Millions of patients are hospitalised each year because of Adverse Drug Reactions, and researchers are seeking ways to promptly discover effects that had remained hidden before the drug was approved and marketed. Electronic health records published biomedical research, and clinical reports have been recognised as rich text-based sources of early warning signals, and recent research has started investigating the potential of social media as well. Recent studies have validated the efficacy of text mining approaches to Pharmagovigilance in social media. We test whether a text mining methodology that has proven successful in identifying hazards in consumer products in online reviews can be applied to the discovery of Adverse Drug Reactions. Since lexicon generation is a key step to this methodology as well as several others in Pharmagovigilance, we also test two methods of lexicon creation: one driven by statistical term prevalence, the other by manual curation by individuals and groups. We conduct an experiment to determine the relative accuracy and precision of manually created lexicons vs. machine-learned lexicons. We find that our methodology is effective at differentiating online reviews with and without Adverse Drug Reactions. We also find that the top quantile of manually curated lists out-perform statistical term prevalence (supervised machine learning) on a variety of classification metrics. Additionally, we find that groups outperform individuals, and that there is no correlation between list size and classification performance. We find that human-generated lexicons only outperform machine learning in the best cases. We also find that group brainstorming outperforms both individual and machine lists.
Gruss, Richard J.; Abrahams, Alan; and Ractham, Peter, "Human vs computer generated search terms for adverse drug reactions in online reviews" (2022). PACIS 2022 Proceedings. 345.
When commenting on articles, please be friendly, welcoming, respectful and abide by the AIS eLibrary Discussion Thread Code of Conduct posted here.