Artificial intelligence (AI) rests on the premise that machines can behave in a human-like way and potentially solve complex analytics problems. In recent years, we have seen several off-the-shelf AI technologies that claim to be ready to use. In this paper, we illustrate how one can use one such technology, called IBM Natural Language Understanding (NLU), to solve a data-analytics problem. First, we provide a detailed step-by-step tutorial on how to use NLU. Next, we introduce our case study in which we investigated the implications of Starbucks’ pledge to hire refugees. In this context, we used NLU to assign sentiment and emotion scores to social-media posts related to Starbucks made before and after the pledge. We found that consumers’ sentiment towards Starbucks became more positive after the pledge whereas investors’ sentiment became more negative. Interestingly, we found no significant relationship between consumers’ and investors’ sentiments. With help from NLU, we also found that consumers’ sentiments lacked consensus in that their social media posts contained a great deal of mixed emotions. As part of our case study, we found that NLU correctly classified the polarity of sentiments 72.64 percent of the time, an accuracy value much higher than the 49.77 percent that the traditional bag-of-words approach achieved. Besides illustrating how practitioners/researchers can use off-the-shelf AI technologies in practice, we believe the results from our case study provide value to organizations interested in implementing corporate social responsibility policies.
Carvalho, A., Levitt, A., Levitt, S., Khaddam, E., & Benamati, J. (2019). Off-The-Shelf Artificial Intelligence Technologies for Sentiment and Emotion Analysis: A Tutorial on Using IBM Natural Language Processing. Communications of the Association for Information Systems, 44, pp-pp. https://doi.org/10.17705/1CAIS.04443
When commenting on articles, please be friendly, welcoming, respectful and abide by the AIS eLibrary Discussion Thread Code of Conduct posted here.