Paper Type

ERF

Abstract

Studies on complex chatbots (defined as chatbots that involve users in long interactions resulting in making users process many pieces of information and taking many actions) show inconsistencies in the relationship between the perceptions of the complexity of these systems and adoption-related constructs. This paper reports on an ongoing research study that proposes a shift to attend to user goal frameworks and goal magnitudes to study this relationship. In this study, we synthesize the concepts of goal systems theory and goal framing theory to systematically develop a scale for measuring goal magnitudes within different goal frameworks during interactions with chatbots. Subsequently, we develop and test a theoretical model that represents the influence of user goals on the intention to switch to human representatives.

Paper Number

1481

Comments

SIGADIT

Share

COinS
Top 25 Paper Badge
 
Aug 16th, 12:00 AM

From Simple Requests to Complex Conversations: How User Goals Shape Interactions with Chatbots and Switching Intentions

Studies on complex chatbots (defined as chatbots that involve users in long interactions resulting in making users process many pieces of information and taking many actions) show inconsistencies in the relationship between the perceptions of the complexity of these systems and adoption-related constructs. This paper reports on an ongoing research study that proposes a shift to attend to user goal frameworks and goal magnitudes to study this relationship. In this study, we synthesize the concepts of goal systems theory and goal framing theory to systematically develop a scale for measuring goal magnitudes within different goal frameworks during interactions with chatbots. Subsequently, we develop and test a theoretical model that represents the influence of user goals on the intention to switch to human representatives.

When commenting on articles, please be friendly, welcoming, respectful and abide by the AIS eLibrary Discussion Thread Code of Conduct posted here.