AIS Transactions on Human-Computer Interaction


Persuasive system design (PSD) is an umbrella term for designs in information systems (IS) that can influence people’s attitude, behavior, or decision making for better or for worse. On the one hand, PSD can improve users’ engagement and motivation to change their attitude, behavior, or decision making in a favorable way, which can help them achieve a desired outcome and, thus, improve their wellbeing. On the other hand, PSD misuse can lead to unethical and undesirable outcomes, such as disclosing unnecessary information or agreeing to terms that do not favor users, which, in turn, can negatively impact their wellbeing. These powerful persuasive designs can involve concepts such as gamification, gamblification, and digital nudging, which all have become prominent in recent years and have been implemented successfully across different sectors, such as education, e-health, e-governance, e-finance, and digital privacy contexts. However, such persuasive influence on individuals raises ethical questions as PSD can impair users’ autonomy or persuade them towards a third party’s goals and, hence, lead to unethical decision-making processes and outcomes. In human-computer interaction, recent advances in artificial intelligence have made this topic particularly significant. These novel technologies allow one to influence the decisions that users make, to gather data, and to profile and persuade users into unethical outcomes. These unethical outcomes can lead to psychological and emotional damage to users. To understand the role that ethics play in persuasive system design, we conducted an exhaustive systematic literature analysis and 20 interviews to overview ethical considerations for persuasive system design. Furthermore, we derive potential propositions for more ethical PSD and shed light on potential research gaps.





When commenting on articles, please be friendly, welcoming, respectful and abide by the AIS eLibrary Discussion Thread Code of Conduct posted here.