•  
  •  
 
Communications of the Association for Information Systems

Author ORCID Identifier

Laurie Giddens: https://orcid.org/0000-0002-1495-3158

Natalie Gerhart: https://orcid.org/0000-0003-3926-3062

Russell Torres: https://orcid.org/0000-0002-2003-5847

Stacie Petter: https://orcid.org/0000-0003-0522-3943

Abstract

Nonconsensual sexually explicit deepfakes, also known as nonconsensual intimate images (NCII), are a taboo topic that requires additional attention within the information systems discipline. Although deepfakes have a variety of potential applications, some of which may be beneficial, both the origins and primary applications of deepfake technology are tied to the creation of pornography, the majority of which is nonconsensual. Current information systems research is generally focused on the application of deepfakes within business contexts or the nefarious use of deepfakes to propagate disinformation. Academia will fail to fully understand and theorize about beneficial or harmful deepfakes if we ignore that the overwhelming majority of deepfakes contain nonconsensual, sexually explicit content. Calls for papers, research agendas, and public discourses must reflect this reality, breaking free from the discomfort associated with the taboo nature of the subject. In this commentary, we aim to (1) draw attention to the topic of NCII deepfakes, which has received little attention in IS research, and (2) suggest areas of inquiry to demonstrate how existing IS research topics are related to the study of the creation, dissemination, impact, and response of NCII deepfakes.

DOI

10.17705/1CAIS.05725

Share

COinS
 

When commenting on articles, please be friendly, welcoming, respectful and abide by the AIS eLibrary Discussion Thread Code of Conduct posted here.