SIG ODIS - Artificial Intelligence and Semantic Technologies for Intelligent Systems

Loading...

Media is loading
 

Paper Type

Complete

Paper Number

1437

Description

Bias in algorithms is a nascent issue due to the rapid expansion of algorithms used as support or primary decision-makers in many organizations and institutions. Bias can creep into machine learning algorithms in many stages of the process lifecycle, starting from training data to data modification after feedback. The fairness of the decisions made by the algorithms is another complex area to assess. This study presents an ethnographic study of a project where social media data was labeled and expanded via machine learning to research the political spectrum. We detail how the initial manual labeling, and the subsequent automatic labeling could have added bias. The study presents a new type of bias, the perception bias of the audience. The perception bias was especially strong due to the political nature of the data in our project. Furthermore, the study provides an example of application of fairness typology.

Comments

SIG ODIS

Share

COinS
 
Aug 10th, 12:00 AM

Generating Outrage Through Data Bias: An Ethnographic Study

Bias in algorithms is a nascent issue due to the rapid expansion of algorithms used as support or primary decision-makers in many organizations and institutions. Bias can creep into machine learning algorithms in many stages of the process lifecycle, starting from training data to data modification after feedback. The fairness of the decisions made by the algorithms is another complex area to assess. This study presents an ethnographic study of a project where social media data was labeled and expanded via machine learning to research the political spectrum. We detail how the initial manual labeling, and the subsequent automatic labeling could have added bias. The study presents a new type of bias, the perception bias of the audience. The perception bias was especially strong due to the political nature of the data in our project. Furthermore, the study provides an example of application of fairness typology.

When commenting on articles, please be friendly, welcoming, respectful and abide by the AIS eLibrary Discussion Thread Code of Conduct posted here.