Paper Number

1624

Paper Type

Complete

Description

On digital labor platforms, interactions between workers and clients are algorithmically managed. Previous research found that algorithmic management can disadvantage workers. In this paper, we empirically examine algorithmic unfairness from a sociotechnical perspective. Specifically, we conduct online focus groups with 23 workers who directly interact with algorithmic management practices on digital labor platforms. In using grounded theory methodology, we pursue to understand how algorithmic management promotes unfairness on digital labor platforms. Our emergent theory understands algorithmic unfairness as algorithmic management practices that give rise to systematic disadvantages for workers. Algorithmic management practices either automate decisions or automate the delegation of decisions. Workers experience systematic disadvantages in the form of devaluation, restriction, and exclusion. Our findings serve as a starting point for mitigating algorithmic unfairness in the future.

Comments

04-Work

Share

COinS
 
Dec 12th, 12:00 AM

Algorithmic Unfairness on Digital Labor Platforms: How Algorithmic Management Practices Disadvantage Workers

On digital labor platforms, interactions between workers and clients are algorithmically managed. Previous research found that algorithmic management can disadvantage workers. In this paper, we empirically examine algorithmic unfairness from a sociotechnical perspective. Specifically, we conduct online focus groups with 23 workers who directly interact with algorithmic management practices on digital labor platforms. In using grounded theory methodology, we pursue to understand how algorithmic management promotes unfairness on digital labor platforms. Our emergent theory understands algorithmic unfairness as algorithmic management practices that give rise to systematic disadvantages for workers. Algorithmic management practices either automate decisions or automate the delegation of decisions. Workers experience systematic disadvantages in the form of devaluation, restriction, and exclusion. Our findings serve as a starting point for mitigating algorithmic unfairness in the future.

When commenting on articles, please be friendly, welcoming, respectful and abide by the AIS eLibrary Discussion Thread Code of Conduct posted here.