Algorithmic systems allocate tasks, monitor performance, track data, and determine app-based workers' payouts in the platform economy. These algorithms are often referred to as black boxes due to their opaque and inscrutable nature (Pasquale 2015). While a growing body of literature acknowledges the pervasive influence of algorithms in shaping platform work, the internal mechanics of algorithmic systems remain underexplored. This paper attempts to unpack the layers of socio-technological processes involved in the production of algorithms.
Through technographic fieldwork at Mobi, a ride-hailing app based in Bengaluru, India, this research uncovers the dynamics underpinning algorithmic management systems within the platform company, thereby shifting attention towards the development of algorithms and the politics of their production at the firm level (Wood 2024). Technographic data was collected over the course of a year, from April 2023 to March 2024, through participant observation and over 80 interviews conducted across the company hierarchy, including software developers, product managers, and founders, to understand their role in the development of the app. The fieldwork also incorporates the perspectives of app-based drivers and union leaders to capture the lived experiences of workers subject to algorithmic management.
The paper identifies four key dimensions through which developers shape algorithmic systems: Optimization, Experimentation, Gamification, and Debugging. These dimensions illustrate how algorithmic systems are continually adjusted and refined, not only to enhance efficiency but also to nudge driver behaviour, manage labour supply, and to respond to both technical and social frictions. Through these practices, developers play an active role in governing the conditions of app-based work.
Drawing from the anthropology of algorithms and scholarship on platform labour, the study argues that algorithms are not purely technical artefacts but are shaped by the politics of production within the company. In other words, a platform application is algorithmically coded with the decisions, values, and labour practices of its creators. The study highlights how developers write and rewrite the rules of the game, ultimately governing the livelihoods of platform workers and that reflects upon the embedded sociality within algorithmic systems.
By peering inside the black box, this research reveals how platform companies strategically obscure their role as employers by presenting themselves as merely software providers, thereby denying workers their rights and evading responsibility. These findings underscore the imperative for regulatory interventions to hold platform companies accountable for the algorithmic practices through which they control labour.