Digitized Work, Dehumanized Workers

Bama Athreya
7 min readOct 12, 2020

We have entered a new stage of capitalism. The global economy is no longer what connects our economic activity. It’s now the digital economy, where geographies cease to matter — at least to capital — as everything happens in cyberspace, on platforms.

In this William Gibson-esque world, work is being digitized, and workers are being dehumanized as a result. To understand and address labor exploitation in the digitized economy, we need to grasp the ways in which it both alters our labor relations and simply exacerbates longstanding problems.

The New Data Colonialism

A small handful of platform companies now dominate the entire globe, transforming our collective economic life. A ‘platform company’ is a corporate entity whose business model relies on a two-sided application programming interface (API) and the internet to ‘source, schedule, manage, ship, and bill task-based, project-driven work’ as Mary Gray and Siddarth Suri have described. Work is fragmented into digitally intermediated ‘gigs’ that in many ways resemble piece-work.

Platform work enables new forms of control over workers through the extraction and commodification of individual workers’ data. Shoshanna Zuboff has detailed the frightening appropriation of our moment-by-moment lived experience and its commodification. Or as Nick Couldry and Ulises Mejias put it, ‘whereas historical colonialism appropriated land, resources, and bodies, today’s new colonialism appropriates human life through extracting value from data.’

This ‘data colonialization’ of platform workers has profound implications for workers’ agency and rights. Companies harvest worker data as inputs for algorithms that determine how to further optimize their operations. For example, ride-hailing apps use driver and rider data to create increasingly sophisticated models and projections of human mobility and to inform the development of self-driving vehicles. On the surface we may see this as benign, and hope this research contributes to better mobility for more people. However, platform workers are generally compelled to sign exceedingly broad agreements for access to their personal data as a condition of employment. They have no meaningful way to opt out of being ‘digitized’ by the company, nor are they in any way compensated for this data labor. Worst of all, they themselves have no access to their own data nor any way to negotiate its use.

Algorithms and the Loss of Human Empathy

What’s even more insidious is the way in which the API is disrupting the empathetic ‘brake’ that might otherwise mitigate acts of exploitation. Stanley Milgram’s famous 1960s experiment is playing out everywhere, in real time. The experiment apparently demonstrated the power of authority, but in addition, it demonstrated the relevance of a technology-enabled, ‘gamified’ interface between the torturer and the victim. Gamification, or the use of computer game-like rewards, penalties and levels, is being applied everywhere in the platform economy. Clients and managers can hide behind algorithms and avoid responsibility for actions that penalize workers and service providers.

Gamification has become a central characteristic of the platform economy.

Algorithms that distribute work are based on codes that necessarily rely on binary choices. These do not allow for consideration or understanding of human exigencies, such as the need to care for a sick family member or an unforeseen road blockage. Platforms may not have humans available to respond to workers who cannot meet the exact terms of a gig for some reason, and may therefore impose harsh penalties on the worker for non-performance. Gray and Suri refer to this as ‘inadvertent algorithmic cruelty’, since algorithms not only distance but can remove the possibility of empathy between service provider and client. Yet behind the algorithmic decision there are human actors making choices in their business interests. In the platform economy, they remain forever anonymous and unreachable to workers.

Workers Play Rating Roulette

Rating systems are commonly used by platforms of all kinds and represent another example of how platforms use clients to remove empathy. A simple one-to-five star rating system is a common way for clients or users of a platform to rate anything from a product they have purchased online to a service such as an Uber ride or AirBnB stay. The system is ostensibly couched as ‘crowdsourcing’, enabling the product or service to continuously improve as a result of customer feedback. In reality, it is often used as a control mechanism, instilling gig workers with fear of ‘deactivation’ from the platform that may coerce them into accepting unsafe or exploitative conditions of work. Deactivation describes the suspension of an account used by a worker to access gigs; it is effectively an electronic blacklist. This invisible and impersonal form of control is critical to consider as we identify violations of labor rights, and particularly forced labor definitions, as it may represent a form of force or coercion where the agent of coercion is an algorithm. This raises serious challenges regarding accountability.

Workers are penalized for receiving low ratings, and may be deactivated from the platform on the basis of even a single client complaint. Many workers have reported that they are unable to access information in such cases about the client who lodged the complaint. One Lyft driver I interviewed asserted that it was common for passengers to lodge false complaints in order to get the company to refund their fare. Since the passenger no longer has a direct monetary transaction with the driver, this and similar behavior no longer carries any possibility of contestation. Indeed the popular delivery company Instacart’s system led to a widespread problem of ‘tip-baiting, ‘ as clients sheltered by a completely transaction-free system engaged in coercive practices they might not have chosen if there were any possibility of a direct human confrontation.

‘Game Over’ for Workers’ Rights?

This system of rewards and punishments also acts coercively to prevent workers from speaking out when laws are violated. A domestic worker interviewed in the Brazilian documentary A Uberização do Trabajo described how the platform Rappi would determine how many hours a gig would take based on the work described by the client. However, she would often find additional cleaning tasks at the assigned location, and fearful that she would receive a poor rating if she did not complete them, would put in the extra time and work for no additional payment. Researchers who have documented app-based domestic work in the US and Europe share similar stories. Similarly, drivers I interviewed stated that they felt compelled to undertake assignments of dubious legality, such as transporting minors. Workers may also feel forced to forego workplace safety interests or decline to report sexual harassment out of fear of receiving a poor rating from a client.

Algorithmic control erodes the human empathy necessary in employee-employer relations.

Also, algorithms acting as managers are not programmed to stop nudging for ever more efficient work. The algorithms are coded to continue optimising behavior even when rates of work and rates of compensation are clearly in violation of local laws. If a worker is willing to accept a task at below minimum wage, or even to take on debt to be selected for a task, most platform algorithms will reward rather than prevent this from taking place. The listing of gigs at well below local and national minimum wage rates is a known feature of many platforms.

One egregious example of this is the US-based household cleaning app Handy, which openly uses a system of imposed fees on workers, leaving some in debt bondage. The examples shared by Reporter Brasil of domestic workers on Rappi also emphasize this point. This is an intentional design choice; the algorithms are ultimately optimized to see the performance of work at no cost as the ideal and will continuously push toward that endpoint.

Yet the choice to allow lines of code to determine a reward, penalty, or blatantly illegal optimisation endpoint is ultimately intentional. The implications of the decision to remove a human interlocutor can be reversed. I believe we need to redefine what we consider forced labor for the platform economy. But that’s not enough. We also have to recognize that companies have brazenly asserted rights to our persons that they should never have been permitted.

Our international laws have prohibited the trade in organs and the trade in slave labor. At a recent conference, Zuboff issued a call for all of us to reject the depiction of ourselves as ‘users’ and instead claim our rights as democratic citizens. In the digitized workplace, workers will need first to assert control over their own data and the right to contest oppressive optimisation by algorithms. To paraphrase the preamble of the International Labour Organization, in a legitimate 21st century economy, humans are not nor should ever be a commodity.

* A longer version of this piece was published in Anti-Trafficking Review, Issue 15.

Title image photo credit: KamiPhuc

Originally published at https://connected2work.org on October 12, 2020.

--

--

Bama Athreya

Expert on labor, gender equity and workplace social inclusion, labor migration and trafficking. Interested in the intersection of tech and social movements.