The Mexico City Generation Equality Mural by artist Adry del Rocio (UN Women / Dzilam Méndez)
On June 30 a long-awaited global event, the Generation Equality Forum, will commence. Governments, philanthropists, businesses and civil society advocates will place a global spotlight on gender equality. The event marks a quarter century of concerted effort following the 1995 Fourth World Conference for Women and its Beijing Declaration and Platform for Action. In the wake of a year defined by a global pandemic, lockdowns, and economic instability worldwide, stakeholders will convene virtually to launch a new set of commitments. Among those are a set of commitments on technology and innovation for gender equality.
The Action Coalition spans important major themes. First, it addresses the gender digital divide, calling for more access and more training for women. Second, it calls for investments in women as innovators. Third, it calls for doubling the number of women working in the tech sector. Fourth, it calls for an end to online discrimination and harassment.
On the surface, these are all admirable goals. However, apart from the fourth pillar, none address underlying, historically rooted inequity. In the wake of our growing understanding of the ways in which prominent digital platforms have undermined development goals, I am concerned that commitments may be naive. Policymakers and others joining new commitments must address technology’s real and potential harms. In particular platform or ‘gig’ work is on the rise and needs to be a focus of the Action Coalition on Technology and Innovation. Like any other technology, platforms for work can be positive or negative; they can provide decent jobs or precarious jobs. And, like any technology, the code underlying platforms can potentially benefit or harm workers.
As I argue in a new paper for the International Development Research Centre, we cannot simply focus on providing access to technology, jobs or skills. There are dangers that platforms and artificial intelligence may disguise discrimination in labour markets. If we want tech to serve the aim of correcting inequality in the economy, our starting point must be to identify the root causes. Policymakers must address discrimination embedded in the technology itself, and systems that perpetuate labor market discrimination. I offer three important considerations for any new technology-focused investments in the world of work.
First, remember a tool is not an end in itself
Policymakers must avoid excessive faith in the ability of tech to solve complex problems. Digital technologies are not a panacea for complex and systemic challenges of unemployment and underemployment. Investors in platforms for low-wage and marginalized workers need to ask whether these investments really provide more and better information to job-seekers. Some have argued that platforms reduce discrimination in labour markets. This is not inherently true, as evidence on discrimination in platforms like Uber and Airbnb have illustrated. Let’s recognize that platforms can facilitate and mask discrimination on the basis of gender, ethnicity, race, or migration status. Donors, investors and policymakers should insist on gendered power analysis around these kinds of interventions to ensure that any inherent inequities in design are addressed up front.
Second, avoid over-reliance on individual empowerment
Optimism in technological fixes for difficult problems is not accidental; cultural narratives have continuously reinforced the promise of technology. The conflation of the terms “technology” and “innovation” in policy has masked the ways in which technology can actually serve regressive purposes. Recent efforts such as the Elephant in the Valley project offer keen insight into Silicon Valley’s tech industry, which pushes a narrative that winners succeed solely by dint of their individual merit, yet sustains a culture that perpetuates discrimination against women and people of colour.
An antidote to this is for governments to measure progress in broader terms than individual success. If some individuals benefit, but labour markets as a whole weaken, we must consider how platforms may be contributing structurally and systemically to the erosion of decent work. We must also consider interventions intended to empower workers — and particularly those who have been restricted or excluded from traditional labour markets — collectively rather than individually. An individual worker constrained by unpaid care burdens and offered a choice of flexible work may be better off individually. However, when an entire community or class of such workers is collectively made available to labour markets in ways that circumvent labor laws, this may erode opportunities for decent work. Individual consequences may look very different than collective consequences.
Third, the rabbit hole is real
The ways in which algorithms on social media serve to engage users through the display of increasingly extreme content are now well-documented ( The Social Dilemma, 2021). Former Google engineer Guillaume Chaslot used the term “rabbit hole” to describe the ways in which these algorithmic nudges amplify not only more and more extreme content, but ultimately extreme and sometimes violent behaviors. One corollary to this finding is that the gender digital divide may serve to amplify gender-based violence. Artificial intelligence has been trained on biased data, as initial gender digital divides resulted in far more men than women online. Algorithms have incentivized users to engage with content that further reinforces gender bias and gender-based violence. This, in turn, has further constricted women’s ability to engage comfortably in virtual space. Some women have expressed the feeling they “should not be on the Internet.” When women and marginalized people are self-imposing barriers for fear of harm, simply providing greater access to connectivity or skills will not be effective.
Generation Equality commitment-makers may be prone to fall into the easy trap of focusing on inputs, not systems. Providing access to broadband or skills training will not serve to address important hidden barriers such as discriminatory norms or workplace cultures, or the persistence of gender-based violence. Approaches targeting individual women may actually be an obstacle to good policy solutions; this can be countered if policymakers reject the narrative of self-reliance and consider how well platforms are serving to shift overall labour market dynamics — particularly for disadvantaged groups — toward decent work.
Policymakers must address the increasingly toxic online environment if they want to ensure gender equity in access to platform work, or any online engagement. Policymakers concerned with gender-based violence in the world of work, both offline and online, must work to rein in the algorithms that amplify and provoke violence across all virtual spaces.
In brief, there is a need to ensure we address the challenge of ongoing inequality in the digital economy in ways that are systemic and reflect our understanding of deeply rooted discrimination. As a start, let’s step away from proposals that rely on ahistoric, technology-led fixes. Instead let’s rethink how we understand women’s autonomy and agency in a data-driven economy.