'Most women in the San Francisco Bay Area are soft and weak, cosseted and naive, despite their claims of worldliness, and generally full of (expletive)," wrote former Facebook product manager Antonio Garcia Martinez in 2016. "They have their self-regarding entitlement feminism, and ceaselessly vaunt their independence. But the reality is, come the epidemic plague or foreign invasion, they'd become precisely the sort of useless baggage you'd trade for a box of shotgun shells or a jerry can of diesel."
This is from his insider account of Silicon Valley, Chaos Monkeys. The book was a best-seller. The New York Times called it "an irresistible and indispensable 360-degree guide to the new technology establishment". Anyone who is surprised by the recent revelations of sexism spreading like wildfire through the technology industry has not been paying attention.
When Susan Fowler wrote about her experience of being sexually harassed at Uber, it prompted a chain of events that seemed unimaginable months ago, including an investigation led by former attorney general Eric Holder, and the departure of a number of key members of the company's leadership team.
Venture capitalist Justin Caldbeck faced allegations of harassing behaviour, and when he offered an unimpressive denial, companies funded by his firm banded together to condemn his tepidity. He subsequently resigned, and the future of his former firm is unclear. Since then, dozens of women have come forward to reveal the sexist culture in numerous Silicon Valley technology and venture capital firms.
At least this issue is being discussed in ways that open up the possibility that it will be addressed. But the problem of sexism in the tech industry goes much deeper and wider. American academic Melvin Kranzberg's first law of technology tells us that technology is neither inherently good nor bad, nor is it neutral. As a black mirror it reflects the problems that exist in society - including the oppression of women. Millions of people bark orders at Alexa every day, but rarely are we encouraged to wonder why the domestic organiser is voiced by a woman. The entry system for a women's locker room in a gym recently refused entry to a female member because her title was "Dr", and it categorized her as male.
But the issue is not only that technology products reflect a backward view of the role of women. They often also appear ignorant or indifferent to women's lived experience. As the internet of things expands, more devices in our homes and on our bodies are collecting data about us and sending it to networks, a process over which we often have little control. This presents profound problems for vulnerable members of society, including survivors of domestic violence.
Threats by abusers
Unsurprisingly, technology is used by abusers: In a survey of domestic violence services organisations, 97 per cent reported that the survivors who use them have experienced harassment, monitoring, and threats by abusers through the misuse of technology. This often happens on phones, but 60 per cent of those surveyed also reported that abusers have spied or eavesdropped on the survivor or children using other forms of technology, including toys and other gifts.
Products that are more responsive to the needs of women would be a great start. But we should also be thinking bigger: We must avoid reproducing sexism in system design. The spatial relationships are used in natural language-processing so that computers can engage with us conversationally. By reading a lot of text, a computer can learn that Paris is to France as Tokyo is to Japan. It develops a dictionary by association.
But this can create problems when the world is not exactly as it ought to be. For instance, researchers have experimented with one of these word-embedding models, Word2vec, a popular and freely available model trained on three million words from Google News. They found that it produces highly gendered analogies. For instance, when asked "Man is to woman as computer programmer is to?", the model will answer "homemaker". Or for "father is to mother as doctor is to?", the answer is "nurse". It is not hard to imagine how this model could also be racially biased.
These biases can be amplified during the process of language learning. As the MIT Technology Review points out: "If the phrase 'computer programmer' is more closely associated with men than women, then a search for the term 'computer programmer CVs' might rank men more highly than women". When this kind of language learning has applications across fields including medicine, education, employment, policymaking and criminal justice, it is not hard to see how much damage such biases can cause.
Removing such gender bias is a challenge, in part because the problem is inherently political: Word2vec entrenches the world as it is, rather than what it could or should be. But if we are to alter the models to reflect aspirations, how do we decide what kind of world we want to see?
Digital technology offers myriad ways to put these understandings to work. It is not bad, but we have to challenge the presumption that it is neutral. Its potential is being explored in ways that are sometimes promising, often frightening and amazing. To make the most of this moment, we need to imagine a future without the oppressions of the past. We need to allow women to reach their potential in workplaces where they feel safe and respected. But we also need to look into the black mirror of technology and find the cracks of light shining through.
The writer is a human rights lawyer and broadcaster
Leave Your Comments