Earlier this year, fellow-editor Sevin and I were fortunate enough to attend the 2021 All About Women talk, ‘How Smart is AI?’, by author and academic Kate Crawford with host Rae Johnston. We were unsure of what to expect when we sat down, but an hour later, we left the auditorium indignant and a little more glum than before the talk.
We walked along the harbour towards Circular Quay station in an urgent conversation, rehashing everything we had just learned to commit them deeper into our memory. We trash-talked Jeff Bezos and affirmed what an awful person he is. We reminded each other of the lithium shortage. And of every billionaire’s desire to abandon the planet. As AI steadily permeates every aspect of society, including facial recognition software now being used for HR and hiring purposes, Kate Crawford reminds us that we only have a few short years to push back on these systems.
First, let’s lay down some harsh truths. We often associate AI with something entirely intangible and incorporeal. A quick Google search will show you how the concept of AI has been sold to us visually; AI is a dazzling circuit board made of light. Or a genderless human face made of light. Or a human brain… made of light. You get my point. It’s easy to see how we’ve come to regard AI, data, and the cloud as otherworldly, when in reality, data centres very much belong to this world. In fact, data centres (the buildings that house servers, i.e. the home of your internet and Netflix) contribute the same amount of carbon emissions as the airline industry.1 Not only do most data centres rely on non-renewable energy sources, but the industry is expected to grow 500% by 2050.2 3
There have also been multiple reports and papers citing that we are due to run out of lithium soon. Lithium is the highly toxic but crucial metal in all of your rechargeable devices and batteries, and very, very soon, demand for it will outstrip the supply. Rae Johnston makes the astute observation that there is currently a push for consumers to embrace electric cars, yet the very materials needed to make those batteries may run out in the next decade or two. Kate suggests we’ll run out of lithium by 2040, which is quite a generous estimate, as other reports claim it may be as soon as 2025.4 5 If we figure out how to recycle lithium, the reserves may last until 2100, but according to Kate, we are so very bad at figuring out how to recycle lithium.
We are not only running out of a key resource and failing to recycle it, we’re also at a loss about what to do with our e-waste. There is a lake in Inner Mongolia, about 8 – 9 kilometres in diameter, that is entirely made of black sludge and toxic waste. “We’re literally terraforming the Earth with the legacies of e-waste, and it’s something we will be living with for generations until we can figure out a much better way to ... construct and use hardware.”
If this is not enough to convince you that the systems we have created are not sustainable, look to the projects that the billionaires are undertaking. As many of you may be aware, a new space race has taken root in our current era, culminating in Richard Branson ‘beating’ Bezos to space earlier this year. Elon Musk is promising to take humans to Mars. Don’t worry if you can’t afford it, you can pay it off when you get here. Indentured servitude might be frowned upon on Earth but maybe the rules are different on Mars? CEO, entrepreneur (born in 1964) Jeffrey Bezos, however, is not looking to start a human colony, rather he wants to move human labour off-world to mine asteroids. You know we’re in the folds of late-stage capitalism when even capitalism is spreading beyond the stratosphere.
“It strikes me as not a coincidence that the men who made their fortunes from AI ... are all now focused on leaving the Earth,” Kate muses. And given what we have learned about the environmental impact of these tech giants, can you blame the billionaires for wanting to launch themselves away from the Earth? She continues, “[Their] vision of the future is so astoundingly terrifying to me, which is the complete abrogation of responsibility and ultimately a departure from the Earth entirely. There’s something so shocking about that, as the end point of these great technologists’ vision.”
Instead of making this planet — the only suitable planet in the solar system for us to live and thrive on — more sustainable, the most powerful men in the world are looking to move on. We are forced to watch as billionaires invest their money, time, and teams of engineers into anything but real, pressing issues on the planet that we actually have. We are being sold an unrealistic and unattainable fantasy of moving to a new planet. It’s nonsense, but it’s powerful nonsense. When you move to a planet as inhospitable as Mars, I guess there’s not much biodiversity for you to ruin, eh, Elon?
Who exactly are the people in the ‘room where it happens’? Unfortunately, stereotypes about the gender imbalance in STEM industries are real; by a generous estimate, over 85% of Amazon engineers are male. The World Economic Forum found that ‘only 22% of AI professionals globally are female’.6 So, it’s easy to see how AI has ended up with its infamous gender and racial biases. When a system is being developed by such a homogenous group, the resulting AI is bound to make mistakes. Kate posits, “Who are in the rooms where AI is designed? Who are the engineers that get to decide what problems can be resolved by AI and for whom this AI will work best? Who runs these companies?” Her answer, to no one’s surprise, is white men, who have ‘skewed the thinking and priorities that these technical systems are serving’.
Joy Buolamwini, founder of the Algorithmic Justice League, reported that the image datasets being input to teach AI facial recognition skewed heavily towards ‘white persons’, with 75% male faces. Facial recognition systems are far more likely to misclassify women than men. Darker-skinned women were found to be misclassified at an error rate of 35%.7 As the use of AI and facial recognition software in public spaces grows, racial and gender bias in the system not only put women at a disadvantage and at potential harm, but also people of colour, and trans and non-binary people. This will be one more thing that marginalised people will have to teach their children, unless we take action now.
“Her answer, to no one’s surprise, is white men, who have ‘skewed the thinking and priorities that these technical systems are serving’.”
A member of the audience asks, “Early on, women were prevalent in data and computer development. How did they get blocked out? And how do we get them back in?”
“I love this question,” Kate gushes. I love this question too; I thought of Margaret Hamilton, the software engineer instrumental in putting NASA’s astronauts on the moon in 1969. Kate’s answer comes without a beat: money and popularity. As an industry grows more popular and becomes associated with profit, they become more male-dominated and women get pushed out. Instead of thinking that the 60s seemed progressive for hiring a female engineer to lead the Apollo software team, why had I never considered that the industry had in fact regressed? Kate argues that it fundamentally comes down to the way we’ve constructed these industries, what they’ve come to prioritise, and who’s work they value (or in the cases of women’s work, don’t value). Again, this harkens back to the question of ‘who is hiring?’ and ‘who is being hired?’.
Despite society’s well-meaning attempts, more often than not, the gender imbalance issue is often simplified into: well, more girls need to be encouraged to pursue a STEM career! And so, the solutions include creating coding camps for girls or a mentoring program for girls, as if that’s enough to prepare women before they are thrown into male-dominated workplaces. It’s quite clear that society is treating a symptom, rather than the disease, because, well, treating the disease is much harder. The problem is not that there aren’t enough girls interested in STEM, it’s that women are being marginalised, harassed, and made to feel lesser than in these spaces.
Is this another story where we can dump the blame on consumers? Is it your fault for not boycotting the tech industry? Is it your fault that electric cars are being sold to you as the right choice, despite dwindling lithium reserves? No, absolutely not. This information is kept away from you while decisions are being made in rooms dominated by men. Kate argues that the information required for the public to understand the issue and ask the right questions ‘are kept at arm’s-length and are kept invisible and hidden’. It is the responsibility of those who are in the AI space to make these systems clean and comprehensible to the people who will be most affected by them. As Kate puts it, “It’s a much bigger question about how we are creating systems that are fundamentally unsustainable.”
She remains hopeful though, despite having just spent an hour telling us how morbid our future might very well be. We are taught that ‘technology is inevitable’; it’ll come and you’ll have to use it. Except you don’t, Kate argues. Entire cities around the world have rejected the use of facial recognition software. All of these systems can be rejected, you don’t have to accept them. This is why it’s so important to have these discussions. “We have this window to create real change and that’s something we have to do collectively.”
If any part of this article sparked your interest, please consider reading Kate Crawford’s The Atlas of AI. - https://bookshop.org/books/atl...
How Smart is AI? is available to rent on the Sydney Opera House website. - https://stream.sydneyoperahous...