For the first time in a year I managed to get abroad for a few days – the Ambrosetti Forum at the Villa D’Este on Lake Como – and apart from the inherent joy of being somewhere beautiful and sunny and foreign, it gave me plenty of reading time. One of the books I polished off is Kate Crawford’s excellent Atlas of AI. It’s a forensic exploration of the unseen structures shaping the way AI is being developed and deployed in the world, and it is fair to say she is prfoundly sceptical about whether ‘actually existing AI’ is serving society broadly as opposed to making a small minority of (mainly) men rich and powerful. “To understand how AI is fundamentally political,” she writes in the introduction, we need to go beyond neural nets and statistical pattern recognition to instead ask what is being optimized, and for whom, and who gets to decide.”
The book starts with the material basis of the industry, in particular the extraction of rare earths and its voracious and growing consumption of energy. We all surely know about the energy appetite of crypto but one point I hadn’t really appreciated is this: “The amount of compute used to train a single AI model has increased by a factor of ten every year.” The next chapter goes on to discuss the extent to which AI depends on low-cost human labour. I think the way Amazon’s Mechanical Turk works is quite well known – a nice book about this was Ghost Work – but this chapter focuses on Amazon warehouses, image-labelling work (“the technical AI research community relies on cheap crowd-sourced labour for many tasks that can’t be done by machine”), and also – nice neologism – ‘fauxtomation’ when tasks are transferred from human workers to human customers: think ‘automated’ checkouts in shops. The chapter has a nice section discussing the role of time in business models: in an evolution of the industrial organisation of time, the continuing automation of economic activity is requiring humans to work ever-faster. The battle is for ‘time sovereignity’.
There is not surprisingly a chapter on data, underlining the point that it is profoundly relational, but also making the point that the reliance on ‘data’ downgrades other forms of knowledge, such as linguistic principles, and how little questioning there is of where it comes from and what it actually measures. I hadn’t realised that many computer science departments have not had ethical review processes on the basis that they do not have human subjects for their research. It’s also a bit of a shocker to realise that some widely used AI databases are palimpsests of older collections of data embedding unpalatable classifications and assumptions.
There’s a similarly shocking chapter on facial recognition – bad enough in the ungoverned way it’s being used but I hadn’t clocked the latest trend of using it to “identify” people’s moods. And the book winds up back at power. As Crawford writes, “We must focus less on ethics and more on power.” I couldn’t agree more. There are a gazillion ethics statements and loads of ethics research but it won’t change anything. I’d add incentives too, but given the concentration of all that compute, understanding the way AI and power structures interact will shape the kind of world we are in 10 years from now. Excellent book.