The welcome application of good sense to AI hype

Summer over in a flash, autumn wind and rain outside – perhaps cosy evenings will speed up both my reading and review-posting.

I just finished AI Snake Oil by Arvind Narayanan and Sayash Kapoor, having long been a fan of the blog of the same name. The book is a really useful guide through the current hype. It distinguishes 3 kinds of AI: generative, predictive and content moderation AI – an interesting categorisation.

On the generative AI so much in the air since ChatGPT was launched in late 2022, and the persuasive debunking here is of the idea that we are anywhere close to ‘general’ machine intelligence, and of the notion that such models pose existential risks. The authors are far more concerned with the risks associated with the use of predictive AI in decision-making. These chapters provide an overview of the dangers: from data bias to model fragility or overfitting to the broad observation that social phenomena are fundamentally more complicated than any model can predict. As Professor Kevin Fong said in his evidence to the Covid inquiry last week, “There is more to know than you can count.” An important message in these times of excessive belief in the power of data.

The section on the challenges of content moderation were particularly interesting to me, as I’ve not thought much about it. The book argues that content moderation AI is no silver bullet to tackle the harms related to social media – in the authors’ view it is impossible to remove human judgement about context and appropriateness. They would like social media companies to spend far more on humans and on setting up redress mechanisms for when the automated moderation makes the wrong call: people currently have no recourse. They also point out that social media is set up with an internal AI conflict: content moderation algorithms are moderating the content the platform’s recommendation algorithms are recommending. The latter have the upper hand because it doesn’t involve delicate judgements about content, only tracking the behaviour of platform users to amplify popular posts.

There have been a lot of new books about AI this year, and I’ve read many good ones. AI Snake Oil joins the stack: it’s well-informed, clear and persuasive – the cool breeze of knowledge and good sense are a good antidote to anybody inclined to believe the hyped claims and fears.

Screenshot 2024-09-29 at 17.31.57