Humans and machines

My colleague Neil Lawrence’s new book, The Atomic Human: Understanding Ourselves in the Age of AI, is a terrific account of why ‘artificial intelligence’ is fundamentally different from embodied human intelligence – which makes it on the one hand an optimistic perspective, but on the other leads him to end with an alarming warning, that the potential of pervasive machine intelligence, “could be as damaging to our cultural ecosystem as our actions have been to the natural ecosystem.” The influence of AI on human society could be parallel to our adverse influence on the environment – no matter how good the intentions – because just as nature moves at the pace of evolutionary time so the interface between humans and nature has failed to take account of the damage humans cause, so the computer-human interface is characterised by this mismatch in information-processing speeds.

The book does not offer a handy list of actions to prevent the damage AI might do to us, but ends by warning about two things: the immense concentration of power in its development and use; and the use of automated decision-making in contexts where any judgement is essential – which is many contexts where uncertainty enters the picture. I rather fear those horses have bolted, though.

Most of the book is a fascinating account of both types of intelligence, AI and human cognition, using information theory as well as cognitive science to explain the profound differences. As he notes, “Shannon defined information as being separated from its context,” but humans need contextual understanding to communicate. Neil uses stories to provide context, to make what could be rather dry material more engaging, braiding the same examples (many from wartime: Bletchley Park, his grandfather’s D-Day experience alongside General Patton’s, the development of radar, missile testing…) through the text. Sometimes I found these confusing, but I have a very literal mind.

There have been lots of books about AI out this year, and I’ve generally enjoyed the ones I’ve read – although whatever you do, avoid Ray Kurzweil’s. I’d recommend adding this one to the to-read list, as it offers a fresh perspective on AI from a super-expert and super-thoughtful practitioner.

Screenshot 2024-06-22 at 11.18.14

Escape velocity?

I’ve read Ray Kurweil’s jaw-dropping book, The Singularity is Nearer: When We Merge With AI, so you don’t have to. He does literally believe we will be injected with nanobots to create an AI super-cortex above our own neo-cortex, plugged into the cloud and therefore all of humanity’s accumulated intelligence, and thus become super-intelligent with capabilities we can hardly imagine. Among the other possibilities he forsees AI ‘replicants’ (yes, he calls them that) created from the images and texts of deceased loved ones, to restore them to artificial life. The main challenge he forsees will be their exact legal status. The book has a lot of capsule summaries about consciousness, intelligence, how AI works – and also the general ways in which life is getting better, there will be more jobs, and our health and lifespans will improve by leaps and bounds.

Might he be wrong about reaching ‘longevity escape velocity’ and the AI singularity by 2030? A hint of this when he says that book production is so slow that what he has written in 2023 will already be overtaken by events by mid-2024 when we are reading: “AI will likely be much more woven tightly into your daily life.” Hmm. Not sure about that prognostication. Although one of the scariest things about the book is the advance praise from Bill Gates, who writes that the author is: “The best person I know at predicting the future of artificial intelligence.” Do all the Tech Types believe this?

One suspects they believe they’re already more super-intelligent than the rest of us, so what could possibly go wrong?

Screenshot 2024-06-07 at 07.06.50

 

 

A depressing catalogue

The depressing UK election campaign (albeit far less depressing than some others around the world) sent me back to a book whose subtitle is ‘Half a century of British economic decline’. It’s Russell Jones’s excellent and sobering The Tyranny of Nostalgia. I read it in proof and, as I remembered, it offers a more or less ringside view of economic policymaking (mainly macro) in the UK during the past half century. It takes a couple of chapters to get into its stride, but does so when it gets past an initial chapter about the nature of economics and one about the years before Jones started his career as a professional economist. As he sums up the story, “It is a depressing catalogue of misapprehensions, missteps, underachievement, wasted opportunities, crises and humiliations.”

The themes that jump out – and in my view remain key problems today – are consistent under-investment and what Jones describes as the ‘capriciousness’ of policy, or churn. And above all, nostalgia for past glories, which “infected programmes with wishful thinking. … Britain lost an empire and time and again it failed to find an enduringly workable economic policy framework.” The post-colonial angst is one reason the book describes the UK’s ever-fraught relationship with the rest of Europe as a ‘running sore’.

There’s scant sign in the current campaign of overt political recognition of the fact that most of Britain is a poor country by the standard of those we like to consider our peers, paying the price for at least five decades of failure to invest in the future. Also depressing is the absence of a meaningful area of consensus about long term economic startegy across parties (or within them) about economic policy, suggesting that the British disease of policy churn will persist. We’ll see after 5th July if things will get better….

Screenshot 2024-06-18 at 07.56.53

 

Books at a workshop

An interdisciplinary workshop about capitalism was always going to generate an eclectic mix of books referred to; I always like to make a note of what crops up. So at The New Institute this week, the focus was Colin Mayer’s important recent book Capitalism and Crises, which centres on his proposal that profit maximisation should continue to be the aim of private companies but profit much be redefined to be net of the cost of any damages inflicted on the rest of society (including workers but not rival businesses in competitive markets). He argues that UK corporate law is already capable of this interpretation.

In the other talks, these other books mentioned make up a pretty good capitalism reading list – although I might have missed a few:

The End of Enlightenment by Richard Whatmore

Sense, Nonsense and Subjectivity  by Markus Gabriel

Capitalism and the Market Economy by Jonathan McMillan

Stabilising an Unstable Economy by Hyman Minsky

The Ethics of Capitalism by Daniel Halliday and John Thrasher

Noise by Daniel Kahneman

A Moral Political Economy by Federica Carugati and Margaret Levi

The Corporation in the 21st Century (forthcoming) by John Kay

The Coming of Managerial Capitalism by Alfred Chandler

Competition Overdose by Ariel Ezrachi and Maurice Stucke

Ecological Economics and the Ecology of Economics by Herman Daly

Ökoliberal by Philipp Krohn

Governing the Commons by Elinor Ostrom

Screenshot 2024-06-08 at 13.56.14

 

The public option

I’m on my way to a workshop at The New Institute in Hamburg, where I will talk about the scope for a public option in (especially) digital markets. As preparation, I’ve read a recent short (and moderately technical) book surveying the literature on ‘mixed oligopoly’ by Joanna Potoygo-Theotoky; these are oligopolistic markets with a mixture of private and public provision, where the public competitor has a broader objective function than profit maximisation – such as social welfare broadly, or ESG motivations, or universal service obligations. The basic idea is that by having a different objective function, the presence of the public provider acts as a regulatory function; private firms will choose a lower price/higher quantity or will select to compete on a different level of quality.

I’m most interested in the latter area, where the formal results can go both ways. Public firms can either decide to offer a ‘basic’ package to deliver universal service or can offer a higher quality package than the private sector. Think of public schools vs private schools providing great sports fields and additional subjects in the former case, or public broadcasters ensuring provision of children’s programmes or religious programmes in the latter case. Given the concentration in digital markets and the limited tools governments other than the US and China have to affect the behaviour of Big Tech, provision of a public option in some domains is worth thinking about.

The book, Mixed Oligopoly and Public Enterprises, is a very nice survey and introduction to the mixed oligopoly literature, much of it focused on the price and quantity decisions and the optimal mixture of private and public, but covering some more recent literature on issues like R&D, quality and ESG standards. It also ends outlining a fascinating research agenda – introducing issues of motivation of employees, and even wider objectives such as creating jobs and reducing inequality.

Screenshot 2024-06-04 at 09.52.52