How not to do economic forecasts

Every so often I come across a book that should be read by: (a) all economists; (b) all students; (c) everybody involved in politics and policy; and (d) everybody else with an interest in the world. Nate Silver’s [amazon_link id=”0141975652″ target=”_blank” ]The Signal and The Noise: The Art And Science of Prediction[/amazon_link] – which I’ve finally read shamefully long after publication – is one of those books. It should in fact be read by all macroeconomists who publish forecasts at least annually, as a condition of their continuing membership of the profession. If I were teaching, it would emphatically be a required item on the course reading list.

It is a wonderfully clear and engaging explanation of the challenges of making predictions in fields ranging from politics and elections to weather and earthquakes to economics and poker. Apart from a couple of sections on American sports, which might as well have been written in a foreign language, the examples illustrate how to, and how not to, make forecasts. You’ll be wiser for reading it, not to mention able to put Bayesian inference into practice. Silver makes a compelling case for adopting the Bayesian approach, rather than the standard (‘frequentist’) statistics descended from R.A.Fischer and universally taught to economists in their econometrics courses. The emerging new economics curricula should at least include Bayesian statistics in the modules covering empirical methods. As Silver writes:

“Essentially the frequentist approach toward statistics seeks to wash its hands of the reason that predictions most often go wrong: human error. It views uncertainty as something intrinsic to the experiment rather than something intrinsic to our ability to understand the real world.”

In other words, it is not true that collecting more and more data – although usually useful to a forecaster – will eliminate your uncertainty about the real world. The signal-noise problem is epistemologically unavoidable. What’s more the frequentist approach involves assumptions about the distribution of the population; we know about the (in-)validity of the normal curve assumption, and anyway, “What ‘sample population’ was the September 11 attack drawn from?”

The chapter on macroeconomic forecasting is a pretty devastating critique of economists who do that kind of thing. There is a demand for macro forecasts, and I’d rather economists supply them than anybody else. But we shouldn’t pretend they’re going to be accurate. Almost all forecasters, even if they publish standard errors, will give the impression of precision – is growth going to be 0.5% or 0.6%? – but it is inaccurate precision. Silver calculates that over the period 1993-2010, GDP growth fell outside the 90% confidence intervals of macro forecasts for the US economy a third of the time, and a half the time if you look back to 1968.

Macroeconomic data are very noisy, especially early estimates of GDP: in the US the margin of error on the initial quarterly estimate of GDP is plus or minus 4.3%. The initial estimate for the final quarter of 2008 was a decline of 3.8% – later revised to minus 9 per cent. Silver makes the comparison between economic forecasts and weather forecasts, similarly difficult problems. However, weather forecasting has improved over the decades, thanks to a better understanding of the causal links and a greater degree of disaggregation of data, made possible by more powerful computers. Economists have neither the improved understanding – on the contrary, important causal links notably finance were ignored until recently – not seemingly the appetite for better data (as I’ve pointed out before).

The book also makes the point that others (like [amazon_link id=”0465053564″ target=”_blank” ]Paul Ormerod[/amazon_link]) have emphasised, that the economy is a complex non-linear system so there is a lot of unavoidable uncertainty about forecasts more than a short period ahead. It also notes that although we know about the Lucas Critique and Goodhart’s Law – both pointing out that policy affects behaviour – economic forecasters typically ignore it in practice. Silver also underlines the rarely-resisted temptation to overfit the data – and microeconomists are just as guilty as macroeconomists here. The temptation is strong because an over-fitted model will seem to ‘explain’ more than a ‘true’ model when the data are noisy, so the usual tests for good fit will look better. [amazon_link id=”0472050079″ target=”_blank” ]Deirdre McCloskey and Stephen Ziliak[/amazon_link] have been pointing out the siren allure of ‘statistical significance’ for ages – it has almost nothing to do with economic meaning – and perhaps The Signal and the Noise will help broadcast the message further.

Finally, I learned a lot from the book. The chapter on how to approach the question of CO2 emissions and climate change is a model of clear thinking. My favourite new fact: members of Congress – with access to lots of company information via lobbyists and the ability to influence companies’ fortunes by legislation – see a profit on their investments that beats the market averages by 5 to 10 per cent a year, “a remarkable rate that would make even Bernie Madoff blush,” as Silver observes.

Anyway, if you haven’t yet read this, go and do so now. The new UK paperback also has a wonderful cover image!

[amazon_image id=”B0097JYVAU” link=”true” target=”_blank” size=”medium” ]The Signal and the Noise: The Art and Science of Prediction[/amazon_image]

Update: Dan Davies (@dsquareddigest) has gently rebuked me for the paragraph about Bayesian versus frequentist statistics. Via Twitter, he said: “Silver has a really annoying misintepretation of Bayesian vs frequentist which is now becoming commonplace… the paragraph you quote is really confused – NS is a good practical statistician but all over the place on theory & methodology. The (otherwise excellent) book gains nothing from taking a very strong position in someone else’s philosophical debate.” Needlesss to say, I know less than Dan about this debate. This doesn’t change my mind that econ students should be taught the Bayesian approach too, nor the conclusion that the book clearly explains how to do it in practice.

Reforming economics

An update on the process of reforming the undergraduate economics curriculum.

The story so far is that in February 2012 a conference on the subject of whether the curriculum at this level is appropriate in the light of the financial crisis, and of employers’ skill needs, was hosted by the Bank of England and Government Economic Service. The pre-and post-conference papers are collected in [amazon_link id=”1907994041″ target=”_blank” ]What’s The Use of Economics: Teaching the Dismal Science After the Crisis.[/amazon_link]

[amazon_image id=”1907994041″ link=”true” target=”_blank” size=”medium” ]What’s the Use of Economics?: Teaching the Dismal Science After the Crisis[/amazon_image]

A working group of academics and employers picked up the reins in a working group which drew up the statement published in the latest Royal Economic Society newsletter. There was real consensus about this, and some of the working group’s members are taking forward the conclusions in a discussion about subject benchmarks with the Quality Assurance Agency, which reviews the performance of higher education institutions.

In parallel, Wendy Carlin of UCL is starting work on developing a new curriculum and supporting teaching materials, to be freely available. The first stage of her work is being supported by INET. Her aim is to have a curriculum to pilot from the autumn of 2014.

This is progress beyond our wildest dreams when I first started to contact people about the conference 18 months ago. I think the profession is really quite divided: there are many economists who don’t believe there’s all that much that needs to change. But it’s obvious that enough do now feel the need for intellectual reform that a real shift is under way.

Physics envy?

There’s a new book causing a stir in the physics community, Lee Smolin’s [amazon_link id=”1846142997″ target=”_blank” ]Time Reborn: From the Crisis in Physics to the Future of the Universe[/amazon_link]. It was reviewed at the weekend by, among others, Gillian Tett in the FT.  She writes: “Smolin, who has worked for several decades at the cutting edge of cosmology, conducting research into areas such as quantum gravity and string theory, argues that it is a mistake to view scientific laws as universal. Rather, they are “path-dependent”, or a function of what occurred before.” As she points out, mid-20th century economics drew on physics, attracted by the scientific rigour of its discoveries. How wonderful it would be to derive precise ‘laws of economics’. Economists are notoriously charged with ‘physics envy’.

[amazon_image id=”1846142997″ link=”true” target=”_blank” size=”medium” ]Time Reborn: From the Crisis of Physics to the Future of the Universe[/amazon_image]

However, it would be physics envy to conclude that economics now needs historical context and path dependency just because a top physicist has written a book discovering the contingency, the historical specificity of events. In her review, Tett concludes: “Economies do not have a “natural balance”; nor do they operate according to timeless “rules”.” But many economists have been there for a while. Some – like the redoubtable Paul Ormerod in books like [amazon_link id=”0571220134″ target=”_blank” ]Why Most Things Fail[/amazon_link] – never stopped arguing that economics is an intrinsically disequilibrium subject, putting dynamic behaviour over time in specific contexts right at centre stage. Older works like Malinvaud’s [amazon_link id=”0470268832″ target=”_blank” ]Theory of Unemployment Reconsidered[/amazon_link], or, famously, Minsky’s  [amazon_link id=”0071592997″ target=”_blank” ]Stablizing an Unstable Economy[/amazon_link]  were disequilibrium theories. Evolutionary economics, albeit always a minority sport, is inherently about dynamics.

I don’t think the concept of equilibrium should be wholly discarded; it can be a useful analytical tool to understand the dynamics of the economy. But on the whole, I don’t think economics needs another phase of physics envy. The subject is already well on the way to rediscovering the importance of time and place.

 

 

Economics, Star Trek and me

It’s time to come clean. I’ve been a lifelong Trekkie – never to the extent of attending fan conventions in Starfleet costume, but nevertheless devoted. Yesterday I went to see the new movie, Star Trek: Into Darkness, which is excellent. It sent me to a superb 2001 book on my shelves, [amazon_link id=”074562491X” target=”_blank” ]Star Trek: The Human Frontier[/amazon_link] by mother-and-son team Michele and Duncan Barrett. I love this book. They write: “We interpret Star Trek in a historical, cultural context. Much of its preoccupation lies in the nexus of questions about what we might shorthand as ‘modernity’ and ‘humanism’.”

[amazon_image id=”074562491X” link=”true” target=”_blank” size=”medium” ]Star Trek: The Human Frontier[/amazon_image]

They argue that the entire Star Trek oeuvre both embodies progressive 1960s politics and constitutes a reflection on what it means to be human. Over time, the simple rational and scientific optimism of the early series gave way to darker themes, explorations of fragmented identities, mental illness, religion and irrationalism, and the character of leadership. Not surprisingly, given the state of the world, the new movie continues in this more pessimistic vein, while ending with an upbeat reaffirmation of the original Enlightenment principles. It raises the pressing modern question of trust and distrust in authority. I thought the design of the movie was also fascinating – particularly the new versions of the Starfleet uniforms. The working uniforms are very similar to the 60s originals, but the dress uniforms in the ceremonial scenes on Earth are strikingly 1930s and conformist.

As anybody mildly interested will know, Benedict Cumberbatch stars as the bad guy, a rational, calculating genius with a strategic mind. I thought it was interesting that he should turn up in the role, given his characterisation as Sherlock – see my previous post on economists’ vision of ‘rational man’.

Finally, Spock has a simply brilliant line, which will appeal to all economists: “I’m a Vulcan; we embrace technicality.”

It should be the motto of all economists: Live long and prosper!

Another take on classics for economists

The admirable Noah Smith responded to my list of suggested classics for economists with a list of science fiction novels for economists. It’s an excellent list, and Paul Krugman responded enthusiastically – he says Isaac Asimov’s [amazon_link id=”0586010807″ target=”_blank” ]Foundation[/amazon_link] novels set him on the path to becoming an economist.

[amazon_image id=”0586010807″ link=”true” target=”_blank” size=”medium” ]Foundation (The Foundation Series)[/amazon_image]

Some years ago, I wrote a column in the Independent arguing that economics sees humans as either Star Trek’s Mr Spock or deductive geniuses like Hercule Poirot or Sherlock Holmes.

So the challenge now is for somebody to do the detective fiction for economists list.

[amazon_image id=”B00AHEMYEY” link=”true” target=”_blank” size=”medium” ]The Complete Sherlock Holmes[/amazon_image]

Interestingly, Benedict Cumberbatch has become a common thread between Sherlock and science fiction, with his role in the new Star Trek movie Into Darkness, which as an old Trekkie I can’t wait to see.

Natural born economist