From dead-end to dynamism

I’ve been reading an interesting, and non-technical, overview of complexity theory as applied to economics, [amazon_link id=”1781951969″ target=”_blank” ]The Rediscovery of Classical Economics: Adaptation, Complexity and Growth[/amazon_link] by David Simpson. As the title indicates, the book looks through the complexity lens – the economy as an evolving, self-organising system –  at classical (as distinct from neoclassical) economics and at ‘Austrian’ business cycle theory. By classical he means not the specific body of thought of the 19th century, but rather the general perspective on the economy as dynamic and in disequilibrium.

[amazon_image id=”1781951969″ link=”true” target=”_blank” size=”medium” ]The Rediscovery of Classical Economics: Adaptation, Complexity and Growth (New Thinking in Political Economy Series)[/amazon_image]

The introduction states:

“Equilibrium theory has focused the attention of academic economists on issues surrounding the efficient allocation of a given set of resources among a number of competing uses at a single moment in time. While such questions have engaged the best brains of at least two generations in a number of intellectual conundrums, it has diverted them from an analysis of those features of a market economy that have impressed themselves on human history.”

Or in other words, the mainstream of economics, with its focus on the moment of equilibrium, along with the assumption of a common stock of knowledge and rational choice, has bypassed the most striking characteristics of actual economies – growth, uncertainty, and human unpredictability.

There are a few other books that serve as good introductions to complexity in economics, such as Paul Ormerod’s very accessible [amazon_link id=”0571197264″ target=”_blank” ]Butterfly Economics[/amazon_link] and Alan Kirman’s [amazon_link id=”0415594243″ target=”_blank” ]Complex Economics[/amazon_link]. The contribution of David Simpson’s book is to link the tools of complexity thinking to a particular tradition in economic thought that has always emphasised growth, uncertainty and the problem of knowledge. Even the thickest-skinned of mainstream economists is probably aware of Hayek’s work on knowledge, or rather the impossibility of knowing everything, and of [amazon_link id=”0415567890″ target=”_blank” ]Schumpeter’s ‘creative destruction'[/amazon_link]. One chapter quotes Paul Krugman saying: “Is the economy a self-organising system? Of course it is!” The problem is that most economists, even if they acknowledge the general point, haven’t been doing that kind of economics – Krugman’s macroeconomics is back-to-the-sixties Keynesian analysis, and happens in a different part of his brain from his 1996 book [amazon_link id=”1557866988″ target=”_blank” ]The Self-Organizing Economy[/amazon_link].

[amazon_image id=”1557866996″ link=”true” target=”_blank” size=”medium” ]The Self Organizing Economy[/amazon_image]

Simpson brings in Austrian economics to consider the business cycle and particularly the present recession. I’m not at all familiar with the Austrian approach, so can’t really evaluate how well it fits into the complexity/uncertainty framework; but the narrative here emphasises the role of technology as the initial trigger, credit expansion in the boom, and the financial causes of the crisis. Simpson writes: “The business cycle carries many of the characteristic signatures of a complex process.” The economy self-organises then self-disorganises.

He concludes: “The marginal revolution of the last quarter of the 19th century had focused attention on the theory of value at the expense of the theory of growth. …. The end result of assuming away so many important aspects of reality is that the theory is not operational. It is impossible to relate equilibrium theory to the empirical processes of an actual market economy. …. Equilibrium theory has reached a dead end.”

My sense is that mainstream economists were already waking up to the irrelevance of much of the post-war work in the subject even before the Crisis. The collapse of the communist regimes, the dramatic impact of technology and globalisation, the steady adoption of behavioural research were all contributing to a shift in the mainstream back towards reality. Events since 2008 have accelerated the move, for all that many economists remain in denial, to the point that curriculum reform is now well under way, as I’ve noted before.

David Simpson’s book is a clear and readable introduction to complexity in the specific context of the history of economic thought, and can thus fill a gap in far too many economists’ knowledge of their own subject. It’s unfortunately a pricey Edward Elgar book, so most readers will need to order it from their library.

 

Anti-decline

After this week’s trip to Liverpool, and being recommended the Tristram Hunt book [amazon_link id=”075381983X” target=”_blank” ]Building Jerusalem[/amazon_link], about Britain’s great Victorian cities, I turned back to Ed Glaeser’s excellent [amazon_link id=”0330458078″ target=”_blank” ]Triumph of the City[/amazon_link] to remind myself what he says about urban decline. The book is mainly about US cities, but he notes that Liverpool’s population has declined from 867,000 in 1937 to around half that figure. Like other industrial cities, the decline has been due to the changed structure of the economy beginning a vicious circle of unemployment, de-skilling, disinvestment and poverty.

[amazon_image id=”0330458078″ link=”true” target=”_blank” size=”medium” ]Triumph of the City[/amazon_image]

As for what to do, Glaeser concludes the chapter: “The path back for declining industrial towns is long and hard. Over decades, they must undo the cursed legacy of big factories and heavy industry. They must return to their roots as places of small-scale entrepreneurship and commerce. Apart from investing in education and maintaining core public services with moderate taxes and regulations, governments can do little to speed the process.”

I’m less fatalistic than this about the scope for local government to encourage revival. It ignores the important role of urban leadership in shaping expectations and bringing about the kinds of connections between people that can spark creativity or prompt some investment. This is the kind of ‘soft’ government activity often overlooked by economists, but it’s important; we shouldn’t forget the vital role of expectations in determining growth. We do have to be patient – but these cities are certainly not lost causes. Even Detroit, which has suffered perhaps the greatest decline of all, seems to be turning the corner.

Heritage Liverpool – ripe for revival?

Northern glory

I’ve spent the past couple of days in Liverpool, a city almost as fine as my home city of Manchester. Whenever you walk around one of the great northern Victorian cities, either of those, or Glasgow or Belfast, or Leeds, it’s impossible not to reflect on either (a) what confidence the civic elites of the 19th century must have felt to build the huge and impressive buildings still giving the cities their character. My goodness they could build, and built to last; and (b) how different Britain must have been when the economic dynamics was not in London. It’s one of the most centralised economies in the world, which is bad for the north, and bad for the south too.

Anyway, one of the other attendees recommended Tristram Hunt’s [amazon_link id=”075381983X” target=”_blank” ]Building Jerusalem: The Rise and Fall of the Victorian City – will have to order it![/amazon_link]

[amazon_image id=”075381983X” link=”true” target=”_blank” size=”medium” ]Building Jerusalem: The Rise and Fall of the Victorian City[/amazon_image]

Economics vs politics

Malcolm Gladwell has written a very positive review of Jerry Adelman’s Albert Hirschman bio, [amazon_link id=”0691155674″ target=”_blank” ]Worldly Philosopher[/amazon_link]. The review also offers a superb thumbnail sketch of Hirschman’s character and philosophy. I’m looking forward to reading the book myself.

[amazon_image id=”0691155674″ link=”true” target=”_blank” size=”medium” ]Worldly Philosopher: The Odyssey of Albert O. Hirschman[/amazon_image]

Meanwhile it sent me back to thumb through [amazon_link id=”0674276604″ target=”_blank” ]Exit, Voice and Loyalty[/amazon_link], and I found this paragraph that speaks to my current preoccupations:

“Exit and voice, that is market and non-market forces, that is economic and political mechanisms, have been introduced as two principal actors of strictly equal rank and importance. … I hope to demonstrate to political scientists the usefulness of economic concepts and to economists the usefulness of political concepts. This reciprocity has been lacking in recent interdisciplinary work as economists have claimed that the concepts developed for the purpose of analyzing phenomena of scarcity and resource allocation can be successfully used for explaining political phenomena as diverse as power, democracy and nationalism. They have thus succeeded in occupying large portions of the neighbouring discipline.”

[amazon_image id=”0674276604″ link=”true” target=”_blank” size=”medium” ]Exit, Voice and Loyalty: Responses to Decline in Firms, Organizations and States[/amazon_image]

He hoped the book would restore the balance, but the succeeding decades were to be characterised by the primacy of market mechanisms in political decisions. Perhaps that’s just starting to change now.

How not to do economic forecasts

Every so often I come across a book that should be read by: (a) all economists; (b) all students; (c) everybody involved in politics and policy; and (d) everybody else with an interest in the world. Nate Silver’s [amazon_link id=”0141975652″ target=”_blank” ]The Signal and The Noise: The Art And Science of Prediction[/amazon_link] – which I’ve finally read shamefully long after publication – is one of those books. It should in fact be read by all macroeconomists who publish forecasts at least annually, as a condition of their continuing membership of the profession. If I were teaching, it would emphatically be a required item on the course reading list.

It is a wonderfully clear and engaging explanation of the challenges of making predictions in fields ranging from politics and elections to weather and earthquakes to economics and poker. Apart from a couple of sections on American sports, which might as well have been written in a foreign language, the examples illustrate how to, and how not to, make forecasts. You’ll be wiser for reading it, not to mention able to put Bayesian inference into practice. Silver makes a compelling case for adopting the Bayesian approach, rather than the standard (‘frequentist’) statistics descended from R.A.Fischer and universally taught to economists in their econometrics courses. The emerging new economics curricula should at least include Bayesian statistics in the modules covering empirical methods. As Silver writes:

“Essentially the frequentist approach toward statistics seeks to wash its hands of the reason that predictions most often go wrong: human error. It views uncertainty as something intrinsic to the experiment rather than something intrinsic to our ability to understand the real world.”

In other words, it is not true that collecting more and more data – although usually useful to a forecaster – will eliminate your uncertainty about the real world. The signal-noise problem is epistemologically unavoidable. What’s more the frequentist approach involves assumptions about the distribution of the population; we know about the (in-)validity of the normal curve assumption, and anyway, “What ‘sample population’ was the September 11 attack drawn from?”

The chapter on macroeconomic forecasting is a pretty devastating critique of economists who do that kind of thing. There is a demand for macro forecasts, and I’d rather economists supply them than anybody else. But we shouldn’t pretend they’re going to be accurate. Almost all forecasters, even if they publish standard errors, will give the impression of precision – is growth going to be 0.5% or 0.6%? – but it is inaccurate precision. Silver calculates that over the period 1993-2010, GDP growth fell outside the 90% confidence intervals of macro forecasts for the US economy a third of the time, and a half the time if you look back to 1968.

Macroeconomic data are very noisy, especially early estimates of GDP: in the US the margin of error on the initial quarterly estimate of GDP is plus or minus 4.3%. The initial estimate for the final quarter of 2008 was a decline of 3.8% – later revised to minus 9 per cent. Silver makes the comparison between economic forecasts and weather forecasts, similarly difficult problems. However, weather forecasting has improved over the decades, thanks to a better understanding of the causal links and a greater degree of disaggregation of data, made possible by more powerful computers. Economists have neither the improved understanding – on the contrary, important causal links notably finance were ignored until recently – not seemingly the appetite for better data (as I’ve pointed out before).

The book also makes the point that others (like [amazon_link id=”0465053564″ target=”_blank” ]Paul Ormerod[/amazon_link]) have emphasised, that the economy is a complex non-linear system so there is a lot of unavoidable uncertainty about forecasts more than a short period ahead. It also notes that although we know about the Lucas Critique and Goodhart’s Law – both pointing out that policy affects behaviour – economic forecasters typically ignore it in practice. Silver also underlines the rarely-resisted temptation to overfit the data – and microeconomists are just as guilty as macroeconomists here. The temptation is strong because an over-fitted model will seem to ‘explain’ more than a ‘true’ model when the data are noisy, so the usual tests for good fit will look better. [amazon_link id=”0472050079″ target=”_blank” ]Deirdre McCloskey and Stephen Ziliak[/amazon_link] have been pointing out the siren allure of ‘statistical significance’ for ages – it has almost nothing to do with economic meaning – and perhaps The Signal and the Noise will help broadcast the message further.

Finally, I learned a lot from the book. The chapter on how to approach the question of CO2 emissions and climate change is a model of clear thinking. My favourite new fact: members of Congress – with access to lots of company information via lobbyists and the ability to influence companies’ fortunes by legislation – see a profit on their investments that beats the market averages by 5 to 10 per cent a year, “a remarkable rate that would make even Bernie Madoff blush,” as Silver observes.

Anyway, if you haven’t yet read this, go and do so now. The new UK paperback also has a wonderful cover image!

[amazon_image id=”B0097JYVAU” link=”true” target=”_blank” size=”medium” ]The Signal and the Noise: The Art and Science of Prediction[/amazon_image]

Update: Dan Davies (@dsquareddigest) has gently rebuked me for the paragraph about Bayesian versus frequentist statistics. Via Twitter, he said: “Silver has a really annoying misintepretation of Bayesian vs frequentist which is now becoming commonplace… the paragraph you quote is really confused – NS is a good practical statistician but all over the place on theory & methodology. The (otherwise excellent) book gains nothing from taking a very strong position in someone else’s philosophical debate.” Needlesss to say, I know less than Dan about this debate. This doesn’t change my mind that econ students should be taught the Bayesian approach too, nor the conclusion that the book clearly explains how to do it in practice.