Economics Textbooks

Steven Clarke asked in a comment on an earlier post about recommendations for an intro economics textbook. Here are my thoughts. Others might want to add their comments.

He has started reading Samuelson, which is a good choice – a classic – so carrying on with it is fine.

The current market leader is Greg Mankiw’s [amazon_link id=”0030259517″ target=”_blank” ]Principles of Economics[/amazon_link] – apparently it has 70% of the US market.

Others often used in UK undergraduate courses are:
Sloman – [amazon_link id=”0273763121″ target=”_blank” ]Economics[/amazon_link]
Lipsey and Chrystal – [amazon_link id=”0199286418″ target=”_blank” ]Economics[/amazon_link]
Begg – [amazon_link id=”0077107756″ target=”_blank” ]Economics[/amazon_link]

There is a CC download of an intro textbook by Preston McAfee – I haven’t read it but it looks pretty encouraging and is free so might be worth looking at for comparison.
http://www.mcafee.cc/Introecon/

The trouble with the textbooks in economics is that they contain things we economists know to be wrong. The brilliant John Sutton at LSE has long complained about this, and Steve Keen made much of it in [amazon_link id=”1848139926″ target=”_blank” ]Debunking Economics[/amazon_link]. So it is also worth reading some of the more popular books that don’t cover the technical grounding but do explain what economists actually do. My favourite is Tim Harford’s [amazon_link id=”0349119856″ target=”_blank” ]The Undercover Economist[/amazon_link]. My own book Sex, Drugs and Economics (pdf) is fabulous, and free, but a bit out of date. I’m not so keen on [amazon_link id=”0141019018″ target=”_blank” ]Freakonomics[/amazon_link] – it’s a good read but takes to a gimmicky extreme the Becker school of economics that says you can analyse decisions to commit crime or have babies or get married just like decisions to buy a new pair of shoes. There is some insight in this but it’s not the whole story.

[amazon_image id=”0349119856″ link=”true” target=”_blank” size=”medium” ]The Undercover Economist[/amazon_image]

Finally, Geoff Riley of Tutor2u has a great booklist of enrichment reading specifically for economics students.

Positive, normative and provocative economics

Last night it was my privilege to give the annual Pro Bono Economics lecture. I’d be delighted to hear people’s comments on it. (It would be even more pleasing if you’d look at the website and consider making a donation to their work.)

Many people in the audience have been enthusiastic, but one macroeconomist has taken great offence at my criticism of macro. I daresay I was too provocative – Dave Ramsden of the Treasury, chairing the evening, diplomatically described it as ‘challenging’ – but it does simply amaze me that so many (but not all) macroeconomists don’t think anything much needs to change in their area. Anyway, views welcome.

In the chat afterwards, somebody recommended to me [amazon_link id=”0521033888″ target=”_blank” ]Rational Economic Man[/amazon_link] by Martin Hollis and Edward Nell. The blurb says:

“Economics is probably the most subtle, precise and powerful of the social sciences and its theories have deep philosophical import. Yet the dominant alliance between economics and philosophy has long been cheerfully simple. This is the textbook alliance of neo-Classicism and Positivism, so crucial to the defence of orthodox economics against by now familiar objections. This is an unusual book and a deliberately controversial one. The authors cast doubt on assumptions which neo-Classicists often find too obvious to defend or, indeed, to mention. They set out to disturb an influential consensus and to champion an unpopular cause. Although they go deeper into both philosophy and economics than is usual in interdisciplinary works, they start from first principles and the text is provokingly clear. This will be a stimulating book for all economic theorists and philosophers interested in the philosophy of science and social science.”

[amazon_image id=”0521033888″ link=”true” target=”_blank” size=”medium” ]Rational Economic Man[/amazon_image]

I’d like it to have been a bit more specific about the authors’ doubts, but it sounds intriguing.

Richard Davies of The Economist (@RD_Economist on Twitter) has recommended [amazon_link id=”0631194355″ target=”_blank” ]Three Methods of Ethics[/amazon_link] by Marcia Baron et al.

[amazon_image id=”0631194355″ link=”true” target=”_blank” size=”medium” ]Three Methods of Ethics: A Debate (Great Debates in Philosophy)[/amazon_image]

I can see I’m going to have to improve my philosophy to continue in the vein of the Pro Bono lecture.

UPDATE: Paul Kelleher (@kelleher_) recommends [amazon_link id=”041588117X” target=”_blank” ]Philosophy of Economics[/amazon_link] by Julian Reiss

[amazon_image id=”041588117X” link=”true” target=”_blank” size=”medium” ]Philosophy of Economics: A Contemporary Introduction (Routledge Contemporary Introductions to Philosophy)[/amazon_image]

The Scarlet Letter for economists

An econometrics paper that can make you laugh? Yes, Ed Leamer, famously the author of a 1983 paper, Let’s Take the Con Out of Econometrics (pdf), has a superb 2010 article in the Journal of Economic Perspectives, Tantalus on the Road to Asymptopia – it’s free access,  only moderately technical, and brilliant.

Leamer’s theme is the same in the more recent paper as in the earlier one, the need for a profound culture change in empirical economics:

“Can we economists agree that it is extremely hard work to squeeze truths from our data sets and what we genuinely understand will remain uncomfortably limited? We need words in our methodological vocabulary to express the limits. We need sensitivity analyses to make those limits transparent. Those who think otherwise should be required to wear a scarlet-letter O around their necks, for “overconfidence.””

The point is that the available economic data will always support a range of different theories, and Leamer advocates sensitivity analyses that illustrate the spectrum of parameter values and theories consistent with observed data. Economists need to go back to 1921, he argues, and read Keynes’s [amazon_link id=”B0080K73L6″ target=”_blank” ]Treatise on Probability[/amazon_link] and Frank Knight’s [amazon_link id=”0486447758″ target=”_blank” ]Risk, Uncertainty and Profit[/amazon_link]. Both books point out that decisions are subject to three-valued logic (yes, no, don’t know) whereas economic theory assumes away the large territory of don’t know.

I strongly agree with Leamer’s conclusions:

“Ignorance is a formidable foe, and to have hope of even modest victories, we economists need to use every resource and every weapon we can muster, including thought experiments (theory), and the analysis of data from nonexperiments, accidental experiments, and designed experiments. We should be celebrating the small genuine victories of the economists who use their tools most effectively, and we should dial back our adoration of those who can carry the biggest and brightest and least-understood weapons. We would benefit from some serious humility, and from burning our “Mission Accomplished” banners. It’s never gonna happen.”

He, like me, is profoundly sceptical about macroeconomics: “Our understanding of causal effects in macroeconomics is virtually nil, and will remain so.”

I must go away and read Leamer’s 2009 book, [amazon_link id=”364207975X” target=”_blank” ]Macroeconomic Patterns and Stories[/amazon_link].

[amazon_image id=”364207975X” link=”true” target=”_blank” size=”medium” ]Macroeconomic Patterns and Stories[/amazon_image]

 

 

From dead-end to dynamism

I’ve been reading an interesting, and non-technical, overview of complexity theory as applied to economics, [amazon_link id=”1781951969″ target=”_blank” ]The Rediscovery of Classical Economics: Adaptation, Complexity and Growth[/amazon_link] by David Simpson. As the title indicates, the book looks through the complexity lens – the economy as an evolving, self-organising system –  at classical (as distinct from neoclassical) economics and at ‘Austrian’ business cycle theory. By classical he means not the specific body of thought of the 19th century, but rather the general perspective on the economy as dynamic and in disequilibrium.

[amazon_image id=”1781951969″ link=”true” target=”_blank” size=”medium” ]The Rediscovery of Classical Economics: Adaptation, Complexity and Growth (New Thinking in Political Economy Series)[/amazon_image]

The introduction states:

“Equilibrium theory has focused the attention of academic economists on issues surrounding the efficient allocation of a given set of resources among a number of competing uses at a single moment in time. While such questions have engaged the best brains of at least two generations in a number of intellectual conundrums, it has diverted them from an analysis of those features of a market economy that have impressed themselves on human history.”

Or in other words, the mainstream of economics, with its focus on the moment of equilibrium, along with the assumption of a common stock of knowledge and rational choice, has bypassed the most striking characteristics of actual economies – growth, uncertainty, and human unpredictability.

There are a few other books that serve as good introductions to complexity in economics, such as Paul Ormerod’s very accessible [amazon_link id=”0571197264″ target=”_blank” ]Butterfly Economics[/amazon_link] and Alan Kirman’s [amazon_link id=”0415594243″ target=”_blank” ]Complex Economics[/amazon_link]. The contribution of David Simpson’s book is to link the tools of complexity thinking to a particular tradition in economic thought that has always emphasised growth, uncertainty and the problem of knowledge. Even the thickest-skinned of mainstream economists is probably aware of Hayek’s work on knowledge, or rather the impossibility of knowing everything, and of [amazon_link id=”0415567890″ target=”_blank” ]Schumpeter’s ‘creative destruction'[/amazon_link]. One chapter quotes Paul Krugman saying: “Is the economy a self-organising system? Of course it is!” The problem is that most economists, even if they acknowledge the general point, haven’t been doing that kind of economics – Krugman’s macroeconomics is back-to-the-sixties Keynesian analysis, and happens in a different part of his brain from his 1996 book [amazon_link id=”1557866988″ target=”_blank” ]The Self-Organizing Economy[/amazon_link].

[amazon_image id=”1557866996″ link=”true” target=”_blank” size=”medium” ]The Self Organizing Economy[/amazon_image]

Simpson brings in Austrian economics to consider the business cycle and particularly the present recession. I’m not at all familiar with the Austrian approach, so can’t really evaluate how well it fits into the complexity/uncertainty framework; but the narrative here emphasises the role of technology as the initial trigger, credit expansion in the boom, and the financial causes of the crisis. Simpson writes: “The business cycle carries many of the characteristic signatures of a complex process.” The economy self-organises then self-disorganises.

He concludes: “The marginal revolution of the last quarter of the 19th century had focused attention on the theory of value at the expense of the theory of growth. …. The end result of assuming away so many important aspects of reality is that the theory is not operational. It is impossible to relate equilibrium theory to the empirical processes of an actual market economy. …. Equilibrium theory has reached a dead end.”

My sense is that mainstream economists were already waking up to the irrelevance of much of the post-war work in the subject even before the Crisis. The collapse of the communist regimes, the dramatic impact of technology and globalisation, the steady adoption of behavioural research were all contributing to a shift in the mainstream back towards reality. Events since 2008 have accelerated the move, for all that many economists remain in denial, to the point that curriculum reform is now well under way, as I’ve noted before.

David Simpson’s book is a clear and readable introduction to complexity in the specific context of the history of economic thought, and can thus fill a gap in far too many economists’ knowledge of their own subject. It’s unfortunately a pricey Edward Elgar book, so most readers will need to order it from their library.

 

How not to do economic forecasts

Every so often I come across a book that should be read by: (a) all economists; (b) all students; (c) everybody involved in politics and policy; and (d) everybody else with an interest in the world. Nate Silver’s [amazon_link id=”0141975652″ target=”_blank” ]The Signal and The Noise: The Art And Science of Prediction[/amazon_link] – which I’ve finally read shamefully long after publication – is one of those books. It should in fact be read by all macroeconomists who publish forecasts at least annually, as a condition of their continuing membership of the profession. If I were teaching, it would emphatically be a required item on the course reading list.

It is a wonderfully clear and engaging explanation of the challenges of making predictions in fields ranging from politics and elections to weather and earthquakes to economics and poker. Apart from a couple of sections on American sports, which might as well have been written in a foreign language, the examples illustrate how to, and how not to, make forecasts. You’ll be wiser for reading it, not to mention able to put Bayesian inference into practice. Silver makes a compelling case for adopting the Bayesian approach, rather than the standard (‘frequentist’) statistics descended from R.A.Fischer and universally taught to economists in their econometrics courses. The emerging new economics curricula should at least include Bayesian statistics in the modules covering empirical methods. As Silver writes:

“Essentially the frequentist approach toward statistics seeks to wash its hands of the reason that predictions most often go wrong: human error. It views uncertainty as something intrinsic to the experiment rather than something intrinsic to our ability to understand the real world.”

In other words, it is not true that collecting more and more data – although usually useful to a forecaster – will eliminate your uncertainty about the real world. The signal-noise problem is epistemologically unavoidable. What’s more the frequentist approach involves assumptions about the distribution of the population; we know about the (in-)validity of the normal curve assumption, and anyway, “What ‘sample population’ was the September 11 attack drawn from?”

The chapter on macroeconomic forecasting is a pretty devastating critique of economists who do that kind of thing. There is a demand for macro forecasts, and I’d rather economists supply them than anybody else. But we shouldn’t pretend they’re going to be accurate. Almost all forecasters, even if they publish standard errors, will give the impression of precision – is growth going to be 0.5% or 0.6%? – but it is inaccurate precision. Silver calculates that over the period 1993-2010, GDP growth fell outside the 90% confidence intervals of macro forecasts for the US economy a third of the time, and a half the time if you look back to 1968.

Macroeconomic data are very noisy, especially early estimates of GDP: in the US the margin of error on the initial quarterly estimate of GDP is plus or minus 4.3%. The initial estimate for the final quarter of 2008 was a decline of 3.8% – later revised to minus 9 per cent. Silver makes the comparison between economic forecasts and weather forecasts, similarly difficult problems. However, weather forecasting has improved over the decades, thanks to a better understanding of the causal links and a greater degree of disaggregation of data, made possible by more powerful computers. Economists have neither the improved understanding – on the contrary, important causal links notably finance were ignored until recently – not seemingly the appetite for better data (as I’ve pointed out before).

The book also makes the point that others (like [amazon_link id=”0465053564″ target=”_blank” ]Paul Ormerod[/amazon_link]) have emphasised, that the economy is a complex non-linear system so there is a lot of unavoidable uncertainty about forecasts more than a short period ahead. It also notes that although we know about the Lucas Critique and Goodhart’s Law – both pointing out that policy affects behaviour – economic forecasters typically ignore it in practice. Silver also underlines the rarely-resisted temptation to overfit the data – and microeconomists are just as guilty as macroeconomists here. The temptation is strong because an over-fitted model will seem to ‘explain’ more than a ‘true’ model when the data are noisy, so the usual tests for good fit will look better. [amazon_link id=”0472050079″ target=”_blank” ]Deirdre McCloskey and Stephen Ziliak[/amazon_link] have been pointing out the siren allure of ‘statistical significance’ for ages – it has almost nothing to do with economic meaning – and perhaps The Signal and the Noise will help broadcast the message further.

Finally, I learned a lot from the book. The chapter on how to approach the question of CO2 emissions and climate change is a model of clear thinking. My favourite new fact: members of Congress – with access to lots of company information via lobbyists and the ability to influence companies’ fortunes by legislation – see a profit on their investments that beats the market averages by 5 to 10 per cent a year, “a remarkable rate that would make even Bernie Madoff blush,” as Silver observes.

Anyway, if you haven’t yet read this, go and do so now. The new UK paperback also has a wonderful cover image!

[amazon_image id=”B0097JYVAU” link=”true” target=”_blank” size=”medium” ]The Signal and the Noise: The Art and Science of Prediction[/amazon_image]

Update: Dan Davies (@dsquareddigest) has gently rebuked me for the paragraph about Bayesian versus frequentist statistics. Via Twitter, he said: “Silver has a really annoying misintepretation of Bayesian vs frequentist which is now becoming commonplace… the paragraph you quote is really confused – NS is a good practical statistician but all over the place on theory & methodology. The (otherwise excellent) book gains nothing from taking a very strong position in someone else’s philosophical debate.” Needlesss to say, I know less than Dan about this debate. This doesn’t change my mind that econ students should be taught the Bayesian approach too, nor the conclusion that the book clearly explains how to do it in practice.