It’s always with great diffidence that I write about macroeconomics. Although I’m in good company in being sceptical about much of macro (see this roundup from Bruegel and this view from Noah Smith, for instance), I’m all too well aware of the limits of my knowledge. So with that warning, here’s what I made of Roger Farmer’s very interesting new book, Prosperity for All: How To Prevent Financial Crises.
The book starts with his critique of conventional DSGE real business cycle and also New Keynesian models. Farmer rightly dismisses the idea that it’s ok to mangle the data through a Hodrick-Prescott filter and ‘calibrate’ the model, as in the real business cycle approach. But he also has a chapter criticising the (now) more conventional New Keynesian models. He writes, “Macroeconomics has taken the wrong path. The error has nothing to do with classical versus New Keynesian approaches. It is a more fundamental error that pervades both.” This is the idea that there is an ultimate full employment equilibrium. Echoing Paul Romer‘s recent broadside, he describes both as phlogiston theory. Echoing a number of others, he describes New Keynesian models as being like Ptolemaic astronomy, adding more and more complexity in a desperate and degenerate attempt to keep the theory roughly aligned to evidence.
The book demolishes the idea that there is a natural rate of unemployment (and thus also the idea of the output gap). Farmer argues that there are multiple possible outcomes and unemployment in many of these will persist in equilibrium. His alternative model – also a DSGE approach – replaces the New Keynesian Phillips Curve with a ‘belief function’, assuming that beliefs about future expected (nominal) output growth equal current realized growth.
It seems obvious to me that this approach is preferable to the New Keynesian and certainly RBC models, although this obviousness is rooted in my intuition – of course unepmloyment can persist and there can be multiple equilibria, duh! However, some things about it are less appealing. Above all, Farmer’s assumption that consumption is a function of wealth, not income. He replaces the conventional consumption function with one more purely related to the Permanent Income Hypothesis. This troubles me, although not because I disagree with his view that the conventional Keynesian consumption function and multiplier are inconsistent with macro data. However, I thought the permanent income hypothesis sat badly with the data also, as it implies more consumption smoothing than is observed. It seems incredible too that many people look forward very far in determining their consumption level even if they do note and respond to asset price changes. Besides, most people have low net wealth – indeed, 26% of Britons have negative net financial wealth.
As the book points out, this change of assumption, and the role of the belief function, have strong policy implications. The debate about austerity, which is in effect an argument about the size of the multiplier (which we don’t know), is a distraction: higher government deficits will not shift the economy to a lower unemployment equilibrium. Instead, Farmer advocates a policy using the composition of the central bank balance sheet to manage asset markets and thus the level of output, and beliefs, through wealth effects (rather than including asset price inflation in an inflation target, as some have advocated). Balance sheet policies are effective because financial markets are incomplete: future generations cannot trade now; parents canot leave negative bequests.
This is a policy debate I’ll be leaving to those who are more knowledgeable than me – although of course those who know what they’re talking about will also have their own horses they’re backing in the race. It is striking that these macro debates rage around the same small amount of data, which is insufficient to identify any of the models people are battling over. In his excellent book about forecasting, The Signal and the Noise, Nate Silver pointed out that weather forecasters reacted to weaknesses in their models and forecasts by gathering many more data points – more observatories, more frequent collection. I see macroeconomists downloading the same data over and over again, and also ignoring the kinds of data issues (such as the effects of temporal aggregation) that time series econometricians like David Giles point out. So something like the David Hendry model-free approach, as set out in his recent paper, seems the best we can do for now.
My reservations should not stop anybody from reading Prosperity for All. It is an accessible read – undergraduate and maybe even A level students could cope with this – and the model is available at www.rogerfarmer.com. I’d like to have seen some discussion of non-US data, and also structural change/non-stationarity. It’s also a short book, and as a macro-ignoramus I’d have liked some sections to be explained more fully. But these are quibbles. This is an important contribution to the debate about macroeconomics, and it’s an important debate because this is what most citizens think economics is all about, and macro policy has a profound effect on their lives.
I’m keen to read other reviews of the book now. I’m sure Roger is more right than the conventional DSGE approach – but also think the how-to-do macro debate is far from settled. How open-minded are more conventional macroeconomists?