Building new institutions

My ticket to hear Elinor Ostrom give the Hayek Memorial Lecture at the IEA dropped through the letterbox today, prompting me to have a quick look at her marvellous book [amazon_link id=”0521405998″ target=”_blank” ]Governing the Commons: The Evolution of institutions for Collective Action[/amazon_link]. The question of institutional innovation as part of the wide range of responses needed to address inequality and the erosion of social capital has been on my mind – it was one of the issues in my recent Joseph Rowntree Foundation/University of York lecture (docx file – this is the draft, the final version will be on the JRF website soon, and the hour-long video is there). The dramatic technological innovations have undermined business as normal in politics and social engagement as much as in business, but we have not seen much institutional adaptation. It will be essential – just as social innovation in the form of mutuals, unions, co-ops, libraries, museums, the expansion of education etc constituted an essential response the the tech-driven social and economic dislocations of the 19th century.

Anyway, Ostrom’s work has looked at developing countries, but the criteria she sets out for the design of successful institutions are highly relevant to our own situation. The key question she addresses is exactly the one we face in a situation of trust being corroded by inequality and many people, from bankers to rioters, seeing what they can get away with: “How a group of principals who are in an interdependent situation can organize and govern themselves to obtain continuing joint benefits when all face temptations to free-ride, shirk, or otherwise act opportunistically.”

I don’t know if the lecture will be filmed but I will tweet with the hashtag #ostromiea from 6.45 on 29th March.

[amazon_image id=”0521405998″ link=”true” target=”_blank” size=”medium” ]Governing the Commons: The Evolution of Institutions for Collective Action (Political Economy of Institutions and Decisions)[/amazon_image]

The thinking organization?

This morning I picked up Herbert Simon’s [amazon_link id=”0684835827″ target=”_blank” ]Administrative Behavior: A Study of Decision-Making Processes in Administrative Organizations[/amazon_link]. I have the 4th edition of 1997 – the original was published in 1945. In between the 1st and 4th the computer and internet revolution happened, and so Simon added a chapter commenting on its implications. He famously pointed out that the proliferation of information increases the scarcity of attention: “The limit is not information but our capacity to attend to it.” (p226) This was indeed the subject of a fascinating workshop at the Toulouse School of Economics last September (pdf here – The Invisible Hand Meets the Invisible Gorilla).

He goes on to say that there is nothing new about drowning in information: “The information that nature presents to us is unimaginably redundant.” The challenge of today’s apparent flood of information is to find the appropriate ways of organizing and processing it. “As important as advances in hardware and software design will be advances in our understanding of human information processing – of thinking, problem solving and decision making.”

While we all admire the iPad3 and the Raspberry Pi, this is surely worth bearing in mind. If I were at the start of my career today, I’d certainly be tempted by the glamour of coding, and would definitely still love statistics and data visualization, but perhaps cognitive science and information theory would win out.

[amazon_image id=”0684835827″ link=”true” target=”_blank” size=”medium” ]Administrative Behavior: A Study of Decision-making Processes in Administrative Organizations: A Study of Decision-making Processes in Administrative Organisations[/amazon_image]

On the fragility of things

Mediaeval manuscripts were copied onto parchment, the skin of (usually) sheep or goats scraped free of hair and bumps. Parchment turned out to be extremely durable but took a lot of time to prepare and so was re-used. Monks would wash and scrape away an original text and copy another on top. If the original ink was tenacious, traces of the original text can be deciphered. These layered manuscripts are ‘palimpsests’ – ‘scraped again’. This is one of many insights into the business of books and texts from classical times to the 15th century that I picked up from the marvellous [amazon_link id=”022407878X” target=”_blank” ]The Swerve[/amazon_link] by Stephen Greenblatt.

This image of the layered manuscript came to mind as I read on through The Swerve. The books is – on its surface layer – about a specific historical episode, the discovery in a remote German monastery by Italian papal bureaucrat (and obsessive text-hunter) Poggio Bracciolini of De Rerum Natura ([amazon_link id=”0674992008″ target=”_blank” ]On The Nature of Things[/amazon_link]) by Lucretius, a long philosophical poem that had been unread for nearly a millennium. Greenblatt argues that the subsequent recirculation of Lucretius’ long-lost book had a cumulatively decisive influence on the course of history. De Rerum Natura sets out a fundamentally modern view of the world. That matter consists of atoms, that humans are not central to the universe, that there is no life after death, that religion is a cruel delusion, that nature experiments and life as we know it is the result of that trial and error, and so on. Lucretius also argued that there are absolutely unpredictable ‘swerves’ in the course of the movement of atoms, and those minimal motions set off entirely new chains of events, with ultimately large consequences. Poggio Bracciolini’s 1417 discovery triggered one such ‘swerve’ in the course of history.

This is a cracking story, as told by Prof Greenblatt. But I see another layer underneath. In recounting the loss of scholarship, and rise of Christian dogma, through the Dark Ages, and in setting out the random sequence of events that enabled the rediscovery of knowledge a thousand years later, the book offers a cautionary tale for our own times. Writing of the destruction of Alexandria as a great centre of learning, he writes:

“Libraries, museums and schools are fragile institutions; they cannot long survive violent assaults.” (p91)

Even in Poggio’s time, he and other humanist scholars combined their passion for ancient texts with mudslinging arguments that sounds just like the nasty polemics that now take place online between bloggers purporting to be engaging in public debate:

“The extravagance and bitterness of the charges …. discloses something rotten in the inner lives of these impressively learned individuals.”  (p146)

The Papal court was so corrupt that its own employees were openly scathing about it and evidently felt a high degree of self-disgust. As for On The Nature of Things, Greenblatt notes that very many people even now would contest its arguments even as they benefit from the ample fruits of scientific discovery. The Swerve ends with a strong sense of the fragility of learning, and an implicit but urgent moral for our own rather bitter and rotten times. He writes of the poem:

“It survived because a succession of people, in a range of places and times and for reasons that seem largely accidental, encountered the material object – the papyrus or parchment or paper, with its inky marks attributed to Titus Lucretius Carus – and then sat down to make material copies of their own.” (p261)

The Swerve has been widely reviewed – see for example The Guardian, The Washington Post, The Telegraph. They are all a bit sniffy, suggesting Greenblatt is over-straining the scholarly argument to focus on one text, even if this provides a good device for a popular book. I’m neither a literary nor Renaissance scholar, so one of the hoi polloi as far as the eminent critics are concerned. So perhaps I’m reading far too much into the (sub-)text, but I enjoyed this cautionary tale more than any other history book I’ve read recently.

[amazon_image id=”022407878X” link=”true” target=”_blank” size=”medium” ]The Swerve: How the Renaissance Began[/amazon_image]

 

 

This story is

Publishing on demand

My current book is completely absorbing, a surprising statement perhaps about a book whose subject is the early 15th century discovery of a copy of an ancient document, [amazon_link id=”0140447962″ target=”_blank” ]De Rerum Natura[/amazon_link] by Lucretius. The book is [amazon_link id=”0393064476″ target=”_blank” ]The Swerve: How the World Became Modern[/amazon_link] by Stephen Greenblatt. (There seems to be a different subtitle on some editions – I have the US one.)

I find it literally difficult not to turn the page and carry on reading. Every sentence brings something interesting or surprising or thought-provoking. Among the many enjoyable nuggets is a description of how the Roman book trade worked. There was a distinction between librari, or copyists, and scribes, scribae. The latter were free citizens who were bureaucrats or personal secretaries. The former were slaves who copied books. Booksellers had shops around the Forum. Books could be mass produced by having one slave read out the text to be copied to a whole room full of librari. Even better, there was a print-on-demand option: a customer could pitch up and request a specific copy, which the copyist would duly produce.

Greenblatt adds that authors made nothing from book sales, and copying was freely done. Wealthy patrons supported writers, an arrangement that survived, he notes, until the 18th century.

[amazon_image id=”0393064476″ link=”true” target=”_blank” size=”medium” ]The Swerve: How the World Became Modern[/amazon_image]

[amazon_image id=”0140447962″ link=”true” target=”_blank” size=”medium” ]The Nature of Things (Penguin Classics)[/amazon_image]

Macro, models and me

There is a new edition of Michael Wickens’ graduate macro text, [amazon_link id=”0691152861″ target=”_blank” ]Macroeconomic Theory: A Dynamic General Equilibrium Approach[/amazon_link]. Given that the previous edition was published in 2008 and therefore written before the financial crisis, I was keen to see the update. There are two substantial new chapters, one on ‘Banks, financial intermediation and unconventional monetary policy’, and one on ‘Unemployment’, which covers search theory, efficiency wage theory and wage stickiness.

Mike Wickens is a staunch advocate of DGSE modelling, arguing that the critics who blame it somehow for the crisis are missing the point. As he puts it in the preface to this new edition, it would be a retrograde step to give up on the ambition of integrating micro and macroeconomics, and the possibility of considering the complete picture of the economy. He argues against the ad hoc tradition of time series econometrics, saying that it leads to an over-emphasis on ‘goodness of fit’ and under-emphasis on macro theory.

Who can argue with the principle that everything is connected? But I disagree with Mike. I used to be one of those time series econometricians, and when working on my PhD thesis in the early 1980s decided to introduce the economic theories of the kind set out in the Unemployment chapter here (search theory, efficiency wage models and wage/price stickiness) to meet industry-level data. It turned out that

a) the equilibrium models are simply not consistent with the data – one has to ditch either the theory or the evidence, and I’m on the side of the evidence. Therefore until we do have a better theory (and anyway, what could be more ad hoc than wage stickiness as a model?), I prefer the traditional time series approach of careful but non-theoretical investigation of the data, which Mike Wickens rejects.

b) individual sectors of the economy are so different from each other in their time series properties that it is clear any general macro theory needs to address aggregation and industry-specific factors, which are probably institutional. That seemed much too hard, and I set off on a voyage towards microeconomics instead.

People who teach macroeconomics in universities need a textbook to teach from, and I’m sure that this is one of the best around. (It’s graduate level, being moderately technical although not too hard for a student who has got her mind around the basic differential calculus, and the text explains the equations pretty clearly.) But I find it depressing that more than 20 years after my generation of graduate students found in our research that what then became the DGSE approach had no evidential foundation, and in the wake of such a dramatic macroeconomic dysfunction, this is still the approach the textbooks take. (One more reason to regret that the academic journals won’t publish negative results – there were lots of 1980s theses reporting that the equilibrium modeling approach didn’t work.)

Publishers, lots of macroeconomists would now like to teach a different kind of macro course – humbler, more eclectic, more institutional, and including some entirely novel modelling approaches. Please can they have a textbook?

[amazon_image id=”0691152861″ link=”true” target=”_blank” size=”medium” ]Macroeconomic Theory: A Dynamic General Equilibrium Approach (Second Edition)[/amazon_image]