How the GFC is reshaping economics

Paul Diggle

Aberdeen Standard Investments

The severity of the global financial crisis, and the weakness of the subsequent recovery, triggered much soul-searching among the economics profession. The global economy may finally be escaping from the long shadow of the crisis, but macroeconomics has continued to undergo a major reassessment in light of its apparent failure to predict and explain the crisis. The adoption of more realistic assumptions about how financial and credit markets work, how consumers and firms make decisions under uncertainty, and the differences between individual consumers and firms, shows that the profession is doing what any good science should – learning from failure.

Revolutions in macroeconomic thinking have happened before – usually in the wake of severe real-world crises that prevailing theories and models failed to predict or explain. In the 1930s, in the wake of the Great Depression, John Maynard Keynes essentially invented macroeconomics as we know it today. By incorporating ’nominal rigidities’ such as sticky wages (because it’s rare for nominal wages to fall) in a general equilibrium setting (where the labour, product and money markets all impact one another), he was able to explain why high unemployment could be persistent. This insight helped create the framework that guided policymakers in the post-war period.

And in the 1970s, as the low growth-high inflation ‘stagflation’ combination appeared, economists again had to invent a new set of tools to help them understand the macroeconomy. The theory of rational expectations (economic agents do not make systematic errors about the future) helped explain why Keynesian demand-management could lead to wage-price spirals. Because policymakers cannot systematically fool economic agents, there was no permanent trade-off between output and inflation – stimulating demand past a certain point would generate unwanted inflation. In the vernacular, the Phillips curve, which had entered the toolkit in the 1950s, was vertical in the long run. Activist monetary policy, which sought to stabilise the economy at full employment with stable prices, became the policymaking paradigm. In the 1990s, John Taylor formalised the conduct of monetary policy in his eponymous Taylor rule, deriving the appropriate interest rate for any given level of output and inflation.

Come the 2000s, macroeconomists had synthesised the insights of the past into ‘New Keynesianism’. The theoretical benchmark combined Keynes’ explanation of the relationship between interest rates and output into an ‘IS curve’, with a Phillips curve that explained the relationship between output and inflation (augmented to be vertical in the long run), and a monetary rule along the lines of the Taylor rule. Policymakers built complex computer models of the economy – ‘dynamic stochastic general equilibrium models’ – from this theoretical foundation. And for a while, it all worked very well.

Most macroeconomists (as well as, let us not forget, most market participants) didn’t predict the crash. They struggled to explain its causes and consequences.

Then came the financial crisis, and the long, hard slog as the economy recovered afterwards. Most macroeconomists (as well as, let us not forget, most market participants) didn’t predict the crash, and struggled to explain its causes and consequences. In response, at least three major changes to macroeconomic theory and practice are now in progress:

Incorporating ‘financial frictions’… Going into the financial crisis, macroeconomic models generally assumed that financial intermediation functioned perfectly and costlessly – households and firms had unlimited access to credit markets, which instantly cleared; insurance was fully available against idiosyncratic risks; monetary policy influenced only the price of credit, rather than its availability; and changes in monetary policy were transmitted perfectly to the rest of the economy. No wonder the financial crisis caught many economists off guard.

Since then, however, economic models have incorporated credit availability; leveraged financial institutions; bankruptcy; asymmetric information between debtors and creditors; a preferred habitat model of the yield curve; a lower bound on policy interest rates; a portfolio-rebalancing impact from large-scale asset purchases; and a spread between policy rates and market rates; to name just a few additions.

Relaxing ‘rational expectations’… Macroeconomic models, from the 1970s onwards, had assumed that consumers and firms make, and act upon, the best possible predictions about the future. They learn from past experiences and use all information available to them to make decisions. They do not keep making systematic mistakes about the future. On average, their forecasts about the (infinite) future are correct.

But economists are now returning to the ‘animal spirits’ identified by Keynes – expectations are in constant flux, influenced by a whole range of factors including group psychology, and individual instincts and emotions. Economic agents use simple heuristics (rules of thumb) to make decisions, rather than undertaking a full cost-benefit analysis of counterfactuals. The new field of behavioural economics has documented economic agents’ irrationality, limited will-power, and myopic time horizons. All of this is being incorporated into more realistic models.

Introducing ‘heterogeneous agents’… At the heart of the early-2000s benchmark macro model was a ‘representative agent’ – a single, infinitely lived, economic agent who smooths lifetime consumption. The representative agent, which assumed away differences between individuals and firms, meant macroeconomists tended to ignore issues of inequality, which have had such a large impact on the course of economies and politics over the past few years.

Fortunately, ‘heterogeneous agents’ – consumers and firms which can differ in their income, wealth, productivity, size, and trade exposures – are now making an appearance in economic models. Meanwhile, the emerging field of agent-based modelling allows group behaviour to emerge from the interactions of many individual agents within a network structure. In these models, each agent is a self-contained unit following a given set of behavioural rules; from their micro-level actions, macro trends can emerge that better fit with what we observe in the real world.

The German physicist Max Planck is often paraphrased as saying that “science advances one funeral at a time”. Macroeconomics, it seems, evolves one crisis at a time. Encouragingly, the discipline has not let the financial crisis go to waste.

For further insights from Aberdeen Standard Investments, please visit our website.


1 contributor mentioned

Paul Diggle
Paul Diggle
Senior Economist
Aberdeen Standard Investments

Paul Diggle is a Senior Economist in Aberdeen’s Economic and Thematic Research team. He undertakes top-down research in macroeconomics and across asset classes, regions and countries in order to generate investment insights.

Expertise

I would like to

Only to be used for sending genuine email enquiries to the Contributor. Livewire Markets Pty Ltd reserves its right to take any legal or other appropriate action in relation to misuse of this service.

Personal Information Collection Statement
Your personal information will be passed to the Contributor and/or its authorised service provider to assist the Contributor to contact you about your investment enquiry. They are required not to use your information for any other purpose. Our privacy policy explains how we store personal information and how you may access, correct or complain about the handling of personal information.

Comments

Sign In or Join Free to comment
Elf Footer