Cloudy, with a Chance of Default

by John on January 2, 2013

 Timely forecasts can be essential to safety

Global finance has more in common with weather prediction than many economists might admit. If they studied the history of weather forecasting, they might be a little humbler in their statements about business cycles and equilibrium.

After all, until the ’60s, weather forecasters thought that more data, more computational power and better models would improve long-range forecasting so that a thankful public could plan weddings and picnics far in advance, free from meteorological harm.

Then MIT mathematician and meteorologist Edward Lorenz ruined everything when he noted that his models, run from the same starting point, would produce widely divergent results. Probing the underlying mathematics, Lorenz discovered Chaos Theory, one of the most important scientific developments of our time.

Until then, meteorology had progressed rapidly, based on understanding physical systems. Descartes founded modern meteorology with his 1637 publication of ‘Les Meteores’ but lacked the basic data to develop his theory. As measurements and understanding evolved—Galileo and his thermometer, his student Toricelli with the barometer—and the realization that temperature and pressure varied with altitude, more attempts were made at forecasting, with little success.

Modern meteorology awaited early-20th century developments and the fundamental observation by Norwegian mathematician Vilhelm Bjerkness, who defined the properties of air masses, weather fronts and atmospheric waves. He stated that all future weather patterns stem from their initial form, acted upon by physics via known mechanical and thermal laws. If only the financial ‘weather’ parameters were so well defined.

Meteorologists then felt that they simply needed to feed sufficient data into the correct models. They could not do so until, in April 1950, John von Neumann, of physics and game-theory fame, worked with meteorologist Jules Charney, created the first successful computerized weather prediction. Soon the computerized models were more accurate than their human counterparts.

Despite these advantages, the ability to predict the weather precisely more than a week or so ahead has not happened. Meteorology has made great strides in understanding the weather, such as storm tracks for hurricanes, and overall understanding of drought, extreme weather and climate-change patterns. But, as Lorenz’ discovery shows, the ability to predict weather at a specific location more than a week or so ahead is barred by the chaos intrinsic to global weather. Why should the even more complex, dynamic and non-linear world financial system, mainly governed by human perceptions rather than physical laws, be any less chaotic?

Comparing the development of meteorology and climate science with economics, weather researchers began to recognize that, as the American Institute of Physics put it: “Before they could understand how climates change, scientists would have to understand the basic principles for how any complicated system can change. Early studies, using highly simplified models, could see nothing but simple and predictable behavior, either stable or cyclical. But in the 1950s, work with slightly more complex physical and computer models hinted that even quite simple systems could lurch in unexpected ways. During the 1960s, computer experts working on weather prediction realized that such surprises were common in systems with realistic feedbacks.”

Attempts to reduce climate physics to a couple of hundred lines of equations were set aside in the development of ever more sophisticated simulations, analyzed and modified against their ability to reproduce historic weather conditions, such as ice ages and aerosols from volcanoes such as the 1991 Mt. Pinatubo eruption. Only after decades of modeling of global warming, international cooperation across a wide variety of disciplines including oceanographers and paleobiologists, and numerous model changes to incorporate an increasing range of factors such as aerosols and ocean temperatures, could the editor of Science Magazine announce, in 2001: “Consensus as strong as the one that has developed around this topic is rare in the history of science.”

AWH Phillips, of Phillips Curve and Moniac hydromechanical simulator fame, proposed that a similar effort might be made in the study of economics. Realize that he had done much pioneering work on analog simulators and economic stability in the 50’s when the results were shown on an oscilloscope!

Phillips proposed that future developments might enable the construction of an electronic analog machine that “using a combination of econometric and trial-and-error methods, the system of relationships, the form of time lags and values of parameters of the analog might be adjusted to produce as good a ‘fit’ as possible to historical time series, and the resulting system used in making economic forecasts and formulating policies.  This would be a very ambitious project. Apart from the engineering problems involved, requiring close coordination of statistical, theoretical and historical studies in economics…” The engineering problems have been overcome with the development of massive computational capability, and the history of climate modeling provides a direct parallel in the study of global warming, an even larger problem than economic stability. 

In 2008, Queen Elizabeth, visiting the London School of Economics, asked: “Why didn’t economists forecast the crash?” Consider global finance as a separate entity, assigning resources worldwide; the ‘weather system’ in which economies operate, a chaotic global flow of cash with series of highs and lows, rapidly shifting storms and calm areas.

To borrow from Lorenz’ analogy as to the flapping of a butterfly wing setting off a tornado in Texas, the waving of financial paper by a London or Singapore trader can trigger the loss of billions of dollars in New York. Recall the ‘London Whale’ and JP Morgan, and the collapse of Baring’s Bank (1762-1995) by rogue trader Nick Leeson. He hoped to cover his errors by betting on a major rise of the Japanese Nikkei stock index, only to see it sent crashing by the 1995 Kobe earthquake. The delusion of human expectations met the realities of real-world events.

Back in the ’30s, John Maynard Keynes spoke eloquently on the unpredictability of such events and their financial impact: “About these matters there is no scientific basis on which to form any calculable probability whatever. We simply do not know.” More recently, Nassim Talib used the analogy of the Black Swan to explain the unpredictability of events in his eponymous book.

Consider Talib’s Mediocristan/Extremistan country analogy, replacing geography with an economic weather map where financial markets’ ‘ extreme weather’ can radically affect real-world economics. Visualize the ‘real’ economy as an aircraft flying through the ‘atmosphere’ of the financial markets. We want a smooth ride through the stable areas on our financial weather map. But the risk of encountering extreme turbulence, such as an embedded thunderstorm, is greatly increased when flying in unstable weather.

Airlines operate in all sorts of weather but avoid extremes by using weather radar and other short-term measurements to assess the risk of continuing the flight and divert around those extremes. If the weather is too severe en route, they don’t fly. Even today, their judgment may be flawed, evidenced by the 2009 Airbus crash in the South Atlantic, but airline flying has become extraordinarily safe.

The Federal Aviation Pilots Handbook guidelines on extreme weather could easily be changed into a policymakers ‘flight’ manual to insure safety. In extreme weather, the most important task is to maintain a stable attitude, rather than altitude and maximum velocity, and operate at a speed that prevents overstressing the aircraft’s structure, thus avoiding situations that could exceed its safe loads and stability envelope. Flying at maximum speed and altitude into potential extreme turbulence is the economic equivalent of turning control over to an autopilot programmed by the financial community, oblivious to the risk of flying directly into financial turbulence capable of ripping the wings off the economy.

Dr. David Leinweber of the Advanced Computational Group at Lawrence Berkeley National Laboratory has proposed just such a solution to the problems of instability in flash trading: upon  detection of instability, slow the rate of automated trading until markets enter a more stable regime, as discussed in his OECD article ‘What can NASCAR teach NASDAQ about avoiding crashes?’.  He proposes real-time monitoring of trading conditions, much as weather forecasters monitor physical data, and rapidly identifying conditions that will lead to instability, then slowing the markets to prevent these conditions from getting out of hand, as in the famous ‘Flash Crash’ and more recent high-speed trading debacles.

It’s tempting to make analogies to Hurricane Sandy, where it was impossible to predict the exact birth of the storm. Once it developed, short-term predictions of its path and impact became possible. The phenomenon of ‘gravity waves’ is perhaps a far better analogy for the interactions of financial markets with the real world economy.  They can rapidly induce severe weather events, such as 4.5-inch-diameter hail in Texas, as illustrated in this NOAA article, (be sure to watch the animation).

Weatherwise, a highly readable meteorological journal  carried a fine explanatory article describing snowfalls, where weather/ocean interactions over the Pacific and Arctic can trigger fast-moving conditions on the East Coast that shut down virtually everything. These snowstorms incur real and immediate economic impact, including the costs of snow removal. Similarly, a financial ‘gravity wave’ such as the sub-prime debacle can cause a storm that dumps enough on the real- world economy to bring it to a near halt.

Perhaps economists should revisit William Phillips’ work on cybernetics and stability as a basis for studying the interactions of financial turbulence with the real-world economy. We could then give government agencies the computational and data-gathering tools they need to monitor Wall Street and the financial community in real time, so that they straighten up and fly right.

Till then, the forecast will be cloudy, with a chance of default.

 

{ 4 comments… read them below or add one }

John Joss January 4, 2013 at 7:07 pm

Insightful, logical, well informed, excellently organized . . . thus unlikely to be considered by the dithering mass media.

Reply

Alex January 5, 2013 at 4:26 pm

AF447 didn’t crash under the automatic pilot, neither was it brought down by turbulence, nor did it suffer a structural failure (if you don’t count hitting the sea). It crashed under the control of the pilots, who responded to a fairly common problem (lost one ASI in the cruise due to icing) by making the opposite control input to that which would have been appropriate, and then persisting in their error with progressively more intensity in defiance of discrepant information.

Had the automatic pilot been functional, it would have done precisely what you suggest. Had the pilot been functional, he would have carried out the ADR DISAGREE/IAS DOUTEUSE checklist from the flight manual, which would have told him to set a broadly sensible pitch/power couple from a table and then carry out diagnostic checks on the aircraft systems. But not only did the pilots fail to fly the plane, they didn’t even start the checklist or really do anything to find out what their problem was.

They seem to have formed a theory that the plane was mystifyingly losing altitude very early on (when it wasn’t) and overspeeding (when it wasn’t) and concluded that the answer was to raise the nose. As a result they stalled, exceeding the maximum nose-up angle of the wing at that speed, altitude, and load. They didn’t identify the stall, and persisted all the way to their deaths trying to raise the nose and therefore making the stall worse.

Lowering the nose and increasing the air speed would have solved their problem at any time up to a few seconds before impact; the flight controls, structures, and engines were functioning perfectly to the finish.

Analogies to economics are invited, but I think that either a property boom or a banking crisis are fairly common economic incidents, a demand-deficient recession is the sort of thing that ought to be diagnosed without too much trouble, expansionary fiscal contraction is AF447-think, and clinging to the sidestick in the full back position as the waves get closer…hardly needs an analogy.

Reply

John January 6, 2013 at 10:56 am

You’re exactly right about the Airbus crash, and the transcripts of the final moments are chilling. The expansionary fiscal contraction model is a direct analog of the Airbus pilots behavior.

I’ve actually done a very simple visual simulation of the housing market with my work at Dominican, and you might be interested in my OECD piece on analog simulations in economics at: http://oecdinsights.org/2012/06/27/going-with-the-flow-can-analog-simulations-make-economics-an-experimental-science/
It has links to the intro. video and working paper at Dominican.

You will see that the fluid dynamics based simulation, when hooked up to a flight dynamics model, gives you a visual output, and yes, you can crash the economy. The fluid dynamics portion of the model has been initially validated against economic time series data from the U.S. and Sweden, including income distribution.

Regarding the Airbus crash, we now have two examples of blind faith in one factor of the control system (ability to achieve the maximum lift coefficient automatically) has lead pilots to fly their aircraft into the ground or ocean. The Toulouse Airbus crash is equally chilling, and I’m sure you’ve seen the videos, which I linked in another piece, entitled Hedging the Apocalypse at: http://somewhatlogically.com/?p=598
which discusses risk and automated systems and the Airbus crashes.

Reply

Jonathan July 7, 2014 at 7:27 am

This piece is a word salad of analogies which are given with no way to determine the scope of their applicability. The general conclusion seems to be “oh me oh my it’s all so complicated.”

It’s not hard to avoid flash crashes; a Tobin tax small enough to be negligable to everyone but the HFT crowd would be sufficient. I don’t need a precise model of the financial system to know that a 25% capital requirement combined with a curb on opaque derivitives would help. I need a model to quantify the amount of help but not to know that it would be a good idea.

What should be done to speed up growth of Total Factor Productivity? If I knew that I’d be smart. But the claim that nothing is known when not everything is known is willful obfuscation.

Reply

Leave a Comment

*

Previous post:

Next post: