When it comes to the economy we have been flying blind for decades. Economic activity has evolved from manufacturing to services and now to something else completely: we are slowly leaving the physical medium and entering the digital medium where VR/AR, IoT, etc. are the new norms. Yet, we have continued to use the same data, the same measurement techniques, the same assumptions, and the same economic models. No wonder we are not getting anything meaningful form them, while economic growth sputters and inequality continues to increase. The risk here is to our long-established institutional framework. If recent events in the UK and US are examples of the effect of these technological and economic developments, then, barring a change of course, the political system in the developed world could come under even much more pressure.

There is so much to learn from history. Surely this is not the first time we have faced such disruptions. Indeed, after the crash of 1929 and during the following Great Depression, we similarly had no clue what to do. And it was largely because we did not know what had happened to us: thanks to the inventions of the Second Industrial Revolution, the economy had started transforming from being mostly agriculture to manufacturing. Technology had made possible both the production of increasingly larger quantity of goods (anything to do with farming at the time) and the displacement  (from employment) of increasingly more farmers: an uncanny increase of aggregate supply (AS) and a simultaneous decrease of aggregate demand (AD)!

“One reads with dismay of Presidents Hoover and then Roosevelt designing policies to combat the Great Depression of the 1930’s on the basis of such sketchy data as stock price indices, freight car loadings, and incomplete indices of industrial production. The fact was that comprehensive measures of national income and output did not exist at the time.” Richard T. Froyen, History of the NIPA

Part of the problem of not knowing what was going on was not being able to measure economic activity properly. Back then (the 1930s), in the 1980s, when the manufacturing to services transition was largely taking place, now, we seem to prefer to go on autopilot instead of updating our economic knowledge with the developments that have happened in other disciplines, such as technology.

For example, when manufacturing is only 12% of GDP, do we need to continue to put so much importance on manufacturing PMIs, for example? I remember sitting on my desk and looking at the Bloomberg terminal with trepidation every single time the NFP number was about to be released…No wonder it has become almost impossible to make money nowadays if our only gauge of economic activity is these publicized numbers. Correlations are broken, cause and effect has vanished. At best the numbers only give us a snapshot of a small and decreasing part of the ‘economy’, at worst they are totally misleading. Take, indeed, NFP: unemployment is close to its all-time lows and the Fed has hiked rates in anticipation of an inflation pick-up, but there is not even the prospect of inflation.

When the economic status quo changes and we ignore it, the ensuing vacuum creates imbalances which, taken to the extreme, can cause asset price crashes and extreme economic hardships. We got lucky in the 1930s in some countries which had strong leaders not afraid to take drastic actions – DLR’s ‘New Deal’; Takahashi Korekiyo’s ‘helicopter money’. It was fortuitous that some countries also decided to make a clean break from past economic models – the gold standard was abandoned. These measures might have been enough to stop the economic plunge, but not enough to return the economy to its pre-crisis prosperity.

We know in hindsight, sadly, that we had to endure a great war, to finally pull through the economic crisis.From an economic standpoint, WW2 did reduce aggregate supply as capital was depleted and destroyed. The government stepping in with increased spending on the war effort stabilized aggregate demand. It took more than two decades, but eventually both the economy and the stock market managed to recover the lost ground during the Great Depression.

Let’s hope that we do not have to go through the same experience this time around. With all this Big Data sloshing around, with the help of AI and machine learning and the prospect of quantum computing, someone out there surely is designing the next economic model to fit this new reality. Would the next Simon Kuznets, please, step up! According to the BEA, GDP was one of the great inventions of the 20th century and even though it took 8 years after the crisis for the data to be organized and officially put to use, it was worth it.

“Much like a satellite in space can survey the weather across an entire continent so can the GDP give an overall picture of the state of the economy. It enables the President, Congress, and the Federal Reserve to judge whether the economy is contracting or expanding, whether the economy needs a boost or should be reined in a bit, and whether a severe recession or inflation threatens.

Without measures of economic aggregates like GDP, policymakers would be adrift in a sea of unorganized data. The GDP and related data are like beacons that help policymakers steer the economy toward the key economic objectives.” Economics, 15th edition, Paul Samuelson and William Nordhaus

We need our own 2008 Great Recession GDP breakthrough. Yes, the NIPA accounts have been continuously updated since the 1940s, but for all intends and purposes it does feel indeed that we are adrift in a sea of unorganized data. Nowadays, there are billion of devices connected to the Internet with trillion of sensors but less than 1% of that data is utilized! For example, there are human sensors (miniature devices attached to our bodies which monitor our health) which could also be configured to monitor productivity; home device sensors which could give us real-time consumption; factory sensors which could give us real-time production, etc. Most of this data never reaches the operational decision makers.  Yes, there are still some technical challenges as connectivity and storage etc., but as the blockchain becomes even more sophisticated and the data is organized in a proper way, there should be no reason why it cannot be more widely used for macro decision making.

Just as GDP came as a response to the transition from agriculture to manufacturing in the 1920s, we should have looked into completely revamping our NIPA methodology probably already in the 1980s when the shift from manufacturing to services was completed. For example, accounting for services is more difficult as they are not uniform like manufactured goods and are often tailored to the customer.

At least in 1971 we scrapped another vestige of that old world – the Bretton Woods agreement. However, the two consecutive oil crises that followed, had the effect of one of those ‘black swan’ events that probably genuinely took us off course as they substantially curtailed aggregate supply and contributed to the rise of inflation. People do not get it but that supply shock in peaceful times was the anomaly. And because it was a one-off event and did not destroy capital (or labor) as during wars or even natural disasters, the inflationary spike which followed was literally temporary. But, instead, we hiked rates to double digits, and as the economy submerged even more, we got Reagonomics (supply-side economics) and adopted inflationary targeting (which we still practice even now).

When that did not work (in times of technological progress and peace – the period after WW2 – supply is never an issue, but demand is), we decided to financialize the economy and submerge it in debt. With the focus on the supply side and not on the fact that aggregate demand was faltering as manufacturing jobs moved into services but at a lower wage, the only way to replace the lost purchasing power was with debt.

Sadly, the result proved out to be not that much different than the 1929 crash (despite the numerous warnings: the 1980s EM debt crisis, the S&L crisis, the 1990s EM crisis…). In 2008 the great super debt cycle finally stopped for good. True, the aftermath of this most recent crisis was not the disaster the Great Depression had been. Thanks to those lessons from the past, we quickly resorted to a version of the ‘New Deal’ but for banks, and we expanded the monetary base (and again, we created money which were only available to the banks; for everyone else, money came at a price – a still positive interest rate despite base rates being negative in some countries).

In the aftermath of 2008 we managed to stabilize the situation, and even though, unlike the 1930s, the stock market and GDP are well above their pre-crisis highs, inequality continues to worsen. We still have not figured out the real reasons we ended up here. We keep asking questions like, Why is productivity so low? Why aren’t there any wage pressures? Why has labor participation declined? Etc. We need to answer these questions but we need to stop guessing and start thinking how to incorporate the advances of technology into our economic measurement techniques.  We need the data to convince our central bankers the plainly obvious for anyone with ‘real-life’ experience that structurally AD<AS.

And once we do that, we need to develop a plan how to boost AD. As long as there is peace, and because in the developed world we are lucky to have the proper institutional infrastructure which promotes entrepreneurship and protects its achievements, AS will take care of itself. But at the same time, that exact institutional framework which allows AS to grow unhindered, also prevents AD from growing: the same advances in technology which foster growth have also broken the Work=Job=Income model.

Therefore, it is a tricky and delicate situation: how do we change the institutional infrastructure to accommodate these technological changes but also preserve it to foster even more advances? For example, can we institutionalize a rise in wages, like ‘limited wage’? We could, but most likely that would only incentivize companies to speed up automation.

Or we could go the other way, like imposing a ‘Luddite-like’ ban on automation which forces companies to hire more humans. We could also limit the supply of labor by building a real wall on the border with Mexico or an imaginary wall with the EU, or by de-globalization, or, at the extreme, war…Obviously, none of these acts would be welcomed and most likely they would backfire and cause enormous human suffering. Actually, it is not completely out of the question that an overly ambitious and eager ‘benevolent’ government would use an AI algorithm one day to optimize the AD<AS issue with the end result being exactly one of those above.

How about just giving ‘money’ directly to people? And, let’s not use this loaded word, ‘money’. Because, all we would be doing is updating numbers on a spreadsheet. We can’t run out of numbers, can we?

It is difficult to offer a solution to these issues without the proper data. Even in the latter case, how do we know when AD eventually equals AS? How do we know when to stop before inflation does indeed rear its ugly head? Experimenting is what we have done for the last century or so, and thanks to the forces of capitalism we have done this well, certainly much better than any other possible alternative. But even if we want to, it is becoming increasingly clear that we cannot continue with this trial and error method: the ‘errors’ are piling up and are about to crumble our faith in the institutions which govern our modern society. Let’s just hope it is not already too late for a change.