Opening Remarks by Mr Edward S. Robinson, Deputy Managing Director (Economic Policy) & Chief Economist, Monetary Authority of Singapore, at the 2024 Advanced Workshop for Central Bankers organised by National University of Singapore on 11 March 2024.
(…)
Over the past year, central banks have had to answer difficult questions about our collective failure to foresee the persistence of inflation after the pandemic, which has in turn called into question the usefulness of our models. Consequently, we may ask if economists should be paying more attention to recent advances in data analytics and artificial intelligence (AI) technologies to improve our forecasts and models. Nobel Laureate Michael Spence, for example, suggested that AI can understand the complex supply chain systems that would be difficult for humans to fully comprehend, and with this understanding, disruptions during the pandemic may have been tempered.
To be sure, “traditional” big data and machine learning techniques are already widely used in economics and finance. The Billion Prices Project started at Massachusetts Institute of Technology (MIT) more than a decade ago was an early adopter, pioneering the collection of online product prices from a large number of retailers at a daily frequency, in order to improve the measurement of aggregate price data. Since then, rising computing power and data availability have led to a surge in research on the economic applications of AI and machine learning (AI/ML). Central banks have also contributed in this regard, adopting AI/ML in areas ranging from financial supervision to macroeconomic surveillance. To cite a few examples, AI/ML techniques have been used to identify anomalous financial transactions, help supervisors sift through large volumes of text data submitted by financial institutions to identify vulnerable areas and generate dynamic measures of inflation expectations using social media posts.
A key strength of AI/ML modelling approaches in predictive tasks is their ability to let the data flexibly determine the functional form of the model. This potentially allows AI/ML models to capture non-linearities in economic dynamics in a way that mimics expert (human) judgement. Recent advances in generative AI (GenAI) take this even further. State-of-the-art large language models (LLMs) trained on vast amounts of data can generate alternate scenarios, specify and simulate basic economic models and beat experts at forecasting inflation.
However, the flexibility of this class of models is also a drawback: AI/ML models can be “fragile” in that their output is often highly sensitive to the choice of model parameters or prompts provided. Together with their opacity, this flaw makes it difficult to parse the underlying drivers of the process being modelled. Despite their impressive capabilities, current LLMs struggle with logic puzzles and mathematical operations, suggesting that they are not yet capable of providing credible explanations for their own predictions.
Future directions
On the whole, AI models currently lack the “clarity of structure” that makes structural models useful to policymakers. To quote former Fed Governor Laurence Meyer, model-based forecasting needs to “start with a paradigm and end with a story”. Without the ability to articulate a vision of the how the economy works or discriminate between competing narratives, AI models cannot yet replace structural models at central banks.
AI or GenAI are not yet the general purpose technology or GPT envisaged by its enthusiastic proponents. For now, perhaps the best way to incorporate AI techniques into central bank modelling toolkits is to use them in satellite models that complement core structural models. Beyond using AI techniques independently for forecasting tasks, this could extend to “semi-structural” approaches connecting AI methods to economic theory. Promising applications in recent research include the use of deep learning models to estimate economic relationships, such as the Phillips Curve, that underpin standard macroeconomic models. Our current models have been built up by rigorously incorporating the most relevant new developments, while retaining their core theoretical foundations. As we improve our understanding of the mechanics underlying AI techniques, we could begin to bring them into our workhorse models in a similar way.
We do need to prepare for the day when GenAI evolves as a GPT. There is significant investment needed for the new technology to become a Pareto superior reality and as we have been warned, this may need some enlightened intervention upstream to influence its transitional path. In the area of economic research as well as more broadly, we need to discretionarily ensure GenAI is an impetus for Harrod-neutral technological progress.
The full speech here
Banking 4.0 – „how was the experience for you”
„So many people are coming here to Bucharest, people that I see and interact on linkedin and now I get the change to meet them in person. It was like being to the Football World Cup but this was the World Cup on linkedin in payments and open banking.”
Many more interesting quotes in the video below: