Providing an update on the Data Gaps Initiative was Robert Heath, deputy director of the IMF’s Statistics Department and Pietro Franchini, a member of the FSB’s secretariat. Robert spoke first and outlined how each financial emergency, from the Mexican to the Asian crisis, has led to efforts to generate better data that might provide warnings for the next crisis.
In the aftermath of the 2008 global financial crisis, the G20 set out to compile better data and produced 20 recommendations in four areas:
- The build-up of risk in the financial sector
- Cross-border financial links
- The vulnerability of domestic economies to shocks
- The communication of official statistics
This means that the IMF now compiles better data for measuring the health of financial institutions, the ‘tail-risks’ of assets or portfolios and also the state of financial products (such as its survey of credit default swaps.)
However, one of the primary reasons for the severity of the 2008 crisis was down to the interconnected and interdependent nature of the global financial system. So the IMF is now trying to better understand the linkages and spill over effects of financial industries between countries. Of particular importance is the need for improved data on the global too-big-to-fail banks (or systemically important financial institutions) and their role in the financial system.
Identifying when domestic problems may be bubbling up before they do spill over is also crucial. In this regard the IMF now has a database of global real estate prices and has also worked to more accurately track government deficit and debt statistics. Although they do note that progress is slow in this field due to different official statistical standards around the world.
Finally, the IMF recognised that critical data may be already available in official statistics, but it is not widely known to markets. Therefore communicating what data is available and making it internationally comparable is another aspect of the initiative.
These efforts made up the first phase of the project and the second phase will now look to consolidate the progress made along with doing further work on areas like foreign currency exposures, where there is a lack of data on potential vulnerabilities.
Pietro Franchini from the FSB then expanded on some of the finer points of initiative. They are focussing on monitoring capital requirements, risk management practices and regulation. However, there is also attention on gathering better data on the trends and risks in the shadow banking sector and the derivatives markets.
It’s these sectors of finance that are the most interconnected and complex and having data on them is the only way to understand how the regular banking sector interacts with the shadow banks, domestically and globally.
Another problem is the time lag in the stream of data, particularly in the midst of a crisis when weekly or even daily submissions are needed. The confidential or market sensitive nature of the data is also another barrier to having indicators of enough quality.
The FSB’s recommendations for improving global standards mostly revolve around introducing quality by design. This means the harmonization of definitions and standardisation of datasets to avoid problems such as double counting.
The problems that still lie ahead
Charles began by picking up on the point of standardisation, which he says should be the priority. With a truly global financial system, he drew an analogy with the internet to demonstrate how finance needs its own global standards.
He said that if all institutions and countries would use standard legal entity identifiers and unique transaction identifiers for example, this would ease the burden on everybody. However, he made the point that this needs to be done worldwide and there was no obvious body to direct this. Banks generally don’t create aggregate statistics as this was the job of the regulators. To solve this would need a comprehensive global strategy he said.
Chris Clack continued on from Charles reaction by emphasising the importance of using this data in models. He said the focus has to be in looking at what is coming on the horizon, not trying to fight the last crisis.
He also made the point that to do this, raw data was like gold dust to academics in his line of research. He is often looking for the lowest level of granularity in the data, but he does admit that confidentiality issues can be a significant barrier to that. He suggested that a solution to accessing this data could be secure facilities, similar to those being used for public administrative data.
At this point, the discussion was opened up to those around the table and a main bone of contention was how far the standardisation of data should go. Cited as an example of this were economic statistics that are adjusted to bring them in line with regional standards (such as the recent revisions in GDP by the ONS.)
Although smoothing data throughout the EU may make it easier to compare in this context, the problem comes when it is compared outside of this sphere, without the data being questioned. Different users have very different needs for the data, so a consensus of sorts was formed around the idea that the raw and adjusted data should be published with detailed meta-tagging attached to clearly describe the data within.
The IMF and FSB are due to report back to the G20 finance ministers and central bankers in September of this year with a proposal for what shape the second phase of the initiative should take. They will be looking to build upon the existing 20 recommendations but through meetings like this one they are seeking to discover what emerging information needs there will be in the coming years.