From IBS intelligence:
More resources are being set aside by banks as the Fundamental Review of the Trading Book (FRTB) looms large. large part of banks’ preparation involves them balancing their capital between the standardised approach (SA) and the internal model approach (IMA).
For smaller banks, the costs of infrastructure requirements for the IMA in terms of systems, data and processes wouldn’t be justified compared to the lower capital requirements they could achieve with IMA compared to SA. The SA calculations also have some synergies with risk calculations for other regulations, such as the upcoming Initial Margin (IM) requirements. So, for institutions that will be pulled into the upcoming tranche next September, it makes sense to include SA calculations in their preparations.
Banks with larger trading books, which include instruments or assets with significant price variation, however, will benefit from switching to an IMA approach. While more costly, an approved IMA model means that the capital adequacy reserves you need to put aside would be lower, so a greater investment would be worthwhile in the long run. Caught up in these discussions it can be easy to forget one key thing: your calculations are only as good as your data.
If you don’t have the correct data processes in place, the change from just managing the classic one years’ worth of historical data to 10 could be extremely painful and costly from an operational perspective. This means it’s time to get your data house in order.
A good first step is to make sure that everything is bucketed correctly. Regardless of whether you’re choosing SA or IMA, you need to make sure that the data backing up your calculations is reliable and timely.
Automation is key. Handling the data process manually is cumbersome – you must put in rules to identify the proxy, find out if a price is valid and if not go to alternative sources. While this may be manageable for one years’ worth of historical data, it becomes nigh-on impossible if you have to do 10 years.
It’s not just a case of difficult calculations, BCBS 239 regulators are increasingly coming down on firms who leave their compliance open to human error. With the much larger data sets in scope for FRTB, even semi-automated processes would have to be automated.
However, it’s not all bad news. Banks who can get these processes right could benefit from more than just avoiding compliance headaches. Automating data allows employees who would have previously been waiting for batch processes to finish to focus on how to get the most value out of the huge amounts of data now available to them.
Having a solid data foundation and an automated data process in place won’t just help with FRTB. Synergies with calculations for other regulations like IM and any ad-hoc regulatory stress tests mean that addressing FRTB requirements thoroughly makes compliance with any other looming regulations easier in a small amount of time.