The Fundamental Review of the Trading Book (FRTB) is an umbrella term for a range of regulatory risk deliveries for the minimum requirements for Market Risk Capital.  With a BCBS implementation date slated for 2022, risk managers must focus on the new rules with some urgency.

FRTB is a complex and large scale regulatory programme that requires major business process, data, and technology modifications.  Given the complexity, the temptation to implement it as a ”bolt-on” set of new feeds or processes should be avoided.

Consistent data architecture

Although FRTB is not primarily about data, considerable data-heavy work will be required of banks to comply with its stringent data requirements.  The vastness of those requirements will pressure banks’ infrastructure in many ways. FRTB’s front-to-back risk computation and aggregation demands often require major changes to banks’ technology and data architecture and, even more broadly, data strategy.

The complexity of the problem is exacerbated for organisations operating across different jurisdictions, with multiple regulators.  Such organisations might consider  splitting data by regulator, or creating a gold-standard data-architecture approach that encompasses all jurisdictions, branches and subsidiaries.  Creating a jurisdictional data architecture programme is never one size fits all, and the scale and complexity of FRTB adds to the rigor of analysis needed.

Even if some locations are regulator-agnostic, market-data differences between regions also raise issues. For example, a Hong Kong front office system might use a different time point for calculating P&L compared to a London market risk system that uses a UK close of business timestamp.

Smaller institutions or smaller business units within larger organisations may for the first time have to compute and report risks for portfolios falling under FRTB’s trading book definition.  They will need to implement systems and a new data architecture that allow them to consume new inputs and perform calculations not needed under the old regulatory capital regime.  Larger institutions may have the needed data and infrastructure, so their efforts may be more concentrated on alignment between systems and processes.  This is still no mean feat.

Data integrity, lineage, ownership and controls

Why is data integrity important, what is the relevance for FRTB, and how does it affect our design choices?

Increasing data integrity by deploying golden sources and avoiding duplicated versions of the truth reduces operational risk, and inefficient data manipulation and reconciliation processes.  Capturing and maintaining data lineage means that we know our data sources, usage, and transformations.

High on a risk manager’s list of concerns for implementing the Internal Models Approach is thestringent Profit and Loss Attribution test, which will fail if sources and processes are inconsistent.  To avoid this risk, hard decisions need to be made about pushing some data-provider functions upstream, such as risk calculations, to eliminate duplicative processing.  Although such a decision requires major process, data, and technology transformation,  for certain banks it will not be possible to function under an internal-models waiver without such core changes.  Controls must be implemented that will facilitate maintaining data integrity, thus ensuring compliance going forward.

None of these concepts are new, and all were highlighted with BCBS 239.  Indeed BCBS 239 arose from the wreckage of the financial crisis, when many banks were found to have no clear idea of their risk exposure, due to their weak data and reporting practices.

Technology architecture, and Cloud

How does solving our FRTB requirements fit into the bank’s overall systems landscape?  We need to ask: Are there any redundant systems, or are any decommissions required?  How does this align with other general system or data-flow rationalisation initiatives in flight (and do we have dependencies?)?  Could FRTB data harmonisation initiatives help those other programmes?  Or the contrary: Could other in-flight projects add complexity to or delay our FRTB programme (e.g. IBOR replacement)?

Vendor systems and data are also a consideration.  Smaller organizations may focus on the Standardised Approach and find it cheaper to buy a vendor solution.  A larger organization may be considering using a vendor’s market data for Non Modellable Risk Factors (NMRF) or for historical market data simulation purposes.

IMA will need a full revaluation or simulation based approach to recalculating position values.  Could any new in-house technology initiatives be leveraged to support this, such as moving to cloud for compute and storage?  Could this in turn be used by other areas, such as Front Office for intraday risk, or CCAR?

Conclusion

If we are competing for resources there may be pressure to build something to get us ‘over the line’ in the short term.  For many firms, however, this could put at risk their ability to stay within IMA going forward, while resulting in higher costs for future regulatory deliveries.  Firms should concentrate on agreeing on design principles that align front office, risk and finance data, and models, while also enhancing market and reference data at the enterprise level.

Delivering all this requires alignment across process, data, and technology.  Data programme initiation and maintenance models such as DCAM (Data Management Capability Assessment Model), used by over 60% of the financial industry, create a common mindset and culture around data, increasing firms’ ability to succeed on complex data-heavy regulatory programmes such as FRTB and BCBS 239.  Click here for more information.