Since we published our last article on how firms can efficiently develop a data strategy to manage the FRTB data requirements, the world was impacted by the COVID-19 pandemic. As a result, adjustments were made to the implementation deadlines of many new regulations.  Here is a high-level overview of the revised timelines for the Fundamental Review of the Trading Book (FRTB):

FRTB is an umbrella term for a range of regulatory risk deliveries for the minimum requirements for Market Risk Capital. Such a complex and large-scale regulatory programme requires significant business process, data, and technology modifications, as well as comprehensive governance and reporting requirements.  Given the complexity, risk managers should remain focused on the revised rules with urgency, and the temptation to implement it as a “bolt-on” set of new feeds or processes should be avoided.

Consistent data architecture

FRTB implementation requires considerable data-heavy work to ensure banks comply with its stringent data requirements.  The vastness of those requirements will pressure banks’ infrastructure in many ways. FRTB’s front-to-back risk computation and aggregation demands often require major changes to banks’ technology and data architecture and, even more broadly, data strategy.

The complexity of the problem is exacerbated for organisations operating across different jurisdictions with multiple regulators.  Such organisations might consider splitting data by regulator or creating a gold-standard data-architecture approach encompassing all jurisdictions, branches, and subsidiaries.  However, creating a jurisdictional data architecture programme is never one size fits all, and the scale and complexity of FRTB adds to the rigor of analysis needed.

Even if some locations are regulator-agnostic, market-data differences between regions also raise issues. For example, a Hong Kong front office system might use a different time point for calculating P&L compared to a London market risk system that uses a UK close of business timestamp.

Smaller institutions or smaller business units within larger organisations may, for the first time, have to compute and report risks for portfolios falling under FRTB’s trading book definition.  They will need to implement systems and new data architecture to allow them to consume new inputs and perform calculations not required under the prior regulatory capital regime.  Larger institutions may have the needed data and infrastructure, so their efforts may be more concentrated on alignment between systems and processes.  However, this is still no mean feat.

Data integrity, lineage, ownership, and controls

Why is data integrity important, the relevance for FRTB, and how does it affect our design choices?

Increasing data integrity by deploying golden sources and avoiding duplicated versions of the truth reduces operational risk and inefficient data manipulation and reconciliation processes.  Capturing and maintaining data lineage means that we know our data sources, usage, and transformations.

High on a risk manager’s list of concerns for implementing the Internal Models Approach (IMA) is the stringentProfit and Loss Attribution test, which will fail if sources and processes are inconsistent.  To avoid this risk, hard decisions need to be made about pushing some data-provider functions upstream, such as risk calculations, to eliminate duplicative processing.  Although such a decision requires major process, data, and technology transformation, for certain banks, it will not be possible to function under an internal-models waiver without such core changes.  Controls must be implemented that will facilitate maintaining data integrity, thus ensuring compliance going forward.

US banks that intend to use the IMA will need to have systems in place for the profit and loss (PLA) and back testing requirements. An accumulation of data will be required over a 12-month period. Processes need to be in place for modelling, data collection, and testing. In the US, desk structures need to be aligned with both FRTB and the Volcker Rule.

None of these concepts are new, and all were highlighted with BCBS 239.  Indeed BCBS 239 arose from the wreckage of the financial crisis when many banks were found to have no clear idea of their risk exposure due to their weak data and reporting practices.

Technology architecture, and Cloud

How does solving our FRTB requirements fit into the bank’s overall systems landscape?  We need to ask: Are there any redundant systems, or are any decommissions required?  How does this align with other general system or data-flow rationalisation initiatives in flight (and do we have dependencies?)?  Could FRTB data harmonisation initiatives help those other programmes?  Or the contrary: Could other in-flight projects add complexity to or delay our FRTB programme (e.g., IBOR replacement)?

Vendor systems and data are also a consideration.  Smaller organizations may focus on the Standardised Approach and find it cheaper to buy a vendor solution.  A larger organization may be considering using a vendor’s market data for Non Modellable Risk Factors (NMRF) or historical market data simulation purposes.

IMA will need a full revaluation or simulation-based approach to recalculating position values.  Could any new in-house technology initiatives be leveraged to support this, such as moving to the cloud for computing and storage?  Could this, in turn, be used by other areas, such as Front Office for intraday risk or CCAR?

Conclusion

If we are competing for resources, there may be pressure to build something to get us ‘over the line’ in the short term.  However, for many firms, this could put at risk their ability to stay within IMA going forward while resulting in higher costs for future regulatory deliveries.  Firms should concentrate on agreeing on design principles that align front office, risk and finance data, and models while also enhancing market and reference data at the enterprise level.

Delivering all this requires alignment across process, data, and technology.  Data programme initiation and maintenance models such as the Data Management Capability Assessment Model (DCAM), used by over 60% of the finance industry, create a common mindset and culture around data, increasing firms’ ability to succeed on complex data-heavy regulatory programmes such as FRTB and BCBS 239.  Click here for more information.