BCBS 239, a lot of delay, the fault of the "data lineage"


The 2008 crisis

The so-called “Subprime” financial crisis began in 2008. It gave way to that of “Sovereign Debts”. This crisis has hurt developed economies around the world. Some took more than 10 years to recover.

Beyond the root causes of this episode, this crisis has highlighted significant difficulties for banks to manage risks by relying on relevant and reliable reporting. The information systems and data architectures were then not very homogeneous, and above all poorly controlled, which made it difficult to consolidate the risks by relevant areas of analysis, within the appropriate timeframes.

From there was born "BCBS 239" - Basel Committee for Banking Supervision

The birth of BCBS239

BCBS 239 is Standard No. 239 of the Basel Committee on Banking Supervision, in short the IT version of a set of prudential standards aimed at warding off the specter of a new crisis like that of 2008!

BCBS 239 sets out “principles for effective risk data aggregation, and risk reporting”.

The overall objective of the standard is to strengthen banks' risk data aggregation capabilities and internal risk reporting practices, thereby improving risk management and decision-making processes in banks.

“The right information must be available to the right people at the right time”.

The principles aim to present accurate risk data to decision makers to make timely and informed decisions. Simple !

This information should also allow the regulator to accurately assess a bank's risks.

It is a standard that applies to Global Systemically Important Banks (G-SIBs), i.e. the 30 largest banks in the world, since… 2016.

The Domestic Systemically Important Banks (D-SIBs) were to take over, by 2019.

Les points techniques structurants

The principles require a good understanding of data flows, their sources, to be able to validate their accuracy and freshness to guarantee the quality of risk reporting.

In these 14 principles, three of them hold our attention.

Principle 2: IT Infrastructure & Data Architecture

“Information Systems should be designed, deployed and maintained to fully enable risk aggregation and reporting, in normal and in times of crisis”.

Principle 3: Accuracy and Integrity

“Data should be accurate and reliable. Data aggregation should rely heavily on automated processes to reduce the likelihood of errors."

Principle 5: Deadlines for producing information

“Risk data must be the most up-to-date, exhaustive, reliable and produced within a timeframe that takes into account the nature of the risk (liquidity, major risks) and the situation (normal/crisis). »

These three principles are the most “technical” in the text.

And they are all three related to the concept of "Data Lineage", i.e. the ability to "trace" information exhaustively in the information system, from operational sources to the of data visualization.

A relative failure

It is clear that things are not necessarily progressing at the expected pace.

Thus the report of the Basel Committee 2020 indicates that despite some progress, banks do not always manage to implement the 14 principles of BCBS 239.

And overall, it is this subject of "data lineage", therefore the tracking of data through systems that is very often the source of these difficulties.

Why :

"Legacy systems" have included (too) many technologies for too long:

  • To store information (various databases, "on prem'", Cloud),
  • To transport information (ELT/ETL, procedural code),
  • To expose it (static BI, data visualization tools, data exploration, etc.).
  • Information Systems are becoming unfathomably complex.

The effects on the implementation of the structuring points of the text are numerous;

  • Difficulties in "guaranteeing a solid system on risk data" (Art.1: Governance)
  • Difficulties in "ensuring accurate and reliable risk data" (Art 3: Accuracy and Integrity),
  • Difficulties "Being able to quickly produce, aggregate and update" this data (Art 5: Actuality).

Some lines of work

    1. It will therefore be necessary to implement a technical, non-declarative data lineage around the technologies of storage, processing, scheduling, and data visualization of the Information System. These technical stacks must be analyzed together to provide a wide range of analysis axes: source and future of information, freshness of the implementation of feeding chains, details of calculations in the data visualization layer. The “parsing” technique of all of these technical stacks is undoubtedly the most likely to provide the expected answers.
    2. This data lineage must be replayed continuously so that it strictly reflects the Information System.
    3. For a shared understanding (business, IT, regulator) of the mysteries of these risk reports and their feeds, the terms "business" will have to be propagated from start to finish in the supply chains. Thus, everyone can imagine that the data investigated has a reliable and documented source.
    4. These flows must be able to be scored on the basis of objective criteria, with warnings in the event of incompleteness: feeding chains presenting breaks, imperfect or too slow scheduling, risk dashboards presenting errors in the formulas which prevail in the calculations of the risk indicators.
    5. All declarative processes are to be avoided. The teams are overworked, and they are unable to bring this text to life precisely in time. Automaticity should be the rule. A golden rule.


    BCBS239 is both a business challenge and an opportunity. A challenge, because the implementation of all the points of this text requires significant investment. And not a one-time investment.

    We believe that the automation of all processes, their ability to "live" in parallel with the analyzed system is the sine qua non condition for BCBS 239 to be a success.

    The appeal is that BCBS 239 can benefit the entire enterprise by reinforcing the importance of data quality and data governance. This can improve decisions, whatever they are, and reduce their implementation time.




    Posts les plus consultés de ce blog

    Le lien étroit entre la data gouvernance et la Green IT

    Les forces et les limites du Data Mesh

    Livre blanc : réduire la dette IT simplement et massivement pour migrer un SI !