How Technology and Data Management Are Influencing Banks’ Response to the Basel III Endgame
The tech and data implications of tighter capital standards
The Basel III endgame is the latest effort by US bank regulators to increase regulatory oversight of banks. The initiative will increase the costs of bank’s lending capabilities, reduce available capital, increase regulatory reporting, and place greater demand on the infrastructure of the largest US banks.
There are vast implications for the way banks allocate capital, manage their risk, master data, and ensure the technology that supports their institutions meets regulators’ increased demands.
What’s changing with the Basel III endgame initiative?
Banks with more than $100 billion in capital have until July 2025 to address the changes proposed by regulators.
An increased area of scrutiny in this initiative will focus on operational risk. The Federal Reserve Board is putting a spotlight on the people, processes, and technology that support the capital markets functions of banks and more. Banks found to experience operational lapses in daily functions, data management, and technology oversight will be subject to increased capital requirements for their lending and trading activities. To meet the higher bar on operational oversight, management teams across the banking system will put a great emphasis on operational excellence, data integrity, and technological efficiency.
Given the new demands, it’s critical for banks to think strategically about their technology, data management, and processes. Legacy technology platforms and data flows will be stressed by the increased requirements for real- and near-term data.
It’s time for banks to reassess their current tech stacks and introduce transformative measures to build flexible and scalable platforms.
What are the challenges of Basel III endgame?
The biggest burden of the Basel III mandate is an increase in the risk weighted assets (RWA) applied to many of the loans, structured products, and trading activities of banks. As a result, the data management and technology platforms banks use will face a new set of responsibilities.
Expanded focus on the risk-based approach to calculating RWAs will require new models, more data, and accelerated timelines to process this data to produce regulatory reports. RWA allocations will increase 19% on average across large banks.1
The Basel Committee on Banking Supervision estimates that Basel III endgame will increase market risk capital requirements by 57% on average for global systemically important banks.2 With significant cost increases and balance sheet constraint measures coming, it’s imperative for banks to have timely access to high-quality data to ensure accurate RWA calculations.
What information’s necessary to meet Basel III demands?
Banks will need to address the onus of increased data demands. Increased scrutiny will focus on:
- Exposure to depository institutions
- Cross default considerations
- Retail exposure to credit cards, revolving facilities, and loans
- Unused lines of credit
Large US banks, particularly those with large capital markets units, are accustomed to managing millions of data points every day. That data is used across their ecosystem for regulatory reporting, management reports, market surveillance, funding, and more. Increased focus on external data sources will mandate technology and data leaders to think about how they capture data from an ever-growing list of sources and integrate that data with internal sources. Greater demands for regulatory reporting increase the need for access to that data faster and more reliably.
What does the operational risk mean for banks?
Martin Gruenberm, Chairman of the FDIC, said, “Operational risk refers to the risk of loss resulting from inadequate or failed internal processes, people, and systems, or from external events.” Gruenberm added, “Operational risk exposures have been, and continue to be, a persistent and growing risk for financial institutions.”3
Penalties for banks that fail to meet the new Basel III standards will be harsh. A multiplier will be applied to regulatory capital requirements for banks cited for operational failures. Heads of operations will have a new level of scrutiny on their ability to keep their business units functioning at optimal levels.
What tech considerations do banks need to address?
Senior management, data officers, and technology leaders at banks must address several key questions:
- 1.Does current infrastructure meet the demands of the markets’ continued evolution?
- 2.Will technology, processes, and people meet the elevated standards of the mandate’s operational risk requirements?
- 3.Do data management capabilities allow timely access to critical information that exists in trading, risk, treasury and funding, accounting, market surveillance, and management systems?
- 4.Does the critical data needed to support the regulations come from a reliable system of record?
- 5.Are banks able to ingress critical data, normalize that data, and transform it in a timely and scalable manner?
Chief technology officers have to navigate legacy tech stacks that support disparate business units, asset complexity, numerous regions and legal entities, countless formats, and more.
IN CASE YOU MISSED IT: Arcesium Launches AquataTM
How should banks be evaluating their systems?
An immediate impact assessment of all technology and processes is warranted. Banks must examine their enterprise data management systems to understand their ability to accurately capture transactions and prices, manage life cycle events, and reconcile information promptly.
Here are just some of the areas banks will need to assess:
- Front-end systems will be taxed with egressing data in real time
- Risk systems will require retrofitting new models to provide outputs to meet the daily demands of regulatory reporting
- Data lakes and data warehouses are increasingly critical repositories of financial data
Many of these systems and platforms have been built to support business units in isolation. Homegrown builds, acquisitions, and software bought to drive efficiencies at the business unit level may drive efficiency and profitability for that unit. However, they may post a headache for data architects who need normalized data to create structured output for regulators, senior management, and more.
Rather than re-engineer systems, banks will increasingly look at extract, transform, load (ETL) layers to ingress and normalize the data. The ETL capabilities will enable them to structure data for multiple use cases and maintain data lineage for an auditable trail of information of what transpired and why.
What is the call to action?
It’s time for banks to review their governance structure.
Every technology and data process that supports risk management, operations, treasury and funding, audit, compliance, and regulatory reporting needs review and must be upgraded as necessary. Banks must be ready to identify and address data gaps, and validate and action their transformation plans.
Data strategies will be at the forefront of this transformation. Understanding that data intelligence is the lifeline of the next generation of banking is critical. Banks that embrace the change, invest in their tech and data stacks, and identify opportunities to use the changes mandated by Basel II endgame to enhance their business models will be the clear winners.
Want to learn more about how to navigate your compliance obligations? Watch our on-demand webinar on Capital & Liquidity Management: Risk, Regulation and Technology to gain valuable insights from industry experts.
Sources:
1 Bank Capital Requirements: Basel III Endgame, Congressional Research Service, November 30, 2023
2 Basel II Monitoring Report, Basel Committee on Banking Supervision, February 2023
3 Remarks by Chairman Martin J. Gruenberg on the Basel III Endgame, Peterson Institute for International Economics, July 27, 2023
Share This post
Subscribe Today
No spam. Just the latest releases and tips, interesting articles, and exclusive interviews in your inbox every week.