Risk Data Aggregation: Why Regulators and Banks Are Finally Paying Attention | Practical Law

Risk Data Aggregation: Why Regulators and Banks Are Finally Paying Attention | Practical Law

A summary of the risk data aggregation principles and requirements applicable to banks.

Risk Data Aggregation: Why Regulators and Banks Are Finally Paying Attention

Practical Law Article 0-613-4965 (Approx. 7 pages)

Risk Data Aggregation: Why Regulators and Banks Are Finally Paying Attention

by Mayra Rodríguez Valladares, MRV Associates and Practical Law Finance
Law stated as of 19 May 2015USA (National/Federal)
A summary of the risk data aggregation principles and requirements applicable to banks.
In the summer of 2015, both Basel III and the Dodd-Frank Act will turn five years old. With the exception of operational risk measurement, the Basel Committee on Banking Supervision has completed most of the necessary guidelines for the Basel III regulatory capital framework (see Practice Note, Basel III: overview). According to the Davis-Polk Progress Report, as of the first quarter of 2015, the 15 US financial and bank regulators have finalized over 60% of the necessary implementation rules for the Dodd-Frank Act (see also Practice Note, Road Map to the Dodd-Frank Wall Street Reform and Consumer Protection Act of 2010). Many of these new rules will require banks to produce high quality, complete, and consistent data proving and documenting their compliance. Data-intensive rules include:
  • The regulatory capital, liquidity, and leverage ratios under Basel III and related comprehensive capital adequacy requirements.
  • The Volcker Rule.
  • Living wills requirements.
  • Bank stress test requirements.
These and other rules, combined with the risk data aggregation principles described below, will require banks to produce this data:
  • In a timely manner for bank risk managers and bank regulators on an ongoing, business-as-usual basis.
  • Rapidly during a period of financial or economic stress.

Risk Data Aggregation Principles

In January 2013, the Basel Committee published its BCBS 239: Principles for effective risk data aggregation and risk reporting (Principles). The Principles describe 14 principles that are intended to strengthen banks' risk data aggregation capabilities and internal risk reporting practices. Effective implementation of the principles is expected to enhance risk management and decision-making processes at banks and increase banks' ability to cope with stress and crisis situations.
The Principles define risk data aggregation to include the following activities to enable the bank to measure its performance against its risk tolerance/appetite:
  • Defining the bank’s understanding of the relevant risk data.
  • Gathering the necessary risk data.
  • Processing the risk data according to the bank’s risk reporting requirements, including:
    • sorting;
    • merging; or
    • breaking down relevant sets of data.
The Principles are designed to support bank efforts to:
  • Enhance the infrastructure for reporting key information, particularly that used by the board and senior management to identify, monitor and manage risks.
  • Improve the decision-making process throughout the banking organization.
  • Enhance the management of information across legal entities, while facilitating a comprehensive assessment of risk exposures at the global consolidated level.
  • Reduce the probability and severity of losses resulting from risk management weaknesses.
  • Improve the speed at which information is available and decisions can be made.
  • Improve the organization’s quality of strategic planning and the ability to manage the risk of new products and services.
They are also designed to complement existing efforts to improve the intensity and effectiveness of bank supervision. Importantly for bank resolution authorities, such as the FDIC in the US or the European Banking Authority in Europe, improved risk data aggregation can enable an orderly bank resolution, thereby reducing taxpayer bailouts of failed banks.
The 14 risk principles are grouped together in four categories as follows:
Category
Principle
Governance and Infrastructure
I. Governance
 
II. Data architecture and IT infrastructure
Risk data aggregation capabilities
III. Accuracy and integrity
 
IV. Completeness
 
V. Timeliness
 
VI. Adaptability
Risk reporting capabilities
VII. Accuracy
 
VIII. Comprehensiveness
 
IX. Clarity and usefulness
 
X. Frequency
 
XI. Distribution
Supervisory review tools and cooperation
XII. Review
 
XIII. Remedial actions and supervisory measures
 
XIV. Home/Host Cooperation
The Principles impose a January 2016 deadline for relevant financial institutions. While these principles are primarily designed for globally systemically important banks, it is clear from the wording that the Basel Committee encourages regulators to also:
  • Strongly consider requiring domestically important banks to comply with the same standards.
  • Consider imposing all or some of the standards on smaller banks, where appropriate.
The Basel Committee released a survey in January 2015, which revealed numerous challenges that banks are facing in order to comply with the Principles by the January 2016 deadline. Regulators are increasingly concerned that banks’ weak risk data aggregation practices means that capital ratios and data from stress tests and living wills may not be reliable enough to demonstrate banks’ true risk exposures, especially in a period of market stress.

Data Intensive Requirements

The significant challenges in complying with the Principles are due to very data intensive requirements from both Basel III and Dodd-Frank.

Capital Requirements

Basel III (Pillar I) capital requirements include guidance on how to measure credit, market, and operational risks. Most globally systemically important banks are allowed to use their credit risk inputs, including:
  • Probability of default.
  • Loss given default
  • Exposure of default.
These inputs go into a Basel Committee calibrated formula to measure credit risk. Banks now also have to calculate capital charges for their derivatives counterparties, including both banks or central counterparties. Because there are different ways to calculate the credit risk inputs, having a defined source of accurate, consistent, and complete data is extremely important for the reliability of measuring credit risk.
For market and operational risk, big banks are allowed to use their own models, which have to comply with numerous design, validation, compliance, and auditing requirements. The biggest data challenge for market risk measurements is having a good reliable source of data for illiquid assets, such as infrequently traded bonds or tailored derivatives. Additionally, market risk models, have to be stressed in order to test whether a bank could survive in a period of significant market shocks in interest rate, equity, foreign exchange, or commodity markets.
The biggest risk data aggregation challenges lie in measuring operational risk, a risk to a bank’s earnings or capital due to a breach in the day to day running of the business due to:
  • People.
  • Processes,
  • Technology,
  • External threats such as:
    • outsourcing to vendors;
    • natural disasters; or
    • cybersecurity.
Operational risk is the most neglected of the financial risk and was not part of The Basel Accord until 2006. Many banks struggle to define, identify, measure, control, and monitor operational risk uniformly. Data challenges abound especially with banks that do not have an established process to collect data that reflects their internal losses. When banks use data of losses that took place at another institution, they risk using data that may not be relevant, since each institution has a different risk culture and internal controls.
Most significant bank losses in the last twenty-five years such as at Barings, Societe Generale, and JP Morgan are all due to operational risk. Establishing a process to obtain accurate and complete operational risk data to measure this important risk in a timely manner should be a top priority for banks. Without an accurate measure of operational risk, banks will not have a complete picture of their aggregated risks across the whole bank, especially during a period of credit or market stress.

Basel III Liquidity Standard and Leverage Ratio

New rules under Basel III, such as the Liquidity Standard and the Leverage Ratio, require that banks know where the data originates and whether it is of high quality. These ratios are of great interest to all bank regulators, but especially to those such as the FDIC or the European Banking Authority who have bank resolution responsibilities. These authorities want to make sure that banks have high quality liquid assets and that they are less leveraged to the reduce their probability of failure.

Stress Testing Requirements

Under Basel III’s Pillar II, the Internal Capital Adequacy Assessment Process is a key area that requires an enormous amount of high quality data are for capital adequacy and stress test requirements. In the US, the Federal Reserve is empowered under Dodd-Frank’s Title I to mandate the Comprehensive Capital Adequacy Review and what is now popularly known as the Dodd-Frank Act Stress Test (DFAST). Stress tests require a significant amount of data such as:
  • Loan balances.
  • Maturities.
  • Repayment history.
  • Geographic exposure.
  • Internal risk ratings.
  • Collateral data.
Most banks need to go to multiple sources for these data points, calling into question whether banks know the lineage, quality, completeness, and accuracy of the data. These data are necessary in order for banks to calculate important stress metrics such as expected losses, loss severities, liquidity, net income, and regulatory capital.

Risk Disclosure

Basel III’s Pillar III (risk disclosures) also requires significant levels of high quality data. This key part of Basel III is incredibly important because if banks disclose high quality information, the market is in a better position to signal what it thinks about a bank’s risk exposures and risk management. The Basel Committee released revised, and significantly improved disclosure requirements in January 2015. Banks now have templates that require them to provide their average risk weights across multiple assets. This will enable market participants to receive banks’ information in a uniform manner so that they can compare and contrast different banks’ risk exposures. High quality data is essential for this pillar, because this would enable the market to discipline banks by selling bank bonds or stocks if investors are dissatisfied with what the banks are signaling about their risk exposures.

Living Will Requirements

Under Title I of Dodd-Frank, banks are required to write living wills that describe to bank regulators how a failed bank would be resolved in the all of the jurisdictions where it has legal entities (see Practice Note, Living Will Requirements for Financial Institutions). For this exercise, bank regulators should require banks to have strong risk data aggregation capabilities that map the different risk exposures to each of the bank’s legal entities. Bank management have traditionally approached risk management along business lines; yet for the purpose of resolving a failed bank , the location of domestic and foreign legal entities dictates which bank bankruptcy laws apply.

Volcker Rule

Title VI of Dodd-Frank (the Volcker Rule) also imposes significant data requirements on financial institutions (see Summary of the Dodd-Frank Act: The Volcker Rule). In order for banks to trade numerous securities and derivatives without violating the Volcker Rule, they must prove to regulators that they are market makers. To demonstrate they stand ready to buy or sell a wide range of securities and derivatives, banks have to meet seven quantitative metrics.
  • Risk and Position Limits and Usage.
  • Risk Factor Sensitivities.
  • Value-at-Risk and Stress VaR.
  • Comprehensive Profit and Loss.
  • Attribution Inventory Turnover.
  • Inventory Aging.
  • Customer-Facing Trade Ratio.
Many of the metrics are not entirely new, but they have to be calculated by desk at a very granular level, which is new to the banking industry.

Bank's Progress With Complying With the Principles

Responding to surveys by the Basel Committee and Markit, 45-50% of globally systemically important banks will not be able to comply with BCBS 239 by the recommended January 2016 deadline (see Progress in Adopting the Principles for Effective Risk Data Aggregation and Risk Reporting, January 2015; and Risk Magazine, January 29, 2015). The weakest areas of bank compliance are in:
  • Principle 2: Data Architecture and IT Infrastructure
  • Principle 3: Accuracy and Integrity of the Information.
  • Principle 6: Adaptability.
Over 50% of the banks need 2-3 years beyond the 2016 deadline to be able to comply at least at a satisfactory level. Most senior bank executives do not trust that in a period of stress, their IT architecture is robust enough to identify their aggregated risk exposures.

Benefits of the Principles

Meeting the enormous challenge of complying with the Principles should not be viewed simply as an expensive compliance exercise. Having high quality data in a timely manner can help the bank’s board of directors set a bank’s risk appetite. Having a coherent single source of information and lineage for data can be enormously useful for senior bank executives so that they can trust the information that they are seeing about their aggregated risk exposures. Moreover, good data can help a bank with its growth plans and with timely and proactive risk management. Without good data, banks cannot truly understand their customers’ needs. It is impossible to make good effective investment and risk management decisions without complete and accurate data.