labels: economy - general, banks & institutions
A Cleaner Road: Loss Data and Operational Risknews
08 February 2005

Customer intelligence will help define and recover bad loans, leading to a more effective debt management strategy and reducing the money required for capital allocation, as prescribed by Basel II, writes Sabyasachi Bardoloi.

Collection and analysis of loss data tops the agenda of the financial services industry (FSI) today. This loss data factor has become a key issue in the realm of operational risk as well as in the development of regulatory capital requirements. As the timeline for implementation of the Basel II accord nears, it has been recommended to banks that they collect three to five years of historical data. Banks as well as banking industry supervisors are expected to familiarise themselves and develop necessary systems and processes in order to meet the high quality data standards prescribed by the Basel II accord.

Financial institutions are gearing up all their energies to meet this mammoth challenge. Quite a number have already initiated measures to collect and analyse operational loss data through internal capital assessment, using the allocation mechanism as the basis. But an industry-wide standard approach, which is acceptable to all is yet to emerge.

Data loss is a major risk that financial institutions are facing today. But it is totally controllable. The first step to counteract this risk proficiently is through awareness and education; to be provided at all levels of an organisation - executive, management, IT staff and end-users.

Proper end-user training and robust back-up plans accompanied by efficient disaster-recovery procedures would help prevent data loss. Data loss owing to human error can be averted and is easily categorised into two areas; education and procedure.

Data collection is important for the assessment of operational risk among financial institutions. There is also an increasing recognition amongst banks and supervisors that the sharing of loss data, based on consistent definitions and metrics, is necessary to arrive at a comprehensive assessment of operational risk.

The need of the hour clearly is to focus on data collection, tracking, monitoring, analysis and reporting for operational risk management. Certain enhancements required to comply with Basel II requirements focus on technology that will lead to:

  • Proper identification and gathering of key risk indicators
  • Thorough monitoring of databases for internal, external, current and historical data
  • Gathering loss data
  • Identifying and estimating frequency and severity of losses
  • Efficient management of reporting

A real menace
Data quality continues to occupy the attention of management to an extraordinary extent. This is the finding of a new report issued by IDC: 'Business Analytics Implementation Challenges - Top 10 Considerations for 2003 and Beyond', which analyses the issue of data quality and its impact on business.

IDC lists data quality as the number two issue for companies implementing enterprise-wide business analytics software. As of today, the majority of banks are yet to adopt any kind of high-end operational risk assessment procedures. However, the main challenge for banks in the coming months will be to spot and locate the required data and build up a data storehouse or data warehouse for storing the ready data for analysis. Moreover, banks must ensure effectual integration of different risk types and accurate computation of the various risk measures.

Banks must now also ensure that the data used in these calculations is thoroughly clean. A study conducted by The Data Warehousing Institute some time back revealed that the US businesses spend $600 billion a year in cleansing their data. Another study conducted by Gartner points out that lack of clean data is responsible for 80 per cent of failed Customer Relationship Management (CRM) projects.

These figures clearly indicate that current data cleansing mechanisms are significantly inadequate. Banks need to take a fresh look at the techniques they are currently using and get their data into shape in order to be ready for the next round of capital changes. The Basel II accord has, for the first time, recommended capital allocations in the area of operational risk.

Banks were quite unaware that they have dirty data problems until a mammoth project like Basel II recommended clean data and awakened them from their deep slumber. Unrecognised dirty data is a real menace. With stockpiles in their systems, bank computers fail to interpret information accurately and important files can be lost or deleted; they could even lose track of customer and executive lists.

If this continues uncorrected, banks may be forced to abandon entire business strategies. Basel II is making the problem even more urgent since dirty data that has not been captured correctly from CRM or other business applications will not meet the standards required for accurate risk management assessments.

Sources of Dirty Data
Dirty data enters systems through various sources:

  • Purposeful entry of incorrect data by online customers
  • Call-centre operators enter truncated data to save time
  • Errors in third-party data
  • Customers key in errors into front-office systems
  • Data in varied systems conforms to divergent formats

Churning in Quality Data
A recent joint survey conducted by Risk Waters magazine and SAS indicated that incomplete, imprecise or archaic data cost financial institutions up to $120 million a year through operational risk. The survey interviewed 400 risk managers from 300 financial institutions and is the largest ever conducted on operational risk management.

Twenty-eight per cent of respondents felt that the problem of collecting data for precise identification and management of operational risk is the major stumbling block in preventing losses, while 33 per cent said that poor data quality is a major concern. Mr Peyman Mastchian, head of risk at SAS, UK feels: "Data quality has been a major issue for businesses for many years, and this survey shows that the problem has by no means gone away." He added, "the focus on operational risk management has increased with regulations such as Sarbanes-Oxley and Basel II but, once again, companies are coming up against the problem of finding and interpreting the data that they need."

The cornerstone for any successful customer relations lies in the accurateness and aptness of the core data. Data quality affects profits of customers in countless ways, and it is least surprising that IDC, the global market intelligence and advisory firm, has identified it as the second biggest challenge faced by organisations today.

In a recent report titled Business Analytics Implementation Challenges, IDC estimated that a typical Fortune 1000 corporation has approximately 50 applications and 14 databases, all of which must be tapped to get an enterprise-wide view. FSIs have also witnessed countless mergers, which have led to varied systems and data inflow. Creating a single view of such customer databases becomes a real challenge.

A recent SAS European survey on the effects of data quality in FSIs reflects the growing importance of quality data. Sixty-six percent of European organisations admitted in the survey that dirty data is affecting their profitability. Altogether, 86 percent felt that around 10 per cent of their customer data is incorrect in some way. The survey revealed that 74 per cent of organisations has taken some measures to address the data quality issue.

The Measurement Matrix
Establishing a robust loss database - which includes both internal and external loss data - is very crucial to measure operational risk in a credible manner. The advance measurement approach introduced by the Basel Committee for banking supervision to measure operational risk comprises three sub-categories; the Internal Measurement Approach (IMA), Loss Distribution Approach (LDA) and the Scorecard Approach (SCA).

The AMA is a more advanced approach since it allows banks to use external and internal loss data as well as internal expertise. The LDA and SCA are very similar as both approaches are based on a statistical VaR model. The difference between the two lies in the fact that in LDA, only internal or external historical loss data is used for estimating the distribution functions. In SCA, banks are also allowed to apply expert knowledge to estimate the distribution functions.

Using the statistical approach, events of operational risk - say lost checks or errors in transfer of funds - could be captured in terms of their frequency and severity. These frequency and severity distributions generate loss distribution with a Monte Carlo simulation. The mean and a certain percentile point are calculated in order to estimate the EL (expected losses) and UL (unexpected losses), respectively. This measurement of VaR is used to allocate economic capital to operational risks.

Using the statistical measurement approach, direct losses related to events of operational risk can be measured, while using scenario analysis, indirect losses or potential losses can be calculated. The operational VaR method can be validated in two ways; either by back testing or by a statistical test. It may be difficult to conduct back testing owing to data availability compared with market risk, but it is possible to secure the robustness of operational risk measurement with statistical testing.

The statistical model enables measurement of risk perfectly, but it does not provide any tools or clues to reduce risk. A statistical model is likely to be beyond the grasp of bank staff, which can be a barrier to attain the required level of understanding of operational risks. Besides, owing to the relative scarcity of internal loss data, any statistical model relies more on external data.

With the VaR method, allocation of resources becomes more effective since it places priorities on each loss type in each business line; to enhance daily operational risk management and to conduct internal audits in a more risk-focused manner. With scenario analysis, the potential losses can be measured so that contingency plans are addressed in order to minimise their potential resulting damage.

An increasing number of banks are on the way to enhance their loss data collection, not just for measurement of VaR but also for robust risk management; in order to put priorities on the risk categories in each business line. This should encourage them to upgrade their operational risk management, both quantitatively and qualitatively.

How can external loss data be used to supplement internal loss data? This is a key question today. Some international trends in the loss data consortium have started to share banks' loss data, which will enable member banks to use external loss data.

The scorecard approach avoids many of the problems inherent in analysis of historical data, by capturing the knowledge and experience of the experts who design the scorecards. However data collection problems are transferred to the collection of risk indicators, which can also suffer from quality issues. And the reliability of the output becomes quite dependent on the experts employed to design the metrics and weightings within the scorecard.

It is entirely possible that a badly designed scorecard could produce results completely inconsistent with reality, intuition and the actual history of losses. Relying completely on scorecards could therefore be compared to driving without keeping an eye on the rear-view mirror. An intelligent bank will always prefer to use a combination of methods, one supplementing or testing the other.

Towards a Cleaner Road
To stay ahead and maintain a competitive edge, well-organised banks have adopted steps to monitor and clean up their information, be it internal or external. The high-end priority for all financial institutions as of today is to maintain clean and quality data by investing in appropriate data collection systems. An organisation's capacity to limit and measure risk will be much higher once these enhanced data collection and intelligence tools are applied, since it will help them to better understand their customer's position. Proper customer intelligence will help define and recover bad loans, leading to a more effective debt management strategy and reducing the money required for capital allocation, as prescribed by Basel II.


 search domain-b
  go
 
A Cleaner Road: Loss Data and Operational Risk