Peace for the World

Peace for the World
First democratic leader of Justice the Godfather of the Sri Lankan Tamil Struggle: Honourable Samuel James Veluppillai Chelvanayakam

Friday, May 11, 2018

Financial institutions very traditional in way they manage and protect data 

Friday, 11 May 2018

logoThe financial sector is under increasing pressure from many stakeholders to manage regulatory compliance and the associated risk more effectively. With the advent of the Basel lll regime, as well as restrictions laid down by regulators, the process of correctly identifying as well as utilising the right data for controlling risk and doing business has become a critical one.

To comply with regulatory requirements, financial institutions will need to increase their governance in ways which conform with the new compliance requirements, improve the quality of data, the protection of that data and optimise the accumulation of new risk data. The assessment of risk depends mostly on properly validated data: data on counterparties, default history, peer data, local and overseas markets and internal operations. According to research, managing data and the infrastructure required to manage the data takes up to 7-10% of a bank’s operating income.

Data quality

For most institutions data quality and protection of that data have been low on boards’ priorities. The new emphasis on regulatory risk management means that the governance and the integrity of the reference data utilised for holistic risk calculations has now become a critical issue.

Regulators are also forced to focus more closely on data collection, management and systems. They understand that management’s ability to control the business, and quantify and manage risk, depends entirely on the quality of relevant data available -and they are, with some reason, becoming more concerned about the poor standards of data management they are encountering. So while there is a regulatory push for improvement on one side, it is because there is also major potential benefit to be secured on the other side in the form of improved business capability.

Risk management is intimately dependent on issues of data: data integrity, sources, completeness, relevance and accuracy. And even in the smallest financial institution, good risk management depends on the IT architecture and systems used to store and process data. But for many financial institutions , with multiple aging IT systems or poorly integrated homegrown systems from decades of add ons,  very often find it very difficult to aggregate and report data to support risk management.

Experience

The shortcomings of this practice were harshly exposed by the financial crisis. A key lesson was that large parts of the financial sector was unable to identify and aggregate risk across the financial system and to quantify its potential impact.

Furthermore, exposure could not be easily be aggregated across trading and bank books, across geographies and across legal entities. This was because risk management, governance and the underlying data infrastructure were very weak. As a result, systemic risk was, both obscure and under estimated. Unfortunately, many of these challenges for effective risk data aggregation and risk reporting still remains mostly unresolved. The data architecture and IT infrastructure, the accuracy and integrity of data and the ability of banks to adapt to changing demands for data interpretation and reporting still remains a big challenge.

However, in the last two years institutions have progressed towards consistent, timely and accurate reporting of top counterparty exposures as well as implementing identified best risk management and data management practices.

Way forward

Weakness in systems and data management have also hampered the ability of institutions and their supervisors to scenario tests and stress tests. The experience of stress-testing has revealed the fact that systems and processes for aggregating and analysing risk in large financial institutions still remains a big challenge.

Counterparty risk (also known as default risk), the risk to both parties that needs to be considered when evaluating a contract also requires high quality data. In the final analysis, institutions both in the banking and non-banking sectors going forward needs to avoid ad hoc processes and manual interventions to produce a summary of potential risks and customer data. Therefore, robust data management and protection infrastructure can largely improve the reliability of the assessments that are produced.

In general, there is a long way to go before the industry can produce the required quality of data necessary to satisfy stakeholder expectations fully. Therefore they need to change their operating model and harmonise their systems and data, to finally get to a single point of view.
(The writer is a thought leader).