Data quality, integration and agility form an essential foundation for a comprehensive operational risk management solution.

Operational Risk Management (ORM) is defined by the Basel Committee on Banking Supervision (BCBS) Basel II accords as “the risk of negative effects on the financial result and capital of the bank caused by omissions in the work of employees, inadequate internal procedures and processes, inadequate management of information and other systems, and unforeseeable external events.” This has been subsequently revised and extended under BCBS 239 with fourteen principles of governance and risk management.

In addition to these broad risk management policy proposals, there are two major business drivers to contemporary banking ORM initiatives:

  • Standardization across value chain for competitive advantage and enhanced customer experience
  • Capital adequacy management and analysis such as for the Basel Accords

Initial ORM implementations were typically built off single data instances and were unable to adapt to evolving approaches. Next generation systems need to support data integration across data silos and be agile and adaptive to reflect the results of continuous incremental change. ORM is not one and done, but a constantly and evolving need.

A major obstacle is to the development of a 360 degree view of risk is that the required data comes from many sources including:

  • Portfolio position data
  • Client data
  • Financial Accounting data
  • Market data
  • Organizational structure data
  • Reference data

As part of an ORM solution, quantified risks across all parts of the organization need to be continuously analyzed and monitored against accepted ranges so that remedial action can be taken and processes adjusted. Reference data needs to be adjusted to ensure that the solution incorporates and reflects changes to targets and ranges.

Capital Adequacy management and analysis using approaches such as the Basel Accords Advanced Measurement Approach (AMA) require the collection of data across four broad classes of data:

  • Internal loss data
  • External data
  • Scenario analysis
  • Business environment and internal control factors

This requires that data is collected and managed at low levels of granularity so that losses can be categorized into standard units of measure, prior to the computation of potential distributions of loss.

ORM is a complex subject, and the challenge of building a comprehensive and accurate view of risk factors only exacerbates the problem. Providing tools that support the increasingly fine levels of granularity, data integration across silos, and synchronized views to support continuous availability of data will ensure operational risk management success in the face of changing approaches and constant refinement.

How Kalido helps

  • Kalido’s unique business model driven approach to automating the data warehouse results in an industry leading time-to-value as more effort goes into defining the business requirements of an operational risk and capital adequacy data warehouse and less into the technical effort of building and maintaining the data warehouse.
  • An integrated store of accurate and current information is foundational to any ORM effort and the Kalido Information Engine is uniquely able to rapidly build and maintain such a foundation.
  • Ends semantic disconnects across the organization, ensuring that risk factors are clearly identified and weighted correctly in risk analysis.
  • Avoids high risk “rip and replace” strategies, using the Kalido Information Engine to incrementally integrate, manage and load transactional and operational data from disparate silo and legacy systems into an operational risk data warehouse.
  • Ensures that changes to source transactions and operational organization structures are automatically reflected in business analytics and reporting views so that decisions are based on the most accurate risk profile available.
  • Kalido Master Data Management ensures that reference norms can be incorporated for continuous monitoring in operational risk management and ensures data quality and governance through rules-based validation and incorporation into workflow.
  • Offers an agile technology and approach to accelerate project delivery time to meet business needs faster
request_a_demo

Resource Library

Read our short white paper and learn how Kalido solutions enable better risk management by improving risk data aggregation and more.

Get the White Paper

Forrester Wave


Forrester Wave: Master Data Management

Delivering clean, accurate, and consistent master data
is critical for efficient business processes.

But it isn’t enough to simply get clean data – you need to keep it clean, too. And the best way to keep data clean is to proactively engage the people who know the most about the data in the process.

Forrester, a leading independent technology and market research company, has cited Kalido MDM as a “Strong Performer” in The Forrester Wave™: Master Data Management Solutions, Q1 2016. According to the Forrester Wave report author, Senior Analyst Michele Goetz, “The Kalido MDM tool makes master data management (MDM) easy for data stewards and subject matter experts (SME).”

Read the report and learn more about the 12 MDM providers that matter most and how they stack up.