JavaData & JavaPreparing for the Pain of Basel II

Preparing for the Pain of Basel II

Compliance with Basel II and Sarbanes-Oxley, the new United States regulations regarding financial reporting, is the response to the issues illuminated by the Worldcom and Enron crises amongst others. Whilst standard accounting regulations would seem an obvious move there are growing concerns surrounding both the costs and means of this compliance, at a time when, like most enterprises, financial institutions can ill afford to invest in more technology.

The New Basel Capital Accord, more commonly known as Basel II, is fundamentally about improving risk and asset management to avoid financial disasters. Compliance requires all banking institutions to have sufficient assets to offset any risks they may face, represented as an eligible capital to risk aggregate ratio of 8%. Part of this compliance dictates that data capture must be fully operational by 2004, and financial institutions must have three years of data on file by 2007, which of course means that work on this aspect of compliance needs to start now, if it hasn’t already started.

If banks are going to have to set aside assets to balance possible risks, then analysing and measuring those risks is going to be paramount. A 1% change in asset allocation may not sound like much, but international banks deal in very large numbers, and 1% can make a significant difference to operation capital.

Operational risk is defined by the Basel Capital Accord as: “The risk of direct or indirect loss resulting from inadequate or failed internal processes, people and systems, or from external events.” It is not just about IT, all companies are exposed to operational risk, and the integration of processes, systems and people has to be understood and continually monitored to mitigate these risks.

The extent of possible losses was indicated in a recent SAS/Risk Waters Group survey, where 93% of respondents had experienced losses of $10 million in one year, and 21% of respondents said that their company suffered losses between $10,000 and $1000,000 at least once a day. The prime reasons given for such losses were incomplete, inaccurate or obsolete data, and inadequate processes.

Hence in order to comply with Basel II, financial institutions will need to have a full and in-depth understanding of all possible risks and their potential impact. This requirement is ongoing; it cannot in any way be regarded as a one-off, or something financial institutions do once a year to fill a page in the annual report.

Risk changes all the time, some risks are known, and can be prepared for, some are unexpected and will need to be understood. The complexity of potential risk factors for financial institutions cannot be over-emphasised. For instance, fraud is a risk factor that might result in a financial loss, but how about the effect fraud might have on reputation and consequent business loss? The bottom line is that financial institutions must be sound enough to weather any storm, without needing to be baled out if their assets come up short, and it is not just nature that must be dealt with.

The challenge of compliance

It has been argued that compliance with Basel II is a software issue, that some cure, along the lines of those associated with Y2K, is needed. But Gartner believes that compliance with Sarbanes-Oxley, and by association Basel II, is not a software issue but a process issue. There is no software equivalent of a silver bullet for Sarbanes-Oxley or Basel II compliance.

Financial institutions should take a step back before they buy packaged Basel II solutions and make sure that they fully understand where compliance begins. A close look at Basel II compliance shows us that at heart it is a knowledge issue. They key areas in Basel II compliance are data capture, reporting and analysis of credit, market and operational risk, and then mitigating perceived risks through business processes, whether automated or performed physically.

In the IT and business world, knowledge starts with data, and risk factors are identified by analysing data, so the logical place to being is data modelling. By understanding how you currently operate and what controls you have or haven’t got on the quality of your data, you will begin to identify where your risk areas might be. The basic building blocks for compliance are therefore understanding meta-data, developing data standards and building corporate data models. Like all basic building blocks, everything else will stand or fall on how complete the original data capture is, and decisions based on high-quality information will have better foundations than those based on poor or incomplete information.

The next step is to gain a complete understanding of the processes, roles and skills employed in the operation of the business through business process modelling. Combining data and business processes together then provides an enterprise architecture, which details not only the processes and data themselves, but also the relationships between them. The most popular description of an enterprise architecture is based on the Zachman Framework, which models how all parts of an organisation fit together and provides an “as-is” diagram of the organisation.

The levels in the Zachman Framework can be equated to the blueprints for building a house. At the top level are the plans and diagrams that an architect might discuss with the owner, at lower levels are the more detailed specifications that are the concern of the builders. Changing the number of windows or bathrooms in the top level diagrams will have a knock-on effect for heating capacity or drainage requirements which will impact on the lower levels.

An enterprise architecture based on the Zachman Framework means that any changes made by those with an overall picture of the organisation can be examined and followed through the organisation to determine possible impacts at different operational levels. Clearly, the converse is also true – changes made at the deeper levels, for instance IT applications, can be tracked back to determine their implications for the organisation as a whole. In this way, enterprise architecture provides the interface that enables business and IT to be aligned.

Financial institutions can use an enterprise architecture to identify and measure risk areas, and to determine if mitigation procedures are in place. For instance, an example process might be a loan application, represented in diagrammatic form below, using the industry standard, Business Process Modeling Notation (BPMN).

Capturing the process defines operational activities and logic and enables the linkage from the activity “Reject Application” to its associated risk definition diagram. The risk definition diagram contains related objects that define the owner for the risk and its effects, probability and impact of the risk, control procedures for monitoring and minimizing the risk, affected key performance indicators, such as processing efficiency and customer satisfaction, and actual losses incurred.

Hence capturing all relevant information against each risk and operational process enables the organisation to define the risk, identify actual and potential mitigation activities and rank the risk, thus providing all the information needed for well-informed decision-making.

Once you have an accurate picture of the organisation, you can then begin to look ahead – and for financial institutions, being able to predict future scenarios will be a valuable tool in Basel II compliance in the long term. Business process simulation can be carried out using the enterprise architecture to examine how potential changes at different points in the organisation affect other areas. Using simulation gives an opportunity to measure risk and predict the cost of failed processes, thus enabling quantification of different potential risk scenarios. These predictions may highlight a need to invest in additional IT applications, demonstrate that current applications are being used inefficiently or they may indicate that business processes need to be addressed. Basing business process simulation on sound original data will be the key to making sound business decisions based on the predictions.

Logic tells us that compliance with any new financial regulations, Basel II, Sarbanes-Oxley, OFR, will be based on the same original data and business processes, thus building Basel II compliance on enterprise architecture will make financial institutions better prepared to meet other regulations now and in the future.

Martin Owen
Manager of Consultancy Services
Popkin Software

www.popkin.com

Martin Owen, Consultancy Services Manager for Popkin, is responsible for the management, training and development of the Popkin International consultancy team. Since joining in 1989, Owen has been involved with many Popkin products and has worked across a range of roles. Organisations that he has worked with at a strategic level include CSC, British Telecom and IBM.

Owen has also been involved in the development of the Business Process Modelling Notation (BPMN) within the BPMI.org.

Get the Free Newsletter!

Subscribe to Developer Insider for top news, trends & analysis

Latest Posts

Related Stories