IFRS 9 impairment regulation: How to prepare for the data tsunami
By Martim Rocha, Advisory Business Solutions Manager, SAS Risk Center of Excellence
With the introduction of International Financial Reporting Standards 9 (IFRS 9) in January 2018*, banks will have to change the processes that currently calculate their credit impairments. The new regulation changes the way credit losses should be recognized. Currently, impairments are based on “incurred losses”; IFRS 9 introduces a model based on future expectations, or expected credit losses (ECL).
The SAS white paper Achieving Optimal IFRS 9 Compliance provides the background to the issues and an approach to handling the latest ECL impairment standard. The principal impact on banks is the need to recognize ECL at all times for all financial instruments, and at individual- and grouped-asset levels. Banks will have to update the ECL amounts at each reporting date to reflect changes in the credit risk of financial instruments. This will significantly increase the number and frequency of impairment calculations that must be performed and the amount of information that must be collected to do so.
The amount of information surfacing will increase in a number of aspects of delivering the new impairment approach; for instance, higher granularity, forecast of future losses, new models and simulation requirements.
Higher granularity
The model to calculate ECL introduces the need to consider every financial instrument at an individual-asset level, which means a large amount of data needs to be collected and processed. While the need to collect data at the individual-asset level exists today in banks (e.g., capital calculations), the level of data granularity required for the IFRS 9 impairment process represents a new challenge for finance departments. It is also worth noting that the assessment of the impairment process will fall under the scope of statutory auditors, which is not the case with capital calculations.
Forecast of future losses
The new regulation implies banks will have to forecast future losses. In doing so, banks will have to pick each contract and project its behavior into the future, projecting the future cash flows for every asset they hold. This step will generate a significant amount of data. Even though this data might not need to be retained for a long period of time, the sheer fact that it needs to be generated creates an impact on the time to deliver and the storage space required.
New models
In order to forecast losses, new models will have to be built to model the behavior of a bank’s assets according to certain economic scenarios. These new models will need to be built based on historical data to capture historical behavior. What this implies is that historical data capturing loss behavior will have to be available, adding an extra dimension to the already large amount of data needed to calculate ECL.
Simulation
The projected losses will be exposed to subjective input and a selection of economic scenarios. This means that banks will want to simulate the ECL calculation for more than one scenario in order to better measure their exposure and increase the accuracy of the ECL calculation. It will require banks to run the calculations and analyze results more than once per period against a backdrop of limited time to deliver results. So in addition to the multiplication effect on the amount of results data that needs to be available for decision makers, there will also be constraints around performance, pushing banks to rethink their risk management systems architecture.
The above challenges will drive a focus on supporting IT systems’ ability to deal with large amounts of data in a timely manner. It will be important that banks are able to generate robust and accurate impairment calculations on time – not only to meet regulatory reporting needs, but also due to the impact they have on a bank’s bottom line and capital levels. The more good-quality data banks can put together, the higher chance they can come up with optimal models and an ECL calculation methodology. More details on this are available in the SAS white paper Achieving Optimal IFRS 9 Compliance.
SAS is widely recognized for its modeling capabilities, solutions and technology that help the financial services industry measure its risk exposures. SAS® software can handle the expected data volumes IFRS 9 impairment processes will generate and offers banks innovative approaches for new IT architectures, such as in-memory processing and grid parallel processing.
* Subject to endorsement of the local jurisdictions with early adoption allowed.
Read More
- Check out SAS High-Performance Risk.
- Read the white paper, Achieving Optimal IFRS 9 Compliance.
- Read the article, Prepare for IFRS 9 convergence with better IT and data practices
Get More Insights
Want more Insights from SAS? Subscribe to our Insights newsletter. Or check back often to get more insights on the topics you care about, including analytics, big data, data management, marketing, and risk & fraud.