What was your data doing during the financial crisis?
Six ways to rethink your data management strategy – and avoid worst-case scenarios
by Mazhar LeGhari, Business Solutions Manager, SAS
“Those who cannot remember the past are condemned to repeat it.” Spanish philosopher George Santayana wrote those words more than a century ago. But enlightenment by hindsight is a slow process. Humans still persist in behaviors that do not work, and organizations still persist with information processes that are broken.
Nowhere is that more evident than in the financial services industry, where the past has been one of tremendous upheaval, and the future doesn’t look much different.
Data should be business-led and IT-managed, with a close coupling between the two.
Naturally, the US federal government responded with regulations intended to prevent us from being condemned to repeat the past, such as the Sarbanes-Oxley Act of 2002 and the Dodd-Frank Act of 2010. The 17 countries in the eurozone have more than 40 financial supervisory authorities – with weak coordination among them – each with its own directives and oversight.
The aims of these regulations are worthy – to ensure that financial institutions know the sources and recipients of funds, have transparency in accounting processes, are accurate in financial representations, and have sufficient capital reserves to continue operations even during times of economic and financial duress.
However, each new regulation added complexity and reporting burdens in a business environment that was already getting more complex. E-commerce, mobile and online banking, wave to pay and more – new banking channels are dramatically reshaping the data management landscape.
What was your data doing while all this was happening?
A common denominator in all these historical markers is data – either the lack of it or the inability to gain timely and trusted insights from it. Could we have foreseen the mortgage meltdown, the financial institutions’ crises and the recession, if only we had gotten our arms around more data and done more to correlate it? Could the dot-com bubble have been averted if investors had better knowledge about the true valuation of the companies they were investing in? Could crash losses have been avoided or minimized if data systems had detected the early-warning signs?
Yes, of course, but that level of knowledge has been elusive. Many IT architectures were built 10 and 20 years ago for business as usual. But business is not as usual anymore, and financial institutions need to focus on six top data issues:
- Unified perspective. Customer expectations and regulatory reporting require the ability to easily link quality data across business and product silos, without a wholesale overhaul or creating yet more single views.
- Data agility. Once you have gained the necessary cross-functional, cross-system perspective, provide the means for data processes to adapt quickly to inevitable future changes.
- Data definitions. To support enterprisewide decisions and reporting, establish consistency in how you define such elements as creditworthiness, risk tolerance and market segments.
- Defining anomalies. Get consensus about what constitutes a trouble condition, such as a high-risk credit application, fraud or other patterns that should be flagged for investigation.
- Data-driven processes. Machine-tomachine interactions are commonplace, such as in algorithm-driven trading. Lacking human scrutiny, unmanned
systems need trusted data and rigorous early-warning systems. - Data governance. Who owns the data? Who manages it? Who can use it? And for what? Traditional governance has been a hybrid model – part centralized with IT and part patchwork entropy. Now that different functions need parallel access to the same data, data should be business-led and IT-managed, with a close coupling between the two.
Until recently, many financial institutions have been buying software applications in silos to suit purposes and market pressures of each line of business. As a result, vendors often position capabilities to retail, commercial or wealth management divisions individually – even though they are all the same organization.
Today, banks need to ask whether that siloed approach is still best. If we’re trying to avoid the mistakes of the past, we need to start doing things differently.
Mazhar LeGhari delivers strategic insight for solutions and best practices in information management based on his deep experience in data governance, data quality, master data management and data integration. Before joining SAS, LeGhari spent 11 years as a solution architect and product manager for enterprise information management systems with two other large software vendors.
Read More
- Bank of Queensland used data management to become a more efficient organization. (success story)
- Toyota Financial Services CIO is an model for change. (article)
Get More Insights
Want more Insights from SAS? Subscribe to our Insights newsletter. Or check back often to get more insights on the topics you care about, including analytics, big data, data management, marketing, and risk & fraud.