Loading...
You are here:  Home  >  Conference and Webinar Communities  >  Current Article

How Fares the Financial Services Industry in Implementing Data Management?

By   /  October 15, 2015  /  1 Comment

dollarsignby Jennifer Zaino

Remember the financial crisis? It’s hard to forget that one, of course. It occurred following a renewed interest by the financial services industry in Data Management after 9/11, but unfortunately efforts mostly failed to focus on the important issue of data relationships.

“Their eyes were set on cost-cutting, [but] the world was exploding with data,” said John Bottega, principal and managing member of the consulting firm Data Management Advisory Services LLC, during a presentation at last month’s Data Governance in Financial Services Conference, produced by DATAVERSITY® and DebTech International. Bottega also is a senior advisor, Chief Data Officer Forum, Data Management Practice, EDM Council, and a former CDO, Bank of America and Federal Reserve Bank of New York.

As data grew by leaps and bounds, “interdependencies and relationships piled on top of each other,” he said – so much so that when the financial crisis occurred in 2008, it was near impossible to link data together to model contagions. No one could understand who was financing whom, who was linked to whom, and the entirety of complex financial instruments.

Gaps in quality and completeness of data left decision makers without important, accurate, and timely information to quickly take action that may have lessened the crisis’ effect. “There was more data but it wasn’t actionable,” he said. Harkening back to the Lehman Brothers bankruptcy, “we couldn’t determine the impact,” Bottega said. “Now, some say if we knew what it would have been, we probably would not have let [it] fail.”

In the wake of the crisis, a defensive posture was struck. Regulations and compliance requirements such as BCBS 239 came about to demand more effective, fully documented and highly resourced, and accountable risk data aggregation frameworks. These frameworks were to support integrated data architectures, controls across the full data lifecycle, on-demand, and ad-hoc reporting as well as scenario-based reporting, with timely, accurate, and comprehensive risk data aligned to concepts for consistency of meaning across the organization. BCBS 239, he summarized, has three main tenets: the governance mandate, the data infrastructure mandate, and the data quality mandate.

Fast-forward to today and the good news is that the financial services industry is at the point where Data Governance programs in the sector are more sustainable, he said. But it’s still not an easy job to support a well-governed approach to Data Management. He noted a number of factors that remain obstacles:

  • There are still many data repositories, contributing to implementation and orchestration challenges.
  • Business mindsets aren’t necessarily well-aligned with Data Governance efforts, with a tendency to view them as expensive, disruptive, and not particularly ROI-oriented.
  • There’s a dearth of individuals available to take on the task that have the right skills, ranging from IT knowledge, to product and process expertise, to modeling and project management capabilities, as well as good understanding of legal issues.
  • Efforts must trudge along even when data ownership and accountability, executive support, and guaranteed funding aren’t assured, and data quality problems are.

 

Go on the Offense

There’s good reason, though, to keep working even against the odds to get these Data Management and Governance issues in hand, and solutions to help financial institutions take on many of them.

For one thing, it’s critical to keep in mind that all the infrastructure that is put in place and steps that are taken around Data Governance to satisfy regulations and compliance requirements translate to helping financial organizations take Big Data in hand, as well. “We want to do Big Data stuff – cool stuff, analytics, predictive analytics, inference processing,” Bottega said. “It’s all about getting the right information to the right person at the right time,” he explained, with the end game of extracting maximum value from data that is discoverable, actionable, trusted, and accurate.

That’s the route to driving new products, creating service innovations, and better deciphering customer needs overall. It’s the road, as it were, to growth and revenue enablement. Ontologies, Semantics, Machine Learning, pattern recognition, linkages, data lakes – they’re all part of the Data Management equation for creating a controlled data environment and enabling the business’ offensive market moves.

The DCAM (Data Management Capability Assessment Model) for Data Management developed by the EDM Council has an important place in all this. It provides a Data Management best practice foundation to address both the regulatory and business case opportunities. “It fits the eight dimensions of Data Management,” he said, in a coordinated, cohesive, operational model. Components of that model include assessing capability maturity around data quality to ensure that data is fit for purpose; data control environments; technology architectures; Data Management strategies; data architectures; Data Management business cases; Data Management programs; and Data Governance.

Each category is defined by a set of capabilities and sub-capabilities, which themselves are evidenced by a series of capability objectives. The Data Governance component, for instance, has in the capability arena creating a Data Governance structure, defining content governance, writing and approving policy standards, aligning cross-organizational data and technology governance, programming and putting in place governance controls, and operationalizing program governance. The sub-capabilities that go into writing and approving policy standards include factors such as their review by relevant program stakeholders and senior executive governing bodies.

“Data Governance is very important to getting people to behave and change the way they do things,” Bottega said. The Chief Data Officer must be empowered to communicate that everyone in the organization is a steward of data in some way shape or form, he added. “If you don’t all collaborate around that, you fail in a data world,” he said.

Financial companies’ who align their data programs with the DCAM industry best practices, he noted, have an opportunity to communicate and demonstrate their Data Management capabilities up through the management chain and to others, including regulators, to show that they are adhering to a model that is critical to building, sustaining and leveraging their data.

Financial Firms Live the Obstacles, but See the Opportunity

Fellow presenter Michael Atkin, EDM Council managing director, disclosed some observations made from its 2015 Data Management benchmarking study. Based on DCAM assessment measurements that represent the intersection of Data Management best practices and the reality of financial services operations, the study revolved around 21 questions covering Data Management strategy, funding mechanisms, end-to-end lineage, operating models, governance approaches, data architecture and data quality practices. It is the first time that a study of Data Management capability across the financial industry was conducted.

EDM collected information from thousands of global financial industry participants, finding among other things that BCBS is the Number 1 Data Management driver in the industry now. Operational efficiency and business value, however, also are pretty well-represented in the ranks.

“We are doing Data Management because we have to, but we understand there is value there,” Atkin said. The industry sees that the same data can be applied to business use cases and that the investment in Data Management must be leveraged across the organization.

That said, the industry is still in something of a transitional phase when it comes to undertaking Data Management activity, Data Governance included. Just over one-third of the respondents said their Data Management journeys were fully operational, while 43% said their Data Management efforts were formed, but still in the process of becoming operational. So, Data Management is underway and programs are in place, but in a lot of instances capabilities have not yet been achieved, never mind enhanced.

“The major players certainly understand the importance of getting data right and the discipline of Data Management is gaining a strong foothold in the financial industry,” Atkin said. “But the scope of the task is significant and organizational challenges on doing it right is daunting.”

About the author

Jennifer Zaino is a New York-based freelance writer specializing in business and technology journalism. She has been an executive editor at leading technology publications, including InformationWeek, where she spearheaded an award-winning news section, and Network Computing, where she helped develop online content strategies including review exclusives and analyst reports. Her freelance credentials include being a regular contributor of original content to The Semantic Web Blog; acting as a contributing writer to RFID Journal; and serving as executive editor at the Smart Architect Smart Enterprise Exchange group. Her work also has appeared in publications and on web sites including EdTech (K-12 and Higher Ed), Ingram Micro Channel Advisor, The CMO Site, and Federal Computer Week.

You might also like...

Radware Survey Finds Nearly Half of Companies Have Suffered a Data Breach in the Past Year

Read More →