You are here:  Home  >  Uncategorized  >  Current Article

FIBO Summit Opening Remarks by EDMC Managing Director Mike Atkin

By   /  June 12, 2013  /  No Comments

[Editor’s Note: As our own Jennifer Zaino recently reported, the Enterprise Data Management (EDM) Council, a not-for-profit trade association dedicated to addressing the practical business strategies and technical implementation realities of enterprise data management held a two day FIBO Technology Summit in conjunction with MediaBistro’s Semantic Technology & Business (SemTechBiz) Conference, June 7th and 8th in San Francisco, California.  SemTechBiz was chosen for the summit because of its close proximity to the leading minds in Silicon Valley.
In afternoon and morning sessions, lead by distinguished academic and industry leaders, 60 top developers discussed 4 key technology challenges and developed plans that will lead to solutions critical to simultaneously lowering the cost of operations in financial institutions and ensuring the transparency required by regulations put in place since the beginning of the financial crisis of 2008.
Michael Atkin, EDM Council Managing Director began the deliberations with the following charge to the assembled experts.]

Photo of Mike Atkin, Managing Director, EDM CouncilI spent the majority of my professional life as the scribe, analyst, advocate, facilitator and therapist for the information industry.   I started with the traditional publishers and then moved on to my engagement in the financial information industry.  I watched the business of information evolve through lots of IT revolutions … from microfiche to Boolean search to CD-ROM to videotext to client server architecture to the Internet and beyond.

At the baseline of everything was the concept of data tagging – as the key to search, retrieval and data value.  I saw the evolution from SGML (which gave rise to the database industry).  I witnessed the separation of content from form with the development of HTML.  And now we are standing at the forefront of capturing meaning with formal ontologies and using inference-based processing to perform complex analysis.

I have been both a witness to (and an organizer of) the information industry for the better part of 30 years.  It is my clear opinion that this development – and by that I mean the tagging of meaning and semantic processing is the most important development I have witnessed.  It is about the representation of knowledge.  It is about complex analytical processing.  It is about the science of meaning.  It is about the next phase of innovation for the information industry.

Let me see if I can put all of this into perspective for you.  Because my goal is to enlist you into our journey.  I know (with absolute certainty) that we are standing at a breakthrough moment and I’m fortunate enough to have a front row seat in many of these discussions.  Some of you may know that I run the EDM Council and have been preaching the gospel and advising financial institutions around the world on the data mandate for many years – and now they care.  And they care at the top of the house.  And they care enough to deal with the huge task of changing how their organizations operate.    And they care enough to usher in a whole new infrastructure across their organizations and across the world.  This is truly a big deal.

I am a member of the US Treasury’s Financial Research Advisory Committee.  This is the mechanism created by the new Office of Financial Research to implement data standards and conduct research about systemic risk.  I am the chair of the Data and Technology Committee and charged with helping the OFR define the pathway forward from a data perspective.  And they understand the importance of data comparability.  They understand that without data standards (i.e. identifiers, language of the financial contract and classification) – they won’t be able to provide oversight over the unruly financial industry.  And they are starting to understand the importance of semantic processing as the pathway through the analytical minefield of interconnected global risk.

I am a member of the Technical Advisory Committee of the Commodity Futures Trading Commission.  They are sitting in the midst of a data wildfire – and they know it.  They need to facilitate transparency in the derivatives market.  They need to understand how these bespoke contracts actually work.  They need to standardize product identification.  They need to classify these instruments so they can be aggregated and linked.  They need to align data meaning with messaging standards.  They need to validate and normalize data across exchanges and across geography.  And they need to support complex analytics based on ad-hoc scenarios and on-demand – when threats to financial stability begin to emerge.  It’s the penultimate use case for both ontologies (the language part) and inference processing (the technology part).

I sit on the Financial Stability Board’s Public Sector Advisory Group helping to implement a legal entity identification standard.  This standard is only the first step.  The real goal is for reporting about ownership structures, control relationships and intercompany linkages.  And the light is beginning to shine – not just on the importance of the identifier, but on the role of ontologies about ownership and control and in understanding the nature of (what David Newman describes as) transitive exposure.   This is about understanding who finances whom, who owns whom, who guarantees whom, who is obligated to whom – what happens under what conditions and ultimately “do I get paid before you do” in the advent of another financial crisis.  This (of course) is the focus of FIBO for business entities – the first standard that we are releasing in partnership with the OMG.

And this story continues.  We are right now working with the Bank of England to align their liquidity reporting requirements to FIBO – so that there is clarity in reporting and an ability to do comparative analysis.  We are right now working with some of the largest banks in the world to align their data repositories in order to do consistent aggregation – so that they can perform the range of stress tests now being mandated by the Federal Reserve.  We are right now working with Fannie Mae and the housing regulators to align data … to integrate it into the MISMO XML messaging schema … and to help them unravel the dynamics of the mortgage-backed securities market.

But most importantly, we are right now gearing up to address the most important new objective within this wonderful new Age of Transparency.  Earlier this year, the Basel Committee on Banking Supervision released a very important document – affectionately known as the Basel Risk Data Aggregation Principles (or Basel RDA).

This document is the result of an evolutionary process.  The evolution began just after the 2008 crisis as the global market authorities started documenting what went wrong and what we need to fix as a result.   In 2009 the SEC and CFTC jointly released a study specifying that the only way through the minefield of complexity was to specify these complex financial instruments based on the underlying facts that define them.    And that the financial industry should partner with government and academia to implement this “algorithmic capability” (that’s the term they used).  This is their important study known to DC policy wonks as Section 719(b) of the Dodd-Frank Act.

In 2010, the Senior Banking Supervisors Group (these are the heads of the world’s leading central banks) released a study that defined the concept of a “risk data appetite framework” and made the strong and intractable connection between risk management and data.  The story went something like this … we regulators are entrusted with a bunch of new tasks (financial stability, transparency and all that).  In order to accomplish these new tasks we need comparable data across your organizations and across the industry so we can feed them into our models and run our economic scenarios.  Plus we regulators are not technically capable of doing the reconciliation – so the onus is on you (the banks) to deliver aligned data.  Oh and by the way – if you can’t do this, we would be very worried about your internal ability to control over your own risk.  So you need to fix this data problem.

This year, the Basel Committee released the RDA principles document.  This document takes the SBSG recommendations up a few notches and mandates the implementation of this control environment.  The 14 RDA principles say three things about data that give me hope and get me charged up.  The first is that the ability to aggregate risk is mandatory and that executive management is responsible for making sure that happens.  This is the governance mandate for data management.  The second is that the banks have to implement an aligned data infrastructure.  This includes identifiers, metadata, naming conventions and harmonized data definitions.  This is the infrastructure mandate for data management.  And the final part is that the banks must have the ability to aggregate risk across business units on demand and in response to ad-hoc economic scenarios.  And while the financial institutions and the regulatory authorities don’t fully understand it yet – this is the semantics mandate for the financial industry.

And this, the … “they don’t know it yet” part … is the essence of the challenge that lies before us.  All around the financial industry from the financial institutions themselves to the regulators that oversee them to the data vendors that serve them – the objectives of transparency, financial stability and cross-asset market surveillance are tailor made for the promise of precise language, based in contractual reality, combined with executable business rules, integrated with other taxonomies, aligned with messaging and managed via inference processing.

It is an outstanding use case for ontologies and for semantic triples.  It is backed with the threat of regulatory mandates.  It is designed to be implemented via standards.  And it is combined with the necessary governance to ensure that we don’t ignore the problem due to concerns about business case or fall victim to the unfortunate “curse of the short view” that is so prevalent among large financial institutions.

And so, while we have the financial industry use case with all the regulatory trimmings – we don’t have complete awareness.  We don’t really understand data as meaning.  We don’t really understand that this is not a data processing or IT problem.  We don’t have aligned data glossaries across business units.  We don’t even know how to spell metadata – let alone utter the “o” world in polite company.  We live a world of reconciliation.  We’re used to reconciliation.  We know how to deal with operational crisis on a tactical basis.  We’ve been practicing that reality since the dawn of credit.

But I’m undaunted.  In fact, I’m excited.  Data management has risen like a phoenix, crawled out of the depths of the back office and is no longer considered as the ugly stepchild of IT.  At the moment, data management is on the agenda of every financial institution.  In fact, it is one of the top issues on the agenda of executive management within the financial industry.

It is also the hot topic of the day within the regulatory community.  Regulators, market authorities and agencies around the industry are waking up to the fact that they cannot accomplish their new goals of unraveling the complexity of the global financial market without comparable data and without a shared view about the “things” in our world, the “facts” about these things and about how the “relationships” among these things work in reality.

But we are at the beginning of the journey, not the end.  Awareness is essential.  Global economic crisis and tough regulatory oversight was necessary to change the orientation of this industry.  But awareness and drivers are not sufficient.  We have work to do.  And we don’t have a huge window of time in which to operate.

But we do have a good start.  The EDM Council has been developing the ontology of the contract as a collaborative project under the leadership of Mike Bennett for the past 5 years.  This is the Financial Industry Business Ontology (or FIBO).  It exists.  It covers all known financial instruments.  It covers business entities and the roles they play in financial processes.  It covers lots of the basic concepts of risk, transactions, corporate actions, issuance, guarantee and collateral.  It plays nicely in the sandbox with other ontologies and with messaging taxonomies.  And it’s governed by the technical rigor of the OMG standards process.

We do have a standard methodology for FIBO developed and implemented under the steady hand of Dennis Wisnosky and in partnership with the OMG architecture board under the tireless dedication of people like Elisa Kendall and Pete Rivette.

We do have a robust illustration of an operational ontology showing the intersection of interest rate swaps, credit default obligations and legal entity relationships under the direction of the remarkable David Newman from Wells Fargo and his OTC derivatives team.  We do have a metadata repository and a means of extracting FIBO in RDF/OWL thanks to Adaptive.  We do have good working relationships with other stakeholders including the messaging schemas, the SemTech community, the financial institutions and the regulatory agencies in the US, UK and Europe.  So we’re in a fairly good place.

What we need to complete this picture however – is you.  The semantic community.  There are some real technical challenges that need to be solved – like the four that have been teed up for this Summit.  We do have to be able to deliver on the promises that we are making.  And the stakeholders are seriously listening to the promises – mainly because they have real tasks to accomplish and this is the right pathway forward.

What I fear however is fragmentation rather than harmonization.  I see lots of activity – but not a sufficient mechanism for achieving alignment among the banks, the regulators, the semantic community, academia, the ontologists and the data vendors.  That’s what we hope to see emerge out of this event – the mechanism for alignment – the process for collaboration – and the means to collectively step up to the challenge.

We are standing at that rare moment – the perfect storm (if you will) between business objectives, regulatory mandates and data as the fundamental pillar that links these things together.  It’s been a long time coming and (as Rahm Emanuel once said) – “shame on us if we waste a good crisis.”

Thank you.

FIBO Summit Technical Objectives

  1. The generation of operational ontologies in RDF/OWL from conceptual ontologies
  2. Convert requirements (i.e. regulatory rules) into executable semantic rule statements
  3. Visually represent all forms of semantic content with enough rigor to support reasoning
  4. Facilitate the sharing of semantics and analytics at the scale of the financial system

About the Author

Mike is a professional facilitator and has been a financial information industry advocate for over 25 years. He is currently the Managing Director for the Enterprise Data Management Council – a business forum for financial institutions, data originators and vendors on the strategy and tactics of managing data as an enterprise-wide asset.  Mike is involved with many organizations, provides strategic advice to members and is a frequent speaker on a range of issues associated with data management. Mike has been the Managing Director of the EDM Council since February 2006.

You might also like...

Smart Data Webinar: Advances in Natural Language Processing II – NL Generation

Read More →