by John Ladley
The first part of our interview with our panel of experts focused on the long legs of the so-called “traditional” data warehouse, and what we might be looking for in the next few years. After all, DW and analytics as a discipline is about 18 – 20 years old and unlike many other IT trends, it still has legs. Along with that our experts addressed the very visible changes in analytics – lower latencies of data, and the seemingly narrowing gap between operational and decisional analytic environments. We also looked at analytics as an expression of change – change in how business users will actually view business intelligence.
Our panel was made up of
- Ram Krishnan
- Neil Raden
- Claudia Imhoff
- Steven Brobst
- David Wells
First we directly asked them if there was a belief that the operational and decisional would ever converge, and if not, what would keep them from converging. None disagreed with the statement that analytics and DW are still hot topics. However, the discipline is maturing. There is commoditization occurring and the model for meeting business needs is changing. Exactly how that model is changing may be open to debate, but it is changing.
In general, our panel agreed that while there are certainly changes at hand in the DW analytics area, the field is maturing. Of great interest was the mention by several of our experts that the DW is not necessarily a ‘single version’ of the truth. Steve summed it up by saying we need a ‘single source’ of the truth. It may not be in the guise of a traditional DW, but the old or new version of the DW feeds the analytics environment.
The gap in operational and decisional architecture will close. As to what that looks like is where we get some differences of opinion. Our panelists differed on the “how.”
Ram feels that the traditional DW architecture will give way to more service oriented approaches. While the term DW will remain, it is liable to store not only event, but the records of the actions taken in response to events. Operational uses of analytics will include creating business rules to respond to certain conditions, e.g. a call center. In essence, the long sought after loop closing. Neil echoed the convergence of operational and decisional structures but threw in that increased technology efficiencies will also have a role to play.
Claudia extended this idea in that a self service model is evolving where business does most of its own reports vs. IT. IT provides data. Steve an David answered a little differently, in that IT must deliver Metrics. The essence it that IT seems to be destined to a logistics role, while the end user provides analytics value.
Several of our other panelists felt that while self service is a goal, reality is that a full self service model will never happen, and should not happen. The “will not” happen was that there is the need to manage reports. Neil reinforced the fact that a lot of analytics is based on reports, and the business users cannot keep 1000 reports in their head, so management of traditional outputs will still remain for a while.
David spoke extensively on the role of desk top analytics. Business users are heavy into Excel and are used to desk top analytics. New products (we will talk about that in the next issue) will keep this trend in place.
Our interviews also detected a shift in WHAT kind of analytics are produced. DW /DM will be there as architecture components as they are needed to do strategic BI. But as mentioned earlier with the storing of actions taken – there are analytics beyond data. Event analytics will need to be stored and analyzed – often on the fly. As the events are more than transactions. Most likely will see more real time analytics required for complex events like process control and customer service touch points. This analysis of actions taken to data can be called Decision Intelligence (Claudia’s term). Beyond discreet data there is the convergence of content, data and event analytics – all being brought together in some kind of decision making environment. (ed. Note – I have labeled this Collaborative Intelligence in the past – if you need a label for it and if you are thinking about that level of sophistication .)
Pervasive BI and analytics are driving this convergence, if not in terms of physical architecture but in term of using BI structures and analytics tools to do work. Several of our gurus commented this was a direct result of society in general being much more comfortable with analytics as a result of seeing them all the time through the web and mass media.
There was some difference in degree in terms of SOA, SaaS or just more efficient processing with the current approaches. This leads EDJ to believe we are liable to see all three in some fashion. There will be some degree of self service – commoditization of tools and new types of tools support this. But with Services and getting more prevalent, Claudia stated the role of IT will change from where it is now, as the all encompassing analytics provider, to more of a data getter” and report manager
Steve Brobst tossed out a slightly different take, mentioning that delivering metrics was the role of IT – you could argue this is the same thing, if the data to provide the4 analytics has to be gathered.
We also asked our panel if analytics could have prevented the recent financial melt downs. EDJ was surprised that they all agreed – the answer is no. It is interesting to note their replies however.
Ram felt that better BI / Analytics systems could have told us more about the exposures. But without closed loop feedback (which is evolving as we already discussed) or loose coupling to data the financial sector basically chose to ignore whatever indicators may have been there. Neil mentioned that data was not behind the issue, it was abuse of credit. Even the best and brightest may not have been able to see the crisis coming because people do not look for failure. Neil did add that even in the future advanced analytics probably would not help.
Steve also does not think the current crisis would have been dampened by analytics. The causative issues were policy-based. The risk analytics were there, just ignored.
David was philosophical stating that there is very little that analytics can do to prevent greed. On the other hand, it may be hard to actually assess if analytics could have prevented the crisis without a consistent definition of analytics, If you are to extend that definition to include placing numbers in some kind of context then yes, perhaps analysis could have helped, but if analytics is just delivery and presentation then no. Which does beg a question –if analytics is so prevalent should we not consider the context in which the analysis is made? i.e. maybe they could not have prevented the problems currently, but in the future perhaps we can place analytics into a different context of risk management.
Claudia summed it up this way – by paraphrasing Alan Greenspan – we could not have foreseen the finial meltdown as the analytics were done on only 20 years data – a period of euphoria. Therefore good analytics not have prevented it. I guess since we will now have some bad years we may have some offsetting analysis.
Next EDJ asked about the use of analytics for decision making. In our experience businesses still seem to shy away from committing to data decisions – we see gut feel a lot. Will this change? Will cultures change to being more analytic oriented? There was a strong theme that our modern times with use of data and the increasing use of socialized technology will result in a stronger push to analytics. Ram said “Business cultures are changing to be more analytic driven., Granted there are slow adopters like financial services and healthcare. There are also managers that exist today and do not want to look at data and do not like the change of relying on a spreadsheet . The organizations that are slow to take on more analytics need better context – analytics have to be friendlier, and biz users need to start to demand 360 degree view o the situation being analyzed.”
David Wells echoed that while there is a slow change to more analytical, the current economy will show the need for better data. The Obama campaign used lots of numbers and this will incent next generation of BI executives. This incentive along with vendors contributing collaborative technology in analytics applications means a better socialization of analytics in an organization. In fact there will be a commoditization of core metrics within domains.
Neil Raden added by observing the current condition of decision making. “Gut feel is real – but some opinions can be changed by data, some by higher level models and thoughts. A CEO does not make pure data decisions. Many corporate cultures are not ready or interested in low latency feed back . – there are some, but many are not. Through analytics the consistency of large volumes of decisions are possible and know how decision was made and why it was made. Analytics only take you so far – then you need decisions.”
Steven Brobst observed that poor architecture and poor delivery speaks of poor discipline – and disciplined organizations do better analytics. Steve mentioned organizations where poor performance was directly tied to lack of discipline in using data.
There was a difference in the view of the role of gut feelings – where Neil felt they would be there as part and parcel of an organization, others felt that gut decisions are overrated. “Gut feelings mean following the leader. In God we trust, all others bring data” is the newer mantra overtaking gut feelings according to Claudia. Culture changes are required and on the way for analytics. I suspect there are organizations that will go with gut feel, and others that will proactively change their data cultures. .
Many readers of EDJ have hundred and thousands of files, spreadsheets and Access data bases. Is there truth to the comment that with all the downstream data movement tools, users already have all the data, they just don’t use it very well or do not know how to use analytics in day-to-day operations? If true, how do you respond to business users still saying “just give me all the data?”
Some of the responses were short and sweet – others not so much. All were unified ion one thought – Just giving business users all that they ask for is not the path to healthy analytics. And inducing the changes away from that mindset is not easy. Neil mentioned that the amount of data really does not matter. It is the usage and the tools. Business users are smart enough to exploit data if they can get to it. Ram echoed the quantity view by being philosophical – what defines ALL the data. It is not realistic to present ALL data. It cannot be defined easily, and if it could, it cannot be managed effectively.
Ten years ago an IT department might have tried to deliver all the data, according to Steve Brobst. But data governance is addressing the fact that privacy and security restrict what is really accessible. Plus, the single source of the truth concept precludes gathering and shipping all data, just gather and provide what is needed by the business.
The quantity issue is no longer really an issue when defining all the data according to Claudia. With appliances and such we can store enormous quantities at reasonable cost. But users cannot drink from a fire hose. Or search haystacks for answers. They have to be educated to ask for the right things. And IT needs to understand the business better to assist in embedding the proper data for analysis.
David Wells summed it best – There will be a shift from giving all the data to giving the appropriate data. The social changes necessary for training to do analytics (vs. just do a report) will occur. And IT will also work more effectively at removing the perception of “bad” data.
It is interesting to note, as we wrap up part 1, that culture change was mentioned many times. This has been confirmed by an increasing interest on the topic on the speaking circuit.
But technology changes are a foot as well. Cost of ownership issues, and new “stuff” are heading our way. Part 2 will recap what our panel said. In addition, we will be asking some new and long-standing vendors to weigh in.
ABOUT THE AUTHOR
John Ladley is an internationally known information management practitioner and a popular speaker on information and knowledge management. John is widely published and has several regular columns. Until recently, John was a Director with Navigant Consulting. Prior to Navigant, John founded KI Solutions, and John was Senior Program Director of Data Warehouse strategies and a Research Fellow at Meta Group. Mr. Ladley is an authority on information architectures, business performance measurement architectures, knowledge management, collaborative applications, and information resource management. John is currently President of IMCue Solutions, a new firm focused on data governance and information management.