Many vendors and end users of data driven products, services and processes, believe predictive analytics represents the epitome of predictive capabilities.
When augmented with Machine Learning algorithms that establish data-derived precedents and incorporate them into future processes, predictive analytics can enhance analytics operations for any number of vertical industries including retail, health care, and numerous facets of the Industrial Internet.
Nonetheless, there are a few minor limitations associated with predictive analytics including:
- Data modeling: Frequently facilitated by the paucity of Data Scientists, predictive analytics largely hinges on data models that are dependant on previous events to calculate the likelihood of future ones. Unexpected events may complicate if not outright evade this process, despite the fact that Machine Learning algorithms can minimize these instances.
- Data Types: Such models are also restricted to certain types of data and sources, whether they are structured, unstructured or semi-structured.
- Statistics: Predictive analytics is also circumscribed by a reliance on a linear relationship between variables, which does not take into account non-linear relationships and variables that have not been anticipated or accounted for in data models—such as which are likely to be found in Big Data sets.
Touting what it refers to as its Natural Intelligence Platform, Saffron has unveiled a Cognitive Computing solution that forgoes predictive analytics for a much more profound way to identify the probability of future events based on real-time pattern recognition of Big Data sets. It involves:
- No Data Modeling
- Unifying Data Types (whether structured or unstructured)
- Machine Learning
It enables most servers to utilize Cognitive Computing capabilities and enhance their predictive capabilities by effectively reasoning as well as people do. During a session at the Enterprise Data World 2014 Conference and Expo, Saffron vice-president of Healthcare & Strategic Partnerships Walt Gall observed:
“Whether its reading a book, looking at a video or listening to someone speak, you’re actually taking that information in and processing it—both unstructured and structured data—and you don’t have a model that you have to tune to learn…The human brain is amazing at basically having that sort of computational sequence of actually processing information, and that’s the mantra of how Saffron is engineered: to be able to actually make connections in a model free way, understand the context, and be able to learn from previous experiences and understand the patterns.”
Analytics on the Fly
Because they do not incorporate data modeling, the underlying analytics that power Saffron’s Natural Intelligence Platform are able to draw connections between both anomalous events and expected ones. Its Machine Learning algorithms are able to learn from the relationships between these events for whatever the projected use of the platform is: whether for fraud detection, cardiac health, or even customer behavior. In such a way it identifies numerous variables (not just those that have been previously identified) and effectively learns what sort of impact they have on the use of the platform.
Other than its Machine Learning and cognitive capabilities, the crux of Saffron’s technology is that by integrating both structured and unstructured data, as well as statistics and semantic technology, it is able to identify every single data element in a Big Data set and learn how those relate to a desired outcome. Its cognitive ability enables the platform to determine the context of those relationships in real-time. Gall commented that:
“The applications are myriad, the industries are far-reaching and the use cases that we present today hopefully illustrate that. But the idea here that I want you to take away is that [the platform] can learn continuously. You don’t have to keep tuning the platform. Once you’ve actually applied the connections to retrospect the data, you’re able to actually take in data, and it just learns on the fly like the human brain does.”
Perhaps the most convincing use case for Saffron is the fact that for several years the company and its technology were employed to provide intelligence for the United States military. In the private sector, however, one of Saffron’s first clients was the aerospace manufacturer Boeing. Boeing had over 40 different sources of unstructured and structured, dynamic data providing information from mechanics, pilots, the environment, and more with which it attempted to perform personalized maintenance schedules based on failure prediction for various helicopter parts. Gall recalled:
“We were put head to head against a blind study where a traditional algorithm approach utilized by NASA had 66 percent accuracy for predicting when a part was actually going to fail with a 16 percent false alarm. When Saffron was tested for this use case we came back with 100 percent recall with one percent false alarm.”
Within the medical industry, Saffron was used to determine the rate of effectiveness of distinguishing cardiomyopathy from constrictive pericarditis using echocardiography, a procedure which enables medical personnel to create images of the heart while using sound waves. In a research study, 24 percent of medical personnel were able to distinguish the aforementioned conditions utilizing echocardiography.
While utilizing traditional statistical methods, Saffron personnel were only able to determine the difference with 54 percent accuracy. However, by implementing its platform and analyzing a single heartbeat each second for each of the 15 patients involved in the study (which constituted 90 metrics in six different places in the heart and translates into 90 million Semantic triples), Saffron improved its rate of correctly differentiating the condition to 90 percent.
Cognitive Computing and Semantics
The cognitive ability of Saffron is facilitated by two related facets of Cognitive Computing that allow the platform to overcome typical limitations associated with this technology. The first of these is associative memory, which provides the capability to associate huge quantities of data while still identifying patterns in real time. The second is cognitive distance, which enables the platform to reason based on similarities and determine similar relationships from copious data sets. The latter is partly based on Kolmogorov complexity, a method of discerning data signals from the metaphoric “noise” they create and which provides the shortest algorithms to reproduce complex data patterns. Cognitive distance and Kolmogorov complexity is particularly effective when used for real-time applications of Big Data Sets.
Moreover, cognitive distance is able to resolve ambiguities by determining the context of data—which involves the use of Semantic technology and triples, simple descriptions of data’s characteristics involving a subject, verb, and an object. Despite the usage of Semantics, those who deploy Saffron are not required to provide ontologies, although they can do so if they like. Ontologies are optional because the platform builds semantic graphs based on the actual data and their connections. Saffron Chief Technology Officer Paul Hoffman remarked:
“We have married associative memories and cognitive distance, this reasoning by similarity [and] an information theoretic approach. Thus we are able to do what is called semantic search…and then we can do convergence…where things [data elements, their relationships and their cognitive distance] change over time, they come close and then they go away. Things are not standing. So if you use a modeled approach you wouldn’t see that. That is an advantage of a parameter free, non-parametric Machine Learning approach.”
In retrospect, the impact of Saffron is greater than its usage in the public and private sector, the many industries the platform can apply to, and its various usages that span from classifications (predictive maintenance and customer lifetime value) to unsupervised learning (when an organization starts out knowing nothing about data and can produce an evolutionary tree about them). It actually attests to the importance of Big Data, Machine Learning, and Cognitive Computing technologies, and how their synthesized value exceeds that of these individual technologies alone. It also alludes to how these technologies can influence the future.