You’ve heard of Cognitive Computing. And you’ve heard of the Cloud. Now, it’s time to get familiar with Cognitive Cloud platforms.
What is a Cognitive Cloud platform? Think of it as extending the ecosystem that will make it possible to create and bring Cognitive Computing applications to the masses. Startup Cognitive Scale is providing an incarnation of the concept via its portable, open-standards based Cognitive Cloud platform that aims to accelerate value from a company’s existing investments in Big Data. It takes a technology-independent approach: Cognitive applications built on its platform are able to run on any public Cloud infrastructure, whether Amazon Web Services, Google Cloud, or IBM Bluemix, and also are available in an on-premise version based on OpenStack open source software.
With Cognitive Scale’s solutions for sourcing, analyzing, and interpreting data of all sorts and context signals, enterprises get help building a new class of cognitive applications. They extract patterns and insights from all data sources, including dark data, while enforcing security, and sovereignty policies. These applications understand natural language and generate personalized insights that learn with every user interaction, so that they grow smarter each time they are used, the company explains.
“Cognitive Computing is the next big evolution in the computing industry,” says Matt Sanchez, founder, Chief Technology Officer, and Vice President of Products at Cognitive Scale. That echoes IBM’s commentary at its launch of IBM Watson, cognitive technology that Sanchez knows well: He was IBM Watson Labs leader, in charge of the team that built the first commercially available Watson apps for financial services and healthcare. Sanchez says that amidst customers’ interest about the potential of Cognitive Computing; however, they generally struggle with issues related to acquiring, integrating, using, and managing the data – in multiple forms, structured and unstructured, from internal silos and various external sources, some of it even potentially subject to location restrictions – that is critical to powering transformative cognitive applications.
Those struggles are not surprising. It’s important to remember that this is all still new ground for most companies, and Cognitive Computing can take enterprises much further than they have gone with traditional business intelligence and analytics solutions. Such systems, for example, are largely unable to deal with dark data. That includes social media postings, say, or electronic medical record notes, or other unstructured data, whether text, image or even electronic fitness device readings, that comprises so much of the information generated today.
Cognitive Computing, in contrast, can take on the challenge of pulling non-obvious insights out of massive amounts of multi-structured data. Unstructured dark data not only can be included for analysis, but combined with existing stores of structured information, such as customer records. Patterns and relationships can be found among all the data and contextual associations made that help businesses take advantage of the knowledge implicit in the information, as Dataversity described in this article. Cognitive Computing solutions can provide users with answers backed by the evidence it gathers from all this data, and learn in real-time and over time to continue to improve its reasoning and thus its answers.
All in the Platform
Cognitive Scale’s platform addresses the challenges companies face with features like its Data Fabric and Analytics Fabric. The former includes patents around maintaining the traceability and lifecycle of content – that is, the Big Data, both internal and third-party, dark and otherwise, that will be used to inform the highly contextual insights it delivers in natural language.
“We saw the need for this all to be a platform capability,” says Sanchez. In addition to providing these services for the curated, high-value data that Cognitive Scale includes in its platform’s industry and general purpose content catalogue, users also get traceability of that data so that at month’s end, they can see how much they have spent on content, what projects it was used in, and so on. The company also can help support users who want cleaner insight into content they themselves have directly licensed from external providers.
Its Analytics Fabric takes on the challenge of getting value out of what now should be better-managed and better-secured data, combining and orchestrating different analytics engines to create composite or cognitive insights across first-party and third-party data. For example, Sanchez explains, IBM Watson is very good at finding answers to natural language queries in unstructured text, but for some questions, that capability needs to be combined with structured data analytics. Or, as he puts it, “sometimes you have to combine what Watson can do with what WolframAlpha can do.” The Analytics Fabric composite analytics engine can compose analytics from Watson, Google Analytics, WolframAlpha, or even custom models in Python to create insights that can be delivered to mobile devices or BI tools or packaged apps, or wherever it needs to go, Sanchez says.
The company also has patented the Cognitive Graph, which Sanchez says fulfills the need for a next-generation object database that has to be able to represent entities, relationships, and attributes in a probabilistic way, not just a deterministic way, so that users can do inferencing and generate hypothesis.
“When you think of Cognitive Computing and how it differs from other analytics, what sets it apart is that it operates like the human brain: It is context driven, generates multiple competing hypothesis, and learns and improves automatically over time.” he says. “You have to understand how to look at the context of whatever the situation is and tune your algorithms to do knowledge discovery and pattern generation for that context. The system must be able to reason over new data and learn from feedback and incrementally improve itself over time….When you think of data representation to do those things, a new kind of data model is needed, and that is the Cognitive Graph.”
The Cognitive Graph brings to the table the ability to normalize many different data types as well as learn from data over time, such that as something changes somewhere in the graph that may affect something elsewhere in the graph, which change is recognized at every point it touches. “People tend to interpret databases like they are the truth, but you can’t assume that about all information sources,” he says. “You need a system that captures and tracks its relative confidence to believe in the precision of the source data.”
Getting on with the Platform
Cognitive Scale is focusing on high ROI-vertical apps, especially those in healthcare, travel, and retail and finance, where use cases include bringing in a lot of data that may live outside internal systems. (See the related story at The Semantic Web Blog to learn more about some of applications, including the vendor’s relationship with WayBlazer in the travel sector). It’s been developing expertise in these areas and accumulating content related to these vertical sectors to deliver to customers as part of its offerings, to help them get up to speed very fast.
The commercial apps it has delivered so far, available on a SaaS model, largely are functioning as an entry point to its Cognitive Garage, which brings to users the technology, data from the company’s content catalogue, and its methods and talent to create a Cognitive Cloud of their own in ten seconds. They deliver their own cognitive app in two five-hour prototype sessions and customize it with their data within 10 days. These quick starts will be offered at a fixed price. “In ten days we can light up a real prototype app on a real Cognitive Cloud, so we can engage very quickly with customers and show them the value of Cognitive Computing and our industry curated content,” says Sanchez. This, he says, solves the dilemma that can bedevil Cognitive Computing usage: Most customers want to see value quickly but it takes such a long time to get all the data and train the algorithms.
“We had to make that faster, so we developed tools and methods to make that faster with our platform and industry content,” he says. Customers that license the cloud platform for apps they create will pay a monthly utility fee.
Moving forward, Sanchez says the company is looking at a few other Cloud infrastructure platform providers, widening the options users have for their Cognitive Cloud platforms. He’d also like to see a spirit of openness expand further in the industry, by building and embracing more standards for cognitive workload management, creating more common services or service interfaces – such as a standard way of consuming baseline natural language processing services – for Cognitive Cloud platforms, and standards around content.
“That’s about how to manage a content catalogue or corpus of data for a cognitive system, almost like UDDI for data,” he says. “We’re thinking about it and starting to work with some bigger partners on this.”