Cognitive Computing Gives Hand Up to Healthcare

healthcare technology x300by Jennifer Zaino

IBM counts healthcare among the key sectors that it expects will realize business advantages with the help of Watson, its headline-making Cognitive Computing platform.

The vendor has partnered with some of the best-known hospitals and health benefits organizations, such as Memorial Sloan-Kettering, the University of Texas MD Anderson Cancer Center, Cleveland Clinic and WellPoint, on Watson-driven healthcare insight and decision support initiatives. Nearly one-quarter of the finalists in the recent Watson Mobile Developer Challenge also were in the health services category, including one of the winners: GenieMD, which entered an app that leveraged the Cognitive Computing technology to improve individuals’ health management (see The Semantic Web Blog’s coverage here).

At Modernizing Medicine, “Watson offers the opportunity for a wonderful enhancement to complement everything we are doing,” says Daniel Cane, president, CEO and co-founder of the company that provides intelligent electronic medical records technology to dermatologists, ophthalmologists, orthopedic surgeons, otolaryngologists, urologists, and gastroenterologists. Independent of Watson, its Electronic Medical Assistant (EMA) technology lets physicians use iPad tablets to touch and swipe their way to creating exam notes at the point of care, generating accurate billing codes, and searching its built-in and physician-informed diagnosis/treatment plans database for information to support healthcare decisions. Information from treatment to patient outcomes is recorded in structured format, using classifications and workflows specific to the medical specialty.

“It lets them capture all the evidence of the real-world and measure outcomes to understand what they do today,” says Cane. But it doesn’t answer the question of what happens in the more narrative world of published clinical research, he explains. That’s where Watson steps in: Cognitive Computing and natural language technology underlie the company’s new schEMA tablet app for digesting, comprehending, and surfacing answers from the massive collection of unstructured data that populates the ever-growing stack of published medical research.

“It’s nearly impossible for a doctor to stay up with all the various research but now in seconds they can answer questions based on existing and new knowledge,” says Cane – not to spit out what the treatment should be but to help them support their treatment protocols with relevant and current insight.

As an example, Cane notes that a dermatologist typically has her own method of caring for cases of psoriasis. But, for a particular patient with certain additional health conditions, she may also want to consult the EMA platform’s database to see how other physicians might have handled similar patient situations. And from there, she may want to query schEMA to investigate further the clinical effects of using two different drugs that her peers have found helpful under similar circumstances. With Watson working diligently in the background, she could learn that research has shown that one of the drugs is more effective in this instance but that she should also be aware that it can enhance liver toxicity, so it’s important to regularly conduct liver screenings to ensure no harm has taken place.

With Watson and schEMA, Cane says, “it’s designed to be a conversation so the doctor can go from there to asking how often to do liver screenings for the patients that get the drug and get the answer.” The value comes in its being the freshest answer possible. Modernizing Medicine is aiming to incorporate and train up Watson on newly published information from premier sources, including peer-reviewed medical journals such as the Journal of the American Medical Association (JAMA) and New England Journal of Medicine, within a day or two of its release.

That way, if the National Psoriasis Foundation in the last month published updated guidelines around these test-timing parameters, the dermatologist in this example will have those at her fingertips ASAP – guidelines that otherwise “could very well have gone unnoticed without access at the point of care,” Cane says. “And with one touch the doctor can cite the evidence Watson found into the medical record.”

Ahead: Savings and Training

Giving physicians the tools to perform and record this kind of due diligence can impact patient outcomes, but Cane also speculates that it could have an effect on doctors’ financial models, as well. Doesn’t it make sense, he proposes, that they receive discounts on medical malpractice insurance premiums if they can show they spend a certain amount of time on cases reviewing the latest evidence, with the goal of reducing patient risk? He concedes that this is all still very new territory for everyone involved, payers and insurance providers included, and these questions are yet to be widely pondered.

In the meantime, the time savings doctors should be able to realize with Cognitive Computing’s help can’t be overlooked – not in an increasingly strained healthcare system that’s placing more documentation demands on practitioners.

“The difference between Cognitive Computing and search is that with search, a human has to find, filter, read, comprehend and decide what to do,” Cane says. “But Watson already has read the documents and come to some level of comprehension to be able to cite an answer that the doctor can utilize to make a decision.”

It’s new territory in other respects, as well, including how the IBM Watson ecosystem at large will be able to leverage new, high-quality, third-party content sources as they are on-boarded. Modernizing Medicine, for example, brought the JAMA relationship to IBM, Cane says, and that conversation expanded from being a simple license agreement to a broader discussion of how these content provider relationships will propagate within the IBM ecosystem. It makes sense, he thinks, that IBM would want to be able to offer other developers access to research from such prestigious resources to serve as “starter content…[so] they can just add their secret sauce to a baked corpus pulled from various medical journals.”

That said, Cane advises partners to be prepared for the work that goes into building a Watson-powered app. It’s not just about pouring content in and then asking questions and immediately getting answers, he says. “It’s not just about access to even the same materials, but how effectively you train your system to use that,” he says. Even Modernizing Medicine, which knew that from the get-go, underestimated the extensive amount of time it would spend in building its corpus, creating Question/Answer pairs, and training the system to be able to return really relevant answers. “It’s very true that the first bits of information you teach it are the hardest,” he says.

He recommends not starting by trying to boil the ocean. Modernizing Medicine currently focuses on a handful of medical specialties with EMA. It narrowed the field to dermatology and further to just three diseases of the skin for its first Watson-powered schEMA app, just a handful of sources, and 1,000 QA pairs. Dermatology made sense for the company to start experimenting with and to roll out as its first app, given that 25 percent of dermatologists in the U.S. use EMA. To try to go full-bore into all its healthcare domains – including the four additional ones it is moving into – could rank up costs into the millions of dollars. It took about one man-year’s worth of hours in total development time to build its three-disease concept prototype on Watson.

“We decided to start by seeing if we could get it to be really intelligent about its answers to melanoma, and we could,” Cane says. So when the company taught it the next diseases (psoriasis and atopic dermatitis), “we were so much better off because of the time and energy we spent training it on one specific classification of questions.”

Getting Smarter

The hard work clearly has paid off. “As Watson gets more familiar with the corpus of information, it gets better and more efficient at answering even new questions that are parallel in structure or syntax to other questions we have taught it.”

In other words, it’s living up to its billing. “It is getting smarter. It is learning from the overall corpus, not just the specific things we taught it,” he says. It’s even taken the company and the physicians it works with on some surprising paths. In one instance, for example, a doctor that Modernizing Medicine collaborates with asked the schEMA app if it was okay to give an immune-compromised patient the shingles vaccine. The physician’s guess was that the answer would be no because you can’t give a live attenuated virus to such a patient.

Watson thought differently – albeit at a very low confidence level, based on finding only one article that said, all evidence to the contrary, that you can administer this vaccine because there are no contraindications noted in published literature. The doctor reviewed the document which Watson used to deliver its answer, and “there was the evidence that said you could give the vaccine, as one thing buried in another clinical trial,” Cane says. “It was the right answer to the right question in the context of a doctor wanting to make a decision.”

Nearly 20 physicians are working at Modernizing Medicine on continually building schEMA’s subject matter expertise, asking questions to ensure that Watson understands what it’s taught, can intelligently answer questions from the text of a document, and give it feedback when it answers questions correctly to build its self-confidence. “It’s like educating a resident,” he says.

The Next Chapter in Healthcare

The road ahead, Cane thinks, will see Watson adding new types of sources from which it can learn. Right now, it’s NLP-focused on unstructured text, and struggles with using structured data, he says. “To teach Watson based on structured data as well as images and video are definitely two areas that Cognitive Computing needs to evolve, and they are,” he says, noting IBM’s dedication of lots of resources in these areas.

Cane looks forward to the day when the millions of images of patients in its system tagged with metadata about their diseases can be leveraged by Watson to help physicians correlate similarities in shapes, pigments or coloring to outcomes for diseases like melanoma or basal cell carcinoma, for example. “That would be incredible,” he says.

Cognitive Computing, in healthcare and other fields, “is the beginning of an entirely new chapter,” Cane says. “The capabilities are just beginning to be explored and it is one of the most exciting times in the history of computing – we are at the nascent stages of something that will permeate all aspects of technology.”

Related Posts Plugin for WordPress, Blogger...

Leave a Reply

Your email address will not be published. Required fields are marked *