Alme for Healthcare brings the personal virtual assistant to the disease management space. Next IT’s Alme natural language platform has a history in other sectors, including financial services, where it’s servicing loan-insurance for SWBC; transportation, where it’s enabling self-service for Alaska Airlines customers; and Aetna, where it’s supporting new user registrations, among others. In fact, Aetna is one of the customers it’s working with on the new healthcare virtual assistant that is designed to improve patient outcomes and quality of care.
The personal assistant functions in a multi-modal model, supporting both talking in natural language and typing; across multiple platforms, including smart phones and tablets; and with multi-lingual capabilities. The conversational language assistant is based on what Next IT notes is a comprehensive patient ontology, support for goal-based conversations (such as helping patients stick with treatment plans), and interactive concept illustrations (pointing out where to do at-home injections, for example).
“We’re working on disease management,” says Fred Brown, founder and CEO of Next IT. “People see their doctor every six months and then sort of forget what they’re supposed to do. So we want to provide real-time help and assistance for them, and an escalation path to a live medical professional when needed.”
The platform is aiming to help address any chronic disease, but it’s still early on and Next AI is working with a variety of specialty pharmaceutical firms, payers, providers and others in the health care continuum to evolve its human emulation process – in which the personal assistant asks enough questions to get a good record of the patient’s issues and enough data to personalize the interaction to address them.
It will spend the next year or so proving the innovation in various areas, and it expects provider networks to enable the technology over time and doctors to be in need of it as they begin to receive payment based on patient outcomes. “It’s still a bit early in how that’s all going to work, but one thing I know is that the patient will have to manage more of his own health care and they will need help through that hassle,” Brown says.
As patients interact with the virtual personal assistant, their conversations are captured and that information will be delivered back to the doctor or other client in a form they prefer. Information “can be bubbled up and used to inform the doctor not necessarily on a daily basis unless there’s a reason to escalate, but to help with things like the patient not being able to remember something when they go to the doctor’s office,” says Victor Morrison, VP of Healthcare at Next IT. “The doctor spends a lot of time trying to gather information and it’s a huge efficiency to pop that onto the doctor’s screen in the office.”
An example of an interaction would be the virtual personal assistant helping an MS patient through his daily medical injectables process. That can be painful, and it can be helpful to have the artificial intelligence coach send a reminder about it and then help him through the process (including asking where the most recent injection sites were and what another injection site option is). At the same time Alme can conduct other inquiries about the patient status – how would the person rate the shot process that day, whether that went okay but whether he still feels poorly, if there are any issues with eyesight or ability to walk a certain distance, and so on. The patient can respond either in natural language or tap an answer to take things to the next level.
“Because we are in the health care domain, or even more specifically around a drug or disease, we can get a lot of context and access to back-end source information, whether a pharmaceuticals company or payer, to answer questions in the context of you,” says Morrison.
“The idea is building an empathetic, caring human-like conversation doing what call center staff does, but being infinitely scalable,” says Brown. And the idea is to make the personal assistant a friend not just for medical issues but for everyday activities – to help patients the way a Siri might, with shopping or restaurant picks or other activities like playing music or sending a text message (especially helpful if the patient has a disability that makes it hard for him to do that himself). “If every time you interact with Alme it’s because you are dealing with taking a shot, you are not going to like it as much as if it can help you do something fun,” says Brown.
That builds stickiness, which hopefully helps drive increasing patient cooperation on the medical tasks side. “We are banking on the user using us more and more because there is a relationship,” says Morrison. The Next IT team says that research shows that avatars can be successful substitutes for the gold standard of face-to-face counseling for behavior change. “Doctors say if they get 10 percent of the population to change their behavior, it’s a massive home run. I think I can do more than 10 percent, but they are looking for little incremental gains to drive better outcomes,” Brown says.
Human-assisted machine learning is a key part of Alme’s intelligence, and Next IT is working on training it as one would train a human doing the job. “The training cycle we take it through is highly regulated and supervised by humans," Brown says. "We store all that interaction and make sure our human understanding always gets smarter."