News came this week that a man accused of defrauding a financial group out of close to a million dollars around an investment in a fictional mobile medical device tablet is scheduled to sign a plea agreement admitting that he committed mail fraud. The man, Howard Leventhal, had been promoting the Star Trek-influenced McCoy Home Health Care Tablet as a device that can instantaneously deliver detailed patient information to medical providers. (The product is discussed on the company’s still-surviving web site here.) He was arrested for the fraud in October and has been out on bail.
The interesting thing about this case is that the fake he was perpetrating isn’t very far removed from reality regarding the role mobile apps and systems will play in healthcare. There of course are plenty of mobile apps already available that help users do everything from monitoring their hearts to recording their blood-oxygen level during the night to see whether they have sleep apnea. Research and Markets, for example, says the wireless health market currently will grow to nearly $60 billion by 2018, up from $23.8 billion, with remote patient monitoring applications and diagnostics helping to drive the growth. But where things really get interesting is when mobile health takes on questions of semantic interoperability of accumulated data, and assessing its meaning.
In fact, at the October 2012 Semantic Technology & Business Conference in New York City, Deborah Estrin, Professor of Computer Science at CornellNYC Tech, talked about her work as co-founder of Open mHealth, a non-profit startup, and the possibilities for using mobile systems to capture health data about users that can aid in diagnostics and treating conditions including chronic diseases, and how a “light-handed” semantic technology approach can contribute to such efforts by helping to parse out the meaning in the data. Current projects for Open mHealth include, for example, an effort with Kaiser Permanent Southern California and four of its partner organizations to implement the Open mHealth DSU (Data Storage Unit) API Specification for unified information sharing across disparate data streams to improve the management of Type II Diabetes Depression.
Or take the Tetherless World Constellation at Rensselaer, which has in its investigations portfolio Mobile Health, with Deborah L. McGuinness, professor in the computer science and cognitive science departments and director of the Web Science Research Center at the university, as principal investigator. The work there includes leveraging semantic technologies and technologies such as IBM Watson to integrate, represent and reason with data from a variety of consumer and medical-grade devices, to enable personalized healthcare.
RPI notes that it is the only university to have obtained IBM’s Watson system (see story here) and that it will leverage Watson to connect the integrated health data view to relevant content. On tap is building a demonstration application whose initial use case will focus on weight management and fitness training, incorporating data from pedometers, heart monitors, and accelerometers. The platform will be designed to be easily extensible to other health sensor data and related focus areas, it says.
The INSIGHT Centre for Data Analytics, as we reported here, also will be investigating connected health and discovery apps, with expertise on the semantic web and Linked Data coming from participating parties such as the Digital Enterprise Research Institute. So, while the McCoy device wasn’t a real McCoy, the future of mobile apps and systems in personalized healthcare is a fact, and one which semantic tech will have a part in.