There’s been lots of talk about the Internet of Things (IoT) in the last few years. Companies have been figuring out how to tie various types of different devices to the Internet. This goes beyond using smartphones and tablets. We often talk about home appliances and smartphone systems becoming part of the Internet of Things. Manufacturers have started to build different kinds of sensors and equipment into things like refrigerators, washers and dryers, and other appliances that could give us entirely new ways to use the World Wide Web.
At the same time, some pioneers are looking into the future, arguing that eventually the Internet of Things is going to be more than just a set of smart devices.
The idea is that as the IoT grows and matures, it’s going to introduce a kind of “fourth dimension” for virtual design and activity.
An explanation of this is in a paper called “The Internet of Things: A Case for Interoperable IoT Sensor Data and Meta-data Formats” by Milan Milenkovic. Here, Milenkovic talks about seeking the “conceptual interoperability” that can facilitate a big change in how we use data.
Think about a set of connected systems where the distributed system components glean and transmit their own data into one common repository; that aggregated data can be used to build new models. Milenkovic’s paper uses the example of a fitness tracking platform that would pull from an individual’s wearable fitness tracker, as well as smartphone systems and other public systems, to provide a much more vibrant and detailed virtual model of that person’s physical activity.
“Connected sensors share real-world data virtually,” Milenkovic writes. In other words, if we can get diverse data from all of these sensors that will be embedded all around us, that virtual model of the real world will become crystal clear in amazing and fascinating ways.
However, what this all relies on is a set of designs and models for sharing data. That’s what Milenkovic calls “conceptual interoperability,” and that takes up most of the paper’s discussion around how these futuristic virtual models will be achieved.
First, Milenkovic argues, there’s not enough time in IT evolution to really go for a universal standard. Instead, Milenkovic contends, systems will have to adhere to basic design principles. Suggestions include the JSON semantic language and common naming of keys. The main goals, Milenkovic says, are to build semantic models, interoperable formats, taxonomies, and ontologies that will allow for higher level uses of the Internet of Things.
“Hypermedia” and Versatile Metadata
Another insightful exploration of these types of possibilities comes from Michael Koster, Principal Research Engineer at SmartThings in Sunnyvale, CA. In an October 30 post on a blog called Data Models for the Internet of Things, Koster explores some of the ideas behind using metadata to evolve the IoT.
Koster works with an IETF group called the Thing to Thing Research Group (T2TRG) in collaboration with the Interest Group on Web of Things (WoT-IG), part of the W3C. Both of these are looking at something called “hypermedia” and its use in evolving networks.
“Hypermedia is the descriptive metadata about how to exchange state information between applications and resources,” Koster writes. “Using hypermedia, applications can read the metadata and automatically consume resources. This results in a machine-understandable interface.”
Going further, Koster defines something called Hypermedia “As The Engine Of Application State,” or HATEOAS, where hypermedia “drives” application/resource arrangements, and defines types of hypermedia for web pages: links and forms. He also talks about using JSON to achieve some of this web functionality.
Intelligent Communities
One of the applications with the greatest potential and one of the areas where IoT evolution is proceeding most quickly is in the emergence of “smart city” models. They seek to use diverse data to achieve “conceptual interoperability” to improve the management, maintenance or growth of city systems. These ideas could be applied to anything from water infrastructure to crime management to landscaping to energy efficiency – and they are. A July 2014 piece in ERCIM News looks at goals and applications for smart city IoT in areas like smart buildings, urban transport, and law enforcement. Meanwhile, various newspaper and media reports emerge showing programs like this one in Chicago involving laying the groundwork for unprecedented data collection and use.
Metadata Variety
Along with the idea of culling data from a wider variety of sources, there’s also the interesting puzzle of getting data in a wide variety of formats.
Some purists who are more determined to refine the use of specific types of sensor data might see format variety as a secondary question, but for others, it’s a big part of building that new picture window into our digital lives. For instance, think about a real-time data system tracking a physical person. Sure, you can get a vibrant narration of that person’s movements, activities, and behaviors just from a set of geospatial sensors, if they are ubiquitous enough, but what about if you add data from video? All of a sudden, everything becomes a lot clearer.
Variety means more work, though.
“Handling diverse and messy data requires a lot of cleanup and preparation,” writes Edd Dumbill in a 2013 Forbes piece that talks about the challenge of getting enterprise data from a scattered field into one giant mechanism for analysis. Dumbill points out that the average business often has trouble with the aggregation and the analysis, partly because it’s so difficult to really define data assets for analytical purposes. Still, the more businesses dig down into their data stores and “tag” for conceptual interoperability, the more detailed their digital masterpieces become.
Privacy Barriers
As you might imagine, conceptual interoperability and related advances don’t mean that data will flow anywhere it can be used, especially not all at once.
“Data interoperability does not imply free sharing,” Milenkovic writes in his paper, and it’s likely that various privacy issues will continue to confound the most advanced projects well after the means to operate is in hand. In response, Data Scientists are hard at work designing elaborate systems where servers will take and store permission, work on layers below the actual aggregation, and try to triage and parse data delivery according to user-generated rules. Here is one example from GitHub revealing a complex system for service provisioning.
What all of this has in common is an obsession with metadata, and an admission that contextual data is the lifeblood of any truly profound system that’s using the Internet of Things (or any other kind of system) to gather and interpret information. As some of these researchers so clearly point out, sensor data without its metadata is often just hashes of numbers. It’s the metadata and descriptive tagging that must be designed, maintained, and deployed in order to even allow modern systems to build that fourth dimension in a virtual space. Making metadata “semantic” is a core process that is so much like managing the lexicon of the human language or cataloguing the scientific species of the world. It’s an effort to impose order on chaos.
JSON and its successor JSON-LD are helping engineers to chase this kind of structural evolution on the Internet. When other kinds of device-specific systems catch up, that conceptual interoperability should start to evolve in a big way. Until then, we’re just on the cusp of an age in which our technologies can tell us much more about ourselves and our environment than our five senses can. Will the change be swift? And what will it mean for human societies? Efforts to improve the use of sensor metadata are one way that today’s IT pioneers are putting out their feelers toward what the rest of the 21st century has in store.