Intelligence at the interface — it’s just over the semantic horizon.
“This concept is what I really believe is under-emphasized when people get excited about the bottom-up technology of semantics,” says Tom Gruber, a pioneer in the field of knowledge sharing and collaboration on the web who established the DARPA Knowledge Sharing Library.
Gruber, who also co-founded RealTravel and Intraspect Software, and is founder and chief scientist at organizational effectiveness consultancy Consider Solutions, hates to use the phrase, but he sees a “paradigm shift” underway. The world is moving from hyperlinks, portals and search engines, where the onus is still on the user to figure out how to get the information he wants, to intelligence at the interface.
“The breakthrough is we’ve gone from where the user has to be relatively smart to say the magic words, to where the system is relatively smart now, and the user can more or less sit back. It’s heads-back interaction. You’re just driving, you’re just typing,” he says. “And the system knows enough about your preferences, your needs, where you are, and what you’ve done, to be able to advise you proactively,” he says, as it leverages the collective intelligence of the social web.
It’s already beginning to happen with services such as Twine, which helps users organize and share and discover information without having to meticulously bookmark and categorize and tag it themselves. What has yet to happen is to bring all the dimensions together — location and time awareness, your social networks, your trusted sources, and so on, in a service.
“Richer data and more rich inferencing produces a kind of emergent service, the quality of which isn’t available today,” he notes.
Gruber, who will address this issue as part of his presentation at the upcoming Semantic Technology Conference (SemTech), being held May 18-22 in San Jose, Calif., sees this computing on behalf of users as the natural result of our online lives being lived in relative transparency. “We’re already giving up our privacy and exposing ourselves to the infrastructure — let’s get the infrastructure to make maximum use of that information and bring the intelligence of the system to that interface,” he says.
Powerful, but potentially problematic? Do people understand what privacy they have given up, and the consequences of that, especially as “gigantic joins” make it possible to put the various pieces of people’s online lives together in ways that those individuals never expected?
“Intelligence at the interface says to give people the maximum complete information about the consequences, give them a service so they have transparency, and they can judge whether they’re getting enough personal value [out of something] to participate in the online world, or make an informed judgment that they don’t want to do that,” says Gruber.
Today, you have to be a pretty sophisticated technology user to ensure you are protecting your online identity, but just as semantics contribute to greater transparency, so too can they protect against its consequences. “The key thing is to know the consequences and the only way people can know that is to see the inferences and the only way to know that is through semantic computation,” says Gruber. He thinks there may even be a market opportunity here, for creating services that report back to you about what’s going on with your individual data.
As the world moves to intelligence at the interface, semantic web standards and technologies are going to be an extremely important vehicle for surfacing back-end structured data. “Intelligence at the interface says to hook up that incredibly intelligent backend to an incredibly intelligent front end that uses the data about humans to reason about what they want and what they need,” he says. Yet, the emphasis needs to be on semantic computing rather than on a particular standard for exchanging data.
Says Gruber, “We should use standards where we can, but the deep shift that is happening is about semantics, not the semantic web.”