You are here:  Home  >  Data Education  >  Current Article

Hey Google: It’s Not Your Sentiment Analytics!

By   /  December 22, 2010  /  No Comments

This week a press release came out from General Sentiment that said that an official Google blog implied that the search engine giant itself owned the start-up’s sentiment analysis system – which it doesn’t. No hard feelings, though – General Sentiment would be happy to pursue a deeper relationship.

To give you some grounding into why General Sentiment sent the release out, you may want to read the Google blog entry here , which refers to a Google document here on the topic of large-scale sentiment analysis for news and blogs. That document, as it happens, was authored by a team including Steven Skiena, professor of computer science at Stony Brook University, who, over the course of about five years and along with dozens of grad students – two of them who now work at Google – developed the Lydia/TextMap system for large-scale news analysis. That system was licensed by General Sentiment, where Skiena is now chief science officer, last year.

Following all this? Here’s what General Sentiment CEO Gregory Artzt thinks happened: “Whoever wrote the blog on the Google website must have done internal search for anyone there working on this [sentiment analysis] and came up with the two guys who [now work at Google] who worked on this big Textmap project. But the writer didn’t figure out whether Google owned or had any IP developed by these students related to sentiment analysis, which they don’t since we exclusively licensed this back in 2008,” he says.

General Sentiment reached out to Google about what Artzt calls the false implication – no lawsuits are brewing, but the CEO would like to point out that if Google wants to “make a statement like that, you need to buy us.” After all, he says, it’s not like Google can come up with the kind of NLP-based, detailed text and sentiment analytics technology General Sentiment has on its own overnight.

“You can’t throw lots of money at this and do it really fast,” Artzt says. The work at Stony Brook started in 2003 and in addition to the five years of R&D there it took another year and a half for General Sentiment to fully commercialize the technology so that it focuses on the metrics and sentiment analytics that its current target audience of marketing, communications, and brand professionals require, web-scale it, and launch it on the cloud. Currently it tracks over 1 billion entities of 130 different kinds, across more than 50 million sources of content including social media, and from the day companies sign up with it they can evaluate their brand going back to 2004 (before the social media explosion, but inclusive of blogs and news accounts.)

“It would take them years regardless of how much money they decided to throw at it,” he says.  “You can use our product just for brand monitoring but you really get value out of the text analysis, which is our business. So our ability to provide deeper, more granular accuracy about an entity, which could be a brand or product or topic, is much deeper and more sophisticated because we understand the English language.”

Google’s Still a Friend!

No Google acquisition plans are in the works, but what is ahead for in the new year, in fact, is an update to General Sentiment’s Heat Maps feature, which it describes as a unique capability to pinpoint the geographic location of online discussions, to work on Google maps. “Hopefully they won’t cut us off from their API,” Artzt jokes. “This will be a lot cooler than the current e-map because it’s more interactive so you can really drill down.”

Also in Artzt’s sites, though not formally on the roadmap yet, is integrating broadcast TV and radio transcripts into its system. That ties into its Media Value feature, which reports unpaid exposure in the media on the basis of sentiment, perception and impact value. A partnership with unnamed vendors could give General Sentiment the capability to provide such reports not just for online expressions in social media or blogs, but also for video sources by extracting at a deep text level the content of transcripts.

Such potential features will come on the heels of its recent update that Artzt said increased the technology’s article serving capacity, added a lot of new sources, and improved drill-down capabilities to be more interactive. “Understanding the context of signals is easier and faster now,” he says.

Where might General Sentiment go next? The financial markets present an enticing opportunity, for one thing, though Artzt says they won’t move out of stealth there until they’re sure they can have a really compelling proposition. The company recently hired someone who did his thesis on creating a market-neutral trading portfolio based on sentiment as signals for trades, using news media sources. “The hypothesis is less about day trading and more about allowing the opinion we find to be predictive of something that’s changing couple of months out,” says Artzt.

About the author

Jennifer Zaino is a New York-based freelance writer specializing in business and technology journalism. She has been an executive editor at leading technology publications, including InformationWeek, where she spearheaded an award-winning news section, and Network Computing, where she helped develop online content strategies including review exclusives and analyst reports. Her freelance credentials include being a regular contributor of original content to The Semantic Web Blog; acting as a contributing writer to RFID Journal; and serving as executive editor at the Smart Architect Smart Enterprise Exchange group. Her work also has appeared in publications and on web sites including EdTech (K-12 and Higher Ed), Ingram Micro Channel Advisor, The CMO Site, and Federal Computer Week.

You might also like...

Implementing Compliance Intelligence to Combat Data Piracy

Read More →