You are here:  Home  >  Data Education  >  Current Article

Semantic Tech Takes On Grants Funding, Portfolio Management

By   /  June 16, 2014  /  No Comments

octoimageby Jennifer Zaino

Whether the discussion is about public grants funding or government agencies’ portfolio management at large, semantic technology can help optimize departments’ missions and outcomes. Octo Consulting, whose engagement with the National Institutes of Health The Semantic Web Blog discussed here, sees the issue in terms of integration and aggregation of data across multiple pipes, vocabularies and standards to enable grant-makers or agency portfolio-managers to get the right answers when they want to search to answer questions, such as whether grants are being allocated to the right opportunities and executed properly, or whether contracts are hired out to the right vendors or licenses are being duplicated.

Those funding public grants, for instance, should keep an eye on what projects private monies are going to, as well – a job that may involve incorporating data in other formats from other public datasets, social media and other sources in addition to their own information, in order to optimize decisions. “The nature of the public grant market is effectively understanding what the private grant market is doing and not doing the same thing,” says Octo executive VP Jay Shah.

At the same time, understanding where the private market isn’t investing may point the way to opportunities to use public funding, such as better serving citizens’ health with research towards the causes of a disease vs. a private funder’s primary interest in creating a new drug to treat that disease – with the prize being profits.

“The semantic web helps to facilitate the development and definition of a common vocabulary that we can use to correlate different contexts for the domain, and also gives the ability for us to integrate data sources seamlessly so the user can ask the questions they want to ask,” says Octo CTO Ashok Nare. “We can leave data where it resides vs. building a huge data warehouse and waiting for a month for metrics. Semantic technologies will let us leave data where it resides, but link it together and be able to answer questions much quicker than traditional technologies.”

There’s also the opportunity to do semantic analysis on grant proposals, says Dan Montgomery, senior consultant at Octo, “to discover if proposed research is 80 or 90 percent similar to research done two to three years ago,” so that it isn’t needlessly repeated. And there’s another opportunity to correlate data from multiple sources to discover whether there is an increased chance that a piece of proposed clinical research, for example, is likely to fail, perhaps under specific conditions. “We can shift the conversation around to do we have the data available [and merged together] to predict failure, or that under certain conditions failure rates go up,” says Shah. “So grants management becomes risk management.”

While an agency might be willing to tolerate some higher risk in one grant because other grants might be safer bets, “can they tolerate that in all their grants? It might be a great way to see high returns but also to lose your shirt,” he says.

Vendor contracts as part of agencies’ overall portfolio management responsibilities are also ripe domains for semantic technology to support, says Montgomery. “We are working with customers to [use semantic technologies to] look at existing contracts, who want to ask if they are hiring the right vendors or if there are duplications in contracts – for example, if there are five different Oracle software licenses within agency departments, you are wasting money,” Nare says.

At the end of the day, as more government data is available online for the public to wade through, agencies have to be smarter about connecting the dots around all facets of their own spending, Shah says, across agency silos. “That is building a business case for concepts like the semantic web and Linked Data and the ability to tag and understand the relationships and context of data,” he says. “We see a big uptick in attention to what agencies are doing to better understand duplicate investments, strategic sourcing and why there are 75 vendor licenses for products they aren’t using.”

About the author

Jennifer Zaino is a New York-based freelance writer specializing in business and technology journalism. She has been an executive editor at leading technology publications, including InformationWeek, where she spearheaded an award-winning news section, and Network Computing, where she helped develop online content strategies including review exclusives and analyst reports. Her freelance credentials include being a regular contributor of original content to The Semantic Web Blog; acting as a contributing writer to RFID Journal; and serving as executive editor at the Smart Architect Smart Enterprise Exchange group. Her work also has appeared in publications and on web sites including EdTech (K-12 and Higher Ed), Ingram Micro Channel Advisor, The CMO Site, and Federal Computer Week.

You might also like...

Property Graphs: The Swiss Army Knife of Data Modeling

Read More →