You are here:  Home  >  Uncategorized  >  Current Article

Stephen Wolfram Demos Knowledge-Based Programming Language As It Approaches Official Release (Video)

By   /  February 26, 2014  /  No Comments

olflangpixby Jennifer Zaino

Stephen Wolfram is talking more publicly about the Wolfram Language, this week releasing a video demo of the knowledge-based programming language. As he describes in the video below, the symbolic language builds in a vast amount of knowledge of how to do computations and about the world itself. “Through symbolic structure of the language,” he says, primitives for everything from processing images to looking up stock prices “are all set up to work together in a wonderfully coherent way.”

The concept of coherence – the idea that everything in the language must fit together – is in fact one of the principles that have guided the development of the language over the past decades, he explains, as is maximum automation – the idea that the language should take care of as much as possible. If you are working in machine learning, for example, and want to build a data classifier, “in the Wolfram Language there’s just one Super Function, Classify, that’s packed with meta-algorithms to automatically figure out what to do,” he says. There are thousands of Super Functions in the language, he says, which “effectively give you the highest possible level of building blocks for programs.”

These building blocks contain not only algorithms but knowledge and data, too, including knowledge about how to import and export formats and interact with external APIs and huge amounts of curated computable data – the same data that powers Wolfram Alpha, completely programmatically accessible, he says. Ask it when the sun will set today, and you’ll get the answer for your current location, for instance.

While the Wolfram Language has a precise way of talking about everything, according to Wolfram, “when dealing with the world at large we often just want to use natural language to describe what we are talking about. And thanks to Wolfram Alpha, we have a big stack of technology for doing that,” he says. Ask for countries in South America, for instance, and its natural language understanding system can figure out what that means and generate precise Wolfram Language code for it. So, you can get a list of those countries from which you can compute – generating their flags or finding the dominant colors in them, for instance.

Indeed, one good way to get started with the language, he suggests, is to take the approach of giving input in natural language and starting to construct a program that way. “The Wolfram Language makes it really easy to create really powerful programs,” he says.

The Wolfram Language, he notes, is completely scalable, so that users can write powerful one line programs to multi-million line programs. “It’s easier to build and debug symbolic programs because at every stage every piece is meaningful,” he says.

Wolfram also says that the Language is in a unique position when it comes both to interacting with devices and with respect to real-world data in general, “because in a sense it’s a language that has a model of the world built in.” The Wolfram Data Framework, or WDF – which takes everything Wolfram has learned about representing data and the world from Wolfram Alpha, and makes it available to use on data from anywhere – “encapsulates that model, and within the Wolfram Language there are lots of ways of taking unstructured data and getting it into WDF with its units and dates and entities and so on made canonical.”

The goal with Wolfram Language, he summaries, is to “encapsulate as much computational knowledge as possible so people can go from ideas to deployed products as quickly and easily as possible.”

Watch the video below for a more in-depth look:

About the author

Jennifer Zaino is a New York-based freelance writer specializing in business and technology journalism. She has been an executive editor at leading technology publications, including InformationWeek, where she spearheaded an award-winning news section, and Network Computing, where she helped develop online content strategies including review exclusives and analyst reports. Her freelance credentials include being a regular contributor of original content to The Semantic Web Blog; acting as a contributing writer to RFID Journal; and serving as executive editor at the Smart Architect Smart Enterprise Exchange group. Her work also has appeared in publications and on web sites including EdTech (K-12 and Higher Ed), Ingram Micro Channel Advisor, The CMO Site, and Federal Computer Week.

You might also like...

Smart Data Webinar: Advances in Natural Language Processing II – NL Generation

Read More →