Big Data at the NSA

By on


by Angela Guess

Alex Williams of TechCrunch reports, “The National Security Agency (NSA) cloud is about big data and creating unicorns. And it all started when some geeks stole two servers. It makes no sense, according to conventional thinking, but these are unconventional times, and the cloud that NSA built had to be thought through differently, too. NSA’s goal is to unify data and use it to do analysis, said Nathanael Burton, a computer scientist with the security agency in a keynote address today at the OpenStack Summit in Portland. But with its old infrastructure, the data was spread across different systems that did not work together. Today, OpenStack is running across the NSA and has drastically changed the way the agency works with data. So much so that Burton said it is now becoming an advisor across the international intelligence community.”

Williams continues, “The NSA is a government agency, and it can take weeks — even months — to get an idea approved and resources committed. This was squelching innovation or, in Burton’s words, making people wonder, ‘why bother?’ Burton had heard about OpenStack and decided to attend its February 2011 conference in Santa Clara. He came back impassioned and decided to steal two servers instead of going through the arduous process of getting IT approval. (Of course, this was in a lab environment so ‘stealing’ two servers is used loosely.) In two weeks they had a pilot of OpenStack up and running on the Cactus release. ‘We started to see our first unicorns,’ Burton said.”


Read more here.

photo credit: NSA


Leave a Reply