by Angela Guess
Jaikumar Vijayan of ComputerWorld reports, "Hadoop and MapReduce have long been mainstays of the big data movement, but some companies now need new and faster ways to extract business value from massive -- and constantly growing -- datasets. While many large organizations are still turning to the open source Hadoop big data framework, its creator, Google, and others have already moved on to newer technologies. The Apache Hadoop platform is an open source version of the Google File System and Google MapReduce technology. It was developed by the search engine giant to manage and process huge volumes of data on commodity hardware. It's been a core part of the processing technology used by Google to crawl and index the Web."
He continues, "Hundreds of enterprises have adopted Hadoop over the past three or so years to manage fast-growing volumes of structured, semi-structured and unstructured data. The open source technology has proved to be a cheaper option than traditional enterprise data warehousing technologies for applications such as log and event data analysis, security event management, social media analytics and other applications involving petabyte-scale data sets. Analysts note that some enterprises have started looking beyond Hadoop not because of limitations in the technology, but for the purposes it was designed."
photo credit: Hadoop