Big Data from Supercomputers

by Angela Guess

Rita Boland of Signal Online recently wrote, “Big data can mean big problems for the people trying to derive usable information from a large number of sources. Since coming into existence in March, the Scalable Data Management, Analysis and Visualization Institute has made strides to resolve this issue for programs running on supercomputers. The young organization’s efforts have applicability to a variety of scientific fields—including nuclear physics—and its tools are open source so others can take advantage of the findings.”

She continues, “Funding comes from the U.S. Department of Energy (DOE), under which the institute falls. For the next five years, the department will provide $5 million annually to support research among the members, which include seven universities and six national laboratories. Lawrence Berkeley National Laboratory is the lead, because one of its staff, Arie Shoshani, was chosen as the institute’s director. The private company Kitware Incorporated also is a member, supplying its virtual toolkit for partners to use. Shoshani explains that his organization’s emphasis on scientific data, large scale simulations and in-situ processing separate it from other agencies working on big data projects. Running data reduction, analysis and visualization tasks in situ means that these tasks are performed on the same machine where the simulation takes place.”

Read more here.

Related Posts Plugin for WordPress, Blogger...

Leave a Reply

Your email address will not be published. Required fields are marked *