Sign up for our newsletter and get the latest big data news and analysis.

Managing Big Data From Space

NASA-logo-1CB08F1EC9-seeklogo.comNASA and its dozens of missions gather hundreds of terabytes of data every hour. Data pour in every day like rushing rivers. Spacecraft monitor everything from our home planet to faraway galaxies, beaming back images and information to Earth. All those digital records need to be stored, indexed and processed so that spacecraft engineers, scientists and people across the globe can use the data to understand Earth and the universe beyond.

Mission planners and software engineers are coming up with new strategies for managing the ever-increasing flow of such large and complex data streams.

Scientists use big data for everything from predicting weather on Earth to monitoring ice caps on Mars to searching for distant galaxies,” said Eric De Jong of JPL, principal investigator for NASA’s Solar System Visualization project. ”We are the keepers of the data, and the users are the astronomers and scientists who need images, mosaics, maps and movies to find patterns and verify theories.”

De Jong explains that there are three aspects to wrangling data from space missions: storage, processing and access. The first task, to store or archive the data, is naturally more challenging for larger volumes of data. The Square Kilometer Array (SKA), a planned array of thousands of telescopes in South Africa and Australia, illustrates this problem. Led by the SKA Organization based in England and scheduled to begin construction in 2016, the array will scan the skies for radio waves coming from the earliest galaxies known.

JPL is involved with archiving the array’s torrents of images: 700 terabytes of data are expected to rush in every day. That’s equivalent to all the data flowing on the Internet every two days. Rather than build more hardware, engineers are busy developing creative software tools to better store the information, such as “cloud computing” techniques and automated programs for extracting data.

chris-2012

We don’t need to reinvent the wheel,” said Chris Mattmann, a principal investigator for JPL’s big-data initiative.”We can modify open-source computer codes to create faster, cheaper solutions.”

JPL has been increasingly bringing open-source software into its fold, creating improved data processing tools for space missions. The JPL tools then go back out into the world for others to use for different applications.

Read the Full Story.

 

Resource Links: