Subscribe free to our newsletters via your
. Space Industry and Business News .




TECH SPACE
Managing the Deluge of 'Big Data' From Space
by Staff Writers
Pasadena CA (JPL) Oct 28, 2013


The center of the Milky Way galaxy imaged by NASA's Spitzer Space Telescope is displayed on a quarter-of-a-billion-pixel, high-definition 23-foot-wide (7-meter) LCD science visualization screen at NASA's Ames Research Center in Moffett Field, Calif. Image Credit: NASA/Ames/JPL-Caltech.

For NASA and its dozens of missions, data pour in every day like rushing rivers. Spacecraft monitor everything from our home planet to faraway galaxies, beaming back images and information to Earth. All those digital records need to be stored, indexed and processed so that spacecraft engineers, scientists and people across the globe can use the data to understand Earth and the universe beyond.

At NASA's Jet Propulsion Laboratory in Pasadena, Calif., mission planners and software engineers are coming up with new strategies for managing the ever-increasing flow of such large and complex data streams, referred to in the information technology community as "big data."

How big is big data? For NASA missions, hundreds of terabytes are gathered every hour. Just one terabyte is equivalent to the information printed on 50,000 trees worth of paper.

"Scientists use big data for everything from predicting weather on Earth to monitoring ice caps on Mars to searching for distant galaxies," said Eric De Jong of JPL, principal investigator for NASA's Solar System Visualization project, which converts NASA mission science into visualization products that researchers can use.

"We are the keepers of the data, and the users are the astronomers and scientists who need images, mosaics, maps and movies to find patterns and verify theories."

Building Castles of Data
De Jong explains that there are three aspects to wrangling data from space missions: storage, processing and access. The first task, to store or archive the data, is naturally more challenging for larger volumes of data.

The Square Kilometer Array (SKA), a planned array of thousands of telescopes in South Africa and Australia, illustrates this problem. Led by the SKA Organization based in England and scheduled to begin construction in 2016, the array will scan the skies for radio waves coming from the earliest galaxies known.

JPL is involved with archiving the array's torrents of images: 700 terabytes of data are expected to rush in every day. That's equivalent to all the data flowing on the Internet every two days.

Rather than build more hardware, engineers are busy developing creative software tools to better store the information, such as "cloud computing" techniques and automated programs for extracting data.

"We don't need to reinvent the wheel," said Chris Mattmann, a principal investigator for JPL's big-data initiative. "We can modify open-source computer codes to create faster, cheaper solutions."

Software that is shared and free for all to build upon is called open source or open code. JPL has been increasingly bringing open-source software into its fold, creating improved data processing tools for space missions. The JPL tools then go back out into the world for others to use for different applications.

"It's a win-win solution for everybody," said Mattmann.

In Living Color
Archiving isn't the only challenge in working with big data. De Jong and his team develop new ways to visualize the information. Each image from one of the cameras on NASA's Mars Reconnaissance Orbiter, for example, contains 120 megapixels.

His team creates movies from data sets like these, in addition to computer graphics and animations that enable scientists and the public to get up close with the Red Planet.

"Data are not just getting bigger but more complex," said De Jong. "We are constantly working on ways to automate the process of creating visualization products, so that scientists and engineers can easily use the data."

Data Served Up to Go
Another big job in the field of big data is making it easy for users to grab what they need from the data archives.

"If you have a giant bookcase of books, you still have to know how to find the book you're looking for," said Steve Groom, manager of NASA's Infrared Processing and Analysis Center at the California Institute of Technology, Pasadena.

The center archives data for public use from a number of NASA astronomy missions, including the Spitzer Space Telescope, the Wide-field Infrared Survey Explorer (WISE) and the U.S. portion of the European Space Agency's Planck mission.

Sometimes users want to access all the data at once to look for global patterns, a benefit of big data archives. "Astronomers can also browse all the 'books' in our library simultaneously, something that can't be done on their own computers," said Groom.

"No human can sort through that much data," said Andrea Donnellan of JPL, who is charged with a similarly mountainous task for the NASA-funded QuakeSim project, which brings together massive data sets -- space- and Earth-based -- to study earthquake processes.

QuakeSim's images and plots allow researchers to understand how earthquakes occur and develop long-term preventative strategies. The data sets include GPS data for hundreds of locations in California, where thousands of measurements are taken, resulting in millions of data points. Donnellan and her team develop software tools to help users sift through the flood of data.

Ultimately, the tide of big data will continue to swell, and NASA will develop new strategies to manage the flow. As new tools evolve, so will our ability to make sense of our universe and the world.

.


Related Links
JPL
Space Technology News - Applications and Research






Comment on this article via your Facebook, Yahoo, AOL, Hotmail login.

Share this article via these popular social media networks
del.icio.usdel.icio.us DiggDigg RedditReddit GoogleGoogle








TECH SPACE
NSF Awards $12 Million to SDSC to Deploy "Comet" Supercomputer
San Diego CA (SPX) Oct 23, 2013
The San Diego Supercomputer Center (SDSC) at the University of California, San Diego, has been awarded a $12-million grant from the National Science Foundation (NSF) to deploy Comet, a new petascale supercomputer designed to transform advanced scientific computing by expanding access and capacity among traditional as well as non-traditional research domains. Comet will be capable of an overall p ... read more


TECH SPACE
Managing the Deluge of 'Big Data' From Space

Cheap metals can be used to make products from petroleum

Vacuums provide solid ground for new definition of kilogram

Zoomable Holograms Pave the Way for Versatile, Portable Projectors

TECH SPACE
Latest AEHF Comms Payload Gets Boost From Customized Integrated Circuits

Northrop Grumman Cobham Intercoms Receives First Order For AN VIC-5 Enhanced Vehicular Comms

Raytheon produces new US Army satellite communications terminals ahead of schedule

Lockheed Martin To Continue In Theater Support for Real-Time Surveillance

TECH SPACE
ILS Proton Launches Sirius FM-6 Satellite

Boeing Finalizes Agreement for Kennedy Space Center Facility

Russia Plans to Spend $22M on Soyuz-2 Launch Pad

Ariane 5 arrives at the Spaceport's Final Assembly Building for payload installation

TECH SPACE
Russia Retires Faulty Glonass-M Satellite

Raytheon demonstrates first Direct Geo-Positioning Metric Sensor

Britain considering car-tracking 'bullet' technology

Orbcomm Launches Solar-Powered Trailer Tracking Solution

TECH SPACE
US military's airship programs lose altitude

Boeing, Lockheed team up for new US Air Force bomber

The Effects of Space Weather on Aviation

Space ballooning: 20-mile-high flights offered for $75K

TECH SPACE
JQI team 'gets the edge' on photon transport in silicon

Atomically Thin Device Promises New Class of Electronics

Tiny Sensors Put the Squeeze on Light

Quantum conductors benefit from growth on smooth foundations

TECH SPACE
Astrium delivers microwave radiometer for the Sentinel-3A satellite

Time is ripe for fire detection satellite

Canadian Satellite SCISAT Celebrating 10 Years Of Scientific Measurements

Developing Next Generation K-12 Science Standards

TECH SPACE
New low-cost, nondestructive technology cuts risk from mercury hot spots

Pollution debated in Canada's oil fields

Mustard gas traces found close to Poland's Baltic Sea coast

Air Pollution Sources And Atmosphere-Warming Particles In South Asia




The content herein, unless otherwise known to be public domain, are Copyright 1995-2014 - Space Media Network. AFP, UPI and IANS news wire stories are copyright Agence France-Presse, United Press International and Indo-Asia News Service. ESA Portal Reports are copyright European Space Agency. All NASA sourced material is public domain. Additional copyrights may apply in whole or part to other bona fide parties. Advertising does not imply endorsement,agreement or approval of any opinions, statements or information provided by Space Media Network on any Web page published or hosted by Space Media Network. Privacy Statement