QUANTA

Monday, April 25, 2011


Data-intensive supercomputing

April 25, 2011 

The amount of digital data generated by instruments such as DNA sequencers, cameras, telescopes, and MRIs is now doubling every 18 months, says Michael Norman, director of the San Diego Supercomputer Center (SDSC) at the University of California, San Diego (UCSD).

“Digital data is advancing at least as fast, and probably faster, than Moore’s Law,” said Norman…. But I/O (input/output) transfer rates are not keeping pace — that is what SDSC’s supercomputers are designed to solve.”

The result of a five-year, $20 million grant from the National Science Foundation, the supercomputer, named Gordon, will have 250 trillion bytes of flash memory and 64 I/O nodes, and be capable of handling massive data bases while providing up to 100 times faster speeds when compared to hard drive disk systems for some queries.

This makes Gordon ideal for data mining and data exploration, where researchers have to churn through tremendous amounts of data just to find a small amount of valuable information, not unlike a web search.

Potential uses include genome assembly from sequencer reads, classification of objects found in massive astronomical surveys, oceanography, atmospheric science, oil exploration, quantum chemistry, structural engineering, and computer-aided design/computer-aided manufacturing (CAD/CAM).

Source: http://goo.gl/eCdig

Source and/or and/or additional resources read more: http://3.ly/rECc  Publisher and/or Author and/or Managing Editor:__Andres Agostini ─ @Futuretronium at Twitter! Futuretronium Book at http://3.ly/rECc