New data-compression method reduces big-data bottleneck
Outperforms and enhances JPEG, handles both analog and digital signals
December 26, 2013
The JALALI-LAB group, led by Bahram Jalali, holder of the Northrop Grumman Opto-Electronic Chair in Electrical Engineering, discovered that it is possible to achieve data compression by stretching and warping the data using a new “anamorphic stretch transform” (AST) technique, which operates both in analog and digital domains.
In analog applications, AST makes it possible to not only capture and digitize signals that are faster than the speed of the sensor and the digitizer, but also to minimize the volume of data generated in the process.
AST can also compress digital records — for example, medical data, so it can be transmitted over the Internet for a tele-consultation. The transformation causes the signal to be reshaped is such a way that “sharp” features — its most defining characteristics — are stretched more than data’s “coarse” features.
The technique does not require prior knowledge of the data for the transformation to take place; it occurs naturally and in a streaming fashion.
“Our transformation causes feature-selective stretching of the data and allocation of more pixels to sharper features where they are needed the most,” Asghari said. “For example, if we used the technique to take a picture of a sailboat on the ocean, our anamorphic stretch transform would cause the sailboat’s features to be stretched much more than the ocean, to identify the boat while using a small file size.”
AST can also be used for image compression, as a standalone algorithm or combined with existing digital compression techniques to enhance speed or quality or to improve the amount images can be compressed. Results have shown that AST can outperform standard JPEG image compression format, with dramatic improvement in terms of image quality and compression factor.
The new technique has its origin in another technology pioneered by the Jalali group, time stretch dispersive Fourier transform, which is a method for slowing down and amplifying faint but very fast signals so they can be detected and digitized in real time.
High-speed instruments created with this technology enabled the discovery of optical rogue waves in 2007 and the detection of cancer cells in blood with one-in-a-million sensitivity in 2012. But these instruments produce a fire hose of data that overwhelms even the most advanced computers. The need to deal with such data loads motivated the UCLA team to search for a new data compression technology.
“Reshaping the data by stretching and wrapping it in the prescribed manner compresses it without losing pertinent information,” he said. “It emulates what happens to waves as they travel through physical media with specific properties. It also brings to mind aspects of surrealism and the optical effects of anamorphism.”
Asghari was supported by a grant from the Natural Sciences and Engineering Research Council of Canada. Jalali also has UCLA faculty appointments in bioengineering and in the David Geffen School of Medicine’s department of surgery, and he is a member of the California NanoSystems Institute.
Abstract of Applied Optics paper
A general method for compressing the modulation time–bandwidth product of analog signals is introduced. As one of its applications, this physics-based signal grooming, performed in the analog domain, allows a conventional digitizer to sample and digitize the analog signal with variable resolution. The net result is that frequency components that were beyond the digitizer bandwidth can now be captured and, at the same time, the total digital data size is reduced. This compression is lossless and is achieved through a feature selective reshaping of the signal’s complex field, performed in the analog domain prior to sampling. Our method is inspired by operation of Fovea centralis in the human eye and by anamorphic transformation in visual arts. The proposed transform can also be performed in the digital domain as a data compression algorithm to alleviate the storage and transmission bottlenecks associated with “big data.”
(¯`*• Global Source and/or more resources at http://goo.gl/zvSV7 │ www.Future-Observatory.blogspot.com and on LinkeIn Group's "Becoming Aware of the Futures" at http://goo.gl/8qKBbK │ @SciCzar │ Point of Contact: www.linkedin.com/in/AndresAgostini