QUANTA

Tuesday, July 31, 2012

“…Technology and ideology are shaking the foundations of twenty-first-century capitalism. Technology is making skills and knowledge the only sources of sustainable strategic advantage…” ─Lester Thurow


Global Source and/or and/or more resources and/or read more: http://goo.gl/HKcYU ─ Publisher and/or Author and/or Managing Editor:__Andres Agostini ─ @Futuretronium at Twitter! │ Futuretronium Supranational Initiative │ Futuretronium Book at http://goo.gl/JujXk ─ www.FUTURE-OBSERVATORY.blogspot.com
“…We have lingered long enough on the shores of the cosmic ocean. We are ready at last to set sail for the stars…” ─Carl Sagan


Global Source and/or and/or more resources and/or read more: http://goo.gl/HKcYU ─ Publisher and/or Author and/or Managing Editor:__Andres Agostini ─ @Futuretronium at Twitter! │ Futuretronium Supranational Initiative │ Futuretronium Book at http://goo.gl/JujXk ─ www.FUTURE-OBSERVATORY.blogspot.com
Why graphene may be the substrate for the next generation of computer chips

Sandwiching individual graphene sheets between insulating layers to produce electrical devices with unique new properties could open up a new dimension of physics research

Wonder material graphene is a two-dimensional material consisting of a single layer of carbon atoms arranged in a honeycomb or chicken wire structure. It is the thinnest material in the world and yet is also one of the strongest. It conducts electricity as efficiently as copper and outperforms all other materials as a conductor of heat.

Now University of Manchester scientists have shown that a new side-view imaging technique can be used to visualize the individual atomic layers of graphene within the devices they have built. They found that the structures were almost perfect — even when more than 10 different layers were used to build the stack.

This surprising result indicates that the latest techniques of isolating graphene could be a huge leap forward for engineering at the atomic level, the scientists say, and gives more weight to graphene’s suitability as a major component in the next generation of computer chips.

The scientists note that the field has expanded beyond studying graphene as isolated 2D crystals. There is a rapidly growing interest in atomic-scale heterostructures made from a combination of alternating layers of graphene, hexagonal boron-nitride (hBN), MoS2, and so on. Such heterostructures provide a higher electronic quality for lateral graphene devices and also allow a conceptually new degree of flexibility in designing electronic, optoelectronic,  micromechanical and other devices..

Side-view imaging

The researchers’ side-view imaging approach works by first extracting a thin slice from the center of the device. This is similar to cutting through a rock to reveal the geological layers or slicing into a chocolate cake to reveal the individual layers of icing.

The scientists used a beam of ions to cut into the surface of the graphene and dig a trench on either side of the section they wanted to isolate. They then removed a thin slice of the device.

“The difference is that our slices are only around 100 atoms thick and this allows us to visualize the individual atomic layers of graphene in projection,” said Dr. Sarah Haigh from The University of Manchester’s School of Materials.

“We have found that the observed roughness of the graphene is correlated with their conductivity. Of course we have to make all our electrical measurements before cutting into the device. We were also able to observe that the layers were perfectly clean and that any debris left over from production segregated into isolated pockets and so did not affect device performance.

“We plan to use this new side view imaging approach to improve the performance of our graphene devices.”

Demonstrating its remarkable properties won Professor Andre Geim and Professor Kostya Novoselov the Nobel prize for Physics in 2010. The University of Manchester is building a state-of-the-art National Graphene Institute to continue to lead the way in graphene research.




Global Source and/or and/or more resources and/or read more: http://goo.gl/HKcYU ─ Publisher and/or Author and/or Managing Editor:__Andres Agostini ─ @Futuretronium at Twitter! │ Futuretronium Supranational Initiative │ Futuretronium Book at http://goo.gl/JujXk ─ www.FUTURE-OBSERVATORY.blogspot.com
 Chronic 2000-04 drought, worst in 800 years, may be the ‘new normal’

The chronic drought that hit western North America from 2000 to 2004 left dying forests and depleted river basins in its wake and was the strongest in 800 years, scientists have concluded, but they say those conditions will become the “new normal” for most of the coming century.

Such climatic extremes have increased as a result of global warming, a group of 10 researchers reported Sunday in Nature Geoscience. And as bad as conditions were during the 2000-04 drought, they may eventually be seen as the good old days.

Climate models and precipitation projections indicate this period will actually be closer to the “wet end” of a drier hydroclimate during the last half of the 21st century, scientists said.

Aside from its impact on forests, crops, rivers and water tables, the drought also cut carbon sequestration by an average of 51 percent in a massive region of the western United States, Canada and Mexico, although some areas were hit much harder than others. As vegetation withered, this released more carbon dioxide into the atmosphere, with the effect of amplifying global warming.

“Climatic extremes such as this will cause more large-scale droughts and forest mortality, and the ability of vegetation to sequester carbon is going to decline,” said Beverly Law, a co-author of the study, professor of global change biology and terrestrial systems science at Oregon State University, and former science director of AmeriFlux, an ecosystem observation network.

“During this drought, carbon sequestration from this region was reduced by half,” Law said. “That’s a huge drop. And if global carbon emissions don’t come down, the future will be even worse.”

This research was supported by the National Science Foundation, NASA, U.S. Department of Energy, and other agencies. The lead author was Christopher Schwalm at Northern Arizona University. Other collaborators were from the University of Colorado, University of California at Berkeley, University of British Columbia, San Diego State University, and other institutions.

It’s not clear whether or not the current drought in the Midwest, now being called one of the worst since the Dust Bowl, is related to these same forces, Law said. This study did not address that, and there are some climate mechanisms in western North America that affect that region more than other parts of the country.

But in the West, this multi-year drought was unlike anything seen in many centuries, based on tree ring data. The last two periods with drought events of similar severity were in the Middle Ages, from 977-981 and 1146-1151. The 2000-04 drought affected precipitation, soil moisture, river levels, crops, forests and grasslands.

Ordinarily, Law said, the land sink in North America is able to sequester the equivalent of about 30 percent of the carbon emitted into the atmosphere by the use of fossil fuels in the same region. However, based on projected changes in precipitation and drought severity, scientists said that this carbon sink, at least in western North America, could disappear by the end of the century.

“Areas that are already dry in the West are expected to get drier,” Law said. “We expect more extremes. And it’s these extreme periods that can really cause ecosystem damage, lead to climate-induced mortality of forests, and may cause some areas to convert from forest into shrublands or grassland.”

During the 2000-04 drought, runoff in the upper Colorado River basin was cut in half. Crop productivity in much of the West fell 5 percent. The productivity of forests and grasslands declined, along with snowpacks. Evapotranspiration decreased the most in evergreen needleleaf forests, about 33 percent.

The effects are driven by human-caused increases in temperature, with associated lower soil moisture and decreased runoff in all major water basins of the western U.S., researchers said in the study.

Although regional precipitations patterns are difficult to forecast, researchers in this report said that climate models are underestimating the extent and severity of drought, compared to actual observations. They say the situation will continue to worsen, and that 80 of the 95 years from 2006 to 2100 will have precipitation levels as low as, or lower than, this “turn of the century” drought from 2000-04.

“Towards the latter half of the 21st century the precipitation regime associated with the turn of the century drought will represent an outlier of extreme wetness,” the scientists wrote in this study.

These long-term trends are consistent with a 21st century “megadrought,” they said.



Global Source and/or and/or more resources and/or read more: http://goo.gl/HKcYU ─ Publisher and/or Author and/or Managing Editor:__Andres Agostini ─ @Futuretronium at Twitter! │ Futuretronium Supranational Initiative │ Futuretronium Book at http://goo.gl/JujXk ─ www.FUTURE-OBSERVATORY.blogspot.com
Microsoft tech to control computers with a flex of a finger

In the future, Microsoft apparently believes, people may simply twitch their fingers or arms to control a computer, game console or mobile device, ReadWriteWeb reports.

Microsoft applied for a patent on electromyography (EMG) controlled computing on Thursday, suggesting that a future smart wristwatch or armband might simply detect a user’s muscle movements and interpret them as gestures or commands.

The “Wearable Electromyography-Based Controller” could also use a network of small sensors attached to the body, all communicating wirelessly with a central hub.

Microsoft also showed off a prototype of an EMG controller in 2010, and has filed complementary EMG controller patents as well as a patent covering the gestures used to control them.

But EMG-based computing does imply several interesting possibilities: the ability to type without a keyboard; wiggling a finger, rather than an arm, to provide fine-grained Kinect controls; or new ways to control “waldos” and other robotic appendages. Microsoft even suggests that a glove-based version of the EMG controller might be used to automatically translate American Sign Language into written or spoken English or other languages. That’s pretty cool.

Microsoft’s patent application claims that an EMG sensor is a “universal” method of controlling any computing device, such as a television or light sensor. That may be true, although something like voice commands could also control just about anything.


Global Source and/or and/or more resources and/or read more: http://goo.gl/HKcYU ─ Publisher and/or Author and/or Managing Editor:__Andres Agostini ─ @Futuretronium at Twitter! │ Futuretronium Supranational Initiative │ Futuretronium Book at http://goo.gl/JujXk ─ www.FUTURE-OBSERVATORY.blogspot.com
A Casimir chip that exploits the vacuum energy

University of Florida researchers have have developed a way to keep objects flat enough to measure the strange Casimir force, which pushes two parallel conducting plates together when they are just a few dozen nanometers apart,  Technology Review Physics arXiv Blog reports.

They carved a single device out of silicon that is capable of measuring the Casimir force between a pair of parallel silicon beams, the first on-chip device capable of doing this.

The device consists of one fixed beam and another moveable one attached to an electromechanical actuator. Other shapes should be possible to manufacture too. “This scheme opens the possibility of tailoring the Casimir force using lithographically defined components of non-conventional shapes,” the researchers say.

So instead of being hindered by uncontrollable Casimir forces, the next generation of microelectromechanical devices should be able to  exploit them, perhaps to make stictionless bearings, springs and even actuators.

Microscopes for viewing nanoscale devices

Monitoring these kind of ultra-small nanoscale devices requires special microscopes, such as the scanning electron microscope (SEM), which images a sample by scanning it with a beam of electrons. (The Casimir device image above is an example of an SEM image).

An SEM can produce very high-resolution images of a sample surface, revealing details less than 1 nanometer in size (the size of small biomolecules).

FEI has just announced the new Verios XHR SEM, which provides the sub-nanometer resolution and enhanced contrast needed for precise measurements in materials science and advanced semiconductor manufacturing applications.

An even higher-resolution microscope is the transmission electron microscope (TEM), with a resolution of 0.5 Angstroms (.05 nm). An example of a TEM image is shown in this news item today on graphene layers.

Another type of nanoscale microscope is the atomic force microscope (AFM). It has several advantages over the 2D SEM; it provides a 3D surface profile, for example. It also has disadvantages: it doesn’t allow for large scanned images, and is very slow, for example.

Nonetheless, AFMs are vital tools in nanotechology, and nanoHUB.org has just announced a two-part, web-based course covering the principles and practice of  atomic force microscopy.


Global Source and/or and/or more resources and/or read more: http://goo.gl/HKcYU ─ Publisher and/or Author and/or Managing Editor:__Andres Agostini ─ @Futuretronium at Twitter! │ Futuretronium Supranational Initiative │ Futuretronium Book at http://goo.gl/JujXk ─ www.FUTURE-OBSERVATORY.blogspot.com
FDA OK’s ingestible sensor chip

Chip tracks adherence to oral medications, can report to caregiver

Proteus Digital Health, Inc. announced Monday that the U.S. Food and Drug Administration (FDA) has cleared its ingestible sensor for marketing as a medical device.

The ingestible sensor (formally referred to as the Ingestion Event Marker or IEM) is part of the Proteus digital health feedback system, an integrated, end-to-end personal health management system designed to help improve patients’ health habits and connections to caregivers.

“The FDA validation represents a major milestone in digital medicine,” said Dr. Eric Topol, professor of genomics at The Scripps Research Institute and author of The Creative Destruction of Medicine: How the Digital Revolution Will Create Better Healthcare. Directly digitizing pills, for the first time, in conjunction with our wireless infrastructure, may prove to be the new standard for influencing medication adherence and significantly aid chronic disease management,”

The Proteus ingestible sensor can be integrated into an inert pill or other ingested products, such as pharmaceuticals. Once the ingestible sensor reaches the stomach, it is powered by contact with stomach fluid and communicates a unique signal that determines identity and timing of ingestion.

This information is transferred through the user’s body tissue to a patch worn on the skin that detects the signal and marks the precise time an ingestible sensor has been taken. Additional physiologic and behavioral metrics collected by the patch include heart rate, body position and activity.

The patch relays information to a mobile phone application. With the patient’s consent, the information is accessible by caregivers and clinicians, helping individuals to develop and sustain healthy habits, families to make better health choices, and clinicians to provide more effective, data-driven care.


Global Source and/or and/or more resources and/or read more: http://goo.gl/HKcYU ─ Publisher and/or Author and/or Managing Editor:__Andres Agostini ─ @Futuretronium at Twitter! │ Futuretronium Supranational Initiative │ Futuretronium Book at http://goo.gl/JujXk ─ www.FUTURE-OBSERVATORY.blogspot.com

Monday, July 9, 2012

 J. Craig Venter says, “…We view the genome as the software, or even the operating system, of the cell…”


Global Source and/or and/or more resources and/or read more: http://goo.gl/HKcYU ─ Publisher and/or Author and/or Managing Editor:__Andres Agostini ─ @Futuretronium at Twitter! │ Futuretronium Supranational Initiative │ Futuretronium Book at http://goo.gl/JujXk ─ www.FUTURE-OBSERVATORY.blogspot.com

Wednesday, July 4, 2012

CERN experiments observe particle consistent with long-sought Higgs boson

Both the ATLAS and CMS experiments at CERN observed a new particle in the mass region around 125-126 GeV, physicists announced at a seminar held at CERN today.

“We observe in our data clear signs of a new particle, at the level of 5 sigma, in the mass region around 126 GeV. The outstanding performance of the LHC and ATLAS and the huge efforts of many people have brought us to this exciting stage,” said ATLAS experiment spokesperson Fabiola Gianotti, “but a little more time is needed to prepare these results for publication.”

“The results are preliminary but the 5 sigma signal at around 125 GeV we’re seeing is dramatic. This is indeed a new particle. We know it must be a boson and it’s the heaviest boson ever found,” said CMS experiment spokesperson Joe Incandela. “The implications are very significant and it is precisely for this reason that we must be extremely diligent in all of our studies and cross-checks.”

“It’s hard not to get excited by these results,” said CERN Research Director Sergio Bertolucci. “We stated last year that in 2012 we would either find a new Higgs-like particle or exclude the existence of the Standard Model Higgs. With all the necessary caution, it looks to me that we are at a branching point: the observation of this new particle indicates the path for the future towards a more detailed understanding of what we’re seeing in the data.”

The results presented today are labelled preliminary. They are based on data collected in 2011 and 2012, with the 2012 data still under analysis.  Publication of the analyses shown today is expected around the end of July. A more complete picture of today’s observations will emerge later this year after the LHC provides the experiments with more data.

The next step will be to determine the precise nature of the particle and its significance for our understanding of the universe. Are its properties as expected for the long-sought Higgs boson, the final missing ingredient in the Standard Model of particle physics? Or is it something more exotic? The Standard Model describes the fundamental particles from which we, and every visible thing in the universe, are made, and the forces acting between them. All the matter that we can see, however, appears to be no more than about 4% of the total. A more exotic version of the Higgs particle could be a bridge to understanding the 96% of the universe that remains obscure.

“We have reached a milestone in our understanding of nature,” said CERN Director General Rolf Heuer. “The discovery of a particle consistent with the Higgs boson opens the way to more detailed studies, requiring larger statistics, which will pin down the new particle’s properties, and is likely to shed light on other mysteries of our universe.”

Positive identification of the new particle’s characteristics will take considerable time and data. But whatever form the Higgs particle takes, our knowledge of the fundamental structure of matter is about to take a major step forward.

Global Source and/or and/or more resources and/or read more: http://goo.gl/HKcYU ─ Publisher and/or Author and/or Managing Editor:__Andres Agostini ─ @Futuretronium at Twitter! │ Futuretronium Supranational Initiative │ Futuretronium Book at http://goo.gl/JujXk ─ www.FUTURE-OBSERVATORY.blogspot.com