2011-08-10

New Enterprise Associates, Lithium-Ion Battery, Leyden Energy, Battery Battery developer Leyden Energy recharges with $20M

Battery technology developers continue to attract venture funding.

Among them is Leyden Energy Inc. (Fremont, Calif.), which recently announced Series B funding totaling $20 million. The round was led by New Enterprise Associates (Menlo Park, Calif.) along with current investors Lightspeed Ventures, Sigma Partners and Walden Capital.

Leyden said it will use it latest infusion of capital to add manufacturing capacity for its next-generation lithium-ion batteries as well as for future development of advanced battery technologies. The production boost is tied to growing demand from customers in the "smaller consumer electronics sector," the company said.

Leyden's core technology includes thermal properties that address shortfalls in Li-ion battery performance at high temperatures. Chemical reactions in batteries speed up at high temperature, degrading performance and reducing the number of charging cycles. The company said it is offering a three-year warranty on its Li-ion batteries as compared to a standard one-year warranty.

“Market demand for high-performance, long-lasting batteries in consumer electronics is growing," Leyden Energy CEO Aakar Patel said in a statement.

The company also said Ron Bernal of New Enterprise Associates will  join its board of directors. “What [Leyden] has introduced is really an energy storage platform that can be applied to a number of different product markets in order to increase the value that those applications bring to end customers," Bernal said in a statement.

Along with consumer electronics, Leyden Energy is also targeting the electric vehicle, smart grid and backup storage markets.
New Enterprise Associates, Lithium-Ion Battery, Leyden Energy, Battery Battery developer Leyden Energy recharges with $20M

Study builds on plausible scenario for origin of life on Earth

The study, "A Route to Enantiopure RNA Precursors from Nearly Racemic Starting Materials," shows how the precursors to RNA could have formed on Earth before any life existed. It was authored by Jason E. Hein, Eric Tse and Donna G. Blackmond, a team of researchers with the Scripps Research Institute. Hein is now a chemistry professor with UC Merced.

Biological molecules, such as RNA and proteins, can exist in either a natural or unnatural form, called enantiomers. By studying the chemical reactions carefully, the research team found that it was possible to generate only the natural form of the necessary RNA precursors by including simple amino acids.

"These amino acids changed how the reactions work and allowed only the naturally occurring RNA precursors to be generated in a stable form," said Hein. "In the end, we showed that an amazingly simple result emerged from some very complex and interconnected chemistry."

The natural enantiomer of the RNA precursor molecules formed a crystal structure visible to the naked eye. The crystals are stable and avoid normal chemical breakdown. They can exist until the conditions are right for them to change into RNA.

The study was led by Blackmond and builds on the work of John D. Sutherland and Matthew W. Powner published in 2009 and covered by outlets such as The New York Times and Wired. Sutherland is a chemist with Cambridge's Medical Research Council Laboratory of Molecular Biology. Powner is a post-doctoral scholar with Harvard University.


Study builds on plausible scenario for origin of life on Earth

Flash Summit, Yoram Cedar, 3-D Flash, Resistive RAM, Flash, EUV, SanDisk, Lithography, Memory EUV delay will slow NAND supply growth

SANTA CLARA, Calif. – Delays delivering next-generation lithography will slow the growth in supply of NAND flash, said the chief technologist of SanDisk in a keynote address at the Flash Memory Summit here.

In an otherwise upbeat assessment of the outlook for the flash market, Yoram Cedar waved a yellow flag about delays fielding extreme ultraviolet lithography. The lack of EUV tools will result in the historical increases in flash supply and decreases in cost to be more moderate with future process technologies, he said.

Existing immersion lithography tools will serve flash makers down to geometries of less than 10nm, two generations from today's processes, he said. In addition, vendors are working to create 3-D stacks of NAND strings using existing fab tools to further boost capacity and supply, he added.

Further in the future, chip makers including SanDisk are developing 3-D structures that use changes in resistance to create denser chips. But the so-called resistive RAM will require EUV tools, he said.

Cedar declined to give any specifics about the timeframe for EUV or the status of the current 3-D chip research. However, he did say chip makers expect to ship 64 and 128 Gbit flash devices using immersion tools.

"Many people in the semiconductor industry are very concerned about EUV not only from the standpoint of its availability but also its cost--these things will cost many millions of dollars," said one audience member in question to Cedar after the keynote.

Some pre-production EUV tools reportedly began shipping in January. Costs for the tools could soar as high as $120 million, according to some reports.

Cedar expressed optimism that EUV systems will be affordable. He also noted historical fears of an end to Moore's Law have so far been unfounded.

"When we were at 90nm, we thought 56nm was difficult and may be the end of the game," he said.

The good news is flash demand is broad and strong. Flash is expected to grow 25 percent on a compound basis through 2015, nearly double the rate of hard disk storage and far above DRAM at only one percent, he said.

About a third of all NAND bits will go to smartphones by 2015 when as many as 1.1 billion units ship, Cedar said. Tablets will take another 15 percent of NAND bits for 327 million systems that year, he added.

"Tablets represent a sizeable market that came from nowhere," he said. "There is so much new development here that wasn’t forecast three or four years ago, and there's no reason this will not continue," he added.

He projected solid-state drives will consume 25 percent of NAND bits, selling into 133 million units for clients and 12 million for servers. The rest of NAND supply, about 26 percent, will go into existing systems such as MP3 players, USB drives and digital cameras, he said.

Researchers are working in parallel on 3-D flash structures using current immersion lithography (left) and extreme ultraviolet technology.


Flash Summit, Yoram Cedar, 3-D Flash, Resistive RAM, Flash, EUV, SanDisk, Lithography, Memory EUV delay will slow NAND supply growth

Solar flares: What does it take to be X-class? Sun emits an X-Class flare on August 9, 2011

The biggest flares are known as "X-class flares" based on a classification system that divides solar flares according to their strength. The smallest ones are A-class (near background levels), followed by B, C, M and X. Similar to the Richter scale for earthquakes, each letter represents a 10-fold increase in energy output. So an X is ten times an M and 100 times a C. Within each letter class there is a finer scale from 1 to 9.

C-class and smaller flares are too weak to noticeably affect Earth. M-class flares can cause brief radio blackouts at the poles and minor radiation storms that might endanger astronauts.

And then come the X-class flares. Although X is the last letter, there are flares more than 10 times the power of an X1, so X-class flares can go higher than 9. The most powerful flare measured with modern methods was in 2003, during the last solar maximum, and it was so powerful that it overloaded the sensors measuring it. The sensors cut out at X28.

The biggest X-class flares are by far the largest explosions in the solar system and are awesome to watch. Loops tens of times the size of Earth leap up off the sun's surface when the sun's magnetic fields cross over each other and reconnect. In the biggest events, this reconnection process can produce as much energy as a billion hydrogen bombs.

If they're directed at Earth, such flares and associated CMEs can create long lasting radiation storms that can harm satellites, communications systems, and even ground-based technologies and power grids. X-class flares on December 5 and December 6, 2006, for example, triggered a CME that interfered with GPS signals being sent to ground-based receivers.

NASA and NOAA -- as well as the US Air Force Weather Agency (AFWA) and others -- keep a constant watch on the sun to monitor for X-class flares and their associated magnetic storms. With advance warning many satellites and spacecraft can be protected from the worst effects.

On August 9, 2011 at 3:48 a.m. EDT, the sun emitted an Earth-directed X6.9 flare, as measured by the NOAA GOES satellite. These gigantic bursts of radiation cannot pass through Earth's atmosphere to harm humans on the ground, however they can disrupt the atmosphere and disrupt GPS and communications signals. In this case, it appears the flare is strong enough to potentially cause some radio communication blackouts. It also produced increased solar energetic proton radiation -- enough to affect humans in space if they do not protect themselves.

There was also a coronal mass ejection (CME) associated with this flare. CMEs are another solar phenomenon that can send solar particles into space and affect electronic systems in satellites and on Earth. However, this CME is not traveling toward and Earth so no Earth-bound effects are expected.


Solar flares: What does it take to be X-class? Sun emits an X-Class flare on August 9, 2011

Japan's Tohoku tsunami created icebergs in Antarctica

Kelly Brunt, a cryosphere specialist at Goddard Space Flight Center, Greenbelt, Md., and colleagues were able to link the calving of icebergs from the Sulzberger Ice Shelf in Antarctica following the Tohoku Tsunami, which originated with an earthquake off the coast of Japan in March 2011. The finding, detailed in a paper published online in the Journal of Glaciology, marks the first direct observation of such a connection between tsunamis and icebergs.

The birth of an iceberg can come about in any number of ways. Often, scientists will see the towering, frozen monoliths break into the polar seas and work backwards to figure out the cause.

So when the Tohoku Tsunami was triggered in the Pacific Ocean on March 11 this spring, Brunt and colleagues immediately looked south. All the way south. Using multiple satellite images, Brunt, Emile Okal at Northwestern University and Douglas MacAyeal at University of Chicago were able to observe new icebergs floating off to sea shortly after the sea swell of the tsunami reached Antarctica.

To put the dynamics of this event in perspective: An earthquake off the coast of Japan caused massive waves to explode out from its epicenter. Swells of water swarmed toward an ice shelf in Antarctica, 8,000 miles (13,600 km) away, and about 18 hours after the earthquake occurred, those waves broke off several chunks of ice that together equaled about two times the surface area of Manhattan. According to historical records, this particular piece of ice hadn't budged in at least 46 years before the tsunami came along.

And as all that was happening, scientists were able to watch the Antarctic ice shelves in as close to real-time as satellite imagery allows, and catch a glimpse of a new iceberg floating off into the Ross Sea.

"In the past we've had calving events where we've looked for the source. It's a reverse scenario -- we see a calving and we go looking for a source," Brunt said. "We knew right away this was one of the biggest events in recent history -- we knew there would be enough swell. And this time we had a source."

Scientists first speculated in the 1970s that repeated flexing of an ice shelf -- a floating extension of a glacier or ice sheet that sits on land -- by waves could cause icebergs to break off. Scientific papers in more recent years have used models and tide gauge measurements in an attempt to quantify the impact of sea swell on ice shelf fronts.

The swell was likely only about a foot high (30 cm) when it reached the Sulzberger shelf. But the consistency of the waves created enough stress to cause the calving. This particular stretch of floating ice shelf is about 260 feet (80 meters) thick, from its exposed surface to its submerged base.

When the earthquake happened, Okal immediately honed in on the vulnerable faces of the Antarctic continent. Using knowledge of iceberg calving and what a NOAA model showed of the tsunami's projected path across the unobstructed Pacific and Southern oceans, Okal, Brunt and MacAyeal began looking at what is called the Sulzberger Ice Shelf. The Sulzberger shelf faces Sulzberger Bay and New Zealand.

Through a fortuitous break in heavy cloud cover, Brunt spotted what appeared to be a new iceberg in MODerate Imaging Spectroradiometer (MODIS) data.

"I didn't have strong expectations either way whether we'd be able to see something," Brunt said. "The fastest imagery I could get to was from MODIS Rapid Response, but it was pretty cloudy. So I was more pessimistic that it would be too cloudy and we couldn't see anything. Then, there was literally one image where the clouds cleared, and you could see a calving event."

A closer look with synthetic aperture radar data from the European Space Agency satellite, Envisat, which can penetrate clouds, found images of two moderate-sized icebergs -- with more, smaller bergs in their wake. The largest iceberg was about four by six miles in surface area -- itself about equal to the surface area of one Manhattan. All the ice surface together about equaled two Manhattans. After looking at historical satellite imagery, the group determined the small outcropping of ice had been there since at least 1965, when it was captured by USGS aerial photography.

The proof that seismic activity can cause Antarctic iceberg calving might shed some light on our knowledge of past events, Okal said.

"In September 1868, Chilean naval officers reported an unseasonal presence of large icebergs in the southernmost Pacific Ocean, and it was later speculated that they may have calved during the great Arica earthquake and tsunami a month earlier," Okal said. "We know now that this is a most probable scenario."

MacAyeal said the event is more proof of the interconnectedness of Earth systems.

"This is an example not only of the way in which events are connected across great ranges of oceanic distance, but also how events in one kind of Earth system, i.e., the plate tectonic system, can connect with another kind of seemingly unrelated event: the calving of icebergs from Antarctica's ice sheet," MacAyeal said.

In what could be one of the more lasting observations from this whole event, the bay in front of the Sulzberger shelf was largely lacking sea ice at the time of the tsunami. Sea ice is thought to help dampen swells that might cause this kind of calving. At the time of the Sumatra tsunami in 2004, the potentially vulnerable Antarctic fronts were buffered by a lot of sea ice, Brunt said, and scientists observed no calving events that they could tie to that tsunami.

"There are theories that sea ice can protect from calving. There was no sea ice in this case," Brunt said. "It's a big chunk of ice that calved because of an earthquake 13,000 kilometers away. I think it's pretty cool."


Japan's Tohoku tsunami created icebergs in Antarctica

Polar dinosaur tracks open new trail to past

The discovery, reported in the journal Alcheringa, is the largest and best collection of polar dinosaur tracks ever found in the Southern Hemisphere.

"These tracks provide us with a direct indicator of how these dinosaurs were interacting with the polar ecosystems, during an important time in geological history," says Emory paleontologist Anthony Martin, who led the research. Martin is an expert in trace fossils, which include tracks, trails, burrows, cocoons and nests.

The three-toed tracks are preserved on two sandstone blocks from the Early Cretaceous Period. They appear to belong to three different sizes of small theropods -- a group of bipedal, mostly carnivorous dinosaurs whose descendants include modern birds. Photos of the tracks, above and below, by Anthony Martin.

The research team also included Thomas Rich, from the Museum Victoria; Michael Hall and Patricia Vickers-Rich, both from the School of Geosciences at Monash University in Victoria; and Gonzalo Vazquez-Prokopec, an ecologist and expert in spatial analysis from Emory's Department of Environmental Studies.

The tracks were found on the rocky shoreline of remote Milanesia Beach, in Otways National Park. This area, west of Melbourne, is known for energetic surf and rugged coastal cliffs, consisting of layers of sediment accumulated over millions of years. Riddled with fractures and pounded by waves and wind, the cliffs occasionally shed large chunks of rock, such as those containing the dinosaur tracks.

One sandstone block has about 15 tracks, including three consecutive footprints made by the smallest of the theropods, estimated to be the size of a chicken. Martin spotted this first known dinosaur trackway of Victoria last June 14, around noon. He was on the lookout, since he had earlier noticed ripple marks and trace fossils of what looked like insect burrows in piles of fallen rock.

"The ripples and burrows indicate a floodplain, which is the most likely area to find polar dinosaur tracks," Martin explains. The second block containing tracks was spotted about three hours later by Greg Denney, a local volunteer who accompanied Martin and Rich on that day's expedition. That block had similar characteristics to the first one, and included eight tracks. The tracks show what appear to be theropods ranging in size from a chicken to a large crane.

"We believe that the two blocks were from the same rock layer, and the same surface, that the dinosaurs were walking on," Martin says.

The small, medium and large tracks may have been made by three different species, Martin says. "They could also belong to two genders and a juvenile of one species -- a little dinosaur family -- but that's purely speculative," he adds.

The Victoria Coast marks the seam where Australia was once joined to Antarctica. During that era, about 115-105 million years ago, the dinosaurs roamed in prolonged polar darkness. Earth's average temperature was 68 degrees Fahrenheit -- just 10 degrees warmer than today -- and the spring thaws would cause torrential flooding in the river valleys.

The dinosaur tracks were probably made during the summer, Martin says. "The ground would have been frozen in the winter, and in order for the waters to subside so that animals could walk across the floodplain, it would have to be later in the season," he explains.

Lower Cretaceous strata of Victoria have yielded the best-documented assemblage of polar dinosaur bones in the world. Few dinosaur tracks, however, have been found.

In the February 2006, Martin found the first known carnivorous dinosaur track in Victoria, at a coastal site known as Dinosaur Dreaming.

In May 2006, during a hike to another remote site near Milanesia Beach, he discovered the first trace fossil of a dinosaur burrow in Australia. That find came on the heels of Martin's co-discovery of the first known dinosaur burrow and burrowing dinosaur, in Montana. The two discoveries suggest that burrowing behaviors were shared by dinosaurs of different species, in different hemispheres, and spanned millions of years during the Cretaceous Period.


Polar dinosaur tracks open new trail to past

Hybrid solar system makes rooftop hydrogen

ScienceDaily (Aug. 9, 2011) — While roofs across the world sport photovoltaic solar panels to convert sunlight into electricity, a Duke University engineer believes a novel hybrid system can wring even more useful energy out of the sun's rays.

Instead of systems based on standard solar panels, Duke engineer Nico Hotz proposes a hybrid option in which sunlight heats a combination of water and methanol in a maze of glass tubes on a rooftop. After two catalytic reactions, the system produces hydrogen much more efficiently than current technology without significant impurities. The resulting hydrogen can be stored and used on demand in fuel cells.

For his analysis, Hotz compared the hybrid system to three different technologies in terms of their exergetic performance. Exergy is a way of describing how much of a given quantity of energy can theoretically be converted to useful work.

"The hybrid system achieved exergetic efficiencies of 28.5 percent in the summer and 18.5 percent in the winter, compared to 5 to 15 percent for the conventional systems in the summer, and 2.5 to 5 percent in the winter," said Hotz, assistant professor of mechanical engineering and materials science at Duke's Pratt School of Engineering.

The paper describing the results of Hotz's analysis was named the top paper during the ASME Energy Sustainability Fuel Cell 2011 conference in Washington, D.C. Hotz recently joined the Duke faculty after completing post-graduate work at the University of California-Berkeley, where he analyzed a model of the new system. He is currently constructing one of the systems at Duke to test whether or not the theoretical efficiencies are born out experimentally.

Hotz's comparisons took place during the months of July and February in order to measure each system's performance during summer and winter months.

Like other solar-based systems, the hybrid system begins with the collection of sunlight. Then things get different. While the hybrid device might look like a traditional solar collector from the distance, it is actually a series of copper tubes coated with a thin layer of aluminum and aluminum oxide and partly filled with catalytic nanoparticles. A combination of water and methanol flows through the tubes, which are sealed in a vacuum.

"This set-up allows up to 95 percent of the sunlight to be absorbed with very little being lost as heat to the surroundings," Hotz said. "This is crucial because it permits us to achieve temperatures of well over 200 degrees Celsius within the tubes. By comparison, a standard solar collector can only heat water between 60 and 70 degrees Celsius."

Once the evaporated liquid achieves these higher temperatures, tiny amounts of a catalyst are added, which produces hydrogen. This combination of high temperature and added catalysts produces hydrogen very efficiently, Hotz said. The resulting hydrogen can then be immediately directed to a fuel cell to provide electricity to a building during the day, or compressed and stored in a tank to provide power later.

The three systems examined in the analysis were the standard photovoltaic cell which converts sunlight directly into electricity to then split water electrolytically into hydrogen and oxygen; a photocatalytic system producing hydrogen similar to Hotz's system, but simpler and not mature yet; and a system in which photovoltaic cells turn sunlight into electricity which is then stored in different types of batteries (with lithium ion being the most efficient).

"We performed a cost analysis and found that the hybrid solar-methanol is the least expensive solution, considering the total installation costs of $7,900 if designed to fulfill the requirements in summer, although this is still much more expensive than a conventional fossil fuel-fed generator," Hotz said.

Costs and efficiencies of systems can vary widely depending on location -- since the roof-mounted collectors that could provide all the building's needs in summer might not be enough for winter. A rooftop system large enough to supply all of a winter's electrical needs would produce more energy than needed in summer, so the owner could decide to shut down portions of the rooftop structure or, if possible, sell excess energy back to the grid.

"The installation costs per year including the fuel costs, and the price per amount of electricity produced, however showed that the (hybrid) solar scenarios can compete with the fossil fuel-based system to some degree," Hotz said. 'In summer, the first and third scenarios, as well as the hybrid system, are cheaper than a propane- or diesel-combusting generator."

This could be an important consideration, especially if a structure is to be located in a remote area where traditional forms of energy would be too difficult or expensive to obtain.

Hotz's research was supported by the Swiss National Science Fund. Joining him in the study were UC-Berkeley's Heng Pan and Costas Grigoropoulos, as well as Seung H. Ko of the Korea Advanced Institute of Science and Technology, Daejon.


Hybrid solar system makes rooftop hydrogen

Renesas, Glucose, Kit Renesas offers glucose meter demo kit

MANHASSET, NY -- Renesas Electronics America has made its Continua demonstration platform available at a level of power efficiency that ensures long battery life.

The platform is based on the Renesas Electronics V850 microcontroller with certified Continua Blood Glucose agent software.

V850 MCUs, at about 350uA/DMIPS with up to 1 MB of flash memory and integrated USB function support, make it possible to handle the added software load without sacrificing the long battery life that end users have come to expect, according to Renesas.

By eliminating the error-prone process of manually collecting measurement data, Continua blood glucose meters as defined by the Continua Health Alliance help improve the quality of medical data and diagnosis.

The challenge in implementing these devices is that collecting and communicating the data adds a significant software burden to existing medical products. This software burden could become a human burden if the people who use these devices every day have to replace batteries frequently.

As an example, Renesas Electronics’s 32-bit V850ES/Jx3-L MCUs relieve this burden with an extraordinarily high energy efficiency — 1.9 DMIPS/MHz — which enables the V850 devices to process more than other MCUs at a lower frequency.

The demonstration platform implements a set of guidelines for blood glucose meters developed by the Continua Health Alliance — a non-profit industry organization dedicated to establishing connected and interoperable personal health solutions.

Alliance guidelines utilize key industry standards to ensure an end-to-end architecture from personal medical devices to hospital information systems. One such standard is ISO/IEEE 11073, which defines a medical device, the measurements that it makes, and a protocol for communicating the measurement data. For the underlying communication transport method, the Continua guidelines specify open standards such as USB, Bluetooth  and ZigBee.

While the Continua demonstration platform uses USB, the V850ES/Jx3-L MCUs can also support Bluetooth and ZigBee functionality.

The Renesas Electronics V850 Continua demonstration platform runs software from Lamprey Networks Inc., developer of the reference code for the Continua Health Alliance. The software used in the platform has been certified by the Continua Health Alliance and is portable across multiple Renesas Electronics MCUs. Because the software is targeted to a specific type of medical device, the software footprint can be optimized.

The Continua Demonstration Platform for Renesas Electronics V850 MCUs is available now.
Renesas, Glucose, Kit Renesas offers glucose meter demo kit

New eruption discovered at undersea volcano, after successfully forecasting the event

What makes the event so intriguing is that the scientists had forecast the eruption starting five years ago -- the first successful forecast of an undersea volcano.

Bill Chadwick, an Oregon State University geologist, and Scott Nooner, of Columbia University, have been monitoring Axial Seamount for more than a decade, and in 2006 published a paper in the Journal of Volcanology and Geothermal Research in which they forecast that Axial would erupt before the year 2014. Their forecast was based on a series of seafloor pressure measurements that indicated the volcano was inflating.

"Volcanoes are notoriously difficult to forecast and much less is known about undersea volcanoes than those on land, so the ability to monitor Axial Seamount, and determine that it was on a path toward an impending eruption is pretty exciting," said Chadwick, who was chief scientist on the recent expedition, which was jointly funded by the National Oceanic and Atmospheric Administration and the National Science Foundation.

Axial last erupted in 1998 and Chadwick, Nooner and colleagues have monitored it ever since. They used precise bottom pressure sensors -- the same instruments used to detect tsunamis in the deep ocean -- to measure vertical movements of the floor of the caldera much like scientists would use GPS on land to measure movements of the ground. They discovered that the volcano was gradually inflating at the rate of 15 centimeters (six inches) a year, indicating that magma was rising and accumulating under the volcano summit.

When Axial erupted in 1998, the floor of the caldera suddenly subsided or deflated by 3.2 meters (10.5 feet) as magma was removed from underground to erupt at the surface. The scientists estimated that the volcano would be ready to erupt again when re-inflation pushed the caldera floor back up to its 1998 level.

"Forecasting the eruption of most land volcanoes is normally very difficult at best and the behavior of most is complex and variable," said Nooner, who is affiliated with the Lamont-Doherty Earth Observatory. "We now have evidence, however, that Axial Seamount behaves in a more predictable way than many other volcanoes -- likely due to its robust magma supply coupled with its thin crust, and its location on a mid-ocean ridge spreading center.

"It is now the only volcano on the seafloor whose surface deformation has been continuously monitored throughout an entire eruption cycle," Nooner added.

The discovery of the new eruption came on July 28, when Chadwick, Nooner and University of Washington colleagues Dave Butterfield and Marvin Lilley led an expedition to Axial aboard the R/V Atlantis, operated by the Woods Hole Oceanographic Institution. Using Jason, a remotely operated robotic vehicle (ROV), they discovered a new lava flow on the seafloor that was not present a year ago.

"It's funny," Chadwick said. "When we first arrived on the seafloor, we thought we were in the wrong place because it looked so completely different. We couldn't find our markers or monitoring instruments or other distinctive features on the bottom. Once we figured out that an eruption had happened, we were pretty excited.

"When eruptions like this occur, a huge amount of heat comes out of the seafloor, the chemistry of seafloor hot springs is changed, and pre-existing vent biological communities are destroyed and new ones form," Chadwick added. "Some species are only found right after eruptions, so it is a unique opportunity to study them."

The first Jason ROV dive of the expedition targeted a field of "black smoker" hot springs on the western side of the caldera, beyond the reach of the new lava flows. Butterfield has been tracking the chemistry and microbiology of hot springs around the caldera since the 1998 eruption.

"The hot springs on the west side did not appear to be significantly disturbed, but the seawater within the caldera was much murkier than usual," Butterfield said, "and that meant something unusual was happening. When we saw the 'Snowblower' vents blasting out huge volumes of white floc and cloudy water on the next ROV dive, it was clear that the after-effects of the eruption were still going strong. This increased output seems to be associated with cooling of the lava flows and may last for a few months or up to a year."

The scientists will examine the chemistry of the vent water and work with Julie Huber of the Marine Biological Laboratory to analyze DNA and RNA of the microbes in the samples.

The scientists recovered seafloor instruments, including two bottom pressure recorders and two ocean-bottom hydrophones, which showed that the eruption took place on April 6 of this year. A third hydrophone was found buried in the new lava flows.

"So far, it is hard to tell the full scope of the eruption because we discovered it near the end of the expedition," said Chadwick, who works out of OSU's Hatfield Marine Science Center in Newport. "But it looks like it might be at least three times bigger than the 1998 eruption."

The lava flow from the 2011 eruptions was at least two kilometers (1.2 miles) wide, the scientists noted.

"Five years ago, these scientists forecast this eruption, which has resulted in millions of square meters of new lava flows on the seafloor," said Barbara Ransom, program director in the National Science Foundation's Division of Ocean Sciences. "The technological advances that allow this research to happen will lead to a new understanding of submarine volcanoes, and of any related hazards."

The bottom-anchored instruments documented hundreds of tiny earthquakes during the volcanic eruption, but land-based seismic monitors and the Sound Surveillance System (SOSUS) hydrophone array operated by the U.S. Navy only detected a handful of them on the day of the eruption because many components of the hydrophone system are offline.

"Because the earthquakes detected back in April at a distance from the volcano were so few and relatively small, we did not believe there was an eruption," said Bob Dziak, an OSU marine geologist who monitors the SOSUS array. "That is why discovering the eruption at sea last week was such a surprise." Both Dziak and Chadwick are affiliated with the Cooperative Institute for Marine Resource Studies -- a joint NOAA/Oregon State University institute.

This latest Axial eruption caused the caldera floor to subside by more than two meters (six feet). The scientists will be measuring the rate of magma inflation over the next few years to see if they can successfully forecast the next event.

"The acid test in science -- whether or not you understand a process in nature -- is to try to predict what will happen based on your observations," Chadwick said. "We have done this and it is extremely satisfying that we were successful. Now we can build on that knowledge and look to apply it to other undersea volcanoes -- and perhaps even volcanoes on land."


New eruption discovered at undersea volcano, after successfully forecasting the event

Solid-State Drives, Serial ATA, PCI Express, Flash Drives, SATA-IO, SSDs Serial ATA, PCIe converge for flash drives

SAN JOSE, Calif. – The serial ATA interconnect will take a ride on PCI Express to support data rates of 8 and 16 Gbit/second to serve the accelerating needs of sold-state and hybrid drives. The move is a sign of the rising proliferation of both flash drives and PCIe.

The Serial ATA International Organization (SATA-IO) will create a so-called SATA Express standard as part of its version 3.2 specifications expected out by the end of the year. The spec essentially ports serial ATA software to the PCI Express transport and defines new connectors needed for cards and drives that use it.

Many early flash drives adopted 3 Gbit/second SATA because it was fast enough, low cost and widely supported in PCs. But with the advent of new, faster NAND flash interfaces the SATA "host interface has become the bottleneck," said Mladen Luksic, president of SATA-IO and an interface engineer at hard drive maker Western Digital.

SATA Express will handle up to two lanes of PCI Express to deliver 8 Gbits/second when implemented with PCIe Gen 2 or 16 Gbits/s with PCIe Gen 3. The newly minted 8 GTransfers/s PCIe Gen 3 interface will begin shipping in volume in PC products over the next six months, said Amber Huffman, a technical lead at SATA-IO and a principal engineer in Intel's storage group.

Current SATA interfaces are usually implemented in PC chip sets and device SoCs using an embedded controller to support the Advanced Host Controller Interface. SATA Express will allow devices to tap directly into PCI Express links coming off chip interfaces and even some modern processors.

The low latency, particularly of the CPU links "makes it a very interesting interface for solid-state drives," said Huffman.

Flash drives are on the rise and increasingly adopting PCIe.

Objective Analysis (Los Gatos, Calif.) forecasts PCIe will become dominant in server SSDs in 2012, with unit shipments greater than the combined shipments of its Serial-attached SCSI (SAS) and Fibre Channel drives. By 2015, the market watcher predicts more than two million PCIe SSDs will ship, more than all of the SATA SSDs sold in 2010.

Serial ATA is used in the vast majority of notebook and desktop hard drives, but less than a third of server drives, territory owned by the more robust and higher cost SAS interface.

SAS specialists such as LSI Corp. are already showing 12 Gbit's SAS chips, but industry efforts also are also underway to port SAS to PCI Express. Specs for a so-called SCSI over PCI Express standard are not expected to be finished for about a year.

The existing 6 Gbit/s serial ATA interface adequately serves a wide range of desktop, notebook and consumer systems, said Luksic. The SATA-IO group will explore needs for those systems as they evolve in the future, he said.


Solid-State Drives, Serial ATA, PCI Express, Flash Drives, SATA-IO, SSDs Serial ATA, PCIe converge for flash drives

2011-08-09

Freescale Semiconductor, Rich Beyer, White Box, Asia What’s in the white box for Freescale?



When asked recently about his company’s new multimedia processors, Freescale Semiconductor CEO Rich Beyer replied that Freescale “would like to work with white-box vendors in China” rather than, say, gun for design wins in the next Apple iPad, HP tablet or Dell netbook.

The response raised a few eyebrows. With Freescale no longer in the wireless baseband business, many observers assumed the company would need to score a high-profile design win to prove its multimedia processor’s chops to the world.

But Freescale is moving away from what it sees as the bloodbath caused by hyper-price-competitive tablets from companies like Dell and Hewlett-Packard. (Freescale barely survived the decline of its biggest customer, Motorola, several years back.)

Partnering with white-box vendors working on a broad range of smart mobile products—everything from personal media devices to automotive infotainment systems, e-readers and media tablets—is “attractive to Freescale,” Beyer says.

Click here to read the full story at EE Times Confidential.
Freescale Semiconductor, Rich Beyer, White Box, Asia What's in the white box for Freescale?

MIT, Siggraph MIT's GelSight enhances 3-D imaging

MANHASSET, NY -- Researchers have created a simple, portable imaging system that combines a slab of transparent, synthetic rubber, a coat of paint containing tiny flecks of metal, and clever algorithms to achieve resolutions previously possible only with large and expensive lab equipment.

The device could enable a way to inspect products too large to fit under a microscope and could also have applications in medicine, forensics and biometrics.

GelSight  is a slab of transparent, synthetic rubber, one of whose sides is coated with a paint containing tiny flecks of metal. When pressed against the surface of an object, the paint-coated side of the slab deforms. Cameras mounted on the other side of the slab photograph the results, and computer-vision algorithms analyze the images.

A new, higher-resolution version of GelSight can register physical features less than a micrometer in depth and about two micrometers across. This compares to an earlier version presented in a 2009 paper at Siggraph which was sensitive enough to detect the raised ink patterns on a $20 bill.

GelSight grew out of a project to create tactile sensors for robots for giving them a sense of touch. But researchers realized that their system provided much higher resolution than tactile sensing required.

The researchers shrunk the flecks of metal in the paint and used  a different lighting scheme than before which in turn needed a redesign of the computer-vision algorithm that measures surface features.

Traditionally, generating micrometer-scale images has required a large, expensive piece of equipment such as a confocal microscope or a white-light interferometer, which might take minutes or even hours to produce a 3-D image. Often, such a device has to be mounted on a vibration isolation table, which might consist of a granite slab held steady by shock absorbers.

In contrast, researchers Edward Adelson and Micah Kimo Johnson built a prototype sensor, about the size of a soda can, which produces 3-D images almost instantly.

With multiple cameras measuring the rubber’s deformation, the system can produce 3-D models of an object, which can be manipulated on a computer screen for examination from multiple angles.

Adelson and Johnson are in discussion with a major aerospace company and several manufacturers of industrial equipment, all of whom are interested in using GelSight to check the integrity of their products.

The technology has also drawn the interest of experts in criminal forensics, who think that it could provide a cheap, efficient way to identify the impressions that particular guns leave on the casings of spent shells.

The researchers work in MIT’s Department of Brain and Cognitive Sciences.

Papers delivered at Siggraph 2011 in Vancouver this week can be viewed here.
MIT, Siggraph MIT's GelSight enhances 3-D imaging

Gartner, Technologies, Hype Gartner expands technology 'hype' curve in 2011

MANHASSET, NY -- The Gartner market research firm has released its 2011 version of their "Hype Cycle for Semiconductors" report and has included six new technologies.

The renamed "Hype Cycle for Semiconductors and Electronics Technologies" includes the broader scope of technologies covered — such as displays, batteries, capacitors and wireless power

Specifically, the 2011 Hype Cycle includes the following six new technologies: quantum dot displays, cognitive radio, terahertz waves, MEMS displays, lithium iron phosphate batteries and 450-mm wafers.

Gartner's senior analyst Jum Tully claims these are important emerging technologies that should be tracked during the next few years.

Among the research's major findings are:

There were relatively few shifts in technologies during the past year. The prevailing economic conditions are no doubt partly responsible for this.

Therefore, no technologies from last year's Hype Cycle have progressed beyond the Plateau of Productivity; and they remain in this year's Hype Cycle. Also, no technologies have been pushed backward except for electronic paper because of a definition change to include color technologies as well.

Gartner has moved optical silicon, which has been in pre-peak position for the past several years, and floating body DRAM to join FPGA embedded in a SoC, micro fuel cells and phase change memory in the Trough of Disillusionment. (See figure)


Gartner, Technologies, Hype Gartner expands technology 'hype' curve in 2011

Spansion, NOR, Flash, Memory, Semiconductor Spansion claims first 4-Gb NOR flash

SAN FRANCISCO—NOR flash memory vendor Spansion Inc. Tuesday (Aug. 9) announced what it said was the semiconductor industry's first single-die, 4-gigabit (Gb) NOR product implemented at the 65-nm node.

Spansion (Sunnyvale, Calif.) said the latest addition to its GL-S product line delivers high quality and fast read performance. The 4-Gb NOR device is sampling this month, Spansion said.

Based on Spansion's proprietary MirrorBit charge-trapping technology, the Spansion GL-S is up to 45 percent faster read than competing NOR flash products, according to Spansion. The Spansion GL-S family is currently offered at 128-Mb through 2-Gb densities and the company is gaining design win momentum for the product family with consumer, automotive, gaming, telecom and industrial applications, according to the company.

Spansion, NOR, Flash, Memory, Semiconductor Spansion claims first 4-Gb NOR flash

Scientists pioneer new method for nanoribbon production

The work, reported in Nature Materials, could pave the way for the production of nanomaterials for use in a new generation of computers and data storage devices that are faster, smaller and more powerful.

The Nottingham research group, led by Dr Andrei Khlobystov in the University's School of Chemistry, specialise in the chemistry of nanomaterials and has been studying carbon nanotubes as containers for molecules and atoms.

Carbon nanotubes are remarkable nanostructures with a typical diameter of 1-2 nanometres, which is 80,000 times smaller than the thickness of a human hair. Over the past few years, the researchers have discovered that physical and chemical properties of molecules inserted into carbon nanotubes are very different to the properties of free molecules. This presents a powerful mechanism for manipulating the molecules, harnessing their functional properties, such as magnetic or optical, and for controlling their chemical reactivity.

The latest study is a collaboration between Dr Khlobystov's chemical nanoscientists, theoretical chemists based in the University's School of Chemistry and electron microscopists from Ulm University in German.

Working together, they have demonstrated that carbon nanotubes can be used as nanoscale chemical reactors and chemical reactions involving carbon and sulphur atoms held within a nanotube lead to the formation of atomically thin strips of carbon, known as graphene nanoribbon, decorated with sulphur atoms around the edge.

Dr Khlobystov said: "Graphene nanoribbons possess a wealth of interesting physical properties making them more suitable for applications in electronic and spintronic devices than the parent material graphene -- the discovery of which attracted the Nobel Prize for Physics last year for University of Manchester scientists Professors Andre Geim and Konstantin Novoselov.

"Nanoribbons are very difficult to make but the Nottingham team's strategy of confining chemical reactions at the nanoscale sparks spontaneous formation of these remarkable structures. The team has also discovered that nanoribbons -- far from being simple flat and linear structures -- possess an unprecedented helical twist that changes over time, giving scientists a way of controlling physical properties of the nanoribbon, such as electrical conductivity."

Devices based on nanoribbons could potentially be used as nano-switches, nano-actuators and nano-transistors integrated in computers or data storage devices.


Scientists pioneer new method for nanoribbon production

Research group shows iPhones cost less to support

iPhone 4

Enlarge

(PhysOrg.com) -- ClickFox, a firm that analyzes customer experience when trying to solve problems with their technology has focused its attention on how much work and cost is involved in supporting and troubleshooting problems related to the three main kinds of smartphones; the iPhone, Blackberry and those running Google’s Android OS. They found that iPhone’s are cheaper to support than Blackberry’s and Android phones are the most expensive of all.

ClickFox reached its conclusions by analyzing support data from North American carriers; after eliminating call data for questions about billing or queries about plan options, the company found that calls for assistance with iPhones were generally handled more expeditiously than those for Blackberry and even more so than for Android calls.

ClickFox, though not revealing exact figures noted that the main difference between the types of support were the number of calls that had to be transferred to other support reps, i.e. difficult problems often require the assistance of more than one support rep to get resolved. ClickFox says that the number of transfers for iPhone callers is fewer than for Blackberry users, and far fewer than for Android users.

In an interview with InfoWorld, analytics director for ClickFox , Lauren Smith said that Blackberry users cost support carriers a total of $46 million more to support than iPhone users did for their support, while Android users cost theirs $97 million more.

ClickFox suggests the disparity is due to the higher degree of difficulty in learning and using the Blackberry and Android phones versus the iPhone, resulting in confused users calling support lines only to find the reps oftentimes confused as well. ClickFox says that while iPhone users typically have their questions or problems resolved on the first call, Blackberry users find themselves transferred to another rep 37% of the time; and Android users get transferred a staggering 77% of the time.

This announcement by ClickFox comes at a bad time for Android users as reports from the recent DefCon Hacking conference in Las Vegas, suggest that the Android OS has a flaw in it that allows one app to change the focus of another app without user consent. Worse, the offending app can apparently also disable the Back button, preventing the user from going back to the original app. Security experts say such a flaw, in addition to being annoying, can allow a secondary app to masquerade as the first, setting up the user for a phishing attack.

© 2010 PhysOrg.com


Research group shows iPhones cost less to support

UMC, foundry, semiconductor, sales, July UMC's July sales down 18% on 2010


LONDON – July sales for foundry United Microelectronics Corp. (UMC) were NT$8,809 million (about $300 million) down 4.1 percent on June sales and down 18.6 percent on UMC's July sales in 2010.

A fall was expected after UMC (Hsinchu, Taiwan) predicted sales would fall 10 to 12 percent in the third quarter when compared with the second quarter. In a typical year sales jump up significantly in the third quarter as ICs to go into consumer equipment for the winter buying season that typically lasts from November to February.

UMC's year-to-date sales for the first seven months of 2011 are NT$65.08 billion (about $2.24 billion), 3.3 percent less than the equivalent figure in 2010. The comparison with the previous year has been declining continuously since January when UMC's sales were 10.75 percent ahead of the previous year.

Rival foundry Taiwan Semiconductor Manufacturing Co. Ltd. (Hsinchu, Taiwan) is expected to post its July sales figures on Wednesday (Aug. 10).


Related links and articles:

UMC predicts declines in revenues, fab use

Taiwan comforted by Apple chip spend cycle

TSMC posts weak June, Q2 sales figures


UMC, foundry, semiconductor, sales, July UMC's July sales down 18% on 2010

Jim Hogan, EDA, Nimbic, semiconductor Jim Hogan joins board of cloud-EDA startup


LONDON – Nimbic Inc., a provider of software-as-a-service, cloud-computing for EDA has announced that Jim Hogan, an EDA industry veteran, has joined its board of directors.

Nimbic (Mountain View, Calif.), founded as Physware in 2006, provides signal integrity, power integrity and electromagnetic interference design tools.

Hogan has worked in the semiconductor industry for more than 33 years, serving as a senior executive in electronic design automation, semiconductor intellectual property, semiconductor equipment, and fabrication companies. Hogan currently serves as chairman at Solido Design Automation and as director at Scoperta and Tela Innovations.

Hogan also has held a variety of executive and board positions at companies including: Altos Design Automation, Telos Venture Partners, Artisan Components and Cadence Design Systems Inc.

"Jim is a terrific addition to Nimbic's board of directors and is already bringing his industry expertise to bear as we launch our cloud solutions for EDA," said Raul Camposano, CEO of Nimbic, in a statement. Camposano is a former chief technology officer at Synopsys Inc. 

"In my opinion, Nimbic will be a company that establishes both innovative technology solutions as well as new and interesting business models to help world-class customers with their design challenges," said Hogan, in the same statement.


Related links and articles:

www.nimbic.com

News articles:


Nimbic launches cloud computing solution for EDA

Jim Hogan joins GateRocket advisory board

Layout optimization startup Tela buys Blaze DFM

Camposano appointed CEO at simulation startup


Jim Hogan, EDA, Nimbic, semiconductor Jim Hogan joins board of cloud-EDA startup

IMS, wireless, charging, market, smartphones, semiconductor IMS: Wireless charging market is on 85% CAGR


LONDON – The global market for wireless power components and accessories was $100 million in 2010 but will grow to be worth $4.5 billion, according to market research IMS Research Ltd. That is equivalent to a compound annual growth rate of 85.5 percent.

Palm and Powermat are the pioneers in wireless charging but with incompatible products and being challenged by the Wireless Power Consortium and its Qi standard, the firm said.

IMS (Wellingborough, England) said it expects rivalries to focus around two implementations of wireless power. Powermat and WPC's Qi are in the "tightly coupled" camp that requires close proximity. Qualcomm and Witricity use magnetic resonance to transfer power over greater distances.

"Our forecast assumes that while competition will be fierce in the near term, a combination of market forces and industry alliances will coalesce to form a de facto standard for interoperability in the next several years," said Jason dePreaux, research manager at IMS Research, in a statement.


Related links and articles:


MediaTek goes to startup for charging tech

Wireless Power Consortium adds seven members

An introduction to the Wireless Power Consortium and TI solutions


IMS, wireless, charging, market, smartphones, semiconductor IMS: Wireless charging market is on 85% CAGR

Twitter, Facebook, BlackBerry Social media blamed as UK youths riot


Serious public disorder has broken out in several major cities of the United Kingdom with gangs of youths looting shops and attacking the police where they meet resistance, according to reports.

The modern tools of social networking, Facebook, Twitter and BlackBerry are being used by the young people to co-ordinate locations where they can meet, leaving the stretched UK police force scrambling to catch up.

The rioting began in the Tottenham district in the north of London on Saturday night but on Sunday night spread across the east and south of London in a series of copy-cat events as young people sought to challenge the authorities.  On Monday night violence and looting were reported in Birmingham, Bristol and Nottingham as well as London.

The spark for the initial violence was a peaceful gathering held outside a police station in Tottenham to protest at the fatal shooting of a man on Thursday. However, since then the violence and looting appear to have been opportunistic criminality amongst young people that have few opportunities for well-paid employment and who are pressured by declining welfare provision and the introduction of austerity measures.


Related links:

http://www.bbc.co.uk/news/uk-england-london-14450248

http://www.bbc.co.uk/news/uk-14454250

http://www.bbc.co.uk/news/uk-14456050



Twitter, Facebook, BlackBerry Social media blamed as UK youths riot

National Instruments, NIWeek 2011 al Instruments hails latest feats at NIWeek 2011

AUSTIN, Texas -- The elusive goal of developing nuclear fusion as a viable energy source has been a lifetime ambition espoused by National Instruments’ co-founder, at least for the past 35 years of the existence of his company.

NI's president and CEO James Truchard,, who co-founded the company in 1976 while working at The University of Texas at Austin, mentions the elusive goal of sustainable energy from fusion as part of his opening remarks at what has become an "engineering lovefest", the annual NIWeek. The latest NIweek 2011 was held here last week.
 
Since 1986, when Truchard and co-founder Jeff Kodosky invented NI’s LabVIEW graphical development software, engineers and scientists have become exposed to the company’s graphical system design environment as an intuitive way to develop customer-defined test, control, and embedded design systems.

The two dates 1986 and 1976 were marked at NIWeek 2011 held last week in 107 degree F. hot Austin, where some 3300 attended the “big tent” event in the Austin Convention Center. “You are in the hottest place for innovation held in one of the hottest cities,” said Truchard in his keynote who then took the crowd of enthusiastic NI engineers, scientists, and followers on a concentrated history map of laboratory test and measurement equipment.

It was left to Shelley Gretlein, NI’s director of software marketing, to provide the background for four modern application areas that in some ways answer the quest posed by the 14 Grand Challenges for Engineering, defined by the National Academy of Engineering.

Truchard was elected to membership in the National Academy of Engineering in 2007, what is widely considered to be the highest honor given in the engineering profession. As such he was able to help define and tackle the 14 Grand Challenges, and to have users of NI tools come up with partial solutions, including the one of energy from fusion.

Gretlein eloquently presented some of the 14 challenges including:  a medical application from Santec; a civil infrastructure application with the Cockrell School of Engineering; a smart grid application with NEXTGen Consultancy; and energy from fusion with the University of Parma (Italy).

Watch the fusion application from NI Week 2011 and other NI presentations here.

The medical application was a Santec Corp. (Japan) portable optical coherence tomography (OCT) imaging system. OCT is a non-invasive imaging technique that relies on analyzing the frequency components of backscattered light from the internal structure of an object or tissue.

NI FlexRIO and FPGA technology was used to create the OCT system that achieved a 4X speed increase and a dramatically smaller footprint compared to the company’s previous rack-mounted OCT system. OCT provides much greater resolution than magnetic resonance imaging (MRI) or positron emission tomography (PET) and uses a low-power light source and the corresponding light reflections to create images, similar to an ultrasound method, only with light.

Another OCT project won NI’s annual Graphical System Design Achievement Award, an NI Week highlight that sports a contest that this year was judged by a committee of NI technical experts reviewing the papers and selecting finalists and winners. A total of 130 submissions were received from authors in 20 countries.
 
Kohji Ohbayashi, of Kitasato University, Graduate School of Medical Science, and his team of researchers created a 3-D OCT medical instrument that can detect cancer during medical checkups without requiring the patient to undergo a biopsy.

To achieve 3-D imaging capabilities, two FPGAs in the system computed more than 700,000 512-point fast Fourier transforms (FFT) every second.

The system has three different real-time display modes: (a) continuous display of rendered 3-D images, (b) continuous 2-D cross-sectional frame scanning in a 3-D cube along each of the axes, and (c) continuous display of all acquired B-scan images.

The GSDAA NI Week 2011 finalists and winners can be viewed here. The full NIWeek 2011 conference presentations can be viewed here.

Read more NI Week coverage here:

The latest updates from NI Week 2011



National Instruments, NIWeek 2011 al Instruments hails latest feats at NIWeek 2011

Tri-Gate, Intel, ARM, Mobile Will tri-gate play an important role in the Intel-ARM tussle?

Rivalries between companies have a charm of their own. For many years, Intel v. AMD was the talk of the town, and then it became Microsoft v. Google. The most interesting rivalry today is, of course, Intel v. ARM. After Intel's tri-gate transistor announcement, I was therefore not surprised to see these news articles:



eWeek

3-D transistor will indeed help Intel beat back ARM, iHS iSuppli says

xbit Labs
ARM Not Afraid of Intel's 22nm/Tri-Gate Process Technology - Company.

Wired Revolution

Intel’s 3D tri-gate Transistor Redesign Brings Huge Efficiency Gains

At the moment, it certainly looks as though ARM will go planar at the 22-nm node, while Intel will go tri-gate. To judge if tri-gate offers Intel a significant advantage, a few questions need to be answered:

•   Intel announced that their 22-nm tri-gate transistor consumed 50 percent lower power when compared to their 32-nm planar transistor. But what are the power savings for a 22-nm tri-gate transistor when compared to a 22-nm planar transistor?
•   How much chip power can one save by using a 22-nm tri-gate transistor in a microprocessor instead of a 22-nm planar transistor? Is it 10 percent? Or is it 30 percent? Or is it 50 percent?

It is not difficult to get estimates for these. Let’s take a look.   

Transistor-level calculations



Tri-Gate, Intel, ARM, Mobile Will tri-gate play an important role in the Intel-ARM tussle?

Sony, Panasonic, Samsung in 3D glasses deal

A man wears 3D glasses while watching a demonstration at the Sony booth during a consumer electronics fair in Las Vegas

Japan's Sony and Panasonic and South Korea's Samsung Electronics have said they will jointly develop new standards for glasses used to watch 3D images on television, computer and movie screens.

Japan's Sony and Panasonic and South Korea's Samsung Electronics said Tuesday they will jointly develop new standards for glasses used to watch 3D images on television, computer and movie screens.

The three Asian consumer electronics giants, working with European technology firm X6D Limited, said their collaboration will cover a technology called "3D active glasses", according to their joint statement.

The universal glasses -- which can be used on TVs from all three firms -- will go on sale in 2012 and will be compatible with 3D sets being released this year, the companies said.

"Today’s announcement marks a unique collaboration of the world’s leading 3D TV manufacturers and 3D technology providers for the benefit of consumers," they said, expressing hope the move would promote 3D technology.

(c) 2011 AFP


Sony, Panasonic, Samsung in 3D glasses deal

Severe low temperatures devastate coral reefs in Florida Keys

Lead author Dustin Kemp, a postdoctoral associate in the UGA Odum School of Ecology, said the study was prompted by an abnormal episode of extended cold weather in January and February 2010. Temperatures on inshore reefs in the upper Florida Keys dropped below 12 C (54 F), and remained below 18 C (64 F) for two weeks. Kemp and his colleagues had planned to sample corals at Admiral Reef, an inshore reef off Key Largo, just three weeks after the cold snap. When they arrived, they discovered that the reef, once abundant in hard and soft corals, was essentially dead. "It was the saddest thing I've ever seen," Kemp said. "The large, reef-building corals were gone. Some were estimated to be 200 to 300 years old and had survived other catastrophic events, such as the 1998 El Niño bleaching event. The severe cold water appeared to kill the corals quite rapidly."

Odum School Professor William Fitt, Kemp's doctoral advisor and one of the paper's co-authors, realized that the team had a unique opportunity. "Nearly 100 years ago, Alfred Mayer described the temperature tolerance of different corals in the Dry Tortugas and found very similar results," Kemp said. "We decided to take the next step and learn how and why the cold temperatures caused the corals to die."

The researchers took samples of Siderastrea siderea -- one of the few reef-building corals to survive -- from Admiral Reef. They also took samples of three common Florida Keys corals, Montastraea faveolata, Siderastrea sidereaand Porites astreoides from Little Grecian Reef, a nearby offshore reef that had not experienced the temperature anomaly to the extent of Admiral Reef. Kemp explained that Little Grecian Reef is far enough offshore that the cold-water temperatures were likely buffered by the warm waters of the Gulf Stream, which resulted in offshore coral reefs being less severely affected by the cold air mass that was pushed by an unusual weather pattern over much of the U.S. during that two-week period.

Back in the lab, they simulated the temperatures that had been recorded at Admiral Reef during the cold weather event, testing the different corals' physiological responses at 12 C and 16 C (61 F), and then, after the corals' exposure to the cold, returned the temperature to 20 C (68 F). They found that although responses varied depending on the coral species, in general the stress of extended cold temperatures had an effect similar to that of high temperatures.

Kemp explained that corals depend on Symbiodinium, a type of symbiotic algae that lives inside them, for nutrition. Through photosynthesis, the algae produce sugars, which are passed on to the corals. "The cold temperatures inhibited photosynthesis in the algae, leading to a potential net loss of carbon transferred from the algae to the coral," said Kemp. He said that each coral species had its own unique type of Symbiodinium, some of which were better able to tolerate and recover from cold temperatures than others.

All of the corals experienced a significant decrease in photosynthesis at 12 C. Siderastrea siderea and M. faveolata were able to handle the 16 C temperatures, but P. astreoides was not, and did not show signs of recovery once the temperature was returned to 20 C. Siderastrea siderea was the only coral able to recover.

"Corals and their symbiotic algae have a range of stress tolerance," said Kemp. "Some can handle moderate stress, some are highly sensitive, and some are in between. But extreme cold is just one stressor among many." Other threats to coral health include increased seawater temperatures, diseases, ocean acidification, and pollution. "Adding stress from wintertime cold episodes could not only quickly kill corals but also may have long-term effects," he said. "For corals found in the Florida Keys, winter is typically a 'non-stressful' time and corals bulk up on tissue reserves that are important for surviving potentially 'stressful' summertime conditions (i.e. coral bleaching)."

Kemp said that researchers at NOAA attribute the record-breaking cold anomaly to a negative trend in the North Atlantic oscillation, an atmospheric pressure pattern that influences the weather in the northern hemisphere. "They speculate that if the trend continues, these kinds of extreme cold events may become more frequent," he said.

Kemp stressed that the study's findings should not be interpreted to downplay the major role of higher temperatures on corals' decline. "The study shows that warming may not be the only climate-related problem for coral reefs in the future," he said.

Kemp also pointed out that it was not only the corals that were devastated by the cold snap. "The corals provide the framework for the entire reef ecosystem," he said. "The lobster, shrimp, clams, fish -- all the creatures that depend on the reef -- were affected too. The potential consequences for coral ecosystems are extremely alarming."

Besides Kemp and Fitt, the paper's coauthors were Clinton Oakley and Gregory Schmidt of the UGA Department of Plant Biology, Daniel Thornhill of the nonprofit Defenders of Wildlife and Bowdoin College, and Laura Newcomb of Bowdoin College. The research was supported by the National Science Foundation and Bowdoin College.


Severe low temperatures devastate coral reefs in Florida Keys

'Endurance gene' for Olympic-level athletes: Genetic basis for muscle endurance discovered in animal study

The study appears online this week in the Journal of Clinical Investigation. The work has implications for improving muscle performance in disease states including metabolic disorders, obesity, and aging.

"We have shown that mice lacking the gene run six times longer than control mice and that the fatigable muscles of the mouse -- the fast muscle in the front of the leg -- have been reprogrammed and are now fatigue-resistant," explains senior author Tejvir S. Khurana, MD, PhD, professor of Physiology and member of the Pennsylvania Muscle Institute. "This has wide ramifications for various aspects of muscle biology ranging from athletics to treating muscle and metabolic diseases."

The gene codes for a protein called Interleukin-15 receptor-alpha (IL-15R-alpha), which acts alone or in conjunction with the IL-15 protein. IL-15R-alpha is important in the immune response, but it also has other functions. IL-15 and IL-15R-alpha have been implicated in muscle physiology, but the exact role in muscle function has not been defined.

"We found a previously unrecognized role for IL-15R-alpha in defining muscle function, and manipulation of this gene has the potential to improve muscle performance in disease states including metabolic disorders, obesity, and aging." says lead author Emidio E. Pistilli, PhD, who was a postdoctoral researcher at Penn and is now an assistant professor in the Division of Exercise Physiology at the West Virginia School of Medicine.

Slow Vs. Fast

Slow muscles are used for endurance and fast muscles are used for speed. The champion fast muscles are the muscles moving the eye, but they are also fatigue-resistant, the only muscles like this.

In the IL-15R-alpha knockout mouse used in this study, fast muscles behave like slow muscles. These mice ran 6.3 times greater distances and had greater ambulatory activity than controls. Their fast muscles displayed fatigue-resistance and slower contractions compared to fast muscles in control mice.

They also showed that the loss of IL-15R-alpha induces a shift in how energy is burned in fast muscles, substantially increasing fatigue resistance and exercise capacity.

The molecular signature of the muscles in the knockout mice included a greater number of active transcription factors, which indicates more muscle fibers with more mitochondria, and the machinery to better process calcium since this chemical drives muscle contraction. Mitochondria are the energy storehouses of the cell.

Morphologically, the fast muscles had a greater number of muscle fibers, smaller fiber areas, and a greater number of nuclei per fiber. The alterations of physiological properties and increased resistance to fatigue in the fast muscles are consistent with a shift towards a slower, more oxidative muscle type in the knockout mice.

The study also found significant associations between the gene and elite endurance athletes and hence supports the possibility that these athletes had a genetic predisposition or advantage.

From these two lines of evidence, the researchers concluded that IL-15R-alpha plays a role in defining the function of fast skeletal muscles.

Importantly, the study demonstrates that muscles can be reprogrammed to perform much better at endurance sports and hence IL-15R-alpha manipulation is of great importance from an athletic doping standpoint as currently it is neither tested for nor do methods exist to detect its misuse by athletes. The investigators are working toward this.

This research identifies a "druggable target" that allows possible reprogramming of muscle function by increasing genes, proteins and pathways typically expressed in slow or fatigue-resistant muscle, similar to adaptations seen after endurance exercise. It is widely accepted that these types of adaptations would be beneficial or protect against obesity, diabetes and aging and may help ameliorate pathology in myopathies such as muscular dystrophy. Hence, say the researchers, the identification of this pathway should facilitate better understanding of these diseases and aid in the development of rational therapies drugs for these disorders.

From a translational research point of view the team will test the role IL-15R-alpha plays in obesity, diabetes, aging, and muscle diseases, as well as develop methods to harness the therapeutic potential of it for patients.

The research was funded by the National Institute of Arthritis and Musculoskelatal and Skin Diseases; the National Eye Institute; the National Institute on Aging; and the VA Puget Sound Health Care System.

In addition to Khurana and Pistilli, co-authors were from the Institute for Neuroscience and Muscle Research, The Children's Hospital at Westmead, Sydney, New South Wales, Australia; the Australian Institute of Sport, Canberra, Australia; Geriatric Research, Education, and Clinical Center, VA Puget Sound Health Care System, Seattle; Division of Gerontology and Geriatric Medicine, Department of Medicine, University of Washington, Seattle; and the Division of Endocrinology, Diabetes, and Metabolism, Penn.


'Endurance gene' for Olympic-level athletes: Genetic basis for muscle endurance discovered in animal study

Like superman's X-Ray vision, new microscope reveals nanoscale details

But that's not all. What's unusual about this new, nanoscale, X-ray microscope is that the images are not produced by a lens, but by means of a powerful computer program.

The scientists report in a paper published in this week's early online edition of the Proceedings of the National Academy of Sciences that this computer program, or algorithm, is able to convert the diffraction patterns produced by the X-rays bouncing off the nanoscale structures into resolvable images.

"The mathematics behind this is somewhat complicated," said Oleg Shpyrko, an assistant professor of physics at UC San Diego who headed the research team. "But what we did is to show that for the first time that we can image magnetic domains with nanometer precision. In other words, we can see magnetic structure at the nanoscale level without using any lenses."

One immediate application of this lens-less X-ray microscope is the development of smaller, data storage devices for computers that can hold more memory.

"This will aid research in hard disk drives where the magnetic bits of data on the surface of the disk are currently only 15 nanometers in size," said Eric Fullerton, a co-author of the paper and director of UC San Diego's Center for Magnetic Recording Research. "This new ability to directly image the bits will be invaluable as we push to store even more data in the future."

The development should be also immediately applicable to other areas of nanoscience and nanotechnology.

"To advance nanoscience and nanotechnology, we have to be able to understand how materials behave at the nanoscale," said Shpyrko. "We want to be able to make materials in a controlled fashion to build magnetic devices for data storage or, in biology or chemistry, to be able to manipulate matter at nanoscale. And in order to do that we have to be able to see at nanoscale. This technique allows you to do that. It allows you to look into materials with X-rays and see details at the nanoscale."

"Because there is no lens in the way, putting a bulky magnet around the sample or adding equipment to change the sample environment in some other way during the measurement is much easier with this method than if we had to use a lens," Shpyrko added.

Ashish Tripathi, a graduate student in Shpyrko's lab, developed the algorithm that served as the X-ray microscope's lens. It worked, in principle, somewhat like the computer program that sharpened the Hubble Space Telescope's initially blurred images, which was caused by a spherical aberration in the telescope's mirror before the telescope was repaired in space. A similar concept is employed by astronomers working in ground-based telescopes who use adaptive optics, movable mirrors controlled by computers, to take out the distortions in their images from the twinkling star light moving through the atmosphere.

But the technique Tripathi developed was entirely new. "There was a lot of simulation involved in the development; it was a lot of work," said Shpyrko.

To test their microscope's ability to penetrate and resolve details at the nanoscale, the physicists made a layered film composed of the elements gadolinium and iron. Such films are now being studied in the information technology industry to develop higher capacity, smaller, and faster computer memory and disk drives.

"Both are magnetic materials and if you combine them in a structure it turns out they spontaneously form nanoscale magnetic domains," Shpyrko. "They actually self assemble into magnetic stripes."

Under the X-ray microscope, the layered gadolinium and iron film looks something like baklava dessert that crinkles up magnetically to form a series of magnetic domains, which appear like the repeating swirls of the ridges in fingerprints. Being able to resolve those domains at the nanoscale for the first time is critically important for computer engineers seeking to cram more data into smaller and smaller hard drives.

As materials are made with smaller and smaller magnetic domains, or thinner and thinner fingerprint patterns, more data can be stored in a smaller space within a material. "The way we're able to do that is to shrink the size of the magnetic bits," Shpyrko said.

The technique should find many other uses outside computer engineering as well.

"By tuning the X-ray energy, we can also use the technique to look at different elements within materials, which is very important in chemistry," he added. "In biology, it can be used to image viruses, cells and different kinds of tissues with a spatial resolution that is better than resolution available using visible light."

The scientists used the Advanced Photon Source, the most brilliant source of coherent X-rays in the Western Hemisphere, at the University of Chicago's Argonne National Laboratory near Chicago to conduct their research project, which was funded by the U.S. Department of Energy. In addition to Tripathi, Shpyrko and Fullerton, a professor of electrical and computer engineering at UC San Diego, other co-authors of the paper include UC San Diego physics graduate students Jyoti Mohanty, Sebastian Dietze and Erik Shipton as well as physicists Ian McNulty and SangSoo Kim at Argonne National Laboratory.


Like superman's X-Ray vision, new microscope reveals nanoscale details

Scientist develops virus that targets HIV: Using a virus to kill a virus

Dr. Pin Wang's lentiviral vector latches onto HIV-infected cells, flagging them with what is called "suicide gene therapy" -- allowing drugs to later target and destroy them.

"If you deplete all of the HIV-infected cells, you can at least partially solve the problem," said Wang, chemical engineering professor with the USC Viterbi School of Engineering.

The process is analogous to the military practice of "buddy lasing" -- that is, having a soldier on the ground illuminate a target with a laser to guide a precision bombing strike from an aircraft.

Like a precision bombing raid, the lentiviral vector approach to targeting HIV has the advantage of avoiding collateral damage, keeping cells that are not infected by HIV out of harm's way. Such accuracy has not been achieved by using drugs alone, Wang said.

So far, the lentiviral vector has only been tested in culture dishes and has resulted in the destruction of about 35 percent of existing HIV cells. While that may not sound like a large percentage, if this treatment were to be used in humans, it would likely be repeated several times to maximize effectiveness.

Among the next steps will be to test the procedure in mice. While this is an important breakthrough, it is not yet a cure, Wang said.

"This is an early stage of research, but certainly it is one of the options in that direction," he said.

Wang's research, which was funded by the National Institutes of Health, appears in the July 23 issue of Virus Research.


Scientist develops virus that targets HIV: Using a virus to kill a virus

DNA building blocks can be made in space, NASA evidence suggests

"People have been discovering components of DNA in meteorites since the 1960's, but researchers were unsure whether they were really created in space or if instead they came from contamination by terrestrial life," said Dr. Michael Callahan of NASA's Goddard Space Flight Center, Greenbelt, Md. "For the first time, we have three lines of evidence that together give us confidence these DNA building blocks actually were created in space." Callahan is lead author of a paper on the discovery appearing in Proceedings of the National Academy of Sciences of the United States of America.

The discovery adds to a growing body of evidence that the chemistry inside asteroids and comets is capable of making building blocks of essential biological molecules. For example, previously, these scientists at the Goddard Astrobiology Analytical Laboratory have found amino acids in samples of comet Wild 2 from NASA's Stardust mission, and in various carbon-rich meteorites. Amino acids are used to make proteins, the workhorse molecules of life, used in everything from structures like hair to enzymes, the catalysts that speed up or regulate chemical reactions.

In the new work, the Goddard team ground up samples of twelve carbon-rich meteorites, nine of which were recovered from Antarctica. They extracted each sample with a solution of formic acid and ran them through a liquid chromatograph, an instrument that separates a mixture of compounds. They further analyzed the samples with a mass spectrometer, which helps determine the chemical structure of compounds.

The team found adenine and guanine, which are components of DNA called nucleobases, as well as hypoxanthine and xanthine. DNA resembles a spiral ladder; adenine and guanine connect with two other nucleobases to form the rungs of the ladder. They are part of the code that tells the cellular machinery which proteins to make. Hypoxanthine and xanthine are not found in DNA, but are used in other biological processes.

Also, in two of the meteorites, the team discovered for the first time trace amounts of three molecules related to nucleobases: purine, 2,6-diaminopurine, and 6,8-diaminopurine; the latter two almost never used in biology. These compounds have the same core molecule as nucleobases but with a structure added or removed.

It's these nucleobase-related molecules, called nucleobase analogs, which provide the first piece of evidence that the compounds in the meteorites came from space and not terrestrial contamination. "You would not expect to see these nucleobase analogs if contamination from terrestrial life was the source, because they're not used in biology, aside from one report of 2,6-diaminopurine occurring in a virus (cyanophage S-2L)," said Callahan. "However, if asteroids are behaving like chemical 'factories' cranking out prebiotic material, you would expect them to produce many variants of nucleobases, not just the biological ones, due to the wide variety of ingredients and conditions in each asteroid."

The second piece of evidence involved research to further rule out the possibility of terrestrial contamination as a source of these molecules. The team also analyzed an eight-kilogram (21.4-pound) sample of ice from Antarctica, where most of the meteorites in the study were found, with the same methods used on the meteorites. The amounts of the two nucleobases, plus hypoxanthine and xanthine, found in the ice were much lower -- parts per trillion -- than in the meteorites, where they were generally present at several parts per billion. More significantly, none of the nucleobase analogs were detected in the ice sample. One of the meteorites with nucleobase analog molecules fell in Australia, and the team also analyzed a soil sample collected near the fall site. As with the ice sample, the soil sample had none of the nucleobase analog molecules present in the meteorite.

Thirdly, the team found these nucleobases -- both the biological and non-biological ones -- were produced in a completely non-biological reaction. "In the lab, an identical suite of nucleobases and nucleobase analogs were generated in non-biological chemical reactions containing hydrogen cyanide, ammonia, and water. This provides a plausible mechanism for their synthesis in the asteroid parent bodies, and supports the notion that they are extraterrestrial," says Callahan.

"In fact, there seems to be a 'goldilocks' class of meteorite, the so-called CM2 meteorites, where conditions are just right to make more of these molecules," adds Callahan.

The team includes Callahan and Drs. Jennifer C. Stern, Daniel P. Glavin, and Jason P. Dworkin of NASA Goddard's Astrobiology Analytical Laboratory; Ms. Karen E. Smith and Dr. Christopher H. House of Pennsylvania State University, University Park, Pa.; Dr. H. James Cleaves II of the Carnegie Institution of Washington, Washington, DC; and Dr. Josef Ruzicka of Thermo Fisher Scientific, Somerset, N.J. The research was funded by the NASA Astrobiology Institute, the Goddard Center for Astrobiology, the NASA Astrobiology: Exobiology and Evolutionary Biology Program, and the NASA Postdoctoral Program.


DNA building blocks can be made in space, NASA evidence suggests

Improved electrical conductivity in polymeric composites

The researchers in Luxembourg, in cooperation with scientists from the Netherlands, have studied the electrical percolation of carbon nanotubes in a polymer matrix and shown the percolation threshold -- the point at which the polymer composite becomes conductive -- can be considerably lowered if small quantities of a conductive polymer latex are added. The simulations were done in Luxembourg, while the experiments took place at Eindhoven University.

"In this project, the idea is to use as little as possible carbon nanotubes and still benefit from their favourable properties," says the project leader at the University of Luxembourg, Prof. Tania Schilling, "we have discovered that, by adding a second component, we could make use of the resulting interactions to reach our goal." By mixing finely dispersed particles, so-called colloidal particles, of differing shapes and sizes in the medium, system-spanning networks form: the prerequisite for electrically conductive composites.

The recent finding of the materials scientists of the University of Luxembourg was published in the peer-reviewed, scientific journal Nature Nanotechnology. This finding is a result of a cooperation of scientists at the University of Luxembourg, the Technische Universiteit Eindhoven and the Dutch Polymer Institute.


Improved electrical conductivity in polymeric composites

Nordic Semiconductor, Broadcom, Bluetooth Broadcom, Nordic Semi team on Bluetooth LE

SAN FRANCISCO—RF chip vendor Nordic Semiconductor ASA Monday (Aug. 8) announced successful wireless communication tests between a prototype design for a small, low-cost Bluetooth low energy proximity tag and Broadcom Corp.'s BCM4330, the first combo chip certified compliant with the Bluetooth 4.0 standard.

The prototype compatibility Bluetooth low energy tags—also known as fobs—demonstrate the interoperability between Bluetooth low energy chips and Bluetooth 4.0 devices, according to Nordic (Oslo, Norway). Adherence to the Bluetooth v4.0 specification ensures that devices from different providers, such as Broadcom and Nordic, communicate seamlessly, Nordic said.

The recently released Bluetooth v4.0 proximity profile enables the communication between the fob and next generation host devices like laptops and mobile phones, Nordic said.

Nordic said the fob is designed to prevent a device such as a laptop from being accessed in the owner’s absence. After pairing with the chip in the mobile device, the user carries the fob on their person, Nordic said. If the distance between the user and the mobile device exceeds a pre-set threshold—as could occur if the device is lost or stolen—the pairing is broken and the mobile device automatically locks, according to Nordic.

The fob is based on Nordic's µBlue nRF8001 single-chip Bluetooth low energy solution expected to be ready for volume production early in the third quarter, Nordic said. The power consumption of the nRF8001 maximizes the battery life of the CR2032 coin-cell powered fob, according to the company.

Broadcom’s BCM4330, the successor to the company’s BCM4329, is the industry’s first combo chip certified with the Bluetooth 4.0 standard, which includes Bluetooth low energy as a hallmark feature.

According to Peter Cooney, practice director for semiconductors at ABI Research, nearly all existing Bluetooth-enabled phones are expected to migrate to Bluetooth 4.0. This will result in more than 1 billion Bluetooth low energy-capable hosts in the handset market alone in the next few years, according to Cooney.
 
"Demand for Bluetooth low energy continues to grow as the technology is integrated into the increasing number of consumer electronics devices," said Craig Ochikubo, vice president and general manager of Broadcom’s Wireless Personal Area Networking line of business.

Several Bluetooth low energy profiles are expected to be released within the next few months by the Bluetooth Special Interest Group.

Nordic Semiconductor, Broadcom, Bluetooth Broadcom, Nordic Semi team on Bluetooth LE

Chimpanzees are spontaneously generous after all, study shows

The current study findings are available in the online edition of Proceedings of the National Academy of Sciences.

According to Yerkes researchers Victoria Horner, PhD, Frans de Waal, PhD, and their colleagues, chimpanzees may not have shown prosocial behaviors in other studies because of design issues, such as the complexity of the apparatus used to deliver rewards and the distance between the animals.

"I have always been skeptical of the previous negative findings and their over-interpretation, says Dr. de Waal. "This study confirms the prosocial nature of chimpanzees with a different test, better adapted to the species," he continues.

In the current study, Dr. Horner and colleagues greatly simplified the test, which focused on offering seven adult female chimpanzees a choice between two similar actions: one that rewards both the "actor," the term used in the paper for the lead study participant, and a partner, and another that rewards only the actor/chooser herself. Examples of the critically important simplified design aspects include allowing the study partners to sit close together and ensuring conspicuous food consumption, which the researchers achieved by wrapping pieces of banana in paper that made a loud noise upon removal.

In each trial, the chooser, which was always tested with her partner in sight, selected between differently colored tokens from a bin. One colored token could be exchanged with an experimenter for treats for both members of the pair (prosocial); the other colored token would result in a treat only for the chooser (selfish). All seven chimpanzees showed an overwhelming preference for the prosocial choice. The study also showed the choosers behaved altruistically especially towards partners who either patiently waited or gently reminded them that they were there by drawing attention to themselves. The chimpanzees making the choices were less likely to reward partners who made a fuss, begged persistently or spat water at them, thus showing their altruism was spontaneous and not subject to intimidation.

"We were excited to find female after female chose the option that gave both her and her partner food," says Dr. Horner. "It was also interesting to me that being overly persistent did not go down well with the choosers. It was far more productive for partners to be calm and remind the choosers they were there from time to time," she continues.

The authors say this study puts to rest a longstanding puzzle surrounding chimpanzee altruism. It is well-known these apes help each other in the wild and show various forms of empathy, such as reassurance of distressed parties. The negative findings of previous studies did not fit this image. These results, however, confirm chimpanzee altruism in a well-controlled experiment, suggesting human altruism is less of an anomaly than previously thought.

The study authors next plan to determine whether the altruistic tendency of the chimpanzees towards their partners is related to social interactions within the group, such as reciprocal exchanges of food or social support.

For eight decades, the Yerkes National Primate Research Center, Emory University, has been dedicated to conducting essential basic science and translational research to advance scientific understanding and to improve the health and well-being of humans and nonhuman primates. Today, the center, as one of only eight National Institutes of Health-funded national primate research centers, provides leadership, training and resources to foster scientific creativity, collaboration and discoveries. Yerkes-based research is grounded in scientific integrity, expert knowledge, respect for colleagues, an open exchange of ideas and compassionate quality animal care.

Within the fields of microbiology and immunology, neurologic diseases, neuropharmacology, behavioral, cognitive and developmental neuroscience, and psychiatric disorders, the center's research programs are seeking ways to: develop vaccines for infectious and noninfectious diseases; treat drug addiction; interpret brain activity through imaging; increase understanding of progressive illnesses such as Alzheimer's and Parkinson's diseases; unlock the secrets of memory; determine how the interaction between genetics and society shape who we are; and advance knowledge about the evolutionary links between biology and behavior.


Chimpanzees are spontaneously generous after all, study shows