Archive for October, 2011

Bidirectional brain signals sense and move virtual objects

Saturday, October 15th, 2011

In the study, monkeys moved and felt virtual objects using only their brain (credit: Duke University)

Two monkeys trained at the Duke University Center for Neuroengineering have learned to employ brain activity alone to move an avatar hand and identify the texture of virtual objects.

“Someday in the near future, quadriplegic patients will take advantage of this technology not only to move their arms and hands and to walk again, but also to sense the texture of objects placed in their hands, or experience the nuances of the terrain on which they stroll with the help of a wearable robotic exoskeleton,” said study leader Miguel Nicolelis, MD, PhD, professor of neurobiology at Duke University Medical Center and co-director of the Duke Center for Neuroengineering.

Sensing textures of virtual objects

Without moving any part of their real bodies, the monkeys used their electrical brain activity to direct the virtual hands of an avatar to the surface of virtual objects and differentiate their textures. Although the virtual objects employed in this study were visually identical, they were designed to have different artificial textures that could only be detected if the animals explored them with virtual hands controlled directly by their brain’s electrical activity.

The texture of the virtual objects was expressed as a pattern of electrical signals transmitted to the monkeys’ brains. Three different electrical patterns corresponded to each of three different object textures.

Because no part of the animal’s real body was involved in the operation of this brain-machine-brain interface, these experiments suggest that in the future, patients who were severely paralyzed due to a spinal cord lesion may take advantage of this technology to regain mobility and also to have their sense of touch restored, said Nicolelis.

First bidirectional link between brain and virtual body

“This is the first demonstration of a brain-machine-brain interface (BMBI) that establishes a direct, bidirectional link between a brain and a virtual body,” Nicolelis said.

“In this BMBI, the virtual body is controlled directly by the animal’s brain activity, while its virtual hand generates tactile feedback information that is signaled via direct electrical microstimulation of another region of the animal’s cortex. We hope that in the next few years this technology could help to restore a more autonomous life to many patients who are currently locked in without being able to move or experience any tactile sensation of the surrounding world,” Nicolelis said.

“This is also the first time we’ve observed a brain controlling a virtual arm that explores objects while the brain simultaneously receives electrical feedback signals that describe the fine texture of objects ‘touched’ by the monkey’s newly acquired virtual hand.

“Such an interaction between the brain and a virtual avatar was totally independent of the animal’s real body, because the animals did not move their real arms and hands, nor did they use their real skin to touch the objects and identify their texture. It’s almost like creating a new sensory channel through which the brain can resume processing information that cannot reach it anymore through the real body and peripheral nerves.”

The combined electrical activity of populations of 50 to 200 neurons in the monkey’s motor cortex controlled the steering of the avatar arm, while thousands of neurons in the primary tactile cortex were simultaneously receiving continuous electrical feedback from the virtual hand’s palm that let the monkey discriminate between objects, based on their texture alone.

Robotic exoskeleton for paralyzed patients

“The remarkable success with non-human primates is what makes us believe that humans could accomplish the same task much more easily in the near future,” Nicolelis said.

The findings provide further evidence that it may be possible to create a robotic exoskeleton that severely paralyzed patients could wear in order to explore and receive feedback from the outside world, Nicolelis said. The  exoskeleton would be directly controlled by the patient’s voluntary brain activity to allow the patient to move autonomously. Simultaneously, sensors distributed across the exoskeleton would generate the type of tactile feedback needed for the patient’s brain to identify the texture, shape and temperature of objects, as well as many features of the surface upon which they walk.

This overall therapeutic approach is the one chosen by the Walk Again Project, an international, non-profit consortium, established by a team of Brazilian, American, Swiss, and German scientists, which aims at restoring full-body mobility to quadriplegic patients through a brain-machine-brain interface implemented in conjunction with a full-body robotic exoskeleton.

The international scientific team recently proposed to carry out its first public demonstration of such an autonomous exoskeleton during the opening game of the 2014 FIFA Soccer World Cup that will be held in Brazil.

Ref.: Joseph E. O’Doherty, Mikhail A. Lebedev, Peter J. Ifft, Katie Z. Zhuang, Solaiman Shokur, Hannes Bleuler, and Miguel A. L. Nicolelis, Active tactile exploration using a brain–machine–brain interface, Nature, October 2011 [doi:10.1038/nature10489]

Source | KurzweilAI

Brain linked to robotic hand; success hailed

Saturday, October 15th, 2011

Assistant professor Jennifer Collinger, left, watches as quadriplegic research subject Tim Hemmes operates the mechanical prosthetic arm in a testing session at UPMC. Read more: http://www.post-gazette.com/pg/11283/1181062-53.stm#ixzz1au8uO2qm

When it happened, emotions flashed like lightning.

The nearby robotic hand that Tim Hemmes was controlling with his mind touched his girlfriend Katie Schaffer’s outstretched hand.

One small touch for Mr. Hemmes; one giant reach for people with disabilities.

Tears of joy flowing in an Oakland laboratory that day continued later when Mr. Hemmes toasted his and University of Pittsburgh researchers’ success at a local restaurant with two daiquiris.

A simple act for most people proved to be a major advance in two decades of research that has proven to be the stuff of science fiction.

Mr. Hemmes’ success in putting the robotic hand in the waiting hand of Ms. Schaffer, 27, of Philadelphia, represented the first time a person with quadriplegia has used his mind to control a robotic arm so masterfully.

The 30-year-old man from Connoquenessing Township, Butler County, hadn’t moved his arms, hands or legs since a motorcycle accident seven years earlier. But Mr. Hemmes had practiced six hours a day, six days a week for nearly a month to move the arm with his mind.

That successful act increases hope for people with paralysis or loss of limbs that they can feed and dress themselves and open doors, among other tasks, with a mind-controlled robotic arm. It’s also improved the prospects of wiring around spinal cord injuries to allow motionless arms and legs to function once again.








“I think the potential here is incredible,” said Dr. Michael Boninger, director of UPMC’s Rehabilitation Institute and a principal investigator in the project. “This is a breakthrough for us.”

Mr. Hemmes? They say he’s a rock star.

Reading brain signals

In a project led by Andrew Schwartz, Ph.D., a University of Pittsburgh professor of neurobiology, researchers taught a monkey how to use a robotic arm mentally to feed itself marshmallows. Electrodes had been shallowly implanted in its brain to read signals from neurons known to control arm motion.

Electrocorticography or ECoG — in which an electronic grid is surgically placed against the brain without penetration — less intrusively captures brain signals.

ECoG has been used to locate sites of seizures and do other experiments in patients with epilepsy. Those experiments were prelude to seeking a candidate with quadriplegia to test ECoG’s capability to control a robotic arm developed by Johns Hopkins University.

The still unanswered question was whether the brains of people with long-term paralysis still produced signals to move their limbs.

ECoG picks up an array of brain signals, almost like a secret code or new language, that a computer algorithm can interpret and then move a robotic arm based on the person’s intentions. It’s a simple explanation for complex science.

Mr. Hemmes’ name cropped up so many times as a potential candidate that the team called him to gauge his interest.

He said no.

He already was involved in a research in Cleveland and feared this project would interfere. But knowing they had the ideal candidate, they called back. This time he agreed, as long as it would not limit his participation in future phases of research.

Mr. Hemmes became quadriplegic July 11, 2004, apparently after a deer darted onto the roadway, causing him to swerve his motorcycle onto gravel where his shoulder hit a mailbox, sending him flying headfirst into a guardrail. The top of his helmet became impaled on a guardrail I-beam, rendering his head motionless while his body continued flying, snapping his neck at the fourth cervical vertebra.

A passer-by found him with blue lips and no signs of breathing. Mr. Hemmes was flown by rescue helicopter to UPMC Mercy and diagnosed with quadriplegia — a condition in which he had lost use of his limbs and his body below the neck or shoulders. He had to learn how to breathe on his own. His doctor told him it was worst accident he’d ever seen in which the person survived.

But after the process of adapting psychologically to quadriplegia, Mr. Hemmes chose to pursue a full life, especially after he got a device to operate a computer and another to operate a wheelchair with head motions.

Since January, he has operated the website — www.Pittsburghpitbullrescue.com — to rescue homeless pit bulls and find them new owners.

The former hockey player’s competitive spirit and willingness to face risk were key attributes. Elizabeth Tyler-Kabara, the UPMC neurosurgeon who would install the ECoG in Mr. Hemmes’ brain, said he had strong motivation and a vision that paralysis could be cured.

Ever since his accident, Mr. Hemmes said, he’s had the goal of hugging his daughter Jaylei, now 8. This could be the first step.

“It’s an honor that they picked me, and I feel humbled,” Mr. Hemmes said.

Mental gymnastics

Mr. Hemmes underwent several hours of surgery to install the ECoG at a precise location against the brain. Wires running under the skin down to a port near his collarbone — where wires can connect to the robotic arm — caused him a stiff neck for a few days.

Two days after surgery, he began exhaustive training on mentally maneuvering a computer cursor in various directions to reach and make targets disappear. Next he learned to move the cursor diagonally before working for hours to capture targets on a three-dimensional computer.

The U.S. Food and Drug Administration allowed the trial to last only 28 days, when the ECoG is removed. The project, initially funded by UPMC, has received more than $6 million in funding from the Department of Veterans Affairs, the National Institutes of Health, and the U.S. Department of Defense’s Defense Advanced Research Projects Agency, known as DARPA.

Initially Mr. Hemmes tried thinking about flexing his arm to move the cursor. But he had better success visually grabbing the ball-shaped cursor to throw it toward a target on the screen. The “mental eye-grabbing” worked best when he was relaxed.

Soon he was capturing 15 of 16 targets and sometimes all 16 during timed sessions. The next challenge was moving the robotic arm with his mind.

The same mental processes worked, but the arm moved more slowly and in real space. But time was ticking away as the experiment approached its final days last month. With Mr. Hemmes finally moving the arm in all directions, Wei Wang — assistant professor of physical medicine and rehabilitation at Pitt’s School of Medicine who also has worked on the signalling system — stood in front of him and raised his hand.

The robotic arm that Mr. Hemmes was controlling moved with fits and starts but in time reached Dr. Wang’s upheld hand. Mr. Hemmes gave him a high five.

The big moment arrived.

Katie Schaffer stood before her boyfriend with her hand extended. “Baby,” she said encouraging him, “touch my hand.”

It took several minutes, but he raised the robotic hand and pushed it toward Ms. Schaffer until its palm finally touched hers. Tears flowed.

“It’s the first time I’ve reached out to anybody in over seven years,” Mr. Hemmes said. “I wanted to touch Katie. I never got to do that before.”

“I have tattoos, and I’m a big, strong guy,” he said in retrospect. “So if I’m going to cry, I’m going to bawl my eyes out. It was pure emotion.”

Curing paralysis

Mr. Hemmes said his accomplishments represent a first step toward “a cure for paralysis.” The research team is cautious about such statements without denying the possibility. They prefer identifying the goal of restoring function in people with disabilities.

“This was way beyond what we expected,” Dr. Tyler-Kabara said. “We really hit a home run, and I’m thrilled.”

The next phase will include up to six people tested in another 30-day trial with ECoG. A year-long trial will test the electrode array that shallowly penetrates the brain. Goals during these phases include expanding the degrees of arm motions to allow people to “pick up a grape or grasp and turn a door knob,” Dr. Tyler-Kabara said.

Anyone interested in participating should call 1-800-533-8762.

Mr. Hemmes says he will participate in future research.

“This is something big, but I’m not done yet,” he said. “I want to hug my daughter.”

Source | Pittsburgh Post-Gazette

Moon Packed with Precious Titanium, NASA Probe Finds

Saturday, October 15th, 2011

This lunar mosaic shows the boundary between Mare Serenitatis and Mare Tranquillitatis. The relative blue color of the Tranquillitatis mare is due to higher abundances of the titanium-bearing mineral ilmenite.

A new map of the moon has uncovered a trove of areas rich in precious titanium ore, with some lunar rocks harboring 10 times as much of the stuff as rocks here on Earth do.

The map, which combined observations in visible and ultraviolet wavelengths, revealed the valuable titanium deposits. These findings could shed light on some of the mysteries of the lunar interior, and could also lay the groundwork for future mining on the moon, researchers said.

“Looking up at the moon, its surface appears painted with shades of grey — at least to the human eye,” Mark Robinson, of Arizona State University, said in a statement. “The maria appear reddish in some places and blue in others. Although subtle, these color variations tell us important things about the chemistry and evolution of the lunar surface. They indicate the titanium and iron abundance, as well as the maturity of a lunar soil.

The results of the study were presented Friday (Oct. 7) at the joint meeting of the European Planetary Science Congress and the American Astronomical Society’s Division for Planetary Sciences in Nantes, France.

Mapping the lunar surface

The map of the moon’s surface was constructed using data from NASA’s Lunar Reconnaissance Orbiter (LRO), which has been circling the moon since June 2009. The probe’s wide angle camera snapped pictures of the surface in seven different wavelengths at different resolutions.

Since specific minerals strongly reflect or absorb different parts of the electromagnetic spectrum, LRO’s instruments were able to give scientists a clearer picture of the chemical composition of the moon’s surface.

Robinson and his colleagues stitched together a mosaic using roughly 4,000 images that had been collected by the spacecraft over one month.

The researchers scanned the lunar surface and compared the brightness in the range of wavelengths from ultraviolet to visible light, picking out areas that are abundant in titanium. The scientists then cross-referenced their findings with lunar samples that were brought back to Earth from NASA’s Apollo flights and the Russian Luna missions.

These titanium-rich areas on the moon puzzled the researchers. The highest abundance of titanium in similar rocks on Earth hovers around 1 percent or less, the scientists explained. The new map shows that these troves of titanium on the moon range from about 1 percent to a little more than 10 percent.

“We still don’t really understand why we find much higher abundances of titanium on the moon compared to similar types of rocks on Earth,” Robinson said. “What the lunar titanium-richness does tell us is something about the conditions inside the moon shortly after it formed, knowledge that geochemists value for understanding the evolution of the moon.”

Valuable titanium ore

Titanium on the moon is primarily found in the mineral ilmenite, a compound that contains iron, titanium and oxygen. If humans one day mine on the moon, they could break down ilmenite to separate these elements.

Furthermore, Apollo data indicated that titanium-rich minerals are more efficient at retaining solar wind particles, such as helium and hydrogen. These gases would likely be vital resources in the construction of lunar colonies and for exploration of the moon, the researchers said. [Lunar Legacy: 45 Apollo Moon Mission Photos]

“Astronauts will want to visit places with both high scientific value and a high potential for resources that can be used to support exploration activities,” Robinson said. “Areas with high titanium provide both — a pathway to understanding the interior of the moon and potential mining resources.”

This composite image of the lunar surface highlights regions with varying mare compositions and enigmatic small volcanic structures known as “domes.”

The lunar map also shows how space weather changes the surface of the moon. Charged particles from solar wind and micrometeorite impacts can change the moon’s surface materials, pulverizing rock into a fine powder and altering the chemical composition of the lunar surface.

“One of the exciting discoveries we’ve made is that the effects of weathering show up much more quickly in ultraviolet than in visible or infrared wavelengths,” study co-author Brett Denevi, of Johns Hopkins University Applied Physics Laboratory in Laurel, Md., said in a statement. “In the [Lunar Reconnaissance Orbiter Camera] ultraviolet mosaics, even craters that we thought were very young appear relatively mature. Only small, very recently formed craters show up as fresh regolith exposed on the surface.”

Source | SPACE

How to communicate better in virtual worlds

Saturday, October 15th, 2011

The experimental setup. Left: The participants wore a total of six tracked objects; right: the corresponding virtual environment, showing the avatars in the self-animated third-person perspective. (Credit: Trevor J. Dodds et al./PLoS One)

Mapping real-world motions to “self-animated” virtual avatars, using body tracking to communicate a wide range of gestures, helps people communicate better in virtual worlds like Second Life, says researchers from the Max Planck Institute for Biological Cybernetics and Korea University.

They conducted two experiments to investigate whether head-mounted display virtual reality is useful for researching the influence of body gestures in communication; and whether body gestures are used to help in communicating the meaning of a word. Participants worked in pairs and played a communication game, where one person had to describe the meanings of words to the other.

Ref.: Trevor J. Dodds et al., Talk to the Virtual Hands: Self-Animated Avatars Improve Communication in Head-Mounted Display Virtual Environments, PLoS One, DOI: 10.1371/journal.pone.0025759 (free access)

Source | KurzweilAI