Formatting Gaia + Technological Symbiosis

December 2nd, 2011

Patrick Millard | Formatting Gaia + Technological Symbiosis from vasa on Vimeo.

Nature and nurture work together to shape the brain

November 21st, 2011

At the Neuroscience 2011 conference, scientists at The Rockefeller University, The Scripps Research Institute, and the University of Pennsylvania presented new research  demonstrating the impact that life experiences can have on genes and behavior. The studies examine how such environmental information can be transmitted from one generation to the next — a phenomenon known as epigenetics. This new knowledge could ultimately improve understanding of brain plasticity, the cognitive benefits of motherhood, and how a parent‘s exposure to drugs, alcohol, and stress can alter brain development and behavior in their offspring.

The new findings show that:

  • Brain cell activation changes a protein involved in turning genes on and off, suggesting the protein may play a role in brain plasticity.
  • Prenatal exposure to amphetamines and alcohol produces abnormal numbers of chromosomes in fetal mouse brains. The findings suggest these abnormal counts may contribute to the developmental defects seen in children exposed to drugs and alcohol in utero.
  • Cocaine-induced changes in the brain may be inheritable. Sons of male rats exposed to cocaine are resistant to the rewarding effects of the drug.
  • Motherhood protects female mice against some of the negative effects of stress.
  • Mice conceived through breeding — but not those conceived through reproductive technologies — show anxiety-like and depressive-like behaviors similar to their fathers. The findings call into question how these behaviors are transmitted across generations.

Source | Kurzweil AI

Robot controls a person’s arm using electrodes

November 21st, 2011

A robot that can control both its own arm and a person’s arm to manipulate objects in a collaborative manner has been developed by Montpellier Laboratory of Informatics, Robotics, and Microelectronics (LIRMM) researchers, IEEE Spectrum Automation reports.

The robot controls the human limb by sending small electrical currents to electrodes taped to the person’s forearm and biceps, which allows the robot to command the elbow and hand to move. In the experiment, the person holds a ball, and the robot holds a hoop; the robot, a small humanoid, has to coordinate the movement of both human and robot arms to successfully drop the ball through the hoop.

The researchers say their goal is to develop robotic technologies that can help people suffering from paralysis and other disabilities to regain some of their motor skills.




Source | Kurzweil AI

Liquid Robotics’ Wave Gliders begin historic swim across Pacific

November 21st, 2011

Four Wave Gliders — self propelled robots, each about the size of a dolphin — left San Francisco on Nov. 17 for a 60,000 kilometer journey, IEEE Spectrum Automation reports.

Built by Liquid Robotics, the robots will use waves to power their propulsion systems and the Sun to power the sensors, as a capability demonstration. They will be measuring things like water salinity, temperature, clarity, and oxygen content; collecting weather data, and gathering information on wave features and currents.

The data from the fleet of robots is being streamed via the Iridium satellite network and made freely available on Google Earth’s Ocean Showcase.

Source | Kurzweil AI

Bidirectional brain signals sense and move virtual objects

October 15th, 2011

In the study, monkeys moved and felt virtual objects using only their brain (credit: Duke University)

Two monkeys trained at the Duke University Center for Neuroengineering have learned to employ brain activity alone to move an avatar hand and identify the texture of virtual objects.

“Someday in the near future, quadriplegic patients will take advantage of this technology not only to move their arms and hands and to walk again, but also to sense the texture of objects placed in their hands, or experience the nuances of the terrain on which they stroll with the help of a wearable robotic exoskeleton,” said study leader Miguel Nicolelis, MD, PhD, professor of neurobiology at Duke University Medical Center and co-director of the Duke Center for Neuroengineering.

Sensing textures of virtual objects

Without moving any part of their real bodies, the monkeys used their electrical brain activity to direct the virtual hands of an avatar to the surface of virtual objects and differentiate their textures. Although the virtual objects employed in this study were visually identical, they were designed to have different artificial textures that could only be detected if the animals explored them with virtual hands controlled directly by their brain’s electrical activity.

The texture of the virtual objects was expressed as a pattern of electrical signals transmitted to the monkeys’ brains. Three different electrical patterns corresponded to each of three different object textures.

Because no part of the animal’s real body was involved in the operation of this brain-machine-brain interface, these experiments suggest that in the future, patients who were severely paralyzed due to a spinal cord lesion may take advantage of this technology to regain mobility and also to have their sense of touch restored, said Nicolelis.

First bidirectional link between brain and virtual body

“This is the first demonstration of a brain-machine-brain interface (BMBI) that establishes a direct, bidirectional link between a brain and a virtual body,” Nicolelis said.

“In this BMBI, the virtual body is controlled directly by the animal’s brain activity, while its virtual hand generates tactile feedback information that is signaled via direct electrical microstimulation of another region of the animal’s cortex. We hope that in the next few years this technology could help to restore a more autonomous life to many patients who are currently locked in without being able to move or experience any tactile sensation of the surrounding world,” Nicolelis said.

“This is also the first time we’ve observed a brain controlling a virtual arm that explores objects while the brain simultaneously receives electrical feedback signals that describe the fine texture of objects ‘touched’ by the monkey’s newly acquired virtual hand.

“Such an interaction between the brain and a virtual avatar was totally independent of the animal’s real body, because the animals did not move their real arms and hands, nor did they use their real skin to touch the objects and identify their texture. It’s almost like creating a new sensory channel through which the brain can resume processing information that cannot reach it anymore through the real body and peripheral nerves.”

The combined electrical activity of populations of 50 to 200 neurons in the monkey’s motor cortex controlled the steering of the avatar arm, while thousands of neurons in the primary tactile cortex were simultaneously receiving continuous electrical feedback from the virtual hand’s palm that let the monkey discriminate between objects, based on their texture alone.

Robotic exoskeleton for paralyzed patients

“The remarkable success with non-human primates is what makes us believe that humans could accomplish the same task much more easily in the near future,” Nicolelis said.

The findings provide further evidence that it may be possible to create a robotic exoskeleton that severely paralyzed patients could wear in order to explore and receive feedback from the outside world, Nicolelis said. The  exoskeleton would be directly controlled by the patient’s voluntary brain activity to allow the patient to move autonomously. Simultaneously, sensors distributed across the exoskeleton would generate the type of tactile feedback needed for the patient’s brain to identify the texture, shape and temperature of objects, as well as many features of the surface upon which they walk.

This overall therapeutic approach is the one chosen by the Walk Again Project, an international, non-profit consortium, established by a team of Brazilian, American, Swiss, and German scientists, which aims at restoring full-body mobility to quadriplegic patients through a brain-machine-brain interface implemented in conjunction with a full-body robotic exoskeleton.

The international scientific team recently proposed to carry out its first public demonstration of such an autonomous exoskeleton during the opening game of the 2014 FIFA Soccer World Cup that will be held in Brazil.

Ref.: Joseph E. O’Doherty, Mikhail A. Lebedev, Peter J. Ifft, Katie Z. Zhuang, Solaiman Shokur, Hannes Bleuler, and Miguel A. L. Nicolelis, Active tactile exploration using a brain–machine–brain interface, Nature, October 2011 [doi:10.1038/nature10489]

Source | KurzweilAI

Brain linked to robotic hand; success hailed

October 15th, 2011

Assistant professor Jennifer Collinger, left, watches as quadriplegic research subject Tim Hemmes operates the mechanical prosthetic arm in a testing session at UPMC. Read more: http://www.post-gazette.com/pg/11283/1181062-53.stm#ixzz1au8uO2qm

When it happened, emotions flashed like lightning.

The nearby robotic hand that Tim Hemmes was controlling with his mind touched his girlfriend Katie Schaffer’s outstretched hand.

One small touch for Mr. Hemmes; one giant reach for people with disabilities.

Tears of joy flowing in an Oakland laboratory that day continued later when Mr. Hemmes toasted his and University of Pittsburgh researchers’ success at a local restaurant with two daiquiris.

A simple act for most people proved to be a major advance in two decades of research that has proven to be the stuff of science fiction.

Mr. Hemmes’ success in putting the robotic hand in the waiting hand of Ms. Schaffer, 27, of Philadelphia, represented the first time a person with quadriplegia has used his mind to control a robotic arm so masterfully.

The 30-year-old man from Connoquenessing Township, Butler County, hadn’t moved his arms, hands or legs since a motorcycle accident seven years earlier. But Mr. Hemmes had practiced six hours a day, six days a week for nearly a month to move the arm with his mind.

That successful act increases hope for people with paralysis or loss of limbs that they can feed and dress themselves and open doors, among other tasks, with a mind-controlled robotic arm. It’s also improved the prospects of wiring around spinal cord injuries to allow motionless arms and legs to function once again.








“I think the potential here is incredible,” said Dr. Michael Boninger, director of UPMC’s Rehabilitation Institute and a principal investigator in the project. “This is a breakthrough for us.”

Mr. Hemmes? They say he’s a rock star.

Reading brain signals

In a project led by Andrew Schwartz, Ph.D., a University of Pittsburgh professor of neurobiology, researchers taught a monkey how to use a robotic arm mentally to feed itself marshmallows. Electrodes had been shallowly implanted in its brain to read signals from neurons known to control arm motion.

Electrocorticography or ECoG — in which an electronic grid is surgically placed against the brain without penetration — less intrusively captures brain signals.

ECoG has been used to locate sites of seizures and do other experiments in patients with epilepsy. Those experiments were prelude to seeking a candidate with quadriplegia to test ECoG’s capability to control a robotic arm developed by Johns Hopkins University.

The still unanswered question was whether the brains of people with long-term paralysis still produced signals to move their limbs.

ECoG picks up an array of brain signals, almost like a secret code or new language, that a computer algorithm can interpret and then move a robotic arm based on the person’s intentions. It’s a simple explanation for complex science.

Mr. Hemmes’ name cropped up so many times as a potential candidate that the team called him to gauge his interest.

He said no.

He already was involved in a research in Cleveland and feared this project would interfere. But knowing they had the ideal candidate, they called back. This time he agreed, as long as it would not limit his participation in future phases of research.

Mr. Hemmes became quadriplegic July 11, 2004, apparently after a deer darted onto the roadway, causing him to swerve his motorcycle onto gravel where his shoulder hit a mailbox, sending him flying headfirst into a guardrail. The top of his helmet became impaled on a guardrail I-beam, rendering his head motionless while his body continued flying, snapping his neck at the fourth cervical vertebra.

A passer-by found him with blue lips and no signs of breathing. Mr. Hemmes was flown by rescue helicopter to UPMC Mercy and diagnosed with quadriplegia — a condition in which he had lost use of his limbs and his body below the neck or shoulders. He had to learn how to breathe on his own. His doctor told him it was worst accident he’d ever seen in which the person survived.

But after the process of adapting psychologically to quadriplegia, Mr. Hemmes chose to pursue a full life, especially after he got a device to operate a computer and another to operate a wheelchair with head motions.

Since January, he has operated the website — www.Pittsburghpitbullrescue.com — to rescue homeless pit bulls and find them new owners.

The former hockey player’s competitive spirit and willingness to face risk were key attributes. Elizabeth Tyler-Kabara, the UPMC neurosurgeon who would install the ECoG in Mr. Hemmes’ brain, said he had strong motivation and a vision that paralysis could be cured.

Ever since his accident, Mr. Hemmes said, he’s had the goal of hugging his daughter Jaylei, now 8. This could be the first step.

“It’s an honor that they picked me, and I feel humbled,” Mr. Hemmes said.

Mental gymnastics

Mr. Hemmes underwent several hours of surgery to install the ECoG at a precise location against the brain. Wires running under the skin down to a port near his collarbone — where wires can connect to the robotic arm — caused him a stiff neck for a few days.

Two days after surgery, he began exhaustive training on mentally maneuvering a computer cursor in various directions to reach and make targets disappear. Next he learned to move the cursor diagonally before working for hours to capture targets on a three-dimensional computer.

The U.S. Food and Drug Administration allowed the trial to last only 28 days, when the ECoG is removed. The project, initially funded by UPMC, has received more than $6 million in funding from the Department of Veterans Affairs, the National Institutes of Health, and the U.S. Department of Defense’s Defense Advanced Research Projects Agency, known as DARPA.

Initially Mr. Hemmes tried thinking about flexing his arm to move the cursor. But he had better success visually grabbing the ball-shaped cursor to throw it toward a target on the screen. The “mental eye-grabbing” worked best when he was relaxed.

Soon he was capturing 15 of 16 targets and sometimes all 16 during timed sessions. The next challenge was moving the robotic arm with his mind.

The same mental processes worked, but the arm moved more slowly and in real space. But time was ticking away as the experiment approached its final days last month. With Mr. Hemmes finally moving the arm in all directions, Wei Wang — assistant professor of physical medicine and rehabilitation at Pitt’s School of Medicine who also has worked on the signalling system — stood in front of him and raised his hand.

The robotic arm that Mr. Hemmes was controlling moved with fits and starts but in time reached Dr. Wang’s upheld hand. Mr. Hemmes gave him a high five.

The big moment arrived.

Katie Schaffer stood before her boyfriend with her hand extended. “Baby,” she said encouraging him, “touch my hand.”

It took several minutes, but he raised the robotic hand and pushed it toward Ms. Schaffer until its palm finally touched hers. Tears flowed.

“It’s the first time I’ve reached out to anybody in over seven years,” Mr. Hemmes said. “I wanted to touch Katie. I never got to do that before.”

“I have tattoos, and I’m a big, strong guy,” he said in retrospect. “So if I’m going to cry, I’m going to bawl my eyes out. It was pure emotion.”

Curing paralysis

Mr. Hemmes said his accomplishments represent a first step toward “a cure for paralysis.” The research team is cautious about such statements without denying the possibility. They prefer identifying the goal of restoring function in people with disabilities.

“This was way beyond what we expected,” Dr. Tyler-Kabara said. “We really hit a home run, and I’m thrilled.”

The next phase will include up to six people tested in another 30-day trial with ECoG. A year-long trial will test the electrode array that shallowly penetrates the brain. Goals during these phases include expanding the degrees of arm motions to allow people to “pick up a grape or grasp and turn a door knob,” Dr. Tyler-Kabara said.

Anyone interested in participating should call 1-800-533-8762.

Mr. Hemmes says he will participate in future research.

“This is something big, but I’m not done yet,” he said. “I want to hug my daughter.”

Source | Pittsburgh Post-Gazette

Moon Packed with Precious Titanium, NASA Probe Finds

October 15th, 2011

This lunar mosaic shows the boundary between Mare Serenitatis and Mare Tranquillitatis. The relative blue color of the Tranquillitatis mare is due to higher abundances of the titanium-bearing mineral ilmenite.

A new map of the moon has uncovered a trove of areas rich in precious titanium ore, with some lunar rocks harboring 10 times as much of the stuff as rocks here on Earth do.

The map, which combined observations in visible and ultraviolet wavelengths, revealed the valuable titanium deposits. These findings could shed light on some of the mysteries of the lunar interior, and could also lay the groundwork for future mining on the moon, researchers said.

“Looking up at the moon, its surface appears painted with shades of grey — at least to the human eye,” Mark Robinson, of Arizona State University, said in a statement. “The maria appear reddish in some places and blue in others. Although subtle, these color variations tell us important things about the chemistry and evolution of the lunar surface. They indicate the titanium and iron abundance, as well as the maturity of a lunar soil.

The results of the study were presented Friday (Oct. 7) at the joint meeting of the European Planetary Science Congress and the American Astronomical Society’s Division for Planetary Sciences in Nantes, France.

Mapping the lunar surface

The map of the moon’s surface was constructed using data from NASA’s Lunar Reconnaissance Orbiter (LRO), which has been circling the moon since June 2009. The probe’s wide angle camera snapped pictures of the surface in seven different wavelengths at different resolutions.

Since specific minerals strongly reflect or absorb different parts of the electromagnetic spectrum, LRO’s instruments were able to give scientists a clearer picture of the chemical composition of the moon’s surface.

Robinson and his colleagues stitched together a mosaic using roughly 4,000 images that had been collected by the spacecraft over one month.

The researchers scanned the lunar surface and compared the brightness in the range of wavelengths from ultraviolet to visible light, picking out areas that are abundant in titanium. The scientists then cross-referenced their findings with lunar samples that were brought back to Earth from NASA’s Apollo flights and the Russian Luna missions.

These titanium-rich areas on the moon puzzled the researchers. The highest abundance of titanium in similar rocks on Earth hovers around 1 percent or less, the scientists explained. The new map shows that these troves of titanium on the moon range from about 1 percent to a little more than 10 percent.

“We still don’t really understand why we find much higher abundances of titanium on the moon compared to similar types of rocks on Earth,” Robinson said. “What the lunar titanium-richness does tell us is something about the conditions inside the moon shortly after it formed, knowledge that geochemists value for understanding the evolution of the moon.”

Valuable titanium ore

Titanium on the moon is primarily found in the mineral ilmenite, a compound that contains iron, titanium and oxygen. If humans one day mine on the moon, they could break down ilmenite to separate these elements.

Furthermore, Apollo data indicated that titanium-rich minerals are more efficient at retaining solar wind particles, such as helium and hydrogen. These gases would likely be vital resources in the construction of lunar colonies and for exploration of the moon, the researchers said. [Lunar Legacy: 45 Apollo Moon Mission Photos]

“Astronauts will want to visit places with both high scientific value and a high potential for resources that can be used to support exploration activities,” Robinson said. “Areas with high titanium provide both — a pathway to understanding the interior of the moon and potential mining resources.”

This composite image of the lunar surface highlights regions with varying mare compositions and enigmatic small volcanic structures known as “domes.”

The lunar map also shows how space weather changes the surface of the moon. Charged particles from solar wind and micrometeorite impacts can change the moon’s surface materials, pulverizing rock into a fine powder and altering the chemical composition of the lunar surface.

“One of the exciting discoveries we’ve made is that the effects of weathering show up much more quickly in ultraviolet than in visible or infrared wavelengths,” study co-author Brett Denevi, of Johns Hopkins University Applied Physics Laboratory in Laurel, Md., said in a statement. “In the [Lunar Reconnaissance Orbiter Camera] ultraviolet mosaics, even craters that we thought were very young appear relatively mature. Only small, very recently formed craters show up as fresh regolith exposed on the surface.”

Source | SPACE

How to communicate better in virtual worlds

October 15th, 2011

The experimental setup. Left: The participants wore a total of six tracked objects; right: the corresponding virtual environment, showing the avatars in the self-animated third-person perspective. (Credit: Trevor J. Dodds et al./PLoS One)

Mapping real-world motions to “self-animated” virtual avatars, using body tracking to communicate a wide range of gestures, helps people communicate better in virtual worlds like Second Life, says researchers from the Max Planck Institute for Biological Cybernetics and Korea University.

They conducted two experiments to investigate whether head-mounted display virtual reality is useful for researching the influence of body gestures in communication; and whether body gestures are used to help in communicating the meaning of a word. Participants worked in pairs and played a communication game, where one person had to describe the meanings of words to the other.

Ref.: Trevor J. Dodds et al., Talk to the Virtual Hands: Self-Animated Avatars Improve Communication in Head-Mounted Display Virtual Environments, PLoS One, DOI: 10.1371/journal.pone.0025759 (free access)

Source | KurzweilAI

Robot octopus shakes your hand

September 18th, 2011

Researchers at the Scuola Superiore Sant’Anna in Pisa, Italy are creating a robot that mimics the abilities of a real octopus, with a robotic tentacle can hold your hand or even grab a bottle, reports New Scientist TV blog.







Source | Kurzweil AI

Carnegie Mellon competitions aimed at building useful robots

September 18th, 2011

Carnegie Mellon University will host a series of “RoboBowl“ competitions aimed at bringing new robotic technologies for manufacturing, healthcare, and national security applications, reports Network World.

Source | Kurzweil AI

A Robot that Flies like a Bird

September 15th, 2011

‘Trojan horse’ particle sneaks chemotherapy in to kill ovarian cancer cells

September 15th, 2011

Researchers at Queen Mary, University of London have delivered a common chemotherapy drug to cancer cells inside tiny microparticles. The drug reduced ovarian cancer tumors in an animal model by 65 times more than using the standard method. This approach is now being developed for clinical use.

The lead researcher Dr Ateh and colleagues found that by coating tiny microparticles of around 0.5 μm diameter with a special protein called CD95, they trigger cancer cells into ingesting these particles and deliver a dose of a common chemotherapy drug called paclitaxel.

The key to their success is that CD95 attaches to another protein called CD95L, which is found much more commonly on the surface of cancer cells than it is on normal healthy cells. Once attached, the cancer cells ingest CD95 and the microparticle with it. Inside the cell, the microparticle can unload its chemotherapy cargo, which kills the cell to reduce the size of the tumor.

The researchers are now advancing these studies and the start-up company BioMoti, which will develop the technology for clinical use, is planning to partner with established pharmaceutical companies for the clinical development of new treatments in specific types of cancer.

Ref.: Davidson D. Ateh et al., The intracellular uptake of CD95 modified paclitaxel-loaded poly(lactic-co-glycolic acid) microparticles, Biomaterials, August 2011

Source | KurzweilAI

Brain-Computer Interface for Disabled People to Control Second Life With Thought Available Commercially Next Year

August 12th, 2011

This is an awesome use of a brain-computer interface developed for disabled people to navigate in the 3D virtual world of Second Life, using a simple interface controlled by the user’s thought:

Developed by an Austrian medical engineering firm called G.Tec, the prototype in the video above was released last year, but since New Scientist wrote about the project recently, and since it’s one of the few real world applications of Second Life that’s already showing tangible, scalable, incredibly important social results, I checked with the company for an update:

“The technology is already on the market for spelling,” G.Tec’s Christoph Guger tells me, pointing to a company called Intendix. “The SL control will be on the market in about one year.” I imagine there are many disabled people in SL right now who would benefit from this, and many more not in SL who could, once it’s on the market. (A Japanese academic created a similar brain-to-SL interface in 2007, but to my knowledge, there are no commercial plans for it as yet.)

Guger shared some insights on how the technology works, and the disabled volunteers who helped them develop it:

G. Tec test volunteers and interface, courtesy Christoph Guger

Above is a pic of the main G. Tec interface with all the basic SL commands. There are other UIs for chatting (with 55 commands) and searching (with 40 commands.)

Not surprisingly, Guger tells me their disabled volunteers enjoyed flying in Second Life most. “It is of course slower than with the keyboard/mouse,” Guger allows, “but the big advantage is that you appear as a normal user in SL, even if you are paralyzed.”

This brain-to-SL interface literally gives housebound disabled people a world to explore, and a means to meet and interact with as many people there, as live in San Francisco; that in itself is an absolute good. But beyond that, Guger sees other medical applications: “First of all you can use it for monitoring, if the patient is still engaged and as a tool to measure his performance. Beside that, it gives access to many other people, which would not be possible otherwise. New games are also developed for ADHD children for example.”

Source | New World Notes







After 30 years, IBM says PC going way of vacuum tube and typewriter

August 12th, 2011

Thirty years ago, IBM created the first personal computer running Microsoft’s MS-DOS. Today, IBM and Microsoft seem to have very different views on the future of the PC.

IBM CTO Mark Dean of the company’s Middle East and Africa division, one of a dozen IBM engineers who designed that first machine unveiled Aug. 12, 1981, says PCs are “going the way of the vacuum tube, typewriter, vinyl records, CRT and incandescent light bulbs.”

IN PICTURES: Evolution of the PC

IBM, of course, sold its PC division to Lenovo in 2005. Dean, in a blog post, writes that “I, personally, have moved beyond the PC as well. My primary computer now is a tablet. When I helped design the PC, I didn’t think I’d live long enough to witness its decline. But, while PCs will continue to be much-used devices, they’re no longer at the leading edge of computing.”

Dean’s remarks continue a debate over whether we are now in a so-called “post-PC” era, in whichsmartphones and tablets are replacing desktops and laptops. Not surprisingly, Microsoft — seller of 400 million Windows 7 licenses — isn’t a fan of that term.

“I prefer to think of it as the PC-plus era,” Microsoft corporate communications VP Frank Shaw writes in a blog post of his own.

In Microsoft’s vision, it’s the PC plus Bing, Windows Live, Windows phones, Office 365, Xbox, Skype and more.

A VISUAL HISTORY: Windows after 25 years

“Our software lights up Windows PCs, Windows Phones and Xbox-connected entertainment systems, and a whole raft of other devices with embedded processors from gasoline pumps to ATMs to the latest soda vending machines, to name just a few,” Shaw writes. “In some cases we build our own hardware (Xbox, Kinect), while in most other cases we work with hardware partners on PCs, phones and other devices to ensure a great end-to-end experience that optimizes the combination of hardware and software.”

Shaw notes that the Apple II, Commodore PET and other devices preceded the first IBM 5150 PC running MS-DOS but says it was the IBM and Microsoft partnership that “was a defining moment for our industry” and fulfilled “the dream of a PC on every desk and in every home.”

The first IBM PC even predates the Macintosh and Windows, which launched in 1984 and 1985, respectively. Shaw says he still owns his first computer, the IBM Personal Portable booting MS-DOS version 5.1.

Although Microsoft’s role in the daily lives of personal computer users could be diminished by the rise of iPhones, Android phones and iPads, IBM’s Dean says it’s not simply a new type of device that is replacing the PC as “the center of computing.”

“PCs are being replaced at the center of computing not by another type of device — though there’s plenty of excitement about smartphones and tablets — but by new ideas about the role that computing can play in progress,” Dean writes. “These days, it’s becoming clear that innovation flourishes best not on devices but in the social spaces between them, where people and ideas meet and interact. It is there that computing can have the most powerful impact on economy, society and people’s lives.”

While that sounds pretty vague, Dean notes that IBM has boosted its profit margins since selling off its PC division with a strategy of exiting commodity businesses and “expanding in higher-value markets.” One example: IBM’s Watson, newly crowned Jeopardy champion.

“We conduct fundamental scientific research, design some of the world’s most advanced chips and computers, provide software that companies and governments run on, and offer business consulting, IT services and solutions that enable our clients to transform themselves continuously, just like we do,” Dean writes.

For all the debate over whether this is a “post-PC” era, it’s clear more people today own Windows computers and Macs than smartphones and tablets, and our new mobile devices are complementing desktops and laptops rather than replacing them.

It’s hard to beat the convenience of an easy-to-use, Internet-connected device in one’s pocket, but many tasks are cumbersome without a full, physical keyboard. Even social media, which seems as “post-PC” as it gets upon first glance, requires a lot of typing.

Some people envision a future where a smartphone is the hub of all your computing needs, and simply hooks into a dock for those rare times you want a bigger screen, mouse and keyboard. Others talk about a future where any surface, whether a wall or table, is transformed into a touch-screen computer with a snap of one’s fingers.

For now, though, most people making these proclamations are typing their blog posts on PCs.

Source | Kurzweil AI

Scientists copy the ways viruses deliver genes

August 12th, 2011

National Physical Laboratory (NPL) scientists have mimicked the ways viruses infect human cells and deliver their genetic material, hoping to apply the approach to gene therapy to correct defective genes such as those that cause cancer.

The researchers used the GeT (gene transporter) model peptide sequence to transfer a synthetic gene encoding for a green fluorescent protein — a protein whose fluorescence in cells can be seen and monitored using fluorescence microscopy. GeT wraps around genes, transports them through cell membranes, and helps their escape from intracellular degradation traps. The process mimics the mechanisms viruses use to infect human cells.

GeT was designed to undergo differential membrane-induced folding — a process whereby the peptide changes its structure in response to only one type of membranes. This enables the peptide, and viruses, to carry genes into the cell. GeT is antibacterial and capable of gene transfer even in bacteria-challenged environments.

The gene transporter design can serve as a potential template for non-viral delivery systems and specialist treatments of genetic disorders, the researchers said.

Source | Kurzweil AI