Skip to Content.
Sympa Menu

monkeywire - [monkeywire] Monkey’s Thoughts Propel Robot, a Step That May Help Humans

monkeywire AT lists.ibiblio.org

Subject: The #1 source for news about monkeys and apes

List archive

Chronological Thread  
  • From: "Josh Greenman" <josh.greenman AT gmail.com>
  • To: monkeywire AT lists.ibiblio.org
  • Subject: [monkeywire] Monkey’s Thoughts Propel Robot, a Step That May Help Humans
  • Date: Mon, 14 Jan 2008 21:42:32 -0500

http://www.nytimes.com/2008/01/15/science/15robo.html

Monkey's Thoughts Propel Robot, a Step That May Help Humans

By SANDRA BLAKESLEE

If Idoya could talk, she would have plenty to boast about.

On Thursday, the 12-pound, 32-inch monkey made a 200-pound, 5-foot
humanoid robot walk on a treadmill using only her brain activity.

She was in North Carolina, and the robot was in Japan.

It was the first time that brain signals had been used to make a robot
walk, said Dr. Miguel A. L. Nicolelis, a neuroscientist at Duke
University whose laboratory designed and carried out the experiment.

In 2003, Dr. Nicolelis's team proved that monkeys could use their
thoughts alone to control a robotic arm for reaching and grasping.

These experiments, Dr. Nicolelis said, are the first steps toward a
brain machine interface that might permit paralyzed people to walk by
directing devices with their thoughts. Electrodes in the person's
brain would send signals to a device worn on the hip, like a cell
phone or pager, that would relay those signals to a pair of braces, a
kind of external skeleton, worn on the legs.

"When that person thinks about walking," he said, "walking happens."

Richard A. Andersen, an expert on such systems at the California
Institute of Technology in Pasadena who was not involved in the
experiment, said that it was "an important advance to achieve
locomotion with a brain machine interface."

Another expert, Nicho Hatsopoulos, a professor at the University of
Chicago, said that the experiment was "an exciting development. And
the use of an exoskeleton could be quite fruitful."

A brain machine interface is any system that allows people or animals
to use their brain activity to control an external device. But until
ways are found to safely implant electrodes into human brains, most
research will remain focused on animals.

In preparing for the experiment, Idoya was trained to walk upright on
a treadmill. She held onto a bar with her hands and got treats —
raisins and Cheerios — as she walked at different speeds, forward and
backward, for 15 minutes a day, 3 days a week, for 2 months.

Meanwhile, electrodes implanted in the so-called leg area of Idoya's
brain recorded the activity of 250 to 300 neurons that fired while she
walked. Some neurons became active when her ankle, knee and hip joints
moved. Others responded when her feet touched the ground. And some
fired in anticipation of her movements.

To obtain a detailed model of Idoya's leg movements, the researchers
also painted her ankle, knee and hip joints with fluorescent stage
makeup and, using a special high speed camera, captured her movements
on video.

The video and brain cell activity were then combined and translated
into a format that a computer could read. This format is able to
predict with 90 percent accuracy all permutations of Idoya's leg
movements three to four seconds before the movement takes place.

On Thursday, an alert and ready-to-work Idoya stepped onto her
treadmill and began walking at a steady pace with electrodes implanted
in her brain. Her walking pattern and brain signals were collected,
fed into the computer and transmitted over a high-speed Internet link
to a robot in Kyoto, Japan.

The robot, called CB for Computational Brain, has the same range of
motion as a human. It can dance, squat, point and "feel" the ground
with sensors embedded in its feet, and it will not fall over when
shoved.

Designed by Gordon Cheng and colleagues at the ATR Computational
Neuroscience Laboratories in Kyoto, the robot was chosen for the
experiment because of its extraordinary ability to mimic human
locomotion.

As Idoya's brain signals streamed into CB's actuators, her job was to
make the robot walk steadily via her own brain activity. She could see
the back of CB's legs on an enormous movie screen in front of her
treadmill and received treats if she could make the robot's joints
move in synchrony with her own leg movements.

As Idoya walked, CB walked at exactly the same pace. Recordings from
Idoya's brain revealed that her neurons fired each time she took a
step and each time the robot took a step.

"It's walking!" Dr. Nicolelis said. "That's one small step for a robot
and one giant leap for a primate."

The signals from Idoya's brain sent to the robot, and the video of the
robot sent back to Idoya, were relayed in less than a quarter of a
second, he said. That was so fast that the robot's movements meshed
with the monkey's experience.

An hour into the experiment, the researchers pulled a trick on Idoya.
They stopped her treadmill. Everyone held their breath. What would
Idoya do?

"Her eyes remained focused like crazy on CB's legs," Dr. Nicolelis said.

She got treats galore. The robot kept walking. And the researchers
were jubilant.

When Idoya's brain signals made the robot walk, some neurons in her
brain controlled her own legs, whereas others controlled the robot's
legs. The latter set of neurons had basically become attuned to the
robot's legs after about an hour of practice and visual feedback.

Idoya cannot talk but her brain signals revealed that after the
treadmill stopped, she was able to make CB walk for three full minutes
by attending to its legs and not her own.

Vision is a powerful, dominant signal in the brain, Dr. Nicolelis
said. Idoya's motor cortex, where the electrodes were implanted, had
started to absorb the representation of the robot's legs — as if they
belonged to Idoya herself.

In earlier experiments, Dr. Nicolelis found that 20 percent of cells
in a monkey's motor cortex were active only when a robotic arm moved.
He said it meant that tools like robotic arms and legs could be
assimilated via learning into an animal's body representation.

In the near future, Idoya and other bipedal monkeys will be getting
more feedback from CB in the form of microstimulation to neurons that
specialize in the sense of touch related to the legs and feet. When
CB's feet touch the ground, sensors will detect pressure and calculate
balance. When that information goes directly into the monkeys' brains,
Dr. Nicolelis said, they will have the strong impression that they can
feel CB's feet hitting the ground.

At that point, the monkeys will be asked to make CB walk across a room
by using just their thoughts.

"We have shown that you can take signals across the planet in the same
time scale that a biological system works," Dr. Nicolelis said. "Here
the target happens to be a robot. It could be a crane. Or any tool of
any size or magnitude. The body does not have a monopoly for enacting
the desires of the brain."

To prove this point, Dr. Nicolelis and his colleague, Dr. Manoel
Jacobsen Teixeira, a neurosurgeon at the Sirio-Lebanese Hospital in
São Paulo, Brazil, plan to demonstrate by the end of the year that
humans can operate an exoskeleton with their thoughts.

It is not uncommon for people to have their arms ripped from their
shoulder sockets during a motorcycle or automobile accident, Dr.
Nicolelis said. All the nerves are torn, leaving the arm paralyzed but
in chronic pain.

Dr. Teixeira is implanting electrodes on the surface of these
patients' brains and stimulating the underlying region where the arm
is represented. The pain goes away.

By pushing the same electrodes slightly deeper in the brain, Dr.
Nicolelis said, it should be possible to record brain activity
involved in moving the arm and intending to move the arm. The
patients' paralyzed arms will then be placed into an exoskeleton or
shell equipped with motors and sensors that send touch sensations back
to the brain.

"They should be able to move the arm with their thoughts," he said.
"This is science fiction coming to life."



  • [monkeywire] Monkey’s Thoughts Propel Robot, a Step That May Help Humans, Josh Greenman, 01/14/2008

Archive powered by MHonArc 2.6.24.

Top of Page