Sunday, October 16, 2011

MIND READING

Technology Turns Thought Into Action
An old technology is providing new insights into the human brain.
The technology is called electrocorticography, or ECoG, and it uses electrodes placed on the surface of the brain to detect electrical signals coming from the brain itself.
Doctors have been using ECoG since the 1950s to figure out which area of the brain is causing seizures in people with severe epilepsy. But in the past decade, scientists have shown that when connected to a computer running special software, ECoG also can be used to control robotic arms, study how the brain produces speech and even decode thoughts.
In one recent experiment, researchers were able to use ECoG to determine the word a person was imagining.
"This is both very exciting and somewhat frightening at the same time," says Gerwin Schalk, a researcher who studies ECoG at the New York State Department of Health's Wadsworth Center in Albany. "It really goes pretty close to what people used to call mind reading."
So perhaps it's not surprising that Schalk's research is funded by both the National Institutes of Health and the U.S. Army.
The key to all of the new uses for ECoG is software, designed in part by Schalk, that helps scientists decode the electrical signals coming from the brain.
The brain uses those signals every time we wiggle a toe or form a thought. But the signals also provide a real-time broadcast of precisely what the brain is doing, and Schalk's software allows scientists to eavesdrop on this broadcast.
How does electrocorticography interpret electric signals in the brain? Credit: American Museum Of Natural History Science Bulletins
Controlling A Virtual Hand
Schalk demonstrates some of ECoG's capabilities in a video shot by the American Museum of Natural History in New York as part of an exhibit called Brain: The Inside Story.
In the video, Schalk is seen working with a young man sitting in a hospital bed at Albany Medical Center, staring at the image of a hand on a computer screen.
Schalk asks him to close the hand. The hand on the screen closes. Schalk asks him to open the hand. The virtual hand opens.
What's striking about this scene is that the young man's own hand isn't moving — he is clenching and unclenching the virtual fist using only his thoughts.
Like all volunteers in ECoG experiments so far, this patient has severe epilepsy. He took part in the experiment while doctors were using ECoG to find the source of his seizures.
But the experiment shows how the technology could help a very different sort of patient — someone paralyzed by a spinal injury or Lou Gehrig's disease. ECoG could allow someone like that to operate a robotic arm with just their thoughts.
The experiment also shows how many different areas of the brain get involved in things we take for granted, Schalk says.
"Even for simple functions such as opening and closing the hand, there are many, many areas that contribute to the movement," he says.
Hitting The Sweet Spot Of Brain Signals
ECoG has proven to be ideal for simultaneously detecting the signals from a large number of brain areas.
Bionic arms are just one potential use for ECoG. Researchers say the technology has proved far more powerful and versatile than anyone expected.
"Every couple of weeks we find something that really kind of makes us scratch our head and say, 'Wow, that's pretty neat,' " says Eric Leuthardt, a neurosurgeon at Washington University in St. Louis who has worked closely with Schalk.

One reason is that ECoG hits a sweet spot between two competing approaches to detecting brain signals, Leuthardt says.
One of these approaches requires placing electrodes deep in the brain. That allows scientists to monitor individual brain cells with great precision. But they can't monitor very many brain cells at the same time. Another approach is to put electrodes on the scalp, but the signals aren't very clear because they must pass through skin and bone.
ECoG does require surgery, but not on the brain itself.
Surgeons make an incision in the scalp and remove a portion of the skull, Leuthardt says. Then, he says, they place a grid of electrodes on the surface of the brain and "close everything back up."
Wires from the electrodes exit through the scalp and are connected directly to a computer, creating what's known as a brain-computer interface.
Watching The Brain Listen To Music
In theory, researchers could receive signals from hundreds or even thousands of electrodes. So far, they haven't gone beyond dozens, yet the results have been spectacular.
Schalk shows some of what ECoG can do in his lab. There aren't any animals or test tubes here, but there are plenty of computers, including one playing Pink Floyd's album The Wall.
Schalk is showing me the results of experiments he did using ECoG to monitor people listening to The Wall. He points toward two waveforms on the computer screen. One shows the mountains and valleys that represent changes in the music volume; the second waveform looks very similar, but it represents the electrical signals generated by the brain in response to the music.
"There's a very close correlation between the actual loudness in the music that is just playing right now and the intensity of the music that we're decoding or inferring from the person's brain," Schalk says. "Isn't that pretty awesome?"
A Brain On 'The Wall'
In this experiment, researchers compared the volume of Pink Floyd music played to a patient (music power) with the data gathered from the brain listening to that music (decoded music power). Notice how the shapes of the waveforms are similar.The brain signal is so distinctive you could almost recognize the music from the waveform alone, Schalk says.
In the second part of the music experiment, volunteers listened to Pink Floyd for about 10 seconds, then the music was interrupted by about a second of complete silence.
The experiment shows that while it may have been silent in the room during the test, it was not silent in the volunteers' brains.
Schalk's computer screen shows that even when the music stops, the waveform from the brain continues as if the music were still playing. What we're seeing is the brain's attempt to fill in the missing sounds, Schalk says.
"The brain basically tells us a lot of information about the music in the times when there is really no music," he says.
It's a vivid illustration of something neuroscientists have been studying for many years, Schalk says. Whether it's musical phrases or strings of words or scenery we look at, our brains are always filling in missing information.
Eavesdropping On Your Inner Monologue?
ECoG is also revealing things about how the brain creates speech.
Schalk and other researchers are using the technology to watch the brains of people as they speak out loud and also as they say the words silently to themselves.
"One of the surprising initial findings coming out of that research was that actual and imagined speech [are] very, very different," Schalk says.
When your brain wants you to say a word out loud, it produces two sets of signals. One has to do with moving the muscles controlling the mouth and vocal tract. The second set involves signals in the brain's auditory system.
But when a person simply thinks of a word instead of saying it, there are no muscle signals — just the activity in the parts of the brain involved in listening.
"That seems to suggest that what imagined speech actually really is, it's more like internally listening to your own voice," Schalk says.
So, he says, it should be possible to use ECoG to eavesdrop on that inner voice and decode what we're thinking.
Schalk says he hasn't quite done that yet. But he's close. In one experiment, he says, the ECoG system tried to recognize several dozen unspoken words in the minds of volunteers. It was right about half the time.
(Mind Reading: Technology Turns Thought Into Action
by Jon Hamilton
An old technology is providing new insights into the human brain.
The technology is called electrocorticography, or ECoG, and it uses electrodes placed on the surface of the brain to detect electrical signals coming from the brain itself.
Doctors have been using ECoG since the 1950s to figure out which area of the brain is causing seizures in people with severe epilepsy. But in the past decade, scientists have shown that when connected to a computer running special software, ECoG also can be used to control robotic arms, study how the brain produces speech and even decode thoughts.
In one recent experiment, researchers were able to use ECoG to determine the word a person was imagining.
"This is both very exciting and somewhat frightening at the same time," says Gerwin Schalk, a researcher who studies ECoG at the New York State Department of Health's Wadsworth Center in Albany. "It really goes pretty close to what people used to call mind reading."
So perhaps it's not surprising that Schalk's research is funded by both the National Institutes of Health and the U.S. Army.
The key to all of the new uses for ECoG is software, designed in part by Schalk, that helps scientists decode the electrical signals coming from the brain.
The brain uses those signals every time we wiggle a toe or form a thought. But the signals also provide a real-time broadcast of precisely what the brain is doing, and Schalk's software allows scientists to eavesdrop on this broadcast.
How does electrocorticography interpret electric signals in the brain? Credit: American Museum Of Natural History Science Bulletins
Controlling A Virtual Hand
Schalk demonstrates some of ECoG's capabilities in a video shot by the American Museum of Natural History in New York as part of an exhibit called Brain: The Inside Story.
In the video, Schalk is seen working with a young man sitting in a hospital bed at Albany Medical Center, staring at the image of a hand on a computer screen.
Schalk asks him to close the hand. The hand on the screen closes. Schalk asks him to open the hand. The virtual hand opens.
What's striking about this scene is that the young man's own hand isn't moving — he is clenching and unclenching the virtual fist using only his thoughts.
Like all volunteers in ECoG experiments so far, this patient has severe epilepsy. He took part in the experiment while doctors were using ECoG to find the source of his seizures.
But the experiment shows how the technology could help a very different sort of patient — someone paralyzed by a spinal injury or Lou Gehrig's disease. ECoG could allow someone like that to operate a robotic arm with just their thoughts.
The experiment also shows how many different areas of the brain get involved in things we take for granted, Schalk says.
"Even for simple functions such as opening and closing the hand, there are many, many areas that contribute to the movement," he says.
Hitting The Sweet Spot Of Brain Signals
ECoG has proven to be ideal for simultaneously detecting the signals from a large number of brain areas.
Bionic arms are just one potential use for ECoG. Researchers say the technology has proved far more powerful and versatile than anyone expected.
"Every couple of weeks we find something that really kind of makes us scratch our head and say, 'Wow, that's pretty neat,' " says Eric Leuthardt, a neurosurgeon at Washington University in St. Louis who has worked closely with Schalk.

One reason is that ECoG hits a sweet spot between two competing approaches to detecting brain signals, Leuthardt says.
One of these approaches requires placing electrodes deep in the brain. That allows scientists to monitor individual brain cells with great precision. But they can't monitor very many brain cells at the same time. Another approach is to put electrodes on the scalp, but the signals aren't very clear because they must pass through skin and bone.
ECoG does require surgery, but not on the brain itself.
Surgeons make an incision in the scalp and remove a portion of the skull, Leuthardt says. Then, he says, they place a grid of electrodes on the surface of the brain and "close everything back up."
Wires from the electrodes exit through the scalp and are connected directly to a computer, creating what's known as a brain-computer interface.
Watching The Brain Listen To Music
In theory, researchers could receive signals from hundreds or even thousands of electrodes. So far, they haven't gone beyond dozens, yet the results have been spectacular.
Schalk shows some of what ECoG can do in his lab. There aren't any animals or test tubes here, but there are plenty of computers, including one playing Pink Floyd's album The Wall.
Schalk is showing me the results of experiments he did using ECoG to monitor people listening to The Wall. He points toward two waveforms on the computer screen. One shows the mountains and valleys that represent changes in the music volume; the second waveform looks very similar, but it represents the electrical signals generated by the brain in response to the music.
"There's a very close correlation between the actual loudness in the music that is just playing right now and the intensity of the music that we're decoding or inferring from the person's brain," Schalk says. "Isn't that pretty awesome?"
A Brain On 'The Wall'
In this experiment, researchers compared the volume of Pink Floyd music played to a patient (music power) with the data gathered from the brain listening to that music (decoded music power). Notice how the shapes of the waveforms are similar.The brain signal is so distinctive you could almost recognize the music from the waveform alone, Schalk says.
In the second part of the music experiment, volunteers listened to Pink Floyd for about 10 seconds, then the music was interrupted by about a second of complete silence.
The experiment shows that while it may have been silent in the room during the test, it was not silent in the volunteers' brains.
Schalk's computer screen shows that even when the music stops, the waveform from the brain continues as if the music were still playing. What we're seeing is the brain's attempt to fill in the missing sounds, Schalk says.
"The brain basically tells us a lot of information about the music in the times when there is really no music," he says.
It's a vivid illustration of something neuroscientists have been studying for many years, Schalk says. Whether it's musical phrases or strings of words or scenery we look at, our brains are always filling in missing information.
Eavesdropping On Your Inner Monologue?
ECoG is also revealing things about how the brain creates speech.
Schalk and other researchers are using the technology to watch the brains of people as they speak out loud and also as they say the words silently to themselves.
"One of the surprising initial findings coming out of that research was that actual and imagined speech [are] very, very different," Schalk says.
When your brain wants you to say a word out loud, it produces two sets of signals. One has to do with moving the muscles controlling the mouth and vocal tract. The second set involves signals in the brain's auditory system.
But when a person simply thinks of a word instead of saying it, there are no muscle signals — just the activity in the parts of the brain involved in listening.
"That seems to suggest that what imagined speech actually really is, it's more like internally listening to your own voice," Schalk says.
So, he says, it should be possible to use ECoG to eavesdrop on that inner voice and decode what we're thinking.
Schalk says he hasn't quite done that yet. But he's close. In one experiment, he says, the ECoG system tried to recognize several dozen unspoken words in the minds of volunteers. It was right about half the time.
(Jon Hamilton npr)

No comments: