It’s been decades since neuroscience began the search for ways ‘read the brain’ so that people can move, communicate, and respond when their physical body can no longer do so. Just about every year there are advances, and announcements, of this and that device, which can interpret the brain’s neuron electrical pulses to perform something like control a computer, activate a robotic limb, or produce artificial speech.
There are a couple of important tracks to this research: One uses sensors that are physically implanted in the brain, the other uses external sensors (usually touching the skin of the head). The trade-offs are pretty obvious. Sensors in the brain can be very sensitive and precise, but also obviously implanting anything in the brain has a relatively high degree of risk – and long term implantations are even more problematic. External sensors are, of course, non-invasive and therefore less dangerous, but the signals received by external sensors tend to be a lot less precise and clear.
Underlying both tracks of research is the work being done to try to isolate (if that’s the word) various areas of the brain in association with various kinds of thought and action – speaking, motor control, and so forth. This research has produced a lot of results, though often on the gross side – using the word gross to mean ‘roughly’, ‘in general’, ‘in the ballpark of’. Not too long ago it was thought that discrete actions or types of thoughts would have discrete locations in the brain. Sometimes this is true, but as research is now showing, more often than not the brain uses many areas in concert to perform a lot of thinking and controlling (and also memory). This makes the work of mapping brain signals more difficult, and the interpretation more difficult still.
This brings me to a new study, just released by the University of Maryland (College Park, USA) to the Journal of Neuroscience. A team of neuroscientists headed by Jose Contreras-Vidal used an array of sensors on the scalps of five participants to record their brain’s electrical activity with an EEG (Electroencephalograph). They were asked to touch a random sequence of buttons, while their brain signals and hand motions were recorded. Then the team attempted to decode the signals and reproduce the hand movements in three-dimensional graphics. This worked.
Unfortunately, the press release leaves out what must be a couple of key developments: Since EEG reading is nearly a hundred years old, how is it that the team achieved a more precise reading of brain signals using EEG? In some way it has to be better than older techniques. And then, how were they able to decode the signals? Presumably it was with a computer and a database of known signals, but the release provides no clue as to anything unusual. It is not unusual for the press releases of many scientific findings to leave out, under-report, or fail to explain the most important ‘core’ techniques and insights, especially if they are new. I’m not sure if this is from a desire to be secretive (although obviously most of the information has to be in the published paper), or if the person writing the release doesn’t understand the core issues well enough. Of course, I suppose if you want to see the most important insights – read the paper. Okay, for specialists; but this leaves the general public and interested media without much of a clue.
In this case, waiting a couple of days for the paper to be published, I found that the EEG equipment was a 55 channel device, using 34 sensors – pretty standard. The information was processed by low-resolution brain electromagnetic tomography (sLORETA), which provided brain locations for the hand movements. In short, this is not fancy, although interpretation and analysis probably required skill. The report notes that variation in hand movements made it difficult to decode accurately.
The most important point, subject to further testing and verification, is that it is possible to achieve accurate readings of cerebral motor control without resorting to brain implants. If it can be done now, in a way which must be considered initial or even crude, then it is possible that eventually cerebral control of external motor activity – controlling a robotic arm, for example, with brain waves, can be achieved with some kind of external skull reading device.