Studying Brain-to-Computer Interfaces Only on Humans: with great results!
December 19, 2012
Brain to computer interfaces that can actually tell you what people are thinking. The brain detail is said to be “amazing” today and is rapidly growing to understand how the brain functions. They can now tell from brain signals if one moves a hand to the left or to right, etc. The team says they have no clue how far this technology will go, only the future will tell.
Instead of experimenting on animals this team is working purely on understanding the human brain through the use of studying only on humans.
"Brain to computer interfaces that can actually tell you what people are thinking" Really? I only know of interfaces that can read what we perceive (external visual information) and what we are intending our muscles to do (motor cortex). I think calling these things "thought" is a stretch, because then you could read someone's "thoughts" just by feeling the tension in their muscles, or reading nerve info in the PNS or optic nerves. What does it mean for someone to think "EE", is it a specific and slight tension in mouth, tongue and diagram? Is it the result of sonic mental imagery? Is it the the memory recall of a sound? These claims are not meaningful without knowing the answers to these questions, and AFAIK we don't.
If there is a working system that reads thoughts (like the content of working memory, or an imagined visual image) then please post a link!
However I agree that to really read the brain 100% and communicate its qualia we will need advanced technologies, maybe even nano-bots that can pass the blood-brain-barrier and send signals to a computer.
Posted by b. on 12/20 at 12:10 PM
I agree that animal studies are unethical. The only way to get the spatial and temporal resolution required for these studies is highly invasive and animal is destroyed afterwards.
I checked out the first link but don't have time right now to watch the BBC doc. I was not aware of thalamus recordings of visual perception. FYI as far as I understand it the thalamus routes pretty much all cortical activity, so its slightly unclear how easy it would be to find the visual parts alone. It is interesting in that the thalamus is implicated in consciousness.
The reason why the brain-reading studies are motor and sensory based is because that is how they correlate the brain activity with an objectively observable aspect (map VC activity to a perceptual image, or map motor activity to muscle movement of EMG).
If the case of self-speech (and other thoughts) we have no objectively observable feature to correlate the brain activity with. Sure you could tell a person to say something specific in their head over and over again, but there are still lots of variables, in what voice is it said? How fast is it said? Accent? Volume? Distance? We know our brains are active all the time, so its unclear what aspects of that activity is conscious and what aspects are not, so separating out conscious from unconscious thought would be tricky to say the least. Just think of a Freudian slip case, a person thinks they are saying one thing, but their mouths are saying another. How would brain-reading tell which one is the "thought"?
This is a key methodological problem with any "thought" reading.
My main criticism of these studies is that we already have a highly evolved system that interfaces the brain with the external world (technologies and other people), its the PNS. It is so effective that we are hardly aware of it mediating our thoughts. These studies seem to miss the whole realization that cognition is embodied. Reading the PNS (EMG, etc.) directly has many massive advantages to reading the brain directly. Its not not as cool because it does not subscribe to the sci-fi brain in the jar idea.