In this interesting study reported in the last Nature Neuroscience volume (2015), the authors Maria C. Dadarlat, Joseph E. O'Doherty and Philip N. Sabes, show that monkeys can learn to use non-biomimetic proprioceptive feedback, delivered via electrical microstimulation of somatosensory cortex, to guide motor movements.
The monkeys also integrated this artificial feedback with vision to optimize motor performance. The results suggest new learning-based approaches both to providing sensory feedback for brain–machine interfaces and to studying the neural mechanisms of adaptive sensory integration.
These new findings support the expanding neurobioethical field known as BMIs or brain–machine interfaces, as well as the fascinating research area of the brain-brain-interfaces (BBIs).
As we read in the abstract: “Proprioception — the sense of the body’s position in space — is important to natural movement planning and execution and will likewise be necessary for successful motor prostheses and brain–machine interfaces (BMIs)”.
Dadarlat, O'Doherty and Sabes reported their findings as follows: “Here we demonstrate that monkeys were able to learn to use an initially unfamiliar multichannel intracortical microstimulation signal, which provided continuous information about hand position relative to an unseen target, to complete accurate reaches. Furthermore, monkeys combined this artificial signal with vision to form an optimal, minimum-variance estimate of relative hand position”.
They concluded saying that “these results demonstrate that a learning-based approach can be used to provide a rich artificial sensory feedback signal, suggesting a new strategy for restoring proprioception to patients using BMIs, as well as a powerful new tool for studying the adaptive mechanisms of sensory integration”.
Read the article here:
Nature Neuroscience 18, 138–144 (2015) doi:10.1038/nn.3883