A brain-computer interface that evokes tactile sensations improves robotic arm control

Publication Title
Publication/Creation Date
May 21 2021
Media Type
Journal Article
Persuasive Intent
Discursive Type

Prosthetic arms controlled by a brain-computer interface can enable people with tetraplegia to perform functional movements. However, vision provides limited feedback because information about grasping objects is best relayed through tactile feedback. We supplemented vision with tactile percepts evoked using a bidirectional brain-computer interface that records neural activity from the motor cortex and generates tactile sensations through intracortical microstimulation of the somatosensory cortex. This enabled a person with tetraplegia to substantially improve performance with a robotic limb; trial times on a clinical upper-limb assessment were reduced by half, from a median time of 20.9 to 10.2 seconds. Faster times were primarily due to less time spent attempting to grasp objects, revealing that mimicking known biological control principles results in task performance that is closer to able-bodied human abilities.

This research was funded by DARPA's Revolutionizing Prosthetics program.
HCI Platform
Relation to Body
Related Body Part
Hand, Brain
Marketing Keywords
Blackrock Microsystems

Date archived
September 21 2021
Last edited
December 7 2021
How to cite this entry
Robert A. Gaunt, Jennifer L. Collinger, Sharlene N. Flesher, John E. Downey, Jeffrey M. Weiss, Christopher L. Hughes, Angelica J. Herrera, Elizabeth C. Tyler-Kabara, Michael L. Boninger. (May 21 2021). "A brain-computer interface that evokes tactile sensations improves robotic arm control". Science. American Association For The Advancement Of Science. Fabric of Digital Life. https://fabricofdigitallife.com/Detail/objects/5440