Rapid integration of artificial sensation

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

1

Humans rely on both proprioceptive and visual feedback during reaching, integrating these two sensory streams to improve movement accuracy and precision [1, 2]. Patients using a Brain-Machine interface (BMI) will similarly require artificial proprioceptive feedback in addition to vision to finely control a prosthesis [3, 4]. Intracortical microstimulation (ICMS) elicits sensory perceptions that could replace the lost proprioceptive signal. However, some learning may be required for encoding artificial sensation [5], as current technology does not give access to neurons with all of the desired encoding properties [6]. We developed a freely-moving mouse behavioral task in which to test learning and integration of artificial sensory information. Five mice were implanted with a 16-channel microwire array in primary somatosensory cortex. Mice were trained to navigate to randomly-selected targets upon the floor of a custom behavioral training cage. Target location was encoded with visual and/or patterned multi-channel ICMS feedback. Mice received multi-modal feedback from the beginning of training of the behavioral task, achieving 75% on multimodal trials after approximately 1000 training trials. Mice also quickly learned to use the ICMS signal to locate invisible targets, achieving 75% proficiency on ICMS-only trials when tested. Critically, we found that performance on multimodal trials significantly exceeded unimodal performance (vision or ICMS), demonstrating that animals rapidly learned to integrate natural vision with artificial sensation.

2

Significance

Multisensory integration of visual and proprioceptive information facilitates accurate and precise movements. Intracortical microstimulation (ICMS) elicits perceptions that could supplement visual information for patients controlling a prostheses. Here, we developed a freely-moving mouse behavioral task to examine how ICMS can be used to encode multi-variable task-relevant information. Mice implanted with a cortical microwire array were trained to interpret patterned multi-channel ICMS to navigate to targets upon the floor of a custom behavioral training cage. Mice quickly learned to use the ICMS signal to locate invisible targets and integrated the artificial signal with natural vision, improving task performance. This protocol can be applied to efficiently develop and test algorithms to encode artificial proprioception for neural prostheses.

Article activity feed