Sensory Survey 3D: an open source utility for the annotation of projected fields for sensory neural interfaces

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Background One goal of neuroprosthetics is to restore sensation to those who have lost it due to disease or injury. This is often accomplished with electrical stimulation of nervous tissue. Common approaches to quantifying evoked sensations for somatosensory prostheses involve asking participants to draw or annotate flattened 2-dimensional (2D) representations of body parts, which can distort the 3-dimensional (3D) projected fields. In particular, patches of skin between the fingers and toes often go unrepresented in 2D, leaving them unquantified. Here, we present a 3D annotation tool that allows participants to accurately report evoked sensations on any given body part. Additionally, we present a pipeline for creating patient-specific models to accommodate different morphologies, and algorithms to synthesize data across models to facilitate analysis. Methods Patients implanted with either NeuroPort electrode arrays (Blackrock Neurotech) in Brodmann’s Area 1 (N = 2 male) or Composite Flat Interface Nerve Electrodes on the peripheral nerves of an amputated arm (N = 2 male) were electrically stimulated and asked to annotate the location of the resulting sensations. In the first cohort, patients reported sensations on a generic hand using either a 2D or 3D interface. We computed the Jaccard index for each electrode between annotation methods and the proportion of 3D faces distorted during the 2D projection. In the second patient cohort, as proof of concept, we created a 3D model of each participant’s residual hand and then mirrored it, allowing participants to annotate their amputated hand. We then projected annotations of each custom model onto a generic model to enable direct comparison of annotations across participants. Results We found that participants reported consistent sensation locations between annotation methods when sensations were localized to central parts of the fingers and hand, visible in 2-dimensions. However, as expected, the 3-dimensional interface better captured sensations on the edges of or between fingers. Development of an iterative Procrustes alignment algorithm allowed for easy comparison of projected fields between patient-specific models. Conclusion Our 3-dimensional annotation tool allows for more precise annotation of evoked sensations than 2-dimensional alternatives and facilitates rapid analysis across patients with different morphologies.

Article activity feed