A fully-memristive trimodal fusion perception system integrating multisensory neuron with hybrid neural network accelerator
Discuss this preprint
Start a discussion What are Sciety discussions?Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Intelligent perception systems require efficient sensing and processing of heterogeneous multimodal information. Memristive sensory neurons and hybrid neural networks (HNNs) respectively enable multimodal sensing and flexible processing. However, existing implementations suffer from cross-modal interference caused by memristive sensory neurons restricted to two-dimensional feature spaces and excessive hardware overhead resulting from complex CMOS-based HNN architectures. Critically, a fully-memristive system integrating both sensory and HNN functions has not been realized. Here, we report a fully-memristive perception system comprising a memristive multisensory neuron (MMN) and memristive HNN accelerator (MHA) for efficient trimodal sensory processing. The MMN, realized via a NbOx memristor-sensor-integrated oscillator, expands encoding dimensions to achieve trimodal fusion without cross-modal interference. The MHA, constructed with a 40-nm 1-Mb RRAM chip and NbOx-based memristive hybrid neurons (MHNs), flexibly processes fused signals. Each MHN provides reconfigurable Tanh and leaky-integrate-and-fire functions, enabled respectively by the nonlinear and threshold-switching properties of NbOx memristor, thereby reducing the hardware overhead. For battery state-of-charge estimation, the complete MMN-MHA system demonstrates 1.71% mean absolute error in artificial-neural-network mode and 92.1% classification accuracy in spiking-neural-network mode. Effective trimodal fusion yields 2.01× and 1.78× precision improvements over unimodal and bimodal schemes under high-noise conditions, demonstrating great potential of this fully-memristive system for scalable, energy-efficient multimodal perception.