On the Role of Quantum Entanglement in Capturing Long-Term Dependencies

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Given the current gap in the literature regarding whether quantum computing can effectively address long-term dependency modeling in sequential learning tasks, this study seeks to explicitly investigate the potential benefits of quantum entanglement in this context. We propose a hybrid Quantum Dilated Con- volutional Neural Network (QDCNN) that synergistically integrates quantum entanglement with classical processing, leveraging the complementary strengths of both computational paradigms. Our architecture employs quantum dilated convolutions with exponentially increasing dilation rates, enabling the efficient capture of long-range temporal dependencies without a proportional increase in quantum circuit depth. To address the limitations of near-term quantum hard- ware, we introduce a novel clipping mechanism that ensures physically realizable entanglement when dilation exceeds the quantum register size, while preserving global quantum correlations. The source code for the proposed quantum model is available at : https://github.com/HoceiniRihab/Quantum-dilated-CNN

Article activity feed