Hybrid Quantum Inception-Inspired Convolutional Neural Network for Image Classification

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Quantum Neural Networks (QNNs) present a promising research direction in image recognition, demonstrating significant potential for various applications. However, QNNs with a fixed circuit topology structure are prone to encountering the barren plateaus during training when processing exponential number of qubits, which limit their performance and scalability. In this paper, a novel Hybrid Quantum-Classical Inception-inspired Convolutional Neural Networks (HQCNN) is proposed. The HQCNN enhances feature learning capability and mitigates the barren plateau issue by integrating quantum convolutional kernels with diverse circuit topologies across multiple channels. The architecture employs an amplitude encoding layer to map multi-channel classical data into quantum states, followed by diverse quantum convolutional kernels that extract feature representations. Furthermore, Hadamard gates are used to intergrate multi-channel features and incorporate quantum pooling to reduce feature dimensionality, with a fully connected layer mapping the quantum-enhancing features into class labels for classification tasks. Finally, by analyzing the influence of the circuit structures and quantum gate combinations on the models' performance, the optimal HQCNN architecture is constructed. Experimental results demonstrate that HQCNN achieves 98.5% accuracy on the MNIST-4 classification task and 92.5% accuracy on the MNIST-10 classification task, which indicates that the proposed model outperforms state-of-the-art HQCNN models and classical CNN in image classification with about 10% parameters. This study provides an effective approach for alleviating the barren plateau issue and designing multi-channel quantum convolutional neural networks for processing big classic data with fewer quantum resources.

Article activity feed