Quantum Destructive Self-Attention for NISQ-EraTransformers

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

In the noisy intermediate-scale quantum (NISQ) era, developing attention mechanisms suitable for quantum hardware is essential for advancing quantum-enhanced Transformers applications. This work introduces a novel quantum Transformer model that integrates a low-depth, ancilla-free attention mechanism based on the Destructive swap-test variant, enabling efficient computation of attention scores without auxiliary qubits. Designed specifically for NISQ constraints, the model emphasizes noise resilience and reduced circuit depth, making it practical for current quantum devices. We evaluate the model using transparent and well-documented Natural Language Processing datasets designed for Quantum Computing (QC). Our results demonstrate competitive performance on language tasks, supported by both quantitative metrics that offer interpretable insights into the model's behavior. This work provides a viable pathway toward practical, interpretable quantum language models in the near term.

Article activity feed