Polynomially efficient quantum enabled variational Monte Carlo for training neural-network quantum states for physico-chemical applications

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

With diverse architectures and strong expressivity, neural-network quantum states (NQS) offer an alternative to traditional variational ansätze for simulating physical systems. Energy-based models such as Hopfield networks and Restricted Boltzmann Machines draw on statistical physics, mapping quantum states onto energy landscapes as associative memories. We show these models can be trained efficiently with Monte Carlo accelerated by quantum devices. Our algorithm scales linearly with circuit width and depth, uses constant measurements, avoids mid-circuit measurements, and requires polynomial storage. It treats both phase and amplitude fields, enlarging the trial space. Sampling on quantum hardware shortens mixing times and yields more faithful estimates, revealing a quantum-assisted advantage. We demonstrate accurate learning of ground states for local spin models and nonlocal electronic-structure Hamiltonians, including at distorted geometries with strong multi-reference correlation. Benchmarks show close agreement and high robustness highlighting promise of machine-learning protocols paired with near-term quantum devices for state learning in chemistry and condensed-matter physics.

Article activity feed