Quantum AI Gets Easier

By

Los Alamos researchers prove that training a quantum neural network will require less data than originally assumed.

A recent paper in the journal Nature Communications reveals that training a quantum computer for AI might require much less data than scientists previously believed. According to the press release, “It had been assumed that the number of parameters, or variables, would be determined by the size of a mathematical construct called a Hilbert space, which becomes exponentially large for training over large numbers of qubits. That size rendered this approach nearly impossible computationally. A qubit, or quantum bit, is the basic computational unit of quantum computing and is analogous to a bit in classical computing.”

Los Alamos quantum theorist Patrick Coles, co-author of the paper, says ““It is hard to imagine how vast the Hilbert space is: a space of a billion states even when you only have 30 qubits...But we showed you only need as many data points as the number of parameters in your model. That is often roughly equal to the number of qubits — so only about 30 data points.”

Lukasz Cincio, another quantum theorist at Los Alamos and also a co-author, adds, “The need for large data sets could have been a roadblock to quantum AI, but our work removes this roadblock. While other issues for quantum AI could still exist, at least now we know that the size of the data set is not an issue.”

09/12/2022
comments powered by Disqus