The content investigates optimal encoding of classical data for statistical inference using quantum computing. It is shown that the accuracy of any statistical inference using quantum computing is upper bounded by a term that is proportional to the maximal quantum leakage from the classical data through its quantum encoding. This demonstrates that the maximal quantum leakage is a universal measure of the quality of the encoding strategy for statistical inference, as it only depends on the quantum encoding of the data and not the inference task itself.
The optimal universal encoding strategy, i.e., the encoding strategy that maximizes the maximal quantum leakage, is proved to be attained by pure states. When there are enough qubits, basis encoding is proved to be universally optimal. An iterative method for numerically computing the optimal universal encoding strategy is presented, which relies on subgradient ascent for maximization of the maximal quantum leakage.
The results show that the number of qubits required for statistical inference must be larger than log2(|X|)/2, where X is the support set of the discrete input random variable, to not artificially constrain the performance of the inference model. The optimal universal encoder is independent of the distribution of the input and the output of the statistical inference problem, demonstrating the universality of the proposed encoding strategy.
Til et andet sprog
fra kildeindhold
arxiv.org
Vigtigste indsigter udtrukket fra
by Farhad Farok... kl. arxiv.org 04-15-2024
https://arxiv.org/pdf/2404.08172.pdfDybere Forespørgsler