The content investigates optimal encoding of classical data for statistical inference using quantum computing. It is shown that the accuracy of any statistical inference using quantum computing is upper bounded by a term that is proportional to the maximal quantum leakage from the classical data through its quantum encoding. This demonstrates that the maximal quantum leakage is a universal measure of the quality of the encoding strategy for statistical inference, as it only depends on the quantum encoding of the data and not the inference task itself.
The optimal universal encoding strategy, i.e., the encoding strategy that maximizes the maximal quantum leakage, is proved to be attained by pure states. When there are enough qubits, basis encoding is proved to be universally optimal. An iterative method for numerically computing the optimal universal encoding strategy is presented, which relies on subgradient ascent for maximization of the maximal quantum leakage.
The results show that the number of qubits required for statistical inference must be larger than log2(|X|)/2, where X is the support set of the discrete input random variable, to not artificially constrain the performance of the inference model. The optimal universal encoder is independent of the distribution of the input and the output of the statistical inference problem, demonstrating the universality of the proposed encoding strategy.
Till ett annat språk
från källinnehåll
arxiv.org
Djupare frågor