Core Concepts
A new decoder for the surface code that combines the accuracy of tensor-network decoders with the efficiency and parallelism of the belief-propagation algorithm.
Abstract
The paper presents a new decoder for the surface code, which combines the accuracy of tensor-network decoders with the efficiency and parallelism of the belief-propagation (BP) algorithm. The main idea is to replace the expensive tensor-network contraction step in tensor-network decoders with the blockBP algorithm - a recent approximate contraction algorithm based on BP.
The key aspects of the blockBP decoder are:
It is a BP-based decoder that works in the degenerate maximal likelihood decoding framework, unlike conventional BP decoders that solve the simpler quantum maximal likelihood decoding problem.
It can run efficiently in parallel, making it potentially suitable for real-time decoding, unlike the slow tensor-network decoders.
The decoder performance depends on the block size k used in the blockBP algorithm. Larger block sizes lead to more accurate decoding, but also higher computational cost.
Numerical simulations show that for code distances d ≤ 9, a block size of k = 1 or 2 outperforms the MWPM decoder. For d ≤ 17 and d ≤ 25, block sizes of k = 4 and k = 6 respectively provide better performance.
The blockBP decoder exhibits faster convergence to the right error coset compared to the wrong cosets, and its performance improves as the noise rate decreases, suggesting potential for further optimizations.