In this work, the authors introduce a framework for utilizing Syndrome-Based Neural Decoders (SBND) in decoding high-order Bit-Interleaved Coded Modulations (BICM). They extend previous results on SBND to apply to various linear modulation techniques, focusing on BICM. By analyzing the performance of different neural-based decoders, such as RNN-based and transformer-based architectures, they compare their Bit Error Rate (BER) performance and computational complexity. The study provides insights into the effectiveness of neural decoders in communication systems.
The content discusses the challenges faced by traditional Deep Neural Networks (DNNs) in channel decoding due to the curse of dimensionality. It explores scalable alternatives like model-based and model-free solutions, emphasizing the importance of machine learning in future communication systems. The introduction of SBND as a symmetric decoder that does not rely on codewords but can be trained with a unique codeword is highlighted. The focus on extending SBND to higher-order modulations like M-QAM and M-PSK is emphasized for practical implementations.
The work delves into the theoretical channel modeling induced by bit Log-Likelihood Ratios (bit-LLRs) for different modulation schemes, paving the way for designing an SBND for BICM. By proposing an implementation that takes bit-LLRs as input instead of channel outputs, the authors aim to decode linear block codes effectively under a BICM setting. Experimental results comparing RNN-based and transformer-based architectures showcase their performances and complexities in decoding polar codes.
To Another Language
from source content
arxiv.org
สอบถามเพิ่มเติม