Core Concepts
Establishing a systematic framework for semantic information theory.
Abstract
The content introduces Semantic Information Theory (SIT) as an extension of classic information theory. It discusses synonymous mapping, semantic entropy, mutual information, channel capacity, and rate-distortion functions. The paper proves coding theorems for SIT and explores semantic communication in text, speech, image, and video sources. It highlights the importance of synonymous mapping in understanding semantic information.
Stats
Channel capacity formula for band-limited Gaussian channel: Cs = B log(S4 * (1 + P / N0B))
Semantic rate distortion function for Gaussian source: Rs(D) = log(P / (S4 * D))
Quotes
"We develop a systematic model for semantic communication with specific design criteria."
"Semantic communication systems based on deep learning demonstrate excellent performance."