This paper explores the connection between backward-adaptive lossy compression and online learning, specifically focusing on the Natural Type Selection (NTS) algorithm and its ability to approach the rate-distortion bound by balancing exploration and exploitation.
This paper presents a unified framework for distributed source coding, multiple description coding, and source coding with side information at decoders, characterizing their multi-letter rate-distortion regions for general correlated sources using constrained-random number generators.