Conditional Entropy is Almost Time-Reversal Invariant: Implications for Learnability and Distributional Shift
The difference between forward and backward conditional entropy of a sequential dataset is a small constant factor dependent only on the forward and backward models, quantifying learnability and controlling for distributional shift.