Prof. Gilad Gour: "From Static to Dynamic Divergences"
Abstract:
In this talk, I will introduce an axiomatic approach for channel divergences and channel relative entropies that is based on three information-theoretic axioms of monotonicity under superchannels (i.e. generalized data processing inequality), additivity under tensor products, and normalization. I will show that these axioms are sufficient to give enough structure, leading to numerous properties that are applicable to all channel divergences. These include faithfulness, continuity, a type of triangle inequality, and boundedness between the min and max channel relative entropies. In addition, I will present a uniqueness theorem showing that the Kullback-Leibler divergence has only one extension to classical channels. For quantum channels, with the exception of the max relative entropy, this uniqueness does not hold. Instead, I will prove the optimality of the amortized channel extension of the Umegaki relative entropy, by showing that it provides a lower bound on all channel relative entropies that reduce to the Kullback- Leibler divergence on classical states. If time permits, I will also introduce the maximal channel extension of a given classical state divergence and discuss its properties.