Hyperparameter Learning of Stochastic Image Generative Models with Bayesian Hierarchical Modeling and Its Effect on Lossless Image Coding

Authors

Image provided by Yuta Nakahara
Yuta
Nakahara
Waseda University
Profile
Toshiyasu
Matsushima
Waseda University

Abstract

Explicit assumption of stochastic data generative models is a remarkable feature of lossless compression of general data in information theory. However, current lossless image coding mostly focus on coding procedures without explicit assumption of the stochastic generative model. Therefore, we have difficulty discussing the theoretical optimality of the coding procedure to the stochastic generative model. In this paper, we solve this difficulty by constructing a stochastic generative model by interpreting the previous coding procedure from another perspective. An important problem of our approach is how to learn the hyperparameters of the stochastic generative model because the optimality of our coding algorithm is guaranteed only asymptotically and the hyperparameter setting still affects the expected code length for finite length data. For this problem, we use Bayesian hierarchical modeling and confirm its effect by numerical experiments. In lossless image coding, this is the first study assuming such an explicit stochastic generative model and learning its hyperparameters, to the best of our knowledge.

Paper Manuscript