Abstract
We consider a variant of the channel simulation problem with a single input and multiple outputs, where Alice observes a probability distribution \(P\) from a set of prescribed probability distributions \(\mathbb{\mathcal{P}}\), and sends a prefix-free codeword \(W\) to Bob to allow him to generate \(n\) i.i.d. random variables \(X_{1}, X_{2}, .., X_{n}\) which follow the distribution \(P\). This can also be regarded as a lossy compression setting for probability distributions. This paper describes encoding schemes for three cases of \(P\): \(P\) is a distribution over positive integers, \(P\) is a continuous distribution over \([0, 1]\) with a non-increasing pdf, and \(P\) is a continuous distribution over \([0, \infty)\) with a non-increasing pdf. We show that the growth rate of the expected codeword length is sub-linear in \(n\) when a power law bound is satisfied. An application of multiple-outputs channel simulation is the compression of probability distributions.