Realizing GANs via a Tunable Loss Function

Authors

Image provided by Gowtham Kurri
Gowtham
Kurri
Arizona State University
Profile
Tyler
Sypherd
Arizona State University
Profile
Lalitha
Sankar
Arizona State University

Abstract

We introduce a tunable GAN, called .\alpha.-GAN, parameterized by .\alpha \in (0,\infty]., which interpolates between various .f.-GANs and Integral Probability Metric based GANs (under constrained discriminator set). We construct .\alpha.-GAN using a supervised loss function, namely, .\alpha.-loss, which is a tunable loss function capturing several canonical losses. We show that .\alpha.-GAN is intimately related to the Arimoto divergence, which was first proposed by \"{O}sterriecher (1996), and Liese and Vajda (2006). We posit that the holistic understanding that .\alpha.-GAN introduces will have practical benefits such as convergence and mode collapse.

Paper Manuscript