EFFICIENT SAMPLING FOR GENERATIVE ADVERSAR-IAL NETWORKS WITH REPARAMETERIZED MARKOV CHAINS

Abstract

Recently, sampling methods have been successfully applied to enhance the sample quality of Generative Adversarial Networks (GANs). However, in practice, they typically have poor sample efficiency because of the independent proposal sampling from the generator. In this work, we propose REP-GAN, a novel sampling method that allows general dependent proposals by REParameterizing the Markov chains into the latent space of the generator. Theoretically, we show that our reparameterized proposal admits a closed-form Metropolis-Hastings acceptance ratio. Empirically, extensive experiments on synthetic and real datasets demonstrate that our REP-GAN largely improves the sample efficiency and obtains better sample quality simultaneously.

1. INTRODUCTION

Generative Adversarial Networks (GANs) (Goodfellow et al., 2014) have achieved a great success on generating realistic images in recent years (Karras et al., 2019; Brock et al., 2019) . Unlike previous models that explicitly parameterize the data distribution, GANs rely on an alternative optimization between a generator and a discriminator to learn the data distribution implicitly. However, in practice, samples generated by GANs still suffer from problems such as mode collapse and bad artifacts. Recently, sampling methods have shown promising results on enhancing the sample quality of GANs by making use of the information in the discriminator. In the alternative training scheme of GANs, the generator only performs a few updates for the inner loop and has not fully utilized the density ratio information estimated by the discriminator. Thus, after GAN training, the sampling methods propose to further utilize this information to bridge the gap between the generative distribution and the data distribution in a fine-grained manner. For example, DRS (Azadi et al., 2019) applies rejection sampling, and MH-GAN (Turner et al., 2019) adopts Markov chain Monte Carlo (MCMC) sampling for the improved sample quality of GANs. Nevertheless, these methods still suffer a lot from the sample efficiency problem. For example, as will be shown in Section 5, MH-GAN's average acceptance ratio on CIFAR10 can be lower than 5%, which makes the Markov chains slow to mix. As MH-GAN adopts an independent proposal q, i.e., q(x |x) = q(x ), the difference between samples can be so large that the proposal gets rejected easily. To address this limitation, we propose to generalize the independent proposal to a general dependent proposal q(x |x). To the end, the proposed sample can be a refinement of the previous one, which leads to a higher acceptance ratio and better sample quality. We can also balance between the exploration and exploitation of the Markov chains by tuning the step size. However, it is hard to design a proper dependent proposal in the high dimensional sample space X because the energy landscape could be very complex (Neal et al., 2010) . Nevertheless, we notice that the generative distribution p g (x) of GANs is implicitly defined as the push-forward of the latent prior distribution p 0 (z), and designing proposals in the low dimensional latent space is generally much easier. Hence, GAN's latent variable structure motivates us to design a structured dependent proposal with two pairing Markov chains, one in the sample space X and the other in the latent space Z. As shown in Figure 1 , given the current pairing samples (z k , x k ), we draw the next proposal x in a bottom-to-up way: 1) drawing a latent proposal z following q(z |z k ); 2) pushing it forward through the generator and getting the sample proposal x = G(z ); 3)

