NEURAL OPTIMAL TRANSPORT

Abstract

We present a novel neural-networks-based algorithm to compute optimal transport maps and plans for strong and weak transport costs. To justify the usage of neural networks, we prove that they are universal approximators of transport plans between probability distributions. We evaluate the performance of our optimal transport algorithm on toy examples and on the unpaired image-to-image translation. (a) Celeba (female) → anime, outdoor → church, deterministic (one-to-one, W2). (b) Handbags → shoes, stochastic (one-to-many, W2,1).

1. INTRODUCTION

Solving optimal transport (OT) problems with neural networks has become widespread in machine learning tentatively starting with the introduction of the large-scale OT (Seguy et al., 2017) and Wasserstein GANs (Arjovsky et al., 2017) . The majority of existing methods compute the OT cost and use it as the loss function to update the generator in generative models (Gulrajani et al., 2017; Liu et al., 2019; Sanjabi et al., 2018; Petzka et al., 2017) . Recently, (Rout et al., 2022; Daniels et al., 2021) have demonstrated that the OT plan itself can be used as a generative model providing comparable performance in practical tasks. In this paper, we focus on the methods which compute the OT plan. Most recent methods (Korotin et al., 2021b; Rout et al., 2022) consider OT for the quadratic transport cost (the Wasserstein-2 distance, W 2 ) and recover a nonstochastic OT plan, i.e., a deterministic OT map. In general, it may



Figure 1: Unpaired translation with our Neural Optimal Transport (NOT) Algorithm 1.

