LEARNING INCOMPRESSIBLE FLUID DYNAMICS FROM SCRATCH -TOWARDS FAST, DIFFERENTIABLE FLUID MODELS THAT GENERALIZE

Abstract

Fast and stable fluid simulations are an essential prerequisite for applications ranging from computer-generated imagery to computer-aided design in research and development. However, solving the partial differential equations of incompressible fluids is a challenging task and traditional numerical approximation schemes come at high computational costs. Recent deep learning based approaches promise vast speed-ups but do not generalize to new fluid domains, require fluid simulation data for training, or rely on complex pipelines that outsource major parts of the fluid simulation to traditional methods. In this work, we propose a novel physics-constrained training approach that generalizes to new fluid domains, requires no fluid simulation data, and allows convolutional neural networks to map a fluid state from time-point t to a subsequent state at time t + dt in a single forward pass. This simplifies the pipeline to train and evaluate neural fluid models. After training, the framework yields models that are capable of fast fluid simulations and can handle various fluid phenomena including the Magnus effect and Kármán vortex streets. We present an interactive real-time demo to show the speed and generalization capabilities of our trained models. Moreover, the trained neural networks are efficient differentiable fluid solvers as they offer a differentiable update step to advance the fluid simulation in time. We exploit this fact in a proof-of-concept optimal control experiment. Our models significantly outperform a recent differentiable fluid solver in terms of computational speed and accuracy.

1. INTRODUCTION

Simulating the behavior of fluids by solving the incompressible Navier-Stokes equations is of great importance for a wide range of applications and accurate as well as fast fluid simulations are a long-standing research goal. On top of simulating the behavior of fluids, several applications such as sensitivity analysis of fluids or gradient-based control algorithms rely on differentiable fluid simulators that allow to propagate gradients throughout the simulation (Holl et al. ( 2020)). Recent advances in deep learning aim for fast and accurate fluid simulations but rely on vast datasets and / or do not generalize to new fluid domains. Kim et al. (2019) present a framework to learn parameterized fluid simulations and allow to interpolate efficiently in between such simulations. However, their work does not generalize to new domain geometries that lay outside the training data. Kim & Lee (2020) train a RNN-GAN that produces turbulent flow fields within a pipe domain, but do not show generalization results beyond pipe domains. Xie et al. (2018) introduce a tempoGAN to perform temporally consistent superresolution of smoke simulations. This allows to

