CROM: CONTINUOUS REDUCED-ORDER MODELING OF PDES USING IMPLICIT NEURAL REPRESENTATIONS

Abstract

The long runtime of high-fidelity partial differential equation (PDE) solvers makes them unsuitable for time-critical applications. We propose to accelerate PDE solvers using reduced-order modeling (ROM). Whereas prior ROM approaches reduce the dimensionality of discretized vector fields, our continuous reduced-order modeling (CROM) approach builds a low-dimensional embedding of the continuous vector fields themselves, not their discretization. We represent this reduced manifold using continuously differentiable neural fields, which may train on any and all available numerical solutions of the continuous system, even when they are obtained using diverse methods or discretizations. We validate our approach on an extensive range of PDEs with training data from voxel grids, meshes, and point clouds. Compared to prior discretization-dependent ROM methods, such as linear subspace proper orthogonal decomposition (POD) and nonlinear manifold neuralnetwork-based autoencoders, CROM features higher accuracy, lower memory consumption, dynamically adaptive resolutions, and applicability to any discretization. For equal latent space dimension, CROM exhibits 79× and 49× better accuracy, and 39× and 132× smaller memory footprint, than POD and autoencoder methods, respectively. Experiments demonstrate 109× and 89× wall-clock speedups over unreduced models on CPUs and GPUs, respectively. Videos and codes are available on the project page: https://crom-pde.github.io.

1. INTRODUCTION

Many scientific and engineering models are posed as partial differential equations (PDEs) of the form F (f , ∇f , ∇ 2 f , . . . , ḟ , f , . . .) = 0, f (x, t) : Ω × T → R d , subject to initial and boundary conditions. Here f is a spatiotemporal dependent, multidimensional continuous vector field, such as temperature, velocity, or displacement; ∇ and ( •) are the spatial and temporal gradients; Ω ⊂ R m and T ⊂ R are the spatial and temporal domains, respectively. We may solve for f by discretizing in space, f (x, t) ≈ f P (x, t) = P i=1 a i (t)N i (x) , transforming the continuous spatial representation to a (P • d)-dimensional vector whose coefficients a i (t) : T → R d and the corresponding basis functions N i (x) : Ω → R (e.g., polynomial basis, fourier basis) approximate the continuous solution. For instance, if N i is the linear finite element basis, the coefficients a i (t) = f (x i , t) are field values at spatial samples x i (Hughes, 2012). After introducing temporal samples {t n } T n=0 , we temporally evolve the solution by solving for P unknowns {a i (t n+1 )} given the previous state {a i (t n )}. Unfortunately, when P is large, processing and memory costs of these full-order solves become intractable. To alleviate this computational burden, prior model reduction techniques (Berkooz et al., 1993; Willcox & Peraire, 2002; Benner et al., 2015) construct a manifold-parameterization function g P : R r → R P d , with r ≪ P d, such that every low-dimensional latent space vector q(t) ∈ R r maps to a discrete field g P (q) → (a 1 , . . . , a P ) T . For instance, for linear finite elements (Barbič & James, 2005), g P (q) → f (x 1 , t), . . . f (x P , t) T , 1

