FAST NONLINEAR VECTOR QUANTILE REGRESSION

Abstract

Quantile regression (QR) is a powerful tool for estimating one or more conditional quantiles of a target variable Y given explanatory features X. A limitation of QR is that it is only defined for scalar target variables, due to the formulation of its objective function, and since the notion of quantiles has no standard definition for multivariate distributions. Recently, vector quantile regression (VQR) was proposed as an extension of QR for vector-valued target variables, thanks to a meaningful generalization of the notion of quantiles to multivariate distributions via optimal transport. Despite its elegance, VQR is arguably not applicable in practice due to several limitations: (i) it assumes a linear model for the quantiles of the target Y given the features X; (ii) its exact formulation is intractable even for modestly-sized problems in terms of target dimensions, number of regressed quantile levels, or number of features, and its relaxed dual formulation may violate the monotonicity of the estimated quantiles; (iii) no fast or scalable solvers for VQR currently exist. In this work we fully address these limitations, namely: (i) We extend VQR to the non-linear case, showing substantial improvement over linear VQR; (ii) We propose vector monotone rearrangement, a method which ensures the quantile functions estimated by VQR are monotone functions; (iii) We provide fast, GPU-accelerated solvers for linear and nonlinear VQR which maintain a fixed memory footprint, and demonstrate that they scale to millions of samples and thousands of quantile levels; (iv) We release an optimized python package of our solvers as to widespread the use of VQR in real-world applications.

1. INTRODUCTION

Quantile regression (QR) (Koenker & Bassett, 1978 ) is a well-known method which estimates a conditional quantile of a target variable Y, given covariates X. A major limitation of QR is that it deals with a scalar-valued target variable, while many important applications require estimation of vector-valued responses. A trivial approach is to estimate conditional quantiles separately for each component of the vector-valued target. However this assumes statistical independence between targets, a very strong assumption rarely held in practice. Extending QR to high dimensional responses is not straightforward because (i) the notion of quantiles is not trivial to define for high dimensional variables, and in fact multiple definitions of multivariate quantiles exist (Carlier et al., 2016) ; (ii) quantile regression is performed by minimizing the pinball loss, which is not defined for high dimensional responses. Carlier et al. (2016) and Chernozhukov et al. (2017) introduced a notion of quantiles for vector-valued random variables, termed vector quantiles. Key to their approach is extending the notions of monotonicity and strong representation of scalar quantile functions to high dimensions, i.e.

Co-monotonicity:

(Q Y (u) -Q Y (u )) (u -u ) ≥ 0, ∀ u, u ∈ [0, 1] d (1) Strong representation: Y = Q Y (U), U ∼ U[0, 1] d (2) where Y is a d-dimensional variable, and Q Y : [0, 1] d → R d is its vector quantile function (VQF). Moreover, Carlier et al. ( 2016) extended QR to vector-valued targets, which leads to vector quantile regression (VQR). VQR estimates the conditional vector quantile function (CVQF) Q Y|X from

