CUBIC SPLINE SMOOTHING COMPENSATION FOR IRREGULARLY SAMPLED SEQUENCES

Abstract

The marriage of recurrent neural networks and neural ordinary differential networks (ODE-RNN) is effective in modeling irregularly sampled sequences. While ODE produces the smooth hidden states between observation intervals, the RNN will trigger a hidden state jump when a new observation arrives and thus cause the interpolation discontinuity problem. To address this issue, we propose the cubic spline smoothing compensation, which is a stand-alone module upon either the output or the hidden state of ODE-RNN and can be trained end-to-end. We derive its analytical solution and provide its theoretical interpolation error bound. Extensive experiments indicate its merits over both ODE-RNN and cubic spline interpolation.

1. INTRODUCTION

Recurrent neural networks (RNNs) are commonly used for modeling regularly sampled sequences (Cho et al., 2014) . However, the standard RNN can only process discrete series without considering the unequal temporal intervals between sample points, making it fail to model irregularly sampled time series commonly seen in domains, e.g., healthcare (Rajkomar et al., 2018) and finance (Fagereng & Halvorsen, 2017) . While some works adapt RNNs to handle such irregular scenarios, they often assume an exponential decay (either at the output or the hidden state) during the time interval between observations (Che et al., 2018; Cao et al., 2018) , which may not always hold. To remove the exponential decay assumption and better model the underlying dynamics, Chen et al. ( 2018) proposed to use the neural ordinary differential equation (ODE) to model the continuous dynamics of hidden states during the observation intervals. Leveraging a learnable ODE parametrized by a neural network, their method renders higher modeling capability and flexibility. However, an ODE determines the trajectory by its initial state, and it fails to adjust the trajectory according to subsequent observations. A popular way to leverage the subsequent observations is ODE-RNN (Rubanova et al., 2019; De Brouwer et al., 2019) , which updates the hidden state upon observations using an RNN, and evolves the hidden state using an ODE between observation intervals. While ODE produces smooth hidden states between observation intervals, the RNN will trigger a hidden state jump at the observation point. This inconsistency (discontinuity) is hard to reconcile, thus jeopardizing continuous time series modeling, especially for interpolation tasks (Fig. 1 top-left ). We propose a Cubic Spline Smoothing Compensation (CSSC) module to tackle the challenging discontinuity problem, and it is especially suitable for continuous time series interpolation. Our CSSC employs the cubic spline as a means of compensation for the ODE-RNN to eliminate the jump, as illustrated in Fig. 1 top-right. While the latent ODE (Rubanova et al., 2019) with an encoder-decoder structure can also produce continuous interpolation, CSSC can further ensure the interpolated curve pass strictly through the observation points. Importantly, we can derive the closed-form solution for CSSC and obtain its interpolation error bound. The error bound suggests two key factors for a good interpolation: the time interval between observations and the performance of ODE-RNN. Furthermore, we propose the hidden CSSC that aims to compensate for the hidden state of ODE-RNN (Fig. 1 bottom), which not only assuage the discontinuity problem but is more efficient when the observations are high-dimensional and only have continuity on the semantic level. We conduct extensive experiments and ablation studies to demonstrate the effectiveness of CSSC and hidden CSSC, and both of them outperform other comparison methods.

