IRREGULARITY REFLECTION NEURAL NETWORK FOR TIME SERIES FORECASTING

Abstract

Time series forecasting is a long-standing challenge in a variety of industries, and deep learning stands as the mainstream paradigm for handling this forecasting problem. With recent success, representations of time series components (e.g., trend and seasonality) are also considered in the learning process of the models. However, the residual remains under explored due to difficulty in formulating its inherent complexity. In this study, we propose a novel Irregularity Reflection Neural Network (IRN) that reflect the residual for the time series forecasting. First, we redefine the residual as the irregularity and express it as a sum of individual, short regular waves considering the Fourier series in a micro perspective. Second, we design a module, based on the convolutional architectures to mimic the variables of the derived irregularity representation, named Irregularity Representation Block (IRB). IRN comprises IRB on top of a forecasting model to learn the irregularity representation of time series. Extensive experiments on multiple realworld datasets demonstrate that IRN outperforms the state-of-the-art benchmarks in time series forecasting tasks.

1. INTRODUCTION

Figure 1 : The Traffic data and its time series components (i.e., trend, seasonality, and irregularity). Owing to the ubiquitous computing systems, time series is available in a wide range of domains including traffic (Chen et al., 2001 ), power plant (Gensler et al., 2016) , stock market indices (Song et al., 2021) , and so on (Liu et al., 2015; Duan et al., 2021) . Spontaneously, interests in time series forecasting have grown, and as a result, an intensive research for a more accurate prediction. 



In recent literature, many deep learning models have been favored for forecasting problems(Lim  & Zohren, 2021). Recurrent Neural Network (RNN) and its extensions such as Long Short-Term Memory (LSTM) (Hochreiter & Schmidhuber, 1997) and Gated Recurrent Unit (GRU)(Chung et al., 2014)  are popular choices for analyzing long sequences. Nevertheless, these models tend to be restricted in handling multivariate time series. As a powerful alternative, Convolution Neural Networks (CNNs) has been introduced to capture overall characteristics of time series through parallel calculations and filter operations. Building on the success in forecasting task, CNNbased models have been proposed according to the type of time series data. Temporal Convolutional Network (TCN) was applied to audio datasets(Oord et al., 2016), whereas Graph Convolutional Network (GCN) was utilized in the time series with graph characteristics (e.g., human skeletonbased action recognition(Zhang et al., 2020)  and traffic dataset(Bai et al., 2020)). The attention models have also been applied to emphasize the specific sequence data that are primarily referenced when making the predictions(Liu et al., 2021b).1

