# Deep and Confident Prediction for Time Series at Uber

@article{Zhu2017DeepAC, title={Deep and Confident Prediction for Time Series at Uber}, author={Lingxue Zhu and Nikolay Pavlovich Laptev}, journal={2017 IEEE International Conference on Data Mining Workshops (ICDMW)}, year={2017}, pages={103-110} }

Reliable uncertainty estimation for time series prediction is critical in many fields, including physics, biology, and manufacturing. At Uber, probabilistic time series forecasting is used for robust prediction of number of trips during special events, driver incentive allocation, as well as real-time anomaly detection across millions of metrics. Classical time series models are often used in conjunction with a probabilistic formulation for uncertainty estimation. However, such models are hard…

#### Supplemental Code

#### Paper Mentions

#### 199 Citations

MOrdReD: Memory-based Ordinal Regression Deep Neural Networks for Time Series Forecasting

- Mathematics, Computer ScienceArXiv
- 2018

This work proposes a novel, end-to-end deep learning method for time series forecasting that provides an excellent predictive forecast, shadowing true future values, but also allows to infer valuable information, such as the predictive distribution of the occurrence of critical events of interest, accurately and reliably even over long time horizons.

DeepPIPE: A distribution-free uncertainty quantification approach for time series forecasting

- Computer ScienceNeurocomputing
- 2020

A novel end-to-end framework called deep prediction interval and point estimation (DeepPIPE) that simultaneously performs multi-step point estimation and uncertainty quantification for time series forecasting and utilizes a novel hybrid loss function that improves the accuracy and stability of forecasting.

Probabilistic forecasting for energy time series considering uncertainties based on deep learning algorithms

- Computer ScienceElectric Power Systems Research
- 2021

Probabilistic extensions to Deep Learning algorithms and their application on energy time series forecasting showed that Concrete Dropout, Deep Ensembles, and Bayesian Neural Networks performed similarly well and better than the reference methods.

Quantifying Uncertainty in Deep Spatiotemporal Forecasting

- Computer Science, MathematicsKDD
- 2021

This paper describes two types of spatiotemporal forecasting problems: regular grid-based and graph-based, and analyzes UQ methods from both the Bayesian and the frequentist point of view, casting in a unified framework via statistical decision theory.

Applying SVGD to Bayesian Neural Networks for Cyclical Time-Series Prediction and Inference

- Computer Science, MathematicsArXiv
- 2019

A regression-based BNN model is proposed to predict spatiotemporal quantities like hourly rider demand with calibrated uncertainties, capable of producing time series predictions as well as measures of uncertainty surrounding the predictions.

Fast Memory-efficient Extreme Events Prediction in Complex Time series

- Computer Science
- 2020

This paper proposes a generic memory-efficient framework for realtime stochastic extreme events prediction in complex time series systems such as intrusion detection, Internet of Things (IoT), social…

Addressing model uncertainty in probabilistic forecasting using Monte Carlo dropout

- Computer Science
- 2020

This work proposes addressing the model uncertainty problem using Monte Carlo dropout, a variational approach that assigns distributions to the weights of a neural network instead of simply using fixed values.

Long-range forecasting in feature-evolving data streams

- Computer ScienceKnowl. Based Syst.
- 2020

This paper proposes the OFAT algorithm, which is a stochastic deep neural network framework to address stated problems collectively, and demonstrates that OFAT is fast, robust, accurate and superior to the state-of-the-art methods.

Uncertainty Intervals for Graph-based Spatio-Temporal Traffic Prediction

- Computer Science, MathematicsArXiv
- 2020

This work proposes Quantile Graph Wavenet, a Spatio-Temporal neural network that is trained to estimate a density given the measurements of previous timesteps, conditioned on a quantile, and produces uncertainty estimates without the need to sample during inference, such as in Monte Carlo Dropout.

Recurrent neural networks for time series prediction

- Computer Science
- 2020

This thesis proposes a novel approach to infer accurate and reliable predictive distributions from scarce time series data through a recurrent neural network approach that infers a joint feature representation over the space of quantised time series by means of a compilation of auxiliary datasets.

#### References

SHOWING 1-10 OF 24 REFERENCES

Time-series Extreme Event Forecasting with Neural Networks at Uber

- 2017

Accurate time-series forecasting during high variance segments (e.g., holidays), is critical for anomaly detection, optimal resource allocation, budget planning and other related tasks. At Uber…

Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning

- Mathematics, Computer ScienceICML
- 2016

A new theoretical framework is developed casting dropout training in deep neural networks (NNs) as approximate Bayesian inference in deep Gaussian processes, which mitigates the problem of representing uncertainty in deep learning without sacrificing either computational complexity or test accuracy.

Concrete Dropout

- Computer Science, MathematicsNIPS
- 2017

This work proposes a new dropout variant which gives improved performance and better calibrated uncertainties, and uses a continuous relaxation of dropout’s discrete masks to allow for automatic tuning of the dropout probability in large models, and as a result faster experimentation cycles.

Long Short-Term Memory

- Computer Science, MedicineNeural Computation
- 1997

A novel, efficient, gradient based method called long short-term memory (LSTM) is introduced, which can learn to bridge minimal time lags in excess of 1000 discrete-time steps by enforcing constant error flow through constant error carousels within special units.

Probabilistic Backpropagation for Scalable Learning of Bayesian Neural Networks

- Computer Science, MathematicsICML
- 2015

This work presents a novel scalable method for learning Bayesian neural networks, called probabilistic backpropagation (PBP), which works by computing a forward propagation of probabilities through the network and then doing a backward computation of gradients.

A new boosting algorithm for improved time-series forecasting with recurrent neural networks

- Computer ScienceInf. Fusion
- 2008

The overall results obtained through this ensemble method are more accurate than those obtained through the standard method, backpropagation through time, on these datasets and perform significantly better even when long-range dependencies play an important role.

Bayesian Recurrent Neural Networks

- Computer Science, MathematicsArXiv
- 2017

This work shows that a simple adaptation of truncated backpropagation through time can yield good quality uncertainty estimates and superior regularisation at only a small extra computational cost during training, and demonstrates how a novel kind of posterior approximation yields further improvements to the performance of Bayesian RNNs.

A Theoretically Grounded Application of Dropout in Recurrent Neural Networks

- Computer Science, MathematicsNIPS
- 2016

This work applies a new variational inference based dropout technique in LSTM and GRU models, which outperforms existing techniques, and to the best of the knowledge improves on the single model state-of-the-art in language modelling with the Penn Treebank.

Nonlinear Systems Identification Using Deep Dynamic Neural Networks

- Computer ScienceArXiv
- 2016

It is demonstrated that deep neural networks are effective model estimators from input-output data and associated characteristics of the underlying dynamical systems.

Auto-Encoding Variational Bayes

- Mathematics, Computer ScienceICLR
- 2014

A stochastic variational inference and learning algorithm that scales to large datasets and, under some mild differentiability conditions, even works in the intractable case is introduced.