Liquid Time-Constant Networks (LTCs)

We introduce a new class of time-continuous recurrent neural network models. Instead of declaring a learning system’s dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems modulated via nonlinear interlinked gates. The resulting models represent dynamical systems with varying (i.e., liquid) time-constants coupled to their hidden state, with outputs being computed by numerical differential equation solvers.

These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations, and give rise to improved performance on time-series prediction tasks. To demonstrate these properties, we first take a theoretical approach to find bounds over their dynamics and compute their expressive power by the trajectory length measure in latent trajectory space. We then conduct a series of time-series prediction experiments to manifest the approximation capability of Liquid Time-Constant Networks (LTCs) compared to modern RNNs. Code and data are available here.

GithubLTCs

PaperLiquid time-constant networks

ODE LSTMs

Ordinary Differential Equation Long Short-term Memory

ODE-LSTMs allow for learning long-term dependencies in irregularly sampled time-series. The motivation arises from the fact that all Neural ODE models provably suffer from the vanishing/exploding gradients. Therefore, they face difficulties in learning long-term dependencies. In our NeurIPS 2020 paper, we proposed ODE-LSTMs, as a powerful time-series modeling framework which can deal with data arriving at arbitrary time-stamp.

Github – Here, is a TensorFlow 2 implementation of a dozen advanced ODE-based RNNs, as well as our performant ODE-LSTMs: ODE-LSTMs

PaperLearning Long-term Dependencies in Irregularly-Sampled Time Series