Long Short-Term Memory Networks (LSTMs)
Recurrent neural networks and Long short-term memory networks (LSTMs) have been shown to model sequence data effectively.
A recurrent neural network can be thought of as multiple copies of the same network each passing a message to a successor.
LSTMs have specific gates, different than a normal artificial neuron in a multi-layer perceptron, that have the following properties:
- Input gate protects from irrelevant input
- Forget gate helps the unit forget previous memory contents
- Output gate exposes the contents of the memory cell (or not) at the output time of the LSTM unit
We often will recommend applying LSTM networks to data that is sequential in natural, such as:
- Web session data for users travesing a web site
- Smart grid sensor data
- Mesh network sensor data
- Financial transaction data
LSTMs are adept at handling sequences with a single or multiple columns of data per timestep.