Programming LSTM with Keras and TensorFlow (10.2)
Long Short Term Memory (LSTM) and Gated Recurrent Units (GRU) are two layer types commonly used to build recurrent neural networks in Keras. This video introduces these two network types as a foundation towards Natural Language Processing (NLP) and time series prediction.
Code for This Video:
Course Homepage: https://sites.wustl.edu/jeffheaton/t81-558/
Tweets by jeffheaton
Support Me on Patreon: https://www.patreon.com/jeffheaton