Programming LSTM with Keras and TensorFlow (10.2)

Long Short Term Memory (LSTM) and Gated Recurrent Units (GRU) are two layer types commonly used to build recurrent neural networks in Keras. This video introduces these two network types as a foundation towards Natural Language Processing (NLP) and time series prediction.

Code for This Video:
https://github.com/jeffheaton/t81_558_deep_learning/blob/master/t81_558_class_10_2_lstm.ipynb
Course Homepage: https://sites.wustl.edu/jeffheaton/t81-558/

Follow Me/Subscribe:
https://www.youtube.com/user/HeatonResearch
https://github.com/jeffheaton

Support Me on Patreon: https://www.patreon.com/jeffheaton

Source of this machine learning/AI Video

AI video(s) you might be interested in …