Milestones in Neural Natural Language Processing with Sebastian Ruder – TWiML Talk #195
In our conversation, Sebastian and I discuss recent milestones in neural NLP, including multi-task learning and pretrained language models. We also discuss the use of attention-based models, Tree RNNs and LSTMs, and memory-based networks. Finally, Sebastian walks us through his recent ULMFit paper, short for “Universal Language Model Fine-tuning for Text Classification,” which he co-authored with Jeremy Howard of fast.ai who I interviewed in episode 186.
For the complete show notes for this episode, visit https://twimlai.com/talk/195.
We hope you will enjoy this and some our 14k+ other artificial intelligence videos. We keep adding new channels and playlists all the time, so the number of fresh videos keeps growing every day.
Support this Website with Crypto
BTC 3KqW2c7wrhJDxAjBaywzj74mF2u5uZg665 (get a BTC wallet, get free BTC)