PyTorch Tutorial 11 – Softmax and Cross Entropy

New Tutorial series about Deep Learning with PyTorch!
⭐ Check out Tabnine, the FREE AI-powered code completion tool I use to help me code faster: *

In this part we learn about the softmax function and the cross entropy loss function. Softmax and cross entropy are popular functions used in neural nets, especially in multiclass classification problems. Learn the math behind these functions, and when and how to use them in PyTorch. Also learn differences between multiclass and binary classification problems.

– Softmax function
– Cross entropy loss
– Use softmax and cross entropy in PyTorch
– Differences between binary and multiclass classification

Part 11: Softmax and Cross Entropy

📚 Get my FREE NumPy Handbook:

📓 Notebooks available on Patreon:

⭐ Join Our Discord :

If you enjoyed this video, please subscribe to the channel!

Official website:

Part 01:

Code for this tutorial series:

You can find me here:

#Python #DeepLearning #Pytorch

* This is a sponsored link. By clicking on it you will not have any additional costs, instead you will support me and my project. Thank you so much for the support! 🙏

Source of this Python/AI Video

AI video(s) you might be interested in …