PyTorch Tutorial 11 – Softmax and Cross Entropy

New Tutorial series about Deep Learning with PyTorch!
⭐ Check out Tabnine, the FREE AI-powered code completion tool I use to help me code faster: https://www.tabnine.com/?utm_source=youtube.com&utm_campaign=PythonEngineer *

In this part we learn about the softmax function and the cross entropy loss function. Softmax and cross entropy are popular functions used in neural nets, especially in multiclass classification problems. Learn the math behind these functions, and when and how to use them in PyTorch. Also learn differences between multiclass and binary classification problems.

– Softmax function
– Cross entropy loss
– Use softmax and cross entropy in PyTorch
– Differences between binary and multiclass classification

Part 11: Softmax and Cross Entropy

📚 Get my FREE NumPy Handbook:
https://www.python-engineer.com/numpybook

📓 Notebooks available on Patreon:
https://www.patreon.com/patrickloeber

⭐ Join Our Discord : https://discord.gg/FHMg9tKFSN

If you enjoyed this video, please subscribe to the channel!

Official website:
https://pytorch.org/

Part 01:

Code for this tutorial series:
https://github.com/python-engineer/pytorchTutorial

You can find me here:
Website: https://www.python-engineer.com
Twitter: https://twitter.com/python_engineer
GitHub: https://github.com/python-engineer

#Python #DeepLearning #Pytorch

———————————————————————————————————-
* This is a sponsored link. By clicking on it you will not have any additional costs, instead you will support me and my project. Thank you so much for the support! 🙏

Source of this Python/AI Video

AI video(s) you might be interested in …