PyTorch Tutorial 12 – Activation Functions

New Tutorial series about Deep Learning with PyTorch!
⭐ Check out Tabnine, the FREE AI-powered code completion tool I use to help me code faster: *

In this part we learn about activation functions in neural nets. What are activation functions, why are they needed, and how do we apply them in PyTorch.

I go over following activation functions:
– Binary Step
– Sigmoid
– TanH (Hyperbolic Tangent)
– ReLU
– Leaky ReLU
– Softmax

📚 Get my FREE NumPy Handbook:

📓 Notebooks available on Patreon:

⭐ Join Our Discord :

Part 12: Activation Functions

If you enjoyed this video, please subscribe to the channel!

Official website:

Part 01:

Code for this tutorial series:

You can find me here:

#Python #DeepLearning #Pytorch

* This is a sponsored link. By clicking on it you will not have any additional costs, instead you will support me and my project. Thank you so much for the support! 🙏

Source of this Python/AI Video

AI video(s) you might be interested in …