Overfitting test for deep learning in PyTorch Lightning

A sanity check test when implementing a new model (or to see if a model might work with your data), is to try to overfit on a single batch. If your model can’t even do this, then you know you have major implementation problems.

In PyTorch lightning you can set the flag `overfit_batches=True` without making any changes to your code. In this case, Lightning will make sure to turn off shuffle for you and use the same batch for both train and val.

Documentation
https://pytorch-lightning.readthedocs.io/en/stable/trainer.html#overfit-batches

Colab:
https://colab.research.google.com/drive/1jO8iyCyfC1BiJbjAdqcQxc7Qclil5hok?usp=sharing

Source of this PyTorch Lightning Video

AI video(s) you might be interested in …