Overfitting test for deep learning in PyTorch Lightning
A sanity check test when implementing a new model (or to see if a model might work with your data), is to try to overfit on a single batch. If your model can’t even do this, then you know you have major implementation problems.
In PyTorch lightning you can set the flag `overfit_batches=True` without making any changes to your code. In this case, Lightning will make sure to turn off shuffle for you and use the same batch for both train and val.