How to set number of epochs in PyTorch Lightning?

Ask Questions Forum: ask Machine Learning Questions to our readersCategory: PyTorchHow to set number of epochs in PyTorch Lightning?
Chris Staff asked 6 months ago
1 Answers
Best Answer
Chris Staff answered 6 months ago

You can use max_epochs for this purpose in your Trainer object. It forces to train for at max this number of epochs:

  trainer = pl.Trainer(auto_scale_batch_size='power', gpus=1, deterministic=True, max_epochs=5)

If you want a minimum number of epochs (e.g. in the case of applying early stopping or something similar) then you can configure this with min_epochs:

  trainer = pl.Trainer(auto_scale_batch_size='power', gpus=1, deterministic=True, min_epochs=5)

Your Answer

7 + 4 =