Custom training loops offer great flexibility. You can quickly add new functionality and gain deep insight into how your algorithm works under the hood. However, setting up custom algorithms over and over is tedious. The general layout often is the same; it’s only tiny parts that change.
When working with image data, practitioners often use augmentations. Augmentations are techniques that artificially and randomly alter the data to increase diversity. Applying such transformations to the training data makes the model more robust.
After you have finally created that training script it’s time to scale things up. From a local development environment, be it an IDE or Colab, to a large computer cluster, it’s quite a stretch. The following best practices make this transition easier.