Without a doubt, the ImageNet dataset has been a critical factor in developing advanced Machine Learning algorithms. Its sheer size and a large number of classes have been challenging to handle.
Neural Networks. They seem to be everywhere. That is not a problem if you are a seasoned practitioner; you understand what they are all about. Usually, you come up with an explanation quickly.
The specs are indeed intimidating: Up to 32 GPU cores and up to 16 CPU cores. Pair that with 64 GB of RAM, and you're well equipped for any workload. And, not to forget the design. Well, it seems Apple did it again.
Most of the time, we write and debug our code locally. After we've passed any tests, we then deploy the scripts to a remote environment. If we're fortunate, we might have access to multiple GPUs.