AWS re:Invent 2021 — A first look at the SageMaker Training Compiler
In this new video, I demo the newly launched SageMaker Training Compiler, a feature of SageMaker that can accelerate the training of deep learning (DL) models by up to 50% through more efficient use of GPU instances.
Starting from a couple of sample notebooks based on Hugging Face models (BERT and GPT-2), I train both vanilla and compiled jobs, and I compare their performance.