Transformer models deliver state-of-the-art performance on a wide range of machine learning tasks, such as natural language processing, computer vision, speech, and more. However, training them at scale often requires a large amount of computing power, making the whole process unnecessarily long, complex, and costly. Join us for a live webinar to learn how the Hugging Face and Habana Labs joint solution makes it easier and quicker to train high-quality transformer models. Live demo included!