In this video, I show you how to accelerate Transformer training with Optimum Graphcore, an open-source library by Hugging Face that leverages the Graphcore AI processor.
First, I walk you through the setup of a Graphcore-enabled notebook on Paperspace. Then, I run a natural language processing job where I adapt existing Transformer training code for Optimum Graphcore, accelerating a BERT model to classify the star rating of Amazon product reviews. We also take a quick look at additional sample notebooks available on Paperspace.