Back

Speaker "Vamsi Sistla" Details Back

 

Topic

GPU Training Made Easy - Pre-train and fine-tune NLP models.

Abstract

Hands-on demo of training and fine-tuning of the latest NLP models using cloud GPU infrastructure. This demo covers popular toolkits such as Fairseq, Huggingface and also leading SOTA NLP models such as BERT, RoBERTa, and so forth. The demo also covers some of the best practices using GPU infrastructure as part of your training and fine-tuning cycles. 
 
Who is this presentation for?
Hands-on demo for data scientists who want to train their latest ML models using GPU infrastructure.
 
Prerequisite knowledge:
Knowledge of NLP, the latest NLP models, and GPUs. Having access to some GPU infrastructure would be helpful to follow along using hands-on demo. Demo will be conducted using Trainml.ai serverless deep learning training infrastructure. You can use any GPU enabled infrastructure. 
 
What you'll learn?
GPU training made easy - using NLP as case study. 

Profile

Head of Data Sciences.