Speaker "Rustem Feyzkhanov" Details Back



Serverless architecture for AI applications


Serverless architecture for AI applications Deep and machine learning becomes more and more essential for a lot of businesses for internal and external use. One of the main issues with deployment is finding the right way to operationalize model within the company. Serverless approach for deep learning provides the cheap, simple, scalable and reliable architecture for it. Serverless architecture changes the rules of the game - instead of thinking about cluster management, scalability, and query processing, you can now focus completely on training the model. The downside within this approach is that you have to keep in mind certain limitations and how to integrate your model in a right fashion.

Who is this presentation for?
My talk will be beneficial for data scientists and for machine learning engineers.

Prerequisite knowledge:
AWS, Tensorflow - beginner

What you'll learn?
I will show how to deploy Tensorflow model for image captioning on AWS infrastructure. AWS Function-as-a-Service solution - Lambda - can achieve very significant results - 20-30k runs per one dollar (completely pay as you go model), 10k functions can be run in parallel and easily integrates with other AWS services. It will allow you to easily connect it to API, chatbot, database or stream of events. I will also show how to construct serverless workflows for deep learning which enable to conduct A/B testing of the models, Canary deployment, error handling.


Rustem Feyzkhanov is a senior machine learning engineer at Instrumental, where he works on analytical models for the manufacturing industry, and AWS Machine Learning Hero. Rustem is passionate about serverless infrastructure (and AI deployments on it) and is the author of the course and book "Serverless Deep Learning with TensorFlow and AWS Lambda" and "Practical Deep Learning on the Cloud". Also, he is the main contributor to the open-source repository for serverless packages