Back

Speaker "Niraj Tank" Details Back

 

Topic

Model as a Service for Real-Time Decisioning​

Abstract

Hosting models and productionizing them is a pain point. Let's fix that. Imagine a stream processing platform that leverages Machine Learning (ML) models and requires real-time decisions to be made. While most solutions provide tightly coupled ML models into the use case, these may not offer the most efficient way for a data scientist to update or rollback a model. With Model as a Service, disrupting the flow and relying on technical engineering teams to deploy, test and promote their models is a thing of the past. We focus on building a decoupled service-based architecture while upholding engineering best practices, delivering gains in terms of model management and deployment. Other benefits also include empowering data scientists by supporting patterns such as A/B testing, multi-arm bandits and ensemble modeling. This talk will lay down, step by step the critical aspects of building a well-managed ML flow pipeline that requires validation, versioning, auditing, and model risk governance. We discuss the benefits of breaking the barriers of a monolithic ML use case by using a service-based approach consisting of features, models, and rules. Join us to have an insight into the technology behind the scenes that accepts a raw serialized model built using popular libraries like H2O, SciKit Learn or Tensorflow, or even plain python source models and serve them via REST/gRPC which makes it easy for the models to integrate into business applications and services that need predictions.
Who is this presentation for?
- Targeted towards Architects, Technology leaders, Data Scientists and Product Managers - Anyone interested in learning modern practices of using open source container technology coupled with open source application infrastructure to operationalize machine-learning models - Data scientists interested in the transition from a rule-based to an ML model-based solution(s) - Product Owners concerned about time-to-market and achieving continuous delivery of ML-based solutions to keep up with the changing needs of their institutional domain

Prerequisite knowledge:
Basic knowledge of ML, micro-services and containers.

What you'll learn?
This talk will provide an overview of building an ML pipeline for decisioning using microservices making use of open source container technologies. We will also touch upon DevOps practices that involve agility, automated QA, rolling upgrades, one-click promotions and fully automated deployments on a real-time decisioning platform.

Profile

Niraj is a Sr. Mgr, Software Engineer at Capital One currently working on a team which has built a fast data streaming and decisioning platform for Capital One Bank. Niraj has been an engineer for past 21 years, his diverse experience ranges from developing products for startups to leading various large-scale integration services.