Русские видео

Сейчас в тренде

Иностранные видео


Скачать с ютуб Serverless Machine Learning Model Inference on Kubernetes with KServe by Stavros Kontopoulos в хорошем качестве

Serverless Machine Learning Model Inference on Kubernetes with KServe by Stavros Kontopoulos 1 год назад


Если кнопки скачивания не загрузились НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием, пожалуйста напишите в поддержку по адресу внизу страницы.
Спасибо за использование сервиса savevideohd.ru



Serverless Machine Learning Model Inference on Kubernetes with KServe by Stavros Kontopoulos

As a Machine Learning (ML) practitioner did you ever wish there was a Kubernetes native way of serving and scaling an ML model? Fortunately, there is Kserve! In this talk we will go through: How to integrate with the most popular ML frameworks to allow easy model inference and prototyping. How to work with Knative Serving, a sophisticated extension of Kubernetes standard services, for cost-effective autoscaling of your ML models. How to get your deployments to the next level using the inference graph to build complex ML pipelines. How to monitor your models and effectively deploy them in production with different rollout strategies. After this talk you will be able to use the above technology in practice to deploy your own models as different scenarios will be described in detail.

Comments