У нас вы можете посмотреть бесплатно Quick Example Deploying ML REST API with FastAPI, Docker, and ECS или скачать в максимальном доступном качестве, которое было загружено на ютуб. Для скачивания выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса savevideohd.ru
I've been scared of docker in the past. Even during the height of the k8s kraze, I rarely ran individual containers. Instead I would create an unnecessary cluster to run my image. In the context of CI/CD for ML, I'm developing a new appreciation. In this example, I have a simple API for my model using FastAPI (I used to use Flask but I'm trying to get with the times - FastAPI seems sweet). I then wrote a simple Dockerfile for my app, uploaded the image to dockerhub, and deployed the API with ECS. "Hey ECS, run this docker image" - works pretty well. GCP, Azure, and all the other clouds all have great ways to deploy containers. This isn't to make a case for VMs, K8s, or anything else, but in the context of "how do I repeatedly and reliably build a web service for my ML project" using one of these services as part of a CD pipeline is super handy. My previous fears were unfounded. Connect with me on LI: / gustafrcavanaugh