Русские видео

Сейчас в тренде

Иностранные видео


Скачать с ютуб How to Run Any GGUF AI Model with Ollama By Converting It в хорошем качестве

How to Run Any GGUF AI Model with Ollama By Converting It 1 месяц назад


Если кнопки скачивания не загрузились НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием, пожалуйста напишите в поддержку по адресу внизу страницы.
Спасибо за использование сервиса savevideohd.ru



How to Run Any GGUF AI Model with Ollama By Converting It

In this video, I'll show you how to run any GGUF AI model from Huggingface with Ollama by converting it to the Ollama format. We'll go step-by-step through the conversion process, including what tools you'll need and how to set everything up. By the end of this guide, you'll be able to effortlessly run your preferred GGUF models using Ollama's platform. Whether you're new to AI models or looking for a new way to work with GGUF, this tutorial has you covered. The Lightning.AI studio environment used in this video is linked here: https://lightning.ai/openintegrator/s... Steps for doing the conversion are included below. How to curl a model file first: curl -L -o poro-34b-chat.Q5_K_M.gguf https://huggingface.co/LumiOpen/Poro-... Then follow instructions here https://github.com/ollama/ollama Import from GGUF Ollama supports importing GGUF models in the Modelfile: #Create a file named Modelfile, with a FROM instruction with the local filepath to the model you want to import. FROM ./poro-34b-chat.Q5_K_M.gguf #Create the model in Ollama ollama create poro-34b-chat.Q5_K_M -f Modelfile #Run the model ollama run poro-34b-chat.Q5_K_M #AI #Ollama #GGUF #AIModel #Tutorial #TechGuide

Comments