Using Local Large Language Models in Semantic Kernel

Will Velida
Will Velida
1.1 هزار بار بازدید - ماه قبل - Did you know that you
Did you know that you can download large-language models on your local machine to build Semantic Kernel agents instead of having to use Azure OpenAI or the OpenAI API? In this video, I show you how you can download LLMs and SLMs on your local machine via Ollama and LM Studio, and use models in your Semantic Kernel applications!

0:00 Introduction
0:38 Option 1 - Ollama
2:16 Interacting with Ollama models via the terminal
3:48 Using Ollama models in Semantic Kernel applications
7:40 Interacting with Ollama models via Semantic Kernel
8:33 Option 2 - LM Studio
9:14 Interacting with LM Studio models in the app
9:50 Running our LM Studio models via localhost
11:39 Using LM Studio models in Semantic Kernel
13:15 Wrap up

Useful links
LM Studio: https://lmstudio.ai/
Ollama: https://ollama.com/

Connect with me!
Twitter: Twitter: willvelida
GitHub: https://github.com/willvelida
Bluesky: https://bsky.app/profile/willvelida.b...
ماه قبل در تاریخ 1403/04/18 منتشر شده است.
1,183 بـار بازدید شده
... بیشتر