LLM Chat App in Python w/ Ollama-py and Streamlit

Decoder
Decoder
8.4 هزار بار بازدید - 7 ماه پیش - In this video I walk
In this video I walk through the new Ollama Python library, and use it to build a chat app with UI powered by Streamlit. After reviewing some important methods from this library, I touch on Python generators as we construct our chat app, step by step.

Check out my other Ollama videos -  Get Started with Ollama

Links:
Code from video - https://decoder.sh/videos/llm-chat-ap...
Ollama-py - https://github.com/ollama/ollama-python
Streamlit - https://streamlit.io/
My website - https://decoder.sh

Timestamps:
00:00 - Intro
00:26 - Why not use the CLI?
01:17 - Looking at the ollama-py library
02:26 - Setting up Python environment
04:05 - Reviewing Ollama functions
04:14 - list()
04:52 - show()
05:44 - chat()
06:55 - Looking at Streamlit
07:59 - Start writing our app
08:51 - App: user input
11:16 - App: message history
13:09 - App: adding ollama response
15:00 - App: chooing a model
17:07 - Introducing generators
18:52 - App: streaming responses
21:22 - App: review
22:10 - Where to find the code
22:27 - Thank you for 2k
7 ماه پیش در تاریخ 1402/12/08 منتشر شده است.
8,484 بـار بازدید شده
... بیشتر