Run LLMs locally using OLLAMA | Private Local LLM | OLLAMA Tutorial | Karndeep SIngh

Karndeep Singh
Karndeep Singh
1.2 هزار بار بازدید - 6 ماه پیش - The video explains how to
The video explains how to run llms locally using OLLAMA Fast and Easy. The following are topics covered in the video:
1. OLLAMA installation on Mac.
2. Download and use LLMs Models in OLLMA.
3. Customize OLLAMA Modelfile for determining model parameters and system prompts.
4.  Understanding different CLI commands in OLLMA.

OLLAMA Github: https://github.com/jmorganca/ollama?t...
OLLAMA Website: https://ollama.ai/

Connect with me on :
1. LinkedIn: LinkedIn: karndeepsingh

2. Telegram Group: https://telegram.me/datascienceclubac...

3. Github: https://www.github.com/karndeepsingh

Feel Good by MusicbyAden | SoundCloud: musicbyaden
Music promoted by https://www.chosic.com/free-music/all/
Creative Commons CC BY-SA 3.0
https://creativecommons.org/licenses/...

#ollama #mac  #llms
6 ماه پیش در تاریخ 1402/10/18 منتشر شده است.
1,255 بـار بازدید شده
... بیشتر