Local AI (with Docs Query Engine) running just on Laptops!!!
4.2 هزار بار بازدید -
8 ماه پیش
-
"Fully LOCAL RAG (Retrieval-Augmented Generation)
"Fully LOCAL RAG (Retrieval-Augmented Generation) / Docs Query Engine with Llama-index and Ollama" would involve creating a system that integrates advanced text retrieval and generation capabilities in a local environment, leveraging the functionalities of Llama-index and Ollama.
Purpose: Create a local RAG system that combines the strengths of Llama-index for document indexing and retrieval, and Ollama for natural language understanding and generation.
🔗 Links 🔗
Local Notebook https://github.com/amrrs/local_doc_qu... (Run it on Jupyter Notebook or Visual Studio Code Locally)
Ollama Query Engine Pack for Llama-index - https://llamahub.ai/l/llama_packs-oll...
Download https://ollama.ai/ (Free OpenSource Tool to run Local Models on CPU)
❤️ If you want to support the channel ❤️
Support here:
Patreon - Patreon: 1littlecoder
Ko-Fi - https://ko-fi.com/1littlecoder
🧭 Follow me on 🧭
Twitter - Twitter: 1littlecoder
Linkedin - LinkedIn: amrrs
Purpose: Create a local RAG system that combines the strengths of Llama-index for document indexing and retrieval, and Ollama for natural language understanding and generation.
🔗 Links 🔗
Local Notebook https://github.com/amrrs/local_doc_qu... (Run it on Jupyter Notebook or Visual Studio Code Locally)
Ollama Query Engine Pack for Llama-index - https://llamahub.ai/l/llama_packs-oll...
Download https://ollama.ai/ (Free OpenSource Tool to run Local Models on CPU)
❤️ If you want to support the channel ❤️
Support here:
Patreon - Patreon: 1littlecoder
Ko-Fi - https://ko-fi.com/1littlecoder
🧭 Follow me on 🧭
Twitter - Twitter: 1littlecoder
Linkedin - LinkedIn: amrrs
8 ماه پیش
در تاریخ 1402/09/16 منتشر شده
است.
4,261
بـار بازدید شده