An Offline Document Enquiry LLM for Everyone
-
Updated
Jul 25, 2023 - Python
An Offline Document Enquiry LLM for Everyone
Attempt to summarize text from `stdin`, using a large language model (locally and offline), to `stdout`
User interface for a ChatGPT-like chatbot powered by a local Ollama AI server, with support for plugins in the future
GPT powered rubber duck debugger as CS50 2023 final project.
A local LLM assisted ppt generation tool
A simple "Be My Eyes" web app with a llama.cpp/llava backend
PalmHill.BlazorChat is a chat application and API built with Blazor WebAssembly, SignalR, and WebAPI, featuring real-time LLM conversations, markdown support, customizable settings, and a responsive design. This project supports Llama2 models and was tested with Orca2.
50-line local LLM assistant in Python with Streamlit and GPT4All
Local AI Open Orca For Dummies is a user-friendly guide to running Large Language Models locally. Simplify your AI journey with easy-to-follow instructions and minimal setup. Perfect for developers tired of complex processes!
Openai-style, fast & lightweight local language model inference w/ documents
LLMs prompt augmentation with RAG by integrating external custom data from a variety of sources, allowing chat with such documents
Data Whisperer is a chatbot that allows you to analyze your local files using Large Language Models (LLMs). You can upload PDFs, Markdown, Text, and Document files to the chatbot and then ask questions related to the content of these files.
Structured inference with Llama 2 in your browser
Serverless single HTML page access to an OpenAI API compatible Local LLM
Infinite Craft but in Pyside6 and Python with local LLM (llama2 & others) using Ollama
implemented vector similarity algorithms to understand their inner workings, used local embeddding models
Add a description, image, and links to the local-llm topic page so that developers can more easily learn about it.
To associate your repository with the local-llm topic, visit your repo's landing page and select "manage topics."