Running Llama 2 and other Open-Source LLMs on CPU Inference Locally for Document Q&A
-
Updated
Nov 6, 2023 - Python
Running Llama 2 and other Open-Source LLMs on CPU Inference Locally for Document Q&A
This Repositry is an experiment with an agent that searches documents and asks questions repeatedly in response to the main question. It automatically determines the optimal answer from the current documents or recognizes when there is no answer.
An LLM-powered Slack bot built with Langchain.
A Document-based Question Answering system by implementing Retrieval-Augmented Generation (RAG) using OpenAI's API.
Cross-document QA leveraging LLM and taxonomy
This repository acts as an archive of the owners experience working with large language models neatly presented within jupyter notebooks
Add a description, image, and links to the document-qa topic page so that developers can more easily learn about it.
To associate your repository with the document-qa topic, visit your repo's landing page and select "manage topics."