Skip to content
This repository has been archived by the owner on Dec 18, 2023. It is now read-only.
/ alpaca.cpp-webui Public archive

Web UI for Alpaca.cpp - Locally run an Instruction-Tuned Chat-Style LLM

License

Notifications You must be signed in to change notification settings

ngxson/alpaca.cpp-webui

Repository files navigation

Alpaca.cpp Web UI (Next.js)

This is a web UI wrapper for alpaca.cpp

Thanks to:

Features

  • Save chat history to disk
  • Implement context memory
  • Conversation history
  • Interface for tweaking parameters
  • Better guide / documentation
  • Ability to stop / regenerate response
  • Detect code response / use monospace font
  • Responsive UI
  • Configuration presets

Screenshot:

How to use

Pre-requirements:

  • You have nodejs v18+ installed on your machine (or if you have Docker, you don't need to install nodejs)
  • You are using Linux (Windows should also work, but I have not tested yet)

For Windows user, these is a detailed guide here: doc/windows.md

🔶 Step 1: Clone this repository to your local machine

🔶 Step 2: Download the model and binary file to run the model. You have some options:

🔶 Step 3: Edit bin/config.js so that the executable name and the model file name are correct
(If you are using chat and ggml-alpaca-7b-q4.bin, you don't need to modify anything)

🔶 Step 4: Run these commands

npm i
npm start

Alternatively, you can just use docker compose up if you have Docker installed.

Then, open http://localhost:13000/ on your browser

TODO

  • Test on Windows
  • Proxy ws via nextjs
  • Add Dockerfile / docker-compose
  • UI: add avatar

About

Web UI for Alpaca.cpp - Locally run an Instruction-Tuned Chat-Style LLM

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published