A HTML UI for Ollama.
- Minimal & responsive UI: mobile & desktop
- Secure: zero dependencies. Built using HTML, CSS, and JavaScript
- Cross browser: support last 2 versions with more than > 0.5% global use
- Simple installation: run in your browser, host on your own server. Less than 20 Kb gzipped
Desktop
Dark mode
Mobile
Manually
Download Ollama from the website.
MacOS
brew install ollama
Linux
For instructions on how to install Ollama on Linux, see https://ollama.ai/download/linux.
To run the Ollama UI, all you need is a web server that serves dist/index.html and the bundled JS and CSS file.
First, start Ollama:
$ ollama run dolphin-phi
Using Caddy
To run the Ollama UI using Caddy, execute the following command from the command line:
$ git clone [email protected]:christianhellsten/ollama-html-ui.git
$ brew install caddy
$ caddy run
Using Docker
To run the Ollama UI using Docker, execute the following command from the command line:
$ docker run -p 80:80 aktagon/ollama-html-ui
Alternatively, build the image yourself:
$ git clone [email protected]:christianhellsten/ollama-html-ui.git
$ docker build -t ollama-html-ui .
$ docker run -p 80:80 ollama-html-ui
Using Parcel
First, clone the repository and install the dependencies:
$ git clone [email protected]:christianhellsten/ollama-html-ui.git
$ cd ollama-html-ui
$ yarn add --dev parcel
# Alternatively, use npm:
# npm install --save-dev parcel
Run the ollama-html-ui using parcel:
$ yarn parcel index.html
Open the UI using a browser:
$ open http://locahost:1234
Tests are written using Playwright
and node:test
.
The the tests can be run from the command line using this command:
$ ollama run dolphin-phi
$ node test
$ parcel build index.html
Tasks
- Personas / Characters / Custom GPTs
- Ollama authentication
- Edit message / response
- Clear chat
- CSP
- Speech recognition
- Image upload / multi-modal
- Markdown support
- Fork chat
- Fork chat before / after message
-
Mark message as good, bad, flagged
-
Export chat messages to JSON
-
Keyboard shortcuts
-
Dark & light theme
-
Export chat messages to JSON
-
Keyboard shortcuts
-
Keyboard shortcuts
-
Dark & light theme
-
Delete message / response
-
Ollama chat API / chat memory
-
IndexedDB persistence
-
Model parameters
-
System prompt
-
Copy message to clipboard
-
Select model in settings (global)
-
Select model in chat (local)
-
Search chats
-
Delete Chat
-
Select model
-
Save settings
-
View settings
-
Clear chats
-
Edit chat
-
New chat
-
Abort response
-
Send message
-
UI tests: https://nodejs.org/api/test.html
Features
- https://ollama.ai support
**Chat**
- New chat
- Edit chat
- Delete chat
- Download chat
- Scroll to top/bottom
- Copy to clipboard
**Chat message**
- Delete chat message
- Copy to clipboard
- Mark as good, bad, or flagged
**Chats**
- Search chats
- Clear chats
- Chat history
- Export chats
**Settings**
- URL
- Model
- System prompt
- Model parameters
Troubleshooting
If you experience compilation errors, try deleting the cache directory:```bash
rm -rf .parcel-cache/
```
Licensing
This project is available under two licensing options:
-
Open Source License (MIT):
- The code in this project is available under the terms of the MIT License.
- You are free to use, modify, and distribute the code in your non-commercial, open source projects.
- View the full text of the MIT License in the LICENSE file.
-
Commercial License:
- If you intend to use this code in a commercial project, we offer a separate commercial licensing option.
- Our commercial license provides additional rights and support tailored to your commercial needs.
- To inquire about our commercial licensing options, pricing, and terms, please contact us at [email protected] to discuss your specific requirements.
We value and support both our open source community and commercial users. By providing dual licensing options, we aim to make this project accessible to a wide range of users while offering customized solutions for commercial projects.