Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Local Docker Deployment & Instructions #1193

Open
wants to merge 4 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
12 changes: 12 additions & 0 deletions .dockerignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
node_modules
test-results
# .env
# .env*

docker*

scripts/

.github

__tests__
41 changes: 41 additions & 0 deletions .env.local

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I guess the .env.local file should not be committed to the repo

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

or a .env.local.example

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the .env.local

As far as I understood, this file should be a .gitignored file which contains the API_KEYS and the like for the local environment without always having it popping up in the git changes.

Original file line number Diff line number Diff line change
@@ -0,0 +1,41 @@
# Supabase Public
NEXT_PUBLIC_SUPABASE_URL=http://127.0.0.1:54321
NEXT_PUBLIC_SUPABASE_ANON_KEY=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJzdXBhYmFzZS1kZW1vIiwicm9sZSI6ImFub24iLCJleHAiOjE5ODM4MTI5OTZ9.CRXP1A7WOeoJeXxjNni43kdQwgnWNReilDMblYTn_I0

# Supabase Private
SUPABASE_SERVICE_ROLE_KEY=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJzdXBhYmFzZS1kZW1vIiwicm9sZSI6InNlcnZpY2Vfcm9sZSIsImV4cCI6MTk4MzgxMjk5Nn0.EGIM96RAZx35lJzdJsyH-qQwv8Hdp7fsn3W0YpN81IU

# Ollama
NEXT_PUBLIC_OLLAMA_URL=http://localhost:11434


##### ATTEMPT AT DOCKER COMPOSE ######
# Supabase Public
# NEXT_PUBLIC_SUPABASE_URL=http://supabase_kong_chatbotui:8000
# NEXT_PUBLIC_SUPABASE_ANON_KEY=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJzdXBhYmFzZS1kZW1vIiwicm9sZSI6ImFub24iLCJleHAiOjE5ODM4MTI5OTZ9.CRXP1A7WOeoJeXxjNni43kdQwgnWNReilDMblYTn_I0

# # Supabase Private
# SUPABASE_SERVICE_ROLE_KEY=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJzdXBhYmFzZS1kZW1vIiwicm9sZSI6InNlcnZpY2Vfcm9sZSIsImV4cCI6MTk4MzgxMjk5Nn0.EGIM96RAZx35lJzdJsyH-qQwv8Hdp7fsn3W0YpN81IU

# # Ollama
# NEXT_PUBLIC_OLLAMA_URL=http://ollama:11434

########################################

# API Keys (Optional: Entering an API key here overrides the API keys globally for all users)
OPENAI_API_KEY=
ANTHROPIC_API_KEY=
GOOGLE_GEMINI_API_KEY=
MISTRAL_API_KEY=
PERPLEXITY_API_KEY=
OPENROUTER_API_KEY=

# OpenAI API Information
NEXT_PUBLIC_OPENAI_ORGANIZATION_ID=

# Azure API Information
AZURE_OPENAI_API_KEY=
NEXT_PUBLIC_AZURE_OPENAI_ENDPOINT=
NEXT_PUBLIC_AZURE_GPT_35_TURBO_ID=
NEXT_PUBLIC_AZURE_GPT_45_VISION_ID=
NEXT_PUBLIC_AZURE_GPT_45_TURBO_ID=
2 changes: 1 addition & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ yarn-error.log*

# local env files
.env
.env*.local
# .env*.local

# vercel
.vercel
Expand Down
38 changes: 38 additions & 0 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
# Use the Node.js official image as a parent image
# FROM node:20
FROM node:21.5.0-bookworm

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should this be node:20-bookworm?

Node 20 is the current LTS version


# Create a non-root user and switch to it
# RUN adduser --disabled-password myuser
# USER myuser

# Set the working directory in the container
WORKDIR /usr/src/app

# # Set ownership of the working directory to the non-root user
# RUN chown -R myuser:myuser /usr/src/app

# # Set appropriate permissions
# RUN chmod -R 755 /usr/src/app

# Copy package.json and package-lock.json (or yarn.lock) to the container working directory
COPY package*.json ./

# Install dependencies in production mode (without dev dependencies)
RUN npm cache clean -f
RUN npm ci

# Copy the rest of the application to the container working directory
COPY . .

# Build the Next.js application
RUN npm run build

# Set the environment to production to reduce Next.js application size
ENV NODE_ENV production

# Expose the port the app runs on
EXPOSE 3000

# Command to run the app
CMD ["npm", "start"]
85 changes: 85 additions & 0 deletions Makefile

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Wouldn't it make sense to add all those commands simply to the package.json so that npm is the only tool (instead of introducing make)?

Original file line number Diff line number Diff line change
@@ -0,0 +1,85 @@
.PHONY: start-chatbot-docker start-docker start-ui start-chatbot-local install-supabase start-supabase generate-db-types start-chatbot down-chatbot remove-chatbot remove-db install-ollama start-ollama remove-all pull-ollama-docker pull-mistral-local

# Start chatbot application using Docker
start-chatbot-docker: install-supabase start-supabase generate-db-types start-docker pull-mistral-docker

# Start chatbot application using local Ollama install
start-chatbot-local: install-ollama install-supabase start-supabase generate-db-types start-ui pull-mistral-local


### Setup Commands ###

build:
docker compose build

install-supabase:
./scripts/install_supabase.sh

start-supabase:
@echo "Starting locally deployed Supabase..."
supabase start

generate-db-types:
@echo "Generating database types..."
supabase gen types typescript --local > supabase/types.ts

install-ollama:
./scripts/install_ollama.sh

start-ollama:
@echo "Starting ollama service..."
# sudo systemctl start ollama
# Check if Ollama service is active and start it if not
if ! systemctl is-active --quiet ollama; then
echo "Starting Ollama service..."
sudo systemctl start ollama
else
echo "Ollama service is already running."
fi

### Start Commands ###

start-docker:
@echo "Starting ollama Docker and Chatbot-ui Docker application..."
docker compose up -d

start-ui:
@echo "Starting Chatbot-ui Docker application..."
docker compose -f docker-compose-ui.yml up -d;

pull-mistral-docker:
@echo "Docker Pulling Mistral model...download may take a few moments..."
docker exec ollama ollama pull mistral
@echo "********************************************************************************"
@echo "PLEASE OPEN YOUR WEB BROWSER AND NAVIGATE TO: http://localhost:3000"
@echo "********************************************************************************"

pull-ollama-docker:
@echo "Please enter the name of the model you wish to pull: "
@read MODEL_NAME; docker exec ollama ollama pull $$MODEL_NAME

pull-mistral-local:
@echo "Pulling Mistral model...download may take a few moments..."
ollama pull mistral

### Shutdown commands ###

down-chatbot:
@echo "Stopping chatbot-ui application and related services..."
docker compose down

remove-chatbot:
@echo "Stopping chatbot-ui application and related services..."
docker compose down -v

remove-db:
@echo "Removing current Supabase project and related services..."
supabase stop --no-backup
docker volume ls --filter label=com.supabase.cli.project=chatbotui -q | xargs -r docker volume rm


remove-all: remove-chatbot remove-db
@echo "Removing and shutting down frontend and database servers..."

rebuild: remove-all build start-chatbot-docker
@echo "Rebuilding the deployment..."
35 changes: 35 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -142,6 +142,41 @@ Your local instance of Chatbot UI should now be running at [http://localhost:300

You can view your backend GUI at [http://localhost:54323/project/default/editor](http://localhost:54323/project/default/editor).

## Local Host Fully Dockerized

This deployment leverages all docker containers with only supabase as the cli to start the backend database

For ease, a Make file and scripts were created to start the process quickly with Docker.

```bash
make start-chatbot-docker
```

This will trigger a series of steps to include:

1. Checks for Supabase (Linux/macOS) and installs as required
2. Starts Supabase backend using chatbot-ui migration scripts and generates types
3. Starts Docker for Chat UI and ollama instances (using default ports)
4. Once containers are deployed, ollama will pull a mistral model automatically to get you started

To pull additional models use the makefile:

```bash
make pull-ollama-docker
```

Follow prompt to enter model name to download. This will download the model using the Docker instance of ollama.

**ENV Asssumption** Uses the standard Supabase keys (change as you desire), ollama URL and Supabase URL remain static for standard deployments.

**CAUTION** If you have ollama already running on your system you will need to adjust your ports in the Dockerfile and the env.local to the port of your choice for the Docker ollama instance.

Work-In-Progress to fully automate the local instance of ollama with the Makefile. If ollama is already installed and serving models, it will work just finish. There is an error to start ollama from the script to run `ollama serve`.

```bash
make start-chatbot-local
```

## Hosted Quickstart

Follow these steps to get your own Chatbot UI instance running in the cloud.
Expand Down
1 change: 1 addition & 0 deletions components/chat/chat-helpers/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -161,6 +161,7 @@ export const handleLocalChat = async (
{
model: chatSettings.model,
messages: formattedMessages,
stream: false, // error loading data as stream, set to false until bug fix Issue# #1088
options: {
temperature: payload.chatSettings.temperature
}
Expand Down