Skip to content

iTeam-S/kaiz

Repository files navigation

kaiz

Before all:

$ ~ cd local
$ ~ docker compose -f compose.ollama.yaml up -d

Then, access the container:

$ ~ docker exec -it local_ollama bash
$ ~ ollama pull llama3.2:1b

Set environment variables:

export KAIZ_OLLAMA_HOST=...
export KAIZ_OLLAMA_MODEL=...

To install dependencies:

bun install

To run:

bun run index.ts Hello
bun run index.ts Hello! How to Run Open Source LLMs Locally Using Ollama ?

To build:

  • Linux:
$ ~ bun build --compile --minify ./index.ts  --target=bun-linux-x64 --outfile=bin/kaiz
  • Windows:
$ ~ bun build --compile --minify ./index.ts  --target=bun-windows-x64 --outfile=bin/kaiz