The LLMKit backend is built with Rust using Actix Web for the HTTP server and SQLite for persistence.
- RESTful API for prompt management and execution
- OpenAI-compatible endpoints
- Provider abstraction layer for multiple LLM services
- SQLite database for storage with SQLx for type-safe queries
- Prompt versioning and evaluation
- Rust toolchain (latest stable)
- SQLx CLI for database management
- SQLite
- Install SQLx CLI if you don't have it:
cargo install sqlx-cli
- Set the database URL:
export DATABASE_URL="sqlite:llmkit.db"
- Create a new database:
sqlx database create
- Run migrations to set up the schema:
sqlx migrate run
- Create a new migration file:
sqlx migrate add <migration_name>
-
Edit the generated SQL file in the
migrations
directory -
Run the migration:
sqlx migrate run
- Prepare SQLx metadata (for offline compile-time checking):
cargo sqlx prepare --check
cargo run
The server will start on http://localhost:8000
by default.
GET /v1/ui/prompts
- List all promptsPOST /v1/ui/prompts
- Create a new promptGET /v1/ui/prompts/{id}
- Get a specific promptPUT /v1/ui/prompts/{id}
- Update a promptDELETE /v1/ui/prompts/{id}
- Delete a prompt
POST /v1/ui/prompts/execute/{id}
- Execute a promptPOST /v1/ui/prompts/execute/{id}/stream
- Stream a prompt executionPOST /v1/ui/prompts/execute/chat
- Execute a prompt in chat mode
POST /v1/chat/completions
- Chat completions APIPOST /v1/chat/completions/stream
- Streaming chat completions
GET /v1/ui/settings/api-keys
- List API keysPOST /v1/ui/settings/api-keys
- Create a new API keyDELETE /v1/ui/settings/api-keys/{id}
- Delete an API key
src/controllers/
- API route handlerssrc/db/
- Database access layersrc/middleware/
- HTTP middleware (auth, etc.)src/services/
- Business logic and LLM provider integrations
<!-- role:system -->
you are a helpful assistant
<!-- role:user -->
sup dude