Skip to content

OpenAI Embeddings API-style local server, runnig on FastAPI

License

Notifications You must be signed in to change notification settings

morioka/tiny-openai-embeddings-api

Repository files navigation

tiny-openai-embeddings-api

OpenAI Embeddings API-style local server, runnig on FastAPI.

This API will be compatible with Embeddings - OpenAI API.

Setup

This was built & tested on Python 3.10.8, Ubutu20.04/WSL2 but should also work on Python 3.9+.

pip install -r requirements.txt

or

docker compose build

Usage

server

export PYTHONPATH=.
uvicorn main:app --host 0.0.0.0

or

docker compose up

client

note: Authorization header may be ignored.

example 1: typical usecase, almost identical to OpenAI Embeddings API example

curl http://127.0.0.1:8000/v1/embeddings \
  -H "Content-Type: application/json" \
  -d '{
    "input": "Your text string goes here",
    "model": "sonoisa/sentence-bert-base-ja-mean-tokens-v2"
  }'

example 2: typical usecase, almost identical to OpenAI Embeddings API example

curl http://127.0.0.1:8000/v1/embeddings \
  -H "Content-Type: application/json" \
  -d '{
    "input": ["Your text string goes here", "Where is your text string?"]
    "model": "sonoisa/sentence-bert-base-ja-mean-tokens-v2"
  }'

download model from huggingface_hub

python -m download_model --model_id sonoisa/sentence-bert-base-ja-mean-tokens-v2 --local_dir model

License

Everything by morioka is licensed under MIT License.

TODO

Enjoy!

About

OpenAI Embeddings API-style local server, runnig on FastAPI

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published