Releases: withcatai/node-llama-cpp
v3.0.0-beta.27
3.0.0-beta.27 (2024-06-12)
Features
Shipped with llama.cpp
release b3135
To use the latest
llama.cpp
release available, runnpx --no node-llama-cpp download --release latest
. (learn more)
v3.0.0-beta.26
3.0.0-beta.26 (2024-06-11)
Bug Fixes
Shipped with llama.cpp
release b3135
To use the latest
llama.cpp
release available, runnpx --no node-llama-cpp download --release latest
. (learn more)
v3.0.0-beta.25
3.0.0-beta.25 (2024-06-10)
Bug Fixes
Shipped with llama.cpp
release b3091
To use the latest
llama.cpp
release available, runnpx --no node-llama-cpp download --release latest
. (learn more)
v3.0.0-beta.24
3.0.0-beta.24 (2024-06-09)
Bug Fixes
Shipped with llama.cpp
release b3091
To use the latest
llama.cpp
release available, runnpx --no node-llama-cpp download --release latest
. (learn more)
v3.0.0-beta.23
3.0.0-beta.23 (2024-06-09)
Bug Fixes
Features
- parallel function calling (#225) (95f4645)
- preload prompt (#225) (95f4645)
- prompt completion engine (#225) (95f4645)
- chat wrapper based system message support (#225) (95f4645)
- add prompt completion to the Electron example (#225) (95f4645)
- model compatibility warnings (#225) (95f4645)
- Functionary
v2.llama3
support (#225) (95f4645) - parallel function calling with plain Llama 3 Instruct (#225) (95f4645)
- improve function calling support for default chat wrapper (#225) (95f4645)
- parallel model downloads (#225) (95f4645)
- improve the electron example (#225) (95f4645)
customStopTriggers
forLlamaCompletion
(#225) (95f4645)- improve loading status in the Electron example (#226) (4ea0c3c)
Shipped with llama.cpp
release b3091
To use the latest
llama.cpp
release available, runnpx --no node-llama-cpp download --release latest
. (learn more)
v2.8.11
v3.0.0-beta.22
3.0.0-beta.22 (2024-05-19)
Bug Fixes
Shipped with llama.cpp
release b2929
To use the latest
llama.cpp
release available, runnpx --no node-llama-cpp download --release latest
. (learn more)
v3.0.0-beta.21
3.0.0-beta.21 (2024-05-19)
Bug Fixes
Shipped with llama.cpp
release b2929
To use the latest
llama.cpp
release available, runnpx --no node-llama-cpp download --release latest
. (learn more)
v3.0.0-beta.20
3.0.0-beta.20 (2024-05-19)
Bug Fixes
Features
init
command to scaffold a new project from a template (withnode-typescript
andelectron-typescript-react
templates) (#217) (d6a0f43)- debug mode (#217) (d6a0f43)
- load LoRA adapters (#217) (d6a0f43)
- improve Electron support (#217) (d6a0f43)
Shipped with llama.cpp
release b2928
To use the latest
llama.cpp
release available, runnpx --no node-llama-cpp download --release latest
. (learn more)
v3.0.0-beta.19
3.0.0-beta.19 (2024-05-12)
Bug Fixes
Features
Shipped with llama.cpp
release b2861
To use the latest
llama.cpp
release available, runnpx --no node-llama-cpp download --release latest
. (learn more)