Releases: withcatai/node-llama-cpp
Releases · withcatai/node-llama-cpp
v2.5.0
2.5.0 (2023-09-26)
Bug Fixes
- adapt to
llama.cpp
interface change (#49) (9db72b0)
Features
- add
FalconChatPromptWrapper
(#53) (656bf3c)
- fall back to build from source if prebuilt binary loading fails (#54) (d99e3b0)
- load conversation history into a
LlamaChatSession
(#51) (4e274ce)
- only build one binary for all node versions (#50) (1e617cd)
v2.4.0
2.4.0 (2023-09-09)
Features
v2.3.2
2.3.2 (2023-09-02)
Bug Fixes
- load image urls properly also outside GitHub (#35) (cf1f5f1)
v2.3.0
2.3.0 (2023-09-02)
Bug Fixes
- handle stop words remainder properly in a chat session (#32) (9bdef11)
- move
default
export to be the last one in package.json
(#31) (dd49959)
Features
v2.2.0
2.2.0 (2023-09-01)
Features
- export class options types (#29) (74be398)
- improve error message when
llama.cpp
source is not downloaded (#27) (7837af7)
- make contributions and support more efficient via GitHub templates (#28) (5fc0d18)
v2.1.2
2.1.2 (2023-08-28)
Bug Fixes
v2.1.0
2.1.0 (2023-08-28)
Features
- add grammar support (#13) (c28d2de)
- add support for metal and cuda in the
build
command (#17) (1043596)