Skip to content

Releases: withcatai/node-llama-cpp

v2.5.0

26 Sep 20:08
d99e3b0
Compare
Choose a tag to compare

2.5.0 (2023-09-26)

Bug Fixes

  • adapt to llama.cpp interface change (#49) (9db72b0)

Features

  • add FalconChatPromptWrapper (#53) (656bf3c)
  • fall back to build from source if prebuilt binary loading fails (#54) (d99e3b0)
  • load conversation history into a LlamaChatSession (#51) (4e274ce)
  • only build one binary for all node versions (#50) (1e617cd)

v2.4.1

15 Sep 23:59
b3758b4
Compare
Choose a tag to compare

2.4.1 (2023-09-15)

v2.4.0

09 Sep 21:46
01b89ce
Compare
Choose a tag to compare

2.4.0 (2023-09-09)

Features

v2.3.2

02 Sep 23:05
cf1f5f1
Compare
Choose a tag to compare

2.3.2 (2023-09-02)

Bug Fixes

  • load image urls properly also outside GitHub (#35) (cf1f5f1)

v2.3.1

02 Sep 22:39
3ef4c00
Compare
Choose a tag to compare

2.3.1 (2023-09-02)

v2.3.0

02 Sep 19:49
47c3c5f
Compare
Choose a tag to compare

2.3.0 (2023-09-02)

Bug Fixes

  • handle stop words remainder properly in a chat session (#32) (9bdef11)
  • move default export to be the last one in package.json (#31) (dd49959)

Features

  • threads count setting on a model (#33) (47c3c5f)

v2.2.0

01 Sep 16:35
74be398
Compare
Choose a tag to compare

2.2.0 (2023-09-01)

Features

  • export class options types (#29) (74be398)
  • improve error message when llama.cpp source is not downloaded (#27) (7837af7)
  • make contributions and support more efficient via GitHub templates (#28) (5fc0d18)

v2.1.2

28 Aug 23:02
fd332e1
Compare
Choose a tag to compare

2.1.2 (2023-08-28)

Bug Fixes

v2.1.1

28 Aug 22:08
b34b3d7
Compare
Choose a tag to compare

2.1.1 (2023-08-28)

v2.1.0

28 Aug 19:03
1043596
Compare
Choose a tag to compare

2.1.0 (2023-08-28)

Features

  • add grammar support (#13) (c28d2de)
  • add support for metal and cuda in the build command (#17) (1043596)