Skip to content

Releases: withcatai/node-llama-cpp

v3.0.0-beta.36

30 Jun 22:18
81e0575
Compare
Choose a tag to compare
v3.0.0-beta.36 Pre-release
Pre-release

3.0.0-beta.36 (2024-06-30)

Bug Fixes


Shipped with llama.cpp release b3267

To use the latest llama.cpp release available, run npx --no node-llama-cpp download --release latest. (learn more)

v3.0.0-beta.35

30 Jun 19:44
a7a2517
Compare
Choose a tag to compare
v3.0.0-beta.35 Pre-release
Pre-release

3.0.0-beta.35 (2024-06-30)

Bug Fixes


Shipped with llama.cpp release b3266

To use the latest llama.cpp release available, run npx --no node-llama-cpp download --release latest. (learn more)

v3.0.0-beta.34

30 Jun 02:09
37c4b26
Compare
Choose a tag to compare
v3.0.0-beta.34 Pre-release
Pre-release

3.0.0-beta.34 (2024-06-30)

Bug Fixes


Shipped with llama.cpp release b3265

To use the latest llama.cpp release available, run npx --no node-llama-cpp download --release latest. (learn more)

v3.0.0-beta.33

29 Jun 23:09
1fbbf72
Compare
Choose a tag to compare
v3.0.0-beta.33 Pre-release
Pre-release

3.0.0-beta.33 (2024-06-29)

Bug Fixes

Features

  • move CUDA prebuilt binaries to dependency modules (#250) (8a92e31)

Shipped with llama.cpp release b3265

To use the latest llama.cpp release available, run npx --no node-llama-cpp download --release latest. (learn more)

v2.8.12

21 Jun 14:49
2137c46
Compare
Choose a tag to compare

2.8.12 (2024-06-21)

Bug Fixes

  • bump llama.cpp release used in prebuilt binaries (#247) (2137c46)

v3.0.0-beta.32

18 Jun 01:37
c89178f
Compare
Choose a tag to compare
v3.0.0-beta.32 Pre-release
Pre-release

3.0.0-beta.32 (2024-06-18)

Bug Fixes


Shipped with llama.cpp release b3166

To use the latest llama.cpp release available, run npx --no node-llama-cpp download --release latest. (learn more)

v3.0.0-beta.31

17 Jun 23:01
0b85800
Compare
Choose a tag to compare
v3.0.0-beta.31 Pre-release
Pre-release

3.0.0-beta.31 (2024-06-17)

Bug Fixes

  • remove CUDA binary compression for Windows (#243) (0b85800)
  • improve inspect gpu command output (#243) (0b85800)

Shipped with llama.cpp release b3166

To use the latest llama.cpp release available, run npx --no node-llama-cpp download --release latest. (learn more)

v3.0.0-beta.30

17 Jun 20:53
1e7c5d0
Compare
Choose a tag to compare
v3.0.0-beta.30 Pre-release
Pre-release

3.0.0-beta.30 (2024-06-17)

Bug Fixes


Shipped with llama.cpp release b3166

To use the latest llama.cpp release available, run npx --no node-llama-cpp download --release latest. (learn more)

v3.0.0-beta.29

16 Jun 01:35
0d40ffc
Compare
Choose a tag to compare
v3.0.0-beta.29 Pre-release
Pre-release

3.0.0-beta.29 (2024-06-16)

Bug Fixes

  • remove CUDA binary compression for now (#238) (0d40ffc)

Shipped with llama.cpp release b3153

To use the latest llama.cpp release available, run npx --no node-llama-cpp download --release latest. (learn more)

v3.0.0-beta.28

15 Jun 23:57
b89ad2d
Compare
Choose a tag to compare
v3.0.0-beta.28 Pre-release
Pre-release

3.0.0-beta.28 (2024-06-15)

Features

  • compress CUDA prebuilt binaries (#236) (b89ad2d)
  • automatically solve more CUDA compilation errors (#236) (b89ad2d)

Shipped with llama.cpp release b3153

To use the latest llama.cpp release available, run npx --no node-llama-cpp download --release latest. (learn more)