Skip to content

How to load tokenizer config? #203

Closed Answered by giladgd
linonetwo asked this question in Q&A
Apr 16, 2024 · 1 comments · 4 replies
Discussion options

You must be logged in to vote

There’s a JinjaChatWrapper you can use, and it’s automatically get used when the model includes a chat template.
You can run this command to chat with a model, and look for the chat wrapper metadata that’s printed before the chat starts:

npx -n node-llama-cpp chat <model file path>

You can also use fileInfo on a model instance to read metadata.

Replies: 1 comment 4 replies

Comment options

You must be logged in to vote
4 replies
@linonetwo
Comment options

@linonetwo
Comment options

@giladgd
Comment options

@linonetwo
Comment options

Answer selected by giladgd
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants