Skip to content

Custom stop token #193

Closed Answered by giladgd
fakerybakery asked this question in Q&A
Discussion options

You must be logged in to vote

In version 3 beta you can generate a completion using LlamaCompletion and configure stopGenerationTriggers.

For example:

import {fileURLToPath} from "url";
import path from "path";
import {getLlama, LlamaCompletion} from "node-llama-cpp";

const __dirname = path.dirname(fileURLToPath(import.meta.url));

const llama = await getLlama();
const model = await llama.loadModel({
    modelPath: path.join(__dirname, "models", "stable-code-3b.Q5_K_M.gguf")
});
const context = await model.createContext();
const completion = new LlamaCompletion({
    contextSequence: context.getSequence(),
    stopGenerationTriggers: [
        [ "];" ]
    ]
});

const input = "const arrayFromOneToTwenty = [1, 2, 3,";

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by giladgd
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants