Skip to content

Doing Batched Inference Using TokensPrompt #16388

Answered by DarkLight1337
wlm64 asked this question in Q&A
Discussion options

You must be logged in to vote

You should be passing a list of TokensPrompt, not have the list inside one prompt.

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by wlm64
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants