Cohere Command R Input Token Limit? #138672
-
ModelsQuestion BodyThis model lists context as 131k input tokens.
Yet with the sample for making a single request, I get this error when passing ~125k tokens
Why do the two numbers (131k and 8k) not line up? |
Beta Was this translation helpful? Give feedback.
Answered by
matthewisabel
Sep 15, 2024
Replies: 1 comment 1 reply
-
Sorry for the confusion. Those are the characteristics of the model, although we apply token limits you can read here: https://docs.github.com/en/github-models/prototyping-with-ai-models#rate-limits We're still evaluating these limits as this is limited preview and there isn't a seamless upgrade path yet to let you go beyond these constraints. Hopefully more updates soon and I'll think about how we can provide more clarity here. |
Beta Was this translation helpful? Give feedback.
1 reply
Answer selected by
ayan4m1
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Sorry for the confusion. Those are the characteristics of the model, although we apply token limits you can read here: https://docs.github.com/en/github-models/prototyping-with-ai-models#rate-limits
We're still evaluating these limits as this is limited preview and there isn't a seamless upgrade path yet to let you go beyond these constraints. Hopefully more updates soon and I'll think about how we can provide more clarity here.