You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
hi,
I try to run the [Chat_with_CSV_File_Lllama2], I encountered this problem :
Number of tokens (663) exceeded maximum context length (512).
Number of tokens (664) exceeded maximum context length (512).
Number of tokens (665) exceeded maximum context length (512).
Number of tokens (666) exceeded maximum context length (512).
Number of tokens (667) exceeded maximum context length (512).
Number of tokens (668) exceeded maximum context length (512).
Number of tokens (669) exceeded maximum context length (512).
Number of tokens (670) exceeded maximum context length (512).
hi,
I try to run the [Chat_with_CSV_File_Lllama2], I encountered this problem :
Number of tokens (663) exceeded maximum context length (512).
Number of tokens (664) exceeded maximum context length (512).
Number of tokens (665) exceeded maximum context length (512).
Number of tokens (666) exceeded maximum context length (512).
Number of tokens (667) exceeded maximum context length (512).
Number of tokens (668) exceeded maximum context length (512).
Number of tokens (669) exceeded maximum context length (512).
Number of tokens (670) exceeded maximum context length (512).
I load the model like :
llm = CTransformers(model="models/llama-2-7b-chat.ggmlv3.q8_0.bin",
model_type="llama",
max_new_tokens=512,
temperature=0.1)
Can anyone help me solve this problem?
The text was updated successfully, but these errors were encountered: