You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Whenever I run deepseek-coder-v2:latest through ollama, the following Error pops up in the log for each prompt:
llm_tokenizer_bpe::check_double_bos_eos: Added a BOS token to the prompt as specified by the model but the prompt also starts with a BOS token. So now the final prompt starts with 2 BOS tokens. Are you sure this is what you want?
I believe that this is due to tokenizer.ggml.add_bos_token being set to true but the template also already having a <|begin▁of▁sentence|> token.
I'm not sure how this affects hallucinations of the model
The text was updated successfully, but these errors were encountered:
Whenever I run
deepseek-coder-v2:latest
throughollama
, the following Error pops up in the log for each prompt:I believe that this is due to
tokenizer.ggml.add_bos_token
being set totrue
but the template also already having a<|begin▁of▁sentence|>
token.I'm not sure how this affects hallucinations of the model
The text was updated successfully, but these errors were encountered: