LLM_local_loader shows an error "data did not match any variant of untagged enum ModelWrapper at line 1251003 column 3" #158
Unanswered
JewelSamAmy
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
@heshengtao, I'm still learning how your node functions, so I apologize if my question is too basic.
I am going through your starter docs and tried both start_with_LLM_api and start_with_LLM_local. start_with_LLM_api works well with my local Ollama (I'm using small model llama3.2:1b for testing purposes). As far as the start_with_LLM_local, I followed your guide and git clone the meta-llama/Llama-3.2-1B to my local environment. When I run it, it shows the error:

Could you please assist me and point me in the right direction? Thank you!
Beta Was this translation helpful? Give feedback.
All reactions