-
It would be nice to have a table with model recommendations depending on VRAM size. Thanks! |
Beta Was this translation helpful? Give feedback.
Answered by
jhc13
Jun 19, 2024
Replies: 1 comment 1 reply
-
Please read the discussion in #161 and #169. I recommend CogVLM2 (if you're on Linux), CogVLM/CogAgent, or InternLM-XComposer 2. |
Beta Was this translation helpful? Give feedback.
1 reply
Answer selected by
ai-marat
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Please read the discussion in #161 and #169.
I recommend CogVLM2 (if you're on Linux), CogVLM/CogAgent, or InternLM-XComposer 2.