Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Problem in Reproducing llava-onevision-72b-ov-chat #459

Open
Estrella142857 opened this issue Dec 13, 2024 · 0 comments
Open

Problem in Reproducing llava-onevision-72b-ov-chat #459

Estrella142857 opened this issue Dec 13, 2024 · 0 comments

Comments

@Estrella142857
Copy link

Hi, I used your demo(llava-onevision-72b-ov-chat) a few weeks ago, and it worked really well for my case.
However, I noticed the demo website went down. I don't know if it will be up again, so I deploy my own version of llava-onevision-72b-ov-chat using huggingface. (https://huggingface.co/lmms-lab/llava-onevision-qwen2-72b-ov-chat)
Sadly, this local version doesn't perform as well as your demo version. So I was wondering if you used some default prompt for demo? If so, where can I find it?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant