-
Notifications
You must be signed in to change notification settings - Fork 19
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Could you please open-source the LLM2CLIP-EVA02-L-14-224 model? #11
Comments
Sure, we think 336 is more common used model so we haven't add 224 version. We will try opensource it and response to you soon. |
Could you please open-source the Llama3.2-1B-CC model? |
We have released the LLM2CLIP-Llama-3.2-1B-Instruct-CC-Finetuned model. We also plan to release the corresponding EVA02-L-14-224 ViT model. |
Thank you very much for your work! |
@konioy. This may need to be done later, as we haven't started training this model yet. The corresponding EVA02-L-14-224 ViT model may take one to two more days to be released. |
Thank you very much for your work! |
Thank you for your attention to our work. We are already in the process of training. |
@konioy We already released Llama3.2-1B-CC in huggingface, please check https://huggingface.co/microsoft/LLM2CLIP-Llama-3.2-1B-Instruct-CC-Finetuned |
Can I ask which vit model the Llama3.2-1B-CC model corresponds to and whether you have released it? |
The corresponding EVA02-L-14-336 is right here. |
Do you have plan release |
It has been included in our plan, so stay tuned. |
Hello, I couldn't find this model on Hugging Face. Could you open source it as well? Thanks.
The text was updated successfully, but these errors were encountered: