Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can you add LRV-Instruction to Your update Arxiv Version? #23

Open
FuxiaoLiu opened this issue Sep 25, 2023 · 1 comment
Open

Can you add LRV-Instruction to Your update Arxiv Version? #23

FuxiaoLiu opened this issue Sep 25, 2023 · 1 comment

Comments

@FuxiaoLiu
Copy link

FuxiaoLiu commented Sep 25, 2023

Paper: Mitigating Hallucination in Large Multi-Modal Models via Robust Instruction Tuning
link: https://arxiv.org/pdf/2306.14565.pdf
Name: LRV-Instruction
Focus: Multimodal
Notes: A benchmark to evaluate the hallucination and instruction following ability

bib:
@Article{liu2023aligning,
title={Aligning Large Multi-Modal Model with Robust Instruction Tuning},
author={Liu, Fuxiao and Lin, Kevin and Li, Linjie and Wang, Jianfeng and Yacoob, Yaser and Wang, Lijuan},
journal={arXiv preprint arXiv:2306.14565},
year={2023}
}

@jindongwang
Copy link
Collaborator

@FuxiaoLiu Thanks for the recommendation. The survey paper is currently under review. We will add your work to the revised version after receiving the first-round review:)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants