Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

OOM in HunyuanVideo24G #300

Open
rorrewang opened this issue Jan 2, 2025 · 3 comments
Open

OOM in HunyuanVideo24G #300

rorrewang opened this issue Jan 2, 2025 · 3 comments

Comments

@rorrewang
Copy link

此前我已经在我自己的4060ti16g上成功运行你们的6G版本,因为我们公司的服务器上使用V10032G显卡,因此我尝试在我们公司的显卡上运行24G版本,但是报错torch.OutOfMemoryError: CUDA out of memory. Tried to allocate 1263.35 GiB. 这里一千多g过于离谱,因此我想问问是什么问题。你们能再验证一下24G版本有没有问题吗?

@rorrewang
Copy link
Author

又运行了6g版本还是OOM感觉是我的模型传输有问题了,重新下载看看。

@JerryLuYujie
Copy link

俺也一样

@Artiprocher
Copy link
Collaborator

可能是 torch 版本的问题,在低版本 torch 上,attention 实现会从 flash attention 2 回滚到 naive attention,建议升级到最新版本

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants