Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question about "packed mode" #609

Open
99DHL opened this issue Mar 28, 2025 · 4 comments
Open

Question about "packed mode" #609

99DHL opened this issue Mar 28, 2025 · 4 comments

Comments

@99DHL
Copy link

99DHL commented Mar 28, 2025

Does 'packed mode' yield different training results compared to using 'packed=False'? I thought 'packed mode' does not apply any approximations, but it seems to produce slightly different results. (PSNR is almost the same, but the number of gaussians is slightly different)

@liruilong940607
Copy link
Collaborator

It could yield different results. In packed mode the Gaussians with zero gradients will be skipped from updates. While in normal mode these zero-gradient Gaussians will still be updated because of the momentum in optimizer.

@99DHL
Copy link
Author

99DHL commented Mar 30, 2025

Thank you for your kind response.
I looked at the code, and it seems that when using packed mode, Adam is used instead of SparseAdam. So I thought that even if the gradients are sparse, the optimizer and weight updates are still performed for all parameters. Did I misunderstand this? Also, if packed mode behaves differently from normal mode, could you explain how it differs from the sparse_grad mode that uses SparseAdam?

@liruilong940607
Copy link
Collaborator

Ah I thought you were asking packed along with sparse_grad.

Setting packed itself does not affect gradient update, only if you also set sparse_grad.

If you only set packed, then it should not affect the optimization process in anyway. Maybe the difference you see between runs are from the nondeterministic randomness in pytorch.

@99DHL
Copy link
Author

99DHL commented Mar 30, 2025

Got it, thanks for the clarification!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants