Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question about variance calculation in sampling step #18

Open
HIT-LiuChen opened this issue Mar 23, 2023 · 5 comments
Open

Question about variance calculation in sampling step #18

HIT-LiuChen opened this issue Mar 23, 2023 · 5 comments

Comments

@HIT-LiuChen
Copy link

In Diffusion.py line 66, posterior_var is calculated. Why do you use torch.cat([self.posterior_var[1:2], self.betas[1:]]) in line 77 to extract variance instead of posterior_var.

    def p_mean_variance(self, x_t, t):
        # below: only log_variance is used in the KL computations
        var = torch.cat([self.posterior_var[1:2], self.betas[1:]])    # i think var should be equal to self.posterior_var
        var = extract(var, t, x_t.shape)

        eps = self.model(x_t, t)
        xt_prev_mean = self.predict_xt_prev_mean_from_eps(x_t, t, eps=eps)

        return xt_prev_mean, var
@ljw919
Copy link

ljw919 commented May 9, 2023

I have the same doubt, what's the reason for doing this ?

@kache1995
Copy link

俺也有这个疑问,方差不是应该是self.posterior_var吗

@tsWen0309
Copy link

Have you solved the problem? Can you fill me in?

@MetaInsight7
Copy link

@zoubohao Could you answer this issue? Thank you!

@chenchen278
Copy link

In the paper DDPM, it seems that var is equal to posterior_var or betas. And I don't know why they were concatenated here?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants