We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
(1) hidden = self.latent2hidden(z) (2) if self.bidirectional or self.num_layers > 1: (3) hidden = hidden.view(self.hidden_factor, batch_size, self.hidden_size) (4) else: (5) hidden = hidden.unsqueeze(0)
I think line (3) should be
hidden = hidden.view(batch_size, self.hidden_factor, self.hidden_size).transpose(0, 1)
This snap of code appears in both forward and inference.
The text was updated successfully, but these errors were encountered:
I tried, there is a dimensionality missmatch with your line...
Sorry, something went wrong.
I also encountered a problem with inference of a 2-layer model. The fix for me was changing these lines in the inference() code in model.py:
if self.bidirectional or self.num_layers > 1: # unflatten hidden state hidden = hidden.view(self.hidden_factor, batch_size, self.hidden_size) hidden = hidden.unsqueeze(0)
to
if self.bidirectional or self.num_layers > 1: # unflatten hidden state hidden = hidden.view(self.hidden_factor, batch_size, self.hidden_size) if self.num_layers == 1: hidden = hidden.unsqueeze(0)
No branches or pull requests
I think line (3) should be
This snap of code appears in both forward and inference.
The text was updated successfully, but these errors were encountered: