You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I ran into an issue when I wanted to decode the outputs of a sequential model. Kaldi struggles to open the written .ark files for decoding throwing the following message:
ERROR (latgen-faster-mapped-parallel[5.5.646~1-cdf2]:DecodableMatrixScaledMapped():decoder/decodable-matrix.h:55) DecodableMatrixScaledMapped: mismatch, matrix has 1 cols but transition-model has 1992 pdf-ids.
So I traced it down to the following line in the core.py
where the out_save array still contains the singleton dimension from the batchsize of the forwarded sequence model. Usually this is not the default, as all PyTorch-Kaldi models implement e.g. the softmax function as an additional own model. In my case softmax is included into the model class of the sequential model and this is where it goes wrong.
I just included a small fix where I squeeze out the redundant dimension from out_save with np.squeeze but this has to be tested before I can make a pull request.
The text was updated successfully, but these errors were encountered:
I ran into an issue when I wanted to decode the outputs of a sequential model. Kaldi struggles to open the written .ark files for decoding throwing the following message:
So I traced it down to the following line in the core.py
pytorch-kaldi/core.py
Line 663 in 775f5db
where the
out_save
array still contains the singleton dimension from the batchsize of the forwarded sequence model. Usually this is not the default, as all PyTorch-Kaldi models implement e.g. the softmax function as an additional own model. In my case softmax is included into the model class of the sequential model and this is where it goes wrong.I just included a small fix where I squeeze out the redundant dimension from
out_save
with np.squeeze but this has to be tested before I can make a pull request.The text was updated successfully, but these errors were encountered: