Skip to content
This repository has been archived by the owner on Feb 12, 2022. It is now read-only.

ONNX LSTM-LM Export Fails #94

Open
abrarmatin opened this issue Jan 28, 2019 · 0 comments
Open

ONNX LSTM-LM Export Fails #94

abrarmatin opened this issue Jan 28, 2019 · 0 comments

Comments

@abrarmatin
Copy link

I want to export a protobuf of the LSTM-LM in ONNX, but I'm failing with the error:
"ValueError: Auto nesting doesn't know how to process an input object of type float. Accepted types: Tensors, or lists/tuples of them"

I modeled my export function after word_language model:

def export_onnx(path, batch_size, seq_len):
print('The model is also exported in ONNX format at {}'.
format(os.path.realpath(args.onnx_export)))
model.eval()
dummy_input = torch.LongTensor(seq_len * batch_size).zero
().view(-1, batch_size).to(device)
hidden = model.init_hidden(batch_size)
torch.onnx.export(model, (dummy_input, hidden), path)_

Searching online reveals a similar issue that suggests the bug is fixed with the newest onnx update: OpenNMT/OpenNMT-py#638

Is it possible to get a protobuf of the AWD-LSTM LM?

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant