-
Notifications
You must be signed in to change notification settings - Fork 50
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Model not compatible with TorchScript conversion (via torch.jit.script) #15
Comments
Unfortunately I have no plans to support TorchScript at the moment. I'm also not convinced that's the only problem, for example the computation of same padding could raise an error. If that's the case I suspect it would also be necessary to train the models from scratch in pytorch. |
The type hint issue was easy to get around but even more problematic is the use of
It looks like tracing may be compatible, though is more restrictive, per the issue on einops arogozhnikov/einops#115
|
Is there any solution to this problem? |
I have a fork that modifies the models so that it can be exported to TorchScript or ONNX. |
It is not possible to convert the model to TorchScript using the function
torch.jit.script
. In particular, the code returns an error because of the usage of...
in the line:MoViNet-pytorch/movinets/models.py
Line 276 in c2d1edf
Even changing the type-hint definition to overcome this problem, the conversion is not possible because the attribute
activation
is initialized asNone
and then filled with aTensor
.The text was updated successfully, but these errors were encountered: