-
Notifications
You must be signed in to change notification settings - Fork 123
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Torchscript compatibility #83
Comments
It seems to be possible to overcome the error reported above by modifying
in https://github.com/google-deepmind/tapnet/blob/main/torch/nets.py#L225-L233, so it looks like
where DummyModel is dummy:
But then
that can be resolved by replacing
|
Hi, Thanks for raising the issue, prior to release we were also able to trace the model using the same method you described but after testing it actually showed very little performance increase when used. Can I ask what the use case is for scripting here? Thanks |
Sorry, I am familiar with scripting, I'm just trying to figure out what the use case is here. Since the model is compatible with torch.compile this seems unnecessary. Thanks |
@sgjheywa, the use case is LibTorch integration in C++. The model can be compiled with |
hello! May I ask if you have implemented model training for the Tapir Python version |
While making the torch TAPIR model compatible with Torchscript tracing is easy by changing
TAPIR.forward()
in https://github.com/google-deepmind/tapnet/blob/main/torch/tapir_model.py#L196-L209 fromto
(assuming it is OK to eliminate
unrefined_
from the output), so thatsucceeds, it is not so easy to make it Torchscript scripting compatible.
fails with
How to make the model Torchscript scripting compatible?
The text was updated successfully, but these errors were encountered: