My model inference is taking almost as long as training. Is this normal? #4254
Replies: 1 comment 2 replies
-
|
How do you measure inference time? Are you making sure the execution queue is fully executed ? Also, are you using the autodiff backend even during inference? |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment

Uh oh!
There was an error while loading. Please reload this page.
-
I followed the Basic Workflow: From Training to Inference guide from the burn book and finished training my model. But when I tried to run inference, I found the inference time is almost the same as the training time. Since it's my first time using burn and model training, I'm not sure if this is expected.
My code matches the example on GitHub. Also, I didn't see the Training Dashboard during training.
experiment.log

config.json
Beta Was this translation helpful? Give feedback.
All reactions