You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I compared the time it took to train the models using 2 GPUs vs. using 1 GPU, an the result was that training with 2 GPUs is much slower. In fact, training with 2 GPUs takes at least two times the amount of time compared to using a 1GPU. What is happening? What is wrong?
I have looked the messages displayed after every iteration, an although "data" time does not vary with respect to the single GPU case, the "time" time is at least twice bigger in the 2 GPUs case.
"data" time: The time it takes to load the data.
"time" time: The time it take to do a whole iteration, including loading the data, forward and backward props.
Disclaimer: These confusing terms are the ones uses in the code.
The comparisons have been made using the same hardware configurations.
The text was updated successfully, but these errors were encountered:
I compared the time it took to train the models using 2 GPUs vs. using 1 GPU, an the result was that training with 2 GPUs is much slower. In fact, training with 2 GPUs takes at least two times the amount of time compared to using a 1GPU. What is happening? What is wrong?
I have looked the messages displayed after every iteration, an although "data" time does not vary with respect to the single GPU case, the "time" time is at least twice bigger in the 2 GPUs case.
Disclaimer: These confusing terms are the ones uses in the code.
The comparisons have been made using the same hardware configurations.
The text was updated successfully, but these errors were encountered: