Skip to content

GPUs and GPU usage #39

Discussion options

You must be logged in to vote

Hi.

I am sorry for the late reply.

  1. How many GPUs do you use to train the pre-trained models provided in the README (especially the BigGAN-2048 on ImageNet)?
    =>
    Models trained on CIFAR10: 1 GPU (2080Ti, RTX-TITAN, V100, A100, etc.)
    Models trained on Tiny_ImageNet: RTX-TITAN x 1 (From DCGAN to SAGAN), RTX-TITAN x 4 (From BigGAN to ContraGAN +ADA)
    Models trained on ImageNet: V100 32GB x 4 with Sync_BN (From SNGAN to BigGAN 256 B.S.), V100 32GB x 8 with Sync_BN and DP (BigGAN 2048 B.S., training takes almost a month)

  2. I'm finding that GPU utilization is quite low when using multiple GPUs
    => yes, it is because you might train the model using DataParallel (DP)
    If you train a model using Dist…

Replies: 2 comments 2 replies

Comment options

You must be logged in to vote
0 replies
Answer selected by mingukkang
Comment options

You must be logged in to vote
2 replies
@mingukkang
Comment options

@greeneggsandyaml
Comment options

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
2 participants