Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can you release the training config for Mobilenext-0.75? fail to reproduce #8

Open
BluebirdStory opened this issue Jun 23, 2022 · 3 comments

Comments

@BluebirdStory
Copy link

BluebirdStory commented Jun 23, 2022

Can you release the training config for Mobilenext-0.75? fail to reproduce. Here is my training settings:
1、SGDwM(m = 0.9), cosine lr, init lr = 0.1 and batchsize = 256 on a single V100 GPU, weight_decay = 1e-4, epochs = 240
2、label smooth(eps = 0.1)
3、Common data augmentation settings(RandomResizedCrop+ColorJitter+RandomFlip)
4、lr warmup is also used exactly following the settings you mentioned in the paper, and also cool-down epochs
I tried dozens of times, and the top-1 acc just hang around 69.5%,never break 70%, let alone the 72% reported in the paper.

Actually, I am currently research in light-weight architecture desgin, I don't know if I should report the performance of mobilenext-0.75 according to my own experiment or the statistics reported in the paper.
Many thanks

@BluebirdStory
Copy link
Author

I even tried AdamW and RandAugment, I also tried decrease weight_decay when more data augmentation is used, but just can't break 70%

@BluebirdStory
Copy link
Author

Alright, I figure it out, using some settings not mentioned in the paper. Finally the mobilenext-0.75 I trained achieves 72.3% top-1 acc, sightly better than the 72% in the paper.

@Newbie-Tom
Copy link

Alright, I figure it out, using some settings not mentioned in the paper. Finally the mobilenext-0.75 I trained achieves 72.3% top-1 acc, sightly better than the 72% in the paper.

hei,could you share the pretrained file and training settings?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants