Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

None convergence with SoftmaxLoss when num_output becoming like 800k, #568

Open
peyer opened this issue May 30, 2019 · 0 comments
Open

None convergence with SoftmaxLoss when num_output becoming like 800k, #568

peyer opened this issue May 30, 2019 · 0 comments

Comments

@peyer
Copy link

peyer commented May 30, 2019

I have comment a problem while training a model for face recognition that when increasing num_output to 800k, the prob of each category became 1.0/800k and never channged!
I have tried the same network and optimization parameters with BVLC-CAFFE, it successfully became to convergence after 8k iterations.
How should I do ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant