-
Notifications
You must be signed in to change notification settings - Fork 70
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Custom cost function is not being used for training #218
Comments
I guess you need this: http://mxnet.io/how_to/new_op.html |
Note that |
Just discovered that there is a |
Thanks a lot! Is it possible to create the new operators with Julia? I have tried |
I have the same issue! Is there really no simpler way? |
Seems no wrapper for creating |
Ohh, what a pity :( |
Hi.
I have implemented my own cost function but I have realized it has only being used for printing, not for calculating the error for the gradient descend, and in fact the metric that is being used during the training is the default one (the accuracy).
The reason I am saying this is because I have tried returning always value 1 (
return [(:EuclideanDist, 1)]
)in the mx.get method and I still get exactly the same results (when in this case it should not be able to even learn anything).Thanks for your help.
The text was updated successfully, but these errors were encountered: