-
-
Notifications
You must be signed in to change notification settings - Fork 98
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Non-trainable parameters? #921
Comments
I'm not entirely sure, but I believe you can create a wrapper structure that defines how the |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
I would like to set up a network in which all of the parameters of one of the linear layers are hard-coded and do not change through training. In other libraries such as PyTorch, one can do this by clearing flag
requires_grad
on the parameters one wishes to hold fixed. I can't find any equivalent in the dfdx documentation, nor any mention of the terms "non-trainable" or similar.Does dfdx support this at all? If so, how does one set this up?
The text was updated successfully, but these errors were encountered: