Skip to content

How to fix some layers for transfer learning? #1706

Answered by matthias-wright
wztdream asked this question in Q&A
Discussion options

You must be logged in to vote

Hi @wztdream,

note that optax is now the recommended optimizer API (see here).

You can freeze a subset of your params by using optax.multi_transform.

One way of achieving this is to create a mask of your params, that assigns one label to trainable parameters and another label to frozen parameters.

Consider this simple parameter tree:

 params
   frozen1
     kernel
     bias
   trainable2
     kernel
     bias
   trainable3
     kernel
     bias
   trainable4
     kernel
     bias

We want to freeze the params of the layers whose names start with frozen, so we assign them the label zero (the name is arbitrary). The other parameters will be assigned the name adam (also arbitrary):

 params
 …

Replies: 2 comments 9 replies

Comment options

You must be logged in to vote
4 replies
@wztdream
Comment options

@matthias-wright
Comment options

@andsteing
Comment options

@varunagrawal
Comment options

Answer selected by wztdream
Comment options

You must be logged in to vote
5 replies
@matthias-wright
Comment options

@marcvanzee
Comment options

@sanchit-gandhi
Comment options

@jheek
Comment options

@sanchit-gandhi
Comment options

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
8 participants