Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Default loop output transforms are too intrusive #362

Open
seanmor5 opened this issue Sep 17, 2022 · 1 comment · May be fixed by #573
Open

Default loop output transforms are too intrusive #362

seanmor5 opened this issue Sep 17, 2022 · 1 comment · May be fixed by #573
Labels
area:loop Applies to loop API kind:chore Internal Improvements

Comments

@seanmor5
Copy link
Contributor

For supervised training loops, we provide a convenience output transform which ensures only the model state is returned from the training loop. This means you always lose the entire training state, which might be of interest later on.

I propose instead that we return a tuple:

{loop_state, transformed_state} which always returns the whole state, as well as a transformed version. That way you never accidentally lose the entire state.

@seanmor5 seanmor5 added kind:chore Internal Improvements area:loop Applies to loop API labels Sep 20, 2022
@josevalim
Copy link
Contributor

My suggestion is to get rid of output_transform altogether. After all, anyone can transform the output by piping an operation after it.

👍 for the trainer returning {model_state, loop_state} though. The user can even access other metadata inside state.step_state. If you want, you can even add more structure by defining a TrainerStep struct which you then place it as the step_state.

Perhaps it is best to do these changes sooner than later, since they are breaking?

@seanmor5 seanmor5 linked a pull request May 11, 2024 that will close this issue
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area:loop Applies to loop API kind:chore Internal Improvements
Projects
None yet
2 participants