Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for TF 2.16 #2

Open
j-woz opened this issue Nov 8, 2024 · 4 comments · May be fixed by #3
Open

Support for TF 2.16 #2

j-woz opened this issue Nov 8, 2024 · 4 comments · May be fixed by #3

Comments

@j-woz
Copy link
Contributor

j-woz commented Nov 8, 2024

No description provided.

j-woz added a commit that referenced this issue Nov 8, 2024
@j-woz
Copy link
Contributor Author

j-woz commented Nov 8, 2024

optimizer.lr disappeared from the Keras API in TF 2.16 , breaking Uno. This seems to work for TF 2.15 and 2.16 .

Should we do it like this or simply refer to params["learning_rate"] ?

@rajeeja
Copy link
Contributor

rajeeja commented Nov 23, 2024

Using params["learning_rate"] directly feels like a more solid and future-proof way to go since it keeps the code in line with newer TensorFlow versions and avoids relying on stuff that’s been removed. What do you think about adding version checks to handle both TensorFlow 2.15 and 2.16? That way, we could switch between optimizer.learning_rate and params["learning_rate"] as needed.

@rajeeja
Copy link
Contributor

rajeeja commented Nov 23, 2024

I'll test out and create a PR and ref. this issue

@j-woz
Copy link
Contributor Author

j-woz commented Nov 25, 2024

Sounds good.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants