Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Change trainer.py inside engine #12673

Open
1 task done
Jessica-hub opened this issue May 14, 2024 · 1 comment
Open
1 task done

Change trainer.py inside engine #12673

Jessica-hub opened this issue May 14, 2024 · 1 comment
Labels
question Further information is requested

Comments

@Jessica-hub
Copy link

Search before asking

Question

Hi, I'm currently trying to use a new optimizer for YOLO training. Can I change the build_optimizer function in trainer,py to add the optimizer? Will this affect the running of the YOLO? If I cannot, can you let me know what should be the best way to add the optimizer?

Additional

No response

@Jessica-hub Jessica-hub added the question Further information is requested label May 14, 2024
@glenn-jocher
Copy link
Member

Hello! 👋

Yes, you can modify the build_optimizer function in trainer.py to integrate a new optimizer for your YOLO training. This change should not negatively affect the running of YOLO as long as the new optimizer is implemented correctly and is compatible with the model's requirements.

Here's a basic example of how you might modify the function:

def build_optimizer(self):
    if self.optimizer_name == 'new_optimizer':
        return NewOptimizer(self.model.parameters(), lr=self.lr, ...)
    else:
        # existing optimizers
        return torch.optim.SGD(self.model.parameters(), lr=self.lr, ...)

Just ensure that your new optimizer adheres to the expected interface and that any specific requirements of the optimizer are met. If you encounter any issues, feel free to share them here!

Happy coding! 🚀

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants