-
-
Notifications
You must be signed in to change notification settings - Fork 8.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Does learning rate apply to linear booster? #11106
Comments
Yes, it's used. |
xgboost/src/linear/updater_coordinate.cc Line 57 in 543b57f
|
Thanks. Sounds like the docs could be updated then, so as to list |
Thank you for updating the doc! |
@trivialfis But I only updated the R docs. The online docs are still not showing the parameter under "linear booster". It's also not clear to me whether the default value is the same for both booster types either. |
There are some hints throughout the R examples for linear boosters, such as in the docs for
xgb.cb.gblinear.history
, which mention things like:xgboost/R-package/R/callbacks.R
Line 984 in 543b57f
The examples then pass
learning_rate
(previouslyeta
) in the parameters:xgboost/R-package/R/callbacks.R
Line 990 in 543b57f
.. but according to the docs about parameters,
eta
andlearning_rate
are for tree-based boosters, or at least that's how they are documented, since they aren't listed as parameters for the linear booster:xgboost/doc/parameter.rst
Line 78 in 543b57f
Does the linear booster use
learning_rate
/eta
if passed underparams
?The text was updated successfully, but these errors were encountered: