-
Notifications
You must be signed in to change notification settings - Fork 1.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
add cli dict parsing for grpo_config #3082
base: main
Are you sure you want to change the base?
Conversation
Thanks! Does it work is you directly modify |
Yes, but both of In |
It seems to work: from transformers.training_args import _VALID_DICT_FIELDS
from trl import GRPOConfig
_VALID_DICT_FIELDS.append("model_init_kwargs")
args = GRPOConfig("output_dir", model_init_kwargs='{"num_labels": 2}')
print(args.model_init_kwargs) # {"num_labels": 2} |
To do this properly, the first step would be to convert Then we could do: # in transformers
class TrainingArguments:
_VALID_DICT_FIELDS = [...]
# in trl
class GRPOConfig(TrainingArguments):
_VALID_DICT_FIELDS = TrainingArguments._VALID_DICT_FIELDS + ["model_init_kwargs"] which eliminates the need to duplicate the post init |
Yes, that was my initial thought as well. However, considering that the What's your suggestions? |
Yes I think first modifying transformers is the way to go. |
okay, I would notify you when pr merged. :) |
0743c5a
to
3e44f00
Compare
Hi, @qgallouedec, the PR in Transformers is merged. 🥳 |
Any potential issues for merging? |
I just need to review it carefully and ensure backwards compatibility |
What does this PR do?
Adds dict parsing logic to
model_init_kwargs
ingrpo_config
, enabling dynamic configuration via CLI. Users can now pass dictionary-like strings (e.g.,--model_init_kwargs '{"torch_dtype":"bfloat16"
) through command-line arguments, which are automatically parsed into Python dicts for the target fields.Logic is the same as TrainingArguments.
Before submitting
Pull Request section?
to it if that's the case.
documentation guidelines.
Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.