Skip to content

Commit

Permalink
Update max_tokens parameter
Browse files Browse the repository at this point in the history
Update max_tokens parameter to the newer one: max_completion_tokens for compatibility with newer OpenAI models.
  • Loading branch information
ceicke committed Feb 3, 2025
1 parent b963887 commit ff70a45
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion app/services/ai_backend/open_ai.rb
Original file line number Diff line number Diff line change
Expand Up @@ -66,7 +66,7 @@ def set_client_config(config)
model: @assistant.language_model.api_name,
messages: system_message(config[:instructions]) + config[:messages],
stream: config[:streaming] && @response_handler || nil,
max_tokens: 2000, # we should really set this dynamically, based on the model, to the max
max_completion_tokens: 2000, # we should really set this dynamically, based on the model, to the max
stream_options: config[:streaming] && { include_usage: true } || nil,
response_format: { type: "text" },
tools: @assistant.language_model.supports_tools? && Toolbox.tools || nil,
Expand Down

0 comments on commit ff70a45

Please sign in to comment.