Skip to content

Fix OpenAI max token parameter selection#170

Closed
auston314 wants to merge 1 commit intolsdefine:mainfrom
auston314:fix/llm-parameter-issues
Closed

Fix OpenAI max token parameter selection#170
auston314 wants to merge 1 commit intolsdefine:mainfrom
auston314:fix/llm-parameter-issues

Conversation

@auston314
Copy link
Copy Markdown

I got some error when I was using gpt-5.4 model.

@lsdefine
Copy link
Copy Markdown
Owner

Thanks for the PR! I implemented a smaller equivalent fix directly on main in 08181be, covering Responses API max_output_tokens and Chat Completions max_completion_tokens for GPT-5/o-series while keeping max_tokens for other compatible providers. Closing this PR to avoid the larger helper change, but really appreciate the catch and contribution!

@lsdefine lsdefine closed this Apr 25, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants