Skip to content

Conversation

@shota-kizawa
Copy link

Fix

  • Added proper handling to ensure max_tokens is always set to a valid value instead of null.
  • Verified that the request now succeeds with both gpt-5 and gpt-5-mini.

Related issue

Closes #103

@liukidar
Copy link
Contributor

Hello,
this as it is would remove the possibility to pass max_tokens to any model.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

FastGraphRAG with instructor fails on gpt-5 / gpt-5-mini: max_tokens sent as null (400 Bad Request)

2 participants