Replies: 1 comment 2 replies
-
|
The error you're encountering, "Calculated available context size -353 was not non-negative," suggests that the total number of tokens required for the prompt and related components exceeds the maximum context size allowed by the model. Here are a few things to consider:
By adjusting these parameters, you should be able to resolve the issue with the negative context size. To reply, just mention @dosu. Share context across your team and agents. Try Dosu. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Getting
ValueError: Calculated available context size -353 was not non-negative.using GPT-4o and an input token size of 1313 according totiktokenusingo200k_baseas encoding. What am I doing wrong here? I have set themax_tokensto 1300.I have also:
Beta Was this translation helpful? Give feedback.
All reactions