Skip to content

Conversation

@akilonx
Copy link

@akilonx akilonx commented Dec 12, 2024

This pull requests fixes the max_tokens error that users get when they try to use the newer o1 models.

Unsupported parameter: 'max_tokens' is not supported with this model. Use 'max_completion_tokens' instead.

The new o1 series of models deprecate the max_tokens parameter in favor of a new max_completion_tokens parameter and added support for max_completion_tokens on older models as well.

What this PR does is:

  1. defaults 'max_tokens' to 'max_completion_tokens'
  2. removes 'max_tokens'

@v3ceban
Copy link

v3ceban commented Dec 23, 2024

Api.edits also needs to be changed to enable edit with instructions feature

function Api.edits(custom_params, cb)
  local openai_params = Utils.collapsed_openai_params(Config.options.openai_params)
  local params = vim.tbl_extend("keep", custom_params, openai_params)
  if params.model == "text-davinci-edit-001" or params.model == "code-davinci-edit-001" then
    vim.notify("Edit models are deprecated", vim.log.levels.WARN)
    Api.make_call(Api.EDITS_URL, params, cb)
    return
  end

  if params.model == "o1-preview" or params.model == "o1-mini" then
    -- max_tokens is unsupported for o1 OpenAI models; older models are backward-compatible with max_tokens,
    -- but max_completion_tokens works with all models.
    params.max_completion_tokens = params.max_tokens
    params.max_tokens = nil
    -- o1 models changed "system" message role to "developer", however, current API
    -- only accepts "user" or "assistant". this will probably be fixed in the future.
    for _, message in ipairs(params.messages) do
      if message.role == "system" then
        message.role = "user"
      end
    end
  end

  Api.make_call(Api.CHAT_COMPLETIONS_URL, params, cb)
end

@adesprez
Copy link

adesprez commented Mar 6, 2025

Hi all,

First of all, thanks for that plugin, it's really great!

Thanks for that PR @akilonx.
And yes, we do need that on the Api.edits function as well.

Same issue with o3 models. I'm having that Unsupported parameter: 'max_tokens' is not supported with this model. Use 'max_completion_tokens' instead. error trying to use the o3-mini model.

And, as stated here: #473 (comment), we can't work our way around it by some config setup, we do need a code change of the API calls.

We also need to not use temperature parameter of o3 models (haven't tried with o1):

function Api.edits(custom_params, cb)
  local openai_params = Utils.collapsed_openai_params(Config.options.openai_params)
  local params = vim.tbl_extend("keep", custom_params, openai_params)
  if params.model == "text-davinci-edit-001" or params.model == "code-davinci-edit-001" then
    vim.notify("Edit models are deprecated", vim.log.levels.WARN)
    Api.make_call(Api.EDITS_URL, params, cb)
    return
  end

  if params.model == "o1-preview" or params.model == "o1-mini" or params.model == "o3-mini" then
    -- max_tokens is unsupported for o1 OpenAI models; older models are backward-compatible with max_tokens,
    -- but max_completion_tokens works with all models.
    params.max_completion_tokens = params.max_tokens
    params.max_tokens = nil
    params.temperature = nil
    -- o1 models changed "system" message role to "developer", however, current API
    -- only accepts "user" or "assistant". this will probably be fixed in the future.
    for _, message in ipairs(params.messages) do
      if message.role == "system" then
        message.role = "user"
      end
    end
  end

  Api.make_call(Api.CHAT_COMPLETIONS_URL, params, cb)
end

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants