Skip to content

Conversation

@chaunceyjiang
Copy link
Collaborator

Fix #17421

@github-actions
Copy link

github-actions bot commented May 4, 2025

👋 Hi! Thank you for contributing to the vLLM project.

💬 Join our developer Slack at https://slack.vllm.ai to discuss your PR in #pr-reviews, coordinate on features in #feat- channels, or join special interest groups in #sig- channels.

Just a reminder: PRs would not trigger full CI run by default. Instead, it would only run fastcheck CI which starts running only a small and essential subset of CI tests to quickly catch errors. You can run other CI tests on top of those by going to your fastcheck build on Buildkite UI (linked in the PR checks section) and unblock them. If you do not have permission to unblock, ping simon-mo or khluu to add you in our Buildkite org.

Once the PR is approved and ready to go, your PR reviewer(s) can run CI to test the changes comprehensively before merging.

To run CI, PR reviewers can either: Add ready label to the PR or enable auto-merge.

🚀

@mergify mergify bot added the frontend label May 4, 2025
@chaunceyjiang
Copy link
Collaborator Author

Test

vllm serve mistralai/Ministral-8B-Instruct-2410 --tokenizer_mode mistral --config_format mistral --load_format mistral --max-model-len 4096 --enable-auto-tool-choice --tool-call-parser mistral
curl -s -X POST \
  -H "Content-Type: application/json" \
  "http://localhost:8000/v1/chat/completions" \
  --data-binary @- << _EOF
  {
   "logprobs": true,
   "top_logprobs": 2,
   "messages": [
      {
          "role": "user",
          "content":  " "
      }
   ],
   "guided_json": {"properties": {}}
  }
_EOF
{"id":"chatcmpl-c02a9bf7edc3428297cbf84ee6046d8c","object":"chat.completion","created":1746375676,"model":"mistralai/Ministral-8B-Instruct-2410","choices":[{"index":0,"message":{"role":"assistant","reasoning_content":null,"content":"{}","tool_calls":[]},"logprobs":{"content":[{"token":"{","logprob":-0.03710511326789856,"bytes":[123],"top_logprobs":[{"token":"{","logprob":-0.03710511326789856,"bytes":[123]},{"token":"{}","logprob":-3.312495708465576,"bytes":[123,125]}]},{"token":"}","logprob":0.0,"bytes":[125],"top_logprobs":[{"token":"}","logprob":0.0,"bytes":[125]}]},{"token":"","logprob":-9999.0,"bytes":[],"top_logprobs":[]}]},"finish_reason":"stop","stop_reason":null}],"usage":{"prompt_tokens":4,"total_tokens":7,"completion_tokens":3,"prompt_tokens_details":null},"prompt_logprobs":null}

@chaunceyjiang
Copy link
Collaborator Author

/cc @DarkLight1337 PTAL.

Copy link
Member

@DarkLight1337 DarkLight1337 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks reasonable, thanks for fixing!

@vllm-bot vllm-bot merged commit 5394ad7 into vllm-project:main May 5, 2025
26 checks passed
@chaunceyjiang chaunceyjiang deleted the top_logprobs branch May 6, 2025 02:13
RichardoMrMu pushed a commit to RichardoMrMu/vllm that referenced this pull request May 12, 2025
mawong-amd pushed a commit to ROCm/vllm that referenced this pull request May 14, 2025
zzzyq pushed a commit to zzzyq/vllm that referenced this pull request May 24, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Bug]: KeyError on logprobs with MistralTokenizer

3 participants