Skip to content

Conversation

@hjh0119
Copy link
Collaborator

@hjh0119 hjh0119 commented Dec 2, 2025

  1. best_of is deprecated in SamplingParamshttps://github.com/[V0 Deprecation] Remove best_of vllm-project/vllm#29090
  2. GuidedDecodingParams is deprecated(when import trl will raise ImportError) [Chore] Cleanup guided namespace, move to structured outputs config vllm-project/vllm#22772

@gemini-code-assist
Copy link
Contributor

Summary of Changes

Hello @hjh0119, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request updates the codebase to maintain compatibility with recent changes in the vllm library. It addresses the deprecation of best_of within SamplingParams and the removal of GuidedDecodingParams, ensuring that the project continues to function correctly with newer versions of vllm and avoids potential import errors.

Highlights

  • vllm.SamplingParams compatibility: Updated the _prepare_generation_config method to conditionally set SamplingParams attributes, specifically addressing the deprecation of best_of by checking for attribute existence before assignment.
  • vllm.GuidedDecodingParams deprecation: Implemented a try-except block to handle the removal of GuidedDecodingParams, aliasing vllm.sampling_params.StructuredOutputsParams to GuidedDecodingParams to prevent ImportError when trl is imported.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

@hjh0119 hjh0119 linked an issue Dec 2, 2025 that may be closed by this pull request
Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request updates the code to be compatible with recent changes in vllm, specifically the deprecation of best_of in SamplingParams and GuidedDecodingParams. The change in vllm_engine.py to conditionally add parameters to SamplingParams is a good defensive measure. However, in grpo_trainer.py, the compatibility patch for GuidedDecodingParams is placed after the trl imports, which will cause it to fail. I've left a critical comment to move this block. I've also pointed out a broken link in a comment. Overall, the changes are in the right direction for maintaining compatibility, but the placement of the patch in grpo_trainer.py needs to be corrected.

@hjh0119
Copy link
Collaborator Author

hjh0119 commented Dec 2, 2025

/gemini review

@hjh0119 hjh0119 merged commit c44d6bc into modelscope:main Dec 2, 2025
2 of 3 checks passed
@hjh0119 hjh0119 deleted the vllm-latest branch December 2, 2025 09:48
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

best_of feature will be removed in the upcoming vLLM release

2 participants