Skip to content

Use vLLM as default inference backend for DocSum#928

Merged
lvliang-intel merged 1 commit intoopea-project:mainfrom
yongfengdu:docsum
Apr 2, 2025
Merged

Use vLLM as default inference backend for DocSum#928
lvliang-intel merged 1 commit intoopea-project:mainfrom
yongfengdu:docsum

Conversation

@yongfengdu
Copy link
Copy Markdown
Collaborator

Use vLLM as default backend.
Update README accordingly.

Description

The summary of the proposed changes as long as the relevant motivation and context.

Issues

#913

Type of change

List the type of change like below. Please delete options that are not relevant.

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds new functionality)
  • Breaking change (fix or feature that would break existing design and interface)

Dependencies

List the newly introduced 3rd party dependency if exists.

Tests

Describe the tests that you ran to verify your changes.

Use vLLM as default backend.
Update README accordingly.

Signed-off-by: Dolpher Du <dolpher.du@intel.com>
@yongfengdu yongfengdu requested a review from lianhao as a code owner April 2, 2025 02:29
@yongfengdu yongfengdu requested review from mkbhanda and poussa April 2, 2025 02:46
@lvliang-intel lvliang-intel merged commit dcfcfc2 into opea-project:main Apr 2, 2025
13 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants