Skip to content

Conversation

@dsikka
Copy link
Contributor

@dsikka dsikka commented Aug 9, 2024

Summary:

  • Splits up [ Kernel ] AWQ Fused MoE #6422 into two separate PRs. This is the first of the two. The second will leverage the weight loading changes introduced in this PR while adding the AWQ Fused MoE Kernel
  • Refactors FusedMoE.weight_loader, to enable loading AWQ models, which have transposed weights (input_dim, output_dim) on disk. Fp16 and Fp8 models have share (input_dim, output_dim). This required more complex logic for handling indexing in the TP case and MergedColumn case
  • Refactors expert_params_mapping, which was overfit to fp16 and fp8 checkpoints. This required renaming the scale parameters in fp8 which to better match the state dicts that we create in autofp8, limiting the amount of remapping we need to do in the model files
  • Updates layers to use fused_topk/grouped_topk and fused_experts, rather than calling fused_moe directly, such that we can reuse the logic across fp16, fp8, and awq

Form Neural Magic
Co-authored by @robertgshaw2-neuralmagic

@github-actions
Copy link

github-actions bot commented Aug 9, 2024

👋 Hi! Thank you for contributing to the vLLM project.
Just a reminder: PRs would not trigger full CI run by default. Instead, it would only run fastcheck CI which consists a small and essential subset of CI tests to quickly catch errors. You can run other CI tests on top of default ones by unblocking the steps in your fast-check build on Buildkite UI.

Once the PR is approved and ready to go, please make sure to run full CI as it is required to merge (or just use auto-merge).

To run full CI, you can do one of these:

  • Comment /ready on the PR
  • Add ready label to the PR
  • Enable auto-merge.

🚀

@dsikka dsikka marked this pull request as ready for review August 9, 2024 14:51
@dsikka dsikka changed the title [Misc] Update fused moe structure [Misc] Update Fused MoE weight loading Aug 9, 2024
@dsikka
Copy link
Contributor Author

dsikka commented Aug 9, 2024

/ready

@github-actions github-actions bot added the ready ONLY add when PR is ready to merge/full CI is needed label Aug 9, 2024
@dsikka dsikka mentioned this pull request Aug 9, 2024
Copy link
Member

@mgoin mgoin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This LGTM but have you verified that DeepSeek MoE is okay with this PR?

@dsikka dsikka requested a review from comaniac August 13, 2024 01:49
@dsikka
Copy link
Contributor Author

dsikka commented Aug 13, 2024

This LGTM but have you verified that DeepSeek MoE is okay with this PR?

yes. deepkseek, mixtral and qwen

@dsikka dsikka requested a review from mgoin August 13, 2024 17:18
Copy link
Member

@mgoin mgoin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you, this looks great after review!

@mgoin mgoin merged commit d3bdfd3 into vllm-project:main Aug 13, 2024
@mgoin mgoin deleted the update-fused-moe branch August 13, 2024 18:57
Alvant pushed a commit to compressa-ai/vllm that referenced this pull request Oct 26, 2024
LeiWang1999 pushed a commit to LeiWang1999/vllm-bitblas that referenced this pull request Mar 26, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ready ONLY add when PR is ready to merge/full CI is needed

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants