-
-
Notifications
You must be signed in to change notification settings - Fork 11.6k
[Hardware][Gaudi][Feature] Enable Dynamic MoE for Mixtral #12303
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Hardware][Gaudi][Feature] Enable Dynamic MoE for Mixtral #12303
Conversation
|
👋 Hi! Thank you for contributing to the vLLM project. Once the PR is approved and ready to go, your PR reviewer(s) can run CI to test the changes comprehensively before merging. To run CI, PR reviewers can do one of these:
🚀 |
cde00ea to
eeac620
Compare
0461c56 to
7d0a9d9
Compare
ae8016b to
99096ae
Compare
|
@kzawora-intel @mgoin, Hi, could you help review this pr? |
99096ae to
969ae96
Compare
bfe5d0c to
9e3150a
Compare
|
Hi @mgoin, could you review the updated code again? |
Signed-off-by: zhenwei <[email protected]>
Signed-off-by: zhenwei <[email protected]>
Signed-off-by: zhenwei <[email protected]>
|
This pull request has merge conflicts that must be resolved before it can be |
9e3150a to
a53b800
Compare
Signed-off-by: zhenwei <[email protected]>
a53b800 to
00dcb05
Compare
|
Hi @mgoin @WoosukKwon , could you help review again? |
|
This pull request has merge conflicts that must be resolved before it can be |
WoosukKwon
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM. Sorry for the delay.
…ct#12303) Signed-off-by: zhenwei <[email protected]>
…ct#12303) Signed-off-by: zhenwei <[email protected]> Signed-off-by: Wes Medford <[email protected]>
…ct#12303) Signed-off-by: zhenwei <[email protected]> Signed-off-by: Louis Ulmer <[email protected]>
…ct#12303) Signed-off-by: zhenwei <[email protected]>
…ct#12303) Signed-off-by: zhenwei <[email protected]>
…ct#12303) Signed-off-by: zhenwei <[email protected]> Signed-off-by: Mu Huai <[email protected]>
Enable Dynamic MoE for Mixtral
Command for testing accuracy:
bash .buildkite/lm-eval-harness/run-lm-eval-gsm-hf-baseline.sh -m neuralmagic/Mixtral-8x7B-Instruct-v0.1 -b 32 -l 250 -f 5 -t 4Accuracy results for bf16 precision:
