You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
<!-- markdownlint-disable -->
## Description
This PR updates the `vllm_spyre` package to `sendnn_inference`.
Docs and publication workflows have been updated in preparation to
publish this as `sendnn-inference==2.0.0`
❗❗❗
Breaking configuration changes!
- The plugin name for `VLLM_PLUGINS` is now `sendnn_inference`
- All config options are now `SENDNN_INFERENCE_*`
- The precompiled model parser now expects `sendnn_inference_version`
## Related Issues
Closes #939
## Test Plan
- CPU unit tests
- Spyre unit tests
- Spot checks with full model integration tests
- Ensure Test PyPI deploy works
## Checklist
- [ ] I have read the [contributing
guidelines](https://docs.vllm.ai/projects/spyre/en/latest/contributing)
- [ ] My code follows the project's code style (run `bash format.sh`)
- [ ] I have added tests for my changes (if applicable)
- [ ] I have updated the documentation (if applicable)
- [ ] My commits include a `Signed-off-by:` line (DCO compliance)
---------
Signed-off-by: Joe Runde <joe@joerun.de>
0 commit comments