-
Notifications
You must be signed in to change notification settings - Fork 1.2k
feat: enable multi-arch builds with ARM compatibility #4290
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for doing this! ❤️
6196645 to
d248483
Compare
ashwinb
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
leseb
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This looks good thanks! I believe the next step is to publish a build and take inspiration from .github/workflows/build-distributions.yml in the ops repo.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The build never ran. https://github.com/llamastack/llama-stack/actions/runs/20277364473 there is an issue with the matrix.
Signed-off-by: Doug Edgar <[email protected]>
Signed-off-by: Doug Edgar <[email protected]>
Signed-off-by: Doug Edgar <[email protected]>
d248483 to
78adce0
Compare
|
Good catch, I see the issue now. I've updated this PR with a new commit. The tests have passed again, and the build now completes: https://github.com/llamastack/llama-stack/actions/runs/20313646963. |
|
Might need a maintainer to remove that |
What does this PR do?
Enables lllama-stack multi-architecture builds for ARM. The result is an image index, that will point to the image associated with the calling user's architecture.
On Kubernetes for example, with these changes, images are still referred to in the same manner, but will resolve to the architecture that is relevant to the node where it is to be run.
In practice, referring to the usual
docker.io/llamastack/distribution-starter:latestshould resolve to the image with the requesting user's relevant architecture, without any additional configuration on the user's side.Closes #406
Test Plan
Tested container builds on an
amd64architecture host, and deployed the starter distribution image via llama-stack-k8s-operator to anarm64architecture OpenShift cluster. The deployment ran and the operator's e2e test suite completed as expected.Arm64-specific build tests run and pass on GitHub.