musa: disable build of whisper.cpp, and update llama.cpp#1745
musa: disable build of whisper.cpp, and update llama.cpp#1745mikebonnet merged 1 commit intomainfrom
Conversation
Updating the musa images to rc4.2.0 requires a newer version of llama.cpp, and broke the whisper.cpp build. For more info see: #1697 Signed-off-by: Mike Bonnet <[email protected]>
Reviewer's GuideAdjusts the build script to use the updated llama.cpp commit for rc4.2.0 compatibility and disables the whisper.cpp build when building the Musa image to avoid build failures. Class diagram for build script function changesclassDiagram
class BuildScript {
+clone_and_build_llama_cpp()
+clone_and_build_whisper_cpp()
+main()
}
BuildScript : - llama_cpp_sha updated
BuildScript : - whisper.cpp build skipped for musa
File-Level Changes
Tips and commandsInteracting with Sourcery
Customizing Your ExperienceAccess your dashboard to:
Getting Help
|
There was a problem hiding this comment.
Summary of Changes
Hello @mikebonnet, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!
This pull request addresses compatibility requirements for musa images updated to rc4.2.0. It involves updating the llama.cpp dependency to a newer version and, as a direct consequence of this update and the musa environment, disabling the whisper.cpp build for musa to prevent build failures.
Highlights
- Dependency Update: Updated the
llama.cppdependency to a newer commit SHA (3f4fc97f1d745f1d5d3c853949503136d419e6de) in thebuild_llama_and_whisper.shscript. This update is necessary for compatibility withmusaimagesrc4.2.0. - Conditional Build Logic: Modified the
build_llama_and_whisper.shscript to conditionally disable thewhisper.cppbuild. The build will now be skipped if thecontainerfilevariable is set tomusa, in addition to the existing check fors390xarchitecture. This addresses a build breakage identified with themusaimage update.
Using Gemini Code Assist
The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.
Invoking Gemini
You can request assistance from Gemini at any point in your pull request via creating an issue comment (i.e. comment on the pull request page) using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands.
| Feature | Command | Description |
|---|---|---|
| Code Review | /gemini review |
Performs a code review for the current pull request in its current state. |
| Pull Request Summary | /gemini summary |
Provides a summary of the current pull request in its current state. |
| Comment | @gemini-code-assist | Responds in comments when explicitly tagged, both in issue comments and review comments. |
| Help | /gemini help |
Displays a list of available commands. |
Customization
To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.
Limitations & Feedback
Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments or fill out our survey to provide feedback.
You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.
Footnotes
-
Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution. ↩
There was a problem hiding this comment.
Hey @mikebonnet - I've reviewed your changes - here's some feedback:
- Consider using a version tag or branch name instead of a hardcoded commit SHA for llama.cpp to simplify future updates.
- To improve readability, extract the combined s390x/musa build skip check into a descriptive variable or function and add a brief comment explaining why whisper.cpp is disabled for musa.
Prompt for AI Agents
Please address the comments from this code review:
## Overall Comments
- Consider using a version tag or branch name instead of a hardcoded commit SHA for llama.cpp to simplify future updates.
- To improve readability, extract the combined s390x/musa build skip check into a descriptive variable or function and add a brief comment explaining why whisper.cpp is disabled for musa.Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.
There was a problem hiding this comment.
Summary of Changes
Hello @mikebonnet, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!
This pull request addresses compatibility issues arising from updating musa images to rc4.2.0. It resolves a dependency conflict by updating the llama.cpp library to a newer version and prevents a build failure by disabling the whisper.cpp build specifically for the musa container, ensuring the container images can be built successfully with the new musa version.
Highlights
- llama.cpp update: The
llama.cppcheckout SHA has been updated fromf667f1e6244e1f420512fa66692b7096ff17f366to3f4fc97f1d745f1d5d3c853949503136d419e6de. - whisper.cpp build disabled for Musa: The build process for
whisper.cppis now conditionally skipped when building themusacontainer image, addressing a reported build breakage.
Using Gemini Code Assist
The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.
Invoking Gemini
You can request assistance from Gemini at any point in your pull request via creating an issue comment (i.e. comment on the pull request page) using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands.
| Feature | Command | Description |
|---|---|---|
| Code Review | /gemini review |
Performs a code review for the current pull request in its current state. |
| Pull Request Summary | /gemini summary |
Provides a summary of the current pull request in its current state. |
| Comment | @gemini-code-assist | Responds in comments when explicitly tagged, both in issue comments and review comments. |
| Help | /gemini help |
Displays a list of available commands. |
Customization
To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.
Limitations & Feedback
Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments or fill out our survey to provide feedback.
You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.
Footnotes
-
Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution. ↩
There was a problem hiding this comment.
Summary of Changes
Hello @mikebonnet, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!
This pull request addresses compatibility issues encountered when updating musa container images to rc4.2.0. It resolves these by bumping the version of llama.cpp used in the build process and conditionally disabling the whisper.cpp build for musa containers, as the latter was breaking the build.
Highlights
- llama.cpp Version Update: I've updated the
llama.cppcheckout SHA fromf667f1e6244e1f420512fa66692b7096ff17f366to3f4fc97f1d745f1d5d3c853949503136d419e6dein the build script. - whisper.cpp Build Disablement for Musa: I've added a condition to disable the
whisper.cppbuild specifically when thecontainerfilevariable is set tomusa. This prevents build failures related towhisper.cppwhen updatingmusaimages.
Using Gemini Code Assist
The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.
Invoking Gemini
You can request assistance from Gemini at any point in your pull request via creating an issue comment (i.e. comment on the pull request page) using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands.
| Feature | Command | Description |
|---|---|---|
| Code Review | /gemini review |
Performs a code review for the current pull request in its current state. |
| Pull Request Summary | /gemini summary |
Provides a summary of the current pull request in its current state. |
| Comment | @gemini-code-assist | Responds in comments when explicitly tagged, both in issue comments and review comments. |
| Help | /gemini help |
Displays a list of available commands. |
Customization
To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.
Limitations & Feedback
Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments or fill out our survey to provide feedback.
You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.
Footnotes
-
Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution. ↩
There was a problem hiding this comment.
Code Review
This pull request updates the llama.cpp dependency to a newer commit and disables the whisper.cpp build for the musa container, as described. The changes are correct and achieve the stated goal. I have one suggestion to improve the shell script's syntax for better readability and consistency with the rest of the file.
Updating the musa images to
rc4.2.0requires a newer version ofllama.cpp, and broke thewhisper.cppbuild. For more info see:#1697
Summary by Sourcery
Update llama.cpp to a newer commit and disable whisper.cpp build for the Musa container
Enhancements: