-
Notifications
You must be signed in to change notification settings - Fork 20
✨ Bring in latest maven-index data version #177
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Signed-off-by: Juan Manuel Leflet Estrada <[email protected]>
WalkthroughThe Dockerfile's maven-index-data download mechanism is updated from a hardcoded v0.0.1 version reference to a dynamic retrieval system that queries GitHub's releases API for the latest available version, extracts the download URL, and downloads accordingly. Changes
Estimated code review effort🎯 2 (Simple) | ⏱️ ~8 minutes
Possibly related PRs
Suggested reviewers
Poem
Pre-merge checks and finishing touches✅ Passed checks (3 passed)
✨ Finishing touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
🧹 Nitpick comments (1)
Dockerfile (1)
31-34: Consider usingjqfor robust JSON parsing.Parsing GitHub's JSON API response with
grepandsedis fragile and error-prone. If the response format changes or contains unexpected characters, extraction may fail silently. The container already installszip; consider addingjqfor reliable JSON parsing. While the current regex may work for typical responses,jqprovides better maintainability and resilience.Example refactor using
jq:FROM registry.access.redhat.com/ubi9/ubi-minimal AS index-download -RUN microdnf install -y wget zip && microdnf clean all && rm -rf /var/cache/dnf +RUN microdnf install -y wget zip jq && microdnf clean all && rm -rf /var/cache/dnf WORKDIR /maven-index-data -RUN DOWNLOAD_URL=$(wget --quiet -O - https://api.github.com/repos/konveyor/maven-search-index/releases/latest | grep '"browser_download_url".*maven-index-data.*\.zip' | sed -E 's/.*"browser_download_url": "([^"]+)".*/\1/') && \ +RUN DOWNLOAD_URL=$(wget --quiet -O - https://api.github.com/repos/konveyor/maven-search-index/releases/latest | jq -r '.assets[] | select(.name | contains("maven-index-data") and endswith(".zip")) | .browser_download_url' | head -1) && \ + test -n "$DOWNLOAD_URL" || (echo "Failed to extract download URL"; exit 1) && \ wget --quiet ${DOWNLOAD_URL} -O maven-index-data.zip && \ unzip maven-index-data.zip && \ rm maven-index-data.zip
| RUN DOWNLOAD_URL=$(wget --quiet -O - https://api.github.com/repos/konveyor/maven-search-index/releases/latest | grep '"browser_download_url".*maven-index-data.*\.zip' | sed -E 's/.*"browser_download_url": "([^"]+)".*/\1/') && \ | ||
| wget --quiet ${DOWNLOAD_URL} -O maven-index-data.zip && \ | ||
| unzip maven-index-data.zip && \ | ||
| rm maven-index-data.zip |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Add error handling for API query and URL extraction failures.
If the GitHub API call fails, the grep doesn't match, or sed fails to extract a valid URL, DOWNLOAD_URL will be empty. The subsequent wget ${DOWNLOAD_URL} will execute with no argument, leading to silent failures or undefined behavior. Docker RUN instructions do not enable set -e by default, so failures in the grep/sed pipeline won't halt the build—the build will succeed despite a missing download.
Apply this diff to add explicit error handling:
RUN DOWNLOAD_URL=$(wget --quiet -O - https://api.github.com/repos/konveyor/maven-search-index/releases/latest | grep '"browser_download_url".*maven-index-data.*\.zip' | sed -E 's/.*"browser_download_url": "([^"]+)".*/\1/') && \
+ if [ -z "$DOWNLOAD_URL" ]; then echo "Failed to extract download URL from GitHub API"; exit 1; fi && \
wget --quiet ${DOWNLOAD_URL} -O maven-index-data.zip && \
unzip maven-index-data.zip && \
rm maven-index-data.zipAlternatively, use a more robust approach:
-RUN DOWNLOAD_URL=$(wget --quiet -O - https://api.github.com/repos/konveyor/maven-search-index/releases/latest | grep '"browser_download_url".*maven-index-data.*\.zip' | sed -E 's/.*"browser_download_url": "([^"]+)".*/\1/') && \
+RUN DOWNLOAD_URL=$(wget --quiet -O - https://api.github.com/repos/konveyor/maven-search-index/releases/latest | grep -o '"browser_download_url":"[^"]*maven-index-data[^"]*\.zip[^"]*"' | head -1 | cut -d'"' -f4) && \
+ test -n "$DOWNLOAD_URL" || (echo "Failed to extract download URL"; exit 1) && \
wget --quiet ${DOWNLOAD_URL} -O maven-index-data.zip && \
unzip maven-index-data.zip && \
rm maven-index-data.zip📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| RUN DOWNLOAD_URL=$(wget --quiet -O - https://api.github.com/repos/konveyor/maven-search-index/releases/latest | grep '"browser_download_url".*maven-index-data.*\.zip' | sed -E 's/.*"browser_download_url": "([^"]+)".*/\1/') && \ | |
| wget --quiet ${DOWNLOAD_URL} -O maven-index-data.zip && \ | |
| unzip maven-index-data.zip && \ | |
| rm maven-index-data.zip | |
| RUN DOWNLOAD_URL=$(wget --quiet -O - https://api.github.com/repos/konveyor/maven-search-index/releases/latest | grep '"browser_download_url".*maven-index-data.*\.zip' | sed -E 's/.*"browser_download_url": "([^"]+)".*/\1/') && \ | |
| if [ -z "$DOWNLOAD_URL" ]; then echo "Failed to extract download URL from GitHub API"; exit 1; fi && \ | |
| wget --quiet ${DOWNLOAD_URL} -O maven-index-data.zip && \ | |
| unzip maven-index-data.zip && \ | |
| rm maven-index-data.zip |
🤖 Prompt for AI Agents
In Dockerfile around lines 31 to 34, the RUN pipeline that queries the GitHub
API and extracts a download URL can produce an empty DOWNLOAD_URL (if the API
call, grep, or sed fails) which then causes wget to run with no argument and the
build to continue silently; update this block to explicitly fail the build on
any error by enabling errexit for the command (or test the extracted variable),
validate that DOWNLOAD_URL is non-empty before calling wget (exit with a clear
error message if empty), and prefer using a robust API parsing command (e.g.,
curl/wget with non-zero-exit check and jq to extract the browser_download_url)
so that any HTTP or parsing failure stops the Docker build and surfaces a
helpful error.
|
Quality, license and security are reporting back as incomplete, but inner status is green. Merging. |
Signed-off-by: Juan Manuel Leflet Estrada <[email protected]> Signed-off-by: Cherry Picker <[email protected]>
Signed-off-by: Juan Manuel Leflet Estrada <[email protected]> Signed-off-by: Cherry Picker <[email protected]> Co-authored-by: Juan Manuel Leflet Estrada <[email protected]>
Bring in always the latest version of the generated maven index data
Summary by CodeRabbit