-
Notifications
You must be signed in to change notification settings - Fork 127
Prepare a new "generic" template, use "src layout" for Lakeflow template #3671
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
Conversation
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
lennartkats-db
commented
Sep 29, 2025
Comment on lines
+37
to
+38
| {{- /* We avoid a relative path here to work around https://github.com/databricks/cli/issues/3674 */}} | ||
| - --editable ${workspace.file_path} |
Contributor
Author
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
N.B., this is a workaround for #3674; -e .. doesn't work in the current CLI.
Collaborator
|
Updates acceptance test expectations for lakeflow-pipelines templates to reflect the environment_version change from 4 to 2. 🤖 Generated with [Claude Code](https://claude.ai/code) Co-Authored-By: Claude <[email protected]>
Resolved conflicts in NEXT_CHANGELOG.md by keeping only the new lakeflow-pipelines template update from this branch. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <[email protected]>
ac73e99 to
5d261bc
Compare
Contributor
Author
|
Scheduling merge post-bugbash. |
deco-sdk-tagging bot
added a commit
that referenced
this pull request
Oct 16, 2025
## Release v0.273.0 ### Notable Changes * (via Terraform v1.92.0) DABs will no longer try to update pipeline permissions upon pipeline deletion. This fixes PERMISSION\_ERROR upon 'bundle destroy' for pipelines that have run\_as setting enabled (described in https://community.databricks.com/t95/data-engineering/dab-dlt-destroy-fails-due-to-ownership-permissions-mismatch/td-p/132101) The downside is that if 'permissions:' block is removed from the resource, DABs will not try anymore to restore permissions to just the owner of the pipeline. ### CLI * Add the `--configure-serverless` flag to `databricks auth login` to configure Databricks Connect to use serverless. ### Dependency updates * Upgrade Go SDK to 0.82.0 ([#3769](#3769)) * Upgrade TF provider to 1.92.0 ([#3772](#3772)) ### Bundles * Updated the internal lakeflow-pipelines template to use an "src" layout ([#3671](#3671)). * Fix for pip flags with equal sign being incorrectly treated as local file names ([#3766](#3766))
github-merge-queue bot
pushed a commit
that referenced
this pull request
Nov 4, 2025
…3712) ## Changes This updates the `default-python` template according to the latest Lakeflow conventions as established in #3671. Notably, the new template moves away from the use of notebooks for pipeline source code. The new layout looks as follows when the user selects they want both the sample job and the sample pipeline: `📁 resources` `├── sample_job.job.yml` `└── sample_etl.pipeline.yml` `📁 src` `├── 📁 my_project` — shared source code for use in jobs and/or pipelines `│ ├── __init__.py` `│ └── main.py` `└── 📁 my_project_etl` — source code for the sample_etl pipeline ` ├── __init__.py` ` ├── 📁 transformations` ` │ ├── __init__.py` ` │ ├── sample_zones_my_project.py` ` │ └── sample_trips_my_project.py` ` ├── 📁 explorations` — exploratory notebooks ` │ ├── __init__.py` ` │ └── sample_exploration.ipynb` ` └── README.md` `📁 tests` — unit tests `📁 fixtures` — fixtures (these can now be used with [`load_fixture`](https://github.com/databricks/cli/blob/af524bb993eaffe059d65f93854d544a162fc6ef/acceptance/bundle/templates/default-python/serverless/output/my_default_python/fixtures/.gitkeep)) `databricks.yml` `pyproject.toml` `README.md` The template prompts have been updated to cater to this structure. Notably, they include a new prompt to manage the catalog and schema used by the template. These settings are propagated to both the job and the pipeline: ``` Welcome to the default Python template for Databricks Asset Bundles! Answer the following questions to customize your project. You can always change your configuration in the databricks.yml file later. Note that https://e2-dogfood.staging.cloud.databricks.com is used for initialization. (For information on how to change your profile, see https://docs.databricks.com/dev-tools/cli/profiles.html.) Unique name for this project [my_project]: my_project Include a Lakeflow job that runs a notebook: yes Include an ETL pipeline: yes Include a sample Python package that builds into a wheel file: yes Use serverless compute: yes Default catalog for any tables created by this project [main]: main Use a personal schema for each user working on this project. (This is recommended. Your personal schema will be 'main.lennart_kats'.): yes ✨ Your new project has been created in the 'my_project' directory! To get started, refer to the project README.md file and the documentation at https://docs.databricks.com/dev-tools/bundles/index.html. ``` ## Testing * Standard unit testing, acceptance testing * AI excercised templates and with all permutations of options, deploying/testing/running/inspecting the result * Bug bash of the original `lakeflow-pipelines` template from #3671 --------- Co-authored-by: Claude <[email protected]> Co-authored-by: Julia Crawford (Databricks) <[email protected]>
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Changes
This PR prepares a generic
defaulttemplate that I want to use as the basis fordefault-python,lakeflow-pipelines, and (likely)default-sql:lakeflow-pipelinescatalogandschemaparameters since we ask a question about catalog/schema in the templateTo support the notion of a "generic" template, the template schema format now supports a
template_dirargument. This allows us to have multipledatabricks_template_schema.jsonfiles that point to one template directory.Out of scope: this PR does not yet update
default-python. For early testing purposes, an early version is available asexperimental-default-python. To keep the diff cleaner, I removed acceptance tests for this template; that's something for a follow-up PR.Why
Tests