-
Notifications
You must be signed in to change notification settings - Fork 132
Update the default-python template according to Lakeflow conventions #3712
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
Conversation
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Collaborator
12 failing tests:
|
7415181 to
7b37662
Compare
2a8ea44 to
81371f2
Compare
ba4fa01 to
603fe28
Compare
🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <[email protected]>
…n-template # Conflicts: # NEXT_CHANGELOG.md # acceptance/auth/bundle_and_profile/output.txt # acceptance/bundle/templates/default-python/classic/out.plan_prod.direct.json # acceptance/bundle/templates/default-python/integration_classic/output.txt # acceptance/bundle/templates/lakeflow-pipelines/python/output/my_lakeflow_pipelines/resources/lakeflow_pipelines_etl.pipeline.yml # acceptance/bundle/templates/lakeflow-pipelines/python/output/my_lakeflow_pipelines/resources/my_lakeflow_pipelines_etl.pipeline.yml # acceptance/bundle/templates/lakeflow-pipelines/python/output/my_lakeflow_pipelines/src/lakeflow_pipelines_etl/README.md # acceptance/bundle/templates/lakeflow-pipelines/python/output/my_lakeflow_pipelines/src/my_lakeflow_pipelines_etl/README.md # acceptance/bundle/templates/lakeflow-pipelines/sql/output/my_lakeflow_pipelines/resources/lakeflow_pipelines_etl.pipeline.yml # acceptance/bundle/templates/lakeflow-pipelines/sql/output/my_lakeflow_pipelines/resources/my_lakeflow_pipelines_etl.pipeline.yml # acceptance/bundle/templates/lakeflow-pipelines/sql/output/my_lakeflow_pipelines/src/lakeflow_pipelines_etl/README.md # acceptance/bundle/templates/lakeflow-pipelines/sql/output/my_lakeflow_pipelines/src/my_lakeflow_pipelines_etl/README.md # libs/template/template.go # libs/template/templates/default/template/__preamble.tmpl # libs/template/templates/experimental-default-python-vnext/databricks_template_schema.json # libs/template/templates/lakeflow-pipelines/databricks_template_schema.json
…ibility This commit applies several critical fixes to the template system: 1. Non-UC workspace support (deeeb38): Set default_catalog to "hive_metastore" when no UC metastore is available, ensuring templates work on non-UC workspaces. Applied to default-python and lakeflow-pipelines templates. 2. Fix catalog property references (63758b4): Changed template conditionals from using the helper function `default_catalog` to the property `.default_catalog` in pipeline templates. This ensures proper evaluation of the user-provided value. 3. Serverless catalog field (0088b28): Always emit the catalog field for serverless pipelines, changing the comment to clarify that "Serverless compute requires Unity Catalog". Applied to both YAML and Python pipeline templates. 4. PyDABs compatibility: Added missing enable_pydabs property to default-python template schema to support the merged PyDABs infrastructure from main. Acceptance test outputs have been updated to reflect these changes.
0612af1 to
fe7fca4
Compare
pietern
reviewed
Nov 3, 2025
pietern
reviewed
Nov 3, 2025
Integration tests were failing because when a UC workspace has no
metastore default catalog, the template would use "hive_metastore"
for serverless pipelines, which the API rejects.
Hardcode 'catalog: main' instead of '${var.catalog}' for this case.
This provides a clear error if "main" doesn't exist and matches the
approach used in origin/main.
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude <[email protected]>
deco-sdk-tagging bot
added a commit
that referenced
this pull request
Nov 5, 2025
## Release v0.276.0 ### CLI * Remove previously added flags from the `jobs create` and `pipelines create` commands. ([#3870](#3870)) ### Bundles * Updated the default-python template to follow the Lakeflow conventions: pipelines as source files, pyproject.toml ([#3712](#3712)). * Fix a permissions bug adding second IS\_OWNER and causing "The job must have exactly one owner." error. Introduced in 0.274.0. ([#3850](#3850))
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Changes
This updates the
default-pythontemplate according to the latest Lakeflow conventions as established in #3671. Notably, the new template moves away from the use of notebooks for pipeline source code.The new layout looks as follows when the user selects they want both the sample job and the sample pipeline:
📁 resources├── sample_job.job.yml└── sample_etl.pipeline.yml📁 src├── 📁 my_project— shared source code for use in jobs and/or pipelines│ ├── __init__.py│ └── main.py└── 📁 my_project_etl— source code for the sample_etl pipeline├── __init__.py├── 📁 transformations│ ├── __init__.py│ ├── sample_zones_my_project.py│ └── sample_trips_my_project.py├── 📁 explorations— exploratory notebooks│ ├── __init__.py│ └── sample_exploration.ipynb└── README.md📁 tests— unit tests📁 fixtures— fixtures (these can now be used withload_fixture)databricks.ymlpyproject.tomlREADME.mdThe template prompts have been updated to cater to this structure. Notably, they include a new prompt to manage the catalog and schema used by the template. These settings are propagated to both the job and the pipeline:
Testing
lakeflow-pipelinestemplate from Prepare a new "generic" template, use "src layout" for Lakeflow template #3671