Skip to content
Merged
Show file tree
Hide file tree
Changes from 6 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions NEXT_CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,5 +12,6 @@
### Dependency updates

### Bundles
* Added support for --bind flag in `bundle generate` ([#3782](https://github.com/databricks/cli/pull/3782))

### API Changes
3 changes: 2 additions & 1 deletion acceptance/bundle/deployment/bind/alert/output.txt
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,8 @@

>>> [CLI] bundle deployment bind my_alert [UUID] --auto-approve
Updating deployment state...
Successfully bound alert with an id '[UUID]'. Run 'bundle deploy' to deploy changes to your workspace
Successfully bound alert with an id '[UUID]'
Run 'bundle deploy' to deploy changes to your workspace

>>> [CLI] bundle summary
Name: test-bundle-$UNIQUE_NAME
Expand Down
3 changes: 2 additions & 1 deletion acceptance/bundle/deployment/bind/cluster/output.txt
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,8 @@

>>> [CLI] bundle deployment bind cluster1 [CLUSTER-ID] --auto-approve
Updating deployment state...
Successfully bound cluster with an id '[CLUSTER-ID]'. Run 'bundle deploy' to deploy changes to your workspace
Successfully bound cluster with an id '[CLUSTER-ID]'
Run 'bundle deploy' to deploy changes to your workspace

>>> [CLI] bundle deployment unbind cluster1
Updating deployment state...
3 changes: 2 additions & 1 deletion acceptance/bundle/deployment/bind/dashboard/output.txt
Original file line number Diff line number Diff line change
@@ -1,7 +1,8 @@

>>> [CLI] bundle deployment bind dashboard1 [DASHBOARD_ID] --auto-approve
Updating deployment state...
Successfully bound dashboard with an id '[DASHBOARD_ID]'. Run 'bundle deploy' to deploy changes to your workspace
Successfully bound dashboard with an id '[DASHBOARD_ID]'
Run 'bundle deploy' to deploy changes to your workspace

>>> [CLI] bundle deploy
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/test-bundle-[UNIQUE_NAME]/default/files...
Expand Down
Original file line number Diff line number Diff line change
@@ -1,7 +1,8 @@

>>> [CLI] bundle deployment bind dashboard1 [DASHBOARD_ID] --auto-approve
Updating deployment state...
Successfully bound dashboard with an id '[DASHBOARD_ID]'. Run 'bundle deploy' to deploy changes to your workspace
Successfully bound dashboard with an id '[DASHBOARD_ID]'
Run 'bundle deploy' to deploy changes to your workspace

>>> errcode [CLI] bundle deploy
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/test-bundle-[UNIQUE_NAME]/default/files...
Expand Down
Original file line number Diff line number Diff line change
@@ -1,7 +1,8 @@

>>> [CLI] bundle deployment bind database_instance1 [UUID] --auto-approve
Updating deployment state...
Successfully bound database_instance with an id '[UUID]'. Run 'bundle deploy' to deploy changes to your workspace
Successfully bound database_instance with an id '[UUID]'
Run 'bundle deploy' to deploy changes to your workspace

>>> [CLI] bundle summary
Name: test-bundle-$UNIQUE_NAME
Expand Down
3 changes: 2 additions & 1 deletion acceptance/bundle/deployment/bind/experiment/output.txt
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,8 @@
=== Substitute variables in the template
=== Create a pre-defined experiment
=== Bind experiment: Updating deployment state...
Successfully bound experiment with an id '[NUMID]'. Run 'bundle deploy' to deploy changes to your workspace
Successfully bound experiment with an id '[NUMID]'
Run 'bundle deploy' to deploy changes to your workspace

=== Deploy bundle: Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/test-bundle-[UNIQUE_NAME]/default/files...
Deploying resources...
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,8 @@ test.py

>>> [CLI] bundle deployment bind test_job_key [NUMID] --auto-approve
Updating deployment state...
Successfully bound job with an id '[NUMID]'. Run 'bundle deploy' to deploy changes to your workspace
Successfully bound job with an id '[NUMID]'
Run 'bundle deploy' to deploy changes to your workspace

>>> [CLI] bundle deploy
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/test-bundle-generate-bind-[UNIQUE_NAME]/files...
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,8 @@ Created job with ID: [NUMID]
=== Bind job:
>>> [CLI] bundle deployment bind foo [NUMID] --auto-approve
Updating deployment state...
Successfully bound job with an id '[NUMID]'. Run 'bundle deploy' to deploy changes to your workspace
Successfully bound job with an id '[NUMID]'
Run 'bundle deploy' to deploy changes to your workspace

=== Remove .databricks directory to simulate fresh deployment:
>>> rm -rf .databricks
Expand Down
3 changes: 2 additions & 1 deletion acceptance/bundle/deployment/bind/job/noop-job/output.txt
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,8 @@

>>> [CLI] bundle deployment bind job_1 [NUMID] --auto-approve
Updating deployment state...
Successfully bound job with an id '[NUMID]'. Run 'bundle deploy' to deploy changes to your workspace
Successfully bound job with an id '[NUMID]'
Run 'bundle deploy' to deploy changes to your workspace

>>> [CLI] bundle deploy
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/my_project/default/files...
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,8 @@

>>> uv run --with [DATABRICKS_BUNDLES_WHEEL] -q [CLI] bundle deployment bind job_1 [NUMID] --auto-approve
Updating deployment state...
Successfully bound job with an id '[NUMID]'. Run 'bundle deploy' to deploy changes to your workspace
Successfully bound job with an id '[NUMID]'
Run 'bundle deploy' to deploy changes to your workspace

>>> uv run --with [DATABRICKS_BUNDLES_WHEEL] -q [CLI] bundle deploy
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/my_project/default/files...
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,8 @@ resources:

>>> [CLI] bundle deployment bind endpoint1 test-endpoint-[UUID]
Updating deployment state...
Successfully bound model_serving_endpoint with an id 'test-endpoint-[UUID]'. Run 'bundle deploy' to deploy changes to your workspace
Successfully bound model_serving_endpoint with an id 'test-endpoint-[UUID]'
Run 'bundle deploy' to deploy changes to your workspace

>>> [CLI] bundle deploy
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/test-bundle-[UNIQUE_NAME]/default/files...
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,8 @@

>>> [CLI] bundle deployment bind monitor1 catalog.schema.table
Updating deployment state...
Successfully bound quality_monitor with an id 'catalog.schema.table'. Run 'bundle deploy' to deploy changes to your workspace
Successfully bound quality_monitor with an id 'catalog.schema.table'
Run 'bundle deploy' to deploy changes to your workspace

>>> [CLI] bundle deploy
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/bind-quality-monitor-test-localonly/default/files...
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,8 @@ resources:

>>> [CLI] bundle deployment bind model1 main.test-schema-rmodel-[UUID].test-registered-model-[UUID]
Updating deployment state...
Successfully bound registered_model with an id 'main.test-schema-rmodel-[UUID].test-registered-model-[UUID]'. Run 'bundle deploy' to deploy changes to your workspace
Successfully bound registered_model with an id 'main.test-schema-rmodel-[UUID].test-registered-model-[UUID]'
Run 'bundle deploy' to deploy changes to your workspace

>>> [CLI] bundle deploy
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/test-bundle-[UNIQUE_NAME]/default/files...
Expand Down
3 changes: 2 additions & 1 deletion acceptance/bundle/deployment/bind/schema/output.txt
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,8 @@
}

=== Bind schema: Updating deployment state...
Successfully bound schema with an id 'main.test-schema-[UUID]'. Run 'bundle deploy' to deploy changes to your workspace
Successfully bound schema with an id 'main.test-schema-[UUID]'
Run 'bundle deploy' to deploy changes to your workspace

=== Deploy bundle: Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/test-bundle-[UNIQUE_NAME]/default/files...
Deploying resources...
Expand Down
3 changes: 2 additions & 1 deletion acceptance/bundle/deployment/bind/secret-scope/output.txt
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,8 @@

>>> [CLI] bundle deployment bind secret_scope1 test-secret-scope-[UUID] --auto-approve
Updating deployment state...
Successfully bound secret_scope with an id 'test-secret-scope-[UUID]'. Run 'bundle deploy' to deploy changes to your workspace
Successfully bound secret_scope with an id 'test-secret-scope-[UUID]'
Run 'bundle deploy' to deploy changes to your workspace

>>> [CLI] bundle deploy
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/bind-secret-scope-test-[UNIQUE_NAME]/default/files...
Expand Down
Original file line number Diff line number Diff line change
@@ -1,7 +1,8 @@

>>> [CLI] bundle deployment bind sql_warehouse1 [SQL-WAREHOUSE-ID] --auto-approve
Updating deployment state...
Successfully bound sql_warehouse with an id '[SQL-WAREHOUSE-ID]'. Run 'bundle deploy' to deploy changes to your workspace
Successfully bound sql_warehouse with an id '[SQL-WAREHOUSE-ID]'
Run 'bundle deploy' to deploy changes to your workspace

>>> [CLI] bundle summary
Name: test-bundle-$UNIQUE_NAME
Expand Down
3 changes: 2 additions & 1 deletion acceptance/bundle/deployment/bind/volume/output.txt
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,8 @@
=== Create a pre-defined volume:
>>> [CLI] bundle deployment bind volume1 main.test-schema-[UUID].volume-[UUID] --auto-approve
Updating deployment state...
Successfully bound volume with an id 'main.test-schema-[UUID].volume-[UUID]'. Run 'bundle deploy' to deploy changes to your workspace
Successfully bound volume with an id 'main.test-schema-[UUID].volume-[UUID]'
Run 'bundle deploy' to deploy changes to your workspace

>>> [CLI] bundle deploy
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/test-bundle-[UNIQUE_NAME]/default/files...
Expand Down
8 changes: 8 additions & 0 deletions acceptance/bundle/generate/auto-bind/databricks.yml.tmpl
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
bundle:
name: auto-bind-test

workspace:
root_path: /tmp/${UNIQUE_NAME}

include:
- resources/*.yml
5 changes: 5 additions & 0 deletions acceptance/bundle/generate/auto-bind/out.test.toml

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

72 changes: 72 additions & 0 deletions acceptance/bundle/generate/auto-bind/output.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,72 @@

=== Create a pre-defined job:
Created job with ID: [NUMID]

>>> [CLI] workspace mkdirs /Workspace/Users/[USERNAME]/python-[UNIQUE_NAME]

>>> [CLI] workspace import /Workspace/Users/[USERNAME]/python-[UNIQUE_NAME]/test --file test.py --language PYTHON

=== Generate and bind in one step:
>>> [CLI] bundle generate job --key test_job --existing-job-id [NUMID] --config-dir resources --source-dir src --bind
File successfully saved to src/test.py
Job configuration successfully saved to resources/test_job.job.yml
Updating deployment state...
Successfully bound job with an id '[NUMID]'

>>> ls src/
test.py

>>> cat resources/test_job.job.yml
resources:
jobs:
test_job:
name: auto-bind-job-[UNIQUE_NAME]
tasks:
- task_key: test
new_cluster:
azure_attributes:
availability: ON_DEMAND_AZURE
enable_elastic_disk: true
node_type_id: [NODE_TYPE_ID]
num_workers: 1
spark_version: 13.3.x-snapshot-scala2.12
email_notifications: {}
notebook_task:
notebook_path: ../src/test.py
source: WORKSPACE
run_if: ALL_SUCCESS
timeout_seconds: 0
email_notifications: {}
max_concurrent_runs: 1
queue:
enabled: true
timeout_seconds: 0
webhook_notifications: {}

=== Deploy the bound job:
>>> [CLI] bundle deploy
Uploading bundle files to /Workspace/tmp/[UNIQUE_NAME]/files...
Deploying resources...
Updating deployment state...
Deployment complete!

=== Destroy the bundle:
>>> [CLI] bundle destroy --auto-approve
The following resources will be deleted:
delete job test_job

All files and directories at the following location will be deleted: /Workspace/tmp/[UNIQUE_NAME]

Deleting files...
Destroy complete!

=== Check that job is bound and does not exist after bundle is destroyed:
>>> errcode [CLI] jobs get [NUMID] --output json
Error: Job [NUMID] does not exist.

Exit code: 1

=== Delete the tmp folder:
>>> [CLI] workspace delete /Workspace/Users/[USERNAME]/python-[UNIQUE_NAME]/test

>>> [CLI] workspace delete /Workspace/Users/[USERNAME]/python-[UNIQUE_NAME]
50 changes: 50 additions & 0 deletions acceptance/bundle/generate/auto-bind/script
Original file line number Diff line number Diff line change
@@ -0,0 +1,50 @@
title "Create a pre-defined job:\n"

PYTHON_NOTEBOOK_DIR="/Workspace/Users/${CURRENT_USER_NAME}/python-${UNIQUE_NAME}"
PYTHON_NOTEBOOK="${PYTHON_NOTEBOOK_DIR}/test"

JOB_ID=$($CLI jobs create --json '
{
"name": "auto-bind-job-'${UNIQUE_NAME}'",
"tasks": [
{
"task_key": "test",
"new_cluster": {
"spark_version": "'${DEFAULT_SPARK_VERSION}'",
"node_type_id": "'${NODE_TYPE_ID}'",
"num_workers": 1
},
"notebook_task": {
"notebook_path": "'${PYTHON_NOTEBOOK}'"
}
}
]
}' | jq -r '.job_id')

echo "Created job with ID: $JOB_ID"

envsubst < databricks.yml.tmpl > databricks.yml

cleanup() {
title "Delete the tmp folder:"
trace $CLI workspace delete ${PYTHON_NOTEBOOK}
trace $CLI workspace delete ${PYTHON_NOTEBOOK_DIR}
}
trap cleanup EXIT

trace $CLI workspace mkdirs "${PYTHON_NOTEBOOK_DIR}"
trace $CLI workspace import "${PYTHON_NOTEBOOK}" --file test.py --language PYTHON

title "Generate and bind in one step:"
trace $CLI bundle generate job --key test_job --existing-job-id $JOB_ID --config-dir resources --source-dir src --bind
trace ls src/
trace cat resources/test_job.job.yml

title "Deploy the bound job:"
trace $CLI bundle deploy

title "Destroy the bundle:"
trace $CLI bundle destroy --auto-approve

title "Check that job is bound and does not exist after bundle is destroyed:"
trace errcode $CLI jobs get "${JOB_ID}" --output json
2 changes: 2 additions & 0 deletions acceptance/bundle/generate/auto-bind/test.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
# Databricks notebook source
print("Test notebook")
26 changes: 26 additions & 0 deletions acceptance/bundle/generate/auto-bind/test.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
# This test is using a workspace import API to load a notebook file.
# This API has a logic on how to accept notebook files and distinguishes them from regular python files.
# To succeed locally we would need to replicate this logic in the fake_workspace
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Worth doing some time?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Certainly, there are a bunch of other tests that need this, worth doing this as a separate PR

Local = false
Cloud = true

Ignore = [
"databricks.yml",
"resources/*",
"src/*",
".databricks",
]

[EnvMatrix]
DATABRICKS_BUNDLE_ENGINE = ["terraform"]


[Env]
# MSYS2 automatically converts absolute paths like /Users/$username/$UNIQUE_NAME to
# C:/Program Files/Git/Users/$username/UNIQUE_NAME before passing it to the CLI
# Setting this environment variable prevents that conversion on windows.
MSYS_NO_PATHCONV = "1"

[[Repls]]
Old = '\\'
New = '/'
Original file line number Diff line number Diff line change
Expand Up @@ -46,6 +46,7 @@ Flags:
--watch watch for changes to the dashboard and update the configuration

Global Flags:
--bind automatically bind the generated resource to the existing resource
--debug enable debug logging
--key string resource key to use for the generated configuration
-o, --output type output type: text or json (default text)
Expand Down
4 changes: 4 additions & 0 deletions acceptance/bundle/help/bundle-generate-job/output.txt
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,9 @@ Examples:
databricks bundle generate job --existing-job-id 67890 \
--key data_pipeline --config-dir resources --source-dir src

# Generate and automatically bind to the existing job
databricks bundle generate job --existing-job-id 12345 --key my_etl_job --bind

What gets generated:
- Job configuration YAML file in the resources directory
- Any associated notebook or Python files in the source directory
Expand All @@ -32,6 +35,7 @@ Flags:
-s, --source-dir string Dir path where the downloaded files will be stored (default "src")

Global Flags:
--bind automatically bind the generated resource to the existing resource
--debug enable debug logging
--key string resource key to use for the generated configuration
-o, --output type output type: text or json (default text)
Expand Down
4 changes: 4 additions & 0 deletions acceptance/bundle/help/bundle-generate-pipeline/output.txt
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,9 @@ Examples:
databricks bundle generate pipeline --existing-pipeline-id def456 \
--key data_transformation --config-dir resources --source-dir src

# Generate and automatically bind to the existing pipeline
databricks bundle generate pipeline --existing-pipeline-id abc123 --key etl_pipeline --bind

What gets generated:
- Pipeline configuration YAML file with settings and libraries
- Pipeline notebooks downloaded to the source directory
Expand All @@ -32,6 +35,7 @@ Flags:
-s, --source-dir string Dir path where the downloaded files will be stored (default "src")

Global Flags:
--bind automatically bind the generated resource to the existing resource
--debug enable debug logging
--key string resource key to use for the generated configuration
-o, --output type output type: text or json (default text)
Expand Down
Loading
Loading