Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,11 @@ limitations under the License.

#### Best practices

> This documentation content is currently under development.
##### Pipeline parameters

Elyra supports [configuring pipeline parameters](pipelines.html#defining-pipeline-parameters) for Kubeflow Pipelines workflows. A parameter can be selected to use as input to a custom node in the dropdown list for an input property. Only parameters that have the same type as the given node property can be selected to use as input for that property. Currently, the Kubeflow Pipelines types of `String`, `Bool`, `Float` and `Integer` are supported.

See the [Kubeflow Pipelines documentation](https://www.kubeflow.org/docs/components/pipelines/v1/sdk/parameters/) for more information on parameters.

### Apache Airflow components

Expand Down
7 changes: 7 additions & 0 deletions docs/source/user_guide/best-practices-file-based-nodes.md
Original file line number Diff line number Diff line change
Expand Up @@ -112,3 +112,10 @@ notebook or script is executed in:
The Kubernetes Secrets property can be used to associate environment variable names with secrets, preventing sensitive information from being exposed in the pipeline file, the pipeline editor, and the runtime environment. As with static environment variables, secret-based environment variable values can be set on an individual node and/or defined as pipeline default values and shared across nodes belonging to the same pipeline. A default value can also be overridden for a particular node by redefining the secret for a given variable name in the node properties.

Secrets are ignored when the pipeline is executed locally. For remote execution, if an environment variable was assigned both a static value (via the 'Environment Variables' property) and a Kubernetes secret value, the secret's value is used.

### Pipeline parameters
File-based components can take advantage of [pipeline parameters](pipelines.html#defining-pipeline-parameters) for those runtime processors that support them. When pipeline parameters are supported, the `Pipeline Parameters` property will be present in the `Node Properties` panel. Check the box to select the parameter(s) that should be passed to this node.

![Select pipeline parameters](../images/user_guide/best-practices-file-based-nodes/elyra-node-parameters.png)

Parameters are passed to file-based nodes by setting them as environment variables in the node container. Due to constraints imposed by environment variables, the parameter value will appear as a string when accessed in the generic node regardless of the `Type` that was selected for the parameter in the `Pipeline Parameters` tab.
58 changes: 49 additions & 9 deletions docs/source/user_guide/pipelines.md
Original file line number Diff line number Diff line change
Expand Up @@ -107,6 +107,28 @@ Each pipeline node is configured using properties. Default node properties are a

- [Disable node caching](#disable-node-caching)

#### Defining pipeline parameters

Certain runtime-specific pipelines include support for pipeline parameters, which are configurable in the "Pipeline Editor Properties" panel.
To access the panel click the "Open panel" button on the right side and select the "Pipeline Parameters" tab. The tab will only be present for those runtime platforms that support pipeline parameters.

![Open the pipeline parameters panel](../images/user_guide/pipelines/open-pipeline-parameters.gif)

Click `Add` to add a parameter to this pipeline. Each parameter has the following attributes:

- `Parameter Name`: The name of this parameter. This must be a unique identifier among all defined parameters. Hover over the tooltip (`?`) to display a description that may include runtime-specific constraints on the format of a parameter name.
- `Description`: Optional. A description for this parameter.
- `Type`: The type of this parameter. The options displayed in this dropdown will be unique to the pipeline runtime platform.
- `Default Value`: Optional. A default value for the parameter. This value can be overridden by providing a value for this parameter during pipeline submit or export.
- `Required`: Whether a value is required for this parameter during pipeline submit or export. The default is `False`.

Next, configure node properties to use the defined parameters as desired. This process will differ for generic nodes and custom nodes. See the [`Node properties reference` subsection](#pipeline-parameters) or the [best practices guide](best-practices-file-based-nodes.html#pipeline-parameters) for information on configuring parameters for generic nodes.
For custom nodes, a parameter can be selected to use as input to a node in the dropdown list for an input property. Only parameters that have the same type as the given node property can be selected to use as input for that property.

![Select a pipeline parameter to use as a node input](../images/user_guide/pipelines/select-parameter.gif)

On pipeline submit or export, all parameters that have been referenced by generic or custom nodes will be displayed in the pop-up dialog. Values can be assigned in this dialog box in order to override any defined default values. A value must be provided if a parameter has been marked as required, and the `OK` button will be disabled until a value is entered. See [Running pipelines](#running-a-pipeline-from-the-visual-pipeline-editor) or [Exporting pipelines](#exporting-a-pipeline-from-the-visual-pipeline-editor) for more information.

#### Adding nodes

Generic nodes are added to a pipeline by dragging notebooks or scripts from the JupyterLab File Browser onto the canvas.
Expand Down Expand Up @@ -149,6 +171,7 @@ Nodes that are implemented using [generic components](pipeline-components.html#g
- [Filename](#filename)
- [Runtime image](#runtime-image)
- [Resources (CPU, GPU, and RAM)](#resources-cpu-gpu-and-ram)
- [Pipeline parameters](#pipeline-parameters)
- [File dependencies](#file-dependencies)
- [Include subdirectories](#file-dependencies)
- [Environment variables](l#environment-variables)
Expand Down Expand Up @@ -230,7 +253,7 @@ The following alphabetically sorted list identifies the node properties that are
- Format:
- _Environment variable_: name of the variable to be set. Example: `optimize`
- _Value_: the value to be assigned to said variable. Example: `true`
- A set of default environment variables can also be set in the pipeline properties tab. If any default environment variables are set, the **Environment Variables** property in the node properties tab will include these variables and their values with a note that each is a pipeline default. Pipeline default environment variables are not editable from the node properties tab. Individual nodes can override a pipeline default value for a given variable by re-defining the variable/value pair in its own node properties.
- A set of default environment variables can also be set in the pipeline properties tab. If any default environment variables are set, the **Environment Variables** property in the node properties tab will include these variables and their values with a note that each is a pipeline default. Pipeline default environment variables are not editable from the node properties tab. Individual nodes can override a pipeline default value for a given variable by re-defining the variable/value pair in its own node properties.

##### File Dependencies
- This property applies only to generic components.
Expand Down Expand Up @@ -283,6 +306,11 @@ The following alphabetically sorted list identifies the node properties that are
- A list of files generated by the notebook inside the image to be passed as inputs to the next step of the pipeline. Specify one file, directory, or expression per line. Supported patterns are `*` and `?`.
- Example: `data/*.csv`

##### Pipeline Parameters
- This property applies only to generic components. Custom components can also use pipeline parameters, but the [method by which to configure them](#defining-pipeline-parameters) is different.
- A list of defined [pipeline parameters](#defining-pipeline-parameters) that should be passed to this generic component.
- Check the box next to a parameter name to indicate that it should be passed to this node. Parameters are passed to generic components by setting them as environment variables in the node container. Due to constraints imposed by environment variables, the parameter value will appear as a string when accessed in the generic node regardless of the `Type` that was selected for the parameter in the `Pipeline Parameters` tab.

##### Resources: CPU, GPU, and RAM
- Resources that the notebook or script requires. RAM takes units of gigabytes (10<sup>9</sup> bytes).
- Specify a custom Kubernetes GPU vendor, if desired. The default vendor is `nvidia.com/gpu`. See [this topic in the Kubernetes documentation](https://kubernetes.io/docs/tasks/manage-gpus/scheduling-gpus/) for more information.
Expand Down Expand Up @@ -314,16 +342,22 @@ To run a pipeline from the Visual Pipeline Editor:

![Open pipeline run wizard](../images/user_guide/pipelines/pipeline-editor-run.png)

1. For generic pipelines select a runtime platform (local, Kubeflow Pipelines, Apache Airflow) and a runtime configuration for that platform. For runtime-specific pipelines select a runtime configuration.
2. For generic pipelines select a runtime platform (local, Kubeflow Pipelines, Apache Airflow) and a runtime configuration for that platform. For runtime-specific pipelines select a runtime configuration.

![Configure pipeline run options](../images/user_guide/pipelines/configure-pipeline-run-options.png)

1. Elyra does not include a pipeline run monitoring interface for pipelines:
3. [Configure pipeline parameters](pipelines.html#defining-pipeline-parameters), if applicable. If any nodes reference parameters defined in the `Pipeline Parameters` panel, the value these parameters take can be customized here. If a parameter is marked as required and no default value is set, a value must be provided before the `OK` button is enabled.

![Configure pipeline submit options with parameters](../images/user_guide/pipelines/configure-pipeline-submit-options-parameters.gif)

4. Select `OK`

5. Elyra does not include a pipeline run monitoring interface for pipelines:
- For local/JupyterLab execution check the console output.
- For Kubeflow Pipelines open the Central Dashboard link.
- For Apache Airflow open the web GUI link.

1. The pipeline run output artifacts are stored in the following locations:
6. The pipeline run output artifacts are stored in the following locations:
- For local/JupyterLab execution all artifacts are stored in the local file system.
- For Kubeflow Pipelines and Apache Airflow output artifacts for generic components are stored in the runtime configuration's designated object storage bucket.

Expand Down Expand Up @@ -359,19 +393,25 @@ Before you can export a pipeline on Kubeflow Pipelines or Apache Airflow you mus
#### Exporting a pipeline from the Visual Pipeline Editor

To export a pipeline from the Visual Pipeline Editor:
1. Click `Export Pipeline` in the editor's tool bar.
1. Click `Export Pipeline` in the editor's toolbar.

![Open pipeline run wizard](../images/user_guide/pipelines/pipeline-editor-export.png)

1. For generic pipelines select a runtime platform (local, Kubeflow Pipelines, or Apache Airflow) and a runtime configuration for that platform. For runtime-specific pipelines select a runtime configuration.
2. For generic pipelines select a runtime platform (local, Kubeflow Pipelines, or Apache Airflow) and a runtime configuration for that platform. For runtime-specific pipelines select a runtime configuration.

1. Select an export format.
3. Select an export format.

1. Customize your file name using the Export Filename box
4. Customize your file name using the Export Filename box

![Configure pipeline export options](../images/user_guide/pipelines/configure-pipeline-export-options.png)

1. Import the exported pipeline file using the Kubeflow Central Dashboard or add it to the Git repository that Apache Airflow is monitoring.
5. [Configure pipeline parameters](pipelines.html#defining-pipeline-parameters), if applicable. If any nodes reference parameters defined in the `Pipeline Parameters` panel, the value these parameters take can be customized here. If a parameter is marked as required and no default value is set, a value must be provided before the `OK` button is enabled.

![Configure pipeline export options with parameters](../images/user_guide/pipelines/configure-pipeline-export-options-parameters.gif)

6. Select `OK`

7. Import the exported pipeline file using the Kubeflow Central Dashboard or add it to the Git repository that Apache Airflow is monitoring.


#### Exporting a pipeline from the command line interface
Expand Down