flowerpower pipeline Commands¶
This section details the commands available under flowerpower pipeline.
run¶
Run a pipeline immediately.
This command executes a pipeline with the specified configuration and inputs. The pipeline will run synchronously, and the command will wait for completion.
Usage¶
Options¶
| Name | Type | Description | Default |
|---|---|---|---|
| name | str (arg) | Name of the pipeline to run. | — |
| --executor | str | Executor type: one of "synchronous", "threadpool", "processpool", "ray", "dask". | None |
| --executor-cfg | str | Executor configuration as JSON/dict; supports keys: type, max_workers, num_cpus. |
None |
| --executor-max-workers | int | Convenience: set executor.max_workers. |
None |
| --executor-num-cpus | int | Convenience: set executor.num_cpus. |
None |
| --base-dir | str | Base directory for the pipeline/project. | None |
| --inputs | str | Inputs as JSON/dict string; parsed to dict. | None |
| --final-vars | str | Final variables as JSON/list string; parsed to list. | None |
| --config | str | Hamilton runtime config as JSON/dict string. | None |
| --cache | str | Cache config as JSON/dict string. | None |
| --storage-options | str | Storage options as JSON/dict string; parsed to dict. | None |
| --log-level | str | Logging level: debug, info, warning, error, critical. | None |
| --with-adapter | str | Adapter config as JSON/dict string. | None |
| --max-retries | int | Max retry attempts on failure. | 0 |
| --retry-delay | float | Base delay between retries (seconds). | 1.0 |
| --jitter-factor | float | Random jitter factor [0-1]. | 0.1 |
Examples¶
YAML values support ${VAR} interpolation. Example in conf/pipelines/<name>.yml:
Executor config JSON example (shell-escaped):
Convenience flags example:
Options¶
| Name | Type | Description | Default |
|---|---|---|---|
| name | str (arg) | Name for the new pipeline | — |
| --base-dir | str | Base directory to create the pipeline in | None |
| --storage-options | str | Options for storage backends (JSON/dict string) | None |
| --log-level | str | Logging level (debug, info, warning, error, critical) | None |
| --overwrite | bool | Overwrite existing pipeline if it exists | False |
Examples¶
delete¶
Delete a pipeline's configuration and/or module files.
This command removes a pipeline's configuration file and/or module file from the project. If neither --cfg nor --module is specified, both will be deleted.
Usage¶
Options¶
| Name | Type | Description | Default |
|---|---|---|---|
| name | str (arg) | Name of the pipeline to delete | — |
| --base-dir | str | Base directory containing the pipeline | None |
| --cfg | bool | Delete only the configuration file | False |
| --module | bool | Delete only the pipeline module | False |
| --storage-options | str | Options for storage backends (JSON/dict string) | None |
| --log-level | str | Logging level | None |
Behavior: If neither --cfg nor --module is specified, both config and module are deleted.
Examples¶
show_dag¶
Show the DAG (Directed Acyclic Graph) of a pipeline.
This command generates and displays a visual representation of the pipeline's execution graph, showing how nodes are connected and dependencies between them.
Usage¶
Options¶
| Name | Type | Description | Default |
|---|---|---|---|
| name | str (arg) | Name of the pipeline to visualize | — |
| --base-dir | str | Base directory containing the pipeline | None |
| --storage-options | str | Options for storage backends (JSON/dict string) | None |
| --log-level | str | Logging level | None |
| --format | str | Output format: png, svg, pdf; raw returns the graph object |
"png" |
Examples¶
save_dag¶
Save the DAG (Directed Acyclic Graph) of a pipeline to a file.
This command generates a visual representation of the pipeline's execution graph and saves it to a file in the specified format.
Usage¶
Options¶
| Name | Type | Description | Default |
|---|---|---|---|
| name | str (arg) | Name of the pipeline to visualize | — |
| --base-dir | str | Base directory containing the pipeline | None |
| --storage-options | str | Options for storage backends (JSON/dict string) | None |
| --log-level | str | Logging level | None |
| --format | str | Output format: png, svg, pdf | "png" |
| --output-path | str | Custom file path to save the output (default: |
None |
Examples¶
show_pipelines¶
List all available pipelines in the project.
This command displays a list of all pipelines defined in the project, providing an overview of what pipelines are available to run.
Usage¶
Options¶
| Name | Type | Description | Default |
|---|---|---|---|
| --base-dir | str | Base directory containing pipelines | None |
| --storage-options | str | Options for storage backends (JSON/dict string) | None |
| --log-level | str | Logging level | None |
| --format | str | Output format (table, json, yaml) | "table" |
Examples¶
show_summary¶
Show summary information for one or all pipelines.
This command displays detailed information about pipelines including their configuration, code structure, and project context. You can view information for a specific pipeline or get an overview of all pipelines.
Usage¶
Options¶
| Name | Type | Description | Default |
|---|---|---|---|
| --name | str | Name of specific pipeline to summarize (all if not specified) | None |
| --cfg | bool | Include configuration details | True |
| --code | bool | Include code/module details | True |
| --project | bool | Include project context information | True |
| --base-dir | str | Base directory containing pipelines | None |
| --storage-options | str | Options for storage backends (JSON/dict string) | None |
| --log-level | str | Logging level | None |
| --to-html | bool | Generate HTML output instead of text | False |
| --to-svg | bool | Generate SVG output (where applicable) | False |
| --output-file | str | File path to save the output instead of printing to console | None |
Examples¶
add_hook¶
Add a hook to a pipeline configuration.
This command adds a hook function to a pipeline's configuration. Hooks are functions that are called at specific points during pipeline execution to perform additional tasks like logging, monitoring, or data validation.
Usage¶
Arguments¶
| Name | Type | Description | Default |
|---|---|---|---|
| name | str | Name of the pipeline to add the hook to | Required |
| function_name | str | Name of the hook function (must be defined in the pipeline module) | Required |
| type | str | Type of hook (determines when the hook is called during execution) | Required |
| to | str | Target node or tag (required for node-specific hooks) | Required |
| base_dir | str | Base directory containing the pipeline | Required |
| storage_options | str | Options for storage backends | Required |
| log_level | str | Set the logging level | Required |