Skip to main content

Decorators

@batch(...)

[source]

from metaflow import batch

Step decorator to specify that this step should execute on AWS Batch.

This decorator indicates that your step should execute on AWS Batch. Note that you can apply this decorator automatically to all steps using the --with batch argument when calling run/resume. Step level decorators within the code are overrides and will force a step to execute on AWS Batch regardless of the --with specification.

To use, annotate your step as follows:

@batch
@step
def my_step(self):
    ...
Parameters 

cpu: int

Number of CPUs required for this step. Defaults to 1. If @resources is also present, the maximum value from all decorators is used

gpu: int

Number of GPUs required for this step. Defaults to 0. If @resources is also present, the maximum value from all decorators is used

memory: int

Memory size (in MB) required for this step. Defaults to 4096. If @resources is also present, the maximum value from all decorators is used

image: string

Docker image to use when launching on AWS Batch. If not specified, a default docker image mapping to the current version of Python is used

queue: string

AWS Batch Job Queue to submit the job to. Defaults to the one specified by the environment variable METAFLOW_BATCH_JOB_QUEUE

iam_role: string

AWS IAM role that AWS Batch container uses to access AWS cloud resources (Amazon S3, Amazon DynamoDb, etc). Defaults to the one specified by the environment variable METAFLOW_ECS_S3_ACCESS_IAM_ROLE

execution_role: string

AWS IAM role that AWS Batch can use to trigger AWS Fargate tasks. Defaults to the one determined by the environment variable METAFLOW_ECS_FARGATE_EXECUTION_ROLE https://docs.aws.amazon.com/batch/latest/userguide/execution-IAM-role.html

shared_memory: int

The value for the size (in MiB) of the /dev/shm volume for this step. This parameter maps to the --shm-size option to docker run.

max_swap: int

The total amount of swap memory (in MiB) a container can use for this step. This parameter is translated to the --memory-swap option to docker run where the value is the sum of the container memory plus the max_swap value.

swappiness: int

This allows you to tune memory swappiness behavior for this step. A swappiness value of 0 causes swapping not to happen unless absolutely necessary. A swappiness value of 100 causes pages to be swapped very aggressively. Accepted values are whole numbers between 0 and 100.

Attributes 

package_sha

package_url

run_time_limit

from metaflow import card

@catch(...)

[source]

from metaflow import catch

Step decorator to specify error handling for your step.

This decorator indicates that exceptions in the step should be caught and not fail the entire flow.

This can be used in conjunction with the @retry decorator. In that case, catch will only activate if all retries fail and will catch the last exception thrown by the last retry.

To use, annotate your step as follows:

@catch(var='foo')
@step
def myStep(self):
    ...
Parameters 

var: string

Name of the artifact in which to store the caught exception. If not specified, the exception is not stored

print_exception: bool

Determines whether or not the exception is printed to stdout when caught. Defaults to True

@conda(...)

[source]

from metaflow import conda

Conda decorator that sets the Conda environment for your step

To use, add this decorator to your step:

@conda
@step
def MyStep(self):
    ...

Information in this decorator will override any eventual @conda_base flow level decorator. Parameters

libraries : Dict Libraries to use for this flow. The key is the name of the package and the value is the version to use. Defaults to {} python : string Version of Python to use (for example: '3.7.4'). Defaults to None (will use the current python version) disabled : bool If set to True, disables Conda. Defaults to False

Attributes 

conda

environments

@kubernetes(...)

[source]

from metaflow import kubernetes

Step decorator to specify that this step should execute on Kubernetes.

This decorator indicates that your step should execute on Kubernetes. Note that you can apply this decorator automatically to all steps using the --with kubernetes argument when calling run/resume. Step level decorators within the code are overrides and will force a step to execute on Kubernetes regardless of the --with specification.

To use, annotate your step as follows:

@kubernetes
@step
def my_step(self):
    ...

Parameters

cpu : int Number of CPUs required for this step. Defaults to 1. If @resources is also present, the maximum value from all decorators is used memory : int Memory size (in MB) required for this step. Defaults to 4096. If @resources is also present, the maximum value from all decorators is used disk : int Disk size (in MB) required for this step. Defaults to 10GB. If @resources is also present, the maximum value from all decorators is used image : string Docker image to use when launching on Kubernetes. If not specified, a default docker image mapping to the current version of Python is used

Attributes 

package_sha

package_url

run_time_limit

@parallel

[source]

from metaflow import parallel

@project

[source]

from metaflow import project

@resources(...)

[source]

from metaflow import resources

Step decorator to specify the resources needed when executing this step.

This decorator passes this information along to container orchestrator (AWS Batch, Kubernetes, etc.) when requesting resources to execute this step.

This decorator is ignored if the execution of the step happens locally.

To use, annotate your step as follows:

@resources(cpu=32)
@step
def my_step(self):
    ...

Parameters

cpu : int Number of CPUs required for this step. Defaults to 1 gpu : int Number of GPUs required for this step. Defaults to 0 memory : int Memory size (in MB) required for this step. Defaults to 4096 shared_memory : int The value for the size (in MiB) of the /dev/shm volume for this step. This parameter maps to the --shm-size option to docker run .

@step(...)

[source]

from metaflow import step

The step decorator. Makes a method a step in the workflow.