Apache Airflow® Executors
Executors are a configuration property of the scheduler process of every Apache Airflow® environment. The executor you choose for a task determines where and how a task is run. You can choose from several pre-configured executors that are designed for different use cases, or you can define a custom executor. Airflow 2.10 introduced the experimental multiple executor configuration feature, which allows you to specify executors for individual tasks without needing to use pre-configured combined executor options.
In this guide, you'll learn how to choose and configure executors in Airflow.
To learn more about how to adjust scaling parameters for task execution in Airflow, see Scaling Airflow to optimize performance.
Assumed knowledge
To get the most out of this guide, you should have an understanding of:
- Basic Airflow concepts. See Introduction to Apache Airflow.
- Airflow components. See Airflow components.
Choosing an executor
There are several pre-configured executors in Airflow for local and production use cases. In production, Astronomer recommends using either of the following executors:
-
CeleryExecutor: Uses a Celery backend (such as Redis, RabbitMq, Redis Sentinel or another message queue system) to coordinate tasks between pre-configured workers. This executor is ideal for high volumes of shorter running tasks or in environments with consistent task loads. The CeleryExecutor is available as part of the Celery provider.
-
KubernetesExecutor: Calls the Kubernetes API to create a separate Kubernetes pod for each task to run, enabling users to pass in custom configurations for each of their tasks and use resources efficiently. The KubernetesExecutor is available as part of the CNCF Kubernetes provider. This executor is ideal in the following scenarios:
- You have long running tasks that you don't want to be interrupted by code deploys or Airflow updates.
- Your tasks require very specific resource configurations.
- Your tasks run infrequently, and you don't want to incur worker resource costs when they aren't running.
For local development, Astronomer recommends using the LocalExecutor. It executes tasks locally inside the scheduler process and does not require workers. It supports parallelism and hyperthreading.
Other available executors are the SequentialExecutor, the experimental AWS ECS Executor and the experimental AWS Batch Executor.
Additionally, you can write and use your own custom executor.
Two statically-coded hybrid executors exist, the CeleryKubernetes Executor and the LocalKubernetes Executor, allowing you to use two different executors in the same Airflow environment in versions 2.9 and earlier. These executors are rarely used and as of Airflow 2.10 no longer recommended.
Configure your executor on Astro
Astro users can choose and configure their executors when creating a deployment. See Manage Airflow executors on Astro for more information. Astro supports the CeleryExecutor and the KubernetesExecutor.
Astronomer's open-source local development environment, the Astro CLI, uses the LocalExecutor.
Configure your executor for self-hosted Airflow
When working with self-hosted Airflow solutions, you can set your executor using the core.executor Airflow config variable.
Note that when using self-hosted Airflow with executors appropriate for production, you will need to configure your own Celery or Kubernetes setup. For more information on available configuration parameters, see the configuration references for the CeleryExecutor and KubernetesExecutor.
Run multiple executors concurrently
In Airflow 2.10+, you can run a multi-executor configuration. Note that this feature is experimental. When using multiple executors, you need to provide the relevant classes to the core.executor Airflow config variable as a comma separated string. You can provide short names for your executor classes after a :
character:
[core]
executor = 'CeleryExecutor,KubernetesExecutor,my_custom_package.MyCustomExecutor:MyExecutor'
The first executor in the list is the default executor. To assign a specific task to another executor from the list, set its executor
parameter to the class name or short name of the executor.
- TaskFlow API
- Traditional syntax
# from airflow.decorators import task
@task(executor="MyExecutor")
def my_task_with_custom_execution():
print("Hi! :)")
my_task_with_custom_execution()
# from airflow.operators.bash import BashOperator
BashOperator(
task_id="my_task_in_its_own_pod",
executor="KubernetesExecutor",
bash_command="echo 'hi :)'",
)
You can also override the default executor for all tasks in one DAG using the default_args
DAG parameter. Note that this feature is not yet supported by Astro Runtime.