Airflow
Table of Contents
The airflow orchestrator allows for workflows to be programmatically authored, scheduled, and monitored.
Getting Started #
Prerequisites #
If you haven't already, follow the initial steps of the Getting Started guide:
Installation and configuration #
Using the Command Line Interface #
-
Add the
airfloworchestrator to your project usingmeltano add:meltano add orchestrator airflow -
Configure the settings below using
meltano config.
Next steps #
- Use the meltano schedule command to create pipeline schedules in your project, to be run by Airflow.
- Start Scheduler and Webserver or execute Airflow commands directly using the instructions in the Meltano docs.
If you run into any issues, learn how to get help.
Settings #
These and other supported settings are documented below. To quickly find the setting you're looking for, use the Table of Contents at the top of the page.
DAGs Folder (core.dags_folder)
#
-
Environment variable:
AIRFLOW__CORE__DAGS_FOLDER, alias:AIRFLOW_CORE.DAGS_FOLDER - Default:
$MELTANO_PROJECT_ROOT/orchestrate/dags
How to use #
Manage this setting using
meltano config or an
environment variable:
meltano config airflow set core.dags_folder <core.dags_folder>
export AIRFLOW__CORE__DAGS_FOLDER=<core.dags_folder>
Plugins Folder (core.plugins_folder)
#
-
Environment variable:
AIRFLOW__CORE__PLUGINS_FOLDER, alias:AIRFLOW_CORE.PLUGINS_FOLDER - Default:
$MELTANO_PROJECT_ROOT/orchestrate/plugins
How to use #
Manage this setting using
meltano config or an
environment variable:
meltano config airflow set core.plugins_folder <core.plugins_folder>
export AIRFLOW__CORE__PLUGINS_FOLDER=<core.plugins_folder>
SQL Alchemy Connection (core.sql_alchemy_conn)
#
-
Environment variable:
AIRFLOW__CORE__SQL_ALCHEMY_CONN, alias:AIRFLOW_CORE.SQL_ALCHEMY_CONN - Default:
sqlite:///$MELTANO_PROJECT_ROOT/.meltano/orchestrators/airflow/airflow.db
How to use #
Manage this setting using
meltano config or an
environment variable:
meltano config airflow set core.sql_alchemy_conn <core.sql_alchemy_conn>
export AIRFLOW__CORE__SQL_ALCHEMY_CONN=<core.sql_alchemy_conn>
Load Examples (core.load_examples)
#
-
Environment variable:
AIRFLOW__CORE__LOAD_EXAMPLES, alias:AIRFLOW_CORE.LOAD_EXAMPLES - Default:
false
How to use #
Manage this setting using
meltano config or an
environment variable:
meltano config airflow set core.load_examples <core.load_examples>
export AIRFLOW__CORE__LOAD_EXAMPLES=<core.load_examples>
Pause DAGs at Creation (core.dags_are_paused_at_creation)
#
-
Environment variable:
AIRFLOW__CORE__DAGS_ARE_PAUSED_AT_CREATION, alias:AIRFLOW_CORE.DAGS_ARE_PAUSED_AT_CREATION - Default:
false
How to use #
Manage this setting using
meltano config or an
environment variable:
meltano config airflow set core.dags_are_paused_at_creation <core.dags_are_paused_at_creation>
export AIRFLOW__CORE__DAGS_ARE_PAUSED_AT_CREATION=<core.dags_are_paused_at_creation>Looking for help? #
If you're having trouble getting the
airflow orchestrator to work, look for an
existing issue in its repository, file a new issue,
or
join the Meltano Slack community
and ask for help in the #plugins-general channel.
Found an issue on this page? #
This page is generated from a YAML file that you can contribute changes to. Edit it on GitHub!