Enabling remote logging¶. To enable this feature, airflow.cfgmust be configured as follows: [logging]# Airflow can store logs remotely in AWS Cloudwatch. Users must supply a log group# ARN (starting with 'cloudwatch://...') and an Airflow connection# id that provides write and read access to the log location.remote_logging=Trueremote_base_log_folder=cloudwatch://arn:aws:logs:<region name>:<account id>:log-group:<group name>remote_log_conn_id=MyCloudwatchConn.
I've looked at the ones in example_dags and the example_trigger_controller_dag.py is related but not the scenario I'm writing about. I don't need to have a dag trigger another dag. I want to trigger a dag directly from the command line and pass parameters to it.
First of all, we have to create the new python file in AIRFLOW_HOME/dags directory. The Airflow scheduler monitors this folder in interval of time and after few seconds you are able to see your DAG in Airflow UI. In my case AIRFLOW_HOME=/home/pawel/airflow => It determines that my dags I need to upload into /home/pawel/airflow/dags folder.
Work with sample DAGs In Airflow, a DAG is a collection of all the tasks you want to run, organized in a way that reflects their relationships and dependencies.