

# Task 1: check that the data table is existed in the dataset # Define DAG: Set ID and assign default args and schedule interval

The following is the DAG we are going to use: import jsonįrom _operator import BigQueryOperatorįrom _check_operator import BigQuer圜heckOperator
AIRFLOW DOCKER FILE HOW TO
For how to construct a DAG, it will be covered in another post. Here I am using a simple DAG with just several BigQueryOperator to run some queries for demonstation purpose. Configure your DAG fileĪssuming you have a DAG for running some BigQuery tasks, we need to place the DAG file into the dags directory. Your file structure should look like this now. pluginsĮcho -e "AIRFLOW_UID=$(id -u)\nAIRFLOW_GID=0" >. There are several directories and user sertting required by the Docker so let’s configure them. This is the Docker file that will help to create the Airflow environment when you run the docker command. Use your terminal, fetch the docker-compose.yaml with the following command. Fetch docker-compose.yaml from AirflowĪfter installing docker, let’s create a working folder in your preferred location. It can be downloaded and installed here in Docker official site.
