This example shows how to create CI/CD pipeline for Airflow DAGs
- Make sure you have AWS CLI installed and configured with an aws account you want to use.
npm install -g aws-cdk
cdk --version
python3 -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt
export AWS_PROFILE=<the configuration profile for aws-cli>
cdk deploy AirflowCICD/Bucket
after the bucket created, do follows.
- copy the bucket name from the cdk output, and replace YOUR_BUCKET_NAME with it in the resources/dags/collect_whats_new_weekly.py
- go to S3 console and create dags and requirements folder in the bucket and upload resources/requirements.txt into requirements folder.
- copy requirements.txt version string, then set shell variable like below.
export REQUIREMENTS_S3_OBJ_VER=the_version_stirng_you_copied
cdk deploy AirflowCICD/Network
cdk deploy AirflowCICD/MWAA
after the deployment completed, do follows.
- copy the topic arn from the cdk output, and replace YOUR_TOPIC_ARN with it in the resources/dags/collect_whats_new_weekly.py
- SNS Topic Subscription - set subscription for Email. the confirmation is needed.
cdk deploy AirflowCICD/CICD
- copy the git clone url from the cdk output, and execute the command to clone, there are three different ways to clone git, choose one that you prefer. In this example, HTTPS way will be used.
- push source code into the repo
cd airflow_dags
mkdir dags
cp rsources/dags/collect_whats_new_weekly.py dags/
git status
git add .
git commit -m "initial commit"
git push
- check pipeline
Go to MWAA Console and Open Airflow UI
Execute whatsnew DAG. To execute the DAG, toggle the pause button.
Check an Email after the execution completed.