Hello, in this post I will show you how to set up official Apache/Airflow with PostgreSQL and LocalExecutor using docker and docker-compose. In this post, I won’t be going through Airflow, what it is, and how it is used. Please checktheofficial documentation for more information about that.
Before setting up and running Apache Airflow, please install Docker and Docker Compose.
In this chapter, I will show you files and directories which are needed to run airflow and in the next chapter, I will go file by file, line by line explaining what is going on.
Firstly, in the root directory create three more directories: dags, logs, and scripts. Further, create following files: **.env, docker-compose.yml, entrypoint.sh **and **dummy_dag.py. **Please make sure those files and directories follow the structure below.
#project structure
root/
├── dags/
│ └── dummy_dag.py
├── scripts/
│ └── entrypoint.sh
├── logs/
├── .env
└── docker-compose.yml
Created files should contain the following:
#docker-compose.yml
version: '3.8'
services:
postgres:
image: postgres
environment:
- POSTGRES_USER=airflow
- POSTGRES_PASSWORD=airflow
- POSTGRES_DB=airflow
scheduler:
image: apache/airflow
command: scheduler
restart_policy:
condition: on-failure
depends_on:
- postgres
env_file:
- .env
volumes:
- ./dags:/opt/airflow/dags
- ./logs:/opt/airflow/logs
webserver:
image: apache/airflow
entrypoint: ./scripts/entrypoint.sh
restart_policy:
condition: on-failure
depends_on:
- postgres
- scheduler
env_file:
- .env
volumes:
- ./dags:/opt/airflow/dags
- ./logs:/opt/airflow/logs
- ./scripts:/opt/airflow/scripts
ports:
- "8080:8080"
#docker #how-to #apache-airflow #docker-compose #postgresql