ETL pipelines with Apache Airflow.
3.12
Running airflow for the first tme
- First update the postgres volume's absolute path manually
- Run airflow with docker compose:
make airflow-init
make airflow
From then on, you just need to run the following:
make airflow
- Make sure Rust & Cargo are installed on your machine.
- Make sure you have Python version
3.12
installed. - Run
pipenv install
. We only installapache-airflow
for the typings when developing airflow DAGS.
After extraction, the data is loaded into the postgres location_etl_db
database running on port 5433