This demonstration shows an Airflow integration with Weights and Biases. This is a slimmed down version of a full demo including data transformations with Astronomer's Cosmos for dbt and the Astronomer buildkit for simplified python virtual environment creation. For the full demo see the main branch.
This workflow includes:
- data ingest to Postgres running as a local docker container
- feature engineering, model training and predictions with the Astro SDK and scikit-learn
- model management with Weights and Biases
Your Astro project contains the following files and folders:
- dags: This folder contains the Python files for the Airflow DAG.
- Dockerfile: This file contains a versioned Astro Runtime Docker image that provides a differentiated Airflow experience. If you want to execute other commands or overrides at runtime, specify them here.
- include: This folder contains additional directories for the services that will be used in the demo. Services included in this demo include:
- packages.txt: Install OS-level packages needed for the project.
- requirements.txt: Install Python packages needed for the project.
- plugins: Add custom or community plugins for your project to this file. It is empty by default.
- airflow_settings.yaml: Use this local-only file to specify Airflow Connections, Variables, and Pools instead of entering them in the Airflow UI as you develop DAGs in this project.
Prerequisites:
Docker Desktop or similar Docker services running locally.
W&B account or Trial Account
- Install Astronomver CLI. The Astro CLI is a command-line interface for data orchestration. It allows you to get started with Apache Airflow quickly and it can be used with all Astronomer products. This will provide a local instance of Airflow if you don't have an existing service. For MacOS
brew install astro
For Linux
curl -sSL install.astronomer.io | sudo bash -s
- Clone this repository.
git clone https://github.com/astronomer/airflow-wandb-demo -b simple
cd airflow-wandb-demo
Edit the .env
file and update the "WANDB_API_KEY" with your Weights and Biases token.
- Start an Airflow instance..
astro dev start
- Run the Airflow DAG in the Airflow UI
- Open localhost:8080 in a browser and login (username:
admin
, password:admin
) - Click the "Play" button for customer_analytics and select "Trigger DAG".
- After testing in local dev mode update the .env file with S3 credentials/buckets and deploy to Astro Cloud.