Ubuntu – Airflow webserver `http://0.0.0.0:8080/` not working
$ airflow webserver ____________ _____________ ____ |__( )_________ __/__ /________ __ ____ /| |_ /__ ___/_ /_ __ /_ __ _ | /| / / ___ ___ | / _ / _ __/ _ / / /_/ /_ |/ |/…
$ airflow webserver ____________ _____________ ____ |__( )_________ __/__ /________ __ ____ /| |_ /__ ___/_ /_ __ /_ __ _ | /| / / ___ ___ | / _ / _ __/ _ / / /_/ /_ |/ |/…
In a task, I serialise a dict (converting a dict to string) and I pushed it to XCOM result[data] = json.dumps({"agents": ["john.doe@example.com"], "houses": ["jane.doe@example.com"]}) In Airflow's UI it looks good as a string, and the DAG level, I get in…
I installed Airflow in Ubuntu under WSL (Windows 10) by following the process in this post. But when running airflow db init, I get the following error: AttributeError: module 'wtforms.fields' has no attribute 'TextField' Complete trace: (airflow_env) sultani@Khalid:~/c/users/administrator/airflowhome$ airflow db…
Kubernetes Newbie here. I was following a youtube tutorial from Marc Lamberti on how to install airflow using Kubernetes locally - using Kind. I was able to create the k8s cluster using below commands: kind create cluster --name airflow-cluster --config…
I have an Airflow instance in Azure Data Factory. I can test the APIs from swagger. But when I try to test from postman, it's showing authentication error. Tried basic-auth with my azure credentials, but did not work. How do…
I'm running following script in MWAA and my local env. from airflow import DAG, XComArg from airflow.providers.amazon.aws.operators.redshift_sql import RedshiftSQLOperator from airflow.models.connection import Connection from airflow.operators.dummy_operator import DummyOperator from airflow.utils.task_group import TaskGroup # from airflow.providers.slack.hooks.slack_webhook import SlackWebhookHook from airflow.decorators import dag,…
I am trying to setup Airflow on Managed Apache Airflow. Everything seems to be working fine except for my AWS Redshift connection. I am using the Connections tab on UI and editing redshift_default with my values. It's working fine locally…
I am trying to ssh into my own AWS MWAA instance in order to install some system dependencies. I'm coming from GCP so this is a bit different for me. I can't find the exact EC2 it is hosted on.…
I've modified the docker-compose for Airflow (apache/airflow:2.5.0-python3.10) and added in a MariaDB service: to emulate the target DB for a project in dev. In the _PIP_ADDITIONAL_REQUIREMENTS I've included pymsql and I am attempting to create a MySQL Connection under Admin…
Is there a way to run a specific batch script on every worker node that MWAA spins up? Is there any feature in airflow that can do this?