I have an airflow task where I try and load a file into an s3 bucket. I have airflow running on a Ec2 instance. Im running AF version 2.4.3 I have done
pip install 'apache-airflow[amazon]'
I start up my AF server, log in and go to the Admin section to add a connection. I open a new connection and I dont have an option for s3.
My only Amazon options are:
Amazon Elastic MapReduce
Amazon Redshift
Amazon Web services.
what else am I missing?
3
Answers
You need to define aws connection under "Amazon Web Services Connection"
for more details see here
You should define the connection within your DAG.
You should also use a secure settings.ini file to save your secrets, and then call those variables from your DAG.
See this answer for a complete guide: Airflow s3 connection using UI
I am using Airflow version 2.4.3
I also did:
and also only had the options of the following in the UI:
Other tutorials (maybe older versions?) show that there should be Amazon S3 as an option.
However, if you select Amazon Web Services, then add you configuration in the "Extra" field:
Then it will work.