skip to Main Content

Context

I’m trying to setup an airflow/dbt docker infrastructure with Snowflake as a database, AWS S3 as a file management system and MongoDB as a data source.

Issue

I installed apache-airflow-providers-snowflake, I can find airflow.providers.snowflake.transfers.copy_into_snowflake but can’t find airflow.providers.snowflake.transfers.s3_to_snowflake !

Here’s what I get as an error on my main dag :

ModuleNotFoundError: No module named ‘airflow.providers.snowflake.transfers.s3_to_snowflake’

Have you ever had the same issue importing snowflake providers and only getting copy_into_snowflake component ?

Source code (I can give you more details if needed)

Here’s my project directory tree :

project
|
|-- airflow
    |-- dags
        |-- main.py
    |-- Dockerfile
    |-- requirements.txt
|
|-- docker-compose.yaml
|-- .env

Airflow main dag :

import pendulum
from airflow.providers.snowflake.transfers.s3_to_snowflake import S3ToSnowflakeOperator

from airflow import DAG

with DAG(
    dag_id="main",
    start_date=pendulum.datetime(year=2024, month=3, day=26, hour=15, minute=35),
    schedule_interval="15 4 * * *",
    catchup=False,
) as dag:
    pass

2

Answers


  1. taking a look at the docs, it seems the airflow.providers.snowflake.transfers.s3_to_snowflake module is only available in versions prior to 4.4.2. If you are using a newer version, it probably won’t work.

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search