skip to Main Content

Can someone share the example code for unloading data from snowflake reader account to s3 bucket as mentioned in their documentation.

The text is taken from their documentation.

https://docs.snowflake.com/en/user-guide/data-sharing-reader-create

It states as "However, you can use the COPY INTO command with your connection credentials to unload data into a cloud storage location."

2

Answers


  1. It’s exactly the same process for unloading data from any other account using COPY INTO.

    Please review the below documentation for more information and examples:

    https://docs.snowflake.com/en/sql-reference/sql/copy-into-location

    Login or Signup to reply.
  2. A Reader Account is typically limited in its capabilities, so you’ll need to ensure the following:

    1. Grant S3 Storage Integration Access: The Snowflake account that created the Reader Account must define the storage integration and grant it access to the Reader Account.
    2. Use COPY INTO Command: Unloading data to S3 uses the COPY INTO command.

    Below is an example assuming that the required S3 Storage Integration has already been set up and the appropriate permissions granted.
    The account that created the Reader Account must set up the storage integration and grant access:

    -- Create the storage integration 
    CREATE OR REPLACE STORAGE INTEGRATION my_s3_integration TYPE = EXTERNAL_STAGE STORAGE_PROVIDER = 'S3' ENABLED = TRUE STORAGE_AWS_ROLE_ARN = 'arn:aws:iam::123456789012:role/my-snowflake-role'STORAGE_ALLOWED_LOCATIONS = ('s3://my-s3-bucket/path/');
    
    -- Grant usage of the storage integration to the reader account
    GRANT USAGE ON INTEGRATION my_s3_integration TO SHARE my_share;
    
    -- Grant the share to the reader account
    GRANT SHARE my_share TO ACCOUNT reader_account_name;
    

    In the Reader Account, create an external stage using the shared storage integration:

    -- Create the stage using the shared storage integration
    CREATE OR REPLACE STAGE my_s3_stage URL = 's3://my-s3-bucket/path/' STORAGE_INTEGRATION = my_s3_integration;
    

    Now, use the COPY INTO command to unload data from a table to the S3 bucket:

    COPY INTO @my_s3_stage FROM my_table FILE_FORMAT = (TYPE = 'CSV' FIELD_DELIMITER = ',' HEADER = TRUE) OVERWRITE = TRUE SINGLE = FALSE;
    

    Verify if the data was unloaded into the S3 bucket. If you face any difficulties while unloading the data, please feel free to raise a Snowflake support ticket.

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search