skip to Main Content

I am trying to setup Airflow on Managed Apache Airflow. Everything seems to be working fine except for my AWS Redshift connection.

I am using the Connections tab on UI and editing redshift_default with my values. It’s working fine locally however when I trigger DAG via online I get following error;

[2023-02-27, 16:15:55 UTC] {{base.py:71}} INFO - Using connection ID 'redshift_default' for task execution.
[2023-02-27, 16:18:07 UTC] {{taskinstance.py:1851}} ERROR - Task failed with exception
Traceback (most recent call last):
  File "/usr/local/airflow/.local/lib/python3.10/site-packages/redshift_connector/core.py", line 585, in __init__
    self._usock.connect((host, port))
TimeoutError: [Errno 110] Connection timed out

Any help will be greatly apprecaited.

2

Answers


  1. Chosen as BEST ANSWER

    My connection host was playing. Fixing it rectified the problem.


  2. Have you tried increasing the value of the timeout parameter using the extras field? Here are a list of all the extras that are available in a RedshiftConnection object in Airflow. https://github.com/aws/amazon-redshift-python-driver#connection-parameters

    There is a timeout parameter than can be altered in the extras

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search