skip to Main Content

I am setting up an Azure Data Factory that links to a Postgres SQL server that is hosted elsewhere. Several of our pipelines are failing to copy data from their source Postgres tables into their Azure SQL server sink tables. This problem appears to be because of a failing "time" conversion.

While I am able to validate that other tables that do not contain "time" columns are being brought in correctly by their respective pipelines, I am unable to even preview the source dataset for the tables that do contain "time" columns in Azure Data Factory. Attempting to do so returns the following error:

Error code
DEDICATED_DPERRORCODE0

Details
Can't cast database type time without time zone to NpgsqlInterval

The DEDICATED_DPERRORCODE0 link goes to a generic troubleshooting page and does not touch on time zones or casting errors.

It appears that ADF is unable to read the Postgres server’s columns that have "Time" as the type, due to trying to cast them to an "Interval" type without a time zone. I do not control the Postgres server; however, I have validated that the server does have the "time(0)" type for the columns in question. I lack the knowledge to be able to verify that the columns have a time zone attached as specified in the postgres documentation.

In the Azure documentation for the postgresql connector, it specifies that when importing from postgres, source columns with type "time" will be mapped to a "timespan" interim service data type. This is then converted to "Time" for the destination type for my columns. In the pipeline, I can see that this is how Azure generates the mapping for those columns.

For reference, here is an example screenshot of a failing test-pipeline:

Pipeline Failing to Preview Data in Schema View

This is the same error that occurs when trying to preview the data from the source table.

If I am reading the error correctly, Azure is attempting to cast the Time columns to an interval type. Doing that appears to require a time zone it doesn’t have and doesn’t match the stated documentation (but maybe Azure’s interim TimeSpan data type is secretly an interval?).

If I utilize the Query option in the source tab of the Copy Data pipeline, I am able to return results if I pass through a query that only select columns that would not require this conversion, such as:

SELECT [Id]
FROM public.table

This is currently how we are bypassing this error, however, doing it this way leaves out the data of those time columns.

There is potentially a query that would allow the pass through of the data as some non-erroring type, however, ideally, I’d like to resolve the root of the issue without writing custom SQL for each import dataset as we plan to add more tables and time columns in the future.

Altering the server would be somewhat difficult, but not necessarily impossible. Is there a way to specify a time zone in ADF that it can automatically use to make the mapping that postgres provides, or other method to successfully cast the postgres time columns into something ADF can use?

2

Answers


  1. Can’t cast database type time without time zone to NpgsqlInterval

    The above error will occur when trying to convert or cast between time type to Interval type which is not supported. If you want to copy data from Azure PostgreSQL table which is having time type column to Azure SQL table, follow below approach:

    Select the PostgreSQL table as source and select SQL sink as below:

    enter image description here

    Enable Auto create table as table option in sink as shown below:

    enter image description here

    Go to mapping click on import schemas then the time without time zone column will change to Time type column as shown below:

    enter image description here

    You will be able to preview the PostgreSQL table data as shown below:

    enter image description here

    The table will copy successfully without any error. Alternatively, you can use data flow to change the data type of column in Projection tab of source as shown below:

    enter image description here

    Login or Signup to reply.
  2. This issue was fixed at the end of October.
    If you are using AzureIR, please retry.
    If you are using SHIR, please upgrade to version 5.46.90 and above

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search