I would like to use the COPY INTO statement to copy a table with it’s column names from an external storage.
Would like to achieve it using a generic procedure which can use for different tables.
Here below you find a draft. The input parameters could be temp_file_path, schema, table name and list of column names of destination table.
Variable temp_file_path is location of files in Azure data lake(dl).
The variable on column list is a placeholder. Need to know how we can implement it.
Other suggestions are welcome.
CREATE PROC [copy_into_sql_from_dl_columns]
@temp_file_path [VARCHAR](4096)
, @dest_schema [VARCHAR](255)
, @dest_table [VARCHAR](255)
, @dest_columns [VARCHAR](255)
AS
IF @temp_file_path IS NULL OR @dest_schema IS NULL OR @dest_table IS NULL OR @dest_columns IS NULL
BEGIN
PRINT 'ERROR: You must specify temp_file_path, destination schema, table and column names.'
END
ELSE
BEGIN
DECLARE @dest_temp_table AS VARCHAR(4096)
SET @dest_temp_table = '['+@dest_schema + '].[' + @dest_table + ']'
-- set target column names into a target temp variable @dest_temp_columns_list
DECLARE @copy_into_query AS VARCHAR(8000)
SET @copy_into_query = 'COPY INTO ' + @dest_temp_table + ' ('+ @dest_temp_columns +')'+' FROM ''' @temp_file_path + ''' WITH (FILE_TYPE = ''parquet'', AUTO_CREATE_TABLE = ''OFF'' ) ';
EXEC (@copy_into_query)
END
GO
Environment is Azure cloud synapse
DB is SQL dedicated pool (Azure Synapse Analytics)
2
Answers
Can you use SELECT INTO?
I repro’d this and below is the approach.
@activity('Lookup1').output.value
is given.