I have this tableConfig.json inside a ADLS container based location .
It has table specific details.
{
"tableName":"employee",
"databaseName": "dbo",
"location" : "/mnt/clean/demo",
"colsList" : ["emp_id","emp_name","emp_city"]
}
Now I want to read that tableConfig.json in azure databricks python notebooks.
and dynamically create the create table and execute it, so that a delta table will be created
CREATE TABLE <databaseName.tableName>
(<colsList>)
USING delta
LOCATION <location>
Is there any approach we can do this ?
Basically i want a notebook to be created and if we execute it, then it should create the delta table as per json config
2
Answers
I don’t think there is an option to create table based on the config file yet, but we can iterate through the
json(dict)
and create the table.Example:
I agree with @notNull using
spark.sql
function to create table,In addition to that, using dataframe you can follow below approach.
First, load the json data into dataframe and follow below steps.
If you having only these columns in list you create sql script to each record in dataframe and execute
spark.sql
function on them.Below is your sample data, that I used.
create a function to make sql script and do udf register.
Next, add new column called script calling above function.
Next, take only script column and run
spark.sql
function.This creates delta table in the given location.
Output