skip to Main Content

How can I change location of default database for the warehouse?(spark) – Ubuntu

... <property> <name>hive.metastore.warehouse.dir</name> <value>hdfs://spark-master-01:9000/skybluelee/skybluelee_warehouse_mysql_5.7</value> <description>location of default database for the warehouse</description> </property> ... the code is a part of /user/spark3/conf/hive-site.xml At first the value was hdfs://spark-master-01:9000/kikang/skybluelee_warehouse_mysql_5.7 And I changed the value hdfs://spark-master-01:9000/skybluelee/skybluelee_warehouse_mysql_5.7 Below there is a code and result println(spark.conf.get("spark.sql.warehouse.dir"))…

VIEW QUESTION

Postgresql – Find start date and end date for continuous status

here is my dataset processed_on|status | ------------+-------+ 2023-01-01 |Success| 2023-01-02 |Success| 2023-01-03 |Success| 2023-01-04 |Fail | 2023-01-05 |Fail | 2023-01-06 |Success| 2023-01-07 |Fail | 2023-01-08 |Success| 2023-01-09 |Success| expected output is -------------------------- start_date|end_date|status -------------------------- 2023-01-01|2023-01-03|Success 2023-01-04|2023-01-05|Fail 2023-01-06|2023-01-06|Success 2023-01-07|023-01-07|Fail 2023-01-08|2023-01-09|Success i…

VIEW QUESTION
Back To Top
Search