I have a server with redis and maven configured
I then do the following sparkSession
spark = pyspark
.sql
.SparkSession
.builder
.master('local[4]')
.appName('try_one_core')
.config("spark.redis.host", "XX.XXX.XXX.XXX")
.config("spark.redis.port", "6379")
.config("spark.redis.auth", "XXXX")
.getOrCreate()
I am trying to connect to a remote redis server and write/load data from it, however when I try to .save() with the following command
df
.write
.format("org.apache.spark.sql.redis")
.option("table", "df")
.option("key.column", "case_id")
.save()
I get the following error:
py4j.protocol.Py4JJavaError: An error occurred while calling
o327.save. : java.lang.ClassNotFoundException: Failed to find data
source: org.apache.spark.sql.redis. Please find packages at
http://spark.apache.org/third-party-projects.html
Is there any fix to this?
2
Answers
It means that
spark-redis-<version>-jar-with-dependencies.jar
is not loaded in Spark.You have to run pyspark with the following arguments as stated in the documentation:
$ bin/pyspark --jars <path-to>/spark-redis-<version>-jar-with-dependencies.jar --conf "spark.redis.host=localhost" --conf "spark.redis.port=6379" --conf "spark.redis.auth=passwd"
As addition to @fe2s answer , instead of loading it from disk or network storage it can be also loaded directly from maven
the
--packages
and--jars
arguments can also be used with normalspark-submit
command