skip to Main Content

I have a server with redis and maven configured
I then do the following sparkSession

spark = pyspark
.sql
.SparkSession
.builder
.master('local[4]')
.appName('try_one_core')
.config("spark.redis.host", "XX.XXX.XXX.XXX")
.config("spark.redis.port", "6379")
.config("spark.redis.auth", "XXXX")
.getOrCreate()

I am trying to connect to a remote redis server and write/load data from it, however when I try to .save() with the following command

df
.write
.format("org.apache.spark.sql.redis")
.option("table", "df")
.option("key.column", "case_id")
.save()

I get the following error:

py4j.protocol.Py4JJavaError: An error occurred while calling
o327.save. : java.lang.ClassNotFoundException: Failed to find data
source: org.apache.spark.sql.redis. Please find packages at
http://spark.apache.org/third-party-projects.html

Is there any fix to this?

2

Answers


  1. It means that spark-redis-<version>-jar-with-dependencies.jar is not loaded in Spark.

    You have to run pyspark with the following arguments as stated in the documentation:

    $ bin/pyspark --jars <path-to>/spark-redis-<version>-jar-with-dependencies.jar --conf "spark.redis.host=localhost" --conf "spark.redis.port=6379" --conf "spark.redis.auth=passwd"

    Login or Signup to reply.
  2. As addition to @fe2s answer , instead of loading it from disk or network storage it can be also loaded directly from maven

    bin/pyspark --packages com.redislabs:spark-redis:2.4.0
    

    the --packages and --jars arguments can also be used with normal spark-submit command

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search