skip to Main Content
    C:Userssorun.jdksopenjdk-14.0.1binjava.exe "-javaagent:D:Intellij IDEAIntelliJ IDEA 2020.1.1libidea_rt.jar=50945:D:Intellij IDEAIntelliJ IDEA 2020.1.1bin" -Dfile.encoding=UTF-8 -classpath C:UserssorunIdeaProjectsspark-streaming-kafkatargetclasses;C:Userssorun.m2repositoryorgapachesparkspark-sql_2.112.2.0spark-sql_2.11-2.2.0.jar;C:Userssorun.m2repositorycomunivocityunivocity-parsers2.2.1univocity-parsers-2.2.1.jar;C:Userssorun.m2repositoryorgapachesparkspark-sketch_2.112.2.0spark-sketch_2.11-2.2.0.jar;C:Userssorun.m2repositoryorgapachesparkspark-core_2.112.2.0spark-core_2.11-2.2.0.jar;C:Userssorun.m2repositoryorgapacheavroavro1.7.7avro-1.7.7.jar;C:Userssorun.m2repositorycomthoughtworksparanamerparanamer2.3paranamer-2.3.jar;C:Userssorun.m2repositoryorgapachecommonscommons-compress1.4.1commons-compress-1.4.1.jar;C:Userssorun.m2repositoryorgtukaanixz1.0xz-1.0.jar;C:Userssorun.m2repositoryorgapacheavroavro-mapred1.7.7avro-mapred-1.7.7-hadoop2.jar;C:Userssorun.m2repositoryorgapacheavroavro-ipc1.7.7avro-ipc-1.7.7.jar;C:Userssorun.m2repositoryorgapacheavroavro-ipc1.7.7avro-ipc-1.7.7-tests.jar;C:Userssorun.m2repositorycomtwitterchill_2.11.8.0chill_2.11-0.8.0.jar;C:Userssorun.m2repositorycomesotericsoftwarekryo-shaded3.0.3kryo-shaded-3.0.3.jar;C:Userssorun.m2repositorycomesotericsoftwareminlog1.3.0minlog-1.3.0.jar;C:Userssorun.m2repositoryorgobjenesisobjenesis2.1objenesis-2.1.jar;C:Userssorun.m2repositorycomtwitterchill-java.8.0chill-java-0.8.0.jar;C:Userssorun.m2repositoryorgapachehadoophadoop-client2.6.5hadoop-client-2.6.5.jar;C:Userssorun.m2repositoryorgapachehadoophadoop-common2.6.5hadoop-common-2.6.5.jar;C:Userssorun.m2repositorycommons-clicommons-cli1.2commons-cli-1.2.jar;C:Userssorun.m2repositoryxmlencxmlenc.52xmlenc-0.52.jar;C:Userssorun.m2repositorycommons-httpclientcommons-httpclient3.1commons-httpclient-3.1.jar;C:Userssorun.m2repositorycommons-iocommons-io2.4commons-io-2.4.jar;C:Userssorun.m2repositorycommons-collectionscommons-collections3.2.2commons-collections-3.2.2.jar;C:Userssorun.m2repositorycommons-langcommons-lang2.6commons-lang-2.6.jar;C:Userssorun.m2repositorycommons-configurationcommons-configuration1.6commons-configuration-1.6.jar;C:Userssorun.m2repositorycommons-digestercommons-digester1.8commons-digester-1.8.jar;C:Userssorun.m2repositorycommons-beanutilscommons-beanutils1.7.0commons-beanutils-1.7.0.jar;C:Userssorun.m2repositorycommons-beanutilscommons-beanutils-core1.8.0commons-beanutils-core-1.8.0.jar;C:Userssorun.m2repositorycomgoogleprotobufprotobuf-java2.5.0protobuf-java-2.5.0.jar;C:Userssorun.m2repositoryorgapachehadoophadoop-auth2.6.5hadoop-auth-2.6.5.jar;C:Userssorun.m2repositoryorgapachedirectoryserverapacheds-kerberos-codec2.0.0-M15apacheds-kerberos-codec-2.0.0-M15.jar;C:Userssorun.m2repositoryorgapachedirectoryserverapacheds-i18n2.0.0-M15apacheds-i18n-2.0.0-M15.jar;C:Userssorun.m2repositoryorgapachedirectoryapiapi-asn1-api1.0.0-M20api-asn1-api-1.0.0-M20.jar;C:Userssorun.m2repositoryorgapachedirectoryapiapi-util1.0.0-M20api-util-1.0.0-M20.jar;C:Userssorun.m2repositoryorgapachecuratorcurator-client2.6.0curator-client-2.6.0.jar;C:Userssorun.m2repositoryorghtracehtrace-core3.0.4htrace-core-3.0.4.jar;C:Userssorun.m2repositoryorgapachehadoophadoop-hdfs2.6.5hadoop-hdfs-2.6.5.jar;C:Userssorun.m2repositoryorgmortbayjettyjetty-util6.1.26jetty-util-6.1.26.jar;C:Userssorun.m2repositoryxercesxercesImpl2.9.1xercesImpl-2.9.1.jar;C:Userssorun.m2repositoryxml-apisxml-apis1.3.04xml-apis-1.3.04.jar;C:Userssorun.m2repositoryorgapachehadoophadoop-mapreduce-client-app2.6.5hadoop-mapreduce-client-app-2.6.5.jar;C:Userssorun.m2repositoryorgapachehadoophadoop-mapreduce-client-common2.6.5hadoop-mapreduce-client-common-2.6.5.jar;C:Userssorun.m2repositoryorgapachehadoophadoop-yarn-client2.6.5hadoop-yarn-client-2.6.5.jar;C:Userssorun.m2repositoryorgapachehadoophadoop-yarn-server-common2.6.5hadoop-yarn-server-common-2.6.5.jar;C:Userssorun.m2repositoryorgapachehadoophadoop-mapreduce-client-shuffle2.6.5hadoop-mapreduce-client-shuffle-2.6.5.jar;C:Userssorun.m2repositoryorgapachehadoophadoop-yarn-api2.6.5hadoop-yarn-api-2.6.5.jar;C:Userssorun.m2repositoryorgapachehadoophadoop-mapreduce-client-core2.6.5hadoop-mapreduce-client-core-2.6.5.jar;C:Userssorun.m2repositoryorgapachehadoophadoop-yarn-common2.6.5hadoop-yarn-common-2.6.5.jar;C:Userssorun.m2repositoryjavaxxmlbindjaxb-api2.2.2jaxb-api-2.2.2.jar;C:Userssorun.m2repositoryjavaxxmlstreamstax-api1.0-2stax-api-1.0-2.jar;C:Userssorun.m2repositoryorgcodehausjacksonjackson-jaxrs1.9.13jackson-jaxrs-1.9.13.jar;C:Userssorun.m2repositoryorgcodehausjacksonjackson-xc1.9.13jackson-xc-1.9.13.jar;C:Userssorun.m2repositoryorgapachehadoophadoop-mapreduce-client-jobclient2.6.5hadoop-mapreduce-client-jobclient-2.6.5.jar;C:Userssorun.m2repositoryorgapachehadoophadoop-annotations2.6.5hadoop-annotations-2.6.5.jar;C:Userssorun.m2repositoryorgapachesparkspark-launcher_2.112.2.0spark-launcher_2.11-2.2.0.jar;C:Userssorun.m2repositoryorgapachesparkspark-network-common_2.112.2.0spark-network-common_2.11-2.2.0.jar;C:Userssorun.m2repositoryorgfusesourceleveldbjnileveldbjni-all1.8leveldbjni-all-1.8.jar;C:Userssorun.m2repositoryorgapachesparkspark-network-shuffle_2.112.2.0spark-network-shuffle_2.11-2.2.0.jar;C:Userssorun.m2repositoryorgapachesparkspark-unsafe_2.112.2.0spark-unsafe_2.11-2.2.0.jar;C:Userssorun.m2repositorynetjavadevjets3tjets3t.9.3jets3t-0.9.3.jar;C:Userssorun.m2repositoryorgapachehttpcomponentshttpcore4.3.3httpcore-4.3.3.jar;C:Userssorun.m2repositoryorgapachehttpcomponentshttpclient4.3.6httpclient-4.3.6.jar;C:Userssorun.m2repositoryjavaxactivationactivation1.1.1activation-1.1.1.jar;C:Userssorun.m2repositorymx4jmx4j3.0.2mx4j-3.0.2.jar;C:Userssorun.m2repositoryjavaxmailmail1.4.7mail-1.4.7.jar;C:Userssorun.m2repositoryorgbouncycastlebcprov-jdk15on1.51bcprov-jdk15on-1.51.jar;C:Userssorun.m2repositorycomjamesmurtyutilsjava-xmlbuilder1.0java-xmlbuilder-1.0.jar;C:Userssorun.m2repositorynetiharderbase642.3.8base64-2.3.8.jar;C:Userssorun.m2repositoryorgapachecuratorcurator-recipes2.6.0curator-recipes-2.6.0.jar;C:Userssorun.m2repositoryorgapachecuratorcurator-framework2.6.0curator-framework-2.6.0.jar;C:Userssorun.m2repositoryorgapachezookeeperzookeeper3.4.6zookeeper-3.4.6.jar;C:Userssorun.m2repositorycomgoogleguavaguava16.0.1guava-16.0.1.jar;C:Userssorun.m2repositoryjavaxservletjavax.servlet-api3.1.0javax.servlet-api-3.1.0.jar;C:Userssorun.m2repositoryorgapachecommonscommons-lang33.5commons-lang3-3.5.jar;C:Userssorun.m2repositoryorgapachecommonscommons-math33.4.1commons-math3-3.4.1.jar;C:Userssorun.m2repositorycomgooglecodefindbugsjsr3051.3.9jsr305-1.3.9.jar;C:Userssorun.m2repositoryorgslf4jslf4j-api1.7.16slf4j-api-1.7.16.jar;C:Userssorun.m2repositoryorgslf4jjul-to-slf4j1.7.16jul-to-slf4j-1.7.16.jar;C:Userssorun.m2repositoryorgslf4jjcl-over-slf4j1.7.16jcl-over-slf4j-1.7.16.jar;C:Userssorun.m2repositorylog4jlog4j1.2.17log4j-1.2.17.jar;C:Userssorun.m2repositoryorgslf4jslf4j-log4j121.7.16slf4j-log4j12-1.7.16.jar;C:Userssorun.m2repositorycomningcompress-lzf1.0.3compress-lzf-1.0.3.jar;C:Userssorun.m2repositoryorgxerialsnappysnappy-java1.1.2.6snappy-java-1.1.2.6.jar;C:Userssorun.m2repositorynetjpountzlz4lz41.3.0lz4-1.3.0.jar;C:Userssorun.m2repositoryorgroaringbitmapRoaringBitmap.5.11RoaringBitmap-0.5.11.jar;C:Userssorun.m2repositorycommons-netcommons-net2.2commons-net-2.2.jar;C:Userssorun.m2repositoryorgscala-langscala-library2.11.8scala-library-2.11.8.jar;C:Userssorun.m2repositoryorgjson4sjson4s-jackson_2.113.2.11json4s-jackson_2.11-3.2.11.jar;C:Userssorun.m2repositoryorgjson4sjson4s-core_2.113.2.11json4s-core_2.11-3.2.11.jar;C:Userssorun.m2repositoryorgjson4sjson4s-ast_2.113.2.11json4s-ast_2.11-3.2.11.jar;C:Userssorun.m2repositoryorgscala-langscalap2.11.0scalap-2.11.0.jar;C:Userssorun.m2repositoryorgscala-langscala-compiler2.11.0scala-compiler-2.11.0.jar;C:Userssorun.m2repositoryorgscala-langmodulesscala-xml_2.111.0.1scala-xml_2.11-1.0.1.jar;C:Userssorun.m2repositoryorgscala-langmodulesscala-parser-combinators_2.111.0.1scala-parser-combinators_2.11-1.0.1.jar;C:Userssorun.m2repositoryorgglassfishjerseycorejersey-client2.22.2jersey-client-2.22.2.jar;C:Userssorun.m2repositoryjavaxwsrsjavax.ws.rs-api2.0.1javax.ws.rs-api-2.0.1.jar;C:Userssorun.m2repositoryorgglassfishhk2hk2-api2.4.0-b34hk2-api-2.4.0-b34.jar;C:Userssorun.m2repositoryorgglassfishhk2hk2-utils2.4.0-b34hk2-utils-2.4.0-b34.jar;C:Userssorun.m2repositoryorgglassfishhk2externalaopalliance-repackaged2.4.0-b34aopalliance-repackaged-2.4.0-b34.jar;C:Userssorun.m2repositoryorgglassfishhk2externaljavax.inject2.4.0-b34javax.inject-2.4.0-b34.jar;C:Userssorun.m2repositoryorgglassfishhk2hk2-locator2.4.0-b34hk2-locator-2.4.0-b34.jar;C:Userssorun.m2repositoryorgjavassistjavassist3.18.1-GAjavassist-3.18.1-GA.jar;C:Userssorun.m2repositoryorgglassfishjerseycorejersey-common2.22.2jersey-common-2.22.2.jar;C:Userssorun.m2repositoryjavaxannotationjavax.annotation-api1.2javax.annotation-api-1.2.jar;C:Userssorun.m2repositoryorgglassfishjerseybundlesrepackagedjersey-guava2.22.2jersey-guava-2.22.2.jar;C:Userssorun.m2repositoryorgglassfishhk2osgi-resource-locator1.0.1osgi-resource-locator-1.0.1.jar;C:Userssorun.m2repositoryorgglassfishjerseycorejersey-server2.22.2jersey-server-2.22.2.jar;C:Userssorun.m2repositoryorgglassfishjerseymediajersey-media-jaxb2.22.2jersey-media-jaxb-2.22.2.jar;C:Userssorun.m2repositoryjavaxvalidationvalidation-api1.1.0.Finalvalidation-api-1.1.0.Final.jar;C:Userssorun.m2repositoryorgglassfishjerseycontainersjersey-container-servlet2.22.2jersey-container-servlet-2.22.2.jar;C:Userssorun.m2repositoryorgglassfishjerseycontainersjersey-container-servlet-core2.22.2jersey-container-servlet-core-2.22.2.jar;C:Userssorun.m2repositoryionettynetty-all4.0.43.Finalnetty-all-4.0.43.Final.jar;C:Userssorun.m2repositoryionettynetty3.9.9.Finalnetty-3.9.9.Final.jar;C:Userssorun.m2repositorycomclearspringanalyticsstream2.7.0stream-2.7.0.jar;C:Userssorun.m2repositoryiodropwizardmetricsmetrics-core3.1.2metrics-core-3.1.2.jar;C:Userssorun.m2repositoryiodropwizardmetricsmetrics-jvm3.1.2metrics-jvm-3.1.2.jar;C:Userssorun.m2repositoryiodropwizardmetricsmetrics-json3.1.2metrics-json-3.1.2.jar;C:Userssorun.m2repositoryiodropwizardmetricsmetrics-graphite3.1.2metrics-graphite-3.1.2.jar;C:Userssorun.m2repositorycomfasterxmljacksonmodulejackson-module-scala_2.112.6.5jackson-module-scala_2.11-2.6.5.jar;C:Userssorun.m2repositorycomfasterxmljacksonmodulejackson-module-paranamer2.6.5jackson-module-paranamer-2.6.5.jar;C:Userssorun.m2repositoryorgapacheivyivy2.4.0ivy-2.4.0.jar;C:Userssorun.m2repositoryorooro2.0.8oro-2.0.8.jar;C:Userssorun.m2repositorynetrazorvinepyrolite4.13pyrolite-4.13.jar;C:Userssorun.m2repositorynetsfpy4jpy4j.10.4py4j-0.10.4.jar;C:Userssorun.m2repositoryorgapachecommonscommons-crypto1.0.0commons-crypto-1.0.0.jar;C:Userssorun.m2repositoryorgapachesparkspark-catalyst_2.112.2.0spark-catalyst_2.11-2.2.0.jar;C:Userssorun.m2repositoryorgscala-langscala-reflect2.11.8scala-reflect-2.11.8.jar;C:Userssorun.m2repositoryorgcodehausjaninojanino3.0.0janino-3.0.0.jar;C:Userssorun.m2repositoryorgcodehausjaninocommons-compiler3.0.0commons-compiler-3.0.0.jar;C:Userssorun.m2repositoryorgantlrantlr4-runtime4.5.3antlr4-runtime-4.5.3.jar;C:Userssorun.m2repositorycommons-codeccommons-codec1.10commons-codec-1.10.jar;C:Userssorun.m2repositoryorgapachesparkspark-tags_2.112.2.0spark-tags_2.11-2.2.0.jar;C:Userssorun.m2repositoryorgapacheparquetparquet-column1.8.2parquet-column-1.8.2.jar;C:Userssorun.m2repositoryorgapacheparquetparquet-common1.8.2parquet-common-1.8.2.jar;C:Userssorun.m2repositoryorgapacheparquetparquet-encoding1.8.2parquet-encoding-1.8.2.jar;C:Userssorun.m2repositoryorgapacheparquetparquet-hadoop1.8.2parquet-hadoop-1.8.2.jar;C:Userssorun.m2repositoryorgapacheparquetparquet-format2.3.1parquet-format-2.3.1.jar;C:Userssorun.m2repositoryorgapacheparquetparquet-jackson1.8.2parquet-jackson-1.8.2.jar;C:Userssorun.m2repositoryorgcodehausjacksonjackson-mapper-asl1.9.11jackson-mapper-asl-1.9.11.jar;C:Userssorun.m2repositoryorgcodehausjacksonjackson-core-asl1.9.11jackson-core-asl-1.9.11.jar;C:Userssorun.m2repositorycomfasterxmljacksoncorejackson-databind2.6.5jackson-databind-2.6.5.jar;C:Userssorun.m2repositorycomfasterxmljacksoncorejackson-annotations2.6.0jackson-annotations-2.6.0.jar;C:Userssorun.m2repositorycomfasterxmljacksoncorejackson-core2.6.5jackson-core-2.6.5.jar;C:Userssorun.m2repositoryorgapachexbeanxbean-asm5-shaded4.4xbean-asm5-shaded-4.4.jar;C:Userssorun.m2repositoryorgspark-projectsparkunused1.0.0unused-1.0.0.jar;C:Userssorun.m2repositoryorgapachesparkspark-sql-kafka-0-10_2.112.2.0spark-sql-kafka-0-10_2.11-2.2.0.jar;C:Userssorun.m2repositoryorgapachekafkakafka-clients.10.0.1kafka-clients-0.10.0.1.jar;C:Userssorun.m2repositorycomgooglecodegsongson2.8.3gson-2.8.3.jar StreamingConsumer

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
20/06/19 12:39:42 INFO SparkContext: Running Spark version 2.2.0
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.hadoop.security.authentication.util.KerberosUtil (file:/C:/Users/sorun/.m2/repository/org/apache/hadoop/hadoop-auth/2.6.5/hadoop-auth-2.6.5.jar) to method sun.security.krb5.Config.getInstance()
WARNING: Please consider reporting this to the maintainers of org.apache.hadoop.security.authentication.util.KerberosUtil
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
20/06/19 12:39:43 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
20/06/19 12:39:44 INFO SparkContext: Submitted application: Streaming-kafka
20/06/19 12:39:44 INFO SecurityManager: Changing view acls to: OZAN-OKAN
20/06/19 12:39:44 INFO SecurityManager: Changing modify acls to: OZAN-OKAN
20/06/19 12:39:44 INFO SecurityManager: Changing view acls groups to: 
20/06/19 12:39:44 INFO SecurityManager: Changing modify acls groups to: 
20/06/19 12:39:44 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(OZAN-OKAN); groups with view permissions: Set(); users  with modify permissions: Set(OZAN-OKAN); groups with modify permissions: Set()
20/06/19 12:39:45 INFO Utils: Successfully started service 'sparkDriver' on port 50966.
20/06/19 12:39:45 INFO SparkEnv: Registering MapOutputTracker
20/06/19 12:39:45 INFO SparkEnv: Registering BlockManagerMaster
20/06/19 12:39:45 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
20/06/19 12:39:45 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
20/06/19 12:39:45 INFO DiskBlockManager: Created local directory at C:UserssorunAppDataLocalTempblockmgr-0794380e-6e2b-4559-bf6c-7d10c2074bc8
20/06/19 12:39:45 INFO MemoryStore: MemoryStore started with capacity 1040.4 MB
20/06/19 12:39:45 INFO SparkEnv: Registering OutputCommitCoordinator
20/06/19 12:39:45 INFO Utils: Successfully started service 'SparkUI' on port 4040.
20/06/19 12:39:46 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.56.1:4040
20/06/19 12:39:46 INFO Executor: Starting executor ID driver on host localhost
20/06/19 12:39:46 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 50975.
20/06/19 12:39:46 INFO NettyBlockTransferService: Server created on 192.168.56.1:50975
20/06/19 12:39:46 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
20/06/19 12:39:46 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.56.1, 50975, None)
20/06/19 12:39:46 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.56.1:50975 with 1040.4 MB RAM, BlockManagerId(driver, 192.168.56.1, 50975, None)
20/06/19 12:39:46 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.56.1, 50975, None)
20/06/19 12:39:46 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 192.168.56.1, 50975, None)
20/06/19 12:39:46 INFO SharedState: Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('file:/C:/Users/sorun/IdeaProjects/spark-streaming-kafka/spark-warehouse/').
20/06/19 12:39:46 INFO SharedState: Warehouse path is 'file:/C:/Users/sorun/IdeaProjects/spark-streaming-kafka/spark-warehouse/'.
20/06/19 12:39:47 INFO StateStoreCoordinatorRef: Registered StateStoreCoordinator endpoint
20/06/19 12:39:47 INFO CatalystSqlParser: Parsing command: string
20/06/19 12:39:49 INFO SparkSqlParser: Parsing command: CAST(value AS STRING) message
Exception in thread "main" org.apache.spark.sql.AnalysisException: cannot resolve '`product`' given input columns: [jsontostructs(message)];
    at org.apache.spark.sql.catalyst.analysis.package$AnalysisErrorAt.failAnalysis(package.scala:42)
    at org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$anonfun$checkAnalysis$1$$anonfun$apply$2.applyOrElse(CheckAnalysis.scala:88)
    at org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$anonfun$checkAnalysis$1$$anonfun$apply$2.applyOrElse(CheckAnalysis.scala:85)
    at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$transformUp$1.apply(TreeNode.scala:289)
    at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$transformUp$1.apply(TreeNode.scala:289)
    at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:70)
    at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:288)
    at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)
    at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)
    at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)
    at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)
    at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)
    at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286)
    at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)
    at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)
    at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4$$anonfun$apply$10.apply(TreeNode.scala:323)
    at scala.collection.MapLike$MappedValues$$anonfun$iterator$3.apply(MapLike.scala:246)
    at scala.collection.MapLike$MappedValues$$anonfun$iterator$3.apply(MapLike.scala:246)
    at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
    at scala.collection.Iterator$class.foreach(Iterator.scala:893)
    at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
    at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
    at scala.collection.IterableLike$$anon$1.foreach(IterableLike.scala:311)
    at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:59)
    at scala.collection.mutable.MapBuilder.$plus$plus$eq(MapBuilder.scala:25)
    at scala.collection.TraversableViewLike$class.force(TraversableViewLike.scala:88)
    at scala.collection.IterableLike$$anon$1.force(IterableLike.scala:311)
    at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:331)
    at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)
    at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)
    at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286)
    at org.apache.spark.sql.catalyst.plans.QueryPlan$$anonfun$transformExpressionsUp$1.apply(QueryPlan.scala:268)
    at org.apache.spark.sql.catalyst.plans.QueryPlan$$anonfun$transformExpressionsUp$1.apply(QueryPlan.scala:268)
    at org.apache.spark.sql.catalyst.plans.QueryPlan.transformExpression$1(QueryPlan.scala:279)
    at org.apache.spark.sql.catalyst.plans.QueryPlan.org$apache$spark$sql$catalyst$plans$QueryPlan$$recursiveTransform$1(QueryPlan.scala:289)
    at org.apache.spark.sql.catalyst.plans.QueryPlan$$anonfun$6.apply(QueryPlan.scala:298)
    at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)
    at org.apache.spark.sql.catalyst.plans.QueryPlan.mapExpressions(QueryPlan.scala:298)
    at org.apache.spark.sql.catalyst.plans.QueryPlan.transformExpressionsUp(QueryPlan.scala:268)
    at org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$anonfun$checkAnalysis$1.apply(CheckAnalysis.scala:85)
    at org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$anonfun$checkAnalysis$1.apply(CheckAnalysis.scala:78)
    at org.apache.spark.sql.catalyst.trees.TreeNode.foreachUp(TreeNode.scala:127)
    at org.apache.spark.sql.catalyst.analysis.CheckAnalysis$class.checkAnalysis(CheckAnalysis.scala:78)
    at org.apache.spark.sql.catalyst.analysis.Analyzer.checkAnalysis(Analyzer.scala:91)
    at org.apache.spark.sql.catalyst.encoders.ExpressionEncoder.resolveAndBind(ExpressionEncoder.scala:256)
    at org.apache.spark.sql.Dataset.<init>(Dataset.scala:206)
    at org.apache.spark.sql.Dataset.<init>(Dataset.scala:170)
    at org.apache.spark.sql.Dataset$.apply(Dataset.scala:61)
    at org.apache.spark.sql.Dataset.as(Dataset.scala:380)
    at StreamingConsumer.main(StreamingConsumer.java:24)
20/06/19 12:39:50 INFO SparkContext: Invoking stop() from shutdown hook
20/06/19 12:39:50 INFO SparkUI: Stopped Spark web UI at http://192.168.56.1:4040
20/06/19 12:39:50 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
20/06/19 12:39:50 INFO MemoryStore: MemoryStore cleared
20/06/19 12:39:50 INFO BlockManager: BlockManager stopped
20/06/19 12:39:50 INFO BlockManagerMaster: BlockManagerMaster stopped
20/06/19 12:39:50 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
20/06/19 12:39:50 INFO SparkContext: Successfully stopped SparkContext
20/06/19 12:39:50 INFO ShutdownHookManager: Shutdown hook called
20/06/19 12:39:50 INFO ShutdownHookManager: Deleting directory C:UserssorunAppDataLocalTempspark-b70ecbcc-e6cf-4328-9069-97cc41cc72d7

Process finished with exit code 1

CODE

3

Answers


  1. Exception in thread "main" org.apache.spark.sql.AnalysisException: 
    cannot resolve '`product`' given input columns: [jsontostructs(message)];
    
    

    Above exception message says the column which you are selecting is not available in DataFrame, rename the column jsontostructs(message) to product & use this column in select.

    Login or Signup to reply.
  2. Change schema).as("json"))

    Dataset<SearchProductModel> data = load.selectExpr("CAST(value AS STRING) as message")
                    .select(functions.from_json(functions.col("message"), schema).as("json"))
                    .select("json.*")
                    .as(Encoders.bean(SearchProductModel.class));
    
    Login or Signup to reply.
  3. And if you have "message" field in your model,
    add it to schema struct type

    StructType schema = new StructType().add("product","string").add("time", DataTypes.TimestampType).add("message", DataTypes.StringType);
    
    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search