How could we run the Kafka example in the official Spark project?


How could we run the Kafka example in the official Spark project?



I try to run the simple Kafka example in Spark project.



The spark built properly, so I can run most of the examples, except this Kafka example.



When I run the following command:


bin/run-example streaming.JavaDirectKafkaWordCount localhost:9092 test



I got the following error:


Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/streaming/kafka010/LocationStrategies



Any idea how could we run the Kafka Example properly?



(Kafka client running on port 9092. Using the latest Spark and Kafka, and Java 8)


9092





same problem, if you find a solution...
– Jokin
Jun 1 at 12:41





ibm.com/support/knowledgecenter/en/SSWTQQ_2.0.0/solnguide/…
– mangusta
Jun 1 at 15:57





Have you found kafka010 jar on the Spark library folder?
– cricket_007
Jun 2 at 0:15


kafka010




1 Answer
1



In my project, finally I connected together Kafka and Spark properly. I realised, using the local mode in Spark script is quite useful during development. Much easier to debug and practice.


local



You can find a detailed, dockerized implementation here: https://github.com/zoltan-nz/kafka-spark-project



Related notes: https://github.com/zoltan-nz/kafka-spark-project/blob/master/SparkStreamer/README.md#experiment-06---using-local






By clicking "Post Your Answer", you acknowledge that you have read our updated terms of service, privacy policy and cookie policy, and that your continued use of the website is subject to these policies.

Popular posts from this blog

api-platform.com Unable to generate an IRI for the item of type

How to set up datasource with Spring for HikariCP?

PHP contact form sending but not receiving emails