Kafka-2.11-0.11.0.0對接spark streaming序列化問題

Kafka_2.11-0.11.0.0

sprak-streaming-kafka-0-10_2.11

報錯信息如下

java.io.NotSerializableException: org.apache.kafka.clients.consumer.ConsumerRecord
Serialization stack:
	- object not serializable (class: org.apache.kafka.clients.consumer.ConsumerRecord, value: ConsumerRecord(topic = news, partition = 0, offset = 115900, CreateTime = 1548486965892, checksum = 3320474937, serialized key size = -1, serialized value size = 51, key = null, value = 2019-01-26 1548486965891 911 550 entertainment view))
	- element of array (index: 0)
	- array (class [Lorg.apache.kafka.clients.consumer.ConsumerRecord;, size 11)
	at org.apache.spark.serializer.SerializationDebugger$.improveException(SerializationDebugger.scala:40)
	at org.apache.spark.serializer.JavaSerializationStream.writeObject(JavaSerializer.scala:46)
	at org.apache.spark.serializer.JavaSerializerInstance.serialize(JavaSerializer.scala:100)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:450)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

解決方法

創建SparkContext時設置一個屬性
set("spark.serializer","org.apache.spark.serializer.KryoSerializer")

val sparkConf = new SparkConf().setAppName("KafkaReceiver")
                .set("spark.serializer", "org.apache.spark.serializer.KryoSerializer")
                .setMaster("local[3]")

 

發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章