異常出現
我們可以看到在org.apache.spark.streaming.kafka010.InternalKafkaConsumer
找不到對應的類
解決方法:
因爲spark對應kafka有版本要求,我們在CDH的spark中搜索kafka
然後選擇0.10
保存之後,然後重啓下Spark
部署之後,重新執行命令
spark2-submit \
--master yarn \
--deploy-mode cluster \
--class com.bigdata.PreWarningScalaAppV2 \
--jars /var/lib/hadoop-hdfs/converter-moshi-2.1.0.jar,/var/lib/hadoop-hdfs/fastjson-1.2.58.jar,/var/lib/hadoop-hdfs/guava-20.0.jar,/var/lib/hadoop-hdfs/influxdb-java-2.5.jar,file:/var/lib/hadoop-hdfs/kafka-clients-2.0.0.jar,file:/var/lib/hadoop-hdfs/logging-interceptor-3.5.0.jar,file:/var/lib/hadoop-hdfs/moshi-1.2.0.jar,file:/var/lib/hadoop-hdfs/okhttp-3.5.0.jar,file:/var/lib/hadoop-hdfs/okio-1.11.0.jar,file:/var/lib/hadoop-hdfs/retrofit-2.1.0.jar,file:/var/lib/hadoop-hdfs/spark-streaming-kafka-0-10_2.11-2.4.4.jar,file:/var/lib/hadoop-hdfs/mysql-connector-java-5.1.48.jar \
--conf "spark.driver.userClassPathFirst=true" \
/var/lib/hadoop-hdfs/prewarning-1.0.jar
顯示運行ok