1、Exception in thread "main" org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at:org.apache.spark.SparkContext.
解決方法:我這邊是因爲自己的原因創建了多個sparkContext,我將創建sparkContext的代碼修改爲只創建一次,程序就可以執行了。
2、Exception in thread "main" java.io.IOException: No FileSystem for scheme: hdfs
解決方法:在jar包的core-default.xml中添加如下
<property>
<name>fs.hdfs.impl</name>
<value>org.apache.hadoop.hdfs.DistributedFileSystem</value>
<description>The FileSystem for hdfs: uris.</description>
</property>
<property>
<name>fs.file.impl</name>
<value>org.apache.hadoop.fs.LocalFileSystem</value>
<description>The FileSystem for hdfs: uris.</description>
</property>