實驗環境
- windows10
- spark2.4
- Scala 2.11.12
問題描述
\sbtSpark1_jar>spark-submit --class com.spark.WordCount.WordCount sbtSpark。jar
java.io.IOException: Failed to delete: C:\Users\dell\AppData\Local\Temp\spark-034fc522-85f7-4f32-9d52-1c6b3ad13e14\userF
iles-886284db-d53e-467a-857c-04c1ff61cca3\sbtSpark1.jar
at org.apache.spark.network.util.JavaUtils.deleteRecursivelyUsingJavaIO(JavaUtils.java:144)
at org.apache.spark.network.util.JavaUtils.deleteRecursively(JavaUtils.java:118)
at org.apache.spark.network.util.JavaUtils.deleteRecursivelyUsingJavaIO(JavaUtils.java:128)
at org.apache.spark.network.util.JavaUtils.deleteRecursively(JavaUtils.java:118)
at org.apache.spark.network.util.JavaUtils.deleteRecursivelyUsingJavaIO(JavaUtils.java:128)
at org.apache.spark.network.util.JavaUtils.deleteRecursively(JavaUtils.java:118)
at org.apache.spark.network.util.JavaUtils.deleteRecursively(JavaUtils.java:91)
at org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:1062)
at org.apache.spark.util.ShutdownHookManager$$anonfun$1$$anonfun$apply$mcV$sp$3.apply(ShutdownHookManager.scala:
65)
at org.apache.spark.util.ShutdownHookManager$$anonfun$1$$anonfun$apply$mcV$sp$3.apply(ShutdownHookManager.scala:
報錯分析解決
>spark-submit --class com.spark.WordCount.WordCount sbtSpark。jar
當我們提交打包好的spark程序時提示如上報錯。在windows
環境下本身就存在這樣的問題,和我們的程序沒有關係。若是不想消除該報錯,可以在%SPARK_HOME%/conf
下的文件log4j.properties
添加如下信息:
log4j.logger.org.apache.spark.util.ShutdownHookManager=OFF
log4j.logger.org.apache.spark.SparkEnv=ERROR
若是沒有該文件,需要找到log4j.properties.template
重命名即可。