java.io.IOException: Failed to delete: C:\Users\dell\AppData\Local\Temp\spark- in windows

實驗環境

  • windows10
  • spark2.4
  • Scala 2.11.12

問題描述

\sbtSpark1_jar>spark-submit --class com.spark.WordCount.WordCount sbtSpark。jar
java.io.IOException: Failed to delete: C:\Users\dell\AppData\Local\Temp\spark-034fc522-85f7-4f32-9d52-1c6b3ad13e14\userF
iles-886284db-d53e-467a-857c-04c1ff61cca3\sbtSpark1.jar
        at org.apache.spark.network.util.JavaUtils.deleteRecursivelyUsingJavaIO(JavaUtils.java:144)
        at org.apache.spark.network.util.JavaUtils.deleteRecursively(JavaUtils.java:118)
        at org.apache.spark.network.util.JavaUtils.deleteRecursivelyUsingJavaIO(JavaUtils.java:128)
        at org.apache.spark.network.util.JavaUtils.deleteRecursively(JavaUtils.java:118)
        at org.apache.spark.network.util.JavaUtils.deleteRecursivelyUsingJavaIO(JavaUtils.java:128)
        at org.apache.spark.network.util.JavaUtils.deleteRecursively(JavaUtils.java:118)
        at org.apache.spark.network.util.JavaUtils.deleteRecursively(JavaUtils.java:91)
        at org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:1062)
        at org.apache.spark.util.ShutdownHookManager$$anonfun$1$$anonfun$apply$mcV$sp$3.apply(ShutdownHookManager.scala:
65)
        at org.apache.spark.util.ShutdownHookManager$$anonfun$1$$anonfun$apply$mcV$sp$3.apply(ShutdownHookManager.scala:

報錯分析解決

>spark-submit --class com.spark.WordCount.WordCount sbtSpark。jar

當我們提交打包好的spark程序時提示如上報錯。在windows環境下本身就存在這樣的問題,和我們的程序沒有關係。若是不想消除該報錯,可以在%SPARK_HOME%/conf下的文件log4j.properties添加如下信息:

log4j.logger.org.apache.spark.util.ShutdownHookManager=OFF
log4j.logger.org.apache.spark.SparkEnv=ERROR

若是沒有該文件,需要找到log4j.properties.template重命名即可。

參考文檔

來自stackoverflow.com

在Windows平臺安裝Hadoop&&idea調試spark程序(scala版)

發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章