java.io.IOException: Failed to delete: C:\Users\dell\AppData\Local\Temp\spark- in windows

实验环境

  • windows10
  • spark2.4
  • Scala 2.11.12

问题描述

\sbtSpark1_jar>spark-submit --class com.spark.WordCount.WordCount sbtSpark。jar
java.io.IOException: Failed to delete: C:\Users\dell\AppData\Local\Temp\spark-034fc522-85f7-4f32-9d52-1c6b3ad13e14\userF
iles-886284db-d53e-467a-857c-04c1ff61cca3\sbtSpark1.jar
        at org.apache.spark.network.util.JavaUtils.deleteRecursivelyUsingJavaIO(JavaUtils.java:144)
        at org.apache.spark.network.util.JavaUtils.deleteRecursively(JavaUtils.java:118)
        at org.apache.spark.network.util.JavaUtils.deleteRecursivelyUsingJavaIO(JavaUtils.java:128)
        at org.apache.spark.network.util.JavaUtils.deleteRecursively(JavaUtils.java:118)
        at org.apache.spark.network.util.JavaUtils.deleteRecursivelyUsingJavaIO(JavaUtils.java:128)
        at org.apache.spark.network.util.JavaUtils.deleteRecursively(JavaUtils.java:118)
        at org.apache.spark.network.util.JavaUtils.deleteRecursively(JavaUtils.java:91)
        at org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:1062)
        at org.apache.spark.util.ShutdownHookManager$$anonfun$1$$anonfun$apply$mcV$sp$3.apply(ShutdownHookManager.scala:
65)
        at org.apache.spark.util.ShutdownHookManager$$anonfun$1$$anonfun$apply$mcV$sp$3.apply(ShutdownHookManager.scala:

报错分析解决

>spark-submit --class com.spark.WordCount.WordCount sbtSpark。jar

当我们提交打包好的spark程序时提示如上报错。在windows环境下本身就存在这样的问题,和我们的程序没有关系。若是不想消除该报错,可以在%SPARK_HOME%/conf下的文件log4j.properties添加如下信息:

log4j.logger.org.apache.spark.util.ShutdownHookManager=OFF
log4j.logger.org.apache.spark.SparkEnv=ERROR

若是没有该文件,需要找到log4j.properties.template重命名即可。

参考文档

来自stackoverflow.com

在Windows平台安装Hadoop&&idea调试spark程序(scala版)

發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章