Spark Worker 啓動報錯:No subfolder can be created in

解決方案寫在前面:在spark-env.sh裏有個參數 SPARK_LOCAL_DIRS,是存放shuffle數據落盤的目錄,這個報錯就是這個目錄不存在導致的。創建目錄重啓worker,再將核數和內存均衡一下。

附錄一下報錯:

18/03/29 09:59:01 ERROR Worker: Failed to launch executor app-20180329063203-1549/1642 for Mobius.di.2::bdp-141. [dispatcher-event-loop-21]
java.io.IOException: No subfolder can be created in .
    at org.apache.spark.deploy.worker.Worker$$anonfun$receive$1$$anonfun$9.apply(Worker.scala:499)
    at org.apache.spark.deploy.worker.Worker$$anonfun$receive$1$$anonfun$9.apply(Worker.scala:484)
    at scala.collection.MapLike$class.getOrElse(MapLike.scala:128)
    at scala.collection.AbstractMap.getOrElse(Map.scala:59)
    at org.apache.spark.deploy.worker.Worker$$anonfun$receive$1.applyOrElse(Worker.scala:484)
	at org.apache.spark.rpc.netty.Inbox$$anonfun$process$1.apply$mcV$sp(Inbox.scala:117)
    at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:205)
    at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:101)
    at org.apache.spark.rpc.netty.Dispatcher$MessageLoop.run(Dispatcher.scala:213)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章