spark的shell腳本分析

  1. bin目錄: { spark-shell , spark-sql } --> spark-submit–> spark-class
  2. sbin目錄:

part1: bin目錄

spark-shell

function main() {
    export SPARK_SUBMIT_OPTS
    "${SPARK_HOME}"/bin/spark-submit --class org.apache.spark.repl.Main --name "Spark shell" "$@"
}
main "$@"

spark-sql

exec "${SPARK_HOME}"/bin/spark-submit --class org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver "$@"

spark-submit

exec "${SPARK_HOME}"/bin/spark-class org.apache.spark.deploy.SparkSubmit "$@"

spark-class

# For tests
build_command() {
  "${JAVA_HOME}/bin/java" -Xmx128m -cp "="${SPARK_HOME}/jars/*" org.apache.spark.launcher.Main "$@"
  printf "%d\0" $?
}

CMD=()
while IFS= read -d '' -r ARG; do
  CMD+=("$ARG")
done < <(build_command "$@")

COUNT=${#CMD[@]}
LAST=$((COUNT - 1))
LAUNCHER_EXIT_CODE=${CMD[$LAST]}

CMD=("${CMD[@]:0:$LAST}")
exec "${CMD[@]}"

part2: 調用spark的scala類

scala類org.apache.spark.deploy.SparkSubmit -main函數

def main(args: Array[String]): Unit = {
    val appArgs = new SparkSubmitArguments(args)
    
    if (appArgs.verbose) { 
      printStream.println(appArgs)
    }
    
    appArgs.action match {
      case SparkSubmitAction.SUBMIT => submit(appArgs)
      case SparkSubmitAction.KILL => kill(appArgs)
      case SparkSubmitAction.REQUEST_STATUS => requestStatus(appArgs)
    }
  }
發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章