Spark on yarn測試存檔

            spark-class腳本結合後期日誌可以看到,spark啓動首先讀取來自spark-env.sh的環境變量,首先加載spark_mem等變量,然後查找jar包以及讀取類路徑。採用yarn-client模式,則將spark作業提交到yarn上運行,sparkContext運行在client本地,task運行在yarn集羣中。可以看到運行結果以及webUIcluster集羣中任務的運行情況。(見後面圖。)Spark-shell命令則是讀取spark-env.sh配置文件的各變量,並通過端口將運行情況反饋到webUI上可直觀看到。

bash-4.1$ CLASSPATH=/etc/hadoop/conf

-bash-4.1$ export HADOOP_HOME=/usr/lib/hadoop

-bash-4.1$  CLASSPATH=$CLASSPATH:$HADOOP_HOME/*:$HADOOP_HOME/lib/*

-bash-4.1$ CLASSPATH=$CLASSPATH:$HADOOP_HOME/../hadoop-mapreduce/*

-bash-4.1$ CLASSPATH=$CLASSPATH:$HADOOP_HOME/../hadoop-mapreduce/lib/*

-bash-4.1$ CLASSPATH=$CLASSPATH:$HADOOP_HOME/../hadoop-yarn/*

-bash-4.1$ CLASSPATH=$CLASSPATH:$HADOOP_HOME/../hadoop-yarn/lib/*

-bash-4.1$ CLASSPATH=$CLASSPATH:$HADOOP_HOME/../hadoop-hdfs/*

-bash-4.1$ CLASSPATH=$CLASSPATH:$HADOOP_HOME/../hadoop-hdfs/lib/*

-bash-4.1$ export SPARK_HOME=/usr/lib/spark

-bash-4.1$ CLASSPATH=$CLASSPATH:$SPARK_HOME/assembly/lib/*

-bash-4.1$ CLASSPATH=$CLASSPATH:$SPARK_HOME/examples/lib/*

-bash-4.1$ CLASSPATH=$CLASSPATH:$HADOOP_HOME/client/*

-bash-4.1$ CLASSPATH=$CLASSPATH:$HADOOP_HOME/client-0.20/*

-bash-4.1$ export JAVA_HOME=/usr/java/jdk1.7.0_45/

-bash-4.1$ export PATH=$JAVA_HOME/bin:$PATH

-bash-4.1$ export SPARK_JAR=hdfs://demo/user/spark/share/lib/spark-assembly.jar

-bash-4.1$ export HADOOP_CONF_DIR=/etc/hadoop/conf

-----------------------------------------spark-shell---------------------------------------------------------------

-bash-4.1$ master=yarn-client spark-shell

SLF4J: Class path contains multiple SLF4J bindings.

SLF4J: Found binding in [jar:file:/usr/lib/zookeeper/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: Found binding in [jar:file:/usr/lib/spark/assembly/lib/spark-assembly_2.10-0.9.0-cdh5.0.0-hadoop2.3.0-cdh5.0.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.

SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]

14/03/06 17:38:27 INFO spark.HttpServer: Starting HTTP Server

14/03/06 17:38:27 INFO server.Server: jetty-7.x.y-SNAPSHOT

14/03/06 17:38:27 INFO server.AbstractConnector: Started [email protected]:57344

Welcome to

      ____              __

     / __/__  ___ _____/ /__

    _\ \/ _ \/ _ `/ __/  '_/

   /___/ .__/\_,_/_/ /_/\_\   version 0.9.0

      /_/

 

Using Scala version 2.10.3 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_45)

Type in expressions to have them evaluated.

Type :help for more information.

14/03/06 17:38:37 INFO slf4j.Slf4jLogger: Slf4jLogger started

14/03/06 17:38:37 INFO Remoting: Starting remoting

14/03/06 17:38:37 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://spark@saltdb:57724]

14/03/06 17:38:37 INFO Remoting: Remoting now listens on addresses: [akka.tcp://spark@saltdb:57724]

14/03/06 17:38:38 INFO spark.SparkEnv: Registering BlockManagerMaster

14/03/06 17:38:38 INFO storage.DiskBlockManager: Created local directory at /tmp/spark-local-20140306173838-f882

14/03/06 17:38:38 INFO storage.MemoryStore: MemoryStore started with capacity 297.0 MB.

14/03/06 17:38:38 INFO network.ConnectionManager: Bound socket to port 39571 with id = ConnectionManagerId(saltdb,39571)

14/03/06 17:38:38 INFO storage.BlockManagerMaster: Trying to register BlockManager

14/03/06 17:38:38 INFO storage.BlockManagerMasterActor$BlockManagerInfo: Registering block manager saltdb:39571 with 297.0 MB RAM

14/03/06 17:38:38 INFO storage.BlockManagerMaster: Registered BlockManager

14/03/06 17:38:38 INFO spark.HttpServer: Starting HTTP Server

14/03/06 17:38:38 INFO server.Server: jetty-7.x.y-SNAPSHOT

14/03/06 17:38:38 INFO server.AbstractConnector: Started [email protected]:44142

14/03/06 17:38:38 INFO broadcast.HttpBroadcast: Broadcast server started at http://192.168.10.240:44142

14/03/06 17:38:38 INFO spark.SparkEnv: Registering MapOutputTracker

14/03/06 17:38:38 INFO spark.HttpFileServer: HTTP File server directory is /tmp/spark-ce8a68d6-5ab3-4122-b579-3a9187fcb1f3

14/03/06 17:38:38 INFO spark.HttpServer: Starting HTTP Server

14/03/06 17:38:38 INFO server.Server: jetty-7.x.y-SNAPSHOT

14/03/06 17:38:38 INFO server.AbstractConnector: Started [email protected]:45789

14/03/06 17:38:39 INFO server.Server: jetty-7.x.y-SNAPSHOT

14/03/06 17:38:39 INFO handler.ContextHandler: started o.e.j.s.h.ContextHandler{/storage/rdd,null}

14/03/06 17:38:39 INFO handler.ContextHandler: started o.e.j.s.h.ContextHandler{/storage,null}

14/03/06 17:38:39 INFO handler.ContextHandler: started o.e.j.s.h.ContextHandler{/stages/stage,null}

14/03/06 17:38:39 INFO handler.ContextHandler: started o.e.j.s.h.ContextHandler{/stages/pool,null}

14/03/06 17:38:39 INFO handler.ContextHandler: started o.e.j.s.h.ContextHandler{/stages,null}

14/03/06 17:38:39 INFO handler.ContextHandler: started o.e.j.s.h.ContextHandler{/environment,null}

14/03/06 17:38:39 INFO handler.ContextHandler: started o.e.j.s.h.ContextHandler{/executors,null}

14/03/06 17:38:39 INFO handler.ContextHandler: started o.e.j.s.h.ContextHandler{/metrics/json,null}

14/03/06 17:38:39 INFO handler.ContextHandler: started o.e.j.s.h.ContextHandler{/static,null}

14/03/06 17:38:39 INFO handler.ContextHandler: started o.e.j.s.h.ContextHandler{/,null}

14/03/06 17:38:39 INFO server.AbstractConnector: Started [email protected]:4040

14/03/06 17:38:39 INFO ui.SparkUI: Started Spark Web UI at http://saltdb:4040

14/03/06 17:38:39 INFO client.AppClient$ClientActor: Connecting to master spark://saltdb:7077...

14/03/06 17:38:42 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

14/03/06 17:38:45 INFO repl.SparkILoop: Created spark context..

Spark context available as sc.

 

scala>

      

scala>

 

--------------又運行一遍spark-shell觀察穩定性--------------------------------------------------------       

-bash-4.1$ master=yarn-client spark-shell

SLF4J: Class path contains multiple SLF4J bindings.

SLF4J: Found binding in [jar:file:/usr/lib/zookeeper/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: Found binding in [jar:file:/usr/lib/spark/assembly/lib/spark-assembly_2.10-0.9.0-cdh5.0.0-hadoop2.3.0-cdh5.0.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.

SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]

14/03/06 17:45:37 INFO spark.HttpServer: Starting HTTP Server

14/03/06 17:45:37 INFO server.Server: jetty-7.x.y-SNAPSHOT

14/03/06 17:45:37 INFO server.AbstractConnector: Started [email protected]:37412

Welcome to

      ____              __

     / __/__  ___ _____/ /__

    _\ \/ _ \/ _ `/ __/  '_/

   /___/ .__/\_,_/_/ /_/\_\   version 0.9.0

      /_/

 

Using Scala version 2.10.3 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_45)

Type in expressions to have them evaluated.

Type :help for more information.

14/03/06 17:45:53 INFO slf4j.Slf4jLogger: Slf4jLogger started

14/03/06 17:45:53 INFO Remoting: Starting remoting

14/03/06 17:45:55 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://spark@saltdb:40986]

14/03/06 17:45:55 INFO Remoting: Remoting now listens on addresses: [akka.tcp://spark@saltdb:40986]

14/03/06 17:45:55 INFO spark.SparkEnv: Registering BlockManagerMaster

14/03/06 17:45:55 INFO storage.DiskBlockManager: Created local directory at /tmp/spark-local-20140306174555-6817

14/03/06 17:45:55 INFO storage.MemoryStore: MemoryStore started with capacity 297.0 MB.

14/03/06 17:45:55 INFO network.ConnectionManager: Bound socket to port 42956 with id = ConnectionManagerId(saltdb,42956)

14/03/06 17:45:55 INFO storage.BlockManagerMaster: Trying to register BlockManager

14/03/06 17:45:55 INFO storage.BlockManagerMasterActor$BlockManagerInfo: Registering block manager saltdb:42956 with 297.0 MB RAM

14/03/06 17:45:55 INFO storage.BlockManagerMaster: Registered BlockManager

14/03/06 17:45:55 INFO spark.HttpServer: Starting HTTP Server

14/03/06 17:45:55 INFO server.Server: jetty-7.x.y-SNAPSHOT

14/03/06 17:45:55 INFO server.AbstractConnector: Started [email protected]:41502

14/03/06 17:45:55 INFO broadcast.HttpBroadcast: Broadcast server started at http://192.168.10.240:41502

14/03/06 17:45:55 INFO spark.SparkEnv: Registering MapOutputTracker

14/03/06 17:45:55 INFO spark.HttpFileServer: HTTP File server directory is /tmp/spark-6a564fdb-d60c-4288-8ffb-c3b92962a653

14/03/06 17:45:55 INFO spark.HttpServer: Starting HTTP Server

14/03/06 17:45:55 INFO server.Server: jetty-7.x.y-SNAPSHOT

14/03/06 17:45:55 INFO server.AbstractConnector: Started [email protected]:52291

14/03/06 17:45:56 INFO server.Server: jetty-7.x.y-SNAPSHOT

14/03/06 17:45:56 INFO handler.ContextHandler: started o.e.j.s.h.ContextHandler{/storage/rdd,null}

14/03/06 17:45:56 INFO handler.ContextHandler: started o.e.j.s.h.ContextHandler{/storage,null}

14/03/06 17:45:56 INFO handler.ContextHandler: started o.e.j.s.h.ContextHandler{/stages/stage,null}

14/03/06 17:45:56 INFO handler.ContextHandler: started o.e.j.s.h.ContextHandler{/stages/pool,null}

14/03/06 17:45:56 INFO handler.ContextHandler: started o.e.j.s.h.ContextHandler{/stages,null}

14/03/06 17:45:56 INFO handler.ContextHandler: started o.e.j.s.h.ContextHandler{/environment,null}

14/03/06 17:45:56 INFO handler.ContextHandler: started o.e.j.s.h.ContextHandler{/executors,null}

14/03/06 17:45:56 INFO handler.ContextHandler: started o.e.j.s.h.ContextHandler{/metrics/json,null}

14/03/06 17:45:56 INFO handler.ContextHandler: started o.e.j.s.h.ContextHandler{/static,null}

14/03/06 17:45:56 INFO handler.ContextHandler: started o.e.j.s.h.ContextHandler{/,null}

14/03/06 17:45:56 INFO server.AbstractConnector: Started [email protected]:4040

14/03/06 17:45:56 INFO ui.SparkUI: Started Spark Web UI at http://saltdb:4040

14/03/06 17:45:57 INFO client.AppClient$ClientActor: Connecting to master spark://saltdb:7077...

14/03/06 17:46:00 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

14/03/06 17:46:02 INFO repl.SparkILoop: Created spark context..

Spark context available as sc.

scala>

 

scala>

 

------------------提交一spark作業到yarn結果正確-------------------------------------------------------

-bash-4.1$ java -cp $CLASSPATH org.apache.spark.examples.SparkPi yarn-client 10

SLF4J: Class path contains multiple SLF4J bindings.

SLF4J: Found binding in [jar:file:/usr/lib/zookeeper/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: Found binding in [jar:file:/usr/lib/spark/assembly/lib/spark-assembly_2.10-0.9.0-cdh5.0.0-hadoop2.3.0-cdh5.0.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.

SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]

14/03/06 18:57:25 INFO slf4j.Slf4jLogger: Slf4jLogger started

14/03/06 18:57:25 INFO Remoting: Starting remoting

14/03/06 18:57:25 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://spark@saltdb:58888]

14/03/06 18:57:25 INFO Remoting: Remoting now listens on addresses: [akka.tcp://spark@saltdb:58888]

14/03/06 18:57:25 INFO spark.SparkEnv: Registering BlockManagerMaster

14/03/06 18:57:25 INFO storage.DiskBlockManager: Created local directory at /tmp/spark-local-20140306185725-0f24

14/03/06 18:57:25 INFO storage.MemoryStore: MemoryStore started with capacity 73.1 MB.

14/03/06 18:57:25 INFO network.ConnectionManager: Bound socket to port 42703 with id = ConnectionManagerId(saltdb,42703)

14/03/06 18:57:25 INFO storage.BlockManagerMaster: Trying to register BlockManager

14/03/06 18:57:25 INFO storage.BlockManagerMasterActor$BlockManagerInfo: Registering block manager saltdb:42703 with 73.1 MB RAM

14/03/06 18:57:25 INFO storage.BlockManagerMaster: Registered BlockManager

14/03/06 18:57:25 INFO spark.HttpServer: Starting HTTP Server

14/03/06 18:57:25 INFO server.Server: jetty-7.x.y-SNAPSHOT

14/03/06 18:57:25 INFO server.AbstractConnector: Started [email protected]:33783

14/03/06 18:57:25 INFO broadcast.HttpBroadcast: Broadcast server started at http://192.168.10.240:33783

14/03/06 18:57:25 INFO spark.SparkEnv: Registering MapOutputTracker

14/03/06 18:57:25 INFO spark.HttpFileServer: HTTP File server directory is /tmp/spark-9c1a614e-8299-4824-81ad-f23f4f907417

14/03/06 18:57:25 INFO spark.HttpServer: Starting HTTP Server

14/03/06 18:57:25 INFO server.Server: jetty-7.x.y-SNAPSHOT

14/03/06 18:57:25 INFO server.AbstractConnector: Started [email protected]:49453

14/03/06 18:57:25 INFO server.Server: jetty-7.x.y-SNAPSHOT

14/03/06 18:57:25 INFO handler.ContextHandler: started o.e.j.s.h.ContextHandler{/storage/rdd,null}

14/03/06 18:57:25 INFO handler.ContextHandler: started o.e.j.s.h.ContextHandler{/storage,null}

14/03/06 18:57:25 INFO handler.ContextHandler: started o.e.j.s.h.ContextHandler{/stages/stage,null}

14/03/06 18:57:25 INFO handler.ContextHandler: started o.e.j.s.h.ContextHandler{/stages/pool,null}

14/03/06 18:57:25 INFO handler.ContextHandler: started o.e.j.s.h.ContextHandler{/stages,null}

14/03/06 18:57:25 INFO handler.ContextHandler: started o.e.j.s.h.ContextHandler{/environment,null}

14/03/06 18:57:25 INFO handler.ContextHandler: started o.e.j.s.h.ContextHandler{/executors,null}

14/03/06 18:57:25 INFO handler.ContextHandler: started o.e.j.s.h.ContextHandler{/metrics/json,null}

14/03/06 18:57:25 INFO handler.ContextHandler: started o.e.j.s.h.ContextHandler{/static,null}

14/03/06 18:57:25 INFO handler.ContextHandler: started o.e.j.s.h.ContextHandler{/,null}

14/03/06 18:57:25 INFO server.AbstractConnector: Started [email protected]:4040

14/03/06 18:57:25 INFO ui.SparkUI: Started Spark Web UI at http://saltdb:4040

14/03/06 18:57:26 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

14/03/06 18:57:26 INFO spark.SparkContext: Added JAR /usr/lib/spark/examples/lib/spark-examples_2.10-0.9.0-cdh5.0.0.jar at http://192.168.10.240:49453/jars/spark-examples_2.10-0.9.0-cdh5.0.0.jar with timestamp 1394103446686

14/03/06 18:57:27 INFO yarn.Client: Got Cluster metric info from ApplicationsManager (ASM), number of NodeManagers: 3

14/03/06 18:57:27 INFO yarn.Client: Queue info ... queueName: root.default, queueCurrentCapacity: 0.0, queueMaxCapacity: -1.0,

      queueApplicationCount = 0, queueChildQueueCount = 0

14/03/06 18:57:27 INFO yarn.Client: Max mem capabililty of a single resource in this cluster 8192

14/03/06 18:57:27 INFO yarn.Client: Preparing Local resources

14/03/06 18:57:48 INFO yarn.Client: Uploading hdfs://demo/user/spark/share/lib/spark-assembly.jar to hdfs://demo/user/hdfs/.sparkStaging/application_1399648782708_0003/spark-assembly.jar

14/03/06 18:57:50 INFO yarn.Client: Setting up the launch environment

14/03/06 18:57:50 INFO yarn.Client: Setting up container launch context

14/03/06 18:57:50 INFO yarn.Client: Command for starting the Spark ApplicationMaster: $JAVA_HOME/bin/java -server -Xmx512m -Djava.io.tmpdir=$PWD/tmp org.apache.spark.deploy.yarn.WorkerLauncher --class notused --jar null --args  'saltdb:58888'  --worker-memory 1024 --worker-cores 1 --num-workers 2 1> <LOG_DIR>/stdout 2> <LOG_DIR>/stderr

14/03/06 18:57:50 INFO yarn.Client: Submitting application to ASM

14/03/06 18:57:51 INFO impl.YarnClientImpl: Submitted application application_1399648782708_0003

14/03/06 18:57:51 INFO cluster.YarnClientSchedulerBackend: Application report from ASM:

         appMasterRpcPort: -1

         appStartTime: 1399650490673

         yarnAppState: ACCEPTED

 

14/03/06 18:57:52 INFO cluster.YarnClientSchedulerBackend: Application report from ASM:

         appMasterRpcPort: -1

         appStartTime: 1399650490673

         yarnAppState: ACCEPTED

 

14/03/06 18:57:53 INFO cluster.YarnClientSchedulerBackend: Application report from ASM:

         appMasterRpcPort: -1

         appStartTime: 1399650490673

         yarnAppState: ACCEPTED

 

14/03/06 18:57:54 INFO cluster.YarnClientSchedulerBackend: Application report from ASM:

         appMasterRpcPort: -1

         appStartTime: 1399650490673

         yarnAppState: ACCEPTED

 

14/03/06 18:57:55 INFO cluster.YarnClientSchedulerBackend: Application report from ASM:

         appMasterRpcPort: -1

         appStartTime: 1399650490673

         yarnAppState: ACCEPTED

 

14/03/06 18:57:56 INFO cluster.YarnClientSchedulerBackend: Application report from ASM:

         appMasterRpcPort: -1

         appStartTime: 1399650490673

         yarnAppState: ACCEPTED

 

14/03/06 18:57:57 INFO cluster.YarnClientSchedulerBackend: Application report from ASM:

         appMasterRpcPort: -1

         appStartTime: 1399650490673

         yarnAppState: ACCEPTED

 

14/03/06 18:57:58 INFO cluster.YarnClientSchedulerBackend: Application report from ASM:

         appMasterRpcPort: 0

         appStartTime: 1399650490673

         yarnAppState: RUNNING

 

14/03/06 18:58:00 INFO cluster.YarnClientClusterScheduler: YarnClientClusterScheduler.postStartHook done

14/03/06 18:58:00 INFO cluster.YarnClientSchedulerBackend: Registered executor: Actor[akka.tcp://sparkExecutor@hadoop3:47746/user/Executor#56353724] with ID 2

14/03/06 18:58:01 INFO spark.SparkContext: Starting job: reduce at SparkPi.scala:39

14/03/06 18:58:01 INFO scheduler.DAGScheduler: Got job 0 (reduce at SparkPi.scala:39) with 10 output partitions (allowLocal=false)

14/03/06 18:58:01 INFO scheduler.DAGScheduler: Final stage: Stage 0 (reduce at SparkPi.scala:39)

14/03/06 18:58:01 INFO scheduler.DAGScheduler: Parents of final stage: List()

14/03/06 18:58:01 INFO scheduler.DAGScheduler: Missing parents: List()

14/03/06 18:58:01 INFO scheduler.DAGScheduler: Submitting Stage 0 (MappedRDD[1] at map at SparkPi.scala:35), which has no missing parents

14/03/06 18:58:01 INFO storage.BlockManagerMasterActor$BlockManagerInfo: Registering block manager hadoop3:57329 with 589.2 MB RAM

14/03/06 18:58:02 INFO scheduler.DAGScheduler: Submitting 10 missing tasks from Stage 0 (MappedRDD[1] at map at SparkPi.scala:35)

14/03/06 18:58:02 INFO cluster.YarnClientClusterScheduler: Adding task set 0.0 with 10 tasks

14/03/06 18:58:02 INFO scheduler.TaskSetManager: Starting task 0.0:0 as TID 0 on executor 2: hadoop3 (PROCESS_LOCAL)

14/03/06 18:58:02 INFO scheduler.TaskSetManager: Serialized task 0.0:0 as 1413 bytes in 7 ms

14/03/06 18:58:03 INFO scheduler.TaskSetManager: Starting task 0.0:1 as TID 1 on executor 2: hadoop3 (PROCESS_LOCAL)

14/03/06 18:58:03 INFO scheduler.TaskSetManager: Serialized task 0.0:1 as 1413 bytes in 0 ms

14/03/06 18:58:03 INFO scheduler.TaskSetManager: Finished TID 0 in 1271 ms on hadoop3 (progress: 0/10)

14/03/06 18:58:03 INFO scheduler.DAGScheduler: Completed ResultTask(0, 0)

14/03/06 18:58:03 INFO scheduler.TaskSetManager: Starting task 0.0:2 as TID 2 on executor 2: hadoop3 (PROCESS_LOCAL)

14/03/06 18:58:03 INFO scheduler.TaskSetManager: Serialized task 0.0:2 as 1413 bytes in 1 ms

14/03/06 18:58:03 INFO scheduler.TaskSetManager: Finished TID 1 in 37 ms on hadoop3 (progress: 1/10)

14/03/06 18:58:03 INFO scheduler.DAGScheduler: Completed ResultTask(0, 1)

14/03/06 18:58:03 INFO scheduler.TaskSetManager: Starting task 0.0:3 as TID 3 on executor 2: hadoop3 (PROCESS_LOCAL)

14/03/06 18:58:03 INFO scheduler.TaskSetManager: Serialized task 0.0:3 as 1413 bytes in 1 ms

14/03/06 18:58:03 INFO scheduler.TaskSetManager: Finished TID 2 in 28 ms on hadoop3 (progress: 2/10)

14/03/06 18:58:03 INFO scheduler.DAGScheduler: Completed ResultTask(0, 2)

14/03/06 18:58:03 INFO scheduler.TaskSetManager: Starting task 0.0:4 as TID 4 on executor 2: hadoop3 (PROCESS_LOCAL)

14/03/06 18:58:03 INFO scheduler.TaskSetManager: Serialized task 0.0:4 as 1413 bytes in 1 ms

14/03/06 18:58:03 INFO scheduler.TaskSetManager: Finished TID 3 in 34 ms on hadoop3 (progress: 3/10)

14/03/06 18:58:03 INFO scheduler.DAGScheduler: Completed ResultTask(0, 3)

14/03/06 18:58:03 INFO scheduler.TaskSetManager: Starting task 0.0:5 as TID 5 on executor 2: hadoop3 (PROCESS_LOCAL)

14/03/06 18:58:03 INFO scheduler.TaskSetManager: Serialized task 0.0:5 as 1413 bytes in 0 ms

14/03/06 18:58:03 INFO scheduler.TaskSetManager: Finished TID 4 in 26 ms on hadoop3 (progress: 4/10)

14/03/06 18:58:03 INFO scheduler.DAGScheduler: Completed ResultTask(0, 4)

14/03/06 18:58:03 INFO scheduler.TaskSetManager: Starting task 0.0:6 as TID 6 on executor 2: hadoop3 (PROCESS_LOCAL)

14/03/06 18:58:03 INFO scheduler.TaskSetManager: Serialized task 0.0:6 as 1413 bytes in 0 ms

14/03/06 18:58:03 INFO scheduler.TaskSetManager: Finished TID 5 in 23 ms on hadoop3 (progress: 5/10)

14/03/06 18:58:03 INFO scheduler.DAGScheduler: Completed ResultTask(0, 5)

14/03/06 18:58:03 INFO scheduler.TaskSetManager: Starting task 0.0:7 as TID 7 on executor 2: hadoop3 (PROCESS_LOCAL)

14/03/06 18:58:03 INFO scheduler.TaskSetManager: Serialized task 0.0:7 as 1413 bytes in 0 ms

14/03/06 18:58:03 INFO scheduler.TaskSetManager: Finished TID 6 in 30 ms on hadoop3 (progress: 6/10)

14/03/06 18:58:03 INFO scheduler.DAGScheduler: Completed ResultTask(0, 6)

14/03/06 18:58:03 INFO scheduler.TaskSetManager: Starting task 0.0:8 as TID 8 on executor 2: hadoop3 (PROCESS_LOCAL)

14/03/06 18:58:03 INFO scheduler.TaskSetManager: Serialized task 0.0:8 as 1413 bytes in 1 ms

14/03/06 18:58:03 INFO scheduler.TaskSetManager: Finished TID 7 in 32 ms on hadoop3 (progress: 7/10)

14/03/06 18:58:03 INFO scheduler.DAGScheduler: Completed ResultTask(0, 7)

14/03/06 18:58:03 INFO scheduler.TaskSetManager: Starting task 0.0:9 as TID 9 on executor 2: hadoop3 (PROCESS_LOCAL)

14/03/06 18:58:03 INFO scheduler.TaskSetManager: Serialized task 0.0:9 as 1413 bytes in 0 ms

14/03/06 18:58:03 INFO scheduler.TaskSetManager: Finished TID 8 in 29 ms on hadoop3 (progress: 8/10)

14/03/06 18:58:03 INFO scheduler.DAGScheduler: Completed ResultTask(0, 8)

14/03/06 18:58:03 INFO scheduler.TaskSetManager: Finished TID 9 in 24 ms on hadoop3 (progress: 9/10)

14/03/06 18:58:03 INFO scheduler.DAGScheduler: Completed ResultTask(0, 9)

14/03/06 18:58:03 INFO scheduler.DAGScheduler: Stage 0 (reduce at SparkPi.scala:39) finished in 1.495 s

14/03/06 18:58:03 INFO cluster.YarnClientClusterScheduler: Remove TaskSet 0.0 from pool

14/03/06 18:58:03 INFO spark.SparkContext: Job finished: reduce at SparkPi.scala:39, took 1.741379435 s

Pi is roughly 3.142412

14/03/06 18:58:03 INFO handler.ContextHandler: stopped o.e.j.s.h.ContextHandler{/,null}

14/03/06 18:58:03 INFO handler.ContextHandler: stopped o.e.j.s.h.ContextHandler{/static,null}

14/03/06 18:58:03 INFO handler.ContextHandler: stopped o.e.j.s.h.ContextHandler{/metrics/json,null}

14/03/06 18:58:03 INFO handler.ContextHandler: stopped o.e.j.s.h.ContextHandler{/executors,null}

14/03/06 18:58:03 INFO handler.ContextHandler: stopped o.e.j.s.h.ContextHandler{/environment,null}

14/03/06 18:58:03 INFO handler.ContextHandler: stopped o.e.j.s.h.ContextHandler{/stages,null}

14/03/06 18:58:03 INFO handler.ContextHandler: stopped o.e.j.s.h.ContextHandler{/stages/pool,null}

14/03/06 18:58:03 INFO handler.ContextHandler: stopped o.e.j.s.h.ContextHandler{/stages/stage,null}

14/03/06 18:58:03 INFO handler.ContextHandler: stopped o.e.j.s.h.ContextHandler{/storage,null}

14/03/06 18:58:03 INFO handler.ContextHandler: stopped o.e.j.s.h.ContextHandler{/storage/rdd,null}

14/03/06 18:58:03 INFO cluster.YarnClientSchedulerBackend: Shutting down all executors

14/03/06 18:58:03 INFO cluster.YarnClientSchedulerBackend: Asking each executor to shut down

14/03/06 18:58:03 INFO cluster.YarnClientSchedulerBackend: Stoped

14/03/06 18:58:04 INFO spark.MapOutputTrackerMasterActor: MapOutputTrackerActor stopped!

14/03/06 18:58:04 INFO network.ConnectionManager: Selector thread was interrupted!

14/03/06 18:58:04 INFO network.ConnectionManager: ConnectionManager stopped

14/03/06 18:58:04 INFO storage.MemoryStore: MemoryStore cleared

14/03/06 18:58:04 INFO storage.BlockManager: BlockManager stopped

14/03/06 18:58:04 INFO storage.BlockManagerMasterActor: Stopping BlockManagerMaster

14/03/06 18:58:04 INFO storage.BlockManagerMaster: BlockManagerMaster stopped

14/03/06 18:58:04 INFO remote.RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.

14/03/06 18:58:04 INFO remote.RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.

14/03/06 18:58:04 INFO spark.SparkContext: Successfully stopped SparkContext

-bash-4.1$

 

發佈了5 篇原創文章 · 獲贊 0 · 訪問量 1萬+
發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章