spark error

Hot Swap failed
        Test3: hierarchy change not implemented; 
        Test3: Operation not supported by VM

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
16/07/28 05:42:58 INFO CoarseGrainedExecutorBackend: Registered signal handlers for [TERM, HUP, INT]
16/07/28 05:42:59 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/07/28 05:42:59 INFO SecurityManager: Changing view acls to: lin
16/07/28 05:42:59 INFO SecurityManager: Changing modify acls to: lin
16/07/28 05:42:59 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(lin); users with modify permissions: Set(lin)
16/07/28 05:42:59 INFO SecurityManager: Changing view acls to: lin
16/07/28 05:42:59 INFO SecurityManager: Changing modify acls to: lin
16/07/28 05:42:59 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(lin); users with modify permissions: Set(lin)
16/07/28 05:43:00 INFO Slf4jLogger: Slf4jLogger started
16/07/28 05:43:00 INFO Remoting: Starting remoting
16/07/28 05:43:00 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://[email protected]:47001]
16/07/28 05:43:00 INFO Utils: Successfully started service 'sparkExecutorActorSystem' on port 47001.
16/07/28 05:43:00 INFO DiskBlockManager: Created local directory at /tmp/spark-f433d17e-57d3-48d2-8edd-84eb33eb2808/executor-c8bbebdc-0763-414e-a4aa-8298c064b7cf/blockmgr-afe3481c-f6ea-483f-b7cb-f5006e3cd2c7
16/07/28 05:43:00 INFO MemoryStore: MemoryStore started with capacity 511.0 MB
16/07/28 05:43:00 INFO CoarseGrainedExecutorBackend: Connecting to driver: spark://[email protected]:53018
16/07/28 05:43:00 INFO WorkerWatcher: Connecting to worker spark://[email protected]:54752
16/07/28 05:43:00 INFO CoarseGrainedExecutorBackend: Successfully registered with driver
16/07/28 05:43:00 INFO Executor: Starting executor ID 0 on host zookeeper2
16/07/28 05:43:00 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 42924.
16/07/28 05:43:00 INFO NettyBlockTransferService: Server created on 42924
16/07/28 05:43:00 INFO BlockManagerMaster: Trying to register BlockManager
16/07/28 05:43:00 INFO BlockManagerMaster: Registered BlockManager
16/07/28 05:43:00 INFO CoarseGrainedExecutorBackend: Got assigned task 0
16/07/28 05:43:00 INFO CoarseGrainedExecutorBackend: Got assigned task 1
16/07/28 05:43:00 INFO Executor: Running task 1.0 in stage 0.0 (TID 1)
16/07/28 05:43:00 INFO Executor: Running task 0.0 in stage 0.0 (TID 0)
16/07/28 05:43:00 INFO Executor: Fetching http://192.168.1.121:49535/jars/SprakJustRowkey.jar with timestamp 1469709778265
16/07/28 05:43:00 INFO Utils: Fetching http://192.168.1.121:49535/jars/SprakJustRowkey.jar to /tmp/spark-f433d17e-57d3-48d2-8edd-84eb33eb2808/executor-c8bbebdc-0763-414e-a4aa-8298c064b7cf/spark-9625a52d-1a42-4545-a98c-1e6a5b460bd5/fetchFileTemp1622181222613571614.tmp
16/07/28 05:43:02 INFO Utils: Copying /tmp/spark-f433d17e-57d3-48d2-8edd-84eb33eb2808/executor-c8bbebdc-0763-414e-a4aa-8298c064b7cf/spark-9625a52d-1a42-4545-a98c-1e6a5b460bd5/6067580111469709778265_cache to /home/lin/Desktop/testHBase/spark-1.6.1-bin-hadoop2.6/work/app-20160728054258-0013/0/./SprakJustRowkey.jar
16/07/28 05:43:03 INFO Executor: Adding file:/home/lin/Desktop/testHBase/spark-1.6.1-bin-hadoop2.6/work/app-20160728054258-0013/0/./SprakJustRowkey.jar to class loader
16/07/28 05:43:03 INFO TorrentBroadcast: Started reading broadcast variable 0
16/07/28 05:43:03 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 21.7 KB, free 21.7 KB)
16/07/28 05:43:03 INFO TorrentBroadcast: Reading broadcast variable 0 took 198 ms
16/07/28 05:43:03 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 63.7 KB, free 85.5 KB)
16/07/28 05:43:03 INFO deprecation: mapred.job.id is deprecated. Instead, use mapreduce.job.id
16/07/28 05:43:03 INFO deprecation: mapred.tip.id is deprecated. Instead, use mapreduce.task.id
16/07/28 05:43:03 INFO deprecation: mapred.task.id is deprecated. Instead, use mapreduce.task.attempt.id
16/07/28 05:43:03 INFO deprecation: mapred.task.is.map is deprecated. Instead, use mapreduce.task.ismap
16/07/28 05:43:03 INFO deprecation: mapred.task.partition is deprecated. Instead, use mapreduce.task.partition
16/07/28 05:43:04 INFO FileOutputCommitter: Saved output of task 'attempt_201607280542_0000_m_000001_1' to file:/home/lin/Desktop/testHBase/TestSpark1/3/_temporary/0/task_201607280542_0000_m_000001
16/07/28 05:43:04 INFO FileOutputCommitter: Saved output of task 'attempt_201607280542_0000_m_000000_0' to file:/home/lin/Desktop/testHBase/TestSpark1/3/_temporary/0/task_201607280542_0000_m_000000
16/07/28 05:43:04 INFO SparkHadoopMapRedUtil: attempt_201607280542_0000_m_000001_1: Committed
16/07/28 05:43:04 INFO SparkHadoopMapRedUtil: attempt_201607280542_0000_m_000000_0: Committed
16/07/28 05:43:04 INFO Executor: Finished task 1.0 in stage 0.0 (TID 1). 1865 bytes result sent to driver
16/07/28 05:43:04 INFO Executor: Finished task 0.0 in stage 0.0 (TID 0). 1865 bytes result sent to driver
16/07/28 05:43:04 INFO CoarseGrainedExecutorBackend: Driver commanded a shutdown
16/07/28 05:43:04 ERROR CoarseGrainedExecutorBackend: RECEIVED SIGNAL 15: SIGTERM
發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章