概要
轉載的,做個標記
今天不談Spark中什麼複雜的技術實現,只稍爲聊聊如何進行代碼跟讀。衆所周知,Spark使用scala進行開發,由於scala有衆多的語法糖,很多時候代碼跟着跟着就覺着線索跟丟掉了,另外Spark基於Akka來進行消息交互,那如何知道誰是接收方呢?
new Throwable().printStackTrace
代碼跟讀的時候,經常會藉助於日誌,針對日誌中輸出的每一句,我們都很想知道它們的調用者是誰。但有時苦於對spark系統的瞭解程度不深,或者對scala認識不夠,一時半會之內無法找到答案,那麼有沒有什麼簡便的辦法呢?
我的辦法就是在日誌出現的地方加入下面一句話
new Throwable().printStackTrace()
現在舉一個實際的例子來說明問題。
比如我們在啓動spark-shell之後,輸入一句非常簡單的sc.textFile("README.md"),會輸出下述的log
14/07/05 19:53:27 INFO MemoryStore: ensureFreeSpace(32816) called with curMem=0, maxMem=308910489 14/07/05 19:53:27 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 32.0 KB, free 294.6 MB) 14/07/05 19:53:27 DEBUG BlockManager: Put block broadcast_0 locally took 78 ms 14/07/05 19:53:27 DEBUG BlockManager: Putting block broadcast_0 without replication took 79 ms res0: org.apache.spark.rdd.RDD[String] = README.md MappedRDD[1] at textFile at :13
那我很想知道是第二句日誌所在的tryToPut函數是被誰調用的該怎麼辦?
辦法就是打開MemoryStore.scala,找到下述語句
logInfo("Block %s stored as %s in memory (estimated size %s, free %s)".format( blockId, valuesOrBytes, Utils.bytesToString(size), Utils.bytesToString(freeMemory)))
在這句話之上,添加如下語句
new Throwable().printStackTrace()
然後,重新進行源碼編譯
sbt/sbt assembly
再次打開spark-shell,執行sc.textFile("README.md"),就可以得到如下輸出,從中可以清楚知道tryToPut的調用者是誰
14/07/05 19:53:27 INFO MemoryStore: ensureFreeSpace(32816) called with curMem=0, maxMem=308910489 14/07/05 19:53:27 WARN MemoryStore: just show the calltrace by entering some modified code java.lang.Throwable at org.apache.spark.storage.MemoryStore.tryToPut(MemoryStore.scala:182) at org.apache.spark.storage.MemoryStore.putValues(MemoryStore.scala:76) at org.apache.spark.storage.MemoryStore.putValues(MemoryStore.scala:92) at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:699) at org.apache.spark.storage.BlockManager.put(BlockManager.scala:570) at org.apache.spark.storage.BlockManager.putSingle(BlockManager.scala:821) at org.apache.spark.broadcast.HttpBroadcast.(HttpBroadcast.scala:52) at org.apache.spark.broadcast.HttpBroadcastFactory.newBroadcast(HttpBroadcastFactory.scala:35) at org.apache.spark.broadcast.HttpBroadcastFactory.newBroadcast(HttpBroadcastFactory.scala:29) at org.apache.spark.broadcast.BroadcastManager.newBroadcast(BroadcastManager.scala:62) at org.apache.spark.SparkContext.broadcast(SparkContext.scala:787) at org.apache.spark.SparkContext.hadoopFile(SparkContext.scala:556) at org.apache.spark.SparkContext.textFile(SparkContext.scala:468) at $line5.$read$$iwC$$iwC$$iwC$$iwC.(:13) at $line5.$read$$iwC$$iwC$$iwC.(:18) at $line5.$read$$iwC$$iwC.(:20) at $line5.$read$$iwC.(:22) at $line5.$read.(:24) at $line5.$read$.(:28) at $line5.$read$.() at $line5.$eval$.(:7) at $line5.$eval$.() at $line5.$eval.$print() at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:483) at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:788) at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1056) at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:614) at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:645) at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:609) at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:796) at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:841) at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:753) at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:601) at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:608) at org.apache.spark.repl.SparkILoop.loop(SparkILoop.scala:611) at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:936) at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:884) at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:884) at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135) at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:884) at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:982) at org.apache.spark.repl.Main$.main(Main.scala:31) at org.apache.spark.repl.Main.main(Main.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:483) at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:303) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:55) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 14/07/05 19:53:27 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 32.0 KB, free 294.6 MB) 14/07/05 19:53:27 DEBUG BlockManager: Put block broadcast_0 locally took 78 ms 14/07/05 19:53:27 DEBUG BlockManager: Putting block broadcast_0 without replication took 79 ms res0: org.apache.spark.rdd.RDD[String] = README.md MappedRDD[1] at textFile at :13
git同步
對代碼作了修改之後,如果並不想提交代碼,那該如何將最新的內容同步到本地呢?
git reset --hard git pull origin master
Akka消息跟蹤
追蹤消息的接收者是誰,相對來說比較容易,只要使用好grep就可以了,當然前提是要對actor model有一點點了解。
還是舉個實例吧,我們知道CoarseGrainedSchedulerBackend會發送LaunchTask消息出來,那麼誰是接收方呢?只需要執行以下腳本即可。
grep LaunchTask -r core/src/main
從如下的輸出中,可以清楚看出CoarseGrainedExecutorBackend是LaunchTask的接收方,接收到該函數之後的業務處理,只需要去看看接收方的receive函數即可。
core/src/main/scala/org/apache/spark/executor/CoarseGrainedExecutorBackend.scala: case LaunchTask(data) => core/src/main/scala/org/apache/spark/executor/CoarseGrainedExecutorBackend.scala: logError("Received LaunchTask command but executor was null") core/src/main/scala/org/apache/spark/scheduler/cluster/CoarseGrainedClusterMessage.scala: case class LaunchTask(data: SerializableBuffer) extends CoarseGrainedClusterMessage core/src/main/scala/org/apache/spark/scheduler/cluster/CoarseGrainedSchedulerBackend.scala: executorActor(task.executorId) ! LaunchTask(new SerializableBuffer(serializedTask))