原因
在spark算子中引用的外部變量,其實是變量的副本,在算子中對其值進行修改,只是改變副本的值,外部的變量還是沒有變。
通俗易懂的講就是foreach裏的變量帶不出來的,除非用map,將結果作爲rdd返回
解決方案:
1、使用廣播變量
object foreachtest { def main(args: Array[String]): Unit = { val conf = new SparkConf() conf.setMaster("local[1]") conf.setAppName("WcAppTask") val sc = new SparkContext(conf) sc.setLogLevel("WARN") val fileRdd = sc.parallelize(Array(("imsi1","2018-07-29 11:22:23","zd-A"),("imsi2","2018-07-29 11:22:24","zd-A"),("imsi3","2018-07-29 11:22:25","zd-A"))) val result = mutable.Map.empty[String,String] val resultBroadCast: Broadcast[mutable.Map[String, String]] =sc.broadcast(result) fileRdd.foreach(input=>{ val str = (input._1+"/t"+input._2+"/t"+input._3).toString resultBroadCast.value += (input._1.toString -> str) println(resultBroadCast.value.size) //返回1,2.3 }) println(result.size) //返回3 }
2:使用累加器
val accum = sc.collectionAccumulator[mutable.Map[String, String]]("My Accumulator") fileRdd.foreach(input => { val str = input._1 + "/t" + input._2 + "/t" + input._3 accum.add(mutable.Map(input._1 -> str)) }) println(accum.value.size())
3:累加變量 longAccumulator
val longaa= sc.longAccumulator("count") fileRdd.foreach(input=>{ val str = (input._1+"/t"+input._2+"/t"+input._3).toString longaa.add(1L) }) println(longaa.count) //返回3