Spark WordCount 數據流分析


import org.apache.spark.{SparkConf, SparkContext}

/**
  * Created by huangle63 on 2016/9/22.
  */
object WrodCount {
	def main(args: Array[String]): Unit = {
		val conf = new SparkConf()
		conf.setAppName("My first spark app")
		conf.setMaster("local")
		val sc = new SparkContext(conf)
		val lines = sc.textFile("E://SOFTLEARN//BOOKDATA//dataset//test.txt", 1)
		val words = lines.flatMap { line => line.split(" ") }
		val pairs = words.map((_,1))
		val wordcounts = pairs.reduceByKey(_+_)
		wordcounts.foreach(println)
		sc.stop()
	}
}

text.txt 內容

Hello Spark Hello Scala
Hello Hadoop
Hello Flink
Spark is Awesome






發佈了46 篇原創文章 · 獲贊 10 · 訪問量 16萬+
發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章