IDEA非sbt下spark開發

  • 創建非sbt的scala項目
  • 引入spark的jar包
File->Project Structure->Libararies引用spark-assembly-1.5.2-hadoop2.6.0.jar
  • 編寫代碼
import scala.math.random
import org.apache.spark._
/**
 * Created by code-pc on 16/3/2.
  */
object test1 {
  def main(args: Array[String]) {
    val conf = new SparkConf().setAppName("Spark Pi").setMaster("local")
    val spark = new SparkContext(conf)
    val slices = if (args.length > 0) args(0).toInt else 2
    val n = 100000 * slices
    val count = spark.parallelize(1 to n, slices).map { i =>
      val x = random * 2 - 1
      val y = random * 2 - 1
      if (x*x + y*y < 1) 1 else 0
    }.reduce(_ + _)
    println("Pi is roughly " + 4.0 * count / n)
    spark.stop()
  }
}
  • 編譯jar包
File -> Project Structure -> Artifacts  -> + -> Jars -> From moudles with dependencies
菜單欄:build Artifacts
  • 運行
./spark-submit --class test1 --master local ~/IdeaProjects/test1/out/artifacts/Pi/test1.jar
發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章