Spark搭建 之 單機模式

本文的主線 安裝 => 運行 => Spark-Shell

安裝

cd /opt/services/

wget https://mirror.tuna.tsinghua.edu.cn/apache/spark/spark-2.4.7/spark-2.4.7-bin-hadoop2.7.tgz

tar xf spark-2.4.7-bin-hadoop2.7.tgz

mv spark-2.4.7-bin-hadoop2.7 spark

運行

cd spark

./bin/run-example SparkPi 10

./bin/run-example SparkPi 10 2>&1 | grep "Pi is roughly"
# Pi is roughly 3.1415431415431416

Spark-Shell

暫只支持Scala和Python

./bin/spark-shell --master local[2]

scala> val textFile = sc.textFile("file:///opt/services/spark/README.md")
textFile: org.apache.spark.rdd.RDD[String] = file:///opt/services/spark/README.md MapPartitionsRDD[1] at textFile at <console>:24

scala> textFile.count()
res0: Long = 104

scala> :quit

參考

發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章