MAC系統中搭建Spark大數據平臺(包括Scala)
localhost:~ didi$ java -version
java version "1.8.0_102"
Java(TM) SE Runtime Environment (build 1.8.0_102-b14)
Java HotSpot(TM) 64-Bit Server VM (build 25.102-b14, mixed mode)
export PATH="$PATH:/usr/local/Cellar/scala-2.11.8/bin"
localhost:~ didi$ scala
Welcome to Scala 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_102).
Type in expressions for evaluation. Or try :help.
scala>
OK!Scala安裝成功。
export PATH="$PATH:/usr/local/Cellar/spark-2.0.1-bin-hadoop2.7/bin"
3.4 修改Spark的配置文件conf目錄cp spark-env.sh.template spark-env.sh
修改spark-env.sh中的內容,加入如下配置:
</pre><pre code_snippet_id="1961561" snippet_file_name="blog_20161101_7_9283581" name="code" class="html">export SCALA_HOME=/usr/local/Cellar/scala-2.11.8/bin
export SPARK_MASTER_IP=localhost
export SPARK_WORKER_MEMORY=4g
3.5 運行Spark./start-all.sh
3.6 使用spark shell進行測試
localhost:bin didi$ spark-shell
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel).
16/11/01 21:09:47 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/11/01 21:09:47 WARN Utils: Your hostname, localhost resolves to a loopback address: 127.0.0.1; using 10.97.182.157 instead (on interface en0)
16/11/01 21:09:47 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
16/11/01 21:09:48 WARN SparkContext: Use an existing SparkContext, some configuration may not take effect.
Spark context Web UI available at http://10.97.182.157:4040
Spark context available as 'sc' (master = local[*], app id = local-1478005788625).
Spark session available as 'spark'.
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 2.0.1
/_/
Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_102)
Type in expressions to have them evaluated.
Type :help for more information.
scala>
OK!Spark環境搭建成功!