Spark安裝:
1、下載安裝Scala http://www.scala-lang.org/download/2.11.8.html頁面下載scala-2.11.8.tgz
下載安裝Spark http://spark.apache.org/downloads.html頁面下載spark-2.0.0-bin-hadoop2.7.tgz
2、安裝Scala
解壓
# tar -zxvf scala-2.11.8.tgz/home/netlab
配置環境變量
[netlab@master ~]$ vi .bash_profile
export SCALA_HOME=/home/netlab/scala-2.11.4
export PATH=$PATH:$SCALA_HOME/bin
[netlab@master ~]$ source .bash_profile
測試是否安裝成功:輸入scala,出現以下則成功:
3、安裝Spark
解壓tar -zxvf spark-2.0.0-bin-hadoop2.7.tgz到/usr/local/wl/spark
配置環境變量
[netlab@master ~]$ vi .bash_profile
export SPARK_HOME=/usr/local/wl/spark/spark-2.0.0-bin-hadoop2.7
export PATH=$PATH:$SPARK_HOME/bin
[netlab@master ~]$ source .bash_profile
[netlab@master ~]$ cd spark-2.0.0-bin-hadoop2.7
[netlab@master spark-2.0.0-bin-hadoop2.7]$ cd conf
[netlab@master conf]$ ls
[netlab@master conf]$cp spark-env.sh.template spark-env.sh
[netlab@master conf]$vi spark-env.sh
添加java/scala/spark路徑
[netlab@master conf]$ source spark-env.sh
[netlab@master conf]$ cp slaves.template slaves
[netlab@master conf]$vi slaves
修改主機名爲從節點名稱
4、將主節點關於spark配置複製到從節點
[netlab@master ~]$scp -r ~/spark-2.0.0-binhadoop2.7 slave:~/
[netlab@master ~]$ cd spark-2.0.0-bin-hadoop2.7
[netlab@master spark-2.0.0-bin-hadoop2.7]$ cd sbin/
[netlab@master sbin]$ ls
[netlab@master sbin]$ start-all.sh
[netlab@master sbin]$ jps檢查節點啓動情況
訪問瀏覽器master:8080