spark ha安裝

### 下載spark
```
wget https://www.apache.org/dyn/closer.lua/spark/spark-2.4.3/spark-2.4.3-bin-hadoop2.7.tgz
```
### 解壓縮
```
tar zxvf spark-2.4.3-bin-hadoop2.7.tgz -C /opt/module/
```
### 拷貝初始文件
```
cp spark-env.sh.template spark-env.sh
cp spark-defaults.conf.template spark-defaults.conf
cp slaves.template slaves
```
### 修改spark-env.sh
```
grep -vE '^#|^$' spark-env.sh
export SPARK_DIST_CLASSPATH=$(/opt/module/hadoop-3.2.0/bin/hadoop classpath)
JAVA_HOME=/opt/module/jdk1.8.0_211
SCALA_HOME=/opt/module/scala-2.13.0
HADOOP_CONF_DIR=/opt/module/hadoop-3.2.0/etc/hadoop
HADOOP_HOME=/opt/module/hadoop-3.2.0
SPARK_DAEMON_JAVA_OPTS="-Dspark.deploy.recoveryMode=ZOOKEEPER -Dspark.deploy.zookeeper.url=hadoop11:2181,hadoop12:2181,hadoop13:2181 -Dspark.deploy.zookeeper.dir=/spark"
```
### 設置spark-defaults.conf
```
grep -vE '^#|^$' conf/spark-defaults.conf
spark.master                     spark://hadoop11:7077,hadoop12:7077
```
### 修改slaves
```
grep -vE '^#|^$' slaves
hadoop13
hadoop14
```
### 增加環境變量
```
## SPARK_HOME
export SPARK_HOME=/opt/module/spark-2.4.3-bin-hadoop2.7
export PATH=$PATH:$SPARK_HOME/bin
```
### 同步數據
```
xsync.sh spark-2.4.3-bin-hadoop2.7/
```
### 啓動spark
```
./sbin/start-all.sh
```
### 檢查
```
xcall.sh jps
================== hadoop11 jps==================
3056 QuorumPeerMain
3745 DFSZKF

### 下載spark
```
wget https://www.apache.org/dyn/closer.lua/spark/spark-2.4.3/spark-2.4.3-bin-hadoop2.7.tgz
```
### 解壓縮
```
tar zxvf spark-2.4.3-bin-hadoop2.7.tgz -C /opt/module/
```
### 拷貝初始文件
```
cp spark-env.sh.template spark-env.sh
cp spark-defaults.conf.template spark-defaults.conf
cp slaves.template slaves
```
### 修改spark-env.sh
```
grep -vE '^#|^$' spark-env.sh
export SPARK_DIST_CLASSPATH=$(/opt/module/hadoop-3.2.0/bin/hadoop classpath)
JAVA_HOME=/opt/module/jdk1.8.0_211
SCALA_HOME=/opt/module/scala-2.13.0
HADOOP_CONF_DIR=/opt/module/hadoop-3.2.0/etc/hadoop
HADOOP_HOME=/opt/module/hadoop-3.2.0
SPARK_DAEMON_JAVA_OPTS="-Dspark.deploy.recoveryMode=ZOOKEEPER -Dspark.deploy.zookeeper.url=hadoop11:2181,hadoop12:2181,hadoop13:2181 -Dspark.deploy.zookeeper.dir=/spark"
```
### 設置spark-defaults.conf
```
grep -vE '^#|^$' conf/spark-defaults.conf
spark.master                     spark://hadoop11:7077,hadoop12:7077
```
### 修改slaves
```
grep -vE '^#|^$' slaves
hadoop13
hadoop14
```
### 增加環境變量
```
## SPARK_HOME
export SPARK_HOME=/opt/module/spark-2.4.3-bin-hadoop2.7
export PATH=$PATH:$SPARK_HOME/bin
```
### 同步數據
```
xsync.sh spark-2.4.3-bin-hadoop2.7/
```
### 啓動spark
```
./sbin/start-all.sh
```
### 檢查
```
xcall.sh jps
================== hadoop11 jps==================
3056 QuorumPeerMain
3745 DFSZKFailoverController
3539 JournalNode
10691 Master
10875 Jps
3294 NameNode
3918 NodeManager
6990 HMaster
================== hadoop12 jps==================
7796 Master
2629 NameNode
2935 DFSZKFailoverController
3111 NodeManager
2793 JournalNode
4073 HMaster
2492 QuorumPeerMain
7884 Jps
2703 DataNode
================== hadoop13 jps==================
2693 JournalNode
6631 Worker
2457 QuorumPeerMain
3066 ResourceManager
6747 Jps
2588 DataNode

### 下載spark
```
wget https://www.apache.org/dyn/closer.lua/spark/spark-2.4.3/spark-2.4.3-bin-hadoop2.7.tgz
```
### 解壓縮
```
tar zxvf spark-2.4.3-bin-hadoop2.7.tgz -C /opt/module/
```
### 拷貝初始文件
```
cp spark-env.sh.template spark-env.sh
cp spark-defaults.conf.template spark-defaults.conf
cp slaves.template slaves
```
### 修改spark-env.sh
```
grep -vE '^#|^$' spark-env.sh
export SPARK_DIST_CLASSPATH=$(/opt/module/hadoop-3.2.0/bin/hadoop classpath)
JAVA_HOME=/opt/module/jdk1.8.0_211
SCALA_HOME=/opt/module/scala-2.13.0
HADOOP_CONF_DIR=/opt/module/hadoop-3.2.0/etc/hadoop
HADOOP_HOME=/opt/module/hadoop-3.2.0
SPARK_DAEMON_JAVA_OPTS="-Dspark.deploy.recoveryMode=ZOOKEEPER -Dspark.deploy.zookeeper.url=hadoop11:2181,hadoop12:2181,hadoop13:2181 -Dspark.deploy.zookeeper.dir=/spark"
```
### 設置spark-defaults.conf
```
grep -vE '^#|^$' conf/spark-defaults.conf
spark.master                     spark://hadoop11:7077,hadoop12:7077
```
### 修改slaves
```
grep -vE '^#|^$' slaves
hadoop13
hadoop14
```
### 增加環境變量
```
## SPARK_HOME
export SPARK_HOME=/opt/module/spark-2.4.3-bin-hadoop2.7
export PATH=$PATH:$SPARK_HOME/bin
```
### 同步數據
```
xsync.sh spark-2.4.3-bin-hadoop2.7/
```
### 啓動spark
```
./sbin/start-all.sh
```
### 檢查
```
xcall.sh jps
================== hadoop11 jps==================
3056 QuorumPeerMain
3745 DFSZKFailoverController
3539 JournalNode
10691 Master
10875 Jps
3294 NameNode
3918 NodeManager
6990 HMaster
================== hadoop12 jps==================
7796 Master
2629 NameNode
2935 DFSZKFailoverController
3111 NodeManager
2793 JournalNode
4073 HMaster
2492 QuorumPeerMain
7884 Jps
2703 DataNode
================== hadoop13 jps==================
2693 JournalNode
6631 Worker
2457 QuorumPeerMain
3066 ResourceManager
6747 Jps
2588 DataNode
3997 HRegionServer
3199 NodeManager
================== hadoop14 jps==================
2755 ResourceManager
3767 HRegionServer
6007 Jps
5896 Worker
2603 DataNode
2846 NodeManager
```
### web驗證
![avatar](../img/spark/01_01.png)
### 切換測試
```
[hadoop@hadoop11 spark-2.4.3-bin-hadoop2.7]$ ./sbin/stop-master.sh
stopping org.apache.spark.deploy.master.Master
```
![avatar](../img/spark/01_02.png)
### 驗證
```
[hadoop@hadoop11 spark-2.4.3-bin-hadoop2.7]$ bin/run-example SparkPi 2>&1 | grep "Pi is"
Pi is roughly 3.1379156895784477

[hadoop@hadoop11 spark-2.4.3-bin-hadoop2.7]$ ./bin/spark-shell
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/module/spark-2.4.3-bin-hadoop2.7/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/module/hadoop-3.2.0/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
2019-06-13 17:09:04,701 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Spark context Web UI available at http://hadoop11:4040
Spark context available as 'sc' (master = spark://hadoop11:7077,hadoop12:7077, app id = app-20190613170930-0001).
Spark session available as 'spark'.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.4.3
      /_/

Using Scala version 2.11.12 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_211)
Type in expressions to have them evaluated.
Type :help for more information.

scala> 3+5
res0: Int = 8

scala> :quit
```
### 手動啓動master
```
[hadoop@hadoop12 spark-2.4.3-bin-hadoop2.7]$ ./sbin/start-master.sh
```
 

### 下載spark
```
wget https://www.apache.org/dyn/closer.lua/spark/spark-2.4.3/spark-2.4.3-bin-hadoop2.7.tgz
```
### 解壓縮
```
tar zxvf spark-2.4.3-bin-hadoop2.7.tgz -C /opt/module/
```
### 拷貝初始文件
```
cp spark-env.sh.template spark-env.sh
cp spark-defaults.conf.template spark-defaults.conf
cp slaves.template slaves
```
### 修改spark-env.sh
```
grep -vE '^#|^$' spark-env.sh
export SPARK_DIST_CLASSPATH=$(/opt/module/hadoop-3.2.0/bin/hadoop classpath)
JAVA_HOME=/opt/module/jdk1.8.0_211
SCALA_HOME=/opt/module/scala-2.13.0
HADOOP_CONF_DIR=/opt/module/hadoop-3.2.0/etc/hadoop
HADOOP_HOME=/opt/module/hadoop-3.2.0
SPARK_DAEMON_JAVA_OPTS="-Dspark.deploy.recoveryMode=ZOOKEEPER -Dspark.deploy.zookeeper.url=hadoop11:2181,hadoop12:2181,hadoop13:2181 -Dspark.deploy.zookeeper.dir=/spark"
```
### 設置spark-defaults.conf
```
grep -vE '^#|^$' conf/spark-defaults.conf
spark.master                     spark://hadoop11:7077,hadoop12:7077
```
### 修改slaves
```
grep -vE '^#|^$' slaves
hadoop13
hadoop14
```
### 增加環境變量
```
## SPARK_HOME
export SPARK_HOME=/opt/module/spark-2.4.3-bin-hadoop2.7
export PATH=$PATH:$SPARK_HOME/bin
```
### 同步數據
```
xsync.sh spark-2.4.3-bin-hadoop2.7/
```
### 啓動spark
```
./sbin/start-all.sh
```
### 檢查
```
xcall.sh jps
================== hadoop11 jps==================
3056 QuorumPeerMain
3745 DFSZKFailoverController
3539 JournalNode
10691 Master
10875 Jps
3294 NameNode
3918 NodeManager
6990 HMaster
================== hadoop12 jps==================
7796 Master
2629 NameNode
2935 DFSZKFailoverController
3111 NodeManager
2793 JournalNode
4073 HMaster
2492 QuorumPeerMain
7884 Jps
2703 DataNode
================== hadoop13 jps==================
2693 JournalNode
6631 Worker
2457 QuorumPeerMain
3066 ResourceManager
6747 Jps
2588 DataNode
3997 HRegionServer
3199 NodeManager
================== hadoop14 jps==================
2755 ResourceManager
3767 HRegionServer
6007 Jps
5896 Worker
2603 DataNode
2846 NodeManager
```
### web驗證
![avatar](../img/spark/01_01.png)
### 切換測試
```
[hadoop@hadoop11 spark-2.4.3-bin-hadoop2.7]$ ./sbin/stop-master.sh
stopping org.apache.spark.deploy.master.Master
```
![avatar](../img/spark/01_02.png)
### 驗證
```
[hadoop@hadoop11 spark-2.4.3-bin-hadoop2.7]$ bin/run-example SparkPi 2>&1 | grep "Pi is"
Pi is roughly 3.1379156895784477

[hadoop@hadoop11 spark-2.4.3-bin-hadoop2.7]$ ./bin/spark-shell
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/module/spark-2.4.3-bin-hadoop2.7/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/module/hadoop-3.2.0/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
2019-06-13 17:09:04,701 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Spark context Web UI available at http://hadoop11:4040
Spark context available as 'sc' (master = spark://hadoop11:7077,hadoop12:7077, app id = app-20190613170930-0001).
Spark session available as 'spark'.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.4.3
      /_/

Using Scala version 2.11.12 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_211)
Type in expressions to have them evaluated.
Type :help for more information.

scala> 3+5
res0: Int = 8

scala> :quit
```
### 手動啓動master
```
[hadoop@hadoop12 spark-2.4.3-bin-hadoop2.7]$ ./sbin/start-master.sh
```
 

3997 HRegionServer
3199 NodeManager
================== hadoop14 jps==================
2755 ResourceManager
3767 HRegionServer
6007 Jps
5896 Worker
2603 DataNode
2846 NodeManager
```
### web驗證
![avatar](../img/spark/01_01.png)
### 切換測試
```
[hadoop@hadoop11 spark-2.4.3-bin-hadoop2.7]$ ./sbin/stop-master.sh
stopping org.apache.spark.deploy.master.Master
```
![avatar](../img/spark/01_02.png)
### 驗證
```
[hadoop@hadoop11 spark-2.4.3-bin-hadoop2.7]$ bin/run-example SparkPi 2>&1 | grep "Pi is"
Pi is roughly 3.1379156895784477

[hadoop@hadoop11 spark-2.4.3-bin-hadoop2.7]$ ./bin/spark-shell
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/module/spark-2.4.3-bin-hadoop2.7/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/module/hadoop-3.2.0/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
2019-06-13 17:09:04,701 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Spark context Web UI available at http://hadoop11:4040
Spark context available as 'sc' (master = spark://hadoop11:7077,hadoop12:7077, app id = app-20190613170930-0001).
Spark session available as 'spark'.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.4.3
      /_/

Using Scala version 2.11.12 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_211)
Type in expressions to have them evaluated.
Type :help for more information.

scala> 3+5
res0: Int = 8

scala> :quit
```
### 手動啓動master
```
[hadoop@hadoop12 spark-2.4.3-bin-hadoop2.7]$ ./sbin/start-master.sh
```
 

ailoverController
3539 JournalNode
10691 Master
10875 Jps
3294 NameNode
3918 NodeManager
6990 HMaster
================== hadoop12 jps==================
7796 Master
2629 NameNode
2935 DFSZKFailoverController
3111 NodeManager
2793 JournalNode
4073 HMaster
2492 QuorumPeerMain
7884 Jps
2703 DataNode
================== hadoop13 jps==================
2693 JournalNode
6631 Worker
2457 QuorumPeerMain
3066 ResourceManager
6747 Jps
2588 DataNode
3997 HRegionServer
3199 NodeManager
================== hadoop14 jps==================
2755 ResourceManager
3767 HRegionServer
6007 Jps
5896 Worker

### 下載spark
```
wget https://www.apache.org/dyn/closer.lua/spark/spark-2.4.3/spark-2.4.3-bin-hadoop2.7.tgz
```
### 解壓縮
```
tar zxvf spark-2.4.3-bin-hadoop2.7.tgz -C /opt/module/
```
### 拷貝初始文件
```
cp spark-env.sh.template spark-env.sh
cp spark-defaults.conf.template spark-defaults.conf
cp slaves.template slaves
```
### 修改spark-env.sh
```
grep -vE '^#|^$' spark-env.sh
export SPARK_DIST_CLASSPATH=$(/opt/module/hadoop-3.2.0/bin/hadoop classpath)
JAVA_HOME=/opt/module/jdk1.8.0_211
SCALA_HOME=/opt/module/scala-2.13.0
HADOOP_CONF_DIR=/opt/module/hadoop-3.2.0/etc/hadoop
HADOOP_HOME=/opt/module/hadoop-3.2.0
SPARK_DAEMON_JAVA_OPTS="-Dspark.deploy.recoveryMode=ZOOKEEPER -Dspark.deploy.zookeeper.url=hadoop11:2181,hadoop12:2181,hadoop13:2181 -Dspark.deploy.zookeeper.dir=/spark"
```
### 設置spark-defaults.conf
```
grep -vE '^#|^$' conf/spark-defaults.conf
spark.master                     spark://hadoop11:7077,hadoop12:7077
```
### 修改slaves
```
grep -vE '^#|^$' slaves
hadoop13
hadoop14
```
### 增加環境變量
```
## SPARK_HOME
export SPARK_HOME=/opt/module/spark-2.4.3-bin-hadoop2.7
export PATH=$PATH:$SPARK_HOME/bin
```
### 同步數據
```
xsync.sh spark-2.4.3-bin-hadoop2.7/
```
### 啓動spark
```
./sbin/start-all.sh
```
### 檢查
```
xcall.sh jps
================== hadoop11 jps==================
3056 QuorumPeerMain
3745 DFSZKFailoverController
3539 JournalNode
10691 Master
10875 Jps
3294 NameNode
3918 NodeManager
6990 HMaster
================== hadoop12 jps==================
7796 Master
2629 NameNode
2935 DFSZKFailoverController
3111 NodeManager
2793 JournalNode
4073 HMaster
2492 QuorumPeerMain
7884 Jps
2703 DataNode
================== hadoop13 jps==================
2693 JournalNode
6631 Worker
2457 QuorumPeerMain
3066 ResourceManager
6747 Jps
2588 DataNode
3997 HRegionServer
3199 NodeManager
================== hadoop14 jps==================
2755 ResourceManager
3767 HRegionServer
6007 Jps
5896 Worker
2603 DataNode
2846 NodeManager
```
### web驗證
![avatar](../img/spark/01_01.png)
### 切換測試
```
[hadoop@hadoop11 spark-2.4.3-bin-hadoop2.7]$ ./sbin/stop-master.sh
stopping org.apache.spark.deploy.master.Master
```
![avatar](../img/spark/01_02.png)
### 驗證
```
[hadoop@hadoop11 spark-2.4.3-bin-hadoop2.7]$ bin/run-example SparkPi 2>&1 | grep "Pi is"
Pi is roughly 3.1379156895784477

[hadoop@hadoop11 spark-2.4.3-bin-hadoop2.7]$ ./bin/spark-shell
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/module/spark-2.4.3-bin-hadoop2.7/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/module/hadoop-3.2.0/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
2019-06-13 17:09:04,701 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Spark context Web UI available at http://hadoop11:4040
Spark context available as 'sc' (master = spark://hadoop11:7077,hadoop12:7077, app id = app-20190613170930-0001).
Spark session available as 'spark'.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.4.3
      /_/

Using Scala version 2.11.12 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_211)
Type in expressions to have them evaluated.
Type :help for more information.

scala> 3+5
res0: Int = 8

scala> :quit
```
### 手動啓動master
```
[hadoop@hadoop12 spark-2.4.3-bin-hadoop2.7]$ ./sbin/start-master.sh
```
 

2603 DataNode
2846 NodeManager
```
### web驗證
![avatar](../img/spark/01_01.png)
### 切換測試
```
[hadoop@hadoop11 spark-2.4.3-bin-hadoop2.7]$ ./sbin/stop-master.sh
stopping org.apache.spark.deploy.master.Master
```
![avatar](../img/spark/01_02.png)
### 驗證
```
[hadoop@hadoop11 spark-2.4.3-bin-hadoop2.7]$ bin/run-example SparkPi 2>&1 | grep "Pi is"
Pi is roughly 3.1379156895784477

[hadoop@hadoop11 spark-2.4.3-bin-hadoop2.7]$ ./bin/spark-shell
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/module/spark-2.4.3-bin-hadoop2.7/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/

### 下載spark
```
wget https://www.apache.org/dyn/closer.lua/spark/spark-2.4.3/spark-2.4.3-bin-hadoop2.7.tgz
```
### 解壓縮
```
tar zxvf spark-2.4.3-bin-hadoop2.7.tgz -C /opt/module/
```
### 拷貝初始文件
```
cp spark-env.sh.template spark-env.sh
cp spark-defaults.conf.template spark-defaults.conf
cp slaves.template slaves
```
### 修改spark-env.sh
```
grep -vE '^#|^$' spark-env.sh
export SPARK_DIST_CLASSPATH=$(/opt/module/hadoop-3.2.0/bin/hadoop classpath)
JAVA_HOME=/opt/module/jdk1.8.0_211
SCALA_HOME=/opt/module/scala-2.13.0
HADOOP_CONF_DIR=/opt/module/hadoop-3.2.0/etc/hadoop
HADOOP_HOME=/opt/module/hadoop-3.2.0
SPARK_DAEMON_JAVA_OPTS="-Dspark.deploy.recoveryMode=ZOOKEEPER -Dspark.deploy.zookeeper.url=hadoop11:2181,hadoop12:2181,hadoop13:2181 -Dspark.deploy.zookeeper.dir=/spark"
```
### 設置spark-defaults.conf
```
grep -vE '^#|^$' conf/spark-defaults.conf
spark.master                     spark://hadoop11:7077,hadoop12:7077
```
### 修改slaves
```
grep -vE '^#|^$' slaves
hadoop13
hadoop14
```
### 增加環境變量
```
## SPARK_HOME
export SPARK_HOME=/opt/module/spark-2.4.3-bin-hadoop2.7
export PATH=$PATH:$SPARK_HOME/bin
```
### 同步數據
```
xsync.sh spark-2.4.3-bin-hadoop2.7/
```
### 啓動spark
```
./sbin/start-all.sh
```
### 檢查
```
xcall.sh jps
================== hadoop11 jps==================
3056 QuorumPeerMain
3745 DFSZKFailoverController
3539 JournalNode
10691 Master
10875 Jps
3294 NameNode
3918 NodeManager
6990 HMaster
================== hadoop12 jps==================
7796 Master
2629 NameNode
2935 DFSZKFailoverController
3111 NodeManager
2793 JournalNode
4073 HMaster
2492 QuorumPeerMain
7884 Jps
2703 DataNode
================== hadoop13 jps==================
2693 JournalNode
6631 Worker
2457 QuorumPeerMain
3066 ResourceManager
6747 Jps
2588 DataNode
3997 HRegionServer
3199 NodeManager
================== hadoop14 jps==================
2755 ResourceManager
3767 HRegionServer
6007 Jps
5896 Worker
2603 DataNode
2846 NodeManager
```
### web驗證
![avatar](../img/spark/01_01.png)
### 切換測試
```
[hadoop@hadoop11 spark-2.4.3-bin-hadoop2.7]$ ./sbin/stop-master.sh
stopping org.apache.spark.deploy.master.Master
```
![avatar](../img/spark/01_02.png)
### 驗證
```
[hadoop@hadoop11 spark-2.4.3-bin-hadoop2.7]$ bin/run-example SparkPi 2>&1 | grep "Pi is"
Pi is roughly 3.1379156895784477

[hadoop@hadoop11 spark-2.4.3-bin-hadoop2.7]$ ./bin/spark-shell
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/module/spark-2.4.3-bin-hadoop2.7/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/module/hadoop-3.2.0/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
2019-06-13 17:09:04,701 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Spark context Web UI available at http://hadoop11:4040
Spark context available as 'sc' (master = spark://hadoop11:7077,hadoop12:7077, app id = app-20190613170930-0001).
Spark session available as 'spark'.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.4.3
      /_/

Using Scala version 2.11.12 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_211)
Type in expressions to have them evaluated.
Type :help for more information.

scala> 3+5
res0: Int = 8

scala> :quit
```
### 手動啓動master
```
[hadoop@hadoop12 spark-2.4.3-bin-hadoop2.7]$ ./sbin/start-master.sh
```
 

opt/module/hadoop-3.2.0/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
2019-06-13 17:09:04,701 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Spark context Web UI available at http://hadoop11:4040
Spark context available as 'sc' (master = spark://hadoop11:7077,hadoop12:7077, app id = app-20190613170930-0001).
Spark session available as 'spark'.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.4.3
      /_/

Using Scala version 2.11.12 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_211)
Type in expressions to have them evaluated.
Type :help for more information.

scala> 3+5
res0: Int = 8

scala> :quit
```

### 下載spark
```
wget https://www.apache.org/dyn/closer.lua/spark/spark-2.4.3/spark-2.4.3-bin-hadoop2.7.tgz
```
### 解壓縮
```
tar zxvf spark-2.4.3-bin-hadoop2.7.tgz -C /opt/module/
```
### 拷貝初始文件
```
cp spark-env.sh.template spark-env.sh
cp spark-defaults.conf.template spark-defaults.conf
cp slaves.template slaves
```
### 修改spark-env.sh
```
grep -vE '^#|^$' spark-env.sh
export SPARK_DIST_CLASSPATH=$(/opt/module/hadoop-3.2.0/bin/hadoop classpath)
JAVA_HOME=/opt/module/jdk1.8.0_211
SCALA_HOME=/opt/module/scala-2.13.0
HADOOP_CONF_DIR=/opt/module/hadoop-3.2.0/etc/hadoop
HADOOP_HOME=/opt/module/hadoop-3.2.0
SPARK_DAEMON_JAVA_OPTS="-Dspark.deploy.recoveryMode=ZOOKEEPER -Dspark.deploy.zookeeper.url=hadoop11:2181,hadoop12:2181,hadoop13:2181 -Dspark.deploy.zookeeper.dir=/spark"
```
### 設置spark-defaults.conf
```
grep -vE '^#|^$' conf/spark-defaults.conf
spark.master                     spark://hadoop11:7077,hadoop12:7077
```
### 修改slaves
```
grep -vE '^#|^$' slaves
hadoop13
hadoop14
```
### 增加環境變量
```
## SPARK_HOME
export SPARK_HOME=/opt/module/spark-2.4.3-bin-hadoop2.7
export PATH=$PATH:$SPARK_HOME/bin
```
### 同步數據
```
xsync.sh spark-2.4.3-bin-hadoop2.7/
```
### 啓動spark
```
./sbin/start-all.sh
```
### 檢查
```
xcall.sh jps
================== hadoop11 jps==================
3056 QuorumPeerMain
3745 DFSZKFailoverController
3539 JournalNode
10691 Master
10875 Jps
3294 NameNode
3918 NodeManager
6990 HMaster
================== hadoop12 jps==================
7796 Master
2629 NameNode
2935 DFSZKFailoverController
3111 NodeManager
2793 JournalNode
4073 HMaster
2492 QuorumPeerMain
7884 Jps
2703 DataNode
================== hadoop13 jps==================
2693 JournalNode
6631 Worker
2457 QuorumPeerMain
3066 ResourceManager
6747 Jps
2588 DataNode
3997 HRegionServer
3199 NodeManager
================== hadoop14 jps==================
2755 ResourceManager
3767 HRegionServer
6007 Jps
5896 Worker
2603 DataNode
2846 NodeManager
```
### web驗證
![avatar](../img/spark/01_01.png)
### 切換測試
```
[hadoop@hadoop11 spark-2.4.3-bin-hadoop2.7]$ ./sbin/stop-master.sh
stopping org.apache.spark.deploy.master.Master
```
![avatar](../img/spark/01_02.png)
### 驗證
```
[hadoop@hadoop11 spark-2.4.3-bin-hadoop2.7]$ bin/run-example SparkPi 2>&1 | grep "Pi is"
Pi is roughly 3.1379156895784477

[hadoop@hadoop11 spark-2.4.3-bin-hadoop2.7]$ ./bin/spark-shell
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/module/spark-2.4.3-bin-hadoop2.7/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/module/hadoop-3.2.0/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
2019-06-13 17:09:04,701 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Spark context Web UI available at http://hadoop11:4040
Spark context available as 'sc' (master = spark://hadoop11:7077,hadoop12:7077, app id = app-20190613170930-0001).
Spark session available as 'spark'.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.4.3
      /_/

Using Scala version 2.11.12 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_211)
Type in expressions to have them evaluated.
Type :help for more information.

scala> 3+5
res0: Int = 8

scala> :quit
```
### 手動啓動master
```
[hadoop@hadoop12 spark-2.4.3-bin-hadoop2.7]$ ./sbin/start-master.sh
```
 

### 手動啓動master
```
[hadoop@hadoop12 spark-2.4.3-bin-hadoop2.7]$ ./sbin/start-master.sh
```

發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章