搭建之前先安裝tree,使用tree命令可以很直接看到目錄下的內容。不用進入每個目錄然後ls看一下了
安裝tree命令
[root@localhost software]# yum -y install tree
注:在/ 目錄下新建software目錄用來存放安裝包
查看是否安裝成功
[root@localhost software]# yum list installed tree
Loaded plugins: fastestmirror, langpacks
Loading mirror speeds from cached hostfile
* base: mirror.jdcloud.com
* extras: mirrors.163.com
* updates: mirrors.huaweicloud.com
Installed Packages
tree.x86_64 1.6.0-10.el7 @base
tree
[root@localhost software]# tree -L 2 -C -p
.
├── [drwxr-xr-x] java
│ ├── [drwxr-xr-x] jdk1.8.0_202
│ └── [-rw-r--r--] jdk-8u202-linux-x64.tar.gz
├── [drwxr-xr-x] scala
│ ├── [drwxrwxr-x] scala-2.12.8
│ └── [-rw-r--r--] scala-2.12.8.tgz
└── [drwxr-xr-x] spark
├── [drwxr-xr-x] spark-2.4.0-bin-hadoop2.7
└── [-rw-r--r--] spark-2.4.0-bin-hadoop2.7.tgz
6 directories, 3 files
將java、scala、spark安裝包使用tar -zxvf進行解壓
使用java -version時發現版本是自帶的openJDK,先卸載自帶JDK
# rpm -qa | grep java
# rpm -qa | grep jdk
# rpm -e --nodeps java-1.8.0-openjdk-headless-1.8.0.65-3.b17.el7.x86_64
# rpm -e --nodeps java-1.8.0-openjdk-1.8.0.65-3.b17.el7.x86_64
# rpm -e --nodeps java-1.7.0-openjdk-1.7.0.91-2.6.2.3.el7.x86_64
# rpm -e --nodeps java-1.7.0-openjdk-headless-1.7.0.91-2.6.2.3.el7.x86_64
在執行java -version
[root@localhost software]# java -version
java version "1.8.0_202"
Java(TM) SE Runtime Environment (build 1.8.0_202-b08)
Java HotSpot(TM) 64-Bit Server VM (build 25.202-b08, mixed mode)
配置環境變量
[root@localhost spark-2.4.0-bin-hadoop2.7]# vim /etc/profile
配置生效
[root@localhost spark-2.4.0-bin-hadoop2.7]# source /etc/profile
配置好的環境變量
# java
export JAVA_HOME=/software/java/jdk1.8.0_202
export CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar
export PATH=$PATH:$JAVA_HOME/bin
# scala
export SCALA_HOME=/software/scala/scala-2.12.8
export PATH=$PATH:$SCALA_HOME/bin
# spark
export SPARK_HOME=/software/spark/spark-2.4.0-bin-hadoop2.7
export PATH=$PATH:$SPARK_HOME/bin
切換到spark安裝目錄下的conf目錄
# cp spark-env.sh.template spark-env.sh
# cp slaves.template slaves
詳情:
[root@localhost conf]# ll
total 52
-rw-r--r--. 1 k8s-master k8s-master 996 Oct 29 14:36 docker.properties.template
-rw-r--r--. 1 k8s-master k8s-master 1105 Oct 29 14:36 fairscheduler.xml.template
-rw-r--r--. 1 root root 2025 Mar 9 14:50 log4j.properties
-rw-r--r--. 1 k8s-master k8s-master 2025 Oct 29 14:36 log4j.properties.template
-rw-r--r--. 1 k8s-master k8s-master 7801 Oct 29 14:36 metrics.properties.template
-rw-r--r--. 1 root root 872 Mar 9 15:02 slaves
-rw-r--r--. 1 k8s-master k8s-master 865 Oct 29 14:36 slaves.template
-rw-r--r--. 1 k8s-master k8s-master 1292 Oct 29 14:36 spark-defaults.conf.template
-rwxr-xr-x. 1 root root 4504 Mar 9 14:49 spark-env.sh
-rwxr-xr-x. 1 k8s-master k8s-master 4221 Oct 29 14:36 spark-env.sh.template
修改vim spark-env.sh,末尾添加
export JAVA_HOME=/software/java/jdk1.8.0_202
export SCALA_HOME=/software/scala/scala-2.12.8
export SPARK_HOME=/software/spark/spark-2.4.0-bin-hadoop2.7
export SPARK_MASTER_IP=192.168.247.133
export SPARK_WORKER_MEMORY=2g
export SPARK_WORKER_CORES=2
export SPARK_WORKER_INSTANCES=1
修改log
log4j.rootCategory=WARN, console
啓動
[root@localhost spark-2.4.0-bin-hadoop2.7]# ./sbin/start-all.sh
查看
[root@localhost conf]# jps
6914 Worker
6829 Master
10846 Jps
如果宿主機無法訪問spark的UI端口,需關閉虛擬機防火牆
[root@localhost spark-2.4.0-bin-hadoop2.7]# systemctl stop firewalld.service
[root@localhost spark-2.4.0-bin-hadoop2.7]# systemctl disable firewalld.service
[root@localhost spark-2.4.0-bin-hadoop2.7]# systemctl status firewalld.service
● firewalld.service - firewalld - dynamic firewall daemon
Loaded: loaded (/usr/lib/systemd/system/firewalld.service; disabled; vendor preset: enabled)
Active: inactive (dead)
Mar 09 13:09:16 localhost.localdomain systemd[1]: Starting firewalld - dynamic firewall daemon...
Mar 09 13:09:23 localhost.localdomain systemd[1]: Started firewalld - dynamic firewall daemon.
Mar 09 15:34:22 localhost.localdomain systemd[1]: Stopping firewalld - dynamic firewall daemon...
Mar 09 15:34:27 localhost.localdomain systemd[1]: Stopped firewalld - dynamic firewall daemon.