在HDP安裝的yarn基礎上,自定義安裝個spark on yarn

一、概述

通常用HDP+AMBARI安裝的spark on yarn 模式可以正常使用,但是缺點是spark版本是HDP包中固定好的,極其不靈活,目標就是使用HDP+AMBARI安裝的yarn , 然後spark自己部署,保證自己安裝的spark可以運行在ambari部署的yarn上面。

二、 部署步驟

1.

進入/usr/hdp/2.5.3.0-37/hadoop-yarn/lib(hdp安裝目錄)目錄下將jersey-client-1.9.jar和jersey-core-1.9.jar拷貝到/opt/spark-2.2.1-bin-hadoop2.7/jars目錄下面,但是/opt/spark-2.2.1-bin-hadoop2.7/jars下面會有jersey-client-2.22.2.jar,只需將它從命名替換掉這個老的即可。

2.

下載安裝包並且解壓,比如spark-2.2.1-bin-hadoop2.7.tgz

3.

添加環境變量 SPARK_HOME

4.

修改配置文件

①.cp spark-defaults.conf.template spark-defaults.conf

添加內容如下:
spark.driver.extraLibraryPath /usr/hdp/current/hadoop-client/lib/native:/usr/hdp/current/hadoop-client/lib/native/Linux-amd64-64
spark.eventLog.dir hdfs:///spark2-history/
spark.eventLog.enabled true
spark.executor.extraLibraryPath /usr/hdp/current/hadoop-client/lib/native:/usr/hdp/current/hadoop-client/lib/native/Linux-amd64-64
spark.history.fs.logDirectory hdfs:///spark2-history/
spark.history.kerberos.keytab none
spark.history.kerberos.principal none
spark.history.provider org.apache.spark.deploy.history.FsHistoryProvider
spark.history.ui.port 18081
spark.yarn.historyServer.address aaa:18081
spark.yarn.queue default
默認使用default隊列,根據yarn實際情況而定
上面的域名要根據實際情況選擇。

②.cp spark-env.sh.template spark-env.sh

添加內容如下:
export JAVA_HOME=/opt/jdk1.8.0_111
export SCALA_HOME=/opt/scala-2.11.7
export HADOOP_HOME=/usr/hdp/current/hadoop-client
export HADOOP_CONF_DIR=/usr/hdp/current/hadoop-client/conf
export SPARK_HOME=/opt/spark-2.2.1-bin-hadoop2.7

5.在ambari界面修改

在這裏插入圖片描述
版本號根據hdp-select |grep hadoop-hdfs-datanode得到的版本號而定

6.

輸入hadoop fs –mkdir /spark2-history(如果conf配置了history,就要創建)

7.

在ambari上的yarn的配置選項中選項中增加以下兩個配置2選一:
yarn.nodemanager.vmem-check-enable=false
yarn.nodemanager.pmem-check-enable=false

在這裏插入圖片描述
如果已經存在相映的配置,則不需要更改。

三、常見的問題

Stack trace: ExitCodeException exitCode=1:
/hadoop/yarn/local/usercache/root/appcache/application_1522807066160_0019/container_e26_1522807066160_0019_02_000001/launch_container.sh:
line 21:
PWD:PWD:PWD/spark_conf:KaTeX parse error: Expected group after '_' at position 5: PWD/_̲_spark_libs__/*…HADOOP_CONF_DIR:/usr/hdp/current/hadoop-client/:/usr/hdp/current/hadoop-client/lib/:/usr/hdp/current/hadoop-hdfs-client/:/usr/hdp/current/hadoop-hdfs-client/lib/:/usr/hdp/current/hadoop-yarn-client/:/usr/hdp/current/hadoop-yarn-client/lib/:PWD/mrframework/hadoop/share/hadoop/mapreduce/:PWD/mr-framework/hadoop/share/hadoop/mapreduce/*:PWD/mr-framework/hadoop/share/hadoop/mapreduce/lib/:PWD/mrframework/hadoop/share/hadoop/common/:PWD/mr-framework/hadoop/share/hadoop/common/*:PWD/mr-framework/hadoop/share/hadoop/common/lib/:PWD/mrframework/hadoop/share/hadoop/yarn/:PWD/mr-framework/hadoop/share/hadoop/yarn/*:PWD/mr-framework/hadoop/share/hadoop/yarn/lib/:PWD/mrframework/hadoop/share/hadoop/hdfs/:PWD/mr-framework/hadoop/share/hadoop/hdfs/*:PWD/mr-framework/hadoop/share/hadoop/hdfs/lib/:PWD/mrframework/hadoop/share/hadoop/tools/lib/:/usr/hdp/2.5.3.037/hadoop/lib/hadooplzo0.6.0.PWD/mr-framework/hadoop/share/hadoop/tools/lib/*:/usr/hdp/2.5.3.0-37/hadoop/lib/hadoop-lzo-0.6.0.{hdp.version}.jar:/etc/hadoop/conf/secure:
bad substitution

解決方案:
在這裏插入圖片描述
版本號根據hdp-select |grep hadoop-hdfs-datanode得到的版本號而定
2.
java.lang.NoClassDefFoundError: com/sun/jersey/api/client/config/ClientConfig
解決方案:
進入/usr/hdp/2.5.3.0-37/hadoop-yarn/lib目錄下將jersey-client-1.9.jar和jersey-core-1.9.jar拷貝到/opt/spark-2.2.1-bin-hadoop2.7/jars目錄下面,但是/opt/spark-2.2.1-bin-hadoop2.7/jars下面會有jersey-client-2.22.2.jar,只需將它從命名即可替換掉這個老的。

發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章