ambari HDP2.6.5 安裝FLINK1.9

ambari HDP2.6.5 安裝FLINK1.9

ambari HDP2.6.5 安裝FLINK1.9

要下載Flink服務文件夾,請運行以下命令

VERSION=`hdp-select status hadoop-client | sed 's/hadoop-client - \([0-9]\.[0-9]\).*/\1/'`
sudo git clone https://github.com/jzyhappy/ambari-flink-service.git  /var/lib/ambari-server/resources/stacks/HDP/$VERSION/services/FLINK 

這裏替換成我的github了,是Flink1.9.1版本,如果版本合適就不用修改了。原來的是https://github.com/abajwa-hw/ambari-flink-service.git

重新啓動Ambari

ambari-server restart

flink 是1.8版本修改成1.9

修改configuration/flink-ambari-config.xml

<property>
    <name>flink_download_url</name>
    <value>http://X.X.151.15/Package/flink-1.9.0-bin-scala_2.11.tgz</value>
    <description>Snapshot download location. Downloaded when setup_prebuilt is true</description>
  </property>

http://X.X.151.15/Package/flink-1.9.0-bin-scala_2.11.tgz 這裏我使用的是本地路徑 推薦使用https://archive.apache.org/dist/ 可以上去看一下自己要選擇的版本將連接填充完整 其他兩個國內可以訪問的有http://www.us.apache.org/dist/flink/ or https://archive.apache.org/dist/

修改metainfo.xml

<name>FLINK</name>
            <displayName>Flink</displayName>
            <comment>Apache Flink is a streaming dataflow engine that provides data distribution, communication, and fault tolerance for distributed computations over data streams.</comment>
            <version>1.9.0</version>

FLINK on YARN

flink在yarn上可以直接運行起來

<property>
	<name>yarn.client.failover-proxy-provider</name>
	<value>org.apache.hadoop.yarn.client.ConfiguredRMFailoverProxyProvider</value>
</property>

flink在yarn上不能運行起來

<property>
	<name>yarn.client.failover-proxy-provider</name>
	<value>org.apache.hadoop.yarn.client.RequestHedgingRMFailoverProxyProvider</value>
</property>

並重啓yarn集羣

配置

修改的配置
javahome 與/etc/prfile一致即可,hadodp_conf_dir使用hdp的生成dir即可

啓動失敗

權限問題

1.hdfs權限沒有讀寫權限

Caused by: org.apache.hadoop.security.AccessControlException: Permission denied: user=flink, access=EXECUTE, inode="/user/flink/.flink/application_1579166472849_0003":root:hdfs:drwxrwx---

在這裏插入圖片描述
2.本地讀寫權限

Execution of 'yarn application -list 2>/dev/null | awk '/flinkapp-from-ambari/ {print $1}' | head -n1 > /var/run/flink/flink.pid' returned 1. -bash: /var/run/flink/flink.pid: Permission denied
awk: (FILENAME=- FNR=4) warning: error writing standard output (Broken pipe)

在這裏插入圖片描述

解決

1.使用hdfs用戶
/etc/profile添加:export HADOOP_USER_NAME=hdfs(hdfs爲最高權限)
source /etc/profile(記得執行,以保證立即生效)

也可以執行 sed -i ‘$a export HADOOP_USER_NAME=hdfs’ ,記得也要source一下

sed -i '$a export HADOOP_USER_NAME=hdfs' 

2.修改 flink.py
user = ‘root’
pwd
/var/lib/ambari-server/resources/stacks/HDP/2.6/services/FLINK/package/scripts

Execute("yarn application -list 2>/dev/null | awk '/" + params.flink_appname + "/ {print $1}' | head -n1 > " + status_params.flink_pid_file, user='root')

測試flink

./flink run --jobmanager yarn-cluster -yn 1 -ytm 768 -yjm 768 /opt/flink/examples/batch/WordCount.jar --input  hdfs://test01:8020/jzytest/1.csv  --output hdfs://test01:8020/jzytest/test.txt

查看生成文件

問題

Caused by: java.lang.ClassNotFoundException: com.sun.jersey.core.util.FeaturesAndProperties

解決

修改yarn-site.xml中的yarn.timeline-service.enabled設置爲false
修改後如圖

def:
1 https://blog.csdn.net/high2011/article/details/90272331
2 https://www.jianshu.com/p/09c08b156885
3 https://blog.csdn.net/jzy3711/article/details/85003606
4 https://github.com/abajwa-hw/ambari-flink-service

發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章