ambari HDP2.6.5 安装FLINK1.9

ambari HDP2.6.5 安装FLINK1.9

ambari HDP2.6.5 安装FLINK1.9

要下载Flink服务文件夹,请运行以下命令

VERSION=`hdp-select status hadoop-client | sed 's/hadoop-client - \([0-9]\.[0-9]\).*/\1/'`
sudo git clone https://github.com/jzyhappy/ambari-flink-service.git  /var/lib/ambari-server/resources/stacks/HDP/$VERSION/services/FLINK 

这里替换成我的github了,是Flink1.9.1版本,如果版本合适就不用修改了。原来的是https://github.com/abajwa-hw/ambari-flink-service.git

重新启动Ambari

ambari-server restart

flink 是1.8版本修改成1.9

修改configuration/flink-ambari-config.xml

<property>
    <name>flink_download_url</name>
    <value>http://X.X.151.15/Package/flink-1.9.0-bin-scala_2.11.tgz</value>
    <description>Snapshot download location. Downloaded when setup_prebuilt is true</description>
  </property>

http://X.X.151.15/Package/flink-1.9.0-bin-scala_2.11.tgz 这里我使用的是本地路径 推荐使用https://archive.apache.org/dist/ 可以上去看一下自己要选择的版本将连接填充完整 其他两个国内可以访问的有http://www.us.apache.org/dist/flink/ or https://archive.apache.org/dist/

修改metainfo.xml

<name>FLINK</name>
            <displayName>Flink</displayName>
            <comment>Apache Flink is a streaming dataflow engine that provides data distribution, communication, and fault tolerance for distributed computations over data streams.</comment>
            <version>1.9.0</version>

FLINK on YARN

flink在yarn上可以直接运行起来

<property>
	<name>yarn.client.failover-proxy-provider</name>
	<value>org.apache.hadoop.yarn.client.ConfiguredRMFailoverProxyProvider</value>
</property>

flink在yarn上不能运行起来

<property>
	<name>yarn.client.failover-proxy-provider</name>
	<value>org.apache.hadoop.yarn.client.RequestHedgingRMFailoverProxyProvider</value>
</property>

并重启yarn集群

配置

修改的配置
javahome 与/etc/prfile一致即可,hadodp_conf_dir使用hdp的生成dir即可

启动失败

权限问题

1.hdfs权限没有读写权限

Caused by: org.apache.hadoop.security.AccessControlException: Permission denied: user=flink, access=EXECUTE, inode="/user/flink/.flink/application_1579166472849_0003":root:hdfs:drwxrwx---

在这里插入图片描述
2.本地读写权限

Execution of 'yarn application -list 2>/dev/null | awk '/flinkapp-from-ambari/ {print $1}' | head -n1 > /var/run/flink/flink.pid' returned 1. -bash: /var/run/flink/flink.pid: Permission denied
awk: (FILENAME=- FNR=4) warning: error writing standard output (Broken pipe)

在这里插入图片描述

解决

1.使用hdfs用户
/etc/profile添加:export HADOOP_USER_NAME=hdfs(hdfs为最高权限)
source /etc/profile(记得执行,以保证立即生效)

也可以执行 sed -i ‘$a export HADOOP_USER_NAME=hdfs’ ,记得也要source一下

sed -i '$a export HADOOP_USER_NAME=hdfs' 

2.修改 flink.py
user = ‘root’
pwd
/var/lib/ambari-server/resources/stacks/HDP/2.6/services/FLINK/package/scripts

Execute("yarn application -list 2>/dev/null | awk '/" + params.flink_appname + "/ {print $1}' | head -n1 > " + status_params.flink_pid_file, user='root')

测试flink

./flink run --jobmanager yarn-cluster -yn 1 -ytm 768 -yjm 768 /opt/flink/examples/batch/WordCount.jar --input  hdfs://test01:8020/jzytest/1.csv  --output hdfs://test01:8020/jzytest/test.txt

查看生成文件

问题

Caused by: java.lang.ClassNotFoundException: com.sun.jersey.core.util.FeaturesAndProperties

解决

修改yarn-site.xml中的yarn.timeline-service.enabled设置为false
修改后如图

def:
1 https://blog.csdn.net/high2011/article/details/90272331
2 https://www.jianshu.com/p/09c08b156885
3 https://blog.csdn.net/jzy3711/article/details/85003606
4 https://github.com/abajwa-hw/ambari-flink-service

發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章