hadoop 3.1.1集羣中安裝kylin 3.0.0啓用報錯

kylin版本:apache-kylin-3.0.0-beta-bin-hadoop3

HDP版本:3.1.0.0

1、Permission denied: user=root, access=WRITE, inode="/kylin":hdfs:hdfs

su - hdfs

hdfs dfs -mkdir /kylin

hdfs dfs -chmod a+rwx /kylin

2、Something wrong with Hive CLI or Beeline, please execute Hive CLI or Beeline CLI in terminal to find the root cause.

個人解決辦法:

vi bin/find-hive-dependency.sh (第37行)

hive_env=`hive ${hive_conf_properties} -e set 2>&1 | grep 'env:CLASSPATH'` 中的變量 ${hive_conf_properties}去掉(未配置此變量),即修改爲

hive_env=`hive -e set 2>&1 | grep 'env:CLASSPATH'`

再啓動即可

bin/kylin.sh start

3、spark not found, set SPARK_HOME, or run bin/download-spark.sh

vi /etc/profile

export SPARK_HOME=/usr/hdp/current/spark2-client
export HIVE_CONF=/etc/hive/conf
export HCAT_HOME=/usr/hdp/current/hive-webhcat

source /etc/profile

4、${KYLIN_HOME}/tomcat/conf/.keystore (沒有那個文件或目錄)

解決:在kylin內置tomcat的server.xml中裏邊有個對https的支持那一段沒啓用的話 註釋掉

<Connector port="7443" protocol="org.apache.coyote.http11.Http11Protocol"
                   maxThreads="150" SSLEnabled="true" scheme="https" secure="true"
                   keystoreFile="conf/.keystore" keystorePass="changeit"
                   clientAuth="false" sslProtocol="TLS" />

5、Caused by: java.lang.NoClassDefFoundError: org/apache/commons/configuration/ConfigurationException

解決:將commons-configuration-*.jar 複製到kylin的tomcat/lib 下

cp /usr/hdp/share/hst/hst-common/lib/commons-configuration-1.10.jar tomcat/lib/

6、spark2/jars/derbyLocale_cs.jar (沒有那個文件或目錄)

spark2/jars/derbyLocale_de_DE.jar (沒有那個文件或目錄)

spark2/jars/derbyLocale_es.jar (沒有那個文件或目錄)

spark2/jars/derbyLocale_fr.jar (沒有那個文件或目錄)

spark2/jars/derbyLocale_hu.jar (沒有那個文件或目錄)

spark2/jars/derbyLocale_it.jar (沒有那個文件或目錄)

spark2/jars/derbyLocale_ja_JP.jar (沒有那個文件或目錄)

spark2/jars/derbyLocale_ko_KR.jar (沒有那個文件或目錄)

spark2/jars/derbyLocale_pl.jar (沒有那個文件或目錄)

spark2/jars/derbyLocale_pt_BR.jar (沒有那個文件或目錄)

spark2/jars/derbyLocale_ru.jar (沒有那個文件或目錄)

spark2/jars/derbyLocale_zh_CN.jar (沒有那個文件或目錄)

spark2/jars/derbyLocale_zh_TW.jar (沒有那個文件或目錄)

警告沒管(你也可以去中央倉庫mvn中下載相應jar包放入目錄下)

比如下載放入:/usr/hdp/3.1.0.0-78/spark2/jars/derbyLocale_cs-10.15.1.3.jar

cd /usr/hdp/3.1.0.0-78/spark2/jars/

ln -s derbyLocale_cs-10.15.1.3.jar derbyLocale_cs.jar

7、Error: Could not open client transport for any of the Server URI's in ZooKeeper: Failed to open new session: java.lang.IllegalArgumentException: Cannot modify dfs.replication at runtime. It is not in list of params that are allowed to be modified at runtime (state=08S01,code=0)

解決方案:conf/kylin_hive_conf.xml中註釋掉dfs.replication這個屬性,還有mapreduce.job.split.metainfo.maxsize這個屬性

8、ERROR : FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:Got exception: org.apache.hadoop.security.AccessControlException Permission denied: user=root, access=EXECUTE, inode="/warehouse/tablespace/managed/hive":hive:hadoop:drwx------

解決方法:hdfs服務修改dfs.permissions.enabled參數,設置爲false,重啓hdfs服務組件和kylin即可

用demo測試:bin/sample.sh

Restart Kylin Server or click Web UI => System Tab => Reload Metadata to take effect

界面纔會顯示出來

9、logs/kylin.log日誌中報錯:SLF4J: Class path contains multiple SLF4Jbindings.,以爲hadoop和hive中jar包含多個StaticLoggerBinder.class引起的,後來發現還存在如下錯誤信息打印

Error: Error while processing statement: Cannot modify mapred.job.name at runtime. It is not in list of params that are allowed to be modified at runtime (state=42000,code=1)

個人解決此問題:hive-site新增自定義參數

<property>
  <name>hive.security.authorization.sqlstd.confwhitelist</name>
  <value>mapred.*|hive.*|mapreduce.*|spark.*</value>
</property>

<property>
  <name>hive.security.authorization.sqlstd.confwhitelist.append</name>
  <value>mapred.*|hive.*|mapreduce.*|spark.*</value>
</property>

 

發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章