1.Hive
開啓HiveMetastore
和HiveServer2
nohup /opt/module/hive3.1.2/bin/hive --service metastore >$HIVE_LOG_DIR/metastore.log 2>&1 &
nohup /opt/module/hive3.1.2/bin/hive --service hiveserver2 >$HIVE_LOG_DIR/hiveServer2.log 2>&1 &
2.配置鏈接的用戶名和密碼
vim conf/hive-site.xml
<property>
<name>hive.server2.thrift.client.user</name>
<value>lytfly</value>
<description>Username to use against thrift client</description>
</property>
<property>
<name>hive.server2.thrift.client.password</name>
<value>199000</value>
<description>Password to use against thrift client</description>
</property>
確保添加的用戶lytfly對HDFS文件系統/tmp/hive
有權限,否會報以下錯誤:
java.lang.RuntimeException: Error applying authorization policy on hive configuration: The dir: /tmp/hive on HDFS should be writable. Current permissions are: rwx------
解決方案:https://my.oschina.net/liuyuantao/blog/5276881
在beeline
進行連接時,如果出現以下問題
Error: Could not open client transport with JDBC Uri: jdbc:hive2://node101:10000/default: Failed to open new session: java.lang.RuntimeException: org.apache.hadoop.ipc.eption(org.apache.hadoop.security.authorize.AuthorizationException): User: lytfly is not allowed to impersonate lytfly (state=08S01,code=0)
這是因爲在hadoop
的core-site.xml
缺少了以下配置:
<property>
<name>hadoop.proxyuser.lytfly.groups</name>
<value>*</value>
</property>
<property>
<name>hadoop.proxyuser.lytfly.hosts</name>
<value>*</value>
</property>
重啓Hadoop
集羣,重啓Hive
的HiveMetastore
和HiveServer2
即可。