簡介
- 當hive客戶端搭建起來之後,應用項目遠程連接hive需要設置用戶名和密碼;
- 由於hive默認的用戶名和密碼都是空的,所以需要我們自定義用戶名和密碼;
實踐
- 首先,需要先用java開發工具打包一個jar工具類,用於解析用戶名和密碼,可直接下載這個jar工具包:hiveAuth.jar;
- 也可以自行通過編寫代碼進行打jar包,代碼如下:
package org.apache.hadoop.hive.contrib.auth;
import javax.security.sasl.AuthenticationException;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.hive.conf.HiveConf;
import org.slf4j.Logger;
public class CustomPasswdAuthenticator implements org.apache.hive.service.auth.PasswdAuthenticationProvider{
private Logger LOG = org.slf4j.LoggerFactory.getLogger(CustomPasswdAuthenticator.class);
private static final String HIVE_JDBC_PASSWD_AUTH_PREFIX="hive.jdbc_passwd.auth.%s";
private Configuration conf=null;
@Override
public void Authenticate(String userName, String passwd)
throws AuthenticationException {
LOG.info("user: "+userName+" try login.");
String passwdConf = getConf().get(String.format(HIVE_JDBC_PASSWD_AUTH_PREFIX, userName));
if(passwdConf==null){
String message = "user's ACL configration is not found. user:"+userName;
LOG.info(message);
throw new AuthenticationException(message);
}
if(!passwd.equals(passwdConf)){
String message = "user name and password is mismatch. user:"+userName;
throw new AuthenticationException(message);
}
}
public Configuration getConf() {
if(conf==null){
this.conf=new Configuration(new HiveConf());
}
return conf;
}
public void setConf(Configuration conf) {
this.conf=conf;
}
}
- 之後,將jar包放在hive根目錄的lib目錄下,同時,需要修改conf下的hive-site.xml配置文件;
<!--自定義遠程連接用戶名和密碼-->
<property>
<name>hive.server2.authentication</name>
<value>CUSTOM</value><!--默認爲none,修改成CUSTOM-->
</property>
<!--指定解析jar包-->
<property>
<name>hive.server2.custom.authentication.class</name>
<value>org.apache.hadoop.hive.contrib.auth.CustomPasswdAuthenticator</value>
</property>
<!--設置用戶名和密碼-->
<property>
<name>hive.jdbc_passwd.auth.sixmonth</name><!--用戶名爲最後一個:sixmonth-->
<value>sixmonth</value><!--密碼-->
</property>
- 最後還需要修改hadoop的相關文件,切換到hadoop配置文件目錄:hadoop/etc/hadoop,修改hadoop:core-site.xml,否則java連接hive沒權限;
<property>
<name>hadoop.proxyuser.hadoop.hosts</name>
<value>*</value>
</property>
<property>
<name>hadoop.proxyuser.hadoop.groups</name>
<value>*</value>
</property>
- 重啓hadoop和hive,可以利用beeline命令去測試,這裏用java的客戶端去連接測試;
1. pom添加hive依賴;
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-jdbc</artifactId>
<version>2.1.1</version>
<exclusions>
<exclusion>
<groupId>org.eclipse.jetty.aggregate</groupId>
<artifactId>*</artifactId>
</exclusion>
</exclusions>
</dependency>
2. 創建main方法連接測試;
package com.springboot.sixmonth.common.util;
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.PreparedStatement;
import java.sql.ResultSet;
import java.sql.SQLException;
/**
* hive連接測試類
* @author sixmonth
* @Date 2019年5月13日
*
*/
public class HiveTest {
//9019是自定義遠程連接的端口,默認是10000
private static final String URLHIVE = "jdbc:hive2://47.100.200.200:9019/default";
private static Connection connection = null;
public static Connection getHiveConnection() {
if (null == connection) {
synchronized (HiveTest.class) {
if (null == connection) {
try {
Class.forName("org.apache.hive.jdbc.HiveDriver");
connection = DriverManager.getConnection(URLHIVE, "sixmonth", "sixmonth");
System.out.println("hive啓動連接成功!");
} catch (SQLException e) {
e.printStackTrace();
} catch (ClassNotFoundException e) {
e.printStackTrace();
}
}
}
}
return connection;
}
public static void main(String args[]) throws SQLException{
String sql1="select * from sixmonth limit 1";
PreparedStatement pstm = getHiveConnection().prepareStatement(sql1);
ResultSet rs= pstm.executeQuery(sql1);
while (rs.next()) {
System.out.println(rs.getString(2));
}
pstm.close();
rs.close();
}
}
3. 運行成功之後,即可讀取hive裏面的數據,如下圖:
總結
- 開發工具遠程連接hive可自定義端口,缺省10000,阿里雲服務器的話需要添加端口的安全組;
- 實踐是檢驗認識真理性的唯一標準,自己動手,豐衣足食~~