之前參考這篇文章(https://segmentfault.com/a/1190000019658767?utm_source=tag-newest,文章下面有我的評論),重寫getConnection方法,發現雖然kerberos登陸成功,但是在獲取連接時候還是沒有權限。
後來參考源碼,重寫init()方法。獲取到的Connection已經是認證通過的。
@Override
public void init() throws SQLException {
ImpalaDataSource _this = this;
loginUser.doAs(new PrivilegedAction<Void>() {
public Void run() {
try {
_this.superInit();
} catch (Exception e) {
e.printStackTrace();
}
return null;
}
});
}
但是打斷點發現:每次在獲取連接的時候都要走init()方法,相當於我重寫的方法,每次都要走loginUser.doAs這一步。感覺沒有必要,參考DuridDataSource的init方法發現。第一次初始化的時候,會將 inited設置爲true。收到啓發,重寫方法中也仿照原生,加一句判斷。最終的源碼是:
import com.alibaba.druid.pool.DruidDataSource;
import lombok.Getter;
import lombok.Setter;
import org.apache.hadoop.security.UserGroupInformation;
import java.security.PrivilegedAction;
import java.sql.SQLException;
@Getter
@Setter
public class ImpalaDataSource extends DruidDataSource {
private UserGroupInformation loginUser;
public ImpalaDataSource(UserGroupInformation loginUser) {
this.loginUser = loginUser;
}
@Override
public void init() throws SQLException {
if (super.inited){
return;
}
ImpalaDataSource _this = this;
loginUser.doAs(new PrivilegedAction<Void>() {
public Void run() {
try {
_this.superInit();
} catch (Exception e) {
e.printStackTrace();
}
return null;
}
});
}
public void superInit() throws SQLException {
super.init();
}
}
下面是在實例化數據源之前做的認證操作:
@Bean(name = "impalaDataSource", initMethod = "init", destroyMethod = "close")
public ImpalaDataSource getImpalaDataSource() throws Exception {
//impala 的kerberos登錄認證
org.apache.hadoop.conf.Configuration conf = new org.apache.hadoop.conf.Configuration();
KerberosInitor.initKerberosEnv(conf, impalaPrincipalName, impalaKeytabPath, krb5ConfPath, loginConfigPath);
//獲取認證用戶
UserGroupInformation loginUser = UserGroupInformation.getLoginUser();
ImpalaDataSource datasource = new ImpalaDataSource(loginUser);
datasource.setUrl(impalaUrl);
datasource.setDriverClassName(impalaDriverClassName);
//configuration
datasource.setInitialSize(initialSize);
datasource.setMinIdle(minIdle);
datasource.setMaxActive(maxActive);
datasource.setMaxWait(maxWait);
datasource.setMinEvictableIdleTimeMillis(minEvictableIdleTimeMillis);
datasource.setValidationQuery(validationQuery);
datasource.setTestWhileIdle(testWhileIdle);
datasource.setTestOnBorrow(testOnBorrow);
datasource.setTestOnReturn(testOnReturn);
datasource.setPoolPreparedStatements(poolPreparedStatements);
datasource.setMaxPoolPreparedStatementPerConnectionSize(maxPoolPreparedStatementPerConnectionSize);
return datasource;
}
public static void initKerberosEnv(Configuration conf, String principalName, String keytabPath, String krb5ConfPath, String loginConfigPath) throws Exception {
System.setProperty("java.security.krb5.conf", krb5ConfPath);
System.setProperty("java.security.auth.login.config", loginConfigPath);
conf.set("hadoop.security.authentication", "Kerberos");
// linux 環境會默認讀取/etc/krb5.conf文件,win不指定會默認讀取C:/Windows/krb5.ini
UserGroupInformation.setConfiguration(conf);
UserGroupInformation.loginUserFromKeytab(principalName, keytabPath);
}