spark程序初始化HiveContext報錯:空指針異常
版本說明:
spark: 1.6.1
scala: 2.10.8
hive: 1.2.1
報錯信息
java.lang.RuntimeException: java.lang.NullPointerException
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
at org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:204)
at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:238)
at org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:218)
at org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:208)
at org.apache.spark.sql.hive.HiveContext.setConf(HiveContext.scala:440)
at org.apache.spark.sql.SQLContext$$anonfun$4.apply(SQLContext.scala:272)
at org.apache.spark.sql.SQLContext$$anonfun$4.apply(SQLContext.scala:271)
at scala.collection.Iterator$class.foreach(Iterator.scala:727)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
at org.apache.spark.sql.SQLContext.<init>(SQLContext.scala:271)
at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:90)
at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:101)
at com.winner.clu.spark.batch.analysis.AccPresetConditionData.mainFun(AccPresetConditionData.scala:60)
at com.winner.clu.spark.batch.BatchJobMain$.main(BatchJobMain.scala:53)
at com.winner.clu.spark.batch.BatchJobMain.main(BatchJobMain.scala)
問題描述
idea本地開發環境進行調試程序,在通過 val htx=new HiveContext(sc)
的時候,包空指針異常
問題原因
追蹤源碼的時候發現,在進行到下面這一步(創建本地臨時目錄)的時候,進行拋出異常
private void createPath(HiveConf conf, Path path, String permission, boolean isLocal, boolean isCleanUp) throws IOException {
FsPermission fsPermission = new FsPermission(permission);
Object fs;
if (isLocal) {
fs = FileSystem.getLocal(conf);
} else {
fs = path.getFileSystem(conf);
}
if (!((FileSystem)fs).exists(path)) {
((FileSystem)fs).mkdirs(path, fsPermission);
String dirType = isLocal ? "local" : "HDFS";
LOG.info("Created " + dirType + " directory: " + path.toString());
}
if (isCleanUp) {
((FileSystem)fs).deleteOnExit(path);
}
}
並且在程序啓動的時候會由於本地hadoop/bin目錄下沒有winutils.exe
而報錯,故而想到可能又由於該問題引起的
處理方案
下載winutils.exe文件,copy到本地hadoop/bin
目錄下,如下圖:
然後重新運行IDEA,問題解決