spark-sql 中不能使用hive創建表,報/user/hive/warehouse is not a directory or unable to create one 錯誤
2.方案
把$HIVE_HOME/conf/hive-site.xml複製到$SPARK_HOME/conf/下
在$SPARK_HOME/conf/hive-site.xml修改hive.metastore.warehouse.dir的屬性值.其默認屬性值是/user/hive/warehouse
重啓spark-sql
修改後文件爲:
<configuration>
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://localhost:3306/hive?createDatabaseIfNotExist=true</value>
</property>
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>com.mysql.jdbc.Driver</value>
</property>
<property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>root</value>
</property>
<property>
<name>javax.jdo.option.ConnectionPassword</name>
<value>123</value>
</property>
<property>
<name>hive.metastore.warehouse.dir</name>
<value>hdfs://192.168.2.181:9000/user/hive/warehouse</value>
<description>hive.metastore.warehouse.dir</description>
</property>
注意:hdfs://192.168.2.181 爲本人hdfs 地址,使用時請換成對應的地址!!!
</configuration>