SQOOP導入hive表報錯

sqoop:/sqoop-1.4.6/bin/sqoop import --connect jdbc:oracle:thin:@10.100.100.100:1521:orcl --username aaa --password aaa --table tablename --hive-import -m 1 --fields-terminated-by '\t' --hive-overwrite --hive-table log.hivetablename -- --default-character-set=utf-8

報錯:Move from: hdfs://XXX to: hdfs://YYY is not valid.Please check that values for params "default.fs.name" and "hive.metastore.warehouse.dir" do not conflict.

ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: Hive exited with status 44

原因:hive表的location和default.fs.name不一致造成的,

  查看location:desc extended hivetablename 

  查看default.fs.name:在hadoop安裝目錄下core-site.xml文件

將hive表的location 改成fs.default.name的值


修改hive表location : alter table hivetablename set location 'hdfs://10.100.111.1:9000/user/hive/warehouse/log/hivetablename ';


如果報org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory XXX already exists ,要刪除XXX,在HDFS  /user/hadoop/目錄下

hadoop fs -rmr /user/hadoop/XXX

發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章