使用sqoop實現hive與mysql數據庫間數據遷移的時,發現如下錯誤提示

執行 ./sqoop create-hive-table --connect jdbc:mysql://192.168.1.10:3306/ekp_11 --table job_log --username root --password 123456 --hive-table job_log

準備將關係型數據的表結構複製到hive中。但是提示如下一堆錯誤信息:

Warning: /usr/lib/hbase does not exist! HBase imports will fail.
Please set $HBASE_HOME to the root of your HBase installation.
Warning: /usr/lib/hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
15/08/02 02:04:14 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
15/08/02 02:04:14 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
15/08/02 02:04:14 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
15/08/02 02:04:14 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
15/08/02 02:04:14 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `job_log` AS t LIMIT 1
15/08/02 02:04:14 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `job_log` AS t LIMIT 1
15/08/02 02:04:14 WARN hive.TableDefWriter: Column fd_start_time had to be cast to a less precise type in Hive
15/08/02 02:04:14 WARN hive.TableDefWriter: Column fd_end_time had to be cast to a less precise type in Hive
Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library /cloud/hadoop-2.2.0/lib/native/libhadoop.so which might have disabled stack guard. The VM will try to fix the stack guard now.
It's highly recommended that you fix the library with 'execstack -c <libfile>', or link it with '-z noexecstack'.
15/08/02 02:04:16 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
15/08/02 02:04:17 INFO hive.HiveImport: Loading uploaded data into Hive
15/08/02 02:04:17 ERROR tool.CreateHiveTableTool: Encountered IOException running create table job: java.io.IOException: Cannot run program "hive": error=2, No such file or directory
        at java.lang.ProcessBuilder.start(ProcessBuilder.java:1047)
        at java.lang.Runtime.exec(Runtime.java:617)
        at java.lang.Runtime.exec(Runtime.java:528)
        at org.apache.sqoop.util.Executor.exec(Executor.java:76)
        at org.apache.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:382)
        at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:335)
        at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:239)
        at org.apache.sqoop.tool.CreateHiveTableTool.run(CreateHiveTableTool.java:58)
        at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
        at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
        at org.apache.sqoop.Sqoop.main(Sqoop.java:238)
Caused by: java.io.IOException: error=2, No such file or directory
        at java.lang.UNIXProcess.forkAndExec(Native Method)
        at java.lang.UNIXProcess.<init>(UNIXProcess.java:186)
        at java.lang.ProcessImpl.start(ProcessImpl.java:130)
        at java.lang.ProcessBuilder.start(ProcessBuilder.java:1028)

        ... 13 more


慣性思維作祟,以爲sqoop能智能到自己去找到本機的hive。

解決方案:爲sqoop配置你使用的hive環境

具體步驟如下:
1、找到/sqoop-1.4.4/conf下的sqoop-env-template.sh 文件,將這個文件重命名爲sqoop-env.sh ;
2、編輯sqoop-env.sh 文件,將你的hive的安裝目錄配上就OK。

      如:export HIVE_HOME=/cloud/apache-hive-1.2.1-bin

發佈了103 篇原創文章 · 獲贊 14 · 訪問量 29萬+
發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章