Sqoop將Oracle數據導入Hive時出現異常 running import job: java.io.IOException: Hive exited with status 1

[root@node2 sqoop-1.4.6]# sqoop import --connect jdbc:oracle:thin:@192.168.8.110:1521:orcl --username SKY --password 123456 --table GJJY20150613 -m 8 --hive-import
Warning: /home/sqoop-1.4.6//../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /home/sqoop-1.4.6//../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
Warning: /home/sqoop-1.4.6//../zookeeper does not exist! Accumulo imports will fail.
Please set $ZOOKEEPER_HOME to the root of your Zookeeper installation.
17/03/30 09:29:25 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6
17/03/30 09:29:25 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
17/03/30 09:29:25 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
17/03/30 09:29:25 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
17/03/30 09:29:25 INFO oracle.OraOopManagerFactory: Data Connector for Oracle and Hadoop is disabled.
17/03/30 09:29:25 INFO manager.SqlManager: Using default fetchSize of 1000
17/03/30 09:29:25 INFO tool.CodeGenTool: Beginning code generation
17/03/30 09:29:33 INFO manager.OracleManager: Time zone has been set to GMT
17/03/30 09:29:33 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM GJJY20150613 t WHERE 1=0
17/03/30 09:29:33 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /home/hadoop-2.5.1
Note: /tmp/sqoop-root/compile/99be5bfaadbeef110d46ecba09bf90e8/GJJY20150613.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
17/03/30 09:29:36 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-root/compile/99be5bfaadbeef110d46ecba09bf90e8/GJJY20150613.jar
17/03/30 09:29:36 INFO manager.OracleManager: Time zone has been set to GMT
17/03/30 09:29:36 INFO manager.OracleManager: Time zone has been set to GMT
17/03/30 09:29:36 INFO mapreduce.ImportJobBase: Beginning import of GJJY20150613
17/03/30 09:29:36 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
17/03/30 09:29:36 INFO manager.OracleManager: Time zone has been set to GMT
17/03/30 09:29:37 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
17/03/30 09:29:44 INFO db.DBInputFormat: Using read commited transaction isolation
17/03/30 09:29:44 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN(ID), MAX(ID) FROM GJJY20150613
17/03/30 09:29:44 INFO mapreduce.JobSubmitter: number of splits:8
17/03/30 09:29:45 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1490692324324_0011
17/03/30 09:29:45 INFO impl.YarnClientImpl: Submitted application application_1490692324324_0011
17/03/30 09:29:45 INFO mapreduce.Job: The url to track the job: http://node1:8088/proxy/application_1490692324324_0011/
17/03/30 09:29:45 INFO mapreduce.Job: Running job: job_1490692324324_0011
17/03/30 09:30:14 INFO mapreduce.Job: Job job_1490692324324_0011 running in uber mode : false
17/03/30 09:30:14 INFO mapreduce.Job:  map 0% reduce 0%
17/03/30 09:30:32 INFO mapreduce.Job:  map 13% reduce 0%
17/03/30 09:30:53 INFO mapreduce.Job:  map 25% reduce 0%
17/03/30 09:30:54 INFO mapreduce.Job:  map 38% reduce 0%
17/03/30 09:30:59 INFO mapreduce.Job:  map 50% reduce 0%
17/03/30 09:31:04 INFO mapreduce.Job:  map 63% reduce 0%
17/03/30 09:31:24 INFO mapreduce.Job:  map 75% reduce 0%
17/03/30 09:31:53 INFO mapreduce.Job:  map 88% reduce 0%
17/03/30 09:31:58 INFO mapreduce.Job:  map 100% reduce 0%
17/03/30 09:31:59 INFO mapreduce.Job: Job job_1490692324324_0011 completed successfully
17/03/30 09:32:00 INFO mapreduce.Job: Counters: 31
File System Counters
FILE: Number of bytes read=0
FILE: Number of bytes written=940960
FILE: Number of read operations=0
FILE: Number of large read operations=0
FILE: Number of write operations=0
HDFS: Number of bytes read=895
HDFS: Number of bytes written=128576343
HDFS: Number of read operations=32
HDFS: Number of large read operations=0
HDFS: Number of write operations=16
Job Counters 
Killed map tasks=4
Launched map tasks=12
Other local map tasks=12
Total time spent by all maps in occupied slots (ms)=552284
Total time spent by all reduces in occupied slots (ms)=0
Total time spent by all map tasks (ms)=552284
Total vcore-seconds taken by all map tasks=552284
Total megabyte-seconds taken by all map tasks=565538816
Map-Reduce Framework
Map input records=1749094
Map output records=1749094
Input split bytes=895
Spilled Records=0
Failed Shuffles=0
Merged Map outputs=0
GC time elapsed (ms)=1077
CPU time spent (ms)=51780
Physical memory (bytes) snapshot=1926582272
Virtual memory (bytes) snapshot=14274105344
Total committed heap usage (bytes)=1232076800
File Input Format Counters 
Bytes Read=0
File Output Format Counters 
Bytes Written=128576343
17/03/30 09:32:00 INFO mapreduce.ImportJobBase: Transferred 122.62 MB in 142.9521 seconds (878.3562 KB/sec)
17/03/30 09:32:00 INFO mapreduce.ImportJobBase: Retrieved 1749094 records.
17/03/30 09:32:00 INFO manager.OracleManager: Time zone has been set to GMT
17/03/30 09:32:00 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM GJJY20150613 t WHERE 1=0
17/03/30 09:32:00 WARN hive.TableDefWriter: Column IC_AMOUNT had to be cast to a less precise type in Hive
17/03/30 09:32:00 WARN hive.TableDefWriter: Column ID had to be cast to a less precise type in Hive
17/03/30 09:32:00 INFO hive.HiveImport: Loading uploaded data into Hive
17/03/30 09:32:06 INFO hive.HiveImport: 
17/03/30 09:32:06 INFO hive.HiveImport: Logging initialized using configuration in jar:file:/home/hive-1.2.1/lib/hive-common-1.2.1.jar!/hive-log4j.properties
17/03/30 09:32:12 INFO hive.HiveImport: Exception in thread "main" java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rwx------
17/03/30 09:32:12 INFO hive.HiveImport: at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
17/03/30 09:32:12 INFO hive.HiveImport: at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:677)
17/03/30 09:32:12 INFO hive.HiveImport: at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:621)
17/03/30 09:32:12 INFO hive.HiveImport: at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
17/03/30 09:32:12 INFO hive.HiveImport: at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
17/03/30 09:32:12 INFO hive.HiveImport: at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
17/03/30 09:32:12 INFO hive.HiveImport: at java.lang.reflect.Method.invoke(Method.java:606)
17/03/30 09:32:12 INFO hive.HiveImport: at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
17/03/30 09:32:12 INFO hive.HiveImport: Caused by: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rwx------
17/03/30 09:32:12 INFO hive.HiveImport: at org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:612)
17/03/30 09:32:12 INFO hive.HiveImport: at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:554)
17/03/30 09:32:12 INFO hive.HiveImport: at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:508)
17/03/30 09:32:12 INFO hive.HiveImport: ... 7 more
17/03/30 09:32:13 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: Hive exited with status 1
at org.apache.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:389)
at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:339)
at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:240)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:514)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)

at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)

at org.apache.sqoop.Sqoop.main(Sqoop.java:236)


在將Oracle中的數據通過Sqoop導入Hive時出現了以上的錯誤,

解決方法:找到HIVE_HOME下的lib文件夾,將文件夾中的libthrift-0.9.2.jar 拷貝到SQOOP_HOME路徑下的lib文件夾下面 ,問題就ok啦!

發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章