記錄sqoop的一些使用,複習下,此前項目中有用到,複習下
使用案例
導入數據
在Sqoop中,“導入”概念指:從非大數據集羣(RDBMS)向大數據集羣(HDFS,HIVE,HBASE)中傳輸數據,叫做:導入,即使用import關鍵字。
RDBMS到HDFS
在Mysql中新建一張表並插入一些數據
create database company_test;
create table company_test.staff(id int(4) primary key not null auto_increment, name varchar(255), sex varchar(255));
insert into company_test.staff(name, sex) values('Thomas', 'Male');
insert into company_test.staff(name, sex) values('Catalina', 'FeMale');
mysql> select * from company_test.staff;
+----+----------+--------+
| id | name | sex |
+----+----------+--------+
| 1 | Thomas | Male |
| 2 | Catalina | FeMale |
+----+----------+--------+
全部導入
bin/sqoop import \
--connect jdbc:mysql://192.168.31.201:3306/company_test \
--username root \
--password 123456 \
--table staff \
--target-dir /sqoop/test/company_test \
--delete-target-dir \
--num-mappers 1 \
--fields-terminated-by "\t"
注意這是導數據到hdfs上,要注意目錄是否存在,如上至少要保證/sqoop/test/存在,否則job會失敗
檢查導入結果
-rw-r--r-- 2 root supergroup 0 2020-01-15 20:02 /sqoop/test/company_test/_SUCCESS
-rw-r--r-- 2 root supergroup 32 2020-01-15 20:02 /sqoop/test/company_test/part-m-00000
[root@mym sqoop]# hdfs dfs -cat /sqoop/test/company_test/part-m-0000
Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library /home/mym/apps/hadoop-2.4.1/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.
It's highly recommended that you fix the library with 'execstack -c <libfile>', or link it with '-z noexecstack'.
20/01/15 20:03:29 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
cat: `/sqoop/test/company_test/part-m-0000': No such file or directory
[root@mym sqoop]# hdfs dfs -cat /sqoop/test/company_test/part-m-00000
Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library /home/mym/apps/hadoop-2.4.1/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.
It's highly recommended that you fix the library with 'execstack -c <libfile>', or link it with '-z noexecstack'.
20/01/15 20:03:57 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
1 Thomas Male
2 Catalina FeMale
查詢導入
bin/sqoop import \
--connect jdbc:mysql://192.168.31.201:3306/company_test \
--username root \
--password 123456 \
--target-dir /sqoop/test/company_test \
--delete-target-dir \
--num-mappers 1 \
--fields-terminated-by "\t" \
--query 'select name,sex from staff where id <=1 and $CONDITIONS;'
提示:must contain ‘$CONDITIONS’ in WHERE clause.
如果query後使用的是雙引號,則$CONDITIONS前必須加轉移符,防止shell識別爲自己的變量。
導入指定列
bin/sqoop import \
--connect jdbc:mysql://192.168.31.201:3306/company_test \
--username root \
--password 123456 \
--target-dir /sqoop/test/company_test \
--delete-target-dir \
--num-mappers 1 \
--fields-terminated-by "\t" \
--columns id,sex \
--table staff
提示:columns中如果涉及到多列,用逗號分隔,分隔時不要添加空格
使用sqoop關鍵字篩選查詢導入數據
bin/sqoop import \
--connect jdbc:mysql://192.168.31.201:3306/company_test \
--username root \
--password 123456 \
--table staff \
--target-dir /sqoop/test/company_test \
--delete-target-dir \
--num-mappers 1 \
--fields-terminated-by "\t" \
--where "id=1"
RDBMS到Hive
bin/sqoop import \
--connect jdbc:mysql://192.168.31.201:3306/company_test \
--username root \
--password 123456 \
--table staff \
--num-mappers 1 \
--hive-import \
--fields-terminated-by "\t" \
--hive-overwrite \
--hive-table staff_hive
提示:該過程分爲兩步,第一步將數據導入到HDFS,第二步將導入到HDFS的數據遷移到Hive倉庫,第一步默認的臨時目錄是/user/atguigu/表名
根據日誌可以看到已經成功,進入到hive去檢查下
hive (default)> show tables;
OK
tab_name
staff_hive
student_test
Time taken: 0.416 seconds, Fetched: 12 row(s)
hive (default)> select * from staff_hive;
OK
staff_hive.id staff_hive.name staff_hive.sex
1 Thomas Male
2 Catalina FeMale
Time taken: 0.689 seconds, Fetched: 2 row(s)
RDBMS到Hbase
bin/sqoop import \
--connect jdbc:mysql://192.168.31.201:3306/company_test \
--username root \
--password 123456 \
--table staff \
--columns "id,name,sex" \
--column-family "info" \
--hbase-create-table \
--hbase-row-key "id" \
--hbase-table "hbase_company" \
--num-mappers 1 \
--split-by id
提示:sqoop1.4.6只支持HBase1.0.1之前的版本的自動創建HBase表的功能
如下報錯
20/01/15 20:49:16 INFO mapreduce.HBaseImportJob: Creating missing HBase table hbase_company
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.hbase.HTableDescriptor.addFamily(Lorg/apache/hadoop/hbase/HColumnDescriptor;)V
at org.apache.sqoop.mapreduce.HBaseImportJob.jobSetup(HBaseImportJob.java:222)
at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:264)
at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:673)
at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:118)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
解決方案:手動創建HBase表
hbase> create ‘hbase_company’,‘info’
在HBase中scan這張表得到如下內容
hbase> scan ‘hbase_company’
導出數據
在Sqoop中,“導出”概念指:從大數據集羣(HDFS,HIVE,HBASE)向非大數據集羣(RDBMS)中傳輸數據,叫做:導出,即使用export關鍵字。
HIVE/HDFS到RDBMS
bin/sqoop export \
--connect jdbc:mysql://mym:3306/company_test \
--username root \
--password 123456 \
--table staff \
--num-mappers 1 \
--export-dir /user/hive/warehouse/staff_hive \
--input-fields-terminated-by "\t"
提示:Mysql中如果表不存在,不會自動創建,會報錯
腳本打包
使用opt格式的文件打包sqoop命令,然後執行
1) 創建一個.opt文件
mkdir opt
touch opt/job_HDFS2RDBMS.opt
2) 編寫sqoop腳本
一行一個
import
--connect
jdbc:mysql://192.168.31.201:3306/company_test
--username
root
--password
123456
--target-dir
/sqoop/test/company_test
--delete-target-dir
--num-mappers
1
--fields-terminated-by
"\t"
--columns
id,sex
--table
staff
3) 執行該腳本
bin/sqoop --options-file opt/job_HDFS2RDBMS.opt
Sqoop一些常用命令及參數
部分Sqoop操作時的常用參數,以供參考,深入學習的去看對應類的源代碼。
序號 | 命令 | 類 | 說明 |
---|---|---|---|
1 | import | ImportTool | 將數據導入到集羣 |
2 | export | ExportTool | 將集羣數據導出 |
3 | codegen | CodeGenTool | 獲取數據庫中某張表數據生成Java並打包Jar |
4 | create-hive-table | CreateHiveTableTool | 創建Hive表 |
5 | eval | EvalSqlTool | 查看SQL執行結果 |
6 | import-all-tables | ImportAllTablesTool | 導入某個數據庫下所有表到HDFS中 |
7 | job | JobTool | 用來生成一個sqoop的任務,生成後,該任務並不執行,除非使用命令執行該任務。 |
8 | list-databases | ListDatabasesTool | 列出所有數據庫名 |
9 | list-tables | ListTablesTool | 列出某個數據庫下所有表 |
10 | merge | MergeTool | 將HDFS中不同目錄下面的數據合在一起,並存放在指定的目錄中 |
11 | metastore | MetastoreTool | 記錄sqoop job的元數據信息,如果不啓動metastore實例,則默認的元數據存儲目錄爲:~/.sqoop,如果要更改存儲目錄,可以在配置文件sqoop-site.xml中進行更改。 |
12 | help | HelpTool | 打印sqoop幫助信息 |
---|---|---|---|
13 | version | VersionTool | 打印sqoop版本信息 |