Hive基本操作

數據庫操作

查看所有的數據庫

hive> show databases ;

使用數據庫default

hive> use default;

查看數據庫信息

hive > describe database default;
OK
db_name comment location owner_name owner_type parameters
default Default Hive database hdfs://hadoop1:8020/user/hive/warehouse public ROLE
Time taken: 0.042 seconds, Fetched: 1 row(s)

顯示地展示當前使用的數據庫

hive> set hive.cli.print.current.db=true;
hive(default)>

Hive顯示列頭

hive (default)> set hive.cli.print.header=true;
hive (default)> desc addressall_2015_07_09;
OK
col_name data_type comment
_c0 string
addr1 bigint
addr2 bigint
addr3 bigint
Time taken: 0.182 seconds, Fetched: 4 row(s)
hive (default)> select * from addressall_2015_07_09;
OK
addressall_2015_07_09._c0 addressall_2015_07_09.addr1 addressall_2015_07_09.addr2 addressall_2015_07_09.addr3
2015_07_09 536 488 493
Time taken: 10.641 seconds, Fetched: 1 row(s)

創建數據庫命令

hive (default)> create database liguodong;
OK
Time taken: 10.128 seconds

切換當前的數據庫

hive (default)> use liguodong;
OK
Time taken: 0.031 seconds
hive (liguodong)>

刪除數據庫

刪除數據庫的時候,不允許刪除有數據的數據庫,如果數據庫裏面有數據則會報錯。如果要忽略這些內容,則在後面增加CASCADE關鍵字,則忽略報錯,刪除數據庫。

hive> DROP DATABASE DbName CASCADE(可選);
hive> DROP DATABASE IF EXISTS DbName CASCADE;

表操作

查看當前DB有啥表

hive> SHOW TABLES IN DbName;
hive (liguodong)> SHOW TABLES IN liguodong;
OK
tab_name
Time taken: 0.165 seconds

也可以使用正則表達式 hive> SHOW TABLES LIKE 'h*';

hive (default)> SHOW TABLES LIKE '*all*' ;
OK
tab_name
addressall_2015_07_09
Time taken: 0.039 seconds, Fetched: 1 row(s)
  • 1
  • 2
  • 3
  • 4
  • 5

獲得表的建表語句

hive (default)> show create table address1_2015_07_09;
OK
createtab_stmt
CREATE  TABLE `address1_2015_07_09`(
  `addr1` bigint)
ROW FORMAT SERDE
  'org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe'
STORED AS INPUTFORMAT
  'org.apache.hadoop.mapred.TextInputFormat'
OUTPUTFORMAT
  'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat'
LOCATION
  'hdfs://nameservice1/user/hive/warehouse/address1_2015_07_09'
TBLPROPERTIES (
  'COLUMN_STATS_ACCURATE'='true',
  'numFiles'='1',
  'numRows'='0',
  'rawDataSize'='0',
  'totalSize'='4',
  'transient_lastDdlTime'='1436408451')
Time taken: 0.11 seconds, Fetched: 17 row(s)
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16
  • 17
  • 18
  • 19
  • 20
  • 21
  • 22

創建表

CREATE [TEMPORARY] [EXTERNAL] TABLE [IF NOT EXISTS] [db_name.]table_name    -- (Note: TEMPORARY available in Hive 0.14.0 and later)
  [(col_name data_type [COMMENT col_comment], ...)]
  [COMMENT table_comment]
  [PARTITIONED BY (col_name data_type [COMMENT col_comment], ...)]
  [CLUSTERED BY (col_name, col_name, ...) [SORTED BY (col_name [ASC|DESC], ...)] INTO num_buckets BUCKETS]
  [SKEWED BY (col_name, col_name, ...)                  -- (Note: Available in Hive 0.10.0 and later)]
     ON ((col_value, col_value, ...), (col_value, col_value, ...), ...)
     [STORED AS DIRECTORIES]
  [
   [ROW FORMAT row_format] 
   [STORED AS file_format]
     | STORED BY 'storage.handler.class.name' [WITH SERDEPROPERTIES (...)]  -- (Note: Available in Hive 0.6.0 and later)
  ]
  [LOCATION hdfs_path]
  [TBLPROPERTIES (property_name=property_value, ...)]   -- (Note: Available in Hive 0.6.0 and later)
  [AS select_statement];   -- (Note: Available in Hive 0.5.0 and later; not supported for external tables)
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16

hive (default)> create table test(id int);
OK
Time taken: 10.143 seconds

獲得表的建表語句

hive (default)> show create table test;
OK
createtab_stmt
CREATE  TABLE `test`(
  `id` int)
ROW FORMAT SERDE
  'org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe'
STORED AS INPUTFORMAT
  'org.apache.hadoop.mapred.TextInputFormat'
OUTPUTFORMAT
  'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat'
LOCATION
  'hdfs://nameservice1/user/hive/warehouse/test'
TBLPROPERTIES (
  'transient_lastDdlTime'='1436799093')
Time taken: 0.135 seconds, Fetched: 12 row(s)
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16

創建內部表

其他獲得表的信息

創建外部表

hive (default)> create external table testexttable(
              > name string comment 'name value',
              > addr string comment 'addr value'
              > );
OK
Time taken: 10.172 seconds
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6

create table if not exists testtable(
name string comment 'name value',
addr string comment 'addr value'
)
row format delimited fields terminated by '\t' lines terminated by '\n' stored as textfile
;
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6

加載數據

hive (default)> load data local inpath ‘/liguodong/hivedata/datatest’ overwrite into table testtable;

hive (default)> load data local inpath ‘/liguodong/hivedata/datatest’ into table testtable;
如果沒有使用overwrite,則會再拷貝一份數據,不會覆蓋原來的數據。

hive (default)> create external table if not exists employees(
              > name string,
              > salary string,
              > subordinates array<string>,
              > deductions map<string,float>,
              > address struct<street:string,city:string,state:string,zip:int>
              > )
              > row format delimited fields terminated by '\t'
              > collection items terminated by ','
              > map keys terminated by ':'
              > lines terminated by '\n'
              > stored as textfile
              > location '/liguodong/data/'
              > ;
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14

查找表數據

hive> select * from employees;
OK
tony    1338    ["a1","a2","a3"]        {"k1":1.0,"k2":2.0,"k3":3.0}    {"street":"s1","city":"s2","state":"s3","zip":4}
mark    5453    ["a4","a5","a6"]        {"k4":4.0,"k5":5.0,"k6":6.0}    {"street":"s4","city":"s5","state":"s6","zip":6}
ivy     323     ["a7","a8","a9"]        {"k7":7.0,"k8":8.0,"k9":9.0}    {"street":"s7","city":"s8","state":"s9","zip":9}
Time taken: 10.204 seconds, Fetched: 3 row(s)

查樹組
hive> select subordinates[1]  from employees;
Total MapReduce CPU Time Spent: 2 seconds 740 msec
OK
a2
a5
a8
查map
hive> select deductions["k2"]  from employees;

OK
2.0
NULL
NULL
Time taken: 75.812 seconds, Fetched: 3 row(s)

查結構體
hive> select address.city  from employees;
Total MapReduce CPU Time Spent: 2 seconds 200 msec
OK
s2
s5
s8
Time taken: 75.311 seconds, Fetched: 3 row(s)
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16
  • 17
  • 18
  • 19
  • 20
  • 21
  • 22
  • 23
  • 24
  • 25
  • 26
  • 27
  • 28
  • 29
  • 30
  • 31

注意:select * 不執行mapreduce,只進行一個本地的查詢。
select 某個字段 生成一個job,執行mapreduce。

select * from employees;
select * from employees limit 10;
  • 1
  • 2

刪除表

內部表刪除,會連同hdfs存儲的數據一同刪除,而外部表刪除,只會刪除外部表的元數據信息。

hive (default)> drop table testtable;
OK
Time taken: 10.283 seconds
hive (default)> drop table testexttable;
OK
Time taken: 0.258 seconds
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6

Hive建表的其他方式

有一個表,創建另一個表

(只是複製了表結構,並不會複製內容。)
create table test3 like test2;
不需要執行mapreduce

hive> desc testtable;
OK
name                    string                  name value
addr                    string                  addr value
Time taken: 0.267 seconds, Fetched: 2 row(s)
hive> create table testtableNew like testtable;
OK
Time taken: 10.323 seconds
hive> desc testtableNew;
OK
name                    string                  name value
addr                    string                  addr value
Time taken: 0.135 seconds, Fetched: 2 row(s)
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14

從其他表查詢,再創建表

(複製表結構的同時,把內容也複製過來了)
create table test2 as select name,addr from test1;
需要執行mapreduce

hive> create table testNewtable as select name,addr from testtable;
hive> desc testNewtable;
OK
name                    string
addr                    string
Time taken: 0.122 seconds, Fetched: 2 row(s)
hive> select * from testNewtable;
OK
liguodong       cd
aobama  lsj
Time taken: 10.226 seconds, Fetched: 4 row(s)
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
					<link href="https://csdnimg.cn/release/phoenix/mdeditor/markdown_views-778f64ae39.css" rel="stylesheet">
            </div>
發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章