百度網盤鏈接:hadoop2.6.0-cdh5.7.1 版本的snappy庫
https://pan.baidu.com/s/1UNXWFq5_eNyqMAaZGO2VcA
提取碼:52tw
1、下載好解壓把文件存放到$HADOOP_HOME/lib/native下
hadoop checknative -a 檢查是否安裝成功
2、如果全部是false,在hadoop-env.sh中添加export HADOOP_ROOT_LOGGER=DEBUG,console
運行hadoop checknative -a來查看詳細錯誤
解決方法:
在hadoop-env.sh添加:
export HADOOP_OPTS="-Djava.library.path=${HADOOP_HOME}/lib/native"
3、安裝好使用snappy壓縮
3.1先用sqoop導數
./sqoop import --connect jdbc:mysql://node2:3306/sqoop --username root --password root --query 'select * from test_1 where $CONDITIONS' -m 1 --fields-terminated-by ',' --target-dir /sqoop/ --hive-import --hive-database tkdw --hive-table wang01
3.2在創建snappy表
create table wang_snappy( id int , name string , age string)
row format delimited fields terminated by ',' stored as orc tblproperties("orc.compress"="snappy")
3.3再插入數據
insert into table wang_snappy select * from wang01;
3.4查看,對比一下壓縮大小
不啓用壓縮:
啓用snappy壓縮
使用orc默認zlib壓縮
綜合他們都推薦snappy,根據壓縮解壓速度,佔用cpu資源等綜合考慮
4、開啓hive壓縮
<!-- open intermediate compress -->
<property>
<name>hive.exec.compress.intermediate</name>
<value>true</value>
</property>
<property>
<name>mapred.map.output.compression.codec</name>
<value> org.apache.hadoop.io.compress.SnappyCodec </value>
</property>
<!-- open compress -->
<property>
<name>hive.exec.compress.output</name>
<value>true</value>
</property>
<property>
<name>mapred.output.compression.codec</name>
<value>org.apache.hadoop.io.compress.SnappyCodec</value>
</property>
5、開啓hadoop 壓縮
需要再mapred-site.xml加入
<property>
<name>mapreduce.output.fileoutputformat.compress</name>
<value>true</value>
</property>
<property>
<name>mapreduce.output.fileoutputformat.compress.codec</name>
<value>org.apache.hadoop.io.compress.SnappyCodec</value>
</property>
core-site.xml
<property>
<name>io.compression.codecs</name>
<value>
org.apache.hadoop.io.compress.GzipCodec,
org.apache.hadoop.io.compress.DefaultCodec,
org.apache.hadoop.io.compress.BZip2Codec,
com.hadoop.compression.lzo.LzoCodec,
org.apache.hadoop.io.compress.Lz4Codec,
org.apache.hadoop.io.compress.SnappyCodec
</value>
</property>
這裏需要注意,put上去的不走mr,所以不會壓縮,用sqoop導入到hdfs,走mr,會有壓縮。