CDH4.3版本中並沒有提供現成的Parquet安裝包,所以如果在Hive或Impala中需要使用Parquet格式,需要手動進行安裝,當創建Parquet格式的表時,需要定義Parquet相關的InputFormat,OutputFormat,Serde,建表語句如下
hive> create table parquet_test(x int, y string) > row format serde 'parquet.hive.serde.ParquetHiveSerDe' > stored as inputformat 'parquet.hive.DeprecatedParquetInputFormat' > outputformat 'parquet.hive.DeprecatedParquetOutputFormat'; FAILED: SemanticException [Error 10055]: Output Format must implement HiveOutputFormat, otherwise it should be either IgnoreKeyTextOutputFormat or SequenceFileOutputFormat
提交語句會報錯,原因是parquet.hive.DeprecatedParquetOutputFormat類並沒有在Hive的CLASSPATH中配置,此類屬於$IMPALA_HOME/lib目錄下的parquet-hive-1.2.5.jar,所以在$HIVE_HOME/lib目錄下建立個軟鏈就可以了
cd $HIVE_HOME/lib ln -s $IMPALA_HOME/lib/parquet-hive-1.2.5.jar
繼續提交建表語句,報錯如下
hive> create table parquet_test(x int, y string) > row format serde 'parquet.hive.serde.ParquetHiveSerDe' > stored as inputformat 'parquet.hive.DeprecatedParquetInputFormat' > outputformat 'parquet.hive.DeprecatedParquetOutputFormat'; Exception in thread "main" java.lang.NoClassDefFoundError: parquet/hadoop/api/WriteSupport at java.lang.Class.forName0(Native Method) at java.lang.Class.forName(Class.java:247) at org.apache.hadoop.hive.ql.plan.CreateTableDesc.validate(CreateTableDesc.java:403) at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeCreateTable(SemanticAnalyzer.java:8858) at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:8190) at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:258) at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:459) at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:349) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:938) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:902) at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259) at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216) at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:412) at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759) at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:613) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.hadoop.util.RunJar.main(RunJar.java:208) Caused by: java.lang.ClassNotFoundException: parquet.hadoop.api.WriteSupport at java.net.URLClassLoader$1.run(URLClassLoader.java:202) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:190) at java.lang.ClassLoader.loadClass(ClassLoader.java:306) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301) at java.lang.ClassLoader.loadClass(ClassLoader.java:247) ... 20 more
報錯的原因是因爲缺少一些Parquet相關的jar文件,直接下載到$HIVE_HOME/lib目錄下即可
cd /usr/lib/hive/lib for f in parquet-avro parquet-cascading parquet-column parquet-common parquet-encoding parquet-generator parquet-hadoop parquet-hive parquet-pig parquet-scrooge parquet-test-hadoop2 parquet-thrift > do > curl -O https://oss.sonatype.org/service/local/repositories/releases/content/com/twitter/${f}/1.2.5/${f}-1.2.5.jar > done > curl -O https://oss.sonatype.org/service/local/repositories/releases/content/com/twitter/parquet-format/1.0.0/parquet-format-1.0.0.jar
繼續提交建表語句,正常通過。成功建表後,需要將其他表中的數據Load到Parquet格式的表中,在執行HQL過程中,需要使用Parquet相關的jar文件,有兩種方法,一種是在運行語句前對每一個jar都執行add jar操作,比較麻煩。第二種是修改hive-site.xml文件進行配置
<property> <name>hive.aux.jars.path</name> <value>file:///usr/lib/hadoop/lib/parquet-hive-1.2.5.jar,file:///usr/lib/hadoop/lib/parquet-hadoop-1.2.5.jar,file:///usr/lib/hadoop/lib/parquet-avro-1.2.5.jar,file:///usr/lib/hadoop/lib/parquet-cascading-1.2.5.jar,file:///usr/lib/hadoop/lib/parquet-column-1.2.5.jar,file:///usr/lib/hadoop/lib/parquet-common-1.2.5.jar,file:///usr/lib/hadoop/lib/parquet-encoding-1.2.5.jar,file:///usr/lib/hadoop/lib/parquet-format-1.0.0.jar,file:///usr/lib/hadoop/lib/parquet-generator-1.2.5.jar,file:///usr/lib/hadoop/lib/parquet-scrooge-1.2.5.jar,file:///usr/lib/hadoop/lib/parquet-test-hadoop2-1.2.5.jar,file:///usr/lib/hadoop/lib/parquet-thrift-1.2.5.jar</value> </property>
配置好後,需要設置下parquet.compression屬性,來標識格式轉換後的壓縮方式,目前支持UNCOMPRESSED,GZIP,SNAPPY三種格式。然後就可以通過insert...select進行格式轉換了
參考資料:http://cmenguy.github.io/blog/2013/10/30/using-hive-with-parquet-format-in-cdh-4-dot-3/