一、運行環境
CentOS 6.5 64位
正確安裝配置jdk
正確安裝配置hadoop
正確安裝mysql
二、所需軟件
apache-hive-1.2.1-bin.tar.gz
(下載地址:apache-hive-1.2.1-bin.tar.gz)mysql-connector-java-5.1.22-bin.jar
(下載地址:mysql-connector-java-5.1.22-bin.jar)
三、安裝配置
解壓apache-hive-1.2.1-bin.tar.gz及其他
tar -zxvf apache-hive-1.2.1-bin.tar.gz
然後把mysql-connector-java-5.1.22-bin.jar移動到apache-hive-1.2.1-bin/lib/文件夾下
配置環境變量
vim /etc/profile export HIVE_HOME=/hadoop/apache-hive-1.2.1-bin export PATH=$PATH:$HIVE_HOME/bin
修改Hive配置(apache-hive-1.2.1-bin/conf/)
- hive-config.sh
#vim /usr/local/apache-hive-1.1.0-bin/bin/hive-config.sh
export JAVA_HOME=/usr/java/jdk_1.7.0_71
export HIVE_HOME=/hadoop/apache-hive-1.2.1-bin
export HADOOP_HOME=/hadoop/hadoop-2.6.2
- cp hive-env.sh.template hive-env.sh
- vim hive-env.sh
HADOOP_HOME=/hadoop/hadoop-2.6.2
export HIVE_CONF_DIR=/hadoop/apache-hive-1.2.1-bin/conf
- hive-site.xml
#cp hive-default.xml.template hive-site.xml
#vim hive-site.xml
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://<遠程主機IP或本機>:3306/hive</value>
<!-- 本地、遠程模式的區別就是在此 -->
<description>JDBC connect string for a JDBC metastore</description>
</property>
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>com.mysql.jdbc.Driver</value>
<description>Driver class name for a JDBC metastore</description>
</property>
<property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>數據庫用戶名</value>
<description>Username to use against metastore database</description>
</property>
<property>
<name>javax.jdo.option.ConnectionPassword</name>
<value>數據庫密碼</value>
<description>password to use against metastore database</description>
</property>
#如果不配置下面的部分可能會產生錯誤.
<property>
<name>hive.exec.local.scratchdir</name>
<value>自定義目錄</value>
<description>Local scratch space for Hive jobs</description>
</property>
<property>
<name>hive.downloaded.resources.dir</name>
<value>自定義目錄</value>
<description>Temporary local directory for added resources in the remote file system.</description>
</property>
<property>
<name>hive.querylog.location</name>
<value>自定義目錄</value>
<description>Location of Hive run time structured log file</description>
</property>
<property>
<name>hive.server2.logging.operation.log.location</name>
<value>自定義目錄/operation_logs</value>
<description>Top level directory where operation logs are stored if logging functionality is enabled</description>
</property>
4.啓動
啓動hadoop
start-dfs.sh
start-yarn.sh
啓動hive
hive
四、常見錯誤
Logging initialized using configuration in jar:file:/hive/apache-hive-1.1.0-bin/lib/hive-common-1.1.0.jar!/hive-log4j.properties
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/hadoop-2.5.2/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/hive/apache-hive-1.1.0-bin/lib/hive-jdbc-1.1.0-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
[ERROR] Terminal initialization failed; falling back to unsupported
java.lang.IncompatibleClassChangeError: Found class jline.Terminal, but interface was expected
at jline.TerminalFactory.create(TerminalFactory.java:101)
at jline.TerminalFactory.get(TerminalFactory.java:158)
at jline.console.ConsoleReader.<init>(ConsoleReader.java:229)
at jline.console.ConsoleReader.<init>(ConsoleReader.java:221)
at jline.console.ConsoleReader.<init>(ConsoleReader.java:209)
at org.apache.hadoop.hive.cli.CliDriver.getConsoleReader(CliDriver.java:773)
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:715)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:675)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:615)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
原因是hadoop目錄下存在老版本jline:
/hadoop-2.6.2/share/hadoop/yarn/lib: jline-0.9.94.jar
解決方法是:
將hive下的新版本jline的JAR包拷貝到hadoop該目錄(/hadoop-2.6.2/share/hadoop/yarn/lib)下:
cp /hadoop/apache-hive-1.2.1-bin/lib/jline-2.12.jar ./
五、附: 配置MySQL
安裝MySQL
[root@server1 ~]# yum install mysql mysql -server
一路選擇yes就行
添加MySQL服務
[root@server1 ~]# /sbin/chkconfig –add mysqld
啓動MySQL
[root@server1 ~]# service mysqld start
Starting mysqld: [ OK ]
用root賬戶在本地登錄MySQL
[root@server1 ~]# mysql -u root
出現歡迎頁面,進入MySQL monitor.
創建數據庫實例hive
mysql > CREATE DATABASE hive;
創建用戶hive
mysql > CREATE USER ‘hive’ IDENTIFIED BY ‘hive’;
給用戶hive賦予相應的訪問與讀寫權限
mysql > GRANT ALL ON hive.* TO hive@localhost ;