1.安裝虛擬機Centos環境
安裝鏡像下載地址:http://vault.centos.org/6.4/isos/x86_64/
下載上圖標出的兩個。安裝時選第一個就可以了。
安裝過程略。可以參見如下文檔安裝。
http://pan.baidu.com/s/1dDowIjv
我的環境是:
[root@hadoop1 target]# uname -a
Linux hadoop1 2.6.32-358.el6.x86_64 #1 SMP Fri Feb 22 00:31:26 UTC 2013 x86_64 x86_64 x86_64 GNU/Linux
2.安裝jdk----jdk-7u67-linux-x64.tar.gz
由於Hadoop是用Java開發的,故編譯時要用到jdk.
此次安裝的版本是jdk-7u67-linux-x64.tar.gz
jdk安裝文件可以到如下網盤地址去下載:
http://pan.baidu.com/s/1jGqUupw
安裝方法:
第一步:tar -zxvf jdk-7u67-linux-x64.tar.gz
第二步:mv jdk1.7.0_67 jdk1.7
第三步:vi /etc/profile
添加如下內容:
export JAVA_HOME=/usr/local/jdk1.7
export PATH=.:$PATH:$JAVA_HOME/bin
保存後:source /etc/profile
第四步:java -version
[root@hadoop1 soft]# java -version
java version "1.7.0_67"
Java(TM) SE Runtime Environment (build 1.7.0_67-b01)
Java HotSpot(TM) 64-Bit Server VM (build 24.65-b04, mixed mode)
3.安裝maven---apache-maven-3.0.5-bin.tar.gz
由於Hadoop2是用Maven來管理項目的,故要編譯源碼,需要用Maven
maven下載地址:http://pan.baidu.com/s/1bnlcZeR
當然也可以到官網去下載
安裝步驟:
解壓並改名
tar -zxvf apache-maven-3.0.5-bin.tar.gz
mv apache-maven-3.0.5 maven
vi /etc/profile
添加如下內容
export MAVEN_HOME=/usr/local/maven
export PATH=.:$PATH:$MAVEN_HOME
source /etc/profile
驗證:mvn -version
[root@hadoop1 soft]# mvn -version
Apache Maven 3.0.5 (r01de14724cdef164cd33c7c8c2fe155faf9602da; 2013-02-19 21:51:28+0800)
Maven home: /usr/local/maven
Java version: 1.7.0_67, vendor: Oracle Corporation
Java home: /usr/local/jdk1.7/jre
Default locale: en_US, platform encoding: UTF-8
OS name: "linux", version: "2.6.32-358.el6.x86_64", arch: "amd64", family: "unix"
4.安裝findbugs -- findbugs-3.0.0.tar.gz
findbugs是用來生成文檔的,如果不需要生成編譯文檔,此步驟可以忽略。
安裝文件下載地址:
http://pan.baidu.com/s/1pJr0dR1
安裝步驟:
tar -zxvf findbugs-3.0.0.tar.gz
mv findbugs-3.0.0 findbugs
配置環境變量:
vi /etc/profile
export FINDBUGS_HOME=/usr/local/findbugs
export PATH=.:$PATH:$FINDBUGS_HOME/bin
source /etc/profile
驗證:findbugs -version
[root@hadoop1 soft]# findbugs -version
3.0.0
5.安裝protoc
由於Hadoop是使用protocol buffer通信的,故要安裝protoc.
官網:https://code.google.com/p/protobuf/downloads/list
我的下載地址:http://pan.baidu.com/s/1zshyA
爲了安裝protoc,需要安裝以下幾個工具:
前提:需要centos虛擬機可以連網。
yum install gcc
yum intall gcc-c++
yum install make
安裝步驟:
tar -zxvf protobuf-2.5.0.tar.gz
cd protobuf-2.5.0
./configure --prefix=/usr/local/protoc
make
make install
以上執行步驟只要不出錯就可以了
編譯文件位於:/usr/local/protoc
配置環境變量:
export PROTOC_HOME=/usr/local/protoc
export PATH=.:$PATH:$PROTOC_HOME
source /etc/profile
驗證:
[root@hadoop1 soft]# protoc --version
libprotoc 2.5.0
6.安裝其它依賴包
yum install cmake
yum install openssl-devel
yum install ncurses-devel
7.編譯源碼
從官網下載Hadoop2.2源碼
http://apache.fayea.com/apache-mirror/hadoop/common/hadoop-2.2.0/
或者從下面的網盤下載:
http://pan.baidu.com/s/1dD6DC2l
安裝步驟:
tar -zxvf hadoop-2.2.0-src.tar.gz
cd hadoop-2.2.0-src
修復bug一枚
在hadoop-2.2.0-src目錄下的如下文件:
/usr/local/hadoop-2.2.0-src/hadoop-common-project/hadoop-auth中的文件pom.xml
在第55行加入如下配置:
<dependency>
<groupId>org.mortbay.jetty</groupId>
<artifactId>jetty-util</artifactId>
<scope>test</scope>
</dependency>
cd hadoop-2.2.0-srcmvn package -DskipTests -Pdist,native,docs
如果沒有安裝生成文檔的findbugs,則去掉後邊的docs
由於maven需要上網去下載需要的jar包,幫該命令會執行時間較長。
如果最後控制檯輸出如下信息:
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop Main ................................ SUCCESS [2.625s]
[INFO] Apache Hadoop Project POM ......................... SUCCESS [1.623s]
[INFO] Apache Hadoop Annotations ......................... SUCCESS [3.604s]
[INFO] Apache Hadoop Assemblies .......................... SUCCESS [0.365s]
[INFO] Apache Hadoop Project Dist POM .................... SUCCESS [4.368s]
[INFO] Apache Hadoop Maven Plugins ....................... SUCCESS [3.922s]
[INFO] Apache Hadoop Auth ................................ SUCCESS [13.938s]
[INFO] Apache Hadoop Auth Examples ....................... SUCCESS [2.717s]
[INFO] Apache Hadoop Common .............................. SUCCESS [5:56.983s]
[INFO] Apache Hadoop NFS ................................. SUCCESS [12.786s]
[INFO] Apache Hadoop Common Project ...................... SUCCESS [0.097s]
[INFO] Apache Hadoop HDFS ................................ SUCCESS [10:03.524s]
[INFO] Apache Hadoop HttpFS .............................. SUCCESS [39.430s]
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS [13.182s]
[INFO] Apache Hadoop HDFS-NFS ............................ SUCCESS [6.215s]
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [0.179s]
[INFO] hadoop-yarn ....................................... SUCCESS [0.392s]
[INFO] hadoop-yarn-api ................................... SUCCESS [57.293s]
[INFO] hadoop-yarn-common ................................ SUCCESS [38.550s]
[INFO] hadoop-yarn-server ................................ SUCCESS [0.608s]
[INFO] hadoop-yarn-server-common ......................... SUCCESS [15.493s]
[INFO] hadoop-yarn-server-nodemanager .................... SUCCESS [17.368s]
[INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS [3.668s]
[INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS [14.096s]
[INFO] hadoop-yarn-server-tests .......................... SUCCESS [0.432s]
[INFO] hadoop-yarn-client ................................ SUCCESS [5.831s]
[INFO] hadoop-yarn-applications .......................... SUCCESS [0.140s]
[INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS [3.396s]
[INFO] hadoop-mapreduce-client ........................... SUCCESS [0.148s]
[INFO] hadoop-mapreduce-client-core ...................... SUCCESS [30.974s]
[INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS [4.917s]
[INFO] hadoop-yarn-site .................................. SUCCESS [0.177s]
[INFO] hadoop-yarn-project ............................... SUCCESS [5.514s]
[INFO] hadoop-mapreduce-client-common .................... SUCCESS [24.587s]
[INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS [3.728s]
[INFO] hadoop-mapreduce-client-app ....................... SUCCESS [12.216s]
[INFO] hadoop-mapreduce-client-hs ........................ SUCCESS [6.015s]
[INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS [4.927s]
[INFO] hadoop-mapreduce-client-hs-plugins ................ SUCCESS [1.892s]
[INFO] Apache Hadoop MapReduce Examples .................. SUCCESS [7.238s]
[INFO] hadoop-mapreduce .................................. SUCCESS [2.417s]
[INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS [6.014s]
[INFO] Apache Hadoop Distributed Copy .................... SUCCESS [9.235s]
[INFO] Apache Hadoop Archives ............................ SUCCESS [2.558s]
[INFO] Apache Hadoop Rumen ............................... SUCCESS [7.183s]
[INFO] Apache Hadoop Gridmix ............................. SUCCESS [5.011s]
[INFO] Apache Hadoop Data Join ........................... SUCCESS [3.364s]
[INFO] Apache Hadoop Extras .............................. SUCCESS [3.617s]
[INFO] Apache Hadoop Pipes ............................... SUCCESS [3.970s]
[INFO] Apache Hadoop Tools Dist .......................... SUCCESS [3.288s]
[INFO] Apache Hadoop Tools ............................... SUCCESS [0.060s]
[INFO] Apache Hadoop Distribution ........................ SUCCESS [41.536s]
[INFO] Apache Hadoop Client .............................. SUCCESS [6.460s]
[INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS [0.586s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 23:42.823s
[INFO] Finished at: Fri Aug 08 07:08:01 CST 2014
[INFO] Final Memory: 71M/239M
[INFO] ------------------------------------------------------------------------
其中關鍵字:BUILD SUCCESS
則表示編譯完成。
編譯後的文件位於:target目錄中。
上圖標出的就是編譯後的Hadoop2.2.