xgboost 分佈式部署教程

xgboost 分佈式部署教程

xgboost是一個非常優秀的用於梯度提升學習開源工具。在多個數值算法和非數值算法的優化下(XGBoost: A Scalable Tree Boosting System),速度非常驚人。經測試用spark10小時才能train出GBDT(Gradient Boosting Decision Tree)的數據量,xgboost 使用一半的集羣資源只需要10分鐘。出於種種原因,我在hadoop環境上部署xgboost花了一個多月的時間,期間在xgboost issues 上提了很多問題,也替作者找過bug。今天特地寫篇部署教程,方便有需要的同行。
注:

  • 本教程是在集羣 gcc -v < 4.8 libhdfs native目標代碼不可用的情況下部署xgboost,因此可以cover掉部署過程中遇到的絕大多數問題。
  • 由於未對當前的代碼進行測試,本教程使用特定版本的代碼。
  • 本教程將運行xgboost依賴的文件都放到xgboost-packages目錄下,再次部署只需scp -r xgboost-packages${HOME}目錄下

獲取特定版本的xgboost

  • githubgit clonedmlc-core, rabit, xgboost
    git clone --recursive https://github.com/dmlc/xgboost
  • 進入xgboost目錄,檢出版本76c320e9f0db7cf4aed73593ddcb4e0be0673810
    git checkout 76c320e9f0db7cf4aed73593ddcb4e0be0673810
  • 進入dmlc-core目錄,檢出版本706f4d477a48fc75cb46b226ea007fbac862f9c2
    git checkout 706f4d477a48fc75cb46b226ea007fbac862f9c2
  • 進入 rabit 目錄,檢出版本112d866dc92354304c0891500374fe40cdf13a50
    git checkout 112d866dc92354304c0891500374fe40cdf13a50
  • ${HOME}創建xgboost-packages目錄,將xgboost拷貝到xgboost-package目錄下
  mkdir xgboost-package
  cp -r xgboost xgboost-packages/

安裝編譯依賴的包

安裝gcc-4.8.0

  cd gcc-4.8.2
  ./contrib/download_prerequisites
  # 建立一個目錄供編譯出的文件存放
  cd ..
  • 建立編譯輸出目錄
    mkdir gcc-build-4.8.2
  • 進入此目錄,執行以下命令,生成makefile文件(安裝到${HOME}目錄下)
cd  gcc-build-4.8.2
../gcc-4.8.2/configure --enable-checking=release --enable-languages=c,c++ --disable-multilib --prefix=${HOME}
  • 編譯
    make -j21
  • 安裝
    make install
  • 修改變量切換默認gcc版本
PATH=$HOME/bin:$PATH
cp -r ~/lib64 ~/xgboost-packages

安裝cmake

tar -zxf cmake-3.5.2.tar.gz
cd cmake-3.5.2
./bootstrap --prefix=${HOME}
gmake
make -j21
make install

下載編譯libhdfs*

unzip hadoop-common-cdh5-2.6.0_5.5.0.zip
cd hadoop-common-cdh5-2.6.0_5.5.0/hadoop-hdfs-project/hadoop-hdfs/src
cmake -DGENERATED_JAVAH=/opt/jdk1.8.0_60 -DJAVA_HOME=/opt/jdk1.8.0_60
make
# 拷貝編譯好的目標文件到xgboost-packages中
cp -r /target/usr/local/lib ${HOME}/xgboost-packages/libhdfs

安裝xgboost

  • 編譯
cd ${HOME}/xgboost-packages/xgboost
cp make/config.mk ./
# 更改config.mk 使用HDFS配置
# whether use HDFS support during compile
USE_HDFS = 1
HADOOP_HOME = /usr/lib/hadoop
HDFS_LIB_PATH = $(HOME)/xgboost-packages/libhdfs
#編譯
make -j22
  • 修改部分代碼(env python 版本>2.7不用修改)
# 更改dmlc_yarn.py首行
#!/usr/bin/python2.7
# 更改run_hdfs_prog.py首行
#!/usr/bin/python2.7
  • 測試
# 添加必要參數
cd ${HOME}/xgboost-packages/xgboost/demo/distributed-training
echo -e "booster = gbtree\nobjective = binary:logistic\nsave_period = 0\neval_train = 1" > mushroom.hadoop.conf
# 測試代碼 run_yarn.sh
#!/bin/bash
if [ "$#" -lt 2 ];
then
        echo "Usage: <nworkers> <nthreads>"
        exit -1
fi

# put the local training file to HDFS
DATA_DIR="/user/`whoami`/xgboost-dist-test"
#hadoop fs -test -d ${DATA_DIR} && hadoop fs -rm -r ${DATA_DIR}
#hadoop fs -mkdir ${DATA_DIR}
#hadoop fs -put ../data/agaricus.txt.train ${DATA_DIR}
#hadoop fs -put ../data/agaricus.txt.test ${DATA_DIR}

# necessary env
export LD_LIBRARY_PATH=${HOME}/xgboost-packages/lib64:$JAVA_HOME/jre/lib/amd64/server:/${HOME}/xgboost-packages/libhdfs:$LD_LIBRARY_PATH
export HADOOP_HOME=/usr/lib/hadoop
export HADOOP_COMMON_HOME=$HADOOP_HOME
export HADOOP_HDFS_HOME=/usr/lib/hadoop-hdfs
export HADOOP_MAPRED_HOME=/usr/lib/hadoop-yarn
export HADOOP_YARN_HOME=$HADOOP_MAPRED_HOME
export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop

# running rabit, pass address in hdfs
../../dmlc-core/tracker/dmlc_yarn.py  -n $1 --vcores $2\
    --ship-libcxx ${HOME}/xgboost-packages/lib64 \
    -q root.machinelearning \
    -f ${HOME}/xgboost-packages/libhdfs/libhdfs.so.0.0.0 \
    ../../xgboost mushroom.hadoop.conf nthread=$2 \
    data=hdfs://ss-hadoop${DATA_DIR}/agaricus.txt.train \
    eval[test]=hdfs://ss-hadoop${DATA_DIR}/agaricus.txt.test \
    eta=1.0 \
    max_depth=3 \
    num_round=3 \
    model_out=hdfs://ss-hadoop/tmp/mushroom.final.model

# get the final model file
hadoop fs -get /tmp/mushroom.final.model final.model

# use dmlc-core/yarn/run_hdfs_prog.py to setup approperiate env

# output prediction task=pred
#../../xgboost.dmlc mushroom.hadoop.conf task=pred model_in=final.model test:data=../data/agaricus.txt.test
#../../dmlc-core/yarn/run_hdfs_prog.py ../../xgboost mushroom.hadoop.conf task=pred model_in=final.model test:data=../data/agaricus.txt.test
# print the boosters of final.model in dump.raw.txt
#../../xgboost.dmlc mushroom.hadoop.conf task=dump model_in=final.model name_dump=dump.raw.txt
#../../dmlc-core/yarn/run_hdfs_prog.py ../../xgboost mushroom.hadoop.conf task=dump model_in=final.model name_dump=dump.raw.txt
# use the feature map in printing for better visualization
#../../xgboost.dmlc mushroom.hadoop.conf task=dump model_in=final.model fmap=../data/featmap.txt name_dump=dump.nice.txt
../../dmlc-core/yarn/run_hdfs_prog.py ../../xgboost mushroom.hadoop.conf task=dump model_in=final.model fmap=../data/featmap.txt name_dump=dump.nice.txt
cat dump.nice.txt
  • 運行結果
    這裏寫圖片描述

參考資料

  1. https://github.com/dmlc/xgboost
  2. https://github.com/dmlc/xgboost/issues/854
  3. https://github.com/dmlc/xgboost/issues/856
  4. https://github.com/dmlc/xgboost/issues/861
  5. https://github.com/dmlc/xgboost/issues/866
  6. https://github.com/dmlc/xgboost/issues/869
  7. https://github.com/dmlc/xgboost/issues/1150
  8. http://arxiv.org/pdf/1603.02754v1.pdf
發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章