我當前的/etc/profile文件配置的環境變量
export JAVA_HOME=/opt/java/jdk1.8.0_131
export SPARK_HOME=/opt/spark-2.4.4-bin-hadoop2.7
export HIVE_HOME=/usr/hdp/current/hive-client
export LIVY_HOME=/opt/livy/livy-0.5.0-incubating-bin
export HBASE_HOME=/opt/hbase-2.2.1
export CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar
export HADOOP_HOME=/opt/hadoop-2.7.7
export HADOOP_COMMON_HOME=$HADOOP_HOME
export HADOOP_MAPRED_HOME=$HADOOP_HOME
export HADOOP_YARN_HOME=$HADOOP_HOME
export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop
export HADOOP_USER_NAME=hdfs
export HADOOP_HDFS_HOME=$HADOOP_HOME
export HADOOP_HDFS_HOME=$HADOOP_HOME
export PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/root/bin:/usr/local/go/bin
export PATH=$JAVA_HOME/bin:$HADOOP_HOME/bin:$SPARK_HOME/bin:$LIVY_HOME/bin:$PATH
當配置完環境變量於是我準備開始測試一下,於是遇到了如下報錯
hadoop fs -ls / 查看hdfs文件
#執行hadoop classpath 命令會顯示當前hadoop都引入了那些jar包
解決方法如下
libexec/hadoop-config.sh 這個文件裏最後一行
添加CLASSPATH=${CLASSPATH}:'/opt/hadoop-2.7.7/client/*
再次查看hadoop fs -ls /