java.lang.NoClassDefFoundError: org/codehaus/jackso/map/JsonMappingException
以此爲關鍵詞搜索出了解決辦法:(最幸運的莫過於一篇中文文章了)
http://www.maplef.net/blog/archives/eclipse-hadoop-plugin.html
文章作者說明了這是20.203的問題,全換成20.2可以連接上HDFS了
但作者也提出了,"使用0.20.2版本後,仍然沒有辦法直接通過hadoop運行mapreduce程序,需要手動運行,"
JarFile=”wordcount-0.1.jar”
MainFunc=”net.maplef.hadoop.wordcount.WordCountDriver”
LocalOutDir=”/tmp/output”
all:help
jar:
jar -cvf ${JarFile} -C bin/ .
input:
hadoop fs -rmr input
hadoop fs -put ${input} input
run:
hadoop jar ${JarFile} ${MainFunc} input output
clean:
hadoop fs -rmr output
output:
rm -rf ${LocalOutDir}
hadoop fs -get output ${LocalOutDir}
gedit ${LocalOutDir}/part-r-00000 &
help:
@echo “Usage:”
@echo ” make jar – Build Jar File.”
@echo ” make input input=yourinputpath – Set up Input Path”
@echo ” make clean – Clean up Output directory on HDFS.”
@echo ” make run – Run your MapReduce code on Hadoop.”
@echo ” make output – Download and show output file”
@echo ” make help – Show Makefile options.”
@echo ” ”
@echo “Example:”
@echo ” make jar; make input input=/home/username/hadoop-test/input/*; make run; make output; make clean”
最後,小小表揚一下自己,雖然世上很少有事一帆風順,但是隻有堅持纔有可能找到路,比起以前那種興奮勁三天就過,總算進步了。