Hadoop-mapreduce org.apache.hadoop.io.LongWritable cannot be cast to org.apache.hadoop.io.Text錯誤




java.lang.Exception: java.lang.ClassCastException: org.apache.hadoop.io.LongWritable cannot be cast to org.apache.hadoop.io.Text


(Hadoop的輸入LongWritable不能轉變爲Text)


When you read a file with a M/R program, the input key of your mapper should be the index of the line in the file, while the input value will be the full line.So here what's happening is that you're trying to have the line index as a Text object which is wrong, and you need an LongWritable instead so that Hadoop doesn't complain about type.


        當你用MapReduce程序讀入文件是,mapper的輸入key值應該是文件當前行的索引值,文件當前行的所有內容會被當做mapper的輸入value值。所以Mapper程序的第一個參數(Mapper類的輸入Key值)不可以具體化爲LongWritable,可以用Text或者Object類。

發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章