1、org.apache.hadoop.ipc.RemoteException: org.apache.hadoop.hdfs.protocol.AlreadyBeingCreatedException: failed to create file /user/bjdata/user/wuyb/semv/SemAAJob_3/calsigma/_temporary/_attempt_201306261152_42270_r_000000_0/part-r-00000 for DFSClient_attempt_201306261152_42270_r_000000_0 on client 10.0.2.79 because current leaseholder is trying to recreate file.
解決:
MultipleOutputs.addNamedOutput(job, CalSigmaReducer.RECORD_OUT, MultipleTextOutputFormat.class, Text.class, NullWritable.class);
原來是使用TextOutputFormat,改成MultipleTextOutputFormat。
2、使用了thrift生成的class,java.lang.IllegalStateException: Runtime parameterized Protobuf/Thrift class is unkonwn. This object was probably created with default constructor. Please use setConverter(Class).
MultipleOutputs.addNamedOutput(job, CalSigmaReducer.RECORD_OUT2, MultipleSequenceFileOutputFormat.class, ThriftWritable.class, NullWritable.class);
原來想設置輸出格式爲TextSequenceFileOutputFormat,然後使用ThriftWritable爲key類型。hadoop在生成對象的時候使用ThriftWritable默認構造器,但ThriftWritable要求設置一個被包裝的類型。這無法做到,因此導致用hadoop的生成對象的方法生成了一個不具有“被包裝類型”的ThriftWritable對象。
解決:寫了一個工具類,把thrift生成的類轉換爲String。(爲了不修改thrift生成的類)
3、使用後忘記close,然後發現有的文件沒有內容。
解決:cleanup的時候mos.close()