Hadoop运行mapreduce实例时,抛出错误
ava.io.IOException: All datanodes xxx.xxx.xxx.xxx:xxx are bad. Aborting…
at org.apache.hadoop.dfs.DFSClient$DFSOutputStream.processDatanodeError(DFSClient.java:2158)
at org.apache.hadoop.dfs.DFSClient$DFSOutputStream.access$1400(DFSClient.java:1735)
at org.apache.hadoop.dfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:1889)
java.io.IOException: Could not get block locations. Aborting…
at org.apache.hadoop.dfs.DFSClient$DFSOutputStream.processDatanodeError(DFSClient.java:2143)
at org.apache.hadoop.dfs.DFSClient$DFSOutputStream.access$1400(DFSClient.java:1735)
at org.apache.hadoop.dfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:1889)
经查明,问题原因是linux机器打开了过多的文件导致。
用命令ulimit -n可以发现linux默认的文件打开数目为1024
修改/ect/security/limit.conf,
增加hadoop soft 65535
(网上还有其他设置也可以一并设置)
再重新运行程序(最好所有的datanode都修改)
问题解决
阅读(4235) | 评论(0) | 转发(0) |