hadoop上传文件报错

淩亂°似流年 2021-11-26 19:36 488阅读 0赞
  1. 19/06/06 16:09:26 INFO hdfs.DFSClient: Exception in createBlockOutputStream
  2. java.io.IOException: Bad connect ack with firstBadLink as 192.168.56.120:50010
  3. at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1456)
  4. at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1357)
  5. at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587)
  6. 19/06/06 16:09:26 INFO hdfs.DFSClient: Abandoning BP-1551374179-192.168.56.119-1559807706695:blk_1073741825_1001
  7. 19/06/06 16:09:26 INFO hdfs.DFSClient: Excluding datanode 192.168.56.120:50010
  8. 19/06/06 16:09:26 INFO hdfs.DFSClient: Exception in createBlockOutputStream
  9. java.io.IOException: Bad connect ack with firstBadLink as 192.168.56.121:50010
  10. at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1456)
  11. at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1357)
  12. at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587)
  13. 19/06/06 16:09:26 INFO hdfs.DFSClient: Abandoning BP-1551374179-192.168.56.119-1559807706695:blk_1073741826_1002
  14. 19/06/06 16:09:26 INFO hdfs.DFSClient: Excluding datanode 192.168.56.121:50010

  在执行上传文件的时候报错,提示 `Bad connect ack with firstBadLink`, 看到跟从节点的通信失败了,并不是由于文件大小引起的,通信失败想到是防火墙可能没有关闭,查看一下防火墙状态,发现防火墙是开着的,所以通信失败。把防火墙关闭之后,在重新上传就成功了。

centos7 查看防火墙状态

  1. firewall-cmd --state

关闭防火墙

  1. systemctl stop firewalld.service

禁用防火墙,开启不再启动。

  1. systemctl disable firewalld.service

关闭防火墙之后在次上传成功。

转载于:https://www.cnblogs.com/hanwen1014/p/10985681.html

发表评论

表情:
评论列表 (有 0 条评论,488人围观)

还没有评论,来说两句吧...

相关阅读

    相关 hadoop put文件

    最近准备学习下大数据相关的知识,先从hadoop学习开始,在hadoop集群环境搭建后测试上传文件一直报错, 防火墙全部关闭了,配置也没问题,格式化也没问题,查找了半天资料才