• Hadoop异常处理 Bad connect ack with firstBadLink (No route to host )


    [root@Node1 ~]# hdfs dfs -put /home/test.txt /lab/input
    15/04/15 17:29:44 INFO hdfs.DFSClient: Exception in createBlockOutputStream
    java.io.IOException: Bad connect ack with firstBadLink as 192.168.249.133:50010
    at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1366)
    at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1271)
    at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:525)
    15/04/15 17:29:44 INFO hdfs.DFSClient: Abandoning BP-1826895056-192.168.249.131-1428690996123:blk_1073755081_14261
    15/04/15 17:29:44 INFO hdfs.DFSClient: Excluding datanode 192.168.249.133:50010
    15/04/15 17:29:44 INFO hdfs.DFSClient: Exception in createBlockOutputStream
    java.io.IOException: Bad connect ack with firstBadLink as 192.168.249.132:50010
    at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1366)
    at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1271)
    at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:525)
    15/04/15 17:29:44 INFO hdfs.DFSClient: Abandoning BP-1826895056-192.168.249.131-1428690996123:blk_1073755082_14262
    15/04/15 17:29:44 INFO hdfs.DFSClient: Excluding datanode 192.168.249.132:50010

    使用ROOT 用户关闭192.168.249.133,192.168.249.132 Node节点的防火墙

    service iptables stop

  • 相关阅读:
    ubuntu挂载群晖共享文件
    200. 岛屿数量_中等_不再记笔记了
    733. 图像渲染_简单_矩阵
    46. 全排列_中等_模拟
    37. 解数独_困难_矩阵
    1041. 困于环中的机器人_中等_模拟
    946. 验证栈序列
    415. 字符串相加_简单_模拟
    164. 最大间距_数组_困难
    215. 数组中的第K个最大元素_中等_数组
  • 原文地址:https://www.cnblogs.com/tmeily/p/4429223.html
Copyright © 2020-2023  润新知