`
david.org
  • 浏览: 157130 次
  • 性别: Icon_minigender_1
  • 来自: 上海
社区版块
存档分类
最新评论

Hadoop 坑爹的Be Replicated to 0 nodes, instead of 1 异常

阅读更多
有段时间不写博客了,感觉自己懒惰了不少,当然也是不够努力。近一年的时间一直在开发Hadoop基础应用。

  新的项目上线之后发现,有些会员上传资源到我们集群的速度,既然跟我们集群的吞吐量差不多,达到了70M+/s的速度。 在向集群put数据时,抛出了异常:
org.apache.hadoop.ipc.RemoteException: java.io.IOException: File /xxx/xxx/xx could only be replicated to 0 nodes, instead of 1

这样的信息告诉我,集群内部无可用的节点了,因为是在put阶段出现的,直觉告诉我,所有的节点是不是都已经写满数据了?

节点少的情况比较容易观察Hadoop的一些问题,查看dfshealth.jsp页面发现,至少有三台节点可写,但dfsClient put数据仍然抛出无节点可用的异常。

追究源码,NameNode身边的 ReplicationTargetChooser#isGoodTarget方法给出了说明:
// check the communication traffic of the target machine
    if (considerLoad) {
      double avgLoad = 0;
      int size = clusterMap.getNumOfLeaves();
      if (size != 0) {
        avgLoad = (double)fs.getTotalLoad()/size;
      }
      if (node.getXceiverCount() > (2.0 * avgLoad)) {
        logr.debug("Node "+NodeBase.getPath(node)+
                  " is not chosen because the node is too busy");
        return false;
      }
}

  isGoodTarget方法对预选的数据节点做出了终审判决,然而除了磁盘空间可利用外,另外需稳定在一定的压力之下,这里的标准是Datanode中XceiverServer所接受的连接数,我们在使用Hadoop时,这个值很容易被忽略,因为这个值不方便被统计到。上段代码说明当前节点的连接数,不得大于集群所有节点平均连接数的两倍。为了使我的系统尽量独力,我在dfshealth.jsp 页面把每台节点的连接数打印了出来,结果发现正好符合上述代码的判断。



比如ReplicationTargetChooser选择了node13,那么即使node13有大片的空间可写,最终也会被上述代码认为是一个不符合条件的节点。
157 > ((27 + 45 + 44 + 54 + 35 + 50 + 104 + 55 + 73 + 69 + 157 + 146)/12 * 2)

这样的异常,一般解决办法是添加节点,或是在节点允许的情况下,对这段算法进行上调。
  • 大小: 50.3 KB
分享到:
评论
3 楼 david.org 2011-11-30  
siguzhishu 写道
HBASE中碰到过 ,修改linux的最大连接数。


这个和linux的最大连接数没关系吧,这个选择策略是Namenode设置的。
2 楼 siguzhishu 2011-07-26  
HBASE中碰到过 ,修改linux的最大连接数。
1 楼 langyu 2011-07-13  
这个问题我前几天也追踪到了,很讨厌呀。集群对于load的计算模型还得需要分析下

相关推荐

    hadoop常见问题及解决方法

    6、启动时报错java.io.IOException: File jobtracker.info could only be replicated to 0 nodes, instead of 1: 解决方法:首先,检查防火墙是否关闭,是否对jobtracker.info文件进行了acl权限设置,或者是否已经...

    hadoop1.0 Failed to set permissions of path 解决方案

    ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task tracker because java.io.IOException: Failed to set permissions of path: \tmp\hadoop-admin \mapred\local\ttprivate to 0700 at org.apache...

    hadoop配置运行错误

    问题描述:在hadoop集群启动时,slave总是无法启动datanode,并报错“could only be replicated to 0 nodes, instead of 1”。 解决方法: 1. 删除所有节点的数据文件:删除所有节点的dfs.data.dir和dfs.tmp.dir...

    hadoop-core-1.2.0解决eclipse Hadoop Failed to set permissions of path错误

    eclipse远程调试hadoop时 报出eclipse Hadoop Failed to set permissions of path错误 修改hadoop core包中FileUtil java文件 里面有checkReturnValue方法 将代码throw new IOException "Failed to set ...

    [Hadoop] Hadoop 集群操作管理技巧 (英文版)

    Hands-on recipes to configure a Hadoop cluster from bare metal hardware nodes Practical and in depth explanation of cluster management commands Easy-to-understand recipes for securing and monitoring a...

    Failed to set permissions of path: \tmp\hadoop-Administrator

    Failed to set permissions of path: \tmp\hadoop-Administrator,的解决方法,更换hadoop-core-1.0.2-modified.jar包

    Hadoop.Essentials.1784396680

    By the end of the book, you will be confident to begin working with Hadoop straightaway and implement the knowledge gained in all your real-world scenarios. Table of Contents Chapter 1: Introduction ...

    Apache Hadoop 3.x state of the union and upgrade guidance

    It morphed the Hadoop compute layer to be a common resource-management platform that can host a wide variety of applications. Many organizations leverage YARN in building their applications on top of...

    Hadoop from the beginning: The basics

    This book is written for anyone who needs to know how to analyze data using Hadoop. It is a good book for both Hadoop beginners and those in need of advancing their Hadoop skills. The author has ...

    Moving Hadoop to The Cloud epub

    Moving Hadoop to The Cloud 英文epub 本资源转载自网络,如有侵权,请联系上传者或csdn删除 本资源转载自网络,如有侵权,请联系上传者或csdn删除

    hadoop-2.2.0-src.tar

    To recap, this release has a number of significant highlights compared to Hadoop 1.x: YARN - A general purpose resource management system for Hadoop to allow MapReduce and other other data processing...

    Hadoop_Data Processing and Modelling-Packt Publishing(2016).pdf

    Along with Hadoop 2.x and illustrates how it can be used to extend the capabilities of Hadoop. When you nish this course, you will be able to tackle the real-world scenarios and become a big data ...

    Introduction to SAS and Hadoop

    In addition, the SAS/ACCESS Interface to Hadoop methods that allow LIBNAME access and SQL pass-through techniques to read and write Hadoop HIVE or Cloudera Impala tables structures is part of this ...

    Apache Hadoop 3 Quick Start Guide

    It enables large datasets to be efficiently processed instead of using one large computer to store and process the data. This book will get you started with the Hadoop ecosystem, and introduce you to...

    Field Guide to Hadoop

    If your organization is about to enter the world of big data, you not only need to decide whether Apache Hadoop is the right platform to use, but also which of its many components are best suited to ...

Global site tag (gtag.js) - Google Analytics