`
leongfans
  • 浏览: 86084 次
  • 性别: Icon_minigender_1
  • 来自: 北京
社区版块
存档分类
最新评论

Hadoop源码解读-Http服务器Jetty的使用

阅读更多

 

Hadoop内嵌了Http服务器Jetty,主要有以下两方面的作用

1、Web访问接口,用于展示Hadoop的内部状态

2、参与Hadoop集群的运行和管理

 

以Namenode为例

Namenode通过

startHttpServer(conf);

 来启动HttpServer(Jetty),具体代码如下

          httpServer = new HttpServer("hdfs", infoHost, infoPort, 
              infoPort == 0, conf, 
              SecurityUtil.getAdminAcls(conf, DFSConfigKeys.DFS_ADMIN));

  深入HttpServer可以看到,上面的代码是将HADOOP_HOME\webapp下面的hdfs目录作为了jetty的默认Context(datanode就用datanode目录,jobtracker为job目录,tasktracker为task目录,secondarynamenode为secondary目录)

    webAppContext = new WebAppContext();
    webAppContext.setDisplayName("WepAppsContext");
    webAppContext.setContextPath("/");
    webAppContext.setWar(appDir + "/" + name);
    webAppContext.getServletContext().setAttribute(CONF_CONTEXT_ATTRIBUTE, conf);
    webAppContext.getServletContext().setAttribute(ADMINS_ACL, adminsAcl);
    webServer.addHandler(webAppContext);

 同时添加了对Log和webapps下面static资源(css、js、pic)的访问

  protected void addDefaultApps(ContextHandlerCollection parent,
      final String appDir) throws IOException {
    // set up the context for "/logs/" if "hadoop.log.dir" property is defined. 
    String logDir = System.getProperty("hadoop.log.dir");
    if (logDir != null) {
      Context logContext = new Context(parent, "/logs");
      logContext.setResourceBase(logDir);
      logContext.addServlet(AdminAuthorizedServlet.class, "/");
      logContext.setDisplayName("logs");
      setContextAttributes(logContext);
      defaultContexts.put(logContext, true);
    }
    // set up the context for "/static/*"
    Context staticContext = new Context(parent, "/static");
    staticContext.setResourceBase(appDir + "/static");
    staticContext.addServlet(DefaultServlet.class, "/*");
    staticContext.setDisplayName("static");
    setContextAttributes(staticContext);
    defaultContexts.put(staticContext, true);
  }

以及一些状态信息的访问

  /**
   * Add default servlets.
   */
  protected void addDefaultServlets() {
    // set up default servlets
    addServlet("stacks", "/stacks", StackServlet.class);
    addServlet("logLevel", "/logLevel", LogLevel.Servlet.class);
    addServlet("metrics", "/metrics", MetricsServlet.class);
    addServlet("conf", "/conf", ConfServlet.class);
    addServlet("jmx", "/jmx", JMXJsonServlet.class);
  }
 

最后,返回namenode,又添加了一些namenode特有的访问接口,例如

/fsck用于文件系统的检查

/getimage是SecondaryNamenode获取image的入口

 

          httpServer.addInternalServlet("getDelegationToken", 
                                        GetDelegationTokenServlet.PATH_SPEC, 
                                        GetDelegationTokenServlet.class, true);
          httpServer.addInternalServlet("renewDelegationToken", 
                                        RenewDelegationTokenServlet.PATH_SPEC, 
                                        RenewDelegationTokenServlet.class, true);
          httpServer.addInternalServlet("cancelDelegationToken", 
                                        CancelDelegationTokenServlet.PATH_SPEC, 
                                        CancelDelegationTokenServlet.class,
                                        true);
          httpServer.addInternalServlet("fsck", "/fsck", FsckServlet.class, true);
          httpServer.addInternalServlet("getimage", "/getimage", 
              GetImageServlet.class, true);
          httpServer.addInternalServlet("listPaths", "/listPaths/*", 
              ListPathsServlet.class, false);
          httpServer.addInternalServlet("data", "/data/*", 
              FileDataServlet.class, false);
          httpServer.addInternalServlet("checksum", "/fileChecksum/*",
              FileChecksumServlets.RedirectServlet.class, false);
          httpServer.addInternalServlet("contentSummary", "/contentSummary/*",
              ContentSummaryServlet.class, false);
          httpServer.start();
      
          // The web-server port can be ephemeral... ensure we have the correct info
          infoPort = httpServer.getPort();
          httpAddress = new InetSocketAddress(infoHost, infoPort);
          conf.set("dfs.http.address", infoHost + ":" + infoPort);
          LOG.info("Web-server up at: " + infoHost + ":" + infoPort);
          return httpServer;

 

再打开hdfs目录,会发现index页面会直接转跳到dfshealth.jsp,查看web.xml

    <servlet-mapping>
        <servlet-name>org.apache.hadoop.hdfs.server.namenode.dfshealth_jsp</servlet-name>
        <url-pattern>/dfshealth.jsp</url-pattern>
    </servlet-mapping>

 dfshealth_jsp.class可以从hadoop-core-xxx.jar里面找到

 

 

分享到:
评论
2 楼 leongfans 2012-04-13  
cjnetwork 写道
dfshealth_jsp.class可以从hadoop-core-xxx.jar里面找到

我找了一下hadoop-core-1.0.1.jar,但没有发现你说这个类。。。   能不能指点一下。
目前,在Myeclipse中导入了hadoop的相关项目,但启动namenode的时候,出了jetty启动报错
java.lang.ClassNotFoundException: org.apache.hadoop.hdfs.server.namenode.dfshealth_jsp
	at java.net.URLClassLoader$1.run(URLClassLoader.java:200)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:252)

等之外,其他还算正常。


1.0.1版本不是很清楚,我这儿说的是cloudera cdh3u2版本的
dfshealth_jsp.java是ant脚本生成的,你需要先运行一下ant
然后把build目录下面的src里面的java文件也加到classpath里面去
1 楼 cjnetwork 2012-04-09  
dfshealth_jsp.class可以从hadoop-core-xxx.jar里面找到

我找了一下hadoop-core-1.0.1.jar,但没有发现你说这个类。。。   能不能指点一下。
目前,在Myeclipse中导入了hadoop的相关项目,但启动namenode的时候,出了jetty启动报错
java.lang.ClassNotFoundException: org.apache.hadoop.hdfs.server.namenode.dfshealth_jsp
	at java.net.URLClassLoader$1.run(URLClassLoader.java:200)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:252)

等之外,其他还算正常。

相关推荐

    hadoop-core-0.20.2 源码 hadoop-2.5.1-src.tar.gz 源码 hadoop 源码

    这里我们将深入探讨"Hadoop-core-0.20.2"和"hadoop-2.5.1-src"的源码,以便更好地理解Hadoop的工作原理和内部机制。 **Hadoop Core源码分析** Hadoop-core-0.20.2是Hadoop早期版本的核心组件,它包含了Hadoop的...

    hadoop-eclipse-plugin1.2.1 and hadoop-eclipse-plugin2.8.0

    在实际使用中,安装Hadoop-Eclipse-Plugin非常简单。只需将jar文件(如hadoop-eclipse-plugin-2.8.0.jar)复制到Eclipse的plugins目录下,然后重启Eclipse,即可在“New Project”中看到Hadoop相关的项目类型。在...

    hadoop-yarn-client-2.6.5-API文档-中文版.zip

    赠送jar包:hadoop-yarn-client-2.6.5.jar; 赠送原API文档:hadoop-yarn-client-2.6.5-javadoc.jar; 赠送源代码:hadoop-yarn-client-2.6.5-sources.jar; 赠送Maven依赖信息文件:hadoop-yarn-client-2.6.5.pom;...

    hadoop-mapreduce-client-jobclient-2.6.5-API文档-中文版.zip

    赠送jar包:hadoop-mapreduce-client-jobclient-2.6.5.jar; 赠送原API文档:hadoop-mapreduce-client-jobclient-2.6.5-javadoc.jar; 赠送源代码:hadoop-mapreduce-client-jobclient-2.6.5-sources.jar; 赠送...

    hadoop-eclipse-plugin-2.10.0.jar

    Eclipse集成Hadoop2.10.0的插件,使用`ant`对hadoop的jar包进行打包并适应Eclipse加载,所以参数里有hadoop和eclipse的目录. 必须注意对于不同的hadoop版本,` HADDOP_INSTALL_PATH/share/hadoop/common/lib`下的jar包...

    hadoop最新版本3.1.1全量jar包

    hadoop-annotations-3.1.1.jar hadoop-common-3.1.1.jar hadoop-mapreduce-client-core-3.1.1.jar hadoop-yarn-api-3.1.1.jar hadoop-auth-3.1.1.jar hadoop-hdfs-3.1.1.jar hadoop-mapreduce-client-hs-3.1.1.jar ...

    hadoop-yarn-common-2.6.5-API文档-中文版.zip

    赠送jar包:hadoop-yarn-common-2.6.5.jar 赠送原API文档:hadoop-yarn-common-2.6.5-javadoc.jar 赠送源代码:hadoop-yarn-common-2.6.5-sources.jar 包含翻译后的API文档:hadoop-yarn-common-2.6.5-javadoc-...

    hadoop-auth-2.5.1-API文档-中文版.zip

    赠送jar包:hadoop-auth-2.5.1.jar; 赠送原API文档:hadoop-auth-2.5.1-javadoc.jar; 赠送源代码:hadoop-auth-2.5.1-sources.jar; 赠送Maven依赖信息文件:hadoop-auth-2.5.1.pom; 包含翻译后的API文档:hadoop...

    hadoop-mapreduce-client-common-2.6.5-API文档-中英对照版.zip

    赠送jar包:hadoop-mapreduce-client-common-2.6.5.jar; 赠送原API文档:hadoop-mapreduce-client-common-2.6.5-javadoc.jar; 赠送源代码:hadoop-mapreduce-client-common-2.6.5-sources.jar; 赠送Maven依赖信息...

    hadoop插件apache-hadoop-3.1.0-winutils-master.zip

    标题中的"apache-hadoop-3.1.0-winutils-master.zip"是一个针对Windows用户的Hadoop工具包,它包含了运行Hadoop所需的特定于Windows的工具和配置。`winutils.exe`是这个工具包的关键组件,它是Hadoop在Windows上的一...

    hadoop-eclipse-plugin-3.1.1.tar.gz

    使用Hadoop-Eclipse-Plugin时,建议遵循良好的编程习惯,如合理划分Mapper和Reducer的功能,优化数据处理流程,以及充分利用Hadoop的并行计算能力。同时,及时更新插件至最新版本,以获取最新的功能和修复。 通过...

    hadoop-common-2.6.0-bin-master.zip

    `hadoop-common-2.6.0-bin-master.zip` 是一个针对Hadoop 2.6.0版本的压缩包,特别适用于在Windows环境下进行本地开发和测试。这个版本的Hadoop包含了对Windows系统的优化,比如提供了`winutils.exe`,这是在Windows...

    hadoop-eclipse-plugin-2.7.3和2.7.7

    hadoop-eclipse-plugin-2.7.3和2.7.7的jar包 hadoop-eclipse-plugin-2.7.3和2.7.7的jar包 hadoop-eclipse-plugin-2.7.3和2.7.7的jar包 hadoop-eclipse-plugin-2.7.3和2.7.7的jar包

    flink-shaded-hadoop-3-uber-3.1.1.7.1.1.0-565-9.0.jar.tar.gz

    Flink-shaded-hadoop-3-uber-jar通过重新打包和阴影处理(shading)技术,确保Flink能够正确地识别和使用Hadoop 3.x的新特性,同时避免了不同版本库之间的冲突。 Java的阴影处理是一种特殊的类重命名过程,它将特定...

    hadoop-yarn-server-resourcemanager-2.6.0-API文档-中文版.zip

    赠送jar包:hadoop-yarn-server-resourcemanager-2.6.0.jar; 赠送原API文档:hadoop-yarn-server-resourcemanager-2.6.0-javadoc.jar; 赠送源代码:hadoop-yarn-server-resourcemanager-2.6.0-sources.jar; 赠送...

    hadoop-mapreduce-client-core-2.5.1-API文档-中文版.zip

    赠送jar包:hadoop-mapreduce-client-core-2.5.1.jar; 赠送原API文档:hadoop-mapreduce-client-core-2.5.1-javadoc.jar; 赠送源代码:hadoop-mapreduce-client-core-2.5.1-sources.jar; 赠送Maven依赖信息文件:...

    hadoop-eclipse-plugin三个版本的插件都在这里了。

    hadoop-eclipse-plugin-2.7.4.jar和hadoop-eclipse-plugin-2.7.3.jar还有hadoop-eclipse-plugin-2.6.0.jar的插件都在这打包了,都可以用。

    hadoop-yarn-api-2.5.1-API文档-中文版.zip

    赠送jar包:hadoop-yarn-api-2.5.1.jar; 赠送原API文档:hadoop-yarn-api-2.5.1-javadoc.jar; 赠送源代码:hadoop-yarn-api-2.5.1-sources.jar; 赠送Maven依赖信息文件:hadoop-yarn-api-2.5.1.pom; 包含翻译后...

    flink-shaded-hadoop-2-uber-2.7.5-10.0.jar.zip

    Apache Flink 是一个流行的开源大数据处理框架,而 `flink-shaded-hadoop-2-uber-2.7.5-10.0.jar.zip` 文件是针对 Flink 优化的一个特殊版本的 Hadoop 库。这个压缩包中的 `flink-shaded-hadoop-2-uber-2.7.5-10.0....

    好用hadoop-eclipse-plugin-1.2.1

    hadoop-eclipse-plugin-1.2.1hadoop-eclipse-plugin-1.2.1hadoop-eclipse-plugin-1.2.1hadoop-eclipse-plugin-1.2.1

Global site tag (gtag.js) - Google Analytics