2011-06-09 20:10:50,033 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting DataNode
STARTUP_MSG: host = localhost.localdomain/127.0.0.1
STARTUP_MSG: args = []
STARTUP_MSG: version = 0.21.0
STARTUP_MSG: classpath = /opt/hadoop-0.21.0/bin/../conf:/usr/java/jdk1.6.0_16/lib/tools.jar:/opt/hadoop-0.21.0/bin/..:/opt/hadoop-0.21.0/bin/../hadoop-common-0.21.0.jar:/opt/hadoop-0.21.0/bin/../hadoop-common-test-0.21.0.jar:/opt/hadoop-0.21.0/bin/../hadoop-hdfs-0.21.0.jar:/opt/hadoop-0.21.0/bin/../hadoop-hdfs-0.21.0-sources.jar:/opt/hadoop-0.21.0/bin/../hadoop-hdfs-ant-0.21.0.jar:/opt/hadoop-0.21.0/bin/../hadoop-hdfs-test-0.21.0.jar:/opt/hadoop-0.21.0/bin/../hadoop-hdfs-test-0.21.0-sources.jar:/opt/hadoop-0.21.0/bin/../hadoop-mapred-0.21.0.jar:/opt/hadoop-0.21.0/bin/../hadoop-mapred-0.21.0-sources.jar:/opt/hadoop-0.21.0/bin/../hadoop-mapred-examples-0.21.0.jar:/opt/hadoop-0.21.0/bin/../hadoop-mapred-test-0.21.0.jar:/opt/hadoop-0.21.0/bin/../hadoop-mapred-tools-0.21.0.jar:/opt/hadoop-0.21.0/bin/../lib/ant-1.6.5.jar:/opt/hadoop-0.21.0/bin/../lib/asm-3.2.jar:/opt/hadoop-0.21.0/bin/../lib/aspectjrt-1.6.5.jar:/opt/hadoop-0.21.0/bin/../lib/aspectjtools-1.6.5.jar:/opt/hadoop-0.21.0/bin/../lib/avro-1.3.2.jar:/opt/hadoop-0.21.0/bin/../lib/commons-cli-1.2.jar:/opt/hadoop-0.21.0/bin/../lib/commons-codec-1.4.jar:/opt/hadoop-0.21.0/bin/../lib/commons-el-1.0.jar:/opt/hadoop-0.21.0/bin/../lib/commons-httpclient-3.1.jar:/opt/hadoop-0.21.0/bin/../lib/commons-lang-2.5.jar:/opt/hadoop-0.21.0/bin/../lib/commons-logging-1.1.1.jar:/opt/hadoop-0.21.0/bin/../lib/commons-logging-api-1.1.jar:/opt/hadoop-0.21.0/bin/../lib/commons-net-1.4.1.jar:/opt/hadoop-0.21.0/bin/../lib/core-3.1.1.jar:/opt/hadoop-0.21.0/bin/../lib/ftplet-api-1.0.0.jar:/opt/hadoop-0.21.0/bin/../lib/ftpserver-core-1.0.0.jar:/opt/hadoop-0.21.0/bin/../lib/ftpserver-deprecated-1.0.0-M2.jar:/opt/hadoop-0.21.0/bin/../lib/hsqldb-1.8.0.10.jar:/opt/hadoop-0.21.0/bin/../lib/jackson-core-asl-1.4.2.jar:/opt/hadoop-0.21.0/bin/../lib/jackson-mapper-asl-1.4.2.jar:/opt/hadoop-0.21.0/bin/../lib/jasper-compiler-5.5.12.jar:/opt/hadoop-0.21.0/bin/../lib/jasper-runtime-5.5.12.jar:/opt/hadoop-0.21.0/bin/../lib/jdiff-1.0.9.jar:/opt/hadoop-0.21.0/bin/../lib/jets3t-0.7.1.jar:/opt/hadoop-0.21.0/bin/../lib/jetty-6.1.14.jar:/opt/hadoop-0.21.0/bin/../lib/jetty-util-6.1.14.jar:/opt/hadoop-0.21.0/bin/../lib/jsp-2.1-6.1.14.jar:/opt/hadoop-0.21.0/bin/../lib/jsp-api-2.1-6.1.14.jar:/opt/hadoop-0.21.0/bin/../lib/junit-4.8.1.jar:/opt/hadoop-0.21.0/bin/../lib/kfs-0.3.jar:/opt/hadoop-0.21.0/bin/../lib/log4j-1.2.15.jar:/opt/hadoop-0.21.0/bin/../lib/mina-core-2.0.0-M5.jar:/opt/hadoop-0.21.0/bin/../lib/mockito-all-1.8.2.jar:/opt/hadoop-0.21.0/bin/../lib/oro-2.0.8.jar:/opt/hadoop-0.21.0/bin/../lib/paranamer-2.2.jar:/opt/hadoop-0.21.0/bin/../lib/paranamer-ant-2.2.jar:/opt/hadoop-0.21.0/bin/../lib/paranamer-generator-2.2.jar:/opt/hadoop-0.21.0/bin/../lib/qdox-1.10.1.jar:/opt/hadoop-0.21.0/bin/../lib/servlet-api-2.5-6.1.14.jar:/opt/hadoop-0.21.0/bin/../lib/slf4j-api-1.5.11.jar:/opt/hadoop-0.21.0/bin/../lib/slf4j-log4j12-1.5.11.jar:/opt/hadoop-0.21.0/bin/../lib/xmlenc-0.52.jar:/opt/hadoop-0.21.0/bin/../lib/jsp-2.1/*.jar:/opt/hadoop-0.21.0/hdfs/bin/../conf:/opt/hadoop-0.21.0/hdfs/bin/../hadoop-hdfs-*.jar:/opt/hadoop-0.21.0/hdfs/bin/../lib/*.jar:/opt/hadoop-0.21.0/bin/../mapred/conf:/opt/hadoop-0.21.0/bin/../mapred/hadoop-mapred-*.jar:/opt/hadoop-0.21.0/bin/../mapred/lib/*.jar:/opt/hadoop-0.21.0/hdfs/bin/../hadoop-hdfs-*.jar:/opt/hadoop-0.21.0/hdfs/bin/../lib/*.jar
STARTUP_MSG: build = https://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.21 -r 985326; compiled by 'tomwhite' on Tue Aug 17 01:02:28 EDT 2010
************************************************************/
2011-06-09 20:10:50,745 INFO org.apache.hadoop.security.Groups: Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping; cacheTimeout=300000
2011-06-09 20:10:51,321 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: java.lang.NoClassDefFoundError: javax/net/SocketFactory
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:621)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:124)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:260)
at java.net.URLClassLoader.access$000(URLClassLoader.java:56)
at java.net.URLClassLoader$1.run(URLClassLoader.java:195)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:252)
at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:320)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:247)
at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1074)
at org.apache.hadoop.net.NetUtils.getSocketFactoryFromProperty(NetUtils.java:115)
at org.apache.hadoop.net.NetUtils.getDefaultSocketFactory(NetUtils.java:99)
at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:242)
at org.apache.hadoop.ipc.RPC.waitForProxy(RPC.java:183)
at org.apache.hadoop.ipc.RPC.waitForProxy(RPC.java:163)
at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:237)
at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1440)
at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1393)
at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1407)
at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1552)
Caused by: java.lang.ClassNotFoundException: javax.net.SocketFactory
at java.net.URLClassLoader$1.run(URLClassLoader.java:200)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:252)
at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:320)
... 25 more
2011-06-09 20:10:51,339 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down DataNode at localhost.localdomain/127.0.0.1
************************************************************/
解决办法:
看你的jre/lib/ 下的jsse 的后缀名是 jar 还是 pack ,如果是pack 就 unpack200 jsse.pack jsse.jar 然后 ok了。。。。恭喜你
分享到:
相关推荐
实验主题:武汉理工大学云计算应用 - Hadoop单机模式与伪分布式 **实验目的与意义:** 本次实验旨在让学生理解并掌握Hadoop的两种基础运行模式:单机模式和伪分布式模式。Hadoop是云计算领域的重要组件,主要用于大...
本文主要介绍了在Ubuntu系统上Hadoop单机版测试环境的搭建过程。
【Hadoop单机模式配置与安装详解】 在深入探讨Hadoop单机模式的配置与安装之前,首先要明白Hadoop是一个开源的分布式计算框架,由Java编写,因此在安装Hadoop之前,必须确保系统中已安装Java Development Kit (JDK)...
**Hadoop单机模式与伪分布模式详解** Hadoop是一个开源的分布式计算框架,它允许在廉价硬件上处理大规模数据集。Hadoop提供了多种运行模式,以满足不同环境的需求,其中最常见的两种是单机模式(Local Mode)和伪...
【标题】:Ubuntu环境下Hadoop单机模式安装详解 【描述】:本文档详细介绍了如何在Ubuntu操作系统上从零开始安装Hadoop,包括Linux安装、创建Hadoop用户组和用户、JDK安装、修改机器名、SSH服务安装以及实现SSH无...
总结来说,这个文档详细介绍了在Ubuntu 11.10上安装和配置Hadoop单机模式的步骤,包括Linux的安装、SSH服务的设置、Hadoop的安装与配置,以及启动和检查Hadoop服务的方法。对于初学者来说,这是一个很好的实践教程,...
Hadoop单机模式是初学者和开发者在本地环境中快速测试和学习Hadoop框架的一种方式。在这种模式下,所有的Hadoop组件(包括NameNode、DataNode、JobTracker和TaskTracker)都在同一个JVM进程中运行,不涉及网络通信,...
Hadoop单机模式的安装以与执行WordCount程序文件.doc
通过本教程的学习,读者可以了解Hadoop单机模式的配置流程,并能够通过一个简单的WordCount程序来验证Hadoop的安装是否成功。需要注意的是,本文档使用的Hadoop版本较为陈旧,对于实际生产环境中Hadoop的配置和使用...
### Hadoop单机模式安装详解 #### 一、前言 Hadoop是一个开源软件框架,用于分布式存储和处理大规模数据集。它最初由Apache软件基金会开发,并被广泛应用于大数据处理领域。对于初学者来说,了解如何在单机模式下...
5. 配置Hadoop单机模式:修改Hadoop配置文件,让Hadoop运行在单机模式。 6. 测试Wordcount示例:运行Hadoop自带的Wordcount示例,验证环境搭建是否成功。 搭建伪分布式开发环境的步骤大致与单机模式相似,但是需要...
hadoop单机配置 hadoop单机配置是指在单个机器上安装和配置hadoop环境的过程。下面将逐步介绍hadoop单机配置的详细过程。 一、安装Linux操作系统 安装Linux操作系统是hadoop单机配置的第一步。在这里,我们选择了...
启动Hadoop单机模式,使用命令`start-dfs.sh`和`start-yarn.sh`。 ### 四、Hadoop伪分布式模式 伪分布式模式是单机模式的扩展,模拟了多节点环境。只需在`hadoop-env.sh`和`hdfs-site.xml`中配置相关参数,然后...
Hadoop环境安装设置(最简单的hadoop单机环境部署教程) 安装前设置 SSH设置和密钥生成 安装Java.下载Java (JDK<最新版> - X64 ... 下载Hadoop.下载来自Apache基金会软件,使用下面 ... Hadoop操作模式 在单机模式下...
4. **配置Hadoop单机模式** - 编辑`$HADOOP_HOME/etc/hadoop/core-site.xml`,设置`fs.defaultFS`为`hdfs://localhost:9000`。 - 编辑`$HADOOP_HOME/etc/hadoop/hdfs-site.xml`,配置NameNode和DataNode端口,例如...
配置Hadoop单机模式相对简单,只需在hadoop-env.sh文件中设置JAVA_HOME,然后编辑配置文件core-site.xml、hdfs-site.xml、mapred-site.xml和yarn-site.xml,设置必要的参数以指定NameNode和DataNode的存储路径,以及...
观察程序执行结果,验证Hadoop单机模式下的基本功能。 ##### (七) Hadoop伪分布式配置 - **步骤**: 1. 修改配置文件`core-site.xml`和`hdfs-site.xml`。 2. 对`core-site.xml`进行配置: - 设置Hadoop的FS...
四、Hadoop单机模式验证 - 运行Hadoop的HDFS和MapReduce示例程序,如WordCount,验证Hadoop是否正常工作。 五、Hadoop集群配置 - 在多台机器上重复上述步骤,确保每台机器都有相同的Hadoop配置。 - 配置各节点间...
**使用Hadoop实现WordCount实验报告** 实验报告的目的是详细记录使用Hadoop在Windows环境下实现WordCount应用的过程,包括环境配置、WordCount程序的实现以及实验结果分析。本实验旨在理解Hadoop分布式计算的基本...