运行
hadoop fs -ls
时出现错误如下:
root@ubuntu:/home/chenwq/hadoop/book/ch03/src/main/java# hadoop fs -ls
11/08/31 22:51:37 INFO security.Groups: Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping; cacheTimeout=300000
11/08/31 22:51:38 WARN conf.Configuration: mapred.task.id is deprecated. Instead, use mapreduce.task.attempt.id
11/08/31 22:51:39 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8020. Already tried 0 time(s).
11/08/31 22:51:40 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8020. Already tried 1 time(s).
11/08/31 22:51:41 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8020. Already tried 2 time(s).
11/08/31 22:51:42 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8020. Already tried 3 time(s).
11/08/31 22:51:43 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8020. Already tried 4 time(s).
11/08/31 22:51:44 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8020. Already tried 5 time(s).
11/08/31 22:51:45 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8020. Already tried 6 time(s).
11/08/31 22:51:46 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8020. Already tried 7 time(s).
11/08/31 22:51:47 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8020. Already tried 8 time(s).
11/08/31 22:51:48 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8020. Already tried 9 time(s).
Bad connection to FS. command aborted.
解决方案:
格式化namenode:
hadoop namenode -format
重新启动start-all.sh,输入
jps
查看后台进程
root@ubuntu:/home/chenwq/hadoop/hadoop-0.21.0/bin# jps
8734 NameNode
8934 DataNode
9137 SecondaryNameNode
9479 Jps
此时namenode启动
运行
hadoop fs -ls
出现:
11/08/31 23:09:37 INFO security.Groups: Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping; cacheTimeout=300000
11/08/31 23:09:37 WARN conf.Configuration: mapred.task.id is deprecated. Instead, use mapreduce.task.attempt.id
Found 1 items
-rw-r--r-- 1 root supergroup 628 2011-08-31 23:01 /user/root/books/URLCat.java
分享到:
相关推荐
ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task tracker because java.io.IOException: Failed to set permissions of path: \tmp\hadoop-admin \mapred\local\ttprivate to 0700 at org.apache...
Hive错误之 Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask错误分析_xiaohu21的博客-CSDN博客.mht
NULL 博文链接:https://ouyida3.iteye.com/blog/1144326
我的报错:Could not locate Hadoop executable: E:\big_data\hadoop-3.3.0\bin\winutils.ex hadoop的winutils.exe及hadoop.dll文件,可以用于hadoop3.3. 下载好直接将两个文件复制到我们hadoop的bin目录下就行了
FAILED: SemanticException org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient 解决方法 ...
hadoop3.3.0源码 # 科学上网下载比较快,可以自己电脑下好了在上传到服务器 wget https://downloads.apache.org/hadoop/common/hadoop-3.3.0/hadoop-3.3.0-src.tar.gz # 解压文件 tar -zxvf hadoop-3.3.0-src.tar.gz...
在Hadoop生态系统中,`hadoop.dll`和`winutils.exe`是两个关键组件,尤其对于Windows用户来说,它们在本地开发和运行Hadoop相关应用时必不可少。`hadoop.dll`是一个动态链接库文件,主要用于在Windows环境中提供...
hadoop2.6版本的dll,网上比较多的是2.2的dll,如果2.6版本用2.2的dll,会报hadoop 2.6 UnsatisfiedLinkError: org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSumsByteArra 错误,因为2.2的dll中...
`org.apache.hadoop.fs`包下提供了各种抽象类和接口,如`FileSystem`、`Path`等,它们是所有HDFS操作的基础。 2. **MapReduce**:MapReduce是Hadoop处理大数据的主要计算模型,由`org.apache.hadoop.mapred`包实现...
Flink-1.11.2与Hadoop3集成JAR包,放到flink安装包的lib目录下,可以避免Caused by: org.apache.flink.core.fs.UnsupportedFileSystemSchemeException: Hadoop is not in the classpath/dependencies.这个报错,实现...
FAILED: SemanticException org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient 通过控制台的...
Caused by: java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset. at org.apache.hadoop.util.Shell.checkHadoopHomeInner(Shell.java:468) at org.apache.hadoop.util.Shell....
通常,Hadoop是设计在Linux系统上运行的,但开发者为了在Windows上运行Hadoop,会创建像`hadoop.dll`这样的动态链接库文件。这个文件使得Windows系统能够理解Hadoop的某些操作,如与HDFS交互。 `winutils.exe`是另...
包org.apache.hadoop.mapreduce的Hadoop源代码分析
本篇将深入探讨在Windows 10环境下运行Hadoop-2.9.2所需的两个关键资源文件:`hadoop.dll`和`winutils.exe`。 1. **hadoop.dll**: 这是一个动态链接库文件,是Hadoop在Windows系统中运行的核心组件之一。在...
- **核心类库**:如`org.apache.hadoop.fs.FileSystem`、`org.apache.hadoop.mapreduce.Job`等,提供了与HDFS交互和MapReduce编程的基本接口。 4. **开发与调试** - **Hadoop API**:学习如何使用Hadoop API开发...
6. **Hadoop命令**:现在可以通过`hadoop fs`或`hadoop dfs`命令与HDFS交互,进行文件操作。例如,`hadoop fs -ls /`可以列出根目录下的所有文件和目录。 7. **MapReduce编程**:如果你打算在Windows上进行...
Eclipse集成Hadoop2.10.0的插件,使用`ant`对hadoop的jar包进行打包并适应Eclipse加载,所以参数里有hadoop和eclipse的目录. 必须注意对于不同的hadoop版本,` HADDOP_INSTALL_PATH/share/hadoop/common/lib`下的jar包...
hadoop-mapreduce-examples-2.7.1.jar
Apache Hadoop:Hadoop集群运维与优化.docx