`

ma-hadoop脚本命令 hadoop-hadoop dfs-hdfs dfs区别

 
阅读更多

 

 

 

1 hadoop 命令:

 

[root@chinadaas01 ~]# hadoop
Usage: hadoop [--config confdir] COMMAND
       where COMMAND is one of:
  fs                   run a generic filesystem user client
  version              print the version
  jar <jar>            run a jar file
  checknative [-a|-h]  check native hadoop and compression libraries availability
  distcp <srcurl> <desturl> copy file or directories recursively
  archive -archiveName NAME -p <parent path> <src>* <dest> create a hadoop archive
  classpath            prints the class path needed to get the
                       Hadoop jar and the required libraries
  daemonlog            get/set the log level for each daemon
 or
  CLASSNAME            run the class named CLASSNAME

 

常见的有:

hadoop job : 查看hadoop 任务   eg---> hadoop job -list all

 

Usage: JobClient <command> <args>
        [-submit <job-file>]
        [-status <job-id>]
        [-counter <job-id> <group-name> <counter-name>]
        [-kill <job-id>]
        [-set-priority <job-id> <priority>]. Valid values for priorities are: VERY_HIGH HIGH NORMAL LOW VERY_LOW
        [-events <job-id> <from-event-#> <#-of-events>]
        [-history <jobOutputDir>]
        [-list [all]]
        [-list-active-trackers]
        [-list-blacklisted-trackers]
        [-list-attempt-ids <job-id> <task-type> <task-state>]

        [-kill-task <task-id>]
        [-fail-task <task-id>]

 

hadoop version 查看安装的版本

 

[root@chinadaas01 ~]# hadoop version
Hadoop 2.0.0-transwarp
Subversion file:///root/wangb/hadoop/build/hadoop/rpm/BUILD/hadoop-2.0.0-transwarp/src/hadoop-common-project/hadoop-common -r Unknown
Compiled by root on Fri Oct 25 14:38:23 CST 2013
From source with checksum ec693572f265ae4d8b8c1f52a22e37f5
This command was run using /usr/lib/hadoop/hadoop-common-2.0.0-transwarp.jar

 

hadoop jar  运行hadoop jar程序:

 

[root@chinadaas01 ~]# hadoop jar
RunJar jarFile [mainClass] args...

 

hadoop dfsadmin  -report  -safemode(在重新启动并且等待datanode发送信息给namenode时 不接受客户端请求 只有在都弄好后才开启服务 这个不接客过程就是安全模式)

hadoop fsck -openforwrite-files    数据块越大 内存空间中映射表越小 查询起来就越快

 

 

hadoop fs  :  hadoop 操作文件系统的命令  等效于  hdfs dfs 命令

 

 

hdfs :   

[root@chinadaas01 ~]# hdfs
Usage: hdfs [--config confdir] COMMAND
       where COMMAND is one of:
  dfs                  run a filesystem command on the file systems supported in Hadoop.
  namenode -format     format the DFS filesystem
  secondarynamenode    run the DFS secondary namenode
  namenode             run the DFS namenode
  journalnode          run the DFS journalnode
  zkfc                 run the ZK Failover Controller daemon
  datanode             run a DFS datanode
  dfsadmin             run a DFS admin client
  haadmin              run a DFS HA admin client
  fsck                 run a DFS filesystem checking utility
  balancer             run a cluster balancing utility
  jmxget               get JMX exported values from NameNode or DataNode.
  oiv                  apply the offline fsimage viewer to an fsimage
  oev                  apply the offline edits viewer to an edits file
  fetchdt              fetch a delegation token from the NameNode
  getconf              get config values from configuration
  groups               get the groups which users belong to
                                                Use -help to see options

 

 

 

能看到  hadoop命令和 hdfs命令是两套独立的体系, 

一个是集群层面的,比如查看版本,查看mr的Job任务,执行hdfs集群的文件相关命令

一个是hdfs集群层面的,比如namenode格式化  datanode命令等

 

这两个的关联相同点就是  hadoop dfs  =  hdfs dfs

 

 

 

 

分享到:
评论

相关推荐

    hadoop-hdfs-client-2.9.1-API文档-中文版.zip

    赠送jar包:hadoop-hdfs-client-2.9.1.jar 赠送原API文档:hadoop-hdfs-client-2.9.1-javadoc.jar 赠送源代码:hadoop-hdfs-client-2.9.1-sources.jar 包含翻译后的API文档:hadoop-hdfs-client-2.9.1-javadoc-...

    hadoop最新版本3.1.1全量jar包

    hadoop-auth-3.1.1.jar hadoop-hdfs-3.1.1.jar hadoop-mapreduce-client-hs-3.1.1.jar hadoop-yarn-client-3.1.1.jar hadoop-client-api-3.1.1.jar hadoop-hdfs-client-3.1.1.jar hadoop-mapreduce-client-jobclient...

    hadoop插件apache-hadoop-3.1.0-winutils-master.zip

    5. **启动Hadoop服务**:通过`sbin`目录下的脚本启动Hadoop的各个服务,如`start-dfs.sh`启动HDFS,`start-yarn.sh`启动YARN。 6. **Hadoop命令**:现在可以通过`hadoop fs`或`hadoop dfs`命令与HDFS交互,进行文件...

    hadoop-hdfs-2.7.3-API文档-中英对照版.zip

    赠送jar包:hadoop-hdfs-2.7.3.jar; 赠送原API文档:hadoop-hdfs-2.7.3-javadoc.jar; 赠送源代码:hadoop-hdfs-2.7.3-sources.jar; 赠送Maven依赖信息文件:hadoop-hdfs-2.7.3.pom; 包含翻译后的API文档:hadoop...

    Hadoop 3.x(HDFS)----【HDFS 的 API 操作】---- 代码

    Hadoop 3.x(HDFS)----【HDFS 的 API 操作】---- 代码 Hadoop 3.x(HDFS)----【HDFS 的 API 操作】---- 代码 Hadoop 3.x(HDFS)----【HDFS 的 API 操作】---- 代码 Hadoop 3.x(HDFS)----【HDFS 的 API 操作】--...

    elasticsearch-hadoop-8.8.0

    1. **HDFS集群访问**:HDFS(Hadoop Distributed File System)是Hadoop的核心部分,提供高容错性的分布式文件存储。Elasticsearch-Hadoop允许用户直接从Hadoop作业读取和写入Elasticsearch,无需将数据移动到本地...

    spark-3.1.3-bin-without-hadoop.tgz

    1. 解压压缩包:使用tar命令解压文件,例如`tar -xvf spark-3.1.3-bin-without-hadoop.tgz`。 2. 配置环境变量:在`~/.bashrc`或`~/.bash_profile`中设置SPARK_HOME,并将Spark的bin目录添加到PATH。 3. 如果在...

    hadoop-hdfs-client-2.9.1-API文档-中英对照版.zip

    赠送jar包:hadoop-hdfs-client-2.9.1.jar; 赠送原API文档:hadoop-hdfs-client-2.9.1-javadoc.jar; 赠送源代码:hadoop-hdfs-client-2.9.1-sources.jar; 赠送Maven依赖信息文件:hadoop-hdfs-client-2.9.1.pom;...

    hdfs-over-ftp-hadoop-0.20.0.rar_ftp_ftpoverhdfs_hdfs文件传入ftp_java

    标题 "hdfs-over-ftp-hadoop-0.20.0.rar" 提示我们关注的是一个关于将HDFS(Hadoop Distributed File System)与FTP(File Transfer Protocol)整合的项目,特别适用于版本0.20.0的Hadoop。这个项目可能提供了在...

    大数据--Hadoop HDFS

    ### 大数据、Hadoop与HDFS详解 随着信息技术的快速发展和互联网的普及,数据量呈爆炸性增长态势。传统的数据处理工具和技术已无法满足如此大规模数据的存储、管理和分析需求。为此,Apache Hadoop应运而生,它提供...

    spark2.1.0-bin-hadoop2.7

    2. 解压:在Linux服务器上,使用`tar -zxvf spark-2.1.0-bin-hadoop2.7.tgz`命令解压文件。 3. 配置环境变量:在`~/.bashrc`或`~/.bash_profile`文件中添加以下内容: ``` export SPARK_HOME=/path/to/spark-...

    apache-hadoop-3.1.0-winutils-master.zip

    7. **启动Hadoop服务**: 使用`start-dfs.cmd`和`start-yarn.cmd`命令启动Hadoop的DataNode、NameNode和YARN服务。 8. **验证安装**: 通过`jps`命令检查Hadoop进程是否正常运行,也可以通过Web UI(默认地址:`...

    hadoop-hdfs-2.6.5-API文档-中文版.zip

    赠送jar包:hadoop-hdfs-2.6.5.jar; 赠送原API文档:hadoop-hdfs-2.6.5-javadoc.jar; 赠送源代码:hadoop-hdfs-2.6.5-sources.jar; 赠送Maven依赖信息文件:hadoop-hdfs-2.6.5.pom; 包含翻译后的API文档:hadoop...

    apache-hadoop-3.1.3-winutils-master.zip

    这些命令通过Hadoop的shell界面提供,使得在Windows上与远程HDFS集群的交互变得简单。 5. **安全性与认证**: 在Hadoop 3.1.3中,安全性是重要的考量。如果HDFS集群启用了Kerberos认证,客户端也需要相应的配置,...

    apache-hadoop-3.1.0-winutils-master本地开发调试.zip

    7. **启动Hadoop**:使用`start-dfs.cmd`和`start-yarn.cmd`命令启动Hadoop的DataNodes、NameNode和YARN的ResourceManager。 8. **验证安装**:通过`jps`命令查看运行的Hadoop进程,确认它们已经成功启动。还可以...

    hadoop-hdfs-2.5.1-API文档-中文版.zip

    赠送jar包:hadoop-hdfs-2.5.1.jar; 赠送原API文档:hadoop-hdfs-2.5.1-javadoc.jar; 赠送源代码:hadoop-hdfs-2.5.1-sources.jar; 赠送Maven依赖信息文件:hadoop-hdfs-2.5.1.pom; 包含翻译后的API文档:hadoop...

    I001-hadoophdfs-mkdirs.7z

    这个"I001-hadoophdfs-mkdirs"压缩包可能包含详细的Hadoop HDFS使用指南,关于`mkdirs`命令的用法示例,以及如何在实际大数据项目中运用这些知识的实践案例。学习这些内容对于理解和操作Hadoop集群至关重要,特别是...

    hadoop-hdfs-2.7.3-API文档-中文版.zip

    赠送jar包:hadoop-hdfs-2.7.3.jar; 赠送原API文档:hadoop-hdfs-2.7.3-javadoc.jar; 赠送源代码:hadoop-hdfs-2.7.3-sources.jar; 赠送Maven依赖信息文件:hadoop-hdfs-2.7.3.pom; 包含翻译后的API文档:hadoop...

    hadoop-3.3.1 windows + apache-hadoop-3.1.0-winutils-master.zip

    5. **测试Hadoop**:启动Hadoop后,可以使用HDFS命令(如`hdfs dfs -ls /`)检查HDFS是否正常工作,或者运行一个简单的MapReduce任务来验证集群功能。 6. **安全配置(可选)**:在生产环境中,你可能还需要配置...

Global site tag (gtag.js) - Google Analytics