`
gaojingsong
  • 浏览: 1201712 次
  • 性别: Icon_minigender_1
  • 来自: 深圳
文章分类
社区版块
存档分类
最新评论

【史上最详细的sqoop命令讲解(一)】

阅读更多

一、sqoop 命令知多少

[root@hadoop0 bin]# ./sqoop

Warning: /opt/bigdata/sqoop-1.4.6/bin/../../hcatalog does not exist! HCatalog jobs will fail.

Please set $HCAT_HOME to the root of your HCatalog installation.

Warning: /opt/bigdata/sqoop-1.4.6/bin/../../accumulo does not exist! Accumulo imports will fail.

Please set $ACCUMULO_HOME to the root of your Accumulo installation.

Try 'sqoop help' for usage.

[root@hadoop0 bin]# ./sqoop help

Warning: /opt/bigdata/sqoop-1.4.6/bin/../../hcatalog does not exist! HCatalog jobs will fail.

Please set $HCAT_HOME to the root of your HCatalog installation.

Warning: /opt/bigdata/sqoop-1.4.6/bin/../../accumulo does not exist! Accumulo imports will fail.

Please set $ACCUMULO_HOME to the root of your Accumulo installation.

99/06/23 18:00:29 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6

usage: sqoop COMMAND [ARGS]

 

Available commands:

  codegen            Generate code to interact with database records

  create-hive-table  Import a table definition into Hive

  eval               Evaluate a SQL statement and display the results

  export             Export an HDFS directory to a database table

  help               List available commands

  import             Import a table from a database to HDFS

  import-all-tables  Import tables from a database to HDFS

  import-mainframe   Import datasets from a mainframe server to HDFS

  job                Work with saved jobs

  list-databases     List available databases on a server

  list-tables        List available tables in a database

  merge              Merge results of incremental imports

  metastore          Run a standalone Sqoop metastore

  version            Display version information

 

See 'sqoop help COMMAND' for information on a specific command.

 

 

二、语法结构解析

1)列出mysql数据库中的所有数据库

 

Table 36. Common arguments

Argument Description
--connect <jdbc-uri> Specify JDBC connect string
--connection-manager <class-name> Specify connection manager class to use
--driver <class-name> Manually specify JDBC driver class to use
--hadoop-home <dir> Override $HADOOP_HOME
--help Print usage instructions
-P Read password from console
--password <password> Set authentication password
--username <username> Set authentication username
--verbose Print more information while working
--connection-param-file <filename> Optional properties file that provides connection parameters
例子:
 
[root@hadoop0 bin]# ./sqoop list-databases --connect jdbc:mysql://192.168.1.101/  -username root --password root
Warning: /opt/bigdata/sqoop-1.4.6/bin/../../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /opt/bigdata/sqoop-1.4.6/bin/../../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
99/06/23 18:03:22 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6
99/06/23 18:03:22 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
99/06/23 18:03:22 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
information_schema
hive
mysql
performance_schema
test
[root@hadoop0 bin]# ./sqoop list-databases --connect jdbc:mysql://192.168.1.101/  -username root --password root --verbose
Warning: /opt/bigdata/sqoop-1.4.6/bin/../../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /opt/bigdata/sqoop-1.4.6/bin/../../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
99/06/23 18:03:42 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6
99/06/23 18:03:42 DEBUG tool.BaseSqoopTool: Enabled debug logging.
99/06/23 18:03:42 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
99/06/23 18:03:42 DEBUG sqoop.ConnFactory: Loaded manager factory: org.apache.sqoop.manager.oracle.OraOopManagerFactory
99/06/23 18:03:42 DEBUG sqoop.ConnFactory: Loaded manager factory: com.cloudera.sqoop.manager.DefaultManagerFactory
99/06/23 18:03:42 DEBUG sqoop.ConnFactory: Trying ManagerFactory: org.apache.sqoop.manager.oracle.OraOopManagerFactory
99/06/23 18:03:42 DEBUG oracle.OraOopManagerFactory: Data Connector for Oracle and Hadoop can be called by Sqoop!
99/06/23 18:03:42 DEBUG sqoop.ConnFactory: Trying ManagerFactory: com.cloudera.sqoop.manager.DefaultManagerFactory
99/06/23 18:03:42 DEBUG manager.DefaultManagerFactory: Trying with scheme: jdbc:mysql:
99/06/23 18:03:42 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
99/06/23 18:03:42 DEBUG sqoop.ConnFactory: Instantiated ConnManager org.apache.sqoop.manager.MySQLManager@a1fcba
99/06/23 18:03:42 DEBUG manager.SqlManager: No connection paramenters specified. Using regular API for making connection.
information_schema
hive
mysql
performance_schema
test
[root@hadoop0 bin]# ./sqoop list-databases --connect jdbc:mysql://192.168.1.101/  -username root --P --verbose
Warning: /opt/bigdata/sqoop-1.4.6/bin/../../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /opt/bigdata/sqoop-1.4.6/bin/../../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
99/06/23 18:03:57 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6
99/06/23 18:03:57 DEBUG tool.BaseSqoopTool: Enabled debug logging.
Enter password: 
99/06/23 18:04:00 DEBUG sqoop.ConnFactory: Loaded manager factory: org.apache.sqoop.manager.oracle.OraOopManagerFactory
99/06/23 18:04:00 DEBUG sqoop.ConnFactory: Loaded manager factory: com.cloudera.sqoop.manager.DefaultManagerFactory
99/06/23 18:04:00 DEBUG sqoop.ConnFactory: Trying ManagerFactory: org.apache.sqoop.manager.oracle.OraOopManagerFactory
99/06/23 18:04:00 DEBUG oracle.OraOopManagerFactory: Data Connector for Oracle and Hadoop can be called by Sqoop!
99/06/23 18:04:00 DEBUG sqoop.ConnFactory: Trying ManagerFactory: com.cloudera.sqoop.manager.DefaultManagerFactory
99/06/23 18:04:00 DEBUG manager.DefaultManagerFactory: Trying with scheme: jdbc:mysql:
99/06/23 18:04:00 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
99/06/23 18:04:00 DEBUG sqoop.ConnFactory: Instantiated ConnManager org.apache.sqoop.manager.MySQLManager@1a4e7e0
99/06/23 18:04:00 DEBUG manager.SqlManager: No connection paramenters specified. Using regular API for making connection.
information_schema
hive
mysql
performance_schema
test
[root@hadoop0 bin]# 
 
2)连接mysql并列出数据库中的表

Table 37. Common arguments

Argument Description
--connect <jdbc-uri> Specify JDBC connect string
--connection-manager <class-name> Specify connection manager class to use
--driver <class-name> Manually specify JDBC driver class to use
--hadoop-home <dir> Override $HADOOP_HOME
--help Print usage instructions
-P Read password from console
--password <password> Set authentication password
--username <username> Set authentication username
--verbose Print more information while working
--connection-param-file <filename> Optional properties file that provides connection parameters
 
例子:
[root@hadoop0 bin]# ./sqoop list-tables --connect jdbc:mysql://192.168.1.101/hive  -username root --password root
Warning: /opt/bigdata/sqoop-1.4.6/bin/../../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /opt/bigdata/sqoop-1.4.6/bin/../../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
99/06/23 18:11:20 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6
99/06/23 18:11:20 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
99/06/23 18:11:20 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
aux_table
bucketing_cols
cds
columns_v2
compaction_queue
types
version
[root@hadoop0 bin]# ./sqoop list-tables --connect jdbc:mysql://192.168.1.101/mysql  -username root --password root
Warning: /opt/bigdata/sqoop-1.4.6/bin/../../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /opt/bigdata/sqoop-1.4.6/bin/../../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
99/06/23 18:11:47 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6
99/06/23 18:11:47 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
99/06/23 18:11:48 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
columns_priv
db
event
func
general_log
help_category
help_keyword
servers
slave_master_info
slave_relay_log_info
slave_worker_info
slow_log
tables_priv
time_zone
time_zone_leap_second
time_zone_name
time_zone_transition
time_zone_transition_type
user
[root@hadoop0 bin]# 

3)将关系型数据的表结构复制到hive中(即根据Mysql表自动在hive中建立表结构)

Table 31. Common arguments

Argument Description
--connect <jdbc-uri> Specify JDBC connect string
--connection-manager <class-name> Specify connection manager class to use
--driver <class-name> Manually specify JDBC driver class to use
--hadoop-home <dir> Override $HADOOP_HOME
--help Print usage instructions
-P Read password from console
--password <password> Set authentication password
--username <username> Set authentication username
--verbose Print more information while working
--connection-param-file <filename> Optional properties file that provides connection parameters

 

Table 32. Hive arguments:

Argument Description
--hive-home <dir> Override $HIVE_HOME
--hive-overwrite Overwrite existing data in the Hive table.
--create-hive-table If set, then the job will fail if the target hive
  table exits. By default this property is false.
--hive-table <table-name> Sets the table name to use when importing to Hive.
--table The database table to read the definition from.

 

Table 33. Output line formatting arguments:

Argument Description
--enclosed-by <char> Sets a required field enclosing character
--escaped-by <char> Sets the escape character
--fields-terminated-by <char> Sets the field separator character
--lines-terminated-by <char> Sets the end-of-line character
--mysql-delimiters Uses MySQL’s default delimiter set: fields: , lines: \n escaped-by: \ optionally-enclosed-by: '
--optionally-enclosed-by <char> Sets a field enclosing character

例子:

[root@hadoop0 bin]# ./sqoop create-hive-table --connect jdbc:mysql://192.168.1.101/test  -username root --password root --table people --hive-table emps --fields-terminated-by ',' --verbose

Warning: /opt/bigdata/sqoop-1.4.6/bin/../../hcatalog does not exist! HCatalog jobs will fail.

Please set $HCAT_HOME to the root of your HCatalog installation.

Warning: /opt/bigdata/sqoop-1.4.6/bin/../../accumulo does not exist! Accumulo imports will fail.

Please set $ACCUMULO_HOME to the root of your Accumulo installation.

99/06/23 18:32:29 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6

99/06/23 18:32:30 DEBUG manager.SqlManager: Using fetchSize for next query: -2147483648

99/06/23 18:32:30 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `people` AS t LIMIT 1

99/06/23 18:32:30 DEBUG manager.SqlManager: Found column id of type [4, 11, 0]

99/06/23 18:32:30 ERROR manager.SqlManager: Error reading from database: java.sql.SQLException: Streaming result set com.mysql.jdbc.RowDataDynamic@c1d29e is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries.

java.sql.SQLException: Streaming result set com.mysql.jdbc.RowDataDynamic@c1d29e is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries.

        at com.mysql.jdbc.MysqlIO.checkForOutstandingStreamingData(MysqlIO.java:2095)

        at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1510)

        at org.apache.sqoop.Sqoop.main(Sqoop.java:236)

99/06/23 18:32:30 ERROR sqoop.Sqoop: Got exception running Sqoop: java.lang.NullPointerException

java.lang.NullPointerException(解决方案:替换Mysql的jar包,据说是Mysql驱动包的bug

        at org.apache.sqoop.hive.TableDefWriter.getCreateTableStmt(TableDefWriter.java:175)

        at org.apache.sqoop.Sqoop.main(Sqoop.java:236)

[root@hadoop0 bin]# ./sqoop create-hive-table --connect jdbc:mysql://192.168.1.101/test  -username root --password root --table people --hive-table emps --fields-terminated-by ',' --verbose

Warning: /opt/bigdata/sqoop-1.4.6/bin/../../hcatalog does not exist! HCatalog jobs will fail.

Please set $HCAT_HOME to the root of your HCatalog installation.

Warning: /opt/bigdata/sqoop-1.4.6/bin/../../accumulo does not exist! Accumulo imports will fail.

Please set $ACCUMULO_HOME to the root of your Accumulo installation.

99/06/23 18:33:44 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6

99/06/23 18:33:44 DEBUG tool.BaseSqoopTool: Enabled debug logging.

99/06/23 18:33:44 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.

99/06/23 18:33:44 DEBUG sqoop.ConnFactory: Loaded manager factory: org.apache.sqoop.manager.oracle.OraOopManagerFactory

99/06/23 18:33:44 DEBUG sqoop.ConnFactory: Loaded manager factory: com.cloudera.sqoop.manager.DefaultManagerFactory

99/06/23 18:33:44 DEBUG sqoop.ConnFactory: Trying ManagerFactory: org.apache.sqoop.manager.oracle.OraOopManagerFactory

99/06/23 18:33:45 DEBUG oracle.OraOopManagerFactory: Data Connector for Oracle and Hadoop can be called by Sqoop!

99/06/23 18:33:45 DEBUG sqoop.ConnFactory: Trying ManagerFactory: com.cloudera.sqoop.manager.DefaultManagerFactory

99/06/23 18:33:45 DEBUG manager.DefaultManagerFactory: Trying with scheme: jdbc:mysql:

99/06/23 18:33:45 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.

99/06/23 18:33:45 DEBUG sqoop.ConnFactory: Instantiated ConnManager org.apache.sqoop.manager.MySQLManager@a1fcba

99/06/23 18:33:45 DEBUG hive.HiveImport: Hive.inputTable: people

99/06/23 18:33:45 DEBUG hive.HiveImport: Hive.outputTable: emps

99/06/23 18:33:45 DEBUG manager.SqlManager: Execute getColumnInfoRawQuery : SELECT t.* FROM `people` AS t LIMIT 1

99/06/23 18:33:45 DEBUG manager.SqlManager: No connection paramenters specified. Using regular API for making connection.

99/06/23 18:33:45 DEBUG manager.SqlManager: Using fetchSize for next query: -2147483648

99/06/23 18:33:45 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `people` AS t LIMIT 1

99/06/23 18:33:45 DEBUG manager.SqlManager: Found column id of type [4, 11, 0]

99/06/23 18:33:45 DEBUG manager.SqlManager: Found column extdata of type [12, 1000, 0]

99/06/23 18:33:45 DEBUG manager.SqlManager: Found column start_name of type [12, 20, 0]

99/06/23 18:33:45 DEBUG manager.SqlManager: Found column nickname of type [12, 20, 0]

99/06/23 18:33:45 DEBUG manager.SqlManager: Found column name of type [12, 20, 0]

99/06/23 18:33:45 DEBUG manager.SqlManager: Using fetchSize for next query: -2147483648

99/06/23 18:33:45 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `people` AS t LIMIT 1

99/06/23 18:33:45 DEBUG manager.SqlManager: Found column id of type INT

99/06/23 18:33:45 DEBUG manager.SqlManager: Found column extdata of type VARCHAR

99/06/23 18:33:45 DEBUG manager.SqlManager: Found column start_name of type VARCHAR

99/06/23 18:33:45 DEBUG manager.SqlManager: Found column nickname of type VARCHAR

99/06/23 18:33:45 DEBUG manager.SqlManager: Found column name of type VARCHAR

99/06/23 18:33:45 DEBUG hive.TableDefWriter: Create statement: CREATE TABLE IF NOT EXISTS `emps` ( `id` INT, `extdata` STRING, `start_name` STRING, `nickname` STRING, `name` STRING) COMMENT 'Imported by sqoop on 1999/06/23 18:33:45' ROW FORMAT DELIMITED FIELDS TERMINATED BY '\054' LINES TERMINATED BY '\012' STORED AS TEXTFILE

SLF4J: Class path contains multiple SLF4J bindings.

SLF4J: Found binding in [jar:file:/opt/bigdata/hadoop272/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: Found binding in [jar:file:/opt/bigdata/hbase-1.1.5/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.

SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]

99/06/23 18:33:46 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

99/06/23 18:33:48 DEBUG hive.TableDefWriter: Load statement: LOAD DATA INPATH 'hdfs://hadoop0:9000/user/root/people' INTO TABLE `emps`

99/06/23 18:33:49 INFO hive.HiveImport: Loading uploaded data into Hive

99/06/23 18:33:49 DEBUG hive.HiveImport: Using external Hive process.

99/06/23 18:34:29 INFO hive.HiveImport: SLF4J: Class path contains multiple SLF4J bindings.

99/06/23 18:34:29 INFO hive.HiveImport: SLF4J: Found binding in [jar:file:/opt/bigdata/hive2.0/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]

99/06/23 18:34:29 INFO hive.HiveImport: SLF4J: Found binding in [jar:file:/opt/bigdata/hbase-1.1.5/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]

99/06/23 18:34:29 INFO hive.HiveImport: SLF4J: Found binding in [jar:file:/opt/bigdata/hadoop272/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]

99/06/23 18:34:29 INFO hive.HiveImport: SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.

99/06/23 18:34:34 INFO hive.HiveImport: 

99/06/23 18:34:34 INFO hive.HiveImport: Logging initialized using configuration in file:/opt/bigdata/hive2.0/conf/hive-log4j2.properties

99/06/23 18:35:18 INFO hive.HiveImport: OK

99/06/23 18:35:18 INFO hive.HiveImport: Time taken: 8.382 seconds

99/06/23 18:35:20 INFO hive.HiveImport: Hive import complete.

 

此条语句执行完毕,Hive里面会自动建立好数据表,表的名字是emps,
hive> show tables;
OK
student
teacher
tt
Time taken: 0.752 seconds, Fetched: 3 row(s)
hive> show tables;
OK
emps
student
teacher
tt
Time taken: 0.127 seconds, Fetched: 4 row(s)
hive> show create table emps;
OK
CREATE TABLE `emps`(
  `id` int, 
  `extdata` string, 
  `start_name` string, 
  `nickname` string, 
  `name` string)
COMMENT 'Imported by sqoop on 1999/06/23 18:33:45'
ROW FORMAT SERDE 
  'org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe' 
WITH SERDEPROPERTIES ( 
  'field.delim'=',', 
  'line.delim'='\n', 
  'serialization.format'=',') 
STORED AS INPUTFORMAT 
  'org.apache.hadoop.mapred.TextInputFormat' 
OUTPUTFORMAT 
  'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat'
LOCATION
  'hdfs://hadoop0:9000/user/hive/warehouse/emps'
TBLPROPERTIES (
  'transient_lastDdlTime'='930134117')
Time taken: 0.854 seconds, Fetched: 21 row(s)
hive> 
0
2
分享到:
评论

相关推荐

    DOS常用命令详细讲解

    DOS常用命令详细讲解。我觉得很好,对学习DOS命令有帮助。

    Sqoop安装与使用

    Sqoop 工具的发展至今主要演化了二大版本,Sqoop1 和 Sqoop2,我们以 Sqoop1 为案例进行讲解,Sqoop2 商用不太稳定。 Sqoop 工具是一个非常有用的数据迁移工具,可以帮助用户将传统型数据库中的数据迁移到 Hadoop ...

    sqoop-cdh安装包

    下面我们将详细讲解sqoop-cdh的安装过程、可能出现的异常处理以及所需的工具包。 首先,我们需要准备安装环境。确保你的CDH集群已经安装并配置完毕,包括Hadoop、Hive、HBase等组件。同时,你的系统中应有Java开发...

    工程项目建设程序详细讲解(史上最全)-79页.pdf

    工程项目建设程序详细讲解(史上最全)-79页.pdf

    linux系统基础命令和详细讲解

    详细介绍以及讲解linux的大部分基础内容,适合刚刚学习linux的新手。(很详细)

    cmd命令 详细讲解CMD 所有命令,及应用

    cmd命令 详细讲解CMD 所有命令,及应用cmd命令 详细讲解CMD 所有命令,及应用cmd命令 详细讲解CMD 所有命令,及应用

    史上最详细批处理脚本(BAT)教程-精华版

    - **ERRORLEVEL**: 此变量可用于存储上一条命令的退出代码。通常用于根据上一步操作的成功或失败来决定下一步的流程。例如: ```batch IF %ERRORLEVEL% NEQ 0 ( ECHO 发生错误 ) ``` 5. **TITLE** - **...

    软件需求分析史上最详细讲解

    本文通过多个具体实例,详细讲解了需求开发、需求管理、需求文档格式等,是学习需求分析不可多得资料。

    Linux命令讲解视频.rar

    01-第一天知识点回顾 .mp4 02查看文件内容命令 mp4 03软链接 mp4 04硬连接 .mp4 05文本搜索命令, 06-查找文件命令, mp4 07-压缩和解压缩命令, mp4 08文件权限命令 mp4 09获取管理权限的相关命令, mp4 10用户相关...

    史上最经典的史密斯(smith)圆图讲解

    ### 史上最经典的史密斯(Smith)圆图讲解 #### 1. 史密斯圆图是什么? 史密斯圆图是一种图形工具,由菲利普·史密斯(Phillip Smith)于1939年发明,主要用于解决微波与射频电路中的匹配问题。它提供了一种直观的...

    Sqoop企业级大数据迁移方案全方位实战视频教程

    手把手视频详细讲解项目开发全过程,需要的小伙伴自行百度网盘下载,链接见附件,永久有效。 课程简介 从零开始讲解大数据业务及数据采集和迁移需求,以案例驱动的方式讲解基于Sqoop构建高性能的分布式数据迁移和...

    Apache Sqoop Cookbook

    此外,书中还详细解释了如何使用Sqoop进行数据导入和导出,如使用`import`和`export`命令,以及如何处理复杂的数据类型和分隔符。 对于数据转换,Sqoop提供了丰富的转换操作,如使用自定义MapReduce任务进行数据...

    20个常用GIT命令详细讲解

    该文件详细讲解了20个常用的GIT命令, 这些都是我们平时使用最多的命令. 本文件是一个chm格式的帮助文件.

    史上最详细的Cygwin中文教程

    这个史上最详细的Cygwin中文教程将深入讲解Cygwin的安装、配置以及使用,帮助用户充分利用其功能。 首先,安装Cygwin是一个相对简单的过程。用户需要访问Cygwin官方网站,下载安装程序setup.exe,然后在运行该程序...

    Linux命令大全(含代码讲解)

    Linux命令大全(含代码讲解)Linux命令大全(含代码讲解)Linux命令大全(含代码讲解)Linux命令大全(含代码讲解)Linux命令大全(含代码讲解)Linux命令大全(含代码讲解)Linux命令大全(含代码讲解)Linux命令...

    CADM命令及用法详细讲解.doc

    "CADM命令及用法详细讲解" 本文档详细讲解了CADM命令的使用方法,涵盖了绘图命令、修改工具、阵列复制等多个方面。下面是本文档的详细知识点总结: 绘图命令 * 直线命令(L):输入命令 L,指定第一点和第二点,...

    HTML菜鸟级最详细讲解PPT

    史上最强大的HTML讲解文档,适合于菜鸟级朋友,从最基本的讲起

    linux全命令超详细讲解

    linux的各种命令,4万字最全讲解,包括热键,开关机,目录与文件操作,磁盘格式操作,压缩,打包,备份,vi编辑器的使用,编码操作正则表达式的基本操作,shell script操作,使用者操作,管理员操作,还包括系统备份...

    史上最全 - Visio图标库

    这个史上最全的Visio图集不仅提供了大量的图形资源,还可能涵盖了相关技术的讲解,如JWT的使用,使得用户在设计和交流时更加便捷高效。通过深入学习和使用这些资源,无论是初学者还是经验丰富的专业人士,都能提升...

    2018最新高清大数据协作框架全套视频教程(Sqoop,Flume,Oozie,WEB工具Hue)

    第1章 数据转换工具Sqoop 1.02_02_01_01 大数据Hadoop 2.x协作.框架的功能 网页.mp4 1.02_02_01_02 Sqoop 功能概述和版本、架构讲解 网页.mp4 1.02_02_01_03 Sqoop 使用...3.02_02_03_05 Oozie安装部署讲解一 网页.mp4

Global site tag (gtag.js) - Google Analytics