一、sqoop 命令知多少
[root@hadoop0 bin]# ./sqoop
Warning: /opt/bigdata/sqoop-1.4.6/bin/../../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /opt/bigdata/sqoop-1.4.6/bin/../../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
Try 'sqoop help' for usage.
[root@hadoop0 bin]# ./sqoop help
Warning: /opt/bigdata/sqoop-1.4.6/bin/../../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /opt/bigdata/sqoop-1.4.6/bin/../../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
99/06/23 18:00:29 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6
usage: sqoop COMMAND [ARGS]
Available commands:
codegen Generate code to interact with database records
create-hive-table Import a table definition into Hive
eval Evaluate a SQL statement and display the results
export Export an HDFS directory to a database table
help List available commands
import Import a table from a database to HDFS
import-all-tables Import tables from a database to HDFS
import-mainframe Import datasets from a mainframe server to HDFS
job Work with saved jobs
list-databases List available databases on a server
list-tables List available tables in a database
merge Merge results of incremental imports
metastore Run a standalone Sqoop metastore
version Display version information
See 'sqoop help COMMAND' for information on a specific command.
二、语法结构解析
1)列出mysql数据库中的所有数据库
Table 36. Common arguments
--connect <jdbc-uri> |
Specify JDBC connect string |
--connection-manager <class-name> |
Specify connection manager class to use |
--driver <class-name> |
Manually specify JDBC driver class to use |
--hadoop-home <dir> |
Override $HADOOP_HOME |
--help |
Print usage instructions |
-P |
Read password from console |
--password <password> |
Set authentication password |
--username <username> |
Set authentication username |
--verbose |
Print more information while working |
--connection-param-file <filename> |
Optional properties file that provides connection parameters |
Table 37. Common arguments
--connect <jdbc-uri> |
Specify JDBC connect string |
--connection-manager <class-name> |
Specify connection manager class to use |
--driver <class-name> |
Manually specify JDBC driver class to use |
--hadoop-home <dir> |
Override $HADOOP_HOME |
--help |
Print usage instructions |
-P |
Read password from console |
--password <password> |
Set authentication password |
--username <username> |
Set authentication username |
--verbose |
Print more information while working |
--connection-param-file <filename> |
Optional properties file that provides connection parameters |
3)将关系型数据的表结构复制到hive中(即根据Mysql表自动在hive中建立表结构)
Table 31. Common arguments
--connect <jdbc-uri> |
Specify JDBC connect string |
--connection-manager <class-name> |
Specify connection manager class to use |
--driver <class-name> |
Manually specify JDBC driver class to use |
--hadoop-home <dir> |
Override $HADOOP_HOME |
--help |
Print usage instructions |
-P |
Read password from console |
--password <password> |
Set authentication password |
--username <username> |
Set authentication username |
--verbose |
Print more information while working |
--connection-param-file <filename> |
Optional properties file that provides connection parameters |
Table 32. Hive arguments:
--hive-home <dir> |
Override $HIVE_HOME
|
--hive-overwrite |
Overwrite existing data in the Hive table. |
--create-hive-table |
If set, then the job will fail if the target hive |
table exits. By default this property is false. | |
--hive-table <table-name> |
Sets the table name to use when importing to Hive. |
--table |
The database table to read the definition from. |
Table 33. Output line formatting arguments:
--enclosed-by <char> |
Sets a required field enclosing character |
--escaped-by <char> |
Sets the escape character |
--fields-terminated-by <char> |
Sets the field separator character |
--lines-terminated-by <char> |
Sets the end-of-line character |
--mysql-delimiters |
Uses MySQL’s default delimiter set: fields: , lines: \n escaped-by: \ optionally-enclosed-by: '
|
--optionally-enclosed-by <char> |
Sets a field enclosing character |
例子:
[root@hadoop0 bin]# ./sqoop create-hive-table --connect jdbc:mysql://192.168.1.101/test -username root --password root --table people --hive-table emps --fields-terminated-by ',' --verbose
Warning: /opt/bigdata/sqoop-1.4.6/bin/../../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /opt/bigdata/sqoop-1.4.6/bin/../../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
99/06/23 18:32:29 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6
99/06/23 18:32:30 DEBUG manager.SqlManager: Using fetchSize for next query: -2147483648
99/06/23 18:32:30 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `people` AS t LIMIT 1
99/06/23 18:32:30 DEBUG manager.SqlManager: Found column id of type [4, 11, 0]
99/06/23 18:32:30 ERROR manager.SqlManager: Error reading from database: java.sql.SQLException: Streaming result set com.mysql.jdbc.RowDataDynamic@c1d29e is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries.
java.sql.SQLException: Streaming result set com.mysql.jdbc.RowDataDynamic@c1d29e is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries.
at com.mysql.jdbc.MysqlIO.checkForOutstandingStreamingData(MysqlIO.java:2095)
at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1510)
at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
99/06/23 18:32:30 ERROR sqoop.Sqoop: Got exception running Sqoop: java.lang.NullPointerException
java.lang.NullPointerException(解决方案:替换Mysql的jar包,据说是Mysql驱动包的bug)
at org.apache.sqoop.hive.TableDefWriter.getCreateTableStmt(TableDefWriter.java:175)
at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
[root@hadoop0 bin]# ./sqoop create-hive-table --connect jdbc:mysql://192.168.1.101/test -username root --password root --table people --hive-table emps --fields-terminated-by ',' --verbose
Warning: /opt/bigdata/sqoop-1.4.6/bin/../../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /opt/bigdata/sqoop-1.4.6/bin/../../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
99/06/23 18:33:44 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6
99/06/23 18:33:44 DEBUG tool.BaseSqoopTool: Enabled debug logging.
99/06/23 18:33:44 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
99/06/23 18:33:44 DEBUG sqoop.ConnFactory: Loaded manager factory: org.apache.sqoop.manager.oracle.OraOopManagerFactory
99/06/23 18:33:44 DEBUG sqoop.ConnFactory: Loaded manager factory: com.cloudera.sqoop.manager.DefaultManagerFactory
99/06/23 18:33:44 DEBUG sqoop.ConnFactory: Trying ManagerFactory: org.apache.sqoop.manager.oracle.OraOopManagerFactory
99/06/23 18:33:45 DEBUG oracle.OraOopManagerFactory: Data Connector for Oracle and Hadoop can be called by Sqoop!
99/06/23 18:33:45 DEBUG sqoop.ConnFactory: Trying ManagerFactory: com.cloudera.sqoop.manager.DefaultManagerFactory
99/06/23 18:33:45 DEBUG manager.DefaultManagerFactory: Trying with scheme: jdbc:mysql:
99/06/23 18:33:45 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
99/06/23 18:33:45 DEBUG sqoop.ConnFactory: Instantiated ConnManager org.apache.sqoop.manager.MySQLManager@a1fcba
99/06/23 18:33:45 DEBUG hive.HiveImport: Hive.inputTable: people
99/06/23 18:33:45 DEBUG hive.HiveImport: Hive.outputTable: emps
99/06/23 18:33:45 DEBUG manager.SqlManager: Execute getColumnInfoRawQuery : SELECT t.* FROM `people` AS t LIMIT 1
99/06/23 18:33:45 DEBUG manager.SqlManager: No connection paramenters specified. Using regular API for making connection.
99/06/23 18:33:45 DEBUG manager.SqlManager: Using fetchSize for next query: -2147483648
99/06/23 18:33:45 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `people` AS t LIMIT 1
99/06/23 18:33:45 DEBUG manager.SqlManager: Found column id of type [4, 11, 0]
99/06/23 18:33:45 DEBUG manager.SqlManager: Found column extdata of type [12, 1000, 0]
99/06/23 18:33:45 DEBUG manager.SqlManager: Found column start_name of type [12, 20, 0]
99/06/23 18:33:45 DEBUG manager.SqlManager: Found column nickname of type [12, 20, 0]
99/06/23 18:33:45 DEBUG manager.SqlManager: Found column name of type [12, 20, 0]
99/06/23 18:33:45 DEBUG manager.SqlManager: Using fetchSize for next query: -2147483648
99/06/23 18:33:45 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `people` AS t LIMIT 1
99/06/23 18:33:45 DEBUG manager.SqlManager: Found column id of type INT
99/06/23 18:33:45 DEBUG manager.SqlManager: Found column extdata of type VARCHAR
99/06/23 18:33:45 DEBUG manager.SqlManager: Found column start_name of type VARCHAR
99/06/23 18:33:45 DEBUG manager.SqlManager: Found column nickname of type VARCHAR
99/06/23 18:33:45 DEBUG manager.SqlManager: Found column name of type VARCHAR
99/06/23 18:33:45 DEBUG hive.TableDefWriter: Create statement: CREATE TABLE IF NOT EXISTS `emps` ( `id` INT, `extdata` STRING, `start_name` STRING, `nickname` STRING, `name` STRING) COMMENT 'Imported by sqoop on 1999/06/23 18:33:45' ROW FORMAT DELIMITED FIELDS TERMINATED BY '\054' LINES TERMINATED BY '\012' STORED AS TEXTFILE
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/bigdata/hadoop272/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/bigdata/hbase-1.1.5/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
99/06/23 18:33:46 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
99/06/23 18:33:48 DEBUG hive.TableDefWriter: Load statement: LOAD DATA INPATH 'hdfs://hadoop0:9000/user/root/people' INTO TABLE `emps`
99/06/23 18:33:49 INFO hive.HiveImport: Loading uploaded data into Hive
99/06/23 18:33:49 DEBUG hive.HiveImport: Using external Hive process.
99/06/23 18:34:29 INFO hive.HiveImport: SLF4J: Class path contains multiple SLF4J bindings.
99/06/23 18:34:29 INFO hive.HiveImport: SLF4J: Found binding in [jar:file:/opt/bigdata/hive2.0/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
99/06/23 18:34:29 INFO hive.HiveImport: SLF4J: Found binding in [jar:file:/opt/bigdata/hbase-1.1.5/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
99/06/23 18:34:29 INFO hive.HiveImport: SLF4J: Found binding in [jar:file:/opt/bigdata/hadoop272/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
99/06/23 18:34:29 INFO hive.HiveImport: SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
99/06/23 18:34:34 INFO hive.HiveImport:
99/06/23 18:34:34 INFO hive.HiveImport: Logging initialized using configuration in file:/opt/bigdata/hive2.0/conf/hive-log4j2.properties
99/06/23 18:35:18 INFO hive.HiveImport: OK
99/06/23 18:35:18 INFO hive.HiveImport: Time taken: 8.382 seconds
99/06/23 18:35:20 INFO hive.HiveImport: Hive import complete.
相关推荐
DOS常用命令详细讲解。我觉得很好,对学习DOS命令有帮助。
史上最详细的讲解endnoteX7全新教程,设计ennote的各个方面
Sqoop 工具的发展至今主要演化了二大版本,Sqoop1 和 Sqoop2,我们以 Sqoop1 为案例进行讲解,Sqoop2 商用不太稳定。 Sqoop 工具是一个非常有用的数据迁移工具,可以帮助用户将传统型数据库中的数据迁移到 Hadoop ...
下面我们将详细讲解sqoop-cdh的安装过程、可能出现的异常处理以及所需的工具包。 首先,我们需要准备安装环境。确保你的CDH集群已经安装并配置完毕,包括Hadoop、Hive、HBase等组件。同时,你的系统中应有Java开发...
本篇文章将详细讲解 Sqoop 脚本生成的相关知识,并介绍如何使用可视化工具"MySqoop"来生成Sqoop脚本。 首先,让我们理解什么是Sqoop脚本。Sqoop脚本是预定义的一系列操作,包括连接数据库、指定表、定义转换规则、...
工程项目建设程序详细讲解(史上最全)-79页.pdf
详细介绍以及讲解linux的大部分基础内容,适合刚刚学习linux的新手。(很详细)
- **ERRORLEVEL**: 此变量可用于存储上一条命令的退出代码。通常用于根据上一步操作的成功或失败来决定下一步的流程。例如: ```batch IF %ERRORLEVEL% NEQ 0 ( ECHO 发生错误 ) ``` 5. **TITLE** - **...
本文通过多个具体实例,详细讲解了需求开发、需求管理、需求文档格式等,是学习需求分析不可多得资料。
01-第一天知识点回顾 .mp4 02查看文件内容命令 mp4 03软链接 mp4 04硬连接 .mp4 05文本搜索命令, 06-查找文件命令, mp4 07-压缩和解压缩命令, mp4 08文件权限命令 mp4 09获取管理权限的相关命令, mp4 10用户相关...
### 史上最经典的史密斯(Smith)圆图讲解 #### 1. 史密斯圆图是什么? 史密斯圆图是一种图形工具,由菲利普·史密斯(Phillip Smith)于1939年发明,主要用于解决微波与射频电路中的匹配问题。它提供了一种直观的...
工程项目建设程序详细讲解(史上最全).doc
手把手视频详细讲解项目开发全过程,需要的小伙伴自行百度网盘下载,链接见附件,永久有效。 课程简介 从零开始讲解大数据业务及数据采集和迁移需求,以案例驱动的方式讲解基于Sqoop构建高性能的分布式数据迁移和...
此外,书中还详细解释了如何使用Sqoop进行数据导入和导出,如使用`import`和`export`命令,以及如何处理复杂的数据类型和分隔符。 对于数据转换,Sqoop提供了丰富的转换操作,如使用自定义MapReduce任务进行数据...
该文件详细讲解了20个常用的GIT命令, 这些都是我们平时使用最多的命令. 本文件是一个chm格式的帮助文件.
这个史上最详细的Cygwin中文教程将深入讲解Cygwin的安装、配置以及使用,帮助用户充分利用其功能。 首先,安装Cygwin是一个相对简单的过程。用户需要访问Cygwin官方网站,下载安装程序setup.exe,然后在运行该程序...
Linux命令大全(含代码讲解)Linux命令大全(含代码讲解)Linux命令大全(含代码讲解)Linux命令大全(含代码讲解)Linux命令大全(含代码讲解)Linux命令大全(含代码讲解)Linux命令大全(含代码讲解)Linux命令...
"CADM命令及用法详细讲解" 本文档详细讲解了CADM命令的使用方法,涵盖了绘图命令、修改工具、阵列复制等多个方面。下面是本文档的详细知识点总结: 绘图命令 * 直线命令(L):输入命令 L,指定第一点和第二点,...
linux的各种命令,4万字最全讲解,包括热键,开关机,目录与文件操作,磁盘格式操作,压缩,打包,备份,vi编辑器的使用,编码操作正则表达式的基本操作,shell script操作,使用者操作,管理员操作,还包括系统备份...
这个史上最全的Visio图集不仅提供了大量的图形资源,还可能涵盖了相关技术的讲解,如JWT的使用,使得用户在设计和交流时更加便捷高效。通过深入学习和使用这些资源,无论是初学者还是经验丰富的专业人士,都能提升...