`
knight_black_bob
  • 浏览: 853303 次
  • 性别: Icon_minigender_1
  • 来自: 北京
社区版块
存档分类
最新评论

[sqoop1 error] java.lang.ClassNotFoundException org.apache.hadoop.hive.conf.Hiv

阅读更多

 

 

问题:

[zkkafka@yanfabu2-37 ~]$ sqoop import \
> --connect jdbc:mysql://10.156.50.36:3306/mqh \
> --username root \
> --password root \
> --table device \
> --fields-terminated-by '\t' \
> --delete-target-dir \
> --num-mappers 1 \
> --hive-import \
> --hive-database test \
> --hive-table hive_bbs_product_snappy;
Warning: /home/zkkafka/sqoop/../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /home/zkkafka/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
19/05/29 12:00:27 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7
19/05/29 12:00:27 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
19/05/29 12:00:27 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
19/05/29 12:00:27 INFO tool.CodeGenTool: Beginning code generation
Wed May 29 12:00:27 CST 2019 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
19/05/29 12:00:27 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `device` AS t LIMIT 1
19/05/29 12:00:27 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `device` AS t LIMIT 1
19/05/29 12:00:27 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /home/zkkafka/hadoop
注: /tmp/sqoop-zkkafka/compile/0edae8bed32dad40d1fe8ca424e30387/device.java使用或覆盖了已过时的 API。
注: 有关详细信息, 请使用 -Xlint:deprecation 重新编译。
19/05/29 12:00:31 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-zkkafka/compile/0edae8bed32dad40d1fe8ca424e30387/device.jar
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/zkkafka/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/zkkafka/hbase/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
19/05/29 12:00:32 INFO tool.ImportTool: Destination directory device deleted.
19/05/29 12:00:32 WARN manager.MySQLManager: It looks like you are importing from mysql.
19/05/29 12:00:32 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
19/05/29 12:00:32 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
19/05/29 12:00:32 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
19/05/29 12:00:32 INFO mapreduce.ImportJobBase: Beginning import of device
19/05/29 12:00:32 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
19/05/29 12:00:32 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
Wed May 29 12:00:39 CST 2019 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
19/05/29 12:00:39 INFO db.DBInputFormat: Using read commited transaction isolation
19/05/29 12:00:40 INFO mapreduce.JobSubmitter: number of splits:1
19/05/29 12:00:40 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1558676658010_0008
19/05/29 12:00:41 INFO impl.YarnClientImpl: Submitted application application_1558676658010_0008
19/05/29 12:00:41 INFO mapreduce.Job: The url to track the job: http://master1:8088/proxy/application_1558676658010_0008/
19/05/29 12:00:41 INFO mapreduce.Job: Running job: job_1558676658010_0008
19/05/29 12:00:49 INFO mapreduce.Job: Job job_1558676658010_0008 running in uber mode : false
19/05/29 12:00:49 INFO mapreduce.Job:  map 0% reduce 0%
19/05/29 12:00:57 INFO mapreduce.Job:  map 100% reduce 0%
19/05/29 12:00:59 INFO mapreduce.Job: Job job_1558676658010_0008 completed successfully
19/05/29 12:01:00 INFO mapreduce.Job: Counters: 30
	File System Counters
		FILE: Number of bytes read=0
		FILE: Number of bytes written=128603
		FILE: Number of read operations=0
		FILE: Number of large read operations=0
		FILE: Number of write operations=0
		HDFS: Number of bytes read=87
		HDFS: Number of bytes written=156
		HDFS: Number of read operations=4
		HDFS: Number of large read operations=0
		HDFS: Number of write operations=2
	Job Counters 
		Launched map tasks=1
		Other local map tasks=1
		Total time spent by all maps in occupied slots (ms)=5465
		Total time spent by all reduces in occupied slots (ms)=0
		Total time spent by all map tasks (ms)=5465
		Total vcore-milliseconds taken by all map tasks=5465
		Total megabyte-milliseconds taken by all map tasks=5596160
	Map-Reduce Framework
		Map input records=6
		Map output records=6
		Input split bytes=87
		Spilled Records=0
		Failed Shuffles=0
		Merged Map outputs=0
		GC time elapsed (ms)=103
		CPU time spent (ms)=1380
		Physical memory (bytes) snapshot=168935424
		Virtual memory (bytes) snapshot=2104311808
		Total committed heap usage (bytes)=93323264
	File Input Format Counters 
		Bytes Read=0
	File Output Format Counters 
		Bytes Written=156
19/05/29 12:01:00 INFO mapreduce.ImportJobBase: Transferred 156 bytes in 27.7324 seconds (5.6252 bytes/sec)
19/05/29 12:01:00 INFO mapreduce.ImportJobBase: Retrieved 6 records.
19/05/29 12:01:00 INFO mapreduce.ImportJobBase: Publishing Hive/Hcat import job data to Listeners for table device
Wed May 29 12:01:00 CST 2019 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
19/05/29 12:01:00 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `device` AS t LIMIT 1
19/05/29 12:01:00 WARN hive.TableDefWriter: Column register_time had to be cast to a less precise type in Hive
19/05/29 12:01:00 INFO hive.HiveImport: Loading uploaded data into Hive
19/05/29 12:01:00 ERROR hive.HiveConfig: Could not load org.apache.hadoop.hive.conf.HiveConf. Make sure HIVE_CONF_DIR is set correctly.
19/05/29 12:01:00 ERROR tool.ImportTool: Import failed: java.io.IOException: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf
	at org.apache.sqoop.hive.HiveConfig.getHiveConf(HiveConfig.java:50)
	at org.apache.sqoop.hive.HiveImport.getHiveArgs(HiveImport.java:392)
	at org.apache.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:379)
	at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:337)
	at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:241)
	at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:537)
	at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:628)
	at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
	at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
	at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
	at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
	at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf
	at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at java.lang.Class.forName0(Native Method)
	at java.lang.Class.forName(Class.java:264)
	at org.apache.sqoop.hive.HiveConfig.getHiveConf(HiveConfig.java:44)
	... 12 more

 

解决方法:
cp /home/zkkafka/hive/lib/hive-exec-**.jar  /home/zkkafka/sqoop/lib/

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

捐助开发者 

在兴趣的驱动下,写一个免费的东西,有欣喜,也还有汗水,希望你喜欢我的作品,同时也能支持一下。 当然,有钱捧个钱场(支持支付宝和微信 以及扣扣群),没钱捧个人场,谢谢各位。

 

个人主页http://knight-black-bob.iteye.com/



 
 
 谢谢您的赞助,我会做的更好!

分享到:
评论

相关推荐

    avro-1.8.1

    运行Sqoop报错:Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/avro/LogicalType,下载此资源放到Sqoop的lib目录下即可

    Sqoop通过Phoenix导hbase数据到hive

    出现此问题时使用:java.lang.NullPointerException at org.json.JSONObject.(JSONObject.java:144) at org.apache.sqoop.util.SqoopJsonUtil.... at org.apache.sqoop.Sqoop.main(Sqoop.java:252)

    java-json.7z

    at org.apache.sqoop.Sqoop.run(Sqoop.java:147) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183) at org.apache.sqoop.Sqoop.runTool...

    Atlas2.3.0依赖: org.restlet/sqoop-1.4.6.2.3.99.0-195

    在IT行业中,我们经常涉及到各种库和框架的集成与使用,这次我们关注的是"Atlas2.3.0"依赖的组件:"org.restlet/sqoop-1.4.6.2.3.99.0-195"。这个依赖包含了三个关键的JAR文件:`sqoop-1.4.6.2.3.99.0-195.jar`,`...

    sqoop-1.4.6.2.3.99.0-195.jar..zip

    编译Atlas用 sqoop-1.4.6.2.3.99.0-195.jar 内含安装jar包以及maven手动安装命令 详情可参考我的博客: https://blog.csdn.net/qq_26502245/article/details/108008070

    sqoop-1.4.7.bin__hadoop-2.6.0.tar

    Sqoop 是一个开源工具,主要用于在关系数据库管理系统(RDBMS)与 Apache Hadoop 之间进行数据迁移。这个压缩包 "sqoop-1.4.7.bin__hadoop-2.6.0.tar" 包含了 Sqoop 的 1.4.7 版本,该版本是针对 Hadoop 2.6.0 的。...

    sqoop-1.4.6.bin__hadoop-2.0.4-alpha.tar.zip

    Sqoop 是一个开源工具,主要用于在关系型数据库(如 MySQL、Oracle 等)与 Hadoop 的 HDFS(Hadoop Distributed File System)之间进行数据迁移。这个压缩包 "sqoop-1.4.6.bin__hadoop-2.0.4-alpha.tar.zip" 包含了 ...

    sqoop-1.4.6.bin__hadoop-2.0.4-alpha.tar.gz

    Sqoop 是一个用于在 Apache Hadoop 和传统关系型数据库之间高效传输数据的工具。这个压缩包 "sqoop-1.4.6.bin__hadoop-2.0.4-alpha.tar.gz" 包含了 Sqoop 的 1.4.6 版本,它已针对 Hadoop 2.0.4-alpha 版本进行了...

    sqoop-1.4.6.bin-hadoop-2.0.4-alpha版本的压缩包,直接下载到本地,解压后即可使用

    Sqoop(发音:skup)是一款开源的工具,主要用于在Hadoop(Hive)与传统的数据库(mysql、postgresql...)间进行数据的传递,可以将一个关系型数据库(例如 : MySQL ,Oracle ,Postgres等)中的数据导进到Hadoop的HDFS中,...

    apache-atlas-2.2.0-sqoop-hook.tar.gz

    标签中的"apache sqoop zookeeper hadoop hive"揭示了这个压缩包与以下几个技术的关联: - **Apache Sqoop**: 如前所述,Sqoop是用于在传统数据库和Hadoop之间迁移数据的工具。通过Sqoop Hook,用户可以确保在数据...

    sqoop-1.4.7.jar

    解决错误:Could not find or load main class org.apache.sqoop.Sqoop。把这个sqoop-1.4.7.jar放到sqoop根目录下的lib目录中,即可。 如果你没有积分,也可以自己去这个地址下载:...

    Hadoop hbase hive sqoop集群环境安装配置及使用文档

    大数据集群 Hadoop HBase Hive Sqoop 集群环境安装配置及使用文档 在本文档中,我们将详细介绍如何搭建一个大数据集群环境,包括 Hadoop、HBase、Hive 和 Sqoop 的安装配置及使用。该文档将分为四部分:Hadoop 集群...

    sqoop-1.4.7.zip

    在这个场景中,我们遇到了一个关于Sqoop运行时的问题,即"找不到或无法加载主类 org.apache.sqoop.sqoop"。这个问题通常是由于Java运行环境配置不正确或者 Sqoop 的依赖库没有被正确地包含在执行环境中所导致的。 ...

    sqoop-1.4.5.bin__hadoop-2.0.4-alpha.tar

    5. 配置 Sqoop 连接数据库:在$SQOOP_HOME/conf目录下,创建一个名为sqoop-site.xml的文件,并配置连接数据库的相关属性,如`<property><name>sqoop.metastore.client.enable.autoconnect</name><value>true</value>...

    sqoop工具_202006041735481.docx

    Apache Sqoop 是一款开源工具,专注于实现关系型数据库管理系统(RDBMS)与Hadoop生态之间的高效数据交换。它支持将传统数据库中的数据导入Hadoop及其相关组件,如HBase和Hive,同时也支持从Hadoop中抽取数据导出至...

    sqoop-1.4.6.bin__hadoop-2.0.4-alpha.zip

    Sqoop 是一个开源工具,主要用于在关系型数据库(如MySQL、Oracle等)和Apache Hadoop之间进行数据导入导出。这个压缩包 "sqoop-1.4.6.bin__hadoop-2.0.4-alpha.zip" 包含的是Sqoop 1.4.6版本,针对Hadoop 2.0.4-...

    sqoop-1.4.6.jar.zip

    sqoop框架开发工具使用的jar sqoop-1.4.6.jar 手动安装到maven <groupId>org.apache.sqoop <artifactId>sqoop <version>1.4.6 </dependency>

    sqoop-1.4.6.2.3.99.0-195.jar

    基于atlas2.0.0编译,需要而很难找到有效地址下载不到的jar包 sqoop-1.4.6.2.3.99.0-195.jar

    sqoop sqlserver相关.zip

    Sqoop是一款开源的数据迁移工具,主要用于在关系型数据库(如SQL Server)与Hadoop之间进行数据导入导出。本压缩包“sqoop sqlserver相关.zip”包含两个关键文件,它们对于在Hadoop环境中使用Sqoop与SQL Server进行...

Global site tag (gtag.js) - Google Analytics