`

hbase 报java.lang.IllegalAccessError: com/google/protobuf/HBaseZeroCopyByteString

阅读更多

问题描述

        在HBase上运行MapReduce作业时,报如下异常:IllegalAccessError: class com.google.protobuf.HBaseZeroCopyByteString cannot access its superclass com.google.protobuf.LiteralByteString

        使用HBase环境如下:CDH5.0.1, HBase版本:0.96.1

问题原因

        This isssue occurs because of an optimization introduced in HBASE-9867 that inadvertently introduced a classloader dependency. This affects both jobs using the -libjars option and "fat jar," jobs which package their runtime dependencies in a nested lib folder.

        这个问题的发生是由于优化了HBASE-9867 引起的,无意间引进了一个依赖类加载器。它影响使用-libjars参数和使用 fat jar两种模式的job. 

        fat jar模式Hadoop的一个特殊功能:可以读取操作目录中/lib目录下包含的所有库的JAR文件,把运行job依赖的jar放在jar中的lib目录下。

解决方案

    1、永久解决方案

         cd /opt/cloudera/parcels/CDH-5.0.0-1.cdh5.0.0.p0.47/lib/hadoop

         ln -s /opt/cloudera/parcels/CDH-5.0.0-1.cdh5.0.0.p0.47/lib/hbase/lib/hbase-protocol-0.96.1.1-cdh5.0.0.jar hbase-protocol-0.96.1.1-cdh5.0.0.jar

    2: 临时解决方案

    HADOOP_CLASSPATH=/opt/cloudera/parcels/CDH-5.0.0-1.cdh5.0.0.p0.47/lib/hbase/lib/hbase-protocol-0.96.1.1-cdh5.0.0.jar:/etc/hbase/conf hadoop  jar myjar.jar MyJobMainClass

详细报错信息

 

14/07/16 14:40:50 WARN client.HConnectionManager$HConnectionImplementation: Encountered problems when prefetch hbase:meta table: 
org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=10, exceptions:
Wed Jul 16 14:32:49 CST 2014, org.apache.hadoop.hbase.client.RpcRetryingCaller@52380405,java.lang.IllegalAccessError: class com.google.protobuf.HBaseZeroCopyByteString cannot access its superclass com.google.protobuf.LiteralByteString
Wed Jul 16 14:32:52 CST 2014, org.apache.hadoop.hbase.client.RpcRetryingCaller@52380405, java.lang.IllegalAccessError: com/google/protobuf/HBaseZeroCopyByteString
Wed Jul 16 14:32:57 CST 2014, org.apache.hadoop.hbase.client.RpcRetryingCaller@52380405, java.lang.IllegalAccessError: com/google/protobuf/HBaseZeroCopyByteString
Wed Jul 16 14:33:07 CST 2014, org.apache.hadoop.hbase.client.RpcRetryingCaller@52380405, java.lang.IllegalAccessError: com/google/protobuf/HBaseZeroCopyByteString
Wed Jul 16 14:33:28 CST 2014, org.apache.hadoop.hbase.client.RpcRetryingCaller@52380405, java.lang.IllegalAccessError: com/google/protobuf/HBaseZeroCopyByteString
Wed Jul 16 14:34:08 CST 2014, org.apache.hadoop.hbase.client.RpcRetryingCaller@52380405, java.lang.IllegalAccessError: com/google/protobuf/HBaseZeroCopyByteString
Wed Jul 16 14:35:49 CST 2014, org.apache.hadoop.hbase.client.RpcRetryingCaller@52380405, java.lang.IllegalAccessError: com/google/protobuf/HBaseZeroCopyByteString
Wed Jul 16 14:37:29 CST 2014, org.apache.hadoop.hbase.client.RpcRetryingCaller@52380405, java.lang.IllegalAccessError: com/google/protobuf/HBaseZeroCopyByteString
Wed Jul 16 14:39:10 CST 2014, org.apache.hadoop.hbase.client.RpcRetryingCaller@52380405, java.lang.IllegalAccessError: com/google/protobuf/HBaseZeroCopyByteString
Wed Jul 16 14:40:50 CST 2014, org.apache.hadoop.hbase.client.RpcRetryingCaller@52380405,java.lang.IllegalAccessError: com/google/protobuf/HBaseZeroCopyByteString


        at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:134)
        at org.apache.hadoop.hbase.client.HTable.getRowOrBefore(HTable.java:705)
        at org.apache.hadoop.hbase.client.MetaScanner.metaScan(MetaScanner.java:144)
        at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.prefetchRegionCache(HConnectionManager.java:1102)
        at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegionInMeta(HConnectionManager.java:1162)
        at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:1054)
        at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:1011)
        at org.apache.hadoop.hbase.client.HTable.finishSetup(HTable.java:326)
        at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:192)
        at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:150)
        at org.apache.hadoop.hbase.mapreduce.TableOutputFormat.setConf(TableOutputFormat.java:206)
        at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
        at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
        at org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:455)
        at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:343)
        at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1295)
        at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1292)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
        at org.apache.hadoop.mapreduce.Job.submit(Job.java:1292)
        at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1313)
        at mapreduce.ImportFromFile.run(ImportFromFile.java:228)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
        at mapreduce.ImportFromFile.main(ImportFromFile.java:185)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:72)
        at org.apache.hadoop.util.ProgramDriver.run(ProgramDriver.java:144)
        at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:152)
        at mapreduce.Driver.main(Driver.java:29)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
Caused by: java.lang.IllegalAccessError: com/google/protobuf/HBaseZeroCopyByteString
        at org.apache.hadoop.hbase.protobuf.RequestConverter.buildRegionSpecifier(RequestConverter.java:897)
        at org.apache.hadoop.hbase.protobuf.RequestConverter.buildGetRowOrBeforeRequest(RequestConverter.java:131)
        at org.apache.hadoop.hbase.protobuf.ProtobufUtil.getRowOrBefore(ProtobufUtil.java:1402)
        at org.apache.hadoop.hbase.client.HTable$2.call(HTable.java:701)
        at org.apache.hadoop.hbase.client.HTable$2.call(HTable.java:699)
        at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:120)
        ... 38 more
Exception in thread "main" java.lang.IllegalAccessError: com/google/protobuf/HBaseZeroCopyByteString
        at org.apache.hadoop.hbase.protobuf.RequestConverter.buildRegionSpecifier(RequestConverter.java:897)
        at org.apache.hadoop.hbase.protobuf.RequestConverter.buildGetRowOrBeforeRequest(RequestConverter.java:131)
        at org.apache.hadoop.hbase.protobuf.ProtobufUtil.getRowOrBefore(ProtobufUtil.java:1402)
        at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegionInMeta(HConnectionManager.java:1176)
        at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:1054)
        at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:1011)
        at org.apache.hadoop.hbase.client.HTable.finishSetup(HTable.java:326)
        at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:192)
        at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:150)
        at org.apache.hadoop.hbase.mapreduce.TableOutputFormat.setConf(TableOutputFormat.java:206)
        at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
        at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
        at org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:455)
        at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:343)
        at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1295)
        at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1292)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
        at org.apache.hadoop.mapreduce.Job.submit(Job.java:1292)
        at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1313)
        at mapreduce.ImportFromFile.run(ImportFromFile.java:228)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
        at mapreduce.ImportFromFile.main(ImportFromFile.java:185)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:72)
        at org.apache.hadoop.util.ProgramDriver.run(ProgramDriver.java:144)
        at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:152)
        at mapreduce.Driver.main(Driver.java:29)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:212)

分享到:
评论

相关推荐

    hbase 启动regionserver日志报错: Wrong FS: hdfs:// .regioninfo, expected: file:///

    NULL 博文链接:https://bnmnba.iteye.com/blog/2322332

    出现Exception in threadmain java.lang.NoClassDefFoundError的各种可能情况.doc

    在Java编程中,遇到“Exception in thread 'main' java.lang.NoClassDefFoundError”是一种常见的异常情况,这通常意味着JVM在运行时未能找到指定的类定义。此错误不同于ClassNotFoundException,后者发生在尝试加载...

    hbase-0.98.9-src.tar

    2. http://labs.google.com/papers/bigtable.html 3. http://hadoop.apache.org 4. http://www.apache.org/dyn/closer.cgi/hbase/ 5. http://hbase.apache.org/docs/current/source-repository.html 6. ...

    hbase-site.xml.doc

    * hbase.tmp.dir:${java.io.tmpdir}/hbase-${user.name},这个参数指定了 HBase 的临时目录,用于存储临时文件。 二、HBase 根目录配置 * hbase.rootdir:${hbase.tmp.dir}/hbase,hdfs://namenode.example.org:...

    [www.java1234.com]HBase权威指南.pdf

    hbase-2.4.16-bin.tar.gz

    hbase官网下载地址(官网下载太慢): https://downloads.apache.org/hbase/ 国内镜像hbase-2.4.16: https://mirrors.tuna.tsinghua.edu.cn/apache/hbase/2.4.16/hbase-2.4.16-bin.tar.gz

    hbase 权限配置.docx

    HBase 权限配置详解 HBase 权限配置是指在 HBase 中对用户的访问控制和权限管理。通过配置 HBase 权限,可以对不同的用户或组授予不同的权限,以确保数据的安全和可靠性。 一、HBase 权限配置的重要性 HBase 权限...

    hadoop安装包下载地址

    它基于Google的MapReduce论文和其他相关论文设计而成,能够有效地支持海量数据的存储与计算任务。本文将详细介绍Hadoop安装包的下载地址及相关资源。 #### 下载地址 Hadoop的官方下载地址为:...

    在hadoop-3.1.2上安装hbase-2.2.1.pdf

    本文将HBase-2.2.1安装在Hadoop-3.1.2上,关于Hadoop-3.1.2的安装,请参见《基于zookeeper-3.5.5安装hadoop-3.1.2》一文。安装环境为64位CentOS-Linux 7.2版本。 本文将在HBase官方提供的quickstart.html文件的指导...

    hbase-2.4.11-bin.tar.gz

    《HBase 2.4.11:大数据存储与管理的基石》 HBase,作为Apache软件基金会的一个开源项目,是构建在Hadoop文件系统(HDFS)之上的分布式、面向列的数据库,专为处理大规模数据而设计。标题中的“hbase-2.4.11-bin....

    pinpoint的hbase初始化脚本hbase-create.hbase

    搭建pinpoint需要的hbase初始化脚本hbase-create.hbase

    hbase-2.0.0.3.0.0.0-1634-bin.tar.gz

    《HBase 2.0.0.3.0.0.0-1634 在 Ambari 2.7.x 下的编译与使用详解》 HBase,全称Apache HBase,是一款构建在Hadoop文件系统之上的分布式、版本化、列族式存储系统,主要用于处理大规模数据。它提供了高度可靠性和高...

    hbase完全分布式.docx

    HBase完全分布式安装和部署指南 HBase是一种开源的、分布式的、面向列的NoSQL数据库,基于Hadoop和HDFS构建。它提供了高性能、可扩展性强、支持大规模数据存储和处理的能力。本文将详细介绍HBase的安装和部署过程,...

    hbase.tar.gz 已经配置完成拿来即用

    HBase是一款基于Apache Hadoop的分布式列式存储系统,它主要设计用于处理大规模的数据存储和检索。这个“hbase.tar.gz”压缩包可能是预配置好的HBase环境,用户下载后可以直接解压使用,无需繁琐的配置步骤。下面将...

    hbase-0.90.5.tar.gz与hadoop0.20.2版本匹配

    HBase是Apache软件基金会开发的一个开源分布式数据库,它是基于Google的Bigtable模型设计的,用于存储大规模结构化数据。HBase构建在Hadoop之上,两者都是Apache Hadoop生态系统的重要组成部分。Hadoop是一个分布式...

    hbase-meta-repair-hbase-2.0.2.jar

    HBase 元数据修复工具包。 ①修改 jar 包中的application.properties,重点是 zookeeper.address、zookeeper.nodeParent、hdfs....③开始修复 `java -jar -Drepair.tableName=表名 hbase-meta-repair-hbase-2.0.2.jar`

    hbase-2.0.2.3.1.4.0-315-bin.tar.gz

    ambari-2.7.5 编译过程中四个大包下载很慢,所以需要提前下载,包含:hbase-2.0.2.3.1.4.0-315-bin.tar.gz ,hadoop-3.1.1.3.1.4.0-315.tar.gz , grafana-6.4.2.linux-amd64.tar.gz ,phoenix-5.0.0.3.1.4.0-315....

Global site tag (gtag.js) - Google Analytics