`
kfcman
  • 浏览: 395567 次
  • 性别: Icon_minigender_1
  • 来自: 上海
社区版块
存档分类
最新评论

Failed to load native-hadoop本地库不一致的解决办法

 
阅读更多

15/06/25 00:14:04 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

解决办法:

增加调试信息

[hadoop@master001 native]$ export HADOOP_ROOT_LOGGER=DEBUG,console

[hadoop@master001 native]$ hadoop fs -text /test/data/origz/access.log.gz

15/06/25 00:44:05 DEBUG util.Shell: setsid exited with exit code 0

15/06/25 00:44:05 DEBUG conf.Configuration: parsing URL jar:file:/usr/hadoop/share/hadoop/common/hadoop-common-2.5.2.jar!/core-default.xml

15/06/25 00:44:05 DEBUG conf.Configuration: parsing input stream sun.net.www.protocol.jar.JarURLConnection$JarURLInputStream@71be98f5

15/06/25 00:44:05 DEBUG conf.Configuration: parsing URL file:/usr/hadoop/etc/hadoop/core-site.xml

15/06/25 00:44:05 DEBUG conf.Configuration: parsing input stream java.io.BufferedInputStream@97e1986

15/06/25 00:44:06 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[Rate of successful kerberos logins and latency (milliseconds)])

15/06/25 00:44:06 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[Rate of failed kerberos logins and latency (milliseconds)])

15/06/25 00:44:06 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[GetGroups])

15/06/25 00:44:06 DEBUG impl.MetricsSystemImpl: UgiMetrics, User and group related metrics

15/06/25 00:44:06 DEBUG security.Groups:  Creating new Groups object

15/06/25 00:44:06 DEBUG util.NativeCodeLoader: Trying to load the custom-built native-hadoop library...

15/06/25 00:44:06 DEBUG util.NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: /usr/hadoop/lib/native/libhadoop.so.1.0.0: /lib64/libc.so.6: version `GLIBC_2.14' not found (required by /usr/hadoop/lib/native/libhadoop.so.1.0.0)

15/06/25 00:44:06 DEBUG util.NativeCodeLoader: java.library.path=/usr/hadoop/lib/native

15/06/25 00:44:06 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

15/06/25 00:44:06 DEBUG security.JniBasedUnixGroupsMappingWithFallback: Falling back to shell based

15/06/25 00:44:06 DEBUG security.JniBasedUnixGroupsMappingWithFallback: Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping

15/06/25 00:44:06 DEBUG security.Groups: Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000; warningDeltaMs=5000

15/06/25 00:44:06 DEBUG security.UserGroupInformation: hadoop login

15/06/25 00:44:06 DEBUG security.UserGroupInformation: hadoop login commit

15/06/25 00:44:06 DEBUG security.UserGroupInformation: using local user:UnixPrincipal: hadoop

15/06/25 00:44:06 DEBUG security.UserGroupInformation: UGI loginUser:hadoop (auth:SIMPLE)

15/06/25 00:44:06 DEBUG hdfs.BlockReaderLocal: dfs.client.use.legacy.blockreader.local = false

15/06/25 00:44:06 DEBUG hdfs.BlockReaderLocal: dfs.client.read.shortcircuit = false

15/06/25 00:44:06 DEBUG hdfs.BlockReaderLocal: dfs.client.domain.socket.data.traffic = false

15/06/25 00:44:06 DEBUG hdfs.BlockReaderLocal: dfs.domain.socket.path =

15/06/25 00:44:06 DEBUG retry.RetryUtils: multipleLinearRandomRetry = null

15/06/25 00:44:06 DEBUG ipc.Server: rpcKind=RPC_PROTOCOL_BUFFER, rpcRequestWrapperClass=class org.apache.hadoop.ipc.ProtobufRpcEngine$RpcRequestWrapper, rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker@78dd667e

15/06/25 00:44:07 DEBUG ipc.Client: getting client out of cache: org.apache.hadoop.ipc.Client@60dcc9fe

15/06/25 00:44:07 DEBUG shortcircuit.DomainSocketFactory: Both short-circuit local reads and UNIX domain socket are disabled.

15/06/25 00:44:07 DEBUG ipc.Client: The ping interval is 60000 ms.

15/06/25 00:44:07 DEBUG ipc.Client: Connecting to master001/192.168.75.155:8020

15/06/25 00:44:07 DEBUG ipc.Client: IPC Client (905735620) connection to master001/192.168.75.155:8020 from hadoop: starting, having connections 1

15/06/25 00:44:07 DEBUG ipc.Client: IPC Client (905735620) connection to master001/192.168.75.155:8020 from hadoop sending #0

15/06/25 00:44:07 DEBUG ipc.Client: IPC Client (905735620) connection to master001/192.168.75.155:8020 from hadoop got value #0

15/06/25 00:44:07 DEBUG ipc.ProtobufRpcEngine: Call: getFileInfo took 71ms

text: `/test/data/origz/access.log.gz': No such file or directory

15/06/25 00:44:07 DEBUG ipc.Client: stopping client from cache: org.apache.hadoop.ipc.Client@60dcc9fe

15/06/25 00:44:07 DEBUG ipc.Client: removing client from cache: org.apache.hadoop.ipc.Client@60dcc9fe

15/06/25 00:44:07 DEBUG ipc.Client: stopping actual client because no more references remain: org.apache.hadoop.ipc.Client@60dcc9fe

15/06/25 00:44:07 DEBUG ipc.Client: Stopping client

15/06/25 00:44:07 DEBUG ipc.Client: IPC Client (905735620) connection to master001/192.168.75.155:8020 from hadoop: closed

15/06/25 00:44:07 DEBUG ipc.Client: IPC Client (905735620) connection to master001/192.168.75.155:8020 from hadoop: stopped, remaining connections 0

 

查看系统的libc版本

[hadoop@master001 native]$ ll /lib64/libc.so.6

lrwxrwxrwx. 1 root root 12 Apr 14 16:14 /lib64/libc.so.6 -> libc-2.12.so

显示版本为2.12

到网站http://ftp.gnu.org/gnu/glibc/

下载glibc-2.14.tar.bz2

下载glibc-linuxthreads-2.5.tar.bz2

[hadoop@master001 native]$ tar -jxvf /home/hadoop/software/glibc-2.14.tar.bz2

[hadoop@master001 native]$ cd glibc-2.14/

[hadoop@master001 glibc-2.14]$ tar -jxvf /home/hadoop/software/glibc-linuxthreads-2.5.tar.bz2

[hadoop@master001 glibc-2.14]$ cd .. #必须返回上级目录

[hadoop@master001 glibc-2.14]$ export CFLAGS="-g -O2"           #加上优化开关,否则会出现错误

[hadoop@master001 native]$ ./glibc-2.14/configure --prefix=/usr --disable-profile --enable-add-ons --with-headers=/usr/include --with-binutils=/usr/bin

执行上面一行可能报:

no acceptable C compiler found in $PATH  那是因为没有安装gcc

安装gcc  yum install gcc

 

[hadoop@master001 native]$ make        #编译,执行很久,可能出错,出错再重新执行

[hadoop@master001 native]$ sudo make install   #安装,必须root用户执行

#验证版本是否升级

[hadoop@master001 native]$ ll /lib64/libc.so.6

lrwxrwxrwx 1 root root 12 Jun 25 02:07 /lib64/libc.so.6 -> libc-2.14.so #显示2.14

增加调试信息

[hadoop@master001 native]$ export HADOOP_ROOT_LOGGER=DEBUG,console

#显示有下面红色部分,说明本地库不再有错误

[hadoop@master001 native]$ hadoop fs -text /test/data/origz/access.log.gz

15/06/25 02:10:01 DEBUG util.Shell: setsid exited with exit code 0

15/06/25 02:10:01 DEBUG conf.Configuration: parsing URL jar:file:/usr/hadoop/share/hadoop/common/hadoop-common-2.5.2.jar!/core-default.xml

15/06/25 02:10:01 DEBUG conf.Configuration: parsing input stream sun.net.www.protocol.jar.JarURLConnection$JarURLInputStream@71be98f5

15/06/25 02:10:01 DEBUG conf.Configuration: parsing URL file:/usr/hadoop/etc/hadoop/core-site.xml

15/06/25 02:10:01 DEBUG conf.Configuration: parsing input stream java.io.BufferedInputStream@97e1986

15/06/25 02:10:02 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName=Ops, about=, always=false, type=DEFAULT, valueName=Time, value=[Rate of successful kerberos logins and latency (milliseconds)])

15/06/25 02:10:02 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName=Ops, about=, always=false, type=DEFAULT, valueName=Time, value=[Rate of failed kerberos logins and latency (milliseconds)])

15/06/25 02:10:02 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName=Ops, about=, always=false, type=DEFAULT, valueName=Time, value=[GetGroups])

15/06/25 02:10:02 DEBUG impl.MetricsSystemImpl: UgiMetrics, User and group related metrics

15/06/25 02:10:02 DEBUG security.Groups:  Creating new Groups object

15/06/25 02:10:02 DEBUG util.NativeCodeLoader: Trying to load the custom-built native-hadoop library...

15/06/25 02:10:02 DEBUG util.NativeCodeLoader: Loaded the native-hadoop library

15/06/25 02:10:02 DEBUG security.JniBasedUnixGroupsMapping: Using JniBasedUnixGroupsMapping for Group resolution

15/06/25 02:10:02 DEBUG security.JniBasedUnixGroupsMappingWithFallback: Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMapping

15/06/25 02:10:02 DEBUG security.Groups: Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000; warningDeltaMs=5000

15/06/25 02:10:02 DEBUG security.UserGroupInformation: hadoop login

15/06/25 02:10:02 DEBUG security.UserGroupInformation: hadoop login commit

15/06/25 02:10:02 DEBUG security.UserGroupInformation: using local user:UnixPrincipal: hadoop

15/06/25 02:10:02 DEBUG security.UserGroupInformation: UGI loginUser:hadoop (auth:SIMPLE)

15/06/25 02:10:03 DEBUG hdfs.BlockReaderLocal: dfs.client.use.legacy.blockreader.local = false

15/06/25 02:10:03 DEBUG hdfs.BlockReaderLocal: dfs.client.read.shortcircuit = false

15/06/25 02:10:03 DEBUG hdfs.BlockReaderLocal: dfs.client.domain.socket.data.traffic = false

15/06/25 02:10:03 DEBUG hdfs.BlockReaderLocal: dfs.domain.socket.path =

15/06/25 02:10:03 DEBUG retry.RetryUtils: multipleLinearRandomRetry = null

15/06/25 02:10:03 DEBUG ipc.Server: rpcKind=RPC_PROTOCOL_BUFFER, rpcRequestWrapperClass=class org.apache.hadoop.ipc.ProtobufRpcEngine$RpcRequestWrapper, rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker@501edcf1

15/06/25 02:10:03 DEBUG ipc.Client: getting client out of cache: org.apache.hadoop.ipc.Client@16e7dcfd

15/06/25 02:10:04 DEBUG unix.DomainSocketWatcher: org.apache.hadoop.net.unix.DomainSocketWatcher$1@7e499e08: starting with interruptCheckPeriodMs = 60000

15/06/25 02:10:04 DEBUG shortcircuit.DomainSocketFactory: Both short-circuit local reads and UNIX domain socket are disabled.

15/06/25 02:10:04 DEBUG ipc.Client: The ping interval is 60000 ms.

15/06/25 02:10:04 DEBUG ipc.Client: Connecting to master001/192.168.75.155:8020

15/06/25 02:10:04 DEBUG ipc.Client: IPC Client (577405636) connection to master001/192.168.75.155:8020 from hadoop sending #0

15/06/25 02:10:04 DEBUG ipc.Client: IPC Client (577405636) connection to master001/192.168.75.155:8020 from hadoop: starting, having connections 1

15/06/25 02:10:04 DEBUG ipc.Client: IPC Client (577405636) connection to master001/192.168.75.155:8020 from hadoop got value #0

15/06/25 02:10:04 DEBUG ipc.ProtobufRpcEngine: Call: getFileInfo took 122ms

text: `/test/data/origz/access.log.gz': No such file or directory

15/06/25 02:10:04 DEBUG ipc.Client: stopping client from cache: org.apache.hadoop.ipc.Client@16e7dcfd

15/06/25 02:10:04 DEBUG ipc.Client: removing client from cache: org.apache.hadoop.ipc.Client@16e7dcfd

15/06/25 02:10:04 DEBUG ipc.Client: stopping actual client because no more references remain: org.apache.hadoop.ipc.Client@16e7dcfd

15/06/25 02:10:04 DEBUG ipc.Client: Stopping client

15/06/25 02:10:04 DEBUG ipc.Client: IPC Client (577405636) connection to master001/192.168.75.155:8020 from hadoop: closed

15/06/25 02:10:04 DEBUG ipc.Client: IPC Client (577405636) connection to master001/192.168.75.155:8020 from hadoop: stopped, remaining connections 0

==============================

完成之后,需要将集群重启:

[hadoop@master001 ~]$ sh /usr/hadoop/sbin/start-dfs.sh

[hadoop@master001 ~]$ sh /usr/hadoop/sbin/start-yarn.sh

[hadoop@master001 ~]$ hadoop fs -ls /

[hadoop@master001 ~]$ hadoop fs -mkdir /usr

[hadoop@master001 ~]$ hadoop fs -ls /

Found 1 items

drwxr-xr-x   - hadoop supergroup          0 2015-06-25 02:27 /usr

=============================================================================

 

分享到:
评论

相关推荐

    Unable to load native-hadoop library for your platform...

    Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 1. 下载文件native.rar 并解压 2. 用SecureFX 上传, 替换native目录下文件,选择二进制文件上传。 3. ...

    Hadoop 启动WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform… using

    搭建hadoop 环境时遇到启动异常告警问题 “WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable” 上来不多说,百度收集些相关...

    win10-hadoop3.0.0或其他版本 Unable to load native-hadoop library 缺失文件

    hadoop 3.0.0 或者其他版本 报错Unable to load native-hadoop library时,放到c:/system32就可以

    hadoop-native-64-2.7.0.tar

    解决:Unable to load native-hadoop library for your platform 问题 原因: Apache提供的hadoop本地库是32位的,而在64位的服务器上就会有问题,因此需要自己编译64位的版本。 将准备好的64位的lib包解压到已经...

    hadoop-native-64-2.6.0.zip

    Hadoop使用过程报错:WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform...解决 下载过后把tar包解压在hdoop的lib/native下即可

    glibc-2.14 Hadoop专属glib

    升级glib解决Hadoop WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... 和 SequenceFile doesn't work with GzipCodec without native-hadoop code 问题, 具体请参见博文:...

    hadoop2.8.1 native for macOS10

    WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Found 2 items 解决方法是下载hadoop源代码重新编译, 但中间需要下载安装...

    hadoop-native-64-2.4.0.tar

    hadoop-native-64-2.4.0.tar 解决运行hadoop 警告:WARN util.NativeCodeLoader: Unable to load native-hadoop

    hadoop-native-64-2.4.1

    解决:Unable to load native-hadoop library for your platform 问题 原因: Apache提供的hadoop本地库是32位的,而在64位的服务器上就会有问题,因此需要自己编译64位的版本。 将准备好的64位的lib包解压到已经...

    hadoop-native-64-2.8.1.tar.7z

    用这个64位替换了官方的32位的native Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

    flink-shaded-hadoop-3-uber-3.1.1.7.1.1.0-565-9.0.jar.tar.gz

    在这个特定的兼容包中,我们可以看到两个文件:flink-shaded-hadoop-3-uber-3.1.1.7.1.1.0-565-9.0.jar(实际的兼容库)和._flink-shaded-hadoop-3-uber-3.1.1.7.1.1.0-565-9.0.jar(可能是Mac OS的元数据文件,通常...

    hadoop-native-64-2.5.0.tar

    解决:Unable to load native-hadoop library for your platform 问题 原因: Apache提供的hadoop本地库是32位的,而在64位的服务器上就会有问题,因此需要自己编译64位的版本。 将准备好的64位的lib包解压到已经...

    flink-shaded-hadoop-2-uber-2.7.5-10.0.jar.zip

    Apache Flink 是一个流行的开源大数据处理框架,而 `flink-shaded-hadoop-2-uber-2.7.5-10.0.jar.zip` 文件是针对 Flink 优化的一个特殊版本的 Hadoop 库。这个压缩包中的 `flink-shaded-hadoop-2-uber-2.7.5-10.0....

    hadoop-native-64-2.6.0.tar

    解决:Unable to load native-hadoop library for your platform 问题 原因: Apache提供的hadoop本地库是32位的,而在64位的服务器上就会有问题,因此需要自己编译64位的版本。 将准备好的64位的lib包解压到已经...

    flink-shaded-hadoop-3下载

    flink-shaded-hadoop-3下载

    apache-hadoop-3.1.0-winutils-master本地开发调试.zip

    在Windows 10环境下进行Hadoop的本地开发和调试,通常需要解决与操作系统兼容性相关的问题,因为Hadoop最初是为Linux设计的。 "apache-hadoop-3.1.0-winutils-master"是针对Windows环境的一个特殊构建,包含了...

    hadoop-native-64-2.5.2.tar

    解决:Unable to load native-hadoop library for your platform 问题 原因: Apache提供的hadoop本地库是32位的,而在64位的服务器上就会有问题,因此需要自己编译64位的版本。 将准备好的64位的lib包解压到已经...

    haddop-2.6.4 native 64位

    为了解决执行hadoop 命令时总有警告: WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

    spark-2.0.0-bin-hadoop2.6.tgz

    本资源是spark-2.0.0-bin-hadoop2.6.tgz百度网盘资源下载,本资源是spark-2.0.0-bin-hadoop2.6.tgz百度网盘资源下载

Global site tag (gtag.js) - Google Analytics