欢迎您访问程序员文章站本站旨在为大家提供分享程序员计算机编程知识!
您现在的位置是: 首页

Hadoop practice: WARN util.NativeCodeLoader

程序员文章站 2022-06-04 18:03:29
...

When start the hadoop by "./start-all.sh" after configured which followed suggestion of other's bolgs.

the warning was occured when do the step of start, see the log:

[hadoop@Test-01 sbin]$ ./start-all.sh 
This script is Deprecated. Instead use start-dfs.sh and start-yarn.sh
14/06/10 15:20:46 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [Test-01]
...

 

1. Not familar with hadoop, then go around the blogs from net and try to find out the solution that can help me to resolv this issue, thanks to jiedushi and find a way to locate the warnning cause which provided at his blog: http://blog.csdn.net/jiedushi/article/details/7496327

Then set the env as 

 export HADOOP_ROOT_LOGGER=DEBUG,console

 this can output the detail log of debug level to the console, and find the issue cause easily.

 

2. Then do the same demo and get the output log from console as following:

[hadoop@Test-01 bin]$ export HADOOP_ROOT_LOGGER=DEBUG,console
[hadoop@Test-01 bin]$ ./hadoop fs -text /home/hadoop/ant1.8.tar 
14/06/10 15:44:58 DEBUG util.Shell: setsid exited with exit code 0
14/06/10 15:44:58 DEBUG conf.Configuration: parsing URL jar:file:/home/hadoop/hadoop-2.4.0/share/hadoop/common/hadoop-common-2.4.0.jar!/core-default.xml
14/06/10 15:44:58 DEBUG conf.Configuration: parsing input stream sun.net.www.protocol.jar.JarURLConnection$JarURLInputStream@531d2a
14/06/10 15:44:58 DEBUG conf.Configuration: parsing URL file:/home/hadoop/hadoop-2.4.0/etc/hadoop/core-site.xml
14/06/10 15:44:58 DEBUG conf.Configuration: parsing input stream java.io.BufferedInputStream@7d611f
14/06/10 15:44:58 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, value=[Rate of successful kerberos logins and latency (milliseconds)], about=, type=DEFAULT, always=false, sampleName=Ops)
14/06/10 15:44:58 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, value=[Rate of failed kerberos logins and latency (milliseconds)], about=, type=DEFAULT, always=false, sampleName=Ops)
14/06/10 15:44:58 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, value=[GetGroups], about=, type=DEFAULT, always=false, sampleName=Ops)
14/06/10 15:44:58 DEBUG impl.MetricsSystemImpl: UgiMetrics, User and group related metrics
14/06/10 15:44:58 DEBUG security.Groups:  Creating new Groups object
14/06/10 15:44:58 DEBUG util.NativeCodeLoader: Trying to load the custom-built native-hadoop library...
14/06/10 15:44:58 DEBUG util.NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: /home/hadoop/hadoop-2.4.0/lib/native/libhadoop.so.1.0.0: /lib/libc.so.6: version `GLIBC_2.6' not found (required by /home/hadoop/hadoop-2.4.0/lib/native/libhadoop.so.1.0.0)
14/06/10 15:44:58 DEBUG util.NativeCodeLoader: java.library.path=/home/hadoop/hadoop-2.4.0/lib/native
14/06/10 15:44:58 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
14/06/10 15:44:58 DEBUG security.JniBasedUnixGroupsMappingWithFallback: Falling back to shell based
14/06/10 15:44:58 DEBUG security.JniBasedUnixGroupsMappingWithFallback: Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
14/06/10 15:44:58 DEBUG security.Groups: Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000; warningDeltaMs=5000
14/06/10 15:44:58 DEBUG security.UserGroupInformation: hadoop login
14/06/10 15:44:58 DEBUG security.UserGroupInformation: hadoop login commit
14/06/10 15:44:58 DEBUG security.UserGroupInformation: using local user:UnixPrincipal: hadoop
14/06/10 15:44:58 DEBUG security.UserGroupInformation: UGI loginUser:hadoop (auth:SIMPLE)
14/06/10 15:44:58 DEBUG hdfs.BlockReaderLocal: dfs.client.use.legacy.blockreader.local = false
14/06/10 15:44:58 DEBUG hdfs.BlockReaderLocal: dfs.client.read.shortcircuit = false
14/06/10 15:44:58 DEBUG hdfs.BlockReaderLocal: dfs.client.domain.socket.data.traffic = false
14/06/10 15:44:58 DEBUG hdfs.BlockReaderLocal: dfs.domain.socket.path = 
14/06/10 15:44:58 DEBUG retry.RetryUtils: multipleLinearRandomRetry = null
14/06/10 15:44:58 DEBUG ipc.Server: rpcKind=RPC_PROTOCOL_BUFFER, rpcRequestWrapperClass=class org.apache.hadoop.ipc.ProtobufRpcEngine$RpcRequestWrapper, rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker@273cef
14/06/10 15:44:59 DEBUG ipc.Client: getting client out of cache: org.apache.hadoop.ipc.Client@102ae46
14/06/10 15:44:59 DEBUG hdfs.BlockReaderLocal: Both short-circuit local reads and UNIX domain socket are disabled.
14/06/10 15:44:59 DEBUG ipc.Client: The ping interval is 60000 ms.
14/06/10 15:44:59 DEBUG ipc.Client: Connecting to /127.0.0.1:9000
14/06/10 15:44:59 DEBUG ipc.Client: IPC Client (31620548) connection to /127.0.0.1:9000 from hadoop: starting, having connections 1
14/06/10 15:44:59 DEBUG ipc.Client: IPC Client (31620548) connection to /127.0.0.1:9000 from hadoop sending #0
14/06/10 15:44:59 DEBUG ipc.Client: IPC Client (31620548) connection to /127.0.0.1:9000 from hadoop got value #0
14/06/10 15:44:59 DEBUG ipc.ProtobufRpcEngine: Call: getFileInfo took 272ms
text: `/home/hadoop/ant1.8.tar': No such file or directory
14/06/10 15:44:59 DEBUG ipc.Client: stopping client from cache: org.apache.hadoop.ipc.Client@102ae46
14/06/10 15:44:59 DEBUG ipc.Client: removing client from cache: org.apache.hadoop.ipc.Client@102ae46
14/06/10 15:44:59 DEBUG ipc.Client: stopping actual client because no more references remain: org.apache.hadoop.ipc.Client@102ae46
14/06/10 15:44:59 DEBUG ipc.Client: Stopping client
14/06/10 15:44:59 DEBUG ipc.Client: IPC Client (31620548) connection to /127.0.0.1:9000 from hadoop: closed
14/06/10 15:44:59 DEBUG ipc.Client: IPC Client (31620548) connection to /127.0.0.1:9000 from hadoop: stopped, remaining connections 0

 check the log at line13~17, it given the detail casue of "NativeCodeLoader" exception, this can make me to resolve this problem quickly.

 

---------------------------------------------------glibc split line---------------------------------------------------------

As the log prompt me to provide the not founded glic " version `GLIBC_2.6' " at the log.

How to do this, if ur professional of linux, it is so easy to fix the issue.

How to Upgrade the gilbc, please refer to "CsuBoy 'Blog": http://blog.csuboy.com/glibc-update-to-2-6/

 

Beofre upgrade the glibc, check the link for the lib, do this can help you to resolve issues by yourself:

[hadoop@Test-01 sbin]$ ll /home/hadoop/hadoop-2.4.0/lib/native/libhadoop.so.1.0.0
-rwxr-xr-x 1 hadoop hadoop 488873 Mar 31 16:49 /home/hadoop/hadoop-2.4.0/lib/native/libhadoop.so.1.0.0
[hadoop@Test-01 sbin]$ cd /home/hadoop/hadoop-2.4.0/lib/native/
[hadoop@Test-01 native]$ ll
total 2288
-rw-r--r-- 1 hadoop hadoop 687184 Mar 31 16:49 libhadoop.a
-rw-r--r-- 1 hadoop hadoop 534024 Mar 31 16:49 libhadooppipes.a
lrwxrwxrwx 1 hadoop hadoop     18 May 28 15:42 libhadoop.so -> libhadoop.so.1.0.0
-rwxr-xr-x 1 hadoop hadoop 488873 Mar 31 16:49 libhadoop.so.1.0.0
-rw-r--r-- 1 hadoop hadoop 226360 Mar 31 16:49 libhadooputils.a
-rw-r--r-- 1 hadoop hadoop 204586 Mar 31 16:49 libhdfs.a
lrwxrwxrwx 1 hadoop hadoop     16 May 28 15:42 libhdfs.so -> libhdfs.so.0.0.0
-rwxr-xr-x 1 hadoop hadoop 167760 Mar 31 16:49 libhdfs.so.0.0.0
[hadoop@Test-01 native]$ file libhadoop.so.1.0.0
libhadoop.so.1.0.0: ELF 32-bit LSB shared object, Intel 80386, version 1 (SYSV), not stripped
[hadoop@Test-01 native]$ ldd libhadoop.so.1.0.0
./libhadoop.so.1.0.0: /lib/libc.so.6: version `GLIBC_2.6' not found (required by ./libhadoop.so.1.0.0)
./libhadoop.so.1.0.0: /lib/libc.so.6: version `GLIBC_2.7' not found (required by ./libhadoop.so.1.0.0)
        linux-gate.so.1 =>  (0x00a6c000)
        libdl.so.2 => /lib/libdl.so.2 (0x00dd1000)
        libjvm.so => not found
        libc.so.6 => /lib/libc.so.6 (0x00110000)
        /lib/ld-linux.so.2 (0x004e7000)
[hadoop@Test-01 native]$ rpm -qa glibc
glibc-2.5-81.el5_8.7
[root @Test-03 native]/lib/libc.so.6
GNU C Library stable release version 2.5, by Roland McGrath et al.
Copyright (C) 2006 Free Software Foundation, Inc.
This is free software; see the source for copying conditions.
There is NO warranty; not even for MERCHANTABILITY or FITNESS FOR A
PARTICULAR PURPOSE.
Compiled by GNU CC version 4.1.2 20080704 (Red Hat 4.1.2-54).
Compiled on a Linux 2.6.9 system on 2013-10-08.
Available extensions:
        The C stubs add-on version 2.1.2.
        crypt add-on version 2.1 by Michael Glad and others
        GNU Libidn by Simon Josefsson
        GNU libio by Per Bothner
        NIS(YP)/NIS+ NSS modules 0.19 by Thorsten Kukuk
        Native POSIX Threads Library by Ulrich Drepper et al
        BIND-8.2.3-T5B
        RT using linux kernel aio
Thread-local storage support included.
For bug reporting instructions, please see:
<http://www.gnu.org/software/libc/bugs.html>.
[root @Test-03 native]

After this, you would know more beyond the problem itself. 

 

---------------------------------------------------32-x64 split line---------------------------------------------------------

If you use the x64 os, it will need you to compile the source;

you can follows as the blog: iloveyin :http://blog.csdn.net/iloveyin/article/details/28909771

 

--------------------------------------------------- lib link split line---------------------------------------------------------

After recompile, still have issue as:

 

14/06/13 13:19:56 DEBUG util.NativeCodeLoader: Trying to load the custom-built native-hadoop library...
14/06/13 13:19:56 DEBUG util.NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path
14/06/13 13:19:56 DEBUG util.NativeCodeLoader: java.library.path=/data/bigdata/.hadoop/hadoop-2.4.0/lib
14/06/13 13:19:56 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

 Just add the env variable:

 

export JAVA_LIBRARY_PATH=$HADOOP_HOME/lib/native

 

--------------------------------------------------- dev split line---------------------------------------------------------