Unable to load bzip2 into Hadoop native library

My CentOS 7 environment; Sparks 1.6.1; Hadoop 2.6.4; and I have two subordinate nodes in cluster mode.

When I tried the hadoop command, I got WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platformโ€ฆ using builtin-java classes where applicable

And I check hadoop checknative -a , I got all the false answers. Part of this problem is solved by adding

 export HADOOP_COMMON_LIB_NATIVE_DIR=${HADOOP_HOME}/lib/native export HADOOP_OPTS="-Djava.library.path=${HADOOP_HOME}/lib/native/" 

in hadoop-env.sh and reinstall openssl-devel. However, I still got a warning when I typed hadoop checknative -a :

 [ hadoop@host-10-174-101-17 ~]$ hadoop checknative -a 16/07/13 14:36:24 WARN bzip2.Bzip2Factory: Failed to load/initialize native-bzip2 library system-native, will use pure-Java version 16/07/13 14:36:24 INFO zlib.ZlibFactory: Successfully loaded & initialized native-zlib library Native library checking: hadoop: true /usr/local/hadoop/lib/native/libhadoop.so.1.0.0 zlib: true /lib64/libz.so.1 snappy: true /lib64/libsnappy.so.1 lz4: true revision:99 bzip2: false openssl: true /lib64/libcrypto.so 16/07/13 14:36:24 INFO util.ExitUtil: Exiting with status 1 

I reinstalled bizp2 and I check bzip2 --version :

 [ hadoop@host-10-174-101-17 ~]$ bzip2 --version bzip2, a block-sorting file compressor. Version 1.0.6, 6-Sept-2010. Copyright (C) 1996-2010 by Julian Seward. This program is free software; you can redistribute it and/or modify it under the terms set out in the LICENSE file, which is included in the bzip2-1.0.6 source distribution. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the LICENSE file for more details. bzip2: I won't write compressed data to a terminal. bzip2: For help, type: `bzip2 --help'. 

What more, I check the dictionary ~ / lib64 /, both libbz2.so.1 and libbz2.so exist, he said that if bzip2 is well loaded, this should be the way to go. Bzip2 seems to be well installed, but hasoop cannot load it. I also tried recompiling as https://issues.apache.org/jira/browse/HADOOP-10409 . If I follow the exact same way, it does not work:

 [ hadoop@host-10-174-101-17 ~]$ strings /export/apps/hadoop/latest/lib/native/libhadoop.so | grep initIDs strings: '/export/apps/hadoop/latest/lib/native/libhadoop.so': No such file 

If I change the path to my hadoop path: /usr/local/hadoop/lib/native/libhadoop.so, this is what it got:

 [ hadoop@host-10-174-101-17 ~]$ strings /usr/local/hadoop/lib/native/libhadoop.so | grep initIDs Java_org_apache_hadoop_io_compress_lz4_Lz4Compressor_initIDs Java_org_apache_hadoop_io_compress_lz4_Lz4Decompressor_initIDs Java_org_apache_hadoop_io_compress_snappy_SnappyCompressor_initIDs Java_org_apache_hadoop_io_compress_snappy_SnappyDecompressor_initIDs Java_org_apache_hadoop_crypto_OpensslCipher_initIDs Java_org_apache_hadoop_io_compress_zlib_ZlibCompressor_initIDs Java_org_apache_hadoop_io_compress_zlib_ZlibDecompressor_initIDs Java_org_apache_hadoop_io_compress_lz4_Lz4Compressor_initIDs Java_org_apache_hadoop_io_compress_lz4_Lz4Decompressor_initIDs Java_org_apache_hadoop_io_compress_snappy_SnappyCompressor_initIDs Java_org_apache_hadoop_io_compress_snappy_SnappyDecompressor_initIDs Java_org_apache_hadoop_crypto_OpensslCipher_initIDs Java_org_apache_hadoop_io_compress_zlib_ZlibCompressor_initIDs Java_org_apache_hadoop_io_compress_zlib_ZlibDecompressor_initIDs Java_org_apache_hadoop_io_compress_lz4_Lz4Decompressor_initIDs Java_org_apache_hadoop_io_compress_lz4_Lz4Compressor_initIDs Java_org_apache_hadoop_io_compress_zlib_ZlibDecompressor_initIDs Java_org_apache_hadoop_io_compress_snappy_SnappyCompressor_initIDs Java_org_apache_hadoop_io_compress_snappy_SnappyDecompressor_initIDs Java_org_apache_hadoop_io_compress_zlib_ZlibCompressor_initIDs Java_org_apache_hadoop_crypto_OpensslCipher_initIDs 

Although, I check hadoop checknative -a , but it does not work. What can I do in this situation? Thank you very much.

+5
source share

All Articles