• 64位CentOS上编译 Hadoop 2.2.0


    下载了Hadoop预编译好的二进制包,hadoop-2.2.0.tar.gz,启动起来后。总是出现这样的警告:

    WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
    

    原因是apache官网提供的二进制包,里面的native库。是32位的,坑跌啊,如今server谁还有32位的啊。

    $ file $HADOOP_PREFIX/lib/native/libhadoop.so.1.0.0
    libhadoop.so.1.0.0: ELF 32-bit LSB shared object, Intel 80386, version 1 (SYSV), dynamically linked, BuildID[sha1]=0x9eb1d49b05f67d38454e42b216e053a27ae8bac9, not stripped
    

    我们须要下载Hadoop 2.2.0源代码,在 64 位Linux下又一次编译,然后把32位的native库用64位的native库替换。

    1. 下载Hadoop 2.2.0 源代码包,并解压

    $ wget http://mirrors.hust.edu.cn/apache/hadoop/common/hadoop-2.2.0/hadoop-2.2.0-src.tar.gz
    $ tar zxf hadoop-2.2.0-src.tar.gz
    

    2. 安装以下的软件

     $ sudo yum install cmake lzo-devel  zlib-devel  gcc autoconf automake libtool   ncurses-devel openssl-deve
    

    3. 安装Maven

    安装 Maven 3.1.4

    $ wget http://mirror.esocc.com/apache/maven/maven-3/3.1.4/binaries/apache-maven-3.1.4-bin.tar.gz
    $ sudo tar zxf apache-maven-3.1.4-bin.tar.gz -C /opt
    $ sudo vim /etc/profile
    export MAVEN_HOME=/opt/apache-maven-3.1.4
    export PATH=$PATH:$MAVEN_HOME/bin
    

    注销并又一次登录,让环境变量生效。

    4. 安装Ant

    $ wget http://apache.dataguru.cn//ant/binaries/apache-ant-1.9.4-bin.tar.gz
    $ sudo tar zxf apache-ant-1.9.4-bin.tar.gz -C /opt
    $ sudo vim /etc/profile
    export ANT_HOME=/opt/apache-ant-1.9.4
    export PATH=$PATH:$ANT_HOME/bin
    

    5. 安装Findbugs

    $ wget http://prdownloads.sourceforge.net/findbugs/findbugs-2.0.3.tar.gz?download
    $ sudo tar zxf findbugs-2.0.3.tar.gz -C /opt
    $ sudo vim /etc/profile
    export FINDBUGS_HOME=/opt/findbugs-2.0.3
    export PATH=$PATH:$FINDBUGS_HOME/bin
    

    6. 安装protobuf

    编译Hadoop 2.2.0,须要protobuf的编译器protoc。

    一定须要protobuf 2.5.0以上,yum里的是2.3,太老了。

    因此下载源代码,编译安装。

    $ wget https://protobuf.googlecode.com/files/protobuf-2.5.0.tar.gz
    $ tar zxf protobuf-2.5.0.tar.gz
    $ cd protobuf-2.5.0
    $ ./configure
    $ make
    $ sudo make install
    

    7. 给Hadoop源代码打一个patch

    最新的Hadoop 2.2.0 的Source Code 压缩包解压出来的code有个bug 须要patch后才干编译。否则编译hadoop-auth 会提示以下错误:

    [ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:2.5.1:testCompile (default-testCompile) on project hadoop-auth: Compilation failure: Compilation failure:
    [ERROR] /home/chuan/trunk/hadoop-common-project/hadoop-auth/src/test/java/org/apache/hadoop/security/authentication/client/AuthenticatorTestCase.java:[84,13] cannot access org.mortbay.component.AbstractLifeCycle
    [ERROR] class file for org.mortbay.component.AbstractLifeCycle not found
    

    Patch: https://issues.apache.org/jira/browse/HADOOP-10110

    wget https://issues.apache.org/jira/secure/attachment/12614482/HADOOP-10110.patch; patch -p0 < HADOOP-10110.patch 

    8. 编译 Hadoop

    cd hadoop-2.2.0-src
    mvn package -DskipTests -Pdist,native -Dtar
    

    9. 替换掉32位的native库

    用 hadoop-2.2.0-src/hadoop-dist/target/hadoop-2.2.0/lib/native 替换掉 hadoop-2.2.0/lib/native

    rm -rf ~/local/opt/hadoop-2.2.0/lib/native
    cp ./hadoop-dist/target/hadoop-2.2.0/lib/native ~/local/opt/hadoop-2.2.0/lib/
    

    然后重新启动Hadoop集群。会看到控制台下不再有警告信息了。

    10 解决Ubuntu下启动失败的问题

    在Ubuntu上,那就不是一点WARN了,而是启动不起来。会出错,原因在于。在./sbin/start-dfs.sh第55行。

    NAMENODES=$($HADOOP_PREFIX/bin/hdfs getconf -namenodes)
    

    在shell里单独执行这样命令,

    ./bin/hdfs getconf -namenodes
    
    OpenJDK 64-Bit Server VM warning: You have loaded library /home/soulmachine/local/opt/hadoop-2.2.0/lib/native/libhadoop.so which might have disabled stack guard. The VM will try to fix the stack guard now.
    It's highly recommended that you fix the library with 'execstack -c <libfile>', or link it with '-z noexecstack'.
    14/02/14 13:14:50 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
    localhost
    

    最后一行的localhost。才是有效的namenode。可是因为前面有一大堆warning。脚本把这一大堆字符串。按空格隔开,每一个单词都看作是namenode,接下来就错的稀里哗啦。

    根本原因,还是由于32位native库。

    把自带的32位native文件夹删除。用编译好的64位native文件夹拷贝过去,再执行

    ./bin/hdfs getconf -namenodes
    localhost
    

    这下就对了!

  • 相关阅读:
    Spring Boot 使用actuator监控与管理
    Spring Boot入门
    mysql中update语句的锁
    LinkedList深入学习
    23种设计模式学习之享元模式
    23种设计模式学习之桥接模式
    23种设计模式学习之外观模式
    23种设计模式学习之代理模式
    23种设计模式学习之装饰者模式
    23种设计模式学习之适配器模式
  • 原文地址:https://www.cnblogs.com/zfyouxi/p/5175186.html
Copyright © 2020-2023  润新知