• CentOS 6.4 64位 源码编译hadoop 2.2.0


    搭建环境:Centos 6.4 64bit

    1、安装JDK 参考这里
    2、安装maven
    maven官方下载地址,可以选择源码编码安装,这里就直接下载编译好的
    wget http://mirror.bit.edu.cn/apache/maven/maven-3/3.1.1/binaries/apache-maven-3.1.1-bin.zip
    解压文件后,同样在/etc/profie里配置环境变量
    vim /etc/profie
    export MAVEN_HOME=/opt/maven3.1.1
    export PATH=$PATH:$MAVEN_HOME/bin
    source /etc/profile
    验证配置是否成功: mvn -version

    Apache Maven 3.1.1 (0728685237757ffbf44136acec0402957f723d9a; 2013-09-17 23:22:22+0800)
    Maven home: /opt/maven3.1.1
    Java version: 1.7.0_45, vendor: Oracle Corporation
    Java home: /opt/jdk1.7/jre
    Default locale: en_US, platform encoding: UTF-8
    OS name: "linux", version: "2.6.32-358.el6.x86_64", arch: "amd64", family: "unix"

    由于maven国外服务器可能连不上,先给maven配置一下国内镜像,在maven目录下,conf/settings.xml,在<mirrors></mirros>里添加,原本的不要动
    <mirror>
    <id>nexus-osc</id>
    <mirrorOf>*</mirrorOf>
    <name>Nexusosc</name>
    <url>http://maven.oschina.net/content/groups/public/</url>
    </mirror>
    同样,在<profiles></profiles>内新添加
    <profile>
    <id>jdk-1.7</id>
    <activation>
    <jdk>1.7</jdk>
    </activation>
    <repositories>
    <repository>
    <id>nexus</id>
    <name>local private nexus</name>
    <url>http://maven.oschina.net/content/groups/public/</url>
    <releases>
    <enabled>true</enabled>
    </releases>
    <snapshots>
    <enabled>false</enabled>
    </snapshots>
    </repository>
    </repositories>
    <pluginRepositories>
    <pluginRepository>
    <id>nexus</id>
    <name>local private nexus</name>
    <url>http://maven.oschina.net/content/groups/public/</url>
    <releases>
    <enabled>true</enabled>
    </releases>
    <snapshots>
    <enabled>false</enabled>
    </snapshots>
    </pluginRepository>
    </pluginRepositories>
    </profile>
    3、安装protoc2.5.0
    hadoop2.2.0编译需要protoc2.5.0的支持,所以还要下载protoc,
    下载地址:https://code.google.com/p/protobuf/downloads/list,要下载2.5.0版本噢
    对protoc进行编译安装前先要装几个依赖包:gcc,gcc-c++,make 如果已经安装的可以忽略
    yum install gcc
    yum intall gcc-c++
    yum install make
    安装protoc
    tar -xvf protobuf-2.5.0.tar.bz2
    cd protobuf-2.5.0
    ./configure --prefix=/opt/protoc/
    make && make install
    4、需要安装cmake,openssl-devel,ncurses-devel依赖 如果已经安装的可以忽略
    yum install cmake
    yum install openssl-devel
    yum install ncurses-devel
    5、编译hadoop
    首先官方下载hadoop源码
    wget http://mirrors.cnnic.cn/apache/hadoop/common/hadoop-2.2.0/hadoop-2.2.0-src.tar.gz
    现在可以进行编译了,
    cd hadoop2.2.0-src
    mvn package -Pdist,native -DskipTests -Dtar

    [INFO] ------------------------------------------------------------------------
    [INFO] Reactor Summary:
    [INFO]
    [INFO] Apache Hadoop Main ................................ SUCCESS [3.709s]
    [INFO] Apache Hadoop Project POM ......................... SUCCESS [2.229s]
    [INFO] Apache Hadoop Annotations ......................... SUCCESS [5.270s]
    [INFO] Apache Hadoop Assemblies .......................... SUCCESS [0.388s]
    [INFO] Apache Hadoop Project Dist POM .................... SUCCESS [3.485s]
    [INFO] Apache Hadoop Maven Plugins ....................... SUCCESS [8.655s]
    [INFO] Apache Hadoop Auth ................................ SUCCESS [7.782s]
    [INFO] Apache Hadoop Auth Examples ....................... SUCCESS [5.731s]
    [INFO] Apache Hadoop Common .............................. SUCCESS [1:52.476s]
    [INFO] Apache Hadoop NFS ................................. SUCCESS [9.935s]
    [INFO] Apache Hadoop Common Project ...................... SUCCESS [0.110s]
    [INFO] Apache Hadoop HDFS ................................ SUCCESS [1:58.347s]
    [INFO] Apache Hadoop HttpFS .............................. SUCCESS [26.915s]
    [INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS [17.002s]
    [INFO] Apache Hadoop HDFS-NFS ............................ SUCCESS [5.292s]
    [INFO] Apache Hadoop HDFS Project ........................ SUCCESS [0.073s]
    [INFO] hadoop-yarn ....................................... SUCCESS [0.335s]
    [INFO] hadoop-yarn-api ................................... SUCCESS [54.478s]
    [INFO] hadoop-yarn-common ................................ SUCCESS [39.215s]
    [INFO] hadoop-yarn-server ................................ SUCCESS [0.241s]
    [INFO] hadoop-yarn-server-common ......................... SUCCESS [15.601s]
    [INFO] hadoop-yarn-server-nodemanager .................... SUCCESS [21.566s]
    [INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS [4.754s]
    [INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS [20.625s]
    [INFO] hadoop-yarn-server-tests .......................... SUCCESS [0.755s]
    [INFO] hadoop-yarn-client ................................ SUCCESS [6.748s]
    [INFO] hadoop-yarn-applications .......................... SUCCESS [0.155s]
    [INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS [4.661s]
    [INFO] hadoop-mapreduce-client ........................... SUCCESS [0.160s]
    [INFO] hadoop-mapreduce-client-core ...................... SUCCESS [36.090s]
    [INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS [2.753s]
    [INFO] hadoop-yarn-site .................................. SUCCESS [0.151s]
    [INFO] hadoop-yarn-project ............................... SUCCESS [4.771s]
    [INFO] hadoop-mapreduce-client-common .................... SUCCESS [24.870s]
    [INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS [3.812s]
    [INFO] hadoop-mapreduce-client-app ....................... SUCCESS [15.759s]
    [INFO] hadoop-mapreduce-client-hs ........................ SUCCESS [6.831s]
    [INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS [8.126s]
    [INFO] hadoop-mapreduce-client-hs-plugins ................ SUCCESS [2.320s]
    [INFO] Apache Hadoop MapReduce Examples .................. SUCCESS [9.596s]
    [INFO] hadoop-mapreduce .................................. SUCCESS [3.905s]
    [INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS [7.118s]
    [INFO] Apache Hadoop Distributed Copy .................... SUCCESS [11.651s]
    [INFO] Apache Hadoop Archives ............................ SUCCESS [2.671s]
    [INFO] Apache Hadoop Rumen ............................... SUCCESS [10.038s]
    [INFO] Apache Hadoop Gridmix ............................. SUCCESS [6.062s]
    [INFO] Apache Hadoop Data Join ........................... SUCCESS [4.104s]
    [INFO] Apache Hadoop Extras .............................. SUCCESS [4.210s]
    [INFO] Apache Hadoop Pipes ............................... SUCCESS [9.419s]
    [INFO] Apache Hadoop Tools Dist .......................... SUCCESS [2.306s]
    [INFO] Apache Hadoop Tools ............................... SUCCESS [0.037s]
    [INFO] Apache Hadoop Distribution ........................ SUCCESS [21.579s]
    [INFO] Apache Hadoop Client .............................. SUCCESS [7.299s]
    [INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS [7.347s]
    [INFO] ------------------------------------------------------------------------
    [INFO] BUILD SUCCESS
    [INFO] ------------------------------------------------------------------------
    [INFO] Total time: 11:53.144s
    [INFO] Finished at: Fri Nov 22 16:58:32 CST 2013
    [INFO] Final Memory: 70M/239M
    [INFO] ------------------------------------------------------------------------
    直到看到上面的内容那就说明编译完成了。

    编译后的路径在:hadoop-2.2.0-src/hadoop-dist/target/hadoop-2.2.0
    通过一下命令可以看出hadoop的版本
    [root@localhost bin]# ./hadoop version
    Hadoop 2.2.0
    Subversion Unknown -r Unknown
    Compiled by root on 2013-11-22T08:47Z
    Compiled with protoc 2.5.0
    From source with checksum 79e53ce7994d1628b240f09af91e1af4
    This command was run using /data/hadoop-2.2.0-src/hadoop-dist/target/hadoop-2.2.0/share/hadoop/common/hadoop-common-2.2.0.jar

    [root@localhost hadoop-2.2.0]# file lib//native/*
    lib//native/libhadoop.a: current ar archive
    lib//native/libhadooppipes.a: current ar archive
    lib//native/libhadoop.so: symbolic link to `libhadoop.so.1.0.0'
    lib//native/libhadoop.so.1.0.0: ELF 64-bit LSB shared object, x86-64, version 1 (SYSV), dynamically linked, not stripped
    lib//native/libhadooputils.a: current ar archive
    lib//native/libhdfs.a: current ar archive
    lib//native/libhdfs.so: symbolic link to `libhdfs.so.0.0.0'
    lib//native/libhdfs.so.0.0.0: ELF 64-bit LSB shared object, x86-64, version 1 (SYSV), dynamically linked, not stripped

  • 相关阅读:
    Git的安装与配置
    JDBCTemplate
    消费金融大数据风控架构
    架构设计之道
    面向服务架构SOA
    java集合List解析
    web应用安全
    微服务的交互模式
    服务化管理和治理框架的技术选型
    分库分表就能无限扩容么?
  • 原文地址:https://www.cnblogs.com/guoyongrong/p/3522671.html
Copyright © 2020-2023  润新知