• hadoop 源码编译---Win7(64位)


    说明:

      在Windows 平台上编译hadoop 源代码,hadoop源码包的安装说明文件【BUILDING.txt】描述如下

    haoop官方文档上需要准备的文件:

    * Windows System
    * JDK 1.7+
    * Maven 3.0 or later
    * Findbugs 1.3.9 (if running findbugs)
    * ProtocolBuffer 2.5.0
    * CMake 2.6 or newer
    * Windows SDK 7.1 or Visual Studio 2010 Professional【没用到,我用cmd代替】
    * Windows SDK 8.1 (if building CPU rate control for the container executor)【没用到,windows sdk7 和windows sdk8 二选一】
    * zlib headers (if building native code bindings for zlib)【没用到】
    * Internet connection for first build (to fetch all Maven and Hadoop dependencies)【我用的是nexus-2.14.1-01-bundle.zip】
    * Unix command-line tools from GnuWin32: sh, mkdir, rm, cp, tar, gzip.【没用到】

    These
    tools must be present on your PATH.

    1.我的准备文件:

      (1).apache-maven-3.0.4

      (2).protoc-2.5.0-win32.zip

      (3).findbugs-1.3.9.zip

      (4).cmake-3.8.0-rc2-win64-x64.zip

    2.系统变量Path设置

      F:Mavenapache-maven-3.0.4-binapache-maven-3.0.4in;

    E:Linuxgoogle-protobuf-2.5.0protoc-2.5.0-win32;
    E:Linuxfindbugs-1.3.9in;
    E:Linuxcmake-3.8.0-rc2-win64-x64in;

    3. 首先进入hadoop源码文件夹--->hadoop-maven-plugins

      【我的:E:Linuxhadoop-2.7.3-srchadoop-maven-plugins】 在地址栏输入cmd 后回车,进入当前目录

    4. 运行mvn clean install  下载编译包

      

    5.退会到hadoop源码根目录执行:

      mvn eclipse:eclipse -DskipTests

    成功后的界面输入出如下:

      

    main:
        [mkdir] Created dir: E:Linuxhadoop-2.7.3-srchadoop-toolshadoop-tools-dist	arget	est-dir
        [mkdir] Created dir: E:Linuxhadoop-2.7.3-srchadoop-toolshadoop-tools-dist	arget	estdata
    [INFO] Executed tasks
    [INFO]
    [INFO] --- maven-remote-resources-plugin:1.5:process (default) @ hadoop-tools-dist ---
    [INFO]
    [INFO] <<< maven-eclipse-plugin:2.10:eclipse (default-cli) < generate-resources @ hadoop-tools-dist <<<
    [INFO]
    [INFO] --- maven-eclipse-plugin:2.10:eclipse (default-cli) @ hadoop-tools-dist ---
    [INFO] Using Eclipse Workspace: null
    [INFO] Adding default classpath container: org.eclipse.jdt.launching.JRE_CONTAINER/org.eclipse.jdt.internal.debug.ui.launcher.StandardVMType/JavaSE-1.7
    [INFO] Wrote settings to E:Linuxhadoop-2.7.3-srchadoop-toolshadoop-tools-dist.settingsorg.eclipse.jdt.core.prefs
    [INFO] File E:Linuxhadoop-2.7.3-srchadoop-toolshadoop-tools-dist.project already exists.
           Additional settings will be preserved, run mvn eclipse:clean if you want old settings to be removed.
    [INFO] Wrote Eclipse project for "hadoop-tools-dist" to E:Linuxhadoop-2.7.3-srchadoop-toolshadoop-tools-dist.
    [INFO]
    [INFO]
    [INFO] ------------------------------------------------------------------------
    [INFO] Building Apache Hadoop Tools 2.7.3
    [INFO] ------------------------------------------------------------------------
    [INFO]
    [INFO] >>> maven-eclipse-plugin:2.10:eclipse (default-cli) > generate-resources @ hadoop-tools >>>
    [INFO]
    [INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-tools ---
    [INFO] Executing tasks
    
    main:
        [mkdir] Created dir: E:Linuxhadoop-2.7.3-srchadoop-tools	arget	est-dir
    [INFO] Executed tasks
    [INFO]
    [INFO] --- maven-remote-resources-plugin:1.5:process (default) @ hadoop-tools ---
    [INFO]
    [INFO] <<< maven-eclipse-plugin:2.10:eclipse (default-cli) < generate-resources @ hadoop-tools <<<
    [INFO]
    [INFO] --- maven-eclipse-plugin:2.10:eclipse (default-cli) @ hadoop-tools ---
    [INFO] Not running eclipse plugin goal for pom project
    [INFO] Using Eclipse Workspace: null
    [INFO] Adding default classpath container: org.eclipse.jdt.launching.JRE_CONTAINER/org.eclipse.jdt.internal.debug.ui.launcher.StandardVMType/JavaSE-1.7
    [INFO]
    [INFO] ------------------------------------------------------------------------
    [INFO] Building Apache Hadoop Distribution 2.7.3
    [INFO] ------------------------------------------------------------------------
    [INFO]
    [INFO] >>> maven-eclipse-plugin:2.10:eclipse (default-cli) > generate-resources @ hadoop-dist >>>
    [INFO]
    [INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-dist ---
    [INFO] Executing tasks
    
    main:
        [mkdir] Created dir: E:Linuxhadoop-2.7.3-srchadoop-dist	arget	est-dir
    [INFO] Executed tasks
    [INFO]
    [INFO] --- maven-remote-resources-plugin:1.5:process (default) @ hadoop-dist ---
    [INFO]
    [INFO] <<< maven-eclipse-plugin:2.10:eclipse (default-cli) < generate-resources @ hadoop-dist <<<
    [INFO]
    [INFO] --- maven-eclipse-plugin:2.10:eclipse (default-cli) @ hadoop-dist ---
    [INFO] Using Eclipse Workspace: null
    [INFO] Adding default classpath container: org.eclipse.jdt.launching.JRE_CONTAINER/org.eclipse.jdt.internal.debug.ui.launcher.StandardVMType/JavaSE-1.7
    [INFO] Wrote settings to E:Linuxhadoop-2.7.3-srchadoop-dist.settingsorg.eclipse.jdt.core.prefs
    [INFO] File E:Linuxhadoop-2.7.3-srchadoop-dist.project already exists.
           Additional settings will be preserved, run mvn eclipse:clean if you want old settings to be removed.
    [INFO] Wrote Eclipse project for "hadoop-dist" to E:Linuxhadoop-2.7.3-srchadoop-dist.
    [INFO]
    [INFO] ------------------------------------------------------------------------
    [INFO] Reactor Summary:
    [INFO]
    [INFO] Apache Hadoop Main ................................. SUCCESS [  0.424 s]
    [INFO] Apache Hadoop Build Tools .......................... SUCCESS [  0.215 s]
    [INFO] Apache Hadoop Project POM .......................... SUCCESS [  0.520 s]
    [INFO] Apache Hadoop Annotations .......................... SUCCESS [  0.108 s]
    [INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [  0.084 s]
    [INFO] Apache Hadoop Assemblies ........................... SUCCESS [  0.095 s]
    [INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [  1.108 s]
    [INFO] Apache Hadoop MiniKDC .............................. SUCCESS [  3.725 s]
    [INFO] Apache Hadoop Auth ................................. SUCCESS [  3.602 s]
    [INFO] Apache Hadoop Auth Examples ........................ SUCCESS [  0.471 s]
    [INFO] Apache Hadoop Common ............................... SUCCESS [ 10.259 s]
    [INFO] Apache Hadoop NFS .................................. SUCCESS [ 13.015 s]
    [INFO] Apache Hadoop KMS .................................. SUCCESS [ 10.464 s]
    [INFO] Apache Hadoop Common Project ....................... SUCCESS [  0.070 s]
    [INFO] Apache Hadoop HDFS ................................. SUCCESS [  5.865 s]
    [INFO] Apache Hadoop HttpFS ............................... SUCCESS [ 32.714 s]
    [INFO] Apache Hadoop HDFS BookKeeper Journal .............. SUCCESS [  5.733 s]
    [INFO] Apache Hadoop HDFS-NFS ............................. SUCCESS [  1.789 s]
    [INFO] Apache Hadoop HDFS Project ......................... SUCCESS [  0.088 s]
    [INFO] hadoop-yarn ........................................ SUCCESS [  0.066 s]
    [INFO] hadoop-yarn-api .................................... SUCCESS [  0.710 s]
    [INFO] hadoop-yarn-common ................................. SUCCESS [  8.698 s]
    [INFO] hadoop-yarn-server ................................. SUCCESS [  0.065 s]
    [INFO] hadoop-yarn-server-common .......................... SUCCESS [  5.843 s]
    [INFO] hadoop-yarn-server-nodemanager ..................... SUCCESS [  2.519 s]
    [INFO] hadoop-yarn-server-web-proxy ....................... SUCCESS [  0.636 s]
    [INFO] hadoop-yarn-server-applicationhistoryservice ....... SUCCESS [  0.958 s]
    [INFO] hadoop-yarn-server-resourcemanager ................. SUCCESS [  3.172 s]
    [INFO] hadoop-yarn-server-tests ........................... SUCCESS [ 13.857 s]
    [INFO] hadoop-yarn-client ................................. SUCCESS [  2.377 s]
    [INFO] hadoop-yarn-server-sharedcachemanager .............. SUCCESS [  2.371 s]
    [INFO] hadoop-yarn-applications ........................... SUCCESS [  0.070 s]
    [INFO] hadoop-yarn-applications-distributedshell .......... SUCCESS [  0.777 s]
    [INFO] hadoop-yarn-applications-unmanaged-am-launcher ..... SUCCESS [  0.478 s]
    [INFO] hadoop-yarn-site ................................... SUCCESS [  0.069 s]
    [INFO] hadoop-yarn-registry ............................... SUCCESS [  1.384 s]
    [INFO] hadoop-yarn-project ................................ SUCCESS [  0.477 s]
    [INFO] hadoop-mapreduce-client ............................ SUCCESS [  0.227 s]
    [INFO] hadoop-mapreduce-client-core ....................... SUCCESS [  5.169 s]
    [INFO] hadoop-mapreduce-client-common ..................... SUCCESS [  7.680 s]
    [INFO] hadoop-mapreduce-client-shuffle .................... SUCCESS [  3.952 s]
    [INFO] hadoop-mapreduce-client-app ........................ SUCCESS [  1.673 s]
    [INFO] hadoop-mapreduce-client-hs ......................... SUCCESS [  5.814 s]
    [INFO] hadoop-mapreduce-client-jobclient .................. SUCCESS [  1.755 s]
    [INFO] hadoop-mapreduce-client-hs-plugins ................. SUCCESS [  0.515 s]
    [INFO] Apache Hadoop MapReduce Examples ................... SUCCESS [  4.946 s]
    [INFO] hadoop-mapreduce ................................... SUCCESS [  0.253 s]
    [INFO] Apache Hadoop MapReduce Streaming .................. SUCCESS [  1.095 s]
    [INFO] Apache Hadoop Distributed Copy ..................... SUCCESS [05:56 min]
    [INFO] Apache Hadoop Archives ............................. SUCCESS [  3.804 s]
    [INFO] Apache Hadoop Rumen ................................ SUCCESS [  0.891 s]
    [INFO] Apache Hadoop Gridmix .............................. SUCCESS [  7.510 s]
    [INFO] Apache Hadoop Data Join ............................ SUCCESS [  0.351 s]
    [INFO] Apache Hadoop Ant Tasks ............................ SUCCESS [  0.483 s]
    [INFO] Apache Hadoop Extras ............................... SUCCESS [  0.567 s]
    [INFO] Apache Hadoop Pipes ................................ SUCCESS [  0.067 s]
    [INFO] Apache Hadoop OpenStack support .................... SUCCESS [  1.997 s]
    [INFO] Apache Hadoop Amazon Web Services support .......... SUCCESS [01:30 min]
    [INFO] Apache Hadoop Azure support ........................ SUCCESS [  5.079 s]
    [INFO] Apache Hadoop Client ............................... SUCCESS [  3.223 s]
    [INFO] Apache Hadoop Mini-Cluster ......................... SUCCESS [  1.524 s]
    [INFO] Apache Hadoop Scheduler Load Simulator ............. SUCCESS [  7.012 s]
    [INFO] Apache Hadoop Tools Dist ........................... SUCCESS [  7.261 s]
    [INFO] Apache Hadoop Tools ................................ SUCCESS [  0.067 s]
    [INFO] Apache Hadoop Distribution ......................... SUCCESS [  0.272 s]
    [INFO] ------------------------------------------------------------------------
    [INFO] BUILD SUCCESS
    [INFO] ------------------------------------------------------------------------
    [INFO] Total time: 10:56 min
    [INFO] Finished at: 2017-03-20T08:39:17+08:00
    [INFO] Final Memory: 77M/247M
    [INFO] ------------------------------------------------------------------------

    编译输出的目录:

      

      

  • 相关阅读:
    [性能调优]在PeopleSoft中使用函数索引
    如何在PeopleSoft中找到并更改默认样式表名称
    安装docker之后,测试hello-world镜像,终端提示:Unable to find image 'hello-world:latest' locally019-11-06
    ubuntu上安装docker
    ubuntu设置MySQL被局域网访问
    .Net Core使用 MiniProfiler 进行性能分析
    EF初次启动慢
    数据库最大连接池Max Pool Size
    SQLSERVER:Timeout expired. The timeout period elapsed prior to obtaining a connection from the pool. This may have occurred because all pooled connections were in use and max pool size was reached.
    EntityFramework中常用的数据删除方式
  • 原文地址:https://www.cnblogs.com/zhangxiaolin/p/6585982.html
Copyright © 2020-2023  润新知