• 使用Git下载Hadoop的到本地Eclipse开发环境


    问题场景 
    按照官网http://wiki.apache.org/hadoop/EclipseEnvironment指导,要把Hadoop下载到本地,并构建Eclipse开发环境,只需要三条指令: 
    Java代码  收藏代码
    1. $ git clone git://git.apache.org/hadoop-common.git  
    2. $ mvn install -DskipTests  
    3. $ mvn eclipse:eclipse -DdownloadSources=true -DdownloadJavadocs=true  

    即可,但当我在本地执行完第二条后,报出如下错误日志信息: 
    Java代码  收藏代码
    1. [INFO]   
    2. [INFO] --- maven-antrun-plugin:1.6:run (compile-proto) @ hadoop-common ---  
    3. [INFO] Executing tasks  
    4.   
    5. main:  
    6.      [exec] target/compile-proto.sh: line 17: protoc: command not found  
    7.      [exec] target/compile-proto.sh: line 17: protoc: command not found  
    8. [INFO] ------------------------------------------------------------------------  
    9. [INFO] Reactor Summary:  
    10. [INFO]   
    11. [INFO] Apache Hadoop Main ................................ SUCCESS [2.389s]  
    12. [INFO] Apache Hadoop Project POM ......................... SUCCESS [0.698s]  
    13. [INFO] Apache Hadoop Annotations ......................... SUCCESS [1.761s]  
    14. [INFO] Apache Hadoop Project Dist POM .................... SUCCESS [0.729s]  
    15. [INFO] Apache Hadoop Assemblies .......................... SUCCESS [0.353s]  
    16. [INFO] Apache Hadoop Auth ................................ SUCCESS [1.998s]  
    17. [INFO] Apache Hadoop Auth Examples ....................... SUCCESS [1.227s]  
    18. [INFO] Apache Hadoop Common .............................. FAILURE [1.132s]  
    19. [INFO] Apache Hadoop Common Project ...................... SKIPPED  
    20. [INFO] Apache Hadoop HDFS ................................ SKIPPED  
    21. [INFO] Apache Hadoop HttpFS .............................. SKIPPED  
    22. [INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED  
    23. [INFO] Apache Hadoop HDFS Project ........................ SKIPPED  
    24. [INFO] hadoop-yarn ....................................... SKIPPED  
    25. [INFO] hadoop-yarn-api ................................... SKIPPED  
    26. [INFO] hadoop-yarn-common ................................ SKIPPED  
    27. [INFO] hadoop-yarn-server ................................ SKIPPED  
    28. [INFO] hadoop-yarn-server-common ......................... SKIPPED  
    29. [INFO] hadoop-yarn-server-nodemanager .................... SKIPPED  
    30. [INFO] hadoop-yarn-server-web-proxy ...................... SKIPPED  
    31. [INFO] hadoop-yarn-server-resourcemanager ................ SKIPPED  
    32. [INFO] hadoop-yarn-server-tests .......................... SKIPPED  
    33. [INFO] hadoop-mapreduce-client ........................... SKIPPED  
    34. [INFO] hadoop-mapreduce-client-core ...................... SKIPPED  
    35. [INFO] hadoop-yarn-applications .......................... SKIPPED  
    36. [INFO] hadoop-yarn-applications-distributedshell ......... SKIPPED  
    37. [INFO] hadoop-yarn-site .................................. SKIPPED  
    38. [INFO] hadoop-mapreduce-client-common .................... SKIPPED  
    39. [INFO] hadoop-mapreduce-client-shuffle ................... SKIPPED  
    40. [INFO] hadoop-mapreduce-client-app ....................... SKIPPED  
    41. [INFO] hadoop-mapreduce-client-hs ........................ SKIPPED  
    42. [INFO] hadoop-mapreduce-client-jobclient ................. SKIPPED  
    43. [INFO] Apache Hadoop MapReduce Examples .................. SKIPPED  
    44. [INFO] hadoop-mapreduce .................................. SKIPPED  
    45. [INFO] Apache Hadoop MapReduce Streaming ................. SKIPPED  
    46. [INFO] Apache Hadoop Distributed Copy .................... SKIPPED  
    47. [INFO] Apache Hadoop Archives ............................ SKIPPED  
    48. [INFO] Apache Hadoop Rumen ............................... SKIPPED  
    49. [INFO] Apache Hadoop Extras .............................. SKIPPED  
    50. [INFO] Apache Hadoop Tools Dist .......................... SKIPPED  
    51. [INFO] Apache Hadoop Tools ............................... SKIPPED  
    52. [INFO] Apache Hadoop Distribution ........................ SKIPPED  
    53. [INFO] ------------------------------------------------------------------------  
    54. [INFO] BUILD FAILURE  
    55. [INFO] ------------------------------------------------------------------------  
    56. [INFO] Total time: 12.483s  
    57. [INFO] Finished at: Mon Jan 30 22:57:23 GMT+08:00 2012  
    58. [INFO] Final Memory: 24M/81M  
    59. [INFO] ------------------------------------------------------------------------  
    60. [ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (compile-proto) on project hadoop-common: An Ant BuildException has occured: exec returned: 127 -> [Help 1]  
    61. [ERROR]   
    62. [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.  
    63. [ERROR] Re-run Maven using the -X switch to enable full debug logging.  
    64. [ERROR]   
    65. [ERROR] For more information about the errors and possible solutions, please read the following articles:  
    66. [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException  
    67. [ERROR]   
    68. [ERROR] After correcting the problems, you can resume the build with the command  
    69. [ERROR]   mvn <goals> -rf :hadoop-common  


    此问题我暂时还未分析具体原因和解决方案,暂时记录下。 
    展开分析 
    通过再此使用命令打出错误信息: 
    Java代码  收藏代码
    1. $ mvn install -DskipTests -e  

    得到详细错误信息为: 
    Java代码  收藏代码
    1. [INFO] ------------------------------------------------------------------------  
    2. [INFO] BUILD FAILURE  
    3. [INFO] ------------------------------------------------------------------------  
    4. [INFO] Total time: 9.387s  
    5. [INFO] Finished at: Mon Jan 30 23:11:07 GMT+08:00 2012  
    6. [INFO] Final Memory: 19M/81M  
    7. [INFO] ------------------------------------------------------------------------  
    8. [ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (compile-proto) on project hadoop-common: An Ant BuildException has occured: exec returned: 127 -> [Help 1]  
    9. org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (compile-proto) on project hadoop-common: An Ant BuildException has occured: exec returned: 127  
    10.     at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:217)  
    11.     at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)  
    12.     at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)  
    13.     at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:84)  
    14.     at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:59)  
    15.     at org.apache.maven.lifecycle.internal.LifecycleStarter.singleThreadedBuild(LifecycleStarter.java:183)  
    16.     at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:161)  
    17.     at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:319)  
    18.     at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:156)  
    19.     at org.apache.maven.cli.MavenCli.execute(MavenCli.java:537)  
    20.     at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:196)  
    21.     at org.apache.maven.cli.MavenCli.main(MavenCli.java:141)  
    22.     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)  
    23.     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)  
    24.     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)  
    25.     at java.lang.reflect.Method.invoke(Method.java:597)  
    26.     at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:290)  
    27.     at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:230)  
    28.     at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:409)  
    29.     at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:352)  
    30. Caused by: org.apache.maven.plugin.MojoExecutionException: An Ant BuildException has occured: exec returned: 127  
    31.     at org.apache.maven.plugin.antrun.AntRunMojo.execute(AntRunMojo.java:283)  
    32.     at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:101)  
    33.     at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:209)  
    34.     ... 19 more  
    35. Caused by: /Users/apple/Documents/Hadoop-common-dev/hadoop-common/hadoop-common-project/hadoop-common/target/antrun/build-main.xml:23: exec returned: 127  
    36.     at org.apache.tools.ant.taskdefs.ExecTask.runExecute(ExecTask.java:650)  
    37.     at org.apache.tools.ant.taskdefs.ExecTask.runExec(ExecTask.java:676)  
    38.     at org.apache.tools.ant.taskdefs.ExecTask.execute(ExecTask.java:502)  
    39.     at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:291)  
    40.     at sun.reflect.GeneratedMethodAccessor16.invoke(Unknown Source)  
    41.     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)  
    42.     at java.lang.reflect.Method.invoke(Method.java:597)  
    43.     at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)  
    44.     at org.apache.tools.ant.Task.perform(Task.java:348)  
    45.     at org.apache.tools.ant.Target.execute(Target.java:390)  
    46.     at org.apache.tools.ant.Target.performTasks(Target.java:411)  
    47.     at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1397)  
    48.     at org.apache.tools.ant.Project.executeTarget(Project.java:1366)  
    49.     at org.apache.maven.plugin.antrun.AntRunMojo.execute(AntRunMojo.java:270)  
    50.     ... 21 more  
    51. [ERROR]   
    52. [ERROR] Re-run Maven using the -X switch to enable full debug logging.  
    53. [ERROR]   
    54. [ERROR] For more information about the errors and possible solutions, please read the following articles:  
    55. [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException  
    56. [ERROR]   

    通过上面错误信息,真方便找到解决方案。 

    根据上面的提示,访问https://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException 
    得到如下提示信息: 
            Unlike many other errors, this exception is not generated by the Maven core itself but by a plugin. As a rule of thumb, plugins use this error to signal a problem in their configuration or the information they retrieved from the POM. 
            这里说的意思是: 
            这个错误不是Maven本身的错误,根据经验,可能是Maven使用的插件通过这个异常来标识它们没有从POM中获得相关的配置信息。 

    接下来进一步分析,通过Maven构建Hadoop过程中是否使用了插件。 
            从错误日志分析,编译过程使用了插件:maven-antrun-plugin。由于编译Hadoop-common过程中出错,所以进一步定位到hadoop-common工程下的POM.xml,可到看到下面信息: 
    Xml代码  收藏代码
    1. <plugin>  
    2.         <groupId>org.apache.maven.plugins</groupId>  
    3.         <artifactId>maven-antrun-plugin</artifactId>  
    4.         <executions>  
    5.           <execution>  
    6.             <id>compile-proto</id>  
    7.             <phase>generate-sources</phase>  
    8.             <goals>  
    9.               <goal>run</goal>  
    10.             </goals>  
    11.             <configuration>  
    12.               <target>  
    13.                 <echo file="target/compile-proto.sh">  
    14.                     PROTO_DIR=src/main/proto  
    15.                     JAVA_DIR=target/generated-sources/java  
    16.                     which cygpath 2> /dev/null  
    17.                     if [ $? = 1 ]; then  
    18.                       IS_WIN=false  
    19.                     else  
    20.                       IS_WIN=true  
    21.                       WIN_PROTO_DIR=`cygpath --windows $PROTO_DIR`  
    22.                       WIN_JAVA_DIR=`cygpath --windows $JAVA_DIR`  
    23.                     fi  
    24.                     mkdir -p $JAVA_DIR 2> /dev/null  
    25.                     for PROTO_FILE in `ls $PROTO_DIR/*.proto 2> /dev/null`  
    26.                     do  
    27.                         if [ "$IS_WIN" = "true" ]; then  
    28.                           protoc -I$WIN_PROTO_DIR --java_out=$WIN_JAVA_DIR $PROTO_FILE  
    29.                         else  
    30.                           protoc -I$PROTO_DIR --java_out=$JAVA_DIR $PROTO_FILE  
    31.                         fi  
    32.                     done  
    33.                 </echo>  
    34.                 <exec executable="sh" dir="${basedir}" failonerror="true">  
    35.                   <arg line="target/compile-proto.sh"/>  
    36.                 </exec>  
    37.               </target>  
    38.             </configuration>  
    39.           </execution>  
    40.           <execution>  
    41.             <id>compile-test-proto</id>  
    42.             <phase>generate-test-sources</phase>  
    43.             <goals>  
    44.               <goal>run</goal>  
    45.             </goals>  
    46.             <configuration>  
    47.               <target>  
    48.                 <echo file="target/compile-test-proto.sh">  
    49.                     PROTO_DIR=src/test/proto  
    50.                     JAVA_DIR=target/generated-test-sources/java  
    51.                     which cygpath 2> /dev/null  
    52.                     if [ $? = 1 ]; then  
    53.                       IS_WIN=false  
    54.                     else  
    55.                       IS_WIN=true  
    56.                       WIN_PROTO_DIR=`cygpath --windows $PROTO_DIR`  
    57.                       WIN_JAVA_DIR=`cygpath --windows $JAVA_DIR`  
    58.                     fi  
    59.                     mkdir -p $JAVA_DIR 2> /dev/null  
    60.                     for PROTO_FILE in `ls $PROTO_DIR/*.proto 2> /dev/null`  
    61.                     do  
    62.                         if [ "$IS_WIN" = "true" ]; then  
    63.                           protoc -I$WIN_PROTO_DIR --java_out=$WIN_JAVA_DIR $PROTO_FILE  
    64.                         else  
    65.                           protoc -I$PROTO_DIR --java_out=$JAVA_DIR $PROTO_FILE  
    66.                         fi  
    67.                     done  
    68.                 </echo>  
    69.                 <exec executable="sh" dir="${basedir}" failonerror="true">  
    70.                   <arg line="target/compile-test-proto.sh"/>  
    71.                 </exec>  
    72.               </target>  
    73.             </configuration>  
    74.           </execution>  
    75.           <execution>  
    76.             <id>save-version</id>  
    77.             <phase>generate-sources</phase>  
    78.             <goals>  
    79.               <goal>run</goal>  
    80.             </goals>  
    81.             <configuration>  
    82.               <target>  
    83.                 <mkdir dir="${project.build.directory}/generated-sources/java"/>  
    84.                 <exec executable="sh">  
    85.                   <arg  
    86.                       line="${basedir}/dev-support/saveVersion.sh ${project.version} ${project.build.directory}/generated-sources/java"/>  
    87.                 </exec>  
    88.               </target>  
    89.             </configuration>  
    90.           </execution>  
    91.           <execution>  
    92.             <id>generate-test-sources</id>  
    93.             <phase>generate-test-sources</phase>  
    94.             <goals>  
    95.               <goal>run</goal>  
    96.             </goals>  
    97.             <configuration>  
    98.               <target>  
    99.   
    100.                 <mkdir dir="${project.build.directory}/generated-test-sources/java"/>  
    101.   
    102.                 <taskdef name="recordcc" classname="org.apache.hadoop.record.compiler.ant.RccTask">  
    103.                   <classpath refid="maven.compile.classpath"/>  
    104.                 </taskdef>  
    105.                 <recordcc destdir="${project.build.directory}/generated-test-sources/java">  
    106.                   <fileset dir="${basedir}/src/test/ddl" includes="**/*.jr"/>  
    107.                 </recordcc>  
    108.               </target>  
    109.             </configuration>  
    110.           </execution>  
    111.           <execution>  
    112.             <id>create-log-dir</id>  
    113.             <phase>process-test-resources</phase>  
    114.             <goals>  
    115.               <goal>run</goal>  
    116.             </goals>  
    117.             <configuration>  
    118.               <target>  
    119.                 <!--  
    120.                 TODO: there are tests (TestLocalFileSystem#testCopy) that fail if data  
    121.                 TODO: from a previous run is present  
    122.                 -->  
    123.                 <delete dir="${test.build.data}"/>  
    124.                 <mkdir dir="${test.build.data}"/>  
    125.                 <mkdir dir="${hadoop.log.dir}"/>  
    126.   
    127.                 <copy toDir="${project.build.directory}/test-classes">  
    128.                   <fileset dir="${basedir}/src/main/conf"/>  
    129.                 </copy>  
    130.               </target>  
    131.             </configuration>  
    132.           </execution>  
    133.           <execution>  
    134.             <phase>pre-site</phase>  
    135.             <goals>  
    136.               <goal>run</goal>  
    137.             </goals>  
    138.             <configuration>  
    139.               <tasks>  
    140.                 <copy file="src/main/resources/core-default.xml" todir="src/site/resources"/>  
    141.                 <copy file="src/main/xsl/configuration.xsl" todir="src/site/resources"/>  
    142.               </tasks>  
    143.             </configuration>  
    144.           </execution>  
    145.         </executions>  
    146.       </plugin>  

    上面是在Pom.xml文件中使用Maven下使用Ant插件,里面有一行: 
    Xml代码  收藏代码
    1. <echo file="target/compile-proto.sh">  

    看着让人不解,联系到HowToContribueToHadoop的文章http://wiki.apache.org/hadoop/HowToContribute可以推知,有可能由于本地没有安装ProtocolBuffers引起的,因为文章内部特别说明了:
    引用
    Hadoop 0.23+ must have Google's ProtocolBuffers for compilation to work.
    接下来打算在本地重新安装ProtocolBuffers后再编译部署。 


    如预期,在本地安装好了Protoc Buffer后,后面两条指令顺利执行完整,剩下的就依据官网把目录下的工程导入Eclipse后,就可以在Eclipse下学习调试源码。
  • 相关阅读:
    各种web页面中的绘图技术对比
    32位和64位操作系统
    mysql新建用户本地无法登录
    ruby libmysqlclient.18.dylib
    jenkins创建git任务连接时遇到的问题
    mybatis 打印日志log4j.properties
    使用shell统计出出现次数排名top10的网址(在博客园中没找到,特转一下)
    ActiveMQ和Tomcat的整合应用(转)
    java 哪些情况下会使对象锁释放
    Web容器与Servlet
  • 原文地址:https://www.cnblogs.com/adolfmc/p/3280307.html
Copyright © 2020-2023  润新知