Hadoop开篇,按惯例。先编译源代码。导入到Eclipse。这样以后要了解那块,或者那块出问题了。直接找源代码。
编译hadoop2.4.1源代码之前。必须安装Maven和Ant环境,而且Hadoop须要protoc2.5.0的支持,所以还要下载protoc。我下载的是:protobuf-2.5.0.tar.bz2
对protoc进行编译安装前先要装几个依赖包:gcc,gcc-c++,make 假设已经安装的能够忽略
yum install gcc yum install gcc-c++ yum install make yum install cmake yum install openssl-devel yum install ncurses-devel
安装protoc
tar -xvf protobuf-2.5.0.tar.bz2 cd protobuf-2.5.0 ./configure --prefix=/opt/protoc/ make && make install
linux系统运行编译命令:mvn install eclipse:eclipse -Pdist,native -DskipTests -Dtar -Dmaven.javadoc.skip=true
编译完毕后。查看hadoop-dist目录:
[root@localhost target]# ll total 153824 drwxr-xr-x. 2 root root 4096 Jul 9 17:00 antrun -rw-r--r--. 1 root root 4809 Jul 9 17:00 dist-layout-stitching.sh -rw-r--r--. 1 root root 666 Jul 9 17:01 dist-tar-stitching.sh drwxr-xr-x. 9 root root 4096 Jul 9 17:00 hadoop-3.0.0-SNAPSHOT -rw-r--r--. 1 root root 157482988 Jul 9 17:01 hadoop-3.0.0-SNAPSHOT.tar.gz -rw-r--r--. 1 root root 3445 Jul 9 17:01 hadoop-dist-3.0.0-SNAPSHOT.jar drwxr-xr-x. 2 root root 4096 Jul 9 17:01 maven-archiver drwxr-xr-x. 2 root root 4096 Jul 9 17:00 test-dir [root@localhost target]# pwd /home/fish/hadoop/hadoop-dist/target
查看hadoop的版本号:
[root@localhost bin]# cd /home/fish/hadoop/hadoop-dist/target/hadoop-3.0.0-SNAPSHOT/bin [root@localhost bin]# ./hadoop version Hadoop 3.0.0-SNAPSHOT Source code repository https://github.com/apache/hadoop.git -r e0febce0e74ec69597376774f771da46834c42b1 Compiled by root on 2015-07-09T08:53Z Compiled with protoc 2.5.0 From source with checksum d69dd13fde158d22d95a263a0f12bc8 This command was run using /home/fish/hadoop/hadoop-dist/target/hadoop-3.0.0-SNAPSHOT/share/hadoop/common/hadoop-common-3.0.0-SNAPSHOT.jar [root@localhost bin]# pwd /home/fish/hadoop/hadoop-dist/target/hadoop-3.0.0-SNAPSHOT/bin
查看编译的一些信息:
[root@localhost hadoop-3.0.0-SNAPSHOT]# file lib//native/* lib//native/libhadoop.a: current ar archive lib//native/libhadooppipes.a: current ar archive lib//native/libhadoop.so: symbolic link to `libhadoop.so.1.0.0' lib//native/libhadoop.so.1.0.0: ELF 64-bit LSB shared object, x86-64, version 1 (SYSV), dynamically linked, not stripped lib//native/libhadooputils.a: current ar archive lib//native/libhdfs.a: current ar archive lib//native/libhdfs.so: symbolic link to `libhdfs.so.0.0.0' lib//native/libhdfs.so.0.0.0: ELF 64-bit LSB shared object, x86-64, version 1 (SYSV), dynamically linked, not stripped lib//native/libnativetask.a: current ar archive lib//native/libnativetask.so: symbolic link to `libnativetask.so.1.0.0' lib//native/libnativetask.so.1.0.0: ELF 64-bit LSB shared object, x86-64, version 1 (SYSV), dynamically linked, not stripped [root@localhost hadoop-3.0.0-SNAPSHOT]# pwd /home/fish/hadoop/hadoop-dist/target/hadoop-3.0.0-SNAPSHOT
编译问题
问题1:
[ERROR] Failed to execute goal on project hadoop-common: Could not resolve dependencies for project org.apache.hadoop:hadoop-common:jar:3.0.0-SNAPSHOT: Failure to find org.apache.hadoop:hadoop-auth:jar:tests:3.0.0-SNAPSHOT in http ://10.0.1.88:8081/nexus/content/repositories/thirdparty/ was cached in the local repository, resolution will not be reattempted until the update interval of thirdparty has elapsed or updates are forced -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException [ERROR] [ERROR] After correcting the problems, you can resume the build with the command [ERROR] mvn <goals> -rf :hadoop-common
改动.m2中的文件:
mv /root/.m2/repository/org/apache/hadoop/hadoop-auth/3.0.0-SNAPSHOT/hadoop-auth-3.0.0-SNAPSHOT-tests.jar.lastUpdated /root/.m2/repository/org/apache/hadoop/hadoop-auth/3.0.0-SNAPSHOT/hadoop-auth-3.0.0-SNAPSHOT-tests.jar mv /root/.m2/repository/org/apache/hadoop/hadoop-kms/3.0.0-SNAPSHOT/hadoop-kms-3.0.0-SNAPSHOT-tests.jar.lastUpdated /root/.m2/repository/org/apache/hadoop/hadoop-kms/3.0.0-SNAPSHOT/hadoop-kms-3.0.0-SNAPSHOT-tests.jar mv /root/.m2/repository/org/apache/hadoop/hadoop-hdfs/3.0.0-SNAPSHOT/hadoop-hdfs-3.0.0-SNAPSHOT-tests.jar.lastUpdated /root/.m2/repository/org/apache/hadoop/hadoop-hdfs/3.0.0-SNAPSHOT/hadoop-hdfs-3.0.0-SNAPSHOT-tests.jar
问题2:
还有些错误会报无法下载到jar,这样的情况能够登录到http://search.maven.org/官方库去看下这个包存不存在,假设存在的话,可能是由于网络原因,多运行几次就能够了。
问题3:
[root@localhost bin]# ./hadoop : No such file or directory改动hadoop命令为linux的格式:
dos2unix /home/fish/hadoop/hadoop-dist/target/hadoop-3.0.0-SNAPSHOT/bin/hadoop