-
-
设置主机名
-
配置ip、主机名映射
-
配置SSH免密登录
-
配置防火墙
-
JDK环境安装
-
上传hadoop安装包到/export/softwares/
-
tar -zxvf hadoop-2.7.4-with-centos-6.7.tar.gz -C /export/servers/
-
-
修改配置文件
-
hadoop-env.sh
-
export JAVA_HOME=/export/servers/jdk1.8.0_65
-
-
core-site.xml(放在configuration里面)
-
<property> <name>fs.defaultFS</name> <value>hdfs://node001:9000</value> </property> <property> <name>hadoop.tmp.dir</name> <value>/export/data/hadoopdata</value> </property>
-
-
hdfs-site.xml(放在configuration里面)
-
<property> <name>dfs.replication</name> <value>2</value> </property> <property> <name>dfs.namenode.secondary.http-address</name> <value>node002:50090</value> </property>
-
-
mapred-site.xml(放在configuration里面)
-
mv mapred-site.xml.template mapred-site.xml
-
<property> <name>mapreduce.framework.name</name> <value>yarn</value> </property>
-
-
yarn-site.xml(放在configuration里面)
-
<property> <name>yarn.resourcemanager.hostname</name> <value>node001</value> </property> <property> <name>yarn.nodemanager.aux-services</name> <value>mapreduce_shuffle</value> </property>
-
-
slaves
-
删除localhost 添加 node001 node002 node003
-
-
-
将hadoop添加到环境变量
-
vim /etc/proflie
-
export HADOOP_HOME=/export/servers/hadoop-2.7.4
export PATH=$PATH:$JAVA_HOME/bin:$HADOOP_HOME/bin:$HADOOP_HOME/sbin
-
-
-
分发到其他两台机器
-
scp -r /export/servers/hadoop-2.7.4/ root@node002:/export/servers/
-
scp -r /export/servers/hadoop-2.7.4/ root@node003:/export/servers/
-
scp -r /etc/profile root@node002:/etc/
-
scp -r /etc/profile root@node003:/etc/
-
-
三台机器重新加载环境变量
-
source /etc/profile
-