安装:
1. 将hadoop-core-0.20.2-cdh3u6/contrib/eclipse-plugin/hadoop-eclipse-plugin-0.20.2-cdh3u6.jar拷贝到eclipse的插件目录plugins下
2. 重启eclipse,可以打开视图:windows->open perspective->other->map/reduce
配置:
3. 打开视图:windows->show view->other-> map/reduce Locations
4. 蓝色大象那右键编辑(Edit),配置参数:
General选项卡:
Location name:任意
Map/Reduce Master中:
Host:Master所在主机的IP
Port:9001
DFS Master中
勾上“Use M/R Master host”
Host:默认和“Map/Reduce Master”中Host一致
Port:9000
User name:Master所在Linux的用户名,一般用户名设成hadoop
Advanced parameters选项卡:
这里相关配置项需要和Master中hadoop的安装目录下conf中相关配置一致
conf下
core-site.xml:
fs.default.name对应:hdfs://172.16.11.74:9000
hadoop.tmp.dir对应:/home/hadoop/tmp
hdfs-site.xml:
dfs.name.dir对应:/home/hadoop/dfs/name
dfs.data.dir对应:/home/hadoop/dfs/data
选项卡配置参考:
dfs.data.dir:/home/hadoop/dfs/data
dfs.name.dir:/home/hadoop/dfs/name
dfs.name.edits.dir:/home/hadoop/dfs/name
eclipse.plug-in.jobtracker.host:172.16.11.74
eclipse.plug-in.jobtracker.port:9001
eclipse.plug-in.namenode.host:172.16.11.74
eclipse.plug-in.namenode.port:9000
fs.checkpoint.dir:/home/hadoop/dfs/
fs.checkpoint.edits.dir:/home/hadoop/dfs/
fs.default.name:hdfs://172.16.11.74:9000/
mapred.job.tracker:172.16.11.74:9001
mapred.local.dir:/home/hadoop/tmp/mapred/local
mapred.system.dir:/home/hadoop/tmp/mapred/system
mapred.temp.dir:/home/hadoop/tmp/mapred/temp
mapreduce.jobtracker.staging.root.dir:/home/hadoop/tmp/mapred/staging
5. 配置后就可以连接了,不过要确认Master所在Linux的防火墙已经关闭
命令(/etc/init.d/iptables status) 若显示 Firewall is stopped.则可以尝试连接
6. 连接成功后,打开DFS Locations目录,可以看到hadoop文件系统的目录结构
参考 : laov上传的文档:(hadoop搭建与eclipse开发环境设置--已验证通过.docx)