• 4 hadoop安装 Sky


    hadoop安装
     
    1 安装java环境
    tar -xf jdk-8u131-linux-x64.tar.gz -C /usr/local/
    vi /etc/profile
    export JAVA_HOME=/usr/local/jdk1.8.0_131
    export CLASS_PATH=$JAVA_HOME/bin/lib:$JAVA_HOME/jre/lib:$JAVA_HOME/lib/tool.jar
    export PATH=${JAVA_HOME}/bin:$PATH
    source /etc/profile
     
    2 ssh互信,免密登录
    192.168.1.84 vm1 namenode
    192.168.1.85 vm2 datanode
    192.168.1.86 vm3 datanode
    创建hadoop用户:(namenode和datanode均需要)
    useradd hadoop
    passwd hadoop
     
    在namenode上生成秘钥:
    # su - hadoop
    # ssh-keygen
    在/home/hadoop/.ssh/目录下生成:id_rsa(私钥)和id_rsa.pub(公钥)
    将公钥id_rsa.pub上传至各datanode节点上
     
    在各datanode节点上:(复制namenode上的id_rsa.pub到authorized_keys)
    # vi /home/hadoop/.ssh/authorized_keys
    文件权限一定是600:
    chmod 700 /home/hadoop/.ssh
    chmod 600 /home/hadoop/.ssh/authorized_keys
     
    3 安装hadoop
    # su - hadoop
    # cd /home/hadoop/
    下载hadoop包:
    https://archive.apache.org/dist/hadoop/common/
    https://mirrors.cnnic.cn/apache/hadoop/common/
    tar -xf hadoop-3.2.2.tar.gz
    cd /home/hadoop/hadoop-3.2.2/etc/hadoop/
     
    修改配置:
    vi hadoop-env.sh
    export export JAVA_HOME=/usr/local/jdk1.8.0_131
     
    vi core-site.xml
    fs.defaultFS
    hdfs://192.168.1.84:50070
    hadoop.tmp.dir
    /home/hadoop/data/tmp
    fs.checkpoint.period
    3600
     
    vi hdfs-site.xml
    dfs.replication
    2
    dfs.namenode.name.dir
    /home/hadoop/data
    dfs.datanode.data.dir
    /home/hadoop/data
    dfs.namenode.secondary.http-address
    192.168.1.84:9001
    dfs.http.address
    0.0.0.0:50070
     
    vi mapred-site.xml
    mapred.job.tracker.http.address
    0.0.0.0:50030
    mapred.task.tracker.http.address
    0.0.0.0:50060
    mapreduce.framework.name
    yarn
     
    mapreduce.application.classpath
    /home/hadoop/hadoop-3.2.2/etc/hadoop,
    /home/hadoop/hadoop-3.2.2/share/hadoop/common/*,
    /home/hadoop/hadoop-3.2.2/share/hadoop/common/lib/*,
    /home/hadoop/hadoop-3.2.2/share/hadoop/hdfs/*,
    /home/hadoop/hadoop-3.2.2/share/hadoop/hdfs/lib/*,
    /home/hadoop/hadoop-3.2.2/share/hadoop/mapreduce/*,
    /home/hadoop/hadoop-3.2.2/share/hadoop/mapreduce/lib/*,
    /home/hadoop/hadoop-3.2.2/share/hadoop/yarn/*,
    /home/hadoop/hadoop-3.2.2/share/hadoop/yarn/lib/*
     
    vi yar-site.xml
    yarn.resourcemanager.hostname
    vm1
     
    yarn.nodemanager.aux-services
    mapreduce_shuffle
     
    vi /home/hadoop/hadoop-3.2.2/etc/hadoop/workers
    vm2
    vm3
     
     
  • 相关阅读:
    Lucene.net系列六 search 下
    Lucene.net 系列三 index 中
    初识Antlr
    Antlr首页计算机器实验成功
    C#语言学习之旅(1):C#基础
    NeatUpload js 判断上传文件的大小是否超过了空间的大小
    对XML的各种操作
    多表求和
    xmlhttp 最简单的无刷新
    xml 查询
  • 原文地址:https://www.cnblogs.com/skyzy/p/16874466.html
Copyright © 2020-2023  润新知