• 【大数据】基于Hadoop的HBase的安装(转)


    https://note.youdao.com/share/?id=c27485373a08517f7ad2e7ec901cd8d5&type=note#/

    安装前先确认HBse和Hadoop版本的对应关系

    环境配置

    export JAVA_HOME=/usr/local/jdk
    export HADOOP_HOME=/usr/local/hadoop
    export HBASE_HOME=/usr/local/hbase
    export HIVE_HOME=/usr/local/hive
    

    Hadoop

    修改${HADOOP_HOME}/etc/hadoop/hadoop-env.sh

    export JAVA_HOME=/usr/local/jdk
    

    修改 ${HADOOP_HOME}/etc/hadoop/core-site.xml

    <configuration>
        <property>
            <name>fs.defaultFS</name>
            <value>hdfs://localhost:9000</value>
        </property>
    </configuration>
    

    修改 ${HADOOP_HOME}/etc/hadoop/hdfs-site.xml

    <configuration>
        <property>
            <name>dfs.replication</name>
            <value>1</value>
        </property>
    </configuration>
    

    测试是否可以不用密码直接登录localhost

    ssh localhost
    

    如果不可以需要执行以下命令

    ssh-keygen -t rsa -P '' -f ~/.ssh/id_rsa
    cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
    chmod 0600 ~/.ssh/authorized_keys
    

    执行命令

    ${HADOOP_HOME}/bin/hdfs namenode -format
    

    启动

    ${HADOOP_HOME}/sbin/start-dfs.sh
    

    停止

    ${HADOOP_HOME}/sbin/stop-dfs.sh
    

    访问网址:http://localhost:50070/

    HBase

    修改 ${HBASE_HOME}/conf/hbase-env.sh

    export JAVA_HOME=/usr/local/jdk
    

    修改 ${HBASE_HOME}/conf/hbase-site.xml

    <configuration>
      <property>
        <name>hbase.rootdir</name>
        <value>hdfs://localhost:9000/hbase</value>
      </property>
      <property>
        <name>hbase.cluster.distributed</name>
        <value>true</value>
      </property>
      <property>
        <name>hbase.zookeeper.quorum</name>
    	<value>localhost</value>
      </property>
    </configuration>
    

    配置服务器的etc/hostname和etc/hosts(一定要配置正确,否则远程无法连接)

    启动

    ${HBASE_HOME}/bin/start-hbase.sh
    

    停止

    ${HBASE_HOME}/bin/stop-hbase.sh
    

    测试

    bin/hbase shell
    
    create 'users','basic','business'
    put 'users','zhangsan','basic:name','张三'
    get 'users','zhangsan'
    scan 'users'
    put 'users','zhangsan','basic:sex','Male'
    put 'users','zhangsan','business:company','STW'
    put 'users','zhangsan','basic:sex','Female'
    delete 'users','zhangsan','basic:sex'
    deleteall 'users','zhangsan'
    count 'users'
    

    Hive

    ${HADOOP_HOME}/bin/hadoop fs -mkdir       /tmp
    ${HADOOP_HOME}/bin/hadoop fs -mkdir       /user
    ${HADOOP_HOME}/bin/hadoop fs -mkdir       /user/hive
    ${HADOOP_HOME}/bin/hadoop fs -mkdir       /user/hive/warehouse
    ${HADOOP_HOME}/bin/hadoop fs -chmod g+w   /tmp
    ${HADOOP_HOME}/bin/hadoop fs -chmod g+w   /user/hive/warehouse
  • 相关阅读:
    【转】linux之fsck命令
    【转】linux之mkfs/mke2fs格式化
    【转】linux_fdisk命令详解
    【转】linux之ln命令
    [转]linux的du和df命令
    [转]Linux之type命令
    [转]Linux下which、whereis、locate、find 命令的区别
    [转]Linux的chattr与lsattr命令详解
    get 与post 的接收传值方式
    在asp.net前台页面中引入命名空间 和连接数据库
  • 原文地址:https://www.cnblogs.com/quietwalk/p/8032414.html
Copyright © 2020-2023  润新知