• hbase shell 杂记


    规划:nn1.hadoop   nn2.hadoop   s1.hadoop   s2.hadoop


    ZK:                    nn1    nn2    s1
    journalnode:        nn1   nn2    s1
    DataNode:           nn1    nn2   s1   s2
    NodeManager:    nn1   nn2   s1   s2
    NameNode:          nn1    nn2
    ResourceManager:   s1   s2
    zkfc :                    nn1  nn2

    yum -y install svn ncurses-devel gcc* lzo-devel zlib-devel autoconf automake libtool cmake openssl-devel bzip2


    启动顺序:

    zkServer.sh start        // nn1 nn2 s1
    hadoop-daemon.sh start journalnode     //nn1 nn2 s1

    namenode(nn1.hadoop)节点进行格式化
    hadoop namenode -format      //nn1
    cd /data
    scp -r hadoopdata/ nn2.hadoop:$PWD

    格式化zkfc //nn1
    hdfs zkfc -formatZK

    启动HDFS //nn1
    start-dfs.sh 或者 hadoop-daemon.sh start namenode

    nn2:
    hadoop namenode -bootstrapStandby
    hadoop-daemon.sh start namenode


    s1/s2:
    hadoop-daemon.sh start datanode

    s2
    start-yarn.sh //NodeManager

    s1
    yarn-daemon.sh start resourcemanager //resourcemanager
    yarn-daemon.sh stop resourcemanager

    nn1
    start-yarn.sh
    stop-yarn.sh

    nn1 启动 mapreduce 任务历史服务器
    mr-jobhistory-daemon.sh start historyserver


    查看各主节点的状态
    HDFS:
    hdfs haadmin -getServiceState nn1
    hdfs haadmin -getServiceState nn2

    YARN:
    yarn rmadmin -getServiceState rm1
    yarn rmadmin -getServiceState rm2


    hadoop-daemon.sh start namenode
    hadoop-daemon.sh stop namenode

    hadoop-daemon.sh start datanode
    hadoop-daemon.sh stop datanode

    创建目录
    hdfs dfs -mkdir /hadoop/

    上传文件
    hadoop fs -put /opt/tools/CentOS-7-x86_64-Minimal-1908.iso /hadoop/
    hadoop fs -ls /hadoop/

    删除
    hdfs dfs -rmr /hadoop/CentOS-7-x86_64-Minimal-1908.iso


    http://192.168.80.166:50070
    http://192.168.80.167:50070
    http://192.168.80.166:19888/jobhistory

    http://192.168.80.169:8088/cluster


    Hbase mirroes
    http://mirrors.hust.edu.cn/apache/hbase/

    HBase 安装
    https://www.cnblogs.com/qingyunzong/p/8668880.html


    HBase介绍
    https://www.cnblogs.com/qingyunzong/p/8665698.html

    #HBase
    export HBASE_HOME=/usr/local/hbase-1.3.6
    export PATH=$PATH:$HBASE_HOME/bin


    sudo chown -R hadoop:hadoop hbase-1.3.6
    sudo ln -s hbase-1.3.6 hbase


    http://192.168.80.166:16010/master-status

    http://192.168.80.169:16000/master-status

    手动启动 HBASE
    hbase-daemon.sh start master
    hbase-daemon.sh start regionserver


    hbase help
    help "create"
    help "dml"
    help "get"
    help 'list'

    表的操作包括(创建create,查看表列表list。查看表的详细信息desc,删除表drop,清空表truncate,修改表的定义alter)

    >create 't1', {NAME => 'f1'}, {NAME => 'f2'}, {NAME => 'f3'}

    创建:
    >create 'myHbase',{NAME => 'myCard',VERSIONS => 5}

    >list

    查看结构
    >desc 'myHbase'

    修改
    >alter 'myHbase', NAME => 'myInfo'

    删除
    >alter 'myHbase', NAME => 'myCard', METHOD => 'delete'

    删除一个列簇
    >alter 'myHbase', 'delete' => 'myCard'


    添加列簇hehe同时删除列簇myInfo
    > alter 'myHbase', {NAME => 'hehe'}, {NAME => 'myInfo', METHOD => 'delete'}

    清空
    >truncate 'myHbase'

    删除表
    >disable 'myHbase'
    > drop 'myHbase'


    HBase表中数据的操作
    数据的操作(增put,删delete,查get + scan, 改==变相的增加)

    >create 'user_info',{NAME=>'base_info',VERSIONS=>3 },{NAME=>'extra_info',VERSIONS=>1 }
    >put 'user_info', 'user0001', 'base_info:name', 'zhangsan1'

    put 'user_info', 'zhangsan_20150701_0001', 'base_info:name', 'zhangsan1'
    put 'user_info', 'zhangsan_20150701_0002', 'base_info:name', 'zhangsan2'
    put 'user_info', 'zhangsan_20150701_0003', 'base_info:name', 'zhangsan3'
    put 'user_info', 'zhangsan_20150701_0004', 'base_info:name', 'zhangsan4'
    put 'user_info', 'zhangsan_20150701_0005', 'base_info:name', 'zhangsan5'
    put 'user_info', 'zhangsan_20150701_0006', 'base_info:name', 'zhangsan6'
    put 'user_info', 'zhangsan_20150701_0007', 'base_info:name', 'zhangsan7'
    put 'user_info', 'zhangsan_20150701_0008', 'base_info:name', 'zhangsan8'

    put 'user_info', 'zhangsan_20150701_0001', 'base_info:age', '21'
    put 'user_info', 'zhangsan_20150701_0002', 'base_info:age', '22'
    put 'user_info', 'zhangsan_20150701_0003', 'base_info:age', '23'
    put 'user_info', 'zhangsan_20150701_0004', 'base_info:age', '24'
    put 'user_info', 'zhangsan_20150701_0005', 'base_info:age', '25'
    put 'user_info', 'zhangsan_20150701_0006', 'base_info:age', '26'
    put 'user_info', 'zhangsan_20150701_0007', 'base_info:age', '27'
    put 'user_info', 'zhangsan_20150701_0008', 'base_info:age', '28'

    put 'user_info', 'zhangsan_20150701_0001', 'extra_info:Hobbies', 'music'
    put 'user_info', 'zhangsan_20150701_0002', 'extra_info:Hobbies', 'sport'
    put 'user_info', 'zhangsan_20150701_0003', 'extra_info:Hobbies', 'music'
    put 'user_info', 'zhangsan_20150701_0004', 'extra_info:Hobbies', 'sport'
    put 'user_info', 'zhangsan_20150701_0005', 'extra_info:Hobbies', 'music'
    put 'user_info', 'zhangsan_20150701_0006', 'extra_info:Hobbies', 'sport'
    put 'user_info', 'zhangsan_20150701_0007', 'extra_info:Hobbies', 'music'

    put 'user_info', 'baiyc_20150716_0001', 'base_info:name', 'baiyc1'
    put 'user_info', 'baiyc_20150716_0002', 'base_info:name', 'baiyc2'
    put 'user_info', 'baiyc_20150716_0003', 'base_info:name', 'baiyc3'
    put 'user_info', 'baiyc_20150716_0004', 'base_info:name', 'baiyc4'
    put 'user_info', 'baiyc_20150716_0005', 'base_info:name', 'baiyc5'
    put 'user_info', 'baiyc_20150716_0006', 'base_info:name', 'baiyc6'
    put 'user_info', 'baiyc_20150716_0007', 'base_info:name', 'baiyc7'
    put 'user_info', 'baiyc_20150716_0008', 'base_info:name', 'baiyc8'

    put 'user_info', 'baiyc_20150716_0001', 'base_info:age', '21'
    put 'user_info', 'baiyc_20150716_0002', 'base_info:age', '22'
    put 'user_info', 'baiyc_20150716_0003', 'base_info:age', '23'
    put 'user_info', 'baiyc_20150716_0004', 'base_info:age', '24'
    put 'user_info', 'baiyc_20150716_0005', 'base_info:age', '25'
    put 'user_info', 'baiyc_20150716_0006', 'base_info:age', '26'
    put 'user_info', 'baiyc_20150716_0007', 'base_info:age', '27'
    put 'user_info', 'baiyc_20150716_0008', 'base_info:age', '28'

    put 'user_info', 'baiyc_20150716_0001', 'extra_info:Hobbies', 'music'
    put 'user_info', 'baiyc_20150716_0002', 'extra_info:Hobbies', 'sport'
    put 'user_info', 'baiyc_20150716_0003', 'extra_info:Hobbies', 'music'
    put 'user_info', 'baiyc_20150716_0004', 'extra_info:Hobbies', 'sport'
    put 'user_info', 'baiyc_20150716_0005', 'extra_info:Hobbies', 'music'
    put 'user_info', 'baiyc_20150716_0006', 'extra_info:Hobbies', 'sport'
    put 'user_info', 'baiyc_20150716_0007', 'extra_info:Hobbies', 'music'
    put 'user_info', 'baiyc_20150716_0008', 'extra_info:Hobbies', 'sport'

    > get 'user_info', 'user0001'
    > get 'user_info', 'rk0001', 'base_info'
    > scan 'user_info'
    > scan 'user_info', {COLUMNS => 'base_info'}


    删除user_info表row key为rk0001,列标示符为base_info:name的数据

    > delete 'user_info', 'rk0001', 'base_info:name'
    > scan 'user_info', {COLUMNS => 'base_info'}


    HBase的API操作
    https://www.cnblogs.com/qingyunzong/p/8671804.html

    具体的jar的引入方式可以参考
    http://www.cnblogs.com/qingyunzong/p/8623309.html

    MapReduce操作Hbase
    https://www.cnblogs.com/qingyunzong/p/8681490.html


    HBase大牛博客
    http://hbasefly.com

  • 相关阅读:
    LeetCode具体分析 :: Recover Binary Search Tree [Tree]
    [leetcode] Path Sum
    System、应用程序进程的Binder线程池和Handler消息循环
    R(二): http与R脚本通讯环境安装
    R(一): R基础知识
    Hive(五):hive与hbase整合
    Hive(六):HQL DDL
    Hive(四):c#通过odbc访问hive
    Hive(三):SQuirrel连接hive配置
    Hive(二):windows hive ODBC 安装
  • 原文地址:https://www.cnblogs.com/walkersss/p/12859714.html
Copyright © 2020-2023  润新知