• hive安装


    1. Hive的安装

    系统环境
    装好hadoop的环境后。我们能够把Hive装在namenode机器上(NameNode-82)。

    下载: hive-0.9.0.tar.gz
    解压到: /home/hadoop/hive


    hive配置


    [root@NameNode-82 ~]# cd /home/hadoop/hive/conf
    [root@NameNode-82 conf]cp hive-default.xml.template hive-site.xml
    [root@NameNode-82 conf]cp hive-log4j.properties.template hive-log4j.properties
    [root@NameNode-82 conf]cp hive-env.sh.template  hive-env.sh


    配置hive-env.sh
    vi hive-env.sh
    HADOOP_HOME=/home/hadoop/hadoop
    export HIVE_CONF_DIR=/home/hadoop/hive/conf
    export HIVE_AUX_JARS_PATH=/home/hadoop/hive/lib
    source hive-env.sh


    改动hive-site.xml配置文件
    把Hive的元数据存储到MySQL中

    vi conf/hive-site.xml

    <property>
    <name>javax.jdo.option.ConnectionURL</name>
    <value>jdbc:mysql://localhost:3306/hive_metadata?createDatabaseIfNotExist=true</value>
    <description>JDBC connect string for a JDBC metastore</description>
    </property>


    <property>
    <name>javax.jdo.option.ConnectionDriverName</name>
    <value>com.mysql.jdbc.Driver</value>
    <description>Driver class name for a JDBC metastore</description>
    </property>


    <property>
    <name>javax.jdo.option.ConnectionUserName</name>
    <value>hive</value>
    <description>username to use against metastore database</description>
    </property>


    <property>
    <name>javax.jdo.option.ConnectionPassword</name>
    <value>hive</value>
    <description>password to use against metastore database</description>
    </property>


    <property>
    <name>hive.metastore.warehouse.dir</name>
    <value>/user/hive/warehouse</value>
    <description>location of default database for the warehouse</description>
    </property>


    环境变量设置

    vi /etc/profile

    export JAVA_HOME=/usr/local/jdk1.8.0_45
    export JRE_HOME=/usr/local/jdk1.8.0_45/jre
    export CLASSPATH=.:$JAVA_HOME/lib:$JRE_HOME/lib:$CLASSPATH
    export PATH=$JAVA_HOME/bin:$JRE_HOME/bin:$PATH
    export JAVA_HOME=/usr/local/jdk1.8.0_45

    export HADOOP_HOME=/home/hadoop/hadoop
    export PATH=$PATH:$HADOOP_HOME/sbin:$HADOOP_HOME/bin

    export HBASE_HOME=/home/hadoop/hbase
    export PATH=$PATH:$HBASE_HOME/bin

    export HIVE_HOME=/home/hadoop/hive

    在hdfs上面。创建文件夹

    hadoop fs -mkidr /tmp
    hadoop fs -mkidr /user/hive/warehouse
    hadoop fs -chmod g+w /tmp
    hadoop fs -chmod g+w /user/hive/warehouse


    在MySQL中创建数据库

    create database hive_metadata;
    grant all on hive_metadata.* to hive@'%' identified by 'hive';
    grant all on hive_metadata.* to hive@localhost identified by 'hive';
    ALTER DATABASE hive_metadata CHARACTER SET latin1;


    手动上传mysql的jdbc库到hive/lib
    ls /home/hadoop/hive/lib
    mysql-connector-java-5.1.22-bin.jar

    启动hive

    #启动metastore服务
    hive --service metastore 
    Starting Hive Metastore Server

    #启动hiveserver服务
    hive --service hiveserver 
    Starting Hive Thrift Server


    #启动hiveclient
    hive shell
    Logging initialized using configuration in file:/root/hive-0.9.0/conf/hive-log4j.properties
    Hive history file=/tmp/root/hive_job_log_root_201211141845_1864939641.txt

    hive> show tables
    OK

  • 相关阅读:
    4.28综合练习
    团队项目第一阶段冲刺第六天
    4.27防盗链和代理
    梦断代码阅读笔记3
    团队项目第一阶段冲刺第五天
    4.26抓取猪⼋戒数据
    团队项目第一阶段冲刺第四天
    4.25xpath解析
    4.24aiohttp模块学习
    如何将类数组转化为数组?
  • 原文地址:https://www.cnblogs.com/slgkaifa/p/6971508.html
Copyright © 2020-2023  润新知