hive依赖hadoop
需要的软件包:hive-0.13.1-cdh5.3.6.tar.gz 、hadoop-2.5.0-cdh5.3.6.tar.gz
1、hadoop的安装步骤请访问:
http://www.cnblogs.com/xningge/articles/8433297.html
2、将hive软件包上传到Linux系统指定的目录:/opt/softwares/cdh
3、解压hive-0.13.1-cdh5.3.6.tar.gz到指定的目录:/opt/modules/cdh/
tar -zxvf hive-0.13.1-cdh5.3.6.tar.gz -C /opt/modules/cdh/
4、重命名hive-env.sh (去掉.template)
HADOOP_HOME=/opt/modules/cdh/hadoop-2.5.0-cdh5.3.6
export HIVE_CONF_DIR=/opt/modules/cdh/hive-0.13.1-cdh5.3.6/conf
5、重命名hive-site.xml (去掉.template,修改为site)
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://hadoop01.xningge.com:3306/cdhmetastore?createDatabaseIfNotExist=true</value>
</property>
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>com.mysql.jdbc.Driver</value>
</property>
<property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>xningge</value>
</property>
<property>
<name>javax.jdo.option.ConnectionPassword</name>
<value>???</value>
</property>
**注意:模板文件有一个bug,在hive-site.xml的2781行少了一个<property>开始标记
6、重命名hive-log4j.properties (去掉.template)
$ mkdir logs
** 修改hive-log4j.properties
hive.log.dir=/opt/modules/cdh/hive-0.13.1-cdh5.3.6/logs
7、拷贝jdbc driver
** 将驱动文件复制到lib/
$ cp /opt/software/mysql-connector-java-5.1.27-bin.jar lib/
8、切换到CDH Hadoop目录,建立目录,并修改权限
** /user/hive/warehouse为Hive存放数据的目录
$ bin/hdfs dfs -mkdir -p /user/hive/warehouse
$ bin/hdfs dfs -chmod g+w /user/hive/warehouse
$ bin/hdfs dfs -chmod g+w /tmp
9、启动客户端使用Hive
eg:把profile里HIVE_HOME注解掉(如果有配置这个全局变量的话,则注释)
$ bin/hive
hive> show databases;