• 在cdh5.1.3中在mapreduce使用hbase


    环境:centos6.5 cdh5.1.3

    一、hadoop命令找不到hbase相关类

    ()观察hadoop classpath的输出:

    1,classpath包含了/etc/hadoop/conf,这是hadoop当前使用的配置文件的目录。

    2,classpath*结尾,

    (),找到hbase相关jar包位置

    ()修改hadoop-env.sh

    打开/etc/hadoop/conf/hadoop-env.sh,为HADOOP_CLASSPATH添加上一步找到的jar文件路径。

    1,指定路径是没用的。需要路径名后面加上”/*”

    2,hadoop classpath命令能随时显示出来/etc/hadoop/conf/hadoop-env.sh的内容。

    修改完成后:

     附修改后的hadoop.env

    1. export HADOOP_MAPRED_HOME=$(([[!'/opt/cloudera-manager/cloudera/parcels/CDH/lib/hadoop-mapreduce'=~ CDH_MR2_HOME ]]&& echo /opt/cloudera-manager/cloudera/parcels/CDH/lib/hadoop-mapreduce )|| echo ${CDH_MR2_HOME:-/usr/lib/hadoop-mapreduce/})
      HADOOP_CLASSPATH=/usr/share/cmf/lib/cdh5/*:/opt/cloudera-manager/cm-5.1.3/share/cmf/lib/cdh5/*:/opt/cloudera-manager/cm-5.1.3/share/cmf/cloudera-navigator-server/libs/cdh5/*:/opt/cloudera-manager/cloudera/parcels/CDH-5.1.3-1.cdh5.1.3.p0.12/lib/hbase/*:/opt/cloudera-manager/cloudera/parcels/CDH-5.1.3-1.cdh5.1.3.p0.12/lib/hbase/lib/*:/opt/cloudera-manager/cloudera/parcels/CDH-5.1.3-1.cdh5.1.3.p0.12/lib/hbase/*:
      # JAVA_LIBRARY_PATH={{JAVA_LIBRARY_PATH}}
      export YARN_OPTS="-Xms825955249 -Xmx825955249 -Djava.net.preferIPv4Stack=true $YARN_OPTS"
      export HADOOP_CLIENT_OPTS="-Djava.net.preferIPv4Stack=true $HADOOP_CLIENT_OPTS"
     

    二、在代码里面指定

    To run MapReduce jobs that use HBase, you need to add the HBase and Zookeeper JAR files to the Hadoop Java classpath. You can do this by adding the following statement to each job:

    TableMapReduceUtil.addDependencyJars(job); 

     

    TableMapReduceUtil.addDependencyJars(job);

    加上这句还是需要修改上一步中的/etc/hadoop/conf/hadoop-env.sh

     

    参考:

    http://www.cloudera.com/content/cloudera/en/documentation/cdh5/v5-0-0/CDH5-Installation-Guide/cdh5ig_mapreduce_hbase.html

     

    附部分代码:

    1. import java.text.SimpleDateFormat;
      import java.util.Date;
      
      import org.apache.hadoop.conf.Configuration;
      import org.apache.hadoop.hbase.client.HTableUtil;
      import org.apache.hadoop.hbase.client.Put;
      import org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil;
      import org.apache.hadoop.hbase.mapreduce.TableOutputFormat;
      import org.apache.hadoop.hbase.mapreduce.TableReducer;
      import org.apache.hadoop.hbase.util.Bytes;
      import org.apache.hadoop.io.LongWritable;
      import org.apache.hadoop.io.NullWritable;
      import org.apache.hadoop.io.Text;
      import org.apache.hadoop.mapreduce.Job;
      import org.apache.hadoop.mapreduce.Mapper;
      import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
      import org.apache.hadoop.mapreduce.lib.input.TextInputFormat;
      
      public class BatchImport {
      
          public static void main(String[] args) throws Exception {
              final Configuration configuration = new Configuration();
              // 设置zookeeper
              configuration.set("hbase.zookeeper.quorum", "192.168.1.170:2181");
              // 设置hbase表名称
              configuration.set(TableOutputFormat.OUTPUT_TABLE, "ww_log");
          
              
              final Job job = new Job(configuration, BatchImport.class.getSimpleName());
              TableMapReduceUtil.addDependencyJars(job); 
              job.setJarByClass(BatchImport.class);
              job.setMapperClass(BatchImportMapper.class);
              job.setReducerClass(BatchImportReducer.class);
              // 设置map的输出,不设置reduce的输出类型
              job.setMapOutputKeyClass(LongWritable.class);
              job.setMapOutputValueClass(Text.class);
      
              job.setInputFormatClass(TextInputFormat.class);
              // 不再设置输出路径,而是设置输出格式类型
              job.setOutputFormatClass(TableOutputFormat.class);
      
              FileInputFormat.setInputPaths(job, "hdfs://192.168.1.170:8020/data/ww_log");
      
              job.waitForCompletion(true);
          }
      }

     

     





  • 相关阅读:
    java.sql.SQLException: 不支持的字符集 (在类路径中添加 orai18n.jar): ZHS16GBK
    STS工具各版本下载网址
    SpringBoot访问不了JSP但却能进入后台
    springboot o.a.tomcat.util.scan.StandardJarScanner : Failed to scan [file:/D:/apache-maven-3.0.5[系统找不到指定路径]
    STS工具:mybayis连接oracle数据库
    springBoot怎样访问静态资源?+静态资源简介
    springboot注解
    8.12-14 df 、mkswap、swapon、swapoff、sync
    8.5-7 mkfs、dumpe2fs、resize2fs
    8.2-3 partprobe、tune2fs
  • 原文地址:https://www.cnblogs.com/xfly/p/4137867.html
Copyright © 2020-2023  润新知