• sqoop部署


    - 下载

    在hadoop家目录下创建一个app目录

    [hadoop@hadoop001 ~]$ mkdir app

    [hadoop@hadoop001 app]$ pwd
    /home/hadoop/app

    [hadoop@hadoop001 app]$ wget http://archive.cloudera.com/cdh5/cdh/5/sqoop-1.4.6-cdh5.7.0.tar.gz  --下载sqoop安装包  我们用的是cdh环境,所以下载的也是cdh安装包

    - 解压
    [hadoop@hadoop001 app]$ tar -zxvf sqoop-1.4.6-cdh5.7.0.tar.gz

    [hadoop@hadoop001 app]$ ll
    drwxr-xr-x 10 hadoop hadoop 4096 Apr 4 15:10 sqoop-1.4.6-cdh5.7.0
    -rw-r--r-- 1 root root 29966286 Apr 3 17:38 sqoop-1.4.6-cdh5.7.0.tar.gz

    -配置环境变量

    [hadoop@hadoop001 ~]$ vi ~/.bash_profile

    export SQOOP_HOME=/home/hadoop/app/sqoop

    export PATH=$SQOOP_HOME/bin:$PATH

    [hadoop@hadoop001 ~]$ source ~/.bash_profile   

    -进入conf目录

    [hadoop@hadoop001 ~]$ cd $SQOOP_HOME/conf    

    -复制配置文件模板并改名

    [hadoop@hadoop001 conf]$ cp sqoop-env-template.sh sqoop-env.sh

    -编辑环境信息

    [hadoop@hadoop001 conf]$ vi sqoop-env.sh

    -解开hadoop和hive的注释,并配置正确的目录

    #Set path to where bin/hadoop is available
    export HADOOP_COMMON_HOME=/home/hadoop/app/hadoop-2.6.0-cdh5.7.0

    #Set path to where hadoop-*-core.jar is available
    export HADOOP_MAPRED_HOME=/home/hadoop/app/hadoop-2.6.0-cdh5.7.0

    #set the path to where bin/hbase is available
    #export HBASE_HOME=

    #Set the path to where bin/hive is available
    export HIVE_HOME=/home/hadoop/app/hive-1.1.0-cdh5.7.0

    导入jdbc驱动jar包

    [hadoop@hadoop001 conf]$ cd $SQOOP_HOME/lib

    [hadoop@hadoop001 lib]$ wget http://central.maven.org/maven2/mysql/mysql-connector-java/5.1.27/mysql-connector-java-5.1.27.jar

    导入json包

    [hadoop@hadoop001 lib]$ wget http://www.java2s.com/Code/JarDownload/java-json/java-json.jar.zip

    [hadoop@hadoop001 lib]$ unzip java-json.jar.zip

    导入hive-exec-**.jar

    [hadoop@hadoop001 lib]$ cp $HIVE_HOME/lib/hive-exec-**.jar $SQOOP_HOME/lib

    输入sqoop version检查是否安装成功

    [hadoop@hadoop001 lib]$ sqoop version
    Warning: /home/hadoop/app/sqoop-1.4.6-cdh5.7.0/../hbase does not exist! HBase imports will fail.
    Please set $HBASE_HOME to the root of your HBase installation.
    Warning: /home/hadoop/app/sqoop-1.4.6-cdh5.7.0/../hcatalog does not exist! HCatalog jobs will fail.
    Please set $HCAT_HOME to the root of your HCatalog installation.
    Warning: /home/hadoop/app/sqoop-1.4.6-cdh5.7.0/../accumulo does not exist! Accumulo imports will fail.
    Please set $ACCUMULO_HOME to the root of your Accumulo installation.
    Warning: /home/hadoop/app/sqoop-1.4.6-cdh5.7.0/../zookeeper does not exist! Accumulo imports will fail.
    Please set $ZOOKEEPER_HOME to the root of your Zookeeper installation.
    19/04/04 20:19:41 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6-cdh5.7.0
    Sqoop 1.4.6-cdh5.7.0
    git commit id
    Compiled by jenkins on Wed Mar 23 11:30:51 PDT 2016

    查看mysql库信息
    [hadoop@hadoop001 lib]$ sqoop list-databases

    > --connect jdbc:mysql://localhost:3306
    > --username root --password 123456

    Warning: /home/hadoop/app/sqoop-1.4.6-cdh5.7.0/../hbase does not exist! HBase imports will fail.
    Please set $HBASE_HOME to the root of your HBase installation.
    Warning: /home/hadoop/app/sqoop-1.4.6-cdh5.7.0/../hcatalog does not exist! HCatalog jobs will fail.
    Please set $HCAT_HOME to the root of your HCatalog installation.
    Warning: /home/hadoop/app/sqoop-1.4.6-cdh5.7.0/../accumulo does not exist! Accumulo imports will fail.
    Please set $ACCUMULO_HOME to the root of your Accumulo installation.
    Warning: /home/hadoop/app/sqoop-1.4.6-cdh5.7.0/../zookeeper does not exist! Accumulo imports will fail.
    Please set $ZOOKEEPER_HOME to the root of your Zookeeper installation.
    19/04/04 20:25:09 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6-cdh5.7.0
    19/04/04 20:25:09 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
    19/04/04 20:25:09 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
    information_schema
    mysql
    performance_schema
    ruoze_d6
    ruozedata
    test

    查看sqoop命令  
    [hadoop@hadoop001 lib]$ sqoop help

    codegen                   Generate code to interact with database records
    create-hive-table       Import a table definition into Hive
    eval                           Evaluate a SQL statement and display the results
    export                       Export an HDFS directory to a database table
    help                          List available commands
    import                       Import a table from a database to HDFS
    import-all-tables       Import tables from a database to HDFS
    import-mainframe     Import datasets from a mainframe server to HDFS
    job                           Work with saved jobs
    list-databases          List available databases on a server
    list-tables                 List available tables in a database
    merge                      Merge results of incremental imports
    metastore                 Run a standalone Sqoop metastore
    version                      Display version information

    通过sqoop把数据从mysql导入hive

    sqoop import
    --connect jdbc:mysql://localhost:3306/hiveDB
    --username root --password 123456
    --table DBS
    --hive-import
    --hive-database myhive
    --hive-table hiveDBS
    --fields-terminated-by ' '
    --lines-terminated-by ' '
    -m 2

     

     

  • 相关阅读:
    惊!ThreadLocal你怎么动不动就内存泄漏?
    windows截屏.md
    小甲鱼.md
    sql手工注入.md
    2_ELF_header详解.md
    一句话下载器.md
    4.md
    通用寄存器.md
    2.md
    1_Segment_section.md
  • 原文地址:https://www.cnblogs.com/xuziyu/p/10656987.html
Copyright © 2020-2023  润新知