• sqoop


    Sqoop安装过程

    Sqoop是一个用来将Hadoop和关系型数据库中的数据相互转移的工具,可以将一个关系型数据库(例如 : MySQL ,Oracle ,Postgres等)中的数据导进到Hadoop的HDFS中,也可以将HDFS的数据导进到关系型数据库中。

    已经安装Hadoop-0.20.2,因sqoop官方版本不支持此版本,但可使用CDH3版本:

    Sqoop CDH版本:http://archive.cloudera.com/cdh/3/sqoop-1.2.0-CDH3B4.tar.gz

    Hadoop CDH版本:http://archive.cloudera.com/cdh/3/hadoop-0.20.2-CDH3B4.tar.gz

    可以通过拷贝相应的包到sqoop-1.2.0-CDH3B4/lib下,依然可以使用Hadoop-0.20.2版本。

    sqoop版本: sqoop-1.2.0-CDH3B4

    Hadoop版本:0.20.2

    mysql版本:  5.6.11

    1)解压缩sqoop安装文件

    [hadoop@node01 ~]$ tar -xzvf sqoop-1.2.0-CDH3B4.tar.gz

    2)sqoop-1.2.0-CDH3B4依赖hadoop-core-0.20.2-CDH3B4.jar,所以你需要下载hadoop- 0.20.2-CDH3B4.tar.gz,解压缩后将hadoop-0.20.2-CDH3B4/hadoop-core-0.20.2- CDH3B4.jar复制到sqoop-1.2.0-CDH3B4/lib中。

    [hadoop@node01 ~]$ cp hadoop-core-0.20.2-CDH3B4.jar sqoop-1.2.0-CDH3B4/lib

    [hadoop@node01 ~]$ ls -l sqoop-1.2.0-CDH3B4/lib/hadoop-core-0.20.2-CDH3B4.jar

    -rw-r--r--. 1 hadoop root 3452461 May  9 05:40 sqoop-1.2.0-CDH3B4/lib/hadoop-core-0.20.2-CDH3B4.jar

    3)另外,sqoop导入mysql数据运行过程中依赖mysql-connector-java-*.jar,所以你需要下载mysql-connector-java-*.jar并复制到sqoop-1.2.0-CDH3B4/lib中

    [hadoop@node01 ~]$ cp mysql-connector-java-5.1.24-bin.jar sqoop-1.2.0-CDH3B4/lib

    [hadoop@node01 ~]$ ls -l sqoop-1.2.0-CDH3B4/lib/mysql-connector-java-5.1.24-bin.jar

    -rw-r--r--. 1 hadoop root 846263 May  9 05:43 sqoop-1.2.0-CDH3B4/lib/mysql-connector-java-5.1.24-bin.jar

    4)修改SQOOP的文件configure-sqoop,注释掉hbase和zookeeper检查(除非你准备使用HABASE等HADOOP上的组件),否则在进行hbase和zookeeper检查时,可能会卡在这里。

    5)启动Hadoop

    [hadoop@node01 bin]$ start-all.sh

    [hadoop@node01 bin]$ jps

    2732 Jps

    2478 NameNode

    2665 JobTracker

    2600 SecondaryNameNode

    6)从MySQL导入数据到HDFS

    (1)在MySQL里创建测试数据库sqooptest

    [hadoop@node01 ~]$ mysql -u root -p

    mysql> create database sqooptest;

    Query OK, 1 row affected (0.01 sec)

    (2)创建sqoop专有用户

    mysql> create user 'sqoop' identified by 'sqoop';

    Query OK, 0 rows affected (0.00 sec)

    mysql> grant all privileges on *.* to 'sqoop' with grant option;

    Query OK, 0 rows affected (0.00 sec)

    mysql> flush privileges;

    Query OK, 0 rows affected (0.00 sec)

    (3)生成测试数据

    mysql> use sqooptest;

    Database changed

    mysql> create table tb1 as select table_schema,table_name,table_type from information_schema.TABLES;

    Query OK, 154 rows affected (0.28 sec)

    Records: 154  Duplicates: 0  Warnings: 0

    (4)测试sqoop与mysql的连接

    [hadoop@node01 ~]$ sqoop list-databases --connect jdbc:mysql://node01:3306/ --username sqoop --password sqoop

    (5)从MySQL导入数据到HDFS

    [hadoop@node01 ~]$ sqoop import --connect jdbc:mysql://node01:3306/sqooptest --username sqoop --password sqoop --table tb1 -m 1

    (6)在HDFS上查看刚刚导入的数据

    [hadoop@node01 ~]$ hadoop dfs -ls tb1

  • 相关阅读:
    Python多进程multiprocessing
    Python正则表达式基础
    wget: unable to resolve host address “http”
    python爬虫--爬取cctv连续剧
    Linux 配置静态IP
    ERROR 1396 (HY000): Operation CREATE USER failed for 'root'@'localhost'
    启动hive --service metastore &出现Missing Hive Execution Jar: /opt/apache-hive-1.2.0-bin//lib/hive-exec-*.jar
    /usr/lib64/libstdc++.so.6: version `GLIBCXX_3.4.15"" not found
    Could not create ServerSocket on address 0.0.0.0/0.0.0.0:9083
    爬取豆瓣电影信息保存到Excel
  • 原文地址:https://www.cnblogs.com/catcoding/p/5436636.html
Copyright © 2020-2023  润新知