sqoop数据迁移
查看所有的数据库
sqoop list-databases -connect jdbc:mysql://hadoop1:3306 -username root -password root
查看text库中所有表
sqoop list-tables -connect jdbc:mysql://hadoop1:3306/test -username root -password root
查看ysyy_bill当中所有数据
sqoop eval -connect jdbc:mysql://hadoop1:3306/test -username root -password root -query "select * from ysyy_bill"
数据互导
mysql-hdfs
(1)无脑导入 常用 整表导
sqoop import -connect jdbc:mysql://hadoop1:3306/test -username root -password root -table t1 -m 1 -target-dir /sqoop/t1
(2)更常用 增量导入
sqoop import -connect jdbc:mysql://hadoop1:3306/test -username root -password root -table t1 -m 1 -target-dir /sqoop/t1 -incremental append -check-column id -last-value "2"
hdfs-mysql
首先在mysql库中手动创建表结构
sqoop export -connect jdbc:mysql://hadoop1:3306/test -username root -password root -table t2 -m 1 -export-dir /sqoop/t1
mysql-hive
首先需要复制mysql表结构到hive表结构
sqoop create-hive-table -connect jdbc:mysql://hadoop1:3306/test -username root -password root -table t1 -hive-table sqoop.t1 -fields-terminated-by " 001" -lines-terminated-by " " sqoop import -connect jdbc:mysql://hadoop1:3306/test -username root -password root -table t1 -hive-table sqoop.t1 -m 1 -fields-terminated-by " 001" -hive-import
hive-mysql
首先mysql库中手动创建表结构
sqoop export -connect jdbc:mysql://hadoop1:3306/test -username root -password root -table t3 -export-dir /user/hive/warehouse/sqoop.db/t1 -input-fields-terminated-by ' 001'
mysql-hbase
手动创建hbase表 create ‘t1’,‘cf’
sqoop import -connect jdbc:mysql://hadoop1:3306/test -username root -password root -table t1 -hbase-table t1 -column-family 'cf' -hbase-row-key id -m 1