1.环境:
CDH
SQOOP2
2.操作步骤
1.准备表与主键
表名:test.test_log
主键:sys_log_id
建表语句
CREATE TABLE hbase_test_test_log_0307 ( ACCT_DATE string, SYS_SEQ_ID string, MER_CUST_ID string, INSERTDT string );
2.设置环境变量
JAVA_HOME=/usr/java/jdk1.8.0_111/ export JAVA_HOME
3.设置hdfs环境与权限
查看目录下文件
hdfs dfs -ls /tmp
删除已有的目录
hdfs dfs -rm -rf /tmp/hive_test_test_log_0307
授权
hadoop dfs -chmod -R 777 /tmp
4.执行导入脚本
sqoop import --driver com.vertica.jdbc.Driver --connect jdbc:vertica://192.168.1.1:8888/dbname?searchpath=test --username user --password passwd --query "select * from test.test_log where acct_date between '20160101' and '20160101'and sys_seq_id >1 and $CONDITIONS " --hive-import --create-hive-table --hive-table hive_test_test_log_0307 --target-dir "/tmp/hive_test_test_log_0307" -m 1
5.查看导入结果与排错
5.确认数据正确
eg:kill异常任务方法
hadoop job -list hadoop job -kill job_1478510263374_0069