最近有一个比较大的文件总行数3.5亿行,数据内容要规律,需要导入到mysql中
[t1@test01 tmp]# wc -l winter1224.txt 356336714 winter1224.txt
如果直接导入,有搞垮数据库的风险,现通过split将其拆分为10个不同的文件
具体命令
split -l 30000000 -d winter1224.txt winter1224_
说明: -l 代表行数 -d 代表以数字做为后缀
执行后结果如下:
[t1@test01 tmp]$ ls winter1224_01 winter1224_03 winter1224_05 winter1224_07 winter1224_09 winter1224_11 winter1224_00 winter1224_02 winter1224_04 winter1224_06 winter1224_08 winter1224_10 winter1224.txt
验证结果:
[t1@test01 tmp]$ wc -l winter1224_* 30000000 winter1224_00 30000000 winter1224_01 30000000 winter1224_02 30000000 winter1224_03 30000000 winter1224_04 30000000 winter1224_05 30000000 winter1224_06 30000000 winter1224_07 30000000 winter1224_08 30000000 winter1224_09 30000000 winter1224_10 26336714 winter1224_11 356336714 total [t1@test01 tmp]$ wc -l winter1224.txt 356336714 winter1224.txt
文件行数一致。完成ok。