怎么给一张表插入100万条数据,如果用for循环那low暴了,有得等。
借鉴博客:https://blog.csdn.net/gzt19881123/article/details/122815596
解决办法:
1、开启mybatis的batch模式:application.yml文件配置
mybatis:
executor-type: batch
2、使用原生的jdbc
// 从spring管理的数据源中直接拿 @Resource(name = "dataSource") private DataSource dataSource; @Test public void jdbcTest() throws SQLException { Connection connection = dataSource.getConnection(); connection.setAutoCommit(false); String sql = "INSERT INTO tpm_user (id,name,createDate,remark) VALUES(?,?,?,?) "; PreparedStatement statement = connection.prepareStatement(sql); for (int i = 0; i < 1000000; i++) { statement.setLong(1, snowflakeService.nextId()); statement.setString(2, "name" + i); statement.setDate(3, new Date(System.currentTimeMillis())); statement.setString(4, "remark" + i); statement.addBatch(); } long start = System.currentTimeMillis(); statement.executeBatch(); connection.commit(); statement.close(); connection.close(); System.out.print("耗时:"); System.out.println(System.currentTimeMillis() - start); }
测试结果:100万条数据,秒插入
。