MyBatis中批量插入
方法一:
<insert id="insertbatch" parameterType="java.util.List">
<selectKey keyProperty="fetchTime" order="BEFORE"
resultType="java.lang.String">
SELECT CURRENT_TIMESTAMP()
</selectKey>
insert into kangaiduoyaodian ( depart1, depart2, product_name,
generic_name, img, product_specification, unit,
approval_certificate, manufacturer, marketPrice, vipPrice,
website, fetch_time, productdesc ) values
<foreach collection="list" item="item" index="index"
separator=",">
( #{item.depart1}, #{item.depart2}, #{item.productName},
#{item.genericName}, #{item.img},
#{item.productSpecification}, #{item.unit},
#{item.approvalCertificate}, #{item.manufacturer},
#{item.marketprice}, #{item.vipprice}, #{item.website},
#{fetchTime}, #{item.productdesc} )
</foreach>
</insert>
方法二:
<insert id="batchInsertB2B" parameterType="ArrayList">
insert into xxxxtable(hkgs,hkgsjsda,office,asdf,ddd,ffff,supfullName,classtype,agent_type,remark)
<foreach collection="list" item="item" index="index" separator="union all">
select #{item.hkgs,jdbcType=VARCHAR},
#{item.hkgsjsda,jdbcType=VARCHAR},
#{item.office,jdbcType=VARCHAR},
#{item.asdf,jdbcType=VARCHAR},
#{item.ddd,jdbcType=VARCHAR},
#{item.ffff,jdbcType=VARCHAR},
#{item.supfullName,jdbcType=VARCHAR},0,0,
#{item.remark,jdbcType=VARCHAR} from dual
</foreach>
</insert>
可以考虑用union all来实现批量插入。
例如:
insert into XX_TABLE(XX,XX,XX)select 'xx','xx','xx' union all select 'xx','xx','xx' union all select 'xx','xx','xx' ...
先拼装好语句再动态传入insert into XX_TABLE(XX,XX,XX)后面部分
MyBatis中批量删除
<!-- 通过主键集合批量删除记录 -->
<delete id="batchRemoveUserByPks" parameterType="java.util.List">
DELETE FROM LD_USER WHERE ID in
<foreach item="item" index="index" collection="list" open="(" separator="," close=")">
#{item}
</foreach>
</delete>
MyBatis中in子句
1.只有一个参数
参数的类型要声明为List或Array
Sql配置如下:
<select id="selectProduct" resultMap="Map">
SELECT *
FROM PRODUCT
WHERE PRODUCTNO IN
<foreach item="productNo" index="index" collection="参数的类型List或array">
#{productNo}
</foreach>
</select>
2.多个参数
首先要将多个参数写入同一个map,将map作为一个参数传入mapper
Sql配置如下:
<select id="selectProduct" resultMap="Map">
SELECT *
FROM PRODUCT
WHERE PRODUCTNO IN
<foreach item="productNo" index="index" collection="map中集合参数的名称">
#{productNo}
</foreach>
</select>
MyBatis批量修改
<update id="updateOrders" parameterType="java.util.List">
update orders set state = '0' where no in
<foreach collection="list" item="nos" open="(" separator="," close=")">
#{nos}
</foreach>
</update>
- MyBatis的前身就是著名的Ibatis,不知何故脱离了Apache改名为MyBatis。
MyBatis所说是轻量级的ORM框架,在网上看过一个测试报告,感觉相比于Hibernate来说,优势并不明显。
下面说一下比较有趣的现象,根据MyBatis的官方文档,在获得sqlSession时,它有为批量更新而专门准备的:
session = sessionFactory.openSession(ExecutorType.BATCH, true);//用于批量update
一般来说,对MYSQL数据库批量操作时速度取决于,是为每一个处理分别建立一个连接,还是为这一批处理一共建立一个连接。按MyBatis的手册说明,选择ExecutorType.BATCH意味着,获得的sqlSession会批量执行所有更新语句。不过我测试了一下,批量插入1000条数据,发觉ExecutorType.BATCH方式的效率居然比普通的方式差很多。我测试用的Mapper中的insert配置如下,再用for循环插入1000条记录:
2 <!-- WARNING - @mbggenerated This element is automatically generated by
3 MyBatis Generator, do not modify. This element was generated on Mon May 09
4 11:09:37 CST 2011. -->
5 insert into student (id, name, sex,
6 address, telephone, t_id
7 )
8 values (#{id,jdbcType=INTEGER}, #{name,jdbcType=VARCHAR},
9 #{sex,jdbcType=VARCHAR},
10 #{address,jdbcType=VARCHAR}, #{telephone,jdbcType=VARCHAR}, #{tId,jdbcType=INTEGER}
11 )
12 </insert>
- 我不清楚原因在哪里, 就配置了MyBatis的log4j,想查看下日志。下载了log4j.jar和commons-logging.jar并配置到项目的类路径,然后在代码路径下新建文件log4j.properties,内容如下:
log4j.rootLogger=DEBUG, stdout
# SqlMap logging configuration...
log4j.logger.com.ibatis=DEBUG
log4j.logger.com.ibatis.common.jdbc.SimpleDataSource=DEBUG
log4j.logger.com.ibatis.sqlmap.engine.cache.CacheModel=DEBUG
log4j.logger.com.ibatis.sqlmap.engine.impl.SqlMapClientImpl=DEBUG
log4j.logger.com.ibatis.sqlmap.engine.builder.xml.SqlMapParser=DEBUG
log4j.logger.com.ibatis.common.util.StopWatch=DEBUG
log4j.logger.java.sql.Connection=DEBUG
log4j.logger.java.sql.Statement=DEBUG
log4j.logger.java.sql.PreparedStatement=DEBUG
log4j.logger.java.sql.ResultSet=DEBUG
# Console output...
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%5p [%t] - %m%n
DEBUG [main] - Created connection 3502256.
DEBUG [main] - ooo Connection Opened
DEBUG [main] - ==> Executing: insert into student ( name, sex, address, telephone,t_id ) values ( ?, ?, ?, ?, ? )
DEBUG [main] - ==> Parameters: 新人0(String), male(String), addr0(String),dd(String),3(Integer)
DEBUG [main] - ==> Executing: insert into student ( name, sex, address, telephone,t_id ) values ( ?, ?, ?, ?, ? )
DEBUG [main] - ==> Parameters: 新人1(String), male(String),
...............
...............
DEBUG [main] - xxx Connection Closed
DEBUG [main] - Returned connection 3502256 to pool. - 最后一点是关于数据库批量插入时sql语句级的优化,我特意测试了两种方式,在StudentMapper中配置了两种insert模式。第一种对应insert value1,insert value2,,,,;第二种对应insert values (value1, value2,....)。发现后者果然比前者快很多啊。下面是两种insert模式,及测试结果对应图:
<!-- 在外部for循环调用一千次 -->
<insert id="insert" parameterType="sdc.mybatis.test.Student">
insert into student (id, name, sex,
address, telephone, t_id
)
values (#{id,jdbcType=INTEGER}, #{name,jdbcType=VARCHAR},
#{sex,jdbcType=VARCHAR},
#{address,jdbcType=VARCHAR}, #{telephone,jdbcType=VARCHAR}, #{tId,jdbcType=INTEGER}
)
</insert>
<!-- 批量 ,传入一个长度为1000的list -->
<insert id="insertBatch">
insert into student ( <include refid="Base_Column_List"/> )
values
<foreach collection="list" item="item" index="index" separator=",">
(null,#{item.name},#{item.sex},#{item.address},#{item.telephone},#{item.tId})
</foreach>
</insert>