欢迎您访问程序员文章站本站旨在为大家提供分享程序员计算机编程知识!
您现在的位置是: 首页  >  IT编程

基于Java SSM实现Excel数据批量导入

程序员文章站 2022-03-10 14:42:08
目录导入maven依赖mapper及映射文件测试导入maven依赖com.alibaba

导入maven依赖

<dependency>
	<groupid>com.alibaba</groupid>
	<artifactid>easyexcel</artifactid>
	<version>${easyexcel.version}</version>
</dependency>

mapper及映射文件

usermapper.java

@mapper
public interface usermapper {
    int batchinsert(@param("list") list<user> list);
}

usermapper.xml

<?xml version="1.0" encoding="utf-8"?>
<!doctype mapper public "-//mybatis.org//dtd mapper 3.0//en" "http://mybatis.org/dtd/mybatis-3-mapper.dtd">
<mapper namespace="com.hc.mapper.usermapper">
  <resultmap id="baseresultmap" type="com.hc.domain.user">
    <!--@mbg.generated-->
    <!--@table tb_user-->
    <id column="id" jdbctype="bigint" property="id" />
    <result column="nickname" jdbctype="varchar" property="nickname" />
    <result column="avatar" jdbctype="varchar" property="avatar" />
    <result column="account" jdbctype="varchar" property="account" />
    <result column="password" jdbctype="varchar" property="password" />
    <result column="gender" jdbctype="tinyint" property="gender" />
    <result column="tel" jdbctype="varchar" property="tel" />
    <result column="email" jdbctype="varchar" property="email" />
    <result column="qq" jdbctype="varchar" property="qq" />
    <result column="wechat" jdbctype="varchar" property="wechat" />
    <result column="salt" jdbctype="varchar" property="salt" />
    <result column="info" jdbctype="varchar" property="info" />
    <result column="status" jdbctype="tinyint" property="status" />
    <result column="create_time" jdbctype="timestamp" property="createtime" />
    <result column="update_time" jdbctype="timestamp" property="updatetime" />
  </resultmap>
  <sql id="base_column_list">
    <!--@mbg.generated-->
    id, nickname, avatar, account, `password`, gender, tel, email, qq, wechat, salt, 
    info, `status`, create_time, update_time
  </sql>
  <insert id="batchinsert" keycolumn="id" keyproperty="id" parametertype="map" usegeneratedkeys="true">
    <!--@mbg.generated-->
    insert into tb_user
    (nickname, avatar, account, `password`, gender, tel, email, qq, wechat, salt, info,
    `status`, create_time, update_time)
    values
    <foreach collection="list" item="item" separator=",">
      (#{item.nickname,jdbctype=varchar}, #{item.avatar,jdbctype=varchar}, #{item.account,jdbctype=varchar},
      #{item.password,jdbctype=varchar}, #{item.gender,jdbctype=tinyint}, #{item.tel,jdbctype=varchar},
      #{item.email,jdbctype=varchar}, #{item.qq,jdbctype=varchar}, #{item.wechat,jdbctype=varchar},
      #{item.salt,jdbctype=varchar}, #{item.info,jdbctype=varchar}, #{item.status,jdbctype=tinyint},
      #{item.createtime,jdbctype=timestamp}, #{item.updatetime,jdbctype=timestamp})
    </foreach>
  </insert>
</mapper>

excel监听器

@log4j2
@service
public class userexcellistener extends analysiseventlistener<user> {

	@resource
	private usermapper usermapper;

	/**
	 * 批处理阈值
	 */
	private static final int batch_count = 250;
	@getter
	list<user> list = new arraylist<>(batch_count);

	@override
	public void invoke(user user, analysiscontext analysiscontext) { //逐行读取数据
		log.info("********** 解析到一条数据:{}", json.tojsonstring(user));
		list.add(user);
		if (list.size() >= batch_count) {
            system.out.println("已经解析"+list.size()+"条数据");
			//每250条,往数据库中存一次
			int batchinsertres = usermapper.batchinsert(list);
			system.out.println(batchinsertres);
			list.clear();
		}
	}

	@override
	public void doafterallanalysed(analysiscontext analysiscontext) {
		log.info("**********所有数据解析完成!");
	}
}

测试

@extendwith(springextension.class)
@contextconfiguration("/applicationcontext.xml")
public class excelutiltest {
    @resource
    private userexcellistener userexcellistener;

    @test
    void userlistener(){
        easyexcel.read("e:\\projects\\worksdisplay\\data\\users.xlsx", user.class, userexcellistener)
                .sheet()
                .doread();
    }
} 

到此这篇关于基于java ssm实现excel数据批量导入的文章就介绍到这了,更多相关java  excel数据批量导入内容请搜索以前的文章或继续浏览下面的相关文章希望大家以后多多支持!