Php导出百万数据的优化
程序员文章站
2022-05-29 11:41:00
导出数据量很大的情况下,生成excel的内存需求非常庞大,服务器吃不消,这个时候考虑生成csv来解决问题,cvs读写性能比excel高。测试表student 数据(大家可以脚本插入300多万测数据。这里只给个简单的示例了) 导出脚本export.php 导出效果: ......
导出数据量很大的情况下,生成excel的内存需求非常庞大,服务器吃不消,这个时候考虑生成csv来解决问题,cvs读写性能比excel高。
测试表student 数据(大家可以脚本插入300多万测数据。这里只给个简单的示例了)
set names utf8mb4; set foreign_key_checks = 0; -- ---------------------------- -- table structure for student -- ---------------------------- drop table if exists `student`; create table `student` ( `id` int(11) not null auto_increment, `stuno` varchar(32) character set utf8 collate utf8_general_ci not null, `stuname` varchar(10) character set utf8 collate utf8_general_ci not null, `stuage` int(11) null default null, primary key (`id`) using btree ) engine = innodb auto_increment = 12 character set = utf8 collate = utf8_general_ci row_format = compact; -- ---------------------------- -- records of student -- ---------------------------- insert into `student` values (1, 'a001', '小明', 22); insert into `student` values (2, 'a005', '小李', 23); insert into `student` values (3, 'a007', '小红', 24); insert into `student` values (4, 'a003', '小明', 22); insert into `student` values (5, 'a002', '小李', 23); insert into `student` values (6, 'a004', '小红', 24); insert into `student` values (7, 'a006', '小王', 25); insert into `student` values (8, 'a008', '乔峰', 27); insert into `student` values (9, 'a009', '欧阳克', 22); insert into `student` values (10, 'a010', '老顽童', 34); insert into `student` values (11, 'a011', '黄老邪', 33); set foreign_key_checks = 1;
导出脚本export.php
<?php
set_time_limit(0);
ini_set('memory_limit', '128m');
$filename = date('ymdhis', time());
header('content-encoding: utf-8');
header("content-type:application/vnd.ms-excel;charset=utf-8");
header('content-disposition: attachment;filename="' . $filename . '.csv"');
//注意,数据量在大的情况下。比如导出几十万到几百万,会出现504 gateway time-out,请修改php.ini的max_execution_time参数
//打开php标准输出流以写入追加的方式打开
$fp = fopen('php://output', 'a');
//连接数据库
$dbhost = '127.0.0.1';
$dbuser = 'root';
$dbpwd = 'root';
$con = mysqli_connect($dbhost, $dbuser, $dbpwd);
if (mysqli_connect_errno())
die('connect error');
$database = 'test';//选择数据库
mysqli_select_db($con, $database);
mysqli_query($con, "set names utf8");//如果需要请设置编码
//用fputcsv从数据库中导出1百万的数据,比如我们每次取1万条数据,分100步来执行
//一次性读取1万条数据,也可以把$nums调小,$step相应增大。
$step = 100;
$nums = 10000;
$where = "where 1=1"; //筛选条件,可自行添加
//设置标题
$title = array('id', '编号', '姓名', '年龄'); //注意这里是小写id,否则打开会提示excel 已经检测到"xxx.xsl"是sylk文件,但是不能将其加载: csv 文或者xls文件的前两个字符是大写字母"i","d"时,会发生此问题。
foreach ($title as $key => $item)
$title[$key] = iconv("utf-8", "gb2312//ignore", $item);
fputcsv($fp, $title);
for ($s = 1; $s <= $step; $s++) {
$start = ($s - 1) * $nums;
$result = mysqli_query($con, "select id,stuno,stuname,stuage from `student` " . $where . " order by `id` limit {$start},{$nums}");
if ($result) {
while ($row = mysqli_fetch_assoc($result)) {
foreach ($row as $key => $item)
$row[$key] = iconv("utf-8", "gbk", $item); //这里必须转码,不然会乱码
fputcsv($fp, $row);
}
mysqli_free_result($result); //释放结果集资源
ob_flush(); //每1万条数据就刷新缓冲区
flush();
}
}
mysqli_close($con);//断开连接
导出效果:
上一篇: 公司一美女电话跟她老公吵架