当前位置:Gxlcms >
数据库问题 >
用php导入10W条+ 级别的csv大文件数据到mysql。导出10W+级别数据到csv文件
用php导入10W条+ 级别的csv大文件数据到mysql。导出10W+级别数据到csv文件
时间:2021-07-01 10:21:17
帮助过:23人阅读
- $handle=fopen("1.csv","r");
- while(!feof($handle)){
- $buffer=fgetss($handle,2048);
- $row = mb_convert_encoding(trim($buffer), ‘utf-8‘, ‘gbk‘);
- $data=explode(",",$row);
- $insertRows[] = $data;
- }
2. 函数 fgetcsv($handel,2048,‘,‘),返回数组,它就是explode(",",fget($handel))的组合。这种方法使用于小的csv 文件。而且不适合有汉字的csv 文件。
[php] view plain
copy
- $handle=fopen("1.csv","r");
- while($data=fgetcsv($handle,1000,",")){
- $insertRows[] = $data;
- }
3. 说到正题了。上面2种方法适合少量的csv 文件。如果一个文件有10W+ 以上的行数,恐怕就很悲剧了。
[php] view plain
copy
- $handle=fopen("1.csv","r");
-
- $excelData = array();
- $content = trim(file_get_contents($fileName));
- $excelData = explode("\n",$content);
或者直接用$excelData = file($file); file() 函数直接将数据读出并放入数组当中。
我们先将所有的文件一次性读出来。放到一个数组$excelData 中。这个时候,我们就可以不用管这个csv 文件了,纯粹了php 操作数组了。所以。不会崩溃异常:
[php] view plain
copy
- $chunkData = array_chunk($excelData , 5000);
- $count = count($chunkData);
- for ($i = 0; $i < $count; $i++) {
- $insertRows = array();
- foreach($chunkData[$i] as $value){
- $string = mb_convert_encoding(trim(strip_tags($value)), ‘utf-8‘, ‘gbk‘);
- $v = explode(‘,‘, trim($string));
- $row = array();
- $row[‘cdate‘] = empty($v[0]) ? date(‘Y-m-d‘) : date(‘Y-m-d‘,strtotime($v[0]));
- $row[‘business‘] = $v[1];
- $row[‘project‘] = $v[2];
- $row[‘shopname‘] = $v[3];
- $row[‘shopid‘] = $v[4];
- $row[‘fanli‘] = formatNumber($v[5]);
- $row[‘fb‘] = $v[6] * 100;
- $row[‘jifen‘] = $v[7];
- $sqlString = ‘(‘."‘".implode( "‘,‘", $row ) . "‘".‘)‘;
- $insertRows[] = $sqlString;
- }
- $result = $model->addDetail($insertRows);
- }
插入数据库当然是批量插入了:
[php] view plain
copy
- public function addDetail($rows){
- if(empty($rows)){
- return false;
- }
-
- $data = implode(‘,‘, $rows);
- $sql = "INSERT IGNORE INTO tb_account_detail(cdate,business,project,shopname,shopid,fanli,fb,jifen)
- VALUES {$data}";
- $result = $this->query($sql);
- return true;
- }
ok ! 亲测试。10W 数据。6个字段。插入需要10秒。
-----------------2013-12-18 日更新---------------------
3. 导出10W+的数据到csv
放弃之前写的一篇博客中用到的方法:http://blog.csdn.net/think2me/article/details/8596833 。原因是:当超过50W+ 以上的数据时,有可能浏览器崩溃,内存超。
这个方法是写文件的方式。然后再把文件弹出下载。
[php] view plain
copy
- public function dump2Excel() {
-
- set_time_limit(0);
- ini_set(‘memory_limit‘, ‘640M‘);
-
- $name = $this->getActionName();
- $model = D(GROUP_NAME . ‘.‘ . $name);
- $map = $this->_search();
-
- if (isset($_GET[‘error‘]) && $_GET[‘error‘] > 0) {
- $filename = C(‘IMG_PATH‘).‘account_data_error_‘ . $map[‘action_id‘] . ‘_‘ . date(‘Y-m-d‘, mktime()) . ‘.csv‘;
- }else{
- $filename = C(‘IMG_PATH‘).‘account_data_all_‘ . $map[‘action_id‘] . ‘_‘ . date(‘Y-m-d‘, mktime()) . ‘.csv‘;
- }
-
-
-
-
-
-
- $header[] = iconv("utf-8", "gb2312", "用户信息");
- $header[] = iconv("utf-8", "gb2312", "商家ID");
- $header[] = iconv("utf-8", "gb2312", "联盟");
- $header[] = iconv("utf-8", "gb2312", "商家订单号");
- $header[] = iconv("utf-8", "gb2312", "商品分类");
- $header[] = iconv("utf-8", "gb2312", "确认类别");
- $header[] = iconv("utf-8", "gb2312", "下单时间");
- $header[] = iconv("utf-8", "gb2312", "完成时间");
- $header[] = iconv("utf-8", "gb2312", "实际支付金额");
- $header[] = iconv("utf-8", "gb2312", "佣金");
- $header[] = iconv("utf-8", "gb2312", "佣金补贴");
- $header[] = iconv("utf-8", "gb2312", "返利");
- $header[] = iconv("utf-8", "gb2312", "F币");
- $header[] = iconv("utf-8", "gb2312", "论坛积分");
- $header[] = iconv("utf-8", "gb2312", "备注");
- $header[] = iconv("utf-8", "gb2312", "强制入库");
- $header[] = iconv("utf-8", "gb2312", "唯一标识");
- $header[] = iconv("utf-8", "gb2312", "错误信息");
-
- $headerFile = implode(‘,‘, $header);
-
-
- @unlink($filename);
- file_put_contents($filename, $headerFile."\n");
-
-
- $list = D(‘Fanli‘)->table(‘tb_account_action_data_error_code‘)->field(‘id,err_msg‘)->findAll();
- $error_msg = array();
- foreach ($list as $value) {
- $error_msg[$value[‘id‘]] = $value[‘err_msg‘];
- }
-
- if (isset($_GET[‘error‘]) && $_GET[‘error‘] > 0) {
- $map[‘error_code‘] = array(‘gt‘, 0);
- }
-
- if (!empty($map[‘action_id‘])) {
- $allCount = $model->where($map)->field(‘count(1) as count‘)->select();
- $pageLimit = ceil($allCount[0][‘count‘]/self::PAGE_COUNT);
- $voList = array();
-
- if (!$handle = fopen($filename, ‘a‘)) {
- echo "不能打开文件 $filename";
- exit;
- }
-
- for($i=0;$i<$pageLimit;$i++){
- $count = self::PAGE_COUNT;
- $start = $count * $i;
- $limit = "$start,$count";
- $voList = $model->where($map)->limit($limit)->order(‘id desc‘)->findAll();
-
- $excelString = array();
- foreach ($voList as $v) {
- $dumpExcel = array();
- $dumpExcel[] = mb_convert_encoding($v[‘user_info‘], ‘GBK‘, ‘UTF-8‘);
- $dumpExcel[] = mb_convert_encoding($v[‘shopid‘], ‘GBK‘, ‘UTF-8‘);
- $dumpExcel[] = mb_convert_encoding($v[‘league‘], ‘GBK‘, ‘UTF-8‘);
- $dumpExcel[] = mb_convert_encoding($v[‘order_id‘], ‘GBK‘, ‘UTF-8‘);
- $dumpExcel[] = mb_convert_encoding($v[‘classify‘], ‘GBK‘, ‘UTF-8‘);
- $dumpExcel[] = mb_convert_encoding($v[‘confirm_type‘], ‘GBK‘, ‘UTF-8‘);
- $dumpExcel[] = "‘".mb_convert_encoding($v[‘buydate‘], ‘GBK‘, ‘UTF-8‘);
- $dumpExcel[] = "‘".mb_convert_encoding($v[‘paydate‘], ‘GBK‘, ‘UTF-8‘);
- $dumpExcel[] = mb_convert_encoding($v[‘real_pay‘], ‘GBK‘, ‘UTF-8‘);
- $dumpExcel[] = mb_convert_encoding($v[‘commission‘], ‘GBK‘, ‘UTF-8‘);
- $dumpExcel[] = mb_convert_encoding($v[‘commission_plus‘], ‘GBK‘, ‘UTF-8‘);
- $dumpExcel[] = mb_convert_encoding($v[‘fanli‘], ‘GBK‘, ‘UTF-8‘);
- $dumpExcel[] = mb_convert_encoding($v[‘jifen‘], ‘GBK‘, ‘UTF-8‘);
- $dumpExcel[] = mb_convert_encoding($v[‘bbs‘], ‘GBK‘, ‘UTF-8‘);
- $dumpExcel[] = mb_convert_encoding($v[‘remarks‘], ‘GBK‘, ‘UTF-8‘);
- $dumpExcel[] = intval($v[‘persist_execute‘]);
- $dumpExcel[] = mb_convert_encoding($v[‘unique_sign‘], ‘GBK‘, ‘UTF-8‘);
- $dumpExcel[] = mb_convert_encoding($error_msg[$v[‘error_code‘]], ‘GBK‘, ‘UTF-8‘);
- $excelString[] = implode(‘,‘,$dumpExcel);
- }
-
- foreach($excelString as $content){
- fwrite($handle, $content . "\n");
- }
- unset($excelString);
- }
- fclose($handle);
- }
-
- header("Content-type: application/octet-stream");
- header(‘Content-Disposition: attachment; filename="‘ . basename($filename) . ‘"‘);
- header("Content-Length: ". filesize($filename));
- readfile($filename);
- }
用php导入10W条+ 级别的csv大文件数据到mysql。导出10W+级别数据到csv文件
标签:base code bsp attach link 打开文件 rem 数组分割 nes