首页 > 解决方案 > PHP PDO选择查询在大量记录上失败

问题描述

我有一个 PostgreSQL 数据库,我需要查询一个包含 2000 万行的表。然后我将结果集推送到 csv 中,最后推送到 S3 中。我已经编译了一个 php 脚本来实现这一点。当我将查询限制为 600 万行时,它会成功,但任何高于此的数据似乎都失败了。即使启用了所有错误检查和参数,也不会记录或显示错误消息。还将我的 Centos 实例上的内存增加到 3GB,但真的没有运气。我的代码如下:

//connect to database
$myPDO = new PDO('pgsql:host=127.0.0.1;port=5433;dbname=test', 'test', 'test');
$myPDO->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_WARNING);

//time and memory usage variables
error_reporting(-1);
$time = microtime(TRUE);
$mem = memory_get_usage();

$stmt = $myPDO->prepare("SELECT * FROM table"); 
$stmt->execute(); 

$u = "export.csv";

    $h = fopen($u, "w+");

     $first = true;

     //counter
     $counter = 0;

//loop through all rows
     while ($row = $stmt->fetch(PDO::FETCH_ASSOC)) {
     $counter++; 

//export column header
     if ($first) {
         fputcsv($h, array_keys($row));
         $first = false;
     }
         //export content
         fputcsv($h, $row);
     }

     fclose($h);

require('AwsS3Upload.php');

//pdo error output
if (!$stmt) {
    echo "\nPDO::errorInfo():\n";
    print_r($dbh->errorInfo());
}
print_r(array('memory' => (memory_get_usage() - $mem) / (1024 * 1024), 'seconds' => microtime(TRUE) - $time));

更新 - 工作代码:

$myPDO = new PDO('pgsql:host=127.0.0.1;port=5433;dbname=test', 'test', 'test');
$myPDO->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_WARNING);
$myPDO->setAttribute(PDO::ATTR_EMULATE_PREPARES, false);
$myPDO->setAttribute(PDO::ATTR_STRINGIFY_FETCHES, false);

//time and memory usage variables
error_reporting(-1);
$time = microtime(TRUE);
$mem = memory_get_usage();

for($i=0; $i<6; $i++){
    $limit = 5000000;
    $offset = $i * $limit;

$sql = 'SELECT * FROM table ORDER BY :order LIMIT :limit OFFSET :offset';
    $stmt = $myPDO->prepare($sql);
    $stmt->bindParam(':order', $order, PDO::PARAM_INT);
    $stmt->bindParam(':limit', $limit, PDO::PARAM_INT);
    $stmt->bindParam(':offset', $offset, PDO::PARAM_INT);

    $stmt->execute(); 

$u = "$i-export.csv";
    $h = fopen($u, "w+");
    $first = true;
    //counter
    $counter = 0;

    while ($row = $stmt->fetch(PDO::FETCH_ASSOC)) {
        $counter++; 

        if ($first) {
            fputcsv($h, array_keys($row));
            $first = false;
        }

            fputcsv($h, $row);
        }

        fclose($h);

   require('AwsS3Upload.php');
}

标签: phppostgresqlpdo

解决方案


你必须把它切成小块。像这样的东西:

for($i=0; $i<6; $i++){
    $limit = 5000000;
    $offset = $i * $limit;
    $stmt = $myPDO->prepare("SELECT * FROM table ORDER BY `id` LIMIT $offset,$limit"); 
    $stmt->execute(); 

    // retrieve and write the records into file
}

推荐阅读