PHP PDO Buffered Request Task

I am having serious problems with the functions of the PHP data object. I am trying to do a circular set of results (~ 60k lines, ~ 1gig) using a buffered query to avoid fetching the whole set.

No matter what I do, the script just hangs in PDO :: query () - it seems that the query is working unbuffered (why else would changing the size of the result set change?). Here is my code to reproduce the problem:

<?php $Database = new PDO( 'mysql:host=localhost;port=3306;dbname=mydatabase', 'root', '', array( PDO::ATTR_ERRMODE => PDO::ERRMODE_EXCEPTION, PDO::MYSQL_ATTR_USE_BUFFERED_QUERY => true ) ); $rQuery = $Database->query('SELECT id FROM mytable'); // This is never reached because the result set is too large echo 'Made it through.'; foreach($rQuery as $aRow) { print_r($aRow); } ?> 

If I limit the request to some reasonable number, it works fine:

 $rQuery = $Database->query('SELECT id FROM mytable LIMIT 10'); 

I tried playing with PDO :: MYSQL_ATTR_MAX_BUFFER_SIZE and using PDO :: prepare () and PDO :: execute () (although there are no parameters in the above query), both of them have no effect. Any help would be appreciated.

+6
php mysql pdo
source share
3 answers

If I understand this right, buffered requests include an indication to PHP that you want to wait for the whole set of results before processing starts. Prior to PDO, this was the default, and you had to call mysql_unbuffered_query if you want to process the results immediately.

Why this is not explained on the MySQL PDO driver page, I do not know.

+8
source share

You can try breaking it into pieces that are not large enough to cause problems:

 <?php $id = 0; $rQuery = $Database->query('SELECT id FROM mytable ORDER BY id ASC LIMIT 100'); do { stuff($rQuery); $id += 100; } while ( $rQuery = $Database->query( 'SELECT id FROM mytable ORDER BY id ASC LIMIT 100 OFFSET '.$id ) ); ?> 

... you still understand.

+1
source share

Or maybe you can try the mysql functions:

 while ($row = mysql_fetch_row($query)) { ... } 

Most likely, it will be faster, since this foreach statement impresses using fetchAll() instead of fetch() each line

-one
source share

All Articles