Welcome Guest, Not a member yet? Register   Sign In
iterating many rows consumes a LOT of memory
#1

I've got utility scripts I use to tidy up my data before deploying it to my production site. One script in particular has to fetch a lot of records and iterate them all (NOTE: I have *tried* to refactor this code to work some other way but cannot find another way to do it without iterating all the records). This is causing CodeIgniter to chew up a LOT of memory and I'm getting an 'out of memory' error. Seems to me like CI's mysqli driver might have a memory leak?  Is there something I can do to improve memory management for a script? Here's an example script:
PHP Code:
public function example() {
    
$this->db->save_queries FALSE// disables query logging


    
$fetch_sql "SELECT t1.col1, t2.col2, t3.col3, t4.*
        FROM
            table1 t1,
            table2 t2,
            table3 t3,
            table4 t4
        WHERE
            t1.foo='foo' AND
            t2.bar != 'bar' AND
            t3.arf = 'arf' AND
            t4.bowwow = 'bowwow'"
;
 
    
$query $this->db->query($fetch_sql)
        or die(
"fetch query failed:" $fetch_sql);

    
$total_records 0;
    echo 
$query->num_rows() . " rows found";
    echo 
memory_get_usage(TRUE) . " memory consumed\n";
    while (
$row $query->unbuffered_row("array")) {

        
$total_records++;

        
// update the ind_cat_code for each record
        
$mystring "col1=" $row['col1'] . ", col2=" $row['col2'];

        if ((
$total_records 1000) == 0) {
            echo 
"$total_records records\n";
            echo 
number_format(memory_get_usage(TRUE), 0) . " memory consumed\n";
        }
     
        if (
$total_records 77000) {
            die(
'that is 77000');
        }

        unset(
$row); // i even tried this to try and free up memory but NO DICE

    
// for each record
 
    
echo $total_records " total records\n";
 
// example() 
The output of this particular script shows an increases in memory usage of exactly 1572864 for every 1000 records iterated. Here's the output:
Code:
90121 rows found
23855104 memory consumed
1000 records
25,427,968 memory consumed
2000 records
26,738,688 memory consumed
3000 records
28,311,552 memory consumed
4000 records
29,622,272 memory consumed
5000 records
31,195,136 memory consumed
.
. etc
.
75000 records
131,858,432 memory consumed
76000 records
133,431,296 memory consumed

Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 32 bytes) in /var/www/sneakyimp-ci-project/system/database/drivers/mysqli/mysqli_result.php on line 168

Am I missing something? Note that I set $this->db->save_queries to FALSE and I'm using $query->unbuffered_row() and I still have this problem. I have been searching and I see a post from four years ago which IMHO was prematurely closed.

My CI version is 3.0.7-dev downloaded from github today.
Reply
#2

maybe someone has an idea. but the other angle is to figure out how you can change the database tables so you have ids or date or something so that you can process them in logical batches.
Reply
#3

>>> chew up a LOT of memory and I'm getting an 'out of memory' error.

Are the search fields indexed and why request all t4.* fields?
Reply




Theme © iAndrew 2016 - Forum software by © MyBB