Welcome Guest, Not a member yet? Register   Sign In
Active Record large result memory exhausted
#11

[eluser]Juan Ignacio Borda[/eluser]
that's why programming it's such an exciting job, one day you're working with a shopping cart and the other you end up trying to run a process over a million records.!!!


My problem right now is that I'm trying to do it with CI
I know at least 3 other ways to do the same without using CI, but I made this post hoping for an answer of why can't I open a table with a million records?
In fact I can hack the library to do what I want.
#12

[eluser]Mat-Moo[/eluser]
I've had similar issues but it always boiled down to a crap SQL query, after running the query try a print $this->db->last_query(); and running it in phpmyadmin (Unless it's really really simple).

#13

[eluser]gRoberts[/eluser]
As InsiteFX has said. Why "WOULD" you want to open a table with a million records?

Any developer worth their salt should know that large tables must be paged/paginated into chunks.

I've only ever come across two instances where you would "need" to process large amounts of data, in one go and in those instances, I have used a direct connection to the database using PHP, i.e. mysql_open.

CodeIgniter and ActiveRecord (IMO) has been built for the common developer who, every day, has to constantly keep on writing the same code over and over again. It's been build so that accessing single, or small result sets is made extremely easy, where normally, it would require a lot more work.

If you need to process large result sets, I would suggest using plain PHP and MySQL calls, so that CI and ActiveRecord do not consume any additional memory during the processing...
#14

[eluser]Juan Ignacio Borda[/eluser]
Hey thanks for the replies.

The query is very simple, jsut a select from a table.
The explanation of my needs are irrelevant to the bug/defect, you can't criticize my needs based on your own.
May be it's not the most common scenario but an abstraction layer should be capable of handle 1M records seamlessly like most other abstraction layers.

If DB layer of CI is intended for small resulsets, that should be noted and remarked in the docs, and the library should be safe in that way and handle the exceptions.

Let's say you start with an empty database and for some reason you end up with a big result, then you will end up with a memory exhaustion error.
#15

[eluser]gRoberts[/eluser]
If you can provide an actual reason for returning 1M records in one resultset, then maybe there is a flaw in the CI DB system, but under normal circumstances, even databases with more than 1M records will work fine on the CI DB system by use of pagination or correctly processing the data.

For example, regardless of how many records you have, you should have pagination to breakdown the results when presenting. i.e.

Code:
select * from Table limit 0, 100

which returns 100 rows.

If you have 1M records that you need to process in one movement, you can either use native mysql calls that PHP provides, or what's probably better, is to batch process them in say 10000 rows at a time.
#16

[eluser]Juan Ignacio Borda[/eluser]
I don't need to present this just a process over thouse records so pagination is not needed
#17

[eluser]Juan Ignacio Borda[/eluser]
Here a controller to raise the error

Code:
<?php
//----file name: /controllers/testdb.php
if (!defined('BASEPATH'))
    exit('No direct script access allowed');

class Testdb extends CI_Controller {

    public function index() {
        echo "<h1>DB Memmore Exhaustion TEST (128Mb)</h1>";
        echo "<ol>";
        echo "<li><a href='testdb/create_db'>Create a table with 4k records</a></li>";
        echo "<li><a href='testdb/run_test'>Run test</a></li>";
        echo "</ol>";
    }
    public function Run_test() {
        $query=$this->db->get('ci_mem_test');
        
        foreach($query->result() as $row){//<---- This line will exhaust 128Mb
            
        }
    }
    public function Create_db() {
        //---load dbforge
        set_time_limit(3600);
        $this->load->dbforge();

        $this->dbforge->drop_table('ci_mem_test');
        $this->dbforge->add_field('id');
        $this->dbforge->create_table('ci_mem_test');
        $object=new stdClass();
        $object->id=0;
        for($i=1;$i<=400000;$i++){
            $this->db->insert('ci_mem_test',$object);
        }
        //$this->output->enable_profiler(TRUE);
    }

}

/* End of file testdb.php */
/* Location: ./application/controllers/testdb.php */
#18

[eluser]CroNiX[/eluser]
It would be better to create a detailed issue on github if you really want this looked at. If you come up with a solution (which would more likely be integrated than just creating an issue), issue a pull request.
#19

[eluser]Mat-Moo[/eluser]
OK had a quick poke around and tried your code, CI seems to cache the whole result to memory then serve from memory it looks like, e.g. (my sql driver)

Code:
function result_object()
{
  if (count($this->result_object) > 0)
  {
   return $this->result_object;
  }

  // In the event that query caching is on the result_id variable
  // will return FALSE since there isn't a valid SQL resource so
  // we'll simply return an empty array.
  if ($this->result_id === FALSE OR $this->num_rows() == 0)
  {
   return array();
  }

  $this->_data_seek(0);
  while ($row = $this->_fetch_object())
  {
   $this->result_object[] = $row;
  }

  return $this->result_object;
}

I don't see anything obvious to avoid this, so while pagination is not what you want, it seems processing records 10000 at a time would probably solve your issue... Or process the query result with standard mysql commands.
#20

[eluser]CroNiX[/eluser]
You don't really need 'pagination', just use a limit with an offset. It's not very efficient to retrieve millions of rows even if you have enough ram. Best to do it in batches of 5000 or so freeing the result in between and using the offset/limit. Surely you aren't displaying that many records, just processing of some sort. This allows you plenty of free resources left over for other operations, like visitors on the website and whatnot.




Theme © iAndrew 2016 - Forum software by © MyBB