I have been using DMZ Datamapper on large results sets (100,000s of rows) and it quickly runs out of memory because CI loads results sets in full. As a workaround, I've been using this pattern:
Code:
$model = new Model_Object();
# A query returning a big result set. Instead of calling get(), call get_sql()
$sql = $model->..where clauses,etc....->get_sql();
$results = $this->db->query($sql);
if ($results->num_rows() > 0) {
# Iterate through the rows without loading the full result set
while($row = $results->_fetch_object()) {
# Do something with the row
}
}
It might be useful for anyone dealing with large result sets if this pattern were part of the library. In place of get, you could call something like get_cursor() and then to read rows you could call something like fetch() which will return a model object populated with the next row. If you did not generate your query with get_cursor(), then fetch() would throw an error.