Welcome Guest, Not a member yet? Register   Sign In
Exceeded excution time
#1

[eluser]belial.seed[/eluser]
Hi i´m having trouble generating a file out of a big resultset, i´ll explain the problem, the code i´m using works for queries that return around 5000 rows and it´s really fast, however when i try the same thing with a bigger resultset (about 4 million rows with 10 fields each), seems like i run out of time so i´ve set max_execution_time=300 on my php.ini, i´ve tried to limit the resultset to half a million, but I still have the same problem, so a little light on my way will be appreciated, thanks for your attention.

the code i´m using:
Code:
$this->load->dbutil();
$this->load->library('zip');
$query = $this->gic_inventario->inventarios2($tmp01,$tmp02,$tmp03);
$delimiter = "\t";
$newline = "\r\n";
$datosArchivo=$this->dbutil->csv_from_result($query,$delimiter,$newline);
$nombreArchivo=date("Ymd_G-i").'_file.txt';
$rutaArchivo='./aplicacion/tmp/files/';
$this->zip->add_data($nombreArchivo, $datosArchivo);
$this->zip->archive('./aplicacion/tmp/files/'.
                             date("Ymd_G-i").'_file.zip');
$this->zip->download(date("Ymd_G-i").'_file.zip');

PS the query takes about 2 minutes to complete.
#2

[eluser]tkyy[/eluser]
at the top of the file that has that code, try

Code:
<?php
set_time_limit(0);
?>
#3

[eluser]belial.seed[/eluser]
Thanks for your answer tkyy it works, now I have different issue i´m exhausting the allowed memory, i´ve set 512 Mb in the php.ini and i´m only processing 200,000 rows at a time however, when i run the query trying to retrieve the next set of rows the error shows up and seems like the unset function doesn´t free the used memory by the resulset.

here´s the code
Code:
// get the number of rows
$size=200000;
$query = $this->gic_pedidos->exporta_pedidos(1,0,0);
$counter= $query->result_array();
$veces=ceil($counter['0']['ctd']/$size);
$datosArchivo='';
//se escriben archivos temporales
for($i=0;$i<2;$i++){
   $limiteInf=$i*$size;
   $limiteSup=($i+1)*$size;
   $query2 = $this->gic_pedidos->exporta_pedidos(2,$limiteInf,$limiteSup);
   $delimiter = "\t";
   $newline = "\r\n";
   $datosArchivo=$this->dbutil->csv_from_result($query2,$delimiter,$newline);
   if ( ! write_file('./aplicacion/temp/files/'.date("Ymd_G-i").'_file_'.$i.'.txt',
                      $datosArchivo))
      {
      echo 'Error! file:'.$i;
      }
   unset($query2,$datosArchivo);
}
#4

[eluser]danmontgomery[/eluser]
Try:

Code:
$this->db->free_result();

Instead of unset(). (Or, in addition to... Leaving it in won't hurt anything)
#5

[eluser]icomefromthenet[/eluser]
You can use the undocument method from CI db library this just wrapps to php method, it wont load the entire result object into memory like any of the CI methods do. I found it helps in the past to conserve memory

Code:
while ($result = $data_row->_fetch_assoc()) {              
}
#6

[eluser]belial.seed[/eluser]
Thanks for your answer noctrum however seems like the resultset it´s not being cleared, here is de the Xdebug error log.

Code:
( ! )  Fatal error: Allowed memory size of 536870912 bytes exhausted (tried to allocate 43 bytes) in C:\xampp\htdocs\gic\gic_qp\aplicacion\database\drivers\mysql\mysql_result.php on line 147
Call Stack
#    Time        Memory       Function    Location
1    0.0008      381688       {main}( )    ..\index.php:0
2    0.0016      431360       require_once( 'C:\xampp\htdocs\gic\gic_qp\aplicacion\codeigniter\CodeIgniter.php' )    ..\index.php:115
3    0.0668      5550040      call_user_func_array ( )    ..\CodeIgniter.php:236
4    0.0668      5550088      Rpt_Gic_Pedidos->a_csv( )    ..\CodeIgniter.php:0
5    179.7741    6051336      CI_DB_utility->csv_from_result( )    ..\rpt_gic_pedidos.php:182
6    179.7744    6051648      CI_DB_result->result_array( )    ..\DB_utility.php:214
7    194.0130    536828792    CI_DB_mysql_result->_fetch_assoc( )    ..\DB_result.php:108
8    194.0130    536828824    mysql_fetch_assoc ( )    ..\mysql_result.php:147

I'm clearing and unsetting like is:
Code:
$query2->free_result();
unset($datosArchivo);

Is there a way to clear the resultset the hard way?
#7

[eluser]belial.seed[/eluser]
The error was in the query limit the second time it brought back 400 000 rows, thanx for your help.




Theme © iAndrew 2016 - Forum software by © MyBB