Welcome Guest, Not a member yet? Register   Sign In
Memory problem with huge sql insertion [solved]
#1

[eluser]dede125a[/eluser]
Hello guys,

I have to insert a lot of recording in a sql base (mysql) form formated files (csv) an i use this code :
Code:
$total = 0;
while (!feof($lefichier)) {
    $ligne = fgets($lefichier);
    if ($ligne!='') {
        //log_message('debug',$ligne);
        list($nom,$prenom,$sexe,$daten,$ine,$us1,$us2,$bac,$us3,$us4,$avis) = explode("|",$ligne);
        // si il s'agit d'autres bacs que généraux ou technique, on n'insert pas les données
        if ($bac=="430" || $bac=="470") {
            $lignedata['nom']         = trim($nom);
            $lignedata['prenom']     = trim($prenom);
            $lignedata['ine']         = trim($ine);
            $lignedata['bac']         = trim($bac);
            $lignedata['result']    = trim($avis);
            
            //insertion dans la db
            $this->db->insert('resultats', $lignedata);
            if (mysql_error()) {
                $message = "<h3>Une erreur s'est produite à l'insertion des données</h3>\n";
                $message .= "<p>".mysql_errno() . ": " . mysql_error() . "</p>\n";
                break;
            }
            $total++;
            $message = "<h3>L'insertion des données s'est bien déroulée</h3>\n";
            
            
        } // endif type bac
    } // endif ligne vide
} // endwhile
$this->firephp->info($message);
fclose($lefichier);
Files contains over 20000 recordings and during the insertion I'v got this error :
Code:
[26-Jun-2009 15:19:11] PHP Fatal error:  Allowed memory size of 8388608 bytes exhausted (tried to allocate 38 bytes) in /Volumes/Serveur/web/opebac2009/system/database/drivers/mysql/mysql_driver.php on line 442

Because I'm not an expert, I guess that goods guys could help me to find a better way ?

Thanks by advance Smile
#2

[eluser]TheFuzzy0ne[/eluser]
I'd suggest doing the insert in two stages. Stage one - you parse the CSV into insert statements (containing say 100 inserts). Stage two - you read through the resulting file, and fire each line (which will be a full insert query), against the database.

Looking at your code, whilst it would hammer the database, I can't see where the bottleneck is. What's the size of the largest entry you need to insert?

EDIT: It might also be worth seeing if you can up the amount of RAM you have allocated. In my limited experience, a lot Web hosts tend to offer a 16MB RAM limit.
#3

[eluser]Michael Wales[/eluser]
Depends on your environment - I do something very similar to this quite often at work and prefer Python. I read a row of data, perform a few selects around the database to validate foreign keys, then perform an insert. Database server is within my local network, so the traffic isn't bad and the server can easily handle the punishment of 3-4 selects per row and thousands of insertions.
#4

[eluser]xwero[/eluser]
try LOAD DATA INFILE
#5

[eluser]jedd[/eluser]
You don't say how many lines in a file you can cope with - have you tried to narrow it down? If you're close to your 20k threshold, then it might be easier just to bump up the php memory limit. From the error you posted it looks like this is set to the August 1997 default of 8MB .. which is a bit too retro for a hip new server. As I say, if you're getting close then this would be an easier solution, but until you ascertain how badly your script is choking on memory resources I wouldn't jump into a redesign.
#6

[eluser]mddd[/eluser]
I can't really see why it would take up so much memory. You are handling only one record at a time. So the MYSQL insert is not so big; there just are many lines!

Maybe your file is formatted in such a way that PHP doesn't see the line endings. So, it tries to read everything all at once. If your PHP memory limit is not much bigger than the file, that would cause problems.

Try the auto_detect_line_endings setting of PHP.
#7

[eluser]Phil Sturgeon[/eluser]
You need to build up an insert string instead of inserting one at a time.

You are currently doing this.

INSERT INTO (col1, col2) VALUES (val1, val2);
INSERT INTO (col1, col2) VALUES (val1, val2);
INSERT INTO (col1, col2) VALUES (val1, val2);
INSERT INTO (col1, col2) VALUES (val1, val2);
INSERT INTO (col1, col2) VALUES (val1, val2);
INSERT INTO (col1, col2) VALUES (val1, val2);

And you want to be doing this:

INSERT INTO (col1, dcol2) VALUES
(val1, val2),
(val1, val2),
(val1, val2),
(val1, val2),
(val1, val2);
#8

[eluser]dede125a[/eluser]
First, thanks all for yours suggestions.

A typical line is something like that :
NAME |FIRSTNAME |F|1990/02/26|M605000100 |0130003H|0130178Y|470|1|L |B|

So i'll test the different options and I'll tell you my acknowledgment soon Smile
#9

[eluser]Marcelo Lipienski[/eluser]
Try to put this into your .htaccess file

Code:
max_execution_time 120
php_value memory_limit 64M

May it work, please send a feedback Smile
#10

[eluser]dede125a[/eluser]
The right solution in that case was to increase the value of the memory limit (who was extremely low effectively) ^^'

I cant use an external solution (like Python) because PHP was the only language permitted by the client (not to hard with CI) ! :p

Teh load data inline can't be the solution, I can't edit the file content, but I can, because I'm not the final operator, It's one person who put the file on the server (via ftp) and it's an other person who exploit the file through the interface I working on. And some part of the file need treatment before insert (but I discovered a useful sql function, thanks).

Insert into, was a excellent solution too, but the increment of the memory was sufficient, so...

Thanks for all ! Smile




Theme © iAndrew 2016 - Forum software by © MyBB