Need help to improve data import from text file |
[eluser]thinzar[/eluser]
I'm currently working on an application (Import Module) where it import data from text file which the user uploaded. The problem is the import works fine if the record is < 20,000.. but it can't handle if it is more than that. Possible problems that i can think of are: 1. Since CI is using persistent SQL connection, there might be connection time out problem 2. Is it because of the cache limit exceed since it has to hold a long SQL query? (but even if i insert one record after another, the connection time-out problem occur.) This is my first project using CI, so i'm not very familiar with its functionalities yet. Please give me suggestions on how to improve the performance since the module needs to handle data import of large data set. Or if there is any other method that i can use (rather than trimming each record and inserting one by one)? The partial code snippet is as below.. Code: function importData($filepath, $data, $datasource){ P.S. I'm using PostgreSQL. will COPY instead of INSERT work?
[eluser]Chris Williams[/eluser]
I actually skipped trying to figure out this one using CI and chose a solution already out there. check out http://www.phpcsvimporter.co.uk/ You maybe able to pull code from it for your issue as well. Who knows. ;-)
[eluser]thinzar[/eluser]
[quote author="Chris Williams" date="1212357861"]I actually skipped trying to figure out this one using CI and chose a solution already out there. check out http://www.phpcsvimporter.co.uk/ You maybe able to pull code from it for your issue as well. Who knows. ;-)[/quote] Thanks for the suggestion... this can be one option |
Welcome Guest, Not a member yet? Register Sign In |