Welcome Guest, Not a member yet? Register   Sign In
uploading and processing large files into database
#1

[eluser]sherwoodforest[/eluser]
I am trying to keep in sync a web app with our companies business software. Because there are no hooks that I can connect to in the business software I have to dump large amounts of data (200mb+) then process the data and import only parts that changed into a web app. Each week or day depending on how fast I can make this synchronization. I would like suggestions on how to approach this problem.
Do I upload the raw data into a temporary table? then look for changes since the last time I upload the stuff.
Do I read it a few lines at a time and check to see if it is different that what I got last time.

or something better
any suggestions?

is there a CI version of fgets or fstream_get_line?
#2

[eluser]coolgeek[/eluser]
Nothing you can write in PHP/CI is going to match the performance of native MySQL processing. So, to start with, loading into a temp table is the better option than trying to parse the file yourself.

The next thing you want to do is try to limit the amount of data you have to import. Where is the data coming from? Is it timestamped, on either import, update, or both?
#3

[eluser]sherwoodforest[/eluser]
I would love to be able to limit the data

where is the data coming from?

I get a text file of all leads I can only limit by geography. (It doesn't give me a way to limit to just the ones that changed) so I don't have timestamps or a last updated field, only the current state of the lead.

so it looks like I get to load into temp table
and then do a compare between live table and the temp table to find the modified records
(which should normally be <.01% of the records)




Theme © iAndrew 2016 - Forum software by © MyBB