[eluser]ciuser99[/eluser]
I basically agree with the assessment that I need to break up the download into smaller pieces. Right now it downloads 100 records at a time and builds up the bulk insert string by concatenating to the string. I build a bulk insert string for each table and execute each bulk insert separately - 4 in total.
Right now the bulk insert is a manual process where the user clicks a button to trigger it. I was planning to make it a cron job running every hour on the hour, but the site stalling issue has given me second thoughts. And the reliability of the bulk insert itself has given me pause.
I tried executing an insert command for each record but that was costly in terms of processing time. This is how I originally coded it and it stalled the site until the bulk inserts were completed. This is a challenge but I find it to be an interesting one that I would like to solve. There is something that Code Igniter is doing, possibly in the initialization code, that could be preventing me from accessing the database while the bulk insert is processing. Perhaps in the way it is trying to access the database.
I've studied other ways in which programmers bulk update their databases while users are hitting the site. I wonder what are the advantages/disadvantages of these strategies.
Thanks for your input. I'm gonna keep searching...