Welcome Guest, Not a member yet? Register   Sign In
Insert/Save performance on 1,000 records/minute
#2

[eluser]tonanbarbarian[/eluser]
i would assume that currently your biggest overhead is processing the xml
there are a couple of questions i would ask yourself before deciding to change the process you have

1. is it working acceptably? if so dont change it unless you really must
2. is your database on the same server as the code or is it remote?
if it is remote then looking to reduce the number of queries you need to run would improve performance
3. is there any common data, i.e. is there data in jobStatus that has to come from records, such as an id?
if so then leave the process as it is
4. are there any queries being run in the save methods apart from inserts?
for example do you lookup data before inserting for whatever reason?
if so then you might want to look at ways to cache the lookup data in memory if possible

since you are processing 1000 records per minute, and I assume this limit is based on the speed of the code, storing the data to be saved in arrays is going to use up memory very quickly. As I alluded to earlier the xml processing may use considerable memory to begin with, if you try to put your data into arrays and then insert in batches you may end up running out of memory


Messages In This Thread
Insert/Save performance on 1,000 records/minute - by El Forum - 12-14-2010, 10:41 PM
Insert/Save performance on 1,000 records/minute - by El Forum - 12-15-2010, 12:02 AM



Theme © iAndrew 2016 - Forum software by © MyBB