• 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
How about a max_insert_batch method ?


I deal with lots of data, that is several 100k rows x several websites every day.
Data is not always clean (external origin), so insert_batch sometimes does not insert all of it, and it's ok.
But i'd like to maximize the amount of data really inserted in database, so i was thinking about extending DB_query_builder to add a "max_insert_batch" method, which would do almost the same as insert_batch, except this :
each time you call _insert_batch, you verify the number of affected_rows
  if it's different from the number of lines sent (which is batch_size, except for the last batch), you recursively call insert_batch with the same block of data you just used, but with a batch_size twice as small (unless of course batch_size = 1)

Your opinion ?


There is also a limit that is set on how many rows can be inserted in MySQL it's self.
What did you Try? What did you Get? What did you Expect?

Joined CodeIgniter Community 2009.  ( Skype: insitfx )

Digg   Delicious   Reddit   Facebook   Twitter   StumbleUpon  

  Theme © 2014 iAndrew  
Powered By MyBB, © 2002-2020 MyBB Group.