Welcome Guest, Not a member yet? Register   Sign In
best practice: multiple updates
#1

[eluser]dmol[/eluser]
Hi all,
I'm looking for a way to keep track of which tables/fields in a DB need to be updated.
Lets say I have 6 tables sharing a common key($id) and there are also some duplicated fields. When I'm doing an update, I need to keep track of which tables to update and which fields. Is there a good way of doing this? my approach is as follows:

pseudo code:
1/Get a list of tables/fieldnames used in the query as follows:
$data['table_name']['fields'][] = $this->db->list_fields('mytable');
--repeat for each table queried.--

2/Process form
3compare existing field values to new values and store changes as $changes[$field_name] => $value
4/foreach $data['table_name] => $myfield
if exists ($changes[$myfield]) --> function to update db with args($id, $table, $field, $value)

Is there a 'stream-lined' way of doing search->update multiple tables? I find the above approach too cumbersome.
Thanks
Dave
#2

[eluser]toopay[/eluser]
If your needs is maintain or do regulary check, use cron. If your needs is to make sure all database operation and consist all function related to it, write unit testing.
#3

[eluser]dmol[/eluser]
Sorry. What?
#4

[eluser]toopay[/eluser]
I suspect you are try to consist the database table field, related with your model function, so i give you a suggestion to write unit testing. If i was misunderstood, tell your problems/issues. And please, use code tags if you post code.
Code:
// Remove any space
[ code]<-Yourcodehere->[ /code]
#5

[eluser]dmol[/eluser]
Thank you Toopay.
My query is related to the complexity of updating multiple tables and in finding a best practice way of achieving this.

In my code I'm comparing tables/fields which I query against new updated data from a form. This involves iterating through one array and comparing its values with another array. This feels to me like a labor-intensive and error prone approach.

I was just wondering with some users out there, who have code interacting with large complex databases, is this the most efficient way of doing database updates?

Thanks
Dave
#6

[eluser]toopay[/eluser]
If your objectives is just to do a "normal" update, then i will say you did it "too much". It will consume more memory on your server, and generates poor performance, since you did nested queries everytime you do the update process. Validation rules, i think, is more appropriate way to filtering/inspecting the data before you put it into your database. You can use a "callback" function within the validation rule, and specify all rule that should match with the input (from the form), including pattern (using regular expression function).

You maybe need to do above process, if the case are you want to save all updating process, with its detail, into some "tracking" table. But i can think several way, to avoid above proccess, like generates a "log" document instead doing that. If, performance, is a "key" part of your application, then you must think twice, before you do above stuff. Because, every kilobytes of our server memory, is precious ;-)
#7

[eluser]dmol[/eluser]
Thanks Toopay,
Its a small, low use application so I suppose maintainability is a key requirement.
Theres not that much DB I/O going on: one query followed by an update per table.
I just want to write clean, reusable and maintainable code where changes only need to be made sparingly and not throughout the application every time I make changes to table structures. I'll continue with the above approach.

Thanks for the reply. I was just hoping I was missing something obvious.
Dave




Theme © iAndrew 2016 - Forum software by © MyBB