Welcome Guest, Not a member yet? Register   Sign In
Store hundreds of tasks per day per user in custom log sub-system
#1

[eluser]CARP[/eluser]
Hi

I've been instructed to develop a custom log system for the CI app I'm finishing.

This log system consists of:
- 1 table with 5 fields. id:INT(20) | task_datetime | taskdesc | fk_user | value | comment
- 1 library with the "logtask()" function which writes to the table using the model_log model
- I autoload the library
- I do a 1 line call to the library's do_log() function in each controller's function, all controllers

So, my app has 12 controllers, 5 functions each avg. (most CRUD) and 6 users

Which would be the best approach so I can store hundreds of tasks everyday, during a month? I'd to a manual cleanup of this table each month

I've thought also about combining the actual MySQL database models with a plain text filesystem -only for doing this logging- (SQLite?). Could this be done in CodeIgniter?

Thanks in advance,
#2

[eluser]CroNiX[/eluser]
Yes, CI has a SQLite driver included. But why not just continue to use MySQL since you are already using it? As long as it's properly indexed, millions of records shouldn't be an issue.
#3

[eluser]CARP[/eluser]
[quote author="CroNiX" date="1408116958"]Yes, CI has a SQLite driver included. But why not just continue to use MySQL since you are already using it? As long as it's properly indexed, millions of records shouldn't be an issue.[/quote]

That's the idea, but I'm not used to "bigdata" and would like to learn and be sure that I'll be able to store a couple of millions of records (accumulated in a year, for eg.) in this "log" table without affecting the whole app's performance

Which changes and which indexes would you create to this table?

Thanks again
#4

[eluser]CroNiX[/eluser]
I'd read up on some articles about mysql indexing. It is something that can tremendously speed up queries as your tables grow large and is often the most overlooked feature of database design.

Generally, you want to index columns that you:
ORDER BY (ORDER BY first_name) (first_name should be indexed)
JOIN ON (JOIN assignments ON assignments.user_id = employees.id) (assignments.user_id should be indexed. employees.id is probably already a primary key and so it's already indexed, but if not it should be)
GROUP BY (GROUP BY first_name) (first_name should be indexed)
use in a WHERE (WHERE first_name = 'someone') (first_name should be indexed)

If you don't use indexes, mysql has to scan ALL rows that you are querying on. With indexes, it knows "where" to look.

Think about how long it would take you to manually search a phone book for a specific phone number vs. searching for someones name (which are in alphabetical order so you know where to start looking). That's the difference between non-indexed and indexed.
#5

[eluser]jonez[/eluser]
If you anticipate this table growing to over 1m records you should create two copies of your tables and use a CRON job to move old data into the archives. This will keep your current data queries fast and allow you to do historical reporting (users understand these taking longer).
#6

[eluser]CARP[/eluser]
thanks guys for the help




Theme © iAndrew 2016 - Forum software by © MyBB