Welcome Guest, Not a member yet? Register   Sign In
How do log files affect performance
#1

[eluser]Clooner[/eluser]
On my production site I run logging at level 1, this means that php errors and ci errors are logged.

A service on which my site relies broke down, this didn't cause any problems on my site since I could simply pull up a cached version of the data. However I noticed the log files grew to 200 megabyte a day. I looked at some pingdom logs and apparently the site speed was affected by this but I am not 100% sure.

I use the log files for keeping track of problems and as soon as errors pop up regularly I will address them and this makes me hesitant to disable the logging of errors since it is such a helpful tool. If logging affects the speed dramatically (10-20% as my best guess) I will need a way to deal with this.

Any thoughts on how much the log files affect the speed of a ci app and if so what are best practices to deal with it?
#2

[eluser]Unknown[/eluser]
The purpose of using the tool is to use when you apply for a development environment, has also noticed that the response time of the application increases if the logs are enabled for this reason always leave activated only in the development environment.
#3

[eluser]Aken[/eluser]
200mb is a ton for log files, at least in my experience (I'm not working with millions of users yet). What sort of things are being logged? If they're repeat logs of what's already in your Apache / PHP error logs, then you're probably just being redundant.
#4

[eluser]Clooner[/eluser]
[quote author="Aken" date="1363553065"]200mb is a ton for log files, at least in my experience (I'm not working with millions of users yet). What sort of things are being logged? If they're repeat logs of what's already in your Apache / PHP error logs, then you're probably just being redundant.[/quote]
You make a good point. I can simply write a script to remove duplicate entries.
#5

[eluser]Clooner[/eluser]
I wrote this simple function to clean the log file of duplicate data and thus dramatically reduce its size. An added benefit is that it also prioritizes what errors happens most often. I run this as a cronjob once a day.
Code:
function clean_yesterdays_log()
{
    $filename = setting('log_storage').'log-'.date("Y-m-d", time() - 60 * 60 * 24).'.php';
    if (file_exists($filename))
    {
        $handle = fopen($filename, 'r');
        $log = array();
        $count = array();
        // skip first two lines but save to use later
        $protect = fgets($handle, 4096);
        $space = fgets($handle, 4096);
        if (trim($space)!='minimized')
        {
            // loop over the file and put into array
            while (($buffer = fgets($handle, 4096)) !== false)
            {
                $entry = substr($buffer, 32);
                if(!in_array($entry, $log))
                    $log[] = $entry;
                $key = array_search($entry, $log);
                if (array_key_exists($key, $count))
                    $count[$key] += 1;
                else
                    $count[$key] = 1;
            }
            $logfile = array();
            foreach ($log as $key=>$entry)
                $logfile [] = str_pad($count[$key], 5).' --> '.trim($entry);
            // sort, prioritize and save
            natsort($logfile);
            $logfile = array_reverse($logfile);
            file_put_contents($filename, implode("\r\n",array(trim($protect),"minimized")+$logfile));
        }
        fclose($handle);
    }
}
#6

[eluser]jmadsen[/eluser]
You're getting 200MB per day of level 1 error messages, and you want to write a script to reduce duplicates? Sounds like you need to fix some code...
#7

[eluser]CroNiX[/eluser]
I only turn logging on when troubleshooting a problem. No reason to be logging all of that and slowing the application down for no reason really (unless there is a problem...then turn logging on to find it). But more often than not, I don't need those logs to know there is a problem or where it is.




Theme © iAndrew 2016 - Forum software by © MyBB