Welcome Guest, Not a member yet? Register   Sign In
Database Timeout
#1

[eluser]JHackamack[/eluser]
Recently I have been having to query large amounts of data (over 4,000 rows) and the database library kept timing out. I found that there seemed to be a limit at around 2,000 (trying it by limiting the SQL query to 2,000 rows) but I recently made a change and found a way around the limiting:
Code:
while ($row = $this->_fetch_object())
        {
            set_time_limit(0);
            $this->result_object[] = $row;
        }

On line 76 of DB_Result.php I added the set_time_limit(0) which tells the script to stop keeping time, and it works flawlessly. No more unnecessary hacks dealing with the script dying.
#2

[eluser]brianw1975[/eluser]
Note: this workaround will only work if PHP is not running with safe_mode on

A better solution would be to find the real reason behind the db disconnect -- or use a better query.
#3

[eluser]JHackamack[/eluser]
I agree that this isn't the best solution. I usually paginate search results which doesn't hit on this limit, but I do have a download link for a person to download an excel (using the excel plugin I found here) file of all the search results. With this temporary fix I've been able to increase the limit to 30,000 results at a time. And you know, 30,000 rows ought to be enough for anybody. I don't know if this is a bug in Active Record or a feature request. Any ideas on what might be causing this?
#4

[eluser]brianw1975[/eluser]
Is the database on localhost (relative to the server that is) or a "remote" dedicated db server machine? is it mysql or M$SQL server or?
#5

[eluser]JHackamack[/eluser]
It is a dedicated remote server running MySQL.
#6

[eluser]brianw1975[/eluser]
hrm... interesting... runs if you set_timeout to 0 fails if you don't..it almost sounded like a memory issue (on the remote server), then i was thinking security time out...

but now, after doing some reading how about this - turn on your log_slow_queries (in the my.ini config file) and make sure your query is actually optimized, also, make sure you have indexes on the appropriate columns, in my newbie days i didn't understand the benefits of indexes, had a page that took upwards of 5 minutes to load and when i did the log_slow_queries thing I found that a table of over 500,000 rows was being processed for every table row that was being displayed on the website.
#7

[eluser]JHackamack[/eluser]
I'm coming across this issue again and the set_time_limit doesn't help anymore.
Code:
CI_DB_mysql_result Object
(
    [conn_id] => Resource id #113
    [result_id] => Resource id #165
    [result_array] => Array
        (
        )

    [result_object] => Array
        (
I print_r in mysql_result.php on line 162 and this is what i got while it was in its loop. It stopped right in the middle of result_object and nulls the view (nothing is rendered) There are 19100 lines of data that need to be displayed, but somehow they aren't getting displayed and when i remove the foreach($data->result() loop the code works just fine. Any ideas?
#8

[eluser]brianw1975[/eluser]
ouch... but I have run into a similiar problem as well - i bet the server is running out of memory. In order to fix the issue you will be looking at processing some data - dump it to the user, process more data, dump it to the user, rinse and repeat.

you may replace "dump to the user" with "dump to a xml/txt/xls/csv file then send to the user"




Theme © iAndrew 2016 - Forum software by © MyBB