Update about 1 million rows in MySQL table every 1 hour |
Hello! I use Codeigniter 3.1.11 and have a question. I need to update about 1 million (in future will be much more) rows in MySQL table every 1 hour by Cron. But problem is if I update more than about 200-300 rows using this code, my server CPU is 100% loaded and table is stopping update after about 200-300 rows. And I even have to restart the PHP on my server to get the server back to normal. What I do wrong? Please, suggest a solution how to do this task correctly, so that the query to the database is executed quickly and there is no heavy load on the server.
This is the code from controller: PHP Code: function cron_rows_update() { This is the code from model: PHP Code: function rows_update() {
By the looks of things you are executing (at least want to) one million + 1 query every hour. That's a lot. Depending on what you wan't to do with your data you need to rethink your structure.
1. Fetching/calculating only when needed 2. Run all calculations in SQL (07-21-2020, 06:59 AM)jreklund Wrote: By the looks of things you are executing (at least want to) one million + 1 query every hour. That's a lot. Depending on what you wan't to do with your data you need to rethink your structure. Thank's for answer. If I run all calculations in SQL for example: PHP Code: $this->db->set("game_available_balance", "game_available_balance + " . $game_balance_in_hour_amount, FALSE; is this SQL query will be faster than if I make calculations in php? And another question: if will use batch update, is it more faster than foreach? If yes, can you write an example of batch update for my case?
By running calculation in SQL, I mean you need to skip your foreach completely and do all math in said SQL instead. Only passing in $currency_numbers_after_dot, $game_currency_percentage_max, $game_currency_speed_in_hour to the query and let it do everything for you.
Batch update are a third option, didn't think of that one. You can find an example on batch update in the user guide. It will be faster as you get 10 000 instead of 1 000 000 queries (per default). But not as fast as running it with one query. https://codeigniter.com/userguide3/datab...ating-data As you expect more than 1 000 000 users, you need to plan ahead. As it will just get slower every time.
(07-21-2020, 07:27 AM)olegrain Wrote:(07-21-2020, 06:59 AM)jreklund Wrote: By the looks of things you are executing (at least want to) one million + 1 query every hour. That's a lot. Depending on what you wan't to do with your data you need to rethink your structure. I’m not actually going to answer your question about the framework but just wanted to mention that for each loops captures all the data into memory. That would be a very bad thing with large data. I believe a while loop would save you on memory issues. While loop should go line by line resulting in lower memory usage.
@olegrain,
I used Cron on a daily basis to download and convert three XLS Spreadsheets to MySql Database Tables. This task used to take over a minute to complete... also the task used to frequently fail because of 100% CPU loading sometimes necessitating a reboot ![]() After numerous attempts at refactoring eventually the same task now completes in just over ten seconds and now runs every hour! The drastic time reduction was due to the frequent use of PHP flush(). Try giving the flush command a whirl and see if it makes a difference. |
Welcome Guest, Not a member yet? Register Sign In |