Welcome Guest, Not a member yet? Register   Sign In
Parallel execution of helper function
#1

Hi,
My code is run by cronjob every five minutes. Among other things, code calls helper function that is used for cashing content. Three helper functions are called in a loop around 50-100 times, depends of what content is needed to be cached by my code.

Now, I just run for each loop and in every pass I call helper function which starts processing data. But in lots of time I get Geatway time out, as total time for executing all is rather long. 

What I would like to do, if it is possible is to call helper functions but as separate process, so in case that one of them times out it will affect only specific case.

Now if it times out in 45th of 100 passes, nothing is processed after time out.
Reply
#2

By your description, I assume your cron job call the helper functions over HTTP.

I do not know if your environment permits to call your CI instance directly via CLI, but if so, I recommend you to do it, that will definitely solve your timeout problem. Depending on your setup, it is very likely that you need to update your router configuration and / or controller method.
Reply
#3

(08-31-2020, 06:20 AM)bivanbi Wrote: By your description, I assume your cron job call the helper functions over HTTP.

I do not know if your environment permits to call your CI instance directly via CLI, but if so, I recommend you to do it, that will definitely solve your timeout problem. Depending on your setup, it is very likely that you need to update your router configuration and / or controller method.

Well you are probably right, as my cronjob indeed is using CLI and it obviously finish everything without timeout, but when I call same function over HTTP I get time out. Hmmm, so is it reason for that time out?

This was one thing that was confusing me whole day. When cronjob run script it seems that everything is fine and I get results in places where I expect them but when I run same script in my browser 9 of 10 times i get timeout. 

Sorry for this noob question but I am not really familiar what CLI is doing diferent thatn HTTP.
Reply
#4

Under CLI, script execution time limit is practically unlimited. If you want to do the same in your browser, add set_time_limit(0); at the beginning of your cron script.
Reply
#5

Depending on the hosting environment, there might be enforced time limits beyond your control, for instance, if there is a web application firewall / proxy between the browser and the PHP script. If that is the case and you really have to call that long running process via HTTP for whatever reason, and it still times out, you might need to try to run that job as a separate process in the background. While I have no personal experience with PHP launching background (detached) processes, this Stack Overflow article might give you a jump-start: https://stackoverflow.com/questions/4595...nd-process
My advice here is to implement some sort of locking mechanism on the background job to prevent overloading the server by launching arbitrary number of background processes by simply hitting the reload button in the browser repeatedly.
Reply




Theme © iAndrew 2016 - Forum software by © MyBB