Welcome Guest, Not a member yet? Register   Sign In
'threaded' CI model for batch processing files?

I have a large recursive script that I have inherited.

Basically, it
1. reads exif data from a lot of large images
2. copies them into directories based on date and renames them
3. runs a shell exec that creates video files from each directory
4. runs another shell_exec that resizes all of the copied images (it was using image_lib but it was too much for it).
5. migrates processed source images into a backup directory.

So i'm testing this on a run of 2500 images.
The script work fine through stages 1+2 but doesn't like 4-5 - no surprises there.

If I run the entire process on a single directory, it works so I'm about to rewrite the whole thing to process a single directory at a time and then run the script as a cron job.

I figured the simplest way would be to save an array of tasks into my database then delete each directory from the array after its been processed.

Just wondering if there's a different/better way i can do this within CI?



Any time I have ever had to run multiple processes at the same time, I have always used one parent PHP script to launch (using shell_exec) multiple child processes.

My issue is the length of time i can run one script for.
Thanksfully, i have a dedicated server and can extend script execution time but it still stops after a while particularly when resizing large jpegs - a resource hog on any OS.

Perhaps using pipes with proc_open? or is that more or less the same as shell_exec?

Perhaps the underlying issue is actually memory usage? I am not positive, but something may be causing a memory leak... or if the calls you are making have the ability to destroy variables after use, maybe it could help?

I have never used proc_open before.

Theme © iAndrew 2016 - Forum software by © MyBB