Welcome Guest, Not a member yet? Register   Sign In
Creating Queue and Worker in Codeigniter 3

(This post was last modified: 07-10-2020, 09:39 AM by php_rocs.)

Hi, i'm trying to build queue worker service under codeigniter.

i've searched and try any libs google provide but no luck

also try a lib using gearman but return some error also

i tried this lib

but when i wanna load the library it's return error "Can't find the class" or something.

and im trying to use 

it's return error when i try to start with "gearman -d"

gearmand: Could not open log file "/usr/local/var/log/gearmand.log", from "/var/www/html/lists/services", switching to stderr. (No such file or directory)

my case is something like this

1. I have 1 Big Database contains millions data let's call "Database A"
2. I need to grab data form "Database A" and result new list of data and save it to a new database let's call "Database B"

in process of grabbing big data "more than 1 million" it cant be done just by a single query (in this case most problem i face is memory limit) i can't upgrade the memory (server use nginx with 2vCpu and 4GB of RAM). i've set up php.ini into maximum memory and connection time out.

so i need to build a worker and grab the data slowly but consistence. maybe 1 million data divide into 10 or 11 workers and saving them into the new database (no matter if the process take minutes or hour).

that's why im trying to find any lib and services i can use in codeigniter.

please help me to solve this proble.

any help is appreciate. thanks in advance.

You can use Cron Job

Learning CI4 from my works, from errors and how to fix bugs in the community

Love CI & Thanks CI Teams


gearman is a service that needs to be installed separately.

Is there a reason you want to do this with codeigniter?
If i understand you correctly you just need to execute a sql statement (get a batch of data from your database) and insert it into another database.

I would recommend you to look at https://www.php.net/manual/en/function.pcntl-fork.php

You could write a single PHP file and the process flow would be the following:
- connect to DB.A and find out the number of rows to fetch
- calculate the batch size and the number of workers needed.
- use pcntl_fork to fork the script where each fork would execute a single batch (ie fetch 100k rows) then the same would insert it into DB.B

Assuming you do not need reporting and managing of the forks this is simple.

There is also simple ways to build a worker/manager/broker in zeromeq.
With ready to use examples in PHP:  http://zguide.zeromq.org/php:chapter2#Sh...ER-sockets

Theme © iAndrew 2016 - Forum software by © MyBB