Welcome Guest, Not a member yet? Register   Sign In
Multiple function call
#1

Hello,

I have one situation, I would like to append a large amount of data from JSON file and loading, parsing and preparing JSON data for the function is ok, there is no time limitation break.

But, when I want to start this function and no matter how much record JSON data has, I always go to timeout limitation error. Is there some way how to split those JSON data to (for example on 1000 records) and call same function few times, but the new call is a new start from zero, no time executed limitation break?

Thanks for the assistance in the advance...
Reply
#2

@msdcorporation,

A couple of questions... What is your timeout value? How much data, on average, is being pulled. I get the feeling that something is wrong because I'm able to do a JSON call in 40K+ records in seconds. What version of PHP, CI and MySQL are you running?
Reply
#3

(This post was last modified: 11-14-2018, 11:07 AM by msdcorporation.)

(11-14-2018, 10:53 AM)php_rocs Wrote: @msdcorporation,

A couple of questions... What is your timeout value?  How much data, on average, is being pulled.  I get the feeling that something is wrong because I'm able to do a JSON call in 40K+ records in seconds.  What version of PHP, CI and MySQL are you running?

I have JSON file for the 20000 products, and I call the function which is updating a few tables in the database. That function communicates with woocommerce tables and I need to create images as one of the steps in the whole process. My PHP version is 7.1 on MariaDB 5.6 and I think that the execution timeout is about 30sec.

When I call this function with 20000 items in JSON data function work in full time but I have about 1500 items in the database, but if I split those data to 20 files of 1000 items and call every file one by one I have all 20000 items in the database.

I think that it is a little bit clear now the whole scenario.

Thanks again for the advice and assistance...
Reply
#4

@msdcorporation,

How long does the step that creates the images take? Also, how fast is the query call? I'm thinking that the bottle neck is the image creation.
Reply
#5

@php_rocs

Without image creation, I have 2000 - 2500 new items in the database, with images that are 1500 items, but if I call 1000, and another 1000, and another 1000... everything works fine.

The process which is updating/inserting data is taken data from JSON make some business logic and prepare data for inserting or updating after that call replace or insert methods of DB class.

Once again, 1000 items at once work fine, maybe 3000 works fine, but I decided to pass 1000 in one step.

Thanks again for the assistance...
Reply
#6

if you are doing one insert per item then it would be much faster to do one insert for all items ie "insert into (fields list) values (item1 data),(item2 data) ad nauseum". another solution i used was to write the data to a file and then call load data infile (in mysql) which puts all the data into the db.
bill
Reply
#7

I'm using CodeIgniter Query Builder for data manipulation and in some cases I use batch_insert but if you have a better idea, great.
In one case I need to do insert, catch the last ID and use it in another insert, so if it is solution post it here :-)
Thanks for the advice...
Reply
#8

Using 

Code:
setTimeout()

 it is possible to launch a function at a specified time:

Code:
setTimeout(function, 60000);

But what if I would like to launch the function multiple times? Every time a time interval passes, I would like to execute the function
Reply




Theme © iAndrew 2016 - Forum software by © MyBB