PHP timeout doing multiple curl and filegets

I have a php script that does a get request to an api, and with the results has to fetch over 1000 images.

My problem is that in the process of doing this I always get a timeout.

What should be my best approach for this?

I've tried to do a loop with dividing the response in smaller gets of the files but still getting the timeout.

Should I store the progress, and do even smaller batches?

1 answer

  • answered 2018-07-12 00:47 solarc

    PHP has a configuration called max_execution_time which you should take a look at.

    If you want to divide the processing into separate small processes try using a queue system like RabbitMQ. your main process would send the tasks to RabbitMQ and then you would have one or multiple workers listen to the queue and get each item to process.