When the connection timesout when using curl or get_file_contents because of a network error or remote server doesn't respond for some reason it kills my script.
I am making these remote calls in a loop and if one fails it kills my script.
What is the best way to handle if a specific post fails, that it goes on to the next in the loop instead of the script dying?
First set a parameter for CURL for a timeout limit:
curl_setopt($ch, CURLOPT_TIMEOUT, 1800);
The result of your curl_exec()
call will show you if the request was successful or not:
for(/* anything */) {
$ch = curl_init();
//...
$result = curl_exec($ch);
if (!$result) {
continue; // use this to jump to the next loop
}
// this code will not be executed if the request failed
}
Collected from the Internet
Please contact [email protected] to delete if infringement.
Comments