I am trying to pull a large set of data from one database, rename the columns, then dump it into another. I started getting timeouts and upped the max_execution_time and max_input_time. This helped but I still was not getting all the data. I then added the following:
set_time_limit(0);
ignore_user_abort(1);
This doubled the amount of data I was able to pull but im still short so its still timing out. Im wondering if there is a better way to do this.
I am using Laravel 5.6,php 7.2 ,mysql 5.6
I am pulling from one database and inserting into another.
$availabilities = DB::connection('mysql2')->select('select vi.status as availability_status_code,vi.date as availability_date,v.masterid as im_id from table1 vi
inner join table2 v on v.id = vi.vid where vi.date >= CURDATE() and v.masterid > 0 order by v.masterid,vi.date'); //
foreach($availabilities as $availability) {
Availabilities::create((array)$availability);
}
This works but as noted times out.
Is there a more efficient way to handle this or should I just increase different time limits until it works? Keep in mind this will run once or twice a day via a job.
You could use:
DB::table('yourtable')->insert([['name' => '1st record'], ['name => '2nd record']]);
Of course you should chunk this into parts, so you shouldn't insert this way for example 1 million of record, but you should break 1 million records into array of for example 200 records. It will be much faster than inserting each record separately as you do now.
You can use collection chunk method to chunk your $availabilities
into smaller parts
Collected from the Internet
Please contact [email protected] to delete if infringement.
Comments