问题描述:

I would like to modify a cron script which includes 3 table insertions, one table update and one deletion processes.

There are thousands of records and hence it takes much time to insert data.

The data is inserted into products table, product payment table, product details table and product images table.

current script checks whether the product exists or not.

If not exists, then insert data to products table first and returns the newly inserted product id. After that, insertion happens to the other three tables. There may be multiple insertion to the images table (depends on the number of images).

If the product already exists, update the tables. For the images table, the script deletes existing image records for that particular product and then inserts all images as new to the images table.

This is what happens now:

if(!$prod_exists){

$product_id = insert_product($product_data) ;

if($product_id){

insert_payment($paymnt_data, $product_id);

insert_details($details_data, $product_id);

insert_images($image_data, $product_id);

}

}else{

update_product($product_data, $product_id)

update_payment($paymnt_data, $product_id);

update_details($details_data, $product_id);

delete_images($product_id);

insert_images($image_data, $product_id);

}

As the data is enormous, it takes longer time to execute. Is there any way that we can optimize this entire process?

Thanks

M

网友答案:

What you could do is use REPLACE INTO sql query to either insert a row or replace the values in one go.

Take a look at this article: http://blogs.coldbuffer.com/inserting-or-updating-records-using-mysql-replace

网友答案:

Another way to deal with lots of data that you have to add/update is to create a temporary table (probably with ENGINE = MEMORY) and do bulk inserts into that table. So instead of adding one dataset at a time, create sql inserts with multiple sets and add them at once. This is way faster than looping over the elements and call an insert for each dataset.

After that you can do a mass update by joining your original table with that temporary table, as this also is much faster.

We are working with this concept to speed up inserts/updates quite well.

网友答案:

Assuming MySQL:

Prepared statements!

http://uk1.php.net/pdo.prepared-statements

This will help immensely with your speed issue, if you're talking thousands of queries.

Also, wrap at least each individual block of queries in a transaction, if you're on InnoDB (which you should be!):

https://dev.mysql.com/doc/refman/5.0/en/ansi-diff-transactions.html

That won't help with speed, but will make the whole operation a good deal safer.

相关阅读:
Top