I have a table that needs to be updated regularly. These updates occur in batches. Unlike INSERT, I can't just include multiple rows in a single query. Now I am ready to prepare an UPDATE statement, then scroll through all the options and execute each. Of course, the preparation takes place only once, but there are still many executions.
I created several versions of the table in different sizes (thinking that it might be better to index or split the table). However, this did not affect the update time. 100 updates take about 4 seconds for a table of 1,000 rows or 500,000 rows.
Is there a faster way to do this faster?
As stated in the comments, here is the actual code (PHP) that I tested with. The column 'id' is the primary key.
$stmt = $dblink->prepare("UPDATE my_table SET col1 = ? , col2 = ? WHERE id = ?"); $rc = $stmt->bind_param("ssi", $c1, $c2, $id); foreach ($items as $item) { $c1 = $item['c1']; $c2 = $item['c2']; $id = $item['id']; $rc = $stmt->execute(); } $stmt->close();
source share