Maintaining a large table of unique values ​​in MySQL

This is probably a common situation, but I could not find a specific answer to SO or Google.

I have a large table (> 10 million rows) of friendships in the MySQL database, which is very important and needs to be maintained in such a way that there are no duplicate rows. The table stores user files. SQL for table:

CREATE TABLE possiblefriends( id INT NOT NULL AUTO_INCREMENT, PRIMARY KEY(id), user INT, possiblefriend INT) 

How the table works, each user has about 1000 or so “possible friends” that are discovered and should be kept, but duplication of “possible friends” should be avoided.

The problem is that due to the design of the program during the day, I need to add 1 million rows or more to the table, which may or may not contain duplicate row entries. The simple answer, it would seem, was to check each row to see if it is a duplicate, and if not, insert it into the table. But this technique is likely to be very slow, as the size of the table increases to 100 million rows, 1 billion rows or higher (which I expect in the near future).

What is the best (i.e. fastest) way to keep this unique table?

I do not need to always have a table with unique values. I just need it once a day for batch jobs. In this case, you should create a separate table that simply inserts all possible rows (containing duplicate rows and all), and then at the end of the day creates a second table that calculates all the unique rows in the first table?

If not, what is the best way for this table to be long-term?

(If indexes are the best long-term solution, tell me which indexes to use)

+6
mysql large-data
source share
2 answers

Add a unique index to (user, possiblefriend) , then use one of:

to avoid errors when trying to insert a repeating row.

You might also consider whether you can migrate your automatically increasing primary key and use (user, possiblefriend) as the primary key. This will reduce the size of your table, and the primary key will function as an index, which saves you from having to create an additional index.

See also:

  • INSERT IGNORE vs. INSERT ... ON DUPLICATE KEY UPDATE
+7
source share

The unique index allows you to make sure that the field is really unique, you can add a unique index, for example:

 CREATE TABLE possiblefriends( id INT NOT NULL AUTO_INCREMENT, PRIMARY KEY(id), user INT, possiblefriend INT, PRIMARY KEY (id), UNIQUE INDEX DefUserID_UNIQUE (user ASC, possiblefriend ASC)) 

This will also greatly expand access to the table.

Your other problem with bulk inserts is a bit more complicated, you can use the built-in ON DUPLICATE KEY UPDATE function below:

 INSERT INTO table (a,b,c) VALUES (1,2,3) ON DUPLICATE KEY UPDATE c=c+1; UPDATE table SET c=c+1 WHERE a=1; 
+2
source share

All Articles