Mysql is faster than INSERT

Well, I have about 175 thousand INSERT instructions, rather large INSERT instructions, for example:

INSERT INTO `gast` (`ID`,`Identiteitskaartnummer`,`Naam`,`Voornaam`,`Adres`,`Postcode`,`Stad`,`Land`,`Tel`,`Gsm`,`Fax`,`Email`,`Creditcardnummer`) VALUES ('100001','5121-1131-6328-9416','Cameron','Rhoda','Ap #192-1541 Velit Rd.','54398','Catskill','Bermuda','1-321-643-8255','(120) 502-0360','1 48 428 3971-3704',' tempor@justo.org ','8378-3645-3748-8446'); 

(Do not pay attention to the fact that this is in Dutch). All data is completely random.

I need to do this about 3-4 times. A 100k INSERT request takes about an hour on my PC. Is there a way to do this faster without changing INSERT statements? I am using MySQL Workbench. Thanks.

EDIT

I have tried everything that has been suggested so far. Using only one INSERT, but several VALUES gives me the error that the INSERT statement is too large. Disabling indexes before inserting also does not improve performance.

I also tried loading the sql file through the command line, but it does nothing ...

Performance insertion ranges from 0.015s to 0.032s per INSERT.

+4
source share
4 answers

INSERT all the data into one table — the table into which you are going to update your database — and then issue the INSERT SELECT statement, because it will execute it as a batch version instead of 175K different statements.

Also, when you reload your database using the INSERT SELECT statement, turn off the paths in the target table ALTER TABLE yourtablename DISABLE KEYS , and then turn them on again ALTER TABLE yourtablename ENABLE KEYS .

I also personally built the coverage index in the seed data table, because then he would not have to read the data page.

+2
source

Just use one statement

 INSERT INTO `gast` (`ID`,`Identiteitskaartnummer`,`Naam`,`Voornaam`,`Adres`,`Postcode`,`Stad`,`Land`,`Tel`,`Gsm`,`Fax`,`Email`,`Creditcardnummer`) VALUES ('100001','5121-1131-6328-9416','Cameron','Rhoda','Ap #192-1541 Velit Rd.','54398','Catskill','Bermuda','1-321-643-8255','(120) 502-0360','1 48 428 3971-3704',' tempor@justo.org ','8378-3645-3748-8446'), ('100002','5121-1131-6328-9416','Cameron','Rhoda','Ap #192-1541 Velit Rd.','54398','Catskill','Bermuda','1-321-643-8255','(120) 502-0360','1 48 428 3971-3704',' tempor@justo.org ','8378-3645-3748-8446') 

etc. etc.

Maybe you should do 10,000 times, not all 175k

+1
source

Another solution would be to save the data in a text file and then load the data using LOAD DATA LOCAL INFILE

Most likely, it is not so pleasant, but it will be much faster

+1
source

Put all your data in the csv file and load the user FILENAME data into the TABLE table. It will not take more than a minute. If you want to do this with an insert, it is better to use a single insert that sends several rows at once

+1
source

All Articles