Massive MySQL INSERT Optimization

I have an application that should run a daily script; The daily script is to load a CSV file with 1,000,000 rows and insert these rows into a table.

I host my application on Dreamhost. I created a while loop that goes through all the CSV lines and executes an INSERT request for each one. The fact is that I get a "500 Internal Server Error". Even if I interrupt it in 1000 files with 1000 lines each, I cannot insert more than 40 or 50 thousand lines in one cycle.

Is there any way to optimize input? I am also considering working with a dedicated server; what do you think?

Thank!

Pedro

+1
source share
10

- MySQL - FAD DATA FILE.

CSV, :

LOAD DATA INFILE 'data.txt' INTO TABLE tbl_name
  FIELDS TERMINATED BY ',' ENCLOSED BY '"'
  LINES TERMINATED BY '\r\n'
  IGNORE 1 LINES;
+13

insert into table values(1,2);

do

insert into table values (1,2),(2,3),(4,5);

.

, , .

http://dev.mysql.com/doc/refman/5.0/en/load-data.html

+6

, LOAD DATA INFILE, , , .

MySQL , , , : http://dev.mysql.com/doc/refman/5.0/en/insert-speed.html

, :

  • / :

    ALTER TABLE tbl_name DISABLE KEYS; ALTER TABLE tbl_name ENABLE KEYS;

  • .

    I.e.: INSERT INTO table (col1, col2) VALUES (val1, val2), (..,..),...

    , 4096 .

  • FLUSH TABLES, , , , .

, . LOCK TABLES, , .

UPDATE

, , , , . :

  • , , "" ( , TRUNCATE).
  • script . , , .
  • , ENABLE KEYS .
+3

cronjob script, x . Cronjob script , , .

, , .

, . .

, time_limit 0 ( dreamhost) .

0

PHP script , script. , .

, , mysql .

0

OMG Ponies , "" , mysqldump, . .

0

? BEGIN MySQL, , COMMIT. , , , script, , .

0

, nos , , .

, MySQL INSERT, , 10 . INSERTS , , nos, . , INSERT, , , PHP, , timout set_time_limit ($ seconds), , INSERT .

, , , INSERT , , MySQL, mysql_errno() mysql_error(). , , INSERT, mysql_affected_rows(). .

0

, sqlloader. , , SQL Loader, csv, , . http://www.oracle-dba-online.com/sql_loader.htm

0

phpmyadmin , .

"", / " " InnoDB MyISAM.

. .

0

All Articles