How to transfer a large .txt file to MySQL?

I have a very large .txt file of millions of lines and it has a split sign.

So what is the easiest way to pass all these rows to the database? Is it PHP fopen and each row explode and paste it into the database?

FILE - 2 GB.

+4
source share
3 answers

Use phpmyadmin to generate a command for you, i.e. .:

LOAD DATA LOCAL INFILE 'input_file' INTO TABLE `tablename` FIELDS TERMINATED BY '|' LINES TERMINATED BY '\n' 

Quote from http://vegdave.wordpress.com/2007/05/19/import-a-csv-file-to-mysql-via-phpmyadmin/

+8
source
 $mysql = mysql_connect("localhost", "user", "password"); mysql_select_db("database", $mysql); $filename = "file.txt"; $handle = fopen($filename, "rb"); $contents = fread($handle, filesize($filename)); fclose($handle); $array = explode('[separator]', $contents); foreach ($array as $line) { mysql_query("INSERT INTO table VALUES (".mysql_real_escape_string($line).")", $mysql); } mysql_close($mysql); 

Another solution, but a boj solution is better than that.

+1
source

It depends on how much memory you have on the machine. If you are not limited by the size of RAM, you can try to read the entire file, blow it into an array and generate an insert request at a time.

You will probably need to create multiple insertion queries due to database restrictions (e.g. max_allowed_packet in mySQL)

If you do not have enough memory, you will need to read the file in a few steps (pieces of bytes). You can create multiple insertion requests.

fopen and fread functions can be useful.

0
source

All Articles