I need to insert about 1.8 million lines from a CSV file into a MySQL database. (only one table)
Currently, Java is used to parse the file and insert each line.
As you can imagine, this takes several hours. (10 pieces)
The reason I am not laying this directly from the file in db is because the data needs to be manipulated before adding it to the database.
This process should be managed by an IT manager. So I configured it as a good batch file to run them after they dumped the new csv file to the right place. So, I need to make this work beautifully by dropping the file in a specific location and running the batch file. (Windows environment)
My question is which way would be the fastest way to insert this a lot of data; large inserts from a temporary parsed file or one insert at a time? maybe some other idea?
Second question: how can I optimize my MySQL installation to allow very fast inserts. (there will be a point where a large selection of all data is required)
Note: the table will eventually be deleted, and the whole process will start again later.
Some clarifications: currently used ... opencsv.CSVReader to parse the file and then insert into each line. I understand some columns and ignore others.
Additional explanations: Local DB MyISAM table
java mysql
Derek organ
source share