Depending on the data format (you said CSV) and the database, it is probably better for you to load the data directly into the database (either directly into tables managed by Django or into temporary tables). As an example, Oracle and SQL Server provide custom tools for loading large amounts of data. In the case of MySQL, there are many tricks you can do. As an example, you can write a perl / python script to read a CSV file and create a SQL script with insert statements, and then directly pass the SQL script to MySQL.
As already noted, always load indexes and triggers before loading large amounts of data, and then add them back - rebuilding indexes after each insert is the main hit of processing.
If you use transactions, disable them or load your inserts so that the transactions are not too large (the definition is too large, but if you make a million rows of data, breaking them into 1 is probably about three thousand transactions).
And most importantly, CONTACT YOUR DATABASE FIRST! The only thing worse than restoring your database from a backup due to the import screw does not have a current backup to restore.
source share