Python and sqlite3 - adding thousands of rows

I have a .sql file containing thousands of individual plugin statements. It takes forever. I am trying to find a way to do this more efficiently. In python, the sqlite3 library cannot do things like ".read" or ".import", but executescript is too slow for many inserts.

I installed the sqlite3.exe shell in the hope of using ".read" or ".import", but I cannot figure out how to use it. Running it through django in eclipse does not work, because it expects the database to be at the root of my C drive, which seems silly. Running it through the command line does not work because it cannot find my database file (unless I am doing something wrong)

Any tips?

Thanks!

+6
python sql django sqlite sqlite3
source share
4 answers

Do you use transactions? SQLite will create a transaction for each statement separately by default , which slows down.

By default, the sqlite3 transaction module is opened implicitly in front of the Data Modification Language (DML) (i.e. INSERT / UPDATE / DELETE / REPLACE)

If you manually create one transaction at the beginning and copy it at the end, this will significantly speed up the process.

+11
source share

Have you tried to run inserts within the same transaction? If not, then each insert is considered a transaction and ... well, you can read the SQLite frequency for this here

+3
source share

Also try VACUUM and ANALYZE in the database file. This helped my similar issue.

+2
source share

Use parameterized query

and

Use a transaction.

+1
source share

All Articles