I use Python to save data line by line ... but it is very slow!
The CSV contains 70 million lines, and with my script I can just save 1 thousand seconds.
This is what my script looks like
reader = csv.reader(open('test_results.csv', 'r')) for row in reader: TestResult(type=row[0], name=row[1], result=row[2]).save()
I believe that for testing, I might have to think about MySQL or PostgreSQL.
Any idea or tips? This is the first time I've come across such huge amounts of data. :)
python django mysql sqlite csv
Radianthex
source share