Import a CSV file to the Sqlite3 database at the command line or via a batch file

I would like to ask, is there anyway to import a CSV file that outputs my select statements in SQLite3 to a new database? Below are the codes I have made so far:

sqlite3.exe -csv logsql.sqlite "SELECT local_port AS port, COUNT(local_port) AS hitcount FROM connections  WHERE connection_type = 'accept' GROUP BY local_port ORDER BY hitcount DESC;" > output.csv
sqlite3.exe -csv test.sqlite "CREATE TABLE test (name varchar(255) not null, blah varchar(255) not null);" .import ./output.csv test

as you can see my first code was to unload the requests made.

second line of code, I'm trying to create a new database and try to import the csv file into the table "test"

Thanks for any help made in advance !: D

+5
source share
5 answers

, :

:

sqlite3.exe test.sqlite < import.sql

import.sql:

CREATE TABLE test (name varchar(255) not null, blah varchar(255) not null);
.separator ,
.import output.csv test

, , , , - ATTACH. , , CSV, . CREATE TABLE... AS SELECT... INSERT.

, , ( PHP):

"ATTACH 'c:\directory\to\database\test.db' as TESTDB;"
"CREATE TABLE TESTDB.test AS SELECT local_port AS port, COUNT(local_port) AS hitcount FROM connections  WHERE connection_type = 'accept' GROUP BY local_port ORDER BY hitcount DESC;"

:

"ATTACH 'c:\directory\to\database\test.db' as TESTDB;"
"CREATE TABLE TESTDB.test (name varchar(255) not null, blah varchar(255) not null);"
"IMPORT INTO TESTDB.test SELECT local_port AS port, COUNT(local_port) AS hitcount FROM connections  WHERE connection_type = 'accept' GROUP BY local_port ORDER BY hitcount DESC;"
+5

CSV sqlite3 shell .import, Python sqlite3. os.system ( Linux, Unix Mac OS X Cygwin Windows):

cmd = '(echo .separator ,; echo .import ' + csv_file + ' ' + table + ')'
cmd += '| sqlite3 ' + db_name
os.system(cmd)
+1

SQLite ... ... , SQLite, Windows .

Perl Python - . SqlLite, Windows.

- .

0

csv, python script, sqlite- csv, csv :

#!/usr/bin/env python
import sqlite3
from csv import DictReader

class SQLiteDB():
    def __init__(self, dbname=':memory:'):
        self.db=sqlite3.connect(dbname)

    def importFromCSV(self, csvfilename, tablename, separator=","):
        with open(csvfilename, 'r') as fh:
            dr = DictReader(fh, delimiter=separator)
            fieldlist=",".join(dr.fieldnames)
            ph=("?,"*len(dr.fieldnames))[:-1]
            self.db.execute("DROP TABLE IF EXISTS %s"%tablename)
            self.db.execute("CREATE TABLE %s(%s)"%(tablename, fieldlist))
            ins="insert into %s (%s) values (%s)"%(tablename, fieldlist, ph)
            for line in dr:
                v=[]
                for k in dr.fieldnames: v.append(line[k])
                self.db.execute(ins, v)
        self.db.commit()

if __name__ == '__main__':
    db=SQLiteDB("mydatabase.sqlite")
    db.importFromCSV("mydata.csv", "mytable")

To import large amounts of data, you need to complete transactions.

Hth

0
source

The single-file command to import the file through bash, which worked for me:

sqlite3 inventory.sqlite.db << EOF
delete from audit;
.separator "\t"
.import audit-sorted-uniq.tsv audit
EOF

Hope this helps.

0
source

All Articles