Fastest way to read data from a SQLite database?

I have a local SQLite database

PARTS TABLE

-- Describe PREFIX_LIST CREATE TABLE PREFIX_LIST(ITEM VARCHAR(25) PRIMARY KEY) -- Describe SUFFIX_LIST CREATE TABLE SUFFIX_LIST(ITEM VARCHAR(25) PRIMARY KEY) -- Describe VALID_LIST CREATE TABLE VALID_LIST ( "PART1" TEXT, "PART2" TEXT, PRIMARY KEY(PART1, PART2) ) 

Now this list is really huge, and I need to save data from it.

Here is my implementation.

 SQLiteConnection con = null; SQLiteCommand cmd = null; Connect(DbPath, ref con, ref cmd); cmd.CommandText = "SELECT PART1 || '@' || PART2 FROM VALID_LIST WHERE NOT EXISTS (SELECT * FROM PREFIX_LIST WHERE VALID_LIST.PART1 LIKE '%' || ITEM || '%') AND NOT EXISTS (SELECT * FROM SUFFIX_LIST WHERE VALID_LIST.PART2 LIKE '%' || ITEM || '%')"; var reader = cmd.ExecuteReader(); if (reader.HasRows) { string savePath; if (SaveTextFile(out savePath) == DialogResult.OK) { TextWriter writer = new StreamWriter(savePath); while (reader.Read()) { writer.WriteLine(reader.GetString(0)); } writer.Close(); writer.Dispose(); } } reader.Close(); reader.Dispose(); cmd.Dispose(); con.Close(); con.Dispose(); MessageBox.Show("List Saved!.", Application.ProductName, MessageBoxButtons.OK, MessageBoxIcon.Information); 

I need a little better, I can save the list faster. Total records in VALID_LIST - 2639117

and it took 15 minutes to save it for the above SQL QUERY!

please lmk if sql query can be optimized!

Thanks in advance

+6
source share
3 answers

LIKE queries will generally be very slow unless a template is added to the suffix. A predicate such as LIKE '%foo' cannot be indexed using typical row indexing.

However, you can replace the heavy use of LIKE in sqlite with the full-text search (FTS) function.

The FTS3 and FTS4 extension modules allow users to create special tables with built-in full-text index (hereinafter referred to as “FTS tables”). The full-text index allows the user to efficiently query the database for all rows containing one or more words (hereinafter “tokens”), even if the table contains many large documents.

They have an example that looks promising in terms of performance in your use case.

 CREATE VIRTUAL TABLE enrondata1 USING fts3(content TEXT); /* FTS3 table */ CREATE TABLE enrondata2(content TEXT); /* Ordinary table * SELECT count(*) FROM enrondata1 WHERE content MATCH 'linux'; /* 0.03 seconds */ SELECT count(*) FROM enrondata2 WHERE content LIKE '%linux%'; /* 22.5 seconds */ 
+6
source

Consider using full-text search.

For this to work, the values ​​in PREFIX and SUFFIX must be labeled (they must be separate words), and the ITEM you are trying to match must be a separate token in one of these values ​​(and not part of a word or two words together). For example, the lines in PREFIX and SUFFIX should be something like "RED BLUE GREEN" or "DOG, CAT, CAPYBARA", and the values ​​for ITEM should be RED, BLUE, GREEN, DOGS, CAT or CAPYBARA.

If these conditions are met, you can enable full-text search, recreate these tables as full text tables, and replace LIKE (and wildcards) with MATCH. In this case, SQLite will maintain an index on each brand found in PREFIX or SUFFIX, and this part of the search will be much faster.

Unfortunately, the inclusion of FTS in SQlite involves compiling a product from source with one or more compile-time flags set. I have no experience with this.

+2
source

I'm not sure if this is what you want, but it will help speed up the recording process. Try copying the rows that you are reading from the database in the row builder and then writing them to a file. For example, you can read the line 100k and then write these 100k to a file right away.

  StringBuilder builder = new StringBuilder(); int count = 0; //to limit the number of rows stored in string builder. while (reader.Read()) { builder.AppendLine(reader.GetString(0)); count++; //store every 100k or so rows at once. //This number depends on how much RAM //you can allocate towards storing the string of rows. //If you have 2GB of free RAM //this number can easily be 1 million but it always depends on the //size of each string stored in database. if(count == 100000) { File.AppendAllText(path, builder.ToString()); //Append all rows to the file builder.Clear(); //clear the string for next 100k rows of data count = 0; //Clear the counter value } count++ } 

Let me know if it helped.

0
source

Source: https://habr.com/ru/post/926886/


All Articles