Best way to store data locally in .NET (C #)

I am writing an application that takes user data and saves it locally for later use. The application will be started and stopped quite often, and I would like it to save / load data at the beginning and end of the application.

It would be quite simple if I used flat files, since the data really does not need to be protected (it will be stored only on this PC). Possible options:

  • Flat files
  • XML
  • SQL DB

Flat files require a bit more support (there are no built-in classes such as XML), however I have not used XML before, and SQL seems redundant for this relatively simple task.

Are there any other features worth exploring? If not, which one is the best solution?




Edit: To add a little more data to the problem, basically the only thing I would like to save is a dictionary that looks like this:

Dictionary<string, List<Account>> 

where Account is another user type.

Would I serialize the dict as xmlroot and then the account type as attributes?




Update 2:

Thus, you can serialize the dictionary. The difficulty is that the value for this dict is the generalization itself, which is a list of complex data structures of type Account. Each account is pretty simple, it's just a bunch of properties.

I understand that the goal here is to try to summarize:

 <Username1> <Account1> <Data1>data1</Data1> <Data2>data2</Data2> </Account1> </Username1> <Username2> <Account1> <Data1>data1</Data1> <Data2>data2</Data2> </Account1> <Account2> <Data1>data1</Data1> <Data2>data2</Data2> </Account2> </Username2> 

As you see heirachy

  • Username (dict string)>
  • Account (each account in the list)>
  • Account data (i.e. class properties).

Getting this layout from Dictionary<Username, List<Account>> is the hard bit and the gist of this question.

There are many β€œhow” answers in serialization, this is my mistake, since I did not make it clearer at an early stage, but now I am looking for a specific solution.

+53
c # xml data-storage
Dec 21 '09 at 19:00
source share
19 answers

I would save the file as JSON . Since you are storing a dictionary that is only a list of name / value pairs, then this is pretty much intended for json. There are many decent free .NET json libraries there - one here, but you can find the full list at the first link.

+20
Dec 21 '09 at 20:19
source share

It really depends on what you store. If you are talking about structured data, then XML or very lightweight SQL RDBMS, such as SQLite or SQL Server Compact Edition, will work well for you. The SQL solution becomes especially convincing if the data is moved beyond the trivial size.

If you store large chunks of relatively unstructured data (for example, binary objects such as images), then obviously neither databases nor XML solutions are suitable, but given your question, I assume that it is more of the first than last.

+21
Dec 21 '09 at 19:03
source share

XML is easy to use through serialization. Use isolated storage .

See also How to decide where to store state for each user? Registry? Application data? Isolated Storage?

 public class UserDB { // actual data to be preserved for each user public int A; public string Z; // metadata public DateTime LastSaved; public int eon; private string dbpath; public static UserDB Load(string path) { UserDB udb; try { System.Xml.Serialization.XmlSerializer s=new System.Xml.Serialization.XmlSerializer(typeof(UserDB)); using(System.IO.StreamReader reader= System.IO.File.OpenText(path)) { udb= (UserDB) s.Deserialize(reader); } } catch { udb= new UserDB(); } udb.dbpath= path; return udb; } public void Save() { LastSaved= System.DateTime.Now; eon++; var s= new System.Xml.Serialization.XmlSerializer(typeof(UserDB)); var ns= new System.Xml.Serialization.XmlSerializerNamespaces(); ns.Add( "", ""); System.IO.StreamWriter writer= System.IO.File.CreateText(dbpath); s.Serialize(writer, this, ns); writer.Close(); } } 
+13
Dec 21 '09 at 19:02
source share

All of the above is a good answer and usually solves the problem.

If you need a simple and free way to scale for millions of data, try the ESENT Managed Interface project on CodePlex .

ESENT is an integrated database storage engine (ISAM) that is part of Windows. It provides reliable, transactional, simultaneous, high-performance data storage with row-level locking, write-based logging, and snapshot isolation. This is the ESENT Win32 API managed shell.

It has a PersistentDictionary object, which is pretty easy to use. Think of it as a Dictionary () object, but it automatically loads and saves to disk without additional code.

For example:

 /// <summary> /// Ask the user for their first name and see if we remember /// their last name. /// </summary> public static void Main() { PersistentDictionary<string, string> dictionary = new PersistentDictionary<string, string>("Names"); Console.WriteLine("What is your first name?"); string firstName = Console.ReadLine(); if (dictionary.ContainsKey(firstName)) { Console.WriteLine("Welcome back {0} {1}", firstName, dictionary[firstName]); } else { Console.WriteLine("I don't know you, {0}. What is your last name?", firstName); dictionary[firstName] = Console.ReadLine(); } 

To answer George's question:

Supported Key Types

Only these types are supported as dictionary keys:

Boolean Byte Int16 UInt16 Int32 UInt32 Int64 UInt64 Float String TimeSyn DoubleTime TimeTime

Supported Value Types

Dictionary values ​​can be any key types, Nullable versions key types, Uri, IPAddress, or serializable structure. A structure is only considered serializable if it meets all of these criteria:

β€’ The structure is marked as serializable β€’ Each member of the struct: 1. The initial data type (for example, Int32) 2. String, Uri or IPAddress 3. The serializable structure.

Or, in other words, a serializable structure cannot contain any references to an object of a class. This is done to maintain API consistency. Adding an object to a PersistentDictionary creates a copy of the object, although serialized. Changing the original object will not change the copy, which would lead to confusing behavior. To avoid these problems, PersistentDictionary only accept values ​​as values.

Can be serialized [Serializable] struct Good {public DateTime? Received; public string Name; public decimal price; public Uri Url; }

Cannot serialize [Serializable] struct Bad {public byte [] Data; // array arent supported; public exception error; // reference object}

+10
Dec 21 '09 at 19:41
source share

I recommend the XML reader / writer class for files because it is easily serializable.

Serialization in C #

Serialization (known as etching in python) is an easy way to convert an object to a binary representation, which can be, for example, written to disk or sent by wire.

This is useful, for example. for convenience, save the settings to a file.

You can serialize your own classes if you mark them using the [Serializable] attribute. This serializes all members of the class except those marked as [NonSerialized] .

Below is the code showing how to do this:

 using System; using System.Collections.Generic; using System.Text; using System.Drawing; namespace ConfigTest { [ Serializable() ] public class ConfigManager { private string windowTitle = "Corp"; private string printTitle = "Inventory"; public string WindowTitle { get { return windowTitle; } set { windowTitle = value; } } public string PrintTitle { get { return printTitle; } set { printTitle = value; } } } } 

Then you, possibly in ConfigForm, call your ConfigManager class and serialize it!

 public ConfigForm() { InitializeComponent(); cm = new ConfigManager(); ser = new XmlSerializer(typeof(ConfigManager)); LoadConfig(); } private void LoadConfig() { try { if (File.Exists(filepath)) { FileStream fs = new FileStream(filepath, FileMode.Open); cm = (ConfigManager)ser.Deserialize(fs); fs.Close(); } else { MessageBox.Show("Could not find User Configuration File\n\nCreating new file...", "User Config Not Found"); FileStream fs = new FileStream(filepath, FileMode.CreateNew); TextWriter tw = new StreamWriter(fs); ser.Serialize(tw, cm); tw.Close(); fs.Close(); } setupControlsFromConfig(); } catch (Exception ex) { MessageBox.Show(ex.Message); } } 

After it has been serialized, you can call the parameters of your configuration file using cm.WindowTitle, etc.

+8
Dec 21 '09 at 19:07
source share

The fourth option for those you mention is binaries . Although it sounds secret and complicated, it is very easy using the .NET serialization API.

If you choose binary or XML files, you can use the same serialization API, although you would use different serializers.

To serialize a binary class, it must be marked with the [Serializable] attribute or implement ISerializable.

You can do something similar with XML , although the interface is called IXmlSerializable and the attributes are [XmlRoot] and other attributes in the System.Xml.Serialization namespace.

If you want to use a relational database, SQL Server Compact Edition is free and very lightweight and based on a single file.

+7
Dec 21 '09 at 19:04
source share

If your collection gets too big, I find that serializing Xml is pretty slow. Another option for serializing your vocabulary is to "collapse your own" using BinaryReader and BinaryWriter.

Here is a sample code to get you started. You can make these general extension methods to work with any type of dictionary, and it works quite well, but is too detailed to publish here.

 class Account { public string AccountName { get; set; } public int AccountNumber { get; set; } internal void Serialize(BinaryWriter bw) { // Add logic to serialize everything you need here // Keep in synch with Deserialize bw.Write(AccountName); bw.Write(AccountNumber); } internal void Deserialize(BinaryReader br) { // Add logic to deserialize everythin you need here, // Keep in synch with Serialize AccountName = br.ReadString(); AccountNumber = br.ReadInt32(); } } class Program { static void Serialize(string OutputFile) { // Write to disk using (Stream stream = File.Open(OutputFile, FileMode.Create)) { BinaryWriter bw = new BinaryWriter(stream); // Save number of entries bw.Write(accounts.Count); foreach (KeyValuePair<string, List<Account>> accountKvp in accounts) { // Save each key/value pair bw.Write(accountKvp.Key); bw.Write(accountKvp.Value.Count); foreach (Account account in accountKvp.Value) { account.Serialize(bw); } } } } static void Deserialize(string InputFile) { accounts.Clear(); // Read from disk using (Stream stream = File.Open(InputFile, FileMode.Open)) { BinaryReader br = new BinaryReader(stream); int entryCount = br.ReadInt32(); for (int entries = 0; entries < entryCount; entries++) { // Read in the key-value pairs string key = br.ReadString(); int accountCount = br.ReadInt32(); List<Account> accountList = new List<Account>(); for (int i = 0; i < accountCount; i++) { Account account = new Account(); account.Deserialize(br); accountList.Add(account); } accounts.Add(key, accountList); } } } static Dictionary<string, List<Account>> accounts = new Dictionary<string, List<Account>>(); static void Main(string[] args) { string accountName = "Bob"; List<Account> newAccounts = new List<Account>(); newAccounts.Add(AddAccount("A", 1)); newAccounts.Add(AddAccount("B", 2)); newAccounts.Add(AddAccount("C", 3)); accounts.Add(accountName, newAccounts); accountName = "Tom"; newAccounts = new List<Account>(); newAccounts.Add(AddAccount("A1", 11)); newAccounts.Add(AddAccount("B1", 22)); newAccounts.Add(AddAccount("C1", 33)); accounts.Add(accountName, newAccounts); string saveFile = @"C:\accounts.bin"; Serialize(saveFile); // clear it out to prove it works accounts.Clear(); Deserialize(saveFile); } static Account AddAccount(string AccountName, int AccountNumber) { Account account = new Account(); account.AccountName = AccountName; account.AccountNumber = AccountNumber; return account; } } 
+5
Dec 21 '09 at 20:52
source share

Just finished storing coding data for my current project. Here are my 5 cents.

I started with binary serialization. It was slow (about 30 seconds to load 100,000 objects), and it also created a fairly large file on disk. However, it took me a few lines of code to implement, and I got all of my storage needs. To improve performance, I switched to custom serialization. Found FastSerialization by Tim Haines in the Code project. Indeed, it is several times faster (got 12 seconds to load, 8 seconds to save, 100K records), and it takes up less disk space. The structure is based on the technique described by GalacticJello in a previous post.

Then I switched to SQLite and was able to get 2 times 3 times faster - 6 seconds to load and 4 seconds to save, 100 thousand records. It includes an analysis of ADO.NET tables for application types. It also gave me much less files on disk. This article explains how to get the best performance from ADO.NET: http://sqlite.phxsoftware.com/forums/t/134.aspx . Creating INSERT statements is a very bad idea. You can guess how I found out about this. :) Indeed, the implementation of SQLite took quite a lot of time, plus a careful measurement of the time that almost every line of code takes.

+5
Dec 23 '09 at 20:34
source share

If your data is complex, large, or you need to query it locally, then object databases may be a valid option. I suggest watching Db4o or Karvonite .

+4
Dec 21 '09 at 19:40
source share

The first thing I look at is the database. However, serialization is an option. If you go for binary serialization, then I would avoid BinaryFormatter - it has a tendency to get angry with versions if you change fields, etc. Xml via XmlSerialzier will be fine and can be lateral (e.g. with the same class definitions) with protobuf-net if you want to try binary serialization based on contracts (without any effort).

+4
Dec 21 '09 at 20:35
source share

Many of the answers in this thread are trying to overestimate the solution. If I'm right, you just want to save the user settings.

To do this, use the .ini file or the App.Config file.

If I am mistaken and you save data that is more than just settings, use a flat text file in csv format. It is quick and easy without the overhead of XML. People love poo poo because they are not so elegant, they are not beautiful, and they don’t look so good on resumes, but this may be the best solution for you depending on what you need.

+3
Dec 21 '09 at 19:37
source share

I made several "stand alone" applications with local data storage. I find it best to use SQL Server Compact Edition (formerly known as SQLAnywhere).

It is lightweight and free. In addition, you can stick to writing a data access layer that can be reused in other projects, plus if the application ever needs to scale to a larger than full-size SQL server, you only need to change the connection string.

+2
Dec 21 '09 at 19:17
source share

My first slope is the access database. .Mdb files are stored locally and can be encrypted if deemed necessary. Although XML or JSON will also work in many scenarios. I would use flat files for reading only, not for searching (read only). I prefer the csv format for setting the width.

0
Dec 21 '09 at 19:03
source share

It depends on the amount of data you want to store. In fact, there is no difference between flat files and XML. XML is likely to be preferable because it provides the structure of the document. On practice

The last option, and many applications now use the Windows registry. I personally do not recommend (Registry Bloat, Corruption, other potential problems), but this is an option.

0
Dec 21 '09 at 19:06
source share

Not knowing what your data looks like i.e. complexity, size, etc. XML is easy to maintain and easily accessible. I would not use an Access database, and flat files are harder to maintain for a long time, especially if you are dealing with more than one data field / element in your file.

I process large data files with flat files in good quantities daily, and even in the extreme example, data with flat files is much more difficult to maintain than the processed XML data files.

A simple example of loading XML data into a dataset using C #:

 DataSet reportData = new DataSet(); reportData.ReadXml(fi.FullName); 

You can also check LINQ to XML as an option for querying XML data ...

NTN ...

0
Dec 21 '09 at 19:15
source share

If you go for binary serialization, consider the speed at which a particular database member should be available. If this is only a small collection, downloading the entire file will make sense, but if it is large, you can also consider the index file.

Account tracking properties / fields that are located at a specific address in the file can help you speed up access time, especially if you optimize this index file based on a key. (maybe even when you write to disk.)

0
Dec 21 '09 at 19:37
source share

Depending on the complexity of your Account object, I would recommend either an XML or a Flat file.

If there are only a few values ​​for each account, you can save them in the properties file, for example:

 account.1.somekey=Some value account.1.someotherkey=Some other value account.1.somedate=2009-12-21 account.2.somekey=Some value 2 account.2.someotherkey=Some other value 2 

... and so on. Reading from a properties file should be simple, as it maps directly to a string dictionary.

As for where to store this file, the best choice would be to save it in the AppData folder inside the subfolder for your program. This is the location where current users will always have write access, and it will be protected from other users of the OS itself.

0
Dec 21 '09 at 19:44
source share

Keep it simple - as you said, a flat file is enough. Use a flat file.

It is assumed that you have correctly analyzed your requirements. I would skip serializing as an XML step, overflow a simple dictionary. The same goes for the database.

0
Dec 21 '09 at 20:23
source share

In my experience, in most cases, JSON is enough in the file (basically you need to store an array or object or just one number or string). I rarely need SQLite (it takes more time to configure and use it, it overflows most of the time).

0
May 12 '17 at 7:05 a.m.
source share



All Articles