I have a tab delimited txt file with 500K entries. I am using the code below to read data into a dataset. It works fine with 50K, but with a 500K it throws a "System.OutOfMemoryException type exception" was thrown. "
What is a more efficient way to read big data with tab delimiters? Or how to solve this problem? Please give me an example
public DataSet DataToDataSet(string fullpath, string file)
{
string sql = "SELECT * FROM " + file;
OleDbConnection connection = new OleDbConnection
("Provider=Microsoft.Jet.OLEDB.4.0;Data Source=" + fullpath + ";"
+ "Extended Properties=\"text;HDR=YES;FMT=Delimited\"");
OleDbDataAdapter ole = new OleDbDataAdapter(sql, connection);
DataSet dataset = new DataSet();
ole.Fill(dataset);
connection.Close();
connection.Dispose();
ole.Dispose();
return dataset;
}
source
share