Efficient way to read blob data in C # / SQL 2005

What is the most efficient memory method for reading SQL 2005 image fields using C # 3.5?

Now I have (byte[])cm.ExecuteScalar("...") .

If I could not read the entire contents of the field in memory, that would be good.

+6
sql sql-server tsql sql-server-2005
source share
2 answers

See this great article here or this blog post for a long explanation on how to do this.

Basically, you need to use SqlDataReader and specify SequentialAccess for it when creating it - then you can read (or write) BLOB from the database in chunks of any size that suits you best.

Basically something like:

 SqlDataReader myReader = getEmp.ExecuteReader(CommandBehavior.SequentialAccess); while (myReader.Read()) { int startIndex = 0; // Read the bytes into outbyte[] and retain the number of bytes returned. retval = myReader.GetBytes(1, startIndex, outbyte, 0, bufferSize); // Continue reading and writing while there are bytes beyond the size of the buffer. while (retval == bufferSize) { // write the buffer to the output, eg a file .... // Reposition the start index to the end of the last buffer and fill the buffer. startIndex += bufferSize; retval = myReader.GetBytes(1, startIndex, outbyte, 0, bufferSize); } // write the last buffer to the output, eg a file .... } // Close the reader and the connection. myReader.Close(); 

Mark

+4
source share

The trick here is to use ExecuteReader in serial mode and read data from IDataReader . Here the version for CLOB - BLOB files are almost identical, but with byte[] and GetBytes(...) .

Something like:

 using (var reader = cmd.ExecuteReader(CommandBehavior.SequentialAccess)) { byte[] buffer = new byte[8040]; // or some multiple (sql server page size) while (reader.Read()) // each row { long dataOffset = 0, read; while ((read = reader.GetBytes( colIndex, dataOffset, buffer, 0, buffer.Length)) > 0) { // TODO: process "read"-many bytes from "buffer" dataOffset += read; } } } 
+2
source share

All Articles