Quite a while ago (2006!), I bumped into an issue when copying large chuncks of data to a network.
I posted it to Google, mentioned that breaking up the data in smaller blocks worked, but never had the time to post the solution.
So here it is :-)
First the problem:
The old code consistently fails when:
- the FileStream is on a network
- and the MemoryStream is 64 megabytes or larger
The old code succeeds when:
- the MemoryStream is smaller than 64 megabytes
- or the FileStream is not on a network
An example of the exception message you get upon failure:
System.IO.IOException: Insufficient system resources exist to complete the requested service.
at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)
at System.IO.FileStream.WriteCore(Byte[] buffer, Int32 offset, Int32 count)
at System.IO.FileStream.Write(Byte[] array, Int32 offset, Int32 count)
at System.IO.BufferedStream.Write(Byte[] array, Int32 offset, Int32 count)
...
Then the old code:
public static void WriteMemoryStreamToFile(string filename, MemoryStream memory)
{
using (Stream
file = new FileStream(filename, FileMode.OpenOrCreate, FileAccess.ReadWrite),
fileBuffer = new BufferedStream(file)
)
{
byte[] memoryBuffer = memory.GetBuffer();
int memoryLength = (int)memory.Length;
fileBuffer.Write(memoryBuffer, 0, memoryLength); //##jpl: drawback: works only up to 2 gigabyte!
fileBuffer.Close();
file.Close();
}
}
Note that the old code already has a limitation of 2 gigabyte, back then this was not an issue (in 2006 there were not that many people having more than 2 gigabytes of memory, now that is a different story).
Read the rest of this entry »