如何从 SQL 服务器读取大文件?

How to read large file from SQL Server?

我试图从 SQL 服务器读取文件(650 兆字节):

using (var reader = command.ExecuteReader(CommandBehavior.SequentialAccess))
{
   if (reader.Read())
   {
       using (var dbStream = reader.GetStream(0))
       {
          if (!reader.IsDBNull(0))
          {
              stream.Position = 0;
              dbStream.CopyTo(stream, 256);
           }

           dbStream.Close();
         }
       }

       reader.Close();
    }

但我在 CopyTo() 上得到了 OutOfMemoryException

对于小文件,此代码片段工作正常。我该如何处理大文件?

您可以将数据以小块的形式读取和写入一些临时文件。您可以在 MSDN - Retrieving Binary Data.

上查看示例
//Column Index in the result set
const int colIdx = 0;

// Writes the BLOB to a file (*.bmp).  
FileStream stream;                            
// Streams the BLOB to the FileStream object.  
BinaryWriter writer;                          

// Size of the BLOB buffer.  
int bufferSize = 100;                     
// The BLOB byte[] buffer to be filled by GetBytes.  
byte[] outByte = new byte[bufferSize];    
// The bytes returned from GetBytes.  
long retval;                              
// The starting position in the BLOB output.  
long startIndex = 0;                      

// Open the connection and read data into the DataReader.  
connection.Open();  
SqlDataReader reader = command.ExecuteReader(CommandBehavior.SequentialAccess);  

while (reader.Read())  
{  

  // Create a file to hold the output.  
  stream = new FileStream(  
    "some-physical-file-name-to-dump-data.bmp", FileMode.OpenOrCreate, FileAccess.Write);  
  writer = new BinaryWriter(stream);  

  // Reset the starting byte for the new BLOB.  
  startIndex = 0;  

  // Read bytes into outByte[] and retain the number of bytes returned.  
  retval = reader.GetBytes(colIdx, startIndex, outByte, 0, bufferSize);  

  // Continue while there are bytes beyond the size of the buffer.  
  while (retval == bufferSize)  
  {  
    writer.Write(outByte);  
    writer.Flush();  

    // Reposition start index to end of last buffer and fill buffer.  
    startIndex += bufferSize;  
    retval = reader.GetBytes(colIdx, startIndex, outByte, 0, bufferSize);  
  }  

  // Write the remaining buffer.  
  writer.Write(outByte, 0, (int)retval);  
  writer.Flush();  

  // Close the output file.  
  writer.Close();  
  stream.Close();  
}  

// Close the reader and the connection.  
reader.Close();  
connection.Close();

确保您将 SqlDataReaderCommandBehavior.SequentialAccess 一起使用,请注意上面代码段中的这一行。

 SqlDataReader reader = command.ExecuteReader(CommandBehavior.SequentialAccess);  

可以找到有关 CommandBehavior 枚举的更多信息 here

编辑:

让我澄清一下。我同意@MickyD,问题的原因不是你是否使用CommandBehavior.SequentialAccess,而是一次读取大文件。

我强调这一点是因为开发人员通常会忽略它,他们倾向于分块读取文件,但不设置 CommandBehavior.SequentialAccess 他们会遇到其他问题。虽然它已经发布了原始问题,但在我的回答中突出显示给任何新来者。

@MatthewWatson yeah var stream = new MemoreStream(); What is not right with it? – Kliver Max 15 hours ago

你的问题不在于你是否在使用:

`command.ExecuteReader(CommandBehavior.SequentialAccess)` 

...如我们所见;或者您的流复制缓冲区大小太大(实际上很小),而是您正在使用 MemoryStream 正如您在上面的评论中指出的那样。您很可能在 650MB 文件中加载 两次一次来自 SQL,另一个存储在 MemoryStream 从而导致您的 OutOfMemoryException

尽管解决方案是改为写入文件流,但问题的原因并未在已接受的答案中突出显示。除非您知道问题的原因,否则您将不会学会在将来避免此类问题。