Owin WebApi Post 大对象方法

Owin WebApi Post method with Large Object

我正在使用 OWIN 自托管 Web API 2 windows 服务。它在大多数情况下都运行良好,除了在客户端(winforms 应用程序)导致 OutOfMemoryException 的大型自定义对象。

问题:如何 POST 大型自定义对象?

OutOfMemoryException 最初发生在这段代码的末尾 JsonConvert.SerializeObject:

using Newtonsoft.Json;

static HttpClient _httpClient = new HttpClient();  

public async Task SaveMyObjectAsync(MyObject largeObject)
{
    var response = await _httpClient.PostAsync("myobjects/route/", new JsonContent(largeObject));
    response.EnsureSuccessStatusCode();
}

public static class JsonSettings
{
    public static readonly JsonSerializerSettings Default =
    new JsonSerializerSettings
    {
        ContractResolver = new DefaultContractResolver(),
        NullValueHandling = NullValueHandling.Ignore,
        ReferenceLoopHandling = ReferenceLoopHandling.Serialize,
        DateTimeZoneHandling = DateTimeZoneHandling.RoundtripKind,
        DateParseHandling = DateParseHandling.DateTimeOffset,
        DateFormatHandling = DateFormatHandling.IsoDateFormat,
        Formatting = Formatting.Indented,
        Converters = new List<JsonConverter>
        {
            new StringEnumConverter(),
        }
    };
}

public class JsonContent : StringContent
{
    public JsonContent(object value) : base(JsonConvert.SerializeObject(value, JsonSettings.Default), Encoding.UTF8, "application/json") {}
}

第一次尝试

所以从this answer开始,我换掉了序列化方法来写入本地文件。这工作了一段时间,直到我意识到它刚刚增加了它可以处理的大小限制。我仍然遇到较大对象的 OutOfMemoryException,但现在它在 File.ReadAllText

public JsonContent(object value) : base(SerializeObjectByStream(value), Encoding.UTF8, "application/json") { }

static string SerializeObjectByStream(object value)
{
    using (TextWriter textWriter = File.CreateText("LocalJsonFile.json"))
    {
        JsonConvert.DefaultSettings = () => JsonSettings.Default;
        using (var jsonWriter = new JsonTextWriter(textWriter))
        {
            var serializer = new JsonSerializer();
            serializer.Serialize(jsonWriter, value);
            jsonWriter.Flush();
        }
    }
    return File.ReadAllText("LocalJsonFile.json");
}

多部分尝试

将这么大的对象分成一个部分发送可能不是一个好主意,所以我尝试使用 MultipartContent 将其分解。大多数示例似乎涵盖读取多部分请求,而不是创建它,但这段代码适用于我的常规大小的自定义对象。不幸的是,它仍然会为大对象抛出 OutOfMemoryException。这次它位于 ser.Serialize(jsonWriter, request) 处的 Newtonsoft JsonSerializer 内部。

我也尝试使用 FileStream 而不是 MemoryStream 来解决同样的问题。这次 OutOfMemoryException 在 _httpClient.PostAsync

using (var content = new MultipartContent())
{
    using (var stream = new MemoryStream())
    {
        var writer = new StreamWriter(stream);
        JsonConvert.DefaultSettings = () => JsonSettings.Default;
        var jsonWriter = new JsonTextWriter(writer);
        var ser = new JsonSerializer();
        ser.Serialize(jsonWriter, request);
        jsonWriter.Flush();
        stream.Seek(0, SeekOrigin.Begin);
        content.Add(new StreamContent(stream));

        var response = await _httpClient.PostAsync("myobjects/route/", content);
        response.EnsureSuccessStatusCode();
    }
}

看来我所做的只是推动这些数据以在不同的地方解决内存不足问题。
如何将这个大型自定义对象分解成块 - 同时将其保留在 1 个事务中????

好吧,这远没有我想象的那么直观。这段代码似乎执行得很好(为简单起见删除了取消令牌等)。虽然我有点担心为我的 FileStream 使用通用文件名。作为异步我想有可能 2 个实例可能会尝试同时创建相同的文件?

JsonSerializer 切换到 BinaryFormatter 非常痛苦,因为这意味着我必须经历并用 [Serializable] 装饰我所有的 类属性。我还依靠 JsonSerializer 的行为来使用默认构造函数 as described here,它为我将 null 值转换为空字符串。

客户:

const int MaximumChunkSize = 1024000;

public async Task SaveMyObjectAsync(MyObject largeObject)
{
    //use fileStream to write the object to disk, 
    // so it does not hold it all in memory at the same time
    using (var stream = new FileStream("LocalStreamFile.json", FileMode.Create))
    {
        //JsonSerializer unable to serialize a large object 
        // without OOM error, so use BinaryFormatter
        var formatter = new BinaryFormatter();
        formatter.Serialize(stream, request);
        //return to the start of the stream
        stream.Seek(0, SeekOrigin.Begin);
        using (var content = new MultipartContent())
        {
            var buffer = new byte[MaximumChunkSize];
            while (stream.Read(buffer, 0, buffer.Length) > 0)
            {
                //add the large object in chunks to the multipart content
                content.Add(new JsonContent(buffer));
            }
            var response = await _httpClient.PostAsync("myobjects/route/", content);
            response.EnsureSuccessStatusCode();
        }
    }
}

服务器:

[HttpPost, Route("myobjects/route/")]
public async Task<IHttpActionResult> SaveMyObjectAsync()
{
    if (Request.Content.IsMimeMultipartContent() == false)
    {
        return StatusCode(HttpStatusCode.BadRequest);
    }
    var contentStreamProvider = await Request.Content.ReadAsMultipartAsync();
    var stream = new FileStream("LocalStreamFile.json", FileMode.Create);
    foreach (var content in contentStreamProvider.Contents)
    {
        //read out the chunk and convert from json
        var requestArray = JsonConvert.DeserializeObject<byte[]>(await content.ReadAsStringAsync(), JsonSettings.Default);
        stream.Write(requestArray, 0, requestArray.Length);
    }
    stream.Seek(0, SeekOrigin.Begin);
    var formatter = new BinaryFormatter();
    //convert back from stream to our original large object
    var request = (MyObject)formatter.Deserialize(stream);
    //save to database etc
    ...
    return Ok();
}