How to diagnose memory explosion?

Diagnosing Memory Explosion in ASP.NET Core File Uploads

Summary
An ASP.NET Core endpoint accepting a FileChunk model with byte arrays in its request body causes process memory to spike by ~40% during large file uploads, resulting in IOException. This occurs because the model binding process buffers the entire request payload in memory.


Root Cause

Model binding in ASP.NET Core buffers the entire request body in memory when parameters are bound to complex types like FileChunk. For large files:

  • Byte arrays in the model force full-payload buffering
  • Default middleware settings limit request size (typically 28.6 MB)
  • Response buffering compounds memory pressure
  • Key Takeaway: The framework doesn’t stream byte-array-bound payloads automatically.

Why This Happens in Real Systems

  • Legacy code migration from SOAP/WCF where buffered payloads were acceptable
  • Insufficient stress testing with large payloads during development
  • Misunderstanding framework mechanics:
    • Web APIs abstract HTTP body parsing
    • Async methods don’t auto-stream binding operations
  • Default configurations prioritize simplicity over scalability

Real-World Impact

  • Resource Exhaustion: Spikes in Kubernetes pods trigger OOM kills
  • Cost Escalation: Cloud apps scale horizontally under memory pressure
  • User Experience: Upload failures at ~130 MB leave users stranded
  • Diagnostic Noise: IOException masks the core buffering issue

Example or Code

The problematic endpoint:

public ActionResult<string> UploadFileChunk(
    [FromBody] FileChunk chunk, string guid) // Byte arrays buffered in memory!  
{ /* ... */ }  

public class FileChunk  
{  
    public byte[] Header { get; set; }  
    public byte[] Payload { get; set; } // 🚫 Memory bomb!  
}

How Senior Engineers Fix It

Three permanent fixes:

  1. Stream-Based Processing

    public async Task<IActionResult> Upload()  
    {  
     using (var reader = new StreamReader(Request.Body))  
     {  
         await ProcessStream(reader); // Streams chunks incrementally  
     }  
    }
  2. Increase Memory Limits (Temporary Workaround)
    In Program.cs:

    builder.Services.Configure<KestrelServerOptions>(options =>  
    {  
     options.Limits.MaxRequestBodySize = 500 * 1024 * 1024;  
    });  
  3. Disable Form Value Caching

    services.AddControllers().AddControllersAsServices();  
    services.AddMvc(options =>  
    {  
     options.ValueProviderFactories.RemoveType<FormValueProviderFactory>();
    });

Diagnostic Workflow:

  1. Use dotnet-counters to monitor GC Heap Size during requests
  2. Profile allocations with dotnet-dump analyze:
    dotnet dump collect -p <PID>  
    analyze -run <path_to_dump>  
    !dumpheap -type Byte[] # Identify large byte[] allocations  
    !gcroot <address>      # Trace ownership path  
  3. Critical Clue: Heap dumps showing System.Byte[] rooted in BufferedRequestStream

Why Juniors Miss It

  • Abstraction Blindness: Trusting model binding without understanding implementation
  • Diagnostic Overload: Misinterpreting heap diff views that hide indirect ownership
  • Tool Misuse: Focusing on visualized allocations rather than GC roots
  • Symptom Fixation: Chasing the IOException instead of allocation patterns
  • Myth Belief: Assuming async equals stream processing

Senior Insight: Always profile payload transformations under load. Framework conveniences are rarely free.

Leave a Comment