fix(http): fix Strong reference leak in server response streaming (#25965)

## Summary

Fix a memory leak in `RequestContext.doRenderWithBody()` where
`Strong.Impl` memory was leaked when proxying streaming responses
through Bun's HTTP server.

## Problem

When a streaming response (e.g., from a proxied fetch request) was
forwarded through Bun's server:

1. `response_body_readable_stream_ref` was initialized at line 1836
(from `lock.readable`) or line 1841 (via `Strong.init()`)
2. For `.Bytes` streams with `has_received_last_chunk=false`, a **new**
Strong reference was created at line 1902
3. The old Strong reference was **never deinit'd**, causing
`Strong.Impl` memory to leak

This leak accumulated over time with every streaming response proxied
through the server.

## Solution

Add `this.response_body_readable_stream_ref.deinit()` before creating
the new Strong reference. This is safe because:

- `stream` exists as a stack-local variable
- JSC's conservative GC tracks stack-local JSValues
- No GC can occur between consecutive synchronous Zig statements
- Therefore, `stream` won't be collected between `deinit()` and
`Strong.init()`

## Test

Added `test/js/web/fetch/server-response-stream-leak.test.ts` which:
- Creates a backend server that returns delayed streaming responses
- Creates a proxy server that forwards the streaming responses
- Makes 200 requests and checks that ReadableStream objects don't
accumulate
- Fails on system Bun v1.3.5 (202 leaked), passes with the fix

## Related

Similar to the Strong reference leak fixes in:
- #23313 (fetch memory leak)
- #25846 (fetch cyclic reference leak)
This commit is contained in:
SUZUKI Sosuke
2026-01-13 07:41:58 +09:00
committed by GitHub
parent b6abbd50a0
commit 461ad886bd
2 changed files with 55 additions and 0 deletions

View File

@@ -0,0 +1,52 @@
import { heapStats } from "bun:jsc";
import { describe, expect, test } from "bun:test";
describe("Bun.serve response stream leak", () => {
test("proxy server forwarding streaming response should not leak", async () => {
// Backend server that returns a streaming response with delay
await using backend = Bun.serve({
port: 0,
fetch(req) {
const stream = new ReadableStream({
async start(controller) {
controller.enqueue(new TextEncoder().encode("chunk1"));
await Bun.sleep(10);
controller.enqueue(new TextEncoder().encode("chunk2"));
controller.close();
},
});
return new Response(stream);
},
});
// Proxy server that forwards the response body stream
await using proxy = Bun.serve({
port: 0,
async fetch(req) {
const backendResponse = await fetch(`http://localhost:${backend.port}/`);
return new Response(backendResponse.body);
},
});
const url = `http://localhost:${proxy.port}/`;
async function leak() {
const response = await fetch(url);
return await response.text();
}
for (let i = 0; i < 200; i++) {
await leak();
}
await Bun.sleep(10);
Bun.gc(true);
await Bun.sleep(10);
Bun.gc(true);
const readableStreamCount = heapStats().objectTypeCounts.ReadableStream || 0;
const responseCount = heapStats().objectTypeCounts.Response || 0;
expect(readableStreamCount).toBeLessThanOrEqual(50);
expect(responseCount).toBeLessThanOrEqual(50);
});
});