Fixes#25862
### What does this PR do?
When a client sends pipelined data immediately after CONNECT request
headers in the same TCP segment, Bun now properly delivers this data to
the `head` parameter of the 'connect' event handler, matching Node.js
behavior.
This enables compatibility with Cap'n Proto's KJ HTTP library used by
Cloudflare's workerd runtime, which pipelines RPC data after CONNECT.
### How did you verify your code works?
<img width="694" height="612" alt="CleanShot 2026-01-09 at 15 30 22@2x"
src="https://github.com/user-attachments/assets/3ffe840e-1792-429c-8303-d98ac3e6912a"
/>
Tests
---------
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
## Summary
- Add comprehensive TypeScript type definitions for `Bun.Archive` in
`bun.d.ts`
- `ArchiveInput` and `ArchiveCompression` types
- Full JSDoc documentation with examples for all methods (`from`,
`write`, `extract`, `blob`, `bytes`, `files`)
- Add documentation page at `docs/runtime/archive.mdx`
- Quickstart examples
- Creating and extracting archives
- `files()` method with glob filtering
- Compression support
- Full API reference section
- Add Archive to docs sidebar under "Data & Storage"
- Add `files()` benchmark comparing `Bun.Archive.files()` vs node-tar
- Shows ~7x speedup for reading archive contents into memory (59µs vs
434µs)
## Test plan
- [x] TypeScript types compile correctly
- [x] Documentation renders properly in Mintlify format
- [x] Benchmark runs successfully and shows performance comparison
- [x] Verified `files()` method works correctly with both Bun.Archive
and node-tar
🤖 Generated with [Claude Code](https://claude.com/claude-code)
---------
Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
## Summary
- Fixes a path traversal vulnerability via symlink when installing
GitHub packages
- Validates symlink targets before creation to ensure they stay within
the extraction directory
- Rejects absolute symlinks and relative paths that would escape the
extraction directory
## Details
When extracting GitHub tarballs, Bun did not validate symlink targets. A
malicious tarball could:
1. Create a symlink pointing outside the extraction directory (e.g.,
`../../../../../../../tmp`)
2. Include a file entry through that symlink path (e.g.,
`symlink-to-tmp/pwned.txt`)
When extracted, the symlink would be created first, then the file would
be written through it, ending up outside the intended package directory
(e.g., `/tmp/pwned.txt`).
### The Fix
Added `isSymlinkTargetSafe()` function that:
1. Rejects absolute symlink targets (starting with `/`)
2. Normalizes the combined path (symlink location + target) and rejects
if the result starts with `..` (would escape)
## Test plan
- [x] Added regression test
`test/cli/install/symlink-path-traversal.test.ts`
- [x] Tests verify relative path traversal symlinks are blocked
- [x] Tests verify absolute symlink targets are blocked
- [x] Tests verify safe relative symlinks within the package still work
- [x] Verified test fails with system bun (vulnerable) and passes with
debug build (fixed)
🤖 Generated with [Claude Code](https://claude.com/claude-code)
---------
Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
Co-authored-by: Dylan Conway <dylan.conway567@gmail.com>
## Summary
Adds a new CLI flag `--compile-executable-path` that allows specifying a
custom Bun executable path for cross-compilation instead of downloading
from the npm registry.
## Usage
```bash
bun build --compile --target=bun-linux-x64 \
--compile-executable-path=/path/to/bun-linux-x64 app.ts
```
## Motivation
The `executablePath` option was already available in the JavaScript
`Bun.build()` API. This exposes the same functionality from the CLI.
## Changes
- Added `--compile-executable-path <STR>` CLI parameter in
`src/cli/Arguments.zig`
- Added `compile_executable_path` field to `BundlerOptions` in
`src/cli.zig`
- Wired the option through to `StandaloneModuleGraph.toExecutable()` in
`src/cli/build_command.zig`
---------
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
### What does this PR do?
remove agent in global WebSocket (in node.js it uses dispatcher not
agent) add agent support in ws module (this actually uses agent)
### How did you verify your code works?
Tests
### What does this PR do?
Update FFI documentation with a note on Windows API handle values, as
pointer encoding to double causes intermittent failures with some
classes of handles.
Putting it in this accordion feels less than ideal but it's also a very
specific use case.
The specific issue I experienced: HDCs and HBITMAPs are basically 32
bits, although they are typedef'd in the headers to HANDLE. They are
returned from and passed to the GDI APIs as sign-extended versions of
the underlying 32-bit value, and so when going through the ptr <->
double pathway of bun FFI, the bottom 11 bits of those values are lost
if the original value had bit 31 set, and subsequent calls will fail.
Probably this is fixable by correctly encoding 'negative' pointers in
the double representation, and I might tackle that if I find time.
## Summary
Fixes#25869
Two fixes to enable `jest.useFakeTimers()` to work with
`@testing-library/react` and `@testing-library/user-event`:
- Set `setTimeout.clock = true` when fake timers are enabled.
testing-library/react's `jestFakeTimersAreEnabled()` checks for this
property to determine if `jest.advanceTimersByTime()` should be called
when draining the microtask queue. Without this, testing-library never
advances timers.
- Make `advanceTimersByTime(0)` fire `setTimeout(fn, 0)` timers.
`setTimeout(fn, 0)` is internally scheduled with a 1ms delay per HTML
spec. Jest/testing-library expect `advanceTimersByTime(0)` to fire such
"immediate" timers, but we were advancing by 0ms so they never fired.
## Test plan
- [x] All 30 existing fake timer tests pass
- [x] New regression test validates both fixes
- [x] Original user-event reproduction now works (test completes instead
of hanging)
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
## Summary
- **libarchive.zig:110**: Fix self-assignment bug where `this.pos` was
assigned to itself instead of `new_pos`
- **s3/credentials.zig:165,176,199**: Fix impossible range checks -
`and` should be `or` for pageSize, partSize, and retry validation (a
value cannot be both less than MIN and greater than MAX simultaneously)
- **postgres.zig:14**: Fix copy-paste error where createConnection
function was internally named "createQuery"
## Test plan
- [ ] Verify S3 credential validation now properly rejects out-of-range
values for pageSize, partSize, and retry
- [ ] Verify libarchive seek operations work correctly
- [ ] Verify postgres createConnection function has correct internal
name
🤖 Generated with [Claude Code](https://claude.com/claude-code)
---------
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
Co-authored-by: Claude Bot <claude-bot@bun.sh>
## Summary
- Fixes#25903 - `Bun.write()` mode option ignored when copying from
`Bun.file()`
- The destination file now correctly uses the specified `mode` option
instead of default permissions
- Works on Linux (via open flags), macOS (chmod after clonefile), and
Windows (chmod after copyfile)
## Test plan
- [x] Added regression test in `test/regression/issue/25903.test.ts`
- [x] Test passes with `bun bd test test/regression/issue/25903.test.ts`
- [x] Test fails with `USE_SYSTEM_BUN=1 bun test
test/regression/issue/25903.test.ts` (verifies the bug exists)
## Changes
- `src/bun.js/webcore/Blob.zig`: Add `mode` field to `WriteFileOptions`
and parse from options
- `src/bun.js/webcore/blob/copy_file.zig`: Use `destination_mode` in
`CopyFile` struct and `doOpenFile`
- `packages/bun-types/bun.d.ts`: Add `mode` option to BunFile copy
overloads
🤖 Generated with [Claude Code](https://claude.com/claude-code)
---------
Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
## Summary
Add support for in-memory entrypoints and files in `Bun.build` via the
`files` option:
```ts
await Bun.build({
entrypoints: ["/app/index.ts"],
files: {
"/app/index.ts": `
import { greet } from "./greet.ts";
console.log(greet("World"));
`,
"/app/greet.ts": `
export function greet(name: string) {
return "Hello, " + name + "!";
}
`,
},
});
```
### Features
- **Bundle entirely from memory**: No files on disk needed
- **Override files on disk**: In-memory files take priority over disk
files
- **Mix disk and virtual files**: Real files can import virtual files
and vice versa
- **Multiple content types**: Supports `string`, `Blob`, `TypedArray`,
and `ArrayBuffer`
### Use Cases
- Code generation at build time
- Injecting build-time constants
- Testing with mock modules
- Bundling dynamically generated code
- Overriding configuration files for different environments
### Implementation Details
- Added `FileMap` struct in `JSBundler.zig` with `resolve`, `get`,
`contains`, `fromJS`, and `deinit` methods
- Uses `"memory"` namespace to avoid `pathWithPrettyInitialized`
allocation issues during linking phase
- FileMap checks added in:
- `runResolver` (entry point resolution)
- `runResolutionForParseTask` (import resolution)
- `enqueueEntryPoints` (entry point handling)
- `getCodeForParseTaskWithoutPlugins` (file content reading)
- Root directory defaults to cwd when all entrypoints are in the FileMap
- Added TypeScript types with JSDoc documentation
- Added bundler documentation with examples
## Test plan
- [x] Basic in-memory file bundling
- [x] In-memory files with absolute imports
- [x] In-memory files with relative imports (same dir, subdirs, parent
dirs)
- [x] Nested/chained imports between in-memory files
- [x] TypeScript and JSX support
- [x] Blob, Uint8Array, and ArrayBuffer content types
- [x] Re-exports and default exports
- [x] In-memory file overrides real file on disk
- [x] Real file on disk imports in-memory file via relative path
- [x] Mixed disk and memory files with complex import graphs
Run tests with: `bun bd test test/bundler/bundler_files.test.ts`
🤖 Generated with [Claude Code](https://claude.com/claude-code)
---------
Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
Co-authored-by: Dylan Conway <dylan.conway567@gmail.com>
## Summary
This PR implements a new `Bun.JSONC.parse()` API that allows parsing
JSONC (JSON with Comments) files. It addresses the feature request from
issue #16257 by providing a native API for parsing JSON with comments
and trailing commas.
The implementation follows the same pattern as `Bun.YAML` and
`Bun.TOML`, leveraging the existing `TSConfigParser` which already
handles JSONC parsing internally.
## Features
- **Parse JSON with comments**: Supports both `//` single-line and `/*
*/` block comments
- **Handle trailing commas**: Works with trailing commas in objects and
arrays
- **Full JavaScript object conversion**: Returns native JavaScript
objects/arrays
- **Error handling**: Proper error throwing for invalid JSON
- **TypeScript compatibility**: Works with TypeScript config files and
other JSONC formats
## Usage Example
```javascript
const result = Bun.JSONC.parse(`{
// This is a comment
"name": "my-app",
"version": "1.0.0", // trailing comma is allowed
"dependencies": {
"react": "^18.0.0",
},
}`);
// Returns native JavaScript object
```
## Implementation Details
- Created `JSONCObject.zig` following the same pattern as
`YAMLObject.zig` and `TOMLObject.zig`
- Uses the existing `TSConfigParser` from `json.zig` which already
handles comments and trailing commas
- Added proper C++ bindings and exports following Bun's established
patterns
- Comprehensive test suite covering various JSONC features
## Test Plan
- [x] Basic JSON parsing works
- [x] Single-line comments (`//`) are handled correctly
- [x] Block comments (`/* */`) are handled correctly
- [x] Trailing commas in objects and arrays work
- [x] Complex nested structures parse correctly
- [x] Error handling for invalid JSON
- [x] Empty objects and arrays work
- [x] Boolean and null values work correctly
🤖 Generated with [Claude Code](https://claude.ai/code)
---------
Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
Co-authored-by: Jarred-Sumner <709451+Jarred-Sumner@users.noreply.github.com>
### What does this PR do?
Fix bytecode CJS pragma detection when source file contains a shebang.
When bundling with `--bytecode` and the source file has a shebang, the
output silently fails to execute (exits 0, no output).
Reproduction:
[github.com/kjanat/bun-bytecode-banner-bug](https://github.com/kjanat/bun-bytecode-banner-bug)
```js
// Bundled output:
#!/usr/bin/env bun // shebang preserved
// @bun @bytecode @bun-cjs // pragma on line 2
(function(exports, require, module, __filename, __dirname) { ... })
```
The pragma parser in `hasBunPragma()` correctly skips the shebang line,
but uses `self.lexer.end` instead of `contents.len` when scanning for
`@bun-cjs`/`@bytecode` tokens. This causes the pragma to not be
recognized.
**Fix:**
```zig
// Before
while (cursor < self.lexer.end) : (cursor += 1) {
// After
while (cursor < end) : (cursor += 1) {
```
Where `end` is already defined as `contents.len` at the top of the
function.
### How did you verify your code works?
- Added bundler test `banner/SourceHashbangWithBytecodeAndCJSTargetBun`
in `test/bundler/bundler_banner.test.ts`
- Added regression tests in
`test/regression/issue/bun-bytecode-shebang.test.ts` that verify:
- CJS wrapper executes when source has shebang
- CJS wrapper executes when source has shebang + bytecode pragma
- End-to-end: bundled bytecode output with source shebang runs correctly
- Ran the tests in the
[kjanat/bun-bytecode-banner-bug](https://github.com/kjanat/bun-bytecode-banner-bug)
repo to verify the issue is fixed
---------
Signed-off-by: Kaj Kowalski <info@kajkowalski.nl>
Co-authored-by: Dylan Conway <dylan.conway567@gmail.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
## Summary
Fixes `ASSERTION FAILED: isUInt32AsAnyInt()` errors that occurred
intermittently, particularly in inspector-related tests.
## Problem
The code was directly manipulating JSPromise internal fields using
`asUInt32AsAnyInt()`:
```cpp
promise->internalField(JSC::JSPromise::Field::Flags).get().asUInt32AsAnyInt()
```
This caused assertion failures when the internal field state was not a
valid uint32.
## Solution
Use WebKit's official JSPromise helper methods instead:
| Before | After |
|--------|-------|
| Direct `internalField` manipulation for rejected promise |
`promise->rejectAsHandled(vm, globalObject, value)` |
| Direct `internalField` manipulation for resolved promise |
`promise->fulfill(vm, globalObject, value)` |
| Direct `internalField` manipulation for handled flag |
`promise->markAsHandled()` |
## Files Changed
- `src/bun.js/bindings/ZigGlobalObject.cpp`
- `src/bun.js/bindings/ModuleLoader.cpp`
- `src/bake/BakeGlobalObject.cpp`
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
## Summary
- Use `describe.concurrent` at module scope for parallel test execution
across node/bun executables and padding strategies
- Replace `Bun.spawnSync` with async `Bun.spawn` in memory leak test
- Replace `beforeEach`/`afterEach` server setup with `await using` in
each test
- Add `Symbol.asyncDispose` to `nodeEchoServer` helper for proper
cleanup
- Fix IPv6/IPv4 binding issue by explicitly binding echo server to
127.0.0.1
## Test plan
- [x] Run `bun test test/js/node/http2/node-http2.test.js` - all 245
tests pass (6 skipped)
- [x] Verify tests run faster due to concurrent execution
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
## Summary
- Wrap all tests in `describe.concurrent` at module scope for parallel
test execution
- Replace `Bun.spawnSync` with `Bun.spawn` + `await` throughout
- Replace `run_dir`/`writeFile` pattern with `tempDir` for automatic
cleanup via `using` declarations
- Remove `beforeEach` hook that created shared temp directory
## Test plan
- [x] All 291 tests pass with `bun bd test
test/cli/install/bun-run.test.ts`
- [x] All tests pass with `USE_SYSTEM_BUN=1 bun test
test/cli/install/bun-run.test.ts`
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
## Summary
- Disable HTTP keep-alive when a proxy returns a 407 (Proxy
Authentication Required) status code
- This prevents subsequent requests from trying to reuse a connection
that the proxy server has closed
- Refactored proxy tests to use `describe.concurrent` and async
`Bun.spawn` patterns
## Test plan
- [x] Added test `simultaneous proxy auth failures should not hang` that
verifies multiple concurrent requests with invalid proxy credentials
complete without hanging
- [x] Existing proxy tests pass
🤖 Generated with [Claude Code](https://claude.ai/code)
## Summary
- **SQLClient.cpp**: Fix bug where `RETURN_IF_EXCEPTION` after
`JSONParse` would never trigger on JSON parse failure since `JSONParse`
doesn't throw
- **BunString.cpp**: Simplify by using `JSONParseWithException` instead
of manually checking for empty result and throwing
## Details
`JSC::JSONParse` returns an empty `JSValue` on failure **without
throwing an exception**. This means that `RETURN_IF_EXCEPTION(scope,
{})` will never catch JSON parsing errors when used after `JSONParse`.
Before this fix in `SQLClient.cpp`:
```cpp
JSC::JSValue json = JSC::JSONParse(globalObject, str);
RETURN_IF_EXCEPTION(scope, {}); // This never triggers on parse failure!
return json; // Returns empty JSValue
```
This could cause issues when parsing invalid JSON data from SQL
databases (e.g., PostgreSQL's JSON/JSONB columns).
`JSONParseWithException` properly throws a `SyntaxError` exception that
`RETURN_IF_EXCEPTION` can catch.
## Test plan
- [x] Build succeeds with `bun bd`
- The changes follow the same pattern used in `ModuleLoader.cpp` and
`BunObject.cpp`
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
## Summary
- Aligns Bun's temp directory resolution with Node.js's `os.tmpdir()`
behavior
- Checks `TMPDIR`, `TMP`, and `TEMP` environment variables in order
(matching Node.js)
- Uses `bun.once` for lazy initialization instead of mutable static
state
- Removes `setTempdir` function and simplifies the API to use
`RealFS.tmpdirPath()` directly
## Test plan
- [ ] Verify temp directory resolution matches Node.js behavior
- [ ] Test with various environment variable configurations
- [ ] Ensure existing tests pass with `bun bd test`
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-authored-by: Claude <noreply@anthropic.com>
### What does this PR do?
Without this change, building with `-DWEBKIT_LOCAL=ON` fails with:
```
/work/bun/src/bun.js/bindings/BunObject.cpp:12:10: fatal error: 'JavaScriptCore/JSBase.h' file not found
12 | #include <JavaScriptCore/JSBase.h>
| ^~~~~~~~~~~~~~~~~~~~~~~~~
1 error generated.
```
The reason for this is because the directory structure differs between
downloaded and local WebKit.
Downloaded WebKit:
```
build/debug/cache/webkit-6d0f3aac0b817cc0/
└── include/
└── JavaScriptCore/
└── JSBase.h ← Direct path
```
Local WebKit:
```
vendor/WebKit/WebKitBuild/Debug/
└── JavaScriptCore/Headers/
└── JavaScriptCore/
└── JSBase.h ← Nested path
```
The include paths are thus configured differently for each build type.
For Remote WebKit (when WEBKIT_LOCAL=OFF):
- SetupWebKit.cmake line 22 sets: WEBKIT_INCLUDE_PATH =
${WEBKIT_PATH}/include
- BuildBun.cmake line 1253 adds:
include_directories(${WEBKIT_INCLUDE_PATH})
- This resolves to: build/debug/cache/webkit-6d0f3aac0b817cc0/include/
- So #include <JavaScriptCore/JSBase.h> finds the file at
include/JavaScriptCore/JSBase.h ✅
For Local WebKit (when WEBKIT_LOCAL=ON):
- The original code only added:
${WEBKIT_PATH}/JavaScriptCore/Headers/JavaScriptCore
- This resolves to:
vendor/WebKit/WebKitBuild/Debug/JavaScriptCore/Headers/JavaScriptCore/
- So #include <JavaScriptCore/JSBase.h> fails because there's no
JavaScriptCore/ subdirectory at that level ❌
- The fix adds: ${WEBKIT_PATH}/JavaScriptCore/Headers
- Now the include path includes:
vendor/WebKit/WebKitBuild/Debug/JavaScriptCore/Headers/
- So #include <JavaScriptCore/JSBase.h> finds the file at
Headers/JavaScriptCore/JSBase.h ✅
### How did you verify your code works?
Built locally.
Co-authored-by: Carl Smedstad <carsme@archlinux.org>
### What does this PR do?
In CMake, failure to execute a process will place a message string in
the RESULT_VARIABLE. If the message string starts with 'No' such as in
'No such file or directory' then CMake will interpret that as the
boolean false and not halt the build. The new code uses a built-in
option to halt the build on any failure, so the script will halt
correctly if that error occurs. This could also be fixed by quoting, but
might as well use the CMake feature.
I encountered this error when I improperly defined BUN_EXECUTABLE.
### How did you verify your code works?
<details>
<summary>Ran the build with invalid executable path:</summary>
```
$ node ./scripts/build.mjs \
-GNinja \
-DCMAKE_BUILD_TYPE=Debug \
-B build/debug \
--log-level=NOTICE \
-DBUN_EXECUTABLE="foo" \
-DNPM_EXECUTABLE="$(which npm)" \
-DZIG_EXECUTABLE="$(which zig)" \
-DENABLE_ASAN=OFF
Globbed 1971 sources [178.21ms]
CMake Configure
$ cmake -G Ninja -DCMAKE_BUILD_TYPE=Debug -B /work/bun/build/debug --log-level NOTICE -DBUN_EXECUTABLE=foo -DNPM_EXECUTABLE=/bin/npm -DZIG_EXECUTABLE=/bin/zig -DENABLE_ASAN=OFF -S /work/bun -DCACHE_STRATEGY=auto
sccache: Using local cache strategy.
```
</details>
Result:
```
CMake Error at cmake/targets/BuildBun.cmake:422 (execute_process):
execute_process failed command indexes:
1: "Abnormal exit with child return code: no such file or directory"
Call Stack (most recent call first):
CMakeLists.txt:66 (include)
```
### What does this PR do?
Before this change, downloading WebKit would fail silently if there were
e.g. a connection or TLS certificate issue. I experienced this when
trying to build bun in an overly-restrictive sandbox.
After this change, the failure reason will be visible.
### How did you verify your code works?
<details>
<summary>Brought down my network interface and ran the build:</summary>
```sh
$ node ./scripts/build.mjs \
-GNinja \
-DCMAKE_BUILD_TYPE=Debug \
-B build/debug \
--log-level=NOTICE \
-DBUN_EXECUTABLE="$(which node)" \
-DNPM_EXECUTABLE="$(which npm)" \
-DZIG_EXECUTABLE="$(which zig)" \
-DENABLE_ASAN=OFF
Globbed 1971 sources [180.67ms]
CMake Configure
$ cmake -G Ninja -DCMAKE_BUILD_TYPE=Debug -B /work/bun/build/debug --log-level NOTICE -DBUN_EXECUTABLE=/bin/node -DNPM_EXECUTABLE=/bin/npm -DZIG_EXECUTABLE=/bin/zig -DENABLE_ASAN=OFF -S /work/bun -DCACHE_STRATEGY=auto
sccache: Using local cache strategy.
...
```
</details>
Result:
```
CMake Error at cmake/tools/SetupWebKit.cmake:99 (message):
Failed to download WebKit: 6;"Could not resolve hostname"
Call Stack (most recent call first):
cmake/targets/BuildBun.cmake:1204 (include)
CMakeLists.txt:66 (include)
```
## Summary
Fixes#20821
When `bun build --compile` was used with 8 or more embedded files, the
compiled binary would silently fail to execute any code (exit code 0, no
output).
**Root cause:** Chunks were sorted alphabetically by their `entry_bits`
key bytes. For entry point 0, the key starts with bit 0 set (byte
pattern `0x01`), but for entry point 8, the key has bit 8 set in the
byte (pattern `0x00, 0x01`). Alphabetically, `0x00 < 0x01`, so entry
point 8's chunk sorted before entry point 0.
This caused the wrong entry point to be identified as the main entry,
resulting in asset wrapper code being executed instead of the user's
code.
**Fix:** Custom sort that ensures `entry_point_id=0` (the main entry
point) always sorts first, with remaining chunks sorted alphabetically
for determinism.
## Test plan
- Added regression test `compile/ManyEmbeddedFiles` that embeds 8 files
and verifies the main entry point runs correctly
- Verified manually with reproduction case from issue
🤖 Generated with [Claude Code](https://claude.com/claude-code)
---------
Co-authored-by: Claude <noreply@anthropic.com>
### What does this PR do?
It is standard practice to send HTTP2 window resize frames once half the
window is used up:
- [nghttp2](https://en.wikipedia.org/wiki/Nghttp2#HTTP/2_implementation)
- [Go](https://github.com/golang/net/blob/master/http2/flow.go#L31)
- [Rust
h2](https://github.com/hyperium/h2/blob/master/src/proto/streams/flow_control.rs#L17)
The current behaviour of Bun is to send a window update once the client
has sent 65535 bytes exactly. This leads to an interruption in
throughput while the window update is received.
This is not just a performance concern however, I think some clients
will not handle this well, as it means that if you stop sending data
even 1 byte before 65535, you have a deadlock. The reason I came across
this issue is that it appears that the Rust `hyper` crate always
reserves an additional 1 byte from the connection for each http2 stream
(https://github.com/hyperium/hyper/blob/master/src/proto/h2/mod.rs#L122).
This means that when you have two concurrent requests, the client treats
it like the window is exhausted when it actually has a byte remaining,
leading to a sequential dependency between the requests that can create
deadlocks if they depend on running concurrently. I accept this is not a
problem with bun, but its a happy accident that we can resolve such
off-by-one issues by increasing the window size once it is 50% utilized
### How did you verify your code works?
Using wireshark, bun debug logging, and client logging I can observe
that the window updates are now sent after 32767 bytes. This also
resolves the h2 crate client issue.
### What does this PR do?
- I was trying to understand how the SEA bundling worked in Bun and
noticed the size of the `Offsets` struct is hard-coded here to 32. It
should use the computed size to be future proof to changes in the
schema.
### How did you verify your code works?
- I didn't. Can add tests if this is not covered by existing tests.
ChatGPT agreed with me though. =)
## Summary
This PR fixes a memory leak that occurs when `fetch()` is called with a
`ReadableStream` body. The ReadableStream objects were not being
properly released, causing them to accumulate in memory.
## Problem
When using `fetch()` with a ReadableStream body:
```javascript
const stream = new ReadableStream({
start(controller) {
controller.enqueue(new TextEncoder().encode("data"));
controller.close();
}
});
await fetch(url, { method: "POST", body: stream });
```
The ReadableStream objects leak because `FetchTasklet.clearData()` has a
conditional that prevents `detach()` from being called on ReadableStream
request bodies after streaming has started.
### Root Cause
The problematic condition in `clearData()`:
```zig
if (this.request_body != .ReadableStream or this.is_waiting_request_stream_start) {
this.request_body.detach();
}
```
After `startRequestStream()` is called:
- `is_waiting_request_stream_start` becomes `false`
- `request_body` is still `.ReadableStream`
- The condition evaluates to `(false or false) = false`
- `detach()` is skipped → **memory leak**
### Why the Original Code Was Wrong
The original code appears to assume that when `startRequestStream()` is
called, ownership of the Strong reference is transferred to
`ResumableSink`. However, this is incorrect:
1. `startRequestStream()` creates a **new independent** Strong reference
in `ResumableSink` (see `ResumableSink.zig:119`)
2. The FetchTasklet's original reference is **not transferred** - it
becomes redundant
3. Strong references in Bun are independent - calling `deinit()` on one
does not affect the other
## Solution
Remove the conditional and always call `detach()`:
```zig
// Always detach request_body regardless of type.
// When request_body is a ReadableStream, startRequestStream() creates
// an independent Strong reference in ResumableSink, so FetchTasklet's
// reference becomes redundant and must be released to avoid leaks.
this.request_body.detach();
```
### Safety Analysis
This change is safe because:
1. **Strong references are independent**: Each Strong reference
maintains its own ref count. Detaching FetchTasklet's reference doesn't
affect ResumableSink's reference
2. **Idempotency**: `detach()` is safe to call on already-detached
references
3. **Timing**: `clearData()` is only called from `deinit()` after
streaming has completed (ref_count = 0)
4. **No UAF risk**: `deinit()` only runs when ref_count reaches 0, which
means all streaming operations have completed
## Test Results
Before fix (with system Bun):
```
Expected: <= 100
Received: 501 (Request objects leaked)
Received: 1002 (ReadableStream objects leaked)
```
After fix:
```
6 pass
0 fail
```
## Test Coverage
Added comprehensive tests in
`test/js/web/fetch/fetch-cyclic-reference.test.ts` covering:
- Response stream leaks with cyclic references
- Streaming response body leaks
- Request body stream leaks with cyclic references
- ReadableStream body leaks (no cyclic reference needed to reproduce)
- Concurrent fetch operations with cyclic references
---------
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Include the Windows module definition file (symbols.def) in
link-metadata.json, similar to how
symbols.txt and symbols.dyn are already included for macOS and Linux.
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-authored-by: Claude <noreply@anthropic.com>
- Introduced `requestPayer` option in S3-related functions and
structures to handle Requester Pays buckets.
- Updated S3 client methods to accept and propagate the `requestPayer`
flag.
- Enhanced documentation for the `requestPayer` option in the S3 type
definitions.
- Adjusted existing S3 operations to utilize the `requestPayer`
parameter where applicable, ensuring compatibility with AWS S3's
Requester Pays feature.
- Ensured that the new functionality is integrated into multipart
uploads and simple requests.
### What does this PR do?
This change allows users to specify whether they are willing to pay for
data transfer costs when accessing objects in Requester Pays buckets,
improving flexibility and compliance with AWS S3's billing model.
This closes#25499
### How did you verify your code works?
I have added a new test file to verify this functionality, and all my
tests pass.
I also tested this against an actual S3 bucket which can only be
accessed if requester pays. I can confirm that it's accessible with
`requestPayer` is `true`, and the default of `false` does not allow
access.
An example bucket is here: s3://hl-mainnet-evm-blocks/0/0/1.rmp.lz4
(my usecase is indexing [hyperliquid block
data](https://hyperliquid.gitbook.io/hyperliquid-docs/for-developers/hyperevm/raw-hyperevm-block-data)
which is stored in s3, and I want to use bun to index faster)
---------
Co-authored-by: Alistair Smith <hi@alistair.sh>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Ciro Spaciari <ciro.spaciari@gmail.com>
### What does this PR do?
the `sql()` helper now filters out `undefined` values in INSERT
statements instead of converting them to `NULL`. This allows columns
with `DEFAULT` values to use their defaults when `undefined` is passed,
rather than being overridden with `NULL`.
**Before:** `sql({ foo: undefined, id: "123" })` in INSERT would
generate `(foo, id) VALUES (NULL, "123")`, causing NOT NULL constraint
violations even when the column has a DEFAULT.
**After:** `sql({ foo: undefined, id: "123" })` in INSERT generates
`(id) VALUES ("123")`, omitting the undefined column entirely and
letting the database use the DEFAULT value.
Also fixes a data loss bug in bulk inserts where columns were determined
only from the first item - now all items are checked, so values in later
items aren't silently dropped.
Fixes#25829
### How did you verify your code works?
- Added regression test for #25829 (NOT NULL column with DEFAULT)
- Added tests for bulk insert with mixed undefined patterns which is the
data loss scenario