Compare commits

..

5 Commits

Author SHA1 Message Date
Claude Bot
55792c6e66 fix(http): prevent out-of-bounds write in buildRequest with excessive headers
The buildRequest function copied user-provided headers into a fixed-size
global array of 256 picohttp.Header entries without bounds checking. When
more than ~250 headers were provided via fetch(), the function would write
past the end of shared_request_headers_buf, corrupting adjacent memory
(shared_response_headers_buf) in release builds or panicking in debug builds.

Fix by dynamically allocating a larger buffer when the header count exceeds
the static buffer capacity, with the overflow buffer cached for reuse. Also
add bounds checks as a safety net for the allocation failure fallback path.

Co-Authored-By: Claude <noreply@anthropic.com>
2026-02-12 04:50:09 +00:00
Dylan Conway
50e478dcdc fix(crypto): correct method and constructor names mangled by number renamer (#26876)
## Problem

The bundler's number renamer was mangling `.name` properties on crypto
class prototype methods and constructors:

- `hash.update.name` → `"update2"` instead of `"update"`
- `verify.verify.name` → `"verify2"` instead of `"verify"`
- `cipher.update.name` → `"update3"` instead of `"update"`
- `crypto.Hash.name` → `"Hash2"` instead of `"Hash"`

### Root causes

1. **Named function expressions on prototypes** collided with other
bindings after scope flattening (e.g. `Verify.prototype.verify =
function verify(...)` collided with the imported `verify`)
2. **Block-scoped constructor declarations** (`Hash`, `Hmac`) got
renamed when the bundler hoisted them out of block scope
3. **Shared function declarations** in the Cipher/Decipher block all got
numeric suffixes (`update3`, `final2`, `setAutoPadding2`, etc.)

## Fix

- Use `Object.assign` with object literals for prototype methods —
object literal property keys correctly infer `.name` and aren't subject
to the renamer
- Remove unnecessary block scopes around `Hash` and `Hmac` constructors
so they stay at module level and don't get renamed
- Inline `Cipheriv` methods and copy references to `Decipheriv`

## Tests

Added comprehensive `.name` tests for all crypto classes: Hash, Hmac,
Sign, Verify, Cipheriv, Decipheriv, DiffieHellman, ECDH, plus factory
functions and constructor names.
2026-02-10 23:06:22 -08:00
robobun
e8f73601c0 fix(compile): use remaining buffer in standalone POSIX write loop (#26882)
## What does this PR do?

Fixes the write loop in `StandaloneModuleGraph.inject()` for POSIX
targets (the `else` branch handling ELF/Linux standalone binaries) to
pass `remain` instead of `bytes` to `Syscall.write()`.

## Problem

The write loop that appends the bundled module graph to the end of the
executable uses a standard partial-write retry pattern, but passes the
full `bytes` buffer on every iteration instead of the remaining portion:

```zig
var remain = bytes;
while (remain.len > 0) {
    switch (Syscall.write(cloned_executable_fd, bytes)) {   // bug: should be 'remain'
        .result => |written| remain = remain[written..],
        ...
    }
}
```

If a partial write occurs, the next iteration re-writes from the start
of the buffer instead of continuing where it left off, corrupting the
output binary. The analogous read loop elsewhere in the same file
already correctly uses `remain`.

## Fix

One-character change: `bytes` → `remain` in the `Syscall.write()` call.

## How did you verify your code works?

- `bun bd` compiles successfully
- `bun bd test test/bundler/bun-build-compile.test.ts` — 4/4 pass

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: Alistair Smith <alistair@anthropic.com>
2026-02-10 23:04:46 -08:00
robobun
ba6e84fecd fix(compile): seek to start of file before EXDEV cross-device copy (#26883)
## What does this PR do?

Fixes `bun build --compile` producing an all-zeros binary when the
output directory is on a different filesystem than the temp directory.
This is common in Docker containers, Gitea runners, and other
environments using overlayfs.

## Problem

When `inject()` finishes writing the modified executable to the temp
file, the file descriptor's offset is at EOF. If the subsequent
`renameat()` to the output path fails with `EXDEV` (cross-device — the
temp file and output dir are on different filesystems), the code falls
back to `copyFileZSlowWithHandle()`, which:

1. Calls `fallocate()` to pre-allocate the output file to the correct
size (filled with zeros)
2. Calls `bun.copyFile(in_handle, out_handle)` — but `in_handle`'s
offset is at EOF
3. `copy_file_range` / `sendfile` / `read` all use the current file
offset (EOF), read 0 bytes, and return immediately
4. Result: output file is the correct size but entirely zeros

This explains user reports of `bun build --compile
--target=bun-darwin-arm64` producing invalid binaries that `file`
identifies as "data" rather than a Mach-O executable.

## Fix

Seek the input fd to offset 0 in `copyFileZSlowWithHandle` before
calling `bun.copyFile`.

## How did you verify your code works?

- `bun bd` compiles successfully
- `bun bd test test/bundler/bun-build-compile.test.ts` — 6/6 pass
- Added tests that verify compiled binaries have valid executable
headers and produce correct output
- Manually verified cross-compilation: `bun build --compile
--target=bun-darwin-arm64` produces a valid Mach-O binary

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2026-02-10 22:32:31 -08:00
SUZUKI Sosuke
e29e830a25 perf(path): use pre-built Structure for path.parse() result objects (#26865)
## Summary

Optimize `path.parse()` by caching a pre-built JSC Structure for the
result object. Instead of creating a new empty object and performing 5
`putDirect` calls (each triggering a Structure transition), we now use
`constructEmptyObject` with the cached Structure and write values
directly via `putDirectOffset`.

## What changed

- **`ZigGlobalObject.h/cpp`**: Added `m_pathParsedObjectStructure` as a
`LazyPropertyOfGlobalObject<Structure>` with fixed property offsets for
`{root, dir, base, ext, name}`.
- **`Path.cpp`**: Added `PathParsedObject__create` extern "C" factory
function that constructs the object using the pre-built Structure and
`putDirectOffset`.
- **`path.zig`**: Replaced `toJSObject()` implementation to call the C++
factory function instead of `createEmptyObject` + 5x `.put()`.
- **`bench/snippets/path-parse.mjs`**: Added benchmark for
`path.parse()`.

This follows the same pattern used by `JSSocketAddressDTO` and
`m_jsonlParseResultStructure`.

## Benchmark results (Apple M4 Max)

| Benchmark | Before (1.3.9) | After | Speedup |
|---|---|---|---|
| `posix.parse("/home/user/dir/file.txt")` | 266.71 ns | 119.62 ns |
**2.23x** |
| `posix.parse("/home/user/dir/")` | 239.10 ns | 91.46 ns | **2.61x** |
| `posix.parse("file.txt")` | 232.55 ns | 89.20 ns | **2.61x** |
| `posix.parse("/root")` | 246.75 ns | 92.68 ns | **2.66x** |
| `posix.parse("")` | 152.19 ns | 20.72 ns | **7.34x** |
| `win32.parse("/home/user/dir/file.txt")` | 260.08 ns | 118.12 ns |
**2.20x** |
| `win32.parse("/home/user/dir/")` | 234.35 ns | 93.47 ns | **2.51x** |
| `win32.parse("file.txt")` | 224.19 ns | 80.56 ns | **2.78x** |
| `win32.parse("/root")` | 241.20 ns | 88.23 ns | **2.73x** |
| `win32.parse("")` | 160.39 ns | 24.20 ns | **6.63x** |

**~2.2x–2.8x faster** for typical paths, **~7x faster** for empty
strings.

## GC Safety

- `LazyPropertyOfGlobalObject<Structure>` is automatically visited via
`FOR_EACH_GLOBALOBJECT_GC_MEMBER`.
- JSValues created by `createUTF8ForJS` are protected by JSC's
conservative stack scanning during the factory function call.

## Test

All 116 existing path tests pass (`bun bd test test/js/node/path/`).

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2026-02-10 22:32:08 -08:00
21 changed files with 574 additions and 465 deletions

View File

@@ -0,0 +1,18 @@
import { posix, win32 } from "path";
import { bench, run } from "../runner.mjs";
const paths = ["/home/user/dir/file.txt", "/home/user/dir/", "file.txt", "/root", ""];
paths.forEach(p => {
bench(`posix.parse(${JSON.stringify(p)})`, () => {
globalThis.abc = posix.parse(p);
});
});
paths.forEach(p => {
bench(`win32.parse(${JSON.stringify(p)})`, () => {
globalThis.abc = win32.parse(p);
});
});
await run();

View File

@@ -935,7 +935,7 @@ pub const StandaloneModuleGraph = struct {
var remain = bytes;
while (remain.len > 0) {
switch (Syscall.write(cloned_executable_fd, bytes)) {
switch (Syscall.write(cloned_executable_fd, remain)) {
.result => |written| remain = remain[written..],
.err => |err| {
Output.prettyErrorln("<r><red>error<r><d>:<r> failed to write to temporary file\n{f}", .{err});

View File

@@ -114,6 +114,25 @@ static JSC::JSObject* createPath(JSGlobalObject* globalThis, bool isWindows)
} // namespace Zig
extern "C" JSC::EncodedJSValue PathParsedObject__create(
JSC::JSGlobalObject* globalObject,
JSC::EncodedJSValue root,
JSC::EncodedJSValue dir,
JSC::EncodedJSValue base,
JSC::EncodedJSValue ext,
JSC::EncodedJSValue name)
{
auto* global = JSC::jsCast<Zig::GlobalObject*>(globalObject);
auto& vm = JSC::getVM(globalObject);
JSC::JSObject* result = JSC::constructEmptyObject(vm, global->pathParsedObjectStructure());
result->putDirectOffset(vm, 0, JSC::JSValue::decode(root));
result->putDirectOffset(vm, 1, JSC::JSValue::decode(dir));
result->putDirectOffset(vm, 2, JSC::JSValue::decode(base));
result->putDirectOffset(vm, 3, JSC::JSValue::decode(ext));
result->putDirectOffset(vm, 4, JSC::JSValue::decode(name));
return JSC::JSValue::encode(result);
}
namespace Bun {
JSC::JSValue createNodePathBinding(Zig::GlobalObject* globalObject)

View File

@@ -2067,6 +2067,30 @@ void GlobalObject::finishCreation(VM& vm)
init.set(structure);
});
this->m_pathParsedObjectStructure.initLater(
[](const Initializer<Structure>& init) {
// { root, dir, base, ext, name } — path.parse() result
Structure* structure = init.owner->structureCache().emptyObjectStructureForPrototype(
init.owner, init.owner->objectPrototype(), 5);
PropertyOffset offset;
structure = Structure::addPropertyTransition(init.vm, structure,
Identifier::fromString(init.vm, "root"_s), 0, offset);
RELEASE_ASSERT(offset == 0);
structure = Structure::addPropertyTransition(init.vm, structure,
Identifier::fromString(init.vm, "dir"_s), 0, offset);
RELEASE_ASSERT(offset == 1);
structure = Structure::addPropertyTransition(init.vm, structure,
Identifier::fromString(init.vm, "base"_s), 0, offset);
RELEASE_ASSERT(offset == 2);
structure = Structure::addPropertyTransition(init.vm, structure,
Identifier::fromString(init.vm, "ext"_s), 0, offset);
RELEASE_ASSERT(offset == 3);
structure = Structure::addPropertyTransition(init.vm, structure,
init.vm.propertyNames->name, 0, offset);
RELEASE_ASSERT(offset == 4);
init.set(structure);
});
this->m_pendingVirtualModuleResultStructure.initLater(
[](const Initializer<Structure>& init) {
init.set(Bun::PendingVirtualModuleResult::createStructure(init.vm, init.owner, init.owner->objectPrototype()));

View File

@@ -567,6 +567,7 @@ public:
V(public, LazyClassStructure, m_JSHTTPParserClassStructure) \
\
V(private, LazyPropertyOfGlobalObject<Structure>, m_jsonlParseResultStructure) \
V(private, LazyPropertyOfGlobalObject<Structure>, m_pathParsedObjectStructure) \
V(private, LazyPropertyOfGlobalObject<Structure>, m_pendingVirtualModuleResultStructure) \
V(private, LazyPropertyOfGlobalObject<JSFunction>, m_performMicrotaskFunction) \
V(private, LazyPropertyOfGlobalObject<JSFunction>, m_nativeMicrotaskTrampoline) \
@@ -702,6 +703,7 @@ public:
void reload();
JSC::Structure* jsonlParseResultStructure() { return m_jsonlParseResultStructure.get(this); }
JSC::Structure* pathParsedObjectStructure() { return m_pathParsedObjectStructure.get(this); }
JSC::Structure* pendingVirtualModuleResultStructure() { return m_pendingVirtualModuleResultStructure.get(this); }
// We need to know if the napi module registered itself or we registered it.

View File

@@ -71,13 +71,12 @@ fn PathParsed(comptime T: type) type {
name: []const T = "",
pub fn toJSObject(this: @This(), globalObject: *jsc.JSGlobalObject) bun.JSError!jsc.JSValue {
var jsObject = jsc.JSValue.createEmptyObject(globalObject, 5);
jsObject.put(globalObject, jsc.ZigString.static("root"), try bun.String.createUTF8ForJS(globalObject, this.root));
jsObject.put(globalObject, jsc.ZigString.static("dir"), try bun.String.createUTF8ForJS(globalObject, this.dir));
jsObject.put(globalObject, jsc.ZigString.static("base"), try bun.String.createUTF8ForJS(globalObject, this.base));
jsObject.put(globalObject, jsc.ZigString.static("ext"), try bun.String.createUTF8ForJS(globalObject, this.ext));
jsObject.put(globalObject, jsc.ZigString.static("name"), try bun.String.createUTF8ForJS(globalObject, this.name));
return jsObject;
const root = try bun.String.createUTF8ForJS(globalObject, this.root);
const dir = try bun.String.createUTF8ForJS(globalObject, this.dir);
const base = try bun.String.createUTF8ForJS(globalObject, this.base);
const ext = try bun.String.createUTF8ForJS(globalObject, this.ext);
const name_val = try bun.String.createUTF8ForJS(globalObject, this.name);
return PathParsedObject__create(globalObject, root, dir, base, ext, name_val);
}
};
}
@@ -2748,6 +2747,14 @@ pub fn resolveJS_T(comptime T: type, globalObject: *jsc.JSGlobalObject, allocato
}
extern "c" fn Process__getCachedCwd(*jsc.JSGlobalObject) jsc.JSValue;
extern "c" fn PathParsedObject__create(
*jsc.JSGlobalObject,
jsc.JSValue,
jsc.JSValue,
jsc.JSValue,
jsc.JSValue,
jsc.JSValue,
) jsc.JSValue;
pub fn resolve(globalObject: *jsc.JSGlobalObject, isWindows: bool, args_ptr: [*]jsc.JSValue, args_len: u16) bun.JSError!jsc.JSValue {
var arena = bun.ArenaAllocator.init(bun.default_allocator);

View File

@@ -23,6 +23,7 @@ var print_every_i: usize = 0;
// we always rewrite the entire HTTP request when write() returns EAGAIN
// so we can reuse this buffer
var shared_request_headers_buf: [256]picohttp.Header = undefined;
var shared_request_headers_overflow: ?[]picohttp.Header = null;
// this doesn't need to be stack memory because it is immediately cloned after use
var shared_response_headers_buf: [256]picohttp.Header = undefined;
@@ -605,7 +606,32 @@ pub fn buildRequest(this: *HTTPClient, body_len: usize) picohttp.Request {
var header_entries = this.header_entries.slice();
const header_names = header_entries.items(.name);
const header_values = header_entries.items(.value);
var request_headers_buf = &shared_request_headers_buf;
// The maximum number of headers is the user-provided headers plus up to
// 6 extra headers that may be added below (Connection, User-Agent,
// Accept, Host, Accept-Encoding, Content-Length/Transfer-Encoding).
const max_headers = header_names.len + 6;
const static_buf_len = shared_request_headers_buf.len;
// Use the static buffer for the common case, dynamically allocate for overflow.
// The overflow buffer is kept around for reuse to avoid repeated allocations.
var request_headers_buf: []picohttp.Header = if (max_headers <= static_buf_len)
&shared_request_headers_buf
else blk: {
if (shared_request_headers_overflow) |overflow| {
if (overflow.len >= max_headers) {
break :blk overflow;
}
bun.default_allocator.free(overflow);
shared_request_headers_overflow = null;
}
const buf = bun.default_allocator.alloc(picohttp.Header, max_headers) catch
// On allocation failure, fall back to the static buffer and
// truncate headers rather than writing out of bounds.
break :blk @as([]picohttp.Header, &shared_request_headers_buf);
shared_request_headers_overflow = buf;
break :blk buf;
};
var override_accept_encoding = false;
var override_accept_header = false;
@@ -667,43 +693,32 @@ pub fn buildRequest(this: *HTTPClient, body_len: usize) picohttp.Request {
else => {},
}
if (header_count >= request_headers_buf.len) break;
request_headers_buf[header_count] = .{
.name = name,
.value = this.headerStr(header_values[i]),
};
// header_name_hashes[header_count] = hash;
// // ensure duplicate headers come after each other
// if (header_count > 2) {
// var head_i: usize = header_count - 1;
// while (head_i > 0) : (head_i -= 1) {
// if (header_name_hashes[head_i] == header_name_hashes[header_count]) {
// std.mem.swap(picohttp.Header, &header_name_hashes[header_count], &header_name_hashes[head_i + 1]);
// std.mem.swap(u64, &request_headers_buf[header_count], &request_headers_buf[head_i + 1]);
// break;
// }
// }
// }
header_count += 1;
}
if (!override_connection_header and !this.flags.disable_keepalive) {
if (!override_connection_header and !this.flags.disable_keepalive and header_count < request_headers_buf.len) {
request_headers_buf[header_count] = connection_header;
header_count += 1;
}
if (!override_user_agent) {
if (!override_user_agent and header_count < request_headers_buf.len) {
request_headers_buf[header_count] = getUserAgentHeader();
header_count += 1;
}
if (!override_accept_header) {
if (!override_accept_header and header_count < request_headers_buf.len) {
request_headers_buf[header_count] = accept_header;
header_count += 1;
}
if (!override_host_header) {
if (!override_host_header and header_count < request_headers_buf.len) {
request_headers_buf[header_count] = .{
.name = host_header_name,
.value = this.url.host,
@@ -711,31 +726,33 @@ pub fn buildRequest(this: *HTTPClient, body_len: usize) picohttp.Request {
header_count += 1;
}
if (!override_accept_encoding and !this.flags.disable_decompression) {
if (!override_accept_encoding and !this.flags.disable_decompression and header_count < request_headers_buf.len) {
request_headers_buf[header_count] = accept_encoding_header;
header_count += 1;
}
if (body_len > 0 or this.method.hasRequestBody()) {
if (this.flags.is_streaming_request_body) {
if (add_transfer_encoding and this.flags.upgrade_state == .none) {
request_headers_buf[header_count] = chunked_encoded_header;
if (header_count < request_headers_buf.len) {
if (body_len > 0 or this.method.hasRequestBody()) {
if (this.flags.is_streaming_request_body) {
if (add_transfer_encoding and this.flags.upgrade_state == .none) {
request_headers_buf[header_count] = chunked_encoded_header;
header_count += 1;
}
} else {
request_headers_buf[header_count] = .{
.name = content_length_header_name,
.value = std.fmt.bufPrint(&this.request_content_len_buf, "{d}", .{body_len}) catch "0",
};
header_count += 1;
}
} else {
} else if (original_content_length) |content_length| {
request_headers_buf[header_count] = .{
.name = content_length_header_name,
.value = std.fmt.bufPrint(&this.request_content_len_buf, "{d}", .{body_len}) catch "0",
.value = content_length,
};
header_count += 1;
}
} else if (original_content_length) |content_length| {
request_headers_buf[header_count] = .{
.name = content_length_header_name,
.value = content_length,
};
header_count += 1;
}
return picohttp.Request{

View File

@@ -199,18 +199,18 @@ function Sign(algorithm, options): void {
}
$toClass(Sign, "Sign", Writable);
Sign.prototype._write = function _write(chunk, encoding, callback) {
this.update(chunk, encoding);
callback();
};
Sign.prototype.update = function update(data, encoding) {
return this[kHandle].update(this, data, encoding);
};
Sign.prototype.sign = function sign(options, encoding) {
return this[kHandle].sign(options, encoding);
};
Object.assign(Sign.prototype, {
_write: function (chunk, encoding, callback) {
this.update(chunk, encoding);
callback();
},
update: function (data, encoding) {
return this[kHandle].update(this, data, encoding);
},
sign: function (options, encoding) {
return this[kHandle].sign(options, encoding);
},
});
crypto_exports.Sign = Sign;
crypto_exports.sign = sign;
@@ -237,9 +237,11 @@ $toClass(Verify, "Verify", Writable);
Verify.prototype._write = Sign.prototype._write;
Verify.prototype.update = Sign.prototype.update;
Verify.prototype.verify = function verify(options, signature, sigEncoding) {
return this[kHandle].verify(options, signature, sigEncoding);
};
Object.assign(Verify.prototype, {
verify: function (options, signature, sigEncoding) {
return this[kHandle].verify(options, signature, sigEncoding);
},
});
crypto_exports.Verify = Verify;
crypto_exports.verify = verify;
@@ -250,82 +252,76 @@ function createVerify(algorithm, options?) {
crypto_exports.createVerify = createVerify;
{
function Hash(algorithm, options?): void {
if (!new.target) {
return new Hash(algorithm, options);
}
const handle = new _Hash(algorithm, options);
this[kHandle] = handle;
LazyTransform.$apply(this, [options]);
function Hash(algorithm, options?): void {
if (!new.target) {
return new Hash(algorithm, options);
}
$toClass(Hash, "Hash", LazyTransform);
Hash.prototype.copy = function copy(options) {
const handle = new _Hash(algorithm, options);
this[kHandle] = handle;
LazyTransform.$apply(this, [options]);
}
$toClass(Hash, "Hash", LazyTransform);
Object.assign(Hash.prototype, {
copy: function (options) {
return new Hash(this[kHandle], options);
};
Hash.prototype._transform = function _transform(chunk, encoding, callback) {
},
_transform: function (chunk, encoding, callback) {
this[kHandle].update(this, chunk, encoding);
callback();
};
Hash.prototype._flush = function _flush(callback) {
},
_flush: function (callback) {
this.push(this[kHandle].digest(null, false));
callback();
};
Hash.prototype.update = function update(data, encoding) {
},
update: function (data, encoding) {
return this[kHandle].update(this, data, encoding);
};
Hash.prototype.digest = function digest(outputEncoding) {
},
digest: function (outputEncoding) {
return this[kHandle].digest(outputEncoding);
};
},
});
crypto_exports.Hash = Hash;
crypto_exports.createHash = function createHash(algorithm, options) {
return new Hash(algorithm, options);
};
}
crypto_exports.Hash = Hash;
crypto_exports.createHash = function createHash(algorithm, options) {
return new Hash(algorithm, options);
};
{
function Hmac(hmac, key, options?): void {
if (!new.target) {
return new Hmac(hmac, key, options);
}
const handle = new _Hmac(hmac, key, options);
this[kHandle] = handle;
LazyTransform.$apply(this, [options]);
function Hmac(hmac, key, options?): void {
if (!new.target) {
return new Hmac(hmac, key, options);
}
$toClass(Hmac, "Hmac", LazyTransform);
Hmac.prototype.update = function update(data, encoding) {
const handle = new _Hmac(hmac, key, options);
this[kHandle] = handle;
LazyTransform.$apply(this, [options]);
}
$toClass(Hmac, "Hmac", LazyTransform);
Object.assign(Hmac.prototype, {
update: function (data, encoding) {
return this[kHandle].update(this, data, encoding);
};
Hmac.prototype.digest = function digest(outputEncoding) {
},
digest: function (outputEncoding) {
return this[kHandle].digest(outputEncoding);
};
Hmac.prototype._transform = function _transform(chunk, encoding, callback) {
},
_transform: function (chunk, encoding, callback) {
this[kHandle].update(this, chunk, encoding);
callback();
};
Hmac.prototype._flush = function _flush(callback) {
},
_flush: function (callback) {
this.push(this[kHandle].digest());
callback();
};
},
});
crypto_exports.Hmac = Hmac;
crypto_exports.createHmac = function createHmac(hmac, key, options) {
return new Hmac(hmac, key, options);
};
}
crypto_exports.Hmac = Hmac;
crypto_exports.createHmac = function createHmac(hmac, key, options) {
return new Hmac(hmac, key, options);
};
crypto_exports.getHashes = getHashes;
@@ -390,62 +386,6 @@ crypto_exports.createECDH = function createECDH(curve) {
return decoder;
}
function setAutoPadding(ap) {
this[kHandle].setAutoPadding(ap);
return this;
}
function getAuthTag() {
return this[kHandle].getAuthTag();
}
function setAuthTag(tagbuf, encoding) {
this[kHandle].setAuthTag(tagbuf, encoding);
return this;
}
function setAAD(aadbuf, options) {
this[kHandle].setAAD(aadbuf, options);
return this;
}
function _transform(chunk, encoding, callback) {
this.push(this[kHandle].update(chunk, encoding));
callback();
}
function _flush(callback) {
try {
this.push(this[kHandle].final());
} catch (e) {
callback(e);
return;
}
callback();
}
function update(data, inputEncoding, outputEncoding) {
const res = this[kHandle].update(data, inputEncoding);
if (outputEncoding && outputEncoding !== "buffer") {
this._decoder = getDecoder(this._decoder, outputEncoding);
return this._decoder.write(res);
}
return res;
}
function final(outputEncoding) {
const res = this[kHandle].final();
if (outputEncoding && outputEncoding !== "buffer") {
this._decoder = getDecoder(this._decoder, outputEncoding);
return this._decoder.end(res);
}
return res;
}
function Cipheriv(cipher, key, iv, options): void {
if (!new.target) {
return new Cipheriv(cipher, key, iv, options);
@@ -457,13 +397,52 @@ crypto_exports.createECDH = function createECDH(curve) {
}
$toClass(Cipheriv, "Cipheriv", LazyTransform);
Cipheriv.prototype.setAutoPadding = setAutoPadding;
Cipheriv.prototype.getAuthTag = getAuthTag;
Cipheriv.prototype.setAAD = setAAD;
Cipheriv.prototype._transform = _transform;
Cipheriv.prototype._flush = _flush;
Cipheriv.prototype.update = update;
Cipheriv.prototype.final = final;
Object.assign(Cipheriv.prototype, {
setAutoPadding: function (ap) {
this[kHandle].setAutoPadding(ap);
return this;
},
getAuthTag: function () {
return this[kHandle].getAuthTag();
},
setAAD: function (aadbuf, options) {
this[kHandle].setAAD(aadbuf, options);
return this;
},
_transform: function (chunk, encoding, callback) {
this.push(this[kHandle].update(chunk, encoding));
callback();
},
_flush: function (callback) {
try {
this.push(this[kHandle].final());
} catch (e) {
callback(e);
return;
}
callback();
},
update: function (data, inputEncoding, outputEncoding) {
const res = this[kHandle].update(data, inputEncoding);
if (outputEncoding && outputEncoding !== "buffer") {
this._decoder = getDecoder(this._decoder, outputEncoding);
return this._decoder.write(res);
}
return res;
},
final: function (outputEncoding) {
const res = this[kHandle].final();
if (outputEncoding && outputEncoding !== "buffer") {
this._decoder = getDecoder(this._decoder, outputEncoding);
return this._decoder.end(res);
}
return res;
},
});
function Decipheriv(cipher, key, iv, options): void {
if (!new.target) {
@@ -476,13 +455,18 @@ crypto_exports.createECDH = function createECDH(curve) {
}
$toClass(Decipheriv, "Decipheriv", LazyTransform);
Decipheriv.prototype.setAutoPadding = setAutoPadding;
Decipheriv.prototype.setAuthTag = setAuthTag;
Decipheriv.prototype.setAAD = setAAD;
Decipheriv.prototype._transform = _transform;
Decipheriv.prototype._flush = _flush;
Decipheriv.prototype.update = update;
Decipheriv.prototype.final = final;
Object.assign(Decipheriv.prototype, {
setAutoPadding: Cipheriv.prototype.setAutoPadding,
setAuthTag: function (tagbuf, encoding) {
this[kHandle].setAuthTag(tagbuf, encoding);
return this;
},
setAAD: Cipheriv.prototype.setAAD,
_transform: Cipheriv.prototype._transform,
_flush: Cipheriv.prototype._flush,
update: Cipheriv.prototype.update,
final: Cipheriv.prototype.final,
});
crypto_exports.Cipheriv = Cipheriv;
crypto_exports.Decipheriv = Decipheriv;

View File

@@ -4092,6 +4092,12 @@ pub fn copyFileZSlowWithHandle(in_handle: bun.FileDescriptor, to_dir: bun.FileDe
_ = std.os.linux.fallocate(out_handle.cast(), 0, 0, @intCast(stat_.size));
}
// Seek input to beginning — the caller may have written to this fd,
// leaving the file offset at EOF. copy_file_range / sendfile / read
// all use the current offset when called with null offsets.
// Ignore errors: the fd may be non-seekable (e.g. a pipe).
_ = setFileOffset(in_handle, 0);
switch (bun.copyFile(in_handle, out_handle)) {
.err => |e| return .{ .err = e },
.result => {},

View File

@@ -121,4 +121,71 @@ describe("Bun.build compile", () => {
});
});
describe("compiled binary validity", () => {
test("output binary has valid executable header", async () => {
using dir = tempDir("build-compile-valid-header", {
"app.js": `console.log("hello");`,
});
const outfile = join(dir + "", "app-out");
const result = await Bun.build({
entrypoints: [join(dir + "", "app.js")],
compile: {
outfile,
},
});
expect(result.success).toBe(true);
// Read the first 4 bytes and verify it's a valid executable magic number
const file = Bun.file(result.outputs[0].path);
const header = new Uint8Array(await file.slice(0, 4).arrayBuffer());
if (isMacOS) {
// MachO magic: 0xCFFAEDFE (little-endian)
expect(header[0]).toBe(0xcf);
expect(header[1]).toBe(0xfa);
expect(header[2]).toBe(0xed);
expect(header[3]).toBe(0xfe);
} else if (isLinux) {
// ELF magic: 0x7F 'E' 'L' 'F'
expect(header[0]).toBe(0x7f);
expect(header[1]).toBe(0x45); // 'E'
expect(header[2]).toBe(0x4c); // 'L'
expect(header[3]).toBe(0x46); // 'F'
} else if (isWindows) {
// PE magic: 'M' 'Z'
expect(header[0]).toBe(0x4d); // 'M'
expect(header[1]).toBe(0x5a); // 'Z'
}
});
test("compiled binary runs and produces expected output", async () => {
using dir = tempDir("build-compile-runs", {
"app.js": `console.log("compile-test-output");`,
});
const outfile = join(dir + "", "app-run");
const result = await Bun.build({
entrypoints: [join(dir + "", "app.js")],
compile: {
outfile,
},
});
expect(result.success).toBe(true);
await using proc = Bun.spawn({
cmd: [result.outputs[0].path],
stdout: "pipe",
stderr: "pipe",
});
const [stdout, stderr, exitCode] = await Promise.all([proc.stdout.text(), proc.stderr.text(), proc.exited]);
expect(stdout.trim()).toBe("compile-test-output");
expect(exitCode).toBe(0);
});
});
// file command test works well

View File

@@ -3,15 +3,16 @@ import { itBundled } from "../expectBundled";
describe("css", () => {
itBundled("css-module/GlobalPseudoFunction", {
files: {
"/index.module.css": /* css */ `
"index.module.css": /* css */ `
:global(.foo) {
color: red;
}
`,
},
outfile: "/out.css",
outdir: "/out",
entryPoints: ["/index.module.css"],
onAfterBundle(api) {
api.expectFile("/out.css").toEqualIgnoringWhitespace(`
api.expectFile("/out/index.module.css").toEqualIgnoringWhitespace(`
/* index.module.css */
.foo {
color: red;

View File

@@ -3,15 +3,16 @@ import { itBundled } from "../expectBundled";
describe("css", () => {
itBundled("css/is-selector", {
files: {
"/index.css": /* css */ `
"index.css": /* css */ `
.foo:is(input:checked) {
color: red;
}
`,
},
outfile: "/out.css",
outdir: "/out",
entryPoints: ["/index.css"],
onAfterBundle(api) {
api.expectFile("/out.css").toMatchInlineSnapshot(`
api.expectFile("/out/index.css").toMatchInlineSnapshot(`
"/* index.css */
.foo:-webkit-any(input:checked) {
color: red;

View File

@@ -3,7 +3,7 @@ import { itBundled } from "../expectBundled";
describe("css", () => {
itBundled("css/view-transition-class-selector-23600", {
files: {
"/index.css": /* css */ `
"index.css": /* css */ `
@keyframes slide-out {
from {
opacity: 1;
@@ -33,9 +33,10 @@ describe("css", () => {
}
`,
},
outfile: "/out.css",
outdir: "/out",
entryPoints: ["/index.css"],
onAfterBundle(api) {
api.expectFile("/out.css").toMatchInlineSnapshot(`
api.expectFile("/out/index.css").toMatchInlineSnapshot(`
"/* index.css */
@keyframes slide-out {
from {

View File

@@ -4,7 +4,6 @@ import { itBundled } from "../../expectBundled";
const runTest = (property: string, input: string, expected: string) => {
const testTitle = `${property}: ${input}`;
itBundled(testTitle, {
virtual: true,
files: {
"/a.css": /* css */ `
h1 {
@@ -12,7 +11,7 @@ h1 {
}
`,
},
outfile: "/out.css",
outfile: "out.css",
onAfterBundle(api) {
api.expectFile("/out.css").toEqualIgnoringWhitespace(`

View File

@@ -4,7 +4,6 @@ import { itBundled } from "../../expectBundled";
const runTest = (testTitle: string, input: string, expected: string) => {
testTitle = testTitle.length === 0 ? input : testTitle;
itBundled(testTitle, {
virtual: true,
files: {
"/a.css": /* css */ `
h1 {
@@ -12,7 +11,7 @@ h1 {
}
`,
},
outfile: "/out.css",
outfile: "out.css",
onAfterBundle(api) {
api.expectFile("/out.css").toEqualIgnoringWhitespace(`

View File

@@ -3,7 +3,6 @@ import { itBundled } from "../../expectBundled";
const runTest = (input: string, expected: string) => {
itBundled(input, {
virtual: true,
files: {
"/a.css": /* css */ `
h1 {
@@ -11,7 +10,7 @@ h1 {
}
`,
},
outfile: "/out.css",
outfile: "out.css",
onAfterBundle(api) {
api.expectFile("/out.css").toEqualIgnoringWhitespace(`

View File

@@ -5,7 +5,6 @@ let i = 0;
const testname = () => `test-${i++}`;
describe("relative_color_out_of_gamut", () => {
itBundled(testname(), {
virtual: true,
files: {
"/a.css": /* css */ `
h1 {
@@ -13,7 +12,7 @@ h1 {
}
`,
},
outfile: "/out.css",
outfile: "out.css",
onAfterBundle(api) {
api.expectFile("/out.css").toEqualIgnoringWhitespace(`
@@ -26,7 +25,6 @@ h1 {
});
itBundled(testname(), {
virtual: true,
files: {
"/a.css": /* css */ `
h1 {
@@ -47,7 +45,6 @@ h1 {
});
itBundled(testname(), {
virtual: true,
files: {
"/a.css": /* css */ `
h1 {
@@ -68,7 +65,6 @@ h1 {
});
itBundled(testname(), {
virtual: true,
files: {
"/a.css": /* css */ `
h1 {
@@ -89,7 +85,6 @@ h1 {
});
itBundled(testname(), {
virtual: true,
files: {
"/a.css": /* css */ `
h1 {
@@ -110,7 +105,6 @@ h1 {
});
itBundled(testname(), {
virtual: true,
files: {
"/a.css": /* css */ `
h1 {
@@ -131,7 +125,6 @@ h1 {
});
itBundled(testname(), {
virtual: true,
files: {
"/a.css": /* css */ `
h1 {
@@ -152,7 +145,6 @@ h1 {
});
itBundled(testname(), {
virtual: true,
files: {
"/a.css": /* css */ `
h1 {
@@ -173,7 +165,6 @@ h1 {
});
itBundled(testname(), {
virtual: true,
files: {
"/a.css": /* css */ `
h1 {
@@ -194,7 +185,6 @@ h1 {
});
itBundled(testname(), {
virtual: true,
files: {
"/a.css": /* css */ `
h1 {
@@ -215,7 +205,6 @@ h1 {
});
itBundled(testname(), {
virtual: true,
files: {
"/a.css": /* css */ `
h1 {
@@ -236,7 +225,6 @@ h1 {
});
itBundled(testname(), {
virtual: true,
files: {
"/a.css": /* css */ `
h1 {
@@ -257,7 +245,6 @@ h1 {
});
itBundled(testname(), {
virtual: true,
files: {
"/a.css": /* css */ `
h1 {
@@ -278,7 +265,6 @@ h1 {
});
itBundled(testname(), {
virtual: true,
files: {
"/a.css": /* css */ `
h1 {
@@ -299,7 +285,6 @@ h1 {
});
itBundled(testname(), {
virtual: true,
files: {
"/a.css": /* css */ `
h1 {
@@ -320,7 +305,6 @@ h1 {
});
itBundled(testname(), {
virtual: true,
files: {
"/a.css": /* css */ `
h1 {
@@ -341,7 +325,6 @@ h1 {
});
itBundled(testname(), {
virtual: true,
files: {
"/a.css": /* css */ `
h1 {
@@ -362,7 +345,6 @@ h1 {
});
itBundled(testname(), {
virtual: true,
files: {
"/a.css": /* css */ `
h1 {
@@ -383,7 +365,6 @@ h1 {
});
itBundled(testname(), {
virtual: true,
files: {
"/a.css": /* css */ `
h1 {
@@ -404,7 +385,6 @@ h1 {
});
itBundled(testname(), {
virtual: true,
files: {
"/a.css": /* css */ `
h1 {
@@ -425,7 +405,6 @@ h1 {
});
itBundled(testname(), {
virtual: true,
files: {
"/a.css": /* css */ `
h1 {
@@ -446,7 +425,6 @@ h1 {
});
itBundled(testname(), {
virtual: true,
files: {
"/a.css": /* css */ `
h1 {
@@ -467,7 +445,6 @@ h1 {
});
itBundled(testname(), {
virtual: true,
files: {
"/a.css": /* css */ `
h1 {
@@ -488,7 +465,6 @@ h1 {
});
itBundled(testname(), {
virtual: true,
files: {
"/a.css": /* css */ `
h1 {
@@ -509,7 +485,6 @@ h1 {
});
itBundled(testname(), {
virtual: true,
files: {
"/a.css": /* css */ `
h1 {
@@ -530,7 +505,6 @@ h1 {
});
itBundled(testname(), {
virtual: true,
files: {
"/a.css": /* css */ `
h1 {
@@ -551,7 +525,6 @@ h1 {
});
itBundled(testname(), {
virtual: true,
files: {
"/a.css": /* css */ `
h1 {

View File

@@ -16,9 +16,9 @@ describe("bundler", () => {
color: black }
`,
},
outfile: "/out.css",
outfile: "/out.js",
onAfterBundle(api) {
api.expectFile("/out.css").toEqualIgnoringWhitespace(`
api.expectFile("/out.js").toEqualIgnoringWhitespace(`
/* entry.css */
body {
color: #000;
@@ -31,9 +31,9 @@ describe("bundler", () => {
files: {
"/entry.css": /* css */ `\n`,
},
outfile: "/out.css",
outfile: "/out.js",
onAfterBundle(api) {
api.expectFile("/out.css").toEqualIgnoringWhitespace(`
api.expectFile("/out.js").toEqualIgnoringWhitespace(`
/* entry.css */`);
},
});
@@ -48,12 +48,12 @@ describe("bundler", () => {
}
}`,
},
outfile: "/out.css",
outfile: "/out.js",
onAfterBundle(api) {
api.expectFile("/out.css").toEqualIgnoringWhitespace(`
api.expectFile("/out.js").toEqualIgnoringWhitespace(`
/* entry.css */
body {
& h1 {
&h1 {
color: #fff;
}
}

View File

@@ -299,13 +299,6 @@ export interface BundlerTestInput {
/** Run after the bun.build function is called with its output */
onAfterApiBundle?(build: BuildOutput): Promise<void> | void;
/**
* Run the build entirely in memory using Bun.build's `files` API.
* No temp directories or files are created. Outputs are read from BuildArtifact.text().
* The `onAfterBundle` callback still works with the same API.
*/
virtual?: boolean;
}
export interface SourceMapTests {
@@ -415,40 +408,6 @@ function testRef(id: string, options: BundlerTestInput): BundlerTestRef {
return { id, options };
}
/**
* Extract capture function calls from file contents.
* Finds all occurrences of fnName(...) and returns the argument contents.
*/
function extractCaptures(fileContents: string, file: string, fnName: string): string[] {
let i = 0;
const length = fileContents.length;
const matches: string[] = [];
while (i < length) {
i = fileContents.indexOf(fnName, i);
if (i === -1) break;
const start = i;
let depth = 0;
while (i < length) {
const char = fileContents[i];
if (char === "(") depth++;
else if (char === ")") {
depth--;
if (depth === 0) break;
}
i++;
}
if (depth !== 0) {
throw new Error(`Could not find closing paren for ${fnName} call in ${file}`);
}
matches.push(fileContents.slice(start + fnName.length + 1, i));
i++;
}
if (matches.length === 0) {
throw new Error(`No ${fnName} calls found in ${file}`);
}
return matches;
}
function expectBundled(
id: string,
opts: BundlerTestInput,
@@ -535,7 +494,6 @@ function expectBundled(
generateOutput = true,
onAfterApiBundle,
throw: _throw = false,
virtual = false,
...unknownProps
} = opts;
@@ -622,198 +580,6 @@ function expectBundled(
return testRef(id, opts);
}
// Virtual mode: run entirely in memory without disk I/O
if (virtual) {
// Validate that unsupported options are not set
const unsupportedOptions: string[] = [];
if (runtimeFiles && Object.keys(runtimeFiles).length > 0) unsupportedOptions.push("runtimeFiles");
if (run) unsupportedOptions.push("run");
if (dce) unsupportedOptions.push("dce");
if (cjs2esm) unsupportedOptions.push("cjs2esm");
if (matchesReference) unsupportedOptions.push("matchesReference");
if (snapshotSourceMap) unsupportedOptions.push("snapshotSourceMap");
if (expectExactFilesize) unsupportedOptions.push("expectExactFilesize");
if (onAfterApiBundle) unsupportedOptions.push("onAfterApiBundle");
if (bundleWarnings) unsupportedOptions.push("bundleWarnings");
if (keepNames) unsupportedOptions.push("keepNames");
if (emitDCEAnnotations) unsupportedOptions.push("emitDCEAnnotations");
if (ignoreDCEAnnotations) unsupportedOptions.push("ignoreDCEAnnotations");
if (bytecode) unsupportedOptions.push("bytecode");
if (compile) unsupportedOptions.push("compile");
if (features && features.length > 0) unsupportedOptions.push("features");
if (outdir) unsupportedOptions.push("outdir (use outfile instead)");
if (unsupportedOptions.length > 0) {
throw new Error(`Virtual mode does not support the following options: ${unsupportedOptions.join(", ")}`);
}
return (async () => {
// Prepare virtual files with dedent applied for strings, preserve binary content as-is
// Use relative paths (strip leading /) to get consistent path comments in CSS output
const virtualFiles: Record<string, string | Buffer | Uint8Array | Blob> = {};
for (const [file, contents] of Object.entries(files)) {
const relativePath = file.startsWith("/") ? file.slice(1) : file;
virtualFiles[relativePath] = typeof contents === "string" ? dedent(contents) : contents;
}
// Convert entrypoints to relative paths too
const relativeEntryPoints = entryPoints.map(ep => (ep.startsWith("/") ? ep.slice(1) : ep));
const build = await Bun.build({
entrypoints: relativeEntryPoints,
files: virtualFiles,
target,
format,
minify: {
whitespace: minifyWhitespace,
syntax: minifySyntax,
identifiers: minifyIdentifiers,
},
external,
plugins: typeof plugins === "function" ? [{ name: "plugin", setup: plugins }] : plugins,
splitting,
treeShaking,
sourcemap: sourceMap,
publicPath,
banner,
footer,
packages,
loader,
jsx: jsx
? {
runtime: jsx.runtime,
importSource: jsx.importSource,
factory: jsx.factory,
fragment: jsx.fragment,
sideEffects: jsx.sideEffects,
development: jsx.development,
}
: undefined,
define,
drop,
conditions,
});
const expectedErrors = bundleErrors
? Object.entries(bundleErrors).flatMap(([file, v]) => v.map(error => ({ file, error })))
: null;
if (!build.success) {
// Collect actual errors from build logs
const actualErrors = build.logs
.filter(x => x.level === "error")
.map(x => ({
file: x.position?.file || "",
error: x.message,
}));
// Check if errors were expected
if (expectedErrors && expectedErrors.length > 0) {
const errorsLeft = [...expectedErrors];
const unexpectedErrors: typeof actualErrors = [];
for (const error of actualErrors) {
const i = errorsLeft.findIndex(item => error.file.endsWith(item.file) && error.error.includes(item.error));
if (i === -1) {
unexpectedErrors.push(error);
} else {
errorsLeft.splice(i, 1);
}
}
if (unexpectedErrors.length > 0) {
throw new Error(
"Unexpected errors reported while bundling:\n" +
unexpectedErrors.map(e => `${e.file}: ${e.error}`).join("\n") +
"\n\nExpected errors:\n" +
expectedErrors.map(e => `${e.file}: ${e.error}`).join("\n"),
);
}
if (errorsLeft.length > 0) {
throw new Error(
"Expected errors were not found while bundling:\n" +
errorsLeft.map(e => `${e.file}: ${e.error}`).join("\n") +
"\n\nActual errors:\n" +
actualErrors.map(e => `${e.file}: ${e.error}`).join("\n"),
);
}
return testRef(id, opts);
}
throw new Error(`Bundle failed:\n${actualErrors.map(e => `${e.file}: ${e.error}`).join("\n")}`);
} else if (expectedErrors && expectedErrors.length > 0) {
throw new Error(
"Errors were expected while bundling:\n" + expectedErrors.map(e => `${e.file}: ${e.error}`).join("\n"),
);
}
// Build in-memory file cache from BuildArtifact outputs
const outputCache: Record<string, string> = {};
for (const output of build.outputs) {
// Normalize path: "./a.css" -> "/a.css"
let outputPath = output.path;
if (outputPath.startsWith("./")) outputPath = outputPath.slice(1);
if (!outputPath.startsWith("/")) outputPath = "/" + outputPath;
outputCache[outputPath] = await output.text();
}
// Determine the main output file path
const mainOutputPath = Object.keys(outputCache)[0] || "/out.js";
const outfileVirtual = outfile ? (outfile.startsWith("/") ? outfile : "/" + outfile) : mainOutputPath;
// Create API object that reads from in-memory cache
const readFile = (file: string): string => {
// Normalize the file path
let normalizedFile = file;
if (normalizedFile.startsWith("./")) normalizedFile = normalizedFile.slice(1);
if (!normalizedFile.startsWith("/")) normalizedFile = "/" + normalizedFile;
// Try exact match first
if (normalizedFile in outputCache) return outputCache[normalizedFile];
// For single-output builds, allow accessing the output by the configured outfile path
const outputs = Object.keys(outputCache);
if (outputs.length === 1 && normalizedFile === outfileVirtual) {
return outputCache[outputs[0]];
}
throw new Error(`Virtual file not found: ${file}. Available: ${Object.keys(outputCache).join(", ")}`);
};
const api = {
root: "/virtual",
outfile: outfileVirtual,
outdir: "/virtual/out",
join: (...paths: string[]) => "/" + paths.join("/").replace(/^\/+/, ""),
readFile,
writeFile: (_file: string, _contents: string) => {
throw new Error("writeFile not supported in virtual mode");
},
expectFile: (file: string) => expect(readFile(file)),
prependFile: (_file: string, _contents: string) => {
throw new Error("prependFile not supported in virtual mode");
},
appendFile: (_file: string, _contents: string) => {
throw new Error("appendFile not supported in virtual mode");
},
assertFileExists: (file: string) => {
readFile(file); // Will throw if not found
},
warnings: {} as Record<string, ErrorMeta[]>,
options: opts,
captureFile: (file: string, fnName = "capture") => extractCaptures(readFile(file), file, fnName),
} satisfies BundlerTestBundleAPI;
if (onAfterBundle) {
onAfterBundle(api);
}
return testRef(id, opts);
})();
}
return (async () => {
if (!backend) {
backend =
@@ -1555,7 +1321,42 @@ for (const [key, blob] of build.outputs) {
},
warnings: warningReference,
options: opts,
captureFile: (file, fnName = "capture") => extractCaptures(readFile(file), file, fnName),
captureFile: (file, fnName = "capture") => {
const fileContents = readFile(file);
let i = 0;
const length = fileContents.length;
const matches = [];
while (i < length) {
i = fileContents.indexOf(fnName, i);
if (i === -1) {
break;
}
const start = i;
let depth = 0;
while (i < length) {
const char = fileContents[i];
if (char === "(") {
depth++;
} else if (char === ")") {
depth--;
if (depth === 0) {
break;
}
}
i++;
}
if (depth !== 0) {
throw new Error(`Could not find closing paren for ${fnName} call in ${file}`);
}
matches.push(fileContents.slice(start + fnName.length + 1, i));
i++;
}
if (matches.length === 0) {
throw new Error(`No ${fnName} calls found in ${file}`);
}
return matches;
},
} satisfies BundlerTestBundleAPI;
// DCE keep scan

View File

@@ -504,6 +504,110 @@ describe("Hash", () => {
expect(hash.update.name).toBe("update");
expect(hash.digest.name).toBe("digest");
expect(hash.copy.name).toBe("copy");
expect(hash._transform.name).toBe("_transform");
expect(hash._flush.name).toBe("_flush");
});
});
describe("Hmac", () => {
it("should have correct method names", () => {
const hmac = crypto.createHmac("sha256", "key");
expect(hmac.update.name).toBe("update");
expect(hmac.digest.name).toBe("digest");
expect(hmac._transform.name).toBe("_transform");
expect(hmac._flush.name).toBe("_flush");
});
});
describe("Sign", () => {
it("should have correct method names", () => {
const sign = crypto.createSign("sha256");
expect(sign.update.name).toBe("update");
expect(sign.sign.name).toBe("sign");
expect(sign._write.name).toBe("_write");
});
});
describe("Verify", () => {
it("should have correct method names", () => {
const verify = crypto.createVerify("sha256");
expect(verify.update.name).toBe("update");
expect(verify.verify.name).toBe("verify");
expect(verify._write.name).toBe("_write");
});
});
describe("Cipheriv", () => {
it("should have correct method names", () => {
const cipher = crypto.createCipheriv("aes-256-cbc", Buffer.alloc(32), Buffer.alloc(16));
expect(cipher.update.name).toBe("update");
expect(cipher.final.name).toBe("final");
expect(cipher.setAutoPadding.name).toBe("setAutoPadding");
expect(cipher.getAuthTag.name).toBe("getAuthTag");
expect(cipher.setAAD.name).toBe("setAAD");
expect(cipher._transform.name).toBe("_transform");
expect(cipher._flush.name).toBe("_flush");
});
});
describe("Decipheriv", () => {
it("should have correct method names", () => {
const decipher = crypto.createDecipheriv("aes-256-cbc", Buffer.alloc(32), Buffer.alloc(16));
expect(decipher.update.name).toBe("update");
expect(decipher.final.name).toBe("final");
expect(decipher.setAutoPadding.name).toBe("setAutoPadding");
expect(decipher.setAuthTag.name).toBe("setAuthTag");
expect(decipher.setAAD.name).toBe("setAAD");
expect(decipher._transform.name).toBe("_transform");
expect(decipher._flush.name).toBe("_flush");
});
});
describe("DiffieHellman", () => {
it("should have correct method names", () => {
const dh = crypto.createDiffieHellman(512);
expect(dh.generateKeys.name).toBe("generateKeys");
expect(dh.computeSecret.name).toBe("computeSecret");
expect(dh.getPrime.name).toBe("getPrime");
expect(dh.getGenerator.name).toBe("getGenerator");
expect(dh.getPublicKey.name).toBe("getPublicKey");
expect(dh.getPrivateKey.name).toBe("getPrivateKey");
expect(dh.setPublicKey.name).toBe("setPublicKey");
expect(dh.setPrivateKey.name).toBe("setPrivateKey");
});
});
describe("ECDH", () => {
it("should have correct method names", () => {
const ecdh = crypto.createECDH("prime256v1");
expect(ecdh.generateKeys.name).toBe("generateKeys");
expect(ecdh.computeSecret.name).toBe("computeSecret");
expect(ecdh.getPublicKey.name).toBe("getPublicKey");
expect(ecdh.getPrivateKey.name).toBe("getPrivateKey");
expect(ecdh.setPublicKey.name).toBe("setPublicKey");
expect(ecdh.setPrivateKey.name).toBe("setPrivateKey");
});
});
describe("crypto module", () => {
it("should have correct factory function names", () => {
expect(crypto.createHash.name).toBe("createHash");
expect(crypto.createHmac.name).toBe("createHmac");
expect(crypto.createSign.name).toBe("createSign");
expect(crypto.createVerify.name).toBe("createVerify");
expect(crypto.createCipheriv.name).toBe("createCipheriv");
expect(crypto.createDecipheriv.name).toBe("createDecipheriv");
expect(crypto.createDiffieHellman.name).toBe("createDiffieHellman");
expect(crypto.createECDH.name).toBe("createECDH");
expect(crypto.hash.name).toBe("hash");
expect(crypto.pbkdf2.name).toBe("pbkdf2");
});
it("should have correct constructor names", () => {
expect(crypto.Hash.name).toBe("Hash");
expect(crypto.Hmac.name).toBe("Hmac");
expect(crypto.Sign.name).toBe("Sign");
expect(crypto.Verify.name).toBe("Verify");
});
});

View File

@@ -0,0 +1,87 @@
import { describe, expect, test } from "bun:test";
import { once } from "node:events";
import { createServer } from "node:net";
describe("fetch with many headers", () => {
test("should not crash or corrupt memory with more than 256 headers", async () => {
// Use a raw TCP server to avoid uws header count limits on the server side.
// We just need to verify that the client sends the request without crashing.
await using server = createServer(socket => {
let data = "";
socket.on("data", (chunk: Buffer) => {
data += chunk.toString();
// Wait for the end of HTTP headers (double CRLF)
if (data.includes("\r\n\r\n")) {
// Count headers (lines between the request line and the blank line)
const headerSection = data.split("\r\n\r\n")[0];
const lines = headerSection.split("\r\n");
// First line is the request line (GET / HTTP/1.1), rest are headers
const headerCount = lines.length - 1;
const body = String(headerCount);
const response = ["HTTP/1.1 200 OK", `Content-Length: ${body.length}`, "Connection: close", "", body].join(
"\r\n",
);
socket.write(response);
socket.end();
}
});
}).listen(0);
await once(server, "listening");
const port = (server.address() as any).port;
// Build 300 unique custom headers (exceeds the 256-entry static buffer)
const headers = new Headers();
const headerCount = 300;
for (let i = 0; i < headerCount; i++) {
headers.set(`x-custom-${i}`, `value-${i}`);
}
const res = await fetch(`http://localhost:${port}/`, { headers });
const receivedCount = parseInt(await res.text(), 10);
expect(res.status).toBe(200);
// The server should receive our custom headers plus default ones
// (host, connection, user-agent, accept, accept-encoding = 5 extra)
expect(receivedCount).toBeGreaterThanOrEqual(headerCount);
});
test("should handle exactly 256 user headers without issues", async () => {
await using server = createServer(socket => {
let data = "";
socket.on("data", (chunk: Buffer) => {
data += chunk.toString();
if (data.includes("\r\n\r\n")) {
const headerSection = data.split("\r\n\r\n")[0];
const lines = headerSection.split("\r\n");
const headerCount = lines.length - 1;
const body = String(headerCount);
const response = ["HTTP/1.1 200 OK", `Content-Length: ${body.length}`, "Connection: close", "", body].join(
"\r\n",
);
socket.write(response);
socket.end();
}
});
}).listen(0);
await once(server, "listening");
const port = (server.address() as any).port;
const headers = new Headers();
const headerCount = 256;
for (let i = 0; i < headerCount; i++) {
headers.set(`x-custom-${i}`, `value-${i}`);
}
const res = await fetch(`http://localhost:${port}/`, { headers });
const receivedCount = parseInt(await res.text(), 10);
expect(res.status).toBe(200);
expect(receivedCount).toBeGreaterThanOrEqual(headerCount);
});
});