Compare commits

..

3 Commits

Author SHA1 Message Date
Claude Bot
ae88edef82 fix(profiler): resolve sourcemapped paths in CPU profile output for compiled binaries
The CPU profiler was using `vm.computeLineColumnWithSourcemap()` which
only remaps line/column but not the source URL. For `bun build --compile
--sourcemap` binaries, this meant profiles showed internal `/$bunfs/root/`
paths instead of original source file paths.

Replace the callback with direct calls to `Bun__remapStackFramePositions`
which remaps both the URL and line/column through sourcemaps, matching
the behavior of error stack traces.

Co-Authored-By: Claude <noreply@anthropic.com>
2026-02-23 10:23:57 +00:00
Dylan Conway
cb3c39be23 ci: add Intel SDE baseline verification for Windows, unify baseline checks (#27121)
Adds a unified baseline verification script
(`scripts/verify-baseline.ts`) that combines basic CPU instruction
checks and JIT stress test fixtures into a single step.

**Changes:**
- New TypeScript script replaces separate `verify-baseline-cpu.sh` and
`verify-jit-stress-qemu.sh` CI steps
- Adds Windows x64 baseline verification using Intel SDE v9.58 with
Nehalem emulation
- Linux continues to use QEMU (Nehalem for x64, Cortex-A53 for aarch64)
- SDE violations are detected by checking output for `SDE-ERROR`
messages rather than exit codes, avoiding false positives from
application errors
- JIT stress fixtures now run on every build instead of only when WebKit
changes

**Platform support:**
| Platform | Emulator | CPU Model |
|----------|----------|-----------|
| Linux x64 baseline | QEMU | Nehalem (SSE4.2, no AVX) |
| Linux aarch64 | QEMU | Cortex-A53 (no LSE/SVE) |
| Windows x64 baseline | Intel SDE | Nehalem (no AVX) |

---------

Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-02-21 14:24:21 -08:00
robobun
bc98025d93 fix(spawn): close libuv pipes before freeing to prevent handle queue corruption (#27064)
## Summary

Fixes #27063

On Windows, when `Bun.spawn` fails (e.g., ENOENT for a nonexistent
executable), pipes that were already initialized with `uv_pipe_init`
were being freed directly with `allocator.destroy()` without first
calling `uv_close()`. This left dangling pointers in libuv's
`handle_queue` linked list, corrupting it. Subsequent spawn calls would
crash with a segfault when inserting new handles into the corrupted
list.

Three sites were freeing pipe handles without `uv_close`:

- **`process.zig` `Stdio.deinit()`**: When spawn failed,
already-initialized pipes were freed without `uv_close()`. Now uses
`closePipeAndDestroy()` which checks `pipe.loop` to determine if the
pipe was registered with the event loop.
- **`process.zig` `spawnProcessWindows` IPC handling**: Unsupported IPC
pipes in stdin/stdout/stderr were freed directly. Now uses the same safe
close-then-destroy pattern.
- **`source.zig` `openPipe()`**: If `pipe.open(fd)` failed after
`pipe.init()` succeeded, the pipe was destroyed directly. Now calls
`uv_close()` with a callback that frees the memory.

Additionally, pipe allocations in `stdio.zig` are now zero-initialized
so that the `loop` field is reliably `null` before `uv_pipe_init`,
enabling the init detection in `deinit`.

## Test plan

- [x] Added regression test `test/regression/issue/27063.test.ts` that
spawns nonexistent executables repeatedly and verifies a valid spawn
still works afterward
- [x] Verified existing spawn tests pass (`exit-code.test.ts`,
`spawnSync.test.ts` — timing-related pre-existing flakes only)
- [x] Debug build compiles successfully
- [ ] Windows CI should verify the fix prevents the segfault


🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Dylan Conway <dylan.conway567@gmail.com>
Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2026-02-21 14:04:25 -08:00
22 changed files with 758 additions and 317 deletions

View File

@@ -593,8 +593,35 @@ function getTargetTriplet(platform) {
*/
function needsBaselineVerification(platform) {
const { os, arch, baseline } = platform;
if (os !== "linux") return false;
return (arch === "x64" && baseline) || arch === "aarch64";
if (os === "linux") return (arch === "x64" && baseline) || arch === "aarch64";
if (os === "windows") return arch === "x64" && baseline;
return false;
}
/**
* Returns the emulator binary name for the given platform.
* Linux uses QEMU user-mode; Windows uses Intel SDE.
* @param {Platform} platform
* @returns {string}
*/
function getEmulatorBinary(platform) {
const { os, arch } = platform;
if (os === "windows") return "sde-external/sde.exe";
if (arch === "aarch64") return "qemu-aarch64-static";
return "qemu-x86_64-static";
}
const SDE_VERSION = "9.58.0-2025-06-16";
const SDE_URL = `https://downloadmirror.intel.com/859732/sde-external-${SDE_VERSION}-win.tar.xz`;
/**
* @param {Platform} platform
* @param {PipelineOptions} options
* @returns {Step}
*/
function hasWebKitChanges(options) {
const { changedFiles = [] } = options;
return changedFiles.some(file => file.includes("SetupWebKit.cmake"));
}
/**
@@ -603,9 +630,31 @@ function needsBaselineVerification(platform) {
* @returns {Step}
*/
function getVerifyBaselineStep(platform, options) {
const { arch } = platform;
const { os } = platform;
const targetKey = getTargetKey(platform);
const archArg = arch === "x64" ? "x64" : "aarch64";
const triplet = getTargetTriplet(platform);
const emulator = getEmulatorBinary(platform);
const jitStressFlag = hasWebKitChanges(options) ? " --jit-stress" : "";
const setupCommands =
os === "windows"
? [
`echo Downloading build artifacts...`,
`buildkite-agent artifact download ${triplet}.zip . --step ${targetKey}-build-bun`,
`echo Extracting ${triplet}.zip...`,
`tar -xf ${triplet}.zip`,
`echo Downloading Intel SDE...`,
`curl.exe -fsSL -o sde.tar.xz "${SDE_URL}"`,
`echo Extracting Intel SDE...`,
`7z x -y sde.tar.xz`,
`7z x -y sde.tar`,
`ren sde-external-${SDE_VERSION}-win sde-external`,
]
: [
`buildkite-agent artifact download '*.zip' . --step ${targetKey}-build-bun`,
`unzip -o '${triplet}.zip'`,
`chmod +x ${triplet}/bun`,
];
return {
key: `${targetKey}-verify-baseline`,
@@ -614,57 +663,10 @@ function getVerifyBaselineStep(platform, options) {
agents: getLinkBunAgent(platform, options),
retry: getRetry(),
cancel_on_build_failing: isMergeQueue(),
timeout_in_minutes: 5,
timeout_in_minutes: hasWebKitChanges(options) ? 30 : 10,
command: [
`buildkite-agent artifact download '*.zip' . --step ${targetKey}-build-bun`,
`unzip -o '${getTargetTriplet(platform)}.zip'`,
`unzip -o '${getTargetTriplet(platform)}-profile.zip'`,
`chmod +x ${getTargetTriplet(platform)}/bun ${getTargetTriplet(platform)}-profile/bun-profile`,
`./scripts/verify-baseline-cpu.sh --arch ${archArg} --binary ${getTargetTriplet(platform)}/bun`,
`./scripts/verify-baseline-cpu.sh --arch ${archArg} --binary ${getTargetTriplet(platform)}-profile/bun-profile`,
],
};
}
/**
* Returns true if the PR modifies SetupWebKit.cmake (WebKit version changes).
* JIT stress tests under QEMU should run when WebKit is updated to catch
* JIT-generated code that uses unsupported CPU instructions.
* @param {PipelineOptions} options
* @returns {boolean}
*/
function hasWebKitChanges(options) {
const { changedFiles = [] } = options;
return changedFiles.some(file => file.includes("SetupWebKit.cmake"));
}
/**
* Returns a step that runs JSC JIT stress tests under QEMU.
* This verifies that JIT-compiled code doesn't use CPU instructions
* beyond the baseline target (no AVX on x64, no LSE on aarch64).
* @param {Platform} platform
* @param {PipelineOptions} options
* @returns {Step}
*/
function getJitStressTestStep(platform, options) {
const { arch } = platform;
const targetKey = getTargetKey(platform);
const archArg = arch === "x64" ? "x64" : "aarch64";
return {
key: `${targetKey}-jit-stress-qemu`,
label: `${getTargetLabel(platform)} - jit-stress-qemu`,
depends_on: [`${targetKey}-build-bun`],
agents: getLinkBunAgent(platform, options),
retry: getRetry(),
cancel_on_build_failing: isMergeQueue(),
// JIT stress tests are slow under QEMU emulation
timeout_in_minutes: 30,
command: [
`buildkite-agent artifact download '*.zip' . --step ${targetKey}-build-bun`,
`unzip -o '${getTargetTriplet(platform)}.zip'`,
`chmod +x ${getTargetTriplet(platform)}/bun`,
`./scripts/verify-jit-stress-qemu.sh --arch ${archArg} --binary ${getTargetTriplet(platform)}/bun`,
...setupCommands,
`bun scripts/verify-baseline.ts --binary ${triplet}/${os === "windows" ? "bun.exe" : "bun"} --emulator ${emulator}${jitStressFlag}`,
],
};
}
@@ -1264,10 +1266,6 @@ async function getPipeline(options = {}) {
if (needsBaselineVerification(target)) {
steps.push(getVerifyBaselineStep(target, options));
// Run JIT stress tests under QEMU when WebKit is updated
if (hasWebKitChanges(options)) {
steps.push(getJitStressTestStep(target, options));
}
}
return getStepWithDependsOn(

View File

@@ -13,6 +13,11 @@ else()
set(LSHPACK_INCLUDES .)
endif()
# Suppress all warnings from vendored lshpack on Windows (clang-cl)
if(WIN32)
set(LSHPACK_CMAKE_ARGS "-DCMAKE_C_FLAGS=-w")
endif()
register_cmake_command(
TARGET
lshpack
@@ -28,6 +33,7 @@ register_cmake_command(
# _lshpack_enc_get_static_name in libls-hpack.a(lshpack.c.o)
# _update_hash in libls-hpack.a(lshpack.c.o)
-DCMAKE_BUILD_TYPE=Release
${LSHPACK_CMAKE_ARGS}
INCLUDES
${LSHPACK_INCLUDES}
)

View File

@@ -79,12 +79,22 @@ endif()
if(CMAKE_SYSTEM_PROCESSOR MATCHES "aarch64|arm64|ARM64|AARCH64" AND NOT APPLE)
list(APPEND MIMALLOC_CMAKE_ARGS -DMI_NO_OPT_ARCH=ON)
list(APPEND MIMALLOC_CMAKE_ARGS -DMI_OPT_SIMD=ON)
list(APPEND MIMALLOC_CMAKE_ARGS "-DCMAKE_C_FLAGS=-moutline-atomics")
if(NOT WIN32)
list(APPEND MIMALLOC_CMAKE_ARGS "-DCMAKE_C_FLAGS=-moutline-atomics")
endif()
elseif(NOT ENABLE_BASELINE)
list(APPEND MIMALLOC_CMAKE_ARGS -DMI_OPT_ARCH=ON)
list(APPEND MIMALLOC_CMAKE_ARGS -DMI_OPT_SIMD=ON)
endif()
# Suppress all warnings from mimalloc on Windows — it's vendored C code compiled
# as C++ (MI_USE_CXX=ON) which triggers many clang-cl warnings (-Wold-style-cast,
# -Wzero-as-null-pointer-constant, -Wc++98-compat-pedantic, etc.)
if(WIN32)
list(APPEND MIMALLOC_CMAKE_ARGS "-DCMAKE_C_FLAGS=-w")
list(APPEND MIMALLOC_CMAKE_ARGS "-DCMAKE_CXX_FLAGS=-w")
endif()
if(WIN32)
if(DEBUG)
set(MIMALLOC_LIBRARY mimalloc-static-debug)

View File

@@ -7,9 +7,16 @@ register_repository(
12882eee073cfe5c7621bcfadf679e1372d4537b
)
# Suppress all warnings from vendored tinycc on Windows (clang-cl)
if(WIN32)
set(TINYCC_CMAKE_ARGS "-DCMAKE_C_FLAGS=-w")
endif()
register_cmake_command(
TARGET
tinycc
ARGS
${TINYCC_CMAKE_ARGS}
LIBRARIES
tcc
)

233
scripts/verify-baseline.ts Normal file
View File

@@ -0,0 +1,233 @@
// Verify that a Bun binary doesn't use CPU instructions beyond its baseline target.
//
// Detects the platform and chooses the appropriate emulator:
// Linux x64: QEMU with Nehalem CPU (no AVX)
// Linux arm64: QEMU with Cortex-A53 (no LSE/SVE)
// Windows x64: Intel SDE with -nhm (no AVX)
//
// Usage:
// bun scripts/verify-baseline.ts --binary ./bun --emulator /usr/bin/qemu-x86_64
// bun scripts/verify-baseline.ts --binary ./bun.exe --emulator ./sde.exe
import { readdirSync } from "node:fs";
import { basename, dirname, join, resolve } from "node:path";
const { parseArgs } = require("node:util");
const { values } = parseArgs({
args: process.argv.slice(2),
options: {
binary: { type: "string" },
emulator: { type: "string" },
"jit-stress": { type: "boolean", default: false },
},
strict: true,
});
const binary = resolve(values.binary!);
function resolveEmulator(name: string): string {
const found = Bun.which(name);
if (found) return found;
// Try without -static suffix (e.g. qemu-aarch64 instead of qemu-aarch64-static)
if (name.endsWith("-static")) {
const fallback = Bun.which(name.slice(0, -"-static".length));
if (fallback) return fallback;
}
// Last resort: resolve as a relative path (e.g. sde-external/sde.exe)
return resolve(name);
}
const emulatorPath = resolveEmulator(values.emulator!);
const scriptDir = dirname(import.meta.path);
const repoRoot = resolve(scriptDir, "..");
const fixturesDir = join(repoRoot, "test", "js", "bun", "jsc-stress", "fixtures");
const wasmFixturesDir = join(fixturesDir, "wasm");
const preloadPath = join(repoRoot, "test", "js", "bun", "jsc-stress", "preload.js");
// Platform detection
const isWindows = process.platform === "win32";
const isAarch64 = process.arch === "arm64";
// SDE outputs this when a chip-check violation occurs
const SDE_VIOLATION_PATTERN = /SDE-ERROR:.*not valid for specified chip/i;
// Configure emulator based on platform
const config = isWindows
? {
runnerCmd: [emulatorPath, "-nhm", "--"],
cpuDesc: "Nehalem (SSE4.2, no AVX/AVX2/AVX512)",
// SDE must run from its own directory for Pin DLL resolution
cwd: dirname(emulatorPath),
}
: isAarch64
? {
runnerCmd: [emulatorPath, "-cpu", "cortex-a53"],
cpuDesc: "Cortex-A53 (ARMv8.0-A+CRC, no LSE/SVE)",
cwd: undefined,
}
: {
runnerCmd: [emulatorPath, "-cpu", "Nehalem"],
cpuDesc: "Nehalem (SSE4.2, no AVX/AVX2/AVX512)",
cwd: undefined,
};
function isInstructionViolation(exitCode: number, output: string): boolean {
if (isWindows) return SDE_VIOLATION_PATTERN.test(output);
return exitCode === 132; // SIGILL = 128 + signal 4
}
console.log(`--- Verifying ${basename(binary)} on ${config.cpuDesc}`);
console.log(` Binary: ${binary}`);
console.log(` Emulator: ${config.runnerCmd.join(" ")}`);
console.log();
let instructionFailures = 0;
let otherFailures = 0;
let passed = 0;
const failedTests: string[] = [];
interface RunTestOptions {
cwd?: string;
/** Tee output live to the console while still capturing it for analysis */
live?: boolean;
}
/** Read a stream, write each chunk to a writable, and return the full text. */
async function teeStream(stream: ReadableStream<Uint8Array>, output: NodeJS.WriteStream): Promise<string> {
const chunks: Uint8Array[] = [];
for await (const chunk of stream) {
chunks.push(chunk);
output.write(chunk);
}
return Buffer.concat(chunks).toString();
}
async function runTest(label: string, binaryArgs: string[], options?: RunTestOptions): Promise<boolean> {
console.log(`+++ ${label}`);
const start = performance.now();
const live = options?.live ?? false;
const proc = Bun.spawn([...config.runnerCmd, binary, ...binaryArgs], {
// config.cwd takes priority — SDE on Windows must run from its own directory for Pin DLL resolution
cwd: config.cwd ?? options?.cwd,
stdout: "pipe",
stderr: "pipe",
});
let stdout: string;
let stderr: string;
if (live) {
[stdout, stderr] = await Promise.all([
teeStream(proc.stdout as ReadableStream<Uint8Array>, process.stdout),
teeStream(proc.stderr as ReadableStream<Uint8Array>, process.stderr),
proc.exited,
]);
} else {
[stdout, stderr] = await Promise.all([
new Response(proc.stdout).text(),
new Response(proc.stderr).text(),
proc.exited,
]);
}
const exitCode = proc.exitCode!;
const elapsed = ((performance.now() - start) / 1000).toFixed(1);
const output = stdout + "\n" + stderr;
if (exitCode === 0) {
if (!live && stdout.trim()) console.log(stdout.trim());
console.log(` PASS (${elapsed}s)`);
passed++;
return true;
}
if (isInstructionViolation(exitCode, output)) {
if (!live && output.trim()) console.log(output.trim());
console.log();
console.log(` FAIL: CPU instruction violation detected (${elapsed}s)`);
if (isAarch64) {
console.log(" The aarch64 build targets Cortex-A53 (ARMv8.0-A+CRC).");
console.log(" LSE atomics, SVE, and dotprod instructions are not allowed.");
} else {
console.log(" The baseline x64 build targets Nehalem (SSE4.2).");
console.log(" AVX, AVX2, and AVX512 instructions are not allowed.");
}
instructionFailures++;
failedTests.push(label);
} else {
if (!live && output.trim()) console.log(output.trim());
console.log(` WARN: exit code ${exitCode} (${elapsed}s, not a CPU instruction issue)`);
otherFailures++;
}
return false;
}
// Phase 1: SIMD code path verification (always runs)
const simdTestPath = join(repoRoot, "test", "js", "bun", "jsc-stress", "fixtures", "simd-baseline.test.ts");
await runTest("SIMD baseline tests", ["test", simdTestPath], { live: true });
// Phase 2: JIT stress fixtures (only with --jit-stress, e.g. on WebKit changes)
if (values["jit-stress"]) {
const jsFixtures = readdirSync(fixturesDir)
.filter(f => f.endsWith(".js"))
.sort();
console.log();
console.log(`--- JS fixtures (DFG/FTL) — ${jsFixtures.length} tests`);
for (let i = 0; i < jsFixtures.length; i++) {
const fixture = jsFixtures[i];
await runTest(`[${i + 1}/${jsFixtures.length}] ${fixture}`, ["--preload", preloadPath, join(fixturesDir, fixture)]);
}
const wasmFixtures = readdirSync(wasmFixturesDir)
.filter(f => f.endsWith(".js"))
.sort();
console.log();
console.log(`--- Wasm fixtures (BBQ/OMG) — ${wasmFixtures.length} tests`);
for (let i = 0; i < wasmFixtures.length; i++) {
const fixture = wasmFixtures[i];
await runTest(
`[${i + 1}/${wasmFixtures.length}] ${fixture}`,
["--preload", preloadPath, join(wasmFixturesDir, fixture)],
{ cwd: wasmFixturesDir },
);
}
} else {
console.log();
console.log("--- Skipping JIT stress fixtures (pass --jit-stress to enable)");
}
// Summary
console.log();
console.log("--- Summary");
console.log(` Passed: ${passed}`);
console.log(` Instruction failures: ${instructionFailures}`);
console.log(` Other failures: ${otherFailures} (warnings, not CPU instruction issues)`);
console.log();
if (instructionFailures > 0) {
console.error(" FAILED: Code uses unsupported CPU instructions.");
// Report to Buildkite annotations tab
const platform = isWindows ? "Windows x64" : isAarch64 ? "Linux aarch64" : "Linux x64";
const annotation = [
`<details>`,
`<summary>CPU instruction violation on ${platform}${instructionFailures} failed</summary>`,
`<p>The baseline build uses instructions not available on <code>${config.cpuDesc}</code>.</p>`,
`<ul>${failedTests.map(t => `<li><code>${t}</code></li>`).join("")}</ul>`,
`</details>`,
].join("\n");
Bun.spawnSync(["buildkite-agent", "annotate", "--append", "--style", "error", "--context", "verify-baseline"], {
stdin: new Blob([annotation]),
});
process.exit(1);
}
if (otherFailures > 0) {
console.log(" Some tests failed for reasons unrelated to CPU instructions.");
}
console.log(` All baseline verification passed on ${config.cpuDesc}.`);

View File

@@ -209,7 +209,7 @@ pub fn Parse(
.body_loc = body_loc,
.properties = properties.items,
.has_decorators = has_any_decorators,
.should_lower_standard_decorators = has_auto_accessor or (p.options.features.standard_decorators and has_any_decorators),
.should_lower_standard_decorators = p.options.features.standard_decorators and (has_any_decorators or has_auto_accessor),
};
}

View File

@@ -300,9 +300,8 @@ pub fn ParseProperty(
}
},
.p_accessor => {
// "accessor" keyword for auto-accessor fields (TC39 proposal)
// Always recognized in classes regardless of decorator mode
if (opts.is_class and
// "accessor" keyword for auto-accessor fields (TC39 standard decorators)
if (opts.is_class and p.options.features.standard_decorators and
(js_lexer.PropertyModifierKeyword.List.get(raw) orelse .p_static) == .p_accessor)
{
kind = .auto_accessor;

View File

@@ -1087,8 +1087,10 @@ pub const WindowsSpawnOptions = struct {
dup2: struct { out: bun.jsc.Subprocess.StdioKind, to: bun.jsc.Subprocess.StdioKind },
pub fn deinit(this: *const Stdio) void {
if (this.* == .buffer) {
this.buffer.closeAndDestroy();
switch (this.*) {
.buffer => |pipe| pipe.closeAndDestroy(),
.ipc => |pipe| pipe.closeAndDestroy(),
else => {},
}
}
};
@@ -1629,9 +1631,10 @@ pub fn spawnProcessWindows(
stdio.flags = uv.UV_INHERIT_FD;
stdio.data.fd = fd_i;
},
.ipc => |my_pipe| {
// ipc option inside stdin, stderr or stdout are not supported
bun.default_allocator.destroy(my_pipe);
.ipc => {
// ipc option inside stdin, stderr or stdout is not supported.
// Don't free the pipe here — the caller owns it and will
// clean it up via WindowsSpawnOptions.deinit().
stdio.flags = uv.UV_IGNORE;
},
.ignore => {
@@ -1829,7 +1832,7 @@ pub const sync = struct {
.ignore => .ignore,
.buffer => .{
.buffer = if (Environment.isWindows)
bun.handleOom(bun.default_allocator.create(bun.windows.libuv.Pipe)),
bun.new(bun.windows.libuv.Pipe, std.mem.zeroes(bun.windows.libuv.Pipe)),
},
};
}

View File

@@ -187,7 +187,7 @@ pub fn create(globalThis: *jsc.JSGlobalObject, socket: SocketType) *WindowsNamed
});
// named_pipe owns the pipe (PipeWriter owns the pipe and will close and deinit it)
this.named_pipe = uws.WindowsNamedPipe.from(bun.handleOom(bun.default_allocator.create(uv.Pipe)), .{
this.named_pipe = uws.WindowsNamedPipe.from(bun.new(uv.Pipe, std.mem.zeroes(uv.Pipe)), .{
.ctx = this,
.ref_ctx = @ptrCast(&WindowsNamedPipeContext.ref),
.deref_ctx = @ptrCast(&WindowsNamedPipeContext.deref),
@@ -288,6 +288,8 @@ pub fn deinit(this: *WindowsNamedPipeContext) void {
bun.destroy(this);
}
const std = @import("std");
const bun = @import("bun");
const Output = bun.Output;
const jsc = bun.jsc;

View File

@@ -235,10 +235,10 @@ pub const Stdio = union(enum) {
return .{ .err = .blob_used_as_out };
}
break :brk .{ .buffer = bun.handleOom(bun.default_allocator.create(uv.Pipe)) };
break :brk .{ .buffer = createZeroedPipe() };
},
.ipc => .{ .ipc = bun.handleOom(bun.default_allocator.create(uv.Pipe)) },
.capture, .pipe, .array_buffer, .readable_stream => .{ .buffer = bun.handleOom(bun.default_allocator.create(uv.Pipe)) },
.ipc => .{ .ipc = createZeroedPipe() },
.capture, .pipe, .array_buffer, .readable_stream => .{ .buffer = createZeroedPipe() },
.fd => |fd| .{ .pipe = fd },
.dup2 => .{ .dup2 = .{ .out = stdio.dup2.out, .to = stdio.dup2.to } },
.path => |pathlike| .{ .path = pathlike.slice() },
@@ -487,12 +487,18 @@ pub const Stdio = union(enum) {
}
};
/// Allocate a zero-initialized uv.Pipe. Zero-init ensures `pipe.loop` is null
/// for pipes that never reach `uv_pipe_init`, so `closeAndDestroy` can tell
/// whether `uv_close` is needed.
fn createZeroedPipe() *uv.Pipe {
return bun.new(uv.Pipe, std.mem.zeroes(uv.Pipe));
}
const std = @import("std");
const bun = @import("bun");
const Environment = bun.Environment;
const Output = bun.Output;
const default_allocator = bun.default_allocator;
const uv = bun.windows.libuv;
const jsc = bun.jsc;

View File

@@ -3,6 +3,7 @@
#include "ZigGlobalObject.h"
#include "helpers.h"
#include "BunString.h"
#include "headers-handwritten.h"
#include <JavaScriptCore/SamplingProfiler.h>
#include <JavaScriptCore/VM.h>
#include <JavaScriptCore/JSGlobalObject.h>
@@ -72,6 +73,35 @@ struct ProfileNode {
WTF::Vector<int> children;
};
// Remap a source URL and line/column through sourcemaps.
// This handles both `bun build --sourcemap` and `bun build --compile --sourcemap`,
// resolving bundled paths like `/$bunfs/root/chunk-xyz.js:123` back to the original
// source file and line number (e.g., `src/myfile.ts:45`).
// lineNumber and columnNumber use 1-based convention on both input and output.
static void remapSourceLocation(JSC::VM& vm, WTF::String& url, int& lineNumber, int& columnNumber)
{
if (url.isEmpty() || lineNumber < 0)
return;
ZigStackFrame frame = {};
frame.source_url = Bun::toStringRef(url);
// Convert from 1-based to 0-based for ZigStackFrame
frame.position.line_zero_based = lineNumber > 0 ? lineNumber - 1 : 0;
frame.position.column_zero_based = columnNumber > 0 ? columnNumber - 1 : 0;
frame.remapped = false;
Bun__remapStackFramePositions(Bun::vm(vm), &frame, 1);
if (frame.remapped) {
WTF::String remappedUrl = frame.source_url.toWTFString();
if (!remappedUrl.isEmpty())
url = remappedUrl;
// Convert back to 1-based
lineNumber = frame.position.line().oneBasedInt();
columnNumber = frame.position.column().oneBasedInt();
}
}
WTF::String stopCPUProfilerAndGetJSON(JSC::VM& vm)
{
s_isProfilerRunning = false;
@@ -172,48 +202,30 @@ WTF::String stopCPUProfilerAndGetJSON(JSC::VM& vm)
if (provider) {
url = provider->sourceURL();
scriptId = static_cast<int>(provider->asID());
// Convert absolute paths to file:// URLs
// Check for:
// - Unix absolute path: /path/to/file
// - Windows drive letter: C:\path or C:/path
// - Windows UNC path: \\server\share
bool isAbsolutePath = false;
if (!url.isEmpty()) {
if (url[0] == '/') {
// Unix absolute path
isAbsolutePath = true;
} else if (url.length() >= 2 && url[1] == ':') {
// Windows drive letter (e.g., C:\)
char firstChar = url[0];
if ((firstChar >= 'A' && firstChar <= 'Z') || (firstChar >= 'a' && firstChar <= 'z')) {
isAbsolutePath = true;
}
} else if (url.length() >= 2 && url[0] == '\\' && url[1] == '\\') {
// Windows UNC path (e.g., \\server\share)
isAbsolutePath = true;
}
}
if (isAbsolutePath) {
url = WTF::URL::fileURLWithFileSystemPath(url).string();
}
}
if (frame.hasExpressionInfo()) {
// Apply sourcemap if available
JSC::LineColumn sourceMappedLineColumn = frame.semanticLocation.lineColumn;
if (provider) {
#if USE(BUN_JSC_ADDITIONS)
auto& fn = vm.computeLineColumnWithSourcemap();
if (fn) {
fn(vm, provider, sourceMappedLineColumn);
}
#endif
}
lineNumber = static_cast<int>(sourceMappedLineColumn.line);
columnNumber = static_cast<int>(sourceMappedLineColumn.column);
lineNumber = static_cast<int>(frame.semanticLocation.lineColumn.line);
columnNumber = static_cast<int>(frame.semanticLocation.lineColumn.column);
// Remap through sourcemaps (updates url, lineNumber, columnNumber)
remapSourceLocation(vm, url, lineNumber, columnNumber);
}
// Convert absolute paths to file:// URLs (after sourcemap remapping)
bool isAbsolutePath = false;
if (!url.isEmpty()) {
if (url[0] == '/')
isAbsolutePath = true;
else if (url.length() >= 2 && url[1] == ':') {
char firstChar = url[0];
if ((firstChar >= 'A' && firstChar <= 'Z') || (firstChar >= 'a' && firstChar <= 'z'))
isAbsolutePath = true;
} else if (url.length() >= 2 && url[0] == '\\' && url[1] == '\\')
isAbsolutePath = true;
}
if (isAbsolutePath)
url = WTF::URL::fileURLWithFileSystemPath(url).string();
}
// Create a unique key for this frame based on parent + callFrame
@@ -647,35 +659,30 @@ void stopCPUProfiler(JSC::VM& vm, WTF::String* outJSON, WTF::String* outText)
if (provider) {
url = provider->sourceURL();
scriptId = static_cast<int>(provider->asID());
bool isAbsolutePath = false;
if (!url.isEmpty()) {
if (url[0] == '/')
isAbsolutePath = true;
else if (url.length() >= 2 && url[1] == ':') {
char firstChar = url[0];
if ((firstChar >= 'A' && firstChar <= 'Z') || (firstChar >= 'a' && firstChar <= 'z'))
isAbsolutePath = true;
} else if (url.length() >= 2 && url[0] == '\\' && url[1] == '\\')
isAbsolutePath = true;
}
if (isAbsolutePath)
url = WTF::URL::fileURLWithFileSystemPath(url).string();
}
if (frame.hasExpressionInfo()) {
JSC::LineColumn sourceMappedLineColumn = frame.semanticLocation.lineColumn;
if (provider) {
#if USE(BUN_JSC_ADDITIONS)
auto& fn = vm.computeLineColumnWithSourcemap();
if (fn)
fn(vm, provider, sourceMappedLineColumn);
#endif
}
lineNumber = static_cast<int>(sourceMappedLineColumn.line);
columnNumber = static_cast<int>(sourceMappedLineColumn.column);
lineNumber = static_cast<int>(frame.semanticLocation.lineColumn.line);
columnNumber = static_cast<int>(frame.semanticLocation.lineColumn.column);
// Remap through sourcemaps (updates url, lineNumber, columnNumber)
remapSourceLocation(vm, url, lineNumber, columnNumber);
}
// Convert absolute paths to file:// URLs (after sourcemap remapping)
bool isAbsolutePath = false;
if (!url.isEmpty()) {
if (url[0] == '/')
isAbsolutePath = true;
else if (url.length() >= 2 && url[1] == ':') {
char firstChar = url[0];
if ((firstChar >= 'A' && firstChar <= 'Z') || (firstChar >= 'a' && firstChar <= 'z'))
isAbsolutePath = true;
} else if (url.length() >= 2 && url[0] == '\\' && url[1] == '\\')
isAbsolutePath = true;
}
if (isAbsolutePath)
url = WTF::URL::fileURLWithFileSystemPath(url).string();
}
WTF::StringBuilder keyBuilder;
@@ -817,35 +824,31 @@ void stopCPUProfiler(JSC::VM& vm, WTF::String* outJSON, WTF::String* outText)
if (frame.frameType == JSC::SamplingProfiler::FrameType::Executable && frame.executable) {
auto sourceProviderAndID = frame.sourceProviderAndID();
auto* provider = std::get<0>(sourceProviderAndID);
if (provider) {
if (provider)
url = provider->sourceURL();
bool isAbsolutePath = false;
if (!url.isEmpty()) {
if (url[0] == '/')
isAbsolutePath = true;
else if (url.length() >= 2 && url[1] == ':') {
char firstChar = url[0];
if ((firstChar >= 'A' && firstChar <= 'Z') || (firstChar >= 'a' && firstChar <= 'z'))
isAbsolutePath = true;
} else if (url.length() >= 2 && url[0] == '\\' && url[1] == '\\')
isAbsolutePath = true;
}
if (isAbsolutePath)
url = WTF::URL::fileURLWithFileSystemPath(url).string();
if (frame.hasExpressionInfo()) {
int columnNumber = -1;
lineNumber = static_cast<int>(frame.semanticLocation.lineColumn.line);
columnNumber = static_cast<int>(frame.semanticLocation.lineColumn.column);
// Remap through sourcemaps (updates url, lineNumber, columnNumber)
remapSourceLocation(vm, url, lineNumber, columnNumber);
}
if (frame.hasExpressionInfo()) {
JSC::LineColumn sourceMappedLineColumn = frame.semanticLocation.lineColumn;
if (provider) {
#if USE(BUN_JSC_ADDITIONS)
auto& fn = vm.computeLineColumnWithSourcemap();
if (fn)
fn(vm, provider, sourceMappedLineColumn);
#endif
}
lineNumber = static_cast<int>(sourceMappedLineColumn.line);
// Convert absolute paths to file:// URLs (after sourcemap remapping)
bool isAbsolutePath = false;
if (!url.isEmpty()) {
if (url[0] == '/')
isAbsolutePath = true;
else if (url.length() >= 2 && url[1] == ':') {
char firstChar = url[0];
if ((firstChar >= 'A' && firstChar <= 'Z') || (firstChar >= 'a' && firstChar <= 'z'))
isAbsolutePath = true;
} else if (url.length() >= 2 && url[0] == '\\' && url[1] == '\\')
isAbsolutePath = true;
}
if (isAbsolutePath)
url = WTF::URL::fileURLWithFileSystemPath(url).string();
}
WTF::String location = formatLocation(url, lineNumber);

View File

@@ -31,8 +31,17 @@
#include <utility>
#include <vector>
#ifdef _WIN32
#pragma clang diagnostic push
#pragma clang diagnostic ignored "-Wmicrosoft-include"
#endif
#define v8 real_v8
#define private public
#include "node/v8.h"
#undef private
#undef v8
#ifdef _WIN32
#pragma clang diagnostic pop
#endif

View File

@@ -923,9 +923,9 @@ pub const SendQueue = struct {
pub fn windowsConfigureClient(this: *SendQueue, pipe_fd: bun.FileDescriptor) !void {
log("configureClient", .{});
const ipc_pipe = bun.handleOom(bun.default_allocator.create(uv.Pipe));
const ipc_pipe = bun.new(uv.Pipe, std.mem.zeroes(uv.Pipe));
ipc_pipe.init(uv.Loop.get(), true).unwrap() catch |err| {
bun.default_allocator.destroy(ipc_pipe);
bun.destroy(ipc_pipe);
return err;
};
ipc_pipe.open(pipe_fd).unwrap() catch |err| {

View File

@@ -564,8 +564,8 @@ pub fn runScriptsWithFilter(ctx: Command.Context) !noreturn {
.config = script,
.options = .{
.stdin = .ignore,
.stdout = if (Environment.isPosix) .buffer else .{ .buffer = try bun.default_allocator.create(bun.windows.libuv.Pipe) },
.stderr = if (Environment.isPosix) .buffer else .{ .buffer = try bun.default_allocator.create(bun.windows.libuv.Pipe) },
.stdout = if (Environment.isPosix) .buffer else .{ .buffer = bun.new(bun.windows.libuv.Pipe, std.mem.zeroes(bun.windows.libuv.Pipe)) },
.stderr = if (Environment.isPosix) .buffer else .{ .buffer = bun.new(bun.windows.libuv.Pipe, std.mem.zeroes(bun.windows.libuv.Pipe)) },
.cwd = std.fs.path.dirname(script.package_json_path) orelse "",
.windows = if (Environment.isWindows) .{ .loop = bun.jsc.EventLoopHandle.init(event_loop) },
.stream = true,

View File

@@ -762,8 +762,8 @@ pub fn run(ctx: Command.Context) !noreturn {
.color_idx = color_idx,
.options = .{
.stdin = .ignore,
.stdout = if (Environment.isPosix) .buffer else .{ .buffer = try bun.default_allocator.create(bun.windows.libuv.Pipe) },
.stderr = if (Environment.isPosix) .buffer else .{ .buffer = try bun.default_allocator.create(bun.windows.libuv.Pipe) },
.stdout = if (Environment.isPosix) .buffer else .{ .buffer = bun.new(bun.windows.libuv.Pipe, std.mem.zeroes(bun.windows.libuv.Pipe)) },
.stderr = if (Environment.isPosix) .buffer else .{ .buffer = bun.new(bun.windows.libuv.Pipe, std.mem.zeroes(bun.windows.libuv.Pipe)) },
.cwd = config.cwd,
.windows = if (Environment.isWindows) .{ .loop = bun.jsc.EventLoopHandle.init(event_loop) },
.stream = true,

View File

@@ -1415,15 +1415,23 @@ pub const Pipe = extern struct {
return @ptrCast(this);
}
/// Close the pipe handle and then free it in the close callback.
/// Use this when a pipe has been init'd but needs to be destroyed
/// (e.g. when open() fails after init() succeeded).
/// Close the pipe handle (if needed) and then free it.
/// Handles all states: never-initialized (loop == null), already closing,
/// or active. After uv_pipe_init the handle is in the event loop's
/// handle_queue; freeing without uv_close corrupts that list.
pub fn closeAndDestroy(this: *@This()) void {
this.close(&onCloseDestroy);
if (this.loop == null) {
// Never initialized — safe to free directly.
bun.destroy(this);
} else if (!this.isClosing()) {
// Initialized and not yet closing — must uv_close first.
this.close(&onCloseDestroy);
}
// else: already closing — the pending close callback owns the lifetime.
}
fn onCloseDestroy(handle: *@This()) callconv(.c) void {
bun.default_allocator.destroy(handle);
bun.destroy(handle);
}
};
const union_unnamed_416 = extern union {

View File

@@ -187,8 +187,8 @@ pub const LifecycleScriptSubprocess = struct {
null,
};
if (Environment.isWindows) {
this.stdout.source = .{ .pipe = bun.handleOom(bun.default_allocator.create(uv.Pipe)) };
this.stderr.source = .{ .pipe = bun.handleOom(bun.default_allocator.create(uv.Pipe)) };
this.stdout.source = .{ .pipe = bun.new(uv.Pipe, std.mem.zeroes(uv.Pipe)) };
this.stderr.source = .{ .pipe = bun.new(uv.Pipe, std.mem.zeroes(uv.Pipe)) };
}
const spawn_options = bun.spawn.SpawnOptions{
.stdin = if (this.foreground)

View File

@@ -1200,7 +1200,7 @@ pub const WindowsBufferedReader = struct {
fn onPipeClose(handle: *uv.Pipe) callconv(.c) void {
const this = bun.cast(*uv.Pipe, handle.data);
bun.default_allocator.destroy(this);
bun.destroy(this);
}
fn onTTYClose(handle: *uv.uv_tty_t) callconv(.c) void {

View File

@@ -210,11 +210,11 @@ pub const Source = union(enum) {
pub fn openPipe(loop: *uv.Loop, fd: bun.FileDescriptor) bun.sys.Maybe(*Source.Pipe) {
log("openPipe (fd = {f})", .{fd});
const pipe = bun.default_allocator.create(Source.Pipe) catch |err| bun.handleOom(err);
const pipe = bun.new(Source.Pipe, std.mem.zeroes(Source.Pipe));
// we should never init using IPC here see ipc.zig
switch (pipe.init(loop, false)) {
.err => |err| {
bun.default_allocator.destroy(pipe);
bun.destroy(pipe);
return .{ .err = err };
},
else => {},

View File

@@ -363,6 +363,100 @@ describe.concurrent("--cpu-prof", () => {
expect(profileContent).toContain("# CPU Profile");
});
test("--cpu-prof with --compile --sourcemap shows sourcemapped paths", async () => {
using dir = tempDir("cpu-prof-compile-sourcemap", {
"src/index.ts": `
import { heavyWork } from "./worker";
function main() {
const now = performance.now();
while (now + 100 > performance.now()) {
heavyWork();
}
}
main();
`,
"src/worker.ts": `
export function heavyWork() {
let sum = 0;
for (let i = 0; i < 10000; i++) {
sum += Math.sqrt(i);
}
return sum;
}
`,
});
// Build with --compile --sourcemap
await using buildProc = Bun.spawn({
cmd: [
bunExe(),
"build",
"--compile",
"--sourcemap",
"--outfile",
join(String(dir), "app"),
join(String(dir), "src/index.ts"),
],
cwd: String(dir),
env: bunEnv,
stderr: "pipe",
});
const buildStderr = await buildProc.stderr.text();
const buildExitCode = await buildProc.exited;
expect(buildStderr).not.toContain("error");
expect(buildExitCode).toBe(0);
// Run the compiled binary with cpu-prof flags via BUN_OPTIONS env var
// (standalone binaries don't parse runtime args, but do parse BUN_OPTIONS)
await using runProc = Bun.spawn({
cmd: [join(String(dir), "app")],
cwd: String(dir),
env: { ...bunEnv, BUN_OPTIONS: "--cpu-prof --cpu-prof-md" },
stderr: "pipe",
});
const runStderr = await runProc.stderr.text();
const runExitCode = await runProc.exited;
expect(runExitCode).toBe(0);
// Check JSON profile (.cpuprofile)
const files = readdirSync(String(dir));
const profileFiles = files.filter(f => f.endsWith(".cpuprofile"));
expect(profileFiles.length).toBeGreaterThan(0);
const profile = JSON.parse(readFileSync(join(String(dir), profileFiles[0]), "utf-8"));
// Collect all URLs with line numbers from profile nodes (frames with expression info)
const framesWithLocation: { url: string; line: number }[] = profile.nodes
.map((n: any) => ({ url: n.callFrame.url, line: n.callFrame.lineNumber }))
.filter((f: { url: string; line: number }) => f.url.length > 0 && f.line >= 0);
// Frames WITH line info should NOT contain /$bunfs/ paths or chunk- names
// (frames without line info may still show the binary name, which is expected)
const bunfsPaths = framesWithLocation.filter(
(f: { url: string }) => f.url.includes("$bunfs") || f.url.includes("chunk-"),
);
expect(bunfsPaths).toEqual([]);
// Should contain original source file names
const allUrls = framesWithLocation.map((f: { url: string }) => f.url);
const hasWorkerTs = allUrls.some((u: string) => u.includes("worker.ts"));
const hasIndexTs = allUrls.some((u: string) => u.includes("index.ts"));
expect(hasWorkerTs || hasIndexTs).toBe(true);
// Check Markdown profile (.md)
const mdFiles = files.filter(f => f.endsWith(".md") && f.startsWith("CPU."));
expect(mdFiles.length).toBeGreaterThan(0);
const mdContent = readFileSync(join(String(dir), mdFiles[0]), "utf-8");
// Should contain original source filenames in markdown
expect(mdContent.includes("worker.ts") || mdContent.includes("index.ts")).toBe(true);
});
test("--cpu-prof and --cpu-prof-md together creates both files", async () => {
using dir = tempDir("cpu-prof-both-formats", {
"test.js": `

View File

@@ -0,0 +1,203 @@
// Exercises Bun's SIMD code paths to verify the baseline binary doesn't
// emit instructions beyond its CPU target (no AVX on x64, no LSE/SVE on aarch64).
//
// Each test uses inputs large enough to hit vectorized fast paths (>= 16 bytes
// for @Vector(16, u8), >= 64 bytes for wider paths) and validates correctness
// to catch both SIGILL and miscompilation from wrong instruction lowering.
import { describe, expect, test } from "bun:test";
// Use Buffer.alloc instead of "x".repeat() — repeat is slow in debug JSC builds.
const ascii256 = Buffer.alloc(256, "a").toString();
const ascii1k = Buffer.alloc(1024, "x").toString();
describe("escapeHTML — @Vector(16, u8) gated by enableSIMD", () => {
test("clean passthrough", () => {
expect(Bun.escapeHTML(ascii256)).toBe(ascii256);
});
test("ampersand in middle", () => {
const input = ascii256 + "&" + ascii256;
const escaped = Bun.escapeHTML(input);
expect(escaped).toContain("&amp;");
// The raw "&" should have been replaced — only "&amp;" should remain
expect(escaped.replaceAll("&amp;", "").includes("&")).toBe(false);
});
test("all special chars", () => {
const input = '<div class="test">' + ascii256 + "</div>";
const escaped = Bun.escapeHTML(input);
expect(escaped).toContain("&lt;");
expect(escaped).toContain("&gt;");
expect(escaped).toContain("&quot;");
});
});
describe("stringWidth — @Vector(16, u8) ungated", () => {
test("ascii", () => {
expect(Bun.stringWidth(ascii256)).toBe(256);
});
test("empty", () => {
expect(Bun.stringWidth("")).toBe(0);
});
test("tabs", () => {
expect(Bun.stringWidth(Buffer.alloc(32, "\t").toString())).toBe(0);
});
test("mixed printable and zero-width", () => {
const mixed = "hello" + "\x00".repeat(64) + "world";
expect(Bun.stringWidth(mixed)).toBe(10);
});
});
describe("Buffer hex encoding — @Vector(16, u8) gated by enableSIMD", () => {
test.each([16, 32, 64, 128, 256])("size %d", size => {
const buf = Buffer.alloc(size, 0xab);
const hex = buf.toString("hex");
expect(hex.length).toBe(size * 2);
expect(hex).toBe("ab".repeat(size));
});
test("all byte values", () => {
const varied = Buffer.alloc(256);
for (let i = 0; i < 256; i++) varied[i] = i;
const hex = varied.toString("hex");
expect(hex).toStartWith("000102030405");
expect(hex).toEndWith("fdfeff");
});
});
describe("base64 — simdutf runtime dispatch", () => {
test("ascii roundtrip", () => {
const encoded = btoa(ascii1k);
expect(atob(encoded)).toBe(ascii1k);
});
test("binary roundtrip", () => {
const binary = String.fromCharCode(...Array.from({ length: 256 }, (_, i) => i));
expect(atob(btoa(binary))).toBe(binary);
});
});
describe("TextEncoder/TextDecoder — simdutf runtime dispatch", () => {
const encoder = new TextEncoder();
const decoder = new TextDecoder();
test("ascii roundtrip", () => {
const bytes = encoder.encode(ascii1k);
expect(bytes.length).toBe(1024);
expect(decoder.decode(bytes)).toBe(ascii1k);
});
test("mixed ascii + multibyte", () => {
const mixed = ascii256 + "\u00e9\u00e9\u00e9" + ascii256 + "\u2603\u2603" + ascii256;
expect(decoder.decode(encoder.encode(mixed))).toBe(mixed);
});
test("emoji surrogate pairs", () => {
const emoji = "\u{1F600}".repeat(64);
expect(decoder.decode(encoder.encode(emoji))).toBe(emoji);
});
});
describe("decodeURIComponent — SIMD % scanning", () => {
test("clean passthrough", () => {
const clean = Buffer.alloc(256, "a").toString();
expect(decodeURIComponent(clean)).toBe(clean);
});
test("encoded at various positions", () => {
const input = "a".repeat(128) + "%20" + "b".repeat(128) + "%21";
expect(decodeURIComponent(input)).toBe("a".repeat(128) + " " + "b".repeat(128) + "!");
});
test("heavy utf8 encoding", () => {
const input = Array.from({ length: 64 }, () => "%C3%A9").join("");
expect(decodeURIComponent(input)).toBe("\u00e9".repeat(64));
});
});
describe("URL parsing — Highway indexOfChar/indexOfAny", () => {
test("long URL with all components", () => {
const longPath = "/" + "segment/".repeat(32) + "end";
const url = new URL("https://user:pass@example.com:8080" + longPath + "?key=value&foo=bar#section");
expect(url.protocol).toBe("https:");
expect(url.hostname).toBe("example.com");
expect(url.port).toBe("8080");
expect(url.pathname).toBe(longPath);
expect(url.search).toBe("?key=value&foo=bar");
expect(url.hash).toBe("#section");
});
});
describe("JSON — JS lexer SIMD string scanning", () => {
test("large object roundtrip", () => {
const obj: Record<string, string> = {};
for (let i = 0; i < 100; i++) {
obj["key_" + Buffer.alloc(32, "a").toString() + "_" + i] = "value_" + Buffer.alloc(64, "b").toString() + "_" + i;
}
const parsed = JSON.parse(JSON.stringify(obj));
expect(Object.keys(parsed).length).toBe(100);
expect(parsed["key_" + Buffer.alloc(32, "a").toString() + "_0"]).toBe(
"value_" + Buffer.alloc(64, "b").toString() + "_0",
);
});
test("string with escape sequences", () => {
const original = { msg: 'quote"here\nand\ttab' + Buffer.alloc(256, "x").toString() };
const reparsed = JSON.parse(JSON.stringify(original));
expect(reparsed.msg).toBe(original.msg);
});
});
describe("HTTP parsing — llhttp SSE4.2 PCMPESTRI", () => {
test("long headers", async () => {
const longHeaderValue = Buffer.alloc(512, "v").toString();
using server = Bun.serve({
port: 0,
fetch(req) {
return new Response(req.headers.get("X-Test-Header") || "missing");
},
});
const resp = await fetch(`http://localhost:${server.port}/` + "path/".repeat(20), {
headers: {
"X-Test-Header": longHeaderValue,
"X-Header-A": Buffer.alloc(64, "a").toString(),
"X-Header-B": Buffer.alloc(64, "b").toString(),
"X-Header-C": Buffer.alloc(64, "c").toString(),
"Accept": "application/json",
"Accept-Language": "en-US,en;q=0.9,fr;q=0.8,de;q=0.7",
},
});
expect(await resp.text()).toBe(longHeaderValue);
});
});
describe("Latin-1 to UTF-8 — @Vector(16, u8) ungated", () => {
test("full byte range", () => {
const latin1Bytes = Buffer.alloc(256);
for (let i = 0; i < 256; i++) latin1Bytes[i] = i;
const latin1Str = latin1Bytes.toString("latin1");
const utf8Buf = Buffer.from(latin1Str, "utf-8");
expect(utf8Buf.length).toBeGreaterThan(256);
expect(utf8Buf.toString("utf-8").length).toBe(256);
});
});
describe("String search — Highway memMem/indexOfChar", () => {
test("indexOf long string", () => {
const haystack = Buffer.alloc(1000, "a").toString() + "needle" + Buffer.alloc(1000, "b").toString();
expect(haystack.indexOf("needle")).toBe(1000);
expect(haystack.indexOf("missing")).toBe(-1);
expect(haystack.lastIndexOf("needle")).toBe(1000);
});
test("includes long string", () => {
const haystack = Buffer.alloc(1000, "a").toString() + "needle" + Buffer.alloc(1000, "b").toString();
expect(haystack.includes("needle")).toBe(true);
expect(haystack.includes("missing")).toBe(false);
});
});

View File

@@ -1,140 +0,0 @@
import { expect, test } from "bun:test";
import { bunEnv, bunExe, tempDir } from "harness";
// https://github.com/oven-sh/bun/issues/27335
// The `accessor` keyword should work in TypeScript classes even when
// `experimentalDecorators: true` is set in tsconfig.json.
test("accessor keyword works with experimentalDecorators: true", async () => {
using dir = tempDir("issue-27335", {
"tsconfig.json": JSON.stringify({
compilerOptions: {
experimentalDecorators: true,
},
}),
"main.ts": `
class Person {
public accessor name: string = "John";
}
const p = new Person();
console.log(p.name);
`,
});
await using proc = Bun.spawn({
cmd: [bunExe(), "main.ts"],
env: bunEnv,
cwd: String(dir),
stderr: "pipe",
});
const [stdout, stderr, exitCode] = await Promise.all([proc.stdout.text(), proc.stderr.text(), proc.exited]);
expect(stdout).toBe("John\n");
expect(exitCode).toBe(0);
});
test("accessor keyword works with various modifiers and experimentalDecorators", async () => {
using dir = tempDir("issue-27335-modifiers", {
"tsconfig.json": JSON.stringify({
compilerOptions: {
experimentalDecorators: true,
},
}),
"main.ts": `
class Foo {
accessor x = 1;
public accessor y = 2;
private accessor z = 3;
static accessor w = 4;
getZ() { return this.z; }
}
const f = new Foo();
console.log(f.x);
console.log(f.y);
console.log(f.getZ());
console.log(Foo.w);
`,
});
await using proc = Bun.spawn({
cmd: [bunExe(), "main.ts"],
env: bunEnv,
cwd: String(dir),
stderr: "pipe",
});
const [stdout, stderr, exitCode] = await Promise.all([proc.stdout.text(), proc.stderr.text(), proc.exited]);
expect(stdout).toBe("1\n2\n3\n4\n");
expect(exitCode).toBe(0);
});
test("accessor keyword works without experimentalDecorators (standard mode)", async () => {
using dir = tempDir("issue-27335-standard", {
"tsconfig.json": JSON.stringify({
compilerOptions: {},
}),
"main.ts": `
class Person {
public accessor name: string = "John";
}
const p = new Person();
console.log(p.name);
`,
});
await using proc = Bun.spawn({
cmd: [bunExe(), "main.ts"],
env: bunEnv,
cwd: String(dir),
stderr: "pipe",
});
const [stdout, stderr, exitCode] = await Promise.all([proc.stdout.text(), proc.stderr.text(), proc.exited]);
expect(stdout).toBe("John\n");
expect(exitCode).toBe(0);
});
test("accessor with experimental decorators on other members", async () => {
using dir = tempDir("issue-27335-mixed", {
"tsconfig.json": JSON.stringify({
compilerOptions: {
experimentalDecorators: true,
},
}),
"main.ts": `
function log(target: any, key: string) {
// simple experimental decorator
}
class MyClass {
@log
greet() { return "hello"; }
accessor count: number = 42;
}
const obj = new MyClass();
console.log(obj.greet());
console.log(obj.count);
`,
});
await using proc = Bun.spawn({
cmd: [bunExe(), "main.ts"],
env: bunEnv,
cwd: String(dir),
stderr: "pipe",
});
const [stdout, stderr, exitCode] = await Promise.all([proc.stdout.text(), proc.stderr.text(), proc.exited]);
expect(stdout).toBe("hello\n42\n");
expect(exitCode).toBe(0);
});