Compare commits

..

11 Commits

Author SHA1 Message Date
Dylan Conway
b3b80ec6f2 skip tail-call-should-consume-stack-in-bbq.js on Windows SDE (17+ min) 2026-02-21 16:45:54 -08:00
Dylan Conway
98064f6d2f fix: retry WebKit download up to 3 times on transient network errors 2026-02-21 15:29:54 -08:00
Dylan Conway
e623787ea3 fix(cmake): use GITHUB_TOKEN for dependency downloads to avoid rate limits
Pass Authorization header for GitHub URLs when GITHUB_TOKEN is set.
Raises the rate limit from 60 to 5000 requests/hour, preventing
download failures when many dependencies are fetched in parallel.

Only applies to github.com URLs. No-op when GITHUB_TOKEN is unset.
2026-02-21 15:00:25 -08:00
Dylan Conway
cb3c39be23 ci: add Intel SDE baseline verification for Windows, unify baseline checks (#27121)
Adds a unified baseline verification script
(`scripts/verify-baseline.ts`) that combines basic CPU instruction
checks and JIT stress test fixtures into a single step.

**Changes:**
- New TypeScript script replaces separate `verify-baseline-cpu.sh` and
`verify-jit-stress-qemu.sh` CI steps
- Adds Windows x64 baseline verification using Intel SDE v9.58 with
Nehalem emulation
- Linux continues to use QEMU (Nehalem for x64, Cortex-A53 for aarch64)
- SDE violations are detected by checking output for `SDE-ERROR`
messages rather than exit codes, avoiding false positives from
application errors
- JIT stress fixtures now run on every build instead of only when WebKit
changes

**Platform support:**
| Platform | Emulator | CPU Model |
|----------|----------|-----------|
| Linux x64 baseline | QEMU | Nehalem (SSE4.2, no AVX) |
| Linux aarch64 | QEMU | Cortex-A53 (no LSE/SVE) |
| Windows x64 baseline | Intel SDE | Nehalem (no AVX) |

---------

Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-02-21 14:24:21 -08:00
robobun
bc98025d93 fix(spawn): close libuv pipes before freeing to prevent handle queue corruption (#27064)
## Summary

Fixes #27063

On Windows, when `Bun.spawn` fails (e.g., ENOENT for a nonexistent
executable), pipes that were already initialized with `uv_pipe_init`
were being freed directly with `allocator.destroy()` without first
calling `uv_close()`. This left dangling pointers in libuv's
`handle_queue` linked list, corrupting it. Subsequent spawn calls would
crash with a segfault when inserting new handles into the corrupted
list.

Three sites were freeing pipe handles without `uv_close`:

- **`process.zig` `Stdio.deinit()`**: When spawn failed,
already-initialized pipes were freed without `uv_close()`. Now uses
`closePipeAndDestroy()` which checks `pipe.loop` to determine if the
pipe was registered with the event loop.
- **`process.zig` `spawnProcessWindows` IPC handling**: Unsupported IPC
pipes in stdin/stdout/stderr were freed directly. Now uses the same safe
close-then-destroy pattern.
- **`source.zig` `openPipe()`**: If `pipe.open(fd)` failed after
`pipe.init()` succeeded, the pipe was destroyed directly. Now calls
`uv_close()` with a callback that frees the memory.

Additionally, pipe allocations in `stdio.zig` are now zero-initialized
so that the `loop` field is reliably `null` before `uv_pipe_init`,
enabling the init detection in `deinit`.

## Test plan

- [x] Added regression test `test/regression/issue/27063.test.ts` that
spawns nonexistent executables repeatedly and verifies a valid spawn
still works afterward
- [x] Verified existing spawn tests pass (`exit-code.test.ts`,
`spawnSync.test.ts` — timing-related pre-existing flakes only)
- [x] Debug build compiles successfully
- [ ] Windows CI should verify the fix prevents the segfault


🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Dylan Conway <dylan.conway567@gmail.com>
Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2026-02-21 14:04:25 -08:00
Jarred Sumner
b371bf9420 fix(install): resolve DT_UNKNOWN entries on NFS/FUSE filesystems (#27008)
## Summary

- Fixes `bun install` producing incomplete `node_modules` on NFS, FUSE,
and some bind mount filesystems
- On these filesystems, `getdents64` returns `DT_UNKNOWN` for `d_type`
instead of `DT_DIR`/`DT_REG`
- The directory walker was silently skipping these entries, causing
missing files (e.g., 484 instead of 1070 for `@sinclair/typebox`)
- When an entry has unknown kind, we now fall back to `fstatat()` to
resolve the actual file type

## Test plan

- [x] Reproduced with Docker NFS environment: npm installs 1071 files,
bun installs only 484
- [x] Verified fix: bun-debug now installs 1070 files (matching npm
minus `.package-lock.json`)
- [x] Second install from cache also works correctly (1070 files)
- [x] `bun run zig:check-all` passes on all 16 platform targets
- [ ] CI passes

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-02-21 02:18:47 -08:00
Jarred Sumner
e6ec92244c fix(bindgen): hoist WTF::String temps to dispatch scope to prevent use-after-free (#27324)
## Summary

Fixes a use-after-free in bindgen v1 generated C++ bindings that causes
`"switch on corrupt value"` panics in `String.deref` on Windows. This is
a top crash (500+ reports across v1.3.3–v1.3.9), predominantly affecting
standalone executables.

## Root Cause

`Bun::toString(WTF::String&)` copies the raw `StringImpl*` pointer
**without adding a reference**. For optional string arguments with
defaults and dictionary string fields, the generated code declares
`WTF::String` inside an `if` block, but the resulting `BunString`
outlives it:

```cpp
BunString argStr;
if (!arg.isUndefinedOrNull()) {
    WTF::String wtfString_0 = WebCore::convert<IDLDOMString>(...);
    argStr = Bun::toString(wtfString_0);  // copies pointer, no ref
}  // ← wtfString_0 destroyed here, drops ref → StringImpl may be freed
// argStr now holds a dangling pointer to freed memory
```

When the freed memory is reused, `String.deref()` reads garbage for the
tag field → `"switch on corrupt value"` panic.

### Why it was Windows-only / elevated recently

- The mimalloc v3 update (shipped in v1.3.7/v1.3.8) changed heap reuse
patterns on Windows, causing freed memory to be overwritten more
aggressively — turning a latent UAF into a frequent crash
- The mimalloc v3 revert in v1.3.9 reduced crash frequency back to
baseline but did not fix the underlying bug
- A [previous fix](https://github.com/oven-sh/bun/pull/26717) was
reverted due to unrelated CI failures

## Fix

Hoist all `WTF::String` temporaries to the same scope as the Zig
dispatch call, so they stay alive until the `BunString` values are
consumed:

1. **Function string arguments**: `WTF::String` is declared at the top
of the generated function, before any `if` blocks for optional arguments
2. **Dictionary string fields**: The `convert*` function accepts
`WTF::String&` references owned by the caller, so the string data
outlives the `convert*` function and remains valid through the dispatch
call

This approach is exception-safe — `WTF::String` destructors handle
cleanup automatically on all exit paths (normal return,
`RETURN_IF_EXCEPTION`, etc.) with no leaked refs.

### Difference from the previous fix

The [previous fix](https://github.com/oven-sh/bun/pull/26717) hoisted
`WTF::String` for function arguments but kept dictionary field temps
**inside** the `convert*` function. This left dictionary string fields
as use-after-return — `result->encoding` would be a dangling pointer
after `convert*` returned. This fix correctly passes `WTF::String&` refs
from the dispatch scope through to the `convert*` function.

### Affected call sites

Only 2 call sites have the vulnerable pattern (`DOMString` +
`.default(...)`):
- `Bun.stringWidth()` — `str: t.DOMString.default("")`  
- `os.userInfo()` — `encoding: t.DOMString.default("")` in
`UserInfoOptions` dictionary

Note: bindgen v2 is not affected — it uses `releaseImpl().leakRef()`
which transfers ownership correctly.
2026-02-21 02:15:52 -08:00
Dylan Conway
b509acb533 Revert "fix: clean up ESM registry when require() of ESM module fails… (#27325)
… (#27288)"

This reverts commit 21c3439bb4.
2026-02-21 01:14:27 -08:00
robobun
ede635b8a9 fix(install): store tarball integrity hash in lockfile for HTTPS dependencies (#27018)
## Summary
- HTTPS/URL tarball dependencies were not having their integrity hash
stored in the lockfile, allowing a malicious server to change the
tarball contents without detection on subsequent installs
- Now computes a sha512 hash from the downloaded tarball bytes during
extraction and stores it in both the binary lockfile and text bun.lock
- The hash is verified on re-download, matching the behavior of npm
registry packages
- Old lockfiles without integrity hashes continue to work (backward
compatible)

## Changes
- `src/install/integrity.zig`: Added `Integrity.forBytes()` to compute
sha512 from raw bytes
- `src/install/install.zig`: Added `integrity` field to `ExtractData`
struct
- `src/install/PackageManagerTask.zig`: Compute hash from tarball bytes
for both remote and local tarball tasks
- `src/install/PackageManager/processDependencyList.zig`: Set
`package.meta.integrity` from computed hash
- `src/install/lockfile/bun.lock.zig`: Serialize/deserialize integrity
for `remote_tarball` and `local_tarball` types

## Test plan
- [x] Integrity hash is stored in text lockfile for tarball URL deps
- [x] Integrity hash is consistent/deterministic across reinstalls
- [x] Integrity mismatch (changed tarball content) causes install
failure
- [x] Old lockfiles without integrity still install successfully
(backward compat)
- [x] Fresh installs produce integrity hashes
- [x] All 12 existing tarball tests pass (no regressions)
- [x] Tests fail with `USE_SYSTEM_BUN=1` (confirms fix is effective)

Fixes GHSA-jfhr-4v9p-9rm4

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2026-02-20 23:52:45 -08:00
Jarred Sumner
ebb3730166 Update ci.mjs 2026-02-20 23:17:53 -08:00
robobun
76ceb26e0a fix(socket): prevent null deref in Listener.getsockname (#27303)
## Summary
- Fix null pointer dereference in `Listener.getsockname()` when called
without an object argument (or with a non-object argument)
- `getsockname()` wrote properties directly into its first argument via
`.put()`, which calls `getObject()` in C++ — this returns null for
non-object values like `undefined`, causing a crash at
`BunString.cpp:942`
- Now validates the argument is an object first; if not, creates a new
empty object, writes properties into it, and returns it

## Crash reproduction
```js
const listener = Bun.listen({
    hostname: "localhost",
    port: 0,
    socket: { data() {} },
});
listener.getsockname(); // Segfault - null pointer dereference
```

## Test plan
- [x] Added `test/js/bun/http/listener-getsockname.test.ts` with tests
for calling `getsockname()` with no argument, with an object argument,
and with a non-object argument
- [x] Verified test crashes with system bun and passes with patched
build
- [x] Verified original fuzzer reproduction no longer crashes

---------

Co-authored-by: Dylan Conway <dylan.conway567@gmail.com>
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-20 22:50:44 -08:00
46 changed files with 1398 additions and 242 deletions

View File

@@ -593,8 +593,35 @@ function getTargetTriplet(platform) {
*/
function needsBaselineVerification(platform) {
const { os, arch, baseline } = platform;
if (os !== "linux") return false;
return (arch === "x64" && baseline) || arch === "aarch64";
if (os === "linux") return (arch === "x64" && baseline) || arch === "aarch64";
if (os === "windows") return arch === "x64" && baseline;
return false;
}
/**
* Returns the emulator binary name for the given platform.
* Linux uses QEMU user-mode; Windows uses Intel SDE.
* @param {Platform} platform
* @returns {string}
*/
function getEmulatorBinary(platform) {
const { os, arch } = platform;
if (os === "windows") return "sde-external/sde.exe";
if (arch === "aarch64") return "qemu-aarch64-static";
return "qemu-x86_64-static";
}
const SDE_VERSION = "9.58.0-2025-06-16";
const SDE_URL = `https://downloadmirror.intel.com/859732/sde-external-${SDE_VERSION}-win.tar.xz`;
/**
* @param {Platform} platform
* @param {PipelineOptions} options
* @returns {Step}
*/
function hasWebKitChanges(options) {
const { changedFiles = [] } = options;
return changedFiles.some(file => file.includes("SetupWebKit.cmake"));
}
/**
@@ -603,9 +630,31 @@ function needsBaselineVerification(platform) {
* @returns {Step}
*/
function getVerifyBaselineStep(platform, options) {
const { arch } = platform;
const { os } = platform;
const targetKey = getTargetKey(platform);
const archArg = arch === "x64" ? "x64" : "aarch64";
const triplet = getTargetTriplet(platform);
const emulator = getEmulatorBinary(platform);
const jitStressFlag = hasWebKitChanges(options) ? " --jit-stress" : "";
const setupCommands =
os === "windows"
? [
`echo Downloading build artifacts...`,
`buildkite-agent artifact download ${triplet}.zip . --step ${targetKey}-build-bun`,
`echo Extracting ${triplet}.zip...`,
`tar -xf ${triplet}.zip`,
`echo Downloading Intel SDE...`,
`curl.exe -fsSL -o sde.tar.xz "${SDE_URL}"`,
`echo Extracting Intel SDE...`,
`7z x -y sde.tar.xz`,
`7z x -y sde.tar`,
`ren sde-external-${SDE_VERSION}-win sde-external`,
]
: [
`buildkite-agent artifact download '*.zip' . --step ${targetKey}-build-bun`,
`unzip -o '${triplet}.zip'`,
`chmod +x ${triplet}/bun`,
];
return {
key: `${targetKey}-verify-baseline`,
@@ -614,57 +663,10 @@ function getVerifyBaselineStep(platform, options) {
agents: getLinkBunAgent(platform, options),
retry: getRetry(),
cancel_on_build_failing: isMergeQueue(),
timeout_in_minutes: 5,
timeout_in_minutes: hasWebKitChanges(options) ? 30 : 10,
command: [
`buildkite-agent artifact download '*.zip' . --step ${targetKey}-build-bun`,
`unzip -o '${getTargetTriplet(platform)}.zip'`,
`unzip -o '${getTargetTriplet(platform)}-profile.zip'`,
`chmod +x ${getTargetTriplet(platform)}/bun ${getTargetTriplet(platform)}-profile/bun-profile`,
`./scripts/verify-baseline-cpu.sh --arch ${archArg} --binary ${getTargetTriplet(platform)}/bun`,
`./scripts/verify-baseline-cpu.sh --arch ${archArg} --binary ${getTargetTriplet(platform)}-profile/bun-profile`,
],
};
}
/**
* Returns true if the PR modifies SetupWebKit.cmake (WebKit version changes).
* JIT stress tests under QEMU should run when WebKit is updated to catch
* JIT-generated code that uses unsupported CPU instructions.
* @param {PipelineOptions} options
* @returns {boolean}
*/
function hasWebKitChanges(options) {
const { changedFiles = [] } = options;
return changedFiles.some(file => file.includes("SetupWebKit.cmake"));
}
/**
* Returns a step that runs JSC JIT stress tests under QEMU.
* This verifies that JIT-compiled code doesn't use CPU instructions
* beyond the baseline target (no AVX on x64, no LSE on aarch64).
* @param {Platform} platform
* @param {PipelineOptions} options
* @returns {Step}
*/
function getJitStressTestStep(platform, options) {
const { arch } = platform;
const targetKey = getTargetKey(platform);
const archArg = arch === "x64" ? "x64" : "aarch64";
return {
key: `${targetKey}-jit-stress-qemu`,
label: `${getTargetLabel(platform)} - jit-stress-qemu`,
depends_on: [`${targetKey}-build-bun`],
agents: getLinkBunAgent(platform, options),
retry: getRetry(),
cancel_on_build_failing: isMergeQueue(),
// JIT stress tests are slow under QEMU emulation
timeout_in_minutes: 30,
command: [
`buildkite-agent artifact download '*.zip' . --step ${targetKey}-build-bun`,
`unzip -o '${getTargetTriplet(platform)}.zip'`,
`chmod +x ${getTargetTriplet(platform)}/bun`,
`./scripts/verify-jit-stress-qemu.sh --arch ${archArg} --binary ${getTargetTriplet(platform)}/bun`,
...setupCommands,
`bun scripts/verify-baseline.ts --binary ${triplet}/${os === "windows" ? "bun.exe" : "bun"} --emulator ${emulator}${jitStressFlag}`,
],
};
}
@@ -1264,10 +1266,6 @@ async function getPipeline(options = {}) {
if (needsBaselineVerification(target)) {
steps.push(getVerifyBaselineStep(target, options));
// Run JIT stress tests under QEMU when WebKit is updated
if (hasWebKitChanges(options)) {
steps.push(getJitStressTestStep(target, options));
}
}
return getStepWithDependsOn(
@@ -1349,6 +1347,10 @@ async function main() {
{ headers: { Authorization: `Bearer ${getSecret("GITHUB_TOKEN")}` } },
);
const doc = await res.json();
if (!Array.isArray(doc)) {
console.error(`-> page ${i}, unexpected response:`, JSON.stringify(doc));
break;
}
console.log(`-> page ${i}, found ${doc.length} items`);
if (doc.length === 0) break;
for (const { filename, status } of doc) {
@@ -1363,7 +1365,7 @@ async function main() {
} catch (e) {
console.error(e);
}
if (allFiles.every(filename => filename.startsWith("docs/"))) {
if (allFiles.length > 0 && allFiles.every(filename => filename.startsWith("docs/"))) {
console.log(`- PR is only docs, skipping tests!`);
return;
}

View File

@@ -30,6 +30,11 @@ else()
set(DOWNLOAD_ACCEPT_HEADER "Accept: */*")
endif()
set(DOWNLOAD_AUTH_HEADER)
if(DEFINED ENV{GITHUB_TOKEN} AND NOT "$ENV{GITHUB_TOKEN}" STREQUAL "" AND DOWNLOAD_URL MATCHES "^https://github\\.com/")
set(DOWNLOAD_AUTH_HEADER HTTPHEADER "Authorization: Bearer $ENV{GITHUB_TOKEN}")
endif()
foreach(i RANGE 10)
set(DOWNLOAD_TMP_FILE_${i} ${DOWNLOAD_TMP_FILE}.${i})
@@ -38,12 +43,13 @@ foreach(i RANGE 10)
else()
message(STATUS "Downloading ${DOWNLOAD_URL}... (retry ${i})")
endif()
file(DOWNLOAD
${DOWNLOAD_URL}
${DOWNLOAD_TMP_FILE_${i}}
HTTPHEADER "User-Agent: cmake/${CMAKE_VERSION}"
HTTPHEADER ${DOWNLOAD_ACCEPT_HEADER}
${DOWNLOAD_AUTH_HEADER}
STATUS DOWNLOAD_STATUS
INACTIVITY_TIMEOUT 60
TIMEOUT 180

View File

@@ -13,6 +13,11 @@ else()
set(LSHPACK_INCLUDES .)
endif()
# Suppress all warnings from vendored lshpack on Windows (clang-cl)
if(WIN32)
set(LSHPACK_CMAKE_ARGS "-DCMAKE_C_FLAGS=-w")
endif()
register_cmake_command(
TARGET
lshpack
@@ -28,6 +33,7 @@ register_cmake_command(
# _lshpack_enc_get_static_name in libls-hpack.a(lshpack.c.o)
# _update_hash in libls-hpack.a(lshpack.c.o)
-DCMAKE_BUILD_TYPE=Release
${LSHPACK_CMAKE_ARGS}
INCLUDES
${LSHPACK_INCLUDES}
)

View File

@@ -79,12 +79,22 @@ endif()
if(CMAKE_SYSTEM_PROCESSOR MATCHES "aarch64|arm64|ARM64|AARCH64" AND NOT APPLE)
list(APPEND MIMALLOC_CMAKE_ARGS -DMI_NO_OPT_ARCH=ON)
list(APPEND MIMALLOC_CMAKE_ARGS -DMI_OPT_SIMD=ON)
list(APPEND MIMALLOC_CMAKE_ARGS "-DCMAKE_C_FLAGS=-moutline-atomics")
if(NOT WIN32)
list(APPEND MIMALLOC_CMAKE_ARGS "-DCMAKE_C_FLAGS=-moutline-atomics")
endif()
elseif(NOT ENABLE_BASELINE)
list(APPEND MIMALLOC_CMAKE_ARGS -DMI_OPT_ARCH=ON)
list(APPEND MIMALLOC_CMAKE_ARGS -DMI_OPT_SIMD=ON)
endif()
# Suppress all warnings from mimalloc on Windows — it's vendored C code compiled
# as C++ (MI_USE_CXX=ON) which triggers many clang-cl warnings (-Wold-style-cast,
# -Wzero-as-null-pointer-constant, -Wc++98-compat-pedantic, etc.)
if(WIN32)
list(APPEND MIMALLOC_CMAKE_ARGS "-DCMAKE_C_FLAGS=-w")
list(APPEND MIMALLOC_CMAKE_ARGS "-DCMAKE_CXX_FLAGS=-w")
endif()
if(WIN32)
if(DEBUG)
set(MIMALLOC_LIBRARY mimalloc-static-debug)

View File

@@ -7,9 +7,16 @@ register_repository(
12882eee073cfe5c7621bcfadf679e1372d4537b
)
# Suppress all warnings from vendored tinycc on Windows (clang-cl)
if(WIN32)
set(TINYCC_CMAKE_ARGS "-DCMAKE_C_FLAGS=-w")
endif()
register_cmake_command(
TARGET
tinycc
ARGS
${TINYCC_CMAKE_ARGS}
LIBRARIES
tcc
)

View File

@@ -241,13 +241,21 @@ if(EXISTS ${WEBKIT_PATH}/package.json)
endif()
endif()
file(
DOWNLOAD ${WEBKIT_DOWNLOAD_URL} ${CACHE_PATH}/${WEBKIT_FILENAME} SHOW_PROGRESS
STATUS WEBKIT_DOWNLOAD_STATUS
)
if(NOT "${WEBKIT_DOWNLOAD_STATUS}" MATCHES "^0;")
message(FATAL_ERROR "Failed to download WebKit: ${WEBKIT_DOWNLOAD_STATUS}")
endif()
foreach(WEBKIT_DOWNLOAD_ATTEMPT RANGE 1 3)
file(
DOWNLOAD ${WEBKIT_DOWNLOAD_URL} ${CACHE_PATH}/${WEBKIT_FILENAME} SHOW_PROGRESS
STATUS WEBKIT_DOWNLOAD_STATUS
)
if("${WEBKIT_DOWNLOAD_STATUS}" MATCHES "^0;")
break()
endif()
message(WARNING "WebKit download attempt ${WEBKIT_DOWNLOAD_ATTEMPT} failed: ${WEBKIT_DOWNLOAD_STATUS}")
file(REMOVE ${CACHE_PATH}/${WEBKIT_FILENAME})
if(WEBKIT_DOWNLOAD_ATTEMPT EQUAL 3)
message(FATAL_ERROR "Failed to download WebKit after 3 attempts: ${WEBKIT_DOWNLOAD_STATUS}")
endif()
execute_process(COMMAND ${CMAKE_COMMAND} -E sleep 5)
endforeach()
file(ARCHIVE_EXTRACT INPUT ${CACHE_PATH}/${WEBKIT_FILENAME} DESTINATION ${CACHE_PATH} TOUCH)
file(REMOVE ${CACHE_PATH}/${WEBKIT_FILENAME})

View File

@@ -39,6 +39,12 @@ else()
unsupported(CMAKE_BUILD_TYPE)
endif()
# Since Bun 1.1, Windows has been built using ReleaseSafe.
# This is because it caught more crashes, but we can reconsider this in the future
if(WIN32 AND DEFAULT_ZIG_OPTIMIZE STREQUAL "ReleaseFast")
set(DEFAULT_ZIG_OPTIMIZE "ReleaseSafe")
endif()
optionx(ZIG_OPTIMIZE "ReleaseFast|ReleaseSafe|ReleaseSmall|Debug" "The Zig optimize level to use" DEFAULT ${DEFAULT_ZIG_OPTIMIZE})
# To use LLVM bitcode from Zig, more work needs to be done. Currently, an install of

238
scripts/verify-baseline.ts Normal file
View File

@@ -0,0 +1,238 @@
// Verify that a Bun binary doesn't use CPU instructions beyond its baseline target.
//
// Detects the platform and chooses the appropriate emulator:
// Linux x64: QEMU with Nehalem CPU (no AVX)
// Linux arm64: QEMU with Cortex-A53 (no LSE/SVE)
// Windows x64: Intel SDE with -nhm (no AVX)
//
// Usage:
// bun scripts/verify-baseline.ts --binary ./bun --emulator /usr/bin/qemu-x86_64
// bun scripts/verify-baseline.ts --binary ./bun.exe --emulator ./sde.exe
import { readdirSync } from "node:fs";
import { basename, dirname, join, resolve } from "node:path";
const { parseArgs } = require("node:util");
const { values } = parseArgs({
args: process.argv.slice(2),
options: {
binary: { type: "string" },
emulator: { type: "string" },
"jit-stress": { type: "boolean", default: false },
},
strict: true,
});
const binary = resolve(values.binary!);
function resolveEmulator(name: string): string {
const found = Bun.which(name);
if (found) return found;
// Try without -static suffix (e.g. qemu-aarch64 instead of qemu-aarch64-static)
if (name.endsWith("-static")) {
const fallback = Bun.which(name.slice(0, -"-static".length));
if (fallback) return fallback;
}
// Last resort: resolve as a relative path (e.g. sde-external/sde.exe)
return resolve(name);
}
const emulatorPath = resolveEmulator(values.emulator!);
const scriptDir = dirname(import.meta.path);
const repoRoot = resolve(scriptDir, "..");
const fixturesDir = join(repoRoot, "test", "js", "bun", "jsc-stress", "fixtures");
const wasmFixturesDir = join(fixturesDir, "wasm");
const preloadPath = join(repoRoot, "test", "js", "bun", "jsc-stress", "preload.js");
// Platform detection
const isWindows = process.platform === "win32";
const isAarch64 = process.arch === "arm64";
// SDE outputs this when a chip-check violation occurs
const SDE_VIOLATION_PATTERN = /SDE-ERROR:.*not valid for specified chip/i;
// Configure emulator based on platform
const config = isWindows
? {
runnerCmd: [emulatorPath, "-nhm", "--"],
cpuDesc: "Nehalem (SSE4.2, no AVX/AVX2/AVX512)",
// SDE must run from its own directory for Pin DLL resolution
cwd: dirname(emulatorPath),
}
: isAarch64
? {
runnerCmd: [emulatorPath, "-cpu", "cortex-a53"],
cpuDesc: "Cortex-A53 (ARMv8.0-A+CRC, no LSE/SVE)",
cwd: undefined,
}
: {
runnerCmd: [emulatorPath, "-cpu", "Nehalem"],
cpuDesc: "Nehalem (SSE4.2, no AVX/AVX2/AVX512)",
cwd: undefined,
};
function isInstructionViolation(exitCode: number, output: string): boolean {
if (isWindows) return SDE_VIOLATION_PATTERN.test(output);
return exitCode === 132; // SIGILL = 128 + signal 4
}
console.log(`--- Verifying ${basename(binary)} on ${config.cpuDesc}`);
console.log(` Binary: ${binary}`);
console.log(` Emulator: ${config.runnerCmd.join(" ")}`);
console.log();
let instructionFailures = 0;
let otherFailures = 0;
let passed = 0;
const failedTests: string[] = [];
interface RunTestOptions {
cwd?: string;
/** Tee output live to the console while still capturing it for analysis */
live?: boolean;
}
/** Read a stream, write each chunk to a writable, and return the full text. */
async function teeStream(stream: ReadableStream<Uint8Array>, output: NodeJS.WriteStream): Promise<string> {
const chunks: Uint8Array[] = [];
for await (const chunk of stream) {
chunks.push(chunk);
output.write(chunk);
}
return Buffer.concat(chunks).toString();
}
async function runTest(label: string, binaryArgs: string[], options?: RunTestOptions): Promise<boolean> {
console.log(`+++ ${label}`);
const start = performance.now();
const live = options?.live ?? false;
const proc = Bun.spawn([...config.runnerCmd, binary, ...binaryArgs], {
// config.cwd takes priority — SDE on Windows must run from its own directory for Pin DLL resolution
cwd: config.cwd ?? options?.cwd,
stdout: "pipe",
stderr: "pipe",
});
let stdout: string;
let stderr: string;
if (live) {
[stdout, stderr] = await Promise.all([
teeStream(proc.stdout as ReadableStream<Uint8Array>, process.stdout),
teeStream(proc.stderr as ReadableStream<Uint8Array>, process.stderr),
proc.exited,
]);
} else {
[stdout, stderr] = await Promise.all([
new Response(proc.stdout).text(),
new Response(proc.stderr).text(),
proc.exited,
]);
}
const exitCode = proc.exitCode!;
const elapsed = ((performance.now() - start) / 1000).toFixed(1);
const output = stdout + "\n" + stderr;
if (exitCode === 0) {
if (!live && stdout.trim()) console.log(stdout.trim());
console.log(` PASS (${elapsed}s)`);
passed++;
return true;
}
if (isInstructionViolation(exitCode, output)) {
if (!live && output.trim()) console.log(output.trim());
console.log();
console.log(` FAIL: CPU instruction violation detected (${elapsed}s)`);
if (isAarch64) {
console.log(" The aarch64 build targets Cortex-A53 (ARMv8.0-A+CRC).");
console.log(" LSE atomics, SVE, and dotprod instructions are not allowed.");
} else {
console.log(" The baseline x64 build targets Nehalem (SSE4.2).");
console.log(" AVX, AVX2, and AVX512 instructions are not allowed.");
}
instructionFailures++;
failedTests.push(label);
} else {
if (!live && output.trim()) console.log(output.trim());
console.log(` WARN: exit code ${exitCode} (${elapsed}s, not a CPU instruction issue)`);
otherFailures++;
}
return false;
}
// Phase 1: SIMD code path verification (always runs)
const simdTestPath = join(repoRoot, "test", "js", "bun", "jsc-stress", "fixtures", "simd-baseline.test.ts");
await runTest("SIMD baseline tests", ["test", simdTestPath], { live: true });
// Phase 2: JIT stress fixtures (only with --jit-stress, e.g. on WebKit changes)
if (values["jit-stress"]) {
const jsFixtures = readdirSync(fixturesDir)
.filter(f => f.endsWith(".js"))
.sort();
console.log();
console.log(`--- JS fixtures (DFG/FTL) — ${jsFixtures.length} tests`);
for (let i = 0; i < jsFixtures.length; i++) {
const fixture = jsFixtures[i];
await runTest(`[${i + 1}/${jsFixtures.length}] ${fixture}`, ["--preload", preloadPath, join(fixturesDir, fixture)]);
}
const skipFixtures = new Set<string>();
if (isWindows) {
// 100k recursive tail calls with exceptions — takes 17+ minutes under SDE
skipFixtures.add("tail-call-should-consume-stack-in-bbq.js");
}
const wasmFixtures = readdirSync(wasmFixturesDir)
.filter(f => f.endsWith(".js") && !skipFixtures.has(f))
.sort();
console.log();
console.log(`--- Wasm fixtures (BBQ/OMG) — ${wasmFixtures.length} tests`);
for (let i = 0; i < wasmFixtures.length; i++) {
const fixture = wasmFixtures[i];
await runTest(
`[${i + 1}/${wasmFixtures.length}] ${fixture}`,
["--preload", preloadPath, join(wasmFixturesDir, fixture)],
{ cwd: wasmFixturesDir },
);
}
} else {
console.log();
console.log("--- Skipping JIT stress fixtures (pass --jit-stress to enable)");
}
// Summary
console.log();
console.log("--- Summary");
console.log(` Passed: ${passed}`);
console.log(` Instruction failures: ${instructionFailures}`);
console.log(` Other failures: ${otherFailures} (warnings, not CPU instruction issues)`);
console.log();
if (instructionFailures > 0) {
console.error(" FAILED: Code uses unsupported CPU instructions.");
// Report to Buildkite annotations tab
const platform = isWindows ? "Windows x64" : isAarch64 ? "Linux aarch64" : "Linux x64";
const annotation = [
`<details>`,
`<summary>CPU instruction violation on ${platform}${instructionFailures} failed</summary>`,
`<p>The baseline build uses instructions not available on <code>${config.cpuDesc}</code>.</p>`,
`<ul>${failedTests.map(t => `<li><code>${t}</code></li>`).join("")}</ul>`,
`</details>`,
].join("\n");
Bun.spawnSync(["buildkite-agent", "annotate", "--append", "--style", "error", "--context", "verify-baseline"], {
stdin: new Blob([annotation]),
});
process.exit(1);
}
if (otherFailures > 0) {
console.log(" Some tests failed for reasons unrelated to CPU instructions.");
}
console.log(` All baseline verification passed on ${config.cpuDesc}.`);

View File

@@ -21,16 +21,3 @@ export const gc = fn({
},
ret: t.usize,
});
export const StringWidthOptions = t.dictionary({
countAnsiEscapeCodes: t.boolean.default(false),
ambiguousIsNarrow: t.boolean.default(true),
});
export const stringWidth = fn({
args: {
str: t.DOMString.default(""),
opts: StringWidthOptions.default({}),
},
ret: t.usize,
});

View File

@@ -34,6 +34,7 @@ pub const BunObject = struct {
pub const sha = toJSCallback(host_fn.wrapStaticMethod(Crypto.SHA512_256, "hash_", true));
pub const shellEscape = toJSCallback(Bun.shellEscape);
pub const shrink = toJSCallback(Bun.shrink);
pub const stringWidth = toJSCallback(Bun.stringWidth);
pub const sleepSync = toJSCallback(Bun.sleepSync);
pub const spawn = toJSCallback(host_fn.wrapStaticMethod(api.Subprocess, "spawn", false));
pub const spawnSync = toJSCallback(host_fn.wrapStaticMethod(api.Subprocess, "spawnSync", false));
@@ -179,6 +180,7 @@ pub const BunObject = struct {
@export(&BunObject.sha, .{ .name = callbackName("sha") });
@export(&BunObject.shellEscape, .{ .name = callbackName("shellEscape") });
@export(&BunObject.shrink, .{ .name = callbackName("shrink") });
@export(&BunObject.stringWidth, .{ .name = callbackName("stringWidth") });
@export(&BunObject.sleepSync, .{ .name = callbackName("sleepSync") });
@export(&BunObject.spawn, .{ .name = callbackName("spawn") });
@export(&BunObject.spawnSync, .{ .name = callbackName("spawnSync") });
@@ -1382,14 +1384,8 @@ pub fn getUnsafe(globalThis: *jsc.JSGlobalObject, _: *jsc.JSObject) jsc.JSValue
return UnsafeObject.create(globalThis);
}
pub fn stringWidth(str: bun.String, opts: gen.StringWidthOptions) usize {
if (str.length() == 0)
return 0;
if (opts.count_ansi_escape_codes)
return str.visibleWidth(!opts.ambiguous_is_narrow);
return str.visibleWidthExcludeANSIColors(!opts.ambiguous_is_narrow);
pub fn stringWidth(globalObject: *jsc.JSGlobalObject, callFrame: *jsc.CallFrame) bun.JSError!jsc.JSValue {
return bun.String.jsGetStringWidth(globalObject, callFrame);
}
/// EnvironmentVariables is runtime defined.

View File

@@ -1087,8 +1087,10 @@ pub const WindowsSpawnOptions = struct {
dup2: struct { out: bun.jsc.Subprocess.StdioKind, to: bun.jsc.Subprocess.StdioKind },
pub fn deinit(this: *const Stdio) void {
if (this.* == .buffer) {
this.buffer.closeAndDestroy();
switch (this.*) {
.buffer => |pipe| pipe.closeAndDestroy(),
.ipc => |pipe| pipe.closeAndDestroy(),
else => {},
}
}
};
@@ -1629,9 +1631,10 @@ pub fn spawnProcessWindows(
stdio.flags = uv.UV_INHERIT_FD;
stdio.data.fd = fd_i;
},
.ipc => |my_pipe| {
// ipc option inside stdin, stderr or stdout are not supported
bun.default_allocator.destroy(my_pipe);
.ipc => {
// ipc option inside stdin, stderr or stdout is not supported.
// Don't free the pipe here — the caller owns it and will
// clean it up via WindowsSpawnOptions.deinit().
stdio.flags = uv.UV_IGNORE;
},
.ignore => {
@@ -1829,7 +1832,7 @@ pub const sync = struct {
.ignore => .ignore,
.buffer => .{
.buffer = if (Environment.isWindows)
bun.handleOom(bun.default_allocator.create(bun.windows.libuv.Pipe)),
bun.new(bun.windows.libuv.Pipe, std.mem.zeroes(bun.windows.libuv.Pipe)),
},
};
}

View File

@@ -851,6 +851,9 @@ pub fn getsockname(this: *Listener, globalThis: *jsc.JSGlobalObject, callFrame:
}
const out = callFrame.argumentsAsArray(1)[0];
if (!out.isObject()) {
return globalThis.throwInvalidArguments("Expected object", .{});
}
const socket = this.listener.uws;
var buf: [64]u8 = [_]u8{0} ** 64;

View File

@@ -187,7 +187,7 @@ pub fn create(globalThis: *jsc.JSGlobalObject, socket: SocketType) *WindowsNamed
});
// named_pipe owns the pipe (PipeWriter owns the pipe and will close and deinit it)
this.named_pipe = uws.WindowsNamedPipe.from(bun.handleOom(bun.default_allocator.create(uv.Pipe)), .{
this.named_pipe = uws.WindowsNamedPipe.from(bun.new(uv.Pipe, std.mem.zeroes(uv.Pipe)), .{
.ctx = this,
.ref_ctx = @ptrCast(&WindowsNamedPipeContext.ref),
.deref_ctx = @ptrCast(&WindowsNamedPipeContext.deref),
@@ -288,6 +288,8 @@ pub fn deinit(this: *WindowsNamedPipeContext) void {
bun.destroy(this);
}
const std = @import("std");
const bun = @import("bun");
const Output = bun.Output;
const jsc = bun.jsc;

View File

@@ -235,10 +235,10 @@ pub const Stdio = union(enum) {
return .{ .err = .blob_used_as_out };
}
break :brk .{ .buffer = bun.handleOom(bun.default_allocator.create(uv.Pipe)) };
break :brk .{ .buffer = createZeroedPipe() };
},
.ipc => .{ .ipc = bun.handleOom(bun.default_allocator.create(uv.Pipe)) },
.capture, .pipe, .array_buffer, .readable_stream => .{ .buffer = bun.handleOom(bun.default_allocator.create(uv.Pipe)) },
.ipc => .{ .ipc = createZeroedPipe() },
.capture, .pipe, .array_buffer, .readable_stream => .{ .buffer = createZeroedPipe() },
.fd => |fd| .{ .pipe = fd },
.dup2 => .{ .dup2 = .{ .out = stdio.dup2.out, .to = stdio.dup2.to } },
.path => |pathlike| .{ .path = pathlike.slice() },
@@ -487,12 +487,18 @@ pub const Stdio = union(enum) {
}
};
/// Allocate a zero-initialized uv.Pipe. Zero-init ensures `pipe.loop` is null
/// for pipes that never reach `uv_pipe_init`, so `closeAndDestroy` can tell
/// whether `uv_close` is needed.
fn createZeroedPipe() *uv.Pipe {
return bun.new(uv.Pipe, std.mem.zeroes(uv.Pipe));
}
const std = @import("std");
const bun = @import("bun");
const Environment = bun.Environment;
const Output = bun.Output;
const default_allocator = bun.default_allocator;
const uv = bun.windows.libuv;
const jsc = bun.jsc;

View File

@@ -995,7 +995,7 @@ JSC_DEFINE_HOST_FUNCTION(functionFileURLToPath, (JSC::JSGlobalObject * globalObj
stderr BunObject_lazyPropCb_wrap_stderr DontDelete|PropertyCallback
stdin BunObject_lazyPropCb_wrap_stdin DontDelete|PropertyCallback
stdout BunObject_lazyPropCb_wrap_stdout DontDelete|PropertyCallback
stringWidth Generated::BunObject::jsStringWidth DontDelete|Function 2
stringWidth BunObject_callback_stringWidth DontDelete|Function 2
stripANSI jsFunctionBunStripANSI DontDelete|Function 1
wrapAnsi jsFunctionBunWrapAnsi DontDelete|Function 3
Terminal BunObject_lazyPropCb_wrap_Terminal DontDelete|PropertyCallback

View File

@@ -31,8 +31,17 @@
#include <utility>
#include <vector>
#ifdef _WIN32
#pragma clang diagnostic push
#pragma clang diagnostic ignored "-Wmicrosoft-include"
#endif
#define v8 real_v8
#define private public
#include "node/v8.h"
#undef private
#undef v8
#ifdef _WIN32
#pragma clang diagnostic pop
#endif

View File

@@ -923,9 +923,9 @@ pub const SendQueue = struct {
pub fn windowsConfigureClient(this: *SendQueue, pipe_fd: bun.FileDescriptor) !void {
log("configureClient", .{});
const ipc_pipe = bun.handleOom(bun.default_allocator.create(uv.Pipe));
const ipc_pipe = bun.new(uv.Pipe, std.mem.zeroes(uv.Pipe));
ipc_pipe.init(uv.Loop.get(), true).unwrap() catch |err| {
bun.default_allocator.destroy(ipc_pipe);
bun.destroy(ipc_pipe);
return err;
};
ipc_pipe.open(pipe_fd).unwrap() catch |err| {

View File

@@ -564,8 +564,8 @@ pub fn runScriptsWithFilter(ctx: Command.Context) !noreturn {
.config = script,
.options = .{
.stdin = .ignore,
.stdout = if (Environment.isPosix) .buffer else .{ .buffer = try bun.default_allocator.create(bun.windows.libuv.Pipe) },
.stderr = if (Environment.isPosix) .buffer else .{ .buffer = try bun.default_allocator.create(bun.windows.libuv.Pipe) },
.stdout = if (Environment.isPosix) .buffer else .{ .buffer = bun.new(bun.windows.libuv.Pipe, std.mem.zeroes(bun.windows.libuv.Pipe)) },
.stderr = if (Environment.isPosix) .buffer else .{ .buffer = bun.new(bun.windows.libuv.Pipe, std.mem.zeroes(bun.windows.libuv.Pipe)) },
.cwd = std.fs.path.dirname(script.package_json_path) orelse "",
.windows = if (Environment.isWindows) .{ .loop = bun.jsc.EventLoopHandle.init(event_loop) },
.stream = true,

View File

@@ -762,8 +762,8 @@ pub fn run(ctx: Command.Context) !noreturn {
.color_idx = color_idx,
.options = .{
.stdin = .ignore,
.stdout = if (Environment.isPosix) .buffer else .{ .buffer = try bun.default_allocator.create(bun.windows.libuv.Pipe) },
.stderr = if (Environment.isPosix) .buffer else .{ .buffer = try bun.default_allocator.create(bun.windows.libuv.Pipe) },
.stdout = if (Environment.isPosix) .buffer else .{ .buffer = bun.new(bun.windows.libuv.Pipe, std.mem.zeroes(bun.windows.libuv.Pipe)) },
.stderr = if (Environment.isPosix) .buffer else .{ .buffer = bun.new(bun.windows.libuv.Pipe, std.mem.zeroes(bun.windows.libuv.Pipe)) },
.cwd = config.cwd,
.windows = if (Environment.isWindows) .{ .loop = bun.jsc.EventLoopHandle.init(event_loop) },
.stream = true,

View File

@@ -142,6 +142,36 @@ function resolveComplexArgumentStrategy(
}
}
function collectStringTemps(args: readonly { type: TypeImpl }[]): Map<number, string[]> {
const result = new Map<number, string[]>();
// Use the same JS argument indexing as the arg processing loop:
// skip virtual args entirely, increment for everything else.
let jsArgIdx = 0;
for (const arg of args) {
const type = arg.type;
if (type.isVirtualArgument()) continue;
if (type.isIgnoredUndefinedType()) {
jsArgIdx++;
continue;
}
if (type.isStringType()) {
result.set(jsArgIdx, [cpp.nextTemporaryName("wtfString")]);
} else if (type.kind === "dictionary") {
const temps: string[] = [];
for (const field of type.data as DictionaryField[]) {
if (field.type.isStringType()) {
temps.push(cpp.nextTemporaryName("wtfString"));
}
}
if (temps.length > 0) {
result.set(jsArgIdx, temps);
}
}
jsArgIdx++;
}
return result;
}
function emitCppCallToVariant(name: string, variant: Variant, dispatchFunctionName: string) {
cpp.line(`auto& vm = JSC::getVM(global);`);
cpp.line(`auto throwScope = DECLARE_THROW_SCOPE(vm);`);
@@ -157,6 +187,16 @@ function emitCppCallToVariant(name: string, variant: Variant, dispatchFunctionNa
communicationStruct.emitCpp(cppInternal, communicationStruct.name());
}
// Hoist all WTF::String temps to the dispatch scope so they outlive any
// BunString values that reference them. Bun::toString() does not ref the
// StringImpl, so the WTF::String must stay alive until the dispatch call.
const hoistedTemps = collectStringTemps(variant.args);
for (const temps of hoistedTemps.values()) {
for (const temp of temps) {
cpp.line(`WTF::String ${temp};`);
}
}
let i = 0;
for (const arg of variant.args) {
const type = arg.type;
@@ -201,6 +241,8 @@ function emitCppCallToVariant(name: string, variant: Variant, dispatchFunctionNa
/** If the final representation may include null */
const isNullable = type.flags.optional && !("default" in type.flags);
const argTemps = hoistedTemps.get(i);
if (isOptionalToUser) {
if (needDeclare) {
addHeaderForType(type);
@@ -215,7 +257,8 @@ function emitCppCallToVariant(name: string, variant: Variant, dispatchFunctionNa
cpp.line(`if (!${jsValueRef}.${isUndefinedOrNull}()) {`);
}
cpp.indent();
emitConvertValue(storageLocation, arg.type, jsValueRef, exceptionContext, "assign");
const hoistedTemp = type.isStringType() ? argTemps?.[0] : undefined;
emitConvertValue(storageLocation, arg.type, jsValueRef, exceptionContext, "assign", hoistedTemp, argTemps);
cpp.dedent();
if ("default" in type.flags) {
cpp.line(`} else {`);
@@ -229,7 +272,16 @@ function emitCppCallToVariant(name: string, variant: Variant, dispatchFunctionNa
}
cpp.line(`}`);
} else {
emitConvertValue(storageLocation, arg.type, jsValueRef, exceptionContext, needDeclare ? "declare" : "assign");
const hoistedTemp = type.isStringType() ? argTemps?.[0] : undefined;
emitConvertValue(
storageLocation,
arg.type,
jsValueRef,
exceptionContext,
needDeclare ? "declare" : "assign",
hoistedTemp,
argTemps,
);
}
i += 1;
@@ -424,6 +476,8 @@ function emitConvertValue(
jsValueRef: string,
exceptionContext: ExceptionContext,
decl: "declare" | "assign",
hoistedTemp?: string,
dictStringTemps?: string[],
) {
if (decl === "declare") {
addHeaderForType(type);
@@ -473,8 +527,12 @@ function emitConvertValue(
case "USVString":
case "DOMString":
case "ByteString": {
const temp = cpp.nextTemporaryName("wtfString");
cpp.line(`WTF::String ${temp} = WebCore::convert<WebCore::IDL${type.kind}>(*global, ${jsValueRef});`);
const temp = hoistedTemp ?? cpp.nextTemporaryName("wtfString");
if (hoistedTemp) {
cpp.line(`${temp} = WebCore::convert<WebCore::IDL${type.kind}>(*global, ${jsValueRef});`);
} else {
cpp.line(`WTF::String ${temp} = WebCore::convert<WebCore::IDL${type.kind}>(*global, ${jsValueRef});`);
}
cpp.line(`RETURN_IF_EXCEPTION(throwScope, {});`);
if (decl === "declare") {
@@ -484,8 +542,12 @@ function emitConvertValue(
break;
}
case "UTF8String": {
const temp = cpp.nextTemporaryName("wtfString");
cpp.line(`WTF::String ${temp} = WebCore::convert<WebCore::IDLDOMString>(*global, ${jsValueRef});`);
const temp = hoistedTemp ?? cpp.nextTemporaryName("wtfString");
if (hoistedTemp) {
cpp.line(`${temp} = WebCore::convert<WebCore::IDLDOMString>(*global, ${jsValueRef});`);
} else {
cpp.line(`WTF::String ${temp} = WebCore::convert<WebCore::IDLDOMString>(*global, ${jsValueRef});`);
}
cpp.line(`RETURN_IF_EXCEPTION(throwScope, {});`);
if (decl === "declare") {
@@ -498,7 +560,13 @@ function emitConvertValue(
if (decl === "declare") {
cpp.line(`${type.cppName()} ${storageLocation};`);
}
cpp.line(`auto did_convert = convert${type.cppInternalName()}(&${storageLocation}, global, ${jsValueRef});`);
if (dictStringTemps && dictStringTemps.length > 0) {
cpp.line(
`auto did_convert = convert${type.cppInternalName()}(&${storageLocation}, global, ${jsValueRef}, ${dictStringTemps.join(", ")});`,
);
} else {
cpp.line(`auto did_convert = convert${type.cppInternalName()}(&${storageLocation}, global, ${jsValueRef});`);
}
cpp.line(`RETURN_IF_EXCEPTION(throwScope, {});`);
cpp.line(`if (!did_convert) return {};`);
break;
@@ -582,10 +650,22 @@ function emitConvertDictionaryFunction(type: TypeImpl) {
addHeaderForType(type);
// Build WTF::String& params for string fields (caller owns the storage).
const stringFieldParams: string[] = [];
for (const field of fields) {
if (field.type.isStringType()) {
stringFieldParams.push(`WTF::String& ${field.key}_str`);
}
}
cpp.line(`// Internal dictionary parse for ${type.name()}`);
cpp.line(
`bool convert${type.cppInternalName()}(${type.cppName()}* result, JSC::JSGlobalObject* global, JSC::JSValue value) {`,
);
const params = [
`${type.cppName()}* result`,
`JSC::JSGlobalObject* global`,
`JSC::JSValue value`,
...stringFieldParams,
];
cpp.line(`bool convert${type.cppInternalName()}(${params.join(", ")}) {`);
cpp.indent();
cpp.line(`auto& vm = JSC::getVM(global);`);
@@ -610,9 +690,12 @@ function emitConvertDictionaryFunction(type: TypeImpl) {
);
cpp.line(` RETURN_IF_EXCEPTION(throwScope, false);`);
cpp.line(`}`);
// For string fields, use the caller-owned WTF::String& ref so the
// string data outlives this function and the BunString in the result.
const hoistedTemp = fieldType.isStringType() ? `${key}_str` : undefined;
cpp.line(`if (!propValue.isUndefined()) {`);
cpp.indent();
emitConvertValue(`result->${key}`, fieldType, "propValue", { type: "none" }, "assign");
emitConvertValue(`result->${key}`, fieldType, "propValue", { type: "none" }, "assign", hoistedTemp);
cpp.dedent();
cpp.line(`} else {`);
cpp.indent();

View File

@@ -1415,15 +1415,23 @@ pub const Pipe = extern struct {
return @ptrCast(this);
}
/// Close the pipe handle and then free it in the close callback.
/// Use this when a pipe has been init'd but needs to be destroyed
/// (e.g. when open() fails after init() succeeded).
/// Close the pipe handle (if needed) and then free it.
/// Handles all states: never-initialized (loop == null), already closing,
/// or active. After uv_pipe_init the handle is in the event loop's
/// handle_queue; freeing without uv_close corrupts that list.
pub fn closeAndDestroy(this: *@This()) void {
this.close(&onCloseDestroy);
if (this.loop == null) {
// Never initialized — safe to free directly.
bun.destroy(this);
} else if (!this.isClosing()) {
// Initialized and not yet closing — must uv_close first.
this.close(&onCloseDestroy);
}
// else: already closing — the pending close callback owns the lifetime.
}
fn onCloseDestroy(handle: *@This()) callconv(.c) void {
bun.default_allocator.destroy(handle);
bun.destroy(handle);
}
};
const union_unnamed_416 = extern union {

View File

@@ -167,18 +167,8 @@ fn onClose(this: *WindowsNamedPipe) void {
log("onClose", .{});
if (!this.flags.is_closed) {
this.flags.is_closed = true; // only call onClose once
// Stop reading and clear the timer to prevent further callbacks,
// but don't call deinit() here. The context (owner) will call
// named_pipe.deinit() when it runs its own deferred deinit.
// Calling deinit() synchronously here causes use-after-free when
// this callback is triggered from within the SSL wrapper's
// handleTraffic() call chain (the wrapper.deinit() frees the SSL
// state while we're still on the wrapper's call stack).
this.setTimeout(0);
if (this.writer.getStream()) |stream| {
_ = stream.readStop();
}
this.handlers.onClose(this.handlers.ctx);
this.deinit();
}
}
@@ -595,7 +585,6 @@ pub fn deinit(this: *WindowsNamedPipe) void {
wrapper.deinit();
this.wrapper = null;
}
this.incoming.clearAndFree(bun.default_allocator);
var ssl_error = this.ssl_error;
ssl_error.deinit();
this.ssl_error = .{};

View File

@@ -383,6 +383,7 @@ pub const PackageInstall = struct {
&[_]bun.OSPathSlice{},
&[_]bun.OSPathSlice{},
) catch |err| return Result.fail(err, .opening_cache_dir, @errorReturnTrace());
walker_.resolve_unknown_entry_types = true;
defer walker_.deinit();
const FileCopier = struct {
@@ -520,6 +521,7 @@ pub const PackageInstall = struct {
else
&[_]bun.OSPathSlice{},
) catch |err| bun.handleOom(err);
state.walker.resolve_unknown_entry_types = true;
if (!Environment.isWindows) {
state.subdir = destbase.makeOpenPath(bun.span(destpath), .{

View File

@@ -1011,6 +1011,7 @@ pub const PackageInstaller = struct {
.local_tarball => {
this.manager.enqueueTarballForReading(
dependency_id,
package_id,
alias.slice(this.lockfile.buffers.string_bytes.items),
resolution,
context,

View File

@@ -126,6 +126,7 @@ pub fn enqueueTarballForDownload(
pub fn enqueueTarballForReading(
this: *PackageManager,
dependency_id: DependencyID,
package_id: PackageID,
alias: string,
resolution: *const Resolution,
task_context: TaskCallbackContext,
@@ -144,6 +145,8 @@ pub fn enqueueTarballForReading(
if (task_queue.found_existing) return;
const integrity = this.lockfile.packages.items(.meta)[package_id].integrity;
this.task_batch.push(ThreadPool.Batch.from(enqueueLocalTarball(
this,
task_id,
@@ -151,6 +154,7 @@ pub fn enqueueTarballForReading(
alias,
path,
resolution.*,
integrity,
)));
}
@@ -1133,6 +1137,7 @@ pub fn enqueueDependencyWithMainAndSuccessFn(
this.lockfile.str(&dependency.name),
url,
res,
.{},
)));
},
.remote => {
@@ -1294,6 +1299,7 @@ fn enqueueLocalTarball(
name: string,
path: string,
resolution: Resolution,
integrity: Integrity,
) *ThreadPool.Task {
var task = this.preallocated_resolve_tasks.get();
task.* = Task{
@@ -1313,6 +1319,7 @@ fn enqueueLocalTarball(
.cache_dir = this.getCacheDirectory(),
.temp_dir = this.getTemporaryDirectory().handle,
.dependency_id = dependency_id,
.integrity = integrity,
.url = strings.StringOrTinyString.initAppendIfNeeded(
path,
*FileSystem.FilenameStore,
@@ -1886,6 +1893,7 @@ const DependencyID = bun.install.DependencyID;
const ExtractTarball = bun.install.ExtractTarball;
const Features = bun.install.Features;
const FolderResolution = bun.install.FolderResolution;
const Integrity = bun.install.Integrity;
const Npm = bun.install.Npm;
const PackageID = bun.install.PackageID;
const PackageNameHash = bun.install.PackageNameHash;

View File

@@ -193,6 +193,9 @@ pub fn processExtractedTarballPackage(
};
package.meta.setHasInstallScript(has_scripts);
if (data.integrity.tag.isSupported()) {
package.meta.integrity = data.integrity;
}
package = manager.lockfile.appendPackage(package) catch unreachable;
package_id.* = package.meta.id;

View File

@@ -25,21 +25,23 @@ pub inline fn run(this: *const ExtractTarball, log: *logger.Log, bytes: []const
}
var result = try this.extract(log, bytes);
// Compute and store SHA-512 integrity hash for GitHub tarballs so the
// lockfile can pin the exact tarball content. On subsequent installs the
// hash stored in the lockfile is forwarded via this.integrity and verified
// Compute and store SHA-512 integrity hash for GitHub / URL / local tarballs
// so the lockfile can pin the exact tarball content. On subsequent installs
// the hash stored in the lockfile is forwarded via this.integrity and verified
// above, preventing a compromised server from silently swapping the tarball.
if (this.resolution.tag == .github) {
if (this.integrity.tag.isSupported()) {
// Re-installing with an existing lockfile: integrity was already
// verified above, propagate the known value to ExtractData so that
// the lockfile keeps it on re-serialisation.
result.integrity = this.integrity;
} else {
// First install (no integrity in the lockfile yet): compute it.
result.integrity = .{ .tag = .sha512 };
Crypto.SHA512.hash(bytes, result.integrity.value[0..Crypto.SHA512.digest]);
}
switch (this.resolution.tag) {
.github, .remote_tarball, .local_tarball => {
if (this.integrity.tag.isSupported()) {
// Re-installing with an existing lockfile: integrity was already
// verified above, propagate the known value to ExtractData so that
// the lockfile keeps it on re-serialisation.
result.integrity = this.integrity;
} else {
// First install (no integrity in the lockfile yet): compute it.
result.integrity = Integrity.forBytes(bytes);
}
},
else => {},
}
return result;
@@ -566,7 +568,6 @@ const string = []const u8;
const Npm = @import("./npm.zig");
const std = @import("std");
const Crypto = @import("../sha.zig").Hashers;
const FileSystem = @import("../fs.zig").FileSystem;
const Integrity = @import("./integrity.zig").Integrity;
const Resolution = @import("./resolution.zig").Resolution;

View File

@@ -209,6 +209,9 @@ pub const ExtractData = struct {
path: string = "",
buf: []u8 = "",
} = null,
/// Integrity hash computed from the raw tarball bytes.
/// Used for HTTPS/local tarball dependencies where the hash
/// is not available from a registry manifest.
integrity: Integrity = .{},
};
@@ -270,9 +273,9 @@ pub const ExternalStringList = external.ExternalStringList;
pub const ExternalStringMap = external.ExternalStringMap;
pub const VersionSlice = external.VersionSlice;
pub const Integrity = @import("./integrity.zig").Integrity;
pub const Dependency = @import("./dependency.zig");
pub const Behavior = @import("./dependency.zig").Behavior;
pub const Integrity = @import("./integrity.zig").Integrity;
pub const Lockfile = @import("./lockfile.zig");
pub const PatchedDep = Lockfile.PatchedDep;

View File

@@ -180,6 +180,14 @@ pub const Integrity = extern struct {
}
}
/// Compute a sha512 integrity hash from raw bytes (e.g. a downloaded tarball).
pub fn forBytes(bytes: []const u8) Integrity {
const len = std.crypto.hash.sha2.Sha512.digest_length;
var value: [digest_buf_len]u8 = empty_digest_buf;
Crypto.SHA512.hash(bytes, value[0..len]);
return .{ .tag = .sha512, .value = value };
}
pub fn verify(this: *const Integrity, bytes: []const u8) bool {
return @call(bun.callmod_inline, verifyByTag, .{ this.tag, bytes, &this.value });
}

View File

@@ -1082,6 +1082,7 @@ pub fn installIsolatedPackages(
.local_tarball => {
manager.enqueueTarballForReading(
dep_id,
pkg_id,
dep.name.slice(string_buf),
&pkg_res,
ctx,

View File

@@ -12,12 +12,16 @@ pub const FileCopier = struct {
return .{
.src_path = src_path,
.dest_subpath = dest_subpath,
.walker = try .walk(
src_dir,
bun.default_allocator,
&.{},
skip_dirnames,
),
.walker = walker: {
var w = try Walker.walk(
src_dir,
bun.default_allocator,
&.{},
skip_dirnames,
);
w.resolve_unknown_entry_types = true;
break :walker w;
},
};
}

View File

@@ -15,12 +15,16 @@ pub fn init(
.src_dir = folder_dir,
.src = src,
.dest = dest,
.walker = try .walk(
folder_dir,
bun.default_allocator,
&.{},
skip_dirnames,
),
.walker = walker: {
var w = try Walker.walk(
folder_dir,
bun.default_allocator,
&.{},
skip_dirnames,
);
w.resolve_unknown_entry_types = true;
break :walker w;
},
};
}

View File

@@ -187,8 +187,8 @@ pub const LifecycleScriptSubprocess = struct {
null,
};
if (Environment.isWindows) {
this.stdout.source = .{ .pipe = bun.handleOom(bun.default_allocator.create(uv.Pipe)) };
this.stderr.source = .{ .pipe = bun.handleOom(bun.default_allocator.create(uv.Pipe)) };
this.stdout.source = .{ .pipe = bun.new(uv.Pipe, std.mem.zeroes(uv.Pipe)) };
this.stderr.source = .{ .pipe = bun.new(uv.Pipe, std.mem.zeroes(uv.Pipe)) };
}
const spawn_options = bun.spawn.SpawnOptions{
.stdin = if (this.foreground)

View File

@@ -535,7 +535,13 @@ pub const Stringifier = struct {
&path_buf,
);
try writer.writeByte(']');
if (pkg_meta.integrity.tag.isSupported()) {
try writer.print(", \"{f}\"]", .{
pkg_meta.integrity,
});
} else {
try writer.writeByte(']');
}
},
.remote_tarball => {
try writer.print("[\"{f}@{f}\", ", .{
@@ -558,7 +564,13 @@ pub const Stringifier = struct {
&path_buf,
);
try writer.writeByte(']');
if (pkg_meta.integrity.tag.isSupported()) {
try writer.print(", \"{f}\"]", .{
pkg_meta.integrity,
});
} else {
try writer.writeByte(']');
}
},
.symlink => {
try writer.print("[\"{f}@link:{f}\", ", .{
@@ -1876,6 +1888,16 @@ pub fn parseIntoBinaryLockfile(
pkg.meta.integrity = Integrity.parse(integrity_str);
},
.local_tarball, .remote_tarball => {
// integrity is optional for tarball deps (backward compat)
if (i < pkg_info.len) {
const integrity_expr = pkg_info.at(i);
if (integrity_expr.asString(allocator)) |integrity_str| {
pkg.meta.integrity = Integrity.parse(integrity_str);
i += 1;
}
}
},
inline .git, .github => |tag| {
// .bun-tag
if (i >= pkg_info.len) {

View File

@@ -1200,7 +1200,7 @@ pub const WindowsBufferedReader = struct {
fn onPipeClose(handle: *uv.Pipe) callconv(.c) void {
const this = bun.cast(*uv.Pipe, handle.data);
bun.default_allocator.destroy(this);
bun.destroy(this);
}
fn onTTYClose(handle: *uv.uv_tty_t) callconv(.c) void {

View File

@@ -210,11 +210,11 @@ pub const Source = union(enum) {
pub fn openPipe(loop: *uv.Loop, fd: bun.FileDescriptor) bun.sys.Maybe(*Source.Pipe) {
log("openPipe (fd = {f})", .{fd});
const pipe = bun.default_allocator.create(Source.Pipe) catch |err| bun.handleOom(err);
const pipe = bun.new(Source.Pipe, std.mem.zeroes(Source.Pipe));
// we should never init using IPC here see ipc.zig
switch (pipe.init(loop, false)) {
.err => |err| {
bun.default_allocator.destroy(pipe);
bun.destroy(pipe);
return .{ .err = err };
},
else => {},

View File

@@ -108,11 +108,6 @@ export function overridableRequire(this: JSCommonJSModule, originalId: string, o
} catch (exception) {
// Since the ESM code is mostly JS, we need to handle exceptions here.
$requireMap.$delete(id);
// Also remove the failed module from the ESM registry so that
// a subsequent import() can re-evaluate it from scratch instead
// of finding the partially-initialized module entry.
// https://github.com/oven-sh/bun/issues/27287
Loader.registry.$delete(id);
throw exception;
}
@@ -326,11 +321,6 @@ export function requireESMFromHijackedExtension(this: JSCommonJSModule, id: stri
} catch (exception) {
// Since the ESM code is mostly JS, we need to handle exceptions here.
$requireMap.$delete(id);
// Also remove the failed module from the ESM registry so that
// a subsequent import() can re-evaluate it from scratch instead
// of finding the partially-initialized module entry.
// https://github.com/oven-sh/bun/issues/27287
Loader.registry.$delete(id);
throw exception;
}

View File

@@ -1093,15 +1093,41 @@ pub const String = extern struct {
extern fn JSC__createRangeError(*jsc.JSGlobalObject, str: *const String) jsc.JSValue;
pub fn jsGetStringWidth(globalObject: *jsc.JSGlobalObject, callFrame: *jsc.CallFrame) bun.JSError!jsc.JSValue {
const argument = callFrame.argument(0);
const str = try argument.toJSString(globalObject);
const view = str.view(globalObject);
const args = callFrame.argumentsAsArray(2);
const argument = args[0];
const opts_val = args[1];
if (argument == .zero or argument.isUndefined()) {
return .jsNumber(@as(i32, 0));
}
const js_str = try argument.toJSString(globalObject);
const view = js_str.view(globalObject);
if (view.isEmpty()) {
return .jsNumber(@as(i32, 0));
}
const width = bun.String.init(view).visibleWidth(false);
const str = bun.String.init(view);
// Parse options: { countAnsiEscapeCodes?: bool, ambiguousIsNarrow?: bool }
var count_ansi: bool = false;
var ambiguous_is_narrow: bool = true;
if (opts_val.isObject()) {
if (try opts_val.getTruthyComptime(globalObject, "countAnsiEscapeCodes")) |v| {
count_ansi = v.toBoolean();
}
if (try opts_val.getTruthyComptime(globalObject, "ambiguousIsNarrow")) |v| {
ambiguous_is_narrow = v.toBoolean();
}
}
const width = if (count_ansi)
str.visibleWidth(!ambiguous_is_narrow)
else
str.visibleWidthExcludeANSIColors(!ambiguous_is_narrow);
return .jsNumber(width);
}

View File

@@ -6,6 +6,7 @@ skip_filenames: []const u64 = &[_]u64{},
skip_dirnames: []const u64 = &[_]u64{},
skip_all: []const u64 = &[_]u64{},
seed: u64 = 0,
resolve_unknown_entry_types: bool = false,
const NameBufferList = std.array_list.Managed(bun.OSPathChar);
@@ -38,7 +39,22 @@ pub fn next(self: *Walker) bun.sys.Maybe(?WalkerEntry) {
.err => |err| return .initErr(err),
.result => |res| {
if (res) |base| {
switch (base.kind) {
// Some filesystems (NFS, FUSE, bind mounts) don't provide
// d_type and return DT_UNKNOWN. Optionally resolve via
// fstatat so callers get accurate types for recursion.
// This only affects POSIX; Windows always provides types.
const kind: @TypeOf(base.kind) = if (comptime !Environment.isWindows)
(if (base.kind == .unknown and self.resolve_unknown_entry_types) brk: {
const dir_fd = top.iter.iter.dir;
break :brk switch (bun.sys.lstatat(dir_fd, base.name.sliceAssumeZ())) {
.result => |stat_buf| bun.sys.kindFromMode(stat_buf.mode),
.err => continue, // skip entries we can't stat
};
} else base.kind)
else
base.kind;
switch (kind) {
.directory => {
if (std.mem.indexOfScalar(
u64,
@@ -78,7 +94,7 @@ pub fn next(self: *Walker) bun.sys.Maybe(?WalkerEntry) {
const cur_len = self.name_buffer.items.len;
bun.handleOom(self.name_buffer.append(0));
if (base.kind == .directory) {
if (kind == .directory) {
const new_dir = switch (bun.openDirForIterationOSPath(top.iter.iter.dir, base.name.slice())) {
.result => |fd| fd,
.err => |err| return .initErr(err),
@@ -95,7 +111,7 @@ pub fn next(self: *Walker) bun.sys.Maybe(?WalkerEntry) {
.dir = top.iter.iter.dir,
.basename = self.name_buffer.items[dirname_len..cur_len :0],
.path = self.name_buffer.items[0..cur_len :0],
.kind = base.kind,
.kind = kind,
});
} else {
var item = self.stack.pop().?;

View File

@@ -13,7 +13,7 @@ exports[`should write plaintext lockfiles 1`] = `
},
},
"packages": {
"dummy-package": ["bar@./bar-0.0.2.tgz", {}],
"dummy-package": ["bar@./bar-0.0.2.tgz", {}, "sha512-DXWxn8qZ4n87XMJjwZUdYPnsrl8Ntz66PudFoxDVkaPEkZBBzENAKsJPgbBacD782W8RwD/v4mjwVyqlPpQ59w=="],
}
}
"

View File

@@ -0,0 +1,491 @@
import { file, spawn } from "bun";
import { afterAll, beforeAll, describe, expect, it, setDefaultTimeout } from "bun:test";
import { rm, writeFile } from "fs/promises";
import { bunExe, bunEnv as env, readdirSorted } from "harness";
import { join } from "path";
import {
createTestContext,
destroyTestContext,
dummyAfterAll,
dummyBeforeAll,
dummyRegistryForContext,
setContextHandler,
type TestContext,
} from "./dummy.registry";
beforeAll(() => {
setDefaultTimeout(1000 * 60 * 5);
dummyBeforeAll();
});
afterAll(dummyAfterAll);
// Helper function that sets up test context and ensures cleanup
async function withContext(
opts: { linker?: "hoisted" | "isolated" } | undefined,
fn: (ctx: TestContext) => Promise<void>,
): Promise<void> {
const ctx = await createTestContext(opts ? { linker: opts.linker! } : undefined);
try {
await fn(ctx);
} finally {
destroyTestContext(ctx);
}
}
// Default context options for most tests
const defaultOpts = { linker: "hoisted" as const };
describe.concurrent("tarball integrity", () => {
it("should store integrity hash for tarball URL in text lockfile", async () => {
await withContext(defaultOpts, async ctx => {
const urls: string[] = [];
setContextHandler(ctx, dummyRegistryForContext(ctx, urls));
await writeFile(
join(ctx.package_dir, "package.json"),
JSON.stringify({
name: "foo",
version: "0.0.1",
dependencies: {
baz: `${ctx.registry_url}baz-0.0.3.tgz`,
},
}),
);
await using proc = spawn({
cmd: [bunExe(), "install", "--save-text-lockfile"],
cwd: ctx.package_dir,
stdout: "pipe",
stdin: "pipe",
stderr: "pipe",
env,
});
const err = await proc.stderr.text();
expect(err).toContain("Saved lockfile");
expect(await proc.exited).toBe(0);
// Read the text lockfile and verify integrity hash is present for the tarball package
const lockContent = await file(join(ctx.package_dir, "bun.lock")).text();
// bun.lock uses trailing commas (not strict JSON), so match with regex
expect(lockContent).toMatch(/"baz":\s*\[.*"sha512-[A-Za-z0-9+/]+=*"\]/s);
});
});
it("should store integrity hash for local tarball in text lockfile", async () => {
await withContext(defaultOpts, async ctx => {
const urls: string[] = [];
setContextHandler(ctx, dummyRegistryForContext(ctx, urls));
await writeFile(
join(ctx.package_dir, "package.json"),
JSON.stringify({
name: "foo",
version: "0.0.1",
dependencies: {
baz: join(import.meta.dir, "baz-0.0.3.tgz"),
},
}),
);
await using proc = spawn({
cmd: [bunExe(), "install", "--save-text-lockfile"],
cwd: ctx.package_dir,
stdout: "pipe",
stdin: "pipe",
stderr: "pipe",
env,
});
const err = await proc.stderr.text();
expect(err).toContain("Saved lockfile");
expect(await proc.exited).toBe(0);
// Read the text lockfile and verify integrity hash is present for the local tarball package
const lockContent = await file(join(ctx.package_dir, "bun.lock")).text();
expect(lockContent).toMatch(/"baz":\s*\[.*"sha512-[A-Za-z0-9+/]+=*"\]/s);
});
});
it("should store consistent integrity hash for tarball URL across reinstalls", async () => {
await withContext(defaultOpts, async ctx => {
const urls: string[] = [];
setContextHandler(ctx, dummyRegistryForContext(ctx, urls));
await writeFile(
join(ctx.package_dir, "package.json"),
JSON.stringify({
name: "foo",
version: "0.0.1",
dependencies: {
baz: `${ctx.registry_url}baz-0.0.3.tgz`,
},
}),
);
// First install to generate lockfile with integrity
{
await using proc = spawn({
cmd: [bunExe(), "install", "--save-text-lockfile"],
cwd: ctx.package_dir,
stdout: "pipe",
stderr: "pipe",
env,
});
const err = await proc.stderr.text();
expect(err).toContain("Saved lockfile");
expect(await proc.exited).toBe(0);
}
// Read and verify integrity hash exists
const lockContent1 = await file(join(ctx.package_dir, "bun.lock")).text();
const integrityMatch1 = lockContent1.match(/"(sha512-[A-Za-z0-9+/]+=*)"/);
expect(integrityMatch1).not.toBeNull();
const integrity1 = integrityMatch1![1];
// Delete lockfile and node_modules, reinstall from scratch
await rm(join(ctx.package_dir, "bun.lock"), { force: true });
await rm(join(ctx.package_dir, "node_modules"), { recursive: true, force: true });
{
await using proc = spawn({
cmd: [bunExe(), "install", "--save-text-lockfile"],
cwd: ctx.package_dir,
stdout: "pipe",
stderr: "pipe",
env,
});
const err = await proc.stderr.text();
expect(err).toContain("Saved lockfile");
expect(await proc.exited).toBe(0);
}
// Verify the same integrity hash was computed
const lockContent2 = await file(join(ctx.package_dir, "bun.lock")).text();
const integrityMatch2 = lockContent2.match(/"(sha512-[A-Za-z0-9+/]+=*)"/);
expect(integrityMatch2).not.toBeNull();
expect(integrityMatch2![1]).toBe(integrity1);
});
});
it("should fail integrity check when tarball URL content changes", async () => {
await withContext(defaultOpts, async ctx => {
// Serve baz-0.0.3.tgz on first install, then baz-0.0.5.tgz (different content) on second
let requestCount = 0;
setContextHandler(ctx, async request => {
const url = request.url;
if (url.endsWith(".tgz")) {
requestCount++;
// First request: serve baz-0.0.3.tgz, subsequent: serve baz-0.0.5.tgz (different content)
const tgzFile = requestCount <= 1 ? "baz-0.0.3.tgz" : "baz-0.0.5.tgz";
return new Response(file(join(import.meta.dir, tgzFile)));
}
return new Response("Not found", { status: 404 });
});
await writeFile(
join(ctx.package_dir, "package.json"),
JSON.stringify({
name: "foo",
version: "0.0.1",
dependencies: {
baz: `${ctx.registry_url}baz-0.0.3.tgz`,
},
}),
);
// First install - succeeds, stores integrity hash
{
await using proc = spawn({
cmd: [bunExe(), "install", "--save-text-lockfile"],
cwd: ctx.package_dir,
stdout: "pipe",
stderr: "pipe",
env,
});
const err = await proc.stderr.text();
expect(err).toContain("Saved lockfile");
expect(await proc.exited).toBe(0);
}
// Verify integrity hash was stored
const lockContent = await file(join(ctx.package_dir, "bun.lock")).text();
expect(lockContent).toMatch(/"sha512-[A-Za-z0-9+/]+=*"/);
// Remove node_modules to force re-download
await rm(join(ctx.package_dir, "node_modules"), { recursive: true, force: true });
// Second install - server now returns different tarball, integrity should fail
{
await using proc = spawn({
cmd: [bunExe(), "install"],
cwd: ctx.package_dir,
stdout: "pipe",
stderr: "pipe",
env,
});
const err = await proc.stderr.text();
const out = await proc.stdout.text();
expect(err + out).toContain("Integrity check failed");
expect(await proc.exited).toBe(1);
}
});
});
it("should install successfully from text lockfile without integrity hash (backward compat)", async () => {
await withContext(defaultOpts, async ctx => {
const urls: string[] = [];
setContextHandler(ctx, dummyRegistryForContext(ctx, urls));
// Write a text lockfile WITHOUT integrity hash (old format / backward compat)
await writeFile(
join(ctx.package_dir, "package.json"),
JSON.stringify({
name: "foo",
version: "0.0.1",
dependencies: {
baz: `${ctx.registry_url}baz-0.0.3.tgz`,
},
}),
);
await writeFile(
join(ctx.package_dir, "bun.lock"),
JSON.stringify({
lockfileVersion: 1,
configVersion: 1,
workspaces: {
"": {
name: "foo",
dependencies: {
baz: `${ctx.registry_url}baz-0.0.3.tgz`,
},
},
},
packages: {
baz: [`baz@${ctx.registry_url}baz-0.0.3.tgz`, { bin: { "baz-run": "index.js" } }],
},
}),
);
// Install with the old-format lockfile - should succeed without errors
await using proc = spawn({
cmd: [bunExe(), "install"],
cwd: ctx.package_dir,
stdout: "pipe",
stderr: "pipe",
env,
});
const err = await proc.stderr.text();
const out = await proc.stdout.text();
// Should not contain any integrity-related errors
expect(err).not.toContain("Integrity check failed");
expect(err).not.toContain("error:");
expect(await proc.exited).toBe(0);
// Package should be installed
expect(await readdirSorted(join(ctx.package_dir, "node_modules", "baz"))).toEqual(["index.js", "package.json"]);
});
});
it("should add integrity hash to lockfile when re-resolving tarball dep", async () => {
await withContext(defaultOpts, async ctx => {
const urls: string[] = [];
setContextHandler(ctx, dummyRegistryForContext(ctx, urls));
await writeFile(
join(ctx.package_dir, "package.json"),
JSON.stringify({
name: "foo",
version: "0.0.1",
dependencies: {
baz: `${ctx.registry_url}baz-0.0.3.tgz`,
},
}),
);
// Fresh install (no existing lockfile) should produce integrity hash
await using proc = spawn({
cmd: [bunExe(), "install", "--save-text-lockfile"],
cwd: ctx.package_dir,
stdout: "pipe",
stderr: "pipe",
env,
});
const err = await proc.stderr.text();
expect(err).toContain("Saved lockfile");
expect(await proc.exited).toBe(0);
// The newly generated lockfile should have the integrity hash
const lockContent = await file(join(ctx.package_dir, "bun.lock")).text();
expect(lockContent).toMatch(/"baz":\s*\[.*"sha512-[A-Za-z0-9+/]+=*"\]/s);
});
});
it("should store consistent integrity hash for local tarball across reinstalls", async () => {
await withContext(defaultOpts, async ctx => {
const urls: string[] = [];
setContextHandler(ctx, dummyRegistryForContext(ctx, urls));
await writeFile(
join(ctx.package_dir, "package.json"),
JSON.stringify({
name: "foo",
version: "0.0.1",
dependencies: {
baz: join(import.meta.dir, "baz-0.0.3.tgz"),
},
}),
);
// First install
{
await using proc = spawn({
cmd: [bunExe(), "install", "--save-text-lockfile"],
cwd: ctx.package_dir,
stdout: "pipe",
stderr: "pipe",
env,
});
const err = await proc.stderr.text();
expect(err).toContain("Saved lockfile");
expect(await proc.exited).toBe(0);
}
const lockContent1 = await file(join(ctx.package_dir, "bun.lock")).text();
const integrityMatch1 = lockContent1.match(/"(sha512-[A-Za-z0-9+/]+=*)"/);
expect(integrityMatch1).not.toBeNull();
const integrity1 = integrityMatch1![1];
// Delete lockfile and node_modules, reinstall
await rm(join(ctx.package_dir, "bun.lock"), { force: true });
await rm(join(ctx.package_dir, "node_modules"), { recursive: true, force: true });
{
await using proc = spawn({
cmd: [bunExe(), "install", "--save-text-lockfile"],
cwd: ctx.package_dir,
stdout: "pipe",
stderr: "pipe",
env,
});
const err = await proc.stderr.text();
expect(err).toContain("Saved lockfile");
expect(await proc.exited).toBe(0);
}
const lockContent2 = await file(join(ctx.package_dir, "bun.lock")).text();
const integrityMatch2 = lockContent2.match(/"(sha512-[A-Za-z0-9+/]+=*)"/);
expect(integrityMatch2).not.toBeNull();
expect(integrityMatch2![1]).toBe(integrity1);
});
});
it("should produce same integrity hash for same tarball via URL and local path", async () => {
await withContext(defaultOpts, async ctx => {
const urls: string[] = [];
setContextHandler(ctx, dummyRegistryForContext(ctx, urls));
// Install via URL
await writeFile(
join(ctx.package_dir, "package.json"),
JSON.stringify({
name: "foo",
version: "0.0.1",
dependencies: {
baz: `${ctx.registry_url}baz-0.0.3.tgz`,
},
}),
);
{
await using proc = spawn({
cmd: [bunExe(), "install", "--save-text-lockfile"],
cwd: ctx.package_dir,
stdout: "pipe",
stderr: "pipe",
env,
});
expect(await proc.exited).toBe(0);
}
const lockContent1 = await file(join(ctx.package_dir, "bun.lock")).text();
const integrityMatch1 = lockContent1.match(/"(sha512-[A-Za-z0-9+/]+=*)"/);
expect(integrityMatch1).not.toBeNull();
const urlIntegrity = integrityMatch1![1];
// Clean up
await rm(join(ctx.package_dir, "bun.lock"), { force: true });
await rm(join(ctx.package_dir, "node_modules"), { recursive: true, force: true });
// Install via local path
await writeFile(
join(ctx.package_dir, "package.json"),
JSON.stringify({
name: "foo",
version: "0.0.1",
dependencies: {
baz: join(import.meta.dir, "baz-0.0.3.tgz"),
},
}),
);
{
await using proc = spawn({
cmd: [bunExe(), "install", "--save-text-lockfile"],
cwd: ctx.package_dir,
stdout: "pipe",
stderr: "pipe",
env,
});
expect(await proc.exited).toBe(0);
}
const lockContent2 = await file(join(ctx.package_dir, "bun.lock")).text();
const integrityMatch2 = lockContent2.match(/"(sha512-[A-Za-z0-9+/]+=*)"/);
expect(integrityMatch2).not.toBeNull();
expect(integrityMatch2![1]).toBe(urlIntegrity);
});
});
it("should install successfully from text lockfile without integrity hash for local tarball (backward compat)", async () => {
await withContext(defaultOpts, async ctx => {
const urls: string[] = [];
setContextHandler(ctx, dummyRegistryForContext(ctx, urls));
const tgzPath = join(import.meta.dir, "baz-0.0.3.tgz");
await writeFile(
join(ctx.package_dir, "package.json"),
JSON.stringify({
name: "foo",
version: "0.0.1",
dependencies: {
baz: tgzPath,
},
}),
);
await writeFile(
join(ctx.package_dir, "bun.lock"),
JSON.stringify({
lockfileVersion: 1,
configVersion: 1,
workspaces: {
"": {
name: "foo",
dependencies: {
baz: tgzPath,
},
},
},
packages: {
baz: [`baz@${tgzPath}`, { bin: { "baz-run": "index.js" } }],
},
}),
);
await using proc = spawn({
cmd: [bunExe(), "install"],
cwd: ctx.package_dir,
stdout: "pipe",
stderr: "pipe",
env,
});
const err = await proc.stderr.text();
expect(err).not.toContain("Integrity check failed");
expect(err).not.toContain("error:");
expect(await proc.exited).toBe(0);
expect(await readdirSorted(join(ctx.package_dir, "node_modules", "baz"))).toEqual(["index.js", "package.json"]);
});
});
});

View File

@@ -68,9 +68,13 @@ it("should not print anything to stderr when running bun.lockb", async () => {
});
const stdoutOutput = await stdout.text();
expect(stdoutOutput).toBe(
`# THIS IS AN AUTOGENERATED FILE. DO NOT EDIT THIS FILE DIRECTLY.\n# yarn lockfile v1\n# bun ./bun.lockb --hash: 8B7A1C2DA8966A48-f4830e6e283fffe9-DE5BD0E91FD9910F-f0bf88071b3f7ec9\n\n\n\"bar@file:./bar-0.0.2.tgz\":\n version \"./bar-0.0.2.tgz\"\n resolved \"./bar-0.0.2.tgz\"\n`,
);
expect(stdoutOutput).toContain("# THIS IS AN AUTOGENERATED FILE. DO NOT EDIT THIS FILE DIRECTLY.");
expect(stdoutOutput).toContain("# yarn lockfile v1");
expect(stdoutOutput).toContain("# bun ./bun.lockb --hash:");
expect(stdoutOutput).toContain('"bar@file:./bar-0.0.2.tgz":');
expect(stdoutOutput).toContain(' version "./bar-0.0.2.tgz"');
expect(stdoutOutput).toContain(' resolved "./bar-0.0.2.tgz"');
expect(stdoutOutput).toContain(" integrity sha512-");
const stderrOutput = await stderr.text();
expect(stderrOutput).toBe("");

View File

@@ -0,0 +1,49 @@
import { expect, test } from "bun:test";
test("Listener.getsockname works with an object argument", () => {
using listener = Bun.listen({
hostname: "localhost",
port: 0,
socket: {
data() {},
},
});
const out: Record<string, unknown> = {};
const result = listener.getsockname(out);
expect(result).toBeUndefined(); // returns undefined, populates object in-place
expect(out).toEqual(
expect.objectContaining({
family: expect.any(String),
address: expect.any(String),
port: expect.any(Number),
}),
);
});
test("Listener.getsockname throws with non-object argument", () => {
using listener = Bun.listen({
hostname: "localhost",
port: 0,
socket: {
data() {},
},
});
expect(() => (listener as any).getsockname(123)).toThrow();
expect(() => (listener as any).getsockname("foo")).toThrow();
});
test("Listener.getsockname throws with no argument", () => {
using listener = Bun.listen({
hostname: "localhost",
port: 0,
socket: {
data() {},
},
});
// Previously crashed with null pointer dereference in BunString.cpp
// when called without an object argument. Now it should throw a TypeError.
expect(() => (listener as any).getsockname()).toThrow();
});

View File

@@ -0,0 +1,203 @@
// Exercises Bun's SIMD code paths to verify the baseline binary doesn't
// emit instructions beyond its CPU target (no AVX on x64, no LSE/SVE on aarch64).
//
// Each test uses inputs large enough to hit vectorized fast paths (>= 16 bytes
// for @Vector(16, u8), >= 64 bytes for wider paths) and validates correctness
// to catch both SIGILL and miscompilation from wrong instruction lowering.
import { describe, expect, test } from "bun:test";
// Use Buffer.alloc instead of "x".repeat() — repeat is slow in debug JSC builds.
const ascii256 = Buffer.alloc(256, "a").toString();
const ascii1k = Buffer.alloc(1024, "x").toString();
describe("escapeHTML — @Vector(16, u8) gated by enableSIMD", () => {
test("clean passthrough", () => {
expect(Bun.escapeHTML(ascii256)).toBe(ascii256);
});
test("ampersand in middle", () => {
const input = ascii256 + "&" + ascii256;
const escaped = Bun.escapeHTML(input);
expect(escaped).toContain("&amp;");
// The raw "&" should have been replaced — only "&amp;" should remain
expect(escaped.replaceAll("&amp;", "").includes("&")).toBe(false);
});
test("all special chars", () => {
const input = '<div class="test">' + ascii256 + "</div>";
const escaped = Bun.escapeHTML(input);
expect(escaped).toContain("&lt;");
expect(escaped).toContain("&gt;");
expect(escaped).toContain("&quot;");
});
});
describe("stringWidth — @Vector(16, u8) ungated", () => {
test("ascii", () => {
expect(Bun.stringWidth(ascii256)).toBe(256);
});
test("empty", () => {
expect(Bun.stringWidth("")).toBe(0);
});
test("tabs", () => {
expect(Bun.stringWidth(Buffer.alloc(32, "\t").toString())).toBe(0);
});
test("mixed printable and zero-width", () => {
const mixed = "hello" + "\x00".repeat(64) + "world";
expect(Bun.stringWidth(mixed)).toBe(10);
});
});
describe("Buffer hex encoding — @Vector(16, u8) gated by enableSIMD", () => {
test.each([16, 32, 64, 128, 256])("size %d", size => {
const buf = Buffer.alloc(size, 0xab);
const hex = buf.toString("hex");
expect(hex.length).toBe(size * 2);
expect(hex).toBe("ab".repeat(size));
});
test("all byte values", () => {
const varied = Buffer.alloc(256);
for (let i = 0; i < 256; i++) varied[i] = i;
const hex = varied.toString("hex");
expect(hex).toStartWith("000102030405");
expect(hex).toEndWith("fdfeff");
});
});
describe("base64 — simdutf runtime dispatch", () => {
test("ascii roundtrip", () => {
const encoded = btoa(ascii1k);
expect(atob(encoded)).toBe(ascii1k);
});
test("binary roundtrip", () => {
const binary = String.fromCharCode(...Array.from({ length: 256 }, (_, i) => i));
expect(atob(btoa(binary))).toBe(binary);
});
});
describe("TextEncoder/TextDecoder — simdutf runtime dispatch", () => {
const encoder = new TextEncoder();
const decoder = new TextDecoder();
test("ascii roundtrip", () => {
const bytes = encoder.encode(ascii1k);
expect(bytes.length).toBe(1024);
expect(decoder.decode(bytes)).toBe(ascii1k);
});
test("mixed ascii + multibyte", () => {
const mixed = ascii256 + "\u00e9\u00e9\u00e9" + ascii256 + "\u2603\u2603" + ascii256;
expect(decoder.decode(encoder.encode(mixed))).toBe(mixed);
});
test("emoji surrogate pairs", () => {
const emoji = "\u{1F600}".repeat(64);
expect(decoder.decode(encoder.encode(emoji))).toBe(emoji);
});
});
describe("decodeURIComponent — SIMD % scanning", () => {
test("clean passthrough", () => {
const clean = Buffer.alloc(256, "a").toString();
expect(decodeURIComponent(clean)).toBe(clean);
});
test("encoded at various positions", () => {
const input = "a".repeat(128) + "%20" + "b".repeat(128) + "%21";
expect(decodeURIComponent(input)).toBe("a".repeat(128) + " " + "b".repeat(128) + "!");
});
test("heavy utf8 encoding", () => {
const input = Array.from({ length: 64 }, () => "%C3%A9").join("");
expect(decodeURIComponent(input)).toBe("\u00e9".repeat(64));
});
});
describe("URL parsing — Highway indexOfChar/indexOfAny", () => {
test("long URL with all components", () => {
const longPath = "/" + "segment/".repeat(32) + "end";
const url = new URL("https://user:pass@example.com:8080" + longPath + "?key=value&foo=bar#section");
expect(url.protocol).toBe("https:");
expect(url.hostname).toBe("example.com");
expect(url.port).toBe("8080");
expect(url.pathname).toBe(longPath);
expect(url.search).toBe("?key=value&foo=bar");
expect(url.hash).toBe("#section");
});
});
describe("JSON — JS lexer SIMD string scanning", () => {
test("large object roundtrip", () => {
const obj: Record<string, string> = {};
for (let i = 0; i < 100; i++) {
obj["key_" + Buffer.alloc(32, "a").toString() + "_" + i] = "value_" + Buffer.alloc(64, "b").toString() + "_" + i;
}
const parsed = JSON.parse(JSON.stringify(obj));
expect(Object.keys(parsed).length).toBe(100);
expect(parsed["key_" + Buffer.alloc(32, "a").toString() + "_0"]).toBe(
"value_" + Buffer.alloc(64, "b").toString() + "_0",
);
});
test("string with escape sequences", () => {
const original = { msg: 'quote"here\nand\ttab' + Buffer.alloc(256, "x").toString() };
const reparsed = JSON.parse(JSON.stringify(original));
expect(reparsed.msg).toBe(original.msg);
});
});
describe("HTTP parsing — llhttp SSE4.2 PCMPESTRI", () => {
test("long headers", async () => {
const longHeaderValue = Buffer.alloc(512, "v").toString();
using server = Bun.serve({
port: 0,
fetch(req) {
return new Response(req.headers.get("X-Test-Header") || "missing");
},
});
const resp = await fetch(`http://localhost:${server.port}/` + "path/".repeat(20), {
headers: {
"X-Test-Header": longHeaderValue,
"X-Header-A": Buffer.alloc(64, "a").toString(),
"X-Header-B": Buffer.alloc(64, "b").toString(),
"X-Header-C": Buffer.alloc(64, "c").toString(),
"Accept": "application/json",
"Accept-Language": "en-US,en;q=0.9,fr;q=0.8,de;q=0.7",
},
});
expect(await resp.text()).toBe(longHeaderValue);
});
});
describe("Latin-1 to UTF-8 — @Vector(16, u8) ungated", () => {
test("full byte range", () => {
const latin1Bytes = Buffer.alloc(256);
for (let i = 0; i < 256; i++) latin1Bytes[i] = i;
const latin1Str = latin1Bytes.toString("latin1");
const utf8Buf = Buffer.from(latin1Str, "utf-8");
expect(utf8Buf.length).toBeGreaterThan(256);
expect(utf8Buf.toString("utf-8").length).toBe(256);
});
});
describe("String search — Highway memMem/indexOfChar", () => {
test("indexOf long string", () => {
const haystack = Buffer.alloc(1000, "a").toString() + "needle" + Buffer.alloc(1000, "b").toString();
expect(haystack.indexOf("needle")).toBe(1000);
expect(haystack.indexOf("missing")).toBe(-1);
expect(haystack.lastIndexOf("needle")).toBe(1000);
});
test("includes long string", () => {
const haystack = Buffer.alloc(1000, "a").toString() + "needle" + Buffer.alloc(1000, "b").toString();
expect(haystack.includes("needle")).toBe(true);
expect(haystack.includes("missing")).toBe(false);
});
});

View File

@@ -1,5 +1,5 @@
import { describe, expect, it } from "bun:test";
import { bunEnv, bunExe, isASAN, tmpdirSync } from "harness";
import { bunEnv, bunExe, tmpdirSync } from "harness";
import { join } from "node:path";
import tls from "node:tls";
@@ -263,7 +263,7 @@ describe.concurrent("fetch-tls", () => {
});
const start = performance.now();
const TIMEOUT = 200;
const THRESHOLD = 150 * (isASAN ? 2 : 1); // ASAN can be very slow, so we need to increase the threshold for it
const THRESHOLD = 150;
try {
await fetch(server.url, {

View File

@@ -1,49 +0,0 @@
import { expect, test } from "bun:test";
import { bunEnv, bunExe, tempDir } from "harness";
// https://github.com/oven-sh/bun/issues/27287
test("CJS require() of failing ESM does not corrupt module for subsequent import()", async () => {
using dir = tempDir("issue-27287", {
"bad-esm.mjs": `throw globalThis.err;\nexport const foo = 2;\n`,
"entry.cjs": `
'use strict';
globalThis.err = new Error('intentional error');
// First: require() the failing ESM module
try {
require('./bad-esm.mjs');
} catch (e) {
console.log('require_error:', e.message);
}
// Second: import() the same module - should re-throw the original error, not ReferenceError
import('./bad-esm.mjs')
.then(() => {
console.log('import_result: resolved');
})
.catch((e) => {
console.log('import_error_type:', e.constructor.name);
console.log('import_error_msg:', e.message);
});
`,
});
await using proc = Bun.spawn({
cmd: [bunExe(), "run", "entry.cjs"],
env: bunEnv,
cwd: String(dir),
stdout: "pipe",
stderr: "pipe",
});
const [stdout, stderr, exitCode] = await Promise.all([proc.stdout.text(), proc.stderr.text(), proc.exited]);
expect(stdout).toContain("require_error: intentional error");
// The import() should re-throw the original evaluation error, NOT a ReferenceError
// about uninitialized exports. The module threw during evaluation, so import() should
// reject with the same error.
expect(stdout).not.toContain("ReferenceError");
expect(stdout).toContain("import_error_type: Error");
expect(stdout).toContain("import_error_msg: intentional error");
expect(exitCode).toBe(0);
});