Compare commits

..

19 Commits

Author SHA1 Message Date
Claude Bot
95c759a578 fix(serve): allow server config with stop method to auto-start
The fix for #26142 incorrectly used `typeof def.stop !== 'function'`
to detect Server instances returned by Bun.serve(). This caused server
config objects with a custom `stop` method (e.g., Elysia apps) to not
auto-start.

Changed the detection to use `typeof def.reload !== 'function'` instead,
since `reload` is specific to Server instances and unlikely to be on
user config objects.

Fixes #26747

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-05 07:03:03 +00:00
robobun
ddefa11070 fix(fs): handle '.' path normalization on Windows (#26634)
## Summary
- Fix path normalization for "." on Windows where `normalizeStringBuf`
was incorrectly stripping it to an empty string
- This caused `existsSync('.')`, `statSync('.')`, and other fs
operations to fail on Windows

## Test plan
- Added regression test `test/regression/issue/26631.test.ts` that tests
`existsSync`, `exists`, `statSync`, and `stat` for both `.` and `..`
paths
- All tests pass locally with `bun bd test
test/regression/issue/26631.test.ts`
- Verified code compiles on all platforms with `bun run zig:check-all`

Fixes #26631

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-01 00:33:59 -08:00
Dylan Conway
35f8154319 bump versions 2026-01-31 17:38:23 -08:00
robobun
9d68ec882a require --compile for ESM bytecode (#26624)
## Summary
- Add validation to require `--compile` when using ESM bytecode
- Update documentation to clarify ESM bytecode requirements

## Why
ESM module resolution is two-phase: (1) analyze imports/exports, (2)
evaluate. Without `--compile`, there's no `module_info` embedded, so JSC
must still parse the file for module analysis even with bytecode -
causing a double-parse deopt.

## Changes
- **CLI**: Error when `--bytecode --format=esm` is used without
`--compile`
- **JS API**: Error when `bytecode: true, format: 'esm'` is used without
`compile: true`
- **Docs**: Update bytecode.mdx, executables.mdx, index.mdx to clarify
requirements
- **Types**: Update JSDoc for bytecode option in bun.d.ts

## Test plan
```bash
# Should error
bun build ./test.js --bytecode --format=esm --outdir=./out
# error: ESM bytecode requires --compile. Use --format=cjs for bytecode without --compile.

# Should work
bun build ./test.js --bytecode --format=esm --compile --outfile=./mytest
bun build ./test.js --bytecode --format=cjs --outdir=./out
```

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2026-01-31 17:35:03 -08:00
Dylan Conway
1337f5dba4 add --cpu-prof-interval flag (#26620)
Adds `--cpu-prof-interval` to configure the CPU profiler sampling
interval in microseconds (default: 1000), matching Node.js's
`--cpu-prof-interval` flag.

```sh
bun --cpu-prof --cpu-prof-interval 500 index.js
```

- Parsed as `u32`, truncated to `c_int` when passed to JSC's
`SamplingProfiler::setTimingInterval`
- Invalid values silently fall back to the default (1000μs)
- Warns if used without `--cpu-prof` or `--cpu-prof-md`

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-31 16:59:03 -08:00
robobun
56b5be4ba4 fix(shell): prevent double-free during GC finalization (#26626)
## Summary

Fixes #26625

This fixes a segmentation fault that occurred on Windows x64 when the GC
finalizer tried to free shell interpreter resources that were already
partially freed during normal shell completion.

- Added explicit `cleanup_state` enum to track resource ownership state
- `needs_full_cleanup`: Nothing cleaned up yet, finalizer must clean
everything
- `runtime_cleaned`: `finish()` already cleaned IO/shell, finalizer
skips those
- Early return in `#derefRootShellAndIOIfNeeded()` when already cleaned
- Explicit state-based cleanup in `deinitFromFinalizer()`

The vulnerability existed on all platforms but was most reliably
triggered on Windows with high GC pressure (many concurrent shell
commands).

## Test plan

- [x] Build passes (`bun bd`)
- [x] New regression test added (`test/regression/issue/26625.test.ts`)
- [x] Existing shell tests pass (same 4 pre-existing failures, no new
failures)


🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2026-01-31 16:57:59 -08:00
Dylan Conway
6c119d608e Simplify bun run build:local to auto-build JSC (#26645)
## Summary

- `bun run build:local` now handles everything: configuring JSC,
building JSC, and building Bun in a single command on all platforms
(macOS, Linux, Windows). Previously required manually running `bun run
jsc:build:debug`, deleting a duplicate `InspectorProtocolObjects.h`
header, and then running the Bun build separately.
- Incremental JSC rebuilds: JSC is built via `add_custom_target` that
delegates to JSC's inner Ninja, which tracks WebKit source file changes
and only rebuilds what changed. `ninja -Cbuild/debug-local` also works
after the first build.
- Cross-platform support:
  - macOS: Uses system ICU automatically
- Linux: Uses system ICU via find_package instead of requiring bundled
static libs
- Windows: Builds ICU from source automatically (only when libs don't
already exist), sets up static CRT and ICU naming conventions

### Changes
- cmake/tools/SetupWebKit.cmake: Replace the old WEBKIT_LOCAL block
(which just set include paths and assumed JSC was pre-built) with full
JSC configure + build integration for all platforms
- cmake/targets/BuildBun.cmake: Add jsc as a build dependency, use
system ICU on Linux for local builds, handle bmalloc linking for local
builds
- CONTRIBUTING.md / docs/project/contributing.mdx: Simplify "Building
WebKit locally" docs from ~15 lines of manual steps to 3 lines

## Test plan

- [x] macOS arm64: clean build, incremental rebuild, WebKit source
change rebuild
- [x] Windows x64: clean build with ICU, incremental rebuild with ICU
skip
- [x] Linux x64: build with system ICU via find_package
- [x] No duplicate InspectorProtocolObjects.h errors
- [x] build/debug-local/bun-debug --version works

Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude <noreply@anthropic.com>
2026-01-31 16:52:51 -08:00
Ciro Spaciari
a14a89ca95 fix(proxy): respect NO_PROXY for explicit proxy options in fetch and ws (#26608)
### What does this PR do?

Extract NO_PROXY checking logic from getHttpProxyFor into a reusable
isNoProxy method on the env Loader. This allows both fetch() and
WebSocket to check NO_PROXY even when a proxy is explicitly provided via
the proxy option (not just via http_proxy env var).

Changes:
- env_loader.zig: Extract isNoProxy() from getHttpProxyFor()
- FetchTasklet.zig: Check isNoProxy() before using explicit proxy
- WebSocket.cpp: Check Bun__isNoProxy() before using explicit proxy
- virtual_machine_exports.zig: Export Bun__isNoProxy for C++ access
- Add NO_PROXY tests for both fetch and WebSocket proxy paths

### How did you verify your code works?
Tests

---------

Co-authored-by: claude[bot] <209825114+claude[bot]@users.noreply.github.com>
2026-01-30 16:20:45 -08:00
robobun
a5246344fa fix(types): Socket.reload() now correctly expects { socket: handler } (#26291)
## Summary
- Fix type definition for `Socket.reload()` to match runtime behavior
- The runtime expects `{ socket: handler }` but types previously
accepted just `handler`

## Test plan
- [x] Added regression test `test/regression/issue/26290.test.ts`
- [x] Verified test passes with `bun bd test`

Fixes #26290

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Alistair Smith <hi@alistair.sh>
2026-01-30 13:23:04 -08:00
robobun
f648483fe7 fix(types): add missing SIMD variants to Bun.Build.CompileTarget type (#26248)
## Summary

- Adds missing SIMD variants to the `Build.Target` TypeScript type
- The runtime accepts targets like `bun-linux-x64-modern` but TypeScript
was rejecting them
- Generalized the type to use `${Architecture}` template where possible

## Test plan

- [x] Added regression test in `test/regression/issue/26247.test.ts`
that validates all valid target combinations type-check correctly
- [x] Verified with `bun bd test test/regression/issue/26247.test.ts`

Fixes #26247

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Alistair Smith <hi@alistair.sh>
2026-01-30 13:13:28 -08:00
ddmoney420
01fa61045f fix(types): add missing bun-linux-x64-${SIMD} compile target type (#26607)
## Summary

- Adds missing `bun-linux-x64-baseline` and `bun-linux-x64-modern`
compile target types
- These targets are supported by the Bun CLI but were missing from the
TypeScript type definitions

## Changes

Added `bun-linux-x64-${SIMD}` to the `CompileTarget` type union, which
expands to:
- `bun-linux-x64-baseline`
- `bun-linux-x64-modern`

## Test plan

- [x] TypeScript should now accept `target: 'bun-linux-x64-modern'`
without type errors

Closes #26247

🤖 Generated with [Claude Code](https://claude.com/claude-code)
2026-01-30 12:21:11 -08:00
Alistair Smith
71ce550cfa esm bytecode (#26402)
### What does this PR do?

### How did you verify your code works?

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: claude[bot] <209825114+claude[bot]@users.noreply.github.com>
2026-01-30 01:38:45 -08:00
robobun
8f61adf494 Harden chunked encoding parser (#26594)
## Summary
- Improve handling of fragmented chunk data in the HTTP parser
- Add test coverage for edge cases

## Test plan
- [x] New tests pass
- [x] Existing tests pass

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-30 01:18:39 -08:00
Dylan Conway
b4b7cc6d78 fix multi-run.test.ts on windows (#26590)
### What does this PR do?

fixes https://github.com/oven-sh/bun/issues/26597

### How did you verify your code works?

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2026-01-29 23:35:53 -08:00
SUZUKI Sosuke
3feea91087 ci: add QEMU JIT stress tests when WebKit is updated (#26589)
## Summary

Add a CI step that runs JSC JIT stress tests under QEMU when
`SetupWebKit.cmake` is modified. This complements #26571 (basic baseline
CPU verification) by also testing JIT-generated code.

## Motivation

PR #26571 added QEMU-based verification that catches illegal
instructions in:
- Startup code
- Static initialization
- Basic interpreter execution

However, JIT compilers (DFG, FTL, Wasm BBQ/OMG) generate code at runtime
that could emit AVX or LSE instructions even if the compiled binary
doesn't. The JSC stress tests from #26380 exercise all JIT tiers through
hot loops that trigger tier-up.

## How it works

1. Detects if `cmake/tools/SetupWebKit.cmake` is modified in the PR
2. If WebKit changes are detected, runs `verify-jit-stress-qemu.sh`
after the build
3. Executes all 78 JIT stress test fixtures under QEMU with restricted
CPU features:
   - x64: `qemu-x86_64 -cpu Nehalem` (SSE4.2, no AVX)
   - aarch64: `qemu-aarch64 -cpu cortex-a53` (ARMv8.0-A, no LSE)
4. Any SIGILL from JIT-generated code fails the build

## Platforms tested

| Target | CPU Model | What it catches |
|---|---|---|
| `linux-x64-baseline` | Nehalem | JIT emitting AVX/AVX2/AVX512 |
| `linux-x64-musl-baseline` | Nehalem | JIT emitting AVX/AVX2/AVX512 |
| `linux-aarch64` | Cortex-A53 | JIT emitting LSE atomics, SVE |
| `linux-aarch64-musl` | Cortex-A53 | JIT emitting LSE atomics, SVE |

## Timeout

The step has a 30-minute timeout since QEMU emulation is ~10-50x slower
than native. This only runs on WebKit update PRs, so it won't affect
most CI runs.

## Refs

- #26380 - Added JSC JIT stress tests
- #26571 - Added basic QEMU baseline verification
2026-01-29 21:12:36 -08:00
Jarred Sumner
bb4d5b9af5 feat(cli/run): add --parallel and --sequential for running multiple scripts with workspace support (#26551)
## Summary

Adds `bun run --parallel` and `bun run --sequential` — new flags for
running multiple package.json scripts concurrently or sequentially with
Foreman-style prefixed output. Includes full `--filter`/`--workspaces`
integration for running scripts across workspace packages.

### Usage

```bash
# Run "build" and "test" concurrently from the current package.json
bun run --parallel build test

# Run "build" and "test" sequentially with prefixed output
bun run --sequential build test

# Glob-matched script names
bun run --parallel "build:*"

# Run "build" in all workspace packages concurrently
bun run --parallel --filter '*' build

# Run "build" in all workspace packages sequentially
bun run --sequential --workspaces build

# Glob-matched scripts across all packages
bun run --parallel --filter '*' "build:*"

# Multiple scripts across all packages
bun run --parallel --filter '*' build lint test

# Continue running even if one package fails
bun run --parallel --no-exit-on-error --filter '*' test

# Skip packages missing the script
bun run --parallel --workspaces --if-present build
```

## How it works

### Output format

Each script's stdout/stderr is prefixed with a colored, padded label:

```
build | compiling...
test  | running suite...
lint  | checking files...
```

### Label format

- **Without `--filter`/`--workspaces`**: labels are just the script name
→ `build | output`
- **With `--filter`/`--workspaces`**: labels are `package:script` →
`pkg-a:build | output`
- **Fallback**: if a package.json has no `name` field, the relative path
from the workspace root is used (e.g., `packages/my-pkg:build`)

### Execution model

- **`--parallel`**: all scripts start immediately, output is interleaved
with prefixes
- **`--sequential`**: scripts run one at a time in order, each waiting
for the previous to finish
- **Pre/post scripts** (`prebuild`/`postbuild`) are grouped with their
main script and run in dependency order within each group
- By default, a failure kills all remaining scripts.
`--no-exit-on-error` lets all scripts finish.

### Workspace integration

The workspace branch in `multi_run.zig` uses a two-pass approach for
deterministic ordering:

1. **Collect**: iterate workspace packages using
`FilterArg.PackageFilterIterator` (same infrastructure as
`filter_run.zig`), filtering with `FilterArg.FilterSet`, collecting
matched packages with their scripts, PATH, and cwd.
2. **Sort**: sort matched packages by name (tiebreak by directory path)
for deterministic ordering — filesystem iteration order from the glob
walker is nondeterministic.
3. **Build configs**: for each sorted package, expand script names
(including globs like `build:*`) against that package's scripts map,
creating `ScriptConfig` entries with `pkg:script` labels and per-package
cwd/PATH.

### Behavioral consistency with `filter_run.zig`

| Behavior | `filter_run.zig` | `multi_run.zig` (this PR) |
|----------|-------------------|---------------------------|
| `--workspaces` skips root package | Yes | Yes |
| `--workspaces` errors on missing script | Yes | Yes |
| `--if-present` silently skips missing | Yes | Yes |
| `--filter` without `--workspaces` includes root | Yes (if matches) |
Yes (if matches) |
| Pre/post script chains | Per-package | Per-package |
| Per-package cwd | Yes | Yes |
| Per-package PATH (`node_modules/.bin`) | Yes | Yes |

### Key implementation details

- Each workspace package script runs in its own package directory with
its own `node_modules/.bin` PATH
- `dirpath` from the glob walker is duped to avoid use-after-free when
the iterator's arena is freed between patterns
- `addScriptConfigs` takes an optional `label_prefix` parameter — `null`
for single-package mode, package name for workspace mode
- `MultiRunProcessHandle` is registered in the `ProcessExitHandler`
tagged pointer union in `process.zig`

## Files changed

| File | Change |
|------|--------|
| `src/cli/multi_run.zig` | New file: process management, output
routing, workspace integration, dependency ordering |
| `src/cli.zig` | Dispatch to `MultiRun.run()` for
`--parallel`/`--sequential`, new context fields |
| `src/cli/Arguments.zig` | Parse `--parallel`, `--sequential`,
`--no-exit-on-error` flags |
| `src/bun.js/api/bun/process.zig` | Register `MultiRunProcessHandle` in
`ProcessExitHandler` tagged pointer union |
| `test/cli/run/multi-run.test.ts` | 118 tests (102 core + 16 workspace
integration) |
| `docs/pm/filter.mdx` | Document `--parallel`/`--sequential` +
`--filter`/`--workspaces` combination |
| `docs/snippets/cli/run.mdx` | Add `--parallel`, `--sequential`,
`--no-exit-on-error` parameter docs |

## Test plan

All 118 tests pass with debug build (`bun bd test
test/cli/run/multi-run.test.ts`). The 16 new workspace tests all fail
with system bun (`USE_SYSTEM_BUN=1`), confirming they test new
functionality.

### Workspace integration tests (16 tests)

1. `--parallel --filter='*'` runs script in all packages
2. `--parallel --filter='pkg-a'` runs only in matching package
3. `--parallel --workspaces` matches all workspace packages
4. `--parallel --filter='*'` with glob expands per-package scripts
5. `--sequential --filter='*'` runs in sequence (deterministic order)
6. Workspace + failure aborts other scripts
7. Workspace + `--no-exit-on-error` lets all finish
8. `--workspaces` skips root package
9. Each workspace script runs in its own package directory (cwd
verification)
10. Multiple script names across workspaces (`build` + `test`)
11. Pre/post scripts work per workspace package
12. `--filter` skips packages without the script (no error)
13. `--workspaces` errors when a package is missing the script
14. `--workspaces --if-present` skips missing scripts silently
15. Labels are padded correctly across workspace packages
16. Package without `name` field uses relative path as label

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Dylan Conway <dylan.conway567@gmail.com>
2026-01-29 20:20:39 -08:00
Dylan Conway
adc1a6b05c Fix aarch64 SIGILL: disable mimalloc LSE atomics + update WebKit + QEMU verification (#26586)
Fixes illegal instruction (SIGILL) crashes on ARMv8.0 aarch64 CPUs
(Cortex-A53, Raspberry Pi 4, AWS a1 instances).

## Root cause

Upstream mimalloc force-enables `MI_OPT_ARCH` on arm64, which adds
`-march=armv8.1-a` and emits LSE atomic instructions (`casa`, `swpa`,
`ldaddl`). These are not available on ARMv8.0 CPUs.

## Fix

- Pass `MI_NO_OPT_ARCH=ON` to mimalloc on aarch64 (has priority over
`MI_OPT_ARCH` in mimalloc's CMake)
- Update WebKit to autobuild-596e48e22e3a1090e5b802744a7938088b1ea860
which explicitly passes `-march` flags to the WebKit build

## Verification

Includes QEMU-based baseline CPU verification CI steps (#26571) that
catch these regressions automatically.
2026-01-29 17:18:57 -08:00
Dylan Conway
8a11a03297 [publish images] 2026-01-29 16:04:44 -08:00
Dylan Conway
baea21f0c7 ci: add QEMU-based baseline CPU verification steps (#26571)
## Summary

Add CI steps that verify baseline builds don't use CPU instructions
beyond their target. Uses QEMU user-mode emulation with restricted CPU
features — any illegal instruction causes SIGILL and fails the build.

## Platforms verified

| Build Target | QEMU Command | What it catches |
|---|---|---|
| `linux-x64-baseline` (glibc) | `qemu-x86_64 -cpu Nehalem` | AVX, AVX2,
AVX512 |
| `linux-x64-musl-baseline` | `qemu-x86_64 -cpu Nehalem` | AVX, AVX2,
AVX512 |
| `linux-aarch64` (glibc) | `qemu-aarch64 -cpu cortex-a35` | LSE
atomics, SVE, dotprod |
| `linux-aarch64-musl` | `qemu-aarch64 -cpu cortex-a35` | LSE atomics,
SVE, dotprod |

## How it works

Each verify step:
1. Downloads the built binary artifact from the `build-bun` step
2. Installs `qemu-user-static` on-the-fly (dnf/apk/apt-get)
3. Runs two smoke tests under QEMU with restricted CPU features:
   - `bun --version` — validates startup, linker, static init code
   - `bun -e eval` — validates JSC initialization and basic execution
4. Hard fails on SIGILL (exit code 132)

The verify step runs in the build group after `build-bun`, with a
5-minute timeout.

## Known issue this will surface

**mimalloc on aarch64**: Built with `MI_OPT_ARCH=ON` which adds
`-march=armv8.1-a`, enabling LSE atomics. This will SIGILL on
Cortex-A35/A53 CPUs. The aarch64 verify steps are expected to fail
initially, confirming the test catches real issues. Fix can be done
separately in `cmake/targets/BuildMimalloc.cmake`.
2026-01-29 15:53:34 -08:00
75 changed files with 6787 additions and 336 deletions

View File

@@ -26,7 +26,7 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
wget curl git python3 python3-pip ninja-build \
software-properties-common apt-transport-https \
ca-certificates gnupg lsb-release unzip \
libxml2-dev ruby ruby-dev bison gawk perl make golang ccache \
libxml2-dev ruby ruby-dev bison gawk perl make golang ccache qemu-user-static \
&& add-apt-repository ppa:ubuntu-toolchain-r/test \
&& apt-get update \
&& apt-get install -y gcc-13 g++-13 libgcc-13-dev libstdc++-13-dev \

View File

@@ -537,6 +537,109 @@ function getLinkBunStep(platform, options) {
};
}
/**
* Returns the artifact triplet for a platform, e.g. "bun-linux-aarch64" or "bun-linux-x64-musl-baseline".
* Matches the naming convention in cmake/targets/BuildBun.cmake.
* @param {Platform} platform
* @returns {string}
*/
function getTargetTriplet(platform) {
const { os, arch, abi, baseline } = platform;
let triplet = `bun-${os}-${arch}`;
if (abi === "musl") {
triplet += "-musl";
}
if (baseline) {
triplet += "-baseline";
}
return triplet;
}
/**
* Returns true if a platform needs QEMU-based baseline CPU verification.
* x64 baseline builds verify no AVX/AVX2 instructions snuck in.
* aarch64 builds verify no LSE/SVE instructions snuck in.
* @param {Platform} platform
* @returns {boolean}
*/
function needsBaselineVerification(platform) {
const { os, arch, baseline } = platform;
if (os !== "linux") return false;
return (arch === "x64" && baseline) || arch === "aarch64";
}
/**
* @param {Platform} platform
* @param {PipelineOptions} options
* @returns {Step}
*/
function getVerifyBaselineStep(platform, options) {
const { arch } = platform;
const targetKey = getTargetKey(platform);
const archArg = arch === "x64" ? "x64" : "aarch64";
return {
key: `${targetKey}-verify-baseline`,
label: `${getTargetLabel(platform)} - verify-baseline`,
depends_on: [`${targetKey}-build-bun`],
agents: getLinkBunAgent(platform, options),
retry: getRetry(),
cancel_on_build_failing: isMergeQueue(),
timeout_in_minutes: 5,
command: [
`buildkite-agent artifact download '*.zip' . --step ${targetKey}-build-bun`,
`unzip -o '${getTargetTriplet(platform)}.zip'`,
`unzip -o '${getTargetTriplet(platform)}-profile.zip'`,
`chmod +x ${getTargetTriplet(platform)}/bun ${getTargetTriplet(platform)}-profile/bun-profile`,
`./scripts/verify-baseline-cpu.sh --arch ${archArg} --binary ${getTargetTriplet(platform)}/bun`,
`./scripts/verify-baseline-cpu.sh --arch ${archArg} --binary ${getTargetTriplet(platform)}-profile/bun-profile`,
],
};
}
/**
* Returns true if the PR modifies SetupWebKit.cmake (WebKit version changes).
* JIT stress tests under QEMU should run when WebKit is updated to catch
* JIT-generated code that uses unsupported CPU instructions.
* @param {PipelineOptions} options
* @returns {boolean}
*/
function hasWebKitChanges(options) {
const { changedFiles = [] } = options;
return changedFiles.some(file => file.includes("SetupWebKit.cmake"));
}
/**
* Returns a step that runs JSC JIT stress tests under QEMU.
* This verifies that JIT-compiled code doesn't use CPU instructions
* beyond the baseline target (no AVX on x64, no LSE on aarch64).
* @param {Platform} platform
* @param {PipelineOptions} options
* @returns {Step}
*/
function getJitStressTestStep(platform, options) {
const { arch } = platform;
const targetKey = getTargetKey(platform);
const archArg = arch === "x64" ? "x64" : "aarch64";
return {
key: `${targetKey}-jit-stress-qemu`,
label: `${getTargetLabel(platform)} - jit-stress-qemu`,
depends_on: [`${targetKey}-build-bun`],
agents: getLinkBunAgent(platform, options),
retry: getRetry(),
cancel_on_build_failing: isMergeQueue(),
// JIT stress tests are slow under QEMU emulation
timeout_in_minutes: 30,
command: [
`buildkite-agent artifact download '*.zip' . --step ${targetKey}-build-bun`,
`unzip -o '${getTargetTriplet(platform)}.zip'`,
`chmod +x ${getTargetTriplet(platform)}/bun`,
`./scripts/verify-jit-stress-qemu.sh --arch ${archArg} --binary ${getTargetTriplet(platform)}/bun`,
],
};
}
/**
* @param {Platform} platform
* @param {PipelineOptions} options
@@ -774,6 +877,7 @@ function getBenchmarkStep() {
* @property {Platform[]} [buildPlatforms]
* @property {Platform[]} [testPlatforms]
* @property {string[]} [testFiles]
* @property {string[]} [changedFiles]
*/
/**
@@ -1126,6 +1230,14 @@ async function getPipeline(options = {}) {
steps.push(getBuildZigStep(target, options));
steps.push(getLinkBunStep(target, options));
if (needsBaselineVerification(target)) {
steps.push(getVerifyBaselineStep(target, options));
// Run JIT stress tests under QEMU when WebKit is updated
if (hasWebKitChanges(options)) {
steps.push(getJitStressTestStep(target, options));
}
}
return getStepWithDependsOn(
{
key: getTargetKey(target),
@@ -1223,6 +1335,7 @@ async function main() {
console.log(`- PR is only docs, skipping tests!`);
return;
}
options.changedFiles = allFiles;
}
startGroup("Generating pipeline...");

View File

@@ -259,18 +259,13 @@ $ git clone https://github.com/oven-sh/WebKit vendor/WebKit
# Check out the commit hash specified in `set(WEBKIT_VERSION <commit_hash>)` in cmake/tools/SetupWebKit.cmake
$ git -C vendor/WebKit checkout <commit_hash>
# Make a debug build of JSC. This will output build artifacts in ./vendor/WebKit/WebKitBuild/Debug
# Optionally, you can use `bun run jsc:build` for a release build
$ bun run jsc:build:debug && rm vendor/WebKit/WebKitBuild/Debug/JavaScriptCore/DerivedSources/inspector/InspectorProtocolObjects.h
# After an initial run of `make jsc-debug`, you can rebuild JSC with:
$ cmake --build vendor/WebKit/WebKitBuild/Debug --target jsc && rm vendor/WebKit/WebKitBuild/Debug/JavaScriptCore/DerivedSources/inspector/InspectorProtocolObjects.h
# Build bun with the local JSC build
# Build bun with the local JSC build — this automatically configures and builds JSC
$ bun run build:local
```
Using `bun run build:local` will build Bun in the `./build/debug-local` directory (instead of `./build/debug`), you'll have to change a couple of places to use this new directory:
`bun run build:local` handles everything: configuring JSC, building JSC, and building Bun. On subsequent runs, JSC will incrementally rebuild if any WebKit sources changed. `ninja -Cbuild/debug-local` also works after the first build, and will build Bun+JSC.
The build output goes to `./build/debug-local` (instead of `./build/debug`), so you'll need to update a couple of places:
- The first line in [`src/js/builtins.d.ts`](/src/js/builtins.d.ts)
- The `CompilationDatabase` line in [`.clangd` config](/.clangd) should be `CompilationDatabase: build/debug-local`
@@ -281,7 +276,7 @@ Note that the WebKit folder, including build artifacts, is 8GB+ in size.
If you are using a JSC debug build and using VScode, make sure to run the `C/C++: Select a Configuration` command to configure intellisense to find the debug headers.
Note that if you change make changes to our [WebKit fork](https://github.com/oven-sh/WebKit), you will also have to change [`SetupWebKit.cmake`](/cmake/tools/SetupWebKit.cmake) to point to the commit hash.
Note that if you make changes to our [WebKit fork](https://github.com/oven-sh/WebKit), you will also have to change [`SetupWebKit.cmake`](/cmake/tools/SetupWebKit.cmake) to point to the commit hash.
## Troubleshooting

2
LATEST
View File

@@ -1 +1 @@
1.3.7
1.3.8

View File

@@ -1273,13 +1273,18 @@ else()
${WEBKIT_LIB_PATH}/libWTF.a
${WEBKIT_LIB_PATH}/libJavaScriptCore.a
)
if(NOT APPLE OR EXISTS ${WEBKIT_LIB_PATH}/libbmalloc.a)
if(WEBKIT_LOCAL OR NOT APPLE OR EXISTS ${WEBKIT_LIB_PATH}/libbmalloc.a)
target_link_libraries(${bun} PRIVATE ${WEBKIT_LIB_PATH}/libbmalloc.a)
endif()
endif()
include_directories(${WEBKIT_INCLUDE_PATH})
# When building with a local WebKit, ensure JSC is built before compiling Bun's C++ sources.
if(WEBKIT_LOCAL AND TARGET jsc)
add_dependencies(${bun} jsc)
endif()
# Include the generated dependency versions header
include_directories(${CMAKE_BINARY_DIR})
@@ -1324,9 +1329,14 @@ if(LINUX)
target_link_libraries(${bun} PUBLIC libatomic.so)
endif()
target_link_libraries(${bun} PRIVATE ${WEBKIT_LIB_PATH}/libicudata.a)
target_link_libraries(${bun} PRIVATE ${WEBKIT_LIB_PATH}/libicui18n.a)
target_link_libraries(${bun} PRIVATE ${WEBKIT_LIB_PATH}/libicuuc.a)
if(WEBKIT_LOCAL)
find_package(ICU REQUIRED COMPONENTS data i18n uc)
target_link_libraries(${bun} PRIVATE ICU::data ICU::i18n ICU::uc)
else()
target_link_libraries(${bun} PRIVATE ${WEBKIT_LIB_PATH}/libicudata.a)
target_link_libraries(${bun} PRIVATE ${WEBKIT_LIB_PATH}/libicui18n.a)
target_link_libraries(${bun} PRIVATE ${WEBKIT_LIB_PATH}/libicuuc.a)
endif()
endif()
if(WIN32)

View File

@@ -69,8 +69,18 @@ if(ENABLE_VALGRIND)
list(APPEND MIMALLOC_CMAKE_ARGS -DMI_VALGRIND=ON)
endif()
# Enable SIMD optimizations when not building for baseline (older CPUs)
if(NOT ENABLE_BASELINE)
# Enable architecture-specific optimizations when not building for baseline.
# On Linux aarch64, upstream mimalloc force-enables MI_OPT_ARCH which adds
# -march=armv8.1-a (LSE atomics). This crashes on ARMv8.0 CPUs
# (Cortex-A53, Raspberry Pi 4, AWS a1 instances). Use MI_NO_OPT_ARCH
# to prevent that, but keep SIMD enabled. -moutline-atomics for runtime
# dispatch to LSE/LL-SC. macOS arm64 always has LSE (Apple Silicon) so
# MI_OPT_ARCH is safe there.
if(CMAKE_SYSTEM_PROCESSOR MATCHES "aarch64|arm64|ARM64|AARCH64" AND NOT APPLE)
list(APPEND MIMALLOC_CMAKE_ARGS -DMI_NO_OPT_ARCH=ON)
list(APPEND MIMALLOC_CMAKE_ARGS -DMI_OPT_SIMD=ON)
list(APPEND MIMALLOC_CMAKE_ARGS "-DCMAKE_C_FLAGS=-moutline-atomics")
elseif(NOT ENABLE_BASELINE)
list(APPEND MIMALLOC_CMAKE_ARGS -DMI_OPT_ARCH=ON)
list(APPEND MIMALLOC_CMAKE_ARGS -DMI_OPT_SIMD=ON)
endif()

View File

@@ -1,5 +1,9 @@
# NOTE: Changes to this file trigger QEMU JIT stress tests in CI.
# See scripts/verify-jit-stress-qemu.sh for details.
option(WEBKIT_VERSION "The version of WebKit to use")
option(WEBKIT_LOCAL "If a local version of WebKit should be used instead of downloading")
option(WEBKIT_BUILD_TYPE "The build type for local WebKit (defaults to CMAKE_BUILD_TYPE)")
if(NOT WEBKIT_VERSION)
set(WEBKIT_VERSION 515344bc5d65aa2d4f9ff277b5fb944f0e051dcd)
@@ -12,7 +16,10 @@ string(SUBSTRING ${WEBKIT_VERSION} 0 16 WEBKIT_VERSION_PREFIX)
string(SUBSTRING ${WEBKIT_VERSION} 0 8 WEBKIT_VERSION_SHORT)
if(WEBKIT_LOCAL)
set(DEFAULT_WEBKIT_PATH ${VENDOR_PATH}/WebKit/WebKitBuild/${CMAKE_BUILD_TYPE})
if(NOT WEBKIT_BUILD_TYPE)
set(WEBKIT_BUILD_TYPE ${CMAKE_BUILD_TYPE})
endif()
set(DEFAULT_WEBKIT_PATH ${VENDOR_PATH}/WebKit/WebKitBuild/${WEBKIT_BUILD_TYPE})
else()
set(DEFAULT_WEBKIT_PATH ${CACHE_PATH}/webkit-${WEBKIT_VERSION_PREFIX})
endif()
@@ -27,35 +34,153 @@ set(WEBKIT_INCLUDE_PATH ${WEBKIT_PATH}/include)
set(WEBKIT_LIB_PATH ${WEBKIT_PATH}/lib)
if(WEBKIT_LOCAL)
if(EXISTS ${WEBKIT_PATH}/cmakeconfig.h)
# You may need to run:
# make jsc-compile-debug jsc-copy-headers
include_directories(
${WEBKIT_PATH}
${WEBKIT_PATH}/JavaScriptCore/Headers
${WEBKIT_PATH}/JavaScriptCore/Headers/JavaScriptCore
${WEBKIT_PATH}/JavaScriptCore/PrivateHeaders
${WEBKIT_PATH}/bmalloc/Headers
${WEBKIT_PATH}/WTF/Headers
${WEBKIT_PATH}/JavaScriptCore/PrivateHeaders/JavaScriptCore
${WEBKIT_PATH}/JavaScriptCore/DerivedSources/inspector
)
set(WEBKIT_SOURCE_DIR ${VENDOR_PATH}/WebKit)
# On Windows, add ICU include path from vcpkg
if(WIN32)
# Auto-detect vcpkg triplet
set(VCPKG_ARM64_PATH ${VENDOR_PATH}/WebKit/vcpkg_installed/arm64-windows-static)
set(VCPKG_X64_PATH ${VENDOR_PATH}/WebKit/vcpkg_installed/x64-windows-static)
if(EXISTS ${VCPKG_ARM64_PATH})
set(VCPKG_ICU_PATH ${VCPKG_ARM64_PATH})
if(WIN32)
# --- Build ICU from source (Windows only) ---
# On macOS, ICU is found automatically (Homebrew icu4c for headers, system for libs).
# On Linux, ICU is found automatically from system packages (e.g. libicu-dev).
# On Windows, there is no system ICU, so we build it from source.
set(ICU_LOCAL_ROOT ${VENDOR_PATH}/WebKit/WebKitBuild/icu)
if(NOT EXISTS ${ICU_LOCAL_ROOT}/lib/sicudt.lib)
message(STATUS "Building ICU from source...")
if(CMAKE_SYSTEM_PROCESSOR MATCHES "arm64|ARM64|aarch64|AARCH64")
set(ICU_PLATFORM "ARM64")
else()
set(VCPKG_ICU_PATH ${VCPKG_X64_PATH})
set(ICU_PLATFORM "x64")
endif()
if(EXISTS ${VCPKG_ICU_PATH}/include)
include_directories(${VCPKG_ICU_PATH}/include)
message(STATUS "Using ICU from vcpkg: ${VCPKG_ICU_PATH}/include")
execute_process(
COMMAND powershell -ExecutionPolicy Bypass -File
${WEBKIT_SOURCE_DIR}/build-icu.ps1
-Platform ${ICU_PLATFORM}
-BuildType ${WEBKIT_BUILD_TYPE}
-OutputDir ${ICU_LOCAL_ROOT}
RESULT_VARIABLE ICU_BUILD_RESULT
)
if(NOT ICU_BUILD_RESULT EQUAL 0)
message(FATAL_ERROR "Failed to build ICU (exit code: ${ICU_BUILD_RESULT}).")
endif()
endif()
# Copy ICU libs to WEBKIT_LIB_PATH with the names BuildBun.cmake expects.
# Prebuilt WebKit uses 's' prefix (static) and 'd' suffix (debug).
file(MAKE_DIRECTORY ${WEBKIT_LIB_PATH})
if(WEBKIT_BUILD_TYPE STREQUAL "Debug")
set(ICU_SUFFIX "d")
else()
set(ICU_SUFFIX "")
endif()
file(COPY_FILE ${ICU_LOCAL_ROOT}/lib/sicudt.lib ${WEBKIT_LIB_PATH}/sicudt${ICU_SUFFIX}.lib ONLY_IF_DIFFERENT)
file(COPY_FILE ${ICU_LOCAL_ROOT}/lib/icuin.lib ${WEBKIT_LIB_PATH}/sicuin${ICU_SUFFIX}.lib ONLY_IF_DIFFERENT)
file(COPY_FILE ${ICU_LOCAL_ROOT}/lib/icuuc.lib ${WEBKIT_LIB_PATH}/sicuuc${ICU_SUFFIX}.lib ONLY_IF_DIFFERENT)
endif()
# --- Configure JSC ---
message(STATUS "Configuring JSC from local WebKit source at ${WEBKIT_SOURCE_DIR}...")
set(JSC_CMAKE_ARGS
-S ${WEBKIT_SOURCE_DIR}
-B ${WEBKIT_PATH}
-G ${CMAKE_GENERATOR}
-DPORT=JSCOnly
-DENABLE_STATIC_JSC=ON
-DUSE_THIN_ARCHIVES=OFF
-DENABLE_FTL_JIT=ON
-DCMAKE_EXPORT_COMPILE_COMMANDS=ON
-DUSE_BUN_JSC_ADDITIONS=ON
-DUSE_BUN_EVENT_LOOP=ON
-DENABLE_BUN_SKIP_FAILING_ASSERTIONS=ON
-DALLOW_LINE_AND_COLUMN_NUMBER_IN_BUILTINS=ON
-DCMAKE_BUILD_TYPE=${WEBKIT_BUILD_TYPE}
-DCMAKE_C_COMPILER=${CMAKE_C_COMPILER}
-DCMAKE_CXX_COMPILER=${CMAKE_CXX_COMPILER}
-DENABLE_REMOTE_INSPECTOR=ON
)
if(WIN32)
# ICU paths and Windows-specific compiler/linker settings
list(APPEND JSC_CMAKE_ARGS
-DICU_ROOT=${ICU_LOCAL_ROOT}
-DICU_LIBRARY=${ICU_LOCAL_ROOT}/lib
-DICU_INCLUDE_DIR=${ICU_LOCAL_ROOT}/include
-DCMAKE_LINKER=lld-link
)
# Static CRT and U_STATIC_IMPLEMENTATION
if(WEBKIT_BUILD_TYPE STREQUAL "Debug")
set(JSC_MSVC_RUNTIME "MultiThreadedDebug")
else()
set(JSC_MSVC_RUNTIME "MultiThreaded")
endif()
list(APPEND JSC_CMAKE_ARGS
-DCMAKE_MSVC_RUNTIME_LIBRARY=${JSC_MSVC_RUNTIME}
"-DCMAKE_C_FLAGS=/DU_STATIC_IMPLEMENTATION"
"-DCMAKE_CXX_FLAGS=/DU_STATIC_IMPLEMENTATION /clang:-fno-c++-static-destructors"
)
endif()
if(ENABLE_ASAN)
list(APPEND JSC_CMAKE_ARGS -DENABLE_SANITIZERS=address)
endif()
# Pass through ccache if available
if(CMAKE_C_COMPILER_LAUNCHER)
list(APPEND JSC_CMAKE_ARGS -DCMAKE_C_COMPILER_LAUNCHER=${CMAKE_C_COMPILER_LAUNCHER})
endif()
if(CMAKE_CXX_COMPILER_LAUNCHER)
list(APPEND JSC_CMAKE_ARGS -DCMAKE_CXX_COMPILER_LAUNCHER=${CMAKE_CXX_COMPILER_LAUNCHER})
endif()
execute_process(
COMMAND ${CMAKE_COMMAND} ${JSC_CMAKE_ARGS}
RESULT_VARIABLE JSC_CONFIGURE_RESULT
)
if(NOT JSC_CONFIGURE_RESULT EQUAL 0)
message(FATAL_ERROR "Failed to configure JSC (exit code: ${JSC_CONFIGURE_RESULT}). "
"Check the output above for errors.")
endif()
if(WIN32)
set(JSC_BYPRODUCTS
${WEBKIT_LIB_PATH}/JavaScriptCore.lib
${WEBKIT_LIB_PATH}/WTF.lib
${WEBKIT_LIB_PATH}/bmalloc.lib
)
else()
set(JSC_BYPRODUCTS
${WEBKIT_LIB_PATH}/libJavaScriptCore.a
${WEBKIT_LIB_PATH}/libWTF.a
${WEBKIT_LIB_PATH}/libbmalloc.a
)
endif()
if(WIN32)
add_custom_target(jsc ALL
COMMAND ${CMAKE_COMMAND} --build ${WEBKIT_PATH} --config ${WEBKIT_BUILD_TYPE} --target jsc
BYPRODUCTS ${JSC_BYPRODUCTS}
COMMENT "Building JSC (${WEBKIT_PATH})"
)
else()
add_custom_target(jsc ALL
COMMAND ${CMAKE_COMMAND} --build ${WEBKIT_PATH} --config ${WEBKIT_BUILD_TYPE} --target jsc
BYPRODUCTS ${JSC_BYPRODUCTS}
COMMENT "Building JSC (${WEBKIT_PATH})"
USES_TERMINAL
)
endif()
include_directories(
${WEBKIT_PATH}
${WEBKIT_PATH}/JavaScriptCore/Headers
${WEBKIT_PATH}/JavaScriptCore/Headers/JavaScriptCore
${WEBKIT_PATH}/JavaScriptCore/PrivateHeaders
${WEBKIT_PATH}/bmalloc/Headers
${WEBKIT_PATH}/WTF/Headers
${WEBKIT_PATH}/JavaScriptCore/PrivateHeaders/JavaScriptCore
)
# On Windows, add ICU headers from the local ICU build
if(WIN32)
include_directories(${ICU_LOCAL_ROOT}/include)
endif()
# After this point, only prebuilt WebKit is supported

View File

@@ -7,9 +7,9 @@ Bytecode caching is a build-time optimization that dramatically improves applica
## Usage
### Basic usage
### Basic usage (CommonJS)
Enable bytecode caching with the `--bytecode` flag:
Enable bytecode caching with the `--bytecode` flag. Without `--format`, this defaults to CommonJS:
```bash terminal icon="terminal"
bun build ./index.ts --target=bun --bytecode --outdir=./dist
@@ -17,7 +17,7 @@ bun build ./index.ts --target=bun --bytecode --outdir=./dist
This generates two files:
- `dist/index.js` - Your bundled JavaScript
- `dist/index.js` - Your bundled JavaScript (CommonJS)
- `dist/index.jsc` - The bytecode cache file
At runtime, Bun automatically detects and uses the `.jsc` file:
@@ -28,14 +28,24 @@ bun ./dist/index.js # Automatically uses index.jsc
### With standalone executables
When creating executables with `--compile`, bytecode is embedded into the binary:
When creating executables with `--compile`, bytecode is embedded into the binary. Both ESM and CommonJS formats are supported:
```bash terminal icon="terminal"
# ESM (requires --compile)
bun build ./cli.ts --compile --bytecode --format=esm --outfile=mycli
# CommonJS (works with or without --compile)
bun build ./cli.ts --compile --bytecode --outfile=mycli
```
The resulting executable contains both the code and bytecode, giving you maximum performance in a single file.
### ESM bytecode
ESM bytecode requires `--compile` because Bun embeds module metadata (import/export information) in the compiled binary. This metadata allows the JavaScript engine to skip parsing entirely at runtime.
Without `--compile`, ESM bytecode would still require parsing the source to analyze module dependencies—defeating the purpose of bytecode caching.
### Combining with other optimizations
Bytecode works great with minification and source maps:
@@ -90,35 +100,9 @@ Larger applications benefit more because they have more code to parse.
- ❌ **Code that runs once**
- ❌ **Development builds**
- ❌ **Size-constrained environments**
- ❌ **Code with top-level await** (not supported)
## Limitations
### CommonJS only
Bytecode caching currently works with CommonJS output format. Bun's bundler automatically converts most ESM code to CommonJS, but **top-level await** is the exception:
```js
// This prevents bytecode caching
const data = await fetch("https://api.example.com");
export default data;
```
**Why**: Top-level await requires async module evaluation, which can't be represented in CommonJS. The module graph becomes asynchronous, and the CommonJS wrapper function model breaks down.
**Workaround**: Move async initialization into a function:
```js
async function init() {
const data = await fetch("https://api.example.com");
return data;
}
export default init;
```
Now the module exports a function that the consumer can await when needed.
### Version compatibility
Bytecode is **not portable across Bun versions**. The bytecode format is tied to JavaScriptCore's internal representation, which changes between versions.
@@ -236,8 +220,6 @@ It's normal for it it to log a cache miss multiple times since Bun doesn't curre
- Compressing `.jsc` files for network transfer (gzip/brotli)
- Evaluating if the startup performance gain is worth the size increase
**Top-level await**: Not supported. Refactor to use async initialization functions.
## What is bytecode?
When you run JavaScript, the JavaScript engine doesn't execute your source code directly. Instead, it goes through several steps:

View File

@@ -322,10 +322,7 @@ Using bytecode compilation, `tsc` starts 2x faster:
Bytecode compilation moves parsing overhead for large input files from runtime to bundle time. Your app starts faster, in exchange for making the `bun build` command a little slower. It doesn't obscure source code.
<Warning>
**Experimental:** Bytecode compilation is an experimental feature. Only `cjs` format is supported (which means no
top-level-await). Let us know if you run into any issues!
</Warning>
<Note>Bytecode compilation supports both `cjs` and `esm` formats when used with `--compile`.</Note>
### What do these flags do?

View File

@@ -1508,22 +1508,43 @@ BuildArtifact (entry-point) {
## Bytecode
The `bytecode: boolean` option can be used to generate bytecode for any JavaScript/TypeScript entrypoints. This can greatly improve startup times for large applications. Only supported for `"cjs"` format, only supports `"target": "bun"` and dependent on a matching version of Bun. This adds a corresponding `.jsc` file for each entrypoint.
The `bytecode: boolean` option can be used to generate bytecode for any JavaScript/TypeScript entrypoints. This can greatly improve startup times for large applications. Requires `"target": "bun"` and is dependent on a matching version of Bun.
- **CommonJS**: Works with or without `compile: true`. Generates a `.jsc` file alongside each entrypoint.
- **ESM**: Requires `compile: true`. Bytecode and module metadata are embedded in the standalone executable.
Without an explicit `format`, bytecode defaults to CommonJS.
<Tabs>
<Tab title="JavaScript">
```ts title="build.ts" icon="/icons/typescript.svg"
// CommonJS bytecode (generates .jsc files)
await Bun.build({
entrypoints: ["./index.tsx"],
outdir: "./out",
bytecode: true,
})
// ESM bytecode (requires compile)
await Bun.build({
entrypoints: ["./index.tsx"],
outfile: "./mycli",
bytecode: true,
format: "esm",
compile: true,
})
```
</Tab>
<Tab title="CLI">
```bash terminal icon="terminal"
# CommonJS bytecode
bun build ./index.tsx --outdir ./out --bytecode
# ESM bytecode (requires --compile)
bun build ./index.tsx --outfile ./mycli --bytecode --format=esm --compile
```
</Tab>
</Tabs>
@@ -1690,7 +1711,10 @@ interface BuildConfig {
* start times, but will make the final output larger and slightly increase
* memory usage.
*
* Bytecode is currently only supported for CommonJS (`format: "cjs"`).
* - CommonJS: works with or without `compile: true`
* - ESM: requires `compile: true`
*
* Without an explicit `format`, defaults to CommonJS.
*
* Must be `target: "bun"`
* @default false

View File

@@ -97,6 +97,31 @@ Filters respect your [workspace configuration](/pm/workspaces): If you have a `p
bun run --filter foo myscript
```
### Parallel and sequential mode
Combine `--filter` or `--workspaces` with `--parallel` or `--sequential` to run scripts across workspace packages with Foreman-style prefixed output:
```bash terminal icon="terminal"
# Run "build" in all matching packages concurrently
bun run --parallel --filter '*' build
# Run "build" in all workspace packages sequentially
bun run --sequential --workspaces build
# Run glob-matched scripts across all packages
bun run --parallel --filter '*' "build:*"
# Continue running even if one package's script fails
bun run --parallel --no-exit-on-error --filter '*' test
# Run multiple scripts across all packages
bun run --parallel --filter '*' build lint
```
Each line of output is prefixed with the package and script name (e.g. `pkg-a:build | ...`). Without `--filter`/`--workspaces`, the prefix is just the script name (e.g. `build | ...`). When a package's `package.json` has no `name` field, the relative path from the workspace root is used instead.
Use `--if-present` with `--workspaces` to skip packages that don't have the requested script instead of erroring.
### Dependency Order
Bun will respect package dependency order when running scripts. Say you have a package `foo` that depends on another package `bar` in your workspace, and both packages have a `build` script. When you run `bun --filter '*' build`, you will notice that `foo` will only start running once `bar` is done.

View File

@@ -266,18 +266,13 @@ git clone https://github.com/oven-sh/WebKit vendor/WebKit
# Check out the commit hash specified in `set(WEBKIT_VERSION <commit_hash>)` in cmake/tools/SetupWebKit.cmake
git -C vendor/WebKit checkout <commit_hash>
# Make a debug build of JSC. This will output build artifacts in ./vendor/WebKit/WebKitBuild/Debug
# Optionally, you can use `bun run jsc:build` for a release build
bun run jsc:build:debug && rm vendor/WebKit/WebKitBuild/Debug/JavaScriptCore/DerivedSources/inspector/InspectorProtocolObjects.h
# After an initial run of `make jsc-debug`, you can rebuild JSC with:
cmake --build vendor/WebKit/WebKitBuild/Debug --target jsc && rm vendor/WebKit/WebKitBuild/Debug/JavaScriptCore/DerivedSources/inspector/InspectorProtocolObjects.h
# Build bun with the local JSC build
# Build bun with the local JSC build — this automatically configures and builds JSC
bun run build:local
```
Using `bun run build:local` will build Bun in the `./build/debug-local` directory (instead of `./build/debug`), you'll have to change a couple of places to use this new directory:
`bun run build:local` handles everything: configuring JSC, building JSC, and building Bun. On subsequent runs, JSC will incrementally rebuild if any WebKit sources changed. `ninja -Cbuild/debug-local` also works after the first build, and will build Bun+JSC.
The build output goes to `./build/debug-local` (instead of `./build/debug`), so you'll need to update a couple of places:
- The first line in `src/js/builtins.d.ts`
- The `CompilationDatabase` line in `.clangd` config should be `CompilationDatabase: build/debug-local`
@@ -288,7 +283,7 @@ Note that the WebKit folder, including build artifacts, is 8GB+ in size.
If you are using a JSC debug build and using VScode, make sure to run the `C/C++: Select a Configuration` command to configure intellisense to find the debug headers.
Note that if you change make changes to our [WebKit fork](https://github.com/oven-sh/WebKit), you will also have to change `SetupWebKit.cmake` to point to the commit hash.
Note that if you make changes to our [WebKit fork](https://github.com/oven-sh/WebKit), you will also have to change `SetupWebKit.cmake` to point to the commit hash.
## Troubleshooting

View File

@@ -50,7 +50,8 @@ bun build <entry points>
</ParamField>
<ParamField path="--format" type="string" default="esm">
Module format of the output bundle. One of <code>esm</code>, <code>cjs</code>, or <code>iife</code>
Module format of the output bundle. One of <code>esm</code>, <code>cjs</code>, or <code>iife</code>. Defaults to{" "}
<code>cjs</code> when <code>--bytecode</code> is used.
</ParamField>
### File Naming

View File

@@ -40,6 +40,18 @@ bun run <file or script>
Run a script in all workspace packages (from the <code>workspaces</code> field in <code>package.json</code>)
</ParamField>
<ParamField path="--parallel" type="boolean">
Run multiple scripts or workspace scripts concurrently with prefixed output
</ParamField>
<ParamField path="--sequential" type="boolean">
Run multiple scripts or workspace scripts one after another with prefixed output
</ParamField>
<ParamField path="--no-exit-on-error" type="boolean">
When using <code>--parallel</code> or <code>--sequential</code>, continue running other scripts when one fails
</ParamField>
### Runtime &amp; Process Control
<ParamField path="--bun" type="boolean">

View File

@@ -1,7 +1,7 @@
{
"private": true,
"name": "bun",
"version": "1.3.8",
"version": "1.3.9",
"workspaces": [
"./packages/bun-types",
"./packages/@types/bun"

View File

@@ -2433,12 +2433,13 @@ declare module "bun" {
type SIMD = "baseline" | "modern";
type CompileTarget =
| `bun-darwin-${Architecture}`
| `bun-darwin-x64-${SIMD}`
| `bun-darwin-${Architecture}-${SIMD}`
| `bun-linux-${Architecture}`
| `bun-linux-${Architecture}-${Libc}`
| `bun-linux-${Architecture}-${SIMD}`
| `bun-linux-${Architecture}-${SIMD}-${Libc}`
| "bun-windows-x64"
| `bun-windows-x64-${SIMD}`
| `bun-linux-x64-${SIMD}-${Libc}`;
| `bun-windows-x64-${SIMD}`;
}
/**
@@ -2593,7 +2594,10 @@ declare module "bun" {
* start times, but will make the final output larger and slightly increase
* memory usage.
*
* Bytecode is currently only supported for CommonJS (`format: "cjs"`).
* - CommonJS: works with or without `compile: true`
* - ESM: requires `compile: true`
*
* Without an explicit `format`, defaults to CommonJS.
*
* Must be `target: "bun"`
* @default false
@@ -5678,7 +5682,7 @@ declare module "bun" {
*
* This will apply to all sockets from the same {@link Listener}. it is per socket only for {@link Bun.connect}.
*/
reload(handler: SocketHandler): void;
reload(options: Pick<SocketOptions<Data>, "socket">): void;
/**
* Get the server that created this socket
@@ -6021,7 +6025,7 @@ declare module "bun" {
stop(closeActiveConnections?: boolean): void;
ref(): void;
unref(): void;
reload(options: Pick<Partial<SocketOptions>, "socket">): void;
reload(options: Pick<SocketOptions<Data>, "socket">): void;
data: Data;
}
interface TCPSocketListener<Data = unknown> extends SocketListener<Data> {

View File

@@ -204,26 +204,38 @@ namespace uWS {
}
// do we have data to emit all?
if (data.length() >= chunkSize(state)) {
unsigned int remaining = chunkSize(state);
if (data.length() >= remaining) {
// emit all but 2 bytes then reset state to 0 and goto beginning
// not fin
std::string_view emitSoon;
bool shouldEmit = false;
if (chunkSize(state) > 2) {
emitSoon = std::string_view(data.data(), chunkSize(state) - 2);
shouldEmit = true;
// Validate the chunk terminator (\r\n) accounting for partial reads
switch (remaining) {
default:
// remaining > 2: emit data and validate full terminator
emitSoon = std::string_view(data.data(), remaining - 2);
shouldEmit = true;
[[fallthrough]];
case 2:
// remaining >= 2: validate both \r and \n
if (data[remaining - 2] != '\r' || data[remaining - 1] != '\n') {
state = STATE_IS_ERROR;
return std::nullopt;
}
break;
case 1:
// remaining == 1: only \n left to validate
if (data[0] != '\n') {
state = STATE_IS_ERROR;
return std::nullopt;
}
break;
case 0:
// remaining == 0: terminator already consumed
break;
}
// Validate that the chunk terminator is \r\n to prevent request smuggling
// The last 2 bytes of the chunk must be exactly \r\n
// Note: chunkSize always includes +2 for the terminator (added in consumeHexNumber),
// and chunks with size 0 (chunkSize == 2) are handled earlier at line 190.
// Therefore chunkSize >= 3 here, so no underflow is possible.
size_t terminatorOffset = chunkSize(state) - 2;
if (data[terminatorOffset] != '\r' || data[terminatorOffset + 1] != '\n') {
state = STATE_IS_ERROR;
return std::nullopt;
}
data.remove_prefix(chunkSize(state));
data.remove_prefix(remaining);
state = STATE_IS_CHUNKED;
if (shouldEmit) {
return emitSoon;
@@ -232,19 +244,45 @@ namespace uWS {
} else {
/* We will consume all our input data */
std::string_view emitSoon;
if (chunkSize(state) > 2) {
uint64_t maximalAppEmit = chunkSize(state) - 2;
if (data.length() > maximalAppEmit) {
unsigned int size = chunkSize(state);
size_t len = data.length();
if (size > 2) {
uint64_t maximalAppEmit = size - 2;
if (len > maximalAppEmit) {
emitSoon = data.substr(0, maximalAppEmit);
// Validate terminator bytes being consumed
size_t terminatorBytesConsumed = len - maximalAppEmit;
if (terminatorBytesConsumed >= 1 && data[maximalAppEmit] != '\r') {
state = STATE_IS_ERROR;
return std::nullopt;
}
if (terminatorBytesConsumed >= 2 && data[maximalAppEmit + 1] != '\n') {
state = STATE_IS_ERROR;
return std::nullopt;
}
} else {
//cb(data);
emitSoon = data;
}
} else if (size == 2) {
// Only terminator bytes remain, validate what we have
if (len >= 1 && data[0] != '\r') {
state = STATE_IS_ERROR;
return std::nullopt;
}
if (len >= 2 && data[1] != '\n') {
state = STATE_IS_ERROR;
return std::nullopt;
}
} else if (size == 1) {
// Only \n remains
if (data[0] != '\n') {
state = STATE_IS_ERROR;
return std::nullopt;
}
}
decChunkSize(state, (unsigned int) data.length());
decChunkSize(state, (unsigned int) len);
state |= STATE_IS_CHUNKED;
// new: decrease data by its size (bug)
data.remove_prefix(data.length()); // ny bug fix för getNextChunk
data.remove_prefix(len);
if (emitSoon.length()) {
return emitSoon;
} else {

View File

@@ -1,5 +1,5 @@
#!/bin/sh
# Version: 26
# Version: 27
# A script that installs the dependencies needed to build and test Bun.
# This should work on macOS and Linux with a POSIX shell.
@@ -1061,6 +1061,11 @@ install_build_essentials() {
go \
xz
install_packages apache2-utils
# QEMU user-mode for baseline CPU verification in CI
case "$arch" in
x64) install_packages qemu-x86_64 ;;
aarch64) install_packages qemu-aarch64 ;;
esac
;;
esac

100
scripts/verify-baseline-cpu.sh Executable file
View File

@@ -0,0 +1,100 @@
#!/usr/bin/env bash
set -euo pipefail
# Verify that a Bun binary doesn't use CPU instructions beyond its baseline target.
# Uses QEMU user-mode emulation with restricted CPU features.
# Any illegal instruction (SIGILL) causes exit code 132 and fails the build.
#
# QEMU must be pre-installed in the CI image (see .buildkite/Dockerfile and
# scripts/bootstrap.sh).
ARCH=""
BINARY=""
while [[ $# -gt 0 ]]; do
case $1 in
--arch) ARCH="$2"; shift 2 ;;
--binary) BINARY="$2"; shift 2 ;;
*) echo "Unknown arg: $1"; exit 1 ;;
esac
done
if [ -z "$ARCH" ] || [ -z "$BINARY" ]; then
echo "Usage: $0 --arch <x64|aarch64> --binary <path>"
exit 1
fi
if [ ! -f "$BINARY" ]; then
echo "ERROR: Binary not found: $BINARY"
exit 1
fi
# Select QEMU binary and CPU model
HOST_ARCH=$(uname -m)
if [ "$ARCH" = "x64" ]; then
QEMU_BIN="qemu-x86_64"
if [ -f "/usr/bin/qemu-x86_64-static" ]; then
QEMU_BIN="qemu-x86_64-static"
fi
QEMU_CPU="Nehalem"
CPU_DESC="Nehalem (SSE4.2, no AVX/AVX2/AVX512)"
elif [ "$ARCH" = "aarch64" ]; then
QEMU_BIN="qemu-aarch64"
if [ -f "/usr/bin/qemu-aarch64-static" ]; then
QEMU_BIN="qemu-aarch64-static"
fi
# cortex-a53 is ARMv8.0-A (no LSE atomics, no SVE). It's the most widely
# supported ARMv8.0 model across QEMU versions.
QEMU_CPU="cortex-a53"
CPU_DESC="Cortex-A53 (ARMv8.0-A+CRC, no LSE/SVE)"
else
echo "ERROR: Unknown arch: $ARCH"
exit 1
fi
if ! command -v "$QEMU_BIN" &>/dev/null; then
echo "ERROR: $QEMU_BIN not found. It must be pre-installed in the CI image."
exit 1
fi
BINARY_NAME=$(basename "$BINARY")
echo "--- Verifying $BINARY_NAME on $CPU_DESC"
echo " Binary: $BINARY"
echo " QEMU: $QEMU_BIN -cpu $QEMU_CPU"
echo " Host: $HOST_ARCH"
echo ""
run_test() {
local label="$1"
shift
echo "+++ $BINARY_NAME: $label"
if "$QEMU_BIN" -cpu "$QEMU_CPU" "$@"; then
echo " PASS"
return 0
else
local exit_code=$?
echo ""
if [ $exit_code -eq 132 ]; then
echo " FAIL: Illegal instruction (SIGILL)"
echo ""
echo " The $BINARY_NAME binary uses CPU instructions not available on $QEMU_CPU."
if [ "$ARCH" = "x64" ]; then
echo " The baseline x64 build targets Nehalem (SSE4.2)."
echo " AVX, AVX2, and AVX512 instructions are not allowed."
else
echo " The aarch64 build targets Cortex-A53 (ARMv8.0-A+CRC)."
echo " LSE atomics, SVE, and dotprod instructions are not allowed."
fi
else
echo " FAIL: exit code $exit_code"
fi
exit $exit_code
fi
}
run_test "bun --version" "$BINARY" --version
run_test "bun -e eval" "$BINARY" -e "console.log(JSON.stringify({ok:1+1}))"
echo ""
echo " All checks passed for $BINARY_NAME on $QEMU_CPU."

148
scripts/verify-jit-stress-qemu.sh Executable file
View File

@@ -0,0 +1,148 @@
#!/usr/bin/env bash
set -euo pipefail
# Run JSC JIT stress tests under QEMU to verify that JIT-compiled code
# doesn't use CPU instructions beyond the baseline target.
#
# This script exercises all JIT tiers (DFG, FTL, Wasm BBQ/OMG) and catches
# cases where JIT-generated code emits AVX instructions on x64 or LSE
# atomics on aarch64.
#
# See: test/js/bun/jsc-stress/ for the test fixtures.
ARCH=""
BINARY=""
while [[ $# -gt 0 ]]; do
case $1 in
--arch) ARCH="$2"; shift 2 ;;
--binary) BINARY="$2"; shift 2 ;;
*) echo "Unknown arg: $1"; exit 1 ;;
esac
done
if [ -z "$ARCH" ] || [ -z "$BINARY" ]; then
echo "Usage: $0 --arch <x64|aarch64> --binary <path>"
exit 1
fi
if [ ! -f "$BINARY" ]; then
echo "ERROR: Binary not found: $BINARY"
exit 1
fi
# Convert to absolute path for use after pushd
BINARY="$(cd "$(dirname "$BINARY")" && pwd)/$(basename "$BINARY")"
# Select QEMU binary and CPU model
if [ "$ARCH" = "x64" ]; then
QEMU_BIN="qemu-x86_64"
if [ -f "/usr/bin/qemu-x86_64-static" ]; then
QEMU_BIN="qemu-x86_64-static"
fi
QEMU_CPU="Nehalem"
CPU_DESC="Nehalem (SSE4.2, no AVX/AVX2/AVX512)"
elif [ "$ARCH" = "aarch64" ]; then
QEMU_BIN="qemu-aarch64"
if [ -f "/usr/bin/qemu-aarch64-static" ]; then
QEMU_BIN="qemu-aarch64-static"
fi
QEMU_CPU="cortex-a53"
CPU_DESC="Cortex-A53 (ARMv8.0-A+CRC, no LSE/SVE)"
else
echo "ERROR: Unknown arch: $ARCH"
exit 1
fi
if ! command -v "$QEMU_BIN" &>/dev/null; then
echo "ERROR: $QEMU_BIN not found. It must be pre-installed in the CI image."
exit 1
fi
BINARY_NAME=$(basename "$BINARY")
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
REPO_ROOT="$(cd "$SCRIPT_DIR/.." && pwd)"
FIXTURES_DIR="$REPO_ROOT/test/js/bun/jsc-stress/fixtures"
WASM_FIXTURES_DIR="$FIXTURES_DIR/wasm"
PRELOAD_PATH="$REPO_ROOT/test/js/bun/jsc-stress/preload.js"
echo "--- Running JSC JIT stress tests on $CPU_DESC"
echo " Binary: $BINARY"
echo " QEMU: $QEMU_BIN -cpu $QEMU_CPU"
echo ""
SIGILL_FAILURES=0
OTHER_FAILURES=0
PASSED=0
run_fixture() {
local fixture="$1"
local fixture_name
fixture_name=$(basename "$fixture")
echo "+++ $fixture_name"
if "$QEMU_BIN" -cpu "$QEMU_CPU" "$BINARY" --preload "$PRELOAD_PATH" "$fixture" 2>&1; then
echo " PASS"
((PASSED++))
return 0
else
local exit_code=$?
if [ $exit_code -eq 132 ]; then
echo " FAIL: Illegal instruction (SIGILL)"
echo ""
echo " JIT-compiled code in $fixture_name uses CPU instructions not available on $QEMU_CPU."
if [ "$ARCH" = "x64" ]; then
echo " The baseline x64 build targets Nehalem (SSE4.2)."
echo " JIT must not emit AVX, AVX2, or AVX512 instructions."
else
echo " The aarch64 build targets Cortex-A53 (ARMv8.0-A+CRC)."
echo " JIT must not emit LSE atomics, SVE, or dotprod instructions."
fi
((SIGILL_FAILURES++))
else
# Non-SIGILL failures are warnings (test issues, not CPU instruction issues)
echo " WARN: exit code $exit_code (not a CPU instruction issue)"
((OTHER_FAILURES++))
fi
return $exit_code
fi
}
# Run JS fixtures (DFG/FTL)
echo "--- JS fixtures (DFG/FTL)"
for fixture in "$FIXTURES_DIR"/*.js; do
if [ -f "$fixture" ]; then
run_fixture "$fixture" || true
fi
done
# Run Wasm fixtures (BBQ/OMG)
echo "--- Wasm fixtures (BBQ/OMG)"
for fixture in "$WASM_FIXTURES_DIR"/*.js; do
if [ -f "$fixture" ]; then
# Wasm tests need to run from the wasm fixtures directory
# because they reference .wasm files relative to the script
pushd "$WASM_FIXTURES_DIR" > /dev/null
run_fixture "$fixture" || true
popd > /dev/null
fi
done
echo ""
echo "--- Summary"
echo " Passed: $PASSED"
echo " SIGILL failures: $SIGILL_FAILURES"
echo " Other failures: $OTHER_FAILURES (warnings, not CPU instruction issues)"
echo ""
if [ $SIGILL_FAILURES -gt 0 ]; then
echo " FAILED: JIT-generated code uses unsupported CPU instructions."
exit 1
fi
if [ $OTHER_FAILURES -gt 0 ]; then
echo " Some tests failed for reasons unrelated to CPU instructions."
echo " These are warnings and do not indicate JIT instruction issues."
fi
echo " All JIT stress tests passed on $QEMU_CPU (no SIGILL)."

View File

@@ -15,6 +15,7 @@ hash: u64 = 0,
is_executable: bool = false,
source_map_index: u32 = std.math.maxInt(u32),
bytecode_index: u32 = std.math.maxInt(u32),
module_info_index: u32 = std.math.maxInt(u32),
output_kind: jsc.API.BuildArtifact.OutputKind,
/// Relative
dest_path: []const u8 = "",
@@ -210,6 +211,7 @@ pub const Options = struct {
hash: ?u64 = null,
source_map_index: ?u32 = null,
bytecode_index: ?u32 = null,
module_info_index: ?u32 = null,
output_path: string,
source_index: Index.Optional = .none,
size: ?usize = null,
@@ -251,6 +253,7 @@ pub fn init(options: Options) OutputFile {
.hash = options.hash orelse 0,
.output_kind = options.output_kind,
.bytecode_index = options.bytecode_index orelse std.math.maxInt(u32),
.module_info_index = options.module_info_index orelse std.math.maxInt(u32),
.source_map_index = options.source_map_index orelse std.math.maxInt(u32),
.is_executable = options.is_executable,
.value = switch (options.data) {

View File

@@ -92,6 +92,10 @@ pub const StandaloneModuleGraph = struct {
contents: Schema.StringPointer = .{},
sourcemap: Schema.StringPointer = .{},
bytecode: Schema.StringPointer = .{},
module_info: Schema.StringPointer = .{},
/// The file path used when generating bytecode (e.g., "B:/~BUN/root/app.js").
/// Must match exactly at runtime for bytecode cache hits.
bytecode_origin_path: Schema.StringPointer = .{},
encoding: Encoding = .latin1,
loader: bun.options.Loader = .file,
module_format: ModuleFormat = .none,
@@ -159,6 +163,10 @@ pub const StandaloneModuleGraph = struct {
encoding: Encoding = .binary,
wtf_string: bun.String = bun.String.empty,
bytecode: []u8 = "",
module_info: []u8 = "",
/// The file path used when generating bytecode (e.g., "B:/~BUN/root/app.js").
/// Must match exactly at runtime for bytecode cache hits.
bytecode_origin_path: []const u8 = "",
module_format: ModuleFormat = .none,
side: FileSide = .server,
@@ -333,6 +341,8 @@ pub const StandaloneModuleGraph = struct {
else
.none,
.bytecode = if (module.bytecode.length > 0) @constCast(sliceTo(raw_bytes, module.bytecode)) else &.{},
.module_info = if (module.module_info.length > 0) @constCast(sliceTo(raw_bytes, module.module_info)) else &.{},
.bytecode_origin_path = if (module.bytecode_origin_path.length > 0) sliceToZ(raw_bytes, module.bytecode_origin_path) else "",
.module_format = module.module_format,
.side = module.side,
},
@@ -382,6 +392,8 @@ pub const StandaloneModuleGraph = struct {
} else if (output_file.output_kind == .bytecode) {
// Allocate up to 256 byte alignment for bytecode
string_builder.cap += (output_file.value.buffer.bytes.len + 255) / 256 * 256 + 256;
} else if (output_file.output_kind == .module_info) {
string_builder.cap += output_file.value.buffer.bytes.len;
} else {
if (entry_point_id == null) {
if (output_file.side == null or output_file.side.? == .server) {
@@ -477,6 +489,19 @@ pub const StandaloneModuleGraph = struct {
}
};
// Embed module_info for ESM bytecode
const module_info: StringPointer = brk: {
if (output_file.module_info_index != std.math.maxInt(u32)) {
const mi_bytes = output_files[output_file.module_info_index].value.buffer.bytes;
const offset = string_builder.len;
const writable = string_builder.writable();
@memcpy(writable[0..mi_bytes.len], mi_bytes[0..mi_bytes.len]);
string_builder.len += mi_bytes.len;
break :brk StringPointer{ .offset = @truncate(offset), .length = @truncate(mi_bytes.len) };
}
break :brk .{};
};
if (comptime bun.Environment.is_canary or bun.Environment.isDebug) {
if (bun.env_var.BUN_FEATURE_FLAG_DUMP_CODE.get()) |dump_code_dir| {
const buf = bun.path_buffer_pool.get();
@@ -498,6 +523,13 @@ pub const StandaloneModuleGraph = struct {
}
}
// When there's bytecode, store the bytecode output file's path as bytecode_origin_path.
// This path was used to generate the bytecode cache and must match at runtime.
const bytecode_origin_path: StringPointer = if (output_file.bytecode_index != std.math.maxInt(u32))
string_builder.appendCountZ(output_files[output_file.bytecode_index].dest_path)
else
.{};
var module = CompiledModuleGraphFile{
.name = string_builder.fmtAppendCountZ("{s}{s}", .{
prefix,
@@ -515,6 +547,8 @@ pub const StandaloneModuleGraph = struct {
else => .none,
} else .none,
.bytecode = bytecode,
.module_info = module_info,
.bytecode_origin_path = bytecode_origin_path,
.side = switch (output_file.side orelse .server) {
.server => .server,
.client => .client,

View File

@@ -0,0 +1,513 @@
pub const RecordKind = enum(u8) {
/// var_name
declared_variable,
/// let_name
lexical_variable,
/// module_name, import_name, local_name
import_info_single,
/// module_name, import_name, local_name
import_info_single_type_script,
/// module_name, import_name = '*', local_name
import_info_namespace,
/// export_name, import_name, module_name
export_info_indirect,
/// export_name, local_name, padding (for local => indirect conversion)
export_info_local,
/// export_name, module_name
export_info_namespace,
/// module_name
export_info_star,
_,
pub fn len(record: RecordKind) !usize {
return switch (record) {
.declared_variable, .lexical_variable => 1,
.import_info_single => 3,
.import_info_single_type_script => 3,
.import_info_namespace => 3,
.export_info_indirect => 3,
.export_info_local => 3,
.export_info_namespace => 2,
.export_info_star => 1,
else => return error.InvalidRecordKind,
};
}
};
pub const Flags = packed struct(u8) {
contains_import_meta: bool = false,
is_typescript: bool = false,
_padding: u6 = 0,
};
pub const ModuleInfoDeserialized = struct {
strings_buf: []const u8,
strings_lens: []align(1) const u32,
requested_modules_keys: []align(1) const StringID,
requested_modules_values: []align(1) const ModuleInfo.FetchParameters,
buffer: []align(1) const StringID,
record_kinds: []align(1) const RecordKind,
flags: Flags,
owner: union(enum) {
module_info,
allocated_slice: struct {
slice: []const u8,
allocator: std.mem.Allocator,
},
},
pub fn deinit(self: *ModuleInfoDeserialized) void {
switch (self.owner) {
.module_info => {
const mi: *ModuleInfo = @fieldParentPtr("_deserialized", self);
mi.destroy();
},
.allocated_slice => |as| {
as.allocator.free(as.slice);
as.allocator.destroy(self);
},
}
}
inline fn eat(rem: *[]const u8, len: usize) ![]const u8 {
if (rem.*.len < len) return error.BadModuleInfo;
const res = rem.*[0..len];
rem.* = rem.*[len..];
return res;
}
inline fn eatC(rem: *[]const u8, comptime len: usize) !*const [len]u8 {
if (rem.*.len < len) return error.BadModuleInfo;
const res = rem.*[0..len];
rem.* = rem.*[len..];
return res;
}
pub fn create(source: []const u8, gpa: std.mem.Allocator) !*ModuleInfoDeserialized {
const duped = try gpa.dupe(u8, source);
errdefer gpa.free(duped);
var rem: []const u8 = duped;
const res = try gpa.create(ModuleInfoDeserialized);
errdefer gpa.destroy(res);
const record_kinds_len = std.mem.readInt(u32, try eatC(&rem, 4), .little);
const record_kinds = std.mem.bytesAsSlice(RecordKind, try eat(&rem, record_kinds_len * @sizeOf(RecordKind)));
_ = try eat(&rem, (4 - (record_kinds_len % 4)) % 4); // alignment padding
const buffer_len = std.mem.readInt(u32, try eatC(&rem, 4), .little);
const buffer = std.mem.bytesAsSlice(StringID, try eat(&rem, buffer_len * @sizeOf(StringID)));
const requested_modules_len = std.mem.readInt(u32, try eatC(&rem, 4), .little);
const requested_modules_keys = std.mem.bytesAsSlice(StringID, try eat(&rem, requested_modules_len * @sizeOf(StringID)));
const requested_modules_values = std.mem.bytesAsSlice(ModuleInfo.FetchParameters, try eat(&rem, requested_modules_len * @sizeOf(ModuleInfo.FetchParameters)));
const flags: Flags = @bitCast((try eatC(&rem, 1))[0]);
_ = try eat(&rem, 3); // alignment padding
const strings_len = std.mem.readInt(u32, try eatC(&rem, 4), .little);
const strings_lens = std.mem.bytesAsSlice(u32, try eat(&rem, strings_len * @sizeOf(u32)));
const strings_buf = rem;
res.* = .{
.strings_buf = strings_buf,
.strings_lens = strings_lens,
.requested_modules_keys = requested_modules_keys,
.requested_modules_values = requested_modules_values,
.buffer = buffer,
.record_kinds = record_kinds,
.flags = flags,
.owner = .{ .allocated_slice = .{
.slice = duped,
.allocator = gpa,
} },
};
return res;
}
/// Wrapper around `create` for use when loading from a cache (transpiler cache or standalone module graph).
/// Returns `null` instead of panicking on corrupt/truncated data.
pub fn createFromCachedRecord(source: []const u8, gpa: std.mem.Allocator) ?*ModuleInfoDeserialized {
return create(source, gpa) catch |e| switch (e) {
error.OutOfMemory => bun.outOfMemory(),
error.BadModuleInfo => null,
};
}
pub fn serialize(self: *const ModuleInfoDeserialized, writer: anytype) !void {
try writer.writeInt(u32, @truncate(self.record_kinds.len), .little);
try writer.writeAll(std.mem.sliceAsBytes(self.record_kinds));
try writer.writeByteNTimes(0, (4 - (self.record_kinds.len % 4)) % 4); // alignment padding
try writer.writeInt(u32, @truncate(self.buffer.len), .little);
try writer.writeAll(std.mem.sliceAsBytes(self.buffer));
try writer.writeInt(u32, @truncate(self.requested_modules_keys.len), .little);
try writer.writeAll(std.mem.sliceAsBytes(self.requested_modules_keys));
try writer.writeAll(std.mem.sliceAsBytes(self.requested_modules_values));
try writer.writeByte(@bitCast(self.flags));
try writer.writeByteNTimes(0, 3); // alignment padding
try writer.writeInt(u32, @truncate(self.strings_lens.len), .little);
try writer.writeAll(std.mem.sliceAsBytes(self.strings_lens));
try writer.writeAll(self.strings_buf);
}
};
const StringMapKey = enum(u32) {
_,
};
pub const StringContext = struct {
strings_buf: []const u8,
strings_lens: []const u32,
pub fn hash(_: @This(), s: []const u8) u32 {
return @as(u32, @truncate(std.hash.Wyhash.hash(0, s)));
}
pub fn eql(self: @This(), fetch_key: []const u8, item_key: StringMapKey, item_i: usize) bool {
return bun.strings.eqlLong(fetch_key, self.strings_buf[@intFromEnum(item_key)..][0..self.strings_lens[item_i]], true);
}
};
pub const ModuleInfo = struct {
/// all strings in wtf-8. index in hashmap = StringID
gpa: std.mem.Allocator,
strings_map: std.ArrayHashMapUnmanaged(StringMapKey, void, void, true),
strings_buf: std.ArrayListUnmanaged(u8),
strings_lens: std.ArrayListUnmanaged(u32),
requested_modules: std.AutoArrayHashMap(StringID, FetchParameters),
buffer: std.ArrayListUnmanaged(StringID),
record_kinds: std.ArrayListUnmanaged(RecordKind),
flags: Flags,
exported_names: std.AutoArrayHashMapUnmanaged(StringID, void),
finalized: bool = false,
/// only initialized after .finalize() is called
_deserialized: ModuleInfoDeserialized,
pub fn asDeserialized(self: *ModuleInfo) *ModuleInfoDeserialized {
bun.assert(self.finalized);
return &self._deserialized;
}
pub const FetchParameters = enum(u32) {
none = std.math.maxInt(u32),
javascript = std.math.maxInt(u32) - 1,
webassembly = std.math.maxInt(u32) - 2,
json = std.math.maxInt(u32) - 3,
_, // host_defined: cast to StringID
pub fn hostDefined(value: StringID) FetchParameters {
return @enumFromInt(@intFromEnum(value));
}
};
pub const VarKind = enum { declared, lexical };
pub fn addVar(self: *ModuleInfo, name: StringID, kind: VarKind) !void {
switch (kind) {
.declared => try self.addDeclaredVariable(name),
.lexical => try self.addLexicalVariable(name),
}
}
fn _addRecord(self: *ModuleInfo, kind: RecordKind, data: []const StringID) !void {
bun.assert(!self.finalized);
bun.assert(data.len == kind.len() catch unreachable);
try self.record_kinds.append(self.gpa, kind);
try self.buffer.appendSlice(self.gpa, data);
}
pub fn addDeclaredVariable(self: *ModuleInfo, id: StringID) !void {
try self._addRecord(.declared_variable, &.{id});
}
pub fn addLexicalVariable(self: *ModuleInfo, id: StringID) !void {
try self._addRecord(.lexical_variable, &.{id});
}
pub fn addImportInfoSingle(self: *ModuleInfo, module_name: StringID, import_name: StringID, local_name: StringID, only_used_as_type: bool) !void {
try self._addRecord(if (only_used_as_type) .import_info_single_type_script else .import_info_single, &.{ module_name, import_name, local_name });
}
pub fn addImportInfoNamespace(self: *ModuleInfo, module_name: StringID, local_name: StringID) !void {
try self._addRecord(.import_info_namespace, &.{ module_name, try self.str("*"), local_name });
}
pub fn addExportInfoIndirect(self: *ModuleInfo, export_name: StringID, import_name: StringID, module_name: StringID) !void {
if (try self._hasOrAddExportedName(export_name)) return; // a syntax error will be emitted later in this case
try self._addRecord(.export_info_indirect, &.{ export_name, import_name, module_name });
}
pub fn addExportInfoLocal(self: *ModuleInfo, export_name: StringID, local_name: StringID) !void {
if (try self._hasOrAddExportedName(export_name)) return; // a syntax error will be emitted later in this case
try self._addRecord(.export_info_local, &.{ export_name, local_name, @enumFromInt(std.math.maxInt(u32)) });
}
pub fn addExportInfoNamespace(self: *ModuleInfo, export_name: StringID, module_name: StringID) !void {
if (try self._hasOrAddExportedName(export_name)) return; // a syntax error will be emitted later in this case
try self._addRecord(.export_info_namespace, &.{ export_name, module_name });
}
pub fn addExportInfoStar(self: *ModuleInfo, module_name: StringID) !void {
try self._addRecord(.export_info_star, &.{module_name});
}
pub fn _hasOrAddExportedName(self: *ModuleInfo, name: StringID) !bool {
if (try self.exported_names.fetchPut(self.gpa, name, {}) != null) return true;
return false;
}
pub fn create(gpa: std.mem.Allocator, is_typescript: bool) !*ModuleInfo {
const res = try gpa.create(ModuleInfo);
res.* = ModuleInfo.init(gpa, is_typescript);
return res;
}
fn init(allocator: std.mem.Allocator, is_typescript: bool) ModuleInfo {
return .{
.gpa = allocator,
.strings_map = .{},
.strings_buf = .{},
.strings_lens = .{},
.exported_names = .{},
.requested_modules = std.AutoArrayHashMap(StringID, FetchParameters).init(allocator),
.buffer = .empty,
.record_kinds = .empty,
.flags = .{ .contains_import_meta = false, .is_typescript = is_typescript },
._deserialized = undefined,
};
}
fn deinit(self: *ModuleInfo) void {
self.strings_map.deinit(self.gpa);
self.strings_buf.deinit(self.gpa);
self.strings_lens.deinit(self.gpa);
self.exported_names.deinit(self.gpa);
self.requested_modules.deinit();
self.buffer.deinit(self.gpa);
self.record_kinds.deinit(self.gpa);
}
pub fn destroy(self: *ModuleInfo) void {
const alloc = self.gpa;
self.deinit();
alloc.destroy(self);
}
pub fn str(self: *ModuleInfo, value: []const u8) !StringID {
try self.strings_buf.ensureUnusedCapacity(self.gpa, value.len);
try self.strings_lens.ensureUnusedCapacity(self.gpa, 1);
const gpres = try self.strings_map.getOrPutAdapted(self.gpa, value, StringContext{
.strings_buf = self.strings_buf.items,
.strings_lens = self.strings_lens.items,
});
if (gpres.found_existing) return @enumFromInt(@as(u32, @intCast(gpres.index)));
gpres.key_ptr.* = @enumFromInt(@as(u32, @truncate(self.strings_buf.items.len)));
gpres.value_ptr.* = {};
self.strings_buf.appendSliceAssumeCapacity(value);
self.strings_lens.appendAssumeCapacity(@as(u32, @truncate(value.len)));
return @enumFromInt(@as(u32, @intCast(gpres.index)));
}
pub fn requestModule(self: *ModuleInfo, import_record_path: StringID, fetch_parameters: FetchParameters) !void {
// jsc only records the attributes of the first import with the given import_record_path. so only put if not exists.
const gpres = try self.requested_modules.getOrPut(import_record_path);
if (!gpres.found_existing) gpres.value_ptr.* = fetch_parameters;
}
/// Replace all occurrences of old_id with new_id in records and requested_modules.
/// Used to fix up cross-chunk import specifiers after final paths are computed.
pub fn replaceStringID(self: *ModuleInfo, old_id: StringID, new_id: StringID) void {
bun.assert(!self.finalized);
// Replace in record buffer
for (self.buffer.items) |*item| {
if (item.* == old_id) item.* = new_id;
}
// Replace in requested_modules keys (preserving insertion order)
if (self.requested_modules.getIndex(old_id)) |idx| {
self.requested_modules.keys()[idx] = new_id;
self.requested_modules.reIndex() catch {};
}
}
/// find any exports marked as 'local' that are actually 'indirect' and fix them
pub fn finalize(self: *ModuleInfo) !void {
bun.assert(!self.finalized);
var local_name_to_module_name = std.AutoArrayHashMap(StringID, struct { module_name: StringID, import_name: StringID, record_kinds_idx: usize }).init(bun.default_allocator);
defer local_name_to_module_name.deinit();
{
var i: usize = 0;
for (self.record_kinds.items, 0..) |k, idx| {
if (k == .import_info_single or k == .import_info_single_type_script) {
try local_name_to_module_name.put(self.buffer.items[i + 2], .{ .module_name = self.buffer.items[i], .import_name = self.buffer.items[i + 1], .record_kinds_idx = idx });
}
i += k.len() catch unreachable;
}
}
{
var i: usize = 0;
for (self.record_kinds.items) |*k| {
if (k.* == .export_info_local) {
if (local_name_to_module_name.get(self.buffer.items[i + 1])) |ip| {
k.* = .export_info_indirect;
self.buffer.items[i + 1] = ip.import_name;
self.buffer.items[i + 2] = ip.module_name;
// In TypeScript, the re-exported import may target a type-only
// export that was elided. Convert the import to SingleTypeScript
// so JSC tolerates it being NotFound during linking.
if (self.flags.is_typescript) {
self.record_kinds.items[ip.record_kinds_idx] = .import_info_single_type_script;
}
}
}
i += k.len() catch unreachable;
}
}
self._deserialized = .{
.strings_buf = self.strings_buf.items,
.strings_lens = self.strings_lens.items,
.requested_modules_keys = self.requested_modules.keys(),
.requested_modules_values = self.requested_modules.values(),
.buffer = self.buffer.items,
.record_kinds = self.record_kinds.items,
.flags = self.flags,
.owner = .module_info,
};
self.finalized = true;
}
};
pub const StringID = enum(u32) {
star_default = std.math.maxInt(u32),
star_namespace = std.math.maxInt(u32) - 1,
_,
};
export fn zig__renderDiff(expected_ptr: [*:0]const u8, expected_len: usize, received_ptr: [*:0]const u8, received_len: usize, globalThis: *bun.jsc.JSGlobalObject) void {
const formatter = DiffFormatter{
.received_string = received_ptr[0..received_len],
.expected_string = expected_ptr[0..expected_len],
.globalThis = globalThis,
};
bun.Output.errorWriter().print("DIFF:\n{any}\n", .{formatter}) catch {};
}
export fn zig__ModuleInfoDeserialized__toJSModuleRecord(
globalObject: *bun.jsc.JSGlobalObject,
vm: *bun.jsc.VM,
module_key: *const IdentifierArray,
source_code: *const SourceCode,
declared_variables: *VariableEnvironment,
lexical_variables: *VariableEnvironment,
res: *ModuleInfoDeserialized,
) ?*JSModuleRecord {
defer res.deinit();
var identifiers = IdentifierArray.create(res.strings_lens.len);
defer identifiers.destroy();
var offset: usize = 0;
for (0.., res.strings_lens) |index, len| {
if (res.strings_buf.len < offset + len) return null; // error!
const sub = res.strings_buf[offset..][0..len];
identifiers.setFromUtf8(index, vm, sub);
offset += len;
}
{
var i: usize = 0;
for (res.record_kinds) |k| {
if (i + (k.len() catch 0) > res.buffer.len) return null;
switch (k) {
.declared_variable => declared_variables.add(vm, identifiers, res.buffer[i]),
.lexical_variable => lexical_variables.add(vm, identifiers, res.buffer[i]),
.import_info_single, .import_info_single_type_script, .import_info_namespace, .export_info_indirect, .export_info_local, .export_info_namespace, .export_info_star => {},
else => return null,
}
i += k.len() catch unreachable; // handled above
}
}
const module_record = JSModuleRecord.create(globalObject, vm, module_key, source_code, declared_variables, lexical_variables, res.flags.contains_import_meta, res.flags.is_typescript);
for (res.requested_modules_keys, res.requested_modules_values) |reqk, reqv| {
switch (reqv) {
.none => module_record.addRequestedModuleNullAttributesPtr(identifiers, reqk),
.javascript => module_record.addRequestedModuleJavaScript(identifiers, reqk),
.webassembly => module_record.addRequestedModuleWebAssembly(identifiers, reqk),
.json => module_record.addRequestedModuleJSON(identifiers, reqk),
else => |uv| module_record.addRequestedModuleHostDefined(identifiers, reqk, @enumFromInt(@intFromEnum(uv))),
}
}
{
var i: usize = 0;
for (res.record_kinds) |k| {
if (i + (k.len() catch unreachable) > res.buffer.len) unreachable; // handled above
switch (k) {
.declared_variable, .lexical_variable => {},
.import_info_single => module_record.addImportEntrySingle(identifiers, res.buffer[i + 1], res.buffer[i + 2], res.buffer[i]),
.import_info_single_type_script => module_record.addImportEntrySingleTypeScript(identifiers, res.buffer[i + 1], res.buffer[i + 2], res.buffer[i]),
.import_info_namespace => module_record.addImportEntryNamespace(identifiers, res.buffer[i + 1], res.buffer[i + 2], res.buffer[i]),
.export_info_indirect => module_record.addIndirectExport(identifiers, res.buffer[i + 0], res.buffer[i + 1], res.buffer[i + 2]),
.export_info_local => module_record.addLocalExport(identifiers, res.buffer[i], res.buffer[i + 1]),
.export_info_namespace => module_record.addNamespaceExport(identifiers, res.buffer[i], res.buffer[i + 1]),
.export_info_star => module_record.addStarExport(identifiers, res.buffer[i]),
else => unreachable, // handled above
}
i += k.len() catch unreachable; // handled above
}
}
return module_record;
}
export fn zig__ModuleInfo__destroy(info: *ModuleInfo) void {
info.destroy();
}
const VariableEnvironment = opaque {
extern fn JSC__VariableEnvironment__add(environment: *VariableEnvironment, vm: *bun.jsc.VM, identifier_array: *IdentifierArray, identifier_index: StringID) void;
pub const add = JSC__VariableEnvironment__add;
};
const IdentifierArray = opaque {
extern fn JSC__IdentifierArray__create(len: usize) *IdentifierArray;
pub const create = JSC__IdentifierArray__create;
extern fn JSC__IdentifierArray__destroy(identifier_array: *IdentifierArray) void;
pub const destroy = JSC__IdentifierArray__destroy;
extern fn JSC__IdentifierArray__setFromUtf8(identifier_array: *IdentifierArray, n: usize, vm: *bun.jsc.VM, str: [*]const u8, len: usize) void;
pub fn setFromUtf8(self: *IdentifierArray, n: usize, vm: *bun.jsc.VM, str: []const u8) void {
JSC__IdentifierArray__setFromUtf8(self, n, vm, str.ptr, str.len);
}
};
const SourceCode = opaque {};
const JSModuleRecord = opaque {
extern fn JSC_JSModuleRecord__create(global_object: *bun.jsc.JSGlobalObject, vm: *bun.jsc.VM, module_key: *const IdentifierArray, source_code: *const SourceCode, declared_variables: *VariableEnvironment, lexical_variables: *VariableEnvironment, has_import_meta: bool, is_typescript: bool) *JSModuleRecord;
pub const create = JSC_JSModuleRecord__create;
extern fn JSC_JSModuleRecord__declaredVariables(module_record: *JSModuleRecord) *VariableEnvironment;
pub const declaredVariables = JSC_JSModuleRecord__declaredVariables;
extern fn JSC_JSModuleRecord__lexicalVariables(module_record: *JSModuleRecord) *VariableEnvironment;
pub const lexicalVariables = JSC_JSModuleRecord__lexicalVariables;
extern fn JSC_JSModuleRecord__addIndirectExport(module_record: *JSModuleRecord, identifier_array: *IdentifierArray, export_name: StringID, import_name: StringID, module_name: StringID) void;
pub const addIndirectExport = JSC_JSModuleRecord__addIndirectExport;
extern fn JSC_JSModuleRecord__addLocalExport(module_record: *JSModuleRecord, identifier_array: *IdentifierArray, export_name: StringID, local_name: StringID) void;
pub const addLocalExport = JSC_JSModuleRecord__addLocalExport;
extern fn JSC_JSModuleRecord__addNamespaceExport(module_record: *JSModuleRecord, identifier_array: *IdentifierArray, export_name: StringID, module_name: StringID) void;
pub const addNamespaceExport = JSC_JSModuleRecord__addNamespaceExport;
extern fn JSC_JSModuleRecord__addStarExport(module_record: *JSModuleRecord, identifier_array: *IdentifierArray, module_name: StringID) void;
pub const addStarExport = JSC_JSModuleRecord__addStarExport;
extern fn JSC_JSModuleRecord__addRequestedModuleNullAttributesPtr(module_record: *JSModuleRecord, identifier_array: *IdentifierArray, module_name: StringID) void;
pub const addRequestedModuleNullAttributesPtr = JSC_JSModuleRecord__addRequestedModuleNullAttributesPtr;
extern fn JSC_JSModuleRecord__addRequestedModuleJavaScript(module_record: *JSModuleRecord, identifier_array: *IdentifierArray, module_name: StringID) void;
pub const addRequestedModuleJavaScript = JSC_JSModuleRecord__addRequestedModuleJavaScript;
extern fn JSC_JSModuleRecord__addRequestedModuleWebAssembly(module_record: *JSModuleRecord, identifier_array: *IdentifierArray, module_name: StringID) void;
pub const addRequestedModuleWebAssembly = JSC_JSModuleRecord__addRequestedModuleWebAssembly;
extern fn JSC_JSModuleRecord__addRequestedModuleJSON(module_record: *JSModuleRecord, identifier_array: *IdentifierArray, module_name: StringID) void;
pub const addRequestedModuleJSON = JSC_JSModuleRecord__addRequestedModuleJSON;
extern fn JSC_JSModuleRecord__addRequestedModuleHostDefined(module_record: *JSModuleRecord, identifier_array: *IdentifierArray, module_name: StringID, host_defined_import_type: StringID) void;
pub const addRequestedModuleHostDefined = JSC_JSModuleRecord__addRequestedModuleHostDefined;
extern fn JSC_JSModuleRecord__addImportEntrySingle(module_record: *JSModuleRecord, identifier_array: *IdentifierArray, import_name: StringID, local_name: StringID, module_name: StringID) void;
pub const addImportEntrySingle = JSC_JSModuleRecord__addImportEntrySingle;
extern fn JSC_JSModuleRecord__addImportEntrySingleTypeScript(module_record: *JSModuleRecord, identifier_array: *IdentifierArray, import_name: StringID, local_name: StringID, module_name: StringID) void;
pub const addImportEntrySingleTypeScript = JSC_JSModuleRecord__addImportEntrySingleTypeScript;
extern fn JSC_JSModuleRecord__addImportEntryNamespace(module_record: *JSModuleRecord, identifier_array: *IdentifierArray, import_name: StringID, local_name: StringID, module_name: StringID) void;
pub const addImportEntryNamespace = JSC_JSModuleRecord__addImportEntryNamespace;
};
export fn zig_log(msg: [*:0]const u8) void {
bun.Output.errorWriter().print("{s}\n", .{std.mem.span(msg)}) catch {};
}
const bun = @import("bun");
const std = @import("std");
const DiffFormatter = @import("./bun.js/test/diff_format.zig").DiffFormatter;

View File

@@ -68,6 +68,7 @@ ts_enums: TsEnumsMap = .{},
/// This is a list of named exports that may exist in a CommonJS module
/// We use this with `commonjs_at_runtime` to re-export CommonJS
has_commonjs_export_names: bool = false,
has_import_meta: bool = false,
import_meta_ref: Ref = Ref.None,
pub const CommonJSNamedExport = struct {

View File

@@ -52,7 +52,7 @@ ts_enums: Ast.TsEnumsMap = .{},
flags: BundledAst.Flags = .{},
pub const Flags = packed struct(u8) {
pub const Flags = packed struct(u16) {
// This is a list of CommonJS features. When a file uses CommonJS features,
// it's not a candidate for "flat bundling" and must be wrapped in its own
// closure.
@@ -65,6 +65,8 @@ pub const Flags = packed struct(u8) {
has_lazy_export: bool = false,
commonjs_module_exports_assigned_deoptimized: bool = false,
has_explicit_use_strict_directive: bool = false,
has_import_meta: bool = false,
_padding: u7 = 0,
};
pub const empty = BundledAst.init(Ast.empty);
@@ -116,6 +118,7 @@ pub fn toAST(this: *const BundledAst) Ast {
.has_lazy_export = this.flags.has_lazy_export,
.commonjs_module_exports_assigned_deoptimized = this.flags.commonjs_module_exports_assigned_deoptimized,
.directive = if (this.flags.has_explicit_use_strict_directive) "use strict" else null,
.has_import_meta = this.flags.has_import_meta,
};
}
@@ -168,6 +171,7 @@ pub fn init(ast: Ast) BundledAst {
.has_lazy_export = ast.has_lazy_export,
.commonjs_module_exports_assigned_deoptimized = ast.commonjs_module_exports_assigned_deoptimized,
.has_explicit_use_strict_directive = strings.eqlComptime(ast.directive orelse "", "use strict"),
.has_import_meta = ast.has_import_meta,
},
};
}

View File

@@ -6591,6 +6591,7 @@ pub fn NewParser_(
.top_level_await_keyword = p.top_level_await_keyword,
.commonjs_named_exports = p.commonjs_named_exports,
.has_commonjs_export_names = p.has_commonjs_export_names,
.has_import_meta = p.has_import_meta,
.hashbang = hashbang,
// TODO: cross-module constant inlining

View File

@@ -433,6 +433,7 @@ pub fn buildWithVm(ctx: bun.cli.Command.Context, cwd: []const u8, vm: *VirtualMa
.asset => {},
.bytecode => {},
.sourcemap => {},
.module_info => {},
.@"metafile-json", .@"metafile-markdown" => {},
}
},

View File

@@ -285,7 +285,9 @@ pub const Run = struct {
.dir = cpu_prof_opts.dir,
.md_format = cpu_prof_opts.md_format,
.json_format = cpu_prof_opts.json_format,
.interval = cpu_prof_opts.interval,
};
CPUProfiler.setSamplingInterval(cpu_prof_opts.interval);
CPUProfiler.startCPUProfiler(vm.jsc_vm);
bun.analytics.Features.cpu_profile += 1;
}

View File

@@ -694,6 +694,7 @@ pub const AsyncModule = struct {
&printer,
.esm_ascii,
mapper.get(),
null,
);
}

View File

@@ -178,6 +178,7 @@ pub fn transpileSourceCode(
var cache = jsc.RuntimeTranspilerCache{
.output_code_allocator = allocator,
.sourcemap_allocator = bun.default_allocator,
.esm_record_allocator = bun.default_allocator,
};
const old = jsc_vm.transpiler.log;
@@ -422,6 +423,10 @@ pub fn transpileSourceCode(
dumpSourceString(jsc_vm, specifier, entry.output_code.byteSlice());
}
// TODO: module_info is only needed for standalone ESM bytecode.
// For now, skip it entirely in the runtime transpiler.
const module_info: ?*analyze_transpiled_module.ModuleInfoDeserialized = null;
return ResolvedSource{
.allocator = null,
.source_code = switch (entry.output_code) {
@@ -436,6 +441,7 @@ pub fn transpileSourceCode(
.specifier = input_specifier,
.source_url = input_specifier.createIfDifferent(path.text),
.is_commonjs_module = entry.metadata.module_type == .cjs,
.module_info = module_info,
.tag = brk: {
if (entry.metadata.module_type == .cjs and source.path.isFile()) {
const actual_package_json: *PackageJSON = package_json orelse brk2: {
@@ -504,6 +510,11 @@ pub fn transpileSourceCode(
jsc_vm.resolved_count += jsc_vm.transpiler.linker.import_counter - start_count;
jsc_vm.transpiler.linker.import_counter = 0;
const is_commonjs_module = parse_result.ast.has_commonjs_export_names or parse_result.ast.exports_kind == .cjs;
// TODO: module_info is only needed for standalone ESM bytecode.
// For now, skip it entirely in the runtime transpiler.
const module_info: ?*analyze_transpiled_module.ModuleInfo = null;
var printer = source_code_printer.*;
printer.ctx.reset();
defer source_code_printer.* = printer;
@@ -516,6 +527,7 @@ pub fn transpileSourceCode(
&printer,
.esm_ascii,
mapper.get(),
module_info,
);
};
@@ -529,9 +541,12 @@ pub fn transpileSourceCode(
}
}
const module_info_deserialized: ?*anyopaque = if (module_info) |mi| @ptrCast(mi.asDeserialized()) else null;
if (jsc_vm.isWatcherEnabled()) {
var resolved_source = jsc_vm.refCountedResolvedSource(printer.ctx.written, input_specifier, path.text, null, false);
resolved_source.is_commonjs_module = parse_result.ast.has_commonjs_export_names or parse_result.ast.exports_kind == .cjs;
resolved_source.is_commonjs_module = is_commonjs_module;
resolved_source.module_info = module_info_deserialized;
return resolved_source;
}
@@ -564,7 +579,8 @@ pub fn transpileSourceCode(
},
.specifier = input_specifier,
.source_url = input_specifier.createIfDifferent(path.text),
.is_commonjs_module = parse_result.ast.has_commonjs_export_names or parse_result.ast.exports_kind == .cjs,
.is_commonjs_module = is_commonjs_module,
.module_info = module_info_deserialized,
.tag = tag,
};
},
@@ -1192,9 +1208,15 @@ pub fn fetchBuiltinModule(jsc_vm: *VirtualMachine, specifier: bun.String) !?Reso
.source_code = file.toWTFString(),
.specifier = specifier,
.source_url = specifier.dupeRef(),
// bytecode_origin_path is the path used when generating bytecode; must match for cache hits
.bytecode_origin_path = if (file.bytecode_origin_path.len > 0) bun.String.fromBytes(file.bytecode_origin_path) else bun.String.empty,
.source_code_needs_deref = false,
.bytecode_cache = if (file.bytecode.len > 0) file.bytecode.ptr else null,
.bytecode_cache_size = file.bytecode.len,
.module_info = if (file.module_info.len > 0)
analyze_transpiled_module.ModuleInfoDeserialized.createFromCachedRecord(file.module_info, bun.default_allocator)
else
null,
.is_commonjs_module = file.module_format == .cjs,
};
}
@@ -1324,6 +1346,7 @@ const string = []const u8;
const Fs = @import("../fs.zig");
const Runtime = @import("../runtime.zig");
const analyze_transpiled_module = @import("../analyze_transpiled_module.zig");
const ast = @import("../import_record.zig");
const node_module_module = @import("./bindings/NodeModuleModule.zig");
const std = @import("std");

View File

@@ -14,7 +14,8 @@
/// Version 15: Updated global defines table list.
/// Version 16: Added typeof undefined minification optimization.
/// Version 17: Removed transpiler import rewrite for bun:test. Not bumping it causes test/js/bun/http/req-url-leak.test.ts to fail with SyntaxError: Export named 'expect' not found in module 'bun:test'.
const expected_version = 17;
/// Version 18: Include ESM record (module info) with an ES Module, see #15758
const expected_version = 18;
const debug = Output.scoped(.cache, .visible);
const MINIMUM_CACHE_SIZE = 50 * 1024;
@@ -32,6 +33,7 @@ pub const RuntimeTranspilerCache = struct {
sourcemap_allocator: std.mem.Allocator,
output_code_allocator: std.mem.Allocator,
esm_record_allocator: std.mem.Allocator,
const seed = 42;
pub const Metadata = struct {
@@ -52,6 +54,10 @@ pub const RuntimeTranspilerCache = struct {
sourcemap_byte_length: u64 = 0,
sourcemap_hash: u64 = 0,
esm_record_byte_offset: u64 = 0,
esm_record_byte_length: u64 = 0,
esm_record_hash: u64 = 0,
pub const size = brk: {
var count: usize = 0;
const meta: Metadata = .{};
@@ -78,6 +84,10 @@ pub const RuntimeTranspilerCache = struct {
try writer.writeInt(u64, this.sourcemap_byte_offset, .little);
try writer.writeInt(u64, this.sourcemap_byte_length, .little);
try writer.writeInt(u64, this.sourcemap_hash, .little);
try writer.writeInt(u64, this.esm_record_byte_offset, .little);
try writer.writeInt(u64, this.esm_record_byte_length, .little);
try writer.writeInt(u64, this.esm_record_hash, .little);
}
pub fn decode(this: *Metadata, reader: anytype) !void {
@@ -102,6 +112,10 @@ pub const RuntimeTranspilerCache = struct {
this.sourcemap_byte_length = try reader.readInt(u64, .little);
this.sourcemap_hash = try reader.readInt(u64, .little);
this.esm_record_byte_offset = try reader.readInt(u64, .little);
this.esm_record_byte_length = try reader.readInt(u64, .little);
this.esm_record_hash = try reader.readInt(u64, .little);
switch (this.module_type) {
.esm, .cjs => {},
// Invalid module type
@@ -120,6 +134,7 @@ pub const RuntimeTranspilerCache = struct {
metadata: Metadata,
output_code: OutputCode = .{ .utf8 = "" },
sourcemap: []const u8 = "",
esm_record: []const u8 = "",
pub const OutputCode = union(enum) {
utf8: []const u8,
@@ -142,11 +157,14 @@ pub const RuntimeTranspilerCache = struct {
}
};
pub fn deinit(this: *Entry, sourcemap_allocator: std.mem.Allocator, output_code_allocator: std.mem.Allocator) void {
pub fn deinit(this: *Entry, sourcemap_allocator: std.mem.Allocator, output_code_allocator: std.mem.Allocator, esm_record_allocator: std.mem.Allocator) void {
this.output_code.deinit(output_code_allocator);
if (this.sourcemap.len > 0) {
sourcemap_allocator.free(this.sourcemap);
}
if (this.esm_record.len > 0) {
esm_record_allocator.free(this.esm_record);
}
}
pub fn save(
@@ -156,6 +174,7 @@ pub const RuntimeTranspilerCache = struct {
input_hash: u64,
features_hash: u64,
sourcemap: []const u8,
esm_record: []const u8,
output_code: OutputCode,
exports_kind: bun.ast.ExportsKind,
) !void {
@@ -201,10 +220,16 @@ pub const RuntimeTranspilerCache = struct {
.output_byte_offset = Metadata.size,
.output_byte_length = output_bytes.len,
.sourcemap_byte_offset = Metadata.size + output_bytes.len,
.esm_record_byte_offset = Metadata.size + output_bytes.len + sourcemap.len,
.esm_record_byte_length = esm_record.len,
};
metadata.output_hash = hash(output_bytes);
metadata.sourcemap_hash = hash(sourcemap);
if (esm_record.len > 0) {
metadata.esm_record_hash = hash(esm_record);
}
var metadata_stream = std.io.fixedBufferStream(&metadata_buf);
try metadata.encode(metadata_stream.writer());
@@ -219,20 +244,26 @@ pub const RuntimeTranspilerCache = struct {
break :brk metadata_buf[0..metadata_stream.pos];
};
const vecs: []const bun.PlatformIOVecConst = if (output_bytes.len > 0)
&.{
bun.platformIOVecConstCreate(metadata_bytes),
bun.platformIOVecConstCreate(output_bytes),
bun.platformIOVecConstCreate(sourcemap),
}
else
&.{
bun.platformIOVecConstCreate(metadata_bytes),
bun.platformIOVecConstCreate(sourcemap),
};
var vecs_buf: [4]bun.PlatformIOVecConst = undefined;
var vecs_i: usize = 0;
vecs_buf[vecs_i] = bun.platformIOVecConstCreate(metadata_bytes);
vecs_i += 1;
if (output_bytes.len > 0) {
vecs_buf[vecs_i] = bun.platformIOVecConstCreate(output_bytes);
vecs_i += 1;
}
if (sourcemap.len > 0) {
vecs_buf[vecs_i] = bun.platformIOVecConstCreate(sourcemap);
vecs_i += 1;
}
if (esm_record.len > 0) {
vecs_buf[vecs_i] = bun.platformIOVecConstCreate(esm_record);
vecs_i += 1;
}
const vecs: []const bun.PlatformIOVecConst = vecs_buf[0..vecs_i];
var position: isize = 0;
const end_position = Metadata.size + output_bytes.len + sourcemap.len;
const end_position = Metadata.size + output_bytes.len + sourcemap.len + esm_record.len;
if (bun.Environment.allow_assert) {
var total: usize = 0;
@@ -242,7 +273,7 @@ pub const RuntimeTranspilerCache = struct {
}
bun.assert(end_position == total);
}
bun.assert(end_position == @as(i64, @intCast(sourcemap.len + output_bytes.len + Metadata.size)));
bun.assert(end_position == @as(i64, @intCast(sourcemap.len + output_bytes.len + Metadata.size + esm_record.len)));
bun.sys.preallocate_file(tmpfile.fd.cast(), 0, @intCast(end_position)) catch {};
while (position < end_position) {
@@ -263,6 +294,7 @@ pub const RuntimeTranspilerCache = struct {
file: std.fs.File,
sourcemap_allocator: std.mem.Allocator,
output_code_allocator: std.mem.Allocator,
esm_record_allocator: std.mem.Allocator,
) !void {
const stat_size = try file.getEndPos();
if (stat_size < Metadata.size + this.metadata.output_byte_length + this.metadata.sourcemap_byte_length) {
@@ -338,6 +370,23 @@ pub const RuntimeTranspilerCache = struct {
this.sourcemap = sourcemap;
}
if (this.metadata.esm_record_byte_length > 0) {
const esm_record = try esm_record_allocator.alloc(u8, this.metadata.esm_record_byte_length);
errdefer esm_record_allocator.free(esm_record);
const read_bytes = try file.preadAll(esm_record, this.metadata.esm_record_byte_offset);
if (read_bytes != this.metadata.esm_record_byte_length) {
return error.MissingData;
}
if (this.metadata.esm_record_hash != 0) {
if (hash(esm_record) != this.metadata.esm_record_hash) {
return error.InvalidHash;
}
}
this.esm_record = esm_record;
}
}
};
@@ -455,6 +504,7 @@ pub const RuntimeTranspilerCache = struct {
input_stat_size: u64,
sourcemap_allocator: std.mem.Allocator,
output_code_allocator: std.mem.Allocator,
esm_record_allocator: std.mem.Allocator,
) !Entry {
var tracer = bun.perf.trace("RuntimeTranspilerCache.fromFile");
defer tracer.end();
@@ -469,6 +519,7 @@ pub const RuntimeTranspilerCache = struct {
input_stat_size,
sourcemap_allocator,
output_code_allocator,
esm_record_allocator,
);
}
@@ -479,6 +530,7 @@ pub const RuntimeTranspilerCache = struct {
input_stat_size: u64,
sourcemap_allocator: std.mem.Allocator,
output_code_allocator: std.mem.Allocator,
esm_record_allocator: std.mem.Allocator,
) !Entry {
var metadata_bytes_buf: [Metadata.size * 2]u8 = undefined;
const cache_fd = try bun.sys.open(cache_file_path.sliceAssumeZ(), bun.O.RDONLY, 0).unwrap();
@@ -510,7 +562,7 @@ pub const RuntimeTranspilerCache = struct {
return error.MismatchedFeatureHash;
}
try entry.load(file, sourcemap_allocator, output_code_allocator);
try entry.load(file, sourcemap_allocator, output_code_allocator, esm_record_allocator);
return entry;
}
@@ -527,6 +579,7 @@ pub const RuntimeTranspilerCache = struct {
input_hash: u64,
features_hash: u64,
sourcemap: []const u8,
esm_record: []const u8,
source_code: bun.String,
exports_kind: bun.ast.ExportsKind,
) !void {
@@ -566,6 +619,7 @@ pub const RuntimeTranspilerCache = struct {
input_hash,
features_hash,
sourcemap,
esm_record,
output_code,
exports_kind,
);
@@ -599,7 +653,7 @@ pub const RuntimeTranspilerCache = struct {
parser_options.hashForRuntimeTranspiler(&features_hasher, used_jsx);
this.features_hash = features_hasher.final();
this.entry = fromFile(input_hash, this.features_hash.?, source.contents.len, this.sourcemap_allocator, this.output_code_allocator) catch |err| {
this.entry = fromFile(input_hash, this.features_hash.?, source.contents.len, this.sourcemap_allocator, this.output_code_allocator, this.esm_record_allocator) catch |err| {
debug("get(\"{s}\") = {s}", .{ source.path.text, @errorName(err) });
return false;
};
@@ -615,7 +669,7 @@ pub const RuntimeTranspilerCache = struct {
if (comptime bun.Environment.isDebug) {
if (!bun_debug_restore_from_cache) {
if (this.entry) |*entry| {
entry.deinit(this.sourcemap_allocator, this.output_code_allocator);
entry.deinit(this.sourcemap_allocator, this.output_code_allocator, this.esm_record_allocator);
this.entry = null;
}
}
@@ -624,7 +678,7 @@ pub const RuntimeTranspilerCache = struct {
return this.entry != null;
}
pub fn put(this: *RuntimeTranspilerCache, output_code_bytes: []const u8, sourcemap: []const u8) void {
pub fn put(this: *RuntimeTranspilerCache, output_code_bytes: []const u8, sourcemap: []const u8, esm_record: []const u8) void {
if (comptime !bun.FeatureFlags.runtime_transpiler_cache)
@compileError("RuntimeTranspilerCache is disabled");
@@ -635,7 +689,7 @@ pub const RuntimeTranspilerCache = struct {
const output_code = bun.String.cloneLatin1(output_code_bytes);
this.output_code = output_code;
toFile(this.input_byte_length.?, this.input_hash.?, this.features_hash.?, sourcemap, output_code, this.exports_kind) catch |err| {
toFile(this.input_byte_length.?, this.input_hash.?, this.features_hash.?, sourcemap, esm_record, output_code, this.exports_kind) catch |err| {
debug("put() = {s}", .{@errorName(err)});
return;
};

View File

@@ -315,6 +315,7 @@ pub const RuntimeTranspilerStore = struct {
var cache = jsc.RuntimeTranspilerCache{
.output_code_allocator = allocator,
.sourcemap_allocator = bun.default_allocator,
.esm_record_allocator = bun.default_allocator,
};
var log = logger.Log.init(allocator);
defer {
@@ -471,6 +472,10 @@ pub const RuntimeTranspilerStore = struct {
dumpSourceString(vm, specifier, entry.output_code.byteSlice());
}
// TODO: module_info is only needed for standalone ESM bytecode.
// For now, skip it entirely in the runtime transpiler.
const module_info: ?*analyze_transpiled_module.ModuleInfoDeserialized = null;
this.resolved_source = ResolvedSource{
.allocator = null,
.source_code = switch (entry.output_code) {
@@ -483,6 +488,7 @@ pub const RuntimeTranspilerStore = struct {
},
},
.is_commonjs_module = entry.metadata.module_type == .cjs,
.module_info = module_info,
.tag = this.resolved_source.tag,
};
@@ -541,6 +547,11 @@ pub const RuntimeTranspilerStore = struct {
printer = source_code_printer.?.*;
}
const is_commonjs_module = parse_result.ast.has_commonjs_export_names or parse_result.ast.exports_kind == .cjs;
// TODO: module_info is only needed for standalone ESM bytecode.
// For now, skip it entirely in the runtime transpiler.
const module_info: ?*analyze_transpiled_module.ModuleInfo = null;
{
var mapper = vm.sourceMapHandler(&printer);
defer source_code_printer.?.* = printer;
@@ -550,7 +561,9 @@ pub const RuntimeTranspilerStore = struct {
&printer,
.esm_ascii,
mapper.get(),
module_info,
) catch |err| {
if (module_info) |mi| mi.destroy();
this.parse_error = err;
return;
};
@@ -589,7 +602,8 @@ pub const RuntimeTranspilerStore = struct {
this.resolved_source = ResolvedSource{
.allocator = null,
.source_code = source_code,
.is_commonjs_module = parse_result.ast.has_commonjs_export_names or parse_result.ast.exports_kind == .cjs,
.is_commonjs_module = is_commonjs_module,
.module_info = if (module_info) |mi| @ptrCast(mi.asDeserialized()) else null,
.tag = this.resolved_source.tag,
};
}
@@ -597,6 +611,7 @@ pub const RuntimeTranspilerStore = struct {
};
const Fs = @import("../fs.zig");
const analyze_transpiled_module = @import("../analyze_transpiled_module.zig");
const node_fallbacks = @import("../node_fallbacks.zig");
const std = @import("std");
const AsyncModule = @import("./AsyncModule.zig").AsyncModule;

View File

@@ -675,8 +675,8 @@ pub const JSBundler = struct {
if (try config.getOptionalEnum(globalThis, "format", options.Format)) |format| {
this.format = format;
if (this.bytecode and format != .cjs) {
return globalThis.throwInvalidArguments("format must be 'cjs' when bytecode is true. Eventually we'll add esm support as well.", .{});
if (this.bytecode and format != .cjs and format != .esm) {
return globalThis.throwInvalidArguments("format must be 'cjs' or 'esm' when bytecode is true.", .{});
}
}
@@ -1019,6 +1019,13 @@ pub const JSBundler = struct {
}
}
// ESM bytecode requires compile because module_info (import/export metadata)
// is only available in compiled binaries. Without it, JSC must parse the file
// twice (once for module analysis, once for bytecode), which is a deopt.
if (this.bytecode and this.format == .esm and this.compile == null) {
return globalThis.throwInvalidArguments("ESM bytecode requires compile: true. Use format: 'cjs' for bytecode without compile.", .{});
}
return this;
}
@@ -1717,11 +1724,12 @@ pub const BuildArtifact = struct {
@"entry-point",
sourcemap,
bytecode,
module_info,
@"metafile-json",
@"metafile-markdown",
pub fn isFileInStandaloneMode(this: OutputKind) bool {
return this != .sourcemap and this != .bytecode and this != .@"metafile-json" and this != .@"metafile-markdown";
return this != .sourcemap and this != .bytecode and this != .module_info and this != .@"metafile-json" and this != .@"metafile-markdown";
}
};

View File

@@ -84,6 +84,7 @@ pub const ProcessExitHandler = struct {
LifecycleScriptSubprocess,
ShellSubprocess,
ProcessHandle,
MultiRunProcessHandle,
SecurityScanSubprocess,
SyncProcess,
},
@@ -111,6 +112,10 @@ pub const ProcessExitHandler = struct {
const subprocess = this.ptr.as(ProcessHandle);
subprocess.onProcessExit(process, status, rusage);
},
@field(TaggedPointer.Tag, @typeName(MultiRunProcessHandle)) => {
const subprocess = this.ptr.as(MultiRunProcessHandle);
subprocess.onProcessExit(process, status, rusage);
},
@field(TaggedPointer.Tag, @typeName(ShellSubprocess)) => {
const subprocess = this.ptr.as(ShellSubprocess);
subprocess.onProcessExit(process, status, rusage);
@@ -2251,6 +2256,7 @@ pub const sync = struct {
};
const std = @import("std");
const MultiRunProcessHandle = @import("../../../cli/multi_run.zig").ProcessHandle;
const ProcessHandle = @import("../../../cli/filter_run.zig").ProcessHandle;
const bun = @import("bun");

View File

@@ -0,0 +1,337 @@
#include "root.h"
#include "JavaScriptCore/JSInternalPromise.h"
#include "JavaScriptCore/JSModuleRecord.h"
#include "JavaScriptCore/GlobalObjectMethodTable.h"
#include "JavaScriptCore/Nodes.h"
#include "JavaScriptCore/Parser.h"
#include "JavaScriptCore/ParserError.h"
#include "JavaScriptCore/SyntheticModuleRecord.h"
#include <wtf/text/MakeString.h>
#include "JavaScriptCore/JSGlobalObject.h"
#include "JavaScriptCore/ExceptionScope.h"
#include "ZigSourceProvider.h"
#include "BunAnalyzeTranspiledModule.h"
// ref: JSModuleLoader.cpp
// ref: ModuleAnalyzer.cpp
// ref: JSModuleRecord.cpp
// ref: NodesAnalyzeModule.cpp, search ::analyzeModule
#include "JavaScriptCore/ModuleAnalyzer.h"
#include "JavaScriptCore/ErrorType.h"
namespace JSC {
String dumpRecordInfo(JSModuleRecord* moduleRecord);
Identifier getFromIdentifierArray(VM& vm, Identifier* identifierArray, uint32_t n)
{
if (n == std::numeric_limits<uint32_t>::max()) {
return vm.propertyNames->starDefaultPrivateName;
}
return identifierArray[n];
}
extern "C" JSModuleRecord* zig__ModuleInfoDeserialized__toJSModuleRecord(JSGlobalObject* globalObject, VM& vm, const Identifier& module_key, const SourceCode& source_code, VariableEnvironment& declared_variables, VariableEnvironment& lexical_variables, bun_ModuleInfoDeserialized* module_info);
extern "C" void zig__renderDiff(const char* expected_ptr, size_t expected_len, const char* received_ptr, size_t received_len, JSGlobalObject* globalObject);
extern "C" Identifier* JSC__IdentifierArray__create(size_t len)
{
return new Identifier[len];
}
extern "C" void JSC__IdentifierArray__destroy(Identifier* identifier)
{
delete[] identifier;
}
extern "C" void JSC__IdentifierArray__setFromUtf8(Identifier* identifierArray, size_t n, VM& vm, char* str, size_t len)
{
identifierArray[n] = Identifier::fromString(vm, AtomString::fromUTF8(std::span<const char>(str, len)));
}
extern "C" void JSC__VariableEnvironment__add(VariableEnvironment& environment, VM& vm, Identifier* identifierArray, uint32_t index)
{
environment.add(getFromIdentifierArray(vm, identifierArray, index));
}
extern "C" VariableEnvironment* JSC_JSModuleRecord__declaredVariables(JSModuleRecord* moduleRecord)
{
return const_cast<VariableEnvironment*>(&moduleRecord->declaredVariables());
}
extern "C" VariableEnvironment* JSC_JSModuleRecord__lexicalVariables(JSModuleRecord* moduleRecord)
{
return const_cast<VariableEnvironment*>(&moduleRecord->lexicalVariables());
}
extern "C" JSModuleRecord* JSC_JSModuleRecord__create(JSGlobalObject* globalObject, VM& vm, const Identifier* moduleKey, const SourceCode& sourceCode, const VariableEnvironment& declaredVariables, const VariableEnvironment& lexicalVariables, bool hasImportMeta, bool isTypescript)
{
JSModuleRecord* result = JSModuleRecord::create(globalObject, vm, globalObject->moduleRecordStructure(), *moduleKey, sourceCode, declaredVariables, lexicalVariables, hasImportMeta ? ImportMetaFeature : 0);
result->m_isTypeScript = isTypescript;
return result;
}
extern "C" void JSC_JSModuleRecord__addIndirectExport(JSModuleRecord* moduleRecord, Identifier* identifierArray, uint32_t exportName, uint32_t importName, uint32_t moduleName)
{
moduleRecord->addExportEntry(JSModuleRecord::ExportEntry::createIndirect(getFromIdentifierArray(moduleRecord->vm(), identifierArray, exportName), getFromIdentifierArray(moduleRecord->vm(), identifierArray, importName), getFromIdentifierArray(moduleRecord->vm(), identifierArray, moduleName)));
}
extern "C" void JSC_JSModuleRecord__addLocalExport(JSModuleRecord* moduleRecord, Identifier* identifierArray, uint32_t exportName, uint32_t localName)
{
moduleRecord->addExportEntry(JSModuleRecord::ExportEntry::createLocal(getFromIdentifierArray(moduleRecord->vm(), identifierArray, exportName), getFromIdentifierArray(moduleRecord->vm(), identifierArray, localName)));
}
extern "C" void JSC_JSModuleRecord__addNamespaceExport(JSModuleRecord* moduleRecord, Identifier* identifierArray, uint32_t exportName, uint32_t moduleName)
{
moduleRecord->addExportEntry(JSModuleRecord::ExportEntry::createNamespace(getFromIdentifierArray(moduleRecord->vm(), identifierArray, exportName), getFromIdentifierArray(moduleRecord->vm(), identifierArray, moduleName)));
}
extern "C" void JSC_JSModuleRecord__addStarExport(JSModuleRecord* moduleRecord, Identifier* identifierArray, uint32_t moduleName)
{
moduleRecord->addStarExportEntry(getFromIdentifierArray(moduleRecord->vm(), identifierArray, moduleName));
}
extern "C" void JSC_JSModuleRecord__addRequestedModuleNullAttributesPtr(JSModuleRecord* moduleRecord, Identifier* identifierArray, uint32_t moduleName)
{
RefPtr<ScriptFetchParameters> attributes = RefPtr<ScriptFetchParameters> {};
moduleRecord->appendRequestedModule(getFromIdentifierArray(moduleRecord->vm(), identifierArray, moduleName), std::move(attributes));
}
extern "C" void JSC_JSModuleRecord__addRequestedModuleJavaScript(JSModuleRecord* moduleRecord, Identifier* identifierArray, uint32_t moduleName)
{
Ref<ScriptFetchParameters> attributes = ScriptFetchParameters::create(ScriptFetchParameters::Type::JavaScript);
moduleRecord->appendRequestedModule(getFromIdentifierArray(moduleRecord->vm(), identifierArray, moduleName), std::move(attributes));
}
extern "C" void JSC_JSModuleRecord__addRequestedModuleWebAssembly(JSModuleRecord* moduleRecord, Identifier* identifierArray, uint32_t moduleName)
{
Ref<ScriptFetchParameters> attributes = ScriptFetchParameters::create(ScriptFetchParameters::Type::WebAssembly);
moduleRecord->appendRequestedModule(getFromIdentifierArray(moduleRecord->vm(), identifierArray, moduleName), std::move(attributes));
}
extern "C" void JSC_JSModuleRecord__addRequestedModuleJSON(JSModuleRecord* moduleRecord, Identifier* identifierArray, uint32_t moduleName)
{
Ref<ScriptFetchParameters> attributes = ScriptFetchParameters::create(ScriptFetchParameters::Type::JSON);
moduleRecord->appendRequestedModule(getFromIdentifierArray(moduleRecord->vm(), identifierArray, moduleName), std::move(attributes));
}
extern "C" void JSC_JSModuleRecord__addRequestedModuleHostDefined(JSModuleRecord* moduleRecord, Identifier* identifierArray, uint32_t moduleName, uint32_t hostDefinedImportType)
{
Ref<ScriptFetchParameters> attributes = ScriptFetchParameters::create(getFromIdentifierArray(moduleRecord->vm(), identifierArray, hostDefinedImportType).string());
moduleRecord->appendRequestedModule(getFromIdentifierArray(moduleRecord->vm(), identifierArray, moduleName), std::move(attributes));
}
extern "C" void JSC_JSModuleRecord__addImportEntrySingle(JSModuleRecord* moduleRecord, Identifier* identifierArray, uint32_t importName, uint32_t localName, uint32_t moduleName)
{
moduleRecord->addImportEntry(JSModuleRecord::ImportEntry {
.type = JSModuleRecord::ImportEntryType::Single,
.moduleRequest = getFromIdentifierArray(moduleRecord->vm(), identifierArray, moduleName),
.importName = getFromIdentifierArray(moduleRecord->vm(), identifierArray, importName),
.localName = getFromIdentifierArray(moduleRecord->vm(), identifierArray, localName),
});
}
extern "C" void JSC_JSModuleRecord__addImportEntrySingleTypeScript(JSModuleRecord* moduleRecord, Identifier* identifierArray, uint32_t importName, uint32_t localName, uint32_t moduleName)
{
moduleRecord->addImportEntry(JSModuleRecord::ImportEntry {
.type = JSModuleRecord::ImportEntryType::SingleTypeScript,
.moduleRequest = getFromIdentifierArray(moduleRecord->vm(), identifierArray, moduleName),
.importName = getFromIdentifierArray(moduleRecord->vm(), identifierArray, importName),
.localName = getFromIdentifierArray(moduleRecord->vm(), identifierArray, localName),
});
}
extern "C" void JSC_JSModuleRecord__addImportEntryNamespace(JSModuleRecord* moduleRecord, Identifier* identifierArray, uint32_t importName, uint32_t localName, uint32_t moduleName)
{
moduleRecord->addImportEntry(JSModuleRecord::ImportEntry {
.type = JSModuleRecord::ImportEntryType::Namespace,
.moduleRequest = getFromIdentifierArray(moduleRecord->vm(), identifierArray, moduleName),
.importName = getFromIdentifierArray(moduleRecord->vm(), identifierArray, importName),
.localName = getFromIdentifierArray(moduleRecord->vm(), identifierArray, localName),
});
}
static EncodedJSValue fallbackParse(JSGlobalObject* globalObject, const Identifier& moduleKey, const SourceCode& sourceCode, JSInternalPromise* promise, JSModuleRecord* resultValue = nullptr);
extern "C" EncodedJSValue Bun__analyzeTranspiledModule(JSGlobalObject* globalObject, const Identifier& moduleKey, const SourceCode& sourceCode, JSInternalPromise* promise)
{
VM& vm = globalObject->vm();
auto scope = DECLARE_THROW_SCOPE(vm);
auto rejectWithError = [&](JSValue error) {
promise->reject(vm, globalObject, error);
return promise;
};
VariableEnvironment declaredVariables = VariableEnvironment();
VariableEnvironment lexicalVariables = VariableEnvironment();
auto provider = static_cast<Zig::SourceProvider*>(sourceCode.provider());
if (provider->m_resolvedSource.module_info == nullptr) {
dataLog("[note] module_info is null for module: ", moduleKey.utf8(), "\n");
RELEASE_AND_RETURN(scope, JSValue::encode(rejectWithError(createError(globalObject, WTF::String::fromLatin1("module_info is null")))));
}
auto moduleRecord = zig__ModuleInfoDeserialized__toJSModuleRecord(globalObject, vm, moduleKey, sourceCode, declaredVariables, lexicalVariables, static_cast<bun_ModuleInfoDeserialized*>(provider->m_resolvedSource.module_info));
// zig__ModuleInfoDeserialized__toJSModuleRecord consumes and frees the module_info.
// Null it out to prevent use-after-free via the dangling pointer.
provider->m_resolvedSource.module_info = nullptr;
if (moduleRecord == nullptr) {
RELEASE_AND_RETURN(scope, JSValue::encode(rejectWithError(createError(globalObject, WTF::String::fromLatin1("parseFromSourceCode failed")))));
}
#if BUN_DEBUG
RELEASE_AND_RETURN(scope, fallbackParse(globalObject, moduleKey, sourceCode, promise, moduleRecord));
#else
promise->resolve(globalObject, moduleRecord);
RELEASE_AND_RETURN(scope, JSValue::encode(promise));
#endif
}
static EncodedJSValue fallbackParse(JSGlobalObject* globalObject, const Identifier& moduleKey, const SourceCode& sourceCode, JSInternalPromise* promise, JSModuleRecord* resultValue)
{
VM& vm = globalObject->vm();
auto scope = DECLARE_THROW_SCOPE(vm);
auto rejectWithError = [&](JSValue error) {
promise->reject(vm, globalObject, error);
return promise;
};
ParserError error;
std::unique_ptr<ModuleProgramNode> moduleProgramNode = parseRootNode<ModuleProgramNode>(
vm, sourceCode, ImplementationVisibility::Public, JSParserBuiltinMode::NotBuiltin,
StrictModeLexicallyScopedFeature, JSParserScriptMode::Module, SourceParseMode::ModuleAnalyzeMode, error);
if (error.isValid())
RELEASE_AND_RETURN(scope, JSValue::encode(rejectWithError(error.toErrorObject(globalObject, sourceCode))));
ASSERT(moduleProgramNode);
ModuleAnalyzer moduleAnalyzer(globalObject, moduleKey, sourceCode, moduleProgramNode->varDeclarations(), moduleProgramNode->lexicalVariables(), moduleProgramNode->features());
RETURN_IF_EXCEPTION(scope, JSValue::encode(promise->rejectWithCaughtException(globalObject, scope)));
auto result = moduleAnalyzer.analyze(*moduleProgramNode);
if (!result) {
auto [errorType, message] = std::move(result.error());
RELEASE_AND_RETURN(scope, JSValue::encode(rejectWithError(createError(globalObject, errorType, message))));
}
JSModuleRecord* moduleRecord = result.value();
if (resultValue != nullptr) {
auto actual = dumpRecordInfo(resultValue);
auto expected = dumpRecordInfo(moduleRecord);
if (actual != expected) {
dataLog("\n\n\n\n\n\n\x1b[95mBEGIN analyzeTranspiledModule\x1b(B\x1b[m\n --- module key ---\n", moduleKey.utf8().data(), "\n --- code ---\n\n", sourceCode.toUTF8().data(), "\n");
dataLog(" ------", "\n");
dataLog(" BunAnalyzeTranspiledModule:", "\n");
zig__renderDiff(expected.utf8().data(), expected.utf8().length(), actual.utf8().data(), actual.utf8().length(), globalObject);
RELEASE_AND_RETURN(scope, JSValue::encode(rejectWithError(createError(globalObject, WTF::String::fromLatin1("Imports different between parseFromSourceCode and fallbackParse")))));
}
}
scope.release();
promise->resolve(globalObject, resultValue == nullptr ? moduleRecord : resultValue);
return JSValue::encode(promise);
}
String dumpRecordInfo(JSModuleRecord* moduleRecord)
{
WTF::StringPrintStream stream;
{
Vector<String> sortedVars;
for (const auto& pair : moduleRecord->declaredVariables())
sortedVars.append(String(pair.key.get()));
std::sort(sortedVars.begin(), sortedVars.end(), [](const String& a, const String& b) {
return codePointCompare(a, b) < 0;
});
stream.print(" varDeclarations:\n");
for (const auto& name : sortedVars)
stream.print(" - ", name, "\n");
}
{
Vector<String> sortedVars;
for (const auto& pair : moduleRecord->lexicalVariables())
sortedVars.append(String(pair.key.get()));
std::sort(sortedVars.begin(), sortedVars.end(), [](const String& a, const String& b) {
return codePointCompare(a, b) < 0;
});
stream.print(" lexicalVariables:\n");
for (const auto& name : sortedVars)
stream.print(" - ", name, "\n");
}
stream.print(" features: (not accessible)\n");
stream.print("\nAnalyzing ModuleRecord key(", moduleRecord->moduleKey().impl(), ")\n");
stream.print(" Dependencies: ", moduleRecord->requestedModules().size(), " modules\n");
{
Vector<String> sortedDeps;
for (const auto& request : moduleRecord->requestedModules()) {
WTF::StringPrintStream line;
if (request.m_attributes == nullptr)
line.print(" module(", request.m_specifier, ")\n");
else
line.print(" module(", request.m_specifier, "),attributes(", (uint8_t)request.m_attributes->type(), ", ", request.m_attributes->hostDefinedImportType(), ")\n");
sortedDeps.append(line.toString());
}
std::sort(sortedDeps.begin(), sortedDeps.end(), [](const String& a, const String& b) {
return codePointCompare(a, b) < 0;
});
for (const auto& dep : sortedDeps)
stream.print(dep);
}
stream.print(" Import: ", moduleRecord->importEntries().size(), " entries\n");
{
Vector<String> sortedImports;
for (const auto& pair : moduleRecord->importEntries()) {
WTF::StringPrintStream line;
auto& importEntry = pair.value;
line.print(" import(", importEntry.importName, "), local(", importEntry.localName, "), module(", importEntry.moduleRequest, ")\n");
sortedImports.append(line.toString());
}
std::sort(sortedImports.begin(), sortedImports.end(), [](const String& a, const String& b) {
return codePointCompare(a, b) < 0;
});
for (const auto& imp : sortedImports)
stream.print(imp);
}
stream.print(" Export: ", moduleRecord->exportEntries().size(), " entries\n");
Vector<String> sortedEntries;
for (const auto& pair : moduleRecord->exportEntries()) {
WTF::StringPrintStream line;
auto& exportEntry = pair.value;
switch (exportEntry.type) {
case AbstractModuleRecord::ExportEntry::Type::Local:
line.print(" [Local] ", "export(", exportEntry.exportName, "), local(", exportEntry.localName, ")\n");
break;
case AbstractModuleRecord::ExportEntry::Type::Indirect:
line.print(" [Indirect] ", "export(", exportEntry.exportName, "), import(", exportEntry.importName, "), module(", exportEntry.moduleName, ")\n");
break;
case AbstractModuleRecord::ExportEntry::Type::Namespace:
line.print(" [Namespace] ", "export(", exportEntry.exportName, "), module(", exportEntry.moduleName, ")\n");
break;
}
sortedEntries.append(line.toString());
}
std::sort(sortedEntries.begin(), sortedEntries.end(), [](const String& a, const String& b) {
return codePointCompare(a, b) < 0;
});
for (const auto& entry : sortedEntries)
stream.print(entry);
{
Vector<String> sortedStarExports;
for (const auto& moduleName : moduleRecord->starExportEntries()) {
WTF::StringPrintStream line;
line.print(" [Star] module(", moduleName.get(), ")\n");
sortedStarExports.append(line.toString());
}
std::sort(sortedStarExports.begin(), sortedStarExports.end(), [](const String& a, const String& b) {
return codePointCompare(a, b) < 0;
});
for (const auto& entry : sortedStarExports)
stream.print(entry);
}
stream.print(" -> done\n");
return stream.toString();
}
}

View File

@@ -0,0 +1 @@
struct bun_ModuleInfoDeserialized;

View File

@@ -19,6 +19,12 @@
extern "C" void Bun__startCPUProfiler(JSC::VM* vm);
extern "C" void Bun__stopCPUProfiler(JSC::VM* vm, BunString* outJSON, BunString* outText);
extern "C" void Bun__setSamplingInterval(int intervalMicroseconds);
void Bun__setSamplingInterval(int intervalMicroseconds)
{
Bun::setSamplingInterval(intervalMicroseconds);
}
namespace Bun {

View File

@@ -3,11 +3,17 @@ pub const CPUProfilerConfig = struct {
dir: []const u8,
md_format: bool = false,
json_format: bool = false,
interval: u32 = 1000,
};
// C++ function declarations
extern fn Bun__startCPUProfiler(vm: *jsc.VM) void;
extern fn Bun__stopCPUProfiler(vm: *jsc.VM, outJSON: ?*bun.String, outText: ?*bun.String) void;
extern fn Bun__setSamplingInterval(intervalMicroseconds: c_int) void;
pub fn setSamplingInterval(interval: u32) void {
Bun__setSamplingInterval(@intCast(interval));
}
pub fn startCPUProfiler(vm: *jsc.VM) void {
Bun__startCPUProfiler(vm);

View File

@@ -24,8 +24,16 @@ pub const ResolvedSource = extern struct {
/// This is for source_code
source_code_needs_deref: bool = true,
already_bundled: bool = false,
// -- Bytecode cache fields --
bytecode_cache: ?[*]u8 = null,
bytecode_cache_size: usize = 0,
module_info: ?*anyopaque = null,
/// The file path used as the source origin for bytecode cache validation.
/// JSC validates bytecode by checking if the origin URL matches exactly what
/// was used at build time. If empty, the origin is derived from source_url.
/// This is converted to a file:// URL on the C++ side.
bytecode_origin_path: bun.String = bun.String.empty,
pub const Tag = @import("ResolvedSourceTag").ResolvedSourceTag;
};

View File

@@ -75,6 +75,14 @@ Ref<SourceProvider> SourceProvider::create(
JSC::SourceProviderSourceType sourceType,
bool isBuiltin)
{
// Use BunTranspiledModule when module_info is present.
// This allows JSC to skip parsing during the analyze phase (uses pre-computed imports/exports).
// Bytecode cache (if present) is used separately during the evaluate phase.
if (resolvedSource.module_info != nullptr) {
ASSERT(!resolvedSource.isCommonJSModule);
sourceType = JSC::SourceProviderSourceType::BunTranspiledModule;
}
auto string = resolvedSource.source_code.toWTFString(BunString::ZeroCopy);
auto sourceURLString = resolvedSource.source_url.toWTFString(BunString::ZeroCopy);
@@ -91,6 +99,18 @@ Ref<SourceProvider> SourceProvider::create(
// https://github.com/oven-sh/bun/issues/9521
}
// Compute source origin: use explicit bytecode_origin_path if provided, otherwise derive from source_url.
// bytecode_origin_path is used for bytecode cache validation where the origin must match
// exactly what was used at build time.
const auto getSourceOrigin = [&]() -> SourceOrigin {
auto bytecodeOriginPath = resolvedSource.bytecode_origin_path.toWTFString(BunString::ZeroCopy);
if (!bytecodeOriginPath.isNull() && !bytecodeOriginPath.isEmpty()) {
// Convert file path to file:// URL (same as build time)
return SourceOrigin(WTF::URL::fileURLWithFileSystemPath(bytecodeOriginPath));
}
return toSourceOrigin(sourceURLString, isBuiltin);
};
const auto getProvider = [&]() -> Ref<SourceProvider> {
if (resolvedSource.bytecode_cache != nullptr) {
const auto destructorPtr = [](const void* ptr) {
@@ -101,13 +121,15 @@ Ref<SourceProvider> SourceProvider::create(
};
const auto destructor = resolvedSource.needsDeref ? destructorPtr : destructorNoOp;
auto origin = getSourceOrigin();
Ref<JSC::CachedBytecode> bytecode = JSC::CachedBytecode::create(std::span<uint8_t>(resolvedSource.bytecode_cache, resolvedSource.bytecode_cache_size), destructor, {});
auto provider = adoptRef(*new SourceProvider(
globalObject->isThreadLocalDefaultGlobalObject ? globalObject : nullptr,
resolvedSource,
string.isNull() ? *StringImpl::empty() : *string.impl(),
JSC::SourceTaintedOrigin::Untainted,
toSourceOrigin(sourceURLString, isBuiltin),
origin,
sourceURLString.impl(), TextPosition(),
sourceType));
provider->m_cachedBytecode = WTF::move(bytecode);
@@ -119,7 +141,7 @@ Ref<SourceProvider> SourceProvider::create(
resolvedSource,
string.isNull() ? *StringImpl::empty() : *string.impl(),
JSC::SourceTaintedOrigin::Untainted,
toSourceOrigin(sourceURLString, isBuiltin),
getSourceOrigin(),
sourceURLString.impl(), TextPosition(),
sourceType));
};
@@ -189,6 +211,8 @@ extern "C" bool generateCachedModuleByteCodeFromSourceCode(BunString* sourceProv
auto key = JSC::sourceCodeKeyForSerializedModule(vm, sourceCode);
dataLogLnIf(JSC::Options::verboseDiskCache(), "[Bytecode Build] generateModule url=", sourceProviderURL->toWTFString(), " origin=", sourceCode.provider()->sourceOrigin().url().string(), " sourceSize=", inputSourceCodeSize, " keyHash=", key.hash());
RefPtr<JSC::CachedBytecode> cachedBytecode = JSC::encodeCodeBlock(vm, key, unlinkedCodeBlock);
if (!cachedBytecode)
return false;
@@ -222,6 +246,8 @@ extern "C" bool generateCachedCommonJSProgramByteCodeFromSourceCode(BunString* s
auto key = JSC::sourceCodeKeyForSerializedProgram(vm, sourceCode);
dataLogLnIf(JSC::Options::verboseDiskCache(), "[Bytecode Build] generateCJS url=", sourceProviderURL->toWTFString(), " origin=", sourceCode.provider()->sourceOrigin().url().string(), " sourceSize=", inputSourceCodeSize, " keyHash=", key.hash());
RefPtr<JSC::CachedBytecode> cachedBytecode = JSC::encodeCodeBlock(vm, key, unlinkedCodeBlock);
if (!cachedBytecode)
return false;

View File

@@ -116,8 +116,13 @@ typedef struct ResolvedSource {
uint32_t tag;
bool needsDeref;
bool already_bundled;
// -- Bytecode cache fields --
uint8_t* bytecode_cache;
size_t bytecode_cache_size;
void* module_info;
// File path used as source origin for bytecode cache validation.
// Converted to file:// URL. If empty, origin is derived from source_url.
BunString bytecode_origin_path;
} ResolvedSource;
static const uint32_t ResolvedSourceTagPackageJSONTypeModule = 1;
typedef union ErrorableResolvedSourceResult {

View File

@@ -83,6 +83,7 @@
namespace WebCore {
WTF_MAKE_TZONE_ALLOCATED_IMPL(WebSocket);
extern "C" int Bun__getTLSRejectUnauthorizedValue();
extern "C" bool Bun__isNoProxy(const char* hostname, size_t hostname_len, const char* host, size_t host_len);
static ErrorEvent::Init createErrorEventInit(WebSocket& webSocket, const String& reason, JSC::JSGlobalObject* globalObject)
{
@@ -573,6 +574,19 @@ ExceptionOr<void> WebSocket::connect(const String& url, const Vector<String>& pr
// Determine connection type based on proxy usage and TLS requirements
bool hasProxy = proxyConfig.has_value();
// Check NO_PROXY even for explicitly-provided proxies
if (hasProxy) {
auto hostStr = m_url.host().toString();
auto hostWithPort = hostName(m_url, is_secure);
auto hostUtf8 = hostStr.utf8();
auto hostWithPortUtf8 = hostWithPort.utf8();
if (Bun__isNoProxy(hostUtf8.data(), hostUtf8.length(), hostWithPortUtf8.data(), hostWithPortUtf8.length())) {
proxyConfig = std::nullopt;
hasProxy = false;
}
}
bool proxyIsHTTPS = hasProxy && proxyConfig->isHTTPS;
// Connection type determines what kind of socket we use:

View File

@@ -649,6 +649,10 @@ pub const PathLike = union(enum) {
const normal = path_handler.normalizeBuf(resolve, b, .windows);
return strings.toKernel32Path(@alignCast(std.mem.bytesAsSlice(u16, buf)), normal);
}
// Handle "." specially since normalizeStringBuf strips it to an empty string
if (s.len == 1 and s[0] == '.') {
return strings.toKernel32Path(@alignCast(std.mem.bytesAsSlice(u16, buf)), ".");
}
const normal = path_handler.normalizeStringBuf(s, b, true, .windows, false);
return strings.toKernel32Path(@alignCast(std.mem.bytesAsSlice(u16, buf)), normal);
}

View File

@@ -158,6 +158,13 @@ export fn Bun__getTLSRejectUnauthorizedValue() i32 {
return if (jsc.VirtualMachine.get().getTLSRejectUnauthorized()) 1 else 0;
}
export fn Bun__isNoProxy(hostname_ptr: [*]const u8, hostname_len: usize, host_ptr: [*]const u8, host_len: usize) bool {
const vm = jsc.VirtualMachine.get();
const hostname: ?[]const u8 = if (hostname_len > 0) hostname_ptr[0..hostname_len] else null;
const host: ?[]const u8 = if (host_len > 0) host_ptr[0..host_len] else null;
return vm.transpiler.env.isNoProxy(hostname, host);
}
export fn Bun__setVerboseFetchValue(value: i32) void {
VirtualMachine.get().default_verbose_fetch = if (value == 1) .headers else if (value == 2) .curl else .none;
}

View File

@@ -1036,9 +1036,14 @@ pub const FetchTasklet = struct {
var proxy: ?ZigURL = null;
if (fetch_options.proxy) |proxy_opt| {
if (!proxy_opt.isEmpty()) { //if is empty just ignore proxy
proxy = fetch_options.proxy orelse jsc_vm.transpiler.env.getHttpProxyFor(fetch_options.url);
// Check NO_PROXY even for explicitly-provided proxies
if (!jsc_vm.transpiler.env.isNoProxy(fetch_options.url.hostname, fetch_options.url.host)) {
proxy = proxy_opt;
}
}
// else: proxy: "" means explicitly no proxy (direct connection)
} else {
// no proxy provided, use default proxy resolution
proxy = jsc_vm.transpiler.env.getHttpProxyFor(fetch_options.url);
}

View File

@@ -502,6 +502,11 @@ pub const Chunk = struct {
///
/// Mutated while sorting chunks in `computeChunks`
css_chunks: []u32 = &.{},
/// Serialized ModuleInfo for ESM bytecode (--compile --bytecode --format=esm)
module_info_bytes: ?[]const u8 = null,
/// Unserialized ModuleInfo for deferred serialization (after chunk paths are resolved)
module_info: ?*analyze_transpiled_module.ModuleInfo = null,
};
pub const CssChunk = struct {
@@ -654,6 +659,7 @@ pub const ParseTask = bun.bundle_v2.ParseTask;
const string = []const u8;
const HTMLImportManifest = @import("./HTMLImportManifest.zig");
const analyze_transpiled_module = @import("../analyze_transpiled_module.zig");
const std = @import("std");
const options = @import("../options.zig");

View File

@@ -70,6 +70,7 @@ pub const LinkerContext = struct {
css_chunking: bool = false,
source_maps: options.SourceMapOption = .none,
target: options.Target = .browser,
compile: bool = false,
metafile: bool = false,
/// Path to write JSON metafile (for Bun.build API)
metafile_json_path: []const u8 = "",

View File

@@ -971,6 +971,7 @@ pub const BundleV2 = struct {
this.linker.options.target = transpiler.options.target;
this.linker.options.output_format = transpiler.options.output_format;
this.linker.options.generate_bytecode_cache = transpiler.options.bytecode;
this.linker.options.compile = transpiler.options.compile;
this.linker.options.metafile = transpiler.options.metafile;
this.linker.options.metafile_json_path = transpiler.options.metafile_json_path;
this.linker.options.metafile_markdown_path = transpiler.options.metafile_markdown_path;
@@ -4508,9 +4509,19 @@ pub const CrossChunkImport = struct {
};
pub const CompileResult = union(enum) {
pub const DeclInfo = struct {
pub const Kind = enum(u1) { declared, lexical };
name: []const u8,
kind: Kind,
};
javascript: struct {
source_index: Index.Int,
result: js_printer.PrintResult,
/// Top-level declarations collected from converted statements during
/// parallel printing. Used by postProcessJSChunk to populate ModuleInfo
/// without re-scanning the original (unconverted) AST.
decls: []const DeclInfo = &.{},
pub fn code(this: @This()) []const u8 {
return switch (this.result) {

View File

@@ -167,7 +167,7 @@ pub const ServerEntryPoint = struct {
\\var hmrSymbol = Symbol("BunServerHMR");
\\var entryNamespace = start;
\\function isServerConfig(def) {{
\\ return def && def !== globalThis && (typeof def.fetch === 'function' || def.app != undefined) && typeof def.stop !== 'function';
\\ return def && def !== globalThis && (typeof def.fetch === 'function' || def.app != undefined) && typeof def.reload !== 'function';
\\}}
\\if (typeof entryNamespace?.then === 'function') {{
\\ entryNamespace = entryNamespace.then((entryNamespace) => {{
@@ -206,7 +206,7 @@ pub const ServerEntryPoint = struct {
\\import * as start from "{f}";
\\var entryNamespace = start;
\\function isServerConfig(def) {{
\\ return def && def !== globalThis && (typeof def.fetch === 'function' || def.app != undefined) && typeof def.stop !== 'function';
\\ return def && def !== globalThis && (typeof def.fetch === 'function' || def.app != undefined) && typeof def.reload !== 'function';
\\}}
\\if (typeof entryNamespace?.then === 'function') {{
\\ entryNamespace = entryNamespace.then((entryNamespace) => {{

View File

@@ -3,7 +3,7 @@
//! chunk indexing remains the same:
//!
//! 1. chunks
//! 2. sourcemaps and bytecode
//! 2. sourcemaps, bytecode, and module_info
//! 3. additional output files
//!
//! We can calculate the space ahead of time and avoid having to do something
@@ -41,7 +41,7 @@ pub fn init(
chunks: []const bun.bundle_v2.Chunk,
_: usize,
) !@This() {
const length, const source_map_and_bytecode_count = OutputFileList.calculateOutputFileListCapacity(c, chunks);
const length, const supplementary_file_count = OutputFileList.calculateOutputFileListCapacity(c, chunks);
var output_files = try std.array_list.Managed(options.OutputFile).initCapacity(
allocator,
length,
@@ -51,8 +51,8 @@ pub fn init(
return .{
.output_files = output_files,
.index_for_chunk = 0,
.index_for_sourcemaps_and_bytecode = if (source_map_and_bytecode_count == 0) null else @as(u32, @truncate(chunks.len)),
.additional_output_files_start = @as(u32, @intCast(chunks.len)) + source_map_and_bytecode_count,
.index_for_sourcemaps_and_bytecode = if (supplementary_file_count == 0) null else @as(u32, @truncate(chunks.len)),
.additional_output_files_start = @as(u32, @intCast(chunks.len)) + supplementary_file_count,
.total_insertions = 0,
};
}
@@ -94,7 +94,10 @@ pub fn calculateOutputFileListCapacity(c: *const bun.bundle_v2.LinkerContext, ch
break :bytecode_count bytecode_count;
} else 0;
return .{ @intCast(chunks.len + source_map_count + bytecode_count + c.parse_graph.additional_output_files.items.len), @intCast(source_map_count + bytecode_count) };
// module_info is generated for ESM bytecode in --compile builds
const module_info_count = if (c.options.generate_bytecode_cache and c.options.output_format == .esm and c.options.compile) bytecode_count else 0;
return .{ @intCast(chunks.len + source_map_count + bytecode_count + module_info_count + c.parse_graph.additional_output_files.items.len), @intCast(source_map_count + bytecode_count + module_info_count) };
}
pub fn insertForChunk(this: *OutputFileList, output_file: options.OutputFile) u32 {

View File

@@ -304,6 +304,69 @@ pub fn generateChunksInParallel(
}
}
// After final_rel_path is computed for all chunks, fix up module_info
// cross-chunk import specifiers. During printing, cross-chunk imports use
// unique_key placeholders as paths. Now that final paths are known, replace
// those placeholders with the resolved paths and serialize.
if (c.options.generate_bytecode_cache and c.options.output_format == .esm and c.options.compile) {
// Build map from unique_key -> final resolved path
const b = @as(*bun.bundle_v2.BundleV2, @fieldParentPtr("linker", c));
var unique_key_to_path = bun.StringHashMap([]const u8).init(c.allocator());
defer unique_key_to_path.deinit();
for (chunks) |*ch| {
if (ch.unique_key.len > 0 and ch.final_rel_path.len > 0) {
// Use the per-chunk public_path to match what IntermediateOutput.code()
// uses during emission (browser chunks from server builds use the
// browser transpiler's public_path).
const public_path = if (ch.flags.is_browser_chunk_from_server_build)
b.transpilerForTarget(.browser).options.public_path
else
c.options.public_path;
const normalizer = bun.bundle_v2.cheapPrefixNormalizer(public_path, ch.final_rel_path);
const resolved = std.fmt.allocPrint(c.allocator(), "{s}{s}", .{ normalizer[0], normalizer[1] }) catch |err| bun.handleOom(err);
unique_key_to_path.put(ch.unique_key, resolved) catch |err| bun.handleOom(err);
}
}
// Fix up each chunk's module_info
for (chunks) |*chunk| {
if (chunk.content != .javascript) continue;
const mi = chunk.content.javascript.module_info orelse continue;
// Collect replacements first (can't modify string table while iterating)
const Replacement = struct { old_id: analyze_transpiled_module.StringID, resolved_path: []const u8 };
var replacements: std.ArrayListUnmanaged(Replacement) = .{};
defer replacements.deinit(c.allocator());
var offset: usize = 0;
for (mi.strings_lens.items, 0..) |slen, string_index| {
const len: usize = @intCast(slen);
const s = mi.strings_buf.items[offset..][0..len];
if (unique_key_to_path.get(s)) |resolved_path| {
replacements.append(c.allocator(), .{
.old_id = @enumFromInt(@as(u32, @intCast(string_index))),
.resolved_path = resolved_path,
}) catch |err| bun.handleOom(err);
}
offset += len;
}
for (replacements.items) |rep| {
const new_id = mi.str(rep.resolved_path) catch |err| bun.handleOom(err);
mi.replaceStringID(rep.old_id, new_id);
}
// Serialize the fixed-up module_info
chunk.content.javascript.module_info_bytes = bun.js_printer.serializeModuleInfo(mi);
// Free the ModuleInfo now that it's been serialized to bytes.
// It was allocated with bun.default_allocator (not the arena),
// so it must be explicitly destroyed.
mi.destroy();
chunk.content.javascript.module_info = null;
}
}
// Generate metafile JSON fragments for each chunk (after paths are resolved)
if (c.options.metafile) {
for (chunks) |*chunk| {
@@ -431,6 +494,14 @@ pub fn generateChunksInParallel(
.none => {},
}
// Compute side early so it can be used for bytecode, module_info, and main chunk output files
const side: bun.bake.Side = if (chunk.content == .css or chunk.flags.is_browser_chunk_from_server_build)
.client
else switch (c.graph.ast.items(.target)[chunk.entry_point.source_index]) {
.browser => .client,
else => .server,
};
const bytecode_output_file: ?options.OutputFile = brk: {
if (c.options.generate_bytecode_cache) {
const loader: Loader = if (chunk.entry_point.is_entry_point)
@@ -444,7 +515,18 @@ pub fn generateChunksInParallel(
jsc.VirtualMachine.is_bundler_thread_for_bytecode_cache = true;
jsc.initialize(false);
var fdpath: bun.PathBuffer = undefined;
var source_provider_url = try bun.String.createFormat("{s}" ++ bun.bytecode_extension, .{chunk.final_rel_path});
// For --compile builds, the bytecode URL must match the module name
// that will be used at runtime. The module name is:
// public_path + final_rel_path (e.g., "/$bunfs/root/app.js")
// Without this prefix, the JSC bytecode cache key won't match at runtime.
// Use the per-chunk public_path (already computed above) for browser chunks
// from server builds, and normalize with cheapPrefixNormalizer for consistency
// with module_info path fixup.
// For non-compile builds, use the normal .jsc extension.
var source_provider_url = if (c.options.compile) url_blk: {
const normalizer = bun.bundle_v2.cheapPrefixNormalizer(public_path, chunk.final_rel_path);
break :url_blk try bun.String.createFormat("{s}{s}", .{ normalizer[0], normalizer[1] });
} else try bun.String.createFormat("{s}" ++ bun.bytecode_extension, .{chunk.final_rel_path});
source_provider_url.ref();
defer source_provider_url.deref();
@@ -469,7 +551,7 @@ pub fn generateChunksInParallel(
.data = .{
.buffer = .{ .data = bytecode, .allocator = cached_bytecode.allocator() },
},
.side = .server,
.side = side,
.entry_point_index = null,
.is_executable = false,
});
@@ -485,6 +567,40 @@ pub fn generateChunksInParallel(
break :brk null;
};
// Create module_info output file for ESM bytecode in --compile builds
const module_info_output_file: ?options.OutputFile = brk: {
if (c.options.generate_bytecode_cache and c.options.output_format == .esm and c.options.compile) {
const loader: Loader = if (chunk.entry_point.is_entry_point)
c.parse_graph.input_files.items(.loader)[
chunk.entry_point.source_index
]
else
.js;
if (chunk.content == .javascript and loader.isJavaScriptLike()) {
if (chunk.content.javascript.module_info_bytes) |module_info_bytes| {
break :brk options.OutputFile.init(.{
.output_path = bun.handleOom(std.fmt.allocPrint(bun.default_allocator, "{s}.module-info", .{chunk.final_rel_path})),
.input_path = bun.handleOom(std.fmt.allocPrint(bun.default_allocator, "{s}.module-info", .{chunk.final_rel_path})),
.input_loader = .js,
.hash = if (chunk.template.placeholder.hash != null) bun.hash(module_info_bytes) else null,
.output_kind = .module_info,
.loader = .file,
.size = @as(u32, @truncate(module_info_bytes.len)),
.display_size = @as(u32, @truncate(module_info_bytes.len)),
.data = .{
.buffer = .{ .data = module_info_bytes, .allocator = bun.default_allocator },
},
.side = side,
.entry_point_index = null,
.is_executable = false,
});
}
}
}
break :brk null;
};
const source_map_index: ?u32 = if (sourcemap_output_file != null)
try output_files.insertForSourcemapOrBytecode(sourcemap_output_file.?)
else
@@ -495,6 +611,11 @@ pub fn generateChunksInParallel(
else
null;
const module_info_index: ?u32 = if (module_info_output_file != null)
try output_files.insertForSourcemapOrBytecode(module_info_output_file.?)
else
null;
const output_kind = if (chunk.content == .css)
.asset
else if (chunk.entry_point.is_entry_point)
@@ -502,12 +623,6 @@ pub fn generateChunksInParallel(
else
.chunk;
const side: bun.bake.Side = if (chunk.content == .css or chunk.flags.is_browser_chunk_from_server_build)
.client
else switch (c.graph.ast.items(.target)[chunk.entry_point.source_index]) {
.browser => .client,
else => .server,
};
const chunk_index = output_files.insertForChunk(options.OutputFile.init(.{
.data = .{
.buffer = .{
@@ -525,6 +640,7 @@ pub fn generateChunksInParallel(
.is_executable = chunk.flags.is_executable,
.source_map_index = source_map_index,
.bytecode_index = bytecode_index,
.module_info_index = module_info_index,
.side = side,
.entry_point_index = if (output_kind == .@"entry-point")
chunk.entry_point.source_index - @as(u32, (if (c.framework) |fw| if (fw.server_components != null) 3 else 1 else 1))
@@ -564,6 +680,7 @@ pub const ThreadPool = bun.bundle_v2.ThreadPool;
const debugPartRanges = Output.scoped(.PartRanges, .hidden);
const analyze_transpiled_module = @import("../../analyze_transpiled_module.zig");
const std = @import("std");
const bun = @import("bun");

View File

@@ -10,6 +10,7 @@ pub fn generateCodeForFileInChunkJS(
stmts: *StmtList,
allocator: std.mem.Allocator,
temp_allocator: std.mem.Allocator,
decl_collector: ?*DeclCollector,
) js_printer.PrintResult {
const parts: []Part = c.graph.ast.items(.parts)[part_range.source_index.get()].slice()[part_range.part_index_begin..part_range.part_index_end];
const all_flags: []const JSMeta.Flags = c.graph.meta.items(.flags);
@@ -613,6 +614,15 @@ pub fn generateCodeForFileInChunkJS(
};
}
// Collect top-level declarations from the converted statements.
// This is done here (after convertStmtsForChunk) rather than in
// postProcessJSChunk, because convertStmtsForChunk transforms the AST
// (e.g. export default expr → var, export stripping) and the converted
// statements reflect what actually gets printed.
if (decl_collector) |dc| {
dc.collectFromStmts(out_stmts, r, c);
}
return c.printCodeForFileInChunkJS(
r,
allocator,
@@ -628,6 +638,77 @@ pub fn generateCodeForFileInChunkJS(
);
}
pub const DeclCollector = struct {
decls: std.ArrayListUnmanaged(CompileResult.DeclInfo) = .{},
allocator: std.mem.Allocator,
const CompileResult = bun.bundle_v2.CompileResult;
/// Collect top-level declarations from **converted** statements (after
/// `convertStmtsForChunk`). At that point, export statements have already
/// been transformed:
/// - `s_export_default` → `s_local` / `s_function` / `s_class`
/// - `s_export_clause` → removed entirely
/// - `s_export_from` / `s_export_star` → removed or converted to `s_import`
///
/// Remaining `s_import` statements (external, non-bundled) don't need
/// handling here; their bindings are recorded separately in
/// `postProcessJSChunk` by scanning the original AST import records.
pub fn collectFromStmts(self: *DeclCollector, stmts: []const Stmt, r: renamer.Renamer, c: *LinkerContext) void {
for (stmts) |stmt| {
switch (stmt.data) {
.s_local => |s| {
const kind: CompileResult.DeclInfo.Kind = if (s.kind == .k_var) .declared else .lexical;
for (s.decls.slice()) |decl| {
self.collectFromBinding(decl.binding, kind, r, c);
}
},
.s_function => |s| {
if (s.func.name) |name_loc_ref| {
if (name_loc_ref.ref) |name_ref| {
self.addRef(name_ref, .lexical, r, c);
}
}
},
.s_class => |s| {
if (s.class.class_name) |class_name| {
if (class_name.ref) |name_ref| {
self.addRef(name_ref, .lexical, r, c);
}
}
},
else => {},
}
}
}
fn collectFromBinding(self: *DeclCollector, binding: Binding, kind: CompileResult.DeclInfo.Kind, r: renamer.Renamer, c: *LinkerContext) void {
switch (binding.data) {
.b_identifier => |b| {
self.addRef(b.ref, kind, r, c);
},
.b_array => |b| {
for (b.items) |item| {
self.collectFromBinding(item.binding, kind, r, c);
}
},
.b_object => |b| {
for (b.properties) |prop| {
self.collectFromBinding(prop.value, kind, r, c);
}
},
.b_missing => {},
}
}
fn addRef(self: *DeclCollector, ref: Ref, kind: CompileResult.DeclInfo.Kind, r: renamer.Renamer, c: *LinkerContext) void {
const followed = c.graph.symbols.follow(ref);
const name = r.nameForSymbol(followed);
if (name.len == 0) return;
self.decls.append(self.allocator, .{ .name = name, .kind = kind }) catch return;
}
};
fn mergeAdjacentLocalStmts(stmts: *std.ArrayListUnmanaged(Stmt), allocator: std.mem.Allocator) void {
if (stmts.items.len == 0)
return;

View File

@@ -46,6 +46,9 @@ fn generateCompileResultForJSChunkImpl(worker: *ThreadPool.Worker, c: *LinkerCon
const toESMRef = c.graph.symbols.follow(runtime_members.get("__toESM").?.ref);
const runtimeRequireRef = if (c.options.output_format == .cjs) null else c.graph.symbols.follow(runtime_members.get("__require").?.ref);
const collect_decls = c.options.generate_bytecode_cache and c.options.output_format == .esm and c.options.compile;
var dc = DeclCollector{ .allocator = allocator };
const result = c.generateCodeForFileInChunkJS(
&buffer_writer,
chunk.renamer,
@@ -57,6 +60,7 @@ fn generateCompileResultForJSChunkImpl(worker: *ThreadPool.Worker, c: *LinkerCon
&worker.stmt_list,
worker.allocator,
arena.allocator(),
if (collect_decls) &dc else null,
);
// Update bytesInOutput for this source in the chunk (for metafile)
@@ -75,6 +79,7 @@ fn generateCompileResultForJSChunkImpl(worker: *ThreadPool.Worker, c: *LinkerCon
.javascript = .{
.source_index = part_range.source_index.get(),
.result = result,
.decls = if (collect_decls) dc.decls.items else &.{},
},
};
}
@@ -82,6 +87,8 @@ fn generateCompileResultForJSChunkImpl(worker: *ThreadPool.Worker, c: *LinkerCon
pub const DeferredBatchTask = bun.bundle_v2.DeferredBatchTask;
pub const ParseTask = bun.bundle_v2.ParseTask;
const DeclCollector = @import("./generateCodeForFileInChunkJS.zig").DeclCollector;
const bun = @import("bun");
const Environment = bun.Environment;
const ThreadPoolLib = bun.ThreadPool;

View File

@@ -25,6 +25,15 @@ pub fn postProcessJSChunk(ctx: GenerateChunkCtx, worker: *ThreadPool.Worker, chu
const toESMRef = c.graph.symbols.follow(runtime_members.get("__toESM").?.ref);
const runtimeRequireRef = if (c.options.output_format == .cjs) null else c.graph.symbols.follow(runtime_members.get("__require").?.ref);
// Create ModuleInfo for ESM bytecode in --compile builds
const generate_module_info = c.options.generate_bytecode_cache and c.options.output_format == .esm and c.options.compile;
const loader = c.parse_graph.input_files.items(.loader)[chunk.entry_point.source_index];
const is_typescript = loader.isTypeScript();
const module_info: ?*analyze_transpiled_module.ModuleInfo = if (generate_module_info)
analyze_transpiled_module.ModuleInfo.create(bun.default_allocator, is_typescript) catch null
else
null;
{
const print_options = js_printer.Options{
.bundling = true,
@@ -39,6 +48,7 @@ pub fn postProcessJSChunk(ctx: GenerateChunkCtx, worker: *ThreadPool.Worker, chu
.target = c.options.target,
.print_dce_annotations = c.options.emit_dce_annotations,
.mangled_props = &c.mangled_props,
.module_info = module_info,
// .const_values = c.graph.const_values,
};
@@ -84,7 +94,124 @@ pub fn postProcessJSChunk(ctx: GenerateChunkCtx, worker: *ThreadPool.Worker, chu
);
}
// Generate the exports for the entry point, if there are any
// Populate ModuleInfo with declarations collected during parallel printing,
// external import records from the original AST, and wrapper refs.
if (module_info) |mi| {
// 1. Add declarations collected by DeclCollector during parallel part printing.
// These come from the CONVERTED statements (after convertStmtsForChunk transforms
// export default → var, strips exports, etc.), so they match what's actually printed.
for (chunk.compile_results_for_chunk) |cr| {
const decls = switch (cr) {
.javascript => |js| js.decls,
else => continue,
};
for (decls) |decl| {
const var_kind: analyze_transpiled_module.ModuleInfo.VarKind = switch (decl.kind) {
.declared => .declared,
.lexical => .lexical,
};
const string_id = mi.str(decl.name) catch continue;
mi.addVar(string_id, var_kind) catch continue;
}
}
// 1b. Check if any source in this chunk uses import.meta. The per-part
// parallel printer does not have module_info, so the printer cannot set
// this flag during per-part printing. We derive it from the AST instead.
// Note: the runtime source (index 0) also uses import.meta (e.g.
// `import.meta.require`), so we must not skip it.
{
const all_ast_flags = c.graph.ast.items(.flags);
for (chunk.content.javascript.parts_in_chunk_in_order) |part_range| {
if (all_ast_flags[part_range.source_index.get()].has_import_meta) {
mi.flags.contains_import_meta = true;
break;
}
}
}
// 2. Collect truly-external imports from the original AST. Bundled imports
// (where source_index is valid) are removed by convertStmtsForChunk and
// re-created as cross-chunk imports — those are already captured by the
// printer when it prints cross_chunk_prefix_stmts above. Only truly-external
// imports (node built-ins, etc.) survive as s_import in per-file parts and
// need recording here.
const all_parts = c.graph.ast.items(.parts);
const all_flags = c.graph.meta.items(.flags);
const all_import_records = c.graph.ast.items(.import_records);
for (chunk.content.javascript.parts_in_chunk_in_order) |part_range| {
if (all_flags[part_range.source_index.get()].wrap == .cjs) continue;
const source_parts = all_parts[part_range.source_index.get()].slice();
const source_import_records = all_import_records[part_range.source_index.get()].slice();
var part_i = part_range.part_index_begin;
while (part_i < part_range.part_index_end) : (part_i += 1) {
for (source_parts[part_i].stmts) |stmt| {
switch (stmt.data) {
.s_import => |s| {
const record = &source_import_records[s.import_record_index];
if (record.path.is_disabled) continue;
if (record.tag == .bun) continue;
// Skip bundled imports — these are converted to cross-chunk
// imports by the linker. The printer already recorded them
// when printing cross_chunk_prefix_stmts.
if (record.source_index.isValid()) continue;
const import_path = record.path.text;
const irp_id = mi.str(import_path) catch continue;
mi.requestModule(irp_id, .none) catch continue;
if (s.default_name) |name| {
if (name.ref) |name_ref| {
const local_name = chunk.renamer.nameForSymbol(name_ref);
const local_name_id = mi.str(local_name) catch continue;
mi.addVar(local_name_id, .lexical) catch continue;
mi.addImportInfoSingle(irp_id, mi.str("default") catch continue, local_name_id, false) catch continue;
}
}
for (s.items) |item| {
if (item.name.ref) |name_ref| {
const local_name = chunk.renamer.nameForSymbol(name_ref);
const local_name_id = mi.str(local_name) catch continue;
mi.addVar(local_name_id, .lexical) catch continue;
mi.addImportInfoSingle(irp_id, mi.str(item.alias) catch continue, local_name_id, false) catch continue;
}
}
if (record.flags.contains_import_star) {
const local_name = chunk.renamer.nameForSymbol(s.namespace_ref);
const local_name_id = mi.str(local_name) catch continue;
mi.addVar(local_name_id, .lexical) catch continue;
mi.addImportInfoNamespace(irp_id, local_name_id) catch continue;
}
},
else => {},
}
}
}
}
// 3. Add wrapper-generated declarations (init_xxx, require_xxx) that are
// not in any part statement.
const all_wrapper_refs = c.graph.ast.items(.wrapper_ref);
for (chunk.content.javascript.parts_in_chunk_in_order) |part_range| {
const source_index = part_range.source_index.get();
if (all_flags[source_index].wrap != .none) {
const wrapper_ref = all_wrapper_refs[source_index];
if (!wrapper_ref.isEmpty()) {
const name = chunk.renamer.nameForSymbol(wrapper_ref);
if (name.len > 0) {
const string_id = mi.str(name) catch continue;
mi.addVar(string_id, .declared) catch continue;
}
}
}
}
}
// Generate the exports for the entry point, if there are any.
// This must happen before module_info serialization so the printer
// can populate export entries in module_info.
const entry_point_tail = brk: {
if (chunk.isEntryPoint()) {
break :brk generateEntryPointTailJS(
@@ -95,12 +222,21 @@ pub fn postProcessJSChunk(ctx: GenerateChunkCtx, worker: *ThreadPool.Worker, chu
worker.allocator,
arena.allocator(),
chunk.renamer,
module_info,
);
}
break :brk CompileResult.empty;
};
// Store unserialized ModuleInfo on the chunk. Serialization is deferred to
// generateChunksInParallel after final chunk paths are computed, so that
// cross-chunk import specifiers (which use unique_key placeholders during
// printing) can be resolved to actual paths.
if (module_info) |mi| {
chunk.content.javascript.module_info = mi;
}
var j = StringJoiner{
.allocator = worker.allocator,
.watcher = .{
@@ -435,6 +571,37 @@ pub fn postProcessJSChunk(ctx: GenerateChunkCtx, worker: *ThreadPool.Worker, chu
}
}
/// Recursively walk a binding and add all declared names to `ModuleInfo`.
/// Handles `b_identifier`, `b_array`, `b_object`, and `b_missing`.
fn addBindingVarsToModuleInfo(
mi: *analyze_transpiled_module.ModuleInfo,
binding: Binding,
var_kind: analyze_transpiled_module.ModuleInfo.VarKind,
r: renamer.Renamer,
symbols: *const js_ast.Symbol.Map,
) void {
switch (binding.data) {
.b_identifier => |b| {
const name = r.nameForSymbol(symbols.follow(b.ref));
if (name.len > 0) {
const str_id = mi.str(name) catch return;
mi.addVar(str_id, var_kind) catch {};
}
},
.b_array => |b| {
for (b.items) |item| {
addBindingVarsToModuleInfo(mi, item.binding, var_kind, r, symbols);
}
},
.b_object => |b| {
for (b.properties) |prop| {
addBindingVarsToModuleInfo(mi, prop.value, var_kind, r, symbols);
}
},
.b_missing => {},
}
}
pub fn generateEntryPointTailJS(
c: *LinkerContext,
toCommonJSRef: Ref,
@@ -443,6 +610,7 @@ pub fn generateEntryPointTailJS(
allocator: std.mem.Allocator,
temp_allocator: std.mem.Allocator,
r: renamer.Renamer,
module_info: ?*analyze_transpiled_module.ModuleInfo,
) CompileResult {
const flags: JSMeta.Flags = c.graph.meta.items(.flags)[source_index];
var stmts = std.array_list.Managed(Stmt).init(temp_allocator);
@@ -825,6 +993,22 @@ pub fn generateEntryPointTailJS(
},
}
// Add generated local declarations from entry point tail to module_info.
// This captures vars like `var export_foo = cjs.foo` for CJS export copies.
if (module_info) |mi| {
for (stmts.items) |stmt| {
switch (stmt.data) {
.s_local => |s| {
const var_kind: analyze_transpiled_module.ModuleInfo.VarKind = if (s.kind == .k_var) .declared else .lexical;
for (s.decls.slice()) |decl| {
addBindingVarsToModuleInfo(mi, decl.binding, var_kind, r, &c.graph.symbols);
}
},
else => {},
}
}
}
if (stmts.items.len == 0) {
return .{
.javascript = .{
@@ -850,6 +1034,7 @@ pub fn generateEntryPointTailJS(
.print_dce_annotations = c.options.emit_dce_annotations,
.minify_syntax = c.options.minify_syntax,
.mangled_props = &c.mangled_props,
.module_info = module_info,
// .const_values = c.graph.const_values,
};
@@ -875,6 +1060,7 @@ pub fn generateEntryPointTailJS(
};
}
const analyze_transpiled_module = @import("../../analyze_transpiled_module.zig");
const std = @import("std");
const bun = @import("bun");

View File

@@ -392,6 +392,7 @@ pub const Command = struct {
enabled: bool = false,
name: []const u8 = "",
dir: []const u8 = "",
interval: u32 = 1000,
md_format: bool = false,
json_format: bool = false,
} = .{},
@@ -425,6 +426,9 @@ pub const Command = struct {
filters: []const []const u8 = &.{},
workspaces: bool = false,
if_present: bool = false,
parallel: bool = false,
sequential: bool = false,
no_exit_on_error: bool = false,
preloads: []const string = &.{},
has_loaded_global_config: bool = false,
@@ -888,6 +892,13 @@ pub const Command = struct {
const ctx = try Command.init(allocator, log, .RunCommand);
ctx.args.target = .bun;
if (ctx.parallel or ctx.sequential) {
MultiRun.run(ctx) catch |err| {
Output.prettyErrorln("<r><red>error<r>: {s}", .{@errorName(err)});
Global.exit(1);
};
}
if (ctx.filters.len > 0 or ctx.workspaces) {
FilterRun.runScriptsWithFilter(ctx) catch |err| {
Output.prettyErrorln("<r><red>error<r>: {s}", .{@errorName(err)});
@@ -927,6 +938,13 @@ pub const Command = struct {
};
ctx.args.target = .bun;
if (ctx.parallel or ctx.sequential) {
MultiRun.run(ctx) catch |err| {
Output.prettyErrorln("<r><red>error<r>: {s}", .{@errorName(err)});
Global.exit(1);
};
}
if (ctx.filters.len > 0 or ctx.workspaces) {
FilterRun.runScriptsWithFilter(ctx) catch |err| {
Output.prettyErrorln("<r><red>error<r>: {s}", .{@errorName(err)});
@@ -1762,6 +1780,7 @@ const string = []const u8;
const AddCompletions = @import("./cli/add_completions.zig");
const FilterRun = @import("./cli/filter_run.zig");
const MultiRun = @import("./cli/multi_run.zig");
const PmViewCommand = @import("./cli/pm_view_command.zig");
const fs = @import("./fs.zig");
const options = @import("./options.zig");

View File

@@ -91,6 +91,7 @@ pub const runtime_params_ = [_]ParamType{
clap.parseParam("--cpu-prof-name <STR> Specify the name of the CPU profile file") catch unreachable,
clap.parseParam("--cpu-prof-dir <STR> Specify the directory where the CPU profile will be saved") catch unreachable,
clap.parseParam("--cpu-prof-md Output CPU profile in markdown format (grep-friendly, designed for LLM analysis)") catch unreachable,
clap.parseParam("--cpu-prof-interval <STR> Specify the sampling interval in microseconds for CPU profiling (default: 1000)") catch unreachable,
clap.parseParam("--heap-prof Generate V8 heap snapshot on exit (.heapsnapshot)") catch unreachable,
clap.parseParam("--heap-prof-name <STR> Specify the name of the heap profile file") catch unreachable,
clap.parseParam("--heap-prof-dir <STR> Specify the directory where the heap profile will be saved") catch unreachable,
@@ -130,6 +131,9 @@ pub const auto_or_run_params = [_]ParamType{
clap.parseParam("-b, --bun Force a script or package to use Bun's runtime instead of Node.js (via symlinking node)") catch unreachable,
clap.parseParam("--shell <STR> Control the shell used for package.json scripts. Supports either 'bun' or 'system'") catch unreachable,
clap.parseParam("--workspaces Run a script in all workspace packages (from the \"workspaces\" field in package.json)") catch unreachable,
clap.parseParam("--parallel Run multiple scripts concurrently with Foreman-style output") catch unreachable,
clap.parseParam("--sequential Run multiple scripts sequentially with Foreman-style output") catch unreachable,
clap.parseParam("--no-exit-on-error Continue running other scripts when one fails (with --parallel/--sequential)") catch unreachable,
};
pub const auto_only_params = [_]ParamType{
@@ -175,7 +179,7 @@ pub const build_only_params = [_]ParamType{
clap.parseParam("--sourcemap <STR>? Build with sourcemaps - 'linked', 'inline', 'external', or 'none'") catch unreachable,
clap.parseParam("--banner <STR> Add a banner to the bundled output such as \"use client\"; for a bundle being used with RSCs") catch unreachable,
clap.parseParam("--footer <STR> Add a footer to the bundled output such as // built with bun!") catch unreachable,
clap.parseParam("--format <STR> Specifies the module format to build to. \"esm\", \"cjs\" and \"iife\" are supported. Defaults to \"esm\".") catch unreachable,
clap.parseParam("--format <STR> Specifies the module format to build to. \"esm\", \"cjs\" and \"iife\" are supported. Defaults to \"esm\", or \"cjs\" with --bytecode.") catch unreachable,
clap.parseParam("--root <STR> Root directory used for multiple entry points") catch unreachable,
clap.parseParam("--splitting Enable code splitting") catch unreachable,
clap.parseParam("--public-path <STR> A prefix to be appended to any import paths in bundled code") catch unreachable,
@@ -453,6 +457,9 @@ pub fn parse(allocator: std.mem.Allocator, ctx: Command.Context, comptime cmd: C
ctx.filters = args.options("--filter");
ctx.workspaces = args.flag("--workspaces");
ctx.if_present = args.flag("--if-present");
ctx.parallel = args.flag("--parallel");
ctx.sequential = args.flag("--sequential");
ctx.no_exit_on_error = args.flag("--no-exit-on-error");
if (args.option("--elide-lines")) |elide_lines| {
if (elide_lines.len > 0) {
@@ -858,6 +865,9 @@ pub fn parse(allocator: std.mem.Allocator, ctx: Command.Context, comptime cmd: C
ctx.runtime_options.cpu_prof.md_format = cpu_prof_md_flag;
// json_format is true if --cpu-prof is passed (regardless of --cpu-prof-md)
ctx.runtime_options.cpu_prof.json_format = cpu_prof_flag;
if (args.option("--cpu-prof-interval")) |interval_str| {
ctx.runtime_options.cpu_prof.interval = std.fmt.parseInt(u32, interval_str, 10) catch 1000;
}
} else {
// Warn if --cpu-prof-name or --cpu-prof-dir is used without a profiler flag
if (args.option("--cpu-prof-name")) |_| {
@@ -866,6 +876,9 @@ pub fn parse(allocator: std.mem.Allocator, ctx: Command.Context, comptime cmd: C
if (args.option("--cpu-prof-dir")) |_| {
Output.warn("--cpu-prof-dir requires --cpu-prof or --cpu-prof-md to be enabled", .{});
}
if (args.option("--cpu-prof-interval")) |_| {
Output.warn("--cpu-prof-interval requires --cpu-prof or --cpu-prof-md to be enabled", .{});
}
}
const heap_prof_v8 = args.flag("--heap-prof");
@@ -968,7 +981,6 @@ pub fn parse(allocator: std.mem.Allocator, ctx: Command.Context, comptime cmd: C
args.flag("--debug-no-minify");
}
// TODO: support --format=esm
if (ctx.bundler_options.bytecode) {
ctx.bundler_options.output_format = .cjs;
ctx.args.target = .bun;
@@ -1181,6 +1193,10 @@ pub fn parse(allocator: std.mem.Allocator, ctx: Command.Context, comptime cmd: C
Output.errGeneric("Using --windows-hide-console is only available when compiling on Windows", .{});
Global.crash();
}
if (ctx.bundler_options.compile_target.os != .windows) {
Output.errGeneric("--windows-hide-console requires a Windows compile target", .{});
Global.crash();
}
if (!ctx.bundler_options.compile) {
Output.errGeneric("--windows-hide-console requires --compile", .{});
Global.crash();
@@ -1192,6 +1208,10 @@ pub fn parse(allocator: std.mem.Allocator, ctx: Command.Context, comptime cmd: C
Output.errGeneric("Using --windows-icon is only available when compiling on Windows", .{});
Global.crash();
}
if (ctx.bundler_options.compile_target.os != .windows) {
Output.errGeneric("--windows-icon requires a Windows compile target", .{});
Global.crash();
}
if (!ctx.bundler_options.compile) {
Output.errGeneric("--windows-icon requires --compile", .{});
Global.crash();
@@ -1203,6 +1223,10 @@ pub fn parse(allocator: std.mem.Allocator, ctx: Command.Context, comptime cmd: C
Output.errGeneric("Using --windows-title is only available when compiling on Windows", .{});
Global.crash();
}
if (ctx.bundler_options.compile_target.os != .windows) {
Output.errGeneric("--windows-title requires a Windows compile target", .{});
Global.crash();
}
if (!ctx.bundler_options.compile) {
Output.errGeneric("--windows-title requires --compile", .{});
Global.crash();
@@ -1214,6 +1238,10 @@ pub fn parse(allocator: std.mem.Allocator, ctx: Command.Context, comptime cmd: C
Output.errGeneric("Using --windows-publisher is only available when compiling on Windows", .{});
Global.crash();
}
if (ctx.bundler_options.compile_target.os != .windows) {
Output.errGeneric("--windows-publisher requires a Windows compile target", .{});
Global.crash();
}
if (!ctx.bundler_options.compile) {
Output.errGeneric("--windows-publisher requires --compile", .{});
Global.crash();
@@ -1225,6 +1253,10 @@ pub fn parse(allocator: std.mem.Allocator, ctx: Command.Context, comptime cmd: C
Output.errGeneric("Using --windows-version is only available when compiling on Windows", .{});
Global.crash();
}
if (ctx.bundler_options.compile_target.os != .windows) {
Output.errGeneric("--windows-version requires a Windows compile target", .{});
Global.crash();
}
if (!ctx.bundler_options.compile) {
Output.errGeneric("--windows-version requires --compile", .{});
Global.crash();
@@ -1236,6 +1268,10 @@ pub fn parse(allocator: std.mem.Allocator, ctx: Command.Context, comptime cmd: C
Output.errGeneric("Using --windows-description is only available when compiling on Windows", .{});
Global.crash();
}
if (ctx.bundler_options.compile_target.os != .windows) {
Output.errGeneric("--windows-description requires a Windows compile target", .{});
Global.crash();
}
if (!ctx.bundler_options.compile) {
Output.errGeneric("--windows-description requires --compile", .{});
Global.crash();
@@ -1247,6 +1283,10 @@ pub fn parse(allocator: std.mem.Allocator, ctx: Command.Context, comptime cmd: C
Output.errGeneric("Using --windows-copyright is only available when compiling on Windows", .{});
Global.crash();
}
if (ctx.bundler_options.compile_target.os != .windows) {
Output.errGeneric("--windows-copyright requires a Windows compile target", .{});
Global.crash();
}
if (!ctx.bundler_options.compile) {
Output.errGeneric("--windows-copyright requires --compile", .{});
Global.crash();
@@ -1306,9 +1346,18 @@ pub fn parse(allocator: std.mem.Allocator, ctx: Command.Context, comptime cmd: C
}
ctx.bundler_options.output_format = format;
if (format != .cjs and ctx.bundler_options.bytecode) {
Output.errGeneric("format must be 'cjs' when bytecode is true. Eventually we'll add esm support as well.", .{});
Global.exit(1);
if (ctx.bundler_options.bytecode) {
if (format != .cjs and format != .esm) {
Output.errGeneric("format must be 'cjs' or 'esm' when bytecode is true.", .{});
Global.exit(1);
}
// ESM bytecode requires --compile because module_info (import/export metadata)
// is only available in compiled binaries. Without it, JSC must parse the file
// twice (once for module analysis, once for bytecode), which is a deopt.
if (format == .esm and !ctx.bundler_options.compile) {
Output.errGeneric("ESM bytecode requires --compile. Use --format=cjs for bytecode without --compile.", .{});
Global.exit(1);
}
}
}

View File

@@ -583,6 +583,7 @@ pub const BuildCommand = struct {
.asset => Output.prettyFmt("<magenta>", true),
.sourcemap => Output.prettyFmt("<d>", true),
.bytecode => Output.prettyFmt("<d>", true),
.module_info => Output.prettyFmt("<d>", true),
.@"metafile-json", .@"metafile-markdown" => Output.prettyFmt("<green>", true),
});
@@ -614,6 +615,7 @@ pub const BuildCommand = struct {
.asset => "asset",
.sourcemap => "source map",
.bytecode => "bytecode",
.module_info => "module info",
.@"metafile-json" => "metafile json",
.@"metafile-markdown" => "metafile markdown",
}});

841
src/cli/multi_run.zig Normal file
View File

@@ -0,0 +1,841 @@
const ScriptConfig = struct {
label: []const u8,
command: [:0]const u8,
cwd: []const u8,
PATH: []const u8,
};
/// Wraps a BufferedReader and tracks whether it represents stdout or stderr,
/// so output can be routed to the correct parent stream.
const PipeReader = struct {
const This = @This();
reader: bun.io.BufferedReader = bun.io.BufferedReader.init(This),
handle: *ProcessHandle = undefined, // set in ProcessHandle.start()
is_stderr: bool,
line_buffer: std.array_list.Managed(u8) = std.array_list.Managed(u8).init(bun.default_allocator),
pub fn onReadChunk(this: *This, chunk: []const u8, hasMore: bun.io.ReadState) bool {
_ = hasMore;
this.handle.state.readChunk(this, chunk) catch {};
return true;
}
pub fn onReaderDone(this: *This) void {
_ = this;
}
pub fn onReaderError(this: *This, err: bun.sys.Error) void {
_ = this;
_ = err;
}
pub fn eventLoop(this: *This) *bun.jsc.MiniEventLoop {
return this.handle.state.event_loop;
}
pub fn loop(this: *This) *bun.Async.Loop {
if (comptime bun.Environment.isWindows) {
return this.handle.state.event_loop.loop.uv_loop;
} else {
return this.handle.state.event_loop.loop;
}
}
};
pub const ProcessHandle = struct {
const This = @This();
config: *ScriptConfig,
state: *State,
color_idx: usize,
stdout_reader: PipeReader = .{ .is_stderr = false },
stderr_reader: PipeReader = .{ .is_stderr = true },
process: ?struct {
ptr: *bun.spawn.Process,
status: bun.spawn.Status = .running,
} = null,
options: bun.spawn.SpawnOptions,
start_time: ?std.time.Instant = null,
end_time: ?std.time.Instant = null,
remaining_dependencies: usize = 0,
/// Dependents within the same script group (pre->main->post chain).
/// These are NOT started if this handle fails, even with --no-exit-on-error.
group_dependents: std.array_list.Managed(*This) = std.array_list.Managed(*This).init(bun.default_allocator),
/// Dependents across sequential groups (group N -> group N+1).
/// These ARE started even if this handle fails when --no-exit-on-error is set.
next_dependents: std.array_list.Managed(*This) = std.array_list.Managed(*This).init(bun.default_allocator),
fn start(this: *This) !void {
this.state.remaining_scripts += 1;
var argv = [_:null]?[*:0]const u8{
this.state.shell_bin,
if (Environment.isPosix) "-c" else "exec",
this.config.command,
null,
};
this.start_time = std.time.Instant.now() catch null;
var spawned: bun.spawn.process.SpawnProcessResult = brk: {
var arena = std.heap.ArenaAllocator.init(bun.default_allocator);
defer arena.deinit();
const original_path = this.state.env.map.get("PATH") orelse "";
bun.handleOom(this.state.env.map.put("PATH", this.config.PATH));
defer bun.handleOom(this.state.env.map.put("PATH", original_path));
const envp = try this.state.env.map.createNullDelimitedEnvMap(arena.allocator());
break :brk try (try bun.spawn.spawnProcess(&this.options, argv[0..], envp)).unwrap();
};
var process = spawned.toProcess(this.state.event_loop, false);
this.stdout_reader.handle = this;
this.stderr_reader.handle = this;
this.stdout_reader.reader.setParent(&this.stdout_reader);
this.stderr_reader.reader.setParent(&this.stderr_reader);
if (Environment.isWindows) {
this.stdout_reader.reader.source = .{ .pipe = this.options.stdout.buffer };
this.stderr_reader.reader.source = .{ .pipe = this.options.stderr.buffer };
}
if (Environment.isPosix) {
if (spawned.stdout) |stdout_fd| {
_ = bun.sys.setNonblocking(stdout_fd);
try this.stdout_reader.reader.start(stdout_fd, true).unwrap();
}
if (spawned.stderr) |stderr_fd| {
_ = bun.sys.setNonblocking(stderr_fd);
try this.stderr_reader.reader.start(stderr_fd, true).unwrap();
}
} else {
try this.stdout_reader.reader.startWithCurrentPipe().unwrap();
try this.stderr_reader.reader.startWithCurrentPipe().unwrap();
}
this.process = .{ .ptr = process };
process.setExitHandler(this);
switch (process.watchOrReap()) {
.result => {},
.err => |err| {
if (!process.hasExited())
process.onExit(.{ .err = err }, &std.mem.zeroes(bun.spawn.Rusage));
},
}
}
pub fn onProcessExit(this: *This, proc: *bun.spawn.Process, status: bun.spawn.Status, _: *const bun.spawn.Rusage) void {
this.process.?.status = status;
this.end_time = std.time.Instant.now() catch null;
_ = proc;
this.state.processExit(this) catch {};
}
pub fn eventLoop(this: *This) *bun.jsc.MiniEventLoop {
return this.state.event_loop;
}
pub fn loop(this: *This) *bun.Async.Loop {
if (comptime bun.Environment.isWindows) {
return this.state.event_loop.loop.uv_loop;
} else {
return this.state.event_loop.loop;
}
}
};
const colors = [_][]const u8{
"\x1b[36m", // cyan
"\x1b[33m", // yellow
"\x1b[35m", // magenta
"\x1b[32m", // green
"\x1b[34m", // blue
"\x1b[31m", // red
};
const reset = "\x1b[0m";
const State = struct {
const This = @This();
handles: []ProcessHandle,
event_loop: *bun.jsc.MiniEventLoop,
remaining_scripts: usize = 0,
max_label_len: usize,
shell_bin: [:0]const u8,
aborted: bool = false,
no_exit_on_error: bool,
env: *bun.DotEnv.Loader,
use_colors: bool,
pub fn isDone(this: *This) bool {
return this.remaining_scripts == 0;
}
fn readChunk(this: *This, pipe: *PipeReader, chunk: []const u8) (std.Io.Writer.Error || bun.OOM)!void {
try pipe.line_buffer.appendSlice(chunk);
// Route to correct parent stream: child stdout -> parent stdout, child stderr -> parent stderr
const writer = if (pipe.is_stderr) Output.errorWriter() else Output.writer();
// Process complete lines
while (std.mem.indexOfScalar(u8, pipe.line_buffer.items, '\n')) |newline_pos| {
const line = pipe.line_buffer.items[0 .. newline_pos + 1];
try this.writeLineWithPrefix(pipe.handle, line, writer);
// Remove processed line from buffer
const remaining = pipe.line_buffer.items[newline_pos + 1 ..];
std.mem.copyForwards(u8, pipe.line_buffer.items[0..remaining.len], remaining);
pipe.line_buffer.items.len = remaining.len;
}
}
fn writeLineWithPrefix(this: *This, handle: *ProcessHandle, line: []const u8, writer: *std.Io.Writer) std.Io.Writer.Error!void {
try this.writePrefix(handle, writer);
try writer.writeAll(line);
}
fn writePrefix(this: *This, handle: *ProcessHandle, writer: *std.Io.Writer) std.Io.Writer.Error!void {
if (this.use_colors) {
try writer.writeAll(colors[handle.color_idx % colors.len]);
}
try writer.writeAll(handle.config.label);
const padding = this.max_label_len -| handle.config.label.len;
for (0..padding) |_| {
try writer.writeByte(' ');
}
if (this.use_colors) {
try writer.writeAll(reset);
}
try writer.writeAll(" | ");
}
fn flushPipeBuffer(this: *This, handle: *ProcessHandle, pipe: *PipeReader) std.Io.Writer.Error!void {
if (pipe.line_buffer.items.len > 0) {
const line = pipe.line_buffer.items;
const needs_newline = line.len > 0 and line[line.len - 1] != '\n';
const writer = if (pipe.is_stderr) Output.errorWriter() else Output.writer();
try this.writeLineWithPrefix(handle, line, writer);
if (needs_newline) {
writer.writeAll("\n") catch {};
}
pipe.line_buffer.clearRetainingCapacity();
}
}
fn processExit(this: *This, handle: *ProcessHandle) std.Io.Writer.Error!void {
this.remaining_scripts -= 1;
// Flush remaining buffers (stdout first, then stderr)
try this.flushPipeBuffer(handle, &handle.stdout_reader);
try this.flushPipeBuffer(handle, &handle.stderr_reader);
// Print exit status to stderr (status messages always go to stderr)
const writer = Output.errorWriter();
try this.writePrefix(handle, writer);
switch (handle.process.?.status) {
.exited => |exited| {
if (exited.code != 0) {
try writer.print("Exited with code {d}\n", .{exited.code});
} else {
if (handle.start_time != null and handle.end_time != null) {
const duration = handle.end_time.?.since(handle.start_time.?);
const ms = @as(f64, @floatFromInt(duration)) / 1_000_000.0;
if (ms > 1000.0) {
try writer.print("Done in {d:.2}s\n", .{ms / 1000.0});
} else {
try writer.print("Done in {d:.0}ms\n", .{ms});
}
} else {
try writer.writeAll("Done\n");
}
}
},
.signaled => |signal| {
try writer.print("Signaled: {s}\n", .{@tagName(signal)});
},
else => {
try writer.writeAll("Error\n");
},
}
// Check if we should abort on error
const failed = switch (handle.process.?.status) {
.exited => |exited| exited.code != 0,
.signaled => true,
else => true,
};
if (failed and !this.no_exit_on_error) {
this.abort();
return;
}
if (failed) {
// Pre->main->post chain is broken -- skip group dependents.
this.skipDependents(handle.group_dependents.items);
// But cascade to next-group dependents (sequential --no-exit-on-error).
if (!this.aborted) {
this.startDependents(handle.next_dependents.items);
}
return;
}
// Success: cascade to all dependents
if (!this.aborted) {
this.startDependents(handle.group_dependents.items);
this.startDependents(handle.next_dependents.items);
}
}
fn startDependents(_: *This, dependents: []*ProcessHandle) void {
for (dependents) |dependent| {
dependent.remaining_dependencies -= 1;
if (dependent.remaining_dependencies == 0) {
dependent.start() catch {
Output.prettyErrorln("<r><red>error<r>: Failed to start process", .{});
Global.exit(1);
};
}
}
}
/// Skip group dependents that will never start because their predecessor
/// failed. Recursively skip their group dependents too.
fn skipDependents(this: *This, dependents: []*ProcessHandle) void {
for (dependents) |dependent| {
dependent.remaining_dependencies -= 1;
if (dependent.remaining_dependencies == 0) {
this.skipDependents(dependent.group_dependents.items);
// Still cascade next_dependents so sequential chains continue
if (!this.aborted) {
this.startDependents(dependent.next_dependents.items);
}
}
}
}
pub fn abort(this: *This) void {
this.aborted = true;
for (this.handles) |*handle| {
if (handle.process) |*proc| {
if (proc.status == .running) {
_ = proc.ptr.kill(std.posix.SIG.INT);
}
}
}
}
pub fn finalize(this: *This) u8 {
for (this.handles) |handle| {
if (handle.process) |proc| {
switch (proc.status) {
.exited => |exited| {
if (exited.code != 0) return exited.code;
},
.signaled => |signal| return signal.toExitCode() orelse 1,
else => return 1,
}
}
}
return 0;
}
};
const AbortHandler = struct {
var should_abort = false;
fn posixSignalHandler(sig: i32, info: *const std.posix.siginfo_t, _: ?*const anyopaque) callconv(.c) void {
_ = sig;
_ = info;
should_abort = true;
}
fn windowsCtrlHandler(dwCtrlType: std.os.windows.DWORD) callconv(.winapi) std.os.windows.BOOL {
if (dwCtrlType == std.os.windows.CTRL_C_EVENT) {
should_abort = true;
return std.os.windows.TRUE;
}
return std.os.windows.FALSE;
}
pub fn install() void {
if (Environment.isPosix) {
const action = std.posix.Sigaction{
.handler = .{ .sigaction = AbortHandler.posixSignalHandler },
.mask = std.posix.sigemptyset(),
.flags = std.posix.SA.SIGINFO | std.posix.SA.RESTART | std.posix.SA.RESETHAND,
};
std.posix.sigaction(std.posix.SIG.INT, &action, null);
} else {
const res = bun.c.SetConsoleCtrlHandler(windowsCtrlHandler, std.os.windows.TRUE);
if (res == 0) {
if (Environment.isDebug) {
Output.warn("Failed to set abort handler\n", .{});
}
}
}
}
pub fn uninstall() void {
if (Environment.isWindows) {
_ = bun.c.SetConsoleCtrlHandler(null, std.os.windows.FALSE);
}
}
};
/// Simple glob matching: `*` matches any sequence of characters.
fn matchesGlob(pattern: []const u8, name: []const u8) bool {
var pi: usize = 0;
var ni: usize = 0;
var star_pi: usize = 0;
var star_ni: usize = 0;
var have_star = false;
while (ni < name.len or pi < pattern.len) {
if (pi < pattern.len and pattern[pi] == '*') {
have_star = true;
star_pi = pi;
star_ni = ni;
pi += 1;
} else if (pi < pattern.len and ni < name.len and pattern[pi] == name[ni]) {
pi += 1;
ni += 1;
} else if (have_star) {
pi = star_pi + 1;
star_ni += 1;
ni = star_ni;
if (ni > name.len) return false;
} else {
return false;
}
}
return true;
}
/// Add configs for a single script name (with pre/post handling).
/// When `label_prefix` is non-null, labels become "{prefix}:{name}" (for workspace runs).
fn addScriptConfigs(
configs: *std.array_list.Managed(ScriptConfig),
group_infos: *std.array_list.Managed(GroupInfo),
raw_name: []const u8,
scripts_map: ?*const bun.StringArrayHashMap([]const u8),
allocator: std.mem.Allocator,
cwd: []const u8,
PATH: []const u8,
label_prefix: ?[]const u8,
) !void {
const group_start = configs.items.len;
const label = if (label_prefix) |prefix|
try std.fmt.allocPrint(allocator, "{s}:{s}", .{ prefix, raw_name })
else
raw_name;
const script_content = if (scripts_map) |sm| sm.get(raw_name) else null;
if (script_content) |content| {
// It's a package.json script - check for pre/post
const pre_name = try std.fmt.allocPrint(allocator, "pre{s}", .{raw_name});
const post_name = try std.fmt.allocPrint(allocator, "post{s}", .{raw_name});
const pre_content = if (scripts_map) |sm| sm.get(pre_name) else null;
const post_content = if (scripts_map) |sm| sm.get(post_name) else null;
if (pre_content) |pc| {
var cmd_buf = try std.array_list.Managed(u8).initCapacity(allocator, pc.len + 1);
try RunCommand.replacePackageManagerRun(&cmd_buf, pc);
try cmd_buf.append(0);
try configs.append(.{
.label = label,
.command = cmd_buf.items[0 .. cmd_buf.items.len - 1 :0],
.cwd = cwd,
.PATH = PATH,
});
}
// Main script
{
var cmd_buf = try std.array_list.Managed(u8).initCapacity(allocator, content.len + 1);
try RunCommand.replacePackageManagerRun(&cmd_buf, content);
try cmd_buf.append(0);
try configs.append(.{
.label = label,
.command = cmd_buf.items[0 .. cmd_buf.items.len - 1 :0],
.cwd = cwd,
.PATH = PATH,
});
}
if (post_content) |pc| {
var cmd_buf = try std.array_list.Managed(u8).initCapacity(allocator, pc.len + 1);
try RunCommand.replacePackageManagerRun(&cmd_buf, pc);
try cmd_buf.append(0);
try configs.append(.{
.label = label,
.command = cmd_buf.items[0 .. cmd_buf.items.len - 1 :0],
.cwd = cwd,
.PATH = PATH,
});
}
} else {
// Not a package.json script - run as a raw command
// If it looks like a file path, prefix with bun executable
const is_file = raw_name.len > 0 and (raw_name[0] == '.' or raw_name[0] == '/' or
(Environment.isWindows and raw_name[0] == '\\') or hasRunnableExtension(raw_name));
const command_z = if (is_file) brk: {
const bun_path = bun.selfExePath() catch "bun";
// Quote the bun path so that backslashes on Windows are not
// interpreted as escape characters by `bun exec` (Bun's shell).
const cmd_str = try std.fmt.allocPrint(allocator, "\"{s}\" {s}" ++ "\x00", .{ bun_path, raw_name });
break :brk cmd_str[0 .. cmd_str.len - 1 :0];
} else try allocator.dupeZ(u8, raw_name);
try configs.append(.{
.label = label,
.command = command_z,
.cwd = cwd,
.PATH = PATH,
});
}
try group_infos.append(.{
.start = group_start,
.count = configs.items.len - group_start,
});
}
const GroupInfo = struct { start: usize, count: usize };
pub fn run(ctx: Command.Context) !noreturn {
// Validate flags
if (ctx.parallel and ctx.sequential) {
Output.prettyErrorln("<r><red>error<r>: --parallel and --sequential cannot be used together", .{});
Global.exit(1);
}
// Collect script names from positionals + passthrough
// For RunCommand: positionals[0] is "run", skip it. For AutoCommand: no "run" prefix.
var script_names = std.array_list.Managed([]const u8).init(ctx.allocator);
var positionals = ctx.positionals;
if (positionals.len > 0 and (strings.eqlComptime(positionals[0], "run") or strings.eqlComptime(positionals[0], "r"))) {
positionals = positionals[1..];
}
for (positionals) |pos| {
if (pos.len > 0) {
try script_names.append(pos);
}
}
for (ctx.passthrough) |pt| {
if (pt.len > 0) {
try script_names.append(pt);
}
}
if (script_names.items.len == 0) {
Output.prettyErrorln("<r><red>error<r>: --parallel/--sequential requires at least one script name", .{});
Global.exit(1);
}
// Set up the transpiler/environment
const fsinstance = try bun.fs.FileSystem.init(null);
var this_transpiler: transpiler.Transpiler = undefined;
_ = try RunCommand.configureEnvForRun(ctx, &this_transpiler, null, true, false);
const cwd = fsinstance.top_level_dir;
const event_loop = bun.jsc.MiniEventLoop.initGlobal(this_transpiler.env, null);
const shell_bin: [:0]const u8 = if (Environment.isPosix)
RunCommand.findShell(this_transpiler.env.get("PATH") orelse "", cwd) orelse return error.MissingShell
else
bun.selfExePath() catch return error.MissingShell;
// Build ScriptConfigs and ProcessHandles
// Each script name can produce up to 3 handles (pre, main, post)
var configs = std.array_list.Managed(ScriptConfig).init(ctx.allocator);
var group_infos = std.array_list.Managed(GroupInfo).init(ctx.allocator);
if (ctx.filters.len > 0 or ctx.workspaces) {
// Workspace-aware mode: iterate over matching workspace packages
var filters_to_use = ctx.filters;
if (ctx.workspaces) {
filters_to_use = &.{"*"};
}
var filter_instance = try FilterArg.FilterSet.init(ctx.allocator, filters_to_use, cwd);
var patterns = std.array_list.Managed([]u8).init(ctx.allocator);
var root_buf: bun.PathBuffer = undefined;
const resolve_root = try FilterArg.getCandidatePackagePatterns(ctx.allocator, ctx.log, &patterns, cwd, &root_buf);
var package_json_iter = try FilterArg.PackageFilterIterator.init(ctx.allocator, patterns.items, resolve_root);
defer package_json_iter.deinit();
// Phase 1: Collect matching packages (filesystem order is nondeterministic)
const MatchedPackage = struct {
name: []const u8,
dirpath: []const u8,
scripts: *const bun.StringArrayHashMap([]const u8),
PATH: []const u8,
};
var matched_packages = std.array_list.Managed(MatchedPackage).init(ctx.allocator);
while (try package_json_iter.next()) |package_json_path| {
const dirpath = try ctx.allocator.dupe(u8, std.fs.path.dirname(package_json_path) orelse Global.crash());
const path = bun.strings.withoutTrailingSlash(dirpath);
// When using --workspaces, skip the root package to prevent recursion
if (ctx.workspaces and strings.eql(path, resolve_root)) {
continue;
}
const pkgjson = bun.PackageJSON.parse(&this_transpiler.resolver, dirpath, .invalid, null, .include_scripts, .main) orelse {
continue;
};
if (!filter_instance.matches(path, pkgjson.name))
continue;
const pkg_scripts = pkgjson.scripts orelse continue;
const pkg_PATH = try RunCommand.configurePathForRunWithPackageJsonDir(ctx, dirpath, &this_transpiler, null, dirpath, ctx.debug.run_in_bun);
const pkg_name = if (pkgjson.name.len > 0)
pkgjson.name
else
// Fallback: use relative path from workspace root
try ctx.allocator.dupe(u8, bun.path.relativePlatform(resolve_root, path, .posix, false));
try matched_packages.append(.{
.name = pkg_name,
.dirpath = dirpath,
.scripts = pkg_scripts,
.PATH = pkg_PATH,
});
}
// Phase 2: Sort by package name, then by path as tiebreaker for deterministic ordering
std.mem.sort(MatchedPackage, matched_packages.items, {}, struct {
fn lessThan(_: void, a: MatchedPackage, b: MatchedPackage) bool {
const name_order = std.mem.order(u8, a.name, b.name);
if (name_order != .eq) return name_order == .lt;
return std.mem.order(u8, a.dirpath, b.dirpath) == .lt;
}
}.lessThan);
// Phase 3: Build configs from sorted packages
for (matched_packages.items) |pkg| {
for (script_names.items) |raw_name| {
if (std.mem.indexOfScalar(u8, raw_name, '*') != null) {
// Glob: expand against this package's scripts
var matches = std.array_list.Managed([]const u8).init(ctx.allocator);
for (pkg.scripts.keys()) |key| {
if (matchesGlob(raw_name, key)) {
try matches.append(key);
}
}
std.mem.sort([]const u8, matches.items, {}, struct {
fn lessThan(_: void, a: []const u8, b: []const u8) bool {
return std.mem.order(u8, a, b) == .lt;
}
}.lessThan);
for (matches.items) |matched_name| {
try addScriptConfigs(&configs, &group_infos, matched_name, pkg.scripts, ctx.allocator, pkg.dirpath, pkg.PATH, pkg.name);
}
} else {
if (pkg.scripts.get(raw_name) != null) {
try addScriptConfigs(&configs, &group_infos, raw_name, pkg.scripts, ctx.allocator, pkg.dirpath, pkg.PATH, pkg.name);
} else if (ctx.workspaces and !ctx.if_present) {
Output.prettyErrorln("<r><red>error<r>: Missing \"{s}\" script in package \"{s}\"", .{ raw_name, pkg.name });
Global.exit(1);
}
}
}
}
if (configs.items.len == 0) {
if (ctx.if_present) {
Global.exit(0);
}
if (ctx.workspaces) {
Output.prettyErrorln("<r><red>error<r>: No workspace packages have matching scripts", .{});
} else {
Output.prettyErrorln("<r><red>error<r>: No packages matched the filter", .{});
}
Global.exit(1);
}
} else {
// Single-package mode: use the root package.json
const PATH = try RunCommand.configurePathForRunWithPackageJsonDir(ctx, "", &this_transpiler, null, cwd, ctx.debug.run_in_bun);
// Load package.json scripts
const root_dir_info = this_transpiler.resolver.readDirInfo(cwd) catch {
Output.prettyErrorln("<r><red>error<r>: Failed to read directory", .{});
Global.exit(1);
} orelse {
Output.prettyErrorln("<r><red>error<r>: Failed to read directory", .{});
Global.exit(1);
};
const package_json = root_dir_info.enclosing_package_json;
const scripts_map: ?*const bun.StringArrayHashMap([]const u8) = if (package_json) |pkg| pkg.scripts else null;
for (script_names.items) |raw_name| {
// Check if this is a glob pattern
if (std.mem.indexOfScalar(u8, raw_name, '*') != null) {
if (scripts_map) |sm| {
// Collect matching script names
var matches = std.array_list.Managed([]const u8).init(ctx.allocator);
for (sm.keys()) |key| {
if (matchesGlob(raw_name, key)) {
try matches.append(key);
}
}
// Sort alphabetically
std.mem.sort([]const u8, matches.items, {}, struct {
fn lessThan(_: void, a: []const u8, b: []const u8) bool {
return std.mem.order(u8, a, b) == .lt;
}
}.lessThan);
if (matches.items.len == 0) {
Output.prettyErrorln("<r><red>error<r>: No scripts match pattern \"{s}\"", .{raw_name});
Global.exit(1);
}
for (matches.items) |matched_name| {
try addScriptConfigs(&configs, &group_infos, matched_name, scripts_map, ctx.allocator, cwd, PATH, null);
}
} else {
Output.prettyErrorln("<r><red>error<r>: Cannot use glob pattern \"{s}\" without package.json scripts", .{raw_name});
Global.exit(1);
}
} else {
try addScriptConfigs(&configs, &group_infos, raw_name, scripts_map, ctx.allocator, cwd, PATH, null);
}
}
}
if (configs.items.len == 0) {
Output.prettyErrorln("<r><red>error<r>: No scripts to run", .{});
Global.exit(1);
}
// Compute max label width
var max_label_len: usize = 0;
for (configs.items) |*config| {
if (config.label.len > max_label_len) {
max_label_len = config.label.len;
}
}
const use_colors = Output.enable_ansi_colors_stderr;
var state = State{
.handles = try ctx.allocator.alloc(ProcessHandle, configs.items.len),
.event_loop = event_loop,
.max_label_len = max_label_len,
.shell_bin = shell_bin,
.no_exit_on_error = ctx.no_exit_on_error,
.env = this_transpiler.env,
.use_colors = use_colors,
};
// Initialize handles
for (configs.items, 0..) |*config, i| {
// Find which group this belongs to, for color assignment
var color_idx: usize = 0;
for (group_infos.items, 0..) |group, gi| {
if (i >= group.start and i < group.start + group.count) {
color_idx = gi;
break;
}
}
state.handles[i] = ProcessHandle{
.state = &state,
.config = config,
.color_idx = color_idx,
.options = .{
.stdin = .ignore,
.stdout = if (Environment.isPosix) .buffer else .{ .buffer = try bun.default_allocator.create(bun.windows.libuv.Pipe) },
.stderr = if (Environment.isPosix) .buffer else .{ .buffer = try bun.default_allocator.create(bun.windows.libuv.Pipe) },
.cwd = config.cwd,
.windows = if (Environment.isWindows) .{ .loop = bun.jsc.EventLoopHandle.init(event_loop) },
.stream = true,
},
};
}
// Set up pre->main->post chaining within each group
for (group_infos.items) |group| {
if (group.count > 1) {
var j: usize = group.start;
while (j < group.start + group.count - 1) : (j += 1) {
try state.handles[j].group_dependents.append(&state.handles[j + 1]);
state.handles[j + 1].remaining_dependencies += 1;
}
}
}
// For sequential mode, chain groups together
if (ctx.sequential) {
var gi: usize = 0;
while (gi < group_infos.items.len - 1) : (gi += 1) {
const current_group = group_infos.items[gi];
const next_group = group_infos.items[gi + 1];
// Last handle of current group -> first handle of next group
const last_in_current = current_group.start + current_group.count - 1;
const first_in_next = next_group.start;
try state.handles[last_in_current].next_dependents.append(&state.handles[first_in_next]);
state.handles[first_in_next].remaining_dependencies += 1;
}
}
// Start handles with no dependencies
for (state.handles) |*handle| {
if (handle.remaining_dependencies == 0) {
handle.start() catch {
Output.prettyErrorln("<r><red>error<r>: Failed to start process", .{});
Global.exit(1);
};
}
}
AbortHandler.install();
while (!state.isDone()) {
if (AbortHandler.should_abort and !state.aborted) {
AbortHandler.uninstall();
state.abort();
}
event_loop.tickOnce(&state);
}
const status = state.finalize();
Global.exit(status);
}
fn hasRunnableExtension(name: []const u8) bool {
const ext = std.fs.path.extension(name);
const loader = bun.options.defaultLoaders.get(ext) orelse return false;
return loader.canBeRunByBun();
}
const FilterArg = @import("./filter_arg.zig");
const std = @import("std");
const RunCommand = @import("./run_command.zig").RunCommand;
const bun = @import("bun");
const Environment = bun.Environment;
const Global = bun.Global;
const Output = bun.Output;
const strings = bun.strings;
const transpiler = bun.transpiler;
const CLI = bun.cli;
const Command = CLI.Command;

View File

@@ -184,71 +184,88 @@ pub const Loader = struct {
}
}
// NO_PROXY filter
// See the syntax at https://about.gitlab.com/blog/2021/01/27/we-need-to-talk-no-proxy/
if (http_proxy != null and hostname != null) {
if (this.get("no_proxy") orelse this.get("NO_PROXY")) |no_proxy_text| {
if (no_proxy_text.len == 0 or strings.eqlComptime(no_proxy_text, "\"\"") or strings.eqlComptime(no_proxy_text, "''")) {
return http_proxy;
}
var no_proxy_iter = std.mem.splitScalar(u8, no_proxy_text, ',');
while (no_proxy_iter.next()) |no_proxy_item| {
var no_proxy_entry = strings.trim(no_proxy_item, &strings.whitespace_chars);
if (no_proxy_entry.len == 0) {
continue;
}
if (strings.eql(no_proxy_entry, "*")) {
return null;
}
//strips .
if (strings.startsWithChar(no_proxy_entry, '.')) {
no_proxy_entry = no_proxy_entry[1..];
if (no_proxy_entry.len == 0) {
continue;
}
}
// Determine if entry contains a port or is an IPv6 address
// IPv6 addresses contain multiple colons (e.g., "::1", "2001:db8::1")
// Bracketed IPv6 with port: "[::1]:8080"
// Host with port: "localhost:8080" (single colon)
const colon_count = std.mem.count(u8, no_proxy_entry, ":");
const is_bracketed_ipv6 = strings.startsWithChar(no_proxy_entry, '[');
const has_port = blk: {
if (is_bracketed_ipv6) {
// Bracketed IPv6: check for "]:port" pattern
if (std.mem.indexOf(u8, no_proxy_entry, "]:")) |_| {
break :blk true;
}
break :blk false;
} else if (colon_count == 1) {
// Single colon means host:port (not IPv6)
break :blk true;
}
// Multiple colons without brackets = bare IPv6 literal (no port)
break :blk false;
};
if (has_port) {
// Entry has a port, do exact match against host:port
if (host) |h| {
if (strings.eqlCaseInsensitiveASCII(h, no_proxy_entry, true)) {
return null;
}
}
} else {
// Entry is hostname/IPv6 only, match against hostname (suffix match)
if (strings.endsWith(hostname.?, no_proxy_entry)) {
return null;
}
}
}
if (this.isNoProxy(hostname, host)) {
return null;
}
}
return http_proxy;
}
/// Returns true if the given hostname/host should bypass the proxy
/// according to the NO_PROXY / no_proxy environment variable.
pub fn isNoProxy(this: *const Loader, hostname: ?[]const u8, host: ?[]const u8) bool {
// NO_PROXY filter
// See the syntax at https://about.gitlab.com/blog/2021/01/27/we-need-to-talk-no-proxy/
const hn = hostname orelse return false;
const no_proxy_text = this.get("no_proxy") orelse this.get("NO_PROXY") orelse return false;
if (no_proxy_text.len == 0 or strings.eqlComptime(no_proxy_text, "\"\"") or strings.eqlComptime(no_proxy_text, "''")) {
return false;
}
var no_proxy_iter = std.mem.splitScalar(u8, no_proxy_text, ',');
while (no_proxy_iter.next()) |no_proxy_item| {
var no_proxy_entry = strings.trim(no_proxy_item, &strings.whitespace_chars);
if (no_proxy_entry.len == 0) {
continue;
}
if (strings.eql(no_proxy_entry, "*")) {
return true;
}
//strips .
if (strings.startsWithChar(no_proxy_entry, '.')) {
no_proxy_entry = no_proxy_entry[1..];
if (no_proxy_entry.len == 0) {
continue;
}
}
// Determine if entry contains a port or is an IPv6 address
// IPv6 addresses contain multiple colons (e.g., "::1", "2001:db8::1")
// Bracketed IPv6 with port: "[::1]:8080"
// Host with port: "localhost:8080" (single colon)
const colon_count = std.mem.count(u8, no_proxy_entry, ":");
const is_bracketed_ipv6 = strings.startsWithChar(no_proxy_entry, '[');
const has_port = blk: {
if (is_bracketed_ipv6) {
// Bracketed IPv6: check for "]:port" pattern
if (std.mem.indexOf(u8, no_proxy_entry, "]:")) |_| {
break :blk true;
}
break :blk false;
} else if (colon_count == 1) {
// Single colon means host:port (not IPv6)
break :blk true;
}
// Multiple colons without brackets = bare IPv6 literal (no port)
break :blk false;
};
if (has_port) {
// Entry has a port, do exact match against host:port
if (host) |h| {
if (strings.eqlCaseInsensitiveASCII(h, no_proxy_entry, true)) {
return true;
}
}
} else {
// Entry is hostname/IPv6 only, match exact or dot-boundary suffix (case-insensitive)
const entry_len = no_proxy_entry.len;
if (hn.len == entry_len) {
if (strings.eqlCaseInsensitiveASCII(hn, no_proxy_entry, true)) return true;
} else if (hn.len > entry_len and
hn[hn.len - entry_len - 1] == '.' and
strings.eqlCaseInsensitiveASCII(hn[hn.len - entry_len ..], no_proxy_entry, true))
{
return true;
}
}
}
return false;
}
var did_load_ccache_path: bool = false;
pub fn loadCCachePath(this: *Loader, fs: *Fs.FileSystem) void {

View File

@@ -395,6 +395,7 @@ pub const Options = struct {
target: options.Target = .browser,
runtime_transpiler_cache: ?*bun.jsc.RuntimeTranspilerCache = null,
module_info: ?*analyze_transpiled_module.ModuleInfo = null,
input_files_for_dev_server: ?[]logger.Source = null,
commonjs_named_exports: js_ast.Ast.CommonJSNamedExports = .{},
@@ -632,9 +633,44 @@ fn NewPrinter(
binary_expression_stack: std.array_list.Managed(BinaryExpressionVisitor) = undefined,
was_lazy_export: bool = false,
module_info: if (!may_have_module_info) void else ?*analyze_transpiled_module.ModuleInfo = if (!may_have_module_info) {} else null,
const Printer = @This();
const may_have_module_info = is_bun_platform and !rewrite_esm_to_cjs;
const TopLevelAndIsExport = if (!may_have_module_info) struct {} else struct {
is_export: bool = false,
is_top_level: ?analyze_transpiled_module.ModuleInfo.VarKind = null,
};
const TopLevel = if (!may_have_module_info) struct {
pub inline fn init(_: IsTopLevel) @This() {
return .{};
}
pub inline fn subVar(_: @This()) @This() {
return .{};
}
pub inline fn isTopLevel(_: @This()) bool {
return false;
}
} else struct {
is_top_level: IsTopLevel = .no,
pub inline fn init(is_top_level: IsTopLevel) @This() {
return .{ .is_top_level = is_top_level };
}
pub fn subVar(self: @This()) @This() {
if (self.is_top_level == .no) return @This().init(.no);
return @This().init(.var_only);
}
pub inline fn isTopLevel(self: @This()) bool {
return self.is_top_level != .no;
}
};
const IsTopLevel = enum { yes, var_only, no };
inline fn moduleInfo(self: *const Printer) ?*analyze_transpiled_module.ModuleInfo {
if (!may_have_module_info) return null;
return self.module_info;
}
/// When Printer is used as a io.Writer, this represents it's error type, aka nothing.
pub const Error = error{};
@@ -1031,6 +1067,25 @@ fn NewPrinter(
p.printSemicolonAfterStatement();
}
// Record var declarations for module_info. printGlobalBunImportStatement
// bypasses printDeclStmt/printBinding, so we must record vars explicitly.
if (may_have_module_info) {
if (p.moduleInfo()) |mi| {
if (import.star_name_loc != null) {
const name = p.renamer.nameForSymbol(import.namespace_ref);
bun.handleOom(mi.addVar(bun.handleOom(mi.str(name)), .declared));
}
if (import.default_name) |default| {
const name = p.renamer.nameForSymbol(default.ref.?);
bun.handleOom(mi.addVar(bun.handleOom(mi.str(name)), .declared));
}
for (import.items) |item| {
const name = p.renamer.nameForSymbol(item.name.ref.?);
bun.handleOom(mi.addVar(bun.handleOom(mi.str(name)), .declared));
}
}
}
}
pub inline fn printSpaceBeforeIdentifier(
@@ -1073,30 +1128,30 @@ fn NewPrinter(
}
}
pub fn printBody(p: *Printer, stmt: Stmt) void {
pub fn printBody(p: *Printer, stmt: Stmt, tlmtlo: TopLevel) void {
switch (stmt.data) {
.s_block => |block| {
p.printSpace();
p.printBlock(stmt.loc, block.stmts, block.close_brace_loc);
p.printBlock(stmt.loc, block.stmts, block.close_brace_loc, tlmtlo);
p.printNewline();
},
else => {
p.printNewline();
p.indent();
p.printStmt(stmt) catch unreachable;
p.printStmt(stmt, tlmtlo) catch unreachable;
p.unindent();
},
}
}
pub fn printBlockBody(p: *Printer, stmts: []const Stmt) void {
pub fn printBlockBody(p: *Printer, stmts: []const Stmt, tlmtlo: TopLevel) void {
for (stmts) |stmt| {
p.printSemicolonIfNeeded();
p.printStmt(stmt) catch unreachable;
p.printStmt(stmt, tlmtlo) catch unreachable;
}
}
pub fn printBlock(p: *Printer, loc: logger.Loc, stmts: []const Stmt, close_brace_loc: ?logger.Loc) void {
pub fn printBlock(p: *Printer, loc: logger.Loc, stmts: []const Stmt, close_brace_loc: ?logger.Loc, tlmtlo: TopLevel) void {
p.addSourceMapping(loc);
p.print("{");
if (stmts.len > 0) {
@@ -1104,7 +1159,7 @@ fn NewPrinter(
p.printNewline();
p.indent();
p.printBlockBody(stmts);
p.printBlockBody(stmts, tlmtlo);
p.unindent();
p.printIndent();
@@ -1123,8 +1178,8 @@ fn NewPrinter(
p.printNewline();
p.indent();
p.printBlockBody(prepend);
p.printBlockBody(stmts);
p.printBlockBody(prepend, TopLevel.init(.no));
p.printBlockBody(stmts, TopLevel.init(.no));
p.unindent();
p.needs_semicolon = false;
@@ -1132,7 +1187,7 @@ fn NewPrinter(
p.print("}");
}
pub fn printDecls(p: *Printer, comptime keyword: string, decls_: []G.Decl, flags: ExprFlag.Set) void {
pub fn printDecls(p: *Printer, comptime keyword: string, decls_: []G.Decl, flags: ExprFlag.Set, tlm: TopLevelAndIsExport) void {
p.print(keyword);
p.printSpace();
var decls = decls_;
@@ -1240,7 +1295,7 @@ fn NewPrinter(
.is_single_line = true,
};
const binding = Binding.init(&b_object, target_e_dot.target.loc);
p.printBinding(binding);
p.printBinding(binding, tlm);
}
p.printWhitespacer(ws(" = "));
@@ -1256,7 +1311,7 @@ fn NewPrinter(
}
{
p.printBinding(decls[0].binding);
p.printBinding(decls[0].binding, tlm);
if (decls[0].value) |value| {
p.printWhitespacer(ws(" = "));
@@ -1268,7 +1323,7 @@ fn NewPrinter(
p.print(",");
p.printSpace();
p.printBinding(decl.binding);
p.printBinding(decl.binding, tlm);
if (decl.value) |value| {
p.printWhitespacer(ws(" = "));
@@ -1342,7 +1397,7 @@ fn NewPrinter(
p.print("...");
}
p.printBinding(arg.binding);
p.printBinding(arg.binding, .{});
if (arg.default) |default| {
p.printWhitespacer(ws(" = "));
@@ -1358,7 +1413,7 @@ fn NewPrinter(
pub fn printFunc(p: *Printer, func: G.Fn) void {
p.printFnArgs(func.open_parens_loc, func.args, func.flags.contains(.has_rest_arg), false);
p.printSpace();
p.printBlock(func.body.loc, func.body.stmts, null);
p.printBlock(func.body.loc, func.body.stmts, null, TopLevel.init(.no));
}
pub fn printClass(p: *Printer, class: G.Class) void {
@@ -1382,7 +1437,7 @@ fn NewPrinter(
if (item.kind == .class_static_block) {
p.print("static");
p.printSpace();
p.printBlock(item.class_static_block.?.loc, item.class_static_block.?.stmts.slice(), null);
p.printBlock(item.class_static_block.?.loc, item.class_static_block.?.stmts.slice(), null, TopLevel.init(.no));
p.printNewline();
continue;
}
@@ -2009,6 +2064,7 @@ fn NewPrinter(
p.print(".importMeta");
} else if (!p.options.import_meta_ref.isValid()) {
// Most of the time, leave it in there
if (p.moduleInfo()) |mi| mi.flags.contains_import_meta = true;
p.print("import.meta");
} else {
// Note: The bundler will not hit this code path. The bundler will replace
@@ -2034,6 +2090,7 @@ fn NewPrinter(
p.printSpaceBeforeIdentifier();
p.addSourceMapping(expr.loc);
}
if (p.moduleInfo()) |mi| mi.flags.contains_import_meta = true;
p.print("import.meta.main");
} else {
bun.debugAssert(p.options.module_type != .internal_bake_dev);
@@ -2539,7 +2596,7 @@ fn NewPrinter(
}
if (!wasPrinted) {
p.printBlock(e.body.loc, e.body.stmts, null);
p.printBlock(e.body.loc, e.body.stmts, null, TopLevel.init(.no));
}
if (wrap) {
@@ -3525,13 +3582,21 @@ fn NewPrinter(
p.printExpr(initial, .comma, ExprFlag.None());
}
pub fn printBinding(p: *Printer, binding: Binding) void {
pub fn printBinding(p: *Printer, binding: Binding, tlm: TopLevelAndIsExport) void {
switch (binding.data) {
.b_missing => {},
.b_identifier => |b| {
p.printSpaceBeforeIdentifier();
p.addSourceMapping(binding.loc);
p.printSymbol(b.ref);
if (may_have_module_info) {
if (p.moduleInfo()) |mi| {
const local_name = p.renamer.nameForSymbol(b.ref);
const name_id = bun.handleOom(mi.str(local_name));
if (tlm.is_top_level) |vk| bun.handleOom(mi.addVar(name_id, vk));
if (tlm.is_export) bun.handleOom(mi.addExportInfoLocal(name_id, name_id));
}
}
},
.b_array => |b| {
p.print("[");
@@ -3558,7 +3623,7 @@ fn NewPrinter(
p.print("...");
}
p.printBinding(item.binding);
p.printBinding(item.binding, tlm);
p.maybePrintDefaultBindingValue(item);
@@ -3605,7 +3670,7 @@ fn NewPrinter(
p.print("]:");
p.printSpace();
p.printBinding(property.value);
p.printBinding(property.value, tlm);
p.maybePrintDefaultBindingValue(property);
continue;
}
@@ -3630,6 +3695,13 @@ fn NewPrinter(
switch (property.value.data) {
.b_identifier => |id| {
if (str.eql(string, p.renamer.nameForSymbol(id.ref))) {
if (may_have_module_info) {
if (p.moduleInfo()) |mi| {
const name_id = bun.handleOom(mi.str(str.data));
if (tlm.is_top_level) |vk| bun.handleOom(mi.addVar(name_id, vk));
if (tlm.is_export) bun.handleOom(mi.addExportInfoLocal(name_id, name_id));
}
}
p.maybePrintDefaultBindingValue(property);
continue;
}
@@ -3647,6 +3719,14 @@ fn NewPrinter(
switch (property.value.data) {
.b_identifier => |id| {
if (strings.utf16EqlString(str.slice16(), p.renamer.nameForSymbol(id.ref))) {
if (may_have_module_info) {
if (p.moduleInfo()) |mi| {
const str8 = str.slice(p.options.allocator);
const name_id = bun.handleOom(mi.str(str8));
if (tlm.is_top_level) |vk| bun.handleOom(mi.addVar(name_id, vk));
if (tlm.is_export) bun.handleOom(mi.addExportInfoLocal(name_id, name_id));
}
}
p.maybePrintDefaultBindingValue(property);
continue;
}
@@ -3666,7 +3746,7 @@ fn NewPrinter(
p.printSpace();
}
p.printBinding(property.value);
p.printBinding(property.value, tlm);
p.maybePrintDefaultBindingValue(property);
}
@@ -3692,7 +3772,7 @@ fn NewPrinter(
}
}
pub fn printStmt(p: *Printer, stmt: Stmt) !void {
pub fn printStmt(p: *Printer, stmt: Stmt, tlmtlo: TopLevel) !void {
const prev_stmt_tag = p.prev_stmt_tag;
defer {
@@ -3729,23 +3809,25 @@ fn NewPrinter(
}
p.addSourceMapping(name.loc);
p.printSymbol(nameRef);
const local_name = p.renamer.nameForSymbol(nameRef);
p.printIdentifier(local_name);
p.printFunc(s.func);
// if (rewrite_esm_to_cjs and s.func.flags.contains(.is_export)) {
// p.printSemicolonAfterStatement();
// p.print("var ");
// p.printSymbol(nameRef);
// p.@"print = "();
// p.printSymbol(nameRef);
// p.printSemicolonAfterStatement();
// } else {
if (may_have_module_info) {
if (p.moduleInfo()) |mi| {
const name_id = bun.handleOom(mi.str(local_name));
// function declarations are lexical (block-scoped in modules);
// only record at true top-level, not inside blocks.
if (tlmtlo.is_top_level == .yes) bun.handleOom(mi.addVar(name_id, .lexical));
if (s.func.flags.contains(.is_export)) bun.handleOom(mi.addExportInfoLocal(name_id, name_id));
}
}
p.printNewline();
// }
if (rewrite_esm_to_cjs and s.func.flags.contains(.is_export)) {
p.printIndent();
p.printBundledExport(p.renamer.nameForSymbol(nameRef), p.renamer.nameForSymbol(nameRef));
p.printBundledExport(local_name, local_name);
p.printSemicolonAfterStatement();
}
},
@@ -3767,9 +3849,20 @@ fn NewPrinter(
p.print("class ");
p.addSourceMapping(s.class.class_name.?.loc);
p.printSymbol(nameRef);
const nameStr = p.renamer.nameForSymbol(nameRef);
p.printIdentifier(nameStr);
p.printClass(s.class);
if (may_have_module_info) {
if (p.moduleInfo()) |mi| {
const name_id = bun.handleOom(mi.str(nameStr));
// class declarations are lexical (block-scoped in modules);
// only record at true top-level, not inside blocks.
if (tlmtlo.is_top_level == .yes) bun.handleOom(mi.addVar(name_id, .lexical));
if (s.is_export) bun.handleOom(mi.addExportInfoLocal(name_id, name_id));
}
}
if (rewrite_esm_to_cjs and s.is_export) {
p.printSemicolonAfterStatement();
} else {
@@ -3805,6 +3898,13 @@ fn NewPrinter(
p.export_default_start = p.writer.written;
p.printExpr(expr, .comma, ExprFlag.None());
p.printSemicolonAfterStatement();
if (may_have_module_info) {
if (p.moduleInfo()) |mi| {
bun.handleOom(mi.addExportInfoLocal(bun.handleOom(mi.str("default")), .star_default));
bun.handleOom(mi.addVar(.star_default, .lexical));
}
}
return;
},
@@ -3825,26 +3925,44 @@ fn NewPrinter(
p.maybePrintSpace();
}
if (func.func.name) |name| {
p.printSymbol(name.ref.?);
const func_name: ?[]const u8 = if (func.func.name) |name| p.renamer.nameForSymbol(name.ref.?) else null;
if (func_name) |fn_name| {
p.printIdentifier(fn_name);
}
p.printFunc(func.func);
if (may_have_module_info) {
if (p.moduleInfo()) |mi| {
const local_name: analyze_transpiled_module.StringID = if (func_name) |f| bun.handleOom(mi.str(f)) else .star_default;
bun.handleOom(mi.addExportInfoLocal(bun.handleOom(mi.str("default")), local_name));
bun.handleOom(mi.addVar(local_name, .lexical));
}
}
p.printNewline();
},
.s_class => |class| {
p.printSpaceBeforeIdentifier();
const class_name: ?[]const u8 = if (class.class.class_name) |name| p.renamer.nameForSymbol(name.ref orelse Output.panic("Internal error: Expected class to have a name ref", .{})) else null;
if (class.class.class_name) |name| {
p.print("class ");
p.printSymbol(name.ref orelse Output.panic("Internal error: Expected class to have a name ref", .{}));
p.printIdentifier(p.renamer.nameForSymbol(name.ref.?));
} else {
p.print("class");
}
p.printClass(class.class);
if (may_have_module_info) {
if (p.moduleInfo()) |mi| {
const local_name: analyze_transpiled_module.StringID = if (class_name) |f| bun.handleOom(mi.str(f)) else .star_default;
bun.handleOom(mi.addExportInfoLocal(bun.handleOom(mi.str("default")), local_name));
bun.handleOom(mi.addVar(local_name, .lexical));
}
}
p.printNewline();
},
else => {
@@ -3875,8 +3993,21 @@ fn NewPrinter(
p.printWhitespacer(ws("from "));
}
const irp = p.importRecord(s.import_record_index).path.text;
p.printImportRecordPath(p.importRecord(s.import_record_index));
p.printSemicolonAfterStatement();
if (may_have_module_info) {
if (p.moduleInfo()) |mi| {
const irp_id = bun.handleOom(mi.str(irp));
bun.handleOom(mi.requestModule(irp_id, .none));
if (s.alias) |alias| {
bun.handleOom(mi.addExportInfoNamespace(bun.handleOom(mi.str(alias.original_name)), irp_id));
} else {
bun.handleOom(mi.addExportInfoStar(irp_id));
}
}
}
},
.s_export_clause => |s| {
if (rewrite_esm_to_cjs) {
@@ -4026,7 +4157,14 @@ fn NewPrinter(
p.printIndent();
}
const name = p.renamer.nameForSymbol(item.name.ref.?);
p.printExportClauseItem(item);
if (may_have_module_info) {
if (p.moduleInfo()) |mi| {
bun.handleOom(mi.addExportInfoLocal(bun.handleOom(mi.str(item.alias)), bun.handleOom(mi.str(name))));
}
}
}
if (!s.is_single_line) {
@@ -4079,8 +4217,20 @@ fn NewPrinter(
}
p.printWhitespacer(ws("} from "));
const irp = import_record.path.text;
p.printImportRecordPath(import_record);
p.printSemicolonAfterStatement();
if (may_have_module_info) {
if (p.moduleInfo()) |mi| {
const irp_id = bun.handleOom(mi.str(irp));
bun.handleOom(mi.requestModule(irp_id, .none));
for (s.items) |item| {
const name = p.renamer.nameForSymbol(item.name.ref.?);
bun.handleOom(mi.addExportInfoIndirect(bun.handleOom(mi.str(item.alias)), bun.handleOom(mi.str(name)), irp_id));
}
}
}
},
.s_local => |s| {
p.printIndent();
@@ -4088,41 +4238,42 @@ fn NewPrinter(
p.addSourceMapping(stmt.loc);
switch (s.kind) {
.k_const => {
p.printDeclStmt(s.is_export, "const", s.decls.slice());
p.printDeclStmt(s.is_export, "const", s.decls.slice(), tlmtlo);
},
.k_let => {
p.printDeclStmt(s.is_export, "let", s.decls.slice());
p.printDeclStmt(s.is_export, "let", s.decls.slice(), tlmtlo);
},
.k_var => {
p.printDeclStmt(s.is_export, "var", s.decls.slice());
p.printDeclStmt(s.is_export, "var", s.decls.slice(), tlmtlo);
},
.k_using => {
p.printDeclStmt(s.is_export, "using", s.decls.slice());
p.printDeclStmt(s.is_export, "using", s.decls.slice(), tlmtlo);
},
.k_await_using => {
p.printDeclStmt(s.is_export, "await using", s.decls.slice());
p.printDeclStmt(s.is_export, "await using", s.decls.slice(), tlmtlo);
},
}
},
.s_if => |s| {
p.printIndent();
p.printIf(s, stmt.loc);
p.printIf(s, stmt.loc, tlmtlo.subVar());
},
.s_do_while => |s| {
p.printIndent();
p.printSpaceBeforeIdentifier();
p.addSourceMapping(stmt.loc);
p.print("do");
const sub_var = tlmtlo.subVar();
switch (s.body.data) {
.s_block => {
p.printSpace();
p.printBlock(s.body.loc, s.body.data.s_block.stmts, s.body.data.s_block.close_brace_loc);
p.printBlock(s.body.loc, s.body.data.s_block.stmts, s.body.data.s_block.close_brace_loc, sub_var);
p.printSpace();
},
else => {
p.printNewline();
p.indent();
p.printStmt(s.body) catch unreachable;
p.printStmt(s.body, sub_var) catch unreachable;
p.printSemicolonIfNeeded();
p.unindent();
p.printIndent();
@@ -4150,7 +4301,7 @@ fn NewPrinter(
p.printSpace();
p.printExpr(s.value, .lowest, ExprFlag.None());
p.print(")");
p.printBody(s.body);
p.printBody(s.body, tlmtlo.subVar());
},
.s_for_of => |s| {
p.printIndent();
@@ -4170,7 +4321,7 @@ fn NewPrinter(
p.printSpace();
p.printExpr(s.value, .comma, ExprFlag.None());
p.print(")");
p.printBody(s.body);
p.printBody(s.body, tlmtlo.subVar());
},
.s_while => |s| {
p.printIndent();
@@ -4181,7 +4332,7 @@ fn NewPrinter(
p.print("(");
p.printExpr(s.test_, .lowest, ExprFlag.None());
p.print(")");
p.printBody(s.body);
p.printBody(s.body, tlmtlo.subVar());
},
.s_with => |s| {
p.printIndent();
@@ -4192,7 +4343,7 @@ fn NewPrinter(
p.print("(");
p.printExpr(s.value, .lowest, ExprFlag.None());
p.print(")");
p.printBody(s.body);
p.printBody(s.body, tlmtlo.subVar());
},
.s_label => |s| {
if (!p.options.minify_whitespace and p.options.indent.count > 0) {
@@ -4202,7 +4353,7 @@ fn NewPrinter(
p.addSourceMapping(stmt.loc);
p.printSymbol(s.name.ref orelse Output.panic("Internal error: expected label to have a name", .{}));
p.print(":");
p.printBody(s.stmt);
p.printBody(s.stmt, tlmtlo.subVar());
},
.s_try => |s| {
p.printIndent();
@@ -4210,7 +4361,8 @@ fn NewPrinter(
p.addSourceMapping(stmt.loc);
p.print("try");
p.printSpace();
p.printBlock(s.body_loc, s.body, null);
const sub_var_try = tlmtlo.subVar();
p.printBlock(s.body_loc, s.body, null, sub_var_try);
if (s.catch_) |catch_| {
p.printSpace();
@@ -4219,18 +4371,18 @@ fn NewPrinter(
if (catch_.binding) |binding| {
p.printSpace();
p.print("(");
p.printBinding(binding);
p.printBinding(binding, .{});
p.print(")");
}
p.printSpace();
p.printBlock(catch_.body_loc, catch_.body, null);
p.printBlock(catch_.body_loc, catch_.body, null, sub_var_try);
}
if (s.finally) |finally| {
p.printSpace();
p.print("finally");
p.printSpace();
p.printBlock(finally.loc, finally.stmts, null);
p.printBlock(finally.loc, finally.stmts, null, sub_var_try);
}
p.printNewline();
@@ -4261,7 +4413,7 @@ fn NewPrinter(
}
p.print(")");
p.printBody(s.body);
p.printBody(s.body, tlmtlo.subVar());
},
.s_switch => |s| {
p.printIndent();
@@ -4293,11 +4445,12 @@ fn NewPrinter(
p.print(":");
const sub_var_case = tlmtlo.subVar();
if (c.body.len == 1) {
switch (c.body[0].data) {
.s_block => {
p.printSpace();
p.printBlock(c.body[0].loc, c.body[0].data.s_block.stmts, c.body[0].data.s_block.close_brace_loc);
p.printBlock(c.body[0].loc, c.body[0].data.s_block.stmts, c.body[0].data.s_block.close_brace_loc, sub_var_case);
p.printNewline();
continue;
},
@@ -4309,7 +4462,7 @@ fn NewPrinter(
p.indent();
for (c.body) |st| {
p.printSemicolonIfNeeded();
p.printStmt(st) catch unreachable;
p.printStmt(st, sub_var_case) catch unreachable;
}
p.unindent();
}
@@ -4494,16 +4647,68 @@ fn NewPrinter(
.dataurl => p.printWhitespacer(ws(" with { type: \"dataurl\" }")),
.text => p.printWhitespacer(ws(" with { type: \"text\" }")),
.bunsh => p.printWhitespacer(ws(" with { type: \"sh\" }")),
// sqlite_embedded only relevant when bundling
.sqlite, .sqlite_embedded => p.printWhitespacer(ws(" with { type: \"sqlite\" }")),
.html => p.printWhitespacer(ws(" with { type: \"html\" }")),
.md => p.printWhitespacer(ws(" with { type: \"md\" }")),
};
p.printSemicolonAfterStatement();
if (may_have_module_info) {
if (p.moduleInfo()) |mi| {
const import_record_path = record.path.text;
const irp_id = bun.handleOom(mi.str(import_record_path));
const fetch_parameters: analyze_transpiled_module.ModuleInfo.FetchParameters = if (comptime is_bun_platform) (if (record.loader) |loader| switch (loader) {
.json => .json,
.jsx => analyze_transpiled_module.ModuleInfo.FetchParameters.hostDefined(bun.handleOom(mi.str("jsx"))),
.js => analyze_transpiled_module.ModuleInfo.FetchParameters.hostDefined(bun.handleOom(mi.str("js"))),
.ts => analyze_transpiled_module.ModuleInfo.FetchParameters.hostDefined(bun.handleOom(mi.str("ts"))),
.tsx => analyze_transpiled_module.ModuleInfo.FetchParameters.hostDefined(bun.handleOom(mi.str("tsx"))),
.css => analyze_transpiled_module.ModuleInfo.FetchParameters.hostDefined(bun.handleOom(mi.str("css"))),
.file => analyze_transpiled_module.ModuleInfo.FetchParameters.hostDefined(bun.handleOom(mi.str("file"))),
.jsonc => analyze_transpiled_module.ModuleInfo.FetchParameters.hostDefined(bun.handleOom(mi.str("jsonc"))),
.toml => analyze_transpiled_module.ModuleInfo.FetchParameters.hostDefined(bun.handleOom(mi.str("toml"))),
.yaml => analyze_transpiled_module.ModuleInfo.FetchParameters.hostDefined(bun.handleOom(mi.str("yaml"))),
.wasm => analyze_transpiled_module.ModuleInfo.FetchParameters.hostDefined(bun.handleOom(mi.str("wasm"))),
.napi => analyze_transpiled_module.ModuleInfo.FetchParameters.hostDefined(bun.handleOom(mi.str("napi"))),
.base64 => analyze_transpiled_module.ModuleInfo.FetchParameters.hostDefined(bun.handleOom(mi.str("base64"))),
.dataurl => analyze_transpiled_module.ModuleInfo.FetchParameters.hostDefined(bun.handleOom(mi.str("dataurl"))),
.text => analyze_transpiled_module.ModuleInfo.FetchParameters.hostDefined(bun.handleOom(mi.str("text"))),
.bunsh => analyze_transpiled_module.ModuleInfo.FetchParameters.hostDefined(bun.handleOom(mi.str("sh"))),
.sqlite, .sqlite_embedded => analyze_transpiled_module.ModuleInfo.FetchParameters.hostDefined(bun.handleOom(mi.str("sqlite"))),
.html => analyze_transpiled_module.ModuleInfo.FetchParameters.hostDefined(bun.handleOom(mi.str("html"))),
.json5 => analyze_transpiled_module.ModuleInfo.FetchParameters.hostDefined(bun.handleOom(mi.str("json5"))),
.md => analyze_transpiled_module.ModuleInfo.FetchParameters.hostDefined(bun.handleOom(mi.str("md"))),
} else .none) else .none;
bun.handleOom(mi.requestModule(irp_id, fetch_parameters));
if (s.default_name) |name| {
const local_name = p.renamer.nameForSymbol(name.ref.?);
const local_name_id = bun.handleOom(mi.str(local_name));
bun.handleOom(mi.addVar(local_name_id, .lexical));
bun.handleOom(mi.addImportInfoSingle(irp_id, bun.handleOom(mi.str("default")), local_name_id, false));
}
for (s.items) |item| {
const local_name = p.renamer.nameForSymbol(item.name.ref.?);
const local_name_id = bun.handleOom(mi.str(local_name));
bun.handleOom(mi.addVar(local_name_id, .lexical));
// In bundled output, all surviving imports are value imports
// (tree-shaking already removed type-only ones). The finalize()
// step handles re-export type-script conversion separately.
bun.handleOom(mi.addImportInfoSingle(irp_id, bun.handleOom(mi.str(item.alias)), local_name_id, false));
}
if (record.flags.contains_import_star) {
const local_name = p.renamer.nameForSymbol(s.namespace_ref);
bun.handleOom(mi.addVar(bun.handleOom(mi.str(local_name)), .lexical));
bun.handleOom(mi.addImportInfoNamespace(irp_id, bun.handleOom(mi.str(local_name))));
}
}
}
},
.s_block => |s| {
p.printIndent();
p.printBlock(stmt.loc, s.stmts, s.close_brace_loc);
p.printBlock(stmt.loc, s.stmts, s.close_brace_loc, tlmtlo.subVar());
p.printNewline();
},
.s_debugger => {
@@ -4779,19 +4984,19 @@ fn NewPrinter(
.s_local => |s| {
switch (s.kind) {
.k_var => {
p.printDecls("var", s.decls.slice(), ExprFlag.Set.init(.{ .forbid_in = true }));
p.printDecls("var", s.decls.slice(), ExprFlag.Set.init(.{ .forbid_in = true }), .{});
},
.k_let => {
p.printDecls("let", s.decls.slice(), ExprFlag.Set.init(.{ .forbid_in = true }));
p.printDecls("let", s.decls.slice(), ExprFlag.Set.init(.{ .forbid_in = true }), .{});
},
.k_const => {
p.printDecls("const", s.decls.slice(), ExprFlag.Set.init(.{ .forbid_in = true }));
p.printDecls("const", s.decls.slice(), ExprFlag.Set.init(.{ .forbid_in = true }), .{});
},
.k_using => {
p.printDecls("using", s.decls.slice(), ExprFlag.Set.init(.{ .forbid_in = true }));
p.printDecls("using", s.decls.slice(), ExprFlag.Set.init(.{ .forbid_in = true }), .{});
},
.k_await_using => {
p.printDecls("await using", s.decls.slice(), ExprFlag.Set.init(.{ .forbid_in = true }));
p.printDecls("await using", s.decls.slice(), ExprFlag.Set.init(.{ .forbid_in = true }), .{});
},
}
},
@@ -4802,7 +5007,7 @@ fn NewPrinter(
},
}
}
pub fn printIf(p: *Printer, s: *const S.If, loc: logger.Loc) void {
pub fn printIf(p: *Printer, s: *const S.If, loc: logger.Loc, tlmtlo: TopLevel) void {
p.printSpaceBeforeIdentifier();
p.addSourceMapping(loc);
p.print("if");
@@ -4814,7 +5019,7 @@ fn NewPrinter(
switch (s.yes.data) {
.s_block => |block| {
p.printSpace();
p.printBlock(s.yes.loc, block.stmts, block.close_brace_loc);
p.printBlock(s.yes.loc, block.stmts, block.close_brace_loc, tlmtlo);
if (s.no != null) {
p.printSpace();
@@ -4829,7 +5034,7 @@ fn NewPrinter(
p.printNewline();
p.indent();
p.printStmt(s.yes) catch unreachable;
p.printStmt(s.yes, tlmtlo) catch unreachable;
p.unindent();
p.needs_semicolon = false;
@@ -4844,7 +5049,7 @@ fn NewPrinter(
} else {
p.printNewline();
p.indent();
p.printStmt(s.yes) catch unreachable;
p.printStmt(s.yes, tlmtlo) catch unreachable;
p.unindent();
if (s.no != null) {
@@ -4863,16 +5068,16 @@ fn NewPrinter(
switch (no_block.data) {
.s_block => {
p.printSpace();
p.printBlock(no_block.loc, no_block.data.s_block.stmts, null);
p.printBlock(no_block.loc, no_block.data.s_block.stmts, null, tlmtlo);
p.printNewline();
},
.s_if => {
p.printIf(no_block.data.s_if, no_block.loc);
p.printIf(no_block.data.s_if, no_block.loc, tlmtlo);
},
else => {
p.printNewline();
p.indent();
p.printStmt(no_block) catch unreachable;
p.printStmt(no_block, tlmtlo) catch unreachable;
p.unindent();
},
}
@@ -4953,11 +5158,20 @@ fn NewPrinter(
}
}
pub fn printDeclStmt(p: *Printer, is_export: bool, comptime keyword: string, decls: []G.Decl) void {
pub fn printDeclStmt(p: *Printer, is_export: bool, comptime keyword: string, decls: []G.Decl, tlmtlo: TopLevel) void {
if (!rewrite_esm_to_cjs and is_export) {
p.print("export ");
}
p.printDecls(keyword, decls, ExprFlag.None());
const tlm: TopLevelAndIsExport = if (may_have_module_info) .{
.is_export = is_export,
.is_top_level = if (comptime strings.eqlComptime(keyword, "var"))
(if (tlmtlo.isTopLevel()) .declared else null)
else
// let/const are block-scoped: only record at true top-level,
// not inside blocks where subVar() downgrades to .var_only.
(if (tlmtlo.is_top_level == .yes) .lexical else null),
} else .{};
p.printDecls(keyword, decls, ExprFlag.None(), tlm);
p.printSemicolonAfterStatement();
if (rewrite_esm_to_cjs and is_export and decls.len > 0) {
for (decls) |decl| {
@@ -5002,7 +5216,7 @@ fn NewPrinter(
p.print("}");
},
else => {
p.printBinding(decl.binding);
p.printBinding(decl.binding, .{});
},
}
p.print(")");
@@ -5335,7 +5549,7 @@ fn NewPrinter(
p.printFnArgs(func.open_parens_loc, func.args, func.flags.contains(.has_rest_arg), false);
p.print(" => {\n");
p.indent();
p.printBlockBody(func.body.stmts);
p.printBlockBody(func.body.stmts, TopLevel.init(.no));
p.unindent();
p.printIndent();
p.print("}, ");
@@ -5792,6 +6006,9 @@ pub fn printAst(
}
}
printer.was_lazy_export = tree.has_lazy_export;
if (PrinterType.may_have_module_info) {
printer.module_info = opts.module_info;
}
var bin_stack_heap = std.heap.stackFallback(1024, bun.default_allocator);
printer.binary_expression_stack = std.array_list.Managed(PrinterType.BinaryExpressionVisitor).init(bin_stack_heap.get());
defer printer.binary_expression_stack.clearAndFree();
@@ -5813,11 +6030,18 @@ pub fn printAst(
// This is never a symbol collision because `uses_require_ref` means
// `require` must be an unbound variable.
printer.print("var {require}=import.meta;");
if (PrinterType.may_have_module_info) {
if (printer.moduleInfo()) |mi| {
mi.flags.contains_import_meta = true;
bun.handleOom(mi.addVar(bun.handleOom(mi.str("require")), .declared));
}
}
}
for (tree.parts.slice()) |part| {
for (part.stmts) |stmt| {
try printer.printStmt(stmt);
try printer.printStmt(stmt, PrinterType.TopLevel.init(.yes));
if (printer.writer.getError()) {} else |err| {
return err;
}
@@ -5825,26 +6049,30 @@ pub fn printAst(
}
}
if (comptime FeatureFlags.runtime_transpiler_cache and generate_source_map) {
if (opts.source_map_handler) |handler| {
var source_maps_chunk = printer.source_map_builder.generateChunk(printer.writer.ctx.getWritten());
if (opts.runtime_transpiler_cache) |cache| {
cache.put(printer.writer.ctx.getWritten(), source_maps_chunk.buffer.list.items);
}
const have_module_info = PrinterType.may_have_module_info and opts.module_info != null;
if (have_module_info) {
try opts.module_info.?.finalize();
}
defer source_maps_chunk.deinit();
var source_maps_chunk: ?SourceMap.Chunk = if (comptime generate_source_map)
if (opts.source_map_handler != null)
printer.source_map_builder.generateChunk(printer.writer.ctx.getWritten())
else
null
else
null;
defer if (source_maps_chunk) |*chunk| chunk.deinit();
try handler.onSourceMapChunk(source_maps_chunk, source);
} else {
if (opts.runtime_transpiler_cache) |cache| {
cache.put(printer.writer.ctx.getWritten(), "");
}
}
} else if (comptime generate_source_map) {
if (opts.runtime_transpiler_cache) |cache| {
var srlz_res = std.array_list.Managed(u8).init(bun.default_allocator);
defer srlz_res.deinit();
if (have_module_info) try opts.module_info.?.asDeserialized().serialize(srlz_res.writer());
cache.put(printer.writer.ctx.getWritten(), if (source_maps_chunk) |chunk| chunk.buffer.list.items else "", srlz_res.items);
}
if (comptime generate_source_map) {
if (opts.source_map_handler) |handler| {
var chunk = printer.source_map_builder.generateChunk(printer.writer.ctx.getWritten());
defer chunk.deinit();
try handler.onSourceMapChunk(chunk, source);
try handler.onSourceMapChunk(source_maps_chunk.?, source);
}
}
@@ -5986,6 +6214,9 @@ pub fn printWithWriterAndPlatform(
getSourceMapBuilder(if (generate_source_maps) .eager else .disable, is_bun_platform, opts, source, &ast),
);
printer.was_lazy_export = ast.has_lazy_export;
if (PrinterType.may_have_module_info) {
printer.module_info = opts.module_info;
}
var bin_stack_heap = std.heap.stackFallback(1024, bun.default_allocator);
printer.binary_expression_stack = std.array_list.Managed(PrinterType.BinaryExpressionVisitor).init(bin_stack_heap.get());
defer printer.binary_expression_stack.clearAndFree();
@@ -6004,7 +6235,7 @@ pub fn printWithWriterAndPlatform(
for (parts) |part| {
for (part.stmts) |stmt| {
printer.printStmt(stmt) catch |err| {
printer.printStmt(stmt, PrinterType.TopLevel.init(.yes)) catch |err| {
return .{ .err = err };
};
if (printer.writer.getError()) {} else |err| {
@@ -6074,7 +6305,7 @@ pub fn printCommonJS(
for (tree.parts.slice()) |part| {
for (part.stmts) |stmt| {
try printer.printStmt(stmt);
try printer.printStmt(stmt, PrinterType.TopLevel.init(.yes));
if (printer.writer.getError()) {} else |err| {
return err;
}
@@ -6098,9 +6329,24 @@ pub fn printCommonJS(
return @as(usize, @intCast(@max(printer.writer.written, 0)));
}
/// Serializes ModuleInfo to an owned byte slice. Returns null on failure.
/// The caller is responsible for freeing the returned slice with bun.default_allocator.
pub fn serializeModuleInfo(module_info: ?*analyze_transpiled_module.ModuleInfo) ?[]const u8 {
const mi = module_info orelse return null;
if (!mi.finalized) {
mi.finalize() catch return null;
}
const deserialized = mi.asDeserialized();
var buf: std.ArrayList(u8) = .empty;
defer buf.deinit(bun.default_allocator);
deserialized.serialize(buf.writer(bun.default_allocator)) catch return null;
return buf.toOwnedSlice(bun.default_allocator) catch null;
}
const string = []const u8;
const SourceMap = @import("./sourcemap/sourcemap.zig");
const analyze_transpiled_module = @import("./analyze_transpiled_module.zig");
const fs = @import("./fs.zig");
const importRecord = @import("./import_record.zig");
const options = @import("./options.zig");

View File

@@ -280,6 +280,17 @@ pub const Interpreter = struct {
exit_code: ?ExitCode = 0,
this_jsvalue: JSValue = .zero,
/// Tracks which resources have been cleaned up to avoid double-free.
/// When the interpreter finishes normally via finish(), it cleans up
/// the runtime resources (IO, shell env) and sets this to .runtime_cleaned.
/// The GC finalizer then only cleans up what remains (args, interpreter itself).
cleanup_state: enum(u8) {
/// Nothing has been cleaned up yet - need full cleanup
needs_full_cleanup,
/// Runtime resources (IO, shell env) have been cleaned up via finish()
runtime_cleaned,
} = .needs_full_cleanup,
__alloc_scope: if (bun.Environment.enableAllocScopes) bun.AllocationScope else void,
estimated_size_for_gc: usize = 0,
@@ -1222,6 +1233,11 @@ pub const Interpreter = struct {
}
fn #derefRootShellAndIOIfNeeded(this: *ThisInterpreter, free_buffered_io: bool) void {
// Check if already cleaned up to prevent double-free
if (this.cleanup_state == .runtime_cleaned) {
return;
}
if (free_buffered_io) {
// Can safely be called multiple times.
if (this.root_shell._buffered_stderr == .owned) {
@@ -1240,10 +1256,26 @@ pub const Interpreter = struct {
}
this.this_jsvalue = .zero;
// Mark that runtime resources have been cleaned up
this.cleanup_state = .runtime_cleaned;
}
fn deinitFromFinalizer(this: *ThisInterpreter) void {
this.#derefRootShellAndIOIfNeeded(true);
log("Interpreter(0x{x}) deinitFromFinalizer (cleanup_state={s})", .{ @intFromPtr(this), @tagName(this.cleanup_state) });
switch (this.cleanup_state) {
.needs_full_cleanup => {
// The interpreter never finished normally (e.g., early error or never started),
// so we need to clean up IO and shell env here
this.root_io.deref();
this.root_shell.deinitImpl(false, true);
},
.runtime_cleaned => {
// finish() already cleaned up IO and shell env via #derefRootShellAndIOIfNeeded,
// nothing more to do for those resources
},
}
this.keep_alive.disable();
this.args.deinit();
this.allocator.destroy(this);

View File

@@ -783,6 +783,7 @@ pub const Transpiler = struct {
comptime enable_source_map: bool,
source_map_context: ?js_printer.SourceMapHandler,
runtime_transpiler_cache: ?*bun.jsc.RuntimeTranspilerCache,
module_info: ?*analyze_transpiled_module.ModuleInfo,
) !usize {
const tracer = if (enable_source_map)
bun.perf.trace("JSPrinter.printWithSourceMap")
@@ -872,6 +873,7 @@ pub const Transpiler = struct {
.inline_require_and_import_errors = false,
.import_meta_ref = ast.import_meta_ref,
.runtime_transpiler_cache = runtime_transpiler_cache,
.module_info = module_info,
.target = transpiler.options.target,
.print_dce_annotations = transpiler.options.emit_dce_annotations,
.hmr_ref = ast.wrapper_ref,
@@ -900,6 +902,7 @@ pub const Transpiler = struct {
false,
null,
null,
null,
);
}
@@ -910,6 +913,7 @@ pub const Transpiler = struct {
writer: Writer,
comptime format: js_printer.Format,
handler: js_printer.SourceMapHandler,
module_info: ?*analyze_transpiled_module.ModuleInfo,
) !usize {
if (bun.feature_flag.BUN_FEATURE_FLAG_DISABLE_SOURCE_MAPS.get()) {
return transpiler.printWithSourceMapMaybe(
@@ -921,6 +925,7 @@ pub const Transpiler = struct {
false,
handler,
result.runtime_transpiler_cache,
module_info,
);
}
return transpiler.printWithSourceMapMaybe(
@@ -932,6 +937,7 @@ pub const Transpiler = struct {
true,
handler,
result.runtime_transpiler_cache,
module_info,
);
}
@@ -1621,6 +1627,7 @@ const Fs = @import("./fs.zig");
const MimeType = @import("./http/MimeType.zig");
const NodeFallbackModules = @import("./node_fallbacks.zig");
const Router = @import("./router.zig");
const analyze_transpiled_module = @import("./analyze_transpiled_module.zig");
const runtime = @import("./runtime.zig");
const std = @import("std");
const DataURL = @import("./resolver/data_url.zig").DataURL;

View File

@@ -1,7 +1,8 @@
import { Database } from "bun:sqlite";
import { describe, expect, test } from "bun:test";
import { rmSync } from "fs";
import { bunEnv, bunExe, isWindows, tempDirWithFiles } from "harness";
import { bunEnv, bunExe, isWindows, tempDir, tempDirWithFiles } from "harness";
import { join } from "path";
import { itBundled } from "./expectBundled";
describe("bundler", () => {
@@ -89,6 +90,135 @@ describe("bundler", () => {
},
},
});
// ESM bytecode test matrix: each scenario × {default, minified} = 2 tests per scenario.
// With --compile, static imports are inlined into one chunk, but dynamic imports
// create separate modules in the standalone graph — each with its own bytecode + ModuleInfo.
const esmBytecodeScenarios: Array<{
name: string;
files: Record<string, string>;
stdout: string;
}> = [
{
name: "HelloWorld",
files: {
"/entry.ts": `console.log("Hello, world!");`,
},
stdout: "Hello, world!",
},
{
// top-level await is ESM-only; if ModuleInfo or bytecode generation
// mishandles async modules, this breaks.
name: "TopLevelAwait",
files: {
"/entry.ts": `
const result = await Promise.resolve("tla works");
console.log(result);
`,
},
stdout: "tla works",
},
{
// import.meta is ESM-only.
name: "ImportMeta",
files: {
"/entry.ts": `
console.log(typeof import.meta.url === "string" ? "ok" : "fail");
console.log(typeof import.meta.dir === "string" ? "ok" : "fail");
`,
},
stdout: "ok\nok",
},
{
// Dynamic import creates a separate module in the standalone graph,
// exercising per-module bytecode + ModuleInfo.
name: "DynamicImport",
files: {
"/entry.ts": `
const { value } = await import("./lazy.ts");
console.log("lazy:", value);
`,
"/lazy.ts": `export const value = 42;`,
},
stdout: "lazy: 42",
},
{
// Dynamic import of a module that itself uses top-level await.
// The dynamically imported module is a separate chunk with async
// evaluation — stresses both ModuleInfo and async bytecode loading.
name: "DynamicImportTLA",
files: {
"/entry.ts": `
const mod = await import("./async-mod.ts");
console.log("value:", mod.value);
`,
"/async-mod.ts": `export const value = await Promise.resolve(99);`,
},
stdout: "value: 99",
},
{
// Multiple dynamic imports: several separate modules in the graph,
// each with its own bytecode + ModuleInfo.
name: "MultipleDynamicImports",
files: {
"/entry.ts": `
const [a, b] = await Promise.all([
import("./mod-a.ts"),
import("./mod-b.ts"),
]);
console.log(a.value, b.value);
`,
"/mod-a.ts": `export const value = "a";`,
"/mod-b.ts": `export const value = "b";`,
},
stdout: "a b",
},
];
for (const scenario of esmBytecodeScenarios) {
for (const minify of [false, true]) {
itBundled(`compile/ESMBytecode+${scenario.name}${minify ? "+minify" : ""}`, {
compile: true,
bytecode: true,
format: "esm",
...(minify && {
minifySyntax: true,
minifyIdentifiers: true,
minifyWhitespace: true,
}),
files: scenario.files,
run: { stdout: scenario.stdout },
});
}
}
// Multi-entry ESM bytecode with Worker (can't be in the matrix — needs
// entryPointsRaw, outfile, setCwd). Each entry becomes a separate module
// in the standalone graph with its own bytecode + ModuleInfo.
itBundled("compile/WorkerBytecodeESM", {
backend: "cli",
compile: true,
bytecode: true,
format: "esm",
files: {
"/entry.ts": /* js */ `
import {rmSync} from 'fs';
// Verify we're not just importing from the filesystem
rmSync("./worker.ts", {force: true});
console.log("Hello, world!");
new Worker("./worker.ts");
`,
"/worker.ts": /* js */ `
console.log("Worker loaded!");
`.trim(),
},
entryPointsRaw: ["./entry.ts", "./worker.ts"],
outfile: "dist/out",
run: {
stdout: "Hello, world!\nWorker loaded!\n",
file: "dist/out",
setCwd: true,
},
});
// https://github.com/oven-sh/bun/issues/8697
itBundled("compile/EmbeddedFileOutfile", {
compile: true,
@@ -311,6 +441,8 @@ describe("bundler", () => {
format: "cjs" | "esm";
}> = [
{ bytecode: true, minify: true, format: "cjs" },
{ bytecode: true, format: "esm" },
{ bytecode: true, minify: true, format: "esm" },
{ format: "cjs" },
{ format: "cjs", minify: true },
{ format: "esm" },
@@ -736,6 +868,54 @@ const server = serve({
.throws(true);
});
// Verify ESM bytecode is actually loaded from the cache at runtime, not just generated.
// Uses regex matching on stderr (not itBundled) since we don't know the exact
// number of cache hit/miss lines for ESM standalone.
test("ESM bytecode cache is used at runtime", async () => {
const ext = isWindows ? ".exe" : "";
using dir = tempDir("esm-bytecode-cache", {
"entry.js": `console.log("esm bytecode loaded");`,
});
const outfile = join(String(dir), `app${ext}`);
// Build with ESM + bytecode
await using build = Bun.spawn({
cmd: [
bunExe(),
"build",
"--compile",
"--bytecode",
"--format=esm",
join(String(dir), "entry.js"),
"--outfile",
outfile,
],
env: bunEnv,
stdout: "pipe",
stderr: "pipe",
});
const [, buildStderr, buildExitCode] = await Promise.all([build.stdout.text(), build.stderr.text(), build.exited]);
expect(buildStderr).toBe("");
expect(buildExitCode).toBe(0);
// Run with verbose disk cache to verify bytecode is loaded
await using exe = Bun.spawn({
cmd: [outfile],
env: { ...bunEnv, BUN_JSC_verboseDiskCache: "1" },
stdout: "pipe",
stderr: "pipe",
});
const [exeStdout, exeStderr, exeExitCode] = await Promise.all([exe.stdout.text(), exe.stderr.text(), exe.exited]);
expect(exeStdout).toContain("esm bytecode loaded");
expect(exeStderr).toMatch(/\[Disk Cache\].*Cache hit/i);
expect(exeExitCode).toBe(0);
});
// When compiling with 8+ entry points, the main entry point should still run correctly.
test("compile with 8+ entry points runs main entry correctly", async () => {
const dir = tempDirWithFiles("compile-many-entries", {

View File

@@ -36,5 +36,30 @@ describe("bundler", () => {
stdout: "app entry\nheader rendering\nmenu showing\nitems: home,about,contact",
},
});
for (const minify of [false, true]) {
itBundled(`compile/splitting/ImportMetaInSplitChunk${minify ? "+minify" : ""}`, {
compile: true,
splitting: true,
bytecode: true,
format: "esm",
...(minify ? { minifySyntax: true, minifyIdentifiers: true, minifyWhitespace: true } : {}),
files: {
"/entry.ts": /* js */ `
const mod = await import("./worker.ts");
mod.run();
`,
"/worker.ts": /* js */ `
export function run() {
console.log(typeof import.meta.url === "string" ? "ok" : "fail");
console.log(typeof import.meta.dir === "string" ? "ok" : "fail");
}
`,
},
run: {
stdout: "ok\nok",
},
});
}
});
});

View File

@@ -163,8 +163,8 @@ describe.skipIf(!isWindows).concurrent("Windows compile metadata", () => {
const [stderr, exitCode] = await Promise.all([proc.stderr.text(), proc.exited]);
expect(exitCode).not.toBe(0);
// When cross-compiling to non-Windows, it tries to download the target but fails
expect(stderr.toLowerCase()).toContain("target platform");
// Windows flags require a Windows compile target
expect(stderr.toLowerCase()).toContain("windows compile target");
});
});

File diff suppressed because it is too large Load Diff

View File

@@ -1,10 +1,21 @@
import { expectType } from "./utilities";
import { expectAssignable, expectType } from "./utilities";
Bun.build({
entrypoints: ["hey"],
splitting: false,
});
// Build.CompileTarget should accept SIMD variants (issue #26247)
expectAssignable<Bun.Build.CompileTarget>("bun-linux-x64-modern");
expectAssignable<Bun.Build.CompileTarget>("bun-linux-x64-baseline");
expectAssignable<Bun.Build.CompileTarget>("bun-linux-arm64-modern");
expectAssignable<Bun.Build.CompileTarget>("bun-linux-arm64-baseline");
expectAssignable<Bun.Build.CompileTarget>("bun-linux-x64-modern-glibc");
expectAssignable<Bun.Build.CompileTarget>("bun-linux-x64-modern-musl");
expectAssignable<Bun.Build.CompileTarget>("bun-darwin-x64-modern");
expectAssignable<Bun.Build.CompileTarget>("bun-darwin-arm64-baseline");
expectAssignable<Bun.Build.CompileTarget>("bun-windows-x64-modern");
Bun.build({
entrypoints: ["hey"],
splitting: false,

View File

@@ -145,3 +145,23 @@ listener.reload({
// ...listener.
},
});
// Test Socket.reload() type signature (issue #26290)
// The socket instance's reload() method should also accept { socket: handler }
await Bun.connect({
data: { arg: "asdf" },
socket: {
open(socket) {
// Socket.reload() should accept { socket: handler }, not handler directly
socket.reload({
socket: {
open() {},
data() {},
},
});
},
data() {},
},
hostname: "localhost",
port: 1,
});

View File

@@ -1,4 +1,4 @@
import { expect, test } from "bun:test" with { todo: "true" };
import { expect, test } from "bun:test";
import "reflect-metadata";
function Abc() {
return (target: any, field: string) => {};

View File

@@ -1,7 +1,112 @@
import type { Socket } from "bun";
import { setSocketOptions } from "bun:internal-for-testing";
import { describe, test } from "bun:test";
import { isPosix } from "harness";
import { describe, expect, test } from "bun:test";
import { bunEnv, bunExe, isPosix } from "harness";
describe.if(isPosix)("HTTP server handles chunked transfer encoding", () => {
test("handles fragmented chunk terminators", async () => {
const script = `
const server = Bun.serve({
port: 0,
async fetch(req) {
const body = await req.text();
return new Response("Got: " + body);
},
});
const { promise, resolve } = Promise.withResolvers();
const socket = await Bun.connect({
hostname: "localhost",
port: server.port,
socket: {
data(socket, data) {
console.log(data.toString());
socket.end();
},
open(socket) {
socket.write("POST / HTTP/1.1\\r\\nHost: localhost\\r\\nTransfer-Encoding: chunked\\r\\n\\r\\n4\\r\\nWiki\\r");
socket.flush();
setTimeout(() => {
socket.write("\\n0\\r\\n\\r\\n");
socket.flush();
}, 50);
},
error() {},
close() { resolve(); },
},
});
await promise;
server.stop();
`;
await using proc = Bun.spawn({
cmd: [bunExe(), "-e", script],
env: bunEnv,
stdout: "pipe",
stderr: "pipe",
});
const [stdout, stderr, exitCode] = await Promise.all([
new Response(proc.stdout).text(),
new Response(proc.stderr).text(),
proc.exited,
]);
expect(stdout).toContain("200 OK");
expect(stdout).toContain("Got: Wiki");
expect(exitCode).toBe(0);
});
test("rejects invalid terminator in fragmented reads", async () => {
const script = `
const server = Bun.serve({
port: 0,
async fetch(req) {
const body = await req.text();
return new Response("Got: " + body);
},
});
const { promise, resolve } = Promise.withResolvers();
const socket = await Bun.connect({
hostname: "localhost",
port: server.port,
socket: {
data(socket, data) {
console.log(data.toString());
socket.end();
},
open(socket) {
socket.write("POST / HTTP/1.1\\r\\nHost: localhost\\r\\nTransfer-Encoding: chunked\\r\\n\\r\\n4\\r\\nTestX");
socket.flush();
setTimeout(() => {
socket.write("\\n0\\r\\n\\r\\n");
socket.flush();
}, 50);
},
error() {},
close() { resolve(); },
},
});
await promise;
server.stop();
`;
await using proc = Bun.spawn({
cmd: [bunExe(), "-e", script],
env: bunEnv,
stdout: "pipe",
stderr: "pipe",
});
const [stdout, stderr, exitCode] = await Promise.all([
new Response(proc.stdout).text(),
new Response(proc.stderr).text(),
proc.exited,
]);
expect(stdout).toContain("400");
expect(exitCode).toBe(0);
});
});
describe.if(isPosix)("HTTP server handles fragmented requests", () => {
test("handles requests with tiny send buffer (regression test)", async () => {

View File

@@ -1,7 +1,7 @@
import axios from "axios";
import type { Server } from "bun";
import { afterAll, beforeAll, describe, expect, test } from "bun:test";
import { tls as tlsCert } from "harness";
import { bunEnv, bunExe, tls as tlsCert } from "harness";
import { HttpsProxyAgent } from "https-proxy-agent";
import { once } from "node:events";
import net from "node:net";
@@ -859,3 +859,84 @@ describe("proxy object format with headers", () => {
expect(response.status).toBe(200);
});
});
describe.concurrent("NO_PROXY with explicit proxy option", () => {
// These tests use subprocess spawning because NO_PROXY is read from the
// process environment at startup. A dead proxy that immediately closes
// connections is used so that if NO_PROXY doesn't work, the fetch fails
// with a connection error.
let deadProxyPort: number;
let deadProxy: ReturnType<typeof Bun.listen>;
beforeAll(() => {
deadProxy = Bun.listen({
hostname: "127.0.0.1",
port: 0,
socket: {
open(socket) {
socket.end();
},
data() {},
},
});
deadProxyPort = deadProxy.port;
});
afterAll(() => {
deadProxy.stop(true);
});
test("NO_PROXY bypasses explicit proxy for fetch", async () => {
await using proc = Bun.spawn({
cmd: [
bunExe(),
"-e",
`const resp = await fetch("http://localhost:${httpServer.port}", { proxy: "http://127.0.0.1:${deadProxyPort}" }); console.log(resp.status);`,
],
env: { ...bunEnv, NO_PROXY: "localhost" },
stdout: "pipe",
stderr: "pipe",
});
const [stdout, stderr, exitCode] = await Promise.all([proc.stdout.text(), proc.stderr.text(), proc.exited]);
if (exitCode !== 0) console.error("stderr:", stderr);
expect(stdout.trim()).toBe("200");
expect(exitCode).toBe(0);
});
test("NO_PROXY with port bypasses explicit proxy for fetch", async () => {
await using proc = Bun.spawn({
cmd: [
bunExe(),
"-e",
`const resp = await fetch("http://localhost:${httpServer.port}", { proxy: "http://127.0.0.1:${deadProxyPort}" }); console.log(resp.status);`,
],
env: { ...bunEnv, NO_PROXY: `localhost:${httpServer.port}` },
stdout: "pipe",
stderr: "pipe",
});
const [stdout, stderr, exitCode] = await Promise.all([proc.stdout.text(), proc.stderr.text(), proc.exited]);
if (exitCode !== 0) console.error("stderr:", stderr);
expect(stdout.trim()).toBe("200");
expect(exitCode).toBe(0);
});
test("NO_PROXY non-match does not bypass explicit proxy", async () => {
// NO_PROXY doesn't match, so fetch should try the dead proxy and fail
await using proc = Bun.spawn({
cmd: [
bunExe(),
"-e",
`try { await fetch("http://localhost:${httpServer.port}", { proxy: "http://127.0.0.1:${deadProxyPort}" }); process.exit(1); } catch { process.exit(0); }`,
],
env: { ...bunEnv, NO_PROXY: "other.com" },
stdout: "pipe",
stderr: "pipe",
});
const exitCode = await proc.exited;
// exit(0) means fetch threw (proxy connection failed), proving proxy was used
expect(exitCode).toBe(0);
});
});

View File

@@ -0,0 +1,499 @@
import { describe, expect, test } from "bun:test";
import { bunEnv, bunExe, isWindows, tempDirWithFiles } from "harness";
const ext = isWindows ? ".exe" : "";
function compileAndRun(dir: string, entrypoint: string) {
const outfile = dir + `/compiled${ext}`;
const buildResult = Bun.spawnSync({
cmd: [bunExe(), "build", "--compile", "--bytecode", "--format=esm", entrypoint, "--outfile", outfile],
env: bunEnv,
cwd: dir,
stdio: ["inherit", "pipe", "pipe"],
});
expect(buildResult.stderr.toString()).toBe("");
expect(buildResult.exitCode).toBe(0);
return Bun.spawnSync({
cmd: [outfile],
env: bunEnv,
cwd: dir,
stdio: ["inherit", "pipe", "pipe"],
});
}
const a_file = `
export type my_string = "1";
export type my_value = "2";
export const my_value = "2";
export const my_only = "3";
`;
const a_no_value = `
export type my_string = "1";
export type my_value = "2";
export const my_only = "3";
`;
const a_with_value = `
export type my_string = "1";
export const my_value = "2";
`;
const b_files = [
{
name: "export from",
value: `export { my_string, my_value, my_only } from "./a.ts";`,
},
{
name: "import then export",
value: `
import { my_string, my_value, my_only } from "./a.ts";
export { my_string, my_value, my_only };
`,
},
{
name: "export star",
value: `export * from "./a.ts";`,
},
{
name: "export merge",
value: `export * from "./a_no_value.ts"; export * from "./a_with_value.ts"`,
},
];
const c_files = [
{ name: "require", value: `console.log(JSON.stringify(require("./b")));` },
{ name: "import star", value: `import * as b from "./b"; console.log(JSON.stringify(b));` },
{ name: "await import", value: `console.log(JSON.stringify(await import("./b")));` },
{
name: "import individual",
value: `
import { my_string, my_value, my_only } from "./b";
console.log(JSON.stringify({ my_only, my_value }));
`,
},
];
for (const b_file of b_files) {
describe(`re-export with ${b_file.name}`, () => {
for (const c_file of c_files) {
describe(`import with ${c_file.name}`, () => {
const dir = tempDirWithFiles("type-export", {
"a.ts": a_file,
"b.ts": b_file.value,
"c.ts": c_file.value,
"a_no_value.ts": a_no_value,
"a_with_value.ts": a_with_value,
});
describe.each(["run", "compile", "build"])("%s", mode => {
// TODO: "run" is skipped until ESM module_info is enabled in the runtime transpiler.
// Currently module_info is only generated for standalone ESM bytecode (--compile).
// Once enabled, flip this to include "run".
test.skipIf(mode === "run")("works", async () => {
let result: Bun.SyncSubprocess<"pipe", "inherit"> | Bun.SyncSubprocess<"pipe", "pipe">;
if (mode === "compile") {
result = compileAndRun(dir, dir + "/c.ts");
} else if (mode === "build") {
const build_result = await Bun.build({
entrypoints: [dir + "/c.ts"],
outdir: dir + "/dist",
});
expect(build_result.success).toBe(true);
result = Bun.spawnSync({
cmd: [bunExe(), "run", dir + "/dist/c.js"],
cwd: dir,
env: bunEnv,
stdio: ["inherit", "pipe", "inherit"],
});
} else {
result = Bun.spawnSync({
cmd: [bunExe(), "run", "c.ts"],
cwd: dir,
env: bunEnv,
stdio: ["inherit", "pipe", "inherit"],
});
}
const parsedOutput = JSON.parse(result.stdout.toString().trim());
expect(parsedOutput).toEqual({ my_value: "2", my_only: "3" });
expect(result.exitCode).toBe(0);
});
});
});
}
});
}
describe("import not found", () => {
for (const [ccase, target_value, name] of [
[``, /SyntaxError: Export named 'not_found' not found in module '[^']+?'\./, "none"],
[
`export default function not_found() {};`,
/SyntaxError: Export named 'not_found' not found in module '[^']+?'\. Did you mean to import default\?/,
"default with same name",
],
[
`export type not_found = "not_found";`,
/SyntaxError: Export named 'not_found' not found in module '[^']+?'\./,
"type",
],
] as const)
test(`${name}`, () => {
const dir = tempDirWithFiles("type-export", {
"a.ts": ccase,
"b.ts": /*js*/ `
import { not_found } from "./a";
console.log(not_found);
`,
"nf.ts": "",
});
const result = Bun.spawnSync({
cmd: [bunExe(), "run", "b.ts"],
cwd: dir,
env: bunEnv,
stdio: ["inherit", "pipe", "pipe"],
});
expect(result.stderr?.toString().trim()).toMatch(target_value);
expect({
exitCode: result.exitCode,
stdout: result.stdout?.toString().trim(),
}).toEqual({
exitCode: 1,
stdout: "",
});
});
});
test("js file type export", () => {
const dir = tempDirWithFiles("type-export", {
"a.js": "export {not_found};",
});
const result = Bun.spawnSync({
cmd: [bunExe(), "a.js"],
cwd: dir,
env: bunEnv,
stdio: ["inherit", "pipe", "pipe"],
});
expect(result.stderr?.toString().trim()).toInclude('error: "not_found" is not declared in this file');
expect(result.exitCode).toBe(1);
});
test("js file type import", () => {
const dir = tempDirWithFiles("type-import", {
"b.js": "import {type_only} from './ts.ts';",
"ts.ts": "export type type_only = 'type_only';",
});
const result = Bun.spawnSync({
cmd: [bunExe(), "b.js"],
cwd: dir,
env: bunEnv,
stdio: ["inherit", "pipe", "pipe"],
});
expect(result.stderr?.toString().trim()).toInclude("Export named 'type_only' not found in module '");
expect(result.stderr?.toString().trim()).not.toInclude("Did you mean to import default?");
expect(result.exitCode).toBe(1);
});
test("js file type import with default export", () => {
const dir = tempDirWithFiles("type-import", {
"b.js": "import {type_only} from './ts.ts';",
"ts.ts": "export type type_only = 'type_only'; export default function type_only() {};",
});
const result = Bun.spawnSync({
cmd: [bunExe(), "b.js"],
cwd: dir,
env: bunEnv,
stdio: ["inherit", "pipe", "pipe"],
});
expect(result.stderr?.toString().trim()).toInclude("Export named 'type_only' not found in module '");
expect(result.stderr?.toString().trim()).toInclude("Did you mean to import default?");
expect(result.exitCode).toBe(1);
});
test("js file with through export", () => {
const dir = tempDirWithFiles("type-import", {
"b.js": "export {type_only} from './ts.ts';",
"ts.ts": "export type type_only = 'type_only'; export default function type_only() {};",
});
const result = Bun.spawnSync({
cmd: [bunExe(), "b.js"],
cwd: dir,
env: bunEnv,
stdio: ["inherit", "pipe", "pipe"],
});
expect(result.stderr?.toString().trim()).toInclude("SyntaxError: export 'type_only' not found in './ts.ts'");
expect(result.exitCode).toBe(1);
});
test("js file with through export 2", () => {
const dir = tempDirWithFiles("type-import", {
"b.js": "import {type_only} from './ts.ts'; export {type_only};",
"ts.ts": "export type type_only = 'type_only'; export default function type_only() {};",
});
const result = Bun.spawnSync({
cmd: [bunExe(), "b.js"],
cwd: dir,
env: bunEnv,
stdio: ["inherit", "pipe", "pipe"],
});
expect(result.stderr?.toString().trim()).toInclude("SyntaxError: export 'type_only' not found in './ts.ts'");
expect(result.exitCode).toBe(1);
});
describe("through export merge", () => {
// this isn't allowed, even in typescript (tsc emits "Duplicate identifier 'value'.")
for (const fmt of ["js", "ts"]) {
describe(fmt, () => {
for (const [name, mode] of [
["through", "export {value} from './b'; export {value} from './c';"],
["direct", "export {value} from './b'; export const value = 'abc';"],
["direct2", "export const value = 'abc'; export {value};"],
["ns", "export * as value from './c'; export * as value from './c';"],
]) {
describe(name, () => {
const dir = tempDirWithFiles("type-import", {
["main." + fmt]: "import {value} from './a'; console.log(value);",
["a." + fmt]: mode,
["b." + fmt]: fmt === "ts" ? "export type value = 'b';" : "",
["c." + fmt]: "export const value = 'c';",
});
for (const file of ["main." + fmt, "a." + fmt]) {
test(file, () => {
const result = Bun.spawnSync({
cmd: [bunExe(), file],
cwd: dir,
env: bunEnv,
stdio: ["inherit", "pipe", "pipe"],
});
expect(result.stderr?.toString().trim()).toInclude(
file === "a." + fmt
? 'error: Multiple exports with the same name "value"\n' // bun's syntax error
: "SyntaxError: Cannot export a duplicate name 'value'.\n", // jsc's syntax error
);
expect(result.exitCode).toBe(1);
});
}
});
}
});
}
});
describe("check ownkeys from a star import", () => {
const dir = tempDirWithFiles("ownkeys-star-import", {
["main.ts"]: `
import * as ns from './a';
console.log(JSON.stringify({
keys: Object.keys(ns).sort(),
ns,
has_sometype: Object.hasOwn(ns, 'sometype'),
}));
`,
["a.ts"]: "export * from './b'; export {sometype} from './b';",
["b.ts"]: "export const value = 'b'; export const anotherValue = 'another'; export type sometype = 'sometype';",
});
const expected = {
keys: ["anotherValue", "value"],
ns: {
anotherValue: "another",
value: "b",
},
has_sometype: false,
};
describe.each(["run", "compile"] as const)("%s", mode => {
const testFn = mode === "run" ? test.skip : test;
testFn("works", () => {
const result =
mode === "compile"
? compileAndRun(dir, dir + "/main.ts")
: Bun.spawnSync({
cmd: [bunExe(), "main.ts"],
cwd: dir,
env: bunEnv,
stdio: ["inherit", "pipe", "pipe"],
});
expect(result.stderr?.toString().trim()).toBe("");
expect(JSON.parse(result.stdout?.toString().trim())).toEqual(expected);
expect(result.exitCode).toBe(0);
});
});
});
test("check commonjs", () => {
const dir = tempDirWithFiles("commonjs", {
["main.ts"]: "const {my_value, my_type} = require('./a'); console.log(my_value, my_type);",
["a.ts"]: "module.exports = require('./b');",
["b.ts"]: "export const my_value = 'my_value'; export type my_type = 'my_type';",
});
const result = Bun.spawnSync({
cmd: [bunExe(), "main.ts"],
cwd: dir,
env: bunEnv,
stdio: ["inherit", "pipe", "pipe"],
});
expect(result.stderr?.toString().trim()).toBe("");
expect(result.stdout?.toString().trim()).toBe("my_value undefined");
expect(result.exitCode).toBe(0);
});
test("check merge", () => {
const dir = tempDirWithFiles("merge", {
["main.ts"]: "import {value} from './a'; console.log(value);",
["a.ts"]: "export * from './b'; export * from './c';",
["b.ts"]: "export const value = 'b';",
["c.ts"]: "export const value = 'c';",
});
const result = Bun.spawnSync({
cmd: [bunExe(), "main.ts"],
cwd: dir,
env: bunEnv,
stdio: ["inherit", "pipe", "pipe"],
});
expect(result.stderr?.toString().trim()).toInclude(
"SyntaxError: Export named 'value' cannot be resolved due to ambiguous multiple bindings in module",
);
expect(result.exitCode).toBe(1);
});
describe("export * from './module'", () => {
for (const fmt of ["js", "ts"]) {
describe(fmt, () => {
const dir = tempDirWithFiles("export-star", {
["main." + fmt]: "import {value} from './a'; console.log(value);",
["a." + fmt]: "export * from './b';",
["b." + fmt]: "export const value = 'b';",
});
for (const file of ["main." + fmt, "a." + fmt]) {
test(file, () => {
const result = Bun.spawnSync({
cmd: [bunExe(), file],
cwd: dir,
env: bunEnv,
stdio: ["inherit", "pipe", "pipe"],
});
expect(result.stderr?.toString().trim()).toBe("");
expect(result.exitCode).toBe(0);
});
}
});
}
});
describe("export * as ns from './module'", () => {
for (const fmt of ["js", "ts"]) {
describe(fmt, () => {
const dir = tempDirWithFiles("export-star-as", {
["main." + fmt]: "import {ns} from './a'; console.log(ns.value);",
["a." + fmt]: "export * as ns from './b';",
["b." + fmt]: "export const value = 'b';",
});
for (const file of ["main." + fmt, "a." + fmt]) {
test(file, () => {
const result = Bun.spawnSync({
cmd: [bunExe(), file],
cwd: dir,
env: bunEnv,
stdio: ["inherit", "pipe", "pipe"],
});
expect(result.stderr?.toString().trim()).toBe("");
expect(result.exitCode).toBe(0);
});
}
});
}
});
describe("export type {Type} from './module'", () => {
for (const fmt of ["ts"]) {
describe(fmt, () => {
const dir = tempDirWithFiles("export-type", {
["main." + fmt]: "import {Type} from './a'; const x: Type = 'test'; console.log(x);",
["a." + fmt]: "export type {Type} from './b';",
["b." + fmt]: "export type Type = string;",
});
for (const file of ["main." + fmt, "a." + fmt]) {
test(file, () => {
const result = Bun.spawnSync({
cmd: [bunExe(), file],
cwd: dir,
env: bunEnv,
stdio: ["inherit", "pipe", "pipe"],
});
expect(result.stderr?.toString().trim()).toBe("");
expect(result.exitCode).toBe(0);
});
}
});
}
});
describe("import only used in decorator (#8439)", () => {
const dir = tempDirWithFiles("import-only-used-in-decorator", {
["index.ts"]: /*js*/ `
import { TestInterface } from "./interface.ts";
function Decorator(): PropertyDecorator {
return () => {};
}
class TestClass {
@Decorator()
test?: TestInterface;
}
class OtherClass {
other?: TestInterface;
}
export {TestInterface};
`,
["interface.ts"]: "export interface TestInterface {};",
"tsconfig.json": JSON.stringify({
compilerOptions: {
experimentalDecorators: true,
emitDecoratorMetadata: true,
},
}),
});
describe.each(["run", "compile"] as const)("%s", mode => {
const testFn = mode === "run" ? test.skip : test;
testFn("works", () => {
const result =
mode === "compile"
? compileAndRun(dir, dir + "/index.ts")
: Bun.spawnSync({
cmd: [bunExe(), "index.ts"],
cwd: dir,
env: bunEnv,
stdio: ["inherit", "pipe", "pipe"],
});
expect(result.stderr?.toString().trim()).toBe("");
expect(result.exitCode).toBe(0);
});
});
});

View File

@@ -13,6 +13,8 @@ const { HttpsProxyAgent } = require("https-proxy-agent") as {
// Use docker-compose infrastructure for squid proxy
const gc = harness.gc;
const bunExe = harness.bunExe;
const bunEnv = harness.bunEnv;
const isDockerEnabled = harness.isDockerEnabled;
// HTTP CONNECT proxy server for WebSocket tunneling
@@ -656,3 +658,86 @@ describe("ws module with HttpsProxyAgent", () => {
gc();
});
});
describe.concurrent("WebSocket NO_PROXY bypass", () => {
test("NO_PROXY matching hostname bypasses explicit proxy for ws://", async () => {
// authProxy requires credentials; if NO_PROXY works, the WebSocket bypasses
// the proxy and connects directly. If NO_PROXY doesn't work, the proxy
// rejects with 407 and the WebSocket errors.
await using proc = Bun.spawn({
cmd: [
bunExe(),
"-e",
`const ws = new WebSocket("ws://127.0.0.1:${wsPort}", { proxy: "http://127.0.0.1:${authProxyPort}" });
ws.onopen = () => { ws.close(); process.exit(0); };
ws.onerror = () => { process.exit(1); };`,
],
env: { ...bunEnv, NO_PROXY: "127.0.0.1" },
stdout: "pipe",
stderr: "pipe",
});
const [stderr, exitCode] = await Promise.all([proc.stderr.text(), proc.exited]);
if (exitCode !== 0) console.error("stderr:", stderr);
expect(exitCode).toBe(0);
});
test("NO_PROXY matching host:port bypasses proxy for ws://", async () => {
await using proc = Bun.spawn({
cmd: [
bunExe(),
"-e",
`const ws = new WebSocket("ws://127.0.0.1:${wsPort}", { proxy: "http://127.0.0.1:${authProxyPort}" });
ws.onopen = () => { ws.close(); process.exit(0); };
ws.onerror = () => { process.exit(1); };`,
],
env: { ...bunEnv, NO_PROXY: `127.0.0.1:${wsPort}` },
stdout: "pipe",
stderr: "pipe",
});
const [stderr, exitCode] = await Promise.all([proc.stderr.text(), proc.exited]);
if (exitCode !== 0) console.error("stderr:", stderr);
expect(exitCode).toBe(0);
});
test("NO_PROXY not matching still uses proxy (auth fails)", async () => {
// NO_PROXY doesn't match the target, so the WebSocket should go through
// the auth proxy without credentials, which rejects with 407.
await using proc = Bun.spawn({
cmd: [
bunExe(),
"-e",
`const ws = new WebSocket("ws://127.0.0.1:${wsPort}", { proxy: "http://127.0.0.1:${authProxyPort}" });
ws.onopen = () => { process.exit(1); };
ws.onerror = () => { process.exit(0); };`,
],
env: { ...bunEnv, NO_PROXY: "other.host.com" },
stdout: "pipe",
stderr: "pipe",
});
const exitCode = await proc.exited;
// exit(0) means onerror fired, proving the proxy was used (and auth failed)
expect(exitCode).toBe(0);
});
test("NO_PROXY=* bypasses all proxies", async () => {
await using proc = Bun.spawn({
cmd: [
bunExe(),
"-e",
`const ws = new WebSocket("ws://127.0.0.1:${wsPort}", { proxy: "http://127.0.0.1:${authProxyPort}" });
ws.onopen = () => { ws.close(); process.exit(0); };
ws.onerror = () => { process.exit(1); };`,
],
env: { ...bunEnv, NO_PROXY: "*" },
stdout: "pipe",
stderr: "pipe",
});
const [stderr, exitCode] = await Promise.all([proc.stderr.text(), proc.exited]);
if (exitCode !== 0) console.error("stderr:", stderr);
expect(exitCode).toBe(0);
});
});

View File

@@ -0,0 +1,42 @@
import { expect, test } from "bun:test";
import { existsSync, statSync } from "node:fs";
import { exists, stat } from "node:fs/promises";
// https://github.com/oven-sh/bun/issues/26631
// Path resolution fails for current directory '.' on Windows
test("existsSync('.') should return true", () => {
expect(existsSync(".")).toBe(true);
});
test("exists('.') should return true", async () => {
expect(await exists(".")).toBe(true);
});
test("statSync('.') should return directory stats", () => {
const stats = statSync(".");
expect(stats.isDirectory()).toBe(true);
});
test("stat('.') should return directory stats", async () => {
const stats = await stat(".");
expect(stats.isDirectory()).toBe(true);
});
test("existsSync('..') should return true", () => {
expect(existsSync("..")).toBe(true);
});
test("exists('..') should return true", async () => {
expect(await exists("..")).toBe(true);
});
test("statSync('..') should return directory stats", () => {
const stats = statSync("..");
expect(stats.isDirectory()).toBe(true);
});
test("stat('..') should return directory stats", async () => {
const stats = await stat("..");
expect(stats.isDirectory()).toBe(true);
});

View File

@@ -0,0 +1,77 @@
import { expect, test } from "bun:test";
import { bunEnv, bunExe, tempDir } from "harness";
// https://github.com/oven-sh/bun/issues/26747
// Server config objects with a `stop` method should still auto-start.
// The previous fix for #26142 incorrectly used the presence of a `stop` method
// to detect Server instances, but user config objects (like Elysia apps) can
// legitimately have a `stop` method.
test("server config with stop method as default export should auto-start", async () => {
using dir = tempDir("issue-26747", {
"server.js": `
// Export a config object with a stop method
// This should still trigger auto-start
export default {
port: 0,
fetch(req) {
return new Response("Hello from server with stop method");
},
stop() {
// Custom stop method - should not prevent auto-start
},
};
// Force the process to exit after 100ms so the test can verify the startup message
// without the server blocking forever
setTimeout(() => process.exit(0), 100);
`,
});
await using proc = Bun.spawn({
cmd: [bunExe(), "server.js"],
env: bunEnv,
cwd: String(dir),
stdout: "pipe",
stderr: "pipe",
});
const [stdout, stderr, exitCode] = await Promise.all([proc.stdout.text(), proc.stderr.text(), proc.exited]);
// Should have started the server (look for the debug message on stdout)
expect(stdout).toContain("Started");
expect(exitCode).toBe(0);
});
test("server config with both stop and reload methods should not auto-start", async () => {
// A config object with a `reload` method is likely a Server instance
// or something that manages itself, so we should not auto-start it
using dir = tempDir("issue-26747-reload", {
"server.js": `
export default {
port: 0,
fetch(req) {
return new Response("Hello");
},
stop() {},
reload() {},
};
console.log("Script completed without auto-starting");
`,
});
await using proc = Bun.spawn({
cmd: [bunExe(), "server.js"],
env: bunEnv,
cwd: String(dir),
stdout: "pipe",
stderr: "pipe",
});
const [stdout, stderr, exitCode] = await Promise.all([proc.stdout.text(), proc.stderr.text(), proc.exited]);
// Should NOT have started the server because it has a reload method
expect(stdout).not.toContain("Started");
expect(stdout).toContain("Script completed without auto-starting");
expect(exitCode).toBe(0);
});