Compare commits

..

16 Commits

Author SHA1 Message Date
Dylan Conway
6c7e97231b fix(bundler): barrel optimization drops exports used by dynamic import (#27695)
## What does this PR do?

Fixes invalid JS output when a `sideEffects: false` barrel is used by
both a **static named import** and a **dynamic `import()`** in the same
build with `--splitting --format=esm`.

## Root Cause

`src/bundler/barrel_imports.zig` had a heuristic that skipped escalating
`requested_exports` to `.all` for `import()` when the target already had
a partial entry from a static import:

```zig
} else if (ir.kind == .dynamic) {
    // Only escalate to .all if no prior requests exist for this target.
    if (!this.requested_exports.contains(target)) {
        try this.requested_exports.put(this.allocator(), target, .all);
    }
}
```

This is unsafe — `await import()` returns the **full module namespace**
at runtime. The consumer can destructure or access any export and we
can't statically determine which ones.

## Failure chain

1. Static `import { a } from "barrel"` seeds `requested_exports[barrel]
= .partial{"a"}`
2. Dynamic `import("barrel")` is ignored because
`requested_exports.contains(barrel)` is true
3. Barrel optimization marks `export { b } from "./b.js"` as `is_unused`
→ `source_index` cleared
4. Code splitting makes the barrel a chunk entry point (dynamic import →
own chunk)
5. Linker can't resolve the barrel's re-export symbol for `b` → uses the
unbound re-export ref directly
6. Output: `export { b2 as b }` where `b2` has no declaration →
**SyntaxError at runtime**

With `--bytecode --compile` this manifests as an `OutputFileListBuilder`
assertion (`total_insertions != output_files.items.len`) because JSC
rejects the chunk during bytecode generation, leaving an unfilled slot.

## Real-world trigger

`@smithy/credential-provider-imds` is a `sideEffects: false` barrel used
by:
- `@aws-sdk/credential-providers` — static: `import {
fromInstanceMetadata } from "@smithy/credential-provider-imds"`
- `@smithy/util-defaults-mode-node` — dynamic: `const {
getInstanceMetadataEndpoint, httpRequest } = await
import("@smithy/credential-provider-imds")`

## Fix

Dynamic import **always** marks the target as `.all` in both Phase 1
seeding and Phase 2 BFS. This is the same treatment as `require()`.

## Tests

- **New:** `barrel/DynamicImportWithStaticImportSameTarget` — repros the
bug with `--splitting`, verifies both exports work at runtime. Fails on
system bun with `SyntaxError: Exported binding 'b' needs to refer to a
top-level declared variable.`
- **Updated:** `barrel/DynamicImportInSubmodule` — was testing that a
fire-and-forget `import()` doesn't force unused submodules to load. That
optimization can't be safely applied, so the test now verifies the
conservative (correct) behavior: all barrel exports are preserved.
2026-03-02 14:33:22 -08:00
Jarred Sumner
32edef77e9 markdown: add {index, depth, ordered, start} to listItem callback meta (#27688) 2026-03-02 04:48:43 -08:00
anthonybaldwin
aa51d6032c fix(css): preserve geometry-box values in mask shorthand parsing (#27680)
### What does this PR do?

Wrap parse calls in `Mask.parse()` with `input.tryParse()` to restore
parser state on failure, matching the pattern used in `background.zig`
([#L55-L105](97c113d010/src/css/properties/background.zig (L55-L105))).
Without this, failed parsers consume tokens before returning errors,
causing `geometry-box` values (`padding-box`, `content-box`) to be
silently dropped and rules with different geometry boxes to be
incorrectly merged.

### How did you verify your code works?

Ran `test/bundler/css/mask-geometry-box.test.ts` against the fixed build
and against `v1.3.10` + `v1.3.11-canary.60`:

- Fixed build: 2 pass, 0 fail
  - All existing CSS bundler tests pass (166 tests across 8 files)
- `v1.3.10` / `v1.3.11-canary.60`: 0 pass, 2 fail
- confirms `geometry-box` values (`padding-box`, `content-box`) are
stripped and rules are incorrectly merged

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2026-03-02 04:09:16 -08:00
robobun
1896d270e7 fix(windows): keep event loop alive during Bun.file() async reads (#26633)
## Summary

- Fix `Bun.file().text()` silently failing on Windows when reading
non-existent files
- Add event loop reference counting to `ReadFileUV` to prevent premature
process exit

## Root Cause

`ReadFileUV` (Windows-only code path) was not calling
`refConcurrently()`/`unrefConcurrently()` on the event loop, causing the
process to exit before the async libuv file read operations completed.
This resulted in promises never resolving/rejecting and the process
exiting with code 0 silently.

The fix follows the same pattern used in `CopyFileWindows` which
properly maintains event loop references.

## Test plan

- [x] Added regression test in `test/regression/issue/26632.test.ts`
- [x] Tests pass with debug build on Linux
- [x] Existing blob/file tests pass

> Note: This bug is Windows-specific. The regression test will properly
validate the fix on Windows CI.

Fixes #26632

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2026-03-02 04:06:30 -08:00
robobun
9e1044fbe7 harden(valkey): add max nesting depth to RESP parser (#27502)
## Summary

- Adds a maximum nesting depth limit (128) to the RESP protocol parser
for aggregate types (Array, Map, Set, Attribute, Push)
- The recursive `readValue()` function now tracks depth and returns
`NestingDepthExceeded` when the limit is reached, preventing excessive
stack usage from deeply nested server responses
- Follows the same pattern used by the CSS parser's
`maximum_nesting_depth` guard

## Test plan

- [x] New test: `test/js/valkey/reliability/resp-nesting-depth.test.ts`
- Verifies deeply nested responses (depth 256) are rejected with an
error
- Verifies shallow nested responses (depth 3) are still parsed correctly
- Subprocess test verifies the process exits cleanly on depth 1000 (no
SIGSEGV)
- [x] Verified test fails on system Bun (without fix) and passes with
debug build
- [x] Debug build compiles successfully

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2026-03-02 04:04:50 -08:00
robobun
36793dfefe test(vite): update vite build test to use rolldown-vite 7.x (#27685)
## Summary
- Update the vite-build integration test from Vite 5 to **rolldown-vite
7.3.x** (Vite 7 powered by the Rolldown bundler written in Rust)
- Upgrade SvelteKit ecosystem deps (`@sveltejs/kit` 2.53.4,
`@sveltejs/vite-plugin-svelte` 6.2.4, Svelte 5) to versions compatible
with Vite 7
- Fix deprecated `csrf: false` config → `csrf: { trustedOrigins: [] }`
in `svelte.config.js`
- Increase test timeout from 60s to 120s to accommodate rolldown build

## Test plan
- [x] `USE_SYSTEM_BUN=1 bun test
test/integration/vite-build/vite-build.test.ts` passes (1 pass, 11.83s)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-02 04:03:15 -08:00
robobun
a67d8e2a64 harden chunked encoding to handle CRLF split across TCP segments (#27669)
## Summary

- Harden the chunked encoding parser to correctly handle chunk-size CRLF
terminators split across TCP segments
- When a segment boundary falls between `\r` and `\n` of a chunk-size
line (e.g. `"5\r"` in one packet, `"\nHello..."` in the next), the
parser now correctly buffers the partial CRLF state and resumes on the
next data arrival
- Add `STATE_WAITING_FOR_LF` flag to `consumeHexNumber()` to track when
`\r` has been consumed but `\n` is still pending
- Add defense-in-depth check in `getNextChunk()` to break out of the
parsing loop if no progress is made

Reported by [sim1222 (Kokecchi)](https://github.com/sim1222).

## Test plan

- [x] New test: "handles lone CR at end of chunk-size line across TCP
segments" — sends chunk-size `\r` and `\n` in separate TCP segments,
verifies server processes the request correctly
- [x] New test: "handles lone CR in chunk-size with extensions" — same
split but with chunk extensions before the CRLF
- [x] Existing test "handles fragmented chunk terminators" still passes
- [x] Existing test "rejects invalid terminator in fragmented reads"
still passes
- [x] Verified new tests timeout/fail with `USE_SYSTEM_BUN=1`
(unpatched) and pass with `bun bd test` (patched)


🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2026-03-02 02:19:51 -08:00
Jarred Sumner
c176f0b12a Fixes #27479 2026-03-02 02:09:13 -08:00
Christopher Stöckl
d5badc2f78 fix(blob): handle UTF-8 paths with non-ASCII characters (#26646)
## What does this PR do?

Fixes encoding corruption when using Bun.file().stat() or
Bun.file().delete() with file paths containing UTF-8 characters (e.g.,
German umlauts, Japanese characters, emoji).

### The Bug
When calling Bun.file().stat() or Bun.file().delete() with paths
containing non-ASCII UTF-8 characters, the path was being corrupted due
to double-encoding:

- UTF-8 bytes were being treated as Latin1 by ZigString.init()
- When converting to ZigString.Slice, the Latin1-to-UTF-8 conversion
would encode the bytes again
- Result: paths like "über.txt" became "über.txt" (mojibake)

### The Fix
Changed ZigString.init() to ZigString.fromUTF8() in two locations:
- src/bun.js/webcore/Blob.zig (getStat function)
- src/bun.js/webcore/blob/Store.zig (unlink function)

The fromUTF8() function marks the string as UTF-8 if it contains
non-ASCII characters, preventing the double-encoding issue.

## How did you verify your code works?

- Added comprehensive test coverage in
test/regression/issue/utf8-path-encoding.test.ts
- Tests verify Bun.file().stat() and Bun.file().delete() work correctly
with:
  - German umlauts (ä, ö, ü)
  - Japanese characters
  - Emoji
  - Mixed special characters
- Tests compare Bun results against Node.js fs module to ensure
consistency


Fixes #26647

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2026-03-02 01:50:05 -08:00
SUZUKI Sosuke
1278e46231 fix: use WriteBarrierEarlyInit for WriteBarriers initialized before finishCreation (#27459)
## Summary

Fix a latent GC safety issue in `NodeVMModule` and
`NodeVMSyntheticModule` where `WriteBarrier` fields were initialized
with the regular constructor before `finishCreation()` was called on the
cell.

## Problem

The regular `WriteBarrier(vm, owner, value)` constructor calls
`vm.writeBarrier(owner, value)`, which assumes the owner cell is already
fully constructed and markable by the GC. However, in C++ member
initializer lists, `finishCreation()` has not been called yet, so the
cell is not in a valid state for the GC to process.

Two fields were affected:
- `NodeVMModule::m_moduleWrapper` (`NodeVMModule.cpp:157`)
- `NodeVMSyntheticModule::m_syntheticEvaluationSteps`
(`NodeVMSyntheticModule.h:57`)

If a GC cycle happens to run during construction (between the member
initializer list and `finishCreation()`), this could lead to a crash.
The bug is timing-dependent and difficult to reproduce, but the code is
clearly incorrect.

## Fix

Use `WriteBarrierEarlyInit` instead, which stores the pointer without
issuing a write barrier. This is the standard JSC pattern — used
extensively in upstream WebKit (`JSObject`, `Structure`, `CodeBlock`,
`JSBoundFunction`, `JSPromiseReaction`, etc.) and already used by the
adjacent `m_context` field in the same constructor.

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2026-03-02 01:41:57 -08:00
Dylan Conway
0b61853ac6 build: compile windows shim for native arch on aarch64 (#27448)
## What does this PR do?

The `bun_shim_impl.exe` used for `node_modules/.bin/*` on Windows was
hardcoded to build for x86_64 (nehalem), even when building bun for
Windows ARM64. This meant every package binary invocation on ARM64 ran
under x64 emulation.

This PR makes `WindowsShim.create()` use the bun build's target arch —
building a native aarch64 shim when building `bun.exe` for aarch64.

## Why is this safe?

The shim source (`src/install/windows-shim/bun_shim_impl.zig`) contains
no arch-specific code:
- Uses only NTDLL (`NtCreateFile`, `NtReadFile`, `NtClose`,
`RtlExitUserProcess`) and kernel32 (`CreateProcessW`,
`WaitForSingleObject`, etc.)
- Uses `std.os.windows.teb()` which already has a native aarch64
implementation (`mov %[ptr], x18`)
- No inline assembly, no x86-specific intrinsics

Verified the ARM64 shim compiles cleanly to a 12.5KB PE32+ Aarch64
binary.

## How did you verify your code works?

- Built the ReleaseFast shim for `aarch64-windows` locally — compiles
without errors
- CI will verify the embedded build path on Windows ARM64

Note: the `zig build windows-shim` standalone helper currently fails on
Debug builds due to a pre-existing `fmtUtf16Le` format string issue —
this is unrelated to this change (same error on x64) and does not affect
the ReleaseFast shim that gets embedded into bun.exe.

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2026-03-02 01:41:19 -08:00
robobun
5d79daa2e3 fix: re-read version/name after lifecycle scripts in pack/publish (#27241)
## Summary

- Fixes `bun pack` and `bun publish` using a stale `version` (and
`name`) when lifecycle scripts (`prepublishOnly`, `prepack`, `prepare`)
modify `package.json` during execution.
- After lifecycle scripts run and `package.json` is re-read from disk,
now also refreshes the `package_name` and `package_version` variables
from the updated JSON.
- Previously, the version was captured once before scripts ran and never
updated, causing the tarball filename and publish registry metadata
(`dist-tags`, `versions` keys, `_id`, `_attachments`) to use the
original version instead of the script-modified one.

## Test plan

- [x] Added pack test: verifies tarball filename uses updated version
when `prepack` script modifies `package.json`
- [x] Added publish test: verifies package is published under the
updated version when `prepublishOnly` modifies it
- [x] Verified new test fails with `USE_SYSTEM_BUN=1` (unfixed bun) and
passes with `bun bd test` (fixed build)
- [x] Existing pack tests (basic, shasum, workspace) still pass

Closes #17195

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2026-03-02 01:40:08 -08:00
robobun
6f256c4b82 fix(shell): preserve empty string arguments in Bun shell (#27231)
## Summary

Fixes #17294

- Empty string arguments (`""`, `''`, `${''}`) were silently dropped
instead of being passed as arguments
- This affected commands like `ssh-keygen -N ""` where the empty
passphrase argument was lost

### Root causes fixed

1. **`appendBunStr`**: empty interpolated values (`${''}`) now emit
literal `""` in the script text so the lexer sees an empty quoted string
2. **Single quote lexing**: added `break_word` calls matching double
quote behavior, so `''` produces proper `SingleQuotedText` tokens
(previously quote context was lost entirely)
3. **`isImmediatelyEscapedQuote`**: now handles `''` in addition to `""`
4. **New `quoted_empty` AST atom**: preserves empty quoted strings
through parsing into expansion
5. **`pushCurrentOut`**: no longer drops empty results when the word
contained quoted empty content

## Test plan

- [x] New regression test `test/regression/issue/17294.test.ts` with 6
test cases covering interpolation, double quotes, single quotes,
multiple empty strings, and mixed args
- [x] Test passes with `bun bd test` and fails with `USE_SYSTEM_BUN=1`
- [x] Updated `test/js/bun/shell/lex.test.ts` expectation for
single-quoted text (now correctly tagged as `SingleQuotedText` instead
of `Text`)
- [x] All existing shell tests pass (lex, parse, brace, exec, bunshell)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2026-03-02 01:27:20 -08:00
robobun
5ff453675e fix(watcher): reset read_ptr after consuming remaining inotify events (#27668)
## Summary

- Fix 100% CPU spin in the Linux inotify file watcher caused by
`read_ptr` never being reset to `null` after consuming remaining events
from an overflowed buffer

## Root Cause

In `INotifyWatcher.read()`, when a single `read()` syscall returns more
than 128 inotify events, `read_ptr` is set to save the buffer position
so the remaining events can be returned on the next call. However, after
those remaining events were fully consumed, `read_ptr` was **never reset
to `null`**. This caused every subsequent call to `read()` to:

1. Enter the `if (this.read_ptr)` branch, skipping the actual `read()`
syscall
2. Re-parse the same stale byte buffer from the saved offset
3. Return the same events repeatedly in an infinite hot loop at 100% CPU

The fix is a single line: `this.read_ptr = null;` after the remaining
events are consumed.

## Test plan

- [x] Added regression test `test/regression/issue/27667.test.ts` that
exercises the watcher under high event load
- [x] Test passes with `bun bd test test/regression/issue/27667.test.ts`

Note: The exact overflow condition (>128 events in a single `read()`)
depends on kernel timing and is hard to trigger deterministically in
tests, but the bug is clear from code inspection — `read_ptr` is set on
line 206 but never cleared anywhere.

Closes #27667

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2026-03-02 01:22:53 -08:00
robobun
4ee2e52c28 fix(windows): fix fd leak and use-after-free in ReadFileUV error paths (#27666)
## Summary

Fixes two error handling bugs in `ReadFileUV` (identified in [PR #26633
review](https://github.com/oven-sh/bun/pull/26633)):

- **`onRead()` fd leak**: The error path at line 772 called
`this.finalize()` directly instead of `this.onFinish()`, bypassing
`doClose()` and leaking the open file descriptor. Every other error path
correctly goes through `onFinish()` → `doClose()` → `finalize()`.

- **`queueRead()` use-after-free**: The OOM catch block for non-regular
file buffer expansion called `this.onFinish()` but was missing a
`return`, causing execution to fall through to `this.remainingBuffer()`
on freed memory.

## Test plan

- [x] `bun run zig:check-windows` passes (all 25/25 build steps
succeeded)
- These are Windows-only code paths (`ReadFileUV` uses libuv), so they
only affect Windows builds

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-02 01:09:47 -08:00
robobun
616501d1e1 fix(buffer): prevent heap read-after-free in indexOf via valueOf-triggered detachment (#26927)
## Summary

- Fix a heap read-after-free vulnerability in `Buffer.indexOf`,
`Buffer.lastIndexOf`, and `Buffer.includes` where the raw `typedVector`
pointer was cached before calling `toNumber()` on the `byteOffset`
argument. A user-supplied `valueOf()` callback could call
`ArrayBuffer.prototype.transfer()` to detach the underlying
`ArrayBuffer`, freeing the original memory, causing these methods to
scan freed heap data.
- The fix defers fetching the `typedVector` pointer until after
`toNumber()` completes, adds a detachment check, and throws a
`TypeError` if the buffer was detached.

## Test plan

- [x] New test file `test/js/node/buffer-indexOf-detach.test.ts` with 7
test cases:
  - `indexOf` throws `TypeError` when buffer detached via `valueOf`
  - `lastIndexOf` throws `TypeError` when buffer detached via `valueOf`
  - `includes` throws `TypeError` when buffer detached via `valueOf`
  - `indexOf` with string value throws `TypeError` when buffer detached
  - `indexOf` with Buffer value throws `TypeError` when buffer detached
  - Normal `indexOf`/`lastIndexOf`/`includes` functionality still works
  - `indexOf` with non-detaching `valueOf` still works correctly
- [x] All 7 tests pass with `bun bd test`
- [x] The 5 detachment tests fail with `USE_SYSTEM_BUN=1` (confirming
they test the fix)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2026-03-01 23:44:35 -08:00
175 changed files with 2614 additions and 1000 deletions

View File

@@ -98,7 +98,7 @@ const BunBuildOptions = struct {
pub fn windowsShim(this: *BunBuildOptions, b: *Build) WindowsShim {
return this.windows_shim orelse {
this.windows_shim = WindowsShim.create(b);
this.windows_shim = WindowsShim.create(b, this.arch);
return this.windows_shim.?;
};
}
@@ -759,7 +759,6 @@ fn configureObj(b: *Build, opts: *BunBuildOptions, obj: *Compile) void {
obj.no_link_obj = opts.os != .windows and !opts.no_llvm;
if (opts.enable_asan and !enableFastBuild(b)) {
if (@hasField(Build.Module, "sanitize_address")) {
if (opts.enable_fuzzilli) {
@@ -986,10 +985,16 @@ const WindowsShim = struct {
exe: *Compile,
dbg: *Compile,
fn create(b: *Build) WindowsShim {
fn create(b: *Build, arch: Arch) WindowsShim {
const target = b.resolveTargetQuery(.{
.cpu_model = .{ .explicit = &std.Target.x86.cpu.nehalem },
.cpu_arch = .x86_64,
.cpu_model = switch (arch) {
.aarch64 => .baseline,
else => .{ .explicit = &std.Target.x86.cpu.nehalem },
},
.cpu_arch = switch (arch) {
.aarch64 => .aarch64,
else => .x86_64,
},
.os_tag = .windows,
.os_version_min = getOSVersionMin(.windows),
});

View File

@@ -400,6 +400,7 @@
"/guides/http/file-uploads",
"/guides/http/fetch-unix",
"/guides/http/stream-iterator",
"/guides/http/sse",
"/guides/http/stream-node-streams-in-bun"
]
},

91
docs/guides/http/sse.mdx Normal file
View File

@@ -0,0 +1,91 @@
---
title: Server-Sent Events (SSE) with Bun
sidebarTitle: Server-Sent Events
mode: center
---
[Server-Sent Events](https://developer.mozilla.org/en-US/docs/Web/API/Server-sent_events) let you push a stream of text events to the browser over a single HTTP response. The client consumes them via [`EventSource`](https://developer.mozilla.org/en-US/docs/Web/API/EventSource).
In Bun, you can implement an SSE endpoint by returning a `Response` whose body is a streaming source and setting the `Content-Type` header to `text/event-stream`.
<Note>
`Bun.serve` closes idle connections after **10 seconds** by default. A quiet SSE stream counts as idle, so the
examples below call `server.timeout(req, 0)` to disable the timeout for the stream. See
[`idleTimeout`](/runtime/http/server#idletimeout) for details.
</Note>
## Using an async generator
In Bun, `new Response` accepts an async generator function directly. This is usually the simplest way to write an SSE endpoint — each `yield` flushes a chunk to the client, and if the client disconnects, the generator's `finally` block runs so you can clean up.
```ts server.ts icon="/icons/typescript.svg"
Bun.serve({
port: 3000,
routes: {
"/events": (req, server) => {
// SSE streams are often quiet between events. By default,
// Bun.serve closes connections after 10 seconds of inactivity.
// Disable the idle timeout for this request so the stream
// stays open indefinitely.
server.timeout(req, 0);
return new Response(
async function* () {
yield `data: connected at ${Date.now()}\n\n`;
// Emit a tick every 5 seconds until the client disconnects.
// When the client goes away, the generator is returned
// (cancelled) and this loop stops automatically.
while (true) {
await Bun.sleep(5000);
yield `data: tick ${Date.now()}\n\n`;
}
},
{
headers: {
"Content-Type": "text/event-stream",
"Cache-Control": "no-cache",
},
},
);
},
},
});
```
## Using a `ReadableStream`
If your events originate from callbacks — message brokers, timers, external pushes — rather than a linear `await` flow, a `ReadableStream` often fits better. When the client disconnects, Bun calls the stream's `cancel()` method automatically, so you can release any resources you set up in `start()`.
```ts server.ts icon="/icons/typescript.svg"
Bun.serve({
port: 3000,
routes: {
"/events": (req, server) => {
server.timeout(req, 0);
let timer: Timer;
const stream = new ReadableStream({
start(controller) {
controller.enqueue(`data: connected at ${Date.now()}\n\n`);
timer = setInterval(() => {
controller.enqueue(`data: tick ${Date.now()}\n\n`);
}, 5000);
},
cancel() {
// Called automatically when the client disconnects.
clearInterval(timer);
},
});
return new Response(stream, {
headers: {
"Content-Type": "text/event-stream",
"Cache-Control": "no-cache",
},
});
},
},
});
```

View File

@@ -171,12 +171,14 @@ Unlike unix domain sockets, abstract namespace sockets are not bound to the file
## idleTimeout
To configure the idle timeout, set the `idleTimeout` field in Bun.serve.
By default, `Bun.serve` closes connections after **10 seconds** of inactivity. A connection is considered idle when there is no data being sent or received — this includes in-flight requests where your handler is still running but hasn't written any bytes to the response yet. Browsers and `fetch()` clients will see this as a connection reset.
To configure this, set the `idleTimeout` field (in seconds). The maximum value is `255`, and `0` disables the timeout entirely.
```ts
Bun.serve({
// 10 seconds:
idleTimeout: 10,
// 30 seconds (default is 10)
idleTimeout: 30,
fetch(req) {
return new Response("Bun!");
@@ -184,7 +186,11 @@ Bun.serve({
});
```
This is the maximum amount of time a connection is allowed to be idle before the server closes it. A connection is idling if there is no data sent or received.
<Note>
**Streaming & Server-Sent Events** — The idle timer applies while a response is being streamed. If your stream goes
quiet for longer than `idleTimeout`, the connection will be closed mid-response. For long-lived streams, disable the
timeout for that request with [`server.timeout(req, 0)`](#server-timeout-request-seconds).
</Note>
---
@@ -296,12 +302,12 @@ This is useful for development and hot reloading. Only `fetch`, `error`, and `ro
### `server.timeout(Request, seconds)`
Set a custom idle timeout for individual requests:
Override the idle timeout for an individual request. Pass `0` to disable the timeout entirely for that request.
```ts
const server = Bun.serve({
async fetch(req, server) {
// Set 60 second timeout for this request
// Give this request up to 60 seconds of inactivity instead of the default 10
server.timeout(req, 60);
// If they take longer than 60 seconds to send the body, the request will be aborted
@@ -312,7 +318,28 @@ const server = Bun.serve({
});
```
Pass `0` to disable the timeout for a request.
This is the recommended way to keep long-lived streaming responses (like Server-Sent Events) alive without raising the global `idleTimeout` for every request:
```ts
Bun.serve({
routes: {
"/events": (req, server) => {
// Disable the idle timeout for this streaming response.
// Otherwise the connection will be closed if no bytes
// are sent for 10 seconds (the default idleTimeout).
server.timeout(req, 0);
return new Response(
async function* () {
yield "data: hello\n\n";
// events can arrive sporadically without the connection being killed
},
{ headers: { "Content-Type": "text/event-stream" } },
);
},
},
});
```
### `server.requestIP(Request)`

View File

@@ -124,22 +124,32 @@ Return a string to replace the element's rendering. Return `null` or `undefined`
### Block callbacks
| Callback | Meta | Description |
| ------------ | ------------------------------------------- | ---------------------------------------------------------------------------------------- |
| `heading` | `{ level: number, id?: string }` | Heading level 16. `id` is set when `headings: { ids: true }` is enabled |
| `paragraph` | — | Paragraph block |
| `blockquote` | — | Blockquote block |
| `code` | `{ language?: string }` | Fenced or indented code block. `language` is the info-string when specified on the fence |
| `list` | `{ ordered: boolean, start?: number }` | Ordered or unordered list. `start` is the start number for ordered lists |
| `listItem` | `{ checked?: boolean }` | List item. `checked` is set for task list items (`- [x]` / `- [ ]`) |
| `hr` | — | Horizontal rule |
| `table` | — | Table block |
| `thead` | — | Table head |
| `tbody` | — | Table body |
| `tr` | — | Table row |
| `th` | `{ align?: "left" \| "center" \| "right" }` | Table header cell. `align` is set when alignment is specified |
| `td` | `{ align?: "left" \| "center" \| "right" }` | Table data cell. `align` is set when alignment is specified |
| `html` | — | Raw HTML content |
| Callback | Meta | Description |
| ------------ | --------------------------------------------- | ---------------------------------------------------------------------------------------- |
| `heading` | `{ level, id? }` | Heading level 16. `id` is set when `headings: { ids: true }` is enabled |
| `paragraph` | — | Paragraph block |
| `blockquote` | — | Blockquote block |
| `code` | `{ language? }` | Fenced or indented code block. `language` is the info-string when specified on the fence |
| `list` | `{ ordered, start?, depth }` | `depth` is nesting level (0 = top-level). `start` is set for ordered lists |
| `listItem` | `{ index, depth, ordered, start?, checked? }` | See [List item meta](#list-item-meta) below |
| `hr` | — | Horizontal rule |
| `table` | — | Table block |
| `thead` | — | Table head |
| `tbody` | — | Table body |
| `tr` | — | Table row |
| `th` | `{ align? }` | Table header cell. `align` is `"left"`, `"center"`, `"right"`, or absent |
| `td` | `{ align? }` | Table data cell |
| `html` | — | Raw HTML content |
#### List item meta
The `listItem` callback receives everything needed to render markers directly:
- `index` — 0-based position within the parent list
- `depth` — the parent list's nesting level (0 = top-level)
- `ordered` — whether the parent list is ordered
- `start` — the parent list's start number (only when `ordered` is true)
- `checked` — task list state (only for `- [x]` / `- [ ]` items)
### Inline callbacks
@@ -205,6 +215,33 @@ const ansi = Bun.markdown.render("# Hello\n\nThis is **bold** and *italic*", {
});
```
#### Nested list numbering
The `listItem` callback receives everything needed to render markers directly — no post-processing:
```ts
const result = Bun.markdown.render("1. first\n 1. sub-a\n 2. sub-b\n2. second", {
listItem: (children, { index, depth, ordered, start }) => {
const n = (start ?? 1) + index;
// 1. 2. 3. at depth 0, a. b. c. at depth 1, i. ii. iii. at depth 2
const marker = !ordered
? "-"
: depth === 0
? `${n}.`
: depth === 1
? `${String.fromCharCode(96 + n)}.`
: `${toRoman(n)}.`;
return " ".repeat(depth) + marker + " " + children.trimEnd() + "\n";
},
// Prepend a newline so nested lists are separated from their parent item's text
list: children => "\n" + children,
});
// 1. first
// a. sub-a
// b. sub-b
// 2. second
```
#### Code block syntax highlighting
````ts

View File

@@ -201,6 +201,10 @@ export const GuidesList = () => {
title: "Streaming HTTP Server with Async Iterators",
href: "/guides/http/stream-iterator",
},
{
title: "Server-Sent Events (SSE)",
href: "/guides/http/sse",
},
{
title: "Streaming HTTP Server with Node.js Streams",
href: "/guides/http/stream-node-streams-in-bun",

View File

@@ -1193,10 +1193,20 @@ declare module "bun" {
ordered: boolean;
/** The start number for ordered lists. */
start?: number;
/** Nesting depth. `0` for a top-level list, `1` for a list inside a list item, etc. */
depth: number;
}
/** Meta passed to the `listItem` callback. */
interface ListItemMeta {
/** 0-based index of this item within its parent list. */
index: number;
/** Nesting depth of the parent list. `0` for items in a top-level list. */
depth: number;
/** Whether the parent list is ordered. */
ordered: boolean;
/** The start number of the parent list (only set when `ordered` is true). */
start?: number;
/** Task list checked state. Set for `- [x]` / `- [ ]` items. */
checked?: boolean;
}
@@ -1234,8 +1244,8 @@ declare module "bun" {
code?: (children: string, meta?: CodeBlockMeta) => string | null | undefined;
/** Ordered or unordered list. `start` is the first item number for ordered lists. */
list?: (children: string, meta: ListMeta) => string | null | undefined;
/** List item. `meta.checked` is set for task list items (`- [x]` / `- [ ]`). Only passed for task list items. */
listItem?: (children: string, meta?: ListItemMeta) => string | null | undefined;
/** List item. `meta` always includes `{index, depth, ordered}`. `meta.start` is set for ordered lists; `meta.checked` is set for task list items. */
listItem?: (children: string, meta: ListItemMeta) => string | null | undefined;
/** Horizontal rule. */
hr?: (children: string) => string | null | undefined;
/** Table. */

View File

@@ -32,120 +32,102 @@ namespace uWS {
constexpr uint64_t STATE_HAS_SIZE = 1ull << (sizeof(uint64_t) * 8 - 1);//0x8000000000000000;
constexpr uint64_t STATE_IS_CHUNKED = 1ull << (sizeof(uint64_t) * 8 - 2);//0x4000000000000000;
constexpr uint64_t STATE_IS_CHUNKED_EXTENSION = 1ull << (sizeof(uint64_t) * 8 - 3);//0x2000000000000000;
constexpr uint64_t STATE_SIZE_MASK = ~(STATE_HAS_SIZE | STATE_IS_CHUNKED | STATE_IS_CHUNKED_EXTENSION);//0x1FFFFFFFFFFFFFFF;
constexpr uint64_t STATE_WAITING_FOR_LF = 1ull << (sizeof(uint64_t) * 8 - 4);//0x1000000000000000;
constexpr uint64_t STATE_SIZE_MASK = ~(STATE_HAS_SIZE | STATE_IS_CHUNKED | STATE_IS_CHUNKED_EXTENSION | STATE_WAITING_FOR_LF);//0x0FFFFFFFFFFFFFFF;
constexpr uint64_t STATE_IS_ERROR = ~0ull;//0xFFFFFFFFFFFFFFFF;
constexpr uint64_t STATE_SIZE_OVERFLOW = 0x0Full << (sizeof(uint64_t) * 8 - 8);//0x0F00000000000000;
/* Overflow guard: if any of bits 55-59 are set before the next *16, one more
* hex digit (plus the +2 for the trailing CRLF of chunk-data) would carry into
* STATE_WAITING_FOR_LF at bit 60. Limits chunk size to 14 hex digits (~72 PB). */
constexpr uint64_t STATE_SIZE_OVERFLOW = 0x1Full << (sizeof(uint64_t) * 8 - 9);//0x0F80000000000000;
inline uint64_t chunkSize(uint64_t state) {
return state & STATE_SIZE_MASK;
}
inline bool isParsingChunkedExtension(uint64_t state) {
return (state & STATE_IS_CHUNKED_EXTENSION) != 0;
}
/* Parses the chunk-size line: HEXDIG+ [;ext...] CRLF
*
* Returns the new state. On return, exactly one of:
* - state has STATE_HAS_SIZE set (success, data advanced past LF)
* - state == STATE_IS_ERROR (malformed input)
* - data is empty (short read; flags persist for resume)
*
* Resume flags:
* STATE_WAITING_FOR_LF -> saw '\r' on previous call, need '\n'
* STATE_IS_CHUNKED_EXTENSION -> mid-extension, skip hex parsing on resume
*
* Structure follows upstream uWS (scan-for-LF) with strict CRLF validation
* added. Every byte is consumed in a forward scan so TCP segment boundaries
* splitting the line at any point are handled by construction.
*
* RFC 7230 4.1.1:
* chunk = chunk-size [ chunk-ext ] CRLF chunk-data CRLF
* chunk-size = 1*HEXDIG
* chunk-ext = *( ";" chunk-ext-name [ "=" chunk-ext-val ] )
* chunk-ext-name = token
* chunk-ext-val = token / quoted-string (TODO: quoted-string unsupported)
*/
inline uint64_t consumeHexNumber(std::string_view &data, uint64_t state) {
/* Resume: '\r' was the last byte of the previous segment. Rare path,
* use data directly to avoid the p/len load on the hot path. */
if (state & STATE_WAITING_FOR_LF) [[unlikely]] {
if (!data.length()) return state;
if (data[0] != '\n') return STATE_IS_ERROR;
data.remove_prefix(1);
return ((state & ~(STATE_WAITING_FOR_LF | STATE_IS_CHUNKED_EXTENSION)) + 2)
| STATE_HAS_SIZE | STATE_IS_CHUNKED;
}
/* Reads hex number until CR or out of data to consume. Updates state. Returns bytes consumed. */
inline void consumeHexNumber(std::string_view &data, uint64_t &state) {
/* Load pointer+length into locals so the loops operate in registers.
* Without this, Clang writes back to the string_view on every iteration.
* Error paths skip the writeback: HttpParser returns immediately on
* STATE_IS_ERROR and never reads data. */
const char *p = data.data();
size_t len = data.length();
/* RFC 9110: 5.5 Field Values (TLDR; anything above 31 is allowed \r, \n ; depending on context)*/
if(!isParsingChunkedExtension(state)){
/* Consume everything higher than 32 and not ; (extension)*/
while (data.length() && data[0] > 32 && data[0] != ';') {
unsigned char digit = (unsigned char)data[0];
unsigned int number;
if (digit >= '0' && digit <= '9') {
number = digit - '0';
} else if (digit >= 'a' && digit <= 'f') {
number = digit - 'a' + 10;
} else if (digit >= 'A' && digit <= 'F') {
number = digit - 'A' + 10;
} else {
state = STATE_IS_ERROR;
return;
}
if ((chunkSize(state) & STATE_SIZE_OVERFLOW)) {
state = STATE_IS_ERROR;
return;
}
// extract state bits
uint64_t bits = /*state &*/ STATE_IS_CHUNKED;
state = (state & STATE_SIZE_MASK) * 16ull + number;
state |= bits;
data.remove_prefix(1);
/* Hex digits. Skipped when resuming mid-extension so that extension bytes
* like 'a' aren't misparsed as hex. */
if (!(state & STATE_IS_CHUNKED_EXTENSION)) {
while (len) {
unsigned char c = (unsigned char) *p;
if (c <= 32 || c == ';') break; /* fall through to drain loop */
unsigned int d = c | 0x20; /* fold A-F -> a-f; '0'..'9' unchanged */
unsigned int n;
if ((unsigned)(d - '0') < 10) [[likely]] n = d - '0';
else if ((unsigned)(d - 'a') < 6) n = d - 'a' + 10;
else return STATE_IS_ERROR;
if (chunkSize(state) & STATE_SIZE_OVERFLOW) [[unlikely]] return STATE_IS_ERROR;
state = ((state & STATE_SIZE_MASK) * 16ull + n) | STATE_IS_CHUNKED;
++p; --len;
}
}
auto len = data.length();
if(len) {
// consume extension
if(data[0] == ';' || isParsingChunkedExtension(state)) {
// mark that we are parsing chunked extension
state |= STATE_IS_CHUNKED_EXTENSION;
/* we got chunk extension lets remove it*/
while(data.length()) {
if(data[0] == '\r') {
// we are done parsing extension
state &= ~STATE_IS_CHUNKED_EXTENSION;
break;
}
/* RFC 9110: Token format (TLDR; anything bellow 32 is not allowed)
* TODO: add support for quoted-strings values (RFC 9110: 3.2.6. Quoted-String)
* Example of chunked encoding with extensions:
*
* 4;key=value\r\n
* Wiki\r\n
* 5;foo=bar;baz=quux\r\n
* pedia\r\n
* 0\r\n
* \r\n
*
* The chunk size is in hex (4, 5, 0), followed by optional
* semicolon-separated extensions. Extensions consist of a key
* (token) and optional value. The value may be a token or a
* quoted string. The chunk data follows the CRLF after the
* extensions and must be exactly the size specified.
*
* RFC 7230 Section 4.1.1 defines chunk extensions as:
* chunk-ext = *( ";" chunk-ext-name [ "=" chunk-ext-val ] )
* chunk-ext-name = token
* chunk-ext-val = token / quoted-string
*/
if(data[0] <= 32) {
state = STATE_IS_ERROR;
return;
}
data.remove_prefix(1);
/* Drain [;ext...] \r \n. Upstream-style forward scan for LF, with strict
* validation: only >32 bytes (extension) and exactly one '\r' immediately
* before '\n' are allowed. */
while (len) {
unsigned char c = (unsigned char) *p;
if (c == '\n') return STATE_IS_ERROR; /* bare LF */
++p; --len;
if (c == '\r') {
if (!len) {
data = std::string_view(p, len);
return state | STATE_WAITING_FOR_LF;
}
if (*p != '\n') return STATE_IS_ERROR;
++p; --len;
data = std::string_view(p, len);
return ((state & ~STATE_IS_CHUNKED_EXTENSION) + 2)
| STATE_HAS_SIZE | STATE_IS_CHUNKED;
}
if(data.length() >= 2) {
/* Consume \r\n */
if((data[0] != '\r' || data[1] != '\n')) {
state = STATE_IS_ERROR;
return;
}
state += 2; // include the two last /r/n
state |= STATE_HAS_SIZE | STATE_IS_CHUNKED;
data.remove_prefix(2);
}
if (c <= 32) return STATE_IS_ERROR;
state |= STATE_IS_CHUNKED_EXTENSION;
}
// short read
data = std::string_view(p, len);
return state; /* short read */
}
inline void decChunkSize(uint64_t &state, uint64_t by) {
//unsigned int bits = state & STATE_IS_CHUNKED;
state = (state & ~STATE_SIZE_MASK) | (chunkSize(state) - by);
//state |= bits;
}
inline bool hasChunkSize(uint64_t state) {
@@ -187,8 +169,8 @@ namespace uWS {
}
if (!hasChunkSize(state)) {
consumeHexNumber(data, state);
if (isParsingInvalidChunkedEncoding(state)) {
state = consumeHexNumber(data, state);
if (isParsingInvalidChunkedEncoding(state)) [[unlikely]] {
return std::nullopt;
}
if (hasChunkSize(state) && chunkSize(state) == 2) {
@@ -204,6 +186,10 @@ namespace uWS {
return std::string_view(nullptr, 0);
}
if (!hasChunkSize(state)) [[unlikely]] {
/* Incomplete chunk-size line — need more data from the network. */
return std::nullopt;
}
continue;
}

View File

@@ -1130,18 +1130,7 @@ install_llvm() {
"clang$(llvm_version)" \
"scudo-malloc" \
"lld$(llvm_version)" \
"llvm$(llvm_version)-dev"
# Alpine uses versioned binary names (e.g. llvm-symbolizer-21, llvm21-symbolizer).
# Create unversioned symlinks so tools like bun-tracestrings can find them.
if ! command -v llvm-symbolizer > /dev/null 2>&1; then
local llvm_v="$(llvm_version)"
# Try Debian-style naming (llvm-symbolizer-21), then Alpine-style (llvm21-symbolizer)
if command -v "llvm-symbolizer-${llvm_v}" > /dev/null 2>&1; then
execute_sudo ln -sf "$(which "llvm-symbolizer-${llvm_v}")" /usr/bin/llvm-symbolizer
elif command -v "llvm${llvm_v}-symbolizer" > /dev/null 2>&1; then
execute_sudo ln -sf "$(which "llvm${llvm_v}-symbolizer")" /usr/bin/llvm-symbolizer
fi
fi
"llvm$(llvm_version)-dev" # Ensures llvm-symbolizer is installed
;;
esac
}

View File

@@ -759,8 +759,12 @@ const JsCallbackRenderer = struct {
const StackEntry = struct {
buffer: std.ArrayListUnmanaged(u8) = .{},
block_type: md.BlockType = .doc,
data: u32 = 0,
flags: u32 = 0,
/// For ul/ol: number of li children seen so far (next li's index).
/// For li: this item's 0-based index within its parent list.
child_index: u32 = 0,
detail: md.SpanDetail = .{},
};
@@ -853,7 +857,22 @@ const JsCallbackRenderer = struct {
if (block_type == .h) {
self.#heading_tracker.enterHeading();
}
try self.#stack.append(self.#allocator, .{ .data = data, .flags = flags });
// For li: record its 0-based index within the parent list, then
// increment the parent's counter so the next sibling gets index+1.
var child_index: u32 = 0;
if (block_type == .li and self.#stack.items.len > 0) {
const parent = &self.#stack.items[self.#stack.items.len - 1];
child_index = parent.child_index;
parent.child_index += 1;
}
try self.#stack.append(self.#allocator, .{
.block_type = block_type,
.data = data,
.flags = flags,
.child_index = child_index,
});
}
fn leaveBlockImpl(ptr: *anyopaque, block_type: md.BlockType, _: u32) bun.JSError!void {
@@ -986,6 +1005,30 @@ const JsCallbackRenderer = struct {
// Metadata object creation
// ========================================
/// Walks the stack to count enclosing ul/ol blocks. Called during leave,
/// so the top entry is the block itself (skip it for li, count it for ul/ol's
/// own depth which excludes self).
fn countListDepth(self: *JsCallbackRenderer) u32 {
var depth: u32 = 0;
// Skip the top entry (self) — we want enclosing lists only.
const len = self.#stack.items.len;
if (len < 2) return 0;
for (self.#stack.items[0 .. len - 1]) |entry| {
if (entry.block_type == .ul or entry.block_type == .ol) depth += 1;
}
return depth;
}
/// Returns the parent ul/ol entry for the current li (top of stack).
/// Returns null if the stack shape is unexpected.
fn parentList(self: *JsCallbackRenderer) ?*const StackEntry {
const len = self.#stack.items.len;
if (len < 2) return null;
const parent = &self.#stack.items[len - 2];
if (parent.block_type == .ul or parent.block_type == .ol) return parent;
return null;
}
fn createBlockMeta(self: *JsCallbackRenderer, block_type: md.BlockType, data: u32, flags: u32) bun.JSError!?JSValue {
const g = self.#globalObject;
switch (block_type) {
@@ -1000,15 +1043,10 @@ const JsCallbackRenderer = struct {
return obj;
},
.ol => {
const obj = JSValue.createEmptyObject(g, 2);
obj.put(g, ZigString.static("ordered"), .true);
obj.put(g, ZigString.static("start"), JSValue.jsNumber(data));
return obj;
return BunMarkdownMeta__createList(g, true, JSValue.jsNumber(data), self.countListDepth());
},
.ul => {
const obj = JSValue.createEmptyObject(g, 1);
obj.put(g, ZigString.static("ordered"), .false);
return obj;
return BunMarkdownMeta__createList(g, false, .js_undefined, self.countListDepth());
},
.code => {
if (flags & md.BLOCK_FENCED_CODE != 0) {
@@ -1023,21 +1061,31 @@ const JsCallbackRenderer = struct {
},
.th, .td => {
const alignment = md.types.alignmentFromData(data);
if (md.types.alignmentName(alignment)) |align_str| {
const obj = JSValue.createEmptyObject(g, 1);
obj.put(g, ZigString.static("align"), try bun.String.createUTF8ForJS(g, align_str));
return obj;
}
return null;
const align_js = if (md.types.alignmentName(alignment)) |align_str|
try bun.String.createUTF8ForJS(g, align_str)
else
JSValue.js_undefined;
return BunMarkdownMeta__createCell(g, align_js);
},
.li => {
// The li entry is still on top of the stack; parent ul/ol is at len-2.
const len = self.#stack.items.len;
const item_index = if (len > 1) self.#stack.items[len - 1].child_index else 0;
const parent = self.parentList();
const is_ordered = parent != null and parent.?.block_type == .ol;
// countListDepth() includes the immediate parent list; subtract it
// so that items in a top-level list report depth 0.
const enclosing = self.countListDepth();
const depth: u32 = if (enclosing > 0) enclosing - 1 else 0;
const task_mark = md.types.taskMarkFromData(data);
if (task_mark != 0) {
const obj = JSValue.createEmptyObject(g, 1);
obj.put(g, ZigString.static("checked"), JSValue.jsBoolean(md.types.isTaskChecked(task_mark)));
return obj;
}
return null;
const start_js = if (is_ordered) JSValue.jsNumber(parent.?.data) else JSValue.js_undefined;
const checked_js = if (task_mark != 0)
JSValue.jsBoolean(md.types.isTaskChecked(task_mark))
else
JSValue.js_undefined;
return BunMarkdownMeta__createListItem(g, item_index, depth, is_ordered, start_js, checked_js);
},
else => return null,
}
@@ -1047,14 +1095,18 @@ const JsCallbackRenderer = struct {
const g = self.#globalObject;
switch (span_type) {
.a => {
const obj = JSValue.createEmptyObject(g, 2);
obj.put(g, ZigString.static("href"), try bun.String.createUTF8ForJS(g, detail.href));
if (detail.title.len > 0) {
obj.put(g, ZigString.static("title"), try bun.String.createUTF8ForJS(g, detail.title));
}
return obj;
const href = try bun.String.createUTF8ForJS(g, detail.href);
const title = if (detail.title.len > 0)
try bun.String.createUTF8ForJS(g, detail.title)
else
JSValue.js_undefined;
return BunMarkdownMeta__createLink(g, href, title);
},
.img => {
// Image meta shares shape with link (src/href are both the first
// field). We use a separate cached structure would require a
// second slot, so just fall back to the generic path here —
// images are rare enough that it doesn't matter.
const obj = JSValue.createEmptyObject(g, 2);
obj.put(g, ZigString.static("src"), try bun.String.createUTF8ForJS(g, detail.href));
if (detail.title.len > 0) {
@@ -1114,6 +1166,14 @@ const TagIndex = enum(u8) {
extern fn BunMarkdownTagStrings__getTagString(*jsc.JSGlobalObject, u8) JSValue;
// Fast-path meta-object constructors using cached Structures (see
// BunMarkdownMeta.cpp). Each constructs via putDirectOffset so the
// resulting objects share a single Structure and stay monomorphic.
extern fn BunMarkdownMeta__createListItem(*jsc.JSGlobalObject, u32, u32, bool, JSValue, JSValue) JSValue;
extern fn BunMarkdownMeta__createList(*jsc.JSGlobalObject, bool, JSValue, u32) JSValue;
extern fn BunMarkdownMeta__createCell(*jsc.JSGlobalObject, JSValue) JSValue;
extern fn BunMarkdownMeta__createLink(*jsc.JSGlobalObject, JSValue, JSValue) JSValue;
fn getCachedTagString(globalObject: *jsc.JSGlobalObject, tag: TagIndex) JSValue {
return BunMarkdownTagStrings__getTagString(globalObject, @intFromEnum(tag));
}

View File

@@ -0,0 +1,123 @@
#include "BunMarkdownMeta.h"
#include "JavaScriptCore/JSObjectInlines.h"
#include "JavaScriptCore/ObjectConstructor.h"
#include "JavaScriptCore/JSCast.h"
using namespace JSC;
namespace Bun {
namespace MarkdownMeta {
// Builds a cached Structure with N fixed property offsets. Properties are
// laid out in declaration order so the extern "C" create functions can use
// putDirectOffset without name lookups.
static Structure* buildStructure(VM& vm, JSGlobalObject* globalObject, std::initializer_list<ASCIILiteral> names)
{
Structure* structure = globalObject->structureCache().emptyObjectStructureForPrototype(
globalObject,
globalObject->objectPrototype(),
names.size());
PropertyOffset offset;
PropertyOffset expected = 0;
for (auto name : names) {
structure = structure->addPropertyTransition(vm, structure, Identifier::fromString(vm, name), 0, offset);
ASSERT_UNUSED(expected, offset == expected);
expected++;
}
return structure;
}
Structure* createListItemMetaStructure(VM& vm, JSGlobalObject* globalObject)
{
return buildStructure(vm, globalObject, { "index"_s, "depth"_s, "ordered"_s, "start"_s, "checked"_s });
}
Structure* createListMetaStructure(VM& vm, JSGlobalObject* globalObject)
{
return buildStructure(vm, globalObject, { "ordered"_s, "start"_s, "depth"_s });
}
Structure* createCellMetaStructure(VM& vm, JSGlobalObject* globalObject)
{
return buildStructure(vm, globalObject, { "align"_s });
}
Structure* createLinkMetaStructure(VM& vm, JSGlobalObject* globalObject)
{
return buildStructure(vm, globalObject, { "href"_s, "title"_s });
}
} // namespace MarkdownMeta
} // namespace Bun
// ──────────────────────────────────────────────────────────────────────────
// extern "C" constructors — callable from MarkdownObject.zig
// ──────────────────────────────────────────────────────────────────────────
extern "C" JSC::EncodedJSValue BunMarkdownMeta__createListItem(
JSGlobalObject* globalObject,
uint32_t index,
uint32_t depth,
bool ordered,
EncodedJSValue start,
EncodedJSValue checked)
{
auto* global = jsCast<Zig::GlobalObject*>(globalObject);
VM& vm = global->vm();
JSObject* obj = constructEmptyObject(vm, global->JSMarkdownListItemMetaStructure());
obj->putDirectOffset(vm, 0, jsNumber(index));
obj->putDirectOffset(vm, 1, jsNumber(depth));
obj->putDirectOffset(vm, 2, jsBoolean(ordered));
obj->putDirectOffset(vm, 3, JSValue::decode(start));
obj->putDirectOffset(vm, 4, JSValue::decode(checked));
return JSValue::encode(obj);
}
extern "C" JSC::EncodedJSValue BunMarkdownMeta__createList(
JSGlobalObject* globalObject,
bool ordered,
EncodedJSValue start,
uint32_t depth)
{
auto* global = jsCast<Zig::GlobalObject*>(globalObject);
VM& vm = global->vm();
JSObject* obj = constructEmptyObject(vm, global->JSMarkdownListMetaStructure());
obj->putDirectOffset(vm, 0, jsBoolean(ordered));
obj->putDirectOffset(vm, 1, JSValue::decode(start));
obj->putDirectOffset(vm, 2, jsNumber(depth));
return JSValue::encode(obj);
}
extern "C" JSC::EncodedJSValue BunMarkdownMeta__createCell(
JSGlobalObject* globalObject,
EncodedJSValue align)
{
auto* global = jsCast<Zig::GlobalObject*>(globalObject);
VM& vm = global->vm();
JSObject* obj = constructEmptyObject(vm, global->JSMarkdownCellMetaStructure());
obj->putDirectOffset(vm, 0, JSValue::decode(align));
return JSValue::encode(obj);
}
extern "C" JSC::EncodedJSValue BunMarkdownMeta__createLink(
JSGlobalObject* globalObject,
EncodedJSValue href,
EncodedJSValue title)
{
auto* global = jsCast<Zig::GlobalObject*>(globalObject);
VM& vm = global->vm();
JSObject* obj = constructEmptyObject(vm, global->JSMarkdownLinkMetaStructure());
obj->putDirectOffset(vm, 0, JSValue::decode(href));
obj->putDirectOffset(vm, 1, JSValue::decode(title));
return JSValue::encode(obj);
}

View File

@@ -0,0 +1,61 @@
#pragma once
#include "root.h"
#include "headers.h"
#include "JavaScriptCore/JSObjectInlines.h"
#include "ZigGlobalObject.h"
using namespace JSC;
namespace Bun {
namespace MarkdownMeta {
// Cached Structures for the small metadata objects passed as the second
// argument to Bun.markdown.render() callbacks. These have fixed shapes
// so JSC's property access inline caches stay monomorphic and we avoid
// the string-hash + property-transition cost of `put()`-style construction
// on every callback (which matters a lot for list items and table cells).
Structure* createListItemMetaStructure(VM& vm, JSGlobalObject* globalObject);
Structure* createListMetaStructure(VM& vm, JSGlobalObject* globalObject);
Structure* createCellMetaStructure(VM& vm, JSGlobalObject* globalObject);
Structure* createLinkMetaStructure(VM& vm, JSGlobalObject* globalObject);
} // namespace MarkdownMeta
} // namespace Bun
// ListItemMeta: {index, depth, ordered, start, checked}
// `start` and `checked` are always present (jsUndefined() when not applicable)
// so the shape is fixed.
extern "C" JSC::EncodedJSValue BunMarkdownMeta__createListItem(
JSGlobalObject* globalObject,
uint32_t index,
uint32_t depth,
bool ordered,
EncodedJSValue start, // jsNumber or jsUndefined
EncodedJSValue checked // jsBoolean or jsUndefined
);
// ListMeta: {ordered, start, depth}
// `start` is always present (jsUndefined for unordered).
extern "C" JSC::EncodedJSValue BunMarkdownMeta__createList(
JSGlobalObject* globalObject,
bool ordered,
EncodedJSValue start, // jsNumber or jsUndefined
uint32_t depth);
// CellMeta: {align}
// `align` is always present (jsUndefined when no alignment).
extern "C" JSC::EncodedJSValue BunMarkdownMeta__createCell(
JSGlobalObject* globalObject,
EncodedJSValue align // jsString or jsUndefined
);
// LinkMeta / ImageMeta: {href, title} or {src, title}
// `title` is always present (jsUndefined when missing). `href` and `src`
// share the structure slot (first property) — the property name differs
// but the shape is the same; two separate structures are used.
extern "C" JSC::EncodedJSValue BunMarkdownMeta__createLink(
JSGlobalObject* globalObject,
EncodedJSValue href,
EncodedJSValue title // jsString or jsUndefined
);

View File

@@ -1507,7 +1507,6 @@ static int64_t indexOfBuffer(JSC::JSGlobalObject* lexicalGlobalObject, bool last
static int64_t indexOf(JSC::JSGlobalObject* lexicalGlobalObject, ThrowScope& scope, JSC::CallFrame* callFrame, typename IDLOperation<JSArrayBufferView>::ClassParameter buffer, bool last)
{
bool dir = !last;
const uint8_t* typedVector = buffer->typedVector();
size_t byteLength = buffer->byteLength();
std::optional<BufferEncodingType> encoding = std::nullopt;
double byteOffsetD = 0;
@@ -1523,17 +1522,41 @@ static int64_t indexOf(JSC::JSGlobalObject* lexicalGlobalObject, ThrowScope& sco
byteOffsetValue = jsUndefined();
byteOffsetD = 0;
} else {
// toNumber() can trigger JavaScript execution (valueOf/Symbol.toPrimitive),
// which could detach the underlying ArrayBuffer. We must re-fetch the
// pointer and length after this call.
byteOffsetD = byteOffsetValue.toNumber(lexicalGlobalObject);
RETURN_IF_EXCEPTION(scope, -1);
if (byteOffsetD > 0x7fffffffp0f) byteOffsetD = 0x7fffffffp0f;
if (byteOffsetD < -0x80000000p0f) byteOffsetD = -0x80000000p0f;
}
// After any call that can trigger JS execution (toNumber, toWTFString,
// toString), the buffer may have been detached. We must re-fetch typedVector
// and byteLength from the buffer right before use, and check for detachment
// after all JS calls in each code path are complete.
// Helper: re-fetch buffer state after JS calls. Returns false if the buffer
// was detached; caller treats this as an empty buffer (matches Node.js: -1).
auto refetchBufferState = [&](const uint8_t*& typedVector, size_t& len) -> bool {
if (buffer->isDetached()) [[unlikely]] {
typedVector = nullptr;
len = 0;
return false;
}
typedVector = buffer->typedVector();
len = buffer->byteLength();
return true;
};
if (std::isnan(byteOffsetD)) byteOffsetD = dir ? 0 : byteLength;
if (valueValue.isNumber()) {
auto byteValue = static_cast<uint8_t>((valueValue.toInt32(lexicalGlobalObject)) % 256);
RETURN_IF_EXCEPTION(scope, -1);
const uint8_t* typedVector;
if (!refetchBufferState(typedVector, byteLength)) return -1;
if (byteLength == 0) return -1;
return indexOfNumber(lexicalGlobalObject, last, typedVector, byteLength, byteOffsetD, byteValue);
}
@@ -1550,13 +1573,26 @@ static int64_t indexOf(JSC::JSGlobalObject* lexicalGlobalObject, ThrowScope& sco
if (!encoding.has_value()) {
return Bun::ERR::UNKNOWN_ENCODING(scope, lexicalGlobalObject, encodingString);
}
auto* str = valueValue.toStringOrNull(lexicalGlobalObject);
auto* str = valueValue.toString(lexicalGlobalObject);
RETURN_IF_EXCEPTION(scope, -1);
const uint8_t* typedVector;
if (!refetchBufferState(typedVector, byteLength)) return -1;
if (byteLength == 0) return -1;
return indexOfString(lexicalGlobalObject, last, typedVector, byteLength, byteOffsetD, str, encoding.value());
}
if (auto* array = JSC::jsDynamicCast<JSC::JSUint8Array*>(valueValue)) {
if (!encoding.has_value()) encoding = BufferEncodingType::utf8;
const uint8_t* typedVector;
if (!refetchBufferState(typedVector, byteLength)) return -1;
if (byteLength == 0) return -1;
// The needle's backing buffer may also have been detached by a
// valueOf/toPrimitive callback during toNumber/toWTFString above.
// Treat a detached needle as an empty buffer (matches Node.js:
// returns the computed byteOffset for a zero-length match).
if (array->isDetached()) [[unlikely]] {
return indexOfOffset(byteLength, byteOffsetD, 0, !last);
}
return indexOfBuffer(lexicalGlobalObject, last, typedVector, byteLength, byteOffsetD, array, encoding.value());
}

View File

@@ -154,7 +154,7 @@ NodeVMModule::NodeVMModule(JSC::VM& vm, JSC::Structure* structure, WTF::String i
: Base(vm, structure)
, m_identifier(WTF::move(identifier))
, m_context(context && context.isObject() ? asObject(context) : nullptr, JSC::WriteBarrierEarlyInit)
, m_moduleWrapper(vm, this, moduleWrapper)
, m_moduleWrapper(moduleWrapper, JSC::WriteBarrierEarlyInit)
{
}

View File

@@ -54,7 +54,7 @@ private:
NodeVMSyntheticModule(JSC::VM& vm, JSC::Structure* structure, WTF::String identifier, JSValue context, JSValue moduleWrapper, WTF::HashSet<String> exportNames, JSValue syntheticEvaluationSteps)
: Base(vm, structure, WTF::move(identifier), context, moduleWrapper)
, m_exportNames(WTF::move(exportNames))
, m_syntheticEvaluationSteps(vm, this, syntheticEvaluationSteps)
, m_syntheticEvaluationSteps(syntheticEvaluationSteps, JSC::WriteBarrierEarlyInit)
{
}

View File

@@ -124,6 +124,7 @@
#include "JSSink.h"
#include "JSSocketAddressDTO.h"
#include "JSReactElement.h"
#include "BunMarkdownMeta.h"
#include "JSSQLStatement.h"
#include "JSStringDecoder.h"
#include "JSTextEncoder.h"
@@ -1802,6 +1803,23 @@ void GlobalObject::finishCreation(VM& vm)
init.set(Bun::JSReactElement::createStructure(init.vm, init.owner));
});
m_JSMarkdownListItemMetaStructure.initLater(
[](const Initializer<Structure>& init) {
init.set(Bun::MarkdownMeta::createListItemMetaStructure(init.vm, init.owner));
});
m_JSMarkdownListMetaStructure.initLater(
[](const Initializer<Structure>& init) {
init.set(Bun::MarkdownMeta::createListMetaStructure(init.vm, init.owner));
});
m_JSMarkdownCellMetaStructure.initLater(
[](const Initializer<Structure>& init) {
init.set(Bun::MarkdownMeta::createCellMetaStructure(init.vm, init.owner));
});
m_JSMarkdownLinkMetaStructure.initLater(
[](const Initializer<Structure>& init) {
init.set(Bun::MarkdownMeta::createLinkMetaStructure(init.vm, init.owner));
});
m_JSSQLStatementStructure.initLater(
[](const Initializer<Structure>& init) {
init.set(WebCore::createJSSQLStatementStructure(init.owner));

View File

@@ -302,6 +302,10 @@ public:
Structure* CommonJSModuleObjectStructure() const { return m_commonJSModuleObjectStructure.getInitializedOnMainThread(this); }
Structure* JSSocketAddressDTOStructure() const { return m_JSSocketAddressDTOStructure.getInitializedOnMainThread(this); }
Structure* JSReactElementStructure() const { return m_JSReactElementStructure.getInitializedOnMainThread(this); }
Structure* JSMarkdownListItemMetaStructure() const { return m_JSMarkdownListItemMetaStructure.getInitializedOnMainThread(this); }
Structure* JSMarkdownListMetaStructure() const { return m_JSMarkdownListMetaStructure.getInitializedOnMainThread(this); }
Structure* JSMarkdownCellMetaStructure() const { return m_JSMarkdownCellMetaStructure.getInitializedOnMainThread(this); }
Structure* JSMarkdownLinkMetaStructure() const { return m_JSMarkdownLinkMetaStructure.getInitializedOnMainThread(this); }
Structure* ImportMetaObjectStructure() const { return m_importMetaObjectStructure.getInitializedOnMainThread(this); }
Structure* ImportMetaBakeObjectStructure() const { return m_importMetaBakeObjectStructure.getInitializedOnMainThread(this); }
Structure* AsyncContextFrameStructure() const { return m_asyncBoundFunctionStructure.getInitializedOnMainThread(this); }
@@ -597,6 +601,10 @@ public:
V(private, LazyPropertyOfGlobalObject<Structure>, m_commonJSModuleObjectStructure) \
V(private, LazyPropertyOfGlobalObject<Structure>, m_JSSocketAddressDTOStructure) \
V(private, LazyPropertyOfGlobalObject<Structure>, m_JSReactElementStructure) \
V(private, LazyPropertyOfGlobalObject<Structure>, m_JSMarkdownListItemMetaStructure) \
V(private, LazyPropertyOfGlobalObject<Structure>, m_JSMarkdownListMetaStructure) \
V(private, LazyPropertyOfGlobalObject<Structure>, m_JSMarkdownCellMetaStructure) \
V(private, LazyPropertyOfGlobalObject<Structure>, m_JSMarkdownLinkMetaStructure) \
V(private, LazyPropertyOfGlobalObject<Structure>, m_memoryFootprintStructure) \
V(private, LazyPropertyOfGlobalObject<JSObject>, m_requireFunctionUnbound) \
V(private, LazyPropertyOfGlobalObject<JSObject>, m_requireResolveFunctionUnbound) \

View File

@@ -138,7 +138,7 @@ pub fn doReadFile(this: *Blob, comptime Function: anytype, global: *JSGlobalObje
promise_value.ensureStillAlive();
handler.promise.strong.set(global, promise_value);
read_file.ReadFileUV.start(handler.globalThis.bunVM().uvLoop(), this.store.?, this.offset, this.size, Handler, handler);
read_file.ReadFileUV.start(handler.globalThis.bunVM().eventLoop(), this.store.?, this.offset, this.size, Handler, handler);
return promise_value;
}
@@ -180,7 +180,7 @@ pub fn NewInternalReadFileHandler(comptime Context: type, comptime Function: any
pub fn doReadFileInternal(this: *Blob, comptime Handler: type, ctx: Handler, comptime Function: anytype, global: *JSGlobalObject) void {
if (Environment.isWindows) {
const ReadFileHandler = NewInternalReadFileHandler(Handler, Function);
return read_file.ReadFileUV.start(libuv.Loop.get(), this.store.?, this.offset, this.size, ReadFileHandler, ctx);
return read_file.ReadFileUV.start(global.bunVM().eventLoop(), this.store.?, this.offset, this.size, ReadFileHandler, ctx);
}
const file_read = read_file.ReadFile.createWithCtx(
bun.default_allocator,
@@ -3134,7 +3134,7 @@ pub fn getStat(this: *Blob, globalThis: *jsc.JSGlobalObject, callback: *jsc.Call
.encoded_slice = switch (path_like) {
// it's already converted to utf8
.encoded_slice => |slice| try slice.toOwned(bun.default_allocator),
else => try ZigString.init(path_like.slice()).toSliceClone(bun.default_allocator),
else => try ZigString.fromUTF8(path_like.slice()).toSliceClone(bun.default_allocator),
},
},
}, globalThis.bunVM());

View File

@@ -263,7 +263,7 @@ pub const File = struct {
.path = .{
.encoded_slice = switch (path_like) {
.encoded_slice => |slice| try slice.toOwned(bun.default_allocator),
else => try jsc.ZigString.init(path_like.slice()).toSliceClone(bun.default_allocator),
else => try jsc.ZigString.fromUTF8(path_like.slice()).toSliceClone(bun.default_allocator),
},
},
}, globalThis.bunVM()),

View File

@@ -523,6 +523,7 @@ pub const ReadFileUV = struct {
pub const doClose = FileCloser(@This()).doClose;
loop: *libuv.Loop,
event_loop: *jsc.EventLoop,
file_store: FileStore,
byte_store: ByteStore = ByteStore{ .allocator = bun.default_allocator },
store: *Store,
@@ -543,10 +544,11 @@ pub const ReadFileUV = struct {
req: libuv.fs_t = std.mem.zeroes(libuv.fs_t),
pub fn start(loop: *libuv.Loop, store: *Store, off: SizeType, max_len: SizeType, comptime Handler: type, handler: *anyopaque) void {
pub fn start(event_loop: *jsc.EventLoop, store: *Store, off: SizeType, max_len: SizeType, comptime Handler: type, handler: *anyopaque) void {
log("ReadFileUV.start", .{});
var this = bun.new(ReadFileUV, .{
.loop = loop,
.loop = event_loop.virtual_machine.uvLoop(),
.event_loop = event_loop,
.file_store = store.data.file,
.store = store,
.offset = off,
@@ -555,15 +557,20 @@ pub const ReadFileUV = struct {
.on_complete_fn = @ptrCast(&Handler.run),
});
store.ref();
// Keep the event loop alive while the async operation is pending
event_loop.refConcurrently();
this.getFd(onFileOpen);
}
pub fn finalize(this: *ReadFileUV) void {
log("ReadFileUV.finalize", .{});
const event_loop = this.event_loop;
defer {
this.store.deref();
this.req.deinit();
bun.destroy(this);
// Release the event loop reference now that we're done
event_loop.unrefConcurrently();
log("ReadFileUV.finalize destroy", .{});
}
@@ -691,8 +698,9 @@ pub const ReadFileUV = struct {
return;
}
// add an extra 16 bytes to the buffer to avoid having to resize it for trailing extra data
this.buffer.ensureTotalCapacityPrecise(this.byte_store.allocator, @min(this.size + 16, @as(usize, std.math.maxInt(bun.windows.ULONG)))) catch |err| {
this.errno = err;
this.buffer.ensureTotalCapacityPrecise(this.byte_store.allocator, @min(this.size + 16, @as(usize, std.math.maxInt(bun.windows.ULONG)))) catch {
this.errno = error.OutOfMemory;
this.system_error = bun.sys.Error.fromCode(bun.sys.E.NOMEM, .read).toSystemError();
this.onFinish();
return;
};
@@ -719,9 +727,11 @@ pub const ReadFileUV = struct {
// non-regular files have variable sizes, so we always ensure
// theres at least 4096 bytes of free space. there has already
// been an initial allocation done for us
this.buffer.ensureUnusedCapacity(this.byte_store.allocator, 4096) catch |err| {
this.errno = err;
this.buffer.ensureUnusedCapacity(this.byte_store.allocator, 4096) catch {
this.errno = error.OutOfMemory;
this.system_error = bun.sys.Error.fromCode(bun.sys.E.NOMEM, .read).toSystemError();
this.onFinish();
return;
};
}
@@ -750,8 +760,9 @@ pub const ReadFileUV = struct {
// We are done reading.
this.byte_store = ByteStore.init(
this.buffer.toOwnedSlice(this.byte_store.allocator) catch |err| {
this.errno = err;
this.buffer.toOwnedSlice(this.byte_store.allocator) catch {
this.errno = error.OutOfMemory;
this.system_error = bun.sys.Error.fromCode(bun.sys.E.NOMEM, .read).toSystemError();
this.onFinish();
return;
},
@@ -769,15 +780,16 @@ pub const ReadFileUV = struct {
if (result.errEnum()) |errno| {
this.errno = bun.errnoToZigErr(errno);
this.system_error = bun.sys.Error.fromCode(errno, .read).toSystemError();
this.finalize();
this.onFinish();
return;
}
if (result.int() == 0) {
// We are done reading.
this.byte_store = ByteStore.init(
this.buffer.toOwnedSlice(this.byte_store.allocator) catch |err| {
this.errno = err;
this.buffer.toOwnedSlice(this.byte_store.allocator) catch {
this.errno = error.OutOfMemory;
this.system_error = bun.sys.Error.fromCode(bun.sys.E.NOMEM, .read).toSystemError();
this.onFinish();
return;
},

View File

@@ -301,10 +301,9 @@ pub fn scheduleBarrelDeferredImports(this: *BundleV2, result: *ParseTask.Result.
// Handle import records without named bindings (not in named_imports).
// - `import "x"` (bare statement): tree-shakeable with sideEffects: false — skip.
// - `require("x")`: synchronous, needs full module — always mark as .all.
// - `import("x")`: mark as .all ONLY if the barrel has no prior requests,
// meaning this is the sole reference. If the barrel already has a .partial
// entry from a static import, the dynamic import is likely a secondary
// (possibly circular) reference and should not escalate requirements.
// - `import("x")`: returns the full module namespace at runtime — consumer
// can destructure or access any export. Must mark as .all. We cannot
// safely assume which exports will be used.
for (file_import_records.slice(), 0..) |ir, idx| {
const target = if (ir.source_index.isValid())
ir.source_index.get()
@@ -319,10 +318,9 @@ pub fn scheduleBarrelDeferredImports(this: *BundleV2, result: *ParseTask.Result.
const gop = try this.requested_exports.getOrPut(this.allocator(), target);
gop.value_ptr.* = .all;
} else if (ir.kind == .dynamic) {
// Only escalate to .all if no prior requests exist for this target.
if (!this.requested_exports.contains(target)) {
try this.requested_exports.put(this.allocator(), target, .all);
}
// import() returns the full module namespace — must preserve all exports.
const gop = try this.requested_exports.getOrPut(this.allocator(), target);
gop.value_ptr.* = .all;
}
}
@@ -354,8 +352,8 @@ pub fn scheduleBarrelDeferredImports(this: *BundleV2, result: *ParseTask.Result.
}
}
// Add bare require/dynamic-import targets to BFS as star imports (matching
// the seeding logic above — require always, dynamic only when sole reference).
// Add bare require/dynamic-import targets to BFS as star imports — both
// always need the full namespace.
for (file_import_records.slice(), 0..) |ir, idx| {
const target = if (ir.source_index.isValid())
ir.source_index.get()
@@ -366,8 +364,7 @@ pub fn scheduleBarrelDeferredImports(this: *BundleV2, result: *ParseTask.Result.
if (ir.flags.is_internal) continue;
if (named_ir_indices.contains(@intCast(idx))) continue;
if (ir.flags.was_originally_bare_import) continue;
const is_all = if (this.requested_exports.get(target)) |re| re == .all else false;
const should_add = ir.kind == .require or (ir.kind == .dynamic and is_all);
const should_add = ir.kind == .require or ir.kind == .dynamic;
if (should_add) {
try queue.append(queue_alloc, .{ .barrel_source_index = target, .alias = "", .is_star = true });
}

View File

@@ -1207,8 +1207,8 @@ pub const PackCommand = struct {
// maybe otp
}
const package_name_expr: Expr = json.root.get("name") orelse return error.MissingPackageName;
const package_name = try package_name_expr.asStringCloned(ctx.allocator) orelse return error.InvalidPackageName;
var package_name_expr: Expr = json.root.get("name") orelse return error.MissingPackageName;
var package_name = try package_name_expr.asStringCloned(ctx.allocator) orelse return error.InvalidPackageName;
if (comptime for_publish) {
const is_scoped = try Dependency.isScopedPackageName(package_name);
if (manager.options.publish_config.access) |access| {
@@ -1220,8 +1220,8 @@ pub const PackCommand = struct {
defer if (comptime !for_publish) ctx.allocator.free(package_name);
if (package_name.len == 0) return error.InvalidPackageName;
const package_version_expr: Expr = json.root.get("version") orelse return error.MissingPackageVersion;
const package_version = try package_version_expr.asStringCloned(ctx.allocator) orelse return error.InvalidPackageVersion;
var package_version_expr: Expr = json.root.get("version") orelse return error.MissingPackageVersion;
var package_version = try package_version_expr.asStringCloned(ctx.allocator) orelse return error.InvalidPackageVersion;
defer if (comptime !for_publish) ctx.allocator.free(package_version);
if (package_version.len == 0) return error.InvalidPackageVersion;
@@ -1395,8 +1395,6 @@ pub const PackCommand = struct {
};
// Re-validate private flag after scripts may have modified it.
// Note: The tarball filename uses the original name/version (matching npm behavior),
// but we re-check private to prevent accidentally publishing a now-private package.
if (comptime for_publish) {
if (json.root.get("private")) |private| {
if (private.asBool()) |is_private| {
@@ -1406,6 +1404,16 @@ pub const PackCommand = struct {
}
}
}
// Re-read name and version from the updated package.json, since lifecycle
// scripts (e.g. prepublishOnly, prepack) may have modified them.
package_name_expr = json.root.get("name") orelse return error.MissingPackageName;
package_name = try package_name_expr.asStringCloned(ctx.allocator) orelse return error.InvalidPackageName;
if (package_name.len == 0) return error.InvalidPackageName;
package_version_expr = json.root.get("version") orelse return error.MissingPackageVersion;
package_version = try package_version_expr.asStringCloned(ctx.allocator) orelse return error.InvalidPackageVersion;
if (package_version.len == 0) return error.InvalidPackageVersion;
}
// Create the edited package.json content after lifecycle scripts have run

View File

@@ -1700,9 +1700,8 @@ pub fn dumpStackTrace(trace: std.builtin.StackTrace, limits: WriteStackTraceLimi
const programs: []const [:0]const u8 = switch (bun.Environment.os) {
.windows => &.{"pdb-addr2line"},
// if `llvm-symbolizer` doesn't work, also try versioned names:
// `llvm-symbolizer-21` (Debian/Ubuntu) or `llvm21-symbolizer` (Alpine)
else => &.{ "llvm-symbolizer", "llvm-symbolizer-21", "llvm21-symbolizer" },
// if `llvm-symbolizer` doesn't work, also try `llvm-symbolizer-21`
else => &.{ "llvm-symbolizer", "llvm-symbolizer-21" },
};
for (programs) |program| {
var arena = bun.ArenaAllocator.init(bun.default_allocator);

View File

@@ -270,14 +270,14 @@ pub const Mask = struct {
while (true) {
if (image == null) {
if (@call(.auto, @field(Image, "parse"), .{input}).asValue()) |value| {
if (input.tryParse(Image.parse, .{}).asValue()) |value| {
image = value;
continue;
}
}
if (position == null) {
if (Position.parse(input).asValue()) |value| {
if (input.tryParse(Position.parse, .{}).asValue()) |value| {
position = value;
size = input.tryParse(struct {
pub inline fn parseFn(i: *css.Parser) css.Result(BackgroundSize) {
@@ -290,35 +290,35 @@ pub const Mask = struct {
}
if (repeat == null) {
if (BackgroundRepeat.parse(input).asValue()) |value| {
if (input.tryParse(BackgroundRepeat.parse, .{}).asValue()) |value| {
repeat = value;
continue;
}
}
if (origin == null) {
if (GeometryBox.parse(input).asValue()) |value| {
if (input.tryParse(GeometryBox.parse, .{}).asValue()) |value| {
origin = value;
continue;
}
}
if (clip == null) {
if (MaskClip.parse(input).asValue()) |value| {
if (input.tryParse(MaskClip.parse, .{}).asValue()) |value| {
clip = value;
continue;
}
}
if (composite == null) {
if (MaskComposite.parse(input).asValue()) |value| {
if (input.tryParse(MaskComposite.parse, .{}).asValue()) |value| {
composite = value;
continue;
}
}
if (mode == null) {
if (MaskMode.parse(input).asValue()) |value| {
if (input.tryParse(MaskMode.parse, .{}).asValue()) |value| {
mode = value;
continue;
}

View File

@@ -1020,6 +1020,9 @@ pub const AST = struct {
Var: []const u8,
VarArgv: u8,
Text: []const u8,
/// An empty string from a quoted context (e.g. "", '', or ${''}). Preserved as an
/// explicit empty argument during expansion, unlike unquoted empty text which is dropped.
quoted_empty,
asterisk,
double_asterisk,
brace_begin,
@@ -1042,6 +1045,7 @@ pub const AST = struct {
.Var => false,
.VarArgv => false,
.Text => false,
.quoted_empty => false,
.asterisk => true,
.double_asterisk => true,
.brace_begin => false,
@@ -1845,6 +1849,9 @@ pub const Parser = struct {
if (txt.len > 0) {
try atoms.append(.{ .Text = txt });
}
} else if (txt.len == 0 and (peeked == .SingleQuotedText or peeked == .DoubleQuotedText)) {
// Preserve empty quoted strings ("", '') as explicit empty arguments
try atoms.append(.quoted_empty);
} else {
try atoms.append(.{ .Text = txt });
}
@@ -2794,10 +2801,12 @@ pub fn NewLexer(comptime encoding: StringEncoding) type {
comptime assertSpecialChar('\'');
if (self.chars.state == .Single) {
try self.break_word(false);
self.chars.state = .Normal;
continue;
}
if (self.chars.state == .Normal) {
try self.break_word(false);
self.chars.state = .Single;
continue;
}
@@ -2893,9 +2902,12 @@ pub fn NewLexer(comptime encoding: StringEncoding) type {
}
inline fn isImmediatelyEscapedQuote(self: *@This()) bool {
return (self.chars.state == .Double and
return ((self.chars.state == .Double and
(self.chars.current != null and !self.chars.current.?.escaped and self.chars.current.?.char == '"') and
(self.chars.prev != null and !self.chars.prev.?.escaped and self.chars.prev.?.char == '"'));
(self.chars.prev != null and !self.chars.prev.?.escaped and self.chars.prev.?.char == '"')) or
(self.chars.state == .Single and
(self.chars.current != null and !self.chars.current.?.escaped and self.chars.current.?.char == '\'') and
(self.chars.prev != null and !self.chars.prev.?.escaped and self.chars.prev.?.char == '\'')));
}
fn break_word_impl(self: *@This(), add_delimiter: bool, in_normal_space: bool, in_operator: bool) !void {
@@ -3234,6 +3246,15 @@ pub fn NewLexer(comptime encoding: StringEncoding) type {
}
fn handleJSStringRef(self: *@This(), bunstr: bun.String) !void {
if (bunstr.length() == 0) {
// Empty JS string ref: emit a zero-length DoubleQuotedText token directly.
// The parser converts this to a quoted_empty atom, preserving the empty arg.
// This works regardless of the lexer's current quote state (Normal/Single/Double)
// because the \x08 marker is processed before quote-state handling.
const pos = self.j;
try self.tokens.append(@unionInit(Token, "DoubleQuotedText", .{ .start = pos, .end = pos }));
return;
}
try self.appendStringToStrPool(bunstr);
}
@@ -4068,6 +4089,13 @@ pub const ShellSrcBuilder = struct {
) bun.OOM!bool {
const invalid = (bunstr.isUTF16() and !bun.simdutf.validate.utf16le(bunstr.utf16())) or (bunstr.isUTF8() and !bun.simdutf.validate.utf8(bunstr.byteSlice()));
if (invalid) return false;
// Empty interpolated values must still produce an argument (e.g. `${''}` should
// pass "" as an arg). Route through appendJSStrRef so the \x08 marker is recognized
// by the lexer regardless of quote context (e.g. inside single quotes).
if (allow_escape and bunstr.length() == 0) {
try this.appendJSStrRef(bunstr);
return true;
}
if (allow_escape) {
if (needsEscapeBunstr(bunstr)) {
try this.appendJSStrRef(bunstr);

View File

@@ -168,8 +168,10 @@ fn commandImplStart(this: *CondExpr) Yield {
.@"-d",
.@"-f",
=> {
// Empty string expansion produces no args; the path doesn't exist.
if (this.args.items.len == 0) return this.parent.childDone(this, 1);
// Empty string expansion produces no args, or the path is an empty string;
// the path doesn't exist. On Windows, stat("") can succeed and return the
// cwd's stat, so we must check for empty paths explicitly.
if (this.args.items.len == 0 or this.args.items[0].len == 0) return this.parent.childDone(this, 1);
this.state = .waiting_stat;
return this.doStat();
},

View File

@@ -36,6 +36,9 @@ child_state: union(enum) {
out_exit_code: ExitCode = 0,
out: Result,
out_idx: u32,
/// Set when the word contains a quoted_empty atom, indicating that an empty
/// result should still be preserved as an argument (POSIX: `""` produces an empty arg).
has_quoted_empty: bool = false,
pub const ParentPtr = StatePtrUnion(.{
Cmd,
@@ -193,6 +196,9 @@ pub fn next(this: *Expansion) Yield {
bun.handleOom(this.current_out.insert(0, '~'));
},
}
} else if (this.has_quoted_empty) {
// ~"" or ~'' should expand to the home directory
bun.handleOom(this.current_out.appendSlice(homedir.slice()));
}
}
@@ -586,6 +592,11 @@ pub fn expandSimpleNoIO(this: *Expansion, atom: *const ast.SimpleAtom, str_list:
.Text => |txt| {
bun.handleOom(str_list.appendSlice(txt));
},
.quoted_empty => {
// A quoted empty string ("", '', or ${''}). We must ensure the word
// is not dropped by pushCurrentOut, so mark it with a flag.
this.has_quoted_empty = true;
},
.Var => |label| {
bun.handleOom(str_list.appendSlice(this.expandVar(label)));
},
@@ -630,8 +641,8 @@ pub fn appendSlice(this: *Expansion, buf: *std.array_list.Managed(u8), slice: []
}
pub fn pushCurrentOut(this: *Expansion) void {
if (this.current_out.items.len == 0) return;
if (this.current_out.items[this.current_out.items.len - 1] != 0) bun.handleOom(this.current_out.append(0));
if (this.current_out.items.len == 0 and !this.has_quoted_empty) return;
if (this.current_out.items.len == 0 or this.current_out.items[this.current_out.items.len - 1] != 0) bun.handleOom(this.current_out.append(0));
switch (this.out.pushResult(&this.current_out)) {
.copied => {
this.current_out.clearRetainingCapacity();
@@ -709,6 +720,7 @@ fn expansionSizeHint(this: *const Expansion, atom: *const ast.Atom, has_unknown:
fn expansionSizeHintSimple(this: *const Expansion, simple: *const ast.SimpleAtom, has_unknown: *bool) usize {
return switch (simple.*) {
.Text => |txt| txt.len,
.quoted_empty => 0,
.Var => |label| this.expandVar(label).len,
.VarArgv => |int| this.expandVarArgv(int).len,
.brace_begin, .brace_end, .comma, .asterisk => 1,

View File

@@ -26,6 +26,7 @@ pub const RedisError = error{
UnsupportedProtocol,
ConnectionTimeout,
IdleTimeout,
NestingDepthExceeded,
};
pub fn valkeyErrorToJS(globalObject: *jsc.JSGlobalObject, message: ?[]const u8, err: RedisError) jsc.JSValue {
@@ -55,6 +56,7 @@ pub fn valkeyErrorToJS(globalObject: *jsc.JSGlobalObject, message: ?[]const u8,
error.InvalidResponseType => .REDIS_INVALID_RESPONSE_TYPE,
error.ConnectionTimeout => .REDIS_CONNECTION_TIMEOUT,
error.IdleTimeout => .REDIS_IDLE_TIMEOUT,
error.NestingDepthExceeded => .REDIS_INVALID_RESPONSE,
error.JSError => return globalObject.takeException(error.JSError),
error.OutOfMemory => globalObject.throwOutOfMemory() catch return globalObject.takeException(error.JSError),
error.JSTerminated => return globalObject.takeException(error.JSTerminated),
@@ -420,7 +422,16 @@ pub const ValkeyReader = struct {
};
}
/// Maximum allowed nesting depth for RESP aggregate types.
/// This limits recursion to prevent excessive stack usage from
/// deeply nested responses.
const max_nesting_depth = 128;
pub fn readValue(self: *ValkeyReader, allocator: std.mem.Allocator) RedisError!RESPValue {
return self.readValueWithDepth(allocator, 0);
}
fn readValueWithDepth(self: *ValkeyReader, allocator: std.mem.Allocator, depth: usize) RedisError!RESPValue {
const type_byte = try self.readByte();
return switch (RESPType.fromByte(type_byte) orelse return error.InvalidResponseType) {
@@ -451,6 +462,7 @@ pub const ValkeyReader = struct {
return RESPValue{ .BulkString = owned };
},
.Array => {
if (depth >= max_nesting_depth) return error.NestingDepthExceeded;
const len = try self.readInteger();
if (len < 0) return RESPValue{ .Array = &[_]RESPValue{} };
const array = try allocator.alloc(RESPValue, @as(usize, @intCast(len)));
@@ -462,7 +474,7 @@ pub const ValkeyReader = struct {
}
}
while (i < len) : (i += 1) {
array[i] = try self.readValue(allocator);
array[i] = try self.readValueWithDepth(allocator, depth + 1);
}
return RESPValue{ .Array = array };
},
@@ -495,6 +507,7 @@ pub const ValkeyReader = struct {
return RESPValue{ .VerbatimString = try self.readVerbatimString(allocator) };
},
.Map => {
if (depth >= max_nesting_depth) return error.NestingDepthExceeded;
const len = try self.readInteger();
if (len < 0) return error.InvalidMap;
@@ -508,11 +521,15 @@ pub const ValkeyReader = struct {
}
while (i < len) : (i += 1) {
entries[i] = .{ .key = try self.readValue(allocator), .value = try self.readValue(allocator) };
var key = try self.readValueWithDepth(allocator, depth + 1);
errdefer key.deinit(allocator);
const value = try self.readValueWithDepth(allocator, depth + 1);
entries[i] = .{ .key = key, .value = value };
}
return RESPValue{ .Map = entries };
},
.Set => {
if (depth >= max_nesting_depth) return error.NestingDepthExceeded;
const len = try self.readInteger();
if (len < 0) return error.InvalidSet;
@@ -525,11 +542,12 @@ pub const ValkeyReader = struct {
}
}
while (i < len) : (i += 1) {
set[i] = try self.readValue(allocator);
set[i] = try self.readValueWithDepth(allocator, depth + 1);
}
return RESPValue{ .Set = set };
},
.Attribute => {
if (depth >= max_nesting_depth) return error.NestingDepthExceeded;
const len = try self.readInteger();
if (len < 0) return error.InvalidAttribute;
@@ -542,9 +560,9 @@ pub const ValkeyReader = struct {
}
}
while (i < len) : (i += 1) {
var key = try self.readValue(allocator);
var key = try self.readValueWithDepth(allocator, depth + 1);
errdefer key.deinit(allocator);
const value = try self.readValue(allocator);
const value = try self.readValueWithDepth(allocator, depth + 1);
attrs[i] = .{ .key = key, .value = value };
}
@@ -553,7 +571,7 @@ pub const ValkeyReader = struct {
errdefer {
allocator.destroy(value_ptr);
}
value_ptr.* = try self.readValue(allocator);
value_ptr.* = try self.readValueWithDepth(allocator, depth + 1);
return RESPValue{ .Attribute = .{
.attributes = attrs,
@@ -561,11 +579,13 @@ pub const ValkeyReader = struct {
} };
},
.Push => {
if (depth >= max_nesting_depth) return error.NestingDepthExceeded;
const len = try self.readInteger();
if (len < 0 or len == 0) return error.InvalidPush;
// First element is the push type
const push_type = try self.readValue(allocator);
var push_type = try self.readValueWithDepth(allocator, depth + 1);
defer push_type.deinit(allocator);
var push_type_str: []const u8 = "";
switch (push_type) {
@@ -594,7 +614,7 @@ pub const ValkeyReader = struct {
}
}
while (i < len - 1) : (i += 1) {
data[i] = try self.readValue(allocator);
data[i] = try self.readValueWithDepth(allocator, depth + 1);
}
return RESPValue{ .Push = .{

View File

@@ -212,6 +212,7 @@ pub fn read(this: *INotifyWatcher) bun.sys.Maybe([]const *align(1) Event) {
}
}
this.read_ptr = null;
return .{ .result = this.eventlist_ptrs[0..count] };
}

View File

@@ -541,7 +541,9 @@ describe("bundler", () => {
});
// --- Ported from Rolldown: dynamic-import-entry ---
// A submodule dynamically imports the barrel back
// A submodule dynamically imports the barrel back. import() returns the full
// module namespace — all barrel exports must be preserved, even if the
// import() result is discarded (we can't statically prove it isn't used).
itBundled("barrel/DynamicImportInSubmodule", {
files: {
@@ -562,17 +564,103 @@ describe("bundler", () => {
export const a = 'dyn-a';
import('./index.js');
`,
// b.js has a syntax error — only a is imported, so b should be skipped
"/node_modules/dynlib/b.js": /* js */ `
export const b = <<<SYNTAX_ERROR>>>;
export const b = 'dyn-b';
`,
},
outdir: "/out",
onAfterBundle(api) {
api.expectFile("/out/entry.js").toContain("dyn-a");
// b must be included — import() needs the full namespace
api.expectFile("/out/entry.js").toContain("dyn-b");
},
});
// Dynamic import returns the full namespace at runtime — consumer can access any export.
// When a file also has a static named import of the same barrel, the barrel
// optimization must not drop exports the dynamic import might use.
// Previously, the dynamic import was ignored if a static import already seeded
// requested_exports, producing invalid JS (export clause referencing undeclared symbol).
itBundled("barrel/DynamicImportWithStaticImportSameTarget", {
files: {
"/entry.js": /* js */ `
import { a } from "barrel";
console.log(a);
const run = async () => {
const { b } = await import("barrel");
console.log(b);
};
run();
`,
"/node_modules/barrel/package.json": JSON.stringify({
name: "barrel",
main: "./index.js",
sideEffects: false,
}),
"/node_modules/barrel/index.js": /* js */ `
export { a } from "./a.js";
export { b } from "./b.js";
`,
"/node_modules/barrel/a.js": /* js */ `
export const a = "A";
`,
"/node_modules/barrel/b.js": /* js */ `
export const b = "B";
`,
},
splitting: true,
format: "esm",
target: "bun",
outdir: "/out",
run: {
stdout: "A\nB",
},
});
// Same as above but static and dynamic importers are in separate files.
// This was parse-order dependent — if the static importer's
// scheduleBarrelDeferredImports ran first, it seeded .partial and the dynamic
// importer's escalation was skipped. Now import() always escalates to .all.
itBundled("barrel/DynamicImportWithStaticImportSeparateFiles", {
files: {
"/static-user.js": /* js */ `
import { a } from "barrel2";
console.log(a);
`,
"/dynamic-user.js": /* js */ `
const run = async () => {
const { b } = await import("barrel2");
console.log(b);
};
run();
`,
"/node_modules/barrel2/package.json": JSON.stringify({
name: "barrel2",
main: "./index.js",
sideEffects: false,
}),
"/node_modules/barrel2/index.js": /* js */ `
export { a } from "./a.js";
export { b } from "./b.js";
`,
"/node_modules/barrel2/a.js": /* js */ `
export const a = "A";
`,
"/node_modules/barrel2/b.js": /* js */ `
export const b = "B";
`,
},
entryPoints: ["/static-user.js", "/dynamic-user.js"],
splitting: true,
format: "esm",
target: "bun",
outdir: "/out",
run: [
{ file: "/out/static-user.js", stdout: "A" },
{ file: "/out/dynamic-user.js", stdout: "B" },
],
});
// --- Ported from Rolldown: multiple-entries ---
// Multiple entry points that each import different things from barrels

View File

@@ -0,0 +1,44 @@
import { describe, expect } from "bun:test";
import { itBundled } from "../expectBundled";
describe("css", () => {
itBundled("css/mask-geometry-box-preserved", {
files: {
"index.css": /* css */ `
.test-a::after {
mask: linear-gradient(#fff 0 0) padding-box, linear-gradient(#fff 0 0);
}
.test-b::after {
mask: linear-gradient(#fff 0 0) content-box, linear-gradient(#fff 0 0);
}
`,
},
outdir: "/out",
entryPoints: ["/index.css"],
onAfterBundle(api) {
const output = api.readFile("/out/index.css");
expect(output).toContain("padding-box");
expect(output).toContain("content-box");
expect(output).toContain(".test-a");
expect(output).toContain(".test-b");
expect(output).not.toContain(".test-a:after, .test-b:after");
},
});
itBundled("css/webkit-mask-geometry-box-preserved", {
files: {
"index.css": /* css */ `
.test-c::after {
-webkit-mask: linear-gradient(#fff 0 0) padding-box, linear-gradient(#fff 0 0);
-webkit-mask-composite: xor;
}
`,
},
outdir: "/out",
entryPoints: ["/index.css"],
onAfterBundle(api) {
const output = api.readFile("/out/index.css");
expect(output).toContain("padding-box");
},
});
});

View File

@@ -664,6 +664,42 @@ tarball: \${fs.existsSync("pack-lifecycle-order-1.1.1.tgz")}\`)`;
]);
});
test("lifecycle script modifying version updates tarball filename (#17195)", async () => {
const updateScript = `const fs = require("fs");
const pkg = JSON.parse(fs.readFileSync("package.json", "utf8"));
pkg.version = "2.0.0-snapshot.test";
fs.writeFileSync("package.json", JSON.stringify(pkg, null, 2));`;
await Promise.all([
write(
join(packageDir, "package.json"),
JSON.stringify({
name: "pack-version-update",
version: "1.0.0",
scripts: {
prepack: `${bunExe()} update-version.js`,
},
}),
),
write(join(packageDir, "update-version.js"), updateScript),
write(join(packageDir, "index.js"), "module.exports = {};"),
]);
await pack(packageDir, bunEnv);
// The tarball filename should use the UPDATED version
expect(await exists(join(packageDir, "pack-version-update-2.0.0-snapshot.test.tgz"))).toBeTrue();
// The old version tarball should NOT exist
expect(await exists(join(packageDir, "pack-version-update-1.0.0.tgz"))).toBeFalse();
const tarball = readTarball(join(packageDir, "pack-version-update-2.0.0-snapshot.test.tgz"));
expect(tarball.entries).toMatchObject([
{ "pathname": "package/package.json" },
{ "pathname": "package/index.js" },
{ "pathname": "package/update-version.js" },
]);
});
describe("bundledDependnecies", () => {
for (const bundledDependencies of ["bundledDependencies", "bundleDependencies"]) {
test(`basic (${bundledDependencies})`, async () => {

View File

@@ -652,6 +652,43 @@ postpack: \${fs.existsSync("postpack.txt")}\`)`;
});
});
test("prepublishOnly modifying version publishes correct version (#17195)", async () => {
const { packageDir, packageJson } = await registry.createTestDir();
const bunfig = await registry.authBunfig("version-update");
const updateScript = `const fs = require("fs");
const pkg = JSON.parse(fs.readFileSync("package.json", "utf8"));
pkg.version = "9.9.9";
fs.writeFileSync("package.json", JSON.stringify(pkg, null, 2));`;
await Promise.all([
rm(join(registry.packagesPath, "publish-version-update"), { recursive: true, force: true }),
write(
packageJson,
JSON.stringify({
name: "publish-version-update",
version: "1.0.0",
scripts: {
prepublishOnly: `${bunExe()} update-version.js`,
},
dependencies: {
"publish-version-update": "9.9.9",
},
}),
),
write(join(packageDir, "update-version.js"), updateScript),
write(join(packageDir, "bunfig.toml"), bunfig),
]);
const { out, err, exitCode } = await publish(env, packageDir);
expect(err).not.toContain("error:");
expect(exitCode).toBe(0);
// Should be able to install the package at the UPDATED version (9.9.9), not the original (1.0.0)
await runBunInstall(env, packageDir);
const installedPkg = await file(join(packageDir, "node_modules", "publish-version-update", "package.json")).json();
expect(installedPkg.version).toBe("9.9.9");
});
test("attempting to publish a private package should fail", async () => {
const { packageDir, packageJson } = await registry.createTestDir();
const bunfig = await registry.authBunfig("privatepackage");

File diff suppressed because it is too large Load Diff

View File

@@ -1,14 +1,14 @@
{
"$schema": "https://shadcn-svelte.com/schema.json",
"style": "default",
"tailwind": {
"config": "tailwind.config.ts",
"css": "app/app.css",
"baseColor": "slate"
},
"aliases": {
"components": "$lib/components",
"utils": "$lib/utils"
},
"typescript": true
"$schema": "https://shadcn-svelte.com/schema.json",
"style": "default",
"tailwind": {
"config": "tailwind.config.ts",
"css": "src/app.css",
"baseColor": "slate"
},
"aliases": {
"components": "$lib/components",
"utils": "$lib/utils"
},
"typescript": true
}

View File

@@ -1,127 +1,127 @@
{
"name": "bun-issue",
"version": "0.0.1",
"private": true,
"scripts": {
"prepare": "svelte-kit sync",
"dev": "bunx --bun vite dev",
"build": "bunx --bun vite build",
"preview": "bunx --bun vite preview",
"start": "bun ./serve.ts",
"cluster": "bun ./cluster.ts",
"check": "svelte-kit sync && svelte-check --tsconfig ./tsconfig.json",
"check:watch": "svelte-kit sync && svelte-check --tsconfig ./tsconfig.json --watch",
"lint": "prettier --check . && eslint .",
"format": "prettier --write ."
},
"devDependencies": {
"@sveltejs/adapter-auto": "^3.3.1",
"@sveltejs/adapter-node": "^5.2.12",
"@sveltejs/kit": "2.11",
"@sveltejs/vite-plugin-svelte": "^3.1.2",
"@types/bun": "^1.1.14",
"@types/capitalize": "^2.0.2",
"@types/cli-progress": "^3.11.6",
"@types/eslint": "^8.56.12",
"@types/natural-compare-lite": "^1.4.2",
"@types/node": "^22.10.2",
"@types/nodemailer": "^6.4.17",
"@types/pg": "^8.11.10",
"@types/prompts": "^2.4.9",
"@types/ramda": "^0.30.2",
"@types/string-pixel-width": "^1.10.3",
"@types/tmp": "^0.2.6",
"@types/xml2js": "^0.4.14",
"autoprefixer": "^10.4.20",
"eslint": "^9.17.0",
"eslint-config-prettier": "^9.1.0",
"eslint-plugin-import": "^2.31.0",
"eslint-plugin-perfectionist": "^3.9.1",
"eslint-plugin-svelte": "^2.46.1",
"globals": "^15.14.0",
"kysely-codegen": "^0.16.8",
"kysely-ctl": "^0.9.0",
"make-vfs": "^1.0.15",
"p-limit": "^6.1.0",
"postcss": "^8.4.49",
"prettier": "^3.4.2",
"prettier-plugin-organize-imports": "^4.1.0",
"prettier-plugin-svelte": "^3.3.2",
"prettier-plugin-tailwindcss": "^0.6.9",
"svelte": "^4.2.19",
"svelte-check": "^3.8.6",
"tailwindcss": "3.4.17",
"tslib": "^2.8.1",
"typescript": "^5.7.2",
"typescript-eslint": "^8.18.1"
},
"type": "module",
"dependencies": {
"vite": "^5.4.11",
"@ethercorps/sveltekit-og": "^3.0.0",
"@internationalized/date": "^3.6.0",
"@lucia-auth/adapter-postgresql": "^3.1.2",
"@lucia-auth/adapter-sqlite": "^3.0.2",
"@tailwindcss/container-queries": "^0.1.1",
"@types/cookie": "^0.6.0",
"bits-ui": "^0.21.16",
"camelcase-keys": "^9.1.3",
"capitalize": "^2.0.4",
"chalk": "^5.3.0",
"cli-progress": "^3.12.0",
"clsx": "^2.1.1",
"cmdk-sv": "^0.0.18",
"commander": "^12.1.0",
"confbox": "^0.1.8",
"cookie": "^0.6.0",
"dayjs": "^1.11.13",
"dedent": "^1.5.3",
"dotenv": "^16.4.7",
"fflate": "^0.8.2",
"formsnap": "^1.0.1",
"fuzzball": "^2.1.3",
"kysely": "^0.27.5",
"kysely-bun-sqlite": "^0.3.2",
"lucia": "^3.2.2",
"lucide-svelte": "^0.454.0",
"magic-bytes.js": "^1.10.0",
"natural-compare-lite": "^1.4.0",
"node-addon-api": "^8.3.0",
"node-gyp": "^10.3.1",
"node-html-parser": "^7.0.1",
"node-stream-zip": "^1.15.0",
"nodemailer": "^6.9.16",
"p-map": "^7.0.3",
"p-queue": "^8.0.1",
"pg": "^8.13.1",
"pretty-bytes": "^6.1.1",
"prompts": "^2.4.2",
"ramda": "^0.30.1",
"sharp": "^0.33.5",
"slugify": "^1.6.6",
"streamsaver": "^2.0.6",
"string-pixel-width": "^1.11.0",
"strip-ansi": "^7.1.0",
"svelte-dnd-action": "^0.9.53",
"svelte-meta-tags": "^3.1.4",
"svelte-sonner": "^0.3.28",
"sveltekit-superforms": "^2.22.1",
"tailwind-merge": "^2.5.5",
"tailwind-variants": "^0.2.1",
"tmp": "^0.2.3",
"ts-pattern": "^5.6.0",
"type-fest": "^4.30.2",
"windows-1252": "^3.0.4",
"xml2js": "^0.6.2",
"yaml": "^2.6.1",
"zod": "^3.24.1",
"zod-urlsearchparams": "^0.0.14"
},
"trustedDependencies": [
"@sveltejs/kit",
"esbuild",
"sharp",
"svelte-adapter-bun",
"svelte-preprocess"
]
"name": "bun-issue",
"version": "0.0.1",
"private": true,
"scripts": {
"prepare": "svelte-kit sync",
"dev": "bunx --bun vite dev",
"build": "bunx --bun vite build",
"preview": "bunx --bun vite preview",
"start": "bun ./serve.ts",
"cluster": "bun ./cluster.ts",
"check": "svelte-kit sync && svelte-check --tsconfig ./tsconfig.json",
"check:watch": "svelte-kit sync && svelte-check --tsconfig ./tsconfig.json --watch",
"lint": "prettier --check . && eslint .",
"format": "prettier --write ."
},
"devDependencies": {
"@sveltejs/adapter-auto": "^7.0.1",
"@sveltejs/adapter-node": "^5.5.4",
"@sveltejs/kit": "^2.53.4",
"@sveltejs/vite-plugin-svelte": "^6.2.4",
"@types/bun": "^1.1.14",
"@types/capitalize": "^2.0.2",
"@types/cli-progress": "^3.11.6",
"@types/eslint": "^8.56.12",
"@types/natural-compare-lite": "^1.4.2",
"@types/node": "^22.10.2",
"@types/nodemailer": "^6.4.17",
"@types/pg": "^8.11.10",
"@types/prompts": "^2.4.9",
"@types/ramda": "^0.30.2",
"@types/string-pixel-width": "^1.10.3",
"@types/tmp": "^0.2.6",
"@types/xml2js": "^0.4.14",
"autoprefixer": "^10.4.20",
"eslint": "^9.17.0",
"eslint-config-prettier": "^9.1.0",
"eslint-plugin-import": "^2.31.0",
"eslint-plugin-perfectionist": "^3.9.1",
"eslint-plugin-svelte": "^2.46.1",
"globals": "^15.14.0",
"kysely-codegen": "^0.16.8",
"kysely-ctl": "^0.9.0",
"make-vfs": "^1.0.15",
"p-limit": "^6.1.0",
"postcss": "^8.4.49",
"prettier": "^3.4.2",
"prettier-plugin-organize-imports": "^4.1.0",
"prettier-plugin-svelte": "^3.3.2",
"prettier-plugin-tailwindcss": "^0.6.9",
"svelte": "^5.0.0",
"svelte-check": "^4.4.4",
"tailwindcss": "3.4.17",
"tslib": "^2.8.1",
"typescript": "^5.7.2",
"typescript-eslint": "^8.18.1"
},
"type": "module",
"dependencies": {
"vite": "npm:rolldown-vite@^7.3.1",
"@ethercorps/sveltekit-og": "^3.0.0",
"@internationalized/date": "^3.6.0",
"@lucia-auth/adapter-postgresql": "^3.1.2",
"@lucia-auth/adapter-sqlite": "^3.0.2",
"@tailwindcss/container-queries": "^0.1.1",
"@types/cookie": "^0.6.0",
"bits-ui": "^0.21.16",
"camelcase-keys": "^9.1.3",
"capitalize": "^2.0.4",
"chalk": "^5.3.0",
"cli-progress": "^3.12.0",
"clsx": "^2.1.1",
"cmdk-sv": "^0.0.18",
"commander": "^12.1.0",
"confbox": "^0.1.8",
"cookie": "^0.6.0",
"dayjs": "^1.11.13",
"dedent": "^1.5.3",
"dotenv": "^16.4.7",
"fflate": "^0.8.2",
"formsnap": "^1.0.1",
"fuzzball": "^2.1.3",
"kysely": "^0.27.5",
"kysely-bun-sqlite": "^0.3.2",
"lucia": "^3.2.2",
"lucide-svelte": "^0.454.0",
"magic-bytes.js": "^1.10.0",
"natural-compare-lite": "^1.4.0",
"node-addon-api": "^8.3.0",
"node-gyp": "^10.3.1",
"node-html-parser": "^7.0.1",
"node-stream-zip": "^1.15.0",
"nodemailer": "^6.9.16",
"p-map": "^7.0.3",
"p-queue": "^8.0.1",
"pg": "^8.13.1",
"pretty-bytes": "^6.1.1",
"prompts": "^2.4.2",
"ramda": "^0.30.1",
"sharp": "^0.33.5",
"slugify": "^1.6.6",
"streamsaver": "^2.0.6",
"string-pixel-width": "^1.11.0",
"strip-ansi": "^7.1.0",
"svelte-dnd-action": "^0.9.53",
"svelte-meta-tags": "^3.1.4",
"svelte-sonner": "^0.3.28",
"sveltekit-superforms": "^2.22.1",
"tailwind-merge": "^2.5.5",
"tailwind-variants": "^0.2.1",
"tmp": "^0.2.3",
"ts-pattern": "^5.6.0",
"type-fest": "^4.30.2",
"windows-1252": "^3.0.4",
"xml2js": "^0.6.2",
"yaml": "^2.6.1",
"zod": "^3.24.1",
"zod-urlsearchparams": "^0.0.14"
},
"trustedDependencies": [
"@sveltejs/kit",
"esbuild",
"sharp",
"svelte-adapter-bun",
"svelte-preprocess"
]
}

Some files were not shown because too many files have changed in this diff Show More