Compare commits

...

262 Commits

Author SHA1 Message Date
Claude Bot
24395f24ed Merge remote-tracking branch 'origin/main' into pr-25604 2025-12-24 06:48:53 +00:00
Aiden Cline
822d75a380 fix(@types/bun): add missing autoloadTsconfig and autoloadPackageJson types (#25501)
### What does this PR do?

Adds missing types, fixes typo

### How did you verify your code works?

Add missing types from: 
https://github.com/oven-sh/bun/pull/25340/changes

---------

Co-authored-by: Alistair Smith <hi@alistair.sh>
2025-12-24 06:47:07 +00:00
SUZUKI Sosuke
bffccf3d5f Upgrade WebKit 2025/12/07 (#25429)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
Co-authored-by: Claude Bot <claude-bot@bun.sh>
2025-12-23 22:24:18 -08:00
robobun
0300150324 docs: fix incorrect [env] section documentation in bunfig.toml (#25634)
## Summary
- Fixed documentation that incorrectly claimed you could use `[env]` as
a TOML section to set environment variables directly
- The `env` option in bunfig.toml only controls whether automatic `.env`
file loading is disabled (via `env = false`)
- Updated to show the correct approaches: using preload scripts or
`.env` files with `--env-file`

## Test plan
- Documentation-only change, no code changes

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-12-23 15:31:12 -08:00
robobun
34a1e2adad fix: use LLVM unstable repo for Debian Trixie (13) (#25657)
## Summary

- Fix LLVM installation on Debian Trixie (13) by using the unstable
repository from apt.llvm.org

The `llvm.sh` script doesn't automatically detect that Debian Trixie
needs to use the unstable repository. This is because trixie's `VERSION`
is `13 (trixie)` rather than `testing`, and apt.llvm.org doesn't have a
dedicated trixie repository.

Without this fix, the LLVM installation falls back to Debian's main
repository packages, which don't include `libclang-rt-19-dev` (the
compiler-rt sanitizer libraries) by default. This causes builds with
ASan (AddressSanitizer) to fail with:

```
ld.lld: error: cannot open /usr/lib/llvm-19/lib/clang/19/lib/x86_64-pc-linux-gnu/libclang_rt.asan.a: No such file or directory
```

This was breaking the [Daily Docker
Build](https://github.com/oven-sh/bun-development-docker-image/actions/runs/20437290601)
in the bun-development-docker-image repo.

## Test plan

- [ ] Wait for the PR CI to pass
- [ ] After merging, the next Daily Docker Build should succeed

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-12-22 23:10:15 -08:00
robobun
8484e1b827 perf: shrink ConcurrentTask from 24 bytes to 16 bytes (#25636) 2025-12-22 12:07:24 -08:00
robobun
3898ed5e3f perf: pack boolean flags and reorder fields to reduce struct padding (#25627) 2025-12-21 17:12:42 -08:00
Jarred Sumner
c08ffadf56 perf(linux): add memfd optimizations and typed flags (#25597)
## Summary

- Add `MemfdFlags` enum to replace raw integer flags for `memfd_create`,
providing semantic clarity for different use cases (`executable`,
`non_executable`, `cross_process`)
- Add support for `MFD_EXEC` and `MFD_NOEXEC_SEAL` flags (Linux 6.3+)
with automatic fallback to older kernel flags when `EINVAL` is returned
- Use memfd + `/proc/self/fd/{fd}` path for loading embedded `.node`
files in standalone builds, avoiding disk writes entirely on Linux

## Test plan

- [ ] Verify standalone builds with embedded `.node` files work on Linux
- [ ] Verify fallback works on older kernels (pre-6.3)
- [ ] Verify subprocess stdio memfd still works correctly

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude <noreply@anthropic.com>
2025-12-19 23:18:21 -08:00
Dylan Conway
fa983247b2 fix(create): crash when running postinstall task with --no-install (#25616)
## Summary
- Fix segmentation fault in `bun create` when using `--no-install` with
a template that has a `bun-create.postinstall` task starting with "bun "
- The bug was caused by unconditionally slicing `argv[2..]` which
created an empty array when `npm_client` was null
- Added check for `npm_client != null` before slicing

## Reproduction
```bash
# Create template with bun-create.postinstall
mkdir -p ~/.bun-create/test-template
echo '{"name":"test","bun-create":{"postinstall":"bun install"}}' > ~/.bun-create/test-template/package.json

# This would crash before the fix
bun create test-template /tmp/my-app --no-install
```

## Test plan
- [x] Verified the reproduction case crashes before the fix
- [x] Verified the reproduction case works after the fix

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2025-12-19 23:17:51 -08:00
Dylan Conway
99b0a16c33 fix: prevent out-of-bounds access in NO_PROXY parsing (#25617)
## Summary
- Fix out-of-bounds access when parsing `NO_PROXY` environment variable
with empty entries
- Empty entries (e.g., `"localhost, , example.com"`) would cause a panic
when checking if the host starts with a dot
- Skip empty entries after trimming whitespace

fixes BUN-110G
fixes BUN-128V

## Test plan
- [x] Verify `NO_PROXY="localhost, , example.com"` no longer crashes

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude <noreply@anthropic.com>
2025-12-19 23:17:29 -08:00
Dylan Conway
085e25d5d1 fix: protect StringOrBuffer from GC in async operations (#25594)
## Summary

- Fix use-after-free crash in async zstd compression, scrypt, and
JSTranspiler operations
- When `StringOrBuffer.fromJSMaybeAsync` is called with `is_async=true`,
the buffer's JSValue is now protected from garbage collection
- Previously, the buffer could be GC'd while a worker thread was still
accessing it, causing segfaults in zstd's `HIST_count_simple` and
similar functions

Fixes BUN-167Z

## Changes

- `fromJSMaybeAsync`: Call `protect()` on buffer when `is_async=true`
- `fromJSWithEncodingMaybeAsync`: Same protection for the early return
path
- `Scrypt`: Fix cleanup to use `deinitAndUnprotect()` for async path,
add missing `deinit()` in sync path
- `JSTranspiler`: Use new protection mechanism instead of manual
`protect()`/`unprotect()` calls
- Simplify `createOnJSThread` signatures to not return errors (OOM is
handled internally)
- Update all callers to use renamed/simplified APIs

## Test plan

- [x] Code review of all callsites to verify correct protect/unprotect
pairing
- [ ] Run existing zstd tests
- [ ] Run existing scrypt tests
- [ ] Run existing transpiler tests

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2025-12-19 17:30:26 -08:00
Jarred Sumner
ce5c336ea5 Revert "fix: memory leaks in IPC message handling (#25602)"
This reverts commit 05b12e0ed0.

The tests did not fail with system version of Bun.
2025-12-19 17:28:54 -08:00
robobun
05b12e0ed0 fix: memory leaks in IPC message handling (#25602)
## Summary

- Add periodic memory reclamation for IPC buffers after processing
messages
- Fix missing `deref()` on `bun.String` created from `cmd` property in
`handleIPCMessage`
- Add `reclaimMemory()` function to shrink incoming buffer and send
queue when they exceed 2MB capacity
- Track message count to trigger memory reclamation every 256 messages

The incoming `ByteList` buffer and send queue `ArrayList` would grow but
never shrink, causing memory accumulation during sustained IPC
messaging.

## Test plan

- [x] Added regression tests in
`test/js/bun/spawn/spawn-ipc-memory.test.ts`
- [x] Existing IPC tests pass (`spawn.ipc.test.ts`)
- [x] Existing cluster tests pass

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-12-19 17:27:09 -08:00
Angus Comrie
d9459f8540 Fix postgres empty check when handling arrays (#25607)
### What does this PR do?
Closes #25505. This adjusts the byte length check in `DataCell:
fromBytes` to 12 bytes instead of 16, as zero-dimensional arrays will
have a shorter preamble.

### How did you verify your code works?
Test suite passes, and I've added a new test that fails in the main
branch but passes with this change. The issue only seems to crop up when
a connection is _reused_, which is curious.
2025-12-19 14:49:12 -08:00
Jarred Sumner
e79b512a9d Propagate debugger CLI config in single-file executables (#25600) 2025-12-19 09:49:02 -08:00
Claude Bot
dee17270ae Restore Next.js 16.1.0 in test files
Restore Next.js 16.1.0 and update snapshots. The dev-server test
flakiness will be investigated separately.

- next: ^16.1.0
- next-auth: 5.0.0-beta.30
- react: ^19.2.3
- react-dom: ^19.2.3

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-12-19 16:03:03 +00:00
Claude Bot
34fcffd1eb Revert Next.js to 15.x in test files for Bun compatibility
Next.js 16 has compatibility issues with Bun's test infrastructure.
Revert test files to Next.js 15.5.9 while keeping React 19.2.3.

Changes:
- test/integration/next-pages: next ^15.5.9, react ^19.2.3
- test/js/third_party/next-auth/fixture: next ^15.5.9, next-auth 5.0.0-beta.25
- bench/install: next ^15.5.9, next-auth 5.0.0-beta.25, react ^19.2.3

The init templates (src/init/*) still use the latest versions for
user-facing templates.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-12-19 15:51:35 +00:00
Claude Bot
87d2e28f6c Update next-build.test.ts snapshots for Next.js 16 / React 19
Updated lockfile snapshots to reflect new dependency versions:
- next: ^16.1.0
- react: ^19.2.3
- react-dom: ^19.2.3

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-12-19 14:55:46 +00:00
Claude Bot
0d2f84c625 Revert root and bench React to 18.x to fix test infrastructure
The root package.json React is used by the test infrastructure via
file: references. Upgrading it to React 19 breaks the test harness
since test/package.json still has React 18 type definitions and
react-dom 18.3.1.

Keep React 19 updates only in:
- src/init/* templates (user-facing)
- test/integration/next-pages (Next.js integration test)
- test/js/third_party/next-auth/fixture (next-auth test)
- test/bake/fixtures/react-spa-simple (bake test)
- bench/install (install benchmark with Next.js)
- bench/react-hello-world (React benchmark)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-12-19 11:56:39 +00:00
Claude Bot
267ebf501c Update next-auth to 5.0.0-beta.30 for Next.js 16 support
next-auth 5.0.0-beta.25 has peer dependency "next": "^14.0.0-0 || ^15.0.0"
next-auth 5.0.0-beta.30 has peer dependency "next": "^14.0.0-0 || ^15.0.0 || ^16.0.0"

This fixes the peer dependency conflict with Next.js 16.1.0.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-12-19 11:00:55 +00:00
Claude Bot
d75322ed05 Update lockfiles for react, react-dom, and next upgrades
Run bun install to regenerate lockfiles after updating:
- react: ^19.2.3
- react-dom: ^19.2.3
- next: ^16.1.0

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-12-19 09:50:21 +00:00
Claude Bot
b084d8e9bf Update react, react-dom, and next to latest versions
Update all package.json files to use the latest versions:
- react: ^19.2.3
- react-dom: ^19.2.3
- @types/react: ^19.2.3
- @types/react-dom: ^19.2.3
- next: ^16.1.0 (where applicable)
- eslint-config-next: ^16.1.0 (where applicable)

Updated files:
- src/init/react-app/package.json
- src/init/react-tailwind/package.json
- src/init/react-shadcn/package.json
- test/integration/next-pages/package.json
- test/js/third_party/next-auth/fixture/package.json
- test/bake/fixtures/react-spa-simple/package.json
- bench/install/package.json
- bench/package.json
- bench/react-hello-world/package.json
- package.json (root)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-12-19 09:46:58 +00:00
robobun
9902039b1f fix: memory leaks in error-handling code for Brotli, Zstd, and Zlib compression state machines (#25592)
## Summary

Fix several memory leaks in the compression libraries:

- **NativeBrotli/NativeZstd reset()** - Each call to `reset()` allocated
a new encoder/decoder without freeing the previous one
- **NativeBrotli/NativeZstd init() error paths** - If `setParams()`
failed after `stream.init()` succeeded, the instance was leaked
- **NativeZstd init()** - If `setPledgedSrcSize()` failed after context
creation, the context was leaked
- **ZlibCompressorArrayList** - After `deflateInit2_()` succeeded, if
`ensureTotalCapacityPrecise()` failed with OOM, zlib internal state was
never freed
- **NativeBrotli close()** - Now sets state to null to prevent potential
double-free (defensive)
- **LibdeflateState** - Added `deinit()` for API consistency

## Test plan

- [x] Added regression test that calls `reset()` 100k times and measures
memory growth
- [x] Test shows memory growth dropped from ~600MB to ~10MB for Brotli
- [x] Verified no double-frees by tracing code paths
- [x] Existing zlib tests pass (except pre-existing timeout in debug
build)

Before fix (system bun 1.3.3):
```
Memory growth after 100000 reset() calls: 624.38 MB  (BrotliCompress)
Memory growth after 100000 reset() calls: 540.63 MB  (BrotliDecompress)
```

After fix:
```
Memory growth after 100000 reset() calls: 11.84 MB   (BrotliCompress)
Memory growth after 100000 reset() calls: 0.16 MB    (BrotliDecompress)
```

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-12-18 21:42:14 -08:00
Dylan Conway
f3fd7506ef fix(windows): handle UV_UNKNOWN and UV_EAI_* error codes in libuv errno mapping (#25596)
## Summary
- Add missing `UV_UNKNOWN` and `UV_EAI_*` error code mappings to the
`errno()` function in `ReturnCode`
- Fixes panic "integer does not fit in destination type" on Windows when
libuv returns unmapped error codes
- Speculative fix for BUN-131E

## Root Cause
The `errno()` function was missing mappings for `UV_UNKNOWN` (-4094) and
all `UV_EAI_*` address info errors (-3000 to -3014). When libuv returned
these codes, the switch fell through to `else => null`, and the caller
at `sys_uv.zig:317` assumed success and tried to cast the negative
return code to `usize`, causing a panic.

This was triggered in `readFileWithOptions` -> `preadv` when:
- Memory-mapped file operations encounter exceptions (file
modified/truncated by another process, network drive issues)
- Windows returns error codes that libuv cannot map to standard errno
values

## Crash Report
```
Bun v1.3.5 (1e86ceb) on windows x86_64baseline []
panic: integer does not fit in destination type
sys_uv.zig:294: preadv
node_fs.zig:5039: readFileWithOptions
```

## Test plan
- [ ] This fix prevents a panic, converting it to a proper error.
Testing would require triggering `UV_UNKNOWN` from libuv, which is
difficult to do reliably (requires memory-mapped file exceptions or
unusual Windows errors).

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2025-12-18 21:41:41 -08:00
robobun
c21c51a0ff test(security-scanner): add TTY prompt tests using Bun.Terminal (#25587)
Co-authored-by: Alistair Smith <hi@alistair.sh>
2025-12-19 05:21:44 +00:00
robobun
0bbf6c74b5 test: add describe blocks for grouping in bun-types.test.ts (#25598)
Co-authored-by: Alistair Smith <hi@alistair.sh>
2025-12-19 04:53:18 +00:00
Dylan Conway
57cbbc09e4 fix: correct off-by-one bounds checks in bundler and package installer (#25582)
## Summary

- Fix two off-by-one bounds check errors that used `>` instead of `>=`
- Both bugs could cause undefined behavior (array out-of-bounds access)
when an index equals the array length

## The Bugs

### 1. `src/install/postinstall_optimizer.zig:62`

```zig
// Before (buggy):
if (resolution > metas.len) continue;
const meta: *const Meta = &metas[resolution];  // Out-of-bounds when resolution == metas.len

// After (fixed):
if (resolution >= metas.len) continue;
```

### 2. `src/bundler/linker_context/doStep5.zig:10`

```zig
// Before (buggy):
if (id > c.graph.meta.len) return;
const resolved_exports = &c.graph.meta.items(.resolved_exports)[id];  // Out-of-bounds when id == c.graph.meta.len

// After (fixed):
if (id >= c.graph.meta.len) return;
```

## Why These Are Bugs

Valid array indices are `0` to `len - 1`. When `index == len`:
- `index > len` evaluates to `false` → check passes
- `array[index]` accesses `array[len]` → out-of-bounds / undefined
behavior

## Codebase Patterns

The rest of the codebase correctly uses `>=` for these checks:
- `lockfile.zig:484`: `if (old_resolution >= old.packages.len)
continue;`
- `lockfile.zig:522`: `if (old_resolution >= old.packages.len)
continue;`
- `LinkerContext.zig:389`: `if (source_index >= import_records_list.len)
continue;`
- `LinkerContext.zig:1667`: `if (source_index >= c.graph.ast.len) {`

## Test plan

- [x] Verified fix aligns with existing codebase patterns
- [ ] CI passes

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude <noreply@anthropic.com>
2025-12-18 18:04:28 -08:00
Jarred Sumner
7f589ffb4b Disable coderabbit enrichment 2025-12-18 18:03:23 -08:00
Francis F
cea59d7fc0 docs(sqlite): fix .run() return value documentation (#25060)
Co-authored-by: Alistair Smith <hi@alistair.sh>
2025-12-18 20:44:35 +00:00
Jarred Sumner
4ea1454e4a Delete unused workflow 2025-12-18 12:04:28 -08:00
Dylan Conway
8941a363c3 fix: dupe ca string in .npmrc to prevent use-after-free (#25563)
## Summary

- Fix use-after-free bug when parsing `ca` option from `.npmrc`
- The `ca` string was being stored directly from the parser's arena
without duplication
- Since the parser arena is freed at the end of `loadNpmrc`, this
created a dangling pointer

## The Bug

In `src/ini.zig`, the `ca` string wasn't being duplicated like all other
string properties:

```zig
// Lines 983-986 explicitly warn about this:
// Need to be very, very careful here with strings.
// They are allocated in the Parser's arena, which of course gets
// deinitialized at the end of the scope.
// We need to dupe all strings

// Line 981: Parser arena is freed here
defer parser.deinit();

// Line 1016-1020: THE BUG - string not duped!
if (out.asProperty("ca")) |query| {
    if (query.expr.asUtf8StringLiteral()) |str| {
        install.ca = .{
            .str = str,  // ← Dangling pointer after parser.deinit()!
        };
```

All other string properties in the same function correctly duplicate:
- `registry` (line 996): `try allocator.dupe(u8, str)`
- `cache` (line 1002): `try allocator.dupe(u8, str)`
- `cafile` (line 1037): `asStringCloned(allocator)`
- `ca` array items (line 1026): `asStringCloned(allocator)`

## User Impact

When a user has `ca=<certificate>` in their `.npmrc` file:
1. The certificate string is parsed and stored
2. The parser arena is freed
3. `install.ca.str` becomes a dangling pointer
4. Later TLS/SSL operations access freed memory
5. Could cause crashes, undefined behavior, or security issues

## Test plan

- Code inspection confirms this matches the pattern used for all other
string properties
- The fix adds `try allocator.dupe(u8, str)` to match `cache`,
`registry`, etc.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude <noreply@anthropic.com>
2025-12-17 19:56:25 -08:00
Dylan Conway
722ac3aa5a fix: check correct variable in subprocess stdin cleanup (#25562)
## Summary

- Fix typo in `onProcessExit` where `existing_stdin_value.isCell()` was
checked instead of `existing_value.isCell()`
- Since `existing_stdin_value` is always `.zero` at that point, the
condition was always false, making the inner block dead code

## The Bug

In `src/bun.js/api/bun/subprocess.zig:593`:

```zig
var existing_stdin_value = jsc.JSValue.zero;  // Line 590 - always .zero
if (this_jsvalue != .zero) {
    if (jsc.Codegen.JSSubprocess.stdinGetCached(this_jsvalue)) |existing_value| {
        if (existing_stdin_value.isCell()) {  // BUG! Should be existing_value
            // This block was DEAD CODE - never executed
```

Compare with the correct pattern used elsewhere:
```zig
// shell/subproc.zig:251-252 (CORRECT)
if (jsc.Codegen.JSSubprocess.stdinGetCached(subprocess.this_jsvalue)) |existing_value| {
    jsc.WebCore.FileSink.JSSink.setDestroyCallback(existing_value, 0);  // Uses existing_value
}
```

## Impact

The dead code prevented:
- Recovery of stdin from cached JS value when `weak_file_sink_stdin_ptr`
is null
- Proper cleanup via `onAttachedProcessExit` on the FileSink  
- `setDestroyCallback` cleanup in `onProcessExit`

Note: The user-visible impact was mitigated by redundant cleanup paths
in `Writable.zig` that also call `setDestroyCallback`.

## Test plan

- Code inspection confirms this is a straightforward typo fix
- Existing subprocess tests continue to pass

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude <noreply@anthropic.com>
2025-12-17 18:34:58 -08:00
Dylan Conway
a333d02f84 fix: correct inverted buffer allocation logic in Postgres array parsing (#25564)
## Summary

- Fix inverted buffer allocation logic when parsing strings in Postgres
arrays
- Strings larger than 16KB were incorrectly using the stack buffer
instead of dynamically allocating
- This caused spurious `InvalidByteSequence` errors for valid data

## The Bug

In `src/sql/postgres/DataCell.zig`, the condition for when to use
dynamic allocation was inverted:

```zig
// BEFORE (buggy):
const needs_dynamic_buffer = str_bytes.len < stack_buffer.len;  // TRUE when SMALL

// AFTER (fixed):
const needs_dynamic_buffer = str_bytes.len > stack_buffer.len;  // TRUE when LARGE
```

## What happened with large strings (>16KB):

1. `needs_dynamic_buffer` = false (e.g., 20000 < 16384 is false)
2. Uses `stack_buffer[0..]` which is only 16KB
3. `unescapePostgresString` hits bounds check and returns
`BufferTooSmall`
4. Error converted to `InvalidByteSequence`
5. User gets error even though data is valid

## User Impact

Users with Postgres arrays containing JSON or string elements larger
than 16KB would get spurious InvalidByteSequence errors even though
their data was perfectly valid.

## Test plan

- Code inspection confirms the logic was inverted
- The fix aligns with the intended behavior: use stack buffer for small
strings, dynamic allocation for large strings

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude <noreply@anthropic.com>
2025-12-17 18:34:17 -08:00
Dylan Conway
c1acb0b9a4 fix(shell): prevent double-close of fd when using &> redirect with builtins (#25568)
## Summary

- Fix double-close of file descriptor when using `&>` redirect with
shell builtin commands
- Add `dupeRef()` helper for cleaner reference counting semantics
- Add tests for `&>` and `&>>` redirects with builtins

## Test plan

- [x] Added tests in `test/js/bun/shell/file-io.test.ts` that reproduce
the bug
- [x] All file-io tests pass

## The Bug

When using `&>` to redirect both stdout and stderr to the same file with
a shell builtin command (e.g., `pwd &> file.txt`), the code was creating
two separate `IOWriter` instances that shared the same file descriptor.
When both `IOWriter`s were destroyed, they both tried to close the same
fd, causing an `EBADF` (bad file descriptor) error.

```javascript
import { $ } from "bun";
await $`pwd &> output.txt`; // Would crash with EBADF
```

## The Fix

1. Share a single `IOWriter` between stdout and stderr when both are
redirected to the same file, with proper reference counting
2. Rename `refSelf` to `dupeRef` for clarity across `IOReader`,
`IOWriter`, `CowFd`, and add it to `Blob` for consistency
3. Fix the `Body.Value` blob case to also properly reference count when
the same blob is assigned to multiple outputs

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Latest model <noreply@anthropic.com>
2025-12-17 18:33:53 -08:00
Jarred Sumner
ffd2240c31 Bump 2025-12-17 11:42:54 -08:00
190n
fa5a5bbe55 fix: v8::Value::IsInt32()/IsUint32() edge cases (#25548)
### What does this PR do?

- fixes both functions returning false for double-encoded values (even
if the numeric value is a valid int32/uint32)
- fixes IsUint32() returning false for values that don't fit in int32
- fixes the test from #22462 not testing anything (the native functions
were being passed a callback to run garbage collection as the first
argument, so it was only ever testing what the type check APIs returned
for that function)
- extends the test to cover the first edge case above

### How did you verify your code works?

The new tests fail without these fixes.

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-12-17 00:52:16 -08:00
Dylan Conway
1e86cebd74 Add bun_version to link metadata (#25545)
## Summary
- Add `bun_version` field to `link-metadata.json`
- Pass `VERSION` CMake variable to the metadata script as `BUN_VERSION`
env var

This ensures the build version is captured in the link metadata JSON
file, which is useful for tracking which version produced a given build
artifact.

## Test plan
- Build with `bun bd` and verify `link-metadata.json` includes
`bun_version`

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude <noreply@anthropic.com>
2025-12-16 19:53:05 -08:00
robobun
bc47f87450 fix(ini): support env var expansion in quoted .npmrc values (#25518)
## Summary

Fixes environment variable expansion in quoted `.npmrc` values and adds
support for the `?` optional modifier.

### Changes

**Simplified quoted value handling:**
- Removed unnecessary `isProperlyQuoted` check that added complexity
without benefit
- When JSON.parse succeeds for quoted strings, expand env vars in the
result
- When JSON.parse fails for single-quoted strings like `'${VAR}'`, still
expand env vars

**Added `?` modifier support (matching npm behavior):**
- `${VAR}` - if VAR is undefined, leaves as `${VAR}` (no expansion)
- `${VAR?}` - if VAR is undefined, expands to empty string

This applies consistently to both quoted and unquoted values.

### Examples

```ini
# Env var found - all expand to the value
token = ${NPM_TOKEN}
token = "${NPM_TOKEN}"
token = '${NPM_TOKEN}'

# Env var NOT found - left as-is
token = ${NPM_TOKEN}         # → ${NPM_TOKEN}
token = "${NPM_TOKEN}"       # → ${NPM_TOKEN}
token = '${NPM_TOKEN}'       # → ${NPM_TOKEN}

# Optional modifier (?) - expands to empty if not found
token = ${NPM_TOKEN?}        # → (empty)
token = "${NPM_TOKEN?}"      # → (empty)
auth = "Bearer ${TOKEN?}"    # → Bearer 
```

### Test Plan

- Added 8 new tests for the `?` modifier covering quoted and unquoted
values
- Verified all expected values match `npm config get` behavior
- All 30 ini tests pass

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
Co-authored-by: Dylan Conway <dylan.conway567@gmail.com>
2025-12-16 19:49:23 -08:00
Dylan Conway
698b004ea4 Add step in CI to upload link metadata (#25448)
### What does this PR do?

### How did you verify your code works?

---------

Co-authored-by: Claude <noreply@anthropic.com>
2025-12-16 14:30:10 -08:00
robobun
b135c207ed fix(yaml): remove YAML 1.1 legacy boolean values for YAML 1.2 compliance (#25537)
## Summary

- Remove YAML 1.1 legacy boolean values (`yes/no/on/off/y/Y`) that are
not part of the YAML 1.2 Core Schema
- Keep YAML 1.2 Core Schema compliant values: `true/True/TRUE`,
`false/False/FALSE`, `null/Null/NULL`, `0x` hex, `0o` octal
- Add comprehensive roundtrip tests for YAML 1.2 compliance

**Removed (now parsed as strings):**
- `yes`, `Yes`, `YES` (were `true`)
- `no`, `No`, `NO` (were `false`)
- `on`, `On`, `ON` (were `true`)
- `off`, `Off`, `OFF` (were `false`)
- `y`, `Y` (were `true`)

This fixes a common pain point where GitHub Actions workflow files with
`on:` keys would have the key parsed as boolean `true` instead of the
string `"on"`.

## YAML 1.2 Core Schema Specification

From [YAML 1.2.2 Section 10.3.2 Tag
Resolution](https://yaml.org/spec/1.2.2/#1032-tag-resolution):

| Regular expression | Resolved to tag |
|-------------------|-----------------|
| `null \| Null \| NULL \| ~` | tag:yaml.org,2002:null |
| `/* Empty */` | tag:yaml.org,2002:null |
| `true \| True \| TRUE \| false \| False \| FALSE` |
tag:yaml.org,2002:bool |
| `[-+]? [0-9]+` | tag:yaml.org,2002:int (Base 10) |
| `0o [0-7]+` | tag:yaml.org,2002:int (Base 8) |
| `0x [0-9a-fA-F]+` | tag:yaml.org,2002:int (Base 16) |
| `[-+]? ( \. [0-9]+ \| [0-9]+ ( \. [0-9]* )? ) ( [eE] [-+]? [0-9]+ )?`
| tag:yaml.org,2002:float |
| `[-+]? ( \.inf \| \.Inf \| \.INF )` | tag:yaml.org,2002:float
(Infinity) |
| `\.nan \| \.NaN \| \.NAN` | tag:yaml.org,2002:float (Not a number) |

Note: `yes`, `no`, `on`, `off`, `y`, `n` are **not** in the YAML 1.2
Core Schema boolean list. These were removed from YAML 1.1 as noted in
[YAML 1.2 Section 1.2](https://yaml.org/spec/1.2.2/#12-yaml-history):

> The YAML 1.2 specification was published in 2009. Its primary focus
was making YAML a strict superset of JSON. **It also removed many of the
problematic implicit typing recommendations.**

## Test plan

- [x] Updated existing YAML tests to reflect YAML 1.2 Core Schema
behavior
- [x] Added roundtrip tests (stringify → parse) for YAML 1.2 compliance
- [x] Verified tests fail with system Bun (YAML 1.1 behavior) and pass
with debug build (YAML 1.2)
- [x] Run `bun bd test test/js/bun/yaml/yaml.test.ts`

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-12-16 14:29:39 -08:00
Ciro Spaciari
a1dd26d7db fix(usockets) fix last_write_failed flag (#25496)
https://github.com/oven-sh/bun/pull/25361 needs to be merged before this
PR

## Summary
- Move `last_write_failed` flag from loop-level to per-socket flag for
correctness

## Changes

- Move `last_write_failed` from `loop->data` to `socket->flags`
- More semantically correct since write status is per-socket, not
per-loop

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2025-12-16 14:26:42 -08:00
Christian Rishøj
7c06320d0f fix(ws): fix zlib version mismatch on Windows (segfault) (#25538)
## Summary

Fixes #24593 - WebSocket segfault on Windows when publishing large
messages with `perMessageDeflate: true`.
Also fixes #21028 (duplicate issue).
Also closes #25457 (alternative PR).

**Root cause:** 

On Windows, the C++ code was compiled against system zlib headers
(1.3.1) but linked against Bun's vendored Cloudflare zlib (1.2.8).

This version mismatch caused `deflateInit2()` to return
`Z_VERSION_ERROR` (-6), leaving the deflate stream in an invalid state.
All subsequent `deflate()` calls returned `Z_STREAM_ERROR` (-2),
producing zero output, which then caused an integer underflow when
subtracting the 4-byte trailer → segfault in memcpy.

**Fix:** 

Add `${VENDOR_PATH}/zlib` to the C++ include paths in
`cmake/targets/BuildBun.cmake`. This ensures the vendored zlib headers
are found before system headers, maintaining header/library version
consistency.

This is a simpler alternative to #25457 which worked around the issue by
using libdeflate exclusively.

## Test plan

- [x] Added regression test `test/regression/issue/24593.test.ts` with 4
test cases:
  - Large ~109KB JSON message publish (core reproduction)
  - Multiple rapid publishes (buffer corruption)
  - Broadcast to multiple subscribers
  - Messages at CORK_BUFFER_SIZE boundary (16KB)
- [x] Tests pass on Windows (was crashing before fix)
- [x] Tests pass on macOS

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2025-12-16 14:23:29 -08:00
robobun
dd04c57258 feat: implement V8 Value type checking APIs (#22462)
## Summary

This PR implements four V8 C++ API methods for type checking that are
commonly used by native Node.js modules:
- `v8::Value::IsMap()` - checks if value is a Map
- `v8::Value::IsArray()` - checks if value is an Array  
- `v8::Value::IsInt32()` - checks if value is a 32-bit integer
- `v8::Value::IsBigInt()` - checks if value is a BigInt

## Implementation Details

The implementation maps V8's type checking APIs to JavaScriptCore's
equivalent functionality:
- `IsMap()` uses JSC's `inherits<JSC::JSMap>()` check
- `IsArray()` uses JSC's `isArray()` function with the global object
- `IsInt32()` uses JSC's `isInt32()` method  
- `IsBigInt()` uses JSC's `isBigInt()` method

## Changes

- Added method declarations to `V8Value.h`
- Implemented the methods in `V8Value.cpp` 
- Added symbol exports to `napi.zig` (both Unix and Windows mangled
names)
- Added symbols to `symbols.txt` and `symbols.dyn`
- Added comprehensive tests in `v8-module/main.cpp` and `v8.test.ts`

## Testing

The implementation has been verified to:
- Compile successfully without errors
- Export the correct symbols in the binary
- Follow established patterns in the V8 compatibility layer

Tests cover various value types including empty and populated
Maps/Arrays, different numeric ranges, BigInts, and other JavaScript
types.

🤖 Generated with [Claude Code](https://claude.ai/code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2025-12-15 19:50:11 -08:00
robobun
344b2c1dfe fix: Response.clone() no longer locks body when body was accessed before clone (#25484)
## Summary
- Fix bug where `Response.clone()` would lock the original response's
body when `response.body` was accessed before cloning
- Apply the same fix to `Request.clone()`

## Root Cause
When `response.body` was accessed before calling `response.clone()`, the
original response's body would become locked after cloning. This
happened because:

1. When the cloned response was wrapped with `toJS()`,
`checkBodyStreamRef()` was called which moved the stream from
`Locked.readable` to `js.gc.stream` and cleared `Locked.readable`
2. The subsequent code tried to get the stream from `Locked.readable`,
which was now empty, so the body cache update was skipped
3. The JavaScript-level body property cache still held the old locked
stream

## Fix
Updated the cache update logic to:
1. For the cloned response: use `js.gc.stream.get()` instead of
`Locked.readable.get()` since `toJS()` already moved the stream
2. For the original response: use `Locked.readable.get()` which still
holds the teed stream since `checkBodyStreamRef` hasn't been called yet

## Reproduction
```javascript
const readableStream = new ReadableStream({
  start(controller) {
    controller.enqueue(new TextEncoder().encode("Hello, world!"));
    controller.close();
  },
});

const response = new Response(readableStream);
console.log(response.body?.locked); // Accessing body before clone
const cloned = response.clone();
console.log(response.body?.locked); // Expected: false, Actual: true 
console.log(cloned.body?.locked);   // Expected: false, Actual: false 
```

## Test plan
- [x] Added regression tests for `Response.clone()` in
`test/js/web/fetch/response.test.ts`
- [x] Added regression test for `Request.clone()` in
`test/js/web/request/request.test.ts`
- [x] Verified tests fail with system bun (before fix) and pass with
debug build (after fix)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2025-12-15 18:46:02 -08:00
Ciro Spaciari
aef0b5b4a6 fix(usockets): safely handle socket reallocation during context adoption (#25361)
## Summary
- Fix use-after-free vulnerability during socket adoption by properly
tracking reallocated sockets
- Add safety checks to prevent linking closed sockets to context lists
- Properly track socket state with new `is_closed`, `adopted`, and
`is_tls` flags

## What does this PR do?

This PR improves event loop stability by addressing potential
use-after-free issues that can occur when sockets are reallocated during
adoption (e.g., when upgrading a TCP socket to TLS).

### Key Changes

**Socket State Tracking
([internal.h](packages/bun-usockets/src/internal/internal.h))**
- Added `is_closed` flag to explicitly track when a socket has been
closed
- Added `adopted` flag to mark sockets that were reallocated during
context adoption
- Added `is_tls` flag to track TLS socket state for proper low-priority
queue handling

**Safe Socket Adoption
([context.c](packages/bun-usockets/src/context.c))**
- When `us_poll_resize()` returns a new pointer (reallocation occurred),
the old socket is now:
  - Marked as closed (`is_closed = 1`)
  - Added to the closed socket cleanup list
  - Marked as adopted (`adopted = 1`)
  - Has its `prev` pointer set to the new socket for event redirection
- Added guards to
`us_internal_socket_context_link_socket/listen_socket/connecting_socket`
to prevent linking already-closed sockets

**Event Loop Handling ([loop.c](packages/bun-usockets/src/loop.c))**
- After callbacks that can trigger socket adoption (`on_open`,
`on_writable`, `on_data`), the event loop now checks if the socket was
reallocated and redirects to the new socket
- Low-priority socket handling now properly checks `is_closed` state and
uses `is_tls` flag for correct SSL handling

**Poll Resize Safety
([epoll_kqueue.c](packages/bun-usockets/src/eventing/epoll_kqueue.c))**
- Changed `us_poll_resize()` to always allocate new memory with
`us_calloc()` instead of `us_realloc()` to ensure the old pointer
remains valid for cleanup
- Now takes `old_ext_size` parameter to correctly calculate memory sizes
- Re-enabled `us_internal_loop_update_pending_ready_polls()` call in
`us_poll_change()` to ensure pending events are properly redirected

### How did you verify your code works?
Run existing CI and existing socket upgrade tests under asan build
2025-12-15 18:43:51 -08:00
robobun
740fb23315 fix(windows): improve bunx metadata validation (#25012)
## Summary

- Improved validation for bunx metadata files on Windows
- Added graceful error handling for malformed metadata instead of
crashing
- Added regression test for the fix

## Test plan

- [x] Run `bun bd test test/cli/install/bunx.test.ts -t "should not
crash on corrupted"`
- [x] Manual testing with corrupted `.bunx` files
- [x] Verified normal operation still works

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2025-12-15 18:37:09 -08:00
robobun
2dd997c4b5 fix(node): support duplicate dlopen calls with DLHandleMap (#24404)
## Summary

Fixes an issue where loading the same native module
(NODE_MODULE_CONTEXT_AWARE) multiple times would fail with:
```
symbol 'napi_register_module_v1' not found in native module
```

Fixes https://github.com/oven-sh/bun/issues/23136
Fixes https://github.com/oven-sh/bun/issues/21432

## Root Cause

When a native module is loaded for the first time:
1. `dlopen()` loads the shared library
2. Static constructors run and call `node_module_register()`
3. The module registers successfully

On subsequent loads of the same module:
1. `dlopen()` returns the same handle (library already loaded)
2. Static constructors **do not run again**
3. No registration occurs, leading to the "symbol not found" error

## Solution

Implemented a thread-safe `DLHandleMap` to cache and replay module
registrations:

1. **Thread-local storage** captures the `node_module*` during static
constructor execution
2. **After successful first load**, save the registration to the global
map
3. **On subsequent loads**, look up the cached registration and replay
it

This approach matches Node.js's `global_handle_map` implementation.

## Changes

- Created `src/bun.js/bindings/DLHandleMap.h` - thread-safe singleton
cache
- Added thread-local storage in `src/bun.js/bindings/v8/node.cpp`
- Modified `src/bun.js/bindings/BunProcess.cpp` to save/lookup cached
modules
- Also includes the exports fix (using `toObject()` to match Node.js
behavior)

## Test Plan

Added `test/js/node/process/dlopen-duplicate-load.test.ts` with tests
that:
- Build a native addon using node-gyp
- Load it twice with `process.dlopen`
- Verify both loads succeed
- Test with different exports objects

All tests pass.

## Related Issue

Fixes the second bug discovered in the segfault investigation.

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2025-12-15 18:35:26 -08:00
robobun
4061e1cb4f fix: handle EINVAL from copy_file_range on eCryptfs (#25534)
## Summary
- Add `EINVAL` and `OPNOTSUPP` to the list of errors that trigger
fallback from `copy_file_range` to `sendfile`/read-write loop
- Fixes `Bun.write` and `fs.copyFile` failing on eCryptfs filesystems

## Test plan
- [x] Existing `copyFile` tests pass (`bun bd test
test/js/node/fs/fs.test.ts -t "copyFile"`)
- [x] Existing `copy_file_range` fallback tests pass (`bun bd test
test/js/bun/io/bun-write.test.js -t "should work when copyFileRange is
not available"`)

Fixes #13968

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2025-12-15 17:47:08 -08:00
robobun
6386eef8aa fix(bunx): handle empty string arguments on Windows (#25025)
## Summary

Fixes #13316
Fixes #18275

Running `bunx cowsay ""` (or any package with an empty string argument)
on Windows caused a panic. Additionally, `bunx concurrently "command
with spaces"` was splitting quoted arguments incorrectly.

**Repro #13316:**
```bash
bunx cowsay ""
# panic(main thread): reached unreachable code
```

**Repro #18275:**
```bash
bunx concurrently "bun --version" "bun --version"
# Only runs once, arguments split incorrectly
# Expected: ["bun --version", "bun --version"]
# Actual: ["bun", "--version", "bun", "--version"]
```

## Root Cause

The bunx fast path on Windows bypasses libuv and calls `CreateProcessW`
directly to save 5-12ms. The command line building logic had two issues:

1. **Empty strings**: Not quoted at all, resulting in invalid command
line
2. **Arguments with spaces**: Not quoted, causing them to be split into
multiple arguments

## Solution

Implement Windows command-line argument quoting using libuv's proven
algorithm:
- Port of libuv's `quote_cmd_arg` function (process backwards + reverse)
- Empty strings become `""`
- Strings with spaces/tabs/quotes are wrapped in quotes
- Backslashes before quotes are properly escaped per Windows rules

**Why not use libuv directly?**
- Normal `Bun.spawn()` uses `uv_spawn()` which handles quoting
internally
- bunx fast path bypasses libuv to save 5-12ms (calls `CreateProcessW`
directly)
- libuv's `quote_cmd_arg` is a static function (not exported)
- Solution: port the algorithm to Zig

## Test Plan

- [x] Added regression test for empty strings (#13316)
- [x] Added regression test for arguments with spaces (#18275)
- [x] Verified system bun (v1.3.3) fails both tests
- [x] Verified fix passes both tests
- [x] Implementation based on battle-tested libuv algorithm

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-12-15 17:29:04 -08:00
robobun
3394fd3bdd fix(node:url): return empty string for invalid domains in domainToASCII/domainToUnicode (#25196)
## Summary
- Fixes `url.domainToASCII` and `url.domainToUnicode` to return empty
string instead of throwing `TypeError` when given invalid domains
- Per Node.js docs: "if `domain` is an invalid domain, the empty string
is returned"

## Test plan
- [x] Run `bun bd test test/regression/issue/24191.test.ts` - all 2
tests pass
- [x] Verify tests fail with system Bun (`USE_SYSTEM_BUN=1`) to confirm
fix validity
- [x] Manual verification: `url.domainToASCII('xn--iñvalid.com')`
returns `""`

## Example

Before (bug):
```
$ bun -e "import url from 'node:url'; console.log(url.domainToASCII('xn--iñvalid.com'))"
TypeError: domainToASCII failed
```

After (fixed):
```
$ bun -e "import url from 'node:url'; console.log(url.domainToASCII('xn--iñvalid.com'))"
(empty string output)
```

Closes #24191

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-12-15 17:26:32 -08:00
robobun
5a8cdc08f0 docs(bundler): add comprehensive Bun.build compile API documentation (#25536) 2025-12-15 16:20:50 -08:00
Jarred Sumner
dcc3386611 [publish images] 2025-12-15 15:34:04 -08:00
robobun
8dc79641c8 fix(http): support proxy passwords longer than 4096 characters (#25530)
## Summary
- Fixes silent 401 Unauthorized errors when using proxies with long
passwords (e.g., JWT tokens > 4096 chars)
- Bun was silently dropping proxy passwords exceeding 4095 characters,
falling through to code that only encoded the username

## Changes
- Added `PercentEncoding.decodeWithFallback` which uses a 4KB stack
buffer for the common case and falls back to heap allocation only for
larger inputs
- Updated proxy auth encoding in `AsyncHTTP.zig` to use the new fallback
method

## Test plan
- [x] Added test case that verifies passwords > 4096 chars are handled
correctly
- [x] Test fails with system bun (v1.3.3), passes with this fix
- [x] All 29 proxy tests pass

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-12-15 13:21:41 -08:00
robobun
d865ef41e2 feat: add Bun.Terminal API for pseudo-terminal (PTY) support (#25415)
## Summary

This PR adds a new `Bun.Terminal` API for creating and managing
pseudo-terminals (PTYs), enabling interactive terminal applications in
Bun.

### Features

- **Standalone Terminal**: Create PTYs directly with `new
Bun.Terminal(options)`
- **Spawn Integration**: Spawn processes with PTY attached via
`Bun.spawn({ terminal: options })`
- **Full PTY Control**: Write data, resize, set raw mode, and handle
callbacks

## Examples

### Basic Terminal with Spawn (Recommended)

```typescript
const proc = Bun.spawn(["bash"], {
  terminal: {
    cols: 80,
    rows: 24,
    data(terminal, data) {
      // Handle output from the terminal
      process.stdout.write(data);
    },
    exit(terminal, code, signal) {
      console.log(`Process exited with code ${code}`);
    },
  },
});

// Write commands to the terminal
proc.terminal.write("echo Hello from PTY!\n");
proc.terminal.write("exit\n");

await proc.exited;
proc.terminal.close();
```

### Interactive Shell

```typescript
// Create an interactive shell that mirrors to stdout
const proc = Bun.spawn(["bash", "-i"], {
  terminal: {
    cols: process.stdout.columns || 80,
    rows: process.stdout.rows || 24,
    data(term, data) {
      process.stdout.write(data);
    },
  },
});

// Forward stdin to the terminal
process.stdin.setRawMode(true);
for await (const chunk of process.stdin) {
  proc.terminal.write(chunk);
}
```

### Running Interactive Programs (vim, htop, etc.)

```typescript
const proc = Bun.spawn(["vim", "file.txt"], {
  terminal: {
    cols: process.stdout.columns,
    rows: process.stdout.rows,
    data(term, data) {
      process.stdout.write(data);
    },
  },
});

// Handle terminal resize
process.stdout.on("resize", () => {
  proc.terminal.resize(process.stdout.columns, process.stdout.rows);
});

// Forward input
process.stdin.setRawMode(true);
for await (const chunk of process.stdin) {
  proc.terminal.write(chunk);
}
```

### Capturing Colored Output

```typescript
const chunks: Uint8Array[] = [];

const proc = Bun.spawn(["ls", "--color=always"], {
  terminal: {
    data(term, data) {
      chunks.push(data);
    },
  },
});

await proc.exited;
proc.terminal.close();

// Output includes ANSI color codes
const output = Buffer.concat(chunks).toString();
console.log(output);
```

### Standalone Terminal (Advanced)

```typescript
const terminal = new Bun.Terminal({
  cols: 80,
  rows: 24,
  data(term, data) {
    console.log("Received:", data.toString());
  },
});

// Use terminal.stdin as the fd for child process stdio
const proc = Bun.spawn(["bash"], {
  stdin: terminal.stdin,
  stdout: terminal.stdin,
  stderr: terminal.stdin,
});

terminal.write("echo hello\n");

// Clean up
terminal.close();
```

### Testing TTY Detection

```typescript
const proc = Bun.spawn([
  "bun", "-e", 
  "console.log('isTTY:', process.stdout.isTTY)"
], {
  terminal: {},
});

// Output: isTTY: true
```

## API

### `Bun.spawn()` with `terminal` option

```typescript
const proc = Bun.spawn(cmd, {
  terminal: {
    cols?: number,        // Default: 80
    rows?: number,        // Default: 24  
    name?: string,        // Default: "xterm-256color"
    data?: (terminal: Terminal, data: Uint8Array) => void,
    exit?: (terminal: Terminal, code: number, signal: string | null) => void,
    drain?: (terminal: Terminal) => void,
  }
});

// Access the terminal
proc.terminal.write(data);
proc.terminal.resize(cols, rows);
proc.terminal.setRawMode(enabled);
proc.terminal.close();

// Note: proc.stdin, proc.stdout, proc.stderr return null when terminal is used
```

### `new Bun.Terminal(options)`

```typescript
const terminal = new Bun.Terminal({
  cols?: number,
  rows?: number,
  name?: string,
  data?: (terminal, data) => void,
  exit?: (terminal, code, signal) => void,
  drain?: (terminal) => void,
});

terminal.stdin;   // Slave fd (for child process)
terminal.stdout;  // Master fd (for reading)
terminal.closed;  // boolean
terminal.write(data);
terminal.resize(cols, rows);
terminal.setRawMode(enabled);
terminal.ref();
terminal.unref();
terminal.close();
await terminal[Symbol.asyncDispose]();
```

## Implementation Details

- Uses `openpty()` to create pseudo-terminal pairs
- Properly manages file descriptor lifecycle with reference counting
- Integrates with Bun's event loop via `BufferedReader` and
`StreamingWriter`
- Supports `await using` syntax for automatic cleanup
- POSIX only (Linux, macOS) - not available on Windows

## Test Results

- 80 tests passing
- Covers: construction, writing, reading, resize, raw mode, callbacks,
spawn integration, error handling, GC safety

## Changes

- `src/bun.js/api/bun/Terminal.zig` - Terminal implementation
- `src/bun.js/api/bun/Terminal.classes.ts` - Class definition for
codegen
- `src/bun.js/api/bun/subprocess.zig` - Added terminal field and getter
- `src/bun.js/api/bun/js_bun_spawn_bindings.zig` - Terminal option
parsing
- `src/bun.js/api/BunObject.classes.ts` - Terminal getter on Subprocess
- `packages/bun-types/bun.d.ts` - TypeScript types
- `docs/runtime/child-process.mdx` - Documentation
- `test/js/bun/terminal/terminal.test.ts` - Comprehensive tests

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2025-12-15 12:51:13 -08:00
robobun
e66b4639bd fix: use LLVM unstable repo for Debian trixie/forky [publish images] (#25470)
## Summary
- Fix Docker image build failure on Debian trixie by using LLVM's
`unstable` repository instead of the non-existent `trixie` repository
- The LLVM apt repository (`apt.llvm.org`) doesn't have packages for
Debian trixie (13) or forky - attempts to access
`llvm-toolchain-trixie-19` return 404
- Pass `-n=unstable` flag to `llvm.sh` when running on these Debian
versions

## Test plan
- [ ] Verify the Docker image build succeeds at
https://github.com/oven-sh/bun-development-docker-image/actions

Fixes the build failure from:
https://github.com/oven-sh/bun-development-docker-image/actions/runs/20105199193

Error was:
```
E: The repository 'http://apt.llvm.org/trixie llvm-toolchain-trixie-19 Release' does not have a Release file.
```

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-12-15 12:45:45 -08:00
robobun
8698d25c52 fix: ensure TLS handshake callback fires before HTTP request handler (#25525)
## Summary

Fixes a flaky test (`test-http-url.parse-https.request.js`) where
`request.socket._secureEstablished` was intermittently `false` when the
HTTP request handler was called on HTTPS servers.

## Root Cause

The `isAuthorized` flag was stored in
`HttpContextData::flags.isAuthorized`, which is **shared across all
sockets** in the same context. This meant multiple concurrent TLS
connections could overwrite each other's authorization state, and the
value could be stale when read.

## Fix

Moved the `isAuthorized` flag from the context-level `HttpContextData`
to the per-socket `AsyncSocketData` base class. This ensures each socket
has its own authorization state that is set correctly during its TLS
handshake callback.

## Changes

- **`AsyncSocketData.h`**: Added per-socket `bool isAuthorized` field
- **`HttpContext.h`**: Updated handshake callback to set per-socket flag
instead of context-level flag
- **`JSNodeHTTPServerSocket.cpp`**: Updated `isAuthorized()` to read
from per-socket `AsyncSocketData` (via `HttpResponseData` which inherits
from it)

## Testing

Ran the flaky test 50+ times with 100% pass rate.

Also verified gRPC and HTTP2 tests still pass.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-12-15 12:44:26 -08:00
github-actions[bot]
81a5c79928 deps: update c-ares to v1.34.6 (#25509)
## What does this PR do?

Updates c-ares to version v1.34.6

Compare:
d3a507e920...3ac47ee46e

Auto-updated by [this
workflow](https://github.com/oven-sh/bun/actions/workflows/update-cares.yml)

Co-authored-by: Jarred-Sumner <709451+Jarred-Sumner@users.noreply.github.com>
2025-12-15 11:36:18 -08:00
Alistair Smith
fa996ad1a8 fix: Support @types/node@25.0.2 (#25532)
### What does this PR do?

CI failed again because of a change in @types/node

### How did you verify your code works?

bun-types.test.ts passes
2025-12-15 11:29:04 -08:00
robobun
ed1d6e595c skip HandleScope GC test on ASAN builds (#25523)
## Summary
- Skip `test_handle_scope_gc` test on ASAN builds due to false positives
from dynamic library boundary crossing (Bun built with ASAN+UBSAN,
native addon without sanitizers)

## Test plan
- CI should pass on ASAN builds with this test skipped
- Non-ASAN builds continue to run the test normally

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-12-15 11:11:11 -08:00
Jarred Sumner
7c98b0f440 Deflake test/js/node/test/parallel/test-fs-readdir-stack-overflow.js 2025-12-14 22:49:51 -08:00
robobun
f0d18d73c9 Disable CodeRabbit issue enrichment (#25520) 2025-12-14 14:56:58 -08:00
Ciro Spaciari
a5712b92b8 Fix 100% CPU usage with idle WebSocket connections on macOS (kqueue) (#25475)
### What does this PR do?

Fixes a bug where idle WebSocket connections would cause 100% CPU usage
on macOS and other BSD systems using kqueue.

**Root cause:** The kqueue event filter comparison was using bitwise AND
(`&`) instead of equality (`==`) when checking the filter type. Combined
with missing `EV_ONESHOT` flags on writable events, this caused the
event loop to continuously spin even when no actual I/O was pending.

**Changes:**
1. **Fixed filter comparison** in `epoll_kqueue.c`: Changed `filter &
EVFILT_READ` to `filter == EVFILT_READ` (same for `EVFILT_WRITE`). The
filter field is a value, not a bitmask.

2. **Added `EV_ONESHOT` flag** to writable events: kqueue writable
events now use one-shot mode to prevent continuous triggering.

3. **Re-arm writable events when needed**: After a one-shot writable
event fires, the code now properly updates the poll state and re-arms
the writable event if another write is still pending.

### How did you verify your code works?

Added a test that:
1. Creates a TLS WebSocket server and client
2. Sends messages then lets the connection sit idle
3. Measures CPU usage over 3 seconds
4. Fails if CPU usage exceeds 2% (expected is ~0.XX% when idle)
2025-12-12 11:10:22 -08:00
robobun
7dcd49f832 fix(install): only apply default trusted dependencies to npm packages (#25163)
## Summary
- The default trusted dependencies list should only apply to packages
installed from npm
- Non-npm sources (file:, link:, git:, github:) now require explicit
trustedDependencies
- This prevents malicious packages from spoofing trusted names through
local paths or git repos

## Test plan
- [x] Added test: file: dependency named "esbuild" does NOT auto-run
postinstall scripts
- [x] Added test: file: dependency runs scripts when explicitly added to
trustedDependencies
- [x] Verified tests fail with system bun (old behavior) and pass with
new build
- [x] Build compiles successfully

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
Co-authored-by: Dylan Conway <dylan.conway567@gmail.com>
2025-12-11 17:44:41 -08:00
robobun
c59a6997cd feat(bundler): add statically-analyzable dead-code elimination via feature flags (#25462)
## Summary
- Adds `import { feature } from "bun:bundle"` for compile-time feature
flag checking
- `feature("FLAG_NAME")` calls are replaced with `true`/`false` at
bundle time
- Enables dead-code elimination through `--feature=FLAG_NAME` CLI
argument
- Works in `bun build`, `bun run`, and `bun test`
- Available in both CLI and `Bun.build()` JavaScript API

## Usage

```ts
import { feature } from "bun:bundle";

if (feature("SUPER_SECRET")) {
  console.log("Secret feature enabled!");
} else {
  console.log("Normal mode");
}
```

### CLI
```bash
# Enable feature during build
bun build --feature=SUPER_SECRET index.ts

# Enable at runtime
bun run --feature=SUPER_SECRET index.ts

# Enable in tests
bun test --feature=SUPER_SECRET
```

### JavaScript API
```ts
await Bun.build({
  entrypoints: ['./index.ts'],
  outdir: './out',
  features: ['SUPER_SECRET', 'ANOTHER_FLAG'],
});
```

## Implementation
- Added `bundler_feature_flags` (as `*const bun.StringSet`) to
`RuntimeFeatures` and `BundleOptions`
- Added `bundler_feature_flag_ref` to Parser struct to track the
`feature` import
- Handle `bun:bundle` import at parse time (similar to macros) - capture
ref, return empty statement
- Handle `feature()` calls in `e_call` visitor - replace with boolean
based on flags
- Wire feature flags through CLI arguments and `Bun.build()` API to
bundler options
- Added `features` option to `JSBundler.zig` for JavaScript API support
- Added TypeScript types in `bun.d.ts`
- Added documentation to `docs/bundler/index.mdx`

## Test plan
- [x] Basic feature flag enabled/disabled tests (both CLI and API
backends)
- [x] Multiple feature flags test
- [x] Dead code elimination verification tests
- [x] Error handling for invalid arguments
- [x] Runtime tests with `bun run --feature=FLAG`
- [x] Test runner tests with `bun test --feature=FLAG`
- [x] Aliased import tests (`import { feature as checkFeature }`)
- [x] Ternary operator DCE tests
- [x] Tests use `itBundled` with both `backend: "cli"` and `backend:
"api"`

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Alistair Smith <hi@alistair.sh>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2025-12-11 17:44:14 -08:00
Alistair Smith
1d50af7fe8 @types/bun: Update to @types/node@25, fallback to PropertyKey in test expect matchers when keyof unknown is used (#25460)
more accurately, developers cannot pass a value when expect values
resolve to never. this is easy to fall into when using the
`toContainKey*` matchers. falling back to PropertyKey when this happens
is a sensible/reasonable default

### What does this PR do?

fixes #25456, cc @MonsterDeveloper
fixes #25461

### How did you verify your code works?

bun types integration test
2025-12-10 18:15:55 -08:00
Jarred Sumner
98cee5a57e Improve Bun.stringWidth accuracy and robustness (#25447)
This PR significantly improves `Bun.stringWidth` to handle a wider
variety of Unicode characters and escape sequences correctly.

## Zero-width character handling

Added support for many previously unhandled zero-width characters:
- Soft hyphen (U+00AD)
- Word joiner and invisible operators (U+2060-U+2064)
- Lone surrogates (U+D800-U+DFFF)
- Arabic formatting characters (U+0600-U+0605, U+06DD, U+070F, U+08E2)
- Indic script combining marks (Devanagari through Malayalam)
- Thai and Lao combining marks
- Combining Diacritical Marks Extended and Supplement
- Tag characters (U+E0000-U+E007F)

## ANSI escape sequence handling

### CSI sequences
- Now properly handles ALL CSI final bytes (0x40-0x7E), not just `m`
- This means cursor movement (A/B/C/D), erase (J/K), scroll (S/T), and
other CSI commands are now correctly excluded from width calculation

### OSC sequences
- Added support for OSC sequences (ESC ] ... BEL/ST)
- OSC 8 hyperlinks are now properly handled
- Supports both BEL (0x07) and ST (ESC \) terminators

### ESC ESC fix
- Fixed state machine bug where `ESC ESC` would incorrectly reset state
- Now correctly handles consecutive ESC characters

## Emoji handling

Added proper grapheme-aware emoji width calculation:
- Flag emoji (regional indicator pairs) → width 2
- Skin tone modifiers → width 2
- ZWJ sequences (family, professions, etc.) → width 2
- Keycap sequences → width 2
- Variation selectors (VS15 for text, VS16 for emoji presentation)
- Uses ICU's `UCHAR_EMOJI` property for accurate emoji detection

## Test coverage

Added comprehensive test suite with **94 tests** covering:
- All zero-width character categories
- All CSI final bytes
- OSC sequences with various terminators
- Emoji edge cases (flags, skin tones, ZWJ, keycaps, variation
selectors)
- East Asian width (CJK, fullwidth, halfwidth katakana)
- Indic and Thai script combining marks
- Fuzzer-like stress tests for robustness

## Breaking changes

This is a behavior change - `stringWidth` will return different values
for some inputs. However, the new values are more accurate
representations of terminal display width:

| Input | Old | New | Why |
|-------|-----|-----|-----|
| Flag emoji 🇺🇸 | 1 | 2 | Flags display as 2 cells |
| Skin tone 👋🏽 | 4 | 2 | Emoji + modifier = 1 grapheme |
| ZWJ family 👨‍👩‍👧 | 8 | 2 | ZWJ sequence = 1 grapheme |
| Word joiner U+2060 | 1 | 0 | Invisible character |
| OSC 8 hyperlinks | counted URL | just visible text | URLs are
invisible |
| Cursor movement ESC[5A | counted | 0 | Control sequence |

🤖 Generated with [Claude Code](https://claude.ai/code)

---------

Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: Claude Bot <claude-bot@bun.sh>
2025-12-10 16:17:57 -08:00
Elfayeur - Remi
ac0099ebc6 docs: fix code highlight (#25411)
### What does this PR do?

Fix code highlight line, see problem:
<img width="684" height="663" alt="Screenshot 2025-12-08 at 11 40 39 AM"
src="https://github.com/user-attachments/assets/9894a7b7-ddd6-4bad-b7de-3e0e55ecd8cd"
/>


### How did you verify your code works?

Co-authored-by: Michael H <git@riskymh.dev>
2025-12-10 16:06:45 +11:00
Ryan Machado
64146d47f9 Update os-signals.mdx (#25372)
### What does this PR do?

Remove a loose section from os-signals documentation

### How did you verify your code works?

Just a small documentation change.

Co-authored-by: Michael H <git@riskymh.dev>
2025-12-10 16:06:39 +11:00
robobun
a2d8b75962 fix(yaml): quote strings ending with colons (#25443)
## Summary
- Fixes strings ending with colons (e.g., `"tin:"`) not being quoted in
YAML.stringify output
- This caused YAML.parse to fail with "Unexpected token" when parsing
the output back

## Test plan
- Added regression tests in `test/regression/issue/25439.test.ts`
- Verified round-trip works for various strings ending with colons
- Ran existing YAML tests to ensure no regressions

Fixes #25439

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-12-09 18:20:26 -08:00
Kyle
a15fe76bf2 add brotli and zstd to CompressionStream and DecompressionStream types (#25374)
### What does this PR do?

- removes the `Unimplemented in Bun` comment on `CompressionStream` and
`DecompressionStream`
- updates the types for `CompressionStream` and `DecompressionStream` to
add a new internal `CompressionFormat` type to the constructor, which
adds `brotli` and `zstd` to the union
- adds tests for brotli and zstd usage
- adds lib.dom.d.ts exclusions for brotli and zstd as these don't exist
in the DOM version of CompressionFormat

fixes #25367

### How did you verify your code works?

typechecks and tests
2025-12-09 17:56:55 -08:00
robobun
8dc084af5f fix(fetch): ignore proxy object without url property (#25414)
## Summary
- When a URL object is passed as the proxy option, or when a proxy
object lacks a "url" property, ignore it instead of throwing an error
- This fixes a regression introduced in 1.3.4 where libraries like taze
that pass URL objects as proxy values would fail

## Test plan
- Added test: "proxy as URL object should be ignored (no url property)"
- passes a URL object directly as proxy
- Updated test: "proxy object without url is ignored (regression
#25413)" - proxy object with headers but no url
- Updated test: "proxy object with null url is ignored (regression
#25413)" - proxy object where url is null
- All 29 proxy tests pass

Fixes #25413

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2025-12-09 12:31:45 -08:00
Alistair Smith
2028e21d60 fmt bun.d.ts 2025-12-08 18:00:09 -08:00
Ciro Spaciari
f25ea59683 feat(s3): add Content-Disposition support for S3 uploads (#25363)
### What does this PR do?
- Add `contentDisposition` option to S3 file uploads to control the
`Content-Disposition` HTTP header
- Support passing `contentDisposition` through all S3 upload paths
(simple uploads, multipart uploads, and streaming uploads)
- Add TypeScript types for the new option
Fixes https://github.com/oven-sh/bun/issues/25362
### How did you verify your code works?
Test
2025-12-08 15:30:20 -08:00
Jarred Sumner
55c6afb498 Deflake test/js/bun/net/socket.test.ts 2025-12-08 11:16:26 -08:00
Jarred Sumner
0aca002161 Deflake test/js/bun/util/sleep.test.ts 2025-12-08 11:14:52 -08:00
robobun
4980736786 docs(server): fix satisfies Serve type in export default example (#25410) 2025-12-08 09:25:00 -08:00
robobun
3af0d23d53 docs: expand single-file executable file embedding documentation (#25408)
## Summary

- Expanded documentation for embedding files in single-file executables
with `with { type: "file" }`
- Added clear explanation of how the import attribute works and path
transformation at build time
- Added examples for reading embedded files with both `Bun.file()` and
Node.js `fs` APIs
- Added practical examples: JSON configs, HTTP static assets, templates,
binary files (WASM, fonts)
- Improved `Bun.embeddedFiles` section with a dynamic asset server
example

## Test plan

- [x] Verified all code examples compile and run correctly with `bun
build --compile`
- [x] Tested `Bun.file()` reads embedded files correctly
- [x] Tested `node:fs` APIs (`readFileSync`, `promises.readFile`,
`stat`) work with embedded files
- [x] Tested `Bun.embeddedFiles` returns correct blob array
- [x] Tested `--asset-naming` flag removes content hashes

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-12-07 18:43:00 -08:00
Hamidreza Hanafi
9c96937329 fix(transpiler): preserve simplified property values in object spread expressions (#25401)
Fixes #25398

### What does this PR do?

Fixes a bug where object expressions with spread properties and nullish
coalescing to empty objects (e.g., `k?.x ?? {}`) would produce invalid
JavaScript output like `k?.x ?? ` (missing `{}`).

### Root Cause

In `src/ast/SideEffects.zig`, the `simplifyUnusedExpr` function handles
unused object expressions with spread properties. When simplifying
property values:

1. The code creates a mutable copy `prop` from the original `prop_`
2. When a property value is simplified (e.g., `k?.x ?? {}` → `k?.x`), it
updates `prop.value`
3. **Bug:** The code then wrote back `prop_` (the original) instead of
`prop` (the modified copy)

Because `simplifyUnusedExpr` mutates the AST in place when handling
nullish coalescing (setting `bin.right` to empty), the original `prop_`
now contained an expression with `bin.right` as an empty/missing
expression, resulting in invalid output.

### How did you verify your code works?
- Added regression test in `test/regression/issue/25398.test.ts`
- Verified the original reproduction case passes
- Verified existing CommonJS tests continue to pass
- Verified test fails with system bun and passes with the fix
2025-12-07 16:33:37 -08:00
robobun
d4eaaf8363 docs: document autoload options for standalone executables (#25385)
## Summary

- Document new default behavior in v1.3.4: `tsconfig.json` and
`package.json` loading is now disabled by default for standalone
executables
- Add documentation for `--compile-autoload-tsconfig` and
`--compile-autoload-package-json` CLI flags
- Document all four JavaScript API options: `autoloadTsconfig`,
`autoloadPackageJson`, `autoloadDotenv`, `autoloadBunfig`
- Note that `.env` and `bunfig.toml` may also be disabled by default in
a future version

## Test plan

- [ ] Review rendered documentation for accuracy and formatting

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-12-07 15:44:06 -08:00
Jarred Sumner
e1aa437694 Bump 2025-12-07 15:42:23 -08:00
robobun
73c3f0004f fix(vm): delete internal Loader property from node:vm global object (#25397) 2025-12-07 13:29:32 -08:00
robobun
b80cb629c6 fix(docs): correct mock documentation examples (#25384)
## Summary

- Fix `mock.mock.args` to `mock.mock.calls` in mock-functions.mdx (the
`.args` property doesn't exist)
- Fix mock.restore example to use module methods instead of spy
functions (calling spy functions after restore returns `undefined`)
- Add missing `vi` import in Vitest compatibility example

## Test plan

- [x] Verified each code block works by running tests against the debug
build

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-12-06 22:17:51 -08:00
Jarred Sumner
8773f7ab65 Delete TODO.md 2025-12-06 19:58:21 -08:00
Dylan Conway
5eb2145b31 fix(compile): use 8-byte header for embedded section to ensure bytecode alignment (#25377)
## Summary
- Change the size header in embedded Mach-O and PE sections from `u32`
(4 bytes) to `u64` (8 bytes)
- Ensures the data payload starts at an 8-byte aligned offset, which is
required for the bytecode cache

## Test plan
- [x] Test standalone compilation on macOS
- [ ] Test standalone compilation on Windows

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2025-12-06 16:37:09 -08:00
Jarred Sumner
cde167cacd Revert "Add Tanstack Start to bun init (#24648)"
Adding a 260 KB bun header image is not a good use of binary size

This reverts commit 830fd9b0ae.
2025-12-05 18:32:51 -08:00
robobun
6ce419d3f8 fix(napi): napi_typeof returns napi_object for String objects (#25365)
## Summary

- Fix `napi_typeof` to return `napi_object` for boxed String objects
(`new String("hello")`) instead of incorrectly returning `napi_string`
- Add regression test for boxed primitive objects (String, Number,
Boolean)

The issue was that `StringObjectType` and `DerivedStringObjectType` JSC
cell types were falling through to return `napi_string`, but these
represent object wrappers around strings, not primitive strings.

## Test plan

- [x] `bun bd test test/napi/napi.test.ts -t "napi_typeof"` passes
- [x] Test fails with `USE_SYSTEM_BUN=1` (confirming the bug exists in
released version)

Fixes #25351

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-12-05 18:27:06 -08:00
Alistair Smith
05508a627d Reapply "use event.message when no event.error in HMR during event" (#25360)
This reverts commit b4c8379447.

### What does this PR do?

### How did you verify your code works?

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-12-05 17:38:56 -08:00
robobun
23383b32b0 feat(compile): add --compile-autoload-tsconfig and --compile-autoload-package-json flags (#25340)
## Summary

By default, standalone executables no longer load `tsconfig.json` and
`package.json` at runtime. This improves startup performance and
prevents unexpected behavior from config files in the runtime
environment.

- Added `--compile-autoload-tsconfig` / `--no-compile-autoload-tsconfig`
CLI flags (default: false)
- Added `--compile-autoload-package-json` /
`--no-compile-autoload-package-json` CLI flags (default: false)
- Added `autoloadTsconfig` and `autoloadPackageJson` options to the
`Bun.build()` compile config
- Flags are stored in `StandaloneModuleGraph.Flags` and applied at
runtime boot

This follows the same pattern as the existing
`--compile-autoload-dotenv` and `--compile-autoload-bunfig` flags.

## Test plan

- [x] Added tests in `test/bundler/bundler_compile_autoload.test.ts`
- [x] Verified standalone executables work correctly with runtime config
files that differ from compile-time configs
- [x] Verified the new CLI flags are properly parsed and applied
- [x] Verified the JS API options work correctly

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2025-12-05 14:43:53 -08:00
eroderust
0d5a7c36ed chore: remove duplicate words in comment (#25347) 2025-12-05 11:19:47 -08:00
Alistair Smith
b4c8379447 Revert "use event.message when no event.error in HMR during event"
This reverts commit 438aaf9e95.
2025-12-05 11:16:52 -08:00
Alistair Smith
438aaf9e95 use event.message when no event.error in HMR during event 2025-12-05 11:14:45 -08:00
robobun
4d60b6f69d docs: clarify SQLite embed example requires existing database (#25329)
Co-authored-by: Alistair Smith <hi@alistair.sh>
2025-12-03 22:02:39 -08:00
pfg
e9e93244cb remove CMakeCache before building (#24860)
So it doesn't cache flags that are passed to the build
2025-12-01 22:02:46 -08:00
pfg
800a937cc2 Add fake timers for bun:test (#23764)
Fixes ENG-21288

TODO: Test with `@testing-library/react` `waitFor`

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Claude <noreply@anthropic.com>
2025-12-01 21:59:11 -08:00
Lydia Hallie
830fd9b0ae Add Tanstack Start to bun init (#24648)
Co-authored-by: Alistair Smith <hi@alistair.sh>
2025-12-01 21:05:47 -08:00
Meghan Denny
fe0aba79f4 test: add regression tests for building docker containers (#25210) 2025-12-01 20:20:06 -08:00
sdm2345
a4aaec5b2f Fix http.Agent connection pool not reusing connections (#24351)
This fixes a critical bug where `http.Agent` with `keepAlive: true` was
not reusing connections, causing a 66% performance degradation compared
to Node.js. Every request was establishing a new TCP/TLS connection
instead of reusing existing ones.

**Root Cause:**

Three independent bugs were causing the connection pool to fail:

1. **TypeScript layer** (`src/js/node/_http_client.ts:271`)
   - Reading wrong property: `keepalive` instead of `keepAlive`
   - User's `keepAlive: true` setting was being ignored

2. **Request header handling** (`src/http.zig:591`)
   - Only handled `Connection: close`, ignored `Connection: keep-alive`
   - Missing explicit flag update for keep-alive header

3. **Response header handling** (`src/http.zig:2240`)
   - Used compile-time function `eqlComptime` at runtime (always failed)
   - Inverted logic: disabled pool when NOT "keep-alive"
   - Ignored case-sensitivity (should use `eqlIgnoreCase` per RFC 7230)

**Performance Impact:**

- **Before**: All requests ~940ms, stddev 33ms (0% improvement) 
- **After**: First request ~930ms, subsequent ~320ms (65.9% improvement)

- Performance now matches Node.js (65.9% vs 66.5% improvement)
- QPS increased from 4.2 to 12.2 req/s (190% improvement)

**Files Changed:**
- `src/js/node/_http_client.ts` - Fix property name (1 line)
- `src/http.zig` - Fix request/response header handling (5 lines)

Fixes #12053

### What does this PR do?

This PR fixes the HTTP connection pool by correcting three bugs:

1. **Fixes TypeScript property name**: Changes `this[kAgent]?.keepalive`
to `this[kAgent]?.keepAlive` to properly read the user's keepAlive
setting from http.Agent

2. **Adds keep-alive request header handling**: Explicitly sets
`disable_keepalive = false` when receiving `Connection: keep-alive`
header

3. **Fixes response header parsing**: 
- Replaces compile-time `strings.eqlComptime()` with runtime
`std.ascii.eqlIgnoreCase()`
- Corrects inverted logic to properly enable connection pool on
`Connection: keep-alive`
   - Makes header comparison case-insensitive per RFC 7230

All three bugs must be fixed together - any single bug would cause the
connection pool to fail.

### How did you verify your code works?

**Test 1: Minimal reproduction with 10 sequential HTTPS requests**
```typescript
const agent = new https.Agent({ keepAlive: true });
// Make 10 requests to https://api.example.com
```

Results:
- First request: 930ms (cold start with TCP/TLS handshake)
- Subsequent requests: ~320ms average (connection reused)
- **Improvement: 65.9%** (matches Node.js 66.5%)
- Verified across 3 repeated test runs for stability

**Test 2: Response header validation**
- Confirmed server returns `Connection: Keep-Alive`
- Verified Bun correctly parses and applies the header

**Test 3: Performance comparison**

| Runtime | First Request | Subsequent Avg | Improvement | QPS |
|---------|--------------|----------------|-------------|-----|
| Node.js v20.18.0 | 938ms | 314ms | **66.5%** | 12.2 |
| Bun v1.3.1 (broken) | 935ms | 942ms | -0.7%  | 4.2 |
| Bun v1.3.2 (fixed) | 930ms | 317ms | **65.9%**  | 12.2 |

Bun now performs identically to Node.js, confirming the connection pool
works correctly.
2025-12-01 17:31:15 -08:00
Meghan Denny
24bc8aa416 ci: remove ubuntu 24 (#25288)
redundant with 25
2025-12-01 17:01:14 -08:00
robobun
2ab6efeea3 fix(ffi): restore CString constructor functionality (#25257)
## Summary
- Fix regression where `new Bun.FFI.CString(ptr)` throws "function is
not a constructor"
- Pass the same function as both call and constructor callbacks for
CString

## Root Cause
PR #24910 replaced `jsc.createCallback` with `jsc.JSFunction.create` for
all FFI functions. However, `JSFunction.create` doesn't allow
constructor calls by default (it uses `callHostFunctionAsConstructor`
which throws). The old `createCallback` used `JSFFIFunction` which
allowed the same function to be called with `new`.

## Fix
Pass the same function as both the `implementation` and `constructor`
option to `JSFunction.create` for CString specifically. This allows `new
CString(ptr)` to work while keeping the refactoring from #24910.

Additionally, the `bun:ffi` module now replaces `Bun.FFI.CString` with
the proper JS CString class after loading, so users get the full class
with `.ptr`, `.byteOffset`, etc. properties.

## Test plan
- [x] Added regression test `test/regression/issue/25231.test.ts`
- [x] Test fails with `USE_SYSTEM_BUN=1` (v1.3.3), passes with fix
- [x] Verified reproduction case from issue works

Fixes #25231

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-12-01 15:47:27 -08:00
Amdadul Haq
6745bdaa85 Add protocol property to serve.d.ts (#25267) 2025-12-01 13:43:14 -08:00
Lydia Hallie
dce7a02f4d Docs: Minor fixes and improvements (#25284)
This PR addresses several issues opened for the docs:

- Add callout for SQLite caching behavior between prepare() and query()
- Fix SQLite types and fix deprecated exec to run
- Fix Secrets API example
- Update SolidStart guide
- Add bun upgrade guide
- Prefer `process.versions.bun` over `typeof Bun` for detection
- Document complete `bunx` flags
- Improve Nitro preset documentation for Nuxt

Fixes #23165, #24424, #24294, #25175, #18433, #16804, #22967, #22527,
#10560, #14744
2025-12-01 13:32:08 -08:00
Michael H
9c420c9eff fix production build for vscode extention (#25274) 2025-12-01 12:59:27 -08:00
github-actions[bot]
9c2ca4b8fd deps: update sqlite to 3.51.100 (#25243)
## What does this PR do?

Updates SQLite to version 3.51.100

Compare: https://sqlite.org/src/vdiff?from=3.51.0&to=3.51.100

Auto-updated by [this
workflow](https://github.com/oven-sh/bun/actions/workflows/update-sqlite3.yml)

Co-authored-by: Jarred-Sumner <709451+Jarred-Sumner@users.noreply.github.com>
2025-12-01 11:58:34 -08:00
robobun
e624f1e571 fix(jest): handle null SourceOrigin in jest.mock() to prevent crash (#25281)
## Summary
- Added null check for `sourceOrigin` before accessing its URL in
`jest.mock()`
- When `callerSourceOrigin()` returns null (e.g., when called with
invalid arguments), the code now safely returns early instead of
crashing

## Test plan
- [x] Added regression test `test/regression/issue/ENG-24434.test.ts`
- [x] `bun bd test test/regression/issue/ENG-24434.test.ts` passes

Fixes ENG-24434

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-12-01 11:54:03 -08:00
robobun
27381063b6 fix(windows): use GetConsoleCP() instead of GetConsoleOutputCP() for input codepage (#25252)
## Summary

- Fix typo where `GetConsoleOutputCP()` was called twice instead of
calling `GetConsoleCP()` for the input codepage
- Add missing `GetConsoleCP()` extern declaration in windows.zig

The code was saving the output codepage twice, meaning the input
codepage was never properly saved and thus couldn't be correctly
restored.

## Note

This fix corrects a bug in the codepage save/restore logic, but **may
not fully resolve the garbled text issue** in #25151. The garbled text
problem occurs when `bunx` (without `--bun`) runs a package via Node.js,
and that package tries to spawn `bun`. The error message from cmd.exe
gets garbled on non-English Windows systems.

Further investigation may be needed to determine if additional codepage
handling is required when spawning processes.

Related to #25151

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2025-11-30 23:11:33 -08:00
Michael H
9ca8de6eb9 vscode test runner add the new test functions to static analysis (#25256) 2025-11-30 17:31:17 -08:00
robobun
fdcfac6a75 fix(node:tls): use SSL_session_reused for TLSSocket.isSessionReused (#25258)
## Summary
Fixes `TLSSocket.isSessionReused()` to use BoringSSL's
`SSL_session_reused()` API instead of incorrectly checking if a session
was set.

The previous implementation returned `!!this[ksession]` which would
return `true` if `setSession()` was called, even if the session wasn't
actually reused by the SSL layer. This fix correctly uses the native SSL
API like Node.js does.

## Changes
- Added native `isSessionReused` function in Zig that calls
`SSL_session_reused()`
- Updated `TLSSocket.prototype.isSessionReused` to use the native
implementation
- Added regression tests

## Test plan
- [x] `bun bd test test/regression/issue/25190.test.ts` passes
- [x] `bun bd test test/js/node/tls/node-tls-connect.test.ts` passes

Fixes #25190

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-11-30 17:00:25 -08:00
robobun
ce1981c525 fix(node:assert): handle Number and Boolean wrappers in deepStrictEqual (#25201)
## Summary
- Fixes `assert.deepStrictEqual()` to properly compare Number and
Boolean wrapper objects
- Previously, `new Number(1)` and `new Number(2)` were incorrectly
considered equal because they have no enumerable properties
- Now correctly extracts and compares internal values using
`JSC::sameValue()`, then falls through to check own properties

## Test plan
- [x] Run `bun bd test test/regression/issue/24045.test.ts` - all 6
tests pass
- [x] Verify tests fail with system Bun (`USE_SYSTEM_BUN=1`) to confirm
fix validity
- [x] Verified behavior matches Node.js exactly (see table below)

## Node.js Compatibility

| Test Case | Node.js | Bun |
|-----------|---------|-----|
| Different Number values (`new Number(1)` vs `new Number(2)`) | throws
| throws |
| Same Number values (`new Number(1)` vs `new Number(1)`) | equal |
equal |
| 0 vs -0 (`new Number(0)` vs `new Number(-0)`) | throws | throws |
| NaN equals NaN (`new Number(NaN)` vs `new Number(NaN)`) | equal |
equal |
| Different Boolean values (`new Boolean(true)` vs `new Boolean(false)`)
| throws | throws |
| Same Boolean values | equal | equal |
| Number wrapper vs primitive (`new Number(1)` vs `1`) | throws | throws
|
| Number vs Boolean wrapper | throws | throws |
| Same value, different own properties | throws | throws |
| Same value, same own properties | equal | equal |
| Different own property values | throws | throws |

## Example

Before (bug):
```javascript
assert.deepStrictEqual(new Number(1), new Number(2)); // passes incorrectly
```

After (fixed):
```javascript
assert.deepStrictEqual(new Number(1), new Number(2)); // throws AssertionError
```

Closes #24045

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2025-11-30 16:58:58 -08:00
Dylan Conway
cc3fc5a1d3 fix ENG-24015 (#25222)
### What does this PR do?
Ensures `ptr` is either a number or heap big int before converting to a
number.

also fixes ENG-24039
### How did you verify your code works?
Added a test
2025-11-29 19:13:32 -08:00
Dylan Conway
d83e0eb1f1 fix ENG-24017 (#25224)
### What does this PR do?
Fixes checking for exceptions when creating empty or used readable
streams

also fixes ENG-24038
### How did you verify your code works?
Added a test for creating empty streams
2025-11-29 19:13:06 -08:00
Dylan Conway
72b9525507 update bunfig telemetry docs (#25237)
### What does this PR do?

### How did you verify your code works?
2025-11-29 19:12:18 -08:00
robobun
0f7494569e fix(console): implement %j format specifier for JSON output (#25195)
## Summary
- Implements the `%j` format specifier for `console.log` and related
console methods
- `%j` outputs the JSON stringified representation of the value
- Previously, `%j` was not recognized and was left as literal text in
the output

## Test plan
- [x] Run `bun bd test test/regression/issue/24234.test.ts` - all 5
tests pass
- [x] Verify tests fail with system Bun (`USE_SYSTEM_BUN=1`) to confirm
fix validity
- [x] Manual verification: `console.log('%j', {foo: 'bar'})` outputs
`{"foo":"bar"}`

## Example

Before (bug):
```
$ bun -e "console.log('%j %s', {foo: 'bar'}, 'hello')"
%j [object Object] hello
```

After (fixed):
```
$ bun -e "console.log('%j %s', {foo: 'bar'}, 'hello')"
{"foo":"bar"} hello
```

Closes #24234

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-11-28 22:57:55 -08:00
Meghan Denny
9fd6b54c10 ci: fix windows git dependencies tests (#25213) 2025-11-28 22:56:54 -08:00
robobun
19acc4dcac fix(buffer): handle string allocation failures in encoding operations (#25214)
## Summary
- Add proper bounds checking for encoding operations that produce larger
output than input
- Handle allocation failures gracefully by returning appropriate errors
- Add defensive checks in string initialization functions

## Test plan
- Added test case for encoding operations with large buffers
- Verified existing buffer tests still pass

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-11-28 22:56:28 -08:00
Meghan Denny
56da7c4fd9 zig: address a VM todo (#23678) 2025-11-28 19:38:26 -08:00
Meghan Denny
5bdb8ec0cb all: update to debian 13 (#24055) [publish images] 2025-11-28 15:01:40 -08:00
Meghan Denny
4cf9b794c9 ci: update buildkite agent to v3.114.0 (#25127) [publish images] 2025-11-28 15:01:20 -08:00
Meghan Denny
998ec54da9 test: fix spacing in sql.test.ts (#24691) 2025-11-28 14:40:58 -08:00
Jarred Sumner
0305f3d4d2 feat(url): implement URLPattern API (#25168)
## Summary

Implements the [URLPattern Web
API](https://developer.mozilla.org/en-US/docs/Web/API/URLPattern) based
on WebKit's implementation. URLPattern provides declarative pattern
matching for URLs, similar to how regular expressions work for strings.

### Features

- **Constructor**: Create patterns from strings or `URLPatternInit`
dictionaries
- **`test()`**: Check if a URL matches the pattern (returns boolean)
- **`exec()`**: Extract matched groups from a URL (returns
`URLPatternResult` or null)
- **Pattern properties**: `protocol`, `username`, `password`,
`hostname`, `port`, `pathname`, `search`, `hash`
- **`hasRegExpGroups`**: Detect if the pattern uses custom regular
expressions

### Example Usage

```js
// Match URLs with a user ID parameter
const pattern = new URLPattern({ pathname: '/users/:id' });

pattern.test('https://example.com/users/123'); // true
pattern.test('https://example.com/posts/456'); // false

const result = pattern.exec('https://example.com/users/123');
console.log(result.pathname.groups.id); // "123"

// Wildcard matching
const filesPattern = new URLPattern({ pathname: '/files/*' });
const match = filesPattern.exec('https://example.com/files/image.png');
console.log(match.pathname.groups[0]); // "image.png"
```

## Implementation Notes

- Adapted from WebKit's URLPattern implementation
- Modified JS bindings to work with Bun's infrastructure (simpler
`convertDictionary` patterns, WTF::Variant handling)
- Added IsoSubspaces for proper GC integration

## Test Plan

- [x] 408 tests from Web Platform Tests pass
- [x] Tests fail with system Bun (URLPattern not defined), pass with
debug build
- [x] Manual testing of basic functionality

Fixes https://github.com/oven-sh/bun/issues/2286

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-28 00:04:30 -08:00
Henrique Fonseca
1006a4fac2 Fix incorrect file name in React component test example 🌟 (#25167)
# What does this PR do?

Nyaa~ This PR fixes a small mistake in the documentation where the code
block for a React component test example was using the wrong filename!
(;ω;)💦

It was previously labeled as `matchers.d.ts`, but it should be something
like `myComponent.test.tsx` to properly reflect a test file for a React
component using `@testing-library/react`. 🧁

This makes the example clearer and more accurate for developers using
Bun to test their React components~! 💻🌸💕

# How did you verify your code works?

It's just docs, one single line 🥺

Pwease review and merge it when you can, senpai~~! UwU 🌈🫧
2025-11-27 23:15:34 -08:00
Michael H
c7f7d9bb82 run fmt (#25148)
prettier released a new update which seems to have changed a few
logistics
2025-11-28 17:51:45 +11:00
robobun
37bce389a0 docs: document inlining process.env.* values in static HTML bundling (#25084)
## Summary

- Add documentation for the `env` option that inlines `process.env.*`
values in frontend code when bundling HTML files
- Document runtime configuration via `bunfig.toml` `[serve.static]`
section for `bun ./index.html`
- Document production build configuration via CLI (`--env=PUBLIC_*`) and
`Bun.build` API (`env: "PUBLIC_*"`)
- Explain prefix filtering to avoid exposing sensitive environment
variables

## Test plan

- [x] Verify documentation renders correctly in local preview
- [x] Cross-reference with existing `env` documentation in
bundler/index.mdx

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Michael H <git@riskymh.dev>
Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-28 17:51:11 +11:00
robobun
bab583497c fix(cli): correct --dry-run help text for bun publish (#25137)
## Summary
- Fix `bun publish --help` showing incorrect `--dry-run` description
("Don't install anything" → "Perform a dry run without making changes")
- The `--dry-run` flag is in a shared params array used by multiple
commands, so the new generic message works for all of them

Fixes #24806

## Test plan
- [x] Verify `bun publish --help` shows "Perform a dry run without
making changes" for --dry-run
- [x] Regression test added that validates the correct help text is
shown
- [x] Test passes with debug build, fails with system bun (validating it
tests the right thing)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-27 18:12:07 -08:00
robobun
a83fceafc7 fix(http2): return server from setTimeout for method chaining (#25138)
## Summary
- Make `Http2Server.setTimeout()` and `Http2SecureServer.setTimeout()`
return `this` to enable method chaining
- Matches Node.js behavior where `server.setTimeout(1000).listen()`
works

Fixes #24924

## Test plan
- [x] Test that `Http2Server.setTimeout()` returns server instance
- [x] Test that `Http2SecureServer.setTimeout()` returns server instance
- [x] Test method chaining works (e.g.,
`server.setTimeout(1000).close()`)
- [x] Tests pass with debug build, fail with system bun

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-27 16:46:15 -08:00
robobun
ef8eef3df8 fix(http): stricter validation in chunked encoding parser (#25159)
## Summary
- Adds stricter validation for chunk boundaries in the HTTP chunked
transfer encoding parser
- Ensures conformance with RFC 9112 requirements for chunk formatting
- Adds additional test coverage for chunked encoding edge cases

## Test plan
- Added new tests in `test/js/bun/http/request-smuggling.test.ts`
- All existing HTTP tests pass
- `bun bd test test/js/bun/http/request-smuggling.test.ts` passes

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-11-27 16:29:35 -08:00
robobun
69b571da41 Delete claude.yml workflow (#25157) 2025-11-27 12:26:50 -08:00
robobun
908ab9ce30 feat(fetch): add proxy object format with headers support (#25090)
## Summary

- Extends `fetch()` proxy option to accept an object format: `proxy: {
url: string, headers?: Headers }`
- Allows sending custom headers to the proxy server (useful for proxy
authentication, custom routing headers, etc.)
- Headers are sent in CONNECT requests (for HTTPS targets) and direct
proxy requests (for HTTP targets)
- User-provided `Proxy-Authorization` header overrides auto-generated
credentials from URL

## Usage

```typescript
// Old format (still works)
fetch(url, { proxy: "http://proxy.example.com:8080" });

// New object format with headers
fetch(url, {
  proxy: {
    url: "http://proxy.example.com:8080",
    headers: {
      "Proxy-Authorization": "Bearer token",
      "X-Custom-Proxy-Header": "value"
    }
  }
});
```

## Test plan

- [x] Test proxy object with url string works same as string proxy
- [x] Test proxy object with headers sends headers to proxy (HTTP
target)
- [x] Test proxy object with headers sends headers in CONNECT request
(HTTPS target)
- [x] Test proxy object with Headers instance
- [x] Test proxy object with empty headers
- [x] Test proxy object with undefined headers
- [x] Test user-provided Proxy-Authorization overrides URL credentials
- [x] All existing proxy tests pass (25 total)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-11-26 15:11:45 -08:00
robobun
43c46b1f77 fix(FormData): throw error instead of assertion failure on very large input (#25006)
## Summary

- Fix crash in `FormData.from()` when called with very large ArrayBuffer
input
- Add length check in C++ `toString` function against both Bun's
synthetic limit and WebKit's `String::MaxLength`
- For UTF-8 tagged strings, use simdutf to calculate actual UTF-16
length only when byte length exceeds the limit

## Root Cause

When `FormData.from()` was called with a very large ArrayBuffer (e.g.,
`new Uint32Array(913148244)` = ~3.6GB), the code would crash with:

```
ASSERTION FAILED: data.size() <= MaxLength
vendor/WebKit/Source/WTF/wtf/text/StringImpl.h(886)
```

The `toString()` function in `helpers.h` was only checking against
`Bun__stringSyntheticAllocationLimit` (which defaults to ~4GB), but not
against WebKit's `String::MaxLength` (INT32_MAX, ~2GB). When the input
exceeded `String::MaxLength`, `createWithoutCopying()` would fail with
an assertion.

## Changes

1. **helpers.h**: Added `|| str.len > WTF::String::MaxLength` checks to
all three code paths in `toString()`:
- UTF-8 tagged pointer path (with simdutf length calculation only when
needed)
   - External pointer path
   - Non-copying creation path

2. **url.zig**: Reverted the incorrect Zig-side check (UTF-8 byte length
!= UTF-16 character length)

## Test plan

- [x] Added test that verifies FormData.from with oversized input
doesn't crash
- [x] Verified original crash case now returns empty FormData instead of
crashing:
  ```js
  const v3 = new Uint32Array(913148244);
  FormData.from(v3); // No longer crashes
  ```

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2025-11-26 13:46:08 -08:00
robobun
a0c5f3dc69 fix(mmap): use coerceToInt64 for offset/size to prevent assertion failure (#25101)
## Summary

- Fix assertion failure in `Bun.mmap` when `offset` or `size` options
are non-numeric values
- Add validation to reject negative `offset`/`size` with clear error
messages

Minimal reproduction: `Bun.mmap("", { offset: null });`

## Root Cause

`Bun.mmap` was calling `toInt64()` directly on the `offset` and `size`
options without validating they are numbers first. `toInt64()` has an
assertion that the value must be a number or BigInt, which fails when
non-numeric values like `null` or functions are passed.

## Test plan

- [x] Added tests for negative offset/size rejection
- [x] Added tests for non-number inputs (null, undefined)
- [x] `bun bd test test/js/bun/util/mmap.test.js` passes

Closes ENG-22413

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-11-26 13:37:41 -08:00
robobun
5965ff18ea fix(test): fix assertion failure in expect.extend with non-JSFunction callables (#25099)
## Summary

- Fix debug assertion failure in `JSWrappingFunction` when
`expect.extend()` is called with objects containing non-`JSFunction`
callables
- The crash occurred because `jsCast<JSFunction*>` was used, which
asserts the value inherits from `JSFunction`, but callable class
constructors (like `Expect`) inherit from `InternalFunction` instead

## Changes

- Change `JSWrappingFunction` to store `JSObject*` instead of
`JSFunction*`
- Use `jsDynamicCast` instead of `jsCast` in `getWrappedFunction`
- Use `getObject()` instead of `jsCast` in `create()`

## Reproduction

```js
const jest = Bun.jest();
jest.expect.extend(jest);
```

Before fix (debug build):
```
ASSERTION FAILED: !from || from->JSCell::inherits(std::remove_pointer<To>::type::info())
JSCast.h(40) : To JSC::jsCast(From *) [To = JSC::JSFunction *, From = JSC::JSCell]
```

After fix: Properly throws `TypeError: expect.extend: 'jest' is not a
valid matcher`

## Test plan

- [x] Added regression test
`test/regression/issue/fuzzer-ENG-22942.test.ts`
- [x] Existing `expect-extend.test.js` tests pass (27 tests)
- [x] Build succeeds

Fixes ENG-22942

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-11-26 13:34:02 -08:00
robobun
44f2328111 fix(fs.access): handle Windows device paths correctly (#25023)
## Summary

Fixes #23292

`fs.access()` and `fs.accessSync()` threw EUNKNOWN (-134) when checking
named pipes on Windows (paths like `\.\pipe\name`), but Node.js worked
fine.

**Repro:**
```ts
// Server creates pipe at \.\pipe\bun-test
import net from 'net';
const server = net.createServer();
server.listen('\\.\pipe\bun-test');

// Client tries to check if pipe exists
import fs from 'fs';
fs.accessSync('\\.\pipe\bun-test', fs.constants.F_OK);
// Error: EUNKNOWN: unknown error, access '\.\pipe\bun-test'
```

## Root Cause

The `osPathKernel32` function normalizes paths before passing to Windows
APIs. The normalization logic treats a single `.` as a "current
directory" component and removes it, so `\.\pipe\name` incorrectly
became `\pipe\name` - an invalid path.

## Solution

Detect Windows device paths (starting with `\.\` or `\?\`) and skip
normalization for these special paths, preserving the device prefix.

## Test Plan

- [x] Added regression test `test/regression/issue/23292.test.ts`
- [x] Test fails with system bun (v1.3.3): 3 failures (EUNKNOWN)
- [x] Test passes with fix: 4 pass

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-26 00:07:51 -08:00
robobun
85e0a723f3 Add analytics tracking for CPU profiling and heap snapshots (#25053)
## Summary

- Add `cpu_profile` and `heap_snapshot` counters to `Analytics.Features`
- Export `heap_snapshot` to C++ as `Bun__Feature__heap_snapshot`
- Increment `cpu_profile` when `--cpu-prof` flag is used
- Increment `heap_snapshot` in all heap snapshot creation locations:
  - `Bun.generateHeapSnapshot()`
  - `bun:jsc` `generateHeapSnapshotForDebugging()`
  - `console.takeHeapSnapshot()`
  - Internal `JSC__JSGlobalObject__generateHeapSnapshot()`

## Test plan

- [x] Build succeeds
- [x] Heap snapshot generation works
- [x] CPU profiling works with `--cpu-prof`
- [x] Existing tests pass: `test/js/bun/util/v8-heap-snapshot.test.ts`
- [x] Existing tests pass: `test/cli/run/cpu-prof.test.ts`

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-11-26 00:02:43 -08:00
robobun
d50c5a385f Add analytics tracking for HTTP client proxy usage (#25056)
## Summary
- Added `http_client_proxy` counter to `analytics.Features` struct
- Incremented counter in `ProxyTunnel.onOpen()` when proxy tunnel
connection opens successfully

This allows tracking HTTP client proxy usage in analytics/crash reports
alongside other features like `fetch`, `WebSocket`, `http_server`, etc.

## Test plan
- [x] Build completes successfully (`bun bd`)
- [x] Existing proxy tests pass (`bun bd test
test/js/bun/http/proxy.test.ts`)
- [x] Counter is properly integrated into the analytics framework

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-11-26 00:02:26 -08:00
Jarred Sumner
bb7641b3b0 fix: use GC-safe operations when populating error stack traces (#25096)
## Summary
- Adds `FinalizerSafety` enum to control whether `functionName` uses
GC-safe operations when iterating over stack traces
- Uses the enum in `populateStackFrameMetadata` to choose between the
richer callee-based path vs the GC-safe path
- Updates call sites in `ZigException.cpp` and
`FormatStackTraceForJS.cpp`

Fixes https://github.com/oven-sh/bun/issues/25094
Fixes https://github.com/oven-sh/bun/issues/20399
Fixes https://github.com/oven-sh/bun/issues/22662

## Test plan
- [ ] Add regression test

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude <noreply@anthropic.com>
2025-11-25 23:55:41 -08:00
Meghan Denny
22f4dfae7c ci: fix release step
don't try to release freebsd yet
2025-11-25 22:27:02 -08:00
Meghan Denny
9fce97bac3 scripts: add freebsd support to bootstrap.sh (#24534) 2025-11-25 17:16:38 -08:00
Dylan Conway
2fb3aa8991 update minimumReleaseAge (#25057)
### What does this PR do?

### How did you verify your code works?
2025-11-25 11:06:24 -08:00
robobun
dc25d66b00 fix(Buffer): improve input validation in *Write methods (#25011)
## Summary
Improve bounds checking logic in Buffer.*Write methods (utf8Write,
base64urlWrite, etc.) to properly handle edge cases with non-numeric
offset and length arguments, matching Node.js behavior.

## Changes
- Handle non-numeric offset by converting to integer (treating invalid
values as 0)
- Clamp length to available buffer space instead of throwing
- Reorder operations to check buffer state after argument conversion

## Node.js Compatibility

This matches Node.js's C++ implementation in `node_buffer.cc`:

**Offset handling via `ParseArrayIndex`**
([node_buffer.cc:211-234](https://github.com/nodejs/node/blob/main/src/node_buffer.cc#L211-L234)):
```cpp
inline MUST_USE_RESULT Maybe<bool> ParseArrayIndex(Environment* env,
                                                   Local<Value> arg,
                                                   size_t def,
                                                   size_t* ret) {
  if (arg->IsUndefined()) {
    *ret = def;
    return Just(true);
  }

  int64_t tmp_i;
  if (!arg->IntegerValue(env->context()).To(&tmp_i))
    return Nothing<bool>();
  // ...
}
```
V8's `IntegerValue` converts non-numeric values (including NaN) to 0.

**Length clamping in `SlowWriteString`**
([node_buffer.cc:1498-1502](https://github.com/nodejs/node/blob/main/src/node_buffer.cc#L1498-L1502)):
```cpp
THROW_AND_RETURN_IF_OOB(ParseArrayIndex(env, args[2], 0, &offset));
THROW_AND_RETURN_IF_OOB(
    ParseArrayIndex(env, args[3], ts_obj_length - offset, &max_length));

max_length = std::min(ts_obj_length - offset, max_length);
```
Node.js clamps `max_length` to available buffer space rather than
throwing.

## Test plan
- Added regression tests for all `*Write` methods verifying proper
handling of edge cases
- Verified behavior matches Node.js
- All 447 buffer tests pass

fixes ENG-21985, fixes ENG-21863, fixes ENG-21751, fixes ENG-21984

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-11-24 23:34:36 -08:00
robobun
ae29340708 fix: prevent calling class constructors marked with call: false (#25047)
## Summary

Fixes a crash (ENG-22243) where calling class constructors marked with
`call: false` would create invalid instances instead of throwing an
error.

## Root Cause

When a class definition has `call: false` (like `Bun.RedisClient`), the
code generator was still allowing the constructor to be invoked without
`new`. This created invalid instances that caused a buffer overflow
during garbage collection.

## The Fix

Modified `src/codegen/generate-classes.ts` to properly check the `call`
property:
- When `call: false`: throws `TypeError: Class constructor X cannot be
invoked without 'new'`
- When `call: true`: behaves as before, allowing construction without
`new`

## Test Plan

- [x] Added regression test in `test/regression/issue/22243.test.ts`
- [x] Test fails with system bun (has the bug)
- [x] Test passes with fixed build
- [x] Verified `Bun.RedisClient()` now throws proper error
- [x] Verified `new Bun.RedisClient()` still works

## Before

```bash
$ bun -e "Bun.RedisClient()"
# Creates invalid instance, no error
```

## After

```bash
$ bun -e "Bun.RedisClient()"
TypeError: Class constructor RedisClient cannot be invoked without 'new'
```

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-11-24 23:34:16 -08:00
robobun
123ac9dc2d chore: disable comment-lint and labeled workflows (#25059)
## Summary
- Disable `comment-lint.yml` (C++ linter comment workflow)
- Disable `labeled.yml` (Issue/PR labeled automation workflow)

Workflows are disabled by renaming them to `.yml.disabled`.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-11-24 21:15:57 -08:00
Marko Vejnovic
48617563b5 ENG-21534: Satisfy aikido (#24880)
### What does this PR do?

- Bumps some packages
- Does some _best practices_ in certain areas to minimize Aikido noise.

### How did you verify your code works?

CI.

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-24 20:16:03 -08:00
robobun
cc393e43f2 fix(test): use putDirectMayBeIndex in spyOn for indexed property keys (#25020)
## Summary
- Fix `spyOn` crash when using indexed property keys (e.g., `spyOn(arr,
0)`)

## Test plan
- [x] Added tests for `spyOn` with numeric indexed properties
- [x] Added tests for `spyOn` with string indexed properties (e.g.,
`"0"`)
- [x] All existing `spyOn` tests pass
- [x] Full `mock-fn.test.js` test suite passes

Fixes ENG-21973

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-11-24 19:27:33 -08:00
robobun
3f0681996f fix(indexOfLine): properly coerce non-number offset argument (#25021)
## Summary
- Fix assertion failure when `Bun.indexOfLine` is called with a
non-number offset argument
- Changed from `.to(u32)` to `.coerce(i32, globalThis)` for proper
JavaScript type coercion

## Test plan
- [x] Added regression test in `test/js/bun/util/index-of-line.test.ts`
- [x] `bun bd test test/js/bun/util/index-of-line.test.ts` passes

Closes ENG-21997

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2025-11-24 19:27:14 -08:00
robobun
e14d5593c5 fix(install): use >= instead of > for resolution bounds check (#25028)
## Summary

- Fix off-by-one error in `preprocessUpdateRequests` where the bounds
check used `>` instead of `>=` when validating package IDs from the
resolution buffer
- When `old_resolution == packages.len`, the check `> packages.len`
passes but `resolutions_of_yore[old_resolution]` is out of bounds since
valid indices are `0` to `packages.len-1`
- This causes an internal assertion failure during `bun install` with
update requests

## The Bug

```zig
// BEFORE (buggy) - at lockfile.zig:484 and :522
if (old_resolution > old.packages.len) continue;
const res = resolutions_of_yore[old_resolution];  // OOB when old_resolution == packages.len

// AFTER (fixed)
if (old_resolution >= old.packages.len) continue;
const res = resolutions_of_yore[old_resolution];  // Now safe
```

## Crash Report

From
[bun.report](https://bun.report/1.3.3/wi1274e01cAggkggB+rt/F+pvBiw3rDqul/Doyi4Emzi5Ewj44FuvbgjMog00yDCYKERNEL32.DLLut0LCSntdll.dll4zijBA0eNrzzCtJLcpLzFFILC5OLSrJzM9TSEvMzCktSgUAiSkKPg/view):

```
panic: Internal assertion failure
- lockfile.zig:523: preprocessUpdateRequests
- install_with_manager.zig:605: installWithManager
- updatePackageJSONAndInstall.zig:340
Features: extracted_packages, text_lockfile
```

## Test plan

- [x] `bun run zig:check` passes
- [ ] CI passes

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-11-24 19:22:45 -08:00
taylor.fish
7335cb747b Fix conversions from JSValue to FFI pointer (#25045)
Fixes this issue, where two identical JS numbers could become two
different FFI pointers:

```c
// gcc -fpic -shared -o main.so
#include <stdio.h>

void* getPtr(void) {
    return (void*)123;
}

void printPtr(void* ptr) {
    printf("%zu\n", (size_t)ptr);
}
```

```js
import { dlopen, FFIType } from "bun:ffi";

const lib = dlopen("./main.so", {
  getPtr: { args: [], returns: FFIType.ptr },
  printPtr: { args: [FFIType.ptr], returns: FFIType.void },
});

const ptr = lib.symbols.getPtr();
console.log(`${typeof ptr} ${ptr}`);

const ptr2 = Number(String(ptr));
console.log(`${typeof ptr2} ${ptr2}`);

console.log(`pointers equal? ${ptr === ptr2}`);
lib.symbols.printPtr(ptr);
lib.symbols.printPtr(ptr2);
```

```console
$ bun main.js
number 123
number 123
pointers equal? true
123
18446744073709551615
```

Fixes #20072

(For internal tracking: fixes ENG-22327)
2025-11-24 17:34:39 -08:00
Marko Vejnovic
9c8575f975 bug(#19588): Fix undici bindings overflow (#25046)
### What does this PR do?

- Fixes an overflow that happened in the undici bindings generator.
- Establishes a pattern using `std::tuple` so this doesn't happen again.

Fixes:

-
https://bun-p9.sentry.io/issues/6597708115/?query=createUndiciInternalBinding&referrer=issue-stream
- https://github.com/oven-sh/bun/issues/19588

### How did you verify your code works?

CI.

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-24 17:31:11 -08:00
robobun
0da132ef6d fix(test): skip grpc-js resolver tests that use unavailable domain (#25039)
## Summary

- Skip 2 tests that use `grpctest.kleinsch.com` (domain no longer
exists)
- Fix flaky "should not keep repeating failed resolutions" test

These tests were originally skipped when added in #14286, but were
accidentally un-skipped in #20051. This restores them to match upstream
grpc-node.

## To re-enable these tests in the future

Bun could set up its own DNS TXT record at `*.bun.sh`. According to the
[gRPC A2
spec](https://github.com/grpc/proposal/blob/master/A2-service-configs-in-dns.md):

**DNS Setup needed:**
1. A record: `grpctest.bun.sh` → any valid IP (e.g., `127.0.0.1`)
2. TXT record: `_grpc_config.grpctest.bun.sh` with value:
   ```

grpc_config=[{"serviceConfig":{"loadBalancingPolicy":"round_robin","methodConfig":[{"name":[{"service":"MyService","method":"Foo"}],"waitForReady":true}]}}]
   ```

Then update the tests to use `grpctest.bun.sh` instead.

## Test plan

- [x] `bun bd test test/js/third_party/grpc-js/test-resolver.test.ts`
passes (20 pass, 3 skip, 1 todo, 0 fail)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-11-24 10:34:26 -08:00
Jarred Sumner
0d863ba237 Don't use globalThis.takeException when rejecting Promise values (#24986)
### What does this PR do?

We can't use globalThis.takeException() because it throws out of memory
error when we instead need to take the exception.

### How did you verify your code works?
2025-11-23 20:27:15 -08:00
Michael H
f31db64bd4 cross-platform bun bd (#24983)
closes #24969
2025-11-23 15:09:43 -08:00
robobun
ddcec61f59 fix: use >= instead of > for String.max_length() check (#24988)
## Summary

- Fixed boundary check in `String.zig` to use `>=` instead of `>` for
`max_length()` comparisons
- Strings fail when the length is exactly equal to `max_length()`, not
just when exceeding it
- This affects both `createExternal` and
`createExternalGloballyAllocated` functions

## Test plan

- Existing tests should continue to pass
- Strings with length exactly equal to `max_length()` will now be
properly rejected

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-11-23 01:42:32 -08:00
Dylan Conway
29051f9340 fix(Bun.plugin): return on invalid target error (#24945)
### What does this PR do?

### How did you verify your code works?

Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2025-11-23 01:41:42 -08:00
robobun
7076fbbe68 fix(glob): fix typo that caused patterns like .*/* to escape cwd boundary (#24939)
## Summary

- Fixed a typo in `makeComponent` that incorrectly identified
2-character patterns starting with `.` (like `.*`) as `..` (DotBack)
patterns
- The condition checked `pattern[component.start] == '.'` twice instead
of checking both characters at positions 0 and 1
- This caused patterns like `.*/*` to be parsed as `../` + `*`, making
the glob walker traverse into parent directories

Fixes #24936

## Test plan

- [x] Added tests in `test/js/bun/glob/scan.test.ts` that verify
patterns like `.*/*` and `.*/**/*.ts` don't escape the cwd boundary
- [x] Tests fail with system bun (bug reproduced) and pass with the fix
- [x] All existing glob tests pass (169 tests)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2025-11-23 01:41:17 -08:00
Marko Vejnovic
67be07fca4 Fix fuzzilli_command.zig (#24941)
### What does this PR do?

Needed to fix `fuzzilli_command.zig` to get it to build

### How did you verify your code works?
2025-11-23 00:34:27 -08:00
Dylan Conway
25b91e5c86 Update JSValue.toSliceClone to use JSError (#24949)
### What does this PR do?
Removes a TODO
### How did you verify your code works?

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-23 00:32:38 -08:00
Jarred Sumner
3c60fcda33 Fixes #24711 (#24982)
### What does this PR do?

Calling `.withAsyncContextIfNeeded` on a Promise is both unnecessary and
incorrect

### How did you verify your code works?
2025-11-22 23:50:34 -08:00
robobun
8da699681e test: add security scanner integration tests for minimum-release-age (#24944) 2025-11-22 15:08:12 -08:00
Alistair Smith
7a06dfcb89 fix: collect all dependencies from workspace packages in scanner (#24942)
### What does this PR do?

Fixes #23688

### How did you verify your code works?

Another test
2025-11-21 18:31:45 -08:00
Michael H
9ed53283a4 bump versions to v1.3.3 (#24933) 2025-11-21 14:22:28 -08:00
Michael H
4450d738fa docs: more consistency + minor updates (#24764)
Co-authored-by: RiskyMH <git@riskymh.dev>
2025-11-21 14:06:19 -08:00
robobun
7ec1aa8c95 docs: document new features from v1.3.2 and v1.3.3 (#24932)
## What does this PR do?

Adds missing documentation for features introduced in Bun v1.3.2 and
v1.3.3:

- **Standalone executable config flags**
(`docs/bundler/executables.mdx`): Document
`--no-compile-autoload-dotenv` and `--no-compile-autoload-bunfig` flags
that control automatic config file loading in compiled binaries
- **Test retry/repeats** (`docs/test/writing-tests.mdx`): Document the
`retry` and `repeats` test options for handling flaky tests
- **Disable env file loading**
(`docs/runtime/environment-variables.mdx`): Document `--no-env-file`
flag and `env = false` bunfig option

## How did you verify your code works?

- [x] Verified documentation is accurate against source code
implementation in `src/cli/Arguments.zig`
- [x] Verified features are not already documented elsewhere
- [x] Cross-referenced with v1.3.2 and v1.3.3 release notes

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-11-21 12:45:57 -08:00
Marko Vejnovic
abb1b0c4d7 test(ENG-21524): Fuzzilli Stop-Gap (#24826)
### What does this PR do?

Adds [@mschwarzl's Fuzzilli Support
PR](https://github.com/oven-sh/bun/pull/23862) with the changes
necessary to be able to:

- Run it in CI
- Make no impact on `debug` and `release` mode.

### How did you verify your code works?

---------

Co-authored-by: Martin Schwarzl <mschwarzl@cloudflare.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Dylan Conway <dylan.conway567@gmail.com>
2025-11-20 23:37:31 -08:00
Dylan Conway
274e01c737 remove jsc.createCallback (#24910)
### What does this PR do?
This was creating `Zig::FFIFunction` when we could instead use a plain
`JSC::JSFunction`
### How did you verify your code works?
Added a test
2025-11-20 20:56:02 -08:00
Conner Phillippi
a0c5edb15b Add new package paths to .aikido configuration 2025-11-20 17:35:16 -08:00
Meghan Denny
5702b39ef1 runtime: implement CompressionStream/DecompressionStream (#24757)
Closes https://github.com/oven-sh/bun/issues/1723
Closes https://github.com/oven-sh/bun/pull/22214
Closes https://github.com/oven-sh/bun/pull/24241

also supports the `"brotli"` and `"zstd"` formats

<img width="1244" height="547" alt="image"
src="https://github.com/user-attachments/assets/aecf4489-29ad-411d-9f6b-3bee50ed1b27"
/>
2025-11-20 17:14:37 -08:00
Dylan Conway
b72ba31441 fix(Blob.prototype.stream): handle undefined chunkSize (#24900)
### What does this PR do?
`blob.stream(undefined)`
### How did you verify your code works?
Added a test

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-20 17:01:24 -08:00
Meghan Denny
b92d2edcff Rename test-http-chunked-encoding-must be-valid-after-without-flushHeaders.ts to test-http-chunked-encoding-must-be-valid-after-without-flushHeaders.ts 2025-11-20 15:36:31 -08:00
Meghan Denny
a4af0aa4a8 Rename test-http-should-not-emit-or-throw error-when-writing-after-socket.end.ts to test-http-should-not-emit-or-throw-error-when-writing-after-socket.end.ts 2025-11-20 15:36:02 -08:00
Conner Phillippi
28b950e2b0 Add 'bench' to excluded paths in .aikido (#24879)
### What does this PR do?

### How did you verify your code works?
2025-11-19 23:46:30 -08:00
Conner Phillippi
595ad7de93 Add .aikido configuration file to exclude test and scripts directories (#24877) 2025-11-19 23:37:55 -08:00
Alistair Smith
b38ba38a18 types: correct ReadableStream methods, allow Response instance for serve routes under a method (#24872) 2025-11-19 23:08:49 -08:00
Jarred Sumner
788f03454d Show debugger in crash reports (#24871)
### What does this PR do?

Show debugger in crash reports

### How did you verify your code works?
2025-11-19 22:52:01 -08:00
Dylan Conway
0e23375d20 fix ENG-21527 (#24861)
### What does this PR do?
fixes ENG-21527
### How did you verify your code works?
Added a test
2025-11-19 22:44:21 -08:00
Jarred Sumner
d584c86d5b Delete incorrect debug assertion 2025-11-19 22:16:41 -08:00
Dylan Conway
0480d55a67 fix(YAML): handle exponential merge keys (#24729)
### What does this PR do?
Fixes ENG-21490
### How did you verify your code works?
Added a test that would previously fail due to timeout. It also confirms
the parsed result is correct.

---------

Co-authored-by: taylor.fish <contact@taylor.fish>
2025-11-19 21:20:55 -08:00
Jarred Sumner
9189fc4fa1 Fixes #24817 (#24864)
### What does this PR do?

Fixes #24817

### How did you verify your code works?
Test

---------

Co-authored-by: taylor.fish <contact@taylor.fish>
2025-11-19 21:17:51 -08:00
Dylan Conway
b554626662 fix ENG-21528 (#24865)
### What does this PR do?
Makes sure we are creating error messages with an allocator that will
not `deinit` at the end of function scope on error.

fixes ENG-21528
### How did you verify your code works?
Added a test
2025-11-19 20:31:37 -08:00
Dylan Conway
0054506538 fix(windows): close worker libuv loop (#24811)
### What does this PR do?
We need to call `uv_loop_close` in order to remove the threadlocal loop
from a list in libuv so it won't be used later. This explains the crash
reports because they have `workers_terminated` in features.

Fixes #24804
Closes BUN-3NV
Closes ENG-21523
### How did you verify your code works?
Manually. I'm not sure how to write a test yet other than manually
clicking sleep

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Zack Radisic <zack@theradisic.com>
2025-11-19 18:54:41 -08:00
Meghan Denny
1b36d35253 cmake: separate variable for ZIG_COMPILER_SAFE default is unnecessary (#24797) 2025-11-18 17:51:19 -08:00
Ciro Spaciari
9a9473d4b9 fix(createEmptyObject) fix some createEmptyObject values part 2 (#24827)
### What does this PR do?
We must use the right number of properties or we should set it to 0

### How did you verify your code works?
Read the code to check the amount of properties + CI
2025-11-18 14:02:21 -08:00
Ciro Spaciari
2d954995cd Fix Response wrapper and us_internal_disable_sweep_timer (#24623)
### What does this PR do?
Fix Response wrapper and us_internal_disable_sweep_timer
Fixes
https://linear.app/oven/issue/ENG-21510/panic-attempt-to-use-null-value-in-responsezig
### How did you verify your code works?
CI
2025-11-18 14:00:53 -08:00
Meghan Denny
11b20aa508 runtime: fix small leak in Bun.listen (#24799)
pulled out of https://github.com/oven-sh/bun/pull/21663

Co-authored-by: Ciro Spaciari <ciro.spaciari@gmail.com>
2025-11-18 12:00:56 -08:00
Meghan Denny
cac8e62635 zig: switch exhaustively on os + arch more (#24796) 2025-11-18 10:49:21 -08:00
Meghan Denny
f03957474e runtime: fix small leak in Bun.spawn (#24798)
pulled out of https://github.com/oven-sh/bun/pull/21663
2025-11-18 09:57:19 -05:00
Meghan Denny
ab80bbe4c2 runtime: fix small leak in node:net.SocketAddress (#24800)
pulled out of https://github.com/oven-sh/bun/pull/21663
2025-11-18 09:55:45 -05:00
Meghan Denny
af498a0483 runtime: fix small leak in Blob deinit (#24802)
pulled out of https://github.com/oven-sh/bun/pull/21663
2025-11-18 09:55:15 -05:00
robobun
7c485177ee Add compile-time flags to control .env and bunfig.toml autoloading (#24790)
## Summary

This PR adds two new compile options to control whether standalone
executables autoload `.env` files and `bunfig.toml` configuration files.

## New Options

### JavaScript API
```js
await Bun.build({
  entrypoints: ["./entry.ts"],
  compile: {
    autoloadDotenv: false,  // Disable .env loading (default: true)
    autoloadBunfig: false,  // Disable bunfig.toml loading (default: true)
  }
});
```

### CLI Flags
```bash
bun build --compile --no-compile-autoload-dotenv entry.ts
bun build --compile --no-compile-autoload-bunfig entry.ts
bun build --compile --compile-autoload-dotenv entry.ts
bun build --compile --compile-autoload-bunfig entry.ts
```

## Implementation

The flags are stored in a new `Flags` packed struct in
`StandaloneModuleGraph`:
```zig
pub const Flags = packed struct(u32) {
    disable_default_env_files: bool = false,
    disable_autoload_bunfig: bool = false,
    _padding: u30 = 0,
};
```

These flags are:
1. Set during compilation from CLI args or JS API options
2. Serialized into the `StandaloneModuleGraph` embedded in the
executable
3. Read at runtime in `bootStandalone()` to conditionally load config
files

## Testing

Manually tested and verified:
-  Default behavior loads `.env` files
-  `--no-compile-autoload-dotenv` disables `.env` loading
-  `--compile-autoload-dotenv` explicitly enables `.env` loading
-  Default behavior loads `bunfig.toml` (verified with preload script)
-  `--no-compile-autoload-bunfig` disables `bunfig.toml` loading

Test cases added in `test/bundler/bundler_compile_autoload.test.ts`

## Files Changed

- `src/StandaloneModuleGraph.zig` - Added Flags struct, updated
encode/decode
- `src/bun.js.zig` - Checks flags in bootStandalone()
- `src/bun.js/api/JSBundler.zig` - Added autoload options to
CompileOptions
- `src/bundler/bundle_v2.zig` - Pass flags to toExecutable()
- `src/cli.zig` - Added flags to BundlerOptions
- `src/cli/Arguments.zig` - Added CLI argument parsing
- `src/cli/build_command.zig` - Pass flags from context
- `test/bundler/expectBundled.ts` - Support new compile options
- `test/bundler/bundler_compile_autoload.test.ts` - New test file

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2025-11-18 09:46:44 -05:00
Marko Vejnovic
9513c1d1d9 chore: HttpContext.h cleanup (#24730) 2025-11-17 13:36:03 -08:00
robobun
509a97a435 Add --no-env-file flag to disable automatic .env loading (#24767)
## Summary

Implements `--no-env-file` CLI flag and bunfig configuration options to
disable automatic `.env` file loading at runtime and in the bundler.

## Motivation

Users may want to disable automatic `.env` file loading for:
- Production environments where env vars are managed externally
- CI/CD pipelines where .env files should be ignored
- Testing scenarios where explicit env control is needed
- Security contexts where .env files should not be trusted

## Changes

### CLI Flag
- Added `--no-env-file` flag that disables loading of default .env files
- Still respects explicit `--env-file` arguments for intentional env
loading

### Bunfig Configuration
Added support for disabling .env loading via `bunfig.toml`:
- `env = false` - disables default .env file loading
- `env = null` - disables default .env file loading  
- `env.file = false` - disables default .env file loading
- `env.file = null` - disables default .env file loading

### Implementation
- Added `disable_default_env_files` field to `api.TransformOptions` with
serialization support
- Added `disable_default_env_files` field to `options.Env` struct
- Implemented `loadEnvConfig` in bunfig parser to handle env
configuration
- Wired up flag throughout runtime and bundler code paths
- Preserved package.json script runner behavior (always skips default
.env files)

## Tests

Added comprehensive test suite (`test/cli/run/no-envfile.test.ts`) with
9 tests covering:
- `--no-env-file` flag with `.env`, `.env.local`,
`.env.development.local`
- Bunfig configurations: `env = false`, `env.file = false`, `env = true`
- `--no-env-file` with `-e` eval flag
- `--no-env-file` combined with `--env-file` (explicit files still load)
- Production mode behavior

All tests pass with debug bun and fail with system bun (as expected).

## Example Usage

```bash
# Disable all default .env files
bun --no-env-file index.js

# Disable defaults but load explicit file
bun --no-env-file --env-file .env.production index.js

# Disable via bunfig.toml
cat > bunfig.toml << 'CONFIG'
env = false
CONFIG
bun index.js
```

## Files Changed
- `src/cli/Arguments.zig` - CLI flag parsing
- `src/api/schema.zig` - API schema field with encode/decode
- `src/options.zig` - Env struct field and wiring
- `src/bunfig.zig` - Config parsing with loadEnvConfig
- `src/transpiler.zig` - Runtime wiring
- `src/bun.js.zig` - Runtime wiring
- `src/cli/exec_command.zig` - Runtime wiring
- `src/cli/run_command.zig` - Preserved package.json script runner
behavior
- `test/cli/run/no-envfile.test.ts` - Comprehensive test suite

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-11-17 15:04:42 -05:00
Dylan Conway
983bb52df7 fix #24550 (#24726)
### What does this PR do?
Fixes a regression introduced in Bun v1.3.2 with #24283.

We are not able to skip `sharp` lifecycle scripts before v0.33.0 because
previous versions did not use optional dependencies with prebuilds.

Fixes #24550
Fixes ENG-21519
### How did you verify your code works?
Manually

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-17 15:04:20 -05:00
Meghan Denny
8b5b36ec7a runtime: fix n-api ThreadSafeFunction finalizer (#24771)
Closes https://github.com/oven-sh/bun/issues/24552
Closes https://github.com/oven-sh/bun/issues/24664
Closes https://github.com/oven-sh/bun/issues/24702
Closes https://github.com/oven-sh/bun/issues/24703
Closes https://github.com/oven-sh/bun/issues/24768
2025-11-17 11:23:13 -08:00
Michael H
87eca6bbc7 docs: re-apply many recent changes that somehow aren't present (#24719)
lots of recent changes aren't present, so this reapplies them
2025-11-16 19:23:01 +11:00
Meghan Denny
2cb8d4eae8 cmake: remove GIT_CHANGED_SOURCES (#24737)
dead code
2025-11-15 16:45:37 -08:00
Meghan Denny
e53ceb62ec zig: fix missing uses of bun.callmod_inline (#24738)
results in better strack traces in debug mode
2025-11-15 16:36:15 -08:00
pfg
277fc558e2 only-failures fix (#24701)
### What does this PR do?

Removes these accidental blank lines

<img width="170" height="139" alt="image"
src="https://github.com/user-attachments/assets/b44d6496-a497-4be6-9666-8134a70d7324"
/>


### How did you verify your code works?
2025-11-14 19:52:43 -08:00
Dylan Conway
5908bfbfc6 fix(YAML.stringify): number-like strings prefixed with 0 (#24731)
### What does this PR do?
Ensures strings that would parse as a number with leading zeroes aren't
emitted without quotes.

fixes #23691

### How did you verify your code works?
Added a test
2025-11-14 17:43:36 -08:00
Dylan Conway
19f21c00bd fix #24510 (#24563)
### What does this PR do?
The assertion was too strict.

This pr changes to assertion to allow multiple of the same dependency id
to be present. Also changes all the assertions to debug assertions.

fixes #24510
### How did you verify your code works?
Manually, and added a new test

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Marko Vejnovic <marko@bun.com>
2025-11-14 16:49:21 -08:00
Lydia Hallie
8650e7ace4 Docs: Add templates to guides (#24732)
Adds template cards to the TanStack Start and Next.js guides
2025-11-14 16:45:21 -08:00
robobun
b2c219a56c Implement retry and repeats options for bun:test (#23713)
Fixes #16051, Fixes ENG-21437

Implements retry/repeats

```ts
test("my test", () => {
    if (Math.random() < 0.1) throw new Error("uh oh!");
}, {repeats: 20});
```

```
Error: uh oh!
✗ my test
```

```ts
test("my test", () => {
    if (Math.random() < 0.1) throw new Error("uh oh!");
}, {retry: 5});
```

```
Error: uh oh!
✓ my test (attempt 2)
```

Also fixes a bug where onTestFinished inside a test would not run if the
test failed

```ts
test("abc", () => {
    onTestFinished(() => { console.log("hello" });
    throw new Error("uh oh!");
});
```

```
Error: uh oh!
hello
```

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: pfg <pfg@pfg.pw>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-14 16:21:04 -08:00
Luke Parker
f216673f98 fix: Add missing SIGWINCH for windows (#24704)
### What does this PR do?
Fixes https://github.com/oven-sh/bun/issues/22288
Fixes #22402
Fixes https://github.com/oven-sh/bun/issues/23224
Fixes https://github.com/oven-sh/bun/issues/17803

cc: Should unblock opencode/opentui window resize on windows
https://github.com/sst/opentui/issues/152

### How did you verify your code works?
Clone the linked repro, verified latest bun failed, node worked, then
iterated till my local bun worked.

Here is a screenshot of the branch working with bun on windows

<img width="1427" height="891" alt="image"
src="https://github.com/user-attachments/assets/18642db7-4cb6-4758-bb76-a38d277cbc23"
/>

Additionally using bun vs bun-debug on a little test for our downstream
package proves this works

<img width="1137" height="679" alt="image"
src="https://github.com/user-attachments/assets/4dbe7605-ced9-4bcb-84f0-ed793f8aa942"
/>
<img width="1138" height="684" alt="image"
src="https://github.com/user-attachments/assets/f658b3b9-e4bc-4bfa-84f0-e1eb3af83d89"
/>
2025-11-14 14:05:47 -08:00
Lydia Hallie
a70f2b7ff9 Docs: Add custor server instructions to TanStack guide (#24723)
Add docs on how to deploy a custom Bun server for TanStack Start. Based
on [this
example](https://github.com/TanStack/router/tree/main/examples/react/start-bun/server.ts)
2025-11-14 11:53:23 -08:00
Braden Wong
65a215bb4e docs(watch): use relativePath parameter name in recursive example (#24716)
This updates the documentation for `fs.watch()` to use `relativePath`
instead of `filename` in the recursive example, following the same
convention from PR #23990.

When `recursive: true` is set on `fs.watch()`, the callback receives a
relative path to the changed file rather than just a simple filename.
Using `relativePath` as the parameter name makes this distinction
clearer to users.

**Related to:** https://github.com/oven-sh/bun/pull/23990

Co-authored-by: Michael H <git@riskymh.dev>
2025-11-15 01:10:35 +11:00
Michael H
c3c91442ac docs: fix custom loader example to be correct & other file-type doc updates (#24677) 2025-11-14 16:32:00 +11:00
Nino
93ab167a8d docs: fix environment variable syntax in executable example (#24706)
### What does this PR do?

Fixes a typo in the docs. 

`bun_BE_BUN=1` doesn't work, it has to be capitalized `BUN_BE_BUN=1`
2025-11-13 21:31:07 -08:00
pfg
d8ee26509c Fix progress showing kb for downloading packages instead of count (#24700)
- show bytes for upgrading bun
- show no unit for other progress bars

Fix for issue introduced in #24266
2025-11-13 19:29:16 -08:00
Ciro Spaciari
21d582a3cd fix(createEmptyObject) fix some createEmptyObject values (#22512)
### What does this PR do?
We must use the right number of properties (not more or less) or we
should set it to 0
### How did you verify your code works?
Read the code, this will avoid potencial crashs and improve stability
2025-11-13 15:19:18 -08:00
Meghan Denny
d7bf4fb443 ci/format: update bun version (#24693) 2025-11-13 15:00:40 -08:00
Ciro Spaciari
263d1ab178 update(crypto) update root certificates to NSS 3.117 (#24607)
### What does this PR do?
This is the certdata.txt[0] from NSS 3.117, released on 2025-11-11.

This is the version of NSS that will ship in Firefox 145.0 on
2025-11-11.

Certificates added:
- OISTE Server Root ECC G1
-  OISTE Server Root RSA G1

[0]
https://hg.mozilla.org/projects/nss/raw-file/NSS_3_117_RTM/lib/ckfw/builtins/certdata.txt
765c9e86-0a91-4dad-b410-801cd60f8b32

Fixes https://linear.app/oven/issue/ENG-21508/update-root-certs
### How did you verify your code works?
CI
2025-11-13 13:26:34 -08:00
Marko Vejnovic
08843030f5 [publish images] 2025-11-13 12:04:47 -08:00
Kristjan Broder Lund
9ccc8fb795 docs: format code blocks correctly (#24672)
### What does this PR do?

The code blocks were not properly formatted, and did not render
correctly

`main`:
<img width="699" height="196" alt="image"
src="https://github.com/user-attachments/assets/c08bc29e-9481-47ae-bafe-dd94b22d0c09"
/>

this pr:
<img width="691" height="306" alt="image"
src="https://github.com/user-attachments/assets/947fb9d7-04f3-42e8-aafe-0d70127fefd1"
/>

### How did you verify your code works?

ran docs locally with mint

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Michael H <git@riskymh.dev>
2025-11-13 20:36:45 +11:00
S.T.P
4c03d3b8b6 docs: remove the redundant tags (#24668)
### What does this PR do?

Remove the redundant code block tag

<img width="1469" height="918" alt="image"
src="https://github.com/user-attachments/assets/3eb3b499-3165-409c-9360-2fe1872162ed"
/>

After change

<img width="1458" height="1006" alt="image"
src="https://github.com/user-attachments/assets/69eac47c-28cd-4459-9478-0098b51f78fe"
/>


### How did you verify your code works?

Preview the documentation locally

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Michael H <git@riskymh.dev>
2025-11-13 18:37:22 +11:00
Marko Vejnovic
6d2ce3892b build(ENG-21514): Fix sccache invocation (#24651)
### What does this PR do?

Fixes some miswritten `cmake` steps so that `sccache` actually works

### How did you verify your code works?
2025-11-12 14:51:08 -08:00
Marko Vejnovic
e03d3bee10 ci(ENG-21502): Fix sccache not working inside Docker (#24597) 2025-11-12 14:40:12 -08:00
Caio Borghi
fff47f0267 docs: update EdgeDB references to Gel rebrand (#24487)
## Summary
EdgeDB has rebranded to Gel. This PR comprehensively updates all
documentation to reflect the rebrand.

## Changes Made

### Documentation & Branding
- **Guide title**: "Use EdgeDB with Bun" → "Use Gel with Bun"
- **File renamed**: `docs/guides/ecosystem/edgedb.mdx` → `gel.mdx`
- **Description**: Added "(formerly EdgeDB)" note
- **All path references**: Updated from `/guides/ecosystem/edgedb` to
`/guides/ecosystem/gel`

### CLI Commands
- `edgedb project init` → `gel project init`
- `edgedb` → `gel` (REPL)
- `edgedb migration create` → `gel migration create`
- `edgedb migrate` → `gel migrate`

### npm Packages
- `edgedb` → `gel`
- `@edgedb/generate` → `@gel/generate`

### Installation & Documentation URLs
- Installation link: `docs.geldata.com/learn/installation` (functional)
- Documentation reference: `docs.geldata.com/` (operational)
- Installation scripts: Verified working (`https://www.geldata.com/sh`
and `ps1`)
- Added Homebrew option: `brew install geldata/tap/gel-cli`

### Code Examples
- Updated all imports: `import { createClient } from "gel"`
- Updated codegen commands: `bunx @gel/generate`

## Verified
All commands verified against official Gel documentation at
https://docs.geldata.com/

Fixes #17721

---------

Co-authored-by: Lydia Hallie <lydiajuliettehallie@gmail.com>
2025-11-12 14:18:59 -08:00
Ciro Spaciari
4e1d9a2cbc remove dead code in src/bake/DevServer/SerializedFailure.zig (#24635)
### What does this PR do?
remove dead code in src/bake/DevServer/SerializedFailure.zig
### How did you verify your code works?
It builds
2025-11-12 13:39:36 -08:00
Ciro Spaciari
1f0c885e91 proper handle on_data if we receive null (#24624)
### What does this PR do?
If for some reason data is null we should handle as empty
Fixes
https://linear.app/oven/issue/ENG-21511/panic-attempt-to-use-null-value-in-socket-on-data
### How did you verify your code works?
Ci
2025-11-12 12:42:06 -08:00
Ciro Spaciari
ab32a2fc4a fix(bun getcompletes) add windows support and remove TODO panic (#24620)
### What does this PR do?
Fixes https://linear.app/oven/issue/ENG-21509/panic-todo-in-completions
### How did you verify your code works?
Test
2025-11-12 12:41:47 -08:00
Ciro Spaciari
8912957aa5 compatibility(node:net) _handle.fd property (#24575)
### What does this PR do?
Expose fd property in _handle for node:net/node:tls
Fixes
https://linear.app/oven/issue/ENG-21507/expose-fd-in-nodenetnodetls

### How did you verify your code works?
Test

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
2025-11-12 12:28:55 -08:00
Ciro Spaciari
df4e42bf1c fix(DevServer) remove panic in case of source type none (#24634)
### What does this PR do?
Remove panic in case of source type none, so we can handle it more
gracefully, we can discuss if this is the best solution but looks
sensible for me. This is really hard to repro but can happen when
deleting files referred by dynamic imports.


Fixes
https://linear.app/oven/issue/ENG-21513/panic-missing-internal-precomputed-line-count-in-renderjson-on
Fixes https://github.com/oven-sh/bun/issues/21714
### How did you verify your code works?
CI

---------

Co-authored-by: taylor.fish <contact@taylor.fish>
2025-11-12 12:28:17 -08:00
Ciro Spaciari
f67bec90c5 refactor(us_socket_t.zig) safer use of intCast (#24622)
### What does this PR do?
make sure to always safe intCast in us_socket_t
### How did you verify your code works?
Compiles
2025-11-12 11:02:39 -08:00
Michael H
fa099336da docs: node does support "import path re-mapping" (#17133)
fixes #4545
2025-11-13 06:02:12 +11:00
Michael H
7f8dff64c4 docs: revert minifier doc's format (#24639) 2025-11-12 11:01:25 -08:00
Michael H
98a01e5d2a docs: fix some pages (#24632) 2025-11-13 06:00:05 +11:00
Michael H
d1fa27acce docs: document more loaders (#24616)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-12 10:48:03 -08:00
Michael H
cf6662d48f types: document configVersion in BunLockFile (#24641) 2025-11-12 10:47:30 -08:00
Marko Vejnovic
2563a9b3ad build(ENG-21491): Improve sccache behavior on developer machines (#24568)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-12 09:11:33 -08:00
Meghan Denny
e0aae8adc1 ci: remove unified-builds and unified-tests options (#24626) 2025-11-11 22:52:46 -08:00
Marko Vejnovic
6b8a75f6ab chore(ENG-21504): Remove bit-rotted scripts (#24606)
### What does this PR do?

Removes some scripts which haven't been tested in a while.

### How did you verify your code works?

CI passes
2025-11-11 22:39:20 -08:00
pfg
97c113d010 remove unused writer type parameters in src/css/ (#24571)
No longer needed after zig upgrade

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-11 21:09:50 -08:00
Meghan Denny
7f4e65464e zig: fix spurious dependency loop compile error in ResumableSink (#24618) 2025-11-11 20:22:24 -08:00
Meghan Denny
4b05629131 ci: update no-validate-exceptions.txt 2025-11-11 20:21:24 -08:00
Ciro Spaciari
d868c6019c fix(DevServer) unconditional unwrap in IncrementalGraph (#24608)
### What does this PR do?
Fixes
https://linear.app/oven/issue/ENG-21505/panic-attempt-to-use-null-value-at-incrementalgraph-by-misusing-jscode

When calling `takeJSBundleToList/takeJSBundle` the desired behavior is
to get only JS chunks from the graph, and we can contain CSS chunks in
the graph we can just continue and ignore in this case keeping the
desired behavior in a safe way instead of unconditional unwrapping
something that is not guaranteed to have a jsCode.

### How did you verify your code works?
CI
2025-11-11 16:46:28 -08:00
robobun
0c42b46af3 docs: remove outdated version mentions (1.0.x and 1.1.x) (#24570)
## Summary

Remove outdated version mentions (1.0.x and 1.1.x) from documentation
for better consistency. These versions are over a year old - you should
be using a recent version of bun :).

## What changed

**Removed version mentions from:**
- `docs/pm/lifecycle.mdx` - v1.0.16 (trusted dependencies)
- `docs/bundler/executables.mdx` - v1.0.23, v1.1.25, v1.1.30 (various
features)
- `docs/guides/install/jfrog-artifactory.mdx` - v1.0.3+ (env var
comment)
- `docs/guides/install/azure-artifacts.mdx` - v1.0.3+ (env var comment)
- `docs/runtime/workers.mdx` - v1.1.13, v1.1.35 (blob URLs, preload)
- `docs/runtime/networking/dns.mdx` - v1.1.9 (DNS caching)
- `docs/guides/runtime/import-html.mdx` - v1.1.5
- `docs/guides/runtime/define-constant.mdx` - v1.1.5
- `docs/runtime/sqlite.mdx` - v1.1.31

**Kept version mentions in:**
- All 1.2.x versions (still recent, less than a year old)
- Benchmark version numbers (e.g., S3 performance comparison with
v1.1.44)
- `docs/guides/install/yarnlock.mdx` (bun.lock introduction context)
- `docs/project/building-windows.mdx` (build requirements)
- `docs/runtime/http/websockets.mdx` (performance benchmarks)

## Why

The docs lack consistency around version mentions - we don't document
every feature's version, so keeping scattered old version numbers looks
inconsistent. These changes represent a small percentage of features
added recently, and users on ancient versions have bigger problems than
needing to know exactly when a feature landed.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: RiskyMH <git@riskymh.dev>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-12 10:44:52 +11:00
Nathan Soares
1cee6cf36b docs: Change code block header from package.json to tsconfig.json (#24511)
Co-authored-by: Michael H <git@riskymh.dev>
2025-11-12 10:25:23 +11:00
pfg
9671a98dca remove unused "recommended zig version" (#24611)
Dead code.
2025-11-11 15:10:57 -08:00
yinheli
c6aa5a97dc fix(docs): Remove duplicate sections in guides.jsx (#24595)
### What does this PR do?

This PR fixes an issue on the Guides page where duplicate sections were
being displayed. The problem was caused by a misplaced return statement
and a duplicated JSX block introduced in commit
[1606a9f24e](https://github.com/oven-sh/bun/blob/1606a9f24e/docs/snippets/guides.jsx#L504-L514).
2025-11-11 14:53:59 -08:00
robobun
925e8bcfe1 Format download sizes in human-readable format (#24266)
## Summary

- Use `std.fmt.fmtIntSizeBin` to format progress indicators with byte
sizes
- Improves readability during operations like `bun upgrade`
- Changes display from raw bytes (e.g., "23982378/2398284") to
human-readable format (e.g., "23.2MiB/100MiB")

## Changes

Modified `src/Progress.zig`:
- Updated progress formatting to use `std.fmt.fmtIntSizeBin` for both
current and total sizes
- Applied to both progress with total (`[current/total unit]`) and
without total (`[current unit]`)

## Test plan

- [x] Build succeeds with `bun bd`
- [ ] Manual verification with `bun upgrade` shows human-readable sizes

Fixes #24226 fixes #7826

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: pfg <pfg@pfg.pw>
2025-11-10 20:02:00 -08:00
robobun
b87ac4a781 Update ci_info with more CI detection (#23708)
Fixes ENG-21481

Updates ci_info to include more CIs. It makes it codegen the ci
detection based on the json from the ci-info package. Also it supports
setting CI=true to force ci detected.

---------

Co-authored-by: pfg <pfg@pfg.pw>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-10 19:58:02 -08:00
robobun
b876938f6d docs: update documentation for Bun v1.3.2 features (#24503)
## Summary
Updates documentation for all major features and changes introduced in
Bun v1.3.2 blog post.

## Changes

### Package Manager
-  Document `configVersion` system for controlling default linker
behavior
-  Clarify that "existing projects (made pre-v1.3.2)" use hoisted
installs for backward compatibility
-  Add smart postinstall script optimization with environment variable
flags
-  Document improved Git dependency resolution with HTTP tarball
optimization
-  Add `bun list` alias for `bun pm ls`

### Testing
-  Document new `onTestFinished` lifecycle hook with simple example
-  Add to lifecycle hooks table in test documentation

### Runtime & Performance
-  Add CPU profiling with `--cpu-prof` flag documentation
-  Place after memory usage section for better flow

### WebSockets
-  Add `subscriptions` getter to existing pub/sub example
-  Add TypeScript reference for the subscriptions property

## Documentation Improvements
All documentation now consistently:
- Uses "made pre-v1.3.2" to clarify existing project behavior
- Simplifies default linker explanations with clear references to
`/docs/pm/isolated-installs`
- Uses `/docs/pm/isolated-installs` for all internal references
- Avoids confusing technical details in favor of user-friendly summaries

## Files Modified
- `docs/guides/install/add-git.mdx` - Added GitHub tarball optimization
note
- `docs/pm/cli/install.mdx` - Added installation strategies and smart
postinstall docs
- `docs/pm/cli/pm.mdx` - Added bun list alias
- `docs/pm/isolated-installs.mdx` - Updated default behavior section
with configVersion table
- `docs/project/benchmarking.mdx` - Added CPU profiling section
- `docs/runtime/bunfig.mdx` - Clarified install.linker defaults
- `docs/runtime/http/websockets.mdx` - Added subscriptions to example
and TypeScript interface
- `docs/test/lifecycle.mdx` - Added onTestFinished hook documentation

## Diff

````diff
diff --git a/docs/guides/install/add-git.mdx b/docs/guides/install/add-git.mdx
index 70950e1a63..7f8f3c8d81 100644
--- a/docs/guides/install/add-git.mdx
+++ b/docs/guides/install/add-git.mdx
@@ -33,6 +33,8 @@ bun add git@github.com:lodash/lodash.git
 bun add github:colinhacks/zod
 ```
 
+**Note:** GitHub dependencies download via HTTP tarball when possible for faster installation.
+
 ---
 
 See [Docs > Package manager](https://bun.com/docs/cli/install) for complete documentation of Bun's package manager.
diff --git a/docs/pm/cli/install.mdx b/docs/pm/cli/install.mdx
index 7affb62646..dde268b7e5 100644
--- a/docs/pm/cli/install.mdx
+++ b/docs/pm/cli/install.mdx
@@ -88,6 +88,13 @@ Lifecycle scripts will run in parallel during installation. To adjust the maximu
 bun install --concurrent-scripts 5
 ```
 
+Bun automatically optimizes postinstall scripts for popular packages (like `esbuild`, `sharp`, etc.) by determining which scripts need to run. To disable these optimizations:
+
+```bash terminal icon="terminal"
+BUN_FEATURE_FLAG_DISABLE_NATIVE_DEPENDENCY_LINKER=1 bun install
+BUN_FEATURE_FLAG_DISABLE_IGNORE_SCRIPTS=1 bun install
+```
+
 ---
 
 ## Workspaces
@@ -231,7 +238,7 @@ Bun supports installing dependencies from Git, GitHub, and local or remotely-hos
 
 Bun supports two package installation strategies that determine how dependencies are organized in `node_modules`:
 
-### Hoisted installs (default for single projects)
+### Hoisted installs
 
 The traditional npm/Yarn approach that flattens dependencies into a shared `node_modules` directory:
 
@@ -249,7 +256,15 @@ bun install --linker isolated
 
 Isolated installs create a central package store in `node_modules/.bun/` with symlinks in the top-level `node_modules`. This ensures packages can only access their declared dependencies.
 
-For complete documentation on isolated installs, refer to [Package manager > Isolated installs](/pm/isolated-installs).
+### Default strategy
+
+The default linker strategy depends on whether you're starting fresh or have an existing project:
+
+- **New workspaces/monorepos**: `isolated` (prevents phantom dependencies)
+- **New single-package projects**: `hoisted` (traditional npm behavior)
+- **Existing projects (made pre-v1.3.2)**: `hoisted` (preserves backward compatibility)
+
+The default is controlled by a `configVersion` field in your lockfile. For a detailed explanation, see [Package manager > Isolated installs](/docs/pm/isolated-installs).
 
 ---
 
@@ -319,8 +334,7 @@ dryRun = false
 concurrentScripts = 16 # (cpu count or GOMAXPROCS) x2
 
 # installation strategy: "hoisted" or "isolated"
-# default: "hoisted" (for single-project projects)
-# default: "isolated" (for monorepo projects)
+# default varies by project type - see /docs/pm/isolated-installs
 linker = "hoisted"
 
 
diff --git a/docs/pm/cli/pm.mdx b/docs/pm/cli/pm.mdx
index fc297753d3..9c8faa7da1 100644
--- a/docs/pm/cli/pm.mdx
+++ b/docs/pm/cli/pm.mdx
@@ -115,6 +115,8 @@ To print a list of installed dependencies in the current project and their resol
 
 ```bash terminal icon="terminal"
 bun pm ls
+# or
+bun list
 ```
 
 ```txt
@@ -130,6 +132,8 @@ To print all installed dependencies, including nth-order dependencies.
 
 ```bash terminal icon="terminal"
 bun pm ls --all
+# or
+bun list --all
 ```
 
 ```txt
diff --git a/docs/pm/isolated-installs.mdx b/docs/pm/isolated-installs.mdx
index 73c6748b15..17afe02fe1 100644
--- a/docs/pm/isolated-installs.mdx
+++ b/docs/pm/isolated-installs.mdx
@@ -5,7 +5,7 @@ description: "Strict dependency isolation similar to pnpm's approach"
 
 Bun provides an alternative package installation strategy called **isolated installs** that creates strict dependency isolation similar to pnpm's approach. This mode prevents phantom dependencies and ensures reproducible, deterministic builds.
 
-This is the default installation strategy for monorepo projects.
+This is the default installation strategy for **new** workspace/monorepo projects (with `configVersion = 1` in the lockfile). Existing projects continue using hoisted installs unless explicitly configured.
 
 ## What are isolated installs?
 
@@ -43,8 +43,23 @@ linker = "isolated"
 
 ### Default behavior
 
-- For monorepo projects, Bun uses the **isolated** installation strategy by default.
-- For single-project projects, Bun uses the **hoisted** installation strategy by default.
+The default linker strategy depends on your project's lockfile `configVersion`:
+
+| `configVersion` | Using workspaces? | Default Linker |
+| --------------- | ----------------- | -------------- |
+| `1`             |                 | `isolated`     |
+| `1`             |                 | `hoisted`      |
+| `0`             |                 | `hoisted`      |
+| `0`             |                 | `hoisted`      |
+
+**New projects**: Default to `configVersion = 1`. In workspaces, v1 uses the isolated linker by default; otherwise it uses hoisted linking.
+
+**Existing Bun projects (made pre-v1.3.2)**: If your existing lockfile doesn't have a version yet, Bun sets `configVersion = 0` when you run `bun install`, preserving the previous hoisted linker default.
+
+**Migrations from other package managers**:
+
+- From pnpm: `configVersion = 1` (using isolated installs in workspaces)
+- From npm or yarn: `configVersion = 0` (using hoisted installs)
 
 You can override the default behavior by explicitly specifying the `--linker` flag or setting it in your configuration file.
 
diff --git a/docs/project/benchmarking.mdx b/docs/project/benchmarking.mdx
index 1263a06729..2ab8bcafc8 100644
--- a/docs/project/benchmarking.mdx
+++ b/docs/project/benchmarking.mdx
@@ -216,3 +216,26 @@ numa nodes:       1
    elapsed:       0.068 s
    process: user: 0.061 s, system: 0.014 s, faults: 0, rss: 57.4 MiB, commit: 64.0 MiB
 ```
+
+## CPU profiling
+
+Profile JavaScript execution to identify performance bottlenecks with the `--cpu-prof` flag.
+
+```sh terminal icon="terminal"
+bun --cpu-prof script.js
+```
+
+This generates a `.cpuprofile` file you can open in Chrome DevTools (Performance tab → Load profile) or VS Code's CPU profiler.
+
+### Options
+
+```sh terminal icon="terminal"
+bun --cpu-prof --cpu-prof-name my-profile.cpuprofile script.js
+bun --cpu-prof --cpu-prof-dir ./profiles script.js
+```
+
+| Flag                         | Description          |
+| ---------------------------- | -------------------- |
+| `--cpu-prof`                 | Enable profiling     |
+| `--cpu-prof-name <filename>` | Set output filename  |
+| `--cpu-prof-dir <dir>`       | Set output directory |
diff --git a/docs/runtime/bunfig.mdx b/docs/runtime/bunfig.mdx
index 91005c1607..5b7fe49823 100644
--- a/docs/runtime/bunfig.mdx
+++ b/docs/runtime/bunfig.mdx
@@ -497,9 +497,9 @@ print = "yarn"
 
 ### `install.linker`
 
-Configure the default linker strategy. Default `"hoisted"` for single-project projects, `"isolated"` for monorepo projects.
+Configure the linker strategy for installing dependencies. Defaults to `"isolated"` for new workspaces, `"hoisted"` for new single-package projects and existing projects (made pre-v1.3.2).
 
-For complete documentation refer to [Package manager > Isolated installs](/pm/isolated-installs).
+For complete documentation refer to [Package manager > Isolated installs](/docs/pm/isolated-installs).
 
 ```toml title="bunfig.toml" icon="settings"
 [install]
diff --git a/docs/runtime/http/websockets.mdx b/docs/runtime/http/websockets.mdx
index b33f37c29f..174043200d 100644
--- a/docs/runtime/http/websockets.mdx
+++ b/docs/runtime/http/websockets.mdx
@@ -212,6 +212,9 @@ const server = Bun.serve({
       // this is a group chat
       // so the server re-broadcasts incoming message to everyone
       server.publish("the-group-chat", `${ws.data.username}: ${message}`);
+
+      // inspect current subscriptions
+      console.log(ws.subscriptions); // ["the-group-chat"]
     },
     close(ws) {
       const msg = `${ws.data.username} has left the chat`;
@@ -393,6 +396,7 @@ interface ServerWebSocket {
   readonly data: any;
   readonly readyState: number;
   readonly remoteAddress: string;
+  readonly subscriptions: string[];
   send(message: string | ArrayBuffer | Uint8Array, compress?: boolean): number;
   close(code?: number, reason?: string): void;
   subscribe(topic: string): void;
diff --git a/docs/test/lifecycle.mdx b/docs/test/lifecycle.mdx
index 6427175df6..3837f0e948 100644
--- a/docs/test/lifecycle.mdx
+++ b/docs/test/lifecycle.mdx
@@ -6,11 +6,12 @@ description: "Learn how to use beforeAll, beforeEach, afterEach, and afterAll li
 The test runner supports the following lifecycle hooks. This is useful for loading test fixtures, mocking data, and configuring the test environment.
 
 | Hook             | Description                                                |
-| ------------ | --------------------------- |
+| ---------------- | ---------------------------------------------------------- |
 | `beforeAll`      | Runs once before all tests.                                |
 | `beforeEach`     | Runs before each test.                                     |
 | `afterEach`      | Runs after each test.                                      |
 | `afterAll`       | Runs once after all tests.                                 |
+| `onTestFinished` | Runs after a single test finishes (after all `afterEach`). |
 
 ## Per-Test Setup and Teardown
 
@@ -90,6 +91,23 @@ describe("test group", () => {
 });
 ```
 
+### `onTestFinished`
+
+Use `onTestFinished` to run a callback after a single test completes. It runs after all `afterEach` hooks.
+
+```ts title="test.ts" icon="/icons/typescript.svg"
+import { test, onTestFinished } from "bun:test";
+
+test("cleanup after test", () => {
+  onTestFinished(() => {
+    // runs after all afterEach hooks
+    console.log("test finished");
+  });
+});
+```
+
+Not supported in concurrent tests; use `test.serial` instead.
+
 ## Global Setup and Teardown
 
 To scope the hooks to an entire multi-file test run, define the hooks in a separate file.
````

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Michael H <git@riskymh.dev>
Co-authored-by: Lydia Hallie <lydiajuliettehallie@gmail.com>
2025-11-10 18:18:07 -08:00
Lydia Hallie
6b70b71895 Add TanStack Start guide (#24572)
Adds a guide on how to build and deploy TanStack Start with Bun
2025-11-10 17:38:48 -08:00
Marko Vejnovic
80a5b59fe5 bug(ENG-21501): Fix integer overflow in hosted_git_info.zig (#24561)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-10 16:30:47 -08:00
Alistair Smith
d87a928b94 Remove dependency on React's types in @types/bun 2025-11-10 15:50:45 -08:00
pfg
05d0475c6c Update to zig 0.15.2 (#24204)
Fixes ENG-21287

Build times, from `bun run build && echo '//' >> src/main.zig && time
bun run build`

|Platform|0.14.1|0.15.2|Speedup|
|-|-|-|-|
|macos debug asan|126.90s|106.27s|1.19x|
|macos debug noasan|60.62s|50.85s|1.19x|
|linux debug asan|292.77s|241.45s|1.21x|
|linux debug noasan|146.58s|130.94s|1.12x|
|linux debug use_llvm=false|n/a|78.27s|1.87x|
|windows debug asan|177.13s|142.55s|1.24x|

Runtime performance:

- next build memory usage may have gone up by 5%. Otherwise seems the
same. Some code with writers may have gotten slower, especially one
instance of a counting writer and a few instances of unbuffered writers
that now have vtable overhead.
- File size reduced by 800kb (from 100.2mb to 99.4mb)

Improvements:

- `@export` hack is no longer needed for watch
- native x86_64 backend for linux builds faster. to use it, set use_llvm
false and no_link_obj false. also set `ASAN_OPTIONS=detect_leaks=0`
otherwise it will spam the output with tens of thousands of lines of
debug info errors. may need to use the zig lldb fork for debugging.
- zig test-obj, which we will be able to use for zig unit tests

Still an issue:

- false 'dependency loop' errors remain in watch mode
- watch mode crashes observed

Follow-up:

- [ ] search `comptime Writer: type` and `comptime W: type` and remove
- [ ] remove format_mode in our zig fork
- [ ] remove deprecated.zig autoFormatLabelFallback
- [ ] remove deprecated.zig autoFormatLabel
- [ ] remove deprecated.BufferedWriter and BufferedReader
- [ ] remove override_no_export_cpp_apis as it is no longer needed
- [ ] css Parser(W) -> Parser, and remove all the comptime writer: type
params
- [ ] remove deprecated writer fully

Files that add lines:

```
649     src/deprecated.zig
167     scripts/pack-codegen-for-zig-team.ts
54      scripts/cleartrace-impl.js
46      scripts/cleartrace.ts
43      src/windows.zig
18      src/fs.zig
17      src/bun.js/ConsoleObject.zig
16      src/output.zig
12      src/bun.js/test/debug.zig
12      src/bun.js/node/node_fs.zig
8       src/env_loader.zig
7       src/css/printer.zig
7       src/cli/init_command.zig
7       src/bun.js/node.zig
6       src/string/escapeRegExp.zig
6       src/install/PnpmMatcher.zig
5       src/bun.js/webcore/Blob.zig
4       src/crash_handler.zig
4       src/bun.zig
3       src/install/lockfile/bun.lock.zig
3       src/cli/update_interactive_command.zig
3       src/cli/pack_command.zig
3       build.zig
2       src/Progress.zig
2       src/install/lockfile/lockfile_json_stringify_for_debugging.zig
2       src/css/small_list.zig
2       src/bun.js/webcore/prompt.zig
1       test/internal/ban-words.test.ts
1       test/internal/ban-limits.json
1       src/watcher/WatcherTrace.zig
1       src/transpiler.zig
1       src/shell/builtin/cp.zig
1       src/js_printer.zig
1       src/io/PipeReader.zig
1       src/install/bin.zig
1       src/css/selectors/selector.zig
1       src/cli/run_command.zig
1       src/bun.js/RuntimeTranspilerStore.zig
1       src/bun.js/bindings/JSRef.zig
1       src/bake/DevServer.zig
```

Files that remove lines:

```
-1      src/test/recover.zig
-1      src/sql/postgres/SocketMonitor.zig
-1      src/sql/mysql/MySQLRequestQueue.zig
-1      src/sourcemap/CodeCoverage.zig
-1      src/css/values/color_js.zig
-1      src/compile_target.zig
-1      src/bundler/linker_context/convertStmtsForChunk.zig
-1      src/bundler/bundle_v2.zig
-1      src/bun.js/webcore/blob/read_file.zig
-1      src/ast/base.zig
-2      src/sql/postgres/protocol/ArrayList.zig
-2      src/shell/builtin/mkdir.zig
-2      src/install/PackageManager/patchPackage.zig
-2      src/install/PackageManager/PackageManagerDirectories.zig
-2      src/fmt.zig
-2      src/css/declaration.zig
-2      src/css/css_parser.zig
-2      src/collections/baby_list.zig
-2      src/bun.js/bindings/ZigStackFrame.zig
-2      src/ast/E.zig
-3      src/StandaloneModuleGraph.zig
-3      src/deps/picohttp.zig
-3      src/deps/libuv.zig
-3      src/btjs.zig
-4      src/threading/Futex.zig
-4      src/shell/builtin/touch.zig
-4      src/meta.zig
-4      src/install/lockfile.zig
-4      src/css/selectors/parser.zig
-5      src/shell/interpreter.zig
-5      src/css/error.zig
-5      src/bun.js/web_worker.zig
-5      src/bun.js.zig
-6      src/cli/test_command.zig
-6      src/bun.js/VirtualMachine.zig
-6      src/bun.js/uuid.zig
-6      src/bun.js/bindings/JSValue.zig
-9      src/bun.js/test/pretty_format.zig
-9      src/bun.js/api/BunObject.zig
-14     src/install/install_binding.zig
-14     src/fd.zig
-14     src/bun.js/node/path.zig
-14     scripts/pack-codegen-for-zig-team.sh
-17     src/bun.js/test/diff_format.zig
```

`git diff --numstat origin/main...HEAD | awk '{ print ($1-$2)"\t"$3 }' |
sort -rn`

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Dylan Conway <dylan.conway567@gmail.com>
Co-authored-by: Meghan Denny <meghan@bun.com>
Co-authored-by: tayor.fish <contact@taylor.fish>
2025-11-10 14:38:26 -08:00
Lydia Hallie
143ad2ea58 Remove bun version from Vercel guide (#24562) 2025-11-10 14:09:04 -08:00
Dylan Conway
6f9843ea9a fix(install): bun pm ls with unresolved dependencies (#24541)
### What does this PR do?
Fixes `bun pm ls --all` crash with unresolved optional peer
dependencies.
Fixes `bun pm ls` crash with empty lockfiles.

Fixes #24502 
### How did you verify your code works?
Added a test for both crashes

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-10 11:19:57 -08:00
github-actions[bot]
0a307ed880 deps: update sqlite to 3.51.0 (#24530) 2025-11-09 01:09:25 -08:00
robobun
b4f85c8866 Update docs example versions to 1.3.2 (#24522)
## Summary

Updated all example version placeholders in documentation from 1.3.1 and
1.2.20 to 1.3.2.

## Changes

Updated version examples in:
- Installation examples (Linux/macOS and Windows install commands)
- Package manager output examples (`bun install`, `bun publish`, `bun
pm` commands)
- Test runner output examples
- Spawn/child process output examples
- Fetch User-Agent header examples in debugging docs
- `Bun.version` API example

## Notes

- Historical version references (e.g., "As of Bun v1.x.x..." or "Bun
v1.x.x+ required") were intentionally **preserved** as they document
when features were introduced
- Generic package.json version examples (non-Bun package versions) were
**preserved**
- Only example outputs and code snippets showing current Bun version
were updated

## Files Changed (13 total)

- `docs/installation.mdx`
- `docs/guides/install/from-npm-install-to-bun-install.mdx`
- `docs/guides/install/add-peer.mdx`
- `docs/bundler/html-static.mdx` (6 occurrences)
- `docs/test/dom.mdx`
- `docs/pm/cli/publish.mdx`
- `docs/pm/cli/pm.mdx`
- `docs/guides/test/snapshot.mdx` (2 occurrences)
- `docs/guides/ecosystem/nuxt.mdx`
- `docs/guides/util/version.mdx`
- `docs/runtime/debugger.mdx` (3 occurrences)
- `docs/runtime/networking/fetch.mdx`
- `docs/runtime/child-process.mdx`

**Total:** 23 version references updated

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: Michael H <git@riskymh.dev>
2025-11-09 16:20:04 +11:00
Michael H
614e8292e3 docs: fix discord invite (#24498)
### What does this PR do?

we don't have the discord vanity invite

### How did you verify your code works?
2025-11-08 21:09:57 -08:00
Michael H
3829b6d0aa add .mdx to .gitattributes (#24525)
### What does this PR do?

### How did you verify your code works?
2025-11-08 20:56:38 -08:00
Meghan Denny
f30e3951a7 Bump 2025-11-07 23:58:34 -08:00
Michael H
b131639cc5 ci: run modified tests first (#24463)
Co-authored-by: Meghan Denny <meghan@bun.com>
2025-11-07 21:49:58 -08:00
Jarred Sumner
b9b07172aa Update package.json 2025-11-07 21:22:53 -08:00
Marko Vejnovic
02b474415d bug(ENG-21479): Fix Valkey URL Parsing (#24458)
### What does this PR do?

Fixes https://github.com/oven-sh/bun/issues/24385

### How did you verify your code works?

Confirmed that the test added in the first commit fails on mainline
`bun` and is fixed in this PR.

---------

Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2025-11-07 21:18:07 -08:00
Dylan Conway
de9a38bd11 fix(install): create bun.lock instead of bun.lockb if npm/yarn/pnpm migration fails (#24494)
### What does this PR do?

### How did you verify your code works?

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-07 20:58:44 -08:00
Marko Vejnovic
2e57c6bf95 fix(ENG-21492): Fix private git+ssh installs (#24490)
### What does this PR do?

This PR is the fix-only version of
https://github.com/oven-sh/bun/pull/24486. Unfortunately due to
complexity setting up all CI agents to ping private git repos, I was
unable to get CI passing there.

### How did you verify your code works?

I ran this:

```
marko@fedora:~/Desktop/bun-4$ bun add git+ssh://git@github.com:oven-sh/private-install-test-repo.git#5b37e644a2ef23fad0da4027042f01b194b179e8
bun add v1.3.2-canary.108 (44402ad2)
  🔍 Resolving [1/1] error: "git clone" for "git+ssh://git@github.com:oven-sh/private-install-test-repo.git#5b37e644a2ef23fad0da4027042f01b194b179e8" failed
error: InstallFailed cloning repository for git+ssh://git@github.com:oven-sh/private-install-test-repo.git#5b37e644a2ef23fad0da4027042f01b194b179e8
error: git+ssh://git@github.com:oven-sh/private-install-test-repo.git#5b37e644a2ef23fad0da4027042f01b194b179e8 failed to resolve
```

followed by

```
marko@fedora:~/Desktop/bun-4$ BUN_DEBUG_QUIET_LOGS=1 ./build/debug/bun-debug add git+ssh://git@github.com:oven-sh/private-install-test-repo.git#5b37e644a2ef23fad0da4027042f01b194b179e8
bun add v1.3.2 (0db90b25)

installed private-install-test-repo@git+ssh://git@github.com:oven-sh/private-install-test-repo.git#5b37e644a2ef23fad0da4027042f01b194b179e8

[1.61s] done
```

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Dylan Conway <dylan.conway567@gmail.com>
2025-11-07 19:56:34 -08:00
Lydia Hallie
35a57ff008 Add Upstash guide with Bun Redis (#24493) 2025-11-07 18:32:19 -08:00
Meghan Denny
7a931d5b26 [publish images] 2025-11-07 16:41:57 -08:00
Meghan Denny
2b42be9dcc [publish images] 2025-11-07 16:40:13 -08:00
Meghan Denny
e6be28b8d4 [publish images] 2025-11-07 16:37:57 -08:00
Meghan Denny
d0a1984a20 ci: skip running tests when a PR only changes docs (#24459)
fixes https://linear.app/oven/issue/ENG-21489
2025-11-07 15:52:37 -08:00
Lydia Hallie
1896c75d78 Add deploy guides for AWS Lambda, Google Run, DigitalOcean (#24414)
Adds deployment guides for Bun apps on AWS Lambda, Google Cloud Run, and
DigitalOcean using a custom `Dockerfile`

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-07 15:03:36 -08:00
Marko Vejnovic
3a810da66c build(ENG-21466): Fix sccache not caching across builds (#24423) 2025-11-07 14:33:26 -08:00
Jarred Sumner
0db90b2526 Implement isolated event loop for spawnSync (#24436) 2025-11-07 05:28:33 -08:00
1352 changed files with 97383 additions and 50514 deletions

19
.aikido Normal file
View File

@@ -0,0 +1,19 @@
exclude:
paths:
- test
- scripts
- bench
- packages/bun-lambda
- packages/bun-release
- packages/bun-wasm
- packages/bun-vscode
- packages/bun-plugin-yaml
- packages/bun-plugin-svelte
- packages/bun-native-plugin-rs
- packages/bun-native-bundler-plugin-api
- packages/bun-inspector-protocol
- packages/bun-inspector-frontend
- packages/bun-error
- packages/bun-debug-adapter-protocol
- packages/bun-build-mdx-rs
- packages/@types/bun

View File

@@ -133,6 +133,20 @@ RUN ARCH=$(if [ "$TARGETARCH" = "arm64" ]; then echo "arm64"; else echo "amd64";
RUN mkdir -p /var/cache/buildkite-agent /var/log/buildkite-agent /var/run/buildkite-agent /etc/buildkite-agent /var/lib/buildkite-agent/cache/bun
# The following is necessary to configure buildkite to use a stable
# checkout directory. sccache hashes absolute paths into its cache keys,
# so if buildkite uses a different checkout path each time (which it does
# by default), sccache will be useless.
RUN mkdir -p -m 755 /var/lib/buildkite-agent/hooks && \
cat <<'EOF' > /var/lib/buildkite-agent/hooks/environment
#!/bin/sh
set -efu
export BUILDKITE_BUILD_CHECKOUT_PATH=/var/lib/buildkite-agent/build
EOF
RUN chmod 744 /var/lib/buildkite-agent/hooks/environment
COPY ../*/agent.mjs /var/bun/scripts/
ENV BUN_INSTALL_CACHE=/var/lib/buildkite-agent/cache/bun

View File

@@ -16,6 +16,7 @@ import {
getEmoji,
getEnv,
getLastSuccessfulBuild,
getSecret,
isBuildkite,
isBuildManual,
isFork,
@@ -30,7 +31,7 @@ import {
} from "../scripts/utils.mjs";
/**
* @typedef {"linux" | "darwin" | "windows"} Os
* @typedef {"linux" | "darwin" | "windows" | "freebsd"} Os
* @typedef {"aarch64" | "x64"} Arch
* @typedef {"musl"} Abi
* @typedef {"debian" | "ubuntu" | "alpine" | "amazonlinux"} Distro
@@ -113,6 +114,7 @@ const buildPlatforms = [
{ os: "linux", arch: "x64", abi: "musl", baseline: true, distro: "alpine", release: "3.22" },
{ os: "windows", arch: "x64", release: "2019" },
{ os: "windows", arch: "x64", baseline: true, release: "2019" },
{ os: "freebsd", arch: "x64", release: "14.3" },
];
/**
@@ -123,16 +125,13 @@ const testPlatforms = [
{ os: "darwin", arch: "aarch64", release: "13", tier: "previous" },
{ os: "darwin", arch: "x64", release: "14", tier: "latest" },
{ os: "darwin", arch: "x64", release: "13", tier: "previous" },
{ os: "linux", arch: "aarch64", distro: "debian", release: "12", tier: "latest" },
{ os: "linux", arch: "x64", distro: "debian", release: "12", tier: "latest" },
{ os: "linux", arch: "x64", baseline: true, distro: "debian", release: "12", tier: "latest" },
{ os: "linux", arch: "x64", profile: "asan", distro: "debian", release: "12", tier: "latest" },
{ os: "linux", arch: "aarch64", distro: "debian", release: "13", tier: "latest" },
{ os: "linux", arch: "x64", distro: "debian", release: "13", tier: "latest" },
{ os: "linux", arch: "x64", baseline: true, distro: "debian", release: "13", tier: "latest" },
{ os: "linux", arch: "x64", profile: "asan", distro: "debian", release: "13", tier: "latest" },
{ os: "linux", arch: "aarch64", distro: "ubuntu", release: "25.04", tier: "latest" },
{ os: "linux", arch: "aarch64", distro: "ubuntu", release: "24.04", tier: "latest" },
{ os: "linux", arch: "x64", distro: "ubuntu", release: "25.04", tier: "latest" },
{ os: "linux", arch: "x64", distro: "ubuntu", release: "24.04", tier: "latest" },
{ os: "linux", arch: "x64", baseline: true, distro: "ubuntu", release: "25.04", tier: "latest" },
{ os: "linux", arch: "x64", baseline: true, distro: "ubuntu", release: "24.04", tier: "latest" },
{ os: "linux", arch: "aarch64", abi: "musl", distro: "alpine", release: "3.22", tier: "latest" },
{ os: "linux", arch: "x64", abi: "musl", distro: "alpine", release: "3.22", tier: "latest" },
{ os: "linux", arch: "x64", abi: "musl", baseline: true, distro: "alpine", release: "3.22", tier: "latest" },
@@ -555,7 +554,6 @@ function getBuildBunStep(platform, options) {
/**
* @typedef {Object} TestOptions
* @property {string} [buildId]
* @property {boolean} [unifiedTests]
* @property {string[]} [testFiles]
* @property {boolean} [dryRun]
*/
@@ -568,12 +566,13 @@ function getBuildBunStep(platform, options) {
*/
function getTestBunStep(platform, options, testOptions = {}) {
const { os, profile } = platform;
const { buildId, unifiedTests, testFiles } = testOptions;
const { buildId, testFiles } = testOptions;
const args = [`--step=${getTargetKey(platform)}-build-bun`];
if (buildId) {
args.push(`--build-id=${buildId}`);
}
if (testFiles) {
args.push(...testFiles.map(testFile => `--include=${testFile}`));
}
@@ -590,7 +589,7 @@ function getTestBunStep(platform, options, testOptions = {}) {
agents: getTestAgent(platform, options),
retry: getRetry(),
cancel_on_build_failing: isMergeQueue(),
parallelism: unifiedTests ? undefined : os === "darwin" ? 2 : 10,
parallelism: os === "darwin" ? 2 : 10,
timeout_in_minutes: profile === "asan" || os === "windows" ? 45 : 30,
env: {
ASAN_OPTIONS: "allow_user_segv_handler=1:disable_coredump=0:detect_leaks=0",
@@ -659,7 +658,7 @@ function getReleaseStep(buildPlatforms, options) {
agents: {
queue: "test-darwin",
},
depends_on: buildPlatforms.map(platform => `${getTargetKey(platform)}-build-bun`),
depends_on: buildPlatforms.filter(p => p.os !== "freebsd").map(platform => `${getTargetKey(platform)}-build-bun`),
env: {
CANARY: revision,
},
@@ -772,8 +771,6 @@ function getBenchmarkStep() {
* @property {Platform[]} [buildPlatforms]
* @property {Platform[]} [testPlatforms]
* @property {string[]} [testFiles]
* @property {boolean} [unifiedBuilds]
* @property {boolean} [unifiedTests]
*/
/**
@@ -944,22 +941,6 @@ function getOptionsStep() {
default: "false",
options: booleanOptions,
},
{
key: "unified-builds",
select: "Do you want to build each platform in a single step?",
hint: "If true, builds will not be split into separate steps (this will likely slow down the build)",
required: false,
default: "false",
options: booleanOptions,
},
{
key: "unified-tests",
select: "Do you want to run tests in a single step?",
hint: "If true, tests will not be split into separate steps (this will be very slow)",
required: false,
default: "false",
options: booleanOptions,
},
],
};
}
@@ -1025,8 +1006,6 @@ async function getPipelineOptions() {
buildImages: parseBoolean(options["build-images"]),
publishImages: parseBoolean(options["publish-images"]),
testFiles: parseArray(options["test-files"]),
unifiedBuilds: parseBoolean(options["unified-builds"]),
unifiedTests: parseBoolean(options["unified-tests"]),
buildPlatforms: buildPlatformKeys?.length
? buildPlatformKeys.flatMap(key => buildProfiles.map(profile => ({ ...buildPlatformsMap.get(key), profile })))
: Array.from(buildPlatformsMap.values()),
@@ -1092,7 +1071,7 @@ async function getPipeline(options = {}) {
const imagePlatforms = new Map(
buildImages || publishImages
? [...buildPlatforms, ...testPlatforms]
.filter(({ os }) => os === "linux" || os === "windows")
.filter(({ os }) => os !== "darwin")
.map(platform => [getImageKey(platform), platform])
: [],
);
@@ -1108,7 +1087,7 @@ async function getPipeline(options = {}) {
});
}
let { skipBuilds, forceBuilds, unifiedBuilds, dryRun } = options;
let { skipBuilds, forceBuilds, dryRun } = options;
dryRun = dryRun || !!buildImages;
/** @type {string | undefined} */
@@ -1126,10 +1105,13 @@ async function getPipeline(options = {}) {
const includeASAN = !isMainBranch();
if (!buildId) {
const relevantBuildPlatforms = includeASAN
let relevantBuildPlatforms = includeASAN
? buildPlatforms
: buildPlatforms.filter(({ profile }) => profile !== "asan");
// run build-image but no build-bun yet
relevantBuildPlatforms = relevantBuildPlatforms.filter(({ os }) => os !== "freebsd");
steps.push(
...relevantBuildPlatforms.map(target => {
const imageKey = getImageKey(target);
@@ -1139,13 +1121,16 @@ async function getPipeline(options = {}) {
dependsOn.push(`${imageKey}-build-image`);
}
const steps = [];
steps.push(getBuildCppStep(target, options));
steps.push(getBuildZigStep(target, options));
steps.push(getLinkBunStep(target, options));
return getStepWithDependsOn(
{
key: getTargetKey(target),
group: getTargetLabel(target),
steps: unifiedBuilds
? [getBuildBunStep(target, options)]
: [getBuildCppStep(target, options), getBuildZigStep(target, options), getLinkBunStep(target, options)],
steps,
},
...dependsOn,
);
@@ -1154,13 +1139,13 @@ async function getPipeline(options = {}) {
}
if (!isMainBranch()) {
const { skipTests, forceTests, unifiedTests, testFiles } = options;
const { skipTests, forceTests, testFiles } = options;
if (!skipTests || forceTests) {
steps.push(
...testPlatforms.map(target => ({
key: getTargetKey(target),
group: getTargetLabel(target),
steps: [getTestBunStep(target, options, { unifiedTests, testFiles, buildId })],
steps: [getTestBunStep(target, options, { testFiles, buildId })],
})),
);
}
@@ -1203,6 +1188,43 @@ async function main() {
console.log("Generated options:", options);
}
startGroup("Querying GitHub for files...");
if (options && isBuildkite && !isMainBranch()) {
/** @type {string[]} */
let allFiles = [];
/** @type {string[]} */
let newFiles = [];
let prFileCount = 0;
try {
console.log("on buildkite: collecting new files from PR");
const per_page = 50;
const { BUILDKITE_PULL_REQUEST } = process.env;
for (let i = 1; i <= 10; i++) {
const res = await fetch(
`https://api.github.com/repos/oven-sh/bun/pulls/${BUILDKITE_PULL_REQUEST}/files?per_page=${per_page}&page=${i}`,
{ headers: { Authorization: `Bearer ${getSecret("GITHUB_TOKEN")}` } },
);
const doc = await res.json();
console.log(`-> page ${i}, found ${doc.length} items`);
if (doc.length === 0) break;
for (const { filename, status } of doc) {
prFileCount += 1;
allFiles.push(filename);
if (status !== "added") continue;
newFiles.push(filename);
}
if (doc.length < per_page) break;
}
console.log(`- PR ${BUILDKITE_PULL_REQUEST}, ${prFileCount} files, ${newFiles.length} new files`);
} catch (e) {
console.error(e);
}
if (allFiles.every(filename => filename.startsWith("docs/"))) {
console.log(`- PR is only docs, skipping tests!`);
return;
}
}
startGroup("Generating pipeline...");
const pipeline = await getPipeline(options);
if (!pipeline) {

View File

@@ -1,5 +1,9 @@
language: en-US
issue_enrichment:
auto_enrich:
enabled: false
reviews:
profile: assertive
request_changes_workflow: false

1
.gitattributes vendored
View File

@@ -16,6 +16,7 @@
*.map text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.md text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.mdc text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.mdx text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.mjs text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.mts text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2

View File

@@ -1,66 +0,0 @@
name: Claude Code
on:
issue_comment:
types: [created]
pull_request_review_comment:
types: [created]
issues:
types: [opened, assigned]
pull_request_review:
types: [submitted]
jobs:
claude:
if: |
github.repository == 'oven-sh/bun' &&
(
(github.event_name == 'issue_comment' && (github.event.comment.author_association == 'MEMBER' || github.event.comment.author_association == 'OWNER' || github.event.comment.author_association == 'COLLABORATOR')) ||
(github.event_name == 'pull_request_review_comment' && (github.event.comment.author_association == 'MEMBER' || github.event.comment.author_association == 'OWNER' || github.event.comment.author_association == 'COLLABORATOR')) ||
(github.event_name == 'pull_request_review' && (github.event.review.author_association == 'MEMBER' || github.event.review.author_association == 'OWNER' || github.event.review.author_association == 'COLLABORATOR')) ||
(github.event_name == 'issues' && (github.event.issue.author_association == 'MEMBER' || github.event.issue.author_association == 'OWNER' || github.event.issue.author_association == 'COLLABORATOR'))
) &&
(github.event_name == 'issue_comment' && contains(github.event.comment.body, '@claude')) ||
(github.event_name == 'pull_request_review_comment' && contains(github.event.comment.body, '@claude')) ||
(github.event_name == 'pull_request_review' && contains(github.event.review.body, '@claude')) ||
(github.event_name == 'issues' && (contains(github.event.issue.body, '@claude') || contains(github.event.issue.title, '@claude')))
runs-on: claude
env:
IS_SANDBOX: 1
container:
image: localhost:5000/claude-bun:latest
options: --privileged --user 1000:1000
permissions:
contents: read
id-token: write
steps:
- name: Checkout repository
working-directory: /workspace/bun
run: |
git config --global user.email "claude-bot@bun.sh" && \
git config --global user.name "Claude Bot" && \
git config --global url."git@github.com:".insteadOf "https://github.com/" && \
git config --global url."git@github.com:".insteadOf "http://github.com/" && \
git config --global --add safe.directory /workspace/bun && \
git config --global push.default current && \
git config --global pull.rebase true && \
git config --global init.defaultBranch main && \
git config --global core.editor "vim" && \
git config --global color.ui auto && \
git config --global fetch.prune true && \
git config --global diff.colorMoved zebra && \
git config --global merge.conflictStyle diff3 && \
git config --global rerere.enabled true && \
git config --global core.autocrlf input
git fetch origin ${{ github.event.pull_request.head.sha }}
git checkout ${{ github.event.pull_request.head.ref }}
git reset --hard origin/${{ github.event.pull_request.head.ref }}
- name: Run Claude Code
id: claude
uses: anthropics/claude-code-action@v1
with:
timeout_minutes: "180"
claude_args: |
--dangerously-skip-permissions
--system-prompt "You are working on the Bun codebase"
claude_code_oauth_token: ${{ secrets.CLAUDE_CODE_OAUTH_TOKEN }}

View File

@@ -1,58 +0,0 @@
name: Codex Test Sync
on:
pull_request:
types: [labeled, opened]
env:
BUN_VERSION: "1.2.15"
jobs:
sync-node-tests:
runs-on: ubuntu-latest
if: |
(github.event.action == 'labeled' && github.event.label.name == 'codex') ||
(github.event.action == 'opened' && contains(github.event.pull_request.labels.*.name, 'codex')) ||
contains(github.head_ref, 'codex')
permissions:
contents: write
pull-requests: write
steps:
- name: Checkout
uses: actions/checkout@v4
with:
token: ${{ secrets.GITHUB_TOKEN }}
fetch-depth: 0
- name: Setup Bun
uses: ./.github/actions/setup-bun
with:
bun-version: ${{ env.BUN_VERSION }}
- name: Get changed files
id: changed-files
uses: tj-actions/changed-files@v44
with:
files: |
test/js/node/test/parallel/**/*.{js,mjs,ts}
test/js/node/test/sequential/**/*.{js,mjs,ts}
- name: Sync tests
if: steps.changed-files.outputs.any_changed == 'true'
shell: bash
run: |
echo "Changed test files:"
echo "${{ steps.changed-files.outputs.all_changed_files }}"
# Process each changed test file
for file in ${{ steps.changed-files.outputs.all_changed_files }}; do
# Extract test name from file path
test_name=$(basename "$file" | sed 's/\.[^.]*$//')
echo "Syncing test: $test_name"
bun node:test:cp "$test_name"
done
- name: Commit changes
uses: stefanzweifel/git-auto-commit-action@v5
with:
commit_message: "Sync Node.js tests with upstream"

View File

@@ -9,7 +9,7 @@ on:
pull_request:
merge_group:
env:
BUN_VERSION: "1.2.20"
BUN_VERSION: "1.3.2"
LLVM_VERSION: "19.1.7"
LLVM_VERSION_MAJOR: "19"

View File

@@ -9,3 +9,6 @@ test/snippets
test/js/node/test
test/napi/node-napi-tests
bun.lock
# the output codeblocks need to stay minified
docs/bundler/minifier.mdx

16
.vscode/settings.json vendored
View File

@@ -27,18 +27,22 @@
"git.ignoreLimitWarning": true,
// Zig
"zig.initialSetupDone": true,
"zig.buildOption": "build",
// "zig.initialSetupDone": true,
// "zig.buildOption": "build",
"zig.zls.zigLibPath": "${workspaceFolder}/vendor/zig/lib",
"zig.buildArgs": ["-Dgenerated-code=./build/debug/codegen", "--watch", "-fincremental"],
"zig.zls.buildOnSaveStep": "check",
"zig.buildOnSaveArgs": [
"-Dgenerated-code=./build/debug/codegen",
"--watch",
"-fincremental"
],
// "zig.zls.buildOnSaveStep": "check",
// "zig.zls.enableBuildOnSave": true,
// "zig.buildOnSave": true,
"zig.buildFilePath": "${workspaceFolder}/build.zig",
// "zig.buildFilePath": "${workspaceFolder}/build.zig",
"zig.path": "${workspaceFolder}/vendor/zig/zig.exe",
"zig.zls.path": "${workspaceFolder}/vendor/zig/zls.exe",
"zig.formattingProvider": "zls",
"zig.zls.enableInlayHints": false,
// "zig.zls.enableInlayHints": false,
"[zig]": {
"editor.tabSize": 4,
"editor.useTabStops": false,

View File

@@ -6,7 +6,7 @@ This is the Bun repository - an all-in-one JavaScript runtime & toolkit designed
- **Build Bun**: `bun bd`
- Creates a debug build at `./build/debug/bun-debug`
- **CRITICAL**: no need for a timeout, the build is really fast!
- **CRITICAL**: do not set a timeout when running `bun bd`
- **Run tests with your debug build**: `bun bd test <test-file>`
- **CRITICAL**: Never use `bun test` directly - it won't include your changes
- **Run any command with debug build**: `bun bd <command>`
@@ -38,16 +38,36 @@ If no valid issue number is provided, find the best existing file to modify inst
### Writing Tests
Tests use Bun's Jest-compatible test runner with proper test fixtures:
Tests use Bun's Jest-compatible test runner with proper test fixtures.
- For **single-file tests**, prefer `-e` over `tempDir`.
- For **multi-file tests**, prefer `tempDir` and `Bun.spawn`.
```typescript
import { test, expect } from "bun:test";
import { bunEnv, bunExe, normalizeBunSnapshot, tempDir } from "harness";
test("my feature", async () => {
test("(single-file test) my feature", async () => {
await using proc = Bun.spawn({
cmd: [bunExe(), "-e", "console.log('Hello, world!')"],
env: bunEnv,
});
const [stdout, stderr, exitCode] = await Promise.all([
proc.stdout.text(),
proc.stderr.text(),
proc.exited,
]);
expect(normalizeBunSnapshot(stdout)).toMatchInlineSnapshot(`"Hello, world!"`);
expect(exitCode).toBe(0);
});
test("(multi-file test) my feature", async () => {
// Create temp directory with test files
using dir = tempDir("test-prefix", {
"index.js": `console.log("hello");`,
"index.js": `import { foo } from "./foo.ts"; foo();`,
"foo.ts": `export function foo() { console.log("foo"); }`,
});
// Spawn Bun process
@@ -74,7 +94,7 @@ test("my feature", async () => {
- Always use `port: 0`. Do not hardcode ports. Do not use your own random port number function.
- Use `normalizeBunSnapshot` to normalize snapshot output of the test.
- NEVER write tests that check for no "panic" or "uncaught exception" or similar in the test output. That is NOT a valid test.
- NEVER write tests that check for no "panic" or "uncaught exception" or similar in the test output. These tests will never fail in CI.
- Use `tempDir` from `"harness"` to create a temporary directory. **Do not** use `tmpdirSync` or `fs.mkdtempSync` to create temporary directories.
- When spawning processes, tests should expect(stdout).toBe(...) BEFORE expect(exitCode).toBe(0). This gives you a more useful error message on test failure.
- **CRITICAL**: Do not write flaky tests. Do not use `setTimeout` in tests. Instead, `await` the condition to be met. You are not testing the TIME PASSING, you are testing the CONDITION.

View File

@@ -25,16 +25,6 @@ if(CMAKE_HOST_APPLE)
endif()
include(SetupLLVM)
find_program(SCCACHE_PROGRAM sccache)
if(SCCACHE_PROGRAM AND NOT DEFINED ENV{NO_SCCACHE})
include(SetupSccache)
else()
find_program(CCACHE_PROGRAM ccache)
if(CCACHE_PROGRAM)
include(SetupCcache)
endif()
endif()
# --- Project ---
parse_package_json(VERSION_VARIABLE DEFAULT_VERSION)
@@ -57,6 +47,16 @@ include(SetupEsbuild)
include(SetupZig)
include(SetupRust)
find_program(SCCACHE_PROGRAM sccache)
if(SCCACHE_PROGRAM AND NOT DEFINED ENV{NO_SCCACHE})
include(SetupSccache)
else()
find_program(CCACHE_PROGRAM ccache)
if(CCACHE_PROGRAM)
include(SetupCcache)
endif()
endif()
# Generate dependency versions header
include(GenerateDependencyVersions)

View File

@@ -201,7 +201,7 @@ Bun generally takes about 2.5 minutes to compile a debug build when there are Zi
- Batch up your changes
- Ensure zls is running with incremental watching for LSP errors (if you use VSCode and install Zig and run `bun run build` once to download Zig, this should just work)
- Prefer using the debugger ("CodeLLDB" in VSCode) to step through the code.
- Use debug logs. `BUN_DEBUG_<scope>=1` will enable debug logging for the corresponding `Output.scoped(.<scope>, .hidden)` logs. You can also set `BUN_DEBUG_QUIET_LOGS=1` to disable all debug logging that isn't explicitly enabled. To dump debug lgos into a file, `BUN_DEBUG=<path-to-file>.log`. Debug logs are aggressively removed in release builds.
- Use debug logs. `BUN_DEBUG_<scope>=1` will enable debug logging for the corresponding `Output.scoped(.<scope>, .hidden)` logs. You can also set `BUN_DEBUG_QUIET_LOGS=1` to disable all debug logging that isn't explicitly enabled. To dump debug logs into a file, `BUN_DEBUG=<path-to-file>.log`. Debug logs are aggressively removed in release builds.
- src/js/\*\*.ts changes are pretty much instant to rebuild. C++ changes are a bit slower, but still much faster than the Zig code (Zig is one compilation unit, C++ is many).
## Code generation scripts

2
LATEST
View File

@@ -1 +1 @@
1.3.1
1.3.5

View File

@@ -54,7 +54,7 @@ Bun supports Linux (x64 & arm64), macOS (x64 & Apple Silicon) and Windows (x64).
curl -fsSL https://bun.com/install | bash
# on windows
powershell -c "irm bun.com/install.ps1 | iex"
powershell -c "irm bun.sh/install.ps1 | iex"
# with npm
npm install -g bun
@@ -104,13 +104,13 @@ bun upgrade --canary
- [File types (Loaders)](https://bun.com/docs/runtime/loaders)
- [TypeScript](https://bun.com/docs/runtime/typescript)
- [JSX](https://bun.com/docs/runtime/jsx)
- [Environment variables](https://bun.com/docs/runtime/env)
- [Environment variables](https://bun.com/docs/runtime/environment-variables)
- [Bun APIs](https://bun.com/docs/runtime/bun-apis)
- [Web APIs](https://bun.com/docs/runtime/web-apis)
- [Node.js compatibility](https://bun.com/docs/runtime/nodejs-apis)
- [Node.js compatibility](https://bun.com/docs/runtime/nodejs-compat)
- [Single-file executable](https://bun.com/docs/bundler/executables)
- [Plugins](https://bun.com/docs/runtime/plugins)
- [Watch mode / Hot Reloading](https://bun.com/docs/runtime/hot)
- [Watch mode / Hot Reloading](https://bun.com/docs/runtime/watch-mode)
- [Module resolution](https://bun.com/docs/runtime/modules)
- [Auto-install](https://bun.com/docs/runtime/autoimport)
- [bunfig.toml](https://bun.com/docs/runtime/bunfig)
@@ -230,7 +230,7 @@ bun upgrade --canary
- Ecosystem
- [Use React and JSX](https://bun.com/guides/ecosystem/react)
- [Use EdgeDB with Bun](https://bun.com/guides/ecosystem/edgedb)
- [Use Gel with Bun](https://bun.com/guides/ecosystem/gel)
- [Use Prisma with Bun](https://bun.com/guides/ecosystem/prisma)
- [Add Sentry to a Bun app](https://bun.com/guides/ecosystem/sentry)
- [Create a Discord bot](https://bun.com/guides/ecosystem/discordjs)

View File

@@ -1,5 +1,6 @@
{
"lockfileVersion": 1,
"configVersion": 0,
"workspaces": {
"": {
"name": "bench",

View File

@@ -1,5 +1,6 @@
{
"lockfileVersion": 1,
"configVersion": 0,
"workspaces": {
"": {
"name": "installbench",
@@ -12,11 +13,11 @@
"@trpc/server": "^11.0.0",
"drizzle-orm": "^0.41.0",
"esbuild": "^0.25.11",
"next": "^15.2.3",
"next-auth": "5.0.0-beta.25",
"next": "^16.1.0",
"next-auth": "5.0.0-beta.30",
"postgres": "^3.4.4",
"react": "^19.0.0",
"react-dom": "^19.0.0",
"react": "^19.2.3",
"react-dom": "^19.2.3",
"server-only": "^0.0.1",
"superjson": "^2.2.1",
"zod": "^3.24.2",
@@ -25,8 +26,8 @@
"@biomejs/biome": "1.9.4",
"@tailwindcss/postcss": "^4.0.15",
"@types/node": "^20.14.10",
"@types/react": "^19.0.0",
"@types/react-dom": "^19.0.0",
"@types/react": "^19.2.3",
"@types/react-dom": "^19.2.3",
"drizzle-kit": "^0.30.5",
"postcss": "^8.5.3",
"tailwindcss": "^4.0.15",
@@ -175,23 +176,23 @@
"@jridgewell/trace-mapping": ["@jridgewell/trace-mapping@0.3.31", "", { "dependencies": { "@jridgewell/resolve-uri": "3.1.2", "@jridgewell/sourcemap-codec": "1.5.5" } }, "sha512-zzNR+SdQSDJzc8joaeP8QQoCQr8NuYx2dIIytl1QeBEZHJ9uW6hebsrYgbz8hJwUQao3TWCMtmfV8Nu1twOLAw=="],
"@next/env": ["@next/env@15.5.6", "", {}, "sha512-3qBGRW+sCGzgbpc5TS1a0p7eNxnOarGVQhZxfvTdnV0gFI61lX7QNtQ4V1TSREctXzYn5NetbUsLvyqwLFJM6Q=="],
"@next/env": ["@next/env@16.1.0", "", {}, "sha512-Dd23XQeFHmhf3KBW76leYVkejHlCdB7erakC2At2apL1N08Bm+dLYNP+nNHh0tzUXfPQcNcXiQyacw0PG4Fcpw=="],
"@next/swc-darwin-arm64": ["@next/swc-darwin-arm64@15.5.6", "", { "os": "darwin", "cpu": "arm64" }, "sha512-ES3nRz7N+L5Umz4KoGfZ4XX6gwHplwPhioVRc25+QNsDa7RtUF/z8wJcbuQ2Tffm5RZwuN2A063eapoJ1u4nPg=="],
"@next/swc-darwin-arm64": ["@next/swc-darwin-arm64@16.1.0", "", { "os": "darwin", "cpu": "arm64" }, "sha512-onHq8dl8KjDb8taANQdzs3XmIqQWV3fYdslkGENuvVInFQzZnuBYYOG2HGHqqtvgmEU7xWzhgndXXxnhk4Z3fQ=="],
"@next/swc-darwin-x64": ["@next/swc-darwin-x64@15.5.6", "", { "os": "darwin", "cpu": "x64" }, "sha512-JIGcytAyk9LQp2/nuVZPAtj8uaJ/zZhsKOASTjxDug0SPU9LAM3wy6nPU735M1OqacR4U20LHVF5v5Wnl9ptTA=="],
"@next/swc-darwin-x64": ["@next/swc-darwin-x64@16.1.0", "", { "os": "darwin", "cpu": "x64" }, "sha512-Am6VJTp8KhLuAH13tPrAoVIXzuComlZlMwGr++o2KDjWiKPe3VwpxYhgV6I4gKls2EnsIMggL4y7GdXyDdJcFA=="],
"@next/swc-linux-arm64-gnu": ["@next/swc-linux-arm64-gnu@15.5.6", "", { "os": "linux", "cpu": "arm64" }, "sha512-qvz4SVKQ0P3/Im9zcS2RmfFL/UCQnsJKJwQSkissbngnB/12c6bZTCB0gHTexz1s6d/mD0+egPKXAIRFVS7hQg=="],
"@next/swc-linux-arm64-gnu": ["@next/swc-linux-arm64-gnu@16.1.0", "", { "os": "linux", "cpu": "arm64" }, "sha512-fVicfaJT6QfghNyg8JErZ+EMNQ812IS0lmKfbmC01LF1nFBcKfcs4Q75Yy8IqnsCqH/hZwGhqzj3IGVfWV6vpA=="],
"@next/swc-linux-arm64-musl": ["@next/swc-linux-arm64-musl@15.5.6", "", { "os": "linux", "cpu": "arm64" }, "sha512-FsbGVw3SJz1hZlvnWD+T6GFgV9/NYDeLTNQB2MXoPN5u9VA9OEDy6fJEfePfsUKAhJufFbZLgp0cPxMuV6SV0w=="],
"@next/swc-linux-arm64-musl": ["@next/swc-linux-arm64-musl@16.1.0", "", { "os": "linux", "cpu": "arm64" }, "sha512-TojQnDRoX7wJWXEEwdfuJtakMDW64Q7NrxQPviUnfYJvAx5/5wcGE+1vZzQ9F17m+SdpFeeXuOr6v3jbyusYMQ=="],
"@next/swc-linux-x64-gnu": ["@next/swc-linux-x64-gnu@15.5.6", "", { "os": "linux", "cpu": "x64" }, "sha512-3QnHGFWlnvAgyxFxt2Ny8PTpXtQD7kVEeaFat5oPAHHI192WKYB+VIKZijtHLGdBBvc16tiAkPTDmQNOQ0dyrA=="],
"@next/swc-linux-x64-gnu": ["@next/swc-linux-x64-gnu@16.1.0", "", { "os": "linux", "cpu": "x64" }, "sha512-quhNFVySW4QwXiZkZ34SbfzNBm27vLrxZ2HwTfFFO1BBP0OY1+pI0nbyewKeq1FriqU+LZrob/cm26lwsiAi8Q=="],
"@next/swc-linux-x64-musl": ["@next/swc-linux-x64-musl@15.5.6", "", { "os": "linux", "cpu": "x64" }, "sha512-OsGX148sL+TqMK9YFaPFPoIaJKbFJJxFzkXZljIgA9hjMjdruKht6xDCEv1HLtlLNfkx3c5w2GLKhj7veBQizQ=="],
"@next/swc-linux-x64-musl": ["@next/swc-linux-x64-musl@16.1.0", "", { "os": "linux", "cpu": "x64" }, "sha512-6JW0z2FZUK5iOVhUIWqE4RblAhUj1EwhZ/MwteGb//SpFTOHydnhbp3868gxalwea+mbOLWO6xgxj9wA9wNvNw=="],
"@next/swc-win32-arm64-msvc": ["@next/swc-win32-arm64-msvc@15.5.6", "", { "os": "win32", "cpu": "arm64" }, "sha512-ONOMrqWxdzXDJNh2n60H6gGyKed42Ieu6UTVPZteXpuKbLZTH4G4eBMsr5qWgOBA+s7F+uB4OJbZnrkEDnZ5Fg=="],
"@next/swc-win32-arm64-msvc": ["@next/swc-win32-arm64-msvc@16.1.0", "", { "os": "win32", "cpu": "arm64" }, "sha512-+DK/akkAvvXn5RdYN84IOmLkSy87SCmpofJPdB8vbLmf01BzntPBSYXnMvnEEv/Vcf3HYJwt24QZ/s6sWAwOMQ=="],
"@next/swc-win32-x64-msvc": ["@next/swc-win32-x64-msvc@15.5.6", "", { "os": "win32", "cpu": "x64" }, "sha512-pxK4VIjFRx1MY92UycLOOw7dTdvccWsNETQ0kDHkBlcFH1GrTLUjSiHU1ohrznnux6TqRHgv5oflhfIWZwVROQ=="],
"@next/swc-win32-x64-msvc": ["@next/swc-win32-x64-msvc@16.1.0", "", { "os": "win32", "cpu": "x64" }, "sha512-Tr0j94MphimCCks+1rtYPzQFK+faJuhHWCegU9S9gDlgyOk8Y3kPmO64UcjyzZAlligeBtYZ/2bEyrKq0d2wqQ=="],
"@panva/hkdf": ["@panva/hkdf@1.2.1", "", {}, "sha512-6oclG6Y3PiDFcoyk8srjLfVKyMfVCKJ27JwNPViuXziFpmdz+MZnZN/aKY0JGXgYuO/VghU0jcOAZgWXZ1Dmrw=="],
@@ -243,13 +244,13 @@
"@trpc/server": ["@trpc/server@11.7.1", "", { "peerDependencies": { "typescript": "5.9.3" } }, "sha512-N3U8LNLIP4g9C7LJ/sLkjuPHwqlvE3bnspzC4DEFVdvx2+usbn70P80E3wj5cjOTLhmhRiwJCSXhlB+MHfGeCw=="],
"@types/cookie": ["@types/cookie@0.6.0", "", {}, "sha512-4Kh9a6B2bQciAhf7FSuMRRkUWecJgJu9nPnx3yzpsfXX/c50REIqpHY4C82bXP90qrLtXtkDxTZosYO3UpOwlA=="],
"@types/node": ["@types/node@20.19.24", "", { "dependencies": { "undici-types": "6.21.0" } }, "sha512-FE5u0ezmi6y9OZEzlJfg37mqqf6ZDSF2V/NLjUyGrR9uTZ7Sb9F7bLNZ03S4XVUNRWGA7Ck4c1kK+YnuWjl+DA=="],
"@types/react": ["@types/react@19.2.2", "", { "dependencies": { "csstype": "3.1.3" } }, "sha512-6mDvHUFSjyT2B2yeNx2nUgMxh9LtOWvkhIU3uePn2I2oyNymUAX1NIsdgviM4CH+JSrp2D2hsMvJOkxY+0wNRA=="],
"@types/react": ["@types/react@19.2.7", "", { "dependencies": { "csstype": "^3.2.2" } }, "sha512-MWtvHrGZLFttgeEj28VXHxpmwYbor/ATPYbBfSFZEIRK0ecCFLl2Qo55z52Hss+UV9CRN7trSeq1zbgx7YDWWg=="],
"@types/react-dom": ["@types/react-dom@19.2.2", "", { "peerDependencies": { "@types/react": "19.2.2" } }, "sha512-9KQPoO6mZCi7jcIStSnlOWn2nEF3mNmyr3rIAsGnAbQKYbRLyqmeSc39EVgtxXVia+LMT8j3knZLAZAh+xLmrw=="],
"@types/react-dom": ["@types/react-dom@19.2.3", "", { "peerDependencies": { "@types/react": "^19.2.0" } }, "sha512-jp2L/eY6fn+KgVVQAOqYItbF0VY/YApe5Mz2F0aykSO8gx31bYCZyvSeYxCHKvzHG5eZjc+zyaS5BrBWya2+kQ=="],
"baseline-browser-mapping": ["baseline-browser-mapping@2.9.11", "", { "bin": { "baseline-browser-mapping": "dist/cli.js" } }, "sha512-Sg0xJUNDU1sJNGdfGWhVHX0kkZ+HWcvmVymJbj6NSgZZmW/8S9Y2HQ5euytnIgakgxN6papOAWiwDo1ctFDcoQ=="],
"buffer-from": ["buffer-from@1.1.2", "", {}, "sha512-E+XQCRwSbaaiChtv6k6Dwgc+bx+Bs6vuKJHHl5kox/BaKbhiXzqQOwK4cO22yElGp2OCmjwVhT3HmxgyPGnJfQ=="],
@@ -257,11 +258,9 @@
"client-only": ["client-only@0.0.1", "", {}, "sha512-IV3Ou0jSMzZrd3pZ48nLkT9DA7Ag1pnPzaiQhpW7c3RbcqqzvzzVu+L8gfqMp/8IM2MQtSiqaCxrrcfu8I8rMA=="],
"cookie": ["cookie@0.7.1", "", {}, "sha512-6DnInpx7SJ2AK3+CTUE/ZM0vWTUboZCegxhC2xiIydHR9jNuTAASBrfEpHhiGOZw/nX51bHt6YQl8jsGo4y/0w=="],
"copy-anything": ["copy-anything@4.0.5", "", { "dependencies": { "is-what": "5.5.0" } }, "sha512-7Vv6asjS4gMOuILabD3l739tsaxFQmC+a7pLZm02zyvs8p977bL3zEgq3yDk5rn9B0PbYgIv++jmHcuUab4RhA=="],
"csstype": ["csstype@3.1.3", "", {}, "sha512-M1uQkMl8rQK/szD0LNhtqxIPLpimGm8sOBwU7lLnCpSbTyY3yeU1Vc7l4KT5zT4s/yOxHH5O7tIuuLOCnLADRw=="],
"csstype": ["csstype@3.2.3", "", {}, "sha512-z1HGKcYy2xA8AGQfwrn0PAy+PB7X/GSj3UVJW9qKyn43xWa+gl5nXmU4qqLMRzWVLFC8KusUX8T/0kCiOYpAIQ=="],
"debug": ["debug@4.4.3", "", { "dependencies": { "ms": "2.1.3" } }, "sha512-RGwwWnwQvkVfavKVt22FGLw+xYSdzARwm0ru6DhTVA3umU5hZc28V3kO4stgYryrTlLpuvgI9GiijltAjNbcqA=="],
@@ -323,9 +322,9 @@
"nanoid": ["nanoid@3.3.11", "", { "bin": { "nanoid": "bin/nanoid.cjs" } }, "sha512-N8SpfPUnUp1bK+PMYW8qSWdl9U+wwNWI4QKxOYDy9JAro3WMX7p2OeVRF9v+347pnakNevPmiHhNmZ2HbFA76w=="],
"next": ["next@15.5.6", "", { "dependencies": { "@next/env": "15.5.6", "@swc/helpers": "0.5.15", "caniuse-lite": "1.0.30001752", "postcss": "8.4.31", "styled-jsx": "5.1.6" }, "optionalDependencies": { "@next/swc-darwin-arm64": "15.5.6", "@next/swc-darwin-x64": "15.5.6", "@next/swc-linux-arm64-gnu": "15.5.6", "@next/swc-linux-arm64-musl": "15.5.6", "@next/swc-linux-x64-gnu": "15.5.6", "@next/swc-linux-x64-musl": "15.5.6", "@next/swc-win32-arm64-msvc": "15.5.6", "@next/swc-win32-x64-msvc": "15.5.6", "sharp": "0.34.4" }, "peerDependencies": { "react": "19.2.0", "react-dom": "19.2.0" }, "bin": { "next": "dist/bin/next" } }, "sha512-zTxsnI3LQo3c9HSdSf91O1jMNsEzIXDShXd4wVdg9y5shwLqBXi4ZtUUJyB86KGVSJLZx0PFONvO54aheGX8QQ=="],
"next": ["next@16.1.0", "", { "dependencies": { "@next/env": "16.1.0", "@swc/helpers": "0.5.15", "baseline-browser-mapping": "^2.8.3", "caniuse-lite": "^1.0.30001579", "postcss": "8.4.31", "styled-jsx": "5.1.6" }, "optionalDependencies": { "@next/swc-darwin-arm64": "16.1.0", "@next/swc-darwin-x64": "16.1.0", "@next/swc-linux-arm64-gnu": "16.1.0", "@next/swc-linux-arm64-musl": "16.1.0", "@next/swc-linux-x64-gnu": "16.1.0", "@next/swc-linux-x64-musl": "16.1.0", "@next/swc-win32-arm64-msvc": "16.1.0", "@next/swc-win32-x64-msvc": "16.1.0", "sharp": "^0.34.4" }, "peerDependencies": { "@opentelemetry/api": "^1.1.0", "@playwright/test": "^1.51.1", "babel-plugin-react-compiler": "*", "react": "^18.2.0 || 19.0.0-rc-de68d2f4-20241204 || ^19.0.0", "react-dom": "^18.2.0 || 19.0.0-rc-de68d2f4-20241204 || ^19.0.0", "sass": "^1.3.0" }, "optionalPeers": ["@opentelemetry/api", "@playwright/test", "babel-plugin-react-compiler", "sass"], "bin": { "next": "dist/bin/next" } }, "sha512-Y+KbmDbefYtHDDQKLNrmzE/YYzG2msqo2VXhzh5yrJ54tx/6TmGdkR5+kP9ma7i7LwZpZMfoY3m/AoPPPKxtVw=="],
"next-auth": ["next-auth@5.0.0-beta.25", "", { "dependencies": { "@auth/core": "0.37.2" }, "peerDependencies": { "next": "15.5.6", "react": "19.2.0" } }, "sha512-2dJJw1sHQl2qxCrRk+KTQbeH+izFbGFPuJj5eGgBZFYyiYYtvlrBeUw1E/OJJxTRjuxbSYGnCTkUIRsIIW0bog=="],
"next-auth": ["next-auth@5.0.0-beta.30", "", { "dependencies": { "@auth/core": "0.41.0" }, "peerDependencies": { "@simplewebauthn/browser": "^9.0.1", "@simplewebauthn/server": "^9.0.2", "next": "^14.0.0-0 || ^15.0.0 || ^16.0.0", "nodemailer": "^7.0.7", "react": "^18.2.0 || ^19.0.0" }, "optionalPeers": ["@simplewebauthn/browser", "@simplewebauthn/server", "nodemailer"] }, "sha512-+c51gquM3F6nMVmoAusRJ7RIoY0K4Ts9HCCwyy/BRoe4mp3msZpOzYMyb5LAYc1wSo74PMQkGDcaghIO7W6Xjg=="],
"oauth4webapi": ["oauth4webapi@3.8.2", "", {}, "sha512-FzZZ+bht5X0FKe7Mwz3DAVAmlH1BV5blSak/lHMBKz0/EBMhX6B10GlQYI51+oRp8ObJaX0g6pXrAxZh5s8rjw=="],
@@ -339,11 +338,9 @@
"preact-render-to-string": ["preact-render-to-string@6.5.11", "", { "peerDependencies": { "preact": "10.24.3" } }, "sha512-ubnauqoGczeGISiOh6RjX0/cdaF8v/oDXIjO85XALCQjwQP+SB4RDXXtvZ6yTYSjG+PC1QRP2AhPgCEsM2EvUw=="],
"pretty-format": ["pretty-format@3.8.0", "", {}, "sha512-WuxUnVtlWL1OfZFQFuqvnvs6MiAGk9UNsBostyBOB0Is9wb5uRESevA6rnl/rkksXaGX3GzZhPup5d6Vp1nFew=="],
"react": ["react@19.2.3", "", {}, "sha512-Ku/hhYbVjOQnXDZFv2+RibmLFGwFdeeKHFcOTlrt7xplBnya5OGn/hIRDsqDiSUcfORsDC7MPxwork8jBwsIWA=="],
"react": ["react@19.2.0", "", {}, "sha512-tmbWg6W31tQLeB5cdIBOicJDJRR2KzXsV7uSK9iNfLWQ5bIZfxuPEHp7M8wiHyHnn0DD1i7w3Zmin0FtkrwoCQ=="],
"react-dom": ["react-dom@19.2.0", "", { "dependencies": { "scheduler": "0.27.0" }, "peerDependencies": { "react": "19.2.0" } }, "sha512-UlbRu4cAiGaIewkPyiRGJk0imDN2T3JjieT6spoL2UeSf5od4n5LB/mQ4ejmxhCFT1tYe8IvaFulzynWovsEFQ=="],
"react-dom": ["react-dom@19.2.3", "", { "dependencies": { "scheduler": "^0.27.0" }, "peerDependencies": { "react": "^19.2.3" } }, "sha512-yELu4WmLPw5Mr/lmeEpox5rw3RETacE++JgHqQzd2dg+YbJuat3jH4ingc+WPZhxaoFzdv9y33G+F7Nl5O0GBg=="],
"resolve-pkg-maps": ["resolve-pkg-maps@1.0.0", "", {}, "sha512-seS2Tj26TBVOC2NIc2rOe2y2ZO7efxITtLZcGSOnHHNOQ7CkiUBfw0Iw2ck6xkIhPwLhKNLS8BO+hEpngQlqzw=="],
@@ -387,7 +384,7 @@
"next/postcss": ["postcss@8.4.31", "", { "dependencies": { "nanoid": "3.3.11", "picocolors": "1.1.1", "source-map-js": "1.2.1" } }, "sha512-PS08Iboia9mts/2ygV3eLpY5ghnUcfLV/EXTOW1E2qYxJKGGBUtNjN76FYHnMs36RmARn41bC0AZmn+rR0OVpQ=="],
"next-auth/@auth/core": ["@auth/core@0.37.2", "", { "dependencies": { "@panva/hkdf": "1.2.1", "@types/cookie": "0.6.0", "cookie": "0.7.1", "jose": "5.10.0", "oauth4webapi": "3.8.2", "preact": "10.11.3", "preact-render-to-string": "5.2.3" } }, "sha512-kUvzyvkcd6h1vpeMAojK2y7+PAV5H+0Cc9+ZlKYDFhDY31AlvsB+GW5vNO4qE3Y07KeQgvNO9U0QUx/fN62kBw=="],
"next-auth/@auth/core": ["@auth/core@0.41.0", "", { "dependencies": { "@panva/hkdf": "^1.2.1", "jose": "^6.0.6", "oauth4webapi": "^3.3.0", "preact": "10.24.3", "preact-render-to-string": "6.5.11" }, "peerDependencies": { "@simplewebauthn/browser": "^9.0.1", "@simplewebauthn/server": "^9.0.2", "nodemailer": "^6.8.0" }, "optionalPeers": ["@simplewebauthn/browser", "@simplewebauthn/server", "nodemailer"] }, "sha512-Wd7mHPQ/8zy6Qj7f4T46vg3aoor8fskJm6g2Zyj064oQ3+p0xNZXAV60ww0hY+MbTesfu29kK14Zk5d5JTazXQ=="],
"@esbuild-kit/core-utils/esbuild/@esbuild/android-arm": ["@esbuild/android-arm@0.18.20", "", { "os": "android", "cpu": "arm" }, "sha512-fyi7TDI/ijKKNZTUJAQqiG5T7YjJXgnzkURqmGj13C6dCqckZBLdl4h7bkhHt/t0WP+zO9/zwroDvANaOqO5Sw=="],
@@ -478,11 +475,5 @@
"drizzle-kit/esbuild/@esbuild/win32-ia32": ["@esbuild/win32-ia32@0.19.12", "", { "os": "win32", "cpu": "ia32" }, "sha512-+ZOE6pUkMOJfmxmBZElNOx72NKpIa/HFOMGzu8fqzQJ5kgf6aTGrcJaFsNiVMH4JKpMipyK+7k0n2UXN7a8YKQ=="],
"drizzle-kit/esbuild/@esbuild/win32-x64": ["@esbuild/win32-x64@0.19.12", "", { "os": "win32", "cpu": "x64" }, "sha512-T1QyPSDCyMXaO3pzBkF96E8xMkiRYbUEZADd29SyPGabqxMViNoii+NcK7eWJAEoU6RZyEm5lVSIjTmcdoB9HA=="],
"next-auth/@auth/core/jose": ["jose@5.10.0", "", {}, "sha512-s+3Al/p9g32Iq+oqXxkW//7jk2Vig6FF1CFqzVXoTUXt2qz89YWbL+OwS17NFYEvxC35n0FKeGO2LGYSxeM2Gg=="],
"next-auth/@auth/core/preact": ["preact@10.11.3", "", {}, "sha512-eY93IVpod/zG3uMF22Unl8h9KkrcKIRs2EGar8hwLZZDU1lkjph303V9HZBwufh2s736U6VXuhD109LYqPoffg=="],
"next-auth/@auth/core/preact-render-to-string": ["preact-render-to-string@5.2.3", "", { "dependencies": { "pretty-format": "3.8.0" }, "peerDependencies": { "preact": "10.11.3" } }, "sha512-aPDxUn5o3GhWdtJtW0svRC2SS/l8D9MAgo2+AWml+BhDImb27ALf04Q2d+AHqUUOc6RdSXFIBVa2gxzgMKgtZA=="],
}
}

View File

@@ -26,11 +26,11 @@
"@trpc/server": "^11.0.0",
"drizzle-orm": "^0.41.0",
"esbuild": "^0.25.11",
"next": "^15.2.3",
"next-auth": "5.0.0-beta.25",
"next": "^16.1.0",
"next-auth": "5.0.0-beta.30",
"postgres": "^3.4.4",
"react": "^19.0.0",
"react-dom": "^19.0.0",
"react": "^19.2.3",
"react-dom": "^19.2.3",
"server-only": "^0.0.1",
"superjson": "^2.2.1",
"zod": "^3.24.2"
@@ -39,8 +39,8 @@
"@biomejs/biome": "1.9.4",
"@tailwindcss/postcss": "^4.0.15",
"@types/node": "^20.14.10",
"@types/react": "^19.0.0",
"@types/react-dom": "^19.0.0",
"@types/react": "^19.2.3",
"@types/react-dom": "^19.2.3",
"drizzle-kit": "^0.30.5",
"postcss": "^8.5.3",
"tailwindcss": "^4.0.15",

View File

@@ -1,18 +1,19 @@
{
"lockfileVersion": 1,
"configVersion": 0,
"workspaces": {
"": {
"name": "react-hello-world",
"dependencies": {
"react": "^19.2.0",
"react-dom": "^19.2.0",
"react": "^19.2.3",
"react-dom": "^19.2.3",
},
},
},
"packages": {
"react": ["react@19.2.0", "", {}, "sha512-tmbWg6W31tQLeB5cdIBOicJDJRR2KzXsV7uSK9iNfLWQ5bIZfxuPEHp7M8wiHyHnn0DD1i7w3Zmin0FtkrwoCQ=="],
"react": ["react@19.2.3", "", {}, "sha512-Ku/hhYbVjOQnXDZFv2+RibmLFGwFdeeKHFcOTlrt7xplBnya5OGn/hIRDsqDiSUcfORsDC7MPxwork8jBwsIWA=="],
"react-dom": ["react-dom@19.2.0", "", { "dependencies": { "scheduler": "^0.27.0" }, "peerDependencies": { "react": "^19.2.0" } }, "sha512-UlbRu4cAiGaIewkPyiRGJk0imDN2T3JjieT6spoL2UeSf5od4n5LB/mQ4ejmxhCFT1tYe8IvaFulzynWovsEFQ=="],
"react-dom": ["react-dom@19.2.3", "", { "dependencies": { "scheduler": "^0.27.0" }, "peerDependencies": { "react": "^19.2.3" } }, "sha512-yELu4WmLPw5Mr/lmeEpox5rw3RETacE++JgHqQzd2dg+YbJuat3jH4ingc+WPZhxaoFzdv9y33G+F7Nl5O0GBg=="],
"scheduler": ["scheduler@0.27.0", "", {}, "sha512-eNv+WrVbKu1f3vbYJT/xtiF5syA5HPIMtf9IgY/nKg0sWqzAUEvqY/xm7OcZc/qafLx/iO9FgOmeSAp4v5ti/Q=="],
}

View File

@@ -11,7 +11,7 @@
"author": "Colin McDonnell",
"license": "ISC",
"dependencies": {
"react": "^19.2.0",
"react-dom": "^19.2.0"
"react": "^19.2.3",
"react-dom": "^19.2.3"
}
}

View File

@@ -13,7 +13,4 @@ export function run(opts = {}) {
}
export const bench = Mitata.bench;
export function group(_name, fn) {
return Mitata.group(fn);
}
export const group = Mitata.group;

View File

@@ -0,0 +1,156 @@
import { bench, group, run } from "../runner.mjs";
const runAll = !process.argv.includes("--simple");
const small = new Uint8Array(1024);
const medium = new Uint8Array(1024 * 100);
const large = new Uint8Array(1024 * 1024);
for (let i = 0; i < large.length; i++) {
const value = Math.floor(Math.sin(i / 100) * 128 + 128);
if (i < small.length) small[i] = value;
if (i < medium.length) medium[i] = value;
large[i] = value;
}
const format = new Intl.NumberFormat("en-US", { notation: "compact", unit: "byte" });
async function compress(data, format) {
const cs = new CompressionStream(format);
const writer = cs.writable.getWriter();
const reader = cs.readable.getReader();
writer.write(data);
writer.close();
const chunks = [];
while (true) {
const { done, value } = await reader.read();
if (done) break;
chunks.push(value);
}
const result = new Uint8Array(chunks.reduce((acc, chunk) => acc + chunk.length, 0));
let offset = 0;
for (const chunk of chunks) {
result.set(chunk, offset);
offset += chunk.length;
}
return result;
}
async function decompress(data, format) {
const ds = new DecompressionStream(format);
const writer = ds.writable.getWriter();
const reader = ds.readable.getReader();
writer.write(data);
writer.close();
const chunks = [];
while (true) {
const { done, value } = await reader.read();
if (done) break;
chunks.push(value);
}
const result = new Uint8Array(chunks.reduce((acc, chunk) => acc + chunk.length, 0));
let offset = 0;
for (const chunk of chunks) {
result.set(chunk, offset);
offset += chunk.length;
}
return result;
}
async function roundTrip(data, format) {
const compressed = await compress(data, format);
return await decompress(compressed, format);
}
const formats = ["deflate", "gzip", "deflate-raw"];
if (runAll) formats.push("brotli", "zstd");
// Small data benchmarks (1KB)
group(`CompressionStream ${format.format(small.length)}`, () => {
for (const fmt of formats) {
try {
new CompressionStream(fmt);
bench(fmt, async () => await compress(small, fmt));
} catch (e) {
// Skip unsupported formats
}
}
});
// Medium data benchmarks (100KB)
group(`CompressionStream ${format.format(medium.length)}`, () => {
for (const fmt of formats) {
try {
new CompressionStream(fmt);
bench(fmt, async () => await compress(medium, fmt));
} catch (e) {}
}
});
// Large data benchmarks (1MB)
group(`CompressionStream ${format.format(large.length)}`, () => {
for (const fmt of formats) {
try {
new CompressionStream(fmt);
bench(fmt, async () => await compress(large, fmt));
} catch (e) {
// Skip unsupported formats
}
}
});
const compressedData = {};
for (const fmt of formats) {
try {
compressedData[fmt] = {
small: await compress(small, fmt),
medium: await compress(medium, fmt),
large: await compress(large, fmt),
};
} catch (e) {
// Skip unsupported formats
}
}
group(`DecompressionStream ${format.format(small.length)}`, () => {
for (const fmt of formats) {
if (compressedData[fmt]) {
bench(fmt, async () => await decompress(compressedData[fmt].small, fmt));
}
}
});
group(`DecompressionStream ${format.format(medium.length)}`, () => {
for (const fmt of formats) {
if (compressedData[fmt]) {
bench(fmt, async () => await decompress(compressedData[fmt].medium, fmt));
}
}
});
group(`DecompressionStream ${format.format(large.length)}`, () => {
for (const fmt of formats) {
if (compressedData[fmt]) {
bench(fmt, async () => await decompress(compressedData[fmt].large, fmt));
}
}
});
group(`roundtrip ${format.format(large.length)}`, () => {
for (const fmt of formats) {
try {
new CompressionStream(fmt);
bench(fmt, async () => await roundTrip(large, fmt));
} catch (e) {
// Skip unsupported formats
}
}
});
await run();

View File

@@ -0,0 +1,48 @@
import { bench, group, run } from "../runner.mjs";
const patterns = [
{ name: "string pattern", input: "https://(sub.)?example(.com/)foo" },
{ name: "hostname IDN", input: { hostname: "xn--caf-dma.com" } },
{
name: "pathname + search + hash + baseURL",
input: {
pathname: "/foo",
search: "bar",
hash: "baz",
baseURL: "https://example.com:8080",
},
},
{ name: "pathname with regex", input: { pathname: "/([[a-z]--a])" } },
{ name: "named groups", input: { pathname: "/users/:id/posts/:postId" } },
{ name: "wildcard", input: { pathname: "/files/*" } },
];
const testURL = "https://sub.example.com/foo";
group("URLPattern parse (constructor)", () => {
for (const { name, input } of patterns) {
bench(name, () => {
return new URLPattern(input);
});
}
});
group("URLPattern.test()", () => {
for (const { name, input } of patterns) {
const pattern = new URLPattern(input);
bench(name, () => {
return pattern.test(testURL);
});
}
});
group("URLPattern.exec()", () => {
for (const { name, input } of patterns) {
const pattern = new URLPattern(input);
bench(name, () => {
return pattern.exec(testURL);
});
}
});
await run();

View File

@@ -18,22 +18,6 @@ const OperatingSystem = @import("src/env.zig").OperatingSystem;
const pathRel = fs.path.relative;
/// When updating this, make sure to adjust SetupZig.cmake
const recommended_zig_version = "0.14.0";
// comptime {
// if (!std.mem.eql(u8, builtin.zig_version_string, recommended_zig_version)) {
// @compileError(
// "" ++
// "Bun requires Zig version " ++ recommended_zig_version ++ ", but you have " ++
// builtin.zig_version_string ++ ". This is automatically configured via Bun's " ++
// "CMake setup. You likely meant to run `bun run build`. If you are trying to " ++
// "upgrade the Zig compiler, edit ZIG_COMMIT in cmake/tools/SetupZig.cmake or " ++
// "comment this error out.",
// );
// }
// }
const zero_sha = "0000000000000000000000000000000000000000";
const BunBuildOptions = struct {
@@ -48,6 +32,7 @@ const BunBuildOptions = struct {
/// enable debug logs in release builds
enable_logs: bool = false,
enable_asan: bool,
enable_fuzzilli: bool,
enable_valgrind: bool,
use_mimalloc: bool,
tracy_callstack_depth: u16,
@@ -97,9 +82,10 @@ const BunBuildOptions = struct {
opts.addOption(bool, "baseline", this.isBaseline());
opts.addOption(bool, "enable_logs", this.enable_logs);
opts.addOption(bool, "enable_asan", this.enable_asan);
opts.addOption(bool, "enable_fuzzilli", this.enable_fuzzilli);
opts.addOption(bool, "enable_valgrind", this.enable_valgrind);
opts.addOption(bool, "use_mimalloc", this.use_mimalloc);
opts.addOption([]const u8, "reported_nodejs_version", b.fmt("{}", .{this.reported_nodejs_version}));
opts.addOption([]const u8, "reported_nodejs_version", b.fmt("{f}", .{this.reported_nodejs_version}));
opts.addOption(bool, "zig_self_hosted_backend", this.no_llvm);
opts.addOption(bool, "override_no_export_cpp_apis", this.override_no_export_cpp_apis);
@@ -134,8 +120,8 @@ pub fn getOSVersionMin(os: OperatingSystem) ?Target.Query.OsVersion {
pub fn getOSGlibCVersion(os: OperatingSystem) ?Version {
return switch (os) {
// Compiling with a newer glibc than this will break certain cloud environments.
.linux => .{ .major = 2, .minor = 27, .patch = 0 },
// Compiling with a newer glibc than this will break certain cloud environments. See symbols.test.ts.
.linux => .{ .major = 2, .minor = 26, .patch = 0 },
else => null,
};
@@ -271,6 +257,7 @@ pub fn build(b: *Build) !void {
.tracy_callstack_depth = b.option(u16, "tracy_callstack_depth", "") orelse 10,
.enable_logs = b.option(bool, "enable_logs", "Enable logs in release") orelse false,
.enable_asan = b.option(bool, "enable_asan", "Enable asan") orelse false,
.enable_fuzzilli = b.option(bool, "enable_fuzzilli", "Enable fuzzilli instrumentation") orelse false,
.enable_valgrind = b.option(bool, "enable_valgrind", "Enable valgrind") orelse false,
.use_mimalloc = b.option(bool, "use_mimalloc", "Use mimalloc as default allocator") orelse false,
.llvm_codegen_threads = b.option(u32, "llvm_codegen_threads", "Number of threads to use for LLVM codegen") orelse 1,
@@ -290,14 +277,16 @@ pub fn build(b: *Build) !void {
var o = build_options;
var unit_tests = b.addTest(.{
.name = "bun-test",
.optimize = build_options.optimize,
.root_source_file = b.path("src/unit_test.zig"),
.test_runner = .{ .path = b.path("src/main_test.zig"), .mode = .simple },
.target = build_options.target,
.root_module = b.createModule(.{
.optimize = build_options.optimize,
.root_source_file = b.path("src/unit_test.zig"),
.target = build_options.target,
.omit_frame_pointer = false,
.strip = false,
}),
.use_llvm = !build_options.no_llvm,
.use_lld = if (build_options.os == .mac) false else !build_options.no_llvm,
.omit_frame_pointer = false,
.strip = false,
});
configureObj(b, &o, unit_tests);
// Setting `linker_allow_shlib_undefined` causes the linker to ignore
@@ -331,6 +320,7 @@ pub fn build(b: *Build) !void {
var step = b.step("check", "Check for semantic analysis errors");
var bun_check_obj = addBunObject(b, &build_options);
bun_check_obj.generated_bin = null;
// bun_check_obj.use_llvm = false;
step.dependOn(&bun_check_obj.step);
// The default install step will run zig build check. This is so ZLS
@@ -503,6 +493,7 @@ fn addMultiCheck(
.no_llvm = root_build_options.no_llvm,
.enable_asan = root_build_options.enable_asan,
.enable_valgrind = root_build_options.enable_valgrind,
.enable_fuzzilli = root_build_options.enable_fuzzilli,
.use_mimalloc = root_build_options.use_mimalloc,
.override_no_export_cpp_apis = root_build_options.override_no_export_cpp_apis,
};
@@ -616,15 +607,22 @@ fn configureObj(b: *Build, opts: *BunBuildOptions, obj: *Compile) void {
obj.llvm_codegen_threads = opts.llvm_codegen_threads orelse 0;
}
obj.no_link_obj = true;
obj.no_link_obj = opts.os != .windows and !opts.no_llvm;
if (opts.enable_asan and !enableFastBuild(b)) {
if (@hasField(Build.Module, "sanitize_address")) {
if (opts.enable_fuzzilli) {
obj.sanitize_coverage_trace_pc_guard = true;
}
obj.root_module.sanitize_address = true;
} else {
const fail_step = b.addFail("asan is not supported on this platform");
obj.step.dependOn(&fail_step.step);
}
} else if (opts.enable_fuzzilli) {
const fail_step = b.addFail("fuzzilli requires asan");
obj.step.dependOn(&fail_step.step);
}
obj.bundle_compiler_rt = false;
obj.bundle_ubsan_rt = false;
@@ -779,6 +777,13 @@ fn addInternalImports(b: *Build, mod: *Module, opts: *BunBuildOptions) void {
mod.addImport("cpp", cppImport);
cppImport.addImport("bun", mod);
}
{
const ciInfoImport = b.createModule(.{
.root_source_file = (std.Build.LazyPath{ .cwd_relative = opts.codegen_path }).path(b, "ci_info.zig"),
});
mod.addImport("ci_info", ciInfoImport);
ciInfoImport.addImport("bun", mod);
}
inline for (.{
.{ .import = "completions-bash", .file = b.path("completions/bun.bash") },
.{ .import = "completions-zsh", .file = b.path("completions/bun.zsh") },
@@ -804,7 +809,7 @@ fn addInternalImports(b: *Build, mod: *Module, opts: *BunBuildOptions) void {
fn propagateImports(source_mod: *Module) !void {
var seen = std.AutoHashMap(*Module, void).init(source_mod.owner.graph.arena);
defer seen.deinit();
var queue = std.ArrayList(*Module).init(source_mod.owner.graph.arena);
var queue = std.array_list.Managed(*Module).init(source_mod.owner.graph.arena);
defer queue.deinit();
try queue.appendSlice(source_mod.import_table.values());
while (queue.pop()) |mod| {

View File

@@ -1,5 +1,6 @@
{
"lockfileVersion": 1,
"configVersion": 1,
"workspaces": {
"": {
"name": "bun",
@@ -31,16 +32,11 @@
"dependencies": {
"@types/node": "*",
},
"devDependencies": {
"@types/react": "^19",
},
"peerDependencies": {
"@types/react": "^19",
},
},
},
"overrides": {
"@types/bun": "workspace:packages/@types/bun",
"@types/node": "25.0.0",
"bun-types": "workspace:packages/bun-types",
},
"packages": {
@@ -90,13 +86,13 @@
"@esbuild/win32-x64": ["@esbuild/win32-x64@0.21.5", "", { "os": "win32", "cpu": "x64" }, "sha512-tQd/1efJuzPC6rCFwEvLtci/xNFcTZknmXs98FYDfGE4wP9ClFV98nyKrzJKVPMhdDnjzLhdUyMX4PsQAPjwIw=="],
"@lezer/common": ["@lezer/common@1.2.3", "", {}, "sha512-w7ojc8ejBqr2REPsWxJjrMFsA/ysDCFICn8zEOR9mrqzOu2amhITYuLD8ag6XZf0CFXDrhKqw7+tW8cX66NaDA=="],
"@lezer/common": ["@lezer/common@1.3.0", "", {}, "sha512-L9X8uHCYU310o99L3/MpJKYxPzXPOS7S0NmBaM7UO/x2Kb2WbmMLSkfvdr1KxRIFYOpbY0Jhn7CfLSUDzL8arQ=="],
"@lezer/cpp": ["@lezer/cpp@1.1.3", "", { "dependencies": { "@lezer/common": "^1.2.0", "@lezer/highlight": "^1.0.0", "@lezer/lr": "^1.0.0" } }, "sha512-ykYvuFQKGsRi6IcE+/hCSGUhb/I4WPjd3ELhEblm2wS2cOznDFzO+ubK2c+ioysOnlZ3EduV+MVQFCPzAIoY3w=="],
"@lezer/highlight": ["@lezer/highlight@1.2.1", "", { "dependencies": { "@lezer/common": "^1.0.0" } }, "sha512-Z5duk4RN/3zuVO7Jq0pGLJ3qynpxUVsh7IbUbGj88+uV2ApSAn6kWg2au3iJb+0Zi7kKtqffIESgNcRXWZWmSA=="],
"@lezer/highlight": ["@lezer/highlight@1.2.3", "", { "dependencies": { "@lezer/common": "^1.3.0" } }, "sha512-qXdH7UqTvGfdVBINrgKhDsVTJTxactNNxLk7+UMwZhU13lMHaOBlJe9Vqp907ya56Y3+ed2tlqzys7jDkTmW0g=="],
"@lezer/lr": ["@lezer/lr@1.4.2", "", { "dependencies": { "@lezer/common": "^1.0.0" } }, "sha512-pu0K1jCIdnQ12aWNaAVU5bzi7Bd1w54J3ECgANPmYLtQKP0HBj2cE/5coBD66MT10xbtIuUr7tg0Shbsvk0mDA=="],
"@lezer/lr": ["@lezer/lr@1.4.3", "", { "dependencies": { "@lezer/common": "^1.0.0" } }, "sha512-yenN5SqAxAPv/qMnpWW0AT7l+SxVrgG+u0tNsRQWqbrz66HIl8DnEbBObvy21J5K7+I1v7gsAnlE2VQ5yYVSeA=="],
"@octokit/app": ["@octokit/app@14.1.0", "", { "dependencies": { "@octokit/auth-app": "^6.0.0", "@octokit/auth-unauthenticated": "^5.0.0", "@octokit/core": "^5.0.0", "@octokit/oauth-app": "^6.0.0", "@octokit/plugin-paginate-rest": "^9.0.0", "@octokit/types": "^12.0.0", "@octokit/webhooks": "^12.0.4" } }, "sha512-g3uEsGOQCBl1+W1rgfwoRFUIR6PtvB2T1E4RpygeUU5LrLvlOqcxrt5lfykIeRpUPpupreGJUYl70fqMDXdTpw=="],
@@ -150,7 +146,7 @@
"@sentry/types": ["@sentry/types@7.120.4", "", {}, "sha512-cUq2hSSe6/qrU6oZsEP4InMI5VVdD86aypE+ENrQ6eZEVLTCYm1w6XhW1NvIu3UuWh7gZec4a9J7AFpYxki88Q=="],
"@types/aws-lambda": ["@types/aws-lambda@8.10.152", "", {}, "sha512-soT/c2gYBnT5ygwiHPmd9a1bftj462NWVk2tKCc1PYHSIacB2UwbTS2zYG4jzag1mRDuzg/OjtxQjQ2NKRB6Rw=="],
"@types/aws-lambda": ["@types/aws-lambda@8.10.159", "", {}, "sha512-SAP22WSGNN12OQ8PlCzGzRCZ7QDCwI85dQZbmpz7+mAk+L7j+wI7qnvmdKh+o7A5LaOp6QnOZ2NJphAZQTTHQg=="],
"@types/btoa-lite": ["@types/btoa-lite@1.0.2", "", {}, "sha512-ZYbcE2x7yrvNFJiU7xJGrpF/ihpkM7zKgw8bha3LNJSesvTtUNxbpzaT7WXBIryf6jovisrxTBvymxMeLLj1Mg=="],
@@ -160,9 +156,7 @@
"@types/ms": ["@types/ms@2.1.0", "", {}, "sha512-GsCCIZDE/p3i96vtEqx+7dBUGXrc7zeSK3wwPHIaRThS+9OhWIXRqzs4d6k1SVU8g91DrNRWxWUGhp5KXQb2VA=="],
"@types/node": ["@types/node@24.2.1", "", { "dependencies": { "undici-types": "~7.10.0" } }, "sha512-DRh5K+ka5eJic8CjH7td8QpYEV6Zo10gfRkjHCO3weqZHWDtAaSTFtl4+VMqOJ4N5jcuhZ9/l+yy8rVgw7BQeQ=="],
"@types/react": ["@types/react@19.1.10", "", { "dependencies": { "csstype": "^3.0.2" } }, "sha512-EhBeSYX0Y6ye8pNebpKrwFJq7BoQ8J5SO6NlvNwwHjSj6adXJViPQrKlsyPw7hLBLvckEMO1yxeGdR82YBBlDg=="],
"@types/node": ["@types/node@25.0.0", "", { "dependencies": { "undici-types": "~7.16.0" } }, "sha512-rl78HwuZlaDIUSeUKkmogkhebA+8K1Hy7tddZuJ3D0xV8pZSfsYGTsliGUol1JPzu9EKnTxPC4L1fiWouStRew=="],
"aggregate-error": ["aggregate-error@3.1.0", "", { "dependencies": { "clean-stack": "^2.0.0", "indent-string": "^4.0.0" } }, "sha512-4I7Td01quW/RpocfNayFdFVk1qSuoh0E7JrbRJ16nH01HhKFQ88INq9Sd+nd72zqRySlr9BmDA8xlEJ6vJMrYA=="],
@@ -192,11 +186,9 @@
"constant-case": ["constant-case@3.0.4", "", { "dependencies": { "no-case": "^3.0.4", "tslib": "^2.0.3", "upper-case": "^2.0.2" } }, "sha512-I2hSBi7Vvs7BEuJDr5dDHfzb/Ruj3FyvFyh7KLilAjNQw3Be+xgqUBA2W6scVEcL0hL1dwPRtIqEPVUCKkSsyQ=="],
"csstype": ["csstype@3.1.3", "", {}, "sha512-M1uQkMl8rQK/szD0LNhtqxIPLpimGm8sOBwU7lLnCpSbTyY3yeU1Vc7l4KT5zT4s/yOxHH5O7tIuuLOCnLADRw=="],
"deprecation": ["deprecation@2.3.1", "", {}, "sha512-xmHIy4F3scKVwMsQ4WnVaS8bHOx0DmVwRywosKhaILI0ywMDWPtBSku2HNxRvF7jtwDRsoEwYQSfbxj8b7RlJQ=="],
"detect-libc": ["detect-libc@2.0.4", "", {}, "sha512-3UDv+G9CsCKO1WKMGw9fwq/SWJYbI0c5Y7LU1AXYoDdbhE2AHQ6N6Nb34sG8Fj7T5APy8qXDCKuuIHd1BR0tVA=="],
"detect-libc": ["detect-libc@2.1.2", "", {}, "sha512-Btj2BOOO83o3WyH59e8MgXsxEQVcarkUOpEYrubB0urwnN10yQ364rsiByU11nZlqWYZm05i/of7io4mzihBtQ=="],
"dot-case": ["dot-case@3.0.4", "", { "dependencies": { "no-case": "^3.0.4", "tslib": "^2.0.3" } }, "sha512-Kv5nKlh6yRrdrGvxeJ2e5y2eRUpkUosIW4A2AS38zwSz27zu7ufDwQPi5Jhs3XAlGNetl3bmnGhQsMtkKJnj3w=="],
@@ -220,27 +212,29 @@
"jws": ["jws@3.2.2", "", { "dependencies": { "jwa": "^1.4.1", "safe-buffer": "^5.0.1" } }, "sha512-YHlZCB6lMTllWDtSPHz/ZXTsi8S00usEV6v1tjq8tOUZzw7DpSDWVXjXDre6ed1w/pd495ODpHZYSdkRTsa0HA=="],
"lightningcss": ["lightningcss@1.30.1", "", { "dependencies": { "detect-libc": "^2.0.3" }, "optionalDependencies": { "lightningcss-darwin-arm64": "1.30.1", "lightningcss-darwin-x64": "1.30.1", "lightningcss-freebsd-x64": "1.30.1", "lightningcss-linux-arm-gnueabihf": "1.30.1", "lightningcss-linux-arm64-gnu": "1.30.1", "lightningcss-linux-arm64-musl": "1.30.1", "lightningcss-linux-x64-gnu": "1.30.1", "lightningcss-linux-x64-musl": "1.30.1", "lightningcss-win32-arm64-msvc": "1.30.1", "lightningcss-win32-x64-msvc": "1.30.1" } }, "sha512-xi6IyHML+c9+Q3W0S4fCQJOym42pyurFiJUHEcEyHS0CeKzia4yZDEsLlqOFykxOdHpNy0NmvVO31vcSqAxJCg=="],
"lightningcss": ["lightningcss@1.30.2", "", { "dependencies": { "detect-libc": "^2.0.3" }, "optionalDependencies": { "lightningcss-android-arm64": "1.30.2", "lightningcss-darwin-arm64": "1.30.2", "lightningcss-darwin-x64": "1.30.2", "lightningcss-freebsd-x64": "1.30.2", "lightningcss-linux-arm-gnueabihf": "1.30.2", "lightningcss-linux-arm64-gnu": "1.30.2", "lightningcss-linux-arm64-musl": "1.30.2", "lightningcss-linux-x64-gnu": "1.30.2", "lightningcss-linux-x64-musl": "1.30.2", "lightningcss-win32-arm64-msvc": "1.30.2", "lightningcss-win32-x64-msvc": "1.30.2" } }, "sha512-utfs7Pr5uJyyvDETitgsaqSyjCb2qNRAtuqUeWIAKztsOYdcACf2KtARYXg2pSvhkt+9NfoaNY7fxjl6nuMjIQ=="],
"lightningcss-darwin-arm64": ["lightningcss-darwin-arm64@1.30.1", "", { "os": "darwin", "cpu": "arm64" }, "sha512-c8JK7hyE65X1MHMN+Viq9n11RRC7hgin3HhYKhrMyaXflk5GVplZ60IxyoVtzILeKr+xAJwg6zK6sjTBJ0FKYQ=="],
"lightningcss-android-arm64": ["lightningcss-android-arm64@1.30.2", "", { "os": "android", "cpu": "arm64" }, "sha512-BH9sEdOCahSgmkVhBLeU7Hc9DWeZ1Eb6wNS6Da8igvUwAe0sqROHddIlvU06q3WyXVEOYDZ6ykBZQnjTbmo4+A=="],
"lightningcss-darwin-x64": ["lightningcss-darwin-x64@1.30.1", "", { "os": "darwin", "cpu": "x64" }, "sha512-k1EvjakfumAQoTfcXUcHQZhSpLlkAuEkdMBsI/ivWw9hL+7FtilQc0Cy3hrx0AAQrVtQAbMI7YjCgYgvn37PzA=="],
"lightningcss-darwin-arm64": ["lightningcss-darwin-arm64@1.30.2", "", { "os": "darwin", "cpu": "arm64" }, "sha512-ylTcDJBN3Hp21TdhRT5zBOIi73P6/W0qwvlFEk22fkdXchtNTOU4Qc37SkzV+EKYxLouZ6M4LG9NfZ1qkhhBWA=="],
"lightningcss-freebsd-x64": ["lightningcss-freebsd-x64@1.30.1", "", { "os": "freebsd", "cpu": "x64" }, "sha512-kmW6UGCGg2PcyUE59K5r0kWfKPAVy4SltVeut+umLCFoJ53RdCUWxcRDzO1eTaxf/7Q2H7LTquFHPL5R+Gjyig=="],
"lightningcss-darwin-x64": ["lightningcss-darwin-x64@1.30.2", "", { "os": "darwin", "cpu": "x64" }, "sha512-oBZgKchomuDYxr7ilwLcyms6BCyLn0z8J0+ZZmfpjwg9fRVZIR5/GMXd7r9RH94iDhld3UmSjBM6nXWM2TfZTQ=="],
"lightningcss-linux-arm-gnueabihf": ["lightningcss-linux-arm-gnueabihf@1.30.1", "", { "os": "linux", "cpu": "arm" }, "sha512-MjxUShl1v8pit+6D/zSPq9S9dQ2NPFSQwGvxBCYaBYLPlCWuPh9/t1MRS8iUaR8i+a6w7aps+B4N0S1TYP/R+Q=="],
"lightningcss-freebsd-x64": ["lightningcss-freebsd-x64@1.30.2", "", { "os": "freebsd", "cpu": "x64" }, "sha512-c2bH6xTrf4BDpK8MoGG4Bd6zAMZDAXS569UxCAGcA7IKbHNMlhGQ89eRmvpIUGfKWNVdbhSbkQaWhEoMGmGslA=="],
"lightningcss-linux-arm64-gnu": ["lightningcss-linux-arm64-gnu@1.30.1", "", { "os": "linux", "cpu": "arm64" }, "sha512-gB72maP8rmrKsnKYy8XUuXi/4OctJiuQjcuqWNlJQ6jZiWqtPvqFziskH3hnajfvKB27ynbVCucKSm2rkQp4Bw=="],
"lightningcss-linux-arm-gnueabihf": ["lightningcss-linux-arm-gnueabihf@1.30.2", "", { "os": "linux", "cpu": "arm" }, "sha512-eVdpxh4wYcm0PofJIZVuYuLiqBIakQ9uFZmipf6LF/HRj5Bgm0eb3qL/mr1smyXIS1twwOxNWndd8z0E374hiA=="],
"lightningcss-linux-arm64-musl": ["lightningcss-linux-arm64-musl@1.30.1", "", { "os": "linux", "cpu": "arm64" }, "sha512-jmUQVx4331m6LIX+0wUhBbmMX7TCfjF5FoOH6SD1CttzuYlGNVpA7QnrmLxrsub43ClTINfGSYyHe2HWeLl5CQ=="],
"lightningcss-linux-arm64-gnu": ["lightningcss-linux-arm64-gnu@1.30.2", "", { "os": "linux", "cpu": "arm64" }, "sha512-UK65WJAbwIJbiBFXpxrbTNArtfuznvxAJw4Q2ZGlU8kPeDIWEX1dg3rn2veBVUylA2Ezg89ktszWbaQnxD/e3A=="],
"lightningcss-linux-x64-gnu": ["lightningcss-linux-x64-gnu@1.30.1", "", { "os": "linux", "cpu": "x64" }, "sha512-piWx3z4wN8J8z3+O5kO74+yr6ze/dKmPnI7vLqfSqI8bccaTGY5xiSGVIJBDd5K5BHlvVLpUB3S2YCfelyJ1bw=="],
"lightningcss-linux-arm64-musl": ["lightningcss-linux-arm64-musl@1.30.2", "", { "os": "linux", "cpu": "arm64" }, "sha512-5Vh9dGeblpTxWHpOx8iauV02popZDsCYMPIgiuw97OJ5uaDsL86cnqSFs5LZkG3ghHoX5isLgWzMs+eD1YzrnA=="],
"lightningcss-linux-x64-musl": ["lightningcss-linux-x64-musl@1.30.1", "", { "os": "linux", "cpu": "x64" }, "sha512-rRomAK7eIkL+tHY0YPxbc5Dra2gXlI63HL+v1Pdi1a3sC+tJTcFrHX+E86sulgAXeI7rSzDYhPSeHHjqFhqfeQ=="],
"lightningcss-linux-x64-gnu": ["lightningcss-linux-x64-gnu@1.30.2", "", { "os": "linux", "cpu": "x64" }, "sha512-Cfd46gdmj1vQ+lR6VRTTadNHu6ALuw2pKR9lYq4FnhvgBc4zWY1EtZcAc6EffShbb1MFrIPfLDXD6Xprbnni4w=="],
"lightningcss-win32-arm64-msvc": ["lightningcss-win32-arm64-msvc@1.30.1", "", { "os": "win32", "cpu": "arm64" }, "sha512-mSL4rqPi4iXq5YVqzSsJgMVFENoa4nGTT/GjO2c0Yl9OuQfPsIfncvLrEW6RbbB24WtZ3xP/2CCmI3tNkNV4oA=="],
"lightningcss-linux-x64-musl": ["lightningcss-linux-x64-musl@1.30.2", "", { "os": "linux", "cpu": "x64" }, "sha512-XJaLUUFXb6/QG2lGIW6aIk6jKdtjtcffUT0NKvIqhSBY3hh9Ch+1LCeH80dR9q9LBjG3ewbDjnumefsLsP6aiA=="],
"lightningcss-win32-x64-msvc": ["lightningcss-win32-x64-msvc@1.30.1", "", { "os": "win32", "cpu": "x64" }, "sha512-PVqXh48wh4T53F/1CCu8PIPCxLzWyCnn/9T5W1Jpmdy5h9Cwd+0YQS6/LwhHXSafuc61/xg9Lv5OrCby6a++jg=="],
"lightningcss-win32-arm64-msvc": ["lightningcss-win32-arm64-msvc@1.30.2", "", { "os": "win32", "cpu": "arm64" }, "sha512-FZn+vaj7zLv//D/192WFFVA0RgHawIcHqLX9xuWiQt7P0PtdFEVaxgF9rjM/IRYHQXNnk61/H/gb2Ei+kUQ4xQ=="],
"lightningcss-win32-x64-msvc": ["lightningcss-win32-x64-msvc@1.30.2", "", { "os": "win32", "cpu": "x64" }, "sha512-5g1yc73p+iAkid5phb4oVFMB45417DkRevRbt/El/gKXJk4jid+vPFF/AXbxn05Aky8PapwzZrdJShv5C0avjw=="],
"lodash.includes": ["lodash.includes@4.3.0", "", {}, "sha512-W3Bx6mdkRTGtlJISOvVD/lbqjTlPPUDTMnlXZFnVwi9NKJ6tiAk6LVdlhZMm17VZisqhKcgzpO5Wz91PCt5b0w=="],
@@ -296,7 +290,7 @@
"scheduler": ["scheduler@0.23.2", "", { "dependencies": { "loose-envify": "^1.1.0" } }, "sha512-UOShsPwz7NrMUqhR6t0hWjFduvOzbtv7toDH1/hIrfRNIDBnnBWd0CwJTGvTpngVlmwGCdP9/Zl/tVrDqcuYzQ=="],
"semver": ["semver@7.7.2", "", { "bin": { "semver": "bin/semver.js" } }, "sha512-RF0Fw+rO5AMf9MAyaRXI4AV0Ulj5lMHqVxxdSgiVbixSCXoEmmX/jk0CuJw4+3SqroYO9VoUh+HcuJivvtJemA=="],
"semver": ["semver@7.7.3", "", { "bin": { "semver": "bin/semver.js" } }, "sha512-SdsKMrI9TdgjdweUSR9MweHA4EJ8YxHn8DFaDisvhVlUOe4BF1tLD7GAj0lIqWVl+dPb/rExr0Btby5loQm20Q=="],
"sentence-case": ["sentence-case@3.0.4", "", { "dependencies": { "no-case": "^3.0.4", "tslib": "^2.0.3", "upper-case-first": "^2.0.2" } }, "sha512-8LS0JInaQMCRoQ7YUytAo/xUu5W2XnQxV2HI/6uM6U7CITS1RqPElr30V6uIqyMKM9lJGRVFy5/4CuzcixNYSg=="],
@@ -312,7 +306,7 @@
"uglify-js": ["uglify-js@3.19.3", "", { "bin": { "uglifyjs": "bin/uglifyjs" } }, "sha512-v3Xu+yuwBXisp6QYTcH4UbH+xYJXqnq2m/LtQVWKWzYc1iehYnLixoQDN9FH6/j9/oybfd6W9Ghwkl8+UMKTKQ=="],
"undici-types": ["undici-types@7.10.0", "", {}, "sha512-t5Fy/nfn+14LuOc2KNYg75vZqClpAiqscVvMygNnlsHBFpSXdJaYtXMcdNLpl/Qvc3P2cB3s6lOV51nqsFq4ag=="],
"undici-types": ["undici-types@7.16.0", "", {}, "sha512-Zz+aZWSj8LE6zoxD+xrjh4VfkIG8Ya6LvYkZqtUQGJPZjYl53ypCaUwWqo7eI0x66KBGeRo+mlBEkMSeSZ38Nw=="],
"universal-github-app-jwt": ["universal-github-app-jwt@1.2.0", "", { "dependencies": { "@types/jsonwebtoken": "^9.0.0", "jsonwebtoken": "^9.0.2" } }, "sha512-dncpMpnsKBk0eetwfN8D8OUHGfiDhhJ+mtsbMl+7PfW7mYjiH8LIcqRmYMtzYLgSh47HjfdBtrBwIQ/gizKR3g=="],

View File

@@ -10,4 +10,4 @@ preload = "./test/preload.ts"
[install]
linker = "isolated"
minimumReleaseAge = 1
minimumReleaseAge = 259200 # three days

View File

@@ -51,6 +51,23 @@ if(ENABLE_ASAN)
)
endif()
if(ENABLE_FUZZILLI)
register_compiler_flags(
DESCRIPTION "Enable coverage instrumentation for fuzzing"
-fsanitize-coverage=trace-pc-guard
)
register_linker_flags(
DESCRIPTION "Link coverage instrumentation"
-fsanitize-coverage=trace-pc-guard
)
register_compiler_flags(
DESCRIPTION "Enable fuzzilli-specific code"
-DFUZZILLI_ENABLED
)
endif()
# --- Optimization level ---
if(DEBUG)
register_compiler_flags(

View File

@@ -125,7 +125,8 @@ setx(CWD ${CMAKE_SOURCE_DIR})
setx(BUILD_PATH ${CMAKE_BINARY_DIR})
optionx(CACHE_PATH FILEPATH "The path to the cache directory" DEFAULT ${BUILD_PATH}/cache)
optionx(CACHE_STRATEGY "read-write|read-only|none" "The strategy to use for caching" DEFAULT "read-write")
optionx(CACHE_STRATEGY "auto|distributed|local|none" "The strategy to use for caching" DEFAULT
"auto")
optionx(CI BOOL "If CI is enabled" DEFAULT OFF)
optionx(ENABLE_ANALYSIS BOOL "If static analysis targets should be enabled" DEFAULT OFF)
@@ -141,9 +142,39 @@ optionx(TMP_PATH FILEPATH "The path to the temporary directory" DEFAULT ${BUILD_
# --- Helper functions ---
# list_filter_out_regex()
#
# Description:
# Filters out elements from a list that match a regex pattern.
#
# Arguments:
# list - The list of strings to traverse
# pattern - The regex pattern to filter out
# touched - A variable to set if any items were removed
function(list_filter_out_regex list pattern touched)
set(result_list "${${list}}")
set(keep_list)
set(was_modified OFF)
foreach(line IN LISTS result_list)
if(line MATCHES "${pattern}")
set(was_modified ON)
else()
list(APPEND keep_list ${line})
endif()
endforeach()
set(${list} "${keep_list}" PARENT_SCOPE)
set(${touched} ${was_modified} PARENT_SCOPE)
endfunction()
# setenv()
# Description:
# Sets an environment variable during the build step, and writes it to a .env file.
#
# See Also:
# unsetenv()
#
# Arguments:
# variable string - The variable to set
# value string - The value to set the variable to
@@ -156,13 +187,7 @@ function(setenv variable value)
if(EXISTS ${ENV_PATH})
file(STRINGS ${ENV_PATH} ENV_FILE ENCODING UTF-8)
foreach(line ${ENV_FILE})
if(line MATCHES "^${variable}=")
list(REMOVE_ITEM ENV_FILE ${line})
set(ENV_MODIFIED ON)
endif()
endforeach()
list_filter_out_regex(ENV_FILE "^${variable}=" ENV_MODIFIED)
if(ENV_MODIFIED)
list(APPEND ENV_FILE "${variable}=${value}")
@@ -178,6 +203,28 @@ function(setenv variable value)
message(STATUS "Set ENV ${variable}: ${value}")
endfunction()
# See setenv()
# Description:
# Exact opposite of setenv().
# Arguments:
# variable string - The variable to unset.
# See Also:
# setenv()
function(unsetenv variable)
set(ENV_PATH ${BUILD_PATH}/.env)
if(NOT EXISTS ${ENV_PATH})
return()
endif()
file(STRINGS ${ENV_PATH} ENV_FILE ENCODING UTF-8)
list_filter_out_regex(ENV_FILE "^${variable}=" ENV_MODIFIED)
if(ENV_MODIFIED)
list(JOIN ENV_FILE "\n" ENV_FILE)
file(WRITE ${ENV_PATH} ${ENV_FILE})
endif()
endfunction()
# satisfies_range()
# Description:
# Check if a version satisfies a version range or list of ranges

View File

@@ -127,6 +127,8 @@ if (NOT ENABLE_ASAN)
set(ENABLE_ZIG_ASAN OFF)
endif()
optionx(ENABLE_FUZZILLI BOOL "If fuzzilli support should be enabled" DEFAULT OFF)
if(RELEASE AND LINUX AND CI AND NOT ENABLE_ASSERTIONS AND NOT ENABLE_ASAN)
set(DEFAULT_LTO ON)
else()

View File

@@ -34,26 +34,6 @@ register_command(
ALWAYS_RUN
)
if(GIT_CHANGED_SOURCES)
set(CLANG_FORMAT_CHANGED_SOURCES)
foreach(source ${CLANG_FORMAT_SOURCES})
list(FIND GIT_CHANGED_SOURCES ${source} index)
if(NOT ${index} EQUAL -1)
list(APPEND CLANG_FORMAT_CHANGED_SOURCES ${source})
endif()
endforeach()
endif()
if(CLANG_FORMAT_CHANGED_SOURCES)
set(CLANG_FORMAT_DIFF_COMMAND ${CLANG_FORMAT_PROGRAM}
-i # edits files in-place
--verbose
${CLANG_FORMAT_CHANGED_SOURCES}
)
else()
set(CLANG_FORMAT_DIFF_COMMAND ${CMAKE_COMMAND} -E echo "No changed files for clang-format")
endif()
register_command(
TARGET
clang-format-diff

View File

@@ -3,7 +3,7 @@
set(CLANG_TIDY_SOURCES ${BUN_C_SOURCES} ${BUN_CXX_SOURCES})
set(CLANG_TIDY_COMMAND ${CLANG_TIDY_PROGRAM}
-p ${BUILD_PATH}
-p ${BUILD_PATH}
--config-file=${CWD}/.clang-tidy
)
@@ -40,27 +40,6 @@ register_command(
ALWAYS_RUN
)
if(GIT_CHANGED_SOURCES)
set(CLANG_TIDY_CHANGED_SOURCES)
foreach(source ${CLANG_TIDY_SOURCES})
list(FIND GIT_CHANGED_SOURCES ${source} index)
if(NOT ${index} EQUAL -1)
list(APPEND CLANG_TIDY_CHANGED_SOURCES ${source})
endif()
endforeach()
endif()
if(CLANG_TIDY_CHANGED_SOURCES)
set(CLANG_TIDY_DIFF_COMMAND ${CLANG_TIDY_PROGRAM}
${CLANG_TIDY_CHANGED_SOURCES}
--fix
--fix-errors
--fix-notes
)
else()
set(CLANG_TIDY_DIFF_COMMAND ${CMAKE_COMMAND} -E echo "No changed files for clang-tidy")
endif()
register_command(
TARGET
clang-tidy-diff

View File

@@ -92,26 +92,6 @@ register_command(
ALWAYS_RUN
)
if(GIT_CHANGED_SOURCES)
set(PRETTIER_CHANGED_SOURCES)
foreach(source ${PRETTIER_SOURCES})
list(FIND GIT_CHANGED_SOURCES ${source} index)
if(NOT ${index} EQUAL -1)
list(APPEND PRETTIER_CHANGED_SOURCES ${source})
endif()
endforeach()
endif()
if(PRETTIER_CHANGED_SOURCES)
set(PRETTIER_DIFF_COMMAND ${PRETTIER_COMMAND}
--write
--plugin=prettier-plugin-organize-imports
${PRETTIER_CHANGED_SOURCES}
)
else()
set(PRETTIER_DIFF_COMMAND ${CMAKE_COMMAND} -E echo "No changed files for prettier")
endif()
register_command(
TARGET
prettier-diff

View File

@@ -25,25 +25,6 @@ register_command(
ALWAYS_RUN
)
if(GIT_CHANGED_SOURCES)
set(ZIG_FORMAT_CHANGED_SOURCES)
foreach(source ${ZIG_FORMAT_SOURCES})
list(FIND GIT_CHANGED_SOURCES ${source} index)
if(NOT ${index} EQUAL -1)
list(APPEND ZIG_FORMAT_CHANGED_SOURCES ${source})
endif()
endforeach()
endif()
if(ZIG_FORMAT_CHANGED_SOURCES)
set(ZIG_FORMAT_DIFF_COMMAND ${ZIG_EXECUTABLE}
fmt
${ZIG_FORMAT_CHANGED_SOURCES}
)
else()
set(ZIG_FORMAT_DIFF_COMMAND ${CMAKE_COMMAND} -E echo "No changed files for zig-format")
endif()
register_command(
TARGET
zig-format-diff

View File

@@ -317,6 +317,10 @@ set(BUN_CPP_OUTPUTS
${CODEGEN_PATH}/cpp.zig
)
set(BUN_CI_INFO_OUTPUTS
${CODEGEN_PATH}/ci_info.zig
)
register_command(
TARGET
bun-cppbind
@@ -334,6 +338,21 @@ register_command(
${BUN_CPP_OUTPUTS}
)
register_command(
TARGET
bun-ci-info
COMMENT
"Generating CI info"
COMMAND
${BUN_EXECUTABLE}
${CWD}/src/codegen/ci_info.ts
${CODEGEN_PATH}/ci_info.zig
SOURCES
${BUN_JAVASCRIPT_CODEGEN_SOURCES}
OUTPUTS
${BUN_CI_INFO_OUTPUTS}
)
register_command(
TARGET
bun-js-modules
@@ -612,6 +631,7 @@ set(BUN_ZIG_GENERATED_SOURCES
${BUN_ZIG_GENERATED_CLASSES_OUTPUTS}
${BUN_JAVASCRIPT_OUTPUTS}
${BUN_CPP_OUTPUTS}
${BUN_CI_INFO_OUTPUTS}
${BUN_BINDGENV2_ZIG_OUTPUTS}
)
@@ -675,6 +695,7 @@ register_command(
-Dcpu=${ZIG_CPU}
-Denable_logs=$<IF:$<BOOL:${ENABLE_LOGS}>,true,false>
-Denable_asan=$<IF:$<BOOL:${ENABLE_ZIG_ASAN}>,true,false>
-Denable_fuzzilli=$<IF:$<BOOL:${ENABLE_FUZZILLI}>,true,false>
-Denable_valgrind=$<IF:$<BOOL:${ENABLE_VALGRIND}>,true,false>
-Duse_mimalloc=$<IF:$<BOOL:${USE_MIMALLOC_AS_DEFAULT_ALLOCATOR}>,true,false>
-Dllvm_codegen_threads=${LLVM_ZIG_CODEGEN_THREADS}
@@ -851,6 +872,7 @@ target_include_directories(${bun} PRIVATE
${CODEGEN_PATH}
${VENDOR_PATH}
${VENDOR_PATH}/picohttpparser
${VENDOR_PATH}/zlib
${NODEJS_HEADERS_PATH}/include
${NODEJS_HEADERS_PATH}/include/node
)
@@ -1178,6 +1200,29 @@ set_target_properties(${bun} PROPERTIES LINK_DEPENDS ${BUN_SYMBOLS_PATH})
include(SetupWebKit)
if(BUN_LINK_ONLY)
register_command(
TARGET
${bun}
TARGET_PHASE
POST_BUILD
COMMENT
"Uploading link metadata"
COMMAND
${CMAKE_COMMAND} -E env
BUN_VERSION=${VERSION}
WEBKIT_DOWNLOAD_URL=${WEBKIT_DOWNLOAD_URL}
WEBKIT_VERSION=${WEBKIT_VERSION}
ZIG_COMMIT=${ZIG_COMMIT}
${BUN_EXECUTABLE} ${CWD}/scripts/create-link-metadata.mjs ${BUILD_PATH} ${bun}
SOURCES
${BUN_ZIG_OUTPUT}
${BUN_CPP_OUTPUT}
ARTIFACTS
${BUILD_PATH}/link-metadata.json
)
endif()
if(WIN32)
if(DEBUG)
target_link_libraries(${bun} PRIVATE

View File

@@ -4,7 +4,7 @@ register_repository(
REPOSITORY
c-ares/c-ares
COMMIT
d3a507e920e7af18a5efb7f9f1d8044ed4750013
3ac47ee46edd8ea40370222f91613fc16c434853
)
register_cmake_command(

View File

@@ -4,41 +4,9 @@ find_command(
COMMAND
git
REQUIRED
OFF
${CI}
)
if(NOT GIT_PROGRAM)
return()
endif()
set(GIT_DIFF_COMMAND ${GIT_PROGRAM} diff --no-color --name-only --diff-filter=AMCR origin/main HEAD)
execute_process(
COMMAND
${GIT_DIFF_COMMAND}
WORKING_DIRECTORY
${CWD}
OUTPUT_STRIP_TRAILING_WHITESPACE
OUTPUT_VARIABLE
GIT_DIFF
ERROR_STRIP_TRAILING_WHITESPACE
ERROR_VARIABLE
GIT_DIFF_ERROR
RESULT_VARIABLE
GIT_DIFF_RESULT
)
if(NOT GIT_DIFF_RESULT EQUAL 0)
message(WARNING "Command failed: ${GIT_DIFF_COMMAND} ${GIT_DIFF_ERROR}")
return()
endif()
string(REPLACE "\n" ";" GIT_CHANGED_SOURCES "${GIT_DIFF}")
if(CI)
set(GIT_CHANGED_SOURCES "${GIT_CHANGED_SOURCES}")
message(STATUS "Set GIT_CHANGED_SOURCES: ${GIT_CHANGED_SOURCES}")
endif()
list(TRANSFORM GIT_CHANGED_SOURCES PREPEND ${CWD}/)
list(LENGTH GIT_CHANGED_SOURCES GIT_CHANGED_SOURCES_COUNT)

View File

@@ -1,60 +1,108 @@
# Setup sccache as the C and C++ compiler launcher to speed up builds by caching
if(CACHE_STRATEGY STREQUAL "none")
return()
endif()
function(check_aws_credentials OUT_VAR)
set(HAS_CREDENTIALS FALSE)
set(SCCACHE_SHARED_CACHE_REGION "us-west-1")
set(SCCACHE_SHARED_CACHE_BUCKET "bun-build-sccache-store")
if(DEFINED ENV{AWS_ACCESS_KEY_ID} AND DEFINED ENV{AWS_SECRET_ACCESS_KEY})
set(HAS_CREDENTIALS TRUE)
message(NOTICE
"sccache: Using AWS credentials found in environment variables")
# Function to check if the system AWS credentials have access to the sccache S3 bucket.
function(check_aws_credentials OUT_VAR)
# Install dependencies first
execute_process(
COMMAND ${BUN_EXECUTABLE} install --frozen-lockfile
WORKING_DIRECTORY ${CMAKE_SOURCE_DIR}/scripts/build-cache
RESULT_VARIABLE INSTALL_EXIT_CODE
OUTPUT_VARIABLE INSTALL_OUTPUT
ERROR_VARIABLE INSTALL_ERROR
)
if(NOT INSTALL_EXIT_CODE EQUAL 0)
message(FATAL_ERROR "Failed to install dependencies in scripts/build-cache\n"
"Exit code: ${INSTALL_EXIT_CODE}\n"
"Output: ${INSTALL_OUTPUT}\n"
"Error: ${INSTALL_ERROR}")
endif()
# Check for ~/.aws directory since sccache may use that.
if(NOT HAS_CREDENTIALS)
if(WIN32)
set(AWS_CONFIG_DIR "$ENV{USERPROFILE}/.aws")
else()
set(AWS_CONFIG_DIR "$ENV{HOME}/.aws")
endif()
# Check AWS credentials
execute_process(
COMMAND
${BUN_EXECUTABLE}
run
have-access.ts
--bucket ${SCCACHE_SHARED_CACHE_BUCKET}
--region ${SCCACHE_SHARED_CACHE_REGION}
WORKING_DIRECTORY
${CMAKE_SOURCE_DIR}/scripts/build-cache
RESULT_VARIABLE HAVE_ACCESS_EXIT_CODE
)
if(EXISTS "${AWS_CONFIG_DIR}/credentials")
set(HAS_CREDENTIALS TRUE)
message(NOTICE
"sccache: Using AWS credentials found in ${AWS_CONFIG_DIR}/credentials")
endif()
if(HAVE_ACCESS_EXIT_CODE EQUAL 0)
set(HAS_CREDENTIALS TRUE)
else()
set(HAS_CREDENTIALS FALSE)
endif()
set(${OUT_VAR} ${HAS_CREDENTIALS} PARENT_SCOPE)
endfunction()
function(check_running_in_ci OUT_VAR)
set(IS_CI FALSE)
# Query EC2 instance metadata service to check if running on buildkite-agent
# The IP address 169.254.169.254 is a well-known link-local address for querying EC2 instance
# metdata:
# https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/instancedata-data-retrieval.html
execute_process(
COMMAND curl -s -m 0.5 http://169.254.169.254/latest/meta-data/tags/instance/Service
OUTPUT_VARIABLE METADATA_OUTPUT
ERROR_VARIABLE METADATA_ERROR
RESULT_VARIABLE METADATA_RESULT
OUTPUT_STRIP_TRAILING_WHITESPACE
ERROR_QUIET
)
# Check if the request succeeded and returned exactly "buildkite-agent"
if(METADATA_RESULT EQUAL 0 AND METADATA_OUTPUT STREQUAL "buildkite-agent")
set(IS_CI TRUE)
endif()
set(${OUT_VAR} ${IS_CI} PARENT_SCOPE)
# Configure sccache to use the local cache only.
function(sccache_configure_local_filesystem)
unsetenv(SCCACHE_BUCKET)
unsetenv(SCCACHE_REGION)
setenv(SCCACHE_DIR "${CACHE_PATH}/sccache")
endfunction()
check_running_in_ci(IS_IN_CI)
find_command(VARIABLE SCCACHE_PROGRAM COMMAND sccache REQUIRED ${IS_IN_CI})
# Configure sccache to use the distributed cache (S3 + local).
function(sccache_configure_distributed)
setenv(SCCACHE_BUCKET "${SCCACHE_SHARED_CACHE_BUCKET}")
setenv(SCCACHE_REGION "${SCCACHE_SHARED_CACHE_REGION}")
setenv(SCCACHE_DIR "${CACHE_PATH}/sccache")
endfunction()
function(sccache_configure_environment_ci)
if(CACHE_STRATEGY STREQUAL "auto" OR CACHE_STRATEGY STREQUAL "distributed")
check_aws_credentials(HAS_AWS_CREDENTIALS)
if(HAS_AWS_CREDENTIALS)
sccache_configure_distributed()
message(NOTICE "sccache: Using distributed cache strategy.")
else()
message(FATAL_ERROR "CI CACHE_STRATEGY is set to '${CACHE_STRATEGY}', but no valid AWS "
"credentials were found. Note that 'auto' requires AWS credentials to access the shared "
"cache in CI.")
endif()
elseif(CACHE_STRATEGY STREQUAL "local")
# We disallow this because we want our CI runs to always used the shared cache to accelerate
# builds.
# none, distributed and auto are all okay.
#
# If local is configured, it's as good as "none", so this is probably user error.
message(FATAL_ERROR "CI CACHE_STRATEGY is set to 'local', which is not allowed.")
endif()
endfunction()
function(sccache_configure_environment_developer)
# Local environments can use any strategy they like. S3 is set up in such a way so as to clean
# itself from old entries automatically.
if (CACHE_STRATEGY STREQUAL "auto" OR CACHE_STRATEGY STREQUAL "local")
# In the local environment, we prioritize using the local cache. This is because sccache takes
# into consideration the whole absolute path of the files being compiled, and it's very
# unlikely users will have the same absolute paths on their local machines.
sccache_configure_local_filesystem()
message(NOTICE "sccache: Using local cache strategy.")
elseif(CACHE_STRATEGY STREQUAL "distributed")
check_aws_credentials(HAS_AWS_CREDENTIALS)
if(HAS_AWS_CREDENTIALS)
sccache_configure_distributed()
message(NOTICE "sccache: Using distributed cache strategy.")
else()
message(FATAL_ERROR "CACHE_STRATEGY is set to 'distributed', but no valid AWS credentials "
"were found.")
endif()
endif()
endfunction()
find_command(VARIABLE SCCACHE_PROGRAM COMMAND sccache REQUIRED ${CI})
if(NOT SCCACHE_PROGRAM)
message(WARNING "sccache not found. Your builds will be slower.")
return()
@@ -66,25 +114,10 @@ foreach(arg ${SCCACHE_ARGS})
list(APPEND CMAKE_ARGS -D${arg}=${${arg}})
endforeach()
# Configure S3 bucket for distributed caching
setenv(SCCACHE_BUCKET "bun-build-sccache-store")
setenv(SCCACHE_REGION "us-west-1")
setenv(SCCACHE_DIR "${CACHE_PATH}/sccache")
# Handle credentials based on cache strategy
if (CACHE_STRATEGY STREQUAL "read-only")
setenv(SCCACHE_S3_NO_CREDENTIALS "1")
message(STATUS "sccache configured in read-only mode.")
else()
# Check for AWS credentials and enable anonymous access if needed
check_aws_credentials(HAS_AWS_CREDENTIALS)
if(NOT IS_IN_CI AND NOT HAS_AWS_CREDENTIALS)
setenv(SCCACHE_S3_NO_CREDENTIALS "1")
message(NOTICE "sccache: No AWS credentials found, enabling anonymous S3 "
"access. Writing to the cache will be disabled.")
endif()
endif()
setenv(SCCACHE_LOG "info")
message(STATUS "sccache configured for bun-build-sccache-store (us-west-1).")
if (CI)
sccache_configure_environment_ci()
else()
sccache_configure_environment_developer()
endif()

View File

@@ -2,7 +2,7 @@ option(WEBKIT_VERSION "The version of WebKit to use")
option(WEBKIT_LOCAL "If a local version of WebKit should be used instead of downloading")
if(NOT WEBKIT_VERSION)
set(WEBKIT_VERSION 6d0f3aac0b817cc01a846b3754b21271adedac12)
set(WEBKIT_VERSION 8400ec68b2649d7b95f10576ad648c3aa8a92b77)
endif()
string(SUBSTRING ${WEBKIT_VERSION} 0 16 WEBKIT_VERSION_PREFIX)

View File

@@ -20,7 +20,7 @@ else()
unsupported(CMAKE_SYSTEM_NAME)
endif()
set(ZIG_COMMIT "55fdbfa0c86be86b68d43a4ba761e6909eb0d7b2")
set(ZIG_COMMIT "c1423ff3fc7064635773a4a4616c5bf986eb00fe")
optionx(ZIG_TARGET STRING "The zig target to use" DEFAULT ${DEFAULT_ZIG_TARGET})
if(CMAKE_BUILD_TYPE STREQUAL "Release")
@@ -55,13 +55,7 @@ optionx(ZIG_OBJECT_FORMAT "obj|bc" "Output file format for Zig object files" DEF
optionx(ZIG_LOCAL_CACHE_DIR FILEPATH "The path to local the zig cache directory" DEFAULT ${CACHE_PATH}/zig/local)
optionx(ZIG_GLOBAL_CACHE_DIR FILEPATH "The path to the global zig cache directory" DEFAULT ${CACHE_PATH}/zig/global)
if(CI)
set(ZIG_COMPILER_SAFE_DEFAULT ON)
else()
set(ZIG_COMPILER_SAFE_DEFAULT OFF)
endif()
optionx(ZIG_COMPILER_SAFE BOOL "Download a ReleaseSafe build of the Zig compiler." DEFAULT ${ZIG_COMPILER_SAFE_DEFAULT})
optionx(ZIG_COMPILER_SAFE BOOL "Download a ReleaseSafe build of the Zig compiler." DEFAULT ${CI})
setenv(ZIG_LOCAL_CACHE_DIR ${ZIG_LOCAL_CACHE_DIR})
setenv(ZIG_GLOBAL_CACHE_DIR ${ZIG_GLOBAL_CACHE_DIR})

View File

@@ -1,4 +1,4 @@
FROM debian:bookworm-slim AS build
FROM debian:trixie-slim AS build
# https://github.com/oven-sh/bun/releases
ARG BUN_VERSION=latest
@@ -55,7 +55,7 @@ RUN apt-get update -qq \
&& which bun \
&& bun --version
FROM debian:bookworm-slim
FROM debian:trixie-slim
# Disable the runtime transpiler cache by default inside Docker containers.
# On ephemeral containers, the cache is not useful

View File

@@ -1,4 +1,4 @@
FROM debian:bookworm-slim AS build
FROM debian:trixie-slim AS build
# https://github.com/oven-sh/bun/releases
ARG BUN_VERSION=latest
@@ -56,7 +56,7 @@ RUN apt-get update -qq \
&& rm -f "bun-linux-$build.zip" SHASUMS256.txt.asc SHASUMS256.txt \
&& chmod +x /usr/local/bin/bun
FROM debian:bookworm
FROM debian:trixie
COPY docker-entrypoint.sh /usr/local/bin
COPY --from=build /usr/local/bin/bun /usr/local/bin/bun

View File

@@ -1,4 +1,4 @@
FROM debian:bookworm-slim AS build
FROM debian:trixie-slim AS build
# https://github.com/oven-sh/bun/releases
ARG BUN_VERSION=latest
@@ -55,7 +55,7 @@ RUN apt-get update -qq \
&& which bun \
&& bun --version
FROM gcr.io/distroless/base-nossl-debian11
FROM gcr.io/distroless/base-nossl-debian13
# Disable the runtime transpiler cache by default inside Docker containers.
# On ephemeral containers, the cache is not useful
@@ -71,6 +71,7 @@ ENV PATH "${PATH}:/usr/local/bun-node-fallback-bin"
# Temporarily use the `build`-stage image binaries to create a symlink:
RUN --mount=type=bind,from=build,source=/usr/bin,target=/usr/bin \
--mount=type=bind,from=build,source=/etc/alternatives/which,target=/etc/alternatives/which \
--mount=type=bind,from=build,source=/bin,target=/bin \
--mount=type=bind,from=build,source=/usr/lib,target=/usr/lib \
--mount=type=bind,from=build,source=/lib,target=/lib \

View File

@@ -34,7 +34,7 @@ By default, Bun's CSS bundler targets the following browsers:
The CSS Nesting specification allows you to write more concise and intuitive stylesheets by nesting selectors inside one another. Instead of repeating parent selectors across your CSS file, you can write child styles directly within their parent blocks.
```css title="styles.css" icon="file-code"
```scss title="styles.css" icon="file-code"
/* With nesting */
.card {
background: white;
@@ -72,7 +72,7 @@ Bun's CSS bundler automatically converts this nested syntax into traditional fla
You can also nest media queries and other at-rules inside selectors, eliminating the need to repeat selector patterns:
```css title="styles.css" icon="file-code"
```scss title="styles.css" icon="file-code"
.responsive-element {
display: block;
@@ -100,7 +100,7 @@ This compiles to:
The `color-mix()` function gives you an easy way to blend two colors together according to a specified ratio in a chosen color space. This powerful feature lets you create color variations without manually calculating the resulting values.
```css title="styles.css" icon="file-code"
```scss title="styles.css" icon="file-code"
.button {
/* Mix blue and red in the RGB color space with a 30/70 proportion */
background-color: color-mix(in srgb, blue 30%, red);

View File

@@ -65,6 +65,7 @@ In Bun's CLI, simple boolean flags like `--minify` do not accept an argument. Ot
| `--chunk-names` | `--chunk-naming` | Renamed for consistency with naming in JS API |
| `--color` | n/a | Always enabled |
| `--drop` | `--drop` | |
| n/a | `--feature` | Bun-specific. Enable feature flags for compile-time dead-code elimination via `import { feature } from "bun:bundle"` |
| `--entry-names` | `--entry-naming` | Renamed for consistency with naming in JS API |
| `--global-name` | n/a | Not applicable, Bun does not support `iife` output at this time |
| `--ignore-annotations` | `--ignore-dce-annotations` | |
@@ -231,23 +232,67 @@ const myPlugin: BunPlugin = {
### onResolve
<Tabs>
<Tab title="options">- 🟢 `filter` - 🟢 `namespace`</Tab>
<Tab title="options">
- 🟢 `filter`
- 🟢 `namespace`
</Tab>
<Tab title="arguments">
- 🟢 `path` - 🟢 `importer` - 🔴 `namespace` - 🔴 `resolveDir` - 🔴 `kind` - 🔴 `pluginData`
- 🟢 `path`
- 🟢 `importer`
- 🔴 `namespace`
- 🔴 `resolveDir`
- 🔴 `kind`
- 🔴 `pluginData`
</Tab>
<Tab title="results">
- 🟢 `namespace` - 🟢 `path` - 🔴 `errors` - 🔴 `external` - 🔴 `pluginData` - 🔴 `pluginName` - 🔴 `sideEffects` -
🔴 `suffix` - 🔴 `warnings` - 🔴 `watchDirs` - 🔴 `watchFiles`
- 🟢 `namespace`
- 🟢 `path`
- 🔴 `errors`
- 🔴 `external`
- 🔴 `pluginData`
- 🔴 `pluginName`
- 🔴 `sideEffects`
- 🔴 `suffix`
- 🔴 `warnings`
- 🔴 `watchDirs`
- 🔴 `watchFiles`
</Tab>
</Tabs>
### onLoad
<Tabs>
<Tab title="options">- 🟢 `filter` - 🟢 `namespace`</Tab>
<Tab title="arguments">- 🟢 `path` - 🔴 `namespace` - 🔴 `suffix` - 🔴 `pluginData`</Tab>
<Tab title="options">
- 🟢 `filter`
- 🟢 `namespace`
</Tab>
<Tab title="arguments">
- 🟢 `path`
- 🔴 `namespace`
- 🔴 `suffix`
- 🔴 `pluginData`
</Tab>
<Tab title="results">
- 🟢 `contents` - 🟢 `loader` - 🔴 `errors` - 🔴 `pluginData` - 🔴 `pluginName` - 🔴 `resolveDir` - 🔴 `warnings` -
🔴 `watchDirs` - 🔴 `watchFiles`
- 🟢 `contents`
- 🟢 `loader`
- 🔴 `errors`
- 🔴 `pluginData`
- 🔴 `pluginName`
- 🔴 `resolveDir`
- 🔴 `warnings`
- 🔴 `watchDirs`
- 🔴 `watchFiles`
</Tab>
</Tabs>

File diff suppressed because it is too large Load Diff

View File

@@ -427,8 +427,8 @@ This will allow you to use TailwindCSS utility classes in your HTML and CSS file
<!doctype html>
<html>
<head>
<link rel="stylesheet" href="tailwindcss" />
<!-- [!code ++] -->
<link rel="stylesheet" href="tailwindcss" />
</head>
<!-- the rest of your HTML... -->
</html>
@@ -448,8 +448,8 @@ Alternatively, you can import TailwindCSS in your CSS file:
<!doctype html>
<html>
<head>
<link rel="stylesheet" href="./style.css" />
<!-- [!code ++] -->
<link rel="stylesheet" href="./style.css" />
</head>
<!-- the rest of your HTML... -->
</html>
@@ -492,6 +492,28 @@ Bun will lazily resolve and load each plugin and use them to bundle your routes.
the CLI.
</Note>
## Inline Environment Variables
Bun can replace `process.env.*` references in your frontend JavaScript and TypeScript with their actual values at build time. Configure the `env` option in your `bunfig.toml`:
```toml title="bunfig.toml" icon="settings"
[serve.static]
env = "PUBLIC_*" # only inline env vars starting with PUBLIC_ (recommended)
# env = "inline" # inline all environment variables
# env = "disable" # disable env var replacement (default)
```
<Note>
This only works with literal `process.env.FOO` references, not `import.meta.env` or indirect access like `const env =
process.env; env.FOO`.
If an environment variable is not set, you may see runtime errors like `ReferenceError: process
is not defined` in the browser.
</Note>
See the [HTML & static sites documentation](/bundler/html-static#inline-environment-variables) for more details on build-time configuration and examples.
## How It Works
Bun uses `HTMLRewriter` to scan for `<script>` and `<link>` tags in HTML files, uses them as entrypoints for Bun's bundler, generates an optimized bundle for the JavaScript/TypeScript/TSX/JSX and CSS files, and serves the result.
@@ -632,7 +654,7 @@ const server = serve({
console.log(`🚀 Server running on ${server.url}`);
```
```html title="public/index.html"
```html title="public/index.html" icon="file-code"
<!DOCTYPE html>
<html>
<head>
@@ -757,7 +779,7 @@ export function App() {
}
```
```css title="src/styles.css"
```css title="src/styles.css" icon="file-code"
* {
margin: 0;
padding: 0;
@@ -999,7 +1021,7 @@ CMD ["bun", "index.js"]
### Environment Variables
```bash title=".env.production" icon="file-code"
```ini title=".env.production" icon="file-code"
NODE_ENV=production
PORT=3000
DATABASE_URL=postgresql://user:pass@localhost:5432/myapp

View File

@@ -9,7 +9,7 @@ Hot Module Replacement (HMR) allows you to update modules in a running applicati
## `import.meta.hot` API Reference
Bun implements a client-side HMR API modeled after [Vite's `import.meta.hot` API](https://vitejs.dev/guide/api-hmr.html). It can be checked for with `if (import.meta.hot)`, tree-shaking it in production.
Bun implements a client-side HMR API modeled after [Vite's `import.meta.hot` API](https://vite.dev/guide/api-hmr). It can be checked for with `if (import.meta.hot)`, tree-shaking it in production.
```ts title="index.ts" icon="/icons/typescript.svg"
if (import.meta.hot) {
@@ -144,7 +144,7 @@ Indicates that multiple dependencies' modules can be accepted. This variant acce
`import.meta.hot.data` maintains state between module instances during hot replacement, enabling data transfer from previous to new versions. When `import.meta.hot.data` is written into, Bun will also mark this module as capable of self-accepting (equivalent of calling `import.meta.hot.accept()`).
```jsx title="index.ts" icon="/icons/typescript.svg"
```tsx title="index.tsx" icon="/icons/typescript.svg"
import { createRoot } from "react-dom/client";
import { App } from "./app";

View File

@@ -25,7 +25,7 @@ bun ./index.html
```
```
Bun v1.2.20
Bun v1.3.3
ready in 6.62ms
→ http://localhost:3000/
Press h + Enter to show shortcuts
@@ -51,7 +51,7 @@ bun index.html
```
```
Bun v1.2.20
Bun v1.3.3
ready in 6.62ms
→ http://localhost:3000/
Press h + Enter to show shortcuts
@@ -81,7 +81,7 @@ bun ./index.html ./about.html
```
```txt
Bun v1.2.20
Bun v1.3.3
ready in 6.62ms
→ http://localhost:3000/
Routes:
@@ -104,7 +104,7 @@ bun ./**/*.html
```
```
Bun v1.2.20
Bun v1.3.3
ready in 6.62ms
→ http://localhost:3000/
Routes:
@@ -122,7 +122,7 @@ bun ./index.html ./about/index.html ./about/foo/index.html
```
```
Bun v1.2.20
Bun v1.3.3
ready in 6.62ms
→ http://localhost:3000/
Routes:
@@ -164,7 +164,7 @@ For example:
}
```
```css abc.css
```css abc.css icon="file-code"
body {
background-color: red;
}
@@ -174,7 +174,7 @@ body {
This outputs:
```css
```css styles.css icon="file-code"
body {
background-color: red;
}
@@ -237,17 +237,118 @@ Then, reference TailwindCSS in your HTML via `<link>` tag, `@import` in CSS, or
<Tabs>
<Tab title="index.html">
```html title="index.html" icon="file-code"
{/* Reference TailwindCSS in your HTML */}
<!-- Reference TailwindCSS in your HTML -->
<link rel="stylesheet" href="tailwindcss" />
```
</Tab>
<Tab title="styles.css">
```css title="styles.css" icon="file-code"
@import "tailwindcss";
```
</Tab>
<Tab title="app.ts">
```ts title="app.ts" icon="/icons/typescript.svg"
import "tailwindcss";
```
</Tab>
<Tab title="styles.css">```css title="styles.css" icon="file-code" @import "tailwindcss"; ```</Tab>
<Tab title="app.ts">```ts title="app.ts" icon="/icons/typescript.svg" import "tailwindcss"; ```</Tab>
</Tabs>
<Info>Only one of those are necessary, not all three.</Info>
## Inline environment variables
Bun can replace `process.env.*` references in your JavaScript and TypeScript with their actual values at build time. This is useful for injecting configuration like API URLs or feature flags into your frontend code.
### Dev server (runtime)
To inline environment variables when using `bun ./index.html`, configure the `env` option in your `bunfig.toml`:
```toml title="bunfig.toml" icon="settings"
[serve.static]
env = "PUBLIC_*" # only inline env vars starting with PUBLIC_ (recommended)
# env = "inline" # inline all environment variables
# env = "disable" # disable env var replacement (default)
```
<Note>
This only works with literal `process.env.FOO` references, not `import.meta.env` or indirect access like `const env =
process.env; env.FOO`.
If an environment variable is not set, you may see runtime errors like `ReferenceError: process
is not defined` in the browser.
</Note>
Then run the dev server:
```bash terminal icon="terminal"
PUBLIC_API_URL=https://api.example.com bun ./index.html
```
### Build for production
When building static HTML for production, use the `env` option to inline environment variables:
<Tabs>
<Tab title="CLI">
```bash terminal icon="terminal"
# Inline all environment variables
bun build ./index.html --outdir=dist --env=inline
# Only inline env vars with a specific prefix (recommended)
bun build ./index.html --outdir=dist --env=PUBLIC_*
```
</Tab>
<Tab title="API">
```ts title="build.ts" icon="/icons/typescript.svg"
// Inline all environment variables
await Bun.build({
entrypoints: ["./index.html"],
outdir: "./dist",
env: "inline", // [!code highlight]
});
// Only inline env vars with a specific prefix (recommended)
await Bun.build({
entrypoints: ["./index.html"],
outdir: "./dist",
env: "PUBLIC_*", // [!code highlight]
});
```
</Tab>
</Tabs>
### Example
Given this source file:
```ts title="app.ts" icon="/icons/typescript.svg"
const apiUrl = process.env.PUBLIC_API_URL;
console.log(`API URL: ${apiUrl}`);
```
And running with `PUBLIC_API_URL=https://api.example.com`:
```bash terminal icon="terminal"
PUBLIC_API_URL=https://api.example.com bun build ./index.html --outdir=dist --env=PUBLIC_*
```
The bundled output will contain:
```js title="dist/app.js" icon="/icons/javascript.svg"
const apiUrl = "https://api.example.com";
console.log(`API URL: ${apiUrl}`);
```
## Echo console logs from browser to terminal
Bun's dev server supports streaming console logs from the browser to the terminal.
@@ -259,7 +360,7 @@ bun ./index.html --console
```
```
Bun v1.2.20
Bun v1.3.3
ready in 6.62ms
→ http://localhost:3000/
Press h + Enter to show shortcuts
@@ -371,7 +472,8 @@ All paths are resolved relative to your HTML file, making it easy to organize yo
- Need more configuration options for things like asset handling
- Need a way to configure CORS, headers, etc.
If you want to submit a PR, most of the code is [here](https://github.com/oven-sh/bun/blob/main/src/bun.js/api/bun/html-rewriter.ts). You could even copy paste that file into your project and use it as a starting point.
{/* todo: find the correct link to link to as this 404's and there isn't any similar files */}
{/* If you want to submit a PR, most of the code is [here](https://github.com/oven-sh/bun/blob/main/src/bun.js/api/bun/html-rewriter.ts). You could even copy paste that file into your project and use it as a starting point. */}
</Warning>

View File

@@ -106,7 +106,7 @@ For each file specified in `entrypoints`, Bun will generate a new bundle. This b
The contents of `out/index.js` will look something like this:
```ts title="out/index.js" icon="/icons/javascript.svg"
```js title="out/index.js" icon="/icons/javascript.svg"
// out/index.js
// ...
// ~20k lines of code
@@ -160,8 +160,12 @@ Like the Bun runtime, the bundler supports an array of file types out of the box
| ----------------------------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ |
| `.js` `.jsx` `.cjs` `.mjs` `.mts` `.cts` `.ts` `.tsx` | Uses Bun's built-in transpiler to parse the file and transpile TypeScript/JSX syntax to vanilla JavaScript. The bundler executes a set of default transforms including dead code elimination and tree shaking. At the moment Bun does not attempt to down-convert syntax; if you use recently ECMAScript syntax, that will be reflected in the bundled code. |
| `.json` | JSON files are parsed and inlined into the bundle as a JavaScript object.<br/><br/>`js<br/>import pkg from "./package.json";<br/>pkg.name; // => "my-package"<br/>` |
| `.jsonc` | JSON with comments. Files are parsed and inlined into the bundle as a JavaScript object.<br/><br/>`js<br/>import config from "./config.jsonc";<br/>config.name; // => "my-config"<br/>` |
| `.toml` | TOML files are parsed and inlined into the bundle as a JavaScript object.<br/><br/>`js<br/>import config from "./bunfig.toml";<br/>config.logLevel; // => "debug"<br/>` |
| `.yaml` `.yml` | YAML files are parsed and inlined into the bundle as a JavaScript object.<br/><br/>`js<br/>import config from "./config.yaml";<br/>config.name; // => "my-app"<br/>` |
| `.txt` | The contents of the text file are read and inlined into the bundle as a string.<br/><br/>`js<br/>import contents from "./file.txt";<br/>console.log(contents); // => "Hello, world!"<br/>` |
| `.html` | HTML files are processed and any referenced assets (scripts, stylesheets, images) are bundled. |
| `.css` | CSS files are bundled together into a single `.css` file in the output directory. |
| `.node` `.wasm` | These files are supported by the Bun runtime, but during bundling they are treated as assets. |
### Assets
@@ -523,7 +527,7 @@ Injects environment variables into the bundled output by converting `process.env
For the input below:
```ts title="input.js" icon="/icons/javascript.svg"
```js title="input.js" icon="/icons/javascript.svg"
// input.js
console.log(process.env.FOO);
console.log(process.env.BAZ);
@@ -531,7 +535,7 @@ console.log(process.env.BAZ);
The generated bundle will contain the following code:
```ts title="output.js" icon="/icons/javascript.svg"
```js title="output.js" icon="/icons/javascript.svg"
// output.js
console.log("bar");
console.log("123");
@@ -576,7 +580,7 @@ console.log(process.env.BAZ);
The generated bundle will contain the following code:
```ts title="output.js" icon="/icons/javascript.svg"
```js title="output.js" icon="/icons/javascript.svg"
console.log(process.env.FOO);
console.log("https://acme.com");
console.log(process.env.BAZ);
@@ -718,7 +722,7 @@ Normally, bundling `index.tsx` would generate a bundle containing the entire sou
The generated bundle will look something like this:
```ts title="out/index.js" icon="/icons/javascript.svg"
```js title="out/index.js" icon="/icons/javascript.svg"
import { z } from "zod";
// ...
@@ -1022,7 +1026,7 @@ Setting `publicPath` will prefix all file paths with the specified value.
The output file would now look something like this.
```ts title="out/index.js" icon="/icons/javascript.svg"
```js title="out/index.js" icon="/icons/javascript.svg"
var logo = "https://cdn.example.com/logo-a7305bdef.svg";
```
@@ -1137,6 +1141,84 @@ Remove function calls from a bundle. For example, `--drop=console` will remove a
</Tab>
</Tabs>
### features
Enable compile-time feature flags for dead-code elimination. This provides a way to conditionally include or exclude code paths at bundle time using `import { feature } from "bun:bundle"`.
```ts title="app.ts" icon="/icons/typescript.svg"
import { feature } from "bun:bundle";
if (feature("PREMIUM")) {
// Only included when PREMIUM flag is enabled
initPremiumFeatures();
}
if (feature("DEBUG")) {
// Only included when DEBUG flag is enabled
console.log("Debug mode");
}
```
<Tabs>
<Tab title="JavaScript">
```ts title="build.ts" icon="/icons/typescript.svg"
await Bun.build({
entrypoints: ['./app.ts'],
outdir: './out',
features: ["PREMIUM"], // PREMIUM=true, DEBUG=false
})
```
</Tab>
<Tab title="CLI">
```bash terminal icon="terminal"
bun build ./app.ts --outdir ./out --feature PREMIUM
```
</Tab>
</Tabs>
The `feature()` function is replaced with `true` or `false` at bundle time. Combined with minification, unreachable code is eliminated:
```ts title="Input" icon="/icons/typescript.svg"
import { feature } from "bun:bundle";
const mode = feature("PREMIUM") ? "premium" : "free";
```
```js title="Output (with --feature PREMIUM --minify)" icon="/icons/javascript.svg"
var mode = "premium";
```
```js title="Output (without --feature PREMIUM, with --minify)" icon="/icons/javascript.svg"
var mode = "free";
```
**Key behaviors:**
- `feature()` requires a string literal argument — dynamic values are not supported
- The `bun:bundle` import is completely removed from the output
- Works with `bun build`, `bun run`, and `bun test`
- Multiple flags can be enabled: `--feature FLAG_A --feature FLAG_B`
- For type safety, augment the `Registry` interface to restrict `feature()` to known flags (see below)
**Use cases:**
- Platform-specific code (`feature("SERVER")` vs `feature("CLIENT")`)
- Environment-based features (`feature("DEVELOPMENT")`)
- Gradual feature rollouts
- A/B testing variants
- Paid tier features
**Type safety:** By default, `feature()` accepts any string. To get autocomplete and catch typos at compile time, create an `env.d.ts` file (or add to an existing `.d.ts`) and augment the `Registry` interface:
```ts title="env.d.ts" icon="/icons/typescript.svg"
declare module "bun:bundle" {
interface Registry {
features: "DEBUG" | "PREMIUM" | "BETA_FEATURES";
}
}
```
Ensure the file is included in your `tsconfig.json` (e.g., `"include": ["src", "env.d.ts"]`). Now `feature()` only accepts those flags, and invalid strings like `feature("TYPO")` become type errors.
## Outputs
The `Bun.build` function returns a `Promise<BuildOutput>`, defined as:
@@ -1352,10 +1434,12 @@ interface BuildConfig {
* JSX configuration object for controlling JSX transform behavior
*/
jsx?: {
runtime?: "automatic" | "classic";
importSource?: string;
factory?: string;
fragment?: string;
importSource?: string;
runtime?: "automatic" | "classic";
sideEffects?: boolean;
development?: boolean;
};
naming?:
| string
@@ -1372,7 +1456,7 @@ interface BuildConfig {
publicPath?: string;
define?: Record<string, string>;
loader?: { [k in string]: Loader };
sourcemap?: "none" | "linked" | "inline" | "external" | "linked" | boolean; // default: "none", true -> "inline"
sourcemap?: "none" | "linked" | "inline" | "external" | boolean; // default: "none", true -> "inline"
/**
* package.json `exports` conditions used when resolving imports
*
@@ -1439,13 +1523,20 @@ interface BuildConfig {
drop?: string[];
/**
* When set to `true`, the returned promise rejects with an AggregateError when a build failure happens.
* When set to `false`, the `success` property of the returned object will be `false` when a build failure happens.
* - When set to `true`, the returned promise rejects with an AggregateError when a build failure happens.
* - When set to `false`, returns a {@link BuildOutput} with `{success: false}`
*
* This defaults to `false` in Bun 1.1 and will change to `true` in Bun 1.2
* as most usage of `Bun.build` forgets to check for errors.
* @default true
*/
throw?: boolean;
/**
* Custom tsconfig.json file path to use for path resolution.
* Equivalent to `--tsconfig-override` in the CLI.
*/
tsconfig?: string;
outdir?: string;
}
interface BuildOutput {
@@ -1462,7 +1553,21 @@ interface BuildArtifact extends Blob {
sourcemap: BuildArtifact | null;
}
type Loader = "js" | "jsx" | "ts" | "tsx" | "json" | "toml" | "file" | "napi" | "wasm" | "text";
type Loader =
| "js"
| "jsx"
| "ts"
| "tsx"
| "css"
| "json"
| "jsonc"
| "toml"
| "yaml"
| "text"
| "file"
| "napi"
| "wasm"
| "html";
interface BuildOutput {
outputs: BuildArtifact[];

View File

@@ -7,14 +7,16 @@ The Bun bundler implements a set of default loaders out of the box.
> As a rule of thumb: **the bundler and the runtime both support the same set of file types out of the box.**
`.js` `.cjs` `.mjs` `.mts` `.cts` `.ts` `.tsx` `.jsx` `.toml` `.json` `.txt` `.wasm` `.node` `.html`
`.js` `.cjs` `.mjs` `.mts` `.cts` `.ts` `.tsx` `.jsx` `.css` `.json` `.jsonc` `.toml` `.yaml` `.yml` `.txt` `.wasm` `.node` `.html` `.sh`
Bun uses the file extension to determine which built-in loader should be used to parse the file. Every loader has a name, such as `js`, `tsx`, or `json`. These names are used when building plugins that extend Bun with custom loaders.
You can explicitly specify which loader to use using the `'loader'` import attribute.
You can explicitly specify which loader to use using the `'type'` import attribute.
```ts title="index.ts" icon="/icons/typescript.svg"
import my_toml from "./my_file" with { loader: "toml" };
import my_toml from "./my_file" with { type: "toml" };
// or with dynamic imports
const { default: my_toml } = await import("./my_file", { with: { type: "toml" } });
```
## Built-in loaders
@@ -85,7 +87,7 @@ If a `.json` file is passed as an entrypoint to the bundler, it will be converte
}
```
```ts Output
```js Output
export default {
name: "John Doe",
age: 35,
@@ -97,7 +99,32 @@ export default {
---
### toml
### `jsonc`
**JSON with Comments loader.** Default for `.jsonc`.
JSONC (JSON with Comments) files can be directly imported. Bun will parse them, stripping out comments and trailing commas.
```js
import config from "./config.jsonc";
console.log(config);
```
During bundling, the parsed JSONC is inlined into the bundle as a JavaScript object, identical to the `json` loader.
```js
var config = {
option: "value",
};
```
<Note>
Bun automatically uses the `jsonc` loader for `tsconfig.json`, `jsconfig.json`, `package.json`, and `bun.lock` files.
</Note>
---
### `toml`
**TOML loader.** Default for `.toml`.
@@ -131,7 +158,7 @@ age = 35
email = "johndoe@example.com"
```
```ts Output
```js Output
export default {
name: "John Doe",
age: 35,
@@ -143,7 +170,53 @@ export default {
---
### text
### `yaml`
**YAML loader.** Default for `.yaml` and `.yml`.
YAML files can be directly imported. Bun will parse them with its fast native YAML parser.
```js
import config from "./config.yaml";
console.log(config);
// via import attribute:
import data from "./data.txt" with { type: "yaml" };
```
During bundling, the parsed YAML is inlined into the bundle as a JavaScript object.
```js
var config = {
name: "my-app",
version: "1.0.0",
// ...other fields
};
```
If a `.yaml` or `.yml` file is passed as an entrypoint, it will be converted to a `.js` module that `export default`s the parsed object.
<CodeGroup>
```yaml Input
name: John Doe
age: 35
email: johndoe@example.com
```
```js Output
export default {
name: "John Doe",
age: 35,
email: "johndoe@example.com",
};
```
</CodeGroup>
---
### `text`
**Text loader.** Default for `.txt`.
@@ -173,7 +246,7 @@ If a `.txt` file is passed as an entrypoint, it will be converted to a `.js` mod
Hello, world!
```
```ts Output
```js Output
export default "Hello, world!";
```
@@ -181,7 +254,7 @@ export default "Hello, world!";
---
### napi
### `napi`
**Native addon loader.** Default for `.node`.
@@ -196,7 +269,7 @@ console.log(addon);
---
### sqlite
### `sqlite`
**SQLite loader.** Requires `with { "type": "sqlite" }` import attribute.
@@ -226,7 +299,9 @@ Otherwise, the database to embed is copied into the `outdir` with a hashed filen
---
### html
### `html`
**HTML loader.** Default for `.html`.
The `html` loader processes HTML files and bundles any referenced assets. It will:
@@ -237,7 +312,7 @@ The `html` loader processes HTML files and bundles any referenced assets. It wil
For example, given this HTML file:
```html title="src/index.html"
```html title="src/index.html" icon="file-code"
<!DOCTYPE html>
<html>
<body>
@@ -250,7 +325,7 @@ For example, given this HTML file:
It will output a new HTML file with the bundled assets:
```html title="dist/index.html"
```html title="dist/index.html" icon="file-code"
<!DOCTYPE html>
<html>
<body>
@@ -301,7 +376,27 @@ The `html` loader behaves differently depending on how it's used:
---
### sh
### `css`
**CSS loader.** Default for `.css`.
CSS files can be directly imported. The bundler will parse and bundle CSS files, handling `@import` statements and `url()` references.
```js
import "./styles.css";
```
During bundling, all imported CSS files are bundled together into a single `.css` file in the output directory.
```css
.my-class {
background: url("./image.png");
}
```
---
### `sh`
**Bun Shell loader.** Default for `.sh` files.
@@ -313,7 +408,7 @@ bun run ./script.sh
---
### file
### `file`
**File loader.** Default for all unrecognized file types.

View File

@@ -87,7 +87,7 @@ macro();
When shipping a library containing a macro to npm or another package registry, use the `"macro"` export condition to provide a special version of your package exclusively for the macro environment.
```json title="package.json" icon="file-code"
```json title="package.json" icon="file-json"
{
"name": "my-package",
"exports": {

File diff suppressed because it is too large Load Diff

View File

@@ -42,7 +42,21 @@ type PluginBuilder = {
config: BuildConfig;
};
type Loader = "js" | "jsx" | "ts" | "tsx" | "css" | "json" | "toml";
type Loader =
| "js"
| "jsx"
| "ts"
| "tsx"
| "json"
| "jsonc"
| "toml"
| "yaml"
| "file"
| "napi"
| "wasm"
| "text"
| "css"
| "html";
```
## Usage

View File

@@ -188,7 +188,7 @@
{
"group": "Publishing & Analysis",
"icon": "upload",
"pages": ["/pm/cli/publish", "/pm/cli/outdated", "/pm/cli/why", "/pm/cli/audit"]
"pages": ["/pm/cli/publish", "/pm/cli/outdated", "/pm/cli/why", "/pm/cli/audit", "/pm/cli/info"]
},
{
"group": "Workspace Management",
@@ -298,7 +298,14 @@
{
"group": "Deployment",
"icon": "rocket",
"pages": ["/guides/deployment/vercel", "/guides/deployment/railway", "/guides/deployment/render"]
"pages": [
"/guides/deployment/vercel",
"/guides/deployment/railway",
"/guides/deployment/render",
"/guides/deployment/aws-lambda",
"/guides/deployment/digital-ocean",
"/guides/deployment/google-cloud-run"
]
},
{
"group": "Runtime & Debugging",
@@ -319,6 +326,7 @@
"group": "Utilities",
"icon": "wrench",
"pages": [
"/guides/util/upgrade",
"/guides/util/detect-bun",
"/guides/util/version",
"/guides/util/hash-a-password",
@@ -347,7 +355,7 @@
"/guides/ecosystem/discordjs",
"/guides/ecosystem/docker",
"/guides/ecosystem/drizzle",
"/guides/ecosystem/edgedb",
"/guides/ecosystem/gel",
"/guides/ecosystem/elysia",
"/guides/ecosystem/express",
"/guides/ecosystem/hono",
@@ -362,13 +370,15 @@
"/guides/ecosystem/qwik",
"/guides/ecosystem/react",
"/guides/ecosystem/remix",
"/guides/ecosystem/tanstack-start",
"/guides/ecosystem/sentry",
"/guides/ecosystem/solidstart",
"/guides/ecosystem/ssr-react",
"/guides/ecosystem/stric",
"/guides/ecosystem/sveltekit",
"/guides/ecosystem/systemd",
"/guides/ecosystem/vite"
"/guides/ecosystem/vite",
"/guides/ecosystem/upstash"
]
},
{
@@ -455,6 +465,7 @@
"/guides/test/update-snapshots",
"/guides/test/coverage",
"/guides/test/coverage-threshold",
"/guides/test/concurrent-test-glob",
"/guides/test/skip-tests",
"/guides/test/todo-tests",
"/guides/test/timeout",

View File

@@ -4,13 +4,9 @@ description: Share feedback, bug reports, and feature requests
mode: center
---
import Feedback from "/snippets/cli/feedback.mdx";
Whether you've found a bug, have a performance issue, or just want to suggest an improvement, here's how you can open a helpful issue:
<Callout icon="discord">
For general questions, please join our [Discord](https://discord.com/invite/CXdq2DP29u).
</Callout>
<Callout icon="discord">For general questions, please join our [Discord](https://bun.com/discord).</Callout>
## Reporting Issues
@@ -56,9 +52,7 @@ Whether you've found a bug, have a performance issue, or just want to suggest an
<Note>
- For MacOS and Linux: copy the output of `uname -mprs`
- For Windows: copy the output of this command in the powershell console:
```powershell
"$([Environment]::OSVersion | ForEach-Object VersionString) $(if ([Environment]::Is64BitOperatingSystem) { "x64" } else { "x86" })"
```
`"$([Environment]::OSVersion | ForEach-Object VersionString) $(if ([Environment]::Is64BitOperatingSystem) { "x64" } else { "x86" })"`
</Note>
</Step>
@@ -79,7 +73,3 @@ echo "please document X" | bun feedback --email you@example.com
```
You can provide feedback as text arguments, file paths, or piped input.
---
<Feedback />

View File

@@ -26,4 +26,4 @@ const regularArr = Array.from(uintArr);
---
See [Docs > API > Binary Data](https://bun.com/docs/api/binary-data#conversion) for complete documentation on manipulating binary data with Bun.
See [Docs > API > Binary Data](/runtime/binary-data#conversion) for complete documentation on manipulating binary data with Bun.

View File

@@ -23,4 +23,4 @@ blob.type; // => "application/octet-stream"
---
See [Docs > API > Binary Data](https://bun.com/docs/api/binary-data#conversion) for complete documentation on manipulating binary data with Bun.
See [Docs > API > Binary Data](/runtime/binary-data#conversion) for complete documentation on manipulating binary data with Bun.

View File

@@ -24,4 +24,4 @@ const nodeBuffer = Buffer.from(arrBuffer, 0, 16); // view first 16 bytes
---
See [Docs > API > Binary Data](https://bun.com/docs/api/binary-data#conversion) for complete documentation on manipulating binary data with Bun.
See [Docs > API > Binary Data](/runtime/binary-data#conversion) for complete documentation on manipulating binary data with Bun.

View File

@@ -14,4 +14,4 @@ const str = decoder.decode(buf);
---
See [Docs > API > Binary Data](https://bun.com/docs/api/binary-data#conversion) for complete documentation on manipulating binary data with Bun.
See [Docs > API > Binary Data](/runtime/binary-data#conversion) for complete documentation on manipulating binary data with Bun.

View File

@@ -38,4 +38,4 @@ const arr = new Uint8Array(buffer, 0, 16); // view first 16 bytes
---
See [Docs > API > Utils](https://bun.com/docs/api/utils) for more useful utilities.
See [Docs > API > Utils](/runtime/utils) for more useful utilities.

View File

@@ -13,4 +13,4 @@ const buf = await blob.arrayBuffer();
---
See [Docs > API > Binary Data](https://bun.com/docs/api/binary-data#conversion) for complete documentation on manipulating binary data with Bun.
See [Docs > API > Binary Data](/runtime/binary-data#conversion) for complete documentation on manipulating binary data with Bun.

View File

@@ -13,4 +13,4 @@ const arr = new DataView(await blob.arrayBuffer());
---
See [Docs > API > Binary Data](https://bun.com/docs/api/binary-data#conversion) for complete documentation on manipulating binary data with Bun.
See [Docs > API > Binary Data](/runtime/binary-data#conversion) for complete documentation on manipulating binary data with Bun.

View File

@@ -13,4 +13,4 @@ const stream = await blob.stream();
---
See [Docs > API > Binary Data](https://bun.com/docs/api/binary-data#conversion) for complete documentation on manipulating binary data with Bun.
See [Docs > API > Binary Data](/runtime/binary-data#conversion) for complete documentation on manipulating binary data with Bun.

View File

@@ -14,4 +14,4 @@ const str = await blob.text();
---
See [Docs > API > Binary Data](https://bun.com/docs/api/binary-data#conversion) for complete documentation on manipulating binary data with Bun.
See [Docs > API > Binary Data](/runtime/binary-data#conversion) for complete documentation on manipulating binary data with Bun.

View File

@@ -13,4 +13,4 @@ const arr = new Uint8Array(await blob.arrayBuffer());
---
See [Docs > API > Binary Data](https://bun.com/docs/api/binary-data#conversion) for complete documentation on manipulating binary data with Bun.
See [Docs > API > Binary Data](/runtime/binary-data#conversion) for complete documentation on manipulating binary data with Bun.

View File

@@ -13,4 +13,4 @@ const arrBuf = nodeBuf.buffer;
---
See [Docs > API > Binary Data](https://bun.com/docs/api/binary-data#conversion) for complete documentation on manipulating binary data with Bun.
See [Docs > API > Binary Data](/runtime/binary-data#conversion) for complete documentation on manipulating binary data with Bun.

View File

@@ -13,4 +13,4 @@ const blob = new Blob([buf]);
---
See [Docs > API > Binary Data](https://bun.com/docs/api/binary-data#conversion) for complete documentation on manipulating binary data with Bun.
See [Docs > API > Binary Data](/runtime/binary-data#conversion) for complete documentation on manipulating binary data with Bun.

View File

@@ -40,4 +40,4 @@ const stream = blob.stream(1024);
---
See [Docs > API > Binary Data](https://bun.com/docs/api/binary-data#conversion) for complete documentation on manipulating binary data with Bun.
See [Docs > API > Binary Data](/runtime/binary-data#conversion) for complete documentation on manipulating binary data with Bun.

View File

@@ -24,4 +24,4 @@ const str = buf.toString("utf8", 0, 5);
---
See [Docs > API > Binary Data](https://bun.com/docs/api/binary-data#conversion) for complete documentation on manipulating binary data with Bun.
See [Docs > API > Binary Data](/runtime/binary-data#conversion) for complete documentation on manipulating binary data with Bun.

View File

@@ -13,4 +13,4 @@ buf instanceof Uint8Array; // => true
---
See [Docs > API > Binary Data](https://bun.com/docs/api/binary-data#conversion) for complete documentation on manipulating binary data with Bun.
See [Docs > API > Binary Data](/runtime/binary-data#conversion) for complete documentation on manipulating binary data with Bun.

View File

@@ -14,4 +14,4 @@ const str = decoder.decode(dv);
---
See [Docs > API > Binary Data](https://bun.com/docs/api/binary-data#conversion) for complete documentation on manipulating binary data with Bun.
See [Docs > API > Binary Data](/runtime/binary-data#conversion) for complete documentation on manipulating binary data with Bun.

View File

@@ -24,4 +24,4 @@ arr.byteLength; // => 32
---
See [Docs > API > Binary Data](https://bun.com/docs/api/binary-data#conversion) for complete documentation on manipulating binary data with Bun.
See [Docs > API > Binary Data](/runtime/binary-data#conversion) for complete documentation on manipulating binary data with Bun.

View File

@@ -15,4 +15,4 @@ console.log(await blob.text());
---
See [Docs > API > Binary Data](https://bun.com/docs/api/binary-data#conversion) for complete documentation on manipulating binary data with Bun.
See [Docs > API > Binary Data](/runtime/binary-data#conversion) for complete documentation on manipulating binary data with Bun.

View File

@@ -13,4 +13,4 @@ const buf = Buffer.from(arr);
---
See [Docs > API > Binary Data](https://bun.com/docs/api/binary-data#conversion) for complete documentation on manipulating binary data with Bun.
See [Docs > API > Binary Data](/runtime/binary-data#conversion) for complete documentation on manipulating binary data with Bun.

View File

@@ -13,4 +13,4 @@ const dv = new DataView(arr.buffer, arr.byteOffset, arr.byteLength);
---
See [Docs > API > Binary Data](https://bun.com/docs/api/binary-data#conversion) for complete documentation on manipulating binary data with Bun.
See [Docs > API > Binary Data](/runtime/binary-data#conversion) for complete documentation on manipulating binary data with Bun.

View File

@@ -40,4 +40,4 @@ const stream = blob.stream(1024);
---
See [Docs > API > Binary Data](https://bun.com/docs/api/binary-data#conversion) for complete documentation on manipulating binary data with Bun.
See [Docs > API > Binary Data](/runtime/binary-data#conversion) for complete documentation on manipulating binary data with Bun.

View File

@@ -15,4 +15,4 @@ const str = decoder.decode(arr);
---
See [Docs > API > Binary Data](https://bun.com/docs/api/binary-data#conversion) for complete documentation on manipulating binary data with Bun.
See [Docs > API > Binary Data](/runtime/binary-data#conversion) for complete documentation on manipulating binary data with Bun.

View File

@@ -0,0 +1,204 @@
---
title: Deploy a Bun application on AWS Lambda
sidebarTitle: Deploy on AWS Lambda
mode: center
---
[AWS Lambda](https://aws.amazon.com/lambda/) is a serverless compute service that lets you run code without provisioning or managing servers.
In this guide, we will deploy a Bun HTTP server to AWS Lambda using a `Dockerfile`.
<Note>
Before continuing, make sure you have:
- A Bun application ready for deployment
- An [AWS account](https://aws.amazon.com/)
- [AWS CLI](https://docs.aws.amazon.com/cli/latest/userguide/cli-chap-getting-started.html) installed and configured
- [Docker](https://docs.docker.com/get-started/get-docker/) installed and added to your `PATH`
</Note>
---
<Steps>
<Step title="Create a new Dockerfile">
Make sure you're in the directory containing your project, then create a new `Dockerfile` in the root of your project. This file contains the instructions to initialize the container, copy your local project files into it, install dependencies, and start the application.
```docker Dockerfile icon="docker"
# Use the official AWS Lambda adapter image to handle the Lambda runtime
FROM public.ecr.aws/awsguru/aws-lambda-adapter:0.9.0 AS aws-lambda-adapter
# Use the official Bun image to run the application
FROM oven/bun:debian AS bun_latest
# Copy the Lambda adapter into the container
COPY --from=aws-lambda-adapter /lambda-adapter /opt/extensions/lambda-adapter
# Set the port to 8080. This is required for the AWS Lambda adapter.
ENV PORT=8080
# Set the work directory to `/var/task`. This is the default work directory for Lambda.
WORKDIR "/var/task"
# Copy the package.json and bun.lock into the container
COPY package.json bun.lock ./
# Install the dependencies
RUN bun install --production --frozen-lockfile
# Copy the rest of the application into the container
COPY . /var/task
# Run the application.
CMD ["bun", "index.ts"]
```
<Note>
Make sure that the start command corresponds to your application's entry point. This can also be `CMD ["bun", "run", "start"]` if you have a start script in your `package.json`.
This image installs dependencies and runs your app with Bun inside a container. If your app doesn't have dependencies, you can omit the `RUN bun install --production --frozen-lockfile` line.
</Note>
Create a new `.dockerignore` file in the root of your project. This file contains the files and directories that should be _excluded_ from the container image, such as `node_modules`. This makes your builds faster and smaller:
```docker .dockerignore icon="Docker"
node_modules
Dockerfile*
.dockerignore
.git
.gitignore
README.md
LICENSE
.vscode
.env
# Any other files or directories you want to exclude
```
</Step>
<Step title="Build the Docker image">
Make sure you're in the directory containing your `Dockerfile`, then build the Docker image. In this case, we'll call the image `bun-lambda-demo` and tag it as `latest`.
```bash terminal icon="terminal"
# cd /path/to/your/app
docker build --provenance=false --platform linux/amd64 -t bun-lambda-demo:latest .
```
</Step>
<Step title="Create an ECR repository">
To push the image to AWS Lambda, we first need to create an [ECR repository](https://aws.amazon.com/ecr/) to push the image to.
By running the following command, we:
- Create an ECR repository named `bun-lambda-demo` in the `us-east-1` region
- Get the repository URI, and export the repository URI as an environment variable. This is optional, but make the next steps easier.
```bash terminal icon="terminal"
export ECR_URI=$(aws ecr create-repository --repository-name bun-lambda-demo --region us-east-1 --query 'repository.repositoryUri' --output text)
echo $ECR_URI
```
```txt
[id].dkr.ecr.us-east-1.amazonaws.com/bun-lambda-demo
```
<Note>
If you're using IAM Identity Center (SSO) or have configured AWS CLI with profiles, you'll need to add the `--profile` flag to your AWS CLI commands.
For example, if your profile is named `my-sso-app`, use `--profile my-sso-app`. Check your AWS CLI configuration with `aws configure list-profiles` to see available profiles.
```bash terminal icon="terminal"
export ECR_URI=$(aws ecr create-repository --repository-name bun-lambda-demo --region us-east-1 --profile my-sso-app --query 'repository.repositoryUri' --output text)
echo $ECR_URI
```
</Note>
</Step>
<Step title="Authenticate with the ECR repository">
Log in to the ECR repository:
```bash terminal icon="terminal"
aws ecr get-login-password --region us-east-1 | docker login --username AWS --password-stdin $ECR_URI
```
```txt
Login Succeeded
```
<Note>
If using a profile, use the `--profile` flag:
```bash terminal icon="terminal"
aws ecr get-login-password --region us-east-1 --profile my-sso-app | docker login --username AWS --password-stdin $ECR_URI
```
</Note>
</Step>
<Step title="Tag and push the docker image to the ECR repository">
Make sure you're in the directory containing your `Dockerfile`, then tag the docker image with the ECR repository URI.
```bash terminal icon="terminal"
docker tag bun-lambda-demo:latest ${ECR_URI}:latest
```
Then, push the image to the ECR repository.
```bash terminal icon="terminal"
docker push ${ECR_URI}:latest
```
</Step>
<Step title="Create an AWS Lambda function">
Go to **AWS Console** > **Lambda** > [**Create Function**](https://us-east-1.console.aws.amazon.com/lambda/home?region=us-east-1#/create/function?intent=authorFromImage) > Select **Container image**
<Warning>Make sure you've selected the right region, this URL defaults to `us-east-1`.</Warning>
<Frame>
![Create Function](/images/guides/lambda1.png)
</Frame>
Give the function a name, like `my-bun-function`.
</Step>
<Step title="Select the container image">
Then, go to the **Container image URI** section, click on **Browse images**. Select the image we just pushed to the ECR repository.
<Frame>
![Select Container Repository](/images/guides/lambda2.png)
</Frame>
Then, select the `latest` image, and click on **Select image**.
<Frame>
![Select Container Image](/images/guides/lambda3.png)
</Frame>
</Step>
<Step title="Configure the function">
To get a public URL for the function, we need to go to **Additional configurations** > **Networking** > **Function URL**.
Set this to **Enable**, with Auth Type **NONE**.
<Frame>
![Set the Function URL](/images/guides/lambda4.png)
</Frame>
</Step>
<Step title="Create the function">
Click on **Create function** at the bottom of the page, this will create the function.
<Frame>
![Create Function](/images/guides/lambda6.png)
</Frame>
</Step>
<Step title="Get the function URL">
Once the function has been created you'll be redirected to the function's page, where you can see the function URL in the **"Function URL"** section.
<Frame>
![Function URL](/images/guides/lambda5.png)
</Frame>
</Step>
<Step title="Test the function">
🥳 Your app is now live! To test the function, you can either go to the **Test** tab, or call the function URL directly.
```bash terminal icon="terminal"
curl -X GET https://[your-function-id].lambda-url.us-east-1.on.aws/
```
```txt
Hello from Bun on Lambda!
```
</Step>
</Steps>

View File

@@ -0,0 +1,161 @@
---
title: Deploy a Bun application on DigitalOcean
sidebarTitle: Deploy on DigitalOcean
mode: center
---
[DigitalOcean](https://www.digitalocean.com/) is a cloud platform that provides a range of services for building and deploying applications.
In this guide, we will deploy a Bun HTTP server to DigitalOcean using a `Dockerfile`.
<Note>
Before continuing, make sure you have:
- A Bun application ready for deployment
- A [DigitalOcean account](https://www.digitalocean.com/)
- [DigitalOcean CLI](https://docs.digitalocean.com/reference/doctl/how-to/install/#step-1-install-doctl) installed and configured
- [Docker](https://docs.docker.com/get-started/get-docker/) installed and added to your `PATH`
</Note>
---
<Steps>
<Step title="Create a new DigitalOcean Container Registry">
Create a new Container Registry to store the Docker image.
<Tabs>
<Tab title="Through the DigitalOcean dashboard">
In the DigitalOcean dashboard, go to [**Container Registry**](https://cloud.digitalocean.com/registry), and enter the details for the new registry.
<Frame>
![DigitalOcean registry dashboard](/images/guides/digitalocean-7.png)
</Frame>
Make sure the details are correct, then click **Create Registry**.
</Tab>
<Tab title="Through the DigitalOcean CLI">
```bash terminal icon="terminal"
doctl registry create bun-digitalocean-demo
```
```txt
Name Endpoint Region slug
bun-digitalocean-demo registry.digitalocean.com/bun-digitalocean-demo sfo2
```
</Tab>
</Tabs>
You should see the new registry in the [**DigitalOcean registry dashboard**](https://cloud.digitalocean.com/registry):
<Frame>
![DigitalOcean registry dashboard](/images/guides/digitalocean-1.png)
</Frame>
</Step>
<Step title="Create a new Dockerfile">
Make sure you're in the directory containing your project, then create a new `Dockerfile` in the root of your project. This file contains the instructions to initialize the container, copy your local project files into it, install dependencies, and start the application.
```docker Dockerfile icon="docker"
# Use the official Bun image to run the application
FROM oven/bun:debian
# Set the work directory to `/app`
WORKDIR /app
# Copy the package.json and bun.lock into the container
COPY package.json bun.lock ./
# Install the dependencies
RUN bun install --production --frozen-lockfile
# Copy the rest of the application into the container
COPY . .
# Expose the port (DigitalOcean will set PORT env var)
EXPOSE 8080
# Run the application
CMD ["bun", "index.ts"]
```
<Note>
Make sure that the start command corresponds to your application's entry point. This can also be `CMD ["bun", "run", "start"]` if you have a start script in your `package.json`.
This image installs dependencies and runs your app with Bun inside a container. If your app doesn't have dependencies, you can omit the `RUN bun install --production --frozen-lockfile` line.
</Note>
Create a new `.dockerignore` file in the root of your project. This file contains the files and directories that should be _excluded_ from the container image, such as `node_modules`. This makes your builds faster and smaller:
```docker .dockerignore icon="Docker"
node_modules
Dockerfile*
.dockerignore
.git
.gitignore
README.md
LICENSE
.vscode
.env
# Any other files or directories you want to exclude
```
</Step>
<Step title="Authenticate Docker with DigitalOcean registry">
Before building and pushing the Docker image, authenticate Docker with the DigitalOcean Container Registry:
```bash terminal icon="terminal"
doctl registry login
```
```txt
Successfully authenticated with registry.digitalocean.com
```
<Note>
This command authenticates Docker with DigitalOcean's registry using your DigitalOcean credentials. Without this step, the build and push command will fail with a 401 authentication error.
</Note>
</Step>
<Step title="Build and push the Docker image to the DigitalOcean registry">
Make sure you're in the directory containing your `Dockerfile`, then build and push the Docker image to the DigitalOcean registry in one command:
```bash terminal icon="terminal"
docker buildx build --platform=linux/amd64 -t registry.digitalocean.com/bun-digitalocean-demo/bun-digitalocean-demo:latest --push .
```
<Note>
If you're building on an ARM Mac (M1/M2), you must use `docker buildx` with `--platform=linux/amd64` to ensure compatibility with DigitalOcean's infrastructure. Using `docker build` without the platform flag will create an ARM64 image that won't run on DigitalOcean.
</Note>
Once the image is pushed, you should see it in the [**DigitalOcean registry dashboard**](https://cloud.digitalocean.com/registry):
<Frame>
![DigitalOcean registry dashboard](/images/guides/digitalocean-2.png)
</Frame>
</Step>
<Step title="Create a new DigitalOcean App Platform project">
In the DigitalOcean dashboard, go to [**App Platform**](https://cloud.digitalocean.com/apps) > **Create App**. We can create a project directly from the container image.
<Frame>
![DigitalOcean App Platform project dashboard](/images/guides/digitalocean-3.png)
</Frame>
Make sure the details are correct, then click **Next**.
<Frame>
![DigitalOcean App Platform service dashboard](/images/guides/digitalocean-4.png)
</Frame>
Review and configure resource settings, then click **Create app**.
<Frame>
![DigitalOcean App Platform service dashboard](/images/guides/digitalocean-6.png)
</Frame>
</Step>
<Step title="Visit your live application">
🥳 Your app is now live! Once the app is created, you should see it in the App Platform dashboard with the public URL.
<Frame>
![DigitalOcean App Platform app dashboard](/images/guides/digitalocean-5.png)
</Frame>
</Step>
</Steps>

View File

@@ -0,0 +1,194 @@
---
title: Deploy a Bun application on Google Cloud Run
sidebarTitle: Deploy on Google Cloud Run
mode: center
---
[Google Cloud Run](https://cloud.google.com/run) is a managed platform for deploying and scaling serverless applications. Google handles the infrastructure for you.
In this guide, we will deploy a Bun HTTP server to Google Cloud Run using a `Dockerfile`.
<Note>
Before continuing, make sure you have:
- A Bun application ready for deployment
- A [Google Cloud account](https://cloud.google.com/) with billing enabled
- [Google Cloud CLI](https://cloud.google.com/sdk/docs/install) installed and configured
</Note>
---
<Steps>
<Step title={<span>Initialize <code>gcloud</code> by select/creating a project</span>}>
Make sure that you've initialized the Google Cloud CLI. This command logs you in, and prompts you to either select an existing project or create a new one.
For more help with the Google Cloud CLI, see the [official documentation](https://docs.cloud.google.com/sdk/gcloud/reference/init).
```bash terminal icon="terminal"
gcloud init
```
```txt
Welcome! This command will take you through the configuration of gcloud.
You must sign in to continue. Would you like to sign in (Y/n)? Y
You are signed in as [email@example.com].
Pick cloud project to use:
[1] existing-bun-app-1234
[2] Enter a project ID
[3] Create a new project
Please enter numeric choice or text value (must exactly match list item): 3
Enter a Project ID. my-bun-app
Your current project has been set to: [my-bun-app]
The Google Cloud CLI is configured and ready to use!
```
</Step>
<Step title="(Optional) Store your project info in environment variables">
Set variables for your project ID and number so they're easier to reuse in the following steps.
```bash terminal icon="terminal"
PROJECT_ID=$(gcloud projects list --format='value(projectId)' --filter='name="my bun app"')
PROJECT_NUMBER=$(gcloud projects list --format='value(projectNumber)' --filter='name="my bun app"')
echo $PROJECT_ID $PROJECT_NUMBER
```
```txt
my-bun-app-... [PROJECT_NUMBER]
```
</Step>
<Step title="Link a billing account">
List your available billing accounts and link one to your project:
```bash terminal icon="terminal"
gcloud billing accounts list
```
```txt
ACCOUNT_ID NAME OPEN MASTER_ACCOUNT_ID
[BILLING_ACCOUNT_ID] My Billing Account True
```
Link your billing account to your project. Replace `[BILLING_ACCOUNT_ID]` with the ID of your billing account.
```bash terminal icon="terminal"
gcloud billing projects link $PROJECT_ID --billing-account=[BILLING_ACCOUNT_ID]
```
```txt
billingAccountName: billingAccounts/[BILLING_ACCOUNT_ID]
billingEnabled: true
name: projects/my-bun-app-.../billingInfo
projectId: my-bun-app-...
```
</Step>
<Step title="Enable APIs and configure IAM roles">
Activate the necessary services and grant Cloud Build permissions:
```bash terminal icon="terminal"
gcloud services enable run.googleapis.com cloudbuild.googleapis.com
gcloud projects add-iam-policy-binding $PROJECT_ID \
--member=serviceAccount:$PROJECT_NUMBER-compute@developer.gserviceaccount.com \
--role=roles/run.builder
```
<Note>
These commands enable Cloud Run (`run.googleapis.com`) and Cloud Build (`cloudbuild.googleapis.com`), which are required for deploying from source. Cloud Run runs your containerized app, while Cloud Build handles building and packaging it.
The IAM binding grants the Compute Engine service account (`$PROJECT_NUMBER-compute@developer.gserviceaccount.com`) permission to build and deploy images on your behalf.
</Note>
</Step>
<Step title="Add a Dockerfile">
Create a new `Dockerfile` in the root of your project. This file contains the instructions to initialize the container, copy your local project files into it, install dependencies, and start the application.
```docker Dockerfile icon="docker"
# Use the official Bun image to run the application
FROM oven/bun:latest
# Copy the package.json and bun.lock into the container
COPY package.json bun.lock ./
# Install the dependencies
# Install the dependencies
RUN bun install --production --frozen-lockfile
# Copy the rest of the application into the container
COPY . .
# Run the application
CMD ["bun", "index.ts"]
```
<Note>
Make sure that the start command corresponds to your application's entry point. This can also be `CMD ["bun", "run", "start"]` if you have a start script in your `package.json`.
This image installs dependencies and runs your app with Bun inside a container. If your app doesn't have dependencies, you can omit the `RUN bun install --production --frozen-lockfile` line.
</Note>
Create a new `.dockerignore` file in the root of your project. This file contains the files and directories that should be _excluded_ from the container image, such as `node_modules`. This makes your builds faster and smaller:
```docker .dockerignore icon="Docker"
node_modules
Dockerfile*
.dockerignore
.git
.gitignore
README.md
LICENSE
.vscode
.env
# Any other files or directories you want to exclude
```
</Step>
<Step title="Deploy your service">
Make sure you're in the directory containing your `Dockerfile`, then deploy directly from your local source:
<Note>
Update the `--region` flag to your preferred region. You can also omit this flag to get an interactive prompt to
select a region.
</Note>
```bash terminal icon="terminal"
gcloud run deploy my-bun-app --source . --region=us-west1 --allow-unauthenticated
```
```txt
Deploying from source requires an Artifact Registry Docker repository to store built containers. A repository named
[cloud-run-source-deploy] in region [us-west1] will be created.
Do you want to continue (Y/n)? Y
Building using Dockerfile and deploying container to Cloud Run service [my-bun-app] in project [my-bun-app-...] region [us-west1]
✓ Building and deploying... Done.
✓ Validating Service...
✓ Uploading sources...
✓ Building Container... Logs are available at [https://console.cloud.google.com/cloud-build/builds...].
✓ Creating Revision...
✓ Routing traffic...
✓ Setting IAM Policy...
Done.
Service [my-bun-app] revision [my-bun-app-...] has been deployed and is serving 100 percent of traffic.
Service URL: https://my-bun-app-....us-west1.run.app
```
</Step>
<Step title="Visit your live application">
🎉 Your Bun application is now live!
Visit the Service URL (`https://my-bun-app-....us-west1.run.app`) to confirm everything works as expected.
</Step>
</Steps>

View File

@@ -4,8 +4,6 @@ sidebarTitle: Deploy on Vercel
mode: center
---
import { ProductCard } from "/snippets/product-card.mdx";
[Vercel](https://vercel.com/) is a cloud platform that lets you build, deploy, and scale your apps.
<Warning>
@@ -32,7 +30,7 @@ import { ProductCard } from "/snippets/product-card.mdx";
Vercel automatically detects this configuration and runs your application on Bun. The value has to be `"1.x"`, Vercel handles the minor version internally.
For best results, match your local Bun version with the version used by Vercel. **Currently, Bun version `1.2.23` is supported**.
For best results, match your local Bun version with the version used by Vercel.
</Step>
<Step title="Next.js configuration">
@@ -81,7 +79,7 @@ import { ProductCard } from "/snippets/product-card.mdx";
console.log("runtime", process.versions.bun);
```
```txt
runtime 1.2.23
runtime 1.3.3
```
[See the Vercel Bun Runtime documentation for feature support →](https://vercel.com/docs/functions/runtimes/bun#feature-support)

View File

@@ -22,7 +22,7 @@ bun add discord.js
---
Before we go further, we need to go to the [Discord developer portal](https://discord.com/developers/applications), login/signup, create a new _Application_, then create a new _Bot_ within that application. Follow the [official guide](https://discordjs.guide/preparations/setting-up-a-bot-application.html#creating-your-bot) for step-by-step instructions.
Before we go further, we need to go to the [Discord developer portal](https://discord.com/developers/applications), login/signup, create a new _Application_, then create a new _Bot_ within that application. Follow the [official guide](https://discordjs.guide/legacy/preparations/app-setup#creating-your-bot) for step-by-step instructions.
---
@@ -30,7 +30,7 @@ Once complete, you'll be presented with your bot's _private key_. Let's add this
<Note>This is an example token that has already been invalidated.</Note>
```txt .env.local icon="settings"
```ini .env.local icon="settings"
DISCORD_TOKEN=NzkyNzE1NDU0MTk2MDg4ODQy.X-hvzA.Ovy4MCQywSkoMRRclStW4xAYK7I
```

View File

@@ -7,8 +7,8 @@ mode: center
Express and other major Node.js HTTP libraries should work out of the box. Bun implements the [`node:http`](https://nodejs.org/api/http.html) and [`node:https`](https://nodejs.org/api/https.html) modules that these libraries rely on.
<Note>
Refer to the [Runtime > Node.js APIs](https://bun.com/docs/runtime/nodejs-apis#node-http) page for more detailed
compatibility information.
Refer to the [Runtime > Node.js APIs](/runtime/nodejs-compat#node-http) page for more detailed compatibility
information.
</Note>
```sh terminal icon="terminal"

View File

@@ -1,23 +1,27 @@
---
title: Use EdgeDB with Bun
sidebarTitle: EdgeDB with Bun
title: Use Gel with Bun
sidebarTitle: Gel with Bun
mode: center
---
EdgeDB is a graph-relational database powered by Postgres under the hood. It provides a declarative schema language, migrations system, and object-oriented query language, in addition to supporting raw SQL queries. It solves the object-relational mapping problem at the database layer, eliminating the need for an ORM library in your application code.
Gel (formerly EdgeDB) is a graph-relational database powered by Postgres under the hood. It provides a declarative schema language, migrations system, and object-oriented query language, in addition to supporting raw SQL queries. It solves the object-relational mapping problem at the database layer, eliminating the need for an ORM library in your application code.
---
First, [install EdgeDB](https://www.edgedb.com/install) if you haven't already.
First, [install Gel](https://docs.geldata.com/learn/installation) if you haven't already.
<CodeGroup>
```sh Linux/macOS terminal icon="terminal"
curl --proto '=https' --tlsv1.2 -sSf https://sh.edgedb.com | sh
curl https://www.geldata.com/sh --proto "=https" -sSf1 | sh
```
```sh Windows terminal icon="windows"
iwr https://ps1.edgedb.com -useb | iex
irm https://www.geldata.com/ps1 | iex
```
```sh Homebrew terminal icon="terminal"
brew install geldata/tap/gel-cli
```
</CodeGroup>
@@ -34,35 +38,35 @@ bun init -y
---
We'll use the EdgeDB CLI to initialize an EdgeDB instance for our project. This creates an `edgedb.toml` file in our project root.
We'll use the Gel CLI to initialize a Gel instance for our project. This creates a `gel.toml` file in our project root.
```sh terminal icon="terminal"
edgedb project init
gel project init
```
```txt
No `edgedb.toml` found in `/Users/colinmcd94/Documents/bun/fun/examples/my-edgedb-app` or above
No `gel.toml` found in `/Users/colinmcd94/Documents/bun/fun/examples/my-gel-app` or above
Do you want to initialize a new project? [Y/n]
> Y
Specify the name of EdgeDB instance to use with this project [default: my_edgedb_app]:
> my_edgedb_app
Checking EdgeDB versions...
Specify the version of EdgeDB to use with this project [default: x.y]:
Specify the name of Gel instance to use with this project [default: my_gel_app]:
> my_gel_app
Checking Gel versions...
Specify the version of Gel to use with this project [default: x.y]:
> x.y
┌─────────────────────┬────────────────────────────────────────────────────────────────────────
│ Project directory │ /Users/colinmcd94/Documents/bun/fun/examples/my-edgedb-app
│ Project config │ /Users/colinmcd94/Documents/bun/fun/examples/my-edgedb-app/edgedb.toml
│ Schema dir (empty) │ /Users/colinmcd94/Documents/bun/fun/examples/my-edgedb-app/dbschema
│ Installation method │ portable package
│ Version │ x.y+6d5921b
│ Instance name │ my_edgedb_app
└─────────────────────┴────────────────────────────────────────────────────────────────────────
┌─────────────────────┬──────────────────────────────────────────────────────────────────┐
│ Project directory │ /Users/colinmcd94/Documents/bun/fun/examples/my-gel-app
│ Project config │ /Users/colinmcd94/Documents/bun/fun/examples/my-gel-app/gel.toml│
│ Schema dir (empty) │ /Users/colinmcd94/Documents/bun/fun/examples/my-gel-app/dbschema│
│ Installation method │ portable package │
│ Version │ x.y+6d5921b │
│ Instance name │ my_gel_app │
└─────────────────────┴──────────────────────────────────────────────────────────────────┘
Version x.y+6d5921b is already downloaded
Initializing EdgeDB instance...
Initializing Gel instance...
Applying migrations...
Everything is up to date. Revision initial
Project initialized.
To connect to my_edgedb_app, run `edgedb`
To connect to my_gel_app, run `gel`
```
---
@@ -70,8 +74,8 @@ To connect to my_edgedb_app, run `edgedb`
To see if the database is running, let's open a REPL and run a simple query.
```sh terminal icon="terminal"
edgedb
edgedb> select 1 + 1;
gel
gel> select 1 + 1;
```
```txt
@@ -81,12 +85,12 @@ edgedb> select 1 + 1;
Then run `\quit` to exit the REPL.
```sh terminal icon="terminal"
edgedb> \quit
gel> \quit
```
---
With the project initialized, we can define a schema. The `edgedb project init` command already created a `dbschema/default.esdl` file to contain our schema.
With the project initialized, we can define a schema. The `gel project init` command already created a `dbschema/default.esdl` file to contain our schema.
```txt File Tree icon="folder-tree"
dbschema
@@ -112,15 +116,15 @@ module default {
Then generate and apply an initial migration.
```sh terminal icon="terminal"
edgedb migration create
gel migration create
```
```txt
Created /Users/colinmcd94/Documents/bun/fun/examples/my-edgedb-app/dbschema/migrations/00001.edgeql, id: m1uwekrn4ni4qs7ul7hfar4xemm5kkxlpswolcoyqj3xdhweomwjrq
Created /Users/colinmcd94/Documents/bun/fun/examples/my-gel-app/dbschema/migrations/00001.edgeql, id: m1uwekrn4ni4qs7ul7hfar4xemm5kkxlpswolcoyqj3xdhweomwjrq
```
```sh terminal icon="terminal"
edgedb migrate
gel migrate
```
```txt
@@ -129,11 +133,11 @@ Applied m1uwekrn4ni4qs7ul7hfar4xemm5kkxlpswolcoyqj3xdhweomwjrq (00001.edgeql)
---
With our schema applied, let's execute some queries using EdgeDB's JavaScript client library. We'll install the client library and EdgeDB's codegen CLI, and create a `seed.ts`.file.
With our schema applied, let's execute some queries using Gel's JavaScript client library. We'll install the client library and Gel's codegen CLI, and create a `seed.ts`.file.
```sh terminal icon="terminal"
bun add edgedb
bun add -D @edgedb/generate
bun add gel
bun add -D @gel/generate
touch seed.ts
```
@@ -144,7 +148,7 @@ Paste the following code into `seed.ts`.
The client auto-connects to the database. We insert a couple movies using the `.execute()` method. We will use EdgeQL's `for` expression to turn this bulk insert into a single optimized query.
```ts seed.ts icon="/icons/typescript.svg"
import { createClient } from "edgedb";
import { createClient } from "gel";
const client = createClient();
@@ -184,10 +188,10 @@ Seeding complete.
---
EdgeDB implements a number of code generation tools for TypeScript. To query our newly seeded database in a typesafe way, we'll use `@edgedb/generate` to code-generate the EdgeQL query builder.
Gel implements a number of code generation tools for TypeScript. To query our newly seeded database in a typesafe way, we'll use `@gel/generate` to code-generate the EdgeQL query builder.
```sh terminal icon="terminal"
bunx @edgedb/generate edgeql-js
bunx @gel/generate edgeql-js
```
```txt
@@ -213,7 +217,7 @@ the query builder directory? The following line will be added:
In `index.ts`, we can import the generated query builder from `./dbschema/edgeql-js` and write a simple select query.
```ts index.ts icon="/icons/typescript.svg"
import { createClient } from "edgedb";
import { createClient } from "gel";
import e from "./dbschema/edgeql-js";
const client = createClient();
@@ -254,4 +258,4 @@ bun run index.ts
---
For complete documentation, refer to the [EdgeDB docs](https://www.edgedb.com/docs).
For complete documentation, refer to the [Gel docs](https://docs.geldata.com/).

View File

@@ -89,4 +89,4 @@ Moo!
---
This is a simple introduction to using Mongoose with TypeScript and Bun. As you build your application, refer to the official [MongoDB](https://docs.mongodb.com/) and [Mongoose](https://mongoosejs.com/docs/) sites for complete documentation.
This is a simple introduction to using Mongoose with TypeScript and Bun. As you build your application, refer to the official [MongoDB](https://www.mongodb.com/docs) and [Mongoose](https://mongoosejs.com/docs/) sites for complete documentation.

View File

@@ -20,7 +20,7 @@ bun add -D drizzle-kit
Create a `.env.local` file and add your [Neon Postgres connection string](https://neon.tech/docs/connect/connect-from-any-app) to it.
```txt .env.local icon="settings"
```ini .env.local icon="settings"
DATABASE_URL=postgresql://usertitle:password@ep-adj-noun-guid.us-east-1.aws.neon.tech/neondb?sslmode=require
```
@@ -33,7 +33,7 @@ import { neon } from "@neondatabase/serverless";
import { drizzle } from "drizzle-orm/neon-http";
// Bun automatically loads the DATABASE_URL from .env.local
// Refer to: https://bun.com/docs/runtime/env for more information
// Refer to: https://bun.com/docs/runtime/environment-variables for more information
const sql = neon(process.env.DATABASE_URL!);
export const db = drizzle(sql);

View File

@@ -21,7 +21,7 @@ bun add @neondatabase/serverless
Create a `.env.local` file and add your [Neon Postgres connection string](https://neon.tech/docs/connect/connect-from-any-app) to it.
```sh .env.local icon="settings"
```ini .env.local icon="settings"
DATABASE_URL=postgresql://usertitle:password@ep-adj-noun-guid.us-east-1.aws.neon.tech/neondb?sslmode=require
```
@@ -33,7 +33,7 @@ Paste the following code into your project's `index.ts` file.
import { neon } from "@neondatabase/serverless";
// Bun automatically loads the DATABASE_URL from .env.local
// Refer to: https://bun.com/docs/runtime/env for more information
// Refer to: https://bun.com/docs/runtime/environment-variables for more information
const sql = neon(process.env.DATABASE_URL);
const rows = await sql`SELECT version()`;

View File

@@ -4,54 +4,100 @@ sidebarTitle: Next.js with Bun
mode: center
---
Initialize a Next.js app with `create-next-app`. This will scaffold a new Next.js project and automatically install dependencies.
```sh terminal icon="terminal"
bun create next-app
```
```txt
✔ What is your project named? … my-app
✔ Would you like to use TypeScript with this project? … No / Yes
✔ Would you like to use ESLint with this project? … No / Yes
✔ Would you like to use Tailwind CSS? ... No / Yes
✔ Would you like to use `src/` directory with this project? … No / Yes
✔ Would you like to use App Router? (recommended) ... No / Yes
✔ What import alias would you like configured? … @/*
Creating a new Next.js app in /path/to/my-app.
```
[Next.js](https://nextjs.org/) is a React framework for building full-stack web applications. It supports server-side rendering, static site generation, API routes, and more. Bun provides fast package installation and can run Next.js development and production servers.
---
You can specify a starter template using the `--example` flag.
<Steps>
<Step title="Create a new Next.js app">
Use the interactive CLI to create a new Next.js app. This will scaffold a new Next.js project and automatically install dependencies.
```sh
bun create next-app --example with-supabase
```
```sh terminal icon="terminal"
bun create next-app@latest my-bun-app
```
```txt
✔ What is your project named? … my-app
...
```
</Step>
<Step title="Start the dev server">
Change to the project directory and run the dev server with Bun.
```sh terminal icon="terminal"
cd my-bun-app
bun --bun run dev
```
This starts the Next.js dev server with Bun's runtime.
Open [`http://localhost:3000`](http://localhost:3000) with your browser to see the result. Any changes you make to `app/page.tsx` will be hot-reloaded in the browser.
</Step>
<Step title="Update scripts in package.json">
Modify the scripts field in your `package.json` by prefixing the Next.js CLI commands with `bun --bun`. This ensures that Bun executes the Next.js CLI for common tasks like `dev`, `build`, and `start`.
```json package.json icon="file-json"
{
"scripts": {
"dev": "bun --bun next dev", // [!code ++]
"build": "bun --bun next build", // [!code ++]
"start": "bun --bun next start", // [!code ++]
}
}
```
</Step>
</Steps>
---
To start the dev server with Bun, run `bun --bun run dev` from the project root.
## Hosting
```sh terminal icon="terminal"
cd my-app
bun --bun run dev
```
Next.js applications on Bun can be deployed to various platforms.
<Columns cols={3}>
<Card title="Vercel" href="/guides/deployment/vercel" icon="/icons/ecosystem/vercel.svg">
Deploy on Vercel
</Card>
<Card title="Railway" href="/guides/deployment/railway" icon="/icons/ecosystem/railway.svg">
Deploy on Railway
</Card>
<Card title="DigitalOcean" href="/guides/deployment/digital-ocean" icon="/icons/ecosystem/digitalocean.svg">
Deploy on DigitalOcean
</Card>
<Card title="AWS Lambda" href="/guides/deployment/aws-lambda" icon="/icons/ecosystem/aws.svg">
Deploy on AWS Lambda
</Card>
<Card title="Google Cloud Run" href="/guides/deployment/google-cloud-run" icon="/icons/ecosystem/gcp.svg">
Deploy on Google Cloud Run
</Card>
<Card title="Render" href="/guides/deployment/render" icon="/icons/ecosystem/render.svg">
Deploy on Render
</Card>
</Columns>
---
To run the dev server with Node.js instead, omit `--bun`.
## Templates
```sh terminal icon="terminal"
cd my-app
bun run dev
```
<Columns cols={2}>
<Card
title="Bun + Next.js Basic Starter"
img="/images/templates/bun-nextjs-basic.png"
href="https://github.com/bun-templates/bun-nextjs-basic"
arrow="true"
cta="Go to template"
>
A simple App Router starter with Bun, Next.js, and Tailwind CSS.
</Card>
<Card
title="Todo App with Next.js + Bun"
img="/images/templates/bun-nextjs-todo.png"
href="https://github.com/bun-templates/bun-nextjs-todo"
arrow="true"
cta="Go to template"
>
A full-stack todo application built with Bun, Next.js, and PostgreSQL.
</Card>
</Columns>
---
Open [http://localhost:3000](http://localhost:3000) with your browser to see the result. Any changes you make to `(pages/app)/index.tsx` will be hot-reloaded in the browser.
[→ See Next.js's official documentation](https://nextjs.org/docs) for more information on building and deploying Next.js applications.

View File

@@ -14,7 +14,7 @@ bunx nuxi init my-nuxt-app
✔ Which package manager would you like to use?
bun
◐ Installing dependencies...
bun install v1.3.1 (16b4bf34)
bun install v1.3.3 (16b4bf34)
+ @nuxt/devtools@0.8.2
+ nuxt@3.7.0
785 packages installed [2.67s]
@@ -74,6 +74,12 @@ export default defineNuxtConfig({
});
```
Alternatively, you can set the preset via environment variable:
```sh terminal icon="terminal"
NITRO_PRESET=bun bun run build
```
<Note>
Some packages provide Bun-specific exports that Nitro will not bundle correctly using the default preset. In this
case, you need to use Bun preset so that the packages will work correctly in production builds.

View File

@@ -62,7 +62,7 @@ mode: center
<Step title="Configure database connection">
Set up your Postgres database URL in the `.env` file.
```env .env icon="settings"
```ini .env icon="settings"
DATABASE_URL="postgresql://username:password@localhost:5432/mydb?schema=public"
```
</Step>

View File

@@ -161,4 +161,4 @@ mode: center
---
That's it! Now that you've set up Prisma using Bun, we recommend referring to the [official Prisma docs](https://www.prisma.io/docs/concepts/components/prisma-client) as you continue to develop your application.
That's it! Now that you've set up Prisma using Bun, we recommend referring to the [official Prisma docs](https://www.prisma.io/docs/orm/prisma-client) as you continue to develop your application.

View File

@@ -65,14 +65,14 @@ bun create qwik
│ bun qwik add │
│ │
│ Relevant docs: │
│ https://qwik.builder.io/docs/getting-started/ │
│ https://qwik.dev/docs/getting-started/
│ │
│ Questions? Start the conversation at: │
│ https://qwik.builder.io/chat
│ https://qwik.dev/chat
│ https://twitter.com/QwikDev │
│ │
│ Presentations, Podcasts and Videos: │
│ https://qwik.builder.io/media/
│ https://qwik.dev/media/
│ │
│ Next steps: │
│ cd my-app │
@@ -111,4 +111,4 @@ Open [http://localhost:5173](http://localhost:5173) with your browser to see the
---
Refer to the [Qwik docs](https://qwik.builder.io/docs/getting-started/) for complete documentation.
Refer to the [Qwik docs](https://qwik.dev/docs/getting-started/) for complete documentation.

View File

@@ -20,7 +20,7 @@ bun add @sentry/bun
Then, initialize the Sentry SDK with your Sentry DSN in your app's entry file. You can find your DSN in your Sentry project settings.
```js sentry.ts icon="/icons/typescript.svg"
```ts sentry.ts icon="/icons/typescript.svg"
import * as Sentry from "@sentry/bun";
// Ensure to call this before importing any other modules!
@@ -37,7 +37,7 @@ Sentry.init({
You can verify that Sentry is working by capturing a test error:
```js sentry.ts icon="/icons/typescript.svg"
```ts sentry.ts icon="/icons/typescript.svg"
setTimeout(() => {
try {
foo();

View File

@@ -4,63 +4,59 @@ sidebarTitle: "SolidStart with Bun"
mode: center
---
<Warning>
SolidStart currently relies on Node.js APIs that Bun does not yet implement. The guide below uses Bun to initialize a
project and install dependencies, but it uses Node.js to run the dev server.
</Warning>
---
Initialize a SolidStart app with `create-solid`.
Initialize a SolidStart app with `create-solid`. You can specify the `--solidstart` flag to create a SolidStart project, and `--ts` for TypeScript support. When prompted for a template, select `basic` for a minimal starter app.
```sh terminal icon="terminal"
bun create solid my-app
bun create solid my-app --solidstart --ts
```
```txt
create-solid version 0.2.31
Welcome to the SolidStart setup wizard!
There are definitely bugs and some feature might not work yet.
If you encounter an issue, have a look at
https://github.com/solidjs/solid-start/issues and open a new one,
if it is not already tracked.
✔ Which template do you want to use? todomvc
✔ Server Side Rendering? … yes
✔ Use TypeScript? … yes
cloned solidjs/solid-start#main to /path/to/my-app/.solid-start
✔ Copied project files
Create-Solid v0.6.11
◇ Project Name
│ my-app
◇ Which template would you like to use?
│ basic
◇ Project created 🎉
◇ To get started, run: ─╮
│ │
│ cd my-app │
│ bun install │
│ bun dev │
│ │
├────────────────────────╯
```
---
As instructed by the `create-solid` CLI, let's install our dependencies.
As instructed by the `create-solid` CLI, install the dependencies.
```sh terminal icon="terminal"
cd my-app
bun install
```
---
Then run the development server.
Then run the development server with `bun dev`.
```sh terminal icon="terminal"
bun run dev
# or, equivalently
bunx solid-start dev
bun dev
```
---
```txt
$ vinxi dev
vinxi v0.5.8
vinxi starting dev server
➜ Local: http://localhost:3000/
➜ Network: use --host to expose
```
Open [localhost:3000](http://localhost:3000). Any changes you make to `src/routes/index.tsx` will be hot-reloaded automatically.
<Frame>
![SolidStart demo app](https://github.com/oven-sh/bun/assets/3084745/1e8043c4-49d1-498c-9add-c1eaab6c7167)
</Frame>
---
Refer to the [SolidStart website](https://start.solidjs.com/getting-started/what-is-solidstart) for complete framework documentation.
Refer to the [SolidStart website](https://docs.solidjs.com/solid-start) for complete framework documentation.

View File

@@ -88,7 +88,7 @@ To build for production, you'll need to add the right SvelteKit adapter. Current
Now, make the following changes to your `svelte.config.js`.
```ts svelte.config.js icon="file-code"
```js svelte.config.js icon="file-code"
import adapter from "@sveltejs/adapter-auto"; // [!code --]
import adapter from "svelte-adapter-bun"; // [!code ++]
import { vitePreprocess } from "@sveltejs/vite-plugin-svelte";

View File

@@ -0,0 +1,791 @@
---
title: Use TanStack Start with Bun
sidebarTitle: TanStack Start with Bun
mode: center
---
[TanStack Start](https://tanstack.com/start/latest) is a full-stack framework powered by TanStack Router. It supports full-document SSR, streaming, server functions, bundling and more, powered by TanStack Router and [Vite](https://vite.dev/).
---
<Steps>
<Step title="Create a new TanStack Start app">
Use the interactive CLI to create a new TanStack Start app.
```sh terminal icon="terminal"
bun create @tanstack/start@latest my-tanstack-app
```
</Step>
<Step title="Start the dev server">
Change to the project directory and run the dev server with Bun.
```sh terminal icon="terminal"
cd my-tanstack-app
bun --bun run dev
```
This starts the Vite dev server with Bun.
</Step>
<Step title="Update scripts in package.json">
Modify the scripts field in your `package.json` by prefixing the Vite CLI commands with `bun --bun`. This ensures that Bun executes the Vite CLI for common tasks like `dev`, `build`, and `preview`.
```json package.json icon="file-json"
{
"scripts": {
"dev": "bun --bun vite dev", // [!code ++]
"build": "bun --bun vite build", // [!code ++]
"serve": "bun --bun vite preview" // [!code ++]
}
}
```
</Step>
</Steps>
---
## Hosting
To host your TanStack Start app, you can use [Nitro](https://nitro.build/) or a custom Bun server for production deployments.
<Tabs>
<Tab title="Nitro">
<Steps>
<Step title="Add Nitro to your project">
Add [Nitro](https://nitro.build/) to your project. This tool allows you to deploy your TanStack Start app to different platforms.
```sh terminal icon="terminal"
bun add nitro
```
</Step>
<Step title={<span>Update your <code>vite.config.ts</code> file</span>}>
Update your `vite.config.ts` file to include the necessary plugins for TanStack Start with Bun.
```ts vite.config.ts icon="/icons/typescript.svg"
// other imports...
import { nitro } from "nitro/vite"; // [!code ++]
const config = defineConfig({
plugins: [
tanstackStart(),
nitro({ preset: "bun" }), // [!code ++]
// other plugins...
],
});
export default config;
```
<Note>
The `bun` preset is optional, but it configures the build output specifically for Bun's runtime.
</Note>
</Step>
<Step title="Update the start command">
Make sure `build` and `start` scripts are present in your `package.json` file:
```json package.json icon="file-json"
{
"scripts": {
"build": "bun --bun vite build", // [!code ++]
// The .output files are created by Nitro when you run `bun run build`.
// Not necessary when deploying to Vercel.
"start": "bun run .output/server/index.mjs" // [!code ++]
}
}
```
<Note>
You do **not** need the custom `start` script when deploying to Vercel.
</Note>
</Step>
<Step title="Deploy your app">
Check out one of our guides to deploy your app to a hosting provider.
<Note>
When deploying to Vercel, you can either add the `"bunVersion": "1.x"` to your `vercel.json` file, or add it to the `nitro` config in your `vite.config.ts` file:
<Warning>
Do **not** use the `bun` Nitro preset when deploying to Vercel.
</Warning>
```ts vite.config.ts icon="/icons/typescript.svg"
export default defineConfig({
plugins: [
tanstackStart(),
nitro({
preset: "bun", // [!code --]
vercel: { // [!code ++]
functions: { // [!code ++]
runtime: "bun1.x", // [!code ++]
}, // [!code ++]
}, // [!code ++]
}),
],
});
```
</Note>
</Step>
</Steps>
</Tab>
<Tab title="Custom Server">
<Note>
This custom server implementation is based on [TanStack's Bun template](https://github.com/TanStack/router/blob/main/examples/react/start-bun/server.ts). It provides fine-grained control over static asset serving, including configurable memory management that preloads small files into memory for fast serving while serving larger files on-demand. This approach is useful when you need precise control over resource usage and asset loading behavior in production deployments.
</Note>
<Steps>
<Step title="Create the production server">
Create a `server.ts` file in your project root with the following custom server implementation:
```ts server.ts icon="/icons/typescript.svg" expandable
/**
* TanStack Start Production Server with Bun
*
* A high-performance production server for TanStack Start applications that
* implements intelligent static asset loading with configurable memory management.
*
* Features:
* - Hybrid loading strategy (preload small files, serve large files on-demand)
* - Configurable file filtering with include/exclude patterns
* - Memory-efficient response generation
* - Production-ready caching headers
*
* Environment Variables:
*
* PORT (number)
* - Server port number
* - Default: 3000
*
* ASSET_PRELOAD_MAX_SIZE (number)
* - Maximum file size in bytes to preload into memory
* - Files larger than this will be served on-demand from disk
* - Default: 5242880 (5MB)
* - Example: ASSET_PRELOAD_MAX_SIZE=5242880 (5MB)
*
* ASSET_PRELOAD_INCLUDE_PATTERNS (string)
* - Comma-separated list of glob patterns for files to include
* - If specified, only matching files are eligible for preloading
* - Patterns are matched against filenames only, not full paths
* - Example: ASSET_PRELOAD_INCLUDE_PATTERNS="*.js,*.css,*.woff2"
*
* ASSET_PRELOAD_EXCLUDE_PATTERNS (string)
* - Comma-separated list of glob patterns for files to exclude
* - Applied after include patterns
* - Patterns are matched against filenames only, not full paths
* - Example: ASSET_PRELOAD_EXCLUDE_PATTERNS="*.map,*.txt"
*
* ASSET_PRELOAD_VERBOSE_LOGGING (boolean)
* - Enable detailed logging of loaded and skipped files
* - Default: false
* - Set to "true" to enable verbose output
*
* ASSET_PRELOAD_ENABLE_ETAG (boolean)
* - Enable ETag generation for preloaded assets
* - Default: true
* - Set to "false" to disable ETag support
*
* ASSET_PRELOAD_ENABLE_GZIP (boolean)
* - Enable Gzip compression for eligible assets
* - Default: true
* - Set to "false" to disable Gzip compression
*
* ASSET_PRELOAD_GZIP_MIN_SIZE (number)
* - Minimum file size in bytes required for Gzip compression
* - Files smaller than this will not be compressed
* - Default: 1024 (1KB)
*
* ASSET_PRELOAD_GZIP_MIME_TYPES (string)
* - Comma-separated list of MIME types eligible for Gzip compression
* - Supports partial matching for types ending with "/"
* - Default: text/,application/javascript,application/json,application/xml,image/svg+xml
*
* Usage:
* bun run server.ts
*/
import path from 'node:path'
// Configuration
const SERVER_PORT = Number(process.env.PORT ?? 3000)
const CLIENT_DIRECTORY = './dist/client'
const SERVER_ENTRY_POINT = './dist/server/server.js'
// Logging utilities for professional output
const log = {
info: (message: string) => {
console.log(`[INFO] ${message}`)
},
success: (message: string) => {
console.log(`[SUCCESS] ${message}`)
},
warning: (message: string) => {
console.log(`[WARNING] ${message}`)
},
error: (message: string) => {
console.log(`[ERROR] ${message}`)
},
header: (message: string) => {
console.log(`\n${message}\n`)
},
}
// Preloading configuration from environment variables
const MAX_PRELOAD_BYTES = Number(
process.env.ASSET_PRELOAD_MAX_SIZE ?? 5 * 1024 * 1024, // 5MB default
)
// Parse comma-separated include patterns (no defaults)
const INCLUDE_PATTERNS = (process.env.ASSET_PRELOAD_INCLUDE_PATTERNS ?? '')
.split(',')
.map((s) => s.trim())
.filter(Boolean)
.map((pattern: string) => convertGlobToRegExp(pattern))
// Parse comma-separated exclude patterns (no defaults)
const EXCLUDE_PATTERNS = (process.env.ASSET_PRELOAD_EXCLUDE_PATTERNS ?? '')
.split(',')
.map((s) => s.trim())
.filter(Boolean)
.map((pattern: string) => convertGlobToRegExp(pattern))
// Verbose logging flag
const VERBOSE = process.env.ASSET_PRELOAD_VERBOSE_LOGGING === 'true'
// Optional ETag feature
const ENABLE_ETAG = (process.env.ASSET_PRELOAD_ENABLE_ETAG ?? 'true') === 'true'
// Optional Gzip feature
const ENABLE_GZIP = (process.env.ASSET_PRELOAD_ENABLE_GZIP ?? 'true') === 'true'
const GZIP_MIN_BYTES = Number(process.env.ASSET_PRELOAD_GZIP_MIN_SIZE ?? 1024) // 1KB
const GZIP_TYPES = (
process.env.ASSET_PRELOAD_GZIP_MIME_TYPES ??
'text/,application/javascript,application/json,application/xml,image/svg+xml'
)
.split(',')
.map((v) => v.trim())
.filter(Boolean)
/**
* Convert a simple glob pattern to a regular expression
* Supports * wildcard for matching any characters
*/
function convertGlobToRegExp(globPattern: string): RegExp {
// Escape regex special chars except *, then replace * with .*
const escapedPattern = globPattern
.replace(/[-/\\^$+?.()|[\]{}]/g, '\\$&')
.replace(/\*/g, '.*')
return new RegExp(`^${escapedPattern}$`, 'i')
}
/**
* Compute ETag for a given data buffer
*/
function computeEtag(data: Uint8Array): string {
const hash = Bun.hash(data)
return `W/"${hash.toString(16)}-${data.byteLength.toString()}"`
}
/**
* Metadata for preloaded static assets
*/
interface AssetMetadata {
route: string
size: number
type: string
}
/**
* In-memory asset with ETag and Gzip support
*/
interface InMemoryAsset {
raw: Uint8Array
gz?: Uint8Array
etag?: string
type: string
immutable: boolean
size: number
}
/**
* Result of static asset preloading process
*/
interface PreloadResult {
routes: Record<string, (req: Request) => Response | Promise<Response>>
loaded: AssetMetadata[]
skipped: AssetMetadata[]
}
/**
* Check if a file is eligible for preloading based on configured patterns
*/
function isFileEligibleForPreloading(relativePath: string): boolean {
const fileName = relativePath.split(/[/\\]/).pop() ?? relativePath
// If include patterns are specified, file must match at least one
if (INCLUDE_PATTERNS.length > 0) {
if (!INCLUDE_PATTERNS.some((pattern) => pattern.test(fileName))) {
return false
}
}
// If exclude patterns are specified, file must not match any
if (EXCLUDE_PATTERNS.some((pattern) => pattern.test(fileName))) {
return false
}
return true
}
/**
* Check if a MIME type is compressible
*/
function isMimeTypeCompressible(mimeType: string): boolean {
return GZIP_TYPES.some((type) =>
type.endsWith('/') ? mimeType.startsWith(type) : mimeType === type,
)
}
/**
* Conditionally compress data based on size and MIME type
*/
function compressDataIfAppropriate(
data: Uint8Array,
mimeType: string,
): Uint8Array | undefined {
if (!ENABLE_GZIP) return undefined
if (data.byteLength < GZIP_MIN_BYTES) return undefined
if (!isMimeTypeCompressible(mimeType)) return undefined
try {
return Bun.gzipSync(data.buffer as ArrayBuffer)
} catch {
return undefined
}
}
/**
* Create response handler function with ETag and Gzip support
*/
function createResponseHandler(
asset: InMemoryAsset,
): (req: Request) => Response {
return (req: Request) => {
const headers: Record<string, string> = {
'Content-Type': asset.type,
'Cache-Control': asset.immutable
? 'public, max-age=31536000, immutable'
: 'public, max-age=3600',
}
if (ENABLE_ETAG && asset.etag) {
const ifNone = req.headers.get('if-none-match')
if (ifNone && ifNone === asset.etag) {
return new Response(null, {
status: 304,
headers: { ETag: asset.etag },
})
}
headers.ETag = asset.etag
}
if (
ENABLE_GZIP &&
asset.gz &&
req.headers.get('accept-encoding')?.includes('gzip')
) {
headers['Content-Encoding'] = 'gzip'
headers['Content-Length'] = String(asset.gz.byteLength)
const gzCopy = new Uint8Array(asset.gz)
return new Response(gzCopy, { status: 200, headers })
}
headers['Content-Length'] = String(asset.raw.byteLength)
const rawCopy = new Uint8Array(asset.raw)
return new Response(rawCopy, { status: 200, headers })
}
}
/**
* Create composite glob pattern from include patterns
*/
function createCompositeGlobPattern(): Bun.Glob {
const raw = (process.env.ASSET_PRELOAD_INCLUDE_PATTERNS ?? '')
.split(',')
.map((s) => s.trim())
.filter(Boolean)
if (raw.length === 0) return new Bun.Glob('**/*')
if (raw.length === 1) return new Bun.Glob(raw[0])
return new Bun.Glob(`{${raw.join(',')}}`)
}
/**
* Initialize static routes with intelligent preloading strategy
* Small files are loaded into memory, large files are served on-demand
*/
async function initializeStaticRoutes(
clientDirectory: string,
): Promise<PreloadResult> {
const routes: Record<string, (req: Request) => Response | Promise<Response>> =
{}
const loaded: AssetMetadata[] = []
const skipped: AssetMetadata[] = []
log.info(`Loading static assets from ${clientDirectory}...`)
if (VERBOSE) {
console.log(
`Max preload size: ${(MAX_PRELOAD_BYTES / 1024 / 1024).toFixed(2)} MB`,
)
if (INCLUDE_PATTERNS.length > 0) {
console.log(
`Include patterns: ${process.env.ASSET_PRELOAD_INCLUDE_PATTERNS ?? ''}`,
)
}
if (EXCLUDE_PATTERNS.length > 0) {
console.log(
`Exclude patterns: ${process.env.ASSET_PRELOAD_EXCLUDE_PATTERNS ?? ''}`,
)
}
}
let totalPreloadedBytes = 0
try {
const glob = createCompositeGlobPattern()
for await (const relativePath of glob.scan({ cwd: clientDirectory })) {
const filepath = path.join(clientDirectory, relativePath)
const route = `/${relativePath.split(path.sep).join(path.posix.sep)}`
try {
// Get file metadata
const file = Bun.file(filepath)
// Skip if file doesn't exist or is empty
if (!(await file.exists()) || file.size === 0) {
continue
}
const metadata: AssetMetadata = {
route,
size: file.size,
type: file.type || 'application/octet-stream',
}
// Determine if file should be preloaded
const matchesPattern = isFileEligibleForPreloading(relativePath)
const withinSizeLimit = file.size <= MAX_PRELOAD_BYTES
if (matchesPattern && withinSizeLimit) {
// Preload small files into memory with ETag and Gzip support
const bytes = new Uint8Array(await file.arrayBuffer())
const gz = compressDataIfAppropriate(bytes, metadata.type)
const etag = ENABLE_ETAG ? computeEtag(bytes) : undefined
const asset: InMemoryAsset = {
raw: bytes,
gz,
etag,
type: metadata.type,
immutable: true,
size: bytes.byteLength,
}
routes[route] = createResponseHandler(asset)
loaded.push({ ...metadata, size: bytes.byteLength })
totalPreloadedBytes += bytes.byteLength
} else {
// Serve large or filtered files on-demand
routes[route] = () => {
const fileOnDemand = Bun.file(filepath)
return new Response(fileOnDemand, {
headers: {
'Content-Type': metadata.type,
'Cache-Control': 'public, max-age=3600',
},
})
}
skipped.push(metadata)
}
} catch (error: unknown) {
if (error instanceof Error && error.name !== 'EISDIR') {
log.error(`Failed to load ${filepath}: ${error.message}`)
}
}
}
// Show detailed file overview only when verbose mode is enabled
if (VERBOSE && (loaded.length > 0 || skipped.length > 0)) {
const allFiles = [...loaded, ...skipped].sort((a, b) =>
a.route.localeCompare(b.route),
)
// Calculate max path length for alignment
const maxPathLength = Math.min(
Math.max(...allFiles.map((f) => f.route.length)),
60,
)
// Format file size with KB and actual gzip size
const formatFileSize = (bytes: number, gzBytes?: number) => {
const kb = bytes / 1024
const sizeStr = kb < 100 ? kb.toFixed(2) : kb.toFixed(1)
if (gzBytes !== undefined) {
const gzKb = gzBytes / 1024
const gzStr = gzKb < 100 ? gzKb.toFixed(2) : gzKb.toFixed(1)
return {
size: sizeStr,
gzip: gzStr,
}
}
// Rough gzip estimation (typically 30-70% compression) if no actual gzip data
const gzipKb = kb * 0.35
return {
size: sizeStr,
gzip: gzipKb < 100 ? gzipKb.toFixed(2) : gzipKb.toFixed(1),
}
}
if (loaded.length > 0) {
console.log('\n📁 Preloaded into memory:')
console.log(
'Path │ Size │ Gzip Size',
)
loaded
.sort((a, b) => a.route.localeCompare(b.route))
.forEach((file) => {
const { size, gzip } = formatFileSize(file.size)
const paddedPath = file.route.padEnd(maxPathLength)
const sizeStr = `${size.padStart(7)} kB`
const gzipStr = `${gzip.padStart(7)} kB`
console.log(`${paddedPath} │ ${sizeStr} │ ${gzipStr}`)
})
}
if (skipped.length > 0) {
console.log('\n💾 Served on-demand:')
console.log(
'Path │ Size │ Gzip Size',
)
skipped
.sort((a, b) => a.route.localeCompare(b.route))
.forEach((file) => {
const { size, gzip } = formatFileSize(file.size)
const paddedPath = file.route.padEnd(maxPathLength)
const sizeStr = `${size.padStart(7)} kB`
const gzipStr = `${gzip.padStart(7)} kB`
console.log(`${paddedPath} │ ${sizeStr} │ ${gzipStr}`)
})
}
}
// Show detailed verbose info if enabled
if (VERBOSE) {
if (loaded.length > 0 || skipped.length > 0) {
const allFiles = [...loaded, ...skipped].sort((a, b) =>
a.route.localeCompare(b.route),
)
console.log('\n📊 Detailed file information:')
console.log(
'Status │ Path │ MIME Type │ Reason',
)
allFiles.forEach((file) => {
const isPreloaded = loaded.includes(file)
const status = isPreloaded ? 'MEMORY' : 'ON-DEMAND'
const reason =
!isPreloaded && file.size > MAX_PRELOAD_BYTES
? 'too large'
: !isPreloaded
? 'filtered'
: 'preloaded'
const route =
file.route.length > 30
? file.route.substring(0, 27) + '...'
: file.route
console.log(
`${status.padEnd(12)} │ ${route.padEnd(30)} │ ${file.type.padEnd(28)} │ ${reason.padEnd(10)}`,
)
})
} else {
console.log('\n📊 No files found to display')
}
}
// Log summary after the file list
console.log() // Empty line for separation
if (loaded.length > 0) {
log.success(
`Preloaded ${String(loaded.length)} files (${(totalPreloadedBytes / 1024 / 1024).toFixed(2)} MB) into memory`,
)
} else {
log.info('No files preloaded into memory')
}
if (skipped.length > 0) {
const tooLarge = skipped.filter((f) => f.size > MAX_PRELOAD_BYTES).length
const filtered = skipped.length - tooLarge
log.info(
`${String(skipped.length)} files will be served on-demand (${String(tooLarge)} too large, ${String(filtered)} filtered)`,
)
}
} catch (error) {
log.error(
`Failed to load static files from ${clientDirectory}: ${String(error)}`,
)
}
return { routes, loaded, skipped }
}
/**
* Initialize the server
*/
async function initializeServer() {
log.header('Starting Production Server')
// Load TanStack Start server handler
let handler: { fetch: (request: Request) => Response | Promise<Response> }
try {
const serverModule = (await import(SERVER_ENTRY_POINT)) as {
default: { fetch: (request: Request) => Response | Promise<Response> }
}
handler = serverModule.default
log.success('TanStack Start application handler initialized')
} catch (error) {
log.error(`Failed to load server handler: ${String(error)}`)
process.exit(1)
}
// Build static routes with intelligent preloading
const { routes } = await initializeStaticRoutes(CLIENT_DIRECTORY)
// Create Bun server
const server = Bun.serve({
port: SERVER_PORT,
routes: {
// Serve static assets (preloaded or on-demand)
...routes,
// Fallback to TanStack Start handler for all other routes
'/*': (req: Request) => {
try {
return handler.fetch(req)
} catch (error) {
log.error(`Server handler error: ${String(error)}`)
return new Response('Internal Server Error', { status: 500 })
}
},
},
// Global error handler
error(error) {
log.error(
`Uncaught server error: ${error instanceof Error ? error.message : String(error)}`,
)
return new Response('Internal Server Error', { status: 500 })
},
})
log.success(`Server listening on http://localhost:${String(server.port)}`)
}
// Initialize the server
initializeServer().catch((error: unknown) => {
log.error(`Failed to start server: ${String(error)}`)
process.exit(1)
})
```
</Step>
<Step title="Update package.json scripts">
Add a `start` script to run the custom server:
```json package.json icon="file-json"
{
"scripts": {
"build": "bun --bun vite build",
"start": "bun run server.ts" // [!code ++]
}
}
```
</Step>
<Step title="Build and run">
Build your application and start the server:
```sh terminal icon="terminal"
bun run build
bun run start
```
The server will start on port 3000 by default (configurable via `PORT` environment variable).
</Step>
</Steps>
</Tab>
</Tabs>
<Columns cols={3}>
<Card title="Vercel" href="/guides/deployment/vercel" icon="/icons/ecosystem/vercel.svg">
Deploy on Vercel
</Card>
<Card title="Render" href="/guides/deployment/render" icon="/icons/ecosystem/render.svg">
Deploy on Render
</Card>
<Card title="Railway" href="/guides/deployment/railway" icon="/icons/ecosystem/railway.svg">
Deploy on Railway
</Card>
<Card title="DigitalOcean" href="/guides/deployment/digital-ocean" icon="/icons/ecosystem/digitalocean.svg">
Deploy on DigitalOcean
</Card>
<Card title="AWS Lambda" href="/guides/deployment/aws-lambda" icon="/icons/ecosystem/aws.svg">
Deploy on AWS Lambda
</Card>
<Card title="Google Cloud Run" href="/guides/deployment/google-cloud-run" icon="/icons/ecosystem/gcp.svg">
Deploy on Google Cloud Run
</Card>
</Columns>
---
## Templates
<Columns cols={2}>
<Card
title="Todo App with Tanstack + Bun"
img="/images/templates/bun-tanstack-todo.png"
href="https://github.com/bun-templates/bun-tanstack-todo"
arrow="true"
cta="Go to template"
>
A Todo application built with Bun, TanStack Start, and PostgreSQL.
</Card>
<Card
title="Bun + TanStack Start Application"
img="/images/templates/bun-tanstack-basic.png"
href="https://github.com/bun-templates/bun-tanstack-basic"
arrow="true"
cta="Go to template"
>
A TanStack Start template using Bun with SSR and file-based routing.
</Card>
<Card
title="Basic Bun + Tanstack Starter"
img="/images/templates/bun-tanstack-start.png"
href="https://github.com/bun-templates/bun-tanstack-start"
arrow="true"
cta="Go to template"
>
The basic TanStack starter using the Bun runtime and Bun's file APIs.
</Card>
</Columns>
---
[→ See TanStack Start's official documentation](https://tanstack.com/start/latest/docs/framework/react/guide/hosting) for more information on hosting.

View File

@@ -0,0 +1,87 @@
---
title: Bun Redis with Upstash
sidebarTitle: Upstash with Bun
mode: center
---
[Upstash](https://upstash.com/) is a fully managed Redis database as a service. Upstash works with the Redis® API, which means you can use Bun's native Redis client to connect to your Upstash database.
<Note>TLS is enabled by default for all Upstash Redis databases.</Note>
---
<Steps>
<Step title="Create a new project">
Create a new project by running `bun init`:
```sh terminal icon="terminal"
bun init bun-upstash-redis
cd bun-upstash-redis
```
</Step>
<Step title="Create an Upstash Redis database">
Go to the [Upstash dashboard](https://console.upstash.com/) and create a new Redis database. After completing the [getting started guide](https://upstash.com/docs/redis/overall/getstarted), you'll see your database page with connection information.
The database page displays two connection methods; HTTP and TLS. For Bun's Redis client, you need the **TLS** connection details. This URL starts with `rediss://`.
<Frame>
![Upstash Redis database page](/images/guides/upstash-1.png)
</Frame>
</Step>
<Step title="Connect using Bun's Redis client">
You can connect to Upstash by setting environment variables with Bun's default `redis` client.
Set the `REDIS_URL` environment variable in your `.env` file using the Redis endpoint (not the REST URL):
```ini .env icon="settings"
REDIS_URL=rediss://********@********.upstash.io:6379
```
Bun's Redis client reads connection information from `REDIS_URL` by default:
```ts index.ts icon="/icons/typescript.svg"
import { redis } from "bun";
// Reads from process.env.REDIS_URL automatically
await redis.set("counter", "0"); // [!code ++]
```
Alternatively, you can create a custom client using `RedisClient`:
```ts index.ts icon="/icons/typescript.svg"
import { RedisClient } from "bun";
const redis = new RedisClient(process.env.REDIS_URL); // [!code ++]
```
</Step>
<Step title="Use the Redis client">
You can now use the Redis client to interact with your Upstash Redis database:
```ts index.ts icon="/icons/typescript.svg"
import { redis } from "bun";
// Get a value
let counter = await redis.get("counter");
// Set a value if it doesn't exist
if (!counter) {
await redis.set("counter", "0");
}
// Increment the counter
await redis.incr("counter");
// Get the updated value
counter = await redis.get("counter");
console.log(counter);
```
```txt
1
```
The Redis client automatically handles connections in the background. No need to manually connect or disconnect for basic operations.
</Step>
</Steps>

View File

@@ -74,4 +74,4 @@ bunx --bun vite build
---
This is a stripped down guide to get you started with Vite + Bun. For more information, see the [Vite documentation](https://vitejs.dev/guide/).
This is a stripped down guide to get you started with Vite + Bun. For more information, see the [Vite documentation](https://vite.dev/guide/).

Some files were not shown because too many files have changed in this diff Show More