Compare commits

...

160 Commits

Author SHA1 Message Date
Claude Bot
2185505c40 fix(cache): handle stale file descriptors in readFileWithAllocator
When rebundling without a DevServer (e.g., after server.reload() in
development mode), cached file descriptors may become invalid because
they were closed by a previous bundle operation.

This fix:
1. Catches seekTo errors on cached file descriptors
2. Falls back to reopening the file when seek fails
3. Tracks whether we opened a new file to ensure proper cleanup

Fixes #26075

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 19:25:09 +00:00
Jarred Sumner
fbd800551b Bump 2026-01-13 15:06:36 -08:00
robobun
113cdd9648 fix(completions): add update command to Fish completions (#25978)
## Summary

- Add the `update` subcommand to Fish shell completions
- Apply the install/add/remove flags (--global, --dry-run, --force,
etc.) to the `update` command

Previously, Fish shell autocompletion for `bun update --gl<TAB>` would
not work because:
1. The `update` command was missing from the list of built-in commands
2. The install/add/remove flags were not being applied to `update`

Fixes #25953

## Test plan

- [x] Verify `update` appears in subcommand completions (`bun <TAB>`)
- [x] Verify `--global` flag completion works (`bun update --gl<TAB>`)
- [x] Verify other install flags work with update (--dry-run, --force,
etc.)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-13 15:05:00 -08:00
robobun
3196178fa7 fix(timers): add _idleStart property to Timeout object (#26021)
## Summary

- Add `_idleStart` property (getter/setter) to the Timeout object
returned by `setTimeout()` and `setInterval()`
- The property returns a monotonic timestamp (in milliseconds)
representing when the timer was created
- This mimics Node.js's behavior where `_idleStart` is the libuv
timestamp at timer creation time

## Test plan

- [x] Verified test fails with `USE_SYSTEM_BUN=1 bun test
test/regression/issue/25639.test.ts`
- [x] Verified test passes with `bun bd test
test/regression/issue/25639.test.ts`
- [x] Manual verification:
  ```bash
  # Bun with fix - _idleStart exists
./build/debug/bun-debug -e "const t = setTimeout(() => {}, 0);
console.log('_idleStart' in t, typeof t._idleStart); clearTimeout(t)"
  # Output: true number
  
  # Node.js reference - same behavior
node -e "const t = setTimeout(() => {}, 0); console.log('_idleStart' in
t, typeof t._idleStart); clearTimeout(t)"
  # Output: true number
  ```

Closes #25639

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-12 19:35:11 -08:00
robobun
d530ed993d fix(css): restore handler context after minifying nested rules (#25997)
## Summary
- Fixes handler context not being restored after minifying nested CSS
rules
- Adds regression test for the issue

## Test plan
- [x] Test fails with `USE_SYSTEM_BUN=1 bun test
test/regression/issue/25794.test.ts`
- [x] Test passes with `bun bd test test/regression/issue/25794.test.ts`

Fixes #25794

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-12 14:55:27 -08:00
Dylan Conway
959169dfaf feat(archive): change API to constructor-based with S3 support (#25940)
## Summary
- Change Archive API from `Bun.Archive.from(data)` to `new
Bun.Archive(data, options?)`
- Change compression options from `{ gzip: true }` to `{ compress:
"gzip", level?: number }`
- Default to no compression when no options provided
- Use `{ compress: "gzip" }` to enable gzip compression (level 6 by
default)
- Add Archive support for S3 and local file writes via `Bun.write()`

## New API

```typescript
// Create archive - defaults to uncompressed tar
const archive = new Bun.Archive({
  "hello.txt": "Hello, World!",
  "data.json": JSON.stringify({ foo: "bar" }),
});

// Enable gzip compression
const compressed = new Bun.Archive(files, { compress: "gzip" });

// Gzip with custom level (1-12)
const maxCompression = new Bun.Archive(files, { compress: "gzip", level: 12 });

// Write to local file
await Bun.write("archive.tar", archive);           // uncompressed by default
await Bun.write("archive.tar.gz", compressed);     // gzipped

// Write to S3
await client.write("archive.tar.gz", compressed);          // S3Client.write()
await Bun.write("s3://bucket/archive.tar.gz", compressed); // S3 URL
await s3File.write(compressed);                            // s3File.write()

// Get bytes/blob (uses compression setting from constructor)
const bytes = await archive.bytes();
const blob = await archive.blob();
```

## TypeScript Types

```typescript
type ArchiveCompression = "gzip";

type ArchiveOptions = {
  compress?: "gzip";
  level?: number;  // 1-12, default 6 when gzip enabled
};
```

## Test plan
- [x] 98 archive tests pass
- [x] S3 integration tests updated to new API
- [x] TypeScript types updated
- [x] Documentation updated with new examples

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2026-01-12 14:54:21 -08:00
SUZUKI Sosuke
461ad886bd fix(http): fix Strong reference leak in server response streaming (#25965)
## Summary

Fix a memory leak in `RequestContext.doRenderWithBody()` where
`Strong.Impl` memory was leaked when proxying streaming responses
through Bun's HTTP server.

## Problem

When a streaming response (e.g., from a proxied fetch request) was
forwarded through Bun's server:

1. `response_body_readable_stream_ref` was initialized at line 1836
(from `lock.readable`) or line 1841 (via `Strong.init()`)
2. For `.Bytes` streams with `has_received_last_chunk=false`, a **new**
Strong reference was created at line 1902
3. The old Strong reference was **never deinit'd**, causing
`Strong.Impl` memory to leak

This leak accumulated over time with every streaming response proxied
through the server.

## Solution

Add `this.response_body_readable_stream_ref.deinit()` before creating
the new Strong reference. This is safe because:

- `stream` exists as a stack-local variable
- JSC's conservative GC tracks stack-local JSValues
- No GC can occur between consecutive synchronous Zig statements
- Therefore, `stream` won't be collected between `deinit()` and
`Strong.init()`

## Test

Added `test/js/web/fetch/server-response-stream-leak.test.ts` which:
- Creates a backend server that returns delayed streaming responses
- Creates a proxy server that forwards the streaming responses
- Makes 200 requests and checks that ReadableStream objects don't
accumulate
- Fails on system Bun v1.3.5 (202 leaked), passes with the fix

## Related

Similar to the Strong reference leak fixes in:
- #23313 (fetch memory leak)
- #25846 (fetch cyclic reference leak)
2026-01-12 14:41:58 -08:00
Markus Schmidt
b6abbd50a0 fix(Bun.SQL): handle binary columns in MySQL correctly (#26011)
## What does this PR do?
Currently binary columns are returned as strings which means they get
corrupted when encoded in UTF8. This PR returns binary columns as
Buffers which is what user's actually expect and is also consistent with
PostgreSQL and SQLite.
### How did you verify your code works?
I added tests to verify the correct behavior. Before there were no tests
for binary columns at all.

This fixes #23991
2026-01-12 11:56:02 -08:00
Alex Miller
beccd01647 fix(FileSink): add Promise<number> to FileSink.write() return type (#25962)
Co-authored-by: Alistair Smith <hi@alistair.sh>
2026-01-11 12:51:16 -08:00
github-actions[bot]
35eb53994a deps: update sqlite to 3.51.200 (#25957)
## What does this PR do?

Updates SQLite to version 3.51.200

Compare: https://sqlite.org/src/vdiff?from=3.51.1&to=3.51.200

Auto-updated by [this
workflow](https://github.com/oven-sh/bun/actions/workflows/update-sqlite3.yml)

Co-authored-by: Jarred-Sumner <709451+Jarred-Sumner@users.noreply.github.com>
2026-01-10 22:46:17 -08:00
robobun
ebf39e9811 fix(install): prevent use-after-free when retrying failed HTTP requests (#25949) 2026-01-10 18:03:24 -08:00
Ciro Spaciari
b610e80ee0 fix(http): properly handle pipelined data in CONNECT requests (#25938)
Fixes #25862

### What does this PR do?

When a client sends pipelined data immediately after CONNECT request
headers in the same TCP segment, Bun now properly delivers this data to
the `head` parameter of the 'connect' event handler, matching Node.js
behavior.

This enables compatibility with Cap'n Proto's KJ HTTP library used by
Cloudflare's workerd runtime, which pipelines RPC data after CONNECT.

### How did you verify your code works?
<img width="694" height="612" alt="CleanShot 2026-01-09 at 15 30 22@2x"
src="https://github.com/user-attachments/assets/3ffe840e-1792-429c-8303-d98ac3e6912a"
/>

Tests

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2026-01-09 19:08:02 -08:00
robobun
7076a49bb1 feat(archive): add TypeScript types, docs, and files() benchmark (#25922)
## Summary

- Add comprehensive TypeScript type definitions for `Bun.Archive` in
`bun.d.ts`
  - `ArchiveInput` and `ArchiveCompression` types
- Full JSDoc documentation with examples for all methods (`from`,
`write`, `extract`, `blob`, `bytes`, `files`)
- Add documentation page at `docs/runtime/archive.mdx`
  - Quickstart examples
  - Creating and extracting archives
  - `files()` method with glob filtering
  - Compression support
  - Full API reference section
- Add Archive to docs sidebar under "Data & Storage"
- Add `files()` benchmark comparing `Bun.Archive.files()` vs node-tar
- Shows ~7x speedup for reading archive contents into memory (59µs vs
434µs)

## Test plan

- [x] TypeScript types compile correctly
- [x] Documentation renders properly in Mintlify format
- [x] Benchmark runs successfully and shows performance comparison
- [x] Verified `files()` method works correctly with both Bun.Archive
and node-tar

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2026-01-09 19:00:19 -08:00
robobun
d4a966f8ae fix(install): prevent symlink path traversal in tarball extraction (#25584)
## Summary

- Fixes a path traversal vulnerability via symlink when installing
GitHub packages
- Validates symlink targets before creation to ensure they stay within
the extraction directory
- Rejects absolute symlinks and relative paths that would escape the
extraction directory

## Details

When extracting GitHub tarballs, Bun did not validate symlink targets. A
malicious tarball could:
1. Create a symlink pointing outside the extraction directory (e.g.,
`../../../../../../../tmp`)
2. Include a file entry through that symlink path (e.g.,
`symlink-to-tmp/pwned.txt`)

When extracted, the symlink would be created first, then the file would
be written through it, ending up outside the intended package directory
(e.g., `/tmp/pwned.txt`).

### The Fix

Added `isSymlinkTargetSafe()` function that:
1. Rejects absolute symlink targets (starting with `/`)
2. Normalizes the combined path (symlink location + target) and rejects
if the result starts with `..` (would escape)

## Test plan

- [x] Added regression test
`test/cli/install/symlink-path-traversal.test.ts`
- [x] Tests verify relative path traversal symlinks are blocked
- [x] Tests verify absolute symlink targets are blocked  
- [x] Tests verify safe relative symlinks within the package still work
- [x] Verified test fails with system bun (vulnerable) and passes with
debug build (fixed)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
Co-authored-by: Dylan Conway <dylan.conway567@gmail.com>
2026-01-09 16:56:31 -08:00
Dylan Conway
7704dca660 feat(build): add --compile-executable-path CLI flag (#25934)
## Summary

Adds a new CLI flag `--compile-executable-path` that allows specifying a
custom Bun executable path for cross-compilation instead of downloading
from the npm registry.

## Usage

```bash
bun build --compile --target=bun-linux-x64 \
  --compile-executable-path=/path/to/bun-linux-x64 app.ts
```

## Motivation

The `executablePath` option was already available in the JavaScript
`Bun.build()` API. This exposes the same functionality from the CLI.

## Changes

- Added `--compile-executable-path <STR>` CLI parameter in
`src/cli/Arguments.zig`
- Added `compile_executable_path` field to `BundlerOptions` in
`src/cli.zig`
- Wired the option through to `StandaloneModuleGraph.toExecutable()` in
`src/cli/build_command.zig`

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2026-01-09 16:21:40 -08:00
Jarred Sumner
1e0f51ddcc Revert "feat(shell): add $.trace for analyzing shell commands without execution (#25667)"
This reverts commit 6b5de25d8a.
2026-01-09 16:20:18 -08:00
Ciro Spaciari
32a76904fe remove agent in global WebSocket add agent support in ws module (#25935)
### What does this PR do?
remove agent in global WebSocket (in node.js it uses dispatcher not
agent) add agent support in ws module (this actually uses agent)
### How did you verify your code works?
Tests
2026-01-09 16:18:47 -08:00
Jarred Sumner
367eeb308e Fixes #23177 (#25936)
### What does this PR do?

Fixes #23177

### How did you verify your code works?
2026-01-09 15:14:00 -08:00
freya
1879b7eeca Note that Windows API handles should be u64, not ptr (#25886)
### What does this PR do?

Update FFI documentation with a note on Windows API handle values, as
pointer encoding to double causes intermittent failures with some
classes of handles.

Putting it in this accordion feels less than ideal but it's also a very
specific use case.

The specific issue I experienced: HDCs and HBITMAPs are basically 32
bits, although they are typedef'd in the headers to HANDLE. They are
returned from and passed to the GDI APIs as sign-extended versions of
the underlying 32-bit value, and so when going through the ptr <->
double pathway of bun FFI, the bottom 11 bits of those values are lost
if the original value had bit 31 set, and subsequent calls will fail.
Probably this is fixable by correctly encoding 'negative' pointers in
the double representation, and I might tackle that if I find time.
2026-01-09 14:04:14 -08:00
robobun
70fa6af355 feat: add Bun.Archive API for creating and extracting tarballs (#25665)
## Summary

- Adds new `Bun.Archive` API for working with tar archives
- `Bun.Archive.from(data)` - Create archive from object, Blob,
TypedArray, or ArrayBuffer
- `Bun.Archive.write(path, data, compress?)` - Write archive to disk
(async)
- `archive.extract(path)` - Extract to directory, returns
`Promise<number>` (file count)
- `archive.blob(compress?)` - Get archive as Blob (async)
- `archive.bytes(compress?)` - Get archive as Uint8Array (async)

Key implementation details:
- Uses existing libarchive bindings for tarball creation/extraction via
`extractToDisk`
- Uses libdeflate for gzip compression
- Immediate byte copying for GC safety (no JSValue protection, no
`hasPendingActivity`)
- Async operations run on worker pool threads with proper VM reference
handling
- Growing memory buffer via `archive_write_open2` callbacks for
efficient tarball creation

## Test plan

- [x] 65 comprehensive tests covering:
  - Normal operations (create, extract, blob, bytes, write)
  - GC safety (unreferenced archives, mutation isolation)  
  - Error handling (invalid args, corrupted data, I/O errors)
- Edge cases (large files, many files, special characters, path
normalization)
  - Concurrent operations

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Dylan Conway <dylan.conway567@gmail.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2026-01-09 00:33:35 -08:00
robobun
eb5b498c62 fix(test): make fake timers work with testing-library/react (#25915)
## Summary

Fixes #25869

Two fixes to enable `jest.useFakeTimers()` to work with
`@testing-library/react` and `@testing-library/user-event`:

- Set `setTimeout.clock = true` when fake timers are enabled.
testing-library/react's `jestFakeTimersAreEnabled()` checks for this
property to determine if `jest.advanceTimersByTime()` should be called
when draining the microtask queue. Without this, testing-library never
advances timers.

- Make `advanceTimersByTime(0)` fire `setTimeout(fn, 0)` timers.
`setTimeout(fn, 0)` is internally scheduled with a 1ms delay per HTML
spec. Jest/testing-library expect `advanceTimersByTime(0)` to fire such
"immediate" timers, but we were advancing by 0ms so they never fired.

## Test plan

- [x] All 30 existing fake timer tests pass
- [x] New regression test validates both fixes
- [x] Original user-event reproduction now works (test completes instead
of hanging)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2026-01-08 20:25:35 -08:00
Dylan Conway
596e83c918 fix: correct logic bugs in libarchive, s3 credentials, and postgres bindings (#25905)
## Summary

- **libarchive.zig:110**: Fix self-assignment bug where `this.pos` was
assigned to itself instead of `new_pos`
- **s3/credentials.zig:165,176,199**: Fix impossible range checks -
`and` should be `or` for pageSize, partSize, and retry validation (a
value cannot be both less than MIN and greater than MAX simultaneously)
- **postgres.zig:14**: Fix copy-paste error where createConnection
function was internally named "createQuery"

## Test plan

- [ ] Verify S3 credential validation now properly rejects out-of-range
values for pageSize, partSize, and retry
- [ ] Verify libarchive seek operations work correctly
- [ ] Verify postgres createConnection function has correct internal
name

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
Co-authored-by: Claude Bot <claude-bot@bun.sh>
2026-01-08 19:46:06 -08:00
SUZUKI Sosuke
3842a5ee18 Fix stack precommit crash on Windows (#25891)
### What does this PR do?

Attempt to fix stack precommit crash on Windows

https://github.com/oven-sh/WebKit/pull/128

### How did you verify your code works?
2026-01-08 18:12:51 -08:00
robobun
50daf5df27 fix(io): respect mode option when copying files with Bun.write() (#25906)
## Summary
- Fixes #25903 - `Bun.write()` mode option ignored when copying from
`Bun.file()`
- The destination file now correctly uses the specified `mode` option
instead of default permissions
- Works on Linux (via open flags), macOS (chmod after clonefile), and
Windows (chmod after copyfile)

## Test plan
- [x] Added regression test in `test/regression/issue/25903.test.ts`
- [x] Test passes with `bun bd test test/regression/issue/25903.test.ts`
- [x] Test fails with `USE_SYSTEM_BUN=1 bun test
test/regression/issue/25903.test.ts` (verifies the bug exists)

## Changes
- `src/bun.js/webcore/Blob.zig`: Add `mode` field to `WriteFileOptions`
and parse from options
- `src/bun.js/webcore/blob/copy_file.zig`: Use `destination_mode` in
`CopyFile` struct and `doOpenFile`
- `packages/bun-types/bun.d.ts`: Add `mode` option to BunFile copy
overloads

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2026-01-08 17:51:42 -08:00
Ciro Spaciari
c90c0e69cb feat(websocket): add HTTP/HTTPS proxy support (#25614)
## Summary

Add `proxy` option to WebSocket constructor for connecting through HTTP
CONNECT proxies.

### Features
- Support for `ws://` and `wss://` through HTTP proxies
- Support for `ws://` and `wss://` through HTTPS proxies (with
`rejectUnauthorized: false`)
- Proxy authentication via URL credentials (Basic auth)
- Custom proxy headers support
- Full TLS options (`ca`, `cert`, `key`, etc.) for target connections
using `SSLConfig.fromJS`

### API

```javascript
// String format
new WebSocket("wss://example.com", { proxy: "http://proxy:8080" })

// With credentials
new WebSocket("wss://example.com", { proxy: "http://user:pass@proxy:8080" })

// Object format with custom headers
new WebSocket("wss://example.com", {
  proxy: { url: "http://proxy:8080", headers: { "X-Custom": "value" } }
})

// HTTPS proxy
new WebSocket("ws://example.com", {
  proxy: "https://proxy:8443",
  tls: { rejectUnauthorized: false }
})
```

### Implementation

| File | Changes |
|------|---------|
| `WebSocketUpgradeClient.zig` | Proxy state machine and CONNECT
handling |
| `WebSocketProxyTunnel.zig` | **New** - TLS tunnel inside CONNECT for
wss:// through HTTP proxy |
| `JSWebSocket.cpp` | Parse proxy option and TLS options using
`SSLConfig.fromJS` |
| `WebSocket.cpp` | Pass proxy parameters to Zig, handle HTTPS proxy
socket selection |
| `bun.d.ts` | Add `proxy` and full TLS options to WebSocket types |

### Supported Scenarios

| Scenario | Status |
|----------|--------|
| ws:// through HTTP proxy |  Working |
| wss:// through HTTP proxy |  Working (TLS tunnel) |
| ws:// through HTTPS proxy |  Working (with `rejectUnauthorized:
false`) |
| wss:// through HTTPS proxy |  Working (with `rejectUnauthorized:
false`) |
| Proxy authentication (Basic) |  Working |
| Custom proxy headers |  Working |
| Custom CA for HTTPS proxy |   Working |

## Test plan

- [x] API tests verify proxy option is accepted in various formats
- [x] Functional tests with local HTTP CONNECT proxy server
- [x] Proxy authentication tests (Basic auth)
- [x] HTTPS proxy tests with `rejectUnauthorized: false`
- [x] Error handling tests (auth failures, wrong credentials)

Run tests: `bun test test/js/web/websocket/websocket-proxy.test.ts`

## Changelog

- Added `proxy` option to `WebSocket` constructor for HTTP/HTTPS proxy
support
- Added full TLS options (`ca`, `cert`, `key`, `passphrase`, etc.) to
`WebSocket` constructor

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2026-01-08 16:21:34 -08:00
robobun
24b97994e3 feat(bundler): add files option for in-memory bundling (#25852)
## Summary

Add support for in-memory entrypoints and files in `Bun.build` via the
`files` option:

```ts
await Bun.build({
  entrypoints: ["/app/index.ts"],
  files: {
    "/app/index.ts": `
      import { greet } from "./greet.ts";
      console.log(greet("World"));
    `,
    "/app/greet.ts": `
      export function greet(name: string) {
        return "Hello, " + name + "!";
      }
    `,
  },
});
```

### Features

- **Bundle entirely from memory**: No files on disk needed
- **Override files on disk**: In-memory files take priority over disk
files
- **Mix disk and virtual files**: Real files can import virtual files
and vice versa
- **Multiple content types**: Supports `string`, `Blob`, `TypedArray`,
and `ArrayBuffer`

### Use Cases

- Code generation at build time
- Injecting build-time constants
- Testing with mock modules
- Bundling dynamically generated code
- Overriding configuration files for different environments

### Implementation Details

- Added `FileMap` struct in `JSBundler.zig` with `resolve`, `get`,
`contains`, `fromJS`, and `deinit` methods
- Uses `"memory"` namespace to avoid `pathWithPrettyInitialized`
allocation issues during linking phase
- FileMap checks added in:
  - `runResolver` (entry point resolution)
  - `runResolutionForParseTask` (import resolution)
  - `enqueueEntryPoints` (entry point handling)
  - `getCodeForParseTaskWithoutPlugins` (file content reading)
- Root directory defaults to cwd when all entrypoints are in the FileMap
- Added TypeScript types with JSDoc documentation
- Added bundler documentation with examples

## Test plan

- [x] Basic in-memory file bundling
- [x] In-memory files with absolute imports
- [x] In-memory files with relative imports (same dir, subdirs, parent
dirs)
- [x] Nested/chained imports between in-memory files
- [x] TypeScript and JSX support
- [x] Blob, Uint8Array, and ArrayBuffer content types
- [x] Re-exports and default exports
- [x] In-memory file overrides real file on disk
- [x] Real file on disk imports in-memory file via relative path
- [x] Mixed disk and memory files with complex import graphs

Run tests with: `bun bd test test/bundler/bundler_files.test.ts`

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
Co-authored-by: Dylan Conway <dylan.conway567@gmail.com>
2026-01-08 15:05:41 -08:00
Jarred Sumner
dda9a9b0fd Increase shard count for Windows & Linux test runs (#25913)
### What does this PR do?

### How did you verify your code works?
2026-01-08 14:38:35 -08:00
robobun
eeef013365 Add Bun.JSONC API for parsing JSON with comments and trailing commas (#22115)
## Summary

This PR implements a new `Bun.JSONC.parse()` API that allows parsing
JSONC (JSON with Comments) files. It addresses the feature request from
issue #16257 by providing a native API for parsing JSON with comments
and trailing commas.

The implementation follows the same pattern as `Bun.YAML` and
`Bun.TOML`, leveraging the existing `TSConfigParser` which already
handles JSONC parsing internally.

## Features

- **Parse JSON with comments**: Supports both `//` single-line and `/*
*/` block comments
- **Handle trailing commas**: Works with trailing commas in objects and
arrays
- **Full JavaScript object conversion**: Returns native JavaScript
objects/arrays
- **Error handling**: Proper error throwing for invalid JSON
- **TypeScript compatibility**: Works with TypeScript config files and
other JSONC formats

## Usage Example

```javascript
const result = Bun.JSONC.parse(`{
  // This is a comment
  "name": "my-app",
  "version": "1.0.0", // trailing comma is allowed
  "dependencies": {
    "react": "^18.0.0",
  },
}`);
// Returns native JavaScript object
```

## Implementation Details

- Created `JSONCObject.zig` following the same pattern as
`YAMLObject.zig` and `TOMLObject.zig`
- Uses the existing `TSConfigParser` from `json.zig` which already
handles comments and trailing commas
- Added proper C++ bindings and exports following Bun's established
patterns
- Comprehensive test suite covering various JSONC features

## Test Plan

- [x] Basic JSON parsing works
- [x] Single-line comments (`//`) are handled correctly
- [x] Block comments (`/* */`) are handled correctly  
- [x] Trailing commas in objects and arrays work
- [x] Complex nested structures parse correctly
- [x] Error handling for invalid JSON
- [x] Empty objects and arrays work
- [x] Boolean and null values work correctly

🤖 Generated with [Claude Code](https://claude.ai/code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
Co-authored-by: Jarred-Sumner <709451+Jarred-Sumner@users.noreply.github.com>
2026-01-08 13:27:47 -08:00
Kaj Kowalski
65d006aae0 fix(parser): fix bytecode CJS pragma detection after shebang (#25868)
### What does this PR do?

Fix bytecode CJS pragma detection when source file contains a shebang.

When bundling with `--bytecode` and the source file has a shebang, the
output silently fails to execute (exits 0, no output).

Reproduction:
[github.com/kjanat/bun-bytecode-banner-bug](https://github.com/kjanat/bun-bytecode-banner-bug)

```js
// Bundled output:
#!/usr/bin/env bun           // shebang preserved
// @bun @bytecode @bun-cjs   // pragma on line 2
(function(exports, require, module, __filename, __dirname) { ... })
```

The pragma parser in `hasBunPragma()` correctly skips the shebang line,
but uses `self.lexer.end` instead of `contents.len` when scanning for
`@bun-cjs`/`@bytecode` tokens. This causes the pragma to not be
recognized.

**Fix:**

```zig
// Before
while (cursor < self.lexer.end) : (cursor += 1) {

// After
while (cursor < end) : (cursor += 1) {
```

Where `end` is already defined as `contents.len` at the top of the
function.

### How did you verify your code works?

- Added bundler test `banner/SourceHashbangWithBytecodeAndCJSTargetBun`
in `test/bundler/bundler_banner.test.ts`
- Added regression tests in
`test/regression/issue/bun-bytecode-shebang.test.ts` that verify:
  - CJS wrapper executes when source has shebang
  - CJS wrapper executes when source has shebang + bytecode pragma
- End-to-end: bundled bytecode output with source shebang runs correctly
- Ran the tests in the
[kjanat/bun-bytecode-banner-bug](https://github.com/kjanat/bun-bytecode-banner-bug)
repo to verify the issue is fixed

---------

Signed-off-by: Kaj Kowalski <info@kajkowalski.nl>
Co-authored-by: Dylan Conway <dylan.conway567@gmail.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2026-01-08 11:32:08 -08:00
Alistair Smith
8b59b8d17d types: Missing methods on udp socket 2026-01-08 09:38:31 +00:00
robobun
a1f1252771 refactor(test): migrate bun-install tests to concurrent execution (#25895) 2026-01-08 01:06:03 -08:00
Jarred Sumner
bf1e4922b4 Speed up some more tests (#25892)
### What does this PR do?

### How did you verify your code works?
2026-01-07 23:39:10 -08:00
SUZUKI Sosuke
fbf47d0256 fix: use JSPromise helper methods instead of direct internal field manipulation (#25889)
## Summary

Fixes `ASSERTION FAILED: isUInt32AsAnyInt()` errors that occurred
intermittently, particularly in inspector-related tests.

## Problem

The code was directly manipulating JSPromise internal fields using
`asUInt32AsAnyInt()`:

```cpp
promise->internalField(JSC::JSPromise::Field::Flags).get().asUInt32AsAnyInt()
```

This caused assertion failures when the internal field state was not a
valid uint32.

## Solution

Use WebKit's official JSPromise helper methods instead:

| Before | After |
|--------|-------|
| Direct `internalField` manipulation for rejected promise |
`promise->rejectAsHandled(vm, globalObject, value)` |
| Direct `internalField` manipulation for resolved promise |
`promise->fulfill(vm, globalObject, value)` |
| Direct `internalField` manipulation for handled flag |
`promise->markAsHandled()` |

## Files Changed

- `src/bun.js/bindings/ZigGlobalObject.cpp`
- `src/bun.js/bindings/ModuleLoader.cpp`
- `src/bake/BakeGlobalObject.cpp`

Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-07 23:23:27 -08:00
robobun
f83214e0a9 test(http2): refactor tests to use describe.concurrent and await using (#25893)
## Summary
- Use `describe.concurrent` at module scope for parallel test execution
across node/bun executables and padding strategies
- Replace `Bun.spawnSync` with async `Bun.spawn` in memory leak test
- Replace `beforeEach`/`afterEach` server setup with `await using` in
each test
- Add `Symbol.asyncDispose` to `nodeEchoServer` helper for proper
cleanup
- Fix IPv6/IPv4 binding issue by explicitly binding echo server to
127.0.0.1

## Test plan
- [x] Run `bun test test/js/node/http2/node-http2.test.js` - all 245
tests pass (6 skipped)
- [x] Verify tests run faster due to concurrent execution

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2026-01-07 23:15:50 -08:00
robobun
81debb4269 feat(bundler): add metafile support matching esbuild format (#25842) 2026-01-07 22:46:51 -08:00
robobun
962ac0c2fd refactor(test): use describe.concurrent and async spawn in bun-run.test.ts (#25890)
## Summary

- Wrap all tests in `describe.concurrent` at module scope for parallel
test execution
- Replace `Bun.spawnSync` with `Bun.spawn` + `await` throughout
- Replace `run_dir`/`writeFile` pattern with `tempDir` for automatic
cleanup via `using` declarations
- Remove `beforeEach` hook that created shared temp directory

## Test plan

- [x] All 291 tests pass with `bun bd test
test/cli/install/bun-run.test.ts`
- [x] All tests pass with `USE_SYSTEM_BUN=1 bun test
test/cli/install/bun-run.test.ts`

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2026-01-07 21:52:41 -08:00
Jarred Sumner
bdc95c2dc5 Speed up shell leak test (#25880)
### What does this PR do?
18s -> 3s

### How did you verify your code works?
2026-01-07 21:31:33 -08:00
Jarred Sumner
29a6c0d263 Speed up require-cache.test.ts (#25887)
### What does this PR do?

21.92s -> 6s

### How did you verify your code works?
2026-01-07 21:13:28 -08:00
Jarred Sumner
39e2c22e1a fix(http): disable keep-alive on proxy authentication failure (407) (#25884)
## Summary

- Disable HTTP keep-alive when a proxy returns a 407 (Proxy
Authentication Required) status code
- This prevents subsequent requests from trying to reuse a connection
that the proxy server has closed
- Refactored proxy tests to use `describe.concurrent` and async
`Bun.spawn` patterns

## Test plan

- [x] Added test `simultaneous proxy auth failures should not hang` that
verifies multiple concurrent requests with invalid proxy credentials
complete without hanging
- [x] Existing proxy tests pass

🤖 Generated with [Claude Code](https://claude.ai/code)
2026-01-07 20:49:30 -08:00
robobun
b20a70dc40 fix: use JSONParseWithException for proper error handling (#25881)
## Summary
- **SQLClient.cpp**: Fix bug where `RETURN_IF_EXCEPTION` after
`JSONParse` would never trigger on JSON parse failure since `JSONParse`
doesn't throw
- **BunString.cpp**: Simplify by using `JSONParseWithException` instead
of manually checking for empty result and throwing

## Details

`JSC::JSONParse` returns an empty `JSValue` on failure **without
throwing an exception**. This means that `RETURN_IF_EXCEPTION(scope,
{})` will never catch JSON parsing errors when used after `JSONParse`.

Before this fix in `SQLClient.cpp`:
```cpp
JSC::JSValue json = JSC::JSONParse(globalObject, str);
RETURN_IF_EXCEPTION(scope, {});  // This never triggers on parse failure!
return json;  // Returns empty JSValue
```

This could cause issues when parsing invalid JSON data from SQL
databases (e.g., PostgreSQL's JSON/JSONB columns).

`JSONParseWithException` properly throws a `SyntaxError` exception that
`RETURN_IF_EXCEPTION` can catch.

## Test plan
- [x] Build succeeds with `bun bd`
- The changes follow the same pattern used in `ModuleLoader.cpp` and
`BunObject.cpp`

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2026-01-07 20:13:01 -08:00
Jarred Sumner
1f22f4447d Align temp directory resolution with os.tmpdir() (#25878)
## Summary
- Aligns Bun's temp directory resolution with Node.js's `os.tmpdir()`
behavior
- Checks `TMPDIR`, `TMP`, and `TEMP` environment variables in order
(matching Node.js)
- Uses `bun.once` for lazy initialization instead of mutable static
state
- Removes `setTempdir` function and simplifies the API to use
`RealFS.tmpdirPath()` directly

## Test plan
- [ ] Verify temp directory resolution matches Node.js behavior
- [ ] Test with various environment variable configurations
- [ ] Ensure existing tests pass with `bun bd test`

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude <noreply@anthropic.com>
2026-01-07 16:09:49 -08:00
Bradley Walters
ff590e9cfd fix(cmake): fix include paths for local WebKit (#25815)
### What does this PR do?

Without this change, building with `-DWEBKIT_LOCAL=ON` fails with:
```
/work/bun/src/bun.js/bindings/BunObject.cpp:12:10: fatal error: 'JavaScriptCore/JSBase.h' file not found
   12 | #include <JavaScriptCore/JSBase.h>
      |          ^~~~~~~~~~~~~~~~~~~~~~~~~
1 error generated.
```

The reason for this is because the directory structure differs between
downloaded and local WebKit.

Downloaded WebKit:
```
build/debug/cache/webkit-6d0f3aac0b817cc0/
  └── include/
      └── JavaScriptCore/
          └── JSBase.h          ← Direct path
```

Local WebKit:
```
vendor/WebKit/WebKitBuild/Debug/
  └── JavaScriptCore/Headers/
      └── JavaScriptCore/
          └── JSBase.h          ← Nested path
```

The include paths are thus configured differently for each build type.

For Remote WebKit (when WEBKIT_LOCAL=OFF):
- SetupWebKit.cmake line 22 sets: WEBKIT_INCLUDE_PATH =
${WEBKIT_PATH}/include
- BuildBun.cmake line 1253 adds:
include_directories(${WEBKIT_INCLUDE_PATH})
- This resolves to: build/debug/cache/webkit-6d0f3aac0b817cc0/include/
- So #include <JavaScriptCore/JSBase.h> finds the file at
include/JavaScriptCore/JSBase.h 

For Local WebKit (when WEBKIT_LOCAL=ON):
- The original code only added:
${WEBKIT_PATH}/JavaScriptCore/Headers/JavaScriptCore
- This resolves to:
vendor/WebKit/WebKitBuild/Debug/JavaScriptCore/Headers/JavaScriptCore/
- So #include <JavaScriptCore/JSBase.h> fails because there's no
JavaScriptCore/ subdirectory at that level 
- The fix adds: ${WEBKIT_PATH}/JavaScriptCore/Headers
- Now the include path includes:
vendor/WebKit/WebKitBuild/Debug/JavaScriptCore/Headers/
- So #include <JavaScriptCore/JSBase.h> finds the file at
Headers/JavaScriptCore/JSBase.h 

### How did you verify your code works?

Built locally.

Co-authored-by: Carl Smedstad <carsme@archlinux.org>
2026-01-07 15:46:46 -08:00
Bradley Walters
18f242daa1 feat(cmake): simplify bindgenv2 error handling using COMMAND_ERROR_IS_FATAL (#25814)
### What does this PR do?

In CMake, failure to execute a process will place a message string in
the RESULT_VARIABLE. If the message string starts with 'No' such as in
'No such file or directory' then CMake will interpret that as the
boolean false and not halt the build. The new code uses a built-in
option to halt the build on any failure, so the script will halt
correctly if that error occurs. This could also be fixed by quoting, but
might as well use the CMake feature.

I encountered this error when I improperly defined BUN_EXECUTABLE.

### How did you verify your code works?

<details>

<summary>Ran the build with invalid executable path:</summary>

```
$ node ./scripts/build.mjs \
                   -GNinja \
                   -DCMAKE_BUILD_TYPE=Debug \
                   -B build/debug \
                   --log-level=NOTICE \
                   -DBUN_EXECUTABLE="foo" \
                   -DNPM_EXECUTABLE="$(which npm)" \
                   -DZIG_EXECUTABLE="$(which zig)" \
                   -DENABLE_ASAN=OFF

Globbed 1971 sources [178.21ms]
CMake Configure
  $ cmake -G Ninja -DCMAKE_BUILD_TYPE=Debug -B /work/bun/build/debug --log-level NOTICE -DBUN_EXECUTABLE=foo -DNPM_EXECUTABLE=/bin/npm -DZIG_EXECUTABLE=/bin/zig -DENABLE_ASAN=OFF -S /work/bun -DCACHE_STRATEGY=auto
sccache: Using local cache strategy.
```

</details>

Result:

```
CMake Error at cmake/targets/BuildBun.cmake:422 (execute_process):
  execute_process failed command indexes:

    1: "Abnormal exit with child return code: no such file or directory"

Call Stack (most recent call first):
  CMakeLists.txt:66 (include)
```
2026-01-07 15:46:00 -08:00
Bradley Walters
fbc175692f feat(cmake): log download failure from SetupWebKit.cmake (#25813)
### What does this PR do?

Before this change, downloading WebKit would fail silently if there were
e.g. a connection or TLS certificate issue. I experienced this when
trying to build bun in an overly-restrictive sandbox.

After this change, the failure reason will be visible.

### How did you verify your code works?

<details>

<summary>Brought down my network interface and ran the build:</summary>

```sh
$ node ./scripts/build.mjs \
                   -GNinja \
                   -DCMAKE_BUILD_TYPE=Debug \
                   -B build/debug \
                   --log-level=NOTICE \
                   -DBUN_EXECUTABLE="$(which node)" \
                   -DNPM_EXECUTABLE="$(which npm)" \
                   -DZIG_EXECUTABLE="$(which zig)" \
                   -DENABLE_ASAN=OFF

Globbed 1971 sources [180.67ms]
CMake Configure
  $ cmake -G Ninja -DCMAKE_BUILD_TYPE=Debug -B /work/bun/build/debug --log-level NOTICE -DBUN_EXECUTABLE=/bin/node -DNPM_EXECUTABLE=/bin/npm -DZIG_EXECUTABLE=/bin/zig -DENABLE_ASAN=OFF -S /work/bun -DCACHE_STRATEGY=auto
sccache: Using local cache strategy.
...
```

</details>

Result:

```
CMake Error at cmake/tools/SetupWebKit.cmake:99 (message):
  Failed to download WebKit: 6;"Could not resolve hostname"
Call Stack (most recent call first):
  cmake/targets/BuildBun.cmake:1204 (include)
  CMakeLists.txt:66 (include)
```
2026-01-07 15:44:57 -08:00
Jarred Sumner
22315000e0 Update CLAUDE.md 2026-01-07 12:33:21 -08:00
M Waheed
bc02c18dc5 Fix typo in PM2 configuration example
According to PM2 documentation, the name of the application is defined
by "name" key, not "title"
2026-01-07 14:04:06 +00:00
Dylan Conway
4c492c66b8 fix(bundler): fix --compile with 8+ embedded files (#25859)
## Summary

Fixes #20821

When `bun build --compile` was used with 8 or more embedded files, the
compiled binary would silently fail to execute any code (exit code 0, no
output).

**Root cause:** Chunks were sorted alphabetically by their `entry_bits`
key bytes. For entry point 0, the key starts with bit 0 set (byte
pattern `0x01`), but for entry point 8, the key has bit 8 set in the
byte (pattern `0x00, 0x01`). Alphabetically, `0x00 < 0x01`, so entry
point 8's chunk sorted before entry point 0.

This caused the wrong entry point to be identified as the main entry,
resulting in asset wrapper code being executed instead of the user's
code.

**Fix:** Custom sort that ensures `entry_point_id=0` (the main entry
point) always sorts first, with remaining chunks sorted alphabetically
for determinism.

## Test plan

- Added regression test `compile/ManyEmbeddedFiles` that embeds 8 files
and verifies the main entry point runs correctly
- Verified manually with reproduction case from issue

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude <noreply@anthropic.com>
2026-01-06 23:05:01 +00:00
Jack Kleeman
46801ec926 Send http2 window resize frames after half the window is used up (#25847)
### What does this PR do?
It is standard practice to send HTTP2 window resize frames once half the
window is used up:
- [nghttp2](https://en.wikipedia.org/wiki/Nghttp2#HTTP/2_implementation)
- [Go](https://github.com/golang/net/blob/master/http2/flow.go#L31)
- [Rust
h2](https://github.com/hyperium/h2/blob/master/src/proto/streams/flow_control.rs#L17)

The current behaviour of Bun is to send a window update once the client
has sent 65535 bytes exactly. This leads to an interruption in
throughput while the window update is received.

This is not just a performance concern however, I think some clients
will not handle this well, as it means that if you stop sending data
even 1 byte before 65535, you have a deadlock. The reason I came across
this issue is that it appears that the Rust `hyper` crate always
reserves an additional 1 byte from the connection for each http2 stream
(https://github.com/hyperium/hyper/blob/master/src/proto/h2/mod.rs#L122).
This means that when you have two concurrent requests, the client treats
it like the window is exhausted when it actually has a byte remaining,
leading to a sequential dependency between the requests that can create
deadlocks if they depend on running concurrently. I accept this is not a
problem with bun, but its a happy accident that we can resolve such
off-by-one issues by increasing the window size once it is 50% utilized

### How did you verify your code works?
Using wireshark, bun debug logging, and client logging I can observe
that the window updates are now sent after 32767 bytes. This also
resolves the h2 crate client issue.
2026-01-06 11:37:50 -08:00
robobun
5617b92a5a test: refactor spawnSync to spawn with describe.concurrent (#25849)
## Summary

- Refactor 16 test files to use async `Bun.spawn` instead of
`Bun.spawnSync`
- Wrap tests in `describe.concurrent` blocks for parallel execution
- Use `await using` for automatic resource cleanup

## Performance Improvement

| Test File | Before | After | Improvement |
|-----------|--------|-------|-------------|
| `node-module-module.test.js` (28 tests) | ~325ms | ~185ms | **43%
faster** |
| `non-english-import.test.js` (3 tests) | ~238ms | ~157ms | **34%
faster** |

## Files Changed

- `test/cli/run/commonjs-invalid.test.ts`
- `test/cli/run/commonjs-no-export.test.ts`
- `test/cli/run/empty-file.test.ts`
- `test/cli/run/jsx-symbol-collision.test.ts`
- `test/cli/run/run-cjs.test.ts`
- `test/cli/run/run-extensionless.test.ts`
- `test/cli/run/run-shell.test.ts`
- `test/cli/run/run-unicode.test.ts`
- `test/js/bun/resolve/non-english-import.test.js`
- `test/js/node/module/node-module-module.test.js`
- `test/regression/issue/00631.test.ts`
- `test/regression/issue/03216.test.ts`
- `test/regression/issue/03830.test.ts`
- `test/regression/issue/04011.test.ts`
- `test/regression/issue/04893.test.ts`
- `test/regression/issue/hashbang-still-works.test.ts`

## Test plan

- [x] All refactored tests pass with `USE_SYSTEM_BUN=1 bun test <file>`

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2026-01-06 15:37:56 +00:00
Jarred Sumner
c6a73fc23e Hardcode __NR_close_range (#25840)
### What does this PR do?

Fixes https://github.com/oven-sh/bun/issues/25839

### How did you verify your code works?
2026-01-06 15:07:19 +00:00
Andrew Johnston
3de2dc1287 fix: use the computed size of the Offsets struct instead of hard-co… (#25816)
### What does this PR do?

- I was trying to understand how the SEA bundling worked in Bun and
noticed the size of the `Offsets` struct is hard-coded here to 32. It
should use the computed size to be future proof to changes in the
schema.

### How did you verify your code works?

- I didn't. Can add tests if this is not covered by existing tests.
ChatGPT agreed with me though. =)
2026-01-06 15:04:28 +00:00
SUZUKI Sosuke
370e6fb9fa fix(fetch): fix ReadableStream memory leak when using stream body (#25846)
## Summary

This PR fixes a memory leak that occurs when `fetch()` is called with a
`ReadableStream` body. The ReadableStream objects were not being
properly released, causing them to accumulate in memory.

## Problem

When using `fetch()` with a ReadableStream body:

```javascript
const stream = new ReadableStream({
  start(controller) {
    controller.enqueue(new TextEncoder().encode("data"));
    controller.close();
  }
});

await fetch(url, { method: "POST", body: stream });
```

The ReadableStream objects leak because `FetchTasklet.clearData()` has a
conditional that prevents `detach()` from being called on ReadableStream
request bodies after streaming has started.

### Root Cause

The problematic condition in `clearData()`:

```zig
if (this.request_body != .ReadableStream or this.is_waiting_request_stream_start) {
    this.request_body.detach();
}
```

After `startRequestStream()` is called:
- `is_waiting_request_stream_start` becomes `false`
- `request_body` is still `.ReadableStream`
- The condition evaluates to `(false or false) = false`
- `detach()` is skipped → **memory leak**

### Why the Original Code Was Wrong

The original code appears to assume that when `startRequestStream()` is
called, ownership of the Strong reference is transferred to
`ResumableSink`. However, this is incorrect:

1. `startRequestStream()` creates a **new independent** Strong reference
in `ResumableSink` (see `ResumableSink.zig:119`)
2. The FetchTasklet's original reference is **not transferred** - it
becomes redundant
3. Strong references in Bun are independent - calling `deinit()` on one
does not affect the other

## Solution

Remove the conditional and always call `detach()`:

```zig
// Always detach request_body regardless of type.
// When request_body is a ReadableStream, startRequestStream() creates
// an independent Strong reference in ResumableSink, so FetchTasklet's
// reference becomes redundant and must be released to avoid leaks.
this.request_body.detach();
```

### Safety Analysis

This change is safe because:

1. **Strong references are independent**: Each Strong reference
maintains its own ref count. Detaching FetchTasklet's reference doesn't
affect ResumableSink's reference
2. **Idempotency**: `detach()` is safe to call on already-detached
references
3. **Timing**: `clearData()` is only called from `deinit()` after
streaming has completed (ref_count = 0)
4. **No UAF risk**: `deinit()` only runs when ref_count reaches 0, which
means all streaming operations have completed

## Test Results

Before fix (with system Bun):
```
Expected: <= 100
Received: 501   (Request objects leaked)
Received: 1002  (ReadableStream objects leaked)
```

After fix:
```
6 pass
0 fail
```

## Test Coverage

Added comprehensive tests in
`test/js/web/fetch/fetch-cyclic-reference.test.ts` covering:
- Response stream leaks with cyclic references
- Streaming response body leaks
- Request body stream leaks with cyclic references
- ReadableStream body leaks (no cyclic reference needed to reproduce)
- Concurrent fetch operations with cyclic references

---------

Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2026-01-06 15:00:52 +00:00
Dylan Conway
91f7a94d84 Add symbols.def to link-metadata.json (#25841)
Include the Windows module definition file (symbols.def) in
link-metadata.json, similar to how
   symbols.txt and symbols.dyn are already included for macOS and Linux.

   🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude <noreply@anthropic.com>
2026-01-05 17:45:41 -08:00
Prithvish Baidya
9ab6365a13 Add support for Requester Pays in S3 operations (#25514)
- Introduced `requestPayer` option in S3-related functions and
structures to handle Requester Pays buckets.
- Updated S3 client methods to accept and propagate the `requestPayer`
flag.
- Enhanced documentation for the `requestPayer` option in the S3 type
definitions.
- Adjusted existing S3 operations to utilize the `requestPayer`
parameter where applicable, ensuring compatibility with AWS S3's
Requester Pays feature.
- Ensured that the new functionality is integrated into multipart
uploads and simple requests.

### What does this PR do?

This change allows users to specify whether they are willing to pay for
data transfer costs when accessing objects in Requester Pays buckets,
improving flexibility and compliance with AWS S3's billing model.

This closes #25499

### How did you verify your code works?

I have added a new test file to verify this functionality, and all my
tests pass.
I also tested this against an actual S3 bucket which can only be
accessed if requester pays. I can confirm that it's accessible with
`requestPayer` is `true`, and the default of `false` does not allow
access.

An example bucket is here: s3://hl-mainnet-evm-blocks/0/0/1.rmp.lz4
(my usecase is indexing [hyperliquid block
data](https://hyperliquid.gitbook.io/hyperliquid-docs/for-developers/hyperevm/raw-hyperevm-block-data)
which is stored in s3, and I want to use bun to index faster)

---------

Co-authored-by: Alistair Smith <hi@alistair.sh>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Ciro Spaciari <ciro.spaciari@gmail.com>
2026-01-05 15:04:20 -08:00
Alistair Smith
bf937f7294 sql: filter out undefined values in INSERT helper instead of treating as NULL (#25830)
### What does this PR do?

the `sql()` helper now filters out `undefined` values in INSERT
statements instead of converting them to `NULL`. This allows columns
with `DEFAULT` values to use their defaults when `undefined` is passed,
rather than being overridden with `NULL`.

**Before:** `sql({ foo: undefined, id: "123" })` in INSERT would
generate `(foo, id) VALUES (NULL, "123")`, causing NOT NULL constraint
violations even when the column has a DEFAULT.

**After:** `sql({ foo: undefined, id: "123" })` in INSERT generates
`(id) VALUES ("123")`, omitting the undefined column entirely and
letting the database use the DEFAULT value.

Also fixes a data loss bug in bulk inserts where columns were determined
only from the first item - now all items are checked, so values in later
items aren't silently dropped.

  Fixes #25829

  ### How did you verify your code works?

  - Added regression test for #25829 (NOT NULL column with DEFAULT)
- Added tests for bulk insert with mixed undefined patterns which is the
data loss scenario
2026-01-05 15:03:20 -08:00
Meghan Denny
ce9788716f test: resolve this napi TODO (#25287)
since https://github.com/oven-sh/bun/pull/20772

Co-authored-by: Ciro Spaciari <ciro.spaciari@gmail.com>
2026-01-05 12:07:28 -08:00
robobun
4301af9f3e Harden TLS hostname verification (#25727)
## Summary
- Tighten wildcard certificate matching logic for improved security
- Add tests for wildcard hostname verification edge cases

## Test plan
- [x] `bun bd test test/js/web/fetch/fetch.tls.wildcard.test.ts` passes
- [x] Existing TLS tests continue to pass

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2026-01-05 10:21:49 -08:00
Jarred Sumner
8d1de78c7e Deflake stress.test.ts 2026-01-05 17:21:34 +00:00
Darwin ❤️❤️❤️
27ff6aaae0 fix(web): make URLSearchParams.prototype.size configurable (#25762) 2026-01-02 04:57:48 -08:00
robobun
779764332a feat(cli): add --grep as alias for -t/--test-name-pattern in bun test (#25788) 2026-01-02 04:52:47 -08:00
Alex Miller
0141a4fac9 docs: fix shell prompt rendering and remove hardcoded prompts (#25792) 2026-01-02 01:43:19 +00:00
robobun
113830d3cf docs: update npmrc.mdx with all supported .npmrc fields (#25783) 2025-12-31 01:46:37 -08:00
robobun
d9ae93e025 fix(cmake): fix JSON parsing in SetupBuildkite.cmake (#25755)
## Summary

- Fix CMake JSON parsing error when Buildkite API returns commit
messages with newlines

CMake's `file(READ ...)` reads files with literal newlines, which breaks
`string(JSON ...)` when the JSON contains escape sequences like `\n` in
string values (e.g., commit messages from Buildkite API).

Use `file(STRINGS ...)` to read line-by-line, then join with `\n` to
preserve valid JSON escape sequences while avoiding literal newlines.

## Test plan

- Verify CMake configure succeeds when Buildkite build has commit
messages with newlines

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-12-30 23:51:29 -08:00
robobun
604c83c8a6 perf(ipc): fix O(n²) JSON scanning for large chunked messages (#25743)
## Summary

- Fix O(n²) performance bug in JSON mode IPC when receiving large
messages that arrive in chunks
- Add `JsonIncomingBuffer` wrapper that tracks newline positions to
avoid re-scanning
- Each byte is now scanned exactly once (on arrival or when preceding
message is consumed)

## Problem

When data arrives in chunks in JSON mode, `decodeIPCMessage` was calling
`indexOfChar(data, '\n')` on the ENTIRE accumulated buffer every time.
For a 10MB message arriving in 160 chunks of 64KB:

- Chunk 1: scan 64KB
- Chunk 2: scan 128KB  
- Chunk 3: scan 192KB
- ...
- Chunk 160: scan 10MB

Total: ~800MB scanned for one 10MB message.

## Solution

Introduced a `JsonIncomingBuffer` struct that:
1. Tracks `newline_pos: ?u32` - position of known upcoming newline (if
any)
2. On `append(bytes)`: Only scans new chunk for `\n` if no position is
cached
3. On `consume(bytes)`: Updates or re-scans as needed after message
processing

This ensures O(n) scanning instead of O(n²).

## Test plan

- [x] `bun run zig:check-all` passes (all platforms compile)
- [x] `bun bd test test/js/bun/spawn/spawn.ipc.test.ts` - 4 tests pass
- [x] `bun bd test test/js/node/child_process/child_process_ipc.test.js`
- 1 test pass
- [x] `bun bd test test/js/bun/spawn/bun-ipc-inherit.test.ts` - 1 test
pass
- [x] `bun bd test test/js/bun/spawn/spawn.ipc.bun-node.test.ts` - 1
test pass
- [x] `bun bd test test/js/bun/spawn/spawn.ipc.node-bun.test.ts` - 1
test pass
- [x] `bun bd test
test/js/node/child_process/child_process_ipc_large_disconnect.test.js` -
1 test pass
- [x] Manual verification with `child-process-send-cb-more.js` (32KB
messages)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2025-12-29 20:02:18 -08:00
robobun
370b25c086 perf(Buffer.indexOf): use SIMD-optimized search functions (#25745)
## Summary

Optimize `Buffer.indexOf` and `Buffer.includes` by replacing
`std::search` with Highway SIMD-optimized functions:

- **Single-byte search**: `highway_index_of_char` - SIMD-accelerated
character search
- **Multi-byte search**: `highway_memmem` - SIMD-accelerated substring
search

These Highway functions are already used throughout Bun's codebase for
fast string searching (URL parsing, JS lexer, etc.) and provide
significant speedups for pattern matching in large buffers.

### Changes

```cpp
// Before: std::search (scalar)
auto it = std::search(haystack.begin(), haystack.end(), needle.begin(), needle.end());

// After: SIMD-optimized
if (valueLength == 1) {
    size_t result = highway_index_of_char(haystackPtr, haystackLen, valuePtr[0]);
} else {
    void* result = highway_memmem(haystackPtr, haystackLen, valuePtr, valueLength);
}
```

## Test plan

- [x] Debug build compiles successfully
- [x] Basic functionality verified (`indexOf`, `includes` with single
and multi-byte patterns)
- [ ] Existing Buffer tests pass

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-12-29 17:27:12 -08:00
Tommy D. Rossi
538be1399c feat(bundler): expose reactFastRefresh option in Bun.build API (#25731)
Fixes #25716

Adds support for a `reactFastRefresh: boolean` option in the `Bun.build`
JavaScript API, matching the existing `--react-fast-refresh` CLI flag.

```ts
const result = await Bun.build({
    reactFastRefresh: true,
    entrypoints: ["src/App.tsx"],
});
```

When enabled, the bundler adds React Fast Refresh transform code
(`$RefreshReg$`, `$RefreshSig$`) to the output.
2025-12-28 22:07:47 -08:00
robobun
d04b86d34f perf: use jsonStringifyFast for faster JSON serialization (#25733)
## Summary

Apply the same optimization technique from PR #25717 (Response.json) to
other APIs that use JSON.stringify internally:

- **IPC message serialization** (`ipc.zig`) - used for inter-process
communication
- **console.log with %j format** (`ConsoleObject.zig`) - commonly used
for debugging
- **PostgreSQL JSON/JSONB types** (`PostgresRequest.zig`) - database
operations
- **MySQL JSON type** (`MySQLTypes.zig`) - database operations
- **Jest %j/%o format specifiers** (`jest.zig`) - test output formatting
- **Transpiler tsconfig/macros** (`JSTranspiler.zig`) - build
configuration

### Root Cause

When calling `JSONStringify(globalObject, value, 0)`, the space
parameter `0` becomes `jsNumber(0)`, which is NOT `undefined`. This
causes JSC's FastStringifier (SIMD-optimized) to bail out:

```cpp
// In WebKit's JSONObject.cpp FastStringifier::stringify()
if (!space.isUndefined()) {
    logOutcome("space"_s);
    return { };  // Bail out to slow path
}
```

Using `jsonStringifyFast` which passes `jsUndefined()` triggers the fast
path.

### Expected Performance Improvement

Based on PR #25717 results, these changes should provide ~3x speedup for
JSON serialization in the affected APIs.

## Test plan

- [x] Debug build compiles successfully
- [x] Basic functionality verified (IPC, console.log %j, Response.json)
- [x] Existing tests pass

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-12-28 18:01:07 -08:00
robobun
37fc8e99f7 Harden WebSocket client decompression (#25724)
## Summary
- Add maximum decompressed message size limit to WebSocket client
deflate handling
- Add test coverage for decompression limits

## Test plan
- Run `bun test
test/js/web/websocket/websocket-permessage-deflate-edge-cases.test.ts`

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-12-28 17:58:24 -08:00
robobun
6b5de25d8a feat(shell): add $.trace for analyzing shell commands without execution (#25667)
## Summary

Adds `Bun.$.trace` for tracing shell commands without executing them.

```js
const result = $.trace`cat /tmp/file.txt > output.txt`;
// { operations: [...], cwd: "...", success: true, error: null }
```

## Test plan

- [x] `bun bd test test/js/bun/shell/trace.test.ts`

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2025-12-27 17:25:52 -08:00
Alex Miller
7b49654db6 fix(io): Prevent data corruption in Bun.write for files >2GB (#25720)
Closes #8254

Fixes a data corruption bug in `Bun.write()` where files larger than 2GB
would have chunks skipped resulting in corrupted output with missing
data.

The `doWriteLoop` had an issue where it would essentially end up
offsetting twice every 2GB chunks:
-  it first sliced the buffer by `total_written`: 
```remain = remain[@min(this.total_written, remain.len)..]``` 
-  it would then increment `bytes_blob.offset`: 
`this.bytes_blob.offset += @truncate(wrote)` 

but because `sharedView()` already uses the blob offset `slice_ = slice_[this.offset..]` it would end up doubling the offset.

In a local reproduction writing a 16GB file with each 2GB chunk filled with incrementing values `[1, 2, 3, 4, 5, 6, 7, 8]`, the buggy version produced: `[1, 3, 5, 7, …]`, skipping every other chunk.

The fix is to simply remove the redundant manual offset and rely only on `total_written` to track write progress.
2025-12-27 16:58:36 -08:00
SUZUKI Sosuke
603bbd18a0 Enable CHECK_REF_COUNTED_LIFECYCLE in WebKit (#25705)
### What does this PR do?

Enables `CHECK_REF_COUNTED_LIFECYCLE` in WebKit (
https://github.com/oven-sh/WebKit/pull/121 )

See also
a978fae619

#### `CHECK_REF_COUNTED_LIFECYCLE`?

A compile-time macro that enables lifecycle validation for
reference-counted objects in debug builds.

**Definition**
```cpp
  #if ASSERT_ENABLED || ENABLE(SECURITY_ASSERTIONS)
  #define CHECK_REF_COUNTED_LIFECYCLE 1
  #else
  #define CHECK_REF_COUNTED_LIFECYCLE 0
  #endif
```
**Purpose**

  Detects three categories of bugs:
1. Missing adoption - Objects stored in RefPtr without using adoptRef()
2. Ref during destruction - ref() called while destructor is running
(causes dangling pointers)
  3. Thread safety violations - Unsafe ref/deref across threads

**Implementation**

  When enabled, RefCountDebugger adds two tracking flags:
  - m_deletionHasBegun - Set when destructor starts
  - m_adoptionIsRequired - Cleared when adoptRef() is called

These flags are checked on every ref()/deref() call, with assertions
failing on violations.

**Motivation**

  Refactored debug code into a separate RefCountDebugger class to:
  - Improve readability of core refcount logic
- Eliminate duplicate code across RefCounted, ThreadSafeRefCounted, etc.
  - Simplify adding new refcount classes

 **Overhead**

Zero in release builds - the flags and checks are compiled out entirely.

### How did you verify your code works?

---------

Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2025-12-27 15:02:11 -08:00
robobun
1d7cb4bbad perf(Response.json): use JSC's FastStringifier by passing undefined for space (#25717)
## Summary

- Fix performance regression where `Response.json()` was 2-3x slower
than `JSON.stringify() + new Response()`
- Root cause: The existing code called `JSC::JSONStringify` with
`indent=0`, which internally passes `jsNumber(0)` as the space
parameter. This bypasses WebKit's FastStringifier optimization.
- Fix: Add a new `jsonStringifyFast` binding that passes `jsUndefined()`
for the space parameter, triggering JSC's FastStringifier
(SIMD-optimized) code path.

## Root Cause Analysis

In WebKit's `JSONObject.cpp`, the `stringify()` function has this logic:

```cpp
static NEVER_INLINE String stringify(JSGlobalObject& globalObject, JSValue value, JSValue replacer, JSValue space)
{
    // ...
    if (String result = FastStringifier<Latin1Character, BufferMode::StaticBuffer>::stringify(globalObject, value, replacer, space, failureReason); !result.isNull())
        return result;
    // Falls back to slow Stringifier...
}
```

And `FastStringifier::stringify()` checks:
```cpp
if (!space.isUndefined()) {
    logOutcome("space"_s);
    return { };  // Bail out to slow path
}
```

So when we called `JSONStringify(globalObject, value, (unsigned)0)`, it
converted to `jsNumber(0)` which is NOT `undefined`, causing
FastStringifier to bail out.

## Performance Results

### Before (3.5x slower than manual approach)
```
Response.json():                2415ms
JSON.stringify() + Response():  689ms
Ratio:                          3.50x
```

### After (parity with manual approach)
```
Response.json():                ~700ms  
JSON.stringify() + Response():  ~700ms
Ratio:                          ~1.09x
```

## Test plan

- [x] Existing `Response.json()` tests pass
(`test/regression/issue/21257.test.ts`)
- [x] Response tests pass (`test/js/web/fetch/response.test.ts`)
- [x] Manual verification that output is correct for various JSON inputs

Fixes #25693

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Sosuke Suzuki <sosuke@bun.com>
2025-12-27 15:01:28 -08:00
SUZUKI Sosuke
01de0ecbd9 Add simple benchmark for Array.of (#25711)
**before:**

```
$ bun bench/snippets/array-of.js
cpu: Apple M4 Max
runtime: bun 1.3.5 (arm64-darwin)

benchmark                                  time (avg)             (min … max)       p75       p99      p999
----------------------------------------------------------------------------- -----------------------------
int: Array.of(1,2,3,4,5)                 9.19 ns/iter       (8.1 ns … 108 ns)   9.28 ns  13.63 ns  69.44 ns
int: Array.of(100 elements)             1'055 ns/iter   (94.58 ns … 1'216 ns)  1'108 ns  1'209 ns  1'216 ns
double: Array.of(1.1,2.2,3.3,4.4,5.5)   10.34 ns/iter      (8.81 ns … 102 ns)  10.17 ns  17.19 ns  73.51 ns
double: Array.of(100 elements)          1'073 ns/iter     (124 ns … 1'215 ns)  1'136 ns  1'204 ns  1'215 ns
object: Array.of(obj x5)                19.19 ns/iter     (16.58 ns … 122 ns)  19.06 ns   77.6 ns  85.75 ns
object: Array.of(100 elements)          1'340 ns/iter     (294 ns … 1'568 ns)  1'465 ns  1'537 ns  1'568 ns
```

**after:**

```
$ ./build/release/bun bench/snippets/array-of.js
cpu: Apple M4 Max
runtime: bun 1.3.6 (arm64-darwin)

benchmark                                  time (avg)             (min … max)       p75       p99      p999
----------------------------------------------------------------------------- -----------------------------
int: Array.of(1,2,3,4,5)                 2.68 ns/iter    (2.14 ns … 92.96 ns)   2.52 ns   3.95 ns  59.73 ns
int: Array.of(100 elements)             23.69 ns/iter     (18.88 ns … 155 ns)  20.91 ns  83.82 ns  96.66 ns
double: Array.of(1.1,2.2,3.3,4.4,5.5)    3.62 ns/iter    (2.97 ns … 75.44 ns)   3.46 ns   5.05 ns  65.82 ns
double: Array.of(100 elements)          26.96 ns/iter     (20.14 ns … 156 ns)  24.45 ns  87.75 ns  96.88 ns
object: Array.of(obj x5)                11.82 ns/iter     (9.6 ns … 87.38 ns)  11.23 ns  68.99 ns  77.09 ns
object: Array.of(100 elements)            236 ns/iter       (206 ns … 420 ns)    273 ns    325 ns    386 ns
```
2025-12-27 00:05:57 -08:00
Oleksandr Herasymov
d3a5f2eef2 perf: speed up Bun.hash.crc32 by switching to zlib CRC32 (#25692)
## What does this PR do?
Switch `Bun.hash.crc32` to use `zlib`'s CRC32 implementation. Bun
already links `zlib`, which provides highly optimized,
hardware-accelerated CRC32. Because `zlib.crc32` takes a 32-bit length,
chunk large inputs to avoid truncation/regressions on buffers >4GB.

Note: This was tried before (PR #12164 by Jarred), which switched CRC32
to zlib for speed. This proposal keeps that approach and adds explicit
chunking to avoid the >4GB length pitfall.

**Problem:** `Bun.hash.crc32` is a significant outlier in
microbenchmarks compared to other hash functions (about 21x slower than
`zlib.crc32` in a 1MB test on M1).

**Root cause:** `Bun.hash.crc32` uses Zig's `std.hash.Crc32`
implementation, which is software-only and does not leverage hardware
acceleration (e.g., `PCLMULQDQ` on x86 or `CRC32` instructions on ARM).

**Implementation:**
https://github.com/oven-sh/bun/blob/main/src/bun.js/api/HashObject.zig

```zig
pub const crc32 = hashWrap(struct {
    pub fn hash(seed: u32, bytes: []const u8) u32 {
        // zlib takes a 32-bit length, so chunk large inputs to avoid truncation.
        var crc: u64 = seed;
        var offset: usize = 0;
        while (offset < bytes.len) {
            const remaining = bytes.len - offset;
            const max_len: usize = std.math.maxInt(u32);
            const chunk_len: u32 = if (remaining > max_len) @intCast(max_len) else @intCast(remaining);
            crc = bun.zlib.crc32(crc, bytes.ptr + offset, chunk_len);
            offset += chunk_len;
        }
        return @intCast(crc);
    }
});
```

### How did you verify your code works?
**Benchmark (1MB payload):**
- **Before:** Bun 1.3.5 `Bun.hash.crc32` = 2,644,444 ns/op vs
`zlib.crc32` = 124,324 ns/op (~21x slower)
- **After (local bun-debug):** `Bun.hash.crc32` = 360,591 ns/op vs
`zlib.crc32` = 359,069 ns/op (~1.0x), results match

## Test environment
- **Machine:** MacBook Pro 13" (M1, 2020)
- **OS:** macOS 15.7.3
- **Baseline Bun:** 1.3.5
- **After Bun:** local `bun-debug` (build/debug)
2025-12-26 23:41:10 -08:00
robobun
b51e993bc2 fix: reject null bytes in spawn args, env, and shell arguments (#25698)
## Summary

- Reject null bytes in command-line arguments passed to `Bun.spawn` and
`Bun.spawnSync`
- Reject null bytes in environment variable keys and values
- Reject null bytes in shell (`$`) template literal arguments

This prevents null byte injection attacks (CWE-158) where null bytes in
strings could cause unintended truncation when passed to the OS,
potentially allowing attackers to bypass file extension validation or
create files with unexpected names.

## Test plan

- [x] Added tests in `test/js/bun/spawn/null-byte-injection.test.ts`
- [x] Tests pass with debug build: `bun bd test
test/js/bun/spawn/null-byte-injection.test.ts`
- [x] Tests fail with system Bun (confirming the fix works)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2025-12-26 23:39:37 -08:00
SUZUKI Sosuke
92f105dbe1 Add microbench for String#includes (#25699)
note: This is due to constant folding by the JIT. For `String#includes`
on dynamic strings, the performance improvement is not this significant.

**before:**
```
$ bun ./bench/snippets/string-includes.mjs
cpu: Apple M4 Max
runtime: bun 1.3.5 (arm64-darwin)

benchmark                                  time (avg)             (min … max)       p75       p99      p999
----------------------------------------------------------------------------- -----------------------------
String.includes - short, hit (middle)   82.24 ns/iter     (14.95 ns … 881 ns)  84.98 ns    470 ns    791 ns
String.includes - short, hit (start)    37.44 ns/iter      (8.46 ns … 774 ns)  26.08 ns    379 ns    598 ns
String.includes - short, hit (end)      97.27 ns/iter     (16.93 ns … 823 ns)    107 ns    537 ns    801 ns
String.includes - short, miss             102 ns/iter       (0 ps … 1'598 µs)     42 ns    125 ns    167 ns
String.includes - long, hit (middle)    16.01 ns/iter     (14.34 ns … 115 ns)  16.03 ns   20.1 ns   53.1 ns
String.includes - long, miss              945 ns/iter       (935 ns … 972 ns)    948 ns    960 ns    972 ns
String.includes - with position          9.83 ns/iter    (8.44 ns … 58.45 ns)   9.83 ns  12.31 ns  15.69 ns
```

**after:**
```
$ ./build/release/bun bench/snippets/string-includes.mjs
cpu: Apple M4 Max
runtime: bun 1.3.6 (arm64-darwin)

benchmark                                  time (avg)             (min … max)       p75       p99      p999
----------------------------------------------------------------------------- -----------------------------
String.includes - short, hit (middle)     243 ps/iter     (203 ps … 10.13 ns)    244 ps    325 ps    509 ps !
String.includes - short, hit (start)      374 ps/iter     (244 ps … 19.78 ns)    387 ps    488 ps    691 ps
String.includes - short, hit (end)        708 ps/iter     (407 ps … 18.03 ns)    651 ps   2.62 ns   2.69 ns
String.includes - short, miss            1.49 ns/iter     (407 ps … 27.93 ns)   2.87 ns   3.09 ns   3.78 ns
String.includes - long, hit (middle)     3.28 ns/iter      (3.05 ns … 118 ns)   3.15 ns   8.75 ns  16.07 ns
String.includes - long, miss             7.28 ns/iter      (3.44 ns … 698 ns)   9.34 ns  42.85 ns    240 ns
String.includes - with position          7.97 ns/iter       (3.7 ns … 602 ns)   9.68 ns  52.19 ns    286 ns
```
2025-12-26 21:49:00 -08:00
Dylan Conway
d0bd1b121f Fix DCE producing invalid syntax for empty objects in spreads (#25710)
## Summary
- Fixes dead code elimination producing invalid syntax like `{ ...a, x:
}` when simplifying empty objects in spread contexts
- The issue was that `simplifyUnusedExpr` and `joinAllWithCommaCallback`
could return `E.Missing` instead of `null` to indicate "no side effects"
- Added checks to return `null` when the result is `E.Missing`

Fixes #25609

## Test plan
- [x] Added regression test that fails on v1.3.5 and passes with fix
- [x] `bun bd test test/regression/issue/25609.test.ts` passes

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude <noreply@anthropic.com>
2025-12-26 21:48:19 -08:00
robobun
81b4a40fbd [publish images] Remove sccache, use ccache only (#25682)
## Summary
- Remove sccache support entirely, use ccache only
- Missing ccache no longer fails the build (just skips caching)
- Remove S3 distributed cache support

## Changes
- Remove `cmake/tools/SetupSccache.cmake` and S3 distributed cache
support
- Simplify `CMakeLists.txt` to only use ccache
- Update `SetupCcache.cmake` to not fail when ccache is missing
- Replace sccache with ccache in bootstrap scripts (sh, ps1)
- Update `.buildkite/Dockerfile` to install ccache instead of sccache
- Update `flake.nix` and `shell.nix` to use ccache
- Update documentation (CONTRIBUTING.md, contributing.mdx,
building-windows.mdx)
- Remove `scripts/build-cache/` directory (was only for sccache S3
access)

## Test plan
- [x] Build completes successfully with `bun bd`

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-12-26 20:24:27 -08:00
Nico Cevallos
5715b54614 add test for dependency order when a package's name is larger than 8 characters + fix (#25697)
### What does this PR do?

- Add test that is broken before the changes in the code and fix
previous test making script in dependency takes a bit of time to be
executed. Without the `setTimeout` in the tests, due race conditions it
always success. I tried adding a test combining both tests, with
dependencies `dep0` and `larger-than-8-char`, but if the timeout is the
same it success.
- Fix for the use case added, by using the correct buffer for
`Dependency.name` otherwise it gets garbage when package name is larger
than 8 characters. This should fix #12203

### How did you verify your code works?

Undo the changes in the code to verify the new test fails and check it
again after adding the changes in the code.
2025-12-25 23:49:23 -08:00
Jarred Sumner
28fd495b39 Deflake test/js/bun/resolve/load-same-js-file-a-lot.test.ts 2025-12-25 17:43:43 -08:00
SUZUKI Sosuke
699d8b1e1c Upgrade WebKit Dec 24, 2025 (#25684)
- WTFMove → WTF::move / std::move: Replaced WTFMove() macro with
WTF::move() function for WTF types, std::move() for std types
- SortedArrayMap removed: Replaced with if-else chains in
EventFactory.cpp, JSCryptoKeyUsage.cpp
- Wasm::Memory::create signature changed: Removed VM parameter
- URLPattern allocation: Changed from WTF_MAKE_ISO_ALLOCATED to
WTF_MAKE_TZONE_ALLOCATED

---------

Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-12-25 14:00:58 -08:00
robobun
2247c3859a chore: convert .cursor/rules to .claude/skills (#25683)
## Summary
- Migrate Cursor rules to Claude Code skills format
- Add 4 new skills for development guidance:
  - `writing-dev-server-tests`: HMR/dev server test guidance
  - `implementing-jsc-classes-cpp`: C++ JSC class implementation  
  - `implementing-jsc-classes-zig`: Zig JSC bindings generator
  - `writing-bundler-tests`: bundler test guidance with itBundled
- Remove all `.cursor/rules/` files

## Test plan
- [x] Skills follow Claude Code skill authoring guidelines
- [x] Each skill has proper YAML frontmatter with name and description
- [x] Skills are concise and actionable

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-12-24 23:37:26 -08:00
Jarred Sumner
08e03814e5 [publish images] Fix CI, remove broken freebsd image step 2025-12-24 20:02:56 -08:00
Jarred Sumner
0dd4f025b6 [publish images] (+ add Object.hasOwn benchmark) 2025-12-24 19:55:44 -08:00
Jarred Sumner
79067037ff Add Promise.race microbenchmark 2025-12-23 22:53:24 -08:00
Aiden Cline
822d75a380 fix(@types/bun): add missing autoloadTsconfig and autoloadPackageJson types (#25501)
### What does this PR do?

Adds missing types, fixes typo

### How did you verify your code works?

Add missing types from: 
https://github.com/oven-sh/bun/pull/25340/changes

---------

Co-authored-by: Alistair Smith <hi@alistair.sh>
2025-12-24 06:47:07 +00:00
SUZUKI Sosuke
bffccf3d5f Upgrade WebKit 2025/12/07 (#25429)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
Co-authored-by: Claude Bot <claude-bot@bun.sh>
2025-12-23 22:24:18 -08:00
robobun
0300150324 docs: fix incorrect [env] section documentation in bunfig.toml (#25634)
## Summary
- Fixed documentation that incorrectly claimed you could use `[env]` as
a TOML section to set environment variables directly
- The `env` option in bunfig.toml only controls whether automatic `.env`
file loading is disabled (via `env = false`)
- Updated to show the correct approaches: using preload scripts or
`.env` files with `--env-file`

## Test plan
- Documentation-only change, no code changes

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-12-23 15:31:12 -08:00
robobun
34a1e2adad fix: use LLVM unstable repo for Debian Trixie (13) (#25657)
## Summary

- Fix LLVM installation on Debian Trixie (13) by using the unstable
repository from apt.llvm.org

The `llvm.sh` script doesn't automatically detect that Debian Trixie
needs to use the unstable repository. This is because trixie's `VERSION`
is `13 (trixie)` rather than `testing`, and apt.llvm.org doesn't have a
dedicated trixie repository.

Without this fix, the LLVM installation falls back to Debian's main
repository packages, which don't include `libclang-rt-19-dev` (the
compiler-rt sanitizer libraries) by default. This causes builds with
ASan (AddressSanitizer) to fail with:

```
ld.lld: error: cannot open /usr/lib/llvm-19/lib/clang/19/lib/x86_64-pc-linux-gnu/libclang_rt.asan.a: No such file or directory
```

This was breaking the [Daily Docker
Build](https://github.com/oven-sh/bun-development-docker-image/actions/runs/20437290601)
in the bun-development-docker-image repo.

## Test plan

- [ ] Wait for the PR CI to pass
- [ ] After merging, the next Daily Docker Build should succeed

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-12-22 23:10:15 -08:00
robobun
8484e1b827 perf: shrink ConcurrentTask from 24 bytes to 16 bytes (#25636) 2025-12-22 12:07:24 -08:00
robobun
3898ed5e3f perf: pack boolean flags and reorder fields to reduce struct padding (#25627) 2025-12-21 17:12:42 -08:00
Jarred Sumner
c08ffadf56 perf(linux): add memfd optimizations and typed flags (#25597)
## Summary

- Add `MemfdFlags` enum to replace raw integer flags for `memfd_create`,
providing semantic clarity for different use cases (`executable`,
`non_executable`, `cross_process`)
- Add support for `MFD_EXEC` and `MFD_NOEXEC_SEAL` flags (Linux 6.3+)
with automatic fallback to older kernel flags when `EINVAL` is returned
- Use memfd + `/proc/self/fd/{fd}` path for loading embedded `.node`
files in standalone builds, avoiding disk writes entirely on Linux

## Test plan

- [ ] Verify standalone builds with embedded `.node` files work on Linux
- [ ] Verify fallback works on older kernels (pre-6.3)
- [ ] Verify subprocess stdio memfd still works correctly

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude <noreply@anthropic.com>
2025-12-19 23:18:21 -08:00
Dylan Conway
fa983247b2 fix(create): crash when running postinstall task with --no-install (#25616)
## Summary
- Fix segmentation fault in `bun create` when using `--no-install` with
a template that has a `bun-create.postinstall` task starting with "bun "
- The bug was caused by unconditionally slicing `argv[2..]` which
created an empty array when `npm_client` was null
- Added check for `npm_client != null` before slicing

## Reproduction
```bash
# Create template with bun-create.postinstall
mkdir -p ~/.bun-create/test-template
echo '{"name":"test","bun-create":{"postinstall":"bun install"}}' > ~/.bun-create/test-template/package.json

# This would crash before the fix
bun create test-template /tmp/my-app --no-install
```

## Test plan
- [x] Verified the reproduction case crashes before the fix
- [x] Verified the reproduction case works after the fix

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2025-12-19 23:17:51 -08:00
Dylan Conway
99b0a16c33 fix: prevent out-of-bounds access in NO_PROXY parsing (#25617)
## Summary
- Fix out-of-bounds access when parsing `NO_PROXY` environment variable
with empty entries
- Empty entries (e.g., `"localhost, , example.com"`) would cause a panic
when checking if the host starts with a dot
- Skip empty entries after trimming whitespace

fixes BUN-110G
fixes BUN-128V

## Test plan
- [x] Verify `NO_PROXY="localhost, , example.com"` no longer crashes

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude <noreply@anthropic.com>
2025-12-19 23:17:29 -08:00
Dylan Conway
085e25d5d1 fix: protect StringOrBuffer from GC in async operations (#25594)
## Summary

- Fix use-after-free crash in async zstd compression, scrypt, and
JSTranspiler operations
- When `StringOrBuffer.fromJSMaybeAsync` is called with `is_async=true`,
the buffer's JSValue is now protected from garbage collection
- Previously, the buffer could be GC'd while a worker thread was still
accessing it, causing segfaults in zstd's `HIST_count_simple` and
similar functions

Fixes BUN-167Z

## Changes

- `fromJSMaybeAsync`: Call `protect()` on buffer when `is_async=true`
- `fromJSWithEncodingMaybeAsync`: Same protection for the early return
path
- `Scrypt`: Fix cleanup to use `deinitAndUnprotect()` for async path,
add missing `deinit()` in sync path
- `JSTranspiler`: Use new protection mechanism instead of manual
`protect()`/`unprotect()` calls
- Simplify `createOnJSThread` signatures to not return errors (OOM is
handled internally)
- Update all callers to use renamed/simplified APIs

## Test plan

- [x] Code review of all callsites to verify correct protect/unprotect
pairing
- [ ] Run existing zstd tests
- [ ] Run existing scrypt tests
- [ ] Run existing transpiler tests

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2025-12-19 17:30:26 -08:00
Jarred Sumner
ce5c336ea5 Revert "fix: memory leaks in IPC message handling (#25602)"
This reverts commit 05b12e0ed0.

The tests did not fail with system version of Bun.
2025-12-19 17:28:54 -08:00
robobun
05b12e0ed0 fix: memory leaks in IPC message handling (#25602)
## Summary

- Add periodic memory reclamation for IPC buffers after processing
messages
- Fix missing `deref()` on `bun.String` created from `cmd` property in
`handleIPCMessage`
- Add `reclaimMemory()` function to shrink incoming buffer and send
queue when they exceed 2MB capacity
- Track message count to trigger memory reclamation every 256 messages

The incoming `ByteList` buffer and send queue `ArrayList` would grow but
never shrink, causing memory accumulation during sustained IPC
messaging.

## Test plan

- [x] Added regression tests in
`test/js/bun/spawn/spawn-ipc-memory.test.ts`
- [x] Existing IPC tests pass (`spawn.ipc.test.ts`)
- [x] Existing cluster tests pass

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-12-19 17:27:09 -08:00
Angus Comrie
d9459f8540 Fix postgres empty check when handling arrays (#25607)
### What does this PR do?
Closes #25505. This adjusts the byte length check in `DataCell:
fromBytes` to 12 bytes instead of 16, as zero-dimensional arrays will
have a shorter preamble.

### How did you verify your code works?
Test suite passes, and I've added a new test that fails in the main
branch but passes with this change. The issue only seems to crop up when
a connection is _reused_, which is curious.
2025-12-19 14:49:12 -08:00
Jarred Sumner
e79b512a9d Propagate debugger CLI config in single-file executables (#25600) 2025-12-19 09:49:02 -08:00
robobun
9902039b1f fix: memory leaks in error-handling code for Brotli, Zstd, and Zlib compression state machines (#25592)
## Summary

Fix several memory leaks in the compression libraries:

- **NativeBrotli/NativeZstd reset()** - Each call to `reset()` allocated
a new encoder/decoder without freeing the previous one
- **NativeBrotli/NativeZstd init() error paths** - If `setParams()`
failed after `stream.init()` succeeded, the instance was leaked
- **NativeZstd init()** - If `setPledgedSrcSize()` failed after context
creation, the context was leaked
- **ZlibCompressorArrayList** - After `deflateInit2_()` succeeded, if
`ensureTotalCapacityPrecise()` failed with OOM, zlib internal state was
never freed
- **NativeBrotli close()** - Now sets state to null to prevent potential
double-free (defensive)
- **LibdeflateState** - Added `deinit()` for API consistency

## Test plan

- [x] Added regression test that calls `reset()` 100k times and measures
memory growth
- [x] Test shows memory growth dropped from ~600MB to ~10MB for Brotli
- [x] Verified no double-frees by tracing code paths
- [x] Existing zlib tests pass (except pre-existing timeout in debug
build)

Before fix (system bun 1.3.3):
```
Memory growth after 100000 reset() calls: 624.38 MB  (BrotliCompress)
Memory growth after 100000 reset() calls: 540.63 MB  (BrotliDecompress)
```

After fix:
```
Memory growth after 100000 reset() calls: 11.84 MB   (BrotliCompress)
Memory growth after 100000 reset() calls: 0.16 MB    (BrotliDecompress)
```

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-12-18 21:42:14 -08:00
Dylan Conway
f3fd7506ef fix(windows): handle UV_UNKNOWN and UV_EAI_* error codes in libuv errno mapping (#25596)
## Summary
- Add missing `UV_UNKNOWN` and `UV_EAI_*` error code mappings to the
`errno()` function in `ReturnCode`
- Fixes panic "integer does not fit in destination type" on Windows when
libuv returns unmapped error codes
- Speculative fix for BUN-131E

## Root Cause
The `errno()` function was missing mappings for `UV_UNKNOWN` (-4094) and
all `UV_EAI_*` address info errors (-3000 to -3014). When libuv returned
these codes, the switch fell through to `else => null`, and the caller
at `sys_uv.zig:317` assumed success and tried to cast the negative
return code to `usize`, causing a panic.

This was triggered in `readFileWithOptions` -> `preadv` when:
- Memory-mapped file operations encounter exceptions (file
modified/truncated by another process, network drive issues)
- Windows returns error codes that libuv cannot map to standard errno
values

## Crash Report
```
Bun v1.3.5 (1e86ceb) on windows x86_64baseline []
panic: integer does not fit in destination type
sys_uv.zig:294: preadv
node_fs.zig:5039: readFileWithOptions
```

## Test plan
- [ ] This fix prevents a panic, converting it to a proper error.
Testing would require triggering `UV_UNKNOWN` from libuv, which is
difficult to do reliably (requires memory-mapped file exceptions or
unusual Windows errors).

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2025-12-18 21:41:41 -08:00
robobun
c21c51a0ff test(security-scanner): add TTY prompt tests using Bun.Terminal (#25587)
Co-authored-by: Alistair Smith <hi@alistair.sh>
2025-12-19 05:21:44 +00:00
robobun
0bbf6c74b5 test: add describe blocks for grouping in bun-types.test.ts (#25598)
Co-authored-by: Alistair Smith <hi@alistair.sh>
2025-12-19 04:53:18 +00:00
Dylan Conway
57cbbc09e4 fix: correct off-by-one bounds checks in bundler and package installer (#25582)
## Summary

- Fix two off-by-one bounds check errors that used `>` instead of `>=`
- Both bugs could cause undefined behavior (array out-of-bounds access)
when an index equals the array length

## The Bugs

### 1. `src/install/postinstall_optimizer.zig:62`

```zig
// Before (buggy):
if (resolution > metas.len) continue;
const meta: *const Meta = &metas[resolution];  // Out-of-bounds when resolution == metas.len

// After (fixed):
if (resolution >= metas.len) continue;
```

### 2. `src/bundler/linker_context/doStep5.zig:10`

```zig
// Before (buggy):
if (id > c.graph.meta.len) return;
const resolved_exports = &c.graph.meta.items(.resolved_exports)[id];  // Out-of-bounds when id == c.graph.meta.len

// After (fixed):
if (id >= c.graph.meta.len) return;
```

## Why These Are Bugs

Valid array indices are `0` to `len - 1`. When `index == len`:
- `index > len` evaluates to `false` → check passes
- `array[index]` accesses `array[len]` → out-of-bounds / undefined
behavior

## Codebase Patterns

The rest of the codebase correctly uses `>=` for these checks:
- `lockfile.zig:484`: `if (old_resolution >= old.packages.len)
continue;`
- `lockfile.zig:522`: `if (old_resolution >= old.packages.len)
continue;`
- `LinkerContext.zig:389`: `if (source_index >= import_records_list.len)
continue;`
- `LinkerContext.zig:1667`: `if (source_index >= c.graph.ast.len) {`

## Test plan

- [x] Verified fix aligns with existing codebase patterns
- [ ] CI passes

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude <noreply@anthropic.com>
2025-12-18 18:04:28 -08:00
Jarred Sumner
7f589ffb4b Disable coderabbit enrichment 2025-12-18 18:03:23 -08:00
Francis F
cea59d7fc0 docs(sqlite): fix .run() return value documentation (#25060)
Co-authored-by: Alistair Smith <hi@alistair.sh>
2025-12-18 20:44:35 +00:00
Jarred Sumner
4ea1454e4a Delete unused workflow 2025-12-18 12:04:28 -08:00
Dylan Conway
8941a363c3 fix: dupe ca string in .npmrc to prevent use-after-free (#25563)
## Summary

- Fix use-after-free bug when parsing `ca` option from `.npmrc`
- The `ca` string was being stored directly from the parser's arena
without duplication
- Since the parser arena is freed at the end of `loadNpmrc`, this
created a dangling pointer

## The Bug

In `src/ini.zig`, the `ca` string wasn't being duplicated like all other
string properties:

```zig
// Lines 983-986 explicitly warn about this:
// Need to be very, very careful here with strings.
// They are allocated in the Parser's arena, which of course gets
// deinitialized at the end of the scope.
// We need to dupe all strings

// Line 981: Parser arena is freed here
defer parser.deinit();

// Line 1016-1020: THE BUG - string not duped!
if (out.asProperty("ca")) |query| {
    if (query.expr.asUtf8StringLiteral()) |str| {
        install.ca = .{
            .str = str,  // ← Dangling pointer after parser.deinit()!
        };
```

All other string properties in the same function correctly duplicate:
- `registry` (line 996): `try allocator.dupe(u8, str)`
- `cache` (line 1002): `try allocator.dupe(u8, str)`
- `cafile` (line 1037): `asStringCloned(allocator)`
- `ca` array items (line 1026): `asStringCloned(allocator)`

## User Impact

When a user has `ca=<certificate>` in their `.npmrc` file:
1. The certificate string is parsed and stored
2. The parser arena is freed
3. `install.ca.str` becomes a dangling pointer
4. Later TLS/SSL operations access freed memory
5. Could cause crashes, undefined behavior, or security issues

## Test plan

- Code inspection confirms this matches the pattern used for all other
string properties
- The fix adds `try allocator.dupe(u8, str)` to match `cache`,
`registry`, etc.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude <noreply@anthropic.com>
2025-12-17 19:56:25 -08:00
Dylan Conway
722ac3aa5a fix: check correct variable in subprocess stdin cleanup (#25562)
## Summary

- Fix typo in `onProcessExit` where `existing_stdin_value.isCell()` was
checked instead of `existing_value.isCell()`
- Since `existing_stdin_value` is always `.zero` at that point, the
condition was always false, making the inner block dead code

## The Bug

In `src/bun.js/api/bun/subprocess.zig:593`:

```zig
var existing_stdin_value = jsc.JSValue.zero;  // Line 590 - always .zero
if (this_jsvalue != .zero) {
    if (jsc.Codegen.JSSubprocess.stdinGetCached(this_jsvalue)) |existing_value| {
        if (existing_stdin_value.isCell()) {  // BUG! Should be existing_value
            // This block was DEAD CODE - never executed
```

Compare with the correct pattern used elsewhere:
```zig
// shell/subproc.zig:251-252 (CORRECT)
if (jsc.Codegen.JSSubprocess.stdinGetCached(subprocess.this_jsvalue)) |existing_value| {
    jsc.WebCore.FileSink.JSSink.setDestroyCallback(existing_value, 0);  // Uses existing_value
}
```

## Impact

The dead code prevented:
- Recovery of stdin from cached JS value when `weak_file_sink_stdin_ptr`
is null
- Proper cleanup via `onAttachedProcessExit` on the FileSink  
- `setDestroyCallback` cleanup in `onProcessExit`

Note: The user-visible impact was mitigated by redundant cleanup paths
in `Writable.zig` that also call `setDestroyCallback`.

## Test plan

- Code inspection confirms this is a straightforward typo fix
- Existing subprocess tests continue to pass

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude <noreply@anthropic.com>
2025-12-17 18:34:58 -08:00
Dylan Conway
a333d02f84 fix: correct inverted buffer allocation logic in Postgres array parsing (#25564)
## Summary

- Fix inverted buffer allocation logic when parsing strings in Postgres
arrays
- Strings larger than 16KB were incorrectly using the stack buffer
instead of dynamically allocating
- This caused spurious `InvalidByteSequence` errors for valid data

## The Bug

In `src/sql/postgres/DataCell.zig`, the condition for when to use
dynamic allocation was inverted:

```zig
// BEFORE (buggy):
const needs_dynamic_buffer = str_bytes.len < stack_buffer.len;  // TRUE when SMALL

// AFTER (fixed):
const needs_dynamic_buffer = str_bytes.len > stack_buffer.len;  // TRUE when LARGE
```

## What happened with large strings (>16KB):

1. `needs_dynamic_buffer` = false (e.g., 20000 < 16384 is false)
2. Uses `stack_buffer[0..]` which is only 16KB
3. `unescapePostgresString` hits bounds check and returns
`BufferTooSmall`
4. Error converted to `InvalidByteSequence`
5. User gets error even though data is valid

## User Impact

Users with Postgres arrays containing JSON or string elements larger
than 16KB would get spurious InvalidByteSequence errors even though
their data was perfectly valid.

## Test plan

- Code inspection confirms the logic was inverted
- The fix aligns with the intended behavior: use stack buffer for small
strings, dynamic allocation for large strings

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude <noreply@anthropic.com>
2025-12-17 18:34:17 -08:00
Dylan Conway
c1acb0b9a4 fix(shell): prevent double-close of fd when using &> redirect with builtins (#25568)
## Summary

- Fix double-close of file descriptor when using `&>` redirect with
shell builtin commands
- Add `dupeRef()` helper for cleaner reference counting semantics
- Add tests for `&>` and `&>>` redirects with builtins

## Test plan

- [x] Added tests in `test/js/bun/shell/file-io.test.ts` that reproduce
the bug
- [x] All file-io tests pass

## The Bug

When using `&>` to redirect both stdout and stderr to the same file with
a shell builtin command (e.g., `pwd &> file.txt`), the code was creating
two separate `IOWriter` instances that shared the same file descriptor.
When both `IOWriter`s were destroyed, they both tried to close the same
fd, causing an `EBADF` (bad file descriptor) error.

```javascript
import { $ } from "bun";
await $`pwd &> output.txt`; // Would crash with EBADF
```

## The Fix

1. Share a single `IOWriter` between stdout and stderr when both are
redirected to the same file, with proper reference counting
2. Rename `refSelf` to `dupeRef` for clarity across `IOReader`,
`IOWriter`, `CowFd`, and add it to `Blob` for consistency
3. Fix the `Body.Value` blob case to also properly reference count when
the same blob is assigned to multiple outputs

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Latest model <noreply@anthropic.com>
2025-12-17 18:33:53 -08:00
Jarred Sumner
ffd2240c31 Bump 2025-12-17 11:42:54 -08:00
190n
fa5a5bbe55 fix: v8::Value::IsInt32()/IsUint32() edge cases (#25548)
### What does this PR do?

- fixes both functions returning false for double-encoded values (even
if the numeric value is a valid int32/uint32)
- fixes IsUint32() returning false for values that don't fit in int32
- fixes the test from #22462 not testing anything (the native functions
were being passed a callback to run garbage collection as the first
argument, so it was only ever testing what the type check APIs returned
for that function)
- extends the test to cover the first edge case above

### How did you verify your code works?

The new tests fail without these fixes.

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-12-17 00:52:16 -08:00
Dylan Conway
1e86cebd74 Add bun_version to link metadata (#25545)
## Summary
- Add `bun_version` field to `link-metadata.json`
- Pass `VERSION` CMake variable to the metadata script as `BUN_VERSION`
env var

This ensures the build version is captured in the link metadata JSON
file, which is useful for tracking which version produced a given build
artifact.

## Test plan
- Build with `bun bd` and verify `link-metadata.json` includes
`bun_version`

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude <noreply@anthropic.com>
2025-12-16 19:53:05 -08:00
robobun
bc47f87450 fix(ini): support env var expansion in quoted .npmrc values (#25518)
## Summary

Fixes environment variable expansion in quoted `.npmrc` values and adds
support for the `?` optional modifier.

### Changes

**Simplified quoted value handling:**
- Removed unnecessary `isProperlyQuoted` check that added complexity
without benefit
- When JSON.parse succeeds for quoted strings, expand env vars in the
result
- When JSON.parse fails for single-quoted strings like `'${VAR}'`, still
expand env vars

**Added `?` modifier support (matching npm behavior):**
- `${VAR}` - if VAR is undefined, leaves as `${VAR}` (no expansion)
- `${VAR?}` - if VAR is undefined, expands to empty string

This applies consistently to both quoted and unquoted values.

### Examples

```ini
# Env var found - all expand to the value
token = ${NPM_TOKEN}
token = "${NPM_TOKEN}"
token = '${NPM_TOKEN}'

# Env var NOT found - left as-is
token = ${NPM_TOKEN}         # → ${NPM_TOKEN}
token = "${NPM_TOKEN}"       # → ${NPM_TOKEN}
token = '${NPM_TOKEN}'       # → ${NPM_TOKEN}

# Optional modifier (?) - expands to empty if not found
token = ${NPM_TOKEN?}        # → (empty)
token = "${NPM_TOKEN?}"      # → (empty)
auth = "Bearer ${TOKEN?}"    # → Bearer 
```

### Test Plan

- Added 8 new tests for the `?` modifier covering quoted and unquoted
values
- Verified all expected values match `npm config get` behavior
- All 30 ini tests pass

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
Co-authored-by: Dylan Conway <dylan.conway567@gmail.com>
2025-12-16 19:49:23 -08:00
Dylan Conway
698b004ea4 Add step in CI to upload link metadata (#25448)
### What does this PR do?

### How did you verify your code works?

---------

Co-authored-by: Claude <noreply@anthropic.com>
2025-12-16 14:30:10 -08:00
robobun
b135c207ed fix(yaml): remove YAML 1.1 legacy boolean values for YAML 1.2 compliance (#25537)
## Summary

- Remove YAML 1.1 legacy boolean values (`yes/no/on/off/y/Y`) that are
not part of the YAML 1.2 Core Schema
- Keep YAML 1.2 Core Schema compliant values: `true/True/TRUE`,
`false/False/FALSE`, `null/Null/NULL`, `0x` hex, `0o` octal
- Add comprehensive roundtrip tests for YAML 1.2 compliance

**Removed (now parsed as strings):**
- `yes`, `Yes`, `YES` (were `true`)
- `no`, `No`, `NO` (were `false`)
- `on`, `On`, `ON` (were `true`)
- `off`, `Off`, `OFF` (were `false`)
- `y`, `Y` (were `true`)

This fixes a common pain point where GitHub Actions workflow files with
`on:` keys would have the key parsed as boolean `true` instead of the
string `"on"`.

## YAML 1.2 Core Schema Specification

From [YAML 1.2.2 Section 10.3.2 Tag
Resolution](https://yaml.org/spec/1.2.2/#1032-tag-resolution):

| Regular expression | Resolved to tag |
|-------------------|-----------------|
| `null \| Null \| NULL \| ~` | tag:yaml.org,2002:null |
| `/* Empty */` | tag:yaml.org,2002:null |
| `true \| True \| TRUE \| false \| False \| FALSE` |
tag:yaml.org,2002:bool |
| `[-+]? [0-9]+` | tag:yaml.org,2002:int (Base 10) |
| `0o [0-7]+` | tag:yaml.org,2002:int (Base 8) |
| `0x [0-9a-fA-F]+` | tag:yaml.org,2002:int (Base 16) |
| `[-+]? ( \. [0-9]+ \| [0-9]+ ( \. [0-9]* )? ) ( [eE] [-+]? [0-9]+ )?`
| tag:yaml.org,2002:float |
| `[-+]? ( \.inf \| \.Inf \| \.INF )` | tag:yaml.org,2002:float
(Infinity) |
| `\.nan \| \.NaN \| \.NAN` | tag:yaml.org,2002:float (Not a number) |

Note: `yes`, `no`, `on`, `off`, `y`, `n` are **not** in the YAML 1.2
Core Schema boolean list. These were removed from YAML 1.1 as noted in
[YAML 1.2 Section 1.2](https://yaml.org/spec/1.2.2/#12-yaml-history):

> The YAML 1.2 specification was published in 2009. Its primary focus
was making YAML a strict superset of JSON. **It also removed many of the
problematic implicit typing recommendations.**

## Test plan

- [x] Updated existing YAML tests to reflect YAML 1.2 Core Schema
behavior
- [x] Added roundtrip tests (stringify → parse) for YAML 1.2 compliance
- [x] Verified tests fail with system Bun (YAML 1.1 behavior) and pass
with debug build (YAML 1.2)
- [x] Run `bun bd test test/js/bun/yaml/yaml.test.ts`

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-12-16 14:29:39 -08:00
Ciro Spaciari
a1dd26d7db fix(usockets) fix last_write_failed flag (#25496)
https://github.com/oven-sh/bun/pull/25361 needs to be merged before this
PR

## Summary
- Move `last_write_failed` flag from loop-level to per-socket flag for
correctness

## Changes

- Move `last_write_failed` from `loop->data` to `socket->flags`
- More semantically correct since write status is per-socket, not
per-loop

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2025-12-16 14:26:42 -08:00
Christian Rishøj
7c06320d0f fix(ws): fix zlib version mismatch on Windows (segfault) (#25538)
## Summary

Fixes #24593 - WebSocket segfault on Windows when publishing large
messages with `perMessageDeflate: true`.
Also fixes #21028 (duplicate issue).
Also closes #25457 (alternative PR).

**Root cause:** 

On Windows, the C++ code was compiled against system zlib headers
(1.3.1) but linked against Bun's vendored Cloudflare zlib (1.2.8).

This version mismatch caused `deflateInit2()` to return
`Z_VERSION_ERROR` (-6), leaving the deflate stream in an invalid state.
All subsequent `deflate()` calls returned `Z_STREAM_ERROR` (-2),
producing zero output, which then caused an integer underflow when
subtracting the 4-byte trailer → segfault in memcpy.

**Fix:** 

Add `${VENDOR_PATH}/zlib` to the C++ include paths in
`cmake/targets/BuildBun.cmake`. This ensures the vendored zlib headers
are found before system headers, maintaining header/library version
consistency.

This is a simpler alternative to #25457 which worked around the issue by
using libdeflate exclusively.

## Test plan

- [x] Added regression test `test/regression/issue/24593.test.ts` with 4
test cases:
  - Large ~109KB JSON message publish (core reproduction)
  - Multiple rapid publishes (buffer corruption)
  - Broadcast to multiple subscribers
  - Messages at CORK_BUFFER_SIZE boundary (16KB)
- [x] Tests pass on Windows (was crashing before fix)
- [x] Tests pass on macOS

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2025-12-16 14:23:29 -08:00
robobun
dd04c57258 feat: implement V8 Value type checking APIs (#22462)
## Summary

This PR implements four V8 C++ API methods for type checking that are
commonly used by native Node.js modules:
- `v8::Value::IsMap()` - checks if value is a Map
- `v8::Value::IsArray()` - checks if value is an Array  
- `v8::Value::IsInt32()` - checks if value is a 32-bit integer
- `v8::Value::IsBigInt()` - checks if value is a BigInt

## Implementation Details

The implementation maps V8's type checking APIs to JavaScriptCore's
equivalent functionality:
- `IsMap()` uses JSC's `inherits<JSC::JSMap>()` check
- `IsArray()` uses JSC's `isArray()` function with the global object
- `IsInt32()` uses JSC's `isInt32()` method  
- `IsBigInt()` uses JSC's `isBigInt()` method

## Changes

- Added method declarations to `V8Value.h`
- Implemented the methods in `V8Value.cpp` 
- Added symbol exports to `napi.zig` (both Unix and Windows mangled
names)
- Added symbols to `symbols.txt` and `symbols.dyn`
- Added comprehensive tests in `v8-module/main.cpp` and `v8.test.ts`

## Testing

The implementation has been verified to:
- Compile successfully without errors
- Export the correct symbols in the binary
- Follow established patterns in the V8 compatibility layer

Tests cover various value types including empty and populated
Maps/Arrays, different numeric ranges, BigInts, and other JavaScript
types.

🤖 Generated with [Claude Code](https://claude.ai/code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2025-12-15 19:50:11 -08:00
robobun
344b2c1dfe fix: Response.clone() no longer locks body when body was accessed before clone (#25484)
## Summary
- Fix bug where `Response.clone()` would lock the original response's
body when `response.body` was accessed before cloning
- Apply the same fix to `Request.clone()`

## Root Cause
When `response.body` was accessed before calling `response.clone()`, the
original response's body would become locked after cloning. This
happened because:

1. When the cloned response was wrapped with `toJS()`,
`checkBodyStreamRef()` was called which moved the stream from
`Locked.readable` to `js.gc.stream` and cleared `Locked.readable`
2. The subsequent code tried to get the stream from `Locked.readable`,
which was now empty, so the body cache update was skipped
3. The JavaScript-level body property cache still held the old locked
stream

## Fix
Updated the cache update logic to:
1. For the cloned response: use `js.gc.stream.get()` instead of
`Locked.readable.get()` since `toJS()` already moved the stream
2. For the original response: use `Locked.readable.get()` which still
holds the teed stream since `checkBodyStreamRef` hasn't been called yet

## Reproduction
```javascript
const readableStream = new ReadableStream({
  start(controller) {
    controller.enqueue(new TextEncoder().encode("Hello, world!"));
    controller.close();
  },
});

const response = new Response(readableStream);
console.log(response.body?.locked); // Accessing body before clone
const cloned = response.clone();
console.log(response.body?.locked); // Expected: false, Actual: true 
console.log(cloned.body?.locked);   // Expected: false, Actual: false 
```

## Test plan
- [x] Added regression tests for `Response.clone()` in
`test/js/web/fetch/response.test.ts`
- [x] Added regression test for `Request.clone()` in
`test/js/web/request/request.test.ts`
- [x] Verified tests fail with system bun (before fix) and pass with
debug build (after fix)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2025-12-15 18:46:02 -08:00
Ciro Spaciari
aef0b5b4a6 fix(usockets): safely handle socket reallocation during context adoption (#25361)
## Summary
- Fix use-after-free vulnerability during socket adoption by properly
tracking reallocated sockets
- Add safety checks to prevent linking closed sockets to context lists
- Properly track socket state with new `is_closed`, `adopted`, and
`is_tls` flags

## What does this PR do?

This PR improves event loop stability by addressing potential
use-after-free issues that can occur when sockets are reallocated during
adoption (e.g., when upgrading a TCP socket to TLS).

### Key Changes

**Socket State Tracking
([internal.h](packages/bun-usockets/src/internal/internal.h))**
- Added `is_closed` flag to explicitly track when a socket has been
closed
- Added `adopted` flag to mark sockets that were reallocated during
context adoption
- Added `is_tls` flag to track TLS socket state for proper low-priority
queue handling

**Safe Socket Adoption
([context.c](packages/bun-usockets/src/context.c))**
- When `us_poll_resize()` returns a new pointer (reallocation occurred),
the old socket is now:
  - Marked as closed (`is_closed = 1`)
  - Added to the closed socket cleanup list
  - Marked as adopted (`adopted = 1`)
  - Has its `prev` pointer set to the new socket for event redirection
- Added guards to
`us_internal_socket_context_link_socket/listen_socket/connecting_socket`
to prevent linking already-closed sockets

**Event Loop Handling ([loop.c](packages/bun-usockets/src/loop.c))**
- After callbacks that can trigger socket adoption (`on_open`,
`on_writable`, `on_data`), the event loop now checks if the socket was
reallocated and redirects to the new socket
- Low-priority socket handling now properly checks `is_closed` state and
uses `is_tls` flag for correct SSL handling

**Poll Resize Safety
([epoll_kqueue.c](packages/bun-usockets/src/eventing/epoll_kqueue.c))**
- Changed `us_poll_resize()` to always allocate new memory with
`us_calloc()` instead of `us_realloc()` to ensure the old pointer
remains valid for cleanup
- Now takes `old_ext_size` parameter to correctly calculate memory sizes
- Re-enabled `us_internal_loop_update_pending_ready_polls()` call in
`us_poll_change()` to ensure pending events are properly redirected

### How did you verify your code works?
Run existing CI and existing socket upgrade tests under asan build
2025-12-15 18:43:51 -08:00
robobun
740fb23315 fix(windows): improve bunx metadata validation (#25012)
## Summary

- Improved validation for bunx metadata files on Windows
- Added graceful error handling for malformed metadata instead of
crashing
- Added regression test for the fix

## Test plan

- [x] Run `bun bd test test/cli/install/bunx.test.ts -t "should not
crash on corrupted"`
- [x] Manual testing with corrupted `.bunx` files
- [x] Verified normal operation still works

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2025-12-15 18:37:09 -08:00
robobun
2dd997c4b5 fix(node): support duplicate dlopen calls with DLHandleMap (#24404)
## Summary

Fixes an issue where loading the same native module
(NODE_MODULE_CONTEXT_AWARE) multiple times would fail with:
```
symbol 'napi_register_module_v1' not found in native module
```

Fixes https://github.com/oven-sh/bun/issues/23136
Fixes https://github.com/oven-sh/bun/issues/21432

## Root Cause

When a native module is loaded for the first time:
1. `dlopen()` loads the shared library
2. Static constructors run and call `node_module_register()`
3. The module registers successfully

On subsequent loads of the same module:
1. `dlopen()` returns the same handle (library already loaded)
2. Static constructors **do not run again**
3. No registration occurs, leading to the "symbol not found" error

## Solution

Implemented a thread-safe `DLHandleMap` to cache and replay module
registrations:

1. **Thread-local storage** captures the `node_module*` during static
constructor execution
2. **After successful first load**, save the registration to the global
map
3. **On subsequent loads**, look up the cached registration and replay
it

This approach matches Node.js's `global_handle_map` implementation.

## Changes

- Created `src/bun.js/bindings/DLHandleMap.h` - thread-safe singleton
cache
- Added thread-local storage in `src/bun.js/bindings/v8/node.cpp`
- Modified `src/bun.js/bindings/BunProcess.cpp` to save/lookup cached
modules
- Also includes the exports fix (using `toObject()` to match Node.js
behavior)

## Test Plan

Added `test/js/node/process/dlopen-duplicate-load.test.ts` with tests
that:
- Build a native addon using node-gyp
- Load it twice with `process.dlopen`
- Verify both loads succeed
- Test with different exports objects

All tests pass.

## Related Issue

Fixes the second bug discovered in the segfault investigation.

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2025-12-15 18:35:26 -08:00
robobun
4061e1cb4f fix: handle EINVAL from copy_file_range on eCryptfs (#25534)
## Summary
- Add `EINVAL` and `OPNOTSUPP` to the list of errors that trigger
fallback from `copy_file_range` to `sendfile`/read-write loop
- Fixes `Bun.write` and `fs.copyFile` failing on eCryptfs filesystems

## Test plan
- [x] Existing `copyFile` tests pass (`bun bd test
test/js/node/fs/fs.test.ts -t "copyFile"`)
- [x] Existing `copy_file_range` fallback tests pass (`bun bd test
test/js/bun/io/bun-write.test.js -t "should work when copyFileRange is
not available"`)

Fixes #13968

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2025-12-15 17:47:08 -08:00
robobun
6386eef8aa fix(bunx): handle empty string arguments on Windows (#25025)
## Summary

Fixes #13316
Fixes #18275

Running `bunx cowsay ""` (or any package with an empty string argument)
on Windows caused a panic. Additionally, `bunx concurrently "command
with spaces"` was splitting quoted arguments incorrectly.

**Repro #13316:**
```bash
bunx cowsay ""
# panic(main thread): reached unreachable code
```

**Repro #18275:**
```bash
bunx concurrently "bun --version" "bun --version"
# Only runs once, arguments split incorrectly
# Expected: ["bun --version", "bun --version"]
# Actual: ["bun", "--version", "bun", "--version"]
```

## Root Cause

The bunx fast path on Windows bypasses libuv and calls `CreateProcessW`
directly to save 5-12ms. The command line building logic had two issues:

1. **Empty strings**: Not quoted at all, resulting in invalid command
line
2. **Arguments with spaces**: Not quoted, causing them to be split into
multiple arguments

## Solution

Implement Windows command-line argument quoting using libuv's proven
algorithm:
- Port of libuv's `quote_cmd_arg` function (process backwards + reverse)
- Empty strings become `""`
- Strings with spaces/tabs/quotes are wrapped in quotes
- Backslashes before quotes are properly escaped per Windows rules

**Why not use libuv directly?**
- Normal `Bun.spawn()` uses `uv_spawn()` which handles quoting
internally
- bunx fast path bypasses libuv to save 5-12ms (calls `CreateProcessW`
directly)
- libuv's `quote_cmd_arg` is a static function (not exported)
- Solution: port the algorithm to Zig

## Test Plan

- [x] Added regression test for empty strings (#13316)
- [x] Added regression test for arguments with spaces (#18275)
- [x] Verified system bun (v1.3.3) fails both tests
- [x] Verified fix passes both tests
- [x] Implementation based on battle-tested libuv algorithm

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-12-15 17:29:04 -08:00
robobun
3394fd3bdd fix(node:url): return empty string for invalid domains in domainToASCII/domainToUnicode (#25196)
## Summary
- Fixes `url.domainToASCII` and `url.domainToUnicode` to return empty
string instead of throwing `TypeError` when given invalid domains
- Per Node.js docs: "if `domain` is an invalid domain, the empty string
is returned"

## Test plan
- [x] Run `bun bd test test/regression/issue/24191.test.ts` - all 2
tests pass
- [x] Verify tests fail with system Bun (`USE_SYSTEM_BUN=1`) to confirm
fix validity
- [x] Manual verification: `url.domainToASCII('xn--iñvalid.com')`
returns `""`

## Example

Before (bug):
```
$ bun -e "import url from 'node:url'; console.log(url.domainToASCII('xn--iñvalid.com'))"
TypeError: domainToASCII failed
```

After (fixed):
```
$ bun -e "import url from 'node:url'; console.log(url.domainToASCII('xn--iñvalid.com'))"
(empty string output)
```

Closes #24191

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-12-15 17:26:32 -08:00
robobun
5a8cdc08f0 docs(bundler): add comprehensive Bun.build compile API documentation (#25536) 2025-12-15 16:20:50 -08:00
Jarred Sumner
dcc3386611 [publish images] 2025-12-15 15:34:04 -08:00
robobun
8dc79641c8 fix(http): support proxy passwords longer than 4096 characters (#25530)
## Summary
- Fixes silent 401 Unauthorized errors when using proxies with long
passwords (e.g., JWT tokens > 4096 chars)
- Bun was silently dropping proxy passwords exceeding 4095 characters,
falling through to code that only encoded the username

## Changes
- Added `PercentEncoding.decodeWithFallback` which uses a 4KB stack
buffer for the common case and falls back to heap allocation only for
larger inputs
- Updated proxy auth encoding in `AsyncHTTP.zig` to use the new fallback
method

## Test plan
- [x] Added test case that verifies passwords > 4096 chars are handled
correctly
- [x] Test fails with system bun (v1.3.3), passes with this fix
- [x] All 29 proxy tests pass

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-12-15 13:21:41 -08:00
robobun
d865ef41e2 feat: add Bun.Terminal API for pseudo-terminal (PTY) support (#25415)
## Summary

This PR adds a new `Bun.Terminal` API for creating and managing
pseudo-terminals (PTYs), enabling interactive terminal applications in
Bun.

### Features

- **Standalone Terminal**: Create PTYs directly with `new
Bun.Terminal(options)`
- **Spawn Integration**: Spawn processes with PTY attached via
`Bun.spawn({ terminal: options })`
- **Full PTY Control**: Write data, resize, set raw mode, and handle
callbacks

## Examples

### Basic Terminal with Spawn (Recommended)

```typescript
const proc = Bun.spawn(["bash"], {
  terminal: {
    cols: 80,
    rows: 24,
    data(terminal, data) {
      // Handle output from the terminal
      process.stdout.write(data);
    },
    exit(terminal, code, signal) {
      console.log(`Process exited with code ${code}`);
    },
  },
});

// Write commands to the terminal
proc.terminal.write("echo Hello from PTY!\n");
proc.terminal.write("exit\n");

await proc.exited;
proc.terminal.close();
```

### Interactive Shell

```typescript
// Create an interactive shell that mirrors to stdout
const proc = Bun.spawn(["bash", "-i"], {
  terminal: {
    cols: process.stdout.columns || 80,
    rows: process.stdout.rows || 24,
    data(term, data) {
      process.stdout.write(data);
    },
  },
});

// Forward stdin to the terminal
process.stdin.setRawMode(true);
for await (const chunk of process.stdin) {
  proc.terminal.write(chunk);
}
```

### Running Interactive Programs (vim, htop, etc.)

```typescript
const proc = Bun.spawn(["vim", "file.txt"], {
  terminal: {
    cols: process.stdout.columns,
    rows: process.stdout.rows,
    data(term, data) {
      process.stdout.write(data);
    },
  },
});

// Handle terminal resize
process.stdout.on("resize", () => {
  proc.terminal.resize(process.stdout.columns, process.stdout.rows);
});

// Forward input
process.stdin.setRawMode(true);
for await (const chunk of process.stdin) {
  proc.terminal.write(chunk);
}
```

### Capturing Colored Output

```typescript
const chunks: Uint8Array[] = [];

const proc = Bun.spawn(["ls", "--color=always"], {
  terminal: {
    data(term, data) {
      chunks.push(data);
    },
  },
});

await proc.exited;
proc.terminal.close();

// Output includes ANSI color codes
const output = Buffer.concat(chunks).toString();
console.log(output);
```

### Standalone Terminal (Advanced)

```typescript
const terminal = new Bun.Terminal({
  cols: 80,
  rows: 24,
  data(term, data) {
    console.log("Received:", data.toString());
  },
});

// Use terminal.stdin as the fd for child process stdio
const proc = Bun.spawn(["bash"], {
  stdin: terminal.stdin,
  stdout: terminal.stdin,
  stderr: terminal.stdin,
});

terminal.write("echo hello\n");

// Clean up
terminal.close();
```

### Testing TTY Detection

```typescript
const proc = Bun.spawn([
  "bun", "-e", 
  "console.log('isTTY:', process.stdout.isTTY)"
], {
  terminal: {},
});

// Output: isTTY: true
```

## API

### `Bun.spawn()` with `terminal` option

```typescript
const proc = Bun.spawn(cmd, {
  terminal: {
    cols?: number,        // Default: 80
    rows?: number,        // Default: 24  
    name?: string,        // Default: "xterm-256color"
    data?: (terminal: Terminal, data: Uint8Array) => void,
    exit?: (terminal: Terminal, code: number, signal: string | null) => void,
    drain?: (terminal: Terminal) => void,
  }
});

// Access the terminal
proc.terminal.write(data);
proc.terminal.resize(cols, rows);
proc.terminal.setRawMode(enabled);
proc.terminal.close();

// Note: proc.stdin, proc.stdout, proc.stderr return null when terminal is used
```

### `new Bun.Terminal(options)`

```typescript
const terminal = new Bun.Terminal({
  cols?: number,
  rows?: number,
  name?: string,
  data?: (terminal, data) => void,
  exit?: (terminal, code, signal) => void,
  drain?: (terminal) => void,
});

terminal.stdin;   // Slave fd (for child process)
terminal.stdout;  // Master fd (for reading)
terminal.closed;  // boolean
terminal.write(data);
terminal.resize(cols, rows);
terminal.setRawMode(enabled);
terminal.ref();
terminal.unref();
terminal.close();
await terminal[Symbol.asyncDispose]();
```

## Implementation Details

- Uses `openpty()` to create pseudo-terminal pairs
- Properly manages file descriptor lifecycle with reference counting
- Integrates with Bun's event loop via `BufferedReader` and
`StreamingWriter`
- Supports `await using` syntax for automatic cleanup
- POSIX only (Linux, macOS) - not available on Windows

## Test Results

- 80 tests passing
- Covers: construction, writing, reading, resize, raw mode, callbacks,
spawn integration, error handling, GC safety

## Changes

- `src/bun.js/api/bun/Terminal.zig` - Terminal implementation
- `src/bun.js/api/bun/Terminal.classes.ts` - Class definition for
codegen
- `src/bun.js/api/bun/subprocess.zig` - Added terminal field and getter
- `src/bun.js/api/bun/js_bun_spawn_bindings.zig` - Terminal option
parsing
- `src/bun.js/api/BunObject.classes.ts` - Terminal getter on Subprocess
- `packages/bun-types/bun.d.ts` - TypeScript types
- `docs/runtime/child-process.mdx` - Documentation
- `test/js/bun/terminal/terminal.test.ts` - Comprehensive tests

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2025-12-15 12:51:13 -08:00
robobun
e66b4639bd fix: use LLVM unstable repo for Debian trixie/forky [publish images] (#25470)
## Summary
- Fix Docker image build failure on Debian trixie by using LLVM's
`unstable` repository instead of the non-existent `trixie` repository
- The LLVM apt repository (`apt.llvm.org`) doesn't have packages for
Debian trixie (13) or forky - attempts to access
`llvm-toolchain-trixie-19` return 404
- Pass `-n=unstable` flag to `llvm.sh` when running on these Debian
versions

## Test plan
- [ ] Verify the Docker image build succeeds at
https://github.com/oven-sh/bun-development-docker-image/actions

Fixes the build failure from:
https://github.com/oven-sh/bun-development-docker-image/actions/runs/20105199193

Error was:
```
E: The repository 'http://apt.llvm.org/trixie llvm-toolchain-trixie-19 Release' does not have a Release file.
```

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-12-15 12:45:45 -08:00
robobun
8698d25c52 fix: ensure TLS handshake callback fires before HTTP request handler (#25525)
## Summary

Fixes a flaky test (`test-http-url.parse-https.request.js`) where
`request.socket._secureEstablished` was intermittently `false` when the
HTTP request handler was called on HTTPS servers.

## Root Cause

The `isAuthorized` flag was stored in
`HttpContextData::flags.isAuthorized`, which is **shared across all
sockets** in the same context. This meant multiple concurrent TLS
connections could overwrite each other's authorization state, and the
value could be stale when read.

## Fix

Moved the `isAuthorized` flag from the context-level `HttpContextData`
to the per-socket `AsyncSocketData` base class. This ensures each socket
has its own authorization state that is set correctly during its TLS
handshake callback.

## Changes

- **`AsyncSocketData.h`**: Added per-socket `bool isAuthorized` field
- **`HttpContext.h`**: Updated handshake callback to set per-socket flag
instead of context-level flag
- **`JSNodeHTTPServerSocket.cpp`**: Updated `isAuthorized()` to read
from per-socket `AsyncSocketData` (via `HttpResponseData` which inherits
from it)

## Testing

Ran the flaky test 50+ times with 100% pass rate.

Also verified gRPC and HTTP2 tests still pass.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-12-15 12:44:26 -08:00
github-actions[bot]
81a5c79928 deps: update c-ares to v1.34.6 (#25509)
## What does this PR do?

Updates c-ares to version v1.34.6

Compare:
d3a507e920...3ac47ee46e

Auto-updated by [this
workflow](https://github.com/oven-sh/bun/actions/workflows/update-cares.yml)

Co-authored-by: Jarred-Sumner <709451+Jarred-Sumner@users.noreply.github.com>
2025-12-15 11:36:18 -08:00
Alistair Smith
fa996ad1a8 fix: Support @types/node@25.0.2 (#25532)
### What does this PR do?

CI failed again because of a change in @types/node

### How did you verify your code works?

bun-types.test.ts passes
2025-12-15 11:29:04 -08:00
robobun
ed1d6e595c skip HandleScope GC test on ASAN builds (#25523)
## Summary
- Skip `test_handle_scope_gc` test on ASAN builds due to false positives
from dynamic library boundary crossing (Bun built with ASAN+UBSAN,
native addon without sanitizers)

## Test plan
- CI should pass on ASAN builds with this test skipped
- Non-ASAN builds continue to run the test normally

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-12-15 11:11:11 -08:00
Jarred Sumner
7c98b0f440 Deflake test/js/node/test/parallel/test-fs-readdir-stack-overflow.js 2025-12-14 22:49:51 -08:00
robobun
f0d18d73c9 Disable CodeRabbit issue enrichment (#25520) 2025-12-14 14:56:58 -08:00
Ciro Spaciari
a5712b92b8 Fix 100% CPU usage with idle WebSocket connections on macOS (kqueue) (#25475)
### What does this PR do?

Fixes a bug where idle WebSocket connections would cause 100% CPU usage
on macOS and other BSD systems using kqueue.

**Root cause:** The kqueue event filter comparison was using bitwise AND
(`&`) instead of equality (`==`) when checking the filter type. Combined
with missing `EV_ONESHOT` flags on writable events, this caused the
event loop to continuously spin even when no actual I/O was pending.

**Changes:**
1. **Fixed filter comparison** in `epoll_kqueue.c`: Changed `filter &
EVFILT_READ` to `filter == EVFILT_READ` (same for `EVFILT_WRITE`). The
filter field is a value, not a bitmask.

2. **Added `EV_ONESHOT` flag** to writable events: kqueue writable
events now use one-shot mode to prevent continuous triggering.

3. **Re-arm writable events when needed**: After a one-shot writable
event fires, the code now properly updates the poll state and re-arms
the writable event if another write is still pending.

### How did you verify your code works?

Added a test that:
1. Creates a TLS WebSocket server and client
2. Sends messages then lets the connection sit idle
3. Measures CPU usage over 3 seconds
4. Fails if CPU usage exceeds 2% (expected is ~0.XX% when idle)
2025-12-12 11:10:22 -08:00
robobun
7dcd49f832 fix(install): only apply default trusted dependencies to npm packages (#25163)
## Summary
- The default trusted dependencies list should only apply to packages
installed from npm
- Non-npm sources (file:, link:, git:, github:) now require explicit
trustedDependencies
- This prevents malicious packages from spoofing trusted names through
local paths or git repos

## Test plan
- [x] Added test: file: dependency named "esbuild" does NOT auto-run
postinstall scripts
- [x] Added test: file: dependency runs scripts when explicitly added to
trustedDependencies
- [x] Verified tests fail with system bun (old behavior) and pass with
new build
- [x] Build compiles successfully

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
Co-authored-by: Dylan Conway <dylan.conway567@gmail.com>
2025-12-11 17:44:41 -08:00
robobun
c59a6997cd feat(bundler): add statically-analyzable dead-code elimination via feature flags (#25462)
## Summary
- Adds `import { feature } from "bun:bundle"` for compile-time feature
flag checking
- `feature("FLAG_NAME")` calls are replaced with `true`/`false` at
bundle time
- Enables dead-code elimination through `--feature=FLAG_NAME` CLI
argument
- Works in `bun build`, `bun run`, and `bun test`
- Available in both CLI and `Bun.build()` JavaScript API

## Usage

```ts
import { feature } from "bun:bundle";

if (feature("SUPER_SECRET")) {
  console.log("Secret feature enabled!");
} else {
  console.log("Normal mode");
}
```

### CLI
```bash
# Enable feature during build
bun build --feature=SUPER_SECRET index.ts

# Enable at runtime
bun run --feature=SUPER_SECRET index.ts

# Enable in tests
bun test --feature=SUPER_SECRET
```

### JavaScript API
```ts
await Bun.build({
  entrypoints: ['./index.ts'],
  outdir: './out',
  features: ['SUPER_SECRET', 'ANOTHER_FLAG'],
});
```

## Implementation
- Added `bundler_feature_flags` (as `*const bun.StringSet`) to
`RuntimeFeatures` and `BundleOptions`
- Added `bundler_feature_flag_ref` to Parser struct to track the
`feature` import
- Handle `bun:bundle` import at parse time (similar to macros) - capture
ref, return empty statement
- Handle `feature()` calls in `e_call` visitor - replace with boolean
based on flags
- Wire feature flags through CLI arguments and `Bun.build()` API to
bundler options
- Added `features` option to `JSBundler.zig` for JavaScript API support
- Added TypeScript types in `bun.d.ts`
- Added documentation to `docs/bundler/index.mdx`

## Test plan
- [x] Basic feature flag enabled/disabled tests (both CLI and API
backends)
- [x] Multiple feature flags test
- [x] Dead code elimination verification tests
- [x] Error handling for invalid arguments
- [x] Runtime tests with `bun run --feature=FLAG`
- [x] Test runner tests with `bun test --feature=FLAG`
- [x] Aliased import tests (`import { feature as checkFeature }`)
- [x] Ternary operator DCE tests
- [x] Tests use `itBundled` with both `backend: "cli"` and `backend:
"api"`

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Alistair Smith <hi@alistair.sh>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2025-12-11 17:44:14 -08:00
Alistair Smith
1d50af7fe8 @types/bun: Update to @types/node@25, fallback to PropertyKey in test expect matchers when keyof unknown is used (#25460)
more accurately, developers cannot pass a value when expect values
resolve to never. this is easy to fall into when using the
`toContainKey*` matchers. falling back to PropertyKey when this happens
is a sensible/reasonable default

### What does this PR do?

fixes #25456, cc @MonsterDeveloper
fixes #25461

### How did you verify your code works?

bun types integration test
2025-12-10 18:15:55 -08:00
Jarred Sumner
98cee5a57e Improve Bun.stringWidth accuracy and robustness (#25447)
This PR significantly improves `Bun.stringWidth` to handle a wider
variety of Unicode characters and escape sequences correctly.

## Zero-width character handling

Added support for many previously unhandled zero-width characters:
- Soft hyphen (U+00AD)
- Word joiner and invisible operators (U+2060-U+2064)
- Lone surrogates (U+D800-U+DFFF)
- Arabic formatting characters (U+0600-U+0605, U+06DD, U+070F, U+08E2)
- Indic script combining marks (Devanagari through Malayalam)
- Thai and Lao combining marks
- Combining Diacritical Marks Extended and Supplement
- Tag characters (U+E0000-U+E007F)

## ANSI escape sequence handling

### CSI sequences
- Now properly handles ALL CSI final bytes (0x40-0x7E), not just `m`
- This means cursor movement (A/B/C/D), erase (J/K), scroll (S/T), and
other CSI commands are now correctly excluded from width calculation

### OSC sequences
- Added support for OSC sequences (ESC ] ... BEL/ST)
- OSC 8 hyperlinks are now properly handled
- Supports both BEL (0x07) and ST (ESC \) terminators

### ESC ESC fix
- Fixed state machine bug where `ESC ESC` would incorrectly reset state
- Now correctly handles consecutive ESC characters

## Emoji handling

Added proper grapheme-aware emoji width calculation:
- Flag emoji (regional indicator pairs) → width 2
- Skin tone modifiers → width 2
- ZWJ sequences (family, professions, etc.) → width 2
- Keycap sequences → width 2
- Variation selectors (VS15 for text, VS16 for emoji presentation)
- Uses ICU's `UCHAR_EMOJI` property for accurate emoji detection

## Test coverage

Added comprehensive test suite with **94 tests** covering:
- All zero-width character categories
- All CSI final bytes
- OSC sequences with various terminators
- Emoji edge cases (flags, skin tones, ZWJ, keycaps, variation
selectors)
- East Asian width (CJK, fullwidth, halfwidth katakana)
- Indic and Thai script combining marks
- Fuzzer-like stress tests for robustness

## Breaking changes

This is a behavior change - `stringWidth` will return different values
for some inputs. However, the new values are more accurate
representations of terminal display width:

| Input | Old | New | Why |
|-------|-----|-----|-----|
| Flag emoji 🇺🇸 | 1 | 2 | Flags display as 2 cells |
| Skin tone 👋🏽 | 4 | 2 | Emoji + modifier = 1 grapheme |
| ZWJ family 👨‍👩‍👧 | 8 | 2 | ZWJ sequence = 1 grapheme |
| Word joiner U+2060 | 1 | 0 | Invisible character |
| OSC 8 hyperlinks | counted URL | just visible text | URLs are
invisible |
| Cursor movement ESC[5A | counted | 0 | Control sequence |

🤖 Generated with [Claude Code](https://claude.ai/code)

---------

Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: Claude Bot <claude-bot@bun.sh>
2025-12-10 16:17:57 -08:00
Elfayeur - Remi
ac0099ebc6 docs: fix code highlight (#25411)
### What does this PR do?

Fix code highlight line, see problem:
<img width="684" height="663" alt="Screenshot 2025-12-08 at 11 40 39 AM"
src="https://github.com/user-attachments/assets/9894a7b7-ddd6-4bad-b7de-3e0e55ecd8cd"
/>


### How did you verify your code works?

Co-authored-by: Michael H <git@riskymh.dev>
2025-12-10 16:06:45 +11:00
Ryan Machado
64146d47f9 Update os-signals.mdx (#25372)
### What does this PR do?

Remove a loose section from os-signals documentation

### How did you verify your code works?

Just a small documentation change.

Co-authored-by: Michael H <git@riskymh.dev>
2025-12-10 16:06:39 +11:00
robobun
a2d8b75962 fix(yaml): quote strings ending with colons (#25443)
## Summary
- Fixes strings ending with colons (e.g., `"tin:"`) not being quoted in
YAML.stringify output
- This caused YAML.parse to fail with "Unexpected token" when parsing
the output back

## Test plan
- Added regression tests in `test/regression/issue/25439.test.ts`
- Verified round-trip works for various strings ending with colons
- Ran existing YAML tests to ensure no regressions

Fixes #25439

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-12-09 18:20:26 -08:00
Kyle
a15fe76bf2 add brotli and zstd to CompressionStream and DecompressionStream types (#25374)
### What does this PR do?

- removes the `Unimplemented in Bun` comment on `CompressionStream` and
`DecompressionStream`
- updates the types for `CompressionStream` and `DecompressionStream` to
add a new internal `CompressionFormat` type to the constructor, which
adds `brotli` and `zstd` to the union
- adds tests for brotli and zstd usage
- adds lib.dom.d.ts exclusions for brotli and zstd as these don't exist
in the DOM version of CompressionFormat

fixes #25367

### How did you verify your code works?

typechecks and tests
2025-12-09 17:56:55 -08:00
robobun
8dc084af5f fix(fetch): ignore proxy object without url property (#25414)
## Summary
- When a URL object is passed as the proxy option, or when a proxy
object lacks a "url" property, ignore it instead of throwing an error
- This fixes a regression introduced in 1.3.4 where libraries like taze
that pass URL objects as proxy values would fail

## Test plan
- Added test: "proxy as URL object should be ignored (no url property)"
- passes a URL object directly as proxy
- Updated test: "proxy object without url is ignored (regression
#25413)" - proxy object with headers but no url
- Updated test: "proxy object with null url is ignored (regression
#25413)" - proxy object where url is null
- All 29 proxy tests pass

Fixes #25413

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2025-12-09 12:31:45 -08:00
Alistair Smith
2028e21d60 fmt bun.d.ts 2025-12-08 18:00:09 -08:00
Ciro Spaciari
f25ea59683 feat(s3): add Content-Disposition support for S3 uploads (#25363)
### What does this PR do?
- Add `contentDisposition` option to S3 file uploads to control the
`Content-Disposition` HTTP header
- Support passing `contentDisposition` through all S3 upload paths
(simple uploads, multipart uploads, and streaming uploads)
- Add TypeScript types for the new option
Fixes https://github.com/oven-sh/bun/issues/25362
### How did you verify your code works?
Test
2025-12-08 15:30:20 -08:00
Jarred Sumner
55c6afb498 Deflake test/js/bun/net/socket.test.ts 2025-12-08 11:16:26 -08:00
Jarred Sumner
0aca002161 Deflake test/js/bun/util/sleep.test.ts 2025-12-08 11:14:52 -08:00
robobun
4980736786 docs(server): fix satisfies Serve type in export default example (#25410) 2025-12-08 09:25:00 -08:00
robobun
3af0d23d53 docs: expand single-file executable file embedding documentation (#25408)
## Summary

- Expanded documentation for embedding files in single-file executables
with `with { type: "file" }`
- Added clear explanation of how the import attribute works and path
transformation at build time
- Added examples for reading embedded files with both `Bun.file()` and
Node.js `fs` APIs
- Added practical examples: JSON configs, HTTP static assets, templates,
binary files (WASM, fonts)
- Improved `Bun.embeddedFiles` section with a dynamic asset server
example

## Test plan

- [x] Verified all code examples compile and run correctly with `bun
build --compile`
- [x] Tested `Bun.file()` reads embedded files correctly
- [x] Tested `node:fs` APIs (`readFileSync`, `promises.readFile`,
`stat`) work with embedded files
- [x] Tested `Bun.embeddedFiles` returns correct blob array
- [x] Tested `--asset-naming` flag removes content hashes

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-12-07 18:43:00 -08:00
Hamidreza Hanafi
9c96937329 fix(transpiler): preserve simplified property values in object spread expressions (#25401)
Fixes #25398

### What does this PR do?

Fixes a bug where object expressions with spread properties and nullish
coalescing to empty objects (e.g., `k?.x ?? {}`) would produce invalid
JavaScript output like `k?.x ?? ` (missing `{}`).

### Root Cause

In `src/ast/SideEffects.zig`, the `simplifyUnusedExpr` function handles
unused object expressions with spread properties. When simplifying
property values:

1. The code creates a mutable copy `prop` from the original `prop_`
2. When a property value is simplified (e.g., `k?.x ?? {}` → `k?.x`), it
updates `prop.value`
3. **Bug:** The code then wrote back `prop_` (the original) instead of
`prop` (the modified copy)

Because `simplifyUnusedExpr` mutates the AST in place when handling
nullish coalescing (setting `bin.right` to empty), the original `prop_`
now contained an expression with `bin.right` as an empty/missing
expression, resulting in invalid output.

### How did you verify your code works?
- Added regression test in `test/regression/issue/25398.test.ts`
- Verified the original reproduction case passes
- Verified existing CommonJS tests continue to pass
- Verified test fails with system bun and passes with the fix
2025-12-07 16:33:37 -08:00
robobun
d4eaaf8363 docs: document autoload options for standalone executables (#25385)
## Summary

- Document new default behavior in v1.3.4: `tsconfig.json` and
`package.json` loading is now disabled by default for standalone
executables
- Add documentation for `--compile-autoload-tsconfig` and
`--compile-autoload-package-json` CLI flags
- Document all four JavaScript API options: `autoloadTsconfig`,
`autoloadPackageJson`, `autoloadDotenv`, `autoloadBunfig`
- Note that `.env` and `bunfig.toml` may also be disabled by default in
a future version

## Test plan

- [ ] Review rendered documentation for accuracy and formatting

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-12-07 15:44:06 -08:00
Jarred Sumner
e1aa437694 Bump 2025-12-07 15:42:23 -08:00
robobun
73c3f0004f fix(vm): delete internal Loader property from node:vm global object (#25397) 2025-12-07 13:29:32 -08:00
robobun
b80cb629c6 fix(docs): correct mock documentation examples (#25384)
## Summary

- Fix `mock.mock.args` to `mock.mock.calls` in mock-functions.mdx (the
`.args` property doesn't exist)
- Fix mock.restore example to use module methods instead of spy
functions (calling spy functions after restore returns `undefined`)
- Add missing `vi` import in Vitest compatibility example

## Test plan

- [x] Verified each code block works by running tests against the debug
build

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-12-06 22:17:51 -08:00
Jarred Sumner
8773f7ab65 Delete TODO.md 2025-12-06 19:58:21 -08:00
797 changed files with 51562 additions and 22309 deletions

View File

@@ -26,7 +26,7 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
wget curl git python3 python3-pip ninja-build \
software-properties-common apt-transport-https \
ca-certificates gnupg lsb-release unzip \
libxml2-dev ruby ruby-dev bison gawk perl make golang \
libxml2-dev ruby ruby-dev bison gawk perl make golang ccache \
&& add-apt-repository ppa:ubuntu-toolchain-r/test \
&& apt-get update \
&& apt-get install -y gcc-13 g++-13 libgcc-13-dev libstdc++-13-dev \
@@ -35,7 +35,8 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
&& wget https://apt.llvm.org/llvm.sh \
&& chmod +x llvm.sh \
&& ./llvm.sh ${LLVM_VERSION} all \
&& rm llvm.sh
&& rm llvm.sh \
&& rm -rf /var/lib/apt/lists/*
RUN --mount=type=tmpfs,target=/tmp \
@@ -48,14 +49,6 @@ RUN --mount=type=tmpfs,target=/tmp \
wget -O /tmp/cmake.sh "$cmake_url" && \
sh /tmp/cmake.sh --skip-license --prefix=/usr
RUN --mount=type=tmpfs,target=/tmp \
sccache_version="0.12.0" && \
arch=$(uname -m) && \
sccache_url="https://github.com/mozilla/sccache/releases/download/v${sccache_version}/sccache-v${sccache_version}-${arch}-unknown-linux-musl.tar.gz" && \
wget -O /tmp/sccache.tar.gz "$sccache_url" && \
tar -xzf /tmp/sccache.tar.gz -C /tmp && \
install -m755 /tmp/sccache-v${sccache_version}-${arch}-unknown-linux-musl/sccache /usr/local/bin
RUN update-alternatives --install /usr/bin/gcc gcc /usr/bin/gcc-13 130 \
--slave /usr/bin/g++ g++ /usr/bin/g++-13 \
--slave /usr/bin/gcc-ar gcc-ar /usr/bin/gcc-ar-13 \
@@ -134,9 +127,7 @@ RUN ARCH=$(if [ "$TARGETARCH" = "arm64" ]; then echo "arm64"; else echo "amd64";
RUN mkdir -p /var/cache/buildkite-agent /var/log/buildkite-agent /var/run/buildkite-agent /etc/buildkite-agent /var/lib/buildkite-agent/cache/bun
# The following is necessary to configure buildkite to use a stable
# checkout directory. sccache hashes absolute paths into its cache keys,
# so if buildkite uses a different checkout path each time (which it does
# by default), sccache will be useless.
# checkout directory for ccache to be effective.
RUN mkdir -p -m 755 /var/lib/buildkite-agent/hooks && \
cat <<'EOF' > /var/lib/buildkite-agent/hooks/environment
#!/bin/sh

View File

@@ -31,7 +31,7 @@ import {
} from "../scripts/utils.mjs";
/**
* @typedef {"linux" | "darwin" | "windows" | "freebsd"} Os
* @typedef {"linux" | "darwin" | "windows"} Os
* @typedef {"aarch64" | "x64"} Arch
* @typedef {"musl"} Abi
* @typedef {"debian" | "ubuntu" | "alpine" | "amazonlinux"} Distro
@@ -114,7 +114,6 @@ const buildPlatforms = [
{ os: "linux", arch: "x64", abi: "musl", baseline: true, distro: "alpine", release: "3.22" },
{ os: "windows", arch: "x64", release: "2019" },
{ os: "windows", arch: "x64", baseline: true, release: "2019" },
{ os: "freebsd", arch: "x64", release: "14.3" },
];
/**
@@ -572,6 +571,7 @@ function getTestBunStep(platform, options, testOptions = {}) {
if (buildId) {
args.push(`--build-id=${buildId}`);
}
if (testFiles) {
args.push(...testFiles.map(testFile => `--include=${testFile}`));
}
@@ -588,7 +588,7 @@ function getTestBunStep(platform, options, testOptions = {}) {
agents: getTestAgent(platform, options),
retry: getRetry(),
cancel_on_build_failing: isMergeQueue(),
parallelism: os === "darwin" ? 2 : 10,
parallelism: os === "darwin" ? 2 : 20,
timeout_in_minutes: profile === "asan" || os === "windows" ? 45 : 30,
env: {
ASAN_OPTIONS: "allow_user_segv_handler=1:disable_coredump=0:detect_leaks=0",
@@ -657,7 +657,7 @@ function getReleaseStep(buildPlatforms, options) {
agents: {
queue: "test-darwin",
},
depends_on: buildPlatforms.filter(p => p.os !== "freebsd").map(platform => `${getTargetKey(platform)}-build-bun`),
depends_on: buildPlatforms.map(platform => `${getTargetKey(platform)}-build-bun`),
env: {
CANARY: revision,
},
@@ -1108,9 +1108,6 @@ async function getPipeline(options = {}) {
? buildPlatforms
: buildPlatforms.filter(({ profile }) => profile !== "asan");
// run build-image but no build-bun yet
relevantBuildPlatforms = relevantBuildPlatforms.filter(({ os }) => os !== "freebsd");
steps.push(
...relevantBuildPlatforms.map(target => {
const imageKey = getImageKey(target);

View File

@@ -0,0 +1,184 @@
---
name: implementing-jsc-classes-cpp
description: Implements JavaScript classes in C++ using JavaScriptCore. Use when creating new JS classes with C++ bindings, prototypes, or constructors.
---
# Implementing JavaScript Classes in C++
## Class Structure
For publicly accessible Constructor and Prototype, create 3 classes:
1. **`class Foo : public JSC::DestructibleObject`** - if C++ fields exist; otherwise use `JSC::constructEmptyObject` with `putDirectOffset`
2. **`class FooPrototype : public JSC::JSNonFinalObject`**
3. **`class FooConstructor : public JSC::InternalFunction`**
No public constructor? Only Prototype and class needed.
## Iso Subspaces
Classes with C++ fields need subspaces in:
- `src/bun.js/bindings/webcore/DOMClientIsoSubspaces.h`
- `src/bun.js/bindings/webcore/DOMIsoSubspaces.h`
```cpp
template<typename MyClassT, JSC::SubspaceAccess mode>
static JSC::GCClient::IsoSubspace* subspaceFor(JSC::VM& vm) {
if constexpr (mode == JSC::SubspaceAccess::Concurrently)
return nullptr;
return WebCore::subspaceForImpl<MyClassT, WebCore::UseCustomHeapCellType::No>(
vm,
[](auto& spaces) { return spaces.m_clientSubspaceForMyClassT.get(); },
[](auto& spaces, auto&& space) { spaces.m_clientSubspaceForMyClassT = std::forward<decltype(space)>(space); },
[](auto& spaces) { return spaces.m_subspaceForMyClassT.get(); },
[](auto& spaces, auto&& space) { spaces.m_subspaceForMyClassT = std::forward<decltype(space)>(space); });
}
```
## Property Definitions
```cpp
static JSC_DECLARE_HOST_FUNCTION(jsFooProtoFuncMethod);
static JSC_DECLARE_CUSTOM_GETTER(jsFooGetter_property);
static const HashTableValue JSFooPrototypeTableValues[] = {
{ "property"_s, static_cast<unsigned>(PropertyAttribute::ReadOnly | PropertyAttribute::CustomAccessor), NoIntrinsic, { HashTableValue::GetterSetterType, jsFooGetter_property, 0 } },
{ "method"_s, static_cast<unsigned>(PropertyAttribute::Function), NoIntrinsic, { HashTableValue::NativeFunctionType, jsFooProtoFuncMethod, 1 } },
};
```
## Prototype Class
```cpp
class JSFooPrototype final : public JSC::JSNonFinalObject {
public:
using Base = JSC::JSNonFinalObject;
static constexpr unsigned StructureFlags = Base::StructureFlags;
static JSFooPrototype* create(JSC::VM& vm, JSC::JSGlobalObject* globalObject, JSC::Structure* structure) {
JSFooPrototype* prototype = new (NotNull, allocateCell<JSFooPrototype>(vm)) JSFooPrototype(vm, structure);
prototype->finishCreation(vm);
return prototype;
}
template<typename, JSC::SubspaceAccess>
static JSC::GCClient::IsoSubspace* subspaceFor(JSC::VM& vm) { return &vm.plainObjectSpace(); }
DECLARE_INFO;
static JSC::Structure* createStructure(JSC::VM& vm, JSC::JSGlobalObject* globalObject, JSC::JSValue prototype) {
auto* structure = JSC::Structure::create(vm, globalObject, prototype, JSC::TypeInfo(JSC::ObjectType, StructureFlags), info());
structure->setMayBePrototype(true);
return structure;
}
private:
JSFooPrototype(JSC::VM& vm, JSC::Structure* structure) : Base(vm, structure) {}
void finishCreation(JSC::VM& vm);
};
void JSFooPrototype::finishCreation(VM& vm) {
Base::finishCreation(vm);
reifyStaticProperties(vm, JSFoo::info(), JSFooPrototypeTableValues, *this);
JSC_TO_STRING_TAG_WITHOUT_TRANSITION();
}
```
## Getter/Setter/Function Definitions
```cpp
// Getter
JSC_DEFINE_CUSTOM_GETTER(jsFooGetter_prop, (JSGlobalObject* globalObject, EncodedJSValue thisValue, PropertyName)) {
VM& vm = globalObject->vm();
auto scope = DECLARE_THROW_SCOPE(vm);
JSFoo* thisObject = jsDynamicCast<JSFoo*>(JSValue::decode(thisValue));
if (UNLIKELY(!thisObject)) {
Bun::throwThisTypeError(*globalObject, scope, "JSFoo"_s, "prop"_s);
return {};
}
return JSValue::encode(jsBoolean(thisObject->value()));
}
// Function
JSC_DEFINE_HOST_FUNCTION(jsFooProtoFuncMethod, (JSGlobalObject* globalObject, CallFrame* callFrame)) {
VM& vm = globalObject->vm();
auto scope = DECLARE_THROW_SCOPE(vm);
auto* thisObject = jsDynamicCast<JSFoo*>(callFrame->thisValue());
if (UNLIKELY(!thisObject)) {
Bun::throwThisTypeError(*globalObject, scope, "Foo"_s, "method"_s);
return {};
}
return JSValue::encode(thisObject->doSomething(vm, globalObject));
}
```
## Constructor Class
```cpp
class JSFooConstructor final : public JSC::InternalFunction {
public:
using Base = JSC::InternalFunction;
static constexpr unsigned StructureFlags = Base::StructureFlags;
static JSFooConstructor* create(JSC::VM& vm, JSC::Structure* structure, JSC::JSObject* prototype) {
JSFooConstructor* constructor = new (NotNull, JSC::allocateCell<JSFooConstructor>(vm)) JSFooConstructor(vm, structure);
constructor->finishCreation(vm, prototype);
return constructor;
}
DECLARE_INFO;
template<typename CellType, JSC::SubspaceAccess>
static JSC::GCClient::IsoSubspace* subspaceFor(JSC::VM& vm) { return &vm.internalFunctionSpace(); }
static JSC::Structure* createStructure(JSC::VM& vm, JSC::JSGlobalObject* globalObject, JSC::JSValue prototype) {
return JSC::Structure::create(vm, globalObject, prototype, JSC::TypeInfo(JSC::InternalFunctionType, StructureFlags), info());
}
private:
JSFooConstructor(JSC::VM& vm, JSC::Structure* structure) : Base(vm, structure, callFoo, constructFoo) {}
void finishCreation(JSC::VM& vm, JSC::JSObject* prototype) {
Base::finishCreation(vm, 0, "Foo"_s);
putDirectWithoutTransition(vm, vm.propertyNames->prototype, prototype, JSC::PropertyAttribute::DontEnum | JSC::PropertyAttribute::DontDelete | JSC::PropertyAttribute::ReadOnly);
}
};
```
## Structure Caching
Add to `ZigGlobalObject.h`:
```cpp
JSC::LazyClassStructure m_JSFooClassStructure;
```
Initialize in `ZigGlobalObject.cpp`:
```cpp
m_JSFooClassStructure.initLater([](LazyClassStructure::Initializer& init) {
Bun::initJSFooClassStructure(init);
});
```
Visit in `visitChildrenImpl`:
```cpp
m_JSFooClassStructure.visit(visitor);
```
## Expose to Zig
```cpp
extern "C" JSC::EncodedJSValue Bun__JSFooConstructor(Zig::GlobalObject* globalObject) {
return JSValue::encode(globalObject->m_JSFooClassStructure.constructor(globalObject));
}
extern "C" EncodedJSValue Bun__Foo__toJS(Zig::GlobalObject* globalObject, Foo* foo) {
auto* structure = globalObject->m_JSFooClassStructure.get(globalObject);
return JSValue::encode(JSFoo::create(globalObject->vm(), structure, globalObject, WTFMove(foo)));
}
```
Include `#include "root.h"` at the top of C++ files.

View File

@@ -0,0 +1,206 @@
---
name: implementing-jsc-classes-zig
description: Creates JavaScript classes using Bun's Zig bindings generator (.classes.ts). Use when implementing new JS APIs in Zig with JSC integration.
---
# Bun's JavaScriptCore Class Bindings Generator
Bridge JavaScript and Zig through `.classes.ts` definitions and Zig implementations.
## Architecture
1. **Zig Implementation** (.zig files)
2. **JavaScript Interface Definition** (.classes.ts files)
3. **Generated Code** (C++/Zig files connecting them)
## Class Definition (.classes.ts)
```typescript
define({
name: "TextDecoder",
constructor: true,
JSType: "object",
finalize: true,
proto: {
decode: { args: 1 },
encoding: { getter: true, cache: true },
fatal: { getter: true },
},
});
```
Options:
- `name`: Class name
- `constructor`: Has public constructor
- `JSType`: "object", "function", etc.
- `finalize`: Needs cleanup
- `proto`: Properties/methods
- `cache`: Cache property values via WriteBarrier
## Zig Implementation
```zig
pub const TextDecoder = struct {
pub const js = JSC.Codegen.JSTextDecoder;
pub const toJS = js.toJS;
pub const fromJS = js.fromJS;
pub const fromJSDirect = js.fromJSDirect;
encoding: []const u8,
fatal: bool,
pub fn constructor(
globalObject: *JSGlobalObject,
callFrame: *JSC.CallFrame,
) bun.JSError!*TextDecoder {
return bun.new(TextDecoder, .{ .encoding = "utf-8", .fatal = false });
}
pub fn decode(
this: *TextDecoder,
globalObject: *JSGlobalObject,
callFrame: *JSC.CallFrame,
) bun.JSError!JSC.JSValue {
const args = callFrame.arguments();
if (args.len < 1 or args.ptr[0].isUndefinedOrNull()) {
return globalObject.throw("Input cannot be null", .{});
}
return JSC.JSValue.jsString(globalObject, "result");
}
pub fn getEncoding(this: *TextDecoder, globalObject: *JSGlobalObject) JSC.JSValue {
return JSC.JSValue.createStringFromUTF8(globalObject, this.encoding);
}
fn deinit(this: *TextDecoder) void {
// Release resources
}
pub fn finalize(this: *TextDecoder) void {
this.deinit();
bun.destroy(this);
}
};
```
**Key patterns:**
- Use `bun.JSError!JSValue` return type for error handling
- Use `globalObject` not `ctx`
- `deinit()` for cleanup, `finalize()` called by GC
- Update `src/bun.js/bindings/generated_classes_list.zig`
## CallFrame Access
```zig
const args = callFrame.arguments();
const first_arg = args.ptr[0]; // Access as slice
const argCount = args.len;
const thisValue = callFrame.thisValue();
```
## Property Caching
For `cache: true` properties, generated accessors:
```zig
// Get cached value
pub fn encodingGetCached(thisValue: JSC.JSValue) ?JSC.JSValue {
const result = TextDecoderPrototype__encodingGetCachedValue(thisValue);
if (result == .zero) return null;
return result;
}
// Set cached value
pub fn encodingSetCached(thisValue: JSC.JSValue, globalObject: *JSC.JSGlobalObject, value: JSC.JSValue) void {
TextDecoderPrototype__encodingSetCachedValue(thisValue, globalObject, value);
}
```
## Error Handling
```zig
pub fn method(this: *MyClass, globalObject: *JSGlobalObject, callFrame: *JSC.CallFrame) bun.JSError!JSC.JSValue {
const args = callFrame.arguments();
if (args.len < 1) {
return globalObject.throw("Missing required argument", .{});
}
return JSC.JSValue.jsString(globalObject, "Success!");
}
```
## Memory Management
```zig
pub fn deinit(this: *TextDecoder) void {
this._encoding.deref();
if (this.buffer) |buffer| {
bun.default_allocator.free(buffer);
}
}
pub fn finalize(this: *TextDecoder) void {
JSC.markBinding(@src());
this.deinit();
bun.default_allocator.destroy(this);
}
```
## Creating a New Binding
1. Define interface in `.classes.ts`:
```typescript
define({
name: "MyClass",
constructor: true,
finalize: true,
proto: {
myMethod: { args: 1 },
myProperty: { getter: true, cache: true },
},
});
```
2. Implement in `.zig`:
```zig
pub const MyClass = struct {
pub const js = JSC.Codegen.JSMyClass;
pub const toJS = js.toJS;
pub const fromJS = js.fromJS;
value: []const u8,
pub const new = bun.TrivialNew(@This());
pub fn constructor(globalObject: *JSGlobalObject, callFrame: *JSC.CallFrame) bun.JSError!*MyClass {
return MyClass.new(.{ .value = "" });
}
pub fn myMethod(this: *MyClass, globalObject: *JSGlobalObject, callFrame: *JSC.CallFrame) bun.JSError!JSC.JSValue {
return JSC.JSValue.jsUndefined();
}
pub fn getMyProperty(this: *MyClass, globalObject: *JSGlobalObject) JSC.JSValue {
return JSC.JSValue.jsString(globalObject, this.value);
}
pub fn deinit(this: *MyClass) void {}
pub fn finalize(this: *MyClass) void {
this.deinit();
bun.destroy(this);
}
};
```
3. Add to `src/bun.js/bindings/generated_classes_list.zig`
## Generated Components
- **C++ Classes**: `JSMyClass`, `JSMyClassPrototype`, `JSMyClassConstructor`
- **Method Bindings**: `MyClassPrototype__myMethodCallback`
- **Property Accessors**: `MyClassPrototype__myPropertyGetterWrap`
- **Zig Bindings**: External function declarations, cached value accessors

View File

@@ -0,0 +1,222 @@
---
name: writing-bundler-tests
description: Guides writing bundler tests using itBundled/expectBundled in test/bundler/. Use when creating or modifying bundler, transpiler, or code transformation tests.
---
# Writing Bundler Tests
Bundler tests use `itBundled()` from `test/bundler/expectBundled.ts` to test Bun's bundler.
## Basic Usage
```typescript
import { describe } from "bun:test";
import { itBundled, dedent } from "./expectBundled";
describe("bundler", () => {
itBundled("category/TestName", {
files: {
"index.js": `console.log("hello");`,
},
run: {
stdout: "hello",
},
});
});
```
Test ID format: `category/TestName` (e.g., `banner/CommentBanner`, `minify/Empty`)
## File Setup
```typescript
{
files: {
"index.js": `console.log("test");`,
"lib.ts": `export const foo = 123;`,
"nested/file.js": `export default {};`,
},
entryPoints: ["index.js"], // defaults to first file
runtimeFiles: { // written AFTER bundling
"extra.js": `console.log("added later");`,
},
}
```
## Bundler Options
```typescript
{
outfile: "/out.js",
outdir: "/out",
format: "esm" | "cjs" | "iife",
target: "bun" | "browser" | "node",
// Minification
minifyWhitespace: true,
minifyIdentifiers: true,
minifySyntax: true,
// Code manipulation
banner: "// copyright",
footer: "// end",
define: { "PROD": "true" },
external: ["lodash"],
// Advanced
sourceMap: "inline" | "external",
splitting: true,
treeShaking: true,
drop: ["console"],
}
```
## Runtime Verification
```typescript
{
run: {
stdout: "expected output", // exact match
stdout: /regex/, // pattern match
partialStdout: "contains this", // substring
stderr: "error output",
exitCode: 1,
env: { NODE_ENV: "production" },
runtime: "bun" | "node",
// Runtime errors
error: "ReferenceError: x is not defined",
},
}
```
## Bundle Errors/Warnings
```typescript
{
bundleErrors: {
"/file.js": ["error message 1", "error message 2"],
},
bundleWarnings: {
"/file.js": ["warning message"],
},
}
```
## Dead Code Elimination (DCE)
Add markers in source code:
```javascript
// KEEP - this should survive
const used = 1;
// REMOVE - this should be eliminated
const unused = 2;
```
```typescript
{
dce: true,
dceKeepMarkerCount: 5, // expected KEEP markers
}
```
## Capture Pattern
Verify exact transpilation with `capture()`:
```typescript
itBundled("string/Folding", {
files: {
"index.ts": `capture(\`\${1 + 1}\`);`,
},
capture: ['"2"'], // expected captured value
minifySyntax: true,
});
```
## Post-Bundle Assertions
```typescript
{
onAfterBundle(api) {
api.expectFile("out.js").toContain("console.log");
api.assertFileExists("out.js");
const content = api.readFile("out.js");
expect(content).toMatchSnapshot();
const values = api.captureFile("out.js");
expect(values).toEqual(["2"]);
},
}
```
## Common Patterns
**Simple output verification:**
```typescript
itBundled("banner/Comment", {
banner: "// copyright",
files: { "a.js": `console.log("Hello")` },
onAfterBundle(api) {
api.expectFile("out.js").toContain("// copyright");
},
});
```
**Multi-file CJS/ESM interop:**
```typescript
itBundled("cjs/ImportSyntax", {
files: {
"entry.js": `import lib from './lib.cjs'; console.log(lib);`,
"lib.cjs": `exports.foo = 'bar';`,
},
run: { stdout: '{"foo":"bar"}' },
});
```
**Error handling:**
```typescript
itBundled("edgecase/InvalidLoader", {
files: { "index.js": `...` },
bundleErrors: {
"index.js": ["Unsupported loader type"],
},
});
```
## Test Organization
```text
test/bundler/
├── bundler_banner.test.ts
├── bundler_string.test.ts
├── bundler_minify.test.ts
├── bundler_cjs.test.ts
├── bundler_edgecase.test.ts
├── bundler_splitting.test.ts
├── css/
├── transpiler/
└── expectBundled.ts
```
## Running Tests
```bash
bun bd test test/bundler/bundler_banner.test.ts
BUN_BUNDLER_TEST_FILTER="banner/Comment" bun bd test bundler_banner.test.ts
BUN_BUNDLER_TEST_DEBUG=1 bun bd test bundler_minify.test.ts
```
## Key Points
- Use `dedent` for readable multi-line code
- File paths are relative (e.g., `/index.js`)
- Use `capture()` to verify exact transpilation results
- Use `.toMatchSnapshot()` for complex outputs
- Pass array to `run` for multiple test scenarios

View File

@@ -0,0 +1,94 @@
---
name: writing-dev-server-tests
description: Guides writing HMR/Dev Server tests in test/bake/. Use when creating or modifying dev server, hot reloading, or bundling tests.
---
# Writing HMR/Dev Server Tests
Dev server tests validate hot-reloading robustness and reliability.
## File Structure
- `test/bake/bake-harness.ts` - shared utilities: `devTest`, `prodTest`, `devAndProductionTest`, `Dev` class, `Client` class
- `test/bake/client-fixture.mjs` - subprocess for `Client` (page loading, IPC queries)
- `test/bake/dev/*.test.ts` - dev server and hot reload tests
- `test/bake/dev-and-prod.ts` - tests running on both dev and production mode
## Test Categories
- `bundle.test.ts` - DevServer-specific bundling bugs
- `css.test.ts` - CSS bundling issues
- `plugins.test.ts` - development mode plugins
- `ecosystem.test.ts` - library compatibility (prefer concrete bugs over full package tests)
- `esm.test.ts` - ESM features in development
- `html.test.ts` - HTML file handling
- `react-spa.test.ts` - React, react-refresh transform, server components
- `sourcemap.test.ts` - source map correctness
## devTest Basics
```ts
import { devTest, emptyHtmlFile } from "../bake-harness";
devTest("html file is watched", {
files: {
"index.html": emptyHtmlFile({
scripts: ["/script.ts"],
body: "<h1>Hello</h1>",
}),
"script.ts": `console.log("hello");`,
},
async test(dev) {
await dev.fetch("/").expect.toInclude("<h1>Hello</h1>");
await dev.patch("index.html", { find: "Hello", replace: "World" });
await dev.fetch("/").expect.toInclude("<h1>World</h1>");
await using c = await dev.client("/");
await c.expectMessage("hello");
await c.expectReload(async () => {
await dev.patch("index.html", { find: "World", replace: "Bar" });
});
await c.expectMessage("hello");
},
});
```
## Key APIs
- **`files`**: Initial filesystem state
- **`dev.fetch()`**: HTTP requests
- **`dev.client()`**: Opens browser instance
- **`dev.write/patch/delete`**: Filesystem mutations (wait for hot-reload automatically)
- **`c.expectMessage()`**: Assert console.log output
- **`c.expectReload()`**: Wrap code that causes hard reload
**Important**: Use `dev.write/patch/delete` instead of `node:fs` - they wait for hot-reload.
## Testing Errors
```ts
devTest("import then create", {
files: {
"index.html": `<!DOCTYPE html><html><head></head><body><script type="module" src="/script.ts"></script></body></html>`,
"script.ts": `import data from "./data"; console.log(data);`,
},
async test(dev) {
const c = await dev.client("/", {
errors: ['script.ts:1:18: error: Could not resolve: "./data"'],
});
await c.expectReload(async () => {
await dev.write("data.ts", "export default 'data';");
});
await c.expectMessage("data");
},
});
```
Specify expected errors with the `errors` option:
```ts
await dev.delete("other.ts", {
errors: ['index.ts:1:16: error: Could not resolve: "./other"'],
});
```

View File

@@ -0,0 +1,268 @@
---
name: zig-system-calls
description: Guides using bun.sys for system calls and file I/O in Zig. Use when implementing file operations instead of std.fs or std.posix.
---
# System Calls & File I/O in Zig
Use `bun.sys` instead of `std.fs` or `std.posix` for cross-platform syscalls with proper error handling.
## bun.sys.File (Preferred)
For most file operations, use the `bun.sys.File` wrapper:
```zig
const File = bun.sys.File;
const file = switch (File.open(path, bun.O.RDWR, 0o644)) {
.result => |f| f,
.err => |err| return .{ .err = err },
};
defer file.close();
// Read/write
_ = try file.read(buffer).unwrap();
_ = try file.writeAll(data).unwrap();
// Get file info
const stat = try file.stat().unwrap();
const size = try file.getEndPos().unwrap();
// std.io compatible
const reader = file.reader();
const writer = file.writer();
```
### Complete Example
```zig
const File = bun.sys.File;
pub fn writeFile(path: [:0]const u8, data: []const u8) File.WriteError!void {
const file = switch (File.open(path, bun.O.WRONLY | bun.O.CREAT | bun.O.TRUNC, 0o664)) {
.result => |f| f,
.err => |err| return err.toError(),
};
defer file.close();
_ = switch (file.writeAll(data)) {
.result => {},
.err => |err| return err.toError(),
};
}
```
## Why bun.sys?
| Aspect | bun.sys | std.fs/std.posix |
| ----------- | -------------------------------- | ------------------- |
| Return Type | `Maybe(T)` with detailed Error | Generic error union |
| Windows | Full support with libuv fallback | Limited/POSIX-only |
| Error Info | errno, syscall tag, path, fd | errno only |
| EINTR | Automatic retry | Manual handling |
## Error Handling with Maybe(T)
`bun.sys` functions return `Maybe(T)` - a tagged union:
```zig
const sys = bun.sys;
// Pattern 1: Switch on result/error
switch (sys.read(fd, buffer)) {
.result => |bytes_read| {
// use bytes_read
},
.err => |err| {
// err.errno, err.syscall, err.fd, err.path
if (err.getErrno() == .AGAIN) {
// handle EAGAIN
}
},
}
// Pattern 2: Unwrap with try (converts to Zig error)
const bytes = try sys.read(fd, buffer).unwrap();
// Pattern 3: Unwrap with default
const value = sys.stat(path).unwrapOr(default_stat);
```
## Low-Level File Operations
Only use these when `bun.sys.File` doesn't meet your needs.
### Opening Files
```zig
const sys = bun.sys;
// Use bun.O flags (cross-platform normalized)
const fd = switch (sys.open(path, bun.O.RDONLY, 0)) {
.result => |fd| fd,
.err => |err| return .{ .err = err },
};
defer fd.close();
// Common flags
bun.O.RDONLY, bun.O.WRONLY, bun.O.RDWR
bun.O.CREAT, bun.O.TRUNC, bun.O.APPEND
bun.O.NONBLOCK, bun.O.DIRECTORY
```
### Reading & Writing
```zig
// Single read (may return less than buffer size)
switch (sys.read(fd, buffer)) {
.result => |n| { /* n bytes read */ },
.err => |err| { /* handle error */ },
}
// Read until EOF or buffer full
const total = try sys.readAll(fd, buffer).unwrap();
// Position-based read/write
sys.pread(fd, buffer, offset)
sys.pwrite(fd, data, offset)
// Vector I/O
sys.readv(fd, iovecs)
sys.writev(fd, iovecs)
```
### File Info
```zig
sys.stat(path) // Follow symlinks
sys.lstat(path) // Don't follow symlinks
sys.fstat(fd) // From file descriptor
sys.fstatat(fd, path)
// Linux-only: faster selective stat
sys.statx(path, &.{ .size, .mtime })
```
### Path Operations
```zig
sys.unlink(path)
sys.unlinkat(dir_fd, path)
sys.rename(from, to)
sys.renameat(from_dir, from, to_dir, to)
sys.readlink(path, buf)
sys.readlinkat(fd, path, buf)
sys.link(T, src, dest)
sys.linkat(src_fd, src, dest_fd, dest)
sys.symlink(target, dest)
sys.symlinkat(target, dirfd, dest)
sys.mkdir(path, mode)
sys.mkdirat(dir_fd, path, mode)
sys.rmdir(path)
```
### Permissions
```zig
sys.chmod(path, mode)
sys.fchmod(fd, mode)
sys.fchmodat(fd, path, mode, flags)
sys.chown(path, uid, gid)
sys.fchown(fd, uid, gid)
```
### Closing File Descriptors
Close is on `bun.FD`:
```zig
fd.close(); // Asserts on error (use in defer)
// Or if you need error info:
if (fd.closeAllowingBadFileDescriptor(null)) |err| {
// handle error
}
```
## Directory Operations
```zig
var buf: bun.PathBuffer = undefined;
const cwd = try sys.getcwd(&buf).unwrap();
const cwdZ = try sys.getcwdZ(&buf).unwrap(); // Zero-terminated
sys.chdir(path, destination)
```
### Directory Iteration
Use `bun.DirIterator` instead of `std.fs.Dir.Iterator`:
```zig
var iter = bun.iterateDir(dir_fd);
while (true) {
switch (iter.next()) {
.result => |entry| {
if (entry) |e| {
const name = e.name.slice();
const kind = e.kind; // .file, .directory, .sym_link, etc.
} else {
break; // End of directory
}
},
.err => |err| return .{ .err = err },
}
}
```
## Socket Operations
**Important**: `bun.sys` has limited socket support. For network I/O:
- **Non-blocking sockets**: Use `uws.Socket` (libuwebsockets) exclusively
- **Pipes/blocking I/O**: Use `PipeReader.zig` and `PipeWriter.zig`
Available in bun.sys:
```zig
sys.setsockopt(fd, level, optname, value)
sys.socketpair(domain, socktype, protocol, nonblocking_status)
```
Do NOT use `bun.sys` for socket read/write - use `uws.Socket` instead.
## Other Operations
```zig
sys.ftruncate(fd, size)
sys.lseek(fd, offset, whence)
sys.dup(fd)
sys.dupWithFlags(fd, flags)
sys.fcntl(fd, cmd, arg)
sys.pipe()
sys.mmap(...)
sys.munmap(memory)
sys.access(path, mode)
sys.futimens(fd, atime, mtime)
sys.utimens(path, atime, mtime)
```
## Error Type
```zig
const err: bun.sys.Error = ...;
err.errno // Raw errno value
err.getErrno() // As std.posix.E enum
err.syscall // Which syscall failed (Tag enum)
err.fd // Optional: file descriptor
err.path // Optional: path string
```
## Key Points
- Prefer `bun.sys.File` wrapper for most file operations
- Use low-level `bun.sys` functions only when needed
- Use `bun.O.*` flags instead of `std.os.O.*`
- Handle `Maybe(T)` with switch or `.unwrap()`
- Use `defer fd.close()` for cleanup
- EINTR is handled automatically in most functions
- For sockets, use `uws.Socket` not `bun.sys`

View File

@@ -1,5 +1,9 @@
language: en-US
issue_enrichment:
auto_enrich:
enabled: false
reviews:
profile: assertive
request_changes_workflow: false

View File

@@ -1,41 +0,0 @@
---
description:
globs: src/**/*.cpp,src/**/*.zig
alwaysApply: false
---
### Build Commands
- **Build debug version**: `bun bd` or `bun run build:debug`
- Creates a debug build at `./build/debug/bun-debug`
- Compilation takes ~2.5 minutes
- **Run tests with your debug build**: `bun bd test <test-file>`
- **CRITICAL**: Never use `bun test` directly - it won't include your changes
- **Run any command with debug build**: `bun bd <command>`
### Run a file
To run a file, use:
```sh
bun bd <file> <...args>
```
**CRITICAL**: Never use `bun <file>` directly. It will not have your changes.
### Logging
`BUN_DEBUG_$(SCOPE)=1` enables debug logs for a specific debug log scope.
Debug logs look like this:
```zig
const log = bun.Output.scoped(.${SCOPE}, .hidden);
// ...later
log("MY DEBUG LOG", .{})
```
### Code Generation
Code generation happens automatically as part of the build process. There are no commands to run.

View File

@@ -1,139 +0,0 @@
---
description: Writing HMR/Dev Server tests
globs: test/bake/*
---
# Writing HMR/Dev Server tests
Dev server tests validate that hot-reloading is robust, correct, and reliable. Remember to write thorough, yet concise tests.
## File Structure
- `test/bake/bake-harness.ts` - shared utilities and test harness
- primary test functions `devTest` / `prodTest` / `devAndProductionTest`
- class `Dev` (controls subprocess for dev server)
- class `Client` (controls a happy-dom subprocess for having the page open)
- more helpers
- `test/bake/client-fixture.mjs` - subprocess for what `Client` controls. it loads a page and uses IPC to query parts of the page, run javascript, and much more.
- `test/bake/dev/*.test.ts` - these call `devTest` to test dev server and hot reloading
- `test/bake/dev-and-prod.ts` - these use `devAndProductionTest` to run the same test on dev and production mode. these tests cannot really test hot reloading for obvious reasons.
## Categories
bundle.test.ts - Bundle tests are tests concerning bundling bugs that only occur in DevServer.
css.test.ts - CSS tests concern bundling bugs with CSS files
plugins.test.ts - Plugin tests concern plugins in development mode.
ecosystem.test.ts - These tests involve ensuring certain libraries are correct. It is preferred to test more concrete bugs than testing entire packages.
esm.test.ts - ESM tests are about various esm features in development mode.
html.test.ts - HTML tests are tests relating to HTML files themselves.
react-spa.test.ts - Tests relating to React, our react-refresh transform, and basic server component transforms.
sourcemap.test.ts - Tests verifying source-maps are correct.
## `devTest` Basics
A test takes in two primary inputs: `files` and `async test(dev) {`
```ts
import { devTest, emptyHtmlFile } from "../bake-harness";
devTest("html file is watched", {
files: {
"index.html": emptyHtmlFile({
scripts: ["/script.ts"],
body: "<h1>Hello</h1>",
}),
"script.ts": `
console.log("hello");
`,
},
async test(dev) {
await dev.fetch("/").expect.toInclude("<h1>Hello</h1>");
await dev.fetch("/").expect.toInclude("<h1>Hello</h1>");
await dev.patch("index.html", {
find: "Hello",
replace: "World",
});
await dev.fetch("/").expect.toInclude("<h1>World</h1>");
// Works
await using c = await dev.client("/");
await c.expectMessage("hello");
// Editing HTML reloads
await c.expectReload(async () => {
await dev.patch("index.html", {
find: "World",
replace: "Hello",
});
await dev.fetch("/").expect.toInclude("<h1>Hello</h1>");
});
await c.expectMessage("hello");
await c.expectReload(async () => {
await dev.patch("index.html", {
find: "Hello",
replace: "Bar",
});
await dev.fetch("/").expect.toInclude("<h1>Bar</h1>");
});
await c.expectMessage("hello");
await c.expectReload(async () => {
await dev.patch("script.ts", {
find: "hello",
replace: "world",
});
});
await c.expectMessage("world");
},
});
```
`files` holds the initial state, and the callback runs with the server running. `dev.fetch()` runs HTTP requests, while `dev.client()` opens a browser instance to the code.
Functions `dev.write` and `dev.patch` and `dev.delete` mutate the filesystem. Do not use `node:fs` APIs, as the dev server ones are hooked to wait for hot-reload, and all connected clients to receive changes.
When a change performs a hard-reload, that must be explicitly annotated with `expectReload`. This tells `client-fixture.mjs` that the test is meant to reload the page once; All other hard reloads automatically fail the test.
Client's have `console.log` instrumented, so that any unasserted logs fail the test. This makes it more obvious when an extra reload or re-evaluation. Messages are awaited via `c.expectMessage("log")` or with multiple arguments if there are multiple logs.
## Testing for bundling errors
By default, a client opening a page to an error will fail the test. This makes testing errors explicit.
```ts
devTest("import then create", {
files: {
"index.html": `
<!DOCTYPE html>
<html>
<head></head>
<body>
<script type="module" src="/script.ts"></script>
</body>
</html>
`,
"script.ts": `
import data from "./data";
console.log(data);
`,
},
async test(dev) {
const c = await dev.client("/", {
errors: ['script.ts:1:18: error: Could not resolve: "./data"'],
});
await c.expectReload(async () => {
await dev.write("data.ts", "export default 'data';");
});
await c.expectMessage("data");
},
});
```
Many functions take an options value to allow specifying it will produce errors. For example, this delete is going to cause a resolution failure.
```ts
await dev.delete("other.ts", {
errors: ['index.ts:1:16: error: Could not resolve: "./other"'],
});
```

View File

@@ -1,413 +0,0 @@
---
description: JavaScript class implemented in C++
globs: *.cpp
alwaysApply: false
---
# Implementing JavaScript classes in C++
If there is a publicly accessible Constructor and Prototype, then there are 3 classes:
- IF there are C++ class members we need a destructor, so `class Foo : public JSC::DestructibleObject`, if no C++ class fields (only JS properties) then we don't need a class at all usually. We can instead use JSC::constructEmptyObject(vm, structure) and `putDirectOffset` like in [NodeFSStatBinding.cpp](mdc:src/bun.js/bindings/NodeFSStatBinding.cpp).
- class FooPrototype : public JSC::JSNonFinalObject
- class FooConstructor : public JSC::InternalFunction
If there is no publicly accessible Constructor, just the Prototype and the class is necessary. In some cases, we can avoid the prototype entirely (but that's rare).
If there are C++ fields on the Foo class, the Foo class will need an iso subspace added to [DOMClientIsoSubspaces.h](mdc:src/bun.js/bindings/webcore/DOMClientIsoSubspaces.h) and [DOMIsoSubspaces.h](mdc:src/bun.js/bindings/webcore/DOMIsoSubspaces.h). Prototype and Constructor do not need subspaces.
Usually you'll need to #include "root.h" at the top of C++ files or you'll get lint errors.
Generally, defining the subspace looks like this:
```c++
class Foo : public JSC::DestructibleObject {
// ...
template<typename MyClassT, JSC::SubspaceAccess mode>
static JSC::GCClient::IsoSubspace* subspaceFor(JSC::VM& vm)
{
if constexpr (mode == JSC::SubspaceAccess::Concurrently)
return nullptr;
return WebCore::subspaceForImpl<MyClassT, WebCore::UseCustomHeapCellType::No>(
vm,
[](auto& spaces) { return spaces.m_clientSubspaceFor${MyClassT}.get(); },
[](auto& spaces, auto&& space) { spaces.m_clientSubspaceFor${MyClassT} = std::forward<decltype(space)>(space); },
[](auto& spaces) { return spaces.m_subspaceFo${MyClassT}.get(); },
[](auto& spaces, auto&& space) { spaces.m_subspaceFor${MyClassT} = std::forward<decltype(space)>(space); });
}
```
It's better to put it in the .cpp file instead of the .h file, when possible.
## Defining properties
Define properties on the prototype. Use a const HashTableValues like this:
```C++
static JSC_DECLARE_HOST_FUNCTION(jsX509CertificateProtoFuncCheckEmail);
static JSC_DECLARE_HOST_FUNCTION(jsX509CertificateProtoFuncCheckHost);
static JSC_DECLARE_HOST_FUNCTION(jsX509CertificateProtoFuncCheckIP);
static JSC_DECLARE_HOST_FUNCTION(jsX509CertificateProtoFuncCheckIssued);
static JSC_DECLARE_HOST_FUNCTION(jsX509CertificateProtoFuncCheckPrivateKey);
static JSC_DECLARE_HOST_FUNCTION(jsX509CertificateProtoFuncToJSON);
static JSC_DECLARE_HOST_FUNCTION(jsX509CertificateProtoFuncToLegacyObject);
static JSC_DECLARE_HOST_FUNCTION(jsX509CertificateProtoFuncToString);
static JSC_DECLARE_HOST_FUNCTION(jsX509CertificateProtoFuncVerify);
static JSC_DECLARE_CUSTOM_GETTER(jsX509CertificateGetter_ca);
static JSC_DECLARE_CUSTOM_GETTER(jsX509CertificateGetter_fingerprint);
static JSC_DECLARE_CUSTOM_GETTER(jsX509CertificateGetter_fingerprint256);
static JSC_DECLARE_CUSTOM_GETTER(jsX509CertificateGetter_fingerprint512);
static JSC_DECLARE_CUSTOM_GETTER(jsX509CertificateGetter_subject);
static JSC_DECLARE_CUSTOM_GETTER(jsX509CertificateGetter_subjectAltName);
static JSC_DECLARE_CUSTOM_GETTER(jsX509CertificateGetter_infoAccess);
static JSC_DECLARE_CUSTOM_GETTER(jsX509CertificateGetter_keyUsage);
static JSC_DECLARE_CUSTOM_GETTER(jsX509CertificateGetter_issuer);
static JSC_DECLARE_CUSTOM_GETTER(jsX509CertificateGetter_issuerCertificate);
static JSC_DECLARE_CUSTOM_GETTER(jsX509CertificateGetter_publicKey);
static JSC_DECLARE_CUSTOM_GETTER(jsX509CertificateGetter_raw);
static JSC_DECLARE_CUSTOM_GETTER(jsX509CertificateGetter_serialNumber);
static JSC_DECLARE_CUSTOM_GETTER(jsX509CertificateGetter_validFrom);
static JSC_DECLARE_CUSTOM_GETTER(jsX509CertificateGetter_validTo);
static JSC_DECLARE_CUSTOM_GETTER(jsX509CertificateGetter_validFromDate);
static JSC_DECLARE_CUSTOM_GETTER(jsX509CertificateGetter_validToDate);
static const HashTableValue JSX509CertificatePrototypeTableValues[] = {
{ "ca"_s, static_cast<unsigned>(PropertyAttribute::ReadOnly | PropertyAttribute::CustomAccessor), NoIntrinsic, { HashTableValue::GetterSetterType, jsX509CertificateGetter_ca, 0 } },
{ "checkEmail"_s, static_cast<unsigned>(PropertyAttribute::Function), NoIntrinsic, { HashTableValue::NativeFunctionType, jsX509CertificateProtoFuncCheckEmail, 2 } },
{ "checkHost"_s, static_cast<unsigned>(PropertyAttribute::Function), NoIntrinsic, { HashTableValue::NativeFunctionType, jsX509CertificateProtoFuncCheckHost, 2 } },
{ "checkIP"_s, static_cast<unsigned>(PropertyAttribute::Function), NoIntrinsic, { HashTableValue::NativeFunctionType, jsX509CertificateProtoFuncCheckIP, 1 } },
{ "checkIssued"_s, static_cast<unsigned>(PropertyAttribute::Function), NoIntrinsic, { HashTableValue::NativeFunctionType, jsX509CertificateProtoFuncCheckIssued, 1 } },
{ "checkPrivateKey"_s, static_cast<unsigned>(PropertyAttribute::Function), NoIntrinsic, { HashTableValue::NativeFunctionType, jsX509CertificateProtoFuncCheckPrivateKey, 1 } },
{ "fingerprint"_s, static_cast<unsigned>(PropertyAttribute::ReadOnly | PropertyAttribute::CustomAccessor), NoIntrinsic, { HashTableValue::GetterSetterType, jsX509CertificateGetter_fingerprint, 0 } },
{ "fingerprint256"_s, static_cast<unsigned>(PropertyAttribute::ReadOnly | PropertyAttribute::CustomAccessor), NoIntrinsic, { HashTableValue::GetterSetterType, jsX509CertificateGetter_fingerprint256, 0 } },
{ "fingerprint512"_s, static_cast<unsigned>(PropertyAttribute::ReadOnly | PropertyAttribute::CustomAccessor), NoIntrinsic, { HashTableValue::GetterSetterType, jsX509CertificateGetter_fingerprint512, 0 } },
{ "infoAccess"_s, static_cast<unsigned>(PropertyAttribute::ReadOnly | PropertyAttribute::CustomAccessor), NoIntrinsic, { HashTableValue::GetterSetterType, jsX509CertificateGetter_infoAccess, 0 } },
{ "issuer"_s, static_cast<unsigned>(PropertyAttribute::ReadOnly | PropertyAttribute::CustomAccessor), NoIntrinsic, { HashTableValue::GetterSetterType, jsX509CertificateGetter_issuer, 0 } },
{ "issuerCertificate"_s, static_cast<unsigned>(PropertyAttribute::ReadOnly | PropertyAttribute::CustomAccessor), NoIntrinsic, { HashTableValue::GetterSetterType, jsX509CertificateGetter_issuerCertificate, 0 } },
{ "keyUsage"_s, static_cast<unsigned>(PropertyAttribute::ReadOnly | PropertyAttribute::CustomAccessor), NoIntrinsic, { HashTableValue::GetterSetterType, jsX509CertificateGetter_keyUsage, 0 } },
{ "publicKey"_s, static_cast<unsigned>(PropertyAttribute::ReadOnly | PropertyAttribute::CustomAccessor), NoIntrinsic, { HashTableValue::GetterSetterType, jsX509CertificateGetter_publicKey, 0 } },
{ "raw"_s, static_cast<unsigned>(PropertyAttribute::ReadOnly | PropertyAttribute::CustomAccessor), NoIntrinsic, { HashTableValue::GetterSetterType, jsX509CertificateGetter_raw, 0 } },
{ "serialNumber"_s, static_cast<unsigned>(PropertyAttribute::ReadOnly | PropertyAttribute::CustomAccessor), NoIntrinsic, { HashTableValue::GetterSetterType, jsX509CertificateGetter_serialNumber, 0 } },
{ "subject"_s, static_cast<unsigned>(PropertyAttribute::ReadOnly | PropertyAttribute::CustomAccessor), NoIntrinsic, { HashTableValue::GetterSetterType, jsX509CertificateGetter_subject, 0 } },
{ "subjectAltName"_s, static_cast<unsigned>(PropertyAttribute::ReadOnly | PropertyAttribute::CustomAccessor), NoIntrinsic, { HashTableValue::GetterSetterType, jsX509CertificateGetter_subjectAltName, 0 } },
{ "toJSON"_s, static_cast<unsigned>(PropertyAttribute::Function), NoIntrinsic, { HashTableValue::NativeFunctionType, jsX509CertificateProtoFuncToJSON, 0 } },
{ "toLegacyObject"_s, static_cast<unsigned>(PropertyAttribute::Function), NoIntrinsic, { HashTableValue::NativeFunctionType, jsX509CertificateProtoFuncToLegacyObject, 0 } },
{ "toString"_s, static_cast<unsigned>(PropertyAttribute::Function), NoIntrinsic, { HashTableValue::NativeFunctionType, jsX509CertificateProtoFuncToString, 0 } },
{ "validFrom"_s, static_cast<unsigned>(PropertyAttribute::ReadOnly | PropertyAttribute::CustomAccessor), NoIntrinsic, { HashTableValue::GetterSetterType, jsX509CertificateGetter_validFrom, 0 } },
{ "validFromDate"_s, static_cast<unsigned>(PropertyAttribute::ReadOnly | PropertyAttribute::CustomAccessorOrValue), NoIntrinsic, { HashTableValue::GetterSetterType, jsX509CertificateGetter_validFromDate, 0 } },
{ "validTo"_s, static_cast<unsigned>(PropertyAttribute::ReadOnly | PropertyAttribute::CustomAccessor), NoIntrinsic, { HashTableValue::GetterSetterType, jsX509CertificateGetter_validTo, 0 } },
{ "validToDate"_s, static_cast<unsigned>(PropertyAttribute::ReadOnly | PropertyAttribute::CustomAccessorOrValue), NoIntrinsic, { HashTableValue::GetterSetterType, jsX509CertificateGetter_validToDate, 0 } },
{ "verify"_s, static_cast<unsigned>(PropertyAttribute::Function), NoIntrinsic, { HashTableValue::NativeFunctionType, jsX509CertificateProtoFuncVerify, 1 } },
};
```
### Creating a prototype class
Follow a pattern like this:
```c++
class JSX509CertificatePrototype final : public JSC::JSNonFinalObject {
public:
using Base = JSC::JSNonFinalObject;
static constexpr unsigned StructureFlags = Base::StructureFlags;
static JSX509CertificatePrototype* create(JSC::VM& vm, JSC::JSGlobalObject* globalObject, JSC::Structure* structure)
{
JSX509CertificatePrototype* prototype = new (NotNull, allocateCell<JSX509CertificatePrototype>(vm)) JSX509CertificatePrototype(vm, structure);
prototype->finishCreation(vm);
return prototype;
}
template<typename, JSC::SubspaceAccess>
static JSC::GCClient::IsoSubspace* subspaceFor(JSC::VM& vm)
{
return &vm.plainObjectSpace();
}
DECLARE_INFO;
static JSC::Structure* createStructure(JSC::VM& vm, JSC::JSGlobalObject* globalObject, JSC::JSValue prototype)
{
auto* structure = JSC::Structure::create(vm, globalObject, prototype, JSC::TypeInfo(JSC::ObjectType, StructureFlags), info());
structure->setMayBePrototype(true);
return structure;
}
private:
JSX509CertificatePrototype(JSC::VM& vm, JSC::Structure* structure)
: Base(vm, structure)
{
}
void finishCreation(JSC::VM& vm);
};
const ClassInfo JSX509CertificatePrototype::s_info = { "X509Certificate"_s, &Base::s_info, nullptr, nullptr, CREATE_METHOD_TABLE(JSX509CertificatePrototype) };
void JSX509CertificatePrototype::finishCreation(VM& vm)
{
Base::finishCreation(vm);
reifyStaticProperties(vm, JSX509Certificate::info(), JSX509CertificatePrototypeTableValues, *this);
JSC_TO_STRING_TAG_WITHOUT_TRANSITION();
}
} // namespace Bun
```
### Getter definition:
```C++
JSC_DEFINE_CUSTOM_GETTER(jsX509CertificateGetter_ca, (JSGlobalObject * globalObject, EncodedJSValue thisValue, PropertyName))
{
VM& vm = globalObject->vm();
auto scope = DECLARE_THROW_SCOPE(vm);
JSX509Certificate* thisObject = jsDynamicCast<JSX509Certificate*>(JSValue::decode(thisValue));
if (UNLIKELY(!thisObject)) {
Bun::throwThisTypeError(*globalObject, scope, "JSX509Certificate"_s, "ca"_s);
return {};
}
return JSValue::encode(jsBoolean(thisObject->view().isCA()));
}
```
### Setter definition
```C++
JSC_DEFINE_CUSTOM_SETTER(jsImportMetaObjectSetter_require, (JSGlobalObject * jsGlobalObject, JSC::EncodedJSValue thisValue, JSC::EncodedJSValue encodedValue, PropertyName propertyName))
{
ImportMetaObject* thisObject = jsDynamicCast<ImportMetaObject*>(JSValue::decode(thisValue));
if (UNLIKELY(!thisObject))
return false;
JSValue value = JSValue::decode(encodedValue);
if (!value.isCell()) {
// TODO:
return true;
}
thisObject->requireProperty.set(thisObject->vm(), thisObject, value.asCell());
return true;
}
```
### Function definition
```C++
JSC_DEFINE_HOST_FUNCTION(jsX509CertificateProtoFuncToJSON, (JSGlobalObject * globalObject, CallFrame* callFrame))
{
VM& vm = globalObject->vm();
auto scope = DECLARE_THROW_SCOPE(vm);
auto *thisObject = jsDynamicCast<MyClassT*>(callFrame->thisValue());
if (UNLIKELY(!thisObject)) {
Bun::throwThisTypeError(*globalObject, scope, "MyClass"_s, "myFunctionName"_s);
return {};
}
return JSValue::encode(functionThatReturnsJSValue(vm, globalObject, thisObject));
}
```
### Constructor definition
```C++
JSC_DECLARE_HOST_FUNCTION(callStats);
JSC_DECLARE_HOST_FUNCTION(constructStats);
class JSStatsConstructor final : public JSC::InternalFunction {
public:
using Base = JSC::InternalFunction;
static constexpr unsigned StructureFlags = Base::StructureFlags;
static JSStatsConstructor* create(JSC::VM& vm, JSC::Structure* structure, JSC::JSObject* prototype)
{
JSStatsConstructor* constructor = new (NotNull, JSC::allocateCell<JSStatsConstructor>(vm)) JSStatsConstructor(vm, structure);
constructor->finishCreation(vm, prototype);
return constructor;
}
DECLARE_INFO;
template<typename CellType, JSC::SubspaceAccess>
static JSC::GCClient::IsoSubspace* subspaceFor(JSC::VM& vm)
{
return &vm.internalFunctionSpace();
}
static JSC::Structure* createStructure(JSC::VM& vm, JSC::JSGlobalObject* globalObject, JSC::JSValue prototype)
{
return JSC::Structure::create(vm, globalObject, prototype, JSC::TypeInfo(JSC::InternalFunctionType, StructureFlags), info());
}
private:
JSStatsConstructor(JSC::VM& vm, JSC::Structure* structure)
: Base(vm, structure, callStats, constructStats)
{
}
void finishCreation(JSC::VM& vm, JSC::JSObject* prototype)
{
Base::finishCreation(vm, 0, "Stats"_s);
putDirectWithoutTransition(vm, vm.propertyNames->prototype, prototype, JSC::PropertyAttribute::DontEnum | JSC::PropertyAttribute::DontDelete | JSC::PropertyAttribute::ReadOnly);
}
};
```
### Structure caching
If there's a class, prototype, and constructor:
1. Add the `JSC::LazyClassStructure` to [ZigGlobalObject.h](mdc:src/bun.js/bindings/ZigGlobalObject.h)
2. Initialize the class structure in [ZigGlobalObject.cpp](mdc:src/bun.js/bindings/ZigGlobalObject.cpp) in `void GlobalObject::finishCreation(VM& vm)`
3. Visit the class structure in visitChildren in [ZigGlobalObject.cpp](mdc:src/bun.js/bindings/ZigGlobalObject.cpp) in `void GlobalObject::visitChildrenImpl`
```c++#ZigGlobalObject.cpp
void GlobalObject::finishCreation(VM& vm) {
// ...
m_JSStatsBigIntClassStructure.initLater(
[](LazyClassStructure::Initializer& init) {
// Call the function to initialize our class structure.
Bun::initJSBigIntStatsClassStructure(init);
});
```
Then, implement the function that creates the structure:
```c++
void setupX509CertificateClassStructure(LazyClassStructure::Initializer& init)
{
auto* prototypeStructure = JSX509CertificatePrototype::createStructure(init.vm, init.global, init.global->objectPrototype());
auto* prototype = JSX509CertificatePrototype::create(init.vm, init.global, prototypeStructure);
auto* constructorStructure = JSX509CertificateConstructor::createStructure(init.vm, init.global, init.global->functionPrototype());
auto* constructor = JSX509CertificateConstructor::create(init.vm, init.global, constructorStructure, prototype);
auto* structure = JSX509Certificate::createStructure(init.vm, init.global, prototype);
init.setPrototype(prototype);
init.setStructure(structure);
init.setConstructor(constructor);
}
```
If there's only a class, use `JSC::LazyProperty<JSGlobalObject, Structure>` instead of `JSC::LazyClassStructure`:
1. Add the `JSC::LazyProperty<JSGlobalObject, Structure>` to @ZigGlobalObject.h
2. Initialize the class structure in @ZigGlobalObject.cpp in `void GlobalObject::finishCreation(VM& vm)`
3. Visit the lazy property in visitChildren in @ZigGlobalObject.cpp in `void GlobalObject::visitChildrenImpl`
void GlobalObject::finishCreation(VM& vm) {
// ...
this.m_myLazyProperty.initLater([](const JSC::LazyProperty<JSC::JSGlobalObject, JSC::Structure>::Initializer& init) {
init.set(Bun::initMyStructure(init.vm, reinterpret_cast<Zig::GlobalObject\*>(init.owner)));
});
```
Then, implement the function that creates the structure:
```c++
Structure* setupX509CertificateStructure(JSC::VM &vm, Zig::GlobalObject* globalObject)
{
// If there is a prototype:
auto* prototypeStructure = JSX509CertificatePrototype::createStructure(init.vm, init.global, init.global->objectPrototype());
auto* prototype = JSX509CertificatePrototype::create(init.vm, init.global, prototypeStructure);
// If there is no prototype or it only has
auto* structure = JSX509Certificate::createStructure(init.vm, init.global, prototype);
init.setPrototype(prototype);
init.setStructure(structure);
init.setConstructor(constructor);
}
```
Then, use the structure by calling `globalObject.m_myStructureName.get(globalObject)`
```C++
JSC_DEFINE_HOST_FUNCTION(x509CertificateConstructorConstruct, (JSGlobalObject * globalObject, CallFrame* callFrame))
{
VM& vm = globalObject->vm();
auto scope = DECLARE_THROW_SCOPE(vm);
if (!callFrame->argumentCount()) {
Bun::throwError(globalObject, scope, ErrorCode::ERR_MISSING_ARGS, "X509Certificate constructor requires at least one argument"_s);
return {};
}
JSValue arg = callFrame->uncheckedArgument(0);
if (!arg.isCell()) {
Bun::throwError(globalObject, scope, ErrorCode::ERR_INVALID_ARG_TYPE, "X509Certificate constructor argument must be a Buffer, TypedArray, or string"_s);
return {};
}
auto* zigGlobalObject = defaultGlobalObject(globalObject);
Structure* structure = zigGlobalObject->m_JSX509CertificateClassStructure.get(zigGlobalObject);
JSValue newTarget = callFrame->newTarget();
if (UNLIKELY(zigGlobalObject->m_JSX509CertificateClassStructure.constructor(zigGlobalObject) != newTarget)) {
auto scope = DECLARE_THROW_SCOPE(vm);
if (!newTarget) {
throwTypeError(globalObject, scope, "Class constructor X509Certificate cannot be invoked without 'new'"_s);
return {};
}
auto* functionGlobalObject = defaultGlobalObject(getFunctionRealm(globalObject, newTarget.getObject()));
RETURN_IF_EXCEPTION(scope, {});
structure = InternalFunction::createSubclassStructure(globalObject, newTarget.getObject(), functionGlobalObject->NodeVMScriptStructure());
RETURN_IF_EXCEPTION(scope, {});
}
return JSValue::encode(createX509Certificate(vm, globalObject, structure, arg));
}
```
### Expose to Zig
To expose the constructor to zig:
```c++
extern "C" JSC::EncodedJSValue Bun__JSBigIntStatsObjectConstructor(Zig::GlobalObject* globalobject)
{
return JSValue::encode(globalobject->m_JSStatsBigIntClassStructure.constructor(globalobject));
}
```
Zig:
```zig
extern "c" fn Bun__JSBigIntStatsObjectConstructor(*JSC.JSGlobalObject) JSC.JSValue;
pub const getBigIntStatsConstructor = Bun__JSBigIntStatsObjectConstructor;
```
To create an object (instance) of a JS class defined in C++ from Zig, follow the \_\_toJS convention like this:
```c++
// X509* is whatever we need to create the object
extern "C" EncodedJSValue Bun__X509__toJS(Zig::GlobalObject* globalObject, X509* cert)
{
// ... implementation details
auto* structure = globalObject->m_JSX509CertificateClassStructure.get(globalObject);
return JSValue::encode(JSX509Certificate::create(globalObject->vm(), structure, globalObject, WTFMove(cert)));
}
```
And from Zig:
```zig
const X509 = opaque {
// ... class
extern fn Bun__X509__toJS(*JSC.JSGlobalObject, *X509) JSC.JSValue;
pub fn toJS(this: *X509, globalObject: *JSC.JSGlobalObject) JSC.JSValue {
return Bun__X509__toJS(globalObject, this);
}
};
```

View File

@@ -1,203 +0,0 @@
# Registering Functions, Objects, and Modules in Bun
This guide documents the process of adding new functionality to the Bun global object and runtime.
## Overview
Bun's architecture exposes functionality to JavaScript through a set of carefully registered functions, objects, and modules. Most core functionality is implemented in Zig, with JavaScript bindings that make these features accessible to users.
There are several key ways to expose functionality in Bun:
1. **Global Functions**: Direct methods on the `Bun` object (e.g., `Bun.serve()`)
2. **Getter Properties**: Lazily initialized properties on the `Bun` object (e.g., `Bun.sqlite`)
3. **Constructor Classes**: Classes available through the `Bun` object (e.g., `Bun.ValkeyClient`)
4. **Global Modules**: Modules that can be imported directly (e.g., `import {X} from "bun:*"`)
## The Registration Process
Adding new functionality to Bun involves several coordinated steps across multiple files:
### 1. Implement the Core Functionality in Zig
First, implement your feature in Zig, typically in its own directory in `src/`. Examples:
- `src/valkey/` for Redis/Valkey client
- `src/semver/` for SemVer functionality
- `src/smtp/` for SMTP client
### 2. Create JavaScript Bindings
Create bindings that expose your Zig functionality to JavaScript:
- Create a class definition file (e.g., `js_bindings.classes.ts`) to define the JavaScript interface
- Implement `JSYourFeature` struct in a file like `js_your_feature.zig`
Example from a class definition file:
```typescript
// Example from a .classes.ts file
import { define } from "../../codegen/class-definitions";
export default [
define({
name: "YourFeature",
construct: true,
finalize: true,
hasPendingActivity: true,
memoryCost: true,
klass: {},
JSType: "0b11101110",
proto: {
yourMethod: {
fn: "yourZigMethod",
length: 1,
},
property: {
getter: "getProperty",
},
},
values: ["cachedValues"],
}),
];
```
### 3. Register with BunObject in `src/bun.js/bindings/BunObject+exports.h`
Add an entry to the `FOR_EACH_GETTER` macro:
```c
// In BunObject+exports.h
#define FOR_EACH_GETTER(macro) \
macro(CSRF) \
macro(CryptoHasher) \
... \
macro(YourFeature) \
```
### 4. Create a Getter Function in `src/bun.js/api/BunObject.zig`
Implement a getter function in `BunObject.zig` that returns your feature:
```zig
// In BunObject.zig
pub const YourFeature = toJSGetter(Bun.getYourFeatureConstructor);
// In the exportAll() function:
@export(&BunObject.YourFeature, .{ .name = getterName("YourFeature") });
```
### 5. Implement the Getter Function in a Relevant Zig File
Implement the function that creates your object:
```zig
// In your main module file (e.g., src/your_feature/your_feature.zig)
pub fn getYourFeatureConstructor(globalThis: *JSC.JSGlobalObject, _: *JSC.JSObject) JSC.JSValue {
return JSC.API.YourFeature.getConstructor(globalThis);
}
```
### 6. Add to Build System
Ensure your files are included in the build system by adding them to the appropriate targets.
## Example: Adding a New Module
Here's a comprehensive example of adding a hypothetical SMTP module:
1. Create implementation files in `src/smtp/`:
- `index.zig`: Main entry point that exports everything
- `SmtpClient.zig`: Core SMTP client implementation
- `js_smtp.zig`: JavaScript bindings
- `js_bindings.classes.ts`: Class definition
2. Define your JS class in `js_bindings.classes.ts`:
```typescript
import { define } from "../../codegen/class-definitions";
export default [
define({
name: "EmailClient",
construct: true,
finalize: true,
hasPendingActivity: true,
configurable: false,
memoryCost: true,
klass: {},
JSType: "0b11101110",
proto: {
send: {
fn: "send",
length: 1,
},
verify: {
fn: "verify",
length: 0,
},
close: {
fn: "close",
length: 0,
},
},
values: ["connectionPromise"],
}),
];
```
3. Add getter to `BunObject+exports.h`:
```c
#define FOR_EACH_GETTER(macro) \
macro(CSRF) \
... \
macro(SMTP) \
```
4. Add getter function to `BunObject.zig`:
```zig
pub const SMTP = toJSGetter(Bun.getSmtpConstructor);
// In exportAll:
@export(&BunObject.SMTP, .{ .name = getterName("SMTP") });
```
5. Implement getter in your module:
```zig
pub fn getSmtpConstructor(globalThis: *JSC.JSGlobalObject, _: *JSC.JSObject) JSC.JSValue {
return JSC.API.JSEmailClient.getConstructor(globalThis);
}
```
## Best Practices
1. **Follow Naming Conventions**: Align your naming with existing patterns
2. **Reference Existing Modules**: Study similar modules like Valkey or S3Client for guidance
3. **Memory Management**: Be careful with memory management and reference counting
4. **Error Handling**: Use `bun.JSError!JSValue` for proper error propagation
5. **Documentation**: Add JSDoc comments to your JavaScript bindings
6. **Testing**: Add tests for your new functionality
## Common Gotchas
- Be sure to handle reference counting properly with `ref()`/`deref()`
- Always implement proper cleanup in `deinit()` and `finalize()`
- For network operations, manage socket lifetimes correctly
- Use `JSC.Codegen` correctly to generate necessary binding code
## Related Files
- `src/bun.js/bindings/BunObject+exports.h`: Registration of getters and functions
- `src/bun.js/api/BunObject.zig`: Implementation of getters and object creation
- `src/bun.js/api/BunObject.classes.ts`: Class definitions
- `.cursor/rules/zig-javascriptcore-classes.mdc`: More details on class bindings
## Additional Resources
For more detailed information on specific topics:
- See `zig-javascriptcore-classes.mdc` for details on creating JS class bindings
- Review existing modules like `valkey`, `sqlite`, or `s3` for real-world examples

View File

@@ -1,91 +0,0 @@
---
description: Writing tests for Bun
globs:
---
# Writing tests for Bun
## Where tests are found
You'll find all of Bun's tests in the `test/` directory.
* `test/`
* `cli/` - CLI command tests, like `bun install` or `bun init`
* `js/` - JavaScript & TypeScript tests
* `bun/` - `Bun` APIs tests, separated by category, for example: `glob/` for `Bun.Glob` tests
* `node/` - Node.js module tests, separated by module, for example: `assert/` for `node:assert` tests
* `test/` - Vendored Node.js tests, taken from the Node.js repository (does not conform to Bun's test style)
* `web/` - Web API tests, separated by category, for example: `fetch/` for `Request` and `Response` tests
* `third_party/` - npm package tests, to validate that basic usage works in Bun
* `napi/` - N-API tests
* `v8/` - V8 C++ API tests
* `bundler/` - Bundler, transpiler, CSS, and `bun build` tests
* `regression/issue/[number]` - Regression tests, always make one when fixing a particular issue
## How tests are written
Bun's tests are written as JavaScript and TypeScript files with the Jest-style APIs, like `test`, `describe`, and `expect`. They are tested using Bun's own test runner, `bun test`.
```js
import { describe, test, expect } from "bun:test";
import assert, { AssertionError } from "assert";
describe("assert(expr)", () => {
test.each([true, 1, "foo"])(`assert(%p) does not throw`, expr => {
expect(() => assert(expr)).not.toThrow();
});
test.each([false, 0, "", null, undefined])(`assert(%p) throws`, expr => {
expect(() => assert(expr)).toThrow(AssertionError);
});
});
```
## Testing conventions
* See `test/harness.ts` for common test utilities and helpers
* Be rigorous and test for edge-cases and unexpected inputs
* Use data-driven tests, e.g. `test.each`, to reduce boilerplate when possible
* When you need to test Bun as a CLI, use the following pattern:
```js
import { test, expect } from "bun:test";
import { spawn } from "bun";
import { bunExe, bunEnv } from "harness";
test("bun --version", async () => {
const { exited, stdout: stdoutStream, stderr: stderrStream } = spawn({
cmd: [bunExe(), "--version"],
env: bunEnv,
stdout: "pipe",
stderr: "pipe",
});
const [ exitCode, stdout, stderr ] = await Promise.all([
exited,
new Response(stdoutStream).text(),
new Response(stderrStream).text(),
]);
expect({ exitCode, stdout, stderr }).toMatchObject({
exitCode: 0,
stdout: expect.stringContaining(Bun.version),
stderr: "",
});
});
```
## Before writing a test
* If you are fixing a bug, write the test first and make sure it fails (as expected) with the canary version of Bun
* If you are fixing a Node.js compatibility bug, create a throw-away snippet of code and test that it works as you expect in Node.js, then that it fails (as expected) with the canary version of Bun
* When the expected behaviour is ambigious, defer to matching what happens in Node.js
* Always attempt to find related tests in an existing test file before creating a new test file

View File

@@ -1,509 +0,0 @@
---
description: How Zig works with JavaScriptCore bindings generator
globs:
alwaysApply: false
---
# Bun's JavaScriptCore Class Bindings Generator
This document explains how Bun's class bindings generator works to bridge Zig and JavaScript code through JavaScriptCore (JSC).
## Architecture Overview
Bun's binding system creates a seamless bridge between JavaScript and Zig, allowing Zig implementations to be exposed as JavaScript classes. The system has several key components:
1. **Zig Implementation** (.zig files)
2. **JavaScript Interface Definition** (.classes.ts files)
3. **Generated Code** (C++/Zig files that connect everything)
## Class Definition Files
### JavaScript Interface (.classes.ts)
The `.classes.ts` files define the JavaScript API using a declarative approach:
```typescript
// Example: encoding.classes.ts
define({
name: "TextDecoder",
constructor: true,
JSType: "object",
finalize: true,
proto: {
decode: {
// Function definition
args: 1,
},
encoding: {
// Getter with caching
getter: true,
cache: true,
},
fatal: {
// Read-only property
getter: true,
},
ignoreBOM: {
// Read-only property
getter: true,
},
},
});
```
Each class definition specifies:
- The class name
- Whether it has a constructor
- JavaScript type (object, function, etc.)
- Properties and methods in the `proto` field
- Caching strategy for properties
- Finalization requirements
### Zig Implementation (.zig)
The Zig files implement the native functionality:
```zig
// Example: TextDecoder.zig
pub const TextDecoder = struct {
// Expose generated bindings as `js` namespace with trait conversion methods
pub const js = JSC.Codegen.JSTextDecoder;
pub const toJS = js.toJS;
pub const fromJS = js.fromJS;
pub const fromJSDirect = js.fromJSDirect;
// Internal state
encoding: []const u8,
fatal: bool,
ignoreBOM: bool,
// Constructor implementation - note use of globalObject
pub fn constructor(
globalObject: *JSGlobalObject,
callFrame: *JSC.CallFrame,
) bun.JSError!*TextDecoder {
// Implementation
return bun.new(TextDecoder, .{
// Fields
});
}
// Prototype methods - note return type includes JSError
pub fn decode(
this: *TextDecoder,
globalObject: *JSGlobalObject,
callFrame: *JSC.CallFrame,
) bun.JSError!JSC.JSValue {
// Implementation
}
// Getters
pub fn getEncoding(this: *TextDecoder, globalObject: *JSGlobalObject) JSC.JSValue {
return JSC.JSValue.createStringFromUTF8(globalObject, this.encoding);
}
pub fn getFatal(this: *TextDecoder, globalObject: *JSGlobalObject) JSC.JSValue {
return JSC.JSValue.jsBoolean(this.fatal);
}
// Cleanup - note standard pattern of using deinit/deref
fn deinit(this: *TextDecoder) void {
// Release any retained resources
// Free the pointer at the end.
bun.destroy(this);
}
// Finalize - called by JS garbage collector. This should call deinit, or deref if reference counted.
pub fn finalize(this: *TextDecoder) void {
this.deinit();
}
};
```
Key components in the Zig file:
- The struct containing native state
- `pub const js = JSC.Codegen.JS<ClassName>` to include generated code
- Constructor and methods using `bun.JSError!JSValue` return type for proper error handling
- Consistent use of `globalObject` parameter name instead of `ctx`
- Methods matching the JavaScript interface
- Getters/setters for properties
- Proper resource cleanup pattern with `deinit()` and `finalize()`
- Update `src/bun.js/bindings/generated_classes_list.zig` to include the new class
## Code Generation System
The binding generator produces C++ code that connects JavaScript and Zig:
1. **JSC Class Structure**: Creates C++ classes for the JS object, prototype, and constructor
2. **Memory Management**: Handles GC integration through JSC's WriteBarrier
3. **Method Binding**: Connects JS function calls to Zig implementations
4. **Type Conversion**: Converts between JS values and Zig types
5. **Property Caching**: Implements the caching system for properties
The generated C++ code includes:
- A JSC wrapper class (`JSTextDecoder`)
- A prototype class (`JSTextDecoderPrototype`)
- A constructor function (`JSTextDecoderConstructor`)
- Function bindings (`TextDecoderPrototype__decodeCallback`)
- Property getters/setters (`TextDecoderPrototype__encodingGetterWrap`)
## CallFrame Access
The `CallFrame` object provides access to JavaScript execution context:
```zig
pub fn decode(
this: *TextDecoder,
globalObject: *JSGlobalObject,
callFrame: *JSC.CallFrame
) bun.JSError!JSC.JSValue {
// Get arguments
const input = callFrame.argument(0);
const options = callFrame.argument(1);
// Get this value
const thisValue = callFrame.thisValue();
// Implementation with error handling
if (input.isUndefinedOrNull()) {
return globalObject.throw("Input cannot be null or undefined", .{});
}
// Return value or throw error
return JSC.JSValue.jsString(globalObject, "result");
}
```
CallFrame methods include:
- `argument(i)`: Get the i-th argument
- `argumentCount()`: Get the number of arguments
- `thisValue()`: Get the `this` value
- `callee()`: Get the function being called
## Property Caching and GC-Owned Values
The `cache: true` option in property definitions enables JSC's WriteBarrier to efficiently store values:
```typescript
encoding: {
getter: true,
cache: true, // Enable caching
}
```
### C++ Implementation
In the generated C++ code, caching uses JSC's WriteBarrier:
```cpp
JSC_DEFINE_CUSTOM_GETTER(TextDecoderPrototype__encodingGetterWrap, (...)) {
auto& vm = JSC::getVM(lexicalGlobalObject);
Zig::GlobalObject *globalObject = reinterpret_cast<Zig::GlobalObject*>(lexicalGlobalObject);
auto throwScope = DECLARE_THROW_SCOPE(vm);
JSTextDecoder* thisObject = jsCast<JSTextDecoder*>(JSValue::decode(encodedThisValue));
JSC::EnsureStillAliveScope thisArg = JSC::EnsureStillAliveScope(thisObject);
// Check for cached value and return if present
if (JSValue cachedValue = thisObject->m_encoding.get())
return JSValue::encode(cachedValue);
// Get value from Zig implementation
JSC::JSValue result = JSC::JSValue::decode(
TextDecoderPrototype__getEncoding(thisObject->wrapped(), globalObject)
);
RETURN_IF_EXCEPTION(throwScope, {});
// Store in cache for future access
thisObject->m_encoding.set(vm, thisObject, result);
RELEASE_AND_RETURN(throwScope, JSValue::encode(result));
}
```
### Zig Accessor Functions
For each cached property, the generator creates Zig accessor functions that allow Zig code to work with these GC-owned values:
```zig
// External function declarations
extern fn TextDecoderPrototype__encodingSetCachedValue(JSC.JSValue, *JSC.JSGlobalObject, JSC.JSValue) callconv(JSC.conv) void;
extern fn TextDecoderPrototype__encodingGetCachedValue(JSC.JSValue) callconv(JSC.conv) JSC.JSValue;
/// `TextDecoder.encoding` setter
/// This value will be visited by the garbage collector.
pub fn encodingSetCached(thisValue: JSC.JSValue, globalObject: *JSC.JSGlobalObject, value: JSC.JSValue) void {
JSC.markBinding(@src());
TextDecoderPrototype__encodingSetCachedValue(thisValue, globalObject, value);
}
/// `TextDecoder.encoding` getter
/// This value will be visited by the garbage collector.
pub fn encodingGetCached(thisValue: JSC.JSValue) ?JSC.JSValue {
JSC.markBinding(@src());
const result = TextDecoderPrototype__encodingGetCachedValue(thisValue);
if (result == .zero)
return null;
return result;
}
```
### Benefits of GC-Owned Values
This system provides several key benefits:
1. **Automatic Memory Management**: The JavaScriptCore GC tracks and manages these values
2. **Proper Garbage Collection**: The WriteBarrier ensures values are properly visited during GC
3. **Consistent Access**: Zig code can easily get/set these cached JS values
4. **Performance**: Cached values avoid repeated computation or serialization
### Use Cases
GC-owned cached values are particularly useful for:
1. **Computed Properties**: Store expensive computation results
2. **Lazily Created Objects**: Create objects only when needed, then cache them
3. **References to Other Objects**: Store references to other JS objects that need GC tracking
4. **Memoization**: Cache results based on input parameters
The WriteBarrier mechanism ensures that any JS values stored in this way are properly tracked by the garbage collector.
## Memory Management and Finalization
The binding system handles memory management across the JavaScript/Zig boundary:
1. **Object Creation**: JavaScript `new TextDecoder()` creates both a JS wrapper and a Zig struct
2. **Reference Tracking**: JSC's GC tracks all JS references to the object
3. **Finalization**: When the JS object is collected, the finalizer releases Zig resources
Bun uses a consistent pattern for resource cleanup:
```zig
// Resource cleanup method - separate from finalization
pub fn deinit(this: *TextDecoder) void {
// Release resources like strings
this._encoding.deref(); // String deref pattern
// Free any buffers
if (this.buffer) |buffer| {
bun.default_allocator.free(buffer);
}
}
// Called by the GC when object is collected
pub fn finalize(this: *TextDecoder) void {
JSC.markBinding(@src()); // For debugging
this.deinit(); // Clean up resources
bun.default_allocator.destroy(this); // Free the object itself
}
```
Some objects that hold references to other JS objects use `.deref()` instead:
```zig
pub fn finalize(this: *SocketAddress) void {
JSC.markBinding(@src());
this._presentation.deref(); // Release references
this.destroy();
}
```
## Error Handling with JSError
Bun uses `bun.JSError!JSValue` return type for proper error handling:
```zig
pub fn decode(
this: *TextDecoder,
globalObject: *JSGlobalObject,
callFrame: *JSC.CallFrame
) bun.JSError!JSC.JSValue {
// Throwing an error
if (callFrame.argumentCount() < 1) {
return globalObject.throw("Missing required argument", .{});
}
// Or returning a success value
return JSC.JSValue.jsString(globalObject, "Success!");
}
```
This pattern allows Zig functions to:
1. Return JavaScript values on success
2. Throw JavaScript exceptions on error
3. Propagate errors automatically through the call stack
## Type Safety and Error Handling
The binding system includes robust error handling:
```cpp
// Example of type checking in generated code
JSTextDecoder* thisObject = jsDynamicCast<JSTextDecoder*>(callFrame->thisValue());
if (UNLIKELY(!thisObject)) {
scope.throwException(lexicalGlobalObject,
Bun::createInvalidThisError(lexicalGlobalObject, callFrame->thisValue(), "TextDecoder"_s));
return {};
}
```
## Prototypal Inheritance
The binding system creates proper JavaScript prototype chains:
1. **Constructor**: JSTextDecoderConstructor with standard .prototype property
2. **Prototype**: JSTextDecoderPrototype with methods and properties
3. **Instances**: Each JSTextDecoder instance with **proto** pointing to prototype
This ensures JavaScript inheritance works as expected:
```cpp
// From generated code
void JSTextDecoderConstructor::finishCreation(VM& vm, JSC::JSGlobalObject* globalObject, JSTextDecoderPrototype* prototype)
{
Base::finishCreation(vm, 0, "TextDecoder"_s, PropertyAdditionMode::WithoutStructureTransition);
// Set up the prototype chain
putDirectWithoutTransition(vm, vm.propertyNames->prototype, prototype, PropertyAttribute::DontEnum | PropertyAttribute::DontDelete | PropertyAttribute::ReadOnly);
ASSERT(inherits(info()));
}
```
## Performance Considerations
The binding system is optimized for performance:
1. **Direct Pointer Access**: JavaScript objects maintain a direct pointer to Zig objects
2. **Property Caching**: WriteBarrier caching avoids repeated native calls for stable properties
3. **Memory Management**: JSC garbage collection integrated with Zig memory management
4. **Type Conversion**: Fast paths for common JavaScript/Zig type conversions
## Creating a New Class Binding
To create a new class binding in Bun:
1. **Define the class interface** in a `.classes.ts` file:
```typescript
define({
name: "MyClass",
constructor: true,
finalize: true,
proto: {
myMethod: {
args: 1,
},
myProperty: {
getter: true,
cache: true,
},
},
});
```
2. **Implement the native functionality** in a `.zig` file:
```zig
pub const MyClass = struct {
// Generated bindings
pub const js = JSC.Codegen.JSMyClass;
pub const toJS = js.toJS;
pub const fromJS = js.fromJS;
pub const fromJSDirect = js.fromJSDirect;
// State
value: []const u8,
pub const new = bun.TrivialNew(@This());
// Constructor
pub fn constructor(
globalObject: *JSGlobalObject,
callFrame: *JSC.CallFrame,
) bun.JSError!*MyClass {
const arg = callFrame.argument(0);
// Implementation
}
// Method
pub fn myMethod(
this: *MyClass,
globalObject: *JSGlobalObject,
callFrame: *JSC.CallFrame,
) bun.JSError!JSC.JSValue {
// Implementation
}
// Getter
pub fn getMyProperty(this: *MyClass, globalObject: *JSGlobalObject) JSC.JSValue {
return JSC.JSValue.jsString(globalObject, this.value);
}
// Resource cleanup
pub fn deinit(this: *MyClass) void {
// Clean up resources
}
pub fn finalize(this: *MyClass) void {
this.deinit();
bun.destroy(this);
}
};
```
3. **The binding generator** creates all necessary C++ and Zig glue code to connect JavaScript and Zig, including:
- C++ class definitions
- Method and property bindings
- Memory management utilities
- GC integration code
## Generated Code Structure
The binding generator produces several components:
### 1. C++ Classes
For each Zig class, the system generates:
- **JS<Class>**: Main wrapper that holds a pointer to the Zig object (`JSTextDecoder`)
- **JS<Class>Prototype**: Contains methods and properties (`JSTextDecoderPrototype`)
- **JS<Class>Constructor**: Implementation of the JavaScript constructor (`JSTextDecoderConstructor`)
### 2. C++ Methods and Properties
- **Method Callbacks**: `TextDecoderPrototype__decodeCallback`
- **Property Getters/Setters**: `TextDecoderPrototype__encodingGetterWrap`
- **Initialization Functions**: `finishCreation` methods for setting up the class
### 3. Zig Bindings
- **External Function Declarations**:
```zig
extern fn TextDecoderPrototype__decode(*TextDecoder, *JSC.JSGlobalObject, *JSC.CallFrame) callconv(JSC.conv) JSC.EncodedJSValue;
```
- **Cached Value Accessors**:
```zig
pub fn encodingGetCached(thisValue: JSC.JSValue) ?JSC.JSValue { ... }
pub fn encodingSetCached(thisValue: JSC.JSValue, globalObject: *JSC.JSGlobalObject, value: JSC.JSValue) void { ... }
```
- **Constructor Helpers**:
```zig
pub fn create(globalObject: *JSC.JSGlobalObject) bun.JSError!JSC.JSValue { ... }
```
### 4. GC Integration
- **Memory Cost Calculation**: `estimatedSize` method
- **Child Visitor Methods**: `visitChildrenImpl` and `visitAdditionalChildren`
- **Heap Analysis**: `analyzeHeap` for debugging memory issues
This architecture makes it possible to implement high-performance native functionality in Zig while exposing a clean, idiomatic JavaScript API to users.

View File

@@ -1,58 +0,0 @@
name: Codex Test Sync
on:
pull_request:
types: [labeled, opened]
env:
BUN_VERSION: "1.2.15"
jobs:
sync-node-tests:
runs-on: ubuntu-latest
if: |
(github.event.action == 'labeled' && github.event.label.name == 'codex') ||
(github.event.action == 'opened' && contains(github.event.pull_request.labels.*.name, 'codex')) ||
contains(github.head_ref, 'codex')
permissions:
contents: write
pull-requests: write
steps:
- name: Checkout
uses: actions/checkout@v4
with:
token: ${{ secrets.GITHUB_TOKEN }}
fetch-depth: 0
- name: Setup Bun
uses: ./.github/actions/setup-bun
with:
bun-version: ${{ env.BUN_VERSION }}
- name: Get changed files
id: changed-files
uses: tj-actions/changed-files@v44
with:
files: |
test/js/node/test/parallel/**/*.{js,mjs,ts}
test/js/node/test/sequential/**/*.{js,mjs,ts}
- name: Sync tests
if: steps.changed-files.outputs.any_changed == 'true'
shell: bash
run: |
echo "Changed test files:"
echo "${{ steps.changed-files.outputs.all_changed_files }}"
# Process each changed test file
for file in ${{ steps.changed-files.outputs.all_changed_files }}; do
# Extract test name from file path
test_name=$(basename "$file" | sed 's/\.[^.]*$//')
echo "Syncing test: $test_name"
bun node:test:cp "$test_name"
done
- name: Commit changes
uses: stefanzweifel/git-auto-commit-action@v5
with:
commit_message: "Sync Node.js tests with upstream"

1
.gitignore vendored
View File

@@ -9,6 +9,7 @@
.ninja_deps
.ninja_log
.npm
.npmrc
.npm.gz
.parcel-cache
.swcrc

View File

@@ -94,7 +94,7 @@ test("(multi-file test) my feature", async () => {
- Always use `port: 0`. Do not hardcode ports. Do not use your own random port number function.
- Use `normalizeBunSnapshot` to normalize snapshot output of the test.
- NEVER write tests that check for no "panic" or "uncaught exception" or similar in the test output. That is NOT a valid test.
- NEVER write tests that check for no "panic" or "uncaught exception" or similar in the test output. These tests will never fail in CI.
- Use `tempDir` from `"harness"` to create a temporary directory. **Do not** use `tmpdirSync` or `fs.mkdtempSync` to create temporary directories.
- When spawning processes, tests should expect(stdout).toBe(...) BEFORE expect(exitCode).toBe(0). This gives you a more useful error message on test failure.
- **CRITICAL**: Do not write flaky tests. Do not use `setTimeout` in tests. Instead, `await` the condition to be met. You are not testing the TIME PASSING, you are testing the CONDITION.

View File

@@ -47,15 +47,7 @@ include(SetupEsbuild)
include(SetupZig)
include(SetupRust)
find_program(SCCACHE_PROGRAM sccache)
if(SCCACHE_PROGRAM AND NOT DEFINED ENV{NO_SCCACHE})
include(SetupSccache)
else()
find_program(CCACHE_PROGRAM ccache)
if(CCACHE_PROGRAM)
include(SetupCcache)
endif()
endif()
include(SetupCcache)
# Generate dependency versions header
include(GenerateDependencyVersions)

View File

@@ -23,7 +23,7 @@ Using your system's package manager, install Bun's dependencies:
{% codetabs group="os" %}
```bash#macOS (Homebrew)
$ brew install automake cmake coreutils gnu-sed go icu4c libiconv libtool ninja pkg-config rust ruby sccache
$ brew install automake ccache cmake coreutils gnu-sed go icu4c libiconv libtool ninja pkg-config rust ruby
```
```bash#Ubuntu/Debian
@@ -65,43 +65,28 @@ $ brew install bun
{% /codetabs %}
### Optional: Install `sccache`
### Optional: Install `ccache`
sccache is used to cache compilation artifacts, significantly speeding up builds. It must be installed with S3 support:
ccache is used to cache compilation artifacts, significantly speeding up builds:
```bash
# For macOS
$ brew install sccache
$ brew install ccache
# For Linux. Note that the version in your package manager may not have S3 support.
$ cargo install sccache --features=s3
# For Ubuntu/Debian
$ sudo apt install ccache
# For Arch
$ sudo pacman -S ccache
# For Fedora
$ sudo dnf install ccache
# For openSUSE
$ sudo zypper install ccache
```
This will install `sccache` with S3 support. Our build scripts will automatically detect and use `sccache` with our shared S3 cache. **Note**: Not all versions of `sccache` are compiled with S3 support, hence we recommend installing it via `cargo`.
#### Registering AWS Credentials for `sccache` (Core Developers Only)
Core developers have write access to the shared S3 cache. To enable write access, you must log in with AWS credentials. The easiest way to do this is to use the [`aws` CLI](https://aws.amazon.com/cli/) and invoke [`aws configure` to provide your AWS security info](https://docs.aws.amazon.com/cli/latest/reference/configure/).
The `cmake` scripts should automatically detect your AWS credentials from the environment or the `~/.aws/credentials` file.
<details>
<summary>Logging in to the `aws` CLI</summary>
1. Install the AWS CLI by following [the official guide](https://docs.aws.amazon.com/cli/latest/userguide/getting-started-install.html).
2. Log in to your AWS account console. A team member should provide you with your credentials.
3. Click your name in the top right > Security credentials.
4. Scroll to "Access keys" and create a new access key.
5. Run `aws configure` in your terminal and provide the access key ID and secret access key when prompted.
</details>
<details>
<summary>Common Issues You May Encounter</summary>
- To confirm that the cache is being used, you can use the `sccache --show-stats` command right after a build. This will expose very useful statistics, including cache hits/misses.
- If you have multiple AWS profiles configured, ensure that the correct profile is set in the `AWS_PROFILE` environment variable.
- `sccache` follows a server-client model. If you run into weird issues where `sccache` refuses to use S3, even though you have AWS credentials configured, try killing any running `sccache` servers with `sccache --stop-server` and then re-running the build.
</details>
Our build scripts will automatically detect and use `ccache` if available. You can check cache statistics with `ccache --show-stats`.
## Install LLVM

2
LATEST
View File

@@ -1 +1 @@
1.3.3
1.3.6

51
TODO.md
View File

@@ -1,51 +0,0 @@
- [ ] issue-207.test.ts: update webkit to allow overridenDateNow to be negative. we will have to pick some other value to be none, eg NaN.
- [ ] process.hrtime (same as performance.now, resets to [0, 0] when fake timers are enabled)
- [ ] decimal ticks should be supported: https://github.com/sinonjs/fake-timers/blob/main/test/issue-207-test.js
- [ ] what happens when you add negative time?
- [ ] test abortsignal
- [ ] test spawnSync
- [ ] support performance.now
- [ ] support Date.now()
- [ ] support useFakeTimers in combination with setSystemTime
- [ ] support functions:
- [x] advanceTimersByTime
- [ ] advanceTimersByTimeAsync\*
- [x] advanceTimersToNextTimer
- [ ] advanceTimersToNextTimerAsync\*
- [ ] advanceTimersToNextFrame
- [x] getTimerCount
- [x] clearAllTimers
- [ ] getMockedSystemTime
- [ ] getRealSystemTime
- [ ] runAllTicks
- [x] runAllTimers
- [ ] runAllTimersAsync\*
- [x] runOnlyPendingTimers
- [x] make sure `[10, [20]], [30, [40]]` runs 10, 20, and 30
- [ ] runOnlyPendingTimersAsync
- [ ] setSystemTime
- [x] useFakeTimers
- [x] isFakeTimers
- [x] useRealTimers
- [ ] the result of 'arm' ('.disarm'/'.rearm') seems to be ignored? both disarm and rearm? we should change it to return void if it's actually ignored
- [ ] we can make this change seperately in main
- [ ] see how fake timers works with setSystemTime
- [ ] handle the config argument for useFakeTimers
- vitest has the toFake parameter for enabling nextTick and queueMicrotask
- jets has a timerLimit parameter
- [ ] make sure there is no memory leak if you create a setTimeout/setInterval and clear or stop the fake timers before it fires repeatedly.
- [ ] audit the allowFakeTimers list (do we want subprocess timeouts to count? and others)
- we will also need some timespec.now() calls to exclude mocked time, ie `timespec.nowUnmocked()`
- since test timeouts must be unmocked, we need BunTest timeouts to use unmocked time
- [ ] audit all timespec.now() calls to replace some with timespec.nowUnmocked()
- [ ] add a test that fake timers do not break test duration calculations
- [ ] test the order of '0' timeouts, decide if we will match or not
- [ ] consider supporting edge-case where `timeout0(A, timeout0(B)), timeout0(C)` prints `A=0, C=0, B=1` (date.now())
- [ ] \* support async functions (maybe defer these for later? the first PR does not need to have everything. we can stub the unimplemented functions.)
- [ ] add types for all functions
- [ ] add docs for all functions (at minimum: documentatino comments in the types)
- [ ] determine which tags should have timer-faking enabled (eg should subprocess timeout? probably not but maybe)
- [ ] find real projects using fake timers, see if they work
- [ ] make sure we match jest fake timers also

View File

@@ -1,5 +1,6 @@
{
"lockfileVersion": 1,
"configVersion": 0,
"workspaces": {
"": {
"name": "bench",
@@ -22,6 +23,7 @@
"react-dom": "^18.3.1",
"string-width": "7.1.0",
"strip-ansi": "^7.1.0",
"tar": "^7.4.3",
"tinycolor2": "^1.6.0",
"zx": "^7.2.3",
},
@@ -107,6 +109,8 @@
"@fastify/proxy-addr": ["@fastify/proxy-addr@5.0.0", "", { "dependencies": { "@fastify/forwarded": "^3.0.0", "ipaddr.js": "^2.1.0" } }, "sha512-37qVVA1qZ5sgH7KpHkkC4z9SK6StIsIcOmpjvMPXNb3vx2GQxhZocogVYbr2PbbeLCQxYIPDok307xEvRZOzGA=="],
"@isaacs/fs-minipass": ["@isaacs/fs-minipass@4.0.1", "", { "dependencies": { "minipass": "^7.0.4" } }, "sha512-wgm9Ehl2jpeqP3zw/7mo3kRHFp5MEDhqAdwy1fTGkHAwnkGOVsgpvQhL8B5n1qlb01jV3n/bI0ZfZp5lWA1k4w=="],
"@jridgewell/gen-mapping": ["@jridgewell/gen-mapping@0.1.1", "", { "dependencies": { "@jridgewell/set-array": "^1.0.0", "@jridgewell/sourcemap-codec": "^1.4.10" } }, "sha512-sQXCasFk+U8lWYEe66WxRDOE9PjVz4vSM51fTu3Hw+ClTpUSQb718772vH3pyS5pShp6lvQM7SxgIDXXXmOX7w=="],
"@jridgewell/resolve-uri": ["@jridgewell/resolve-uri@3.1.0", "", {}, "sha512-F2msla3tad+Mfht5cJq7LSXcdudKTWCVYUgw6pLFOOHSTtZlj6SWNYAp+AhuqLmWdBO2X5hPrLcu8cVP8fy28w=="],
@@ -181,6 +185,8 @@
"chalk": ["chalk@5.3.0", "", {}, "sha512-dLitG79d+GV1Nb/VYcCDFivJeK1hiukt9QjRNVOsUtTy1rR1YJsmpGGTZ3qJos+uw7WmWF4wUwBd9jxjocFC2w=="],
"chownr": ["chownr@3.0.0", "", {}, "sha512-+IxzY9BZOQd/XuYPRmrvEVjF/nqj5kgT4kEq7VofrDoM1MxoRjEWkrCC3EtLi59TVawxTAn+orJwFQcrqEN1+g=="],
"color": ["color@4.2.3", "", { "dependencies": { "color-convert": "^2.0.1", "color-string": "^1.9.0" } }, "sha512-1rXeuUUiGGrykh+CeBdu5Ie7OJwinCgQY0bc7GCRxy5xVHy+moaqkpL/jqQq0MtQOeYcrqEz4abc5f0KtU7W4A=="],
"color-convert": ["color-convert@2.0.1", "", { "dependencies": { "color-name": "~1.1.4" } }, "sha512-RRECPsj7iu/xb5oKYcsFHSppFNnsj/52OVTRKb4zP5onXwVF3zVmmToNcOfGC+CRDpfK/U584fMg38ZHCaElKQ=="],
@@ -361,6 +367,10 @@
"minimist": ["minimist@1.2.8", "", {}, "sha512-2yyAR8qBkN3YuheJanUpWC5U3bb5osDywNB8RzDVlDwDHbocAJveqqj1u8+SVD7jkWT4yvsHCpWqqWqAxb0zCA=="],
"minipass": ["minipass@7.1.2", "", {}, "sha512-qOOzS1cBTWYF4BH8fVePDBOO9iptMnGUEZwNc/cMWnTV2nVLZ7VoNWEPHkYczZA0pdoA7dl6e7FL659nX9S2aw=="],
"minizlib": ["minizlib@3.1.0", "", { "dependencies": { "minipass": "^7.1.2" } }, "sha512-KZxYo1BUkWD2TVFLr0MQoM8vUUigWD3LlD83a/75BqC+4qE0Hb1Vo5v1FgcfaNXvfXzr+5EhQ6ing/CaBijTlw=="],
"mitata": ["mitata@1.0.25", "", {}, "sha512-0v5qZtVW5vwj9FDvYfraR31BMDcRLkhSFWPTLaxx/Z3/EvScfVtAAWtMI2ArIbBcwh7P86dXh0lQWKiXQPlwYA=="],
"ms": ["ms@2.1.2", "", {}, "sha512-sGkPx+VjMtmA6MX27oA4FBFELFCZZ4S4XqeGOXCv68tT+jb3vk/RyaKWP0PTKyWtmLSM0b+adUTEvbs1PEaH2w=="],
@@ -457,6 +467,8 @@
"supports-color": ["supports-color@5.5.0", "", { "dependencies": { "has-flag": "^3.0.0" } }, "sha512-QjVjwdXIt408MIiAqCX4oUKsgU2EqAGzs2Ppkm4aQYbjm+ZEWEcW4SfFNTr4uMNZma0ey4f5lgLrkB0aX0QMow=="],
"tar": ["tar@7.5.2", "", { "dependencies": { "@isaacs/fs-minipass": "^4.0.0", "chownr": "^3.0.0", "minipass": "^7.1.2", "minizlib": "^3.1.0", "yallist": "^5.0.0" } }, "sha512-7NyxrTE4Anh8km8iEy7o0QYPs+0JKBTj5ZaqHg6B39erLg0qYXN3BijtShwbsNSvQ+LN75+KV+C4QR/f6Gwnpg=="],
"thread-stream": ["thread-stream@3.1.0", "", { "dependencies": { "real-require": "^0.2.0" } }, "sha512-OqyPZ9u96VohAyMfJykzmivOrY2wfMSf3C5TtFJVgN+Hm6aj+voFhlK+kZEIv2FBh1X6Xp3DlnCOfEQ3B2J86A=="],
"through": ["through@2.3.8", "", {}, "sha512-w89qg7PI8wAdvX60bMDP+bFoD5Dvhm9oLheFp5O4a2QF0cSBGsBX4qZmadPMvVqlLJBBci+WqGGOAPvcDeNSVg=="],
@@ -481,7 +493,7 @@
"which": ["which@3.0.1", "", { "dependencies": { "isexe": "^2.0.0" }, "bin": { "node-which": "bin/which.js" } }, "sha512-XA1b62dzQzLfaEOSQFTCOd5KFf/1VSzZo7/7TUjnya6u0vGGKzU96UQBZTAThCb2j4/xjBAyii1OhRLJEivHvg=="],
"yallist": ["yallist@3.1.1", "", {}, "sha512-a4UGQaWPH59mOXUYnAG2ewncQS4i4F43Tv3JoAM+s2VDAmS9NsK8GpDMLrCHPksFT7h3K6TOoUNn2pb7RoXx4g=="],
"yallist": ["yallist@5.0.0", "", {}, "sha512-YgvUTfwqyc7UXVMrB+SImsVYSmTS8X/tSrtdNZMImM+n7+QTriRXyXim0mBrTXNeqzVF0KWGgHPeiyViFFrNDw=="],
"yaml": ["yaml@2.3.4", "", {}, "sha512-8aAvwVUSHpfEqTQ4w/KMlf3HcRdt50E5ODIQJBw1fQ5RL34xabzxtUlzTXVqc4rkZsPbvrXKWnABCD7kWSmocA=="],
@@ -501,6 +513,8 @@
"light-my-request/process-warning": ["process-warning@4.0.1", "", {}, "sha512-3c2LzQ3rY9d0hc1emcsHhfT9Jwz0cChib/QN89oME2R451w5fy3f0afAhERFZAwrbDU43wk12d0ORBpDVME50Q=="],
"lru-cache/yallist": ["yallist@3.1.1", "", {}, "sha512-a4UGQaWPH59mOXUYnAG2ewncQS4i4F43Tv3JoAM+s2VDAmS9NsK8GpDMLrCHPksFT7h3K6TOoUNn2pb7RoXx4g=="],
"npm-run-path/path-key": ["path-key@4.0.0", "", {}, "sha512-haREypq7xkM7ErfgIyA0z+Bj4AGKlMSdlQE2jvJo6huWD1EdkKYV+G/T4nq0YEF2vgTT8kqMFKo1uHn950r4SQ=="],
"ansi-styles/color-convert/color-name": ["color-name@1.1.3", "", {}, "sha512-72fSenhMw2HZMTVHeCA9KCmpEIbzWiQsjN+BHcBbS9vr1mtt+vJjPdksIBNUmKAW8TFUDPJK5SUU3QhE9NEXDw=="],

View File

@@ -19,6 +19,7 @@
"react-dom": "^18.3.1",
"string-width": "7.1.0",
"strip-ansi": "^7.1.0",
"tar": "^7.4.3",
"tinycolor2": "^1.6.0",
"zx": "^7.2.3"
},

477
bench/snippets/archive.mjs Normal file
View File

@@ -0,0 +1,477 @@
import { mkdirSync, mkdtempSync, rmSync, writeFileSync } from "node:fs";
import { tmpdir } from "node:os";
import { join } from "node:path";
import { Pack, Unpack } from "tar";
import { bench, group, run } from "../runner.mjs";
// Check if Bun.Archive is available
const hasBunArchive = typeof Bun !== "undefined" && typeof Bun.Archive !== "undefined";
// Test data sizes
const smallContent = "Hello, World!";
const mediumContent = Buffer.alloc(10 * 1024, "x").toString(); // 10KB
const largeContent = Buffer.alloc(100 * 1024, "x").toString(); // 100KB
// Create test files for node-tar (it reads from filesystem)
const setupDir = mkdtempSync(join(tmpdir(), "archive-bench-setup-"));
function setupNodeTarFiles(prefix, files) {
const dir = join(setupDir, prefix);
mkdirSync(dir, { recursive: true });
for (const [name, content] of Object.entries(files)) {
const filePath = join(dir, name);
const fileDir = join(filePath, "..");
mkdirSync(fileDir, { recursive: true });
writeFileSync(filePath, content);
}
return dir;
}
// Setup directories for different test cases
const smallFilesDir = setupNodeTarFiles("small", {
"file1.txt": smallContent,
"file2.txt": smallContent,
"file3.txt": smallContent,
});
const mediumFilesDir = setupNodeTarFiles("medium", {
"file1.txt": mediumContent,
"file2.txt": mediumContent,
"file3.txt": mediumContent,
});
const largeFilesDir = setupNodeTarFiles("large", {
"file1.txt": largeContent,
"file2.txt": largeContent,
"file3.txt": largeContent,
});
const manyFilesEntries = {};
for (let i = 0; i < 100; i++) {
manyFilesEntries[`file${i}.txt`] = smallContent;
}
const manyFilesDir = setupNodeTarFiles("many", manyFilesEntries);
// Pre-create archives for extraction benchmarks
let smallTarGzBuffer, mediumTarGzBuffer, largeTarGzBuffer, manyFilesTarGzBuffer;
let smallTarBuffer, mediumTarBuffer, largeTarBuffer, manyFilesTarBuffer;
let smallBunArchiveGz, mediumBunArchiveGz, largeBunArchiveGz, manyFilesBunArchiveGz;
let smallBunArchive, mediumBunArchive, largeBunArchive, manyFilesBunArchive;
// Create tar buffer using node-tar (with optional gzip)
async function createNodeTarBuffer(cwd, files, gzip = false) {
return new Promise(resolve => {
const pack = new Pack({ cwd, gzip });
const bufs = [];
pack.on("data", chunk => bufs.push(chunk));
pack.on("end", () => resolve(Buffer.concat(bufs)));
for (const file of files) {
pack.add(file);
}
pack.end();
});
}
// Extract tar buffer using node-tar
async function extractNodeTarBuffer(buffer, cwd) {
return new Promise((resolve, reject) => {
const unpack = new Unpack({ cwd });
unpack.on("end", resolve);
unpack.on("error", reject);
unpack.end(buffer);
});
}
// Initialize gzipped archives
smallTarGzBuffer = await createNodeTarBuffer(smallFilesDir, ["file1.txt", "file2.txt", "file3.txt"], true);
mediumTarGzBuffer = await createNodeTarBuffer(mediumFilesDir, ["file1.txt", "file2.txt", "file3.txt"], true);
largeTarGzBuffer = await createNodeTarBuffer(largeFilesDir, ["file1.txt", "file2.txt", "file3.txt"], true);
manyFilesTarGzBuffer = await createNodeTarBuffer(manyFilesDir, Object.keys(manyFilesEntries), true);
// Initialize uncompressed archives
smallTarBuffer = await createNodeTarBuffer(smallFilesDir, ["file1.txt", "file2.txt", "file3.txt"], false);
mediumTarBuffer = await createNodeTarBuffer(mediumFilesDir, ["file1.txt", "file2.txt", "file3.txt"], false);
largeTarBuffer = await createNodeTarBuffer(largeFilesDir, ["file1.txt", "file2.txt", "file3.txt"], false);
manyFilesTarBuffer = await createNodeTarBuffer(manyFilesDir, Object.keys(manyFilesEntries), false);
const smallFiles = { "file1.txt": smallContent, "file2.txt": smallContent, "file3.txt": smallContent };
const mediumFiles = { "file1.txt": mediumContent, "file2.txt": mediumContent, "file3.txt": mediumContent };
const largeFiles = { "file1.txt": largeContent, "file2.txt": largeContent, "file3.txt": largeContent };
if (hasBunArchive) {
smallBunArchiveGz = await Bun.Archive.from(smallFiles).bytes("gzip");
mediumBunArchiveGz = await Bun.Archive.from(mediumFiles).bytes("gzip");
largeBunArchiveGz = await Bun.Archive.from(largeFiles).bytes("gzip");
manyFilesBunArchiveGz = await Bun.Archive.from(manyFilesEntries).bytes("gzip");
smallBunArchive = await Bun.Archive.from(smallFiles).bytes();
mediumBunArchive = await Bun.Archive.from(mediumFiles).bytes();
largeBunArchive = await Bun.Archive.from(largeFiles).bytes();
manyFilesBunArchive = await Bun.Archive.from(manyFilesEntries).bytes();
}
// Create reusable extraction directories (overwriting is fine)
const extractDirNodeTar = mkdtempSync(join(tmpdir(), "archive-bench-extract-node-"));
const extractDirBun = mkdtempSync(join(tmpdir(), "archive-bench-extract-bun-"));
const writeDirNodeTar = mkdtempSync(join(tmpdir(), "archive-bench-write-node-"));
const writeDirBun = mkdtempSync(join(tmpdir(), "archive-bench-write-bun-"));
// ============================================================================
// Create .tar (uncompressed) benchmarks
// ============================================================================
group("create .tar (3 small files)", () => {
bench("node-tar", async () => {
await createNodeTarBuffer(smallFilesDir, ["file1.txt", "file2.txt", "file3.txt"], false);
});
if (hasBunArchive) {
bench("Bun.Archive", async () => {
await Bun.Archive.from(smallFiles).bytes();
});
}
});
group("create .tar (3 x 100KB files)", () => {
bench("node-tar", async () => {
await createNodeTarBuffer(largeFilesDir, ["file1.txt", "file2.txt", "file3.txt"], false);
});
if (hasBunArchive) {
bench("Bun.Archive", async () => {
await Bun.Archive.from(largeFiles).bytes();
});
}
});
group("create .tar (100 small files)", () => {
bench("node-tar", async () => {
await createNodeTarBuffer(manyFilesDir, Object.keys(manyFilesEntries), false);
});
if (hasBunArchive) {
bench("Bun.Archive", async () => {
await Bun.Archive.from(manyFilesEntries).bytes();
});
}
});
// ============================================================================
// Create .tar.gz (compressed) benchmarks
// ============================================================================
group("create .tar.gz (3 small files)", () => {
bench("node-tar", async () => {
await createNodeTarBuffer(smallFilesDir, ["file1.txt", "file2.txt", "file3.txt"], true);
});
if (hasBunArchive) {
bench("Bun.Archive", async () => {
await Bun.Archive.from(smallFiles).bytes("gzip");
});
}
});
group("create .tar.gz (3 x 100KB files)", () => {
bench("node-tar", async () => {
await createNodeTarBuffer(largeFilesDir, ["file1.txt", "file2.txt", "file3.txt"], true);
});
if (hasBunArchive) {
bench("Bun.Archive", async () => {
await Bun.Archive.from(largeFiles).bytes("gzip");
});
}
});
group("create .tar.gz (100 small files)", () => {
bench("node-tar", async () => {
await createNodeTarBuffer(manyFilesDir, Object.keys(manyFilesEntries), true);
});
if (hasBunArchive) {
bench("Bun.Archive", async () => {
await Bun.Archive.from(manyFilesEntries).bytes("gzip");
});
}
});
// ============================================================================
// Extract .tar (uncompressed) benchmarks
// ============================================================================
group("extract .tar (3 small files)", () => {
bench("node-tar", async () => {
await extractNodeTarBuffer(smallTarBuffer, extractDirNodeTar);
});
if (hasBunArchive) {
bench("Bun.Archive", async () => {
await Bun.Archive.from(smallBunArchive).extract(extractDirBun);
});
}
});
group("extract .tar (3 x 100KB files)", () => {
bench("node-tar", async () => {
await extractNodeTarBuffer(largeTarBuffer, extractDirNodeTar);
});
if (hasBunArchive) {
bench("Bun.Archive", async () => {
await Bun.Archive.from(largeBunArchive).extract(extractDirBun);
});
}
});
group("extract .tar (100 small files)", () => {
bench("node-tar", async () => {
await extractNodeTarBuffer(manyFilesTarBuffer, extractDirNodeTar);
});
if (hasBunArchive) {
bench("Bun.Archive", async () => {
await Bun.Archive.from(manyFilesBunArchive).extract(extractDirBun);
});
}
});
// ============================================================================
// Extract .tar.gz (compressed) benchmarks
// ============================================================================
group("extract .tar.gz (3 small files)", () => {
bench("node-tar", async () => {
await extractNodeTarBuffer(smallTarGzBuffer, extractDirNodeTar);
});
if (hasBunArchive) {
bench("Bun.Archive", async () => {
await Bun.Archive.from(smallBunArchiveGz).extract(extractDirBun);
});
}
});
group("extract .tar.gz (3 x 100KB files)", () => {
bench("node-tar", async () => {
await extractNodeTarBuffer(largeTarGzBuffer, extractDirNodeTar);
});
if (hasBunArchive) {
bench("Bun.Archive", async () => {
await Bun.Archive.from(largeBunArchiveGz).extract(extractDirBun);
});
}
});
group("extract .tar.gz (100 small files)", () => {
bench("node-tar", async () => {
await extractNodeTarBuffer(manyFilesTarGzBuffer, extractDirNodeTar);
});
if (hasBunArchive) {
bench("Bun.Archive", async () => {
await Bun.Archive.from(manyFilesBunArchiveGz).extract(extractDirBun);
});
}
});
// ============================================================================
// Write .tar to disk benchmarks
// ============================================================================
let writeCounter = 0;
group("write .tar to disk (3 small files)", () => {
bench("node-tar + writeFileSync", async () => {
const buffer = await createNodeTarBuffer(smallFilesDir, ["file1.txt", "file2.txt", "file3.txt"], false);
writeFileSync(join(writeDirNodeTar, `archive-${writeCounter++}.tar`), buffer);
});
if (hasBunArchive) {
bench("Bun.Archive.write", async () => {
await Bun.Archive.write(join(writeDirBun, `archive-${writeCounter++}.tar`), smallFiles);
});
}
});
group("write .tar to disk (3 x 100KB files)", () => {
bench("node-tar + writeFileSync", async () => {
const buffer = await createNodeTarBuffer(largeFilesDir, ["file1.txt", "file2.txt", "file3.txt"], false);
writeFileSync(join(writeDirNodeTar, `archive-${writeCounter++}.tar`), buffer);
});
if (hasBunArchive) {
bench("Bun.Archive.write", async () => {
await Bun.Archive.write(join(writeDirBun, `archive-${writeCounter++}.tar`), largeFiles);
});
}
});
group("write .tar to disk (100 small files)", () => {
bench("node-tar + writeFileSync", async () => {
const buffer = await createNodeTarBuffer(manyFilesDir, Object.keys(manyFilesEntries), false);
writeFileSync(join(writeDirNodeTar, `archive-${writeCounter++}.tar`), buffer);
});
if (hasBunArchive) {
bench("Bun.Archive.write", async () => {
await Bun.Archive.write(join(writeDirBun, `archive-${writeCounter++}.tar`), manyFilesEntries);
});
}
});
// ============================================================================
// Write .tar.gz to disk benchmarks
// ============================================================================
group("write .tar.gz to disk (3 small files)", () => {
bench("node-tar + writeFileSync", async () => {
const buffer = await createNodeTarBuffer(smallFilesDir, ["file1.txt", "file2.txt", "file3.txt"], true);
writeFileSync(join(writeDirNodeTar, `archive-${writeCounter++}.tar.gz`), buffer);
});
if (hasBunArchive) {
bench("Bun.Archive.write", async () => {
await Bun.Archive.write(join(writeDirBun, `archive-${writeCounter++}.tar.gz`), smallFiles, "gzip");
});
}
});
group("write .tar.gz to disk (3 x 100KB files)", () => {
bench("node-tar + writeFileSync", async () => {
const buffer = await createNodeTarBuffer(largeFilesDir, ["file1.txt", "file2.txt", "file3.txt"], true);
writeFileSync(join(writeDirNodeTar, `archive-${writeCounter++}.tar.gz`), buffer);
});
if (hasBunArchive) {
bench("Bun.Archive.write", async () => {
await Bun.Archive.write(join(writeDirBun, `archive-${writeCounter++}.tar.gz`), largeFiles, "gzip");
});
}
});
group("write .tar.gz to disk (100 small files)", () => {
bench("node-tar + writeFileSync", async () => {
const buffer = await createNodeTarBuffer(manyFilesDir, Object.keys(manyFilesEntries), true);
writeFileSync(join(writeDirNodeTar, `archive-${writeCounter++}.tar.gz`), buffer);
});
if (hasBunArchive) {
bench("Bun.Archive.write", async () => {
await Bun.Archive.write(join(writeDirBun, `archive-${writeCounter++}.tar.gz`), manyFilesEntries, "gzip");
});
}
});
// ============================================================================
// Get files array from archive (files() method) benchmarks
// ============================================================================
// Helper to get files array from node-tar (reads all entries into memory)
async function getFilesArrayNodeTar(buffer) {
return new Promise((resolve, reject) => {
const files = new Map();
let pending = 0;
let closed = false;
const maybeResolve = () => {
if (closed && pending === 0) {
resolve(files);
}
};
const unpack = new Unpack({
onReadEntry: entry => {
if (entry.type === "File") {
pending++;
const chunks = [];
entry.on("data", chunk => chunks.push(chunk));
entry.on("end", () => {
const content = Buffer.concat(chunks);
// Create a File-like object similar to Bun.Archive.files()
files.set(entry.path, new Blob([content]));
pending--;
maybeResolve();
});
}
entry.resume(); // Drain the entry
},
});
unpack.on("close", () => {
closed = true;
maybeResolve();
});
unpack.on("error", reject);
unpack.end(buffer);
});
}
group("files() - get all files as Map (3 small files)", () => {
bench("node-tar", async () => {
await getFilesArrayNodeTar(smallTarBuffer);
});
if (hasBunArchive) {
bench("Bun.Archive.files()", async () => {
await Bun.Archive.from(smallBunArchive).files();
});
}
});
group("files() - get all files as Map (3 x 100KB files)", () => {
bench("node-tar", async () => {
await getFilesArrayNodeTar(largeTarBuffer);
});
if (hasBunArchive) {
bench("Bun.Archive.files()", async () => {
await Bun.Archive.from(largeBunArchive).files();
});
}
});
group("files() - get all files as Map (100 small files)", () => {
bench("node-tar", async () => {
await getFilesArrayNodeTar(manyFilesTarBuffer);
});
if (hasBunArchive) {
bench("Bun.Archive.files()", async () => {
await Bun.Archive.from(manyFilesBunArchive).files();
});
}
});
group("files() - get all files as Map from .tar.gz (3 small files)", () => {
bench("node-tar", async () => {
await getFilesArrayNodeTar(smallTarGzBuffer);
});
if (hasBunArchive) {
bench("Bun.Archive.files()", async () => {
await Bun.Archive.from(smallBunArchiveGz).files();
});
}
});
group("files() - get all files as Map from .tar.gz (100 small files)", () => {
bench("node-tar", async () => {
await getFilesArrayNodeTar(manyFilesTarGzBuffer);
});
if (hasBunArchive) {
bench("Bun.Archive.files()", async () => {
await Bun.Archive.from(manyFilesBunArchiveGz).files();
});
}
});
await run();
// Cleanup
rmSync(setupDir, { recursive: true, force: true });
rmSync(extractDirNodeTar, { recursive: true, force: true });
rmSync(extractDirBun, { recursive: true, force: true });
rmSync(writeDirNodeTar, { recursive: true, force: true });
rmSync(writeDirBun, { recursive: true, force: true });

335
bench/snippets/array-of.js Normal file
View File

@@ -0,0 +1,335 @@
import { bench, run } from "../runner.mjs";
let sink;
// Integers
bench("int: Array.of(1,2,3,4,5)", () => {
sink = Array.of(1, 2, 3, 4, 5);
});
bench("int: Array.of(100 elements)", () => {
sink = Array.of(
0,
1,
2,
3,
4,
5,
6,
7,
8,
9,
10,
11,
12,
13,
14,
15,
16,
17,
18,
19,
20,
21,
22,
23,
24,
25,
26,
27,
28,
29,
30,
31,
32,
33,
34,
35,
36,
37,
38,
39,
40,
41,
42,
43,
44,
45,
46,
47,
48,
49,
50,
51,
52,
53,
54,
55,
56,
57,
58,
59,
60,
61,
62,
63,
64,
65,
66,
67,
68,
69,
70,
71,
72,
73,
74,
75,
76,
77,
78,
79,
80,
81,
82,
83,
84,
85,
86,
87,
88,
89,
90,
91,
92,
93,
94,
95,
96,
97,
98,
99,
);
});
// Doubles
bench("double: Array.of(1.1,2.2,3.3,4.4,5.5)", () => {
sink = Array.of(1.1, 2.2, 3.3, 4.4, 5.5);
});
bench("double: Array.of(100 elements)", () => {
sink = Array.of(
0.1,
1.1,
2.1,
3.1,
4.1,
5.1,
6.1,
7.1,
8.1,
9.1,
10.1,
11.1,
12.1,
13.1,
14.1,
15.1,
16.1,
17.1,
18.1,
19.1,
20.1,
21.1,
22.1,
23.1,
24.1,
25.1,
26.1,
27.1,
28.1,
29.1,
30.1,
31.1,
32.1,
33.1,
34.1,
35.1,
36.1,
37.1,
38.1,
39.1,
40.1,
41.1,
42.1,
43.1,
44.1,
45.1,
46.1,
47.1,
48.1,
49.1,
50.1,
51.1,
52.1,
53.1,
54.1,
55.1,
56.1,
57.1,
58.1,
59.1,
60.1,
61.1,
62.1,
63.1,
64.1,
65.1,
66.1,
67.1,
68.1,
69.1,
70.1,
71.1,
72.1,
73.1,
74.1,
75.1,
76.1,
77.1,
78.1,
79.1,
80.1,
81.1,
82.1,
83.1,
84.1,
85.1,
86.1,
87.1,
88.1,
89.1,
90.1,
91.1,
92.1,
93.1,
94.1,
95.1,
96.1,
97.1,
98.1,
99.1,
);
});
// Objects
bench("object: Array.of(obj x5)", () => {
sink = Array.of({ a: 1 }, { a: 2 }, { a: 3 }, { a: 4 }, { a: 5 });
});
bench("object: Array.of(100 elements)", () => {
sink = Array.of(
{ a: 0 },
{ a: 1 },
{ a: 2 },
{ a: 3 },
{ a: 4 },
{ a: 5 },
{ a: 6 },
{ a: 7 },
{ a: 8 },
{ a: 9 },
{ a: 10 },
{ a: 11 },
{ a: 12 },
{ a: 13 },
{ a: 14 },
{ a: 15 },
{ a: 16 },
{ a: 17 },
{ a: 18 },
{ a: 19 },
{ a: 20 },
{ a: 21 },
{ a: 22 },
{ a: 23 },
{ a: 24 },
{ a: 25 },
{ a: 26 },
{ a: 27 },
{ a: 28 },
{ a: 29 },
{ a: 30 },
{ a: 31 },
{ a: 32 },
{ a: 33 },
{ a: 34 },
{ a: 35 },
{ a: 36 },
{ a: 37 },
{ a: 38 },
{ a: 39 },
{ a: 40 },
{ a: 41 },
{ a: 42 },
{ a: 43 },
{ a: 44 },
{ a: 45 },
{ a: 46 },
{ a: 47 },
{ a: 48 },
{ a: 49 },
{ a: 50 },
{ a: 51 },
{ a: 52 },
{ a: 53 },
{ a: 54 },
{ a: 55 },
{ a: 56 },
{ a: 57 },
{ a: 58 },
{ a: 59 },
{ a: 60 },
{ a: 61 },
{ a: 62 },
{ a: 63 },
{ a: 64 },
{ a: 65 },
{ a: 66 },
{ a: 67 },
{ a: 68 },
{ a: 69 },
{ a: 70 },
{ a: 71 },
{ a: 72 },
{ a: 73 },
{ a: 74 },
{ a: 75 },
{ a: 76 },
{ a: 77 },
{ a: 78 },
{ a: 79 },
{ a: 80 },
{ a: 81 },
{ a: 82 },
{ a: 83 },
{ a: 84 },
{ a: 85 },
{ a: 86 },
{ a: 87 },
{ a: 88 },
{ a: 89 },
{ a: 90 },
{ a: 91 },
{ a: 92 },
{ a: 93 },
{ a: 94 },
{ a: 95 },
{ a: 96 },
{ a: 97 },
{ a: 98 },
{ a: 99 },
);
});
await run();

View File

@@ -0,0 +1,4 @@
// Child process for IPC benchmarks - echoes messages back to parent
process.on("message", message => {
process.send(message);
});

View File

@@ -0,0 +1,45 @@
import { fork } from "node:child_process";
import path from "node:path";
import { fileURLToPath } from "node:url";
import { bench, run } from "../runner.mjs";
const __dirname = path.dirname(fileURLToPath(import.meta.url));
const childPath = path.join(__dirname, "ipc-json-child.mjs");
const smallMessage = { type: "ping", id: 1 };
const largeString = Buffer.alloc(10 * 1024 * 1024, "A").toString();
const largeMessage = { type: "ping", id: 1, data: largeString };
async function runBenchmark(message, count) {
let received = 0;
const { promise, resolve } = Promise.withResolvers();
const child = fork(childPath, [], {
stdio: ["ignore", "ignore", "ignore", "ipc"],
serialization: "json",
});
child.on("message", () => {
received++;
if (received >= count) {
resolve();
}
});
for (let i = 0; i < count; i++) {
child.send(message);
}
await promise;
child.kill();
}
bench("ipc json - small messages (1000 roundtrips)", async () => {
await runBenchmark(smallMessage, 1000);
});
bench("ipc json - 10MB messages (10 roundtrips)", async () => {
await runBenchmark(largeMessage, 10);
});
await run();

View File

@@ -0,0 +1,57 @@
import { bench, run } from "../runner.mjs";
const obj = { a: 1, b: 2, c: 3 };
const objDeep = { a: 1, b: 2, c: 3, d: 4, e: 5, f: 6, g: 7, h: 8 };
const sym = Symbol("test");
const objWithSymbol = { [sym]: 1, a: 2 };
const objs = [
{ f: 50 },
{ f: 50, g: 70 },
{ g: 50, f: 70 },
{ h: 50, f: 70 },
{ z: 50, f: 70 },
{ k: 50, f: 70 },
];
bench("Object.hasOwn - hit", () => {
return Object.hasOwn(obj, "a");
});
bench("Object.hasOwn - miss", () => {
return Object.hasOwn(obj, "z");
});
bench("Object.hasOwn - symbol hit", () => {
return Object.hasOwn(objWithSymbol, sym);
});
bench("Object.hasOwn - symbol miss", () => {
return Object.hasOwn(objWithSymbol, Symbol("other"));
});
bench("Object.hasOwn - multiple shapes", () => {
let result = true;
for (let i = 0; i < objs.length; i++) {
result = Object.hasOwn(objs[i], "f") && result;
}
return result;
});
bench("Object.prototype.hasOwnProperty - hit", () => {
return obj.hasOwnProperty("a");
});
bench("Object.prototype.hasOwnProperty - miss", () => {
return obj.hasOwnProperty("z");
});
bench("in operator - hit", () => {
return "a" in obj;
});
bench("in operator - miss", () => {
return "z" in obj;
});
await run();

View File

@@ -0,0 +1,7 @@
import { bench, run } from "../runner.mjs";
bench("Promise.race([p1, p2])", async function () {
return await Promise.race([Promise.resolve(1), Promise.resolve(2)]);
});
await run();

View File

@@ -112,12 +112,40 @@ const obj = {
},
};
bench("Response.json(obj)", async () => {
const smallObj = { id: 1, name: "test" };
const arrayObj = {
items: Array.from({ length: 100 }, (_, i) => ({ id: i, value: `item-${i}` })),
};
bench("Response.json(obj)", () => {
return Response.json(obj);
});
bench("Response.json(obj).json()", async () => {
return await Response.json(obj).json();
bench("new Response(JSON.stringify(obj))", () => {
return new Response(JSON.stringify(obj), {
headers: { "Content-Type": "application/json" },
});
});
bench("Response.json(smallObj)", () => {
return Response.json(smallObj);
});
bench("new Response(JSON.stringify(smallObj))", () => {
return new Response(JSON.stringify(smallObj), {
headers: { "Content-Type": "application/json" },
});
});
bench("Response.json(arrayObj)", () => {
return Response.json(arrayObj);
});
bench("new Response(JSON.stringify(arrayObj))", () => {
return new Response(JSON.stringify(arrayObj), {
headers: { "Content-Type": "application/json" },
});
});
await run();

View File

@@ -0,0 +1,34 @@
import { bench, run } from "../runner.mjs";
const shortStr = "The quick brown fox jumps over the lazy dog";
const longStr = shortStr.repeat(100);
bench("String.includes - short, hit (middle)", () => {
return shortStr.includes("jumps");
});
bench("String.includes - short, hit (start)", () => {
return shortStr.includes("The");
});
bench("String.includes - short, hit (end)", () => {
return shortStr.includes("dog");
});
bench("String.includes - short, miss", () => {
return shortStr.includes("cat");
});
bench("String.includes - long, hit (middle)", () => {
return longStr.includes("jumps");
});
bench("String.includes - long, miss", () => {
return longStr.includes("cat");
});
bench("String.includes - with position", () => {
return shortStr.includes("fox", 10);
});
await run();

View File

@@ -36,6 +36,7 @@
},
"overrides": {
"@types/bun": "workspace:packages/@types/bun",
"@types/node": "25.0.0",
"bun-types": "workspace:packages/bun-types",
},
"packages": {
@@ -155,7 +156,7 @@
"@types/ms": ["@types/ms@2.1.0", "", {}, "sha512-GsCCIZDE/p3i96vtEqx+7dBUGXrc7zeSK3wwPHIaRThS+9OhWIXRqzs4d6k1SVU8g91DrNRWxWUGhp5KXQb2VA=="],
"@types/node": ["@types/node@24.10.1", "", { "dependencies": { "undici-types": "~7.16.0" } }, "sha512-GNWcUTRBgIRJD5zj+Tq0fKOJ5XZajIiBroOF0yvj2bSU1WvNdYS/dn9UxwsujGW4JX06dnHyjV2y9rRaybH0iQ=="],
"@types/node": ["@types/node@25.0.0", "", { "dependencies": { "undici-types": "~7.16.0" } }, "sha512-rl78HwuZlaDIUSeUKkmogkhebA+8K1Hy7tddZuJ3D0xV8pZSfsYGTsliGUol1JPzu9EKnTxPC4L1fiWouStRew=="],
"aggregate-error": ["aggregate-error@3.1.0", "", { "dependencies": { "clean-stack": "^2.0.0", "indent-string": "^4.0.0" } }, "sha512-4I7Td01quW/RpocfNayFdFVk1qSuoh0E7JrbRJ16nH01HhKFQ88INq9Sd+nd72zqRySlr9BmDA8xlEJ6vJMrYA=="],

View File

@@ -419,12 +419,9 @@ execute_process(
--command=list-outputs
--sources=${BUN_BINDGENV2_SOURCES_COMMA_SEPARATED}
--codegen-path=${CODEGEN_PATH}
RESULT_VARIABLE bindgen_result
OUTPUT_VARIABLE bindgen_outputs
COMMAND_ERROR_IS_FATAL ANY
)
if(${bindgen_result})
message(FATAL_ERROR "bindgenv2/script.ts exited with non-zero status")
endif()
foreach(output IN LISTS bindgen_outputs)
if(output MATCHES "\.cpp$")
list(APPEND BUN_BINDGENV2_CPP_OUTPUTS ${output})
@@ -872,6 +869,7 @@ target_include_directories(${bun} PRIVATE
${CODEGEN_PATH}
${VENDOR_PATH}
${VENDOR_PATH}/picohttpparser
${VENDOR_PATH}/zlib
${NODEJS_HEADERS_PATH}/include
${NODEJS_HEADERS_PATH}/include/node
)
@@ -1199,6 +1197,29 @@ set_target_properties(${bun} PROPERTIES LINK_DEPENDS ${BUN_SYMBOLS_PATH})
include(SetupWebKit)
if(BUN_LINK_ONLY)
register_command(
TARGET
${bun}
TARGET_PHASE
POST_BUILD
COMMENT
"Uploading link metadata"
COMMAND
${CMAKE_COMMAND} -E env
BUN_VERSION=${VERSION}
WEBKIT_DOWNLOAD_URL=${WEBKIT_DOWNLOAD_URL}
WEBKIT_VERSION=${WEBKIT_VERSION}
ZIG_COMMIT=${ZIG_COMMIT}
${BUN_EXECUTABLE} ${CWD}/scripts/create-link-metadata.mjs ${BUILD_PATH} ${bun}
SOURCES
${BUN_ZIG_OUTPUT}
${BUN_CPP_OUTPUT}
ARTIFACTS
${BUILD_PATH}/link-metadata.json
)
endif()
if(WIN32)
if(DEBUG)
target_link_libraries(${bun} PRIVATE

View File

@@ -4,7 +4,7 @@ register_repository(
REPOSITORY
c-ares/c-ares
COMMIT
d3a507e920e7af18a5efb7f9f1d8044ed4750013
3ac47ee46edd8ea40370222f91613fc16c434853
)
register_cmake_command(

View File

@@ -48,6 +48,9 @@ if(NOT BUILDKITE_BUILD_STATUS EQUAL 0)
endif()
file(READ ${BUILDKITE_BUILD_PATH}/build.json BUILDKITE_BUILD)
# Escape backslashes so CMake doesn't interpret JSON escape sequences (e.g., \n in commit messages)
string(REPLACE "\\" "\\\\" BUILDKITE_BUILD "${BUILDKITE_BUILD}")
string(JSON BUILDKITE_BUILD_UUID GET ${BUILDKITE_BUILD} id)
string(JSON BUILDKITE_JOBS GET ${BUILDKITE_BUILD} jobs)
string(JSON BUILDKITE_JOBS_COUNT LENGTH ${BUILDKITE_JOBS})

View File

@@ -5,18 +5,12 @@ if(NOT ENABLE_CCACHE OR CACHE_STRATEGY STREQUAL "none")
return()
endif()
if (CI AND NOT APPLE)
setenv(CCACHE_DISABLE 1)
return()
endif()
find_command(
VARIABLE
CCACHE_PROGRAM
COMMAND
ccache
REQUIRED
${CI}
)
if(NOT CCACHE_PROGRAM)

View File

@@ -1,123 +0,0 @@
# Setup sccache as the C and C++ compiler launcher to speed up builds by caching
if(CACHE_STRATEGY STREQUAL "none")
return()
endif()
set(SCCACHE_SHARED_CACHE_REGION "us-west-1")
set(SCCACHE_SHARED_CACHE_BUCKET "bun-build-sccache-store")
# Function to check if the system AWS credentials have access to the sccache S3 bucket.
function(check_aws_credentials OUT_VAR)
# Install dependencies first
execute_process(
COMMAND ${BUN_EXECUTABLE} install --frozen-lockfile
WORKING_DIRECTORY ${CMAKE_SOURCE_DIR}/scripts/build-cache
RESULT_VARIABLE INSTALL_EXIT_CODE
OUTPUT_VARIABLE INSTALL_OUTPUT
ERROR_VARIABLE INSTALL_ERROR
)
if(NOT INSTALL_EXIT_CODE EQUAL 0)
message(FATAL_ERROR "Failed to install dependencies in scripts/build-cache\n"
"Exit code: ${INSTALL_EXIT_CODE}\n"
"Output: ${INSTALL_OUTPUT}\n"
"Error: ${INSTALL_ERROR}")
endif()
# Check AWS credentials
execute_process(
COMMAND
${BUN_EXECUTABLE}
run
have-access.ts
--bucket ${SCCACHE_SHARED_CACHE_BUCKET}
--region ${SCCACHE_SHARED_CACHE_REGION}
WORKING_DIRECTORY
${CMAKE_SOURCE_DIR}/scripts/build-cache
RESULT_VARIABLE HAVE_ACCESS_EXIT_CODE
)
if(HAVE_ACCESS_EXIT_CODE EQUAL 0)
set(HAS_CREDENTIALS TRUE)
else()
set(HAS_CREDENTIALS FALSE)
endif()
set(${OUT_VAR} ${HAS_CREDENTIALS} PARENT_SCOPE)
endfunction()
# Configure sccache to use the local cache only.
function(sccache_configure_local_filesystem)
unsetenv(SCCACHE_BUCKET)
unsetenv(SCCACHE_REGION)
setenv(SCCACHE_DIR "${CACHE_PATH}/sccache")
endfunction()
# Configure sccache to use the distributed cache (S3 + local).
function(sccache_configure_distributed)
setenv(SCCACHE_BUCKET "${SCCACHE_SHARED_CACHE_BUCKET}")
setenv(SCCACHE_REGION "${SCCACHE_SHARED_CACHE_REGION}")
setenv(SCCACHE_DIR "${CACHE_PATH}/sccache")
endfunction()
function(sccache_configure_environment_ci)
if(CACHE_STRATEGY STREQUAL "auto" OR CACHE_STRATEGY STREQUAL "distributed")
check_aws_credentials(HAS_AWS_CREDENTIALS)
if(HAS_AWS_CREDENTIALS)
sccache_configure_distributed()
message(NOTICE "sccache: Using distributed cache strategy.")
else()
message(FATAL_ERROR "CI CACHE_STRATEGY is set to '${CACHE_STRATEGY}', but no valid AWS "
"credentials were found. Note that 'auto' requires AWS credentials to access the shared "
"cache in CI.")
endif()
elseif(CACHE_STRATEGY STREQUAL "local")
# We disallow this because we want our CI runs to always used the shared cache to accelerate
# builds.
# none, distributed and auto are all okay.
#
# If local is configured, it's as good as "none", so this is probably user error.
message(FATAL_ERROR "CI CACHE_STRATEGY is set to 'local', which is not allowed.")
endif()
endfunction()
function(sccache_configure_environment_developer)
# Local environments can use any strategy they like. S3 is set up in such a way so as to clean
# itself from old entries automatically.
if (CACHE_STRATEGY STREQUAL "auto" OR CACHE_STRATEGY STREQUAL "local")
# In the local environment, we prioritize using the local cache. This is because sccache takes
# into consideration the whole absolute path of the files being compiled, and it's very
# unlikely users will have the same absolute paths on their local machines.
sccache_configure_local_filesystem()
message(NOTICE "sccache: Using local cache strategy.")
elseif(CACHE_STRATEGY STREQUAL "distributed")
check_aws_credentials(HAS_AWS_CREDENTIALS)
if(HAS_AWS_CREDENTIALS)
sccache_configure_distributed()
message(NOTICE "sccache: Using distributed cache strategy.")
else()
message(FATAL_ERROR "CACHE_STRATEGY is set to 'distributed', but no valid AWS credentials "
"were found.")
endif()
endif()
endfunction()
find_command(VARIABLE SCCACHE_PROGRAM COMMAND sccache REQUIRED ${CI})
if(NOT SCCACHE_PROGRAM)
message(WARNING "sccache not found. Your builds will be slower.")
return()
endif()
set(SCCACHE_ARGS CMAKE_C_COMPILER_LAUNCHER CMAKE_CXX_COMPILER_LAUNCHER)
foreach(arg ${SCCACHE_ARGS})
setx(${arg} ${SCCACHE_PROGRAM})
list(APPEND CMAKE_ARGS -D${arg}=${${arg}})
endforeach()
setenv(SCCACHE_LOG "info")
if (CI)
sccache_configure_environment_ci()
else()
sccache_configure_environment_developer()
endif()

View File

@@ -2,7 +2,7 @@ option(WEBKIT_VERSION "The version of WebKit to use")
option(WEBKIT_LOCAL "If a local version of WebKit should be used instead of downloading")
if(NOT WEBKIT_VERSION)
set(WEBKIT_VERSION 6d0f3aac0b817cc01a846b3754b21271adedac12)
set(WEBKIT_VERSION 1d0216219a3c52cb85195f48f19ba7d5db747ff7)
endif()
string(SUBSTRING ${WEBKIT_VERSION} 0 16 WEBKIT_VERSION_PREFIX)
@@ -28,6 +28,7 @@ if(WEBKIT_LOCAL)
# make jsc-compile-debug jsc-copy-headers
include_directories(
${WEBKIT_PATH}
${WEBKIT_PATH}/JavaScriptCore/Headers
${WEBKIT_PATH}/JavaScriptCore/Headers/JavaScriptCore
${WEBKIT_PATH}/JavaScriptCore/PrivateHeaders
${WEBKIT_PATH}/bmalloc/Headers
@@ -90,7 +91,14 @@ if(EXISTS ${WEBKIT_PATH}/package.json)
endif()
endif()
file(DOWNLOAD ${WEBKIT_DOWNLOAD_URL} ${CACHE_PATH}/${WEBKIT_FILENAME} SHOW_PROGRESS)
file(
DOWNLOAD ${WEBKIT_DOWNLOAD_URL} ${CACHE_PATH}/${WEBKIT_FILENAME} SHOW_PROGRESS
STATUS WEBKIT_DOWNLOAD_STATUS
)
if(NOT "${WEBKIT_DOWNLOAD_STATUS}" MATCHES "^0;")
message(FATAL_ERROR "Failed to download WebKit: ${WEBKIT_DOWNLOAD_STATUS}")
endif()
file(ARCHIVE_EXTRACT INPUT ${CACHE_PATH}/${WEBKIT_FILENAME} DESTINATION ${CACHE_PATH} TOUCH)
file(REMOVE ${CACHE_PATH}/${WEBKIT_FILENAME})
file(REMOVE_RECURSE ${WEBKIT_PATH})

View File

@@ -35,8 +35,8 @@ end
set -l bun_install_boolean_flags yarn production optional development no-save dry-run force no-cache silent verbose global
set -l bun_install_boolean_flags_descriptions "Write a yarn.lock file (yarn v1)" "Don't install devDependencies" "Add dependency to optionalDependencies" "Add dependency to devDependencies" "Don't update package.json or save a lockfile" "Don't install anything" "Always request the latest versions from the registry & reinstall all dependencies" "Ignore manifest cache entirely" "Don't output anything" "Excessively verbose logging" "Use global folder"
set -l bun_builtin_cmds_without_run dev create help bun upgrade discord install remove add init pm x
set -l bun_builtin_cmds_accepting_flags create help bun upgrade discord run init link unlink pm x
set -l bun_builtin_cmds_without_run dev create help bun upgrade discord install remove add update init pm x
set -l bun_builtin_cmds_accepting_flags create help bun upgrade discord run init link unlink pm x update
function __bun_complete_bins_scripts --inherit-variable bun_builtin_cmds_without_run -d "Emit bun completions for bins and scripts"
# Do nothing if we already have a builtin subcommand,
@@ -148,14 +148,14 @@ complete -c bun \
for i in (seq (count $bun_install_boolean_flags))
complete -c bun \
-n "__fish_seen_subcommand_from install add remove" -l "$bun_install_boolean_flags[$i]" -d "$bun_install_boolean_flags_descriptions[$i]"
-n "__fish_seen_subcommand_from install add remove update" -l "$bun_install_boolean_flags[$i]" -d "$bun_install_boolean_flags_descriptions[$i]"
end
complete -c bun \
-n "__fish_seen_subcommand_from install add remove" -l 'cwd' -d 'Change working directory'
-n "__fish_seen_subcommand_from install add remove update" -l 'cwd' -d 'Change working directory'
complete -c bun \
-n "__fish_seen_subcommand_from install add remove" -l 'cache-dir' -d 'Choose a cache directory (default: $HOME/.bun/install/cache)'
-n "__fish_seen_subcommand_from install add remove update" -l 'cache-dir' -d 'Choose a cache directory (default: $HOME/.bun/install/cache)'
complete -c bun \
-n "__fish_seen_subcommand_from add" -d 'Popular' -a '(__fish__get_bun_packages)'
@@ -183,4 +183,5 @@ complete -c bun -n "__fish_use_subcommand" -a "unlink" -d "Unregister a local np
complete -c bun -n "__fish_use_subcommand" -a "pm" -d "Additional package management utilities" -f
complete -c bun -n "__fish_use_subcommand" -a "x" -d "Execute a package binary, installing if needed" -f
complete -c bun -n "__fish_use_subcommand" -a "outdated" -d "Display the latest versions of outdated dependencies" -f
complete -c bun -n "__fish_use_subcommand" -a "update" -d "Update dependencies to their latest versions" -f
complete -c bun -n "__fish_use_subcommand" -a "publish" -d "Publish your package from local to npm" -f

View File

@@ -65,6 +65,7 @@ In Bun's CLI, simple boolean flags like `--minify` do not accept an argument. Ot
| `--chunk-names` | `--chunk-naming` | Renamed for consistency with naming in JS API |
| `--color` | n/a | Always enabled |
| `--drop` | `--drop` | |
| n/a | `--feature` | Bun-specific. Enable feature flags for compile-time dead-code elimination via `import { feature } from "bun:bundle"` |
| `--entry-names` | `--entry-naming` | Renamed for consistency with naming in JS API |
| `--global-name` | n/a | Not applicable, Bun does not support `iife` output at this time |
| `--ignore-annotations` | `--ignore-dce-annotations` | |

File diff suppressed because it is too large Load Diff

View File

@@ -427,8 +427,8 @@ This will allow you to use TailwindCSS utility classes in your HTML and CSS file
<!doctype html>
<html>
<head>
<link rel="stylesheet" href="tailwindcss" />
<!-- [!code ++] -->
<link rel="stylesheet" href="tailwindcss" />
</head>
<!-- the rest of your HTML... -->
</html>
@@ -448,8 +448,8 @@ Alternatively, you can import TailwindCSS in your CSS file:
<!doctype html>
<html>
<head>
<link rel="stylesheet" href="./style.css" />
<!-- [!code ++] -->
<link rel="stylesheet" href="./style.css" />
</head>
<!-- the rest of your HTML... -->
</html>

View File

@@ -220,6 +220,78 @@ An array of paths corresponding to the entrypoints of our application. One bundl
</Tab>
</Tabs>
### files
A map of file paths to their contents for in-memory bundling. This allows you to bundle virtual files that don't exist on disk, or override the contents of files that do exist. This option is only available in the JavaScript API.
File contents can be provided as a `string`, `Blob`, `TypedArray`, or `ArrayBuffer`.
#### Bundle entirely from memory
You can bundle code without any files on disk by providing all sources via `files`:
```ts title="build.ts" icon="/icons/typescript.svg"
const result = await Bun.build({
entrypoints: ["/app/index.ts"],
files: {
"/app/index.ts": `
import { greet } from "./greet.ts";
console.log(greet("World"));
`,
"/app/greet.ts": `
export function greet(name: string) {
return "Hello, " + name + "!";
}
`,
},
});
const output = await result.outputs[0].text();
console.log(output);
```
When all entrypoints are in the `files` map, the current working directory is used as the root.
#### Override files on disk
In-memory files take priority over files on disk. This lets you override specific files while keeping the rest of your codebase unchanged:
```ts title="build.ts" icon="/icons/typescript.svg"
// Assume ./src/config.ts exists on disk with development settings
await Bun.build({
entrypoints: ["./src/index.ts"],
files: {
// Override config.ts with production values
"./src/config.ts": `
export const API_URL = "https://api.production.com";
export const DEBUG = false;
`,
},
outdir: "./dist",
});
```
#### Mix disk and virtual files
Real files on disk can import virtual files, and virtual files can import real files:
```ts title="build.ts" icon="/icons/typescript.svg"
// ./src/index.ts exists on disk and imports "./generated.ts"
await Bun.build({
entrypoints: ["./src/index.ts"],
files: {
// Provide a virtual file that index.ts imports
"./src/generated.ts": `
export const BUILD_ID = "${crypto.randomUUID()}";
export const BUILD_TIME = ${Date.now()};
`,
},
outdir: "./dist",
});
```
This is useful for code generation, injecting build-time constants, or testing with mock modules.
### outdir
The directory where output files will be written.
@@ -1141,6 +1213,157 @@ Remove function calls from a bundle. For example, `--drop=console` will remove a
</Tab>
</Tabs>
### features
Enable compile-time feature flags for dead-code elimination. This provides a way to conditionally include or exclude code paths at bundle time using `import { feature } from "bun:bundle"`.
```ts title="app.ts" icon="/icons/typescript.svg"
import { feature } from "bun:bundle";
if (feature("PREMIUM")) {
// Only included when PREMIUM flag is enabled
initPremiumFeatures();
}
if (feature("DEBUG")) {
// Only included when DEBUG flag is enabled
console.log("Debug mode");
}
```
<Tabs>
<Tab title="JavaScript">
```ts title="build.ts" icon="/icons/typescript.svg"
await Bun.build({
entrypoints: ['./app.ts'],
outdir: './out',
features: ["PREMIUM"], // PREMIUM=true, DEBUG=false
})
```
</Tab>
<Tab title="CLI">
```bash terminal icon="terminal"
bun build ./app.ts --outdir ./out --feature PREMIUM
```
</Tab>
</Tabs>
The `feature()` function is replaced with `true` or `false` at bundle time. Combined with minification, unreachable code is eliminated:
```ts title="Input" icon="/icons/typescript.svg"
import { feature } from "bun:bundle";
const mode = feature("PREMIUM") ? "premium" : "free";
```
```js title="Output (with --feature PREMIUM --minify)" icon="/icons/javascript.svg"
var mode = "premium";
```
```js title="Output (without --feature PREMIUM, with --minify)" icon="/icons/javascript.svg"
var mode = "free";
```
**Key behaviors:**
- `feature()` requires a string literal argument — dynamic values are not supported
- The `bun:bundle` import is completely removed from the output
- Works with `bun build`, `bun run`, and `bun test`
- Multiple flags can be enabled: `--feature FLAG_A --feature FLAG_B`
- For type safety, augment the `Registry` interface to restrict `feature()` to known flags (see below)
**Use cases:**
- Platform-specific code (`feature("SERVER")` vs `feature("CLIENT")`)
- Environment-based features (`feature("DEVELOPMENT")`)
- Gradual feature rollouts
- A/B testing variants
- Paid tier features
**Type safety:** By default, `feature()` accepts any string. To get autocomplete and catch typos at compile time, create an `env.d.ts` file (or add to an existing `.d.ts`) and augment the `Registry` interface:
```ts title="env.d.ts" icon="/icons/typescript.svg"
declare module "bun:bundle" {
interface Registry {
features: "DEBUG" | "PREMIUM" | "BETA_FEATURES";
}
}
```
Ensure the file is included in your `tsconfig.json` (e.g., `"include": ["src", "env.d.ts"]`). Now `feature()` only accepts those flags, and invalid strings like `feature("TYPO")` become type errors.
### metafile
Generate metadata about the build in a structured format. The metafile contains information about all input files, output files, their sizes, imports, and exports. This is useful for:
- **Bundle analysis**: Understand what's contributing to bundle size
- **Visualization**: Feed into tools like [esbuild's bundle analyzer](https://esbuild.github.io/analyze/) or other visualization tools
- **Dependency tracking**: See the full import graph of your application
- **CI integration**: Track bundle size changes over time
<Tabs>
<Tab title="JavaScript">
```ts title="build.ts" icon="/icons/typescript.svg"
const result = await Bun.build({
entrypoints: ['./src/index.ts'],
outdir: './dist',
metafile: true,
});
if (result.metafile) {
// Analyze inputs
for (const [path, meta] of Object.entries(result.metafile.inputs)) {
console.log(`${path}: ${meta.bytes} bytes`);
}
// Analyze outputs
for (const [path, meta] of Object.entries(result.metafile.outputs)) {
console.log(`${path}: ${meta.bytes} bytes`);
}
// Save for external analysis tools
await Bun.write('./dist/meta.json', JSON.stringify(result.metafile));
}
```
</Tab>
<Tab title="CLI">
```bash terminal icon="terminal"
bun build ./src/index.ts --outdir ./dist --metafile ./dist/meta.json
```
</Tab>
</Tabs>
The metafile structure contains:
```ts
interface BuildMetafile {
inputs: {
[path: string]: {
bytes: number;
imports: Array<{
path: string;
kind: ImportKind;
original?: string; // Original specifier before resolution
external?: boolean;
}>;
format?: "esm" | "cjs" | "json" | "css";
};
};
outputs: {
[path: string]: {
bytes: number;
inputs: {
[path: string]: { bytesInOutput: number };
};
imports: Array<{ path: string; kind: ImportKind }>;
exports: string[];
entryPoint?: string;
cssBundle?: string; // Associated CSS file for JS entry points
};
};
}
```
## Outputs
The `Bun.build` function returns a `Promise<BuildOutput>`, defined as:
@@ -1150,6 +1373,7 @@ interface BuildOutput {
outputs: BuildArtifact[];
success: boolean;
logs: Array<object>; // see docs for details
metafile?: BuildMetafile; // only when metafile: true
}
interface BuildArtifact extends Blob {

View File

@@ -121,6 +121,7 @@
"/runtime/file-io",
"/runtime/streams",
"/runtime/binary-data",
"/runtime/archive",
"/runtime/sql",
"/runtime/sqlite",
"/runtime/s3",

View File

@@ -33,7 +33,7 @@ Alternatively, you can create a PM2 configuration file. Create a file named `pm2
```js pm2.config.js icon="file-code"
module.exports = {
title: "app", // Name of your application
name: "app", // Name of your application
script: "index.ts", // Entry point of your application
interpreter: "bun", // Bun interpreter
env: {

View File

@@ -8,7 +8,9 @@ Unlike other npm clients, Bun does not execute arbitrary lifecycle scripts for i
<Note>
Bun includes a default allowlist of popular packages containing `postinstall` scripts that are known to be safe. You
can see this list [here](https://github.com/oven-sh/bun/blob/main/src/install/default-trusted-dependencies.txt).
can see this list [here](https://github.com/oven-sh/bun/blob/main/src/install/default-trusted-dependencies.txt). This
default list only applies to packages installed from npm. For packages from other sources (such as `file:`, `link:`,
`git:`, or `github:` dependencies), you must explicitly add them to `trustedDependencies`.
</Note>
---

View File

@@ -14,16 +14,6 @@ process.on("SIGINT", () => {
---
If you don't know which signal to listen for, you listen to the umbrella `"exit"` event.
```ts
process.on("exit", code => {
console.log(`Process exited with code ${code}`);
});
```
---
If you don't know which signal to listen for, you listen to the [`"beforeExit"`](https://nodejs.org/api/process.html#event-beforeexit) and [`"exit"`](https://nodejs.org/api/process.html#event-exit) events.
```ts

View File

@@ -60,7 +60,7 @@ test("random", async () => {
expect(random).toHaveBeenCalled();
expect(random).toHaveBeenCalledTimes(3);
expect(random.mock.args).toEqual([[1], [2], [3]]);
expect(random.mock.calls).toEqual([[1], [2], [3]]);
expect(random.mock.results[0]).toEqual({ type: "return", value: a });
});
```

View File

@@ -189,7 +189,7 @@ Isolated installs are conceptually similar to pnpm, so migration should be strai
```bash terminal icon="terminal"
# Remove pnpm files
$ rm -rf node_modules pnpm-lock.yaml
rm -rf node_modules pnpm-lock.yaml
# Install with Bun's isolated linker
bun install --linker isolated

View File

@@ -46,6 +46,13 @@ Once added to `trustedDependencies`, install/re-install the package. Bun will re
The top 500 npm packages with lifecycle scripts are allowed by default. You can see the full list [here](https://github.com/oven-sh/bun/blob/main/src/install/default-trusted-dependencies.txt).
<Note>
The default trusted dependencies list only applies to packages installed from npm. For packages from other sources
(such as `file:`, `link:`, `git:`, or `github:` dependencies), you must explicitly add them to `trustedDependencies`
to run their lifecycle scripts, even if the package name matches an entry in the default list. This prevents malicious
packages from spoofing trusted package names through local file paths or git repositories.
</Note>
---
## `--ignore-scripts`

View File

@@ -72,6 +72,7 @@ The following options are supported:
- `username`
- `_password` (base64 encoded password)
- `_auth` (base64 encoded username:password, e.g. `btoa(username + ":" + password)`)
- `email`
The equivalent `bunfig.toml` option is to add a key in [`install.scopes`](/runtime/bunfig#install-registry):
@@ -109,3 +110,136 @@ The equivalent `bunfig.toml` option is [`install.exact`](/runtime/bunfig#install
[install]
exact = true
```
### `ignore-scripts`: Skip lifecycle scripts
Prevents running lifecycle scripts during installation:
```ini .npmrc icon="npm"
ignore-scripts=true
```
This is equivalent to using the `--ignore-scripts` flag with `bun install`.
### `dry-run`: Preview changes without installing
Shows what would be installed without actually installing:
```ini .npmrc icon="npm"
dry-run=true
```
The equivalent `bunfig.toml` option is [`install.dryRun`](/runtime/bunfig#install-dryrun):
```toml bunfig.toml icon="settings"
[install]
dryRun = true
```
### `cache`: Configure cache directory
Set the cache directory path, or disable caching:
```ini .npmrc icon="npm"
# set a custom cache directory
cache=/path/to/cache
# or disable caching
cache=false
```
The equivalent `bunfig.toml` option is [`install.cache`](/runtime/bunfig#install-cache):
```toml bunfig.toml icon="settings"
[install.cache]
# set a custom cache directory
dir = "/path/to/cache"
# or disable caching
disable = true
```
### `ca` and `cafile`: Configure CA certificates
Configure custom CA certificates for registry connections:
```ini .npmrc icon="npm"
# single CA certificate
ca="-----BEGIN CERTIFICATE-----\n...\n-----END CERTIFICATE-----"
# multiple CA certificates
ca[]="-----BEGIN CERTIFICATE-----\n...\n-----END CERTIFICATE-----"
ca[]="-----BEGIN CERTIFICATE-----\n...\n-----END CERTIFICATE-----"
# or specify a path to a CA file
cafile=/path/to/ca-bundle.crt
```
### `omit` and `include`: Control dependency types
Control which dependency types are installed:
```ini .npmrc icon="npm"
# omit dev dependencies
omit=dev
# omit multiple types
omit[]=dev
omit[]=optional
# include specific types (overrides omit)
include=dev
```
Valid values: `dev`, `peer`, `optional`
### `install-strategy` and `node-linker`: Installation strategy
Control how packages are installed in `node_modules`. Bun supports two different configuration options for compatibility with different package managers.
**npm's `install-strategy`:**
```ini .npmrc icon="npm"
# flat node_modules structure (default)
install-strategy=hoisted
# symlinked structure
install-strategy=linked
```
**pnpm/yarn's `node-linker`:**
The `node-linker` option controls the installation mode. Bun supports values from both pnpm and yarn:
| Value | Description | Accepted by |
| -------------- | ----------------------------------------------- | ----------- |
| `isolated` | Symlinked structure with isolated dependencies | pnpm |
| `hoisted` | Flat node_modules structure | pnpm |
| `pnpm` | Symlinked structure (same as `isolated`) | yarn |
| `node-modules` | Flat node_modules structure (same as `hoisted`) | yarn |
```ini .npmrc icon="npm"
# symlinked/isolated mode
node-linker=isolated
node-linker=pnpm
# flat/hoisted mode
node-linker=hoisted
node-linker=node-modules
```
### `public-hoist-pattern` and `hoist-pattern`: Control hoisting
Control which packages are hoisted to the root `node_modules`:
```ini .npmrc icon="npm"
# packages matching this pattern will be hoisted to the root
public-hoist-pattern=*eslint*
# multiple patterns
public-hoist-pattern[]=*eslint*
public-hoist-pattern[]=*prettier*
# control general hoisting behavior
hoist-pattern=*
```

View File

@@ -14,7 +14,7 @@ It is strongly recommended to use [PowerShell 7 (`pwsh.exe`)](https://learn.micr
By default, running unverified scripts are blocked.
```ps1
> Set-ExecutionPolicy -Scope CurrentUser -ExecutionPolicy Unrestricted
Set-ExecutionPolicy -Scope CurrentUser -ExecutionPolicy Unrestricted
```
### System Dependencies
@@ -22,7 +22,7 @@ By default, running unverified scripts are blocked.
Bun v1.1 or later. We use Bun to run it's own code generators.
```ps1
> irm bun.sh/install.ps1 | iex
irm bun.sh/install.ps1 | iex
```
[Visual Studio](https://visualstudio.microsoft.com) with the "Desktop Development with C++" workload. While installing, make sure to install Git as well, if Git for Windows is not already installed.
@@ -30,7 +30,7 @@ Bun v1.1 or later. We use Bun to run it's own code generators.
Visual Studio can be installed graphically using the wizard or through WinGet:
```ps1
> winget install "Visual Studio Community 2022" --override "--add Microsoft.VisualStudio.Workload.NativeDesktop Microsoft.VisualStudio.Component.Git " -s msstore
winget install "Visual Studio Community 2022" --override "--add Microsoft.VisualStudio.Workload.NativeDesktop Microsoft.VisualStudio.Component.Git " -s msstore
```
After Visual Studio, you need the following:
@@ -48,10 +48,10 @@ After Visual Studio, you need the following:
[Scoop](https://scoop.sh) can be used to install these remaining tools easily.
```ps1 Scoop
> irm https://get.scoop.sh | iex
> scoop install nodejs-lts go rust nasm ruby perl sccache
irm https://get.scoop.sh | iex
scoop install nodejs-lts go rust nasm ruby perl ccache
# scoop seems to be buggy if you install llvm and the rest at the same time
> scoop install llvm@19.1.7
scoop install llvm@19.1.7
```
<Note>
@@ -63,19 +63,19 @@ After Visual Studio, you need the following:
If you intend on building WebKit locally (optional), you should install these packages:
```ps1 Scoop
> scoop install make cygwin python
scoop install make cygwin python
```
From here on out, it is **expected you use a PowerShell Terminal with `.\scripts\vs-shell.ps1` sourced**. This script is available in the Bun repository and can be loaded by executing it:
```ps1
> .\scripts\vs-shell.ps1
.\scripts\vs-shell.ps1
```
To verify, you can check for an MSVC-only command line such as `mt.exe`
```ps1
> Get-Command mt
Get-Command mt
```
<Note>
@@ -86,16 +86,16 @@ To verify, you can check for an MSVC-only command line such as `mt.exe`
## Building
```ps1
> bun run build
bun run build
# after the initial `bun run build` you can use the following to build
> ninja -Cbuild/debug
ninja -Cbuild/debug
```
If this was successful, you should have a `bun-debug.exe` in the `build/debug` folder.
```ps1
> .\build\debug\bun-debug.exe --revision
.\build\debug\bun-debug.exe --revision
```
You should add this to `$Env:PATH`. The simplest way to do so is to open the start menu, type "Path", and then navigate the environment variables menu to add `C:\.....\bun\build\debug` to the user environment variable `PATH`. You should then restart your editor (if it does not update still, log out and log back in).
@@ -111,15 +111,15 @@ You can run the test suite either using `bun test <path>` or by using the wrappe
```ps1
# Setup
> bun i --cwd packages\bun-internal-test
bun i --cwd packages\bun-internal-test
# Run the entire test suite with reporter
# the package.json script "test" uses "build/debug/bun-debug.exe" by default
> bun run test
bun run test
# Run an individual test file:
> bun-debug test node\fs
> bun-debug test "C:\bun\test\js\bun\resolve\import-meta.test.js"
bun-debug test node\fs
bun-debug test "C:\bun\test\js\bun\resolve\import-meta.test.js"
```
## Troubleshooting

View File

@@ -28,23 +28,23 @@ Using your system's package manager, install Bun's dependencies:
<CodeGroup>
```bash macOS (Homebrew)
$ brew install automake cmake coreutils gnu-sed go icu4c libiconv libtool ninja pkg-config rust ruby sccache
brew install automake ccache cmake coreutils gnu-sed go icu4c libiconv libtool ninja pkg-config rust ruby
```
```bash Ubuntu/Debian
$ sudo apt install curl wget lsb-release software-properties-common cargo cmake git golang libtool ninja-build pkg-config rustc ruby-full xz-utils
sudo apt install curl wget lsb-release software-properties-common cargo cmake git golang libtool ninja-build pkg-config rustc ruby-full xz-utils
```
```bash Arch
$ sudo pacman -S base-devel cmake git go libiconv libtool make ninja pkg-config python rust sed unzip ruby
sudo pacman -S base-devel cmake git go libiconv libtool make ninja pkg-config python rust sed unzip ruby
```
```bash Fedora
$ sudo dnf install cargo clang19 llvm19 lld19 cmake git golang libtool ninja-build pkg-config rustc ruby libatomic-static libstdc++-static sed unzip which libicu-devel 'perl(Math::BigInt)'
sudo dnf install cargo clang19 llvm19 lld19 cmake git golang libtool ninja-build pkg-config rustc ruby libatomic-static libstdc++-static sed unzip which libicu-devel 'perl(Math::BigInt)'
```
```bash openSUSE Tumbleweed
$ sudo zypper install go cmake ninja automake git icu rustup && rustup toolchain install stable
sudo zypper install go cmake ninja automake git icu rustup && rustup toolchain install stable
```
</CodeGroup>
@@ -56,59 +56,42 @@ Before starting, you will need to already have a release build of Bun installed,
<CodeGroup>
```bash Native
$ curl -fsSL https://bun.com/install | bash
curl -fsSL https://bun.com/install | bash
```
```bash npm
$ npm install -g bun
npm install -g bun
```
```bash Homebrew
$ brew tap oven-sh/bun
$ brew install bun
brew tap oven-sh/bun
brew install bun
```
</CodeGroup>
### Optional: Install `sccache`
### Optional: Install `ccache`
sccache is used to cache compilation artifacts, significantly speeding up builds. It must be installed with S3 support:
ccache is used to cache compilation artifacts, significantly speeding up builds:
```bash
# For macOS
$ brew install sccache
brew install ccache
# For Linux. Note that the version in your package manager may not have S3 support.
$ cargo install sccache --features=s3
# For Ubuntu/Debian
sudo apt install ccache
# For Arch
sudo pacman -S ccache
# For Fedora
sudo dnf install ccache
# For openSUSE
sudo zypper install ccache
```
This will install `sccache` with S3 support. Our build scripts will automatically detect and use `sccache` with our shared S3 cache. **Note**: Not all versions of `sccache` are compiled with S3 support, hence we recommend installing it via `cargo`.
#### Registering AWS Credentials for `sccache` (Core Developers Only)
Core developers have write access to the shared S3 cache. To enable write access, you must log in with AWS credentials. The easiest way to do this is to use the [`aws` CLI](https://aws.amazon.com/cli/) and invoke [`aws configure` to provide your AWS security info](https://docs.aws.amazon.com/cli/latest/reference/configure/).
The `cmake` scripts should automatically detect your AWS credentials from the environment or the `~/.aws/credentials` file.
<details>
<summary>Logging in to the `aws` CLI</summary>
1. Install the AWS CLI by following [the official guide](https://docs.aws.amazon.com/cli/latest/userguide/getting-started-install.html).
2. Log in to your AWS account console. A team member should provide you with your credentials.
3. Click your name in the top right > Security credentials.
4. Scroll to "Access keys" and create a new access key.
5. Run `aws configure` in your terminal and provide the access key ID and secret access key when prompted.
</details>
<details>
<summary>Common Issues You May Encounter</summary>
- To confirm that the cache is being used, you can use the `sccache --show-stats` command right after a build. This will expose very useful statistics, including cache hits/misses.
- If you have multiple AWS profiles configured, ensure that the correct profile is set in the `AWS_PROFILE` environment variable.
- `sccache` follows a server-client model. If you run into weird issues where `sccache` refuses to use S3, even though you have AWS credentials configured, try killing any running `sccache` servers with `sccache --stop-server` and then re-running the build.
</details>
Our build scripts will automatically detect and use `ccache` if available. You can check cache statistics with `ccache --show-stats`.
## Install LLVM
@@ -117,24 +100,24 @@ Bun requires LLVM 19 (`clang` is part of LLVM). This version requirement is to m
<CodeGroup>
```bash macOS (Homebrew)
$ brew install llvm@19
brew install llvm@19
```
```bash Ubuntu/Debian
$ # LLVM has an automatic installation script that is compatible with all versions of Ubuntu
$ wget https://apt.llvm.org/llvm.sh -O - | sudo bash -s -- 19 all
# LLVM has an automatic installation script that is compatible with all versions of Ubuntu
wget https://apt.llvm.org/llvm.sh -O - | sudo bash -s -- 19 all
```
```bash Arch
$ sudo pacman -S llvm clang lld
sudo pacman -S llvm clang lld
```
```bash Fedora
$ sudo dnf install llvm clang lld-devel
sudo dnf install llvm clang lld-devel
```
```bash openSUSE Tumbleweed
$ sudo zypper install clang19 lld19 llvm19
sudo zypper install clang19 lld19 llvm19
```
</CodeGroup>
@@ -144,7 +127,7 @@ If none of the above solutions apply, you will have to install it [manually](htt
Make sure Clang/LLVM 19 is in your path:
```bash
$ which clang-19
which clang-19
```
If not, run this to manually add it:
@@ -154,12 +137,12 @@ If not, run this to manually add it:
```bash macOS (Homebrew)
# use fish_add_path if you're using fish
# use path+="$(brew --prefix llvm@19)/bin" if you are using zsh
$ export PATH="$(brew --prefix llvm@19)/bin:$PATH"
export PATH="$(brew --prefix llvm@19)/bin:$PATH"
```
```bash Arch
# use fish_add_path if you're using fish
$ export PATH="$PATH:/usr/lib/llvm19/bin"
export PATH="$PATH:/usr/lib/llvm19/bin"
```
</CodeGroup>
@@ -179,7 +162,7 @@ bun run build
The binary will be located at `./build/debug/bun-debug`. It is recommended to add this to your `$PATH`. To verify the build worked, let's print the version number on the development build of Bun.
```bash
$ build/debug/bun-debug --version
build/debug/bun-debug --version
x.y.z_debug
```
@@ -278,17 +261,17 @@ WebKit is not cloned by default (to save time and disk space). To clone and buil
```bash
# Clone WebKit into ./vendor/WebKit
$ git clone https://github.com/oven-sh/WebKit vendor/WebKit
git clone https://github.com/oven-sh/WebKit vendor/WebKit
# Check out the commit hash specified in `set(WEBKIT_VERSION <commit_hash>)` in cmake/tools/SetupWebKit.cmake
$ git -C vendor/WebKit checkout <commit_hash>
git -C vendor/WebKit checkout <commit_hash>
# Make a debug build of JSC. This will output build artifacts in ./vendor/WebKit/WebKitBuild/Debug
# Optionally, you can use `bun run jsc:build` for a release build
bun run jsc:build:debug && rm vendor/WebKit/WebKitBuild/Debug/JavaScriptCore/DerivedSources/inspector/InspectorProtocolObjects.h
# After an initial run of `make jsc-debug`, you can rebuild JSC with:
$ cmake --build vendor/WebKit/WebKitBuild/Debug --target jsc && rm vendor/WebKit/WebKitBuild/Debug/JavaScriptCore/DerivedSources/inspector/InspectorProtocolObjects.h
cmake --build vendor/WebKit/WebKitBuild/Debug --target jsc && rm vendor/WebKit/WebKitBuild/Debug/JavaScriptCore/DerivedSources/inspector/InspectorProtocolObjects.h
# Build bun with the local JSC build
bun run build:local
@@ -339,20 +322,20 @@ is not able to compile a simple test program.
To fix the error, we need to update the GCC version to 11. To do this, we'll need to check if the latest version is available in the distribution's official repositories or use a third-party repository that provides GCC 11 packages. Here are general steps:
```bash
$ sudo apt update
$ sudo apt install gcc-11 g++-11
sudo apt update
sudo apt install gcc-11 g++-11
# If the above command fails with `Unable to locate package gcc-11` we need
# to add the APT repository
$ sudo add-apt-repository -y ppa:ubuntu-toolchain-r/test
sudo add-apt-repository -y ppa:ubuntu-toolchain-r/test
# Now run `apt install` again
$ sudo apt install gcc-11 g++-11
sudo apt install gcc-11 g++-11
```
Now, we need to set GCC 11 as the default compiler:
```bash
$ sudo update-alternatives --install /usr/bin/gcc gcc /usr/bin/gcc-11 100
$ sudo update-alternatives --install /usr/bin/g++ g++ /usr/bin/g++-11 100
sudo update-alternatives --install /usr/bin/gcc gcc /usr/bin/gcc-11 100
sudo update-alternatives --install /usr/bin/g++ g++ /usr/bin/g++-11 100
```
### libarchive
@@ -360,7 +343,7 @@ $ sudo update-alternatives --install /usr/bin/g++ g++ /usr/bin/g++-11 100
If you see an error on macOS when compiling `libarchive`, run:
```bash
$ brew install pkg-config
brew install pkg-config
```
### macOS `library not found for -lSystem`
@@ -368,7 +351,7 @@ $ brew install pkg-config
If you see this error when compiling, run:
```bash
$ xcode-select --install
xcode-select --install
```
### Cannot find `libatomic.a`

452
docs/runtime/archive.mdx Normal file
View File

@@ -0,0 +1,452 @@
---
title: Archive
description: Create and extract tar archives with Bun's fast native implementation
---
Bun provides a fast, native implementation for working with tar archives through `Bun.Archive`. It supports creating archives from in-memory data, extracting archives to disk, and reading archive contents without extraction.
## Quickstart
**Create an archive from files:**
```ts
const archive = new Bun.Archive({
"hello.txt": "Hello, World!",
"data.json": JSON.stringify({ foo: "bar" }),
"nested/file.txt": "Nested content",
});
// Write to disk
await Bun.write("bundle.tar", archive);
```
**Extract an archive:**
```ts
const tarball = await Bun.file("package.tar.gz").bytes();
const archive = new Bun.Archive(tarball);
const entryCount = await archive.extract("./output");
console.log(`Extracted ${entryCount} entries`);
```
**Read archive contents without extracting:**
```ts
const tarball = await Bun.file("package.tar.gz").bytes();
const archive = new Bun.Archive(tarball);
const files = await archive.files();
for (const [path, file] of files) {
console.log(`${path}: ${await file.text()}`);
}
```
## Creating Archives
Use `new Bun.Archive()` to create an archive from an object where keys are file paths and values are file contents. By default, archives are uncompressed:
```ts
// Creates an uncompressed tar archive (default)
const archive = new Bun.Archive({
"README.md": "# My Project",
"src/index.ts": "console.log('Hello');",
"package.json": JSON.stringify({ name: "my-project" }),
});
```
File contents can be:
- **Strings** - Text content
- **Blobs** - Binary data
- **ArrayBufferViews** (e.g., `Uint8Array`) - Raw bytes
- **ArrayBuffers** - Raw binary data
```ts
const data = "binary data";
const arrayBuffer = new ArrayBuffer(8);
const archive = new Bun.Archive({
"text.txt": "Plain text",
"blob.bin": new Blob([data]),
"bytes.bin": new Uint8Array([1, 2, 3, 4]),
"buffer.bin": arrayBuffer,
});
```
### Writing Archives to Disk
Use `Bun.write()` to write an archive to disk:
```ts
// Write uncompressed tar (default)
const archive = new Bun.Archive({
"file1.txt": "content1",
"file2.txt": "content2",
});
await Bun.write("output.tar", archive);
// Write gzipped tar
const compressed = new Bun.Archive({ "src/index.ts": "console.log('Hello');" }, { compress: "gzip" });
await Bun.write("output.tar.gz", compressed);
```
### Getting Archive Bytes
Get the archive data as bytes or a Blob:
```ts
const archive = new Bun.Archive({ "hello.txt": "Hello, World!" });
// As Uint8Array
const bytes = await archive.bytes();
// As Blob
const blob = await archive.blob();
// With gzip compression (set at construction)
const gzipped = new Bun.Archive({ "hello.txt": "Hello, World!" }, { compress: "gzip" });
const gzippedBytes = await gzipped.bytes();
const gzippedBlob = await gzipped.blob();
```
## Extracting Archives
### From Existing Archive Data
Create an archive from existing tar/tar.gz data:
```ts
// From a file
const tarball = await Bun.file("package.tar.gz").bytes();
const archiveFromFile = new Bun.Archive(tarball);
```
```ts
// From a fetch response
const response = await fetch("https://example.com/archive.tar.gz");
const archiveFromFetch = new Bun.Archive(await response.blob());
```
### Extracting to Disk
Use `.extract()` to write all files to a directory:
```ts
const tarball = await Bun.file("package.tar.gz").bytes();
const archive = new Bun.Archive(tarball);
const count = await archive.extract("./extracted");
console.log(`Extracted ${count} entries`);
```
The target directory is created automatically if it doesn't exist. Existing files are overwritten. The returned count includes files, directories, and symlinks (on POSIX systems).
**Note**: On Windows, symbolic links in archives are always skipped during extraction. Bun does not attempt to create them regardless of privilege level. On Linux and macOS, symlinks are extracted normally.
**Security note**: Bun.Archive validates paths during extraction, rejecting absolute paths (POSIX `/`, Windows drive letters like `C:\` or `C:/`, and UNC paths like `\\server\share`). Path traversal components (`..`) are normalized away (e.g., `dir/sub/../file` becomes `dir/file`) to prevent directory escape attacks.
### Filtering Extracted Files
Use glob patterns to extract only specific files. Patterns are matched against archive entry paths normalized to use forward slashes (`/`). Positive patterns specify what to include, and negative patterns (prefixed with `!`) specify what to exclude. Negative patterns are applied after positive patterns, so **using only negative patterns will match nothing** (you must include a positive pattern like `**` first):
```ts
const tarball = await Bun.file("package.tar.gz").bytes();
const archive = new Bun.Archive(tarball);
// Extract only TypeScript files
const tsCount = await archive.extract("./extracted", { glob: "**/*.ts" });
// Extract files from multiple directories
const multiCount = await archive.extract("./extracted", {
glob: ["src/**", "lib/**"],
});
```
Use negative patterns (prefixed with `!`) to exclude files. When mixing positive and negative patterns, entries must match at least one positive pattern and not match any negative pattern:
```ts
// Extract everything except node_modules
const distCount = await archive.extract("./extracted", {
glob: ["**", "!node_modules/**"],
});
// Extract source files but exclude tests
const srcCount = await archive.extract("./extracted", {
glob: ["src/**", "!**/*.test.ts", "!**/__tests__/**"],
});
```
## Reading Archive Contents
### Get All Files
Use `.files()` to get archive contents as a `Map` of `File` objects without extracting to disk. Unlike `extract()` which processes all entry types, `files()` returns only regular files (no directories):
```ts
const tarball = await Bun.file("package.tar.gz").bytes();
const archive = new Bun.Archive(tarball);
const files = await archive.files();
for (const [path, file] of files) {
console.log(`${path}: ${file.size} bytes`);
console.log(await file.text());
}
```
Each `File` object includes:
- `name` - The file path within the archive (always uses forward slashes `/` as separators)
- `size` - File size in bytes
- `lastModified` - Modification timestamp
- Standard `Blob` methods: `text()`, `arrayBuffer()`, `stream()`, etc.
**Note**: `files()` loads file contents into memory. For large archives, consider using `extract()` to write directly to disk instead.
### Error Handling
Archive operations can fail due to corrupted data, I/O errors, or invalid paths. Use try/catch to handle these cases:
```ts
try {
const tarball = await Bun.file("package.tar.gz").bytes();
const archive = new Bun.Archive(tarball);
const count = await archive.extract("./output");
console.log(`Extracted ${count} entries`);
} catch (e: unknown) {
if (e instanceof Error) {
const error = e as Error & { code?: string };
if (error.code === "EACCES") {
console.error("Permission denied");
} else if (error.code === "ENOSPC") {
console.error("Disk full");
} else {
console.error("Archive error:", error.message);
}
} else {
console.error("Archive error:", String(e));
}
}
```
Common error scenarios:
- **Corrupted/truncated archives** - `new Archive()` loads the archive data; errors may be deferred until read/extract operations
- **Permission denied** - `extract()` throws if the target directory is not writable
- **Disk full** - `extract()` throws if there's insufficient space
- **Invalid paths** - Operations throw for malformed file paths
The count returned by `extract()` includes all successfully written entries (files, directories, and symlinks on POSIX systems).
**Security note**: Bun.Archive automatically validates paths during extraction. Absolute paths (POSIX `/`, Windows drive letters, UNC paths) and unsafe symlink targets are rejected. Path traversal components (`..`) are normalized away to prevent directory escape.
For additional security with untrusted archives, you can enumerate and validate paths before extraction:
```ts
const archive = new Bun.Archive(untrustedData);
const files = await archive.files();
// Optional: Custom validation for additional checks
for (const [path] of files) {
// Example: Reject hidden files
if (path.startsWith(".") || path.includes("/.")) {
throw new Error(`Hidden file rejected: ${path}`);
}
// Example: Whitelist specific directories
if (!path.startsWith("src/") && !path.startsWith("lib/")) {
throw new Error(`Unexpected path: ${path}`);
}
}
// Extract to a controlled destination
await archive.extract("./safe-output");
```
When using `files()` with a glob pattern, an empty `Map` is returned if no files match:
```ts
const matches = await archive.files("*.nonexistent");
if (matches.size === 0) {
console.log("No matching files found");
}
```
### Filtering with Glob Patterns
Pass a glob pattern to filter which files are returned:
```ts
// Get only TypeScript files
const tsFiles = await archive.files("**/*.ts");
// Get files in src directory
const srcFiles = await archive.files("src/*");
// Get all JSON files (recursive)
const jsonFiles = await archive.files("**/*.json");
// Get multiple file types with array of patterns
const codeFiles = await archive.files(["**/*.ts", "**/*.js"]);
```
Supported glob patterns (subset of [Bun.Glob](/docs/api/glob) syntax):
- `*` - Match any characters except `/`
- `**` - Match any characters including `/`
- `?` - Match single character
- `[abc]` - Match character set
- `{a,b}` - Match alternatives
- `!pattern` - Exclude files matching pattern (negation). Must be combined with positive patterns; using only negative patterns matches nothing.
See [Bun.Glob](/docs/api/glob) for the full glob syntax including escaping and advanced patterns.
## Compression
Bun.Archive creates uncompressed tar archives by default. Use `{ compress: "gzip" }` to enable gzip compression:
```ts
// Default: uncompressed tar
const archive = new Bun.Archive({ "hello.txt": "Hello, World!" });
// Reading: automatically detects gzip
const gzippedTarball = await Bun.file("archive.tar.gz").bytes();
const readArchive = new Bun.Archive(gzippedTarball);
// Enable gzip compression
const compressed = new Bun.Archive({ "hello.txt": "Hello, World!" }, { compress: "gzip" });
// Gzip with custom level (1-12)
const maxCompression = new Bun.Archive({ "hello.txt": "Hello, World!" }, { compress: "gzip", level: 12 });
```
The options accept:
- No options or `undefined` - Uncompressed tar (default)
- `{ compress: "gzip" }` - Enable gzip compression at level 6
- `{ compress: "gzip", level: number }` - Gzip with custom level 1-12 (1 = fastest, 12 = smallest)
## Examples
### Bundle Project Files
```ts
import { Glob } from "bun";
// Collect source files
const files: Record<string, string> = {};
const glob = new Glob("src/**/*.ts");
for await (const path of glob.scan(".")) {
// Normalize path separators to forward slashes for cross-platform compatibility
const archivePath = path.replaceAll("\\", "/");
files[archivePath] = await Bun.file(path).text();
}
// Add package.json
files["package.json"] = await Bun.file("package.json").text();
// Create compressed archive and write to disk
const archive = new Bun.Archive(files, { compress: "gzip" });
await Bun.write("bundle.tar.gz", archive);
```
### Extract and Process npm Package
```ts
const response = await fetch("https://registry.npmjs.org/lodash/-/lodash-4.17.21.tgz");
const archive = new Bun.Archive(await response.blob());
// Get package.json
const files = await archive.files("package/package.json");
const packageJson = files.get("package/package.json");
if (packageJson) {
const pkg = JSON.parse(await packageJson.text());
console.log(`Package: ${pkg.name}@${pkg.version}`);
}
```
### Create Archive from Directory
```ts
import { readdir } from "node:fs/promises";
import { join } from "node:path";
async function archiveDirectory(dir: string, compress = false): Promise<Bun.Archive> {
const files: Record<string, Blob> = {};
async function walk(currentDir: string, prefix: string = "") {
const entries = await readdir(currentDir, { withFileTypes: true });
for (const entry of entries) {
const fullPath = join(currentDir, entry.name);
const archivePath = prefix ? `${prefix}/${entry.name}` : entry.name;
if (entry.isDirectory()) {
await walk(fullPath, archivePath);
} else {
files[archivePath] = Bun.file(fullPath);
}
}
}
await walk(dir);
return new Bun.Archive(files, compress ? { compress: "gzip" } : undefined);
}
const archive = await archiveDirectory("./my-project", true);
await Bun.write("my-project.tar.gz", archive);
```
## Reference
> **Note**: The following type signatures are simplified for documentation purposes. See [`packages/bun-types/bun.d.ts`](https://github.com/oven-sh/bun/blob/main/packages/bun-types/bun.d.ts) for the full type definitions.
```ts
type ArchiveInput =
| Record<string, string | Blob | Bun.ArrayBufferView | ArrayBufferLike>
| Blob
| Bun.ArrayBufferView
| ArrayBufferLike;
type ArchiveOptions = {
/** Compression algorithm. Currently only "gzip" is supported. */
compress?: "gzip";
/** Compression level 1-12 (default 6 when gzip is enabled). */
level?: number;
};
interface ArchiveExtractOptions {
/** Glob pattern(s) to filter extraction. Supports negative patterns with "!" prefix. */
glob?: string | readonly string[];
}
class Archive {
/**
* Create an Archive from input data
* @param data - Files to archive (as object) or existing archive data (as bytes/blob)
* @param options - Compression options. Uncompressed by default.
* Pass { compress: "gzip" } to enable compression.
*/
constructor(data: ArchiveInput, options?: ArchiveOptions);
/**
* Extract archive to a directory
* @returns Number of entries extracted (files, directories, and symlinks)
*/
extract(path: string, options?: ArchiveExtractOptions): Promise<number>;
/**
* Get archive as a Blob (uses compression setting from constructor)
*/
blob(): Promise<Blob>;
/**
* Get archive as a Uint8Array (uses compression setting from constructor)
*/
bytes(): Promise<Uint8Array<ArrayBuffer>>;
/**
* Get archive contents as File objects (regular files only, no directories)
*/
files(glob?: string | readonly string[]): Promise<Map<string, File>>;
}
```

View File

@@ -115,6 +115,26 @@ Currently we do not collect telemetry and this setting is only used for enabling
telemetry = false
```
### `env`
Configure automatic `.env` file loading. By default, Bun automatically loads `.env` files. To disable this behavior:
```toml title="bunfig.toml" icon="settings"
# Disable automatic .env file loading
env = false
```
You can also use object syntax with the `file` property:
```toml title="bunfig.toml" icon="settings"
[env]
file = false
```
This is useful in production environments or CI/CD pipelines where you want to rely solely on system environment variables.
Note: Explicitly provided environment files via `--env-file` will still be loaded even when default loading is disabled.
### `console`
Configure console output behavior.

View File

@@ -315,6 +315,109 @@ if (typeof Bun !== "undefined") {
---
## Terminal (PTY) support
For interactive terminal applications, you can spawn a subprocess with a pseudo-terminal (PTY) attached using the `terminal` option. This makes the subprocess think it's running in a real terminal, enabling features like colored output, cursor movement, and interactive prompts.
```ts
const proc = Bun.spawn(["bash"], {
terminal: {
cols: 80,
rows: 24,
data(terminal, data) {
// Called when data is received from the terminal
process.stdout.write(data);
},
},
});
// Write to the terminal
proc.terminal.write("echo hello\n");
// Wait for the process to exit
await proc.exited;
// Close the terminal
proc.terminal.close();
```
When the `terminal` option is provided:
- The subprocess sees `process.stdout.isTTY` as `true`
- `stdin`, `stdout`, and `stderr` are all connected to the terminal
- `proc.stdin`, `proc.stdout`, and `proc.stderr` return `null` — use the terminal instead
- Access the terminal via `proc.terminal`
### Terminal options
| Option | Description | Default |
| ------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------ | ------------------ |
| `cols` | Number of columns | `80` |
| `rows` | Number of rows | `24` |
| `name` | Terminal type for PTY configuration (set `TERM` env var separately via `env` option) | `"xterm-256color"` |
| `data` | Callback when data is received `(terminal, data) => void` | — |
| `exit` | Callback when PTY stream closes (EOF or error). `exitCode` is PTY lifecycle status (0=EOF, 1=error), not subprocess exit code. Use `proc.exited` for process exit. | — |
| `drain` | Callback when ready for more data `(terminal) => void` | — |
### Terminal methods
The `Terminal` object returned by `proc.terminal` has the following methods:
```ts
// Write data to the terminal
proc.terminal.write("echo hello\n");
// Resize the terminal
proc.terminal.resize(120, 40);
// Set raw mode (disable line buffering and echo)
proc.terminal.setRawMode(true);
// Keep event loop alive while terminal is open
proc.terminal.ref();
proc.terminal.unref();
// Close the terminal
proc.terminal.close();
```
### Reusable Terminal
You can create a terminal independently and reuse it across multiple subprocesses:
```ts
await using terminal = new Bun.Terminal({
cols: 80,
rows: 24,
data(term, data) {
process.stdout.write(data);
},
});
// Spawn first process
const proc1 = Bun.spawn(["echo", "first"], { terminal });
await proc1.exited;
// Reuse terminal for another process
const proc2 = Bun.spawn(["echo", "second"], { terminal });
await proc2.exited;
// Terminal is closed automatically by `await using`
```
When passing an existing `Terminal` object:
- The terminal can be reused across multiple spawns
- You control when to close the terminal
- The `exit` callback fires when you call `terminal.close()`, not when each subprocess exits
- Use `proc.exited` to detect individual subprocess exits
This is useful for running multiple commands in sequence through the same terminal session.
<Note>Terminal support is only available on POSIX systems (Linux, macOS). It is not available on Windows.</Note>
---
## Blocking API (`Bun.spawnSync()`)
Bun provides a synchronous equivalent of `Bun.spawn` called `Bun.spawnSync`. This is a blocking API that supports the same inputs and parameters as `Bun.spawn`. It returns a `SyncSubprocess` object, which differs from `Subprocess` in a few ways.
@@ -407,6 +510,7 @@ namespace SpawnOptions {
timeout?: number;
killSignal?: string | number;
maxBuffer?: number;
terminal?: TerminalOptions; // PTY support (POSIX only)
}
type Readable =
@@ -435,10 +539,11 @@ namespace SpawnOptions {
}
interface Subprocess extends AsyncDisposable {
readonly stdin: FileSink | number | undefined;
readonly stdout: ReadableStream<Uint8Array> | number | undefined;
readonly stderr: ReadableStream<Uint8Array> | number | undefined;
readonly readable: ReadableStream<Uint8Array> | number | undefined;
readonly stdin: FileSink | number | undefined | null;
readonly stdout: ReadableStream<Uint8Array<ArrayBuffer>> | number | undefined | null;
readonly stderr: ReadableStream<Uint8Array<ArrayBuffer>> | number | undefined | null;
readonly readable: ReadableStream<Uint8Array<ArrayBuffer>> | number | undefined | null;
readonly terminal: Terminal | undefined;
readonly pid: number;
readonly exited: Promise<number>;
readonly exitCode: number | null;
@@ -465,6 +570,28 @@ interface SyncSubprocess {
pid: number;
}
interface TerminalOptions {
cols?: number;
rows?: number;
name?: string;
data?: (terminal: Terminal, data: Uint8Array<ArrayBuffer>) => void;
/** Called when PTY stream closes (EOF or error). exitCode is PTY lifecycle status (0=EOF, 1=error), not subprocess exit code. */
exit?: (terminal: Terminal, exitCode: number, signal: string | null) => void;
drain?: (terminal: Terminal) => void;
}
interface Terminal extends AsyncDisposable {
readonly stdin: number;
readonly stdout: number;
readonly closed: boolean;
write(data: string | BufferSource): number;
resize(cols: number, rows: number): void;
setRawMode(enabled: boolean): void;
ref(): void;
unref(): void;
close(): void;
}
interface ResourceUsage {
contextSwitches: {
voluntary: number;

View File

@@ -358,6 +358,8 @@ Bun represents [pointers](<https://en.wikipedia.org/wiki/Pointer_(computer_progr
**Why not `BigInt`?** `BigInt` is slower. JavaScript engines allocate a separate `BigInt` which means they can't fit into a regular JavaScript value. If you pass a `BigInt` to a function, it will be converted to a `number`
**Windows Note**: The Windows API type HANDLE does not represent a virtual address, and using `ptr` for it will _not_ work as expected. Use `u64` to safely represent HANDLE values.
</Accordion>
To convert from a `TypedArray` to a pointer:

View File

@@ -193,15 +193,17 @@ This is the maximum amount of time a connection is allowed to be idle before the
Thus far, the examples on this page have used the explicit `Bun.serve` API. Bun also supports an alternate syntax.
```ts server.ts
import { type Serve } from "bun";
import type { Serve } from "bun";
export default {
fetch(req) {
return new Response("Bun!");
},
} satisfies Serve;
} satisfies Serve.Options<undefined>;
```
The type parameter `<undefined>` represents WebSocket data — if you add a `websocket` handler with custom data attached via `server.upgrade(req, { data: ... })`, replace `undefined` with your data type.
Instead of passing the server options into `Bun.serve`, `export default` it. This file can be executed as-is; when Bun sees a file with a `default` export containing a `fetch` handler, it passes it into `Bun.serve` under the hood.
---

View File

@@ -127,3 +127,54 @@ const socket = await Bun.udpSocket({
},
});
```
### Socket options
UDP sockets support setting various socket options:
```ts
const socket = await Bun.udpSocket({});
// Enable broadcasting to send packets to a broadcast address
socket.setBroadcast(true);
// Set the IP TTL (time to live) for outgoing packets
socket.setTTL(64);
```
### Multicast
Bun supports multicast operations for UDP sockets. Use `addMembership` and `dropMembership` to join and leave multicast groups:
```ts
const socket = await Bun.udpSocket({});
// Join a multicast group
socket.addMembership("224.0.0.1");
// Join with a specific interface
socket.addMembership("224.0.0.1", "192.168.1.100");
// Leave a multicast group
socket.dropMembership("224.0.0.1");
```
Additional multicast options:
```ts
// Set TTL for multicast packets (number of network hops)
socket.setMulticastTTL(2);
// Control whether multicast packets loop back to the local socket
socket.setMulticastLoopback(true);
// Specify which interface to use for outgoing multicast packets
socket.setMulticastInterface("192.168.1.100");
```
For source-specific multicast (SSM), use `addSourceSpecificMembership` and `dropSourceSpecificMembership`:
```ts
socket.addSourceSpecificMembership("10.0.0.1", "232.0.0.1");
socket.dropSourceSpecificMembership("10.0.0.1", "232.0.0.1");
```

View File

@@ -303,7 +303,7 @@ Internally, this calls [`sqlite3_reset`](https://www.sqlite.org/capi3ref.html#sq
### `.run()`
Use `.run()` to run a query and get back `undefined`. This is useful for schema-modifying queries (e.g. `CREATE TABLE`) or bulk write operations.
Use `.run()` to run a query and get back an object with execution metadata. This is useful for schema-modifying queries (e.g. `CREATE TABLE`) or bulk write operations.
```ts db.ts icon="/icons/typescript.svg" highlight={2}
const query = db.query(`create table foo;`);

View File

@@ -19,7 +19,7 @@ If you're looking to create a brand new empty project, use [`bun init`](/runtime
`bun create ./MyComponent.tsx` turns an existing React component into a complete dev environment with hot reload and production builds in one command.
```bash
$ bun create ./MyComponent.jsx # .tsx also supported
bun create ./MyComponent.jsx # .tsx also supported
```
<Frame>

View File

@@ -149,25 +149,25 @@ div.callout .code-block {
margin-bottom: 0px;
}
.code-block[language="shellscript"] code span.line:not(:empty):has(span)::before {
[language="shellscript"] code span.line:not(:empty):has(span)::before {
content: "$ ";
color: #6272a4;
user-select: none;
}
.code-block[language="shellscript"] code span.line:has(> span:first-child[style*="color: rgb(98, 114, 164)"])::before,
.code-block[language="shellscript"] code span.line:has(> span:first-child[style*="#6272A4"])::before {
[language="shellscript"] code span.line:has(> span:first-child[style*="color: rgb(98, 114, 164)"])::before,
[language="shellscript"] code span.line:has(> span:first-child[style*="#6272A4"])::before {
content: "";
}
.code-block[language="powershell"] code span.line:not(:empty):has(span)::before {
[language="powershell"] code span.line:not(:empty):has(span)::before {
content: "> ";
color: #6272a4;
user-select: none;
}
.code-block[language="powershell"] code span.line:has(> span:first-child[style*="color: rgb(98, 114, 164)"])::before,
.code-block[language="powershell"] code span.line:has(> span:first-child[style*="#6272A4"])::before {
[language="powershell"] code span.line:has(> span:first-child[style*="color: rgb(98, 114, 164)"])::before,
[language="powershell"] code span.line:has(> span:first-child[style*="#6272A4"])::before {
content: "";
}

View File

@@ -376,16 +376,18 @@ timeout = 10000
## Environment Variables
You can also set environment variables in your configuration that affect test behavior:
Environment variables for tests should be set using `.env` files. Bun automatically loads `.env` files from your project root. For test-specific variables, create a `.env.test` file:
```toml title="bunfig.toml" icon="settings"
[env]
NODE_ENV = "test"
DATABASE_URL = "postgresql://localhost:5432/test_db"
LOG_LEVEL = "error"
```ini title=".env.test" icon="settings"
NODE_ENV=test
DATABASE_URL=postgresql://localhost:5432/test_db
LOG_LEVEL=error
```
[test]
coverage = true
Then load it with `--env-file`:
```bash terminal icon="terminal"
bun test --env-file=.env.test
```
## Complete Configuration Example
@@ -398,13 +400,6 @@ Here's a comprehensive example showing all available test configuration options:
registry = "https://registry.npmjs.org/"
exact = true
[env]
# Environment variables for tests
NODE_ENV = "test"
DATABASE_URL = "postgresql://localhost:5432/test_db"
API_URL = "http://localhost:3001"
LOG_LEVEL = "error"
[test]
# Test discovery
root = "src"

View File

@@ -428,26 +428,26 @@ test("foo, bar, baz", () => {
const barSpy = spyOn(barModule, "bar");
const bazSpy = spyOn(bazModule, "baz");
// Original values
expect(fooSpy).toBe("foo");
expect(barSpy).toBe("bar");
expect(bazSpy).toBe("baz");
// Original implementations still work
expect(fooModule.foo()).toBe("foo");
expect(barModule.bar()).toBe("bar");
expect(bazModule.baz()).toBe("baz");
// Mock implementations
fooSpy.mockImplementation(() => 42);
barSpy.mockImplementation(() => 43);
bazSpy.mockImplementation(() => 44);
expect(fooSpy()).toBe(42);
expect(barSpy()).toBe(43);
expect(bazSpy()).toBe(44);
expect(fooModule.foo()).toBe(42);
expect(barModule.bar()).toBe(43);
expect(bazModule.baz()).toBe(44);
// Restore all
mock.restore();
expect(fooSpy()).toBe("foo");
expect(barSpy()).toBe("bar");
expect(bazSpy()).toBe("baz");
expect(fooModule.foo()).toBe("foo");
expect(barModule.bar()).toBe("bar");
expect(bazModule.baz()).toBe("baz");
});
```
@@ -455,10 +455,10 @@ Using `mock.restore()` can reduce the amount of code in your tests by adding it
## Vitest Compatibility
For added compatibility with tests written for Vitest, Bun provides the `vi` global object as an alias for parts of the Jest mocking API:
For added compatibility with tests written for Vitest, Bun provides the `vi` object as an alias for parts of the Jest mocking API:
```ts title="test.ts" icon="/icons/typescript.svg"
import { test, expect } from "bun:test";
import { test, expect, vi } from "bun:test";
// Using the 'vi' alias similar to Vitest
test("vitest compatibility", () => {

View File

@@ -40,7 +40,7 @@
pkgs.cmake # Expected: 3.30+ on nixos-unstable as of 2025-10
pkgs.ninja
pkgs.pkg-config
pkgs.sccache
pkgs.ccache
# Compilers and toolchain - version pinned to LLVM 19
clang

View File

@@ -1,7 +1,7 @@
{
"private": true,
"name": "bun",
"version": "1.3.4",
"version": "1.3.7",
"workspaces": [
"./packages/bun-types",
"./packages/@types/bun"
@@ -23,7 +23,8 @@
},
"resolutions": {
"bun-types": "workspace:packages/bun-types",
"@types/bun": "workspace:packages/@types/bun"
"@types/bun": "workspace:packages/@types/bun",
"@types/node": "25.0.0"
},
"scripts": {
"build": "bun --silent run build:debug",

File diff suppressed because it is too large Load Diff

74
packages/bun-types/bundle.d.ts vendored Normal file
View File

@@ -0,0 +1,74 @@
/**
* The `bun:bundle` module provides compile-time utilities for dead-code elimination.
*
* @example
* ```ts
* import { feature } from "bun:bundle";
*
* if (feature("SUPER_SECRET")) {
* console.log("Secret feature enabled!");
* } else {
* console.log("Normal mode");
* }
* ```
*
* Enable feature flags via CLI:
* ```bash
* # During build
* bun build --feature=SUPER_SECRET index.ts
*
* # At runtime
* bun run --feature=SUPER_SECRET index.ts
*
* # In tests
* bun test --feature=SUPER_SECRET
* ```
*
* @module bun:bundle
*/
declare module "bun:bundle" {
/**
* Registry for type-safe feature flags.
*
* Augment this interface to get autocomplete and type checking for your feature flags:
*
* @example
* ```ts
* // env.d.ts
* declare module "bun:bundle" {
* interface Registry {
* features: "DEBUG" | "PREMIUM" | "BETA";
* }
* }
* ```
*
* Now `feature()` only accepts `"DEBUG"`, `"PREMIUM"`, or `"BETA"`:
* ```ts
* feature("DEBUG"); // OK
* feature("TYPO"); // Type error
* ```
*/
interface Registry {}
/**
* Check if a feature flag is enabled at compile time.
*
* This function is replaced with a boolean literal (`true` or `false`) at bundle time,
* enabling dead-code elimination. The bundler will remove unreachable branches.
*
* @param flag - The name of the feature flag to check
* @returns `true` if the flag was passed via `--feature=FLAG`, `false` otherwise
*
* @example
* ```ts
* import { feature } from "bun:bundle";
*
* // With --feature=DEBUG, this becomes: if (true) { ... }
* // Without --feature=DEBUG, this becomes: if (false) { ... }
* if (feature("DEBUG")) {
* console.log("Debug mode enabled");
* }
* ```
*/
function feature(flag: Registry extends { features: infer Features extends string } ? Features : string): boolean;
}

View File

@@ -1,9 +1,12 @@
declare module "bun" {
namespace __internal {
type NodeCryptoWebcryptoSubtleCrypto = import("crypto").webcrypto.SubtleCrypto;
type NodeCryptoWebcryptoCryptoKey = import("crypto").webcrypto.CryptoKey;
type NodeCryptoWebcryptoCryptoKeyPair = import("crypto").webcrypto.CryptoKeyPair;
type LibEmptyOrNodeCryptoWebcryptoSubtleCrypto = LibDomIsLoaded extends true
? {}
: import("crypto").webcrypto.SubtleCrypto;
type LibWorkerOrBunWorker = LibDomIsLoaded extends true ? {} : Bun.Worker;
type LibEmptyOrBunWebSocket = LibDomIsLoaded extends true ? {} : Bun.WebSocket;
@@ -14,7 +17,9 @@ declare module "bun" {
? {}
: import("node:stream/web").DecompressionStream;
type LibPerformanceOrNodePerfHooksPerformance = LibDomIsLoaded extends true ? {} : import("perf_hooks").Performance;
type LibPerformanceOrNodePerfHooksPerformance = LibDomIsLoaded extends true
? {}
: import("node:perf_hooks").Performance;
type LibEmptyOrPerformanceEntry = LibDomIsLoaded extends true ? {} : import("node:perf_hooks").PerformanceEntry;
type LibEmptyOrPerformanceMark = LibDomIsLoaded extends true ? {} : import("node:perf_hooks").PerformanceMark;
type LibEmptyOrPerformanceMeasure = LibDomIsLoaded extends true ? {} : import("node:perf_hooks").PerformanceMeasure;
@@ -83,6 +88,24 @@ declare var WritableStream: Bun.__internal.UseLibDomIfAvailable<
}
>;
interface CompressionStream extends Bun.__internal.LibEmptyOrNodeStreamWebCompressionStream {}
declare var CompressionStream: Bun.__internal.UseLibDomIfAvailable<
"CompressionStream",
{
prototype: CompressionStream;
new (format: Bun.CompressionFormat): CompressionStream;
}
>;
interface DecompressionStream extends Bun.__internal.LibEmptyOrNodeStreamWebDecompressionStream {}
declare var DecompressionStream: Bun.__internal.UseLibDomIfAvailable<
"DecompressionStream",
{
prototype: DecompressionStream;
new (format: Bun.CompressionFormat): DecompressionStream;
}
>;
interface Worker extends Bun.__internal.LibWorkerOrBunWorker {}
declare var Worker: Bun.__internal.UseLibDomIfAvailable<
"Worker",
@@ -206,7 +229,7 @@ interface TextEncoder extends Bun.__internal.LibEmptyOrNodeUtilTextEncoder {
* @param src The text to encode.
* @param dest The array to hold the encode result.
*/
encodeInto(src?: string, dest?: Bun.BufferSource): import("util").EncodeIntoResult;
encodeInto(src?: string, dest?: Bun.BufferSource): import("node:util").TextEncoderEncodeIntoResult;
}
declare var TextEncoder: Bun.__internal.UseLibDomIfAvailable<
"TextEncoder",
@@ -278,30 +301,6 @@ declare var Event: {
new (type: string, eventInitDict?: Bun.EventInit): Event;
};
/**
* Unimplemented in Bun
*/
interface CompressionStream extends Bun.__internal.LibEmptyOrNodeStreamWebCompressionStream {}
/**
* Unimplemented in Bun
*/
declare var CompressionStream: Bun.__internal.UseLibDomIfAvailable<
"CompressionStream",
typeof import("node:stream/web").CompressionStream
>;
/**
* Unimplemented in Bun
*/
interface DecompressionStream extends Bun.__internal.LibEmptyOrNodeStreamWebCompressionStream {}
/**
* Unimplemented in Bun
*/
declare var DecompressionStream: Bun.__internal.UseLibDomIfAvailable<
"DecompressionStream",
typeof import("node:stream/web").DecompressionStream
>;
interface EventTarget {
/**
* Adds a new handler for the `type` event. Any given `listener` is added only once per `type` and per `capture` option value.
@@ -958,7 +957,7 @@ declare function alert(message?: string): void;
declare function confirm(message?: string): boolean;
declare function prompt(message?: string, _default?: string): string | null;
interface SubtleCrypto extends Bun.__internal.NodeCryptoWebcryptoSubtleCrypto {}
interface SubtleCrypto extends Bun.__internal.LibEmptyOrNodeCryptoWebcryptoSubtleCrypto {}
declare var SubtleCrypto: {
prototype: SubtleCrypto;
new (): SubtleCrypto;
@@ -1694,6 +1693,10 @@ declare var EventSource: Bun.__internal.UseLibDomIfAvailable<
interface Performance extends Bun.__internal.LibPerformanceOrNodePerfHooksPerformance {}
declare var performance: Bun.__internal.UseLibDomIfAvailable<"performance", Performance>;
declare var Performance: Bun.__internal.UseLibDomIfAvailable<
"Performance",
{ new (): Performance; prototype: Performance }
>;
interface PerformanceEntry extends Bun.__internal.LibEmptyOrPerformanceEntry {}
declare var PerformanceEntry: Bun.__internal.UseLibDomIfAvailable<

View File

@@ -23,6 +23,7 @@
/// <reference path="./serve.d.ts" />
/// <reference path="./sql.d.ts" />
/// <reference path="./security.d.ts" />
/// <reference path="./bundle.d.ts" />
/// <reference path="./bun.ns.d.ts" />

View File

@@ -86,7 +86,7 @@ declare global {
reallyExit(code?: number): never;
dlopen(module: { exports: any }, filename: string, flags?: number): void;
_exiting: boolean;
noDeprecation: boolean;
noDeprecation?: boolean | undefined;
binding(m: "constants"): {
os: typeof import("node:os").constants;
@@ -308,11 +308,11 @@ declare global {
}
}
declare module "fs/promises" {
declare module "node:fs/promises" {
function exists(path: Bun.PathLike): Promise<boolean>;
}
declare module "tls" {
declare module "node:tls" {
interface BunConnectionOptions extends Omit<ConnectionOptions, "key" | "ca" | "tls" | "cert"> {
/**
* Optionally override the trusted CA certificates. Default is to trust
@@ -359,3 +359,18 @@ declare module "tls" {
function connect(options: BunConnectionOptions, secureConnectListener?: () => void): TLSSocket;
}
declare module "console" {
interface Console {
/**
* Asynchronously read lines from standard input (fd 0)
*
* ```ts
* for await (const line of console) {
* console.log(line);
* }
* ```
*/
[Symbol.asyncIterator](): AsyncIterableIterator<string>;
}
}

View File

@@ -11,9 +11,9 @@ declare module "bun" {
* If the file descriptor is not writable yet, the data is buffered.
*
* @param chunk The data to write
* @returns Number of bytes written
* @returns Number of bytes written or, if the write is pending, a Promise resolving to the number of bytes
*/
write(chunk: string | ArrayBufferView | ArrayBuffer | SharedArrayBuffer): number;
write(chunk: string | ArrayBufferView | ArrayBuffer | SharedArrayBuffer): number | Promise<number>;
/**
* Flush the internal buffer, committing the data to disk or the pipe.
*
@@ -78,9 +78,9 @@ declare module "bun" {
* If the network is not writable yet, the data is buffered.
*
* @param chunk The data to write
* @returns Number of bytes written
* @returns Number of bytes written or, if the write is pending, a Promise resolving to the number of bytes
*/
write(chunk: string | ArrayBufferView | ArrayBuffer | SharedArrayBuffer): number;
write(chunk: string | ArrayBufferView | ArrayBuffer | SharedArrayBuffer): number | Promise<number>;
/**
* Flush the internal buffer, committing the data to the network.
*
@@ -281,6 +281,24 @@ declare module "bun" {
*/
type?: string;
/**
* The Content-Disposition header value.
* Controls how the file is presented when downloaded.
*
* @example
* // Setting attachment disposition with filename
* const file = s3.file("report.pdf", {
* contentDisposition: "attachment; filename=\"quarterly-report.pdf\""
* });
*
* @example
* // Setting inline disposition
* await s3.write("image.png", imageData, {
* contentDisposition: "inline"
* });
*/
contentDisposition?: string | undefined;
/**
* By default, Amazon S3 uses the STANDARD Storage Class to store newly created objects.
*
@@ -303,6 +321,30 @@ declare module "bun" {
| "SNOW"
| "STANDARD_IA";
/**
* When set to `true`, confirms that the requester knows they will be charged
* for the request and data transfer costs. Required for accessing objects
* in Requester Pays buckets.
*
* @see https://docs.aws.amazon.com/AmazonS3/latest/userguide/RequesterPaysBuckets.html
*
* @example
* // Accessing a file in a Requester Pays bucket
* const file = s3.file("data.csv", {
* bucket: "requester-pays-bucket",
* requestPayer: true
* });
* const content = await file.text();
*
* @example
* // Uploading to a Requester Pays bucket
* await s3.write("output.json", data, {
* bucket: "requester-pays-bucket",
* requestPayer: true
* });
*/
requestPayer?: boolean;
/**
* @deprecated The size of the internal buffer in bytes. Defaults to 5 MiB. use `partSize` and `queueSize` instead.
*/
@@ -567,7 +609,17 @@ declare module "bun" {
* });
*/
write(
data: string | ArrayBufferView | ArrayBuffer | SharedArrayBuffer | Request | Response | BunFile | S3File | Blob,
data:
| string
| ArrayBufferView
| ArrayBuffer
| SharedArrayBuffer
| Request
| Response
| BunFile
| S3File
| Blob
| Archive,
options?: S3Options,
): Promise<number>;
@@ -878,7 +930,8 @@ declare module "bun" {
| BunFile
| S3File
| Blob
| File,
| File
| Archive,
options?: S3Options,
): Promise<number>;
@@ -928,7 +981,8 @@ declare module "bun" {
| BunFile
| S3File
| Blob
| File,
| File
| Archive,
options?: S3Options,
): Promise<number>;

View File

@@ -154,12 +154,6 @@ declare module "bun:sqlite" {
* | `bigint` | `INTEGER` |
* | `null` | `NULL` |
*
* @example
* ```ts
* db.run("CREATE TABLE foo (bar TEXT)");
* db.run("INSERT INTO foo VALUES (?)", ["baz"]);
* ```
*
* Useful for queries like:
* - `CREATE TABLE`
* - `INSERT INTO`
@@ -180,8 +174,14 @@ declare module "bun:sqlite" {
*
* @param sql The SQL query to run
* @param bindings Optional bindings for the query
* @returns A `Changes` object with `changes` and `lastInsertRowid` properties
*
* @returns `Database` instance
* @example
* ```ts
* db.run("CREATE TABLE foo (bar TEXT)");
* db.run("INSERT INTO foo VALUES (?)", ["baz"]);
* // => { changes: 1, lastInsertRowid: 1 }
* ```
*/
run<ParamsType extends SQLQueryBindings[]>(sql: string, ...bindings: ParamsType[]): Changes;
@@ -670,18 +670,19 @@ declare module "bun:sqlite" {
* Execute the prepared statement.
*
* @param params optional values to bind to the statement. If omitted, the statement is run with the last bound values or no parameters if there are none.
* @returns A `Changes` object with `changes` and `lastInsertRowid` properties
*
* @example
* ```ts
* const stmt = db.prepare("UPDATE foo SET bar = ?");
* stmt.run("baz");
* // => undefined
* const insert = db.prepare("INSERT INTO users (name) VALUES (?)");
* insert.run("Alice");
* // => { changes: 1, lastInsertRowid: 1 }
* insert.run("Bob");
* // => { changes: 1, lastInsertRowid: 2 }
*
* stmt.run();
* // => undefined
*
* stmt.run("foo");
* // => undefined
* const update = db.prepare("UPDATE users SET name = ? WHERE id = ?");
* update.run("Charlie", 1);
* // => { changes: 1, lastInsertRowid: 2 }
* ```
*
* The following types can be used when binding parameters:

View File

@@ -428,6 +428,8 @@ declare module "bun:test" {
}
namespace __internal {
type IfNeverThenElse<T, Else> = [T] extends [never] ? Else : T;
type IsTuple<T> = T extends readonly unknown[]
? number extends T["length"]
? false // It's an array with unknown length, not a tuple
@@ -1097,8 +1099,8 @@ declare module "bun:test" {
*
* @param expected the expected value
*/
toContainKey(expected: keyof T): void;
toContainKey<X = T>(expected: NoInfer<keyof X>): void;
toContainKey(expected: __internal.IfNeverThenElse<keyof T, PropertyKey>): void;
toContainKey<X = T>(expected: __internal.IfNeverThenElse<NoInfer<keyof X>, PropertyKey>): void;
/**
* Asserts that an `object` contains all the provided keys.
@@ -1114,8 +1116,8 @@ declare module "bun:test" {
*
* @param expected the expected value
*/
toContainAllKeys(expected: Array<keyof T>): void;
toContainAllKeys<X = T>(expected: NoInfer<Array<keyof X>>): void;
toContainAllKeys(expected: Array<__internal.IfNeverThenElse<keyof T, PropertyKey>>): void;
toContainAllKeys<X = T>(expected: Array<__internal.IfNeverThenElse<NoInfer<keyof X>, PropertyKey>>): void;
/**
* Asserts that an `object` contains at least one of the provided keys.
@@ -1131,8 +1133,8 @@ declare module "bun:test" {
*
* @param expected the expected value
*/
toContainAnyKeys(expected: Array<keyof T>): void;
toContainAnyKeys<X = T>(expected: NoInfer<Array<keyof X>>): void;
toContainAnyKeys(expected: Array<__internal.IfNeverThenElse<keyof T, PropertyKey>>): void;
toContainAnyKeys<X = T>(expected: Array<__internal.IfNeverThenElse<NoInfer<keyof X>, PropertyKey>>): void;
/**
* Asserts that an `object` contain the provided value.
@@ -1224,8 +1226,8 @@ declare module "bun:test" {
*
* @param expected the expected value
*/
toContainKeys(expected: Array<keyof T>): void;
toContainKeys<X = T>(expected: NoInfer<Array<keyof X>>): void;
toContainKeys(expected: Array<__internal.IfNeverThenElse<keyof T, PropertyKey>>): void;
toContainKeys<X = T>(expected: Array<__internal.IfNeverThenElse<NoInfer<keyof X>, PropertyKey>>): void;
/**
* Asserts that a value contains and equals what is expected.

View File

@@ -54,8 +54,8 @@ void us_listen_socket_close(int ssl, struct us_listen_socket_t *ls) {
s->next = loop->data.closed_head;
loop->data.closed_head = s;
/* Any socket with prev = context is marked as closed */
s->prev = (struct us_socket_t *) context;
/* Mark the socket as closed */
s->flags.is_closed = 1;
}
/* We cannot immediately free a listen socket as we can be inside an accept loop */
@@ -154,7 +154,9 @@ void us_internal_socket_context_unlink_connecting_socket(int ssl, struct us_sock
/* We always add in the top, so we don't modify any s.next */
void us_internal_socket_context_link_listen_socket(int ssl, struct us_socket_context_t *context, struct us_listen_socket_t *ls) {
struct us_socket_t* s = &ls->s;
if(us_socket_is_closed(ssl, s)) return;
s->context = context;
s->next = (struct us_socket_t *) context->head_listen_sockets;
s->prev = 0;
@@ -166,6 +168,8 @@ void us_internal_socket_context_link_listen_socket(int ssl, struct us_socket_con
}
void us_internal_socket_context_link_connecting_socket(int ssl, struct us_socket_context_t *context, struct us_connecting_socket_t *c) {
if(c->closed) return;
c->context = context;
c->next_pending = context->head_connecting_sockets;
c->prev_pending = 0;
@@ -180,6 +184,8 @@ void us_internal_socket_context_link_connecting_socket(int ssl, struct us_socket
/* We always add in the top, so we don't modify any s.next */
void us_internal_socket_context_link_socket(int ssl, struct us_socket_context_t *context, struct us_socket_t *s) {
if(us_socket_is_closed(ssl,s)) return;
s->context = context;
s->next = context->head_sockets;
s->prev = 0;
@@ -386,6 +392,9 @@ struct us_listen_socket_t *us_socket_context_listen(int ssl, struct us_socket_co
s->flags.low_prio_state = 0;
s->flags.is_paused = 0;
s->flags.is_ipc = 0;
s->flags.is_closed = 0;
s->flags.adopted = 0;
s->flags.is_tls = ssl;
s->next = 0;
s->flags.allow_half_open = (options & LIBUS_SOCKET_ALLOW_HALF_OPEN);
us_internal_socket_context_link_listen_socket(ssl, context, ls);
@@ -422,6 +431,9 @@ struct us_listen_socket_t *us_socket_context_listen_unix(int ssl, struct us_sock
s->flags.allow_half_open = (options & LIBUS_SOCKET_ALLOW_HALF_OPEN);
s->flags.is_paused = 0;
s->flags.is_ipc = 0;
s->flags.is_closed = 0;
s->flags.adopted = 0;
s->flags.is_tls = ssl;
s->next = 0;
us_internal_socket_context_link_listen_socket(ssl, context, ls);
@@ -430,7 +442,7 @@ struct us_listen_socket_t *us_socket_context_listen_unix(int ssl, struct us_sock
return ls;
}
struct us_socket_t* us_socket_context_connect_resolved_dns(struct us_socket_context_t *context, struct sockaddr_storage* addr, int options, int socket_ext_size) {
struct us_socket_t* us_socket_context_connect_resolved_dns(int ssl, struct us_socket_context_t *context, struct sockaddr_storage* addr, int options, int socket_ext_size) {
LIBUS_SOCKET_DESCRIPTOR connect_socket_fd = bsd_create_connect_socket(addr, options);
if (connect_socket_fd == LIBUS_SOCKET_ERROR) {
return NULL;
@@ -453,6 +465,9 @@ struct us_socket_t* us_socket_context_connect_resolved_dns(struct us_socket_cont
socket->flags.allow_half_open = (options & LIBUS_SOCKET_ALLOW_HALF_OPEN);
socket->flags.is_paused = 0;
socket->flags.is_ipc = 0;
socket->flags.is_closed = 0;
socket->flags.adopted = 0;
socket->flags.is_tls = ssl;
socket->connect_state = NULL;
socket->connect_next = NULL;
@@ -514,7 +529,7 @@ void *us_socket_context_connect(int ssl, struct us_socket_context_t *context, co
struct sockaddr_storage addr;
if (try_parse_ip(host, port, &addr)) {
*has_dns_resolved = 1;
return us_socket_context_connect_resolved_dns(context, &addr, options, socket_ext_size);
return us_socket_context_connect_resolved_dns(ssl, context, &addr, options, socket_ext_size);
}
struct addrinfo_request* ai_req;
@@ -534,7 +549,7 @@ void *us_socket_context_connect(int ssl, struct us_socket_context_t *context, co
struct sockaddr_storage addr;
init_addr_with_port(&entries->info, port, &addr);
*has_dns_resolved = 1;
struct us_socket_t *s = us_socket_context_connect_resolved_dns(context, &addr, options, socket_ext_size);
struct us_socket_t *s = us_socket_context_connect_resolved_dns(ssl, context, &addr, options, socket_ext_size);
Bun__addrinfo_freeRequest(ai_req, s == NULL);
return s;
}
@@ -583,6 +598,9 @@ int start_connections(struct us_connecting_socket_t *c, int count) {
flags->allow_half_open = (c->options & LIBUS_SOCKET_ALLOW_HALF_OPEN);
flags->is_paused = 0;
flags->is_ipc = 0;
flags->is_closed = 0;
flags->adopted = 0;
flags->is_tls = c->ssl;
/* Link it into context so that timeout fires properly */
us_internal_socket_context_link_socket(0, context, s);
@@ -760,6 +778,9 @@ struct us_socket_t *us_socket_context_connect_unix(int ssl, struct us_socket_con
connect_socket->flags.allow_half_open = (options & LIBUS_SOCKET_ALLOW_HALF_OPEN);
connect_socket->flags.is_paused = 0;
connect_socket->flags.is_ipc = 0;
connect_socket->flags.is_closed = 0;
connect_socket->flags.adopted = 0;
connect_socket->flags.is_tls = ssl;
connect_socket->connect_state = NULL;
connect_socket->connect_next = NULL;
us_internal_socket_context_link_socket(ssl, context, connect_socket);
@@ -780,10 +801,10 @@ struct us_socket_context_t *us_create_child_socket_context(int ssl, struct us_so
}
/* Note: This will set timeout to 0 */
struct us_socket_t *us_socket_context_adopt_socket(int ssl, struct us_socket_context_t *context, struct us_socket_t *s, int ext_size) {
struct us_socket_t *us_socket_context_adopt_socket(int ssl, struct us_socket_context_t *context, struct us_socket_t *s, int old_ext_size, int ext_size) {
#ifndef LIBUS_NO_SSL
if (ssl) {
return (struct us_socket_t *) us_internal_ssl_socket_context_adopt_socket((struct us_internal_ssl_socket_context_t *) context, (struct us_internal_ssl_socket_t *) s, ext_size);
return (struct us_socket_t *) us_internal_ssl_socket_context_adopt_socket((struct us_internal_ssl_socket_context_t *) context, (struct us_internal_ssl_socket_t *) s, old_ext_size, ext_size);
}
#endif
@@ -807,7 +828,18 @@ struct us_socket_t *us_socket_context_adopt_socket(int ssl, struct us_socket_con
struct us_socket_t *new_s = s;
if (ext_size != -1) {
struct us_poll_t *pool_ref = &s->p;
new_s = (struct us_socket_t *) us_poll_resize(pool_ref, loop, sizeof(struct us_socket_t) + ext_size);
new_s = (struct us_socket_t *) us_poll_resize(pool_ref, loop, sizeof(struct us_socket_t) + old_ext_size, sizeof(struct us_socket_t) + ext_size);
if(new_s != s) {
/* Mark the old socket as closed */
s->flags.is_closed = 1;
/* Link this socket to the close-list and let it be deleted after this iteration */
s->next = s->context->loop->data.closed_head;
s->context->loop->data.closed_head = s;
/* Mark the old socket as adopted (reallocated) */
s->flags.adopted = 1;
/* Tell the event loop what is the new socket so we can process to send info to the right place and callbacks like more data and EOF*/
s->prev = new_s;
}
if (c) {
c->connecting_head = new_s;
c->context = context;

View File

@@ -396,7 +396,7 @@ void us_internal_update_handshake(struct us_internal_ssl_socket_t *s) {
}
int result = SSL_do_handshake(s->ssl);
if (SSL_get_shutdown(s->ssl) & SSL_RECEIVED_SHUTDOWN) {
us_internal_ssl_socket_close(s, 0, NULL);
return;
@@ -417,7 +417,7 @@ void us_internal_update_handshake(struct us_internal_ssl_socket_t *s) {
}
s->handshake_state = HANDSHAKE_PENDING;
s->ssl_write_wants_read = 1;
s->s.flags.last_write_failed = 1;
return;
}
// success
@@ -434,6 +434,7 @@ ssl_on_close(struct us_internal_ssl_socket_t *s, int code, void *reason) {
struct us_internal_ssl_socket_t * ret = context->on_close(s, code, reason);
SSL_free(s->ssl); // free SSL after on_close
s->ssl = NULL; // set to NULL
return ret;
}
@@ -1855,15 +1856,16 @@ void us_internal_ssl_socket_shutdown(struct us_internal_ssl_socket_t *s) {
struct us_internal_ssl_socket_t *us_internal_ssl_socket_context_adopt_socket(
struct us_internal_ssl_socket_context_t *context,
struct us_internal_ssl_socket_t *s, int ext_size) {
struct us_internal_ssl_socket_t *s, int old_ext_size, int ext_size) {
// todo: this is completely untested
int new_old_ext_size = sizeof(struct us_internal_ssl_socket_t) - sizeof(struct us_socket_t) + old_ext_size;
int new_ext_size = ext_size;
if (ext_size != -1) {
new_ext_size = sizeof(struct us_internal_ssl_socket_t) - sizeof(struct us_socket_t) + ext_size;
}
return (struct us_internal_ssl_socket_t *)us_socket_context_adopt_socket(
0, &context->sc, &s->s,
new_ext_size);
new_old_ext_size, new_ext_size);
}
struct us_internal_ssl_socket_t *
@@ -1920,10 +1922,11 @@ ssl_wrapped_context_on_data(struct us_internal_ssl_socket_t *s, char *data,
struct us_wrapped_socket_context_t *wrapped_context =
(struct us_wrapped_socket_context_t *)us_internal_ssl_socket_context_ext(
context);
// raw data if needed
// raw data if needed
if (wrapped_context->old_events.on_data) {
wrapped_context->old_events.on_data((struct us_socket_t *)s, data, length);
}
// ssl wrapped data
return ssl_on_data(s, data, length);
}
@@ -2028,7 +2031,7 @@ us_internal_ssl_socket_open(struct us_internal_ssl_socket_t *s, int is_client,
// already opened
if (s->ssl)
return s;
// start SSL open
return ssl_on_open(s, is_client, ip, ip_length, NULL);
}
@@ -2040,6 +2043,7 @@ struct us_socket_t *us_socket_upgrade_to_tls(us_socket_r s, us_socket_context_r
struct us_internal_ssl_socket_t *socket =
(struct us_internal_ssl_socket_t *)us_socket_context_adopt_socket(
0, new_context, s,
sizeof(void*),
(sizeof(struct us_internal_ssl_socket_t) - sizeof(struct us_socket_t)) + sizeof(void*));
socket->ssl = NULL;
socket->ssl_write_wants_read = 0;
@@ -2058,7 +2062,7 @@ struct us_socket_t *us_socket_upgrade_to_tls(us_socket_r s, us_socket_context_r
struct us_internal_ssl_socket_t *us_internal_ssl_socket_wrap_with_tls(
struct us_socket_t *s, struct us_bun_socket_context_options_t options,
struct us_socket_events_t events, int socket_ext_size) {
struct us_socket_events_t events, int old_socket_ext_size, int socket_ext_size) {
/* Cannot wrap a closed socket */
if (us_socket_is_closed(0, s)) {
return NULL;
@@ -2163,6 +2167,7 @@ us_socket_context_on_socket_connect_error(
struct us_internal_ssl_socket_t *socket =
(struct us_internal_ssl_socket_t *)us_socket_context_adopt_socket(
0, context, s,
old_socket_ext_size,
sizeof(struct us_internal_ssl_socket_t) - sizeof(struct us_socket_t) +
socket_ext_size);
socket->ssl = NULL;

View File

@@ -228,8 +228,8 @@ void us_loop_run(struct us_loop_t *loop) {
// > Instead, the filter will aggregate the events into a single kevent struct
// Note: EV_ERROR only sets the error in data as part of changelist. Not in this call!
int events = 0
| ((filter & EVFILT_READ) ? LIBUS_SOCKET_READABLE : 0)
| ((filter & EVFILT_WRITE) ? LIBUS_SOCKET_WRITABLE : 0);
| ((filter == EVFILT_READ) ? LIBUS_SOCKET_READABLE : 0)
| ((filter == EVFILT_WRITE) ? LIBUS_SOCKET_WRITABLE : 0);
const int error = (flags & (EV_ERROR)) ? ((int)fflags || 1) : 0;
const int eof = (flags & (EV_EOF));
#endif
@@ -325,7 +325,7 @@ void us_internal_loop_update_pending_ready_polls(struct us_loop_t *loop, struct
int num_entries_possibly_remaining = 1;
#else
/* Ready polls may contain same poll twice under kqueue, as one poll may hold two filters */
int num_entries_possibly_remaining = 2;//((old_events & LIBUS_SOCKET_READABLE) ? 1 : 0) + ((old_events & LIBUS_SOCKET_WRITABLE) ? 1 : 0);
int num_entries_possibly_remaining = 2;
#endif
/* Todo: for kqueue if we track things in us_change_poll it is possible to have a fast path with no seeking in cases of:
@@ -360,11 +360,11 @@ int kqueue_change(int kqfd, int fd, int old_events, int new_events, void *user_d
if(!is_readable && !is_writable) {
if(!(old_events & LIBUS_SOCKET_WRITABLE)) {
// if we are not reading or writing, we need to add writable to receive FIN
EV_SET64(&change_list[change_length++], fd, EVFILT_WRITE, EV_ADD, 0, 0, (uint64_t)(void*)user_data, 0, 0);
EV_SET64(&change_list[change_length++], fd, EVFILT_WRITE, EV_ADD | EV_ONESHOT, 0, 0, (uint64_t)(void*)user_data, 0, 0);
}
} else if ((new_events & LIBUS_SOCKET_WRITABLE) != (old_events & LIBUS_SOCKET_WRITABLE)) {
/* Do they differ in writable? */
EV_SET64(&change_list[change_length++], fd, EVFILT_WRITE, (new_events & LIBUS_SOCKET_WRITABLE) ? EV_ADD : EV_DELETE, 0, 0, (uint64_t)(void*)user_data, 0, 0);
EV_SET64(&change_list[change_length++], fd, EVFILT_WRITE, (new_events & LIBUS_SOCKET_WRITABLE) ? EV_ADD | EV_ONESHOT : EV_DELETE, 0, 0, (uint64_t)(void*)user_data, 0, 0);
}
int ret;
do {
@@ -377,22 +377,30 @@ int kqueue_change(int kqfd, int fd, int old_events, int new_events, void *user_d
}
#endif
struct us_poll_t *us_poll_resize(struct us_poll_t *p, struct us_loop_t *loop, unsigned int ext_size) {
int events = us_poll_events(p);
struct us_poll_t *us_poll_resize(struct us_poll_t *p, struct us_loop_t *loop, unsigned int old_ext_size, unsigned int ext_size) {
struct us_poll_t *new_p = us_realloc(p, sizeof(struct us_poll_t) + ext_size);
if (p != new_p) {
unsigned int old_size = sizeof(struct us_poll_t) + old_ext_size;
unsigned int new_size = sizeof(struct us_poll_t) + ext_size;
if(new_size <= old_size) return p;
struct us_poll_t *new_p = us_calloc(1, new_size);
memcpy(new_p, p, old_size);
/* Increment poll count for the new poll - the old poll will be freed separately
* which decrements the count, keeping the total correct */
loop->num_polls++;
int events = us_poll_events(p);
#ifdef LIBUS_USE_EPOLL
/* Hack: forcefully update poll by stripping away already set events */
new_p->state.poll_type = us_internal_poll_type(new_p);
us_poll_change(new_p, loop, events);
/* Hack: forcefully update poll by stripping away already set events */
new_p->state.poll_type = us_internal_poll_type(new_p);
us_poll_change(new_p, loop, events);
#else
/* Forcefully update poll by resetting them with new_p as user data */
kqueue_change(loop->fd, new_p->state.fd, 0, LIBUS_SOCKET_WRITABLE | LIBUS_SOCKET_READABLE, new_p);
#endif /* This is needed for epoll also (us_change_poll doesn't update the old poll) */
us_internal_loop_update_pending_ready_polls(loop, p, new_p, events, events);
}
/* Forcefully update poll by resetting them with new_p as user data */
kqueue_change(loop->fd, new_p->state.fd, 0, LIBUS_SOCKET_WRITABLE | LIBUS_SOCKET_READABLE, new_p);
#endif
/* This is needed for epoll also (us_change_poll doesn't update the old poll) */
us_internal_loop_update_pending_ready_polls(loop, p, new_p, events, events);
return new_p;
}
@@ -444,7 +452,7 @@ void us_poll_change(struct us_poll_t *p, struct us_loop_t *loop, int events) {
kqueue_change(loop->fd, p->state.fd, old_events, events, p);
#endif
/* Set all removed events to null-polls in pending ready poll list */
// us_internal_loop_update_pending_ready_polls(loop, p, p, old_events, events);
us_internal_loop_update_pending_ready_polls(loop, p, p, old_events, events);
}
}

View File

@@ -71,6 +71,11 @@ void us_poll_init(struct us_poll_t *p, LIBUS_SOCKET_DESCRIPTOR fd,
}
void us_poll_free(struct us_poll_t *p, struct us_loop_t *loop) {
// poll was resized and dont own uv_poll_t anymore
if(!p->uv_p) {
free(p);
return;
}
/* The idea here is like so; in us_poll_stop we call uv_close after setting
* data of uv-poll to 0. This means that in close_cb_free we call free on 0
* with does nothing, since us_poll_stop should not really free the poll.
@@ -86,6 +91,7 @@ void us_poll_free(struct us_poll_t *p, struct us_loop_t *loop) {
}
void us_poll_start(struct us_poll_t *p, struct us_loop_t *loop, int events) {
if(!p->uv_p) return;
p->poll_type = us_internal_poll_type(p) |
((events & LIBUS_SOCKET_READABLE) ? POLL_TYPE_POLLING_IN : 0) |
((events & LIBUS_SOCKET_WRITABLE) ? POLL_TYPE_POLLING_OUT : 0);
@@ -99,6 +105,7 @@ void us_poll_start(struct us_poll_t *p, struct us_loop_t *loop, int events) {
}
void us_poll_change(struct us_poll_t *p, struct us_loop_t *loop, int events) {
if(!p->uv_p) return;
if (us_poll_events(p) != events) {
p->poll_type =
us_internal_poll_type(p) |
@@ -109,6 +116,7 @@ void us_poll_change(struct us_poll_t *p, struct us_loop_t *loop, int events) {
}
void us_poll_stop(struct us_poll_t *p, struct us_loop_t *loop) {
if(!p->uv_p) return;
uv_poll_stop(p->uv_p);
/* We normally only want to close the poll here, not free it. But if we stop
@@ -217,10 +225,20 @@ struct us_poll_t *us_create_poll(struct us_loop_t *loop, int fallthrough,
/* If we update our block position we have to update the uv_poll data to point
* to us */
struct us_poll_t *us_poll_resize(struct us_poll_t *p, struct us_loop_t *loop,
unsigned int ext_size) {
unsigned int old_ext_size, unsigned int ext_size) {
// cannot resize if we dont own uv_poll_t
if(!p->uv_p) return p;
unsigned int old_size = sizeof(struct us_poll_t) + old_ext_size;
unsigned int new_size = sizeof(struct us_poll_t) + ext_size;
if(new_size <= old_size) return p;
struct us_poll_t *new_p = calloc(1, new_size);
memcpy(new_p, p, old_size);
struct us_poll_t *new_p = realloc(p, sizeof(struct us_poll_t) + ext_size);
new_p->uv_p->data = new_p;
p->uv_p = NULL;
return new_p;
}

View File

@@ -170,6 +170,14 @@ struct us_socket_flags {
unsigned char low_prio_state: 2;
/* If true, the socket should be read using readmsg to support receiving file descriptors */
bool is_ipc: 1;
/* If true, the socket has been closed */
bool is_closed: 1;
/* If true, the socket was reallocated during adoption */
bool adopted: 1;
/* If true, the socket is a TLS socket */
bool is_tls: 1;
/* If true, the last write to this socket failed (would block) */
bool last_write_failed: 1;
} __attribute__((packed));
@@ -435,11 +443,11 @@ void us_internal_ssl_socket_shutdown(us_internal_ssl_socket_r s);
struct us_internal_ssl_socket_t *us_internal_ssl_socket_context_adopt_socket(
us_internal_ssl_socket_context_r context,
us_internal_ssl_socket_r s, int ext_size);
us_internal_ssl_socket_r s, int old_ext_size, int ext_size);
struct us_internal_ssl_socket_t *us_internal_ssl_socket_wrap_with_tls(
us_socket_r s, struct us_bun_socket_context_options_t options,
struct us_socket_events_t events, int socket_ext_size);
struct us_socket_events_t events, int old_socket_ext_size, int socket_ext_size);
struct us_internal_ssl_socket_context_t *
us_internal_create_child_ssl_socket_context(
us_internal_ssl_socket_context_r context, int context_ext_size);

View File

@@ -37,7 +37,6 @@ struct us_internal_loop_data_t {
struct us_timer_t *sweep_timer;
int sweep_timer_count;
struct us_internal_async *wakeup_async;
int last_write_failed;
struct us_socket_context_t *head;
struct us_socket_context_t *iterator;
struct us_socket_context_t *closed_context_head;

View File

@@ -349,7 +349,7 @@ struct us_loop_t *us_socket_context_loop(int ssl, us_socket_context_r context) n
/* Invalidates passed socket, returning a new resized socket which belongs to a different socket context.
* Used mainly for "socket upgrades" such as when transitioning from HTTP to WebSocket. */
struct us_socket_t *us_socket_context_adopt_socket(int ssl, us_socket_context_r context, us_socket_r s, int ext_size);
struct us_socket_t *us_socket_context_adopt_socket(int ssl, us_socket_context_r context, us_socket_r s, int old_ext_size, int ext_size);
struct us_socket_t *us_socket_upgrade_to_tls(us_socket_r s, us_socket_context_r new_context, const char *sni);
@@ -411,7 +411,7 @@ void *us_poll_ext(us_poll_r p) nonnull_fn_decl;
LIBUS_SOCKET_DESCRIPTOR us_poll_fd(us_poll_r p) nonnull_fn_decl;
/* Resize an active poll */
struct us_poll_t *us_poll_resize(us_poll_r p, us_loop_r loop, unsigned int ext_size) nonnull_fn_decl;
struct us_poll_t *us_poll_resize(us_poll_r p, us_loop_r loop, unsigned int old_ext_size, unsigned int ext_size) nonnull_fn_decl;
/* Public interfaces for sockets */
@@ -470,7 +470,7 @@ void us_socket_local_address(int ssl, us_socket_r s, char *nonnull_arg buf, int
/* Bun extras */
struct us_socket_t *us_socket_pair(struct us_socket_context_t *ctx, int socket_ext_size, LIBUS_SOCKET_DESCRIPTOR* fds);
struct us_socket_t *us_socket_from_fd(struct us_socket_context_t *ctx, int socket_ext_size, LIBUS_SOCKET_DESCRIPTOR fd, int ipc);
struct us_socket_t *us_socket_wrap_with_tls(int ssl, us_socket_r s, struct us_bun_socket_context_options_t options, struct us_socket_events_t events, int socket_ext_size);
struct us_socket_t *us_socket_wrap_with_tls(int ssl, us_socket_r s, struct us_bun_socket_context_options_t options, struct us_socket_events_t events, int old_socket_ext_size, int socket_ext_size);
int us_socket_raw_write(int ssl, us_socket_r s, const char *data, int length);
struct us_socket_t* us_socket_open(int ssl, struct us_socket_t * s, int is_client, char* ip, int ip_length);
int us_raw_root_certs(struct us_cert_string_t**out);

View File

@@ -193,9 +193,17 @@ void us_internal_handle_low_priority_sockets(struct us_loop_t *loop) {
loop_data->low_prio_head = s->next;
if (s->next) s->next->prev = 0;
s->next = 0;
int ssl = s->flags.is_tls;
if(us_socket_is_closed(ssl, s)) {
s->flags.low_prio_state = 2;
us_socket_context_unref(ssl, s->context);
continue;
}
us_internal_socket_context_link_socket(0, s->context, s);
us_poll_change(&s->p, us_socket_context(0, s)->loop, us_poll_events(&s->p) | LIBUS_SOCKET_READABLE);
us_internal_socket_context_link_socket(ssl, s->context, s);
us_socket_context_unref(ssl, s->context);
us_poll_change(&s->p, us_socket_context(ssl, s)->loop, us_poll_events(&s->p) | LIBUS_SOCKET_READABLE);
s->flags.low_prio_state = 2;
}
@@ -243,6 +251,7 @@ void us_internal_free_closed_sockets(struct us_loop_t *loop) {
/* Free all closed sockets (maybe it is better to reverse order?) */
for (struct us_socket_t *s = loop->data.closed_head; s; ) {
struct us_socket_t *next = s->next;
s->prev = s->next = 0;
us_poll_free((struct us_poll_t *) s, loop);
s = next;
}
@@ -347,6 +356,9 @@ void us_internal_dispatch_ready_poll(struct us_poll_t *p, int error, int eof, in
s->flags.allow_half_open = listen_socket->s.flags.allow_half_open;
s->flags.is_paused = 0;
s->flags.is_ipc = 0;
s->flags.is_closed = 0;
s->flags.adopted = 0;
s->flags.is_tls = listen_socket->s.flags.is_tls;
/* We always use nodelay */
bsd_socket_nodelay(client_fd, 1);
@@ -354,7 +366,10 @@ void us_internal_dispatch_ready_poll(struct us_poll_t *p, int error, int eof, in
us_internal_socket_context_link_socket(0, listen_socket->s.context, s);
listen_socket->s.context->on_open(s, 0, bsd_addr_get_ip(&addr), bsd_addr_get_ip_length(&addr));
/* After socket adoption, track the new socket; the old one becomes invalid */
if(s && s->flags.adopted && s->prev) {
s = s->prev;
}
/* Exit accept loop if listen socket was closed in on_open handler */
if (us_socket_is_closed(0, &listen_socket->s)) {
break;
@@ -369,22 +384,37 @@ void us_internal_dispatch_ready_poll(struct us_poll_t *p, int error, int eof, in
case POLL_TYPE_SOCKET: {
/* We should only use s, no p after this point */
struct us_socket_t *s = (struct us_socket_t *) p;
/* After socket adoption, track the new socket; the old one becomes invalid */
if(s && s->flags.adopted && s->prev) {
s = s->prev;
}
/* The context can change after calling a callback but the loop is always the same */
struct us_loop_t* loop = s->context->loop;
if (events & LIBUS_SOCKET_WRITABLE && !error) {
/* Note: if we failed a write as a socket of one loop then adopted
* to another loop, this will be wrong. Absurd case though */
loop->data.last_write_failed = 0;
s->flags.last_write_failed = 0;
#ifdef LIBUS_USE_KQUEUE
/* Kqueue is one-shot so is not writable anymore */
p->state.poll_type = us_internal_poll_type(p) | ((events & LIBUS_SOCKET_READABLE) ? POLL_TYPE_POLLING_IN : 0);
#endif
s = s->context->on_writable(s);
/* After socket adoption, track the new socket; the old one becomes invalid */
if(s && s->flags.adopted && s->prev) {
s = s->prev;
}
if (!s || us_socket_is_closed(0, s)) {
return;
}
/* If we have no failed write or if we shut down, then stop polling for more writable */
if (!loop->data.last_write_failed || us_socket_is_shut_down(0, s)) {
if (!s->flags.last_write_failed || us_socket_is_shut_down(0, s)) {
us_poll_change(&s->p, loop, us_poll_events(&s->p) & LIBUS_SOCKET_READABLE);
} else {
#ifdef LIBUS_USE_KQUEUE
/* Kqueue one-shot writable needs to be re-enabled */
us_poll_change(&s->p, loop, us_poll_events(&s->p) | LIBUS_SOCKET_WRITABLE);
#endif
}
}
@@ -468,6 +498,10 @@ void us_internal_dispatch_ready_poll(struct us_poll_t *p, int error, int eof, in
if (length > 0) {
s = s->context->on_data(s, loop->data.recv_buf + LIBUS_RECV_BUFFER_PADDING, length);
/* After socket adoption, track the new socket; the old one becomes invalid */
if(s && s->flags.adopted && s->prev) {
s = s->prev;
}
// loop->num_ready_polls isn't accessible on Windows.
#ifndef WIN32
// rare case: we're reading a lot of data, there's more to be read, and either:

View File

@@ -125,7 +125,7 @@ int us_socket_is_closed(int ssl, struct us_socket_t *s) {
if(ssl) {
return us_internal_ssl_socket_is_closed((struct us_internal_ssl_socket_t *) s);
}
return s->prev == (struct us_socket_t *) s->context;
return s->flags.is_closed;
}
int us_connecting_socket_is_closed(int ssl, struct us_connecting_socket_t *c) {
@@ -159,8 +159,8 @@ void us_connecting_socket_close(int ssl, struct us_connecting_socket_t *c) {
s->next = s->context->loop->data.closed_head;
s->context->loop->data.closed_head = s;
/* Any socket with prev = context is marked as closed */
s->prev = (struct us_socket_t *) s->context;
/* Mark the socket as closed */
s->flags.is_closed = 1;
}
if(!c->error) {
// if we have no error, we have to set that we were aborted aka we called close
@@ -218,11 +218,10 @@ struct us_socket_t *us_socket_close(int ssl, struct us_socket_t *s, int code, vo
bsd_close_socket(us_poll_fd((struct us_poll_t *) s));
/* Mark the socket as closed */
s->flags.is_closed = 1;
/* Any socket with prev = context is marked as closed */
s->prev = (struct us_socket_t *) s->context;
/* mark it as closed and call the callback */
/* call the callback */
struct us_socket_t *res = s;
if (!(us_internal_poll_type(&s->p) & POLL_TYPE_SEMI_SOCKET)) {
res = s->context->on_close(s, code, reason);
@@ -268,8 +267,8 @@ struct us_socket_t *us_socket_detach(int ssl, struct us_socket_t *s) {
s->next = s->context->loop->data.closed_head;
s->context->loop->data.closed_head = s;
/* Any socket with prev = context is marked as closed */
s->prev = (struct us_socket_t *) s->context;
/* Mark the socket as closed */
s->flags.is_closed = 1;
return s;
}
@@ -321,8 +320,10 @@ struct us_socket_t *us_socket_from_fd(struct us_socket_context_t *ctx, int socke
s->flags.low_prio_state = 0;
s->flags.allow_half_open = 0;
s->flags.is_paused = 0;
s->flags.is_ipc = 0;
s->flags.is_ipc = ipc;
s->flags.is_closed = 0;
s->flags.adopted = 0;
s->flags.is_tls = 0;
s->connect_state = NULL;
/* We always use nodelay */
@@ -369,7 +370,7 @@ int us_socket_write(int ssl, struct us_socket_t *s, const char *data, int length
int written = bsd_send(us_poll_fd(&s->p), data, length);
if (written != length) {
s->context->loop->data.last_write_failed = 1;
s->flags.last_write_failed = 1;
us_poll_change(&s->p, s->context->loop, LIBUS_SOCKET_READABLE | LIBUS_SOCKET_WRITABLE);
}
@@ -406,7 +407,7 @@ int us_socket_ipc_write_fd(struct us_socket_t *s, const char* data, int length,
int sent = bsd_sendmsg(us_poll_fd(&s->p), &msg, 0);
if (sent != length) {
s->context->loop->data.last_write_failed = 1;
s->flags.last_write_failed = 1;
us_poll_change(&s->p, s->context->loop, LIBUS_SOCKET_READABLE | LIBUS_SOCKET_WRITABLE);
}
@@ -476,13 +477,13 @@ int us_connecting_socket_get_error(int ssl, struct us_connecting_socket_t *c) {
Note: this assumes that the socket is non-TLS and will be adopted and wrapped with a new TLS context
context ext will not be copied to the new context, new context will contain us_wrapped_socket_context_t on ext
*/
struct us_socket_t *us_socket_wrap_with_tls(int ssl, struct us_socket_t *s, struct us_bun_socket_context_options_t options, struct us_socket_events_t events, int socket_ext_size) {
struct us_socket_t *us_socket_wrap_with_tls(int ssl, struct us_socket_t *s, struct us_bun_socket_context_options_t options, struct us_socket_events_t events, int old_socket_ext_size, int socket_ext_size) {
// only accepts non-TLS sockets
if (ssl) {
return NULL;
}
return(struct us_socket_t *) us_internal_ssl_socket_wrap_with_tls(s, options, events, socket_ext_size);
return(struct us_socket_t *) us_internal_ssl_socket_wrap_with_tls(s, options, events, old_socket_ext_size, socket_ext_size);
}
// if a TLS socket calls this, it will start SSL call open event and TLS handshake if required

View File

@@ -84,6 +84,7 @@ struct AsyncSocketData {
/* Or empty */
AsyncSocketData() = default;
bool isIdle = false;
bool isAuthorized = false; // per-socket TLS authorization status
};
}

View File

@@ -124,15 +124,16 @@ private:
// if we are closing or already closed, we don't need to do anything
if (!us_socket_is_closed(SSL, s)) {
HttpContextData<SSL> *httpContextData = getSocketContextDataS(s);
httpContextData->flags.isAuthorized = success;
// Set per-socket authorization status
auto *httpResponseData = reinterpret_cast<HttpResponseData<SSL> *>(us_socket_ext(SSL, s));
if(httpContextData->flags.rejectUnauthorized) {
if(!success || verify_error.error != 0) {
// we failed to handshake, close the socket
us_socket_close(SSL, s, 0, nullptr);
return;
}
httpContextData->flags.isAuthorized = true;
}
httpResponseData->isAuthorized = success;
/* Any connected socket should timeout until it has a request */
((HttpResponse<SSL> *) s)->resetTimeout();

View File

@@ -30,6 +30,7 @@
#include <algorithm>
#include <climits>
#include <string_view>
#include <span>
#include <map>
#include "MoveOnlyFunction.h"
#include "ChunkedEncoding.h"
@@ -160,6 +161,13 @@ namespace uWS
std::map<std::string, unsigned short, std::less<>> *currentParameterOffsets = nullptr;
public:
/* Any data pipelined after the HTTP headers (before response).
* Used for Node.js compatibility: 'connect' and 'upgrade' events
* pass this as the 'head' Buffer parameter.
* WARNING: This points to data in the receive buffer and may be stack-allocated.
* Must be cloned before the request handler returns. */
std::span<const char> head;
bool isAncient()
{
return ancientHttp;
@@ -883,6 +891,8 @@ namespace uWS
/* If returned socket is not what we put in we need
* to break here as we either have upgraded to
* WebSockets or otherwise closed the socket. */
/* Store any remaining data as head for Node.js compat (connect/upgrade events) */
req->head = std::span<const char>(data, length);
void *returnedUser = requestHandler(user, req);
if (returnedUser != user) {
/* We are upgraded to WebSocket or otherwise broken */
@@ -928,9 +938,13 @@ namespace uWS
consumedTotal += emittable;
}
} else if(isConnectRequest) {
// This only server to mark that the connect request read all headers
// and can starting emitting data
// This only serves to mark that the connect request read all headers
// and can start emitting data. Don't try to parse remaining data as HTTP -
// it's pipelined data that we've already captured in req->head.
remainingStreamingBytes = STATE_IS_CHUNKED;
// Mark remaining data as consumed and break - it's not HTTP
consumedTotal += length;
break;
} else {
/* If we came here without a body; emit an empty data chunk to signal no data */
dataHandler(user, {}, true);

View File

@@ -331,7 +331,7 @@ public:
/* Adopting a socket invalidates it, do not rely on it directly to carry any data */
us_socket_t *usSocket = us_socket_context_adopt_socket(SSL, (us_socket_context_t *) webSocketContext, (us_socket_t *) this, sizeof(WebSocketData) + sizeof(UserData));
us_socket_t *usSocket = us_socket_context_adopt_socket(SSL, (us_socket_context_t *) webSocketContext, (us_socket_t *) this, sizeof(HttpResponseData<SSL>), sizeof(WebSocketData) + sizeof(UserData));
WebSocket<SSL, true, UserData> *webSocket = (WebSocket<SSL, true, UserData> *) usSocket;
/* For whatever reason we were corked, update cork to the new socket */

2
rust-toolchain.toml Normal file
View File

@@ -0,0 +1,2 @@
[toolchain]
channel = "nightly-2025-12-10"

View File

@@ -313,7 +313,7 @@ function Install-Build-Essentials {
strawberryperl `
mingw
Install-Rust
Install-Sccache
Install-Ccache
# Needed to remap stack traces
Install-PdbAddr2line
Install-Llvm
@@ -376,8 +376,8 @@ function Install-Rust {
Add-To-Path "$rustPath\cargo\bin"
}
function Install-Sccache {
Install-Package sccache -Version "0.12.0"
function Install-Ccache {
Install-Package ccache
}
function Install-PdbAddr2line {

View File

@@ -1,5 +1,5 @@
#!/bin/sh
# Version: 21
# Version: 26
# A script that installs the dependencies needed to build and test Bun.
# This should work on macOS and Linux with a POSIX shell.
@@ -281,9 +281,6 @@ check_operating_system() {
Darwin)
os="darwin"
;;
FreeBSD)
os="freebsd"
;;
*)
error "Unsupported operating system: $os"
;;
@@ -346,11 +343,6 @@ check_operating_system() {
;;
esac
;;
freebsd)
. /etc/os-release
distro="$ID"
release="$VERSION_ID"
;;
esac
if [ -n "$distro" ]; then
@@ -434,9 +426,6 @@ check_package_manager() {
error "No package manager found. (apt, dnf, yum, apk)"
fi
;;
freebsd)
pm="pkg"
;;
esac
print "Package manager: $pm"
@@ -449,13 +438,6 @@ check_package_manager() {
apk)
package_manager update
;;
pkg)
# may need to switch betwen 'latest' and 'quarterly' depending on which repo www/chromium is in. check https://www.freshports.org/www/chromium/.
# mkdir -p /usr/local/etc/pkg/repos
# echo 'FreeBSD: { url: "pkg+http://pkg.FreeBSD.org/${ABI}/quarterly" }' > /usr/local/etc/pkg/repos/FreeBSD.conf
export ASSUME_ALWAYS_YES=yes
package_manager update -f
;;
esac
}
@@ -816,9 +798,6 @@ install_nodejs() {
linux)
nodejs_platform="linux"
;;
freebsd)
nodejs_platform="freebsd"
;;
*)
error "Unsupported OS for Node.js download: $os"
;;
@@ -837,12 +816,6 @@ install_nodejs() {
;;
esac
if [ "$os" = "freebsd" ]; then
# TODO: use nodejs_version_exact
install_packages "www/node$(nodejs_version)" "www/npm-node$(nodejs_version)"
return
fi
case "$abi" in
musl)
nodejs_mirror="https://bun-nodejs-release.s3.us-west-1.amazonaws.com"
@@ -932,10 +905,6 @@ install_nodejs() {
}
install_nodejs_headers() {
if [ "$os" = "freebsd" ]; then
return
fi
nodejs_version="$(nodejs_version_exact)"
nodejs_headers_tar="$(download_file "https://nodejs.org/download/release/v$nodejs_version/node-v$nodejs_version-headers.tar.gz")"
nodejs_headers_dir="$(dirname "$nodejs_headers_tar")"
@@ -950,10 +919,6 @@ install_nodejs_headers() {
}
setup_node_gyp_cache() {
if [ "$os" = "freebsd" ]; then
return
fi
nodejs_version="$1"
headers_source="$2"
@@ -992,11 +957,6 @@ bun_version_exact() {
}
install_bun() {
if [ "$os" = "freebsd" ]; then
# TODO: need to complete bun bootstrap for for this work
return
fi
install_packages unzip
case "$pm" in
@@ -1048,9 +1008,6 @@ install_cmake() {
--skip-license \
--prefix=/usr
;;
freebsd-pkg)
install_packages devel/cmake
;;
esac
}
@@ -1122,17 +1079,6 @@ install_build_essentials() {
ruby \
perl \
;;
freebsd)
install_packages \
devel/ninja \
devel/pkgconf \
lang/go \
devel/gmake \
lang/python3 \
devel/libtool \
lang/ruby33 \
perl5 \
;;
esac
install_cmake
@@ -1140,7 +1086,7 @@ install_build_essentials() {
install_osxcross
install_gcc
install_rust
install_sccache
install_ccache
install_docker
}
@@ -1155,12 +1101,23 @@ llvm_version() {
install_llvm() {
case "$pm" in
apt)
bash="$(require bash)"
llvm_script="$(download_file "https://apt.llvm.org/llvm.sh")"
execute_sudo "$bash" "$llvm_script" "$(llvm_version)" all
# Debian 13 (Trixie) has LLVM 19 natively, and apt.llvm.org doesn't have a trixie repo
if [ "$distro" = "debian" ]; then
install_packages \
"llvm-$(llvm_version)" \
"clang-$(llvm_version)" \
"lld-$(llvm_version)" \
"llvm-$(llvm_version)-dev" \
"llvm-$(llvm_version)-tools" \
"libclang-rt-$(llvm_version)-dev"
else
bash="$(require bash)"
llvm_script="$(download_file "https://apt.llvm.org/llvm.sh")"
execute_sudo "$bash" "$llvm_script" "$(llvm_version)" all
# Install llvm-symbolizer explicitly to ensure it's available for ASAN
install_packages "llvm-$(llvm_version)-tools"
# Install llvm-symbolizer explicitly to ensure it's available for ASAN
install_packages "llvm-$(llvm_version)-tools"
fi
;;
brew)
install_packages "llvm@$(llvm_version)"
@@ -1174,13 +1131,6 @@ install_llvm() {
"llvm$(llvm_version)-dev" # Ensures llvm-symbolizer is installed
;;
esac
case "$os" in
freebsd)
# TODO: use llvm_version_exact
install_packages "devel/llvm$(llvm_version)"
;;
esac
}
install_gcc() {
@@ -1256,69 +1206,39 @@ install_gcc() {
execute_sudo ln -sf $(which llvm-symbolizer-$llvm_v) /usr/bin/llvm-symbolizer
}
install_sccache() {
case "$os" in
linux)
;;
freebsd)
cargo install sccache --locked --version 0.12.0
return
;;
*)
error "Unsupported platform: $os"
;;
install_ccache() {
case "$pm" in
apt)
install_packages ccache
;;
brew)
install_packages ccache
;;
apk)
install_packages ccache
;;
dnf|yum)
install_packages ccache
;;
zypper)
install_packages ccache
;;
esac
# Alright, look, this function is cobbled together but it's only as cobbled
# together as this whole script is.
#
# For some reason, move_to_bin doesn't work here due to permissions so I'm
# avoiding that function. It's also wrong with permissions and so on.
#
# Unfortunately, we cannot use install_packages since many package managers
# don't compile `sccache` with S3 support.
local opts=$-
set -ef
local sccache_http
sccache_http="https://github.com/mozilla/sccache/releases/download/v0.12.0/sccache-v0.12.0-$(uname -m)-unknown-linux-musl.tar.gz"
local file
file=$(download_file "$sccache_http")
local tmpdir
tmpdir=$(mktemp -d)
execute tar -xzf "$file" -C "$tmpdir"
execute_sudo install -m755 "$tmpdir/sccache-v0.12.0-$(uname -m)-unknown-linux-musl/sccache" "/usr/local/bin"
set +ef -"$opts"
}
install_rust() {
case "$distro" in
alpine)
install_packages \
rust \
cargo
;;
freebsd)
install_packages lang/rust
create_directory "$HOME/.cargo/bin"
append_to_path "$HOME/.cargo/bin"
;;
*)
rust_home="/opt/rust"
create_directory "$rust_home"
append_to_profile "export RUSTUP_HOME=$rust_home"
append_to_profile "export CARGO_HOME=$rust_home"
rust_home="/opt/rust"
create_directory "$rust_home"
append_to_profile "export RUSTUP_HOME=$rust_home"
append_to_profile "export CARGO_HOME=$rust_home"
sh="$(require sh)"
rustup_script=$(download_file "https://sh.rustup.rs")
execute "$sh" -lc "$rustup_script -y --no-modify-path"
append_to_path "$rust_home/bin"
;;
esac
sh="$(require sh)"
rustup_script=$(download_file "https://sh.rustup.rs")
execute "$sh" -lc "$rustup_script -y --no-modify-path"
append_to_path "$rust_home/bin"
# Ensure all rustup files are accessible (for CI builds where different users run builds)
grant_to_user "$rust_home"
case "$osxcross" in
1)
@@ -1435,9 +1355,6 @@ install_tailscale() {
execute_as_user go install tailscale.com/cmd/tailscale{,d}@latest
append_to_path "$home/go/bin"
;;
freebsd)
install_packages security/tailscale
;;
esac
}
@@ -1504,14 +1421,6 @@ create_buildkite_user() {
--home "$home" \
--disabled-password
;;
freebsd)
execute_sudo pw group add -n "$group"
execute_sudo pw user add \
-n "$user" \
-g "$group" \
-s "$(require sh)" \
-d "$home" \
;;
*)
execute_sudo useradd "$user" \
--system \
@@ -1537,9 +1446,7 @@ create_buildkite_user() {
done
# The following is necessary to configure buildkite to use a stable
# checkout directory. sccache hashes absolute paths into its cache keys,
# so if buildkite uses a different checkout path each time (which it does
# by default), sccache will be useless.
# checkout directory for ccache to be effective.
local opts=$-
set -ef
@@ -1694,17 +1601,6 @@ install_age() {
;;
esac
;;
freebsd)
case "$arch" in
x64)
age_arch="amd64"
age_hash="943a7510a9973a1e589b913a70228aa1361a63cde39e3ed581435a4d4802df29"
;;
*)
error "Unsupported platform: $os-$arch"
;;
esac
;;
*)
error "Unsupported platform: $os-$arch"
;;

View File

@@ -1,249 +0,0 @@
{
"lockfileVersion": 1,
"configVersion": 1,
"workspaces": {
"": {
"dependencies": {
"@aws-sdk/client-s3": "^3.928.0",
"@aws-sdk/property-provider": "^3.374.0",
},
},
},
"packages": {
"@aws-crypto/crc32": ["@aws-crypto/crc32@5.2.0", "", { "dependencies": { "@aws-crypto/util": "^5.2.0", "@aws-sdk/types": "^3.222.0", "tslib": "^2.6.2" } }, "sha512-nLbCWqQNgUiwwtFsen1AdzAtvuLRsQS8rYgMuxCrdKf9kOssamGLuPwyTY9wyYblNr9+1XM8v6zoDTPPSIeANg=="],
"@aws-crypto/crc32c": ["@aws-crypto/crc32c@5.2.0", "", { "dependencies": { "@aws-crypto/util": "^5.2.0", "@aws-sdk/types": "^3.222.0", "tslib": "^2.6.2" } }, "sha512-+iWb8qaHLYKrNvGRbiYRHSdKRWhto5XlZUEBwDjYNf+ly5SVYG6zEoYIdxvf5R3zyeP16w4PLBn3rH1xc74Rag=="],
"@aws-crypto/sha1-browser": ["@aws-crypto/sha1-browser@5.2.0", "", { "dependencies": { "@aws-crypto/supports-web-crypto": "^5.2.0", "@aws-crypto/util": "^5.2.0", "@aws-sdk/types": "^3.222.0", "@aws-sdk/util-locate-window": "^3.0.0", "@smithy/util-utf8": "^2.0.0", "tslib": "^2.6.2" } }, "sha512-OH6lveCFfcDjX4dbAvCFSYUjJZjDr/3XJ3xHtjn3Oj5b9RjojQo8npoLeA/bNwkOkrSQ0wgrHzXk4tDRxGKJeg=="],
"@aws-crypto/sha256-browser": ["@aws-crypto/sha256-browser@5.2.0", "", { "dependencies": { "@aws-crypto/sha256-js": "^5.2.0", "@aws-crypto/supports-web-crypto": "^5.2.0", "@aws-crypto/util": "^5.2.0", "@aws-sdk/types": "^3.222.0", "@aws-sdk/util-locate-window": "^3.0.0", "@smithy/util-utf8": "^2.0.0", "tslib": "^2.6.2" } }, "sha512-AXfN/lGotSQwu6HNcEsIASo7kWXZ5HYWvfOmSNKDsEqC4OashTp8alTmaz+F7TC2L083SFv5RdB+qU3Vs1kZqw=="],
"@aws-crypto/sha256-js": ["@aws-crypto/sha256-js@5.2.0", "", { "dependencies": { "@aws-crypto/util": "^5.2.0", "@aws-sdk/types": "^3.222.0", "tslib": "^2.6.2" } }, "sha512-FFQQyu7edu4ufvIZ+OadFpHHOt+eSTBaYaki44c+akjg7qZg9oOQeLlk77F6tSYqjDAFClrHJk9tMf0HdVyOvA=="],
"@aws-crypto/supports-web-crypto": ["@aws-crypto/supports-web-crypto@5.2.0", "", { "dependencies": { "tslib": "^2.6.2" } }, "sha512-iAvUotm021kM33eCdNfwIN//F77/IADDSs58i+MDaOqFrVjZo9bAal0NK7HurRuWLLpF1iLX7gbWrjHjeo+YFg=="],
"@aws-crypto/util": ["@aws-crypto/util@5.2.0", "", { "dependencies": { "@aws-sdk/types": "^3.222.0", "@smithy/util-utf8": "^2.0.0", "tslib": "^2.6.2" } }, "sha512-4RkU9EsI6ZpBve5fseQlGNUWKMa1RLPQ1dnjnQoe07ldfIzcsGb5hC5W0Dm7u423KWzawlrpbjXBrXCEv9zazQ=="],
"@aws-sdk/client-s3": ["@aws-sdk/client-s3@3.928.0", "", { "dependencies": { "@aws-crypto/sha1-browser": "5.2.0", "@aws-crypto/sha256-browser": "5.2.0", "@aws-crypto/sha256-js": "5.2.0", "@aws-sdk/core": "3.928.0", "@aws-sdk/credential-provider-node": "3.928.0", "@aws-sdk/middleware-bucket-endpoint": "3.922.0", "@aws-sdk/middleware-expect-continue": "3.922.0", "@aws-sdk/middleware-flexible-checksums": "3.928.0", "@aws-sdk/middleware-host-header": "3.922.0", "@aws-sdk/middleware-location-constraint": "3.922.0", "@aws-sdk/middleware-logger": "3.922.0", "@aws-sdk/middleware-recursion-detection": "3.922.0", "@aws-sdk/middleware-sdk-s3": "3.928.0", "@aws-sdk/middleware-ssec": "3.922.0", "@aws-sdk/middleware-user-agent": "3.928.0", "@aws-sdk/region-config-resolver": "3.925.0", "@aws-sdk/signature-v4-multi-region": "3.928.0", "@aws-sdk/types": "3.922.0", "@aws-sdk/util-endpoints": "3.922.0", "@aws-sdk/util-user-agent-browser": "3.922.0", "@aws-sdk/util-user-agent-node": "3.928.0", "@aws-sdk/xml-builder": "3.921.0", "@smithy/config-resolver": "^4.4.2", "@smithy/core": "^3.17.2", "@smithy/eventstream-serde-browser": "^4.2.4", "@smithy/eventstream-serde-config-resolver": "^4.3.4", "@smithy/eventstream-serde-node": "^4.2.4", "@smithy/fetch-http-handler": "^5.3.5", "@smithy/hash-blob-browser": "^4.2.5", "@smithy/hash-node": "^4.2.4", "@smithy/hash-stream-node": "^4.2.4", "@smithy/invalid-dependency": "^4.2.4", "@smithy/md5-js": "^4.2.4", "@smithy/middleware-content-length": "^4.2.4", "@smithy/middleware-endpoint": "^4.3.6", "@smithy/middleware-retry": "^4.4.6", "@smithy/middleware-serde": "^4.2.4", "@smithy/middleware-stack": "^4.2.4", "@smithy/node-config-provider": "^4.3.4", "@smithy/node-http-handler": "^4.4.4", "@smithy/protocol-http": "^5.3.4", "@smithy/smithy-client": "^4.9.2", "@smithy/types": "^4.8.1", "@smithy/url-parser": "^4.2.4", "@smithy/util-base64": "^4.3.0", "@smithy/util-body-length-browser": "^4.2.0", "@smithy/util-body-length-node": "^4.2.1", "@smithy/util-defaults-mode-browser": "^4.3.5", "@smithy/util-defaults-mode-node": "^4.2.8", "@smithy/util-endpoints": "^3.2.4", "@smithy/util-middleware": "^4.2.4", "@smithy/util-retry": "^4.2.4", "@smithy/util-stream": "^4.5.5", "@smithy/util-utf8": "^4.2.0", "@smithy/util-waiter": "^4.2.4", "@smithy/uuid": "^1.1.0", "tslib": "^2.6.2" } }, "sha512-lXhhmcBjYa+ea0kRs00aq3WUwiXggwJkLwcMzOOsbW3CVYQaNpT7hztkfn2S6Qna7ETzd8M5+XZP+BmQRVE0Sg=="],
"@aws-sdk/client-sso": ["@aws-sdk/client-sso@3.928.0", "", { "dependencies": { "@aws-crypto/sha256-browser": "5.2.0", "@aws-crypto/sha256-js": "5.2.0", "@aws-sdk/core": "3.928.0", "@aws-sdk/middleware-host-header": "3.922.0", "@aws-sdk/middleware-logger": "3.922.0", "@aws-sdk/middleware-recursion-detection": "3.922.0", "@aws-sdk/middleware-user-agent": "3.928.0", "@aws-sdk/region-config-resolver": "3.925.0", "@aws-sdk/types": "3.922.0", "@aws-sdk/util-endpoints": "3.922.0", "@aws-sdk/util-user-agent-browser": "3.922.0", "@aws-sdk/util-user-agent-node": "3.928.0", "@smithy/config-resolver": "^4.4.2", "@smithy/core": "^3.17.2", "@smithy/fetch-http-handler": "^5.3.5", "@smithy/hash-node": "^4.2.4", "@smithy/invalid-dependency": "^4.2.4", "@smithy/middleware-content-length": "^4.2.4", "@smithy/middleware-endpoint": "^4.3.6", "@smithy/middleware-retry": "^4.4.6", "@smithy/middleware-serde": "^4.2.4", "@smithy/middleware-stack": "^4.2.4", "@smithy/node-config-provider": "^4.3.4", "@smithy/node-http-handler": "^4.4.4", "@smithy/protocol-http": "^5.3.4", "@smithy/smithy-client": "^4.9.2", "@smithy/types": "^4.8.1", "@smithy/url-parser": "^4.2.4", "@smithy/util-base64": "^4.3.0", "@smithy/util-body-length-browser": "^4.2.0", "@smithy/util-body-length-node": "^4.2.1", "@smithy/util-defaults-mode-browser": "^4.3.5", "@smithy/util-defaults-mode-node": "^4.2.8", "@smithy/util-endpoints": "^3.2.4", "@smithy/util-middleware": "^4.2.4", "@smithy/util-retry": "^4.2.4", "@smithy/util-utf8": "^4.2.0", "tslib": "^2.6.2" } }, "sha512-Efenb8zV2fJJDXmp2NE4xj8Ymhp4gVJCkQ6ixhdrpfQXgd2PODO7a20C2+BhFM6aGmN3m6XWYJ64ZyhXF4pAyQ=="],
"@aws-sdk/core": ["@aws-sdk/core@3.928.0", "", { "dependencies": { "@aws-sdk/types": "3.922.0", "@aws-sdk/xml-builder": "3.921.0", "@smithy/core": "^3.17.2", "@smithy/node-config-provider": "^4.3.4", "@smithy/property-provider": "^4.2.4", "@smithy/protocol-http": "^5.3.4", "@smithy/signature-v4": "^5.3.4", "@smithy/smithy-client": "^4.9.2", "@smithy/types": "^4.8.1", "@smithy/util-base64": "^4.3.0", "@smithy/util-middleware": "^4.2.4", "@smithy/util-utf8": "^4.2.0", "tslib": "^2.6.2" } }, "sha512-e28J2uKjy2uub4u41dNnmzAu0AN3FGB+LRcLN2Qnwl9Oq3kIcByl5sM8ZD+vWpNG+SFUrUasBCq8cMnHxwXZ4w=="],
"@aws-sdk/credential-provider-env": ["@aws-sdk/credential-provider-env@3.928.0", "", { "dependencies": { "@aws-sdk/core": "3.928.0", "@aws-sdk/types": "3.922.0", "@smithy/property-provider": "^4.2.4", "@smithy/types": "^4.8.1", "tslib": "^2.6.2" } }, "sha512-tB8F9Ti0/NFyFVQX8UQtgRik88evtHpyT6WfXOB4bAY6lEnEHA0ubJZmk9y+aUeoE+OsGLx70dC3JUsiiCPJkQ=="],
"@aws-sdk/credential-provider-http": ["@aws-sdk/credential-provider-http@3.928.0", "", { "dependencies": { "@aws-sdk/core": "3.928.0", "@aws-sdk/types": "3.922.0", "@smithy/fetch-http-handler": "^5.3.5", "@smithy/node-http-handler": "^4.4.4", "@smithy/property-provider": "^4.2.4", "@smithy/protocol-http": "^5.3.4", "@smithy/smithy-client": "^4.9.2", "@smithy/types": "^4.8.1", "@smithy/util-stream": "^4.5.5", "tslib": "^2.6.2" } }, "sha512-67ynC/8UW9Y8Gn1ZZtC3OgcQDGWrJelHmkbgpmmxYUrzVhp+NINtz3wiTzrrBFhPH/8Uy6BxvhMfXhn0ptcMEQ=="],
"@aws-sdk/credential-provider-ini": ["@aws-sdk/credential-provider-ini@3.928.0", "", { "dependencies": { "@aws-sdk/core": "3.928.0", "@aws-sdk/credential-provider-env": "3.928.0", "@aws-sdk/credential-provider-http": "3.928.0", "@aws-sdk/credential-provider-process": "3.928.0", "@aws-sdk/credential-provider-sso": "3.928.0", "@aws-sdk/credential-provider-web-identity": "3.928.0", "@aws-sdk/nested-clients": "3.928.0", "@aws-sdk/types": "3.922.0", "@smithy/credential-provider-imds": "^4.2.4", "@smithy/property-provider": "^4.2.4", "@smithy/shared-ini-file-loader": "^4.3.4", "@smithy/types": "^4.8.1", "tslib": "^2.6.2" } }, "sha512-WVWYyj+jox6mhKYp11mu8x1B6Xa2sLbXFHAv5K3Jg8CHvXYpePgTcYlCljq3d4XHC4Jl4nCcsdMtBahSpU9bAA=="],
"@aws-sdk/credential-provider-node": ["@aws-sdk/credential-provider-node@3.928.0", "", { "dependencies": { "@aws-sdk/credential-provider-env": "3.928.0", "@aws-sdk/credential-provider-http": "3.928.0", "@aws-sdk/credential-provider-ini": "3.928.0", "@aws-sdk/credential-provider-process": "3.928.0", "@aws-sdk/credential-provider-sso": "3.928.0", "@aws-sdk/credential-provider-web-identity": "3.928.0", "@aws-sdk/types": "3.922.0", "@smithy/credential-provider-imds": "^4.2.4", "@smithy/property-provider": "^4.2.4", "@smithy/shared-ini-file-loader": "^4.3.4", "@smithy/types": "^4.8.1", "tslib": "^2.6.2" } }, "sha512-SdXVjxZOIXefIR/NJx+lyXOrn4m0ScTAU2JXpLsFCkW2Cafo6vTqHUghyO8vak/XQ8PpPqpLXVpGbAYFuIPW6Q=="],
"@aws-sdk/credential-provider-process": ["@aws-sdk/credential-provider-process@3.928.0", "", { "dependencies": { "@aws-sdk/core": "3.928.0", "@aws-sdk/types": "3.922.0", "@smithy/property-provider": "^4.2.4", "@smithy/shared-ini-file-loader": "^4.3.4", "@smithy/types": "^4.8.1", "tslib": "^2.6.2" } }, "sha512-XL0juran8yhqwn0mreV+NJeHJOkcRBaExsvVn9fXWW37A4gLh4esSJxM2KbSNh0t+/Bk3ehBI5sL9xad+yRDuw=="],
"@aws-sdk/credential-provider-sso": ["@aws-sdk/credential-provider-sso@3.928.0", "", { "dependencies": { "@aws-sdk/client-sso": "3.928.0", "@aws-sdk/core": "3.928.0", "@aws-sdk/token-providers": "3.928.0", "@aws-sdk/types": "3.922.0", "@smithy/property-provider": "^4.2.4", "@smithy/shared-ini-file-loader": "^4.3.4", "@smithy/types": "^4.8.1", "tslib": "^2.6.2" } }, "sha512-md/y+ePDsO1zqPJrsOyPs4ciKmdpqLL7B0dln1NhqZPnKIS5IBfTqZJ5tJ9eTezqc7Tn4Dbg6HiuemcGvZTeFA=="],
"@aws-sdk/credential-provider-web-identity": ["@aws-sdk/credential-provider-web-identity@3.928.0", "", { "dependencies": { "@aws-sdk/core": "3.928.0", "@aws-sdk/nested-clients": "3.928.0", "@aws-sdk/types": "3.922.0", "@smithy/property-provider": "^4.2.4", "@smithy/shared-ini-file-loader": "^4.3.4", "@smithy/types": "^4.8.1", "tslib": "^2.6.2" } }, "sha512-rd97nLY5e/nGOr73ZfsXD+H44iZ9wyGZTKt/2QkiBN3hot/idhgT9+XHsWhRi+o/dThQbpL8RkpAnpF+0ZGthw=="],
"@aws-sdk/middleware-bucket-endpoint": ["@aws-sdk/middleware-bucket-endpoint@3.922.0", "", { "dependencies": { "@aws-sdk/types": "3.922.0", "@aws-sdk/util-arn-parser": "3.893.0", "@smithy/node-config-provider": "^4.3.4", "@smithy/protocol-http": "^5.3.4", "@smithy/types": "^4.8.1", "@smithy/util-config-provider": "^4.2.0", "tslib": "^2.6.2" } }, "sha512-Dpr2YeOaLFqt3q1hocwBesynE3x8/dXZqXZRuzSX/9/VQcwYBFChHAm4mTAl4zuvArtDbLrwzWSxmOWYZGtq5w=="],
"@aws-sdk/middleware-expect-continue": ["@aws-sdk/middleware-expect-continue@3.922.0", "", { "dependencies": { "@aws-sdk/types": "3.922.0", "@smithy/protocol-http": "^5.3.4", "@smithy/types": "^4.8.1", "tslib": "^2.6.2" } }, "sha512-xmnLWMtmHJHJBupSWMUEW1gyxuRIeQ1Ov2xa8Tqq77fPr4Ft2AluEwiDMaZIMHoAvpxWKEEt9Si59Li7GIA+bQ=="],
"@aws-sdk/middleware-flexible-checksums": ["@aws-sdk/middleware-flexible-checksums@3.928.0", "", { "dependencies": { "@aws-crypto/crc32": "5.2.0", "@aws-crypto/crc32c": "5.2.0", "@aws-crypto/util": "5.2.0", "@aws-sdk/core": "3.928.0", "@aws-sdk/types": "3.922.0", "@smithy/is-array-buffer": "^4.2.0", "@smithy/node-config-provider": "^4.3.4", "@smithy/protocol-http": "^5.3.4", "@smithy/types": "^4.8.1", "@smithy/util-middleware": "^4.2.4", "@smithy/util-stream": "^4.5.5", "@smithy/util-utf8": "^4.2.0", "tslib": "^2.6.2" } }, "sha512-9+aCRt7teItSIMbnGvOY+FhtJnW2ZBUbfD+ug29a/ZbobDfTwmtrmtgEIWdXryFaRbT03HHfaJ3a++lTw4osuw=="],
"@aws-sdk/middleware-host-header": ["@aws-sdk/middleware-host-header@3.922.0", "", { "dependencies": { "@aws-sdk/types": "3.922.0", "@smithy/protocol-http": "^5.3.4", "@smithy/types": "^4.8.1", "tslib": "^2.6.2" } }, "sha512-HPquFgBnq/KqKRVkiuCt97PmWbKtxQ5iUNLEc6FIviqOoZTmaYG3EDsIbuFBz9C4RHJU4FKLmHL2bL3FEId6AA=="],
"@aws-sdk/middleware-location-constraint": ["@aws-sdk/middleware-location-constraint@3.922.0", "", { "dependencies": { "@aws-sdk/types": "3.922.0", "@smithy/types": "^4.8.1", "tslib": "^2.6.2" } }, "sha512-T4iqd7WQ2DDjCH/0s50mnhdoX+IJns83ZE+3zj9IDlpU0N2aq8R91IG890qTfYkUEdP9yRm0xir/CNed+v6Dew=="],
"@aws-sdk/middleware-logger": ["@aws-sdk/middleware-logger@3.922.0", "", { "dependencies": { "@aws-sdk/types": "3.922.0", "@smithy/types": "^4.8.1", "tslib": "^2.6.2" } }, "sha512-AkvYO6b80FBm5/kk2E636zNNcNgjztNNUxpqVx+huyGn9ZqGTzS4kLqW2hO6CBe5APzVtPCtiQsXL24nzuOlAg=="],
"@aws-sdk/middleware-recursion-detection": ["@aws-sdk/middleware-recursion-detection@3.922.0", "", { "dependencies": { "@aws-sdk/types": "3.922.0", "@aws/lambda-invoke-store": "^0.1.1", "@smithy/protocol-http": "^5.3.4", "@smithy/types": "^4.8.1", "tslib": "^2.6.2" } }, "sha512-TtSCEDonV/9R0VhVlCpxZbp/9sxQvTTRKzIf8LxW3uXpby6Wl8IxEciBJlxmSkoqxh542WRcko7NYODlvL/gDA=="],
"@aws-sdk/middleware-sdk-s3": ["@aws-sdk/middleware-sdk-s3@3.928.0", "", { "dependencies": { "@aws-sdk/core": "3.928.0", "@aws-sdk/types": "3.922.0", "@aws-sdk/util-arn-parser": "3.893.0", "@smithy/core": "^3.17.2", "@smithy/node-config-provider": "^4.3.4", "@smithy/protocol-http": "^5.3.4", "@smithy/signature-v4": "^5.3.4", "@smithy/smithy-client": "^4.9.2", "@smithy/types": "^4.8.1", "@smithy/util-config-provider": "^4.2.0", "@smithy/util-middleware": "^4.2.4", "@smithy/util-stream": "^4.5.5", "@smithy/util-utf8": "^4.2.0", "tslib": "^2.6.2" } }, "sha512-LTkjS6cpJ2PEtsottTKq7JxZV0oH+QJ12P/dGNPZL4URayjEMBVR/dp4zh835X/FPXzijga3sdotlIKzuFy9FA=="],
"@aws-sdk/middleware-ssec": ["@aws-sdk/middleware-ssec@3.922.0", "", { "dependencies": { "@aws-sdk/types": "3.922.0", "@smithy/types": "^4.8.1", "tslib": "^2.6.2" } }, "sha512-eHvSJZTSRJO+/tjjGD6ocnPc8q9o3m26+qbwQTu/4V6yOJQ1q+xkDZNqwJQphL+CodYaQ7uljp8g1Ji/AN3D9w=="],
"@aws-sdk/middleware-user-agent": ["@aws-sdk/middleware-user-agent@3.928.0", "", { "dependencies": { "@aws-sdk/core": "3.928.0", "@aws-sdk/types": "3.922.0", "@aws-sdk/util-endpoints": "3.922.0", "@smithy/core": "^3.17.2", "@smithy/protocol-http": "^5.3.4", "@smithy/types": "^4.8.1", "tslib": "^2.6.2" } }, "sha512-ESvcfLx5PtpdUM3ptCwb80toBTd3y5I4w5jaeOPHihiZr7jkRLE/nsaCKzlqscPs6UQ8xI0maav04JUiTskcHw=="],
"@aws-sdk/nested-clients": ["@aws-sdk/nested-clients@3.928.0", "", { "dependencies": { "@aws-crypto/sha256-browser": "5.2.0", "@aws-crypto/sha256-js": "5.2.0", "@aws-sdk/core": "3.928.0", "@aws-sdk/middleware-host-header": "3.922.0", "@aws-sdk/middleware-logger": "3.922.0", "@aws-sdk/middleware-recursion-detection": "3.922.0", "@aws-sdk/middleware-user-agent": "3.928.0", "@aws-sdk/region-config-resolver": "3.925.0", "@aws-sdk/types": "3.922.0", "@aws-sdk/util-endpoints": "3.922.0", "@aws-sdk/util-user-agent-browser": "3.922.0", "@aws-sdk/util-user-agent-node": "3.928.0", "@smithy/config-resolver": "^4.4.2", "@smithy/core": "^3.17.2", "@smithy/fetch-http-handler": "^5.3.5", "@smithy/hash-node": "^4.2.4", "@smithy/invalid-dependency": "^4.2.4", "@smithy/middleware-content-length": "^4.2.4", "@smithy/middleware-endpoint": "^4.3.6", "@smithy/middleware-retry": "^4.4.6", "@smithy/middleware-serde": "^4.2.4", "@smithy/middleware-stack": "^4.2.4", "@smithy/node-config-provider": "^4.3.4", "@smithy/node-http-handler": "^4.4.4", "@smithy/protocol-http": "^5.3.4", "@smithy/smithy-client": "^4.9.2", "@smithy/types": "^4.8.1", "@smithy/url-parser": "^4.2.4", "@smithy/util-base64": "^4.3.0", "@smithy/util-body-length-browser": "^4.2.0", "@smithy/util-body-length-node": "^4.2.1", "@smithy/util-defaults-mode-browser": "^4.3.5", "@smithy/util-defaults-mode-node": "^4.2.8", "@smithy/util-endpoints": "^3.2.4", "@smithy/util-middleware": "^4.2.4", "@smithy/util-retry": "^4.2.4", "@smithy/util-utf8": "^4.2.0", "tslib": "^2.6.2" } }, "sha512-kXzfJkq2cD65KAHDe4hZCsnxcGGEWD5pjHqcZplwG4VFMa/iVn/mWrUY9QdadD2GBpXFNQbgOiKG3U2NkKu+4Q=="],
"@aws-sdk/property-provider": ["@aws-sdk/property-provider@3.374.0", "", { "dependencies": { "@smithy/property-provider": "^1.0.1", "tslib": "^2.5.0" } }, "sha512-UvsRnRtLD3g0tPMuiILJiGT9PIg4tgU3coRO4s+bAuT29mUiuViDAQB8DYPgaOtqA4qNFiWose+AMCpdxlOhRA=="],
"@aws-sdk/region-config-resolver": ["@aws-sdk/region-config-resolver@3.925.0", "", { "dependencies": { "@aws-sdk/types": "3.922.0", "@smithy/config-resolver": "^4.4.2", "@smithy/node-config-provider": "^4.3.4", "@smithy/types": "^4.8.1", "tslib": "^2.6.2" } }, "sha512-FOthcdF9oDb1pfQBRCfWPZhJZT5wqpvdAS5aJzB1WDZ+6EuaAhLzLH/fW1slDunIqq1PSQGG3uSnVglVVOvPHQ=="],
"@aws-sdk/signature-v4-multi-region": ["@aws-sdk/signature-v4-multi-region@3.928.0", "", { "dependencies": { "@aws-sdk/middleware-sdk-s3": "3.928.0", "@aws-sdk/types": "3.922.0", "@smithy/protocol-http": "^5.3.4", "@smithy/signature-v4": "^5.3.4", "@smithy/types": "^4.8.1", "tslib": "^2.6.2" } }, "sha512-1+Ic8+MyqQy+OE6QDoQKVCIcSZO+ETmLLLpVS5yu0fihBU85B5HHU7iaKX1qX7lEaGPMpSN/mbHW0VpyQ0Xqaw=="],
"@aws-sdk/token-providers": ["@aws-sdk/token-providers@3.928.0", "", { "dependencies": { "@aws-sdk/core": "3.928.0", "@aws-sdk/nested-clients": "3.928.0", "@aws-sdk/types": "3.922.0", "@smithy/property-provider": "^4.2.4", "@smithy/shared-ini-file-loader": "^4.3.4", "@smithy/types": "^4.8.1", "tslib": "^2.6.2" } }, "sha512-533NpTdUJNDi98zBwRp4ZpZoqULrAVfc0YgIy+8AZHzk0v7N+v59O0d2Du3YO6zN4VU8HU8766DgKiyEag6Dzg=="],
"@aws-sdk/types": ["@aws-sdk/types@3.922.0", "", { "dependencies": { "@smithy/types": "^4.8.1", "tslib": "^2.6.2" } }, "sha512-eLA6XjVobAUAMivvM7DBL79mnHyrm+32TkXNWZua5mnxF+6kQCfblKKJvxMZLGosO53/Ex46ogim8IY5Nbqv2w=="],
"@aws-sdk/util-arn-parser": ["@aws-sdk/util-arn-parser@3.893.0", "", { "dependencies": { "tslib": "^2.6.2" } }, "sha512-u8H4f2Zsi19DGnwj5FSZzDMhytYF/bCh37vAtBsn3cNDL3YG578X5oc+wSX54pM3tOxS+NY7tvOAo52SW7koUA=="],
"@aws-sdk/util-endpoints": ["@aws-sdk/util-endpoints@3.922.0", "", { "dependencies": { "@aws-sdk/types": "3.922.0", "@smithy/types": "^4.8.1", "@smithy/url-parser": "^4.2.4", "@smithy/util-endpoints": "^3.2.4", "tslib": "^2.6.2" } }, "sha512-4ZdQCSuNMY8HMlR1YN4MRDdXuKd+uQTeKIr5/pIM+g3TjInZoj8imvXudjcrFGA63UF3t92YVTkBq88mg58RXQ=="],
"@aws-sdk/util-locate-window": ["@aws-sdk/util-locate-window@3.893.0", "", { "dependencies": { "tslib": "^2.6.2" } }, "sha512-T89pFfgat6c8nMmpI8eKjBcDcgJq36+m9oiXbcUzeU55MP9ZuGgBomGjGnHaEyF36jenW9gmg3NfZDm0AO2XPg=="],
"@aws-sdk/util-user-agent-browser": ["@aws-sdk/util-user-agent-browser@3.922.0", "", { "dependencies": { "@aws-sdk/types": "3.922.0", "@smithy/types": "^4.8.1", "bowser": "^2.11.0", "tslib": "^2.6.2" } }, "sha512-qOJAERZ3Plj1st7M4Q5henl5FRpE30uLm6L9edZqZXGR6c7ry9jzexWamWVpQ4H4xVAVmiO9dIEBAfbq4mduOA=="],
"@aws-sdk/util-user-agent-node": ["@aws-sdk/util-user-agent-node@3.928.0", "", { "dependencies": { "@aws-sdk/middleware-user-agent": "3.928.0", "@aws-sdk/types": "3.922.0", "@smithy/node-config-provider": "^4.3.4", "@smithy/types": "^4.8.1", "tslib": "^2.6.2" }, "peerDependencies": { "aws-crt": ">=1.0.0" }, "optionalPeers": ["aws-crt"] }, "sha512-s0jP67nQLLWVWfBtqTkZUkSWK5e6OI+rs+wFya2h9VLyWBFir17XSDI891s8HZKIVCEl8eBrup+hhywm4nsIAA=="],
"@aws-sdk/xml-builder": ["@aws-sdk/xml-builder@3.921.0", "", { "dependencies": { "@smithy/types": "^4.8.1", "fast-xml-parser": "5.2.5", "tslib": "^2.6.2" } }, "sha512-LVHg0jgjyicKKvpNIEMXIMr1EBViESxcPkqfOlT+X1FkmUMTNZEEVF18tOJg4m4hV5vxtkWcqtr4IEeWa1C41Q=="],
"@aws/lambda-invoke-store": ["@aws/lambda-invoke-store@0.1.1", "", {}, "sha512-RcLam17LdlbSOSp9VxmUu1eI6Mwxp+OwhD2QhiSNmNCzoDb0EeUXTD2n/WbcnrAYMGlmf05th6QYq23VqvJqpA=="],
"@smithy/abort-controller": ["@smithy/abort-controller@4.2.5", "", { "dependencies": { "@smithy/types": "^4.9.0", "tslib": "^2.6.2" } }, "sha512-j7HwVkBw68YW8UmFRcjZOmssE77Rvk0GWAIN1oFBhsaovQmZWYCIcGa9/pwRB0ExI8Sk9MWNALTjftjHZea7VA=="],
"@smithy/chunked-blob-reader": ["@smithy/chunked-blob-reader@5.2.0", "", { "dependencies": { "tslib": "^2.6.2" } }, "sha512-WmU0TnhEAJLWvfSeMxBNe5xtbselEO8+4wG0NtZeL8oR21WgH1xiO37El+/Y+H/Ie4SCwBy3MxYWmOYaGgZueA=="],
"@smithy/chunked-blob-reader-native": ["@smithy/chunked-blob-reader-native@4.2.1", "", { "dependencies": { "@smithy/util-base64": "^4.3.0", "tslib": "^2.6.2" } }, "sha512-lX9Ay+6LisTfpLid2zZtIhSEjHMZoAR5hHCR4H7tBz/Zkfr5ea8RcQ7Tk4mi0P76p4cN+Btz16Ffno7YHpKXnQ=="],
"@smithy/config-resolver": ["@smithy/config-resolver@4.4.3", "", { "dependencies": { "@smithy/node-config-provider": "^4.3.5", "@smithy/types": "^4.9.0", "@smithy/util-config-provider": "^4.2.0", "@smithy/util-endpoints": "^3.2.5", "@smithy/util-middleware": "^4.2.5", "tslib": "^2.6.2" } }, "sha512-ezHLe1tKLUxDJo2LHtDuEDyWXolw8WGOR92qb4bQdWq/zKenO5BvctZGrVJBK08zjezSk7bmbKFOXIVyChvDLw=="],
"@smithy/core": ["@smithy/core@3.18.0", "", { "dependencies": { "@smithy/middleware-serde": "^4.2.5", "@smithy/protocol-http": "^5.3.5", "@smithy/types": "^4.9.0", "@smithy/util-base64": "^4.3.0", "@smithy/util-body-length-browser": "^4.2.0", "@smithy/util-middleware": "^4.2.5", "@smithy/util-stream": "^4.5.6", "@smithy/util-utf8": "^4.2.0", "@smithy/uuid": "^1.1.0", "tslib": "^2.6.2" } }, "sha512-vGSDXOJFZgOPTatSI1ly7Gwyy/d/R9zh2TO3y0JZ0uut5qQ88p9IaWaZYIWSSqtdekNM4CGok/JppxbAff4KcQ=="],
"@smithy/credential-provider-imds": ["@smithy/credential-provider-imds@4.2.5", "", { "dependencies": { "@smithy/node-config-provider": "^4.3.5", "@smithy/property-provider": "^4.2.5", "@smithy/types": "^4.9.0", "@smithy/url-parser": "^4.2.5", "tslib": "^2.6.2" } }, "sha512-BZwotjoZWn9+36nimwm/OLIcVe+KYRwzMjfhd4QT7QxPm9WY0HiOV8t/Wlh+HVUif0SBVV7ksq8//hPaBC/okQ=="],
"@smithy/eventstream-codec": ["@smithy/eventstream-codec@4.2.5", "", { "dependencies": { "@aws-crypto/crc32": "5.2.0", "@smithy/types": "^4.9.0", "@smithy/util-hex-encoding": "^4.2.0", "tslib": "^2.6.2" } }, "sha512-Ogt4Zi9hEbIP17oQMd68qYOHUzmH47UkK7q7Gl55iIm9oKt27MUGrC5JfpMroeHjdkOliOA4Qt3NQ1xMq/nrlA=="],
"@smithy/eventstream-serde-browser": ["@smithy/eventstream-serde-browser@4.2.5", "", { "dependencies": { "@smithy/eventstream-serde-universal": "^4.2.5", "@smithy/types": "^4.9.0", "tslib": "^2.6.2" } }, "sha512-HohfmCQZjppVnKX2PnXlf47CW3j92Ki6T/vkAT2DhBR47e89pen3s4fIa7otGTtrVxmj7q+IhH0RnC5kpR8wtw=="],
"@smithy/eventstream-serde-config-resolver": ["@smithy/eventstream-serde-config-resolver@4.3.5", "", { "dependencies": { "@smithy/types": "^4.9.0", "tslib": "^2.6.2" } }, "sha512-ibjQjM7wEXtECiT6my1xfiMH9IcEczMOS6xiCQXoUIYSj5b1CpBbJ3VYbdwDy8Vcg5JHN7eFpOCGk8nyZAltNQ=="],
"@smithy/eventstream-serde-node": ["@smithy/eventstream-serde-node@4.2.5", "", { "dependencies": { "@smithy/eventstream-serde-universal": "^4.2.5", "@smithy/types": "^4.9.0", "tslib": "^2.6.2" } }, "sha512-+elOuaYx6F2H6x1/5BQP5ugv12nfJl66GhxON8+dWVUEDJ9jah/A0tayVdkLRP0AeSac0inYkDz5qBFKfVp2Gg=="],
"@smithy/eventstream-serde-universal": ["@smithy/eventstream-serde-universal@4.2.5", "", { "dependencies": { "@smithy/eventstream-codec": "^4.2.5", "@smithy/types": "^4.9.0", "tslib": "^2.6.2" } }, "sha512-G9WSqbST45bmIFaeNuP/EnC19Rhp54CcVdX9PDL1zyEB514WsDVXhlyihKlGXnRycmHNmVv88Bvvt4EYxWef/Q=="],
"@smithy/fetch-http-handler": ["@smithy/fetch-http-handler@5.3.6", "", { "dependencies": { "@smithy/protocol-http": "^5.3.5", "@smithy/querystring-builder": "^4.2.5", "@smithy/types": "^4.9.0", "@smithy/util-base64": "^4.3.0", "tslib": "^2.6.2" } }, "sha512-3+RG3EA6BBJ/ofZUeTFJA7mHfSYrZtQIrDP9dI8Lf7X6Jbos2jptuLrAAteDiFVrmbEmLSuRG/bUKzfAXk7dhg=="],
"@smithy/hash-blob-browser": ["@smithy/hash-blob-browser@4.2.6", "", { "dependencies": { "@smithy/chunked-blob-reader": "^5.2.0", "@smithy/chunked-blob-reader-native": "^4.2.1", "@smithy/types": "^4.9.0", "tslib": "^2.6.2" } }, "sha512-8P//tA8DVPk+3XURk2rwcKgYwFvwGwmJH/wJqQiSKwXZtf/LiZK+hbUZmPj/9KzM+OVSwe4o85KTp5x9DUZTjw=="],
"@smithy/hash-node": ["@smithy/hash-node@4.2.5", "", { "dependencies": { "@smithy/types": "^4.9.0", "@smithy/util-buffer-from": "^4.2.0", "@smithy/util-utf8": "^4.2.0", "tslib": "^2.6.2" } }, "sha512-DpYX914YOfA3UDT9CN1BM787PcHfWRBB43fFGCYrZFUH0Jv+5t8yYl+Pd5PW4+QzoGEDvn5d5QIO4j2HyYZQSA=="],
"@smithy/hash-stream-node": ["@smithy/hash-stream-node@4.2.5", "", { "dependencies": { "@smithy/types": "^4.9.0", "@smithy/util-utf8": "^4.2.0", "tslib": "^2.6.2" } }, "sha512-6+do24VnEyvWcGdHXomlpd0m8bfZePpUKBy7m311n+JuRwug8J4dCanJdTymx//8mi0nlkflZBvJe+dEO/O12Q=="],
"@smithy/invalid-dependency": ["@smithy/invalid-dependency@4.2.5", "", { "dependencies": { "@smithy/types": "^4.9.0", "tslib": "^2.6.2" } }, "sha512-2L2erASEro1WC5nV+plwIMxrTXpvpfzl4e+Nre6vBVRR2HKeGGcvpJyyL3/PpiSg+cJG2KpTmZmq934Olb6e5A=="],
"@smithy/is-array-buffer": ["@smithy/is-array-buffer@4.2.0", "", { "dependencies": { "tslib": "^2.6.2" } }, "sha512-DZZZBvC7sjcYh4MazJSGiWMI2L7E0oCiRHREDzIxi/M2LY79/21iXt6aPLHge82wi5LsuRF5A06Ds3+0mlh6CQ=="],
"@smithy/md5-js": ["@smithy/md5-js@4.2.5", "", { "dependencies": { "@smithy/types": "^4.9.0", "@smithy/util-utf8": "^4.2.0", "tslib": "^2.6.2" } }, "sha512-Bt6jpSTMWfjCtC0s79gZ/WZ1w90grfmopVOWqkI2ovhjpD5Q2XRXuecIPB9689L2+cCySMbaXDhBPU56FKNDNg=="],
"@smithy/middleware-content-length": ["@smithy/middleware-content-length@4.2.5", "", { "dependencies": { "@smithy/protocol-http": "^5.3.5", "@smithy/types": "^4.9.0", "tslib": "^2.6.2" } }, "sha512-Y/RabVa5vbl5FuHYV2vUCwvh/dqzrEY/K2yWPSqvhFUwIY0atLqO4TienjBXakoy4zrKAMCZwg+YEqmH7jaN7A=="],
"@smithy/middleware-endpoint": ["@smithy/middleware-endpoint@4.3.7", "", { "dependencies": { "@smithy/core": "^3.18.0", "@smithy/middleware-serde": "^4.2.5", "@smithy/node-config-provider": "^4.3.5", "@smithy/shared-ini-file-loader": "^4.4.0", "@smithy/types": "^4.9.0", "@smithy/url-parser": "^4.2.5", "@smithy/util-middleware": "^4.2.5", "tslib": "^2.6.2" } }, "sha512-i8Mi8OuY6Yi82Foe3iu7/yhBj1HBRoOQwBSsUNYglJTNSFaWYTNM2NauBBs/7pq2sqkLRqeUXA3Ogi2utzpUlQ=="],
"@smithy/middleware-retry": ["@smithy/middleware-retry@4.4.7", "", { "dependencies": { "@smithy/node-config-provider": "^4.3.5", "@smithy/protocol-http": "^5.3.5", "@smithy/service-error-classification": "^4.2.5", "@smithy/smithy-client": "^4.9.3", "@smithy/types": "^4.9.0", "@smithy/util-middleware": "^4.2.5", "@smithy/util-retry": "^4.2.5", "@smithy/uuid": "^1.1.0", "tslib": "^2.6.2" } }, "sha512-E7Vc6WHCHlzDRTx1W0jZ6J1L6ziEV0PIWcUdmfL4y+c8r7WYr6I+LkQudaD8Nfb7C5c4P3SQ972OmXHtv6m/OA=="],
"@smithy/middleware-serde": ["@smithy/middleware-serde@4.2.5", "", { "dependencies": { "@smithy/protocol-http": "^5.3.5", "@smithy/types": "^4.9.0", "tslib": "^2.6.2" } }, "sha512-La1ldWTJTZ5NqQyPqnCNeH9B+zjFhrNoQIL1jTh4zuqXRlmXhxYHhMtI1/92OlnoAtp6JoN7kzuwhWoXrBwPqg=="],
"@smithy/middleware-stack": ["@smithy/middleware-stack@4.2.5", "", { "dependencies": { "@smithy/types": "^4.9.0", "tslib": "^2.6.2" } }, "sha512-bYrutc+neOyWxtZdbB2USbQttZN0mXaOyYLIsaTbJhFsfpXyGWUxJpEuO1rJ8IIJm2qH4+xJT0mxUSsEDTYwdQ=="],
"@smithy/node-config-provider": ["@smithy/node-config-provider@4.3.5", "", { "dependencies": { "@smithy/property-provider": "^4.2.5", "@smithy/shared-ini-file-loader": "^4.4.0", "@smithy/types": "^4.9.0", "tslib": "^2.6.2" } }, "sha512-UTurh1C4qkVCtqggI36DGbLB2Kv8UlcFdMXDcWMbqVY2uRg0XmT9Pb4Vj6oSQ34eizO1fvR0RnFV4Axw4IrrAg=="],
"@smithy/node-http-handler": ["@smithy/node-http-handler@4.4.5", "", { "dependencies": { "@smithy/abort-controller": "^4.2.5", "@smithy/protocol-http": "^5.3.5", "@smithy/querystring-builder": "^4.2.5", "@smithy/types": "^4.9.0", "tslib": "^2.6.2" } }, "sha512-CMnzM9R2WqlqXQGtIlsHMEZfXKJVTIrqCNoSd/QpAyp+Dw0a1Vps13l6ma1fH8g7zSPNsA59B/kWgeylFuA/lw=="],
"@smithy/property-provider": ["@smithy/property-provider@1.2.0", "", { "dependencies": { "@smithy/types": "^1.2.0", "tslib": "^2.5.0" } }, "sha512-qlJd9gT751i4T0t/hJAyNGfESfi08Fek8QiLcysoKPgR05qHhG0OYhlaCJHhpXy4ECW0lHyjvFM1smrCLIXVfw=="],
"@smithy/protocol-http": ["@smithy/protocol-http@5.3.5", "", { "dependencies": { "@smithy/types": "^4.9.0", "tslib": "^2.6.2" } }, "sha512-RlaL+sA0LNMp03bf7XPbFmT5gN+w3besXSWMkA8rcmxLSVfiEXElQi4O2IWwPfxzcHkxqrwBFMbngB8yx/RvaQ=="],
"@smithy/querystring-builder": ["@smithy/querystring-builder@4.2.5", "", { "dependencies": { "@smithy/types": "^4.9.0", "@smithy/util-uri-escape": "^4.2.0", "tslib": "^2.6.2" } }, "sha512-y98otMI1saoajeik2kLfGyRp11e5U/iJYH/wLCh3aTV/XutbGT9nziKGkgCaMD1ghK7p6htHMm6b6scl9JRUWg=="],
"@smithy/querystring-parser": ["@smithy/querystring-parser@4.2.5", "", { "dependencies": { "@smithy/types": "^4.9.0", "tslib": "^2.6.2" } }, "sha512-031WCTdPYgiQRYNPXznHXof2YM0GwL6SeaSyTH/P72M1Vz73TvCNH2Nq8Iu2IEPq9QP2yx0/nrw5YmSeAi/AjQ=="],
"@smithy/service-error-classification": ["@smithy/service-error-classification@4.2.5", "", { "dependencies": { "@smithy/types": "^4.9.0" } }, "sha512-8fEvK+WPE3wUAcDvqDQG1Vk3ANLR8Px979te96m84CbKAjBVf25rPYSzb4xU4hlTyho7VhOGnh5i62D/JVF0JQ=="],
"@smithy/shared-ini-file-loader": ["@smithy/shared-ini-file-loader@4.4.0", "", { "dependencies": { "@smithy/types": "^4.9.0", "tslib": "^2.6.2" } }, "sha512-5WmZ5+kJgJDjwXXIzr1vDTG+RhF9wzSODQBfkrQ2VVkYALKGvZX1lgVSxEkgicSAFnFhPj5rudJV0zoinqS0bA=="],
"@smithy/signature-v4": ["@smithy/signature-v4@5.3.5", "", { "dependencies": { "@smithy/is-array-buffer": "^4.2.0", "@smithy/protocol-http": "^5.3.5", "@smithy/types": "^4.9.0", "@smithy/util-hex-encoding": "^4.2.0", "@smithy/util-middleware": "^4.2.5", "@smithy/util-uri-escape": "^4.2.0", "@smithy/util-utf8": "^4.2.0", "tslib": "^2.6.2" } }, "sha512-xSUfMu1FT7ccfSXkoLl/QRQBi2rOvi3tiBZU2Tdy3I6cgvZ6SEi9QNey+lqps/sJRnogIS+lq+B1gxxbra2a/w=="],
"@smithy/smithy-client": ["@smithy/smithy-client@4.9.3", "", { "dependencies": { "@smithy/core": "^3.18.0", "@smithy/middleware-endpoint": "^4.3.7", "@smithy/middleware-stack": "^4.2.5", "@smithy/protocol-http": "^5.3.5", "@smithy/types": "^4.9.0", "@smithy/util-stream": "^4.5.6", "tslib": "^2.6.2" } }, "sha512-8tlueuTgV5n7inQCkhyptrB3jo2AO80uGrps/XTYZivv5MFQKKBj3CIWIGMI2fRY5LEduIiazOhAWdFknY1O9w=="],
"@smithy/types": ["@smithy/types@4.9.0", "", { "dependencies": { "tslib": "^2.6.2" } }, "sha512-MvUbdnXDTwykR8cB1WZvNNwqoWVaTRA0RLlLmf/cIFNMM2cKWz01X4Ly6SMC4Kks30r8tT3Cty0jmeWfiuyHTA=="],
"@smithy/url-parser": ["@smithy/url-parser@4.2.5", "", { "dependencies": { "@smithy/querystring-parser": "^4.2.5", "@smithy/types": "^4.9.0", "tslib": "^2.6.2" } }, "sha512-VaxMGsilqFnK1CeBX+LXnSuaMx4sTL/6znSZh2829txWieazdVxr54HmiyTsIbpOTLcf5nYpq9lpzmwRdxj6rQ=="],
"@smithy/util-base64": ["@smithy/util-base64@4.3.0", "", { "dependencies": { "@smithy/util-buffer-from": "^4.2.0", "@smithy/util-utf8": "^4.2.0", "tslib": "^2.6.2" } }, "sha512-GkXZ59JfyxsIwNTWFnjmFEI8kZpRNIBfxKjv09+nkAWPt/4aGaEWMM04m4sxgNVWkbt2MdSvE3KF/PfX4nFedQ=="],
"@smithy/util-body-length-browser": ["@smithy/util-body-length-browser@4.2.0", "", { "dependencies": { "tslib": "^2.6.2" } }, "sha512-Fkoh/I76szMKJnBXWPdFkQJl2r9SjPt3cMzLdOB6eJ4Pnpas8hVoWPYemX/peO0yrrvldgCUVJqOAjUrOLjbxg=="],
"@smithy/util-body-length-node": ["@smithy/util-body-length-node@4.2.1", "", { "dependencies": { "tslib": "^2.6.2" } }, "sha512-h53dz/pISVrVrfxV1iqXlx5pRg3V2YWFcSQyPyXZRrZoZj4R4DeWRDo1a7dd3CPTcFi3kE+98tuNyD2axyZReA=="],
"@smithy/util-buffer-from": ["@smithy/util-buffer-from@4.2.0", "", { "dependencies": { "@smithy/is-array-buffer": "^4.2.0", "tslib": "^2.6.2" } }, "sha512-kAY9hTKulTNevM2nlRtxAG2FQ3B2OR6QIrPY3zE5LqJy1oxzmgBGsHLWTcNhWXKchgA0WHW+mZkQrng/pgcCew=="],
"@smithy/util-config-provider": ["@smithy/util-config-provider@4.2.0", "", { "dependencies": { "tslib": "^2.6.2" } }, "sha512-YEjpl6XJ36FTKmD+kRJJWYvrHeUvm5ykaUS5xK+6oXffQPHeEM4/nXlZPe+Wu0lsgRUcNZiliYNh/y7q9c2y6Q=="],
"@smithy/util-defaults-mode-browser": ["@smithy/util-defaults-mode-browser@4.3.6", "", { "dependencies": { "@smithy/property-provider": "^4.2.5", "@smithy/smithy-client": "^4.9.3", "@smithy/types": "^4.9.0", "tslib": "^2.6.2" } }, "sha512-kbpuXbEf2YQ9zEE6eeVnUCQWO0e1BjMnKrXL8rfXgiWA0m8/E0leU4oSNzxP04WfCmW8vjEqaDeXWxwE4tpOjQ=="],
"@smithy/util-defaults-mode-node": ["@smithy/util-defaults-mode-node@4.2.9", "", { "dependencies": { "@smithy/config-resolver": "^4.4.3", "@smithy/credential-provider-imds": "^4.2.5", "@smithy/node-config-provider": "^4.3.5", "@smithy/property-provider": "^4.2.5", "@smithy/smithy-client": "^4.9.3", "@smithy/types": "^4.9.0", "tslib": "^2.6.2" } }, "sha512-dgyribrVWN5qE5usYJ0m5M93mVM3L3TyBPZWe1Xl6uZlH2gzfQx3dz+ZCdW93lWqdedJRkOecnvbnoEEXRZ5VQ=="],
"@smithy/util-endpoints": ["@smithy/util-endpoints@3.2.5", "", { "dependencies": { "@smithy/node-config-provider": "^4.3.5", "@smithy/types": "^4.9.0", "tslib": "^2.6.2" } }, "sha512-3O63AAWu2cSNQZp+ayl9I3NapW1p1rR5mlVHcF6hAB1dPZUQFfRPYtplWX/3xrzWthPGj5FqB12taJJCfH6s8A=="],
"@smithy/util-hex-encoding": ["@smithy/util-hex-encoding@4.2.0", "", { "dependencies": { "tslib": "^2.6.2" } }, "sha512-CCQBwJIvXMLKxVbO88IukazJD9a4kQ9ZN7/UMGBjBcJYvatpWk+9g870El4cB8/EJxfe+k+y0GmR9CAzkF+Nbw=="],
"@smithy/util-middleware": ["@smithy/util-middleware@4.2.5", "", { "dependencies": { "@smithy/types": "^4.9.0", "tslib": "^2.6.2" } }, "sha512-6Y3+rvBF7+PZOc40ybeZMcGln6xJGVeY60E7jy9Mv5iKpMJpHgRE6dKy9ScsVxvfAYuEX4Q9a65DQX90KaQ3bA=="],
"@smithy/util-retry": ["@smithy/util-retry@4.2.5", "", { "dependencies": { "@smithy/service-error-classification": "^4.2.5", "@smithy/types": "^4.9.0", "tslib": "^2.6.2" } }, "sha512-GBj3+EZBbN4NAqJ/7pAhsXdfzdlznOh8PydUijy6FpNIMnHPSMO2/rP4HKu+UFeikJxShERk528oy7GT79YiJg=="],
"@smithy/util-stream": ["@smithy/util-stream@4.5.6", "", { "dependencies": { "@smithy/fetch-http-handler": "^5.3.6", "@smithy/node-http-handler": "^4.4.5", "@smithy/types": "^4.9.0", "@smithy/util-base64": "^4.3.0", "@smithy/util-buffer-from": "^4.2.0", "@smithy/util-hex-encoding": "^4.2.0", "@smithy/util-utf8": "^4.2.0", "tslib": "^2.6.2" } }, "sha512-qWw/UM59TiaFrPevefOZ8CNBKbYEP6wBAIlLqxn3VAIo9rgnTNc4ASbVrqDmhuwI87usnjhdQrxodzAGFFzbRQ=="],
"@smithy/util-uri-escape": ["@smithy/util-uri-escape@4.2.0", "", { "dependencies": { "tslib": "^2.6.2" } }, "sha512-igZpCKV9+E/Mzrpq6YacdTQ0qTiLm85gD6N/IrmyDvQFA4UnU3d5g3m8tMT/6zG/vVkWSU+VxeUyGonL62DuxA=="],
"@smithy/util-utf8": ["@smithy/util-utf8@4.2.0", "", { "dependencies": { "@smithy/util-buffer-from": "^4.2.0", "tslib": "^2.6.2" } }, "sha512-zBPfuzoI8xyBtR2P6WQj63Rz8i3AmfAaJLuNG8dWsfvPe8lO4aCPYLn879mEgHndZH1zQ2oXmG8O1GGzzaoZiw=="],
"@smithy/util-waiter": ["@smithy/util-waiter@4.2.5", "", { "dependencies": { "@smithy/abort-controller": "^4.2.5", "@smithy/types": "^4.9.0", "tslib": "^2.6.2" } }, "sha512-Dbun99A3InifQdIrsXZ+QLcC0PGBPAdrl4cj1mTgJvyc9N2zf7QSxg8TBkzsCmGJdE3TLbO9ycwpY0EkWahQ/g=="],
"@smithy/uuid": ["@smithy/uuid@1.1.0", "", { "dependencies": { "tslib": "^2.6.2" } }, "sha512-4aUIteuyxtBUhVdiQqcDhKFitwfd9hqoSDYY2KRXiWtgoWJ9Bmise+KfEPDiVHWeJepvF8xJO9/9+WDIciMFFw=="],
"bowser": ["bowser@2.12.1", "", {}, "sha512-z4rE2Gxh7tvshQ4hluIT7XcFrgLIQaw9X3A+kTTRdovCz5PMukm/0QC/BKSYPj3omF5Qfypn9O/c5kgpmvYUCw=="],
"fast-xml-parser": ["fast-xml-parser@5.2.5", "", { "dependencies": { "strnum": "^2.1.0" }, "bin": { "fxparser": "src/cli/cli.js" } }, "sha512-pfX9uG9Ki0yekDHx2SiuRIyFdyAr1kMIMitPvb0YBo8SUfKvia7w7FIyd/l6av85pFYRhZscS75MwMnbvY+hcQ=="],
"strnum": ["strnum@2.1.1", "", {}, "sha512-7ZvoFTiCnGxBtDqJ//Cu6fWtZtc7Y3x+QOirG15wztbdngGSkht27o2pyGWrVy0b4WAy3jbKmnoK6g5VlVNUUw=="],
"tslib": ["tslib@2.8.1", "", {}, "sha512-oJFu94HQb+KVduSUQL7wnpmqnfmLsOA/nAh6b6EH0wCEoK0/mPeXU6c3wKDV83MkOuHPRHtSXKKU99IBazS/2w=="],
"@aws-crypto/sha1-browser/@smithy/util-utf8": ["@smithy/util-utf8@2.3.0", "", { "dependencies": { "@smithy/util-buffer-from": "^2.2.0", "tslib": "^2.6.2" } }, "sha512-R8Rdn8Hy72KKcebgLiv8jQcQkXoLMOGGv5uI1/k0l+snqkOzQ1R0ChUBCxWMlBsFMekWjq0wRudIweFs7sKT5A=="],
"@aws-crypto/sha256-browser/@smithy/util-utf8": ["@smithy/util-utf8@2.3.0", "", { "dependencies": { "@smithy/util-buffer-from": "^2.2.0", "tslib": "^2.6.2" } }, "sha512-R8Rdn8Hy72KKcebgLiv8jQcQkXoLMOGGv5uI1/k0l+snqkOzQ1R0ChUBCxWMlBsFMekWjq0wRudIweFs7sKT5A=="],
"@aws-crypto/util/@smithy/util-utf8": ["@smithy/util-utf8@2.3.0", "", { "dependencies": { "@smithy/util-buffer-from": "^2.2.0", "tslib": "^2.6.2" } }, "sha512-R8Rdn8Hy72KKcebgLiv8jQcQkXoLMOGGv5uI1/k0l+snqkOzQ1R0ChUBCxWMlBsFMekWjq0wRudIweFs7sKT5A=="],
"@aws-sdk/core/@smithy/property-provider": ["@smithy/property-provider@4.2.5", "", { "dependencies": { "@smithy/types": "^4.9.0", "tslib": "^2.6.2" } }, "sha512-8iLN1XSE1rl4MuxvQ+5OSk/Zb5El7NJZ1td6Tn+8dQQHIjp59Lwl6bd0+nzw6SKm2wSSriH2v/I9LPzUic7EOg=="],
"@aws-sdk/credential-provider-env/@smithy/property-provider": ["@smithy/property-provider@4.2.5", "", { "dependencies": { "@smithy/types": "^4.9.0", "tslib": "^2.6.2" } }, "sha512-8iLN1XSE1rl4MuxvQ+5OSk/Zb5El7NJZ1td6Tn+8dQQHIjp59Lwl6bd0+nzw6SKm2wSSriH2v/I9LPzUic7EOg=="],
"@aws-sdk/credential-provider-http/@smithy/property-provider": ["@smithy/property-provider@4.2.5", "", { "dependencies": { "@smithy/types": "^4.9.0", "tslib": "^2.6.2" } }, "sha512-8iLN1XSE1rl4MuxvQ+5OSk/Zb5El7NJZ1td6Tn+8dQQHIjp59Lwl6bd0+nzw6SKm2wSSriH2v/I9LPzUic7EOg=="],
"@aws-sdk/credential-provider-ini/@smithy/property-provider": ["@smithy/property-provider@4.2.5", "", { "dependencies": { "@smithy/types": "^4.9.0", "tslib": "^2.6.2" } }, "sha512-8iLN1XSE1rl4MuxvQ+5OSk/Zb5El7NJZ1td6Tn+8dQQHIjp59Lwl6bd0+nzw6SKm2wSSriH2v/I9LPzUic7EOg=="],
"@aws-sdk/credential-provider-node/@smithy/property-provider": ["@smithy/property-provider@4.2.5", "", { "dependencies": { "@smithy/types": "^4.9.0", "tslib": "^2.6.2" } }, "sha512-8iLN1XSE1rl4MuxvQ+5OSk/Zb5El7NJZ1td6Tn+8dQQHIjp59Lwl6bd0+nzw6SKm2wSSriH2v/I9LPzUic7EOg=="],
"@aws-sdk/credential-provider-process/@smithy/property-provider": ["@smithy/property-provider@4.2.5", "", { "dependencies": { "@smithy/types": "^4.9.0", "tslib": "^2.6.2" } }, "sha512-8iLN1XSE1rl4MuxvQ+5OSk/Zb5El7NJZ1td6Tn+8dQQHIjp59Lwl6bd0+nzw6SKm2wSSriH2v/I9LPzUic7EOg=="],
"@aws-sdk/credential-provider-sso/@smithy/property-provider": ["@smithy/property-provider@4.2.5", "", { "dependencies": { "@smithy/types": "^4.9.0", "tslib": "^2.6.2" } }, "sha512-8iLN1XSE1rl4MuxvQ+5OSk/Zb5El7NJZ1td6Tn+8dQQHIjp59Lwl6bd0+nzw6SKm2wSSriH2v/I9LPzUic7EOg=="],
"@aws-sdk/credential-provider-web-identity/@smithy/property-provider": ["@smithy/property-provider@4.2.5", "", { "dependencies": { "@smithy/types": "^4.9.0", "tslib": "^2.6.2" } }, "sha512-8iLN1XSE1rl4MuxvQ+5OSk/Zb5El7NJZ1td6Tn+8dQQHIjp59Lwl6bd0+nzw6SKm2wSSriH2v/I9LPzUic7EOg=="],
"@aws-sdk/token-providers/@smithy/property-provider": ["@smithy/property-provider@4.2.5", "", { "dependencies": { "@smithy/types": "^4.9.0", "tslib": "^2.6.2" } }, "sha512-8iLN1XSE1rl4MuxvQ+5OSk/Zb5El7NJZ1td6Tn+8dQQHIjp59Lwl6bd0+nzw6SKm2wSSriH2v/I9LPzUic7EOg=="],
"@smithy/credential-provider-imds/@smithy/property-provider": ["@smithy/property-provider@4.2.5", "", { "dependencies": { "@smithy/types": "^4.9.0", "tslib": "^2.6.2" } }, "sha512-8iLN1XSE1rl4MuxvQ+5OSk/Zb5El7NJZ1td6Tn+8dQQHIjp59Lwl6bd0+nzw6SKm2wSSriH2v/I9LPzUic7EOg=="],
"@smithy/node-config-provider/@smithy/property-provider": ["@smithy/property-provider@4.2.5", "", { "dependencies": { "@smithy/types": "^4.9.0", "tslib": "^2.6.2" } }, "sha512-8iLN1XSE1rl4MuxvQ+5OSk/Zb5El7NJZ1td6Tn+8dQQHIjp59Lwl6bd0+nzw6SKm2wSSriH2v/I9LPzUic7EOg=="],
"@smithy/property-provider/@smithy/types": ["@smithy/types@1.2.0", "", { "dependencies": { "tslib": "^2.5.0" } }, "sha512-z1r00TvBqF3dh4aHhya7nz1HhvCg4TRmw51fjMrh5do3h+ngSstt/yKlNbHeb9QxJmFbmN8KEVSWgb1bRvfEoA=="],
"@smithy/util-defaults-mode-browser/@smithy/property-provider": ["@smithy/property-provider@4.2.5", "", { "dependencies": { "@smithy/types": "^4.9.0", "tslib": "^2.6.2" } }, "sha512-8iLN1XSE1rl4MuxvQ+5OSk/Zb5El7NJZ1td6Tn+8dQQHIjp59Lwl6bd0+nzw6SKm2wSSriH2v/I9LPzUic7EOg=="],
"@smithy/util-defaults-mode-node/@smithy/property-provider": ["@smithy/property-provider@4.2.5", "", { "dependencies": { "@smithy/types": "^4.9.0", "tslib": "^2.6.2" } }, "sha512-8iLN1XSE1rl4MuxvQ+5OSk/Zb5El7NJZ1td6Tn+8dQQHIjp59Lwl6bd0+nzw6SKm2wSSriH2v/I9LPzUic7EOg=="],
"@aws-crypto/sha1-browser/@smithy/util-utf8/@smithy/util-buffer-from": ["@smithy/util-buffer-from@2.2.0", "", { "dependencies": { "@smithy/is-array-buffer": "^2.2.0", "tslib": "^2.6.2" } }, "sha512-IJdWBbTcMQ6DA0gdNhh/BwrLkDR+ADW5Kr1aZmd4k3DIF6ezMV4R2NIAmT08wQJ3yUK82thHWmC/TnK/wpMMIA=="],
"@aws-crypto/sha256-browser/@smithy/util-utf8/@smithy/util-buffer-from": ["@smithy/util-buffer-from@2.2.0", "", { "dependencies": { "@smithy/is-array-buffer": "^2.2.0", "tslib": "^2.6.2" } }, "sha512-IJdWBbTcMQ6DA0gdNhh/BwrLkDR+ADW5Kr1aZmd4k3DIF6ezMV4R2NIAmT08wQJ3yUK82thHWmC/TnK/wpMMIA=="],
"@aws-crypto/util/@smithy/util-utf8/@smithy/util-buffer-from": ["@smithy/util-buffer-from@2.2.0", "", { "dependencies": { "@smithy/is-array-buffer": "^2.2.0", "tslib": "^2.6.2" } }, "sha512-IJdWBbTcMQ6DA0gdNhh/BwrLkDR+ADW5Kr1aZmd4k3DIF6ezMV4R2NIAmT08wQJ3yUK82thHWmC/TnK/wpMMIA=="],
"@aws-crypto/sha1-browser/@smithy/util-utf8/@smithy/util-buffer-from/@smithy/is-array-buffer": ["@smithy/is-array-buffer@2.2.0", "", { "dependencies": { "tslib": "^2.6.2" } }, "sha512-GGP3O9QFD24uGeAXYUjwSTXARoqpZykHadOmA8G5vfJPK0/DC67qa//0qvqrJzL1xc8WQWX7/yc7fwudjPHPhA=="],
"@aws-crypto/sha256-browser/@smithy/util-utf8/@smithy/util-buffer-from/@smithy/is-array-buffer": ["@smithy/is-array-buffer@2.2.0", "", { "dependencies": { "tslib": "^2.6.2" } }, "sha512-GGP3O9QFD24uGeAXYUjwSTXARoqpZykHadOmA8G5vfJPK0/DC67qa//0qvqrJzL1xc8WQWX7/yc7fwudjPHPhA=="],
"@aws-crypto/util/@smithy/util-utf8/@smithy/util-buffer-from/@smithy/is-array-buffer": ["@smithy/is-array-buffer@2.2.0", "", { "dependencies": { "tslib": "^2.6.2" } }, "sha512-GGP3O9QFD24uGeAXYUjwSTXARoqpZykHadOmA8G5vfJPK0/DC67qa//0qvqrJzL1xc8WQWX7/yc7fwudjPHPhA=="],
}
}

View File

@@ -1,82 +0,0 @@
/**
* Test whether the current user has access to the Bun build-cache.
*
* Exits with code 0 if access is available, or 1 otherwise.
*/
import { HeadBucketCommand, S3Client } from "@aws-sdk/client-s3";
import { CredentialsProviderError } from "@aws-sdk/property-provider";
class CacheConfig {
#bucket: string;
#region: string;
get bucket(): string {
return this.#bucket;
}
get region(): string {
return this.#region;
}
static fromArgv(): CacheConfig {
const cacheConfig = new CacheConfig();
const exitWithHelp = (reason: string) => {
console.error(`Error: ${reason}\n`);
console.error("Usage: have-access --bucket <bucket-name> --region <region>");
process.exit(1);
};
for (let i = 2; i < process.argv.length; i++) {
switch (process.argv[i]) {
case "-b":
case "--bucket":
cacheConfig.#bucket = process.argv[++i];
break;
case "-r":
case "--region":
cacheConfig.#region = process.argv[++i];
break;
default:
exitWithHelp(`Unknown argument: ${process.argv[i]}`);
}
}
if (!cacheConfig.#bucket) {
exitWithHelp("Missing required argument: --bucket");
}
if (!cacheConfig.#region) {
exitWithHelp("Missing required argument: --region");
}
return cacheConfig;
}
}
/**
* Test whether the current user has access to the Bun build-cache.
*/
async function currentUserHasAccess(cacheConfig: CacheConfig): Promise<boolean> {
const s3Client = new S3Client({ region: cacheConfig.region });
try {
await s3Client.send(new HeadBucketCommand({ Bucket: cacheConfig.bucket }));
return true;
} catch (error) {
if (
error.name === "NotFound" ||
error.$metadata?.httpStatusCode === 404 ||
error.name === "Forbidden" ||
error.$metadata?.httpStatusCode === 403 ||
error instanceof CredentialsProviderError
) {
return false;
}
throw error;
}
}
const ok = await currentUserHasAccess(CacheConfig.fromArgv());
process.exit(ok ? 0 : 1);

View File

@@ -1,6 +0,0 @@
{
"dependencies": {
"@aws-sdk/client-s3": "^3.928.0",
"@aws-sdk/property-provider": "^3.374.0"
}
}

View File

@@ -108,8 +108,8 @@ async function build(args) {
await startGroup("CMake Build", () => spawn("cmake", buildArgs, { env }));
if (ciCppBuild) {
await startGroup("sccache stats", () => {
spawn("sccache", ["--show-stats"], { env });
await startGroup("ccache stats", () => {
spawn("ccache", ["--show-stats"], { env });
});
}

View File

@@ -0,0 +1,52 @@
#!/usr/bin/env bun
/**
* Creates link-metadata.json with link command and build metadata
*
* Usage: bun scripts/create-link-metadata.mjs <build-path> <bun-target>
*/
import { $ } from "bun";
import { dirname, join } from "path";
const [buildPath, bunTarget] = process.argv.slice(2);
if (!buildPath || !bunTarget) {
console.error("Usage: bun scripts/create-link-metadata.mjs <build-path> <bun-target>");
process.exit(1);
}
// Get the repo root (parent of scripts directory)
const repoRoot = dirname(import.meta.dir);
// Extract link command using ninja
console.log("Extracting link command...");
const linkCommandResult = await $`ninja -C ${buildPath} -t commands ${bunTarget}`.quiet();
const linkCommand = linkCommandResult.stdout.toString().trim();
// Read linker-related files from src/
console.log("Reading linker files...");
const linkerLds = await Bun.file(join(repoRoot, "src", "linker.lds")).text();
const symbolsDyn = await Bun.file(join(repoRoot, "src", "symbols.dyn")).text();
const symbolsTxt = await Bun.file(join(repoRoot, "src", "symbols.txt")).text();
const symbolsDef = await Bun.file(join(repoRoot, "src", "symbols.def")).text();
// Create metadata JSON with link command included
const metadata = {
bun_version: process.env.BUN_VERSION || "",
webkit_url: process.env.WEBKIT_DOWNLOAD_URL || "",
webkit_version: process.env.WEBKIT_VERSION || "",
zig_commit: process.env.ZIG_COMMIT || "",
target: bunTarget,
timestamp: new Date().toISOString(),
link_command: linkCommand,
linker_lds: linkerLds,
symbols_dyn: symbolsDyn,
symbols_txt: symbolsTxt,
symbols_def: symbolsDef,
};
const metadataPath = join(buildPath, "link-metadata.json");
await Bun.write(metadataPath, JSON.stringify(metadata, null, 2));
console.log(`Written to ${metadataPath}`);
console.log("Done");

View File

@@ -387,9 +387,6 @@ const aws = {
owner = "amazon";
name = `Windows_Server-${release || "*"}-English-Full-Base-*`;
}
} else if (os === "freebsd") {
owner = "782442783595"; // upstream member of FreeBSD team, likely Colin Percival
name = `FreeBSD ${release}-STABLE-${{ "aarch64": "arm64", "x64": "amd64" }[arch] ?? "amd64"}-* UEFI-PREFERRED cloud-init UFS`;
}
if (!name) {
@@ -676,10 +673,6 @@ function getCloudInit(cloudInit) {
case "windows":
// handled above
break;
case "freebsd":
sftpPath = "/usr/libexec/openssh/sftp-server";
shell = "/bin/csh";
break;
default:
throw new Error(`Unsupported os: ${cloudInit["os"]}`);
}
@@ -1071,7 +1064,7 @@ function getCloud(name) {
}
/**
* @typedef {"linux" | "darwin" | "windows" | "freebsd"} Os
* @typedef {"linux" | "darwin" | "windows"} Os
* @typedef {"aarch64" | "x64"} Arch
* @typedef {"macos" | "windowsserver" | "debian" | "ubuntu" | "alpine" | "amazonlinux"} Distro
*/

View File

@@ -1128,6 +1128,7 @@ async function spawnBun(execPath, { args, cwd, timeout, env, stdout, stderr }) {
...process.env,
PATH: path,
TMPDIR: tmpdirPath,
BUN_TMPDIR: tmpdirPath,
USER: username,
HOME: homedir,
SHELL: shellPath,

View File

@@ -1538,7 +1538,7 @@ export function parseNumber(value) {
/**
* @param {string} string
* @returns {"darwin" | "linux" | "windows" | "freebsd"}
* @returns {"darwin" | "linux" | "windows"}
*/
export function parseOs(string) {
if (/darwin|apple|mac/i.test(string)) {
@@ -1550,9 +1550,6 @@ export function parseOs(string) {
if (/win/i.test(string)) {
return "windows";
}
if (/freebsd/i.test(string)) {
return "freebsd";
}
throw new Error(`Unsupported operating system: ${string}`);
}
@@ -1915,10 +1912,6 @@ export function getUsernameForDistro(distro) {
if (/amazon|amzn|al\d+|rhel/i.test(distro)) {
return "ec2-user";
}
if (/freebsd/i.test(distro)) {
return "root";
}
throw new Error(`Unsupported distro: ${distro}`);
}
@@ -2986,7 +2979,6 @@ const emojiMap = {
gear: ["⚙️", "gear"],
clipboard: ["📋", "clipboard"],
rocket: ["🚀", "rocket"],
freebsd: ["😈", "freebsd"],
openbsd: ["🐡", "openbsd"],
netbsd: ["🚩", "netbsd"],
};

View File

@@ -17,7 +17,7 @@ pkgs.mkShell rec {
cargo
go
python3
sccache
ccache
pkg-config
gnumake
libtool

View File

@@ -8,4 +8,5 @@ Syntax reminders:
Conventions:
- Prefer `@import` at the **bottom** of the file, but the auto formatter will move them so you don't need to worry about it.
- **Never** use `@import()` inline inside of functions. **Always** put them at the bottom of the file or containing struct. Imports in Zig are free of side-effects, so there's no such thing as a "dynamic" import.
- You must be patient with the build.

Some files were not shown because too many files have changed in this diff Show More