Compare commits

...

154 Commits

Author SHA1 Message Date
Claude Bot
535fabc46e Add Content-Range support for StaticRoute
Implements HTTP Range requests (RFC 7233) for static routes in Bun.serve(),
enabling partial content delivery for better streaming and download resumption.

Features:
- Parse Range header with support for:
  - Single ranges: "bytes=200-1000"
  - Open ranges: "bytes=200-"
  - Suffix ranges: "bytes=-500"
- Generate appropriate Content-Range response headers
- Return 206 Partial Content for valid ranges
- Return 416 Range Not Satisfiable for invalid ranges
- Add Accept-Ranges: bytes header to indicate range support
- Preserve ETag validation with If-None-Match
- Comprehensive test coverage

Implementation includes:
- New ContentRange.zig module with parsing and validation logic
- Integration with StaticRoute.zig for seamless range handling
- PendingRangeResponse for async streaming of large ranges
- Fallback to full content for unsupported scenarios (multipart ranges)

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-07-31 01:38:21 +00:00
Jarred Sumner
245abb92fb Cleanup some code from recent PRs (#21451)
### What does this PR do?

Remove some duplicate code

### How did you verify your code works?

Ran the tests

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-07-28 23:35:46 -07:00
robobun
066a25ac40 Fix crash in Response.redirect with invalid arguments (#21440)
## Summary
- Fix crash in `Response.redirect()` when called with invalid arguments
like `Response.redirect(400, "a")`
- Add proper status code validation per Web API specification (301, 302,
303, 307, 308)
- Add comprehensive tests to prevent regression and ensure spec
compliance

## Issue
When `Response.redirect()` is called with invalid arguments (e.g.,
`Response.redirect(400, "a")`), the code crashes with a panic due to an
assertion failure in `fastGet()`. The second argument is passed to
`Response.Init.init()` which attempts to call `fastGet()` on non-object
values, triggering `bun.assert(this.isObject())` to fail.

Additionally, the original implementation didn't properly validate
redirect status codes according to the Web API specification.

## Fix
Enhanced the `constructRedirect()` function with:

1. **Proper status code validation**: Only allows valid redirect status
codes (301, 302, 303, 307, 308) as specified by the MDN Web API
documentation
2. **Crash prevention**: Only processes object init values to prevent
`fastGet()` crashes with non-object values
3. **Consistent behavior**: Throws `RangeError` for invalid status codes
in both number and object forms

## Changes
- **`src/bun.js/webcore/Response.zig`**: Enhanced `constructRedirect()`
with validation logic
- **`test/js/web/fetch/response.test.ts`**: Added comprehensive tests
for crash prevention and status validation
- **`test/js/web/fetch/fetch.test.ts`**: Updated existing test to use
valid redirect status (307 instead of 408)

## Test Plan
- [x] Added test that reproduces the original crash scenario - now
passes without crashing
- [x] Added tests for proper status code validation (valid codes pass,
invalid codes throw RangeError)
- [x] Verified existing Response.redirect tests still pass
- [x] Confirmed Web API compliance with MDN specification
- [x] Tested various edge cases: `Response.redirect(400, "a")`,
`Response.redirect("url", 400)`, etc.

## Behavior Changes
- **Invalid status codes now throw RangeError** (spec compliant
behavior)
- **Non-object init values are safely ignored** (no more crashes)
- **Maintains backward compatibility** for valid use cases

Per [MDN Web API
specification](https://developer.mozilla.org/en-US/docs/Web/API/Response/redirect_static),
Response.redirect() should only accept status codes 301, 302, 303, 307,
or 308.

Fixes https://github.com/oven-sh/bun/issues/18414

🤖 Generated with [Claude Code](https://claude.ai/code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2025-07-28 21:16:47 -07:00
its-me-mhd
ab88317846 fix(install): Fix PATH mangling in Windows install.ps1 (#21446)
The install script was incorrectly setting $env:PATH by assigning an
array directly, which PowerShell converts to a space-separated string
instead of the required semicolon-separated format.

This caused the Windows PATH environment variable to be malformed,
making installed programs inaccessible.

Fixes #16811

### What does this PR do?

Fixes a bug in the Windows PowerShell install script where `$env:PATH`
was being set incorrectly, causing the PATH environment variable to be
malformed.

**The Problem:**
- The script assigns an array directly to `$env:PATH`
- PowerShell converts this to a space-separated string instead of
semicolon-separated
- This breaks the Windows PATH, making installed programs inaccessible

**The Fix:**
- Changed `$env:PATH = $Path;` to `$env:PATH = $Path -join ';'`
- Now properly creates semicolon-separated PATH entries as required by
Windows

### How did you verify your code works?

 **Tested the bug reproduction:**
```powershell
$Path = @('C:\Windows', 'C:\Windows\System32', 'C:\test')
$env:PATH = $Path  # WRONG: Results in "C:\Windows C:\Windows\System32 C:\test"
2025-07-28 20:23:34 -07:00
Jarred Sumner
e7373bbf32 when CI scripts/build.mjs is killed, also kill the processes it started (#21445)
### What does this PR do?

reduce number of zombie build processes that can happen in CI or when
building locally

### How did you verify your code works?
2025-07-28 19:54:14 -07:00
Meghan Denny
4687cc4f5e ci: only run the remap server when in CI (#21444) 2025-07-28 19:24:26 -07:00
Jarred Sumner
a5ff729665 Delete agent.mjs 2025-07-28 18:37:07 -07:00
Jarred Sumner
62e8a7fb01 Update pull_request_template.md 2025-07-28 18:28:54 -07:00
robobun
220807f3dc Add If-None-Match support to StaticRoute with automatic ETag generation (#21424)
## Summary

Fixes https://github.com/oven-sh/bun/issues/19198

This implements RFC 9110 Section 13.1.2 If-None-Match conditional
request support for static routes in Bun.serve().

**Key Features:**
- Automatic ETag generation for static content based on content hash
- If-None-Match header evaluation with weak entity tag comparison
- 304 Not Modified responses for cache efficiency
- Standards-compliant handling of wildcards (*), multiple ETags, and
weak ETags (W/)
- Method-specific application (GET/HEAD only) with proper 405 responses
for other methods

## Implementation Details

- ETags are generated using `bun.hash()` and formatted as strong ETags
(e.g., "abc123")
- Preserves existing ETag headers from Response objects
- Uses weak comparison semantics as defined in RFC 9110 Section 8.8.3.2
- Handles comma-separated ETag lists and malformed headers gracefully
- Only applies to GET/HEAD requests with 200 status codes

## Files Changed

- `src/bun.js/api/server/StaticRoute.zig` - Core implementation (~100
lines)
- `test/js/bun/http/serve-if-none-match.test.ts` - Comprehensive test
suite (17 tests)

## Test Results

-  All 17 new If-None-Match tests pass
-  All 34 existing static route tests pass (no regressions)
-  Debug build compiles successfully

## Test plan

- [ ] Run existing HTTP server tests to ensure no regressions
- [ ] Test ETag generation for various content types
- [ ] Verify 304 responses reduce bandwidth in real scenarios
- [ ] Test edge cases like malformed If-None-Match headers

🤖 Generated with [Claude Code](https://claude.ai/code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-07-28 16:22:21 -07:00
robobun
562f82d3f8 Fix Windows watcher index out of bounds crash when events exceed buffer size (#21400)
## Summary
Fixes a crash in the Windows file watcher that occurred when the number
of file system events exceeded the fixed `watch_events` buffer size
(128).

## Problem
The crash manifested as:
```
index out of bounds: index 128, len 128
```

This happened when:
1. More than 128 file system events were generated in a single watch
cycle
2. The code tried to access `this.watch_events[128]` on an array of
length 128 (valid indices: 0-127)
3. Later, `std.sort.pdq()` would operate on an invalid array slice

## Solution
Implemented a hybrid approach that preserves the original behavior while
handling overflow gracefully:

- **Fixed array for common case**: Uses the existing 128-element array
when possible for optimal performance
- **Dynamic allocation for overflow**: Switches to `ArrayList` only when
needed
- **Single-batch processing**: All events are still processed together
in one batch, preserving event coalescing
- **Graceful fallback**: Handles allocation failures with appropriate
fallbacks

## Benefits
-  **Fixes the crash** while maintaining existing performance
characteristics
-  **Preserves event coalescing** - events for the same file still get
properly merged
-  **Single consolidated callback** instead of multiple partial updates
-  **Memory efficient** - no overhead for normal cases (≤128 events)
-  **Backward compatible** - no API changes

## Test Plan
- [x] Compiles successfully with `bun run zig:check-windows`
- [x] Preserves existing behavior for common case (≤128 events)
- [x] Handles overflow case gracefully with dynamic allocation

🤖 Generated with [Claude Code](https://claude.ai/code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Jarred Sumner <jarred@bun.sh>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2025-07-28 15:43:54 -07:00
Dylan Conway
4580e11fc3 [PKG-517] fix(install): --linker=isolated should spawn scripts on the main thread (#21425)
### What does this PR do?
Fixes thread safety issues due to file poll code being not thread safe.
<!-- **Please explain what your changes do**, example: -->

<!--

This adds a new flag --bail to bun test. When set, it will stop running
tests after the first failure. This is useful for CI environments where
you want to fail fast.

-->

### How did you verify your code works?
Added tests for lifecycle scripts. The tests are unlikely to reproduce
the bug, but we'll know if it actually fixes the issue if
`test/package.json` doesn't show in flaky tests anymore.
<!-- **For code changes, please include automated tests**. Feel free to
uncomment the line below -->

<!-- I wrote automated tests -->

<!-- If JavaScript/TypeScript modules or builtins changed:

- [ ] I included a test for the new code, or existing tests cover it
- [ ] I ran my tests locally and they pass (`bun-debug test
test-file-name.test`)

-->

<!-- If Zig files changed:

- [ ] I checked the lifetime of memory allocated to verify it's (1)
freed and (2) only freed when it should be
- [ ] I included a test for the new code, or an existing test covers it
- [ ] JSValue used outside of the stack is either wrapped in a
JSC.Strong or is JSValueProtect'ed
- [ ] I wrote TypeScript/JavaScript tests and they pass locally
(`bun-debug test test-file-name.test`)
-->

<!-- If new methods, getters, or setters were added to a publicly
exposed class:

- [ ] I added TypeScript types for the new methods, getters, or setters
-->

<!-- If dependencies in tests changed:

- [ ] I made sure that specific versions of dependencies are used
instead of ranged or tagged versions
-->

<!-- If a new builtin ESM/CJS module was added:

- [ ] I updated Aliases in `module_loader.zig` to include the new module
- [ ] I added a test that imports the module
- [ ] I added a test that require() the module
-->

---------

Co-authored-by: taylor.fish <contact@taylor.fish>
2025-07-28 12:29:47 -07:00
CountBleck
2956281845 Support WebAssembly.{instantiate,compile}Streaming() (#20503)
### What does this PR do?

<!-- **Please explain what your changes do** -->

This PR should fix #14219 and implement
`WebAssembly.instantiateStreaming()` and
`WebAssembly.compileStreaming()`.

This is a mixture of WebKit's implementation (using a helper,
`handleResponseOnStreamingAction`, also containing a fast-path for
blobs) and some of Node.js's validation (error messages) and its
builtin-based strategy to consume chunks from streams.

`src/bun.js/bindings/GlobalObject.zig` has a helper function
(`getBodyStreamOrBytesForWasmStreaming`), called by C++, to validate the
response (like
[Node.js](214e4db60e/lib/internal/wasm_web_api.js)
does) and to extract the data from the response, either as a slice/span
(if we can get the data synchronously), or as a `ReadableStream` body
(if the data is still pending or if it is a file/S3 `Blob`).

In C++, `handleResponseOnStreamingAction` is called by
`compileStreaming` and `instantiateStreaming` on the
`JSC::GlobalObjectMethodTable`, just like in
[WebKit](97ee3c598a/Source/WebCore/bindings/js/JSDOMGlobalObject.cpp (L517)).
It calls the aforementioned Zig helper for validation and getting the
response data. The data is then fed into `JSC::Wasm::StreamingCompiler`.

If the data is received as a `ReadableStream`, then we call a JS builtin
in `WasmStreaming.ts` to iterate over each chunk of the stream, like
[Node.js](214e4db60e/lib/internal/wasm_web_api.js (L50-L52))
does. The `JSC::Wasm::StreamingCompiler` is passed into JS through a new
wrapper object, `WebCore::WasmStreamingCompiler`, like
[Node.js](214e4db60e/src/node_wasm_web_api.h)
does. It has `addBytes`, `finalize`, `error`, and (unused) `cancel`
methods to mirror the underlying JSC class.

(If there's a simpler way to do this, please let me know...that would be
very much appreciated)

- [x] Code changes

### How did you verify your code works?

<!-- **For code changes, please include automated tests**. Feel free to
uncomment the line below -->

I wrote automated tests (`test/js/web/fetch/wasm-streaming.test`).

<!-- If JavaScript/TypeScript modules or builtins changed: -->

- [x] I included a test for the new code, or existing tests cover it
- [x] I ran my tests locally and they pass (`bun-debug test
test/js/web/fetch/wasm-streaming.test`)

<!-- If Zig files changed: -->

- [x] I checked the lifetime of memory allocated to verify it's (1)
freed and (2) only freed when it should be (NOTE: consumed `AnyBlob`
bodies are freed, and all other allocations are in C++ and either GCed
or ref-counted)
- [x] I included a test for the new code, or an existing test covers it
(NOTE: via JS/TS unit test)
- [x] JSValue used outside of the stack is either wrapped in a
JSC.Strong or is JSValueProtect'ed (NOTE: N/A, JSValue never used
outside the stack)
- [x] I wrote TypeScript/JavaScript tests and they pass locally
(`bun-debug test test/js/web/fetch/wasm-streaming.test`)

---------

Co-authored-by: graphite-app[bot] <96075541+graphite-app[bot]@users.noreply.github.com>
2025-07-28 11:59:45 -07:00
Dylan Conway
9a2dfee3ca Fix env loader buffer overflow by using stack fallback allocator (#21416)
## Summary
- Fixed buffer overflow in env_loader when parsing large environment
variables with escape sequences
- Replaced fixed 4096-byte buffer with a stack fallback allocator that
automatically switches to heap allocation for larger values
- Added comprehensive tests to prevent regression

## Background
The env_loader previously used a fixed threadlocal buffer that could
overflow when parsing environment variables containing escape sequences.
This caused crashes when the parsed value exceeded 4KB.

## Changes
- Replaced fixed buffer with `StackFallbackAllocator` that uses 4KB
stack buffer for common cases and falls back to heap for larger values
- Updated all env parsing functions to accept a reusable buffer
parameter
- Added proper memory cleanup with defer statements

## Test plan
- [x] Added test cases for large environment variables with escape
sequences
- [x] Added test for values larger than 4KB  
- [x] Added edge case tests (empty quotes, escape at EOF)
- [x] All existing env tests continue to pass

fixes #11627
fixes BAPI-1274

🤖 Generated with [Claude Code](https://claude.ai/code)

---------

Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-07-28 00:13:17 -07:00
Dylan Conway
7a47c945aa Fix bundler cyclic imports with async dependencies (#21423)
## Summary

This PR fixes a bug in Bun's bundler where cyclic imports with async
dependencies would produce invalid JavaScript with syntax errors.

## Problem

When modules have cyclic imports and one uses top-level await, the
bundler wasn't properly marking all modules in the cycle as async. This
resulted in non-async wrapper functions containing `await` statements,
causing syntax errors like:

```
error: "await" can only be used inside an "async" function
```

## Solution

The fix matches esbuild's approach by calling `validateTLA` for all
files before `scanImportsAndExports` begins. This ensures async status
is properly propagated through import chains before dependency
resolution.

Key changes:
1. Added a new phase that validates top-level await for all parsed
JavaScript files before import/export scanning
2. This matches esbuild's `finishScan` function which processes all
files in source index order
3. Ensures the `is_async_or_has_async_dependency` flag is properly set
for all modules in cyclic import chains

## Test Plan

- Fixed the reproduction case provided in
`/Users/dylan/clones/bun-esm-bug`
- All existing bundler tests pass, including
`test/bundler/esbuild/default.test.ts`
- The bundled output now correctly generates async wrapper functions
when needed

fixes #21113

🤖 Generated with [Claude Code](https://claude.ai/code)

---------

Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-07-28 00:09:16 -07:00
Dylan Conway
24b7835ecd Fix shell lexer error message handling (#21419)
## Summary
- Fixed shell lexer to properly store error messages using TextRange
instead of direct string slices
- This prevents potential use-after-free issues when error messages are
accessed after the lexer's string pool might have been reallocated
- Added test coverage for shell syntax error reporting

## Changes
- Changed `LexError.msg` from `[]const u8` to `Token.TextRange` to store
indices into the string pool
- Added `TextRange.slice()` helper method for converting ranges back to
string slices
- Updated error message concatenation logic to use the new range-based
approach
- Added test to verify syntax errors are reported correctly

## Test plan
- [x] Added test case for invalid shell syntax error reporting
- [x] Existing shell tests continue to pass
- [x] Manual testing of various shell syntax errors

closes BAPI-2232

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-authored-by: Claude <noreply@anthropic.com>
2025-07-27 23:32:06 -07:00
robobun
95990e7bd6 Give us way to know if crashing inside exit handler (#21401)
## Summary
- Add `exited: usize = 0` field to analytics Features struct 
- Increment the counter atomically in `Global.exit` when called
- Provides visibility into how often exit is being called

## Test plan
- [x] Syntax check passes for both modified files
- [x] Code compiles without errors

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-07-27 16:55:55 -07:00
Zack Radisic
f2e487b1e6 Remove ZigString.Slice.clone(...) (#21410)
### What does this PR do?

Removes `ZigString.Slice.clone(...)` and replaces all of its usages with
`.cloneIfNeeded(...)` which is what it did anyway (why did this alias
exist in the first place?)

Anyone reading code that sees `.clone(...)` would expect it to clone the
underlying string. This makes it _extremely_ easy to write code which
looks okay but actually results in a use-after-free:

```zig
const out: []const u8 = out: {
    const string = bun.String.cloneUTF8("hello friends!");
    defer string.deref();
    const utf8_slice = string.toUTF8(bun.default_allocator);
    defer utf8_slice.deinit();

    // doesn't actually clone
    const cloned = utf8_slice.clone(bun.default_allocator) catch bun.outOfMemory();
    break :out cloned.slice();
};

std.debug.print("Use after free: {s}!\n", .{out});
```
(This is a simplification of an actual example from the codebase)
2025-07-27 16:54:39 -07:00
robobun
3315ade0e9 fix: update hdrhistogram GitHub action workflow (#21412)
## Summary

Fixes the broken update hdrhistogram GitHub Action workflow that was
failing due to multiple issues.

## Issues Fixed

1. **Tag SHA resolution failure**: The workflow failed when trying to
resolve commit SHA from lightweight tags, causing the error "Could not
fetch SHA for tag 0.11.8 @ 8dcce8f68512fca460b171bccc3a5afce0048779"
2. **Branch naming bug**: The workflow was creating branches named
`deps/update-cares-*` instead of `deps/update-hdrhistogram-*`
3. **Wrong workflow link**: PR body was linking to `update-cares.yml`
instead of `update-hdrhistogram.yml`

## Fix Details

- **Improved tag SHA resolution**: Updated logic to handle both
lightweight and annotated tags:
  - Try to get commit SHA from tag object (for annotated tags)
  - If that fails, use the tag SHA directly (for lightweight tags)
- Uses jq's `// empty` operator and proper error handling with
`2>/dev/null`
- **Fixed branch naming**: Changed from `deps/update-cares-*` to
`deps/update-hdrhistogram-*`
- **Updated workflow link**: Fixed PR body to link to correct workflow
file

## Test Plan

- [x] Verified current workflow logs show the exact error being fixed
- [x] Tested API calls locally to confirm the new logic works with
latest tag (0.11.8)
- [x] Confirmed the latest tag is a lightweight tag pointing directly to
commit

The workflow should now run successfully on the next scheduled execution
or manual trigger.

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-07-27 16:43:38 -07:00
robobun
95e653e52b fix: handle both lightweight and annotated tags in update-lolhtml workflow (#21414)
## Summary

- Fixes the failing update-lolhtml GitHub Action that was unable to
handle lightweight Git tags
- The action was failing with "Could not fetch SHA for tag v2.6.0"
because it assumed all tags are annotated tag objects
- Updated the workflow to properly handle both lightweight tags (direct
commit refs) and annotated tags (tag objects)

## Root Cause

The lolhtml repository uses lightweight tags (like v2.6.0) which point
directly to commits, not to tag objects. The original workflow tried to
fetch a tag object for every tag, causing failures when the tag was
lightweight.

## Solution

The fix adds logic to:
1. Check the tag object type from the Git refs API response
2. For annotated tags: fetch the commit SHA from the tag object
(original behavior)
3. For lightweight tags: use the SHA directly from the ref (new
behavior)

## Test Plan

- [x] Verified the workflow logic handles both tag types correctly
- [ ] The next scheduled run should succeed (runs weekly on Sundays at 1
AM UTC)
- [ ] Manual workflow dispatch can be used to test immediately

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-07-27 16:43:17 -07:00
robobun
a8522b16af fix: handle both lightweight and annotated tags in lshpack update action (#21413)
## What does this PR do?

Fixes the failing `update-lshpack.yml` GitHub Action that has been
consistently failing with the error: "Could not fetch SHA for tag v2.3.4
@ {SHA}".

## Root Cause

The workflow assumed all Git tags are annotated tags that need to be
dereferenced via the GitHub API. However, some tags (like lightweight
tags) point directly to commits and don't need dereferencing. When the
script tried to dereference a lightweight tag, the API call failed.

## Fix

This PR updates the workflow to:

1. **Check the tag type** before attempting to dereference
2. **For annotated tags** (`type="tag"`): dereference to get the commit
SHA via `/git/tags/{sha}`
3. **For lightweight tags** (`type="commit"`): use the SHA directly
since it already points to the commit

## Changes Made

- Updated `.github/workflows/update-lshpack.yml` to properly handle both
lightweight and annotated Git tags
- Added proper tag type checking before attempting to dereference tags
- Improved error messages to distinguish between tag types

## Testing

The workflow will now handle both types of Git tags properly:
-  Annotated tags: properly dereferences to get commit SHA  
-  Lightweight tags: uses the tag SHA directly as commit SHA

This should resolve the consistent failures in the lshpack update
automation.

## Files Changed

- `.github/workflows/update-lshpack.yml`: Updated tag SHA resolution
logic

🤖 Generated with [Claude Code](https://claude.ai/code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-07-27 16:43:10 -07:00
robobun
aba8c4efd2 fix: handle lightweight tags in update highway workflow (#21411)
## Summary

Fixes the broken update highway GitHub action that was failing with:
> Error: Could not fetch SHA for tag 1.2.0 @
457c891775a7397bdb0376bb1031e6e027af1c48

## Root Cause

The workflow assumed all Git tags are annotated tags, but Google Highway
uses lightweight tags. For lightweight tags, the GitHub API returns
`object.type: "commit"` and `object.sha` is already the commit SHA. For
annotated tags, `object.type: "tag"` and you need to fetch the tag
object to get the commit SHA.

## Changes

- Updated tag SHA fetching logic to handle both lightweight and
annotated tags
- Fixed incorrect branch name (`deps/update-cares` →
`deps/update-highway`)
- Fixed workflow URL in PR template

## Test Plan

- [x] Verified the API returns `type: "commit"` for highway tag 1.2.0
- [x] Confirmed the fix properly extracts the commit SHA:
`457c891775a7397bdb0376bb1031e6e027af1c48`
- [x] Manual testing shows current version (`12b325bc...`) \!= latest
version (`457c891...`)

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-07-27 16:43:01 -07:00
Jarred Sumner
0bebdc9049 Add a comment 2025-07-26 21:59:21 -07:00
Ali
1058d0dee4 fix import.meta.url in hmr (#21386)
### What does this PR do?

<!-- **Please explain what your changes do**, example: -->
use `window.location.origin` in browser instead of `bun://` .
should fix [9910](https://github.com/oven-sh/bun/issues/19910)
<!--

This adds a new flag --bail to bun test. When set, it will stop running
tests after the first failure. This is useful for CI environments where
you want to fail fast.

-->

- [ ] Documentation or TypeScript types (it's okay to leave the rest
blank in this case)
- [x] Code changes

### How did you verify your code works?

<!-- **For code changes, please include automated tests**. Feel free to
uncomment the line below -->

<!-- I wrote automated tests -->

<!-- If JavaScript/TypeScript modules or builtins changed:

- [ ] I included a test for the new code, or existing tests cover it
- [ ] I ran my tests locally and they pass (`bun-debug test
test-file-name.test`)

-->

<!-- If Zig files changed:

- [ ] I checked the lifetime of memory allocated to verify it's (1)
freed and (2) only freed when it should be
- [ ] I included a test for the new code, or an existing test covers it
- [ ] JSValue used outside of the stack is either wrapped in a
JSC.Strong or is JSValueProtect'ed
- [ ] I wrote TypeScript/JavaScript tests and they pass locally
(`bun-debug test test-file-name.test`)
-->

<!-- If new methods, getters, or setters were added to a publicly
exposed class:

- [ ] I added TypeScript types for the new methods, getters, or setters
-->

<!-- If dependencies in tests changed:

- [ ] I made sure that specific versions of dependencies are used
instead of ranged or tagged versions
-->

<!-- If a new builtin ESM/CJS module was added:

- [ ] I updated Aliases in `module_loader.zig` to include the new module
- [ ] I added a test that imports the module
- [ ] I added a test that require() the module
-->
2025-07-26 20:31:45 -07:00
Zack Radisic
679a07caef Fix assertion failure in dev server related to deleting files (#21304)
### What does this PR do?

Fixes an assertion failure in dev server which may happen if you delete
files. The issue was that `disconnectEdgeFromDependencyList(...)` was
wrong and too prematurely setting `g.first_deps[idx] = .none`.

Fixes #20529

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-07-26 16:16:55 -07:00
pfg
0bd73b4363 Fix toIncludeRepeated (#21366)
Fixes #12276: toIncludeRepeated should check for the exact repeat count
not >=

This is a breaking change because some people may be relying on the
existing behaviour. Should it be feature-flagged for 1.3?

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-07-25 22:22:04 -07:00
Meghan Denny
bbdc3ae055 node: sync updated tests (#21147) 2025-07-25 19:14:22 -07:00
Meghan Denny
81e08d45d4 node: fix test-worker-uncaught-exception-async.js (#21150) 2025-07-25 19:13:48 -07:00
pfg
89eb48047f Auto reduce banned word count (#20929)
Co-authored-by: pfgithub <6010774+pfgithub@users.noreply.github.com>
2025-07-25 18:13:43 -07:00
taylor.fish
712d5be741 Add safety checks to MultiArrayList and BabyList (#21063)
Ensure we aren't using multiple allocators with the same list by storing
a pointer to the allocator in debug mode only.

This check is stricter than the bare minimum necessary to prevent
illegal behavior, so CI may reveal certain uses that fail the checks but
don't cause IB. Most of these cases should probably be updated to comply
with the new requirements—we want these types' invariants to be clear.

(For internal tracking: fixes ENG-14987)
2025-07-25 18:12:21 -07:00
taylor.fish
60823348c5 Optimize WaitGroup implementation (#21260)
`add` no longer locks a mutex, and `finish` no longer locks a mutex
except for the last task. This could meaningfully improve performance in
cases where we spawn a large number of tasks on a thread pool. This
change doesn't alter the semantics of the type, unlike the standard
library's `WaitGroup`, which also uses atomics but has to be explicitly
reset.

(For internal tracking: fixes ENG-19722)

---------

Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2025-07-25 18:12:12 -07:00
190n
2f0ddf3018 gamble.ts: print summary of runs so far before exit upon ctrl-c (#21296)
### What does this PR do?

lets you get useful info out of this script even if you set the number
of attempts too high and you don't want to wait

### How did you verify your code works?

local testing

(I cancelled CI because this script is not used anywhere in CI so it
wouldn't tell us anything useful)
2025-07-25 17:46:34 -07:00
190n
1ab76610cf [STAB-861] Suppress known-benign core dumps in CI (#21321)
### What does this PR do?

- for these kinds of aborts which we test in CI, introduce a feature
flag to suppress core dumps and crash reporting only from that abort,
and set the flag when running the test:
    - libuv stub functions
- Node-API abort (used in particular when calling illegal functions
during finalizers)
    - passing `process.kill` its own PID
- core dumps are suppressed with `setrlimit`, and crash reporting with
the new `suppress_reporting` field. these suppressions are only engaged
right before crashing, so we won't ignore new kinds of crashes that come
up in these tests.
- for the test bindings used to test the crash handler in
`run-crash-handler.test.ts`, disables core dumps but does not disable
crash reporting (because crashes get reported to a server that the test
is running to make sure they are reported)
- fixes a panic when printing source code around an error containing
`\n\r`
- updates the code where we clone vendor tests to checkout the right tag
- adds `vendor/elysia/test/path/plugin.test.ts` to
no-validate-exceptions
- this failure was exposed by starting to test the version of elysia we
have been intending to test. the crash trace suggests it may be fixed by
#21307.
- makes dumping core or uploading a crash report count as a failing test
- this ensures we don't realize a crash has occurred if it happened in a
subprocess and the main test doesn't adequately check the exit code. to
spawn a subprocess you expect to fail, prefer `expect(code).toBe(1)`
over `expect(code).not.toBe(0)`. if you really expect multiple possible
erroneous exit codes, you might try `expect(signal).toBeNull()` to still
disallow crashes.

### How did you verify your code works?

Running affected tests on a Linux machine with core dumps set up and
checking no new ones appear.

https://buildkite.com/bun/bun/builds/21465 has no core dumps.
2025-07-25 16:22:04 -07:00
taylor.fish
d3927a6e09 Fix isolated install atomics/synchronization (#21365)
Also fix a race condition with hardlinking on Windows during hoisted
installs, and a bug in the process waiter thread implementation causing
items to be skipped.

(For internal tracking: fixes STAB-850, STAB-873, STAB-881)
2025-07-25 16:17:50 -07:00
robobun
8424caa5fa Fix bounds check in lexer for single # character (#21341) 2025-07-25 13:59:33 -07:00
robobun
3e6d792b62 Fix PostgreSQL index out of bounds crash during batch inserts (#21311) (#21316)
## Summary

Fixes the "index out of bounds: index 0, len 0" crash that occurs during
large batch PostgreSQL inserts, particularly on Windows systems.

The issue occurred when PostgreSQL DataRow messages contained data but
the `statement.fields` array was empty (len=0), causing crashes in
`DataCell.Putter.putImpl()`. This typically happens during large batch
operations where there may be race conditions or timing issues between
RowDescription and DataRow message processing.

## Changes

- **Add bounds checking** in `DataCell.Putter.putImpl()` before
accessing `fields` and `list` arrays
(src/sql/postgres/DataCell.zig:1043-1050)
- **Graceful degradation** - return `false` to ignore extra fields
instead of crashing
- **Debug logging** to help diagnose field metadata issues
- **Comprehensive regression tests** covering batch inserts, empty
results, and concurrent operations

## Test Plan

- [x] Added regression tests in `test/regression/issue/21311.test.ts`
- [x] Tests pass with the fix: All 3 tests pass with 212 expect() calls
- [x] Existing PostgreSQL tests still work (no regressions)

The fix prevents the crash while maintaining safe operation, allowing
PostgreSQL batch operations to continue working reliably.

## Root Cause

The crash occurred when:
1. `statement.fields` array was empty (len=0) due to timing issues
2. PostgreSQL DataRow messages contained actual data 
3. Code tried to access `this.list[index]` and `this.fields[index]`
without bounds checking

This was particularly problematic on Windows during batch operations due
to potential differences in:
- Network stack message ordering
- Memory allocation behavior
- Threading/concurrency during batch operations
- Statement preparation timing

Fixes #21311

🤖 Generated with [Claude Code](https://claude.ai/code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Ciro Spaciari <ciro.spaciari@gmail.com>
2025-07-25 12:53:49 -07:00
aspizu
9bb2474adb fix: display of eq token in shell lexer (#21275)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Zack Radisic <56137411+zackradisic@users.noreply.github.com>
2025-07-25 12:47:48 -07:00
pfg
fe94a36dbc Make execArgv empty when in compiled executable (#21298)
```ts
// a.js
console.log({
    argv: process.argv,
    execArgv: process.execArgv,
});
```

```diff
$> node a.js -a --b
{
  argv: [
    '/opt/homebrew/Cellar/node/24.2.0/bin/node',
    '/tmp/a.js',
    '-a',
    '--b'
  ],
  execArgv: []
}

$> bun a.js -a --b
{
  argv: [ "/Users/pfg/.bun/bin/bun", "/tmp/a.js",
    "-a", "--b"
  ],
  execArgv: [],
}

$> bun build --compile a.js --outfile=a
   [5ms]  bundle  1 modules
  [87ms] compile  

$> ./a -a --b

{
  argv: [ "bun", "/$bunfs/root/a", "-a", "--b" ],
- execArgv: [ "-a", "--b" ],
+ execArgv: [],
}
```

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-07-25 12:47:10 -07:00
pfg
cb2887feee Fix Bun.resolve() returning a promise throwing a raw exception instead of an Error (#21302)
I haven't checked all uses of tryTakeException but this bug is probably
not the only one.

Caught by running fuzzy-wuzzy with debug logging enabled. It tried to
print the exception. Updates fuzzy-wuzzy to have improved logging that
can tell you what was last executed before a crash.

---------

Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-07-25 12:46:33 -07:00
Meghan Denny
64361eb964 zig: delete deprecated bun.jsc.Maybe (#21327)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: taylor.fish <contact@taylor.fish>
2025-07-25 12:38:06 -07:00
Meghan Denny
3413a2816f node:fs: fix leak of mkdir return value (#21360) 2025-07-25 01:00:20 -07:00
190n
97a530d832 [publish images] [BAPI-2103] use gdb and bun.report code to print stack traces upon crash in CI (#21143)
### What does this PR do?

Closes #13012

On Linux, when any Bun process spawned by `runner.node.mjs` crashes, we
run GDB in batch mode to print a backtrace from the core file.

And on all platforms, we run a mini `bun.report` server which collects
crashes reported by any Bun process executed during the tests, and after
each test `runner.node.mjs` fetches and prints any new crashes from the
server.

<details>
<summary>example 1</summary>

```
#0  crash_handler.crash () at crash_handler.zig:1513
#1  0x0000000002cf4020 in crash_handler.crashHandler (reason=..., error_return_trace=0x0, begin_addr=...) at crash_handler.zig:479
#2  0x0000000002cefe25 in crash_handler.handleSegfaultPosix (sig=<optimized out>, info=<optimized out>) at crash_handler.zig:800
#3  0x00000000045a1124 in WTF::jscSignalHandler (sig=11, info=0x7ffe044e30b0, ucontext=0x0) at vendor/WebKit/Source/WTF/wtf/threads/Signals.cpp:548
#4  <signal handler called>
#5  JSC::JSCell::type (this=0x0) at vendor/WebKit/Source/JavaScriptCore/runtime/JSCellInlines.h:137
#6  JSC::JSObject::getOwnNonIndexPropertySlot (this=0x150bc914fe18, vm=..., structure=0x150a0102de50, propertyName=..., slot=...) at vendor/WebKit/Source/JavaScriptCore/runtime/JSObject.h:1348
#7  JSC::JSObject::getPropertySlot<false> (this=0x150bc914fe18, globalObject=0x150b864e0088, propertyName=..., slot=...) at vendor/WebKit/Source/JavaScriptCore/runtime/JSObject.h:1433
#8  JSC::JSValue::getPropertySlot (this=0x7ffe044e4880, globalObject=0x150b864e0088, propertyName=..., slot=...) at vendor/WebKit/Source/JavaScriptCore/runtime/JSCJSValueInlines.h:1108
#9  JSC::JSValue::get (this=0x7ffe044e4880, globalObject=0x150b864e0088, propertyName=..., slot=...) at vendor/WebKit/Source/JavaScriptCore/runtime/JSCJSValueInlines.h:1065
#10 JSC::LLInt::performLLIntGetByID (bytecodeIndex=..., codeBlock=0x150b861e7740, globalObject=0x150b864e0088, baseValue=..., ident=..., metadata=...) at vendor/WebKit/Source/JavaScriptCore/llint/LLIntSlowPaths.cpp:878
#11 0x0000000004d7b055 in llint_slow_path_get_by_id (callFrame=0x7ffe044e4ab0, pc=0x150bc92ea0e7) at vendor/WebKit/Source/JavaScriptCore/llint/LLIntSlowPaths.cpp:946
#12 0x0000000003dd6042 in llint_op_get_by_id ()
#13 0x0000000000000000 in ?? ()
```

</details>

<details>
<summary>example 2</summary>

```
  #0  crash_handler.crash () at crash_handler.zig:1513
  #1  0x0000000002c5db80 in crash_handler.crashHandler (reason=..., error_return_trace=0x0, begin_addr=...) at crash_handler.zig:479
  #2  0x0000000002c59f60 in crash_handler.handleSegfaultPosix (sig=<optimized out>, info=<optimized out>) at crash_handler.zig:800
  #3  0x00000000042ecc88 in WTF::jscSignalHandler (sig=11, info=0xfffff60141b0, ucontext=0xfffff6014230) at vendor/WebKit/Source/WTF/wtf/threads/Signals.cpp:548
  #4  <signal handler called>
  #5  bun.js.api.FFIObject.Reader.u8 (globalObject=0x4000554e0088) at /var/lib/buildkite-agent/builds/ip-172-31-75-92/bun/bun/src/bun.js/api/FFIObject.zig:65
  #6  bun.js.jsc.host_fn.toJSHostCall__anon_1711576 (globalThis=0x4000554e0088, args=...) at /var/lib/buildkite-agent/builds/ip-172-31-75-92/bun/bun/src/bun.js/jsc/host_fn.zig:97
  #7  bun.js.jsc.host_fn.DOMCall("Reader"[0..6],bun.js.api.FFIObject.Reader,"u8"[0..2],.{ .reads = .{ ... }, .writes = .{ ... } }).slowpath (globalObject=0x4000554e0088, thisValue=70370172175040, arguments_ptr=0xfffff6015460, arguments_len=1) at /var/lib/buildkite-agent/builds/ip-172-31-75-92/bun/bun/src/bun.js/jsc/host_fn.zig:490
  #8  0x000040003419003c in ?? ()
  #9  0x0000400055173440 in ?? ()
```

</details>

I used GDB instead of LLDB (as the branch name suggests) because it
seems to produce more useful stack traces with musl libc.

- [x] on linux, use gdb to print from core dump of main bun process
crashed
- [x] on linux, use gdb to print from all new core dumps (so including
bun subprocesses spawned by the test that crashed)
- [x] on all platforms, use a mini bun.report server to print a
self-reported trace (depends on oven-sh/bun.report#15; for now our
package.json points to a commit on the branch of that repo)
- [x] fix trying to fetch stack traces too early on windows
- [x] use output groups so the traces show up alongside the log for the
specific test instead of having to find it in the logs from the entire
run
- [x] get oven-sh/bun.report#15 merged, and point to a bun.report commit
on the main branch instead of the PR branch in package.json

### How did you verify your code works?

Manually, and in CI with a crashing test.

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-07-24 23:32:54 -07:00
Meghan Denny
72a6278b3f Delete test/js/node/test/parallel/test-http-url.parse-https.request.js
flaky on macos ci atm
2025-07-24 19:06:38 -07:00
Meghan Denny
bd232189b4 node: fix test-http-server-keepalive-req-gc.js (#21333) 2025-07-24 13:46:50 -07:00
Meghan Denny
87df7527bb node: fix test-http-url.parse-https.request.js (#21335) 2025-07-24 13:44:55 -07:00
190n
a1f756fea9 Fix running bun test on multiple node:test tests (#19354)
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-07-24 11:48:55 -07:00
190n
03dfd7d96b Run callback passed to napi_module_register after dlopen returns instead of during call (#20478) 2025-07-24 11:46:56 -07:00
Meghan Denny
9fa8ae9a40 fixes #21274 (#21278)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-07-23 22:31:42 -07:00
Ciro Spaciari
e4dd780c2a fix(s3) ref before calling unlink (#21317)
Co-authored-by: Meghan Denny <meghan@bun.sh>
2025-07-23 22:23:58 -07:00
taylor.fish
76c623817f Fix Bake WatcherAtomics (#21328) 2025-07-23 22:23:35 -07:00
Dylan Conway
f90a007593 [PKG-513, ENG-19757] bun install: fix non-ascii edge case and simplify task lifetimes (#21320) 2025-07-23 19:23:54 -07:00
Meghan Denny
ea12cb5801 ci: show the retries count on flaky tests (#21325)
Co-authored-by: 190n <ben@bun.sh>
Co-authored-by: taylor.fish <contact@taylor.fish>
2025-07-23 19:11:18 -07:00
Meghan Denny
75027e9616 ci: give windows a bit more time for tests
noticed in the wild napi compiling slow
2025-07-23 18:04:20 -07:00
Meghan Denny
e8d0935717 zig: delete deprecated some bun.jsc apis (#21309) 2025-07-23 17:10:58 -07:00
190n
ace81813fc [STAB-851] fix debug-coredump.ts (#21318) 2025-07-23 12:23:14 -07:00
Zack Radisic
71e2161591 Split DevServer.zig into multiple files (#21299)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-07-23 05:29:22 +00:00
taylor.fish
07cd45deae Refactor Zig imports and file structure (part 1) (#21270)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-07-22 17:51:38 -07:00
Meghan Denny
73d92c7518 Revert "Fix beforeAll hooks running for unmatched describe blocks when using test filters" (#21292)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-07-22 12:53:53 -07:00
Jarred Sumner
5c44553a02 Update vscode-release.yml 2025-07-21 23:25:57 -07:00
Michael H
f4116bfa7d followup for vscode test runner (#21024)
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-07-21 23:21:05 -07:00
Jarred Sumner
5aeede1ac7 Quieter build commands 2025-07-21 21:56:41 -07:00
Meghan Denny
6d2a0e30f5 test: node: revert the previous tmpdirName commit 2025-07-21 21:34:00 -07:00
Jarred Sumner
382fe74fd0 Remove noisy copy_file logs on Linux 2025-07-21 21:24:35 -07:00
Jarred Sumner
aac646dbfe Suppress linker alignment warnings in debug build on macOS 2025-07-21 21:24:35 -07:00
Jarred Sumner
da90ad84d0 Fix windows build in git bash 2025-07-21 21:24:35 -07:00
robobun
6383c8f94c Fix beforeAll hooks running for unmatched describe blocks when using test filters (#21195)
Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-07-21 20:21:29 -07:00
robobun
718e7cdc43 Upgrade libarchive to v3.8.1 (#21250)
Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: Zack Radisic <zack@theradisic.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-07-21 20:08:00 -07:00
robobun
cd54db1e4b Fix Windows cross-compilation missing executable permissions (#21268)
Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-07-21 19:47:00 -07:00
Jarred Sumner
171169a237 Update CLAUDE.md 2025-07-21 19:14:24 -07:00
Jarred Sumner
5fbd99e0cb Fix parsing bug with non-sha256 integrity hashes when migrating lockfiles (#21220) 2025-07-21 17:10:06 -07:00
pfg
60faa8696f Auto cpp->zig bindings (#20881)
Co-authored-by: pfgithub <6010774+pfgithub@users.noreply.github.com>
Co-authored-by: Ben Grant <ben@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-07-21 16:26:07 -07:00
Meghan Denny
d2a4fb8124 test: node: much more robust tmpdirName for use with --parallel 2025-07-21 15:56:32 -07:00
Meghan Denny
a4d031a841 meta: add a --parallel flag to the runner for faster local testing (#21140)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-07-21 15:17:19 -07:00
Meghan Denny
56bc65932f ci: print memory usage in runner (#21163) 2025-07-21 15:12:30 -07:00
pfg
83760fc446 Sort imports in all files (#21119)
Co-authored-by: taylor.fish <contact@taylor.fish>
2025-07-21 13:26:47 -07:00
robobun
74d3610d41 Update lol-html to v2.6.0 (#21251)
Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-07-21 02:08:35 -07:00
Jarred Sumner
1d085cb4d4 CI: normalize glob-sources paths to posix paths 2025-07-21 01:24:59 -07:00
Jarred Sumner
a868e859d7 Run formatter 2025-07-21 01:19:09 -07:00
Zack Radisic
39dd5002c3 Fix CSS error with printing :is(...) pseudo class (#21249)
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2025-07-21 00:21:33 -07:00
robobun
7940861b87 Fix extra bracket in template literal syntax highlighting (#17327) (#21187)
Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-07-20 23:38:24 -07:00
github-actions[bot]
f65f31b783 deps: update sqlite to 3.50.300 (#21222)
Co-authored-by: Jarred-Sumner <Jarred-Sumner@users.noreply.github.com>
2025-07-20 23:05:49 -07:00
robobun
cc5d8adcb5 Enable Windows long path support (#21244)
Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-07-20 23:04:17 -07:00
Jarred Sumner
bbc4f89c25 Deflake test-21049.test.ts 2025-07-20 23:02:10 -07:00
Zack Radisic
f4339df16b SSG stuff (#20998)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2025-07-20 22:37:50 -07:00
robobun
12dafa4f89 Add comprehensive documentation for bun ci command (#21231)
Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2025-07-20 03:46:44 -07:00
robobun
436be9f277 docs: add comprehensive documentation for bun update --interactive (#21232)
Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-07-20 03:39:31 -07:00
robobun
422991719d docs: add isolated installs to navigation (#21217)
Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2025-07-19 22:27:44 -07:00
Jarred Sumner
bc030d23b3 Read the same values as pnpm's .npmrc node-linker options 2025-07-19 22:20:11 -07:00
Meghan Denny
146fb2f7aa Bump 2025-07-19 11:55:14 -07:00
robobun
1e5f746f9b docs: add comprehensive isolated install documentation (#21198) 2025-07-19 06:12:46 -07:00
robobun
aad3abeadd Update interactive spacing (#21156)
Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: RiskyMH <git@riskymh.dev>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2025-07-19 04:11:28 -07:00
Jarred Sumner
6d5637b568 Remove debug logs 2025-07-19 04:04:24 -07:00
Jarred Sumner
eb0b0db8fd Rename IS_CODE_AGENT -> AGENT 2025-07-19 03:59:47 -07:00
Dylan Conway
1a9bc5da09 fix(install): isolated install aliased dependency name fix (#21138)
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Ciro Spaciari <ciro.spaciari@gmail.com>
Co-authored-by: Meghan Denny <meghan@bun.sh>
2025-07-19 01:59:52 -07:00
taylor.fish
a1c0f74037 Simplify/fix threading utilities (#21089)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-07-18 22:02:36 -07:00
robobun
b5a9d09009 Add comprehensive build-time constants guide for --define flag (#21171)
Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-07-18 15:40:17 -07:00
Alistair Smith
a280d15bdc Remove conflicting declaration breaking lint.yml (#21180) 2025-07-18 16:26:54 -04:00
Jarred Sumner
f380458bae Add remoteRoot/localRoot mapping for VSCode (#19884)
Co-authored-by: Jarred-Sumner <709451+Jarred-Sumner@users.noreply.github.com>
Co-authored-by: robobun <robobun@oven.sh>
2025-07-18 04:19:15 -07:00
Meghan Denny
6ed245b889 ci: add auto-updaters for highway and hdrhistogram (#21157) 2025-07-18 04:15:33 -07:00
Meghan Denny
89e322e3b5 cmake: set nodejs_version statically (#21164) 2025-07-18 04:14:11 -07:00
Jarred Sumner
45e97919e5 Add bun bd to test/ folder and also run formatter 2025-07-17 22:20:02 -07:00
robobun
1927b06c50 Add documentation for AI agent environment variables in test runner (#21159)
Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-07-17 21:45:47 -07:00
robobun
851fa7d3e6 Add support for AI agent environment variables to quiet test output (#21135)
Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2025-07-17 21:20:50 -07:00
Michael H
be03a537df make bun ci work as alias of --frozen-lockfile (like npm ci) (#20590)
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2025-07-17 05:41:06 -07:00
Jarred Sumner
664506474a Reduce stack space usage in Cli.start function (#21133) 2025-07-17 05:34:37 -07:00
Jarred Sumner
804e76af22 Introduce bun update --interactive (#20850)
Co-authored-by: Jarred-Sumner <709451+Jarred-Sumner@users.noreply.github.com>
Co-authored-by: Dylan Conway <dylan.conway567@gmail.com>
Co-authored-by: claude[bot] <209825114+claude[bot]@users.noreply.github.com>
Co-authored-by: Jarred Sumner <Jarred-Sumner@users.noreply.github.com>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-07-17 04:33:30 -07:00
Michael H
67bed87795 followups from recent merged pr's (#21109) 2025-07-17 03:14:26 -07:00
Michael H
d181e19952 bun bake templates: --source-map -> --sourcemap (#21130) 2025-07-17 03:13:48 -07:00
190n
6b14f77252 fix nonsense test name and elapsed time when beforeEach callback has thrown (#21118)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-07-17 02:02:57 -07:00
Meghan Denny
cc4f840e8b test: remove accidental uses of test.only (#20097) 2025-07-16 23:07:23 -07:00
Dylan Conway
0bb7132f61 [ENG-15045] add --linker and copyfile fallback for isolated installs (#21122)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-07-16 23:02:52 -07:00
Zack Radisic
45a0559374 Fix some shell things (#20649)
Co-authored-by: claude[bot] <209825114+claude[bot]@users.noreply.github.com>
Co-authored-by: Zack Radisic <zackradisic@users.noreply.github.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
Co-authored-by: robobun <robobun@oven.sh>
2025-07-16 19:46:31 -07:00
Jarred Sumner
7bbb4e2ad9 Enable ASAN by default on Linux x64 & arm64 debug builds 2025-07-16 17:57:00 -07:00
Ben Grant
e273f7d122 [publish images] for #21095 2025-07-16 11:43:38 -07:00
Michael H
15b7cd8c18 node:fs.glob support array of excludes and globs (#20883) 2025-07-16 03:08:54 -07:00
190n
d3adc16524 do not silently fail in configure_core_dumps (#21095) 2025-07-16 03:06:46 -07:00
Meghan Denny
fae0a64d90 small 20956 followup (#21103) 2025-07-16 02:26:40 -07:00
Michael H
0ee633663e Implement bun why (#20847)
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-07-16 02:00:53 -07:00
Michael H
a717679fb3 support $variables in test.each (#21061) 2025-07-16 01:42:19 -07:00
Dylan Conway
36ce7b0203 comment 2025-07-16 01:08:30 -07:00
Dylan Conway
2bc75a87f4 fix(install): fix resolving duplicate dependencies (#21059)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-07-16 01:01:10 -07:00
Jarred Sumner
fdec7fc6e3 Simpler version of #20813 (#21102)
Co-authored-by: claude[bot] <209825114+claude[bot]@users.noreply.github.com>
Co-authored-by: Jarred Sumner <Jarred-Sumner@users.noreply.github.com>
2025-07-16 00:14:33 -07:00
Meghan Denny
875604a42b safety: a lot more exception checker progress (#20956) 2025-07-16 00:11:19 -07:00
Jarred Sumner
d8b37bf408 Shrink MimeType list (#21004)
Co-authored-by: Jarred-Sumner <709451+Jarred-Sumner@users.noreply.github.com>
Co-authored-by: Meghan Denny <meghan@bun.sh>
2025-07-15 22:37:09 -07:00
jarred-sumner-bot
32ce9a3890 Add Windows PE codesigning support for standalone executables (#21091)
Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2025-07-15 22:31:54 -07:00
Jarred Sumner
0ce70df96b Buffer stdout when printing summary + check for dir/package.json + npmrc link-workspace-packages + npmrc save-exact (#20907) 2025-07-15 22:15:41 -07:00
Jarred Sumner
b0a5728b37 Allow "catalog" and "catalogs" in top-level package.json if it wasn't provided in "workspaces" object (#21009) 2025-07-15 22:14:27 -07:00
Michael H
20db4b636e implement bun pm pkg (#21046) 2025-07-15 22:14:00 -07:00
jarred-sumner-bot
1cef9399e4 fix: CSS parser crashes with calc() NaN values and system colors in color-mix() (#21079)
Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Zack Radisic <56137411+zackradisic@users.noreply.github.com>
2025-07-15 22:02:03 -07:00
Michael H
f4444c0e4d fix --tsconfig-override (#21045) 2025-07-15 22:00:17 -07:00
Dylan Conway
577b327fe5 [ENG-15008] fix #21086 (#21093)
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2025-07-15 21:59:41 -07:00
pfg
9d2eb78544 Fetch redirect fix (#21064)
Co-authored-by: pfgithub <6010774+pfgithub@users.noreply.github.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Ciro Spaciari <ciro.spaciari@gmail.com>
2025-07-15 20:51:52 -07:00
Meghan Denny
4be43e2c52 de-flake spawn.test.ts 2025-07-15 17:18:46 -07:00
Ciro Spaciari
93cc51c974 fix(s3) handle slashs at end and start of bucket name (#20997) 2025-07-15 16:52:47 -07:00
190n
528ee8b673 cleanup main.zig (#21085) 2025-07-15 16:29:03 -07:00
Jarred Sumner
c2fc69302b Try to deflake bun-install-lifecycle-scripts test (#21070) 2025-07-15 16:27:56 -07:00
Jarred Sumner
df7b6b4813 Add NODE_NO_WARNINGS (#21075) 2025-07-15 16:27:21 -07:00
jarred-sumner-bot
81b4b8ca94 fix(s3): ensure correct alphabetical query parameter order in presigned URLs (#21050)
Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: jarred-sumner-bot <220441119+jarred-sumner-bot@users.noreply.github.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2025-07-15 16:20:34 -07:00
Meghan Denny
a242c878c3 [publish images] ci: add ubuntu 25.04 (#20996)
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
Co-authored-by: Jarred-Sumner <709451+Jarred-Sumner@users.noreply.github.com>
2025-07-15 01:22:41 -07:00
taylor.fish
77af8729c1 Fix potential buffer overflow in which.isValid (#21062)
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2025-07-15 01:20:34 -07:00
taylor.fish
3b2289d76c Sync Mutex and Futex with upstream and port std.Thread.Condition (#21060)
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2025-07-15 01:16:40 -07:00
jarred-sumner-bot
803181ae7c Add --quiet flag to bun pm pack command (#21053)
Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: jarred-sumner-bot <220441119+jarred-sumner-bot@users.noreply.github.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2025-07-15 01:14:58 -07:00
Jarred Sumner
89aae0bdc0 Add flag to disable sql auto pipelining (#21067)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-07-15 01:13:35 -07:00
Jarred Sumner
e62f9286c9 Fix formatter action 2025-07-15 00:16:30 -07:00
Alistair Smith
b75eb0c7c6 Updates to bun init React templates: Electric boogaloo (#21066) 2025-07-15 00:10:02 -07:00
Jarred Sumner
33fac76763 Delete flaky test that made incorrect assumptions about the event loop
We cannot assume that the next event loop cycle will yield the expected outcome from the operating system. We can assume certain things happen in a certain order, but when timers run next does not mean kqueue/epoll/iocp will always have the data available there.
2025-07-14 23:58:37 -07:00
Jarred Sumner
40970aee3b Delete flaky test that made incorrect assumptions about the event loop
We cannot assume that the next event loop cycle will yield the expected outcome from the operating system. We can assume certain things happen in a certain order, but when timers run next does not mean kqueue/epoll/iocp will always have the data available there.
2025-07-14 23:53:12 -07:00
Jarred Sumner
a6f09228af Fix 2025-07-14 22:33:59 -07:00
Ciro Spaciari
6efb346e68 feature(postgres) add pipelining support (#20986)
Co-authored-by: cirospaciari <6379399+cirospaciari@users.noreply.github.com>
2025-07-14 21:59:16 -07:00
jarred-sumner-bot
5fe0c034e2 fix: respect user-provided Connection header in fetch() requests (#21049)
Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: jarred-sumner-bot <220441119+jarred-sumner-bot@users.noreply.github.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
Co-authored-by: claude[bot] <209825114+claude[bot]@users.noreply.github.com>
Co-authored-by: Jarred Sumner <Jarred-Sumner@users.noreply.github.com>
2025-07-14 20:53:46 -07:00
jarred-sumner-bot
7f29446d9b fix(install): handle lifecycle script spawn failures gracefully instead of crashing (#21054)
Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: jarred-sumner-bot <220441119+jarred-sumner-bot@users.noreply.github.com>
2025-07-14 20:50:32 -07:00
Jose Angel
c5f64036a7 docs: Update docs for use prisma with bun (#21048)
Co-authored-by: jose angel gonzalez velazquez <angel@23LAP1CD208600C.indra.es>
2025-07-14 19:40:05 -07:00
jarred-sumner-bot
757096777b Document test.coveragePathIgnorePatterns option (#21036)
Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2025-07-14 17:36:10 -07:00
jarred-sumner-bot
e9ccc81e03 Add --sql-preconnect CLI flag for PostgreSQL startup connections (#21035)
Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: jarred-sumner-bot <220441119+jarred-sumner-bot@users.noreply.github.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
Co-authored-by: Jarred-Sumner <709451+Jarred-Sumner@users.noreply.github.com>
2025-07-14 15:05:30 -07:00
Jarred Sumner
70ebe75e6c Fixes #21011 (#21014) 2025-07-14 14:18:18 -07:00
1157 changed files with 64393 additions and 35601 deletions

View File

@@ -1,78 +0,0 @@
import { spawnSync } from "node:child_process";
import { readFileSync, existsSync } from "node:fs";
import { parseArgs } from "node:util";
const { positionals, values } = parseArgs({
allowPositionals: true,
options: {
help: {
type: "boolean",
short: "h",
default: false,
},
interactive: {
type: "boolean",
short: "i",
default: false,
},
},
});
if (values.help || positionals.length === 0) {
console.log("Usage: node agent.mjs <prompt_name> [extra_args...]");
console.log("Example: node agent.mjs triage fix bug in authentication");
console.log("Options:");
console.log(" -h, --help Show this help message");
console.log(" -i, --interactive Run in interactive mode");
process.exit(0);
}
const promptName = positionals[0].toUpperCase();
const promptFile = `.agent/${promptName}.md`;
const extraArgs = positionals.slice(1);
if (!existsSync(promptFile)) {
console.error(`Error: Prompt file "${promptFile}" not found`);
console.error(`Available prompts should be named like: .agent/triage.md, .agent/debug.md, etc.`);
process.exit(1);
}
try {
let prompt = readFileSync(promptFile, "utf-8");
const githubEnvs = Object.entries(process.env)
.filter(([key]) => key.startsWith("GITHUB_"))
.sort(([a], [b]) => a.localeCompare(b));
if (githubEnvs.length > 0) {
const githubContext = `## GitHub Environment\n\n${githubEnvs
.map(([key, value]) => `**${key}**: \`${value}\``)
.join("\n")}\n\n---\n\n`;
prompt = githubContext + prompt;
}
if (extraArgs.length > 0) {
const extraArgsContext = `\n\n## Additional Arguments\n\n${extraArgs.join(" ")}\n\n---\n\n`;
prompt = prompt + extraArgsContext;
}
const claudeArgs = [prompt, "--allowedTools=Edit,Write,Replace,Search", "--output-format=json"];
if (!values.interactive) {
claudeArgs.unshift("--print");
}
const { status, error } = spawnSync("claude", claudeArgs, {
stdio: "inherit",
encoding: "utf-8",
});
if (error) {
console.error("Error running claude:", error);
process.exit(1);
}
process.exit(status || 0);
} catch (error) {
console.error(`Error reading prompt file "${promptFile}":`, error);
process.exit(1);
}

View File

@@ -127,8 +127,11 @@ const testPlatforms = [
{ os: "linux", arch: "x64", distro: "debian", release: "12", tier: "latest" },
{ os: "linux", arch: "x64", baseline: true, distro: "debian", release: "12", tier: "latest" },
{ os: "linux", arch: "x64", profile: "asan", distro: "debian", release: "12", tier: "latest" },
{ os: "linux", arch: "aarch64", distro: "ubuntu", release: "25.04", tier: "latest" },
{ os: "linux", arch: "aarch64", distro: "ubuntu", release: "24.04", tier: "latest" },
{ os: "linux", arch: "x64", distro: "ubuntu", release: "25.04", tier: "latest" },
{ os: "linux", arch: "x64", distro: "ubuntu", release: "24.04", tier: "latest" },
{ os: "linux", arch: "x64", baseline: true, distro: "ubuntu", release: "25.04", tier: "latest" },
{ os: "linux", arch: "x64", baseline: true, distro: "ubuntu", release: "24.04", tier: "latest" },
{ os: "linux", arch: "aarch64", abi: "musl", distro: "alpine", release: "3.21", tier: "latest" },
{ os: "linux", arch: "x64", abi: "musl", distro: "alpine", release: "3.21", tier: "latest" },
@@ -566,7 +569,7 @@ function getTestBunStep(platform, options, testOptions = {}) {
retry: getRetry(),
cancel_on_build_failing: isMergeQueue(),
parallelism: unifiedTests ? undefined : os === "darwin" ? 2 : 10,
timeout_in_minutes: profile === "asan" ? 45 : 30,
timeout_in_minutes: profile === "asan" || os === "windows" ? 45 : 30,
command:
os === "windows"
? `node .\\scripts\\runner.node.mjs ${args.join(" ")}`

View File

@@ -360,9 +360,8 @@ JSC_DEFINE_HOST_FUNCTION(x509CertificateConstructorConstruct, (JSGlobalObject *
auto* functionGlobalObject = defaultGlobalObject(getFunctionRealm(globalObject, newTarget.getObject()));
RETURN_IF_EXCEPTION(scope, {});
structure = InternalFunction::createSubclassStructure(
globalObject, newTarget.getObject(), functionGlobalObject->NodeVMScriptStructure());
scope.release();
structure = InternalFunction::createSubclassStructure(globalObject, newTarget.getObject(), functionGlobalObject->NodeVMScriptStructure());
RETURN_IF_EXCEPTION(scope, {});
}
return JSValue::encode(createX509Certificate(vm, globalObject, structure, arg));

View File

@@ -1,50 +1,3 @@
### What does this PR do?
<!-- **Please explain what your changes do**, example: -->
<!--
This adds a new flag --bail to bun test. When set, it will stop running tests after the first failure. This is useful for CI environments where you want to fail fast.
-->
- [ ] Documentation or TypeScript types (it's okay to leave the rest blank in this case)
- [ ] Code changes
### How did you verify your code works?
<!-- **For code changes, please include automated tests**. Feel free to uncomment the line below -->
<!-- I wrote automated tests -->
<!-- If JavaScript/TypeScript modules or builtins changed:
- [ ] I included a test for the new code, or existing tests cover it
- [ ] I ran my tests locally and they pass (`bun-debug test test-file-name.test`)
-->
<!-- If Zig files changed:
- [ ] I checked the lifetime of memory allocated to verify it's (1) freed and (2) only freed when it should be
- [ ] I included a test for the new code, or an existing test covers it
- [ ] JSValue used outside of the stack is either wrapped in a JSC.Strong or is JSValueProtect'ed
- [ ] I wrote TypeScript/JavaScript tests and they pass locally (`bun-debug test test-file-name.test`)
-->
<!-- If new methods, getters, or setters were added to a publicly exposed class:
- [ ] I added TypeScript types for the new methods, getters, or setters
-->
<!-- If dependencies in tests changed:
- [ ] I made sure that specific versions of dependencies are used instead of ranged or tagged versions
-->
<!-- If a new builtin ESM/CJS module was added:
- [ ] I updated Aliases in `module_loader.zig` to include the new module
- [ ] I added a test that imports the module
- [ ] I added a test that require() the module
-->

View File

@@ -1,28 +1,30 @@
name: format
name: autofix.ci
permissions:
contents: write
contents: read
on:
workflow_call:
workflow_dispatch:
pull_request:
merge_group:
push:
branches: ["main"]
env:
BUN_VERSION: "1.2.11"
LLVM_VERSION: "19.1.7"
LLVM_VERSION_MAJOR: "19"
jobs:
format:
autofix:
name: Format
runs-on: ubuntu-latest
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
steps:
- name: Checkout
uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Configure Git
run: |
git config --global core.autocrlf true
@@ -44,25 +46,16 @@ jobs:
version: 0.14.0
- name: Zig Format
run: |
bun scripts/zig-remove-unreferenced-top-level-decls.ts src/
zig fmt src
bun scripts/sortImports src
./scripts/sort-imports.ts src
zig fmt src
- name: Commit
uses: stefanzweifel/git-auto-commit-action@v5
with:
commit_message: "`bun run zig-format`"
- name: Prettier Format
run: |
bun run prettier
- name: Commit
uses: stefanzweifel/git-auto-commit-action@v5
with:
commit_message: "`bun run prettier`"
- name: Clang Format
run: |
bun run clang-format
- name: Commit
uses: stefanzweifel/git-auto-commit-action@v5
with:
commit_message: "`bun run clang-format`"
- name: Ban Words
run: |
bun ./test/internal/ban-words.test.ts
- uses: autofix-ci/action@635ffb0c9798bd160680f18fd73371e355b85f27

View File

@@ -0,0 +1,102 @@
name: Update hdrhistogram
on:
schedule:
- cron: "0 4 * * 0"
workflow_dispatch:
jobs:
check-update:
runs-on: ubuntu-latest
permissions:
contents: write
pull-requests: write
steps:
- uses: actions/checkout@v4
- name: Check hdrhistogram version
id: check-version
run: |
set -euo pipefail
# Extract the commit hash from the line after COMMIT
CURRENT_VERSION=$(awk '/[[:space:]]*COMMIT[[:space:]]*$/{getline; gsub(/^[[:space:]]+|[[:space:]]+$/,"",$0); print}' cmake/targets/BuildHdrHistogram.cmake)
if [ -z "$CURRENT_VERSION" ]; then
echo "Error: Could not find COMMIT line in BuildHdrHistogram.cmake"
exit 1
fi
# Validate that it looks like a git hash
if ! [[ $CURRENT_VERSION =~ ^[0-9a-f]{40}$ ]]; then
echo "Error: Invalid git hash format in BuildHdrHistogram.cmake"
echo "Found: $CURRENT_VERSION"
echo "Expected: 40 character hexadecimal string"
exit 1
fi
echo "current=$CURRENT_VERSION" >> $GITHUB_OUTPUT
LATEST_RELEASE=$(curl -sL https://api.github.com/repos/HdrHistogram/HdrHistogram_c/releases/latest)
if [ -z "$LATEST_RELEASE" ]; then
echo "Error: Failed to fetch latest release from GitHub API"
exit 1
fi
LATEST_TAG=$(echo "$LATEST_RELEASE" | jq -r '.tag_name')
if [ -z "$LATEST_TAG" ] || [ "$LATEST_TAG" = "null" ]; then
echo "Error: Could not extract tag name from GitHub API response"
exit 1
fi
LATEST_TAG_SHA=$(curl -sL "https://api.github.com/repos/HdrHistogram/HdrHistogram_c/git/refs/tags/$LATEST_TAG" | jq -r '.object.sha')
if [ -z "$LATEST_TAG_SHA" ] || [ "$LATEST_TAG_SHA" = "null" ]; then
echo "Error: Could not fetch SHA for tag $LATEST_TAG"
exit 1
fi
# Try to get commit SHA from tag object (for annotated tags)
# If it fails, assume it's a lightweight tag pointing directly to commit
LATEST_SHA=$(curl -sL "https://api.github.com/repos/HdrHistogram/HdrHistogram_c/git/tags/$LATEST_TAG_SHA" 2>/dev/null | jq -r '.object.sha // empty')
if [ -z "$LATEST_SHA" ]; then
# Lightweight tag - SHA points directly to commit
LATEST_SHA="$LATEST_TAG_SHA"
fi
if ! [[ $LATEST_SHA =~ ^[0-9a-f]{40}$ ]]; then
echo "Error: Invalid SHA format received from GitHub"
echo "Found: $LATEST_SHA"
echo "Expected: 40 character hexadecimal string"
exit 1
fi
echo "latest=$LATEST_SHA" >> $GITHUB_OUTPUT
echo "tag=$LATEST_TAG" >> $GITHUB_OUTPUT
- name: Update version if needed
if: success() && steps.check-version.outputs.current != steps.check-version.outputs.latest
run: |
set -euo pipefail
# Handle multi-line format where COMMIT and its value are on separate lines
sed -i -E '/[[:space:]]*COMMIT[[:space:]]*$/{n;s/[[:space:]]*([0-9a-f]+)[[:space:]]*$/ ${{ steps.check-version.outputs.latest }}/}' cmake/targets/BuildHdrHistogram.cmake
- name: Create Pull Request
if: success() && steps.check-version.outputs.current != steps.check-version.outputs.latest
uses: peter-evans/create-pull-request@v4
with:
token: ${{ secrets.GITHUB_TOKEN }}
add-paths: |
cmake/targets/BuildHdrHistogram.cmake
commit-message: "deps: update hdrhistogram to ${{ steps.check-version.outputs.tag }} (${{ steps.check-version.outputs.latest }})"
title: "deps: update hdrhistogram to ${{ steps.check-version.outputs.tag }}"
delete-branch: true
branch: deps/update-hdrhistogram-${{ github.run_number }}
body: |
## What does this PR do?
Updates hdrhistogram to version ${{ steps.check-version.outputs.tag }}
Compare: https://github.com/HdrHistogram/HdrHistogram_c/compare/${{ steps.check-version.outputs.current }}...${{ steps.check-version.outputs.latest }}
Auto-updated by [this workflow](https://github.com/oven-sh/bun/actions/workflows/update-hdrhistogram.yml)

118
.github/workflows/update-highway.yml vendored Normal file
View File

@@ -0,0 +1,118 @@
name: Update highway
on:
schedule:
- cron: "0 4 * * 0"
workflow_dispatch:
jobs:
check-update:
runs-on: ubuntu-latest
permissions:
contents: write
pull-requests: write
steps:
- uses: actions/checkout@v4
- name: Check highway version
id: check-version
run: |
set -euo pipefail
# Extract the commit hash from the line after COMMIT
CURRENT_VERSION=$(awk '/[[:space:]]*COMMIT[[:space:]]*$/{getline; gsub(/^[[:space:]]+|[[:space:]]+$/,"",$0); print}' cmake/targets/BuildHighway.cmake)
if [ -z "$CURRENT_VERSION" ]; then
echo "Error: Could not find COMMIT line in BuildHighway.cmake"
exit 1
fi
# Validate that it looks like a git hash
if ! [[ $CURRENT_VERSION =~ ^[0-9a-f]{40}$ ]]; then
echo "Error: Invalid git hash format in BuildHighway.cmake"
echo "Found: $CURRENT_VERSION"
echo "Expected: 40 character hexadecimal string"
exit 1
fi
echo "current=$CURRENT_VERSION" >> $GITHUB_OUTPUT
LATEST_RELEASE=$(curl -sL https://api.github.com/repos/google/highway/releases/latest)
if [ -z "$LATEST_RELEASE" ]; then
echo "Error: Failed to fetch latest release from GitHub API"
exit 1
fi
LATEST_TAG=$(echo "$LATEST_RELEASE" | jq -r '.tag_name')
if [ -z "$LATEST_TAG" ] || [ "$LATEST_TAG" = "null" ]; then
echo "Error: Could not extract tag name from GitHub API response"
exit 1
fi
TAG_REF=$(curl -sL "https://api.github.com/repos/google/highway/git/refs/tags/$LATEST_TAG")
if [ -z "$TAG_REF" ]; then
echo "Error: Could not fetch tag reference for $LATEST_TAG"
exit 1
fi
TAG_OBJECT_SHA=$(echo "$TAG_REF" | jq -r '.object.sha')
TAG_OBJECT_TYPE=$(echo "$TAG_REF" | jq -r '.object.type')
if [ -z "$TAG_OBJECT_SHA" ] || [ "$TAG_OBJECT_SHA" = "null" ]; then
echo "Error: Could not fetch SHA for tag $LATEST_TAG"
exit 1
fi
# Handle both lightweight tags (type: commit) and annotated tags (type: tag)
if [ "$TAG_OBJECT_TYPE" = "commit" ]; then
# Lightweight tag - object.sha is already the commit SHA
LATEST_SHA="$TAG_OBJECT_SHA"
elif [ "$TAG_OBJECT_TYPE" = "tag" ]; then
# Annotated tag - need to fetch the tag object to get the commit SHA
LATEST_SHA=$(curl -sL "https://api.github.com/repos/google/highway/git/tags/$TAG_OBJECT_SHA" | jq -r '.object.sha')
if [ -z "$LATEST_SHA" ] || [ "$LATEST_SHA" = "null" ]; then
echo "Error: Could not fetch commit SHA for annotated tag $LATEST_TAG @ $TAG_OBJECT_SHA"
exit 1
fi
else
echo "Error: Unexpected tag object type: $TAG_OBJECT_TYPE"
exit 1
fi
if ! [[ $LATEST_SHA =~ ^[0-9a-f]{40}$ ]]; then
echo "Error: Invalid SHA format received from GitHub"
echo "Found: $LATEST_SHA"
echo "Expected: 40 character hexadecimal string"
exit 1
fi
echo "latest=$LATEST_SHA" >> $GITHUB_OUTPUT
echo "tag=$LATEST_TAG" >> $GITHUB_OUTPUT
- name: Update version if needed
if: success() && steps.check-version.outputs.current != steps.check-version.outputs.latest
run: |
set -euo pipefail
# Handle multi-line format where COMMIT and its value are on separate lines
sed -i -E '/[[:space:]]*COMMIT[[:space:]]*$/{n;s/[[:space:]]*([0-9a-f]+)[[:space:]]*$/ ${{ steps.check-version.outputs.latest }}/}' cmake/targets/BuildHighway.cmake
- name: Create Pull Request
if: success() && steps.check-version.outputs.current != steps.check-version.outputs.latest
uses: peter-evans/create-pull-request@v4
with:
token: ${{ secrets.GITHUB_TOKEN }}
add-paths: |
cmake/targets/BuildHighway.cmake
commit-message: "deps: update highway to ${{ steps.check-version.outputs.tag }} (${{ steps.check-version.outputs.latest }})"
title: "deps: update highway to ${{ steps.check-version.outputs.tag }}"
delete-branch: true
branch: deps/update-highway-${{ github.run_number }}
body: |
## What does this PR do?
Updates highway to version ${{ steps.check-version.outputs.tag }}
Compare: https://github.com/google/highway/compare/${{ steps.check-version.outputs.current }}...${{ steps.check-version.outputs.latest }}
Auto-updated by [this workflow](https://github.com/oven-sh/bun/actions/workflows/update-highway.yml)

View File

@@ -50,15 +50,27 @@ jobs:
exit 1
fi
LATEST_TAG_SHA=$(curl -sL "https://api.github.com/repos/cloudflare/lol-html/git/refs/tags/$LATEST_TAG" | jq -r '.object.sha')
# Get the commit SHA that the tag points to
# This handles both lightweight tags (direct commit refs) and annotated tags (tag objects)
TAG_REF_RESPONSE=$(curl -sL "https://api.github.com/repos/cloudflare/lol-html/git/refs/tags/$LATEST_TAG")
LATEST_TAG_SHA=$(echo "$TAG_REF_RESPONSE" | jq -r '.object.sha')
TAG_OBJECT_TYPE=$(echo "$TAG_REF_RESPONSE" | jq -r '.object.type')
if [ -z "$LATEST_TAG_SHA" ] || [ "$LATEST_TAG_SHA" = "null" ]; then
echo "Error: Could not fetch SHA for tag $LATEST_TAG"
exit 1
fi
LATEST_SHA=$(curl -sL "https://api.github.com/repos/cloudflare/lol-html/git/tags/$LATEST_TAG_SHA" | jq -r '.object.sha')
if [ -z "$LATEST_SHA" ] || [ "$LATEST_SHA" = "null" ]; then
echo "Error: Could not fetch SHA for tag $LATEST_TAG @ $LATEST_TAG_SHA"
exit 1
if [ "$TAG_OBJECT_TYPE" = "tag" ]; then
# This is an annotated tag, we need to get the commit it points to
LATEST_SHA=$(curl -sL "https://api.github.com/repos/cloudflare/lol-html/git/tags/$LATEST_TAG_SHA" | jq -r '.object.sha')
if [ -z "$LATEST_SHA" ] || [ "$LATEST_SHA" = "null" ]; then
echo "Error: Could not fetch commit SHA for annotated tag $LATEST_TAG @ $LATEST_TAG_SHA"
exit 1
fi
else
# This is a lightweight tag pointing directly to a commit
LATEST_SHA="$LATEST_TAG_SHA"
fi
if ! [[ $LATEST_SHA =~ ^[0-9a-f]{40}$ ]]; then

View File

@@ -50,15 +50,32 @@ jobs:
exit 1
fi
LATEST_TAG_SHA=$(curl -sL "https://api.github.com/repos/litespeedtech/ls-hpack/git/refs/tags/$LATEST_TAG" | jq -r '.object.sha')
# Get the tag reference, which contains both SHA and type
TAG_REF=$(curl -sL "https://api.github.com/repos/litespeedtech/ls-hpack/git/refs/tags/$LATEST_TAG")
if [ -z "$TAG_REF" ]; then
echo "Error: Could not fetch tag reference for $LATEST_TAG"
exit 1
fi
LATEST_TAG_SHA=$(echo "$TAG_REF" | jq -r '.object.sha')
TAG_TYPE=$(echo "$TAG_REF" | jq -r '.object.type')
if [ -z "$LATEST_TAG_SHA" ] || [ "$LATEST_TAG_SHA" = "null" ]; then
echo "Error: Could not fetch SHA for tag $LATEST_TAG"
exit 1
fi
LATEST_SHA=$(curl -sL "https://api.github.com/repos/litespeedtech/ls-hpack/git/tags/$LATEST_TAG_SHA" | jq -r '.object.sha')
if [ -z "$LATEST_SHA" ] || [ "$LATEST_SHA" = "null" ]; then
echo "Error: Could not fetch SHA for tag $LATEST_TAG @ $LATEST_TAG_SHA"
exit 1
# If it's an annotated tag, we need to dereference it to get the commit SHA
# If it's a lightweight tag, the SHA already points to the commit
if [ "$TAG_TYPE" = "tag" ]; then
LATEST_SHA=$(curl -sL "https://api.github.com/repos/litespeedtech/ls-hpack/git/tags/$LATEST_TAG_SHA" | jq -r '.object.sha')
if [ -z "$LATEST_SHA" ] || [ "$LATEST_SHA" = "null" ]; then
echo "Error: Could not fetch commit SHA for annotated tag $LATEST_TAG"
exit 1
fi
else
# For lightweight tags, the SHA is already the commit SHA
LATEST_SHA="$LATEST_TAG_SHA"
fi
if ! [[ $LATEST_SHA =~ ^[0-9a-f]{40}$ ]]; then

47
.github/workflows/vscode-release.yml vendored Normal file
View File

@@ -0,0 +1,47 @@
name: VSCode Extension Publish
on:
workflow_dispatch:
inputs:
version:
description: "Version to publish (e.g. 0.0.25) - Check the marketplace for the latest version"
required: true
type: string
jobs:
publish:
name: "Publish to VS Code Marketplace"
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Setup Bun
uses: ./.github/actions/setup-bun
with:
bun-version: "1.2.18"
- name: Install dependencies (root)
run: bun install
- name: Install dependencies
run: bun install
working-directory: packages/bun-vscode
- name: Set Version
run: bun pm version ${{ github.event.inputs.version }} --no-git-tag-version --allow-same-version
working-directory: packages/bun-vscode
- name: Build (inspector protocol)
run: bun install && bun run build
working-directory: packages/bun-inspector-protocol
- name: Build (vscode extension)
run: bun run build
working-directory: packages/bun-vscode
- name: Publish
if: success()
run: bunx vsce publish
env:
VSCE_PAT: ${{ secrets.VSCODE_EXTENSION }}
working-directory: packages/bun-vscode/extension

View File

@@ -168,5 +168,5 @@
"WebKit/WebInspectorUI": true,
},
"git.detectSubmodules": false,
"bun.test.customScript": "bun-debug test"
// "bun.test.customScript": "./build/debug/bun-debug test"
}

View File

@@ -4,9 +4,9 @@ This is the Bun repository - an all-in-one JavaScript runtime & toolkit designed
### Build Commands
- **Build debug version**: `bun bd` or `bun run build:debug`
- **Build debug version**: `bun bd`
- Creates a debug build at `./build/debug/bun-debug`
- Compilation takes ~2.5 minutes
- Compilation takes ~5 minutes. Don't timeout, be patient.
- **Run tests with your debug build**: `bun bd test <test-file>`
- **CRITICAL**: Never use `bun test` directly - it won't include your changes
- **Run any command with debug build**: `bun bd <command>`

View File

@@ -160,6 +160,7 @@ In particular, these are:
- `./src/codegen/generate-jssink.ts` -- Generates `build/debug/codegen/JSSink.cpp`, `build/debug/codegen/JSSink.h` which implement various classes for interfacing with `ReadableStream`. This is internally how `FileSink`, `ArrayBufferSink`, `"type": "direct"` streams and other code related to streams works.
- `./src/codegen/generate-classes.ts` -- Generates `build/debug/codegen/ZigGeneratedClasses*`, which generates Zig & C++ bindings for JavaScriptCore classes implemented in Zig. In `**/*.classes.ts` files, we define the interfaces for various classes, methods, prototypes, getters/setters etc which the code generator reads to generate boilerplate code implementing the JavaScript objects in C++ and wiring them up to Zig
- `./src/codegen/cppbind.ts` -- Generates automatic Zig bindings for C++ functions marked with `[[ZIG_EXPORT]]` attributes.
- `./src/codegen/bundle-modules.ts` -- Bundles built-in modules like `node:fs`, `bun:ffi` into files we can include in the final binary. In development, these can be reloaded without rebuilding Zig (you still need to run `bun run build`, but it re-reads the transpiled files from disk afterwards). In release builds, these are embedded into the binary.
- `./src/codegen/bundle-functions.ts` -- Bundles globally-accessible functions implemented in JavaScript/TypeScript like `ReadableStream`, `WritableStream`, and a handful more. These are used similarly to the builtin modules, but the output more closely aligns with what WebKit/Safari does for Safari's built-in functions so that we can copy-paste the implementations from WebKit as a starting point.

2
LATEST
View File

@@ -1 +1 @@
1.2.18
1.2.19

Binary file not shown.

View File

@@ -28,9 +28,7 @@ if (+(existingUsers?.[0]?.count ?? existingUsers?.count) < 100) {
}));
// Insert all users
await sql`
INSERT INTO users_bun_bench (first_name, last_name, email, dob) ${sql(users)}
`;
await sql`INSERT INTO users_bun_bench ${sql(users)}`;
}
const type = isBun ? "Bun.sql" : "postgres";

View File

@@ -9,6 +9,6 @@
"typescript": "^5.0.0"
},
"dependencies": {
"postgres": "^3.4.5"
"postgres": "^3.4.7"
}
}

View File

@@ -752,6 +752,13 @@ fn addInternalImports(b: *Build, mod: *Module, opts: *BunBuildOptions) void {
});
}
}
{
const cppImport = b.createModule(.{
.root_source_file = (std.Build.LazyPath{ .cwd_relative = opts.codegen_path }).path(b, "cpp.zig"),
});
mod.addImport("cpp", cppImport);
cppImport.addImport("bun", mod);
}
inline for (.{
.{ .import = "completions-bash", .file = b.path("completions/bun.bash") },
.{ .import = "completions-zsh", .file = b.path("completions/bun.zsh") },

213
bun.lock
View File

@@ -4,6 +4,9 @@
"": {
"name": "bun",
"devDependencies": {
"@lezer/common": "^1.2.3",
"@lezer/cpp": "^1.1.3",
"bun-tracestrings": "github:oven-sh/bun.report#912ca63e26c51429d3e6799aa2a6ab079b188fd8",
"esbuild": "^0.21.4",
"mitata": "^0.1.11",
"peechy": "0.4.34",
@@ -87,41 +90,191 @@
"@esbuild/win32-x64": ["@esbuild/win32-x64@0.21.5", "", { "os": "win32", "cpu": "x64" }, "sha512-tQd/1efJuzPC6rCFwEvLtci/xNFcTZknmXs98FYDfGE4wP9ClFV98nyKrzJKVPMhdDnjzLhdUyMX4PsQAPjwIw=="],
"@lezer/common": ["@lezer/common@1.2.3", "", {}, "sha512-w7ojc8ejBqr2REPsWxJjrMFsA/ysDCFICn8zEOR9mrqzOu2amhITYuLD8ag6XZf0CFXDrhKqw7+tW8cX66NaDA=="],
"@lezer/cpp": ["@lezer/cpp@1.1.3", "", { "dependencies": { "@lezer/common": "^1.2.0", "@lezer/highlight": "^1.0.0", "@lezer/lr": "^1.0.0" } }, "sha512-ykYvuFQKGsRi6IcE+/hCSGUhb/I4WPjd3ELhEblm2wS2cOznDFzO+ubK2c+ioysOnlZ3EduV+MVQFCPzAIoY3w=="],
"@lezer/highlight": ["@lezer/highlight@1.2.1", "", { "dependencies": { "@lezer/common": "^1.0.0" } }, "sha512-Z5duk4RN/3zuVO7Jq0pGLJ3qynpxUVsh7IbUbGj88+uV2ApSAn6kWg2au3iJb+0Zi7kKtqffIESgNcRXWZWmSA=="],
"@lezer/lr": ["@lezer/lr@1.4.2", "", { "dependencies": { "@lezer/common": "^1.0.0" } }, "sha512-pu0K1jCIdnQ12aWNaAVU5bzi7Bd1w54J3ECgANPmYLtQKP0HBj2cE/5coBD66MT10xbtIuUr7tg0Shbsvk0mDA=="],
"@octokit/app": ["@octokit/app@14.1.0", "", { "dependencies": { "@octokit/auth-app": "^6.0.0", "@octokit/auth-unauthenticated": "^5.0.0", "@octokit/core": "^5.0.0", "@octokit/oauth-app": "^6.0.0", "@octokit/plugin-paginate-rest": "^9.0.0", "@octokit/types": "^12.0.0", "@octokit/webhooks": "^12.0.4" } }, "sha512-g3uEsGOQCBl1+W1rgfwoRFUIR6PtvB2T1E4RpygeUU5LrLvlOqcxrt5lfykIeRpUPpupreGJUYl70fqMDXdTpw=="],
"@octokit/auth-app": ["@octokit/auth-app@6.1.4", "", { "dependencies": { "@octokit/auth-oauth-app": "^7.1.0", "@octokit/auth-oauth-user": "^4.1.0", "@octokit/request": "^8.3.1", "@octokit/request-error": "^5.1.0", "@octokit/types": "^13.1.0", "deprecation": "^2.3.1", "lru-cache": "npm:@wolfy1339/lru-cache@^11.0.2-patch.1", "universal-github-app-jwt": "^1.1.2", "universal-user-agent": "^6.0.0" } }, "sha512-QkXkSOHZK4dA5oUqY5Dk3S+5pN2s1igPjEASNQV8/vgJgW034fQWR16u7VsNOK/EljA00eyjYF5mWNxWKWhHRQ=="],
"@octokit/auth-oauth-app": ["@octokit/auth-oauth-app@7.1.0", "", { "dependencies": { "@octokit/auth-oauth-device": "^6.1.0", "@octokit/auth-oauth-user": "^4.1.0", "@octokit/request": "^8.3.1", "@octokit/types": "^13.0.0", "@types/btoa-lite": "^1.0.0", "btoa-lite": "^1.0.0", "universal-user-agent": "^6.0.0" } }, "sha512-w+SyJN/b0l/HEb4EOPRudo7uUOSW51jcK1jwLa+4r7PA8FPFpoxEnHBHMITqCsc/3Vo2qqFjgQfz/xUUvsSQnA=="],
"@octokit/auth-oauth-device": ["@octokit/auth-oauth-device@6.1.0", "", { "dependencies": { "@octokit/oauth-methods": "^4.1.0", "@octokit/request": "^8.3.1", "@octokit/types": "^13.0.0", "universal-user-agent": "^6.0.0" } }, "sha512-FNQ7cb8kASufd6Ej4gnJ3f1QB5vJitkoV1O0/g6e6lUsQ7+VsSNRHRmFScN2tV4IgKA12frrr/cegUs0t+0/Lw=="],
"@octokit/auth-oauth-user": ["@octokit/auth-oauth-user@4.1.0", "", { "dependencies": { "@octokit/auth-oauth-device": "^6.1.0", "@octokit/oauth-methods": "^4.1.0", "@octokit/request": "^8.3.1", "@octokit/types": "^13.0.0", "btoa-lite": "^1.0.0", "universal-user-agent": "^6.0.0" } }, "sha512-FrEp8mtFuS/BrJyjpur+4GARteUCrPeR/tZJzD8YourzoVhRics7u7we/aDcKv+yywRNwNi/P4fRi631rG/OyQ=="],
"@octokit/auth-token": ["@octokit/auth-token@4.0.0", "", {}, "sha512-tY/msAuJo6ARbK6SPIxZrPBms3xPbfwBrulZe0Wtr/DIY9lje2HeV1uoebShn6mx7SjCHif6EjMvoREj+gZ+SA=="],
"@octokit/auth-unauthenticated": ["@octokit/auth-unauthenticated@5.0.1", "", { "dependencies": { "@octokit/request-error": "^5.0.0", "@octokit/types": "^12.0.0" } }, "sha512-oxeWzmBFxWd+XolxKTc4zr+h3mt+yofn4r7OfoIkR/Cj/o70eEGmPsFbueyJE2iBAGpjgTnEOKM3pnuEGVmiqg=="],
"@octokit/core": ["@octokit/core@5.2.2", "", { "dependencies": { "@octokit/auth-token": "^4.0.0", "@octokit/graphql": "^7.1.0", "@octokit/request": "^8.4.1", "@octokit/request-error": "^5.1.1", "@octokit/types": "^13.0.0", "before-after-hook": "^2.2.0", "universal-user-agent": "^6.0.0" } }, "sha512-/g2d4sW9nUDJOMz3mabVQvOGhVa4e/BN/Um7yca9Bb2XTzPPnfTWHWQg+IsEYO7M3Vx+EXvaM/I2pJWIMun1bg=="],
"@octokit/endpoint": ["@octokit/endpoint@9.0.6", "", { "dependencies": { "@octokit/types": "^13.1.0", "universal-user-agent": "^6.0.0" } }, "sha512-H1fNTMA57HbkFESSt3Y9+FBICv+0jFceJFPWDePYlR/iMGrwM5ph+Dd4XRQs+8X+PUFURLQgX9ChPfhJ/1uNQw=="],
"@octokit/graphql": ["@octokit/graphql@7.1.1", "", { "dependencies": { "@octokit/request": "^8.4.1", "@octokit/types": "^13.0.0", "universal-user-agent": "^6.0.0" } }, "sha512-3mkDltSfcDUoa176nlGoA32RGjeWjl3K7F/BwHwRMJUW/IteSa4bnSV8p2ThNkcIcZU2umkZWxwETSSCJf2Q7g=="],
"@octokit/oauth-app": ["@octokit/oauth-app@6.1.0", "", { "dependencies": { "@octokit/auth-oauth-app": "^7.0.0", "@octokit/auth-oauth-user": "^4.0.0", "@octokit/auth-unauthenticated": "^5.0.0", "@octokit/core": "^5.0.0", "@octokit/oauth-authorization-url": "^6.0.2", "@octokit/oauth-methods": "^4.0.0", "@types/aws-lambda": "^8.10.83", "universal-user-agent": "^6.0.0" } }, "sha512-nIn/8eUJ/BKUVzxUXd5vpzl1rwaVxMyYbQkNZjHrF7Vk/yu98/YDF/N2KeWO7uZ0g3b5EyiFXFkZI8rJ+DH1/g=="],
"@octokit/oauth-authorization-url": ["@octokit/oauth-authorization-url@6.0.2", "", {}, "sha512-CdoJukjXXxqLNK4y/VOiVzQVjibqoj/xHgInekviUJV73y/BSIcwvJ/4aNHPBPKcPWFnd4/lO9uqRV65jXhcLA=="],
"@octokit/oauth-methods": ["@octokit/oauth-methods@4.1.0", "", { "dependencies": { "@octokit/oauth-authorization-url": "^6.0.2", "@octokit/request": "^8.3.1", "@octokit/request-error": "^5.1.0", "@octokit/types": "^13.0.0", "btoa-lite": "^1.0.0" } }, "sha512-4tuKnCRecJ6CG6gr0XcEXdZtkTDbfbnD5oaHBmLERTjTMZNi2CbfEHZxPU41xXLDG4DfKf+sonu00zvKI9NSbw=="],
"@octokit/openapi-types": ["@octokit/openapi-types@24.2.0", "", {}, "sha512-9sIH3nSUttelJSXUrmGzl7QUBFul0/mB8HRYl3fOlgHbIWG+WnYDXU3v/2zMtAvuzZ/ed00Ei6on975FhBfzrg=="],
"@octokit/plugin-paginate-graphql": ["@octokit/plugin-paginate-graphql@4.0.1", "", { "peerDependencies": { "@octokit/core": ">=5" } }, "sha512-R8ZQNmrIKKpHWC6V2gum4x9LG2qF1RxRjo27gjQcG3j+vf2tLsEfE7I/wRWEPzYMaenr1M+qDAtNcwZve1ce1A=="],
"@octokit/plugin-paginate-rest": ["@octokit/plugin-paginate-rest@11.4.4-cjs.2", "", { "dependencies": { "@octokit/types": "^13.7.0" }, "peerDependencies": { "@octokit/core": "5" } }, "sha512-2dK6z8fhs8lla5PaOTgqfCGBxgAv/le+EhPs27KklPhm1bKObpu6lXzwfUEQ16ajXzqNrKMujsFyo9K2eaoISw=="],
"@octokit/plugin-rest-endpoint-methods": ["@octokit/plugin-rest-endpoint-methods@13.3.2-cjs.1", "", { "dependencies": { "@octokit/types": "^13.8.0" }, "peerDependencies": { "@octokit/core": "^5" } }, "sha512-VUjIjOOvF2oELQmiFpWA1aOPdawpyaCUqcEBc/UOUnj3Xp6DJGrJ1+bjUIIDzdHjnFNO6q57ODMfdEZnoBkCwQ=="],
"@octokit/plugin-retry": ["@octokit/plugin-retry@6.1.0", "", { "dependencies": { "@octokit/request-error": "^5.0.0", "@octokit/types": "^13.0.0", "bottleneck": "^2.15.3" }, "peerDependencies": { "@octokit/core": "5" } }, "sha512-WrO3bvq4E1Xh1r2mT9w6SDFg01gFmP81nIG77+p/MqW1JeXXgL++6umim3t6x0Zj5pZm3rXAN+0HEjmmdhIRig=="],
"@octokit/plugin-throttling": ["@octokit/plugin-throttling@8.2.0", "", { "dependencies": { "@octokit/types": "^12.2.0", "bottleneck": "^2.15.3" }, "peerDependencies": { "@octokit/core": "^5.0.0" } }, "sha512-nOpWtLayKFpgqmgD0y3GqXafMFuKcA4tRPZIfu7BArd2lEZeb1988nhWhwx4aZWmjDmUfdgVf7W+Tt4AmvRmMQ=="],
"@octokit/request": ["@octokit/request@8.4.1", "", { "dependencies": { "@octokit/endpoint": "^9.0.6", "@octokit/request-error": "^5.1.1", "@octokit/types": "^13.1.0", "universal-user-agent": "^6.0.0" } }, "sha512-qnB2+SY3hkCmBxZsR/MPCybNmbJe4KAlfWErXq+rBKkQJlbjdJeS85VI9r8UqeLYLvnAenU8Q1okM/0MBsAGXw=="],
"@octokit/request-error": ["@octokit/request-error@5.1.1", "", { "dependencies": { "@octokit/types": "^13.1.0", "deprecation": "^2.0.0", "once": "^1.4.0" } }, "sha512-v9iyEQJH6ZntoENr9/yXxjuezh4My67CBSu9r6Ve/05Iu5gNgnisNWOsoJHTP6k0Rr0+HQIpnH+kyammu90q/g=="],
"@octokit/types": ["@octokit/types@13.10.0", "", { "dependencies": { "@octokit/openapi-types": "^24.2.0" } }, "sha512-ifLaO34EbbPj0Xgro4G5lP5asESjwHracYJvVaPIyXMuiuXLlhic3S47cBdTb+jfODkTE5YtGCLt3Ay3+J97sA=="],
"@octokit/webhooks": ["@octokit/webhooks@12.3.2", "", { "dependencies": { "@octokit/request-error": "^5.0.0", "@octokit/webhooks-methods": "^4.1.0", "@octokit/webhooks-types": "7.6.1", "aggregate-error": "^3.1.0" } }, "sha512-exj1MzVXoP7xnAcAB3jZ97pTvVPkQF9y6GA/dvYC47HV7vLv+24XRS6b/v/XnyikpEuvMhugEXdGtAlU086WkQ=="],
"@octokit/webhooks-methods": ["@octokit/webhooks-methods@5.1.1", "", {}, "sha512-NGlEHZDseJTCj8TMMFehzwa9g7On4KJMPVHDSrHxCQumL6uSQR8wIkP/qesv52fXqV1BPf4pTxwtS31ldAt9Xg=="],
"@octokit/webhooks-types": ["@octokit/webhooks-types@7.6.1", "", {}, "sha512-S8u2cJzklBC0FgTwWVLaM8tMrDuDMVE4xiTK4EYXM9GntyvrdbSoxqDQa+Fh57CCNApyIpyeqPhhFEmHPfrXgw=="],
"@sentry/types": ["@sentry/types@7.120.3", "", {}, "sha512-C4z+3kGWNFJ303FC+FxAd4KkHvxpNFYAFN8iMIgBwJdpIl25KZ8Q/VdGn0MLLUEHNLvjob0+wvwlcRBBNLXOow=="],
"@types/aws-lambda": ["@types/aws-lambda@8.10.152", "", {}, "sha512-soT/c2gYBnT5ygwiHPmd9a1bftj462NWVk2tKCc1PYHSIacB2UwbTS2zYG4jzag1mRDuzg/OjtxQjQ2NKRB6Rw=="],
"@types/btoa-lite": ["@types/btoa-lite@1.0.2", "", {}, "sha512-ZYbcE2x7yrvNFJiU7xJGrpF/ihpkM7zKgw8bha3LNJSesvTtUNxbpzaT7WXBIryf6jovisrxTBvymxMeLLj1Mg=="],
"@types/bun": ["@types/bun@workspace:packages/@types/bun"],
"@types/node": ["@types/node@22.15.18", "", { "dependencies": { "undici-types": "~6.21.0" } }, "sha512-v1DKRfUdyW+jJhZNEI1PYy29S2YRxMV5AOO/x/SjKmW0acCIOqmbj6Haf9eHAhsPmrhlHSxEhv/1WszcLWV4cg=="],
"@types/jsonwebtoken": ["@types/jsonwebtoken@9.0.10", "", { "dependencies": { "@types/ms": "*", "@types/node": "*" } }, "sha512-asx5hIG9Qmf/1oStypjanR7iKTv0gXQ1Ov/jfrX6kS/EO0OFni8orbmGCn0672NHR3kXHwpAwR+B368ZGN/2rA=="],
"@types/ms": ["@types/ms@2.1.0", "", {}, "sha512-GsCCIZDE/p3i96vtEqx+7dBUGXrc7zeSK3wwPHIaRThS+9OhWIXRqzs4d6k1SVU8g91DrNRWxWUGhp5KXQb2VA=="],
"@types/node": ["@types/node@24.1.0", "", { "dependencies": { "undici-types": "~7.8.0" } }, "sha512-ut5FthK5moxFKH2T1CUOC6ctR67rQRvvHdFLCD2Ql6KXmMuCrjsSsRI9UsLCm9M18BMwClv4pn327UvB7eeO1w=="],
"@types/react": ["@types/react@19.1.8", "", { "dependencies": { "csstype": "^3.0.2" } }, "sha512-AwAfQ2Wa5bCx9WP8nZL2uMZWod7J7/JSplxbTmBQ5ms6QpqNYm672H0Vu9ZVKVngQ+ii4R/byguVEUZQyeg44g=="],
"aggregate-error": ["aggregate-error@3.1.0", "", { "dependencies": { "clean-stack": "^2.0.0", "indent-string": "^4.0.0" } }, "sha512-4I7Td01quW/RpocfNayFdFVk1qSuoh0E7JrbRJ16nH01HhKFQ88INq9Sd+nd72zqRySlr9BmDA8xlEJ6vJMrYA=="],
"before-after-hook": ["before-after-hook@2.2.3", "", {}, "sha512-NzUnlZexiaH/46WDhANlyR2bXRopNg4F/zuSA3OpZnllCUgRaOF2znDioDWrmbNVsuZk6l9pMquQB38cfBZwkQ=="],
"bottleneck": ["bottleneck@2.19.5", "", {}, "sha512-VHiNCbI1lKdl44tGrhNfU3lup0Tj/ZBMJB5/2ZbNXRCPuRCO7ed2mgcK4r17y+KB2EfuYuRaVlwNbAeaWGSpbw=="],
"btoa-lite": ["btoa-lite@1.0.0", "", {}, "sha512-gvW7InbIyF8AicrqWoptdW08pUxuhq8BEgowNajy9RhiE86fmGAGl+bLKo6oB8QP0CkqHLowfN0oJdKC/J6LbA=="],
"buffer-equal-constant-time": ["buffer-equal-constant-time@1.0.1", "", {}, "sha512-zRpUiDwd/xk6ADqPMATG8vc9VPrkck7T07OIx0gnjmJAnHnTVXNQG3vfvWNuiZIkwu9KrKdA1iJKfsfTVxE6NA=="],
"bun-tracestrings": ["bun-tracestrings@github:oven-sh/bun.report#912ca63", { "dependencies": { "@octokit/webhooks-methods": "^5.1.0", "@sentry/types": "^7.112.2", "@types/bun": "^1.2.6", "html-minifier": "^4.0.0", "lightningcss": "^1.24.1", "marked": "^12.0.1", "octokit": "^3.2.0", "prettier": "^3.2.5", "typescript": "^5.0.0" }, "bin": { "ci-remap-server": "./bin/ci-remap-server.ts" } }, "oven-sh-bun.report-912ca63"],
"bun-types": ["bun-types@workspace:packages/bun-types"],
"camel-case": ["camel-case@4.1.2", "", { "dependencies": { "pascal-case": "^3.1.2", "tslib": "^2.0.3" } }, "sha512-gxGWBrTT1JuMx6R+o5PTXMmUnhnVzLQ9SNutD4YqKtI6ap897t3tKECYla6gCWEkplXnlNybEkZg9GEGxKFCgw=="],
"camel-case": ["camel-case@3.0.0", "", { "dependencies": { "no-case": "^2.2.0", "upper-case": "^1.1.1" } }, "sha512-+MbKztAYHXPr1jNTSKQF52VpcFjwY5RkR7fxksV8Doo4KAYc5Fl4UJRgthBbTmEx8C54DqahhbLJkDwjI3PI/w=="],
"capital-case": ["capital-case@1.0.4", "", { "dependencies": { "no-case": "^3.0.4", "tslib": "^2.0.3", "upper-case-first": "^2.0.2" } }, "sha512-ds37W8CytHgwnhGGTi88pcPyR15qoNkOpYwmMMfnWqqWgESapLqvDx6huFjQ5vqWSn2Z06173XNA7LtMOeUh1A=="],
"change-case": ["change-case@4.1.2", "", { "dependencies": { "camel-case": "^4.1.2", "capital-case": "^1.0.4", "constant-case": "^3.0.4", "dot-case": "^3.0.4", "header-case": "^2.0.4", "no-case": "^3.0.4", "param-case": "^3.0.4", "pascal-case": "^3.1.2", "path-case": "^3.0.4", "sentence-case": "^3.0.4", "snake-case": "^3.0.4", "tslib": "^2.0.3" } }, "sha512-bSxY2ws9OtviILG1EiY5K7NNxkqg/JnRnFxLtKQ96JaviiIxi7djMrSd0ECT9AC+lttClmYwKw53BWpOMblo7A=="],
"clean-css": ["clean-css@4.2.4", "", { "dependencies": { "source-map": "~0.6.0" } }, "sha512-EJUDT7nDVFDvaQgAo2G/PJvxmp1o/c6iXLbswsBbUFXi1Nr+AjA2cKmfbKDMjMvzEe75g3P6JkaDDAKk96A85A=="],
"clean-stack": ["clean-stack@2.2.0", "", {}, "sha512-4diC9HaTE+KRAMWhDhrGOECgWZxoevMc5TlkObMqNSsVU62PYzXZ/SMTjzyGAFF1YusgxGcSWTEXBhp0CPwQ1A=="],
"commander": ["commander@2.20.3", "", {}, "sha512-GpVkmM8vF2vQUkj2LvZmD35JxeJOLCwJ9cUkugyk2nuhbv3+mJvpLYYt+0+USMxE+oj+ey/lJEnhZw75x/OMcQ=="],
"constant-case": ["constant-case@3.0.4", "", { "dependencies": { "no-case": "^3.0.4", "tslib": "^2.0.3", "upper-case": "^2.0.2" } }, "sha512-I2hSBi7Vvs7BEuJDr5dDHfzb/Ruj3FyvFyh7KLilAjNQw3Be+xgqUBA2W6scVEcL0hL1dwPRtIqEPVUCKkSsyQ=="],
"csstype": ["csstype@3.1.3", "", {}, "sha512-M1uQkMl8rQK/szD0LNhtqxIPLpimGm8sOBwU7lLnCpSbTyY3yeU1Vc7l4KT5zT4s/yOxHH5O7tIuuLOCnLADRw=="],
"deprecation": ["deprecation@2.3.1", "", {}, "sha512-xmHIy4F3scKVwMsQ4WnVaS8bHOx0DmVwRywosKhaILI0ywMDWPtBSku2HNxRvF7jtwDRsoEwYQSfbxj8b7RlJQ=="],
"detect-libc": ["detect-libc@2.0.4", "", {}, "sha512-3UDv+G9CsCKO1WKMGw9fwq/SWJYbI0c5Y7LU1AXYoDdbhE2AHQ6N6Nb34sG8Fj7T5APy8qXDCKuuIHd1BR0tVA=="],
"dot-case": ["dot-case@3.0.4", "", { "dependencies": { "no-case": "^3.0.4", "tslib": "^2.0.3" } }, "sha512-Kv5nKlh6yRrdrGvxeJ2e5y2eRUpkUosIW4A2AS38zwSz27zu7ufDwQPi5Jhs3XAlGNetl3bmnGhQsMtkKJnj3w=="],
"ecdsa-sig-formatter": ["ecdsa-sig-formatter@1.0.11", "", { "dependencies": { "safe-buffer": "^5.0.1" } }, "sha512-nagl3RYrbNv6kQkeJIpt6NJZy8twLB/2vtz6yN9Z4vRKHN4/QZJIEbqohALSgwKdnksuY3k5Addp5lg8sVoVcQ=="],
"esbuild": ["esbuild@0.21.5", "", { "optionalDependencies": { "@esbuild/aix-ppc64": "0.21.5", "@esbuild/android-arm": "0.21.5", "@esbuild/android-arm64": "0.21.5", "@esbuild/android-x64": "0.21.5", "@esbuild/darwin-arm64": "0.21.5", "@esbuild/darwin-x64": "0.21.5", "@esbuild/freebsd-arm64": "0.21.5", "@esbuild/freebsd-x64": "0.21.5", "@esbuild/linux-arm": "0.21.5", "@esbuild/linux-arm64": "0.21.5", "@esbuild/linux-ia32": "0.21.5", "@esbuild/linux-loong64": "0.21.5", "@esbuild/linux-mips64el": "0.21.5", "@esbuild/linux-ppc64": "0.21.5", "@esbuild/linux-riscv64": "0.21.5", "@esbuild/linux-s390x": "0.21.5", "@esbuild/linux-x64": "0.21.5", "@esbuild/netbsd-x64": "0.21.5", "@esbuild/openbsd-x64": "0.21.5", "@esbuild/sunos-x64": "0.21.5", "@esbuild/win32-arm64": "0.21.5", "@esbuild/win32-ia32": "0.21.5", "@esbuild/win32-x64": "0.21.5" }, "bin": { "esbuild": "bin/esbuild" } }, "sha512-mg3OPMV4hXywwpoDxu3Qda5xCKQi+vCTZq8S9J/EpkhB2HzKXq4SNFZE3+NK93JYxc8VMSep+lOUSC/RVKaBqw=="],
"he": ["he@1.2.0", "", { "bin": { "he": "bin/he" } }, "sha512-F/1DnUGPopORZi0ni+CvrCgHQ5FyEAHRLSApuYWMmrbSwoN2Mn/7k+Gl38gJnR7yyDZk6WLXwiGod1JOWNDKGw=="],
"header-case": ["header-case@2.0.4", "", { "dependencies": { "capital-case": "^1.0.4", "tslib": "^2.0.3" } }, "sha512-H/vuk5TEEVZwrR0lp2zed9OCo1uAILMlx0JEMgC26rzyJJ3N1v6XkwHHXJQdR2doSjcGPM6OKPYoJgf0plJ11Q=="],
"html-minifier": ["html-minifier@4.0.0", "", { "dependencies": { "camel-case": "^3.0.0", "clean-css": "^4.2.1", "commander": "^2.19.0", "he": "^1.2.0", "param-case": "^2.1.1", "relateurl": "^0.2.7", "uglify-js": "^3.5.1" }, "bin": { "html-minifier": "./cli.js" } }, "sha512-aoGxanpFPLg7MkIl/DDFYtb0iWz7jMFGqFhvEDZga6/4QTjneiD8I/NXL1x5aaoCp7FSIT6h/OhykDdPsbtMig=="],
"indent-string": ["indent-string@4.0.0", "", {}, "sha512-EdDDZu4A2OyIK7Lr/2zG+w5jmbuk1DVBnEwREQvBzspBJkCEbRa8GxU1lghYcaGJCnRWibjDXlq779X1/y5xwg=="],
"js-tokens": ["js-tokens@4.0.0", "", {}, "sha512-RdJUflcE3cUzKiMqQgsCu06FPu9UdIJO0beYbPhHN4k6apgJtifcoCtT9bcxOpYBtpD2kCM6Sbzg4CausW/PKQ=="],
"jsonwebtoken": ["jsonwebtoken@9.0.2", "", { "dependencies": { "jws": "^3.2.2", "lodash.includes": "^4.3.0", "lodash.isboolean": "^3.0.3", "lodash.isinteger": "^4.0.4", "lodash.isnumber": "^3.0.3", "lodash.isplainobject": "^4.0.6", "lodash.isstring": "^4.0.1", "lodash.once": "^4.0.0", "ms": "^2.1.1", "semver": "^7.5.4" } }, "sha512-PRp66vJ865SSqOlgqS8hujT5U4AOgMfhrwYIuIhfKaoSCZcirrmASQr8CX7cUg+RMih+hgznrjp99o+W4pJLHQ=="],
"jwa": ["jwa@1.4.2", "", { "dependencies": { "buffer-equal-constant-time": "^1.0.1", "ecdsa-sig-formatter": "1.0.11", "safe-buffer": "^5.0.1" } }, "sha512-eeH5JO+21J78qMvTIDdBXidBd6nG2kZjg5Ohz/1fpa28Z4CcsWUzJ1ZZyFq/3z3N17aZy+ZuBoHljASbL1WfOw=="],
"jws": ["jws@3.2.2", "", { "dependencies": { "jwa": "^1.4.1", "safe-buffer": "^5.0.1" } }, "sha512-YHlZCB6lMTllWDtSPHz/ZXTsi8S00usEV6v1tjq8tOUZzw7DpSDWVXjXDre6ed1w/pd495ODpHZYSdkRTsa0HA=="],
"lightningcss": ["lightningcss@1.30.1", "", { "dependencies": { "detect-libc": "^2.0.3" }, "optionalDependencies": { "lightningcss-darwin-arm64": "1.30.1", "lightningcss-darwin-x64": "1.30.1", "lightningcss-freebsd-x64": "1.30.1", "lightningcss-linux-arm-gnueabihf": "1.30.1", "lightningcss-linux-arm64-gnu": "1.30.1", "lightningcss-linux-arm64-musl": "1.30.1", "lightningcss-linux-x64-gnu": "1.30.1", "lightningcss-linux-x64-musl": "1.30.1", "lightningcss-win32-arm64-msvc": "1.30.1", "lightningcss-win32-x64-msvc": "1.30.1" } }, "sha512-xi6IyHML+c9+Q3W0S4fCQJOym42pyurFiJUHEcEyHS0CeKzia4yZDEsLlqOFykxOdHpNy0NmvVO31vcSqAxJCg=="],
"lightningcss-darwin-arm64": ["lightningcss-darwin-arm64@1.30.1", "", { "os": "darwin", "cpu": "arm64" }, "sha512-c8JK7hyE65X1MHMN+Viq9n11RRC7hgin3HhYKhrMyaXflk5GVplZ60IxyoVtzILeKr+xAJwg6zK6sjTBJ0FKYQ=="],
"lightningcss-darwin-x64": ["lightningcss-darwin-x64@1.30.1", "", { "os": "darwin", "cpu": "x64" }, "sha512-k1EvjakfumAQoTfcXUcHQZhSpLlkAuEkdMBsI/ivWw9hL+7FtilQc0Cy3hrx0AAQrVtQAbMI7YjCgYgvn37PzA=="],
"lightningcss-freebsd-x64": ["lightningcss-freebsd-x64@1.30.1", "", { "os": "freebsd", "cpu": "x64" }, "sha512-kmW6UGCGg2PcyUE59K5r0kWfKPAVy4SltVeut+umLCFoJ53RdCUWxcRDzO1eTaxf/7Q2H7LTquFHPL5R+Gjyig=="],
"lightningcss-linux-arm-gnueabihf": ["lightningcss-linux-arm-gnueabihf@1.30.1", "", { "os": "linux", "cpu": "arm" }, "sha512-MjxUShl1v8pit+6D/zSPq9S9dQ2NPFSQwGvxBCYaBYLPlCWuPh9/t1MRS8iUaR8i+a6w7aps+B4N0S1TYP/R+Q=="],
"lightningcss-linux-arm64-gnu": ["lightningcss-linux-arm64-gnu@1.30.1", "", { "os": "linux", "cpu": "arm64" }, "sha512-gB72maP8rmrKsnKYy8XUuXi/4OctJiuQjcuqWNlJQ6jZiWqtPvqFziskH3hnajfvKB27ynbVCucKSm2rkQp4Bw=="],
"lightningcss-linux-arm64-musl": ["lightningcss-linux-arm64-musl@1.30.1", "", { "os": "linux", "cpu": "arm64" }, "sha512-jmUQVx4331m6LIX+0wUhBbmMX7TCfjF5FoOH6SD1CttzuYlGNVpA7QnrmLxrsub43ClTINfGSYyHe2HWeLl5CQ=="],
"lightningcss-linux-x64-gnu": ["lightningcss-linux-x64-gnu@1.30.1", "", { "os": "linux", "cpu": "x64" }, "sha512-piWx3z4wN8J8z3+O5kO74+yr6ze/dKmPnI7vLqfSqI8bccaTGY5xiSGVIJBDd5K5BHlvVLpUB3S2YCfelyJ1bw=="],
"lightningcss-linux-x64-musl": ["lightningcss-linux-x64-musl@1.30.1", "", { "os": "linux", "cpu": "x64" }, "sha512-rRomAK7eIkL+tHY0YPxbc5Dra2gXlI63HL+v1Pdi1a3sC+tJTcFrHX+E86sulgAXeI7rSzDYhPSeHHjqFhqfeQ=="],
"lightningcss-win32-arm64-msvc": ["lightningcss-win32-arm64-msvc@1.30.1", "", { "os": "win32", "cpu": "arm64" }, "sha512-mSL4rqPi4iXq5YVqzSsJgMVFENoa4nGTT/GjO2c0Yl9OuQfPsIfncvLrEW6RbbB24WtZ3xP/2CCmI3tNkNV4oA=="],
"lightningcss-win32-x64-msvc": ["lightningcss-win32-x64-msvc@1.30.1", "", { "os": "win32", "cpu": "x64" }, "sha512-PVqXh48wh4T53F/1CCu8PIPCxLzWyCnn/9T5W1Jpmdy5h9Cwd+0YQS6/LwhHXSafuc61/xg9Lv5OrCby6a++jg=="],
"lodash.includes": ["lodash.includes@4.3.0", "", {}, "sha512-W3Bx6mdkRTGtlJISOvVD/lbqjTlPPUDTMnlXZFnVwi9NKJ6tiAk6LVdlhZMm17VZisqhKcgzpO5Wz91PCt5b0w=="],
"lodash.isboolean": ["lodash.isboolean@3.0.3", "", {}, "sha512-Bz5mupy2SVbPHURB98VAcw+aHh4vRV5IPNhILUCsOzRmsTmSQ17jIuqopAentWoehktxGd9e/hbIXq980/1QJg=="],
"lodash.isinteger": ["lodash.isinteger@4.0.4", "", {}, "sha512-DBwtEWN2caHQ9/imiNeEA5ys1JoRtRfY3d7V9wkqtbycnAmTvRRmbHKDV4a0EYc678/dia0jrte4tjYwVBaZUA=="],
"lodash.isnumber": ["lodash.isnumber@3.0.3", "", {}, "sha512-QYqzpfwO3/CWf3XP+Z+tkQsfaLL/EnUlXWVkIk5FUPc4sBdTehEqZONuyRt2P67PXAk+NXmTBcc97zw9t1FQrw=="],
"lodash.isplainobject": ["lodash.isplainobject@4.0.6", "", {}, "sha512-oSXzaWypCMHkPC3NvBEaPHf0KsA5mvPrOPgQWDsbg8n7orZ290M0BmC/jgRZ4vcJ6DTAhjrsSYgdsW/F+MFOBA=="],
"lodash.isstring": ["lodash.isstring@4.0.1", "", {}, "sha512-0wJxfxH1wgO3GrbuP+dTTk7op+6L41QCXbGINEmD+ny/G/eCqGzxyCsh7159S+mgDDcoarnBw6PC1PS5+wUGgw=="],
"lodash.once": ["lodash.once@4.1.1", "", {}, "sha512-Sb487aTOCr9drQVL8pIxOzVhafOjZN9UU54hiN8PU3uAiSV7lx1yYNpbNmex2PK6dSJoNTSJUUswT651yww3Mg=="],
"loose-envify": ["loose-envify@1.4.0", "", { "dependencies": { "js-tokens": "^3.0.0 || ^4.0.0" }, "bin": { "loose-envify": "cli.js" } }, "sha512-lyuxPGr/Wfhrlem2CL/UcnUc1zcqKAImBDzukY7Y5F/yQiNdko6+fRLevlw1HgMySw7f611UIY408EtxRSoK3Q=="],
"lower-case": ["lower-case@2.0.2", "", { "dependencies": { "tslib": "^2.0.3" } }, "sha512-7fm3l3NAF9WfN6W3JOmf5drwpVqX78JtoGJ3A6W0a6ZnldM41w2fV5D490psKFTpMds8TJse/eHLFFsNHHjHgg=="],
"lru-cache": ["@wolfy1339/lru-cache@11.0.2-patch.1", "", {}, "sha512-BgYZfL2ADCXKOw2wJtkM3slhHotawWkgIRRxq4wEybnZQPjvAp71SPX35xepMykTw8gXlzWcWPTY31hlbnRsDA=="],
"marked": ["marked@12.0.2", "", { "bin": { "marked": "bin/marked.js" } }, "sha512-qXUm7e/YKFoqFPYPa3Ukg9xlI5cyAtGmyEIzMfW//m6kXwCy2Ps9DYf5ioijFKQ8qyuscrHoY04iJGctu2Kg0Q=="],
"mitata": ["mitata@0.1.14", "", {}, "sha512-8kRs0l636eT4jj68PFXOR2D5xl4m56T478g16SzUPOYgkzQU+xaw62guAQxzBPm+SXb15GQi1cCpDxJfkr4CSA=="],
"ms": ["ms@2.1.3", "", {}, "sha512-6FlzubTLZG3J2a/NVCAleEhjzq5oxgHyaCU9yYXvcLsvoVaHJq/s5xXI6/XXP6tz7R9xAOtHnSO/tXtF3WRTlA=="],
"no-case": ["no-case@3.0.4", "", { "dependencies": { "lower-case": "^2.0.2", "tslib": "^2.0.3" } }, "sha512-fgAN3jGAh+RoxUGZHTSOLJIqUc2wmoBwGR4tbpNAKmmovFoWq0OdRkb0VkldReO2a2iBT/OEulG9XSUc10r3zg=="],
"param-case": ["param-case@3.0.4", "", { "dependencies": { "dot-case": "^3.0.4", "tslib": "^2.0.3" } }, "sha512-RXlj7zCYokReqWpOPH9oYivUzLYZ5vAPIfEmCTNViosC78F8F0H9y7T7gG2M39ymgutxF5gcFEsyZQSph9Bp3A=="],
"octokit": ["octokit@3.2.2", "", { "dependencies": { "@octokit/app": "^14.0.2", "@octokit/core": "^5.0.0", "@octokit/oauth-app": "^6.0.0", "@octokit/plugin-paginate-graphql": "^4.0.0", "@octokit/plugin-paginate-rest": "11.4.4-cjs.2", "@octokit/plugin-rest-endpoint-methods": "13.3.2-cjs.1", "@octokit/plugin-retry": "^6.0.0", "@octokit/plugin-throttling": "^8.0.0", "@octokit/request-error": "^5.0.0", "@octokit/types": "^13.0.0", "@octokit/webhooks": "^12.3.1" } }, "sha512-7Abo3nADdja8l/aglU6Y3lpnHSfv0tw7gFPiqzry/yCU+2gTAX7R1roJ8hJrxIK+S1j+7iqRJXtmuHJ/UDsBhQ=="],
"once": ["once@1.4.0", "", { "dependencies": { "wrappy": "1" } }, "sha512-lNaJgI+2Q5URQBkccEKHTQOPaXdUxnZZElQTZY0MFUAuaEqe1E+Nyvgdz/aIyNi6Z9MzO5dv1H8n58/GELp3+w=="],
"param-case": ["param-case@2.1.1", "", { "dependencies": { "no-case": "^2.2.0" } }, "sha512-eQE845L6ot89sk2N8liD8HAuH4ca6Vvr7VWAWwt7+kvvG5aBcPmmphQ68JsEG2qa9n1TykS2DLeMt363AAH8/w=="],
"pascal-case": ["pascal-case@3.1.2", "", { "dependencies": { "no-case": "^3.0.4", "tslib": "^2.0.3" } }, "sha512-uWlGT3YSnK9x3BQJaOdcZwrnV6hPpd8jFH1/ucpiLRPh/2zCVJKS19E4GvYHvaCcACn3foXZ0cLB9Wrx1KGe5g=="],
@@ -129,30 +282,76 @@
"peechy": ["peechy@0.4.34", "", { "dependencies": { "change-case": "^4.1.2" }, "bin": { "peechy": "cli.js" } }, "sha512-Cpke/cCqqZHhkyxz7mdqS8ZAGJFUi5icu3ZGqxm9GC7g2VrhH0tmjPhZoWHAN5ghw1m1wq5+2YvfbDSqgC4+Zg=="],
"prettier": ["prettier@3.5.3", "", { "bin": { "prettier": "bin/prettier.cjs" } }, "sha512-QQtaxnoDJeAkDvDKWCLiwIXkTgRhwYDEQCghU9Z6q03iyek/rxRh/2lC3HB7P8sWT2xC/y5JDctPLBIGzHKbhw=="],
"prettier": ["prettier@3.6.2", "", { "bin": { "prettier": "bin/prettier.cjs" } }, "sha512-I7AIg5boAr5R0FFtJ6rCfD+LFsWHp81dolrFD8S79U9tb8Az2nGrJncnMSnys+bpQJfRUzqs9hnA81OAA3hCuQ=="],
"prettier-plugin-organize-imports": ["prettier-plugin-organize-imports@4.1.0", "", { "peerDependencies": { "prettier": ">=2.0", "typescript": ">=2.9", "vue-tsc": "^2.1.0" }, "optionalPeers": ["vue-tsc"] }, "sha512-5aWRdCgv645xaa58X8lOxzZoiHAldAPChljr/MT0crXVOWTZ+Svl4hIWlz+niYSlO6ikE5UXkN1JrRvIP2ut0A=="],
"prettier-plugin-organize-imports": ["prettier-plugin-organize-imports@4.2.0", "", { "peerDependencies": { "prettier": ">=2.0", "typescript": ">=2.9", "vue-tsc": "^2.1.0 || 3" }, "optionalPeers": ["vue-tsc"] }, "sha512-Zdy27UhlmyvATZi67BTnLcKTo8fm6Oik59Sz6H64PgZJVs6NJpPD1mT240mmJn62c98/QaL+r3kx9Q3gRpDajg=="],
"react": ["react@18.3.1", "", { "dependencies": { "loose-envify": "^1.1.0" } }, "sha512-wS+hAgJShR0KhEvPJArfuPVN1+Hz1t0Y6n5jLrGQbkb4urgPE/0Rve+1kMB1v/oWgHgm4WIcV+i7F2pTVj+2iQ=="],
"react-dom": ["react-dom@18.3.1", "", { "dependencies": { "loose-envify": "^1.1.0", "scheduler": "^0.23.2" }, "peerDependencies": { "react": "^18.3.1" } }, "sha512-5m4nQKp+rZRb09LNH59GM4BxTh9251/ylbKIbpe7TpGxfJ+9kv6BLkLBXIjjspbgbnIBNqlI23tRnTWT0snUIw=="],
"relateurl": ["relateurl@0.2.7", "", {}, "sha512-G08Dxvm4iDN3MLM0EsP62EDV9IuhXPR6blNz6Utcp7zyV3tr4HVNINt6MpaRWbxoOHT3Q7YN2P+jaHX8vUbgog=="],
"safe-buffer": ["safe-buffer@5.2.1", "", {}, "sha512-rp3So07KcdmmKbGvgaNxQSJr7bGVSVk5S9Eq1F+ppbRo70+YeaDxkw5Dd8NPN+GD6bjnYm2VuPuCXmpuYvmCXQ=="],
"scheduler": ["scheduler@0.23.2", "", { "dependencies": { "loose-envify": "^1.1.0" } }, "sha512-UOShsPwz7NrMUqhR6t0hWjFduvOzbtv7toDH1/hIrfRNIDBnnBWd0CwJTGvTpngVlmwGCdP9/Zl/tVrDqcuYzQ=="],
"semver": ["semver@7.7.2", "", { "bin": { "semver": "bin/semver.js" } }, "sha512-RF0Fw+rO5AMf9MAyaRXI4AV0Ulj5lMHqVxxdSgiVbixSCXoEmmX/jk0CuJw4+3SqroYO9VoUh+HcuJivvtJemA=="],
"sentence-case": ["sentence-case@3.0.4", "", { "dependencies": { "no-case": "^3.0.4", "tslib": "^2.0.3", "upper-case-first": "^2.0.2" } }, "sha512-8LS0JInaQMCRoQ7YUytAo/xUu5W2XnQxV2HI/6uM6U7CITS1RqPElr30V6uIqyMKM9lJGRVFy5/4CuzcixNYSg=="],
"snake-case": ["snake-case@3.0.4", "", { "dependencies": { "dot-case": "^3.0.4", "tslib": "^2.0.3" } }, "sha512-LAOh4z89bGQvl9pFfNF8V146i7o7/CqFPbqzYgP+yYzDIDeS9HaNFtXABamRW+AQzEVODcvE79ljJ+8a9YSdMg=="],
"source-map": ["source-map@0.6.1", "", {}, "sha512-UjgapumWlbMhkBgzT7Ykc5YXUT46F0iKu8SGXq0bcwP5dz/h0Plj6enJqjz1Zbq2l5WaqYnrVbwWOWMyF3F47g=="],
"source-map-js": ["source-map-js@1.2.1", "", {}, "sha512-UXWMKhLOwVKb728IUtQPXxfYU+usdybtUrK/8uGE8CQMvrhOpwvzDBwj0QhSL7MQc7vIsISBG8VQ8+IDQxpfQA=="],
"tslib": ["tslib@2.8.1", "", {}, "sha512-oJFu94HQb+KVduSUQL7wnpmqnfmLsOA/nAh6b6EH0wCEoK0/mPeXU6c3wKDV83MkOuHPRHtSXKKU99IBazS/2w=="],
"typescript": ["typescript@5.8.3", "", { "bin": { "tsc": "bin/tsc", "tsserver": "bin/tsserver" } }, "sha512-p1diW6TqL9L07nNxvRMM7hMMw4c5XOo/1ibL4aAIGmSAt9slTE1Xgw5KWuof2uTOvCg9BY7ZRi+GaF+7sfgPeQ=="],
"undici-types": ["undici-types@6.21.0", "", {}, "sha512-iwDZqg0QAGrg9Rav5H4n0M64c3mkR59cJ6wQp+7C4nI0gsmExaedaYLNO44eT4AtBBwjbTiGPMlt2Md0T9H9JQ=="],
"uglify-js": ["uglify-js@3.19.3", "", { "bin": { "uglifyjs": "bin/uglifyjs" } }, "sha512-v3Xu+yuwBXisp6QYTcH4UbH+xYJXqnq2m/LtQVWKWzYc1iehYnLixoQDN9FH6/j9/oybfd6W9Ghwkl8+UMKTKQ=="],
"upper-case": ["upper-case@2.0.2", "", { "dependencies": { "tslib": "^2.0.3" } }, "sha512-KgdgDGJt2TpuwBUIjgG6lzw2GWFRCW9Qkfkiv0DxqHHLYJHmtmdUIKcZd8rHgFSjopVTlw6ggzCm1b8MFQwikg=="],
"undici-types": ["undici-types@7.8.0", "", {}, "sha512-9UJ2xGDvQ43tYyVMpuHlsgApydB8ZKfVYTsLDhXkFL/6gfkp+U8xTGdh8pMJv1SpZna0zxG1DwsKZsreLbXBxw=="],
"universal-github-app-jwt": ["universal-github-app-jwt@1.2.0", "", { "dependencies": { "@types/jsonwebtoken": "^9.0.0", "jsonwebtoken": "^9.0.2" } }, "sha512-dncpMpnsKBk0eetwfN8D8OUHGfiDhhJ+mtsbMl+7PfW7mYjiH8LIcqRmYMtzYLgSh47HjfdBtrBwIQ/gizKR3g=="],
"universal-user-agent": ["universal-user-agent@6.0.1", "", {}, "sha512-yCzhz6FN2wU1NiiQRogkTQszlQSlpWaw8SvVegAc+bDxbzHgh1vX8uIe8OYyMH6DwH+sdTJsgMl36+mSMdRJIQ=="],
"upper-case": ["upper-case@1.1.3", "", {}, "sha512-WRbjgmYzgXkCV7zNVpy5YgrHgbBv126rMALQQMrmzOVC4GM2waQ9x7xtm8VU+1yF2kWyPzI9zbZ48n4vSxwfSA=="],
"upper-case-first": ["upper-case-first@2.0.2", "", { "dependencies": { "tslib": "^2.0.3" } }, "sha512-514ppYHBaKwfJRK/pNC6c/OxfGa0obSnAl106u97Ed0I625Nin96KAjttZF6ZL3e1XLtphxnqrOi9iWgm+u+bg=="],
"wrappy": ["wrappy@1.0.2", "", {}, "sha512-l4Sp/DRseor9wL6EvV2+TuQn63dMkPjZ/sp9XkghTEbV9KlPS1xUsZ3u7/IQO4wxtcFB4bgpQPRcR3QCvezPcQ=="],
"@octokit/app/@octokit/plugin-paginate-rest": ["@octokit/plugin-paginate-rest@9.2.2", "", { "dependencies": { "@octokit/types": "^12.6.0" }, "peerDependencies": { "@octokit/core": "5" } }, "sha512-u3KYkGF7GcZnSD/3UP0S7K5XUFT2FkOQdcfXZGZQPGv3lm4F2Xbf71lvjldr8c1H3nNbF+33cLEkWYbokGWqiQ=="],
"@octokit/app/@octokit/types": ["@octokit/types@12.6.0", "", { "dependencies": { "@octokit/openapi-types": "^20.0.0" } }, "sha512-1rhSOfRa6H9w4YwK0yrf5faDaDTb+yLyBUKOCV4xtCDB5VmIPqd/v9yr9o6SAzOAlRxMiRiCic6JVM1/kunVkw=="],
"@octokit/auth-unauthenticated/@octokit/types": ["@octokit/types@12.6.0", "", { "dependencies": { "@octokit/openapi-types": "^20.0.0" } }, "sha512-1rhSOfRa6H9w4YwK0yrf5faDaDTb+yLyBUKOCV4xtCDB5VmIPqd/v9yr9o6SAzOAlRxMiRiCic6JVM1/kunVkw=="],
"@octokit/plugin-throttling/@octokit/types": ["@octokit/types@12.6.0", "", { "dependencies": { "@octokit/openapi-types": "^20.0.0" } }, "sha512-1rhSOfRa6H9w4YwK0yrf5faDaDTb+yLyBUKOCV4xtCDB5VmIPqd/v9yr9o6SAzOAlRxMiRiCic6JVM1/kunVkw=="],
"@octokit/webhooks/@octokit/webhooks-methods": ["@octokit/webhooks-methods@4.1.0", "", {}, "sha512-zoQyKw8h9STNPqtm28UGOYFE7O6D4Il8VJwhAtMHFt2C4L0VQT1qGKLeefUOqHNs1mNRYSadVv7x0z8U2yyeWQ=="],
"camel-case/no-case": ["no-case@2.3.2", "", { "dependencies": { "lower-case": "^1.1.1" } }, "sha512-rmTZ9kz+f3rCvK2TD1Ue/oZlns7OGoIWP4fc3llxxRXlOkHKoWPPWJOfFYpITabSow43QJbRIoHQXtt10VldyQ=="],
"change-case/camel-case": ["camel-case@4.1.2", "", { "dependencies": { "pascal-case": "^3.1.2", "tslib": "^2.0.3" } }, "sha512-gxGWBrTT1JuMx6R+o5PTXMmUnhnVzLQ9SNutD4YqKtI6ap897t3tKECYla6gCWEkplXnlNybEkZg9GEGxKFCgw=="],
"change-case/param-case": ["param-case@3.0.4", "", { "dependencies": { "dot-case": "^3.0.4", "tslib": "^2.0.3" } }, "sha512-RXlj7zCYokReqWpOPH9oYivUzLYZ5vAPIfEmCTNViosC78F8F0H9y7T7gG2M39ymgutxF5gcFEsyZQSph9Bp3A=="],
"constant-case/upper-case": ["upper-case@2.0.2", "", { "dependencies": { "tslib": "^2.0.3" } }, "sha512-KgdgDGJt2TpuwBUIjgG6lzw2GWFRCW9Qkfkiv0DxqHHLYJHmtmdUIKcZd8rHgFSjopVTlw6ggzCm1b8MFQwikg=="],
"param-case/no-case": ["no-case@2.3.2", "", { "dependencies": { "lower-case": "^1.1.1" } }, "sha512-rmTZ9kz+f3rCvK2TD1Ue/oZlns7OGoIWP4fc3llxxRXlOkHKoWPPWJOfFYpITabSow43QJbRIoHQXtt10VldyQ=="],
"@octokit/app/@octokit/types/@octokit/openapi-types": ["@octokit/openapi-types@20.0.0", "", {}, "sha512-EtqRBEjp1dL/15V7WiX5LJMIxxkdiGJnabzYx5Apx4FkQIFgAfKumXeYAqqJCj1s+BMX4cPFIFC4OLCR6stlnA=="],
"@octokit/auth-unauthenticated/@octokit/types/@octokit/openapi-types": ["@octokit/openapi-types@20.0.0", "", {}, "sha512-EtqRBEjp1dL/15V7WiX5LJMIxxkdiGJnabzYx5Apx4FkQIFgAfKumXeYAqqJCj1s+BMX4cPFIFC4OLCR6stlnA=="],
"@octokit/plugin-throttling/@octokit/types/@octokit/openapi-types": ["@octokit/openapi-types@20.0.0", "", {}, "sha512-EtqRBEjp1dL/15V7WiX5LJMIxxkdiGJnabzYx5Apx4FkQIFgAfKumXeYAqqJCj1s+BMX4cPFIFC4OLCR6stlnA=="],
"camel-case/no-case/lower-case": ["lower-case@1.1.4", "", {}, "sha512-2Fgx1Ycm599x+WGpIYwJOvsjmXFzTSc34IwDWALRA/8AopUKAVPwfJ+h5+f85BCp0PWmmJcWzEpxOpoXycMpdA=="],
"param-case/no-case/lower-case": ["lower-case@1.1.4", "", {}, "sha512-2Fgx1Ycm599x+WGpIYwJOvsjmXFzTSc34IwDWALRA/8AopUKAVPwfJ+h5+f85BCp0PWmmJcWzEpxOpoXycMpdA=="],
}
}

View File

@@ -7,3 +7,6 @@
# Instead, we can only scan the test directory for Bun's runtime tests
root = "test"
preload = "./test/preload.ts"
[install]
linker = "isolated"

View File

@@ -95,7 +95,7 @@ if(LINUX)
optionx(ENABLE_VALGRIND BOOL "If Valgrind support should be enabled" DEFAULT OFF)
endif()
if(DEBUG AND APPLE AND ARCH STREQUAL "aarch64")
if(DEBUG AND ((APPLE AND ARCH STREQUAL "aarch64") OR LINUX))
set(DEFAULT_ASAN ON)
else()
set(DEFAULT_ASAN OFF)
@@ -139,10 +139,10 @@ endif()
optionx(REVISION STRING "The git revision of the build" DEFAULT ${DEFAULT_REVISION})
# Used in process.version, process.versions.node, napi, and elsewhere
optionx(NODEJS_VERSION STRING "The version of Node.js to report" DEFAULT "24.3.0")
setx(NODEJS_VERSION "24.3.0")
# Used in process.versions.modules and compared while loading V8 modules
optionx(NODEJS_ABI_VERSION STRING "The ABI version of Node.js to report" DEFAULT "137")
setx(NODEJS_ABI_VERSION "137")
if(APPLE)
set(DEFAULT_STATIC_SQLITE OFF)

View File

@@ -1,4 +1,3 @@
src/bake/bake.bind.ts
src/bake/bake.d.ts
src/bake/bake.private.d.ts
src/bake/bun-framework-react/index.ts

View File

@@ -1,4 +1,4 @@
src/bake/bake.bind.ts
src/bake.bind.ts
src/bake/DevServer.bind.ts
src/bun.js/api/BunObject.bind.ts
src/bun.js/bindgen_test.bind.ts

View File

@@ -350,6 +350,7 @@ src/bun.js/bindings/webcore/JSTextEncoderStream.cpp
src/bun.js/bindings/webcore/JSTransformStream.cpp
src/bun.js/bindings/webcore/JSTransformStreamDefaultController.cpp
src/bun.js/bindings/webcore/JSURLSearchParams.cpp
src/bun.js/bindings/webcore/JSWasmStreamingCompiler.cpp
src/bun.js/bindings/webcore/JSWebSocket.cpp
src/bun.js/bindings/webcore/JSWorker.cpp
src/bun.js/bindings/webcore/JSWorkerOptions.cpp

View File

@@ -8,11 +8,14 @@ src/codegen/bundle-functions.ts
src/codegen/bundle-modules.ts
src/codegen/class-definitions.ts
src/codegen/client-js.ts
src/codegen/cppbind.ts
src/codegen/create-hash-table.ts
src/codegen/generate-classes.ts
src/codegen/generate-compact-string-table.ts
src/codegen/generate-js2native.ts
src/codegen/generate-jssink.ts
src/codegen/generate-node-errors.ts
src/codegen/helpers.ts
src/codegen/internal-module-registry-scanner.ts
src/codegen/replacements.ts
src/codegen/shared-types.ts

View File

@@ -29,6 +29,7 @@ src/js/builtins/TransformStream.ts
src/js/builtins/TransformStreamDefaultController.ts
src/js/builtins/TransformStreamInternals.ts
src/js/builtins/UtilInspect.ts
src/js/builtins/WasmStreaming.ts
src/js/builtins/WritableStreamDefaultController.ts
src/js/builtins/WritableStreamDefaultWriter.ts
src/js/builtins/WritableStreamInternals.ts

View File

@@ -1,15 +1,16 @@
src/allocators.zig
src/allocators/AllocationScope.zig
src/allocators/linux_memfd_allocator.zig
src/allocators/max_heap_allocator.zig
src/allocators/memory_allocator.zig
src/allocators/basic.zig
src/allocators/LinuxMemFdAllocator.zig
src/allocators/MaxHeapAllocator.zig
src/allocators/MemoryReportingAllocator.zig
src/allocators/mimalloc_arena.zig
src/allocators/mimalloc.zig
src/allocators/MimallocArena.zig
src/allocators/NullableAllocator.zig
src/analytics/analytics_schema.zig
src/analytics/analytics_thread.zig
src/analytics.zig
src/analytics/schema.zig
src/api/schema.zig
src/ast.zig
src/ast/Ast.zig
src/ast/ASTMemoryAllocator.zig
src/ast/B.zig
@@ -33,20 +34,30 @@ src/ast/UseDirective.zig
src/async/posix_event_loop.zig
src/async/stub_event_loop.zig
src/async/windows_event_loop.zig
src/baby_list.zig
src/bake/bake.zig
src/bake.zig
src/bake/DevServer.zig
src/bake/DevServer/Assets.zig
src/bake/DevServer/DirectoryWatchStore.zig
src/bake/DevServer/ErrorReportRequest.zig
src/bake/DevServer/HmrSocket.zig
src/bake/DevServer/HotReloadEvent.zig
src/bake/DevServer/IncrementalGraph.zig
src/bake/DevServer/memory_cost.zig
src/bake/DevServer/PackedMap.zig
src/bake/DevServer/RouteBundle.zig
src/bake/DevServer/SerializedFailure.zig
src/bake/DevServer/SourceMapStore.zig
src/bake/DevServer/WatcherAtomics.zig
src/bake/FrameworkRouter.zig
src/bake/production.zig
src/base64/base64.zig
src/bit_set.zig
src/bits.zig
src/boringssl.zig
src/brotli.zig
src/btjs.zig
src/bun_js.zig
src/bun.js.zig
src/bun.js/api.zig
src/bun.js/api/bun/dns_resolver.zig
src/bun.js/api/bun/dns.zig
src/bun.js/api/bun/h2_frame_parser.zig
src/bun.js/api/bun/lshpack.zig
src/bun.js/api/bun/process.zig
@@ -96,6 +107,7 @@ src/bun.js/api/Timer/EventLoopTimer.zig
src/bun.js/api/Timer/ImmediateObject.zig
src/bun.js/api/Timer/TimeoutObject.zig
src/bun.js/api/Timer/TimerObjectInternals.zig
src/bun.js/api/Timer/WTFTimer.zig
src/bun.js/api/TOMLObject.zig
src/bun.js/api/UnsafeObject.zig
src/bun.js/bindgen_test.zig
@@ -242,7 +254,6 @@ src/bun.js/test/jest.zig
src/bun.js/test/pretty_format.zig
src/bun.js/test/snapshot.zig
src/bun.js/test/test.zig
src/bun.js/unbounded_queue.zig
src/bun.js/uuid.zig
src/bun.js/virtual_machine_exports.zig
src/bun.js/VirtualMachine.zig
@@ -281,7 +292,6 @@ src/bun.js/webcore/streams.zig
src/bun.js/webcore/TextDecoder.zig
src/bun.js/webcore/TextEncoder.zig
src/bun.js/webcore/TextEncoderStreamEncoder.zig
src/bun.js/WTFTimer.zig
src/bun.zig
src/bundler/AstBuilder.zig
src/bundler/bundle_v2.zig
@@ -305,12 +315,14 @@ src/bundler/linker_context/generateCodeForLazyExport.zig
src/bundler/linker_context/generateCompileResultForCssChunk.zig
src/bundler/linker_context/generateCompileResultForHtmlChunk.zig
src/bundler/linker_context/generateCompileResultForJSChunk.zig
src/bundler/linker_context/OutputFileListBuilder.zig
src/bundler/linker_context/postProcessCSSChunk.zig
src/bundler/linker_context/postProcessHTMLChunk.zig
src/bundler/linker_context/postProcessJSChunk.zig
src/bundler/linker_context/prepareCssAstsForChunk.zig
src/bundler/linker_context/renameSymbolsInChunk.zig
src/bundler/linker_context/scanImportsAndExports.zig
src/bundler/linker_context/StaticRouteVisitor.zig
src/bundler/linker_context/writeOutputFilesToDisk.zig
src/bundler/LinkerContext.zig
src/bundler/LinkerGraph.zig
@@ -343,9 +355,11 @@ src/cli/pack_command.zig
src/cli/package_manager_command.zig
src/cli/patch_command.zig
src/cli/patch_commit_command.zig
src/cli/pm_pkg_command.zig
src/cli/pm_trusted_command.zig
src/cli/pm_version_command.zig
src/cli/pm_view_command.zig
src/cli/pm_why_command.zig
src/cli/publish_command.zig
src/cli/remove_command.zig
src/cli/run_command.zig
@@ -354,8 +368,16 @@ src/cli/test_command.zig
src/cli/test/Scanner.zig
src/cli/unlink_command.zig
src/cli/update_command.zig
src/cli/update_interactive_command.zig
src/cli/upgrade_command.zig
src/cli/why_command.zig
src/codegen/process_windows_translate_c.zig
src/collections.zig
src/collections/baby_list.zig
src/collections/bit_set.zig
src/collections/hive_array.zig
src/collections/multi_array_list.zig
src/collections/safety.zig
src/compile_target.zig
src/comptime_string_map.zig
src/copy_file.zig
@@ -504,30 +526,28 @@ src/env.zig
src/errno/darwin_errno.zig
src/errno/linux_errno.zig
src/errno/windows_errno.zig
src/exact_size_matcher.zig
src/fd.zig
src/feature_flags.zig
src/fmt.zig
src/fs.zig
src/fs/stat_hash.zig
src/futex.zig
src/generated_perf_trace_events.zig
src/generated_versions_list.zig
src/glob.zig
src/glob/GlobWalker.zig
src/glob/match.zig
src/Global.zig
src/grapheme.zig
src/heap_breakdown.zig
src/highway.zig
src/hive_array.zig
src/hmac.zig
src/HTMLScanner.zig
src/http.zig
src/http/AsyncHTTP.zig
src/http/CertificateInfo.zig
src/http/ContentRange.zig
src/http/Decompressor.zig
src/http/Encoding.zig
src/http/ETag.zig
src/http/FetchRedirect.zig
src/http/HeaderBuilder.zig
src/http/Headers.zig
@@ -538,6 +558,7 @@ src/http/HTTPThread.zig
src/http/InitError.zig
src/http/InternalState.zig
src/http/Method.zig
src/http/mime_type_list_enum.zig
src/http/MimeType.zig
src/http/ProxyTunnel.zig
src/http/SendFile.zig
@@ -563,6 +584,7 @@ src/install/install_binding.zig
src/install/install.zig
src/install/integrity.zig
src/install/isolated_install.zig
src/install/isolated_install/FileCopier.zig
src/install/isolated_install/Hardlinker.zig
src/install/isolated_install/Installer.zig
src/install/isolated_install/Store.zig
@@ -613,6 +635,10 @@ src/install/resolvers/folder_resolver.zig
src/install/versioned_url.zig
src/install/windows-shim/BinLinkingShim.zig
src/install/windows-shim/bun_shim_impl.zig
src/interchange.zig
src/interchange/json.zig
src/interchange/toml.zig
src/interchange/toml/lexer.zig
src/io/heap.zig
src/io/io.zig
src/io/MaxBuf.zig
@@ -621,14 +647,12 @@ src/io/PipeReader.zig
src/io/pipes.zig
src/io/PipeWriter.zig
src/io/source.zig
src/js_ast.zig
src/js_lexer_tables.zig
src/js_lexer.zig
src/js_lexer/identifier.zig
src/js_parser.zig
src/js_printer.zig
src/jsc_stub.zig
src/json_parser.zig
src/libarchive/libarchive-bindings.zig
src/libarchive/libarchive.zig
src/linear_fifo.zig
@@ -640,8 +664,6 @@ src/main_test.zig
src/main_wasm.zig
src/main.zig
src/meta.zig
src/multi_array_list.zig
src/Mutex.zig
src/napi/napi.zig
src/node_fallbacks.zig
src/open.zig
@@ -653,6 +675,7 @@ src/paths.zig
src/paths/EnvPath.zig
src/paths/path_buffer_pool.zig
src/paths/Path.zig
src/pe.zig
src/perf.zig
src/pool.zig
src/Progress.zig
@@ -815,30 +838,36 @@ src/sql/postgres/types/PostgresString.zig
src/sql/postgres/types/Tag.zig
src/StandaloneModuleGraph.zig
src/StaticHashMap.zig
src/string_immutable.zig
src/string_types.zig
src/string.zig
src/string/escapeHTML.zig
src/string/HashedString.zig
src/string/immutable.zig
src/string/immutable/escapeHTML.zig
src/string/immutable/exact_size_matcher.zig
src/string/immutable/grapheme.zig
src/string/immutable/paths.zig
src/string/immutable/unicode.zig
src/string/immutable/visible.zig
src/string/MutableString.zig
src/string/paths.zig
src/string/PathString.zig
src/string/SmolStr.zig
src/string/StringBuilder.zig
src/string/StringJoiner.zig
src/string/unicode.zig
src/string/visible.zig
src/string/WTFStringImpl.zig
src/sync.zig
src/sys_uv.zig
src/sys.zig
src/system_timer.zig
src/test/fixtures.zig
src/test/recover.zig
src/thread_pool.zig
src/threading.zig
src/threading/channel.zig
src/threading/Condition.zig
src/threading/Futex.zig
src/threading/guarded_value.zig
src/threading/Mutex.zig
src/threading/ThreadPool.zig
src/threading/unbounded_queue.zig
src/threading/WaitGroup.zig
src/tmp.zig
src/toml/toml_lexer.zig
src/toml/toml_parser.zig
src/tracy.zig
src/trait.zig
src/transpiler.zig

View File

@@ -255,6 +255,10 @@ set(BUN_ZIG_GENERATED_CLASSES_SCRIPT ${CWD}/src/codegen/generate-classes.ts)
absolute_sources(BUN_ZIG_GENERATED_CLASSES_SOURCES ${CWD}/cmake/sources/ZigGeneratedClassesSources.txt)
# hand written cpp source files. Full list of "source" code (including codegen) is in BUN_CPP_SOURCES
absolute_sources(BUN_CXX_SOURCES ${CWD}/cmake/sources/CxxSources.txt)
absolute_sources(BUN_C_SOURCES ${CWD}/cmake/sources/CSources.txt)
set(BUN_ZIG_GENERATED_CLASSES_OUTPUTS
${CODEGEN_PATH}/ZigGeneratedClasses.h
${CODEGEN_PATH}/ZigGeneratedClasses.cpp
@@ -308,6 +312,27 @@ set(BUN_JAVASCRIPT_OUTPUTS
${CWD}/src/bun.js/bindings/GeneratedJS2Native.zig
)
set(BUN_CPP_OUTPUTS
${CODEGEN_PATH}/cpp.zig
)
register_command(
TARGET
bun-cppbind
COMMENT
"Generating C++ --> Zig bindings"
COMMAND
${BUN_EXECUTABLE}
${CWD}/src/codegen/cppbind.ts
${CWD}/src
${CODEGEN_PATH}
SOURCES
${BUN_JAVASCRIPT_CODEGEN_SOURCES}
${BUN_CXX_SOURCES}
OUTPUTS
${BUN_CPP_OUTPUTS}
)
register_command(
TARGET
bun-js-modules
@@ -537,6 +562,7 @@ set(BUN_ZIG_GENERATED_SOURCES
${BUN_ERROR_CODE_OUTPUTS}
${BUN_ZIG_GENERATED_CLASSES_OUTPUTS}
${BUN_JAVASCRIPT_OUTPUTS}
${BUN_CPP_OUTPUTS}
)
# In debug builds, these are not embedded, but rather referenced at runtime.
@@ -606,6 +632,7 @@ register_command(
TARGETS
clone-zig
clone-zstd
bun-cppbind
SOURCES
${BUN_ZIG_SOURCES}
${BUN_ZIG_GENERATED_SOURCES}
@@ -618,10 +645,6 @@ set_property(DIRECTORY APPEND PROPERTY CMAKE_CONFIGURE_DEPENDS "build.zig")
set(BUN_USOCKETS_SOURCE ${CWD}/packages/bun-usockets)
# hand written cpp source files. Full list of "source" code (including codegen) is in BUN_CPP_SOURCES
absolute_sources(BUN_CXX_SOURCES ${CWD}/cmake/sources/CxxSources.txt)
absolute_sources(BUN_C_SOURCES ${CWD}/cmake/sources/CSources.txt)
if(WIN32)
list(APPEND BUN_CXX_SOURCES ${CWD}/src/bun.js/bindings/windows/rescle.cpp)
list(APPEND BUN_CXX_SOURCES ${CWD}/src/bun.js/bindings/windows/rescle-binding.cpp)
@@ -685,7 +708,7 @@ if(WIN32)
${CODEGEN_PATH}/windows-app-info.rc
@ONLY
)
set(WINDOWS_RESOURCES ${CODEGEN_PATH}/windows-app-info.rc)
set(WINDOWS_RESOURCES ${CODEGEN_PATH}/windows-app-info.rc ${CWD}/src/bun.exe.manifest)
endif()
# --- Executable ---
@@ -958,6 +981,16 @@ if(APPLE)
-Wl,-map,${bun}.linker-map
)
if(DEBUG)
target_link_options(${bun} PUBLIC
# Suppress ALL linker warnings on macOS.
# The intent is to only suppress linker alignment warnings.
# As of July 21st, 2025 there doesn't seem to be a more specific suppression just for linker alignment warnings.
# If you find one, please update this to only be for linker alignment.
-Wl,-w
)
endif()
# don't strip in debug, this seems to be needed so that the Zig std library
# `*dbHelper` DWARF symbols (used by LLDB for pretty printing) are in the
# output executable

View File

@@ -4,7 +4,7 @@ register_repository(
REPOSITORY
libarchive/libarchive
COMMIT
898dc8319355b7e985f68a9819f182aaed61b53a
7118f97c26bf0b2f426728b482f86508efc81d02
)
register_cmake_command(
@@ -20,11 +20,14 @@ register_cmake_command(
-DENABLE_WERROR=OFF
-DENABLE_BZip2=OFF
-DENABLE_CAT=OFF
-DENABLE_CPIO=OFF
-DENABLE_UNZIP=OFF
-DENABLE_EXPAT=OFF
-DENABLE_ICONV=OFF
-DENABLE_LIBB2=OFF
-DENABLE_LibGCC=OFF
-DENABLE_LIBXML2=OFF
-DENABLE_WIN32_XMLLITE=OFF
-DENABLE_LZ4=OFF
-DENABLE_LZMA=OFF
-DENABLE_LZO=OFF

View File

@@ -4,7 +4,7 @@ register_repository(
REPOSITORY
cloudflare/lol-html
COMMIT
67f1d4ffd6b74db7e053fb129dcce620193c180d
d64457d9ff0143deef025d5df7e8586092b9afb7
)
set(LOLHTML_CWD ${VENDOR_PATH}/lolhtml/c-api)

View File

@@ -20,7 +20,7 @@ else()
unsupported(CMAKE_SYSTEM_NAME)
endif()
set(ZIG_COMMIT "0a0120fa92cd7f6ab244865688b351df634f0707")
set(ZIG_COMMIT "edc6229b1fafb1701a25fb4e17114cc756991546")
optionx(ZIG_TARGET STRING "The zig target to use" DEFAULT ${DEFAULT_ZIG_TARGET})
if(CMAKE_BUILD_TYPE STREQUAL "Release")

View File

@@ -274,6 +274,23 @@ If no connection URL is provided, the system checks for the following individual
| `PGPASSWORD` | - | (empty) | Database password |
| `PGDATABASE` | - | username | Database name |
## Runtime Preconnection
Bun can preconnect to PostgreSQL at startup to improve performance by establishing database connections before your application code runs. This is useful for reducing connection latency on the first database query.
```bash
# Enable PostgreSQL preconnection
bun --sql-preconnect index.js
# Works with DATABASE_URL environment variable
DATABASE_URL=postgres://user:pass@localhost:5432/db bun --sql-preconnect index.js
# Can be combined with other runtime flags
bun --sql-preconnect --hot index.js
```
The `--sql-preconnect` flag will automatically establish a PostgreSQL connection using your configured environment variables at startup. If the connection fails, it won't crash your application - the error will be handled gracefully.
## Connection Options
You can configure your database connection manually by passing options to the SQL constructor:

View File

@@ -88,6 +88,20 @@ The order of the `--target` flag does not matter, as long as they're delimited b
On x64 platforms, Bun uses SIMD optimizations which require a modern CPU supporting AVX2 instructions. The `-baseline` build of Bun is for older CPUs that don't support these optimizations. Normally, when you install Bun we automatically detect which version to use but this can be harder to do when cross-compiling since you might not know the target CPU. You usually don't need to worry about it on Darwin x64, but it is relevant for Windows x64 and Linux x64. If you or your users see `"Illegal instruction"` errors, you might need to use the baseline version.
## Build-time constants
Use the `--define` flag to inject build-time constants into your executable, such as version numbers, build timestamps, or configuration values:
```bash
$ bun build --compile --define BUILD_VERSION='"1.2.3"' --define BUILD_TIME='"2024-01-15T10:30:00Z"' src/cli.ts --outfile mycli
```
These constants are embedded directly into your compiled binary at build time, providing zero runtime overhead and enabling dead code elimination optimizations.
{% callout type="info" %}
For comprehensive examples and advanced patterns, see the [Build-time constants guide](/guides/runtime/build-time-constants).
{% /callout %}
## Deploying to production
Compiled executables reduce memory usage and improve Bun's start time.

View File

@@ -183,6 +183,30 @@ Bun supports installing dependencies from Git, GitHub, and local or remotely-hos
}
```
## Installation strategies
Bun supports two package installation strategies that determine how dependencies are organized in `node_modules`:
### Hoisted installs (default for single projects)
The traditional npm/Yarn approach that flattens dependencies into a shared `node_modules` directory:
```bash
$ bun install --linker hoisted
```
### Isolated installs
A pnpm-like approach that creates strict dependency isolation to prevent phantom dependencies:
```bash
$ bun install --linker isolated
```
Isolated installs create a central package store in `node_modules/.bun/` with symlinks in the top-level `node_modules`. This ensures packages can only access their declared dependencies.
For complete documentation on isolated installs, refer to [Package manager > Isolated installs](https://bun.com/docs/install/isolated).
## Configuration
The default behavior of `bun install` can be configured in `bunfig.toml`. The default values are shown below.
@@ -213,11 +237,15 @@ dryRun = false
# equivalent to `--concurrent-scripts` flag
concurrentScripts = 16 # (cpu count or GOMAXPROCS) x2
# installation strategy: "hoisted" or "isolated"
# default: "hoisted"
linker = "hoisted"
```
## CI/CD
Looking to speed up your CI? Use the official [`oven-sh/setup-bun`](https://github.com/oven-sh/setup-bun) action to install `bun` in a GitHub Actions pipeline.
Use the official [`oven-sh/setup-bun`](https://github.com/oven-sh/setup-bun) action to install `bun` in a GitHub Actions pipeline:
```yaml#.github/workflows/release.yml
name: bun-types
@@ -236,4 +264,31 @@ jobs:
run: bun run build
```
For CI/CD environments that want to enforce reproducible builds, use `bun ci` to fail the build if the package.json is out of sync with the lockfile:
```bash
$ bun ci
```
This is equivalent to `bun install --frozen-lockfile`. It installs exact versions from `bun.lock` and fails if `package.json` doesn't match the lockfile. To use `bun ci` or `bun install --frozen-lockfile`, you must commit `bun.lock` to version control.
And instead of running `bun install`, run `bun ci`.
```yaml#.github/workflows/release.yml
name: bun-types
jobs:
build:
name: build-app
runs-on: ubuntu-latest
steps:
- name: Checkout repo
uses: actions/checkout@v4
- name: Install bun
uses: oven-sh/setup-bun@v2
- name: Install dependencies
run: bun ci
- name: Build app
run: bun run build
```
{% bunCLIUsage command="install" /%}

View File

@@ -8,15 +8,70 @@ To create a tarball of the current workspace:
$ bun pm pack
```
Options for the `pack` command:
This command creates a `.tgz` file containing all files that would be published to npm, following the same rules as `npm pack`.
- `--dry-run`: Perform all tasks except writing the tarball to disk.
- `--destination`: Specify the directory where the tarball will be saved.
- `--filename`: Specify an exact file name for the tarball to be saved at.
## Examples
Basic usage:
```bash
$ bun pm pack
# Creates my-package-1.0.0.tgz in current directory
```
Quiet mode for scripting:
```bash
$ TARBALL=$(bun pm pack --quiet)
$ echo "Created: $TARBALL"
# Output: Created: my-package-1.0.0.tgz
```
Custom destination:
```bash
$ bun pm pack --destination ./dist
# Saves tarball in ./dist/ directory
```
## Options
- `--dry-run`: Perform all tasks except writing the tarball to disk. Shows what would be included.
- `--destination <dir>`: Specify the directory where the tarball will be saved.
- `--filename <name>`: Specify an exact file name for the tarball to be saved at.
- `--ignore-scripts`: Skip running pre/postpack and prepare scripts.
- `--gzip-level`: Set a custom compression level for gzip, ranging from 0 to 9 (default is 9).
- `--gzip-level <0-9>`: Set a custom compression level for gzip, ranging from 0 to 9 (default is 9).
- `--quiet`: Only output the tarball filename, suppressing verbose output. Ideal for scripts and automation.
> Note `--filename` and `--destination` cannot be used at the same time
> **Note:** `--filename` and `--destination` cannot be used at the same time.
## Output Modes
**Default output:**
```bash
$ bun pm pack
bun pack v1.2.19
packed 131B package.json
packed 40B index.js
my-package-1.0.0.tgz
Total files: 2
Shasum: f2451d6eb1e818f500a791d9aace80b394258a90
Unpacked size: 171B
Packed size: 249B
```
**Quiet output:**
```bash
$ bun pm pack --quiet
my-package-1.0.0.tgz
```
The `--quiet` flag is particularly useful for automation workflows where you need to capture the generated tarball filename for further processing.
## bin
@@ -193,3 +248,38 @@ v1.0.1
```
Supports `patch`, `minor`, `major`, `premajor`, `preminor`, `prepatch`, `prerelease`, `from-git`, or specific versions like `1.2.3`. By default creates git commit and tag unless `--no-git-tag-version` was used to skip.
## pkg
Manage `package.json` data with get, set, delete, and fix operations.
All commands support dot and bracket notation:
```bash
scripts.build # dot notation
contributors[0] # array access
workspaces.0 # dot with numeric index
scripts[test:watch] # bracket for special chars
```
Examples:
```bash
# set
$ bun pm pkg get name # single property
$ bun pm pkg get name version # multiple properties
$ bun pm pkg get # entire package.json
$ bun pm pkg get scripts.build # nested property
# set
$ bun pm pkg set name="my-package" # simple property
$ bun pm pkg set scripts.test="jest" version=2.0.0 # multiple properties
$ bun pm pkg set {"private":"true"} --json # JSON values with --json flag
# delete
$ bun pm pkg delete description # single property
$ bun pm pkg delete scripts.test contributors[0] # multiple/nested
# fix
$ bun pm pkg fix # auto-fix common issues
```

View File

@@ -248,4 +248,33 @@ $ bun test foo
Any test file in the directory with an _absolute path_ that contains one of the targets will run. Glob patterns are not yet supported. -->
## AI Agent Integration
When using Bun's test runner with AI coding assistants, you can enable quieter output to improve readability and reduce context noise. This feature minimizes test output verbosity while preserving essential failure information.
### Environment Variables
Set any of the following environment variables to enable AI-friendly output:
- `CLAUDECODE=1` - For Claude Code
- `REPL_ID=1` - For Replit
- `AGENT=1` - Generic AI agent flag
### Behavior
When an AI agent environment is detected:
- Only test failures are displayed in detail
- Passing, skipped, and todo test indicators are hidden
- Summary statistics remain intact
```bash
# Example: Enable quiet output for Claude Code
$ CLAUDECODE=1 bun test
# Still shows failures and summary, but hides verbose passing test output
```
This feature is particularly useful in AI-assisted development workflows where reduced output verbosity improves context efficiency while maintaining visibility into test failures.
{% bunCLIUsage command="test" /%}

View File

@@ -10,6 +10,86 @@ To update a specific dependency to the latest version:
$ bun update [package]
```
## `--interactive`
For a more controlled update experience, use the `--interactive` flag to select which packages to update:
```sh
$ bun update --interactive
$ bun update -i
```
This launches an interactive terminal interface that shows all outdated packages with their current and target versions. You can then select which packages to update.
### Interactive Interface
The interface displays packages grouped by dependency type:
```
? Select packages to update - Space to toggle, Enter to confirm, a to select all, n to select none, i to invert, l to toggle latest
dependencies Current Target Latest
□ react 17.0.2 18.2.0 18.3.1
□ lodash 4.17.20 4.17.21 4.17.21
devDependencies Current Target Latest
□ typescript 4.8.0 5.0.0 5.3.3
□ @types/node 16.11.7 18.0.0 20.11.5
optionalDependencies Current Target Latest
□ some-optional-package 1.0.0 1.1.0 1.2.0
```
**Sections:**
- Packages are grouped under section headers: `dependencies`, `devDependencies`, `peerDependencies`, `optionalDependencies`
- Each section shows column headers aligned with the package data
**Columns:**
- **Package**: Package name (may have suffix like ` dev`, ` peer`, ` optional` for clarity)
- **Current**: Currently installed version
- **Target**: Version that would be installed (respects semver constraints)
- **Latest**: Latest available version
### Keyboard Controls
**Selection:**
- **Space**: Toggle package selection
- **Enter**: Confirm selections and update
- **a/A**: Select all packages
- **n/N**: Select none
- **i/I**: Invert selection
**Navigation:**
- **↑/↓ Arrow keys** or **j/k**: Move cursor
- **l/L**: Toggle between target and latest version for current package
**Exit:**
- **Ctrl+C** or **Ctrl+D**: Cancel without updating
### Visual Indicators
- **☑** Selected packages (will be updated)
- **□** Unselected packages
- **>** Current cursor position
- **Colors**: Red (major), yellow (minor), green (patch) version changes
- **Underlined**: Currently selected update target
### Package Grouping
Packages are organized in sections by dependency type:
- **dependencies** - Regular runtime dependencies
- **devDependencies** - Development dependencies
- **peerDependencies** - Peer dependencies
- **optionalDependencies** - Optional dependencies
Within each section, individual packages may have additional suffixes (` dev`, ` peer`, ` optional`) for extra clarity.
## `--latest`
By default, `bun update` will update to the latest version of a dependency that satisfies the version range specified in your `package.json`.
@@ -20,6 +100,8 @@ To update to the latest version, regardless of if it's compatible with the curre
$ bun update --latest
```
In interactive mode, you can toggle individual packages between their target version (respecting semver) and latest version using the **l** key.
For example, with the following `package.json`:
```json

67
docs/cli/why.md Normal file
View File

@@ -0,0 +1,67 @@
The `bun why` command explains why a package is installed in your project by showing the dependency chain that led to its installation.
## Usage
```bash
$ bun why <package>
```
## Arguments
- `<package>`: The name of the package to explain. Supports glob patterns like `@org/*` or `*-lodash`.
## Options
- `--top`: Show only the top-level dependencies instead of the complete dependency tree.
- `--depth <number>`: Maximum depth of the dependency tree to display.
## Examples
Check why a specific package is installed:
```bash
$ bun why react
react@18.2.0
└─ my-app@1.0.0 (requires ^18.0.0)
```
Check why all packages with a specific pattern are installed:
```bash
$ bun why "@types/*"
@types/react@18.2.15
└─ dev my-app@1.0.0 (requires ^18.0.0)
@types/react-dom@18.2.7
└─ dev my-app@1.0.0 (requires ^18.0.0)
```
Show only top-level dependencies:
```bash
$ bun why express --top
express@4.18.2
└─ my-app@1.0.0 (requires ^4.18.2)
```
Limit the dependency tree depth:
```bash
$ bun why express --depth 2
express@4.18.2
└─ express-pollyfill@1.20.1 (requires ^4.18.2)
└─ body-parser@1.20.1 (requires ^1.20.1)
└─ accepts@1.3.8 (requires ^1.3.8)
└─ (deeper dependencies hidden)
```
## Understanding the Output
The output shows:
- The package name and version being queried
- The dependency chain that led to its installation
- The type of dependency (dev, peer, optional, or production)
- The version requirement specified in each package's dependencies
For nested dependencies, the command shows the complete dependency tree by default, with indentation indicating the relationship hierarchy.

View File

@@ -40,6 +40,7 @@ Open `prisma/schema.prisma` and add a simple `User` model.
```prisma-diff#prisma/schema.prisma
generator client {
provider = "prisma-client-js"
output = "../generated/prisma"
}
datasource db {
@@ -78,7 +79,7 @@ migrations/
Your database is now in sync with your schema.
✔ Generated Prisma Client (v5.3.1) to ./node_modules/@prisma/client in 41ms
✔ Generated Prisma Client (v6.11.1) to ./generated/prisma in 41ms
```
---

View File

@@ -0,0 +1,293 @@
---
name: Build-time constants with --define
---
The `--define` flag can be used with `bun build` and `bun build --compile` to inject build-time constants into your application. This is especially useful for embedding metadata like build versions, timestamps, or configuration flags directly into your compiled executables.
```sh
$ bun build --compile --define BUILD_VERSION='"1.2.3"' --define BUILD_TIME='"2024-01-15T10:30:00Z"' src/index.ts --outfile myapp
```
---
## Why use build-time constants?
Build-time constants are embedded directly into your compiled code, making them:
- **Zero runtime overhead** - No environment variable lookups or file reads
- **Immutable** - Values are baked into the binary at compile time
- **Optimizable** - Dead code elimination can remove unused branches
- **Secure** - No external dependencies or configuration files to manage
This is similar to `gcc -D` or `#define` in C/C++, but for JavaScript/TypeScript.
---
## Basic usage
### With `bun build`
```sh
# Bundle with build-time constants
$ bun build --define BUILD_VERSION='"1.0.0"' --define NODE_ENV='"production"' src/index.ts --outdir ./dist
```
### With `bun build --compile`
```sh
# Compile to executable with build-time constants
$ bun build --compile --define BUILD_VERSION='"1.0.0"' --define BUILD_TIME='"2024-01-15T10:30:00Z"' src/cli.ts --outfile mycli
```
### JavaScript API
```ts
await Bun.build({
entrypoints: ["./src/index.ts"],
outdir: "./dist",
define: {
BUILD_VERSION: '"1.0.0"',
BUILD_TIME: '"2024-01-15T10:30:00Z"',
DEBUG: "false",
},
});
```
---
## Common use cases
### Version information
Embed version and build metadata directly into your executable:
{% codetabs %}
```ts#src/version.ts
// These constants are replaced at build time
declare const BUILD_VERSION: string;
declare const BUILD_TIME: string;
declare const GIT_COMMIT: string;
export function getVersion() {
return {
version: BUILD_VERSION,
buildTime: BUILD_TIME,
commit: GIT_COMMIT,
};
}
```
```sh#Build command
$ bun build --compile \
--define BUILD_VERSION='"1.2.3"' \
--define BUILD_TIME='"2024-01-15T10:30:00Z"' \
--define GIT_COMMIT='"abc123"' \
src/cli.ts --outfile mycli
```
{% /codetabs %}
### Feature flags
Use build-time constants to enable/disable features:
```ts
// Replaced at build time
declare const ENABLE_ANALYTICS: boolean;
declare const ENABLE_DEBUG: boolean;
function trackEvent(event: string) {
if (ENABLE_ANALYTICS) {
// This entire block is removed if ENABLE_ANALYTICS is false
console.log("Tracking:", event);
}
}
if (ENABLE_DEBUG) {
console.log("Debug mode enabled");
}
```
```sh
# Production build - analytics enabled, debug disabled
$ bun build --compile --define ENABLE_ANALYTICS=true --define ENABLE_DEBUG=false src/app.ts --outfile app-prod
# Development build - both enabled
$ bun build --compile --define ENABLE_ANALYTICS=false --define ENABLE_DEBUG=true src/app.ts --outfile app-dev
```
### Configuration
Replace configuration objects at build time:
```ts
declare const CONFIG: {
apiUrl: string;
timeout: number;
retries: number;
};
// CONFIG is replaced with the actual object at build time
const response = await fetch(CONFIG.apiUrl, {
timeout: CONFIG.timeout,
});
```
```sh
$ bun build --compile --define 'CONFIG={"apiUrl":"https://api.example.com","timeout":5000,"retries":3}' src/app.ts --outfile app
```
---
## Advanced patterns
### Environment-specific builds
Create different executables for different environments:
```json
{
"scripts": {
"build:dev": "bun build --compile --define NODE_ENV='\"development\"' --define API_URL='\"http://localhost:3000\"' src/app.ts --outfile app-dev",
"build:staging": "bun build --compile --define NODE_ENV='\"staging\"' --define API_URL='\"https://staging.example.com\"' src/app.ts --outfile app-staging",
"build:prod": "bun build --compile --define NODE_ENV='\"production\"' --define API_URL='\"https://api.example.com\"' src/app.ts --outfile app-prod"
}
}
```
### Using shell commands for dynamic values
Generate build-time constants from shell commands:
```sh
# Use git to get current commit and timestamp
$ bun build --compile \
--define BUILD_VERSION="\"$(git describe --tags --always)\"" \
--define BUILD_TIME="\"$(date -u +%Y-%m-%dT%H:%M:%SZ)\"" \
--define GIT_COMMIT="\"$(git rev-parse HEAD)\"" \
src/cli.ts --outfile mycli
```
### Build automation script
Create a build script that automatically injects build metadata:
```ts
// build.ts
import { $ } from "bun";
const version = await $`git describe --tags --always`.text();
const buildTime = new Date().toISOString();
const gitCommit = await $`git rev-parse HEAD`.text();
await Bun.build({
entrypoints: ["./src/cli.ts"],
outdir: "./dist",
define: {
BUILD_VERSION: JSON.stringify(version.trim()),
BUILD_TIME: JSON.stringify(buildTime),
GIT_COMMIT: JSON.stringify(gitCommit.trim()),
},
});
console.log(`Built with version ${version.trim()}`);
```
---
## Important considerations
### Value format
Values must be valid JSON that will be parsed and inlined as JavaScript expressions:
```sh
# ✅ Strings must be JSON-quoted
--define VERSION='"1.0.0"'
# ✅ Numbers are JSON literals
--define PORT=3000
# ✅ Booleans are JSON literals
--define DEBUG=true
# ✅ Objects and arrays (use single quotes to wrap the JSON)
--define 'CONFIG={"host":"localhost","port":3000}'
# ✅ Arrays work too
--define 'FEATURES=["auth","billing","analytics"]'
# ❌ This won't work - missing quotes around string
--define VERSION=1.0.0
```
### Property keys
You can use property access patterns as keys, not just simple identifiers:
```sh
# ✅ Replace process.env.NODE_ENV with "production"
--define 'process.env.NODE_ENV="production"'
# ✅ Replace process.env.API_KEY with the actual key
--define 'process.env.API_KEY="abc123"'
# ✅ Replace nested properties
--define 'window.myApp.version="1.0.0"'
# ✅ Replace array access
--define 'process.argv[2]="--production"'
```
This is particularly useful for environment variables:
```ts
// Before compilation
if (process.env.NODE_ENV === "production") {
console.log("Production mode");
}
// After compilation with --define 'process.env.NODE_ENV="production"'
if ("production" === "production") {
console.log("Production mode");
}
// After optimization
console.log("Production mode");
```
### TypeScript declarations
For TypeScript projects, declare your constants to avoid type errors:
```ts
// types/build-constants.d.ts
declare const BUILD_VERSION: string;
declare const BUILD_TIME: string;
declare const NODE_ENV: "development" | "staging" | "production";
declare const DEBUG: boolean;
```
### Cross-platform compatibility
When building for multiple platforms, constants work the same way:
```sh
# Linux
$ bun build --compile --target=bun-linux-x64 --define PLATFORM='"linux"' src/app.ts --outfile app-linux
# macOS
$ bun build --compile --target=bun-darwin-x64 --define PLATFORM='"darwin"' src/app.ts --outfile app-macos
# Windows
$ bun build --compile --target=bun-windows-x64 --define PLATFORM='"windows"' src/app.ts --outfile app-windows.exe
```
---
## Related
- [Define constants at runtime](/guides/runtime/define-constant) - Using `--define` with `bun run`
- [Building executables](/bundler/executables) - Complete guide to `bun build --compile`
- [Bundler API](/bundler) - Full bundler documentation including `define` option

View File

@@ -52,6 +52,8 @@ In your root-level `package.json`, add a `catalog` or `catalogs` field within th
}
```
If you put `catalog` or `catalogs` at the top level of the `package.json` file, that will work too.
### 2. Reference Catalog Versions in Workspace Packages
In your workspace packages, use the `catalog:` protocol to reference versions:

View File

@@ -81,6 +81,14 @@ $ bun install --verbose # debug logging
$ bun install --silent # no logging
```
To use isolated installs instead of the default hoisted strategy:
```bash
$ bun install --linker isolated
```
Isolated installs create strict dependency isolation similar to pnpm, preventing phantom dependencies and ensuring more deterministic builds. For complete documentation, see [Isolated installs](https://bun.com/docs/install/isolated).
{% details summary="Configuring behavior" %}
The default behavior of `bun install` can be configured in `bunfig.toml`:
@@ -110,6 +118,10 @@ dryRun = false
# equivalent to `--concurrent-scripts` flag
concurrentScripts = 16 # (cpu count or GOMAXPROCS) x2
# installation strategy: "hoisted" or "isolated"
# default: "hoisted"
linker = "hoisted"
```
{% /details %}

195
docs/install/isolated.md Normal file
View File

@@ -0,0 +1,195 @@
Bun provides an alternative package installation strategy called **isolated installs** that creates strict dependency isolation similar to pnpm's approach. This mode prevents phantom dependencies and ensures reproducible, deterministic builds.
## What are isolated installs?
Isolated installs create a non-hoisted dependency structure where packages can only access their explicitly declared dependencies. This differs from the traditional "hoisted" installation strategy used by npm and Yarn, where dependencies are flattened into a shared `node_modules` directory.
### Key benefits
- **Prevents phantom dependencies** — Packages cannot accidentally import dependencies they haven't declared
- **Deterministic resolution** — Same dependency tree regardless of what else is installed
- **Better for monorepos** — Workspace isolation prevents cross-contamination between packages
- **Reproducible builds** — More predictable resolution behavior across environments
## Using isolated installs
### Command line
Use the `--linker` flag to specify the installation strategy:
```bash
# Use isolated installs
$ bun install --linker isolated
# Use traditional hoisted installs
$ bun install --linker hoisted
```
### Configuration file
Set the default linker strategy in your `bunfig.toml`:
```toml
[install]
linker = "isolated"
```
### Default behavior
By default, Bun uses the **hoisted** installation strategy for all projects. To use isolated installs, you must explicitly specify the `--linker isolated` flag or set it in your configuration file.
## How isolated installs work
### Directory structure
Instead of hoisting dependencies, isolated installs create a two-tier structure:
```
node_modules/
├── .bun/ # Central package store
│ ├── package@1.0.0/ # Versioned package installations
│ │ └── node_modules/
│ │ └── package/ # Actual package files
│ ├── @scope+package@2.1.0/ # Scoped packages (+ replaces /)
│ │ └── node_modules/
│ │ └── @scope/
│ │ └── package/
│ └── ...
└── package-name -> .bun/package@1.0.0/node_modules/package # Symlinks
```
### Resolution algorithm
1. **Central store** — All packages are installed in `node_modules/.bun/package@version/` directories
2. **Symlinks** — Top-level `node_modules` contains symlinks pointing to the central store
3. **Peer resolution** — Complex peer dependencies create specialized directory names
4. **Deduplication** — Packages with identical package IDs and peer dependency sets are shared
### Workspace handling
In monorepos, workspace dependencies are handled specially:
- **Workspace packages** — Symlinked directly to their source directories, not the store
- **Workspace dependencies** — Can access other workspace packages in the monorepo
- **External dependencies** — Installed in the isolated store with proper isolation
## Comparison with hoisted installs
| Aspect | Hoisted (npm/Yarn) | Isolated (pnpm-like) |
| ------------------------- | ------------------------------------------ | --------------------------------------- |
| **Dependency access** | Packages can access any hoisted dependency | Packages only see declared dependencies |
| **Phantom dependencies** | ❌ Possible | ✅ Prevented |
| **Disk usage** | ✅ Lower (shared installs) | ✅ Similar (uses symlinks) |
| **Determinism** | ❌ Less deterministic | ✅ More deterministic |
| **Node.js compatibility** | ✅ Standard behavior | ✅ Compatible via symlinks |
| **Best for** | Single projects, legacy code | Monorepos, strict dependency management |
## Advanced features
### Peer dependency handling
Isolated installs handle peer dependencies through sophisticated resolution:
```bash
# Package with peer dependencies creates specialized paths
node_modules/.bun/package@1.0.0_react@18.2.0/
```
The directory name encodes both the package version and its peer dependency versions, ensuring each unique combination gets its own installation.
### Backend strategies
Bun uses different file operation strategies for performance:
- **Clonefile** (macOS) — Copy-on-write filesystem clones for maximum efficiency
- **Hardlink** (Linux/Windows) — Hardlinks to save disk space
- **Copyfile** (fallback) — Full file copies when other methods aren't available
### Debugging isolated installs
Enable verbose logging to understand the installation process:
```bash
$ bun install --linker isolated --verbose
```
This shows:
- Store entry creation
- Symlink operations
- Peer dependency resolution
- Deduplication decisions
## Troubleshooting
### Compatibility issues
Some packages may not work correctly with isolated installs due to:
- **Hardcoded paths** — Packages that assume a flat `node_modules` structure
- **Dynamic imports** — Runtime imports that don't follow Node.js resolution
- **Build tools** — Tools that scan `node_modules` directly
If you encounter issues, you can:
1. **Switch to hoisted mode** for specific projects:
```bash
$ bun install --linker hoisted
```
2. **Report compatibility issues** to help improve isolated install support
### Performance considerations
- **Install time** — May be slightly slower due to symlink operations
- **Disk usage** — Similar to hoisted (uses symlinks, not file copies)
- **Memory usage** — Higher during install due to complex peer resolution
## Migration guide
### From npm/Yarn
```bash
# Remove existing node_modules and lockfiles
$ rm -rf node_modules package-lock.json yarn.lock
# Install with isolated linker
$ bun install --linker isolated
```
### From pnpm
Isolated installs are conceptually similar to pnpm, so migration should be straightforward:
```bash
# Remove pnpm files
$ rm -rf node_modules pnpm-lock.yaml
# Install with Bun's isolated linker
$ bun install --linker isolated
```
The main difference is that Bun uses symlinks in `node_modules` while pnpm uses a global store with symlinks.
## When to use isolated installs
**Use isolated installs when:**
- Working in monorepos with multiple packages
- Strict dependency management is required
- Preventing phantom dependencies is important
- Building libraries that need deterministic dependencies
**Use hoisted installs when:**
- Working with legacy code that assumes flat `node_modules`
- Compatibility with existing build tools is required
- Working in environments where symlinks aren't well supported
- You prefer the simpler traditional npm behavior
## Related documentation
- [Package manager > Workspaces](https://bun.com/docs/install/workspaces) — Monorepo workspace management
- [Package manager > Lockfile](https://bun.com/docs/install/lockfile) — Understanding Bun's lockfile format
- [CLI > install](https://bun.com/docs/cli/install) — Complete `bun install` command reference

View File

@@ -176,10 +176,16 @@ export default {
page("cli/pm", "`bun pm`", {
description: "Utilities relating to package management with Bun.",
}),
page("cli/why", "`bun why`", {
description: "Explains why a package is installed in your project.",
}),
page("install/cache", "Global cache", {
description:
"Bun's package manager installs all packages into a shared global cache to avoid redundant re-downloads.",
}),
page("install/isolated", "Isolated installs", {
description: "Create strict dependency isolation, preventing phantom dependencies.",
}),
page("install/workspaces", "Workspaces", {
description: "Bun's package manager supports workspaces and monorepo development workflows.",
}),

View File

@@ -20,7 +20,7 @@ this one:
Given a file implementing a simple function, such as `add`
```zig#src/bun.js/math.zig
pub fn add(global: *JSC.JSGlobalObject, a: i32, b: i32) !i32 {
pub fn add(global: *jsc.JSGlobalObject, a: i32, b: i32) !i32 {
return std.math.add(i32, a, b) catch {
// Binding functions can return `error.OutOfMemory` and `error.JSError`.
// Others like `error.Overflow` from `std.math.add` must be converted.
@@ -33,7 +33,7 @@ const gen = bun.gen.math; // "math" being this file's basename
const std = @import("std");
const bun = @import("bun");
const JSC = bun.JSC;
const jsc = bun.jsc;
```
Then describe the API schema using a `.bind.ts` function. The binding file goes next to the Zig file.

View File

@@ -195,6 +195,24 @@ Whether to skip test files when computing coverage statistics. Default `false`.
coverageSkipTestFiles = false
```
### `test.coveragePathIgnorePatterns`
Exclude specific files or file patterns from coverage reports using glob patterns. Can be a single string pattern or an array of patterns.
```toml
[test]
# Single pattern
coveragePathIgnorePatterns = "**/*.spec.ts"
# Multiple patterns
coveragePathIgnorePatterns = [
"**/*.spec.ts",
"**/*.test.ts",
"src/utils/**",
"*.config.js"
]
```
### `test.coverageReporter`
By default, coverage reports will be printed to the console. For persistent code coverage reports in CI environments and for other tools use `lcov`.

View File

@@ -71,6 +71,26 @@ coverageThreshold = { lines = 0.9, functions = 0.8, statements = 0.85 }
Setting any of these enables `fail_on_low_coverage`, causing the test run to fail if coverage is below the threshold.
#### coveragePathIgnorePatterns
Exclude specific files or file patterns from coverage reports using glob patterns:
```toml
[test]
# Single pattern
coveragePathIgnorePatterns = "**/*.spec.ts"
# Multiple patterns
coveragePathIgnorePatterns = [
"**/*.spec.ts",
"**/*.test.ts",
"src/utils/**",
"*.config.js"
]
```
Files matching any of these patterns will be excluded from coverage calculation and reporting. See the [coverage documentation](./coverage.md) for more details and examples.
#### coverageIgnoreSourcemaps
Internally, Bun transpiles every file. That means code coverage must also go through sourcemaps before they can be reported. We expose this as a flag to allow you to opt out of this behavior, but it will be confusing because during the transpilation process, Bun may move code around and change variable names. This option is mostly useful for debugging coverage issues.

View File

@@ -57,7 +57,18 @@ coverageThreshold = { lines = 0.9, functions = 0.9, statements = 0.9 }
Setting any of these thresholds enables `fail_on_low_coverage`, causing the test run to fail if coverage is below the threshold.
### Exclude test files from coverage
### Sourcemaps
Internally, Bun transpiles all files by default, so Bun automatically generates an internal [source map](https://web.dev/source-maps/) that maps lines of your original source code onto Bun's internal representation. If for any reason you want to disable this, set `test.coverageIgnoreSourcemaps` to `true`; this will rarely be desirable outside of advanced use cases.
```toml
[test]
coverageIgnoreSourcemaps = true # default false
```
### Exclude files from coverage
#### Skip test files
By default, test files themselves are included in coverage reports. You can exclude them with:
@@ -68,15 +79,33 @@ coverageSkipTestFiles = true # default false
This will exclude files matching test patterns (e.g., _.test.ts, _\_spec.js) from the coverage report.
### Sourcemaps
#### Ignore specific paths and patterns
Internally, Bun transpiles all files by default, so Bun automatically generates an internal [source map](https://web.dev/source-maps/) that maps lines of your original source code onto Bun's internal representation. If for any reason you want to disable this, set `test.coverageIgnoreSourcemaps` to `true`; this will rarely be desirable outside of advanced use cases.
You can exclude specific files or file patterns from coverage reports using `coveragePathIgnorePatterns`:
```toml
[test]
coverageIgnoreSourcemaps = true # default false
# Single pattern
coveragePathIgnorePatterns = "**/*.spec.ts"
# Multiple patterns
coveragePathIgnorePatterns = [
"**/*.spec.ts",
"**/*.test.ts",
"src/utils/**",
"*.config.js"
]
```
This option accepts glob patterns and works similarly to Jest's `collectCoverageFrom` ignore patterns. Files matching any of these patterns will be excluded from coverage calculation and reporting in both text and LCOV outputs.
Common use cases:
- Exclude utility files: `"src/utils/**"`
- Exclude configuration files: `"*.config.js"`
- Exclude specific test patterns: `"**/*.spec.ts"`
- Exclude build artifacts: `"dist/**"`
### Coverage defaults
By default, coverage reports:
@@ -84,6 +113,7 @@ By default, coverage reports:
1. Exclude `node_modules` directories
2. Exclude files loaded via non-JS/TS loaders (e.g., .css, .txt) unless a custom JS loader is specified
3. Include test files themselves (can be disabled with `coverageSkipTestFiles = true` as shown above)
4. Can exclude additional files with `coveragePathIgnorePatterns` as shown above
### Coverage reporters

View File

@@ -2,16 +2,15 @@ const std = @import("std");
const path_handler = @import("../src/resolver/resolve_path.zig");
const bun = @import("bun");
const string = bun.string;
const string = []const u8;
const Output = bun.Output;
const Global = bun.Global;
const Environment = bun.Environment;
const strings = bun.strings;
const MutableString = bun.MutableString;
const stringZ = bun.stringZ;
const stringZ = [:0]const u8;
const default_allocator = bun.default_allocator;
const C = bun.C;
const Features = @import("../src/analytics/analytics_thread.zig").Features;
const Features = bun.analytics.Features;
// zig run --main-pkg-path ../ ./features.zig
pub fn main() anyerror!void {

View File

@@ -1,14 +1,13 @@
const std = @import("std");
const bun = @import("bun");
const string = bun.string;
const string = []const u8;
const Output = bun.Output;
const Global = bun.Global;
const Environment = bun.Environment;
const strings = bun.strings;
const MutableString = bun.MutableString;
const stringZ = bun.stringZ;
const stringZ = [:0]const u8;
const default_allocator = bun.default_allocator;
const C = bun.C;
const clap = @import("../src/deps/zig-clap/clap.zig");
const URL = @import("../src/url.zig").URL;
@@ -196,7 +195,7 @@ pub fn main() anyerror!void {
response_body: MutableString = undefined,
context: HTTP.HTTPChannelContext = undefined,
};
const Batch = @import("../src/thread_pool.zig").Batch;
const Batch = bun.ThreadPool.Batch;
var groups = try default_allocator.alloc(Group, args.count);
var repeat_i: usize = 0;
while (repeat_i < args.repeat + 1) : (repeat_i += 1) {

View File

@@ -1,15 +1,14 @@
// most of this file is copy pasted from other files in misctools
const std = @import("std");
const bun = @import("bun");
const string = bun.string;
const string = []const u8;
const Output = bun.Output;
const Global = bun.Global;
const Environment = bun.Environment;
const strings = bun.strings;
const MutableString = bun.MutableString;
const stringZ = bun.stringZ;
const stringZ = [:0]const u8;
const default_allocator = bun.default_allocator;
const C = bun.C;
const clap = @import("../src/deps/zig-clap/clap.zig");
const URL = @import("../src/url.zig").URL;

View File

@@ -2,15 +2,14 @@ const std = @import("std");
const path_handler = @import("../src/resolver/resolve_path.zig");
const bun = @import("bun");
const string = bun.string;
const string = []const u8;
const Output = bun.Output;
const Global = bun.Global;
const Environment = bun.Environment;
const strings = bun.strings;
const MutableString = bun.MutableString;
const stringZ = bun.stringZ;
const stringZ = [:0]const u8;
const default_allocator = bun.default_allocator;
const C = bun.C;
// zig build-exe -Doptimize=ReleaseFast --main-pkg-path ../ ./readlink-getfd.zig
pub fn main() anyerror!void {

View File

@@ -2,15 +2,14 @@ const std = @import("std");
const path_handler = @import("../src/resolver/resolve_path.zig");
const bun = @import("bun");
const string = bun.string;
const string = []const u8;
const Output = bun.Output;
const Global = bun.Global;
const Environment = bun.Environment;
const strings = bun.strings;
const MutableString = bun.MutableString;
const stringZ = bun.stringZ;
const stringZ = [:0]const u8;
const default_allocator = bun.default_allocator;
const C = bun.C;
// zig build-exe -Doptimize=ReleaseFast --main-pkg-path ../ ./readlink-getfd.zig
pub fn main() anyerror!void {

View File

@@ -2,15 +2,14 @@ const std = @import("std");
const path_handler = @import("../src/resolver/resolve_path.zig");
const bun = @import("bun");
const string = bun.string;
const string = []const u8;
const Output = bun.Output;
const Global = bun.Global;
const Environment = bun.Environment;
const strings = bun.strings;
const MutableString = bun.MutableString;
const stringZ = bun.stringZ;
const stringZ = [:0]const u8;
const default_allocator = bun.default_allocator;
const C = bun.C;
const Archive = @import("../src/libarchive/libarchive.zig").Archive;
const Zlib = @import("../src/zlib.zig");

View File

@@ -1,12 +1,15 @@
{
"private": true,
"name": "bun",
"version": "1.2.19",
"version": "1.2.20",
"workspaces": [
"./packages/bun-types",
"./packages/@types/bun"
],
"devDependencies": {
"bun-tracestrings": "github:oven-sh/bun.report#912ca63e26c51429d3e6799aa2a6ab079b188fd8",
"@lezer/common": "^1.2.3",
"@lezer/cpp": "^1.1.3",
"esbuild": "^0.21.4",
"mitata": "^0.1.11",
"peechy": "0.4.34",
@@ -28,8 +31,8 @@
"watch-windows": "bun run zig build check-windows --watch -fincremental --prominent-compile-errors --global-cache-dir build/debug/zig-check-cache --zig-lib-dir vendor/zig/lib",
"bd:v": "(bun run --silent build:debug &> /tmp/bun.debug.build.log || (cat /tmp/bun.debug.build.log && rm -rf /tmp/bun.debug.build.log && exit 1)) && rm -f /tmp/bun.debug.build.log && ./build/debug/bun-debug",
"bd": "BUN_DEBUG_QUIET_LOGS=1 bun --silent bd:v",
"build:debug": "bun scripts/glob-sources.mjs > /dev/null && bun ./scripts/build.mjs -GNinja -DCMAKE_BUILD_TYPE=Debug -B build/debug",
"build:debug:asan": "bun ./scripts/build.mjs -GNinja -DCMAKE_BUILD_TYPE=Debug -DENABLE_ASAN=ON -B build/debug-asan",
"build:debug": "export COMSPEC=\"C:\\Windows\\System32\\cmd.exe\" && bun scripts/glob-sources.mjs > /dev/null && bun ./scripts/build.mjs -GNinja -DCMAKE_BUILD_TYPE=Debug -B build/debug --log-level=NOTICE",
"build:debug:asan": "bun ./scripts/build.mjs -GNinja -DCMAKE_BUILD_TYPE=Debug -DENABLE_ASAN=ON -B build/debug-asan --log-level=NOTICE",
"build:release": "bun ./scripts/build.mjs -GNinja -DCMAKE_BUILD_TYPE=Release -B build/release",
"build:ci": "bun ./scripts/build.mjs -GNinja -DCMAKE_BUILD_TYPE=Release -DCMAKE_VERBOSE_MAKEFILE=ON -DCI=true -B build/release-ci --verbose --fresh",
"build:assert": "bun ./scripts/build.mjs -GNinja -DCMAKE_BUILD_TYPE=RelWithDebInfo -DENABLE_ASSERTIONS=ON -DENABLE_LOGS=ON -B build/release-assert",

View File

@@ -2512,6 +2512,19 @@ declare module "bun" {
* This defaults to `true`.
*/
throw?: boolean;
/**
* Custom tsconfig.json file path to use for path resolution.
* Equivalent to `--tsconfig-override` in the CLI.
* @example
* ```ts
* await Bun.build({
* entrypoints: ['./src/index.ts'],
* tsconfig: './custom-tsconfig.json'
* });
* ```
*/
tsconfig?: string;
}
/**

View File

@@ -27,11 +27,17 @@ At its core is the _Bun runtime_, a fast JavaScript runtime designed as a drop-i
- Run scripts from package.json
- Visual lockfile viewer for old binary lockfiles (`bun.lockb`)
## Bun test runner integration
Run and debug tests directly from VSCode's Testing panel. The extension automatically discovers test files, shows inline test status, and provides rich error messages with diffs.
![Test runner example](https://raw.githubusercontent.com/oven-sh/bun/refs/heads/main/packages/bun-vscode/assets/bun-test.gif)
## In-editor error messages
When running programs with Bun from a Visual Studio Code terminal, Bun will connect to the extension and report errors as they happen, at the exact location they happened. We recommend using this feature with `bun --watch` so you can see errors on every save.
![Error messages example](https://raw.githubusercontent.com/oven-sh/bun/refs/heads/main/packages/bun-vscode/error-messages.gif)
![Error messages example](https://raw.githubusercontent.com/oven-sh/bun/refs/heads/main/packages/bun-vscode/assets/error-messages.gif)
<div align="center">
<sup>In the example above VSCode is saving on every keypress. Under normal configuration you'd only see errors on every save.</sup>
@@ -95,6 +101,9 @@ You can use the following configurations to debug JavaScript and TypeScript file
// The URL of the WebSocket inspector to attach to.
// This value can be retrieved by using `bun --inspect`.
"url": "ws://localhost:6499/",
// Optional path mapping for remote debugging
"localRoot": "${workspaceFolder}",
"remoteRoot": "/app",
},
],
}

Binary file not shown.

After

Width:  |  Height:  |  Size: 6.7 MiB

View File

Before

Width:  |  Height:  |  Size: 462 KiB

After

Width:  |  Height:  |  Size: 462 KiB

View File

@@ -102,8 +102,6 @@
"@types/ws": ["@types/ws@8.5.12", "", { "dependencies": { "@types/node": "*" } }, "sha512-3tPRkv1EtkDpzlgyKyI8pGsGZAGPEaXeu0DOj5DI25Ja91bdAYddYHbADRYVrZMRbfW+1l5YwXVDKohDJNQxkQ=="],
"@types/xml2js": ["@types/xml2js@0.4.14", "", { "dependencies": { "@types/node": "*" } }, "sha512-4YnrRemBShWRO2QjvUin8ESA41rH+9nQGLUGZV/1IDhi3SL9OhdpNC/MrulTWuptXKwhx/aDxE7toV0f/ypIXQ=="],
"@vscode/debugadapter": ["@vscode/debugadapter@1.61.0", "", { "dependencies": { "@vscode/debugprotocol": "1.61.0" } }, "sha512-VDGLUFDVAdnftUebZe4uQCIFUbJ7rTc2Grps4D/CXl+qyzTZSQLv5VADEOZ6kBYG4SvlnMLql5vPQ0G6XvUCvQ=="],
"@vscode/debugadapter-testsupport": ["@vscode/debugadapter-testsupport@1.61.0", "", { "dependencies": { "@vscode/debugprotocol": "1.61.0" } }, "sha512-M/8aNX1aFvupd+SP0NLEVLKUK9y52BuCK5vKO2gzdpSoRUR2fR8oFbGkTie+/p2Yrcswnuf7hFx0xWkV9avRdg=="],

View File

@@ -10,15 +10,13 @@
"devDependencies": {
"@types/bun": "^1.1.10",
"@types/vscode": "^1.60.0",
"@types/xml2js": "^0.4.14",
"@vscode/debugadapter": "^1.56.0",
"@vscode/debugadapter-testsupport": "^1.56.0",
"@vscode/test-cli": "^0.0.10",
"@vscode/test-electron": "^2.4.1",
"@vscode/vsce": "^2.20.1",
"esbuild": "^0.19.2",
"typescript": "^5.0.0",
"xml2js": "^0.6.2"
"typescript": "^5.0.0"
},
"activationEvents": [
"onStartupFinished"
@@ -73,7 +71,7 @@
},
"bun.test.filePattern": {
"type": "string",
"default": "**/*{.test.,.spec.,_test_,_spec_}{js,ts,tsx,jsx,mts,cts}",
"default": "**/*{.test.,.spec.,_test_,_spec_}{js,ts,tsx,jsx,mts,cts,cjs,mjs}",
"description": "Glob pattern to find test files"
},
"bun.test.customFlag": {
@@ -83,8 +81,14 @@
},
"bun.test.customScript": {
"type": "string",
"default": "",
"default": "bun test",
"description": "Custom script to use instead of `bun test`, for example script from `package.json`"
},
"bun.test.enable": {
"type": "boolean",
"description": "If the test explorer should be enabled and integrated with your editor",
"scope": "window",
"default": true
}
}
},
@@ -279,6 +283,14 @@
"type": "boolean",
"description": "If the debugger should stop on the first line of the program.",
"default": false
},
"localRoot": {
"type": "string",
"description": "The local path that maps to \"remoteRoot\" when attaching to a remote Bun process."
},
"remoteRoot": {
"type": "string",
"description": "The remote path to the code when attaching. File paths reported by Bun that start with this path will be mapped back to 'localRoot'."
}
}
}

View File

@@ -1,5 +1,6 @@
import { DebugSession, OutputEvent } from "@vscode/debugadapter";
import { tmpdir } from "node:os";
import * as path from "node:path";
import { join } from "node:path";
import * as vscode from "vscode";
import {
@@ -220,7 +221,7 @@ class InlineDebugAdapterFactory implements vscode.DebugAdapterDescriptorFactory
session: vscode.DebugSession,
): Promise<vscode.ProviderResult<vscode.DebugAdapterDescriptor>> {
const { configuration } = session;
const { request, url, __untitledName } = configuration;
const { request, url, __untitledName, localRoot, remoteRoot } = configuration;
if (request === "attach") {
for (const [adapterUrl, adapter] of adapters) {
@@ -230,7 +231,10 @@ class InlineDebugAdapterFactory implements vscode.DebugAdapterDescriptorFactory
}
}
const adapter = new FileDebugSession(session.id, __untitledName);
const adapter = new FileDebugSession(session.id, __untitledName, {
localRoot,
remoteRoot,
});
await adapter.initialize();
return new vscode.DebugAdapterInlineImplementation(adapter);
}
@@ -275,6 +279,11 @@ interface RuntimeExceptionThrownEvent {
};
}
interface PathMapping {
localRoot?: string;
remoteRoot?: string;
}
class FileDebugSession extends DebugSession {
// If these classes are moved/published, we should make sure
// we remove these non-null assertions so consumers of
@@ -283,18 +292,60 @@ class FileDebugSession extends DebugSession {
sessionId?: string;
untitledDocPath?: string;
bunEvalPath?: string;
localRoot?: string;
remoteRoot?: string;
#isWindowsRemote = false;
constructor(sessionId?: string, untitledDocPath?: string) {
constructor(sessionId?: string, untitledDocPath?: string, mapping?: PathMapping) {
super();
this.sessionId = sessionId;
this.untitledDocPath = untitledDocPath;
if (mapping) {
this.localRoot = mapping.localRoot;
this.remoteRoot = mapping.remoteRoot;
if (typeof mapping.remoteRoot === "string") {
this.#isWindowsRemote = mapping.remoteRoot.includes("\\");
}
}
if (untitledDocPath) {
const cwd = vscode.workspace.workspaceFolders?.[0]?.uri?.fsPath ?? process.cwd();
this.bunEvalPath = join(cwd, "[eval]");
}
}
mapRemoteToLocal(p: string | undefined): string | undefined {
if (!p || !this.remoteRoot || !this.localRoot) return p;
const remoteModule = this.#isWindowsRemote ? path.win32 : path.posix;
let remoteRoot = remoteModule.normalize(this.remoteRoot);
if (!remoteRoot.endsWith(remoteModule.sep)) remoteRoot += remoteModule.sep;
let target = remoteModule.normalize(p);
const starts = this.#isWindowsRemote
? target.toLowerCase().startsWith(remoteRoot.toLowerCase())
: target.startsWith(remoteRoot);
if (starts) {
const rel = target.slice(remoteRoot.length);
const localRel = rel.split(remoteModule.sep).join(path.sep);
return path.join(this.localRoot, localRel);
}
return p;
}
mapLocalToRemote(p: string | undefined): string | undefined {
if (!p || !this.remoteRoot || !this.localRoot) return p;
let localRoot = path.normalize(this.localRoot);
if (!localRoot.endsWith(path.sep)) localRoot += path.sep;
let localPath = path.normalize(p);
if (localPath.startsWith(localRoot)) {
const rel = localPath.slice(localRoot.length);
const remoteModule = this.#isWindowsRemote ? path.win32 : path.posix;
const remoteRel = rel.split(path.sep).join(remoteModule.sep);
return remoteModule.join(this.remoteRoot, remoteRel);
}
return p;
}
async initialize() {
const uniqueId = this.sessionId ?? Math.random().toString(36).slice(2);
const url =
@@ -307,14 +358,20 @@ class FileDebugSession extends DebugSession {
if (untitledDocPath) {
this.adapter.on("Adapter.response", (response: DebugProtocolResponse) => {
if (response.body?.source?.path === bunEvalPath) {
response.body.source.path = untitledDocPath;
if (response.body?.source?.path) {
if (response.body.source.path === bunEvalPath) {
response.body.source.path = untitledDocPath;
} else {
response.body.source.path = this.mapRemoteToLocal(response.body.source.path);
}
}
if (Array.isArray(response.body?.breakpoints)) {
for (const bp of response.body.breakpoints) {
if (bp.source?.path === bunEvalPath) {
bp.source.path = untitledDocPath;
bp.verified = true;
} else if (bp.source?.path) {
bp.source.path = this.mapRemoteToLocal(bp.source.path);
}
}
}
@@ -322,14 +379,35 @@ class FileDebugSession extends DebugSession {
});
this.adapter.on("Adapter.event", (event: DebugProtocolEvent) => {
if (event.body?.source?.path === bunEvalPath) {
event.body.source.path = untitledDocPath;
if (event.body?.source?.path) {
if (event.body.source.path === bunEvalPath) {
event.body.source.path = untitledDocPath;
} else {
event.body.source.path = this.mapRemoteToLocal(event.body.source.path);
}
}
this.sendEvent(event);
});
} else {
this.adapter.on("Adapter.response", response => this.sendResponse(response));
this.adapter.on("Adapter.event", event => this.sendEvent(event));
this.adapter.on("Adapter.response", (response: DebugProtocolResponse) => {
if (response.body?.source?.path) {
response.body.source.path = this.mapRemoteToLocal(response.body.source.path);
}
if (Array.isArray(response.body?.breakpoints)) {
for (const bp of response.body.breakpoints) {
if (bp.source?.path) {
bp.source.path = this.mapRemoteToLocal(bp.source.path);
}
}
}
this.sendResponse(response);
});
this.adapter.on("Adapter.event", (event: DebugProtocolEvent) => {
if (event.body?.source?.path) {
event.body.source.path = this.mapRemoteToLocal(event.body.source.path);
}
this.sendEvent(event);
});
}
this.adapter.on("Adapter.reverseRequest", ({ command, arguments: args }) =>
@@ -345,11 +423,15 @@ class FileDebugSession extends DebugSession {
if (type === "request") {
const { untitledDocPath, bunEvalPath } = this;
const { command } = message;
if (untitledDocPath && (command === "setBreakpoints" || command === "breakpointLocations")) {
if (command === "setBreakpoints" || command === "breakpointLocations") {
const args = message.arguments as any;
if (args.source?.path === untitledDocPath) {
if (untitledDocPath && args.source?.path === untitledDocPath) {
args.source.path = bunEvalPath;
} else if (args.source?.path) {
args.source.path = this.mapLocalToRemote(args.source.path);
}
} else if (command === "source" && message.arguments?.source?.path) {
message.arguments.source.path = this.mapLocalToRemote(message.arguments.source.path);
}
this.adapter.emit("Adapter.request", message);
@@ -367,7 +449,7 @@ class TerminalDebugSession extends FileDebugSession {
signal!: TCPSocketSignal | UnixSignal;
constructor() {
super();
super(undefined, undefined);
}
async initialize() {

View File

@@ -0,0 +1,864 @@
import { describe, expect, test } from "bun:test";
import { MockTestController, MockWorkspaceFolder } from "./vscode-types.mock";
import "./vscode.mock";
import { makeTestController, makeWorkspaceFolder } from "./vscode.mock";
const { BunTestController } = await import("../bun-test-controller");
const mockTestController: MockTestController = makeTestController();
const mockWorkspaceFolder: MockWorkspaceFolder = makeWorkspaceFolder("/test/workspace");
const controller = new BunTestController(mockTestController, mockWorkspaceFolder, true);
const internal = controller._internal;
const { expandEachTests, parseTestBlocks, getBraceDepth } = internal;
describe("BunTestController (static file parser)", () => {
describe("expandEachTests", () => {
describe("$variable syntax", () => {
test("should not expand $variable patterns (Bun behavior)", () => {
const content = `test.each([
{ a: 1, b: 2, expected: 3 },
{ a: 5, b: 5, expected: 10 }
])('$a + $b = $expected', ({ a, b, expected }) => {})`;
const result = expandEachTests("test.each([", "$a + $b = $expected", content, 0, "test", 1);
expect(result).toHaveLength(1);
expect(result[0].name).toBe("$a + $b = $expected");
});
test("should not expand string values with quotes", () => {
const content = `test.each([
{ name: "Alice", city: "NYC" },
{ name: "Bob", city: "LA" }
])('$name from $city', ({ name, city }) => {})`;
const result = expandEachTests("test.each([", "$name from $city", content, 0, "test", 1);
expect(result).toHaveLength(1);
expect(result[0].name).toBe("$name from $city");
});
test("should not expand nested property access", () => {
const content = `test.each([
{ user: { name: "Alice", profile: { city: "NYC" } } },
{ user: { name: "Bob", profile: { city: "LA" } } }
])('$user.name from $user.profile.city', ({ user }) => {})`;
const result = expandEachTests("test.each([", "$user.name from $user.profile.city", content, 0, "test", 1);
expect(result).toHaveLength(1);
expect(result[0].name).toBe("$user.name from $user.profile.city");
});
test("should not expand array indexing", () => {
const content = `test.each([
{ users: [{ name: "Alice" }, { name: "Bob" }] },
{ users: [{ name: "Carol" }, { name: "Dave" }] }
])('first user: $users.0.name', ({ users }) => {})`;
const result = expandEachTests("test.each([", "first user: $users.0.name", content, 0, "test", 1);
expect(result).toHaveLength(1);
expect(result[0].name).toBe("first user: $users.0.name");
});
test("should return template as-is for missing properties", () => {
const content = `test.each([
{ a: 1 },
{ a: 2 }
])('$a and $missing', ({ a }) => {})`;
const result = expandEachTests("test.each([", "$a and $missing", content, 0, "test", 1);
expect(result).toHaveLength(1);
expect(result[0].name).toBe("$a and $missing");
});
test("should handle edge cases with special identifiers", () => {
const content = `test.each([
{ _valid: "ok", $dollar: "yes", _123mix: "mixed" }
])('$_valid | $$dollar | $_123mix', (obj) => {})`;
const result = expandEachTests("test.each([", "$_valid | $$dollar | $_123mix", content, 0, "test", 1);
expect(result).toHaveLength(1);
expect(result[0].name).toBe("$_valid | $$dollar | $_123mix");
});
test("should handle invalid identifiers as literals", () => {
const content = `test.each([
{ valid: "test" }
])('$valid | $123invalid | $has-dash', (obj) => {})`;
const result = expandEachTests("test.each([", "$valid | $123invalid | $has-dash", content, 0, "test", 1);
expect(result).toHaveLength(1);
expect(result[0].name).toBe("$valid | $123invalid | $has-dash");
});
});
describe("% formatters", () => {
test("should handle %i for integers", () => {
const content = `test.each([
[1, 2, 3],
[5, 5, 10]
])('%i + %i = %i', (a, b, expected) => {})`;
const result = expandEachTests("test.each([", "%i + %i = %i", content, 0, "test", 1);
expect(result).toHaveLength(2);
expect(result[0].name).toBe("1 + 2 = 3");
expect(result[1].name).toBe("5 + 5 = 10");
});
test("should handle %s for strings", () => {
const content = `test.each([
["hello", "world"],
["foo", "bar"]
])('%s %s', (a, b) => {})`;
const result = expandEachTests("test.each([", "%s %s", content, 0, "test", 1);
expect(result).toHaveLength(2);
expect(result[0].name).toBe("hello world");
expect(result[1].name).toBe("foo bar");
});
test("should handle %f and %d for numbers", () => {
const content = `test.each([
[1.5, 2.7],
[3.14, 2.71]
])('%f and %d', (a, b) => {})`;
const result = expandEachTests("test.each([", "%f and %d", content, 0, "test", 1);
expect(result).toHaveLength(2);
expect(result[0].name).toBe("1.5 and 2.7");
expect(result[1].name).toBe("3.14 and 2.71");
});
test("should handle %o and %j for objects", () => {
const content = `test.each([
[{ a: 1 }, { b: 2 }]
])('%o and %j', (obj1, obj2) => {})`;
const result = expandEachTests("test.each([", "%o and %j", content, 0, "test", 1);
expect(result).toHaveLength(1);
expect(result[0].name).toBe("%o and %j");
});
test("should handle %# for index", () => {
const content = `test.each([
[1, 2],
[3, 4],
[5, 6]
])('Test #%#: %i + %i', (a, b) => {})`;
const result = expandEachTests("test.each([", "Test #%#: %i + %i", content, 0, "test", 1);
expect(result).toHaveLength(3);
expect(result[0].name).toBe("Test #1: 1 + 2");
expect(result[1].name).toBe("Test #2: 3 + 4");
expect(result[2].name).toBe("Test #3: 5 + 6");
});
test("should handle %% for literal percent", () => {
const content = `test.each([
[50],
[100]
])('%i%% complete', (percent) => {})`;
const result = expandEachTests("test.each([", "%i%% complete", content, 0, "test", 1);
expect(result).toHaveLength(2);
expect(result[0].name).toBe("50% complete");
expect(result[1].name).toBe("100% complete");
});
});
describe("describe.each", () => {
test("should work with describe.each", () => {
const content = `describe.each([
{ module: "fs", method: "readFile" },
{ module: "path", method: "join" }
])('$module module', ({ module, method }) => {})`;
const result = expandEachTests("describe.each([", "$module module", content, 0, "describe", 1);
expect(result).toHaveLength(1);
expect(result[0].name).toBe("$module module");
expect(result[0].type).toBe("describe");
});
});
describe("error handling", () => {
test("should handle non-.each tests", () => {
const result = expandEachTests("test", "regular test", "test('regular test', () => {})", 0, "test", 1);
expect(result).toHaveLength(1);
expect(result[0].name).toBe("regular test");
});
test("should handle malformed JSON", () => {
const content = `test.each([
{ invalid json }
])('test', () => {})`;
const result = expandEachTests("test.each([", "test", content, 0, "test", 1);
expect(result).toHaveLength(1);
expect(result[0].name).toBe("test");
});
test("should handle non-array values", () => {
const content = `test.each({ not: "array" })('test', () => {})`;
const result = expandEachTests("test.each([", "test", content, 0, "test", 1);
expect(result).toHaveLength(1);
expect(result[0].name).toBe("test");
});
});
describe("mixed formatters", () => {
test("should handle both $ and % in objects", () => {
const content = `test.each([
{ name: "Test", index: 0 }
])('$name #%#', (obj) => {})`;
const result = expandEachTests("test.each([", "$name #%#", content, 0, "test", 1);
expect(result).toHaveLength(1);
expect(result[0].name).toBe("$name #%#");
});
});
describe("edge cases", () => {
test("should handle complex nested objects", () => {
const content = `test.each([
{
user: {
profile: {
address: {
city: "NYC",
coords: { lat: 40.7128, lng: -74.0060 }
}
}
}
}
])('User from $user.profile.address.city at $user.profile.address.coords.lat', ({ user }) => {})`;
const result = expandEachTests(
"test.each([",
"User from $user.profile.address.city at $user.profile.address.coords.lat",
content,
0,
"test",
1,
);
expect(result).toHaveLength(1);
expect(result[0].name).toBe("User from $user.profile.address.city at $user.profile.address.coords.lat");
});
test("should handle arrays with inline comments", () => {
const content = `test.each([
{ a: 1 }, // first test
{ a: 2 }, // second test
// { a: 3 }, // commented out test
{ a: 4 } /* final test */
])('test $a', ({ a }) => {})`;
const result = expandEachTests("test.each([", "test $a", content, 0, "test", 1);
expect(result).toHaveLength(1);
expect(result[0].name).toBe("test $a");
});
test("should handle arrays with multiline comments", () => {
const content = `test.each([
{ name: "test1" },
/* This is a
multiline comment
that spans several lines */
{ name: "test2" },
/**
* JSDoc style comment
* with multiple lines
*/
{ name: "test3" }
])('$name', ({ name }) => {})`;
const result = expandEachTests("test.each([", "$name", content, 0, "test", 1);
expect(result).toHaveLength(1);
expect(result[0].name).toBe("$name");
});
test("should handle malformed array syntax gracefully", () => {
const content = `test.each([
{ a: 1 },
{ a: 2,,, }, // extra commas
{ a: 3, }, // trailing comma
{ a: 4 },,, // extra trailing commas
])('test $a', ({ a }) => {})`;
const result = expandEachTests("test.each([", "test $a", content, 0, "test", 1);
expect(result.length).toBeGreaterThanOrEqual(1);
});
test("should handle strings with comment-like content", () => {
const content = `test.each([
{ comment: "// this is not a comment" },
{ comment: "/* neither is this */" },
{ url: "https://example.com/path" }
])('Test: $comment $url', (data) => {})`;
const result = expandEachTests("test.each([", "Test: $comment $url", content, 0, "test", 1);
expect(result).toHaveLength(1);
expect(result[0].name).toBe("Test: $comment $url");
});
test("should handle special characters in strings", () => {
const content = `test.each([
{ char: "\\n" },
{ char: "\\t" },
{ char: "\\"" },
{ char: "\\'" },
{ char: "\\\\" },
{ char: "\`" }
])('Special char: $char', ({ char }) => {})`;
const result = expandEachTests("test.each([", "Special char: $char", content, 0, "test", 1);
expect(result.length).toBeGreaterThanOrEqual(1);
});
test("should handle empty arrays", () => {
const content = `test.each([])('should handle empty', () => {})`;
const result = expandEachTests("test.each([", "should handle empty", content, 0, "test", 1);
expect(result).toHaveLength(0);
});
test("should handle undefined and null values", () => {
const content = `test.each([
{ value: undefined },
{ value: null },
{ value: false },
{ value: 0 },
{ value: "" }
])('Value: $value', ({ value }) => {})`;
const result = expandEachTests("test.each([", "Value: $value", content, 0, "test", 1);
if (result.length === 1) {
expect(result[0].name).toBe("Value: $value");
} else {
expect(result).toHaveLength(5);
expect(result[0].name).toBe("Value: undefined");
expect(result[1].name).toBe("Value: null");
expect(result[2].name).toBe("Value: false");
expect(result[3].name).toBe("Value: 0");
expect(result[4].name).toBe("Value: ");
}
});
test("should handle circular references gracefully", () => {
const content = `test.each([
{ a: { b: "[Circular]" } },
{ a: { b: { c: "[Circular]" } } }
])('Circular: $a.b', ({ a }) => {})`;
const result = expandEachTests("test.each([", "Circular: $a.b", content, 0, "test", 1);
expect(result).toHaveLength(1);
expect(result[0].name).toBe("Circular: $a.b");
});
test("should handle very long property paths", () => {
const content = `test.each([
{
a: {
b: {
c: {
d: {
e: {
f: {
g: "deeply nested"
}
}
}
}
}
}
}
])('Value: $a.b.c.d.e.f.g', (data) => {})`;
const result = expandEachTests("test.each([", "Value: $a.b.c.d.e.f.g", content, 0, "test", 1);
expect(result).toHaveLength(1);
expect(result[0].name).toBe("Value: $a.b.c.d.e.f.g");
});
test("should handle syntax errors in array", () => {
const content = `test.each([
{ a: 1 }
{ a: 2 } // missing comma
{ a: 3 }
])('test $a', ({ a }) => {})`;
const result = expandEachTests("test.each([", "test $a", content, 0, "test", 1);
expect(result).toHaveLength(1);
expect(result[0].name).toBe("test $a");
});
test("should handle arrays with trailing commas", () => {
const content = `test.each([
{ a: 1 },
{ a: 2 },
])('test $a', ({ a }) => {})`;
const result = expandEachTests("test.each([", "test $a", content, 0, "test", 1);
expect(result).toHaveLength(1);
expect(result[0].name).toBe("test $a");
});
test("should handle mixed data types in arrays", () => {
const content = `test.each([
["string", 123, true, null, undefined],
[{ obj: true }, [1, 2, 3], new Date("2024-01-01")]
])('test %s %i %s %s %s', (...args) => {})`;
const result = expandEachTests("test.each([", "test %s %i %s %s %s", content, 0, "test", 1);
expect(result).toHaveLength(1);
expect(result[0].name).toBe("test %s %i %s %s %s");
});
test("should handle regex-like strings", () => {
const content = `test.each([
{ pattern: "/^test.*$/" },
{ pattern: "\\\\d{3}-\\\\d{4}" },
{ pattern: "[a-zA-Z]+" }
])('Pattern: $pattern', ({ pattern }) => {})`;
const result = expandEachTests("test.each([", "Pattern: $pattern", content, 0, "test", 1);
expect(result).toHaveLength(1);
expect(result[0].name).toBe("Pattern: $pattern");
});
test("should handle invalid property access gracefully", () => {
const content = `test.each([
{ a: { b: null } },
{ a: null },
{ },
{ a: { } }
])('Access: $a.b.c.d', (data) => {})`;
const result = expandEachTests("test.each([", "Access: $a.b.c.d", content, 0, "test", 1);
expect(result).toHaveLength(1);
expect(result[0].name).toBe("Access: $a.b.c.d");
});
test("should handle object methods and computed properties", () => {
const content = `test.each([
{ fn: function() {}, method() {}, arrow: () => {} },
{ ["computed"]: "value", [Symbol.for("sym")]: "symbol" }
])('Object with methods', (obj) => {})`;
const result = expandEachTests("test.each([", "Object with methods", content, 0, "test", 1);
expect(result.length).toBeGreaterThanOrEqual(1);
});
});
});
describe("parseTestBlocks", () => {
test("should parse simple test blocks", () => {
const content = `
test("should add numbers", () => {
expect(1 + 1).toBe(2);
});
test("should multiply numbers", () => {
expect(2 * 3).toBe(6);
});
`;
const result = parseTestBlocks(content);
expect(result).toHaveLength(2);
expect(result[0].name).toBe("should add numbers");
expect(result[0].type).toBe("test");
expect(result[1].name).toBe("should multiply numbers");
expect(result[1].type).toBe("test");
});
test("should parse describe blocks with nested tests", () => {
const content = `
describe("Math operations", () => {
test("addition", () => {
expect(1 + 1).toBe(2);
});
test("subtraction", () => {
expect(5 - 3).toBe(2);
});
});
`;
const result = parseTestBlocks(content);
expect(result).toHaveLength(1);
expect(result[0].name).toBe("Math operations");
expect(result[0].type).toBe("describe");
expect(result[0].children).toHaveLength(2);
expect(result[0].children[0].name).toBe("addition");
expect(result[0].children[1].name).toBe("subtraction");
});
test("should handle test modifiers", () => {
const content = `
test.skip("skipped test", () => {});
test.todo("todo test", () => {});
test.only("only test", () => {});
test.failing("failing test", () => {});
`;
const result = parseTestBlocks(content);
expect(result).toHaveLength(4);
expect(result[0].name).toBe("skipped test");
expect(result[1].name).toBe("todo test");
expect(result[2].name).toBe("only test");
expect(result[3].name).toBe("failing test");
});
test("should handle conditional tests", () => {
const content = `
test.if(true)("conditional test", () => {});
test.skipIf(false)("skip if test", () => {});
test.todoIf(true)("todo if test", () => {});
`;
const result = parseTestBlocks(content);
expect(result).toHaveLength(3);
expect(result[0].name).toBe("conditional test");
expect(result[1].name).toBe("skip if test");
expect(result[2].name).toBe("todo if test");
});
test("should ignore comments", () => {
const content = `
// This is a comment with test("fake test", () => {})
/* Multi-line comment
test("another fake test", () => {})
*/
test("real test", () => {});
`;
const result = parseTestBlocks(content);
expect(result).toHaveLength(1);
expect(result[0].name).toBe("real test");
});
test("should handle nested describe blocks", () => {
const content = `
describe("Outer", () => {
describe("Inner", () => {
test("deeply nested", () => {});
});
test("shallow test", () => {});
});
`;
const result = parseTestBlocks(content);
expect(result).toHaveLength(1);
expect(result[0].name).toBe("Outer");
expect(result[0].children).toHaveLength(2);
expect(result[0].children[0].name).toBe("Inner");
expect(result[0].children[0].children).toHaveLength(1);
expect(result[0].children[0].children[0].name).toBe("deeply nested");
expect(result[0].children[1].name).toBe("shallow test");
});
test("should handle it() as alias for test()", () => {
const content = `
it("should work with it", () => {});
it.skip("should skip with it", () => {});
`;
const result = parseTestBlocks(content);
expect(result).toHaveLength(2);
expect(result[0].name).toBe("should work with it");
expect(result[0].type).toBe("test");
expect(result[1].name).toBe("should skip with it");
});
test("should handle different quote types", () => {
const content = `
test('single quotes', () => {});
test("double quotes", () => {});
test(\`template literals\`, () => {});
`;
const result = parseTestBlocks(content);
expect(result).toHaveLength(3);
expect(result[0].name).toBe("single quotes");
expect(result[1].name).toBe("double quotes");
expect(result[2].name).toBe("template literals");
});
test("should handle escaped quotes in test names", () => {
const content = `
test("test with \\"escaped\\" quotes", () => {});
test('test with \\'escaped\\' quotes', () => {});
`;
const result = parseTestBlocks(content);
expect(result).toHaveLength(2);
expect(result[0].name).toBe('test with "escaped" quotes');
expect(result[1].name).toBe("test with 'escaped' quotes");
});
test("should handle comments within test names", () => {
const content = `
test("test with // comment syntax", () => {});
test("test with /* comment */ syntax", () => {});
test("test with URL https://example.com", () => {});
`;
const result = parseTestBlocks(content);
expect(result.length).toBeGreaterThanOrEqual(1);
const hasCommentSyntax = result.some(r => r.name.includes("comment syntax"));
const hasURL = result.some(r => r.name.includes("https://example.com"));
expect(hasCommentSyntax || hasURL).toBe(true);
});
test("should ignore code that looks like tests in strings", () => {
const content = `
const str = "test('fake test', () => {})";
const template = \`describe("fake describe", () => {})\`;
// Real test
test("real test", () => {
const example = 'test("nested fake", () => {})';
});
`;
const result = parseTestBlocks(content);
expect(result.length).toBeGreaterThanOrEqual(1);
expect(result.some(r => r.name === "real test")).toBe(true);
});
test("should handle tests with complex modifier chains", () => {
const content = `
test.skip.failing("skipped failing test", () => {});
test.only.todo("only todo test", () => {});
describe.skip.each([1, 2])("skip each %i", (n) => {});
it.failing.each([{a: 1}])("failing each $a", ({a}) => {});
`;
const result = parseTestBlocks(content);
expect(result.length).toBeGreaterThan(0);
});
test("should handle weird spacing and formatting", () => {
const content = `
test ( "extra spaces" , ( ) => { } ) ;
test
(
"multiline test"
,
(
)
=>
{
}
)
;
test\t(\t"tabs"\t,\t()\t=>\t{}\t);
`;
const result = parseTestBlocks(content);
expect(result).toHaveLength(3);
expect(result[0].name).toBe("extra spaces");
expect(result[1].name).toBe("multiline test");
expect(result[2].name).toBe("tabs");
});
test("should handle test.each with complex patterns", () => {
const content = `
test.each([
[1, 2, 3],
[4, 5, 9]
])("when %i + %i, result should be %i", (a, b, expected) => {});
describe.each([
{ db: "postgres" },
{ db: "mysql" }
])("Database $db", ({ db }) => {
test("should connect", () => {});
});
`;
const result = parseTestBlocks(content);
expect(result).toHaveLength(3);
expect(result[0].name).toBe("when 1 + 2, result should be 3");
expect(result[0].type).toBe("test");
expect(result[1].name).toBe("when 4 + 5, result should be 9");
expect(result[1].type).toBe("test");
expect(result[2].name).toBe("Database $db");
expect(result[2].type).toBe("describe");
});
test("should handle Unicode and emoji in test names", () => {
const content = `
test("测试中文", () => {});
test("テスト日本語", () => {});
test("тест русский", () => {});
test("🚀 rocket test", () => {});
test("Test with 🎉 celebration", () => {});
`;
const result = parseTestBlocks(content);
expect(result).toHaveLength(5);
expect(result[0].name).toBe("测试中文");
expect(result[1].name).toBe("テスト日本語");
expect(result[2].name).toBe("тест русский");
expect(result[3].name).toBe("🚀 rocket test");
expect(result[4].name).toBe("Test with 🎉 celebration");
});
test("should handle test names with interpolation-like syntax", () => {
const content = `
test("test with \${variable}", () => {});
test("test with \$dollar", () => {});
test("test with %percent", () => {});
test(\`template literal test\`, () => {});
`;
const result = parseTestBlocks(content);
expect(result).toHaveLength(4);
expect(result[0].name).toBe("test with ${variable}");
expect(result[1].name).toBe("test with $dollar");
expect(result[2].name).toBe("test with %percent");
expect(result[3].name).toBe("template literal test");
});
test("should handle async/await in test definitions", () => {
const content = `
test("sync test", () => {});
test("async test", async () => {});
test("test with await", async () => {
await something();
});
it("async it", async function() {});
`;
const result = parseTestBlocks(content);
expect(result).toHaveLength(4);
expect(result[0].name).toBe("sync test");
expect(result[1].name).toBe("async test");
expect(result[2].name).toBe("test with await");
expect(result[3].name).toBe("async it");
});
test("should handle generator functions and other ES6+ syntax", () => {
const content = `
test("generator test", function* () {
yield 1;
});
test.each\`
a | b | expected
\${1} | \${1} | \${2}
\${1} | \${2} | \${3}
\`('$a + $b = $expected', ({ a, b, expected }) => {});
`;
const result = parseTestBlocks(content);
expect(result.length).toBeGreaterThanOrEqual(1);
expect(result[0].name).toBe("generator test");
});
});
describe("getBraceDepth", () => {
test("should count braces correctly", () => {
const content = "{ { } }";
expect(getBraceDepth(content, 0, content.length)).toBe(0);
expect(getBraceDepth(content, 0, 3)).toBe(2);
expect(getBraceDepth(content, 0, 5)).toBe(1);
});
test("should ignore braces in strings", () => {
const content = '{ "string with { braces }" }';
expect(getBraceDepth(content, 0, content.length)).toBe(0);
});
test("should ignore braces in template literals", () => {
const content = "{ `template with { braces }` }";
expect(getBraceDepth(content, 0, content.length)).toBe(0);
});
test("should handle escaped quotes", () => {
const content = '{ "escaped \\" quote" }';
expect(getBraceDepth(content, 0, content.length)).toBe(0);
});
test("should handle mixed quotes", () => {
const content = `{ "double" + 'single' + \`template\` }`;
expect(getBraceDepth(content, 0, content.length)).toBe(0);
});
test("should handle nested braces", () => {
const content = "{ a: { b: { c: 1 } } }";
expect(getBraceDepth(content, 0, 10)).toBe(2);
expect(getBraceDepth(content, 0, 15)).toBe(3);
});
test("should handle complex template literals", () => {
const content = '{ `${foo({ bar: "baz" })} and ${nested.value}` }';
expect(getBraceDepth(content, 0, content.length)).toBe(0);
});
test("should handle edge cases", () => {
expect(getBraceDepth("", 0, 0)).toBe(0);
expect(getBraceDepth("{{{}}}", 0, 6)).toBe(0);
expect(getBraceDepth("{{{", 0, 3)).toBe(3);
expect(getBraceDepth("}}}", 0, 3)).toBe(-3);
const templateContent = "{ `${foo}` + `${bar}` }";
expect(getBraceDepth(templateContent, 0, templateContent.length)).toBe(0);
});
});
});

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,570 @@
/**
* Mock VSCode types and classes for testing
* These should be as close as possible to the real VSCode API
*/
export interface MockUri {
readonly scheme: string;
readonly authority: string;
readonly path: string;
readonly query: string;
readonly fragment: string;
readonly fsPath: string;
toString(): string;
}
export class MockUri implements MockUri {
constructor(
public readonly scheme: string,
public readonly authority: string,
public readonly path: string,
public readonly query: string,
public readonly fragment: string,
public readonly fsPath: string,
) {}
static file(path: string): MockUri {
return new MockUri("file", "", path, "", "", path);
}
toString(): string {
return `${this.scheme}://${this.authority}${this.path}`;
}
}
export class MockPosition {
constructor(
public readonly line: number,
public readonly character: number,
) {}
}
export class MockRange {
constructor(
public readonly start: MockPosition,
public readonly end: MockPosition,
) {}
}
export class MockLocation {
constructor(
public readonly uri: MockUri,
public readonly range: MockRange,
) {}
}
export class MockTestTag {
constructor(public readonly id: string) {}
}
export class MockTestMessage {
public location?: MockLocation;
public actualOutput?: string;
public expectedOutput?: string;
constructor(public message: string | MockMarkdownString) {}
static diff(message: string, expected: string, actual: string): MockTestMessage {
const msg = new MockTestMessage(message);
msg.expectedOutput = expected;
msg.actualOutput = actual;
return msg;
}
}
export class MockMarkdownString {
constructor(public value: string = "") {}
appendCodeblock(code: string, language?: string): MockMarkdownString {
this.value += `\n\`\`\`${language || ""}\n${code}\n\`\`\``;
return this;
}
appendMarkdown(value: string): MockMarkdownString {
this.value += value;
return this;
}
appendText(value: string): MockMarkdownString {
this.value += value.replace(/[\\`*_{}[\]()#+\-.!]/g, "\\$&");
return this;
}
}
export interface MockTestItem {
readonly id: string;
readonly uri?: MockUri;
readonly children: MockTestItemCollection;
readonly parent?: MockTestItem;
label: string;
description?: string;
tags: readonly MockTestTag[];
canResolveChildren: boolean;
busy: boolean;
range?: MockRange;
error?: string | MockMarkdownString;
}
export interface MockTestItemCollection {
readonly size: number;
add(item: MockTestItem): void;
replace(items: readonly MockTestItem[]): void;
forEach(callback: (item: MockTestItem, id: string, collection: MockTestItemCollection) => void): void;
get(itemId: string): MockTestItem | undefined;
delete(itemId: string): void;
[Symbol.iterator](): Iterator<[string, MockTestItem]>;
}
export class MockTestItemCollection implements MockTestItemCollection {
private items = new Map<string, MockTestItem>();
get size(): number {
return this.items.size;
}
add(item: MockTestItem): void {
this.items.set(item.id, item);
}
replace(items: readonly MockTestItem[]): void {
this.items.clear();
for (const item of items) {
this.items.set(item.id, item);
}
}
forEach(callback: (item: MockTestItem, id: string, collection: MockTestItemCollection) => void): void {
this.items.forEach((item, id) => callback(item, id, this));
}
get(itemId: string): MockTestItem | undefined {
return this.items.get(itemId);
}
delete(itemId: string): void {
this.items.delete(itemId);
}
[Symbol.iterator](): Iterator<[string, MockTestItem]> {
return this.items[Symbol.iterator]();
}
clear(): void {
this.items.clear();
}
set(id: string, item: MockTestItem): void {
this.items.set(id, item);
}
values(): IterableIterator<MockTestItem> {
return this.items.values();
}
keys(): IterableIterator<string> {
return this.items.keys();
}
entries(): IterableIterator<[string, MockTestItem]> {
return this.items.entries();
}
}
export class MockTestItem implements MockTestItem {
public canResolveChildren: boolean = false;
public busy: boolean = false;
public description?: string;
public range?: MockRange;
public error?: string | MockMarkdownString;
public readonly children: MockTestItemCollection;
constructor(
public readonly id: string,
public label: string,
public readonly uri?: MockUri,
public readonly parent?: MockTestItem,
public tags: readonly MockTestTag[] = [],
) {
this.children = new MockTestItemCollection();
}
}
export interface MockTestController {
readonly items: MockTestItemCollection;
createTestItem(id: string, label: string, uri?: MockUri): MockTestItem;
createRunProfile(
label: string,
kind: MockTestRunProfileKind,
runHandler: (request: MockTestRunRequest, token: MockCancellationToken) => void | Promise<void>,
isDefault?: boolean,
): MockTestRunProfile;
createTestRun(request: MockTestRunRequest, name?: string, persist?: boolean): MockTestRun;
invalidateTestResults(items?: readonly MockTestItem[]): void;
resolveHandler?: (item: MockTestItem | undefined) => Promise<void> | void;
refreshHandler?: (token?: MockCancellationToken) => Promise<void> | void;
}
export class MockTestController implements MockTestController {
public readonly items: MockTestItemCollection;
public resolveHandler?: (item: MockTestItem | undefined) => Promise<void> | void;
public refreshHandler?: (token?: MockCancellationToken) => Promise<void> | void;
constructor(
public readonly id: string,
public readonly label: string,
) {
this.items = new MockTestItemCollection();
}
createTestItem(id: string, label: string, uri?: MockUri): MockTestItem {
return new MockTestItem(id, label, uri);
}
createRunProfile(
label: string,
kind: MockTestRunProfileKind,
runHandler: (request: MockTestRunRequest, token: MockCancellationToken) => void | Promise<void>,
isDefault?: boolean,
): MockTestRunProfile {
return new MockTestRunProfile(label, kind, runHandler, isDefault);
}
createTestRun(request: MockTestRunRequest, name?: string, persist?: boolean): MockTestRun {
return new MockTestRun(name, persist);
}
invalidateTestResults(items?: readonly MockTestItem[]): void {
// Mock implementation - in real VSCode this would invalidate test results
}
dispose(): void {
this.items.clear();
}
}
export enum MockTestRunProfileKind {
Run = 1,
Debug = 2,
Coverage = 3,
}
export interface MockTestRunProfile {
readonly label: string;
readonly kind: MockTestRunProfileKind;
readonly isDefault: boolean;
readonly runHandler: (request: MockTestRunRequest, token: MockCancellationToken) => void | Promise<void>;
dispose(): void;
}
export class MockTestRunProfile implements MockTestRunProfile {
constructor(
public readonly label: string,
public readonly kind: MockTestRunProfileKind,
public readonly runHandler: (request: MockTestRunRequest, token: MockCancellationToken) => void | Promise<void>,
public readonly isDefault: boolean = false,
) {}
dispose(): void {
// No-op for mock
}
}
export interface MockTestRunRequest {
readonly include?: readonly MockTestItem[];
readonly exclude?: readonly MockTestItem[];
readonly profile?: MockTestRunProfile;
}
export class MockTestRunRequest implements MockTestRunRequest {
constructor(
public readonly include?: readonly MockTestItem[],
public readonly exclude?: readonly MockTestItem[],
public readonly profile?: MockTestRunProfile,
) {}
}
export interface MockTestRun {
readonly name?: string;
readonly token: MockCancellationToken;
appendOutput(output: string, location?: MockLocation, test?: MockTestItem): void;
end(): void;
enqueued(test: MockTestItem): void;
errored(test: MockTestItem, message: MockTestMessage | readonly MockTestMessage[], duration?: number): void;
failed(test: MockTestItem, message: MockTestMessage | readonly MockTestMessage[], duration?: number): void;
passed(test: MockTestItem, duration?: number): void;
skipped(test: MockTestItem): void;
started(test: MockTestItem): void;
}
export class MockTestRun implements MockTestRun {
public readonly token: MockCancellationToken;
private _ended: boolean = false;
constructor(
public readonly name?: string,
public readonly persist: boolean = true,
) {
this.token = new MockCancellationToken();
}
appendOutput(output: string, location?: MockLocation, test?: MockTestItem): void {
if (this._ended) return;
// For mock, just store output - in real VS Code this would appear in test output
}
end(): void {
this._ended = true;
}
enqueued(test: MockTestItem): void {
if (this._ended) return;
// Mock implementation
}
errored(test: MockTestItem, message: MockTestMessage | readonly MockTestMessage[], duration?: number): void {
if (this._ended) return;
// Mock implementation
}
failed(test: MockTestItem, message: MockTestMessage | readonly MockTestMessage[], duration?: number): void {
if (this._ended) return;
// Mock implementation
}
passed(test: MockTestItem, duration?: number): void {
if (this._ended) return;
// Mock implementation
}
skipped(test: MockTestItem): void {
if (this._ended) return;
// Mock implementation
}
started(test: MockTestItem): void {
if (this._ended) return;
// Mock implementation
}
}
export interface MockCancellationToken {
readonly isCancellationRequested: boolean;
onCancellationRequested(listener: () => void): MockDisposable;
}
export class MockCancellationToken implements MockCancellationToken {
private _isCancellationRequested: boolean = false;
private _listeners: (() => void)[] = [];
get isCancellationRequested(): boolean {
return this._isCancellationRequested;
}
onCancellationRequested(listener: () => void): MockDisposable {
this._listeners.push(listener);
return new MockDisposable(() => {
const index = this._listeners.indexOf(listener);
if (index >= 0) {
this._listeners.splice(index, 1);
}
});
}
cancel(): void {
this._isCancellationRequested = true;
this._listeners.forEach(listener => listener());
}
}
export interface MockDisposable {
dispose(): void;
}
export class MockDisposable implements MockDisposable {
constructor(private readonly disposeFunc?: () => void) {}
dispose(): void {
this.disposeFunc?.();
}
}
export interface MockTextDocument {
readonly uri: MockUri;
readonly fileName: string;
readonly isUntitled: boolean;
readonly languageId: string;
readonly version: number;
readonly isDirty: boolean;
readonly isClosed: boolean;
readonly eol: MockEndOfLine;
readonly lineCount: number;
getText(range?: MockRange): string;
getWordRangeAtPosition(position: MockPosition, regex?: RegExp): MockRange | undefined;
lineAt(line: number | MockPosition): MockTextLine;
offsetAt(position: MockPosition): number;
positionAt(offset: number): MockPosition;
save(): Promise<boolean>;
validatePosition(position: MockPosition): MockPosition;
validateRange(range: MockRange): MockRange;
}
export enum MockEndOfLine {
LF = 1,
CRLF = 2,
}
export interface MockTextLine {
readonly lineNumber: number;
readonly text: string;
readonly range: MockRange;
readonly rangeIncludingLineBreak: MockRange;
readonly firstNonWhitespaceCharacterIndex: number;
readonly isEmptyOrWhitespace: boolean;
}
export interface MockWorkspaceFolder {
readonly uri: MockUri;
readonly name: string;
readonly index: number;
}
export class MockWorkspaceFolder implements MockWorkspaceFolder {
constructor(
public readonly uri: MockUri,
public readonly name: string,
public readonly index: number = 0,
) {}
}
export interface MockFileSystemWatcher extends MockDisposable {
readonly ignoreCreateEvents: boolean;
readonly ignoreChangeEvents: boolean;
readonly ignoreDeleteEvents: boolean;
onDidCreate(listener: (uri: MockUri) => void): MockDisposable;
onDidChange(listener: (uri: MockUri) => void): MockDisposable;
onDidDelete(listener: (uri: MockUri) => void): MockDisposable;
}
export class MockFileSystemWatcher implements MockFileSystemWatcher {
public readonly ignoreCreateEvents: boolean = false;
public readonly ignoreChangeEvents: boolean = false;
public readonly ignoreDeleteEvents: boolean = false;
private _createListeners: ((uri: MockUri) => void)[] = [];
private _changeListeners: ((uri: MockUri) => void)[] = [];
private _deleteListeners: ((uri: MockUri) => void)[] = [];
onDidCreate(listener: (uri: MockUri) => void): MockDisposable {
this._createListeners.push(listener);
return new MockDisposable(() => {
const index = this._createListeners.indexOf(listener);
if (index >= 0) this._createListeners.splice(index, 1);
});
}
onDidChange(listener: (uri: MockUri) => void): MockDisposable {
this._changeListeners.push(listener);
return new MockDisposable(() => {
const index = this._changeListeners.indexOf(listener);
if (index >= 0) this._changeListeners.splice(index, 1);
});
}
onDidDelete(listener: (uri: MockUri) => void): MockDisposable {
this._deleteListeners.push(listener);
return new MockDisposable(() => {
const index = this._deleteListeners.indexOf(listener);
if (index >= 0) this._deleteListeners.splice(index, 1);
});
}
dispose(): void {
this._createListeners.length = 0;
this._changeListeners.length = 0;
this._deleteListeners.length = 0;
}
// Helper methods for testing
triggerCreate(uri: MockUri): void {
this._createListeners.forEach(listener => listener(uri));
}
triggerChange(uri: MockUri): void {
this._changeListeners.forEach(listener => listener(uri));
}
triggerDelete(uri: MockUri): void {
this._deleteListeners.forEach(listener => listener(uri));
}
}
export interface MockRelativePattern {
readonly base: string;
readonly pattern: string;
}
export class MockRelativePattern implements MockRelativePattern {
constructor(
public readonly base: string | MockWorkspaceFolder,
public readonly pattern: string,
) {}
get baseUri(): MockUri {
if (typeof this.base === "string") {
return MockUri.file(this.base);
}
return this.base.uri;
}
}
export interface MockConfiguration {
get<T>(section: string, defaultValue?: T): T | undefined;
has(section: string): boolean;
inspect<T>(section: string): MockConfigurationInspect<T> | undefined;
update(section: string, value: any, configurationTarget?: MockConfigurationTarget): Promise<void>;
}
export interface MockConfigurationInspect<T> {
readonly key: string;
readonly defaultValue?: T;
readonly globalValue?: T;
readonly workspaceValue?: T;
readonly workspaceFolderValue?: T;
}
export enum MockConfigurationTarget {
Global = 1,
Workspace = 2,
WorkspaceFolder = 3,
}
export class MockConfiguration implements MockConfiguration {
private _values = new Map<string, any>();
get<T>(section: string, defaultValue?: T): T | undefined {
return this._values.get(section) ?? defaultValue;
}
has(section: string): boolean {
return this._values.has(section);
}
inspect<T>(section: string): MockConfigurationInspect<T> | undefined {
return {
key: section,
defaultValue: undefined,
globalValue: this._values.get(section),
workspaceValue: undefined,
workspaceFolderValue: undefined,
};
}
async update(section: string, value: any, configurationTarget?: MockConfigurationTarget): Promise<void> {
this._values.set(section, value);
}
// Helper for testing
setValue(section: string, value: any): void {
this._values.set(section, value);
}
}

View File

@@ -0,0 +1,56 @@
import { mock } from "bun:test";
import {
MockConfiguration,
MockDisposable,
MockFileSystemWatcher,
MockLocation,
MockMarkdownString,
MockPosition,
MockRange,
MockRelativePattern,
MockTestController,
MockTestMessage,
MockTestRunProfileKind,
MockTestTag,
MockUri,
MockWorkspaceFolder,
} from "./vscode-types.mock";
mock.module("vscode", () => ({
window: {
createOutputChannel: () => ({
appendLine: () => {},
}),
visibleTextEditors: [],
},
workspace: {
getConfiguration: (section?: string) => new MockConfiguration(),
onDidOpenTextDocument: () => new MockDisposable(),
textDocuments: [],
createFileSystemWatcher: (pattern: string | MockRelativePattern) => new MockFileSystemWatcher(),
findFiles: async (include: string, exclude?: string, maxResults?: number, token?: any) => {
return []; // Mock implementation
},
},
Uri: MockUri,
TestTag: MockTestTag,
Position: MockPosition,
Range: MockRange,
Location: MockLocation,
TestMessage: MockTestMessage,
MarkdownString: MockMarkdownString,
TestRunProfileKind: MockTestRunProfileKind,
RelativePattern: MockRelativePattern,
debug: {
addBreakpoints: () => {},
startDebugging: async () => true,
},
}));
export function makeTestController(): MockTestController {
return new MockTestController("test-controller", "Test Controller");
}
export function makeWorkspaceFolder(path: string): MockWorkspaceFolder {
return new MockWorkspaceFolder(MockUri.file(path), path.split("/").pop() || "workspace", 0);
}

View File

@@ -17,7 +17,7 @@ export const debug = vscode.window.createOutputChannel("Bun - Test Runner");
export type TestNode = {
name: string;
type: "describe" | "test" | "it";
type: "describe" | "test";
line: number;
children: TestNode[];
parent?: TestNode;
@@ -51,11 +51,15 @@ export class BunTestController implements vscode.Disposable {
private currentRunType: "file" | "individual" = "file";
private requestedTestIds: Set<string> = new Set();
private discoveredTestIds: Set<string> = new Set();
private executedTestCount: number = 0;
private totalTestsStarted: number = 0;
constructor(
private readonly testController: vscode.TestController,
private readonly workspaceFolder: vscode.WorkspaceFolder,
readonly isTest: boolean = false,
) {
if (isTest) return;
this.setupTestController();
this.setupWatchers();
this.setupOpenDocumentListener();
@@ -67,10 +71,7 @@ export class BunTestController implements vscode.Disposable {
try {
this.signal = await this.createSignal();
await this.signal.ready;
debug.appendLine(`Signal initialized at: ${this.signal.url}`);
this.signal.on("Signal.Socket.connect", (socket: net.Socket) => {
debug.appendLine("Bun connected to signal socket");
this.handleSocketConnection(socket, this.currentRun!);
});
@@ -89,8 +90,9 @@ export class BunTestController implements vscode.Disposable {
};
this.testController.refreshHandler = async token => {
const files = await this.discoverInitialTests(token);
const files = await this.discoverInitialTests(token, false);
if (!files?.length) return;
if (token.isCancellationRequested) return;
const filePaths = new Set(files.map(f => f.fsPath));
for (const [, testItem] of this.testController.items) {
@@ -134,15 +136,21 @@ export class BunTestController implements vscode.Disposable {
}
private isTestFile(document: vscode.TextDocument): boolean {
return document?.uri?.scheme === "file" && /\.(test|spec)\.(js|jsx|ts|tsx|cjs|mts)$/.test(document.uri.fsPath);
return (
document?.uri?.scheme === "file" && /\.(test|spec)\.(js|jsx|ts|tsx|cjs|mjs|mts|cts)$/.test(document.uri.fsPath)
);
}
private async discoverInitialTests(cancellationToken?: vscode.CancellationToken): Promise<vscode.Uri[] | undefined> {
private async discoverInitialTests(
cancellationToken?: vscode.CancellationToken,
reset: boolean = true,
): Promise<vscode.Uri[] | undefined> {
try {
const tests = await this.findTestFiles(cancellationToken);
this.createFileTestItems(tests);
this.createFileTestItems(tests, reset);
return tests;
} catch {
} catch (error) {
debug.appendLine(`Error in discoverInitialTests: ${error}`);
return undefined;
}
}
@@ -179,6 +187,8 @@ export class BunTestController implements vscode.Disposable {
const ignoreGlobs = new Set(["**/node_modules/**"]);
for (const ignore of ignores) {
if (cancellationToken?.isCancellationRequested) return [];
try {
const content = await fs.readFile(ignore.fsPath, { encoding: "utf8" });
const lines = content
@@ -195,13 +205,15 @@ export class BunTestController implements vscode.Disposable {
ignoreGlobs.add(path.join(cwd.trim(), line.trim()));
}
}
} catch {}
} catch {
debug.appendLine(`Error in buildIgnoreGlobs: ${ignore.fsPath}`);
}
}
return [...ignoreGlobs.values()];
}
private createFileTestItems(files: vscode.Uri[]): void {
private createFileTestItems(files: vscode.Uri[], reset: boolean = true): void {
if (files.length === 0) {
return;
}
@@ -214,7 +226,9 @@ export class BunTestController implements vscode.Disposable {
path.relative(this.workspaceFolder.uri.fsPath, file.fsPath) || file.fsPath,
file,
);
fileTestItem.children.replace([]);
if (reset) {
fileTestItem.children.replace([]);
}
fileTestItem.canResolveChildren = true;
this.testController.items.add(fileTestItem);
}
@@ -274,7 +288,13 @@ export class BunTestController implements vscode.Disposable {
return { bunCommand, testArgs };
}
private async discoverTests(testItem?: vscode.TestItem | false, filePath?: string): Promise<void> {
private async discoverTests(
testItem?: vscode.TestItem | false,
filePath?: string,
cancellationToken?: vscode.CancellationToken,
): Promise<void> {
if (cancellationToken?.isCancellationRequested) return;
let targetPath = filePath;
if (!targetPath && testItem) {
targetPath = testItem?.uri?.fsPath || this.workspaceFolder.uri.fsPath;
@@ -297,17 +317,24 @@ export class BunTestController implements vscode.Disposable {
);
this.testController.items.add(fileTestItem);
}
fileTestItem.children.replace([]);
if (!this.currentRun) {
fileTestItem.children.replace([]);
}
fileTestItem.canResolveChildren = false;
this.addTestNodes(testNodes, fileTestItem, targetPath);
} catch {}
} catch {
debug.appendLine(`Error in discoverTests: ${targetPath}`);
}
}
private parseTestBlocks(fileContent: string): TestNode[] {
const cleanContent = fileContent
.replace(/\/\*[\s\S]*?\*\//g, match => match.replace(/[^\n\r]/g, " "))
.replace(/\/\/.*$/gm, match => " ".repeat(match.length));
.replace(/('(?:[^'\\]|\\.)*'|"(?:[^"\\]|\\.)*"|`(?:[^`\\]|\\.)*`)|\/\/.*$/gm, (match, str) => {
if (str) return str;
return " ".repeat(match.length);
});
const testRegex =
/\b(describe|test|it)(?:\.(?:skip|todo|failing|only))?(?:\.(?:if|todoIf|skipIf)\s*\([^)]*\))?(?:\.each\s*\([^)]*\))?\s*\(\s*(['"`])((?:\\\2|.)*?)\2\s*(?:,|\))/g;
@@ -319,6 +346,7 @@ export class BunTestController implements vscode.Disposable {
match = testRegex.exec(cleanContent);
while (match !== null) {
const [full, type, , name] = match;
const _type = type === "it" ? "test" : type;
const line = cleanContent.slice(0, match.index).split("\n").length - 1;
while (
@@ -329,7 +357,14 @@ export class BunTestController implements vscode.Disposable {
stack.pop();
}
const expandedNodes = this.expandEachTests(full, name, cleanContent, match.index, type as TestNode["type"], line);
const expandedNodes = this.expandEachTests(
full,
name,
cleanContent,
match.index,
_type as TestNode["type"],
line,
);
for (const node of expandedNodes) {
if (stack.length === 0) {
@@ -433,16 +468,16 @@ export class BunTestController implements vscode.Disposable {
throw new Error("Not an array");
}
return eachValues.map(val => {
let testName = name;
return eachValues.map((val, testIndex) => {
let testName = name.replace(/%%/g, "%").replace(/%#/g, (testIndex + 1).toString());
if (Array.isArray(val)) {
let idx = 0;
testName = testName.replace(/%[isfd]/g, () => {
testName = testName.replace(/%[isfdojp#%]/g, () => {
const v = val[idx++];
return typeof v === "object" ? JSON.stringify(v) : String(v);
});
} else {
testName = testName.replace(/%[isfd]/g, () => {
testName = testName.replace(/%[isfdojp#%]/g, () => {
return typeof val === "object" ? JSON.stringify(val) : String(val);
});
}
@@ -475,19 +510,22 @@ export class BunTestController implements vscode.Disposable {
: this.escapeTestName(node.name);
const testId = `${filePath}#${nodePath}`;
const testItem = this.testController.createTestItem(testId, this.stripAnsi(node.name), vscode.Uri.file(filePath));
let testItem = parent.children.get(testId);
if (!testItem) {
testItem = this.testController.createTestItem(testId, this.stripAnsi(node.name), vscode.Uri.file(filePath));
testItem.tags = [new vscode.TestTag(node.type === "describe" ? "describe" : "test")];
if (node.type) testItem.tags = [new vscode.TestTag(node.type)];
if (typeof node.line === "number") {
testItem.range = new vscode.Range(
new vscode.Position(node.line, 0),
new vscode.Position(node.line, node.name.length),
);
if (typeof node.line === "number") {
testItem.range = new vscode.Range(
new vscode.Position(node.line, 0),
new vscode.Position(node.line, node.name.length),
);
}
parent.children.add(testItem);
}
parent.children.add(testItem);
if (node.children.length > 0) {
this.addTestNodes(node.children, testItem, filePath, nodePath);
}
@@ -500,7 +538,7 @@ export class BunTestController implements vscode.Disposable {
}
private escapeTestName(source: string): string {
return source.replace(/[^a-zA-Z0-9_\ ]/g, "\\$&");
return source.replace(/[^\w \-\u0080-\uFFFF]/g, "\\$&");
}
private async createSignal(): Promise<UnixSignal | TCPSocketSignal> {
@@ -517,6 +555,23 @@ export class BunTestController implements vscode.Disposable {
token: vscode.CancellationToken,
isDebug: boolean,
): Promise<void> {
if (this.currentRun) {
this.closeAllActiveProcesses();
this.disconnectInspector();
if (this.currentRun) {
this.currentRun.appendOutput("\n\x1b[33mCancelled: Starting new test run\x1b[0m\n");
this.currentRun.end();
this.currentRun = null;
}
}
this.totalTestsStarted++;
if (this.totalTestsStarted > 15) {
this.closeAllActiveProcesses();
this.disconnectInspector();
this.signal?.close();
this.signal = null;
}
const run = this.testController.createTestRun(request);
token.onCancellationRequested(() => {
@@ -525,6 +580,14 @@ export class BunTestController implements vscode.Disposable {
this.disconnectInspector();
});
if ("onDidDispose" in run) {
(run.onDidDispose as vscode.Event<void>)(() => {
run?.end?.();
this.closeAllActiveProcesses();
this.disconnectInspector();
});
}
const queue: vscode.TestItem[] = [];
if (request.include) {
@@ -547,7 +610,9 @@ export class BunTestController implements vscode.Disposable {
await this.runTestsWithInspector(queue, run, token);
} catch (error) {
for (const test of queue) {
run.errored(test, new vscode.TestMessage(`Error: ${error}`));
const msg = new vscode.TestMessage(`Error: ${error}`);
msg.location = new vscode.Location(test.uri!, test.range || new vscode.Range(0, 0, 0, 0));
run.errored(test, msg);
}
} finally {
run.end();
@@ -557,8 +622,11 @@ export class BunTestController implements vscode.Disposable {
private async runTestsWithInspector(
tests: vscode.TestItem[],
run: vscode.TestRun,
_token: vscode.CancellationToken,
token: vscode.CancellationToken,
): Promise<void> {
const time = performance.now();
if (token.isCancellationRequested) return;
this.disconnectInspector();
const allFiles = new Set<string>();
@@ -569,13 +637,20 @@ export class BunTestController implements vscode.Disposable {
}
if (allFiles.size === 0) {
run.appendOutput("No test files found to run.\n");
return;
const errorMsg = "No test files found to run.";
run.appendOutput(`\x1b[31mError: ${errorMsg}\x1b[0m\n`);
for (const test of tests) {
const msg = new vscode.TestMessage(errorMsg);
msg.location = new vscode.Location(test.uri!, test.range || new vscode.Range(0, 0, 0, 0));
run.errored(test, msg);
}
throw new Error(errorMsg);
}
for (const test of tests) {
if (token.isCancellationRequested) return;
if (test.uri && test.canResolveChildren) {
await this.discoverTests(test);
await this.discoverTests(test, undefined, token);
}
}
@@ -584,6 +659,7 @@ export class BunTestController implements vscode.Disposable {
this.requestedTestIds.clear();
this.discoveredTestIds.clear();
this.executedTestCount = 0;
for (const test of tests) {
this.requestedTestIds.add(test.id);
}
@@ -607,21 +683,38 @@ export class BunTestController implements vscode.Disposable {
resolve();
};
const handleCancel = () => {
clearTimeout(timeout);
this.signal!.off("Signal.Socket.connect", handleConnect);
reject(new Error("Test run cancelled"));
};
token.onCancellationRequested(handleCancel);
this.signal!.once("Signal.Socket.connect", handleConnect);
});
const { bunCommand, testArgs } = this.getBunExecutionConfig();
let args = [...testArgs, ...Array.from(allFiles)];
let args = [...testArgs, ...allFiles];
let printedArgs = `\x1b[34;1m>\x1b[0m \x1b[34;1m${bunCommand} ${testArgs.join(" ")}\x1b[2m`;
for (const file of allFiles) {
const f = path.relative(this.workspaceFolder.uri.fsPath, file) || file;
if (f.includes(" ")) {
printedArgs += ` ".${path.sep}${f}"`;
} else {
printedArgs += ` .${path.sep}${f}`;
}
}
if (isIndividualTestRun) {
const pattern = this.buildTestNamePattern(tests);
if (pattern) {
args.push("--test-name-pattern", process.platform === "win32" ? `"${pattern}"` : pattern);
args.push("--test-name-pattern", pattern);
printedArgs += `\x1b[0m\x1b[2m --test-name-pattern "${pattern}"\x1b[0m`;
}
}
run.appendOutput(`\r\n\x1b[34m>\x1b[0m \x1b[2m${bunCommand} ${args.join(" ")}\x1b[0m\r\n\r\n`);
args.push(`--inspect-wait=${this.signal!.url}`);
run.appendOutput(printedArgs + "\x1b[0m\r\n\r\n");
for (const test of tests) {
if (isIndividualTestRun || tests.length === 1) {
@@ -631,34 +724,52 @@ export class BunTestController implements vscode.Disposable {
}
}
let inspectorUrl: string | undefined =
this.signal.url.startsWith("ws") || this.signal.url.startsWith("tcp")
? `${this.signal!.url}?wait=1`
: `${this.signal!.url}`;
// right now there isnt a way to tell socket method to wait for the connection
if (!inspectorUrl?.includes("?wait=1")) {
args.push(`--inspect-wait=${this.signal!.url}`);
inspectorUrl = undefined;
}
const proc = spawn(bunCommand, args, {
cwd: this.workspaceFolder.uri.fsPath,
env: {
...process.env,
BUN_DEBUG_QUIET_LOGS: "1",
FORCE_COLOR: "1",
NO_COLOR: "0",
BUN_INSPECT: inspectorUrl,
...process.env,
},
});
this.activeProcesses.add(proc);
let stdout = "";
proc.on("exit", (code, signal) => {
debug.appendLine(`Process exited with code ${code}, signal ${signal}`);
if (code !== 0 && code !== 1) {
debug.appendLine(`Test process failed: exit ${code}, signal ${signal}`);
}
});
proc.on("error", error => {
stdout += `Process error: ${error.message}\n`;
debug.appendLine(`Process error: ${error.message}`);
});
proc.stdout?.on("data", data => {
const dataStr = data.toString();
stdout += dataStr;
const formattedOutput = dataStr.replace(/\n/g, "\r\n");
run.appendOutput(formattedOutput);
});
proc.stderr?.on("data", data => {
const dataStr = data.toString();
stdout += dataStr;
const formattedOutput = dataStr.replace(/\n/g, "\r\n");
run.appendOutput(formattedOutput);
});
@@ -666,35 +777,57 @@ export class BunTestController implements vscode.Disposable {
try {
await socketPromise;
} catch (error) {
debug.appendLine(`Failed to establish inspector connection: ${error}`);
debug.appendLine(`Signal URL was: ${this.signal!.url}`);
debug.appendLine(`Command was: ${bunCommand} ${args.join(" ")}`);
debug.appendLine(`Connection failed: ${error} (URL: ${this.signal!.url})`);
throw error;
}
await new Promise<void>((resolve, reject) => {
proc.on("close", code => {
const handleClose = (code: number | null) => {
this.activeProcesses.delete(proc);
if (code === 0 || code === 1) {
resolve();
} else {
reject(new Error(`Process exited with code ${code}`));
reject(new Error(`Process exited with code ${code}. Please check the console for more details.`));
}
});
};
proc.on("error", error => {
const handleError = (error: Error) => {
this.activeProcesses.delete(proc);
reject(error);
});
};
const handleCancel = () => {
proc.kill("SIGTERM");
this.activeProcesses.delete(proc);
reject(new Error("Test run cancelled"));
};
proc.on("close", handleClose);
proc.on("error", handleError);
token.onCancellationRequested(handleCancel);
}).finally(() => {
if (isIndividualTestRun) {
this.applyPreviousResults(tests, run);
if (this.discoveredTestIds.size === 0) {
const errorMsg =
"No tests were executed. This could mean:\r\n- All tests were filtered out\r\n- The test runner crashed before running tests\r\n- No tests match the pattern";
run.appendOutput(`\n\x1b[31m\x1b[1mError:\x1b[0m\x1b[31m ${errorMsg}\x1b[0m\n`);
for (const test of tests) {
if (!this.testResultHistory.has(test.id)) {
const msg = new vscode.TestMessage(errorMsg + "\n\n----------\n" + stdout + "\n----------\n");
msg.location = new vscode.Location(test.uri!, test.range || new vscode.Range(0, 0, 0, 0));
run.errored(test, msg);
}
}
}
if (isIndividualTestRun) {
this.cleanupUndiscoveredTests(tests);
} else {
this.cleanupStaleTests(tests);
if (this.discoveredTestIds.size > 0 && this.executedTestCount > 0) {
if (isIndividualTestRun) {
this.applyPreviousResults(tests, run);
this.cleanupUndiscoveredTests(tests);
} else {
this.cleanupStaleTests(tests);
}
}
if (this.activeProcesses.has(proc)) {
@@ -704,6 +837,7 @@ export class BunTestController implements vscode.Disposable {
this.disconnectInspector();
this.currentRun = null;
debug.appendLine(`Test run completed in ${performance.now() - time}ms`);
});
}
@@ -725,7 +859,7 @@ export class BunTestController implements vscode.Disposable {
run.passed(item, previousResult.duration);
break;
case "failed":
run.failed(item, previousResult.message || new vscode.TestMessage("Test failed"), previousResult.duration);
run.failed(item, [], previousResult.duration);
break;
case "skipped":
run.skipped(item);
@@ -763,16 +897,11 @@ export class BunTestController implements vscode.Disposable {
this.handleLifecycleError(event, run);
});
this.debugAdapter.on("Inspector.event", e => {
debug.appendLine(`Received inspector event: ${e.method}`);
});
this.debugAdapter.on("Inspector.error", e => {
debug.appendLine(`Inspector error: ${e}`);
});
socket.on("close", () => {
debug.appendLine("Inspector connection closed");
this.debugAdapter = null;
});
@@ -799,7 +928,6 @@ export class BunTestController implements vscode.Disposable {
const { id: inspectorTestId, url: sourceURL, name, type, parentId, line } = params;
if (!sourceURL) {
debug.appendLine(`Warning: Test found without URL: ${name}`);
return;
}
@@ -814,8 +942,6 @@ export class BunTestController implements vscode.Disposable {
this.inspectorToVSCode.set(inspectorTestId, testItem);
this.vscodeToInspector.set(testItem.id, inspectorTestId);
this.discoveredTestIds.add(testItem.id);
} else {
debug.appendLine(`Could not find VS Code test item for: ${name} in ${path.basename(filePath)}`);
}
}
@@ -931,6 +1057,7 @@ export class BunTestController implements vscode.Disposable {
if (!testItem) return;
const duration = elapsed / 1000000;
this.executedTestCount++;
if (
this.currentRunType === "individual" &&
@@ -959,7 +1086,6 @@ export class BunTestController implements vscode.Disposable {
break;
case "skip":
case "todo":
case "skipped_because_label":
run.skipped(testItem);
this.testResultHistory.set(testItem.id, { status: "skipped" });
break;
@@ -970,6 +1096,8 @@ export class BunTestController implements vscode.Disposable {
run.failed(testItem, timeoutMsg, duration);
this.testResultHistory.set(testItem.id, { status: "failed", message: timeoutMsg, duration });
break;
case "skipped_because_label":
break;
}
}
@@ -1078,7 +1206,10 @@ export class BunTestController implements vscode.Disposable {
const lines = messageLinesRaw;
const errorLine = lines[0].trim();
const messageLines = lines.slice(1).join("\n");
const messageLines = lines
.slice(1)
.filter(line => line.trim())
.join("\n");
const errorType = errorLine.replace(/^(E|e)rror: /, "").trim();
@@ -1090,8 +1221,8 @@ export class BunTestController implements vscode.Disposable {
const regex = /^Expected:\s*([\s\S]*?)\nReceived:\s*([\s\S]*?)$/;
let testMessage = vscode.TestMessage.diff(
errorLine,
messageLines.match(regex)?.[1].trim() || "",
messageLines.match(regex)?.[2].trim() || "",
messageLines.trim().match(regex)?.[1].trim() || "",
messageLines.trim().match(regex)?.[2].trim() || "",
);
if (!messageLines.match(regex)) {
const code = messageLines
@@ -1153,7 +1284,7 @@ export class BunTestController implements vscode.Disposable {
lastEffortMsg = lastEffortMsg.reverse();
}
const msg = errorLine.startsWith("error: expect")
const msg = errorType.startsWith("expect")
? `${lastEffortMsg.join("\n")}\n${errorLine.trim()}`.trim()
: `${errorLine.trim()}\n${messageLines}`.trim();
@@ -1201,12 +1332,15 @@ export class BunTestController implements vscode.Disposable {
t = t.replaceAll(/\$\{[^}]+\}/g, ".*?");
t = t.replaceAll(/\\\$\\\{[^}]+\\\}/g, ".*?");
t = t.replaceAll(/\\%[isfd]/g, ".*?");
t = t.replaceAll(/\\%[isfdojp#%]|(\\%)|(\\#)/g, ".*?");
t = t.replaceAll(/\$[\w\.\[\]]+/g, ".*?");
if (test.tags.some(tag => tag.id === "test" || tag.id === "it")) {
if (test?.tags?.some(tag => tag.id === "test" || tag.id === "it")) {
testNames.push(`^ ${t}$`);
} else {
} else if (test?.tags?.some(tag => tag.id === "describe")) {
testNames.push(`^ ${t} `);
} else {
testNames.push(t);
}
}
@@ -1242,7 +1376,13 @@ export class BunTestController implements vscode.Disposable {
const isIndividualTestRun = this.shouldUseTestNamePattern(tests);
if (testFiles.size === 0) {
run.appendOutput("No test files found to debug.\n");
const errorMsg = "No test files found to debug.";
run.appendOutput(`\x1b[31mError: ${errorMsg}\x1b[0m\n`);
for (const test of tests) {
const msg = new vscode.TestMessage(errorMsg);
msg.location = new vscode.Location(test.uri!, test.range || new vscode.Range(0, 0, 0, 0));
run.errored(test, msg);
}
run.end();
return;
}
@@ -1268,7 +1408,7 @@ export class BunTestController implements vscode.Disposable {
const pattern = this.buildTestNamePattern(tests);
if (pattern) {
args.push("--test-name-pattern", process.platform === "win32" ? `"${pattern}"` : pattern);
args.push("--test-name-pattern", pattern);
}
}
@@ -1289,9 +1429,12 @@ export class BunTestController implements vscode.Disposable {
if (!res) throw new Error("Failed to start debugging session");
} catch (error) {
for (const test of tests) {
run.errored(test, new vscode.TestMessage(`Error starting debugger: ${error}`));
const msg = new vscode.TestMessage(`Error starting debugger: ${error}`);
msg.location = new vscode.Location(test.uri!, test.range || new vscode.Range(0, 0, 0, 0));
run.errored(test, msg);
}
}
run.appendOutput("\n\x1b[33mDebug session started. Please open the debug console to see its output.\x1b[0m\r\n");
run.end();
}
@@ -1318,6 +1461,32 @@ export class BunTestController implements vscode.Disposable {
}
this.disposables = [];
}
// a sus way to expose internal functions to the test suite
public get _internal() {
return {
expandEachTests: this.expandEachTests.bind(this),
parseTestBlocks: this.parseTestBlocks.bind(this),
getBraceDepth: this.getBraceDepth.bind(this),
buildTestNamePattern: this.buildTestNamePattern.bind(this),
stripAnsi: this.stripAnsi.bind(this),
processErrorData: this.processErrorData.bind(this),
escapeTestName: this.escapeTestName.bind(this),
shouldUseTestNamePattern: this.shouldUseTestNamePattern.bind(this),
isTestFile: this.isTestFile.bind(this),
customFilePattern: this.customFilePattern.bind(this),
getBunExecutionConfig: this.getBunExecutionConfig.bind(this),
findTestByPath: this.findTestByPath.bind(this),
findTestByName: this.findTestByName.bind(this),
createTestItem: this.createTestItem.bind(this),
createErrorMessage: this.createErrorMessage.bind(this),
cleanupTestItem: this.cleanupTestItem.bind(this),
};
}
}
function windowsVscodeUri(uri: string): string {

View File

@@ -7,8 +7,14 @@ export async function registerTests(context: vscode.ExtensionContext) {
return;
}
const config = vscode.workspace.getConfiguration("bun.test");
const enable = config.get<boolean>("enable", true);
if (!enable) {
return;
}
try {
const controller = vscode.tests.createTestController("bun-tests", "Bun Tests");
const controller = vscode.tests.createTestController("bun", "Bun Tests");
context.subscriptions.push(controller);
const bunTestController = new BunTestController(controller, workspaceFolder);

View File

@@ -2,8 +2,8 @@
+++ CMakeLists.txt
@@ -1,5 +1,5 @@
#
-CMAKE_MINIMUM_REQUIRED(VERSION 2.8.12 FATAL_ERROR)
+CMAKE_MINIMUM_REQUIRED(VERSION 2.8.12...3.5 FATAL_ERROR)
if(POLICY CMP0065)
cmake_policy(SET CMP0065 NEW) #3.4 don't use `-rdynamic` with executables
endif()
-cmake_minimum_required(VERSION 3.17 FATAL_ERROR)
+cmake_minimum_required(VERSION 3.17...3.30 FATAL_ERROR)
PROJECT(libarchive C)
#

View File

@@ -1,22 +1,29 @@
--- a/libarchive/archive_write_add_filter_gzip.c
+++ b/libarchive/archive_write_add_filter_gzip.c
@@ -58,6 +58,7 @@ archive_write_set_compression_gzip(struct archive *a)
struct private_data {
--- a/libarchive/archive_write_add_filter_gzip.c 2025-07-21 06:29:58.505101515 +0000
+++ b/libarchive/archive_write_add_filter_gzip.c 2025-07-21 06:44:09.023676935 +0000
@@ -59,12 +59,13 @@
int compression_level;
int timestamp;
+ unsigned char os;
char *original_filename;
+ unsigned char os;
#ifdef HAVE_ZLIB_H
z_stream stream;
int64_t total_in;
@@ -106,6 +107,7 @@ archive_write_add_filter_gzip(struct archive *_a)
archive_set_error(&a->archive, ENOMEM, "Out of memory");
unsigned char *compressed;
size_t compressed_buffer_size;
- unsigned long crc;
+ uint32_t crc;
#else
struct archive_write_program_data *pdata;
#endif
@@ -108,6 +109,7 @@
return (ARCHIVE_FATAL);
}
+ data->os = 3; /* default Unix */
f->data = data;
+ data->os = 3; /* default Unix */
f->open = &archive_compressor_gzip_open;
f->options = &archive_compressor_gzip_options;
@@ -166,6 +168,30 @@ archive_compressor_gzip_options(struct archive_write_filter *f, const char *key,
f->close = &archive_compressor_gzip_close;
@@ -177,6 +179,30 @@
return (ARCHIVE_OK);
}
@@ -47,7 +54,7 @@
/* Note: The "warn" return is just to inform the options
* supervisor that we didn't handle it. It will generate
* a suitable error if no one used this option. */
@@ -226,7 +252,7 @@ archive_compressor_gzip_open(struct archive_write_filter *f)
@@ -236,7 +262,7 @@
data->compressed[8] = 4;
else
data->compressed[8] = 0;

View File

@@ -1,4 +1,4 @@
# Version: 9
# Version: 11
# A script that installs the dependencies needed to build and test Bun.
# This should work on Windows 10 or newer with PowerShell.
@@ -282,6 +282,8 @@ function Install-Build-Essentials {
strawberryperl `
mingw
Install-Rust
# Needed to remap stack traces
Install-PdbAddr2line
Install-Llvm
}
@@ -342,6 +344,10 @@ function Install-Rust {
Add-To-Path "$rustPath\cargo\bin"
}
function Install-PdbAddr2line {
Execute-Command cargo install --examples "pdb-addr2line@0.11.2"
}
function Install-Llvm {
Install-Package llvm `
-Command clang-cl `

View File

@@ -1,5 +1,5 @@
#!/bin/sh
# Version: 15
# Version: 18
# A script that installs the dependencies needed to build and test Bun.
# This should work on macOS and Linux with a POSIX shell.
@@ -1510,17 +1510,20 @@ configure_core_dumps() {
# disable apport.service if it exists since it will override the core_pattern
if which systemctl >/dev/null; then
if systemctl list-unit-files apport.service >/dev/null; then
execute_sudo "$systemctl" disable --now apport.service || true
execute_sudo "$systemctl" disable --now apport.service
fi
fi
# load the new configuration (ignore permission errors)
execute_sudo sysctl -p "$sysctl_file" || true
execute_sudo sysctl -p "$sysctl_file"
# ensure that a regular user will be able to run sysctl
if [ -d /sbin ]; then
append_to_path /sbin
fi
# install gdb for backtraces
install_packages gdb
;;
esac
}
@@ -1538,6 +1541,17 @@ clean_system() {
done
}
ensure_no_tmpfs() {
if ! [ "$os" = "linux" ]; then
return
fi
if ! [ "$distro" = "ubuntu" ]; then
return
fi
execute_sudo systemctl mask tmp.mount
}
main() {
check_features "$@"
check_operating_system
@@ -1555,6 +1569,7 @@ main() {
configure_core_dumps
fi
clean_system
ensure_no_tmpfs
}
main "$@"

View File

@@ -224,6 +224,31 @@ async function spawn(command, args, options, label) {
...options,
});
let killedManually = false;
function onKill() {
clearOnKill();
if (!subprocess.killed) {
killedManually = true;
subprocess.kill?.();
}
}
function clearOnKill() {
process.off("beforeExit", onKill);
process.off("SIGINT", onKill);
process.off("SIGTERM", onKill);
}
// Kill the entire process tree so everything gets cleaned up. On Windows, job
// control groups make this haappen automatically so we don't need to do this
// on Windows.
if (process.platform !== "win32") {
process.once("beforeExit", onKill);
process.once("SIGINT", onKill);
process.once("SIGTERM", onKill);
}
let timestamp;
subprocess.on("spawn", () => {
timestamp = Date.now();
@@ -253,8 +278,14 @@ async function spawn(command, args, options, label) {
}
const { error, exitCode, signalCode } = await new Promise(resolve => {
subprocess.on("error", error => resolve({ error }));
subprocess.on("exit", (exitCode, signalCode) => resolve({ exitCode, signalCode }));
subprocess.on("error", error => {
clearOnKill();
resolve({ error });
});
subprocess.on("exit", (exitCode, signalCode) => {
clearOnKill();
resolve({ exitCode, signalCode });
});
});
if (done) {
@@ -301,7 +332,9 @@ async function spawn(command, args, options, label) {
}
if (signalCode) {
console.error(`Command killed: ${signalCode}`);
if (!killedManually) {
console.error(`Command killed: ${signalCode}`);
}
} else {
console.error(`Command exited: code ${exitCode}`);
}

View File

@@ -44,17 +44,21 @@ if (!fs.existsSync(join(dir, "bun-profile")) || !fs.existsSync(join(dir, `bun-${
await Bun.$`bash -c ${`age -d -i <(echo "$AGE_CORES_IDENTITY")`} < ${cores} | tar -zxvC ${dir}`;
console.log("moving cores out of nested directory");
for await (const file of new Bun.Glob("bun-cores-*/bun-*.core").scan(dir)) {
for await (const file of new Bun.Glob("bun-cores-*/*.core").scan(dir)) {
fs.renameSync(join(dir, file), join(dir, basename(file)));
}
} else {
console.log(`already downloaded in ${dir}`);
}
console.log("launching debugger:");
console.log(`${debuggerPath} --core ${join(dir, `bun-${pid}.core`)} ${join(dir, "bun-profile")}`);
const desiredCore = join(dir, (await new Bun.Glob(`*${pid}.core`).scan(dir).next()).value);
const proc = await Bun.spawn([debuggerPath, "--core", join(dir, `bun-${pid}.core`), join(dir, "bun-profile")], {
const args = [debuggerPath, "--core", desiredCore, join(dir, "bun-profile")];
console.log("launching debugger:");
console.log(args.map(Bun.$.escape).join(" "));
const proc = Bun.spawn(args, {
stdin: "inherit",
stdout: "inherit",
stderr: "inherit",

View File

@@ -29,6 +29,27 @@ const formatTime = (ms: number): string => {
const start = Date.now();
let totalTimeEstimate = -1;
function report() {
process.stdout.write("\n");
const attemptsReached =
numOk + numTimedOut + signals.values().reduce((a, b) => a + b, 0) + codes.values().reduce((a, b) => a + b, 0);
green(`${pad(numOk)}/${attemptsReached} OK`);
if (numTimedOut > 0) {
red(`${pad(numTimedOut)}/${attemptsReached} timeout`);
}
for (const [signal, count] of signals.entries()) {
red(`${pad(count)}/${attemptsReached} ${signal}`);
}
for (const [code, count] of codes.entries()) {
red(`${pad(count)}/${attemptsReached} code ${code}`);
}
process.exit(numOk === attemptsReached ? 0 : 1);
}
process.on("SIGINT", report);
for (let i = 0; i < attempts; i++) {
const proc = Bun.spawn({
cmd: argv,
@@ -75,17 +96,5 @@ for (let i = 0; i < attempts; i++) {
const remaining = totalTimeEstimate - (now - start);
process.stdout.write(`\r\x1b[2K${pad(i + 1)}/${attempts} completed, ${formatTime(remaining)} remaining`);
}
process.stdout.write("\n");
green(`${pad(numOk)}/${attempts} OK`);
if (numTimedOut > 0) {
red(`${pad(numTimedOut)}/${attempts} timeout`);
}
for (const [signal, count] of signals.entries()) {
red(`${pad(count)}/${attempts} ${signal}`);
}
for (const [code, count] of codes.entries()) {
red(`${pad(count)}/${attempts} code ${code}`);
}
process.exit(numOk === attempts ? 0 : 1);
report();

View File

@@ -20,7 +20,7 @@ async function globSources(output, patterns, excludes = []) {
const sources =
paths
.map(path => normalize(relative(root, path)))
.map(path => normalize(relative(root, path).replaceAll("\\", "/")))
.sort((a, b) => a.localeCompare(b))
.join("\n")
.trim() + "\n";

109
scripts/p-limit.mjs Normal file
View File

@@ -0,0 +1,109 @@
/**
* p-limit@6.2.0
* https://github.com/sindresorhus/p-limit
* MIT (c) Sindre Sorhus
*/
import Queue from "./yocto-queue.mjs";
export default function pLimit(concurrency) {
validateConcurrency(concurrency);
const queue = new Queue();
let activeCount = 0;
const resumeNext = () => {
if (activeCount < concurrency && queue.size > 0) {
queue.dequeue()();
// Since `pendingCount` has been decreased by one, increase `activeCount` by one.
activeCount++;
}
};
const next = () => {
activeCount--;
resumeNext();
};
const run = async (function_, resolve, arguments_) => {
const result = (async () => function_(...arguments_))();
resolve(result);
try {
await result;
} catch {}
next();
};
const enqueue = (function_, resolve, arguments_) => {
// Queue `internalResolve` instead of the `run` function
// to preserve asynchronous context.
new Promise(internalResolve => {
queue.enqueue(internalResolve);
}).then(run.bind(undefined, function_, resolve, arguments_));
(async () => {
// This function needs to wait until the next microtask before comparing
// `activeCount` to `concurrency`, because `activeCount` is updated asynchronously
// after the `internalResolve` function is dequeued and called. The comparison in the if-statement
// needs to happen asynchronously as well to get an up-to-date value for `activeCount`.
await Promise.resolve();
if (activeCount < concurrency) {
resumeNext();
}
})();
};
const generator = (function_, ...arguments_) =>
new Promise(resolve => {
enqueue(function_, resolve, arguments_);
});
Object.defineProperties(generator, {
activeCount: {
get: () => activeCount,
},
pendingCount: {
get: () => queue.size,
},
clearQueue: {
value() {
queue.clear();
},
},
concurrency: {
get: () => concurrency,
set(newConcurrency) {
validateConcurrency(newConcurrency);
concurrency = newConcurrency;
queueMicrotask(() => {
// eslint-disable-next-line no-unmodified-loop-condition
while (activeCount < concurrency && queue.size > 0) {
resumeNext();
}
});
},
},
});
return generator;
}
export function limitFunction(function_, option) {
const { concurrency } = option;
const limit = pLimit(concurrency);
return (...arguments_) => limit(() => function_(...arguments_));
}
function validateConcurrency(concurrency) {
if (!((Number.isInteger(concurrency) || concurrency === Number.POSITIVE_INFINITY) && concurrency > 0)) {
throw new TypeError("Expected `concurrency` to be a number from 1 and up");
}
}

View File

@@ -20,6 +20,7 @@ import {
readdirSync,
readFileSync,
realpathSync,
rmSync,
statSync,
symlinkSync,
unlink,
@@ -27,9 +28,12 @@ import {
writeFileSync,
} from "node:fs";
import { readFile } from "node:fs/promises";
import { userInfo } from "node:os";
import { availableParallelism, userInfo } from "node:os";
import { basename, dirname, extname, join, relative, sep } from "node:path";
import { createInterface } from "node:readline";
import { setTimeout as setTimeoutPromise } from "node:timers/promises";
import { parseArgs } from "node:util";
import pLimit from "./p-limit.mjs";
import {
getAbi,
getAbiVersion,
@@ -62,6 +66,7 @@ import {
unzip,
uploadArtifact,
} from "./utils.mjs";
let isQuiet = false;
const cwd = import.meta.dirname ? dirname(import.meta.dirname) : process.cwd();
const testsPath = join(cwd, "test");
@@ -152,6 +157,10 @@ const { values: options, positionals: filters } = parseArgs({
type: "boolean",
default: isBuildkite && isLinux,
},
["parallel"]: {
type: "boolean",
default: false,
},
},
});
@@ -170,6 +179,25 @@ if (options["quiet"]) {
isQuiet = true;
}
let coresDir;
if (options["coredump-upload"]) {
// this sysctl is set in bootstrap.sh to /var/bun-cores-$distro-$release-$arch
const sysctl = await spawnSafe({ command: "sysctl", args: ["-n", "kernel.core_pattern"] });
coresDir = sysctl.stdout;
if (sysctl.ok) {
if (coresDir.startsWith("|")) {
throw new Error("cores are being piped not saved");
}
// change /foo/bar/%e-%p.core to /foo/bar
coresDir = dirname(sysctl.stdout);
} else {
throw new Error(`Failed to check core_pattern: ${sysctl.error}`);
}
}
let remapPort = undefined;
/**
* @typedef {Object} TestExpectation
* @property {string} filename
@@ -340,6 +368,10 @@ async function runTests() {
const failedResults = [];
const maxAttempts = 1 + (parseInt(options["retries"]) || 0);
const parallelism = options["parallel"] ? availableParallelism() : 1;
console.log("parallelism", parallelism);
const limit = pLimit(parallelism);
/**
* @param {string} title
* @param {function} fn
@@ -349,17 +381,21 @@ async function runTests() {
const index = ++i;
let result, failure, flaky;
for (let attempt = 1; attempt <= maxAttempts; attempt++) {
let attempt = 1;
for (; attempt <= maxAttempts; attempt++) {
if (attempt > 1) {
await new Promise(resolve => setTimeout(resolve, 5000 + Math.random() * 10_000));
}
result = await startGroup(
attempt === 1
? `${getAnsi("gray")}[${index}/${total}]${getAnsi("reset")} ${title}`
: `${getAnsi("gray")}[${index}/${total}]${getAnsi("reset")} ${title} ${getAnsi("gray")}[attempt #${attempt}]${getAnsi("reset")}`,
fn,
);
let grouptitle = `${getAnsi("gray")}[${index}/${total}]${getAnsi("reset")} ${title}`;
if (attempt > 1) grouptitle += ` ${getAnsi("gray")}[attempt #${attempt}]${getAnsi("reset")}`;
if (parallelism > 1) {
console.log(grouptitle);
result = await fn();
} else {
result = await startGroup(grouptitle, fn);
}
const { ok, stdoutPreview, error } = result;
if (ok) {
@@ -374,6 +410,7 @@ async function runTests() {
const color = attempt >= maxAttempts ? "red" : "yellow";
const label = `${getAnsi(color)}[${index}/${total}] ${title} - ${error}${getAnsi("reset")}`;
startGroup(label, () => {
if (parallelism > 1) return;
process.stderr.write(stdoutPreview);
});
@@ -394,14 +431,15 @@ async function runTests() {
// Group flaky tests together, regardless of the title
const context = flaky ? "flaky" : title;
const style = flaky || title.startsWith("vendor") ? "warning" : "error";
if (!flaky) attempt = 1; // no need to show the retries count on failures, we know it maxed out
if (title.startsWith("vendor")) {
const content = formatTestToMarkdown({ ...failure, testPath: title });
const content = formatTestToMarkdown({ ...failure, testPath: title }, false, attempt - 1);
if (content) {
reportAnnotationToBuildKite({ context, label: title, content, style });
}
} else {
const content = formatTestToMarkdown(failure);
const content = formatTestToMarkdown(failure, false, attempt - 1);
if (content) {
reportAnnotationToBuildKite({ context, label: title, content, style });
}
@@ -411,10 +449,10 @@ async function runTests() {
if (isGithubAction) {
const summaryPath = process.env["GITHUB_STEP_SUMMARY"];
if (summaryPath) {
const longMarkdown = formatTestToMarkdown(failure);
const longMarkdown = formatTestToMarkdown(failure, false, attempt - 1);
appendFileSync(summaryPath, longMarkdown);
}
const shortMarkdown = formatTestToMarkdown(failure, true);
const shortMarkdown = formatTestToMarkdown(failure, true, attempt - 1);
appendFileSync("comment.md", shortMarkdown);
}
@@ -433,48 +471,100 @@ async function runTests() {
}
if (!failedResults.length) {
for (const testPath of tests) {
const absoluteTestPath = join(testsPath, testPath);
const title = relative(cwd, absoluteTestPath).replaceAll(sep, "/");
if (isNodeTest(testPath)) {
const testContent = readFileSync(absoluteTestPath, "utf-8");
const runWithBunTest =
title.includes("needs-test") || testContent.includes("bun:test") || testContent.includes("node:test");
const subcommand = runWithBunTest ? "test" : "run";
const env = {
FORCE_COLOR: "0",
NO_COLOR: "1",
BUN_DEBUG_QUIET_LOGS: "1",
};
if ((basename(execPath).includes("asan") || !isCI) && shouldValidateExceptions(testPath)) {
env.BUN_JSC_validateExceptionChecks = "1";
}
await runTest(title, async () => {
const { ok, error, stdout } = await spawnBun(execPath, {
cwd: cwd,
args: [subcommand, "--config=" + join(import.meta.dirname, "../bunfig.node-test.toml"), absoluteTestPath],
timeout: getNodeParallelTestTimeout(title),
env,
stdout: chunk => pipeTestStdout(process.stdout, chunk),
stderr: chunk => pipeTestStdout(process.stderr, chunk),
});
const mb = 1024 ** 3;
const stdoutPreview = stdout.slice(0, mb).split("\n").slice(0, 50).join("\n");
return {
testPath: title,
ok: ok,
status: ok ? "pass" : "fail",
error: error,
errors: [],
tests: [],
stdout: stdout,
stdoutPreview: stdoutPreview,
};
});
// TODO: remove windows exclusion here
if (isCI && !isWindows) {
// bun install has succeeded
const { promise: portPromise, resolve: portResolve } = Promise.withResolvers();
const { promise: errorPromise, resolve: errorResolve } = Promise.withResolvers();
console.log("run in", cwd);
let exiting = false;
const server = spawn(execPath, ["run", "ci-remap-server", execPath, cwd, getCommit()], {
stdio: ["ignore", "pipe", "inherit"],
cwd, // run in main repo
env: { ...process.env, BUN_DEBUG_QUIET_LOGS: "1", NO_COLOR: "1" },
});
server.unref();
server.on("error", errorResolve);
server.on("exit", (code, signal) => {
if (!exiting && (code !== 0 || signal !== null)) errorResolve(signal ? signal : "code " + code);
});
process.on("exit", () => {
exiting = true;
server.kill();
});
const lines = createInterface(server.stdout);
lines.on("line", line => {
portResolve({ port: parseInt(line) });
});
const result = await Promise.race([portPromise, errorPromise, setTimeoutPromise(5000, "timeout")]);
if (typeof result.port != "number") {
server.kill();
console.warn("ci-remap server did not start:", result);
} else {
await runTest(title, async () => spawnBunTest(execPath, join("test", testPath)));
console.log("crash reports parsed on port", result.port);
remapPort = result.port;
}
}
await Promise.all(
tests.map(testPath =>
limit(() => {
const absoluteTestPath = join(testsPath, testPath);
const title = relative(cwd, absoluteTestPath).replaceAll(sep, "/");
if (isNodeTest(testPath)) {
const testContent = readFileSync(absoluteTestPath, "utf-8");
const runWithBunTest =
title.includes("needs-test") || testContent.includes("bun:test") || testContent.includes("node:test");
const subcommand = runWithBunTest ? "test" : "run";
const env = {
FORCE_COLOR: "0",
NO_COLOR: "1",
BUN_DEBUG_QUIET_LOGS: "1",
};
if ((basename(execPath).includes("asan") || !isCI) && shouldValidateExceptions(testPath)) {
env.BUN_JSC_validateExceptionChecks = "1";
}
return runTest(title, async () => {
const { ok, error, stdout, crashes } = await spawnBun(execPath, {
cwd: cwd,
args: [
subcommand,
"--config=" + join(import.meta.dirname, "../bunfig.node-test.toml"),
absoluteTestPath,
],
timeout: getNodeParallelTestTimeout(title),
env,
stdout: parallelism > 1 ? () => {} : chunk => pipeTestStdout(process.stdout, chunk),
stderr: parallelism > 1 ? () => {} : chunk => pipeTestStdout(process.stderr, chunk),
});
const mb = 1024 ** 3;
let stdoutPreview = stdout.slice(0, mb).split("\n").slice(0, 50).join("\n");
if (crashes) stdoutPreview += crashes;
return {
testPath: title,
ok: ok,
status: ok ? "pass" : "fail",
error: error,
errors: [],
tests: [],
stdout: stdout,
stdoutPreview: stdoutPreview,
};
});
} else {
return runTest(title, async () =>
spawnBunTest(execPath, join("test", testPath), {
cwd,
stdout: parallelism > 1 ? () => {} : chunk => pipeTestStdout(process.stdout, chunk),
stderr: parallelism > 1 ? () => {} : chunk => pipeTestStdout(process.stderr, chunk),
}),
);
}
}),
),
);
}
if (vendorTests?.length) {
@@ -516,7 +606,7 @@ async function runTests() {
if (isGithubAction) {
reportOutputToGitHubAction("failing_tests_count", failedResults.length);
const markdown = formatTestToMarkdown(failedResults);
const markdown = formatTestToMarkdown(failedResults, false, 0);
reportOutputToGitHubAction("failing_tests", markdown);
}
@@ -611,19 +701,6 @@ async function runTests() {
if (options["coredump-upload"]) {
try {
// this sysctl is set in bootstrap.sh to /var/bun-cores-$distro-$release-$arch
const sysctl = await spawnSafe({ command: "sysctl", args: ["-n", "kernel.core_pattern"] });
let coresDir = sysctl.stdout;
if (sysctl.ok) {
if (coresDir.startsWith("|")) {
throw new Error("cores are being piped not saved");
}
// change /foo/bar/%e-%p.core to /foo/bar
coresDir = dirname(sysctl.stdout);
} else {
throw new Error(`Failed to check core_pattern: ${sysctl.error}`);
}
const coresDirBase = dirname(coresDir);
const coresDirName = basename(coresDir);
const coreFileNames = readdirSync(coresDir);
@@ -729,6 +806,7 @@ async function runTests() {
* @property {number} timestamp
* @property {number} duration
* @property {string} stdout
* @property {number} [pid]
*/
/**
@@ -779,6 +857,25 @@ async function spawnSafe(options) {
};
await new Promise(resolve => {
try {
function unsafeBashEscape(str) {
if (!str) return "";
if (str.includes(" ")) return JSON.stringify(str);
return str;
}
if (process.env.SHOW_SPAWN_COMMANDS) {
console.log(
"SPAWNING COMMAND:\n" +
[
"echo -n | " +
Object.entries(env)
.map(([key, value]) => `${unsafeBashEscape(key)}=${unsafeBashEscape(value)}`)
.join(" "),
unsafeBashEscape(command),
...args.map(unsafeBashEscape),
].join(" ") +
" | cat",
);
}
subprocess = spawn(command, args, {
stdio: ["ignore", "pipe", "pipe"],
timeout,
@@ -890,6 +987,7 @@ async function spawnSafe(options) {
stdout: buffer,
timestamp: timestamp || Date.now(),
duration: duration || 0,
pid: subprocess?.pid,
};
}
@@ -918,10 +1016,16 @@ function getCombinedPath(execPath) {
return _combinedPath;
}
/**
* @typedef {object} SpawnBunResult
* @extends SpawnResult
* @property {string} [crashes]
*/
/**
* @param {string} execPath Path to bun binary
* @param {SpawnOptions} options
* @returns {Promise<SpawnResult>}
* @returns {Promise<SpawnBunResult>}
*/
async function spawnBun(execPath, { args, cwd, timeout, env, stdout, stderr }) {
const path = getCombinedPath(execPath);
@@ -940,11 +1044,13 @@ async function spawnBun(execPath, { args, cwd, timeout, env, stdout, stderr }) {
BUN_DEBUG_QUIET_LOGS: "1",
BUN_GARBAGE_COLLECTOR_LEVEL: "1",
BUN_JSC_randomIntegrityAuditRate: "1.0",
BUN_ENABLE_CRASH_REPORTING: "0", // change this to '1' if https://github.com/oven-sh/bun/issues/13012 is implemented
BUN_RUNTIME_TRANSPILER_CACHE_PATH: "0",
BUN_INSTALL_CACHE_DIR: tmpdirPath,
SHELLOPTS: isWindows ? "igncr" : undefined, // ignore "\r" on Windows
TEST_TMPDIR: tmpdirPath, // Used in Node.js tests.
...(typeof remapPort == "number"
? { BUN_CRASH_REPORT_URL: `http://localhost:${remapPort}` }
: { BUN_ENABLE_CRASH_REPORTING: "0" }),
};
if (basename(execPath).includes("asan")) {
@@ -968,7 +1074,8 @@ async function spawnBun(execPath, { args, cwd, timeout, env, stdout, stderr }) {
bunEnv["TEMP"] = tmpdirPath;
}
try {
return await spawnSafe({
const existingCores = options["coredump-upload"] ? readdirSync(coresDir) : [];
const result = await spawnSafe({
command: execPath,
args,
cwd,
@@ -977,12 +1084,93 @@ async function spawnBun(execPath, { args, cwd, timeout, env, stdout, stderr }) {
stdout,
stderr,
});
const newCores = options["coredump-upload"] ? readdirSync(coresDir).filter(c => !existingCores.includes(c)) : [];
let crashes = "";
if (options["coredump-upload"] && (result.signalCode !== null || newCores.length > 0)) {
// warn if the main PID crashed and we don't have a core
if (result.signalCode !== null && !newCores.some(c => c.endsWith(`${result.pid}.core`))) {
crashes += `main process killed by ${result.signalCode} but no core file found\n`;
}
if (newCores.length > 0) {
result.ok = false;
if (!isAlwaysFailure(result.error)) result.error = "core dumped";
}
for (const coreName of newCores) {
const corePath = join(coresDir, coreName);
let out = "";
const gdb = await spawnSafe({
command: "gdb",
args: ["-batch", `--eval-command=bt`, "--core", corePath, execPath],
timeout: 240_000,
stderr: () => {},
stdout(text) {
out += text;
},
});
if (!gdb.ok) {
crashes += `failed to get backtrace from GDB: ${gdb.error}\n`;
} else {
crashes += `======== Stack trace from GDB for ${coreName}: ========\n`;
for (const line of out.split("\n")) {
// filter GDB output since it is pretty verbose
if (
line.startsWith("Program terminated") ||
line.startsWith("#") || // gdb backtrace lines start with #0, #1, etc.
line.startsWith("[Current thread is")
) {
crashes += line + "\n";
}
}
}
}
}
// Skip this if the remap server didn't work or if Bun exited normally
// (tests in which a subprocess crashed should at least set exit code 1)
if (typeof remapPort == "number" && result.exitCode !== 0) {
try {
// When Bun crashes, it exits before the subcommand it runs to upload the crash report has necessarily finished.
// So wait a little bit to make sure that the crash report has at least started uploading
// (once the server sees the /ack request then /traces will wait for any crashes to finish processing)
// There is a bug that if a test causes crash reports but exits with code 0, the crash reports will instead
// be attributed to the next test that fails. I'm not sure how to fix this without adding a sleep in between
// all tests (which would slow down CI a lot).
await setTimeoutPromise(500);
const response = await fetch(`http://localhost:${remapPort}/traces`);
if (!response.ok || response.status !== 200) throw new Error(`server responded with code ${response.status}`);
const traces = await response.json();
if (traces.length > 0) {
result.ok = false;
if (!isAlwaysFailure(result.error)) result.error = "crash reported";
crashes += `${traces.length} crashes reported during this test\n`;
for (const t of traces) {
if (t.failed_parse) {
crashes += "Trace string failed to parse:\n";
crashes += t.failed_parse + "\n";
} else if (t.failed_remap) {
crashes += "Parsed trace failed to remap:\n";
crashes += JSON.stringify(t.failed_remap, null, 2) + "\n";
} else {
crashes += "================\n";
crashes += t.remap + "\n";
}
}
}
} catch (e) {
crashes += "failed to fetch traces: " + e.toString() + "\n";
}
}
if (crashes.length > 0) result.crashes = crashes;
return result;
} finally {
// try {
// rmSync(tmpdirPath, { recursive: true, force: true });
// } catch (error) {
// console.warn(error);
// }
try {
rmSync(tmpdirPath, { recursive: true, force: true });
} catch (error) {
console.warn(error);
}
}
}
@@ -1058,19 +1246,20 @@ async function spawnBunTest(execPath, testPath, options = { cwd }) {
const env = {
GITHUB_ACTIONS: "true", // always true so annotations are parsed
};
if (basename(execPath).includes("asan") && shouldValidateExceptions(relative(cwd, absPath))) {
if ((basename(execPath).includes("asan") || !isCI) && shouldValidateExceptions(relative(cwd, absPath))) {
env.BUN_JSC_validateExceptionChecks = "1";
}
const { ok, error, stdout } = await spawnBun(execPath, {
const { ok, error, stdout, crashes } = await spawnBun(execPath, {
args: isReallyTest ? testArgs : [...args, absPath],
cwd: options["cwd"],
timeout: isReallyTest ? timeout : 30_000,
env,
stdout: chunk => pipeTestStdout(process.stdout, chunk),
stderr: chunk => pipeTestStdout(process.stderr, chunk),
stdout: options.stdout,
stderr: options.stderr,
});
const { tests, errors, stdout: stdoutPreview } = parseTestStdout(stdout, testPath);
let { tests, errors, stdout: stdoutPreview } = parseTestStdout(stdout, testPath);
if (crashes) stdoutPreview += crashes;
// If we generated a JUnit file and we're on BuildKite, upload it immediately
if (junitFilePath && isReallyTest && isBuildkite && cliOptions["junit-upload"]) {
@@ -1245,11 +1434,12 @@ function parseTestStdout(stdout, testPath) {
* @returns {Promise<TestResult>}
*/
async function spawnBunInstall(execPath, options) {
const { ok, error, stdout, duration } = await spawnBun(execPath, {
let { ok, error, stdout, duration, crashes } = await spawnBun(execPath, {
args: ["install"],
timeout: testTimeout,
...options,
});
if (crashes) stdout += crashes;
const relativePath = relative(cwd, options.cwd);
const testPath = join(relativePath, "package.json");
const status = ok ? "pass" : "fail";
@@ -1416,20 +1606,30 @@ async function getVendorTests(cwd) {
const vendorPath = join(cwd, "vendor", name);
if (!existsSync(vendorPath)) {
await spawnSafe({
const { ok, error } = await spawnSafe({
command: "git",
args: ["clone", "--depth", "1", "--single-branch", repository, vendorPath],
timeout: testTimeout,
cwd,
});
if (!ok) throw new Error(`failed to git clone vendor '${name}': ${error}`);
}
await spawnSafe({
let { ok, error } = await spawnSafe({
command: "git",
args: ["fetch", "--depth", "1", "origin", "tag", tag],
timeout: testTimeout,
cwd: vendorPath,
});
if (!ok) throw new Error(`failed to fetch tag ${tag} for vendor '${name}': ${error}`);
({ ok, error } = await spawnSafe({
command: "git",
args: ["checkout", tag],
timeout: testTimeout,
cwd: vendorPath,
}));
if (!ok) throw new Error(`failed to checkout tag ${tag} for vendor '${name}': ${error}`);
const packageJsonPath = join(vendorPath, "package.json");
if (!existsSync(packageJsonPath)) {
@@ -1730,9 +1930,10 @@ function getTestLabel() {
/**
* @param {TestResult | TestResult[]} result
* @param {boolean} concise
* @param {number} retries
* @returns {string}
*/
function formatTestToMarkdown(result, concise) {
function formatTestToMarkdown(result, concise, retries) {
const results = Array.isArray(result) ? result : [result];
const buildLabel = getTestLabel();
const buildUrl = getBuildUrl();
@@ -1776,6 +1977,9 @@ function formatTestToMarkdown(result, concise) {
if (platform) {
markdown += ` on ${platform}`;
}
if (retries > 0) {
markdown += ` (${retries} ${retries === 1 ? "retry" : "retries"})`;
}
if (concise) {
markdown += "</li>\n";
@@ -1980,7 +2184,9 @@ function isAlwaysFailure(error) {
error.includes("segmentation fault") ||
error.includes("illegal instruction") ||
error.includes("sigtrap") ||
error.includes("error: addresssanitizer")
error.includes("error: addresssanitizer") ||
error.includes("core dumped") ||
error.includes("crash reported")
);
}

87
scripts/sortImports.ts → scripts/sort-imports.ts Normal file → Executable file
View File

@@ -1,3 +1,4 @@
#!/usr/bin/env bun
import { readdirSync } from "fs";
import path from "path";
@@ -16,10 +17,9 @@ const usage = String.raw`
Usage: bun scripts/sortImports [options] <files...>
Options:
--help Show this help message
--no-include-pub Exclude pub imports from sorting
--no-remove-unused Don't remove unused imports
--include-unsorted Process files even if they don't have @sortImports marker
--help Show this help message
--include-pub Also sort ${"`pub`"} imports
--keep-unused Don't remove unused imports
Examples:
bun scripts/sortImports src
@@ -34,9 +34,9 @@ if (filePaths.length === 0) {
}
const config = {
includePub: !args.includes("--no-include-pub"),
removeUnused: !args.includes("--no-remove-unused"),
includeUnsorted: args.includes("--include-unsorted"),
includePub: args.includes("--include-pub"),
removeUnused: !args.includes("--keep-unused"),
normalizePaths: "./",
};
// Type definitions
@@ -68,11 +68,11 @@ function parseDeclarations(
const line = lines[i];
if (line === "// @sortImports") {
lines[i] = "";
lines[i] = DELETED_LINE;
continue;
}
const inlineDeclPattern = /^(?:pub )?const ([a-zA-Z0-9_]+) = (.+);$/;
const inlineDeclPattern = /^(?:pub )?const ([a-zA-Z0-9_]+) = (.+);(\s*\/\/[^\n]*)?$/;
const match = line.match(inlineDeclPattern);
if (!match) continue;
@@ -102,6 +102,10 @@ function parseDeclarations(
continue;
}
if (declarations.has(name)) {
unusedLineIndices.push(i);
continue;
}
declarations.set(name, {
whole: line,
index: i,
@@ -275,8 +279,6 @@ function sortGroupsAndDeclarations(groups: Map<string, Group>): string[] {
// Generate the sorted output
function generateSortedOutput(lines: string[], groups: Map<string, Group>, sortedGroupKeys: string[]): string[] {
const outputLines = [...lines];
outputLines.push("");
outputLines.push("// @sortImports");
for (const groupKey of sortedGroupKeys) {
const groupDeclarations = groups.get(groupKey)!;
@@ -288,22 +290,36 @@ function generateSortedOutput(lines: string[], groups: Map<string, Group>, sorte
// Add declarations to output and mark original lines for removal
for (const declaration of groupDeclarations.declarations) {
outputLines.push(declaration.whole);
outputLines[declaration.index] = "";
outputLines[declaration.index] = DELETED_LINE;
}
}
return outputLines;
}
function extractThisDeclaration(declarations: Map<string, Declaration>): Declaration | null {
for (const declaration of declarations.values()) {
if (declaration.value === "@This()") {
declarations.delete(declaration.key);
return declaration;
}
}
return null;
}
const DELETED_LINE = "%DELETED_LINE%";
// Main execution function for a single file
async function processFile(filePath: string): Promise<void> {
const originalFileContents = await Bun.file(filePath).text();
let fileContents = originalFileContents;
if (!config.includeUnsorted && !originalFileContents.includes("// @sortImports")) {
return;
if (config.normalizePaths === "") {
fileContents = fileContents.replaceAll(`@import("./`, `@import("`);
} else if (config.normalizePaths === "./") {
fileContents = fileContents.replaceAll(/@import\("([A-Za-z0-9_-][^"]*\.zig)"\)/g, '@import("./$1")');
fileContents = fileContents.replaceAll(`@import("./../`, `@import("../`);
}
console.log(`Processing: ${filePath}`);
let needsRecurse = true;
while (needsRecurse) {
@@ -312,6 +328,7 @@ async function processFile(filePath: string): Promise<void> {
const lines = fileContents.split("\n");
const { declarations, unusedLineIndices } = parseDeclarations(lines, fileContents);
const thisDeclaration = extractThisDeclaration(declarations);
const groups = groupDeclarationsByImportPath(declarations);
promoteItemsWithChildGroups(groups);
@@ -323,13 +340,46 @@ async function processFile(filePath: string): Promise<void> {
// Remove unused declarations
if (config.removeUnused) {
for (const line of unusedLineIndices) {
sortedLines[line] = "";
sortedLines[line] = DELETED_LINE;
needsRecurse = true;
}
}
if (thisDeclaration) {
var onlyCommentsBeforeThis = true;
for (const [i, line] of sortedLines.entries()) {
if (i >= thisDeclaration.index) {
break;
}
if (line === "" || line === DELETED_LINE) {
continue;
}
if (!line.startsWith("//")) {
onlyCommentsBeforeThis = false;
break;
}
}
if (!onlyCommentsBeforeThis) {
sortedLines[thisDeclaration.index] = DELETED_LINE;
let firstNonFileCommentLine = 0;
for (const line of sortedLines) {
if (line.startsWith("//!")) {
firstNonFileCommentLine++;
} else {
break;
}
}
const insert = [thisDeclaration.whole, ""];
if (firstNonFileCommentLine > 0) insert.unshift("");
sortedLines.splice(firstNonFileCommentLine, 0, ...insert);
}
}
fileContents = sortedLines.join("\n");
}
// Remove deleted lines
fileContents = fileContents.replaceAll(DELETED_LINE + "\n", "");
// fileContents = fileContents.replaceAll(DELETED_LINE, ""); // any remaining lines
// Remove any leading newlines
fileContents = fileContents.replace(/^\n+/, "");
@@ -343,7 +393,6 @@ async function processFile(filePath: string): Promise<void> {
if (fileContents === "\n") fileContents = "";
if (fileContents === originalFileContents) {
console.log(`✓ No changes: ${filePath}`);
return;
}
@@ -369,7 +418,7 @@ async function main() {
successCount++;
} catch (error) {
errorCount++;
console.error(`Failed to process ${filePath}`);
console.error(`Failed to process ${path.join(filePath, file)}:\n`, error);
}
}
continue;
@@ -380,7 +429,7 @@ async function main() {
successCount++;
} catch (error) {
errorCount++;
console.error(`Failed to process ${filePath}`);
console.error(`Failed to process ${filePath}:\n`, error);
}
}

View File

@@ -2702,7 +2702,14 @@ export function reportAnnotationToBuildKite({ context, label, content, style = "
source: "buildkite",
level: "error",
});
reportAnnotationToBuildKite({ label: `${label}-error`, content: errorContent, attempt: attempt + 1 });
reportAnnotationToBuildKite({
context,
label: `${label}-error`,
content: errorContent,
style,
priority,
attempt: attempt + 1,
});
}
/**
@@ -2850,6 +2857,14 @@ export function printEnvironment() {
}
});
}
if (isLinux) {
startGroup("Memory", () => {
const shell = which(["sh", "bash"]);
if (shell) {
spawnSync([shell, "-c", "free -m -w"], { stdio: "inherit" });
}
});
}
if (isWindows) {
startGroup("Disk (win)", () => {
const shell = which(["pwsh"]);
@@ -2857,6 +2872,14 @@ export function printEnvironment() {
spawnSync([shell, "-c", "get-psdrive"], { stdio: "inherit" });
}
});
startGroup("Memory", () => {
const shell = which(["pwsh"]);
if (shell) {
spawnSync([shell, "-c", "Get-Counter '\\Memory\\Available MBytes'"], { stdio: "inherit" });
console.log();
spawnSync([shell, "-c", "Get-CimInstance Win32_PhysicalMemory"], { stdio: "inherit" });
}
});
}
}

90
scripts/yocto-queue.mjs Normal file
View File

@@ -0,0 +1,90 @@
/**
* yocto-queue@1.2.1
* https://github.com/sindresorhus/yocto-queue
* MIT (c) Sindre Sorhus
*/
/*
How it works:
`this.#head` is an instance of `Node` which keeps track of its current value and nests another instance of `Node` that keeps the value that comes after it. When a value is provided to `.enqueue()`, the code needs to iterate through `this.#head`, going deeper and deeper to find the last value. However, iterating through every single item is slow. This problem is solved by saving a reference to the last value as `this.#tail` so that it can reference it to add a new value.
*/
class Node {
value;
next;
constructor(value) {
this.value = value;
}
}
export default class Queue {
#head;
#tail;
#size;
constructor() {
this.clear();
}
enqueue(value) {
const node = new Node(value);
if (this.#head) {
this.#tail.next = node;
this.#tail = node;
} else {
this.#head = node;
this.#tail = node;
}
this.#size++;
}
dequeue() {
const current = this.#head;
if (!current) {
return;
}
this.#head = this.#head.next;
this.#size--;
return current.value;
}
peek() {
if (!this.#head) {
return;
}
return this.#head.value;
// TODO: Node.js 18.
// return this.#head?.value;
}
clear() {
this.#head = undefined;
this.#tail = undefined;
this.#size = 0;
}
get size() {
return this.#size;
}
*[Symbol.iterator]() {
let current = this.#head;
while (current) {
yield current.value;
current = current.next;
}
}
*drain() {
while (this.#head) {
yield this.dequeue();
}
}
}

View File

@@ -1,155 +0,0 @@
import * as fs from "fs";
import * as path from "path";
/**
* Removes unreferenced top-level const declarations from a Zig file
* Handles patterns like: const <IDENTIFIER> = @import(...) or const <IDENTIFIER> = ...
*/
export function removeUnreferencedImports(content: string): string {
let modified = true;
let result = content;
// Keep iterating until no more changes are made
while (modified) {
modified = false;
const lines = result.split("\n");
const newLines: string[] = [];
for (let i = 0; i < lines.length; i++) {
const line = lines[i];
// Match top-level const declarations: const <IDENTIFIER> = ...
const constMatch = line.match(/^const\s+([a-zA-Z_][a-zA-Z0-9_]*)\s*=(.*)$/);
if (constMatch) {
const identifier = constMatch[1];
const assignmentPart = constMatch[2];
// Skip lines that contain '{' in the assignment (likely structs/objects)
if (assignmentPart.includes("{")) {
newLines.push(line);
continue;
}
// Check if this identifier is referenced anywhere else in the file
const isReferenced = isIdentifierReferenced(identifier, lines, i);
if (!isReferenced) {
// Skip this line (delete it)
modified = true;
console.log(`Removing unreferenced import: ${identifier}`);
continue;
}
}
newLines.push(line);
}
result = newLines.join("\n");
}
return result;
}
/**
* Check if an identifier is referenced anywhere in the file except at the declaration line
*/
function isIdentifierReferenced(identifier: string, lines: string[], declarationLineIndex: number): boolean {
// Create a regex that matches the identifier as a whole word
// This prevents matching partial words (e.g. "std" shouldn't match "stdx")
const identifierRegex = new RegExp(`\\b${escapeRegex(identifier)}\\b`);
for (let i = 0; i < lines.length; i++) {
// Skip the declaration line itself
if (i === declarationLineIndex) {
continue;
}
const line = lines[i];
// Check if the identifier appears in this line
if (identifierRegex.test(line)) {
return true;
}
}
return false;
}
/**
* Escape special regex characters in a string
*/
function escapeRegex(string: string): string {
return string.replace(/[.*+?^${}()|[\]\\]/g, "\\$&");
}
/**
* Process a single Zig file
*/
export function processZigFile(filePath: string): void {
try {
const content = fs.readFileSync(filePath, "utf-8");
const cleaned = removeUnreferencedImports(content);
if (content !== cleaned) {
fs.writeFileSync(filePath, cleaned);
console.log(`Cleaned: ${filePath}`);
} else {
console.log(`No changes: ${filePath}`);
}
} catch (error) {
console.error(`Error processing ${filePath}:`, error);
}
}
/**
* Process multiple Zig files or directories
*/
export function processFiles(paths: string[]): void {
for (const inputPath of paths) {
const stat = fs.statSync(inputPath);
if (stat.isDirectory()) {
// Process all .zig files in directory recursively
processDirectory(inputPath);
} else if (inputPath.endsWith(".zig")) {
processZigFile(inputPath);
} else {
console.warn(`Skipping non-Zig file: ${inputPath}`);
}
}
}
/**
* Recursively process all .zig files in a directory
*/
function processDirectory(dirPath: string): void {
const entries = fs.readdirSync(dirPath, { withFileTypes: true });
for (const entry of entries) {
const fullPath = path.join(dirPath, entry.name);
if (entry.isDirectory()) {
processDirectory(fullPath);
} else if (entry.name.endsWith(".zig")) {
processZigFile(fullPath);
}
}
}
// CLI usage
if (require.main === module) {
const args = process.argv.slice(2);
if (args.length === 0) {
console.log("Usage: bun zig-remove-unreferenced-top-level-decls.ts <file1.zig> [file2.zig] [directory]...");
console.log("");
console.log("Examples:");
console.log(" bun zig-remove-unreferenced-top-level-decls.ts file.zig");
console.log(" bun zig-remove-unreferenced-top-level-decls.ts src/");
console.log(" bun zig-remove-unreferenced-top-level-decls.ts file1.zig file2.zig src/");
process.exit(1);
}
processFiles(args);
}

View File

@@ -1,12 +1,4 @@
const std = @import("std");
const Environment = @import("./env.zig");
const Output = @import("output.zig");
const use_mimalloc = bun.use_mimalloc;
const Mimalloc = bun.Mimalloc;
const bun = @import("bun");
const version_string = Environment.version_string;
const Global = @This();
/// Does not have the canary tag, because it is exposed in `Bun.version`
/// "1.0.0" or "1.0.0-debug"
@@ -112,6 +104,7 @@ pub fn isExiting() bool {
/// Flushes stdout and stderr (in exit/quick_exit callback) and exits with the given code.
pub fn exit(code: u32) noreturn {
is_exiting.store(true, .monotonic);
_ = @atomicRmw(usize, &bun.analytics.Features.exited, .Add, 1, .monotonic);
// If we are crashing, allow the crash handler to finish it's work.
bun.crash_handler.sleepForeverIfAnotherThreadIsCrashing();
@@ -174,10 +167,10 @@ pub const versions = @import("./generated_versions_list.zig");
// 2. if I want to configure allocator later
pub inline fn configureAllocator(_: AllocatorConfiguration) void {
// if (comptime !use_mimalloc) return;
// const Mimalloc = @import("./allocators/mimalloc.zig");
// Mimalloc.mi_option_set_enabled(Mimalloc.mi_option_verbose, config.verbose);
// Mimalloc.mi_option_set_enabled(Mimalloc.mi_option_large_os_pages, config.long_running);
// if (!config.long_running) Mimalloc.mi_option_set(Mimalloc.mi_option_reset_delay, 0);
// const mimalloc = bun.mimalloc;
// mimalloc.mi_option_set_enabled(mimalloc.mi_option_verbose, config.verbose);
// mimalloc.mi_option_set_enabled(mimalloc.mi_option_large_os_pages, config.long_running);
// if (!config.long_running) mimalloc.mi_option_set(mimalloc.mi_option_reset_delay, 0);
}
pub fn notimpl() noreturn {
@@ -191,20 +184,17 @@ pub fn crash() noreturn {
Global.exit(1);
}
const Global = @This();
const string = bun.string;
pub const BunInfo = struct {
bun_version: string,
platform: Analytics.GenerateHeader.GeneratePlatform.Platform,
platform: analytics.GenerateHeader.GeneratePlatform.Platform,
const Analytics = @import("./analytics/analytics_thread.zig");
const JSON = bun.JSON;
const JSAst = bun.JSAst;
const analytics = bun.analytics;
const JSON = bun.json;
const JSAst = bun.ast;
pub fn generate(comptime Bundler: type, _: Bundler, allocator: std.mem.Allocator) !JSAst.Expr {
const info = BunInfo{
.bun_version = Global.package_json_version,
.platform = Analytics.GenerateHeader.GeneratePlatform.forOS(),
.platform = analytics.GenerateHeader.GeneratePlatform.forOS(),
};
return try JSON.toAST(allocator, BunInfo, info);
@@ -219,7 +209,7 @@ comptime {
}
pub export fn Bun__onExit() void {
bun.JSC.Node.FSEvents.closeAndWait();
bun.jsc.Node.FSEvents.closeAndWait();
runExitCallbacks();
Output.flush();
@@ -231,3 +221,15 @@ pub export fn Bun__onExit() void {
comptime {
_ = Bun__onExit;
}
const string = []const u8;
const Output = @import("./output.zig");
const std = @import("std");
const Environment = @import("./env.zig");
const version_string = Environment.version_string;
const bun = @import("bun");
const Mimalloc = bun.mimalloc;
const use_mimalloc = bun.use_mimalloc;

View File

@@ -1,10 +1,4 @@
const std = @import("std");
const bun = @import("bun");
const ImportRecord = @import("./import_record.zig").ImportRecord;
const ImportKind = @import("./import_record.zig").ImportKind;
const lol = @import("./deps/lol-html.zig");
const logger = bun.logger;
const fs = bun.fs;
const HTMLScanner = @This();
allocator: std.mem.Allocator,
import_records: ImportRecord.List = .{},
@@ -303,4 +297,12 @@ pub fn HTMLProcessor(
};
}
const HTMLScanner = @This();
const lol = @import("./deps/lol-html.zig");
const std = @import("std");
const ImportKind = @import("./import_record.zig").ImportKind;
const ImportRecord = @import("./import_record.zig").ImportRecord;
const bun = @import("bun");
const fs = bun.fs;
const logger = bun.logger;

View File

@@ -1,3 +1,5 @@
const OutputFile = @This();
// Instead of keeping files in-memory, we:
// 1. Write directly to disk
// 2. (Optional) move the file to the destination
@@ -13,15 +15,31 @@ hash: u64 = 0,
is_executable: bool = false,
source_map_index: u32 = std.math.maxInt(u32),
bytecode_index: u32 = std.math.maxInt(u32),
output_kind: JSC.API.BuildArtifact.OutputKind,
output_kind: jsc.API.BuildArtifact.OutputKind,
/// Relative
dest_path: []const u8 = "",
side: ?bun.bake.Side,
/// This is only set for the JS bundle, and not files associated with an
/// entrypoint like sourcemaps and bytecode
entry_point_index: ?u32,
referenced_css_files: []const Index = &.{},
referenced_css_chunks: []const Index = &.{},
source_index: Index.Optional = .none,
bake_extra: BakeExtra = .{},
pub const zero_value = OutputFile{
.loader = .file,
.src_path = Fs.Path.init(""),
.value = .noop,
.output_kind = .chunk,
.side = null,
.entry_point_index = null,
};
pub const BakeExtra = struct {
is_route: bool = false,
fully_static: bool = false,
bake_is_runtime: bool = false,
};
pub const Index = bun.GenericIndex(u32, OutputFile);
@@ -30,7 +48,7 @@ pub fn deinit(this: *OutputFile) void {
bun.default_allocator.free(this.src_path.text);
bun.default_allocator.free(this.dest_path);
bun.default_allocator.free(this.referenced_css_files);
bun.default_allocator.free(this.referenced_css_chunks);
}
// Depending on:
@@ -99,6 +117,13 @@ pub const Value = union(Kind) {
}
}
pub fn asSlice(v: Value) []const u8 {
return switch (v) {
.buffer => |buf| buf.bytes,
else => "",
};
}
pub fn toBunString(v: Value) bun.String {
return switch (v) {
.noop => bun.String.empty,
@@ -129,14 +154,14 @@ pub const Value = union(Kind) {
pub const SavedFile = struct {
pub fn toJS(
globalThis: *JSC.JSGlobalObject,
globalThis: *jsc.JSGlobalObject,
path: []const u8,
byte_size: usize,
) JSC.JSValue {
) jsc.JSValue {
const mime_type = globalThis.bunVM().mimeType(path);
const store = JSC.WebCore.Blob.Store.initFile(
JSC.Node.PathOrFileDescriptor{
.path = JSC.Node.PathLike{
const store = jsc.WebCore.Blob.Store.initFile(
jsc.Node.PathOrFileDescriptor{
.path = jsc.Node.PathLike{
.string = bun.PathString.init(path),
},
},
@@ -144,12 +169,12 @@ pub const SavedFile = struct {
bun.default_allocator,
) catch unreachable;
var blob = bun.default_allocator.create(JSC.WebCore.Blob) catch unreachable;
blob.* = JSC.WebCore.Blob.initWithStore(store, globalThis);
var blob = bun.default_allocator.create(jsc.WebCore.Blob) catch unreachable;
blob.* = jsc.WebCore.Blob.initWithStore(store, globalThis);
if (mime_type) |mime| {
blob.content_type = mime.value;
}
blob.size = @as(JSC.WebCore.Blob.SizeType, @truncate(byte_size));
blob.size = @as(jsc.WebCore.Blob.SizeType, @truncate(byte_size));
blob.allocator = bun.default_allocator;
return blob.toJS(globalThis);
}
@@ -190,7 +215,7 @@ pub const Options = struct {
size: ?usize = null,
input_path: []const u8 = "",
display_size: u32 = 0,
output_kind: JSC.API.BuildArtifact.OutputKind,
output_kind: jsc.API.BuildArtifact.OutputKind,
is_executable: bool,
data: union(enum) {
buffer: struct {
@@ -206,7 +231,8 @@ pub const Options = struct {
},
side: ?bun.bake.Side,
entry_point_index: ?u32,
referenced_css_files: []const Index = &.{},
referenced_css_chunks: []const Index = &.{},
bake_extra: BakeExtra = .{},
};
pub fn init(options: Options) OutputFile {
@@ -240,7 +266,8 @@ pub fn init(options: Options) OutputFile {
},
.side = options.side,
.entry_point_index = options.entry_point_index,
.referenced_css_files = options.referenced_css_files,
.referenced_css_chunks = options.referenced_css_chunks,
.bake_extra = options.bake_extra,
};
}
@@ -262,7 +289,7 @@ pub fn writeToDisk(f: OutputFile, root_dir: std.fs.Dir, root_dir_path: []const u
}
var path_buf: bun.PathBuffer = undefined;
_ = try JSC.Node.fs.NodeFS.writeFileWithPathBuffer(&path_buf, .{
_ = try jsc.Node.fs.NodeFS.writeFileWithPathBuffer(&path_buf, .{
.data = .{ .buffer = .{
.buffer = .{
.ptr = @constCast(value.bytes.ptr),
@@ -317,20 +344,20 @@ pub fn copyTo(file: *const OutputFile, _: string, rel_path: []const u8, dir: Fil
pub fn toJS(
this: *OutputFile,
owned_pathname: ?[]const u8,
globalObject: *JSC.JSGlobalObject,
) bun.JSC.JSValue {
globalObject: *jsc.JSGlobalObject,
) bun.jsc.JSValue {
return switch (this.value) {
.move, .pending => @panic("Unexpected pending output file"),
.noop => .js_undefined,
.copy => |copy| brk: {
const file_blob = JSC.WebCore.Blob.Store.initFile(
const file_blob = jsc.WebCore.Blob.Store.initFile(
if (copy.fd.isValid())
JSC.Node.PathOrFileDescriptor{
jsc.Node.PathOrFileDescriptor{
.fd = copy.fd,
}
else
JSC.Node.PathOrFileDescriptor{
.path = JSC.Node.PathLike{ .string = bun.PathString.init(globalObject.allocator().dupe(u8, copy.pathname) catch unreachable) },
jsc.Node.PathOrFileDescriptor{
.path = jsc.Node.PathLike{ .string = bun.PathString.init(globalObject.allocator().dupe(u8, copy.pathname) catch unreachable) },
},
this.loader.toMimeType(&.{owned_pathname orelse ""}),
globalObject.allocator(),
@@ -338,8 +365,8 @@ pub fn toJS(
Output.panic("error: Unable to create file blob: \"{s}\"", .{@errorName(err)});
};
var build_output = bun.new(JSC.API.BuildArtifact, .{
.blob = JSC.WebCore.Blob.initWithStore(file_blob, globalObject),
var build_output = bun.new(jsc.API.BuildArtifact, .{
.blob = jsc.WebCore.Blob.initWithStore(file_blob, globalObject),
.hash = this.hash,
.loader = this.input_loader,
.output_kind = this.output_kind,
@@ -356,12 +383,12 @@ pub fn toJS(
break :brk build_output.toJS(globalObject);
},
.saved => brk: {
var build_output = bun.default_allocator.create(JSC.API.BuildArtifact) catch @panic("Unable to allocate Artifact");
var build_output = bun.default_allocator.create(jsc.API.BuildArtifact) catch @panic("Unable to allocate Artifact");
const path_to_use = owned_pathname orelse this.src_path.text;
const file_blob = JSC.WebCore.Blob.Store.initFile(
JSC.Node.PathOrFileDescriptor{
.path = JSC.Node.PathLike{ .string = bun.PathString.init(owned_pathname orelse (bun.default_allocator.dupe(u8, this.src_path.text) catch unreachable)) },
const file_blob = jsc.WebCore.Blob.Store.initFile(
jsc.Node.PathOrFileDescriptor{
.path = jsc.Node.PathLike{ .string = bun.PathString.init(owned_pathname orelse (bun.default_allocator.dupe(u8, this.src_path.text) catch unreachable)) },
},
this.loader.toMimeType(&.{owned_pathname orelse ""}),
globalObject.allocator(),
@@ -376,8 +403,8 @@ pub fn toJS(
},
};
build_output.* = JSC.API.BuildArtifact{
.blob = JSC.WebCore.Blob.initWithStore(file_blob, globalObject),
build_output.* = jsc.API.BuildArtifact{
.blob = jsc.WebCore.Blob.initWithStore(file_blob, globalObject),
.hash = this.hash,
.loader = this.input_loader,
.output_kind = this.output_kind,
@@ -387,7 +414,7 @@ pub fn toJS(
break :brk build_output.toJS(globalObject);
},
.buffer => |buffer| brk: {
var blob = JSC.WebCore.Blob.init(@constCast(buffer.bytes), buffer.allocator, globalObject);
var blob = jsc.WebCore.Blob.init(@constCast(buffer.bytes), buffer.allocator, globalObject);
if (blob.store) |store| {
store.mime_type = this.loader.toMimeType(&.{owned_pathname orelse ""});
blob.content_type = store.mime_type.value;
@@ -395,10 +422,10 @@ pub fn toJS(
blob.content_type = this.loader.toMimeType(&.{owned_pathname orelse ""}).value;
}
blob.size = @as(JSC.WebCore.Blob.SizeType, @truncate(buffer.bytes.len));
blob.size = @as(jsc.WebCore.Blob.SizeType, @truncate(buffer.bytes.len));
var build_output = bun.default_allocator.create(JSC.API.BuildArtifact) catch @panic("Unable to allocate Artifact");
build_output.* = JSC.API.BuildArtifact{
var build_output = bun.default_allocator.create(jsc.API.BuildArtifact) catch @panic("Unable to allocate Artifact");
build_output.* = jsc.API.BuildArtifact{
.blob = blob,
.hash = this.hash,
.loader = this.input_loader,
@@ -421,20 +448,20 @@ pub fn toJS(
pub fn toBlob(
this: *OutputFile,
allocator: std.mem.Allocator,
globalThis: *JSC.JSGlobalObject,
) !JSC.WebCore.Blob {
globalThis: *jsc.JSGlobalObject,
) !jsc.WebCore.Blob {
return switch (this.value) {
.move, .pending => @panic("Unexpected pending output file"),
.noop => @panic("Cannot convert noop output file to blob"),
.copy => |copy| brk: {
const file_blob = try JSC.WebCore.Blob.Store.initFile(
const file_blob = try jsc.WebCore.Blob.Store.initFile(
if (copy.fd.isValid())
JSC.Node.PathOrFileDescriptor{
jsc.Node.PathOrFileDescriptor{
.fd = copy.fd,
}
else
JSC.Node.PathOrFileDescriptor{
.path = JSC.Node.PathLike{ .string = bun.PathString.init(allocator.dupe(u8, copy.pathname) catch unreachable) },
jsc.Node.PathOrFileDescriptor{
.path = jsc.Node.PathLike{ .string = bun.PathString.init(allocator.dupe(u8, copy.pathname) catch unreachable) },
},
this.loader.toMimeType(&.{ this.dest_path, this.src_path.text }),
allocator,
@@ -447,12 +474,12 @@ pub fn toBlob(
},
};
break :brk JSC.WebCore.Blob.initWithStore(file_blob, globalThis);
break :brk jsc.WebCore.Blob.initWithStore(file_blob, globalThis);
},
.saved => brk: {
const file_blob = try JSC.WebCore.Blob.Store.initFile(
JSC.Node.PathOrFileDescriptor{
.path = JSC.Node.PathLike{ .string = bun.PathString.init(allocator.dupe(u8, this.src_path.text) catch unreachable) },
const file_blob = try jsc.WebCore.Blob.Store.initFile(
jsc.Node.PathOrFileDescriptor{
.path = jsc.Node.PathLike{ .string = bun.PathString.init(allocator.dupe(u8, this.src_path.text) catch unreachable) },
},
this.loader.toMimeType(&.{ this.dest_path, this.src_path.text }),
allocator,
@@ -465,10 +492,10 @@ pub fn toBlob(
},
};
break :brk JSC.WebCore.Blob.initWithStore(file_blob, globalThis);
break :brk jsc.WebCore.Blob.initWithStore(file_blob, globalThis);
},
.buffer => |buffer| brk: {
var blob = JSC.WebCore.Blob.init(@constCast(buffer.bytes), buffer.allocator, globalThis);
var blob = jsc.WebCore.Blob.init(@constCast(buffer.bytes), buffer.allocator, globalThis);
if (blob.store) |store| {
store.mime_type = this.loader.toMimeType(&.{ this.dest_path, this.src_path.text });
blob.content_type = store.mime_type.value;
@@ -483,22 +510,22 @@ pub fn toBlob(
},
};
blob.size = @as(JSC.WebCore.Blob.SizeType, @truncate(buffer.bytes.len));
blob.size = @as(jsc.WebCore.Blob.SizeType, @truncate(buffer.bytes.len));
break :brk blob;
},
};
}
const OutputFile = @This();
const string = []const u8;
const FileDescriptorType = bun.FileDescriptor;
const std = @import("std");
const bun = @import("bun");
const JSC = bun.JSC;
const Fs = bun.fs;
const Loader = @import("./options.zig").Loader;
const resolver = @import("./resolver/resolver.zig");
const resolve_path = @import("./resolver/resolve_path.zig");
const Output = @import("./Global.zig").Output;
const resolver = @import("./resolver/resolver.zig");
const std = @import("std");
const Loader = @import("./options.zig").Loader;
const bun = @import("bun");
const Environment = bun.Environment;
const FileDescriptorType = bun.FileDescriptor;
const Fs = bun.fs;
const jsc = bun.jsc;
const Output = bun.Global.Output;

View File

@@ -14,12 +14,7 @@
//! * `refresh_rate_ms`
//! * `initial_delay_ms`
const std = @import("std");
const builtin = @import("builtin");
const windows = std.os.windows;
const assert = bun.assert;
const Progress = @This();
const bun = @import("bun");
/// `null` if the current node (and its children) should
/// not print on update()
@@ -453,3 +448,10 @@ test "basic functionality" {
node.end();
}
}
const builtin = @import("builtin");
const std = @import("std");
const windows = std.os.windows;
const bun = @import("bun");
const assert = bun.assert;

View File

@@ -1,19 +1,6 @@
//! Originally, we tried using LIEF to inject the module graph into a MachO segment
//! But this incurred a fixed 350ms overhead on every build, which is unacceptable
//! so we give up on codesigning support on macOS for now until we can find a better solution
const bun = @import("bun");
const std = @import("std");
const Schema = bun.Schema.Api;
const strings = bun.strings;
const Output = bun.Output;
const Global = bun.Global;
const Environment = bun.Environment;
const Syscall = bun.sys;
const SourceMap = bun.sourcemap;
const StringPointer = bun.StringPointer;
const macho = bun.macho;
const w = std.os.windows;
pub const StandaloneModuleGraph = struct {
bytes: []const u8 = "",
@@ -137,6 +124,19 @@ pub const StandaloneModuleGraph = struct {
}
};
const PE = struct {
pub extern "C" fn Bun__getStandaloneModuleGraphPELength() u32;
pub extern "C" fn Bun__getStandaloneModuleGraphPEData() ?[*]u8;
pub fn getData() ?[]const u8 {
const length = Bun__getStandaloneModuleGraphPELength();
if (length == 0) return null;
const data_ptr = Bun__getStandaloneModuleGraphPEData() orelse return null;
return data_ptr[0..length];
}
};
pub const File = struct {
name: []const u8 = "",
loader: bun.options.Loader,
@@ -182,7 +182,7 @@ pub const StandaloneModuleGraph = struct {
return this.wtf_string.dupeRef();
}
pub fn blob(this: *File, globalObject: *bun.JSC.JSGlobalObject) *bun.webcore.Blob {
pub fn blob(this: *File, globalObject: *bun.jsc.JSGlobalObject) *bun.webcore.Blob {
if (this.cached_blob == null) {
const store = bun.webcore.Blob.Store.init(@constCast(this.contents), bun.default_allocator);
// make it never free
@@ -682,6 +682,48 @@ pub const StandaloneModuleGraph = struct {
}
return cloned_executable_fd;
},
.windows => {
const input_result = bun.sys.File.readToEnd(.{ .handle = cloned_executable_fd }, bun.default_allocator);
if (input_result.err) |err| {
Output.prettyErrorln("Error reading standalone module graph: {}", .{err});
cleanup(zname, cloned_executable_fd);
Global.exit(1);
}
var pe_file = bun.pe.PEFile.init(bun.default_allocator, input_result.bytes.items) catch |err| {
Output.prettyErrorln("Error initializing PE file: {}", .{err});
cleanup(zname, cloned_executable_fd);
Global.exit(1);
};
defer pe_file.deinit();
pe_file.addBunSection(bytes) catch |err| {
Output.prettyErrorln("Error adding Bun section to PE file: {}", .{err});
cleanup(zname, cloned_executable_fd);
Global.exit(1);
};
input_result.bytes.deinit();
switch (Syscall.setFileOffset(cloned_executable_fd, 0)) {
.err => |err| {
Output.prettyErrorln("Error seeking to start of temporary file: {}", .{err});
cleanup(zname, cloned_executable_fd);
Global.exit(1);
},
else => {},
}
var file = bun.sys.File{ .handle = cloned_executable_fd };
const writer = file.writer();
pe_file.write(writer) catch |err| {
Output.prettyErrorln("Error writing PE file: {}", .{err});
cleanup(zname, cloned_executable_fd);
Global.exit(1);
};
// Set executable permissions when running on POSIX hosts, even for Windows targets
if (comptime !Environment.isWindows) {
_ = bun.c.fchmod(cloned_executable_fd.native(), 0o777);
}
return cloned_executable_fd;
},
else => {
var total_byte_count: usize = undefined;
if (Environment.isWindows) {
@@ -888,6 +930,22 @@ pub const StandaloneModuleGraph = struct {
return try StandaloneModuleGraph.fromBytes(allocator, @constCast(macho_bytes), offsets);
}
if (comptime Environment.isWindows) {
const pe_bytes = PE.getData() orelse return null;
if (pe_bytes.len < @sizeOf(Offsets) + trailer.len) {
Output.debugWarn("bun standalone module graph is too small to be valid", .{});
return null;
}
const pe_bytes_slice = pe_bytes[pe_bytes.len - @sizeOf(Offsets) - trailer.len ..];
const trailer_bytes = pe_bytes[pe_bytes.len - trailer.len ..][0..trailer.len];
if (!bun.strings.eqlComptime(trailer_bytes, trailer)) {
Output.debugWarn("bun standalone module graph has invalid trailer", .{});
return null;
}
const offsets = std.mem.bytesAsValue(Offsets, pe_bytes_slice).*;
return try StandaloneModuleGraph.fromBytes(allocator, @constCast(pe_bytes), offsets);
}
// Do not invoke libuv here.
const self_exe = openSelf() catch return null;
defer self_exe.close();
@@ -1168,13 +1226,13 @@ pub const StandaloneModuleGraph = struct {
// the allocator given to the JS parser is not respected for all parts
// of the parse, so we need to remember to reset the ast store
bun.JSAst.Expr.Data.Store.reset();
bun.JSAst.Stmt.Data.Store.reset();
bun.ast.Expr.Data.Store.reset();
bun.ast.Stmt.Data.Store.reset();
defer {
bun.JSAst.Expr.Data.Store.reset();
bun.JSAst.Stmt.Data.Store.reset();
bun.ast.Expr.Data.Store.reset();
bun.ast.Stmt.Data.Store.reset();
}
var json = bun.JSON.parse(&json_src, &log, arena, false) catch
var json = bun.json.parse(&json_src, &log, arena, false) catch
return error.InvalidSourceMap;
const mappings_str = json.get("mappings") orelse
@@ -1252,3 +1310,18 @@ pub const StandaloneModuleGraph = struct {
bun.assert(header_list.items.len == string_payload_start_location);
}
};
const std = @import("std");
const w = std.os.windows;
const bun = @import("bun");
const Environment = bun.Environment;
const Global = bun.Global;
const Output = bun.Output;
const SourceMap = bun.sourcemap;
const StringPointer = bun.StringPointer;
const Syscall = bun.sys;
const macho = bun.macho;
const pe = bun.pe;
const strings = bun.strings;
const Schema = bun.schema.api;

View File

@@ -1,13 +1,5 @@
// https://github.com/lithdew/rheia/blob/162293d0f0e8d6572a8954c0add83f13f76b3cc6/hash_map.zig
// Apache License 2.0
const std = @import("std");
const mem = std.mem;
const math = std.math;
const testing = std.testing;
const bun = @import("bun");
const assert = bun.assert;
pub fn AutoHashMap(comptime K: type, comptime V: type, comptime max_load_percentage: comptime_int) type {
return HashMap(K, V, std.hash_map.AutoContext(K), max_load_percentage);
@@ -785,3 +777,11 @@ test "SortedHashMap: collision test" {
try testing.expectEqual(@as(usize, 1), map.delete(prefix ++ [_]u8{1}).?);
try testing.expectEqual(@as(usize, 2), map.delete(prefix ++ [_]u8{2}).?);
}
const bun = @import("bun");
const assert = bun.assert;
const std = @import("std");
const math = std.math;
const mem = std.mem;
const testing = std.testing;

View File

@@ -1,5 +1,7 @@
//! Bun's cross-platform filesystem watcher. Runs on its own thread.
const Watcher = @This();
const DebugLogScope = bun.Output.Scoped(.watcher, false);
const log = DebugLogScope.log;
@@ -126,7 +128,6 @@ pub fn getHash(filepath: string) HashType {
pub const WatchItemIndex = u16;
pub const max_eviction_count = 8096;
const WindowsWatcher = @import("./watcher/WindowsWatcher.zig");
// TODO: some platform-specific behavior is implemented in
// this file instead of the platform-specific file.
// ideally, the constants above can be inlined
@@ -288,7 +289,7 @@ pub fn flushEvictions(this: *Watcher) void {
}
}
fn watchLoop(this: *Watcher) bun.JSC.Maybe(void) {
fn watchLoop(this: *Watcher) bun.sys.Maybe(void) {
while (this.running) {
// individual platform implementation will call onFileUpdate
switch (Platform.watchLoopCycle(this)) {
@@ -296,7 +297,7 @@ fn watchLoop(this: *Watcher) bun.JSC.Maybe(void) {
.result => |iter| iter,
}
}
return .{ .result = {} };
return .success;
}
fn appendFileAssumeCapacity(
@@ -308,13 +309,13 @@ fn appendFileAssumeCapacity(
parent_hash: HashType,
package_json: ?*PackageJSON,
comptime clone_file_path: bool,
) bun.JSC.Maybe(void) {
) bun.sys.Maybe(void) {
if (comptime Environment.isWindows) {
// on windows we can only watch items that are in the directory tree of the top level dir
const rel = bun.path.isParentOrEqual(this.fs.top_level_dir, file_path);
if (rel == .unrelated) {
Output.warn("File {s} is not in the project directory and will not be watched\n", .{file_path});
return .{ .result = {} };
return .success;
}
}
@@ -381,7 +382,7 @@ fn appendFileAssumeCapacity(
}
this.watchlist.appendAssumeCapacity(item);
return .{ .result = {} };
return .success;
}
fn appendDirectoryAssumeCapacity(
this: *Watcher,
@@ -389,7 +390,7 @@ fn appendDirectoryAssumeCapacity(
file_path: string,
hash: HashType,
comptime clone_file_path: bool,
) bun.JSC.Maybe(WatchItemIndex) {
) bun.sys.Maybe(WatchItemIndex) {
if (comptime Environment.isWindows) {
// on windows we can only watch items that are in the directory tree of the top level dir
const rel = bun.path.isParentOrEqual(this.fs.top_level_dir, file_path);
@@ -500,7 +501,7 @@ pub fn appendFileMaybeLock(
package_json: ?*PackageJSON,
comptime clone_file_path: bool,
comptime lock: bool,
) bun.JSC.Maybe(void) {
) bun.sys.Maybe(void) {
if (comptime lock) this.mutex.lock();
defer if (comptime lock) this.mutex.unlock();
bun.assert(file_path.len > 1);
@@ -560,7 +561,7 @@ pub fn appendFileMaybeLock(
});
}
return .{ .result = {} };
return .success;
}
inline fn isEligibleDirectory(this: *Watcher, dir: string) bool {
@@ -576,7 +577,7 @@ pub fn appendFile(
dir_fd: bun.FileDescriptor,
package_json: ?*PackageJSON,
comptime clone_file_path: bool,
) bun.JSC.Maybe(void) {
) bun.sys.Maybe(void) {
return appendFileMaybeLock(this, fd, file_path, hash, loader, dir_fd, package_json, clone_file_path, true);
}
@@ -586,7 +587,7 @@ pub fn addDirectory(
file_path: string,
hash: HashType,
comptime clone_file_path: bool,
) bun.JSC.Maybe(WatchItemIndex) {
) bun.sys.Maybe(WatchItemIndex) {
this.mutex.lock();
defer this.mutex.unlock();
@@ -608,7 +609,7 @@ pub fn addFile(
dir_fd: bun.FileDescriptor,
package_json: ?*PackageJSON,
comptime clone_file_path: bool,
) bun.JSC.Maybe(void) {
) bun.sys.Maybe(void) {
// This must lock due to concurrent transpiler
this.mutex.lock();
defer this.mutex.unlock();
@@ -621,7 +622,7 @@ pub fn addFile(
fds[index] = fd;
}
}
return .{ .result = {} };
return .success;
}
return this.appendFileMaybeLock(fd, file_path, hash, loader, dir_fd, package_json, clone_file_path, false);
@@ -673,13 +674,16 @@ pub fn onMaybeWatchDirectory(watch: *Watcher, file_path: string, dir_fd: bun.Sto
}
}
const std = @import("std");
const bun = @import("bun");
const string = bun.string;
const Output = bun.Output;
const Environment = bun.Environment;
const strings = bun.strings;
const FeatureFlags = bun.FeatureFlags;
const string = []const u8;
const WindowsWatcher = @import("./watcher/WindowsWatcher.zig");
const options = @import("./options.zig");
const Mutex = bun.Mutex;
const std = @import("std");
const PackageJSON = @import("./resolver/package_json.zig").PackageJSON;
const bun = @import("bun");
const Environment = bun.Environment;
const FeatureFlags = bun.FeatureFlags;
const Mutex = bun.Mutex;
const Output = bun.Output;
const strings = bun.strings;

View File

@@ -1,8 +1,12 @@
const std = @import("std");
const Environment = @import("./env.zig");
const bun = @import("bun");
const OOM = bun.OOM;
pub const c_allocator = @import("./allocators/basic.zig").c_allocator;
pub const z_allocator = @import("./allocators/basic.zig").z_allocator;
pub const mimalloc = @import("./allocators/mimalloc.zig");
pub const MimallocArena = @import("./allocators/MimallocArena.zig");
pub const AllocationScope = @import("./allocators/AllocationScope.zig");
pub const NullableAllocator = @import("./allocators/NullableAllocator.zig");
pub const MaxHeapAllocator = @import("./allocators/MaxHeapAllocator.zig");
pub const MemoryReportingAllocator = @import("./allocators/MemoryReportingAllocator.zig");
pub const LinuxMemFdAllocator = @import("./allocators/LinuxMemFdAllocator.zig");
pub fn isSliceInBufferT(comptime T: type, slice: []const T, buffer: []const T) bool {
return (@intFromPtr(buffer.ptr) <= @intFromPtr(slice.ptr) and
@@ -80,9 +84,14 @@ fn OverflowGroup(comptime Block: type) type {
const max = 4095;
const UsedSize = std.math.IntFittingRange(0, max + 1);
const default_allocator = bun.default_allocator;
used: UsedSize = 0,
allocated: UsedSize = 0,
ptrs: [max]*Block = undefined,
used: UsedSize,
allocated: UsedSize,
ptrs: [max]*Block,
pub inline fn zero(this: *Overflow) void {
this.used = 0;
this.allocated = 0;
}
pub fn tail(this: *Overflow) *Block {
if (this.allocated > 0 and this.ptrs[this.used].isFull()) {
@@ -94,7 +103,7 @@ fn OverflowGroup(comptime Block: type) type {
if (this.allocated <= this.used) {
this.ptrs[this.allocated] = default_allocator.create(Block) catch unreachable;
this.ptrs[this.allocated].* = Block{};
this.ptrs[this.allocated].zero();
this.allocated +%= 1;
}
@@ -113,8 +122,12 @@ pub fn OverflowList(comptime ValueType: type, comptime count: comptime_int) type
const SizeType = std.math.IntFittingRange(0, count);
const Block = struct {
used: SizeType = 0,
items: [count]ValueType = undefined,
used: SizeType,
items: [count]ValueType,
pub inline fn zero(this: *Block) void {
this.used = 0;
}
pub inline fn isFull(block: *const Block) bool {
return block.used >= @as(SizeType, count);
@@ -129,8 +142,13 @@ pub fn OverflowList(comptime ValueType: type, comptime count: comptime_int) type
}
};
const Overflow = OverflowGroup(Block);
list: Overflow = Overflow{},
count: u31 = 0,
list: Overflow,
count: u31,
pub inline fn zero(this: *This) void {
this.list.zero();
this.count = 0;
}
pub inline fn len(this: *const This) u31 {
return this.count;
@@ -187,9 +205,17 @@ pub fn BSSList(comptime ValueType: type, comptime _count: anytype) type {
return struct {
const ChunkSize = 256;
const OverflowBlock = struct {
used: std.atomic.Value(u16) = std.atomic.Value(u16).init(0),
data: [ChunkSize]ValueType = undefined,
prev: ?*OverflowBlock = null,
used: std.atomic.Value(u16),
data: [ChunkSize]ValueType,
prev: ?*OverflowBlock,
pub inline fn zero(this: *OverflowBlock) void {
// Avoid struct initialization syntax.
// This makes Bun start about 1ms faster.
// https://github.com/ziglang/zig/issues/24313
this.used = std.atomic.Value(u16).init(0);
this.prev = null;
}
pub fn append(this: *OverflowBlock, item: ValueType) !*ValueType {
const index = this.used.fetchAdd(1, .acq_rel);
@@ -204,10 +230,10 @@ pub fn BSSList(comptime ValueType: type, comptime _count: anytype) type {
allocator: Allocator,
mutex: Mutex = .{},
head: *OverflowBlock = undefined,
tail: OverflowBlock = OverflowBlock{},
backing_buf: [count]ValueType = undefined,
used: u32 = 0,
head: *OverflowBlock,
tail: OverflowBlock,
backing_buf: [count]ValueType,
used: u32,
pub var instance: *Self = undefined;
pub var loaded = false;
@@ -219,11 +245,14 @@ pub fn BSSList(comptime ValueType: type, comptime _count: anytype) type {
pub fn init(allocator: std.mem.Allocator) *Self {
if (!loaded) {
instance = bun.default_allocator.create(Self) catch bun.outOfMemory();
instance.* = Self{
.allocator = allocator,
.tail = OverflowBlock{},
};
// Avoid struct initialization syntax.
// This makes Bun start about 1ms faster.
// https://github.com/ziglang/zig/issues/24313
instance.allocator = allocator;
instance.mutex = .{};
instance.tail.zero();
instance.head = &instance.tail;
instance.used = 0;
loaded = true;
}
@@ -242,7 +271,7 @@ pub fn BSSList(comptime ValueType: type, comptime _count: anytype) type {
instance.used += 1;
return self.head.append(value) catch brk: {
var new_block = try self.allocator.create(OverflowBlock);
new_block.* = OverflowBlock{};
new_block.zero();
new_block.prev = self.head;
self.head = new_block;
break :brk self.head.append(value);
@@ -265,8 +294,6 @@ pub fn BSSList(comptime ValueType: type, comptime _count: anytype) type {
};
}
const Mutex = bun.Mutex;
/// Append-only list.
/// Stores an initial count in .bss section of the object file
/// Overflows to heap when count is exceeded.
@@ -287,12 +314,12 @@ pub fn BSSStringList(comptime _count: usize, comptime _item_length: usize) type
const Allocator = std.mem.Allocator;
const Self = @This();
backing_buf: [count * item_length]u8 = undefined,
backing_buf_used: u64 = undefined,
overflow_list: Overflow = Overflow{},
backing_buf: [count * item_length]u8,
backing_buf_used: u64,
overflow_list: Overflow,
allocator: Allocator,
slice_buf: [count][]const u8 = undefined,
slice_buf_used: u16 = 0,
slice_buf: [count][]const u8,
slice_buf_used: u16,
mutex: Mutex = .{},
pub var instance: *Self = undefined;
var loaded: bool = false;
@@ -305,10 +332,14 @@ pub fn BSSStringList(comptime _count: usize, comptime _item_length: usize) type
pub fn init(allocator: std.mem.Allocator) *Self {
if (!loaded) {
instance = bun.default_allocator.create(Self) catch bun.outOfMemory();
instance.* = Self{
.allocator = allocator,
.backing_buf_used = 0,
};
// Avoid struct initialization syntax.
// This makes Bun start about 1ms faster.
// https://github.com/ziglang/zig/issues/24313
instance.allocator = allocator;
instance.backing_buf_used = 0;
instance.slice_buf_used = 0;
instance.overflow_list.zero();
instance.mutex = .{};
loaded = true;
}
@@ -469,11 +500,11 @@ pub fn BSSMap(comptime ValueType: type, comptime count: anytype, comptime store_
const Overflow = OverflowList(ValueType, count / 4);
index: IndexMap,
overflow_list: Overflow = Overflow{},
overflow_list: Overflow,
allocator: Allocator,
mutex: Mutex = .{},
backing_buf: [count]ValueType = undefined,
backing_buf_used: u16 = 0,
backing_buf: [count]ValueType,
backing_buf_used: u16,
pub var instance: *Self = undefined;
@@ -481,11 +512,15 @@ pub fn BSSMap(comptime ValueType: type, comptime count: anytype, comptime store_
pub fn init(allocator: std.mem.Allocator) *Self {
if (!loaded) {
// Avoid struct initialization syntax.
// This makes Bun start about 1ms faster.
// https://github.com/ziglang/zig/issues/24313
instance = bun.default_allocator.create(Self) catch bun.outOfMemory();
instance.* = Self{
.index = IndexMap{},
.allocator = allocator,
};
instance.index = IndexMap{};
instance.allocator = allocator;
instance.overflow_list.zero();
instance.backing_buf_used = 0;
instance.mutex = .{};
loaded = true;
}
@@ -634,9 +669,12 @@ pub fn BSSMap(comptime ValueType: type, comptime count: anytype, comptime store_
pub fn init(allocator: std.mem.Allocator) *Self {
if (!instance_loaded) {
instance = bun.default_allocator.create(Self) catch bun.outOfMemory();
instance.* = Self{
.map = BSSMapType.init(allocator),
};
// Avoid struct initialization syntax.
// This makes Bun start about 1ms faster.
// https://github.com/ziglang/zig/issues/24313
instance.map = BSSMapType.init(allocator);
instance.key_list_buffer_used = 0;
instance.key_list_overflow.zero();
instance_loaded = true;
}
@@ -733,3 +771,10 @@ pub fn BSSMap(comptime ValueType: type, comptime count: anytype, comptime store_
}
};
}
const Environment = @import("./env.zig");
const std = @import("std");
const bun = @import("bun");
const OOM = bun.OOM;
const Mutex = bun.threading.Mutex;

View File

@@ -1,5 +1,6 @@
//! AllocationScope wraps another allocator, providing leak and invalid free assertions.
//! It also allows measuring how much memory a scope has allocated.
const AllocationScope = @This();
pub const enabled = bun.Environment.enableAllocScopes;
@@ -253,6 +254,7 @@ pub inline fn downcast(a: Allocator) ?*AllocationScope {
const std = @import("std");
const Allocator = std.mem.Allocator;
const bun = @import("bun");
const Output = bun.Output;
const StoredTrace = bun.crash_handler.StoredTrace;

View File

@@ -0,0 +1,193 @@
//! When cloning large amounts of data potentially multiple times, we can
//! leverage copy-on-write memory to avoid actually copying the data. To do that
//! on Linux, we need to use a memfd, which is a Linux-specific feature.
//!
//! The steps are roughly:
//!
//! 1. Create a memfd
//! 2. Write the data to the memfd
//! 3. Map the memfd into memory
//!
//! Then, to clone the data later, we can just call `mmap` again.
//!
//! The big catch is that mmap(), memfd_create(), write() all have overhead. And
//! often we will re-use virtual memory within the process. This does not reuse
//! the virtual memory. So we should only really use this for large blobs of
//! data that we expect to be cloned multiple times. Such as Blob in FormData.
const Self = @This();
const RefCount = bun.ptr.ThreadSafeRefCount(@This(), "ref_count", deinit, .{});
pub const new = bun.TrivialNew(@This());
pub const ref = RefCount.ref;
pub const deref = RefCount.deref;
ref_count: RefCount,
fd: bun.FileDescriptor = .invalid,
size: usize = 0,
var memfd_counter = std.atomic.Value(usize).init(0);
fn deinit(self: *Self) void {
self.fd.close();
bun.destroy(self);
}
pub fn allocator(self: *Self) std.mem.Allocator {
return .{
.ptr = self,
.vtable = AllocatorInterface.VTable,
};
}
pub fn from(allocator_: std.mem.Allocator) ?*Self {
if (allocator_.vtable == AllocatorInterface.VTable) {
return @alignCast(@ptrCast(allocator_.ptr));
}
return null;
}
const AllocatorInterface = struct {
fn alloc(_: *anyopaque, _: usize, _: std.mem.Alignment, _: usize) ?[*]u8 {
// it should perform no allocations or resizes
return null;
}
fn free(
ptr: *anyopaque,
buf: []u8,
_: std.mem.Alignment,
_: usize,
) void {
var self: *Self = @alignCast(@ptrCast(ptr));
defer self.deref();
bun.sys.munmap(@alignCast(@ptrCast(buf))).unwrap() catch |err| {
bun.Output.debugWarn("Failed to munmap memfd: {}", .{err});
};
}
pub const VTable = &std.mem.Allocator.VTable{
.alloc = &AllocatorInterface.alloc,
.resize = &std.mem.Allocator.noResize,
.remap = &std.mem.Allocator.noRemap,
.free = &free,
};
};
pub fn alloc(self: *Self, len: usize, offset: usize, flags: std.posix.MAP) bun.sys.Maybe(bun.webcore.Blob.Store.Bytes) {
var size = len;
// size rounded up to nearest page
size = std.mem.alignForward(usize, size, std.heap.pageSize());
var flags_mut = flags;
flags_mut.TYPE = .SHARED;
switch (bun.sys.mmap(
null,
@min(size, self.size),
std.posix.PROT.READ | std.posix.PROT.WRITE,
flags_mut,
self.fd,
offset,
)) {
.result => |slice| {
return .{
.result = bun.webcore.Blob.Store.Bytes{
.cap = @truncate(slice.len),
.ptr = slice.ptr,
.len = @truncate(len),
.allocator = self.allocator(),
},
};
},
.err => |errno| {
return .{ .err = errno };
},
}
}
pub fn shouldUse(bytes: []const u8) bool {
if (comptime !bun.Environment.isLinux) {
return false;
}
if (bun.jsc.VirtualMachine.is_smol_mode) {
return bytes.len >= 1024 * 1024 * 1;
}
// This is a net 2x - 4x slowdown to new Blob([huge])
// so we must be careful
return bytes.len >= 1024 * 1024 * 8;
}
pub fn create(bytes: []const u8) bun.sys.Maybe(bun.webcore.Blob.Store.Bytes) {
if (comptime !bun.Environment.isLinux) {
unreachable;
}
var label_buf: [128]u8 = undefined;
const label = std.fmt.bufPrintZ(&label_buf, "memfd-num-{d}", .{memfd_counter.fetchAdd(1, .monotonic)}) catch "";
// Using huge pages was slower.
const fd = switch (bun.sys.memfd_create(label, std.os.linux.MFD.CLOEXEC)) {
.err => |err| return .{ .err = bun.sys.Error.fromCode(err.getErrno(), .open) },
.result => |fd| fd,
};
if (bytes.len > 0)
// Hint at the size of the file
_ = bun.sys.ftruncate(fd, @intCast(bytes.len));
// Dump all the bytes in there
var written: isize = 0;
var remain = bytes;
while (remain.len > 0) {
switch (bun.sys.pwrite(fd, remain, written)) {
.err => |err| {
if (err.getErrno() == .AGAIN) {
continue;
}
bun.Output.debugWarn("Failed to write to memfd: {}", .{err});
fd.close();
return .{ .err = err };
},
.result => |result| {
if (result == 0) {
bun.Output.debugWarn("Failed to write to memfd: EOF", .{});
fd.close();
return .{ .err = bun.sys.Error.fromCode(.NOMEM, .write) };
}
written += @intCast(result);
remain = remain[result..];
},
}
}
var linux_memfd_allocator = Self.new(.{
.fd = fd,
.ref_count = .init(),
.size = bytes.len,
});
switch (linux_memfd_allocator.alloc(bytes.len, 0, .{ .TYPE = .SHARED })) {
.result => |res| {
return .{ .result = res };
},
.err => |err| {
linux_memfd_allocator.deref();
return .{ .err = err };
},
}
}
pub fn isInstance(allocator_: std.mem.Allocator) bool {
return allocator_.vtable == AllocatorInterface.VTable;
}
const bun = @import("bun");
const std = @import("std");

View File

@@ -0,0 +1,58 @@
//! Single allocation only.
const Self = @This();
array_list: std.ArrayListAligned(u8, @alignOf(std.c.max_align_t)),
fn alloc(ptr: *anyopaque, len: usize, alignment: std.mem.Alignment, _: usize) ?[*]u8 {
bun.assert(alignment.toByteUnits() <= @alignOf(std.c.max_align_t));
var self = bun.cast(*Self, ptr);
self.array_list.items.len = 0;
self.array_list.ensureTotalCapacity(len) catch return null;
self.array_list.items.len = len;
return self.array_list.items.ptr;
}
fn resize(_: *anyopaque, buf: []u8, _: std.mem.Alignment, new_len: usize, _: usize) bool {
_ = new_len;
_ = buf;
@panic("not implemented");
}
fn free(
_: *anyopaque,
_: []u8,
_: std.mem.Alignment,
_: usize,
) void {}
pub fn reset(self: *Self) void {
self.array_list.items.len = 0;
}
pub fn deinit(self: *Self) void {
self.array_list.deinit();
}
const vtable = std.mem.Allocator.VTable{
.alloc = &alloc,
.free = &free,
.resize = &resize,
.remap = &std.mem.Allocator.noRemap,
};
pub fn init(self: *Self, allocator: std.mem.Allocator) std.mem.Allocator {
self.array_list = .init(allocator);
return std.mem.Allocator{
.ptr = self,
.vtable = &vtable,
};
}
pub fn isInstance(allocator: std.mem.Allocator) bool {
return allocator.vtable == &vtable;
}
const bun = @import("bun");
const std = @import("std");

View File

@@ -1,4 +1,5 @@
const MemoryReportingAllocator = @This();
const log = bun.Output.scoped(.MEM, false);
child_allocator: std.mem.Allocator,
@@ -76,6 +77,10 @@ pub inline fn assert(this: *const MemoryReportingAllocator) void {
}
}
pub fn isInstance(allocator_: std.mem.Allocator) bool {
return allocator_.vtable == &VTable;
}
pub const VTable = std.mem.Allocator.VTable{
.alloc = &alloc,
.resize = &resize,
@@ -84,7 +89,8 @@ pub const VTable = std.mem.Allocator.VTable{
};
const std = @import("std");
const bun = @import("bun");
const jsc = bun.jsc;
const Environment = bun.Environment;
const Output = bun.Output;
const jsc = bun.jsc;

View File

@@ -0,0 +1,176 @@
const Self = @This();
heap: ?*mimalloc.Heap = null,
const log = bun.Output.scoped(.mimalloc, true);
/// Internally, mimalloc calls mi_heap_get_default()
/// to get the default heap.
/// It uses pthread_getspecific to do that.
/// We can save those extra calls if we just do it once in here
pub fn getThreadlocalDefault() Allocator {
return Allocator{ .ptr = mimalloc.mi_heap_get_default(), .vtable = &c_allocator_vtable };
}
pub fn backingAllocator(self: Self) Allocator {
var arena = Self{ .heap = self.heap.?.backing() };
return arena.allocator();
}
pub fn allocator(self: Self) Allocator {
@setRuntimeSafety(false);
return Allocator{ .ptr = self.heap.?, .vtable = &c_allocator_vtable };
}
pub fn dumpThreadStats(self: *Self) void {
_ = self;
const dump_fn = struct {
pub fn dump(textZ: [*:0]const u8, _: ?*anyopaque) callconv(.C) void {
const text = bun.span(textZ);
bun.Output.errorWriter().writeAll(text) catch {};
}
}.dump;
mimalloc.mi_thread_stats_print_out(dump_fn, null);
bun.Output.flush();
}
pub fn dumpStats(self: *Self) void {
_ = self;
const dump_fn = struct {
pub fn dump(textZ: [*:0]const u8, _: ?*anyopaque) callconv(.C) void {
const text = bun.span(textZ);
bun.Output.errorWriter().writeAll(text) catch {};
}
}.dump;
mimalloc.mi_stats_print_out(dump_fn, null);
bun.Output.flush();
}
pub fn deinit(self: *Self) void {
mimalloc.mi_heap_destroy(bun.take(&self.heap).?);
}
pub fn init() !Self {
return .{ .heap = mimalloc.mi_heap_new() orelse return error.OutOfMemory };
}
pub fn gc(self: Self) void {
mimalloc.mi_heap_collect(self.heap orelse return, false);
}
pub inline fn helpCatchMemoryIssues(self: Self) void {
if (comptime FeatureFlags.help_catch_memory_issues) {
self.gc();
bun.mimalloc.mi_collect(false);
}
}
pub fn ownsPtr(self: Self, ptr: *const anyopaque) bool {
return mimalloc.mi_heap_check_owned(self.heap.?, ptr);
}
pub const supports_posix_memalign = true;
fn alignedAlloc(heap: *mimalloc.Heap, len: usize, alignment: mem.Alignment) ?[*]u8 {
log("Malloc: {d}\n", .{len});
const ptr: ?*anyopaque = if (mimalloc.canUseAlignedAlloc(len, alignment.toByteUnits()))
mimalloc.mi_heap_malloc_aligned(heap, len, alignment.toByteUnits())
else
mimalloc.mi_heap_malloc(heap, len);
if (comptime Environment.isDebug) {
const usable = mimalloc.mi_malloc_usable_size(ptr);
if (usable < len) {
std.debug.panic("mimalloc: allocated size is too small: {d} < {d}", .{ usable, len });
}
}
return if (ptr) |p|
@as([*]u8, @ptrCast(p))
else
null;
}
fn alignedAllocSize(ptr: [*]u8) usize {
return mimalloc.mi_malloc_usable_size(ptr);
}
fn alloc(arena: *anyopaque, len: usize, alignment: mem.Alignment, _: usize) ?[*]u8 {
const self = bun.cast(*mimalloc.Heap, arena);
return alignedAlloc(
self,
len,
alignment,
);
}
fn resize(_: *anyopaque, buf: []u8, _: mem.Alignment, new_len: usize, _: usize) bool {
return mimalloc.mi_expand(buf.ptr, new_len) != null;
}
fn free(
_: *anyopaque,
buf: []u8,
alignment: mem.Alignment,
_: usize,
) void {
// mi_free_size internally just asserts the size
// so it's faster if we don't pass that value through
// but its good to have that assertion
if (comptime Environment.isDebug) {
assert(mimalloc.mi_is_in_heap_region(buf.ptr));
if (mimalloc.canUseAlignedAlloc(buf.len, alignment.toByteUnits()))
mimalloc.mi_free_size_aligned(buf.ptr, buf.len, alignment.toByteUnits())
else
mimalloc.mi_free_size(buf.ptr, buf.len);
} else {
mimalloc.mi_free(buf.ptr);
}
}
/// Attempt to expand or shrink memory, allowing relocation.
///
/// `memory.len` must equal the length requested from the most recent
/// successful call to `alloc`, `resize`, or `remap`. `alignment` must
/// equal the same value that was passed as the `alignment` parameter to
/// the original `alloc` call.
///
/// A non-`null` return value indicates the resize was successful. The
/// allocation may have same address, or may have been relocated. In either
/// case, the allocation now has size of `new_len`. A `null` return value
/// indicates that the resize would be equivalent to allocating new memory,
/// copying the bytes from the old memory, and then freeing the old memory.
/// In such case, it is more efficient for the caller to perform the copy.
///
/// `new_len` must be greater than zero.
///
/// `ret_addr` is optionally provided as the first return address of the
/// allocation call stack. If the value is `0` it means no return address
/// has been provided.
fn remap(self: *anyopaque, buf: []u8, alignment: mem.Alignment, new_len: usize, _: usize) ?[*]u8 {
const aligned_size = alignment.toByteUnits();
const value = mimalloc.mi_heap_realloc_aligned(@ptrCast(self), buf.ptr, new_len, aligned_size);
return @ptrCast(value);
}
pub fn isInstance(allocator_: Allocator) bool {
return allocator_.vtable == &c_allocator_vtable;
}
const c_allocator_vtable = Allocator.VTable{
.alloc = &Self.alloc,
.resize = &Self.resize,
.remap = &Self.remap,
.free = &Self.free,
};
const Environment = @import("../env.zig");
const FeatureFlags = @import("../feature_flags.zig");
const std = @import("std");
const bun = @import("bun");
const assert = bun.assert;
const mimalloc = bun.mimalloc;
const mem = std.mem;
const Allocator = mem.Allocator;

View File

@@ -1,6 +1,4 @@
//! A nullable allocator the same size as `std.mem.Allocator`.
const std = @import("std");
const bun = @import("bun");
const NullableAllocator = @This();
@@ -46,3 +44,6 @@ comptime {
@compileError("Expected the sizes to be the same.");
}
}
const bun = @import("bun");
const std = @import("std");

View File

@@ -1,11 +1,4 @@
const mem = @import("std").mem;
const std = @import("std");
const bun = @import("bun");
const log = bun.Output.scoped(.mimalloc, true);
const assert = bun.assert;
const Allocator = mem.Allocator;
const mimalloc = @import("./mimalloc.zig");
const Environment = @import("../env.zig");
fn mimalloc_free(
_: *anyopaque,
@@ -150,3 +143,13 @@ const z_allocator_vtable = Allocator.VTable{
.remap = &std.mem.Allocator.noRemap,
.free = &ZAllocator.free_with_z_allocator,
};
const Environment = @import("../env.zig");
const std = @import("std");
const bun = @import("bun");
const assert = bun.assert;
const mimalloc = bun.mimalloc;
const mem = @import("std").mem;
const Allocator = mem.Allocator;

View File

@@ -1,187 +0,0 @@
const bun = @import("bun");
const std = @import("std");
/// When cloning large amounts of data potentially multiple times, we can
/// leverage copy-on-write memory to avoid actually copying the data. To do that
/// on Linux, we need to use a memfd, which is a Linux-specific feature.
///
/// The steps are roughly:
///
/// 1. Create a memfd
/// 2. Write the data to the memfd
/// 3. Map the memfd into memory
///
/// Then, to clone the data later, we can just call `mmap` again.
///
/// The big catch is that mmap(), memfd_create(), write() all have overhead. And
/// often we will re-use virtual memory within the process. This does not reuse
/// the virtual memory. So we should only really use this for large blobs of
/// data that we expect to be cloned multiple times. Such as Blob in FormData.
pub const LinuxMemFdAllocator = struct {
const RefCount = bun.ptr.ThreadSafeRefCount(@This(), "ref_count", deinit, .{});
pub const new = bun.TrivialNew(@This());
pub const ref = RefCount.ref;
pub const deref = RefCount.deref;
ref_count: RefCount,
fd: bun.FileDescriptor = .invalid,
size: usize = 0,
var memfd_counter = std.atomic.Value(usize).init(0);
fn deinit(this: *LinuxMemFdAllocator) void {
this.fd.close();
bun.destroy(this);
}
pub fn allocator(this: *LinuxMemFdAllocator) std.mem.Allocator {
return .{
.ptr = this,
.vtable = AllocatorInterface.VTable,
};
}
pub fn from(allocator_: std.mem.Allocator) ?*LinuxMemFdAllocator {
if (allocator_.vtable == AllocatorInterface.VTable) {
return @alignCast(@ptrCast(allocator_.ptr));
}
return null;
}
const AllocatorInterface = struct {
fn alloc(_: *anyopaque, _: usize, _: std.mem.Alignment, _: usize) ?[*]u8 {
// it should perform no allocations or resizes
return null;
}
fn free(
ptr: *anyopaque,
buf: []u8,
_: std.mem.Alignment,
_: usize,
) void {
var this: *LinuxMemFdAllocator = @alignCast(@ptrCast(ptr));
defer this.deref();
bun.sys.munmap(@alignCast(@ptrCast(buf))).unwrap() catch |err| {
bun.Output.debugWarn("Failed to munmap memfd: {}", .{err});
};
}
pub const VTable = &std.mem.Allocator.VTable{
.alloc = &AllocatorInterface.alloc,
.resize = &std.mem.Allocator.noResize,
.remap = &std.mem.Allocator.noRemap,
.free = &free,
};
};
pub fn alloc(this: *LinuxMemFdAllocator, len: usize, offset: usize, flags: std.posix.MAP) bun.JSC.Maybe(bun.webcore.Blob.Store.Bytes) {
var size = len;
// size rounded up to nearest page
size = std.mem.alignForward(usize, size, std.heap.pageSize());
var flags_mut = flags;
flags_mut.TYPE = .SHARED;
switch (bun.sys.mmap(
null,
@min(size, this.size),
std.posix.PROT.READ | std.posix.PROT.WRITE,
flags_mut,
this.fd,
offset,
)) {
.result => |slice| {
return .{
.result = bun.webcore.Blob.Store.Bytes{
.cap = @truncate(slice.len),
.ptr = slice.ptr,
.len = @truncate(len),
.allocator = this.allocator(),
},
};
},
.err => |errno| {
return .{ .err = errno };
},
}
}
pub fn shouldUse(bytes: []const u8) bool {
if (comptime !bun.Environment.isLinux) {
return false;
}
if (bun.JSC.VirtualMachine.is_smol_mode) {
return bytes.len >= 1024 * 1024 * 1;
}
// This is a net 2x - 4x slowdown to new Blob([huge])
// so we must be careful
return bytes.len >= 1024 * 1024 * 8;
}
pub fn create(bytes: []const u8) bun.JSC.Maybe(bun.webcore.Blob.Store.Bytes) {
if (comptime !bun.Environment.isLinux) {
unreachable;
}
var label_buf: [128]u8 = undefined;
const label = std.fmt.bufPrintZ(&label_buf, "memfd-num-{d}", .{memfd_counter.fetchAdd(1, .monotonic)}) catch "";
// Using huge pages was slower.
const fd = switch (bun.sys.memfd_create(label, std.os.linux.MFD.CLOEXEC)) {
.err => |err| return .{ .err = bun.sys.Error.fromCode(err.getErrno(), .open) },
.result => |fd| fd,
};
if (bytes.len > 0)
// Hint at the size of the file
_ = bun.sys.ftruncate(fd, @intCast(bytes.len));
// Dump all the bytes in there
var written: isize = 0;
var remain = bytes;
while (remain.len > 0) {
switch (bun.sys.pwrite(fd, remain, written)) {
.err => |err| {
if (err.getErrno() == .AGAIN) {
continue;
}
bun.Output.debugWarn("Failed to write to memfd: {}", .{err});
fd.close();
return .{ .err = err };
},
.result => |result| {
if (result == 0) {
bun.Output.debugWarn("Failed to write to memfd: EOF", .{});
fd.close();
return .{ .err = bun.sys.Error.fromCode(.NOMEM, .write) };
}
written += @intCast(result);
remain = remain[result..];
},
}
}
var linux_memfd_allocator = LinuxMemFdAllocator.new(.{
.fd = fd,
.ref_count = .init(),
.size = bytes.len,
});
switch (linux_memfd_allocator.alloc(bytes.len, 0, .{ .TYPE = .SHARED })) {
.result => |res| {
return .{ .result = res };
},
.err => |err| {
linux_memfd_allocator.deref();
return .{ .err = err };
},
}
}
};

Some files were not shown because too many files have changed in this diff Show More