Compare commits

...

14 Commits

Author SHA1 Message Date
Claude Bot
46027c0b95 fix(server): destroy response stream sink in all code paths
The ResponseStream.JSSink allocated in doRenderStream was not being
destroyed in several code paths, causing a memory leak:

1. When isAbortedOrEnded() is true after stream assignment - only
   finalize() was called, not destroy()

2. When stream is in progress but no Promise is returned - the sink
   was never destroyed

3. When abort() is called on a request with an active sink - abort()
   only calls finalize(), not destroy()

Added destroy() calls in all these paths, and added a safety net in
finalizeWithoutDeinit() to clean up any leaked sinks before the
RequestContext is returned to the pool.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-12-19 23:40:47 +00:00
robobun
9902039b1f fix: memory leaks in error-handling code for Brotli, Zstd, and Zlib compression state machines (#25592)
## Summary

Fix several memory leaks in the compression libraries:

- **NativeBrotli/NativeZstd reset()** - Each call to `reset()` allocated
a new encoder/decoder without freeing the previous one
- **NativeBrotli/NativeZstd init() error paths** - If `setParams()`
failed after `stream.init()` succeeded, the instance was leaked
- **NativeZstd init()** - If `setPledgedSrcSize()` failed after context
creation, the context was leaked
- **ZlibCompressorArrayList** - After `deflateInit2_()` succeeded, if
`ensureTotalCapacityPrecise()` failed with OOM, zlib internal state was
never freed
- **NativeBrotli close()** - Now sets state to null to prevent potential
double-free (defensive)
- **LibdeflateState** - Added `deinit()` for API consistency

## Test plan

- [x] Added regression test that calls `reset()` 100k times and measures
memory growth
- [x] Test shows memory growth dropped from ~600MB to ~10MB for Brotli
- [x] Verified no double-frees by tracing code paths
- [x] Existing zlib tests pass (except pre-existing timeout in debug
build)

Before fix (system bun 1.3.3):
```
Memory growth after 100000 reset() calls: 624.38 MB  (BrotliCompress)
Memory growth after 100000 reset() calls: 540.63 MB  (BrotliDecompress)
```

After fix:
```
Memory growth after 100000 reset() calls: 11.84 MB   (BrotliCompress)
Memory growth after 100000 reset() calls: 0.16 MB    (BrotliDecompress)
```

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-12-18 21:42:14 -08:00
Dylan Conway
f3fd7506ef fix(windows): handle UV_UNKNOWN and UV_EAI_* error codes in libuv errno mapping (#25596)
## Summary
- Add missing `UV_UNKNOWN` and `UV_EAI_*` error code mappings to the
`errno()` function in `ReturnCode`
- Fixes panic "integer does not fit in destination type" on Windows when
libuv returns unmapped error codes
- Speculative fix for BUN-131E

## Root Cause
The `errno()` function was missing mappings for `UV_UNKNOWN` (-4094) and
all `UV_EAI_*` address info errors (-3000 to -3014). When libuv returned
these codes, the switch fell through to `else => null`, and the caller
at `sys_uv.zig:317` assumed success and tried to cast the negative
return code to `usize`, causing a panic.

This was triggered in `readFileWithOptions` -> `preadv` when:
- Memory-mapped file operations encounter exceptions (file
modified/truncated by another process, network drive issues)
- Windows returns error codes that libuv cannot map to standard errno
values

## Crash Report
```
Bun v1.3.5 (1e86ceb) on windows x86_64baseline []
panic: integer does not fit in destination type
sys_uv.zig:294: preadv
node_fs.zig:5039: readFileWithOptions
```

## Test plan
- [ ] This fix prevents a panic, converting it to a proper error.
Testing would require triggering `UV_UNKNOWN` from libuv, which is
difficult to do reliably (requires memory-mapped file exceptions or
unusual Windows errors).

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2025-12-18 21:41:41 -08:00
robobun
c21c51a0ff test(security-scanner): add TTY prompt tests using Bun.Terminal (#25587)
Co-authored-by: Alistair Smith <hi@alistair.sh>
2025-12-19 05:21:44 +00:00
robobun
0bbf6c74b5 test: add describe blocks for grouping in bun-types.test.ts (#25598)
Co-authored-by: Alistair Smith <hi@alistair.sh>
2025-12-19 04:53:18 +00:00
Dylan Conway
57cbbc09e4 fix: correct off-by-one bounds checks in bundler and package installer (#25582)
## Summary

- Fix two off-by-one bounds check errors that used `>` instead of `>=`
- Both bugs could cause undefined behavior (array out-of-bounds access)
when an index equals the array length

## The Bugs

### 1. `src/install/postinstall_optimizer.zig:62`

```zig
// Before (buggy):
if (resolution > metas.len) continue;
const meta: *const Meta = &metas[resolution];  // Out-of-bounds when resolution == metas.len

// After (fixed):
if (resolution >= metas.len) continue;
```

### 2. `src/bundler/linker_context/doStep5.zig:10`

```zig
// Before (buggy):
if (id > c.graph.meta.len) return;
const resolved_exports = &c.graph.meta.items(.resolved_exports)[id];  // Out-of-bounds when id == c.graph.meta.len

// After (fixed):
if (id >= c.graph.meta.len) return;
```

## Why These Are Bugs

Valid array indices are `0` to `len - 1`. When `index == len`:
- `index > len` evaluates to `false` → check passes
- `array[index]` accesses `array[len]` → out-of-bounds / undefined
behavior

## Codebase Patterns

The rest of the codebase correctly uses `>=` for these checks:
- `lockfile.zig:484`: `if (old_resolution >= old.packages.len)
continue;`
- `lockfile.zig:522`: `if (old_resolution >= old.packages.len)
continue;`
- `LinkerContext.zig:389`: `if (source_index >= import_records_list.len)
continue;`
- `LinkerContext.zig:1667`: `if (source_index >= c.graph.ast.len) {`

## Test plan

- [x] Verified fix aligns with existing codebase patterns
- [ ] CI passes

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude <noreply@anthropic.com>
2025-12-18 18:04:28 -08:00
Jarred Sumner
7f589ffb4b Disable coderabbit enrichment 2025-12-18 18:03:23 -08:00
Francis F
cea59d7fc0 docs(sqlite): fix .run() return value documentation (#25060)
Co-authored-by: Alistair Smith <hi@alistair.sh>
2025-12-18 20:44:35 +00:00
Jarred Sumner
4ea1454e4a Delete unused workflow 2025-12-18 12:04:28 -08:00
Dylan Conway
8941a363c3 fix: dupe ca string in .npmrc to prevent use-after-free (#25563)
## Summary

- Fix use-after-free bug when parsing `ca` option from `.npmrc`
- The `ca` string was being stored directly from the parser's arena
without duplication
- Since the parser arena is freed at the end of `loadNpmrc`, this
created a dangling pointer

## The Bug

In `src/ini.zig`, the `ca` string wasn't being duplicated like all other
string properties:

```zig
// Lines 983-986 explicitly warn about this:
// Need to be very, very careful here with strings.
// They are allocated in the Parser's arena, which of course gets
// deinitialized at the end of the scope.
// We need to dupe all strings

// Line 981: Parser arena is freed here
defer parser.deinit();

// Line 1016-1020: THE BUG - string not duped!
if (out.asProperty("ca")) |query| {
    if (query.expr.asUtf8StringLiteral()) |str| {
        install.ca = .{
            .str = str,  // ← Dangling pointer after parser.deinit()!
        };
```

All other string properties in the same function correctly duplicate:
- `registry` (line 996): `try allocator.dupe(u8, str)`
- `cache` (line 1002): `try allocator.dupe(u8, str)`
- `cafile` (line 1037): `asStringCloned(allocator)`
- `ca` array items (line 1026): `asStringCloned(allocator)`

## User Impact

When a user has `ca=<certificate>` in their `.npmrc` file:
1. The certificate string is parsed and stored
2. The parser arena is freed
3. `install.ca.str` becomes a dangling pointer
4. Later TLS/SSL operations access freed memory
5. Could cause crashes, undefined behavior, or security issues

## Test plan

- Code inspection confirms this matches the pattern used for all other
string properties
- The fix adds `try allocator.dupe(u8, str)` to match `cache`,
`registry`, etc.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude <noreply@anthropic.com>
2025-12-17 19:56:25 -08:00
Dylan Conway
722ac3aa5a fix: check correct variable in subprocess stdin cleanup (#25562)
## Summary

- Fix typo in `onProcessExit` where `existing_stdin_value.isCell()` was
checked instead of `existing_value.isCell()`
- Since `existing_stdin_value` is always `.zero` at that point, the
condition was always false, making the inner block dead code

## The Bug

In `src/bun.js/api/bun/subprocess.zig:593`:

```zig
var existing_stdin_value = jsc.JSValue.zero;  // Line 590 - always .zero
if (this_jsvalue != .zero) {
    if (jsc.Codegen.JSSubprocess.stdinGetCached(this_jsvalue)) |existing_value| {
        if (existing_stdin_value.isCell()) {  // BUG! Should be existing_value
            // This block was DEAD CODE - never executed
```

Compare with the correct pattern used elsewhere:
```zig
// shell/subproc.zig:251-252 (CORRECT)
if (jsc.Codegen.JSSubprocess.stdinGetCached(subprocess.this_jsvalue)) |existing_value| {
    jsc.WebCore.FileSink.JSSink.setDestroyCallback(existing_value, 0);  // Uses existing_value
}
```

## Impact

The dead code prevented:
- Recovery of stdin from cached JS value when `weak_file_sink_stdin_ptr`
is null
- Proper cleanup via `onAttachedProcessExit` on the FileSink  
- `setDestroyCallback` cleanup in `onProcessExit`

Note: The user-visible impact was mitigated by redundant cleanup paths
in `Writable.zig` that also call `setDestroyCallback`.

## Test plan

- Code inspection confirms this is a straightforward typo fix
- Existing subprocess tests continue to pass

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude <noreply@anthropic.com>
2025-12-17 18:34:58 -08:00
Dylan Conway
a333d02f84 fix: correct inverted buffer allocation logic in Postgres array parsing (#25564)
## Summary

- Fix inverted buffer allocation logic when parsing strings in Postgres
arrays
- Strings larger than 16KB were incorrectly using the stack buffer
instead of dynamically allocating
- This caused spurious `InvalidByteSequence` errors for valid data

## The Bug

In `src/sql/postgres/DataCell.zig`, the condition for when to use
dynamic allocation was inverted:

```zig
// BEFORE (buggy):
const needs_dynamic_buffer = str_bytes.len < stack_buffer.len;  // TRUE when SMALL

// AFTER (fixed):
const needs_dynamic_buffer = str_bytes.len > stack_buffer.len;  // TRUE when LARGE
```

## What happened with large strings (>16KB):

1. `needs_dynamic_buffer` = false (e.g., 20000 < 16384 is false)
2. Uses `stack_buffer[0..]` which is only 16KB
3. `unescapePostgresString` hits bounds check and returns
`BufferTooSmall`
4. Error converted to `InvalidByteSequence`
5. User gets error even though data is valid

## User Impact

Users with Postgres arrays containing JSON or string elements larger
than 16KB would get spurious InvalidByteSequence errors even though
their data was perfectly valid.

## Test plan

- Code inspection confirms the logic was inverted
- The fix aligns with the intended behavior: use stack buffer for small
strings, dynamic allocation for large strings

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude <noreply@anthropic.com>
2025-12-17 18:34:17 -08:00
Dylan Conway
c1acb0b9a4 fix(shell): prevent double-close of fd when using &> redirect with builtins (#25568)
## Summary

- Fix double-close of file descriptor when using `&>` redirect with
shell builtin commands
- Add `dupeRef()` helper for cleaner reference counting semantics
- Add tests for `&>` and `&>>` redirects with builtins

## Test plan

- [x] Added tests in `test/js/bun/shell/file-io.test.ts` that reproduce
the bug
- [x] All file-io tests pass

## The Bug

When using `&>` to redirect both stdout and stderr to the same file with
a shell builtin command (e.g., `pwd &> file.txt`), the code was creating
two separate `IOWriter` instances that shared the same file descriptor.
When both `IOWriter`s were destroyed, they both tried to close the same
fd, causing an `EBADF` (bad file descriptor) error.

```javascript
import { $ } from "bun";
await $`pwd &> output.txt`; // Would crash with EBADF
```

## The Fix

1. Share a single `IOWriter` between stdout and stderr when both are
redirected to the same file, with proper reference counting
2. Rename `refSelf` to `dupeRef` for clarity across `IOReader`,
`IOWriter`, `CowFd`, and add it to `Blob` for consistency
3. Fix the `Body.Value` blob case to also properly reference count when
the same blob is assigned to multiple outputs

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Latest model <noreply@anthropic.com>
2025-12-17 18:33:53 -08:00
Jarred Sumner
ffd2240c31 Bump 2025-12-17 11:42:54 -08:00
29 changed files with 10551 additions and 4885 deletions

View File

@@ -3,8 +3,6 @@ language: en-US
issue_enrichment:
auto_enrich:
enabled: false
planning:
enabled: false
reviews:
profile: assertive

View File

@@ -1,58 +0,0 @@
name: Codex Test Sync
on:
pull_request:
types: [labeled, opened]
env:
BUN_VERSION: "1.2.15"
jobs:
sync-node-tests:
runs-on: ubuntu-latest
if: |
(github.event.action == 'labeled' && github.event.label.name == 'codex') ||
(github.event.action == 'opened' && contains(github.event.pull_request.labels.*.name, 'codex')) ||
contains(github.head_ref, 'codex')
permissions:
contents: write
pull-requests: write
steps:
- name: Checkout
uses: actions/checkout@v4
with:
token: ${{ secrets.GITHUB_TOKEN }}
fetch-depth: 0
- name: Setup Bun
uses: ./.github/actions/setup-bun
with:
bun-version: ${{ env.BUN_VERSION }}
- name: Get changed files
id: changed-files
uses: tj-actions/changed-files@v44
with:
files: |
test/js/node/test/parallel/**/*.{js,mjs,ts}
test/js/node/test/sequential/**/*.{js,mjs,ts}
- name: Sync tests
if: steps.changed-files.outputs.any_changed == 'true'
shell: bash
run: |
echo "Changed test files:"
echo "${{ steps.changed-files.outputs.all_changed_files }}"
# Process each changed test file
for file in ${{ steps.changed-files.outputs.all_changed_files }}; do
# Extract test name from file path
test_name=$(basename "$file" | sed 's/\.[^.]*$//')
echo "Syncing test: $test_name"
bun node:test:cp "$test_name"
done
- name: Commit changes
uses: stefanzweifel/git-auto-commit-action@v5
with:
commit_message: "Sync Node.js tests with upstream"

2
LATEST
View File

@@ -1 +1 @@
1.3.4
1.3.5

View File

@@ -303,7 +303,7 @@ Internally, this calls [`sqlite3_reset`](https://www.sqlite.org/capi3ref.html#sq
### `.run()`
Use `.run()` to run a query and get back `undefined`. This is useful for schema-modifying queries (e.g. `CREATE TABLE`) or bulk write operations.
Use `.run()` to run a query and get back an object with execution metadata. This is useful for schema-modifying queries (e.g. `CREATE TABLE`) or bulk write operations.
```ts db.ts icon="/icons/typescript.svg" highlight={2}
const query = db.query(`create table foo;`);

View File

@@ -1,7 +1,7 @@
{
"private": true,
"name": "bun",
"version": "1.3.5",
"version": "1.3.6",
"workspaces": [
"./packages/bun-types",
"./packages/@types/bun"

View File

@@ -154,12 +154,6 @@ declare module "bun:sqlite" {
* | `bigint` | `INTEGER` |
* | `null` | `NULL` |
*
* @example
* ```ts
* db.run("CREATE TABLE foo (bar TEXT)");
* db.run("INSERT INTO foo VALUES (?)", ["baz"]);
* ```
*
* Useful for queries like:
* - `CREATE TABLE`
* - `INSERT INTO`
@@ -180,8 +174,14 @@ declare module "bun:sqlite" {
*
* @param sql The SQL query to run
* @param bindings Optional bindings for the query
* @returns A `Changes` object with `changes` and `lastInsertRowid` properties
*
* @returns `Database` instance
* @example
* ```ts
* db.run("CREATE TABLE foo (bar TEXT)");
* db.run("INSERT INTO foo VALUES (?)", ["baz"]);
* // => { changes: 1, lastInsertRowid: 1 }
* ```
*/
run<ParamsType extends SQLQueryBindings[]>(sql: string, ...bindings: ParamsType[]): Changes;
@@ -670,18 +670,19 @@ declare module "bun:sqlite" {
* Execute the prepared statement.
*
* @param params optional values to bind to the statement. If omitted, the statement is run with the last bound values or no parameters if there are none.
* @returns A `Changes` object with `changes` and `lastInsertRowid` properties
*
* @example
* ```ts
* const stmt = db.prepare("UPDATE foo SET bar = ?");
* stmt.run("baz");
* // => undefined
* const insert = db.prepare("INSERT INTO users (name) VALUES (?)");
* insert.run("Alice");
* // => { changes: 1, lastInsertRowid: 1 }
* insert.run("Bob");
* // => { changes: 1, lastInsertRowid: 2 }
*
* stmt.run();
* // => undefined
*
* stmt.run("foo");
* // => undefined
* const update = db.prepare("UPDATE users SET name = ? WHERE id = ?");
* update.run("Charlie", 1);
* // => { changes: 1, lastInsertRowid: 2 }
* ```
*
* The following types can be used when binding parameters:

View File

@@ -590,7 +590,7 @@ pub fn onProcessExit(this: *Subprocess, process: *Process, status: bun.spawn.Sta
var existing_stdin_value = jsc.JSValue.zero;
if (this_jsvalue != .zero) {
if (jsc.Codegen.JSSubprocess.stdinGetCached(this_jsvalue)) |existing_value| {
if (existing_stdin_value.isCell()) {
if (existing_value.isCell()) {
if (stdin == null) {
// TODO: review this cast
stdin = @ptrCast(@alignCast(jsc.WebCore.FileSink.JSSink.fromJS(existing_value)));

View File

@@ -752,6 +752,16 @@ pub fn NewRequestContext(comptime ssl_enabled: bool, comptime debug_mode: bool,
stream.unpipeWithoutDeref();
}
// Clean up response stream sink if it wasn't already destroyed
// This can happen if the connection was aborted during streaming
if (this.sink) |wrapper| {
ctxLog("finalizeWithoutDeinit: cleaning up leaked sink", .{});
wrapper.sink.finalize();
wrapper.detach(globalThis);
this.sink = null;
wrapper.sink.destroy();
}
this.response_body_readable_stream_ref.deinit();
if (!this.pathname.isEmpty()) {
@@ -1270,6 +1280,8 @@ pub fn NewRequestContext(comptime ssl_enabled: bool, comptime debug_mode: bool,
response_stream.sink.onFirstWrite = null;
response_stream.sink.finalize();
this.sink = null;
response_stream.sink.destroy();
return;
}
var response_body_readable_stream_ref = this.response_body_readable_stream_ref;
@@ -1295,6 +1307,9 @@ pub fn NewRequestContext(comptime ssl_enabled: bool, comptime debug_mode: bool,
response_stream.detach(globalThis);
stream.cancel(globalThis);
response_stream.sink.markDone();
response_stream.sink.finalize();
this.sink = null;
response_stream.sink.destroy();
this.renderMissing();
}

View File

@@ -93,6 +93,7 @@ pub fn init(this: *@This(), globalThis: *jsc.JSGlobalObject, callframe: *jsc.Cal
err = this.stream.setParams(@intCast(i), d);
if (err.isError()) {
// try impl.emitError(this, globalThis, this_value, err); //XXX: onerror isn't set yet
this.stream.close();
return .false;
}
}
@@ -179,9 +180,23 @@ const Context = struct {
}
pub fn reset(this: *Context) Error {
if (this.state != null) {
this.deinitState();
}
return this.init();
}
/// Frees the Brotli encoder/decoder state without changing mode.
/// Use close() for full cleanup that also sets mode to NONE.
fn deinitState(this: *Context) void {
switch (this.mode) {
.BROTLI_ENCODE => c.BrotliEncoderDestroyInstance(@ptrCast(@alignCast(this.state))),
.BROTLI_DECODE => c.BrotliDecoderDestroyInstance(@ptrCast(@alignCast(this.state))),
else => unreachable,
}
this.state = null;
}
pub fn setBuffers(this: *Context, in: ?[]const u8, out: ?[]u8) void {
this.next_in = if (in) |p| p.ptr else null;
this.next_out = if (out) |p| p.ptr else null;
@@ -238,11 +253,7 @@ const Context = struct {
}
pub fn close(this: *Context) void {
switch (this.mode) {
.BROTLI_ENCODE => c.BrotliEncoderDestroyInstance(@ptrCast(@alignCast(this.state))),
.BROTLI_DECODE => c.BrotliDecoderDestroyInstance(@ptrCast(@alignCast(this.state))),
else => unreachable,
}
this.deinitState();
this.mode = .NONE;
}

View File

@@ -93,7 +93,10 @@ pub fn init(this: *@This(), globalThis: *jsc.JSGlobalObject, callframe: *jsc.Cal
for (params_.asU32(), 0..) |x, i| {
if (x == std.math.maxInt(u32)) continue;
const err_ = this.stream.setParams(@intCast(i), x);
if (err_.isError()) return globalThis.ERR(.ZLIB_INITIALIZATION_FAILED, "{s}", .{std.mem.sliceTo(err_.msg.?, 0)}).throw();
if (err_.isError()) {
this.stream.close();
return globalThis.ERR(.ZLIB_INITIALIZATION_FAILED, "{s}", .{std.mem.sliceTo(err_.msg.?, 0)}).throw();
}
}
return .true;
@@ -135,7 +138,11 @@ const Context = struct {
if (state == null) return .init("Could not initialize zstd instance", -1, "ERR_ZLIB_INITIALIZATION_FAILED");
this.state = state.?;
const result = c.ZSTD_CCtx_setPledgedSrcSize(state, pledged_src_size);
if (c.ZSTD_isError(result) > 0) return .init("Could not set pledged src size", -1, "ERR_ZLIB_INITIALIZATION_FAILED");
if (c.ZSTD_isError(result) > 0) {
_ = c.ZSTD_freeCCtx(state);
this.state = null;
return .init("Could not set pledged src size", -1, "ERR_ZLIB_INITIALIZATION_FAILED");
}
return .ok;
},
.ZSTD_DECOMPRESS => {
@@ -165,9 +172,23 @@ const Context = struct {
}
pub fn reset(this: *Context) Error {
if (this.state != null) {
this.deinitState();
}
return this.init(this.pledged_src_size);
}
/// Frees the Zstd encoder/decoder state without changing mode.
/// Use close() for full cleanup that also sets mode to NONE.
fn deinitState(this: *Context) void {
_ = switch (this.mode) {
.ZSTD_COMPRESS => c.ZSTD_freeCCtx(@ptrCast(this.state)),
.ZSTD_DECOMPRESS => c.ZSTD_freeDCtx(@ptrCast(this.state)),
else => unreachable,
};
this.state = null;
}
pub fn setBuffers(this: *Context, in: ?[]const u8, out: ?[]u8) void {
this.input.src = if (in) |p| p.ptr else null;
this.input.size = if (in) |p| p.len else 0;
@@ -243,13 +264,8 @@ const Context = struct {
.ZSTD_DECOMPRESS => c.ZSTD_DCtx_reset(@ptrCast(this.state), c.ZSTD_reset_session_and_parameters),
else => unreachable,
};
_ = switch (this.mode) {
.ZSTD_COMPRESS => c.ZSTD_freeCCtx(@ptrCast(this.state)),
.ZSTD_DECOMPRESS => c.ZSTD_freeDCtx(@ptrCast(this.state)),
else => unreachable,
};
this.deinitState();
this.mode = .NONE;
this.state = null;
}
};

View File

@@ -7,7 +7,7 @@ pub fn doStep5(c: *LinkerContext, source_index_: Index, _: usize) void {
defer trace.end();
const id = source_index;
if (id > c.graph.meta.len) return;
if (id >= c.graph.meta.len) return;
const worker: *ThreadPool.Worker = ThreadPool.Worker.get(@fieldParentPtr("linker", c));
defer worker.unget();

View File

@@ -2880,6 +2880,21 @@ pub const ReturnCode = enum(c_int) {
UV_ECANCELED => @intFromEnum(bun.sys.E.CANCELED),
UV_ECHARSET => @intFromEnum(bun.sys.E.CHARSET),
UV_EOF => @intFromEnum(bun.sys.E.EOF),
UV_UNKNOWN => @intFromEnum(bun.sys.E.UNKNOWN),
UV_EAI_ADDRFAMILY => @intFromEnum(bun.sys.E.UV_EAI_ADDRFAMILY),
UV_EAI_AGAIN => @intFromEnum(bun.sys.E.UV_EAI_AGAIN),
UV_EAI_BADFLAGS => @intFromEnum(bun.sys.E.UV_EAI_BADFLAGS),
UV_EAI_BADHINTS => @intFromEnum(bun.sys.E.UV_EAI_BADHINTS),
UV_EAI_CANCELED => @intFromEnum(bun.sys.E.UV_EAI_CANCELED),
UV_EAI_FAIL => @intFromEnum(bun.sys.E.UV_EAI_FAIL),
UV_EAI_FAMILY => @intFromEnum(bun.sys.E.UV_EAI_FAMILY),
UV_EAI_MEMORY => @intFromEnum(bun.sys.E.UV_EAI_MEMORY),
UV_EAI_NODATA => @intFromEnum(bun.sys.E.UV_EAI_NODATA),
UV_EAI_NONAME => @intFromEnum(bun.sys.E.UV_EAI_NONAME),
UV_EAI_OVERFLOW => @intFromEnum(bun.sys.E.UV_EAI_OVERFLOW),
UV_EAI_PROTOCOL => @intFromEnum(bun.sys.E.UV_EAI_PROTOCOL),
UV_EAI_SERVICE => @intFromEnum(bun.sys.E.UV_EAI_SERVICE),
UV_EAI_SOCKTYPE => @intFromEnum(bun.sys.E.UV_EAI_SOCKTYPE),
else => null,
}
else

View File

@@ -103,6 +103,11 @@ pub const LibdeflateState = struct {
shared_buffer: [512 * 1024]u8 = undefined,
pub const new = bun.TrivialNew(@This());
pub fn deinit(this: *@This()) void {
this.decompressor.deinit();
bun.TrivialDeinit(@This())(this);
}
};
const request_body_send_stack_buffer_size = 32 * 1024;

View File

@@ -1016,7 +1016,7 @@ pub fn loadNpmrc(
if (out.asProperty("ca")) |query| {
if (query.expr.asUtf8StringLiteral()) |str| {
install.ca = .{
.str = str,
.str = try allocator.dupe(u8, str),
};
} else if (query.expr.isArray()) {
const arr = query.expr.data.e_array;

View File

@@ -59,7 +59,7 @@ pub const PostinstallOptimizer = enum {
// Loop through the list of optional dependencies with platform-specific constraints
// Find a matching target-specific dependency.
for (resolutions) |resolution| {
if (resolution > metas.len) continue;
if (resolution >= metas.len) continue;
const meta: *const Meta = &metas[resolution];
if (meta.arch == .all or meta.os == .all) continue;
if (meta.arch.isMatch(target_cpu) and meta.os.isMatch(target_os)) {

View File

@@ -264,6 +264,11 @@ pub const BuiltinIO = struct {
ref_count: RefCount,
blob: bun.webcore.Blob,
fn dupeRef(this: *Blob) *Blob {
this.ref();
return this;
}
fn deinit(this: *Blob) void {
this.blob.deinit();
bun.destroy(this);
@@ -340,16 +345,16 @@ pub fn init(
io: *IO,
) ?Yield {
const stdin: BuiltinIO.Input = switch (io.stdin) {
.fd => |fd| .{ .fd = fd.refSelf() },
.fd => |fd| .{ .fd = fd.dupeRef() },
.ignore => .ignore,
};
const stdout: BuiltinIO.Output = switch (io.stdout) {
.fd => |val| .{ .fd = .{ .writer = val.writer.refSelf(), .captured = val.captured } },
.fd => |val| .{ .fd = .{ .writer = val.writer.dupeRef(), .captured = val.captured } },
.pipe => .{ .buf = std.array_list.Managed(u8).init(cmd.base.allocator()) },
.ignore => .ignore,
};
const stderr: BuiltinIO.Output = switch (io.stderr) {
.fd => |val| .{ .fd = .{ .writer = val.writer.refSelf(), .captured = val.captured } },
.fd => |val| .{ .fd = .{ .writer = val.writer.dupeRef(), .captured = val.captured } },
.pipe => .{ .buf = std.array_list.Managed(u8).init(cmd.base.allocator()) },
.ignore => .ignore,
};
@@ -481,13 +486,26 @@ fn initRedirections(
cmd.exec.bltn.stdin.deref();
cmd.exec.bltn.stdin = .{ .fd = IOReader.init(redirfd, cmd.base.eventLoop()) };
}
if (!node.redirect.stdout and !node.redirect.stderr) {
return null;
}
const redirect_writer: *IOWriter = .init(
redirfd,
.{ .pollable = is_pollable, .nonblocking = is_nonblocking, .is_socket = is_socket },
cmd.base.eventLoop(),
);
defer redirect_writer.deref();
if (node.redirect.stdout) {
cmd.exec.bltn.stdout.deref();
cmd.exec.bltn.stdout = .{ .fd = .{ .writer = IOWriter.init(redirfd, .{ .pollable = is_pollable, .nonblocking = is_nonblocking, .is_socket = is_socket }, cmd.base.eventLoop()) } };
cmd.exec.bltn.stdout = .{ .fd = .{ .writer = redirect_writer.dupeRef() } };
}
if (node.redirect.stderr) {
cmd.exec.bltn.stderr.deref();
cmd.exec.bltn.stderr = .{ .fd = .{ .writer = IOWriter.init(redirfd, .{ .pollable = is_pollable, .nonblocking = is_nonblocking, .is_socket = is_socket }, cmd.base.eventLoop()) } };
cmd.exec.bltn.stderr = .{ .fd = .{ .writer = redirect_writer.dupeRef() } };
}
},
.jsbuf => |val| {
@@ -522,24 +540,29 @@ fn initRedirections(
var original_blob = body.use();
defer original_blob.deinit();
if (!node.redirect.stdin and !node.redirect.stdout and !node.redirect.stderr) {
return null;
}
const blob: *BuiltinIO.Blob = bun.new(BuiltinIO.Blob, .{
.ref_count = .init(),
.blob = original_blob.dupe(),
});
defer blob.deref();
if (node.redirect.stdin) {
cmd.exec.bltn.stdin.deref();
cmd.exec.bltn.stdin = .{ .blob = blob };
cmd.exec.bltn.stdin = .{ .blob = blob.dupeRef() };
}
if (node.redirect.stdout) {
cmd.exec.bltn.stdout.deref();
cmd.exec.bltn.stdout = .{ .blob = blob };
cmd.exec.bltn.stdout = .{ .blob = blob.dupeRef() };
}
if (node.redirect.stderr) {
cmd.exec.bltn.stderr.deref();
cmd.exec.bltn.stderr = .{ .blob = blob };
cmd.exec.bltn.stderr = .{ .blob = blob.dupeRef() };
}
} else if (interpreter.jsobjs[file.jsbuf.idx].as(jsc.WebCore.Blob)) |blob| {
if ((node.redirect.stdout or node.redirect.stderr) and !blob.needsToReadFile()) {

View File

@@ -170,7 +170,7 @@ pub const OutKind = union(enum) {
fn to_subproc_stdio(this: OutKind, shellio: *?*shell.IOWriter) bun.shell.subproc.Stdio {
return switch (this) {
.fd => |val| brk: {
shellio.* = val.writer.refSelf();
shellio.* = val.writer.dupeRef();
break :brk if (val.captured) |cap| .{
.capture = .{
.buf = cap,

View File

@@ -30,7 +30,7 @@ const InitFlags = packed struct(u8) {
__unused: u5 = 0,
};
pub fn refSelf(this: *IOReader) *IOReader {
pub fn dupeRef(this: *IOReader) *IOReader {
this.ref();
return this;
}

View File

@@ -70,7 +70,7 @@ pub const Poll = WriterImpl;
// pub fn __onClose(_: *IOWriter) void {}
// pub fn __flush(_: *IOWriter) void {}
pub fn refSelf(this: *IOWriter) *IOWriter {
pub fn dupeRef(this: *IOWriter) *IOWriter {
this.ref();
return this;
}

View File

@@ -177,7 +177,7 @@ pub const CowFd = struct {
this.refcount += 1;
}
pub fn refSelf(this: *CowFd) *CowFd {
pub fn dupeRef(this: *CowFd) *CowFd {
this.ref();
return this;
}

View File

@@ -1121,7 +1121,7 @@ pub const PipeReader = struct {
log("PipeReader(0x{x}, {s}) create()", .{ @intFromPtr(this), @tagName(this.out_type) });
if (capture) |cap| {
this.captured_writer.writer = cap.refSelf();
this.captured_writer.writer = cap.dupeRef();
this.captured_writer.dead = false;
}

View File

@@ -151,8 +151,8 @@ fn parseArray(bytes: []const u8, bigint: bool, comptime arrayType: types.Tag, gl
.jsonb_array,
=> {
const str_bytes = slice[1..current_idx];
const needs_dynamic_buffer = str_bytes.len < stack_buffer.len;
const buffer = if (needs_dynamic_buffer) try bun.default_allocator.alloc(u8, str_bytes.len) else stack_buffer[0..];
const needs_dynamic_buffer = str_bytes.len > stack_buffer.len;
const buffer = if (needs_dynamic_buffer) try bun.default_allocator.alloc(u8, str_bytes.len) else stack_buffer[0..str_bytes.len];
defer if (needs_dynamic_buffer) bun.default_allocator.free(buffer);
const unescaped = unescapePostgresString(str_bytes, buffer) catch return error.InvalidByteSequence;
try array.append(bun.default_allocator, SQLDataCell{ .tag = .json, .value = .{ .json = if (unescaped.len > 0) String.cloneUTF8(unescaped).value.WTFStringImpl else null }, .free_value = 1 });
@@ -168,8 +168,8 @@ fn parseArray(bytes: []const u8, bigint: bool, comptime arrayType: types.Tag, gl
slice = trySlice(slice, current_idx + 1);
continue;
}
const needs_dynamic_buffer = str_bytes.len < stack_buffer.len;
const buffer = if (needs_dynamic_buffer) try bun.default_allocator.alloc(u8, str_bytes.len) else stack_buffer[0..];
const needs_dynamic_buffer = str_bytes.len > stack_buffer.len;
const buffer = if (needs_dynamic_buffer) try bun.default_allocator.alloc(u8, str_bytes.len) else stack_buffer[0..str_bytes.len];
defer if (needs_dynamic_buffer) bun.default_allocator.free(buffer);
const string_bytes = unescapePostgresString(str_bytes, buffer) catch return error.InvalidByteSequence;
try array.append(bun.default_allocator, SQLDataCell{ .tag = .string, .value = .{ .string = if (string_bytes.len > 0) String.cloneUTF8(string_bytes).value.WTFStringImpl else null }, .free_value = 1 });

View File

@@ -840,7 +840,10 @@ pub const ZlibCompressorArrayList = struct {
@sizeOf(zStream_struct),
)) {
ReturnCode.Ok => {
try zlib_reader.list.ensureTotalCapacityPrecise(list_allocator, deflateBound(&zlib_reader.zlib, input.len));
zlib_reader.list.ensureTotalCapacityPrecise(list_allocator, deflateBound(&zlib_reader.zlib, input.len)) catch {
zlib_reader.deinit();
return error.OutOfMemory;
};
zlib_reader.list_ptr.* = zlib_reader.list;
zlib_reader.zlib.avail_out = @truncate(zlib_reader.list.capacity);
zlib_reader.zlib.next_out = zlib_reader.list.items.ptr;

View File

@@ -1,4 +1,4 @@
import { bunEnv, bunExe, runBunInstall, tempDirWithFiles } from "harness";
import { bunEnv, bunExe, isWindows, runBunInstall, tempDirWithFiles } from "harness";
import { rm } from "node:fs/promises";
import { join } from "node:path";
import { isCI } from "../../harness";
@@ -21,15 +21,19 @@ type TestName = ReturnType<typeof getTestName>;
// don't get totally lost.
const TESTS_TO_SKIP: Set<string> = new Set<TestName>([
// https://github.com/oven-sh/bun/issues/22255
"0289 (without modules)", "0292 (without modules)", "0295 (without modules)", "0298 (without modules)", "0307 (without modules)", "0310 (without modules)", "0313 (without modules)", "0316 (without modules)", // remove "is-even"
"0325 (without modules)", "0328 (without modules)", "0331 (without modules)", "0334 (without modules)", "0343 (without modules)", "0346 (without modules)", "0349 (without modules)", "0352 (without modules)", // remove "left-pad,is-even"
"0361 (without modules)", "0364 (without modules)", "0367 (without modules)", "0370 (without modules)", "0379 (without modules)", "0382 (without modules)", "0385 (without modules)", "0388 (without modules)", // uninstall "is-even"
"0397 (without modules)", "0400 (without modules)", "0403 (without modules)", "0406 (without modules)", "0415 (without modules)", "0418 (without modules)", "0421 (without modules)", "0424 (without modules)", // uninstall "left-pad,is-even"
// remove "is-even"
"0481 (without modules)", "0486 (without modules)", "0491 (without modules)", "0496 (without modules)", "0511 (without modules)", "0516 (without modules)", "0521 (without modules)", "0526 (without modules)",
// remove "left-pad,is-even"
"0541 (without modules)", "0546 (without modules)", "0551 (without modules)", "0556 (without modules)", "0571 (without modules)", "0576 (without modules)", "0581 (without modules)", "0586 (without modules)",
// uninstall "is-even"
"0601 (without modules)", "0606 (without modules)", "0611 (without modules)", "0616 (without modules)", "0631 (without modules)", "0636 (without modules)", "0641 (without modules)", "0646 (without modules)",
// uninstall "left-pad,is-even"
"0661 (without modules)", "0666 (without modules)", "0671 (without modules)", "0676 (without modules)", "0691 (without modules)", "0696 (without modules)", "0701 (without modules)", "0706 (without modules)",
]);
interface SecurityScannerTestOptions {
command: "install" | "update" | "add" | "remove" | "uninstall";
args: string[];
args: readonly string[];
hasExistingNodeModules: boolean;
linker: "hoisted" | "isolated";
scannerType: "local" | "npm" | "npm.bunfigonly";
@@ -38,6 +42,10 @@ interface SecurityScannerTestOptions {
hasLockfile: boolean;
scannerSyncronouslyThrows: boolean;
// TTY options for testing interactive prompts
hasTTY: boolean;
ttyResponse: "y" | "n"; // Response to send when prompted (only used when hasTTY is true and scannerReturns is "warn")
}
const DO_TEST_DEBUG = process.env.SCANNER_TEST_DEBUG === "true";
@@ -70,6 +78,8 @@ async function runSecurityScannerTest(options: SecurityScannerTestOptions) {
scannerReturns,
shouldFail,
scannerSyncronouslyThrows,
hasTTY,
ttyResponse,
} = options;
const expectedExitCode = shouldFail ? 1 : 0;
@@ -224,51 +234,90 @@ scanner = "${scannerPath}"`,
console.log(`cd ${dir} && ${cmd.join(" ")}`);
}
await using proc = Bun.spawn({
cmd,
cwd: dir,
stdout: "pipe",
stderr: "pipe",
stdin: "pipe",
env: bunEnv,
});
let errAndOut = "";
let exitCode: number;
if (DO_TEST_DEBUG) {
const write = (chunk: Uint8Array<ArrayBuffer>, stream: NodeJS.WriteStream, decoder: TextDecoder) => {
const str = decoder.decode(chunk);
if (hasTTY) {
let responseSent = false;
errAndOut += str;
await using terminal = new Bun.Terminal({
cols: 80,
rows: 24,
data(_term, data) {
const text = new TextDecoder().decode(data);
errAndOut += text;
const lines = str.split("\n");
for (const line of lines) {
stream.write(redSubprocessPrefix);
stream.write(" ");
stream.write(line);
stream.write("\n");
}
};
if (DO_TEST_DEBUG) {
const lines = text.split("\n");
for (const line of lines) {
process.stdout.write(redSubprocessPrefix);
process.stdout.write(" ");
process.stdout.write(line);
process.stdout.write("\n");
}
}
const outDecoder = new TextDecoder();
const stdoutWriter = new WritableStream<Uint8Array<ArrayBuffer>>({
write: chunk => write(chunk, process.stdout, outDecoder),
close: () => void process.stdout.write(outDecoder.decode()),
// When we see the prompt, send the configured response
if (!responseSent && errAndOut.includes("Continue anyway? [y/N]")) {
responseSent = true;
terminal.write(ttyResponse + "\n");
}
},
});
const errDecoder = new TextDecoder();
const stderrWriter = new WritableStream<Uint8Array<ArrayBuffer>>({
write: chunk => write(chunk, process.stderr, errDecoder),
close: () => void process.stderr.write(errDecoder.decode()),
await using proc = Bun.spawn(cmd, {
cwd: dir,
env: bunEnv,
terminal,
});
await Promise.all([proc.stdout.pipeTo(stdoutWriter), proc.stderr.pipeTo(stderrWriter)]);
exitCode = await proc.exited;
} else {
const [stdout, stderr] = await Promise.all([proc.stdout.text(), proc.stderr.text()]);
errAndOut = stdout + stderr;
}
// Non-TTY mode: use piped stdin to ensure isatty(stdin) returns false
await using proc = Bun.spawn({
cmd,
cwd: dir,
stdout: "pipe",
stderr: "pipe",
stdin: "pipe",
env: bunEnv,
});
const exitCode = await proc.exited;
if (DO_TEST_DEBUG) {
const write = (chunk: Uint8Array<ArrayBuffer>, stream: NodeJS.WriteStream, decoder: TextDecoder) => {
const str = decoder.decode(chunk, { stream: true });
errAndOut += str;
const lines = str.split("\n");
for (const line of lines) {
stream.write(redSubprocessPrefix);
stream.write(" ");
stream.write(line);
stream.write("\n");
}
};
const outDecoder = new TextDecoder();
const stdoutWriter = new WritableStream<Uint8Array<ArrayBuffer>>({
write: chunk => write(chunk, process.stdout, outDecoder),
close: () => void process.stdout.write(outDecoder.decode()),
});
const errDecoder = new TextDecoder();
const stderrWriter = new WritableStream<Uint8Array<ArrayBuffer>>({
write: chunk => write(chunk, process.stderr, errDecoder),
close: () => void process.stderr.write(errDecoder.decode()),
});
await Promise.all([proc.stdout.pipeTo(stdoutWriter), proc.stderr.pipeTo(stderrWriter)]);
} else {
const [stdout, stderr] = await Promise.all([proc.stdout.text(), proc.stderr.text()]);
errAndOut = stdout + stderr;
}
exitCode = await proc.exited;
}
if (exitCode !== expectedExitCode) {
console.log("Command:", cmd.join(" "));
@@ -301,6 +350,20 @@ scanner = "${scannerPath}"`,
if (scannerReturns === "warn") {
expect(errAndOut).toContain("WARNING:");
expect(errAndOut).toContain("Test warning");
if (hasTTY) {
// In TTY mode, we should see the interactive prompt
expect(errAndOut).toContain("Continue anyway? [y/N]");
if (ttyResponse === "y") {
expect(errAndOut).toContain("Continuing with installation...");
} else {
expect(errAndOut).toContain("Installation cancelled.");
}
} else {
// In non-TTY mode, we should see the no-TTY error message
expect(errAndOut).toContain("Security warnings found. Cannot prompt for confirmation (no TTY).");
expect(errAndOut).toContain("Installation cancelled.");
}
} else if (scannerReturns === "fatal") {
expect(errAndOut).toContain("FATAL:");
expect(errAndOut).toContain("Test fatal error");
@@ -309,14 +372,30 @@ scanner = "${scannerPath}"`,
if (scannerType !== "npm.bunfigonly" && !hasExistingNodeModules) {
switch (scannerReturns) {
case "fatal":
case "warn": {
// When there are fatal advisories OR warnings (with no TTY to prompt),
// the installation is cancelled and packages should NOT be installed
case "fatal": {
// Fatal advisories always cancel installation
expect(await Bun.file(join(dir, "node_modules", "left-pad", "package.json")).exists()).toBe(false);
break;
}
case "warn": {
if (hasTTY && ttyResponse === "y") {
// User accepted the warning in TTY mode, command proceeds normally
// For remove/uninstall without existing node_modules, nothing gets installed
// For other commands, packages should be installed
if (command === "remove" || command === "uninstall") {
// These commands don't install packages, they remove them
// Without existing node_modules, there's nothing to verify
} else {
expect(await Bun.file(join(dir, "node_modules", "left-pad", "package.json")).exists()).toBe(true);
}
} else {
// No TTY to prompt OR user rejected, installation is cancelled
expect(await Bun.file(join(dir, "node_modules", "left-pad", "package.json")).exists()).toBe(false);
}
break;
}
case "none": {
// When there are no security issues, packages should be installed normally
@@ -386,8 +465,12 @@ scanner = "${scannerPath}"`,
const requestedTarballs = registry.getRequestedTarballs();
// when we have no node modules and the scanner comes from npm, we must first install the scanner
// but, if we expext the scanner to report failure then we should ONLY see the scanner tarball requested, no others
if (scannerType === "npm" && !hasExistingNodeModules && (scannerReturns === "fatal" || scannerReturns === "warn")) {
// but, if we expect the scanner to report failure then we should ONLY see the scanner tarball requested, no others
// Exception: when hasTTY is true and ttyResponse is "y", the user accepts the warning and installation continues
const installationWasCancelled =
scannerReturns === "fatal" || (scannerReturns === "warn" && (!hasTTY || ttyResponse === "n"));
if (scannerType === "npm" && !hasExistingNodeModules && installationWasCancelled) {
const doWeExpectToAlwaysTryToResolve =
// If there is no lockfile, we will resolve packages
!hasLockfile ||
@@ -410,40 +493,15 @@ scanner = "${scannerPath}"`,
const sortedPackages = [...requestedPackages].sort();
const sortedTarballs = [...requestedTarballs].sort();
if (command === "install") {
expect(sortedPackages).toMatchSnapshot("requested-packages: install");
expect(sortedTarballs).toMatchSnapshot("requested-tarballs: install");
} else if (command === "add") {
expect(sortedPackages).toMatchSnapshot("requested-packages: add");
expect(sortedTarballs).toMatchSnapshot("requested-tarballs: add");
} else if (command === "update") {
if (args.length > 0) {
expect(sortedPackages).toMatchSnapshot("requested-packages: update with args");
expect(sortedTarballs).toMatchSnapshot("requested-tarballs: update with args");
} else {
expect(sortedPackages).toMatchSnapshot("requested-packages: update without args");
expect(sortedTarballs).toMatchSnapshot("requested-tarballs: update without args");
}
} else if (command === "remove" || command === "uninstall") {
if (args.length > 0) {
expect(sortedPackages).toMatchSnapshot("requested-packages: remove with args");
expect(sortedTarballs).toMatchSnapshot("requested-tarballs: remove with args");
} else {
expect(sortedPackages).toMatchSnapshot("requested-packages: remove without args");
expect(sortedTarballs).toMatchSnapshot("requested-tarballs: remove without args");
}
} else {
expect(sortedPackages).toMatchSnapshot("requested-packages: unknown command");
expect(sortedTarballs).toMatchSnapshot("requested-tarballs: unknown command");
}
const key = `${command} ${args.length > 0 ? "with args" : "without args"}` as const;
expect(sortedPackages).toMatchSnapshot(`requested-packages: ${key}`);
expect(sortedTarballs).toMatchSnapshot(`requested-tarballs: ${key}`);
}
export function runSecurityScannerTests(selfModuleName: string, hasExistingNodeModules: boolean) {
let i = 0;
const bunTest = Bun.jest(selfModuleName);
const { describe, beforeAll, afterAll } = bunTest;
const { describe, beforeAll, afterAll, test } = Bun.jest(selfModuleName);
beforeAll(async () => {
registryUrl = await startRegistry(DO_TEST_DEBUG);
@@ -453,6 +511,13 @@ export function runSecurityScannerTests(selfModuleName: string, hasExistingNodeM
stopRegistry();
});
const ttyConfigs = [
{ hasTTY: false, ttyResponse: "n", ttyLabel: "no-TTY" } as const,
{ hasTTY: true, ttyResponse: "y", ttyLabel: "TTY:y" } as const,
{ hasTTY: true, ttyResponse: "n", ttyLabel: "TTY:n" } as const,
];
const ttyConfigsNoTTY = ttyConfigs.filter(c => !c.hasTTY);
describe.each(["install", "update", "add", "remove", "uninstall"] as const)("bun %s", command => {
describe.each([
{ args: [], name: "no args" },
@@ -463,58 +528,74 @@ export function runSecurityScannerTests(selfModuleName: string, hasExistingNodeM
describe.each(["local", "npm", "npm.bunfigonly"] as const)("(scanner: %s)", scannerType => {
describe.each([true, false] as const)("(bun.lock exists: %p)", hasLockfile => {
describe.each(["none", "warn", "fatal"] as const)("(advisories: %s)", scannerReturns => {
if ((command === "add" || command === "uninstall" || command === "remove") && args.length === 0) {
// TODO(@alii): Test this case:
// - Exit code 1
// - No changes to disk
// - Scanner does not run
return;
}
// TTY tests only apply to "warn" cases - for "none" and "fatal", only test non-TTY
const applicableTtyConfigs = scannerReturns === "warn" ? ttyConfigs : ttyConfigsNoTTY;
const testName = getTestName(String(++i).padStart(4, "0"), hasExistingNodeModules);
describe.each(applicableTtyConfigs)("($ttyLabel)", ({ hasTTY, ttyResponse }) => {
if ((command === "add" || command === "uninstall" || command === "remove") && args.length === 0) {
// TODO(@alii): Test this case:
// - Exit code 1
// - No changes to disk
// - Scanner does not run
return;
}
if (TESTS_TO_SKIP.has(testName)) {
return test.skip(testName, async () => {
// TODO
});
}
const testName = getTestName(String(++i).padStart(4, "0"), hasExistingNodeModules);
if (isCI) {
if (command === "uninstall") {
if (TESTS_TO_SKIP.has(testName)) {
return test.skip(testName, async () => {
// Same as `remove`, optimising for CI time here
// TODO
});
}
const random = Math.random();
if (random < (100 - CI_SAMPLE_PERCENT) / 100) {
if (hasTTY && isWindows) {
return test.skip(testName, async () => {
// skipping this one for CI
// PTY not supported on Windows
});
}
}
// npm.bunfigonly is the case where a scanner is a valid npm package name identifier
// but is not referenced in package.json anywhere and is not in the lockfile, so the only knowledge
// of this package's existence is the fact that it was defined in as the value in bunfig.toml
// Therefore, we should fail because we don't know where to install it from
const shouldFail =
scannerType === "npm.bunfigonly" || scannerReturns === "fatal" || scannerReturns === "warn";
if (isCI) {
if (command === "uninstall") {
return test.skip(testName, async () => {
// Same as `remove`, optimising for CI time here
});
}
test(testName, async () => {
await runSecurityScannerTest({
command,
args,
hasExistingNodeModules,
linker,
scannerType,
scannerReturns,
shouldFail,
hasLockfile,
const random = Math.random();
// TODO(@alii): Test this case
scannerSyncronouslyThrows: false,
if (random < (100 - CI_SAMPLE_PERCENT) / 100) {
return test.skip(testName, async () => {
// skipping this one for CI
});
}
}
// npm.bunfigonly is the case where a scanner is a valid npm package name identifier
// but is not referenced in package.json anywhere and is not in the lockfile, so the only knowledge
// of this package's existence is the fact that it was defined in as the value in bunfig.toml
// Therefore, we should fail because we don't know where to install it from
const shouldFail =
scannerType === "npm.bunfigonly" ||
scannerReturns === "fatal" ||
(scannerReturns === "warn" && (!hasTTY || ttyResponse === "n"));
test(testName, async () => {
await runSecurityScannerTest({
command,
args,
hasExistingNodeModules,
linker,
scannerType,
scannerReturns,
shouldFail,
hasLockfile,
// TODO(@alii): Test this case
scannerSyncronouslyThrows: false,
hasTTY,
ttyResponse,
});
});
});
});

View File

@@ -264,14 +264,16 @@ afterAll(async () => {
});
describe("@types/bun integration test", () => {
test("checks without lib.dom.d.ts", async () => {
const { diagnostics, emptyInterfaces } = await diagnose(TEMP_FIXTURE_DIR);
describe("basic type checks", () => {
test("checks without lib.dom.d.ts", async () => {
const { diagnostics, emptyInterfaces } = await diagnose(TEMP_FIXTURE_DIR);
expect(emptyInterfaces).toEqual(expectedEmptyInterfacesWhenNoDOM);
expect(diagnostics).toEqual([]);
expect(emptyInterfaces).toEqual(expectedEmptyInterfacesWhenNoDOM);
expect(diagnostics).toEqual([]);
});
});
describe("test-globals reference", () => {
describe("Test Globals", () => {
const code = `
const test_shouldBeAFunction: Function = test;
const it_shouldBeAFunction: Function = it;
@@ -360,7 +362,7 @@ describe("@types/bun integration test", () => {
});
});
describe("bun:bundle feature() type safety with Registry", () => {
describe("bun:bundle feature()", () => {
test("Registry augmentation restricts feature() to known flags", async () => {
const testCode = `
// Augment the Registry to define known flags
@@ -454,294 +456,300 @@ describe("@types/bun integration test", () => {
});
});
test("checks with no lib at all", async () => {
const { diagnostics, emptyInterfaces } = await diagnose(TEMP_FIXTURE_DIR, {
options: {
lib: [],
},
describe("lib configuration", () => {
test("checks with no lib at all", async () => {
const { diagnostics, emptyInterfaces } = await diagnose(TEMP_FIXTURE_DIR, {
options: {
lib: [],
},
});
expect(emptyInterfaces).toEqual(expectedEmptyInterfacesWhenNoDOM);
expect(diagnostics).toEqual([]);
});
expect(emptyInterfaces).toEqual(expectedEmptyInterfacesWhenNoDOM);
expect(diagnostics).toEqual([]);
});
test("fails with types: [] and no jsx", async () => {
const { diagnostics, emptyInterfaces } = await diagnose(TEMP_FIXTURE_DIR, {
options: {
lib: [],
types: [],
jsx: ts.JsxEmit.None,
},
});
test("fails with types: [] and no jsx", async () => {
const { diagnostics, emptyInterfaces } = await diagnose(TEMP_FIXTURE_DIR, {
options: {
lib: [],
types: [],
jsx: ts.JsxEmit.None,
},
expect(emptyInterfaces).toEqual(expectedEmptyInterfacesWhenNoDOM);
expect(diagnostics).toEqual([
// // This is expected because we, of course, can't check that our tsx file is passing
// // when tsx is turned off...
// {
// "code": 17004,
// "line": "[slug].tsx:17:10",
// "message": "Cannot use JSX unless the '--jsx' flag is provided.",
// },
]);
});
expect(emptyInterfaces).toEqual(expectedEmptyInterfacesWhenNoDOM);
expect(diagnostics).toEqual([
// // This is expected because we, of course, can't check that our tsx file is passing
// // when tsx is turned off...
// {
// "code": 17004,
// "line": "[slug].tsx:17:10",
// "message": "Cannot use JSX unless the '--jsx' flag is provided.",
// },
]);
});
test("checks with lib.dom.d.ts", async () => {
const { diagnostics, emptyInterfaces } = await diagnose(TEMP_FIXTURE_DIR, {
options: {
lib: ["ESNext", "DOM", "DOM.Iterable", "DOM.AsyncIterable"].map(name => `lib.${name.toLowerCase()}.d.ts`),
},
});
test("checks with lib.dom.d.ts", async () => {
const { diagnostics, emptyInterfaces } = await diagnose(TEMP_FIXTURE_DIR, {
options: {
lib: ["ESNext", "DOM", "DOM.Iterable", "DOM.AsyncIterable"].map(name => `lib.${name.toLowerCase()}.d.ts`),
},
expect(emptyInterfaces).toEqual(
new Set([
"ThisType",
"RTCAnswerOptions",
"RTCOfferAnswerOptions",
"RTCSetParameterOptions",
"EXT_color_buffer_float",
"EXT_float_blend",
"EXT_frag_depth",
"EXT_shader_texture_lod",
"FragmentDirective",
"MediaSourceHandle",
"OES_element_index_uint",
"OES_fbo_render_mipmap",
"OES_texture_float",
"OES_texture_float_linear",
"OES_texture_half_float_linear",
"PeriodicWave",
"RTCRtpScriptTransform",
"WebGLBuffer",
"WebGLFramebuffer",
"WebGLProgram",
"WebGLQuery",
"WebGLRenderbuffer",
"WebGLSampler",
"WebGLShader",
"WebGLSync",
"WebGLTexture",
"WebGLTransformFeedback",
"WebGLUniformLocation",
"WebGLVertexArrayObject",
"WebGLVertexArrayObjectOES",
]),
);
expect(diagnostics).toEqual([
{
code: 2322,
line: "24154.ts:11:3",
message:
"Type 'Blob' is not assignable to type 'import(\"node:buffer\").Blob'.\nThe types returned by 'stream()' are incompatible between these types.\nType 'ReadableStream<Uint8Array<ArrayBuffer>>' is missing the following properties from type 'ReadableStream<NonSharedUint8Array>': blob, text, bytes, json",
},
{
code: 2769,
line: "fetch.ts:25:32",
message:
"No overload matches this call.\nOverload 1 of 3, '(input: string | Request | URL, init?: RequestInit | undefined): Promise<Response>', gave the following error.\nType 'AsyncGenerator<\"chunk1\" | \"chunk2\", void, unknown>' is not assignable to type 'BodyInit | null | undefined'.\nType 'AsyncGenerator<\"chunk1\" | \"chunk2\", void, unknown>' is missing the following properties from type 'ReadableStream<any>': locked, cancel, getReader, pipeThrough, and 3 more.\nOverload 2 of 3, '(input: string | Request | URL, init?: BunFetchRequestInit | undefined): Promise<Response>', gave the following error.\nType 'AsyncGenerator<\"chunk1\" | \"chunk2\", void, unknown>' is not assignable to type 'BodyInit | null | undefined'.\nType 'AsyncGenerator<\"chunk1\" | \"chunk2\", void, unknown>' is missing the following properties from type 'ReadableStream<any>': locked, cancel, getReader, pipeThrough, and 3 more.\nOverload 3 of 3, '(input: RequestInfo | URL, init?: RequestInit | undefined): Promise<Response>', gave the following error.\nType 'AsyncGenerator<\"chunk1\" | \"chunk2\", void, unknown>' is not assignable to type 'BodyInit | null | undefined'.\nType 'AsyncGenerator<\"chunk1\" | \"chunk2\", void, unknown>' is missing the following properties from type 'ReadableStream<any>': locked, cancel, getReader, pipeThrough, and 3 more.",
},
{
code: 2769,
line: "fetch.ts:33:32",
message:
"No overload matches this call.\nOverload 1 of 3, '(input: string | Request | URL, init?: RequestInit | undefined): Promise<Response>', gave the following error.\nType '{ [Symbol.asyncIterator](): AsyncGenerator<\"data1\" | \"data2\", void, unknown>; }' is not assignable to type 'BodyInit | null | undefined'.\nType '{ [Symbol.asyncIterator](): AsyncGenerator<\"data1\" | \"data2\", void, unknown>; }' is missing the following properties from type 'ReadableStream<any>': locked, cancel, getReader, pipeThrough, and 3 more.\nOverload 2 of 3, '(input: string | Request | URL, init?: BunFetchRequestInit | undefined): Promise<Response>', gave the following error.\nType '{ [Symbol.asyncIterator](): AsyncGenerator<\"data1\" | \"data2\", void, unknown>; }' is not assignable to type 'BodyInit | null | undefined'.\nType '{ [Symbol.asyncIterator](): AsyncGenerator<\"data1\" | \"data2\", void, unknown>; }' is missing the following properties from type 'ReadableStream<any>': locked, cancel, getReader, pipeThrough, and 3 more.\nOverload 3 of 3, '(input: RequestInfo | URL, init?: RequestInit | undefined): Promise<Response>', gave the following error.\nType '{ [Symbol.asyncIterator](): AsyncGenerator<\"data1\" | \"data2\", void, unknown>; }' is not assignable to type 'BodyInit | null | undefined'.\nType '{ [Symbol.asyncIterator](): AsyncGenerator<\"data1\" | \"data2\", void, unknown>; }' is missing the following properties from type 'ReadableStream<any>': locked, cancel, getReader, pipeThrough, and 3 more.",
},
{
code: 2769,
line: "fetch.ts:168:34",
message:
"No overload matches this call.\nOverload 1 of 3, '(input: string | Request | URL, init?: RequestInit | undefined): Promise<Response>', gave the following error.\nType 'SharedArrayBuffer' is not assignable to type 'BodyInit | null | undefined'.\nType 'SharedArrayBuffer' is missing the following properties from type 'ArrayBuffer': resizable, resize, detached, transfer, transferToFixedLength\nOverload 2 of 3, '(input: string | Request | URL, init?: BunFetchRequestInit | undefined): Promise<Response>', gave the following error.\nType 'SharedArrayBuffer' is not assignable to type 'BodyInit | null | undefined'.\nType 'SharedArrayBuffer' is missing the following properties from type 'ArrayBuffer': resizable, resize, detached, transfer, transferToFixedLength\nOverload 3 of 3, '(input: RequestInfo | URL, init?: RequestInit | undefined): Promise<Response>', gave the following error.\nType 'SharedArrayBuffer' is not assignable to type 'BodyInit | null | undefined'.\nType 'SharedArrayBuffer' is missing the following properties from type 'ArrayBuffer': resizable, resize, detached, transfer, transferToFixedLength",
},
{
code: 2353,
line: "globals.ts:307:5",
message: "Object literal may only specify known properties, and 'headers' does not exist in type 'string[]'.",
},
{
code: 2345,
line: "http.ts:43:24",
message:
"Argument of type '() => AsyncGenerator<Uint8Array<ArrayBuffer> | \"hey\", void, unknown>' is not assignable to parameter of type 'BodyInit | null | undefined'.",
},
{
code: 2345,
line: "http.ts:55:24",
message:
"Argument of type 'AsyncGenerator<Uint8Array<ArrayBuffer> | \"it works!\", void, unknown>' is not assignable to parameter of type 'BodyInit | null | undefined'.\nType 'AsyncGenerator<Uint8Array<ArrayBuffer> | \"it works!\", void, unknown>' is missing the following properties from type 'ReadableStream<any>': locked, cancel, getReader, pipeThrough, and 3 more.",
},
{
code: 2345,
line: "index.ts:196:14",
message:
"Argument of type 'AsyncGenerator<Uint8Array<ArrayBuffer>, void, unknown>' is not assignable to parameter of type 'BodyInit | null | undefined'.\nType 'AsyncGenerator<Uint8Array<ArrayBuffer>, void, unknown>' is missing the following properties from type 'ReadableStream<any>': locked, cancel, getReader, pipeThrough, and 3 more.",
},
{
code: 2345,
line: "index.ts:322:29",
message:
"Argument of type '{ headers: { \"x-bun\": string; }; }' is not assignable to parameter of type 'number'.",
},
{
code: 2339,
line: "spawn.ts:62:38",
message: "Property 'text' does not exist on type 'ReadableStream<Uint8Array<ArrayBuffer>>'.",
},
{
code: 2339,
line: "spawn.ts:107:38",
message: "Property 'text' does not exist on type 'ReadableStream<Uint8Array<ArrayBuffer>>'.",
},
{
"code": 2769,
"line": "streams.ts:18:3",
"message":
"No overload matches this call.\nOverload 1 of 3, '(underlyingSource: UnderlyingByteSource, strategy?: { highWaterMark?: number | undefined; } | undefined): ReadableStream<Uint8Array<ArrayBuffer>>', gave the following error.\nType '\"direct\"' is not assignable to type '\"bytes\"'.",
},
{
"code": 2339,
"line": "streams.ts:20:16",
"message": "Property 'write' does not exist on type 'ReadableByteStreamController'.",
},
{
"code": 2339,
"line": "streams.ts:46:19",
"message": "Property 'json' does not exist on type 'ReadableStream<Uint8Array<ArrayBufferLike>>'.",
},
{
"code": 2339,
"line": "streams.ts:47:19",
"message": "Property 'bytes' does not exist on type 'ReadableStream<Uint8Array<ArrayBufferLike>>'.",
},
{
"code": 2339,
"line": "streams.ts:48:19",
"message": "Property 'text' does not exist on type 'ReadableStream<Uint8Array<ArrayBufferLike>>'.",
},
{
"code": 2339,
"line": "streams.ts:49:19",
"message": "Property 'blob' does not exist on type 'ReadableStream<Uint8Array<ArrayBufferLike>>'.",
},
{
code: 2345,
line: "streams.ts:63:66",
message: "Argument of type '\"brotli\"' is not assignable to parameter of type 'CompressionFormat'.",
},
{
code: 2345,
line: "streams.ts:63:113",
message: "Argument of type '\"brotli\"' is not assignable to parameter of type 'CompressionFormat'.",
},
{
code: 2345,
line: "streams.ts:64:66",
message: "Argument of type '\"zstd\"' is not assignable to parameter of type 'CompressionFormat'.",
},
{
code: 2345,
line: "streams.ts:64:111",
message: "Argument of type '\"zstd\"' is not assignable to parameter of type 'CompressionFormat'.",
},
{
code: 2353,
line: "websocket.ts:25:5",
message:
"Object literal may only specify known properties, and 'protocols' does not exist in type 'string[]'.",
},
{
code: 2353,
line: "websocket.ts:30:5",
message:
"Object literal may only specify known properties, and 'protocol' does not exist in type 'string[]'.",
},
{
code: 2353,
line: "websocket.ts:35:5",
message:
"Object literal may only specify known properties, and 'protocol' does not exist in type 'string[]'.",
},
{
code: 2353,
line: "websocket.ts:43:5",
message: "Object literal may only specify known properties, and 'headers' does not exist in type 'string[]'.",
},
{
code: 2353,
line: "websocket.ts:51:5",
message:
"Object literal may only specify known properties, and 'protocols' does not exist in type 'string[]'.",
},
{
code: 2554,
line: "websocket.ts:185:29",
message: "Expected 2 arguments, but got 0.",
},
{
code: 2551,
line: "websocket.ts:192:17",
message: "Property 'URL' does not exist on type 'WebSocket'. Did you mean 'url'?",
},
{
code: 2322,
line: "websocket.ts:196:3",
message: "Type '\"nodebuffer\"' is not assignable to type 'BinaryType'.",
},
{
code: 2339,
line: "websocket.ts:242:6",
message: "Property 'ping' does not exist on type 'WebSocket'.",
},
{
code: 2339,
line: "websocket.ts:245:6",
message: "Property 'ping' does not exist on type 'WebSocket'.",
},
{
code: 2339,
line: "websocket.ts:249:6",
message: "Property 'ping' does not exist on type 'WebSocket'.",
},
{
code: 2339,
line: "websocket.ts:253:6",
message: "Property 'ping' does not exist on type 'WebSocket'.",
},
{
code: 2339,
line: "websocket.ts:256:6",
message: "Property 'pong' does not exist on type 'WebSocket'.",
},
{
code: 2339,
line: "websocket.ts:259:6",
message: "Property 'pong' does not exist on type 'WebSocket'.",
},
{
code: 2339,
line: "websocket.ts:263:6",
message: "Property 'pong' does not exist on type 'WebSocket'.",
},
{
code: 2339,
line: "websocket.ts:267:6",
message: "Property 'pong' does not exist on type 'WebSocket'.",
},
{
code: 2339,
line: "websocket.ts:270:6",
message: "Property 'terminate' does not exist on type 'WebSocket'.",
},
{
code: 2339,
line: "worker.ts:23:11",
message: "Property 'ref' does not exist on type 'Worker'.",
},
{
code: 2339,
line: "worker.ts:24:11",
message: "Property 'unref' does not exist on type 'Worker'.",
},
{
code: 2339,
line: "worker.ts:25:11",
message: "Property 'threadId' does not exist on type 'Worker'.",
},
]);
});
expect(emptyInterfaces).toEqual(
new Set([
"ThisType",
"RTCAnswerOptions",
"RTCOfferAnswerOptions",
"RTCSetParameterOptions",
"EXT_color_buffer_float",
"EXT_float_blend",
"EXT_frag_depth",
"EXT_shader_texture_lod",
"FragmentDirective",
"MediaSourceHandle",
"OES_element_index_uint",
"OES_fbo_render_mipmap",
"OES_texture_float",
"OES_texture_float_linear",
"OES_texture_half_float_linear",
"PeriodicWave",
"RTCRtpScriptTransform",
"WebGLBuffer",
"WebGLFramebuffer",
"WebGLProgram",
"WebGLQuery",
"WebGLRenderbuffer",
"WebGLSampler",
"WebGLShader",
"WebGLSync",
"WebGLTexture",
"WebGLTransformFeedback",
"WebGLUniformLocation",
"WebGLVertexArrayObject",
"WebGLVertexArrayObjectOES",
]),
);
expect(diagnostics).toEqual([
{
code: 2322,
line: "24154.ts:11:3",
message:
"Type 'Blob' is not assignable to type 'import(\"node:buffer\").Blob'.\nThe types returned by 'stream()' are incompatible between these types.\nType 'ReadableStream<Uint8Array<ArrayBuffer>>' is missing the following properties from type 'ReadableStream<NonSharedUint8Array>': blob, text, bytes, json",
},
{
code: 2769,
line: "fetch.ts:25:32",
message:
"No overload matches this call.\nOverload 1 of 3, '(input: string | Request | URL, init?: RequestInit | undefined): Promise<Response>', gave the following error.\nType 'AsyncGenerator<\"chunk1\" | \"chunk2\", void, unknown>' is not assignable to type 'BodyInit | null | undefined'.\nType 'AsyncGenerator<\"chunk1\" | \"chunk2\", void, unknown>' is missing the following properties from type 'ReadableStream<any>': locked, cancel, getReader, pipeThrough, and 3 more.\nOverload 2 of 3, '(input: string | Request | URL, init?: BunFetchRequestInit | undefined): Promise<Response>', gave the following error.\nType 'AsyncGenerator<\"chunk1\" | \"chunk2\", void, unknown>' is not assignable to type 'BodyInit | null | undefined'.\nType 'AsyncGenerator<\"chunk1\" | \"chunk2\", void, unknown>' is missing the following properties from type 'ReadableStream<any>': locked, cancel, getReader, pipeThrough, and 3 more.\nOverload 3 of 3, '(input: RequestInfo | URL, init?: RequestInit | undefined): Promise<Response>', gave the following error.\nType 'AsyncGenerator<\"chunk1\" | \"chunk2\", void, unknown>' is not assignable to type 'BodyInit | null | undefined'.\nType 'AsyncGenerator<\"chunk1\" | \"chunk2\", void, unknown>' is missing the following properties from type 'ReadableStream<any>': locked, cancel, getReader, pipeThrough, and 3 more.",
},
{
code: 2769,
line: "fetch.ts:33:32",
message:
"No overload matches this call.\nOverload 1 of 3, '(input: string | Request | URL, init?: RequestInit | undefined): Promise<Response>', gave the following error.\nType '{ [Symbol.asyncIterator](): AsyncGenerator<\"data1\" | \"data2\", void, unknown>; }' is not assignable to type 'BodyInit | null | undefined'.\nType '{ [Symbol.asyncIterator](): AsyncGenerator<\"data1\" | \"data2\", void, unknown>; }' is missing the following properties from type 'ReadableStream<any>': locked, cancel, getReader, pipeThrough, and 3 more.\nOverload 2 of 3, '(input: string | Request | URL, init?: BunFetchRequestInit | undefined): Promise<Response>', gave the following error.\nType '{ [Symbol.asyncIterator](): AsyncGenerator<\"data1\" | \"data2\", void, unknown>; }' is not assignable to type 'BodyInit | null | undefined'.\nType '{ [Symbol.asyncIterator](): AsyncGenerator<\"data1\" | \"data2\", void, unknown>; }' is missing the following properties from type 'ReadableStream<any>': locked, cancel, getReader, pipeThrough, and 3 more.\nOverload 3 of 3, '(input: RequestInfo | URL, init?: RequestInit | undefined): Promise<Response>', gave the following error.\nType '{ [Symbol.asyncIterator](): AsyncGenerator<\"data1\" | \"data2\", void, unknown>; }' is not assignable to type 'BodyInit | null | undefined'.\nType '{ [Symbol.asyncIterator](): AsyncGenerator<\"data1\" | \"data2\", void, unknown>; }' is missing the following properties from type 'ReadableStream<any>': locked, cancel, getReader, pipeThrough, and 3 more.",
},
{
code: 2769,
line: "fetch.ts:168:34",
message:
"No overload matches this call.\nOverload 1 of 3, '(input: string | Request | URL, init?: RequestInit | undefined): Promise<Response>', gave the following error.\nType 'SharedArrayBuffer' is not assignable to type 'BodyInit | null | undefined'.\nType 'SharedArrayBuffer' is missing the following properties from type 'ArrayBuffer': resizable, resize, detached, transfer, transferToFixedLength\nOverload 2 of 3, '(input: string | Request | URL, init?: BunFetchRequestInit | undefined): Promise<Response>', gave the following error.\nType 'SharedArrayBuffer' is not assignable to type 'BodyInit | null | undefined'.\nType 'SharedArrayBuffer' is missing the following properties from type 'ArrayBuffer': resizable, resize, detached, transfer, transferToFixedLength\nOverload 3 of 3, '(input: RequestInfo | URL, init?: RequestInit | undefined): Promise<Response>', gave the following error.\nType 'SharedArrayBuffer' is not assignable to type 'BodyInit | null | undefined'.\nType 'SharedArrayBuffer' is missing the following properties from type 'ArrayBuffer': resizable, resize, detached, transfer, transferToFixedLength",
},
{
code: 2353,
line: "globals.ts:307:5",
message: "Object literal may only specify known properties, and 'headers' does not exist in type 'string[]'.",
},
{
code: 2345,
line: "http.ts:43:24",
message:
"Argument of type '() => AsyncGenerator<Uint8Array<ArrayBuffer> | \"hey\", void, unknown>' is not assignable to parameter of type 'BodyInit | null | undefined'.",
},
{
code: 2345,
line: "http.ts:55:24",
message:
"Argument of type 'AsyncGenerator<Uint8Array<ArrayBuffer> | \"it works!\", void, unknown>' is not assignable to parameter of type 'BodyInit | null | undefined'.\nType 'AsyncGenerator<Uint8Array<ArrayBuffer> | \"it works!\", void, unknown>' is missing the following properties from type 'ReadableStream<any>': locked, cancel, getReader, pipeThrough, and 3 more.",
},
{
code: 2345,
line: "index.ts:196:14",
message:
"Argument of type 'AsyncGenerator<Uint8Array<ArrayBuffer>, void, unknown>' is not assignable to parameter of type 'BodyInit | null | undefined'.\nType 'AsyncGenerator<Uint8Array<ArrayBuffer>, void, unknown>' is missing the following properties from type 'ReadableStream<any>': locked, cancel, getReader, pipeThrough, and 3 more.",
},
{
code: 2345,
line: "index.ts:322:29",
message:
"Argument of type '{ headers: { \"x-bun\": string; }; }' is not assignable to parameter of type 'number'.",
},
{
code: 2339,
line: "spawn.ts:62:38",
message: "Property 'text' does not exist on type 'ReadableStream<Uint8Array<ArrayBuffer>>'.",
},
{
code: 2339,
line: "spawn.ts:107:38",
message: "Property 'text' does not exist on type 'ReadableStream<Uint8Array<ArrayBuffer>>'.",
},
{
"code": 2769,
"line": "streams.ts:18:3",
"message":
"No overload matches this call.\nOverload 1 of 3, '(underlyingSource: UnderlyingByteSource, strategy?: { highWaterMark?: number | undefined; } | undefined): ReadableStream<Uint8Array<ArrayBuffer>>', gave the following error.\nType '\"direct\"' is not assignable to type '\"bytes\"'.",
},
{
"code": 2339,
"line": "streams.ts:20:16",
"message": "Property 'write' does not exist on type 'ReadableByteStreamController'.",
},
{
"code": 2339,
"line": "streams.ts:46:19",
"message": "Property 'json' does not exist on type 'ReadableStream<Uint8Array<ArrayBufferLike>>'.",
},
{
"code": 2339,
"line": "streams.ts:47:19",
"message": "Property 'bytes' does not exist on type 'ReadableStream<Uint8Array<ArrayBufferLike>>'.",
},
{
"code": 2339,
"line": "streams.ts:48:19",
"message": "Property 'text' does not exist on type 'ReadableStream<Uint8Array<ArrayBufferLike>>'.",
},
{
"code": 2339,
"line": "streams.ts:49:19",
"message": "Property 'blob' does not exist on type 'ReadableStream<Uint8Array<ArrayBufferLike>>'.",
},
{
code: 2345,
line: "streams.ts:63:66",
message: "Argument of type '\"brotli\"' is not assignable to parameter of type 'CompressionFormat'.",
},
{
code: 2345,
line: "streams.ts:63:113",
message: "Argument of type '\"brotli\"' is not assignable to parameter of type 'CompressionFormat'.",
},
{
code: 2345,
line: "streams.ts:64:66",
message: "Argument of type '\"zstd\"' is not assignable to parameter of type 'CompressionFormat'.",
},
{
code: 2345,
line: "streams.ts:64:111",
message: "Argument of type '\"zstd\"' is not assignable to parameter of type 'CompressionFormat'.",
},
{
code: 2353,
line: "websocket.ts:25:5",
message: "Object literal may only specify known properties, and 'protocols' does not exist in type 'string[]'.",
},
{
code: 2353,
line: "websocket.ts:30:5",
message: "Object literal may only specify known properties, and 'protocol' does not exist in type 'string[]'.",
},
{
code: 2353,
line: "websocket.ts:35:5",
message: "Object literal may only specify known properties, and 'protocol' does not exist in type 'string[]'.",
},
{
code: 2353,
line: "websocket.ts:43:5",
message: "Object literal may only specify known properties, and 'headers' does not exist in type 'string[]'.",
},
{
code: 2353,
line: "websocket.ts:51:5",
message: "Object literal may only specify known properties, and 'protocols' does not exist in type 'string[]'.",
},
{
code: 2554,
line: "websocket.ts:185:29",
message: "Expected 2 arguments, but got 0.",
},
{
code: 2551,
line: "websocket.ts:192:17",
message: "Property 'URL' does not exist on type 'WebSocket'. Did you mean 'url'?",
},
{
code: 2322,
line: "websocket.ts:196:3",
message: "Type '\"nodebuffer\"' is not assignable to type 'BinaryType'.",
},
{
code: 2339,
line: "websocket.ts:242:6",
message: "Property 'ping' does not exist on type 'WebSocket'.",
},
{
code: 2339,
line: "websocket.ts:245:6",
message: "Property 'ping' does not exist on type 'WebSocket'.",
},
{
code: 2339,
line: "websocket.ts:249:6",
message: "Property 'ping' does not exist on type 'WebSocket'.",
},
{
code: 2339,
line: "websocket.ts:253:6",
message: "Property 'ping' does not exist on type 'WebSocket'.",
},
{
code: 2339,
line: "websocket.ts:256:6",
message: "Property 'pong' does not exist on type 'WebSocket'.",
},
{
code: 2339,
line: "websocket.ts:259:6",
message: "Property 'pong' does not exist on type 'WebSocket'.",
},
{
code: 2339,
line: "websocket.ts:263:6",
message: "Property 'pong' does not exist on type 'WebSocket'.",
},
{
code: 2339,
line: "websocket.ts:267:6",
message: "Property 'pong' does not exist on type 'WebSocket'.",
},
{
code: 2339,
line: "websocket.ts:270:6",
message: "Property 'terminate' does not exist on type 'WebSocket'.",
},
{
code: 2339,
line: "worker.ts:23:11",
message: "Property 'ref' does not exist on type 'Worker'.",
},
{
code: 2339,
line: "worker.ts:24:11",
message: "Property 'unref' does not exist on type 'Worker'.",
},
{
code: 2339,
line: "worker.ts:25:11",
message: "Property 'threadId' does not exist on type 'Worker'.",
},
]);
});
});

View File

@@ -140,4 +140,19 @@ describe("IOWriter file output redirection", () => {
.fileEquals("pipe_output.txt", "pipe test\n")
.runAsTest("pipe with file redirection");
});
describe("&> redirect (stdout and stderr to same file)", () => {
// This test verifies the fix for the bug where using &> with a builtin
// command caused the same file descriptor to be closed twice, resulting
// in an EBADF error. The issue was that two separate IOWriter instances
// were created for the same fd when both stdout and stderr were redirected.
TestBuilder.command`pwd &> pwd_output.txt`.exitCode(0).runAsTest("builtin pwd with &> redirect");
TestBuilder.command`echo "hello" &> echo_output.txt`
.exitCode(0)
.fileEquals("echo_output.txt", "hello\n")
.runAsTest("builtin echo with &> redirect");
TestBuilder.command`pwd &>> append_output.txt`.exitCode(0).runAsTest("builtin pwd with &>> append redirect");
});
});

View File

@@ -0,0 +1,65 @@
import { expect, test } from "bun:test";
import { createBrotliCompress, createBrotliDecompress } from "zlib";
// This test verifies that calling reset() on Brotli streams doesn't leak memory.
// Before the fix, each reset() call would allocate a new Brotli encoder/decoder
// without freeing the previous one.
test("Brotli reset() should not leak memory", { timeout: 30_000 }, async () => {
const iterations = 100_000;
// Get baseline memory
Bun.gc(true);
await Bun.sleep(10);
const baselineMemory = process.memoryUsage.rss();
const compressor = createBrotliCompress();
// Reset many times - before the fix, each reset leaks ~400KB (brotli encoder state)
for (let i = 0; i < iterations; i++) {
compressor.reset();
}
compressor.close();
// Force GC and measure
Bun.gc(true);
await Bun.sleep(10);
const finalMemory = process.memoryUsage.rss();
const memoryGrowth = finalMemory - baselineMemory;
const memoryGrowthMB = memoryGrowth / 1024 / 1024;
console.log(`Memory growth after ${iterations} reset() calls: ${memoryGrowthMB.toFixed(2)} MB`);
// With 100k iterations and ~400KB per leak, we'd expect ~40GB of leakage without the fix.
// With the fix, memory growth should be minimal (under 50MB accounting for test overhead).
expect(memoryGrowthMB).toBeLessThan(50);
});
test("BrotliDecompress reset() should not leak memory", { timeout: 30_000 }, async () => {
const iterations = 100_000;
Bun.gc(true);
await Bun.sleep(10);
const baselineMemory = process.memoryUsage.rss();
const decompressor = createBrotliDecompress();
for (let i = 0; i < iterations; i++) {
decompressor.reset();
}
decompressor.close();
Bun.gc(true);
await Bun.sleep(10);
const finalMemory = process.memoryUsage.rss();
const memoryGrowth = finalMemory - baselineMemory;
const memoryGrowthMB = memoryGrowth / 1024 / 1024;
console.log(`Memory growth after ${iterations} reset() calls: ${memoryGrowthMB.toFixed(2)} MB`);
expect(memoryGrowthMB).toBeLessThan(50);
});