Compare commits

..

15 Commits

Author SHA1 Message Date
Claude Bot
76163e2020 feat(patch): add --preview flag to show diff without saving
Adds a `--preview` flag to `bun patch` and `bun patch-commit` commands that
prints the patch diff to stdout without saving the patch file or modifying
package.json.

This allows users to preview what a patch will contain before committing to
it, making it easier to create minimal patches when porting bug fixes from
upstream repositories.

Usage:
  bun patch --preview node_modules/package
  bun patch-commit --preview node_modules/package

Closes #26463

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-26 23:56:49 +00:00
robobun
bfe40e8760 fix(cmake): use BUILDKITE_BUILD_NUMBER to avoid 302 redirect (#26409)
## What does this PR do?

Fixes CMake "No jobs found" error during the build-bun step in CI by
using `BUILDKITE_BUILD_NUMBER` instead of `BUILDKITE_BUILD_ID` (UUID)
for the Buildkite API URL.

### Problem

When `BUN_LINK_ONLY=ON`, `SetupBuildkite.cmake` fetches build info from
the Buildkite API to download artifacts from earlier build steps
(build-cpp, build-zig).

The `BUILDKITE_BUILD_ID` environment variable contains a UUID (e.g.,
`019bee3e-da45-4e9f-b4d8-4bdb5aeac0ac`). When this UUID is used in the
URL, Buildkite returns a **302 redirect** to the numeric build number
URL (e.g., `/builds/35708`).

CMake's `file(DOWNLOAD)` command **does not follow HTTP redirects**, so
the downloaded file is empty. Parsing the empty JSON yields 0 jobs,
triggering the fatal error:

```
CMake Error at cmake/tools/SetupBuildkite.cmake:67 (message):
  No jobs found:
  https://buildkite.com/bun/bun/builds/019bee3e-da45-4e9f-b4d8-4bdb5aeac0ac
```

### Solution

Prefer `BUILDKITE_BUILD_NUMBER` (numeric, e.g., `35708`) when available,
which doesn't redirect. This environment variable is automatically set
by Buildkite.

## How did you verify your code works?

- Verified UUID URL returns 302: `curl -sS -w '%{http_code}'
"https://buildkite.com/bun/bun/builds/019bee3e-da45-4e9f-b4d8-4bdb5aeac0ac"`
→ `302`
- Verified numeric URL returns 200 with JSON: `curl -sS -H "Accept:
application/json" "https://buildkite.com/bun/bun/builds/35708"` → valid
JSON with jobs array

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-24 23:54:33 -08:00
github-actions[bot]
bcaae48a95 deps: update lolhtml to v2.7.1 (#26430)
## What does this PR do?

Updates lolhtml to version v2.7.1

Compare:
d64457d9ff...e9e16dca48

Auto-updated by [this
workflow](https://github.com/oven-sh/bun/actions/workflows/update-lolhtml.yml)

Co-authored-by: Jarred-Sumner <709451+Jarred-Sumner@users.noreply.github.com>
2026-01-24 23:54:09 -08:00
robobun
6b3403a2b4 ci: fix update workflows creating duplicate PRs (#26433)
## Summary
- Fixed all `update-*.yml` workflows that were creating duplicate PRs
every week

## Problem
The update workflows (libarchive, zstd, cares, etc.) were using `${{
github.run_number }}` in the branch name, e.g.:
```yaml
branch: deps/update-libarchive-${{ github.run_number }}
```

This caused a new unique branch to be created on every workflow run, so
the `peter-evans/create-pull-request` action couldn't detect existing
PRs and would create duplicates.

**Evidence:** There are currently 8+ open duplicate PRs for libarchive
alone:
- #26432 deps: update libarchive to v3.8.5 (deps/update-libarchive-56)
- #26209 deps: update libarchive to v3.8.5 (deps/update-libarchive-55)
- #25955 deps: update libarchive to v3.8.5 (deps/update-libarchive-54)
- etc.

## Solution
Changed all workflows to use static branch names, e.g.:
```yaml
branch: deps/update-libarchive
```

This allows the action to:
1. Detect if an existing branch/PR already exists
2. Update the existing PR with new changes instead of creating a new one
3. Properly use `delete-branch: true` when the PR is merged

## Files Changed
- `.github/workflows/update-cares.yml`
- `.github/workflows/update-hdrhistogram.yml`
- `.github/workflows/update-highway.yml`
- `.github/workflows/update-libarchive.yml`
- `.github/workflows/update-libdeflate.yml`
- `.github/workflows/update-lolhtml.yml`
- `.github/workflows/update-lshpack.yml`
- `.github/workflows/update-root-certs.yml`
- `.github/workflows/update-sqlite3.yml`
- `.github/workflows/update-vendor.yml`
- `.github/workflows/update-zstd.yml`

## Test plan
- [x] Verified the change is syntactically correct
- [ ] Wait for next scheduled run of any workflow to verify it updates
existing PR instead of creating a new one

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-24 23:53:50 -08:00
robobun
70fe76209b fix(readline): use symbol key for _refreshLine in tab completer (#26412) 2026-01-24 15:37:29 -08:00
Jarred Sumner
ab3df344b8 Delete slop test 2026-01-23 23:22:32 -08:00
robobun
4680e89a91 fix(bundler): add missing semicolons in minified bun module imports (#26372)
## Summary
- Fix missing semicolons in minified output when using both default and
named imports from `"bun"` module
- The issue occurred in `printInternalBunImport` when transitioning
between star_name, default_name, and items sections without flushing
pending semicolons

## Test plan
- Added regression tests in `test/regression/issue/26371.test.ts`
covering:
  - Default + named imports (`import bun, { embeddedFiles } from "bun"`)
- Namespace + named imports (`import * as bun from "bun"; import {
embeddedFiles } from "bun"`)
  - Namespace + default + named imports combination
- Verified test fails with `USE_SYSTEM_BUN=1` (reproduces bug)
- Verified test passes with `bun bd test` (fix works)

Fixes #26371

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-23 23:09:01 -08:00
robobun
f88f60af5a fix(bundler): throw error when Bun.build is called from macro during bundling (#26361)
## Summary
- Fixes #26360
- Detects when `Bun.build` is called from within macro mode during
bundling and throws a clear error instead of hanging indefinitely

## Problem
When `Bun.build` API is called to bundle a file that imports from a
macro which itself uses `Bun.build`, the process would hang indefinitely
due to a deadlock:

1. The bundler uses a singleton thread for processing `Bun.build` calls
2. During parsing, when a macro is encountered, it's evaluated on that
thread
3. If the macro calls `Bun.build`, it tries to enqueue to the same
singleton thread
4. The singleton is blocked waiting for macro completion → deadlock

## Solution
Added a check in `Bun.build` that detects when it's called from macro
mode (`vm.macro_mode`) and throws a clear error with guidance:

```
Bun.build cannot be called from within a macro during bundling.

This would cause a deadlock because the bundler is waiting for the macro to complete,
but the macro's Bun.build call is waiting for the bundler.

To bundle code at compile time in a macro, use Bun.spawnSync to invoke the CLI:
  const result = Bun.spawnSync(["bun", "build", entrypoint, "--format=esm"]);
```

## Test plan
- [x] Added regression test in `test/regression/issue/26360.test.ts`
- [x] Verified test hangs/fails with system Bun (the bug exists)
- [x] Verified test passes with the fix applied
- [x] Verified regular `Bun.build` (not in macro context) still works

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2026-01-23 20:24:12 -08:00
robobun
232e0df956 fix(http): respect port numbers in NO_PROXY environment variable (#26347)
## Summary
- Fix NO_PROXY environment variable to properly respect port numbers
like Node.js and curl do
- Previously `NO_PROXY=localhost:1234` would bypass proxy for all
requests to localhost regardless of port
- Now entries with ports (e.g., `localhost:8080`) do exact host:port
matching, while entries without ports continue to use suffix matching

## Test plan
- Added tests in `test/js/bun/http/proxy.test.js` covering:
  - [x] Bypass proxy when NO_PROXY matches host:port exactly
  - [x] Use proxy when NO_PROXY has different port  
- [x] Bypass proxy when NO_PROXY has host only (no port) - existing
behavior preserved
  - [x] Handle NO_PROXY with multiple entries including port
- Verified existing proxy tests still pass

🤖 Generated with [Claude Code](https://claude.ai/code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2026-01-23 20:21:57 -08:00
robobun
9f0e78fc42 fix(serve): add missing return after handling invalid stream in response body (#26396)
## Summary
- Fix missing return after handling `.Invalid` stream case in response
body rendering
- Add regression test for Bun.serve() concurrent instances (#26394)

## Details

When a response body contains a locked value with an invalid readable
stream (`stream.ptr == .Invalid`), the code would:
1. Call `this.response_body_readable_stream_ref.deinit()` 
2. Fall through without returning

This missing `return` caused the code to fall through to subsequent
logic that could set up invalid callbacks on an already-used body value,
potentially causing undefined behavior.

The fix adds `this.doRenderBlob()` to render an empty response body for
the invalid stream case, then returns properly.

## Test plan
- [x] Added `test/regression/issue/26394.test.ts` with tests for
concurrent Bun.serve instances
- [x] Verified test passes with `bun bd test
test/regression/issue/26394.test.ts`
- [x] Verified test passes with system Bun (`USE_SYSTEM_BUN=1 bun test`)


🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-23 12:27:36 -08:00
robobun
043fafeefa fix(http): set #js_ref in Request.toJS for server-created requests (#26390)
## Summary

- Fix Request.text() failure with "TypeError: undefined is not a
function" after many requests on certain platforms
- Set `#js_ref` in `Request.toJS()` for server-created requests,
matching the existing pattern in `Response.toJS()`

## Root Cause

When Request objects are created by the server (via `Request.init()`),
the `#js_ref` field was never initialized. This caused
`checkBodyStreamRef()` to fail silently when called in `toJS()` because
`#js_ref.tryGet()` returned null.

The bug manifested on macOS after ~4,500 requests when GC conditions
were triggered, causing the weak reference lookup to fail and resulting
in "TypeError: undefined is not a function" when calling `req.text()`.

## Fix

The fix mirrors the existing pattern in `Response.toJS()`:
1. Create the JS value first via `js.toJSUnchecked()`
2. Set `#js_ref` with the JS wrapper reference
3. Then call `checkBodyStreamRef()` which can now properly access the JS
value

## Test plan

- [x] Added regression test that exercises Request.text() with 6000
requests and periodic GC
- [x] Existing request tests pass
- [x] HTTP server tests pass

Fixes #26387

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-23 12:09:03 -08:00
Alistair Smith
ce173b1112 Revert 0c3b5e501b 2026-01-23 11:02:26 -08:00
Alistair Smith
0c3b5e501b Merge branch 'main' of github.com:oven-sh/bun into ali/esm-bytecode 2026-01-23 10:48:40 -08:00
Alistair Smith
5dc72bc1d8 use esm module info in source provider 2026-01-23 10:48:34 -08:00
Alistair Smith
dfc36a8255 start laying groundwork 2026-01-23 10:43:22 -08:00
34 changed files with 760 additions and 228 deletions

View File

@@ -88,7 +88,7 @@ jobs:
commit-message: "deps: update c-ares to ${{ steps.check-version.outputs.tag }} (${{ steps.check-version.outputs.latest }})"
title: "deps: update c-ares to ${{ steps.check-version.outputs.tag }}"
delete-branch: true
branch: deps/update-cares-${{ github.run_number }}
branch: deps/update-cares
body: |
## What does this PR do?

View File

@@ -91,7 +91,7 @@ jobs:
commit-message: "deps: update hdrhistogram to ${{ steps.check-version.outputs.tag }} (${{ steps.check-version.outputs.latest }})"
title: "deps: update hdrhistogram to ${{ steps.check-version.outputs.tag }}"
delete-branch: true
branch: deps/update-hdrhistogram-${{ github.run_number }}
branch: deps/update-hdrhistogram
body: |
## What does this PR do?

View File

@@ -107,7 +107,7 @@ jobs:
commit-message: "deps: update highway to ${{ steps.check-version.outputs.tag }} (${{ steps.check-version.outputs.latest }})"
title: "deps: update highway to ${{ steps.check-version.outputs.tag }}"
delete-branch: true
branch: deps/update-highway-${{ github.run_number }}
branch: deps/update-highway
body: |
## What does this PR do?

View File

@@ -88,7 +88,7 @@ jobs:
commit-message: "deps: update libarchive to ${{ steps.check-version.outputs.tag }} (${{ steps.check-version.outputs.latest }})"
title: "deps: update libarchive to ${{ steps.check-version.outputs.tag }}"
delete-branch: true
branch: deps/update-libarchive-${{ github.run_number }}
branch: deps/update-libarchive
body: |
## What does this PR do?

View File

@@ -88,7 +88,7 @@ jobs:
commit-message: "deps: update libdeflate to ${{ steps.check-version.outputs.tag }} (${{ steps.check-version.outputs.latest }})"
title: "deps: update libdeflate to ${{ steps.check-version.outputs.tag }}"
delete-branch: true
branch: deps/update-libdeflate-${{ github.run_number }}
branch: deps/update-libdeflate
body: |
## What does this PR do?

View File

@@ -100,7 +100,7 @@ jobs:
commit-message: "deps: update lolhtml to ${{ steps.check-version.outputs.tag }} (${{ steps.check-version.outputs.latest }})"
title: "deps: update lolhtml to ${{ steps.check-version.outputs.tag }}"
delete-branch: true
branch: deps/update-lolhtml-${{ github.run_number }}
branch: deps/update-lolhtml
body: |
## What does this PR do?

View File

@@ -105,7 +105,7 @@ jobs:
commit-message: "deps: update lshpack to ${{ steps.check-version.outputs.tag }} (${{ steps.check-version.outputs.latest }})"
title: "deps: update lshpack to ${{ steps.check-version.outputs.tag }}"
delete-branch: true
branch: deps/update-lshpack-${{ github.run_number }}
branch: deps/update-lshpack
body: |
## What does this PR do?

View File

@@ -74,7 +74,7 @@ jobs:
```
${{ env.changed_files }}
```
branch: certs/update-root-certs-${{ github.run_number }}
branch: certs/update-root-certs
base: main
delete-branch: true
labels:

View File

@@ -83,7 +83,7 @@ jobs:
commit-message: "deps: update sqlite to ${{ steps.check-version.outputs.latest }}"
title: "deps: update sqlite to ${{ steps.check-version.outputs.latest }}"
delete-branch: true
branch: deps/update-sqlite-${{ steps.check-version.outputs.latest }}
branch: deps/update-sqlite
body: |
## What does this PR do?

View File

@@ -68,7 +68,7 @@ jobs:
commit-message: "deps: update ${{ matrix.package }} to ${{ steps.check-version.outputs.latest }} (${{ steps.check-version.outputs.latest }})"
title: "deps: update ${{ matrix.package }} to ${{ steps.check-version.outputs.latest }}"
delete-branch: true
branch: deps/update-${{ matrix.package }}-${{ github.run_number }}
branch: deps/update-${{ matrix.package }}
body: |
## What does this PR do?

View File

@@ -88,7 +88,7 @@ jobs:
commit-message: "deps: update zstd to ${{ steps.check-version.outputs.tag }} (${{ steps.check-version.outputs.latest }})"
title: "deps: update zstd to ${{ steps.check-version.outputs.tag }}"
delete-branch: true
branch: deps/update-zstd-${{ github.run_number }}
branch: deps/update-zstd
body: |
## What does this PR do?

View File

@@ -4,7 +4,7 @@ register_repository(
REPOSITORY
cloudflare/lol-html
COMMIT
d64457d9ff0143deef025d5df7e8586092b9afb7
e9e16dca48dd4a8ffbc77642bc4be60407585f11
)
set(LOLHTML_CWD ${VENDOR_PATH}/lolhtml/c-api)

View File

@@ -6,7 +6,8 @@ endif()
optionx(BUILDKITE_ORGANIZATION_SLUG STRING "The organization slug to use on Buildkite" DEFAULT "bun")
optionx(BUILDKITE_PIPELINE_SLUG STRING "The pipeline slug to use on Buildkite" DEFAULT "bun")
optionx(BUILDKITE_BUILD_ID STRING "The build ID to use on Buildkite")
optionx(BUILDKITE_BUILD_ID STRING "The build ID (UUID) to use on Buildkite")
optionx(BUILDKITE_BUILD_NUMBER STRING "The build number to use on Buildkite")
optionx(BUILDKITE_GROUP_ID STRING "The group ID to use on Buildkite")
if(ENABLE_BASELINE)
@@ -32,7 +33,13 @@ if(NOT BUILDKITE_BUILD_ID)
return()
endif()
setx(BUILDKITE_BUILD_URL https://buildkite.com/${BUILDKITE_ORGANIZATION_SLUG}/${BUILDKITE_PIPELINE_SLUG}/builds/${BUILDKITE_BUILD_ID})
# Use BUILDKITE_BUILD_NUMBER for the URL if available, as the UUID format causes a 302 redirect
# that CMake's file(DOWNLOAD) doesn't follow, resulting in empty response.
if(BUILDKITE_BUILD_NUMBER)
setx(BUILDKITE_BUILD_URL https://buildkite.com/${BUILDKITE_ORGANIZATION_SLUG}/${BUILDKITE_PIPELINE_SLUG}/builds/${BUILDKITE_BUILD_NUMBER})
else()
setx(BUILDKITE_BUILD_URL https://buildkite.com/${BUILDKITE_ORGANIZATION_SLUG}/${BUILDKITE_PIPELINE_SLUG}/builds/${BUILDKITE_BUILD_ID})
endif()
setx(BUILDKITE_BUILD_PATH ${BUILDKITE_BUILDS_PATH}/builds/${BUILDKITE_BUILD_ID})
file(

View File

@@ -1081,6 +1081,28 @@ pub const JSBundler = struct {
return globalThis.throwInvalidArguments("Expected a config object to be passed to Bun.build", .{});
}
const vm = globalThis.bunVM();
// Detect and prevent calling Bun.build from within a macro during bundling.
// This would cause a deadlock because:
// 1. The bundler thread (singleton) is processing the outer Bun.build
// 2. During parsing, it encounters a macro and evaluates it
// 3. The macro calls Bun.build, which tries to enqueue to the same singleton thread
// 4. The singleton thread is blocked waiting for the macro to complete -> deadlock
if (vm.macro_mode) {
return globalThis.throw(
\\Bun.build cannot be called from within a macro during bundling.
\\
\\This would cause a deadlock because the bundler is waiting for the macro to complete,
\\but the macro's Bun.build call is waiting for the bundler.
\\
\\To bundle code at compile time in a macro, use Bun.spawnSync to invoke the CLI:
\\ const result = Bun.spawnSync(["bun", "build", entrypoint, "--format=esm"]);
,
.{},
);
}
var plugins: ?*Plugin = null;
const config = try Config.fromJS(globalThis, arguments[0], &plugins, bun.default_allocator);
@@ -1088,7 +1110,7 @@ pub const JSBundler = struct {
config,
plugins,
globalThis,
globalThis.bunVM().eventLoop(),
vm.eventLoop(),
bun.default_allocator,
);
}

View File

@@ -1489,7 +1489,7 @@ pub fn NewRequestContext(comptime ssl_enabled: bool, comptime debug_mode: bool,
const path = blob.store.?.data.s3.path();
const env = globalThis.bunVM().transpiler.env;
S3.stat(credentials, path, @ptrCast(&onS3SizeResolved), this, if (env.getHttpProxy(true, null)) |proxy| proxy.href else null, blob.store.?.data.s3.request_payer) catch {}; // TODO: properly propagate exception upwards
S3.stat(credentials, path, @ptrCast(&onS3SizeResolved), this, if (env.getHttpProxy(true, null, null)) |proxy| proxy.href else null, blob.store.?.data.s3.request_payer) catch {}; // TODO: properly propagate exception upwards
return;
}
this.renderMetadata();
@@ -1871,6 +1871,9 @@ pub fn NewRequestContext(comptime ssl_enabled: bool, comptime debug_mode: bool,
switch (stream.ptr) {
.Invalid => {
this.response_body_readable_stream_ref.deinit();
// Stream is invalid, render empty body
this.doRenderBlob();
return;
},
// toBlobIfPossible will typically convert .Blob streams, or .File streams into a Blob object, but cannot always.
.Blob,

View File

@@ -960,7 +960,7 @@ fn writeFileWithEmptySourceToDestination(ctx: *jsc.JSGlobalObject, destination_b
const promise = jsc.JSPromise.Strong.init(ctx);
const promise_value = promise.value();
const proxy = ctx.bunVM().transpiler.env.getHttpProxy(true, null);
const proxy = ctx.bunVM().transpiler.env.getHttpProxy(true, null, null);
const proxy_url = if (proxy) |p| p.href else null;
destination_store.ref();
try S3.upload(
@@ -1102,7 +1102,7 @@ pub fn writeFileWithSourceDestination(ctx: *jsc.JSGlobalObject, source_blob: *Bl
return jsc.JSPromise.dangerouslyCreateRejectedPromiseValueWithoutNotifyingVM(ctx, ctx.takeException(err));
};
defer aws_options.deinit();
const proxy = ctx.bunVM().transpiler.env.getHttpProxy(true, null);
const proxy = ctx.bunVM().transpiler.env.getHttpProxy(true, null, null);
const proxy_url = if (proxy) |p| p.href else null;
switch (source_store.data) {
.bytes => |bytes| {
@@ -1390,7 +1390,7 @@ pub fn writeFileInternal(globalThis: *jsc.JSGlobalObject, path_or_blob_: *PathOr
destination_blob.detach();
return globalThis.throwInvalidArguments("ReadableStream has already been used", .{});
}
const proxy = globalThis.bunVM().transpiler.env.getHttpProxy(true, null);
const proxy = globalThis.bunVM().transpiler.env.getHttpProxy(true, null, null);
const proxy_url = if (proxy) |p| p.href else null;
return S3.uploadStream(
@@ -1454,7 +1454,7 @@ pub fn writeFileInternal(globalThis: *jsc.JSGlobalObject, path_or_blob_: *PathOr
destination_blob.detach();
return globalThis.throwInvalidArguments("ReadableStream has already been used", .{});
}
const proxy = globalThis.bunVM().transpiler.env.getHttpProxy(true, null);
const proxy = globalThis.bunVM().transpiler.env.getHttpProxy(true, null, null);
const proxy_url = if (proxy) |p| p.href else null;
return S3.uploadStream(
(if (options.extra_options != null) aws_options.credentials.dupe() else s3.getCredentials()),
@@ -2266,13 +2266,13 @@ const S3BlobDownloadTask = struct {
if (blob.offset > 0) {
const len: ?usize = if (blob.size != Blob.max_size) @intCast(blob.size) else null;
const offset: usize = @intCast(blob.offset);
try S3.downloadSlice(credentials, path, offset, len, @ptrCast(&S3BlobDownloadTask.onS3DownloadResolved), this, if (env.getHttpProxy(true, null)) |proxy| proxy.href else null, s3_store.request_payer);
try S3.downloadSlice(credentials, path, offset, len, @ptrCast(&S3BlobDownloadTask.onS3DownloadResolved), this, if (env.getHttpProxy(true, null, null)) |proxy| proxy.href else null, s3_store.request_payer);
} else if (blob.size == Blob.max_size) {
try S3.download(credentials, path, @ptrCast(&S3BlobDownloadTask.onS3DownloadResolved), this, if (env.getHttpProxy(true, null)) |proxy| proxy.href else null, s3_store.request_payer);
try S3.download(credentials, path, @ptrCast(&S3BlobDownloadTask.onS3DownloadResolved), this, if (env.getHttpProxy(true, null, null)) |proxy| proxy.href else null, s3_store.request_payer);
} else {
const len: usize = @intCast(blob.size);
const offset: usize = @intCast(blob.offset);
try S3.downloadSlice(credentials, path, offset, len, @ptrCast(&S3BlobDownloadTask.onS3DownloadResolved), this, if (env.getHttpProxy(true, null)) |proxy| proxy.href else null, s3_store.request_payer);
try S3.downloadSlice(credentials, path, offset, len, @ptrCast(&S3BlobDownloadTask.onS3DownloadResolved), this, if (env.getHttpProxy(true, null, null)) |proxy| proxy.href else null, s3_store.request_payer);
}
return promise;
}
@@ -2432,7 +2432,7 @@ pub fn pipeReadableStreamToBlob(this: *Blob, globalThis: *jsc.JSGlobalObject, re
defer aws_options.deinit();
const path = s3.path();
const proxy = globalThis.bunVM().transpiler.env.getHttpProxy(true, null);
const proxy = globalThis.bunVM().transpiler.env.getHttpProxy(true, null, null);
const proxy_url = if (proxy) |p| p.href else null;
return S3.uploadStream(
@@ -2646,7 +2646,7 @@ pub fn getWriter(
if (this.isS3()) {
const s3 = &this.store.?.data.s3;
const path = s3.path();
const proxy = globalThis.bunVM().transpiler.env.getHttpProxy(true, null);
const proxy = globalThis.bunVM().transpiler.env.getHttpProxy(true, null, null);
const proxy_url = if (proxy) |p| p.href else null;
if (arguments.len > 0) {
const options = arguments.ptr[0];

View File

@@ -332,7 +332,7 @@ pub fn fromBlobCopyRef(globalThis: *JSGlobalObject, blob: *const Blob, recommend
.s3 => |*s3| {
const credentials = s3.getCredentials();
const path = s3.path();
const proxy = globalThis.bunVM().transpiler.env.getHttpProxy(true, null);
const proxy = globalThis.bunVM().transpiler.env.getHttpProxy(true, null, null);
const proxy_url = if (proxy) |p| p.href else null;
return bun.S3.readableStream(credentials, path, blob.offset, if (blob.size != Blob.max_size) blob.size else null, proxy_url, s3.request_payer, globalThis);

View File

@@ -228,8 +228,11 @@ pub inline fn detachReadableStream(this: *Request, globalObject: *jsc.JSGlobalOb
pub fn toJS(this: *Request, globalObject: *JSGlobalObject) JSValue {
this.calculateEstimatedByteSize();
const js_value = js.toJSUnchecked(globalObject, this);
this.#js_ref = .initWeak(js_value);
this.checkBodyStreamRef(globalObject);
return js.toJSUnchecked(globalObject, this);
return js_value;
}
extern "C" fn Bun__JSRequest__createForBake(globalObject: *jsc.JSGlobalObject, requestPtr: *Request) callconv(jsc.conv) jsc.JSValue;

View File

@@ -421,7 +421,7 @@ pub const S3BlobStatTask = struct {
const path = s3_store.path();
const env = globalThis.bunVM().transpiler.env;
try S3.stat(credentials, path, @ptrCast(&S3BlobStatTask.onS3ExistsResolved), this, if (env.getHttpProxy(true, null)) |proxy| proxy.href else null, s3_store.request_payer);
try S3.stat(credentials, path, @ptrCast(&S3BlobStatTask.onS3ExistsResolved), this, if (env.getHttpProxy(true, null, null)) |proxy| proxy.href else null, s3_store.request_payer);
return promise;
}
pub fn stat(globalThis: *jsc.JSGlobalObject, blob: *Blob) bun.JSTerminated!JSValue {
@@ -437,7 +437,7 @@ pub const S3BlobStatTask = struct {
const path = s3_store.path();
const env = globalThis.bunVM().transpiler.env;
try S3.stat(credentials, path, @ptrCast(&S3BlobStatTask.onS3StatResolved), this, if (env.getHttpProxy(true, null)) |proxy| proxy.href else null, s3_store.request_payer);
try S3.stat(credentials, path, @ptrCast(&S3BlobStatTask.onS3StatResolved), this, if (env.getHttpProxy(true, null, null)) |proxy| proxy.href else null, s3_store.request_payer);
return promise;
}
pub fn size(globalThis: *jsc.JSGlobalObject, blob: *Blob) bun.JSTerminated!JSValue {
@@ -453,7 +453,7 @@ pub const S3BlobStatTask = struct {
const path = s3_store.path();
const env = globalThis.bunVM().transpiler.env;
try S3.stat(credentials, path, @ptrCast(&S3BlobStatTask.onS3SizeResolved), this, if (env.getHttpProxy(true, null)) |proxy| proxy.href else null, s3_store.request_payer);
try S3.stat(credentials, path, @ptrCast(&S3BlobStatTask.onS3SizeResolved), this, if (env.getHttpProxy(true, null, null)) |proxy| proxy.href else null, s3_store.request_payer);
return promise;
}

View File

@@ -356,7 +356,7 @@ pub const S3 = struct {
};
const promise = jsc.JSPromise.Strong.init(globalThis);
const value = promise.value();
const proxy_url = globalThis.bunVM().transpiler.env.getHttpProxy(true, null);
const proxy_url = globalThis.bunVM().transpiler.env.getHttpProxy(true, null, null);
const proxy = if (proxy_url) |url| url.href else null;
var aws_options = try this.getCredentialsWithOptions(extra_options, globalThis);
defer aws_options.deinit();
@@ -414,7 +414,7 @@ pub const S3 = struct {
const promise = jsc.JSPromise.Strong.init(globalThis);
const value = promise.value();
const proxy_url = globalThis.bunVM().transpiler.env.getHttpProxy(true, null);
const proxy_url = globalThis.bunVM().transpiler.env.getHttpProxy(true, null, null);
const proxy = if (proxy_url) |url| url.href else null;
var aws_options = try this.getCredentialsWithOptions(extra_options, globalThis);
defer aws_options.deinit();

View File

@@ -143,15 +143,6 @@ pub const PackageManagerCommand = struct {
const is_direct_whoami = if (bun.argv.len > 1) strings.eqlComptime(bun.argv[1], "whoami") else false;
const cli = try PackageManager.CommandLineArguments.parse(ctx.allocator, .pm);
// Handle "cache" subcommand before PackageManager.init since it doesn't require a package.json
var cli_positionals = cli.positionals;
const early_subcommand = getSubcommand(&cli_positionals);
if (strings.eqlComptime(early_subcommand, "cache")) {
execCacheSubcommand(ctx, cli.positionals);
return;
}
var pm, const cwd = PackageManager.init(ctx, cli, PackageManager.Subcommand.pm) catch |err| {
if (err == error.MissingPackageJSON) {
var cwd_buf: bun.PathBuffer = undefined;
@@ -257,6 +248,64 @@ pub const PackageManagerCommand = struct {
_ = try pm.lockfile.hasMetaHashChanged(true, pm.lockfile.packages.len);
Global.exit(0);
} else if (strings.eqlComptime(subcommand, "cache")) {
var dir: bun.PathBuffer = undefined;
var fd = pm.getCacheDirectory();
const outpath = bun.getFdPath(.fromStdDir(fd), &dir) catch |err| {
Output.prettyErrorln("{s} getting cache directory", .{@errorName(err)});
Global.crash();
};
if (pm.options.positionals.len > 1 and strings.eqlComptime(pm.options.positionals[1], "rm")) {
fd.close();
var had_err = false;
std.fs.deleteTreeAbsolute(outpath) catch |err| {
Output.err(err, "Could not delete {s}", .{outpath});
had_err = true;
};
Output.prettyln("Cleared 'bun install' cache", .{});
bunx: {
const tmp = bun.fs.FileSystem.RealFS.platformTempDir();
const tmp_dir = std.fs.openDirAbsolute(tmp, .{ .iterate = true }) catch |err| {
Output.err(err, "Could not open {s}", .{tmp});
had_err = true;
break :bunx;
};
var iter = tmp_dir.iterate();
// This is to match 'bunx_command.BunxCommand.exec's logic
const prefix = try std.fmt.allocPrint(ctx.allocator, "bunx-{d}-", .{
if (bun.Environment.isPosix) bun.c.getuid() else bun.windows.userUniqueId(),
});
var deleted: usize = 0;
while (iter.next() catch |err| {
Output.err(err, "Could not read {s}", .{tmp});
had_err = true;
break :bunx;
}) |entry| {
if (std.mem.startsWith(u8, entry.name, prefix)) {
tmp_dir.deleteTree(entry.name) catch |err| {
Output.err(err, "Could not delete {s}", .{entry.name});
had_err = true;
continue;
};
deleted += 1;
}
}
Output.prettyln("Cleared {d} cached 'bunx' packages", .{deleted});
}
Global.exit(if (had_err) 1 else 0);
}
Output.writer().writeAll(outpath) catch {};
Global.exit(0);
} else if (strings.eqlComptime(subcommand, "default-trusted")) {
try DefaultTrustedCommand.exec();
Global.exit(0);
@@ -408,129 +457,6 @@ pub const PackageManagerCommand = struct {
}
};
/// Get the cache directory path without requiring a PackageManager instance.
/// This uses the same logic as PackageManager.fetchCacheDirectoryPath but with direct env access.
fn getCachePathWithoutPackageManager() []const u8 {
// Check BUN_INSTALL_CACHE_DIR first (via system env since we don't have DotEnv loaded)
if (bun.getenvZ("BUN_INSTALL_CACHE_DIR")) |dir| {
return Fs.FileSystem.instance.abs(&[_]string{dir});
}
// Check BUN_INSTALL
if (bun.getenvZ("BUN_INSTALL")) |dir| {
var parts = [_]string{ dir, "install/", "cache/" };
return Fs.FileSystem.instance.abs(&parts);
}
// Check XDG_CACHE_HOME
if (bun.env_var.XDG_CACHE_HOME.get()) |dir| {
var parts = [_]string{ dir, ".bun/", "install/", "cache/" };
return Fs.FileSystem.instance.abs(&parts);
}
// Fall back to HOME
if (bun.env_var.HOME.get()) |dir| {
var parts = [_]string{ dir, ".bun/", "install/", "cache/" };
return Fs.FileSystem.instance.abs(&parts);
}
// Ultimate fallback to node_modules/.bun-cache
var fallback_parts = [_]string{"node_modules/.bun-cache"};
return Fs.FileSystem.instance.abs(&fallback_parts);
}
const ClearBunxCacheResult = struct {
deleted: usize,
had_err: bool,
};
/// Clear cached bunx packages from the temp directory.
/// Returns the number of deleted packages and whether any errors occurred.
fn clearBunxCache(allocator: std.mem.Allocator) ClearBunxCacheResult {
var result = ClearBunxCacheResult{ .deleted = 0, .had_err = false };
const tmp = bun.fs.FileSystem.RealFS.platformTempDir();
var tmp_dir = std.fs.openDirAbsolute(tmp, .{ .iterate = true }) catch |err| {
Output.err(err, "Could not open {s}", .{tmp});
result.had_err = true;
return result;
};
defer tmp_dir.close();
var iter = tmp_dir.iterate();
// This is to match 'bunx_command.BunxCommand.exec's logic
const prefix = std.fmt.allocPrint(allocator, "bunx-{d}-", .{
if (bun.Environment.isPosix) bun.c.getuid() else bun.windows.userUniqueId(),
}) catch |err| {
Output.err(err, "Could not allocate prefix", .{});
result.had_err = true;
return result;
};
defer allocator.free(prefix);
while (iter.next() catch |err| {
Output.err(err, "Could not read {s}", .{tmp});
result.had_err = true;
return result;
}) |entry| {
if (std.mem.startsWith(u8, entry.name, prefix)) {
tmp_dir.deleteTree(entry.name) catch |err| {
Output.err(err, "Could not delete {s}", .{entry.name});
result.had_err = true;
continue;
};
result.deleted += 1;
}
}
return result;
}
/// Handle "bun pm cache" and "bun pm cache rm" without requiring a package.json.
/// This is a standalone function because the cache directory can be determined
/// independently from the project's package.json.
fn execCacheSubcommand(ctx: Command.Context, positionals: []const string) void {
// Get cache directory path without requiring a PackageManager instance.
const cache_path = getCachePathWithoutPackageManager();
// Check if this is "cache rm" (positionals would be ["pm", "cache", "rm"] or ["cache", "rm"])
// We need to find "rm" after "cache" in the positionals
const has_rm = for (positionals, 0..) |pos, i| {
if (strings.eqlComptime(pos, "cache")) {
if (i + 1 < positionals.len and strings.eqlComptime(positionals[i + 1], "rm")) {
break true;
}
}
} else false;
if (has_rm) {
var had_err = false;
std.fs.deleteTreeAbsolute(cache_path) catch |err| {
// FileNotFound is not an error - the cache may not exist yet
if (err != error.FileNotFound) {
Output.err(err, "Could not delete {s}", .{cache_path});
had_err = true;
}
};
Output.prettyln("Cleared 'bun install' cache", .{});
const bunx_result = clearBunxCache(ctx.allocator);
if (bunx_result.had_err) {
had_err = true;
}
Output.prettyln("Cleared {d} cached 'bunx' packages", .{bunx_result.deleted});
Global.exit(if (had_err) 1 else 0);
}
// Just print the cache path
Output.writer().writeAll(cache_path) catch {};
Global.exit(0);
}
fn printNodeModulesFolderStructure(
directory: *const NodeModulesFolder,
directory_package_id: ?PackageID,

View File

@@ -156,14 +156,17 @@ pub const Loader = struct {
}
pub fn getHttpProxyFor(this: *Loader, url: URL) ?URL {
return this.getHttpProxy(url.isHTTP(), url.hostname);
return this.getHttpProxy(url.isHTTP(), url.hostname, url.host);
}
pub fn hasHTTPProxy(this: *const Loader) bool {
return this.has("http_proxy") or this.has("HTTP_PROXY") or this.has("https_proxy") or this.has("HTTPS_PROXY");
}
pub fn getHttpProxy(this: *Loader, is_http: bool, hostname: ?[]const u8) ?URL {
/// Get proxy URL for HTTP/HTTPS requests, respecting NO_PROXY.
/// `hostname` is the host without port (e.g., "localhost")
/// `host` is the host with port if present (e.g., "localhost:3000")
pub fn getHttpProxy(this: *Loader, is_http: bool, hostname: ?[]const u8, host: ?[]const u8) ?URL {
// TODO: When Web Worker support is added, make sure to intern these strings
var http_proxy: ?URL = null;
@@ -191,23 +194,54 @@ pub const Loader = struct {
var no_proxy_iter = std.mem.splitScalar(u8, no_proxy_text, ',');
while (no_proxy_iter.next()) |no_proxy_item| {
var host = strings.trim(no_proxy_item, &strings.whitespace_chars);
if (host.len == 0) {
var no_proxy_entry = strings.trim(no_proxy_item, &strings.whitespace_chars);
if (no_proxy_entry.len == 0) {
continue;
}
if (strings.eql(host, "*")) {
if (strings.eql(no_proxy_entry, "*")) {
return null;
}
//strips .
if (strings.startsWithChar(host, '.')) {
host = host[1..];
if (host.len == 0) {
if (strings.startsWithChar(no_proxy_entry, '.')) {
no_proxy_entry = no_proxy_entry[1..];
if (no_proxy_entry.len == 0) {
continue;
}
}
//hostname ends with suffix
if (strings.endsWith(hostname.?, host)) {
return null;
// Determine if entry contains a port or is an IPv6 address
// IPv6 addresses contain multiple colons (e.g., "::1", "2001:db8::1")
// Bracketed IPv6 with port: "[::1]:8080"
// Host with port: "localhost:8080" (single colon)
const colon_count = std.mem.count(u8, no_proxy_entry, ":");
const is_bracketed_ipv6 = strings.startsWithChar(no_proxy_entry, '[');
const has_port = blk: {
if (is_bracketed_ipv6) {
// Bracketed IPv6: check for "]:port" pattern
if (std.mem.indexOf(u8, no_proxy_entry, "]:")) |_| {
break :blk true;
}
break :blk false;
} else if (colon_count == 1) {
// Single colon means host:port (not IPv6)
break :blk true;
}
// Multiple colons without brackets = bare IPv6 literal (no port)
break :blk false;
};
if (has_port) {
// Entry has a port, do exact match against host:port
if (host) |h| {
if (strings.eqlCaseInsensitiveASCII(h, no_proxy_entry, true)) {
return null;
}
}
} else {
// Entry is hostname/IPv6 only, match against hostname (suffix match)
if (strings.endsWith(hostname.?, no_proxy_entry)) {
return null;
}
}
}
}

View File

@@ -119,11 +119,13 @@ pub const unlink_params: []const ParamType = &(shared_params ++ [_]ParamType{
const patch_params: []const ParamType = &(shared_params ++ [_]ParamType{
clap.parseParam("<POS> ... \"name\" of the package to patch") catch unreachable,
clap.parseParam("--commit Install a package containing modifications in `dir`") catch unreachable,
clap.parseParam("--preview Print the diff without saving it (implies --commit)") catch unreachable,
clap.parseParam("--patches-dir <dir> The directory to put the patch file in (only if --commit is used)") catch unreachable,
});
const patch_commit_params: []const ParamType = &(shared_params ++ [_]ParamType{
clap.parseParam("<POS> ... \"dir\" containing changes to a package") catch unreachable,
clap.parseParam("--preview Print the diff without saving it") catch unreachable,
clap.parseParam("--patches-dir <dir> The directory to put the patch file") catch unreachable,
});
@@ -281,6 +283,7 @@ const PatchOpts = union(enum) {
patch: struct {},
commit: struct {
patches_dir: []const u8 = "patches",
preview: bool = false,
},
};
@@ -380,6 +383,9 @@ pub fn printHelp(subcommand: Subcommand) void {
\\ <d>Generate a patch file for changes made to jquery<r>
\\ <b><green>bun patch --commit 'node_modules/jquery'<r>
\\
\\ <d>Preview a patch without saving it<r>
\\ <b><green>bun patch --preview 'node_modules/jquery'<r>
\\
\\ <d>Generate a patch file in a custom directory for changes made to jquery<r>
\\ <b><green>bun patch --patches-dir 'my-patches' 'node_modules/jquery'<r>
\\
@@ -408,6 +414,9 @@ pub fn printHelp(subcommand: Subcommand) void {
\\ <d>Generate a patch in the default "./patches" directory for changes in "./node_modules/jquery"<r>
\\ <b><green>bun patch-commit 'node_modules/jquery'<r>
\\
\\ <d>Preview a patch without saving it<r>
\\ <b><green>bun patch-commit --preview 'node_modules/jquery'<r>
\\
\\ <d>Generate a patch in a custom directory ("./my-patches")<r>
\\ <b><green>bun patch-commit --patches-dir 'my-patches' 'node_modules/jquery'<r>
\\
@@ -934,10 +943,12 @@ pub fn parse(allocator: std.mem.Allocator, comptime subcommand: Subcommand) !Com
if (subcommand == .patch) {
const patch_commit = args.flag("--commit");
if (patch_commit) {
const patch_preview = args.flag("--preview");
if (patch_commit or patch_preview) {
cli.patch = .{
.commit = .{
.patches_dir = args.option("--patches-dir") orelse "patches",
.preview = patch_preview,
},
};
} else {
@@ -950,6 +961,7 @@ pub fn parse(allocator: std.mem.Allocator, comptime subcommand: Subcommand) !Com
cli.patch = .{
.commit = .{
.patches_dir = args.option("--patches-dir") orelse "patches",
.preview = args.flag("--preview"),
},
};
}

View File

@@ -33,6 +33,7 @@ patch_features: union(enum) {
patch: struct {},
commit: struct {
patches_dir: string,
preview: bool = false,
},
} = .{ .nothing = .{} },
@@ -648,10 +649,11 @@ pub fn load(
.patch => {
this.patch_features = .{ .patch = .{} };
},
.commit => {
.commit => |commit_opts| {
this.patch_features = .{
.commit = .{
.patches_dir = cli.patch.commit.patches_dir,
.patches_dir = commit_opts.patches_dir,
.preview = commit_opts.preview,
},
};
},

View File

@@ -391,6 +391,16 @@ pub fn doPatchCommit(
};
defer patchfile_contents.deinit();
// In preview mode, print the diff to stdout and return without modifying any files
if (manager.options.patch_features.commit.preview) {
Output.writer().writeAll(patchfile_contents.items) catch |e| {
Output.err(e, "failed to write patch to stdout", .{});
Global.crash();
};
Output.flush();
return null;
}
// write the patch contents to temp file then rename
var tmpname_buf: [1024]u8 = undefined;
const tempfile_name = try bun.fs.FileSystem.tmpname("tmp", &tmpname_buf, bun.fastRandom());

View File

@@ -1534,7 +1534,7 @@ var _Interface = class Interface extends InterfaceConstructor {
prefix +
StringPrototypeSlice.$call(this.line, this.cursor, this.line.length);
this.cursor = this.cursor - completeOn.length + prefix.length;
this._refreshLine();
this[kRefreshLine]();
return;
}

View File

@@ -964,6 +964,7 @@ fn NewPrinter(
}
if (import.default_name) |default| {
p.printSemicolonIfNeeded();
p.print("var ");
p.printSymbol(default.ref.?);
if (comptime Statement == void) {
@@ -984,6 +985,7 @@ fn NewPrinter(
}
if (import.items.len > 0) {
p.printSemicolonIfNeeded();
p.printWhitespacer(ws("var {"));
if (!import.is_single_line) {

View File

@@ -1109,15 +1109,15 @@ describe("bundler", () => {
"/entry.js": /* js */ `
// Test all equality operators with typeof undefined
console.log(typeof x !== 'undefined');
console.log(typeof x != 'undefined');
console.log(typeof x != 'undefined');
console.log('undefined' !== typeof x);
console.log('undefined' != typeof x);
console.log(typeof x === 'undefined');
console.log(typeof x == 'undefined');
console.log('undefined' === typeof x);
console.log('undefined' == typeof x);
// These should not be optimized
console.log(typeof x === 'string');
console.log(x === 'undefined');
@@ -1135,4 +1135,61 @@ describe("bundler", () => {
);
},
});
// https://github.com/oven-sh/bun/issues/26371
// Minified bundler output missing semicolon between statements when
// using both default and named imports from "bun" module
itBundled("minify/BunImportSemicolonInsertion", {
files: {
"/entry.js": /* js */ `
import bun, { embeddedFiles } from "bun"
console.log(typeof embeddedFiles)
console.log(typeof bun.argv)
`,
},
minifySyntax: true,
minifyWhitespace: true,
minifyIdentifiers: true,
target: "bun",
run: {
stdout: "object\nobject",
},
});
itBundled("minify/BunImportNamespaceAndNamed", {
files: {
"/entry.js": /* js */ `
import * as bun from "bun"
import { embeddedFiles } from "bun"
console.log(typeof embeddedFiles)
console.log(typeof bun.argv)
`,
},
minifySyntax: true,
minifyWhitespace: true,
minifyIdentifiers: true,
target: "bun",
run: {
stdout: "object\nobject",
},
});
itBundled("minify/BunImportDefaultNamespaceAndNamed", {
files: {
"/entry.js": /* js */ `
import bun, * as bunNs from "bun"
import { embeddedFiles } from "bun"
console.log(typeof embeddedFiles)
console.log(typeof bun.argv)
console.log(typeof bunNs.argv)
`,
},
minifySyntax: true,
minifyWhitespace: true,
minifyIdentifiers: true,
target: "bun",
run: {
stdout: "object\nobject\nobject",
},
});
});

View File

@@ -816,4 +816,146 @@ module.exports = function isOdd() {
expect(stdout.toString()).toBe("true\n");
}
});
describe("--preview flag", async () => {
test("should print diff without saving patch file", async () => {
const tempdir = tempDirWithFiles("preview", {
"package.json": JSON.stringify({
name: "bun-patch-preview-test",
module: "index.ts",
type: "module",
dependencies: {
"is-even": "1.0.0",
},
}),
"index.ts": /* ts */ `import isEven from 'is-even'; console.log(isEven(420))`,
});
// Install dependencies
expectNoError(await $`${bunExe()} i`.env(bunEnv).cwd(tempdir));
// Prepare the package for patching
expectNoError(await $`${bunExe()} patch is-even@1.0.0`.env(bunEnv).cwd(tempdir));
// Make a change to the package
const patchedCode = /* ts */ `/*!
* is-even <https://github.com/jonschlinkert/is-even>
*
* Copyright (c) 2015, 2017, Jon Schlinkert.
* Released under the MIT License.
*/
'use strict';
var isOdd = require('is-odd');
module.exports = function isEven(i) {
console.log("Preview test!")
return !isOdd(i);
};
`;
await $`echo ${patchedCode} > node_modules/is-even/index.js`.env(bunEnv).cwd(tempdir);
// Run bun patch --preview
const { stdout, stderr } = await $`${bunExe()} patch --preview node_modules/is-even`.env(bunEnv).cwd(tempdir);
expect(stderr.toString()).not.toContain("error");
// The stdout should contain the diff
expect(stdout.toString()).toContain("diff --git");
expect(stdout.toString()).toContain("Preview test!");
expect(stdout.toString()).toContain("index.js");
// The patches directory should NOT exist (no file saved)
const { exitCode } = await $`test -d patches`.env(bunEnv).cwd(tempdir).throws(false);
expect(exitCode).not.toBe(0);
// package.json should NOT have patchedDependencies
const packageJson = await $`cat package.json`.cwd(tempdir).env(bunEnv).json();
expect(packageJson.patchedDependencies).toBeUndefined();
});
test("should work with bun patch-commit --preview", async () => {
const tempdir = tempDirWithFiles("preview2", {
"package.json": JSON.stringify({
name: "bun-patch-preview-test-2",
module: "index.ts",
type: "module",
dependencies: {
"is-even": "1.0.0",
},
}),
"index.ts": /* ts */ `import isEven from 'is-even'; console.log(isEven(420))`,
});
// Install dependencies
expectNoError(await $`${bunExe()} i`.env(bunEnv).cwd(tempdir));
// Prepare the package for patching
expectNoError(await $`${bunExe()} patch is-even@1.0.0`.env(bunEnv).cwd(tempdir));
// Make a change to the package
const patchedCode = /* ts */ `/*!
* is-even <https://github.com/jonschlinkert/is-even>
*
* Copyright (c) 2015, 2017, Jon Schlinkert.
* Released under the MIT License.
*/
'use strict';
var isOdd = require('is-odd');
module.exports = function isEven(i) {
console.log("patch-commit preview!")
return !isOdd(i);
};
`;
await $`echo ${patchedCode} > node_modules/is-even/index.js`.env(bunEnv).cwd(tempdir);
// Run bun patch-commit --preview
const { stdout, stderr } = await $`${bunExe()} patch-commit --preview node_modules/is-even`
.env(bunEnv)
.cwd(tempdir);
expect(stderr.toString()).not.toContain("error");
// The stdout should contain the diff
expect(stdout.toString()).toContain("diff --git");
expect(stdout.toString()).toContain("patch-commit preview!");
// The patches directory should NOT exist (no file saved)
const { exitCode } = await $`test -d patches`.env(bunEnv).cwd(tempdir).throws(false);
expect(exitCode).not.toBe(0);
// package.json should NOT have patchedDependencies
const packageJson = await $`cat package.json`.cwd(tempdir).env(bunEnv).json();
expect(packageJson.patchedDependencies).toBeUndefined();
});
test("--preview should show no changes when package is unmodified", async () => {
const tempdir = tempDirWithFiles("preview3", {
"package.json": JSON.stringify({
name: "bun-patch-preview-test-3",
module: "index.ts",
type: "module",
dependencies: {
"is-even": "1.0.0",
},
}),
"index.ts": /* ts */ `import isEven from 'is-even'; console.log(isEven(420))`,
});
// Install dependencies
expectNoError(await $`${bunExe()} i`.env(bunEnv).cwd(tempdir));
// Prepare the package for patching (but don't make any changes)
expectNoError(await $`${bunExe()} patch is-even@1.0.0`.env(bunEnv).cwd(tempdir));
// Run bun patch --preview without making changes
const { stdout, stderr } = await $`${bunExe()} patch --preview node_modules/is-even`.env(bunEnv).cwd(tempdir);
expect(stderr.toString()).not.toContain("error");
// Should indicate no changes
expect(stdout.toString()).toContain("No changes detected");
});
});
});

View File

@@ -15,10 +15,17 @@ beforeAll(() => {
// simple http proxy
if (request.url.startsWith("http://")) {
return await fetch(request.url, {
const response = await fetch(request.url, {
method: request.method,
body: await request.text(),
});
// Add marker header to indicate request went through proxy
const headers = new Headers(response.headers);
headers.set("x-proxy-used", "1");
return new Response(response.body, {
status: response.status,
headers,
});
}
// no TLS support here
@@ -257,4 +264,129 @@ describe.concurrent(() => {
}
expect(exitCode).toBe(0);
});
// Test that NO_PROXY respects port numbers like Node.js and curl do
describe("NO_PROXY port handling", () => {
it("should bypass proxy when NO_PROXY matches host:port exactly", async () => {
// NO_PROXY includes the exact host:port, should bypass proxy
const {
exited,
stdout,
stderr: stderrStream,
} = Bun.spawn({
cmd: [
bunExe(),
"-e",
`const resp = await fetch("http://localhost:${server.port}/test"); console.log(resp.headers.get("x-proxy-used") || "no-proxy");`,
],
env: {
...bunEnv,
http_proxy: `http://localhost:${proxy.port}`,
NO_PROXY: `localhost:${server.port}`,
},
stdout: "pipe",
stderr: "pipe",
});
const [exitCode, out, stderr] = await Promise.all([exited, stdout.text(), stderrStream.text()]);
if (exitCode !== 0) {
console.error("stderr:", stderr);
}
// Should connect directly, not through proxy (no x-proxy-used header)
expect(out.trim()).toBe("no-proxy");
expect(exitCode).toBe(0);
});
it("should use proxy when NO_PROXY has different port", async () => {
const differentPort = server.port + 1000;
// NO_PROXY includes a different port, should NOT bypass proxy
const {
exited,
stdout,
stderr: stderrStream,
} = Bun.spawn({
cmd: [
bunExe(),
"-e",
`const resp = await fetch("http://localhost:${server.port}/test"); console.log(resp.headers.get("x-proxy-used") || "no-proxy");`,
],
env: {
...bunEnv,
http_proxy: `http://localhost:${proxy.port}`,
NO_PROXY: `localhost:${differentPort}`,
},
stdout: "pipe",
stderr: "pipe",
});
const [exitCode, out, stderr] = await Promise.all([exited, stdout.text(), stderrStream.text()]);
if (exitCode !== 0) {
console.error("stderr:", stderr);
}
// The proxy adds x-proxy-used header, verify it was used
expect(out.trim()).toBe("1");
expect(exitCode).toBe(0);
});
it("should bypass proxy when NO_PROXY has host only (no port)", async () => {
// NO_PROXY includes just the host (no port), should bypass proxy for all ports
const {
exited,
stdout,
stderr: stderrStream,
} = Bun.spawn({
cmd: [
bunExe(),
"-e",
`const resp = await fetch("http://localhost:${server.port}/test"); console.log(resp.headers.get("x-proxy-used") || "no-proxy");`,
],
env: {
...bunEnv,
http_proxy: `http://localhost:${proxy.port}`,
NO_PROXY: `localhost`,
},
stdout: "pipe",
stderr: "pipe",
});
const [exitCode, out, stderr] = await Promise.all([exited, stdout.text(), stderrStream.text()]);
if (exitCode !== 0) {
console.error("stderr:", stderr);
}
// Should connect directly, not through proxy (no x-proxy-used header)
expect(out.trim()).toBe("no-proxy");
expect(exitCode).toBe(0);
});
it("should handle NO_PROXY with multiple entries including port", async () => {
const differentPort = server.port + 1000;
// NO_PROXY includes multiple entries, one of which matches exactly
const {
exited,
stdout,
stderr: stderrStream,
} = Bun.spawn({
cmd: [
bunExe(),
"-e",
`const resp = await fetch("http://localhost:${server.port}/test"); console.log(resp.headers.get("x-proxy-used") || "no-proxy");`,
],
env: {
...bunEnv,
http_proxy: `http://localhost:${proxy.port}`,
NO_PROXY: `example.com, localhost:${differentPort}, localhost:${server.port}`,
},
stdout: "pipe",
stderr: "pipe",
});
const [exitCode, out, stderr] = await Promise.all([exited, stdout.text(), stderrStream.text()]);
if (exitCode !== 0) {
console.error("stderr:", stderr);
}
// Should connect directly, not through proxy (no x-proxy-used header)
expect(out.trim()).toBe("no-proxy");
expect(exitCode).toBe(0);
});
});
});

View File

@@ -0,0 +1,139 @@
import { expect, test } from "bun:test";
import { bunEnv, bunExe, tempDir } from "harness";
// https://github.com/oven-sh/bun/issues/26360
// Bug: Bun.build API hangs indefinitely when called from within a macro that is
// evaluated during another Bun.build call. The CLI `bun build` works correctly.
//
// Root cause: The bundler uses a singleton thread for processing Bun.build calls.
// When a macro is evaluated during bundling and that macro calls Bun.build:
// 1. The singleton bundler thread is processing the outer Bun.build
// 2. The macro runs on the bundler thread and calls Bun.build
// 3. The inner Bun.build tries to enqueue to the same singleton thread
// 4. The singleton thread is blocked waiting for the macro to complete -> deadlock
//
// Fix: Detect when Bun.build is called from within macro mode and throw a clear error.
test("Bun.build from macro during bundling throws instead of hanging", async () => {
using dir = tempDir("issue-26360", {
// A simple file that will be bundled by the macro
"browser.ts": `console.log("browser code");
export default "";
`,
// A macro that calls Bun.build and catches the error
// The error should indicate that Bun.build cannot be called from macro context
"macro.ts": `import browserCode from "./browser" with { type: "file" };
let errorMessage = "no error";
try {
const built = await Bun.build({
entrypoints: [browserCode],
format: "esm",
});
} catch (e) {
errorMessage = "CAUGHT: " + e.message;
}
export const getErrorMessage = (): string => errorMessage;
`,
// File that imports from the macro
"index.ts": `import { getErrorMessage } from "./macro" with { type: "macro" };
console.log("ERROR_MSG:", getErrorMessage());
`,
// Build script that uses Bun.build API (this would hang before the fix)
"build_script.ts": `const result = await Bun.build({
entrypoints: ["./index.ts"],
});
if (!result.success) {
console.log("BUILD_ERROR");
for (const log of result.logs) {
console.log(log.message);
}
} else {
console.log("BUILD_SUCCESS");
// Print the output to verify the macro caught the error
const text = await result.outputs[0].text();
console.log(text);
}
`,
});
// Run the build script - should complete (not hang) and the macro should have caught the error
await using proc = Bun.spawn({
cmd: [bunExe(), "build_script.ts"],
cwd: String(dir),
env: bunEnv,
stdout: "pipe",
stderr: "pipe",
});
const [stdout, stderr, exitCode] = await Promise.all([proc.stdout.text(), proc.stderr.text(), proc.exited]);
// The build should succeed (the macro catches the error)
expect(stdout).toContain("BUILD_SUCCESS");
// The macro should have received the error message about Bun.build not being allowed
expect(stdout).toContain("Bun.build cannot be called from within a macro");
});
test("CLI bun build with macro that calls Bun.build also throws", async () => {
using dir = tempDir("issue-26360-cli", {
"browser.ts": `console.log("browser code");
export default "";
`,
// A macro that calls Bun.build and catches the error
"macro.ts": `import browserCode from "./browser" with { type: "file" };
let errorMessage = "";
try {
const built = await Bun.build({
entrypoints: [browserCode],
format: "esm",
});
} catch (e) {
errorMessage = e.message;
}
export const getErrorMessage = (): string => errorMessage;
`,
"index.ts": `import { getErrorMessage } from "./macro" with { type: "macro" };
console.log("ERROR_MSG:", getErrorMessage());
`,
});
// Run via CLI
await using proc = Bun.spawn({
cmd: [bunExe(), "build", "index.ts", "--target=node"],
cwd: String(dir),
env: bunEnv,
stdout: "pipe",
stderr: "pipe",
});
const [stdout, stderr, exitCode] = await Promise.all([proc.stdout.text(), proc.stderr.text(), proc.exited]);
// The CLI build should also work and show the error message was caught
expect(stdout).toContain("Bun.build cannot be called from within a macro");
});
test("regular Bun.build (not in macro) still works", async () => {
using dir = tempDir("issue-26360-normal", {
"entry.ts": `
console.log("hello world");
export default "";
`,
});
const result = await Bun.build({
entrypoints: [`${dir}/entry.ts`],
format: "esm",
});
expect(result.success).toBe(true);
expect(result.outputs.length).toBeGreaterThan(0);
const text = await result.outputs[0].text();
expect(text).toContain("hello world");
});

View File

@@ -0,0 +1,46 @@
import { expect, test } from "bun:test";
// https://github.com/oven-sh/bun/issues/26387
// Request.text() fails with "TypeError: undefined is not a function" after ~4500 requests
test("Request.text() should work after many requests", async () => {
// Create a server that reads the request body using req.text()
using server = Bun.serve({
port: 0,
async fetch(req) {
try {
const body = await req.text();
return new Response("ok:" + body.length);
} catch (e) {
return new Response(`error: ${e}`, { status: 500 });
}
},
});
const url = `http://localhost:${server.port}`;
// Send many requests to trigger the GC conditions that caused the bug
// The original bug occurred around 4500 requests, but we use a higher number
// to ensure we trigger any GC-related issues
const requestCount = 6000;
for (let i = 0; i < requestCount; i++) {
const body = Buffer.alloc(100, "x").toString() + `-request-${i}`;
const response = await fetch(url, {
method: "POST",
body: body,
});
if (!response.ok) {
const text = await response.text();
throw new Error(`Request ${i} failed: ${text}`);
}
const responseText = await response.text();
expect(responseText).toBe(`ok:${body.length}`);
// Periodically run GC to increase likelihood of triggering the bug
if (i % 500 === 0) {
Bun.gc(true);
}
}
}, 60000);

View File

@@ -0,0 +1,40 @@
import { expect, test } from "bun:test";
import { bunEnv, bunExe } from "harness";
// https://github.com/oven-sh/bun/issues/26411
// Tab completion with node:readline/promises threw
// "TypeError: this._refreshLine is not a function"
test("tab completion works with node:readline/promises", async () => {
await using proc = Bun.spawn({
cmd: [
bunExe(),
"-e",
`
import readline from "node:readline/promises";
const rl = readline.createInterface({
input: process.stdin,
output: process.stdout,
terminal: true,
completer: (line) => [["FOO", "FOOBAR"], line]
});
rl.line = "foo";
rl.cursor = 3;
setTimeout(() => {
rl.close();
console.log("OK");
process.exit(0);
}, 100);
rl.write("", { name: "tab" });
`,
],
env: bunEnv,
stdout: "pipe",
stderr: "pipe",
});
const [stdout, stderr, exitCode] = await Promise.all([proc.stdout.text(), proc.stderr.text(), proc.exited]);
expect(stderr).not.toContain("this._refreshLine is not a function");
expect(stdout).toContain("OK");
expect(exitCode).toBe(0);
});

View File

@@ -1,45 +0,0 @@
import { expect, test } from "bun:test";
import { bunEnv, bunExe, tempDir } from "harness";
// Test that "bun pm cache rm" works without a package.json
// https://github.com/oven-sh/bun/issues/26427
test("bun pm cache rm works without package.json", async () => {
// Use a temp directory without a package.json
using dir = tempDir("bun-test-26427", {});
await using proc = Bun.spawn({
cmd: [bunExe(), "pm", "cache", "rm"],
env: bunEnv,
cwd: String(dir),
stdout: "pipe",
stderr: "pipe",
});
const [stdout, exitCode] = await Promise.all([proc.stdout.text(), proc.exited]);
// Should succeed and clear the cache without requiring -g flag
expect(stdout).toContain("Cleared");
expect(exitCode).toBe(0);
});
// Test that "bun pm cache" (print path) works without a package.json
test("bun pm cache works without package.json", async () => {
// Use a temp directory without a package.json
using dir = tempDir("bun-test-26427-cache", {});
await using proc = Bun.spawn({
cmd: [bunExe(), "pm", "cache"],
env: bunEnv,
cwd: String(dir),
stdout: "pipe",
stderr: "pipe",
});
const [stdout, exitCode] = await Promise.all([proc.stdout.text(), proc.exited]);
// Should succeed and print an absolute path to the cache directory
const trimmedOutput = stdout.trim();
// Check that it's an absolute path (starts with / on Unix or drive letter on Windows)
expect(trimmedOutput.startsWith("/") || /^[A-Za-z]:/.test(trimmedOutput)).toBe(true);
expect(exitCode).toBe(0);
});