Commit Graph

14025 Commits

Author SHA1 Message Date
Claude Bot
ca44414247 Refactor MIME type compression check to use Category system
Replaces string prefix checks with Bun's MimeType.Category infrastructure.

Benefits:
- Cleaner code (category-based switch vs 20+ string checks)
- Better performance (comptime category vs runtime string comparisons)
- More maintainable (one place to update compression logic)
- More comprehensive (automatically handles 2310+ MIME types via categories)
- Handles +xml and +json structured suffixes (e.g., application/vnd.api+json)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-07 15:32:22 +00:00
Claude Bot
ad3359dd8c Add comment explaining Host header localhost detection
The Host header includes the port (e.g., "localhost:3000") so we can't
use the isLocalhost() helper which expects just IP addresses.

The helper is still used for the socket address check where appropriate.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-07 15:16:52 +00:00
Claude Bot
5cd677299d Add memoryCost method to CompressedVariant for consistency
Make CompressedVariant consistent with AnyBlob and Headers by giving it
its own memoryCost() method instead of accessing .data.len directly.

This makes the code more maintainable and consistent with how other
types in StaticRoute report their memory usage.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-07 15:04:03 +00:00
Claude Bot
8e397874f1 Simplify HTTP compression implementation
Major simplifications:
- Remove complex cache config (TTL, max sizes, min/max entry sizes)
- Store compressed variants directly on StaticRoute like the body
- Mark compression as "failed" if it doesn't save space to avoid retrying
- Remove all cache enforcement logic
- Compression is opt-in (disabled by default)
- Consolidate tests into single comprehensive test file

The cache is now just simple storage - compress once per encoding,
store it if it saves space, serve it when requested. No LRU, no TTL,
no complex size limits. Much simpler and easier to understand.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-07 14:53:16 +00:00
Claude Bot
3c96d08588 Update documentation to reflect complete implementation
All features now fully implemented and documented:
- Static and dynamic route compression
- Full cache enforcement (TTL, size limits)
- Memory safety defaults (50MB, 24hr)
- --smol mode support (5MB, 1hr)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-07 13:15:59 +00:00
Claude Bot
9129ff7c33 Implement cache enforcement for HTTP compression
Adds active enforcement of cache configuration limits:
-  TTL checking: Expired variants are automatically recreated
-  Max cache size: Per-route limit prevents unbounded growth
-  Min/max entry size: Filter variants by compressed size
-  Zero TTL means infinite (no expiration)

Implementation details:
- Added created_at_ms timestamp to CompressedVariant
- Check expiration before serving cached variant
- Check size constraints before and after compression
- All cache checks respect cache: false config
- 6 new tests covering all enforcement features

All tests passing (19 total):
- 13 original compression tests
- 6 new cache enforcement tests

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-07 13:03:02 +00:00
Claude Bot
0579e7046c Add HTTP compression support for Bun.serve (static & dynamic routes)
Implements automatic HTTP response compression with:
-  Brotli, Gzip, Zstd, Deflate support with configurable levels
-  Static routes: lazy compression with caching (compress once, serve many)
-  Dynamic routes: on-demand compression per request
-  Per-algorithm control (enable/disable individual encodings)
-  Smart defaults: OPT-IN (disabled by default), localhost detection
-  RFC 9110 compliant Accept-Encoding negotiation with quality values
-  ETag preservation (same ETag for all compressed variants)
-  Vary: Accept-Encoding header for proper caching
-  MIME type filtering (skip images, videos, archives)
-  Already-encoded detection (skip if Content-Encoding exists)
-  Configurable threshold, cache size, TTL, --smol mode support
-  node:http compatibility (compression disabled)

Fixes:
- Localhost detection uses Host header check (getRemoteSocketInfo unreliable for loopback)
- Encoding selection respects client preferences (no fallback when client specifies encodings)
- All tests passing (13 tests covering static/dynamic routes, edge cases)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-07 12:46:55 +00:00
Claude Bot
4187cef4b9 Add compression cache configuration with --smol mode support
Implemented cache control API:
- cache: false - Disables caching entirely (compress on-demand, not cached)
- cache: { maxSize, ttl, minEntrySize, maxEntrySize } - Configure limits
- --smol mode automatically uses conservative defaults

Cache Configuration:
- DEFAULT: 50MB max, 24h TTL, 128B-10MB per entry
- SMOL: 5MB max, 1h TTL, 512B-1MB per entry (for --smol flag)
- cache: false - Skip caching, return false from tryServeCompressed()

API Example:
```js
Bun.serve({
  compression: {
    brotli: 6,
    cache: false, // Disable caching
    cache: {
      maxSize: 100 * 1024 * 1024, // 100MB
      ttl: 3600, // 1 hour (seconds)
      minEntrySize: 512,
      maxEntrySize: 5 * 1024 * 1024,
    }
  }
})
```

Limitations (TODO):
- Cache limits are parsed but not enforced yet
- No TTL checking or eviction
- No total size tracking or LRU eviction
- cache: false works immediately

The configuration exists and --smol defaults are in place, ready for
enforcement implementation later.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-07 12:46:55 +00:00
Claude Bot
05bf4baf2b Document that streaming responses are already excluded from compression
Clarified in code comments and documentation that:
- Streaming responses (ReadableStream bodies) are rejected from StaticRoute
- They throw error at StaticRoute.fromJS():160 requiring buffered body
- Streams go through RequestContext, not StaticRoute
- Compression only applies to fully buffered static Response objects

This answers the "how does streaming work" question - it doesn't go through
StaticRoute at all, so compression is never applied to streams. No special
handling needed - the architecture naturally prevents it.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-07 12:46:55 +00:00
Claude Bot
570d3a394a Generate proper ETags for compressed variants by hashing compressed data
Previously: Appended encoding name to original ETag ("hash-gzip")
Now: Hash the actual compressed bytes for each variant

Benefits:
- RFC compliant: ETag accurately represents the bytes being sent
- Better caching: Different compression = different ETag
- Cache correctness: Browsers can properly validate cached responses
- Optimization: Reuses XxHash64 like original ETags

Also fixed duplicate ETag headers by excluding etag and content-length
from original headers when serving compressed responses.

Test results show proper ETags:
- Gzip: "9fda8793868c946a" (unique hash)
- Brotli: "f6cf23ab76d3053b" (different hash)
- Original: "3e18e94100461873" (also different)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-07 12:46:55 +00:00
Claude Bot
ba7a4d7048 Document memory implications of compression caching
Update documentation to be honest about memory usage:
- Each static route can store up to 4 compressed variants (lazy)
- Small files: negligible overhead (~200 bytes)
- Large files: significant overhead (~300-400KB per route)
- Example: 100 routes × 1MB files = ~40MB extra

Clarify this is for static routes only, not dynamic routes or streaming.
Dynamic routes would need proper LRU cache with TTL and size limits.

The current design is acceptable for static routes because:
1. Static routes are finite and user-controlled
2. Original data is already cached
3. Lazy compression - only cache what clients request
4. Users can disable algorithms: compression: { gzip: true, brotli: false }

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-07 12:46:55 +00:00
Claude Bot
f58ce945a4 Remove redundant node:http compression check from Zig code
Set compression: false directly in the node:http JS code instead of
checking onNodeHTTPRequest in Zig. This is simpler and follows the
pattern of setting it at the source. Since compression is opt-in by
default (false), this also removes unnecessary special-case logic.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-07 12:46:55 +00:00
Claude Bot
4b6f043c78 Add opt-in HTTP response compression for Bun.serve static routes
## Summary
Implements automatic HTTP response compression for static routes in Bun.serve()
with support for Brotli, Gzip, Zstd, and Deflate algorithms. Compression is
opt-in (disabled by default) and only applies to static Response objects.

## Implementation

### Core Components
- **CompressionConfig.zig**: Configuration parsing and encoding selection
  - RFC 9110 compliant Accept-Encoding header parsing with quality values
  - Per-algorithm configuration (level, threshold, enable/disable)
  - Automatic localhost detection to skip compression
  - Default: opt-in (user must set compression: true)

- **Compressor.zig**: Compression utilities for all algorithms
  - Brotli (level 0-11, default 4)
  - Gzip (level 1-9, default 6)
  - Zstd (level 1-22, default 3)
  - Deflate (level 1-9, disabled by default)
  - MIME type filtering to skip already-compressed formats

### Static Route Integration
- Lazy compression with per-encoding caching
- Compressed variants stored inline (CompressedVariant struct)
- Separate ETags per encoding (format: "hash-encoding")
- Proper Vary: Accept-Encoding headers for cache correctness
- Memory-efficient: compress once, serve many times

### Configuration API
```js
Bun.serve({
  compression: true,  // Use defaults
  compression: false, // Disable
  compression: {
    brotli: 6,        // Custom level
    gzip: false,      // Disable individual algorithm
    threshold: 2048,  // Min size to compress (bytes)
    disableForLocalhost: true, // Skip localhost (default)
  },
});
```

## Limitations
- **Static routes only**: Only applies to Response objects in routes
- **No dynamic routes**: Would require caching API (future work)
- **No streaming**: Streaming responses are not compressed
- **node:http disabled**: Compression force-disabled for node:http servers

## Testing
Verified with manual tests showing:
- Compression enabled when opt-in
- Proper gzip encoding applied
- 99% compression ratio on test data
- Disabled by default as expected
- Vary headers set correctly

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-07 12:46:55 +00:00
Jarred Sumner
6f8138b6e4 in build Add NO_SCCACHE env var 2025-11-07 04:40:29 -08:00
taylor.fish
23a2b2129c Use std.debug.captureStackTrace on all platforms (#24456)
In the crash reporter, we currently use glibc's `backtrace()` function
on glibc Linux targets. However, this has resulted in poor stack traces
in many scenarios, particularly when a JSC signal handlers is involved,
in which case the stack trace tends to have only one frame—the signal
handler itself. Considering that JSC installs a signal handler for SEGV,
this is particularly bad.

Zig's `std.debug.captureStackTrace` generates considerably more complete
stack traces, but it has an issue where the top frame is missing when a
signal handler is involved. This is unfortunate, but it's still the
better option for now. Note that our stack traces on macOS also have
this missing frame issue.

In the future, we will investigate backporting the changes to stack
trace capturing that were recently made in Zig's `master` branch, since
that seems to have fixed the missing frame issue.

This PR still uses the stack trace provided by `backtrace()` if it
returns more frames than `captureStackTrace`. In particular, ARM may
need this behavior.

(For internal tracking: fixes ENG-21406)
2025-11-07 04:07:53 -08:00
Jarred Sumner
8ec856124c Add ccache back, with fallback for sccache 2025-11-07 04:01:10 -08:00
Jarred Sumner
94bc68f72c Ignore maxBuffer when not piped (#24440)
### What does this PR do?

### How did you verify your code works?
2025-11-07 00:54:01 -08:00
Marko Vejnovic
75f271a306 ENG-21473: Fix installations without sccache (#24453) 2025-11-06 17:26:28 -08:00
Marko Vejnovic
267be9a54a ci(ENG-21474): Minor Cleanup (#24450) 2025-11-06 17:26:19 -08:00
Alistair Smith
44402ad27a Document & cover some missing spawn/spawnSync options (#24417) 2025-11-06 14:37:26 -08:00
pfg
e01f454635 Fix #23865 (#24355)
Fixes #23865, Fixes ENG-21446

Previously, a termination exception would be thrown. We didn't handle it
properly and eventually it got caught by a `catch @panic()` handler.
Now, no termination exception is thrown.

```
drainMicrotasksWithGlobal calls JSC__JSGlobalObject__drainMicrotasks
JSC__JSGlobalObject__drainMicrotasks returns m_terminationException
-> drainMicrotasksWithGlobal
-> event_loop.zig:exit, which catches the error and discards it
-> ...
```

For workers, we will need to handle termination exceptions in this
codepath.

~~Previously, it would see the exception, call
reportUncaughtExceptoinAtEventLoop, but the exception would still
survive and return out from the catch scope. You're not supposed to
still have an exception signaled at the exit of a catch scope. Exception
checker may not have caught it because maybe the branch wasn't taken.~~

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-05 22:04:14 -08:00
Jarred Sumner
f56232a810 Move Bun.spawn & Bun.spawnSync into a separate file (#24425)
### What does this PR do?



### How did you verify your code works?

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-05 22:03:27 -08:00
Marko Vejnovic
cf0ae19c2a ENG-21468: RELEASE=1 disables sccache (#24428)
### What does this PR do?

What the title says

### How did you verify your code works?

Tested locally:

```bash
killall sccache
RELEASE=1 bun run build
sccache --show-stats
```

```
marko@fedora:~/Desktop/bun-2$ sccache --show-stats
Compile requests                      0
Compile requests executed             0
Cache hits                            0
Cache misses                          0
Cache hits rate                       -
Cache timeouts                        0
Cache read errors                     0
Forced recaches                       0
Cache write errors                    0
Cache errors                          0
Compilations                          0
Compilation failures                  0
Non-cacheable compilations            0
Non-cacheable calls                   0
Non-compilation calls                 0
Unsupported compiler calls            0
Average cache write               0.000 s
Average compiler                  0.000 s
Average cache read hit            0.000 s
Failed distributed compilations       0
Cache location                  Local disk: "/home/marko/.cache/sccache"
Use direct/preprocessor mode?   yes
Version (client)                0.12.0
Max cache size                       10 GiB
```

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-05 22:03:10 -08:00
Jarred Sumner
4ac293bf01 Add error when loading known unsupported v8 c++ api (#24384)
### What does this PR do?

### How did you verify your code works?
2025-11-05 19:17:03 -08:00
Jarred Sumner
314088ab37 Update no-validate-exceptions.txt 2025-11-05 19:07:42 -08:00
Marko Vejnovic
86a0ff442a build(ENG-21464): Remove sccache --show-stats on local builds (#24421)
Co-authored-by: Meghan Denny <meghan@bun.com>
2025-11-05 16:26:48 -08:00
Meghan Denny
f4404a55db misc tidyings from another branch (#24406)
pulled out of https://github.com/oven-sh/bun/pull/21809

- brings the ASAN behavior on linux closer in sync with macos
- fixes some tests to also pass in node

---------

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
2025-11-05 15:28:28 -08:00
Marko Vejnovic
1d4e3c0ab2 [publish images] 2025-11-05 14:34:31 -08:00
Marko Vejnovic
782f684b2e build(ENG-21330): Replace ccache with sccache (#24200) 2025-11-05 14:30:56 -08:00
Alistair Smith
995d988c73 Clear module cache when require'ing an es module with TLA throws (#24389)
### What does this PR do?

Fixes #24387

### How did you verify your code works?

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Marko Vejnovic <marko@bun.com>
2025-11-05 13:55:49 -08:00
robobun
c7b9e0dc92 fix(node): prevent crash with null/undefined exports in process.dlopen (#24403)
## Summary

Fixes a segfault that occurred when calling `process.dlopen` with
`null`, `undefined`, or primitive values for `exports`.

Previously, this would cause a crash at address `0x00000000` in
`node_module_register` due to dereferencing an uninitialized
`strongExportsObject`.

## Changes

- Modified `src/bun.js/bindings/v8/node.cpp` to use JSC's `toObject()`
instead of manual type checking
- This matches Node.js `ToObject()` behavior:
  - Throws `TypeError` for `null`/`undefined`
  - Creates wrapper objects for primitives
  - Preserves existing objects

## Test Plan

Added `test/js/node/process/dlopen-non-object-exports.test.ts` with
three test cases:
- Null exports (should throw)
- Undefined exports (should throw)  
- Primitive exports (should create wrapper)

All tests pass with the fix.

## Related Issue

Fixes the first bug discovered in the segfault investigation.

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-11-05 13:53:08 -08:00
robobun
df5d0fcfa1 fix: ensure EC private key JWK "d" field has correct length (#24400)
## Summary

Fixes incorrect JWK "d" field length for exported elliptic curve private
keys. The "d" field is now correctly padded to ensure RFC 7518
compliance.

## Problem

When exporting EC private keys to JWK format, the "d" field would
sometimes be shorter than required by RFC 7518 because
`convertToBytes()` doesn't pad the result when the BIGNUM has leading
zeros. This caused incompatibility with Chrome's strict validation,
though Node.js and Firefox would accept the malformed keys.

Expected lengths per RFC 7518:
- P-256: 32 bytes → 43 base64url characters
- P-384: 48 bytes → 64 base64url characters  
- P-521: 66 bytes → 88 base64url characters

## Solution

Changed `src/bun.js/bindings/webcrypto/CryptoKeyECOpenSSL.cpp:420` to
use `convertToBytesExpand(privateKey, keySizeInBytes)` instead of
`convertToBytes(privateKey)`, ensuring the private key is padded with
leading zeros when necessary. This matches the behavior already used for
the x and y public key coordinates.

## Test plan

-  Added regression test in `test/regression/issue/24399.test.ts` that
generates multiple keys for each curve and verifies correct "d" field
length
-  Test fails with `USE_SYSTEM_BUN=1 bun test` (reproduces the bug)
-  Test passes with `bun bd test` (verifies the fix)
-  Existing crypto tests pass

Fixes #24399

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-05 13:49:13 -08:00
taylor.fish
4a326979f4 Fix bindgenv2 type annotations (#24139) 2025-11-05 12:45:43 -08:00
Marko Vejnovic
b3f8930c4a ENG-21460: Docs CLRF to LF (#24416) 2025-11-05 12:22:27 -08:00
Alistair Smith
126f4686af remove outdated docs ci workflows 2025-11-05 11:19:07 -08:00
Lydia Hallie
1606a9f24e Replace old docs with new docs repo (#24201) 2025-11-05 11:14:21 -08:00
Meghan Denny
550522e99b napi: unskip passing tests (#24359) 2025-11-04 16:59:23 -08:00
Alistair Smith
46d4ed3c33 Fix #24154 (#24382) 2025-11-04 13:11:52 -08:00
Meghan Denny
fa219a2f8e js: update node:_http_agent (#24275)
pulled out of https://github.com/oven-sh/bun/pull/21809

+7 node tests

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-04 11:56:33 -08:00
csvlad
4250ce6157 fix: vi typing in bun:test (#24248) 2025-11-04 08:27:30 -08:00
nkxxll
f8dce87f24 docs(bun-types): Replace depricated readableStreamToText in type docu… (#24372)
Co-authored-by: Alistair Smith <hi@alistair.sh>
2025-11-04 07:43:46 -08:00
Jarred Sumner
359f04d81f Improve NAPI property and element handling (#24358)
### What does this PR do?

Refactored NAPI property and element access to use inline methods and
improved error handling. Added comprehensive tests for default value
behavior and numeric string key operations in NAPI, ensuring correct
handling of missing properties, integer keys, and property deletion.
Updated TypeScript tests to cover new scenarios.

### How did you verify your code works?

Tests

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
2025-11-04 03:21:07 -08:00
robobun
9ce2504554 fix(node:http): unref poll_ref on WebSocket upgrade to prevent CPU spin (#24271)
## Summary

Fixes 100% CPU usage on idle WebSocket servers between bun-v1.2.23 and
bun-v1.3.0.

Many users reported WebSocket server CPU usage jumping to 100% on idle
connections after upgrading to v1.3.0. Investigation revealed a missing
`poll_ref.unref()` call in the WebSocket upgrade path.

## Root Cause

In commit 625e537f5d (#23348), the `OnBeforeOpen` callback mechanism was
removed as part of refactoring the WebSocket upgrade process. However,
this callback contained a critical cleanup step:

```zig
defer ctx.this.poll_ref.unref(ctx.globalObject.bunVM());
```

When a `NodeHTTPResponse` is created, `poll_ref.ref()` is called (line
314) to keep the event loop alive while handling the HTTP request. After
a WebSocket upgrade, the HTTP response object is no longer relevant and
its `poll_ref` must be unref'd to indicate the request processing is
complete.

Without this unref, the event loop maintains an active reference even
after the upgrade completes, causing the CPU to spin at 100% waiting for
events on what should be an idle connection.

## Changes

- Added `poll_ref.unref()` call in `NodeHTTPResponse.upgrade()` after
setting the `upgraded` flag
- Added regression test to verify event loop properly exits after
WebSocket upgrade

## Test Plan

- [x] Code compiles successfully
- [x] Existing WebSocket tests pass
- [x] Manual testing confirms CPU usage returns to normal on idle
WebSocket connections

## Related Issues

Fixes issue reported by users between bun-v1.2.23 and bun-v1.3.0
regarding 100% CPU usage on idle WebSocket servers.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-03 23:27:26 -08:00
Jarred Sumner
5d76a0b2f8 Revert incorrect remaining_in_buffer check 2025-11-03 23:02:22 -08:00
Ciro Spaciari
8a9249c216 fix(tls) undo some changes added in root_certs (#24350)
### What does this PR do?
Restore call to us_get_default_ca_certificates, and
X509_STORE_set_default_paths

Fixes https://github.com/oven-sh/bun/issues/23735
### How did you verify your code works?
Manually test running:
```bash
bun -e "await fetch('https://secure-api.eloview.com').then(res => res.t
ext()).then(console.log);"
```
should not result in:
```js
error: unable to get local issuer certificate
  path: "https://secure-api.eloview.com/",
 errno: 0,
  code: "UNABLE_TO_GET_ISSUER_CERT_LOCALLY"
```

<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->

## Summary by CodeRabbit

* **Bug Fixes**
* Enhanced system root certificate handling to ensure consistent
validation across all secure connections.

<!-- end of auto-generated comment: release notes by coderabbit.ai -->
2025-11-03 22:59:46 -08:00
Dylan Conway
aad4d800ff add "configVersion" to bun.lock(b) (#24236)
### What does this PR do?

Adds `"configVersion"` to bun.lock(b). The version will be used to keep
default settings the same if they would be breaking across bun versions.

fixes ENG-21389
fixes ENG-21388
### How did you verify your code works?
TODO:
- [ ] new project
- [ ] existing project without configVersion
- [ ] existing project with configVersion
- [ ] same as above but with bun.lockb
- [ ] configVersion@0 defaults to hoisted linker
- [ ] new projects use isolated linker

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2025-11-03 22:20:07 -08:00
Jarred Sumner
528620e9ae Add postinstall optimizer with native binlink support and script skipping (#24283)
## Summary

This PR introduces a new postinstall optimization system that
significantly reduces the need to run lifecycle scripts for certain
packages by intelligently handling their requirements at install time.

## Key Features

### 1. Native Binlink Optimization

When packages like `esbuild` ship platform-specific binaries as optional
dependencies, we now:
- Detect the native binlink pattern (enabled by default for `esbuild`)
- Find the matching platform-specific dependency based on target CPU/OS
- Link binaries directly from the platform-specific package (e.g.,
`@esbuild/darwin-arm64`)
- Fall back gracefully if the platform-specific package isn't found

**Result**: No postinstall scripts needed for esbuild and similar
packages.

### 2. Lifecycle Script Skipping

For packages like `sharp` that run heavy postinstall scripts:
- Skip lifecycle scripts entirely (enabled by default for `sharp`)
- Prevents downloading large binaries or compiling native code
unnecessarily
- Reduces install time and potential failures in restricted environments

## Configuration

Both features can be configured via `package.json`:

```json
{
  "nativeDependencies": ["esbuild", "my-custom-package"],
  "ignoreScripts": ["sharp", "another-package"]
}
```

Set to empty arrays to disable defaults:
```json
{
  "nativeDependencies": [],
  "ignoreScripts": []
}
```

Environment variable overrides:
- `BUN_FEATURE_FLAG_DISABLE_NATIVE_DEPENDENCY_LINKER=1` - disable native
binlink
- `BUN_FEATURE_FLAG_DISABLE_IGNORE_SCRIPTS=1` - disable script ignoring

## Implementation Details

### Core Components

- **`postinstall_optimizer.zig`**: New file containing the optimizer
logic
- `PostinstallOptimizer` enum with `native_binlink` and `ignore`
variants
  - `List` type to track optimization strategies per package hash
  - Defaults for `esbuild` (native binlink) and `sharp` (ignore)
  
- **`Bin.Linker` changes**: Extended to support separate target paths
  - `target_node_modules_path`: Where to find the actual binary
  - `target_package_name`: Name of the package containing the binary
  - Fallback logic when native binlink optimization fails

### Modified Components

- **PackageInstaller.zig**: Checks optimizer before:
  - Enqueueing lifecycle scripts
  - Linking binaries (with platform-specific package resolution)
  
- **isolated_install/Installer.zig**: Similar checks for isolated linker
mode
  - `maybeReplaceNodeModulesPath()` resolves platform-specific packages
  - Retry logic without optimization on failure

- **Lockfile**: Added `postinstall_optimizer` field to persist
configuration

## Changes Included

- Updated `esbuild` from 0.21.5 to 0.25.11 (testing with latest)
- VS Code launch config updates for debugging install with new flags
- New feature flags in `env_var.zig`

## Test Plan

- [x] Existing install tests pass
- [ ] Test esbuild install without postinstall scripts running
- [ ] Test sharp install with scripts skipped
- [ ] Test custom package.json configuration
- [ ] Test fallback when platform-specific package not found
- [ ] Test feature flag overrides

🤖 Generated with [Claude Code](https://claude.com/claude-code)

<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->
## Summary by CodeRabbit

* **New Features**
* Native binlink optimization: installs platform-specific binaries when
available, with a safe retry fallback and verbose logging option.
* Per-package postinstall controls to optionally skip lifecycle scripts.
* New feature flags to disable native binlink optimization and to
disable lifecycle-script ignoring.

* **Tests**
* End-to-end tests and test packages added to validate native binlink
behavior across install scenarios and linker modes.

* **Documentation**
  * Bench README and sample app migrated to a Next.js-based setup.
<!-- end of auto-generated comment: release notes by coderabbit.ai -->

---------

Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Dylan Conway <dylan.conway567@gmail.com>
2025-11-03 20:36:22 -08:00
robobun
7197fb1f04 Fix Module._resolveFilename to pass options.paths to overridden functions (#24325)
Fixes Next.js 16 + React Compiler build failure when using Bun runtime.

## Issue
When `Module._resolveFilename` was overridden (e.g., by Next.js's
require-hook), Bun was not passing the `options` parameter (which
contains `paths`) to the override function. This caused resolution
failures when the override tried to use custom resolution paths.

Additionally, when `Module._resolveFilename` was called directly with
`options.paths`, Bun was ignoring the paths parameter and using default
resolution.

## Root Causes
1. In `ImportMetaObject.cpp`, when calling an overridden
`_resolveFilename` function, the options object with paths was not being
passed as the 4th argument.

2. In `NodeModuleModule.cpp`, `jsFunctionResolveFileName` was calling
`Bun__resolveSync` without extracting and using the `options.paths`
parameter.

## Solution
1. In `ImportMetaObject.cpp`: When `userPathList` is provided, construct
an options object with `{paths: userPathList}` and pass it as the 4th
argument to the overridden `_resolveFilename` function.

2. In `NodeModuleModule.cpp`: Extract `options.paths` from the 4th
argument and call `Bun__resolveSyncWithPaths` when paths are provided,
instead of always using `Bun__resolveSync`.

## Reproduction
Before this fix, running:
```bash
bun --bun next build --turbopack
```
on a Next.js 16 app with React Compiler enabled would fail with:
```
Cannot find module './node_modules/babel-plugin-react-compiler'
```

## Testing
- Added comprehensive tests for `Module._resolveFilename` with
`options.paths`
- Verified Next.js 16 + React Compiler + Turbopack builds successfully
with Bun
- All 5 new tests pass with the fix, 3 fail without it
- All existing tests continue to pass

## Files Changed
- `src/bun.js/bindings/ImportMetaObject.cpp` - Pass options to override
- `src/bun.js/modules/NodeModuleModule.cpp` - Handle options.paths in
_resolveFilename
- `test/js/node/module/module-resolve-filename-paths.test.js` - New test
suite

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-11-03 20:28:33 -08:00
robobun
946470dcd7 Refactor: move FetchTasklet to separate file (#24330)
## Summary

Extract `FetchTasklet` struct from `src/bun.js/webcore/fetch.zig` into
its own file at `src/bun.js/webcore/fetch/FetchTasklet.zig` to improve
code organization and modularity.

## Changes

- Moved `FetchTasklet` struct definition (1336 lines) to new file
`src/bun.js/webcore/fetch/FetchTasklet.zig`
- Added all necessary imports to the new file
- Updated `fetch.zig` line 61 to import `FetchTasklet` from the new
location: `pub const FetchTasklet =
@import("./fetch/FetchTasklet.zig").FetchTasklet;`
- Verified compilation succeeds with `bun bd`

## Impact

- No functional changes - this is a pure refactoring
- Improves code organization by separating the large `FetchTasklet`
implementation
- Makes the codebase more maintainable and easier to navigate
- Reduces `fetch.zig` from 2768 lines to 1433 lines

## Test plan

- [x] Built successfully with `bun bd`
- [x] No changes to functionality - pure code organization refactor

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-03 02:21:49 -08:00
Michael H
d76fad3618 fix update interactive to keep npm aliases (#23903)
### What does this PR do?

fixes #23901

### How did you verify your code works?

with a test

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-03 02:12:24 -08:00