Compare commits

..

18 Commits

Author SHA1 Message Date
Claude Bot
7af0c26c06 test: await stdout/stderr before proc.exited for better error messages
Address review feedback: await stdout and stderr first, then await
proc.exited so test failures show captured output.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-15 01:17:45 +00:00
Claude Bot
8473377733 fix(console): only display own properties in console.log
Objects created with Object.create() were incorrectly showing inherited
prototype properties in console.log output. This fix ensures console.log
only displays own enumerable properties, matching Node.js behavior.

Fixes #1713

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-15 00:59:53 +00:00
Ciro Spaciari
22bebfc467 respect agent options and connectOpts in https module (#25937) 2026-01-14 11:52:53 -08:00
robobun
1800093a64 fix(install): use scope-specific registry for scoped packages in frozen lockfile (#26047)
## Summary
- Fixed `bun install --frozen-lockfile` to use scope-specific registry
for scoped packages when the lockfile has an empty registry URL

When parsing a `bun.lock` file with an empty registry URL for a scoped
package (like `@example/test-package`), bun was unconditionally using
the default npm registry (`https://registry.npmjs.org/`) instead of
looking up the scope-specific registry from `bunfig.toml`.

For example, with this configuration in `bunfig.toml`:
```toml
[install.scopes]
example = { url = "https://npm.pkg.github.com" }
```

And this lockfile entry with an empty registry URL:
```json
"@example/test-package": ["@example/test-package@1.0.0", "", {}, "sha512-AAAA"]
```

bun would try to fetch from
`https://registry.npmjs.org/@example/test-package/-/...` instead of
`https://npm.pkg.github.com/@example/test-package/-/...`.

The fix uses `manager.scopeForPackageName()` (the same pattern used in
`pnpm.zig`) to look up the correct scope-specific registry URL.

## Test plan
- [x] Added regression test `test/regression/issue/026039.test.ts` that
verifies:
  - Scoped packages use the scope-specific registry from `bunfig.toml`
  - Non-scoped packages continue to use the default registry
- [x] Verified test fails with system bun (without fix) and passes with
debug build (with fix)

Fixes #26039

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2026-01-14 10:22:30 -08:00
Jarred Sumner
967a6a2021 Fix blocking realpathSync call (#26056)
### What does this PR do?

### How did you verify your code works?

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2026-01-13 23:05:46 -08:00
Jarred Sumner
49d0fbd2de Update 25716.test.ts 2026-01-13 22:38:31 -08:00
Tommy D. Rossi
af2317deb4 fix(bundler): allow reactFastRefresh Bun.build option with non-browser targets (#26035)
### What does this PR do?

Previously, reactFastRefresh was silently ignored when target was not
'browser', even when explicitly enabled. This was confusing as there was
no warning or error.

This change removes the `target == .browser` check, trusting explicit
user intent. If users enable reactFastRefresh with a non-browser target,
the transform will now be applied. If `$RefreshReg$` is not defined at
runtime, it will fail fast with a clear error rather than silently doing
nothing.

Use case: Terminal UIs (like [termcast](https://termcast.app)) need
React Fast Refresh with target: 'bun' for hot reloading in non-browser
environments.

### How did you verify your code works?

Updated existing test removing target browser
2026-01-13 22:13:17 -08:00
robobun
ab009fe00d fix(init): respect --minimal flag for agent rule files (#26051)
## Summary
- Fixes `bun init --minimal` creating Cursor rules files and CLAUDE.md
when it shouldn't
- Adds regression test to verify `--minimal` only creates package.json
and tsconfig.json

## Test plan
- [x] Verify test fails with system bun (unfixed): `USE_SYSTEM_BUN=1 bun
test test/cli/init/init.test.ts -t "bun init --minimal"`
- [x] Verify test passes with debug build: `bun bd test
test/cli/init/init.test.ts -t "bun init --minimal"`
- [x] All existing init tests pass: `bun bd test
test/cli/init/init.test.ts`

Fixes #26050

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-13 18:33:42 -08:00
Jarred Sumner
fbd800551b Bump 2026-01-13 15:06:36 -08:00
robobun
113cdd9648 fix(completions): add update command to Fish completions (#25978)
## Summary

- Add the `update` subcommand to Fish shell completions
- Apply the install/add/remove flags (--global, --dry-run, --force,
etc.) to the `update` command

Previously, Fish shell autocompletion for `bun update --gl<TAB>` would
not work because:
1. The `update` command was missing from the list of built-in commands
2. The install/add/remove flags were not being applied to `update`

Fixes #25953

## Test plan

- [x] Verify `update` appears in subcommand completions (`bun <TAB>`)
- [x] Verify `--global` flag completion works (`bun update --gl<TAB>`)
- [x] Verify other install flags work with update (--dry-run, --force,
etc.)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-13 15:05:00 -08:00
robobun
3196178fa7 fix(timers): add _idleStart property to Timeout object (#26021)
## Summary

- Add `_idleStart` property (getter/setter) to the Timeout object
returned by `setTimeout()` and `setInterval()`
- The property returns a monotonic timestamp (in milliseconds)
representing when the timer was created
- This mimics Node.js's behavior where `_idleStart` is the libuv
timestamp at timer creation time

## Test plan

- [x] Verified test fails with `USE_SYSTEM_BUN=1 bun test
test/regression/issue/25639.test.ts`
- [x] Verified test passes with `bun bd test
test/regression/issue/25639.test.ts`
- [x] Manual verification:
  ```bash
  # Bun with fix - _idleStart exists
./build/debug/bun-debug -e "const t = setTimeout(() => {}, 0);
console.log('_idleStart' in t, typeof t._idleStart); clearTimeout(t)"
  # Output: true number
  
  # Node.js reference - same behavior
node -e "const t = setTimeout(() => {}, 0); console.log('_idleStart' in
t, typeof t._idleStart); clearTimeout(t)"
  # Output: true number
  ```

Closes #25639

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-12 19:35:11 -08:00
robobun
d530ed993d fix(css): restore handler context after minifying nested rules (#25997)
## Summary
- Fixes handler context not being restored after minifying nested CSS
rules
- Adds regression test for the issue

## Test plan
- [x] Test fails with `USE_SYSTEM_BUN=1 bun test
test/regression/issue/25794.test.ts`
- [x] Test passes with `bun bd test test/regression/issue/25794.test.ts`

Fixes #25794

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-12 14:55:27 -08:00
Dylan Conway
959169dfaf feat(archive): change API to constructor-based with S3 support (#25940)
## Summary
- Change Archive API from `Bun.Archive.from(data)` to `new
Bun.Archive(data, options?)`
- Change compression options from `{ gzip: true }` to `{ compress:
"gzip", level?: number }`
- Default to no compression when no options provided
- Use `{ compress: "gzip" }` to enable gzip compression (level 6 by
default)
- Add Archive support for S3 and local file writes via `Bun.write()`

## New API

```typescript
// Create archive - defaults to uncompressed tar
const archive = new Bun.Archive({
  "hello.txt": "Hello, World!",
  "data.json": JSON.stringify({ foo: "bar" }),
});

// Enable gzip compression
const compressed = new Bun.Archive(files, { compress: "gzip" });

// Gzip with custom level (1-12)
const maxCompression = new Bun.Archive(files, { compress: "gzip", level: 12 });

// Write to local file
await Bun.write("archive.tar", archive);           // uncompressed by default
await Bun.write("archive.tar.gz", compressed);     // gzipped

// Write to S3
await client.write("archive.tar.gz", compressed);          // S3Client.write()
await Bun.write("s3://bucket/archive.tar.gz", compressed); // S3 URL
await s3File.write(compressed);                            // s3File.write()

// Get bytes/blob (uses compression setting from constructor)
const bytes = await archive.bytes();
const blob = await archive.blob();
```

## TypeScript Types

```typescript
type ArchiveCompression = "gzip";

type ArchiveOptions = {
  compress?: "gzip";
  level?: number;  // 1-12, default 6 when gzip enabled
};
```

## Test plan
- [x] 98 archive tests pass
- [x] S3 integration tests updated to new API
- [x] TypeScript types updated
- [x] Documentation updated with new examples

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2026-01-12 14:54:21 -08:00
SUZUKI Sosuke
461ad886bd fix(http): fix Strong reference leak in server response streaming (#25965)
## Summary

Fix a memory leak in `RequestContext.doRenderWithBody()` where
`Strong.Impl` memory was leaked when proxying streaming responses
through Bun's HTTP server.

## Problem

When a streaming response (e.g., from a proxied fetch request) was
forwarded through Bun's server:

1. `response_body_readable_stream_ref` was initialized at line 1836
(from `lock.readable`) or line 1841 (via `Strong.init()`)
2. For `.Bytes` streams with `has_received_last_chunk=false`, a **new**
Strong reference was created at line 1902
3. The old Strong reference was **never deinit'd**, causing
`Strong.Impl` memory to leak

This leak accumulated over time with every streaming response proxied
through the server.

## Solution

Add `this.response_body_readable_stream_ref.deinit()` before creating
the new Strong reference. This is safe because:

- `stream` exists as a stack-local variable
- JSC's conservative GC tracks stack-local JSValues
- No GC can occur between consecutive synchronous Zig statements
- Therefore, `stream` won't be collected between `deinit()` and
`Strong.init()`

## Test

Added `test/js/web/fetch/server-response-stream-leak.test.ts` which:
- Creates a backend server that returns delayed streaming responses
- Creates a proxy server that forwards the streaming responses
- Makes 200 requests and checks that ReadableStream objects don't
accumulate
- Fails on system Bun v1.3.5 (202 leaked), passes with the fix

## Related

Similar to the Strong reference leak fixes in:
- #23313 (fetch memory leak)
- #25846 (fetch cyclic reference leak)
2026-01-12 14:41:58 -08:00
Markus Schmidt
b6abbd50a0 fix(Bun.SQL): handle binary columns in MySQL correctly (#26011)
## What does this PR do?
Currently binary columns are returned as strings which means they get
corrupted when encoded in UTF8. This PR returns binary columns as
Buffers which is what user's actually expect and is also consistent with
PostgreSQL and SQLite.
### How did you verify your code works?
I added tests to verify the correct behavior. Before there were no tests
for binary columns at all.

This fixes #23991
2026-01-12 11:56:02 -08:00
Alex Miller
beccd01647 fix(FileSink): add Promise<number> to FileSink.write() return type (#25962)
Co-authored-by: Alistair Smith <hi@alistair.sh>
2026-01-11 12:51:16 -08:00
github-actions[bot]
35eb53994a deps: update sqlite to 3.51.200 (#25957)
## What does this PR do?

Updates SQLite to version 3.51.200

Compare: https://sqlite.org/src/vdiff?from=3.51.1&to=3.51.200

Auto-updated by [this
workflow](https://github.com/oven-sh/bun/actions/workflows/update-sqlite3.yml)

Co-authored-by: Jarred-Sumner <709451+Jarred-Sumner@users.noreply.github.com>
2026-01-10 22:46:17 -08:00
robobun
ebf39e9811 fix(install): prevent use-after-free when retrying failed HTTP requests (#25949) 2026-01-10 18:03:24 -08:00
56 changed files with 2426 additions and 706 deletions

View File

@@ -6,7 +6,8 @@ To do that:
- git fetch upstream
- git merge upstream main
- Fix the merge conflicts
- bun build.ts debug
- cd ../../ (back to bun)
- make jsc-build (this will take about 7 minutes)
- While it compiles, in another task review the JSC commits between the last version of Webkit and the new version. Write up a summary of the webkit changes in a file called "webkit-changes.md"
- bun run build:local (build a build of Bun with the new Webkit, make sure it compiles)
- After making sure it compiles, run some code to make sure things work. something like ./build/debug-local/bun-debug --print '42' should be all you need
@@ -20,7 +21,3 @@ To do that:
- commit + push (without adding the webkit-changes.md file)
- create PR titled "Upgrade Webkit to the <commit-sha>", paste your webkit-changes.md into the PR description
- delete the webkit-changes.md file
Things to check for a successful upgrade:
- Did JSType in vendor/WebKit/Source/JavaScriptCore have any recent changes? Does the enum values align with whats present in src/bun.js/bindings/JSType.zig?
- Were there any changes to the webcore code generator? If there are C++ compilation errors, check for differences in some of the generated code in like vendor/WebKit/source/WebCore/bindings/scripts/test/JS/

2
LATEST
View File

@@ -1 +1 @@
1.3.5
1.3.6

View File

@@ -2,7 +2,7 @@ option(WEBKIT_VERSION "The version of WebKit to use")
option(WEBKIT_LOCAL "If a local version of WebKit should be used instead of downloading")
if(NOT WEBKIT_VERSION)
set(WEBKIT_VERSION preview-pr-135-a6fa914b)
set(WEBKIT_VERSION 1d0216219a3c52cb85195f48f19ba7d5db747ff7)
endif()
string(SUBSTRING ${WEBKIT_VERSION} 0 16 WEBKIT_VERSION_PREFIX)
@@ -33,8 +33,8 @@ if(WEBKIT_LOCAL)
${WEBKIT_PATH}/JavaScriptCore/PrivateHeaders
${WEBKIT_PATH}/bmalloc/Headers
${WEBKIT_PATH}/WTF/Headers
${WEBKIT_PATH}/JavaScriptCore/PrivateHeaders/JavaScriptCore
${WEBKIT_PATH}/JavaScriptCore/DerivedSources/inspector
${WEBKIT_PATH}/JavaScriptCore/PrivateHeaders/JavaScriptCore
)
endif()

View File

@@ -35,8 +35,8 @@ end
set -l bun_install_boolean_flags yarn production optional development no-save dry-run force no-cache silent verbose global
set -l bun_install_boolean_flags_descriptions "Write a yarn.lock file (yarn v1)" "Don't install devDependencies" "Add dependency to optionalDependencies" "Add dependency to devDependencies" "Don't update package.json or save a lockfile" "Don't install anything" "Always request the latest versions from the registry & reinstall all dependencies" "Ignore manifest cache entirely" "Don't output anything" "Excessively verbose logging" "Use global folder"
set -l bun_builtin_cmds_without_run dev create help bun upgrade discord install remove add init pm x
set -l bun_builtin_cmds_accepting_flags create help bun upgrade discord run init link unlink pm x
set -l bun_builtin_cmds_without_run dev create help bun upgrade discord install remove add update init pm x
set -l bun_builtin_cmds_accepting_flags create help bun upgrade discord run init link unlink pm x update
function __bun_complete_bins_scripts --inherit-variable bun_builtin_cmds_without_run -d "Emit bun completions for bins and scripts"
# Do nothing if we already have a builtin subcommand,
@@ -148,14 +148,14 @@ complete -c bun \
for i in (seq (count $bun_install_boolean_flags))
complete -c bun \
-n "__fish_seen_subcommand_from install add remove" -l "$bun_install_boolean_flags[$i]" -d "$bun_install_boolean_flags_descriptions[$i]"
-n "__fish_seen_subcommand_from install add remove update" -l "$bun_install_boolean_flags[$i]" -d "$bun_install_boolean_flags_descriptions[$i]"
end
complete -c bun \
-n "__fish_seen_subcommand_from install add remove" -l 'cwd' -d 'Change working directory'
-n "__fish_seen_subcommand_from install add remove update" -l 'cwd' -d 'Change working directory'
complete -c bun \
-n "__fish_seen_subcommand_from install add remove" -l 'cache-dir' -d 'Choose a cache directory (default: $HOME/.bun/install/cache)'
-n "__fish_seen_subcommand_from install add remove update" -l 'cache-dir' -d 'Choose a cache directory (default: $HOME/.bun/install/cache)'
complete -c bun \
-n "__fish_seen_subcommand_from add" -d 'Popular' -a '(__fish__get_bun_packages)'
@@ -183,4 +183,5 @@ complete -c bun -n "__fish_use_subcommand" -a "unlink" -d "Unregister a local np
complete -c bun -n "__fish_use_subcommand" -a "pm" -d "Additional package management utilities" -f
complete -c bun -n "__fish_use_subcommand" -a "x" -d "Execute a package binary, installing if needed" -f
complete -c bun -n "__fish_use_subcommand" -a "outdated" -d "Display the latest versions of outdated dependencies" -f
complete -c bun -n "__fish_use_subcommand" -a "update" -d "Update dependencies to their latest versions" -f
complete -c bun -n "__fish_use_subcommand" -a "publish" -d "Publish your package from local to npm" -f

View File

@@ -10,21 +10,21 @@ Bun provides a fast, native implementation for working with tar archives through
**Create an archive from files:**
```ts
const archive = Bun.Archive.from({
const archive = new Bun.Archive({
"hello.txt": "Hello, World!",
"data.json": JSON.stringify({ foo: "bar" }),
"nested/file.txt": "Nested content",
});
// Write to disk
await Bun.Archive.write("bundle.tar", archive);
await Bun.write("bundle.tar", archive);
```
**Extract an archive:**
```ts
const tarball = await Bun.file("package.tar.gz").bytes();
const archive = Bun.Archive.from(tarball);
const archive = new Bun.Archive(tarball);
const entryCount = await archive.extract("./output");
console.log(`Extracted ${entryCount} entries`);
```
@@ -33,7 +33,7 @@ console.log(`Extracted ${entryCount} entries`);
```ts
const tarball = await Bun.file("package.tar.gz").bytes();
const archive = Bun.Archive.from(tarball);
const archive = new Bun.Archive(tarball);
const files = await archive.files();
for (const [path, file] of files) {
@@ -43,10 +43,11 @@ for (const [path, file] of files) {
## Creating Archives
Use `Bun.Archive.from()` to create an archive from an object where keys are file paths and values are file contents:
Use `new Bun.Archive()` to create an archive from an object where keys are file paths and values are file contents. By default, archives are uncompressed:
```ts
const archive = Bun.Archive.from({
// Creates an uncompressed tar archive (default)
const archive = new Bun.Archive({
"README.md": "# My Project",
"src/index.ts": "console.log('Hello');",
"package.json": JSON.stringify({ name: "my-project" }),
@@ -64,7 +65,7 @@ File contents can be:
const data = "binary data";
const arrayBuffer = new ArrayBuffer(8);
const archive = Bun.Archive.from({
const archive = new Bun.Archive({
"text.txt": "Plain text",
"blob.bin": new Blob([data]),
"bytes.bin": new Uint8Array([1, 2, 3, 4]),
@@ -74,18 +75,19 @@ const archive = Bun.Archive.from({
### Writing Archives to Disk
Use `Bun.Archive.write()` to create and write an archive in one operation:
Use `Bun.write()` to write an archive to disk:
```ts
// Write uncompressed tar
await Bun.Archive.write("output.tar", {
// Write uncompressed tar (default)
const archive = new Bun.Archive({
"file1.txt": "content1",
"file2.txt": "content2",
});
await Bun.write("output.tar", archive);
// Write gzipped tar
const files = { "src/index.ts": "console.log('Hello');" };
await Bun.Archive.write("output.tar.gz", files, "gzip");
const compressed = new Bun.Archive({ "src/index.ts": "console.log('Hello');" }, { compress: "gzip" });
await Bun.write("output.tar.gz", compressed);
```
### Getting Archive Bytes
@@ -93,8 +95,7 @@ await Bun.Archive.write("output.tar.gz", files, "gzip");
Get the archive data as bytes or a Blob:
```ts
const files = { "hello.txt": "Hello, World!" };
const archive = Bun.Archive.from(files);
const archive = new Bun.Archive({ "hello.txt": "Hello, World!" });
// As Uint8Array
const bytes = await archive.bytes();
@@ -102,9 +103,10 @@ const bytes = await archive.bytes();
// As Blob
const blob = await archive.blob();
// With gzip compression
const gzippedBytes = await archive.bytes("gzip");
const gzippedBlob = await archive.blob("gzip");
// With gzip compression (set at construction)
const gzipped = new Bun.Archive({ "hello.txt": "Hello, World!" }, { compress: "gzip" });
const gzippedBytes = await gzipped.bytes();
const gzippedBlob = await gzipped.blob();
```
## Extracting Archives
@@ -116,13 +118,13 @@ Create an archive from existing tar/tar.gz data:
```ts
// From a file
const tarball = await Bun.file("package.tar.gz").bytes();
const archiveFromFile = Bun.Archive.from(tarball);
const archiveFromFile = new Bun.Archive(tarball);
```
```ts
// From a fetch response
const response = await fetch("https://example.com/archive.tar.gz");
const archiveFromFetch = Bun.Archive.from(await response.blob());
const archiveFromFetch = new Bun.Archive(await response.blob());
```
### Extracting to Disk
@@ -131,7 +133,7 @@ Use `.extract()` to write all files to a directory:
```ts
const tarball = await Bun.file("package.tar.gz").bytes();
const archive = Bun.Archive.from(tarball);
const archive = new Bun.Archive(tarball);
const count = await archive.extract("./extracted");
console.log(`Extracted ${count} entries`);
```
@@ -148,7 +150,7 @@ Use glob patterns to extract only specific files. Patterns are matched against a
```ts
const tarball = await Bun.file("package.tar.gz").bytes();
const archive = Bun.Archive.from(tarball);
const archive = new Bun.Archive(tarball);
// Extract only TypeScript files
const tsCount = await archive.extract("./extracted", { glob: "**/*.ts" });
@@ -181,7 +183,7 @@ Use `.files()` to get archive contents as a `Map` of `File` objects without extr
```ts
const tarball = await Bun.file("package.tar.gz").bytes();
const archive = Bun.Archive.from(tarball);
const archive = new Bun.Archive(tarball);
const files = await archive.files();
for (const [path, file] of files) {
@@ -206,7 +208,7 @@ Archive operations can fail due to corrupted data, I/O errors, or invalid paths.
```ts
try {
const tarball = await Bun.file("package.tar.gz").bytes();
const archive = Bun.Archive.from(tarball);
const archive = new Bun.Archive(tarball);
const count = await archive.extract("./output");
console.log(`Extracted ${count} entries`);
} catch (e: unknown) {
@@ -227,7 +229,7 @@ try {
Common error scenarios:
- **Corrupted/truncated archives** - `Archive.from()` loads the archive data; errors may be deferred until read/extract operations
- **Corrupted/truncated archives** - `new Archive()` loads the archive data; errors may be deferred until read/extract operations
- **Permission denied** - `extract()` throws if the target directory is not writable
- **Disk full** - `extract()` throws if there's insufficient space
- **Invalid paths** - Operations throw for malformed file paths
@@ -239,7 +241,7 @@ The count returned by `extract()` includes all successfully written entries (fil
For additional security with untrusted archives, you can enumerate and validate paths before extraction:
```ts
const archive = Bun.Archive.from(untrustedData);
const archive = new Bun.Archive(untrustedData);
const files = await archive.files();
// Optional: Custom validation for additional checks
@@ -298,26 +300,28 @@ See [Bun.Glob](/docs/api/glob) for the full glob syntax including escaping and a
## Compression
Bun.Archive supports gzip compression for both reading and writing:
Bun.Archive creates uncompressed tar archives by default. Use `{ compress: "gzip" }` to enable gzip compression:
```ts
// Default: uncompressed tar
const archive = new Bun.Archive({ "hello.txt": "Hello, World!" });
// Reading: automatically detects gzip
const gzippedTarball = await Bun.file("archive.tar.gz").bytes();
const archive = Bun.Archive.from(gzippedTarball);
const readArchive = new Bun.Archive(gzippedTarball);
// Writing: specify compression
const files = { "hello.txt": "Hello, World!" };
await Bun.Archive.write("output.tar.gz", files, "gzip");
// Enable gzip compression
const compressed = new Bun.Archive({ "hello.txt": "Hello, World!" }, { compress: "gzip" });
// Getting bytes: specify compression
const gzippedBytes = await archive.bytes("gzip");
// Gzip with custom level (1-12)
const maxCompression = new Bun.Archive({ "hello.txt": "Hello, World!" }, { compress: "gzip", level: 12 });
```
The compression argument accepts:
The options accept:
- `"gzip"` - Enable gzip compression
- `true` - Same as `"gzip"`
- `false` or `undefined` - No compression
- No options or `undefined` - Uncompressed tar (default)
- `{ compress: "gzip" }` - Enable gzip compression at level 6
- `{ compress: "gzip", level: number }` - Gzip with custom level 1-12 (1 = fastest, 12 = smallest)
## Examples
@@ -339,15 +343,16 @@ for await (const path of glob.scan(".")) {
// Add package.json
files["package.json"] = await Bun.file("package.json").text();
// Create compressed archive
await Bun.Archive.write("bundle.tar.gz", files, "gzip");
// Create compressed archive and write to disk
const archive = new Bun.Archive(files, { compress: "gzip" });
await Bun.write("bundle.tar.gz", archive);
```
### Extract and Process npm Package
```ts
const response = await fetch("https://registry.npmjs.org/lodash/-/lodash-4.17.21.tgz");
const archive = Bun.Archive.from(await response.blob());
const archive = new Bun.Archive(await response.blob());
// Get package.json
const files = await archive.files("package/package.json");
@@ -365,7 +370,7 @@ if (packageJson) {
import { readdir } from "node:fs/promises";
import { join } from "node:path";
async function archiveDirectory(dir: string): Promise<Bun.Archive> {
async function archiveDirectory(dir: string, compress = false): Promise<Bun.Archive> {
const files: Record<string, Blob> = {};
async function walk(currentDir: string, prefix: string = "") {
@@ -384,11 +389,11 @@ async function archiveDirectory(dir: string): Promise<Bun.Archive> {
}
await walk(dir);
return Bun.Archive.from(files);
return new Bun.Archive(files, compress ? { compress: "gzip" } : undefined);
}
const archive = await archiveDirectory("./my-project");
await Bun.Archive.write("my-project.tar.gz", archive, "gzip");
const archive = await archiveDirectory("./my-project", true);
await Bun.write("my-project.tar.gz", archive);
```
## Reference
@@ -396,14 +401,19 @@ await Bun.Archive.write("my-project.tar.gz", archive, "gzip");
> **Note**: The following type signatures are simplified for documentation purposes. See [`packages/bun-types/bun.d.ts`](https://github.com/oven-sh/bun/blob/main/packages/bun-types/bun.d.ts) for the full type definitions.
```ts
type ArchiveCompression = "gzip" | boolean;
type ArchiveInput =
| Record<string, string | Blob | Bun.ArrayBufferView | ArrayBufferLike>
| Blob
| Bun.ArrayBufferView
| ArrayBufferLike;
type ArchiveOptions = {
/** Compression algorithm. Currently only "gzip" is supported. */
compress?: "gzip";
/** Compression level 1-12 (default 6 when gzip is enabled). */
level?: number;
};
interface ArchiveExtractOptions {
/** Glob pattern(s) to filter extraction. Supports negative patterns with "!" prefix. */
glob?: string | readonly string[];
@@ -412,13 +422,11 @@ interface ArchiveExtractOptions {
class Archive {
/**
* Create an Archive from input data
* @param data - Files to archive (as object) or existing archive data (as bytes/blob)
* @param options - Compression options. Uncompressed by default.
* Pass { compress: "gzip" } to enable compression.
*/
static from(data: ArchiveInput): Archive;
/**
* Write an archive directly to disk
*/
static write(path: string, data: ArchiveInput | Archive, compress?: ArchiveCompression): Promise<void>;
constructor(data: ArchiveInput, options?: ArchiveOptions);
/**
* Extract archive to a directory
@@ -427,14 +435,14 @@ class Archive {
extract(path: string, options?: ArchiveExtractOptions): Promise<number>;
/**
* Get archive as a Blob
* Get archive as a Blob (uses compression setting from constructor)
*/
blob(compress?: ArchiveCompression): Promise<Blob>;
blob(): Promise<Blob>;
/**
* Get archive as a Uint8Array
* Get archive as a Uint8Array (uses compression setting from constructor)
*/
bytes(compress?: ArchiveCompression): Promise<Uint8Array<ArrayBuffer>>;
bytes(): Promise<Uint8Array<ArrayBuffer>>;
/**
* Get archive contents as File objects (regular files only, no directories)

View File

@@ -1,7 +1,7 @@
{
"private": true,
"name": "bun",
"version": "1.3.6",
"version": "1.3.7",
"workspaces": [
"./packages/bun-types",
"./packages/@types/bun"

View File

@@ -750,7 +750,7 @@ declare module "bun" {
*/
function write(
destination: BunFile | S3File | PathLike,
input: Blob | NodeJS.TypedArray | ArrayBufferLike | string | BlobPart[],
input: Blob | NodeJS.TypedArray | ArrayBufferLike | string | BlobPart[] | Archive,
options?: {
/**
* If writing to a PathLike, set the permissions of the file.
@@ -6975,15 +6975,44 @@ declare module "bun" {
/**
* Compression format for archive output.
* - `"gzip"` - Compress with gzip
* - `true` - Same as `"gzip"`
* - `false` - Explicitly disable compression (no compression)
* - `undefined` - No compression (default behavior when omitted)
*
* Both `false` and `undefined` result in no compression; `false` can be used
* to explicitly indicate "no compression" in code where the intent should be clear.
* Currently only `"gzip"` is supported.
*/
type ArchiveCompression = "gzip" | boolean;
type ArchiveCompression = "gzip";
/**
* Options for creating an Archive instance.
*
* By default, archives are not compressed. Use `{ compress: "gzip" }` to enable compression.
*
* @example
* ```ts
* // No compression (default)
* new Bun.Archive(data);
*
* // Enable gzip with default level (6)
* new Bun.Archive(data, { compress: "gzip" });
*
* // Specify compression level
* new Bun.Archive(data, { compress: "gzip", level: 9 });
* ```
*/
interface ArchiveOptions {
/**
* Compression algorithm to use.
* Currently only "gzip" is supported.
* If not specified, no compression is applied.
*/
compress?: ArchiveCompression;
/**
* Compression level (1-12). Only applies when `compress` is set.
* - 1: Fastest compression, lowest ratio
* - 6: Default balance of speed and ratio
* - 12: Best compression ratio, slowest
*
* @default 6
*/
level?: number;
}
/**
* Options for extracting archive contents.
@@ -7031,7 +7060,7 @@ declare module "bun" {
* @example
* **Create an archive from an object:**
* ```ts
* const archive = Bun.Archive.from({
* const archive = new Bun.Archive({
* "hello.txt": "Hello, World!",
* "data.json": JSON.stringify({ foo: "bar" }),
* "binary.bin": new Uint8Array([1, 2, 3, 4]),
@@ -7039,9 +7068,20 @@ declare module "bun" {
* ```
*
* @example
* **Create a gzipped archive:**
* ```ts
* const archive = new Bun.Archive({
* "hello.txt": "Hello, World!",
* }, { compress: "gzip" });
*
* // Or with a specific compression level (1-12)
* const archive = new Bun.Archive(data, { compress: "gzip", level: 9 });
* ```
*
* @example
* **Extract an archive to disk:**
* ```ts
* const archive = Bun.Archive.from(tarballBytes);
* const archive = new Bun.Archive(tarballBytes);
* const entryCount = await archive.extract("./output");
* console.log(`Extracted ${entryCount} entries`);
* ```
@@ -7049,7 +7089,7 @@ declare module "bun" {
* @example
* **Get archive contents as a Map of File objects:**
* ```ts
* const archive = Bun.Archive.from(tarballBytes);
* const archive = new Bun.Archive(tarballBytes);
* const entries = await archive.files();
* for (const [path, file] of entries) {
* console.log(path, await file.text());
@@ -7062,36 +7102,50 @@ declare module "bun" {
* await Bun.Archive.write("bundle.tar.gz", {
* "src/index.ts": sourceCode,
* "package.json": packageJson,
* }, "gzip");
* }, { compress: "gzip" });
* ```
*/
export class Archive {
/**
* Create an `Archive` instance from input data.
*
* By default, archives are not compressed. Use `{ compress: "gzip" }` to enable compression.
*
* @param data - The input data for the archive:
* - **Object**: Creates a new tarball with the object's keys as file paths and values as file contents
* - **Blob/TypedArray/ArrayBuffer**: Wraps existing archive data (tar or tar.gz)
*
* @returns A new `Archive` instance
* @param options - Optional archive options including compression settings.
* Defaults to no compression if omitted.
*
* @example
* **From an object (creates new tarball):**
* **From an object (creates uncompressed tarball):**
* ```ts
* const archive = Bun.Archive.from({
* const archive = new Bun.Archive({
* "hello.txt": "Hello, World!",
* "nested/file.txt": "Nested content",
* });
* ```
*
* @example
* **With gzip compression:**
* ```ts
* const archive = new Bun.Archive(data, { compress: "gzip" });
* ```
*
* @example
* **With explicit gzip compression level:**
* ```ts
* const archive = new Bun.Archive(data, { compress: "gzip", level: 12 });
* ```
*
* @example
* **From existing archive data:**
* ```ts
* const response = await fetch("https://example.com/package.tar.gz");
* const archive = Bun.Archive.from(await response.blob());
* const archive = new Bun.Archive(await response.blob());
* ```
*/
static from(data: ArchiveInput): Archive;
constructor(data: ArchiveInput, options?: ArchiveOptions);
/**
* Create and write an archive directly to disk in one operation.
@@ -7100,8 +7154,8 @@ declare module "bun" {
* as it streams the data directly to disk.
*
* @param path - The file path to write the archive to
* @param data - The input data for the archive (same as `Archive.from()`)
* @param compress - Optional compression: `"gzip"`, `true` for gzip, or `false`/`undefined` for none
* @param data - The input data for the archive (same as `new Archive()`)
* @param options - Optional archive options including compression settings
*
* @returns A promise that resolves when the write is complete
*
@@ -7117,10 +7171,10 @@ declare module "bun" {
* @example
* **Write gzipped tarball:**
* ```ts
* await Bun.Archive.write("output.tar.gz", files, "gzip");
* await Bun.Archive.write("output.tar.gz", files, { compress: "gzip" });
* ```
*/
static write(path: string, data: ArchiveInput | Archive, compress?: ArchiveCompression): Promise<void>;
static write(path: string, data: ArchiveInput | Archive, options?: ArchiveOptions): Promise<void>;
/**
* Extract the archive contents to a directory on disk.
@@ -7136,7 +7190,7 @@ declare module "bun" {
* @example
* **Extract all entries:**
* ```ts
* const archive = Bun.Archive.from(tarballBytes);
* const archive = new Bun.Archive(tarballBytes);
* const count = await archive.extract("./extracted");
* console.log(`Extracted ${count} entries`);
* ```
@@ -7166,42 +7220,48 @@ declare module "bun" {
/**
* Get the archive contents as a `Blob`.
*
* @param compress - Optional compression: `"gzip"`, `true` for gzip, or `false`/`undefined` for none
* Uses the compression settings specified when the Archive was created.
*
* @returns A promise that resolves with the archive data as a Blob
*
* @example
* **Get uncompressed tarball:**
* **Get tarball as Blob:**
* ```ts
* const archive = new Bun.Archive(data);
* const blob = await archive.blob();
* ```
*
* @example
* **Get gzipped tarball:**
* **Get gzipped tarball as Blob:**
* ```ts
* const gzippedBlob = await archive.blob("gzip");
* const archive = new Bun.Archive(data, { compress: "gzip" });
* const gzippedBlob = await archive.blob();
* ```
*/
blob(compress?: ArchiveCompression): Promise<Blob>;
blob(): Promise<Blob>;
/**
* Get the archive contents as a `Uint8Array`.
*
* @param compress - Optional compression: `"gzip"`, `true` for gzip, or `false`/`undefined` for none
* Uses the compression settings specified when the Archive was created.
*
* @returns A promise that resolves with the archive data as a Uint8Array
*
* @example
* **Get uncompressed tarball bytes:**
* **Get tarball bytes:**
* ```ts
* const archive = new Bun.Archive(data);
* const bytes = await archive.bytes();
* ```
*
* @example
* **Get gzipped tarball bytes:**
* ```ts
* const gzippedBytes = await archive.bytes("gzip");
* const archive = new Bun.Archive(data, { compress: "gzip" });
* const gzippedBytes = await archive.bytes();
* ```
*/
bytes(compress?: ArchiveCompression): Promise<Uint8Array<ArrayBuffer>>;
bytes(): Promise<Uint8Array<ArrayBuffer>>;
/**
* Get the archive contents as a `Map` of `File` objects.

View File

@@ -11,9 +11,9 @@ declare module "bun" {
* If the file descriptor is not writable yet, the data is buffered.
*
* @param chunk The data to write
* @returns Number of bytes written
* @returns Number of bytes written or, if the write is pending, a Promise resolving to the number of bytes
*/
write(chunk: string | ArrayBufferView | ArrayBuffer | SharedArrayBuffer): number;
write(chunk: string | ArrayBufferView | ArrayBuffer | SharedArrayBuffer): number | Promise<number>;
/**
* Flush the internal buffer, committing the data to disk or the pipe.
*
@@ -78,9 +78,9 @@ declare module "bun" {
* If the network is not writable yet, the data is buffered.
*
* @param chunk The data to write
* @returns Number of bytes written
* @returns Number of bytes written or, if the write is pending, a Promise resolving to the number of bytes
*/
write(chunk: string | ArrayBufferView | ArrayBuffer | SharedArrayBuffer): number;
write(chunk: string | ArrayBufferView | ArrayBuffer | SharedArrayBuffer): number | Promise<number>;
/**
* Flush the internal buffer, committing the data to the network.
*
@@ -609,7 +609,17 @@ declare module "bun" {
* });
*/
write(
data: string | ArrayBufferView | ArrayBuffer | SharedArrayBuffer | Request | Response | BunFile | S3File | Blob,
data:
| string
| ArrayBufferView
| ArrayBuffer
| SharedArrayBuffer
| Request
| Response
| BunFile
| S3File
| Blob
| Archive,
options?: S3Options,
): Promise<number>;
@@ -920,7 +930,8 @@ declare module "bun" {
| BunFile
| S3File
| Blob
| File,
| File
| Archive,
options?: S3Options,
): Promise<number>;
@@ -970,7 +981,8 @@ declare module "bun" {
| BunFile
| S3File
| Blob
| File,
| File
| Archive,
options?: S3Options,
): Promise<number>;

View File

@@ -8,10 +8,6 @@ export default [
configurable: false,
JSType: "0b11101110",
klass: {
from: {
fn: "from",
length: 1,
},
write: {
fn: "write",
length: 2,

View File

@@ -5,8 +5,19 @@ pub const toJS = js.toJS;
pub const fromJS = js.fromJS;
pub const fromJSDirect = js.fromJSDirect;
/// Compression options for the archive
pub const Compression = union(enum) {
none,
gzip: struct {
/// Compression level: 1 (fastest) to 12 (maximum compression). Default is 6.
level: u8 = 6,
},
};
/// The underlying data for the archive - uses Blob.Store for thread-safe ref counting
store: *jsc.WebCore.Blob.Store,
/// Compression settings for this archive
compress: Compression = .none,
pub fn finalize(this: *Archive) void {
jsc.markBinding(@src());
@@ -65,47 +76,95 @@ fn countFilesInArchive(data: []const u8) u32 {
return count;
}
/// Constructor: new Archive() - throws an error since users should use Archive.from()
pub fn constructor(globalThis: *jsc.JSGlobalObject, _: *jsc.CallFrame) bun.JSError!*Archive {
return globalThis.throwInvalidArguments("Archive cannot be constructed directly. Use Archive.from() instead.", .{});
}
/// Static method: Archive.from(data)
/// Constructor: new Archive(data, options?)
/// Creates an Archive from either:
/// - An object { [path: string]: Blob | string | ArrayBufferView | ArrayBufferLike }
/// - A Blob, ArrayBufferView, or ArrayBufferLike (assumes it's already a valid archive)
pub fn from(globalThis: *jsc.JSGlobalObject, callframe: *jsc.CallFrame) bun.JSError!jsc.JSValue {
const arg = callframe.argumentsAsArray(1)[0];
if (arg == .zero) {
return globalThis.throwInvalidArguments("Archive.from requires an argument", .{});
/// Options:
/// - compress: "gzip" - Enable gzip compression
/// - level: number (1-12) - Compression level (default 6)
/// When no options are provided, no compression is applied
pub fn constructor(globalThis: *jsc.JSGlobalObject, callframe: *jsc.CallFrame) bun.JSError!*Archive {
const data_arg, const options_arg = callframe.argumentsAsArray(2);
if (data_arg == .zero) {
return globalThis.throwInvalidArguments("new Archive() requires an argument", .{});
}
// Parse compression options
const compress = try parseCompressionOptions(globalThis, options_arg);
// For Blob/Archive, ref the existing store (zero-copy)
if (arg.as(jsc.WebCore.Blob)) |blob_ptr| {
if (data_arg.as(jsc.WebCore.Blob)) |blob_ptr| {
if (blob_ptr.store) |store| {
store.ref();
return bun.new(Archive, .{ .store = store }).toJS(globalThis);
return bun.new(Archive, .{ .store = store, .compress = compress });
}
}
// For ArrayBuffer/TypedArray, copy the data
if (arg.asArrayBuffer(globalThis)) |array_buffer| {
if (data_arg.asArrayBuffer(globalThis)) |array_buffer| {
const data = try bun.default_allocator.dupe(u8, array_buffer.slice());
return createArchive(globalThis, data);
return createArchive(data, compress);
}
// For plain objects, build a tarball
if (arg.isObject()) {
const data = try buildTarballFromObject(globalThis, arg);
return createArchive(globalThis, data);
if (data_arg.isObject()) {
const data = try buildTarballFromObject(globalThis, data_arg);
return createArchive(data, compress);
}
return globalThis.throwInvalidArguments("Expected an object, Blob, TypedArray, or ArrayBuffer", .{});
}
fn createArchive(globalThis: *jsc.JSGlobalObject, data: []u8) jsc.JSValue {
/// Parse compression options from JS value
/// Returns .none if no compression specified, caller must handle defaults
fn parseCompressionOptions(globalThis: *jsc.JSGlobalObject, options_arg: jsc.JSValue) bun.JSError!Compression {
// No options provided means no compression (caller handles defaults)
if (options_arg.isUndefinedOrNull()) {
return .none;
}
if (!options_arg.isObject()) {
return globalThis.throwInvalidArguments("Archive: options must be an object", .{});
}
// Check for compress option
if (try options_arg.getTruthy(globalThis, "compress")) |compress_val| {
// compress must be "gzip"
if (!compress_val.isString()) {
return globalThis.throwInvalidArguments("Archive: compress option must be a string", .{});
}
const compress_str = try compress_val.toSlice(globalThis, bun.default_allocator);
defer compress_str.deinit();
if (!bun.strings.eqlComptime(compress_str.slice(), "gzip")) {
return globalThis.throwInvalidArguments("Archive: compress option must be \"gzip\"", .{});
}
// Parse level option (1-12, default 6)
var level: u8 = 6;
if (try options_arg.getTruthy(globalThis, "level")) |level_val| {
if (!level_val.isNumber()) {
return globalThis.throwInvalidArguments("Archive: level must be a number", .{});
}
const level_num = level_val.toInt64();
if (level_num < 1 or level_num > 12) {
return globalThis.throwInvalidArguments("Archive: level must be between 1 and 12", .{});
}
level = @intCast(level_num);
}
return .{ .gzip = .{ .level = level } };
}
// No compress option specified in options object means no compression
return .none;
}
fn createArchive(data: []u8, compress: Compression) *Archive {
const store = jsc.WebCore.Blob.Store.init(data, bun.default_allocator);
return bun.new(Archive, .{ .store = store }).toJS(globalThis);
return bun.new(Archive, .{ .store = store, .compress = compress });
}
/// Shared helper that builds tarball bytes from a JS object
@@ -212,12 +271,15 @@ fn getEntryData(globalThis: *jsc.JSGlobalObject, value: jsc.JSValue, allocator:
return value.toSlice(globalThis, allocator);
}
/// Static method: Archive.write(path, data, compress?)
/// Creates and writes an archive to disk in one operation
/// Static method: Archive.write(path, data, options?)
/// Creates and writes an archive to disk in one operation.
/// For Archive instances, uses the archive's compression settings unless overridden by options.
/// Options:
/// - gzip: { level?: number } - Override compression settings
pub fn write(globalThis: *jsc.JSGlobalObject, callframe: *jsc.CallFrame) bun.JSError!jsc.JSValue {
const path_arg, const data_arg, const compress_arg = callframe.argumentsAsArray(3);
const path_arg, const data_arg, const options_arg = callframe.argumentsAsArray(3);
if (data_arg == .zero) {
return globalThis.throwInvalidArguments("Archive.write requires at least 2 arguments (path, data)", .{});
return globalThis.throwInvalidArguments("Archive.write requires 2 arguments (path, data)", .{});
}
// Get the path
@@ -228,61 +290,37 @@ pub fn write(globalThis: *jsc.JSGlobalObject, callframe: *jsc.CallFrame) bun.JSE
const path_slice = try path_arg.toSlice(globalThis, bun.default_allocator);
defer path_slice.deinit();
// Determine compression
const use_gzip = try parseCompressArg(globalThis, compress_arg);
// Parse options for compression override
const options_compress = try parseCompressionOptions(globalThis, options_arg);
// Try to use store reference (zero-copy) for Archive/Blob
// For Archive instances, use options override or archive's compression settings
if (fromJS(data_arg)) |archive| {
return startWriteTask(globalThis, .{ .store = archive.store }, path_slice.slice(), use_gzip);
const compress = if (options_compress != .none) options_compress else archive.compress;
return startWriteTask(globalThis, .{ .store = archive.store }, path_slice.slice(), compress);
}
// For Blobs, use store reference with options compression
if (data_arg.as(jsc.WebCore.Blob)) |blob_ptr| {
if (blob_ptr.store) |store| {
return startWriteTask(globalThis, .{ .store = store }, path_slice.slice(), use_gzip);
return startWriteTask(globalThis, .{ .store = store }, path_slice.slice(), options_compress);
}
}
// Fall back to copying data for ArrayBuffer/TypedArray/objects
const archive_data = try getArchiveData(globalThis, data_arg);
return startWriteTask(globalThis, .{ .owned = archive_data }, path_slice.slice(), use_gzip);
}
/// Get archive data from a value, returning owned bytes
fn getArchiveData(globalThis: *jsc.JSGlobalObject, arg: jsc.JSValue) bun.JSError![]u8 {
// Check if it's a typed array, ArrayBuffer, or similar
if (arg.asArrayBuffer(globalThis)) |array_buffer| {
return bun.default_allocator.dupe(u8, array_buffer.slice());
// For ArrayBuffer/TypedArray, copy the data with options compression
if (data_arg.asArrayBuffer(globalThis)) |array_buffer| {
const data = try bun.default_allocator.dupe(u8, array_buffer.slice());
return startWriteTask(globalThis, .{ .owned = data }, path_slice.slice(), options_compress);
}
// Check if it's an object with entries (plain object) - build tarball
if (arg.isObject()) {
return buildTarballFromObject(globalThis, arg);
// For plain objects, build a tarball with options compression
if (data_arg.isObject()) {
const data = try buildTarballFromObject(globalThis, data_arg);
return startWriteTask(globalThis, .{ .owned = data }, path_slice.slice(), options_compress);
}
return globalThis.throwInvalidArguments("Expected an object, Blob, TypedArray, ArrayBuffer, or Archive", .{});
}
fn parseCompressArg(globalThis: *jsc.JSGlobalObject, arg: jsc.JSValue) bun.JSError!bool {
if (arg.isUndefinedOrNull()) {
return false;
}
if (arg.isBoolean()) {
return arg.toBoolean();
}
if (arg.isString()) {
const str = try arg.toSlice(globalThis, bun.default_allocator);
defer str.deinit();
if (std.mem.eql(u8, str.slice(), "gzip")) {
return true;
}
return globalThis.throwInvalidArguments("Archive: compress argument must be 'gzip', a boolean, or undefined", .{});
}
return globalThis.throwInvalidArguments("Archive: compress argument must be 'gzip', a boolean, or undefined", .{});
}
/// Instance method: archive.extract(path, options?)
/// Extracts the archive to the given path
/// Options:
@@ -379,20 +417,16 @@ fn freePatterns(patterns: []const []const u8) void {
bun.default_allocator.free(patterns);
}
/// Instance method: archive.blob(compress?)
/// Returns Promise<Blob> with the archive data
pub fn blob(this: *Archive, globalThis: *jsc.JSGlobalObject, callframe: *jsc.CallFrame) bun.JSError!jsc.JSValue {
const compress_arg = callframe.argumentsAsArray(1)[0];
const use_gzip = try parseCompressArg(globalThis, compress_arg);
return startBlobTask(globalThis, this.store, use_gzip, .blob);
/// Instance method: archive.blob()
/// Returns Promise<Blob> with the archive data (compressed if gzip was set in options)
pub fn blob(this: *Archive, globalThis: *jsc.JSGlobalObject, _: *jsc.CallFrame) bun.JSError!jsc.JSValue {
return startBlobTask(globalThis, this.store, this.compress, .blob);
}
/// Instance method: archive.bytes(compress?)
/// Returns Promise<Uint8Array> with the archive data
pub fn bytes(this: *Archive, globalThis: *jsc.JSGlobalObject, callframe: *jsc.CallFrame) bun.JSError!jsc.JSValue {
const compress_arg = callframe.argumentsAsArray(1)[0];
const use_gzip = try parseCompressArg(globalThis, compress_arg);
return startBlobTask(globalThis, this.store, use_gzip, .bytes);
/// Instance method: archive.bytes()
/// Returns Promise<Uint8Array> with the archive data (compressed if gzip was set in options)
pub fn bytes(this: *Archive, globalThis: *jsc.JSGlobalObject, _: *jsc.CallFrame) bun.JSError!jsc.JSValue {
return startBlobTask(globalThis, this.store, this.compress, .bytes);
}
/// Instance method: archive.files(glob?)
@@ -578,15 +612,17 @@ const BlobContext = struct {
};
store: *jsc.WebCore.Blob.Store,
use_gzip: bool,
compress: Compression,
output_type: OutputType,
result: Result = .{ .uncompressed = {} },
fn run(this: *BlobContext) Result {
if (this.use_gzip) {
return .{ .compressed = compressGzip(this.store.sharedView()) catch |e| return .{ .err = e } };
switch (this.compress) {
.gzip => |opts| {
return .{ .compressed = compressGzip(this.store.sharedView(), opts.level) catch |e| return .{ .err = e } };
},
.none => return .{ .uncompressed = {} },
}
return .{ .uncompressed = {} };
}
fn runFromJS(this: *BlobContext, globalThis: *jsc.JSGlobalObject) bun.JSError!PromiseResult {
@@ -617,13 +653,13 @@ const BlobContext = struct {
pub const BlobTask = AsyncTask(BlobContext);
fn startBlobTask(globalThis: *jsc.JSGlobalObject, store: *jsc.WebCore.Blob.Store, use_gzip: bool, output_type: BlobContext.OutputType) bun.JSError!jsc.JSValue {
fn startBlobTask(globalThis: *jsc.JSGlobalObject, store: *jsc.WebCore.Blob.Store, compress: Compression, output_type: BlobContext.OutputType) bun.JSError!jsc.JSValue {
store.ref();
errdefer store.deref();
const task = try BlobTask.create(globalThis, .{
.store = store,
.use_gzip = use_gzip,
.compress = compress,
.output_type = output_type,
});
@@ -646,7 +682,7 @@ const WriteContext = struct {
data: Data,
path: [:0]const u8,
use_gzip: bool,
compress: Compression,
result: Result = .{ .success = {} },
fn run(this: *WriteContext) Result {
@@ -654,11 +690,11 @@ const WriteContext = struct {
.owned => |d| d,
.store => |s| s.sharedView(),
};
const data_to_write = if (this.use_gzip)
compressGzip(source_data) catch |e| return .{ .err = e }
else
source_data;
defer if (this.use_gzip) bun.default_allocator.free(data_to_write);
const data_to_write = switch (this.compress) {
.gzip => |opts| compressGzip(source_data, opts.level) catch |e| return .{ .err = e },
.none => source_data,
};
defer if (this.compress != .none) bun.default_allocator.free(data_to_write);
const file = switch (bun.sys.File.openat(.cwd(), this.path, bun.O.CREAT | bun.O.WRONLY | bun.O.TRUNC, 0o644)) {
.err => |err| return .{ .sys_err = err.clone(bun.default_allocator) },
@@ -699,7 +735,7 @@ fn startWriteTask(
globalThis: *jsc.JSGlobalObject,
data: WriteContext.Data,
path: []const u8,
use_gzip: bool,
compress: Compression,
) bun.JSError!jsc.JSValue {
const path_z = try bun.default_allocator.dupeZ(u8, path);
errdefer bun.default_allocator.free(path_z);
@@ -714,7 +750,7 @@ fn startWriteTask(
const task = try WriteTask.create(globalThis, .{
.data = data,
.path = path_z,
.use_gzip = use_gzip,
.compress = compress,
});
const promise_js = task.promise.value();
@@ -869,10 +905,10 @@ fn startFilesTask(globalThis: *jsc.JSGlobalObject, store: *jsc.WebCore.Blob.Stor
// Helpers
// ============================================================================
fn compressGzip(data: []const u8) ![]u8 {
fn compressGzip(data: []const u8, level: u8) ![]u8 {
libdeflate.load();
const compressor = libdeflate.Compressor.alloc(6) orelse return error.GzipInitFailed;
const compressor = libdeflate.Compressor.alloc(@intCast(level)) orelse return error.GzipInitFailed;
defer compressor.deinit();
const max_size = compressor.maxBytesNeeded(data, .gzip);

View File

@@ -118,6 +118,14 @@ pub fn set_repeat(_: *Self, thisValue: JSValue, globalThis: *JSGlobalObject, val
Self.js.repeatSetCached(thisValue, globalThis, value);
}
pub fn get_idleStart(_: *Self, thisValue: JSValue, _: *JSGlobalObject) JSValue {
return Self.js.idleStartGetCached(thisValue).?;
}
pub fn set_idleStart(_: *Self, thisValue: JSValue, globalThis: *JSGlobalObject, value: JSValue) void {
Self.js.idleStartSetCached(thisValue, globalThis, value);
}
pub fn dispose(self: *Self, globalThis: *JSGlobalObject, _: *jsc.CallFrame) bun.JSError!JSValue {
self.internals.cancel(globalThis.bunVM());
return .js_undefined;

View File

@@ -242,7 +242,7 @@ fn convertToInterval(this: *TimerObjectInternals, global: *JSGlobalObject, timer
this.strong_this.set(global, timer);
this.flags.kind = .setInterval;
this.interval = new_interval;
this.reschedule(timer, vm);
this.reschedule(timer, vm, global);
}
pub fn run(this: *TimerObjectInternals, globalThis: *jsc.JSGlobalObject, timer: JSValue, callback: JSValue, arguments: JSValue, async_id: u64, vm: *jsc.VirtualMachine) bool {
@@ -293,8 +293,8 @@ pub fn init(
TimeoutObject.js.idleTimeoutSetCached(timer, global, .jsNumber(interval));
TimeoutObject.js.repeatSetCached(timer, global, if (kind == .setInterval) .jsNumber(interval) else .null);
// this increments the refcount
this.reschedule(timer, vm);
// this increments the refcount and sets _idleStart
this.reschedule(timer, vm, global);
}
this.strong_this.set(global, timer);
@@ -328,7 +328,7 @@ pub fn doRefresh(this: *TimerObjectInternals, globalObject: *jsc.JSGlobalObject,
}
this.strong_this.set(globalObject, this_value);
this.reschedule(this_value, VirtualMachine.get());
this.reschedule(this_value, VirtualMachine.get(), globalObject);
return this_value;
}
@@ -371,7 +371,7 @@ fn shouldRescheduleTimer(this: *TimerObjectInternals, repeat: JSValue, idle_time
return true;
}
pub fn reschedule(this: *TimerObjectInternals, timer: JSValue, vm: *VirtualMachine) void {
pub fn reschedule(this: *TimerObjectInternals, timer: JSValue, vm: *VirtualMachine, globalThis: *JSGlobalObject) void {
if (this.flags.kind == .setImmediate) return;
const idle_timeout = TimeoutObject.js.idleTimeoutGetCached(timer).?;
@@ -380,7 +380,8 @@ pub fn reschedule(this: *TimerObjectInternals, timer: JSValue, vm: *VirtualMachi
// https://github.com/nodejs/node/blob/a7cbb904745591c9a9d047a364c2c188e5470047/lib/internal/timers.js#L612
if (!this.shouldRescheduleTimer(repeat, idle_timeout)) return;
const now = timespec.msFromNow(.allow_mocked_time, this.interval);
const now = timespec.now(.allow_mocked_time);
const scheduled_time = now.addMs(this.interval);
const was_active = this.eventLoopTimer().state == .ACTIVE;
if (was_active) {
vm.timer.remove(this.eventLoopTimer());
@@ -388,9 +389,13 @@ pub fn reschedule(this: *TimerObjectInternals, timer: JSValue, vm: *VirtualMachi
this.ref();
}
vm.timer.update(this.eventLoopTimer(), &now);
vm.timer.update(this.eventLoopTimer(), &scheduled_time);
this.flags.has_cleared_timer = false;
// Set _idleStart to the current monotonic timestamp in milliseconds
// This mimics Node.js's behavior where _idleStart is the libuv timestamp when the timer was scheduled
TimeoutObject.js.idleStartSetCached(timer, globalThis, .jsNumber(now.msUnsigned()));
if (this.flags.has_js_ref) {
this.setEnableKeepingEventLoopAlive(vm, true);
}

View File

@@ -1896,6 +1896,9 @@ pub fn NewRequestContext(comptime ssl_enabled: bool, comptime debug_mode: bool,
}
this.ref();
byte_stream.pipe = jsc.WebCore.Pipe.Wrap(@This(), onPipe).init(this);
// Deinit the old Strong reference before creating a new one
// to avoid leaking the Strong.Impl memory
this.response_body_readable_stream_ref.deinit();
this.response_body_readable_stream_ref = jsc.WebCore.ReadableStream.Strong.init(stream, globalThis);
this.byte_stream = byte_stream;

View File

@@ -36,7 +36,7 @@ namespace WebCore {
class JSHeapData;
class DOMGCOutputConstraint : public JSC::MarkingConstraint {
WTF_DEPRECATED_MAKE_FAST_ALLOCATED(DOMGCOutputConstraint);
WTF_DEPRECATED_MAKE_FAST_ALLOCATED(DOMEGCOutputConstraint);
public:
DOMGCOutputConstraint(JSC::VM&, JSHeapData&);

View File

@@ -730,18 +730,13 @@ JSC::ScriptExecutionStatus Zig::GlobalObject::scriptExecutionStatus(JSC::JSGloba
void unsafeEvalNoop(JSGlobalObject*, const WTF::String&) {}
static void queueMicrotaskToEventLoop(JSGlobalObject& globalObject, QueuedTask&& task)
{
globalObject.vm().queueMicrotask(WTF::move(task));
}
const JSC::GlobalObjectMethodTable& GlobalObject::globalObjectMethodTable()
{
static const JSC::GlobalObjectMethodTable table = {
&supportsRichSourceInfo,
&shouldInterruptScript,
&javaScriptRuntimeFlags,
&queueMicrotaskToEventLoop,
nullptr, // &queueMicrotaskToEventLoop, // queueTaskToEventLoop
nullptr, // &shouldInterruptScriptBeforeTimeout,
&moduleLoaderImportModule, // moduleLoaderImportModule
&moduleLoaderResolve, // moduleLoaderResolve
@@ -770,7 +765,8 @@ const JSC::GlobalObjectMethodTable& EvalGlobalObject::globalObjectMethodTable()
&supportsRichSourceInfo,
&shouldInterruptScript,
&javaScriptRuntimeFlags,
&queueMicrotaskToEventLoop,
// &queueMicrotaskToEventLoop, // queueTaskToEventLoop
nullptr,
nullptr, // &shouldInterruptScriptBeforeTimeout,
&moduleLoaderImportModule, // moduleLoaderImportModule
&moduleLoaderResolve, // moduleLoaderResolve
@@ -1076,7 +1072,7 @@ JSC_DEFINE_HOST_FUNCTION(functionQueueMicrotask,
// BunPerformMicrotaskJob accepts a variable number of arguments (up to: performMicrotask, job, asyncContext, arg0, arg1).
// The runtime inspects argumentCount to determine which arguments are present, so callers may pass only the subset they need.
// Here we pass: function, callback, asyncContext.
JSC::QueuedTask task { nullptr, JSC::InternalMicrotask::BunPerformMicrotaskJob, 0, globalObject, function, callback, asyncContext };
JSC::QueuedTask task { nullptr, JSC::InternalMicrotask::BunPerformMicrotaskJob, globalObject, function, callback, asyncContext };
globalObject->vm().queueMicrotask(WTF::move(task));
return JSC::JSValue::encode(JSC::jsUndefined());
@@ -3107,7 +3103,7 @@ extern "C" void JSC__JSGlobalObject__queueMicrotaskCallback(Zig::GlobalObject* g
// Do not use JSCell* here because the GC will try to visit it.
// Use BunInvokeJobWithArguments to pass the two arguments (ptr and callback) to the trampoline function
JSC::QueuedTask task { nullptr, JSC::InternalMicrotask::BunInvokeJobWithArguments, 0, globalObject, function, JSValue(std::bit_cast<double>(reinterpret_cast<uintptr_t>(ptr))), JSValue(std::bit_cast<double>(reinterpret_cast<uintptr_t>(callback))) };
JSC::QueuedTask task { nullptr, JSC::InternalMicrotask::BunInvokeJobWithArguments, globalObject, function, JSValue(std::bit_cast<double>(reinterpret_cast<uintptr_t>(ptr))), JSValue(std::bit_cast<double>(reinterpret_cast<uintptr_t>(callback))) };
globalObject->vm().queueMicrotask(WTF::move(task));
}

View File

@@ -3540,7 +3540,7 @@ void JSC__JSPromise__rejectOnNextTickWithHandled(JSC::JSPromise* promise, JSC::J
value = jsUndefined();
}
JSC::QueuedTask task { nullptr, JSC::InternalMicrotask::BunPerformMicrotaskJob, 0, globalObject, microtaskFunction, rejectPromiseFunction, globalObject->m_asyncContextData.get()->getInternalField(0), promise, value };
JSC::QueuedTask task { nullptr, JSC::InternalMicrotask::BunPerformMicrotaskJob, globalObject, microtaskFunction, rejectPromiseFunction, globalObject->m_asyncContextData.get()->getInternalField(0), promise, value };
globalObject->vm().queueMicrotask(WTF::move(task));
RETURN_IF_EXCEPTION(scope, );
}
@@ -4989,34 +4989,24 @@ static void JSC__JSValue__forEachPropertyImpl(JSC::EncodedJSValue JSValue0, JSC:
return;
}
size_t prototypeCount = 0;
auto scope = DECLARE_CATCH_SCOPE(vm);
JSC::Structure* structure = object->structure();
bool fast = !nonIndexedOnly && canPerformFastPropertyEnumerationForIterationBun(structure);
JSValue prototypeObject = value;
if (fast) {
if (structure->outOfLineSize() == 0 && structure->inlineSize() == 0) {
// Object has no own properties - don't fall back to prototype properties.
// console.log should only show own enumerable properties, not inherited ones.
fast = false;
if (JSValue proto = object->getPrototype(globalObject)) {
if ((structure = proto.structureOrNull())) {
prototypeObject = proto;
fast = canPerformFastPropertyEnumerationForIterationBun(structure);
prototypeCount = 1;
}
}
}
}
auto* propertyNames = vm.propertyNames;
auto& builtinNames = WebCore::builtinNames(vm);
WTF::Vector<Identifier, 6> visitedProperties;
restart:
if (fast) {
bool anyHits = false;
JSC::JSObject* objectToUse = prototypeObject.getObject();
structure->forEachProperty(vm, [&](const PropertyTableEntry& entry) -> bool {
if ((entry.attributes() & (PropertyAttribute::Function)) == 0 && (entry.attributes() & (PropertyAttribute::Builtin)) != 0) {
return true;
@@ -5025,7 +5015,7 @@ restart:
if (prop == propertyNames->constructor
|| prop == propertyNames->underscoreProto
|| prop == propertyNames->toStringTagSymbol || (objectToUse != object && prop == propertyNames->__esModule))
|| prop == propertyNames->toStringTagSymbol)
return true;
if (builtinNames.bunNativePtrPrivateName() == prop)
@@ -5037,18 +5027,14 @@ restart:
visitedProperties.append(Identifier::fromUid(vm, prop));
ZigString key = toZigString(prop);
JSC::JSValue propertyValue = JSValue();
if (objectToUse == object) {
propertyValue = objectToUse->getDirect(entry.offset());
if (!propertyValue) {
scope.clearException();
return true;
}
JSC::JSValue propertyValue = object->getDirect(entry.offset());
if (!propertyValue) {
scope.clearException();
return true;
}
if (!propertyValue || propertyValue.isGetterSetter() && !((entry.attributes() & PropertyAttribute::Accessor) != 0)) {
propertyValue = objectToUse->getIfPropertyExists(globalObject, prop);
if (propertyValue.isGetterSetter() && !((entry.attributes() & PropertyAttribute::Accessor) != 0)) {
propertyValue = object->getIfPropertyExists(globalObject, prop);
}
// Ignore exceptions due to getters.
@@ -5074,134 +5060,106 @@ restart:
// Propagate exceptions from callbacks.
RETURN_IF_EXCEPTION(scope, );
// Only iterate own properties - do not walk up the prototype chain.
if (anyHits) {
if (prototypeCount++ < 5) {
if (JSValue proto = prototypeObject.getPrototype(globalObject)) {
if (!(proto == globalObject->objectPrototype() || proto == globalObject->functionPrototype() || (proto.inherits<JSGlobalProxy>() && jsCast<JSGlobalProxy*>(proto)->target() != globalObject))) {
if ((structure = proto.structureOrNull())) {
prototypeObject = proto;
fast = canPerformFastPropertyEnumerationForIterationBun(structure);
goto restart;
}
}
}
// Ignore exceptions from Proxy "getPrototype" trap.
CLEAR_IF_EXCEPTION(scope);
}
return;
}
}
// Slow path: iterate only own properties of the original object.
// Do not walk up the prototype chain - console.log should only show own enumerable properties.
JSC::PropertyNameArrayBuilder properties(vm, PropertyNameMode::StringsAndSymbols, PrivateSymbolMode::Exclude);
{
if constexpr (nonIndexedOnly) {
object->getOwnNonIndexPropertyNames(globalObject, properties, DontEnumPropertiesMode::Include);
} else {
object->methodTable()->getOwnPropertyNames(object, globalObject, properties, DontEnumPropertiesMode::Include);
}
JSObject* iterating = prototypeObject.getObject();
RETURN_IF_EXCEPTION(scope, void());
while (iterating && !(iterating == globalObject->objectPrototype() || iterating == globalObject->functionPrototype() || (iterating->inherits<JSGlobalProxy>() && jsCast<JSGlobalProxy*>(iterating)->target() != globalObject)) && prototypeCount++ < 5) {
if constexpr (nonIndexedOnly) {
iterating->getOwnNonIndexPropertyNames(globalObject, properties, DontEnumPropertiesMode::Include);
} else {
iterating->methodTable()->getOwnPropertyNames(iterating, globalObject, properties, DontEnumPropertiesMode::Include);
for (auto& property : properties) {
if (property.isEmpty() || property.isNull()) [[unlikely]]
continue;
// ignore constructor
if (property == propertyNames->constructor || builtinNames.bunNativePtrPrivateName() == property)
continue;
if constexpr (nonIndexedOnly) {
if (property == propertyNames->length) {
continue;
}
RETURN_IF_EXCEPTION(scope, void());
for (auto& property : properties) {
if (property.isEmpty() || property.isNull()) [[unlikely]]
continue;
// ignore constructor
if (property == propertyNames->constructor || builtinNames.bunNativePtrPrivateName() == property)
continue;
if constexpr (nonIndexedOnly) {
if (property == propertyNames->length) {
continue;
}
}
JSC::PropertySlot slot(object, PropertySlot::InternalMethodType::Get);
if (!object->getPropertySlot(globalObject, property, slot))
continue;
// Ignore exceptions from "Get" proxy traps.
CLEAR_IF_EXCEPTION(scope);
if ((slot.attributes() & PropertyAttribute::DontEnum) != 0) {
if (property == propertyNames->underscoreProto
|| property == propertyNames->toStringTagSymbol || property == propertyNames->__esModule)
continue;
}
if (visitedProperties.contains(property))
continue;
visitedProperties.append(property);
ZigString key = toZigString(property.isSymbol() && !property.isPrivateName() ? property.impl() : property.string());
if (key.len == 0)
continue;
JSC::JSValue propertyValue = jsUndefined();
if ((slot.attributes() & PropertyAttribute::DontEnum) != 0) {
if ((slot.attributes() & PropertyAttribute::Accessor) != 0) {
// If we can't use getPureResult, let's at least say it was a [Getter]
if (!slot.isCacheableGetter()) {
propertyValue = slot.getterSetter();
} else {
propertyValue = slot.getPureResult();
}
} else if (slot.attributes() & PropertyAttribute::BuiltinOrFunction) {
propertyValue = slot.getValue(globalObject, property);
} else if (slot.isCustom()) {
propertyValue = slot.getValue(globalObject, property);
} else if (slot.isValue()) {
propertyValue = slot.getValue(globalObject, property);
} else if (object->getOwnPropertySlot(object, globalObject, property, slot)) {
propertyValue = slot.getValue(globalObject, property);
}
} else if (slot.isAccessor()) {
// If we can't use getPureResult, let's at least say it was a [Getter]
if (!slot.isCacheableGetter()) {
propertyValue = slot.getterSetter();
} else {
propertyValue = slot.getPureResult();
}
} else {
propertyValue = slot.getValue(globalObject, property);
}
// Ignore exceptions from getters.
if (scope.exception()) [[unlikely]] {
scope.clearException();
propertyValue = jsUndefined();
}
JSC::EnsureStillAliveScope ensureStillAliveScope(propertyValue);
bool isPrivate = property.isPrivateName();
if (isPrivate && !JSC::Options::showPrivateScriptsInStackTraces())
continue;
iter(globalObject, arg2, &key, JSC::JSValue::encode(propertyValue), property.isSymbol(), isPrivate);
// Propagate exceptions from callbacks.
RETURN_IF_EXCEPTION(scope, void());
}
if constexpr (nonIndexedOnly) {
break;
}
// reuse memory
properties.data()->propertyNameVector().shrink(0);
if (iterating->isCallable())
break;
if (iterating == globalObject)
break;
iterating = iterating->getPrototype(globalObject).getObject();
}
JSC::PropertySlot slot(object, PropertySlot::InternalMethodType::Get);
if (!object->getPropertySlot(globalObject, property, slot))
continue;
// Ignore exceptions from "Get" proxy traps.
CLEAR_IF_EXCEPTION(scope);
if ((slot.attributes() & PropertyAttribute::DontEnum) != 0) {
if (property == propertyNames->underscoreProto
|| property == propertyNames->toStringTagSymbol || property == propertyNames->__esModule)
continue;
}
if (visitedProperties.contains(property))
continue;
visitedProperties.append(property);
ZigString key = toZigString(property.isSymbol() && !property.isPrivateName() ? property.impl() : property.string());
if (key.len == 0)
continue;
JSC::JSValue propertyValue = jsUndefined();
if ((slot.attributes() & PropertyAttribute::DontEnum) != 0) {
if ((slot.attributes() & PropertyAttribute::Accessor) != 0) {
// If we can't use getPureResult, let's at least say it was a [Getter]
if (!slot.isCacheableGetter()) {
propertyValue = slot.getterSetter();
} else {
propertyValue = slot.getPureResult();
}
} else if (slot.attributes() & PropertyAttribute::BuiltinOrFunction) {
propertyValue = slot.getValue(globalObject, property);
} else if (slot.isCustom()) {
propertyValue = slot.getValue(globalObject, property);
} else if (slot.isValue()) {
propertyValue = slot.getValue(globalObject, property);
} else if (object->getOwnPropertySlot(object, globalObject, property, slot)) {
propertyValue = slot.getValue(globalObject, property);
}
} else if (slot.isAccessor()) {
// If we can't use getPureResult, let's at least say it was a [Getter]
if (!slot.isCacheableGetter()) {
propertyValue = slot.getterSetter();
} else {
propertyValue = slot.getPureResult();
}
} else {
propertyValue = slot.getValue(globalObject, property);
}
// Ignore exceptions from getters.
if (scope.exception()) [[unlikely]] {
scope.clearException();
propertyValue = jsUndefined();
}
JSC::EnsureStillAliveScope ensureStillAliveScope(propertyValue);
bool isPrivate = property.isPrivateName();
if (isPrivate && !JSC::Options::showPrivateScriptsInStackTraces())
continue;
iter(globalObject, arg2, &key, JSC::JSValue::encode(propertyValue), property.isSymbol(), isPrivate);
// Propagate exceptions from callbacks.
RETURN_IF_EXCEPTION(scope, void());
}
properties.releaseData();
@@ -5428,7 +5386,7 @@ extern "C" void JSC__JSGlobalObject__queueMicrotaskJob(JSC::JSGlobalObject* arg0
#endif
JSC::QueuedTask task { nullptr, JSC::InternalMicrotask::BunPerformMicrotaskJob, 0, globalObject, microTaskFunction, WTF::move(microtaskArgs[0]), WTF::move(microtaskArgs[1]), WTF::move(microtaskArgs[2]), WTF::move(microtaskArgs[3]) };
JSC::QueuedTask task { nullptr, JSC::InternalMicrotask::BunPerformMicrotaskJob, globalObject, microTaskFunction, WTF::move(microtaskArgs[0]), WTF::move(microtaskArgs[1]), WTF::move(microtaskArgs[2]), WTF::move(microtaskArgs[3]) };
globalObject->vm().queueMicrotask(WTF::move(task));
}

View File

@@ -73,7 +73,7 @@ extern "C" bool is_executable_file(const char* path)
{
#if defined(O_EXEC)
// O_EXEC is macOS specific
int fd = open(path, O_EXEC | O_CLOEXEC, 0);
int fd = open(path, O_EXEC | O_CLOEXEC | O_NONBLOCK | O_NOCTTY, 0);
if (fd < 0)
return false;
close(fd);

View File

@@ -11,7 +11,7 @@ JSC_DECLARE_HOST_FUNCTION(jsVerifyOneShot);
static const unsigned int NoDsaSignature = static_cast<unsigned int>(-1);
struct SignJobCtx {
WTF_MAKE_TZONE_ALLOCATED(SignJobCtx);
WTF_MAKE_TZONE_ALLOCATED(name);
public:
enum class Mode {

View File

@@ -1,7 +1,7 @@
// clang-format off
/******************************************************************************
** This file is an amalgamation of many separate C source files from SQLite
** version 3.51.1. By combining all the individual C code files into this
** version 3.51.2. By combining all the individual C code files into this
** single large file, the entire code can be compiled as a single translation
** unit. This allows many compilers to do optimizations that would not be
** possible if the files were compiled separately. Performance improvements
@@ -19,7 +19,7 @@
** separate file. This file contains only code for the core SQLite library.
**
** The content in this amalgamation comes from Fossil check-in
** 281fc0e9afc38674b9b0991943b9e9d1e64c with changes in files:
** b270f8339eb13b504d0b2ba154ebca966b7d with changes in files:
**
**
*/
@@ -469,12 +469,12 @@ extern "C" {
** [sqlite3_libversion_number()], [sqlite3_sourceid()],
** [sqlite_version()] and [sqlite_source_id()].
*/
#define SQLITE_VERSION "3.51.1"
#define SQLITE_VERSION_NUMBER 3051001
#define SQLITE_SOURCE_ID "2025-11-28 17:28:25 281fc0e9afc38674b9b0991943b9e9d1e64c6cbdb133d35f6f5c87ff6af38a88"
#define SQLITE_VERSION "3.51.2"
#define SQLITE_VERSION_NUMBER 3051002
#define SQLITE_SOURCE_ID "2026-01-09 17:27:48 b270f8339eb13b504d0b2ba154ebca966b7dde08e40c3ed7d559749818cb2075"
#define SQLITE_SCM_BRANCH "branch-3.51"
#define SQLITE_SCM_TAGS "release version-3.51.1"
#define SQLITE_SCM_DATETIME "2025-11-28T17:28:25.933Z"
#define SQLITE_SCM_TAGS "release version-3.51.2"
#define SQLITE_SCM_DATETIME "2026-01-09T17:27:48.405Z"
/*
** CAPI3REF: Run-Time Library Version Numbers
@@ -41230,12 +41230,18 @@ static int unixLock(sqlite3_file *id, int eFileLock){
pInode->nLock++;
pInode->nShared = 1;
}
}else if( (eFileLock==EXCLUSIVE_LOCK && pInode->nShared>1)
|| unixIsSharingShmNode(pFile)
){
}else if( eFileLock==EXCLUSIVE_LOCK && pInode->nShared>1 ){
/* We are trying for an exclusive lock but another thread in this
** same process is still holding a shared lock. */
rc = SQLITE_BUSY;
}else if( unixIsSharingShmNode(pFile) ){
/* We are in WAL mode and attempting to delete the SHM and WAL
** files due to closing the connection or changing out of WAL mode,
** but another process still holds locks on the SHM file, thus
** indicating that database locks have been broken, perhaps due
** to a rogue close(open(dbFile)) or similar.
*/
rc = SQLITE_BUSY;
}else{
/* The request was for a RESERVED or EXCLUSIVE lock. It is
** assumed that there is a SHARED or greater lock on the file
@@ -43874,26 +43880,21 @@ static int unixFcntlExternalReader(unixFile *pFile, int *piOut){
** still not a disaster.
*/
static int unixIsSharingShmNode(unixFile *pFile){
int rc;
unixShmNode *pShmNode;
struct flock lock;
if( pFile->pShm==0 ) return 0;
if( pFile->ctrlFlags & UNIXFILE_EXCL ) return 0;
pShmNode = pFile->pShm->pShmNode;
rc = 1;
unixEnterMutex();
if( ALWAYS(pShmNode->nRef==1) ){
struct flock lock;
lock.l_whence = SEEK_SET;
lock.l_start = UNIX_SHM_DMS;
lock.l_len = 1;
lock.l_type = F_WRLCK;
osFcntl(pShmNode->hShm, F_GETLK, &lock);
if( lock.l_type==F_UNLCK ){
rc = 0;
}
}
unixLeaveMutex();
return rc;
#if SQLITE_ATOMIC_INTRINSICS
assert( AtomicLoad(&pShmNode->nRef)==1 );
#endif
memset(&lock, 0, sizeof(lock));
lock.l_whence = SEEK_SET;
lock.l_start = UNIX_SHM_DMS;
lock.l_len = 1;
lock.l_type = F_WRLCK;
osFcntl(pShmNode->hShm, F_GETLK, &lock);
return (lock.l_type!=F_UNLCK);
}
/*
@@ -115318,9 +115319,22 @@ SQLITE_PRIVATE int sqlite3CodeSubselect(Parse *pParse, Expr *pExpr){
pParse->nMem += nReg;
if( pExpr->op==TK_SELECT ){
dest.eDest = SRT_Mem;
dest.iSdst = dest.iSDParm;
if( (pSel->selFlags&SF_Distinct) && pSel->pLimit && pSel->pLimit->pRight ){
/* If there is both a DISTINCT and an OFFSET clause, then allocate
** a separate dest.iSdst array for sqlite3Select() and other
** routines to populate. In this case results will be copied over
** into the dest.iSDParm array only after OFFSET processing. This
** ensures that in the case where OFFSET excludes all rows, the
** dest.iSDParm array is not left populated with the contents of the
** last row visited - it should be all NULLs if all rows were
** excluded by OFFSET. */
dest.iSdst = pParse->nMem+1;
pParse->nMem += nReg;
}else{
dest.iSdst = dest.iSDParm;
}
dest.nSdst = nReg;
sqlite3VdbeAddOp3(v, OP_Null, 0, dest.iSDParm, dest.iSDParm+nReg-1);
sqlite3VdbeAddOp3(v, OP_Null, 0, dest.iSDParm, pParse->nMem);
VdbeComment((v, "Init subquery result"));
}else{
dest.eDest = SRT_Exists;
@@ -148188,9 +148202,14 @@ static void selectInnerLoop(
assert( nResultCol<=pDest->nSdst );
pushOntoSorter(
pParse, pSort, p, regResult, regOrig, nResultCol, nPrefixReg);
pDest->iSDParm = regResult;
}else{
assert( nResultCol==pDest->nSdst );
assert( regResult==iParm );
if( regResult!=iParm ){
/* This occurs in cases where the SELECT had both a DISTINCT and
** an OFFSET clause. */
sqlite3VdbeAddOp3(v, OP_Copy, regResult, iParm, nResultCol-1);
}
/* The LIMIT clause will jump out of the loop for us */
}
break;
@@ -154205,12 +154224,24 @@ static SQLITE_NOINLINE void existsToJoin(
&& (pSub->selFlags & SF_Aggregate)==0
&& !pSub->pSrc->a[0].fg.isSubquery
&& pSub->pLimit==0
&& pSub->pPrior==0
){
/* Before combining the sub-select with the parent, renumber the
** cursor used by the subselect. This is because the EXISTS expression
** might be a copy of another EXISTS expression from somewhere
** else in the tree, and in this case it is important that it use
** a unique cursor number. */
sqlite3 *db = pParse->db;
int *aCsrMap = sqlite3DbMallocZero(db, (pParse->nTab+2)*sizeof(int));
if( aCsrMap==0 ) return;
aCsrMap[0] = (pParse->nTab+1);
renumberCursors(pParse, pSub, -1, aCsrMap);
sqlite3DbFree(db, aCsrMap);
memset(pWhere, 0, sizeof(*pWhere));
pWhere->op = TK_INTEGER;
pWhere->u.iValue = 1;
ExprSetProperty(pWhere, EP_IntValue);
assert( p->pWhere!=0 );
pSub->pSrc->a[0].fg.fromExists = 1;
pSub->pSrc->a[0].fg.jointype |= JT_CROSS;
@@ -174003,6 +174034,9 @@ SQLITE_PRIVATE void sqlite3WhereEnd(WhereInfo *pWInfo){
sqlite3 *db = pParse->db;
int iEnd = sqlite3VdbeCurrentAddr(v);
int nRJ = 0;
#ifndef SQLITE_DISABLE_SKIPAHEAD_DISTINCT
int addrSeek = 0;
#endif
/* Generate loop termination code.
*/
@@ -174015,7 +174049,10 @@ SQLITE_PRIVATE void sqlite3WhereEnd(WhereInfo *pWInfo){
** the RIGHT JOIN table */
WhereRightJoin *pRJ = pLevel->pRJ;
sqlite3VdbeResolveLabel(v, pLevel->addrCont);
pLevel->addrCont = 0;
/* Replace addrCont with a new label that will never be used, just so
** the subsequent call to resolve pLevel->addrCont will have something
** to resolve. */
pLevel->addrCont = sqlite3VdbeMakeLabel(pParse);
pRJ->endSubrtn = sqlite3VdbeCurrentAddr(v);
sqlite3VdbeAddOp3(v, OP_Return, pRJ->regReturn, pRJ->addrSubrtn, 1);
VdbeCoverage(v);
@@ -174024,7 +174061,6 @@ SQLITE_PRIVATE void sqlite3WhereEnd(WhereInfo *pWInfo){
pLoop = pLevel->pWLoop;
if( pLevel->op!=OP_Noop ){
#ifndef SQLITE_DISABLE_SKIPAHEAD_DISTINCT
int addrSeek = 0;
Index *pIdx;
int n;
if( pWInfo->eDistinct==WHERE_DISTINCT_ORDERED
@@ -174047,25 +174083,26 @@ SQLITE_PRIVATE void sqlite3WhereEnd(WhereInfo *pWInfo){
sqlite3VdbeAddOp2(v, OP_Goto, 1, pLevel->p2);
}
#endif /* SQLITE_DISABLE_SKIPAHEAD_DISTINCT */
if( pTabList->a[pLevel->iFrom].fg.fromExists && i==pWInfo->nLevel-1 ){
/* If the EXISTS-to-JOIN optimization was applied, then the EXISTS
** loop(s) will be the inner-most loops of the join. There might be
** multiple EXISTS loops, but they will all be nested, and the join
** order will not have been changed by the query planner. If the
** inner-most EXISTS loop sees a single successful row, it should
** break out of *all* EXISTS loops. But only the inner-most of the
** nested EXISTS loops should do this breakout. */
int nOuter = 0; /* Nr of outer EXISTS that this one is nested within */
while( nOuter<i ){
if( !pTabList->a[pLevel[-nOuter-1].iFrom].fg.fromExists ) break;
nOuter++;
}
testcase( nOuter>0 );
sqlite3VdbeAddOp2(v, OP_Goto, 0, pLevel[-nOuter].addrBrk);
VdbeComment((v, "EXISTS break"));
}
if( pTabList->a[pLevel->iFrom].fg.fromExists && i==pWInfo->nLevel-1 ){
/* If the EXISTS-to-JOIN optimization was applied, then the EXISTS
** loop(s) will be the inner-most loops of the join. There might be
** multiple EXISTS loops, but they will all be nested, and the join
** order will not have been changed by the query planner. If the
** inner-most EXISTS loop sees a single successful row, it should
** break out of *all* EXISTS loops. But only the inner-most of the
** nested EXISTS loops should do this breakout. */
int nOuter = 0; /* Nr of outer EXISTS that this one is nested within */
while( nOuter<i ){
if( !pTabList->a[pLevel[-nOuter-1].iFrom].fg.fromExists ) break;
nOuter++;
}
/* The common case: Advance to the next row */
if( pLevel->addrCont ) sqlite3VdbeResolveLabel(v, pLevel->addrCont);
testcase( nOuter>0 );
sqlite3VdbeAddOp2(v, OP_Goto, 0, pLevel[-nOuter].addrBrk);
VdbeComment((v, "EXISTS break"));
}
sqlite3VdbeResolveLabel(v, pLevel->addrCont);
if( pLevel->op!=OP_Noop ){
sqlite3VdbeAddOp3(v, pLevel->op, pLevel->p1, pLevel->p2, pLevel->p3);
sqlite3VdbeChangeP5(v, pLevel->p5);
VdbeCoverage(v);
@@ -174078,10 +174115,11 @@ SQLITE_PRIVATE void sqlite3WhereEnd(WhereInfo *pWInfo){
VdbeCoverage(v);
}
#ifndef SQLITE_DISABLE_SKIPAHEAD_DISTINCT
if( addrSeek ) sqlite3VdbeJumpHere(v, addrSeek);
if( addrSeek ){
sqlite3VdbeJumpHere(v, addrSeek);
addrSeek = 0;
}
#endif
}else if( pLevel->addrCont ){
sqlite3VdbeResolveLabel(v, pLevel->addrCont);
}
if( (pLoop->wsFlags & WHERE_IN_ABLE)!=0 && pLevel->u.in.nIn>0 ){
struct InLoop *pIn;
@@ -219471,7 +219509,7 @@ static void rtreenode(sqlite3_context *ctx, int nArg, sqlite3_value **apArg){
if( node.zData==0 ) return;
nData = sqlite3_value_bytes(apArg[1]);
if( nData<4 ) return;
if( nData<NCELL(&node)*tree.nBytesPerCell ) return;
if( nData<4+NCELL(&node)*tree.nBytesPerCell ) return;
pOut = sqlite3_str_new(0);
for(ii=0; ii<NCELL(&node); ii++){
@@ -238552,7 +238590,13 @@ typedef sqlite3_uint64 u64;
# define FLEXARRAY 1
#endif
#endif
#endif /* SQLITE_AMALGAMATION */
/*
** Constants for the largest and smallest possible 32-bit signed integers.
*/
# define LARGEST_INT32 ((int)(0x7fffffff))
# define SMALLEST_INT32 ((int)((-1) - LARGEST_INT32))
/* Truncate very long tokens to this many bytes. Hard limit is
** (65536-1-1-4-9)==65521 bytes. The limiting factor is the 16-bit offset
@@ -253115,7 +253159,7 @@ static int sqlite3Fts5IndexMerge(Fts5Index *p, int nMerge){
fts5StructureRelease(pStruct);
pStruct = pNew;
nMin = 1;
nMerge = nMerge*-1;
nMerge = (nMerge==SMALLEST_INT32 ? LARGEST_INT32 : (nMerge*-1));
}
if( pStruct && pStruct->nLevel ){
if( fts5IndexMerge(p, &pStruct, nMerge, nMin) ){
@@ -260322,7 +260366,7 @@ static void fts5SourceIdFunc(
){
assert( nArg==0 );
UNUSED_PARAM2(nArg, apUnused);
sqlite3_result_text(pCtx, "fts5: 2025-11-28 17:28:25 281fc0e9afc38674b9b0991943b9e9d1e64c6cbdb133d35f6f5c87ff6af38a88", -1, SQLITE_TRANSIENT);
sqlite3_result_text(pCtx, "fts5: 2026-01-09 17:27:48 b270f8339eb13b504d0b2ba154ebca966b7dde08e40c3ed7d559749818cb2075", -1, SQLITE_TRANSIENT);
}
/*

View File

@@ -147,12 +147,12 @@ extern "C" {
** [sqlite3_libversion_number()], [sqlite3_sourceid()],
** [sqlite_version()] and [sqlite_source_id()].
*/
#define SQLITE_VERSION "3.51.1"
#define SQLITE_VERSION_NUMBER 3051001
#define SQLITE_SOURCE_ID "2025-11-28 17:28:25 281fc0e9afc38674b9b0991943b9e9d1e64c6cbdb133d35f6f5c87ff6af38a88"
#define SQLITE_VERSION "3.51.2"
#define SQLITE_VERSION_NUMBER 3051002
#define SQLITE_SOURCE_ID "2026-01-09 17:27:48 b270f8339eb13b504d0b2ba154ebca966b7dde08e40c3ed7d559749818cb2075"
#define SQLITE_SCM_BRANCH "branch-3.51"
#define SQLITE_SCM_TAGS "release version-3.51.1"
#define SQLITE_SCM_DATETIME "2025-11-28T17:28:25.933Z"
#define SQLITE_SCM_TAGS "release version-3.51.2"
#define SQLITE_SCM_DATETIME "2026-01-09T17:27:48.405Z"
/*
** CAPI3REF: Run-Time Library Version Numbers

View File

@@ -33,7 +33,7 @@ struct EventInit {
bool composed { false };
template<class Encoder> void encode(Encoder&) const;
template<class Decoder> [[nodiscard]] static bool decode(Decoder&, EventInit&);
template<class Decoder> WARN_UNUSED_RETURN static bool decode(Decoder&, EventInit&);
};
template<class Encoder>

View File

@@ -261,7 +261,7 @@ public:
}
template<class Encoder> void encode(Encoder &) const;
template<class Decoder> [[nodiscard]] static bool decode(Decoder &, HTTPHeaderMap &);
template<class Decoder> WARN_UNUSED_RETURN static bool decode(Decoder &, HTTPHeaderMap &);
void setUncommonHeader(const String &name, const String &value);
void setUncommonHeaderCloneName(const StringView name, const String &value);

View File

@@ -43,7 +43,7 @@ namespace WebCore {
using NavigationTimingFunction = unsigned long long (PerformanceTiming::*)() const;
static constexpr SortedArrayMap restrictedMarkFunctions { std::to_array<std::pair<ComparableASCIILiteral, NavigationTimingFunction>>({
static constexpr std::array<std::pair<ComparableASCIILiteral, NavigationTimingFunction>, 21> restrictedMarkMappings { {
{ "connectEnd"_s, &PerformanceTiming::connectEnd },
{ "connectStart"_s, &PerformanceTiming::connectStart },
{ "domComplete"_s, &PerformanceTiming::domComplete },
@@ -65,7 +65,8 @@ static constexpr SortedArrayMap restrictedMarkFunctions { std::to_array<std::pai
{ "secureConnectionStart"_s, &PerformanceTiming::secureConnectionStart },
{ "unloadEventEnd"_s, &PerformanceTiming::unloadEventEnd },
{ "unloadEventStart"_s, &PerformanceTiming::unloadEventStart },
}) };
} };
static constexpr SortedArrayMap restrictedMarkFunctions { restrictedMarkMappings };
bool PerformanceUserTiming::isRestrictedMarkName(const String& markName)
{

View File

@@ -87,11 +87,12 @@ template<> JSString* convertEnumerationToJS(JSGlobalObject& lexicalGlobalObject,
template<> std::optional<CryptoKey::Type> parseEnumeration<CryptoKey::Type>(JSGlobalObject& lexicalGlobalObject, JSValue value)
{
auto stringValue = value.toWTFString(&lexicalGlobalObject);
static constexpr SortedArrayMap enumerationMapping { std::to_array<std::pair<ComparableASCIILiteral, CryptoKey::Type>>({
static constexpr std::array<std::pair<ComparableASCIILiteral, CryptoKey::Type>, 3> mappings { {
{ "private"_s, CryptoKey::Type::Private },
{ "public"_s, CryptoKey::Type::Public },
{ "secret"_s, CryptoKey::Type::Secret },
}) };
} };
static constexpr SortedArrayMap enumerationMapping { mappings };
if (auto* enumerationValue = enumerationMapping.tryGet(stringValue); enumerationValue) [[likely]]
return *enumerationValue;
return std::nullopt;

View File

@@ -64,7 +64,7 @@ template<> JSString* convertEnumerationToJS(JSGlobalObject& lexicalGlobalObject,
template<> std::optional<CryptoKeyUsage> parseEnumeration<CryptoKeyUsage>(JSGlobalObject& lexicalGlobalObject, JSValue value)
{
auto stringValue = value.toWTFString(&lexicalGlobalObject);
static constexpr SortedArrayMap enumerationMapping { std::to_array<std::pair<ComparableASCIILiteral, CryptoKeyUsage>>({
static constexpr std::array<std::pair<ComparableASCIILiteral, CryptoKeyUsage>, 8> mappings { {
{ "decrypt"_s, CryptoKeyUsage::Decrypt },
{ "deriveBits"_s, CryptoKeyUsage::DeriveBits },
{ "deriveKey"_s, CryptoKeyUsage::DeriveKey },
@@ -73,7 +73,8 @@ template<> std::optional<CryptoKeyUsage> parseEnumeration<CryptoKeyUsage>(JSGlob
{ "unwrapKey"_s, CryptoKeyUsage::UnwrapKey },
{ "verify"_s, CryptoKeyUsage::Verify },
{ "wrapKey"_s, CryptoKeyUsage::WrapKey },
}) };
} };
static constexpr SortedArrayMap enumerationMapping { mappings };
if (auto* enumerationValue = enumerationMapping.tryGet(stringValue); enumerationValue) [[likely]]
return *enumerationValue;
return std::nullopt;

View File

@@ -96,12 +96,13 @@ template<> JSString* convertEnumerationToJS(JSGlobalObject& lexicalGlobalObject,
template<> std::optional<SubtleCrypto::KeyFormat> parseEnumeration<SubtleCrypto::KeyFormat>(JSGlobalObject& lexicalGlobalObject, JSValue value)
{
auto stringValue = value.toWTFString(&lexicalGlobalObject);
static constexpr SortedArrayMap enumerationMapping { std::to_array<std::pair<ComparableASCIILiteral, SubtleCrypto::KeyFormat>>({
static constexpr std::array<std::pair<ComparableASCIILiteral, SubtleCrypto::KeyFormat>, 4> mappings { {
{ "jwk"_s, SubtleCrypto::KeyFormat::Jwk },
{ "pkcs8"_s, SubtleCrypto::KeyFormat::Pkcs8 },
{ "raw"_s, SubtleCrypto::KeyFormat::Raw },
{ "spki"_s, SubtleCrypto::KeyFormat::Spki },
}) };
} };
static constexpr SortedArrayMap enumerationMapping { mappings };
if (auto* enumerationValue = enumerationMapping.tryGet(stringValue); enumerationValue) [[likely]]
return *enumerationValue;
return std::nullopt;

View File

@@ -184,13 +184,18 @@ export default [
setter: "set_repeat",
this: true,
},
_idleStart: {
getter: "get_idleStart",
setter: "set_idleStart",
this: true,
},
["@@dispose"]: {
fn: "dispose",
length: 0,
invalidThisBehavior: InvalidThisBehavior.NoOp,
},
},
values: ["arguments", "callback", "idleTimeout", "repeat"],
values: ["arguments", "callback", "idleTimeout", "repeat", "idleStart"],
}),
define({
name: "Immediate",

View File

@@ -5499,7 +5499,9 @@ pub const NodeFS = struct {
// O_PATH is faster
bun.O.PATH
else
bun.O.RDONLY;
// O_NONBLOCK prevents blocking on a FIFO.
// O_NOCTTY prevents acquiring a controlling terminal.
bun.O.RDONLY | bun.O.NONBLOCK | bun.O.NOCTTY;
const fd = switch (bun.sys.open(path, flags, 0)) {
.err => |err| return .{ .err = err.withPath(path) },

View File

@@ -1484,6 +1484,12 @@ pub fn writeFileInternal(globalThis: *jsc.JSGlobalObject, path_or_blob_: *PathOr
}
}
// Check for Archive - allows Bun.write() and S3 writes to accept Archive instances
if (data.as(Archive)) |archive| {
archive.store.ref();
break :brk Blob.initWithStore(archive.store, globalThis);
}
break :brk try Blob.get(
globalThis,
data,
@@ -4828,6 +4834,7 @@ const NewReadFileHandler = read_file.NewReadFileHandler;
const string = []const u8;
const Archive = @import("../api/Archive.zig");
const Environment = @import("../../env.zig");
const S3File = @import("./S3File.zig");
const std = @import("std");

View File

@@ -1189,8 +1189,7 @@ fn runWithSourceCode(
opts.features.bundler_feature_flags = transpiler.options.bundler_feature_flags;
opts.features.hot_module_reloading = output_format == .internal_bake_dev and !source.index.isRuntime();
opts.features.auto_polyfill_require = output_format == .esm and !opts.features.hot_module_reloading;
opts.features.react_fast_refresh = target == .browser and
transpiler.options.react_fast_refresh and
opts.features.react_fast_refresh = transpiler.options.react_fast_refresh and
loader.isJSX() and
!source.path.isNodeModule();

View File

@@ -791,7 +791,9 @@ pub const InitCommand = struct {
switch (template) {
.blank, .typescript_library => {
Template.createAgentRule();
if (!minimal) {
Template.createAgentRule();
}
if (package_json_file != null and !did_load_package_json) {
Output.prettyln(" + <r><d>package.json<r>", .{});

View File

@@ -755,12 +755,13 @@ function emitConvertEnumFunction(w: CodeWriter, type: TypeImpl) {
w.line(`template<> std::optional<${name}> parseEnumerationFromString<${name}>(const String& stringValue)`);
w.line(`{`);
w.line(
` static constexpr SortedArrayMap enumerationMapping { std::to_array<std::pair<ComparableASCIILiteral, ${name}>>({`,
` static constexpr std::array<std::pair<ComparableASCIILiteral, ${name}>, ${type.data.length}> mappings { {`,
);
for (const value of type.data) {
w.line(` { ${str(value)}_s, ${name}::${pascal(value)} },`);
}
w.line(` }) };`);
w.line(` } };`);
w.line(` static constexpr SortedArrayMap enumerationMapping { mappings };`);
w.line(` if (auto* enumerationValue = enumerationMapping.tryGet(stringValue); enumerationValue) [[likely]]`);
w.line(` return *enumerationValue;`);
w.line(` return std::nullopt;`);

View File

@@ -143,7 +143,7 @@ export function enumeration(
template<> std::optional<${qualifiedName}>
WebCore::parseEnumerationFromString<${qualifiedName}>(const WTF::String& stringVal)
{
static constexpr ::WTF::SortedArrayMap enumerationMapping { ::std::to_array<${pairType}>({
static constexpr ::std::array<${pairType}, ${valueMap.size}> mappings {
${joinIndented(
12,
Array.from(valueMap.entries())
@@ -155,7 +155,8 @@ export function enumeration(
},`;
}),
)}
}) };
};
static constexpr ::WTF::SortedArrayMap enumerationMapping { mappings };
if (auto* enumerationValue = enumerationMapping.tryGet(stringVal)) [[likely]] {
return *enumerationValue;
}

View File

@@ -325,9 +325,7 @@ $$capture_start$$(${fn.async ? "async " : ""}${
directives: fn.directives,
source: finalReplacement,
params: fn.params,
// Async functions automatically get Private visibility because the parser
// upgrades them when they use await (see Parser.cpp parseFunctionBody)
visibility: fn.directives.visibility ?? (fn.directives.linkTimeConstant || fn.async ? "Private" : "Public"),
visibility: fn.directives.visibility ?? (fn.directives.linkTimeConstant ? "Private" : "Public"),
isGetter: !!fn.directives.getter,
constructAbility: fn.directives.ConstructAbility ?? "CannotConstruct",
constructKind: fn.directives.ConstructKind ?? "None",

View File

@@ -216,6 +216,7 @@ pub fn StyleRule(comptime R: type) type {
var handler_context = context.handler_context.child(.style_rule);
std.mem.swap(css.PropertyHandlerContext, &context.handler_context, &handler_context);
try this.rules.minify(context, unused);
std.mem.swap(css.PropertyHandlerContext, &context.handler_context, &handler_context);
if (unused and this.rules.v.items.len == 0) {
return true;
}

View File

@@ -431,6 +431,12 @@ fn drainEvents(this: *@This()) void {
.async_http = http.*,
});
cloned.async_http.real = http;
// Clear stale queue pointers - the clone inherited http.next and http.task.node.next
// which may point to other AsyncHTTP structs that could be freed before the callback
// copies data back to the original. If not cleared, retrying a failed request would
// re-queue with stale pointers causing use-after-free.
cloned.async_http.next = null;
cloned.async_http.task.node.next = null;
cloned.async_http.onStart();
if (comptime Environment.allow_assert) {
count += 1;

View File

@@ -1708,8 +1708,14 @@ pub fn parseIntoBinaryLockfile(
};
if (registry_str.len == 0) {
// Use scope-specific registry if available, otherwise fall back to default
const registry_url = if (manager) |mgr|
mgr.scopeForPackageName(name_str).url.href
else
Npm.Registry.default_url;
const url = try ExtractTarball.buildURL(
Npm.Registry.default_url,
registry_url,
strings.StringOrTinyString.init(name.slice(string_buf.bytes.items)),
res.value.npm.version,
string_buf.bytes.items,

View File

@@ -1,8 +1,14 @@
const { isIP, isIPv6 } = require("internal/net/isIP");
const { checkIsHttpToken, validateFunction, validateInteger, validateBoolean } = require("internal/validators");
const {
checkIsHttpToken,
validateFunction,
validateInteger,
validateBoolean,
validateString,
} = require("internal/validators");
const { urlToHttpOptions } = require("internal/url");
const { isValidTLSArray } = require("internal/tls");
const { throwOnInvalidTLSArray } = require("internal/tls");
const { validateHeaderName } = require("node:_http_common");
const { getTimerDuration } = require("internal/timers");
const { ConnResetException } = require("internal/shared");
@@ -728,53 +734,48 @@ function ClientRequest(input, options, cb) {
throw new Error("pfx is not supported");
}
if (options.rejectUnauthorized !== undefined) this._ensureTls().rejectUnauthorized = options.rejectUnauthorized;
else {
let agentRejectUnauthorized = agent?.options?.rejectUnauthorized;
if (agentRejectUnauthorized !== undefined) this._ensureTls().rejectUnauthorized = agentRejectUnauthorized;
else {
// popular https-proxy-agent uses connectOpts
agentRejectUnauthorized = agent?.connectOpts?.rejectUnauthorized;
if (agentRejectUnauthorized !== undefined) this._ensureTls().rejectUnauthorized = agentRejectUnauthorized;
}
}
if (options.ca) {
if (!isValidTLSArray(options.ca))
throw new TypeError(
"ca argument must be an string, Buffer, TypedArray, BunFile or an array containing string, Buffer, TypedArray or BunFile",
);
this._ensureTls().ca = options.ca;
}
if (options.cert) {
if (!isValidTLSArray(options.cert))
throw new TypeError(
"cert argument must be an string, Buffer, TypedArray, BunFile or an array containing string, Buffer, TypedArray or BunFile",
);
this._ensureTls().cert = options.cert;
}
if (options.key) {
if (!isValidTLSArray(options.key))
throw new TypeError(
"key argument must be an string, Buffer, TypedArray, BunFile or an array containing string, Buffer, TypedArray or BunFile",
);
this._ensureTls().key = options.key;
}
if (options.passphrase) {
if (typeof options.passphrase !== "string") throw new TypeError("passphrase argument must be a string");
this._ensureTls().passphrase = options.passphrase;
}
if (options.ciphers) {
if (typeof options.ciphers !== "string") throw new TypeError("ciphers argument must be a string");
this._ensureTls().ciphers = options.ciphers;
}
if (options.servername) {
if (typeof options.servername !== "string") throw new TypeError("servername argument must be a string");
this._ensureTls().servername = options.servername;
}
// Merge TLS options using spread operator, matching Node.js behavior in createSocket:
// options = { __proto__: null, ...options, ...this.options };
// https://github.com/nodejs/node/blob/v23.6.0/lib/_http_agent.js#L242
// With spread, the last one wins, so agent.options overwrites request options.
//
// agent.options: Stored by Node.js Agent constructor
// https://github.com/nodejs/node/blob/v23.6.0/lib/_http_agent.js#L96
//
// agent.connectOpts: Used by https-proxy-agent for TLS connection options (lowest priority)
// https://github.com/TooTallNate/proxy-agents/blob/main/packages/https-proxy-agent/src/index.ts#L110-L117
const mergedTlsOptions = { __proto__: null, ...agent?.connectOpts, ...options, ...agent?.options };
if (options.secureOptions) {
if (typeof options.secureOptions !== "number") throw new TypeError("secureOptions argument must be a string");
this._ensureTls().secureOptions = options.secureOptions;
if (mergedTlsOptions.rejectUnauthorized !== undefined) {
this._ensureTls().rejectUnauthorized = mergedTlsOptions.rejectUnauthorized;
}
if (mergedTlsOptions.ca) {
throwOnInvalidTLSArray("options.ca", mergedTlsOptions.ca);
this._ensureTls().ca = mergedTlsOptions.ca;
}
if (mergedTlsOptions.cert) {
throwOnInvalidTLSArray("options.cert", mergedTlsOptions.cert);
this._ensureTls().cert = mergedTlsOptions.cert;
}
if (mergedTlsOptions.key) {
throwOnInvalidTLSArray("options.key", mergedTlsOptions.key);
this._ensureTls().key = mergedTlsOptions.key;
}
if (mergedTlsOptions.passphrase) {
validateString(mergedTlsOptions.passphrase, "options.passphrase");
this._ensureTls().passphrase = mergedTlsOptions.passphrase;
}
if (mergedTlsOptions.ciphers) {
validateString(mergedTlsOptions.ciphers, "options.ciphers");
this._ensureTls().ciphers = mergedTlsOptions.ciphers;
}
if (mergedTlsOptions.servername) {
validateString(mergedTlsOptions.servername, "options.servername");
this._ensureTls().servername = mergedTlsOptions.servername;
}
if (mergedTlsOptions.secureOptions) {
validateInteger(mergedTlsOptions.secureOptions, "options.secureOptions");
this._ensureTls().secureOptions = mergedTlsOptions.secureOptions;
}
this[kPath] = options.path || "/";
if (cb) {

View File

@@ -1,4 +1,4 @@
pub fn decodeBinaryValue(globalObject: *jsc.JSGlobalObject, field_type: types.FieldType, column_length: u32, raw: bool, bigint: bool, unsigned: bool, comptime Context: type, reader: NewReader(Context)) !SQLDataCell {
pub fn decodeBinaryValue(globalObject: *jsc.JSGlobalObject, field_type: types.FieldType, column_length: u32, raw: bool, bigint: bool, unsigned: bool, binary: bool, comptime Context: type, reader: NewReader(Context)) !SQLDataCell {
debug("decodeBinaryValue: {s}", .{@tagName(field_type)});
return switch (field_type) {
.MYSQL_TYPE_TINY => {
@@ -131,6 +131,7 @@ pub fn decodeBinaryValue(globalObject: *jsc.JSGlobalObject, field_type: types.Fi
else => error.InvalidBinaryValue,
},
// When the column contains a binary string we return a Buffer otherwise a string
.MYSQL_TYPE_ENUM,
.MYSQL_TYPE_SET,
.MYSQL_TYPE_GEOMETRY,
@@ -138,7 +139,6 @@ pub fn decodeBinaryValue(globalObject: *jsc.JSGlobalObject, field_type: types.Fi
.MYSQL_TYPE_STRING,
.MYSQL_TYPE_VARCHAR,
.MYSQL_TYPE_VAR_STRING,
// We could return Buffer here BUT TEXT, LONGTEXT, MEDIUMTEXT, TINYTEXT, etc. are BLOB and the user expects a string
.MYSQL_TYPE_TINY_BLOB,
.MYSQL_TYPE_MEDIUM_BLOB,
.MYSQL_TYPE_LONG_BLOB,
@@ -151,7 +151,9 @@ pub fn decodeBinaryValue(globalObject: *jsc.JSGlobalObject, field_type: types.Fi
}
var string_data = try reader.encodeLenString();
defer string_data.deinit();
if (binary) {
return SQLDataCell.raw(&string_data);
}
const slice = string_data.slice();
return SQLDataCell{ .tag = .string, .value = .{ .string = if (slice.len > 0) bun.String.cloneUTF8(slice).value.WTFStringImpl else null }, .free_value = 1 };
},

View File

@@ -140,8 +140,12 @@ pub const Row = struct {
}
},
else => {
const slice = value.slice();
cell.* = SQLDataCell{ .tag = .string, .value = .{ .string = if (slice.len > 0) bun.String.cloneUTF8(slice).value.WTFStringImpl else null }, .free_value = 1 };
if (column.flags.BINARY) {
cell.* = SQLDataCell.raw(value);
} else {
const slice = value.slice();
cell.* = SQLDataCell{ .tag = .string, .value = .{ .string = if (slice.len > 0) bun.String.cloneUTF8(slice).value.WTFStringImpl else null }, .free_value = 1 };
}
},
};
}
@@ -226,7 +230,7 @@ pub const Row = struct {
}
const column = this.columns[i];
value.* = try decodeBinaryValue(this.globalObject, column.column_type, column.column_length, this.raw, this.bigint, column.flags.UNSIGNED, Context, reader);
value.* = try decodeBinaryValue(this.globalObject, column.column_type, column.column_length, this.raw, this.bigint, column.flags.UNSIGNED, column.flags.BINARY, Context, reader);
value.index = switch (column.name_or_index) {
// The indexed columns can be out of order.
.index => |idx| idx,

View File

@@ -295,4 +295,34 @@ import path from "path";
expect(fs.existsSync(path.join(temp, "src/components"))).toBe(true);
expect(fs.existsSync(path.join(temp, "src/components/ui"))).toBe(true);
}, 30_000);
test("bun init --minimal only creates package.json and tsconfig.json", async () => {
// Regression test for https://github.com/oven-sh/bun/issues/26050
// --minimal should not create .cursor/, CLAUDE.md, .gitignore, or README.md
const temp = tempDirWithFiles("bun-init-minimal", {});
const { exited } = Bun.spawn({
cmd: [bunExe(), "init", "--minimal", "-y"],
cwd: temp,
stdio: ["ignore", "inherit", "inherit"],
env: {
...bunEnv,
// Simulate Cursor being installed via CURSOR_TRACE_ID env var
CURSOR_TRACE_ID: "test-trace-id",
},
});
expect(await exited).toBe(0);
// Should create package.json and tsconfig.json
expect(fs.existsSync(path.join(temp, "package.json"))).toBe(true);
expect(fs.existsSync(path.join(temp, "tsconfig.json"))).toBe(true);
// Should NOT create these extra files with --minimal
expect(fs.existsSync(path.join(temp, "index.ts"))).toBe(false);
expect(fs.existsSync(path.join(temp, ".gitignore"))).toBe(false);
expect(fs.existsSync(path.join(temp, "README.md"))).toBe(false);
expect(fs.existsSync(path.join(temp, "CLAUDE.md"))).toBe(false);
expect(fs.existsSync(path.join(temp, ".cursor"))).toBe(false);
});
});

File diff suppressed because it is too large Load Diff

View File

@@ -1509,3 +1509,128 @@ describe.concurrent("s3 missing credentials", () => {
});
});
});
// Archive + S3 integration tests
describe.skipIf(!minioCredentials)("Archive with S3", () => {
const credentials = minioCredentials!;
it("writes archive to S3 via S3Client.write()", async () => {
const client = new Bun.S3Client(credentials);
const archive = new Bun.Archive({
"hello.txt": "Hello from Archive!",
"data.json": JSON.stringify({ test: true }),
});
const key = randomUUIDv7() + ".tar";
await client.write(key, archive);
// Verify by downloading and reading back
const downloaded = await client.file(key).bytes();
const readArchive = new Bun.Archive(downloaded);
const files = await readArchive.files();
expect(files.size).toBe(2);
expect(await files.get("hello.txt")!.text()).toBe("Hello from Archive!");
expect(await files.get("data.json")!.text()).toBe(JSON.stringify({ test: true }));
// Cleanup
await client.unlink(key);
});
it("writes archive to S3 via Bun.write() with s3:// URL", async () => {
const archive = new Bun.Archive({
"file1.txt": "content1",
"dir/file2.txt": "content2",
});
const key = randomUUIDv7() + ".tar";
const s3Url = `s3://${credentials.bucket}/${key}`;
await Bun.write(s3Url, archive, {
...credentials,
});
// Verify by downloading
const s3File = Bun.file(s3Url, credentials);
const downloaded = await s3File.bytes();
const readArchive = new Bun.Archive(downloaded);
const files = await readArchive.files();
expect(files.size).toBe(2);
expect(await files.get("file1.txt")!.text()).toBe("content1");
expect(await files.get("dir/file2.txt")!.text()).toBe("content2");
// Cleanup
await s3File.delete();
});
it("writes archive with binary content to S3", async () => {
const client = new Bun.S3Client(credentials);
const binaryData = new Uint8Array([0x00, 0x01, 0x02, 0xff, 0xfe, 0xfd, 0x80, 0x7f]);
const archive = new Bun.Archive({
"binary.bin": binaryData,
});
const key = randomUUIDv7() + ".tar";
await client.write(key, archive);
// Verify binary data is preserved
const downloaded = await client.file(key).bytes();
const readArchive = new Bun.Archive(downloaded);
const files = await readArchive.files();
const extractedBinary = await files.get("binary.bin")!.bytes();
expect(extractedBinary).toEqual(binaryData);
// Cleanup
await client.unlink(key);
});
it("writes large archive to S3", async () => {
const client = new Bun.S3Client(credentials);
// Create archive with multiple files
const entries: Record<string, string> = {};
for (let i = 0; i < 50; i++) {
entries[`file${i.toString().padStart(3, "0")}.txt`] = `Content for file ${i}`;
}
const archive = new Bun.Archive(entries);
const key = randomUUIDv7() + ".tar";
await client.write(key, archive);
// Verify
const downloaded = await client.file(key).bytes();
const readArchive = new Bun.Archive(downloaded);
const files = await readArchive.files();
expect(files.size).toBe(50);
expect(await files.get("file000.txt")!.text()).toBe("Content for file 0");
expect(await files.get("file049.txt")!.text()).toBe("Content for file 49");
// Cleanup
await client.unlink(key);
});
it("writes archive via s3File.write()", async () => {
const client = new Bun.S3Client(credentials);
const archive = new Bun.Archive({
"test.txt": "Hello via s3File.write()!",
});
const key = randomUUIDv7() + ".tar";
const s3File = client.file(key);
await s3File.write(archive);
// Verify
const downloaded = await s3File.bytes();
const readArchive = new Bun.Archive(downloaded);
const files = await readArchive.files();
expect(files.size).toBe(1);
expect(await files.get("test.txt")!.text()).toBe("Hello via s3File.write()!");
// Cleanup
await s3File.delete();
});
});

View File

@@ -1,5 +1,16 @@
import { describe, expect, it, spyOn } from "bun:test";
import { bunEnv, bunExe, gc, getMaxFD, isBroken, isIntelMacOS, isWindows, tempDirWithFiles, tmpdirSync } from "harness";
import {
bunEnv,
bunExe,
gc,
getMaxFD,
isBroken,
isIntelMacOS,
isPosix,
isWindows,
tempDirWithFiles,
tmpdirSync,
} from "harness";
import { isAscii } from "node:buffer";
import fs, {
closeSync,
@@ -51,6 +62,7 @@ import { tmpdir } from "node:os";
import { join } from "node:path";
import { spawnSync } from "bun";
import { mkfifo } from "mkfifo";
import { ReadStream as ReadStream_, WriteStream as WriteStream_ } from "./export-from.js";
import { ReadStream as ReadStreamStar_, WriteStream as WriteStreamStar_ } from "./export-star-from.js";
@@ -1540,6 +1552,13 @@ it("symlink", () => {
expect(realpathSync(actual)).toBe(realpathSync(import.meta.path));
});
it.if(isPosix)("realpathSync doesn't block on FIFO", () => {
const path = join(tmpdirSync(), "test-fs-fifo-block.fifo");
mkfifo(path, 0o666);
realpathSync(path);
unlinkSync(path);
});
it("readlink", () => {
const actual = join(tmpdirSync(), "fs-readlink.txt");
try {

View File

@@ -0,0 +1,30 @@
-----BEGIN ENCRYPTED PRIVATE KEY-----
MIIFNTBfBgkqhkiG9w0BBQ0wUjAxBgkqhkiG9w0BBQwwJAQQieLggVjbubz09mX5
GdRQAwICCAAwDAYIKoZIhvcNAgkFADAdBglghkgBZQMEASoEEJ++f2E23qU4mbP4
m3RnPasEggTQoS6zcBDvWURYyctw9Qma8L/ZnPg4SBclVzYbiZcvBPNRvCNLnYxQ
ysimU/8PTCP9m944dcsMolRqPjj0gOQCnBpqbZmnc7elwDFZIhePRfMKC2bPHZeo
ABonNOs2VstJ9gT3RA5x8Dj99dsoPdnV9rL6vkW0Gk86BPGgQq5i1ipJvYrpOtay
Bq5JgpptVX86azXZVriB8FUNfJuFOPQfxfXIY7ogHpQWZ7rIVa5ug7LlJ7sLjakj
ph/4corzRnRr88/eFfhYbV5rob/Lvoq8+I2Hgf25ypJ2XdOoWAgDOvl6+k01v/Ci
VAYAE1v9RgmiAXFIE9uYbSIyhiVibmLU6QK7Vcydv0ZaZLdP/9HwfZ6Q5u1a23rj
ltzRFOu5H7ipVXSoZU1ffw2EXi1RZJU2n5M3tU11qZsNpaDulEdcYZm74sUaqdjA
zkYSO+RBehptEUfgjXBrW8HJ42fCfd6IvQ7NtT3e3zJup105cHIEfO8IiSSt/oW3
SOupzjTpARHhAbPKSEmUVC1IXjGUvUuZs+NlN+byNkI4IhSTHp4vn5k87l22jccl
4NwW5ZIouqawvV5gyOGgBcwgSfvd4H8mcSeFfZhVmEtRDKtubREr8mqqcUWq5V/W
fEGR2LTQKRofhGGw56Jzw8FgNJNI0m6WBYIPQVtmwqqljPNPDuCQZ/icrhM6s0MR
7IyDiCUHzsz2JZxRJJO9pzItSABym/I57DTtRg1XQTEuSU+dTwhVzwkytWVldHx3
Rvbb6DUWrLtthoAs/LSDevjhrLYAdkLj4iaexqfYPcrRA22hj3KxxRpzV8zqMNvM
hI703HrjIPzlVhrqf6gMiKs7iZu2XQ4RRsQyKzWlro9bOprUvIg/abFtaJDXKqN0
sTJQ9rSpTJgUzG4sJEFiUeM0Wm2cLUO1w4N4/si89vOCcVJJUIjZgwsyFu8DpUIE
7E9rgAzuWByIBOJQ0f1hfF7zGUxAJ75qRdHm0q2aDkDPLiJk1alR1MpMs1tIcaBO
CAxnlZtORvq6QMQnERkpzuvX2PS5mtZ8w/qizPgb8GL3kU+Ex0lJHT8PBwspSXWV
Gc9AvCZ1z+YLnflUsRch/dI/suGhpIcLOX4M3pfW9qfo/i92uR52JWzIAkRKFTOi
fSiADLpar2WT2Kcz9aGfTB2swjhsL7Q6Tf8BWUCVYtfbf5FK07uPTCb9tyy+LxtU
qvtHe3XyZTO3guRBBDZotEOqNKzJw+ZUKIO7vX5JGtpMudBHL2J1KH80Qy4+uR/H
b9YyW0UFOyuOejmrMwHMP/iXkYyTsBiShETU0Uga33xvSuS10FhiCt87cXCI/WeZ
Jw1fk29QA3nx5vw9zDcVFiJRwOu9l6/JxXFpGm0ZjhYudS98yJkam3sbwJThJ+1C
fFzzCM69iUdPw/8JEPnD+Wd2okFiwjpEzHrZ+n1P5YGDF7UTyEB3gLpn3sgmBR9H
2z4yiL+ST/WI7n3ykXxzxjzcEgkDEwLfzHlguqh7jhYWuIhsDmcch7EgH8+gsyke
9lgUWJdoHXVfNZmWh4rMMkEUGi605WulXV8N9qQJJOJltN3lGdKZi+CBK6dTlPtJ
iAj5mvrk++pP/b0SplcQtq3pspGnWmjw+jw0aOVzSpn8qrco1/FZWdw=
-----END ENCRYPTED PRIVATE KEY-----

View File

@@ -1,28 +1,28 @@
-----BEGIN PRIVATE KEY-----
MIIEvAIBADANBgkqhkiG9w0BAQEFAASCBKYwggSiAgEAAoIBAQCIzOJskt6VkEJY
XKSJv/Gdil3XYkjk3NVc/+m+kzqnkTRbPtT9w+IGWgmJhuf9DJPLCwHFAEFarVwV
x16Q0PbU4ajXaLRHEYGhrH10oTMjQnJ24xVm26mxRXPQa5vaLpWJqNyIdNLIQLe+
UXUOzSGGsFTRMAjvYrkzjBe4ZUnaZV+aFY/ug0jfzeA1dJjzKZs6+yTJRbsuWUEb
8MsDmT4v+kBZDKdaDn7AFDWRVqx/38BnqsRzkM0CxpnyT2kRzw5zQajIE13gdTJo
1EHvYSUkkxrY5m30Rl9BuBBZBjhMzOHq0fYVVooHO+sf4XHPgvFTTxJum85u7J1J
oEUjrLKtAgMBAAECggEACInVNhaiqu4infZGVMy0rXMV8VwSlapM7O2SLtFsr0nK
XUmaLK6dvGzBPKK9dxdiYCFzPlMKQTkhzsAvYFWSmm3tRmikG+11TFyCRhXLpc8/
ark4vD9Io6ZkmKUmyKLwtXNjNGcqQtJ7RXc7Ga3nAkueN6JKZHqieZusXVeBGQ70
YH1LKyVNBeJggbj+g9rqaksPyNJQ8EWiNTJkTRQPazZ0o1VX/fzDFyr/a5npFtHl
4BHfafv9o1Xyr70Kie8CYYRJNViOCN+ylFs7Gd3XRaAkSkgMT/7DzrHdEM2zrrHK
yNg2gyDVX9UeEJG2X5UtU0o9BVW7WBshz/2hqIUHoQKBgQC8zsRFvC7u/rGr5vRR
mhZZG+Wvg03/xBSuIgOrzm+Qie6mAzOdVmfSL/pNV9EFitXt1yd2ROo31AbS7Evy
Bm/QVKr2mBlmLgov3B7O/e6ABteooOL7769qV/v+yo8VdEg0biHmsfGIIXDe3Lwl
OT0XwF9r/SeZLbw1zfkSsUVG/QKBgQC5fANM3Dc9LEek+6PHv5+eC1cKkyioEjUl
/y1VUD00aABI1TUcdLF3BtFN2t/S6HW0hrP3KwbcUfqC25k+GDLh1nM6ZK/gI3Yn
IGtCHxtE3S6jKhE9QcK/H+PzGVKWge9SezeYRP0GHJYDrTVTA8Kt9HgoZPPeReJl
+Ss9c8ThcQKBgECX6HQHFnNzNSufXtSQB7dCoQizvjqTRZPxVRoxDOABIGExVTYt
umUhPtu5AGyJ+/hblEeU+iBRbGg6qRzK8PPwE3E7xey8MYYAI5YjL7YjISKysBUL
AhM6uJ6Jg/wOBSnSx8xZ8kzlS+0izUda1rjKeprCSArSp8IsjlrDxPStAoGAEcPr
+P+altRX5Fhpvmb/Hb8OTif8G+TqjEIdkG9H/W38oP0ywg/3M2RGxcMx7txu8aR5
NjI7zPxZFxF7YvQkY3cLwEsGgVxEI8k6HLIoBXd90Qjlb82NnoqqZY1GWL4HMwo0
L/Rjm6M/Rwje852Hluu0WoIYzXA6F/Q+jPs6nzECgYAxx4IbDiGXuenkwSF1SUyj
NwJXhx4HDh7U6EO/FiPZE5BHE3BoTrFu3o1lzverNk7G3m+j+m1IguEAalHlukYl
rip9iUISlKYqbYZdLBoLwHAfHhszdrjqn8/v6oqbB5yR3HXjPFUWJo0WJ2pqJp56
ZshgmQQ/5Khoj6x0/dMPSg==
MIIEvQIBADANBgkqhkiG9w0BAQEFAASCBKcwggSjAgEAAoIBAQDlYzosgRgXHL6v
Mh1V0ERFhsvlZrtRojSw6tafr3SQBphU793/rGiYZlL/lJ9HIlLkx9JMbuTjNm5U
2eRwHiTQIeWD4aCIESwPlkdaVYtC+IOj55bJN8xNa7h5GyJwF7PnPetAsKyE8DMB
n1gKMhaIis7HHOUtk4/K3Y4peU44d04z0yPt6JtY5Sbvi1E7pGX6T/2c9sHsdIDe
DctWnewpXXs8zkAla0KNWQfpDnpS53wxAfStTA4lSrA9daxC7hZopQlLxFIbJk+0
BLbEsXtrJ54T5iguHk+2MDVAy4MOqP9XbKV7eGHk73l6+CSwmHyHBxh4ChxRQeT5
BP0MUTn1AgMBAAECggEABtPvC5uVGr0DjQX2GxONsK8cOxoVec7U+C4pUMwBcXcM
yjxwlHdujpi/IDXtjsm+A2rSPu2vGPdKDfMFanPvPxW/Ne99noc6U0VzHsR8lnP8
wSB328nyJhzOeyZcXk9KTtgIPF7156gZsJLsZTNL+ej90i3xQWvKxCxXmrLuad5O
z/TrgZkC6wC3fgj1d3e8bMljQ7tLxbshJMYVI5o6RFTxy84DLI+rlvPkf7XbiMPf
2lsm4jcJKvfx+164HZJ9QVlx8ncqOHAnGvxb2xHHfqv4JAbz615t7yRvtaw4Paj5
6kQSf0VWnsVzgxNJWvnUZym/i/Qf5nQafjChCyKOEQKBgQD9f4SkvJrp/mFKWLHd
kDvRpSIIltfJsa5KShn1IHsQXFwc0YgyP4SKQb3Ckv+/9UFHK9EzM+WlPxZi7ZOS
hsWhIfkI4c4ORpxUQ+hPi0K2k+HIY7eYyONqDAzw5PGkKBo3mSGMHDXYywSqexhB
CCMHuHdMhwyHdz4PWYOK3C2VMQKBgQDnpsrHK7lM9aVb8wNhTokbK5IlTSzH/5oJ
lAVu6G6H3tM5YQeoDXztbZClvrvKU8DU5UzwaC+8AEWQwaram29QIDpAI3nVQQ0k
dmHHp/pCeADdRG2whaGcl418UJMMv8AUpWTRm+kVLTLqfTHBC0ji4NlCQMHCUCfd
U8TeUi5QBQKBgQDvJNd7mboDOUmLG7VgMetc0Y4T0EnuKsMjrlhimau/OYJkZX84
+BcPXwmnf4nqC3Lzs3B9/12L0MJLvZjUSHQ0mJoZOPxtF0vvasjEEbp0B3qe0wOn
DQ0NRCUJNNKJbJOfE8VEKnDZ/lx+f/XXk9eINwvElDrLqUBQtr+TxjbyYQKBgAxQ
lZ8Y9/TbajsFJDzcC/XhzxckjyjisbGoqNFIkfevJNN8EQgiD24f0Py+swUChtHK
jtiI8WCxMwGLCiYs9THxRKd8O1HW73fswy32BBvcfU9F//7OW9UTSXY+YlLfLrrq
P/3UqAN0L6y/kxGMJAfLpEEdaC+IS1Y8yc531/ZxAoGASYiasDpePtmzXklDxk3h
jEw64QAdXK2p/xTMjSeTtcqJ7fvaEbg+Mfpxq0mdTjfbTdR9U/nzAkwS7OoZZ4Du
ueMVls0IVqcNnBtikG8wgdxN27b5JPXS+GzQ0zDSpWFfRPZiIh37BAXr0D1voluJ
rEHkcals6p7hL98BoxjFIvA=
-----END PRIVATE KEY-----

View File

@@ -1,23 +1,23 @@
-----BEGIN CERTIFICATE-----
MIID5jCCAs6gAwIBAgIUN7coIsdMcLo9amZfkwogu0YkeLEwDQYJKoZIhvcNAQEL
BQAwfjELMAkGA1UEBhMCU0UxDjAMBgNVBAgMBVN0YXRlMREwDwYDVQQHDAhMb2Nh
dGlvbjEaMBgGA1UECgwRT3JnYW5pemF0aW9uIE5hbWUxHDAaBgNVBAsME09yZ2Fu
aXphdGlvbmFsIFVuaXQxEjAQBgNVBAMMCWxvY2FsaG9zdDAeFw0yMzA5MjExNDE2
MjNaFw0yNDA5MjAxNDE2MjNaMH4xCzAJBgNVBAYTAlNFMQ4wDAYDVQQIDAVTdGF0
ZTERMA8GA1UEBwwITG9jYXRpb24xGjAYBgNVBAoMEU9yZ2FuaXphdGlvbiBOYW1l
MRwwGgYDVQQLDBNPcmdhbml6YXRpb25hbCBVbml0MRIwEAYDVQQDDAlsb2NhbGhv
c3QwggEiMA0GCSqGSIb3DQEBAQUAA4IBDwAwggEKAoIBAQCIzOJskt6VkEJYXKSJ
v/Gdil3XYkjk3NVc/+m+kzqnkTRbPtT9w+IGWgmJhuf9DJPLCwHFAEFarVwVx16Q
0PbU4ajXaLRHEYGhrH10oTMjQnJ24xVm26mxRXPQa5vaLpWJqNyIdNLIQLe+UXUO
zSGGsFTRMAjvYrkzjBe4ZUnaZV+aFY/ug0jfzeA1dJjzKZs6+yTJRbsuWUEb8MsD
mT4v+kBZDKdaDn7AFDWRVqx/38BnqsRzkM0CxpnyT2kRzw5zQajIE13gdTJo1EHv
YSUkkxrY5m30Rl9BuBBZBjhMzOHq0fYVVooHO+sf4XHPgvFTTxJum85u7J1JoEUj
rLKtAgMBAAGjXDBaMA4GA1UdDwEB/wQEAwIDiDATBgNVHSUEDDAKBggrBgEFBQcD
ATAUBgNVHREEDTALgglsb2NhbGhvc3QwHQYDVR0OBBYEFNzx4Rfs9m8XR5ML0WsI
sorKmB4PMA0GCSqGSIb3DQEBCwUAA4IBAQB87iQy8R0fiOky9WTcyzVeMaavS3MX
iTe1BRn1OCyDq+UiwwoNz7zdzZJFEmRtFBwPNFOe4HzLu6E+7yLFR552eYRHlqIi
/fiLb5JiZfPtokUHeqwELWBsoXtU8vKxViPiLZ09jkWOPZWo7b/xXd6QYykBfV91
usUXLzyTD2orMagpqNksLDGS3p3ggHEJBZtRZA8R7kPEw98xZHznOQpr26iv8kYz
ZWdLFoFdwgFBSfxePKax5rfo+FbwdrcTX0MhbORyiu2XsBAghf8s2vKDkHg2UQE8
haonxFYMFaASfaZ/5vWKYDTCJkJ67m/BtkpRafFEO+ad1i1S61OjfxH4
MIID4jCCAsqgAwIBAgIUcaRq6J/YF++Bo01Zc+HeQvCbnWMwDQYJKoZIhvcNAQEL
BQAwaTELMAkGA1UEBhMCVVMxCzAJBgNVBAgMAkNBMRYwFAYDVQQHDA1TYW4gRnJh
bmNpc2NvMQ0wCwYDVQQKDARPdmVuMREwDwYDVQQLDAhUZWFtIEJ1bjETMBEGA1UE
AwwKc2VydmVyLWJ1bjAeFw0yNTA5MDYwMzAwNDlaFw0zNTA5MDQwMzAwNDlaMGkx
CzAJBgNVBAYTAlVTMQswCQYDVQQIDAJDQTEWMBQGA1UEBwwNU2FuIEZyYW5jaXNj
bzENMAsGA1UECgwET3ZlbjERMA8GA1UECwwIVGVhbSBCdW4xEzARBgNVBAMMCnNl
cnZlci1idW4wggEiMA0GCSqGSIb3DQEBAQUAA4IBDwAwggEKAoIBAQDlYzosgRgX
HL6vMh1V0ERFhsvlZrtRojSw6tafr3SQBphU793/rGiYZlL/lJ9HIlLkx9JMbuTj
Nm5U2eRwHiTQIeWD4aCIESwPlkdaVYtC+IOj55bJN8xNa7h5GyJwF7PnPetAsKyE
8DMBn1gKMhaIis7HHOUtk4/K3Y4peU44d04z0yPt6JtY5Sbvi1E7pGX6T/2c9sHs
dIDeDctWnewpXXs8zkAla0KNWQfpDnpS53wxAfStTA4lSrA9daxC7hZopQlLxFIb
Jk+0BLbEsXtrJ54T5iguHk+2MDVAy4MOqP9XbKV7eGHk73l6+CSwmHyHBxh4ChxR
QeT5BP0MUTn1AgMBAAGjgYEwfzAdBgNVHQ4EFgQUw7nEnh4uOdZVZUapQzdAUaVa
An0wHwYDVR0jBBgwFoAUw7nEnh4uOdZVZUapQzdAUaVaAn0wDwYDVR0TAQH/BAUw
AwEB/zAsBgNVHREEJTAjgglsb2NhbGhvc3SHBH8AAAGHEAAAAAAAAAAAAAAAAAAA
AAEwDQYJKoZIhvcNAQELBQADggEBAEA8r1fvDLMSCb8bkAURpFk8chn8pl5MChzT
YUDaLdCCBjPXJkSXNdyuwS+T/ljAGyZbW5xuDccCNKltawO4CbyEXUEZbYr3w9eq
j8uqymJPhFf0O1rKOI2han5GBCgHwG13QwKI+4uu7390nD+TlzLOhxFfvOG7OadH
QNMNLNyldgF4Nb8vWdz0FtQiGUIrO7iq4LFhhd1lCxe0q+FAYSEYcc74WtF/Yo8V
JQauXuXyoP5FqLzNt/yeNQhceyIXJGKCsjr5/bASBmVlCwgRfsD3jpG37L8YCJs1
L4WEikcY4Lzb2NF9e94IyZdQsRqd9DFBF5zP013MSUiuhiow32k=
-----END CERTIFICATE-----

View File

@@ -0,0 +1,710 @@
/**
* All tests in this file run in both Bun and Node.js.
*
* Test that TLS options can be inherited from agent.options and agent.connectOpts.
* This is important for compatibility with libraries like https-proxy-agent.
*
* The HttpsProxyAgent tests verify that TLS options are properly passed through
* the proxy tunnel to the target HTTPS server.
*/
import { once } from "node:events";
import { readFileSync } from "node:fs";
import http from "node:http";
import https from "node:https";
import { createRequire } from "node:module";
import type { AddressInfo } from "node:net";
import net from "node:net";
import { dirname, join } from "node:path";
import { describe, test } from "node:test";
import { fileURLToPath } from "node:url";
// Use createRequire for ESM compatibility
const require = createRequire(import.meta.url);
const { HttpsProxyAgent } = require("https-proxy-agent") as {
HttpsProxyAgent: new (proxyUrl: string, options?: Record<string, unknown>) => http.Agent;
};
const __dirname = dirname(fileURLToPath(import.meta.url));
// Self-signed certificate with SANs for localhost and 127.0.0.1
// This cert is its own CA (self-signed)
const tlsCerts = {
cert: readFileSync(join(__dirname, "fixtures", "cert.pem"), "utf8"),
key: readFileSync(join(__dirname, "fixtures", "cert.key"), "utf8"),
encryptedKey: readFileSync(join(__dirname, "fixtures", "cert.encrypted.key"), "utf8"),
passphrase: "testpassword",
// Self-signed cert, so it's its own CA
get ca() {
return this.cert;
},
};
async function createHttpsServer(
options: https.ServerOptions = {},
): Promise<{ server: https.Server; port: number; hostname: string }> {
const server = https.createServer({ key: tlsCerts.key, cert: tlsCerts.cert, ...options }, (req, res) => {
res.writeHead(200);
res.end("OK");
});
await once(server.listen(0, "127.0.0.1"), "listening");
const { port } = server.address() as AddressInfo;
return { server, port, hostname: "127.0.0.1" };
}
async function createHttpServer(): Promise<{
server: http.Server;
port: number;
hostname: string;
}> {
const server = http.createServer((req, res) => {
res.writeHead(200);
res.end("OK");
});
await once(server.listen(0, "127.0.0.1"), "listening");
const { port } = server.address() as AddressInfo;
return { server, port, hostname: "127.0.0.1" };
}
/**
* Create an HTTP CONNECT proxy server.
* This proxy handles the CONNECT method to establish tunnels for HTTPS connections.
*/
function createConnectProxy(): net.Server {
return net.createServer(clientSocket => {
let buffer: Uint8Array = new Uint8Array(0);
let tunnelEstablished = false;
let targetSocket: net.Socket | null = null;
clientSocket.on("data", (data: Uint8Array) => {
// If tunnel is already established, forward data directly
if (tunnelEstablished && targetSocket) {
targetSocket.write(data);
return;
}
// Concatenate buffers
const newBuffer = new Uint8Array(buffer.length + data.length);
newBuffer.set(buffer);
newBuffer.set(data, buffer.length);
buffer = newBuffer;
const bufferStr = new TextDecoder().decode(buffer);
// Check if we have complete headers
const headerEnd = bufferStr.indexOf("\r\n\r\n");
if (headerEnd === -1) return;
const headerPart = bufferStr.substring(0, headerEnd);
const lines = headerPart.split("\r\n");
const requestLine = lines[0];
// Check for CONNECT method
const match = requestLine.match(/^CONNECT\s+([^:]+):(\d+)\s+HTTP/);
if (!match) {
clientSocket.write("HTTP/1.1 400 Bad Request\r\n\r\n");
clientSocket.end();
return;
}
const [, targetHost, targetPort] = match;
// Get any data after the headers (shouldn't be any for CONNECT)
// headerEnd is byte position in the string, need to account for UTF-8
const headerBytes = new TextEncoder().encode(bufferStr.substring(0, headerEnd + 4)).length;
const remainingData = buffer.subarray(headerBytes);
// Connect to target
targetSocket = net.connect(parseInt(targetPort, 10), targetHost, () => {
clientSocket.write("HTTP/1.1 200 Connection Established\r\n\r\n");
tunnelEstablished = true;
// Forward any remaining data
if (remainingData.length > 0) {
targetSocket!.write(remainingData);
}
// Set up bidirectional piping
targetSocket!.on("data", (chunk: Uint8Array) => {
clientSocket.write(chunk);
});
});
targetSocket.on("error", () => {
if (!tunnelEstablished) {
clientSocket.write("HTTP/1.1 502 Bad Gateway\r\n\r\n");
}
clientSocket.end();
});
targetSocket.on("close", () => clientSocket.destroy());
clientSocket.on("close", () => targetSocket?.destroy());
});
clientSocket.on("error", () => {
targetSocket?.destroy();
});
});
}
/**
* Helper to start a proxy server and get its port.
*/
async function startProxy(server: net.Server): Promise<number> {
return new Promise<number>(resolve => {
server.listen(0, "127.0.0.1", () => {
const addr = server.address() as AddressInfo;
resolve(addr.port);
});
});
}
describe("https.request agent TLS options inheritance", () => {
describe("agent.options", () => {
test("inherits ca from agent.options", async () => {
const { server, port, hostname } = await createHttpsServer();
try {
// Create an agent with ca in options
const agent = new https.Agent({
ca: tlsCerts.ca,
});
const { promise, resolve, reject } = Promise.withResolvers<void>();
const req = https.request(
{
hostname,
port,
path: "/",
method: "GET",
agent,
// NO ca here - should inherit from agent.options
},
res => {
res.on("data", () => {});
res.on("end", resolve);
},
);
req.on("error", reject);
req.end();
await promise;
} finally {
server.close();
}
});
test("inherits rejectUnauthorized from agent.options", async () => {
const { server, port, hostname } = await createHttpsServer();
try {
// Create an agent with rejectUnauthorized: false in options
const agent = new https.Agent({
rejectUnauthorized: false,
});
const { promise, resolve, reject } = Promise.withResolvers<void>();
const req = https.request(
{
hostname,
port,
path: "/",
method: "GET",
agent,
// NO rejectUnauthorized here - should inherit from agent.options
},
res => {
res.on("data", () => {});
res.on("end", resolve);
},
);
req.on("error", reject);
req.end();
await promise;
} finally {
server.close();
}
});
test("inherits cert and key from agent.options", async () => {
// Create a server that uses TLS
const { server, port, hostname } = await createHttpsServer();
try {
// Create an agent with cert/key in options
const agent = new https.Agent({
rejectUnauthorized: false,
cert: tlsCerts.cert,
key: tlsCerts.key,
});
const { promise, resolve, reject } = Promise.withResolvers<void>();
const req = https.request(
{
hostname,
port,
path: "/",
method: "GET",
agent,
// NO cert/key here - should inherit from agent.options
},
res => {
res.on("data", () => {});
res.on("end", resolve);
},
);
req.on("error", reject);
req.end();
await promise;
} finally {
server.close();
}
});
});
// Test HttpsProxyAgent compatibility - these tests use real HttpsProxyAgent
// to verify HTTPS requests work through the proxy tunnel with TLS options
describe("HttpsProxyAgent TLS options", () => {
test("HttpsProxyAgent with rejectUnauthorized: false", async () => {
const { server, port, hostname } = await createHttpsServer();
const proxy = createConnectProxy();
const proxyPort = await startProxy(proxy);
try {
// Create HttpsProxyAgent for the proxy connection
const agent = new HttpsProxyAgent(`http://127.0.0.1:${proxyPort}`, {
rejectUnauthorized: false,
});
const { promise, resolve, reject } = Promise.withResolvers<void>();
const req = https.request(
{
hostname,
port,
path: "/",
method: "GET",
agent,
// TLS options must also be passed here for Node.js compatibility
// https-proxy-agent doesn't propagate these to target connection in Node.js
// See: https://github.com/TooTallNate/node-https-proxy-agent/issues/35
rejectUnauthorized: false,
},
res => {
res.on("data", () => {});
res.on("end", resolve);
},
);
req.on("error", reject);
req.end();
await promise;
} finally {
server.close();
proxy.close();
}
});
test("HttpsProxyAgent with ca option", async () => {
const { server, port, hostname } = await createHttpsServer();
const proxy = createConnectProxy();
const proxyPort = await startProxy(proxy);
try {
// Create HttpsProxyAgent for the proxy connection
const agent = new HttpsProxyAgent(`http://127.0.0.1:${proxyPort}`, {
ca: tlsCerts.ca,
});
const { promise, resolve, reject } = Promise.withResolvers<void>();
const req = https.request(
{
hostname,
port,
path: "/",
method: "GET",
agent,
// TLS options must also be passed here for Node.js compatibility
ca: tlsCerts.ca,
},
res => {
res.on("data", () => {});
res.on("end", resolve);
},
);
req.on("error", reject);
req.end();
await promise;
} finally {
server.close();
proxy.close();
}
});
test("HttpsProxyAgent with cert and key options", async () => {
const { server, port, hostname } = await createHttpsServer();
const proxy = createConnectProxy();
const proxyPort = await startProxy(proxy);
try {
// Create HttpsProxyAgent for the proxy connection
const agent = new HttpsProxyAgent(`http://127.0.0.1:${proxyPort}`, {
rejectUnauthorized: false,
cert: tlsCerts.cert,
key: tlsCerts.key,
});
const { promise, resolve, reject } = Promise.withResolvers<void>();
const req = https.request(
{
hostname,
port,
path: "/",
method: "GET",
agent,
// TLS options must also be passed here for Node.js compatibility
rejectUnauthorized: false,
cert: tlsCerts.cert,
key: tlsCerts.key,
},
res => {
res.on("data", () => {});
res.on("end", resolve);
},
);
req.on("error", reject);
req.end();
await promise;
} finally {
server.close();
proxy.close();
}
});
});
describe("option precedence (matches Node.js)", () => {
// In Node.js, options are merged via spread in createSocket:
// options = { __proto__: null, ...options, ...this.options };
// https://github.com/nodejs/node/blob/v23.6.0/lib/_http_agent.js#L365
// With spread, the last one wins, so agent.options overwrites request options.
test("agent.options takes precedence over direct options", async () => {
const { server, port, hostname } = await createHttpsServer();
try {
// Create an agent with correct CA
const agent = new https.Agent({
ca: tlsCerts.ca, // Correct CA in agent.options - should be used
});
const { promise, resolve, reject } = Promise.withResolvers<void>();
const req = https.request(
{
hostname,
port,
path: "/",
method: "GET",
agent,
ca: "wrong-ca-that-would-fail", // Wrong CA in request - should be ignored
},
res => {
res.on("data", () => {});
res.on("end", resolve);
},
);
req.on("error", reject);
req.end();
await promise;
} finally {
server.close();
}
});
test("direct options used when agent.options not set", async () => {
const { server, port, hostname } = await createHttpsServer();
try {
// Create an agent without ca
const agent = new https.Agent({});
const { promise, resolve, reject } = Promise.withResolvers<void>();
const req = https.request(
{
hostname,
port,
path: "/",
method: "GET",
agent,
ca: tlsCerts.ca, // Direct option should be used since agent.options.ca is not set
},
res => {
res.on("data", () => {});
res.on("end", resolve);
},
);
req.on("error", reject);
req.end();
await promise;
} finally {
server.close();
}
});
});
describe("other TLS options", () => {
test("inherits servername from agent.options", async () => {
const { server, port, hostname } = await createHttpsServer();
try {
const agent = new https.Agent({
rejectUnauthorized: false,
servername: "localhost", // Should be passed to TLS
});
const { promise, resolve, reject } = Promise.withResolvers<void>();
const req = https.request(
{
hostname,
port,
path: "/",
method: "GET",
agent,
},
res => {
res.on("data", () => {});
res.on("end", resolve);
},
);
req.on("error", reject);
req.end();
await promise;
} finally {
server.close();
}
});
test("inherits ciphers from agent.options", async () => {
const { server, port, hostname } = await createHttpsServer();
try {
const agent = new https.Agent({
rejectUnauthorized: false,
ciphers: "HIGH:!aNULL:!MD5", // Custom cipher suite
});
const { promise, resolve, reject } = Promise.withResolvers<void>();
const req = https.request(
{
hostname,
port,
path: "/",
method: "GET",
agent,
},
res => {
res.on("data", () => {});
res.on("end", resolve);
},
);
req.on("error", reject);
req.end();
await promise;
} finally {
server.close();
}
});
test("inherits passphrase from agent.options", async () => {
// Create server that accepts connections with encrypted key
const { server, port, hostname } = await createHttpsServer({
key: tlsCerts.encryptedKey,
passphrase: tlsCerts.passphrase,
});
try {
// Create an agent with encrypted key and passphrase in options
const agent = new https.Agent({
ca: tlsCerts.ca,
cert: tlsCerts.cert,
key: tlsCerts.encryptedKey,
passphrase: tlsCerts.passphrase,
});
const { promise, resolve, reject } = Promise.withResolvers<void>();
const req = https.request(
{
hostname,
port,
path: "/",
method: "GET",
agent,
// NO passphrase here - should inherit from agent.options
},
res => {
res.on("data", () => {});
res.on("end", resolve);
},
);
req.on("error", reject);
req.end();
await promise;
} finally {
server.close();
}
});
test("supports multiple CAs (array)", async () => {
const { server, port, hostname } = await createHttpsServer();
try {
// Create an agent with CA as an array
const agent = new https.Agent({
ca: [tlsCerts.ca], // Array of CAs
});
const { promise, resolve, reject } = Promise.withResolvers<void>();
const req = https.request(
{
hostname,
port,
path: "/",
method: "GET",
agent,
},
res => {
res.on("data", () => {});
res.on("end", resolve);
},
);
req.on("error", reject);
req.end();
await promise;
} finally {
server.close();
}
});
});
describe("TLS error handling", () => {
test("rejects self-signed cert when rejectUnauthorized is true", async () => {
const { server, port, hostname } = await createHttpsServer();
try {
// Create an agent without CA and with rejectUnauthorized: true (default)
const agent = new https.Agent({
rejectUnauthorized: true,
// NO ca - should fail because cert is self-signed
});
const { promise, resolve, reject } = Promise.withResolvers<Error>();
const req = https.request(
{
hostname,
port,
path: "/",
method: "GET",
agent,
},
() => {
reject(new Error("Expected request to fail"));
},
);
req.on("error", resolve);
req.end();
const error = await promise;
// Should get a certificate error (self-signed cert not trusted)
if (
!(
error.message.includes("self-signed") ||
error.message.includes("SELF_SIGNED") ||
error.message.includes("certificate") ||
error.message.includes("unable to verify")
)
) {
throw new Error(`Expected certificate error, got: ${error.message}`);
}
} finally {
server.close();
}
});
});
});
describe("http.request agent options", () => {
test("does not fail when agent has TLS options (they are ignored for HTTP)", async () => {
const { server, port, hostname } = await createHttpServer();
try {
// Create an agent - TLS options passed via constructor should be ignored for HTTP
// Using type assertion since http.Agent doesn't normally accept TLS options
const agent = new (http.Agent as any)({
rejectUnauthorized: false,
ca: "some-ca",
});
const { promise, resolve, reject } = Promise.withResolvers<void>();
const req = http.request(
{
hostname,
port,
path: "/",
method: "GET",
agent,
},
res => {
res.on("data", () => {});
res.on("end", resolve);
},
);
req.on("error", reject);
req.end();
await promise;
} finally {
server.close();
}
});
});
// Only run in Bun to avoid infinite loop when Node.js runs this file
if (typeof Bun !== "undefined") {
const { bunEnv, nodeExe } = await import("harness");
describe("Node.js compatibility", () => {
test("all tests pass in Node.js", async () => {
const node = nodeExe();
if (!node) {
throw new Error("Node.js not found in PATH");
}
const testFile = fileURLToPath(import.meta.url);
await using proc = Bun.spawn({
cmd: [node, "--test", testFile],
env: bunEnv,
stdout: "pipe",
stderr: "pipe",
});
const [stdout, stderr, exitCode] = await Promise.all([
new Response(proc.stdout).text(),
new Response(proc.stderr).text(),
proc.exited,
]);
if (exitCode !== 0) {
throw new Error(`Node.js tests failed with code ${exitCode}\n${stderr}\n${stdout}`);
}
});
});
}

View File

@@ -480,6 +480,25 @@ if (isDockerEnabled()) {
expect(b).toEqual({ b: 2 });
});
test("Binary", async () => {
const random_name = ("t_" + Bun.randomUUIDv7("hex").replaceAll("-", "")).toLowerCase();
await sql`CREATE TEMPORARY TABLE ${sql(random_name)} (a binary(1), b varbinary(1), c blob)`;
const values = [
{ a: Buffer.from([1]), b: Buffer.from([2]), c: Buffer.from([3]) },
];
await sql`INSERT INTO ${sql(random_name)} ${sql(values)}`;
const results = await sql`select * from ${sql(random_name)}`;
// return buffers
expect(results[0].a).toEqual(Buffer.from([1]));
expect(results[0].b).toEqual(Buffer.from([2]));
expect(results[0].c).toEqual(Buffer.from([3]));
// text protocol should behave the same
const results2 = await sql`select * from ${sql(random_name)}`.simple();
expect(results2[0].a).toEqual(Buffer.from([1]));
expect(results2[0].b).toEqual(Buffer.from([2]));
expect(results2[0].c).toEqual(Buffer.from([3]));
})
test("bulk insert nested sql()", async () => {
await using sql = new SQL({ ...getOptions(), max: 1 });
await sql`create temporary table test_users (name text, age int)`;

View File

@@ -138,7 +138,6 @@ NestedClass {
foo: FooWithProp {
a: 1,
},
test: [Function: test],
}
myCustomName {
[Symbol(Symbol.toStringTag)]: "myCustomName",

View File

@@ -0,0 +1,52 @@
import { heapStats } from "bun:jsc";
import { describe, expect, test } from "bun:test";
describe("Bun.serve response stream leak", () => {
test("proxy server forwarding streaming response should not leak", async () => {
// Backend server that returns a streaming response with delay
await using backend = Bun.serve({
port: 0,
fetch(req) {
const stream = new ReadableStream({
async start(controller) {
controller.enqueue(new TextEncoder().encode("chunk1"));
await Bun.sleep(10);
controller.enqueue(new TextEncoder().encode("chunk2"));
controller.close();
},
});
return new Response(stream);
},
});
// Proxy server that forwards the response body stream
await using proxy = Bun.serve({
port: 0,
async fetch(req) {
const backendResponse = await fetch(`http://localhost:${backend.port}/`);
return new Response(backendResponse.body);
},
});
const url = `http://localhost:${proxy.port}/`;
async function leak() {
const response = await fetch(url);
return await response.text();
}
for (let i = 0; i < 200; i++) {
await leak();
}
await Bun.sleep(10);
Bun.gc(true);
await Bun.sleep(10);
Bun.gc(true);
const readableStreamCount = heapStats().objectTypeCounts.ReadableStream || 0;
const responseCount = heapStats().objectTypeCounts.Response || 0;
expect(readableStreamCount).toBeLessThanOrEqual(50);
expect(responseCount).toBeLessThanOrEqual(50);
});
});

View File

@@ -0,0 +1,105 @@
import { expect, test } from "bun:test";
import { bunEnv, bunExe, tempDirWithFiles } from "harness";
// Test for https://github.com/oven-sh/bun/issues/26039
// When parsing a bun.lock file with an empty registry URL for a scoped package,
// bun should use the scope-specific registry from bunfig.toml, not the default npm registry.
test("frozen lockfile should use scope-specific registry for scoped packages", async () => {
const dir = tempDirWithFiles("scoped-registry-test", {
"package.json": JSON.stringify({
name: "test-scoped-registry",
version: "1.0.0",
dependencies: {
"@example/test-package": "^1.0.0",
},
}),
"bunfig.toml": `
[install.scopes]
example = { url = "https://npm.pkg.github.com" }
`,
// bun.lock with empty string for registry URL - this should trigger the scope lookup
"bun.lock": JSON.stringify(
{
lockfileVersion: 1,
workspaces: {
"": {
dependencies: {
"@example/test-package": "^1.0.0",
},
},
},
packages: {
"@example/test-package": ["@example/test-package@1.0.0", "", {}, "sha512-AAAA"],
},
},
null,
2,
),
});
// Run bun install --frozen-lockfile. It will fail because the package doesn't exist,
// but the error message should show the correct registry URL (npm.pkg.github.com, not registry.npmjs.org)
const { stderr, exitCode } = Bun.spawnSync({
cmd: [bunExe(), "install", "--frozen-lockfile"],
cwd: dir,
env: bunEnv,
});
const stderrText = stderr.toString();
// Before the fix, this would try to fetch from https://registry.npmjs.org/@example/test-package/-/test-package-1.0.0.tgz
// After the fix, it should try to fetch from https://npm.pkg.github.com/@example/test-package/-/test-package-1.0.0.tgz
expect(stderrText).toContain("npm.pkg.github.com");
expect(stderrText).not.toContain("registry.npmjs.org");
// The install should fail because the package doesn't exist on the registry
expect(exitCode).not.toBe(0);
});
// Test that non-scoped packages still use the default registry when registry URL is empty
test("frozen lockfile should use default registry for non-scoped packages", async () => {
const dir = tempDirWithFiles("non-scoped-registry-test", {
"package.json": JSON.stringify({
name: "test-non-scoped-registry",
version: "1.0.0",
dependencies: {
"fake-nonexistent-package": "^1.0.0",
},
}),
"bunfig.toml": `
[install.scopes]
example = { url = "https://npm.pkg.github.com" }
`,
// bun.lock with empty string for registry URL for non-scoped package
"bun.lock": JSON.stringify(
{
lockfileVersion: 1,
workspaces: {
"": {
dependencies: {
"fake-nonexistent-package": "^1.0.0",
},
},
},
packages: {
"fake-nonexistent-package": ["fake-nonexistent-package@1.0.0", "", {}, "sha512-BBBB"],
},
},
null,
2,
),
});
const { stderr, exitCode } = Bun.spawnSync({
cmd: [bunExe(), "install", "--frozen-lockfile"],
cwd: dir,
env: bunEnv,
});
const stderrText = stderr.toString();
// Non-scoped packages should still use the default registry
expect(stderrText).toContain("registry.npmjs.org");
expect(stderrText).not.toContain("npm.pkg.github.com");
// The install should fail because the package doesn't exist on the registry
expect(exitCode).not.toBe(0);
});

View File

@@ -0,0 +1,119 @@
import { describe, expect, test } from "bun:test";
import { bunEnv, bunExe } from "harness";
describe("console.log should only display own properties", () => {
test("Object.create with prototype properties should not show inherited properties", async () => {
await using proc = Bun.spawn({
cmd: [
bunExe(),
"-e",
`
const obj = Object.create({ key: 123 });
console.log(obj);
obj.key = 456;
console.log(obj);
`,
],
env: bunEnv,
stdout: "pipe",
stderr: "pipe",
});
const stdout = await proc.stdout.text();
const stderr = await proc.stderr.text();
const exitCode = await proc.exited;
expect(stderr).toBe("");
// First line: empty object (no own properties)
// Second line: object with own property key: 456
expect(stdout).toContain("{}");
expect(stdout).toContain("key: 456");
expect(stdout).not.toContain("key: 123");
expect(exitCode).toBe(0);
});
test("Object.create(null) with own properties should display them", async () => {
await using proc = Bun.spawn({
cmd: [
bunExe(),
"-e",
`
const obj = Object.create(null);
obj.foo = "bar";
console.log(obj);
`,
],
env: bunEnv,
stdout: "pipe",
stderr: "pipe",
});
const stdout = await proc.stdout.text();
const stderr = await proc.stderr.text();
const exitCode = await proc.exited;
expect(stderr).toBe("");
expect(stdout).toContain("foo:");
expect(stdout).toContain('"bar"');
expect(exitCode).toBe(0);
});
test("regular object should display own properties normally", async () => {
await using proc = Bun.spawn({
cmd: [
bunExe(),
"-e",
`
const obj = { a: 1, b: 2 };
console.log(obj);
`,
],
env: bunEnv,
stdout: "pipe",
stderr: "pipe",
});
const stdout = await proc.stdout.text();
const stderr = await proc.stderr.text();
const exitCode = await proc.exited;
expect(stderr).toBe("");
expect(stdout).toContain("a: 1");
expect(stdout).toContain("b: 2");
expect(exitCode).toBe(0);
});
test("class instances should display own properties, not inherited methods", async () => {
await using proc = Bun.spawn({
cmd: [
bunExe(),
"-e",
`
class Foo {
constructor() {
this.value = 42;
}
method() {
return this.value;
}
}
const foo = new Foo();
console.log(foo);
`,
],
env: bunEnv,
stdout: "pipe",
stderr: "pipe",
});
const stdout = await proc.stdout.text();
const stderr = await proc.stderr.text();
const exitCode = await proc.exited;
expect(stderr).toBe("");
expect(stdout).toContain("value: 42");
// Should not display inherited method as own property
expect(stdout).not.toMatch(/method:\s*\[Function/);
expect(exitCode).toBe(0);
});
});

View File

@@ -0,0 +1,64 @@
import { expect, test } from "bun:test";
// GitHub Issue #25639: setTimeout Timeout object missing _idleStart property
// Next.js 16 uses _idleStart to coordinate timers for Cache Components
test("setTimeout returns Timeout object with _idleStart property", () => {
const timer = setTimeout(() => {}, 100);
try {
// Verify _idleStart exists and is a number
expect("_idleStart" in timer).toBe(true);
expect(typeof timer._idleStart).toBe("number");
// _idleStart should be a positive timestamp
expect(timer._idleStart).toBeGreaterThan(0);
} finally {
clearTimeout(timer);
}
});
test("setInterval returns Timeout object with _idleStart property", () => {
const timer = setInterval(() => {}, 100);
try {
// Verify _idleStart exists and is a number
expect("_idleStart" in timer).toBe(true);
expect(typeof timer._idleStart).toBe("number");
// _idleStart should be a positive timestamp
expect(timer._idleStart).toBeGreaterThan(0);
} finally {
clearInterval(timer);
}
});
test("_idleStart is writable (Next.js modifies it to coordinate timers)", () => {
const timer = setTimeout(() => {}, 100);
try {
const originalIdleStart = timer._idleStart;
expect(typeof originalIdleStart).toBe("number");
// Next.js sets _idleStart to coordinate timers
const newIdleStart = originalIdleStart - 100;
timer._idleStart = newIdleStart;
expect(timer._idleStart).toBe(newIdleStart);
} finally {
clearTimeout(timer);
}
});
test("timers created at different times have different _idleStart values", async () => {
const timer1 = setTimeout(() => {}, 100);
// Wait a bit to ensure different timestamp
await Bun.sleep(10);
const timer2 = setTimeout(() => {}, 100);
try {
expect(timer2._idleStart).toBeGreaterThanOrEqual(timer1._idleStart);
} finally {
clearTimeout(timer1);
clearTimeout(timer2);
}
});

View File

@@ -4,7 +4,7 @@ import { expect, test } from "bun:test";
import { tempDirWithFiles } from "harness";
import { join } from "path";
test("Bun.build reactFastRefresh option enables React Fast Refresh transform", async () => {
test.each(["browser", "bun"] as const)("Bun.build reactFastRefresh works with target: %s", async target => {
const dir = tempDirWithFiles("react-fast-refresh-test", {
"component.tsx": `
import { useState } from "react";
@@ -24,7 +24,7 @@ test("Bun.build reactFastRefresh option enables React Fast Refresh transform", a
const buildEnabled = await Bun.build({
entrypoints: [join(dir, "component.tsx")],
reactFastRefresh: true,
target: "browser",
target,
external: ["react"],
});
@@ -38,7 +38,7 @@ test("Bun.build reactFastRefresh option enables React Fast Refresh transform", a
// Without reactFastRefresh (default), output should NOT contain refresh calls
const buildDisabled = await Bun.build({
entrypoints: [join(dir, "component.tsx")],
target: "browser",
target,
external: ["react"],
});

View File

@@ -0,0 +1,86 @@
import { expect, test } from "bun:test";
import { bunEnv, bunExe, tempDirWithFiles } from "harness";
test("CSS logical properties should not be stripped when nested rules are present", async () => {
// Test for regression of issue #25794: CSS logical properties (e.g., inset-inline-end)
// are stripped from bundler output when they appear in a nested selector that also
// contains further nested rules (like pseudo-elements).
const dir = tempDirWithFiles("css-logical-properties-nested", {
"input.css": `.test-longform {
background-color: teal;
&.test-longform--end {
inset-inline-end: 20px;
&:after {
content: "";
}
}
}
`,
});
await using proc = Bun.spawn({
cmd: [bunExe(), "build", "input.css", "--outdir", "out"],
env: bunEnv,
cwd: dir,
stdout: "pipe",
stderr: "pipe",
});
const [stdout, stderr, exitCode] = await Promise.all([proc.stdout.text(), proc.stderr.text(), proc.exited]);
// Verify the output CSS contains the logical property fallbacks
const outputContent = await Bun.file(`${dir}/out/input.css`).text();
// Helper function to normalize CSS output for snapshots
function normalizeCSSOutput(output: string): string {
return output
.replace(/\/\*.*?\*\//g, "/* [path] */") // Replace comment paths
.trim();
}
// The output should contain LTR/RTL fallback rules for inset-inline-end
// inset-inline-end: 20px should generate:
// - right: 20px for LTR languages
// - left: 20px for RTL languages
// The bundler generates vendor-prefixed variants for browser compatibility
expect(normalizeCSSOutput(outputContent)).toMatchInlineSnapshot(`
"/* [path] */
.test-longform {
background-color: teal;
}
.test-longform.test-longform--end:not(:-webkit-any(:lang(ae), :lang(ar), :lang(arc), :lang(bcc), :lang(bqi), :lang(ckb), :lang(dv), :lang(fa), :lang(glk), :lang(he), :lang(ku), :lang(mzn), :lang(nqo), :lang(pnb), :lang(ps), :lang(sd), :lang(ug), :lang(ur), :lang(yi))) {
right: 20px;
}
.test-longform.test-longform--end:not(:-moz-any(:lang(ae), :lang(ar), :lang(arc), :lang(bcc), :lang(bqi), :lang(ckb), :lang(dv), :lang(fa), :lang(glk), :lang(he), :lang(ku), :lang(mzn), :lang(nqo), :lang(pnb), :lang(ps), :lang(sd), :lang(ug), :lang(ur), :lang(yi))) {
right: 20px;
}
.test-longform.test-longform--end:not(:is(:lang(ae), :lang(ar), :lang(arc), :lang(bcc), :lang(bqi), :lang(ckb), :lang(dv), :lang(fa), :lang(glk), :lang(he), :lang(ku), :lang(mzn), :lang(nqo), :lang(pnb), :lang(ps), :lang(sd), :lang(ug), :lang(ur), :lang(yi))) {
right: 20px;
}
.test-longform.test-longform--end:-webkit-any(:lang(ae), :lang(ar), :lang(arc), :lang(bcc), :lang(bqi), :lang(ckb), :lang(dv), :lang(fa), :lang(glk), :lang(he), :lang(ku), :lang(mzn), :lang(nqo), :lang(pnb), :lang(ps), :lang(sd), :lang(ug), :lang(ur), :lang(yi)) {
left: 20px;
}
.test-longform.test-longform--end:-moz-any(:lang(ae), :lang(ar), :lang(arc), :lang(bcc), :lang(bqi), :lang(ckb), :lang(dv), :lang(fa), :lang(glk), :lang(he), :lang(ku), :lang(mzn), :lang(nqo), :lang(pnb), :lang(ps), :lang(sd), :lang(ug), :lang(ur), :lang(yi)) {
left: 20px;
}
.test-longform.test-longform--end:is(:lang(ae), :lang(ar), :lang(arc), :lang(bcc), :lang(bqi), :lang(ckb), :lang(dv), :lang(fa), :lang(glk), :lang(he), :lang(ku), :lang(mzn), :lang(nqo), :lang(pnb), :lang(ps), :lang(sd), :lang(ug), :lang(ur), :lang(yi)) {
left: 20px;
}
.test-longform.test-longform--end:after {
content: "";
}"
`);
// Should exit successfully
expect(exitCode).toBe(0);
});