Compare commits

...

14 Commits

Author SHA1 Message Date
Claude Bot
eada7f7fb6 fix(bundler): preserve sideEffects from package.json when onResolve plugin returns null
Fixes #2952

When an onResolve plugin returns null (no match), the bundler falls back to
default file resolution. Previously, the sideEffects field from package.json
was not being properly propagated in this code path, causing tree-shaking to
fail for packages like lodash-es that rely on sideEffects: false.

This fix:
1. Updates enqueueParseTask to use resolve_result.primary_side_effects_data
   instead of loader.sideEffects()
2. Adds lookupSideEffectsForPath helper to look up sideEffects from package.json
   when an onResolve plugin returns a path in the "file" namespace
3. Uses the looked-up sideEffects in the onResolve success handler

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-15 01:15:37 +00:00
Ciro Spaciari
22bebfc467 respect agent options and connectOpts in https module (#25937) 2026-01-14 11:52:53 -08:00
robobun
1800093a64 fix(install): use scope-specific registry for scoped packages in frozen lockfile (#26047)
## Summary
- Fixed `bun install --frozen-lockfile` to use scope-specific registry
for scoped packages when the lockfile has an empty registry URL

When parsing a `bun.lock` file with an empty registry URL for a scoped
package (like `@example/test-package`), bun was unconditionally using
the default npm registry (`https://registry.npmjs.org/`) instead of
looking up the scope-specific registry from `bunfig.toml`.

For example, with this configuration in `bunfig.toml`:
```toml
[install.scopes]
example = { url = "https://npm.pkg.github.com" }
```

And this lockfile entry with an empty registry URL:
```json
"@example/test-package": ["@example/test-package@1.0.0", "", {}, "sha512-AAAA"]
```

bun would try to fetch from
`https://registry.npmjs.org/@example/test-package/-/...` instead of
`https://npm.pkg.github.com/@example/test-package/-/...`.

The fix uses `manager.scopeForPackageName()` (the same pattern used in
`pnpm.zig`) to look up the correct scope-specific registry URL.

## Test plan
- [x] Added regression test `test/regression/issue/026039.test.ts` that
verifies:
  - Scoped packages use the scope-specific registry from `bunfig.toml`
  - Non-scoped packages continue to use the default registry
- [x] Verified test fails with system bun (without fix) and passes with
debug build (with fix)

Fixes #26039

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2026-01-14 10:22:30 -08:00
Jarred Sumner
967a6a2021 Fix blocking realpathSync call (#26056)
### What does this PR do?

### How did you verify your code works?

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2026-01-13 23:05:46 -08:00
Jarred Sumner
49d0fbd2de Update 25716.test.ts 2026-01-13 22:38:31 -08:00
Tommy D. Rossi
af2317deb4 fix(bundler): allow reactFastRefresh Bun.build option with non-browser targets (#26035)
### What does this PR do?

Previously, reactFastRefresh was silently ignored when target was not
'browser', even when explicitly enabled. This was confusing as there was
no warning or error.

This change removes the `target == .browser` check, trusting explicit
user intent. If users enable reactFastRefresh with a non-browser target,
the transform will now be applied. If `$RefreshReg$` is not defined at
runtime, it will fail fast with a clear error rather than silently doing
nothing.

Use case: Terminal UIs (like [termcast](https://termcast.app)) need
React Fast Refresh with target: 'bun' for hot reloading in non-browser
environments.

### How did you verify your code works?

Updated existing test removing target browser
2026-01-13 22:13:17 -08:00
robobun
ab009fe00d fix(init): respect --minimal flag for agent rule files (#26051)
## Summary
- Fixes `bun init --minimal` creating Cursor rules files and CLAUDE.md
when it shouldn't
- Adds regression test to verify `--minimal` only creates package.json
and tsconfig.json

## Test plan
- [x] Verify test fails with system bun (unfixed): `USE_SYSTEM_BUN=1 bun
test test/cli/init/init.test.ts -t "bun init --minimal"`
- [x] Verify test passes with debug build: `bun bd test
test/cli/init/init.test.ts -t "bun init --minimal"`
- [x] All existing init tests pass: `bun bd test
test/cli/init/init.test.ts`

Fixes #26050

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-13 18:33:42 -08:00
Jarred Sumner
fbd800551b Bump 2026-01-13 15:06:36 -08:00
robobun
113cdd9648 fix(completions): add update command to Fish completions (#25978)
## Summary

- Add the `update` subcommand to Fish shell completions
- Apply the install/add/remove flags (--global, --dry-run, --force,
etc.) to the `update` command

Previously, Fish shell autocompletion for `bun update --gl<TAB>` would
not work because:
1. The `update` command was missing from the list of built-in commands
2. The install/add/remove flags were not being applied to `update`

Fixes #25953

## Test plan

- [x] Verify `update` appears in subcommand completions (`bun <TAB>`)
- [x] Verify `--global` flag completion works (`bun update --gl<TAB>`)
- [x] Verify other install flags work with update (--dry-run, --force,
etc.)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-13 15:05:00 -08:00
robobun
3196178fa7 fix(timers): add _idleStart property to Timeout object (#26021)
## Summary

- Add `_idleStart` property (getter/setter) to the Timeout object
returned by `setTimeout()` and `setInterval()`
- The property returns a monotonic timestamp (in milliseconds)
representing when the timer was created
- This mimics Node.js's behavior where `_idleStart` is the libuv
timestamp at timer creation time

## Test plan

- [x] Verified test fails with `USE_SYSTEM_BUN=1 bun test
test/regression/issue/25639.test.ts`
- [x] Verified test passes with `bun bd test
test/regression/issue/25639.test.ts`
- [x] Manual verification:
  ```bash
  # Bun with fix - _idleStart exists
./build/debug/bun-debug -e "const t = setTimeout(() => {}, 0);
console.log('_idleStart' in t, typeof t._idleStart); clearTimeout(t)"
  # Output: true number
  
  # Node.js reference - same behavior
node -e "const t = setTimeout(() => {}, 0); console.log('_idleStart' in
t, typeof t._idleStart); clearTimeout(t)"
  # Output: true number
  ```

Closes #25639

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-12 19:35:11 -08:00
robobun
d530ed993d fix(css): restore handler context after minifying nested rules (#25997)
## Summary
- Fixes handler context not being restored after minifying nested CSS
rules
- Adds regression test for the issue

## Test plan
- [x] Test fails with `USE_SYSTEM_BUN=1 bun test
test/regression/issue/25794.test.ts`
- [x] Test passes with `bun bd test test/regression/issue/25794.test.ts`

Fixes #25794

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-12 14:55:27 -08:00
Dylan Conway
959169dfaf feat(archive): change API to constructor-based with S3 support (#25940)
## Summary
- Change Archive API from `Bun.Archive.from(data)` to `new
Bun.Archive(data, options?)`
- Change compression options from `{ gzip: true }` to `{ compress:
"gzip", level?: number }`
- Default to no compression when no options provided
- Use `{ compress: "gzip" }` to enable gzip compression (level 6 by
default)
- Add Archive support for S3 and local file writes via `Bun.write()`

## New API

```typescript
// Create archive - defaults to uncompressed tar
const archive = new Bun.Archive({
  "hello.txt": "Hello, World!",
  "data.json": JSON.stringify({ foo: "bar" }),
});

// Enable gzip compression
const compressed = new Bun.Archive(files, { compress: "gzip" });

// Gzip with custom level (1-12)
const maxCompression = new Bun.Archive(files, { compress: "gzip", level: 12 });

// Write to local file
await Bun.write("archive.tar", archive);           // uncompressed by default
await Bun.write("archive.tar.gz", compressed);     // gzipped

// Write to S3
await client.write("archive.tar.gz", compressed);          // S3Client.write()
await Bun.write("s3://bucket/archive.tar.gz", compressed); // S3 URL
await s3File.write(compressed);                            // s3File.write()

// Get bytes/blob (uses compression setting from constructor)
const bytes = await archive.bytes();
const blob = await archive.blob();
```

## TypeScript Types

```typescript
type ArchiveCompression = "gzip";

type ArchiveOptions = {
  compress?: "gzip";
  level?: number;  // 1-12, default 6 when gzip enabled
};
```

## Test plan
- [x] 98 archive tests pass
- [x] S3 integration tests updated to new API
- [x] TypeScript types updated
- [x] Documentation updated with new examples

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2026-01-12 14:54:21 -08:00
SUZUKI Sosuke
461ad886bd fix(http): fix Strong reference leak in server response streaming (#25965)
## Summary

Fix a memory leak in `RequestContext.doRenderWithBody()` where
`Strong.Impl` memory was leaked when proxying streaming responses
through Bun's HTTP server.

## Problem

When a streaming response (e.g., from a proxied fetch request) was
forwarded through Bun's server:

1. `response_body_readable_stream_ref` was initialized at line 1836
(from `lock.readable`) or line 1841 (via `Strong.init()`)
2. For `.Bytes` streams with `has_received_last_chunk=false`, a **new**
Strong reference was created at line 1902
3. The old Strong reference was **never deinit'd**, causing
`Strong.Impl` memory to leak

This leak accumulated over time with every streaming response proxied
through the server.

## Solution

Add `this.response_body_readable_stream_ref.deinit()` before creating
the new Strong reference. This is safe because:

- `stream` exists as a stack-local variable
- JSC's conservative GC tracks stack-local JSValues
- No GC can occur between consecutive synchronous Zig statements
- Therefore, `stream` won't be collected between `deinit()` and
`Strong.init()`

## Test

Added `test/js/web/fetch/server-response-stream-leak.test.ts` which:
- Creates a backend server that returns delayed streaming responses
- Creates a proxy server that forwards the streaming responses
- Makes 200 requests and checks that ReadableStream objects don't
accumulate
- Fails on system Bun v1.3.5 (202 leaked), passes with the fix

## Related

Similar to the Strong reference leak fixes in:
- #23313 (fetch memory leak)
- #25846 (fetch cyclic reference leak)
2026-01-12 14:41:58 -08:00
Markus Schmidt
b6abbd50a0 fix(Bun.SQL): handle binary columns in MySQL correctly (#26011)
## What does this PR do?
Currently binary columns are returned as strings which means they get
corrupted when encoded in UTF8. This PR returns binary columns as
Buffers which is what user's actually expect and is also consistent with
PostgreSQL and SQLite.
### How did you verify your code works?
I added tests to verify the correct behavior. Before there were no tests
for binary columns at all.

This fixes #23991
2026-01-12 11:56:02 -08:00
38 changed files with 2227 additions and 465 deletions

2
LATEST
View File

@@ -1 +1 @@
1.3.5
1.3.6

View File

@@ -35,8 +35,8 @@ end
set -l bun_install_boolean_flags yarn production optional development no-save dry-run force no-cache silent verbose global
set -l bun_install_boolean_flags_descriptions "Write a yarn.lock file (yarn v1)" "Don't install devDependencies" "Add dependency to optionalDependencies" "Add dependency to devDependencies" "Don't update package.json or save a lockfile" "Don't install anything" "Always request the latest versions from the registry & reinstall all dependencies" "Ignore manifest cache entirely" "Don't output anything" "Excessively verbose logging" "Use global folder"
set -l bun_builtin_cmds_without_run dev create help bun upgrade discord install remove add init pm x
set -l bun_builtin_cmds_accepting_flags create help bun upgrade discord run init link unlink pm x
set -l bun_builtin_cmds_without_run dev create help bun upgrade discord install remove add update init pm x
set -l bun_builtin_cmds_accepting_flags create help bun upgrade discord run init link unlink pm x update
function __bun_complete_bins_scripts --inherit-variable bun_builtin_cmds_without_run -d "Emit bun completions for bins and scripts"
# Do nothing if we already have a builtin subcommand,
@@ -148,14 +148,14 @@ complete -c bun \
for i in (seq (count $bun_install_boolean_flags))
complete -c bun \
-n "__fish_seen_subcommand_from install add remove" -l "$bun_install_boolean_flags[$i]" -d "$bun_install_boolean_flags_descriptions[$i]"
-n "__fish_seen_subcommand_from install add remove update" -l "$bun_install_boolean_flags[$i]" -d "$bun_install_boolean_flags_descriptions[$i]"
end
complete -c bun \
-n "__fish_seen_subcommand_from install add remove" -l 'cwd' -d 'Change working directory'
-n "__fish_seen_subcommand_from install add remove update" -l 'cwd' -d 'Change working directory'
complete -c bun \
-n "__fish_seen_subcommand_from install add remove" -l 'cache-dir' -d 'Choose a cache directory (default: $HOME/.bun/install/cache)'
-n "__fish_seen_subcommand_from install add remove update" -l 'cache-dir' -d 'Choose a cache directory (default: $HOME/.bun/install/cache)'
complete -c bun \
-n "__fish_seen_subcommand_from add" -d 'Popular' -a '(__fish__get_bun_packages)'
@@ -183,4 +183,5 @@ complete -c bun -n "__fish_use_subcommand" -a "unlink" -d "Unregister a local np
complete -c bun -n "__fish_use_subcommand" -a "pm" -d "Additional package management utilities" -f
complete -c bun -n "__fish_use_subcommand" -a "x" -d "Execute a package binary, installing if needed" -f
complete -c bun -n "__fish_use_subcommand" -a "outdated" -d "Display the latest versions of outdated dependencies" -f
complete -c bun -n "__fish_use_subcommand" -a "update" -d "Update dependencies to their latest versions" -f
complete -c bun -n "__fish_use_subcommand" -a "publish" -d "Publish your package from local to npm" -f

View File

@@ -10,21 +10,21 @@ Bun provides a fast, native implementation for working with tar archives through
**Create an archive from files:**
```ts
const archive = Bun.Archive.from({
const archive = new Bun.Archive({
"hello.txt": "Hello, World!",
"data.json": JSON.stringify({ foo: "bar" }),
"nested/file.txt": "Nested content",
});
// Write to disk
await Bun.Archive.write("bundle.tar", archive);
await Bun.write("bundle.tar", archive);
```
**Extract an archive:**
```ts
const tarball = await Bun.file("package.tar.gz").bytes();
const archive = Bun.Archive.from(tarball);
const archive = new Bun.Archive(tarball);
const entryCount = await archive.extract("./output");
console.log(`Extracted ${entryCount} entries`);
```
@@ -33,7 +33,7 @@ console.log(`Extracted ${entryCount} entries`);
```ts
const tarball = await Bun.file("package.tar.gz").bytes();
const archive = Bun.Archive.from(tarball);
const archive = new Bun.Archive(tarball);
const files = await archive.files();
for (const [path, file] of files) {
@@ -43,10 +43,11 @@ for (const [path, file] of files) {
## Creating Archives
Use `Bun.Archive.from()` to create an archive from an object where keys are file paths and values are file contents:
Use `new Bun.Archive()` to create an archive from an object where keys are file paths and values are file contents. By default, archives are uncompressed:
```ts
const archive = Bun.Archive.from({
// Creates an uncompressed tar archive (default)
const archive = new Bun.Archive({
"README.md": "# My Project",
"src/index.ts": "console.log('Hello');",
"package.json": JSON.stringify({ name: "my-project" }),
@@ -64,7 +65,7 @@ File contents can be:
const data = "binary data";
const arrayBuffer = new ArrayBuffer(8);
const archive = Bun.Archive.from({
const archive = new Bun.Archive({
"text.txt": "Plain text",
"blob.bin": new Blob([data]),
"bytes.bin": new Uint8Array([1, 2, 3, 4]),
@@ -74,18 +75,19 @@ const archive = Bun.Archive.from({
### Writing Archives to Disk
Use `Bun.Archive.write()` to create and write an archive in one operation:
Use `Bun.write()` to write an archive to disk:
```ts
// Write uncompressed tar
await Bun.Archive.write("output.tar", {
// Write uncompressed tar (default)
const archive = new Bun.Archive({
"file1.txt": "content1",
"file2.txt": "content2",
});
await Bun.write("output.tar", archive);
// Write gzipped tar
const files = { "src/index.ts": "console.log('Hello');" };
await Bun.Archive.write("output.tar.gz", files, "gzip");
const compressed = new Bun.Archive({ "src/index.ts": "console.log('Hello');" }, { compress: "gzip" });
await Bun.write("output.tar.gz", compressed);
```
### Getting Archive Bytes
@@ -93,8 +95,7 @@ await Bun.Archive.write("output.tar.gz", files, "gzip");
Get the archive data as bytes or a Blob:
```ts
const files = { "hello.txt": "Hello, World!" };
const archive = Bun.Archive.from(files);
const archive = new Bun.Archive({ "hello.txt": "Hello, World!" });
// As Uint8Array
const bytes = await archive.bytes();
@@ -102,9 +103,10 @@ const bytes = await archive.bytes();
// As Blob
const blob = await archive.blob();
// With gzip compression
const gzippedBytes = await archive.bytes("gzip");
const gzippedBlob = await archive.blob("gzip");
// With gzip compression (set at construction)
const gzipped = new Bun.Archive({ "hello.txt": "Hello, World!" }, { compress: "gzip" });
const gzippedBytes = await gzipped.bytes();
const gzippedBlob = await gzipped.blob();
```
## Extracting Archives
@@ -116,13 +118,13 @@ Create an archive from existing tar/tar.gz data:
```ts
// From a file
const tarball = await Bun.file("package.tar.gz").bytes();
const archiveFromFile = Bun.Archive.from(tarball);
const archiveFromFile = new Bun.Archive(tarball);
```
```ts
// From a fetch response
const response = await fetch("https://example.com/archive.tar.gz");
const archiveFromFetch = Bun.Archive.from(await response.blob());
const archiveFromFetch = new Bun.Archive(await response.blob());
```
### Extracting to Disk
@@ -131,7 +133,7 @@ Use `.extract()` to write all files to a directory:
```ts
const tarball = await Bun.file("package.tar.gz").bytes();
const archive = Bun.Archive.from(tarball);
const archive = new Bun.Archive(tarball);
const count = await archive.extract("./extracted");
console.log(`Extracted ${count} entries`);
```
@@ -148,7 +150,7 @@ Use glob patterns to extract only specific files. Patterns are matched against a
```ts
const tarball = await Bun.file("package.tar.gz").bytes();
const archive = Bun.Archive.from(tarball);
const archive = new Bun.Archive(tarball);
// Extract only TypeScript files
const tsCount = await archive.extract("./extracted", { glob: "**/*.ts" });
@@ -181,7 +183,7 @@ Use `.files()` to get archive contents as a `Map` of `File` objects without extr
```ts
const tarball = await Bun.file("package.tar.gz").bytes();
const archive = Bun.Archive.from(tarball);
const archive = new Bun.Archive(tarball);
const files = await archive.files();
for (const [path, file] of files) {
@@ -206,7 +208,7 @@ Archive operations can fail due to corrupted data, I/O errors, or invalid paths.
```ts
try {
const tarball = await Bun.file("package.tar.gz").bytes();
const archive = Bun.Archive.from(tarball);
const archive = new Bun.Archive(tarball);
const count = await archive.extract("./output");
console.log(`Extracted ${count} entries`);
} catch (e: unknown) {
@@ -227,7 +229,7 @@ try {
Common error scenarios:
- **Corrupted/truncated archives** - `Archive.from()` loads the archive data; errors may be deferred until read/extract operations
- **Corrupted/truncated archives** - `new Archive()` loads the archive data; errors may be deferred until read/extract operations
- **Permission denied** - `extract()` throws if the target directory is not writable
- **Disk full** - `extract()` throws if there's insufficient space
- **Invalid paths** - Operations throw for malformed file paths
@@ -239,7 +241,7 @@ The count returned by `extract()` includes all successfully written entries (fil
For additional security with untrusted archives, you can enumerate and validate paths before extraction:
```ts
const archive = Bun.Archive.from(untrustedData);
const archive = new Bun.Archive(untrustedData);
const files = await archive.files();
// Optional: Custom validation for additional checks
@@ -298,26 +300,28 @@ See [Bun.Glob](/docs/api/glob) for the full glob syntax including escaping and a
## Compression
Bun.Archive supports gzip compression for both reading and writing:
Bun.Archive creates uncompressed tar archives by default. Use `{ compress: "gzip" }` to enable gzip compression:
```ts
// Default: uncompressed tar
const archive = new Bun.Archive({ "hello.txt": "Hello, World!" });
// Reading: automatically detects gzip
const gzippedTarball = await Bun.file("archive.tar.gz").bytes();
const archive = Bun.Archive.from(gzippedTarball);
const readArchive = new Bun.Archive(gzippedTarball);
// Writing: specify compression
const files = { "hello.txt": "Hello, World!" };
await Bun.Archive.write("output.tar.gz", files, "gzip");
// Enable gzip compression
const compressed = new Bun.Archive({ "hello.txt": "Hello, World!" }, { compress: "gzip" });
// Getting bytes: specify compression
const gzippedBytes = await archive.bytes("gzip");
// Gzip with custom level (1-12)
const maxCompression = new Bun.Archive({ "hello.txt": "Hello, World!" }, { compress: "gzip", level: 12 });
```
The compression argument accepts:
The options accept:
- `"gzip"` - Enable gzip compression
- `true` - Same as `"gzip"`
- `false` or `undefined` - No compression
- No options or `undefined` - Uncompressed tar (default)
- `{ compress: "gzip" }` - Enable gzip compression at level 6
- `{ compress: "gzip", level: number }` - Gzip with custom level 1-12 (1 = fastest, 12 = smallest)
## Examples
@@ -339,15 +343,16 @@ for await (const path of glob.scan(".")) {
// Add package.json
files["package.json"] = await Bun.file("package.json").text();
// Create compressed archive
await Bun.Archive.write("bundle.tar.gz", files, "gzip");
// Create compressed archive and write to disk
const archive = new Bun.Archive(files, { compress: "gzip" });
await Bun.write("bundle.tar.gz", archive);
```
### Extract and Process npm Package
```ts
const response = await fetch("https://registry.npmjs.org/lodash/-/lodash-4.17.21.tgz");
const archive = Bun.Archive.from(await response.blob());
const archive = new Bun.Archive(await response.blob());
// Get package.json
const files = await archive.files("package/package.json");
@@ -365,7 +370,7 @@ if (packageJson) {
import { readdir } from "node:fs/promises";
import { join } from "node:path";
async function archiveDirectory(dir: string): Promise<Bun.Archive> {
async function archiveDirectory(dir: string, compress = false): Promise<Bun.Archive> {
const files: Record<string, Blob> = {};
async function walk(currentDir: string, prefix: string = "") {
@@ -384,11 +389,11 @@ async function archiveDirectory(dir: string): Promise<Bun.Archive> {
}
await walk(dir);
return Bun.Archive.from(files);
return new Bun.Archive(files, compress ? { compress: "gzip" } : undefined);
}
const archive = await archiveDirectory("./my-project");
await Bun.Archive.write("my-project.tar.gz", archive, "gzip");
const archive = await archiveDirectory("./my-project", true);
await Bun.write("my-project.tar.gz", archive);
```
## Reference
@@ -396,14 +401,19 @@ await Bun.Archive.write("my-project.tar.gz", archive, "gzip");
> **Note**: The following type signatures are simplified for documentation purposes. See [`packages/bun-types/bun.d.ts`](https://github.com/oven-sh/bun/blob/main/packages/bun-types/bun.d.ts) for the full type definitions.
```ts
type ArchiveCompression = "gzip" | boolean;
type ArchiveInput =
| Record<string, string | Blob | Bun.ArrayBufferView | ArrayBufferLike>
| Blob
| Bun.ArrayBufferView
| ArrayBufferLike;
type ArchiveOptions = {
/** Compression algorithm. Currently only "gzip" is supported. */
compress?: "gzip";
/** Compression level 1-12 (default 6 when gzip is enabled). */
level?: number;
};
interface ArchiveExtractOptions {
/** Glob pattern(s) to filter extraction. Supports negative patterns with "!" prefix. */
glob?: string | readonly string[];
@@ -412,13 +422,11 @@ interface ArchiveExtractOptions {
class Archive {
/**
* Create an Archive from input data
* @param data - Files to archive (as object) or existing archive data (as bytes/blob)
* @param options - Compression options. Uncompressed by default.
* Pass { compress: "gzip" } to enable compression.
*/
static from(data: ArchiveInput): Archive;
/**
* Write an archive directly to disk
*/
static write(path: string, data: ArchiveInput | Archive, compress?: ArchiveCompression): Promise<void>;
constructor(data: ArchiveInput, options?: ArchiveOptions);
/**
* Extract archive to a directory
@@ -427,14 +435,14 @@ class Archive {
extract(path: string, options?: ArchiveExtractOptions): Promise<number>;
/**
* Get archive as a Blob
* Get archive as a Blob (uses compression setting from constructor)
*/
blob(compress?: ArchiveCompression): Promise<Blob>;
blob(): Promise<Blob>;
/**
* Get archive as a Uint8Array
* Get archive as a Uint8Array (uses compression setting from constructor)
*/
bytes(compress?: ArchiveCompression): Promise<Uint8Array<ArrayBuffer>>;
bytes(): Promise<Uint8Array<ArrayBuffer>>;
/**
* Get archive contents as File objects (regular files only, no directories)

View File

@@ -1,7 +1,7 @@
{
"private": true,
"name": "bun",
"version": "1.3.6",
"version": "1.3.7",
"workspaces": [
"./packages/bun-types",
"./packages/@types/bun"

View File

@@ -750,7 +750,7 @@ declare module "bun" {
*/
function write(
destination: BunFile | S3File | PathLike,
input: Blob | NodeJS.TypedArray | ArrayBufferLike | string | BlobPart[],
input: Blob | NodeJS.TypedArray | ArrayBufferLike | string | BlobPart[] | Archive,
options?: {
/**
* If writing to a PathLike, set the permissions of the file.
@@ -6975,15 +6975,44 @@ declare module "bun" {
/**
* Compression format for archive output.
* - `"gzip"` - Compress with gzip
* - `true` - Same as `"gzip"`
* - `false` - Explicitly disable compression (no compression)
* - `undefined` - No compression (default behavior when omitted)
*
* Both `false` and `undefined` result in no compression; `false` can be used
* to explicitly indicate "no compression" in code where the intent should be clear.
* Currently only `"gzip"` is supported.
*/
type ArchiveCompression = "gzip" | boolean;
type ArchiveCompression = "gzip";
/**
* Options for creating an Archive instance.
*
* By default, archives are not compressed. Use `{ compress: "gzip" }` to enable compression.
*
* @example
* ```ts
* // No compression (default)
* new Bun.Archive(data);
*
* // Enable gzip with default level (6)
* new Bun.Archive(data, { compress: "gzip" });
*
* // Specify compression level
* new Bun.Archive(data, { compress: "gzip", level: 9 });
* ```
*/
interface ArchiveOptions {
/**
* Compression algorithm to use.
* Currently only "gzip" is supported.
* If not specified, no compression is applied.
*/
compress?: ArchiveCompression;
/**
* Compression level (1-12). Only applies when `compress` is set.
* - 1: Fastest compression, lowest ratio
* - 6: Default balance of speed and ratio
* - 12: Best compression ratio, slowest
*
* @default 6
*/
level?: number;
}
/**
* Options for extracting archive contents.
@@ -7031,7 +7060,7 @@ declare module "bun" {
* @example
* **Create an archive from an object:**
* ```ts
* const archive = Bun.Archive.from({
* const archive = new Bun.Archive({
* "hello.txt": "Hello, World!",
* "data.json": JSON.stringify({ foo: "bar" }),
* "binary.bin": new Uint8Array([1, 2, 3, 4]),
@@ -7039,9 +7068,20 @@ declare module "bun" {
* ```
*
* @example
* **Create a gzipped archive:**
* ```ts
* const archive = new Bun.Archive({
* "hello.txt": "Hello, World!",
* }, { compress: "gzip" });
*
* // Or with a specific compression level (1-12)
* const archive = new Bun.Archive(data, { compress: "gzip", level: 9 });
* ```
*
* @example
* **Extract an archive to disk:**
* ```ts
* const archive = Bun.Archive.from(tarballBytes);
* const archive = new Bun.Archive(tarballBytes);
* const entryCount = await archive.extract("./output");
* console.log(`Extracted ${entryCount} entries`);
* ```
@@ -7049,7 +7089,7 @@ declare module "bun" {
* @example
* **Get archive contents as a Map of File objects:**
* ```ts
* const archive = Bun.Archive.from(tarballBytes);
* const archive = new Bun.Archive(tarballBytes);
* const entries = await archive.files();
* for (const [path, file] of entries) {
* console.log(path, await file.text());
@@ -7062,36 +7102,50 @@ declare module "bun" {
* await Bun.Archive.write("bundle.tar.gz", {
* "src/index.ts": sourceCode,
* "package.json": packageJson,
* }, "gzip");
* }, { compress: "gzip" });
* ```
*/
export class Archive {
/**
* Create an `Archive` instance from input data.
*
* By default, archives are not compressed. Use `{ compress: "gzip" }` to enable compression.
*
* @param data - The input data for the archive:
* - **Object**: Creates a new tarball with the object's keys as file paths and values as file contents
* - **Blob/TypedArray/ArrayBuffer**: Wraps existing archive data (tar or tar.gz)
*
* @returns A new `Archive` instance
* @param options - Optional archive options including compression settings.
* Defaults to no compression if omitted.
*
* @example
* **From an object (creates new tarball):**
* **From an object (creates uncompressed tarball):**
* ```ts
* const archive = Bun.Archive.from({
* const archive = new Bun.Archive({
* "hello.txt": "Hello, World!",
* "nested/file.txt": "Nested content",
* });
* ```
*
* @example
* **With gzip compression:**
* ```ts
* const archive = new Bun.Archive(data, { compress: "gzip" });
* ```
*
* @example
* **With explicit gzip compression level:**
* ```ts
* const archive = new Bun.Archive(data, { compress: "gzip", level: 12 });
* ```
*
* @example
* **From existing archive data:**
* ```ts
* const response = await fetch("https://example.com/package.tar.gz");
* const archive = Bun.Archive.from(await response.blob());
* const archive = new Bun.Archive(await response.blob());
* ```
*/
static from(data: ArchiveInput): Archive;
constructor(data: ArchiveInput, options?: ArchiveOptions);
/**
* Create and write an archive directly to disk in one operation.
@@ -7100,8 +7154,8 @@ declare module "bun" {
* as it streams the data directly to disk.
*
* @param path - The file path to write the archive to
* @param data - The input data for the archive (same as `Archive.from()`)
* @param compress - Optional compression: `"gzip"`, `true` for gzip, or `false`/`undefined` for none
* @param data - The input data for the archive (same as `new Archive()`)
* @param options - Optional archive options including compression settings
*
* @returns A promise that resolves when the write is complete
*
@@ -7117,10 +7171,10 @@ declare module "bun" {
* @example
* **Write gzipped tarball:**
* ```ts
* await Bun.Archive.write("output.tar.gz", files, "gzip");
* await Bun.Archive.write("output.tar.gz", files, { compress: "gzip" });
* ```
*/
static write(path: string, data: ArchiveInput | Archive, compress?: ArchiveCompression): Promise<void>;
static write(path: string, data: ArchiveInput | Archive, options?: ArchiveOptions): Promise<void>;
/**
* Extract the archive contents to a directory on disk.
@@ -7136,7 +7190,7 @@ declare module "bun" {
* @example
* **Extract all entries:**
* ```ts
* const archive = Bun.Archive.from(tarballBytes);
* const archive = new Bun.Archive(tarballBytes);
* const count = await archive.extract("./extracted");
* console.log(`Extracted ${count} entries`);
* ```
@@ -7166,42 +7220,48 @@ declare module "bun" {
/**
* Get the archive contents as a `Blob`.
*
* @param compress - Optional compression: `"gzip"`, `true` for gzip, or `false`/`undefined` for none
* Uses the compression settings specified when the Archive was created.
*
* @returns A promise that resolves with the archive data as a Blob
*
* @example
* **Get uncompressed tarball:**
* **Get tarball as Blob:**
* ```ts
* const archive = new Bun.Archive(data);
* const blob = await archive.blob();
* ```
*
* @example
* **Get gzipped tarball:**
* **Get gzipped tarball as Blob:**
* ```ts
* const gzippedBlob = await archive.blob("gzip");
* const archive = new Bun.Archive(data, { compress: "gzip" });
* const gzippedBlob = await archive.blob();
* ```
*/
blob(compress?: ArchiveCompression): Promise<Blob>;
blob(): Promise<Blob>;
/**
* Get the archive contents as a `Uint8Array`.
*
* @param compress - Optional compression: `"gzip"`, `true` for gzip, or `false`/`undefined` for none
* Uses the compression settings specified when the Archive was created.
*
* @returns A promise that resolves with the archive data as a Uint8Array
*
* @example
* **Get uncompressed tarball bytes:**
* **Get tarball bytes:**
* ```ts
* const archive = new Bun.Archive(data);
* const bytes = await archive.bytes();
* ```
*
* @example
* **Get gzipped tarball bytes:**
* ```ts
* const gzippedBytes = await archive.bytes("gzip");
* const archive = new Bun.Archive(data, { compress: "gzip" });
* const gzippedBytes = await archive.bytes();
* ```
*/
bytes(compress?: ArchiveCompression): Promise<Uint8Array<ArrayBuffer>>;
bytes(): Promise<Uint8Array<ArrayBuffer>>;
/**
* Get the archive contents as a `Map` of `File` objects.

View File

@@ -609,7 +609,17 @@ declare module "bun" {
* });
*/
write(
data: string | ArrayBufferView | ArrayBuffer | SharedArrayBuffer | Request | Response | BunFile | S3File | Blob,
data:
| string
| ArrayBufferView
| ArrayBuffer
| SharedArrayBuffer
| Request
| Response
| BunFile
| S3File
| Blob
| Archive,
options?: S3Options,
): Promise<number>;
@@ -920,7 +930,8 @@ declare module "bun" {
| BunFile
| S3File
| Blob
| File,
| File
| Archive,
options?: S3Options,
): Promise<number>;
@@ -970,7 +981,8 @@ declare module "bun" {
| BunFile
| S3File
| Blob
| File,
| File
| Archive,
options?: S3Options,
): Promise<number>;

View File

@@ -8,10 +8,6 @@ export default [
configurable: false,
JSType: "0b11101110",
klass: {
from: {
fn: "from",
length: 1,
},
write: {
fn: "write",
length: 2,

View File

@@ -5,8 +5,19 @@ pub const toJS = js.toJS;
pub const fromJS = js.fromJS;
pub const fromJSDirect = js.fromJSDirect;
/// Compression options for the archive
pub const Compression = union(enum) {
none,
gzip: struct {
/// Compression level: 1 (fastest) to 12 (maximum compression). Default is 6.
level: u8 = 6,
},
};
/// The underlying data for the archive - uses Blob.Store for thread-safe ref counting
store: *jsc.WebCore.Blob.Store,
/// Compression settings for this archive
compress: Compression = .none,
pub fn finalize(this: *Archive) void {
jsc.markBinding(@src());
@@ -65,47 +76,95 @@ fn countFilesInArchive(data: []const u8) u32 {
return count;
}
/// Constructor: new Archive() - throws an error since users should use Archive.from()
pub fn constructor(globalThis: *jsc.JSGlobalObject, _: *jsc.CallFrame) bun.JSError!*Archive {
return globalThis.throwInvalidArguments("Archive cannot be constructed directly. Use Archive.from() instead.", .{});
}
/// Static method: Archive.from(data)
/// Constructor: new Archive(data, options?)
/// Creates an Archive from either:
/// - An object { [path: string]: Blob | string | ArrayBufferView | ArrayBufferLike }
/// - A Blob, ArrayBufferView, or ArrayBufferLike (assumes it's already a valid archive)
pub fn from(globalThis: *jsc.JSGlobalObject, callframe: *jsc.CallFrame) bun.JSError!jsc.JSValue {
const arg = callframe.argumentsAsArray(1)[0];
if (arg == .zero) {
return globalThis.throwInvalidArguments("Archive.from requires an argument", .{});
/// Options:
/// - compress: "gzip" - Enable gzip compression
/// - level: number (1-12) - Compression level (default 6)
/// When no options are provided, no compression is applied
pub fn constructor(globalThis: *jsc.JSGlobalObject, callframe: *jsc.CallFrame) bun.JSError!*Archive {
const data_arg, const options_arg = callframe.argumentsAsArray(2);
if (data_arg == .zero) {
return globalThis.throwInvalidArguments("new Archive() requires an argument", .{});
}
// Parse compression options
const compress = try parseCompressionOptions(globalThis, options_arg);
// For Blob/Archive, ref the existing store (zero-copy)
if (arg.as(jsc.WebCore.Blob)) |blob_ptr| {
if (data_arg.as(jsc.WebCore.Blob)) |blob_ptr| {
if (blob_ptr.store) |store| {
store.ref();
return bun.new(Archive, .{ .store = store }).toJS(globalThis);
return bun.new(Archive, .{ .store = store, .compress = compress });
}
}
// For ArrayBuffer/TypedArray, copy the data
if (arg.asArrayBuffer(globalThis)) |array_buffer| {
if (data_arg.asArrayBuffer(globalThis)) |array_buffer| {
const data = try bun.default_allocator.dupe(u8, array_buffer.slice());
return createArchive(globalThis, data);
return createArchive(data, compress);
}
// For plain objects, build a tarball
if (arg.isObject()) {
const data = try buildTarballFromObject(globalThis, arg);
return createArchive(globalThis, data);
if (data_arg.isObject()) {
const data = try buildTarballFromObject(globalThis, data_arg);
return createArchive(data, compress);
}
return globalThis.throwInvalidArguments("Expected an object, Blob, TypedArray, or ArrayBuffer", .{});
}
fn createArchive(globalThis: *jsc.JSGlobalObject, data: []u8) jsc.JSValue {
/// Parse compression options from JS value
/// Returns .none if no compression specified, caller must handle defaults
fn parseCompressionOptions(globalThis: *jsc.JSGlobalObject, options_arg: jsc.JSValue) bun.JSError!Compression {
// No options provided means no compression (caller handles defaults)
if (options_arg.isUndefinedOrNull()) {
return .none;
}
if (!options_arg.isObject()) {
return globalThis.throwInvalidArguments("Archive: options must be an object", .{});
}
// Check for compress option
if (try options_arg.getTruthy(globalThis, "compress")) |compress_val| {
// compress must be "gzip"
if (!compress_val.isString()) {
return globalThis.throwInvalidArguments("Archive: compress option must be a string", .{});
}
const compress_str = try compress_val.toSlice(globalThis, bun.default_allocator);
defer compress_str.deinit();
if (!bun.strings.eqlComptime(compress_str.slice(), "gzip")) {
return globalThis.throwInvalidArguments("Archive: compress option must be \"gzip\"", .{});
}
// Parse level option (1-12, default 6)
var level: u8 = 6;
if (try options_arg.getTruthy(globalThis, "level")) |level_val| {
if (!level_val.isNumber()) {
return globalThis.throwInvalidArguments("Archive: level must be a number", .{});
}
const level_num = level_val.toInt64();
if (level_num < 1 or level_num > 12) {
return globalThis.throwInvalidArguments("Archive: level must be between 1 and 12", .{});
}
level = @intCast(level_num);
}
return .{ .gzip = .{ .level = level } };
}
// No compress option specified in options object means no compression
return .none;
}
fn createArchive(data: []u8, compress: Compression) *Archive {
const store = jsc.WebCore.Blob.Store.init(data, bun.default_allocator);
return bun.new(Archive, .{ .store = store }).toJS(globalThis);
return bun.new(Archive, .{ .store = store, .compress = compress });
}
/// Shared helper that builds tarball bytes from a JS object
@@ -212,12 +271,15 @@ fn getEntryData(globalThis: *jsc.JSGlobalObject, value: jsc.JSValue, allocator:
return value.toSlice(globalThis, allocator);
}
/// Static method: Archive.write(path, data, compress?)
/// Creates and writes an archive to disk in one operation
/// Static method: Archive.write(path, data, options?)
/// Creates and writes an archive to disk in one operation.
/// For Archive instances, uses the archive's compression settings unless overridden by options.
/// Options:
/// - gzip: { level?: number } - Override compression settings
pub fn write(globalThis: *jsc.JSGlobalObject, callframe: *jsc.CallFrame) bun.JSError!jsc.JSValue {
const path_arg, const data_arg, const compress_arg = callframe.argumentsAsArray(3);
const path_arg, const data_arg, const options_arg = callframe.argumentsAsArray(3);
if (data_arg == .zero) {
return globalThis.throwInvalidArguments("Archive.write requires at least 2 arguments (path, data)", .{});
return globalThis.throwInvalidArguments("Archive.write requires 2 arguments (path, data)", .{});
}
// Get the path
@@ -228,61 +290,37 @@ pub fn write(globalThis: *jsc.JSGlobalObject, callframe: *jsc.CallFrame) bun.JSE
const path_slice = try path_arg.toSlice(globalThis, bun.default_allocator);
defer path_slice.deinit();
// Determine compression
const use_gzip = try parseCompressArg(globalThis, compress_arg);
// Parse options for compression override
const options_compress = try parseCompressionOptions(globalThis, options_arg);
// Try to use store reference (zero-copy) for Archive/Blob
// For Archive instances, use options override or archive's compression settings
if (fromJS(data_arg)) |archive| {
return startWriteTask(globalThis, .{ .store = archive.store }, path_slice.slice(), use_gzip);
const compress = if (options_compress != .none) options_compress else archive.compress;
return startWriteTask(globalThis, .{ .store = archive.store }, path_slice.slice(), compress);
}
// For Blobs, use store reference with options compression
if (data_arg.as(jsc.WebCore.Blob)) |blob_ptr| {
if (blob_ptr.store) |store| {
return startWriteTask(globalThis, .{ .store = store }, path_slice.slice(), use_gzip);
return startWriteTask(globalThis, .{ .store = store }, path_slice.slice(), options_compress);
}
}
// Fall back to copying data for ArrayBuffer/TypedArray/objects
const archive_data = try getArchiveData(globalThis, data_arg);
return startWriteTask(globalThis, .{ .owned = archive_data }, path_slice.slice(), use_gzip);
}
/// Get archive data from a value, returning owned bytes
fn getArchiveData(globalThis: *jsc.JSGlobalObject, arg: jsc.JSValue) bun.JSError![]u8 {
// Check if it's a typed array, ArrayBuffer, or similar
if (arg.asArrayBuffer(globalThis)) |array_buffer| {
return bun.default_allocator.dupe(u8, array_buffer.slice());
// For ArrayBuffer/TypedArray, copy the data with options compression
if (data_arg.asArrayBuffer(globalThis)) |array_buffer| {
const data = try bun.default_allocator.dupe(u8, array_buffer.slice());
return startWriteTask(globalThis, .{ .owned = data }, path_slice.slice(), options_compress);
}
// Check if it's an object with entries (plain object) - build tarball
if (arg.isObject()) {
return buildTarballFromObject(globalThis, arg);
// For plain objects, build a tarball with options compression
if (data_arg.isObject()) {
const data = try buildTarballFromObject(globalThis, data_arg);
return startWriteTask(globalThis, .{ .owned = data }, path_slice.slice(), options_compress);
}
return globalThis.throwInvalidArguments("Expected an object, Blob, TypedArray, ArrayBuffer, or Archive", .{});
}
fn parseCompressArg(globalThis: *jsc.JSGlobalObject, arg: jsc.JSValue) bun.JSError!bool {
if (arg.isUndefinedOrNull()) {
return false;
}
if (arg.isBoolean()) {
return arg.toBoolean();
}
if (arg.isString()) {
const str = try arg.toSlice(globalThis, bun.default_allocator);
defer str.deinit();
if (std.mem.eql(u8, str.slice(), "gzip")) {
return true;
}
return globalThis.throwInvalidArguments("Archive: compress argument must be 'gzip', a boolean, or undefined", .{});
}
return globalThis.throwInvalidArguments("Archive: compress argument must be 'gzip', a boolean, or undefined", .{});
}
/// Instance method: archive.extract(path, options?)
/// Extracts the archive to the given path
/// Options:
@@ -379,20 +417,16 @@ fn freePatterns(patterns: []const []const u8) void {
bun.default_allocator.free(patterns);
}
/// Instance method: archive.blob(compress?)
/// Returns Promise<Blob> with the archive data
pub fn blob(this: *Archive, globalThis: *jsc.JSGlobalObject, callframe: *jsc.CallFrame) bun.JSError!jsc.JSValue {
const compress_arg = callframe.argumentsAsArray(1)[0];
const use_gzip = try parseCompressArg(globalThis, compress_arg);
return startBlobTask(globalThis, this.store, use_gzip, .blob);
/// Instance method: archive.blob()
/// Returns Promise<Blob> with the archive data (compressed if gzip was set in options)
pub fn blob(this: *Archive, globalThis: *jsc.JSGlobalObject, _: *jsc.CallFrame) bun.JSError!jsc.JSValue {
return startBlobTask(globalThis, this.store, this.compress, .blob);
}
/// Instance method: archive.bytes(compress?)
/// Returns Promise<Uint8Array> with the archive data
pub fn bytes(this: *Archive, globalThis: *jsc.JSGlobalObject, callframe: *jsc.CallFrame) bun.JSError!jsc.JSValue {
const compress_arg = callframe.argumentsAsArray(1)[0];
const use_gzip = try parseCompressArg(globalThis, compress_arg);
return startBlobTask(globalThis, this.store, use_gzip, .bytes);
/// Instance method: archive.bytes()
/// Returns Promise<Uint8Array> with the archive data (compressed if gzip was set in options)
pub fn bytes(this: *Archive, globalThis: *jsc.JSGlobalObject, _: *jsc.CallFrame) bun.JSError!jsc.JSValue {
return startBlobTask(globalThis, this.store, this.compress, .bytes);
}
/// Instance method: archive.files(glob?)
@@ -578,15 +612,17 @@ const BlobContext = struct {
};
store: *jsc.WebCore.Blob.Store,
use_gzip: bool,
compress: Compression,
output_type: OutputType,
result: Result = .{ .uncompressed = {} },
fn run(this: *BlobContext) Result {
if (this.use_gzip) {
return .{ .compressed = compressGzip(this.store.sharedView()) catch |e| return .{ .err = e } };
switch (this.compress) {
.gzip => |opts| {
return .{ .compressed = compressGzip(this.store.sharedView(), opts.level) catch |e| return .{ .err = e } };
},
.none => return .{ .uncompressed = {} },
}
return .{ .uncompressed = {} };
}
fn runFromJS(this: *BlobContext, globalThis: *jsc.JSGlobalObject) bun.JSError!PromiseResult {
@@ -617,13 +653,13 @@ const BlobContext = struct {
pub const BlobTask = AsyncTask(BlobContext);
fn startBlobTask(globalThis: *jsc.JSGlobalObject, store: *jsc.WebCore.Blob.Store, use_gzip: bool, output_type: BlobContext.OutputType) bun.JSError!jsc.JSValue {
fn startBlobTask(globalThis: *jsc.JSGlobalObject, store: *jsc.WebCore.Blob.Store, compress: Compression, output_type: BlobContext.OutputType) bun.JSError!jsc.JSValue {
store.ref();
errdefer store.deref();
const task = try BlobTask.create(globalThis, .{
.store = store,
.use_gzip = use_gzip,
.compress = compress,
.output_type = output_type,
});
@@ -646,7 +682,7 @@ const WriteContext = struct {
data: Data,
path: [:0]const u8,
use_gzip: bool,
compress: Compression,
result: Result = .{ .success = {} },
fn run(this: *WriteContext) Result {
@@ -654,11 +690,11 @@ const WriteContext = struct {
.owned => |d| d,
.store => |s| s.sharedView(),
};
const data_to_write = if (this.use_gzip)
compressGzip(source_data) catch |e| return .{ .err = e }
else
source_data;
defer if (this.use_gzip) bun.default_allocator.free(data_to_write);
const data_to_write = switch (this.compress) {
.gzip => |opts| compressGzip(source_data, opts.level) catch |e| return .{ .err = e },
.none => source_data,
};
defer if (this.compress != .none) bun.default_allocator.free(data_to_write);
const file = switch (bun.sys.File.openat(.cwd(), this.path, bun.O.CREAT | bun.O.WRONLY | bun.O.TRUNC, 0o644)) {
.err => |err| return .{ .sys_err = err.clone(bun.default_allocator) },
@@ -699,7 +735,7 @@ fn startWriteTask(
globalThis: *jsc.JSGlobalObject,
data: WriteContext.Data,
path: []const u8,
use_gzip: bool,
compress: Compression,
) bun.JSError!jsc.JSValue {
const path_z = try bun.default_allocator.dupeZ(u8, path);
errdefer bun.default_allocator.free(path_z);
@@ -714,7 +750,7 @@ fn startWriteTask(
const task = try WriteTask.create(globalThis, .{
.data = data,
.path = path_z,
.use_gzip = use_gzip,
.compress = compress,
});
const promise_js = task.promise.value();
@@ -869,10 +905,10 @@ fn startFilesTask(globalThis: *jsc.JSGlobalObject, store: *jsc.WebCore.Blob.Stor
// Helpers
// ============================================================================
fn compressGzip(data: []const u8) ![]u8 {
fn compressGzip(data: []const u8, level: u8) ![]u8 {
libdeflate.load();
const compressor = libdeflate.Compressor.alloc(6) orelse return error.GzipInitFailed;
const compressor = libdeflate.Compressor.alloc(@intCast(level)) orelse return error.GzipInitFailed;
defer compressor.deinit();
const max_size = compressor.maxBytesNeeded(data, .gzip);

View File

@@ -118,6 +118,14 @@ pub fn set_repeat(_: *Self, thisValue: JSValue, globalThis: *JSGlobalObject, val
Self.js.repeatSetCached(thisValue, globalThis, value);
}
pub fn get_idleStart(_: *Self, thisValue: JSValue, _: *JSGlobalObject) JSValue {
return Self.js.idleStartGetCached(thisValue).?;
}
pub fn set_idleStart(_: *Self, thisValue: JSValue, globalThis: *JSGlobalObject, value: JSValue) void {
Self.js.idleStartSetCached(thisValue, globalThis, value);
}
pub fn dispose(self: *Self, globalThis: *JSGlobalObject, _: *jsc.CallFrame) bun.JSError!JSValue {
self.internals.cancel(globalThis.bunVM());
return .js_undefined;

View File

@@ -242,7 +242,7 @@ fn convertToInterval(this: *TimerObjectInternals, global: *JSGlobalObject, timer
this.strong_this.set(global, timer);
this.flags.kind = .setInterval;
this.interval = new_interval;
this.reschedule(timer, vm);
this.reschedule(timer, vm, global);
}
pub fn run(this: *TimerObjectInternals, globalThis: *jsc.JSGlobalObject, timer: JSValue, callback: JSValue, arguments: JSValue, async_id: u64, vm: *jsc.VirtualMachine) bool {
@@ -293,8 +293,8 @@ pub fn init(
TimeoutObject.js.idleTimeoutSetCached(timer, global, .jsNumber(interval));
TimeoutObject.js.repeatSetCached(timer, global, if (kind == .setInterval) .jsNumber(interval) else .null);
// this increments the refcount
this.reschedule(timer, vm);
// this increments the refcount and sets _idleStart
this.reschedule(timer, vm, global);
}
this.strong_this.set(global, timer);
@@ -328,7 +328,7 @@ pub fn doRefresh(this: *TimerObjectInternals, globalObject: *jsc.JSGlobalObject,
}
this.strong_this.set(globalObject, this_value);
this.reschedule(this_value, VirtualMachine.get());
this.reschedule(this_value, VirtualMachine.get(), globalObject);
return this_value;
}
@@ -371,7 +371,7 @@ fn shouldRescheduleTimer(this: *TimerObjectInternals, repeat: JSValue, idle_time
return true;
}
pub fn reschedule(this: *TimerObjectInternals, timer: JSValue, vm: *VirtualMachine) void {
pub fn reschedule(this: *TimerObjectInternals, timer: JSValue, vm: *VirtualMachine, globalThis: *JSGlobalObject) void {
if (this.flags.kind == .setImmediate) return;
const idle_timeout = TimeoutObject.js.idleTimeoutGetCached(timer).?;
@@ -380,7 +380,8 @@ pub fn reschedule(this: *TimerObjectInternals, timer: JSValue, vm: *VirtualMachi
// https://github.com/nodejs/node/blob/a7cbb904745591c9a9d047a364c2c188e5470047/lib/internal/timers.js#L612
if (!this.shouldRescheduleTimer(repeat, idle_timeout)) return;
const now = timespec.msFromNow(.allow_mocked_time, this.interval);
const now = timespec.now(.allow_mocked_time);
const scheduled_time = now.addMs(this.interval);
const was_active = this.eventLoopTimer().state == .ACTIVE;
if (was_active) {
vm.timer.remove(this.eventLoopTimer());
@@ -388,9 +389,13 @@ pub fn reschedule(this: *TimerObjectInternals, timer: JSValue, vm: *VirtualMachi
this.ref();
}
vm.timer.update(this.eventLoopTimer(), &now);
vm.timer.update(this.eventLoopTimer(), &scheduled_time);
this.flags.has_cleared_timer = false;
// Set _idleStart to the current monotonic timestamp in milliseconds
// This mimics Node.js's behavior where _idleStart is the libuv timestamp when the timer was scheduled
TimeoutObject.js.idleStartSetCached(timer, globalThis, .jsNumber(now.msUnsigned()));
if (this.flags.has_js_ref) {
this.setEnableKeepingEventLoopAlive(vm, true);
}

View File

@@ -1896,6 +1896,9 @@ pub fn NewRequestContext(comptime ssl_enabled: bool, comptime debug_mode: bool,
}
this.ref();
byte_stream.pipe = jsc.WebCore.Pipe.Wrap(@This(), onPipe).init(this);
// Deinit the old Strong reference before creating a new one
// to avoid leaking the Strong.Impl memory
this.response_body_readable_stream_ref.deinit();
this.response_body_readable_stream_ref = jsc.WebCore.ReadableStream.Strong.init(stream, globalThis);
this.byte_stream = byte_stream;

View File

@@ -73,7 +73,7 @@ extern "C" bool is_executable_file(const char* path)
{
#if defined(O_EXEC)
// O_EXEC is macOS specific
int fd = open(path, O_EXEC | O_CLOEXEC, 0);
int fd = open(path, O_EXEC | O_CLOEXEC | O_NONBLOCK | O_NOCTTY, 0);
if (fd < 0)
return false;
close(fd);

View File

@@ -184,13 +184,18 @@ export default [
setter: "set_repeat",
this: true,
},
_idleStart: {
getter: "get_idleStart",
setter: "set_idleStart",
this: true,
},
["@@dispose"]: {
fn: "dispose",
length: 0,
invalidThisBehavior: InvalidThisBehavior.NoOp,
},
},
values: ["arguments", "callback", "idleTimeout", "repeat"],
values: ["arguments", "callback", "idleTimeout", "repeat", "idleStart"],
}),
define({
name: "Immediate",

View File

@@ -5499,7 +5499,9 @@ pub const NodeFS = struct {
// O_PATH is faster
bun.O.PATH
else
bun.O.RDONLY;
// O_NONBLOCK prevents blocking on a FIFO.
// O_NOCTTY prevents acquiring a controlling terminal.
bun.O.RDONLY | bun.O.NONBLOCK | bun.O.NOCTTY;
const fd = switch (bun.sys.open(path, flags, 0)) {
.err => |err| return .{ .err = err.withPath(path) },

View File

@@ -1484,6 +1484,12 @@ pub fn writeFileInternal(globalThis: *jsc.JSGlobalObject, path_or_blob_: *PathOr
}
}
// Check for Archive - allows Bun.write() and S3 writes to accept Archive instances
if (data.as(Archive)) |archive| {
archive.store.ref();
break :brk Blob.initWithStore(archive.store, globalThis);
}
break :brk try Blob.get(
globalThis,
data,
@@ -4828,6 +4834,7 @@ const NewReadFileHandler = read_file.NewReadFileHandler;
const string = []const u8;
const Archive = @import("../api/Archive.zig");
const Environment = @import("../../env.zig");
const S3File = @import("./S3File.zig");
const std = @import("std");

View File

@@ -1189,8 +1189,7 @@ fn runWithSourceCode(
opts.features.bundler_feature_flags = transpiler.options.bundler_feature_flags;
opts.features.hot_module_reloading = output_format == .internal_bake_dev and !source.index.isRuntime();
opts.features.auto_polyfill_require = output_format == .esm and !opts.features.hot_module_reloading;
opts.features.react_fast_refresh = target == .browser and
transpiler.options.react_fast_refresh and
opts.features.react_fast_refresh = transpiler.options.react_fast_refresh and
loader.isJSX() and
!source.path.isNodeModule();

View File

@@ -1364,7 +1364,7 @@ pub const BundleV2 = struct {
this.graph.input_files.append(this.allocator(), .{
.source = source.*,
.loader = loader,
.side_effects = loader.sideEffects(),
.side_effects = resolve_result.primary_side_effects_data,
}) catch |err| bun.handleOom(err);
var task = bun.handleOom(this.allocator().create(ParseTask));
task.* = ParseTask.init(resolve_result, source_index, this);
@@ -2467,6 +2467,30 @@ pub const BundleV2 = struct {
}
}
/// Look up the sideEffects field from package.json for a given file path.
/// This is used when an onResolve plugin returns a path in the "file" namespace
/// to ensure tree-shaking still works properly.
fn lookupSideEffectsForPath(this: *BundleV2, absolute_path: []const u8, target: options.Target) _resolver.SideEffects {
const dir_path = bun.path.dirname(absolute_path, .auto);
const resolver = &this.transpilerForTarget(target).resolver;
var dir_info: ?*const _resolver.DirInfo = resolver.readDirInfoIgnoreError(dir_path);
// Walk up directory tree to find package.json
while (dir_info) |info| {
if (info.package_json) |package_json| {
return switch (package_json.side_effects) {
.unspecified => .has_side_effects,
.false => .no_side_effects__package_json,
.map => |map| if (map.contains(bun.StringHashMapUnowned.Key.init(absolute_path))) .has_side_effects else .no_side_effects__package_json,
.glob, .mixed => if (package_json.side_effects.hasSideEffects(absolute_path)) .has_side_effects else .no_side_effects__package_json,
};
}
dir_info = info.getParent();
}
return .has_side_effects;
}
pub fn onResolveFromJsLoop(resolve: *bun.jsc.API.JSBundler.Resolve) void {
onResolve(resolve, resolve.bv2);
}
@@ -2556,6 +2580,12 @@ pub const BundleV2 = struct {
this.graph.ast.append(this.allocator(), JSAst.empty) catch unreachable;
const loader = path.loader(&this.transpiler.options.loaders) orelse options.Loader.file;
// Look up sideEffects from package.json for file namespace paths
const side_effects: _resolver.SideEffects = if (strings.eqlComptime(path.namespace, "file"))
this.lookupSideEffectsForPath(path.text, resolve.import_record.original_target)
else
.has_side_effects;
this.graph.input_files.append(this.allocator(), .{
.source = .{
.path = path,
@@ -2563,7 +2593,7 @@ pub const BundleV2 = struct {
.index = source_index,
},
.loader = loader,
.side_effects = .has_side_effects,
.side_effects = side_effects,
}) catch unreachable;
var task = bun.default_allocator.create(ParseTask) catch unreachable;
task.* = ParseTask{
@@ -2576,7 +2606,7 @@ pub const BundleV2 = struct {
.file = bun.invalid_fd,
},
},
.side_effects = .has_side_effects,
.side_effects = side_effects,
.jsx = this.transpilerForTarget(resolve.import_record.original_target).options.jsx,
.source_index = source_index,
.module_type = .unknown,

View File

@@ -791,7 +791,9 @@ pub const InitCommand = struct {
switch (template) {
.blank, .typescript_library => {
Template.createAgentRule();
if (!minimal) {
Template.createAgentRule();
}
if (package_json_file != null and !did_load_package_json) {
Output.prettyln(" + <r><d>package.json<r>", .{});

View File

@@ -216,6 +216,7 @@ pub fn StyleRule(comptime R: type) type {
var handler_context = context.handler_context.child(.style_rule);
std.mem.swap(css.PropertyHandlerContext, &context.handler_context, &handler_context);
try this.rules.minify(context, unused);
std.mem.swap(css.PropertyHandlerContext, &context.handler_context, &handler_context);
if (unused and this.rules.v.items.len == 0) {
return true;
}

View File

@@ -1708,8 +1708,14 @@ pub fn parseIntoBinaryLockfile(
};
if (registry_str.len == 0) {
// Use scope-specific registry if available, otherwise fall back to default
const registry_url = if (manager) |mgr|
mgr.scopeForPackageName(name_str).url.href
else
Npm.Registry.default_url;
const url = try ExtractTarball.buildURL(
Npm.Registry.default_url,
registry_url,
strings.StringOrTinyString.init(name.slice(string_buf.bytes.items)),
res.value.npm.version,
string_buf.bytes.items,

View File

@@ -1,8 +1,14 @@
const { isIP, isIPv6 } = require("internal/net/isIP");
const { checkIsHttpToken, validateFunction, validateInteger, validateBoolean } = require("internal/validators");
const {
checkIsHttpToken,
validateFunction,
validateInteger,
validateBoolean,
validateString,
} = require("internal/validators");
const { urlToHttpOptions } = require("internal/url");
const { isValidTLSArray } = require("internal/tls");
const { throwOnInvalidTLSArray } = require("internal/tls");
const { validateHeaderName } = require("node:_http_common");
const { getTimerDuration } = require("internal/timers");
const { ConnResetException } = require("internal/shared");
@@ -728,53 +734,48 @@ function ClientRequest(input, options, cb) {
throw new Error("pfx is not supported");
}
if (options.rejectUnauthorized !== undefined) this._ensureTls().rejectUnauthorized = options.rejectUnauthorized;
else {
let agentRejectUnauthorized = agent?.options?.rejectUnauthorized;
if (agentRejectUnauthorized !== undefined) this._ensureTls().rejectUnauthorized = agentRejectUnauthorized;
else {
// popular https-proxy-agent uses connectOpts
agentRejectUnauthorized = agent?.connectOpts?.rejectUnauthorized;
if (agentRejectUnauthorized !== undefined) this._ensureTls().rejectUnauthorized = agentRejectUnauthorized;
}
}
if (options.ca) {
if (!isValidTLSArray(options.ca))
throw new TypeError(
"ca argument must be an string, Buffer, TypedArray, BunFile or an array containing string, Buffer, TypedArray or BunFile",
);
this._ensureTls().ca = options.ca;
}
if (options.cert) {
if (!isValidTLSArray(options.cert))
throw new TypeError(
"cert argument must be an string, Buffer, TypedArray, BunFile or an array containing string, Buffer, TypedArray or BunFile",
);
this._ensureTls().cert = options.cert;
}
if (options.key) {
if (!isValidTLSArray(options.key))
throw new TypeError(
"key argument must be an string, Buffer, TypedArray, BunFile or an array containing string, Buffer, TypedArray or BunFile",
);
this._ensureTls().key = options.key;
}
if (options.passphrase) {
if (typeof options.passphrase !== "string") throw new TypeError("passphrase argument must be a string");
this._ensureTls().passphrase = options.passphrase;
}
if (options.ciphers) {
if (typeof options.ciphers !== "string") throw new TypeError("ciphers argument must be a string");
this._ensureTls().ciphers = options.ciphers;
}
if (options.servername) {
if (typeof options.servername !== "string") throw new TypeError("servername argument must be a string");
this._ensureTls().servername = options.servername;
}
// Merge TLS options using spread operator, matching Node.js behavior in createSocket:
// options = { __proto__: null, ...options, ...this.options };
// https://github.com/nodejs/node/blob/v23.6.0/lib/_http_agent.js#L242
// With spread, the last one wins, so agent.options overwrites request options.
//
// agent.options: Stored by Node.js Agent constructor
// https://github.com/nodejs/node/blob/v23.6.0/lib/_http_agent.js#L96
//
// agent.connectOpts: Used by https-proxy-agent for TLS connection options (lowest priority)
// https://github.com/TooTallNate/proxy-agents/blob/main/packages/https-proxy-agent/src/index.ts#L110-L117
const mergedTlsOptions = { __proto__: null, ...agent?.connectOpts, ...options, ...agent?.options };
if (options.secureOptions) {
if (typeof options.secureOptions !== "number") throw new TypeError("secureOptions argument must be a string");
this._ensureTls().secureOptions = options.secureOptions;
if (mergedTlsOptions.rejectUnauthorized !== undefined) {
this._ensureTls().rejectUnauthorized = mergedTlsOptions.rejectUnauthorized;
}
if (mergedTlsOptions.ca) {
throwOnInvalidTLSArray("options.ca", mergedTlsOptions.ca);
this._ensureTls().ca = mergedTlsOptions.ca;
}
if (mergedTlsOptions.cert) {
throwOnInvalidTLSArray("options.cert", mergedTlsOptions.cert);
this._ensureTls().cert = mergedTlsOptions.cert;
}
if (mergedTlsOptions.key) {
throwOnInvalidTLSArray("options.key", mergedTlsOptions.key);
this._ensureTls().key = mergedTlsOptions.key;
}
if (mergedTlsOptions.passphrase) {
validateString(mergedTlsOptions.passphrase, "options.passphrase");
this._ensureTls().passphrase = mergedTlsOptions.passphrase;
}
if (mergedTlsOptions.ciphers) {
validateString(mergedTlsOptions.ciphers, "options.ciphers");
this._ensureTls().ciphers = mergedTlsOptions.ciphers;
}
if (mergedTlsOptions.servername) {
validateString(mergedTlsOptions.servername, "options.servername");
this._ensureTls().servername = mergedTlsOptions.servername;
}
if (mergedTlsOptions.secureOptions) {
validateInteger(mergedTlsOptions.secureOptions, "options.secureOptions");
this._ensureTls().secureOptions = mergedTlsOptions.secureOptions;
}
this[kPath] = options.path || "/";
if (cb) {

View File

@@ -1,4 +1,4 @@
pub fn decodeBinaryValue(globalObject: *jsc.JSGlobalObject, field_type: types.FieldType, column_length: u32, raw: bool, bigint: bool, unsigned: bool, comptime Context: type, reader: NewReader(Context)) !SQLDataCell {
pub fn decodeBinaryValue(globalObject: *jsc.JSGlobalObject, field_type: types.FieldType, column_length: u32, raw: bool, bigint: bool, unsigned: bool, binary: bool, comptime Context: type, reader: NewReader(Context)) !SQLDataCell {
debug("decodeBinaryValue: {s}", .{@tagName(field_type)});
return switch (field_type) {
.MYSQL_TYPE_TINY => {
@@ -131,6 +131,7 @@ pub fn decodeBinaryValue(globalObject: *jsc.JSGlobalObject, field_type: types.Fi
else => error.InvalidBinaryValue,
},
// When the column contains a binary string we return a Buffer otherwise a string
.MYSQL_TYPE_ENUM,
.MYSQL_TYPE_SET,
.MYSQL_TYPE_GEOMETRY,
@@ -138,7 +139,6 @@ pub fn decodeBinaryValue(globalObject: *jsc.JSGlobalObject, field_type: types.Fi
.MYSQL_TYPE_STRING,
.MYSQL_TYPE_VARCHAR,
.MYSQL_TYPE_VAR_STRING,
// We could return Buffer here BUT TEXT, LONGTEXT, MEDIUMTEXT, TINYTEXT, etc. are BLOB and the user expects a string
.MYSQL_TYPE_TINY_BLOB,
.MYSQL_TYPE_MEDIUM_BLOB,
.MYSQL_TYPE_LONG_BLOB,
@@ -151,7 +151,9 @@ pub fn decodeBinaryValue(globalObject: *jsc.JSGlobalObject, field_type: types.Fi
}
var string_data = try reader.encodeLenString();
defer string_data.deinit();
if (binary) {
return SQLDataCell.raw(&string_data);
}
const slice = string_data.slice();
return SQLDataCell{ .tag = .string, .value = .{ .string = if (slice.len > 0) bun.String.cloneUTF8(slice).value.WTFStringImpl else null }, .free_value = 1 };
},

View File

@@ -140,8 +140,12 @@ pub const Row = struct {
}
},
else => {
const slice = value.slice();
cell.* = SQLDataCell{ .tag = .string, .value = .{ .string = if (slice.len > 0) bun.String.cloneUTF8(slice).value.WTFStringImpl else null }, .free_value = 1 };
if (column.flags.BINARY) {
cell.* = SQLDataCell.raw(value);
} else {
const slice = value.slice();
cell.* = SQLDataCell{ .tag = .string, .value = .{ .string = if (slice.len > 0) bun.String.cloneUTF8(slice).value.WTFStringImpl else null }, .free_value = 1 };
}
},
};
}
@@ -226,7 +230,7 @@ pub const Row = struct {
}
const column = this.columns[i];
value.* = try decodeBinaryValue(this.globalObject, column.column_type, column.column_length, this.raw, this.bigint, column.flags.UNSIGNED, Context, reader);
value.* = try decodeBinaryValue(this.globalObject, column.column_type, column.column_length, this.raw, this.bigint, column.flags.UNSIGNED, column.flags.BINARY, Context, reader);
value.index = switch (column.name_or_index) {
// The indexed columns can be out of order.
.index => |idx| idx,

View File

@@ -295,4 +295,34 @@ import path from "path";
expect(fs.existsSync(path.join(temp, "src/components"))).toBe(true);
expect(fs.existsSync(path.join(temp, "src/components/ui"))).toBe(true);
}, 30_000);
test("bun init --minimal only creates package.json and tsconfig.json", async () => {
// Regression test for https://github.com/oven-sh/bun/issues/26050
// --minimal should not create .cursor/, CLAUDE.md, .gitignore, or README.md
const temp = tempDirWithFiles("bun-init-minimal", {});
const { exited } = Bun.spawn({
cmd: [bunExe(), "init", "--minimal", "-y"],
cwd: temp,
stdio: ["ignore", "inherit", "inherit"],
env: {
...bunEnv,
// Simulate Cursor being installed via CURSOR_TRACE_ID env var
CURSOR_TRACE_ID: "test-trace-id",
},
});
expect(await exited).toBe(0);
// Should create package.json and tsconfig.json
expect(fs.existsSync(path.join(temp, "package.json"))).toBe(true);
expect(fs.existsSync(path.join(temp, "tsconfig.json"))).toBe(true);
// Should NOT create these extra files with --minimal
expect(fs.existsSync(path.join(temp, "index.ts"))).toBe(false);
expect(fs.existsSync(path.join(temp, ".gitignore"))).toBe(false);
expect(fs.existsSync(path.join(temp, "README.md"))).toBe(false);
expect(fs.existsSync(path.join(temp, "CLAUDE.md"))).toBe(false);
expect(fs.existsSync(path.join(temp, ".cursor"))).toBe(false);
});
});

File diff suppressed because it is too large Load Diff

View File

@@ -1509,3 +1509,128 @@ describe.concurrent("s3 missing credentials", () => {
});
});
});
// Archive + S3 integration tests
describe.skipIf(!minioCredentials)("Archive with S3", () => {
const credentials = minioCredentials!;
it("writes archive to S3 via S3Client.write()", async () => {
const client = new Bun.S3Client(credentials);
const archive = new Bun.Archive({
"hello.txt": "Hello from Archive!",
"data.json": JSON.stringify({ test: true }),
});
const key = randomUUIDv7() + ".tar";
await client.write(key, archive);
// Verify by downloading and reading back
const downloaded = await client.file(key).bytes();
const readArchive = new Bun.Archive(downloaded);
const files = await readArchive.files();
expect(files.size).toBe(2);
expect(await files.get("hello.txt")!.text()).toBe("Hello from Archive!");
expect(await files.get("data.json")!.text()).toBe(JSON.stringify({ test: true }));
// Cleanup
await client.unlink(key);
});
it("writes archive to S3 via Bun.write() with s3:// URL", async () => {
const archive = new Bun.Archive({
"file1.txt": "content1",
"dir/file2.txt": "content2",
});
const key = randomUUIDv7() + ".tar";
const s3Url = `s3://${credentials.bucket}/${key}`;
await Bun.write(s3Url, archive, {
...credentials,
});
// Verify by downloading
const s3File = Bun.file(s3Url, credentials);
const downloaded = await s3File.bytes();
const readArchive = new Bun.Archive(downloaded);
const files = await readArchive.files();
expect(files.size).toBe(2);
expect(await files.get("file1.txt")!.text()).toBe("content1");
expect(await files.get("dir/file2.txt")!.text()).toBe("content2");
// Cleanup
await s3File.delete();
});
it("writes archive with binary content to S3", async () => {
const client = new Bun.S3Client(credentials);
const binaryData = new Uint8Array([0x00, 0x01, 0x02, 0xff, 0xfe, 0xfd, 0x80, 0x7f]);
const archive = new Bun.Archive({
"binary.bin": binaryData,
});
const key = randomUUIDv7() + ".tar";
await client.write(key, archive);
// Verify binary data is preserved
const downloaded = await client.file(key).bytes();
const readArchive = new Bun.Archive(downloaded);
const files = await readArchive.files();
const extractedBinary = await files.get("binary.bin")!.bytes();
expect(extractedBinary).toEqual(binaryData);
// Cleanup
await client.unlink(key);
});
it("writes large archive to S3", async () => {
const client = new Bun.S3Client(credentials);
// Create archive with multiple files
const entries: Record<string, string> = {};
for (let i = 0; i < 50; i++) {
entries[`file${i.toString().padStart(3, "0")}.txt`] = `Content for file ${i}`;
}
const archive = new Bun.Archive(entries);
const key = randomUUIDv7() + ".tar";
await client.write(key, archive);
// Verify
const downloaded = await client.file(key).bytes();
const readArchive = new Bun.Archive(downloaded);
const files = await readArchive.files();
expect(files.size).toBe(50);
expect(await files.get("file000.txt")!.text()).toBe("Content for file 0");
expect(await files.get("file049.txt")!.text()).toBe("Content for file 49");
// Cleanup
await client.unlink(key);
});
it("writes archive via s3File.write()", async () => {
const client = new Bun.S3Client(credentials);
const archive = new Bun.Archive({
"test.txt": "Hello via s3File.write()!",
});
const key = randomUUIDv7() + ".tar";
const s3File = client.file(key);
await s3File.write(archive);
// Verify
const downloaded = await s3File.bytes();
const readArchive = new Bun.Archive(downloaded);
const files = await readArchive.files();
expect(files.size).toBe(1);
expect(await files.get("test.txt")!.text()).toBe("Hello via s3File.write()!");
// Cleanup
await s3File.delete();
});
});

View File

@@ -1,5 +1,16 @@
import { describe, expect, it, spyOn } from "bun:test";
import { bunEnv, bunExe, gc, getMaxFD, isBroken, isIntelMacOS, isWindows, tempDirWithFiles, tmpdirSync } from "harness";
import {
bunEnv,
bunExe,
gc,
getMaxFD,
isBroken,
isIntelMacOS,
isPosix,
isWindows,
tempDirWithFiles,
tmpdirSync,
} from "harness";
import { isAscii } from "node:buffer";
import fs, {
closeSync,
@@ -51,6 +62,7 @@ import { tmpdir } from "node:os";
import { join } from "node:path";
import { spawnSync } from "bun";
import { mkfifo } from "mkfifo";
import { ReadStream as ReadStream_, WriteStream as WriteStream_ } from "./export-from.js";
import { ReadStream as ReadStreamStar_, WriteStream as WriteStreamStar_ } from "./export-star-from.js";
@@ -1540,6 +1552,13 @@ it("symlink", () => {
expect(realpathSync(actual)).toBe(realpathSync(import.meta.path));
});
it.if(isPosix)("realpathSync doesn't block on FIFO", () => {
const path = join(tmpdirSync(), "test-fs-fifo-block.fifo");
mkfifo(path, 0o666);
realpathSync(path);
unlinkSync(path);
});
it("readlink", () => {
const actual = join(tmpdirSync(), "fs-readlink.txt");
try {

View File

@@ -0,0 +1,30 @@
-----BEGIN ENCRYPTED PRIVATE KEY-----
MIIFNTBfBgkqhkiG9w0BBQ0wUjAxBgkqhkiG9w0BBQwwJAQQieLggVjbubz09mX5
GdRQAwICCAAwDAYIKoZIhvcNAgkFADAdBglghkgBZQMEASoEEJ++f2E23qU4mbP4
m3RnPasEggTQoS6zcBDvWURYyctw9Qma8L/ZnPg4SBclVzYbiZcvBPNRvCNLnYxQ
ysimU/8PTCP9m944dcsMolRqPjj0gOQCnBpqbZmnc7elwDFZIhePRfMKC2bPHZeo
ABonNOs2VstJ9gT3RA5x8Dj99dsoPdnV9rL6vkW0Gk86BPGgQq5i1ipJvYrpOtay
Bq5JgpptVX86azXZVriB8FUNfJuFOPQfxfXIY7ogHpQWZ7rIVa5ug7LlJ7sLjakj
ph/4corzRnRr88/eFfhYbV5rob/Lvoq8+I2Hgf25ypJ2XdOoWAgDOvl6+k01v/Ci
VAYAE1v9RgmiAXFIE9uYbSIyhiVibmLU6QK7Vcydv0ZaZLdP/9HwfZ6Q5u1a23rj
ltzRFOu5H7ipVXSoZU1ffw2EXi1RZJU2n5M3tU11qZsNpaDulEdcYZm74sUaqdjA
zkYSO+RBehptEUfgjXBrW8HJ42fCfd6IvQ7NtT3e3zJup105cHIEfO8IiSSt/oW3
SOupzjTpARHhAbPKSEmUVC1IXjGUvUuZs+NlN+byNkI4IhSTHp4vn5k87l22jccl
4NwW5ZIouqawvV5gyOGgBcwgSfvd4H8mcSeFfZhVmEtRDKtubREr8mqqcUWq5V/W
fEGR2LTQKRofhGGw56Jzw8FgNJNI0m6WBYIPQVtmwqqljPNPDuCQZ/icrhM6s0MR
7IyDiCUHzsz2JZxRJJO9pzItSABym/I57DTtRg1XQTEuSU+dTwhVzwkytWVldHx3
Rvbb6DUWrLtthoAs/LSDevjhrLYAdkLj4iaexqfYPcrRA22hj3KxxRpzV8zqMNvM
hI703HrjIPzlVhrqf6gMiKs7iZu2XQ4RRsQyKzWlro9bOprUvIg/abFtaJDXKqN0
sTJQ9rSpTJgUzG4sJEFiUeM0Wm2cLUO1w4N4/si89vOCcVJJUIjZgwsyFu8DpUIE
7E9rgAzuWByIBOJQ0f1hfF7zGUxAJ75qRdHm0q2aDkDPLiJk1alR1MpMs1tIcaBO
CAxnlZtORvq6QMQnERkpzuvX2PS5mtZ8w/qizPgb8GL3kU+Ex0lJHT8PBwspSXWV
Gc9AvCZ1z+YLnflUsRch/dI/suGhpIcLOX4M3pfW9qfo/i92uR52JWzIAkRKFTOi
fSiADLpar2WT2Kcz9aGfTB2swjhsL7Q6Tf8BWUCVYtfbf5FK07uPTCb9tyy+LxtU
qvtHe3XyZTO3guRBBDZotEOqNKzJw+ZUKIO7vX5JGtpMudBHL2J1KH80Qy4+uR/H
b9YyW0UFOyuOejmrMwHMP/iXkYyTsBiShETU0Uga33xvSuS10FhiCt87cXCI/WeZ
Jw1fk29QA3nx5vw9zDcVFiJRwOu9l6/JxXFpGm0ZjhYudS98yJkam3sbwJThJ+1C
fFzzCM69iUdPw/8JEPnD+Wd2okFiwjpEzHrZ+n1P5YGDF7UTyEB3gLpn3sgmBR9H
2z4yiL+ST/WI7n3ykXxzxjzcEgkDEwLfzHlguqh7jhYWuIhsDmcch7EgH8+gsyke
9lgUWJdoHXVfNZmWh4rMMkEUGi605WulXV8N9qQJJOJltN3lGdKZi+CBK6dTlPtJ
iAj5mvrk++pP/b0SplcQtq3pspGnWmjw+jw0aOVzSpn8qrco1/FZWdw=
-----END ENCRYPTED PRIVATE KEY-----

View File

@@ -1,28 +1,28 @@
-----BEGIN PRIVATE KEY-----
MIIEvAIBADANBgkqhkiG9w0BAQEFAASCBKYwggSiAgEAAoIBAQCIzOJskt6VkEJY
XKSJv/Gdil3XYkjk3NVc/+m+kzqnkTRbPtT9w+IGWgmJhuf9DJPLCwHFAEFarVwV
x16Q0PbU4ajXaLRHEYGhrH10oTMjQnJ24xVm26mxRXPQa5vaLpWJqNyIdNLIQLe+
UXUOzSGGsFTRMAjvYrkzjBe4ZUnaZV+aFY/ug0jfzeA1dJjzKZs6+yTJRbsuWUEb
8MsDmT4v+kBZDKdaDn7AFDWRVqx/38BnqsRzkM0CxpnyT2kRzw5zQajIE13gdTJo
1EHvYSUkkxrY5m30Rl9BuBBZBjhMzOHq0fYVVooHO+sf4XHPgvFTTxJum85u7J1J
oEUjrLKtAgMBAAECggEACInVNhaiqu4infZGVMy0rXMV8VwSlapM7O2SLtFsr0nK
XUmaLK6dvGzBPKK9dxdiYCFzPlMKQTkhzsAvYFWSmm3tRmikG+11TFyCRhXLpc8/
ark4vD9Io6ZkmKUmyKLwtXNjNGcqQtJ7RXc7Ga3nAkueN6JKZHqieZusXVeBGQ70
YH1LKyVNBeJggbj+g9rqaksPyNJQ8EWiNTJkTRQPazZ0o1VX/fzDFyr/a5npFtHl
4BHfafv9o1Xyr70Kie8CYYRJNViOCN+ylFs7Gd3XRaAkSkgMT/7DzrHdEM2zrrHK
yNg2gyDVX9UeEJG2X5UtU0o9BVW7WBshz/2hqIUHoQKBgQC8zsRFvC7u/rGr5vRR
mhZZG+Wvg03/xBSuIgOrzm+Qie6mAzOdVmfSL/pNV9EFitXt1yd2ROo31AbS7Evy
Bm/QVKr2mBlmLgov3B7O/e6ABteooOL7769qV/v+yo8VdEg0biHmsfGIIXDe3Lwl
OT0XwF9r/SeZLbw1zfkSsUVG/QKBgQC5fANM3Dc9LEek+6PHv5+eC1cKkyioEjUl
/y1VUD00aABI1TUcdLF3BtFN2t/S6HW0hrP3KwbcUfqC25k+GDLh1nM6ZK/gI3Yn
IGtCHxtE3S6jKhE9QcK/H+PzGVKWge9SezeYRP0GHJYDrTVTA8Kt9HgoZPPeReJl
+Ss9c8ThcQKBgECX6HQHFnNzNSufXtSQB7dCoQizvjqTRZPxVRoxDOABIGExVTYt
umUhPtu5AGyJ+/hblEeU+iBRbGg6qRzK8PPwE3E7xey8MYYAI5YjL7YjISKysBUL
AhM6uJ6Jg/wOBSnSx8xZ8kzlS+0izUda1rjKeprCSArSp8IsjlrDxPStAoGAEcPr
+P+altRX5Fhpvmb/Hb8OTif8G+TqjEIdkG9H/W38oP0ywg/3M2RGxcMx7txu8aR5
NjI7zPxZFxF7YvQkY3cLwEsGgVxEI8k6HLIoBXd90Qjlb82NnoqqZY1GWL4HMwo0
L/Rjm6M/Rwje852Hluu0WoIYzXA6F/Q+jPs6nzECgYAxx4IbDiGXuenkwSF1SUyj
NwJXhx4HDh7U6EO/FiPZE5BHE3BoTrFu3o1lzverNk7G3m+j+m1IguEAalHlukYl
rip9iUISlKYqbYZdLBoLwHAfHhszdrjqn8/v6oqbB5yR3HXjPFUWJo0WJ2pqJp56
ZshgmQQ/5Khoj6x0/dMPSg==
MIIEvQIBADANBgkqhkiG9w0BAQEFAASCBKcwggSjAgEAAoIBAQDlYzosgRgXHL6v
Mh1V0ERFhsvlZrtRojSw6tafr3SQBphU793/rGiYZlL/lJ9HIlLkx9JMbuTjNm5U
2eRwHiTQIeWD4aCIESwPlkdaVYtC+IOj55bJN8xNa7h5GyJwF7PnPetAsKyE8DMB
n1gKMhaIis7HHOUtk4/K3Y4peU44d04z0yPt6JtY5Sbvi1E7pGX6T/2c9sHsdIDe
DctWnewpXXs8zkAla0KNWQfpDnpS53wxAfStTA4lSrA9daxC7hZopQlLxFIbJk+0
BLbEsXtrJ54T5iguHk+2MDVAy4MOqP9XbKV7eGHk73l6+CSwmHyHBxh4ChxRQeT5
BP0MUTn1AgMBAAECggEABtPvC5uVGr0DjQX2GxONsK8cOxoVec7U+C4pUMwBcXcM
yjxwlHdujpi/IDXtjsm+A2rSPu2vGPdKDfMFanPvPxW/Ne99noc6U0VzHsR8lnP8
wSB328nyJhzOeyZcXk9KTtgIPF7156gZsJLsZTNL+ej90i3xQWvKxCxXmrLuad5O
z/TrgZkC6wC3fgj1d3e8bMljQ7tLxbshJMYVI5o6RFTxy84DLI+rlvPkf7XbiMPf
2lsm4jcJKvfx+164HZJ9QVlx8ncqOHAnGvxb2xHHfqv4JAbz615t7yRvtaw4Paj5
6kQSf0VWnsVzgxNJWvnUZym/i/Qf5nQafjChCyKOEQKBgQD9f4SkvJrp/mFKWLHd
kDvRpSIIltfJsa5KShn1IHsQXFwc0YgyP4SKQb3Ckv+/9UFHK9EzM+WlPxZi7ZOS
hsWhIfkI4c4ORpxUQ+hPi0K2k+HIY7eYyONqDAzw5PGkKBo3mSGMHDXYywSqexhB
CCMHuHdMhwyHdz4PWYOK3C2VMQKBgQDnpsrHK7lM9aVb8wNhTokbK5IlTSzH/5oJ
lAVu6G6H3tM5YQeoDXztbZClvrvKU8DU5UzwaC+8AEWQwaram29QIDpAI3nVQQ0k
dmHHp/pCeADdRG2whaGcl418UJMMv8AUpWTRm+kVLTLqfTHBC0ji4NlCQMHCUCfd
U8TeUi5QBQKBgQDvJNd7mboDOUmLG7VgMetc0Y4T0EnuKsMjrlhimau/OYJkZX84
+BcPXwmnf4nqC3Lzs3B9/12L0MJLvZjUSHQ0mJoZOPxtF0vvasjEEbp0B3qe0wOn
DQ0NRCUJNNKJbJOfE8VEKnDZ/lx+f/XXk9eINwvElDrLqUBQtr+TxjbyYQKBgAxQ
lZ8Y9/TbajsFJDzcC/XhzxckjyjisbGoqNFIkfevJNN8EQgiD24f0Py+swUChtHK
jtiI8WCxMwGLCiYs9THxRKd8O1HW73fswy32BBvcfU9F//7OW9UTSXY+YlLfLrrq
P/3UqAN0L6y/kxGMJAfLpEEdaC+IS1Y8yc531/ZxAoGASYiasDpePtmzXklDxk3h
jEw64QAdXK2p/xTMjSeTtcqJ7fvaEbg+Mfpxq0mdTjfbTdR9U/nzAkwS7OoZZ4Du
ueMVls0IVqcNnBtikG8wgdxN27b5JPXS+GzQ0zDSpWFfRPZiIh37BAXr0D1voluJ
rEHkcals6p7hL98BoxjFIvA=
-----END PRIVATE KEY-----

View File

@@ -1,23 +1,23 @@
-----BEGIN CERTIFICATE-----
MIID5jCCAs6gAwIBAgIUN7coIsdMcLo9amZfkwogu0YkeLEwDQYJKoZIhvcNAQEL
BQAwfjELMAkGA1UEBhMCU0UxDjAMBgNVBAgMBVN0YXRlMREwDwYDVQQHDAhMb2Nh
dGlvbjEaMBgGA1UECgwRT3JnYW5pemF0aW9uIE5hbWUxHDAaBgNVBAsME09yZ2Fu
aXphdGlvbmFsIFVuaXQxEjAQBgNVBAMMCWxvY2FsaG9zdDAeFw0yMzA5MjExNDE2
MjNaFw0yNDA5MjAxNDE2MjNaMH4xCzAJBgNVBAYTAlNFMQ4wDAYDVQQIDAVTdGF0
ZTERMA8GA1UEBwwITG9jYXRpb24xGjAYBgNVBAoMEU9yZ2FuaXphdGlvbiBOYW1l
MRwwGgYDVQQLDBNPcmdhbml6YXRpb25hbCBVbml0MRIwEAYDVQQDDAlsb2NhbGhv
c3QwggEiMA0GCSqGSIb3DQEBAQUAA4IBDwAwggEKAoIBAQCIzOJskt6VkEJYXKSJ
v/Gdil3XYkjk3NVc/+m+kzqnkTRbPtT9w+IGWgmJhuf9DJPLCwHFAEFarVwVx16Q
0PbU4ajXaLRHEYGhrH10oTMjQnJ24xVm26mxRXPQa5vaLpWJqNyIdNLIQLe+UXUO
zSGGsFTRMAjvYrkzjBe4ZUnaZV+aFY/ug0jfzeA1dJjzKZs6+yTJRbsuWUEb8MsD
mT4v+kBZDKdaDn7AFDWRVqx/38BnqsRzkM0CxpnyT2kRzw5zQajIE13gdTJo1EHv
YSUkkxrY5m30Rl9BuBBZBjhMzOHq0fYVVooHO+sf4XHPgvFTTxJum85u7J1JoEUj
rLKtAgMBAAGjXDBaMA4GA1UdDwEB/wQEAwIDiDATBgNVHSUEDDAKBggrBgEFBQcD
ATAUBgNVHREEDTALgglsb2NhbGhvc3QwHQYDVR0OBBYEFNzx4Rfs9m8XR5ML0WsI
sorKmB4PMA0GCSqGSIb3DQEBCwUAA4IBAQB87iQy8R0fiOky9WTcyzVeMaavS3MX
iTe1BRn1OCyDq+UiwwoNz7zdzZJFEmRtFBwPNFOe4HzLu6E+7yLFR552eYRHlqIi
/fiLb5JiZfPtokUHeqwELWBsoXtU8vKxViPiLZ09jkWOPZWo7b/xXd6QYykBfV91
usUXLzyTD2orMagpqNksLDGS3p3ggHEJBZtRZA8R7kPEw98xZHznOQpr26iv8kYz
ZWdLFoFdwgFBSfxePKax5rfo+FbwdrcTX0MhbORyiu2XsBAghf8s2vKDkHg2UQE8
haonxFYMFaASfaZ/5vWKYDTCJkJ67m/BtkpRafFEO+ad1i1S61OjfxH4
MIID4jCCAsqgAwIBAgIUcaRq6J/YF++Bo01Zc+HeQvCbnWMwDQYJKoZIhvcNAQEL
BQAwaTELMAkGA1UEBhMCVVMxCzAJBgNVBAgMAkNBMRYwFAYDVQQHDA1TYW4gRnJh
bmNpc2NvMQ0wCwYDVQQKDARPdmVuMREwDwYDVQQLDAhUZWFtIEJ1bjETMBEGA1UE
AwwKc2VydmVyLWJ1bjAeFw0yNTA5MDYwMzAwNDlaFw0zNTA5MDQwMzAwNDlaMGkx
CzAJBgNVBAYTAlVTMQswCQYDVQQIDAJDQTEWMBQGA1UEBwwNU2FuIEZyYW5jaXNj
bzENMAsGA1UECgwET3ZlbjERMA8GA1UECwwIVGVhbSBCdW4xEzARBgNVBAMMCnNl
cnZlci1idW4wggEiMA0GCSqGSIb3DQEBAQUAA4IBDwAwggEKAoIBAQDlYzosgRgX
HL6vMh1V0ERFhsvlZrtRojSw6tafr3SQBphU793/rGiYZlL/lJ9HIlLkx9JMbuTj
Nm5U2eRwHiTQIeWD4aCIESwPlkdaVYtC+IOj55bJN8xNa7h5GyJwF7PnPetAsKyE
8DMBn1gKMhaIis7HHOUtk4/K3Y4peU44d04z0yPt6JtY5Sbvi1E7pGX6T/2c9sHs
dIDeDctWnewpXXs8zkAla0KNWQfpDnpS53wxAfStTA4lSrA9daxC7hZopQlLxFIb
Jk+0BLbEsXtrJ54T5iguHk+2MDVAy4MOqP9XbKV7eGHk73l6+CSwmHyHBxh4ChxR
QeT5BP0MUTn1AgMBAAGjgYEwfzAdBgNVHQ4EFgQUw7nEnh4uOdZVZUapQzdAUaVa
An0wHwYDVR0jBBgwFoAUw7nEnh4uOdZVZUapQzdAUaVaAn0wDwYDVR0TAQH/BAUw
AwEB/zAsBgNVHREEJTAjgglsb2NhbGhvc3SHBH8AAAGHEAAAAAAAAAAAAAAAAAAA
AAEwDQYJKoZIhvcNAQELBQADggEBAEA8r1fvDLMSCb8bkAURpFk8chn8pl5MChzT
YUDaLdCCBjPXJkSXNdyuwS+T/ljAGyZbW5xuDccCNKltawO4CbyEXUEZbYr3w9eq
j8uqymJPhFf0O1rKOI2han5GBCgHwG13QwKI+4uu7390nD+TlzLOhxFfvOG7OadH
QNMNLNyldgF4Nb8vWdz0FtQiGUIrO7iq4LFhhd1lCxe0q+FAYSEYcc74WtF/Yo8V
JQauXuXyoP5FqLzNt/yeNQhceyIXJGKCsjr5/bASBmVlCwgRfsD3jpG37L8YCJs1
L4WEikcY4Lzb2NF9e94IyZdQsRqd9DFBF5zP013MSUiuhiow32k=
-----END CERTIFICATE-----

View File

@@ -0,0 +1,710 @@
/**
* All tests in this file run in both Bun and Node.js.
*
* Test that TLS options can be inherited from agent.options and agent.connectOpts.
* This is important for compatibility with libraries like https-proxy-agent.
*
* The HttpsProxyAgent tests verify that TLS options are properly passed through
* the proxy tunnel to the target HTTPS server.
*/
import { once } from "node:events";
import { readFileSync } from "node:fs";
import http from "node:http";
import https from "node:https";
import { createRequire } from "node:module";
import type { AddressInfo } from "node:net";
import net from "node:net";
import { dirname, join } from "node:path";
import { describe, test } from "node:test";
import { fileURLToPath } from "node:url";
// Use createRequire for ESM compatibility
const require = createRequire(import.meta.url);
const { HttpsProxyAgent } = require("https-proxy-agent") as {
HttpsProxyAgent: new (proxyUrl: string, options?: Record<string, unknown>) => http.Agent;
};
const __dirname = dirname(fileURLToPath(import.meta.url));
// Self-signed certificate with SANs for localhost and 127.0.0.1
// This cert is its own CA (self-signed)
const tlsCerts = {
cert: readFileSync(join(__dirname, "fixtures", "cert.pem"), "utf8"),
key: readFileSync(join(__dirname, "fixtures", "cert.key"), "utf8"),
encryptedKey: readFileSync(join(__dirname, "fixtures", "cert.encrypted.key"), "utf8"),
passphrase: "testpassword",
// Self-signed cert, so it's its own CA
get ca() {
return this.cert;
},
};
async function createHttpsServer(
options: https.ServerOptions = {},
): Promise<{ server: https.Server; port: number; hostname: string }> {
const server = https.createServer({ key: tlsCerts.key, cert: tlsCerts.cert, ...options }, (req, res) => {
res.writeHead(200);
res.end("OK");
});
await once(server.listen(0, "127.0.0.1"), "listening");
const { port } = server.address() as AddressInfo;
return { server, port, hostname: "127.0.0.1" };
}
async function createHttpServer(): Promise<{
server: http.Server;
port: number;
hostname: string;
}> {
const server = http.createServer((req, res) => {
res.writeHead(200);
res.end("OK");
});
await once(server.listen(0, "127.0.0.1"), "listening");
const { port } = server.address() as AddressInfo;
return { server, port, hostname: "127.0.0.1" };
}
/**
* Create an HTTP CONNECT proxy server.
* This proxy handles the CONNECT method to establish tunnels for HTTPS connections.
*/
function createConnectProxy(): net.Server {
return net.createServer(clientSocket => {
let buffer: Uint8Array = new Uint8Array(0);
let tunnelEstablished = false;
let targetSocket: net.Socket | null = null;
clientSocket.on("data", (data: Uint8Array) => {
// If tunnel is already established, forward data directly
if (tunnelEstablished && targetSocket) {
targetSocket.write(data);
return;
}
// Concatenate buffers
const newBuffer = new Uint8Array(buffer.length + data.length);
newBuffer.set(buffer);
newBuffer.set(data, buffer.length);
buffer = newBuffer;
const bufferStr = new TextDecoder().decode(buffer);
// Check if we have complete headers
const headerEnd = bufferStr.indexOf("\r\n\r\n");
if (headerEnd === -1) return;
const headerPart = bufferStr.substring(0, headerEnd);
const lines = headerPart.split("\r\n");
const requestLine = lines[0];
// Check for CONNECT method
const match = requestLine.match(/^CONNECT\s+([^:]+):(\d+)\s+HTTP/);
if (!match) {
clientSocket.write("HTTP/1.1 400 Bad Request\r\n\r\n");
clientSocket.end();
return;
}
const [, targetHost, targetPort] = match;
// Get any data after the headers (shouldn't be any for CONNECT)
// headerEnd is byte position in the string, need to account for UTF-8
const headerBytes = new TextEncoder().encode(bufferStr.substring(0, headerEnd + 4)).length;
const remainingData = buffer.subarray(headerBytes);
// Connect to target
targetSocket = net.connect(parseInt(targetPort, 10), targetHost, () => {
clientSocket.write("HTTP/1.1 200 Connection Established\r\n\r\n");
tunnelEstablished = true;
// Forward any remaining data
if (remainingData.length > 0) {
targetSocket!.write(remainingData);
}
// Set up bidirectional piping
targetSocket!.on("data", (chunk: Uint8Array) => {
clientSocket.write(chunk);
});
});
targetSocket.on("error", () => {
if (!tunnelEstablished) {
clientSocket.write("HTTP/1.1 502 Bad Gateway\r\n\r\n");
}
clientSocket.end();
});
targetSocket.on("close", () => clientSocket.destroy());
clientSocket.on("close", () => targetSocket?.destroy());
});
clientSocket.on("error", () => {
targetSocket?.destroy();
});
});
}
/**
* Helper to start a proxy server and get its port.
*/
async function startProxy(server: net.Server): Promise<number> {
return new Promise<number>(resolve => {
server.listen(0, "127.0.0.1", () => {
const addr = server.address() as AddressInfo;
resolve(addr.port);
});
});
}
describe("https.request agent TLS options inheritance", () => {
describe("agent.options", () => {
test("inherits ca from agent.options", async () => {
const { server, port, hostname } = await createHttpsServer();
try {
// Create an agent with ca in options
const agent = new https.Agent({
ca: tlsCerts.ca,
});
const { promise, resolve, reject } = Promise.withResolvers<void>();
const req = https.request(
{
hostname,
port,
path: "/",
method: "GET",
agent,
// NO ca here - should inherit from agent.options
},
res => {
res.on("data", () => {});
res.on("end", resolve);
},
);
req.on("error", reject);
req.end();
await promise;
} finally {
server.close();
}
});
test("inherits rejectUnauthorized from agent.options", async () => {
const { server, port, hostname } = await createHttpsServer();
try {
// Create an agent with rejectUnauthorized: false in options
const agent = new https.Agent({
rejectUnauthorized: false,
});
const { promise, resolve, reject } = Promise.withResolvers<void>();
const req = https.request(
{
hostname,
port,
path: "/",
method: "GET",
agent,
// NO rejectUnauthorized here - should inherit from agent.options
},
res => {
res.on("data", () => {});
res.on("end", resolve);
},
);
req.on("error", reject);
req.end();
await promise;
} finally {
server.close();
}
});
test("inherits cert and key from agent.options", async () => {
// Create a server that uses TLS
const { server, port, hostname } = await createHttpsServer();
try {
// Create an agent with cert/key in options
const agent = new https.Agent({
rejectUnauthorized: false,
cert: tlsCerts.cert,
key: tlsCerts.key,
});
const { promise, resolve, reject } = Promise.withResolvers<void>();
const req = https.request(
{
hostname,
port,
path: "/",
method: "GET",
agent,
// NO cert/key here - should inherit from agent.options
},
res => {
res.on("data", () => {});
res.on("end", resolve);
},
);
req.on("error", reject);
req.end();
await promise;
} finally {
server.close();
}
});
});
// Test HttpsProxyAgent compatibility - these tests use real HttpsProxyAgent
// to verify HTTPS requests work through the proxy tunnel with TLS options
describe("HttpsProxyAgent TLS options", () => {
test("HttpsProxyAgent with rejectUnauthorized: false", async () => {
const { server, port, hostname } = await createHttpsServer();
const proxy = createConnectProxy();
const proxyPort = await startProxy(proxy);
try {
// Create HttpsProxyAgent for the proxy connection
const agent = new HttpsProxyAgent(`http://127.0.0.1:${proxyPort}`, {
rejectUnauthorized: false,
});
const { promise, resolve, reject } = Promise.withResolvers<void>();
const req = https.request(
{
hostname,
port,
path: "/",
method: "GET",
agent,
// TLS options must also be passed here for Node.js compatibility
// https-proxy-agent doesn't propagate these to target connection in Node.js
// See: https://github.com/TooTallNate/node-https-proxy-agent/issues/35
rejectUnauthorized: false,
},
res => {
res.on("data", () => {});
res.on("end", resolve);
},
);
req.on("error", reject);
req.end();
await promise;
} finally {
server.close();
proxy.close();
}
});
test("HttpsProxyAgent with ca option", async () => {
const { server, port, hostname } = await createHttpsServer();
const proxy = createConnectProxy();
const proxyPort = await startProxy(proxy);
try {
// Create HttpsProxyAgent for the proxy connection
const agent = new HttpsProxyAgent(`http://127.0.0.1:${proxyPort}`, {
ca: tlsCerts.ca,
});
const { promise, resolve, reject } = Promise.withResolvers<void>();
const req = https.request(
{
hostname,
port,
path: "/",
method: "GET",
agent,
// TLS options must also be passed here for Node.js compatibility
ca: tlsCerts.ca,
},
res => {
res.on("data", () => {});
res.on("end", resolve);
},
);
req.on("error", reject);
req.end();
await promise;
} finally {
server.close();
proxy.close();
}
});
test("HttpsProxyAgent with cert and key options", async () => {
const { server, port, hostname } = await createHttpsServer();
const proxy = createConnectProxy();
const proxyPort = await startProxy(proxy);
try {
// Create HttpsProxyAgent for the proxy connection
const agent = new HttpsProxyAgent(`http://127.0.0.1:${proxyPort}`, {
rejectUnauthorized: false,
cert: tlsCerts.cert,
key: tlsCerts.key,
});
const { promise, resolve, reject } = Promise.withResolvers<void>();
const req = https.request(
{
hostname,
port,
path: "/",
method: "GET",
agent,
// TLS options must also be passed here for Node.js compatibility
rejectUnauthorized: false,
cert: tlsCerts.cert,
key: tlsCerts.key,
},
res => {
res.on("data", () => {});
res.on("end", resolve);
},
);
req.on("error", reject);
req.end();
await promise;
} finally {
server.close();
proxy.close();
}
});
});
describe("option precedence (matches Node.js)", () => {
// In Node.js, options are merged via spread in createSocket:
// options = { __proto__: null, ...options, ...this.options };
// https://github.com/nodejs/node/blob/v23.6.0/lib/_http_agent.js#L365
// With spread, the last one wins, so agent.options overwrites request options.
test("agent.options takes precedence over direct options", async () => {
const { server, port, hostname } = await createHttpsServer();
try {
// Create an agent with correct CA
const agent = new https.Agent({
ca: tlsCerts.ca, // Correct CA in agent.options - should be used
});
const { promise, resolve, reject } = Promise.withResolvers<void>();
const req = https.request(
{
hostname,
port,
path: "/",
method: "GET",
agent,
ca: "wrong-ca-that-would-fail", // Wrong CA in request - should be ignored
},
res => {
res.on("data", () => {});
res.on("end", resolve);
},
);
req.on("error", reject);
req.end();
await promise;
} finally {
server.close();
}
});
test("direct options used when agent.options not set", async () => {
const { server, port, hostname } = await createHttpsServer();
try {
// Create an agent without ca
const agent = new https.Agent({});
const { promise, resolve, reject } = Promise.withResolvers<void>();
const req = https.request(
{
hostname,
port,
path: "/",
method: "GET",
agent,
ca: tlsCerts.ca, // Direct option should be used since agent.options.ca is not set
},
res => {
res.on("data", () => {});
res.on("end", resolve);
},
);
req.on("error", reject);
req.end();
await promise;
} finally {
server.close();
}
});
});
describe("other TLS options", () => {
test("inherits servername from agent.options", async () => {
const { server, port, hostname } = await createHttpsServer();
try {
const agent = new https.Agent({
rejectUnauthorized: false,
servername: "localhost", // Should be passed to TLS
});
const { promise, resolve, reject } = Promise.withResolvers<void>();
const req = https.request(
{
hostname,
port,
path: "/",
method: "GET",
agent,
},
res => {
res.on("data", () => {});
res.on("end", resolve);
},
);
req.on("error", reject);
req.end();
await promise;
} finally {
server.close();
}
});
test("inherits ciphers from agent.options", async () => {
const { server, port, hostname } = await createHttpsServer();
try {
const agent = new https.Agent({
rejectUnauthorized: false,
ciphers: "HIGH:!aNULL:!MD5", // Custom cipher suite
});
const { promise, resolve, reject } = Promise.withResolvers<void>();
const req = https.request(
{
hostname,
port,
path: "/",
method: "GET",
agent,
},
res => {
res.on("data", () => {});
res.on("end", resolve);
},
);
req.on("error", reject);
req.end();
await promise;
} finally {
server.close();
}
});
test("inherits passphrase from agent.options", async () => {
// Create server that accepts connections with encrypted key
const { server, port, hostname } = await createHttpsServer({
key: tlsCerts.encryptedKey,
passphrase: tlsCerts.passphrase,
});
try {
// Create an agent with encrypted key and passphrase in options
const agent = new https.Agent({
ca: tlsCerts.ca,
cert: tlsCerts.cert,
key: tlsCerts.encryptedKey,
passphrase: tlsCerts.passphrase,
});
const { promise, resolve, reject } = Promise.withResolvers<void>();
const req = https.request(
{
hostname,
port,
path: "/",
method: "GET",
agent,
// NO passphrase here - should inherit from agent.options
},
res => {
res.on("data", () => {});
res.on("end", resolve);
},
);
req.on("error", reject);
req.end();
await promise;
} finally {
server.close();
}
});
test("supports multiple CAs (array)", async () => {
const { server, port, hostname } = await createHttpsServer();
try {
// Create an agent with CA as an array
const agent = new https.Agent({
ca: [tlsCerts.ca], // Array of CAs
});
const { promise, resolve, reject } = Promise.withResolvers<void>();
const req = https.request(
{
hostname,
port,
path: "/",
method: "GET",
agent,
},
res => {
res.on("data", () => {});
res.on("end", resolve);
},
);
req.on("error", reject);
req.end();
await promise;
} finally {
server.close();
}
});
});
describe("TLS error handling", () => {
test("rejects self-signed cert when rejectUnauthorized is true", async () => {
const { server, port, hostname } = await createHttpsServer();
try {
// Create an agent without CA and with rejectUnauthorized: true (default)
const agent = new https.Agent({
rejectUnauthorized: true,
// NO ca - should fail because cert is self-signed
});
const { promise, resolve, reject } = Promise.withResolvers<Error>();
const req = https.request(
{
hostname,
port,
path: "/",
method: "GET",
agent,
},
() => {
reject(new Error("Expected request to fail"));
},
);
req.on("error", resolve);
req.end();
const error = await promise;
// Should get a certificate error (self-signed cert not trusted)
if (
!(
error.message.includes("self-signed") ||
error.message.includes("SELF_SIGNED") ||
error.message.includes("certificate") ||
error.message.includes("unable to verify")
)
) {
throw new Error(`Expected certificate error, got: ${error.message}`);
}
} finally {
server.close();
}
});
});
});
describe("http.request agent options", () => {
test("does not fail when agent has TLS options (they are ignored for HTTP)", async () => {
const { server, port, hostname } = await createHttpServer();
try {
// Create an agent - TLS options passed via constructor should be ignored for HTTP
// Using type assertion since http.Agent doesn't normally accept TLS options
const agent = new (http.Agent as any)({
rejectUnauthorized: false,
ca: "some-ca",
});
const { promise, resolve, reject } = Promise.withResolvers<void>();
const req = http.request(
{
hostname,
port,
path: "/",
method: "GET",
agent,
},
res => {
res.on("data", () => {});
res.on("end", resolve);
},
);
req.on("error", reject);
req.end();
await promise;
} finally {
server.close();
}
});
});
// Only run in Bun to avoid infinite loop when Node.js runs this file
if (typeof Bun !== "undefined") {
const { bunEnv, nodeExe } = await import("harness");
describe("Node.js compatibility", () => {
test("all tests pass in Node.js", async () => {
const node = nodeExe();
if (!node) {
throw new Error("Node.js not found in PATH");
}
const testFile = fileURLToPath(import.meta.url);
await using proc = Bun.spawn({
cmd: [node, "--test", testFile],
env: bunEnv,
stdout: "pipe",
stderr: "pipe",
});
const [stdout, stderr, exitCode] = await Promise.all([
new Response(proc.stdout).text(),
new Response(proc.stderr).text(),
proc.exited,
]);
if (exitCode !== 0) {
throw new Error(`Node.js tests failed with code ${exitCode}\n${stderr}\n${stdout}`);
}
});
});
}

View File

@@ -480,6 +480,25 @@ if (isDockerEnabled()) {
expect(b).toEqual({ b: 2 });
});
test("Binary", async () => {
const random_name = ("t_" + Bun.randomUUIDv7("hex").replaceAll("-", "")).toLowerCase();
await sql`CREATE TEMPORARY TABLE ${sql(random_name)} (a binary(1), b varbinary(1), c blob)`;
const values = [
{ a: Buffer.from([1]), b: Buffer.from([2]), c: Buffer.from([3]) },
];
await sql`INSERT INTO ${sql(random_name)} ${sql(values)}`;
const results = await sql`select * from ${sql(random_name)}`;
// return buffers
expect(results[0].a).toEqual(Buffer.from([1]));
expect(results[0].b).toEqual(Buffer.from([2]));
expect(results[0].c).toEqual(Buffer.from([3]));
// text protocol should behave the same
const results2 = await sql`select * from ${sql(random_name)}`.simple();
expect(results2[0].a).toEqual(Buffer.from([1]));
expect(results2[0].b).toEqual(Buffer.from([2]));
expect(results2[0].c).toEqual(Buffer.from([3]));
})
test("bulk insert nested sql()", async () => {
await using sql = new SQL({ ...getOptions(), max: 1 });
await sql`create temporary table test_users (name text, age int)`;

View File

@@ -0,0 +1,52 @@
import { heapStats } from "bun:jsc";
import { describe, expect, test } from "bun:test";
describe("Bun.serve response stream leak", () => {
test("proxy server forwarding streaming response should not leak", async () => {
// Backend server that returns a streaming response with delay
await using backend = Bun.serve({
port: 0,
fetch(req) {
const stream = new ReadableStream({
async start(controller) {
controller.enqueue(new TextEncoder().encode("chunk1"));
await Bun.sleep(10);
controller.enqueue(new TextEncoder().encode("chunk2"));
controller.close();
},
});
return new Response(stream);
},
});
// Proxy server that forwards the response body stream
await using proxy = Bun.serve({
port: 0,
async fetch(req) {
const backendResponse = await fetch(`http://localhost:${backend.port}/`);
return new Response(backendResponse.body);
},
});
const url = `http://localhost:${proxy.port}/`;
async function leak() {
const response = await fetch(url);
return await response.text();
}
for (let i = 0; i < 200; i++) {
await leak();
}
await Bun.sleep(10);
Bun.gc(true);
await Bun.sleep(10);
Bun.gc(true);
const readableStreamCount = heapStats().objectTypeCounts.ReadableStream || 0;
const responseCount = heapStats().objectTypeCounts.Response || 0;
expect(readableStreamCount).toBeLessThanOrEqual(50);
expect(responseCount).toBeLessThanOrEqual(50);
});
});

View File

@@ -0,0 +1,105 @@
import { expect, test } from "bun:test";
import { bunEnv, bunExe, tempDirWithFiles } from "harness";
// Test for https://github.com/oven-sh/bun/issues/26039
// When parsing a bun.lock file with an empty registry URL for a scoped package,
// bun should use the scope-specific registry from bunfig.toml, not the default npm registry.
test("frozen lockfile should use scope-specific registry for scoped packages", async () => {
const dir = tempDirWithFiles("scoped-registry-test", {
"package.json": JSON.stringify({
name: "test-scoped-registry",
version: "1.0.0",
dependencies: {
"@example/test-package": "^1.0.0",
},
}),
"bunfig.toml": `
[install.scopes]
example = { url = "https://npm.pkg.github.com" }
`,
// bun.lock with empty string for registry URL - this should trigger the scope lookup
"bun.lock": JSON.stringify(
{
lockfileVersion: 1,
workspaces: {
"": {
dependencies: {
"@example/test-package": "^1.0.0",
},
},
},
packages: {
"@example/test-package": ["@example/test-package@1.0.0", "", {}, "sha512-AAAA"],
},
},
null,
2,
),
});
// Run bun install --frozen-lockfile. It will fail because the package doesn't exist,
// but the error message should show the correct registry URL (npm.pkg.github.com, not registry.npmjs.org)
const { stderr, exitCode } = Bun.spawnSync({
cmd: [bunExe(), "install", "--frozen-lockfile"],
cwd: dir,
env: bunEnv,
});
const stderrText = stderr.toString();
// Before the fix, this would try to fetch from https://registry.npmjs.org/@example/test-package/-/test-package-1.0.0.tgz
// After the fix, it should try to fetch from https://npm.pkg.github.com/@example/test-package/-/test-package-1.0.0.tgz
expect(stderrText).toContain("npm.pkg.github.com");
expect(stderrText).not.toContain("registry.npmjs.org");
// The install should fail because the package doesn't exist on the registry
expect(exitCode).not.toBe(0);
});
// Test that non-scoped packages still use the default registry when registry URL is empty
test("frozen lockfile should use default registry for non-scoped packages", async () => {
const dir = tempDirWithFiles("non-scoped-registry-test", {
"package.json": JSON.stringify({
name: "test-non-scoped-registry",
version: "1.0.0",
dependencies: {
"fake-nonexistent-package": "^1.0.0",
},
}),
"bunfig.toml": `
[install.scopes]
example = { url = "https://npm.pkg.github.com" }
`,
// bun.lock with empty string for registry URL for non-scoped package
"bun.lock": JSON.stringify(
{
lockfileVersion: 1,
workspaces: {
"": {
dependencies: {
"fake-nonexistent-package": "^1.0.0",
},
},
},
packages: {
"fake-nonexistent-package": ["fake-nonexistent-package@1.0.0", "", {}, "sha512-BBBB"],
},
},
null,
2,
),
});
const { stderr, exitCode } = Bun.spawnSync({
cmd: [bunExe(), "install", "--frozen-lockfile"],
cwd: dir,
env: bunEnv,
});
const stderrText = stderr.toString();
// Non-scoped packages should still use the default registry
expect(stderrText).toContain("registry.npmjs.org");
expect(stderrText).not.toContain("npm.pkg.github.com");
// The install should fail because the package doesn't exist on the registry
expect(exitCode).not.toBe(0);
});

View File

@@ -0,0 +1,135 @@
/**
* Regression test for GitHub issue #2952
* https://github.com/oven-sh/bun/issues/2952
*
* When an onResolve plugin returns null (no match), the bundler should still
* respect the sideEffects field from package.json for tree-shaking.
*
* This is a critical issue because packages like lodash-es use sideEffects: false
* to enable proper tree-shaking of unused exports.
*
* The bug manifested when:
* 1. A plugin's onResolve callback returned null for all resolutions
* 2. The resolved module had sideEffects: false in package.json
* 3. The module used barrel exports (re-exports from individual files)
*
* The fix ensures that when onResolve returns null and the bundler falls back
* to default resolution, the sideEffects field from package.json is properly
* propagated to the parse task.
*/
import { expect, test } from "bun:test";
import { bunEnv, bunExe, tempDir } from "harness";
import * as path from "path";
test("issue#2952: onResolve plugin returning null should preserve sideEffects for tree-shaking", async () => {
using dir = tempDir("issue-2952", {
"entry.ts": `
import { isArray } from "tree-shakeable-lib";
export default function isArray2(value: any): boolean {
return isArray(value);
}
`,
"node_modules/tree-shakeable-lib/index.js": `
export { default as isArray } from './isArray.js';
export { default as isString } from './isString.js';
export { default as isNumber } from './isNumber.js';
export { default as isObject } from './isObject.js';
`,
"node_modules/tree-shakeable-lib/isArray.js": `
export default function isArray(value) {
return Array.isArray(value);
}
`,
"node_modules/tree-shakeable-lib/isString.js": `
export default function isString(value) {
console.log("TREESHAKE_FAILED_isString");
return typeof value === "string";
}
`,
"node_modules/tree-shakeable-lib/isNumber.js": `
export default function isNumber(value) {
console.log("TREESHAKE_FAILED_isNumber");
return typeof value === "number";
}
`,
"node_modules/tree-shakeable-lib/isObject.js": `
export default function isObject(value) {
console.log("TREESHAKE_FAILED_isObject");
return typeof value === "object" && value !== null;
}
`,
"node_modules/tree-shakeable-lib/package.json": JSON.stringify({
name: "tree-shakeable-lib",
main: "index.js",
sideEffects: false,
}),
"build-with-plugin.ts": `
import type { BunPlugin } from "bun";
const myPlugin: BunPlugin = {
name: "Test plugin",
setup(build) {
build.onResolve({ filter: /.*/ }, async (args) => {
return null; // Return null to let default resolution handle it
});
},
};
const result = await Bun.build({
entrypoints: ["entry.ts"],
minify: true,
outdir: "dist-with-plugin",
plugins: [myPlugin],
});
if (!result.success) {
console.error("Build failed");
process.exit(1);
}
`,
"build-without-plugin.ts": `
const result = await Bun.build({
entrypoints: ["entry.ts"],
minify: true,
outdir: "dist-without-plugin",
});
if (!result.success) {
console.error("Build failed");
process.exit(1);
}
`,
});
// Build without plugin
await using procWithout = Bun.spawn({
cmd: [bunExe(), "build-without-plugin.ts"],
cwd: String(dir),
env: bunEnv,
stdout: "pipe",
stderr: "pipe",
});
await procWithout.exited;
// Build with plugin
await using procWith = Bun.spawn({
cmd: [bunExe(), "build-with-plugin.ts"],
cwd: String(dir),
env: bunEnv,
stdout: "pipe",
stderr: "pipe",
});
await procWith.exited;
// Read outputs
const outputWithout = await Bun.file(path.join(String(dir), "dist-without-plugin/entry.js")).text();
const outputWith = await Bun.file(path.join(String(dir), "dist-with-plugin/entry.js")).text();
// Both should tree-shake correctly
expect(outputWithout).not.toContain("TREESHAKE_FAILED");
expect(outputWith).not.toContain("TREESHAKE_FAILED");
// Output sizes should be similar (both properly tree-shaken)
// Allow some variance for potential formatting differences
expect(Math.abs(outputWithout.length - outputWith.length)).toBeLessThan(100);
});

View File

@@ -0,0 +1,64 @@
import { expect, test } from "bun:test";
// GitHub Issue #25639: setTimeout Timeout object missing _idleStart property
// Next.js 16 uses _idleStart to coordinate timers for Cache Components
test("setTimeout returns Timeout object with _idleStart property", () => {
const timer = setTimeout(() => {}, 100);
try {
// Verify _idleStart exists and is a number
expect("_idleStart" in timer).toBe(true);
expect(typeof timer._idleStart).toBe("number");
// _idleStart should be a positive timestamp
expect(timer._idleStart).toBeGreaterThan(0);
} finally {
clearTimeout(timer);
}
});
test("setInterval returns Timeout object with _idleStart property", () => {
const timer = setInterval(() => {}, 100);
try {
// Verify _idleStart exists and is a number
expect("_idleStart" in timer).toBe(true);
expect(typeof timer._idleStart).toBe("number");
// _idleStart should be a positive timestamp
expect(timer._idleStart).toBeGreaterThan(0);
} finally {
clearInterval(timer);
}
});
test("_idleStart is writable (Next.js modifies it to coordinate timers)", () => {
const timer = setTimeout(() => {}, 100);
try {
const originalIdleStart = timer._idleStart;
expect(typeof originalIdleStart).toBe("number");
// Next.js sets _idleStart to coordinate timers
const newIdleStart = originalIdleStart - 100;
timer._idleStart = newIdleStart;
expect(timer._idleStart).toBe(newIdleStart);
} finally {
clearTimeout(timer);
}
});
test("timers created at different times have different _idleStart values", async () => {
const timer1 = setTimeout(() => {}, 100);
// Wait a bit to ensure different timestamp
await Bun.sleep(10);
const timer2 = setTimeout(() => {}, 100);
try {
expect(timer2._idleStart).toBeGreaterThanOrEqual(timer1._idleStart);
} finally {
clearTimeout(timer1);
clearTimeout(timer2);
}
});

View File

@@ -4,7 +4,7 @@ import { expect, test } from "bun:test";
import { tempDirWithFiles } from "harness";
import { join } from "path";
test("Bun.build reactFastRefresh option enables React Fast Refresh transform", async () => {
test.each(["browser", "bun"] as const)("Bun.build reactFastRefresh works with target: %s", async target => {
const dir = tempDirWithFiles("react-fast-refresh-test", {
"component.tsx": `
import { useState } from "react";
@@ -24,7 +24,7 @@ test("Bun.build reactFastRefresh option enables React Fast Refresh transform", a
const buildEnabled = await Bun.build({
entrypoints: [join(dir, "component.tsx")],
reactFastRefresh: true,
target: "browser",
target,
external: ["react"],
});
@@ -38,7 +38,7 @@ test("Bun.build reactFastRefresh option enables React Fast Refresh transform", a
// Without reactFastRefresh (default), output should NOT contain refresh calls
const buildDisabled = await Bun.build({
entrypoints: [join(dir, "component.tsx")],
target: "browser",
target,
external: ["react"],
});

View File

@@ -0,0 +1,86 @@
import { expect, test } from "bun:test";
import { bunEnv, bunExe, tempDirWithFiles } from "harness";
test("CSS logical properties should not be stripped when nested rules are present", async () => {
// Test for regression of issue #25794: CSS logical properties (e.g., inset-inline-end)
// are stripped from bundler output when they appear in a nested selector that also
// contains further nested rules (like pseudo-elements).
const dir = tempDirWithFiles("css-logical-properties-nested", {
"input.css": `.test-longform {
background-color: teal;
&.test-longform--end {
inset-inline-end: 20px;
&:after {
content: "";
}
}
}
`,
});
await using proc = Bun.spawn({
cmd: [bunExe(), "build", "input.css", "--outdir", "out"],
env: bunEnv,
cwd: dir,
stdout: "pipe",
stderr: "pipe",
});
const [stdout, stderr, exitCode] = await Promise.all([proc.stdout.text(), proc.stderr.text(), proc.exited]);
// Verify the output CSS contains the logical property fallbacks
const outputContent = await Bun.file(`${dir}/out/input.css`).text();
// Helper function to normalize CSS output for snapshots
function normalizeCSSOutput(output: string): string {
return output
.replace(/\/\*.*?\*\//g, "/* [path] */") // Replace comment paths
.trim();
}
// The output should contain LTR/RTL fallback rules for inset-inline-end
// inset-inline-end: 20px should generate:
// - right: 20px for LTR languages
// - left: 20px for RTL languages
// The bundler generates vendor-prefixed variants for browser compatibility
expect(normalizeCSSOutput(outputContent)).toMatchInlineSnapshot(`
"/* [path] */
.test-longform {
background-color: teal;
}
.test-longform.test-longform--end:not(:-webkit-any(:lang(ae), :lang(ar), :lang(arc), :lang(bcc), :lang(bqi), :lang(ckb), :lang(dv), :lang(fa), :lang(glk), :lang(he), :lang(ku), :lang(mzn), :lang(nqo), :lang(pnb), :lang(ps), :lang(sd), :lang(ug), :lang(ur), :lang(yi))) {
right: 20px;
}
.test-longform.test-longform--end:not(:-moz-any(:lang(ae), :lang(ar), :lang(arc), :lang(bcc), :lang(bqi), :lang(ckb), :lang(dv), :lang(fa), :lang(glk), :lang(he), :lang(ku), :lang(mzn), :lang(nqo), :lang(pnb), :lang(ps), :lang(sd), :lang(ug), :lang(ur), :lang(yi))) {
right: 20px;
}
.test-longform.test-longform--end:not(:is(:lang(ae), :lang(ar), :lang(arc), :lang(bcc), :lang(bqi), :lang(ckb), :lang(dv), :lang(fa), :lang(glk), :lang(he), :lang(ku), :lang(mzn), :lang(nqo), :lang(pnb), :lang(ps), :lang(sd), :lang(ug), :lang(ur), :lang(yi))) {
right: 20px;
}
.test-longform.test-longform--end:-webkit-any(:lang(ae), :lang(ar), :lang(arc), :lang(bcc), :lang(bqi), :lang(ckb), :lang(dv), :lang(fa), :lang(glk), :lang(he), :lang(ku), :lang(mzn), :lang(nqo), :lang(pnb), :lang(ps), :lang(sd), :lang(ug), :lang(ur), :lang(yi)) {
left: 20px;
}
.test-longform.test-longform--end:-moz-any(:lang(ae), :lang(ar), :lang(arc), :lang(bcc), :lang(bqi), :lang(ckb), :lang(dv), :lang(fa), :lang(glk), :lang(he), :lang(ku), :lang(mzn), :lang(nqo), :lang(pnb), :lang(ps), :lang(sd), :lang(ug), :lang(ur), :lang(yi)) {
left: 20px;
}
.test-longform.test-longform--end:is(:lang(ae), :lang(ar), :lang(arc), :lang(bcc), :lang(bqi), :lang(ckb), :lang(dv), :lang(fa), :lang(glk), :lang(he), :lang(ku), :lang(mzn), :lang(nqo), :lang(pnb), :lang(ps), :lang(sd), :lang(ug), :lang(ur), :lang(yi)) {
left: 20px;
}
.test-longform.test-longform--end:after {
content: "";
}"
`);
// Should exit successfully
expect(exitCode).toBe(0);
});