Compare commits

...

8 Commits

Author SHA1 Message Date
Claude Bot
0ea50e1803 fix(vscode): respect bun.test.filePattern setting in isTestFile
The isTestFile() method was using a hardcoded regex that only matched
.test. and .spec. patterns, ignoring the user's bun.test.filePattern
configuration. This caused files that don't match the custom pattern
to still be added to the Test Explorer when manually opened.

Now isTestFile() uses the customFilePattern() method to respect the
user's configuration. This allows users to configure patterns like
**/*.bun.test.{js,ts} to avoid conflicts with other test runners
like Vitest in monorepo setups.

Also adds a length check on glob patterns to prevent potential ReDoS
attacks from pathological patterns.

Fixes #26067

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 14:10:56 +00:00
Jarred Sumner
fbd800551b Bump 2026-01-13 15:06:36 -08:00
robobun
113cdd9648 fix(completions): add update command to Fish completions (#25978)
## Summary

- Add the `update` subcommand to Fish shell completions
- Apply the install/add/remove flags (--global, --dry-run, --force,
etc.) to the `update` command

Previously, Fish shell autocompletion for `bun update --gl<TAB>` would
not work because:
1. The `update` command was missing from the list of built-in commands
2. The install/add/remove flags were not being applied to `update`

Fixes #25953

## Test plan

- [x] Verify `update` appears in subcommand completions (`bun <TAB>`)
- [x] Verify `--global` flag completion works (`bun update --gl<TAB>`)
- [x] Verify other install flags work with update (--dry-run, --force,
etc.)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-13 15:05:00 -08:00
robobun
3196178fa7 fix(timers): add _idleStart property to Timeout object (#26021)
## Summary

- Add `_idleStart` property (getter/setter) to the Timeout object
returned by `setTimeout()` and `setInterval()`
- The property returns a monotonic timestamp (in milliseconds)
representing when the timer was created
- This mimics Node.js's behavior where `_idleStart` is the libuv
timestamp at timer creation time

## Test plan

- [x] Verified test fails with `USE_SYSTEM_BUN=1 bun test
test/regression/issue/25639.test.ts`
- [x] Verified test passes with `bun bd test
test/regression/issue/25639.test.ts`
- [x] Manual verification:
  ```bash
  # Bun with fix - _idleStart exists
./build/debug/bun-debug -e "const t = setTimeout(() => {}, 0);
console.log('_idleStart' in t, typeof t._idleStart); clearTimeout(t)"
  # Output: true number
  
  # Node.js reference - same behavior
node -e "const t = setTimeout(() => {}, 0); console.log('_idleStart' in
t, typeof t._idleStart); clearTimeout(t)"
  # Output: true number
  ```

Closes #25639

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-12 19:35:11 -08:00
robobun
d530ed993d fix(css): restore handler context after minifying nested rules (#25997)
## Summary
- Fixes handler context not being restored after minifying nested CSS
rules
- Adds regression test for the issue

## Test plan
- [x] Test fails with `USE_SYSTEM_BUN=1 bun test
test/regression/issue/25794.test.ts`
- [x] Test passes with `bun bd test test/regression/issue/25794.test.ts`

Fixes #25794

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-12 14:55:27 -08:00
Dylan Conway
959169dfaf feat(archive): change API to constructor-based with S3 support (#25940)
## Summary
- Change Archive API from `Bun.Archive.from(data)` to `new
Bun.Archive(data, options?)`
- Change compression options from `{ gzip: true }` to `{ compress:
"gzip", level?: number }`
- Default to no compression when no options provided
- Use `{ compress: "gzip" }` to enable gzip compression (level 6 by
default)
- Add Archive support for S3 and local file writes via `Bun.write()`

## New API

```typescript
// Create archive - defaults to uncompressed tar
const archive = new Bun.Archive({
  "hello.txt": "Hello, World!",
  "data.json": JSON.stringify({ foo: "bar" }),
});

// Enable gzip compression
const compressed = new Bun.Archive(files, { compress: "gzip" });

// Gzip with custom level (1-12)
const maxCompression = new Bun.Archive(files, { compress: "gzip", level: 12 });

// Write to local file
await Bun.write("archive.tar", archive);           // uncompressed by default
await Bun.write("archive.tar.gz", compressed);     // gzipped

// Write to S3
await client.write("archive.tar.gz", compressed);          // S3Client.write()
await Bun.write("s3://bucket/archive.tar.gz", compressed); // S3 URL
await s3File.write(compressed);                            // s3File.write()

// Get bytes/blob (uses compression setting from constructor)
const bytes = await archive.bytes();
const blob = await archive.blob();
```

## TypeScript Types

```typescript
type ArchiveCompression = "gzip";

type ArchiveOptions = {
  compress?: "gzip";
  level?: number;  // 1-12, default 6 when gzip enabled
};
```

## Test plan
- [x] 98 archive tests pass
- [x] S3 integration tests updated to new API
- [x] TypeScript types updated
- [x] Documentation updated with new examples

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2026-01-12 14:54:21 -08:00
SUZUKI Sosuke
461ad886bd fix(http): fix Strong reference leak in server response streaming (#25965)
## Summary

Fix a memory leak in `RequestContext.doRenderWithBody()` where
`Strong.Impl` memory was leaked when proxying streaming responses
through Bun's HTTP server.

## Problem

When a streaming response (e.g., from a proxied fetch request) was
forwarded through Bun's server:

1. `response_body_readable_stream_ref` was initialized at line 1836
(from `lock.readable`) or line 1841 (via `Strong.init()`)
2. For `.Bytes` streams with `has_received_last_chunk=false`, a **new**
Strong reference was created at line 1902
3. The old Strong reference was **never deinit'd**, causing
`Strong.Impl` memory to leak

This leak accumulated over time with every streaming response proxied
through the server.

## Solution

Add `this.response_body_readable_stream_ref.deinit()` before creating
the new Strong reference. This is safe because:

- `stream` exists as a stack-local variable
- JSC's conservative GC tracks stack-local JSValues
- No GC can occur between consecutive synchronous Zig statements
- Therefore, `stream` won't be collected between `deinit()` and
`Strong.init()`

## Test

Added `test/js/web/fetch/server-response-stream-leak.test.ts` which:
- Creates a backend server that returns delayed streaming responses
- Creates a proxy server that forwards the streaming responses
- Makes 200 requests and checks that ReadableStream objects don't
accumulate
- Fails on system Bun v1.3.5 (202 leaked), passes with the fix

## Related

Similar to the Strong reference leak fixes in:
- #23313 (fetch memory leak)
- #25846 (fetch cyclic reference leak)
2026-01-12 14:41:58 -08:00
Markus Schmidt
b6abbd50a0 fix(Bun.SQL): handle binary columns in MySQL correctly (#26011)
## What does this PR do?
Currently binary columns are returned as strings which means they get
corrupted when encoded in UTF8. This PR returns binary columns as
Buffers which is what user's actually expect and is also consistent with
PostgreSQL and SQLite.
### How did you verify your code works?
I added tests to verify the correct behavior. Before there were no tests
for binary columns at all.

This fixes #23991
2026-01-12 11:56:02 -08:00
24 changed files with 1181 additions and 362 deletions

2
LATEST
View File

@@ -1 +1 @@
1.3.5
1.3.6

View File

@@ -35,8 +35,8 @@ end
set -l bun_install_boolean_flags yarn production optional development no-save dry-run force no-cache silent verbose global
set -l bun_install_boolean_flags_descriptions "Write a yarn.lock file (yarn v1)" "Don't install devDependencies" "Add dependency to optionalDependencies" "Add dependency to devDependencies" "Don't update package.json or save a lockfile" "Don't install anything" "Always request the latest versions from the registry & reinstall all dependencies" "Ignore manifest cache entirely" "Don't output anything" "Excessively verbose logging" "Use global folder"
set -l bun_builtin_cmds_without_run dev create help bun upgrade discord install remove add init pm x
set -l bun_builtin_cmds_accepting_flags create help bun upgrade discord run init link unlink pm x
set -l bun_builtin_cmds_without_run dev create help bun upgrade discord install remove add update init pm x
set -l bun_builtin_cmds_accepting_flags create help bun upgrade discord run init link unlink pm x update
function __bun_complete_bins_scripts --inherit-variable bun_builtin_cmds_without_run -d "Emit bun completions for bins and scripts"
# Do nothing if we already have a builtin subcommand,
@@ -148,14 +148,14 @@ complete -c bun \
for i in (seq (count $bun_install_boolean_flags))
complete -c bun \
-n "__fish_seen_subcommand_from install add remove" -l "$bun_install_boolean_flags[$i]" -d "$bun_install_boolean_flags_descriptions[$i]"
-n "__fish_seen_subcommand_from install add remove update" -l "$bun_install_boolean_flags[$i]" -d "$bun_install_boolean_flags_descriptions[$i]"
end
complete -c bun \
-n "__fish_seen_subcommand_from install add remove" -l 'cwd' -d 'Change working directory'
-n "__fish_seen_subcommand_from install add remove update" -l 'cwd' -d 'Change working directory'
complete -c bun \
-n "__fish_seen_subcommand_from install add remove" -l 'cache-dir' -d 'Choose a cache directory (default: $HOME/.bun/install/cache)'
-n "__fish_seen_subcommand_from install add remove update" -l 'cache-dir' -d 'Choose a cache directory (default: $HOME/.bun/install/cache)'
complete -c bun \
-n "__fish_seen_subcommand_from add" -d 'Popular' -a '(__fish__get_bun_packages)'
@@ -183,4 +183,5 @@ complete -c bun -n "__fish_use_subcommand" -a "unlink" -d "Unregister a local np
complete -c bun -n "__fish_use_subcommand" -a "pm" -d "Additional package management utilities" -f
complete -c bun -n "__fish_use_subcommand" -a "x" -d "Execute a package binary, installing if needed" -f
complete -c bun -n "__fish_use_subcommand" -a "outdated" -d "Display the latest versions of outdated dependencies" -f
complete -c bun -n "__fish_use_subcommand" -a "update" -d "Update dependencies to their latest versions" -f
complete -c bun -n "__fish_use_subcommand" -a "publish" -d "Publish your package from local to npm" -f

View File

@@ -10,21 +10,21 @@ Bun provides a fast, native implementation for working with tar archives through
**Create an archive from files:**
```ts
const archive = Bun.Archive.from({
const archive = new Bun.Archive({
"hello.txt": "Hello, World!",
"data.json": JSON.stringify({ foo: "bar" }),
"nested/file.txt": "Nested content",
});
// Write to disk
await Bun.Archive.write("bundle.tar", archive);
await Bun.write("bundle.tar", archive);
```
**Extract an archive:**
```ts
const tarball = await Bun.file("package.tar.gz").bytes();
const archive = Bun.Archive.from(tarball);
const archive = new Bun.Archive(tarball);
const entryCount = await archive.extract("./output");
console.log(`Extracted ${entryCount} entries`);
```
@@ -33,7 +33,7 @@ console.log(`Extracted ${entryCount} entries`);
```ts
const tarball = await Bun.file("package.tar.gz").bytes();
const archive = Bun.Archive.from(tarball);
const archive = new Bun.Archive(tarball);
const files = await archive.files();
for (const [path, file] of files) {
@@ -43,10 +43,11 @@ for (const [path, file] of files) {
## Creating Archives
Use `Bun.Archive.from()` to create an archive from an object where keys are file paths and values are file contents:
Use `new Bun.Archive()` to create an archive from an object where keys are file paths and values are file contents. By default, archives are uncompressed:
```ts
const archive = Bun.Archive.from({
// Creates an uncompressed tar archive (default)
const archive = new Bun.Archive({
"README.md": "# My Project",
"src/index.ts": "console.log('Hello');",
"package.json": JSON.stringify({ name: "my-project" }),
@@ -64,7 +65,7 @@ File contents can be:
const data = "binary data";
const arrayBuffer = new ArrayBuffer(8);
const archive = Bun.Archive.from({
const archive = new Bun.Archive({
"text.txt": "Plain text",
"blob.bin": new Blob([data]),
"bytes.bin": new Uint8Array([1, 2, 3, 4]),
@@ -74,18 +75,19 @@ const archive = Bun.Archive.from({
### Writing Archives to Disk
Use `Bun.Archive.write()` to create and write an archive in one operation:
Use `Bun.write()` to write an archive to disk:
```ts
// Write uncompressed tar
await Bun.Archive.write("output.tar", {
// Write uncompressed tar (default)
const archive = new Bun.Archive({
"file1.txt": "content1",
"file2.txt": "content2",
});
await Bun.write("output.tar", archive);
// Write gzipped tar
const files = { "src/index.ts": "console.log('Hello');" };
await Bun.Archive.write("output.tar.gz", files, "gzip");
const compressed = new Bun.Archive({ "src/index.ts": "console.log('Hello');" }, { compress: "gzip" });
await Bun.write("output.tar.gz", compressed);
```
### Getting Archive Bytes
@@ -93,8 +95,7 @@ await Bun.Archive.write("output.tar.gz", files, "gzip");
Get the archive data as bytes or a Blob:
```ts
const files = { "hello.txt": "Hello, World!" };
const archive = Bun.Archive.from(files);
const archive = new Bun.Archive({ "hello.txt": "Hello, World!" });
// As Uint8Array
const bytes = await archive.bytes();
@@ -102,9 +103,10 @@ const bytes = await archive.bytes();
// As Blob
const blob = await archive.blob();
// With gzip compression
const gzippedBytes = await archive.bytes("gzip");
const gzippedBlob = await archive.blob("gzip");
// With gzip compression (set at construction)
const gzipped = new Bun.Archive({ "hello.txt": "Hello, World!" }, { compress: "gzip" });
const gzippedBytes = await gzipped.bytes();
const gzippedBlob = await gzipped.blob();
```
## Extracting Archives
@@ -116,13 +118,13 @@ Create an archive from existing tar/tar.gz data:
```ts
// From a file
const tarball = await Bun.file("package.tar.gz").bytes();
const archiveFromFile = Bun.Archive.from(tarball);
const archiveFromFile = new Bun.Archive(tarball);
```
```ts
// From a fetch response
const response = await fetch("https://example.com/archive.tar.gz");
const archiveFromFetch = Bun.Archive.from(await response.blob());
const archiveFromFetch = new Bun.Archive(await response.blob());
```
### Extracting to Disk
@@ -131,7 +133,7 @@ Use `.extract()` to write all files to a directory:
```ts
const tarball = await Bun.file("package.tar.gz").bytes();
const archive = Bun.Archive.from(tarball);
const archive = new Bun.Archive(tarball);
const count = await archive.extract("./extracted");
console.log(`Extracted ${count} entries`);
```
@@ -148,7 +150,7 @@ Use glob patterns to extract only specific files. Patterns are matched against a
```ts
const tarball = await Bun.file("package.tar.gz").bytes();
const archive = Bun.Archive.from(tarball);
const archive = new Bun.Archive(tarball);
// Extract only TypeScript files
const tsCount = await archive.extract("./extracted", { glob: "**/*.ts" });
@@ -181,7 +183,7 @@ Use `.files()` to get archive contents as a `Map` of `File` objects without extr
```ts
const tarball = await Bun.file("package.tar.gz").bytes();
const archive = Bun.Archive.from(tarball);
const archive = new Bun.Archive(tarball);
const files = await archive.files();
for (const [path, file] of files) {
@@ -206,7 +208,7 @@ Archive operations can fail due to corrupted data, I/O errors, or invalid paths.
```ts
try {
const tarball = await Bun.file("package.tar.gz").bytes();
const archive = Bun.Archive.from(tarball);
const archive = new Bun.Archive(tarball);
const count = await archive.extract("./output");
console.log(`Extracted ${count} entries`);
} catch (e: unknown) {
@@ -227,7 +229,7 @@ try {
Common error scenarios:
- **Corrupted/truncated archives** - `Archive.from()` loads the archive data; errors may be deferred until read/extract operations
- **Corrupted/truncated archives** - `new Archive()` loads the archive data; errors may be deferred until read/extract operations
- **Permission denied** - `extract()` throws if the target directory is not writable
- **Disk full** - `extract()` throws if there's insufficient space
- **Invalid paths** - Operations throw for malformed file paths
@@ -239,7 +241,7 @@ The count returned by `extract()` includes all successfully written entries (fil
For additional security with untrusted archives, you can enumerate and validate paths before extraction:
```ts
const archive = Bun.Archive.from(untrustedData);
const archive = new Bun.Archive(untrustedData);
const files = await archive.files();
// Optional: Custom validation for additional checks
@@ -298,26 +300,28 @@ See [Bun.Glob](/docs/api/glob) for the full glob syntax including escaping and a
## Compression
Bun.Archive supports gzip compression for both reading and writing:
Bun.Archive creates uncompressed tar archives by default. Use `{ compress: "gzip" }` to enable gzip compression:
```ts
// Default: uncompressed tar
const archive = new Bun.Archive({ "hello.txt": "Hello, World!" });
// Reading: automatically detects gzip
const gzippedTarball = await Bun.file("archive.tar.gz").bytes();
const archive = Bun.Archive.from(gzippedTarball);
const readArchive = new Bun.Archive(gzippedTarball);
// Writing: specify compression
const files = { "hello.txt": "Hello, World!" };
await Bun.Archive.write("output.tar.gz", files, "gzip");
// Enable gzip compression
const compressed = new Bun.Archive({ "hello.txt": "Hello, World!" }, { compress: "gzip" });
// Getting bytes: specify compression
const gzippedBytes = await archive.bytes("gzip");
// Gzip with custom level (1-12)
const maxCompression = new Bun.Archive({ "hello.txt": "Hello, World!" }, { compress: "gzip", level: 12 });
```
The compression argument accepts:
The options accept:
- `"gzip"` - Enable gzip compression
- `true` - Same as `"gzip"`
- `false` or `undefined` - No compression
- No options or `undefined` - Uncompressed tar (default)
- `{ compress: "gzip" }` - Enable gzip compression at level 6
- `{ compress: "gzip", level: number }` - Gzip with custom level 1-12 (1 = fastest, 12 = smallest)
## Examples
@@ -339,15 +343,16 @@ for await (const path of glob.scan(".")) {
// Add package.json
files["package.json"] = await Bun.file("package.json").text();
// Create compressed archive
await Bun.Archive.write("bundle.tar.gz", files, "gzip");
// Create compressed archive and write to disk
const archive = new Bun.Archive(files, { compress: "gzip" });
await Bun.write("bundle.tar.gz", archive);
```
### Extract and Process npm Package
```ts
const response = await fetch("https://registry.npmjs.org/lodash/-/lodash-4.17.21.tgz");
const archive = Bun.Archive.from(await response.blob());
const archive = new Bun.Archive(await response.blob());
// Get package.json
const files = await archive.files("package/package.json");
@@ -365,7 +370,7 @@ if (packageJson) {
import { readdir } from "node:fs/promises";
import { join } from "node:path";
async function archiveDirectory(dir: string): Promise<Bun.Archive> {
async function archiveDirectory(dir: string, compress = false): Promise<Bun.Archive> {
const files: Record<string, Blob> = {};
async function walk(currentDir: string, prefix: string = "") {
@@ -384,11 +389,11 @@ async function archiveDirectory(dir: string): Promise<Bun.Archive> {
}
await walk(dir);
return Bun.Archive.from(files);
return new Bun.Archive(files, compress ? { compress: "gzip" } : undefined);
}
const archive = await archiveDirectory("./my-project");
await Bun.Archive.write("my-project.tar.gz", archive, "gzip");
const archive = await archiveDirectory("./my-project", true);
await Bun.write("my-project.tar.gz", archive);
```
## Reference
@@ -396,14 +401,19 @@ await Bun.Archive.write("my-project.tar.gz", archive, "gzip");
> **Note**: The following type signatures are simplified for documentation purposes. See [`packages/bun-types/bun.d.ts`](https://github.com/oven-sh/bun/blob/main/packages/bun-types/bun.d.ts) for the full type definitions.
```ts
type ArchiveCompression = "gzip" | boolean;
type ArchiveInput =
| Record<string, string | Blob | Bun.ArrayBufferView | ArrayBufferLike>
| Blob
| Bun.ArrayBufferView
| ArrayBufferLike;
type ArchiveOptions = {
/** Compression algorithm. Currently only "gzip" is supported. */
compress?: "gzip";
/** Compression level 1-12 (default 6 when gzip is enabled). */
level?: number;
};
interface ArchiveExtractOptions {
/** Glob pattern(s) to filter extraction. Supports negative patterns with "!" prefix. */
glob?: string | readonly string[];
@@ -412,13 +422,11 @@ interface ArchiveExtractOptions {
class Archive {
/**
* Create an Archive from input data
* @param data - Files to archive (as object) or existing archive data (as bytes/blob)
* @param options - Compression options. Uncompressed by default.
* Pass { compress: "gzip" } to enable compression.
*/
static from(data: ArchiveInput): Archive;
/**
* Write an archive directly to disk
*/
static write(path: string, data: ArchiveInput | Archive, compress?: ArchiveCompression): Promise<void>;
constructor(data: ArchiveInput, options?: ArchiveOptions);
/**
* Extract archive to a directory
@@ -427,14 +435,14 @@ class Archive {
extract(path: string, options?: ArchiveExtractOptions): Promise<number>;
/**
* Get archive as a Blob
* Get archive as a Blob (uses compression setting from constructor)
*/
blob(compress?: ArchiveCompression): Promise<Blob>;
blob(): Promise<Blob>;
/**
* Get archive as a Uint8Array
* Get archive as a Uint8Array (uses compression setting from constructor)
*/
bytes(compress?: ArchiveCompression): Promise<Uint8Array<ArrayBuffer>>;
bytes(): Promise<Uint8Array<ArrayBuffer>>;
/**
* Get archive contents as File objects (regular files only, no directories)

View File

@@ -1,7 +1,7 @@
{
"private": true,
"name": "bun",
"version": "1.3.6",
"version": "1.3.7",
"workspaces": [
"./packages/bun-types",
"./packages/@types/bun"

View File

@@ -750,7 +750,7 @@ declare module "bun" {
*/
function write(
destination: BunFile | S3File | PathLike,
input: Blob | NodeJS.TypedArray | ArrayBufferLike | string | BlobPart[],
input: Blob | NodeJS.TypedArray | ArrayBufferLike | string | BlobPart[] | Archive,
options?: {
/**
* If writing to a PathLike, set the permissions of the file.
@@ -6975,15 +6975,44 @@ declare module "bun" {
/**
* Compression format for archive output.
* - `"gzip"` - Compress with gzip
* - `true` - Same as `"gzip"`
* - `false` - Explicitly disable compression (no compression)
* - `undefined` - No compression (default behavior when omitted)
*
* Both `false` and `undefined` result in no compression; `false` can be used
* to explicitly indicate "no compression" in code where the intent should be clear.
* Currently only `"gzip"` is supported.
*/
type ArchiveCompression = "gzip" | boolean;
type ArchiveCompression = "gzip";
/**
* Options for creating an Archive instance.
*
* By default, archives are not compressed. Use `{ compress: "gzip" }` to enable compression.
*
* @example
* ```ts
* // No compression (default)
* new Bun.Archive(data);
*
* // Enable gzip with default level (6)
* new Bun.Archive(data, { compress: "gzip" });
*
* // Specify compression level
* new Bun.Archive(data, { compress: "gzip", level: 9 });
* ```
*/
interface ArchiveOptions {
/**
* Compression algorithm to use.
* Currently only "gzip" is supported.
* If not specified, no compression is applied.
*/
compress?: ArchiveCompression;
/**
* Compression level (1-12). Only applies when `compress` is set.
* - 1: Fastest compression, lowest ratio
* - 6: Default balance of speed and ratio
* - 12: Best compression ratio, slowest
*
* @default 6
*/
level?: number;
}
/**
* Options for extracting archive contents.
@@ -7031,7 +7060,7 @@ declare module "bun" {
* @example
* **Create an archive from an object:**
* ```ts
* const archive = Bun.Archive.from({
* const archive = new Bun.Archive({
* "hello.txt": "Hello, World!",
* "data.json": JSON.stringify({ foo: "bar" }),
* "binary.bin": new Uint8Array([1, 2, 3, 4]),
@@ -7039,9 +7068,20 @@ declare module "bun" {
* ```
*
* @example
* **Create a gzipped archive:**
* ```ts
* const archive = new Bun.Archive({
* "hello.txt": "Hello, World!",
* }, { compress: "gzip" });
*
* // Or with a specific compression level (1-12)
* const archive = new Bun.Archive(data, { compress: "gzip", level: 9 });
* ```
*
* @example
* **Extract an archive to disk:**
* ```ts
* const archive = Bun.Archive.from(tarballBytes);
* const archive = new Bun.Archive(tarballBytes);
* const entryCount = await archive.extract("./output");
* console.log(`Extracted ${entryCount} entries`);
* ```
@@ -7049,7 +7089,7 @@ declare module "bun" {
* @example
* **Get archive contents as a Map of File objects:**
* ```ts
* const archive = Bun.Archive.from(tarballBytes);
* const archive = new Bun.Archive(tarballBytes);
* const entries = await archive.files();
* for (const [path, file] of entries) {
* console.log(path, await file.text());
@@ -7062,36 +7102,50 @@ declare module "bun" {
* await Bun.Archive.write("bundle.tar.gz", {
* "src/index.ts": sourceCode,
* "package.json": packageJson,
* }, "gzip");
* }, { compress: "gzip" });
* ```
*/
export class Archive {
/**
* Create an `Archive` instance from input data.
*
* By default, archives are not compressed. Use `{ compress: "gzip" }` to enable compression.
*
* @param data - The input data for the archive:
* - **Object**: Creates a new tarball with the object's keys as file paths and values as file contents
* - **Blob/TypedArray/ArrayBuffer**: Wraps existing archive data (tar or tar.gz)
*
* @returns A new `Archive` instance
* @param options - Optional archive options including compression settings.
* Defaults to no compression if omitted.
*
* @example
* **From an object (creates new tarball):**
* **From an object (creates uncompressed tarball):**
* ```ts
* const archive = Bun.Archive.from({
* const archive = new Bun.Archive({
* "hello.txt": "Hello, World!",
* "nested/file.txt": "Nested content",
* });
* ```
*
* @example
* **With gzip compression:**
* ```ts
* const archive = new Bun.Archive(data, { compress: "gzip" });
* ```
*
* @example
* **With explicit gzip compression level:**
* ```ts
* const archive = new Bun.Archive(data, { compress: "gzip", level: 12 });
* ```
*
* @example
* **From existing archive data:**
* ```ts
* const response = await fetch("https://example.com/package.tar.gz");
* const archive = Bun.Archive.from(await response.blob());
* const archive = new Bun.Archive(await response.blob());
* ```
*/
static from(data: ArchiveInput): Archive;
constructor(data: ArchiveInput, options?: ArchiveOptions);
/**
* Create and write an archive directly to disk in one operation.
@@ -7100,8 +7154,8 @@ declare module "bun" {
* as it streams the data directly to disk.
*
* @param path - The file path to write the archive to
* @param data - The input data for the archive (same as `Archive.from()`)
* @param compress - Optional compression: `"gzip"`, `true` for gzip, or `false`/`undefined` for none
* @param data - The input data for the archive (same as `new Archive()`)
* @param options - Optional archive options including compression settings
*
* @returns A promise that resolves when the write is complete
*
@@ -7117,10 +7171,10 @@ declare module "bun" {
* @example
* **Write gzipped tarball:**
* ```ts
* await Bun.Archive.write("output.tar.gz", files, "gzip");
* await Bun.Archive.write("output.tar.gz", files, { compress: "gzip" });
* ```
*/
static write(path: string, data: ArchiveInput | Archive, compress?: ArchiveCompression): Promise<void>;
static write(path: string, data: ArchiveInput | Archive, options?: ArchiveOptions): Promise<void>;
/**
* Extract the archive contents to a directory on disk.
@@ -7136,7 +7190,7 @@ declare module "bun" {
* @example
* **Extract all entries:**
* ```ts
* const archive = Bun.Archive.from(tarballBytes);
* const archive = new Bun.Archive(tarballBytes);
* const count = await archive.extract("./extracted");
* console.log(`Extracted ${count} entries`);
* ```
@@ -7166,42 +7220,48 @@ declare module "bun" {
/**
* Get the archive contents as a `Blob`.
*
* @param compress - Optional compression: `"gzip"`, `true` for gzip, or `false`/`undefined` for none
* Uses the compression settings specified when the Archive was created.
*
* @returns A promise that resolves with the archive data as a Blob
*
* @example
* **Get uncompressed tarball:**
* **Get tarball as Blob:**
* ```ts
* const archive = new Bun.Archive(data);
* const blob = await archive.blob();
* ```
*
* @example
* **Get gzipped tarball:**
* **Get gzipped tarball as Blob:**
* ```ts
* const gzippedBlob = await archive.blob("gzip");
* const archive = new Bun.Archive(data, { compress: "gzip" });
* const gzippedBlob = await archive.blob();
* ```
*/
blob(compress?: ArchiveCompression): Promise<Blob>;
blob(): Promise<Blob>;
/**
* Get the archive contents as a `Uint8Array`.
*
* @param compress - Optional compression: `"gzip"`, `true` for gzip, or `false`/`undefined` for none
* Uses the compression settings specified when the Archive was created.
*
* @returns A promise that resolves with the archive data as a Uint8Array
*
* @example
* **Get uncompressed tarball bytes:**
* **Get tarball bytes:**
* ```ts
* const archive = new Bun.Archive(data);
* const bytes = await archive.bytes();
* ```
*
* @example
* **Get gzipped tarball bytes:**
* ```ts
* const gzippedBytes = await archive.bytes("gzip");
* const archive = new Bun.Archive(data, { compress: "gzip" });
* const gzippedBytes = await archive.bytes();
* ```
*/
bytes(compress?: ArchiveCompression): Promise<Uint8Array<ArrayBuffer>>;
bytes(): Promise<Uint8Array<ArrayBuffer>>;
/**
* Get the archive contents as a `Map` of `File` objects.

View File

@@ -609,7 +609,17 @@ declare module "bun" {
* });
*/
write(
data: string | ArrayBufferView | ArrayBuffer | SharedArrayBuffer | Request | Response | BunFile | S3File | Blob,
data:
| string
| ArrayBufferView
| ArrayBuffer
| SharedArrayBuffer
| Request
| Response
| BunFile
| S3File
| Blob
| Archive,
options?: S3Options,
): Promise<number>;
@@ -920,7 +930,8 @@ declare module "bun" {
| BunFile
| S3File
| Blob
| File,
| File
| Archive,
options?: S3Options,
): Promise<number>;
@@ -970,7 +981,8 @@ declare module "bun" {
| BunFile
| S3File
| Blob
| File,
| File
| Archive,
options?: S3Options,
): Promise<number>;

View File

@@ -573,6 +573,95 @@ describe("BunTestController - Test Discovery and Management", () => {
});
});
describe("matchesGlobPattern", () => {
test("should match default test file patterns", () => {
const defaultPattern = "**/*{.test.,.spec.,_test_,_spec_}{js,ts,tsx,jsx,mts,cts,cjs,mjs}";
// Should match .test. files
expect(internal.matchesGlobPattern("/path/to/file.test.ts", defaultPattern)).toBe(true);
expect(internal.matchesGlobPattern("/path/to/file.test.js", defaultPattern)).toBe(true);
expect(internal.matchesGlobPattern("/path/to/file.test.tsx", defaultPattern)).toBe(true);
expect(internal.matchesGlobPattern("/path/to/file.test.jsx", defaultPattern)).toBe(true);
// Should match .spec. files
expect(internal.matchesGlobPattern("/path/to/file.spec.ts", defaultPattern)).toBe(true);
expect(internal.matchesGlobPattern("/path/to/component.spec.js", defaultPattern)).toBe(true);
// Should match _test_ files (pattern is _test_{extension} with no dot before extension)
expect(internal.matchesGlobPattern("/path/to/file_test_ts", defaultPattern)).toBe(true);
// Should match _spec_ files (pattern is _spec_{extension} with no dot before extension)
expect(internal.matchesGlobPattern("/path/to/file_spec_js", defaultPattern)).toBe(true);
});
test("should not match non-test files", () => {
const defaultPattern = "**/*{.test.,.spec.,_test_,_spec_}{js,ts,tsx,jsx,mts,cts,cjs,mjs}";
// Regular source files should not match
expect(internal.matchesGlobPattern("/path/to/component.ts", defaultPattern)).toBe(false);
expect(internal.matchesGlobPattern("/path/to/index.js", defaultPattern)).toBe(false);
expect(internal.matchesGlobPattern("/path/to/utils.tsx", defaultPattern)).toBe(false);
// Files without proper extension should not match
expect(internal.matchesGlobPattern("/path/to/test.txt", defaultPattern)).toBe(false);
expect(internal.matchesGlobPattern("/path/to/test", defaultPattern)).toBe(false);
});
test("should match custom bun test patterns (fixes #26067)", () => {
// Custom pattern for Bun-specific tests to avoid conflicts with Vitest
const bunPattern = "**/*.bun.test.{js,ts,tsx,jsx}";
// Should match Bun-specific test files
expect(internal.matchesGlobPattern("/backend/user.bun.test.ts", bunPattern)).toBe(true);
expect(internal.matchesGlobPattern("/src/api.bun.test.js", bunPattern)).toBe(true);
expect(internal.matchesGlobPattern("/components/Button.bun.test.tsx", bunPattern)).toBe(true);
// Should NOT match regular test files (Vitest files)
expect(internal.matchesGlobPattern("/frontend/component.test.ts", bunPattern)).toBe(false);
expect(internal.matchesGlobPattern("/src/utils.spec.js", bunPattern)).toBe(false);
});
test("should handle patterns with single wildcard", () => {
const pattern = "*.test.ts";
expect(internal.matchesGlobPattern("file.test.ts", pattern)).toBe(true);
expect(internal.matchesGlobPattern("/path/to/file.test.ts", pattern)).toBe(true);
expect(internal.matchesGlobPattern("file.spec.ts", pattern)).toBe(false);
});
test("should handle patterns with double wildcard for nested paths", () => {
const pattern = "**/tests/**/*.test.ts";
expect(internal.matchesGlobPattern("/project/tests/unit/file.test.ts", pattern)).toBe(true);
expect(internal.matchesGlobPattern("/project/tests/integration/deep/file.test.ts", pattern)).toBe(true);
expect(internal.matchesGlobPattern("/project/src/file.test.ts", pattern)).toBe(false);
});
test("should handle patterns with question mark wildcard", () => {
const pattern = "**/*.test?.ts";
expect(internal.matchesGlobPattern("/path/file.test1.ts", pattern)).toBe(true);
expect(internal.matchesGlobPattern("/path/file.testA.ts", pattern)).toBe(true);
expect(internal.matchesGlobPattern("/path/file.test.ts", pattern)).toBe(false);
});
test("should handle case-insensitive matching", () => {
const pattern = "**/*.TEST.ts";
// Pattern matching should be case-insensitive
expect(internal.matchesGlobPattern("/path/file.test.ts", pattern)).toBe(true);
expect(internal.matchesGlobPattern("/path/file.TEST.ts", pattern)).toBe(true);
expect(internal.matchesGlobPattern("/path/file.Test.ts", pattern)).toBe(true);
});
test("should handle Windows-style paths", () => {
const pattern = "**/*.test.ts";
expect(internal.matchesGlobPattern("C:\\project\\src\\file.test.ts", pattern)).toBe(true);
expect(internal.matchesGlobPattern("C:\\Users\\dev\\tests\\unit.test.ts", pattern)).toBe(true);
});
});
describe("getBunExecutionConfig", () => {
test("should return bun execution configuration", () => {
const config = internal.getBunExecutionConfig();
@@ -1004,6 +1093,7 @@ describe("BunTestController - Integration and Coverage", () => {
expect(internal).toHaveProperty("isTestFile");
expect(internal).toHaveProperty("customFilePattern");
expect(internal).toHaveProperty("matchesGlobPattern");
expect(internal).toHaveProperty("getBunExecutionConfig");
expect(internal).toHaveProperty("findTestByPath");
@@ -1026,6 +1116,7 @@ describe("BunTestController - Integration and Coverage", () => {
expect(typeof internal.shouldUseTestNamePattern).toBe("function");
expect(typeof internal.isTestFile).toBe("function");
expect(typeof internal.customFilePattern).toBe("function");
expect(typeof internal.matchesGlobPattern).toBe("function");
expect(typeof internal.getBunExecutionConfig).toBe("function");
expect(typeof internal.findTestByPath).toBe("function");
expect(typeof internal.findTestByName).toBe("function");
@@ -1039,7 +1130,7 @@ describe("BunTestController - Integration and Coverage", () => {
const functionCount = methodNames.filter(name => typeof internal[name] === "function").length;
expect(functionCount).toBe(methodNames.length);
expect(methodNames.length).toBeGreaterThanOrEqual(16);
expect(methodNames.length).toBeGreaterThanOrEqual(17);
});
});
@@ -1057,6 +1148,7 @@ describe("BunTestController - Integration and Coverage", () => {
expect(typeof internal.shouldUseTestNamePattern).toBe("function");
expect(typeof internal.isTestFile).toBe("function");
expect(typeof internal.customFilePattern).toBe("function");
expect(typeof internal.matchesGlobPattern).toBe("function");
expect(typeof internal.getBunExecutionConfig).toBe("function");
expect(typeof internal.findTestByPath).toBe("function");
expect(typeof internal.findTestByName).toBe("function");
@@ -1068,7 +1160,7 @@ describe("BunTestController - Integration and Coverage", () => {
expect(controller._internal).toBeDefined();
const internalMethods = Object.keys(internal);
expect(internalMethods.length).toBeGreaterThanOrEqual(16);
expect(internalMethods.length).toBeGreaterThanOrEqual(17);
});
test("should handle controller disposal", () => {

View File

@@ -136,9 +136,42 @@ export class BunTestController implements vscode.Disposable {
}
private isTestFile(document: vscode.TextDocument): boolean {
return (
document?.uri?.scheme === "file" && /\.(test|spec)\.(js|jsx|ts|tsx|cjs|mjs|mts|cts)$/.test(document.uri.fsPath)
);
if (document?.uri?.scheme !== "file") {
return false;
}
const pattern = this.customFilePattern();
return this.matchesGlobPattern(document.uri.fsPath, pattern);
}
private matchesGlobPattern(filePath: string, globPattern: string): boolean {
// Basic sanity check to prevent pathological patterns
if (globPattern.length > 500) {
debug.appendLine(`Warning: glob pattern too long (${globPattern.length} chars), skipping match`);
return false;
}
// Normalize file path: convert Windows backslashes to forward slashes
const normalizedPath = filePath.replace(/\\/g, "/");
// Convert glob pattern to regex
// Handle common glob patterns like **/*.test.{js,ts}
let regexPattern = globPattern
// Escape special regex characters (except those used in glob patterns)
.replace(/[.+^$|\\]/g, "\\$&")
// Convert ** to match any path
.replace(/\*\*/g, "<<<GLOBSTAR>>>")
// Convert * to match any filename characters (not path separators)
.replace(/\*/g, "[^/]*")
// Restore ** as .* to match anything including path separators
.replace(/<<<GLOBSTAR>>>/g, ".*")
// Convert ? to match single character
.replace(/\?/g, ".")
// Convert {a,b,c} to (a|b|c)
.replace(/\{([^}]+)\}/g, (_, group) => `(${group.split(",").join("|")})`);
// The pattern should match the end of the file path
const regex = new RegExp(`(^|/)${regexPattern}$`, "i");
return regex.test(normalizedPath);
}
private async discoverInitialTests(
@@ -1480,6 +1513,7 @@ export class BunTestController implements vscode.Disposable {
isTestFile: this.isTestFile.bind(this),
customFilePattern: this.customFilePattern.bind(this),
matchesGlobPattern: this.matchesGlobPattern.bind(this),
getBunExecutionConfig: this.getBunExecutionConfig.bind(this),
findTestByPath: this.findTestByPath.bind(this),

View File

@@ -8,10 +8,6 @@ export default [
configurable: false,
JSType: "0b11101110",
klass: {
from: {
fn: "from",
length: 1,
},
write: {
fn: "write",
length: 2,

View File

@@ -5,8 +5,19 @@ pub const toJS = js.toJS;
pub const fromJS = js.fromJS;
pub const fromJSDirect = js.fromJSDirect;
/// Compression options for the archive
pub const Compression = union(enum) {
none,
gzip: struct {
/// Compression level: 1 (fastest) to 12 (maximum compression). Default is 6.
level: u8 = 6,
},
};
/// The underlying data for the archive - uses Blob.Store for thread-safe ref counting
store: *jsc.WebCore.Blob.Store,
/// Compression settings for this archive
compress: Compression = .none,
pub fn finalize(this: *Archive) void {
jsc.markBinding(@src());
@@ -65,47 +76,95 @@ fn countFilesInArchive(data: []const u8) u32 {
return count;
}
/// Constructor: new Archive() - throws an error since users should use Archive.from()
pub fn constructor(globalThis: *jsc.JSGlobalObject, _: *jsc.CallFrame) bun.JSError!*Archive {
return globalThis.throwInvalidArguments("Archive cannot be constructed directly. Use Archive.from() instead.", .{});
}
/// Static method: Archive.from(data)
/// Constructor: new Archive(data, options?)
/// Creates an Archive from either:
/// - An object { [path: string]: Blob | string | ArrayBufferView | ArrayBufferLike }
/// - A Blob, ArrayBufferView, or ArrayBufferLike (assumes it's already a valid archive)
pub fn from(globalThis: *jsc.JSGlobalObject, callframe: *jsc.CallFrame) bun.JSError!jsc.JSValue {
const arg = callframe.argumentsAsArray(1)[0];
if (arg == .zero) {
return globalThis.throwInvalidArguments("Archive.from requires an argument", .{});
/// Options:
/// - compress: "gzip" - Enable gzip compression
/// - level: number (1-12) - Compression level (default 6)
/// When no options are provided, no compression is applied
pub fn constructor(globalThis: *jsc.JSGlobalObject, callframe: *jsc.CallFrame) bun.JSError!*Archive {
const data_arg, const options_arg = callframe.argumentsAsArray(2);
if (data_arg == .zero) {
return globalThis.throwInvalidArguments("new Archive() requires an argument", .{});
}
// Parse compression options
const compress = try parseCompressionOptions(globalThis, options_arg);
// For Blob/Archive, ref the existing store (zero-copy)
if (arg.as(jsc.WebCore.Blob)) |blob_ptr| {
if (data_arg.as(jsc.WebCore.Blob)) |blob_ptr| {
if (blob_ptr.store) |store| {
store.ref();
return bun.new(Archive, .{ .store = store }).toJS(globalThis);
return bun.new(Archive, .{ .store = store, .compress = compress });
}
}
// For ArrayBuffer/TypedArray, copy the data
if (arg.asArrayBuffer(globalThis)) |array_buffer| {
if (data_arg.asArrayBuffer(globalThis)) |array_buffer| {
const data = try bun.default_allocator.dupe(u8, array_buffer.slice());
return createArchive(globalThis, data);
return createArchive(data, compress);
}
// For plain objects, build a tarball
if (arg.isObject()) {
const data = try buildTarballFromObject(globalThis, arg);
return createArchive(globalThis, data);
if (data_arg.isObject()) {
const data = try buildTarballFromObject(globalThis, data_arg);
return createArchive(data, compress);
}
return globalThis.throwInvalidArguments("Expected an object, Blob, TypedArray, or ArrayBuffer", .{});
}
fn createArchive(globalThis: *jsc.JSGlobalObject, data: []u8) jsc.JSValue {
/// Parse compression options from JS value
/// Returns .none if no compression specified, caller must handle defaults
fn parseCompressionOptions(globalThis: *jsc.JSGlobalObject, options_arg: jsc.JSValue) bun.JSError!Compression {
// No options provided means no compression (caller handles defaults)
if (options_arg.isUndefinedOrNull()) {
return .none;
}
if (!options_arg.isObject()) {
return globalThis.throwInvalidArguments("Archive: options must be an object", .{});
}
// Check for compress option
if (try options_arg.getTruthy(globalThis, "compress")) |compress_val| {
// compress must be "gzip"
if (!compress_val.isString()) {
return globalThis.throwInvalidArguments("Archive: compress option must be a string", .{});
}
const compress_str = try compress_val.toSlice(globalThis, bun.default_allocator);
defer compress_str.deinit();
if (!bun.strings.eqlComptime(compress_str.slice(), "gzip")) {
return globalThis.throwInvalidArguments("Archive: compress option must be \"gzip\"", .{});
}
// Parse level option (1-12, default 6)
var level: u8 = 6;
if (try options_arg.getTruthy(globalThis, "level")) |level_val| {
if (!level_val.isNumber()) {
return globalThis.throwInvalidArguments("Archive: level must be a number", .{});
}
const level_num = level_val.toInt64();
if (level_num < 1 or level_num > 12) {
return globalThis.throwInvalidArguments("Archive: level must be between 1 and 12", .{});
}
level = @intCast(level_num);
}
return .{ .gzip = .{ .level = level } };
}
// No compress option specified in options object means no compression
return .none;
}
fn createArchive(data: []u8, compress: Compression) *Archive {
const store = jsc.WebCore.Blob.Store.init(data, bun.default_allocator);
return bun.new(Archive, .{ .store = store }).toJS(globalThis);
return bun.new(Archive, .{ .store = store, .compress = compress });
}
/// Shared helper that builds tarball bytes from a JS object
@@ -212,12 +271,15 @@ fn getEntryData(globalThis: *jsc.JSGlobalObject, value: jsc.JSValue, allocator:
return value.toSlice(globalThis, allocator);
}
/// Static method: Archive.write(path, data, compress?)
/// Creates and writes an archive to disk in one operation
/// Static method: Archive.write(path, data, options?)
/// Creates and writes an archive to disk in one operation.
/// For Archive instances, uses the archive's compression settings unless overridden by options.
/// Options:
/// - gzip: { level?: number } - Override compression settings
pub fn write(globalThis: *jsc.JSGlobalObject, callframe: *jsc.CallFrame) bun.JSError!jsc.JSValue {
const path_arg, const data_arg, const compress_arg = callframe.argumentsAsArray(3);
const path_arg, const data_arg, const options_arg = callframe.argumentsAsArray(3);
if (data_arg == .zero) {
return globalThis.throwInvalidArguments("Archive.write requires at least 2 arguments (path, data)", .{});
return globalThis.throwInvalidArguments("Archive.write requires 2 arguments (path, data)", .{});
}
// Get the path
@@ -228,61 +290,37 @@ pub fn write(globalThis: *jsc.JSGlobalObject, callframe: *jsc.CallFrame) bun.JSE
const path_slice = try path_arg.toSlice(globalThis, bun.default_allocator);
defer path_slice.deinit();
// Determine compression
const use_gzip = try parseCompressArg(globalThis, compress_arg);
// Parse options for compression override
const options_compress = try parseCompressionOptions(globalThis, options_arg);
// Try to use store reference (zero-copy) for Archive/Blob
// For Archive instances, use options override or archive's compression settings
if (fromJS(data_arg)) |archive| {
return startWriteTask(globalThis, .{ .store = archive.store }, path_slice.slice(), use_gzip);
const compress = if (options_compress != .none) options_compress else archive.compress;
return startWriteTask(globalThis, .{ .store = archive.store }, path_slice.slice(), compress);
}
// For Blobs, use store reference with options compression
if (data_arg.as(jsc.WebCore.Blob)) |blob_ptr| {
if (blob_ptr.store) |store| {
return startWriteTask(globalThis, .{ .store = store }, path_slice.slice(), use_gzip);
return startWriteTask(globalThis, .{ .store = store }, path_slice.slice(), options_compress);
}
}
// Fall back to copying data for ArrayBuffer/TypedArray/objects
const archive_data = try getArchiveData(globalThis, data_arg);
return startWriteTask(globalThis, .{ .owned = archive_data }, path_slice.slice(), use_gzip);
}
/// Get archive data from a value, returning owned bytes
fn getArchiveData(globalThis: *jsc.JSGlobalObject, arg: jsc.JSValue) bun.JSError![]u8 {
// Check if it's a typed array, ArrayBuffer, or similar
if (arg.asArrayBuffer(globalThis)) |array_buffer| {
return bun.default_allocator.dupe(u8, array_buffer.slice());
// For ArrayBuffer/TypedArray, copy the data with options compression
if (data_arg.asArrayBuffer(globalThis)) |array_buffer| {
const data = try bun.default_allocator.dupe(u8, array_buffer.slice());
return startWriteTask(globalThis, .{ .owned = data }, path_slice.slice(), options_compress);
}
// Check if it's an object with entries (plain object) - build tarball
if (arg.isObject()) {
return buildTarballFromObject(globalThis, arg);
// For plain objects, build a tarball with options compression
if (data_arg.isObject()) {
const data = try buildTarballFromObject(globalThis, data_arg);
return startWriteTask(globalThis, .{ .owned = data }, path_slice.slice(), options_compress);
}
return globalThis.throwInvalidArguments("Expected an object, Blob, TypedArray, ArrayBuffer, or Archive", .{});
}
fn parseCompressArg(globalThis: *jsc.JSGlobalObject, arg: jsc.JSValue) bun.JSError!bool {
if (arg.isUndefinedOrNull()) {
return false;
}
if (arg.isBoolean()) {
return arg.toBoolean();
}
if (arg.isString()) {
const str = try arg.toSlice(globalThis, bun.default_allocator);
defer str.deinit();
if (std.mem.eql(u8, str.slice(), "gzip")) {
return true;
}
return globalThis.throwInvalidArguments("Archive: compress argument must be 'gzip', a boolean, or undefined", .{});
}
return globalThis.throwInvalidArguments("Archive: compress argument must be 'gzip', a boolean, or undefined", .{});
}
/// Instance method: archive.extract(path, options?)
/// Extracts the archive to the given path
/// Options:
@@ -379,20 +417,16 @@ fn freePatterns(patterns: []const []const u8) void {
bun.default_allocator.free(patterns);
}
/// Instance method: archive.blob(compress?)
/// Returns Promise<Blob> with the archive data
pub fn blob(this: *Archive, globalThis: *jsc.JSGlobalObject, callframe: *jsc.CallFrame) bun.JSError!jsc.JSValue {
const compress_arg = callframe.argumentsAsArray(1)[0];
const use_gzip = try parseCompressArg(globalThis, compress_arg);
return startBlobTask(globalThis, this.store, use_gzip, .blob);
/// Instance method: archive.blob()
/// Returns Promise<Blob> with the archive data (compressed if gzip was set in options)
pub fn blob(this: *Archive, globalThis: *jsc.JSGlobalObject, _: *jsc.CallFrame) bun.JSError!jsc.JSValue {
return startBlobTask(globalThis, this.store, this.compress, .blob);
}
/// Instance method: archive.bytes(compress?)
/// Returns Promise<Uint8Array> with the archive data
pub fn bytes(this: *Archive, globalThis: *jsc.JSGlobalObject, callframe: *jsc.CallFrame) bun.JSError!jsc.JSValue {
const compress_arg = callframe.argumentsAsArray(1)[0];
const use_gzip = try parseCompressArg(globalThis, compress_arg);
return startBlobTask(globalThis, this.store, use_gzip, .bytes);
/// Instance method: archive.bytes()
/// Returns Promise<Uint8Array> with the archive data (compressed if gzip was set in options)
pub fn bytes(this: *Archive, globalThis: *jsc.JSGlobalObject, _: *jsc.CallFrame) bun.JSError!jsc.JSValue {
return startBlobTask(globalThis, this.store, this.compress, .bytes);
}
/// Instance method: archive.files(glob?)
@@ -578,15 +612,17 @@ const BlobContext = struct {
};
store: *jsc.WebCore.Blob.Store,
use_gzip: bool,
compress: Compression,
output_type: OutputType,
result: Result = .{ .uncompressed = {} },
fn run(this: *BlobContext) Result {
if (this.use_gzip) {
return .{ .compressed = compressGzip(this.store.sharedView()) catch |e| return .{ .err = e } };
switch (this.compress) {
.gzip => |opts| {
return .{ .compressed = compressGzip(this.store.sharedView(), opts.level) catch |e| return .{ .err = e } };
},
.none => return .{ .uncompressed = {} },
}
return .{ .uncompressed = {} };
}
fn runFromJS(this: *BlobContext, globalThis: *jsc.JSGlobalObject) bun.JSError!PromiseResult {
@@ -617,13 +653,13 @@ const BlobContext = struct {
pub const BlobTask = AsyncTask(BlobContext);
fn startBlobTask(globalThis: *jsc.JSGlobalObject, store: *jsc.WebCore.Blob.Store, use_gzip: bool, output_type: BlobContext.OutputType) bun.JSError!jsc.JSValue {
fn startBlobTask(globalThis: *jsc.JSGlobalObject, store: *jsc.WebCore.Blob.Store, compress: Compression, output_type: BlobContext.OutputType) bun.JSError!jsc.JSValue {
store.ref();
errdefer store.deref();
const task = try BlobTask.create(globalThis, .{
.store = store,
.use_gzip = use_gzip,
.compress = compress,
.output_type = output_type,
});
@@ -646,7 +682,7 @@ const WriteContext = struct {
data: Data,
path: [:0]const u8,
use_gzip: bool,
compress: Compression,
result: Result = .{ .success = {} },
fn run(this: *WriteContext) Result {
@@ -654,11 +690,11 @@ const WriteContext = struct {
.owned => |d| d,
.store => |s| s.sharedView(),
};
const data_to_write = if (this.use_gzip)
compressGzip(source_data) catch |e| return .{ .err = e }
else
source_data;
defer if (this.use_gzip) bun.default_allocator.free(data_to_write);
const data_to_write = switch (this.compress) {
.gzip => |opts| compressGzip(source_data, opts.level) catch |e| return .{ .err = e },
.none => source_data,
};
defer if (this.compress != .none) bun.default_allocator.free(data_to_write);
const file = switch (bun.sys.File.openat(.cwd(), this.path, bun.O.CREAT | bun.O.WRONLY | bun.O.TRUNC, 0o644)) {
.err => |err| return .{ .sys_err = err.clone(bun.default_allocator) },
@@ -699,7 +735,7 @@ fn startWriteTask(
globalThis: *jsc.JSGlobalObject,
data: WriteContext.Data,
path: []const u8,
use_gzip: bool,
compress: Compression,
) bun.JSError!jsc.JSValue {
const path_z = try bun.default_allocator.dupeZ(u8, path);
errdefer bun.default_allocator.free(path_z);
@@ -714,7 +750,7 @@ fn startWriteTask(
const task = try WriteTask.create(globalThis, .{
.data = data,
.path = path_z,
.use_gzip = use_gzip,
.compress = compress,
});
const promise_js = task.promise.value();
@@ -869,10 +905,10 @@ fn startFilesTask(globalThis: *jsc.JSGlobalObject, store: *jsc.WebCore.Blob.Stor
// Helpers
// ============================================================================
fn compressGzip(data: []const u8) ![]u8 {
fn compressGzip(data: []const u8, level: u8) ![]u8 {
libdeflate.load();
const compressor = libdeflate.Compressor.alloc(6) orelse return error.GzipInitFailed;
const compressor = libdeflate.Compressor.alloc(@intCast(level)) orelse return error.GzipInitFailed;
defer compressor.deinit();
const max_size = compressor.maxBytesNeeded(data, .gzip);

View File

@@ -118,6 +118,14 @@ pub fn set_repeat(_: *Self, thisValue: JSValue, globalThis: *JSGlobalObject, val
Self.js.repeatSetCached(thisValue, globalThis, value);
}
pub fn get_idleStart(_: *Self, thisValue: JSValue, _: *JSGlobalObject) JSValue {
return Self.js.idleStartGetCached(thisValue).?;
}
pub fn set_idleStart(_: *Self, thisValue: JSValue, globalThis: *JSGlobalObject, value: JSValue) void {
Self.js.idleStartSetCached(thisValue, globalThis, value);
}
pub fn dispose(self: *Self, globalThis: *JSGlobalObject, _: *jsc.CallFrame) bun.JSError!JSValue {
self.internals.cancel(globalThis.bunVM());
return .js_undefined;

View File

@@ -242,7 +242,7 @@ fn convertToInterval(this: *TimerObjectInternals, global: *JSGlobalObject, timer
this.strong_this.set(global, timer);
this.flags.kind = .setInterval;
this.interval = new_interval;
this.reschedule(timer, vm);
this.reschedule(timer, vm, global);
}
pub fn run(this: *TimerObjectInternals, globalThis: *jsc.JSGlobalObject, timer: JSValue, callback: JSValue, arguments: JSValue, async_id: u64, vm: *jsc.VirtualMachine) bool {
@@ -293,8 +293,8 @@ pub fn init(
TimeoutObject.js.idleTimeoutSetCached(timer, global, .jsNumber(interval));
TimeoutObject.js.repeatSetCached(timer, global, if (kind == .setInterval) .jsNumber(interval) else .null);
// this increments the refcount
this.reschedule(timer, vm);
// this increments the refcount and sets _idleStart
this.reschedule(timer, vm, global);
}
this.strong_this.set(global, timer);
@@ -328,7 +328,7 @@ pub fn doRefresh(this: *TimerObjectInternals, globalObject: *jsc.JSGlobalObject,
}
this.strong_this.set(globalObject, this_value);
this.reschedule(this_value, VirtualMachine.get());
this.reschedule(this_value, VirtualMachine.get(), globalObject);
return this_value;
}
@@ -371,7 +371,7 @@ fn shouldRescheduleTimer(this: *TimerObjectInternals, repeat: JSValue, idle_time
return true;
}
pub fn reschedule(this: *TimerObjectInternals, timer: JSValue, vm: *VirtualMachine) void {
pub fn reschedule(this: *TimerObjectInternals, timer: JSValue, vm: *VirtualMachine, globalThis: *JSGlobalObject) void {
if (this.flags.kind == .setImmediate) return;
const idle_timeout = TimeoutObject.js.idleTimeoutGetCached(timer).?;
@@ -380,7 +380,8 @@ pub fn reschedule(this: *TimerObjectInternals, timer: JSValue, vm: *VirtualMachi
// https://github.com/nodejs/node/blob/a7cbb904745591c9a9d047a364c2c188e5470047/lib/internal/timers.js#L612
if (!this.shouldRescheduleTimer(repeat, idle_timeout)) return;
const now = timespec.msFromNow(.allow_mocked_time, this.interval);
const now = timespec.now(.allow_mocked_time);
const scheduled_time = now.addMs(this.interval);
const was_active = this.eventLoopTimer().state == .ACTIVE;
if (was_active) {
vm.timer.remove(this.eventLoopTimer());
@@ -388,9 +389,13 @@ pub fn reschedule(this: *TimerObjectInternals, timer: JSValue, vm: *VirtualMachi
this.ref();
}
vm.timer.update(this.eventLoopTimer(), &now);
vm.timer.update(this.eventLoopTimer(), &scheduled_time);
this.flags.has_cleared_timer = false;
// Set _idleStart to the current monotonic timestamp in milliseconds
// This mimics Node.js's behavior where _idleStart is the libuv timestamp when the timer was scheduled
TimeoutObject.js.idleStartSetCached(timer, globalThis, .jsNumber(now.msUnsigned()));
if (this.flags.has_js_ref) {
this.setEnableKeepingEventLoopAlive(vm, true);
}

View File

@@ -1896,6 +1896,9 @@ pub fn NewRequestContext(comptime ssl_enabled: bool, comptime debug_mode: bool,
}
this.ref();
byte_stream.pipe = jsc.WebCore.Pipe.Wrap(@This(), onPipe).init(this);
// Deinit the old Strong reference before creating a new one
// to avoid leaking the Strong.Impl memory
this.response_body_readable_stream_ref.deinit();
this.response_body_readable_stream_ref = jsc.WebCore.ReadableStream.Strong.init(stream, globalThis);
this.byte_stream = byte_stream;

View File

@@ -184,13 +184,18 @@ export default [
setter: "set_repeat",
this: true,
},
_idleStart: {
getter: "get_idleStart",
setter: "set_idleStart",
this: true,
},
["@@dispose"]: {
fn: "dispose",
length: 0,
invalidThisBehavior: InvalidThisBehavior.NoOp,
},
},
values: ["arguments", "callback", "idleTimeout", "repeat"],
values: ["arguments", "callback", "idleTimeout", "repeat", "idleStart"],
}),
define({
name: "Immediate",

View File

@@ -1484,6 +1484,12 @@ pub fn writeFileInternal(globalThis: *jsc.JSGlobalObject, path_or_blob_: *PathOr
}
}
// Check for Archive - allows Bun.write() and S3 writes to accept Archive instances
if (data.as(Archive)) |archive| {
archive.store.ref();
break :brk Blob.initWithStore(archive.store, globalThis);
}
break :brk try Blob.get(
globalThis,
data,
@@ -4828,6 +4834,7 @@ const NewReadFileHandler = read_file.NewReadFileHandler;
const string = []const u8;
const Archive = @import("../api/Archive.zig");
const Environment = @import("../../env.zig");
const S3File = @import("./S3File.zig");
const std = @import("std");

View File

@@ -216,6 +216,7 @@ pub fn StyleRule(comptime R: type) type {
var handler_context = context.handler_context.child(.style_rule);
std.mem.swap(css.PropertyHandlerContext, &context.handler_context, &handler_context);
try this.rules.minify(context, unused);
std.mem.swap(css.PropertyHandlerContext, &context.handler_context, &handler_context);
if (unused and this.rules.v.items.len == 0) {
return true;
}

View File

@@ -1,4 +1,4 @@
pub fn decodeBinaryValue(globalObject: *jsc.JSGlobalObject, field_type: types.FieldType, column_length: u32, raw: bool, bigint: bool, unsigned: bool, comptime Context: type, reader: NewReader(Context)) !SQLDataCell {
pub fn decodeBinaryValue(globalObject: *jsc.JSGlobalObject, field_type: types.FieldType, column_length: u32, raw: bool, bigint: bool, unsigned: bool, binary: bool, comptime Context: type, reader: NewReader(Context)) !SQLDataCell {
debug("decodeBinaryValue: {s}", .{@tagName(field_type)});
return switch (field_type) {
.MYSQL_TYPE_TINY => {
@@ -131,6 +131,7 @@ pub fn decodeBinaryValue(globalObject: *jsc.JSGlobalObject, field_type: types.Fi
else => error.InvalidBinaryValue,
},
// When the column contains a binary string we return a Buffer otherwise a string
.MYSQL_TYPE_ENUM,
.MYSQL_TYPE_SET,
.MYSQL_TYPE_GEOMETRY,
@@ -138,7 +139,6 @@ pub fn decodeBinaryValue(globalObject: *jsc.JSGlobalObject, field_type: types.Fi
.MYSQL_TYPE_STRING,
.MYSQL_TYPE_VARCHAR,
.MYSQL_TYPE_VAR_STRING,
// We could return Buffer here BUT TEXT, LONGTEXT, MEDIUMTEXT, TINYTEXT, etc. are BLOB and the user expects a string
.MYSQL_TYPE_TINY_BLOB,
.MYSQL_TYPE_MEDIUM_BLOB,
.MYSQL_TYPE_LONG_BLOB,
@@ -151,7 +151,9 @@ pub fn decodeBinaryValue(globalObject: *jsc.JSGlobalObject, field_type: types.Fi
}
var string_data = try reader.encodeLenString();
defer string_data.deinit();
if (binary) {
return SQLDataCell.raw(&string_data);
}
const slice = string_data.slice();
return SQLDataCell{ .tag = .string, .value = .{ .string = if (slice.len > 0) bun.String.cloneUTF8(slice).value.WTFStringImpl else null }, .free_value = 1 };
},

View File

@@ -140,8 +140,12 @@ pub const Row = struct {
}
},
else => {
const slice = value.slice();
cell.* = SQLDataCell{ .tag = .string, .value = .{ .string = if (slice.len > 0) bun.String.cloneUTF8(slice).value.WTFStringImpl else null }, .free_value = 1 };
if (column.flags.BINARY) {
cell.* = SQLDataCell.raw(value);
} else {
const slice = value.slice();
cell.* = SQLDataCell{ .tag = .string, .value = .{ .string = if (slice.len > 0) bun.String.cloneUTF8(slice).value.WTFStringImpl else null }, .free_value = 1 };
}
},
};
}
@@ -226,7 +230,7 @@ pub const Row = struct {
}
const column = this.columns[i];
value.* = try decodeBinaryValue(this.globalObject, column.column_type, column.column_length, this.raw, this.bigint, column.flags.UNSIGNED, Context, reader);
value.* = try decodeBinaryValue(this.globalObject, column.column_type, column.column_length, this.raw, this.bigint, column.flags.UNSIGNED, column.flags.BINARY, Context, reader);
value.index = switch (column.name_or_index) {
// The indexed columns can be out of order.
.index => |idx| idx,

File diff suppressed because it is too large Load Diff

View File

@@ -1509,3 +1509,128 @@ describe.concurrent("s3 missing credentials", () => {
});
});
});
// Archive + S3 integration tests
describe.skipIf(!minioCredentials)("Archive with S3", () => {
const credentials = minioCredentials!;
it("writes archive to S3 via S3Client.write()", async () => {
const client = new Bun.S3Client(credentials);
const archive = new Bun.Archive({
"hello.txt": "Hello from Archive!",
"data.json": JSON.stringify({ test: true }),
});
const key = randomUUIDv7() + ".tar";
await client.write(key, archive);
// Verify by downloading and reading back
const downloaded = await client.file(key).bytes();
const readArchive = new Bun.Archive(downloaded);
const files = await readArchive.files();
expect(files.size).toBe(2);
expect(await files.get("hello.txt")!.text()).toBe("Hello from Archive!");
expect(await files.get("data.json")!.text()).toBe(JSON.stringify({ test: true }));
// Cleanup
await client.unlink(key);
});
it("writes archive to S3 via Bun.write() with s3:// URL", async () => {
const archive = new Bun.Archive({
"file1.txt": "content1",
"dir/file2.txt": "content2",
});
const key = randomUUIDv7() + ".tar";
const s3Url = `s3://${credentials.bucket}/${key}`;
await Bun.write(s3Url, archive, {
...credentials,
});
// Verify by downloading
const s3File = Bun.file(s3Url, credentials);
const downloaded = await s3File.bytes();
const readArchive = new Bun.Archive(downloaded);
const files = await readArchive.files();
expect(files.size).toBe(2);
expect(await files.get("file1.txt")!.text()).toBe("content1");
expect(await files.get("dir/file2.txt")!.text()).toBe("content2");
// Cleanup
await s3File.delete();
});
it("writes archive with binary content to S3", async () => {
const client = new Bun.S3Client(credentials);
const binaryData = new Uint8Array([0x00, 0x01, 0x02, 0xff, 0xfe, 0xfd, 0x80, 0x7f]);
const archive = new Bun.Archive({
"binary.bin": binaryData,
});
const key = randomUUIDv7() + ".tar";
await client.write(key, archive);
// Verify binary data is preserved
const downloaded = await client.file(key).bytes();
const readArchive = new Bun.Archive(downloaded);
const files = await readArchive.files();
const extractedBinary = await files.get("binary.bin")!.bytes();
expect(extractedBinary).toEqual(binaryData);
// Cleanup
await client.unlink(key);
});
it("writes large archive to S3", async () => {
const client = new Bun.S3Client(credentials);
// Create archive with multiple files
const entries: Record<string, string> = {};
for (let i = 0; i < 50; i++) {
entries[`file${i.toString().padStart(3, "0")}.txt`] = `Content for file ${i}`;
}
const archive = new Bun.Archive(entries);
const key = randomUUIDv7() + ".tar";
await client.write(key, archive);
// Verify
const downloaded = await client.file(key).bytes();
const readArchive = new Bun.Archive(downloaded);
const files = await readArchive.files();
expect(files.size).toBe(50);
expect(await files.get("file000.txt")!.text()).toBe("Content for file 0");
expect(await files.get("file049.txt")!.text()).toBe("Content for file 49");
// Cleanup
await client.unlink(key);
});
it("writes archive via s3File.write()", async () => {
const client = new Bun.S3Client(credentials);
const archive = new Bun.Archive({
"test.txt": "Hello via s3File.write()!",
});
const key = randomUUIDv7() + ".tar";
const s3File = client.file(key);
await s3File.write(archive);
// Verify
const downloaded = await s3File.bytes();
const readArchive = new Bun.Archive(downloaded);
const files = await readArchive.files();
expect(files.size).toBe(1);
expect(await files.get("test.txt")!.text()).toBe("Hello via s3File.write()!");
// Cleanup
await s3File.delete();
});
});

View File

@@ -480,6 +480,25 @@ if (isDockerEnabled()) {
expect(b).toEqual({ b: 2 });
});
test("Binary", async () => {
const random_name = ("t_" + Bun.randomUUIDv7("hex").replaceAll("-", "")).toLowerCase();
await sql`CREATE TEMPORARY TABLE ${sql(random_name)} (a binary(1), b varbinary(1), c blob)`;
const values = [
{ a: Buffer.from([1]), b: Buffer.from([2]), c: Buffer.from([3]) },
];
await sql`INSERT INTO ${sql(random_name)} ${sql(values)}`;
const results = await sql`select * from ${sql(random_name)}`;
// return buffers
expect(results[0].a).toEqual(Buffer.from([1]));
expect(results[0].b).toEqual(Buffer.from([2]));
expect(results[0].c).toEqual(Buffer.from([3]));
// text protocol should behave the same
const results2 = await sql`select * from ${sql(random_name)}`.simple();
expect(results2[0].a).toEqual(Buffer.from([1]));
expect(results2[0].b).toEqual(Buffer.from([2]));
expect(results2[0].c).toEqual(Buffer.from([3]));
})
test("bulk insert nested sql()", async () => {
await using sql = new SQL({ ...getOptions(), max: 1 });
await sql`create temporary table test_users (name text, age int)`;

View File

@@ -0,0 +1,52 @@
import { heapStats } from "bun:jsc";
import { describe, expect, test } from "bun:test";
describe("Bun.serve response stream leak", () => {
test("proxy server forwarding streaming response should not leak", async () => {
// Backend server that returns a streaming response with delay
await using backend = Bun.serve({
port: 0,
fetch(req) {
const stream = new ReadableStream({
async start(controller) {
controller.enqueue(new TextEncoder().encode("chunk1"));
await Bun.sleep(10);
controller.enqueue(new TextEncoder().encode("chunk2"));
controller.close();
},
});
return new Response(stream);
},
});
// Proxy server that forwards the response body stream
await using proxy = Bun.serve({
port: 0,
async fetch(req) {
const backendResponse = await fetch(`http://localhost:${backend.port}/`);
return new Response(backendResponse.body);
},
});
const url = `http://localhost:${proxy.port}/`;
async function leak() {
const response = await fetch(url);
return await response.text();
}
for (let i = 0; i < 200; i++) {
await leak();
}
await Bun.sleep(10);
Bun.gc(true);
await Bun.sleep(10);
Bun.gc(true);
const readableStreamCount = heapStats().objectTypeCounts.ReadableStream || 0;
const responseCount = heapStats().objectTypeCounts.Response || 0;
expect(readableStreamCount).toBeLessThanOrEqual(50);
expect(responseCount).toBeLessThanOrEqual(50);
});
});

View File

@@ -0,0 +1,64 @@
import { expect, test } from "bun:test";
// GitHub Issue #25639: setTimeout Timeout object missing _idleStart property
// Next.js 16 uses _idleStart to coordinate timers for Cache Components
test("setTimeout returns Timeout object with _idleStart property", () => {
const timer = setTimeout(() => {}, 100);
try {
// Verify _idleStart exists and is a number
expect("_idleStart" in timer).toBe(true);
expect(typeof timer._idleStart).toBe("number");
// _idleStart should be a positive timestamp
expect(timer._idleStart).toBeGreaterThan(0);
} finally {
clearTimeout(timer);
}
});
test("setInterval returns Timeout object with _idleStart property", () => {
const timer = setInterval(() => {}, 100);
try {
// Verify _idleStart exists and is a number
expect("_idleStart" in timer).toBe(true);
expect(typeof timer._idleStart).toBe("number");
// _idleStart should be a positive timestamp
expect(timer._idleStart).toBeGreaterThan(0);
} finally {
clearInterval(timer);
}
});
test("_idleStart is writable (Next.js modifies it to coordinate timers)", () => {
const timer = setTimeout(() => {}, 100);
try {
const originalIdleStart = timer._idleStart;
expect(typeof originalIdleStart).toBe("number");
// Next.js sets _idleStart to coordinate timers
const newIdleStart = originalIdleStart - 100;
timer._idleStart = newIdleStart;
expect(timer._idleStart).toBe(newIdleStart);
} finally {
clearTimeout(timer);
}
});
test("timers created at different times have different _idleStart values", async () => {
const timer1 = setTimeout(() => {}, 100);
// Wait a bit to ensure different timestamp
await Bun.sleep(10);
const timer2 = setTimeout(() => {}, 100);
try {
expect(timer2._idleStart).toBeGreaterThanOrEqual(timer1._idleStart);
} finally {
clearTimeout(timer1);
clearTimeout(timer2);
}
});

View File

@@ -0,0 +1,86 @@
import { expect, test } from "bun:test";
import { bunEnv, bunExe, tempDirWithFiles } from "harness";
test("CSS logical properties should not be stripped when nested rules are present", async () => {
// Test for regression of issue #25794: CSS logical properties (e.g., inset-inline-end)
// are stripped from bundler output when they appear in a nested selector that also
// contains further nested rules (like pseudo-elements).
const dir = tempDirWithFiles("css-logical-properties-nested", {
"input.css": `.test-longform {
background-color: teal;
&.test-longform--end {
inset-inline-end: 20px;
&:after {
content: "";
}
}
}
`,
});
await using proc = Bun.spawn({
cmd: [bunExe(), "build", "input.css", "--outdir", "out"],
env: bunEnv,
cwd: dir,
stdout: "pipe",
stderr: "pipe",
});
const [stdout, stderr, exitCode] = await Promise.all([proc.stdout.text(), proc.stderr.text(), proc.exited]);
// Verify the output CSS contains the logical property fallbacks
const outputContent = await Bun.file(`${dir}/out/input.css`).text();
// Helper function to normalize CSS output for snapshots
function normalizeCSSOutput(output: string): string {
return output
.replace(/\/\*.*?\*\//g, "/* [path] */") // Replace comment paths
.trim();
}
// The output should contain LTR/RTL fallback rules for inset-inline-end
// inset-inline-end: 20px should generate:
// - right: 20px for LTR languages
// - left: 20px for RTL languages
// The bundler generates vendor-prefixed variants for browser compatibility
expect(normalizeCSSOutput(outputContent)).toMatchInlineSnapshot(`
"/* [path] */
.test-longform {
background-color: teal;
}
.test-longform.test-longform--end:not(:-webkit-any(:lang(ae), :lang(ar), :lang(arc), :lang(bcc), :lang(bqi), :lang(ckb), :lang(dv), :lang(fa), :lang(glk), :lang(he), :lang(ku), :lang(mzn), :lang(nqo), :lang(pnb), :lang(ps), :lang(sd), :lang(ug), :lang(ur), :lang(yi))) {
right: 20px;
}
.test-longform.test-longform--end:not(:-moz-any(:lang(ae), :lang(ar), :lang(arc), :lang(bcc), :lang(bqi), :lang(ckb), :lang(dv), :lang(fa), :lang(glk), :lang(he), :lang(ku), :lang(mzn), :lang(nqo), :lang(pnb), :lang(ps), :lang(sd), :lang(ug), :lang(ur), :lang(yi))) {
right: 20px;
}
.test-longform.test-longform--end:not(:is(:lang(ae), :lang(ar), :lang(arc), :lang(bcc), :lang(bqi), :lang(ckb), :lang(dv), :lang(fa), :lang(glk), :lang(he), :lang(ku), :lang(mzn), :lang(nqo), :lang(pnb), :lang(ps), :lang(sd), :lang(ug), :lang(ur), :lang(yi))) {
right: 20px;
}
.test-longform.test-longform--end:-webkit-any(:lang(ae), :lang(ar), :lang(arc), :lang(bcc), :lang(bqi), :lang(ckb), :lang(dv), :lang(fa), :lang(glk), :lang(he), :lang(ku), :lang(mzn), :lang(nqo), :lang(pnb), :lang(ps), :lang(sd), :lang(ug), :lang(ur), :lang(yi)) {
left: 20px;
}
.test-longform.test-longform--end:-moz-any(:lang(ae), :lang(ar), :lang(arc), :lang(bcc), :lang(bqi), :lang(ckb), :lang(dv), :lang(fa), :lang(glk), :lang(he), :lang(ku), :lang(mzn), :lang(nqo), :lang(pnb), :lang(ps), :lang(sd), :lang(ug), :lang(ur), :lang(yi)) {
left: 20px;
}
.test-longform.test-longform--end:is(:lang(ae), :lang(ar), :lang(arc), :lang(bcc), :lang(bqi), :lang(ckb), :lang(dv), :lang(fa), :lang(glk), :lang(he), :lang(ku), :lang(mzn), :lang(nqo), :lang(pnb), :lang(ps), :lang(sd), :lang(ug), :lang(ur), :lang(yi)) {
left: 20px;
}
.test-longform.test-longform--end:after {
content: "";
}"
`);
// Should exit successfully
expect(exitCode).toBe(0);
});