Compare commits

..

8 Commits

Author SHA1 Message Date
Claude Bot
4681cbfeec Add streaming support for Bun Shell stdout/stderr
Implements ReadableStream interface for shell stdout and stderr, allowing
real-time consumption of command output instead of waiting for buffering.

- Created ShellOutputStream.zig implementing ReadableStream source
- Added stdout/stderr getters to ShellInterpreter returning streams
- Exposed streams on ShellPromise in shell.ts
- Added data notification hooks in Cmd.zig and Builtin.zig
- Updated TypeScript definitions in shell.d.ts
- Added comprehensive tests in shell-streaming.test.ts

The streams are lazily created on first access and share the same
underlying ByteList buffer used for buffered output. Data notifications
wake pending reads when new output arrives.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-09-30 22:09:02 +00:00
Dylan Conway
52629145ca fix(parser): TSX arrow function bugfix (#23082)
### What does this PR do?
Missing `.t_equals` and `.t_slash` checks. This matches esbuild.

```go
// Returns true if the current less-than token is considered to be an arrow
// function under TypeScript's rules for files containing JSX syntax
func (p *parser) isTSArrowFnJSX() (isTSArrowFn bool) {
	oldLexer := p.lexer
	p.lexer.Next()

	// Look ahead to see if this should be an arrow function instead
	if p.lexer.Token == js_lexer.TConst {
		p.lexer.Next()
	}
	if p.lexer.Token == js_lexer.TIdentifier {
		p.lexer.Next()
		if p.lexer.Token == js_lexer.TComma || p.lexer.Token == js_lexer.TEquals {
			isTSArrowFn = true
		} else if p.lexer.Token == js_lexer.TExtends {
			p.lexer.Next()
			isTSArrowFn = p.lexer.Token != js_lexer.TEquals && p.lexer.Token != js_lexer.TGreaterThan && p.lexer.Token != js_lexer.TSlash
		}
	}

	// Restore the lexer
	p.lexer = oldLexer
	return
}
```

fixes #19697
### How did you verify your code works?
Added some tests.
2025-09-29 05:10:16 -07:00
Dylan Conway
f4218ed40b fix(parser): possible crash with --minify-syntax and string -> dot conversions (#23078)
### What does this PR do?
Fixes code like `[(()=>{})()][''+'c']`.

We were calling `visitExpr` on a node that was already visited. This
code doesn't exist in esbuild, but we should keep it because it's an
optimization.

fixes #18629
fixes #15926

### How did you verify your code works?
Manually and added a test.
2025-09-29 04:20:57 -07:00
Dylan Conway
9c75db45fa fix(parser): scope mismatch bug from parseSuffix (#23073)
### What does this PR do?
esbuild returns `left` from the inner loop. This PR matches this
behavior. Before it was breaking out of the inner loop and continuing
through the outer loop, potentially parsing too far.

fixes #22013
fixes #22384

### How did you verify your code works?
Added some tests.

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-29 03:03:18 -07:00
Dylan Conway
f6e722b594 fix(glob): fix index out of bounds in GlobWalker (#23055)
### What does this PR do?
Given pattern input "../." we might collapse all path components.
### How did you verify your code works?
Manually and added a test.

---------

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
2025-09-29 02:21:13 -07:00
Jarred Sumner
d9fdb67d70 Deflake bun-server.test.ts 2025-09-28 23:58:21 -07:00
Jarred Sumner
a09dc2f450 Update no-validate-leaksan.txt 2025-09-28 23:46:10 -07:00
Meghan Denny
39e48ed244 Bump 2025-09-28 11:28:14 -07:00
29 changed files with 814 additions and 135 deletions

2
LATEST
View File

@@ -1 +1 @@
1.2.22
1.2.23

View File

@@ -132,10 +132,6 @@ await redis.hmset("user:123", [
const userFields = await redis.hmget("user:123", ["name", "email"]);
console.log(userFields); // ["Alice", "alice@example.com"]
// Get a single field from a hash
const value = await redis.hget("user:123", "name");
console.log(value); // "Alice"
// Increment a numeric field in a hash
await redis.hincrby("user:123", "visits", 1);

View File

@@ -293,33 +293,6 @@ const socket = new WebSocket("ws://localhost:3000", {
});
```
### Subprotocol negotiation
WebSocket clients can request specific subprotocols during the connection handshake. The server can then choose which protocol to use from the client's list.
```js
// Request multiple protocols
const ws = new WebSocket("ws://localhost:3000", ["chat", "superchat"]);
ws.onopen = () => {
console.log(`Connected with protocol: ${ws.protocol}`); // Server's chosen protocol
};
```
### Custom headers
Bun allows you to set custom headers in the WebSocket constructor, including overriding standard WebSocket headers. This is useful for authentication, custom host headers, or other server requirements.
```js
const ws = new WebSocket("ws://localhost:3000", {
headers: {
"Host": "custom-host.example.com",
"Sec-WebSocket-Key": "dGhlIHNhbXBsZSBub25jZQ==",
"X-Custom": "value"
}
});
```
To add event listeners to the socket:
```ts

View File

@@ -781,14 +781,6 @@ $ bun build ./index.tsx --outdir ./out --minify --keep-names
{% /codetabs %}
### Minification optimizations
The minifier applies several optimizations:
- **Constructor simplification**: `new Object()` → `{}`, `new Array(1,2)` → `[1,2]`
- **typeof checks**: `typeof x === "undefined"` → `typeof x > "u"`
- **Function names**: Unused function/class expression names are removed unless `--keep-names` is set
<!-- ### `treeshaking`
boolean; -->

View File

@@ -166,16 +166,6 @@ will execute `<script>` in both `bar` and `baz`, but not in `foo`.
Find more details in the docs page for [filter](https://bun.com/docs/cli/filter#running-scripts-with-filter).
### `--workspaces`
In monorepos with workspaces, you can use the `--workspaces` flag to execute a script in all workspace packages that have the script defined.
```bash
$ bun run --workspaces build
```
This will run the `build` script in all workspace packages that have a `build` script defined in their `package.json`. Packages without the specified script will be skipped.
## `bun run -` to pipe code from stdin
`bun run -` lets you read JavaScript, TypeScript, TSX, or JSX from stdin and execute it without writing to a temporary file first.

View File

@@ -323,24 +323,3 @@ Error: here!
at moduleEvaluation (native)
at <anonymous> (native)
```
### Async stack traces
Bun includes asynchronous call frames in stack traces, making debugging async/await code easier:
```js
async function foo() {
return await bar();
}
async function bar() {
throw new Error("oops");
}
await foo();
// error: oops
// at bar (async.js:6:9)
// at async foo (async.js:2:16)
```
The stack trace shows the complete async execution path with `async` prefixed to asynchronous frames.

View File

@@ -40,7 +40,7 @@ This page is updated regularly to reflect compatibility status of the latest ver
### [`node:http`](https://nodejs.org/api/http.html)
🟢 Fully implemented. Outgoing client request body is currently buffered instead of streamed. `closeIdleConnections()` is implemented.
🟢 Fully implemented. Outgoing client request body is currently buffered instead of streamed.
### [`node:https`](https://nodejs.org/api/https.html)
@@ -80,7 +80,7 @@ This page is updated regularly to reflect compatibility status of the latest ver
### [`node:tty`](https://nodejs.org/api/tty.html)
🟢 Fully implemented. Includes interactive TTY support after stdin closes.
🟢 Fully implemented.
### [`node:url`](https://nodejs.org/api/url.html)
@@ -124,7 +124,7 @@ This page is updated regularly to reflect compatibility status of the latest ver
### [`node:perf_hooks`](https://nodejs.org/api/perf_hooks.html)
🟡 Missing `createHistogram`. `monitorEventLoopDelay` is implemented. It's recommended to use `performance` global instead of `perf_hooks.performance`.
🟡 Missing `createHistogram` `monitorEventLoopDelay`. It's recommended to use `performance` global instead of `perf_hooks.performance`.
### [`node:process`](https://nodejs.org/api/process.html)
@@ -406,10 +406,6 @@ The table below lists all globals implemented by Node.js and Bun's current compa
🟢 Fully implemented.
### Performance
`structuredClone()` uses the same optimized serialization as `postMessage()`. For simple objects containing only primitives, it can be up to 240x faster than standard structured cloning.
### [`SubtleCrypto`](https://developer.mozilla.org/en-US/docs/Web/API/SubtleCrypto)
🟢 Fully implemented.

View File

@@ -1,7 +1,7 @@
{
"private": true,
"name": "bun",
"version": "1.2.23",
"version": "1.2.24",
"workspaces": [
"./packages/bun-types",
"./packages/@types/bun"

View File

@@ -88,6 +88,38 @@ declare module "bun" {
* ```
*/
class ShellPromise extends Promise<ShellOutput> {
/**
* Get a ReadableStream for stdout that streams data as the shell executes.
*
* This allows you to consume stdout incrementally rather than waiting for
* the command to complete. The stream will emit chunks as they're written.
*
* @example
* ```ts
* const shell = $`long-running-command`;
* for await (const chunk of shell.stdout) {
* console.log('Received:', new TextDecoder().decode(chunk));
* }
* ```
*/
get stdout(): ReadableStream<Uint8Array>;
/**
* Get a ReadableStream for stderr that streams data as the shell executes.
*
* This allows you to consume stderr incrementally rather than waiting for
* the command to complete. The stream will emit chunks as they're written.
*
* @example
* ```ts
* const shell = $`long-running-command`;
* for await (const chunk of shell.stderr) {
* console.error('Error:', new TextDecoder().decode(chunk));
* }
* ```
*/
get stderr(): ReadableStream<Uint8Array>;
get stdin(): WritableStream;
/**

View File

@@ -194,11 +194,11 @@ pub fn isTSArrowFnJSX(p: anytype) !bool {
}
if (p.lexer.token == .t_identifier) {
try p.lexer.next();
if (p.lexer.token == .t_comma) {
if (p.lexer.token == .t_comma or p.lexer.token == .t_equals) {
is_ts_arrow_fn = true;
} else if (p.lexer.token == .t_extends) {
try p.lexer.next();
is_ts_arrow_fn = p.lexer.token != .t_equals and p.lexer.token != .t_greater_than;
is_ts_arrow_fn = p.lexer.token != .t_equals and p.lexer.token != .t_greater_than and p.lexer.token != .t_slash;
}
}

View File

@@ -827,24 +827,25 @@ pub fn ParseSuffix(
const optional_chain = &optional_chain_;
while (true) {
if (p.lexer.loc().start == p.after_arrow_body_loc.start) {
while (true) {
switch (p.lexer.token) {
.t_comma => {
if (level.gte(.comma)) {
break;
}
defer left_and_out.* = left_value;
next_token: switch (p.lexer.token) {
.t_comma => {
if (level.gte(.comma)) {
return;
}
try p.lexer.next();
left.* = p.newExpr(E.Binary{
.op = .bin_comma,
.left = left.*,
.right = try p.parseExpr(.comma),
}, left.loc);
},
else => {
break;
},
}
try p.lexer.next();
left.* = p.newExpr(E.Binary{
.op = .bin_comma,
.left = left.*,
.right = try p.parseExpr(.comma),
}, left.loc);
continue :next_token p.lexer.token;
},
else => {
return;
},
}
}

View File

@@ -609,7 +609,8 @@ pub fn VisitExpr(
p.delete_target = dot.data;
}
return p.visitExprInOut(dot, in);
// don't call visitExprInOut on `dot` because we've already visited `target` above!
return dot;
}
// Handle property rewrites to ensure things

View File

@@ -23,6 +23,14 @@ export default [
fn: "getStarted",
length: 0,
},
stdout: {
getter: "getStdout",
cache: true,
},
stderr: {
getter: "getStderr",
cache: true,
},
},
}),
];

View File

@@ -40,9 +40,9 @@ function source(name) {
isClosed: {
getter: "getIsClosedFromJS",
},
...(name !== "File"
...(name !== "File" && name !== "ShellOutputStream"
? // Buffered versions
// not implemented in File, yet.
// not implemented in File and ShellOutputStream yet.
{
text: {
fn: "textFromJS",
@@ -80,6 +80,6 @@ function source(name) {
});
}
const sources = ["Blob", "File", "Bytes"];
const sources = ["Blob", "File", "Bytes", "ShellOutputStream"];
export default sources.map(source);

View File

@@ -70,6 +70,7 @@ pub const Classes = struct {
pub const FileInternalReadableStreamSource = webcore.FileReader.Source;
pub const BlobInternalReadableStreamSource = webcore.ByteBlobLoader.Source;
pub const BytesInternalReadableStreamSource = webcore.ByteStream.Source;
pub const ShellOutputStreamInternalReadableStreamSource = webcore.ShellOutputStream.Source;
pub const PostgresSQLConnection = api.Postgres.PostgresSQLConnection;
pub const MySQLConnection = api.MySQL.MySQLConnection;
pub const PostgresSQLQuery = api.Postgres.PostgresSQLQuery;

View File

@@ -40,6 +40,7 @@ pub const FetchHeaders = @import("./bindings/FetchHeaders.zig").FetchHeaders;
pub const ByteBlobLoader = @import("./webcore/ByteBlobLoader.zig");
pub const ByteStream = @import("./webcore/ByteStream.zig");
pub const FileReader = @import("./webcore/FileReader.zig");
pub const ShellOutputStream = @import("../shell/ShellOutputStream.zig");
pub const ScriptExecutionContext = @import("./webcore/ScriptExecutionContext.zig");
pub const streams = @import("./webcore/streams.zig");

View File

@@ -587,6 +587,16 @@ pub fn GlobWalker_(
var had_dot_dot = false;
const component_idx = this.walker.skipSpecialComponents(work_item.idx, &dir_path, &this.iter_state.directory.path, &had_dot_dot);
// If we've exhausted all pattern components (e.g., pattern was only dots like "../."),
// we're done with this work item
if (component_idx >= this.walker.patternComponents.items.len) {
if (work_item.fd) |fd| {
this.closeDisallowingCwd(fd);
}
this.iter_state = .get_next;
return .success;
}
const fd: Accessor.Handle = fd: {
if (work_item.fd) |fd| break :fd fd;
if (comptime root) {
@@ -705,6 +715,13 @@ pub fn GlobWalker_(
var has_dot_dot = false;
const component_idx = this.walker.skipSpecialComponents(work_item.idx, &symlink_full_path_z, scratch_path_buf, &has_dot_dot);
// If we've exhausted all pattern components, continue to next item
if (component_idx >= this.walker.patternComponents.items.len) {
this.iter_state = .get_next;
continue;
}
var pattern = this.walker.patternComponents.items[component_idx];
const next_pattern = if (component_idx + 1 < this.walker.patternComponents.items.len) &this.walker.patternComponents.items[component_idx + 1] else null;
const is_last = component_idx == this.walker.patternComponents.items.len - 1;
@@ -1173,28 +1190,32 @@ pub fn GlobWalker_(
) u32 {
var component_idx = work_item_idx;
// Skip `.` and `..` while also appending them to `dir_path`
component_idx = switch (this.patternComponents.items[component_idx].syntax_hint) {
.Dot => this.collapseDots(
component_idx,
dir_path,
scratch_path_buf,
encountered_dot_dot,
),
.DotBack => this.collapseDots(
component_idx,
dir_path,
scratch_path_buf,
encountered_dot_dot,
),
else => component_idx,
};
if (component_idx < this.patternComponents.items.len) {
// Skip `.` and `..` while also appending them to `dir_path`
component_idx = switch (this.patternComponents.items[component_idx].syntax_hint) {
.Dot => this.collapseDots(
component_idx,
dir_path,
scratch_path_buf,
encountered_dot_dot,
),
.DotBack => this.collapseDots(
component_idx,
dir_path,
scratch_path_buf,
encountered_dot_dot,
),
else => component_idx,
};
}
// Skip to the last `**` if there is a chain of them
component_idx = switch (this.patternComponents.items[component_idx].syntax_hint) {
.Double => this.collapseSuccessiveDoubleWildcards(component_idx),
else => component_idx,
};
if (component_idx < this.patternComponents.items.len) {
// Skip to the last `**` if there is a chain of them
component_idx = switch (this.patternComponents.items[component_idx].syntax_hint) {
.Double => this.collapseSuccessiveDoubleWildcards(component_idx),
else => component_idx,
};
}
return component_idx;
}

View File

@@ -109,6 +109,7 @@ export function createBunShellTemplateFunction(createShellInterpreter_, createPa
#throws: boolean = true;
#resolve: (code: number, stdout: Buffer, stderr: Buffer) => void;
#reject: (code: number, stdout: Buffer, stderr: Buffer) => void;
#interp: $ZigGeneratedClasses.ShellInterpreter | undefined = undefined;
constructor(args: $ZigGeneratedClasses.ParsedShellScript, throws: boolean) {
// Create the error immediately so it captures the stacktrace at the point
@@ -170,11 +171,22 @@ export function createBunShellTemplateFunction(createShellInterpreter_, createPa
this.#hasRun = true;
let interp = createShellInterpreter(this.#resolve, this.#reject, this.#args!);
this.#interp = interp;
this.#args = undefined;
interp.run();
}
}
get stdout(): ReadableStream {
this.#run();
return this.#interp!.stdout;
}
get stderr(): ReadableStream {
this.#run();
return this.#interp!.stderr;
}
#quiet(): this {
this.#throwIfRunning();
this.#args!.setQuiet();

View File

@@ -623,6 +623,7 @@ pub fn done(this: *Builtin, exit_code: anytype) Yield {
bun.default_allocator,
this.stdout.buf.items[0..],
));
cmd.base.shell.notifyStdoutData();
}
// Aggregate output data if shell state is piped and this cmd is piped
if (cmd.io.stderr == .pipe and cmd.io.stderr == .pipe and this.stderr == .buf) {
@@ -630,6 +631,7 @@ pub fn done(this: *Builtin, exit_code: anytype) Yield {
bun.default_allocator,
this.stderr.buf.items[0..],
));
cmd.base.shell.notifyStderrData();
}
return cmd.parent.childDone(cmd, this.exit_code.?);

View File

@@ -0,0 +1,207 @@
const std = @import("std");
const bun = @import("bun");
const jsc = bun.jsc;
const JSValue = jsc.JSValue;
const JSGlobalObject = jsc.JSGlobalObject;
const webcore = jsc.WebCore;
const Blob = webcore.Blob;
const streams = webcore.streams;
const Output = bun.Output;
/// ShellOutputStream provides a ReadableStream interface over a ByteList that is
/// being written to during shell execution. It allows streaming stdout/stderr
/// while the shell is still running, rather than waiting for completion.
const ShellOutputStream = @This();
/// Pointer to the ByteList being written to by the shell
buffer: *bun.ByteList,
/// Current read offset in the buffer
offset: usize = 0,
/// Whether the shell has finished and no more data will be written
done: bool = false,
/// Pending read operation
pending: streams.Result.Pending = .{ .result = .{ .done = {} } },
/// Buffer for pending read
pending_buffer: []u8 = &.{},
/// JSValue for the pending read
pending_value: jsc.Strong.Optional = .empty,
pub const Source = webcore.ReadableStream.NewSource(
@This(),
"ShellOutputStream",
onStart,
onPull,
onCancel,
deinit,
null,
null,
null,
null,
);
const log = Output.scoped(.ShellOutputStream, .visible);
pub fn init(buffer: *bun.ByteList) ShellOutputStream {
return .{
.buffer = buffer,
};
}
pub fn parent(this: *@This()) *Source {
return @fieldParentPtr("context", this);
}
pub fn onStart(this: *@This()) streams.Start {
// If we already have data, let the consumer know
if (this.buffer.len > 0 and this.done) {
return .{ .chunk_size = 16384 };
}
return .{ .ready = {} };
}
pub fn onPull(this: *@This(), buffer: []u8, view: jsc.JSValue) streams.Result {
jsc.markBinding(@src());
bun.assert(buffer.len > 0);
const available = this.buffer.len -| this.offset;
if (available > 0) {
const to_copy = @min(available, buffer.len);
@memcpy(buffer[0..to_copy], this.buffer.slice()[this.offset..][0..to_copy]);
this.offset += to_copy;
// If we've read everything and the shell is done, signal completion
if (this.done and this.offset >= this.buffer.len) {
return .{
.into_array_and_done = .{
.value = view,
.len = @as(Blob.SizeType, @truncate(to_copy)),
},
};
}
return .{
.into_array = .{
.value = view,
.len = @as(Blob.SizeType, @truncate(to_copy)),
},
};
}
// No data available yet
if (this.done) {
return .{ .done = {} };
}
// Wait for data
this.pending_buffer = buffer;
this.pending_value.set(this.parent().globalThis, view);
return .{
.pending = &this.pending,
};
}
pub fn onCancel(this: *@This()) void {
jsc.markBinding(@src());
this.done = true;
this.pending_value.deinit();
if (this.pending.state == .pending) {
this.pending_buffer = &.{};
this.pending.result.deinit();
this.pending.result = .{ .done = {} };
this.pending.run();
}
}
pub fn deinit(this: *@This()) void {
jsc.markBinding(@src());
this.pending_value.deinit();
if (!this.done) {
this.done = true;
if (this.pending.state == .pending) {
this.pending_buffer = &.{};
this.pending.result.deinit();
this.pending.result = .{ .done = {} };
if (this.pending.future == .promise) {
this.pending.runOnNextTick();
} else {
this.pending.run();
}
}
}
this.parent().deinit();
}
/// Called when new data has been written to the buffer.
/// Resumes any pending read operation.
pub fn onData(this: *@This()) void {
if (this.pending.state != .pending) {
return;
}
const available = this.buffer.len -| this.offset;
if (available == 0) {
return;
}
const to_copy = @min(available, this.pending_buffer.len);
@memcpy(
this.pending_buffer[0..to_copy],
this.buffer.slice()[this.offset..][0..to_copy]
);
this.offset += to_copy;
const view = this.pending_value.get() orelse {
return;
};
this.pending_value.clearWithoutDeallocation();
this.pending_buffer = &.{};
const is_done = this.done and this.offset >= this.buffer.len;
if (is_done) {
this.pending.result = .{
.into_array_and_done = .{
.value = view,
.len = @as(Blob.SizeType, @truncate(to_copy)),
},
};
} else {
this.pending.result = .{
.into_array = .{
.value = view,
.len = @as(Blob.SizeType, @truncate(to_copy)),
},
};
}
this.pending.run();
}
/// Called when the shell has finished and no more data will be written.
pub fn setDone(this: *@This()) void {
this.done = true;
// If we have a pending read and no more data, resolve it as done
if (this.pending.state == .pending) {
const available = this.buffer.len -| this.offset;
if (available == 0) {
this.pending_buffer = &.{};
const view = this.pending_value.get();
if (view) |v| {
_ = v;
this.pending_value.clearWithoutDeallocation();
}
this.pending.result.deinit();
this.pending.result = .{ .done = {} };
this.pending.run();
} else {
// We have data, let onData handle it
this.onData();
}
}
}

View File

@@ -113,6 +113,8 @@ pub const CallstackGuard = enum(u0) { __i_know_what_i_am_doing };
pub const ExitCode = u16;
pub const ShellOutputStream = @import("./ShellOutputStream.zig");
pub const StateKind = enum(u8) {
script,
stmt,
@@ -278,6 +280,9 @@ pub const Interpreter = struct {
__alloc_scope: if (bun.Environment.enableAllocScopes) bun.AllocationScope else void,
stdout_stream: ?*ShellOutputStream.Source = null,
stderr_stream: ?*ShellOutputStream.Source = null,
// Here are all the state nodes:
pub const State = @import("./states/Base.zig");
pub const Script = @import("./states/Script.zig");
@@ -351,6 +356,10 @@ pub const Interpreter = struct {
async_pids: SmolList(pid_t, 4) = SmolList(pid_t, 4).zeroes,
/// Reference to the interpreter for stream notifications
/// Only set for the root shell
interpreter: ?*ThisInterpreter = null,
__alloc_scope: if (bun.Environment.enableAllocScopes) *bun.AllocationScope else void,
const pid_t = if (bun.Environment.isPosix) std.posix.pid_t else uv.uv_pid_t;
@@ -383,6 +392,20 @@ pub const Interpreter = struct {
};
}
/// Notify streams that new stdout data is available
pub fn notifyStdoutData(this: *ShellExecEnv) void {
if (this.interpreter) |interp| {
interp.notifyStdoutData();
}
}
/// Notify streams that new stderr data is available
pub fn notifyStderrData(this: *ShellExecEnv) void {
if (this.interpreter) |interp| {
interp.notifyStderrData();
}
}
pub inline fn cwdZ(this: *ShellExecEnv) [:0]const u8 {
if (this.__cwd.items.len == 0) return "";
return this.__cwd.items[0..this.__cwd.items.len -| 1 :0];
@@ -872,6 +895,7 @@ pub const Interpreter = struct {
}
interpreter.root_shell.__alloc_scope = if (bun.Environment.enableAllocScopes) &interpreter.__alloc_scope else {};
interpreter.root_shell.interpreter = interpreter;
return .{ .result = interpreter };
}
@@ -1139,6 +1163,9 @@ pub const Interpreter = struct {
log("Interpreter(0x{x}) finish {d}", .{ @intFromPtr(this), exit_code });
defer decrPendingActivityFlag(&this.has_pending_activity);
// Mark streams as done before resolving
this.markStreamsDone();
if (this.event_loop == .js) {
defer this.deinitAfterJSRun();
this.exit_code = exit_code;
@@ -1281,6 +1308,72 @@ pub const Interpreter = struct {
return ioToJSValue(globalThis, this.root_shell.buffered_stderr());
}
pub fn getStdout(this: *ThisInterpreter, globalThis: *JSGlobalObject) JSValue {
if (this.stdout_stream) |stream| {
return stream.toReadableStream(globalThis) catch |err| {
globalThis.reportActiveExceptionAsUnhandled(err);
return .zero;
};
}
// Create the stream
var source = ShellOutputStream.Source.new(.{
.globalThis = globalThis,
.context = ShellOutputStream.init(this.root_shell.buffered_stdout()),
});
this.stdout_stream = source;
return source.toReadableStream(globalThis) catch |err| {
globalThis.reportActiveExceptionAsUnhandled(err);
return .zero;
};
}
pub fn getStderr(this: *ThisInterpreter, globalThis: *JSGlobalObject) JSValue {
if (this.stderr_stream) |stream| {
return stream.toReadableStream(globalThis) catch |err| {
globalThis.reportActiveExceptionAsUnhandled(err);
return .zero;
};
}
// Create the stream
var source = ShellOutputStream.Source.new(.{
.globalThis = globalThis,
.context = ShellOutputStream.init(this.root_shell.buffered_stderr()),
});
this.stderr_stream = source;
return source.toReadableStream(globalThis) catch |err| {
globalThis.reportActiveExceptionAsUnhandled(err);
return .zero;
};
}
/// Notify stdout stream that new data is available
pub fn notifyStdoutData(this: *ThisInterpreter) void {
if (this.stdout_stream) |stream| {
stream.context.onData();
}
}
/// Notify stderr stream that new data is available
pub fn notifyStderrData(this: *ThisInterpreter) void {
if (this.stderr_stream) |stream| {
stream.context.onData();
}
}
/// Mark streams as done when shell finishes
fn markStreamsDone(this: *ThisInterpreter) void {
if (this.stdout_stream) |stream| {
stream.context.setDone();
}
if (this.stderr_stream) |stream| {
stream.context.setDone();
}
}
pub fn finalize(this: *ThisInterpreter) void {
log("Interpreter(0x{x}) finalize", .{@intFromPtr(this)});
this.deinitFromFinalizer();

View File

@@ -155,6 +155,7 @@ const BufferedIoClosed = struct {
if (cmd.io.stdout == .pipe and cmd.io.stdout == .pipe and !cmd.node.redirect.redirectsElsewhere(.stdout)) {
const the_slice = readable.pipe.slice();
bun.handleOom(cmd.base.shell.buffered_stdout().appendSlice(bun.default_allocator, the_slice));
cmd.base.shell.notifyStdoutData();
}
var buffer = readable.pipe.takeBuffer();
@@ -169,6 +170,7 @@ const BufferedIoClosed = struct {
if (cmd.io.stderr == .pipe and cmd.io.stderr == .pipe and !cmd.node.redirect.redirectsElsewhere(.stderr)) {
const the_slice = readable.pipe.slice();
bun.handleOom(cmd.base.shell.buffered_stderr().appendSlice(bun.default_allocator, the_slice));
cmd.base.shell.notifyStderrData();
}
var buffer = readable.pipe.takeBuffer();

View File

@@ -0,0 +1,98 @@
import { describe, expect, test } from "bun:test";
import { bunEnv, bunExe, tempDir } from "harness";
describe("scope mismatch panic regression test", () => {
test("should not panic with scope mismatch when arrow function is followed by array literal", async () => {
// This test reproduces the exact panic that was fixed
// The bug caused: "panic(main thread): Scope mismatch while visiting"
using dir = tempDir("scope-mismatch", {
"index.tsx": `
const Layout = () => {
return (
<html>
</html>
)
}
['1', 'p'].forEach(i =>
app.get(\`/\${i === 'home' ? '' : i}\`, c => c.html(
<Layout selected={i}>
Hello {i}
</Layout>
))
)`,
});
// With the bug, this would panic with "Scope mismatch while visiting"
// With the fix, it should fail with a normal ReferenceError for 'app'
await using proc = Bun.spawn({
cmd: [bunExe(), "index.tsx"],
env: bunEnv,
cwd: String(dir),
stderr: "pipe",
stdout: "pipe",
});
const [stdout, stderr, exitCode] = await Promise.all([proc.stdout.text(), proc.stderr.text(), proc.exited]);
// The key assertion: should NOT panic with scope mismatch
expect(stderr).not.toContain("panic");
expect(stderr).not.toContain("Scope mismatch");
// Should fail with a normal error instead (ReferenceError for undefined 'app')
expect(stderr).toContain("ReferenceError");
expect(stderr).toContain("app is not defined");
expect(exitCode).not.toBe(0);
});
test("should not panic with simpler arrow function followed by array", async () => {
using dir = tempDir("scope-mismatch-simple", {
"test.js": `
const fn = () => {
return 1
}
['a', 'b'].forEach(x => console.log(x))`,
});
await using proc = Bun.spawn({
cmd: [bunExe(), "test.js"],
env: bunEnv,
cwd: String(dir),
stderr: "pipe",
stdout: "pipe",
});
const [stdout, stderr, exitCode] = await Promise.all([proc.stdout.text(), proc.stderr.text(), proc.exited]);
// Should not panic
expect(stderr).not.toContain("panic");
expect(stderr).not.toContain("Scope mismatch");
// Should successfully execute
expect(stdout).toBe("a\nb\n");
expect(exitCode).toBe(0);
});
test("correctly rejects direct indexing into block body arrow function", async () => {
using dir = tempDir("scope-mismatch-reject", {
"test.js": `const fn = () => {return 1}['x']`,
});
await using proc = Bun.spawn({
cmd: [bunExe(), "test.js"],
env: bunEnv,
cwd: String(dir),
stderr: "pipe",
stdout: "pipe",
});
const [stdout, stderr, exitCode] = await Promise.all([proc.stdout.text(), proc.stderr.text(), proc.exited]);
// Should fail with a parse error, not a panic
expect(stderr).not.toContain("panic");
expect(stderr).not.toContain("Scope mismatch");
expect(stderr).toContain("error"); // Parse error or similar
expect(exitCode).not.toBe(0);
});
});

View File

@@ -1344,6 +1344,19 @@ console.log(<div {...obj} key="after" />);`),
);
});
it("parses TSX arrow functions correctly", () => {
var transpiler = new Bun.Transpiler({
loader: "tsx",
});
expect(transpiler.transformSync("console.log(A = <T = unknown,>() => null)")).toBe(
"console.log(A = () => null);\n",
);
expect(transpiler.transformSync("const B = <T extends string>() => null")).toBe("const B = () => null;\n");
expect(transpiler.transformSync("const element = <T extends/>")).toContain("jsxDEV");
expect(transpiler.transformSync("const element2 = <T extends={true}/>")).toContain("jsxDEV");
expect(transpiler.transformSync("const element3 = <T extends></T>")).toContain("jsxDEV");
});
it.todo("JSX", () => {
var bun = new Bun.Transpiler({
loader: "jsx",
@@ -3551,6 +3564,19 @@ it("does not crash with 9 comments and typescript type skipping", () => {
expect(exitCode).toBe(0);
});
it("does not crash with --minify-syntax and revisiting dot expressions", () => {
const { stdout, stderr, exitCode } = Bun.spawnSync({
cmd: [bunExe(), "-p", "[(()=>{})()][''+'c']"],
stdout: "pipe",
stderr: "pipe",
env: bunEnv,
});
expect(stderr.toString()).toBe("");
expect(stdout.toString()).toBe("undefined\n");
expect(exitCode).toBe(0);
});
it("runtime transpiler stack overflows", async () => {
expect(async () => await import("./fixtures/lots-of-for-loop.js")).toThrow(`Maximum call stack size exceeded`);
});
@@ -3566,3 +3592,132 @@ it("Bun.Transpiler.transform stack overflows", async () => {
const transpiler = new Bun.Transpiler();
expect(async () => await transpiler.transform(code)).toThrow(`Maximum call stack size exceeded`);
});
describe("arrow function parsing after const declaration (scope mismatch bug)", () => {
const transpiler = new Bun.Transpiler({ loader: "tsx" });
it("reproduces the original scope mismatch bug with JSX", () => {
// This is the exact pattern that caused the scope mismatch panic
const code = `
const Layout = () => {
return (
<html>
</html>
)
}
['1', 'p'].forEach(i =>
app.get(\`/\${i === 'home' ? '' : i}\`, c => c.html(
<Layout selected={i}>
Hello {i}
</Layout>
))
)`;
// Without the fix, this would parse the array as indexing into the arrow function
// causing a scope mismatch panic when visiting the AST
const result = transpiler.transformSync(code);
// The correct parse should have the array literal as a separate statement
expect(result).toContain("forEach");
// The bug would incorrectly parse as: })["1", "p"]
expect(result).not.toContain(')["');
expect(result).not.toContain('}["');
});
it("correctly parses array literal on next line after block body arrow function", () => {
const code = `const Layout = () => {
return 1
}
['1', 'p'].forEach(i => console.log(i))`;
const result = transpiler.transformSync(code);
expect(result).toContain("forEach");
// The bug would cause the array to be parsed as indexing: Layout[...
expect(result).not.toContain(')["');
});
it("correctly parses JSX arrow function followed by array literal", () => {
const code = `const Layout = () => {
return (
<html>
</html>
)
}
['1', 'p'].forEach(i => console.log(i))`;
const result = transpiler.transformSync(code);
expect(result).toContain("forEach");
expect(result).not.toContain("Layout[");
});
it("rejects indexing directly into block body arrow function without parens", () => {
const code = `const Layout = () => {return 1}['x']`;
// Should throw a parse error - either "Parse error" or the more specific message
expect(() => transpiler.transformSync(code)).toThrow();
});
it("allows indexing into parenthesized arrow function", () => {
const code = `const x = (() => {return {a: 1}})['a']`;
const result = transpiler.transformSync(code);
expect(result).toContain('["a"]');
});
it("correctly handles expression body arrow functions", () => {
const code = `const Layout = () => 1
['1', 'p'].forEach(i => console.log(i))`;
const result = transpiler.transformSync(code);
expect(result).toContain("forEach");
});
it("correctly handles arrow function with comma operator", () => {
const code = `const a = () => {return 1}, b = 2`;
const result = transpiler.transformSync(code);
expect(result).toContain("b = 2");
});
it("correctly handles multiple arrow functions in const declaration", () => {
const code = `const a = () => {return 1}, b = () => {return 2}
['1', '2'].forEach(x => console.log(x))`;
const result = transpiler.transformSync(code);
expect(result).toContain("forEach");
expect(result).not.toContain("b[");
});
it("preserves intentional array access with explicit semicolon", () => {
const code = `const Layout = () => {return 1};
['1', 'p'].forEach(i => console.log(i))`;
const result = transpiler.transformSync(code);
expect(result).toContain("forEach");
expect(result).not.toContain("Layout[");
});
it("handles nested arrow functions correctly", () => {
const code = `const outer = () => {
const inner = () => {
return 1
}
return inner
}
['test'].forEach(x => x)`;
const result = transpiler.transformSync(code);
expect(result).toContain("forEach");
});
it("handles arrow function followed by object literal", () => {
const code = `const fn = () => {return 1}
({a: 1, b: 2}).a`;
const result = transpiler.transformSync(code);
expect(result).toContain("a: 1");
expect(result).not.toContain("fn(");
});
});

View File

@@ -1104,3 +1104,7 @@ exports[`fast-glob e2e tests (absolute) only files (cwd) **/*: absolute: **/* 1`
"/fixtures/third/library/b/book.md",
]
`;
exports[`fast-glob e2e tests (absolute) patterns regular ../.: absolute: ../. 1`] = `[]`;
exports[`fast-glob e2e tests patterns regular ../.: ../. 1`] = `[]`;

View File

@@ -222,6 +222,7 @@ const regular = {
"fixtures/**/{nested,file.md}/*",
"./fixtures/*",
"../.",
],
cwd: [
{ pattern: "*", cwd: "fixtures" },

View File

@@ -184,7 +184,6 @@ describe.concurrent("Server", () => {
test("abort signal on server", async () => {
{
let abortPromise = Promise.withResolvers();
let responseAwaited = Promise.withResolvers();
let fetchAborted = false;
const abortController = new AbortController();
using server = Bun.serve({
@@ -193,8 +192,7 @@ describe.concurrent("Server", () => {
abortPromise.resolve();
});
abortController.abort();
await Bun.sleep(15);
responseAwaited.resolve();
await abortPromise.promise;
return new Response("Hello");
},
port: 0,
@@ -210,7 +208,7 @@ describe.concurrent("Server", () => {
fetchAborted = true;
}
// wait for the server to process the abort signal, fetch may throw before the server processes the signal
await Promise.all([abortPromise.promise, responseAwaited.promise]);
await abortPromise.promise;
expect(fetchAborted).toBe(true);
}
});

View File

@@ -0,0 +1,113 @@
import { test, expect, describe } from "bun:test";
import { $ } from "bun";
describe("Shell streaming stdout/stderr", () => {
test("stdout returns a ReadableStream", async () => {
const shell = $`echo "hello world"`;
const stdout = shell.stdout;
expect(stdout).toBeInstanceOf(ReadableStream);
// Consume the stream
const chunks: Uint8Array[] = [];
for await (const chunk of stdout) {
chunks.push(chunk);
}
const text = new TextDecoder().decode(Buffer.concat(chunks));
expect(text.trim()).toBe("hello world");
// Wait for shell to complete
await shell;
});
test("stderr returns a ReadableStream", async () => {
const shell = $`node -e "console.error('error message')"`.nothrow();
const stderr = shell.stderr;
expect(stderr).toBeInstanceOf(ReadableStream);
// Consume the stream
const chunks: Uint8Array[] = [];
for await (const chunk of stderr) {
chunks.push(chunk);
}
const text = new TextDecoder().decode(Buffer.concat(chunks));
expect(text.trim()).toBe("error message");
// Wait for shell to complete
await shell;
});
test("can read stdout stream while command is running", async () => {
const shell = $`node -e "
for (let i = 0; i < 3; i++) {
console.log('line ' + i);
}
"`;
const chunks: string[] = [];
const reader = shell.stdout.getReader();
const decoder = new TextDecoder();
try {
while (true) {
const { done, value } = await reader.read();
if (done) break;
chunks.push(decoder.decode(value, { stream: true }));
}
} finally {
reader.releaseLock();
}
const output = chunks.join('');
expect(output).toContain("line 0");
expect(output).toContain("line 1");
expect(output).toContain("line 2");
await shell;
});
test("stdout and stderr work independently", async () => {
const shell = $`node -e "
console.log('stdout message');
console.error('stderr message');
"`.nothrow();
const stdoutPromise = (async () => {
const chunks: Uint8Array[] = [];
for await (const chunk of shell.stdout) {
chunks.push(chunk);
}
return new TextDecoder().decode(Buffer.concat(chunks));
})();
const stderrPromise = (async () => {
const chunks: Uint8Array[] = [];
for await (const chunk of shell.stderr) {
chunks.push(chunk);
}
return new TextDecoder().decode(Buffer.concat(chunks));
})();
const [stdoutText, stderrText] = await Promise.all([stdoutPromise, stderrPromise]);
expect(stdoutText.trim()).toBe("stdout message");
expect(stderrText.trim()).toBe("stderr message");
await shell;
});
test("can access stdout stream multiple times", async () => {
const shell = $`echo "test"`;
const stream1 = shell.stdout;
const stream2 = shell.stdout;
// Should return the same stream instance
expect(stream1).toBe(stream2);
await shell;
});
});

View File

@@ -71,14 +71,6 @@ test/js/node/test/parallel/test-fs-readfile-eof.js
test/js/node/test/parallel/test-child-process-promisified.js
test/js/node/test/parallel/test-child-process-exec-encoding.js
test/js/node/test/parallel/test-child-process-execfile.js
test/bake/dev-and-prod.test.ts
test/bake/dev/bundle.test.ts
test/bake/dev/css.test.ts
test/bake/dev/esm.test.ts
test/bake/dev/hot.test.ts
test/bake/dev/react-spa.test.ts
test/bake/dev/sourcemap.test.ts
test/bake/dev/ssg-pages-router.test.ts
test/bundler/bundler_compile.test.ts
test/bundler/bundler_plugin.test.ts
test/bundler/transpiler/bun-pragma.test.ts
@@ -403,3 +395,14 @@ test/js/third_party/resvg/bbox.test.js
test/regression/issue/10139.test.ts
test/js/bun/udp/udp_socket.test.ts
test/cli/init/init.test.ts
# Watcher Thread
test/bake/dev-and-prod.test.ts
test/bake/dev/bundle.test.ts
test/bake/dev/css.test.ts
test/bake/dev/esm.test.ts
test/bake/dev/hot.test.ts
test/bake/dev/react-spa.test.ts
test/bake/dev/sourcemap.test.ts
test/bake/dev/ssg-pages-router.test.ts
test/bake/dev/deinitialization.test.ts