Files
bun.sh/src/transpiler.zig
Jarred Sumner 5a0705348b feat(transpiler): add replMode option for REPL transforms
Add a new `replMode` option to Bun.Transpiler that transforms code for
interactive REPL evaluation:

- Wraps expressions in `{ value: expr }` for result capture
- Uses sync/async IIFE wrappers to avoid parentheses around objects
- Hoists var/let/const declarations for persistence across REPL lines
- Converts const to let for REPL mutability
- Hoists function declarations with this.funcName assignment
- Hoists class declarations with var for vm context persistence
- Auto-detects object literals (starting with { without trailing ;)

This enables building a Node.js-compatible REPL using Bun.Transpiler
with vm.runInContext for persistent variable scope.

Usage:
```typescript
const transpiler = new Bun.Transpiler({
  loader: "tsx",
  replMode: true,
});
const transformed = transpiler.transformSync(userInput);
const result = await vm.runInContext(transformed, context);
console.log(result.value);
```

REPL transforms are extracted into separate repl_transforms.zig module.

Claude-Generated-By: Claude Code (cli/claude-opus-4-5=100%)
Claude-Steers: 8
Claude-Permission-Prompts: 0
Claude-Escapes: 0
Claude-Plan:
<claude-plan>
# REPL Transform Fixes and Node.js Parity

## Current Status

The basic `replMode` option is implemented. This plan covers fixes and parity with Node.js REPL.

## Issues to Fix

### 1. Value Wrapper Has Extra Parentheses (CRITICAL)
**Current output:**
```js
({
  __proto__: null,
  value: 42
});
```

**Expected behavior (per Node.js):**
- For **non-async expressions**: Node.js returns `null` (no transform) - the REPL evaluates the expression directly
- For **async expressions**: `(async () => { return { __proto__: null, value: (expr) } })()`

**Solution:**
1. For non-async expressions: Don't wrap in `{ value: expr }` - just return the expression as-is
2. For async expressions: The `{ __proto__: null, value: expr }` is already inside the function after `return`, so no outer parens needed
3. Add inner parens around the expression value for clarity: `{ __proto__: null, value: (expr) }`

### 2. Object Literal Disambiguation (CRITICAL)
**Input:** `{a: 1}` or `{foo: await fetch()}`

**Current:** Parsed as block with labeled statement, NOT object literal

**Solution:** Pre-check input at transpiler layer:
- If code starts with `{` and doesn't end with `;`, try parsing as `(_=CODE)`
- If valid, wrap input as `(CODE)` before processing
- This matches Node.js approach in `repl.js` line 411-414

### 3. Class Declarations Don't Persist to VM Context
**Current:** Uses `let ClassName;` hoisting - doesn't become vm context property

**Node.js behavior:** Also uses `let` - this is a known limitation in Node.js too!

Looking at Node.js `await.js` line 31-37:
```js
ClassDeclaration(node, state, c) {
  state.prepend(node, `${node.id.name}=`);
  ArrayPrototypePush(state.hoistedDeclarationStatements, `let ${node.id.name}; `);
}
```

**Decision:** Use `var` instead of `let` for class hoisting. This makes classes persist to vm context, matching user expectations for REPL behavior. (Different from Node.js which uses `let`)

---

## Implementation Plan

## Usage Example

```typescript
// REPL tool implementation
const transpiler = new Bun.Transpiler({
  loader: "tsx",
  replMode: true,  // NEW OPTION
});

// For each REPL input line:
const transformed = transpiler.transformSync(userInput);

// Execute in persistent VM context
const result = vm.runInContext(transformed, replContext);

// result.value contains the expression result (wrapped to prevent auto-await)
console.log(result.value);
```

## Design Decisions

- **Value wrapper**: Use `{ value: expr }` wrapper like Node.js to prevent auto-awaiting Promise results
- **Static imports**: Keep static imports as-is (Bun handles them natively)
- **Scope**: Full Node.js REPL transform parity

---

## Node.js REPL Transform Behavior (Reference)

From `vendor/node/lib/internal/repl/await.js`:

### 1. Object Literal Detection
```javascript
// {a:1} → ({a:1}) when starts with { and no trailing ;
if (/^\s*{/.test(code) && !/;\s*$/.test(code)) {
  code = `(${code})`;
}
```

### 2. Top-Level Await Transform
```javascript
// Input:  await x
// Output: (async () => { return { value: (await x) } })()

// Input:  var x = await 1
// Output: var x; (async () => { void (x = await 1) })()

// Input:  const x = await 1
// Output: let x; (async () => { void (x = await 1) })()  // const→let

// Input:  function foo() {}  (with await somewhere)
// Output: var foo; (async () => { this.foo = foo; function foo() {} })()

// Input:  class Foo {}  (with await somewhere)
// Output: let Foo; (async () => { Foo=class Foo {} })()
```

### 3. Transform Skipping
Returns `null` (no transform) when:
- No `await` expression present at top level
- Top-level `return` statement exists
- Code is inside async functions/arrow functions/class methods

---

## Implementation Plan

### Fix 1: Remove Extra Parentheses from Value Wrapper

**Problem:** The printer adds `()` around objects at statement start to disambiguate from blocks.

**Solution:** Always use an IIFE wrapper (sync or async) so the object is after `return`:

```js
// Non-async expression (current - BAD)
({ __proto__: null, value: 42 });

// Non-async expression (fixed - GOOD)
(() => { return { __proto__: null, value: 42 } })()

// Non-async with hoisting (fixed - GOOD)
var x;
(() => { void (x = 1); return { __proto__: null, value: x } })()

// Async (already correct)
var x;
(async () => { void (x = await 1); return { __proto__: null, value: x } })()
```

**Files to modify:**
1. `src/ast/P.zig` - `applyReplValueWrapper()` function

**Changes:**
- Remove the simple `{ value: expr }` wrapper approach
- Always use `applyReplAsyncTransform()` style IIFE wrapping, but with `is_async = false` for non-async code
- This ensures the object is always after `return`, avoiding the parentheses issue
- Hoisting still works for both cases

### Fix 2: Object Literal Disambiguation

**File:** `src/bun.js/api/JSTranspiler.zig` - Before parsing

Add pre-processing check:
```zig
// In transformSync, before parsing:
if (config.repl_mode) {
    // Check if input looks like object literal: starts with { and doesn't end with ;
    if (startsWithBrace(source) and !endsWithSemicolon(source)) {
        // Try parsing as expression by wrapping: _=(CODE)
        // If valid, wrap input as (CODE)
        source = wrapAsExpression(source);
    }
}
```

This matches Node.js `isObjectLiteral()` check in `repl/utils.js:786-789`:
```js
function isObjectLiteral(code) {
  return /^\s*{/.test(code) && !/;\s*$/.test(code);
}
```

### Fix 3: Class Declaration Persistence

**File:** `src/ast/P.zig` - `applyReplAsyncTransform()` in the class handling section

Change from:
```zig
// let Foo; (hoisted)
try hoisted_stmts.append(p.s(S.Local{ .kind = .k_let, ... }));
```

To:
```zig
// var Foo; (hoisted) - use var so it becomes context property
try hoisted_stmts.append(p.s(S.Local{ .kind = .k_var, ... }));
// Also add: this.Foo = Foo; assignment after class declaration
```

---

## Node.js REPL Test Cases to Match

From `vendor/node/test/parallel/test-repl-preprocess-top-level-await.js`:

| Input | Expected Output |
|-------|-----------------|
| `await 0` | `(async () => { return { value: (await 0) } })()` |
| `var a = await 1` | `var a; (async () => { void (a = await 1) })()` |
| `let a = await 1` | `let a; (async () => { void (a = await 1) })()` |
| `const a = await 1` | `let a; (async () => { void (a = await 1) })()` |
| `await 0; function foo() {}` | `var foo; (async () => { await 0; this.foo = foo; function foo() {} })()` |
| `await 0; class Foo {}` | `let Foo; (async () => { await 0; Foo=class Foo {} })()` |
| `var {a} = {a:1}, [b] = [1]` | `var a, b; (async () => { void ( ({a} = {a:1}), ([b] = [1])) })()` |

---

## Files to Modify

| File | Changes |
|------|---------|
| `src/ast/P.zig` | Fix value wrapper format, fix class hoisting to use var |
| `src/bun.js/api/JSTranspiler.zig` | Add object literal pre-check |
| `src/ast/js_printer.zig` | May need to check object literal printing |
| `test/js/bun/transpiler/repl-transform.test.ts` | Update tests for exact Node.js parity |

---

## Verification

1. Run Node.js preprocess test cases through Bun's transpiler
2. Verify output matches Node.js exactly (or functionally equivalent)
3. Test with vm.runInContext for variable persistence
4. Test object literal inputs: `{a: 1}`, `{foo: await bar()}`

---

## DEPRECATED - Previous Implementation (Already Done)

### 1. Add `replMode` to Bun.Transpiler API

**File**: `src/bun.js/api/JSTranspiler.zig`

Add to `Config` struct (around line 27-44):
```zig
pub const Config = struct {
    // ... existing fields ...
    repl_mode: bool = false,
```

Parse the option in `Config.fromJS()` (around line 420-430):
```zig
if (try object.getBooleanLoose(globalThis, "replMode")) |flag| {
    this.repl_mode = flag;
}
```

Apply the option in `constructor()` (around line 714-721):
```zig
transpiler.options.repl_mode = config.repl_mode;
```

### 2. Add Feature Flag to Runtime

**File**: `src/runtime.zig` (in `Runtime.Features`)

```zig
/// REPL mode: transforms code for interactive evaluation
/// - Wraps lone object literals `{...}` in parentheses
/// - Hoists variable declarations for REPL persistence
/// - Wraps last expression in { value: expr } for result capture
/// - Assigns functions to context for persistence
repl_mode: bool = false,
```

### 3. Add to BundleOptions

**File**: `src/options.zig`

Add to `BundleOptions` struct:
```zig
repl_mode: bool = false,
```

### 4. Implement REPL Transforms in Parser

**File**: `src/ast/P.zig`

#### 4a. Object Literal Detection (Parser-Level)

In REPL mode, the parser should prefer interpreting ambiguous `{...}` as object literals instead of blocks.

**Location**: `src/ast/parseStmt.zig` in statement parsing

When `repl_mode` is true and the parser sees `{` at the start of a statement:
1. Try parsing as expression statement (object literal) first
2. If that fails, fall back to block statement

This is similar to how JavaScript engines handle REPL input. The parser already has the infrastructure to do this - we just need to change the precedence in REPL mode.

```zig
// In parseStmt when repl_mode is true and we see '{'
if (p.options.features.repl_mode and p.token.tag == .t_open_brace) {
    // Try parsing as expression first
    const saved_state = p.saveState();
    if (p.tryParseExpressionStatement()) |expr_stmt| {
        return expr_stmt;
    }
    p.restoreState(saved_state);
    // Fall back to block statement
    return p.parseBlockStatement();
}
```

This handles:
- `{a: 1}` → parsed as object literal expression
- `{a: 1, b: 2}` → parsed as object literal expression
- `{ let x = 1; }` → fails as expression, parsed as block
- `{ label: break label; }` → fails as expression (break not valid in object), parsed as block

#### 4b. REPL Transform Pass (in toAST after visiting)

Add a new function `applyReplTransforms()` that:

1. **Detect if transform is needed**: Walk AST to check for top-level `await`
2. **Skip transform when**:
   - No `await` at top level
   - Top-level `return` statement exists
3. **When transform IS needed**:
   - Wrap entire code in `(async () => { ... })()`
   - Hoist variable declarations outside the async wrapper
   - Convert `const` to `let` for persistence
   - Wrap last expression in `return { value: (expr) }`
   - Handle function declarations (assign to `this`)
   - Handle class declarations (hoist as `let`)

**Key Logic:**
```zig
fn applyReplTransforms(p: *Parser, stmts: []Stmt) ![]Stmt {
    // 1. Check for top-level await
    const has_await = p.hasTopLevelAwait(stmts);
    const has_return = p.hasTopLevelReturn(stmts);

    if (!has_await or has_return) {
        // Just wrap last expression, no async wrapper needed
        return p.wrapLastExpression(stmts);
    }

    // 2. Collect declarations to hoist
    var hoisted = std.ArrayList(Stmt).init(p.allocator);
    var inner_stmts = std.ArrayList(Stmt).init(p.allocator);

    for (stmts) |stmt| {
        switch (stmt.data) {
            .s_local => |local| {
                // Hoist declaration, convert const→let
                try hoisted.append(p.createHoistedDecl(local));
                // Add assignment expression to inner
                try inner_stmts.append(p.createAssignmentExpr(local));
            },
            .s_function => |func| {
                // var foo; (hoisted)
                try hoisted.append(p.createVarDecl(func.name));
                // this.foo = foo; function foo() {} (inner)
                try inner_stmts.append(p.createThisAssignment(func.name));
                try inner_stmts.append(stmt);
            },
            .s_class => |class| {
                // let Foo; (hoisted)
                try hoisted.append(p.createLetDecl(class.name));
                // Foo = class Foo {} (inner)
                try inner_stmts.append(p.createClassAssignment(class));
            },
            else => try inner_stmts.append(stmt),
        }
    }

    // 3. Wrap last expression in return { value: expr }
    p.wrapLastExpressionWithReturn(&inner_stmts);

    // 4. Create async IIFE: (async () => { ...inner... })()
    const async_iife = p.createAsyncIIFE(inner_stmts.items);

    // 5. Combine: hoisted declarations + async IIFE
    try hoisted.append(async_iife);
    return hoisted.toOwnedSlice();
}
```

### 5. TypeScript Type Definitions

**File**: `packages/bun-types/bun.d.ts`

Add to `TranspilerOptions` interface (around line 1748):
```typescript
interface TranspilerOptions {
  // ... existing options ...

  /**
   * Enable REPL mode transforms:
   * - Wraps object literals in parentheses
   * - Hoists declarations for REPL persistence
   * - Wraps last expression in { value: expr } for result capture
   * - Wraps code with await in async IIFE
   */
  replMode?: boolean;
}
```

---

## Files to Modify

| File | Changes |
|------|---------|
| `src/bun.js/api/JSTranspiler.zig` | Add `repl_mode` to Config, parse from JS, apply to transpiler |
| `src/runtime.zig` | Add `repl_mode: bool` to `Runtime.Features` |
| `src/options.zig` | Add `repl_mode: bool` to `BundleOptions` |
| `src/ast/P.zig` | REPL transform pass in `toAST()` |
| `src/ast/parseStmt.zig` | Object literal vs block disambiguation in REPL mode |
| `packages/bun-types/bun.d.ts` | Add `replMode?: boolean` to `TranspilerOptions` |

---

## Test Cases

Create test file: `test/js/bun/transpiler/repl-transform.test.ts`

### Part 1: Transform Output Tests (Unit Tests)

Test exact transformation output matches expected patterns:

```typescript
import { expect, test, describe } from "bun:test";
import vm from "node:vm";

describe("Bun.Transpiler replMode - Transform Output", () => {
  const transpiler = new Bun.Transpiler({ loader: "tsx", replMode: true });

  // Based on Node.js test-repl-preprocess-top-level-await.js
  const testCases: [string, string | null][] = [
    // No await = null (no async transform, but still expression capture)
    ['0', null],

    // Basic await
    ['await 0', '(async () => { return { value: (await 0) } })()'],
    ['await 0;', '(async () => { return { value: (await 0) }; })()'],
    ['(await 0)', '(async () => { return ({ value: (await 0) }) })()'],

    // No transform for await inside async functions
    ['async function foo() { await 0; }', null],
    ['async () => await 0', null],
    ['class A { async method() { await 0 } }', null],

    // Top-level return = no transform
    ['await 0; return 0;', null],

    // Multiple await - last one gets return wrapper
    ['await 1; await 2;', '(async () => { await 1; return { value: (await 2) }; })()'],

    // Variable hoisting - var
    ['var a = await 1', 'var a; (async () => { void (a = await 1) })()'],

    // Variable hoisting - let
    ['let a = await 1', 'let a; (async () => { void (a = await 1) })()'],

    // Variable hoisting - const becomes let
    ['const a = await 1', 'let a; (async () => { void (a = await 1) })()'],

    // For loop with var - hoist var
    ['for (var i = 0; i < 1; ++i) { await i }',
     'var i; (async () => { for (void (i = 0); i < 1; ++i) { await i } })()'],

    // For loop with let - no hoist
    ['for (let i = 0; i < 1; ++i) { await i }',
     '(async () => { for (let i = 0; i < 1; ++i) { await i } })()'],

    // Destructuring with var
    ['var {a} = {a:1}, [b] = [1], {c:{d}} = {c:{d: await 1}}',
     'var a, b, d; (async () => { void ( ({a} = {a:1}), ([b] = [1]), ({c:{d}} = {c:{d: await 1}})) })()'],

    // Destructuring with let
    ['let [a, b, c] = await ([1, 2, 3])',
     'let a, b, c; (async () => { void ([a, b, c] = await ([1, 2, 3])) })()'],

    // Function declarations - assign to this
    ['await 0; function foo() {}',
     'var foo; (async () => { await 0; this.foo = foo; function foo() {} })()'],

    // Class declarations - hoist as let
    ['await 0; class Foo {}',
     'let Foo; (async () => { await 0; Foo=class Foo {} })()'],

    // Nested scopes
    ['if (await true) { var a = 1; }',
     'var a; (async () => { if (await true) { void (a = 1); } })()'],
    ['if (await true) { let a = 1; }',
     '(async () => { if (await true) { let a = 1; } })()'],

    // Mixed declarations
    ['var a = await 1; let b = 2; const c = 3;',
     'var a; let b; let c; (async () => { void (a = await 1); void (b = 2); void (c = 3); })()'],

    // for await
    ['for await (var i of asyncIterable) { i; }',
     'var i; (async () => { for await (i of asyncIterable) { i; } })()'],

    // for-of with var
    ['for (var i of [1,2,3]) { await 1; }',
     'var i; (async () => { for (i of [1,2,3]) { await 1; } })()'],

    // for-in with var
    ['for (var i in {x:1}) { await 1 }',
     'var i; (async () => { for (i in {x:1}) { await 1 } })()'],

    // Spread in destructuring
    ['var { ...rest } = await {}',
     'var rest; (async () => { void ({ ...rest } = await {}) })()'],
  ];

  for (const [input, expected] of testCases) {
    test(`transform: ${input.slice(0, 40)}...`, () => {
      const result = transpiler.transformSync(input);
      if (expected === null) {
        // No async transform expected, but expression capture may still happen
        expect(result).not.toMatch(/^\(async/);
      } else {
        expect(result.trim()).toBe(expected);
      }
    });
  }

  // Object literal detection - parser handles this automatically in REPL mode
  describe("object literal vs block disambiguation", () => {
    test("{a: 1} parsed as object literal", async () => {
      const ctx = vm.createContext({});
      const code = transpiler.transformSync("{a: 1}");
      const result = await vm.runInContext(code, ctx);
      // Should evaluate to object, not undefined (block with label)
      expect(result.value).toEqual({ a: 1 });
    });

    test("{a: 1, b: 2} parsed as object literal", async () => {
      const ctx = vm.createContext({});
      const code = transpiler.transformSync("{a: 1, b: 2}");
      const result = await vm.runInContext(code, ctx);
      expect(result.value).toEqual({ a: 1, b: 2 });
    });

    test("{ let x = 1; x } parsed as block", async () => {
      const ctx = vm.createContext({});
      const code = transpiler.transformSync("{ let x = 1; x }");
      const result = await vm.runInContext(code, ctx);
      // Block returns last expression value
      expect(result.value).toBe(1);
    });

    test("{ x: 1; y: 2 } parsed as block with labels", async () => {
      // Semicolons make this a block with labeled statements, not object
      const ctx = vm.createContext({});
      const code = transpiler.transformSync("{ x: 1; y: 2 }");
      const result = await vm.runInContext(code, ctx);
      // Block with labels returns last value
      expect(result.value).toBe(2);
    });
  });
});
```

### Part 2: Variable Persistence Tests (Integration with node:vm)

Test that variables persist across multiple REPL evaluations:

```typescript
import { expect, test, describe } from "bun:test";
import vm from "node:vm";

describe("REPL Variable Persistence", () => {
  const transpiler = new Bun.Transpiler({ loader: "tsx", replMode: true });

  // Helper to run multiple REPL lines in sequence
  async function runReplSession(lines: string[], context?: object) {
    const ctx = vm.createContext(context ?? { console });
    const results: any[] = [];

    for (const line of lines) {
      const transformed = transpiler.transformSync(line);
      const result = await vm.runInContext(transformed, ctx);
      results.push(result?.value ?? result);
    }

    return { results, context: ctx };
  }

  test("var persists across lines", async () => {
    const { results, context } = await runReplSession([
      "var x = 10",
      "x + 5",
      "x = 20",
      "x",
    ]);

    expect(results[1]).toBe(15);  // x + 5
    expect(results[3]).toBe(20);  // x after reassignment
    expect(context.x).toBe(20);   // x visible in context
  });

  test("let persists across lines (hoisted)", async () => {
    const { results } = await runReplSession([
      "let y = await Promise.resolve(100)",
      "y * 2",
    ]);

    expect(results[1]).toBe(200);
  });

  test("const becomes let, can be reassigned in later lines", async () => {
    const { results } = await runReplSession([
      "const z = await Promise.resolve(5)",
      "z",
      // In REPL, const becomes let, so next line can redeclare
      "z = 10",  // This works because const→let
      "z",
    ]);

    expect(results[1]).toBe(5);
    expect(results[3]).toBe(10);
  });

  test("function declarations persist", async () => {
    const { results, context } = await runReplSession([
      "await 1; function add(a, b) { return a + b; }",
      "add(2, 3)",
      "function multiply(a, b) { return a * b; }",  // no await
      "multiply(4, 5)",
    ]);

    expect(results[1]).toBe(5);
    expect(results[3]).toBe(20);
    expect(typeof context.add).toBe("function");
    expect(typeof context.multiply).toBe("function");
  });

  test("class declarations persist", async () => {
    const { results, context } = await runReplSession([
      "await 1; class Counter { constructor() { this.count = 0; } inc() { this.count++; } }",
      "const c = new Counter()",
      "c.inc(); c.inc(); c.count",
    ]);

    expect(results[2]).toBe(2);
    expect(typeof context.Counter).toBe("function");
  });

  test("complex session with mixed declarations", async () => {
    const { results } = await runReplSession([
      "var total = 0",
      "async function addAsync(n) { return total += await Promise.resolve(n); }",
      "await addAsync(10)",
      "await addAsync(20)",
      "total",
    ]);

    expect(results[2]).toBe(10);
    expect(results[3]).toBe(30);
    expect(results[4]).toBe(30);
  });

  test("destructuring assignment persists", async () => {
    const { results, context } = await runReplSession([
      "var { a, b } = await Promise.resolve({ a: 1, b: 2 })",
      "a + b",
      "var [x, y, z] = [10, 20, 30]",
      "x + y + z",
    ]);

    expect(results[1]).toBe(3);
    expect(results[3]).toBe(60);
    expect(context.a).toBe(1);
    expect(context.x).toBe(10);
  });
});
```

### Part 3: eval() Scoping Semantics Tests

Test that REPL behaves like eval() with proper scoping:

```typescript
import { expect, test, describe } from "bun:test";
import vm from "node:vm";

describe("REPL eval() Scoping Semantics", () => {
  const transpiler = new Bun.Transpiler({ loader: "tsx", replMode: true });

  test("var hoists to global context", async () => {
    const ctx = vm.createContext({});

    const code = transpiler.transformSync("var globalVar = 42");
    await vm.runInContext(code, ctx);

    expect(ctx.globalVar).toBe(42);
  });

  test("let/const hoisted for REPL but scoped correctly", async () => {
    const ctx = vm.createContext({});

    // With await, let is hoisted outside async wrapper
    const code1 = transpiler.transformSync("let x = await 1");
    await vm.runInContext(code1, ctx);
    expect(ctx.x).toBe(1);

    // Without await, let behavior depends on implementation
    const code2 = transpiler.transformSync("let y = 2");
    await vm.runInContext(code2, ctx);
    // y should still be accessible in REPL context
  });

  test("block-scoped let does NOT leak", async () => {
    const ctx = vm.createContext({});

    const code = transpiler.transformSync("if (await true) { let blockScoped = 1; }");
    await vm.runInContext(code, ctx);

    // blockScoped should NOT be visible in context
    expect(ctx.blockScoped).toBeUndefined();
  });

  test("function in block hoists with var (sloppy mode)", async () => {
    const ctx = vm.createContext({});

    const code = transpiler.transformSync("if (await true) { function blockFn() { return 42; } }");
    await vm.runInContext(code, ctx);

    // In sloppy mode, function in block hoists to function scope
    expect(typeof ctx.blockFn).toBe("function");
    expect(ctx.blockFn()).toBe(42);
  });

  test("this binding in function declarations", async () => {
    const ctx = vm.createContext({});

    const code = transpiler.transformSync("await 1; function greet() { return 'hello'; }");
    await vm.runInContext(code, ctx);

    // Function should be assigned to this (context) for REPL persistence
    expect(ctx.greet()).toBe("hello");
  });

  test("async function expression captures result", async () => {
    const ctx = vm.createContext({});

    const code = transpiler.transformSync("await (async () => { return 42; })()");
    const result = await vm.runInContext(code, ctx);

    expect(result.value).toBe(42);
  });

  test("Promise result NOT auto-awaited due to { value: } wrapper", async () => {
    const ctx = vm.createContext({});

    // Without wrapper, result would be auto-awaited
    // With { value: } wrapper, we get the Promise object
    const code = transpiler.transformSync("await Promise.resolve(Promise.resolve(42))");
    const result = await vm.runInContext(code, ctx);

    // The inner Promise should be in value, not auto-resolved
    expect(result.value).toBeInstanceOf(Promise);
    expect(await result.value).toBe(42);
  });
});
```

### Part 4: Edge Cases and Error Handling

```typescript
import { expect, test, describe } from "bun:test";
import vm from "node:vm";

describe("REPL Edge Cases", () => {
  const transpiler = new Bun.Transpiler({ loader: "tsx", replMode: true });

  test("empty input", () => {
    const result = transpiler.transformSync("");
    expect(result).toBe("");
  });

  test("whitespace only", () => {
    const result = transpiler.transformSync("   \n\t  ");
    expect(result.trim()).toBe("");
  });

  test("comment only", () => {
    const result = transpiler.transformSync("// just a comment");
    expect(result).toContain("// just a comment");
  });

  test("multiline input", () => {
    const input = `
      var x = await 1;
      var y = await 2;
      x + y
    `;
    const result = transpiler.transformSync(input);
    expect(result).toContain("var x");
    expect(result).toContain("var y");
    expect(result).toContain("async");
  });

  test("TypeScript syntax", () => {
    const input = "const x: number = await Promise.resolve(42)";
    const result = transpiler.transformSync(input);
    expect(result).not.toContain(": number"); // Types stripped
    expect(result).toContain("let x");
  });

  test("JSX in REPL", () => {
    const input = "await Promise.resolve(<div>Hello</div>)";
    const result = transpiler.transformSync(input);
    expect(result).toContain("async");
  });

  test("import expression (dynamic)", () => {
    // Dynamic imports should work fine
    const input = "await import('fs')";
    const result = transpiler.transformSync(input);
    expect(result).toContain("import");
  });

  test("nested await expressions", () => {
    const input = "await (await Promise.resolve(Promise.resolve(1)))";
    const result = transpiler.transformSync(input);
    expect(result).toContain("{ value:");
  });

  test("for-await-of", () => {
    const input = `
      async function* gen() { yield 1; yield 2; }
      for await (const x of gen()) { console.log(x); }
    `;
    const result = transpiler.transformSync(input);
    expect(result).toContain("var gen");
    expect(result).toContain("for await");
  });
});
```

---

## Verification Plan

### 1. Build and Basic Tests
```bash
# Build Bun with changes
bun bd

# Run the REPL transform tests
bun bd test test/js/bun/transpiler/repl-transform.test.ts
```

### 2. Manual Transform Output Verification
```typescript
// test-repl-manual.ts
const t = new Bun.Transpiler({ loader: "tsx", replMode: true });

// Object literal
console.log("Object literal:");
console.log(t.transformSync("{a: 1}"));
// Expected: contains "({a: 1})"

// Basic await
console.log("\nBasic await:");
console.log(t.transformSync("await 0"));
// Expected: (async () => { return { value: (await 0) } })()

// Variable hoisting
console.log("\nVar hoisting:");
console.log(t.transformSync("var x = await 1"));
// Expected: var x; (async () => { void (x = await 1) })()

// const → let
console.log("\nConst to let:");
console.log(t.transformSync("const x = await 1"));
// Expected: let x; (async () => { void (x = await 1) })()

// Function hoisting
console.log("\nFunction:");
console.log(t.transformSync("await 0; function foo() {}"));
// Expected: var foo; (async () => { await 0; this.foo = foo; function foo() {} })()
```

### 3. Full REPL Session Simulation
```typescript
// test-repl-session.ts
import vm from "node:vm";

const t = new Bun.Transpiler({ loader: "tsx", replMode: true });
const ctx = vm.createContext({ console, Promise });

async function repl(code: string) {
  const transformed = t.transformSync(code);
  console.log(`> ${code}`);
  console.log(`[transformed]: ${transformed}`);
  const result = await vm.runInContext(transformed, ctx);
  console.log(`= ${JSON.stringify(result?.value ?? result)}\n`);
  return result?.value ?? result;
}

// Test session
await repl("var counter = 0");
await repl("function increment() { return ++counter; }");
await repl("increment()");  // Should be 1
await repl("increment()");  // Should be 2
await repl("counter");      // Should be 2

await repl("const data = await Promise.resolve({ x: 10, y: 20 })");
await repl("data.x + data.y");  // Should be 30

await repl("class Point { constructor(x, y) { this.x = x; this.y = y; } }");
await repl("const p = new Point(3, 4)");
await repl("Math.sqrt(p.x**2 + p.y**2)");  // Should be 5
```

### 4. Verify No Regressions
```bash
# Run existing transpiler tests
bun bd test test/js/bun/transpiler/

# Run existing vm tests
bun bd test test/js/node/vm/
```

### 5. Cross-check with Node.js (Optional)
Compare transform outputs with Node.js's `processTopLevelAwait`:
```typescript
// Compare a few key transforms with Node.js output
const cases = [
  "await 0",
  "var x = await 1",
  "await 0; function foo() {}",
];
// Verify Bun output matches Node.js patterns
```
</claude-plan>
2026-01-19 00:22:36 -08:00

1618 lines
67 KiB
Zig

pub const options = @import("./options.zig");
pub const MacroJSValueType = jsc.JSValue;
pub const EntryPoints = @import("./bundler/entry_points.zig");
pub const ParseResult = struct {
source: logger.Source,
loader: options.Loader,
ast: js_ast.Ast,
already_bundled: AlreadyBundled = .none,
input_fd: ?StoredFileDescriptorType = null,
empty: bool = false,
pending_imports: _resolver.PendingResolution.List = .{},
runtime_transpiler_cache: ?*bun.jsc.RuntimeTranspilerCache = null,
pub const AlreadyBundled = union(enum) {
none: void,
source_code: void,
source_code_cjs: void,
bytecode: []u8,
bytecode_cjs: []u8,
pub fn bytecodeSlice(this: AlreadyBundled) []u8 {
return switch (this) {
inline .bytecode, .bytecode_cjs => |slice| slice,
else => &.{},
};
}
pub fn isBytecode(this: AlreadyBundled) bool {
return this == .bytecode or this == .bytecode_cjs;
}
pub fn isCommonJS(this: AlreadyBundled) bool {
return this == .source_code_cjs or this == .bytecode_cjs;
}
};
pub fn isPendingImport(this: *const ParseResult, id: u32) bool {
const import_record_ids = this.pending_imports.items(.import_record_id);
return std.mem.indexOfScalar(u32, import_record_ids, id) != null;
}
/// **DO NOT CALL THIS UNDER NORMAL CIRCUMSTANCES**
/// Normally, we allocate each AST in an arena and free all at once
/// So this function only should be used when we globally allocate an AST
pub fn deinit(this: *ParseResult) void {
_resolver.PendingResolution.deinitListItems(this.pending_imports, bun.default_allocator);
this.pending_imports.deinit(bun.default_allocator);
this.ast.deinit();
bun.default_allocator.free(@constCast(this.source.contents));
}
};
pub const PluginRunner = struct {
global_object: *jsc.JSGlobalObject,
allocator: std.mem.Allocator,
pub fn extractNamespace(specifier: string) string {
const colon = strings.indexOfChar(specifier, ':') orelse return "";
if (Environment.isWindows and
colon == 1 and
specifier.len > 3 and
bun.path.isSepAny(specifier[2]) and
((specifier[0] > 'a' and specifier[0] < 'z') or (specifier[0] > 'A' and specifier[0] < 'Z')))
return "";
return specifier[0..colon];
}
pub fn couldBePlugin(specifier: string) bool {
if (strings.lastIndexOfChar(specifier, '.')) |last_dor| {
const ext = specifier[last_dor + 1 ..];
// '.' followed by either a letter or a non-ascii character
// maybe there are non-ascii file extensions?
// we mostly want to cheaply rule out "../" and ".." and "./"
if (ext.len > 0 and ((ext[0] >= 'a' and ext[0] <= 'z') or (ext[0] >= 'A' and ext[0] <= 'Z') or ext[0] > 127))
return true;
}
return (!std.fs.path.isAbsolute(specifier) and strings.containsChar(specifier, ':'));
}
pub fn onResolve(
this: *PluginRunner,
specifier: []const u8,
importer: []const u8,
log: *logger.Log,
loc: logger.Loc,
target: jsc.JSGlobalObject.BunPluginTarget,
) bun.JSError!?Fs.Path {
var global = this.global_object;
const namespace_slice = extractNamespace(specifier);
const namespace = if (namespace_slice.len > 0 and !strings.eqlComptime(namespace_slice, "file"))
bun.String.init(namespace_slice)
else
bun.String.empty;
const on_resolve_plugin = try global.runOnResolvePlugins(
namespace,
bun.String.init(specifier).substring(if (namespace.length() > 0) namespace.length() + 1 else 0),
bun.String.init(importer),
target,
) orelse return null;
const path_value = try on_resolve_plugin.get(global, "path") orelse return null;
if (path_value.isEmptyOrUndefinedOrNull()) return null;
if (!path_value.isString()) {
log.addError(null, loc, "Expected \"path\" to be a string") catch unreachable;
return null;
}
const file_path = try path_value.toBunString(global);
defer file_path.deref();
if (file_path.length() == 0) {
log.addError(
null,
loc,
"Expected \"path\" to be a non-empty string in onResolve plugin",
) catch unreachable;
return null;
} else if
// TODO: validate this better
(file_path.eqlComptime(".") or
file_path.eqlComptime("..") or
file_path.eqlComptime("...") or
file_path.eqlComptime(" "))
{
log.addError(
null,
loc,
"Invalid file path from onResolve plugin",
) catch unreachable;
return null;
}
var static_namespace = true;
const user_namespace: bun.String = brk: {
if (try on_resolve_plugin.get(global, "namespace")) |namespace_value| {
if (!namespace_value.isString()) {
log.addError(null, loc, "Expected \"namespace\" to be a string") catch unreachable;
return null;
}
const namespace_str = try namespace_value.toBunString(global);
if (namespace_str.length() == 0) {
namespace_str.deref();
break :brk bun.String.init("file");
}
if (namespace_str.eqlComptime("file")) {
namespace_str.deref();
break :brk bun.String.init("file");
}
if (namespace_str.eqlComptime("bun")) {
namespace_str.deref();
break :brk bun.String.init("bun");
}
if (namespace_str.eqlComptime("node")) {
namespace_str.deref();
break :brk bun.String.init("node");
}
static_namespace = false;
break :brk namespace_str;
}
break :brk bun.String.init("file");
};
defer user_namespace.deref();
if (static_namespace) {
return Fs.Path.initWithNamespace(
std.fmt.allocPrint(this.allocator, "{f}", .{file_path}) catch unreachable,
user_namespace.byteSlice(),
);
} else {
return Fs.Path.initWithNamespace(
std.fmt.allocPrint(this.allocator, "{f}", .{file_path}) catch unreachable,
std.fmt.allocPrint(this.allocator, "{f}", .{user_namespace}) catch unreachable,
);
}
}
pub fn onResolveJSC(this: *const PluginRunner, namespace: bun.String, specifier: bun.String, importer: bun.String, target: jsc.JSGlobalObject.BunPluginTarget) bun.JSError!?jsc.ErrorableString {
var global = this.global_object;
const on_resolve_plugin = try global.runOnResolvePlugins(
if (namespace.length() > 0 and !namespace.eqlComptime("file"))
namespace
else
bun.String.static(""),
specifier,
importer,
target,
) orelse return null;
if (!on_resolve_plugin.isObject()) return null;
const path_value = try on_resolve_plugin.get(global, "path") orelse return null;
if (path_value.isEmptyOrUndefinedOrNull()) return null;
if (!path_value.isString()) {
return jsc.ErrorableString.err(
error.JSErrorObject,
bun.String.static("Expected \"path\" to be a string in onResolve plugin").toErrorInstance(this.global_object),
);
}
const file_path = try path_value.toBunString(global);
if (file_path.length() == 0) {
return jsc.ErrorableString.err(
error.JSErrorObject,
bun.String.static("Expected \"path\" to be a non-empty string in onResolve plugin").toErrorInstance(this.global_object),
);
} else if
// TODO: validate this better
(file_path.eqlComptime(".") or
file_path.eqlComptime("..") or
file_path.eqlComptime("...") or
file_path.eqlComptime(" "))
{
return jsc.ErrorableString.err(
error.JSErrorObject,
bun.String.static("\"path\" is invalid in onResolve plugin").toErrorInstance(this.global_object),
);
}
var static_namespace = true;
const user_namespace: bun.String = brk: {
if (try on_resolve_plugin.get(global, "namespace")) |namespace_value| {
if (!namespace_value.isString()) {
return jsc.ErrorableString.err(
error.JSErrorObject,
bun.String.static("Expected \"namespace\" to be a string").toErrorInstance(this.global_object),
);
}
const namespace_str = try namespace_value.toBunString(global);
if (namespace_str.length() == 0) {
break :brk bun.String.static("file");
}
if (namespace_str.eqlComptime("file")) {
defer namespace_str.deref();
break :brk bun.String.static("file");
}
if (namespace_str.eqlComptime("bun")) {
defer namespace_str.deref();
break :brk bun.String.static("bun");
}
if (namespace_str.eqlComptime("node")) {
defer namespace_str.deref();
break :brk bun.String.static("node");
}
static_namespace = false;
break :brk namespace_str;
}
break :brk bun.String.static("file");
};
defer user_namespace.deref();
// Our super slow way of cloning the string into memory owned by jsc
const combined_string = std.fmt.allocPrint(this.allocator, "{f}:{f}", .{ user_namespace, file_path }) catch unreachable;
var out_ = bun.String.init(combined_string);
const jsval = out_.toJS(this.global_object);
const out = jsval.toBunString(this.global_object) catch @panic("unreachable");
this.allocator.free(combined_string);
return jsc.ErrorableString.ok(out);
}
};
/// This structure was the JavaScript transpiler before bundle_v2 was written. It now
/// acts mostly as a configuration object, but it also contains stateful logic around
/// logging errors (.log) and module resolution (.resolve_queue)
///
/// This object is not exclusive to bundle_v2/Bun.build, one of these is stored
/// on every VM so that the options can be used for transpilation.
pub const Transpiler = struct {
options: options.BundleOptions,
log: *logger.Log,
allocator: std.mem.Allocator,
result: options.TransformResult,
resolver: Resolver,
fs: *Fs.FileSystem,
output_files: std.array_list.Managed(options.OutputFile),
resolve_results: *ResolveResults,
resolve_queue: ResolveQueue,
elapsed: u64 = 0,
needs_runtime: bool = false,
router: ?Router = null,
source_map: options.SourceMapOption = .none,
linker: Linker,
timer: SystemTimer,
env: *DotEnv.Loader,
macro_context: ?js_ast.Macro.MacroContext = null,
pub const isCacheEnabled = false;
pub inline fn getPackageManager(this: *Transpiler) *PackageManager {
return this.resolver.getPackageManager();
}
pub fn setLog(this: *Transpiler, log: *logger.Log) void {
this.log = log;
this.linker.log = log;
this.resolver.log = log;
}
// TODO: remove this method. it does not make sense
pub fn setAllocator(this: *Transpiler, allocator: std.mem.Allocator) void {
this.allocator = allocator;
this.linker.allocator = allocator;
this.resolver.allocator = allocator;
}
fn _resolveEntryPoint(transpiler: *Transpiler, entry_point: string) !_resolver.Result {
return transpiler.resolver.resolveWithFramework(transpiler.fs.top_level_dir, entry_point, .entry_point_build) catch |err| {
// Relative entry points that were not resolved to a node_modules package are
// interpreted as relative to the current working directory.
if (!std.fs.path.isAbsolute(entry_point) and
!(strings.hasPrefix(entry_point, "./") or strings.hasPrefix(entry_point, ".\\")))
{
brk: {
return transpiler.resolver.resolve(
transpiler.fs.top_level_dir,
try strings.append(transpiler.allocator, "./", entry_point),
.entry_point_build,
) catch {
// return the original error
break :brk;
};
}
}
return err;
};
}
pub fn resolveEntryPoint(transpiler: *Transpiler, entry_point: string) !_resolver.Result {
return _resolveEntryPoint(transpiler, entry_point) catch |err| {
var cache_bust_buf: bun.PathBuffer = undefined;
// Bust directory cache and try again
const buster_name = name: {
if (std.fs.path.isAbsolute(entry_point)) {
if (std.fs.path.dirname(entry_point)) |dir| {
// Normalized with trailing slash
break :name bun.strings.normalizeSlashesOnly(&cache_bust_buf, dir, std.fs.path.sep);
}
}
var parts = [_]string{
entry_point,
bun.pathLiteral(".."),
};
break :name bun.path.joinAbsStringBufZ(
transpiler.fs.top_level_dir,
&cache_bust_buf,
&parts,
.auto,
);
};
// Only re-query if we previously had something cached.
if (transpiler.resolver.bustDirCache(bun.strings.withoutTrailingSlashWindowsPath(buster_name))) {
if (_resolveEntryPoint(transpiler, entry_point)) |result|
return result
else |_| {
// ignore this error, we will print the original error
}
}
bun.handleOom(transpiler.log.addErrorFmt(null, logger.Loc.Empty, transpiler.allocator, "{s} resolving \"{s}\" (entry point)", .{ @errorName(err), entry_point }));
return err;
};
}
pub fn init(
allocator: std.mem.Allocator,
log: *logger.Log,
opts: api.TransformOptions,
env_loader_: ?*DotEnv.Loader,
) !Transpiler {
js_ast.Expr.Data.Store.create();
js_ast.Stmt.Data.Store.create();
const fs = try Fs.FileSystem.init(opts.absolute_working_dir);
const bundle_options = try options.BundleOptions.fromApi(
allocator,
fs,
log,
opts,
);
var env_loader: *DotEnv.Loader = env_loader_ orelse DotEnv.instance orelse brk: {
const map = try allocator.create(DotEnv.Map);
map.* = DotEnv.Map.init(allocator);
const loader = try allocator.create(DotEnv.Loader);
loader.* = DotEnv.Loader.init(map, allocator);
break :brk loader;
};
if (DotEnv.instance == null) {
DotEnv.instance = env_loader;
}
// hide elapsed time when loglevel is warn or error
env_loader.quiet = !log.level.atLeast(.info);
// var pool = try allocator.create(ThreadPool);
// try pool.init(ThreadPool.InitConfig{
// .allocator = allocator,
// });
const resolve_results = try allocator.create(ResolveResults);
resolve_results.* = ResolveResults.init(allocator);
return Transpiler{
.options = bundle_options,
.fs = fs,
.allocator = allocator,
.timer = SystemTimer.start() catch @panic("Timer fail"),
.resolver = Resolver.init1(allocator, log, fs, bundle_options),
.log = log,
// .thread_pool = pool,
.linker = undefined,
.result = options.TransformResult{ .outbase = bundle_options.output_dir },
.resolve_results = resolve_results,
.resolve_queue = ResolveQueue.init(allocator),
.output_files = std.array_list.Managed(options.OutputFile).init(allocator),
.env = env_loader,
};
}
pub fn deinit(this: *Transpiler) void {
this.options.deinit(this.allocator);
this.log.deinit();
this.resolver.deinit();
this.fs.deinit();
}
pub fn configureLinkerWithAutoJSX(transpiler: *Transpiler, auto_jsx: bool) void {
transpiler.linker = Linker.init(
transpiler.allocator,
transpiler.log,
&transpiler.resolve_queue,
&transpiler.options,
&transpiler.resolver,
transpiler.resolve_results,
transpiler.fs,
);
if (auto_jsx) {
// Most of the time, this will already be cached
if (transpiler.resolver.readDirInfo(transpiler.fs.top_level_dir) catch null) |root_dir| {
if (root_dir.tsconfig_json) |tsconfig| {
// If we don't explicitly pass JSX, try to get it from the root tsconfig
if (transpiler.options.transform_options.jsx == null) {
transpiler.options.jsx = tsconfig.jsx;
}
transpiler.options.emit_decorator_metadata = tsconfig.emit_decorator_metadata;
}
}
}
}
pub fn configureLinker(transpiler: *Transpiler) void {
transpiler.configureLinkerWithAutoJSX(true);
}
pub fn runEnvLoader(this: *Transpiler, skip_default_env: bool) !void {
switch (this.options.env.behavior) {
.prefix, .load_all, .load_all_without_inlining => {
// Step 1. Load the project root.
const dir_info = this.resolver.readDirInfo(this.fs.top_level_dir) catch return orelse return;
if (dir_info.tsconfig_json) |tsconfig| {
this.options.jsx = tsconfig.mergeJSX(this.options.jsx);
}
const dir = dir_info.getEntries(this.resolver.generation) orelse return;
// Process always has highest priority.
const was_production = this.options.production;
try this.env.loadProcess();
const has_production_env = this.env.isProduction();
if (!was_production and has_production_env) {
this.options.setProduction(true);
this.resolver.opts.setProduction(true);
}
if (this.options.isTest() or this.env.isTest()) {
try this.env.load(dir, this.options.env.files, .@"test", skip_default_env);
} else if (this.options.production) {
try this.env.load(dir, this.options.env.files, .production, skip_default_env);
} else {
try this.env.load(dir, this.options.env.files, .development, skip_default_env);
}
},
.disable => {
try this.env.loadProcess();
if (this.env.isProduction()) {
this.options.setProduction(true);
this.resolver.opts.setProduction(true);
}
},
else => {},
}
if (strings.eqlComptime(this.env.get("BUN_DISABLE_TRANSPILER") orelse "0", "1")) {
this.options.disable_transpilation = true;
}
}
// This must be run after a framework is configured, if a framework is enabled
pub fn configureDefines(this: *Transpiler) !void {
if (this.options.defines_loaded) {
return;
}
if (this.options.target == .bun_macro) {
this.options.env.behavior = .prefix;
this.options.env.prefix = "BUN_";
}
try this.runEnvLoader(this.options.env.disable_default_env_files);
var is_production = this.env.isProduction();
js_ast.Expr.Data.Store.create();
js_ast.Stmt.Data.Store.create();
defer js_ast.Expr.Data.Store.reset();
defer js_ast.Stmt.Data.Store.reset();
try this.options.loadDefines(this.allocator, this.env, &this.options.env);
var is_development = false;
if (this.options.define.dots.get("NODE_ENV")) |NODE_ENV| {
if (NODE_ENV.len > 0 and NODE_ENV[0].data.value == .e_string) {
if (NODE_ENV[0].data.value.e_string.eqlComptime("production")) {
is_production = true;
} else if (NODE_ENV[0].data.value.e_string.eqlComptime("development")) {
is_development = true;
}
}
}
if (is_development) {
this.options.setProduction(false);
this.resolver.opts.setProduction(false);
this.options.force_node_env = .development;
this.resolver.opts.force_node_env = .development;
} else if (is_production) {
this.options.setProduction(true);
this.resolver.opts.setProduction(true);
}
}
pub fn resetStore(_: *const Transpiler) void {
js_ast.Expr.Data.Store.reset();
js_ast.Stmt.Data.Store.reset();
}
pub noinline fn dumpEnvironmentVariables(transpiler: *const Transpiler) void {
@branchHint(.cold);
const opts = std.json.Stringify.Options{
.whitespace = .indent_2,
};
Output.flush();
var w: std.json.Stringify = .{ .writer = Output.writer(), .options = opts };
w.write(transpiler.env.map.*) catch unreachable;
Output.flush();
}
pub const BuildResolveResultPair = struct {
written: usize,
input_fd: ?StoredFileDescriptorType,
empty: bool = false,
};
fn buildWithResolveResultEager(
transpiler: *Transpiler,
resolve_result: _resolver.Result,
comptime import_path_format: options.BundleOptions.ImportPathFormat,
comptime Outstream: type,
outstream: Outstream,
client_entry_point_: ?*EntryPoints.ClientEntryPoint,
) !?options.OutputFile {
_ = outstream;
if (resolve_result.flags.is_external) {
return null;
}
var file_path = (resolve_result.pathConst() orelse return null).*;
// Step 1. Parse & scan
const loader = transpiler.options.loader(file_path.name.ext);
if (client_entry_point_) |client_entry_point| {
file_path = client_entry_point.source.path;
}
file_path.pretty = Linker.relative_paths_list.append(string, transpiler.fs.relativeTo(file_path.text)) catch unreachable;
var output_file = options.OutputFile{
.src_path = file_path,
.loader = loader,
.value = undefined,
.side = null,
.entry_point_index = null,
.output_kind = .chunk,
};
switch (loader) {
.jsx, .tsx, .js, .ts, .json, .jsonc, .toml, .yaml, .text => {
var result = transpiler.parse(
ParseOptions{
.allocator = transpiler.allocator,
.path = file_path,
.loader = loader,
.dirname_fd = resolve_result.dirname_fd,
.file_descriptor = null,
.file_hash = null,
.macro_remappings = transpiler.options.macro_remap,
.jsx = resolve_result.jsx,
.emit_decorator_metadata = resolve_result.flags.emit_decorator_metadata,
},
client_entry_point_,
) orelse {
return null;
};
if (!transpiler.options.transform_only) {
if (!transpiler.options.target.isBun())
try transpiler.linker.link(
file_path,
&result,
transpiler.options.origin,
import_path_format,
false,
false,
)
else
try transpiler.linker.link(
file_path,
&result,
transpiler.options.origin,
import_path_format,
false,
true,
);
}
const buffer_writer = js_printer.BufferWriter.init(transpiler.allocator);
var writer = js_printer.BufferPrinter.init(buffer_writer);
output_file.size = switch (transpiler.options.target) {
.browser, .node => try transpiler.print(
result,
*js_printer.BufferPrinter,
&writer,
.esm,
),
.bun, .bun_macro, .bake_server_components_ssr => try transpiler.print(
result,
*js_printer.BufferPrinter,
&writer,
.esm_ascii,
),
};
output_file.value = .{
.buffer = .{
.allocator = transpiler.allocator,
.bytes = writer.ctx.written,
},
};
},
.dataurl, .base64 => {
Output.panic("TODO: dataurl, base64", .{}); // TODO
},
.css => {
const alloc = transpiler.allocator;
const entry = transpiler.resolver.caches.fs.readFileWithAllocator(
transpiler.allocator,
transpiler.fs,
file_path.text,
resolve_result.dirname_fd,
false,
null,
) catch |err| {
transpiler.log.addErrorFmt(null, logger.Loc.Empty, transpiler.allocator, "{s} reading \"{s}\"", .{ @errorName(err), file_path.pretty }) catch {};
return null;
};
var opts = bun.css.ParserOptions.default(alloc, transpiler.log);
const css_module_suffix = ".module.css";
const enable_css_modules = file_path.text.len > css_module_suffix.len and
strings.eqlComptime(file_path.text[file_path.text.len - css_module_suffix.len ..], css_module_suffix);
if (enable_css_modules) {
opts.filename = bun.path.basename(file_path.text);
opts.css_modules = bun.css.CssModuleConfig{};
}
var sheet, var extra = switch (bun.css.StyleSheet(bun.css.DefaultAtRule).parse(
alloc,
entry.contents,
opts,
null,
// TODO: DO WE EVEN HAVE SOURCE INDEX IN THIS TRANSPILER.ZIG file??
bun.bundle_v2.Index.invalid,
)) {
.result => |v| v,
.err => |e| {
transpiler.log.addErrorFmt(null, logger.Loc.Empty, transpiler.allocator, "{f} parsing", .{e}) catch unreachable;
return null;
},
};
if (sheet.minify(alloc, bun.css.MinifyOptions.default(), &extra).asErr()) |e| {
bun.handleOom(transpiler.log.addErrorFmt(null, logger.Loc.Empty, transpiler.allocator, "{f} while minifying", .{e.kind}));
return null;
}
const symbols = bun.ast.Symbol.Map{};
const result = switch (sheet.toCss(
alloc,
bun.css.PrinterOptions{
.targets = bun.css.Targets.forBundlerTarget(transpiler.options.target),
.minify = transpiler.options.minify_whitespace,
},
null,
null,
&symbols,
)) {
.result => |v| v,
.err => |e| {
bun.handleOom(transpiler.log.addErrorFmt(null, logger.Loc.Empty, transpiler.allocator, "{f} while printing", .{e}));
return null;
},
};
output_file.value = .{ .buffer = .{ .allocator = alloc, .bytes = result.code } };
},
.html, .bunsh, .sqlite_embedded, .sqlite, .wasm, .file, .napi => {
const hashed_name = try transpiler.linker.getHashedFilename(file_path, null);
var pathname = try transpiler.allocator.alloc(u8, hashed_name.len + file_path.name.ext.len);
bun.copy(u8, pathname, hashed_name);
bun.copy(u8, pathname[hashed_name.len..], file_path.name.ext);
output_file.value = .{
.copy = options.OutputFile.FileOperation{
.pathname = pathname,
.dir = if (transpiler.options.output_dir_handle) |output_handle|
.fromStdDir(output_handle)
else
.invalid,
.is_outdir = true,
},
};
},
}
return output_file;
}
fn printWithSourceMapMaybe(
transpiler: *Transpiler,
ast: js_ast.Ast,
source: *const logger.Source,
comptime Writer: type,
writer: Writer,
comptime format: js_printer.Format,
comptime enable_source_map: bool,
source_map_context: ?js_printer.SourceMapHandler,
runtime_transpiler_cache: ?*bun.jsc.RuntimeTranspilerCache,
) !usize {
const tracer = if (enable_source_map)
bun.perf.trace("JSPrinter.printWithSourceMap")
else
bun.perf.trace("JSPrinter.print");
defer tracer.end();
const symbols = js_ast.Symbol.NestedList.fromBorrowedSliceDangerous(&.{ast.symbols});
return switch (format) {
.cjs => try js_printer.printCommonJS(
Writer,
writer,
ast,
js_ast.Symbol.Map.initList(symbols),
source,
false,
.{
.bundling = false,
.runtime_imports = ast.runtime_imports,
.require_ref = ast.require_ref,
.css_import_behavior = transpiler.options.cssImportBehavior(),
.source_map_handler = source_map_context,
.minify_whitespace = transpiler.options.minify_whitespace,
.minify_syntax = transpiler.options.minify_syntax,
.minify_identifiers = transpiler.options.minify_identifiers,
.transform_only = transpiler.options.transform_only,
.runtime_transpiler_cache = runtime_transpiler_cache,
.print_dce_annotations = transpiler.options.emit_dce_annotations,
.hmr_ref = ast.wrapper_ref,
.mangled_props = null,
},
enable_source_map,
),
.esm => try js_printer.printAst(
Writer,
writer,
ast,
js_ast.Symbol.Map.initList(symbols),
source,
false,
.{
.bundling = false,
.runtime_imports = ast.runtime_imports,
.require_ref = ast.require_ref,
.source_map_handler = source_map_context,
.css_import_behavior = transpiler.options.cssImportBehavior(),
.minify_whitespace = transpiler.options.minify_whitespace,
.minify_syntax = transpiler.options.minify_syntax,
.minify_identifiers = transpiler.options.minify_identifiers,
.transform_only = transpiler.options.transform_only,
.import_meta_ref = ast.import_meta_ref,
.runtime_transpiler_cache = runtime_transpiler_cache,
.print_dce_annotations = transpiler.options.emit_dce_annotations,
.hmr_ref = ast.wrapper_ref,
.mangled_props = null,
},
enable_source_map,
),
.esm_ascii => switch (transpiler.options.target.isBun()) {
inline else => |is_bun| try js_printer.printAst(
Writer,
writer,
ast,
js_ast.Symbol.Map.initList(symbols),
source,
is_bun,
.{
.bundling = false,
.runtime_imports = ast.runtime_imports,
.require_ref = ast.require_ref,
.css_import_behavior = transpiler.options.cssImportBehavior(),
.source_map_handler = source_map_context,
.minify_whitespace = transpiler.options.minify_whitespace,
.minify_syntax = transpiler.options.minify_syntax,
.minify_identifiers = transpiler.options.minify_identifiers,
.transform_only = transpiler.options.transform_only,
.module_type = if (is_bun and transpiler.options.transform_only)
// this is for when using `bun build --no-bundle`
// it should copy what was passed for the cli
transpiler.options.output_format
else if (ast.exports_kind == .cjs)
.cjs
else
.esm,
.inline_require_and_import_errors = false,
.import_meta_ref = ast.import_meta_ref,
.runtime_transpiler_cache = runtime_transpiler_cache,
.target = transpiler.options.target,
.print_dce_annotations = transpiler.options.emit_dce_annotations,
.hmr_ref = ast.wrapper_ref,
.mangled_props = null,
},
enable_source_map,
),
},
else => unreachable,
};
}
pub fn print(
transpiler: *Transpiler,
result: ParseResult,
comptime Writer: type,
writer: Writer,
comptime format: js_printer.Format,
) !usize {
return transpiler.printWithSourceMapMaybe(
result.ast,
&result.source,
Writer,
writer,
format,
false,
null,
null,
);
}
pub fn printWithSourceMap(
transpiler: *Transpiler,
result: ParseResult,
comptime Writer: type,
writer: Writer,
comptime format: js_printer.Format,
handler: js_printer.SourceMapHandler,
) !usize {
if (bun.feature_flag.BUN_FEATURE_FLAG_DISABLE_SOURCE_MAPS.get()) {
return transpiler.printWithSourceMapMaybe(
result.ast,
&result.source,
Writer,
writer,
format,
false,
handler,
result.runtime_transpiler_cache,
);
}
return transpiler.printWithSourceMapMaybe(
result.ast,
&result.source,
Writer,
writer,
format,
true,
handler,
result.runtime_transpiler_cache,
);
}
pub const ParseOptions = struct {
allocator: std.mem.Allocator,
dirname_fd: StoredFileDescriptorType,
file_descriptor: ?StoredFileDescriptorType = null,
file_hash: ?u32 = null,
/// On exception, we might still want to watch the file.
file_fd_ptr: ?*StoredFileDescriptorType = null,
path: Fs.Path,
loader: options.Loader,
jsx: options.JSX.Pragma,
macro_remappings: MacroRemap,
macro_js_ctx: MacroJSValueType = default_macro_js_value,
virtual_source: ?*const logger.Source = null,
replace_exports: runtime.Runtime.Features.ReplaceableExport.Map = .{},
inject_jest_globals: bool = false,
set_breakpoint_on_first_line: bool = false,
emit_decorator_metadata: bool = false,
remove_cjs_module_wrapper: bool = false,
dont_bundle_twice: bool = false,
allow_commonjs: bool = false,
/// `"type"` from `package.json`. Used to make sure the parser defaults
/// to CommonJS or ESM based on what the package.json says, when it
/// doesn't otherwise know from reading the source code.
///
/// See: https://nodejs.org/api/packages.html#type
module_type: options.ModuleType = .unknown,
runtime_transpiler_cache: ?*bun.jsc.RuntimeTranspilerCache = null,
keep_json_and_toml_as_one_statement: bool = false,
allow_bytecode_cache: bool = false,
};
pub fn parse(
transpiler: *Transpiler,
this_parse: ParseOptions,
client_entry_point_: anytype,
) ?ParseResult {
return parseMaybeReturnFileOnly(transpiler, this_parse, client_entry_point_, false);
}
pub fn parseMaybeReturnFileOnly(
transpiler: *Transpiler,
this_parse: ParseOptions,
client_entry_point_: anytype,
comptime return_file_only: bool,
) ?ParseResult {
return parseMaybeReturnFileOnlyAllowSharedBuffer(
transpiler,
this_parse,
client_entry_point_,
return_file_only,
false,
);
}
pub fn parseMaybeReturnFileOnlyAllowSharedBuffer(
transpiler: *Transpiler,
this_parse: ParseOptions,
client_entry_point_: anytype,
comptime return_file_only: bool,
comptime use_shared_buffer: bool,
) ?ParseResult {
var allocator = this_parse.allocator;
const dirname_fd = this_parse.dirname_fd;
const file_descriptor = this_parse.file_descriptor;
const file_hash = this_parse.file_hash;
const path = this_parse.path;
const loader = this_parse.loader;
var input_fd: ?StoredFileDescriptorType = null;
const source: *const logger.Source = &brk: {
if (this_parse.virtual_source) |virtual_source| {
break :brk virtual_source.*;
}
if (client_entry_point_) |client_entry_point| {
if (@hasField(std.meta.Child(@TypeOf(client_entry_point)), "source")) {
break :brk client_entry_point.source;
}
}
if (strings.eqlComptime(path.namespace, "node")) {
if (NodeFallbackModules.contentsFromPath(path.text)) |code| {
break :brk logger.Source.initPathString(path.text, code);
}
break :brk logger.Source.initPathString(path.text, "");
}
if (strings.startsWith(path.text, "data:")) {
const data_url = DataURL.parseWithoutCheck(path.text) catch |err| {
transpiler.log.addErrorFmt(null, logger.Loc.Empty, transpiler.allocator, "{s} parsing data url \"{s}\"", .{ @errorName(err), path.text }) catch {};
return null;
};
const body = data_url.decodeData(this_parse.allocator) catch |err| {
transpiler.log.addErrorFmt(null, logger.Loc.Empty, transpiler.allocator, "{s} decoding data \"{s}\"", .{ @errorName(err), path.text }) catch {};
return null;
};
break :brk logger.Source.initPathString(path.text, body);
}
const entry = transpiler.resolver.caches.fs.readFileWithAllocator(
if (use_shared_buffer) bun.default_allocator else this_parse.allocator,
transpiler.fs,
path.text,
dirname_fd,
use_shared_buffer,
file_descriptor,
) catch |err| {
transpiler.log.addErrorFmt(null, logger.Loc.Empty, transpiler.allocator, "{s} reading \"{s}\"", .{ @errorName(err), path.text }) catch {};
return null;
};
input_fd = entry.fd;
if (this_parse.file_fd_ptr) |file_fd_ptr| {
file_fd_ptr.* = entry.fd;
}
break :brk logger.Source.initRecycledFile(.{ .path = path, .contents = entry.contents }, transpiler.allocator) catch return null;
};
if (comptime return_file_only) {
return ParseResult{ .source = source.*, .input_fd = input_fd, .loader = loader, .empty = true, .ast = js_ast.Ast.empty };
}
if (source.contents.len == 0 or (source.contents.len < 33 and std.mem.trim(u8, source.contents, "\n\r ").len == 0)) {
if (!loader.handlesEmptyFile()) {
return ParseResult{ .source = source.*, .input_fd = input_fd, .loader = loader, .empty = true, .ast = js_ast.Ast.empty };
}
}
switch (loader) {
.js,
.jsx,
.ts,
.tsx,
=> {
// wasm magic number
if (source.isWebAssembly()) {
return ParseResult{
.source = source.*,
.input_fd = input_fd,
.loader = .wasm,
.empty = true,
.ast = js_ast.Ast.empty,
};
}
const target = transpiler.options.target;
var jsx = this_parse.jsx;
jsx.parse = loader.isJSX();
var opts = js_parser.Parser.Options.init(jsx, loader);
opts.features.emit_decorator_metadata = this_parse.emit_decorator_metadata;
opts.features.allow_runtime = transpiler.options.allow_runtime;
opts.features.set_breakpoint_on_first_line = this_parse.set_breakpoint_on_first_line;
opts.features.trim_unused_imports = transpiler.options.trim_unused_imports orelse loader.isTypeScript();
opts.features.no_macros = transpiler.options.no_macros;
opts.features.runtime_transpiler_cache = this_parse.runtime_transpiler_cache;
opts.transform_only = transpiler.options.transform_only;
opts.ignore_dce_annotations = transpiler.options.ignore_dce_annotations;
// @bun annotation
opts.features.dont_bundle_twice = this_parse.dont_bundle_twice;
opts.features.commonjs_at_runtime = this_parse.allow_commonjs;
opts.module_type = this_parse.module_type;
opts.tree_shaking = transpiler.options.tree_shaking;
opts.features.inlining = transpiler.options.inlining;
opts.filepath_hash_for_hmr = file_hash orelse 0;
opts.features.auto_import_jsx = transpiler.options.auto_import_jsx;
opts.warn_about_unbundled_modules = !target.isBun();
opts.features.inject_jest_globals = this_parse.inject_jest_globals;
opts.features.minify_syntax = transpiler.options.minify_syntax;
opts.features.minify_identifiers = transpiler.options.minify_identifiers;
opts.features.dead_code_elimination = transpiler.options.dead_code_elimination;
opts.features.remove_cjs_module_wrapper = this_parse.remove_cjs_module_wrapper;
opts.features.bundler_feature_flags = transpiler.options.bundler_feature_flags;
opts.features.repl_mode = transpiler.options.repl_mode;
opts.repl_mode = transpiler.options.repl_mode;
if (transpiler.macro_context == null) {
transpiler.macro_context = js_ast.Macro.MacroContext.init(transpiler);
}
// we'll just always enable top-level await
// this is incorrect for Node.js files which are CommonJS modules
opts.features.top_level_await = true;
opts.macro_context = &transpiler.macro_context.?;
if (target != .bun_macro) {
opts.macro_context.javascript_object = this_parse.macro_js_ctx;
}
opts.features.is_macro_runtime = target == .bun_macro;
opts.features.replace_exports = this_parse.replace_exports;
return switch ((transpiler.resolver.caches.js.parse(
allocator,
opts,
transpiler.options.define,
transpiler.log,
source,
) catch null) orelse return null) {
.ast => |value| .{
.ast = value,
.source = source.*,
.loader = loader,
.input_fd = input_fd,
.runtime_transpiler_cache = this_parse.runtime_transpiler_cache,
},
.cached => .{
.ast = undefined,
.runtime_transpiler_cache = this_parse.runtime_transpiler_cache,
.source = source.*,
.loader = loader,
.input_fd = input_fd,
},
.already_bundled => |already_bundled| .{
.ast = undefined,
.already_bundled = switch (already_bundled) {
.bun => .source_code,
.bun_cjs => .source_code_cjs,
.bytecode_cjs, .bytecode => brk: {
const default_value: ParseResult.AlreadyBundled = if (already_bundled == .bytecode_cjs) .source_code_cjs else .source_code;
if (this_parse.virtual_source == null and this_parse.allow_bytecode_cache) {
var path_buf2: bun.PathBuffer = undefined;
@memcpy(path_buf2[0..path.text.len], path.text);
path_buf2[path.text.len..][0..bun.bytecode_extension.len].* = bun.bytecode_extension.*;
const bytecode = bun.sys.File.toSourceAt(dirname_fd.unwrapValid() orelse bun.FD.cwd(), path_buf2[0 .. path.text.len + bun.bytecode_extension.len], bun.default_allocator, .{}).asValue() orelse break :brk default_value;
if (bytecode.contents.len == 0) {
break :brk default_value;
}
break :brk if (already_bundled == .bytecode_cjs) .{ .bytecode_cjs = @constCast(bytecode.contents) } else .{ .bytecode = @constCast(bytecode.contents) };
}
break :brk default_value;
},
},
.source = source.*,
.loader = loader,
.input_fd = input_fd,
},
};
},
// TODO: use lazy export AST
inline .toml, .yaml, .json, .jsonc => |kind| {
var expr = if (kind == .jsonc)
// We allow importing tsconfig.*.json or jsconfig.*.json with comments
// These files implicitly become JSONC files, which aligns with the behavior of text editors.
JSON.parseTSConfig(source, transpiler.log, allocator, false) catch return null
else if (kind == .json)
JSON.parse(source, transpiler.log, allocator, false) catch return null
else if (kind == .toml)
TOML.parse(source, transpiler.log, allocator, false) catch return null
else if (kind == .yaml)
YAML.parse(source, transpiler.log, allocator) catch return null
else
@compileError("unreachable");
var symbols: []js_ast.Symbol = &.{};
const parts = brk: {
if (this_parse.keep_json_and_toml_as_one_statement) {
var stmts = allocator.alloc(js_ast.Stmt, 1) catch unreachable;
stmts[0] = js_ast.Stmt.allocate(allocator, js_ast.S.SExpr, js_ast.S.SExpr{ .value = expr }, logger.Loc{ .start = 0 });
var parts_ = allocator.alloc(js_ast.Part, 1) catch unreachable;
parts_[0] = js_ast.Part{ .stmts = stmts };
break :brk parts_;
}
if (expr.data == .e_object) {
const properties: []js_ast.G.Property = expr.data.e_object.properties.slice();
if (properties.len > 0) {
var stmts = allocator.alloc(js_ast.Stmt, 3) catch return null;
var decls = std.ArrayListUnmanaged(js_ast.G.Decl).initCapacity(
allocator,
properties.len,
) catch |err| bun.handleOom(err);
decls.expandToCapacity();
symbols = allocator.alloc(js_ast.Symbol, properties.len) catch return null;
var export_clauses = allocator.alloc(js_ast.ClauseItem, properties.len) catch return null;
var duplicate_key_checker = bun.StringHashMap(u32).init(allocator);
defer duplicate_key_checker.deinit();
var count: usize = 0;
for (properties, decls.items, symbols, 0..) |*prop, *decl, *symbol, i| {
const name = prop.key.?.data.e_string.slice(allocator);
// Do not make named exports for "default" exports
if (strings.eqlComptime(name, "default"))
continue;
const visited = duplicate_key_checker.getOrPut(name) catch continue;
if (visited.found_existing) {
decls.items[visited.value_ptr.*].value = prop.value.?;
continue;
}
visited.value_ptr.* = @truncate(i);
symbol.* = js_ast.Symbol{
.original_name = MutableString.ensureValidIdentifier(name, allocator) catch return null,
};
const ref = Ref.init(@truncate(i), 0, false);
decl.* = js_ast.G.Decl{
.binding = js_ast.Binding.alloc(allocator, js_ast.B.Identifier{
.ref = ref,
}, prop.key.?.loc),
.value = prop.value.?,
};
export_clauses[i] = js_ast.ClauseItem{
.name = .{
.ref = ref,
.loc = prop.key.?.loc,
},
.alias = name,
.alias_loc = prop.key.?.loc,
};
prop.value = js_ast.Expr.initIdentifier(ref, prop.value.?.loc);
count += 1;
}
decls.shrinkRetainingCapacity(count);
stmts[0] = js_ast.Stmt.alloc(
js_ast.S.Local,
js_ast.S.Local{
.decls = js_ast.G.Decl.List.moveFromList(&decls),
.kind = .k_var,
},
logger.Loc{
.start = 0,
},
);
stmts[1] = js_ast.Stmt.alloc(
js_ast.S.ExportClause,
js_ast.S.ExportClause{
.items = export_clauses[0..count],
.is_single_line = false,
},
logger.Loc{
.start = 0,
},
);
stmts[2] = js_ast.Stmt.alloc(
js_ast.S.ExportDefault,
js_ast.S.ExportDefault{
.value = js_ast.StmtOrExpr{ .expr = expr },
.default_name = js_ast.LocRef{
.loc = logger.Loc{},
.ref = Ref.None,
},
},
logger.Loc{
.start = 0,
},
);
var parts_ = allocator.alloc(js_ast.Part, 1) catch unreachable;
parts_[0] = js_ast.Part{ .stmts = stmts };
break :brk parts_;
}
}
{
var stmts = allocator.alloc(js_ast.Stmt, 1) catch unreachable;
stmts[0] = js_ast.Stmt.alloc(js_ast.S.ExportDefault, js_ast.S.ExportDefault{
.value = js_ast.StmtOrExpr{ .expr = expr },
.default_name = js_ast.LocRef{
.loc = logger.Loc{},
.ref = Ref.None,
},
}, logger.Loc{ .start = 0 });
var parts_ = allocator.alloc(js_ast.Part, 1) catch unreachable;
parts_[0] = js_ast.Part{ .stmts = stmts };
break :brk parts_;
}
};
var ast = js_ast.Ast.fromParts(parts);
ast.symbols = js_ast.Symbol.List.fromOwnedSlice(symbols);
return ParseResult{
.ast = ast,
.source = source.*,
.loader = loader,
.input_fd = input_fd,
};
},
// TODO: use lazy export AST
.text => {
const expr = js_ast.Expr.init(js_ast.E.String, js_ast.E.String{
.data = source.contents,
}, logger.Loc.Empty);
const stmt = js_ast.Stmt.alloc(js_ast.S.ExportDefault, js_ast.S.ExportDefault{
.value = js_ast.StmtOrExpr{ .expr = expr },
.default_name = js_ast.LocRef{
.loc = logger.Loc{},
.ref = Ref.None,
},
}, logger.Loc{ .start = 0 });
var stmts = allocator.alloc(js_ast.Stmt, 1) catch unreachable;
stmts[0] = stmt;
var parts = allocator.alloc(js_ast.Part, 1) catch unreachable;
parts[0] = js_ast.Part{ .stmts = stmts };
return ParseResult{
.ast = js_ast.Ast.fromParts(parts),
.source = source.*,
.loader = loader,
.input_fd = input_fd,
};
},
.wasm => {
if (transpiler.options.target.isBun()) {
if (!source.isWebAssembly()) {
transpiler.log.addErrorFmt(
null,
logger.Loc.Empty,
transpiler.allocator,
"Invalid wasm file \"{s}\" (missing magic header)",
.{path.text},
) catch {};
return null;
}
return ParseResult{
.ast = js_ast.Ast.empty,
.source = source.*,
.loader = loader,
.input_fd = input_fd,
};
}
},
.css => {},
else => Output.panic("Unsupported loader {s} for path: {s}", .{ @tagName(loader), source.path.text }),
}
return null;
}
fn normalizeEntryPointPath(transpiler: *Transpiler, _entry: string) string {
var paths = [_]string{_entry};
var entry = transpiler.fs.abs(&paths);
std.fs.accessAbsolute(entry, .{}) catch
return _entry;
entry = transpiler.fs.relativeTo(entry);
if (!strings.startsWith(entry, "./")) {
// Entry point paths without a leading "./" are interpreted as package
// paths. This happens because they go through general path resolution
// like all other import paths so that plugins can run on them. Requiring
// a leading "./" for a relative path simplifies writing plugins because
// entry points aren't a special case.
//
// However, requiring a leading "./" also breaks backward compatibility
// and makes working with the CLI more difficult. So attempt to insert
// "./" automatically when needed. We don't want to unconditionally insert
// a leading "./" because the path may not be a file system path. For
// example, it may be a URL. So only insert a leading "./" when the path
// is an exact match for an existing file.
var __entry = transpiler.allocator.alloc(u8, "./".len + entry.len) catch unreachable;
__entry[0] = '.';
__entry[1] = '/';
bun.copy(u8, __entry[2..__entry.len], entry);
entry = __entry;
}
return entry;
}
fn enqueueEntryPoints(transpiler: *Transpiler, entry_points: []_resolver.Result, comptime normalize_entry_point: bool) usize {
var entry_point_i: usize = 0;
for (transpiler.options.entry_points) |_entry| {
const entry: string = if (comptime normalize_entry_point) transpiler.normalizeEntryPointPath(_entry) else _entry;
defer {
js_ast.Expr.Data.Store.reset();
js_ast.Stmt.Data.Store.reset();
}
const result = transpiler.resolver.resolve(transpiler.fs.top_level_dir, entry, .entry_point_build) catch |err| {
Output.prettyError("Error resolving \"{s}\": {s}\n", .{ entry, @errorName(err) });
continue;
};
if (result.pathConst() == null) {
Output.prettyError("\"{s}\" is disabled due to \"browser\" field in package.json.\n", .{
entry,
});
continue;
}
if (transpiler.linker.enqueueResolveResult(&result) catch unreachable) {
entry_points[entry_point_i] = result;
entry_point_i += 1;
}
}
return entry_point_i;
}
pub fn transform(
transpiler: *Transpiler,
allocator: std.mem.Allocator,
log: *logger.Log,
opts: api.TransformOptions,
) !options.TransformResult {
_ = opts;
var entry_points = try allocator.alloc(_resolver.Result, transpiler.options.entry_points.len);
entry_points = entry_points[0..transpiler.enqueueEntryPoints(entry_points, true)];
if (log.level.atLeast(.debug)) {
transpiler.resolver.debug_logs = try DebugLogs.init(allocator);
}
transpiler.options.transform_only = true;
const did_start = false;
if (transpiler.options.output_dir_handle == null) {
const outstream = bun.sys.File.from(std.fs.File.stdout());
if (!did_start) {
try switch (transpiler.options.import_path_format) {
.relative => transpiler.processResolveQueue(.relative, false, @TypeOf(outstream), outstream),
.absolute_url => transpiler.processResolveQueue(.absolute_url, false, @TypeOf(outstream), outstream),
.absolute_path => transpiler.processResolveQueue(.absolute_path, false, @TypeOf(outstream), outstream),
.package_path => transpiler.processResolveQueue(.package_path, false, @TypeOf(outstream), outstream),
};
}
} else {
const output_dir = transpiler.options.output_dir_handle orelse {
Output.printError("Invalid or missing output directory.", .{});
Global.crash();
};
if (!did_start) {
try switch (transpiler.options.import_path_format) {
.relative => transpiler.processResolveQueue(.relative, false, std.fs.Dir, output_dir),
.absolute_url => transpiler.processResolveQueue(.absolute_url, false, std.fs.Dir, output_dir),
.absolute_path => transpiler.processResolveQueue(.absolute_path, false, std.fs.Dir, output_dir),
.package_path => transpiler.processResolveQueue(.package_path, false, std.fs.Dir, output_dir),
};
}
}
if (FeatureFlags.tracing and transpiler.options.log.level.atLeast(.info)) {
Output.prettyErrorln(
"<r><d>\n---Tracing---\nResolve time: {d}\nParsing time: {d}\n---Tracing--\n\n<r>",
.{
transpiler.resolver.elapsed,
transpiler.elapsed,
},
);
}
var final_result = try options.TransformResult.init(try allocator.dupe(u8, transpiler.result.outbase), try transpiler.output_files.toOwnedSlice(), log, allocator);
final_result.root_dir = transpiler.options.output_dir_handle;
return final_result;
}
fn processResolveQueue(
transpiler: *Transpiler,
comptime import_path_format: options.BundleOptions.ImportPathFormat,
comptime wrap_entry_point: bool,
comptime Outstream: type,
outstream: Outstream,
) !void {
while (transpiler.resolve_queue.readItem()) |item| {
js_ast.Expr.Data.Store.reset();
js_ast.Stmt.Data.Store.reset();
if (comptime wrap_entry_point) {
const path = item.pathConst() orelse unreachable;
const loader = transpiler.options.loader(path.name.ext);
if (item.import_kind == .entry_point and loader.supportsClientEntryPoint()) {
var client_entry_point = try transpiler.allocator.create(EntryPoints.ClientEntryPoint);
client_entry_point.* = EntryPoints.ClientEntryPoint{};
try client_entry_point.generate(Transpiler, transpiler, path.name, transpiler.options.framework.?.client.path);
const entry_point_output_file = transpiler.buildWithResolveResultEager(
item,
import_path_format,
Outstream,
outstream,
client_entry_point,
) catch continue orelse continue;
transpiler.output_files.append(entry_point_output_file) catch unreachable;
js_ast.Expr.Data.Store.reset();
js_ast.Stmt.Data.Store.reset();
// At this point, the entry point will be de-duped.
// So we just immediately build it.
var item_not_entrypointed = item;
item_not_entrypointed.import_kind = .stmt;
const original_output_file = transpiler.buildWithResolveResultEager(
item_not_entrypointed,
import_path_format,
Outstream,
outstream,
null,
) catch continue orelse continue;
transpiler.output_files.append(original_output_file) catch unreachable;
continue;
}
}
const output_file = transpiler.buildWithResolveResultEager(
item,
import_path_format,
Outstream,
outstream,
null,
) catch continue orelse continue;
transpiler.output_files.append(output_file) catch unreachable;
}
}
};
pub const ServeResult = struct {
file: options.OutputFile,
mime_type: MimeType,
};
pub const ResolveResults = std.AutoHashMap(
u64,
void,
);
pub const ResolveQueue = bun.LinearFifo(
_resolver.Result,
.Dynamic,
);
const string = []const u8;
const DotEnv = @import("./env_loader.zig");
const Fs = @import("./fs.zig");
const MimeType = @import("./http/MimeType.zig");
const NodeFallbackModules = @import("./node_fallbacks.zig");
const Router = @import("./router.zig");
const runtime = @import("./runtime.zig");
const std = @import("std");
const DataURL = @import("./resolver/data_url.zig").DataURL;
const MacroRemap = @import("./resolver/package_json.zig").MacroMap;
const PackageManager = @import("./install/install.zig").PackageManager;
const SystemTimer = @import("./system_timer.zig").Timer;
const URL = @import("./url.zig").URL;
const linker = @import("./linker.zig");
const Linker = linker.Linker;
const _resolver = @import("./resolver/resolver.zig");
const DebugLogs = _resolver.DebugLogs;
const Resolver = _resolver.Resolver;
const bun = @import("bun");
const Environment = bun.Environment;
const FeatureFlags = bun.FeatureFlags;
const Global = bun.Global;
const JSON = bun.json;
const MutableString = bun.MutableString;
const Output = bun.Output;
const StoredFileDescriptorType = bun.StoredFileDescriptorType;
const default_allocator = bun.default_allocator;
const js_parser = bun.js_parser;
const js_printer = bun.js_printer;
const jsc = bun.jsc;
const logger = bun.logger;
const strings = bun.strings;
const api = bun.schema.api;
const TOML = bun.interchange.toml.TOML;
const YAML = bun.interchange.yaml.YAML;
const default_macro_js_value = jsc.JSValue.zero;
const js_ast = bun.ast;
const Ref = bun.ast.Ref;