Compare commits

...

12 Commits

Author SHA1 Message Date
autofix-ci[bot]
317a4c4030 [autofix.ci] apply automated fixes 2026-02-13 21:46:59 +00:00
Jarred Sumner
c674cc644f feat(cron): add Bun.cron API for OS-level cron jobs and expression parsing
Adds `Bun.cron()`, `Bun.cron.remove()`, and `Bun.cron.parse()` — a
complete API for registering OS-level cron jobs and parsing cron
expressions from JavaScript.

**Bun.cron(path, schedule, title)** registers a cron job that runs a
JS/TS module on a schedule, using the platform's native scheduler:
- Linux: crontab
- macOS: launchd (plist + StartCalendarInterval)
- Windows: Task Scheduler (schtasks)

The target module exports a `scheduled(controller)` handler following
the Cloudflare Workers Cron Triggers API.

**Bun.cron.parse(expression, relativeDate?)** parses a cron expression
and returns the next matching UTC Date. Supports:
- Standard 5-field expressions (minute hour day month weekday)
- Named days: SUN-SAT, Sunday-Saturday (case-insensitive)
- Named months: JAN-DEC, January-December (case-insensitive)
- Sunday as 7 (normalized to 0)
- Predefined nicknames: @yearly, @annually, @monthly, @weekly, @daily,
  @midnight, @hourly
- POSIX OR logic: when both day-of-month and day-of-week are specified
  (neither is *), either matching is sufficient

The parser uses a bitset representation (u64/u32/u16/u8 per field) for
efficient matching. The next-occurrence algorithm advances by the
largest non-matching unit (month > day > hour > minute) and normalizes
via JSC's UTC date functions on each iteration.

Expressions are normalized to numeric form before platform registration,
so named values like "Monday" or "@daily" produce valid crontab entries.

Also fixes a pre-existing bug in filterCrontab where substring matching
(indexOf) could cause removing job "test" to also remove "test-cleanup".
Changed to exact line matching.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-02-13 22:45:03 +01:00
robobun
7a801fcf93 fix(ini): prevent OOB read and UB on truncated/invalid UTF-8 in INI parser (#26947)
## Summary

- Fix out-of-bounds read in the INI parser's `prepareStr` function when
a multi-byte UTF-8 lead byte appears at the end of a value with
insufficient continuation bytes
- Fix undefined behavior when bare continuation bytes (0x80-0xBF) cause
`utf8ByteSequenceLength` to return 0, hitting an `unreachable` branch
(UB in ReleaseFast builds)
- Add bounds checking before accessing `val[i+1]`, `val[i+2]`,
`val[i+3]` in both escaped and non-escaped code paths

The vulnerability could be triggered by a crafted `.npmrc` file
containing truncated UTF-8 sequences. In release builds, this could
cause OOB heap reads (potential info leak) or undefined behavior.

## Test plan

- [x] Added 9 tests covering truncated 2/3/4-byte sequences, bare
continuation bytes, and escaped contexts
- [x] All 52 INI parser tests pass (`bun bd test
test/js/bun/ini/ini.test.ts`)
- [x] No regressions in npmrc tests (failures are pre-existing Verdaccio
connectivity issues)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2026-02-12 00:28:44 -08:00
robobun
44541eb574 fix(sql): reject null bytes in connection parameters to prevent protocol injection (#26952)
## Summary

- Reject null bytes in `username`, `password`, `database`, and `path`
connection parameters for both PostgreSQL and MySQL to prevent wire
protocol parameter injection
- Both the Postgres and MySQL wire protocols use null-terminated strings
in their startup/handshake messages, so embedded null bytes in these
fields act as field terminators, allowing injection of arbitrary
protocol parameters (e.g. `search_path` for schema hijacking)
- The fix validates these fields immediately after UTF-8 conversion and
throws `InvalidArguments` error with a clear message if null bytes are
found

## Test plan

- [x] New test
`test/regression/issue/postgres-null-byte-injection.test.ts` verifies:
- Null bytes in username are rejected with an error before any data is
sent
- Null bytes in database are rejected with an error before any data is
sent
- Null bytes in password are rejected with an error before any data is
sent
  - Normal connections without null bytes still work correctly
- [x] Test verified to fail with `USE_SYSTEM_BUN=1` (unfixed bun) and
pass with `bun bd test` (fixed build)
- [x] Existing SQL tests pass (`adapter-env-var-precedence.test.ts`,
`postgres-stringbuilder-assertion-aggressive.test.ts`)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2026-02-12 00:27:00 -08:00
robobun
993be3f931 fix(plugin): set virtualModules to nullptr after delete in clearAll (#26940)
## Summary

- Fix double-free in `Bun.plugin.clearAll()` by setting `virtualModules
= nullptr` after `delete`
- In `jsFunctionBunPluginClear` (`BunPlugin.cpp:956`), `delete
global->onLoadPlugins.virtualModules` freed the pointer without
nullifying it. When the `OnLoad` destructor later runs (during Worker
termination or VM destruction), it checks `if (virtualModules)` — the
dangling non-null pointer passes the check and is deleted again,
corrupting the heap allocator.

## Test plan

- [ ] New test
`test/regression/issue/plugin-clearall-double-free.test.ts` spawns a
subprocess that registers a virtual module, calls
`Bun.plugin.clearAll()`, and exits with `BUN_DESTRUCT_VM_ON_EXIT=1` to
trigger the destructor path
- [ ] Verified the test fails on the system bun (pre-fix) with `pas
panic: deallocation did fail ... Alloc bit not set`
- [ ] Verified the test passes with the debug build (post-fix)
- [ ] Existing plugin tests (`test/js/bun/plugin/plugins.test.ts`) all
pass (29/29)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2026-02-11 23:14:43 -08:00
robobun
a68393926b fix(ws): handle fragmented pong frames and validate control frame size (#26944)
## Summary

- Fix WebSocket client pong frame handler to properly handle payloads
split across TCP segments, preventing frame desync that could cause
protocol confusion
- Add missing RFC 6455 Section 5.5 validation: control frame payloads
must not exceed 125 bytes (pong handler lacked this check, unlike ping
and close handlers)

## Details

The pong handler (lines 652-663) had two issues:

1. **Frame desync on fragmented delivery**: When a pong payload was
split across TCP segments (`data.len < receive_body_remain`), the
handler consumed only the available bytes but unconditionally reset
`receive_state = .need_header` and `receive_body_remain = 0`. The
remaining payload bytes in the next TCP delivery were then
misinterpreted as WebSocket frame headers.

2. **Missing payload length validation**: Unlike the ping handler (line
615) and close handler (line 680), the pong handler did not validate the
7-bit payload length against the RFC 6455 limit of 125 bytes for control
frames.

The fix models the pong handler after the existing ping handler pattern:
track partial delivery state with a `pong_received` boolean, buffer
incoming data into `ping_frame_bytes`, and only reset to `need_header`
after the complete payload has been consumed.

## Test plan

- [x] New test `websocket-pong-fragmented.test.ts` verifies:
- Fragmented pong delivery (50-byte payload split into 2+48 bytes) does
not cause frame desync, and a subsequent text frame is received
correctly
- Pong frames with >125 byte payloads are rejected as invalid control
frames
- [x] Test fails with `USE_SYSTEM_BUN=1` (reproduces the bug) and passes
with `bun bd test`
- [x] Existing WebSocket tests pass: `websocket-client.test.ts`,
`websocket-close-fragmented.test.ts`,
`websocket-client-short-read.test.ts`

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2026-02-11 23:12:28 -08:00
robobun
e8a5f23385 fix(s3): reject CRLF characters in header values to prevent header injection (#26942)
## Summary

- Fixes HTTP header injection vulnerability in S3 client where
user-controlled options (`contentDisposition`, `contentEncoding`,
`type`) were passed to HTTP headers without CRLF validation
- Adds input validation at the JS-to-Zig boundary in
`src/s3/credentials.zig` that throws a `TypeError` if `\r` or `\n`
characters are detected
- An attacker could previously inject arbitrary headers (e.g.
`X-Amz-Security-Token`) by embedding `\r\n` in these string fields

## Test plan

- [x] Added `test/regression/issue/s3-header-injection.test.ts` with 6
tests:
  - CRLF in `contentDisposition` throws
  - CRLF in `contentEncoding` throws
  - CRLF in `type` (content-type) throws
  - Lone CR in `contentDisposition` throws
  - Lone LF in `contentDisposition` throws
  - Valid `contentDisposition` without CRLF still works correctly
- [x] Tests fail with `USE_SYSTEM_BUN=1` (confirming vulnerability
exists in current release)
- [x] Tests pass with `bun bd test` (confirming fix works)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2026-02-11 23:02:39 -08:00
robobun
16b3e7cde7 fix(libarchive): use normalized path in mkdiratZ to prevent directory traversal (#26956)
## Summary

- Fix path traversal vulnerability in tarball directory extraction on
POSIX systems where `mkdiratZ` used the un-normalized `pathname` (raw
from tarball) instead of the normalized `path` variable, allowing `../`
components to escape the extraction root via kernel path resolution
- The Windows directory creation, symlink creation, and file creation
code paths already correctly used the normalized path — only the two
POSIX `mkdiratZ` calls were affected (lines 463 and 469)
- `bun install` is not affected because npm mode skips directory
entries; affected callers include `bun create`, GitHub tarball
extraction, and `compile_target`

## Test plan

- [x] Added regression test that crafts a tarball with
`safe_dir/../../escaped_dir/` directory entry and verifies it cannot
create directories outside the extraction root
- [x] Verified test **fails** with system bun (vulnerable) and
**passes** with debug build (fixed)
- [x] Full `archive.test.ts` suite passes (99/99 tests)
- [x] `symlink-path-traversal.test.ts` continues to pass (3/3 tests)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2026-02-11 22:47:41 -08:00
robobun
4c32f15339 fix(sql): use constant-time comparison for SCRAM server signature (#26937)
## Summary

- Replace `bun.strings.eqlLong` with BoringSSL's `CRYPTO_memcmp` for
SCRAM-SHA-256 server signature verification in the PostgreSQL client
- The previous comparison (`eqlLong`) returned early on the first
mismatching byte, potentially leaking information about the expected
server signature via timing side-channel
- `CRYPTO_memcmp` is already used elsewhere in the codebase for
constant-time comparisons (CSRF tokens, `crypto.timingSafeEqual`,
KeyObject comparison)

## Test plan

- [x] `bun bd` compiles successfully
- [ ] Existing SCRAM-SHA-256 integration tests in
`test/js/sql/sql.test.ts` pass (require Docker/PostgreSQL)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2026-02-11 22:45:47 -08:00
robobun
635034ee33 fix(shell): use-after-free in runFromJS when setupIOBeforeRun fails (#26920)
## Summary

- Fixes #26918 — segfault at address `0x28189480080` caused by
use-after-free in the shell interpreter
- When `setupIOBeforeRun()` fails (e.g., stdout handle unavailable on
Windows), the `runFromJS` error path called `deinitFromExec()` which
directly freed the GC-managed interpreter object with
`allocator.destroy(this)`. When the GC later swept and called
`deinitFromFinalizer()` on the already-freed memory, it caused a
segfault.
- Replaced `deinitFromExec()` with `derefRootShellAndIOIfNeeded(true)`
which properly cleans up runtime resources (IO handles, shell
environment) while leaving final object destruction to the GC finalizer
— matching the pattern already used in `finish()`.

## Test plan

- [x] Added regression test in `test/regression/issue/26918.test.ts`
that verifies the shell interpreter handles closed stdout gracefully
without crashing
- [x] Test passes with `bun bd test test/regression/issue/26918.test.ts`
- [ ] The actual crash is primarily reproducible on Windows where stdout
handles can be truly unavailable — CI Windows tests should validate the
fix

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2026-02-11 17:51:10 -08:00
robobun
3e792d0d2e fix(test): write JUnit reporter outfile when --bail triggers early exit (#26852)
## Summary
- When `--bail` caused an early exit after a test failure, the JUnit
reporter output file (`--reporter-outfile`) was never written because
`Global.exit()` was called before the normal completion path
- Extracted the JUnit write logic into a `writeJUnitReportIfNeeded()`
method on `CommandLineReporter` and call it in both bail exit paths
(test failure and unhandled rejection) as well as the normal completion
path

Closes #26851

## Test plan
- [x] Added regression test `test/regression/issue/26851.test.ts` with
two cases:
  - Single failing test file with `--bail` produces JUnit XML output
- Multiple test files where bail triggers on second file still writes
the report
- [x] Verified test fails with system bun (`USE_SYSTEM_BUN=1`)
- [x] Verified test passes with `bun bd test`

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2026-02-11 17:41:45 -08:00
robobun
b7d505b6c1 deflake: make HMR rapid edits test event-driven (#26890)
## Summary
- Add `expectMessageEventually(value)` to the bake test harness `Client`
class — waits for a specific message to appear, draining any
intermediate messages that arrived before it
- Rewrite "hmr handles rapid consecutive edits" test to use raw
`Bun.write` + sleep for intermediate edits and `expectMessageEventually`
for the final assertion, avoiding flaky failures when HMR batches
updates non-deterministically across platforms

Fixes flaky failure on Windows where an extra "render 10" message
arrived after `expectMessage` consumed its expected messages but before
client disposal.

## Test plan
- [x] `bun bd test test/bake/dev-and-prod.test.ts` — all 12 tests pass
- [x] Ran the specific test multiple times to confirm no flakiness

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Alistair Smith <alistair@anthropic.com>
2026-02-11 16:05:25 -08:00
31 changed files with 3719 additions and 54 deletions

View File

@@ -136,7 +136,7 @@
{
"group": "Process & System",
"icon": "computer",
"pages": ["/runtime/environment-variables", "/runtime/shell", "/runtime/child-process"]
"pages": ["/runtime/environment-variables", "/runtime/shell", "/runtime/child-process", "/runtime/cron"]
},
{
"group": "Interop & Tooling",

311
docs/runtime/cron.mdx Normal file
View File

@@ -0,0 +1,311 @@
---
title: Cron
description: Schedule and parse cron jobs with Bun
---
Bun has built-in support for registering OS-level cron jobs and parsing cron expressions.
## Quickstart
**Parse a cron expression to find the next matching time:**
```ts
// Next weekday at 9:30 AM UTC
const next = Bun.cron.parse("30 9 * * MON-FRI");
console.log(next); // => 2025-01-20T09:30:00.000Z
```
**Register a cron job that runs a script on a schedule:**
```ts
await Bun.cron("./worker.ts", "30 2 * * MON", "weekly-report");
```
---
## `Bun.cron.parse()`
Parse a cron expression and return the next matching UTC `Date`.
```ts
const next = Bun.cron.parse("*/15 * * * *");
console.log(next); // => next quarter-hour boundary
```
### Parameters
| Parameter | Type | Description |
| -------------- | ---------------- | -------------------------------------------------------- |
| `expression` | `string` | A 5-field cron expression or predefined nickname |
| `relativeDate` | `Date \| number` | Starting point for the search (defaults to `Date.now()`) |
### Returns
`Date | null` — the next matching UTC time, or `null` if no match exists within ~4 years (e.g. February 30th).
### Chaining calls
Call `parse()` repeatedly to get a sequence of upcoming times:
```ts
const from = Date.UTC(2025, 0, 15, 10, 0, 0);
const first = Bun.cron.parse("0 * * * *", from);
console.log(first); // => 2025-01-15T11:00:00.000Z
const second = Bun.cron.parse("0 * * * *", first);
console.log(second); // => 2025-01-15T12:00:00.000Z
```
---
## Cron expression syntax
Standard 5-field format: `minute hour day-of-month month day-of-week`
| Field | Values | Special characters |
| ------------ | ----------------------- | ------------------ |
| Minute | `0``59` | `*` `,` `-` `/` |
| Hour | `0``23` | `*` `,` `-` `/` |
| Day of month | `1``31` | `*` `,` `-` `/` |
| Month | `1``12` or `JAN``DEC` | `*` `,` `-` `/` |
| Day of week | `0``7` or `SUN``SAT` | `*` `,` `-` `/` |
### Special characters
| Character | Description | Example |
| --------- | ----------- | ------------------------------------- |
| `*` | All values | `* * * * *` — every minute |
| `,` | List | `1,15 * * * *` — minute 1 and 15 |
| `-` | Range | `9-17 * * * *` — minutes 9 through 17 |
| `/` | Step | `*/15 * * * *` — every 15 minutes |
### Named values
Month and weekday fields accept case-insensitive names:
```ts
// 3-letter abbreviations
Bun.cron.parse("0 9 * * MON-FRI"); // weekdays
Bun.cron.parse("0 0 1 JAN,JUN *"); // January and June
// Full names
Bun.cron.parse("0 9 * * Monday-Friday");
Bun.cron.parse("0 0 1 January *");
```
Both `0` and `7` mean Sunday in the weekday field.
### Predefined nicknames
| Nickname | Equivalent | Description |
| ----------------------- | ----------- | ------------------------- |
| `@yearly` / `@annually` | `0 0 1 1 *` | Once a year (January 1st) |
| `@monthly` | `0 0 1 * *` | Once a month (1st day) |
| `@weekly` | `0 0 * * 0` | Once a week (Sunday) |
| `@daily` / `@midnight` | `0 0 * * *` | Once a day (midnight) |
| `@hourly` | `0 * * * *` | Once an hour |
```ts
const next = Bun.cron.parse("@daily");
console.log(next); // => next midnight UTC
```
### Day-of-month and day-of-week interaction
When **both** day-of-month and day-of-week are specified (neither is `*`), the expression matches when **either** condition is true. This follows the [POSIX cron](https://pubs.opengroup.org/onlinepubs/9699919799/utilities/crontab.html) standard.
```ts
// Fires on the 15th of every month OR every Friday
Bun.cron.parse("0 0 15 * FRI");
```
When only one is specified (the other is `*`), only that field is used for matching.
---
## `Bun.cron()`
Register an OS-level cron job that runs a JavaScript/TypeScript module on a schedule.
```ts
await Bun.cron("./worker.ts", "30 2 * * MON", "weekly-report");
```
### Parameters
| Parameter | Type | Description |
| ---------- | -------- | ---------------------------------------------------------- |
| `path` | `string` | Path to the script (resolved relative to caller) |
| `schedule` | `string` | Cron expression or nickname |
| `title` | `string` | Unique job identifier (alphanumeric, hyphens, underscores) |
Re-registering with the same `title` overwrites the existing job in-place — the old schedule is replaced, not duplicated.
```ts
await Bun.cron("./worker.ts", "0 * * * *", "my-job"); // every hour
await Bun.cron("./worker.ts", "*/15 * * * *", "my-job"); // replaces: every 15 min
```
### The `scheduled()` handler
The registered script must export a default object with a `scheduled()` method, following the [Cloudflare Workers Cron Triggers API](https://developers.cloudflare.com/workers/runtime-apis/handlers/scheduled/):
```ts worker.ts
export default {
scheduled(controller) {
console.log(controller.cron); // "30 2 * * 1"
console.log(controller.type); // "scheduled"
console.log(controller.scheduledTime); // 1737340200000
},
};
```
The handler can be `async`. Bun waits for the returned promise to settle before exiting.
---
## How it works per platform
### Linux
Bun uses [crontab](https://man7.org/linux/man-pages/man5/crontab.5.html) to register jobs. Each job is stored as a line in your user's crontab with a `# bun-cron: <title>` marker comment above it.
The crontab entry looks like:
```
<schedule> '<bun-path>' run --cron-title=<title> --cron-period='<schedule>' '<script-path>'
```
When the cron daemon fires the job, Bun imports your module and calls the `scheduled()` handler.
**Viewing registered jobs:**
```sh
crontab -l
```
**Logs:** On Linux, cron output goes to the system log. Check with:
```sh
# systemd-based (Ubuntu, Fedora, Arch, etc.)
journalctl -u cron # or crond on some distros
journalctl -u cron --since "1 hour ago"
# syslog-based (older systems)
grep CRON /var/log/syslog
```
To capture stdout/stderr to a file, redirect output in the crontab entry directly, or add logging inside your `scheduled()` handler.
**Manually uninstalling without code:**
```sh
# Edit your crontab and remove the "# bun-cron: <title>" comment
# and the command line below it
crontab -e
# Or remove ALL bun cron jobs at once by filtering them out:
crontab -l | grep -v "# bun-cron:" | grep -v "\-\-cron-title=" | crontab -
```
### macOS
Bun uses [launchd](https://developer.apple.com/library/archive/documentation/MacOSX/Conceptual/BPSystemStartup/Chapters/CreatingLaunchdJobs.html) to register jobs. Each job is installed as a plist file at:
```
~/Library/LaunchAgents/bun.cron.<title>.plist
```
The plist uses `StartCalendarInterval` to define the schedule. Only simple expressions (single values and `*`) are supported on macOS — complex patterns with ranges, lists, or steps will be rejected.
**Viewing registered jobs:**
```sh
launchctl list | grep sh.bun.cron
```
**Logs:** stdout and stderr are written to:
```
/tmp/bun.cron.<title>.stdout.log
/tmp/bun.cron.<title>.stderr.log
```
For example, a job titled `weekly-report`:
```sh
cat /tmp/bun.cron.weekly-report.stdout.log
tail -f /tmp/bun.cron.weekly-report.stderr.log
```
**Manually uninstalling without code:**
```sh
# Unload the job from launchd
launchctl bootout gui/$(id -u)/bun.cron.<title>
# Delete the plist file
rm ~/Library/LaunchAgents/bun.cron.<title>.plist
# Example for a job titled "weekly-report":
launchctl bootout gui/$(id -u)/bun.cron.weekly-report
rm ~/Library/LaunchAgents/bun.cron.weekly-report.plist
```
### Windows
Bun uses [Task Scheduler](https://learn.microsoft.com/en-us/windows/win32/taskschd/task-scheduler-start-page) via `schtasks`. Each job is registered as a scheduled task named `bun-cron-<title>`.
Only simple schedule patterns are supported — `*/N` (every N minutes), `N * * * *` (hourly), `N N * * *` (daily), and `N N * * N` (weekly).
**Viewing registered jobs:**
```powershell
schtasks /query /tn "bun-cron-<title>"
# List all bun cron tasks
schtasks /query | findstr "bun-cron-"
```
**Logs:** Task Scheduler logs events to the Windows Event Log. View them with:
```powershell
# In PowerShell
Get-WinEvent -LogName Microsoft-Windows-TaskScheduler/Operational | Where-Object { $_.Message -like "*bun-cron*" }
```
Or open **Event Viewer** → **Applications and Services Logs** → **Microsoft** → **Windows** → **TaskScheduler** → **Operational**.
To capture stdout/stderr to a file, add logging inside your `scheduled()` handler.
**Manually uninstalling without code:**
```powershell
schtasks /delete /tn "bun-cron-<title>" /f
# Example:
schtasks /delete /tn "bun-cron-weekly-report" /f
```
Or open **Task Scheduler** (taskschd.msc), find the task named `bun-cron-<title>`, right-click, and delete it.
---
## `Bun.cron.remove()`
Remove a previously registered cron job by its title. Works on all platforms.
```ts
await Bun.cron.remove("weekly-report");
```
This reverses what `Bun.cron()` did:
| Platform | What `remove()` does |
| -------- | -------------------------------------------------------- |
| Linux | Edits crontab to remove the entry and its marker comment |
| macOS | Runs `launchctl bootout` and deletes the plist file |
| Windows | Runs `schtasks /delete` to remove the scheduled task |
Removing a job that doesn't exist resolves without error.

View File

@@ -7191,6 +7191,92 @@ declare module "bun" {
options?: SpawnOptions.SpawnSyncOptions<In, Out, Err>,
): SyncSubprocess<Out, Err>;
/**
* Register an OS-level cron job that runs a JavaScript/TypeScript module on a schedule.
*
* The module must export a `default` object with a `scheduled(controller)` method,
* conforming to the [Cloudflare Workers Cron Triggers API](https://developers.cloudflare.com/workers/runtime-apis/handlers/scheduled/).
*
* On Linux, registers with [crontab](https://man7.org/linux/man-pages/man5/crontab.5.html).
* On macOS, registers with [launchd](https://developer.apple.com/library/archive/documentation/MacOSX/Conceptual/BPSystemStartup/Chapters/CreatingLaunchdJobs.html).
* On Windows, registers with [Task Scheduler](https://learn.microsoft.com/en-us/windows/win32/taskschd/task-scheduler-start-page).
*
* **Cron expression syntax** (5 fields: `minute hour day month weekday`):
*
* | Field | Values | Special |
* |-------|--------|---------|
* | Minute | `0-59` | `*` `,` `-` `/` |
* | Hour | `0-23` | `*` `,` `-` `/` |
* | Day of month | `1-31` | `*` `,` `-` `/` |
* | Month | `1-12` or `JAN-DEC` | `*` `,` `-` `/` |
* | Day of week | `0-7` or `SUN-SAT` | `*` `,` `-` `/` |
*
* - `0` and `7` both mean Sunday in the weekday field.
* - Month/day names are case-insensitive (`MON`, `Mon`, `Monday` all work).
* - Predefined nicknames: `@yearly`, `@annually`, `@monthly`, `@weekly`, `@daily`, `@midnight`, `@hourly`.
* - When both day-of-month and day-of-week are specified (neither is `*`),
* the job runs when **either** field matches ([POSIX cron](https://pubs.opengroup.org/onlinepubs/9699919799/utilities/crontab.html) behavior).
*
* @param path - Path to the script to run (resolved relative to caller)
* @param schedule - Cron expression or predefined nickname (e.g. `"30 2 * * MON"`, `"@daily"`)
* @param title - Unique identifier for this cron job (alphanumeric, hyphens, underscores only)
* @returns Promise that resolves when the cron job is registered
* @throws If the expression is invalid, the title contains invalid characters, or registration fails
*
* @example
* ```ts
* // Run every Monday at 2:30 AM
* await Bun.cron("./worker.ts", "30 2 * * MON", "weekly-report");
*
* // Run daily at midnight
* await Bun.cron("./cleanup.ts", "@daily", "daily-cleanup");
* ```
*/
const cron: {
(path: string, schedule: string, title: string): Promise<void>;
/**
* Remove a previously registered cron job by its title.
*
* @param title - The title of the cron job to remove
* @returns Promise that resolves when the cron job is removed
*
* @example
* ```ts
* await Bun.cron.remove("weekly-report");
* ```
*/
remove(title: string): Promise<void>;
/**
* Parse a cron expression and return the next matching UTC Date.
*
* Supports the same syntax as {@link Bun.cron} — 5-field expressions, named
* days/months, and predefined nicknames like `@daily`.
*
* When both day-of-month and day-of-week are specified (neither is `*`),
* matching uses OR logic per [POSIX cron](https://pubs.opengroup.org/onlinepubs/9699919799/utilities/crontab.html):
* a date matches if **either** field matches.
*
* @param expression - A cron expression or nickname (e.g. `"0,15,30,45 * * * *"`, `"0 9 * * MON-FRI"`, `"@hourly"`)
* @param relativeDate - Starting point for the search (defaults to `Date.now()`). Accepts a `Date` or milliseconds since epoch.
* @returns The next `Date` matching the expression (UTC), or `null` if no match exists within ~4 years (e.g. `"0 0 30 2 *"` — Feb 30 never occurs)
* @throws If the expression is invalid or `relativeDate` is `NaN`/`Infinity`
*
* @example
* ```ts
* // Next weekday at 09:30 UTC
* const next = Bun.cron.parse("30 9 * * MON-FRI");
*
* // Chain calls to get a sequence
* const first = Bun.cron.parse("@hourly", from);
* const second = Bun.cron.parse("@hourly", first);
*
* // With a specific starting point
* const nextJan1 = Bun.cron.parse("0 0 1 JAN *", Date.UTC(2025, 0, 1));
* ```
*/
parse(expression: string, relativeDate?: Date | number): Date | null;
};
/** Utility type for any process from {@link Bun.spawn()} with both stdout and stderr set to `"pipe"` */
type ReadableSubprocess = Subprocess<any, "pipe", "pipe">;
/** Utility type for any process from {@link Bun.spawn()} with stdin set to `"pipe"` */

View File

@@ -208,6 +208,38 @@ pub const Run = struct {
if (ctx.runtime_options.eval.eval_and_print) {
b.options.dead_code_elimination = false;
}
} else if (ctx.runtime_options.cron_title.len > 0 and ctx.runtime_options.cron_period.len > 0) {
// Cron execution mode: wrap the entry point in a script that imports the
// module and calls default.scheduled(controller)
// Escape path for embedding in JS string literal (handle backslashes on Windows)
const escaped_path = try escapeForJSString(bun.default_allocator, entry_path);
defer bun.default_allocator.free(escaped_path);
const escaped_period = try escapeForJSString(bun.default_allocator, ctx.runtime_options.cron_period);
defer bun.default_allocator.free(escaped_period);
const cron_script = try std.fmt.allocPrint(bun.default_allocator,
\\const mod = await import("{s}");
\\const scheduled = (mod.default || mod).scheduled;
\\if (typeof scheduled !== "function") throw new Error("Module does not export default.scheduled()");
\\const controller = {{ cron: "{s}", type: "scheduled", scheduledTime: Date.now() }};
\\await scheduled(controller);
, .{ escaped_path, escaped_period });
// entry_path must end with /[eval] for the transpiler to use eval_source
const trigger = bun.pathLiteral("/[eval]");
var cwd_buf: bun.PathBuffer = undefined;
const cwd_slice = switch (bun.sys.getcwd(&cwd_buf)) {
.result => |cwd| cwd,
.err => return error.SystemResources,
};
var eval_path_buf: [bun.MAX_PATH_BYTES + trigger.len]u8 = undefined;
@memcpy(eval_path_buf[0..cwd_slice.len], cwd_slice);
@memcpy(eval_path_buf[cwd_slice.len..][0..trigger.len], trigger);
const eval_entry_path = eval_path_buf[0 .. cwd_slice.len + trigger.len];
// Heap-allocate the path so it outlives this stack frame
const heap_entry_path = try bun.default_allocator.dupe(u8, eval_entry_path);
const script_source = try bun.default_allocator.create(logger.Source);
script_source.* = logger.Source.initPathString(heap_entry_path, cron_script);
vm.module_loader.eval_source = script_source;
run.entry_path = heap_entry_path;
}
b.options.install = ctx.install;
@@ -564,6 +596,32 @@ const VirtualMachine = jsc.VirtualMachine;
const string = []const u8;
/// Escape a string for safe embedding in a JS double-quoted string literal.
/// Escapes backslashes, double quotes, newlines, etc.
fn escapeForJSString(allocator: std.mem.Allocator, input: []const u8) ![]const u8 {
var needs_escape = false;
for (input) |c| {
if (c == '\\' or c == '"' or c == '\n' or c == '\r' or c == '\t') {
needs_escape = true;
break;
}
}
if (!needs_escape) return allocator.dupe(u8, input);
var result = try std.array_list.Managed(u8).initCapacity(allocator, input.len + 16);
for (input) |c| {
switch (c) {
'\\' => try result.appendSlice("\\\\"),
'"' => try result.appendSlice("\\\""),
'\n' => try result.appendSlice("\\n"),
'\r' => try result.appendSlice("\\r"),
'\t' => try result.appendSlice("\\t"),
else => try result.append(c),
}
}
return result.toOwnedSlice();
}
const CPUProfiler = @import("./bun.js/bindings/BunCPUProfiler.zig");
const HeapProfiler = @import("./bun.js/bindings/BunHeapProfiler.zig");
const options = @import("./options.zig");

View File

@@ -69,6 +69,7 @@ pub const BunObject = struct {
pub const YAML = toJSLazyPropertyCallback(Bun.getYAMLObject);
pub const Transpiler = toJSLazyPropertyCallback(Bun.getTranspilerConstructor);
pub const argv = toJSLazyPropertyCallback(Bun.getArgv);
pub const cron = toJSLazyPropertyCallback(@import("./cron.zig").getCronObject);
pub const cwd = toJSLazyPropertyCallback(Bun.getCWD);
pub const embeddedFiles = toJSLazyPropertyCallback(Bun.getEmbeddedFiles);
pub const enableANSIColors = toJSLazyPropertyCallback(Bun.enableANSIColors);
@@ -139,6 +140,7 @@ pub const BunObject = struct {
@export(&BunObject.Glob, .{ .name = lazyPropertyCallbackName("Glob") });
@export(&BunObject.Transpiler, .{ .name = lazyPropertyCallbackName("Transpiler") });
@export(&BunObject.argv, .{ .name = lazyPropertyCallbackName("argv") });
@export(&BunObject.cron, .{ .name = lazyPropertyCallbackName("cron") });
@export(&BunObject.cwd, .{ .name = lazyPropertyCallbackName("cwd") });
@export(&BunObject.enableANSIColors, .{ .name = lazyPropertyCallbackName("enableANSIColors") });
@export(&BunObject.hash, .{ .name = lazyPropertyCallbackName("hash") });

View File

@@ -87,6 +87,8 @@ pub const ProcessExitHandler = struct {
MultiRunProcessHandle,
SecurityScanSubprocess,
SyncProcess,
CronRegisterJob,
CronRemoveJob,
},
);
@@ -124,6 +126,14 @@ pub const ProcessExitHandler = struct {
const subprocess = this.ptr.as(SecurityScanSubprocess);
subprocess.onProcessExit(process, status, rusage);
},
@field(TaggedPointer.Tag, @typeName(CronRegisterJob)) => {
const cron_job = this.ptr.as(CronRegisterJob);
cron_job.onProcessExit(process, status, rusage);
},
@field(TaggedPointer.Tag, @typeName(CronRemoveJob)) => {
const cron_job = this.ptr.as(CronRemoveJob);
cron_job.onProcessExit(process, status, rusage);
},
@field(TaggedPointer.Tag, @typeName(SyncProcess)) => {
const subprocess = this.ptr.as(SyncProcess);
if (comptime Environment.isPosix) {
@@ -2259,6 +2269,9 @@ const std = @import("std");
const MultiRunProcessHandle = @import("../../../cli/multi_run.zig").ProcessHandle;
const ProcessHandle = @import("../../../cli/filter_run.zig").ProcessHandle;
const CronRegisterJob = @import("../cron.zig").CronRegisterJob;
const CronRemoveJob = @import("../cron.zig").CronRemoveJob;
const bun = @import("bun");
const Environment = bun.Environment;
const Output = bun.Output;

1133
src/bun.js/api/cron.zig Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,314 @@
/// Cron expression parser and next-occurrence calculator.
///
/// Parses standard 5-field cron expressions (minute hour day month weekday)
/// into a bitset representation, and computes the next matching UTC time.
///
/// Supports:
/// - Wildcards: *
/// - Lists: 1,3,5
/// - Ranges: 1-5
/// - Steps: */15, 1-30/2
/// - Named days: SUN-SAT, Sun-Sat, Sunday-Saturday (case-insensitive)
/// - Named months: JAN-DEC, Jan-Dec, January-December (case-insensitive)
/// - Sunday as 7: weekday field accepts 7 as alias for 0
/// - Nicknames: @yearly, @annually, @monthly, @weekly, @daily, @midnight, @hourly
pub const CronExpression = struct {
minutes: u64, // bits 0-59
hours: u32, // bits 0-23
days: u32, // bits 1-31
months: u16, // bits 1-12
weekdays: u8, // bits 0-6 (0=Sunday)
days_is_wildcard: bool, // true if day-of-month field was *
weekdays_is_wildcard: bool, // true if weekday field was *
pub const Error = error{
InvalidField,
InvalidStep,
InvalidRange,
InvalidNumber,
TooManyFields,
TooFewFields,
};
/// Parse a 5-field cron expression or predefined nickname into a CronExpression.
pub fn parse(input: []const u8) Error!CronExpression {
const expr = bun.strings.trim(input, " \t");
// Check for predefined nicknames
if (expr.len > 0 and expr[0] == '@') {
return parseNickname(expr) orelse error.InvalidField;
}
var count: usize = 0;
var fields: [5][]const u8 = undefined;
var iter = std.mem.tokenizeAny(u8, expr, " \t");
while (iter.next()) |field| {
if (count >= 5) return error.TooManyFields;
fields[count] = field;
count += 1;
}
if (count != 5) return error.TooFewFields;
return .{
.minutes = try parseField(u64, fields[0], 0, 59, .none),
.hours = try parseField(u32, fields[1], 0, 23, .none),
.days = try parseField(u32, fields[2], 1, 31, .none),
.months = try parseField(u16, fields[3], 1, 12, .month),
.weekdays = try parseField(u8, fields[4], 0, 6, .weekday),
.days_is_wildcard = bun.strings.eql(fields[2], "*"),
.weekdays_is_wildcard = bun.strings.eql(fields[4], "*"),
};
}
/// Validate a cron expression string without allocating.
pub fn validate(expr: []const u8) bool {
_ = parse(expr) catch return false;
return true;
}
/// Format the expression as a normalized numeric "M H D Mo W" string
/// suitable for crontab. Returns the written slice of `buf`.
pub fn formatNumeric(self: CronExpression, buf: *[512]u8) []const u8 {
var stream = std.io.fixedBufferStream(buf);
const w = stream.writer();
formatBitfield(w, u64, self.minutes, 0, 59);
w.writeByte(' ') catch unreachable;
formatBitfield(w, u32, self.hours, 0, 23);
w.writeByte(' ') catch unreachable;
formatBitfield(w, u32, self.days, 1, 31);
w.writeByte(' ') catch unreachable;
formatBitfield(w, u16, self.months, 1, 12);
w.writeByte(' ') catch unreachable;
formatBitfield(w, u8, self.weekdays, 0, 6);
return stream.getWritten();
}
/// Compute the next UTC time (in ms since epoch) that matches this expression,
/// starting from `from_ms`. Returns null if no match found within ~4 years.
pub fn next(self: CronExpression, globalObject: *jsc.JSGlobalObject, from_ms: f64) bun.JSError!?f64 {
var dt = globalObject.msToGregorianDateTimeUTC(from_ms);
// Advance by 1 minute, zero out seconds
dt.minute += 1;
if (dt.minute > 59) {
dt.minute = 0;
dt.hour += 1;
if (dt.hour > 23) {
dt.hour = 0;
dt.day += 1;
}
}
dt.second = 0;
// Loop up to ~4 years to prevent infinite iteration
var iterations: u32 = 0;
const max_iterations: u32 = 1500 * 24 * 60;
while (iterations < max_iterations) : (iterations += 1) {
// Normalize via round-trip to handle overflows and compute weekday
{
const ms = try globalObject.gregorianDateTimeToMSUTC(dt.year, dt.month, dt.day, dt.hour, dt.minute, dt.second, 0);
dt = globalObject.msToGregorianDateTimeUTC(ms);
}
// Check month
if (!bitSet(u16, self.months, @intCast(dt.month))) {
dt.month += 1;
dt.day = 1;
dt.hour = 0;
dt.minute = 0;
continue;
}
// POSIX cron day-of-month / day-of-week logic:
// - If both are restricted (neither was *): OR — either matching is enough
// - If only one is restricted: only that one matters (the * field matches all)
const day_ok = bitSet(u32, self.days, @intCast(dt.day));
const weekday_ok = bitSet(u8, self.weekdays, @intCast(dt.weekday));
const both_restricted = !self.days_is_wildcard and !self.weekdays_is_wildcard;
const day_match = if (both_restricted) (day_ok or weekday_ok) else (day_ok and weekday_ok);
if (!day_match) {
dt.day += 1;
dt.hour = 0;
dt.minute = 0;
continue;
}
// Check hour
if (!bitSet(u32, self.hours, @intCast(dt.hour))) {
dt.hour += 1;
dt.minute = 0;
continue;
}
// Check minute
if (!bitSet(u64, self.minutes, @intCast(dt.minute))) {
dt.minute += 1;
continue;
}
// All fields match
return try globalObject.gregorianDateTimeToMSUTC(dt.year, dt.month, dt.day, dt.hour, dt.minute, dt.second, 0);
}
return null;
}
};
// ============================================================================
// Name lookup tables
// ============================================================================
const all_hours: u32 = (1 << 24) - 1;
const all_days: u32 = ((1 << 32) - 1) & ~@as(u32, 1);
const all_months: u16 = ((1 << 13) - 1) & ~@as(u16, 1);
const all_weekdays: u8 = (1 << 7) - 1;
fn parseNickname(expr: []const u8) ?CronExpression {
const eql = bun.strings.eqlCaseInsensitiveASCIIICheckLength;
if (eql(expr, "@yearly") or eql(expr, "@annually"))
return .{ .minutes = 1, .hours = 1, .days = 1 << 1, .months = 1 << 1, .weekdays = all_weekdays, .days_is_wildcard = false, .weekdays_is_wildcard = true };
if (eql(expr, "@monthly"))
return .{ .minutes = 1, .hours = 1, .days = 1 << 1, .months = all_months, .weekdays = all_weekdays, .days_is_wildcard = false, .weekdays_is_wildcard = true };
if (eql(expr, "@weekly"))
return .{ .minutes = 1, .hours = 1, .days = all_days, .months = all_months, .weekdays = 1, .days_is_wildcard = true, .weekdays_is_wildcard = false };
if (eql(expr, "@daily") or eql(expr, "@midnight"))
return .{ .minutes = 1, .hours = 1, .days = all_days, .months = all_months, .weekdays = all_weekdays, .days_is_wildcard = true, .weekdays_is_wildcard = true };
if (eql(expr, "@hourly"))
return .{ .minutes = 1, .hours = all_hours, .days = all_days, .months = all_months, .weekdays = all_weekdays, .days_is_wildcard = true, .weekdays_is_wildcard = true };
return null;
}
const weekday_map = bun.ComptimeStringMap(u7, .{
.{ "sun", 0 }, .{ "mon", 1 }, .{ "tue", 2 },
.{ "wed", 3 }, .{ "thu", 4 }, .{ "fri", 5 },
.{ "sat", 6 }, .{ "sunday", 0 }, .{ "monday", 1 },
.{ "tuesday", 2 }, .{ "wednesday", 3 }, .{ "thursday", 4 },
.{ "friday", 5 }, .{ "saturday", 6 },
});
const month_map = bun.ComptimeStringMap(u7, .{
.{ "jan", 1 }, .{ "feb", 2 }, .{ "mar", 3 },
.{ "apr", 4 }, .{ "may", 5 }, .{ "jun", 6 },
.{ "jul", 7 }, .{ "aug", 8 }, .{ "sep", 9 },
.{ "oct", 10 }, .{ "nov", 11 }, .{ "dec", 12 },
.{ "january", 1 }, .{ "february", 2 }, .{ "march", 3 },
.{ "april", 4 }, .{ "may", 5 }, .{ "june", 6 },
.{ "july", 7 }, .{ "august", 8 }, .{ "september", 9 },
.{ "october", 10 }, .{ "november", 11 }, .{ "december", 12 },
});
// ============================================================================
// Field parsing
// ============================================================================
const NameKind = enum { none, weekday, month };
/// Parse a single cron field (e.g. "1,5-10,*/3") into a bitset.
fn parseField(comptime T: type, field: []const u8, min: u7, max: u7, kind: NameKind) CronExpression.Error!T {
if (field.len == 0) return error.InvalidField;
var result: T = 0;
var parts = std.mem.splitScalar(u8, field, ',');
while (parts.next()) |part| {
if (part.len == 0) return error.InvalidField;
// Split by / for step
var step_iter = std.mem.splitScalar(u8, part, '/');
const base = step_iter.next() orelse return error.InvalidField;
const step_str = step_iter.next();
if (step_iter.next() != null) return error.InvalidStep;
const step: u7 = if (step_str) |s| blk: {
if (s.len == 0) return error.InvalidStep;
break :blk std.fmt.parseInt(u7, s, 10) catch return error.InvalidStep;
} else 1;
if (step == 0) return error.InvalidStep;
var range_min: u7 = undefined;
var range_max: u7 = undefined;
if (bun.strings.eql(base, "*")) {
range_min = min;
range_max = max;
} else {
if (splitRange(base)) |range_parts| {
const lo = parseValue(range_parts[0], min, max, kind) catch return error.InvalidNumber;
const hi = parseValue(range_parts[1], min, max, kind) catch return error.InvalidNumber;
if (lo > hi) return error.InvalidRange;
range_min = lo;
range_max = hi;
} else {
const lo = parseValue(base, min, max, kind) catch return error.InvalidNumber;
range_min = lo;
range_max = if (step_str != null) max else lo;
}
}
// Set bits
var i: u7 = range_min;
while (i <= range_max) : (i += step) {
result |= @as(T, 1) << @intCast(i);
if (@as(u8, i) + @as(u8, step) > range_max) break;
}
}
return result;
}
/// Split a base expression on '-' for ranges, returning null if not a range.
fn splitRange(base: []const u8) ?[2][]const u8 {
const idx = bun.strings.indexOfChar(base, '-') orelse return null;
if (idx == 0 or idx == base.len - 1) return null;
const rest = base[idx + 1 ..];
if (bun.strings.indexOfChar(rest, '-') != null) return null;
return .{ base[0..idx], rest };
}
/// Parse a single value (number or name), validating range.
/// For weekday fields, 7 is normalized to 0 (Sunday).
fn parseValue(str: []const u8, min: u7, max: u7, kind: NameKind) error{InvalidNumber}!u7 {
// Try named value first via ComptimeStringMap case-insensitive lookup
switch (kind) {
.weekday => if (weekday_map.getASCIIICaseInsensitive(str)) |v| return v,
.month => if (month_map.getASCIIICaseInsensitive(str)) |v| return v,
.none => {},
}
const val = std.fmt.parseInt(u8, str, 10) catch return error.InvalidNumber;
if (kind == .weekday and val == 7) return 0;
if (val < min or val > max) return error.InvalidNumber;
return @intCast(val);
}
// ============================================================================
// Helpers
// ============================================================================
inline fn bitSet(comptime T: type, set: T, pos: std.math.Log2Int(T)) bool {
return (set >> pos) & 1 != 0;
}
/// Write a bitfield as a cron field string: "*" if all bits set, or comma-separated values.
fn formatBitfield(w: anytype, comptime T: type, bits: T, min: u8, max: u8) void {
var all_set = true;
for (min..max + 1) |i| {
if ((bits >> @intCast(i)) & 1 == 0) {
all_set = false;
break;
}
}
if (all_set) {
w.writeByte('*') catch unreachable;
return;
}
var first = true;
for (min..max + 1) |i| {
if ((bits >> @intCast(i)) & 1 != 0) {
if (!first) w.writeByte(',') catch unreachable;
std.fmt.format(w, "{d}", .{i}) catch unreachable;
first = false;
}
}
}
const std = @import("std");
const bun = @import("bun");
const jsc = bun.jsc;

View File

@@ -28,6 +28,7 @@
macro(ValkeyClient) \
macro(argv) \
macro(assetPrefix) \
macro(cron) \
macro(cwd) \
macro(embeddedFiles) \
macro(enableANSIColors) \

View File

@@ -934,6 +934,7 @@ JSC_DEFINE_HOST_FUNCTION(functionFileURLToPath, (JSC::JSGlobalObject * globalObj
build BunObject_callback_build DontDelete|Function 1
concatArrayBuffers functionConcatTypedArrays DontDelete|Function 3
connect BunObject_callback_connect DontDelete|Function 1
cron BunObject_lazyPropCb_wrap_cron DontDelete|PropertyCallback
cwd BunObject_lazyPropCb_wrap_cwd DontEnum|DontDelete|PropertyCallback
color BunObject_callback_color DontDelete|Function 2
deepEquals functionBunDeepEquals DontDelete|Function 2

View File

@@ -954,6 +954,7 @@ BUN_DEFINE_HOST_FUNCTION(jsFunctionBunPluginClear, (JSC::JSGlobalObject * global
global->onResolvePlugins.namespaces.clear();
delete global->onLoadPlugins.virtualModules;
global->onLoadPlugins.virtualModules = nullptr;
return JSC::JSValue::encode(JSC::jsUndefined());
}

View File

@@ -27,6 +27,28 @@ pub const JSGlobalObject = opaque {
return bun.cpp.Bun__gregorianDateTimeToMS(this, year, month, day, hour, minute, second, millisecond);
}
pub fn gregorianDateTimeToMSUTC(this: *jsc.JSGlobalObject, year: i32, month: i32, day: i32, hour: i32, minute: i32, second: i32, millisecond: i32) bun.JSError!f64 {
jsc.markBinding(@src());
return bun.cpp.Bun__gregorianDateTimeToMSUTC(this, year, month, day, hour, minute, second, millisecond);
}
pub const GregorianDateTime = struct {
year: i32,
month: i32,
day: i32,
hour: i32,
minute: i32,
second: i32,
weekday: i32,
};
pub fn msToGregorianDateTimeUTC(this: *jsc.JSGlobalObject, ms: f64) GregorianDateTime {
jsc.markBinding(@src());
var dt: GregorianDateTime = undefined;
bun.cpp.Bun__msToGregorianDateTimeUTC(this, ms, &dt.year, &dt.month, &dt.day, &dt.hour, &dt.minute, &dt.second, &dt.weekday);
return dt;
}
pub fn throwTODO(this: *JSGlobalObject, msg: []const u8) bun.JSError {
const err = this.createErrorInstance("{s}", .{msg});
err.put(this, ZigString.static("name"), (bun.String.static("TODOError").toJS(this)) catch return error.JSError);

View File

@@ -5641,6 +5641,34 @@ extern "C" [[ZIG_EXPORT(check_slow)]] double Bun__gregorianDateTimeToMS(JSC::JSG
return vm.dateCache.gregorianDateTimeToMS(dateTime, millisecond, WTF::TimeType::LocalTime);
}
extern "C" [[ZIG_EXPORT(check_slow)]] double Bun__gregorianDateTimeToMSUTC(JSC::JSGlobalObject* globalObject, int year, int month, int day, int hour, int minute, int second, int millisecond)
{
auto& vm = JSC::getVM(globalObject);
WTF::GregorianDateTime dateTime;
dateTime.setYear(year);
dateTime.setMonth(month - 1);
dateTime.setMonthDay(day);
dateTime.setHour(hour);
dateTime.setMinute(minute);
dateTime.setSecond(second);
return vm.dateCache.gregorianDateTimeToMS(dateTime, millisecond, WTF::TimeType::UTCTime);
}
extern "C" [[ZIG_EXPORT(nothrow)]] void Bun__msToGregorianDateTimeUTC(JSC::JSGlobalObject* globalObject, double ms,
int* year, int* month, int* day, int* hour, int* minute, int* second, int* weekday)
{
auto& vm = JSC::getVM(globalObject);
WTF::GregorianDateTime dt;
vm.dateCache.msToGregorianDateTime(ms, WTF::TimeType::UTCTime, dt);
*year = dt.year();
*month = dt.month() + 1;
*day = dt.monthDay();
*hour = dt.hour();
*minute = dt.minute();
*second = dt.second();
*weekday = dt.weekDay();
}
extern "C" EncodedJSValue JSC__JSValue__dateInstanceFromNumber(JSC::JSGlobalObject* globalObject, double unixTimestamp)
{
auto& vm = JSC::getVM(globalObject);

View File

@@ -389,6 +389,8 @@ pub const Command = struct {
expose_gc: bool = false,
preserve_symlinks_main: bool = false,
console_depth: ?u16 = null,
cron_title: []const u8 = "",
cron_period: []const u8 = "",
cpu_prof: struct {
enabled: bool = false,
name: []const u8 = "",

View File

@@ -124,6 +124,8 @@ pub const runtime_params_ = [_]ParamType{
clap.parseParam("--unhandled-rejections <STR> One of \"strict\", \"throw\", \"warn\", \"none\", or \"warn-with-error-code\"") catch unreachable,
clap.parseParam("--console-depth <NUMBER> Set the default depth for console.log object inspection (default: 2)") catch unreachable,
clap.parseParam("--user-agent <STR> Set the default User-Agent header for HTTP requests") catch unreachable,
clap.parseParam("--cron-title <STR> Title for cron execution mode") catch unreachable,
clap.parseParam("--cron-period <STR> Cron period for cron execution mode") catch unreachable,
};
pub const auto_or_run_params = [_]ParamType{
@@ -823,6 +825,17 @@ pub fn parse(allocator: std.mem.Allocator, ctx: Command.Context, comptime cmd: C
ctx.runtime_options.dns_result_order = order;
}
if (args.option("--cron-title")) |t| {
ctx.runtime_options.cron_title = t;
}
if (args.option("--cron-period")) |p| {
ctx.runtime_options.cron_period = p;
}
if ((ctx.runtime_options.cron_title.len > 0) != (ctx.runtime_options.cron_period.len > 0)) {
Output.errGeneric("--cron-title and --cron-period must be provided together", .{});
Global.exit(1);
}
if (args.option("--inspect")) |inspect_flag| {
ctx.runtime_options.debugger = if (inspect_flag.len == 0)
Command.Debugger{ .enable = .{} }

View File

@@ -948,6 +948,7 @@ pub const CommandLineReporter = struct {
this.printSummary();
Output.prettyError("\nBailed out after {d} failure{s}<r>\n", .{ this.jest.bail, if (this.jest.bail == 1) "" else "s" });
Output.flush();
this.writeJUnitReportIfNeeded();
Global.exit(1);
}
},
@@ -970,6 +971,20 @@ pub const CommandLineReporter = struct {
Output.printStartEnd(bun.start_time, std.time.nanoTimestamp());
}
/// Writes the JUnit reporter output file if a JUnit reporter is active and
/// an outfile path was configured. This must be called before any early exit
/// (e.g. bail) so that the report is not lost.
pub fn writeJUnitReportIfNeeded(this: *CommandLineReporter) void {
if (this.reporters.junit) |junit| {
if (this.jest.test_options.reporter_outfile) |outfile| {
if (junit.current_file.len > 0) {
junit.endTestSuite() catch {};
}
junit.writeToFile(outfile) catch {};
}
}
}
pub fn generateCodeCoverage(this: *CommandLineReporter, vm: *jsc.VirtualMachine, opts: *TestCommand.CodeCoverageOptions, comptime reporters: TestCommand.Reporters, comptime enable_ansi_colors: bool) !void {
if (comptime !reporters.text and !reporters.lcov) {
return;
@@ -1772,12 +1787,7 @@ pub const TestCommand = struct {
Output.prettyError("\n", .{});
Output.flush();
if (reporter.reporters.junit) |junit| {
if (junit.current_file.len > 0) {
junit.endTestSuite() catch {};
}
junit.writeToFile(ctx.test_options.reporter_outfile.?) catch {};
}
reporter.writeJUnitReportIfNeeded();
if (vm.hot_reload == .watch) {
vm.runWithAPILock(jsc.VirtualMachine, vm, runEventLoopForWatch);
@@ -1920,6 +1930,7 @@ pub const TestCommand = struct {
if (reporter.jest.bail == reporter.summary().fail) {
reporter.printSummary();
Output.prettyError("\nBailed out after {d} failure{s}<r>\n", .{ reporter.jest.bail, if (reporter.jest.bail == 1) "" else "s" });
reporter.writeJUnitReportIfNeeded();
vm.exit_handler.exit_code = 1;
vm.is_shutting_down = true;

View File

@@ -27,6 +27,7 @@ pub fn NewWebSocketClient(comptime ssl: bool) type {
ping_frame_bytes: [128 + 6]u8 = [_]u8{0} ** (128 + 6),
ping_len: u8 = 0,
ping_received: bool = false,
pong_received: bool = false,
close_received: bool = false,
close_frame_buffering: bool = false,
@@ -120,6 +121,7 @@ pub fn NewWebSocketClient(comptime ssl: bool) type {
this.clearReceiveBuffers(true);
this.clearSendBuffers(true);
this.ping_received = false;
this.pong_received = false;
this.ping_len = 0;
this.close_frame_buffering = false;
this.receive_pending_chunk_len = 0;
@@ -650,14 +652,38 @@ pub fn NewWebSocketClient(comptime ssl: bool) type {
if (data.len == 0) break;
},
.pong => {
const pong_len = @min(data.len, @min(receive_body_remain, this.ping_frame_bytes.len));
if (!this.pong_received) {
if (receive_body_remain > 125) {
this.terminate(ErrorCode.invalid_control_frame);
terminated = true;
break;
}
this.ping_len = @truncate(receive_body_remain);
receive_body_remain = 0;
this.pong_received = true;
}
const pong_len = this.ping_len;
this.dispatchData(data[0..pong_len], .Pong);
if (data.len > 0) {
const total_received = @min(pong_len, receive_body_remain + data.len);
const slice = this.ping_frame_bytes[6..][receive_body_remain..total_received];
@memcpy(slice, data[0..slice.len]);
receive_body_remain = total_received;
data = data[slice.len..];
}
const pending_body = pong_len - receive_body_remain;
if (pending_body > 0) {
// wait for more data - pong payload is fragmented across TCP segments
break;
}
const pong_data = this.ping_frame_bytes[6..][0..pong_len];
this.dispatchData(pong_data, .Pong);
data = data[pong_len..];
receive_state = .need_header;
receive_body_remain = 0;
receiving_type = last_receive_data_type;
this.pong_received = false;
if (data.len == 0) break;
},

View File

@@ -291,25 +291,32 @@ pub const Parser = struct {
}
},
else => {
try unesc.appendSlice(switch (bun.strings.utf8ByteSequenceLength(c)) {
1 => brk: {
break :brk &[_]u8{ '\\', c };
switch (bun.strings.utf8ByteSequenceLength(c)) {
0, 1 => try unesc.appendSlice(&[_]u8{ '\\', c }),
2 => if (val.len - i >= 2) {
try unesc.appendSlice(&[_]u8{ '\\', c, val[i + 1] });
i += 1;
} else {
try unesc.appendSlice(&[_]u8{ '\\', c });
},
2 => brk: {
defer i += 1;
break :brk &[_]u8{ '\\', c, val[i + 1] };
3 => if (val.len - i >= 3) {
try unesc.appendSlice(&[_]u8{ '\\', c, val[i + 1], val[i + 2] });
i += 2;
} else {
try unesc.append('\\');
try unesc.appendSlice(val[i..val.len]);
i = val.len - 1;
},
3 => brk: {
defer i += 2;
break :brk &[_]u8{ '\\', c, val[i + 1], val[i + 2] };
4 => if (val.len - i >= 4) {
try unesc.appendSlice(&[_]u8{ '\\', c, val[i + 1], val[i + 2], val[i + 3] });
i += 3;
} else {
try unesc.append('\\');
try unesc.appendSlice(val[i..val.len]);
i = val.len - 1;
},
4 => brk: {
defer i += 3;
break :brk &[_]u8{ '\\', c, val[i + 1], val[i + 2], val[i + 3] };
},
// this means invalid utf8
else => unreachable,
});
}
},
}
@@ -342,25 +349,30 @@ pub const Parser = struct {
try unesc.append('.');
}
},
else => try unesc.appendSlice(switch (bun.strings.utf8ByteSequenceLength(c)) {
1 => brk: {
break :brk &[_]u8{c};
else => switch (bun.strings.utf8ByteSequenceLength(c)) {
0, 1 => try unesc.append(c),
2 => if (val.len - i >= 2) {
try unesc.appendSlice(&[_]u8{ c, val[i + 1] });
i += 1;
} else {
try unesc.append(c);
},
2 => brk: {
defer i += 1;
break :brk &[_]u8{ c, val[i + 1] };
3 => if (val.len - i >= 3) {
try unesc.appendSlice(&[_]u8{ c, val[i + 1], val[i + 2] });
i += 2;
} else {
try unesc.appendSlice(val[i..val.len]);
i = val.len - 1;
},
3 => brk: {
defer i += 2;
break :brk &[_]u8{ c, val[i + 1], val[i + 2] };
4 => if (val.len - i >= 4) {
try unesc.appendSlice(&[_]u8{ c, val[i + 1], val[i + 2], val[i + 3] });
i += 3;
} else {
try unesc.appendSlice(val[i..val.len]);
i = val.len - 1;
},
4 => brk: {
defer i += 3;
break :brk &[_]u8{ c, val[i + 1], val[i + 2], val[i + 3] };
},
// this means invalid utf8
else => unreachable,
}),
},
}
}

View File

@@ -460,13 +460,13 @@ pub const Archiver = struct {
if (comptime Environment.isWindows) {
try bun.MakePath.makePath(u16, dir, path);
} else {
std.posix.mkdiratZ(dir_fd, pathname, @intCast(mode)) catch |err| {
std.posix.mkdiratZ(dir_fd, path, @intCast(mode)) catch |err| {
// It's possible for some tarballs to return a directory twice, with and
// without `./` in the beginning. So if it already exists, continue to the
// next entry.
if (err == error.PathAlreadyExists or err == error.NotDir) continue;
bun.makePath(dir, std.fs.path.dirname(path_slice) orelse return err) catch {};
std.posix.mkdiratZ(dir_fd, pathname, 0o777) catch {};
std.posix.mkdiratZ(dir_fd, path, 0o777) catch {};
};
}
},

View File

@@ -221,7 +221,11 @@ pub const S3Credentials = struct {
defer str.deref();
if (str.tag != .Empty and str.tag != .Dead) {
new_credentials._contentDispositionSlice = str.toUTF8(bun.default_allocator);
new_credentials.content_disposition = new_credentials._contentDispositionSlice.?.slice();
const slice = new_credentials._contentDispositionSlice.?.slice();
if (containsNewlineOrCR(slice)) {
return globalObject.throwInvalidArguments("contentDisposition must not contain newline characters (CR/LF)", .{});
}
new_credentials.content_disposition = slice;
}
} else {
return globalObject.throwInvalidArgumentTypeValue("contentDisposition", "string", js_value);
@@ -236,7 +240,11 @@ pub const S3Credentials = struct {
defer str.deref();
if (str.tag != .Empty and str.tag != .Dead) {
new_credentials._contentTypeSlice = str.toUTF8(bun.default_allocator);
new_credentials.content_type = new_credentials._contentTypeSlice.?.slice();
const slice = new_credentials._contentTypeSlice.?.slice();
if (containsNewlineOrCR(slice)) {
return globalObject.throwInvalidArguments("type must not contain newline characters (CR/LF)", .{});
}
new_credentials.content_type = slice;
}
} else {
return globalObject.throwInvalidArgumentTypeValue("type", "string", js_value);
@@ -251,7 +259,11 @@ pub const S3Credentials = struct {
defer str.deref();
if (str.tag != .Empty and str.tag != .Dead) {
new_credentials._contentEncodingSlice = str.toUTF8(bun.default_allocator);
new_credentials.content_encoding = new_credentials._contentEncodingSlice.?.slice();
const slice = new_credentials._contentEncodingSlice.?.slice();
if (containsNewlineOrCR(slice)) {
return globalObject.throwInvalidArguments("contentEncoding must not contain newline characters (CR/LF)", .{});
}
new_credentials.content_encoding = slice;
}
} else {
return globalObject.throwInvalidArgumentTypeValue("contentEncoding", "string", js_value);
@@ -1150,6 +1162,12 @@ const CanonicalRequest = struct {
}
};
/// Returns true if the given slice contains any CR (\r) or LF (\n) characters,
/// which would allow HTTP header injection if used in a header value.
fn containsNewlineOrCR(value: []const u8) bool {
return std.mem.indexOfAny(u8, value, "\r\n") != null;
}
const std = @import("std");
const ACL = @import("./acl.zig").ACL;
const MultiPartUploadOptions = @import("./multipart_options.zig").MultiPartUploadOptions;

View File

@@ -1154,7 +1154,7 @@ pub const Interpreter = struct {
_ = callframe; // autofix
if (this.setupIOBeforeRun().asErr()) |e| {
defer this.#deinitFromExec();
defer this.#derefRootShellAndIOIfNeeded(true);
const shellerr = bun.shell.ShellErr.newSys(e);
return try throwShellErr(&shellerr, .{ .js = globalThis.bunVM().event_loop });
}

View File

@@ -422,6 +422,19 @@ pub fn createInstance(globalObject: *jsc.JSGlobalObject, callframe: *jsc.CallFra
break :brk b.allocatedSlice();
};
// Reject null bytes in connection parameters to prevent protocol injection
// (null bytes act as field terminators in the MySQL wire protocol).
inline for (.{ .{ username, "username" }, .{ password, "password" }, .{ database, "database" }, .{ path, "path" } }) |entry| {
if (entry[0].len > 0 and std.mem.indexOfScalar(u8, entry[0], 0) != null) {
bun.default_allocator.free(options_buf);
tls_config.deinit();
if (tls_ctx) |tls| {
tls.deinit(true);
}
return globalObject.throwInvalidArguments(entry[1] ++ " must not contain null bytes", .{});
}
}
const on_connect = arguments[9];
const on_close = arguments[10];
const idle_timeout = arguments[11].toInt32();

View File

@@ -680,6 +680,20 @@ pub fn call(globalObject: *jsc.JSGlobalObject, callframe: *jsc.CallFrame) bun.JS
break :brk b.allocatedSlice();
};
// Reject null bytes in connection parameters to prevent Postgres startup
// message parameter injection (null bytes act as field terminators in the
// wire protocol's key\0value\0 format).
inline for (.{ .{ username, "username" }, .{ password, "password" }, .{ database, "database" }, .{ path, "path" } }) |entry| {
if (entry[0].len > 0 and std.mem.indexOfScalar(u8, entry[0], 0) != null) {
bun.default_allocator.free(options_buf);
tls_config.deinit();
if (tls_ctx) |tls| {
tls.deinit(true);
}
return globalObject.throwInvalidArguments(entry[1] ++ " must not contain null bytes", .{});
}
}
const on_connect = arguments[9];
const on_close = arguments[10];
const idle_timeout = arguments[11].toInt32();
@@ -1626,7 +1640,10 @@ pub fn on(this: *PostgresSQLConnection, comptime MessageType: @Type(.enum_litera
// This will usually start with "v="
const comparison_signature = final.data.slice();
if (comparison_signature.len < 2 or !bun.strings.eqlLong(server_signature, comparison_signature[2..], true)) {
if (comparison_signature.len < 2 or
server_signature.len != comparison_signature.len - 2 or
BoringSSL.c.CRYPTO_memcmp(server_signature.ptr, comparison_signature[2..].ptr, server_signature.len) != 0)
{
debug("SASLFinal - SASL Server signature mismatch\nExpected: {s}\nActual: {s}", .{ server_signature, comparison_signature[2..] });
this.fail("The server did not return the correct signature", error.SASL_SIGNATURE_MISMATCH);
} else {

View File

@@ -260,14 +260,35 @@ devTest("hmr handles rapid consecutive edits", {
await Bun.sleep(1);
}
// Wait event-driven for "render 10" to appear. Intermediate renders may
// be skipped (watcher coalescing) and the final render may fire multiple
// times (duplicate reloads), so we just listen for any occurrence.
const finalRender = "render 10";
while (true) {
const message = await client.getStringMessage();
if (message === finalRender) break;
if (typeof message === "string" && message.includes("HMR_ERROR")) {
throw new Error("Unexpected HMR error message: " + message);
}
}
await new Promise<void>((resolve, reject) => {
const check = () => {
for (const msg of client.messages) {
if (typeof msg === "string" && msg.includes("HMR_ERROR")) {
cleanup();
reject(new Error("Unexpected HMR error message: " + msg));
return;
}
if (msg === finalRender) {
cleanup();
resolve();
return;
}
}
};
const cleanup = () => {
client.off("message", check);
};
client.on("message", check);
// Check messages already buffered.
check();
});
// Drain all buffered messages — intermediate renders and possible
// duplicates of the final render are expected and harmless.
client.messages.length = 0;
const hmrErrors = await client.js`return globalThis.__hmrErrors ? [...globalThis.__hmrErrors] : [];`;
if (hmrErrors.length > 0) {

View File

@@ -611,6 +611,82 @@ describe("Bun.Archive", () => {
// Very deep paths might fail on some systems - that's acceptable
}
});
test("directory entries with path traversal components cannot escape extraction root", async () => {
// Manually craft a tar archive containing directory entries with "../" traversal
// components in their pathnames. This tests that the extraction code uses the
// normalized path (which strips "..") rather than the raw pathname from the tarball.
function createTarHeader(
name: string,
size: number,
type: "0" | "5", // 0=file, 5=directory
): Uint8Array {
const header = new Uint8Array(512);
const enc = new TextEncoder();
header.set(enc.encode(name).slice(0, 100), 0);
header.set(enc.encode(type === "5" ? "0000755 " : "0000644 "), 100);
header.set(enc.encode("0000000 "), 108);
header.set(enc.encode("0000000 "), 116);
header.set(enc.encode(size.toString(8).padStart(11, "0") + " "), 124);
const mtime = Math.floor(Date.now() / 1000)
.toString(8)
.padStart(11, "0");
header.set(enc.encode(mtime + " "), 136);
header.set(enc.encode(" "), 148); // checksum placeholder
header[156] = type.charCodeAt(0);
header.set(enc.encode("ustar"), 257);
header[262] = 0;
header.set(enc.encode("00"), 263);
let checksum = 0;
for (let i = 0; i < 512; i++) checksum += header[i];
header.set(enc.encode(checksum.toString(8).padStart(6, "0") + "\0 "), 148);
return header;
}
const blocks: Uint8Array[] = [];
const enc = new TextEncoder();
// A legitimate directory
blocks.push(createTarHeader("safe_dir/", 0, "5"));
// A directory entry with traversal: "safe_dir/../../escaped_dir/"
// After normalization this becomes "escaped_dir" (safe),
// but the raw pathname resolves ".." via the kernel in mkdirat.
blocks.push(createTarHeader("safe_dir/../../escaped_dir/", 0, "5"));
// A normal file
const content = enc.encode("hello");
blocks.push(createTarHeader("safe_dir/file.txt", content.length, "0"));
blocks.push(content);
const pad = 512 - (content.length % 512);
if (pad < 512) blocks.push(new Uint8Array(pad));
// End-of-archive markers
blocks.push(new Uint8Array(1024));
const totalLen = blocks.reduce((s, b) => s + b.length, 0);
const tarball = new Uint8Array(totalLen);
let offset = 0;
for (const b of blocks) {
tarball.set(b, offset);
offset += b.length;
}
// Create a parent directory so we can check if "escaped_dir" appears outside extractDir
using parentDir = tempDir("archive-traversal-parent", {});
const extractPath = join(String(parentDir), "extract");
const { mkdirSync, existsSync } = require("fs");
mkdirSync(extractPath, { recursive: true });
const archive = new Bun.Archive(tarball);
await archive.extract(extractPath);
// The "escaped_dir" should NOT exist in the parent directory (outside extraction root)
const escapedOutside = join(String(parentDir), "escaped_dir");
expect(existsSync(escapedOutside)).toBe(false);
// The "safe_dir" should exist inside the extraction directory
expect(existsSync(join(extractPath, "safe_dir"))).toBe(true);
// The normalized "escaped_dir" may or may not exist inside extractPath
// (depending on whether normalization keeps it), but it must NOT be outside
});
});
describe("Archive.write()", () => {

View File

@@ -0,0 +1,797 @@
import { afterEach, beforeEach, describe, expect, test } from "bun:test";
import { bunEnv, bunExe, isLinux, tempDir } from "harness";
import { unlinkSync, writeFileSync } from "node:fs";
function readCrontab(): string {
const result = Bun.spawnSync({
cmd: ["/usr/bin/crontab", "-l"],
stdout: "pipe",
stderr: "pipe",
});
return result.exitCode === 0 ? result.stdout.toString() : "";
}
function writeCrontab(content: string) {
const tmpFile = `/tmp/bun-cron-${Date.now()}-${Math.random().toString(36).slice(2)}.tmp`;
writeFileSync(tmpFile, content);
try {
Bun.spawnSync({ cmd: ["/usr/bin/crontab", tmpFile] });
} finally {
try {
unlinkSync(tmpFile);
} catch {}
}
}
let savedCrontab: string | null = null;
function saveCrontab() {
savedCrontab = readCrontab();
}
function restoreCrontab() {
if (savedCrontab !== null) {
writeCrontab(savedCrontab);
savedCrontab = null;
}
}
// ==========================================================================
// API shape
// ==========================================================================
describe("Bun.cron API", () => {
test("is a function", () => {
expect(typeof Bun.cron).toBe("function");
});
test("has .remove method", () => {
expect(typeof Bun.cron.remove).toBe("function");
});
test("has .parse method", () => {
expect(typeof Bun.cron.parse).toBe("function");
});
test("throws with no arguments", () => {
// @ts-ignore
expect(() => Bun.cron()).toThrow();
});
test("throws with non-string path", () => {
// @ts-ignore
expect(() => Bun.cron(123, "* * * * *", "test-bad")).toThrow();
});
test("throws with non-string schedule", () => {
// @ts-ignore
expect(() => Bun.cron("./test.ts", 123, "test-bad")).toThrow();
});
test("throws with non-string title", () => {
// @ts-ignore
expect(() => Bun.cron("./test.ts", "* * * * *", 123)).toThrow();
});
test("remove throws with non-string title", () => {
// @ts-ignore
expect(() => Bun.cron.remove(123)).toThrow();
});
test("throws with invalid title characters", () => {
expect(() => Bun.cron("./test.ts", "* * * * *", "bad title!")).toThrow(/alphanumeric/);
expect(() => Bun.cron("./test.ts", "* * * * *", "bad/title")).toThrow(/alphanumeric/);
expect(() => Bun.cron("./test.ts", "* * * * *", "")).toThrow(/alphanumeric/);
});
test("throws with invalid cron expression", () => {
expect(() => Bun.cron("./test.ts", "not a cron", "test-bad")).toThrow(/cron expression/i);
expect(() => Bun.cron("./test.ts", "* * *", "test-bad")).toThrow(/cron expression/i);
expect(() => Bun.cron("./test.ts", "* * * * * *", "test-bad")).toThrow(/cron expression/i);
expect(() => Bun.cron("./test.ts", "abc * * * *", "test-bad")).toThrow(/cron expression/i);
});
test("remove throws with invalid title characters", () => {
expect(() => Bun.cron.remove("bad title!")).toThrow(/alphanumeric/);
});
});
// ==========================================================================
// Registration (Linux only — uses crontab)
// ==========================================================================
describe.skipIf(!isLinux)("cron registration", () => {
beforeEach(saveCrontab);
afterEach(restoreCrontab);
test("accepts valid cron expressions", async () => {
using dir = tempDir("bun-cron-test", {
"job.ts": `export default { scheduled() {} };`,
});
// Every minute
await Bun.cron(`${dir}/job.ts`, "* * * * *", "test-every-min");
// Ranges, steps, lists
await Bun.cron(`${dir}/job.ts`, "*/15 1-5 1,15 * 0-4", "test-complex");
// Named days/months get normalized to numeric form in crontab
await Bun.cron(`${dir}/job.ts`, "30 2 * * Monday", "test-named");
const crontab = readCrontab();
expect(crontab).toContain("# bun-cron: test-every-min");
expect(crontab).toContain("# bun-cron: test-complex");
expect(crontab).toContain("# bun-cron: test-named");
// Verify "Monday" was normalized to "1" in the crontab entry
const namedLine = crontab.split("\n").find((l: string) => l.includes("--cron-title=test-named"));
expect(namedLine).toBeDefined();
expect(namedLine).toStartWith("30 2 * * 1 ");
expect(namedLine).not.toContain("Monday");
});
test("registers a crontab entry with absolute path", async () => {
using dir = tempDir("bun-cron-test", {
"job.ts": `export default { scheduled() {} };`,
});
const scriptPath = `${dir}/job.ts`;
await Bun.cron(scriptPath, "30 2 * * 1", "test-register");
const crontab = readCrontab();
expect(crontab).toContain("# bun-cron: test-register");
expect(crontab).toContain("30 2 * * 1");
expect(crontab).toContain(scriptPath);
});
test("crontab entry contains correct format", async () => {
using dir = tempDir("bun-cron-test", {
"job.ts": `export default { scheduled() {} };`,
});
await Bun.cron(`${dir}/job.ts`, "15 3 * * 0", "test-format");
const crontab = readCrontab();
const lines = crontab.split("\n");
const markerIdx = lines.findIndex((l: string) => l.includes("# bun-cron: test-format"));
expect(markerIdx).toBeGreaterThanOrEqual(0);
const commandLine = lines[markerIdx + 1];
expect(commandLine).toStartWith("15 3 * * 0 ");
expect(commandLine).toContain("--cron-title=test-format");
expect(commandLine).toContain("--cron-period='15 3 * * 0'");
expect(commandLine).toContain(`${dir}/job.ts`);
});
test("replaces existing entry with same title", async () => {
using dir = tempDir("bun-cron-test", {
"job.ts": `export default { scheduled() {} };`,
});
await Bun.cron(`${dir}/job.ts`, "0 * * * *", "test-replace");
await Bun.cron(`${dir}/job.ts`, "30 2 * * 1", "test-replace");
const crontab = readCrontab();
const count = (crontab.match(/# bun-cron: test-replace/g) || []).length;
expect(count).toBe(1);
expect(crontab).toContain("30 2 * * 1");
expect(crontab).not.toContain("0 * * * *");
});
test("registers multiple different cron jobs", async () => {
using dir = tempDir("bun-cron-test", {
"a.ts": `export default { scheduled() {} };`,
"b.ts": `export default { scheduled() {} };`,
});
await Bun.cron(`${dir}/a.ts`, "0 * * * *", "multi-a");
await Bun.cron(`${dir}/b.ts`, "30 12 * * 5", "multi-b");
const crontab = readCrontab();
expect(crontab).toContain("# bun-cron: multi-a");
expect(crontab).toContain("# bun-cron: multi-b");
expect(crontab).toContain("0 * * * *");
expect(crontab).toContain("30 12 * * 5");
});
test("preserves existing non-bun crontab entries", async () => {
// Add a manual crontab entry first
await using setup = Bun.spawn({
cmd: ["/usr/bin/crontab", "-"],
stdin: "pipe",
});
setup.stdin.write("0 0 * * * /usr/bin/some-other-job\n");
setup.stdin.end();
await setup.exited;
using dir = tempDir("bun-cron-test", {
"job.ts": `export default { scheduled() {} };`,
});
await Bun.cron(`${dir}/job.ts`, "*/5 * * * *", "test-preserve");
const crontab = readCrontab();
expect(crontab).toContain("/usr/bin/some-other-job");
expect(crontab).toContain("# bun-cron: test-preserve");
});
test("returns a promise that resolves", async () => {
using dir = tempDir("bun-cron-test", {
"job.ts": `export default { scheduled() {} };`,
});
const result = await Bun.cron(`${dir}/job.ts`, "* * * * *", "test-promise");
expect(result).toBeUndefined();
});
});
// ==========================================================================
// Removal
// ==========================================================================
describe.skipIf(!isLinux)("cron removal", () => {
beforeEach(saveCrontab);
afterEach(restoreCrontab);
test("removes an existing cron entry", async () => {
using dir = tempDir("bun-cron-test", {
"job.ts": `export default { scheduled() {} };`,
});
await Bun.cron(`${dir}/job.ts`, "30 2 * * 1", "rm-target");
let crontab = readCrontab();
expect(crontab).toContain("# bun-cron: rm-target");
await Bun.cron.remove("rm-target");
crontab = readCrontab();
expect(crontab).not.toContain("# bun-cron: rm-target");
expect(crontab).not.toContain("30 2 * * 1");
});
test("removing non-existent entry resolves without error", async () => {
const result = await Bun.cron.remove("rm-nonexistent");
expect(result).toBeUndefined();
});
test("removes only the targeted entry", async () => {
using dir = tempDir("bun-cron-test", {
"a.ts": `export default { scheduled() {} };`,
"b.ts": `export default { scheduled() {} };`,
});
await Bun.cron(`${dir}/a.ts`, "0 * * * *", "rm-keep");
await Bun.cron(`${dir}/b.ts`, "30 2 * * 1", "rm-delete");
await Bun.cron.remove("rm-delete");
const crontab = readCrontab();
expect(crontab).toContain("# bun-cron: rm-keep");
expect(crontab).not.toContain("# bun-cron: rm-delete");
});
test("register after remove works", async () => {
using dir = tempDir("bun-cron-test", {
"job.ts": `export default { scheduled() {} };`,
});
await Bun.cron(`${dir}/job.ts`, "0 * * * *", "rm-reregister");
await Bun.cron.remove("rm-reregister");
let crontab = readCrontab();
expect(crontab).not.toContain("# bun-cron: rm-reregister");
await Bun.cron(`${dir}/job.ts`, "30 6 * * *", "rm-reregister");
crontab = readCrontab();
expect(crontab).toContain("# bun-cron: rm-reregister");
expect(crontab).toContain("30 6 * * *");
});
});
// ==========================================================================
// Cron execution mode (--cron-title / --cron-period)
// ==========================================================================
describe("cron execution mode", () => {
test("calls default.scheduled with controller object", async () => {
using dir = tempDir("bun-cron-test", {
"scheduled.ts": `
export default {
scheduled(controller: any) {
console.log(JSON.stringify({
type: controller.type,
cron: controller.cron,
hasScheduledTime: typeof controller.scheduledTime === "number",
}));
}
};
`,
});
await using proc = Bun.spawn({
cmd: [bunExe(), "run", "--cron-title=my-job", "--cron-period=30 2 * * 1", `${dir}/scheduled.ts`],
env: bunEnv,
stdout: "pipe",
stderr: "pipe",
});
const [stdout, stderr, exitCode] = await Promise.all([proc.stdout.text(), proc.stderr.text(), proc.exited]);
const output = JSON.parse(stdout.trim());
expect(output).toEqual({
type: "scheduled",
cron: "30 2 * * 1",
hasScheduledTime: true,
});
expect(exitCode).toBe(0);
});
test("handles async scheduled handler", async () => {
using dir = tempDir("bun-cron-test", {
"async-scheduled.ts": `
export default {
async scheduled(controller: any) {
await Bun.sleep(10);
console.log("async-done");
}
};
`,
});
await using proc = Bun.spawn({
cmd: [bunExe(), "run", "--cron-title=async-job", "--cron-period=* * * * *", `${dir}/async-scheduled.ts`],
env: bunEnv,
stdout: "pipe",
stderr: "pipe",
});
const [stdout, stderr, exitCode] = await Promise.all([proc.stdout.text(), proc.stderr.text(), proc.exited]);
expect(stdout.trim()).toBe("async-done");
expect(exitCode).toBe(0);
});
test("exits with error when no scheduled method", async () => {
using dir = tempDir("bun-cron-test", {
"no-scheduled.ts": `export default { hello: "world" };`,
});
await using proc = Bun.spawn({
cmd: [bunExe(), "run", "--cron-title=bad-job", "--cron-period=* * * * *", `${dir}/no-scheduled.ts`],
env: bunEnv,
stdout: "pipe",
stderr: "pipe",
});
const [stdout, stderr, exitCode] = await Promise.all([proc.stdout.text(), proc.stderr.text(), proc.exited]);
expect(exitCode).not.toBe(0);
});
test("handles CJS module with default export", async () => {
using dir = tempDir("bun-cron-test", {
"cjs-scheduled.cjs": `
module.exports = {
scheduled(controller) {
console.log("cjs-" + controller.type);
}
};
`,
});
await using proc = Bun.spawn({
cmd: [bunExe(), "run", "--cron-title=cjs-job", "--cron-period=* * * * *", `${dir}/cjs-scheduled.cjs`],
env: bunEnv,
stdout: "pipe",
stderr: "pipe",
});
const [stdout, stderr, exitCode] = await Promise.all([proc.stdout.text(), proc.stderr.text(), proc.exited]);
expect(stdout.trim()).toBe("cjs-scheduled");
expect(exitCode).toBe(0);
});
test("scheduled handler receives scheduledTime as number", async () => {
using dir = tempDir("bun-cron-test", {
"time-check.ts": `
export default {
scheduled(controller: any) {
const now = Date.now();
const diff = Math.abs(now - controller.scheduledTime);
// scheduledTime should be within 5 seconds of now
console.log(diff < 5000 ? "ok" : "bad-" + diff);
}
};
`,
});
await using proc = Bun.spawn({
cmd: [bunExe(), "run", "--cron-title=time-job", "--cron-period=* * * * *", `${dir}/time-check.ts`],
env: bunEnv,
stdout: "pipe",
stderr: "pipe",
});
const [stdout, , exitCode] = await Promise.all([proc.stdout.text(), proc.stderr.text(), proc.exited]);
expect(stdout.trim()).toBe("ok");
expect(exitCode).toBe(0);
});
});
// ==========================================================================
// Bun.cron.parse
// ==========================================================================
/**
* Collect the next N occurrences by chaining parse() calls.
* This is the real test: if the pattern is parsed correctly,
* calling next() repeatedly should produce the right sequence.
*/
function nextN(expr: string, from: number, n: number): number[] {
const results: number[] = [];
let cursor = from;
for (let i = 0; i < n; i++) {
const d = Bun.cron.parse(expr, cursor);
if (!d) break;
results.push(d.getTime());
cursor = d.getTime();
}
return results;
}
describe("Bun.cron.parse", () => {
test("is a function that returns a Date", () => {
expect(typeof Bun.cron.parse).toBe("function");
const result = Bun.cron.parse("* * * * *", Date.UTC(2025, 0, 15, 10, 30, 0));
expect(result).toBeInstanceOf(Date);
});
// --- Verify patterns via sequential next() calls ---
test("*/15 produces :00, :15, :30, :45, :00 sequence", () => {
// Start before midnight so we can see the hour roll
const from = Date.UTC(2025, 0, 15, 10, 58, 0);
expect(nextN("*/15 * * * *", from, 5)).toEqual([
Date.UTC(2025, 0, 15, 11, 0, 0),
Date.UTC(2025, 0, 15, 11, 15, 0),
Date.UTC(2025, 0, 15, 11, 30, 0),
Date.UTC(2025, 0, 15, 11, 45, 0),
Date.UTC(2025, 0, 15, 12, 0, 0),
]);
});
test("0 */6 produces 00:00, 06:00, 12:00, 18:00, 00:00 sequence", () => {
const from = Date.UTC(2025, 0, 15, 0, 0, 0);
expect(nextN("0 */6 * * *", from, 5)).toEqual([
Date.UTC(2025, 0, 15, 6, 0, 0),
Date.UTC(2025, 0, 15, 12, 0, 0),
Date.UTC(2025, 0, 15, 18, 0, 0),
Date.UTC(2025, 0, 16, 0, 0, 0),
Date.UTC(2025, 0, 16, 6, 0, 0),
]);
});
test("0 0 * * MON,WED,FRI produces correct weekday sequence", () => {
// Tue Jan 14 2025
const from = Date.UTC(2025, 0, 14, 0, 0, 0);
const results = nextN("0 0 * * MON,WED,FRI", from, 5);
expect(results).toEqual([
Date.UTC(2025, 0, 15, 0, 0, 0), // Wed
Date.UTC(2025, 0, 17, 0, 0, 0), // Fri
Date.UTC(2025, 0, 20, 0, 0, 0), // Mon
Date.UTC(2025, 0, 22, 0, 0, 0), // Wed
Date.UTC(2025, 0, 24, 0, 0, 0), // Fri
]);
// Verify actual weekdays (0=Sun, 1=Mon, 3=Wed, 5=Fri)
expect(results.map(t => new Date(t).getUTCDay())).toEqual([3, 5, 1, 3, 5]);
});
test("0 9 * * MON-FRI produces consecutive weekday mornings", () => {
// Fri Jan 17 2025 at noon
const from = Date.UTC(2025, 0, 17, 12, 0, 0);
const results = nextN("0 9 * * MON-FRI", from, 5);
// Should skip Sat+Sun, then Mon-Fri
expect(results).toEqual([
Date.UTC(2025, 0, 20, 9, 0, 0), // Mon
Date.UTC(2025, 0, 21, 9, 0, 0), // Tue
Date.UTC(2025, 0, 22, 9, 0, 0), // Wed
Date.UTC(2025, 0, 23, 9, 0, 0), // Thu
Date.UTC(2025, 0, 24, 9, 0, 0), // Fri
]);
expect(results.map(t => new Date(t).getUTCDay())).toEqual([1, 2, 3, 4, 5]);
});
test("@weekly produces consecutive Sundays", () => {
const from = Date.UTC(2025, 0, 12, 0, 0, 0); // Sun Jan 12
const results = nextN("@weekly", from, 4);
expect(results).toEqual([
Date.UTC(2025, 0, 19, 0, 0, 0),
Date.UTC(2025, 0, 26, 0, 0, 0),
Date.UTC(2025, 1, 2, 0, 0, 0),
Date.UTC(2025, 1, 9, 0, 0, 0),
]);
expect(results.every(t => new Date(t).getUTCDay() === 0)).toBe(true);
});
test("@monthly produces 1st of consecutive months", () => {
const from = Date.UTC(2025, 0, 15, 0, 0, 0);
expect(nextN("@monthly", from, 4)).toEqual([
Date.UTC(2025, 1, 1, 0, 0, 0),
Date.UTC(2025, 2, 1, 0, 0, 0),
Date.UTC(2025, 3, 1, 0, 0, 0),
Date.UTC(2025, 4, 1, 0, 0, 0),
]);
});
test("0 0 31 * * skips months without 31 days", () => {
const from = Date.UTC(2025, 0, 1, 0, 0, 0);
const results = nextN("0 0 31 * *", from, 4);
// Jan 31, Mar 31, May 31, Jul 31 (skips Feb, Apr, Jun)
expect(results).toEqual([
Date.UTC(2025, 0, 31, 0, 0, 0),
Date.UTC(2025, 2, 31, 0, 0, 0),
Date.UTC(2025, 4, 31, 0, 0, 0),
Date.UTC(2025, 6, 31, 0, 0, 0),
]);
});
// --- POSIX OR logic: the critical behavioral test ---
test("OR logic: '0 0 15 * FRI' matches BOTH the 15th AND every Friday", () => {
// This is the defining test for POSIX cron OR behavior.
// With AND logic, this would only match Fridays that fall on the 15th (~once a year).
// With OR logic, it matches the 15th of any month AND every Friday.
// Jan 2025: 15th is Wed, Fridays are 3,10,17,24,31
const from = Date.UTC(2025, 0, 1, 0, 0, 0);
const results = nextN("0 0 15 * FRI", from, 6);
expect(results).toEqual([
Date.UTC(2025, 0, 3, 0, 0, 0), // Fri Jan 3
Date.UTC(2025, 0, 10, 0, 0, 0), // Fri Jan 10
Date.UTC(2025, 0, 15, 0, 0, 0), // Wed Jan 15 (15th, not a Friday!)
Date.UTC(2025, 0, 17, 0, 0, 0), // Fri Jan 17
Date.UTC(2025, 0, 24, 0, 0, 0), // Fri Jan 24
Date.UTC(2025, 0, 31, 0, 0, 0), // Fri Jan 31
]);
// Verify: Jan 15 is NOT a Friday (it's Wednesday=3), proving OR logic
expect(new Date(results[2]).getUTCDay()).toBe(3); // Wednesday
expect(new Date(results[2]).getUTCDate()).toBe(15); // 15th
});
test("OR logic: '0 0 1 * MON' fires on 1st AND on Mondays", () => {
// Feb 2025: 1st is Saturday, Mondays are 3,10,17,24, Mar 1 is Saturday
// Need to see enough results to hit a 1st-of-month that ISN'T a Monday
const from = Date.UTC(2025, 1, 1, 0, 0, 0);
const results = nextN("0 0 1 * MON", from, 8);
expect(results).toEqual([
Date.UTC(2025, 1, 3, 0, 0, 0), // Mon Feb 3
Date.UTC(2025, 1, 10, 0, 0, 0), // Mon Feb 10
Date.UTC(2025, 1, 17, 0, 0, 0), // Mon Feb 17
Date.UTC(2025, 1, 24, 0, 0, 0), // Mon Feb 24
Date.UTC(2025, 2, 1, 0, 0, 0), // Sat Mar 1 — matches day-of-month, NOT a Monday!
Date.UTC(2025, 2, 3, 0, 0, 0), // Mon Mar 3
Date.UTC(2025, 2, 10, 0, 0, 0), // Mon Mar 10
Date.UTC(2025, 2, 17, 0, 0, 0), // Mon Mar 17
]);
// Mar 1 is Saturday (6), proving OR logic: it matched on day-of-month alone
expect(new Date(results[4]).getUTCDay()).toBe(6); // Saturday
expect(new Date(results[4]).getUTCDate()).toBe(1); // 1st
});
test("wildcard day + specific weekday: only weekday matters", () => {
// "0 0 * * 1" — day-of-month is *, only Monday matters
const from = Date.UTC(2025, 0, 14, 10, 0, 0); // Tue
const results = nextN("0 0 * * 1", from, 3);
expect(results).toEqual([
Date.UTC(2025, 0, 20, 0, 0, 0),
Date.UTC(2025, 0, 27, 0, 0, 0),
Date.UTC(2025, 1, 3, 0, 0, 0),
]);
expect(results.every(t => new Date(t).getUTCDay() === 1)).toBe(true);
});
test("specific day + wildcard weekday: only day matters", () => {
// "0 0 15 * *" — weekday is *, only 15th matters
const from = Date.UTC(2025, 0, 1, 0, 0, 0);
const results = nextN("0 0 15 * *", from, 3);
expect(results).toEqual([
Date.UTC(2025, 0, 15, 0, 0, 0),
Date.UTC(2025, 1, 15, 0, 0, 0),
Date.UTC(2025, 2, 15, 0, 0, 0),
]);
expect(results.every(t => new Date(t).getUTCDate() === 15)).toBe(true);
});
// --- Named days: verify via weekday sequences ---
test("SUN through SAT each map to the correct weekday", () => {
const from = Date.UTC(2025, 0, 11, 0, 0, 0); // Saturday Jan 11
const names = ["SUN", "MON", "TUE", "WED", "THU", "FRI", "SAT"];
for (let i = 0; i < 7; i++) {
const result = Bun.cron.parse(`0 0 * * ${names[i]}`, from)!;
expect(new Date(result).getUTCDay()).toBe(i);
}
});
test("full day names match 3-letter abbreviations", () => {
const from = Date.UTC(2025, 0, 14, 10, 0, 0);
const pairs: [string, string][] = [
["SUN", "Sunday"],
["MON", "Monday"],
["TUE", "Tuesday"],
["WED", "Wednesday"],
["THU", "Thursday"],
["FRI", "Friday"],
["SAT", "Saturday"],
];
for (const [abbr, full] of pairs) {
expect(Bun.cron.parse(`0 0 * * ${abbr}`, from)!.getTime()).toBe(
Bun.cron.parse(`0 0 * * ${full}`, from)!.getTime(),
);
}
});
test("MON-FRI/2 produces Mon, Wed, Fri", () => {
const from = Date.UTC(2025, 0, 18, 12, 0, 0); // Saturday
const results = nextN("0 0 * * MON-FRI/2", from, 6);
// Mon=1, Wed=3, Fri=5 repeating
expect(results.map(t => new Date(t).getUTCDay())).toEqual([1, 3, 5, 1, 3, 5]);
});
test("day 7 and SUN both schedule on Sundays", () => {
const from = Date.UTC(2025, 0, 13, 0, 0, 0); // Monday
expect(nextN("0 0 * * 7", from, 3)).toEqual(nextN("0 0 * * SUN", from, 3));
expect(nextN("0 0 * * 0", from, 3)).toEqual(nextN("0 0 * * 7", from, 3));
});
// --- Named months: verify via sequences ---
test("JAN through DEC each map to the correct month", () => {
const names = ["JAN", "FEB", "MAR", "APR", "MAY", "JUN", "JUL", "AUG", "SEP", "OCT", "NOV", "DEC"];
const from = Date.UTC(2024, 11, 1, 0, 0, 0); // Dec 2024
for (let i = 0; i < 12; i++) {
const result = Bun.cron.parse(`0 0 1 ${names[i]} *`, from)!;
expect(new Date(result).getUTCMonth()).toBe(i);
}
});
test("full month names match abbreviations", () => {
const from = Date.UTC(2025, 0, 1, 0, 0, 0);
const pairs: [string, string][] = [
["JAN", "January"],
["FEB", "February"],
["MAR", "March"],
["JUN", "June"],
["SEP", "September"],
["DEC", "December"],
];
for (const [abbr, full] of pairs) {
expect(Bun.cron.parse(`0 0 1 ${abbr} *`, from)!.getTime()).toBe(
Bun.cron.parse(`0 0 1 ${full} *`, from)!.getTime(),
);
}
});
test("JAN-MAR produces Jan, Feb, Mar sequence", () => {
const from = Date.UTC(2024, 11, 1, 0, 0, 0); // Dec 2024
const results = nextN("0 0 1 JAN-MAR *", from, 4);
expect(results).toEqual([
Date.UTC(2025, 0, 1, 0, 0, 0),
Date.UTC(2025, 1, 1, 0, 0, 0),
Date.UTC(2025, 2, 1, 0, 0, 0),
Date.UTC(2026, 0, 1, 0, 0, 0),
]);
});
// --- Nicknames verified against equivalent expressions ---
test("@yearly equals '0 0 1 1 *'", () => {
const from = Date.UTC(2025, 0, 1, 0, 0, 0);
expect(nextN("@yearly", from, 3)).toEqual(nextN("0 0 1 1 *", from, 3));
expect(nextN("@annually", from, 3)).toEqual(nextN("0 0 1 1 *", from, 3));
});
test("@daily equals '0 0 * * *'", () => {
const from = Date.UTC(2025, 0, 15, 0, 0, 0);
expect(nextN("@daily", from, 5)).toEqual(nextN("0 0 * * *", from, 5));
expect(nextN("@midnight", from, 5)).toEqual(nextN("0 0 * * *", from, 5));
});
test("@hourly equals '0 * * * *'", () => {
const from = Date.UTC(2025, 0, 15, 10, 0, 0);
expect(nextN("@hourly", from, 5)).toEqual(nextN("0 * * * *", from, 5));
});
test("nicknames with leading/trailing whitespace work", () => {
const from = Date.UTC(2025, 0, 15, 10, 0, 0);
const expected = Bun.cron.parse("@daily", from)!.getTime();
expect(Bun.cron.parse(" @daily", from)!.getTime()).toBe(expected);
expect(Bun.cron.parse("@daily ", from)!.getTime()).toBe(expected);
expect(Bun.cron.parse(" @DAILY ", from)!.getTime()).toBe(expected);
});
test("invalid nicknames throw", () => {
expect(() => Bun.cron.parse("@invalid")).toThrow(/cron expression/i);
expect(() => Bun.cron.parse("@")).toThrow(/cron expression/i);
});
// --- Boundary and edge cases ---
test("year boundary: Dec 31 → Jan 1", () => {
const from = Date.UTC(2025, 11, 31, 23, 30, 0);
expect(Bun.cron.parse("0 0 1 1 *", from)!.getTime()).toBe(Date.UTC(2026, 0, 1, 0, 0, 0));
});
test("leap year Feb 29 scheduling", () => {
// From Jan 1 2024, next Feb 29 should be 2024 (leap year)
const from = Date.UTC(2024, 0, 1, 0, 0, 0);
const results = nextN("0 0 29 2 *", from, 2);
expect(results[0]).toBe(Date.UTC(2024, 1, 29, 0, 0, 0));
// Next is 2028 (next leap year)
expect(results[1]).toBe(Date.UTC(2028, 1, 29, 0, 0, 0));
});
test("impossible expression (Feb 30) returns null", () => {
expect(Bun.cron.parse("0 0 30 2 *", Date.UTC(2025, 0, 1, 0, 0, 0))).toBeNull();
});
test("whitespace: multiple spaces, tabs, leading/trailing", () => {
const from = Date.UTC(2025, 0, 15, 10, 30, 0);
const expected = Date.UTC(2025, 0, 15, 10, 31, 0);
expect(Bun.cron.parse("* * * * *", from)!.getTime()).toBe(expected);
expect(Bun.cron.parse("*\t*\t*\t*\t*", from)!.getTime()).toBe(expected);
expect(Bun.cron.parse(" * * * * * ", from)!.getTime()).toBe(expected);
});
// --- Error cases ---
test("rejects invalid expressions", () => {
expect(() => Bun.cron.parse("not a cron")).toThrow(/cron expression/i);
expect(() => Bun.cron.parse("* * *")).toThrow(/cron expression/i);
expect(() => Bun.cron.parse("* * * * * *")).toThrow(/cron expression/i);
// @ts-ignore
expect(() => Bun.cron.parse(123)).toThrow();
});
test("rejects out-of-range values", () => {
expect(() => Bun.cron.parse("60 * * * *")).toThrow();
expect(() => Bun.cron.parse("* 24 * * *")).toThrow();
expect(() => Bun.cron.parse("* * 0 * *")).toThrow();
expect(() => Bun.cron.parse("* * 32 * *")).toThrow();
expect(() => Bun.cron.parse("* * * 0 *")).toThrow();
expect(() => Bun.cron.parse("* * * 13 *")).toThrow();
expect(() => Bun.cron.parse("* * * * 8")).toThrow(); // 7 is OK (Sunday), 8 is not
});
test("rejects malformed fields", () => {
expect(() => Bun.cron.parse("1,,3 * * * *")).toThrow();
expect(() => Bun.cron.parse(",1 * * * *")).toThrow();
expect(() => Bun.cron.parse("*/0 * * * *")).toThrow();
expect(() => Bun.cron.parse("* * * * FOO")).toThrow();
expect(() => Bun.cron.parse("* * * * Mond")).toThrow();
expect(() => Bun.cron.parse("* * * FOO *")).toThrow();
expect(() => Bun.cron.parse("* * * Janu *")).toThrow();
});
test("rejects invalid Date arguments", () => {
expect(() => Bun.cron.parse("* * * * *", NaN)).toThrow(/Invalid date/i);
expect(() => Bun.cron.parse("* * * * *", Infinity)).toThrow(/Invalid date/i);
// @ts-ignore
expect(() => Bun.cron.parse("* * * * *", "not a date")).toThrow();
});
test("null/undefined relativeDate uses current time", () => {
const before = Date.now();
const result1 = Bun.cron.parse("* * * * *")!;
// @ts-ignore
const result2 = Bun.cron.parse("* * * * *", null)!;
const after = Date.now();
for (const result of [result1, result2]) {
expect(result).toBeInstanceOf(Date);
expect(result.getTime()).toBeGreaterThanOrEqual(before);
expect(result.getTime()).toBeLessThanOrEqual(after + 2 * 60 * 1000);
}
});
test("Date object input works the same as number", () => {
const ms = Date.UTC(2025, 0, 15, 10, 30, 0);
const fromNumber = Bun.cron.parse("30 * * * *", ms)!;
const fromDate = Bun.cron.parse("30 * * * *", new Date(ms))!;
expect(fromNumber.getTime()).toBe(fromDate.getTime());
});
});

View File

@@ -489,6 +489,61 @@ brr = 3
"zr": ["deedee"],
});
});
describe("truncated/invalid utf-8", () => {
test("bare continuation byte (0x80) should not crash", () => {
// 0x80 is a continuation byte without a leading byte
// utf8ByteSequenceLength returns 0, which must not hit unreachable
const ini = Buffer.concat([Buffer.from("key = "), Buffer.from([0x80])]).toString("latin1");
// Should not crash - just parse gracefully
expect(() => parse(ini)).not.toThrow();
});
test("truncated 2-byte sequence at end of value", () => {
// 0xC0 is a 2-byte lead byte, but there's no continuation byte following
const ini = Buffer.concat([Buffer.from("key = "), Buffer.from([0xc0])]).toString("latin1");
expect(() => parse(ini)).not.toThrow();
});
test("truncated 3-byte sequence at end of value", () => {
// 0xE0 is a 3-byte lead byte, but only 0 continuation bytes follow
const ini = Buffer.concat([Buffer.from("key = "), Buffer.from([0xe0])]).toString("latin1");
expect(() => parse(ini)).not.toThrow();
});
test("truncated 3-byte sequence with 1 continuation byte at end", () => {
// 0xE0 is a 3-byte lead byte, but only 1 continuation byte follows
const ini = Buffer.concat([Buffer.from("key = "), Buffer.from([0xe0, 0x80])]).toString("latin1");
expect(() => parse(ini)).not.toThrow();
});
test("truncated 4-byte sequence at end of value", () => {
// 0xF0 is a 4-byte lead byte, but only 0 continuation bytes follow
const ini = Buffer.concat([Buffer.from("key = "), Buffer.from([0xf0])]).toString("latin1");
expect(() => parse(ini)).not.toThrow();
});
test("truncated 4-byte sequence with 1 continuation byte at end", () => {
const ini = Buffer.concat([Buffer.from("key = "), Buffer.from([0xf0, 0x80])]).toString("latin1");
expect(() => parse(ini)).not.toThrow();
});
test("truncated 4-byte sequence with 2 continuation bytes at end", () => {
const ini = Buffer.concat([Buffer.from("key = "), Buffer.from([0xf0, 0x80, 0x80])]).toString("latin1");
expect(() => parse(ini)).not.toThrow();
});
test("truncated 2-byte sequence in escaped context", () => {
// Backslash followed by a 2-byte lead byte at end of value
const ini = Buffer.concat([Buffer.from("key = \\"), Buffer.from([0xc0])]).toString("latin1");
expect(() => parse(ini)).not.toThrow();
});
test("bare continuation byte in escaped context", () => {
const ini = Buffer.concat([Buffer.from("key = \\"), Buffer.from([0x80])]).toString("latin1");
expect(() => parse(ini)).not.toThrow();
});
});
});
const wtf = {

View File

@@ -0,0 +1,222 @@
import { TCPSocketListener } from "bun";
import { describe, expect, test } from "bun:test";
const hostname = "127.0.0.1";
const MAX_HEADER_SIZE = 16 * 1024;
function doHandshake(
socket: any,
handshakeBuffer: Uint8Array,
data: Uint8Array,
): { buffer: Uint8Array; done: boolean } {
const newBuffer = new Uint8Array(handshakeBuffer.length + data.length);
newBuffer.set(handshakeBuffer);
newBuffer.set(data, handshakeBuffer.length);
if (newBuffer.length > MAX_HEADER_SIZE) {
socket.end();
throw new Error("Handshake headers too large");
}
const dataStr = new TextDecoder("utf-8").decode(newBuffer);
const endOfHeaders = dataStr.indexOf("\r\n\r\n");
if (endOfHeaders === -1) {
return { buffer: newBuffer, done: false };
}
if (!dataStr.startsWith("GET")) {
throw new Error("Invalid handshake");
}
const magic = /Sec-WebSocket-Key:\s*(.*)\r\n/i.exec(dataStr);
if (!magic) {
throw new Error("Missing Sec-WebSocket-Key");
}
const hasher = new Bun.CryptoHasher("sha1");
hasher.update(magic[1].trim());
hasher.update("258EAFA5-E914-47DA-95CA-C5AB0DC85B11");
const accept = hasher.digest("base64");
socket.write(
"HTTP/1.1 101 Switching Protocols\r\n" +
"Upgrade: websocket\r\n" +
"Connection: Upgrade\r\n" +
`Sec-WebSocket-Accept: ${accept}\r\n` +
"\r\n",
);
socket.flush();
return { buffer: newBuffer, done: true };
}
function makeTextFrame(text: string): Uint8Array {
const payload = new TextEncoder().encode(text);
const len = payload.length;
let header: Uint8Array;
if (len < 126) {
header = new Uint8Array([0x81, len]);
} else if (len < 65536) {
header = new Uint8Array([0x81, 126, (len >> 8) & 0xff, len & 0xff]);
} else {
throw new Error("Message too large for this test");
}
const frame = new Uint8Array(header.length + len);
frame.set(header);
frame.set(payload, header.length);
return frame;
}
describe("WebSocket", () => {
test("fragmented pong frame does not cause frame desync", async () => {
let server: TCPSocketListener | undefined;
let client: WebSocket | undefined;
let handshakeBuffer = new Uint8Array(0);
let handshakeComplete = false;
try {
const { promise, resolve, reject } = Promise.withResolvers<void>();
server = Bun.listen({
socket: {
data(socket, data) {
if (handshakeComplete) {
// After handshake, we just receive client frames (like close) - ignore them
return;
}
const result = doHandshake(socket, handshakeBuffer, new Uint8Array(data));
handshakeBuffer = result.buffer;
if (!result.done) return;
handshakeComplete = true;
// Build a pong frame with a 50-byte payload, but deliver it in two parts.
// Pong opcode = 0x8A, FIN=1
const pongPayload = new Uint8Array(50);
for (let i = 0; i < 50; i++) pongPayload[i] = 0x41 + (i % 26); // 'A'-'Z' repeated
const pongFrame = new Uint8Array(2 + 50);
pongFrame[0] = 0x8a; // FIN + Pong opcode
pongFrame[1] = 50; // payload length
pongFrame.set(pongPayload, 2);
// Part 1 of pong: header (2 bytes) + first 2 bytes of payload = 4 bytes
// This leaves 48 bytes of pong payload undelivered.
const pongPart1 = pongFrame.slice(0, 4);
// Part 2: remaining 48 bytes of pong payload
const pongPart2 = pongFrame.slice(4);
// A text message to send after the pong completes.
const textFrame = makeTextFrame("hello after pong");
// Send part 1 of pong
socket.write(pongPart1);
socket.flush();
// After a delay, send part 2 of pong + the follow-up text message
setTimeout(() => {
// Concatenate part2 + text frame to simulate them arriving together
const combined = new Uint8Array(pongPart2.length + textFrame.length);
combined.set(pongPart2);
combined.set(textFrame, pongPart2.length);
socket.write(combined);
socket.flush();
}, 50);
},
},
hostname,
port: 0,
});
const messages: string[] = [];
client = new WebSocket(`ws://${server.hostname}:${server.port}`);
client.addEventListener("error", event => {
reject(new Error("WebSocket error"));
});
client.addEventListener("close", event => {
// If the connection closes unexpectedly due to frame desync, the test should fail
reject(new Error(`WebSocket closed unexpectedly: code=${event.code} reason=${event.reason}`));
});
client.addEventListener("message", event => {
messages.push(event.data as string);
if (messages.length === 1) {
// We got the text message after the fragmented pong
try {
expect(messages[0]).toBe("hello after pong");
resolve();
} catch (err) {
reject(err);
}
}
});
await promise;
} finally {
client?.close();
server?.stop(true);
}
});
test("pong frame with payload > 125 bytes is rejected", async () => {
let server: TCPSocketListener | undefined;
let client: WebSocket | undefined;
let handshakeBuffer = new Uint8Array(0);
let handshakeComplete = false;
try {
const { promise, resolve, reject } = Promise.withResolvers<void>();
server = Bun.listen({
socket: {
data(socket, data) {
if (handshakeComplete) return;
const result = doHandshake(socket, handshakeBuffer, new Uint8Array(data));
handshakeBuffer = result.buffer;
if (!result.done) return;
handshakeComplete = true;
// Send a pong frame with a 126-byte payload (invalid per RFC 6455 Section 5.5)
// Control frames MUST have a payload length of 125 bytes or less.
// Use 2-byte extended length encoding since 126 > 125.
// But actually, the 7-bit length field in byte[1] can encode 0-125 directly.
// For 126, the server must use the extended 16-bit length.
// However, control frames with >125 payload are invalid regardless of encoding.
const pongFrame = new Uint8Array(4 + 126);
pongFrame[0] = 0x8a; // FIN + Pong
pongFrame[1] = 126; // Signals 16-bit extended length follows
pongFrame[2] = 0; // High byte of length
pongFrame[3] = 126; // Low byte of length = 126
// Fill payload with arbitrary data
for (let i = 0; i < 126; i++) pongFrame[4 + i] = 0x42;
socket.write(pongFrame);
socket.flush();
},
},
hostname,
port: 0,
});
client = new WebSocket(`ws://${server.hostname}:${server.port}`);
client.addEventListener("error", () => {
// Expected - the connection should error due to invalid control frame
resolve();
});
client.addEventListener("close", () => {
// Also acceptable - connection closes due to protocol error
resolve();
});
client.addEventListener("message", () => {
reject(new Error("Should not receive a message from an invalid pong frame"));
});
await promise;
} finally {
client?.close();
server?.stop(true);
}
});
});

View File

@@ -0,0 +1,77 @@
import { expect, test } from "bun:test";
import { bunEnv, bunExe, tempDir } from "harness";
import { join } from "path";
test("--bail writes JUnit reporter outfile", async () => {
using dir = tempDir("bail-junit", {
"fail.test.ts": `
import { test, expect } from "bun:test";
test("failing test", () => { expect(1).toBe(2); });
`,
});
const outfile = join(String(dir), "results.xml");
await using proc = Bun.spawn({
cmd: [bunExe(), "test", "--bail", "--reporter=junit", `--reporter-outfile=${outfile}`, "fail.test.ts"],
env: bunEnv,
cwd: String(dir),
stdout: "pipe",
stderr: "pipe",
});
const exitCode = await proc.exited;
// The test should fail and bail
expect(exitCode).not.toBe(0);
// The JUnit report file should still be written despite bail
const file = Bun.file(outfile);
expect(await file.exists()).toBe(true);
const xml = await file.text();
expect(xml).toContain("<?xml");
expect(xml).toContain("<testsuites");
expect(xml).toContain("</testsuites>");
expect(xml).toContain("failing test");
});
test("--bail writes JUnit reporter outfile with multiple files", async () => {
using dir = tempDir("bail-junit-multi", {
"a_pass.test.ts": `
import { test, expect } from "bun:test";
test("passing test", () => { expect(1).toBe(1); });
`,
"b_fail.test.ts": `
import { test, expect } from "bun:test";
test("another failing test", () => { expect(1).toBe(2); });
`,
});
const outfile = join(String(dir), "results.xml");
await using proc = Bun.spawn({
cmd: [bunExe(), "test", "--bail", "--reporter=junit", `--reporter-outfile=${outfile}`],
env: bunEnv,
cwd: String(dir),
stdout: "pipe",
stderr: "pipe",
});
const exitCode = await proc.exited;
// The test should fail and bail
expect(exitCode).not.toBe(0);
// The JUnit report file should still be written despite bail
const file = Bun.file(outfile);
expect(await file.exists()).toBe(true);
const xml = await file.text();
expect(xml).toContain("<?xml");
expect(xml).toContain("<testsuites");
expect(xml).toContain("</testsuites>");
// Both the passing and failing tests should be recorded
expect(xml).toContain("passing test");
expect(xml).toContain("another failing test");
});

View File

@@ -0,0 +1,187 @@
import { SQL } from "bun";
import { expect, test } from "bun:test";
import net from "net";
test("postgres connection rejects null bytes in username", async () => {
let serverReceivedData = false;
const server = net.createServer(socket => {
serverReceivedData = true;
socket.destroy();
});
await new Promise<void>(r => server.listen(0, "127.0.0.1", () => r()));
const port = (server.address() as net.AddressInfo).port;
try {
const sql = new SQL({
hostname: "127.0.0.1",
port,
username: "alice\x00search_path\x00evil_schema,public",
database: "testdb",
max: 1,
idleTimeout: 1,
connectionTimeout: 2,
});
await sql`SELECT 1`;
expect.unreachable();
} catch (e: any) {
expect(e.message).toContain("null bytes");
} finally {
server.close();
}
// The server should never have received any data because the null byte
// should be rejected before the connection is established.
expect(serverReceivedData).toBe(false);
});
test("postgres connection rejects null bytes in database", async () => {
let serverReceivedData = false;
const server = net.createServer(socket => {
serverReceivedData = true;
socket.destroy();
});
await new Promise<void>(r => server.listen(0, "127.0.0.1", () => r()));
const port = (server.address() as net.AddressInfo).port;
try {
const sql = new SQL({
hostname: "127.0.0.1",
port,
username: "alice",
database: "testdb\x00search_path\x00evil_schema,public",
max: 1,
idleTimeout: 1,
connectionTimeout: 2,
});
await sql`SELECT 1`;
expect.unreachable();
} catch (e: any) {
expect(e.message).toContain("null bytes");
} finally {
server.close();
}
expect(serverReceivedData).toBe(false);
});
test("postgres connection rejects null bytes in password", async () => {
let serverReceivedData = false;
const server = net.createServer(socket => {
serverReceivedData = true;
socket.destroy();
});
await new Promise<void>(r => server.listen(0, "127.0.0.1", () => r()));
const port = (server.address() as net.AddressInfo).port;
try {
const sql = new SQL({
hostname: "127.0.0.1",
port,
username: "alice",
password: "pass\x00search_path\x00evil_schema",
database: "testdb",
max: 1,
idleTimeout: 1,
connectionTimeout: 2,
});
await sql`SELECT 1`;
expect.unreachable();
} catch (e: any) {
expect(e.message).toContain("null bytes");
} finally {
server.close();
}
expect(serverReceivedData).toBe(false);
});
test("postgres connection does not use truncated path with null bytes", async () => {
// The JS layer's fs.existsSync() rejects paths containing null bytes,
// so the path is dropped before reaching the native layer. Verify that a
// path with null bytes doesn't silently connect via a truncated path.
let serverReceivedData = false;
const server = net.createServer(socket => {
serverReceivedData = true;
socket.destroy();
});
await new Promise<void>(r => server.listen(0, "127.0.0.1", () => r()));
const port = (server.address() as net.AddressInfo).port;
try {
const sql = new SQL({
hostname: "127.0.0.1",
port,
username: "alice",
database: "testdb",
path: "/tmp\x00injected",
max: 1,
idleTimeout: 1,
connectionTimeout: 2,
});
await sql`SELECT 1`;
} catch {
// Expected to fail
} finally {
server.close();
}
// The path had null bytes so it should have been dropped by the JS layer,
// falling back to TCP where it hits our mock server (not a truncated Unix socket).
expect(serverReceivedData).toBe(true);
});
test("postgres connection works with normal parameters (no null bytes)", async () => {
// Verify that normal connections without null bytes still work.
// Use a mock server that sends an auth error so we can verify the
// startup message is sent correctly.
let receivedData = false;
const server = net.createServer(socket => {
socket.once("data", () => {
receivedData = true;
const errMsg = Buffer.from("SFATAL\0VFATAL\0C28000\0Mauthentication failed\0\0");
const len = errMsg.length + 4;
const header = Buffer.alloc(5);
header.write("E", 0);
header.writeInt32BE(len, 1);
socket.write(Buffer.concat([header, errMsg]));
socket.destroy();
});
});
await new Promise<void>(r => server.listen(0, "127.0.0.1", () => r()));
const port = (server.address() as net.AddressInfo).port;
try {
const sql = new SQL({
hostname: "127.0.0.1",
port,
username: "alice",
database: "testdb",
max: 1,
idleTimeout: 1,
connectionTimeout: 2,
});
await sql`SELECT 1`;
} catch {
// Expected - mock server sends auth error
} finally {
server.close();
}
// Normal parameters should connect fine - the server should receive data
expect(receivedData).toBe(true);
});

View File

@@ -0,0 +1,148 @@
import { S3Client } from "bun";
import { describe, expect, test } from "bun:test";
// Test that CRLF characters in S3 options are rejected to prevent header injection.
// See: HTTP Header Injection via S3 Content-Disposition Value
describe("S3 header injection prevention", () => {
test("contentDisposition with CRLF should throw", () => {
using server = Bun.serve({
port: 0,
fetch() {
return new Response("OK", { status: 200 });
},
});
const client = new S3Client({
accessKeyId: "test-key",
secretAccessKey: "test-secret",
endpoint: server.url.href,
bucket: "test-bucket",
});
expect(() =>
client.write("test-file.txt", "Hello", {
contentDisposition: 'attachment; filename="evil"\r\nX-Injected: value',
}),
).toThrow(/CR\/LF/);
});
test("contentEncoding with CRLF should throw", () => {
using server = Bun.serve({
port: 0,
fetch() {
return new Response("OK", { status: 200 });
},
});
const client = new S3Client({
accessKeyId: "test-key",
secretAccessKey: "test-secret",
endpoint: server.url.href,
bucket: "test-bucket",
});
expect(() =>
client.write("test-file.txt", "Hello", {
contentEncoding: "gzip\r\nX-Injected: value",
}),
).toThrow(/CR\/LF/);
});
test("type (content-type) with CRLF should throw", () => {
using server = Bun.serve({
port: 0,
fetch() {
return new Response("OK", { status: 200 });
},
});
const client = new S3Client({
accessKeyId: "test-key",
secretAccessKey: "test-secret",
endpoint: server.url.href,
bucket: "test-bucket",
});
expect(() =>
client.write("test-file.txt", "Hello", {
type: "text/plain\r\nX-Injected: value",
}),
).toThrow(/CR\/LF/);
});
test("contentDisposition with only CR should throw", () => {
using server = Bun.serve({
port: 0,
fetch() {
return new Response("OK", { status: 200 });
},
});
const client = new S3Client({
accessKeyId: "test-key",
secretAccessKey: "test-secret",
endpoint: server.url.href,
bucket: "test-bucket",
});
expect(() =>
client.write("test-file.txt", "Hello", {
contentDisposition: "attachment\rinjected",
}),
).toThrow(/CR\/LF/);
});
test("contentDisposition with only LF should throw", () => {
using server = Bun.serve({
port: 0,
fetch() {
return new Response("OK", { status: 200 });
},
});
const client = new S3Client({
accessKeyId: "test-key",
secretAccessKey: "test-secret",
endpoint: server.url.href,
bucket: "test-bucket",
});
expect(() =>
client.write("test-file.txt", "Hello", {
contentDisposition: "attachment\ninjected",
}),
).toThrow(/CR\/LF/);
});
test("valid contentDisposition without CRLF should not throw", async () => {
const { promise: requestReceived, resolve: onRequestReceived } = Promise.withResolvers<Headers>();
using server = Bun.serve({
port: 0,
async fetch(req) {
onRequestReceived(req.headers);
return new Response("OK", { status: 200 });
},
});
const client = new S3Client({
accessKeyId: "test-key",
secretAccessKey: "test-secret",
endpoint: server.url.href,
bucket: "test-bucket",
});
// Valid content-disposition values should not throw synchronously.
// The write may eventually fail because the mock server doesn't speak S3 protocol,
// but the option parsing should succeed and a request should be initiated.
expect(() =>
client.write("test-file.txt", "Hello", {
contentDisposition: 'attachment; filename="report.pdf"',
}),
).not.toThrow();
const receivedHeaders = await requestReceived;
expect(receivedHeaders.get("content-disposition")).toBe('attachment; filename="report.pdf"');
});
});