Compare commits

..

77 Commits

Author SHA1 Message Date
Meghan Denny
ccc1850190 runtime: fix small leak in Worker erroring exits 2025-11-17 16:54:10 -08:00
Marko Vejnovic
9513c1d1d9 chore: HttpContext.h cleanup (#24730) 2025-11-17 13:36:03 -08:00
robobun
509a97a435 Add --no-env-file flag to disable automatic .env loading (#24767)
## Summary

Implements `--no-env-file` CLI flag and bunfig configuration options to
disable automatic `.env` file loading at runtime and in the bundler.

## Motivation

Users may want to disable automatic `.env` file loading for:
- Production environments where env vars are managed externally
- CI/CD pipelines where .env files should be ignored
- Testing scenarios where explicit env control is needed
- Security contexts where .env files should not be trusted

## Changes

### CLI Flag
- Added `--no-env-file` flag that disables loading of default .env files
- Still respects explicit `--env-file` arguments for intentional env
loading

### Bunfig Configuration
Added support for disabling .env loading via `bunfig.toml`:
- `env = false` - disables default .env file loading
- `env = null` - disables default .env file loading  
- `env.file = false` - disables default .env file loading
- `env.file = null` - disables default .env file loading

### Implementation
- Added `disable_default_env_files` field to `api.TransformOptions` with
serialization support
- Added `disable_default_env_files` field to `options.Env` struct
- Implemented `loadEnvConfig` in bunfig parser to handle env
configuration
- Wired up flag throughout runtime and bundler code paths
- Preserved package.json script runner behavior (always skips default
.env files)

## Tests

Added comprehensive test suite (`test/cli/run/no-envfile.test.ts`) with
9 tests covering:
- `--no-env-file` flag with `.env`, `.env.local`,
`.env.development.local`
- Bunfig configurations: `env = false`, `env.file = false`, `env = true`
- `--no-env-file` with `-e` eval flag
- `--no-env-file` combined with `--env-file` (explicit files still load)
- Production mode behavior

All tests pass with debug bun and fail with system bun (as expected).

## Example Usage

```bash
# Disable all default .env files
bun --no-env-file index.js

# Disable defaults but load explicit file
bun --no-env-file --env-file .env.production index.js

# Disable via bunfig.toml
cat > bunfig.toml << 'CONFIG'
env = false
CONFIG
bun index.js
```

## Files Changed
- `src/cli/Arguments.zig` - CLI flag parsing
- `src/api/schema.zig` - API schema field with encode/decode
- `src/options.zig` - Env struct field and wiring
- `src/bunfig.zig` - Config parsing with loadEnvConfig
- `src/transpiler.zig` - Runtime wiring
- `src/bun.js.zig` - Runtime wiring
- `src/cli/exec_command.zig` - Runtime wiring
- `src/cli/run_command.zig` - Preserved package.json script runner
behavior
- `test/cli/run/no-envfile.test.ts` - Comprehensive test suite

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
2025-11-17 15:04:42 -05:00
Dylan Conway
983bb52df7 fix #24550 (#24726)
### What does this PR do?
Fixes a regression introduced in Bun v1.3.2 with #24283.

We are not able to skip `sharp` lifecycle scripts before v0.33.0 because
previous versions did not use optional dependencies with prebuilds.

Fixes #24550
Fixes ENG-21519
### How did you verify your code works?
Manually

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-17 15:04:20 -05:00
Meghan Denny
8b5b36ec7a runtime: fix n-api ThreadSafeFunction finalizer (#24771)
Closes https://github.com/oven-sh/bun/issues/24552
Closes https://github.com/oven-sh/bun/issues/24664
Closes https://github.com/oven-sh/bun/issues/24702
Closes https://github.com/oven-sh/bun/issues/24703
Closes https://github.com/oven-sh/bun/issues/24768
2025-11-17 11:23:13 -08:00
Michael H
87eca6bbc7 docs: re-apply many recent changes that somehow aren't present (#24719)
lots of recent changes aren't present, so this reapplies them
2025-11-16 19:23:01 +11:00
Meghan Denny
2cb8d4eae8 cmake: remove GIT_CHANGED_SOURCES (#24737)
dead code
2025-11-15 16:45:37 -08:00
Meghan Denny
e53ceb62ec zig: fix missing uses of bun.callmod_inline (#24738)
results in better strack traces in debug mode
2025-11-15 16:36:15 -08:00
pfg
277fc558e2 only-failures fix (#24701)
### What does this PR do?

Removes these accidental blank lines

<img width="170" height="139" alt="image"
src="https://github.com/user-attachments/assets/b44d6496-a497-4be6-9666-8134a70d7324"
/>


### How did you verify your code works?
2025-11-14 19:52:43 -08:00
Dylan Conway
5908bfbfc6 fix(YAML.stringify): number-like strings prefixed with 0 (#24731)
### What does this PR do?
Ensures strings that would parse as a number with leading zeroes aren't
emitted without quotes.

fixes #23691

### How did you verify your code works?
Added a test
2025-11-14 17:43:36 -08:00
Dylan Conway
19f21c00bd fix #24510 (#24563)
### What does this PR do?
The assertion was too strict.

This pr changes to assertion to allow multiple of the same dependency id
to be present. Also changes all the assertions to debug assertions.

fixes #24510
### How did you verify your code works?
Manually, and added a new test

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Marko Vejnovic <marko@bun.com>
2025-11-14 16:49:21 -08:00
Lydia Hallie
8650e7ace4 Docs: Add templates to guides (#24732)
Adds template cards to the TanStack Start and Next.js guides
2025-11-14 16:45:21 -08:00
robobun
b2c219a56c Implement retry and repeats options for bun:test (#23713)
Fixes #16051, Fixes ENG-21437

Implements retry/repeats

```ts
test("my test", () => {
    if (Math.random() < 0.1) throw new Error("uh oh!");
}, {repeats: 20});
```

```
Error: uh oh!
✗ my test
```

```ts
test("my test", () => {
    if (Math.random() < 0.1) throw new Error("uh oh!");
}, {retry: 5});
```

```
Error: uh oh!
✓ my test (attempt 2)
```

Also fixes a bug where onTestFinished inside a test would not run if the
test failed

```ts
test("abc", () => {
    onTestFinished(() => { console.log("hello" });
    throw new Error("uh oh!");
});
```

```
Error: uh oh!
hello
```

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: pfg <pfg@pfg.pw>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-14 16:21:04 -08:00
Luke Parker
f216673f98 fix: Add missing SIGWINCH for windows (#24704)
### What does this PR do?
Fixes https://github.com/oven-sh/bun/issues/22288
Fixes #22402
Fixes https://github.com/oven-sh/bun/issues/23224
Fixes https://github.com/oven-sh/bun/issues/17803

cc: Should unblock opencode/opentui window resize on windows
https://github.com/sst/opentui/issues/152

### How did you verify your code works?
Clone the linked repro, verified latest bun failed, node worked, then
iterated till my local bun worked.

Here is a screenshot of the branch working with bun on windows

<img width="1427" height="891" alt="image"
src="https://github.com/user-attachments/assets/18642db7-4cb6-4758-bb76-a38d277cbc23"
/>

Additionally using bun vs bun-debug on a little test for our downstream
package proves this works

<img width="1137" height="679" alt="image"
src="https://github.com/user-attachments/assets/4dbe7605-ced9-4bcb-84f0-ed793f8aa942"
/>
<img width="1138" height="684" alt="image"
src="https://github.com/user-attachments/assets/f658b3b9-e4bc-4bfa-84f0-e1eb3af83d89"
/>
2025-11-14 14:05:47 -08:00
Lydia Hallie
a70f2b7ff9 Docs: Add custor server instructions to TanStack guide (#24723)
Add docs on how to deploy a custom Bun server for TanStack Start. Based
on [this
example](https://github.com/TanStack/router/tree/main/examples/react/start-bun/server.ts)
2025-11-14 11:53:23 -08:00
Braden Wong
65a215bb4e docs(watch): use relativePath parameter name in recursive example (#24716)
This updates the documentation for `fs.watch()` to use `relativePath`
instead of `filename` in the recursive example, following the same
convention from PR #23990.

When `recursive: true` is set on `fs.watch()`, the callback receives a
relative path to the changed file rather than just a simple filename.
Using `relativePath` as the parameter name makes this distinction
clearer to users.

**Related to:** https://github.com/oven-sh/bun/pull/23990

Co-authored-by: Michael H <git@riskymh.dev>
2025-11-15 01:10:35 +11:00
Michael H
c3c91442ac docs: fix custom loader example to be correct & other file-type doc updates (#24677) 2025-11-14 16:32:00 +11:00
Nino
93ab167a8d docs: fix environment variable syntax in executable example (#24706)
### What does this PR do?

Fixes a typo in the docs. 

`bun_BE_BUN=1` doesn't work, it has to be capitalized `BUN_BE_BUN=1`
2025-11-13 21:31:07 -08:00
pfg
d8ee26509c Fix progress showing kb for downloading packages instead of count (#24700)
- show bytes for upgrading bun
- show no unit for other progress bars

Fix for issue introduced in #24266
2025-11-13 19:29:16 -08:00
Ciro Spaciari
21d582a3cd fix(createEmptyObject) fix some createEmptyObject values (#22512)
### What does this PR do?
We must use the right number of properties (not more or less) or we
should set it to 0
### How did you verify your code works?
Read the code, this will avoid potencial crashs and improve stability
2025-11-13 15:19:18 -08:00
Meghan Denny
d7bf4fb443 ci/format: update bun version (#24693) 2025-11-13 15:00:40 -08:00
Ciro Spaciari
263d1ab178 update(crypto) update root certificates to NSS 3.117 (#24607)
### What does this PR do?
This is the certdata.txt[0] from NSS 3.117, released on 2025-11-11.

This is the version of NSS that will ship in Firefox 145.0 on
2025-11-11.

Certificates added:
- OISTE Server Root ECC G1
-  OISTE Server Root RSA G1

[0]
https://hg.mozilla.org/projects/nss/raw-file/NSS_3_117_RTM/lib/ckfw/builtins/certdata.txt
765c9e86-0a91-4dad-b410-801cd60f8b32

Fixes https://linear.app/oven/issue/ENG-21508/update-root-certs
### How did you verify your code works?
CI
2025-11-13 13:26:34 -08:00
Marko Vejnovic
08843030f5 [publish images] 2025-11-13 12:04:47 -08:00
Kristjan Broder Lund
9ccc8fb795 docs: format code blocks correctly (#24672)
### What does this PR do?

The code blocks were not properly formatted, and did not render
correctly

`main`:
<img width="699" height="196" alt="image"
src="https://github.com/user-attachments/assets/c08bc29e-9481-47ae-bafe-dd94b22d0c09"
/>

this pr:
<img width="691" height="306" alt="image"
src="https://github.com/user-attachments/assets/947fb9d7-04f3-42e8-aafe-0d70127fefd1"
/>

### How did you verify your code works?

ran docs locally with mint

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Michael H <git@riskymh.dev>
2025-11-13 20:36:45 +11:00
S.T.P
4c03d3b8b6 docs: remove the redundant tags (#24668)
### What does this PR do?

Remove the redundant code block tag

<img width="1469" height="918" alt="image"
src="https://github.com/user-attachments/assets/3eb3b499-3165-409c-9360-2fe1872162ed"
/>

After change

<img width="1458" height="1006" alt="image"
src="https://github.com/user-attachments/assets/69eac47c-28cd-4459-9478-0098b51f78fe"
/>


### How did you verify your code works?

Preview the documentation locally

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Michael H <git@riskymh.dev>
2025-11-13 18:37:22 +11:00
Marko Vejnovic
6d2ce3892b build(ENG-21514): Fix sccache invocation (#24651)
### What does this PR do?

Fixes some miswritten `cmake` steps so that `sccache` actually works

### How did you verify your code works?
2025-11-12 14:51:08 -08:00
Marko Vejnovic
e03d3bee10 ci(ENG-21502): Fix sccache not working inside Docker (#24597) 2025-11-12 14:40:12 -08:00
Caio Borghi
fff47f0267 docs: update EdgeDB references to Gel rebrand (#24487)
## Summary
EdgeDB has rebranded to Gel. This PR comprehensively updates all
documentation to reflect the rebrand.

## Changes Made

### Documentation & Branding
- **Guide title**: "Use EdgeDB with Bun" → "Use Gel with Bun"
- **File renamed**: `docs/guides/ecosystem/edgedb.mdx` → `gel.mdx`
- **Description**: Added "(formerly EdgeDB)" note
- **All path references**: Updated from `/guides/ecosystem/edgedb` to
`/guides/ecosystem/gel`

### CLI Commands
- `edgedb project init` → `gel project init`
- `edgedb` → `gel` (REPL)
- `edgedb migration create` → `gel migration create`
- `edgedb migrate` → `gel migrate`

### npm Packages
- `edgedb` → `gel`
- `@edgedb/generate` → `@gel/generate`

### Installation & Documentation URLs
- Installation link: `docs.geldata.com/learn/installation` (functional)
- Documentation reference: `docs.geldata.com/` (operational)
- Installation scripts: Verified working (`https://www.geldata.com/sh`
and `ps1`)
- Added Homebrew option: `brew install geldata/tap/gel-cli`

### Code Examples
- Updated all imports: `import { createClient } from "gel"`
- Updated codegen commands: `bunx @gel/generate`

## Verified
All commands verified against official Gel documentation at
https://docs.geldata.com/

Fixes #17721

---------

Co-authored-by: Lydia Hallie <lydiajuliettehallie@gmail.com>
2025-11-12 14:18:59 -08:00
Ciro Spaciari
4e1d9a2cbc remove dead code in src/bake/DevServer/SerializedFailure.zig (#24635)
### What does this PR do?
remove dead code in src/bake/DevServer/SerializedFailure.zig
### How did you verify your code works?
It builds
2025-11-12 13:39:36 -08:00
Ciro Spaciari
1f0c885e91 proper handle on_data if we receive null (#24624)
### What does this PR do?
If for some reason data is null we should handle as empty
Fixes
https://linear.app/oven/issue/ENG-21511/panic-attempt-to-use-null-value-in-socket-on-data
### How did you verify your code works?
Ci
2025-11-12 12:42:06 -08:00
Ciro Spaciari
ab32a2fc4a fix(bun getcompletes) add windows support and remove TODO panic (#24620)
### What does this PR do?
Fixes https://linear.app/oven/issue/ENG-21509/panic-todo-in-completions
### How did you verify your code works?
Test
2025-11-12 12:41:47 -08:00
Ciro Spaciari
8912957aa5 compatibility(node:net) _handle.fd property (#24575)
### What does this PR do?
Expose fd property in _handle for node:net/node:tls
Fixes
https://linear.app/oven/issue/ENG-21507/expose-fd-in-nodenetnodetls

### How did you verify your code works?
Test

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
2025-11-12 12:28:55 -08:00
Ciro Spaciari
df4e42bf1c fix(DevServer) remove panic in case of source type none (#24634)
### What does this PR do?
Remove panic in case of source type none, so we can handle it more
gracefully, we can discuss if this is the best solution but looks
sensible for me. This is really hard to repro but can happen when
deleting files referred by dynamic imports.


Fixes
https://linear.app/oven/issue/ENG-21513/panic-missing-internal-precomputed-line-count-in-renderjson-on
Fixes https://github.com/oven-sh/bun/issues/21714
### How did you verify your code works?
CI

---------

Co-authored-by: taylor.fish <contact@taylor.fish>
2025-11-12 12:28:17 -08:00
Ciro Spaciari
f67bec90c5 refactor(us_socket_t.zig) safer use of intCast (#24622)
### What does this PR do?
make sure to always safe intCast in us_socket_t
### How did you verify your code works?
Compiles
2025-11-12 11:02:39 -08:00
Michael H
fa099336da docs: node does support "import path re-mapping" (#17133)
fixes #4545
2025-11-13 06:02:12 +11:00
Michael H
7f8dff64c4 docs: revert minifier doc's format (#24639) 2025-11-12 11:01:25 -08:00
Michael H
98a01e5d2a docs: fix some pages (#24632) 2025-11-13 06:00:05 +11:00
Michael H
d1fa27acce docs: document more loaders (#24616)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-12 10:48:03 -08:00
Michael H
cf6662d48f types: document configVersion in BunLockFile (#24641) 2025-11-12 10:47:30 -08:00
Marko Vejnovic
2563a9b3ad build(ENG-21491): Improve sccache behavior on developer machines (#24568)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-12 09:11:33 -08:00
Meghan Denny
e0aae8adc1 ci: remove unified-builds and unified-tests options (#24626) 2025-11-11 22:52:46 -08:00
Marko Vejnovic
6b8a75f6ab chore(ENG-21504): Remove bit-rotted scripts (#24606)
### What does this PR do?

Removes some scripts which haven't been tested in a while.

### How did you verify your code works?

CI passes
2025-11-11 22:39:20 -08:00
pfg
97c113d010 remove unused writer type parameters in src/css/ (#24571)
No longer needed after zig upgrade

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-11 21:09:50 -08:00
Meghan Denny
7f4e65464e zig: fix spurious dependency loop compile error in ResumableSink (#24618) 2025-11-11 20:22:24 -08:00
Meghan Denny
4b05629131 ci: update no-validate-exceptions.txt 2025-11-11 20:21:24 -08:00
Ciro Spaciari
d868c6019c fix(DevServer) unconditional unwrap in IncrementalGraph (#24608)
### What does this PR do?
Fixes
https://linear.app/oven/issue/ENG-21505/panic-attempt-to-use-null-value-at-incrementalgraph-by-misusing-jscode

When calling `takeJSBundleToList/takeJSBundle` the desired behavior is
to get only JS chunks from the graph, and we can contain CSS chunks in
the graph we can just continue and ignore in this case keeping the
desired behavior in a safe way instead of unconditional unwrapping
something that is not guaranteed to have a jsCode.

### How did you verify your code works?
CI
2025-11-11 16:46:28 -08:00
robobun
0c42b46af3 docs: remove outdated version mentions (1.0.x and 1.1.x) (#24570)
## Summary

Remove outdated version mentions (1.0.x and 1.1.x) from documentation
for better consistency. These versions are over a year old - you should
be using a recent version of bun :).

## What changed

**Removed version mentions from:**
- `docs/pm/lifecycle.mdx` - v1.0.16 (trusted dependencies)
- `docs/bundler/executables.mdx` - v1.0.23, v1.1.25, v1.1.30 (various
features)
- `docs/guides/install/jfrog-artifactory.mdx` - v1.0.3+ (env var
comment)
- `docs/guides/install/azure-artifacts.mdx` - v1.0.3+ (env var comment)
- `docs/runtime/workers.mdx` - v1.1.13, v1.1.35 (blob URLs, preload)
- `docs/runtime/networking/dns.mdx` - v1.1.9 (DNS caching)
- `docs/guides/runtime/import-html.mdx` - v1.1.5
- `docs/guides/runtime/define-constant.mdx` - v1.1.5
- `docs/runtime/sqlite.mdx` - v1.1.31

**Kept version mentions in:**
- All 1.2.x versions (still recent, less than a year old)
- Benchmark version numbers (e.g., S3 performance comparison with
v1.1.44)
- `docs/guides/install/yarnlock.mdx` (bun.lock introduction context)
- `docs/project/building-windows.mdx` (build requirements)
- `docs/runtime/http/websockets.mdx` (performance benchmarks)

## Why

The docs lack consistency around version mentions - we don't document
every feature's version, so keeping scattered old version numbers looks
inconsistent. These changes represent a small percentage of features
added recently, and users on ancient versions have bigger problems than
needing to know exactly when a feature landed.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: RiskyMH <git@riskymh.dev>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-12 10:44:52 +11:00
Nathan Soares
1cee6cf36b docs: Change code block header from package.json to tsconfig.json (#24511)
Co-authored-by: Michael H <git@riskymh.dev>
2025-11-12 10:25:23 +11:00
pfg
9671a98dca remove unused "recommended zig version" (#24611)
Dead code.
2025-11-11 15:10:57 -08:00
yinheli
c6aa5a97dc fix(docs): Remove duplicate sections in guides.jsx (#24595)
### What does this PR do?

This PR fixes an issue on the Guides page where duplicate sections were
being displayed. The problem was caused by a misplaced return statement
and a duplicated JSX block introduced in commit
[1606a9f24e](https://github.com/oven-sh/bun/blob/1606a9f24e/docs/snippets/guides.jsx#L504-L514).
2025-11-11 14:53:59 -08:00
robobun
925e8bcfe1 Format download sizes in human-readable format (#24266)
## Summary

- Use `std.fmt.fmtIntSizeBin` to format progress indicators with byte
sizes
- Improves readability during operations like `bun upgrade`
- Changes display from raw bytes (e.g., "23982378/2398284") to
human-readable format (e.g., "23.2MiB/100MiB")

## Changes

Modified `src/Progress.zig`:
- Updated progress formatting to use `std.fmt.fmtIntSizeBin` for both
current and total sizes
- Applied to both progress with total (`[current/total unit]`) and
without total (`[current unit]`)

## Test plan

- [x] Build succeeds with `bun bd`
- [ ] Manual verification with `bun upgrade` shows human-readable sizes

Fixes #24226 fixes #7826

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: pfg <pfg@pfg.pw>
2025-11-10 20:02:00 -08:00
robobun
b87ac4a781 Update ci_info with more CI detection (#23708)
Fixes ENG-21481

Updates ci_info to include more CIs. It makes it codegen the ci
detection based on the json from the ci-info package. Also it supports
setting CI=true to force ci detected.

---------

Co-authored-by: pfg <pfg@pfg.pw>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-10 19:58:02 -08:00
robobun
b876938f6d docs: update documentation for Bun v1.3.2 features (#24503)
## Summary
Updates documentation for all major features and changes introduced in
Bun v1.3.2 blog post.

## Changes

### Package Manager
-  Document `configVersion` system for controlling default linker
behavior
-  Clarify that "existing projects (made pre-v1.3.2)" use hoisted
installs for backward compatibility
-  Add smart postinstall script optimization with environment variable
flags
-  Document improved Git dependency resolution with HTTP tarball
optimization
-  Add `bun list` alias for `bun pm ls`

### Testing
-  Document new `onTestFinished` lifecycle hook with simple example
-  Add to lifecycle hooks table in test documentation

### Runtime & Performance
-  Add CPU profiling with `--cpu-prof` flag documentation
-  Place after memory usage section for better flow

### WebSockets
-  Add `subscriptions` getter to existing pub/sub example
-  Add TypeScript reference for the subscriptions property

## Documentation Improvements
All documentation now consistently:
- Uses "made pre-v1.3.2" to clarify existing project behavior
- Simplifies default linker explanations with clear references to
`/docs/pm/isolated-installs`
- Uses `/docs/pm/isolated-installs` for all internal references
- Avoids confusing technical details in favor of user-friendly summaries

## Files Modified
- `docs/guides/install/add-git.mdx` - Added GitHub tarball optimization
note
- `docs/pm/cli/install.mdx` - Added installation strategies and smart
postinstall docs
- `docs/pm/cli/pm.mdx` - Added bun list alias
- `docs/pm/isolated-installs.mdx` - Updated default behavior section
with configVersion table
- `docs/project/benchmarking.mdx` - Added CPU profiling section
- `docs/runtime/bunfig.mdx` - Clarified install.linker defaults
- `docs/runtime/http/websockets.mdx` - Added subscriptions to example
and TypeScript interface
- `docs/test/lifecycle.mdx` - Added onTestFinished hook documentation

## Diff

````diff
diff --git a/docs/guides/install/add-git.mdx b/docs/guides/install/add-git.mdx
index 70950e1a63..7f8f3c8d81 100644
--- a/docs/guides/install/add-git.mdx
+++ b/docs/guides/install/add-git.mdx
@@ -33,6 +33,8 @@ bun add git@github.com:lodash/lodash.git
 bun add github:colinhacks/zod
 ```
 
+**Note:** GitHub dependencies download via HTTP tarball when possible for faster installation.
+
 ---
 
 See [Docs > Package manager](https://bun.com/docs/cli/install) for complete documentation of Bun's package manager.
diff --git a/docs/pm/cli/install.mdx b/docs/pm/cli/install.mdx
index 7affb62646..dde268b7e5 100644
--- a/docs/pm/cli/install.mdx
+++ b/docs/pm/cli/install.mdx
@@ -88,6 +88,13 @@ Lifecycle scripts will run in parallel during installation. To adjust the maximu
 bun install --concurrent-scripts 5
 ```
 
+Bun automatically optimizes postinstall scripts for popular packages (like `esbuild`, `sharp`, etc.) by determining which scripts need to run. To disable these optimizations:
+
+```bash terminal icon="terminal"
+BUN_FEATURE_FLAG_DISABLE_NATIVE_DEPENDENCY_LINKER=1 bun install
+BUN_FEATURE_FLAG_DISABLE_IGNORE_SCRIPTS=1 bun install
+```
+
 ---
 
 ## Workspaces
@@ -231,7 +238,7 @@ Bun supports installing dependencies from Git, GitHub, and local or remotely-hos
 
 Bun supports two package installation strategies that determine how dependencies are organized in `node_modules`:
 
-### Hoisted installs (default for single projects)
+### Hoisted installs
 
 The traditional npm/Yarn approach that flattens dependencies into a shared `node_modules` directory:
 
@@ -249,7 +256,15 @@ bun install --linker isolated
 
 Isolated installs create a central package store in `node_modules/.bun/` with symlinks in the top-level `node_modules`. This ensures packages can only access their declared dependencies.
 
-For complete documentation on isolated installs, refer to [Package manager > Isolated installs](/pm/isolated-installs).
+### Default strategy
+
+The default linker strategy depends on whether you're starting fresh or have an existing project:
+
+- **New workspaces/monorepos**: `isolated` (prevents phantom dependencies)
+- **New single-package projects**: `hoisted` (traditional npm behavior)
+- **Existing projects (made pre-v1.3.2)**: `hoisted` (preserves backward compatibility)
+
+The default is controlled by a `configVersion` field in your lockfile. For a detailed explanation, see [Package manager > Isolated installs](/docs/pm/isolated-installs).
 
 ---
 
@@ -319,8 +334,7 @@ dryRun = false
 concurrentScripts = 16 # (cpu count or GOMAXPROCS) x2
 
 # installation strategy: "hoisted" or "isolated"
-# default: "hoisted" (for single-project projects)
-# default: "isolated" (for monorepo projects)
+# default varies by project type - see /docs/pm/isolated-installs
 linker = "hoisted"
 
 
diff --git a/docs/pm/cli/pm.mdx b/docs/pm/cli/pm.mdx
index fc297753d3..9c8faa7da1 100644
--- a/docs/pm/cli/pm.mdx
+++ b/docs/pm/cli/pm.mdx
@@ -115,6 +115,8 @@ To print a list of installed dependencies in the current project and their resol
 
 ```bash terminal icon="terminal"
 bun pm ls
+# or
+bun list
 ```
 
 ```txt
@@ -130,6 +132,8 @@ To print all installed dependencies, including nth-order dependencies.
 
 ```bash terminal icon="terminal"
 bun pm ls --all
+# or
+bun list --all
 ```
 
 ```txt
diff --git a/docs/pm/isolated-installs.mdx b/docs/pm/isolated-installs.mdx
index 73c6748b15..17afe02fe1 100644
--- a/docs/pm/isolated-installs.mdx
+++ b/docs/pm/isolated-installs.mdx
@@ -5,7 +5,7 @@ description: "Strict dependency isolation similar to pnpm's approach"
 
 Bun provides an alternative package installation strategy called **isolated installs** that creates strict dependency isolation similar to pnpm's approach. This mode prevents phantom dependencies and ensures reproducible, deterministic builds.
 
-This is the default installation strategy for monorepo projects.
+This is the default installation strategy for **new** workspace/monorepo projects (with `configVersion = 1` in the lockfile). Existing projects continue using hoisted installs unless explicitly configured.
 
 ## What are isolated installs?
 
@@ -43,8 +43,23 @@ linker = "isolated"
 
 ### Default behavior
 
-- For monorepo projects, Bun uses the **isolated** installation strategy by default.
-- For single-project projects, Bun uses the **hoisted** installation strategy by default.
+The default linker strategy depends on your project's lockfile `configVersion`:
+
+| `configVersion` | Using workspaces? | Default Linker |
+| --------------- | ----------------- | -------------- |
+| `1`             |                 | `isolated`     |
+| `1`             |                 | `hoisted`      |
+| `0`             |                 | `hoisted`      |
+| `0`             |                 | `hoisted`      |
+
+**New projects**: Default to `configVersion = 1`. In workspaces, v1 uses the isolated linker by default; otherwise it uses hoisted linking.
+
+**Existing Bun projects (made pre-v1.3.2)**: If your existing lockfile doesn't have a version yet, Bun sets `configVersion = 0` when you run `bun install`, preserving the previous hoisted linker default.
+
+**Migrations from other package managers**:
+
+- From pnpm: `configVersion = 1` (using isolated installs in workspaces)
+- From npm or yarn: `configVersion = 0` (using hoisted installs)
 
 You can override the default behavior by explicitly specifying the `--linker` flag or setting it in your configuration file.
 
diff --git a/docs/project/benchmarking.mdx b/docs/project/benchmarking.mdx
index 1263a06729..2ab8bcafc8 100644
--- a/docs/project/benchmarking.mdx
+++ b/docs/project/benchmarking.mdx
@@ -216,3 +216,26 @@ numa nodes:       1
    elapsed:       0.068 s
    process: user: 0.061 s, system: 0.014 s, faults: 0, rss: 57.4 MiB, commit: 64.0 MiB
 ```
+
+## CPU profiling
+
+Profile JavaScript execution to identify performance bottlenecks with the `--cpu-prof` flag.
+
+```sh terminal icon="terminal"
+bun --cpu-prof script.js
+```
+
+This generates a `.cpuprofile` file you can open in Chrome DevTools (Performance tab → Load profile) or VS Code's CPU profiler.
+
+### Options
+
+```sh terminal icon="terminal"
+bun --cpu-prof --cpu-prof-name my-profile.cpuprofile script.js
+bun --cpu-prof --cpu-prof-dir ./profiles script.js
+```
+
+| Flag                         | Description          |
+| ---------------------------- | -------------------- |
+| `--cpu-prof`                 | Enable profiling     |
+| `--cpu-prof-name <filename>` | Set output filename  |
+| `--cpu-prof-dir <dir>`       | Set output directory |
diff --git a/docs/runtime/bunfig.mdx b/docs/runtime/bunfig.mdx
index 91005c1607..5b7fe49823 100644
--- a/docs/runtime/bunfig.mdx
+++ b/docs/runtime/bunfig.mdx
@@ -497,9 +497,9 @@ print = "yarn"
 
 ### `install.linker`
 
-Configure the default linker strategy. Default `"hoisted"` for single-project projects, `"isolated"` for monorepo projects.
+Configure the linker strategy for installing dependencies. Defaults to `"isolated"` for new workspaces, `"hoisted"` for new single-package projects and existing projects (made pre-v1.3.2).
 
-For complete documentation refer to [Package manager > Isolated installs](/pm/isolated-installs).
+For complete documentation refer to [Package manager > Isolated installs](/docs/pm/isolated-installs).
 
 ```toml title="bunfig.toml" icon="settings"
 [install]
diff --git a/docs/runtime/http/websockets.mdx b/docs/runtime/http/websockets.mdx
index b33f37c29f..174043200d 100644
--- a/docs/runtime/http/websockets.mdx
+++ b/docs/runtime/http/websockets.mdx
@@ -212,6 +212,9 @@ const server = Bun.serve({
       // this is a group chat
       // so the server re-broadcasts incoming message to everyone
       server.publish("the-group-chat", `${ws.data.username}: ${message}`);
+
+      // inspect current subscriptions
+      console.log(ws.subscriptions); // ["the-group-chat"]
     },
     close(ws) {
       const msg = `${ws.data.username} has left the chat`;
@@ -393,6 +396,7 @@ interface ServerWebSocket {
   readonly data: any;
   readonly readyState: number;
   readonly remoteAddress: string;
+  readonly subscriptions: string[];
   send(message: string | ArrayBuffer | Uint8Array, compress?: boolean): number;
   close(code?: number, reason?: string): void;
   subscribe(topic: string): void;
diff --git a/docs/test/lifecycle.mdx b/docs/test/lifecycle.mdx
index 6427175df6..3837f0e948 100644
--- a/docs/test/lifecycle.mdx
+++ b/docs/test/lifecycle.mdx
@@ -6,11 +6,12 @@ description: "Learn how to use beforeAll, beforeEach, afterEach, and afterAll li
 The test runner supports the following lifecycle hooks. This is useful for loading test fixtures, mocking data, and configuring the test environment.
 
 | Hook             | Description                                                |
-| ------------ | --------------------------- |
+| ---------------- | ---------------------------------------------------------- |
 | `beforeAll`      | Runs once before all tests.                                |
 | `beforeEach`     | Runs before each test.                                     |
 | `afterEach`      | Runs after each test.                                      |
 | `afterAll`       | Runs once after all tests.                                 |
+| `onTestFinished` | Runs after a single test finishes (after all `afterEach`). |
 
 ## Per-Test Setup and Teardown
 
@@ -90,6 +91,23 @@ describe("test group", () => {
 });
 ```
 
+### `onTestFinished`
+
+Use `onTestFinished` to run a callback after a single test completes. It runs after all `afterEach` hooks.
+
+```ts title="test.ts" icon="/icons/typescript.svg"
+import { test, onTestFinished } from "bun:test";
+
+test("cleanup after test", () => {
+  onTestFinished(() => {
+    // runs after all afterEach hooks
+    console.log("test finished");
+  });
+});
+```
+
+Not supported in concurrent tests; use `test.serial` instead.
+
 ## Global Setup and Teardown
 
 To scope the hooks to an entire multi-file test run, define the hooks in a separate file.
````

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Michael H <git@riskymh.dev>
Co-authored-by: Lydia Hallie <lydiajuliettehallie@gmail.com>
2025-11-10 18:18:07 -08:00
Lydia Hallie
6b70b71895 Add TanStack Start guide (#24572)
Adds a guide on how to build and deploy TanStack Start with Bun
2025-11-10 17:38:48 -08:00
Marko Vejnovic
80a5b59fe5 bug(ENG-21501): Fix integer overflow in hosted_git_info.zig (#24561)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-10 16:30:47 -08:00
Alistair Smith
d87a928b94 Remove dependency on React's types in @types/bun 2025-11-10 15:50:45 -08:00
pfg
05d0475c6c Update to zig 0.15.2 (#24204)
Fixes ENG-21287

Build times, from `bun run build && echo '//' >> src/main.zig && time
bun run build`

|Platform|0.14.1|0.15.2|Speedup|
|-|-|-|-|
|macos debug asan|126.90s|106.27s|1.19x|
|macos debug noasan|60.62s|50.85s|1.19x|
|linux debug asan|292.77s|241.45s|1.21x|
|linux debug noasan|146.58s|130.94s|1.12x|
|linux debug use_llvm=false|n/a|78.27s|1.87x|
|windows debug asan|177.13s|142.55s|1.24x|

Runtime performance:

- next build memory usage may have gone up by 5%. Otherwise seems the
same. Some code with writers may have gotten slower, especially one
instance of a counting writer and a few instances of unbuffered writers
that now have vtable overhead.
- File size reduced by 800kb (from 100.2mb to 99.4mb)

Improvements:

- `@export` hack is no longer needed for watch
- native x86_64 backend for linux builds faster. to use it, set use_llvm
false and no_link_obj false. also set `ASAN_OPTIONS=detect_leaks=0`
otherwise it will spam the output with tens of thousands of lines of
debug info errors. may need to use the zig lldb fork for debugging.
- zig test-obj, which we will be able to use for zig unit tests

Still an issue:

- false 'dependency loop' errors remain in watch mode
- watch mode crashes observed

Follow-up:

- [ ] search `comptime Writer: type` and `comptime W: type` and remove
- [ ] remove format_mode in our zig fork
- [ ] remove deprecated.zig autoFormatLabelFallback
- [ ] remove deprecated.zig autoFormatLabel
- [ ] remove deprecated.BufferedWriter and BufferedReader
- [ ] remove override_no_export_cpp_apis as it is no longer needed
- [ ] css Parser(W) -> Parser, and remove all the comptime writer: type
params
- [ ] remove deprecated writer fully

Files that add lines:

```
649     src/deprecated.zig
167     scripts/pack-codegen-for-zig-team.ts
54      scripts/cleartrace-impl.js
46      scripts/cleartrace.ts
43      src/windows.zig
18      src/fs.zig
17      src/bun.js/ConsoleObject.zig
16      src/output.zig
12      src/bun.js/test/debug.zig
12      src/bun.js/node/node_fs.zig
8       src/env_loader.zig
7       src/css/printer.zig
7       src/cli/init_command.zig
7       src/bun.js/node.zig
6       src/string/escapeRegExp.zig
6       src/install/PnpmMatcher.zig
5       src/bun.js/webcore/Blob.zig
4       src/crash_handler.zig
4       src/bun.zig
3       src/install/lockfile/bun.lock.zig
3       src/cli/update_interactive_command.zig
3       src/cli/pack_command.zig
3       build.zig
2       src/Progress.zig
2       src/install/lockfile/lockfile_json_stringify_for_debugging.zig
2       src/css/small_list.zig
2       src/bun.js/webcore/prompt.zig
1       test/internal/ban-words.test.ts
1       test/internal/ban-limits.json
1       src/watcher/WatcherTrace.zig
1       src/transpiler.zig
1       src/shell/builtin/cp.zig
1       src/js_printer.zig
1       src/io/PipeReader.zig
1       src/install/bin.zig
1       src/css/selectors/selector.zig
1       src/cli/run_command.zig
1       src/bun.js/RuntimeTranspilerStore.zig
1       src/bun.js/bindings/JSRef.zig
1       src/bake/DevServer.zig
```

Files that remove lines:

```
-1      src/test/recover.zig
-1      src/sql/postgres/SocketMonitor.zig
-1      src/sql/mysql/MySQLRequestQueue.zig
-1      src/sourcemap/CodeCoverage.zig
-1      src/css/values/color_js.zig
-1      src/compile_target.zig
-1      src/bundler/linker_context/convertStmtsForChunk.zig
-1      src/bundler/bundle_v2.zig
-1      src/bun.js/webcore/blob/read_file.zig
-1      src/ast/base.zig
-2      src/sql/postgres/protocol/ArrayList.zig
-2      src/shell/builtin/mkdir.zig
-2      src/install/PackageManager/patchPackage.zig
-2      src/install/PackageManager/PackageManagerDirectories.zig
-2      src/fmt.zig
-2      src/css/declaration.zig
-2      src/css/css_parser.zig
-2      src/collections/baby_list.zig
-2      src/bun.js/bindings/ZigStackFrame.zig
-2      src/ast/E.zig
-3      src/StandaloneModuleGraph.zig
-3      src/deps/picohttp.zig
-3      src/deps/libuv.zig
-3      src/btjs.zig
-4      src/threading/Futex.zig
-4      src/shell/builtin/touch.zig
-4      src/meta.zig
-4      src/install/lockfile.zig
-4      src/css/selectors/parser.zig
-5      src/shell/interpreter.zig
-5      src/css/error.zig
-5      src/bun.js/web_worker.zig
-5      src/bun.js.zig
-6      src/cli/test_command.zig
-6      src/bun.js/VirtualMachine.zig
-6      src/bun.js/uuid.zig
-6      src/bun.js/bindings/JSValue.zig
-9      src/bun.js/test/pretty_format.zig
-9      src/bun.js/api/BunObject.zig
-14     src/install/install_binding.zig
-14     src/fd.zig
-14     src/bun.js/node/path.zig
-14     scripts/pack-codegen-for-zig-team.sh
-17     src/bun.js/test/diff_format.zig
```

`git diff --numstat origin/main...HEAD | awk '{ print ($1-$2)"\t"$3 }' |
sort -rn`

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Dylan Conway <dylan.conway567@gmail.com>
Co-authored-by: Meghan Denny <meghan@bun.com>
Co-authored-by: tayor.fish <contact@taylor.fish>
2025-11-10 14:38:26 -08:00
Lydia Hallie
143ad2ea58 Remove bun version from Vercel guide (#24562) 2025-11-10 14:09:04 -08:00
Dylan Conway
6f9843ea9a fix(install): bun pm ls with unresolved dependencies (#24541)
### What does this PR do?
Fixes `bun pm ls --all` crash with unresolved optional peer
dependencies.
Fixes `bun pm ls` crash with empty lockfiles.

Fixes #24502 
### How did you verify your code works?
Added a test for both crashes

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-10 11:19:57 -08:00
github-actions[bot]
0a307ed880 deps: update sqlite to 3.51.0 (#24530) 2025-11-09 01:09:25 -08:00
robobun
b4f85c8866 Update docs example versions to 1.3.2 (#24522)
## Summary

Updated all example version placeholders in documentation from 1.3.1 and
1.2.20 to 1.3.2.

## Changes

Updated version examples in:
- Installation examples (Linux/macOS and Windows install commands)
- Package manager output examples (`bun install`, `bun publish`, `bun
pm` commands)
- Test runner output examples
- Spawn/child process output examples
- Fetch User-Agent header examples in debugging docs
- `Bun.version` API example

## Notes

- Historical version references (e.g., "As of Bun v1.x.x..." or "Bun
v1.x.x+ required") were intentionally **preserved** as they document
when features were introduced
- Generic package.json version examples (non-Bun package versions) were
**preserved**
- Only example outputs and code snippets showing current Bun version
were updated

## Files Changed (13 total)

- `docs/installation.mdx`
- `docs/guides/install/from-npm-install-to-bun-install.mdx`
- `docs/guides/install/add-peer.mdx`
- `docs/bundler/html-static.mdx` (6 occurrences)
- `docs/test/dom.mdx`
- `docs/pm/cli/publish.mdx`
- `docs/pm/cli/pm.mdx`
- `docs/guides/test/snapshot.mdx` (2 occurrences)
- `docs/guides/ecosystem/nuxt.mdx`
- `docs/guides/util/version.mdx`
- `docs/runtime/debugger.mdx` (3 occurrences)
- `docs/runtime/networking/fetch.mdx`
- `docs/runtime/child-process.mdx`

**Total:** 23 version references updated

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: Michael H <git@riskymh.dev>
2025-11-09 16:20:04 +11:00
Michael H
614e8292e3 docs: fix discord invite (#24498)
### What does this PR do?

we don't have the discord vanity invite

### How did you verify your code works?
2025-11-08 21:09:57 -08:00
Michael H
3829b6d0aa add .mdx to .gitattributes (#24525)
### What does this PR do?

### How did you verify your code works?
2025-11-08 20:56:38 -08:00
Meghan Denny
f30e3951a7 Bump 2025-11-07 23:58:34 -08:00
Michael H
b131639cc5 ci: run modified tests first (#24463)
Co-authored-by: Meghan Denny <meghan@bun.com>
2025-11-07 21:49:58 -08:00
Jarred Sumner
b9b07172aa Update package.json 2025-11-07 21:22:53 -08:00
Marko Vejnovic
02b474415d bug(ENG-21479): Fix Valkey URL Parsing (#24458)
### What does this PR do?

Fixes https://github.com/oven-sh/bun/issues/24385

### How did you verify your code works?

Confirmed that the test added in the first commit fails on mainline
`bun` and is fixed in this PR.

---------

Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2025-11-07 21:18:07 -08:00
Dylan Conway
de9a38bd11 fix(install): create bun.lock instead of bun.lockb if npm/yarn/pnpm migration fails (#24494)
### What does this PR do?

### How did you verify your code works?

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-07 20:58:44 -08:00
Marko Vejnovic
2e57c6bf95 fix(ENG-21492): Fix private git+ssh installs (#24490)
### What does this PR do?

This PR is the fix-only version of
https://github.com/oven-sh/bun/pull/24486. Unfortunately due to
complexity setting up all CI agents to ping private git repos, I was
unable to get CI passing there.

### How did you verify your code works?

I ran this:

```
marko@fedora:~/Desktop/bun-4$ bun add git+ssh://git@github.com:oven-sh/private-install-test-repo.git#5b37e644a2ef23fad0da4027042f01b194b179e8
bun add v1.3.2-canary.108 (44402ad2)
  🔍 Resolving [1/1] error: "git clone" for "git+ssh://git@github.com:oven-sh/private-install-test-repo.git#5b37e644a2ef23fad0da4027042f01b194b179e8" failed
error: InstallFailed cloning repository for git+ssh://git@github.com:oven-sh/private-install-test-repo.git#5b37e644a2ef23fad0da4027042f01b194b179e8
error: git+ssh://git@github.com:oven-sh/private-install-test-repo.git#5b37e644a2ef23fad0da4027042f01b194b179e8 failed to resolve
```

followed by

```
marko@fedora:~/Desktop/bun-4$ BUN_DEBUG_QUIET_LOGS=1 ./build/debug/bun-debug add git+ssh://git@github.com:oven-sh/private-install-test-repo.git#5b37e644a2ef23fad0da4027042f01b194b179e8
bun add v1.3.2 (0db90b25)

installed private-install-test-repo@git+ssh://git@github.com:oven-sh/private-install-test-repo.git#5b37e644a2ef23fad0da4027042f01b194b179e8

[1.61s] done
```

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Dylan Conway <dylan.conway567@gmail.com>
2025-11-07 19:56:34 -08:00
Lydia Hallie
35a57ff008 Add Upstash guide with Bun Redis (#24493) 2025-11-07 18:32:19 -08:00
Meghan Denny
7a931d5b26 [publish images] 2025-11-07 16:41:57 -08:00
Meghan Denny
2b42be9dcc [publish images] 2025-11-07 16:40:13 -08:00
Meghan Denny
e6be28b8d4 [publish images] 2025-11-07 16:37:57 -08:00
Meghan Denny
d0a1984a20 ci: skip running tests when a PR only changes docs (#24459)
fixes https://linear.app/oven/issue/ENG-21489
2025-11-07 15:52:37 -08:00
Lydia Hallie
1896c75d78 Add deploy guides for AWS Lambda, Google Run, DigitalOcean (#24414)
Adds deployment guides for Bun apps on AWS Lambda, Google Cloud Run, and
DigitalOcean using a custom `Dockerfile`

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-07 15:03:36 -08:00
Marko Vejnovic
3a810da66c build(ENG-21466): Fix sccache not caching across builds (#24423) 2025-11-07 14:33:26 -08:00
Jarred Sumner
0db90b2526 Implement isolated event loop for spawnSync (#24436) 2025-11-07 05:28:33 -08:00
824 changed files with 19978 additions and 12055 deletions

View File

@@ -133,6 +133,20 @@ RUN ARCH=$(if [ "$TARGETARCH" = "arm64" ]; then echo "arm64"; else echo "amd64";
RUN mkdir -p /var/cache/buildkite-agent /var/log/buildkite-agent /var/run/buildkite-agent /etc/buildkite-agent /var/lib/buildkite-agent/cache/bun
# The following is necessary to configure buildkite to use a stable
# checkout directory. sccache hashes absolute paths into its cache keys,
# so if buildkite uses a different checkout path each time (which it does
# by default), sccache will be useless.
RUN mkdir -p -m 755 /var/lib/buildkite-agent/hooks && \
cat <<'EOF' > /var/lib/buildkite-agent/hooks/environment
#!/bin/sh
set -efu
export BUILDKITE_BUILD_CHECKOUT_PATH=/var/lib/buildkite-agent/build
EOF
RUN chmod 744 /var/lib/buildkite-agent/hooks/environment
COPY ../*/agent.mjs /var/bun/scripts/
ENV BUN_INSTALL_CACHE=/var/lib/buildkite-agent/cache/bun

View File

@@ -16,6 +16,7 @@ import {
getEmoji,
getEnv,
getLastSuccessfulBuild,
getSecret,
isBuildkite,
isBuildManual,
isFork,
@@ -555,7 +556,6 @@ function getBuildBunStep(platform, options) {
/**
* @typedef {Object} TestOptions
* @property {string} [buildId]
* @property {boolean} [unifiedTests]
* @property {string[]} [testFiles]
* @property {boolean} [dryRun]
*/
@@ -568,7 +568,7 @@ function getBuildBunStep(platform, options) {
*/
function getTestBunStep(platform, options, testOptions = {}) {
const { os, profile } = platform;
const { buildId, unifiedTests, testFiles } = testOptions;
const { buildId, testFiles } = testOptions;
const args = [`--step=${getTargetKey(platform)}-build-bun`];
if (buildId) {
@@ -590,7 +590,7 @@ function getTestBunStep(platform, options, testOptions = {}) {
agents: getTestAgent(platform, options),
retry: getRetry(),
cancel_on_build_failing: isMergeQueue(),
parallelism: unifiedTests ? undefined : os === "darwin" ? 2 : 10,
parallelism: os === "darwin" ? 2 : 10,
timeout_in_minutes: profile === "asan" || os === "windows" ? 45 : 30,
env: {
ASAN_OPTIONS: "allow_user_segv_handler=1:disable_coredump=0:detect_leaks=0",
@@ -772,8 +772,6 @@ function getBenchmarkStep() {
* @property {Platform[]} [buildPlatforms]
* @property {Platform[]} [testPlatforms]
* @property {string[]} [testFiles]
* @property {boolean} [unifiedBuilds]
* @property {boolean} [unifiedTests]
*/
/**
@@ -944,22 +942,6 @@ function getOptionsStep() {
default: "false",
options: booleanOptions,
},
{
key: "unified-builds",
select: "Do you want to build each platform in a single step?",
hint: "If true, builds will not be split into separate steps (this will likely slow down the build)",
required: false,
default: "false",
options: booleanOptions,
},
{
key: "unified-tests",
select: "Do you want to run tests in a single step?",
hint: "If true, tests will not be split into separate steps (this will be very slow)",
required: false,
default: "false",
options: booleanOptions,
},
],
};
}
@@ -1025,8 +1007,6 @@ async function getPipelineOptions() {
buildImages: parseBoolean(options["build-images"]),
publishImages: parseBoolean(options["publish-images"]),
testFiles: parseArray(options["test-files"]),
unifiedBuilds: parseBoolean(options["unified-builds"]),
unifiedTests: parseBoolean(options["unified-tests"]),
buildPlatforms: buildPlatformKeys?.length
? buildPlatformKeys.flatMap(key => buildProfiles.map(profile => ({ ...buildPlatformsMap.get(key), profile })))
: Array.from(buildPlatformsMap.values()),
@@ -1108,7 +1088,7 @@ async function getPipeline(options = {}) {
});
}
let { skipBuilds, forceBuilds, unifiedBuilds, dryRun } = options;
let { skipBuilds, forceBuilds, dryRun } = options;
dryRun = dryRun || !!buildImages;
/** @type {string | undefined} */
@@ -1139,13 +1119,16 @@ async function getPipeline(options = {}) {
dependsOn.push(`${imageKey}-build-image`);
}
const steps = [];
steps.push(getBuildCppStep(target, options));
steps.push(getBuildZigStep(target, options));
steps.push(getLinkBunStep(target, options));
return getStepWithDependsOn(
{
key: getTargetKey(target),
group: getTargetLabel(target),
steps: unifiedBuilds
? [getBuildBunStep(target, options)]
: [getBuildCppStep(target, options), getBuildZigStep(target, options), getLinkBunStep(target, options)],
steps,
},
...dependsOn,
);
@@ -1154,13 +1137,13 @@ async function getPipeline(options = {}) {
}
if (!isMainBranch()) {
const { skipTests, forceTests, unifiedTests, testFiles } = options;
const { skipTests, forceTests, testFiles } = options;
if (!skipTests || forceTests) {
steps.push(
...testPlatforms.map(target => ({
key: getTargetKey(target),
group: getTargetLabel(target),
steps: [getTestBunStep(target, options, { unifiedTests, testFiles, buildId })],
steps: [getTestBunStep(target, options, { testFiles, buildId })],
})),
);
}
@@ -1203,6 +1186,43 @@ async function main() {
console.log("Generated options:", options);
}
startGroup("Querying GitHub for files...");
if (options && isBuildkite && !isMainBranch()) {
/** @type {string[]} */
let allFiles = [];
/** @type {string[]} */
let newFiles = [];
let prFileCount = 0;
try {
console.log("on buildkite: collecting new files from PR");
const per_page = 50;
const { BUILDKITE_PULL_REQUEST } = process.env;
for (let i = 1; i <= 10; i++) {
const res = await fetch(
`https://api.github.com/repos/oven-sh/bun/pulls/${BUILDKITE_PULL_REQUEST}/files?per_page=${per_page}&page=${i}`,
{ headers: { Authorization: `Bearer ${getSecret("GITHUB_TOKEN")}` } },
);
const doc = await res.json();
console.log(`-> page ${i}, found ${doc.length} items`);
if (doc.length === 0) break;
for (const { filename, status } of doc) {
prFileCount += 1;
allFiles.push(filename);
if (status !== "added") continue;
newFiles.push(filename);
}
if (doc.length < per_page) break;
}
console.log(`- PR ${BUILDKITE_PULL_REQUEST}, ${prFileCount} files, ${newFiles.length} new files`);
} catch (e) {
console.error(e);
}
if (allFiles.every(filename => filename.startsWith("docs/"))) {
console.log(`- PR is only docs, skipping tests!`);
return;
}
}
startGroup("Generating pipeline...");
const pipeline = await getPipeline(options);
if (!pipeline) {

1
.gitattributes vendored
View File

@@ -16,6 +16,7 @@
*.map text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.md text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.mdc text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.mdx text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.mjs text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.mts text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2

View File

@@ -9,7 +9,7 @@ on:
pull_request:
merge_group:
env:
BUN_VERSION: "1.2.20"
BUN_VERSION: "1.3.2"
LLVM_VERSION: "19.1.7"
LLVM_VERSION_MAJOR: "19"

View File

@@ -9,3 +9,6 @@ test/snippets
test/js/node/test
test/napi/node-napi-tests
bun.lock
# the output codeblocks need to stay minified
docs/bundler/minifier.mdx

View File

@@ -38,16 +38,36 @@ If no valid issue number is provided, find the best existing file to modify inst
### Writing Tests
Tests use Bun's Jest-compatible test runner with proper test fixtures:
Tests use Bun's Jest-compatible test runner with proper test fixtures.
- For **single-file tests**, prefer `-e` over `tempDir`.
- For **multi-file tests**, prefer `tempDir` and `Bun.spawn`.
```typescript
import { test, expect } from "bun:test";
import { bunEnv, bunExe, normalizeBunSnapshot, tempDir } from "harness";
test("my feature", async () => {
test("(single-file test) my feature", async () => {
await using proc = Bun.spawn({
cmd: [bunExe(), "-e", "console.log('Hello, world!')"],
env: bunEnv,
});
const [stdout, stderr, exitCode] = await Promise.all([
proc.stdout.text(),
proc.stderr.text(),
proc.exited,
]);
expect(normalizeBunSnapshot(stdout)).toMatchInlineSnapshot(`"Hello, world!"`);
expect(exitCode).toBe(0);
});
test("(multi-file test) my feature", async () => {
// Create temp directory with test files
using dir = tempDir("test-prefix", {
"index.js": `console.log("hello");`,
"index.js": `import { foo } from "./foo.ts"; foo();`,
"foo.ts": `export function foo() { console.log("foo"); }`,
});
// Spawn Bun process

View File

@@ -25,16 +25,6 @@ if(CMAKE_HOST_APPLE)
endif()
include(SetupLLVM)
find_program(SCCACHE_PROGRAM sccache)
if(SCCACHE_PROGRAM AND NOT DEFINED ENV{NO_SCCACHE})
include(SetupSccache)
else()
find_program(CCACHE_PROGRAM ccache)
if(CCACHE_PROGRAM)
include(SetupCcache)
endif()
endif()
# --- Project ---
parse_package_json(VERSION_VARIABLE DEFAULT_VERSION)
@@ -57,6 +47,16 @@ include(SetupEsbuild)
include(SetupZig)
include(SetupRust)
find_program(SCCACHE_PROGRAM sccache)
if(SCCACHE_PROGRAM AND NOT DEFINED ENV{NO_SCCACHE})
include(SetupSccache)
else()
find_program(CCACHE_PROGRAM ccache)
if(CCACHE_PROGRAM)
include(SetupCcache)
endif()
endif()
# Generate dependency versions header
include(GenerateDependencyVersions)

View File

@@ -201,7 +201,7 @@ Bun generally takes about 2.5 minutes to compile a debug build when there are Zi
- Batch up your changes
- Ensure zls is running with incremental watching for LSP errors (if you use VSCode and install Zig and run `bun run build` once to download Zig, this should just work)
- Prefer using the debugger ("CodeLLDB" in VSCode) to step through the code.
- Use debug logs. `BUN_DEBUG_<scope>=1` will enable debug logging for the corresponding `Output.scoped(.<scope>, .hidden)` logs. You can also set `BUN_DEBUG_QUIET_LOGS=1` to disable all debug logging that isn't explicitly enabled. To dump debug lgos into a file, `BUN_DEBUG=<path-to-file>.log`. Debug logs are aggressively removed in release builds.
- Use debug logs. `BUN_DEBUG_<scope>=1` will enable debug logging for the corresponding `Output.scoped(.<scope>, .hidden)` logs. You can also set `BUN_DEBUG_QUIET_LOGS=1` to disable all debug logging that isn't explicitly enabled. To dump debug logs into a file, `BUN_DEBUG=<path-to-file>.log`. Debug logs are aggressively removed in release builds.
- src/js/\*\*.ts changes are pretty much instant to rebuild. C++ changes are a bit slower, but still much faster than the Zig code (Zig is one compilation unit, C++ is many).
## Code generation scripts

2
LATEST
View File

@@ -1 +1 @@
1.3.1
1.3.2

View File

@@ -230,7 +230,7 @@ bun upgrade --canary
- Ecosystem
- [Use React and JSX](https://bun.com/guides/ecosystem/react)
- [Use EdgeDB with Bun](https://bun.com/guides/ecosystem/edgedb)
- [Use Gel with Bun](https://bun.com/guides/ecosystem/gel)
- [Use Prisma with Bun](https://bun.com/guides/ecosystem/prisma)
- [Add Sentry to a Bun app](https://bun.com/guides/ecosystem/sentry)
- [Create a Discord bot](https://bun.com/guides/ecosystem/discordjs)

View File

@@ -18,22 +18,6 @@ const OperatingSystem = @import("src/env.zig").OperatingSystem;
const pathRel = fs.path.relative;
/// When updating this, make sure to adjust SetupZig.cmake
const recommended_zig_version = "0.14.0";
// comptime {
// if (!std.mem.eql(u8, builtin.zig_version_string, recommended_zig_version)) {
// @compileError(
// "" ++
// "Bun requires Zig version " ++ recommended_zig_version ++ ", but you have " ++
// builtin.zig_version_string ++ ". This is automatically configured via Bun's " ++
// "CMake setup. You likely meant to run `bun run build`. If you are trying to " ++
// "upgrade the Zig compiler, edit ZIG_COMMIT in cmake/tools/SetupZig.cmake or " ++
// "comment this error out.",
// );
// }
// }
const zero_sha = "0000000000000000000000000000000000000000";
const BunBuildOptions = struct {
@@ -99,7 +83,7 @@ const BunBuildOptions = struct {
opts.addOption(bool, "enable_asan", this.enable_asan);
opts.addOption(bool, "enable_valgrind", this.enable_valgrind);
opts.addOption(bool, "use_mimalloc", this.use_mimalloc);
opts.addOption([]const u8, "reported_nodejs_version", b.fmt("{}", .{this.reported_nodejs_version}));
opts.addOption([]const u8, "reported_nodejs_version", b.fmt("{f}", .{this.reported_nodejs_version}));
opts.addOption(bool, "zig_self_hosted_backend", this.no_llvm);
opts.addOption(bool, "override_no_export_cpp_apis", this.override_no_export_cpp_apis);
@@ -134,8 +118,8 @@ pub fn getOSVersionMin(os: OperatingSystem) ?Target.Query.OsVersion {
pub fn getOSGlibCVersion(os: OperatingSystem) ?Version {
return switch (os) {
// Compiling with a newer glibc than this will break certain cloud environments.
.linux => .{ .major = 2, .minor = 27, .patch = 0 },
// Compiling with a newer glibc than this will break certain cloud environments. See symbols.test.ts.
.linux => .{ .major = 2, .minor = 26, .patch = 0 },
else => null,
};
@@ -290,14 +274,16 @@ pub fn build(b: *Build) !void {
var o = build_options;
var unit_tests = b.addTest(.{
.name = "bun-test",
.optimize = build_options.optimize,
.root_source_file = b.path("src/unit_test.zig"),
.test_runner = .{ .path = b.path("src/main_test.zig"), .mode = .simple },
.target = build_options.target,
.root_module = b.createModule(.{
.optimize = build_options.optimize,
.root_source_file = b.path("src/unit_test.zig"),
.target = build_options.target,
.omit_frame_pointer = false,
.strip = false,
}),
.use_llvm = !build_options.no_llvm,
.use_lld = if (build_options.os == .mac) false else !build_options.no_llvm,
.omit_frame_pointer = false,
.strip = false,
});
configureObj(b, &o, unit_tests);
// Setting `linker_allow_shlib_undefined` causes the linker to ignore
@@ -331,6 +317,7 @@ pub fn build(b: *Build) !void {
var step = b.step("check", "Check for semantic analysis errors");
var bun_check_obj = addBunObject(b, &build_options);
bun_check_obj.generated_bin = null;
// bun_check_obj.use_llvm = false;
step.dependOn(&bun_check_obj.step);
// The default install step will run zig build check. This is so ZLS
@@ -616,7 +603,7 @@ fn configureObj(b: *Build, opts: *BunBuildOptions, obj: *Compile) void {
obj.llvm_codegen_threads = opts.llvm_codegen_threads orelse 0;
}
obj.no_link_obj = true;
obj.no_link_obj = opts.os != .windows;
if (opts.enable_asan and !enableFastBuild(b)) {
if (@hasField(Build.Module, "sanitize_address")) {
@@ -779,6 +766,13 @@ fn addInternalImports(b: *Build, mod: *Module, opts: *BunBuildOptions) void {
mod.addImport("cpp", cppImport);
cppImport.addImport("bun", mod);
}
{
const ciInfoImport = b.createModule(.{
.root_source_file = (std.Build.LazyPath{ .cwd_relative = opts.codegen_path }).path(b, "ci_info.zig"),
});
mod.addImport("ci_info", ciInfoImport);
ciInfoImport.addImport("bun", mod);
}
inline for (.{
.{ .import = "completions-bash", .file = b.path("completions/bun.bash") },
.{ .import = "completions-zsh", .file = b.path("completions/bun.zsh") },
@@ -804,7 +798,7 @@ fn addInternalImports(b: *Build, mod: *Module, opts: *BunBuildOptions) void {
fn propagateImports(source_mod: *Module) !void {
var seen = std.AutoHashMap(*Module, void).init(source_mod.owner.graph.arena);
defer seen.deinit();
var queue = std.ArrayList(*Module).init(source_mod.owner.graph.arena);
var queue = std.array_list.Managed(*Module).init(source_mod.owner.graph.arena);
defer queue.deinit();
try queue.appendSlice(source_mod.import_table.values());
while (queue.pop()) |mod| {

View File

@@ -1,5 +1,6 @@
{
"lockfileVersion": 1,
"configVersion": 0,
"workspaces": {
"": {
"name": "bun",
@@ -31,12 +32,6 @@
"dependencies": {
"@types/node": "*",
},
"devDependencies": {
"@types/react": "^19",
},
"peerDependencies": {
"@types/react": "^19",
},
},
},
"overrides": {
@@ -162,8 +157,6 @@
"@types/node": ["@types/node@24.2.1", "", { "dependencies": { "undici-types": "~7.10.0" } }, "sha512-DRh5K+ka5eJic8CjH7td8QpYEV6Zo10gfRkjHCO3weqZHWDtAaSTFtl4+VMqOJ4N5jcuhZ9/l+yy8rVgw7BQeQ=="],
"@types/react": ["@types/react@19.1.10", "", { "dependencies": { "csstype": "^3.0.2" } }, "sha512-EhBeSYX0Y6ye8pNebpKrwFJq7BoQ8J5SO6NlvNwwHjSj6adXJViPQrKlsyPw7hLBLvckEMO1yxeGdR82YBBlDg=="],
"aggregate-error": ["aggregate-error@3.1.0", "", { "dependencies": { "clean-stack": "^2.0.0", "indent-string": "^4.0.0" } }, "sha512-4I7Td01quW/RpocfNayFdFVk1qSuoh0E7JrbRJ16nH01HhKFQ88INq9Sd+nd72zqRySlr9BmDA8xlEJ6vJMrYA=="],
"before-after-hook": ["before-after-hook@2.2.3", "", {}, "sha512-NzUnlZexiaH/46WDhANlyR2bXRopNg4F/zuSA3OpZnllCUgRaOF2znDioDWrmbNVsuZk6l9pMquQB38cfBZwkQ=="],
@@ -192,8 +185,6 @@
"constant-case": ["constant-case@3.0.4", "", { "dependencies": { "no-case": "^3.0.4", "tslib": "^2.0.3", "upper-case": "^2.0.2" } }, "sha512-I2hSBi7Vvs7BEuJDr5dDHfzb/Ruj3FyvFyh7KLilAjNQw3Be+xgqUBA2W6scVEcL0hL1dwPRtIqEPVUCKkSsyQ=="],
"csstype": ["csstype@3.1.3", "", {}, "sha512-M1uQkMl8rQK/szD0LNhtqxIPLpimGm8sOBwU7lLnCpSbTyY3yeU1Vc7l4KT5zT4s/yOxHH5O7tIuuLOCnLADRw=="],
"deprecation": ["deprecation@2.3.1", "", {}, "sha512-xmHIy4F3scKVwMsQ4WnVaS8bHOx0DmVwRywosKhaILI0ywMDWPtBSku2HNxRvF7jtwDRsoEwYQSfbxj8b7RlJQ=="],
"detect-libc": ["detect-libc@2.0.4", "", {}, "sha512-3UDv+G9CsCKO1WKMGw9fwq/SWJYbI0c5Y7LU1AXYoDdbhE2AHQ6N6Nb34sG8Fj7T5APy8qXDCKuuIHd1BR0tVA=="],

View File

@@ -125,7 +125,8 @@ setx(CWD ${CMAKE_SOURCE_DIR})
setx(BUILD_PATH ${CMAKE_BINARY_DIR})
optionx(CACHE_PATH FILEPATH "The path to the cache directory" DEFAULT ${BUILD_PATH}/cache)
optionx(CACHE_STRATEGY "read-write|read-only|none" "The strategy to use for caching" DEFAULT "read-write")
optionx(CACHE_STRATEGY "auto|distributed|local|none" "The strategy to use for caching" DEFAULT
"auto")
optionx(CI BOOL "If CI is enabled" DEFAULT OFF)
optionx(ENABLE_ANALYSIS BOOL "If static analysis targets should be enabled" DEFAULT OFF)
@@ -141,9 +142,39 @@ optionx(TMP_PATH FILEPATH "The path to the temporary directory" DEFAULT ${BUILD_
# --- Helper functions ---
# list_filter_out_regex()
#
# Description:
# Filters out elements from a list that match a regex pattern.
#
# Arguments:
# list - The list of strings to traverse
# pattern - The regex pattern to filter out
# touched - A variable to set if any items were removed
function(list_filter_out_regex list pattern touched)
set(result_list "${${list}}")
set(keep_list)
set(was_modified OFF)
foreach(line IN LISTS result_list)
if(line MATCHES "${pattern}")
set(was_modified ON)
else()
list(APPEND keep_list ${line})
endif()
endforeach()
set(${list} "${keep_list}" PARENT_SCOPE)
set(${touched} ${was_modified} PARENT_SCOPE)
endfunction()
# setenv()
# Description:
# Sets an environment variable during the build step, and writes it to a .env file.
#
# See Also:
# unsetenv()
#
# Arguments:
# variable string - The variable to set
# value string - The value to set the variable to
@@ -156,13 +187,7 @@ function(setenv variable value)
if(EXISTS ${ENV_PATH})
file(STRINGS ${ENV_PATH} ENV_FILE ENCODING UTF-8)
foreach(line ${ENV_FILE})
if(line MATCHES "^${variable}=")
list(REMOVE_ITEM ENV_FILE ${line})
set(ENV_MODIFIED ON)
endif()
endforeach()
list_filter_out_regex(ENV_FILE "^${variable}=" ENV_MODIFIED)
if(ENV_MODIFIED)
list(APPEND ENV_FILE "${variable}=${value}")
@@ -178,6 +203,28 @@ function(setenv variable value)
message(STATUS "Set ENV ${variable}: ${value}")
endfunction()
# See setenv()
# Description:
# Exact opposite of setenv().
# Arguments:
# variable string - The variable to unset.
# See Also:
# setenv()
function(unsetenv variable)
set(ENV_PATH ${BUILD_PATH}/.env)
if(NOT EXISTS ${ENV_PATH})
return()
endif()
file(STRINGS ${ENV_PATH} ENV_FILE ENCODING UTF-8)
list_filter_out_regex(ENV_FILE "^${variable}=" ENV_MODIFIED)
if(ENV_MODIFIED)
list(JOIN ENV_FILE "\n" ENV_FILE)
file(WRITE ${ENV_PATH} ${ENV_FILE})
endif()
endfunction()
# satisfies_range()
# Description:
# Check if a version satisfies a version range or list of ranges

View File

@@ -34,26 +34,6 @@ register_command(
ALWAYS_RUN
)
if(GIT_CHANGED_SOURCES)
set(CLANG_FORMAT_CHANGED_SOURCES)
foreach(source ${CLANG_FORMAT_SOURCES})
list(FIND GIT_CHANGED_SOURCES ${source} index)
if(NOT ${index} EQUAL -1)
list(APPEND CLANG_FORMAT_CHANGED_SOURCES ${source})
endif()
endforeach()
endif()
if(CLANG_FORMAT_CHANGED_SOURCES)
set(CLANG_FORMAT_DIFF_COMMAND ${CLANG_FORMAT_PROGRAM}
-i # edits files in-place
--verbose
${CLANG_FORMAT_CHANGED_SOURCES}
)
else()
set(CLANG_FORMAT_DIFF_COMMAND ${CMAKE_COMMAND} -E echo "No changed files for clang-format")
endif()
register_command(
TARGET
clang-format-diff

View File

@@ -3,7 +3,7 @@
set(CLANG_TIDY_SOURCES ${BUN_C_SOURCES} ${BUN_CXX_SOURCES})
set(CLANG_TIDY_COMMAND ${CLANG_TIDY_PROGRAM}
-p ${BUILD_PATH}
-p ${BUILD_PATH}
--config-file=${CWD}/.clang-tidy
)
@@ -40,27 +40,6 @@ register_command(
ALWAYS_RUN
)
if(GIT_CHANGED_SOURCES)
set(CLANG_TIDY_CHANGED_SOURCES)
foreach(source ${CLANG_TIDY_SOURCES})
list(FIND GIT_CHANGED_SOURCES ${source} index)
if(NOT ${index} EQUAL -1)
list(APPEND CLANG_TIDY_CHANGED_SOURCES ${source})
endif()
endforeach()
endif()
if(CLANG_TIDY_CHANGED_SOURCES)
set(CLANG_TIDY_DIFF_COMMAND ${CLANG_TIDY_PROGRAM}
${CLANG_TIDY_CHANGED_SOURCES}
--fix
--fix-errors
--fix-notes
)
else()
set(CLANG_TIDY_DIFF_COMMAND ${CMAKE_COMMAND} -E echo "No changed files for clang-tidy")
endif()
register_command(
TARGET
clang-tidy-diff

View File

@@ -92,26 +92,6 @@ register_command(
ALWAYS_RUN
)
if(GIT_CHANGED_SOURCES)
set(PRETTIER_CHANGED_SOURCES)
foreach(source ${PRETTIER_SOURCES})
list(FIND GIT_CHANGED_SOURCES ${source} index)
if(NOT ${index} EQUAL -1)
list(APPEND PRETTIER_CHANGED_SOURCES ${source})
endif()
endforeach()
endif()
if(PRETTIER_CHANGED_SOURCES)
set(PRETTIER_DIFF_COMMAND ${PRETTIER_COMMAND}
--write
--plugin=prettier-plugin-organize-imports
${PRETTIER_CHANGED_SOURCES}
)
else()
set(PRETTIER_DIFF_COMMAND ${CMAKE_COMMAND} -E echo "No changed files for prettier")
endif()
register_command(
TARGET
prettier-diff

View File

@@ -25,25 +25,6 @@ register_command(
ALWAYS_RUN
)
if(GIT_CHANGED_SOURCES)
set(ZIG_FORMAT_CHANGED_SOURCES)
foreach(source ${ZIG_FORMAT_SOURCES})
list(FIND GIT_CHANGED_SOURCES ${source} index)
if(NOT ${index} EQUAL -1)
list(APPEND ZIG_FORMAT_CHANGED_SOURCES ${source})
endif()
endforeach()
endif()
if(ZIG_FORMAT_CHANGED_SOURCES)
set(ZIG_FORMAT_DIFF_COMMAND ${ZIG_EXECUTABLE}
fmt
${ZIG_FORMAT_CHANGED_SOURCES}
)
else()
set(ZIG_FORMAT_DIFF_COMMAND ${CMAKE_COMMAND} -E echo "No changed files for zig-format")
endif()
register_command(
TARGET
zig-format-diff

View File

@@ -317,6 +317,10 @@ set(BUN_CPP_OUTPUTS
${CODEGEN_PATH}/cpp.zig
)
set(BUN_CI_INFO_OUTPUTS
${CODEGEN_PATH}/ci_info.zig
)
register_command(
TARGET
bun-cppbind
@@ -334,6 +338,21 @@ register_command(
${BUN_CPP_OUTPUTS}
)
register_command(
TARGET
bun-ci-info
COMMENT
"Generating CI info"
COMMAND
${BUN_EXECUTABLE}
${CWD}/src/codegen/ci_info.ts
${CODEGEN_PATH}/ci_info.zig
SOURCES
${BUN_JAVASCRIPT_CODEGEN_SOURCES}
OUTPUTS
${BUN_CI_INFO_OUTPUTS}
)
register_command(
TARGET
bun-js-modules
@@ -612,6 +631,7 @@ set(BUN_ZIG_GENERATED_SOURCES
${BUN_ZIG_GENERATED_CLASSES_OUTPUTS}
${BUN_JAVASCRIPT_OUTPUTS}
${BUN_CPP_OUTPUTS}
${BUN_CI_INFO_OUTPUTS}
${BUN_BINDGENV2_ZIG_OUTPUTS}
)

View File

@@ -4,41 +4,9 @@ find_command(
COMMAND
git
REQUIRED
OFF
${CI}
)
if(NOT GIT_PROGRAM)
return()
endif()
set(GIT_DIFF_COMMAND ${GIT_PROGRAM} diff --no-color --name-only --diff-filter=AMCR origin/main HEAD)
execute_process(
COMMAND
${GIT_DIFF_COMMAND}
WORKING_DIRECTORY
${CWD}
OUTPUT_STRIP_TRAILING_WHITESPACE
OUTPUT_VARIABLE
GIT_DIFF
ERROR_STRIP_TRAILING_WHITESPACE
ERROR_VARIABLE
GIT_DIFF_ERROR
RESULT_VARIABLE
GIT_DIFF_RESULT
)
if(NOT GIT_DIFF_RESULT EQUAL 0)
message(WARNING "Command failed: ${GIT_DIFF_COMMAND} ${GIT_DIFF_ERROR}")
return()
endif()
string(REPLACE "\n" ";" GIT_CHANGED_SOURCES "${GIT_DIFF}")
if(CI)
set(GIT_CHANGED_SOURCES "${GIT_CHANGED_SOURCES}")
message(STATUS "Set GIT_CHANGED_SOURCES: ${GIT_CHANGED_SOURCES}")
endif()
list(TRANSFORM GIT_CHANGED_SOURCES PREPEND ${CWD}/)
list(LENGTH GIT_CHANGED_SOURCES GIT_CHANGED_SOURCES_COUNT)

View File

@@ -1,60 +1,108 @@
# Setup sccache as the C and C++ compiler launcher to speed up builds by caching
if(CACHE_STRATEGY STREQUAL "none")
return()
endif()
function(check_aws_credentials OUT_VAR)
set(HAS_CREDENTIALS FALSE)
set(SCCACHE_SHARED_CACHE_REGION "us-west-1")
set(SCCACHE_SHARED_CACHE_BUCKET "bun-build-sccache-store")
if(DEFINED ENV{AWS_ACCESS_KEY_ID} AND DEFINED ENV{AWS_SECRET_ACCESS_KEY})
set(HAS_CREDENTIALS TRUE)
message(NOTICE
"sccache: Using AWS credentials found in environment variables")
# Function to check if the system AWS credentials have access to the sccache S3 bucket.
function(check_aws_credentials OUT_VAR)
# Install dependencies first
execute_process(
COMMAND ${BUN_EXECUTABLE} install --frozen-lockfile
WORKING_DIRECTORY ${CMAKE_SOURCE_DIR}/scripts/build-cache
RESULT_VARIABLE INSTALL_EXIT_CODE
OUTPUT_VARIABLE INSTALL_OUTPUT
ERROR_VARIABLE INSTALL_ERROR
)
if(NOT INSTALL_EXIT_CODE EQUAL 0)
message(FATAL_ERROR "Failed to install dependencies in scripts/build-cache\n"
"Exit code: ${INSTALL_EXIT_CODE}\n"
"Output: ${INSTALL_OUTPUT}\n"
"Error: ${INSTALL_ERROR}")
endif()
# Check for ~/.aws directory since sccache may use that.
if(NOT HAS_CREDENTIALS)
if(WIN32)
set(AWS_CONFIG_DIR "$ENV{USERPROFILE}/.aws")
else()
set(AWS_CONFIG_DIR "$ENV{HOME}/.aws")
endif()
# Check AWS credentials
execute_process(
COMMAND
${BUN_EXECUTABLE}
run
have-access.ts
--bucket ${SCCACHE_SHARED_CACHE_BUCKET}
--region ${SCCACHE_SHARED_CACHE_REGION}
WORKING_DIRECTORY
${CMAKE_SOURCE_DIR}/scripts/build-cache
RESULT_VARIABLE HAVE_ACCESS_EXIT_CODE
)
if(EXISTS "${AWS_CONFIG_DIR}/credentials")
set(HAS_CREDENTIALS TRUE)
message(NOTICE
"sccache: Using AWS credentials found in ${AWS_CONFIG_DIR}/credentials")
endif()
if(HAVE_ACCESS_EXIT_CODE EQUAL 0)
set(HAS_CREDENTIALS TRUE)
else()
set(HAS_CREDENTIALS FALSE)
endif()
set(${OUT_VAR} ${HAS_CREDENTIALS} PARENT_SCOPE)
endfunction()
function(check_running_in_ci OUT_VAR)
set(IS_CI FALSE)
# Query EC2 instance metadata service to check if running on buildkite-agent
# The IP address 169.254.169.254 is a well-known link-local address for querying EC2 instance
# metdata:
# https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/instancedata-data-retrieval.html
execute_process(
COMMAND curl -s -m 0.5 http://169.254.169.254/latest/meta-data/tags/instance/Service
OUTPUT_VARIABLE METADATA_OUTPUT
ERROR_VARIABLE METADATA_ERROR
RESULT_VARIABLE METADATA_RESULT
OUTPUT_STRIP_TRAILING_WHITESPACE
ERROR_QUIET
)
# Check if the request succeeded and returned exactly "buildkite-agent"
if(METADATA_RESULT EQUAL 0 AND METADATA_OUTPUT STREQUAL "buildkite-agent")
set(IS_CI TRUE)
endif()
set(${OUT_VAR} ${IS_CI} PARENT_SCOPE)
# Configure sccache to use the local cache only.
function(sccache_configure_local_filesystem)
unsetenv(SCCACHE_BUCKET)
unsetenv(SCCACHE_REGION)
setenv(SCCACHE_DIR "${CACHE_PATH}/sccache")
endfunction()
check_running_in_ci(IS_IN_CI)
find_command(VARIABLE SCCACHE_PROGRAM COMMAND sccache REQUIRED ${IS_IN_CI})
# Configure sccache to use the distributed cache (S3 + local).
function(sccache_configure_distributed)
setenv(SCCACHE_BUCKET "${SCCACHE_SHARED_CACHE_BUCKET}")
setenv(SCCACHE_REGION "${SCCACHE_SHARED_CACHE_REGION}")
setenv(SCCACHE_DIR "${CACHE_PATH}/sccache")
endfunction()
function(sccache_configure_environment_ci)
if(CACHE_STRATEGY STREQUAL "auto" OR CACHE_STRATEGY STREQUAL "distributed")
check_aws_credentials(HAS_AWS_CREDENTIALS)
if(HAS_AWS_CREDENTIALS)
sccache_configure_distributed()
message(NOTICE "sccache: Using distributed cache strategy.")
else()
message(FATAL_ERROR "CI CACHE_STRATEGY is set to '${CACHE_STRATEGY}', but no valid AWS "
"credentials were found. Note that 'auto' requires AWS credentials to access the shared "
"cache in CI.")
endif()
elseif(CACHE_STRATEGY STREQUAL "local")
# We disallow this because we want our CI runs to always used the shared cache to accelerate
# builds.
# none, distributed and auto are all okay.
#
# If local is configured, it's as good as "none", so this is probably user error.
message(FATAL_ERROR "CI CACHE_STRATEGY is set to 'local', which is not allowed.")
endif()
endfunction()
function(sccache_configure_environment_developer)
# Local environments can use any strategy they like. S3 is set up in such a way so as to clean
# itself from old entries automatically.
if (CACHE_STRATEGY STREQUAL "auto" OR CACHE_STRATEGY STREQUAL "local")
# In the local environment, we prioritize using the local cache. This is because sccache takes
# into consideration the whole absolute path of the files being compiled, and it's very
# unlikely users will have the same absolute paths on their local machines.
sccache_configure_local_filesystem()
message(NOTICE "sccache: Using local cache strategy.")
elseif(CACHE_STRATEGY STREQUAL "distributed")
check_aws_credentials(HAS_AWS_CREDENTIALS)
if(HAS_AWS_CREDENTIALS)
sccache_configure_distributed()
message(NOTICE "sccache: Using distributed cache strategy.")
else()
message(FATAL_ERROR "CACHE_STRATEGY is set to 'distributed', but no valid AWS credentials "
"were found.")
endif()
endif()
endfunction()
find_command(VARIABLE SCCACHE_PROGRAM COMMAND sccache REQUIRED ${CI})
if(NOT SCCACHE_PROGRAM)
message(WARNING "sccache not found. Your builds will be slower.")
return()
@@ -66,25 +114,10 @@ foreach(arg ${SCCACHE_ARGS})
list(APPEND CMAKE_ARGS -D${arg}=${${arg}})
endforeach()
# Configure S3 bucket for distributed caching
setenv(SCCACHE_BUCKET "bun-build-sccache-store")
setenv(SCCACHE_REGION "us-west-1")
setenv(SCCACHE_DIR "${CACHE_PATH}/sccache")
# Handle credentials based on cache strategy
if (CACHE_STRATEGY STREQUAL "read-only")
setenv(SCCACHE_S3_NO_CREDENTIALS "1")
message(STATUS "sccache configured in read-only mode.")
else()
# Check for AWS credentials and enable anonymous access if needed
check_aws_credentials(HAS_AWS_CREDENTIALS)
if(NOT IS_IN_CI AND NOT HAS_AWS_CREDENTIALS)
setenv(SCCACHE_S3_NO_CREDENTIALS "1")
message(NOTICE "sccache: No AWS credentials found, enabling anonymous S3 "
"access. Writing to the cache will be disabled.")
endif()
endif()
setenv(SCCACHE_LOG "info")
message(STATUS "sccache configured for bun-build-sccache-store (us-west-1).")
if (CI)
sccache_configure_environment_ci()
else()
sccache_configure_environment_developer()
endif()

View File

@@ -20,7 +20,7 @@ else()
unsupported(CMAKE_SYSTEM_NAME)
endif()
set(ZIG_COMMIT "55fdbfa0c86be86b68d43a4ba761e6909eb0d7b2")
set(ZIG_COMMIT "c1423ff3fc7064635773a4a4616c5bf986eb00fe")
optionx(ZIG_TARGET STRING "The zig target to use" DEFAULT ${DEFAULT_ZIG_TARGET})
if(CMAKE_BUILD_TYPE STREQUAL "Release")

View File

@@ -34,7 +34,7 @@ By default, Bun's CSS bundler targets the following browsers:
The CSS Nesting specification allows you to write more concise and intuitive stylesheets by nesting selectors inside one another. Instead of repeating parent selectors across your CSS file, you can write child styles directly within their parent blocks.
```css title="styles.css" icon="file-code"
```scss title="styles.css" icon="file-code"
/* With nesting */
.card {
background: white;
@@ -100,7 +100,7 @@ This compiles to:
The `color-mix()` function gives you an easy way to blend two colors together according to a specified ratio in a chosen color space. This powerful feature lets you create color variations without manually calculating the resulting values.
```css title="styles.css" icon="file-code"
```scss title="styles.css" icon="file-code"
.button {
/* Mix blue and red in the RGB color space with a 30/70 proportion */
background-color: color-mix(in srgb, blue 30%, red);

View File

@@ -231,23 +231,67 @@ const myPlugin: BunPlugin = {
### onResolve
<Tabs>
<Tab title="options">- 🟢 `filter` - 🟢 `namespace`</Tab>
<Tab title="options">
- 🟢 `filter`
- 🟢 `namespace`
</Tab>
<Tab title="arguments">
- 🟢 `path` - 🟢 `importer` - 🔴 `namespace` - 🔴 `resolveDir` - 🔴 `kind` - 🔴 `pluginData`
- 🟢 `path`
- 🟢 `importer`
- 🔴 `namespace`
- 🔴 `resolveDir`
- 🔴 `kind`
- 🔴 `pluginData`
</Tab>
<Tab title="results">
- 🟢 `namespace` - 🟢 `path` - 🔴 `errors` - 🔴 `external` - 🔴 `pluginData` - 🔴 `pluginName` - 🔴 `sideEffects` -
🔴 `suffix` - 🔴 `warnings` - 🔴 `watchDirs` - 🔴 `watchFiles`
- 🟢 `namespace`
- 🟢 `path`
- 🔴 `errors`
- 🔴 `external`
- 🔴 `pluginData`
- 🔴 `pluginName`
- 🔴 `sideEffects`
- 🔴 `suffix`
- 🔴 `warnings`
- 🔴 `watchDirs`
- 🔴 `watchFiles`
</Tab>
</Tabs>
### onLoad
<Tabs>
<Tab title="options">- 🟢 `filter` - 🟢 `namespace`</Tab>
<Tab title="arguments">- 🟢 `path` - 🔴 `namespace` - 🔴 `suffix` - 🔴 `pluginData`</Tab>
<Tab title="options">
- 🟢 `filter`
- 🟢 `namespace`
</Tab>
<Tab title="arguments">
- 🟢 `path`
- 🔴 `namespace`
- 🔴 `suffix`
- 🔴 `pluginData`
</Tab>
<Tab title="results">
- 🟢 `contents` - 🟢 `loader` - 🔴 `errors` - 🔴 `pluginData` - 🔴 `pluginName` - 🔴 `resolveDir` - 🔴 `warnings` -
🔴 `watchDirs` - 🔴 `watchFiles`
- 🟢 `contents`
- 🟢 `loader`
- 🔴 `errors`
- 🔴 `pluginData`
- 🔴 `pluginName`
- 🔴 `resolveDir`
- 🔴 `warnings`
- 🔴 `watchDirs`
- 🔴 `watchFiles`
</Tab>
</Tabs>

View File

@@ -90,7 +90,7 @@ The order of the `--target` flag does not matter, as long as they're delimited b
| bun-linux-x64 | Linux | x64 | ✅ | ✅ | glibc |
| bun-linux-arm64 | Linux | arm64 | ✅ | N/A | glibc |
| bun-windows-x64 | Windows | x64 | ✅ | ✅ | - |
| ~~bun-windows-arm64~~ | Windows | arm64 | ❌ | ❌ | - |
| ~~bun-windows-arm64~~ | ~~Windows~~ | ~~arm64~~ | ❌ | ❌ | - |
| bun-darwin-x64 | macOS | x64 | ✅ | ✅ | - |
| bun-darwin-arm64 | macOS | arm64 | ✅ | N/A | - |
| bun-linux-x64-musl | Linux | x64 | ✅ | ✅ | musl |
@@ -154,8 +154,8 @@ Using bytecode compilation, `tsc` starts 2x faster:
Bytecode compilation moves parsing overhead for large input files from runtime to bundle time. Your app starts faster, in exchange for making the `bun build` command a little slower. It doesn't obscure source code.
<Warning>
**Experimental:** Bytecode compilation is an experimental feature introduced in Bun v1.1.30. Only `cjs` format is
supported (which means no top-level-await). Let us know if you run into any issues!
**Experimental:** Bytecode compilation is an experimental feature. Only `cjs` format is supported (which means no
top-level-await). Let us know if you run into any issues!
</Warning>
### What do these flags do?
@@ -216,7 +216,7 @@ However, with the `BUN_BE_BUN=1` environment variable, it acts just like the `bu
```bash icon="terminal" terminal
# With the env var, the executable acts like the `bun` CLI
bun_BE_BUN=1 ./such-bun install
BUN_BE_BUN=1 ./such-bun install
```
```txt
@@ -320,7 +320,7 @@ new Worker(new URL("./my-worker.ts", import.meta.url));
new Worker(new URL("./my-worker.ts", import.meta.url).href);
```
As of Bun v1.1.25, when you add multiple entrypoints to a standalone executable, they will be bundled separately into the executable.
When you add multiple entrypoints to a standalone executable, they will be bundled separately into the executable.
In the future, we may automatically detect usages of statically-known paths in `new Worker(path)` and then bundle those into the executable, but for now, you'll need to add it to the shell command manually like the above example.
@@ -395,7 +395,7 @@ This database is read-write, but all changes are lost when the executable exits
### Embed N-API Addons
As of Bun v1.0.23, you can embed `.node` files into executables.
You can embed `.node` files into executables.
```ts index.ts icon="/icons/typescript.svg"
const addon = require("./addon.node");
@@ -524,12 +524,46 @@ codesign -vvv --verify ./myapp
---
## Code splitting
Standalone executables support code splitting. Use `--compile` with `--splitting` to create an executable that loads code-split chunks at runtime.
```bash
bun build --compile --splitting ./src/entry.ts --outdir ./build
```
<CodeGroup>
```ts src/entry.ts icon="/icons/typescript.svg"
console.log("Entrypoint loaded");
const lazy = await import("./lazy.ts");
lazy.hello();
```
```ts src/lazy.ts icon="/icons/typescript.svg"
export function hello() {
console.log("Lazy module loaded");
}
```
</CodeGroup>
```bash terminal icon="terminal"
./build/entry
```
```txt
Entrypoint loaded
Lazy module loaded
```
---
## Unsupported CLI arguments
Currently, the `--compile` flag can only accept a single entrypoint at a time and does not support the following flags:
- `--outdir` — use `outfile` instead.
- `--splitting`
- `--outdir` — use `outfile` instead (except when using with `--splitting`).
- `--public-path`
- `--target=node` or `--target=browser`
- `--no-bundle` - we always bundle everything into the executable.

View File

@@ -25,7 +25,7 @@ bun ./index.html
```
```
Bun v1.2.20
Bun v1.3.2
ready in 6.62ms
→ http://localhost:3000/
Press h + Enter to show shortcuts
@@ -51,7 +51,7 @@ bun index.html
```
```
Bun v1.2.20
Bun v1.3.2
ready in 6.62ms
→ http://localhost:3000/
Press h + Enter to show shortcuts
@@ -81,7 +81,7 @@ bun ./index.html ./about.html
```
```txt
Bun v1.2.20
Bun v1.3.2
ready in 6.62ms
→ http://localhost:3000/
Routes:
@@ -104,7 +104,7 @@ bun ./**/*.html
```
```
Bun v1.2.20
Bun v1.3.2
ready in 6.62ms
→ http://localhost:3000/
Routes:
@@ -122,7 +122,7 @@ bun ./index.html ./about/index.html ./about/foo/index.html
```
```
Bun v1.2.20
Bun v1.3.2
ready in 6.62ms
→ http://localhost:3000/
Routes:
@@ -237,13 +237,27 @@ Then, reference TailwindCSS in your HTML via `<link>` tag, `@import` in CSS, or
<Tabs>
<Tab title="index.html">
```html title="index.html" icon="file-code"
{/* Reference TailwindCSS in your HTML */}
<!-- Reference TailwindCSS in your HTML -->
<link rel="stylesheet" href="tailwindcss" />
```
</Tab>
<Tab title="styles.css">
```css title="styles.css" icon="file-code"
@import "tailwindcss";
```
</Tab>
<Tab title="app.ts">
```ts title="app.ts" icon="/icons/typescript.svg"
import "tailwindcss";
```
</Tab>
<Tab title="styles.css">```css title="styles.css" icon="file-code" @import "tailwindcss"; ```</Tab>
<Tab title="app.ts">```ts title="app.ts" icon="/icons/typescript.svg" import "tailwindcss"; ```</Tab>
</Tabs>
<Info>Only one of those are necessary, not all three.</Info>
@@ -259,7 +273,7 @@ bun ./index.html --console
```
```
Bun v1.2.20
Bun v1.3.2
ready in 6.62ms
→ http://localhost:3000/
Press h + Enter to show shortcuts

View File

@@ -160,8 +160,12 @@ Like the Bun runtime, the bundler supports an array of file types out of the box
| ----------------------------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ |
| `.js` `.jsx` `.cjs` `.mjs` `.mts` `.cts` `.ts` `.tsx` | Uses Bun's built-in transpiler to parse the file and transpile TypeScript/JSX syntax to vanilla JavaScript. The bundler executes a set of default transforms including dead code elimination and tree shaking. At the moment Bun does not attempt to down-convert syntax; if you use recently ECMAScript syntax, that will be reflected in the bundled code. |
| `.json` | JSON files are parsed and inlined into the bundle as a JavaScript object.<br/><br/>`js<br/>import pkg from "./package.json";<br/>pkg.name; // => "my-package"<br/>` |
| `.jsonc` | JSON with comments. Files are parsed and inlined into the bundle as a JavaScript object.<br/><br/>`js<br/>import config from "./config.jsonc";<br/>config.name; // => "my-config"<br/>` |
| `.toml` | TOML files are parsed and inlined into the bundle as a JavaScript object.<br/><br/>`js<br/>import config from "./bunfig.toml";<br/>config.logLevel; // => "debug"<br/>` |
| `.yaml` `.yml` | YAML files are parsed and inlined into the bundle as a JavaScript object.<br/><br/>`js<br/>import config from "./config.yaml";<br/>config.name; // => "my-app"<br/>` |
| `.txt` | The contents of the text file are read and inlined into the bundle as a string.<br/><br/>`js<br/>import contents from "./file.txt";<br/>console.log(contents); // => "Hello, world!"<br/>` |
| `.html` | HTML files are processed and any referenced assets (scripts, stylesheets, images) are bundled. |
| `.css` | CSS files are bundled together into a single `.css` file in the output directory. |
| `.node` `.wasm` | These files are supported by the Bun runtime, but during bundling they are treated as assets. |
### Assets
@@ -1372,7 +1376,7 @@ interface BuildConfig {
publicPath?: string;
define?: Record<string, string>;
loader?: { [k in string]: Loader };
sourcemap?: "none" | "linked" | "inline" | "external" | "linked" | boolean; // default: "none", true -> "inline"
sourcemap?: "none" | "linked" | "inline" | "external" | boolean; // default: "none", true -> "inline"
/**
* package.json `exports` conditions used when resolving imports
*
@@ -1462,7 +1466,21 @@ interface BuildArtifact extends Blob {
sourcemap: BuildArtifact | null;
}
type Loader = "js" | "jsx" | "ts" | "tsx" | "json" | "toml" | "file" | "napi" | "wasm" | "text";
type Loader =
| "js"
| "jsx"
| "ts"
| "tsx"
| "css"
| "json"
| "jsonc"
| "toml"
| "yaml"
| "text"
| "file"
| "napi"
| "wasm"
| "html";
interface BuildOutput {
outputs: BuildArtifact[];

View File

@@ -7,14 +7,16 @@ The Bun bundler implements a set of default loaders out of the box.
> As a rule of thumb: **the bundler and the runtime both support the same set of file types out of the box.**
`.js` `.cjs` `.mjs` `.mts` `.cts` `.ts` `.tsx` `.jsx` `.toml` `.json` `.txt` `.wasm` `.node` `.html`
`.js` `.cjs` `.mjs` `.mts` `.cts` `.ts` `.tsx` `.jsx` `.css` `.json` `.jsonc` `.toml` `.yaml` `.yml` `.txt` `.wasm` `.node` `.html` `.sh`
Bun uses the file extension to determine which built-in loader should be used to parse the file. Every loader has a name, such as `js`, `tsx`, or `json`. These names are used when building plugins that extend Bun with custom loaders.
You can explicitly specify which loader to use using the `'loader'` import attribute.
You can explicitly specify which loader to use using the `'type'` import attribute.
```ts title="index.ts" icon="/icons/typescript.svg"
import my_toml from "./my_file" with { loader: "toml" };
import my_toml from "./my_file" with { type: "toml" };
// or with dynamic imports
const { default: my_toml } = await import("./my_file", { with: { type: "toml" } });
```
## Built-in loaders
@@ -85,7 +87,7 @@ If a `.json` file is passed as an entrypoint to the bundler, it will be converte
}
```
```ts Output
```js Output
export default {
name: "John Doe",
age: 35,
@@ -97,7 +99,32 @@ export default {
---
### toml
### `jsonc`
**JSON with Comments loader.** Default for `.jsonc`.
JSONC (JSON with Comments) files can be directly imported. Bun will parse them, stripping out comments and trailing commas.
```js
import config from "./config.jsonc";
console.log(config);
```
During bundling, the parsed JSONC is inlined into the bundle as a JavaScript object, identical to the `json` loader.
```js
var config = {
option: "value",
};
```
<Note>
Bun automatically uses the `jsonc` loader for `tsconfig.json`, `jsconfig.json`, `package.json`, and `bun.lock` files.
</Note>
---
### `toml`
**TOML loader.** Default for `.toml`.
@@ -131,7 +158,7 @@ age = 35
email = "johndoe@example.com"
```
```ts Output
```js Output
export default {
name: "John Doe",
age: 35,
@@ -143,7 +170,53 @@ export default {
---
### text
### `yaml`
**YAML loader.** Default for `.yaml` and `.yml`.
YAML files can be directly imported. Bun will parse them with its fast native YAML parser.
```js
import config from "./config.yaml";
console.log(config);
// via import attribute:
import data from "./data.txt" with { type: "yaml" };
```
During bundling, the parsed YAML is inlined into the bundle as a JavaScript object.
```js
var config = {
name: "my-app",
version: "1.0.0",
// ...other fields
};
```
If a `.yaml` or `.yml` file is passed as an entrypoint, it will be converted to a `.js` module that `export default`s the parsed object.
<CodeGroup>
```yaml Input
name: John Doe
age: 35
email: johndoe@example.com
```
```js Output
export default {
name: "John Doe",
age: 35,
email: "johndoe@example.com",
};
```
</CodeGroup>
---
### `text`
**Text loader.** Default for `.txt`.
@@ -173,7 +246,7 @@ If a `.txt` file is passed as an entrypoint, it will be converted to a `.js` mod
Hello, world!
```
```ts Output
```js Output
export default "Hello, world!";
```
@@ -181,7 +254,7 @@ export default "Hello, world!";
---
### napi
### `napi`
**Native addon loader.** Default for `.node`.
@@ -196,7 +269,7 @@ console.log(addon);
---
### sqlite
### `sqlite`
**SQLite loader.** Requires `with { "type": "sqlite" }` import attribute.
@@ -226,7 +299,9 @@ Otherwise, the database to embed is copied into the `outdir` with a hashed filen
---
### html
### `html`
**HTML loader.** Default for `.html`.
The `html` loader processes HTML files and bundles any referenced assets. It will:
@@ -301,7 +376,27 @@ The `html` loader behaves differently depending on how it's used:
---
### sh
### `css`
**CSS loader.** Default for `.css`.
CSS files can be directly imported. The bundler will parse and bundle CSS files, handling `@import` statements and `url()` references.
```js
import "./styles.css";
```
During bundling, all imported CSS files are bundled together into a single `.css` file in the output directory.
```css
.my-class {
background: url("./image.png");
}
```
---
### `sh`
**Bun Shell loader.** Default for `.sh` files.
@@ -313,7 +408,7 @@ bun run ./script.sh
---
### file
### `file`
**File loader.** Default for all unrecognized file types.

File diff suppressed because it is too large Load Diff

View File

@@ -42,7 +42,21 @@ type PluginBuilder = {
config: BuildConfig;
};
type Loader = "js" | "jsx" | "ts" | "tsx" | "css" | "json" | "toml";
type Loader =
| "js"
| "jsx"
| "ts"
| "tsx"
| "json"
| "jsonc"
| "toml"
| "yaml"
| "file"
| "napi"
| "wasm"
| "text"
| "css"
| "html";
```
## Usage

View File

@@ -188,7 +188,7 @@
{
"group": "Publishing & Analysis",
"icon": "upload",
"pages": ["/pm/cli/publish", "/pm/cli/outdated", "/pm/cli/why", "/pm/cli/audit"]
"pages": ["/pm/cli/publish", "/pm/cli/outdated", "/pm/cli/why", "/pm/cli/audit", "/pm/cli/info"]
},
{
"group": "Workspace Management",
@@ -298,7 +298,14 @@
{
"group": "Deployment",
"icon": "rocket",
"pages": ["/guides/deployment/vercel", "/guides/deployment/railway", "/guides/deployment/render"]
"pages": [
"/guides/deployment/vercel",
"/guides/deployment/railway",
"/guides/deployment/render",
"/guides/deployment/aws-lambda",
"/guides/deployment/digital-ocean",
"/guides/deployment/google-cloud-run"
]
},
{
"group": "Runtime & Debugging",
@@ -347,7 +354,7 @@
"/guides/ecosystem/discordjs",
"/guides/ecosystem/docker",
"/guides/ecosystem/drizzle",
"/guides/ecosystem/edgedb",
"/guides/ecosystem/gel",
"/guides/ecosystem/elysia",
"/guides/ecosystem/express",
"/guides/ecosystem/hono",
@@ -362,13 +369,15 @@
"/guides/ecosystem/qwik",
"/guides/ecosystem/react",
"/guides/ecosystem/remix",
"/guides/ecosystem/tanstack-start",
"/guides/ecosystem/sentry",
"/guides/ecosystem/solidstart",
"/guides/ecosystem/ssr-react",
"/guides/ecosystem/stric",
"/guides/ecosystem/sveltekit",
"/guides/ecosystem/systemd",
"/guides/ecosystem/vite"
"/guides/ecosystem/vite",
"/guides/ecosystem/upstash"
]
},
{
@@ -447,6 +456,7 @@
"pages": [
"/guides/test/run-tests",
"/guides/test/watch-mode",
"/guides/test/concurrent-test-glob",
"/guides/test/migrate-from-jest",
"/guides/test/mock-functions",
"/guides/test/spy-on",

View File

@@ -0,0 +1,204 @@
---
title: Deploy a Bun application on AWS Lambda
sidebarTitle: Deploy on AWS Lambda
mode: center
---
[AWS Lambda](https://aws.amazon.com/lambda/) is a serverless compute service that lets you run code without provisioning or managing servers.
In this guide, we will deploy a Bun HTTP server to AWS Lambda using a `Dockerfile`.
<Note>
Before continuing, make sure you have:
- A Bun application ready for deployment
- An [AWS account](https://aws.amazon.com/)
- [AWS CLI](https://docs.aws.amazon.com/cli/latest/userguide/cli-chap-getting-started.html) installed and configured
- [Docker](https://docs.docker.com/get-started/get-docker/) installed and added to your `PATH`
</Note>
---
<Steps>
<Step title="Create a new Dockerfile">
Make sure you're in the directory containing your project, then create a new `Dockerfile` in the root of your project. This file contains the instructions to initialize the container, copy your local project files into it, install dependencies, and start the application.
```docker Dockerfile icon="docker"
# Use the official AWS Lambda adapter image to handle the Lambda runtime
FROM public.ecr.aws/awsguru/aws-lambda-adapter:0.9.0 AS aws-lambda-adapter
# Use the official Bun image to run the application
FROM oven/bun:debian AS bun_latest
# Copy the Lambda adapter into the container
COPY --from=aws-lambda-adapter /lambda-adapter /opt/extensions/lambda-adapter
# Set the port to 8080. This is required for the AWS Lambda adapter.
ENV PORT=8080
# Set the work directory to `/var/task`. This is the default work directory for Lambda.
WORKDIR "/var/task"
# Copy the package.json and bun.lock into the container
COPY package.json bun.lock ./
# Install the dependencies
RUN bun install --production --frozen-lockfile
# Copy the rest of the application into the container
COPY . /var/task
# Run the application.
CMD ["bun", "index.ts"]
```
<Note>
Make sure that the start command corresponds to your application's entry point. This can also be `CMD ["bun", "run", "start"]` if you have a start script in your `package.json`.
This image installs dependencies and runs your app with Bun inside a container. If your app doesn't have dependencies, you can omit the `RUN bun install --production --frozen-lockfile` line.
</Note>
Create a new `.dockerignore` file in the root of your project. This file contains the files and directories that should be _excluded_ from the container image, such as `node_modules`. This makes your builds faster and smaller:
```docker .dockerignore icon="Docker"
node_modules
Dockerfile*
.dockerignore
.git
.gitignore
README.md
LICENSE
.vscode
.env
# Any other files or directories you want to exclude
```
</Step>
<Step title="Build the Docker image">
Make sure you're in the directory containing your `Dockerfile`, then build the Docker image. In this case, we'll call the image `bun-lambda-demo` and tag it as `latest`.
```bash terminal icon="terminal"
# cd /path/to/your/app
docker build --provenance=false --platform linux/amd64 -t bun-lambda-demo:latest .
```
</Step>
<Step title="Create an ECR repository">
To push the image to AWS Lambda, we first need to create an [ECR repository](https://aws.amazon.com/ecr/) to push the image to.
By running the following command, we:
- Create an ECR repository named `bun-lambda-demo` in the `us-east-1` region
- Get the repository URI, and export the repository URI as an environment variable. This is optional, but make the next steps easier.
```bash terminal icon="terminal"
export ECR_URI=$(aws ecr create-repository --repository-name bun-lambda-demo --region us-east-1 --query 'repository.repositoryUri' --output text)
echo $ECR_URI
```
```txt
[id].dkr.ecr.us-east-1.amazonaws.com/bun-lambda-demo
```
<Note>
If you're using IAM Identity Center (SSO) or have configured AWS CLI with profiles, you'll need to add the `--profile` flag to your AWS CLI commands.
For example, if your profile is named `my-sso-app`, use `--profile my-sso-app`. Check your AWS CLI configuration with `aws configure list-profiles` to see available profiles.
```bash terminal icon="terminal"
export ECR_URI=$(aws ecr create-repository --repository-name bun-lambda-demo --region us-east-1 --profile my-sso-app --query 'repository.repositoryUri' --output text)
echo $ECR_URI
```
</Note>
</Step>
<Step title="Authenticate with the ECR repository">
Log in to the ECR repository:
```bash terminal icon="terminal"
aws ecr get-login-password --region us-east-1 | docker login --username AWS --password-stdin $ECR_URI
```
```txt
Login Succeeded
```
<Note>
If using a profile, use the `--profile` flag:
```bash terminal icon="terminal"
aws ecr get-login-password --region us-east-1 --profile my-sso-app | docker login --username AWS --password-stdin $ECR_URI
```
</Note>
</Step>
<Step title="Tag and push the docker image to the ECR repository">
Make sure you're in the directory containing your `Dockerfile`, then tag the docker image with the ECR repository URI.
```bash terminal icon="terminal"
docker tag bun-lambda-demo:latest ${ECR_URI}:latest
```
Then, push the image to the ECR repository.
```bash terminal icon="terminal"
docker push ${ECR_URI}:latest
```
</Step>
<Step title="Create an AWS Lambda function">
Go to **AWS Console** > **Lambda** > [**Create Function**](https://us-east-1.console.aws.amazon.com/lambda/home?region=us-east-1#/create/function?intent=authorFromImage) > Select **Container image**
<Warning>Make sure you've selected the right region, this URL defaults to `us-east-1`.</Warning>
<Frame>
![Create Function](/images/guides/lambda1.png)
</Frame>
Give the function a name, like `my-bun-function`.
</Step>
<Step title="Select the container image">
Then, go to the **Container image URI** section, click on **Browse images**. Select the image we just pushed to the ECR repository.
<Frame>
![Select Container Repository](/images/guides/lambda2.png)
</Frame>
Then, select the `latest` image, and click on **Select image**.
<Frame>
![Select Container Image](/images/guides/lambda3.png)
</Frame>
</Step>
<Step title="Configure the function">
To get a public URL for the function, we need to go to **Additional configurations** > **Networking** > **Function URL**.
Set this to **Enable**, with Auth Type **NONE**.
<Frame>
![Set the Function URL](/images/guides/lambda4.png)
</Frame>
</Step>
<Step title="Create the function">
Click on **Create function** at the bottom of the page, this will create the function.
<Frame>
![Create Function](/images/guides/lambda6.png)
</Frame>
</Step>
<Step title="Get the function URL">
Once the function has been created you'll be redirected to the function's page, where you can see the function URL in the **"Function URL"** section.
<Frame>
![Function URL](/images/guides/lambda5.png)
</Frame>
</Step>
<Step title="Test the function">
🥳 Your app is now live! To test the function, you can either go to the **Test** tab, or call the function URL directly.
```bash terminal icon="terminal"
curl -X GET https://[your-function-id].lambda-url.us-east-1.on.aws/
```
```txt
Hello from Bun on Lambda!
```
</Step>
</Steps>

View File

@@ -0,0 +1,161 @@
---
title: Deploy a Bun application on DigitalOcean
sidebarTitle: Deploy on DigitalOcean
mode: center
---
[DigitalOcean](https://www.digitalocean.com/) is a cloud platform that provides a range of services for building and deploying applications.
In this guide, we will deploy a Bun HTTP server to DigitalOcean using a `Dockerfile`.
<Note>
Before continuing, make sure you have:
- A Bun application ready for deployment
- A [DigitalOcean account](https://www.digitalocean.com/)
- [DigitalOcean CLI](https://docs.digitalocean.com/reference/doctl/how-to/install/#step-1-install-doctl) installed and configured
- [Docker](https://docs.docker.com/get-started/get-docker/) installed and added to your `PATH`
</Note>
---
<Steps>
<Step title="Create a new DigitalOcean Container Registry">
Create a new Container Registry to store the Docker image.
<Tabs>
<Tab title="Through the DigitalOcean dashboard">
In the DigitalOcean dashboard, go to [**Container Registry**](https://cloud.digitalocean.com/registry), and enter the details for the new registry.
<Frame>
![DigitalOcean registry dashboard](/images/guides/digitalocean-7.png)
</Frame>
Make sure the details are correct, then click **Create Registry**.
</Tab>
<Tab title="Through the DigitalOcean CLI">
```bash terminal icon="terminal"
doctl registry create bun-digitalocean-demo
```
```txt
Name Endpoint Region slug
bun-digitalocean-demo registry.digitalocean.com/bun-digitalocean-demo sfo2
```
</Tab>
</Tabs>
You should see the new registry in the [**DigitalOcean registry dashboard**](https://cloud.digitalocean.com/registry):
<Frame>
![DigitalOcean registry dashboard](/images/guides/digitalocean-1.png)
</Frame>
</Step>
<Step title="Create a new Dockerfile">
Make sure you're in the directory containing your project, then create a new `Dockerfile` in the root of your project. This file contains the instructions to initialize the container, copy your local project files into it, install dependencies, and start the application.
```docker Dockerfile icon="docker"
# Use the official Bun image to run the application
FROM oven/bun:debian
# Set the work directory to `/app`
WORKDIR /app
# Copy the package.json and bun.lock into the container
COPY package.json bun.lock ./
# Install the dependencies
RUN bun install --production --frozen-lockfile
# Copy the rest of the application into the container
COPY . .
# Expose the port (DigitalOcean will set PORT env var)
EXPOSE 8080
# Run the application
CMD ["bun", "index.ts"]
```
<Note>
Make sure that the start command corresponds to your application's entry point. This can also be `CMD ["bun", "run", "start"]` if you have a start script in your `package.json`.
This image installs dependencies and runs your app with Bun inside a container. If your app doesn't have dependencies, you can omit the `RUN bun install --production --frozen-lockfile` line.
</Note>
Create a new `.dockerignore` file in the root of your project. This file contains the files and directories that should be _excluded_ from the container image, such as `node_modules`. This makes your builds faster and smaller:
```docker .dockerignore icon="Docker"
node_modules
Dockerfile*
.dockerignore
.git
.gitignore
README.md
LICENSE
.vscode
.env
# Any other files or directories you want to exclude
```
</Step>
<Step title="Authenticate Docker with DigitalOcean registry">
Before building and pushing the Docker image, authenticate Docker with the DigitalOcean Container Registry:
```bash terminal icon="terminal"
doctl registry login
```
```txt
Successfully authenticated with registry.digitalocean.com
```
<Note>
This command authenticates Docker with DigitalOcean's registry using your DigitalOcean credentials. Without this step, the build and push command will fail with a 401 authentication error.
</Note>
</Step>
<Step title="Build and push the Docker image to the DigitalOcean registry">
Make sure you're in the directory containing your `Dockerfile`, then build and push the Docker image to the DigitalOcean registry in one command:
```bash terminal icon="terminal"
docker buildx build --platform=linux/amd64 -t registry.digitalocean.com/bun-digitalocean-demo/bun-digitalocean-demo:latest --push .
```
<Note>
If you're building on an ARM Mac (M1/M2), you must use `docker buildx` with `--platform=linux/amd64` to ensure compatibility with DigitalOcean's infrastructure. Using `docker build` without the platform flag will create an ARM64 image that won't run on DigitalOcean.
</Note>
Once the image is pushed, you should see it in the [**DigitalOcean registry dashboard**](https://cloud.digitalocean.com/registry):
<Frame>
![DigitalOcean registry dashboard](/images/guides/digitalocean-2.png)
</Frame>
</Step>
<Step title="Create a new DigitalOcean App Platform project">
In the DigitalOcean dashboard, go to [**App Platform**](https://cloud.digitalocean.com/apps) > **Create App**. We can create a project directly from the container image.
<Frame>
![DigitalOcean App Platform project dashboard](/images/guides/digitalocean-3.png)
</Frame>
Make sure the details are correct, then click **Next**.
<Frame>
![DigitalOcean App Platform service dashboard](/images/guides/digitalocean-4.png)
</Frame>
Review and configure resource settings, then click **Create app**.
<Frame>
![DigitalOcean App Platform service dashboard](/images/guides/digitalocean-6.png)
</Frame>
</Step>
<Step title="Visit your live application">
🥳 Your app is now live! Once the app is created, you should see it in the App Platform dashboard with the public URL.
<Frame>
![DigitalOcean App Platform app dashboard](/images/guides/digitalocean-5.png)
</Frame>
</Step>
</Steps>

View File

@@ -0,0 +1,194 @@
---
title: Deploy a Bun application on Google Cloud Run
sidebarTitle: Deploy on Google Cloud Run
mode: center
---
[Google Cloud Run](https://cloud.google.com/run) is a managed platform for deploying and scaling serverless applications. Google handles the infrastructure for you.
In this guide, we will deploy a Bun HTTP server to Google Cloud Run using a `Dockerfile`.
<Note>
Before continuing, make sure you have:
- A Bun application ready for deployment
- A [Google Cloud account](https://cloud.google.com/) with billing enabled
- [Google Cloud CLI](https://cloud.google.com/sdk/docs/install) installed and configured
</Note>
---
<Steps>
<Step title={<span>Initialize <code>gcloud</code> by select/creating a project</span>}>
Make sure that you've initialized the Google Cloud CLI. This command logs you in, and prompts you to either select an existing project or create a new one.
For more help with the Google Cloud CLI, see the [official documentation](https://docs.cloud.google.com/sdk/gcloud/reference/init).
```bash terminal icon="terminal"
gcloud init
```
```txt
Welcome! This command will take you through the configuration of gcloud.
You must sign in to continue. Would you like to sign in (Y/n)? Y
You are signed in as [email@example.com].
Pick cloud project to use:
[1] existing-bun-app-1234
[2] Enter a project ID
[3] Create a new project
Please enter numeric choice or text value (must exactly match list item): 3
Enter a Project ID. my-bun-app
Your current project has been set to: [my-bun-app]
The Google Cloud CLI is configured and ready to use!
```
</Step>
<Step title="(Optional) Store your project info in environment variables">
Set variables for your project ID and number so they're easier to reuse in the following steps.
```bash terminal icon="terminal"
PROJECT_ID=$(gcloud projects list --format='value(projectId)' --filter='name="my bun app"')
PROJECT_NUMBER=$(gcloud projects list --format='value(projectNumber)' --filter='name="my bun app"')
echo $PROJECT_ID $PROJECT_NUMBER
```
```txt
my-bun-app-... [PROJECT_NUMBER]
```
</Step>
<Step title="Link a billing account">
List your available billing accounts and link one to your project:
```bash terminal icon="terminal"
gcloud billing accounts list
```
```txt
ACCOUNT_ID NAME OPEN MASTER_ACCOUNT_ID
[BILLING_ACCOUNT_ID] My Billing Account True
```
Link your billing account to your project. Replace `[BILLING_ACCOUNT_ID]` with the ID of your billing account.
```bash terminal icon="terminal"
gcloud billing projects link $PROJECT_ID --billing-account=[BILLING_ACCOUNT_ID]
```
```txt
billingAccountName: billingAccounts/[BILLING_ACCOUNT_ID]
billingEnabled: true
name: projects/my-bun-app-.../billingInfo
projectId: my-bun-app-...
```
</Step>
<Step title="Enable APIs and configure IAM roles">
Activate the necessary services and grant Cloud Build permissions:
```bash terminal icon="terminal"
gcloud services enable run.googleapis.com cloudbuild.googleapis.com
gcloud projects add-iam-policy-binding $PROJECT_ID \
--member=serviceAccount:$PROJECT_NUMBER-compute@developer.gserviceaccount.com \
--role=roles/run.builder
```
<Note>
These commands enable Cloud Run (`run.googleapis.com`) and Cloud Build (`cloudbuild.googleapis.com`), which are required for deploying from source. Cloud Run runs your containerized app, while Cloud Build handles building and packaging it.
The IAM binding grants the Compute Engine service account (`$PROJECT_NUMBER-compute@developer.gserviceaccount.com`) permission to build and deploy images on your behalf.
</Note>
</Step>
<Step title="Add a Dockerfile">
Create a new `Dockerfile` in the root of your project. This file contains the instructions to initialize the container, copy your local project files into it, install dependencies, and start the application.
```docker Dockerfile icon="docker"
# Use the official Bun image to run the application
FROM oven/bun:latest
# Copy the package.json and bun.lock into the container
COPY package.json bun.lock ./
# Install the dependencies
# Install the dependencies
RUN bun install --production --frozen-lockfile
# Copy the rest of the application into the container
COPY . .
# Run the application
CMD ["bun", "index.ts"]
```
<Note>
Make sure that the start command corresponds to your application's entry point. This can also be `CMD ["bun", "run", "start"]` if you have a start script in your `package.json`.
This image installs dependencies and runs your app with Bun inside a container. If your app doesn't have dependencies, you can omit the `RUN bun install --production --frozen-lockfile` line.
</Note>
Create a new `.dockerignore` file in the root of your project. This file contains the files and directories that should be _excluded_ from the container image, such as `node_modules`. This makes your builds faster and smaller:
```docker .dockerignore icon="Docker"
node_modules
Dockerfile*
.dockerignore
.git
.gitignore
README.md
LICENSE
.vscode
.env
# Any other files or directories you want to exclude
```
</Step>
<Step title="Deploy your service">
Make sure you're in the directory containing your `Dockerfile`, then deploy directly from your local source:
<Note>
Update the `--region` flag to your preferred region. You can also omit this flag to get an interactive prompt to
select a region.
</Note>
```bash terminal icon="terminal"
gcloud run deploy my-bun-app --source . --region=us-west1 --allow-unauthenticated
```
```txt
Deploying from source requires an Artifact Registry Docker repository to store built containers. A repository named
[cloud-run-source-deploy] in region [us-west1] will be created.
Do you want to continue (Y/n)? Y
Building using Dockerfile and deploying container to Cloud Run service [my-bun-app] in project [my-bun-app-...] region [us-west1]
✓ Building and deploying... Done.
✓ Validating Service...
✓ Uploading sources...
✓ Building Container... Logs are available at [https://console.cloud.google.com/cloud-build/builds...].
✓ Creating Revision...
✓ Routing traffic...
✓ Setting IAM Policy...
Done.
Service [my-bun-app] revision [my-bun-app-...] has been deployed and is serving 100 percent of traffic.
Service URL: https://my-bun-app-....us-west1.run.app
```
</Step>
<Step title="Visit your live application">
🎉 Your Bun application is now live!
Visit the Service URL (`https://my-bun-app-....us-west1.run.app`) to confirm everything works as expected.
</Step>
</Steps>

View File

@@ -32,7 +32,7 @@ import { ProductCard } from "/snippets/product-card.mdx";
Vercel automatically detects this configuration and runs your application on Bun. The value has to be `"1.x"`, Vercel handles the minor version internally.
For best results, match your local Bun version with the version used by Vercel. **Currently, Bun version `1.2.23` is supported**.
For best results, match your local Bun version with the version used by Vercel.
</Step>
<Step title="Next.js configuration">
@@ -81,7 +81,7 @@ import { ProductCard } from "/snippets/product-card.mdx";
console.log("runtime", process.versions.bun);
```
```txt
runtime 1.2.23
runtime 1.3.2
```
[See the Vercel Bun Runtime documentation for feature support →](https://vercel.com/docs/functions/runtimes/bun#feature-support)

View File

@@ -1,23 +1,27 @@
---
title: Use EdgeDB with Bun
sidebarTitle: EdgeDB with Bun
title: Use Gel with Bun
sidebarTitle: Gel with Bun
mode: center
---
EdgeDB is a graph-relational database powered by Postgres under the hood. It provides a declarative schema language, migrations system, and object-oriented query language, in addition to supporting raw SQL queries. It solves the object-relational mapping problem at the database layer, eliminating the need for an ORM library in your application code.
Gel (formerly EdgeDB) is a graph-relational database powered by Postgres under the hood. It provides a declarative schema language, migrations system, and object-oriented query language, in addition to supporting raw SQL queries. It solves the object-relational mapping problem at the database layer, eliminating the need for an ORM library in your application code.
---
First, [install EdgeDB](https://www.edgedb.com/install) if you haven't already.
First, [install Gel](https://docs.geldata.com/learn/installation) if you haven't already.
<CodeGroup>
```sh Linux/macOS terminal icon="terminal"
curl --proto '=https' --tlsv1.2 -sSf https://sh.edgedb.com | sh
curl https://www.geldata.com/sh --proto "=https" -sSf1 | sh
```
```sh Windows terminal icon="windows"
iwr https://ps1.edgedb.com -useb | iex
irm https://www.geldata.com/ps1 | iex
```
```sh Homebrew terminal icon="terminal"
brew install geldata/tap/gel-cli
```
</CodeGroup>
@@ -34,35 +38,35 @@ bun init -y
---
We'll use the EdgeDB CLI to initialize an EdgeDB instance for our project. This creates an `edgedb.toml` file in our project root.
We'll use the Gel CLI to initialize a Gel instance for our project. This creates a `gel.toml` file in our project root.
```sh terminal icon="terminal"
edgedb project init
gel project init
```
```txt
No `edgedb.toml` found in `/Users/colinmcd94/Documents/bun/fun/examples/my-edgedb-app` or above
No `gel.toml` found in `/Users/colinmcd94/Documents/bun/fun/examples/my-gel-app` or above
Do you want to initialize a new project? [Y/n]
> Y
Specify the name of EdgeDB instance to use with this project [default: my_edgedb_app]:
> my_edgedb_app
Checking EdgeDB versions...
Specify the version of EdgeDB to use with this project [default: x.y]:
Specify the name of Gel instance to use with this project [default: my_gel_app]:
> my_gel_app
Checking Gel versions...
Specify the version of Gel to use with this project [default: x.y]:
> x.y
┌─────────────────────┬────────────────────────────────────────────────────────────────────────
│ Project directory │ /Users/colinmcd94/Documents/bun/fun/examples/my-edgedb-app
│ Project config │ /Users/colinmcd94/Documents/bun/fun/examples/my-edgedb-app/edgedb.toml
│ Schema dir (empty) │ /Users/colinmcd94/Documents/bun/fun/examples/my-edgedb-app/dbschema
│ Installation method │ portable package
│ Version │ x.y+6d5921b
│ Instance name │ my_edgedb_app
└─────────────────────┴────────────────────────────────────────────────────────────────────────
┌─────────────────────┬──────────────────────────────────────────────────────────────────┐
│ Project directory │ /Users/colinmcd94/Documents/bun/fun/examples/my-gel-app
│ Project config │ /Users/colinmcd94/Documents/bun/fun/examples/my-gel-app/gel.toml│
│ Schema dir (empty) │ /Users/colinmcd94/Documents/bun/fun/examples/my-gel-app/dbschema│
│ Installation method │ portable package │
│ Version │ x.y+6d5921b │
│ Instance name │ my_gel_app │
└─────────────────────┴──────────────────────────────────────────────────────────────────┘
Version x.y+6d5921b is already downloaded
Initializing EdgeDB instance...
Initializing Gel instance...
Applying migrations...
Everything is up to date. Revision initial
Project initialized.
To connect to my_edgedb_app, run `edgedb`
To connect to my_gel_app, run `gel`
```
---
@@ -70,8 +74,8 @@ To connect to my_edgedb_app, run `edgedb`
To see if the database is running, let's open a REPL and run a simple query.
```sh terminal icon="terminal"
edgedb
edgedb> select 1 + 1;
gel
gel> select 1 + 1;
```
```txt
@@ -81,12 +85,12 @@ edgedb> select 1 + 1;
Then run `\quit` to exit the REPL.
```sh terminal icon="terminal"
edgedb> \quit
gel> \quit
```
---
With the project initialized, we can define a schema. The `edgedb project init` command already created a `dbschema/default.esdl` file to contain our schema.
With the project initialized, we can define a schema. The `gel project init` command already created a `dbschema/default.esdl` file to contain our schema.
```txt File Tree icon="folder-tree"
dbschema
@@ -112,15 +116,15 @@ module default {
Then generate and apply an initial migration.
```sh terminal icon="terminal"
edgedb migration create
gel migration create
```
```txt
Created /Users/colinmcd94/Documents/bun/fun/examples/my-edgedb-app/dbschema/migrations/00001.edgeql, id: m1uwekrn4ni4qs7ul7hfar4xemm5kkxlpswolcoyqj3xdhweomwjrq
Created /Users/colinmcd94/Documents/bun/fun/examples/my-gel-app/dbschema/migrations/00001.edgeql, id: m1uwekrn4ni4qs7ul7hfar4xemm5kkxlpswolcoyqj3xdhweomwjrq
```
```sh terminal icon="terminal"
edgedb migrate
gel migrate
```
```txt
@@ -129,11 +133,11 @@ Applied m1uwekrn4ni4qs7ul7hfar4xemm5kkxlpswolcoyqj3xdhweomwjrq (00001.edgeql)
---
With our schema applied, let's execute some queries using EdgeDB's JavaScript client library. We'll install the client library and EdgeDB's codegen CLI, and create a `seed.ts`.file.
With our schema applied, let's execute some queries using Gel's JavaScript client library. We'll install the client library and Gel's codegen CLI, and create a `seed.ts`.file.
```sh terminal icon="terminal"
bun add edgedb
bun add -D @edgedb/generate
bun add gel
bun add -D @gel/generate
touch seed.ts
```
@@ -144,7 +148,7 @@ Paste the following code into `seed.ts`.
The client auto-connects to the database. We insert a couple movies using the `.execute()` method. We will use EdgeQL's `for` expression to turn this bulk insert into a single optimized query.
```ts seed.ts icon="/icons/typescript.svg"
import { createClient } from "edgedb";
import { createClient } from "gel";
const client = createClient();
@@ -184,10 +188,10 @@ Seeding complete.
---
EdgeDB implements a number of code generation tools for TypeScript. To query our newly seeded database in a typesafe way, we'll use `@edgedb/generate` to code-generate the EdgeQL query builder.
Gel implements a number of code generation tools for TypeScript. To query our newly seeded database in a typesafe way, we'll use `@gel/generate` to code-generate the EdgeQL query builder.
```sh terminal icon="terminal"
bunx @edgedb/generate edgeql-js
bunx @gel/generate edgeql-js
```
```txt
@@ -213,7 +217,7 @@ the query builder directory? The following line will be added:
In `index.ts`, we can import the generated query builder from `./dbschema/edgeql-js` and write a simple select query.
```ts index.ts icon="/icons/typescript.svg"
import { createClient } from "edgedb";
import { createClient } from "gel";
import e from "./dbschema/edgeql-js";
const client = createClient();
@@ -254,4 +258,4 @@ bun run index.ts
---
For complete documentation, refer to the [EdgeDB docs](https://www.edgedb.com/docs).
For complete documentation, refer to the [Gel docs](https://docs.geldata.com/).

View File

@@ -4,54 +4,100 @@ sidebarTitle: Next.js with Bun
mode: center
---
Initialize a Next.js app with `create-next-app`. This will scaffold a new Next.js project and automatically install dependencies.
```sh terminal icon="terminal"
bun create next-app
```
```txt
✔ What is your project named? … my-app
✔ Would you like to use TypeScript with this project? … No / Yes
✔ Would you like to use ESLint with this project? … No / Yes
✔ Would you like to use Tailwind CSS? ... No / Yes
✔ Would you like to use `src/` directory with this project? … No / Yes
✔ Would you like to use App Router? (recommended) ... No / Yes
✔ What import alias would you like configured? … @/*
Creating a new Next.js app in /path/to/my-app.
```
[Next.js](https://nextjs.org/) is a React framework for building full-stack web applications. It supports server-side rendering, static site generation, API routes, and more. Bun provides fast package installation and can run Next.js development and production servers.
---
You can specify a starter template using the `--example` flag.
<Steps>
<Step title="Create a new Next.js app">
Use the interactive CLI to create a new Next.js app. This will scaffold a new Next.js project and automatically install dependencies.
```sh
bun create next-app --example with-supabase
```
```sh terminal icon="terminal"
bun create next-app@latest my-bun-app
```
```txt
✔ What is your project named? … my-app
...
```
</Step>
<Step title="Start the dev server">
Change to the project directory and run the dev server with Bun.
```sh terminal icon="terminal"
cd my-bun-app
bun --bun run dev
```
This starts the Next.js dev server with Bun's runtime.
Open [`http://localhost:3000`](http://localhost:3000) with your browser to see the result. Any changes you make to `app/page.tsx` will be hot-reloaded in the browser.
</Step>
<Step title="Update scripts in package.json">
Modify the scripts field in your `package.json` by prefixing the Next.js CLI commands with `bun --bun`. This ensures that Bun executes the Next.js CLI for common tasks like `dev`, `build`, and `start`.
```json package.json icon="file-json"
{
"scripts": {
"dev": "bun --bun next dev", // [!code ++]
"build": "bun --bun next build", // [!code ++]
"start": "bun --bun next start", // [!code ++]
}
}
```
</Step>
</Steps>
---
To start the dev server with Bun, run `bun --bun run dev` from the project root.
## Hosting
```sh terminal icon="terminal"
cd my-app
bun --bun run dev
```
Next.js applications on Bun can be deployed to various platforms.
<Columns cols={3}>
<Card title="Vercel" href="/guides/deployment/vercel" icon="/icons/ecosystem/vercel.svg">
Deploy on Vercel
</Card>
<Card title="Railway" href="/guides/deployment/railway" icon="/icons/ecosystem/railway.svg">
Deploy on Railway
</Card>
<Card title="DigitalOcean" href="/guides/deployment/digital-ocean" icon="/icons/ecosystem/digitalocean.svg">
Deploy on DigitalOcean
</Card>
<Card title="AWS Lambda" href="/guides/deployment/aws-lambda" icon="/icons/ecosystem/aws.svg">
Deploy on AWS Lambda
</Card>
<Card title="Google Cloud Run" href="/guides/deployment/google-cloud-run" icon="/icons/ecosystem/gcp.svg">
Deploy on Google Cloud Run
</Card>
<Card title="Render" href="/guides/deployment/render" icon="/icons/ecosystem/render.svg">
Deploy on Render
</Card>
</Columns>
---
To run the dev server with Node.js instead, omit `--bun`.
## Templates
```sh terminal icon="terminal"
cd my-app
bun run dev
```
<Columns cols={2}>
<Card
title="Bun + Next.js Basic Starter"
img="/images/templates/bun-nextjs-basic.png"
href="https://github.com/bun-templates/bun-nextjs-basic"
arrow="true"
cta="Go to template"
>
A simple App Router starter with Bun, Next.js, and Tailwind CSS.
</Card>
<Card
title="Todo App with Next.js + Bun"
img="/images/templates/bun-nextjs-todo.png"
href="https://github.com/bun-templates/bun-nextjs-todo"
arrow="true"
cta="Go to template"
>
A full-stack todo application built with Bun, Next.js, and PostgreSQL.
</Card>
</Columns>
---
Open [http://localhost:3000](http://localhost:3000) with your browser to see the result. Any changes you make to `(pages/app)/index.tsx` will be hot-reloaded in the browser.
[→ See Next.js's official documentation](https://nextjs.org/docs) for more information on building and deploying Next.js applications.

View File

@@ -14,7 +14,7 @@ bunx nuxi init my-nuxt-app
✔ Which package manager would you like to use?
bun
◐ Installing dependencies...
bun install v1.3.1 (16b4bf34)
bun install v1.3.2 (16b4bf34)
+ @nuxt/devtools@0.8.2
+ nuxt@3.7.0
785 packages installed [2.67s]

View File

@@ -0,0 +1,792 @@
---
title: Use TanStack Start with Bun
sidebarTitle: TanStack Start with Bun
mode: center
---
[TanStack Start](https://tanstack.com/start/latest) is a full-stack framework powered by TanStack Router. It supports full-document SSR, streaming, server functions, bundling and more, powered by TanStack Router and [Vite](https://vite.dev/).
---
<Steps>
<Step title="Create a new TanStack Start app">
Use the interactive CLI to create a new TanStack Start app.
```sh terminal icon="terminal"
bun create @tanstack/start@latest my-tanstack-app
```
</Step>
<Step title="Start the dev server">
Change to the project directory and run the dev server with Bun.
```sh terminal icon="terminal"
cd my-tanstack-app
bun --bun run dev
```
This starts the Vite dev server with Bun.
</Step>
<Step title="Update scripts in package.json">
Modify the scripts field in your `package.json` by prefixing the Vite CLI commands with `bun --bun`. This ensures that Bun executes the Vite CLI for common tasks like `dev`, `build`, and `preview`.
```json package.json icon="file-json"
{
"scripts": {
"dev": "bun --bun vite dev", // [!code ++]
"build": "bun --bun vite build", // [!code ++]
"serve": "bun --bun vite preview" // [!code ++]
}
}
```
</Step>
</Steps>
---
## Hosting
To host your TanStack Start app, you can use [Nitro](https://nitro.build/) or a custom Bun server for production deployments.
<Tabs>
<Tab title="Nitro">
<Steps>
<Step title="Add Nitro to your project">
Add [Nitro](https://nitro.build/) to your project. This tool allows you to deploy your TanStack Start app to different platforms.
```sh terminal icon="terminal"
bun add nitro
```
</Step>
<Step title={<span>Update your <code>vite.config.ts</code> file</span>}>
Update your `vite.config.ts` file to include the necessary plugins for TanStack Start with Bun.
```ts vite.config.ts icon="/icons/typescript.svg"
// other imports...
import { nitro } from "nitro/vite"; // [!code ++]
const config = defineConfig({
plugins: [
tanstackStart(),
nitro({ preset: "bun" }), // [!code ++]
// other plugins...
],
});
export default config;
```
<Note>
The `bun` preset is optional, but it configures the build output specifically for Bun's runtime.
</Note>
</Step>
<Step title="Update the start command">
Make sure `build` and `start` scripts are present in your `package.json` file:
```json package.json icon="file-json"
{
"scripts": {
"build": "bun --bun vite build", // [!code ++]
// The .output files are created by Nitro when you run `bun run build`.
// Not necessary when deploying to Vercel.
"start": "bun run .output/server/index.mjs" // [!code ++]
}
}
```
<Note>
You do **not** need the custom `start` script when deploying to Vercel.
</Note>
</Step>
<Step title="Deploy your app">
Check out one of our guides to deploy your app to a hosting provider.
<Note>
When deploying to Vercel, you can either add the `"bunVersion": "1.x"` to your `vercel.json` file, or add it to the `nitro` config in your `vite.config.ts` file:
<Warning>
Do **not** use the `bun` Nitro preset when deploying to Vercel.
</Warning>
```ts vite.config.ts icon="/icons/typescript.svg"
export default defineConfig({
plugins: [
tanstackStart(),
nitro({
preset: "bun", // [!code --]
vercel: { // [!code ++]
functions: { // [!code ++]
runtime: "bun1.x", // [!code ++]
}, // [!code ++]
}, // [!code ++]
}),
],
});
```
</Note>
</Step>
</Steps>
</Tab>
<Tab title="Custom Server">
<Note>
This custom server implementation is based on [TanStack's Bun template](https://github.com/TanStack/router/blob/main/examples/react/start-bun/server.ts). It provides fine-grained control over static asset serving, including configurable memory management that preloads small files into memory for fast serving while serving larger files on-demand. This approach is useful when you need precise control over resource usage and asset loading behavior in production deployments.
</Note>
<Steps>
<Step title="Create the production server">
Create a `server.ts` file in your project root with the following custom server implementation:
```ts server.ts icon="/icons/typescript.svg" expandable
/**
* TanStack Start Production Server with Bun
*
* A high-performance production server for TanStack Start applications that
* implements intelligent static asset loading with configurable memory management.
*
* Features:
* - Hybrid loading strategy (preload small files, serve large files on-demand)
* - Configurable file filtering with include/exclude patterns
* - Memory-efficient response generation
* - Production-ready caching headers
*
* Environment Variables:
*
* PORT (number)
* - Server port number
* - Default: 3000
*
* ASSET_PRELOAD_MAX_SIZE (number)
* - Maximum file size in bytes to preload into memory
* - Files larger than this will be served on-demand from disk
* - Default: 5242880 (5MB)
* - Example: ASSET_PRELOAD_MAX_SIZE=5242880 (5MB)
*
* ASSET_PRELOAD_INCLUDE_PATTERNS (string)
* - Comma-separated list of glob patterns for files to include
* - If specified, only matching files are eligible for preloading
* - Patterns are matched against filenames only, not full paths
* - Example: ASSET_PRELOAD_INCLUDE_PATTERNS="*.js,*.css,*.woff2"
*
* ASSET_PRELOAD_EXCLUDE_PATTERNS (string)
* - Comma-separated list of glob patterns for files to exclude
* - Applied after include patterns
* - Patterns are matched against filenames only, not full paths
* - Example: ASSET_PRELOAD_EXCLUDE_PATTERNS="*.map,*.txt"
*
* ASSET_PRELOAD_VERBOSE_LOGGING (boolean)
* - Enable detailed logging of loaded and skipped files
* - Default: false
* - Set to "true" to enable verbose output
*
* ASSET_PRELOAD_ENABLE_ETAG (boolean)
* - Enable ETag generation for preloaded assets
* - Default: true
* - Set to "false" to disable ETag support
*
* ASSET_PRELOAD_ENABLE_GZIP (boolean)
* - Enable Gzip compression for eligible assets
* - Default: true
* - Set to "false" to disable Gzip compression
*
* ASSET_PRELOAD_GZIP_MIN_SIZE (number)
* - Minimum file size in bytes required for Gzip compression
* - Files smaller than this will not be compressed
* - Default: 1024 (1KB)
*
* ASSET_PRELOAD_GZIP_MIME_TYPES (string)
* - Comma-separated list of MIME types eligible for Gzip compression
* - Supports partial matching for types ending with "/"
* - Default: text/,application/javascript,application/json,application/xml,image/svg+xml
*
* Usage:
* bun run server.ts
*/
import path from 'node:path'
// Configuration
const SERVER_PORT = Number(process.env.PORT ?? 3000)
const CLIENT_DIRECTORY = './dist/client'
const SERVER_ENTRY_POINT = './dist/server/server.js'
// Logging utilities for professional output
const log = {
info: (message: string) => {
console.log(`[INFO] ${message}`)
},
success: (message: string) => {
console.log(`[SUCCESS] ${message}`)
},
warning: (message: string) => {
console.log(`[WARNING] ${message}`)
},
error: (message: string) => {
console.log(`[ERROR] ${message}`)
},
header: (message: string) => {
console.log(`\n${message}\n`)
},
}
// Preloading configuration from environment variables
const MAX_PRELOAD_BYTES = Number(
process.env.ASSET_PRELOAD_MAX_SIZE ?? 5 * 1024 * 1024, // 5MB default
)
// Parse comma-separated include patterns (no defaults)
const INCLUDE_PATTERNS = (process.env.ASSET_PRELOAD_INCLUDE_PATTERNS ?? '')
.split(',')
.map((s) => s.trim())
.filter(Boolean)
.map((pattern: string) => convertGlobToRegExp(pattern))
// Parse comma-separated exclude patterns (no defaults)
const EXCLUDE_PATTERNS = (process.env.ASSET_PRELOAD_EXCLUDE_PATTERNS ?? '')
.split(',')
.map((s) => s.trim())
.filter(Boolean)
.map((pattern: string) => convertGlobToRegExp(pattern))
// Verbose logging flag
const VERBOSE = process.env.ASSET_PRELOAD_VERBOSE_LOGGING === 'true'
// Optional ETag feature
const ENABLE_ETAG = (process.env.ASSET_PRELOAD_ENABLE_ETAG ?? 'true') === 'true'
// Optional Gzip feature
const ENABLE_GZIP = (process.env.ASSET_PRELOAD_ENABLE_GZIP ?? 'true') === 'true'
const GZIP_MIN_BYTES = Number(process.env.ASSET_PRELOAD_GZIP_MIN_SIZE ?? 1024) // 1KB
const GZIP_TYPES = (
process.env.ASSET_PRELOAD_GZIP_MIME_TYPES ??
'text/,application/javascript,application/json,application/xml,image/svg+xml'
)
.split(',')
.map((v) => v.trim())
.filter(Boolean)
/**
* Convert a simple glob pattern to a regular expression
* Supports * wildcard for matching any characters
*/
function convertGlobToRegExp(globPattern: string): RegExp {
// Escape regex special chars except *, then replace * with .*
const escapedPattern = globPattern
.replace(/[-/\\^$+?.()|[\]{}]/g, '\\$&')
.replace(/\*/g, '.*')
return new RegExp(`^${escapedPattern}$`, 'i')
}
/**
* Compute ETag for a given data buffer
*/
function computeEtag(data: Uint8Array): string {
const hash = Bun.hash(data)
return `W/"${hash.toString(16)}-${data.byteLength.toString()}"`
}
/**
* Metadata for preloaded static assets
*/
interface AssetMetadata {
route: string
size: number
type: string
}
/**
* In-memory asset with ETag and Gzip support
*/
interface InMemoryAsset {
raw: Uint8Array
gz?: Uint8Array
etag?: string
type: string
immutable: boolean
size: number
}
/**
* Result of static asset preloading process
*/
interface PreloadResult {
routes: Record<string, (req: Request) => Response | Promise<Response>>
loaded: AssetMetadata[]
skipped: AssetMetadata[]
}
/**
* Check if a file is eligible for preloading based on configured patterns
*/
function isFileEligibleForPreloading(relativePath: string): boolean {
const fileName = relativePath.split(/[/\\]/).pop() ?? relativePath
// If include patterns are specified, file must match at least one
if (INCLUDE_PATTERNS.length > 0) {
if (!INCLUDE_PATTERNS.some((pattern) => pattern.test(fileName))) {
return false
}
}
// If exclude patterns are specified, file must not match any
if (EXCLUDE_PATTERNS.some((pattern) => pattern.test(fileName))) {
return false
}
return true
}
/**
* Check if a MIME type is compressible
*/
function isMimeTypeCompressible(mimeType: string): boolean {
return GZIP_TYPES.some((type) =>
type.endsWith('/') ? mimeType.startsWith(type) : mimeType === type,
)
}
/**
* Conditionally compress data based on size and MIME type
*/
function compressDataIfAppropriate(
data: Uint8Array,
mimeType: string,
): Uint8Array | undefined {
if (!ENABLE_GZIP) return undefined
if (data.byteLength < GZIP_MIN_BYTES) return undefined
if (!isMimeTypeCompressible(mimeType)) return undefined
try {
return Bun.gzipSync(data.buffer as ArrayBuffer)
} catch {
return undefined
}
}
/**
* Create response handler function with ETag and Gzip support
*/
function createResponseHandler(
asset: InMemoryAsset,
): (req: Request) => Response {
return (req: Request) => {
const headers: Record<string, string> = {
'Content-Type': asset.type,
'Cache-Control': asset.immutable
? 'public, max-age=31536000, immutable'
: 'public, max-age=3600',
}
if (ENABLE_ETAG && asset.etag) {
const ifNone = req.headers.get('if-none-match')
if (ifNone && ifNone === asset.etag) {
return new Response(null, {
status: 304,
headers: { ETag: asset.etag },
})
}
headers.ETag = asset.etag
}
if (
ENABLE_GZIP &&
asset.gz &&
req.headers.get('accept-encoding')?.includes('gzip')
) {
headers['Content-Encoding'] = 'gzip'
headers['Content-Length'] = String(asset.gz.byteLength)
const gzCopy = new Uint8Array(asset.gz)
return new Response(gzCopy, { status: 200, headers })
}
headers['Content-Length'] = String(asset.raw.byteLength)
const rawCopy = new Uint8Array(asset.raw)
return new Response(rawCopy, { status: 200, headers })
}
}
/**
* Create composite glob pattern from include patterns
*/
function createCompositeGlobPattern(): Bun.Glob {
const raw = (process.env.ASSET_PRELOAD_INCLUDE_PATTERNS ?? '')
.split(',')
.map((s) => s.trim())
.filter(Boolean)
if (raw.length === 0) return new Bun.Glob('**/*')
if (raw.length === 1) return new Bun.Glob(raw[0])
return new Bun.Glob(`{${raw.join(',')}}`)
}
/**
* Initialize static routes with intelligent preloading strategy
* Small files are loaded into memory, large files are served on-demand
*/
async function initializeStaticRoutes(
clientDirectory: string,
): Promise<PreloadResult> {
const routes: Record<string, (req: Request) => Response | Promise<Response>> =
{}
const loaded: AssetMetadata[] = []
const skipped: AssetMetadata[] = []
log.info(`Loading static assets from ${clientDirectory}...`)
if (VERBOSE) {
console.log(
`Max preload size: ${(MAX_PRELOAD_BYTES / 1024 / 1024).toFixed(2)} MB`,
)
if (INCLUDE_PATTERNS.length > 0) {
console.log(
`Include patterns: ${process.env.ASSET_PRELOAD_INCLUDE_PATTERNS ?? ''}`,
)
}
if (EXCLUDE_PATTERNS.length > 0) {
console.log(
`Exclude patterns: ${process.env.ASSET_PRELOAD_EXCLUDE_PATTERNS ?? ''}`,
)
}
}
let totalPreloadedBytes = 0
try {
const glob = createCompositeGlobPattern()
for await (const relativePath of glob.scan({ cwd: clientDirectory })) {
const filepath = path.join(clientDirectory, relativePath)
const route = `/${relativePath.split(path.sep).join(path.posix.sep)}`
try {
// Get file metadata
const file = Bun.file(filepath)
// Skip if file doesn't exist or is empty
if (!(await file.exists()) || file.size === 0) {
continue
}
const metadata: AssetMetadata = {
route,
size: file.size,
type: file.type || 'application/octet-stream',
}
// Determine if file should be preloaded
const matchesPattern = isFileEligibleForPreloading(relativePath)
const withinSizeLimit = file.size <= MAX_PRELOAD_BYTES
if (matchesPattern && withinSizeLimit) {
// Preload small files into memory with ETag and Gzip support
const bytes = new Uint8Array(await file.arrayBuffer())
const gz = compressDataIfAppropriate(bytes, metadata.type)
const etag = ENABLE_ETAG ? computeEtag(bytes) : undefined
const asset: InMemoryAsset = {
raw: bytes,
gz,
etag,
type: metadata.type,
immutable: true,
size: bytes.byteLength,
}
routes[route] = createResponseHandler(asset)
loaded.push({ ...metadata, size: bytes.byteLength })
totalPreloadedBytes += bytes.byteLength
} else {
// Serve large or filtered files on-demand
routes[route] = () => {
const fileOnDemand = Bun.file(filepath)
return new Response(fileOnDemand, {
headers: {
'Content-Type': metadata.type,
'Cache-Control': 'public, max-age=3600',
},
})
}
skipped.push(metadata)
}
} catch (error: unknown) {
if (error instanceof Error && error.name !== 'EISDIR') {
log.error(`Failed to load ${filepath}: ${error.message}`)
}
}
}
// Show detailed file overview only when verbose mode is enabled
if (VERBOSE && (loaded.length > 0 || skipped.length > 0)) {
const allFiles = [...loaded, ...skipped].sort((a, b) =>
a.route.localeCompare(b.route),
)
// Calculate max path length for alignment
const maxPathLength = Math.min(
Math.max(...allFiles.map((f) => f.route.length)),
60,
)
// Format file size with KB and actual gzip size
const formatFileSize = (bytes: number, gzBytes?: number) => {
const kb = bytes / 1024
const sizeStr = kb < 100 ? kb.toFixed(2) : kb.toFixed(1)
if (gzBytes !== undefined) {
const gzKb = gzBytes / 1024
const gzStr = gzKb < 100 ? gzKb.toFixed(2) : gzKb.toFixed(1)
return {
size: sizeStr,
gzip: gzStr,
}
}
// Rough gzip estimation (typically 30-70% compression) if no actual gzip data
const gzipKb = kb * 0.35
return {
size: sizeStr,
gzip: gzipKb < 100 ? gzipKb.toFixed(2) : gzipKb.toFixed(1),
}
}
if (loaded.length > 0) {
console.log('\n📁 Preloaded into memory:')
console.log(
'Path │ Size │ Gzip Size',
)
loaded
.sort((a, b) => a.route.localeCompare(b.route))
.forEach((file) => {
const { size, gzip } = formatFileSize(file.size)
const paddedPath = file.route.padEnd(maxPathLength)
const sizeStr = `${size.padStart(7)} kB`
const gzipStr = `${gzip.padStart(7)} kB`
console.log(`${paddedPath} │ ${sizeStr} │ ${gzipStr}`)
})
}
if (skipped.length > 0) {
console.log('\n💾 Served on-demand:')
console.log(
'Path │ Size │ Gzip Size',
)
skipped
.sort((a, b) => a.route.localeCompare(b.route))
.forEach((file) => {
const { size, gzip } = formatFileSize(file.size)
const paddedPath = file.route.padEnd(maxPathLength)
const sizeStr = `${size.padStart(7)} kB`
const gzipStr = `${gzip.padStart(7)} kB`
console.log(`${paddedPath} │ ${sizeStr} │ ${gzipStr}`)
})
}
}
// Show detailed verbose info if enabled
if (VERBOSE) {
if (loaded.length > 0 || skipped.length > 0) {
const allFiles = [...loaded, ...skipped].sort((a, b) =>
a.route.localeCompare(b.route),
)
console.log('\n📊 Detailed file information:')
console.log(
'Status │ Path │ MIME Type │ Reason',
)
allFiles.forEach((file) => {
const isPreloaded = loaded.includes(file)
const status = isPreloaded ? 'MEMORY' : 'ON-DEMAND'
const reason =
!isPreloaded && file.size > MAX_PRELOAD_BYTES
? 'too large'
: !isPreloaded
? 'filtered'
: 'preloaded'
const route =
file.route.length > 30
? file.route.substring(0, 27) + '...'
: file.route
console.log(
`${status.padEnd(12)} │ ${route.padEnd(30)} │ ${file.type.padEnd(28)} │ ${reason.padEnd(10)}`,
)
})
} else {
console.log('\n📊 No files found to display')
}
}
// Log summary after the file list
console.log() // Empty line for separation
if (loaded.length > 0) {
log.success(
`Preloaded ${String(loaded.length)} files (${(totalPreloadedBytes / 1024 / 1024).toFixed(2)} MB) into memory`,
)
} else {
log.info('No files preloaded into memory')
}
if (skipped.length > 0) {
const tooLarge = skipped.filter((f) => f.size > MAX_PRELOAD_BYTES).length
const filtered = skipped.length - tooLarge
log.info(
`${String(skipped.length)} files will be served on-demand (${String(tooLarge)} too large, ${String(filtered)} filtered)`,
)
}
} catch (error) {
log.error(
`Failed to load static files from ${clientDirectory}: ${String(error)}`,
)
}
return { routes, loaded, skipped }
}
/**
* Initialize the server
*/
async function initializeServer() {
log.header('Starting Production Server')
// Load TanStack Start server handler
let handler: { fetch: (request: Request) => Response | Promise<Response> }
try {
const serverModule = (await import(SERVER_ENTRY_POINT)) as {
default: { fetch: (request: Request) => Response | Promise<Response> }
}
handler = serverModule.default
log.success('TanStack Start application handler initialized')
} catch (error) {
log.error(`Failed to load server handler: ${String(error)}`)
process.exit(1)
}
// Build static routes with intelligent preloading
const { routes } = await initializeStaticRoutes(CLIENT_DIRECTORY)
// Create Bun server
const server = Bun.serve({
port: SERVER_PORT,
routes: {
// Serve static assets (preloaded or on-demand)
...routes,
// Fallback to TanStack Start handler for all other routes
'/*': (req: Request) => {
try {
return handler.fetch(req)
} catch (error) {
log.error(`Server handler error: ${String(error)}`)
return new Response('Internal Server Error', { status: 500 })
}
},
},
// Global error handler
error(error) {
log.error(
`Uncaught server error: ${error instanceof Error ? error.message : String(error)}`,
)
return new Response('Internal Server Error', { status: 500 })
},
})
log.success(`Server listening on http://localhost:${String(server.port)}`)
}
// Initialize the server
initializeServer().catch((error: unknown) => {
log.error(`Failed to start server: ${String(error)}`)
process.exit(1)
})
```
</Step>
<Step title="Update package.json scripts">
Add a `start` script to run the custom server:
```json package.json icon="file-json"
{
"scripts": {
"build": "bun --bun vite build",
"start": "bun run server.ts" // [!code ++]
}
}
```
</Step>
<Step title="Build and run">
Build your application and start the server:
```sh terminal icon="terminal"
bun run build
bun run start
```
The server will start on port 3000 by default (configurable via `PORT` environment variable).
</Step>
</Steps>
</Tab>
</Tabs>
<Columns cols={3}>
<Card title="Vercel" href="/guides/deployment/vercel" icon="/icons/ecosystem/vercel.svg">
Deploy on Vercel
</Card>
<Card title="Render" href="/guides/deployment/render" icon="/icons/ecosystem/render.svg">
Deploy on Render
</Card>
<Card title="Railway" href="/guides/deployment/railway" icon="/icons/ecosystem/railway.svg">
Deploy on Railway
</Card>
<Card title="DigitalOcean" href="/guides/deployment/digital-ocean" icon="/icons/ecosystem/digitalocean.svg">
Deploy on DigitalOcean
</Card>
<Card title="AWS Lambda" href="/guides/deployment/aws-lambda" icon="/icons/ecosystem/aws.svg">
Deploy on AWS Lambda
</Card>
<Card title="Google Cloud Run" href="/guides/deployment/google-cloud-run" icon="/icons/ecosystem/gcp.svg">
Deploy on Google Cloud Run
</Card>
</Columns>
---
## Templates
<Columns cols={2}>
<Card
title="Todo App with Tanstack + Bun"
img="/images/templates/bun-tanstack-todo.png"
href="https://github.com/bun-templates/bun-tanstack-todo"
arrow="true"
cta="Go to template"
>
A Todo application built with Bun, TanStack Start, and PostgreSQL.
</Card>
<Card
title="Bun + TanStack Start Application"
img="/images/templates/bun-tanstack-basic.png"
href="https://github.com/bun-templates/bun-tanstack-basic"
arrow="true"
cta="Go to template"
>
A TanStack Start template using Bun with SSR and file-based routing.
</Card>
<Card
title="Basic Bun + Tanstack Starter"
img="/images/templates/bun-tanstack-start.png"
href="https://github.com/bun-templates/bun-tanstack-start"
arrow="true"
cta="Go to template"
>
The basic TanStack starter using the Bun runtime and Bun's file APIs.
</Card>
</Columns>
---
[→ See TanStack Start's official documentation](https://tanstack.com/start/latest/docs/framework/react/guide/hosting) for more information on hosting.

View File

@@ -0,0 +1,87 @@
---
title: Bun Redis with Upstash
sidebarTitle: Upstash with Bun
mode: center
---
[Upstash](https://upstash.com/) is a fully managed Redis database as a service. Upstash works with the Redis® API, which means you can use Bun's native Redis client to connect to your Upstash database.
<Note>TLS is enabled by default for all Upstash Redis databases.</Note>
---
<Steps>
<Step title="Create a new project">
Create a new project by running `bun init`:
```sh terminal icon="terminal"
bun init bun-upstash-redis
cd bun-upstash-redis
```
</Step>
<Step title="Create an Upstash Redis database">
Go to the [Upstash dashboard](https://console.upstash.com/) and create a new Redis database. After completing the [getting started guide](https://upstash.com/docs/redis/overall/getstarted), you'll see your database page with connection information.
The database page displays two connection methods; HTTP and TLS. For Bun's Redis client, you need the **TLS** connection details. This URL starts with `rediss://`.
<Frame>
![Upstash Redis database page](/images/guides/upstash-1.png)
</Frame>
</Step>
<Step title="Connect using Bun's Redis client">
You can connect to Upstash by setting environment variables with Bun's default `redis` client.
Set the `REDIS_URL` environment variable in your `.env` file using the Redis endpoint (not the REST URL):
```env .env icon="settings"
REDIS_URL=rediss://********@********.upstash.io:6379
```
Bun's Redis client reads connection information from `REDIS_URL` by default:
```ts index.ts icon="/icons/typescript.svg"
import { redis } from "bun";
// Reads from process.env.REDIS_URL automatically
await redis.set("counter", "0"); // [!code ++]
```
Alternatively, you can create a custom client using `RedisClient`:
```ts index.ts icon="/icons/typescript.svg"
import { RedisClient } from "bun";
const redis = new RedisClient(process.env.REDIS_URL); // [!code ++]
```
</Step>
<Step title="Use the Redis client">
You can now use the Redis client to interact with your Upstash Redis database:
```ts index.ts icon="/icons/typescript.svg"
import { redis } from "bun";
// Get a value
let counter = await redis.get("counter");
// Set a value if it doesn't exist
if (!counter) {
await redis.set("counter", "0");
}
// Increment the counter
await redis.incr("counter");
// Get the updated value
counter = await redis.get("counter");
console.log(counter);
```
```txt
1
```
The Redis client automatically handles connections in the background. No need to manually connect or disconnect for basic operations.
</Step>
</Steps>

View File

@@ -37,6 +37,7 @@ await extractLinks("https://bun.com");
When scraping websites, you often want to convert relative URLs (like `/docs`) to absolute URLs. Here's how to handle URL resolution:
{/* prettier-ignore */}
```ts extract-links.ts icon="/icons/typescript.svg"
async function extractLinksFromURL(url: string) {
const response = await fetch(url);
@@ -47,13 +48,11 @@ async function extractLinksFromURL(url: string) {
const href = el.getAttribute("href");
if (href) {
// Convert relative URLs to absolute // [!code ++]
try {
// [!code ++]
try { // [!code ++]
const absoluteURL = new URL(href, url).href; // [!code ++]
links.add(absoluteURL); // [!code ++]
} catch {
// [!code ++]
links.add(href);
links.add(absoluteURL);
} catch { // [!code ++]
links.add(href); // [!code ++]
} // [!code ++]
}
},

View File

@@ -65,6 +65,7 @@ First we use the [`.formData()`](https://developer.mozilla.org/en-US/docs/Web/AP
Finally, we write the `Blob` to disk using [`Bun.write()`](https://bun.com/docs/api/file-io#writing-files-bun-write).
{/* prettier-ignore */}
```ts index.ts icon="/icons/typescript.svg"
const server = Bun.serve({
port: 4000,
@@ -80,8 +81,7 @@ const server = Bun.serve({
});
// parse formdata at /action // [!code ++]
if (url.pathname === "/action") {
// [!code ++]
if (url.pathname === "/action") { // [!code ++]
const formdata = await req.formData(); // [!code ++]
const name = formdata.get("name"); // [!code ++]
const profilePicture = formdata.get("profilePicture"); // [!code ++]

View File

@@ -33,6 +33,8 @@ bun add git@github.com:lodash/lodash.git
bun add github:colinhacks/zod
```
**Note:** GitHub dependencies download via HTTP tarball when possible for faster installation.
---
See [Docs > Package manager](https://bun.com/docs/cli/install) for complete documentation of Bun's package manager.

View File

@@ -17,7 +17,7 @@ This will add the package to `peerDependencies` in `package.json`.
```json package.json icon="file-json"
{
"peerDependencies": {
"@types/bun": "^1.3.1" // [!code ++]
"@types/bun": "^1.3.2" // [!code ++]
}
}
```
@@ -26,14 +26,14 @@ This will add the package to `peerDependencies` in `package.json`.
Running `bun install` will install peer dependencies by default, unless marked optional in `peerDependenciesMeta`.
{/* prettier-ignore */}
```json package.json icon="file-json"
{
"peerDependencies": {
"@types/bun": "^1.3.1"
"@types/bun": "^1.3.2"
},
"peerDependenciesMeta": {
"@types/bun": {
// [!code ++]
"@types/bun": { // [!code ++]
"optional": true // [!code ++]
} // [!code ++]
}

View File

@@ -25,7 +25,7 @@ To use it with `bun install`, add a `bunfig.toml` file to your project with the
[install.registry]
url = "https://pkgs.dev.azure.com/my-azure-artifacts-user/_packaging/my-azure-artifacts-user/npm/registry"
username = "my-azure-artifacts-user"
# Bun v1.0.3+ supports using an environment variable here
# You can use an environment variable here
password = "$NPM_PASSWORD"
```

View File

@@ -99,7 +99,7 @@ bun update
bun update @types/bun --latest
# Update a dependency to a specific version
bun update @types/bun@1.3.1
bun update @types/bun@1.3.2
# Update all dependencies to the latest versions
bun update --latest

View File

@@ -17,7 +17,7 @@ Make sure to replace `MY_SUBDOMAIN` with your JFrog Artifactory subdomain, such
```toml bunfig.toml icon="settings"
[install.registry]
url = "https://MY_SUBDOMAIN.jfrog.io/artifactory/api/npm/npm/_auth=MY_TOKEN"
# Bun v1.0.3+ supports using an environment variable here
# You can use an environment variable here
# url = "$NPM_CONFIG_REGISTRY"
```

View File

@@ -23,8 +23,8 @@ To listen to changes in subdirectories, pass the `recursive: true` option to `fs
```ts
import { watch } from "fs";
const watcher = watch(import.meta.dir, { recursive: true }, (event, filename) => {
console.log(`Detected ${event} in ${filename}`);
const watcher = watch(import.meta.dir, { recursive: true }, (event, relativePath) => {
console.log(`Detected ${event} in ${relativePath}`);
});
```

View File

@@ -15,12 +15,12 @@ jobs:
steps:
# ...
- uses: actions/checkout@v4
- uses: oven-sh/setup-bun@v2 // [!code ++]
- uses: oven-sh/setup-bun@v2 # [!code ++]
# run any `bun` or `bunx` command
- run: bun install // [!code ++]
- run: bun index.ts // [!code ++]
- run: bun run build // [!code ++]
- run: bun install # [!code ++]
- run: bun index.ts # [!code ++]
- run: bun run build # [!code ++]
```
---
@@ -36,8 +36,8 @@ jobs:
steps:
# ...
- uses: oven-sh/setup-bun@v2
with: // [!code ++]
bun-version: 1.2.0 # or "latest", "canary", <sha> // [!code ++]
with: # [!code ++]
bun-version: 1.2.0 # or "latest", "canary", <sha> # [!code ++]
```
---

View File

@@ -27,9 +27,9 @@ if (process.env.NODE_ENV === "production") {
Before the code reaches the JavaScript engine, Bun replaces `process.env.NODE_ENV` with `"production"`.
{/* prettier-ignore */}
```ts
if ("production" === "production") {
// [!code ++]
if ("production" === "production") { // [!code ++]
console.log("Production mode");
} else {
console.log("Development mode");
@@ -42,9 +42,9 @@ It doesn't stop there. Bun's optimizing transpiler is smart enough to do some ba
Since `"production" === "production"` is always `true`, Bun replaces the entire expression with the `true` value.
{/* prettier-ignore */}
```ts
if (true) {
// [!code ++]
if (true) { // [!code ++]
console.log("Production mode");
} else {
console.log("Development mode");
@@ -120,7 +120,7 @@ console.log("abc");
You can also pass properties to the `--define` flag.
For example, to replace all usages of `console.write` with `console.log`, you can use the following command (requires Bun v1.1.5 or later)
For example, to replace all usages of `console.write` with `console.log`, you can use the following command
```sh
bun --define console.write=console.log src/index.ts

View File

@@ -13,5 +13,3 @@ console.log(html); // <!DOCTYPE html><html><head>...
```
This can also be used with hot module reloading and/or watch mode to force Bun to reload whenever the `./file.html` file changes.
This feature was added in Bun v1.1.5.

View File

@@ -14,7 +14,7 @@ bun add -d @types/bun # dev dependency
Below is the full set of recommended `compilerOptions` for a Bun project. With this `tsconfig.json`, you can use top-level await, extensioned or extensionless imports, and JSX.
```json package.json icon="file-json"
```json tsconfig.json icon="file-json"
{
"compilerOptions": {
// Environment setup & latest features

View File

@@ -0,0 +1,143 @@
---
title: Selectively run tests concurrently with glob patterns
sidebarTitle: Concurrent test glob
mode: center
---
This guide demonstrates how to use the `concurrentTestGlob` option to selectively run tests concurrently based on file naming patterns.
## Project Structure
```sh title="Project Structure" icon="folder-tree"
my-project/
├── bunfig.toml
├── tests/
│ ├── unit/
│ │ ├── math.test.ts # Sequential
│ │ └── utils.test.ts # Sequential
│ └── integration/
│ ├── concurrent-api.test.ts # Concurrent
│ └── concurrent-database.test.ts # Concurrent
```
## Configuration
Configure your `bunfig.toml` to run test files with "concurrent-" prefix concurrently:
```toml title="bunfig.toml" icon="settings"
[test]
# Run all test files with "concurrent-" prefix concurrently
concurrentTestGlob = "**/concurrent-*.test.ts"
```
## Test Files
### Unit Test (Sequential)
Sequential tests are good for tests that share state or have specific ordering requirements:
```ts title="tests/unit/math.test.ts" icon="/icons/typescript.svg"
import { test, expect } from "bun:test";
// These tests run sequentially by default
let sharedState = 0;
test("addition", () => {
sharedState = 5 + 3;
expect(sharedState).toBe(8);
});
test("uses previous state", () => {
// This test depends on the previous test's state
expect(sharedState).toBe(8);
});
```
### Integration Test (Concurrent)
Tests in files matching the glob pattern automatically run concurrently:
```ts title="tests/integration/concurrent-api.test.ts" icon="/icons/typescript.svg"
import { test, expect } from "bun:test";
// These tests automatically run concurrently due to filename matching the glob pattern.
// Using test() is equivalent to test.concurrent() when the file matches concurrentTestGlob.
// Each test is independent and can run in parallel.
test("fetch user data", async () => {
const response = await fetch("/api/user/1");
expect(response.ok).toBe(true);
});
test("fetch posts", async () => {
const response = await fetch("/api/posts");
expect(response.ok).toBe(true);
});
test("fetch comments", async () => {
const response = await fetch("/api/comments");
expect(response.ok).toBe(true);
});
```
## Running Tests
```bash terminal icon="terminal"
# Run all tests - concurrent-*.test.ts files will run concurrently
bun test
# Override: Force ALL tests to run concurrently
# Note: This overrides bunfig.toml and runs all tests concurrently, regardless of glob
bun test --concurrent
# Run only unit tests (sequential)
bun test tests/unit
# Run only integration tests (concurrent due to glob pattern)
bun test tests/integration
```
## Benefits
1. **Gradual Migration**: Migrate to concurrent tests file by file by renaming them
2. **Clear Organization**: File naming convention indicates execution mode
3. **Performance**: Integration tests run faster in parallel
4. **Safety**: Unit tests remain sequential where needed
5. **Flexibility**: Easy to change execution mode by renaming files
## Migration Strategy
To migrate existing tests to concurrent execution:
1. **Start with independent integration tests** - These typically don't share state
2. **Rename files to match the glob pattern**: `mv api.test.ts concurrent-api.test.ts`
3. **Verify tests still pass** - Run `bun test` to ensure no race conditions
4. **Monitor for shared state issues** - Watch for flaky tests or unexpected failures
5. **Continue migrating stable tests incrementally** - Don't rush the migration
## Tips
- **Use descriptive prefixes**: `concurrent-`, `parallel-`, `async-`
- **Keep related sequential tests together** in the same directory
- **Document why certain tests must remain sequential** with comments
- **Use `test.concurrent()` for fine-grained control** in sequential files
(Note: In files matched by `concurrentTestGlob`, plain `test()` already runs concurrently)
## Multiple Patterns
You can specify multiple patterns for different test categories:
```toml title="bunfig.toml" icon="settings"
[test]
concurrentTestGlob = [
"**/integration/*.test.ts",
"**/e2e/*.test.ts",
"**/concurrent-*.test.ts"
]
```
This configuration will run tests concurrently if they match any of these patterns:
- All tests in `integration/` directories
- All tests in `e2e/` directories
- All tests with `concurrent-` prefix anywhere in the project

View File

@@ -64,10 +64,10 @@ Later, when this test file is executed again, Bun will read the snapshot file an
```sh terminal icon="terminal"
bun test
bun test v1.3.1 (9c68abdb)
```
```txt
bun test v1.3.2 (9c68abdb)
test/snap.test.ts:
✓ snapshot [1.05ms]
@@ -83,10 +83,10 @@ To update snapshots, use the `--update-snapshots` flag.
```sh terminal icon="terminal"
bun test --update-snapshots
bun test v1.3.1 (9c68abdb)
```
```txt
bun test v1.3.2 (9c68abdb)
test/snap.test.ts:
✓ snapshot [0.86ms]

View File

@@ -23,6 +23,7 @@ const spy = spyOn(leo, "sayHi");
Once the spy is created, it can be used to write `expect` assertions relating to method calls.
{/* prettier-ignore */}
```ts
import { test, expect, spyOn } from "bun:test";
@@ -35,8 +36,7 @@ const leo = {
const spy = spyOn(leo, "sayHi");
test("turtles", () => {
// [!code ++]
test("turtles", () => { // [!code ++]
expect(spy).toHaveBeenCalledTimes(0); // [!code ++]
leo.sayHi("pizza"); // [!code ++]
expect(spy).toHaveBeenCalledTimes(1); // [!code ++]

View File

@@ -7,7 +7,7 @@ mode: center
Get the current version of Bun in a semver format.
```ts index.ts icon="/icons/typescript.svg"
Bun.version; // => "1.3.1"
Bun.version; // => "1.3.2"
```
---

View File

@@ -9,7 +9,7 @@ When building a WebSocket server, it's typically necessary to store some identif
With [Bun.serve()](https://bun.com/docs/api/websockets contextual-data), this "contextual data" is set when the connection is initially upgraded by passing a `data` parameter in the `server.upgrade()` call.
```ts server.ts icon="/icons/typescript.svg"
Bun.serve<{ socketId: number }>({
Bun.serve({
fetch(req, server) {
const success = server.upgrade(req, {
data: {
@@ -22,6 +22,9 @@ Bun.serve<{ socketId: number }>({
// ...
},
websocket: {
// TypeScript: specify the type of ws.data like this
data: {} as { socketId: number },
// define websocket handlers
async message(ws, message) {
// the contextual data is available as the `data` property
@@ -43,8 +46,7 @@ type WebSocketData = {
userId: string;
};
// TypeScript: specify the type of `data`
Bun.serve<WebSocketData>({
Bun.serve({
async fetch(req, server) {
// use a library to parse cookies
const cookies = parseCookies(req.headers.get("Cookie"));
@@ -62,6 +64,9 @@ Bun.serve<WebSocketData>({
if (upgraded) return undefined;
},
websocket: {
// TypeScript: specify the type of ws.data like this
data: {} as WebSocketData,
async message(ws, message) {
// save the message to a database
await saveMessageToDatabase({

View File

@@ -9,7 +9,7 @@ Bun's server-side `WebSocket` API provides a native pub-sub API. Sockets can be
This code snippet implements a simple single-channel chat server.
```ts server.ts icon="/icons/typescript.svg"
const server = Bun.serve<{ username: string }>({
const server = Bun.serve({
fetch(req, server) {
const cookies = req.headers.get("cookie");
const username = getUsernameFromCookies(cookies);
@@ -19,6 +19,9 @@ const server = Bun.serve<{ username: string }>({
return new Response("Hello world");
},
websocket: {
// TypeScript: specify the type of ws.data like this
data: {} as { username: string },
open(ws) {
const msg = `${ws.data.username} has entered the chat`;
ws.subscribe("the-group-chat");

View File

@@ -9,7 +9,7 @@ Start a simple WebSocket server using [`Bun.serve`](https://bun.com/docs/api/htt
Inside `fetch`, we attempt to upgrade incoming `ws:` or `wss:` requests to WebSocket connections.
```ts server.ts icon="/icons/typescript.svg"
const server = Bun.serve<{ authToken: string }>({
const server = Bun.serve({
fetch(req, server) {
const success = server.upgrade(req);
if (success) {
@@ -22,6 +22,9 @@ const server = Bun.serve<{ authToken: string }>({
return new Response("Hello world!");
},
websocket: {
// TypeScript: specify the type of ws.data like this
data: {} as { authToken: string },
// this is called when a message is received
async message(ws, message) {
console.log(`Received ${message}`);

View File

@@ -0,0 +1,13 @@
<svg xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" version="1.1" id="Layer_1" x="0px" y="0px" width="24" height="24" viewBox="0 0 304 182" style="enable-background:new 0 0 304 182;" xml:space="preserve">
<style type="text/css">
.st0{fill:#646464;}
.st1{fill-rule:evenodd;clip-rule:evenodd;fill:#FF9900;}
</style>
<g>
<path class="st0" d="M86.4,66.4c0,3.7,0.4,6.7,1.1,8.9c0.8,2.2,1.8,4.6,3.2,7.2c0.5,0.8,0.7,1.6,0.7,2.3c0,1-0.6,2-1.9,3l-6.3,4.2 c-0.9,0.6-1.8,0.9-2.6,0.9c-1,0-2-0.5-3-1.4C76.2,90,75,88.4,74,86.8c-1-1.7-2-3.6-3.1-5.9c-7.8,9.2-17.6,13.8-29.4,13.8 c-8.4,0-15.1-2.4-20-7.2c-4.9-4.8-7.4-11.2-7.4-19.2c0-8.5,3-15.4,9.1-20.6c6.1-5.2,14.2-7.8,24.5-7.8c3.4,0,6.9,0.3,10.6,0.8 c3.7,0.5,7.5,1.3,11.5,2.2v-7.3c0-7.6-1.6-12.9-4.7-16c-3.2-3.1-8.6-4.6-16.3-4.6c-3.5,0-7.1,0.4-10.8,1.3c-3.7,0.9-7.3,2-10.8,3.4 c-1.6,0.7-2.8,1.1-3.5,1.3c-0.7,0.2-1.2,0.3-1.6,0.3c-1.4,0-2.1-1-2.1-3.1v-4.9c0-1.6,0.2-2.8,0.7-3.5c0.5-0.7,1.4-1.4,2.8-2.1 c3.5-1.8,7.7-3.3,12.6-4.5c4.9-1.3,10.1-1.9,15.6-1.9c11.9,0,20.6,2.7,26.2,8.1c5.5,5.4,8.3,13.6,8.3,24.6V66.4z M45.8,81.6 c3.3,0,6.7-0.6,10.3-1.8c3.6-1.2,6.8-3.4,9.5-6.4c1.6-1.9,2.8-4,3.4-6.4c0.6-2.4,1-5.3,1-8.7v-4.2c-2.9-0.7-6-1.3-9.2-1.7 c-3.2-0.4-6.3-0.6-9.4-0.6c-6.7,0-11.6,1.3-14.9,4c-3.3,2.7-4.9,6.5-4.9,11.5c0,4.7,1.2,8.2,3.7,10.6 C37.7,80.4,41.2,81.6,45.8,81.6z M126.1,92.4c-1.8,0-3-0.3-3.8-1c-0.8-0.6-1.5-2-2.1-3.9L96.7,10.2c-0.6-2-0.9-3.3-0.9-4 c0-1.6,0.8-2.5,2.4-2.5h9.8c1.9,0,3.2,0.3,3.9,1c0.8,0.6,1.4,2,2,3.9l16.8,66.2l15.6-66.2c0.5-2,1.1-3.3,1.9-3.9c0.8-0.6,2.2-1,4-1 h8c1.9,0,3.2,0.3,4,1c0.8,0.6,1.5,2,1.9,3.9l15.8,67l17.3-67c0.6-2,1.3-3.3,2-3.9c0.8-0.6,2.1-1,3.9-1h9.3c1.6,0,2.5,0.8,2.5,2.5 c0,0.5-0.1,1-0.2,1.6c-0.1,0.6-0.3,1.4-0.7,2.5l-24.1,77.3c-0.6,2-1.3,3.3-2.1,3.9c-0.8,0.6-2.1,1-3.8,1h-8.6c-1.9,0-3.2-0.3-4-1 c-0.8-0.7-1.5-2-1.9-4L156,23l-15.4,64.4c-0.5,2-1.1,3.3-1.9,4c-0.8,0.7-2.2,1-4,1H126.1z M254.6,95.1c-5.2,0-10.4-0.6-15.4-1.8 c-5-1.2-8.9-2.5-11.5-4c-1.6-0.9-2.7-1.9-3.1-2.8c-0.4-0.9-0.6-1.9-0.6-2.8v-5.1c0-2.1,0.8-3.1,2.3-3.1c0.6,0,1.2,0.1,1.8,0.3 c0.6,0.2,1.5,0.6,2.5,1c3.4,1.5,7.1,2.7,11,3.5c4,0.8,7.9,1.2,11.9,1.2c6.3,0,11.2-1.1,14.6-3.3c3.4-2.2,5.2-5.4,5.2-9.5 c0-2.8-0.9-5.1-2.7-7c-1.8-1.9-5.2-3.6-10.1-5.2L246,52c-7.3-2.3-12.7-5.7-16-10.2c-3.3-4.4-5-9.3-5-14.5c0-4.2,0.9-7.9,2.7-11.1 c1.8-3.2,4.2-6,7.2-8.2c3-2.3,6.4-4,10.4-5.2c4-1.2,8.2-1.7,12.6-1.7c2.2,0,4.5,0.1,6.7,0.4c2.3,0.3,4.4,0.7,6.5,1.1 c2,0.5,3.9,1,5.7,1.6c1.8,0.6,3.2,1.2,4.2,1.8c1.4,0.8,2.4,1.6,3,2.5c0.6,0.8,0.9,1.9,0.9,3.3v4.7c0,2.1-0.8,3.2-2.3,3.2 c-0.8,0-2.1-0.4-3.8-1.2c-5.7-2.6-12.1-3.9-19.2-3.9c-5.7,0-10.2,0.9-13.3,2.8c-3.1,1.9-4.7,4.8-4.7,8.9c0,2.8,1,5.2,3,7.1 c2,1.9,5.7,3.8,11,5.5l14.2,4.5c7.2,2.3,12.4,5.5,15.5,9.6c3.1,4.1,4.6,8.8,4.6,14c0,4.3-0.9,8.2-2.6,11.6 c-1.8,3.4-4.2,6.4-7.3,8.8c-3.1,2.5-6.8,4.3-11.1,5.6C264.4,94.4,259.7,95.1,254.6,95.1z"/>
<g>
<path class="st1" d="M273.5,143.7c-32.9,24.3-80.7,37.2-121.8,37.2c-57.6,0-109.5-21.3-148.7-56.7c-3.1-2.8-0.3-6.6,3.4-4.4 c42.4,24.6,94.7,39.5,148.8,39.5c36.5,0,76.6-7.6,113.5-23.2C274.2,133.6,278.9,139.7,273.5,143.7z"/>
<path class="st1" d="M287.2,128.1c-4.2-5.4-27.8-2.6-38.5-1.3c-3.2,0.4-3.7-2.4-0.8-4.5c18.8-13.2,49.7-9.4,53.3-5 c3.6,4.5-1,35.4-18.6,50.2c-2.7,2.3-5.3,1.1-4.1-1.9C282.5,155.7,291.4,133.4,287.2,128.1z"/>
</g>
</g>
</svg>

After

Width:  |  Height:  |  Size: 3.2 KiB

View File

@@ -0,0 +1 @@
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 512 512"><rect width="512" height="512" rx="15%" fill="#0080ff"/><path fill="#fff" d="M78 373v-47h47v104h57V300h74v147A191 191 0 1 0 65 256h74a117 117 0 1 1 117 117"/></svg>

After

Width:  |  Height:  |  Size: 250 B

View File

@@ -0,0 +1,13 @@
<svg xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" width="24" height="24" viewBox="0 -12.5 256 256" version="1.1" preserveAspectRatio="xMidYMid">
<g>
<path d="M75.390147,0 C67.1964365,0.144249443 59.6926147,4.61940312 55.670735,11.7594655 L55.670735,11.7594655 L3.05275724,102.995813 C-1.01758575,110.07943 -1.01758575,118.791982 3.05275724,125.875029 L3.05275724,125.875029 L55.6507795,217.871964 C59.631608,225.111234 67.1149042,229.733488 75.3696214,230.052205 L75.3696214,230.052205 L180.586192,230.052205 C188.840909,229.768267 196.337889,225.164829 200.325559,217.932401 L200.325559,217.932401 L252.923581,126.455448 C254.973292,122.857194 255.997862,118.851278 255.997862,114.845933 L255.997862,114.845933 C255.997862,110.840588 254.973292,106.834673 252.923581,103.235849 L252.923581,103.235849 L200.325559,11.7594655 C196.300829,4.6222539 188.799287,0.148240535 180.606147,0 L180.606147,0 L75.390147,0 Z" fill="#4285F4">
</path>
<path d="M236.495178,155.027249 L200.325559,217.932401 C196.337889,225.164829 188.840909,229.768267 180.586192,230.052205 L140.510158,230.052205 L82.0381078,171.057147 L99.2756312,115.473789 L82.0381078,59.2541363 L99.743159,71.1470183 L123.548878,95.1659759 L112.429128,59.2541363 L195.804166,115.261691 L236.495178,155.027249 Z" fill-opacity="0.07" fill="#000000" fill-rule="nonzero">
</path>
<path d="M82.0382788,59.2539082 L99.7433301,71.1473604 L113.622065,115.765481 L99.9902076,159.043164 L82.0382788,171.05749 L99.3807109,115.261463 L82.0382788,59.2539082 Z M127.385457,79.0899171 L135.977707,106.881596 L168.966927,106.881596 L127.385457,79.0899171 Z M195.804166,115.261748 L112.429128,171.057204 L129.77156,115.261748 L112.429128,59.2536232 L195.804166,115.261748 Z" fill="#FFFFFF">
</path>
</g>
</svg>

After

Width:  |  Height:  |  Size: 1.8 KiB

View File

@@ -0,0 +1,17 @@
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 1024 1024" fill="none">
<style>
.railway-path {
fill: #000000;
}
@media (prefers-color-scheme: dark) {
.railway-path {
fill: #ffffff;
}
}
html.dark .railway-path {
fill: #ffffff;
}
</style>
<path class="railway-path" d="M4.756 438.175A520.713 520.713 0 0 0 0 489.735h777.799c-2.716-5.306-6.365-10.09-10.045-14.772-132.97-171.791-204.498-156.896-306.819-161.26-34.114-1.403-57.249-1.967-193.037-1.967-72.677 0-151.688.185-228.628.39-9.96 26.884-19.566 52.942-24.243 74.14h398.571v51.909H4.756ZM783.93 541.696H.399c.82 13.851 2.112 27.517 3.978 40.999h723.39c32.248 0 50.299-18.297 56.162-40.999ZM45.017 724.306S164.941 1018.77 511.46 1024c207.112 0 385.071-123.006 465.907-299.694H45.017Z"/>
<path class="railway-path" d="M511.454 0C319.953 0 153.311 105.16 65.31 260.612c68.771-.144 202.704-.226 202.704-.226h.031v-.051c158.309 0 164.193.707 195.118 1.998l19.149.706c66.7 2.224 148.683 9.384 213.19 58.19 35.015 26.471 85.571 84.896 115.708 126.52 27.861 38.499 35.876 82.756 16.933 125.158-17.436 38.97-54.952 62.215-100.383 62.215H16.69s4.233 17.944 10.58 37.751h970.632A510.385 510.385 0 0 0 1024 512.218C1024.01 229.355 794.532 0 511.454 0Z"/>
</svg>

After

Width:  |  Height:  |  Size: 1.3 KiB

View File

@@ -0,0 +1,16 @@
<svg width="24" height="24" viewBox="0 0 21 21" fill="none" xmlns="http://www.w3.org/2000/svg">
<style>
.render-path {
fill: #000000;
}
@media (prefers-color-scheme: dark) {
.render-path {
fill: #ffffff;
}
}
html.dark .render-path {
fill: #ffffff;
}
</style>
<path class="render-path" d="M15.6491 0.00582607C12.9679 -0.120371 10.7133 1.81847 10.3286 4.373C10.3134 4.49154 10.2905 4.60627 10.2715 4.72099C9.67356 7.90268 6.88955 10.3119 3.5457 10.3119C2.35364 10.3119 1.23395 10.006 0.258977 9.47058C0.140914 9.40557 0 9.4897 0 9.62354V10.3081V20.6218H10.2677V12.8894C10.2677 11.4668 11.4178 10.3119 12.8346 10.3119H15.4015C18.3074 10.3119 20.6458 7.89121 20.5315 4.94662C20.4287 2.29649 18.2884 0.132023 15.6491 0.00582607Z"/>
</svg>

After

Width:  |  Height:  |  Size: 796 B

View File

@@ -0,0 +1,16 @@
<svg width="24" height="24" viewBox="0 0 76 65" fill="none" xmlns="http://www.w3.org/2000/svg">
<style>
.vercel-path {
fill: #000000;
}
@media (prefers-color-scheme: dark) {
.vercel-path {
fill: #ffffff;
}
}
html.dark .vercel-path {
fill: #ffffff;
}
</style>
<path class="vercel-path" d="M37.5274 0L75.0548 65H0L37.5274 0Z"/>
</svg>

After

Width:  |  Height:  |  Size: 394 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 344 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 403 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.4 MiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 872 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 813 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.1 MiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.3 MiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 662 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 593 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 627 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 448 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 568 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 584 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 195 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 674 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 302 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 806 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 816 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 281 KiB

View File

@@ -38,7 +38,7 @@ Bun ships as a single, dependency-free executable. You can install it via script
</Warning>
For support and discussion, please join the **#windows** channel on our [Discord](https://discord.gg/bun).
For support and discussion, please join the **#windows** channel on our [Discord](https://bun.com/discord).
</Tab>
@@ -209,7 +209,7 @@ Since Bun is a single binary, you can install older versions by re-running the i
To install a specific version, pass the git tag to the install script:
```bash terminal icon="terminal"
curl -fsSL https://bun.com/install | bash -s "bun-v1.3.1"
curl -fsSL https://bun.com/install | bash -s "bun-v1.3.2"
```
</Tab>
@@ -217,7 +217,7 @@ Since Bun is a single binary, you can install older versions by re-running the i
On Windows, pass the version number to the PowerShell install script:
```powershell PowerShell icon="windows"
iex "& {$(irm https://bun.com/install.ps1)} -Version 1.3.1"
iex "& {$(irm https://bun.com/install.ps1)} -Version 1.3.2"
```
</Tab>

70
docs/pm/cli/info.mdx Normal file
View File

@@ -0,0 +1,70 @@
---
title: "bun info"
description: "Display package metadata from the npm registry"
---
`bun info` displays package metadata from the npm registry.
## Usage
```bash terminal icon="terminal"
bun info react
```
This will display information about the `react` package, including its latest version, description, homepage, dependencies, and more.
## Viewing specific versions
To view information about a specific version:
```bash terminal icon="terminal"
bun info react@18.0.0
```
## Viewing specific properties
You can also query specific properties from the package metadata:
```bash terminal icon="terminal"
bun info react version
bun info react dependencies
bun info react repository.url
```
## JSON output
To get the output in JSON format, use the `--json` flag:
```bash terminal icon="terminal"
bun info react --json
```
## Alias
`bun pm view` is an alias for `bun info`:
```bash terminal icon="terminal"
bun pm view react # equivalent to: bun info react
```
## Examples
```bash terminal icon="terminal"
# View basic package information
bun info is-number
# View a specific version
bun info is-number@7.0.0
# View all available versions
bun info is-number versions
# View package dependencies
bun info express dependencies
# View package homepage
bun info lodash homepage
# Get JSON output
bun info react --json
```

View File

@@ -88,6 +88,13 @@ Lifecycle scripts will run in parallel during installation. To adjust the maximu
bun install --concurrent-scripts 5
```
Bun automatically optimizes postinstall scripts for popular packages (like `esbuild`, `sharp`, etc.) by determining which scripts need to run. To disable these optimizations:
```bash terminal icon="terminal"
BUN_FEATURE_FLAG_DISABLE_NATIVE_DEPENDENCY_LINKER=1 bun install
BUN_FEATURE_FLAG_DISABLE_IGNORE_SCRIPTS=1 bun install
```
---
## Workspaces
@@ -127,14 +134,14 @@ For more information on filtering with `bun install`, refer to [Package Manager
Bun supports npm's `"overrides"` and Yarn's `"resolutions"` in `package.json`. These are mechanisms for specifying a version range for _metadependencies_—the dependencies of your dependencies. Refer to [Package manager > Overrides and resolutions](/pm/overrides) for complete documentation.
{/* prettier-ignore */}
```json package.json file="file-json"
{
"name": "my-app",
"dependencies": {
"foo": "^2.0.0"
},
"overrides": {
// [!code ++]
"overrides": { // [!code ++]
"bar": "~4.4.0" // [!code ++]
} // [!code ++]
}
@@ -231,7 +238,7 @@ Bun supports installing dependencies from Git, GitHub, and local or remotely-hos
Bun supports two package installation strategies that determine how dependencies are organized in `node_modules`:
### Hoisted installs (default for single projects)
### Hoisted installs
The traditional npm/Yarn approach that flattens dependencies into a shared `node_modules` directory:
@@ -249,7 +256,15 @@ bun install --linker isolated
Isolated installs create a central package store in `node_modules/.bun/` with symlinks in the top-level `node_modules`. This ensures packages can only access their declared dependencies.
For complete documentation on isolated installs, refer to [Package manager > Isolated installs](/pm/isolated-installs).
### Default strategy
The default linker strategy depends on whether you're starting fresh or have an existing project:
- **New workspaces/monorepos**: `isolated` (prevents phantom dependencies)
- **New single-package projects**: `hoisted` (traditional npm behavior)
- **Existing projects (made pre-v1.3.2)**: `hoisted` (preserves backward compatibility)
The default is controlled by a `configVersion` field in your lockfile. For a detailed explanation, see [Package manager > Isolated installs](/pm/isolated-installs).
---
@@ -289,7 +304,16 @@ For more advanced security scanning, including integration with services & custo
## Configuration
The default behavior of `bun install` can be configured in `bunfig.toml`. The default values are shown below.
### Configuring `bun install` with `bunfig.toml`
`bunfig.toml` is searched for in the following paths on `bun install`, `bun remove`, and `bun add`:
1. `$XDG_CONFIG_HOME/.bunfig.toml` or `$HOME/.bunfig.toml`
2. `./bunfig.toml`
If both are found, the results are merged together.
Configuring with `bunfig.toml` is optional. Bun tries to be zero configuration in general, but that's not always possible. The default behavior of `bun install` can be configured in `bunfig.toml`. The default values are shown below.
```toml bunfig.toml icon="settings"
[install]
@@ -319,8 +343,9 @@ dryRun = false
concurrentScripts = 16 # (cpu count or GOMAXPROCS) x2
# installation strategy: "hoisted" or "isolated"
# default: "hoisted" (for single-project projects)
# default: "isolated" (for monorepo projects)
# default depends on lockfile configVersion and workspaces:
# - configVersion = 1: "isolated" if using workspaces, otherwise "hoisted"
# - configVersion = 0: "hoisted"
linker = "hoisted"
@@ -329,7 +354,29 @@ minimumReleaseAge = 259200 # seconds
minimumReleaseAgeExcludes = ["@types/node", "typescript"]
```
---
### Configuring with environment variables
Environment variables have a higher priority than `bunfig.toml`.
| Name | Description |
| ---------------------------------- | ------------------------------------------------------------- |
| `BUN_CONFIG_REGISTRY` | Set an npm registry (default: https://registry.npmjs.org) |
| `BUN_CONFIG_TOKEN` | Set an auth token (currently does nothing) |
| `BUN_CONFIG_YARN_LOCKFILE` | Save a Yarn v1-style yarn.lock |
| `BUN_CONFIG_LINK_NATIVE_BINS` | Point `bin` in package.json to a platform-specific dependency |
| `BUN_CONFIG_SKIP_SAVE_LOCKFILE` | Dont save a lockfile |
| `BUN_CONFIG_SKIP_LOAD_LOCKFILE` | Dont load a lockfile |
| `BUN_CONFIG_SKIP_INSTALL_PACKAGES` | Dont install any packages |
Bun always tries to use the fastest available installation method for the target platform. On macOS, thats `clonefile` and on Linux, thats `hardlink`. You can change which installation method is used with the `--backend` flag. When unavailable or on error, `clonefile` and `hardlink` fallsback to a platform-specific implementation of copying files.
Bun stores installed packages from npm in `~/.bun/install/cache/${name}@${version}`. Note that if the semver version has a `build` or a `pre` tag, it is replaced with a hash of that value instead. This is to reduce the chances of errors from long file paths, but unfortunately complicates figuring out where a package was installed on disk.
When the `node_modules` folder exists, before installing, Bun checks if the `"name"` and `"version"` in `package/package.json` in the expected node_modules folder matches the expected `name` and `version`. This is how it determines whether it should install. It uses a custom JSON parser which stops parsing as soon as it finds `"name"` and `"version"`.
When a `bun.lock` doesnt exist or `package.json` has changed dependencies, tarballs are downloaded & extracted eagerly while resolving.
When a `bun.lock` exists and `package.json` hasnt changed, Bun downloads missing dependencies lazily. If the package with a matching `name` & `version` already exists in the expected location within `node_modules`, Bun wont attempt to download the tarball.
## CI/CD
@@ -379,6 +426,94 @@ jobs:
run: bun run build
```
## Platform-specific dependencies?
bun stores normalized `cpu` and `os` values from npm in the lockfile, along with the resolved packages. It skips downloading, extracting, and installing packages disabled for the current target at runtime. This means the lockfile won't change between platforms/architectures even if the packages ultimately installed do change.
### `--cpu` and `--os` flags
You can override the target platform for package selection:
```bash
bun install --cpu=x64 --os=linux
```
This installs packages for the specified platform instead of the current system. Useful for cross-platform builds or when preparing deployments for different environments.
**Accepted values for `--cpu`**: `arm64`, `x64`, `ia32`, `ppc64`, `s390x`
**Accepted values for `--os`**: `linux`, `darwin`, `win32`, `freebsd`, `openbsd`, `sunos`, `aix`
## Peer dependencies?
Peer dependencies are handled similarly to yarn. `bun install` will automatically install peer dependencies. If the dependency is marked optional in `peerDependenciesMeta`, an existing dependency will be chosen if possible.
## Lockfile
`bun.lock` is Buns lockfile format. See [our blogpost about the text lockfile](https://bun.com/blog/bun-lock-text-lockfile).
Prior to Bun 1.2, the lockfile was binary and called `bun.lockb`. Old lockfiles can be upgraded to the new format by running `bun install --save-text-lockfile --frozen-lockfile --lockfile-only`, and then deleting `bun.lockb`.
## Cache
To delete the cache:
```bash
bun pm cache rm
# or
rm -rf ~/.bun/install/cache
```
## Platform-specific backends
`bun install` uses different system calls to install dependencies depending on the platform. This is a performance optimization. You can force a specific backend with the `--backend` flag.
**`hardlink`** is the default backend on Linux. Benchmarking showed it to be the fastest on Linux.
```bash
rm -rf node_modules
bun install --backend hardlink
```
**`clonefile`** is the default backend on macOS. Benchmarking showed it to be the fastest on macOS. It is only available on macOS.
```bash
rm -rf node_modules
bun install --backend clonefile
```
**`clonefile_each_dir`** is similar to `clonefile`, except it clones each file individually per directory. It is only available on macOS and tends to perform slower than `clonefile`. Unlike `clonefile`, this does not recursively clone subdirectories in one system call.
```bash
rm -rf node_modules
bun install --backend clonefile_each_dir
```
**`copyfile`** is the fallback used when any of the above fail, and is the slowest. on macOS, it uses `fcopyfile()` and on linux it uses `copy_file_range()`.
```bash
rm -rf node_modules
bun install --backend copyfile
```
**`symlink`** is typically only used for `file:` dependencies (and eventually `link:`) internally. To prevent infinite loops, it skips symlinking the `node_modules` folder.
If you install with `--backend=symlink`, Node.js won't resolve node_modules of dependencies unless each dependency has its own node_modules folder or you pass `--preserve-symlinks` to `node` or `bun`. See [Node.js documentation on `--preserve-symlinks`](https://nodejs.org/api/cli.html#--preserve-symlinks).
```bash
rm -rf node_modules
bun install --backend symlink
bun --preserve-symlinks ./my-file.js
node --preserve-symlinks ./my-file.js # https://nodejs.org/api/cli.html#--preserve-symlinks
```
## npm registry metadata
Bun uses a binary format for caching NPM registry responses. This loads much faster than JSON and tends to be smaller on disk.
You will see these files in `~/.bun/install/cache/*.npm`. The filename pattern is `${hash(packageName)}.npm`. Its a hash so that extra directories dont need to be created for scoped packages.
Bun's usage of `Cache-Control` ignores `Age`. This improves performance, but means bun may be about 5 minutes out of date to receive the latest package version metadata from npm.
## pnpm migration
Bun automatically migrates projects from pnpm to bun. When a `pnpm-lock.yaml` file is detected and no `bun.lock` file exists, Bun will automatically migrate the lockfile to `bun.lock` during installation. The original `pnpm-lock.yaml` file remains unmodified.

View File

@@ -43,6 +43,19 @@ In addition, the `--save` flag can be used to add `cool-pkg` to the `dependencie
}
```
## Unlinking
Use `bun unlink` in the root directory to unregister a local package.
```bash terminal icon="terminal"
cd /path/to/cool-pkg
bun unlink
```
```txt
bun unlink v1.x (7416672e)
```
---
<Link />

View File

@@ -115,6 +115,8 @@ To print a list of installed dependencies in the current project and their resol
```bash terminal icon="terminal"
bun pm ls
# or
bun list
```
```txt
@@ -130,6 +132,8 @@ To print all installed dependencies, including nth-order dependencies.
```bash terminal icon="terminal"
bun pm ls --all
# or
bun list --all
```
```txt
@@ -244,7 +248,7 @@ bun pm version
```
```txt
bun pm version v1.3.1 (ca7428e9)
bun pm version v1.3.2 (ca7428e9)
Current package version: v1.0.0
Increment:
@@ -299,7 +303,7 @@ scripts[test:watch] # bracket for special chars
Examples:
```bash terminal icon="terminal"
# set
# get
bun pm pkg get name # single property
bun pm pkg get name version # multiple properties
bun pm pkg get # entire package.json

View File

@@ -13,7 +13,7 @@ bun publish
```
```txt
bun publish v1.3.1 (ca7428e9)
bun publish v1.3.2 (ca7428e9)
packed 203B package.json
packed 224B README.md
@@ -89,6 +89,14 @@ The `--dry-run` flag can be used to simulate the publish process without actuall
bun publish --dry-run
```
### `--tolerate-republish`
Exit with code 0 instead of 1 if the package version already exists. Useful in CI/CD where jobs may be re-run.
```sh terminal icon="terminal"
bun publish --tolerate-republish
```
### `--gzip-level`
Specify the level of gzip compression to use when packing the package. Only applies to `bun publish` without a tarball path argument. Values range from `0` to `9` (default is `9`).

View File

@@ -5,7 +5,7 @@ description: "Strict dependency isolation similar to pnpm's approach"
Bun provides an alternative package installation strategy called **isolated installs** that creates strict dependency isolation similar to pnpm's approach. This mode prevents phantom dependencies and ensures reproducible, deterministic builds.
This is the default installation strategy for monorepo projects.
This is the default installation strategy for **new** workspace/monorepo projects (with `configVersion = 1` in the lockfile). Existing projects continue using hoisted installs unless explicitly configured.
## What are isolated installs?
@@ -43,8 +43,23 @@ linker = "isolated"
### Default behavior
- For monorepo projects, Bun uses the **isolated** installation strategy by default.
- For single-project projects, Bun uses the **hoisted** installation strategy by default.
The default linker strategy depends on your project's lockfile `configVersion`:
| `configVersion` | Using workspaces? | Default Linker |
| --------------- | ----------------- | -------------- |
| `1` | ✅ | `isolated` |
| `1` | ❌ | `hoisted` |
| `0` | ✅ | `hoisted` |
| `0` | ❌ | `hoisted` |
**New projects**: Default to `configVersion = 1`. In workspaces, v1 uses the isolated linker by default; otherwise it uses hoisted linking.
**Existing Bun projects (made pre-v1.3.2)**: If your existing lockfile doesn't have a version yet, Bun sets `configVersion = 0` when you run `bun install`, preserving the previous hoisted linker default.
**Migrations from other package managers**:
- From pnpm: `configVersion = 1` (using isolated installs in workspaces)
- From npm or yarn: `configVersion = 0` (using hoisted installs)
You can override the default behavior by explicitly specifying the `--linker` flag or setting it in your configuration file.

View File

@@ -44,7 +44,7 @@ Instead of executing arbitrary scripts, Bun uses a "default-secure" approach. Yo
Once added to `trustedDependencies`, install/re-install the package. Bun will read this field and run lifecycle scripts for `my-trusted-package`.
As of Bun v1.0.16, the top 500 npm packages with lifecycle scripts are allowed by default. You can see the full list [here](https://github.com/oven-sh/bun/blob/main/src/install/default-trusted-dependencies.txt).
The top 500 npm packages with lifecycle scripts are allowed by default. You can see the full list [here](https://github.com/oven-sh/bun/blob/main/src/install/default-trusted-dependencies.txt).
---

View File

@@ -5,14 +5,14 @@ description: "Control metadependency versions with npm overrides and Yarn resolu
Bun supports npm's `"overrides"` and Yarn's `"resolutions"` in `package.json`. These are mechanisms for specifying a version range for _metadependencies_—the dependencies of your dependencies.
{/* prettier-ignore */}
```json package.json icon="file-json"
{
"name": "my-app",
"dependencies": {
"foo": "^2.0.0"
},
"overrides": {
// [!code ++]
"overrides": { // [!code ++]
"bar": "~4.4.0" // [!code ++]
} // [!code ++]
}
@@ -50,14 +50,14 @@ Add `bar` to the `"overrides"` field in `package.json`. Bun will defer to the sp
overrides](https://docs.npmjs.com/cli/v9/configuring-npm/package-json#overrides) are not supported.
</Note>
{/* prettier-ignore */}
```json package.json icon="file-json"
{
"name": "my-app",
"dependencies": {
"foo": "^2.0.0"
},
"overrides": {
// [!code ++]
"overrides": { // [!code ++]
"bar": "~4.4.0" // [!code ++]
} // [!code ++]
}
@@ -69,14 +69,14 @@ The syntax is similar for `"resolutions"`, which is Yarn's alternative to `"over
As with `"overrides"`, _nested resolutions_ are not currently supported.
{/* prettier-ignore */}
```json package.json icon="file-json"
{
"name": "my-app",
"dependencies": {
"foo": "^2.0.0"
},
"resolutions": {
// [!code ++]
"resolutions": { // [!code ++]
"bar": "~4.4.0" // [!code ++]
} // [!code ++]
}

View File

@@ -7,8 +7,7 @@ Bun supports [`workspaces`](https://docs.npmjs.com/cli/v9/using-npm/workspaces?v
It's common for a monorepo to have the following structure:
```
tree
```txt File Tree icon="folder-tree"
<root>
├── README.md
├── bun.lock
@@ -31,7 +30,7 @@ tree
In the root `package.json`, the `"workspaces"` key is used to indicate which subdirectories should be considered packages/workspaces within the monorepo. It's conventional to place all the workspace in a directory called `packages`.
```json
```json package.json icon="file-json"
{
"name": "my-project",
"version": "1.0.0",
@@ -43,14 +42,22 @@ In the root `package.json`, the `"workspaces"` key is used to indicate which sub
```
<Note>
**Glob support** — Bun supports full glob syntax in `"workspaces"` (see [here](/runtime/glob#supported-glob-patterns)
for a comprehensive list of supported syntax), _except_ for exclusions (e.g. `!**/excluded/**`), which are not
implemented yet.
**Glob support** — Bun supports full glob syntax in `"workspaces"`, including negative patterns (e.g.
`!**/excluded/**`). See [here](https://bun.com/docs/api/glob#supported-glob-patterns) for a comprehensive list of
supported syntax.
</Note>
```json package.json icon="file-json"
{
"name": "my-project",
"version": "1.0.0",
"workspaces": ["packages/**", "!packages/**/test/**", "!packages/**/template/**"]
}
```
Each workspace has it's own `package.json`. When referencing other packages in the monorepo, semver or workspace protocols (e.g. `workspace:*`) can be used as the version field in your `package.json`.
```json
```json packages/pkg-a/package.json icon="file-json"
{
"name": "pkg-a",
"version": "1.0.0",

View File

@@ -216,3 +216,26 @@ numa nodes: 1
elapsed: 0.068 s
process: user: 0.061 s, system: 0.014 s, faults: 0, rss: 57.4 MiB, commit: 64.0 MiB
```
## CPU profiling
Profile JavaScript execution to identify performance bottlenecks with the `--cpu-prof` flag.
```sh terminal icon="terminal"
bun --cpu-prof script.js
```
This generates a `.cpuprofile` file you can open in Chrome DevTools (Performance tab → Load profile) or VS Code's CPU profiler.
### Options
```sh terminal icon="terminal"
bun --cpu-prof --cpu-prof-name my-profile.cpuprofile script.js
bun --cpu-prof --cpu-prof-dir ./profiles script.js
```
| Flag | Description |
| ---------------------------- | -------------------- |
| `--cpu-prof` | Enable profiling |
| `--cpu-prof-name <filename>` | Set output filename |
| `--cpu-prof-dir <dir>` | Set output directory |

View File

@@ -7,26 +7,40 @@ Configuring a development environment for Bun can take 10-30 minutes depending o
If you are using Windows, please refer to [this guide](/project/building-windows)
## Install Dependencies
## Using Nix (Alternative)
A Nix flake is provided as an alternative to manual dependency installation:
```bash
nix develop
# or explicitly use the pure shell
# nix develop .#pure
export CMAKE_SYSTEM_PROCESSOR=$(uname -m)
bun bd
```
This provides all dependencies in an isolated, reproducible environment without requiring sudo.
## Install Dependencies (Manual)
Using your system's package manager, install Bun's dependencies:
<CodeGroup>
```bash macOS (Homebrew)
$ brew install automake ccache cmake coreutils gnu-sed go icu4c libiconv libtool ninja pkg-config rust ruby
$ brew install automake cmake coreutils gnu-sed go icu4c libiconv libtool ninja pkg-config rust ruby sccache
```
```bash Ubuntu/Debian
$ sudo apt install curl wget lsb-release software-properties-common cargo ccache cmake git golang libtool ninja-build pkg-config rustc ruby-full xz-utils
$ sudo apt install curl wget lsb-release software-properties-common cargo cmake git golang libtool ninja-build pkg-config rustc ruby-full xz-utils
```
```bash Arch
$ sudo pacman -S base-devel ccache cmake git go libiconv libtool make ninja pkg-config python rust sed unzip ruby
$ sudo pacman -S base-devel cmake git go libiconv libtool make ninja pkg-config python rust sed unzip ruby
```
```bash Fedora
$ sudo dnf install cargo ccache cmake git golang libtool ninja-build pkg-config rustc ruby libatomic-static libstdc++-static sed unzip which libicu-devel 'perl(Math::BigInt)'
$ sudo dnf install cargo clang19 llvm19 lld19 cmake git golang libtool ninja-build pkg-config rustc ruby libatomic-static libstdc++-static sed unzip which libicu-devel 'perl(Math::BigInt)'
```
```bash openSUSE Tumbleweed
@@ -56,6 +70,46 @@ $ brew install bun
</CodeGroup>
### Optional: Install `sccache`
sccache is used to cache compilation artifacts, significantly speeding up builds. It must be installed with S3 support:
```bash
# For macOS
$ brew install sccache
# For Linux. Note that the version in your package manager may not have S3 support.
$ cargo install sccache --features=s3
```
This will install `sccache` with S3 support. Our build scripts will automatically detect and use `sccache` with our shared S3 cache. **Note**: Not all versions of `sccache` are compiled with S3 support, hence we recommend installing it via `cargo`.
#### Registering AWS Credentials for `sccache` (Core Developers Only)
Core developers have write access to the shared S3 cache. To enable write access, you must log in with AWS credentials. The easiest way to do this is to use the [`aws` CLI](https://aws.amazon.com/cli/) and invoke [`aws configure` to provide your AWS security info](https://docs.aws.amazon.com/cli/latest/reference/configure/).
The `cmake` scripts should automatically detect your AWS credentials from the environment or the `~/.aws/credentials` file.
<details>
<summary>Logging in to the `aws` CLI</summary>
1. Install the AWS CLI by following [the official guide](https://docs.aws.amazon.com/cli/latest/userguide/getting-started-install.html).
2. Log in to your AWS account console. A team member should provide you with your credentials.
3. Click your name in the top right > Security credentials.
4. Scroll to "Access keys" and create a new access key.
5. Run `aws configure` in your terminal and provide the access key ID and secret access key when prompted.
</details>
<details>
<summary>Common Issues You May Encounter</summary>
- To confirm that the cache is being used, you can use the `sccache --show-stats` command right after a build. This will expose very useful statistics, including cache hits/misses.
- If you have multiple AWS profiles configured, ensure that the correct profile is set in the `AWS_PROFILE` environment variable.
- `sccache` follows a server-client model. If you run into weird issues where `sccache` refuses to use S3, even though you have AWS credentials configured, try killing any running `sccache` servers with `sccache --stop-server` and then re-running the build.
</details>
## Install LLVM
Bun requires LLVM 19 (`clang` is part of LLVM). This version requirement is to match WebKit (precompiled), as mismatching versions will cause memory allocation failures at runtime. In most cases, you can install LLVM through your system package manager:
@@ -156,7 +210,7 @@ Bun generally takes about 2.5 minutes to compile a debug build when there are Zi
- Batch up your changes
- Ensure zls is running with incremental watching for LSP errors (if you use VSCode and install Zig and run `bun run build` once to download Zig, this should just work)
- Prefer using the debugger ("CodeLLDB" in VSCode) to step through the code.
- Use debug logs. `BUN_DEBUG_<scope>=1` will enable debug logging for the corresponding `Output.scoped(.<scope>, false)` logs. You can also set `BUN_DEBUG_QUIET_LOGS=1` to disable all debug logging that isn't explicitly enabled. To dump debug lgos into a file, `BUN_DEBUG=<path-to-file>.log`. Debug logs are aggressively removed in release builds.
- Use debug logs. `BUN_DEBUG_<scope>=1` will enable debug logging for the corresponding `Output.scoped(.<scope>, .hidden)` logs. You can also set `BUN_DEBUG_QUIET_LOGS=1` to disable all debug logging that isn't explicitly enabled. To dump debug logs into a file, `BUN_DEBUG=<path-to-file>.log`. Debug logs are aggressively removed in release builds.
- src/js/\*\*.ts changes are pretty much instant to rebuild. C++ changes are a bit slower, but still much faster than the Zig code (Zig is one compilation unit, C++ is many).
## Code generation scripts
@@ -327,15 +381,6 @@ bun run build -DUSE_STATIC_LIBATOMIC=OFF
The built version of Bun may not work on other systems if compiled this way.
### ccache conflicts with building TinyCC on macOS
If you run into issues with `ccache` when building TinyCC, try reinstalling ccache
```bash
brew uninstall ccache
brew install ccache
```
## Using bun-debug
- Disable logging: `BUN_DEBUG_QUIET_LOGS=1 bun-debug ...` (to disable all debug logging)

View File

@@ -12,198 +12,204 @@ Build a minimal HTTP server with `Bun.serve`, run it locally, then evolve it by
---
<Steps>
<Step title="Step 1">
Initialize a new project with `bun init`.
<Step title="Step 1">
```bash terminal icon="terminal"
bun init my-app
```
Initialize a new project with `bun init`.
It'll prompt you to pick a template, either `Blank`, `React`, or `Library`. For this guide, we'll pick `Blank`.
```bash terminal icon="terminal"
bun init my-app
```
```bash terminal icon="terminal"
bun init my-app
```
```txt
✓ Select a project template: Blank
It'll prompt you to pick a template, either `Blank`, `React`, or `Library`. For this guide, we'll pick `Blank`.
- .gitignore
- CLAUDE.md
- .cursor/rules/use-bun-instead-of-node-vite-npm-pnpm.mdc -> CLAUDE.md
- index.ts
- tsconfig.json (for editor autocomplete)
- README.md
```bash terminal icon="terminal"
bun init my-app
```
```txt
✓ Select a project template: Blank
````
- .gitignore
- CLAUDE.md
- .cursor/rules/use-bun-instead-of-node-vite-npm-pnpm.mdc -> CLAUDE.md
- index.ts
- tsconfig.json (for editor autocomplete)
- README.md
```
This automatically creates a `my-app` directory with a basic Bun app.
</Step>
<Step title="Step 2">
This automatically creates a `my-app` directory with a basic Bun app.
Run the `index.ts` file using `bun run index.ts`.
</Step>
<Step title="Step 2">
```bash terminal icon="terminal"
cd my-app
bun run index.ts
```
```txt
Hello via Bun!
```
Run the `index.ts` file using `bun run index.ts`.
You should see a console output saying `"Hello via Bun!"`.
</Step>
<Step title="Step 3">
Replace the contents of `index.ts` with the following code:
```bash terminal icon="terminal"
cd my-app
bun run index.ts
```
```txt
Hello via Bun!
```
```ts index.ts icon="/icons/typescript.svg"
const server = Bun.serve({
port: 3000,
routes: {
"/": () => new Response('Bun!'),
}
});
You should see a console output saying `"Hello via Bun!"`.
console.log(`Listening on ${server.url}`);
```
</Step>
<Step title="Step 3">
Run the `index.ts` file again using `bun run index.ts`.
Replace the contents of `index.ts` with the following code:
```bash terminal icon="terminal"
bun run index.ts
```
```txt
Listening on http://localhost:3000
```
```ts index.ts icon="/icons/typescript.svg"
const server = Bun.serve({
port: 3000,
routes: {
"/": () => new Response('Bun!'),
}
});
Visit [`http://localhost:3000`](http://localhost:3000) to test the server. You should see a simple page that says `"Bun!"`.
console.log(`Listening on ${server.url}`);
```
Run the `index.ts` file again using `bun run index.ts`.
<Accordion title="Seeing TypeScript errors on Bun?">
If you used `bun init`, Bun will have automatically installed Bun's TypeScript declarations and configured your `tsconfig.json`. If you're trying out Bun in an existing project, you may see a type error on the `Bun` global.
```bash terminal icon="terminal"
bun run index.ts
```
```txt
Listening on http://localhost:3000
```
To fix this, first install `@types/bun` as a dev dependency.
Visit [`http://localhost:3000`](http://localhost:3000) to test the server. You should see a simple page that says `"Bun!"`.
```bash terminal icon="terminal"
bun add -d @types/bun
```
<Accordion title="Seeing TypeScript errors on Bun?">
Then add the following to your `compilerOptions` in `tsconfig.json`:
If you used `bun init`, Bun will have automatically installed Bun's TypeScript declarations and configured your `tsconfig.json`. If you're trying out Bun in an existing project, you may see a type error on the `Bun` global.
```json tsconfig.json icon="file-code"
{
"compilerOptions": {
"lib": ["ESNext"],
"target": "ESNext",
"module": "Preserve",
"moduleDetection": "force",
"moduleResolution": "bundler",
"allowImportingTsExtensions": true,
"verbatimModuleSyntax": true,
"noEmit": true
}
}
```
To fix this, first install `@types/bun` as a dev dependency.
</Accordion>
```bash terminal icon="terminal"
bun add -d @types/bun
```
</Step>
<Step title="Step 4">
Install the `figlet` package and its type declarations. Figlet is a utility for converting strings into ASCII art.
Then add the following to your `compilerOptions` in `tsconfig.json`:
```bash terminal icon="terminal"
bun add figlet
bun add -d @types/figlet # TypeScript users only
```
```json tsconfig.json icon="file-code"
{
"compilerOptions": {
"lib": ["ESNext"],
"target": "ESNext",
"module": "Preserve",
"moduleDetection": "force",
"moduleResolution": "bundler",
"allowImportingTsExtensions": true,
"verbatimModuleSyntax": true,
"noEmit": true
}
}
```
Update `index.ts` to use `figlet` in `routes`.
</Accordion>
```ts index.ts icon="/icons/typescript.svg"
import figlet from 'figlet'; // [!code ++]
</Step>
<Step title="Step 4">
const server = Bun.serve({
port: 3000,
routes: {
"/": () => new Response('Bun!'),
"/figlet": () => { // [!code ++]
const body = figlet.textSync('Bun!'); // [!code ++]
return new Response(body); // [!code ++]
} // [!code ++]
}
});
Install the `figlet` package and its type declarations. Figlet is a utility for converting strings into ASCII art.
console.log(`Listening on ${server.url}`);
```
```bash terminal icon="terminal"
bun add figlet
bun add -d @types/figlet # TypeScript users only
```
Run the `index.ts` file again using `bun run index.ts`.
Update `index.ts` to use `figlet` in `routes`.
```bash terminal icon="terminal"
bun run index.ts
```
```txt
Listening on http://localhost:3000
```
```ts index.ts icon="/icons/typescript.svg"
import figlet from 'figlet'; // [!code ++]
Visit [`http://localhost:3000/figlet`](http://localhost:3000/figlet) to test the server. You should see a simple page that says `"Bun!"` in ASCII art.
const server = Bun.serve({
port: 3000,
routes: {
"/": () => new Response('Bun!'),
"/figlet": () => { // [!code ++]
const body = figlet.textSync('Bun!'); // [!code ++]
return new Response(body); // [!code ++]
} // [!code ++]
}
});
```txt
____ _
| __ ) _ _ _ __ | |
| _ \| | | | '_ \| |
| |_) | |_| | | | |_|
|____/ \__,_|_| |_(_)
```
</Step>
<Step title="Step 5">
Let's add some HTML. Create a new file called `index.html` and add the following code:
console.log(`Listening on ${server.url}`);
```
```html index.html icon="file-code"
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Bun</title>
</head>
<body>
<h1>Bun!</h1>
</body>
</html>
```
Run the `index.ts` file again using `bun run index.ts`.
Then, import this file in `index.ts` and serve it from the root `/` route.
```bash terminal icon="terminal"
bun run index.ts
```
```txt
Listening on http://localhost:3000
```
```ts index.ts icon="/icons/typescript.svg"
import figlet from 'figlet';
import index from './index.html'; // [!code ++]
Visit [`http://localhost:3000/figlet`](http://localhost:3000/figlet) to test the server. You should see a simple page that says `"Bun!"` in ASCII art.
const server = Bun.serve({
port: 3000,
routes: {
"/": index, // [!code ++]
"/figlet": () => {
const body = figlet.textSync('Bun!');
return new Response(body);
}
}
});
```txt
____ _
| __ ) _ _ _ __ | |
| _ \| | | | '_ \| |
| |_) | |_| | | | |_|
|____/ \__,_|_| |_(_)
```
console.log(`Listening on ${server.url}`);
```
</Step>
<Step title="Step 5">
Run the `index.ts` file again using `bun run index.ts`.
Let's add some HTML. Create a new file called `index.html` and add the following code:
```bash terminal icon="terminal"
bun run index.ts
```
```txt
Listening on http://localhost:3000
```
```html index.html icon="file-code"
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Bun</title>
</head>
<body>
<h1>Bun!</h1>
</body>
</html>
```
Visit [`http://localhost:3000`](http://localhost:3000) to test the server. You should see the static HTML page.
</Step>
Then, import this file in `index.ts` and serve it from the root `/` route.
</Steps>
````
```ts index.ts icon="/icons/typescript.svg"
import figlet from 'figlet';
import index from './index.html'; // [!code ++]
const server = Bun.serve({
port: 3000,
routes: {
"/": index, // [!code ++]
"/figlet": () => {
const body = figlet.textSync('Bun!');
return new Response(body);
}
}
});
console.log(`Listening on ${server.url}`);
```
Run the `index.ts` file again using `bun run index.ts`.
```bash terminal icon="terminal"
bun run index.ts
```
```txt
Listening on http://localhost:3000
```
Visit [`http://localhost:3000`](http://localhost:3000) to test the server. You should see the static HTML page.
</Step>
</Steps>
🎉 Congratulations! You've built a simple HTTP server with Bun and installed a package.
@@ -213,16 +219,21 @@ Build a minimal HTTP server with `Bun.serve`, run it locally, then evolve it by
Bun can also execute `"scripts"` from your `package.json`. Add the following script:
{/* prettier-ignore */}
```json package.json icon="file-code"
{
"name": "quickstart",
"module": "index.ts",
"type": "module",
"scripts": {
"start": "bun run index.ts"
},
"private": true,
"scripts": { // [!code ++]
"start": "bun run index.ts" // [!code ++]
}, // [!code ++]
"devDependencies": {
"@types/bun": "latest"
},
"peerDependencies": {
"typescript": "^5"
}
}
```
@@ -234,7 +245,7 @@ bun run start
```
```txt
Listening on http://localhost:3000
Listening on http://localhost:3000
```
<Note>⚡️ **Performance** — `bun run` is roughly 28x faster than `npm run` (6ms vs 170ms of overhead).</Note>

View File

@@ -276,6 +276,58 @@ This is useful for catching flaky tests or non-deterministic behavior. Each test
The `--rerun-each` CLI flag will override this setting when specified.
### `test.concurrentTestGlob`
Specify a glob pattern to automatically run matching test files with concurrent test execution enabled. Test files matching this pattern will behave as if the `--concurrent` flag was passed, running all tests within those files concurrently.
```toml title="bunfig.toml" icon="settings"
[test]
concurrentTestGlob = "**/concurrent-*.test.ts"
```
This is useful for:
- Gradually migrating test suites to concurrent execution
- Running integration tests concurrently while keeping unit tests sequential
- Separating fast concurrent tests from tests that require sequential execution
The `--concurrent` CLI flag will override this setting when specified.
### `test.onlyFailures`
When enabled, only failed tests are displayed in the output. This helps reduce noise in large test suites by hiding passing tests. Default `false`.
```toml title="bunfig.toml" icon="settings"
[test]
onlyFailures = true
```
This is equivalent to using the `--only-failures` flag when running `bun test`.
### `test.reporter`
Configure the test reporter settings.
#### `test.reporter.dots`
Enable the dots reporter, which displays a compact output showing a dot for each test. Default `false`.
```toml title="bunfig.toml" icon="settings"
[test.reporter]
dots = true
```
#### `test.reporter.junit`
Enable JUnit XML reporting and specify the output file path.
```toml title="bunfig.toml" icon="settings"
[test.reporter]
junit = "test-results.xml"
```
This generates a JUnit XML report that can be consumed by CI systems and other tools.
## Package manager
Package management is a complex issue; to support a range of use cases, the behavior of `bun install` can be configured under the `[install]` section.
@@ -401,6 +453,7 @@ Environment variable: `BUN_INSTALL_BIN`
```toml title="bunfig.toml" icon="settings"
# where globally-installed package bins are linked
[install]
globalBinDir = "~/.bun/bin"
```
@@ -497,7 +550,7 @@ print = "yarn"
### `install.linker`
Configure the default linker strategy. Default `"hoisted"` for single-project projects, `"isolated"` for monorepo projects.
Configure the linker strategy for installing dependencies. Defaults to `"isolated"` for new workspaces, `"hoisted"` for new single-package projects and existing projects (made pre-v1.3.2).
For complete documentation refer to [Package manager > Isolated installs](/pm/isolated-installs).
@@ -550,7 +603,7 @@ For more details see [Minimum release age](/pm/cli/install#minimum-release-age)
The `bun run` command can be configured under the `[run]` section. These apply to the `bun run` command and the `bun` command when running a file or executable or script.
Currently, `bunfig.toml` isn't always automatically loaded for `bun run` in a local project (it does check for a global `bunfig.toml`), so you might still need to pass `-c` or `-c=bunfig.toml` to use these settings.
Currently, `bunfig.toml` is only automatically loaded for `bun run` in a local project (it doesn't check for a global `.bunfig.toml`).
### `run.shell` - use the system shell or Bun's shell

View File

@@ -100,7 +100,7 @@ You can read results from the subprocess via the `stdout` and `stderr` propertie
```ts
const proc = Bun.spawn(["bun", "--version"]);
const text = await proc.stdout.text();
console.log(text); // => "1.3.1\n"
console.log(text); // => "1.3.2\n"
```
Configure the output stream by passing one of the following values to `stdout/stderr`:

View File

@@ -146,11 +146,11 @@ await fetch("https://example.com", {
```
```txt
[fetch] $ curl --http1.1 "https://example.com/" -X POST -H "content-type: application/json" -H "Connection: keep-alive" -H "User-Agent: Bun/1.3.1" -H "Accept: */*" -H "Host: example.com" -H "Accept-Encoding: gzip, deflate, br" --compressed -H "Content-Length: 13" --data-raw "{\"foo\":\"bar\"}"
[fetch] $ curl --http1.1 "https://example.com/" -X POST -H "content-type: application/json" -H "Connection: keep-alive" -H "User-Agent: Bun/1.3.2" -H "Accept: */*" -H "Host: example.com" -H "Accept-Encoding: gzip, deflate, br" --compressed -H "Content-Length: 13" --data-raw "{\"foo\":\"bar\"}"
[fetch] > HTTP/1.1 POST https://example.com/
[fetch] > content-type: application/json
[fetch] > Connection: keep-alive
[fetch] > User-Agent: Bun/1.3.1
[fetch] > User-Agent: Bun/1.3.2
[fetch] > Accept: */*
[fetch] > Host: example.com
[fetch] > Accept-Encoding: gzip, deflate, br
@@ -190,7 +190,7 @@ await fetch("https://example.com", {
[fetch] > HTTP/1.1 POST https://example.com/
[fetch] > content-type: application/json
[fetch] > Connection: keep-alive
[fetch] > User-Agent: Bun/1.3.1
[fetch] > User-Agent: Bun/1.3.2
[fetch] > Accept: */*
[fetch] > Host: example.com
[fetch] > Accept-Encoding: gzip, deflate, br

View File

@@ -5,14 +5,16 @@ description: "File types and loaders supported by Bun's bundler and runtime"
The Bun bundler implements a set of default loaders out of the box. As a rule of thumb, the bundler and the runtime both support the same set of file types out of the box.
`.js` `.cjs` `.mjs` `.mts` `.cts` `.ts` `.tsx` `.jsx` `.toml` `.json` `.txt` `.wasm` `.node` `.html`
`.js` `.cjs` `.mjs` `.mts` `.cts` `.ts` `.tsx` `.jsx` `.css` `.json` `.jsonc` `.toml` `.yaml` `.yml` `.txt` `.wasm` `.node` `.html` `.sh`
Bun uses the file extension to determine which built-in _loader_ should be used to parse the file. Every loader has a name, such as `js`, `tsx`, or `json`. These names are used when building [plugins](/bundler/plugins) that extend Bun with custom loaders.
You can explicitly specify which loader to use using the 'loader' import attribute.
You can explicitly specify which loader to use using the `'type'` import attribute.
```ts
import my_toml from "./my_file" with { loader: "toml" };
import my_toml from "./my_file" with { type: "toml" };
// or with dynamic imports
const { default: my_toml } = await import("./my_file", { with: { type: "toml" } });
```
---
@@ -84,6 +86,29 @@ export default {
</CodeGroup>
### `jsonc`
**JSON with Comments loader**. Default for `.jsonc`.
JSONC (JSON with Comments) files can be directly imported. Bun will parse them, stripping out comments and trailing commas.
```ts
import config from "./config.jsonc";
console.log(config);
```
During bundling, the parsed JSONC is inlined into the bundle as a JavaScript object, identical to the `json` loader.
```ts
var config = {
option: "value",
};
```
<Note>
Bun automatically uses the `jsonc` loader for `tsconfig.json`, `jsconfig.json`, `package.json`, and `bun.lock` files.
</Note>
### `toml`
**TOML loader**. Default for `.toml`.
@@ -128,6 +153,50 @@ export default {
</CodeGroup>
### `yaml`
**YAML loader**. Default for `.yaml` and `.yml`.
YAML files can be directly imported. Bun will parse them with its fast native YAML parser.
```ts
import config from "./config.yaml";
console.log(config);
// via import attribute:
import data from "./data.txt" with { type: "yaml" };
```
During bundling, the parsed YAML is inlined into the bundle as a JavaScript object.
```ts
var config = {
name: "my-app",
version: "1.0.0",
// ...other fields
};
```
If a `.yaml` or `.yml` file is passed as an entrypoint, it will be converted to a `.js` module that `export default`s the parsed object.
<CodeGroup>
```yaml Input
name: John Doe
age: 35
email: johndoe@example.com
```
```ts Output
export default {
name: "John Doe",
age: 35,
email: "johndoe@example.com",
};
```
</CodeGroup>
### `text`
**Text loader**. Default for `.txt`.
@@ -283,6 +352,18 @@ The `html` loader behaves differently depending on how it's used:
</Note>
### `css`
**CSS loader**. Default for `.css`.
CSS files can be directly imported. This is primarily useful for [full-stack applications](/bundler/html-static) where CSS is bundled alongside HTML.
```ts
import "./styles.css";
```
There isn't any value returned from the import, it's only used for side effects.
### `sh` loader
**Bun Shell loader**. Default for `.sh` files

Some files were not shown because too many files have changed in this diff Show More