This would happen sometimes because it was appending base64 strings to
eachother. You can't do that.
Tested locally and it fixes the bug. Not sure how to make a regression
test for this.
### What does this PR do?
Split subprocess into more files
### How did you verify your code works?
check
---------
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
## Summary
- Adds a GitHub Action that automatically applies the 'claude' label to
PRs created by robobun user
- Triggers on `pull_request` `opened` events
- Only runs for PRs created by the `robobun` user account
- Uses `github-script` action to add the label
## Test plan
- [x] Created the workflow file with proper permissions
- [ ] Test by creating a new PR with robobun user (will happen
automatically on next Claude PR)
- [ ] Verify the label gets applied automatically
This ensures all future Claude-generated PRs are properly labeled for
tracking and organization.
🤖 Generated with [Claude Code](https://claude.ai/code)
Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
### What does this PR do?
there was a regression in 1.2.5 where it stopped supporting lowercase
veriants of the crypto keys. This broke the `mailauth` lib and proabibly
many more.
simple code:
```ts
import { sign, constants } from 'crypto';
const DUMMY_PRIVATE_KEY = `-----BEGIN PRIVATE KEY-----\r\nMIICeAIBADANBgkqhkiG9w0BAQEFAASCAmIwggJeAgEAAoGBAMx5bEJhDzwNBG1m\r\nmIYn/V1HMK9g8WTVaHym4F4iPcTdZ4RYUrMa/xOUwPMAfrOJdf3joSUFWBx3ZPdW\r\nhrvpqjmcmgoYDRJzZwVKJ1uqTko6Anm3gplWl6JP3nGOL9Vt5K5xAJWif5fHPfCx\r\nLA2p/SnJDNmcyOWURUCRVCDlZgJRAgMBAAECgYEAt8a+ZZ7EyY1NmGJo3dMdZnPw\r\nrwArlhw08CwwZorSB5mTS6Dym2W9MsU08nNUbVs0AIBRumtmOReaWK+dI1GtmsT+\r\n/5YOrE8aU9xcTgMzZjr9AjI9cSc5J9etqqTjUplKfC5Ay0WBhPlx66MPAcTsq/u/\r\nIdPYvhvgXuJm6X3oDP0CQQDllIopSYXW+EzfpsdTsY1dW+xKM90NA7hUFLbIExwc\r\nvL9dowJcNvPNtOOA8Zrt0guVz0jZU/wPYZhvAm2/ab93AkEA5AFCfcAXrfC2lnDe\r\n9G5x/DGaB5jAsQXi9xv+/QECyAN3wzSlQNAZO8MaNr2IUpKuqMfxl0sPJSsGjOMY\r\ne8aOdwJBAIM7U3aiVmU5bgfyN8J5ncsd/oWz+8mytK0rYgggFFPA+Mq3oWPA7cBK\r\nhDly4hLLnF+4K3Y/cbgBG7do9f8SnaUCQQCLvfXpqp0Yv4q4487SUwrLff8gns+i\r\n76+uslry5/azbeSuIIsUETcV+LsNR9bQfRRNX9ZDWv6aUid+nAU6f3R7AkAFoONM\r\nmr4hjSGiU1o91Duatf4tny1Hp/hw2VoZAb5zxAlMtMifDg4Aqg4XFgptST7IUzTN\r\nK3P7zdJ30gregvjI\r\n-----END PRIVATE KEY-----`;
sign('rsa-sha256', Buffer.from('message'), {
key: DUMMY_PRIVATE_KEY,
padding: constants.RSA_PKCS1_PSS_PADDING,
});
// would throw invalid digest
```
### How did you verify your code works?
made test
---------
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Add an owned pointer type—a wrapper around a pointer and an allocator.
`Owned(*Foo)` and `Owned([]Foo)` contain both the pointer/slice and the
allocator that was used to allocate it. Calling `deinit` on these types
first calls `Foo.deinit` and then frees the memory. This makes it easier
to remember to free the memory, and hard to accidentally free it with
the wrong allocator.
Optional pointers are also supported (`Owned(?*Foo)`, `Owned(?[]Foo)`),
and an unmanaged variant which doesn't store the allocator
(`Owned(*Foo).Unmanaged`) is available for cases where space efficiency
is a concern.
A `MaybeOwned` type is also provided for representing data that could be
owned or borrowed. If the data is owned, `MaybeOwned.deinit` works like
`Owned.deinit`; otherwise, it's a no-op.
(For internal tracking: fixes STAB-920, STAB-921)
## Summary
Optimizes the `--lockfile-only` flag to skip downloading **npm package
tarballs** since they're not needed for lockfile generation. This saves
bandwidth and improves performance for lockfile-only operations while
preserving accuracy for non-npm dependencies.
## Changes
- **Add `prefetch_resolved_tarballs` flag** to
`PackageManagerOptions.Do` struct (defaults to `true`)
- **Set flag to `false`** when `--lockfile-only` is used
- **Skip tarball downloads for npm packages only** when flag is
disabled:
- `getOrPutResolvedPackageWithFindResult` - Main npm package resolution
(uses `Task.Id.forNPMPackage`)
- `enqueuePackageForDownload` - NPM package downloads (uses
`bun.Semver.Version`)
- **Preserve tarball downloads for non-npm dependencies** to maintain
lockfile accuracy:
- Remote tarball URLs (needed for lockfile generation)
- GitHub dependencies (needed for lockfile generation)
- Generic tarball downloads (may be remote)
- Patch-related downloads (needed for patch application)
- **Add comprehensive test** that verifies only package manifests are
fetched for npm packages with `--lockfile-only`
## Rationale
Only npm registry packages can safely skip tarball downloads during
lockfile generation because:
✅ **NPM packages**: Metadata is available from registry manifests,
tarball not needed for lockfile
❌ **Remote URLs**: Need tarball content to determine package metadata
and generate accurate lockfile
❌ **GitHub deps**: Need tarball content to extract package.json and
determine dependencies
❌ **Tarball URIs**: Need content to determine package structure and
dependencies
This selective approach maximizes bandwidth savings while ensuring
lockfile accuracy.
## Test Plan
- ✅ New test in `test/cli/install/lockfile-only.test.ts` verifies only
npm manifest URLs are requested
- ✅ Uses absolute package versions to ensure the npm resolution code
path is hit
- ✅ Test output normalized to work with both debug and non-debug builds
- ✅ All existing install/update tests still pass (including remote
dependency tests)
## Performance Impact
For `--lockfile-only` operations with npm packages, this eliminates
unnecessary tarball downloads, reducing:
- **Network bandwidth usage** (manifests only, not tarballs)
- **Installation time** (no tarball extraction/processing)
- **Cache storage requirements** (tarballs not cached)
The optimization only affects npm packages in `--lockfile-only` mode and
has zero impact on:
- Regular installs (npm packages still download tarballs)
- Remote dependencies (always download tarballs for accuracy)
- GitHub dependencies (always download tarballs for accuracy)
## Files Changed
- `src/install/PackageManager/PackageManagerOptions.zig` - Add flag and
configure for lockfile-only
- `src/install/PackageManager/PackageManagerEnqueue.zig` - Skip npm
tarball generation selectively
- `test/cli/install/lockfile-only.test.ts` - Test with dummy registry
🤖 Generated with [Claude Code](https://claude.ai/code)
---------
Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
Co-authored-by: Alistair Smith <hi@alistair.sh>
### What does this PR do?
Defers exceptions thrown by NAPI code until execution returns/flows to
JS code.
### How did you verify your code works?
Ran existing NAPI tests and added to napi.test.ts.
We can't use `std.heap.c_allocator` as `z_allocator`; it doesn't
zero-initialize the memory. This PR adds a fallback implementation.
This PR also makes Bun compile successfully with `use_mimalloc` set to
false. More work is likely necessary to make it function correctly in
this case, but it should at least compile.
(For internal tracking: fixes STAB-978, STAB-979)
### What does this PR do?
cases like `@prisma/engines-version` with version of
`6.14.0-17.fba13060ef3cfbe5e95af3aaba61eabf2b8a8a20` was having issues
with the version and using a "corrupted" string instead
### How did you verify your code works?
---------
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
* Move `DebugThreadLock` to `bun.safety`
* Enable in `ci_assert` builds, but store stack traces only in debug
builds
* Reduce size of struct by making optional field non-optional
* Add `initLockedIfNonComptime` as a workaround for not being able to
call `initLocked` in comptime contexts
* Add `lockOrAssert` method to acquire the lock if unlocked, or else
assert that the current thread acquired the lock
* Add stack traces to `CriticalSection` and `AllocPtr` in debug builds
* Make `MimallocArena.init` infallible
* Make `MimallocArena.heap` non-nullable
* Rename `RefCount.active_counts` to `raw_count` and provide read-only
`get` method
* Add `bun.safety.alloc.assertEq` to assert that two allocators are
equal (avoiding comparison of undefined `ptr`s)
(For internal tracking: fixes STAB-917, STAB-918, STAB-962, STAB-963,
STAB-964, STAB-965)
---------
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
`ban-words.test.ts` attempts to detect places where a struct field is
given a default value of `undefined`, but it fails to detect cases like
the following:
```zig
foo: *Foo align(1) = undefined,
bar: [16 * 64]Bar = undefined,
baz: Baz(u8, true) = undefined,
```
This PR updates the check to detect more occurrences, while still
avoiding (as far as I can tell) the inclusion of any false positives.
(For internal tracking: fixes STAB-971)
---------
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Various types have a `deepClone` method, but there are two different
signatures in use. Some types, like those in the `css` directory, have
an infallible `deepClone` method that cannot return an error. Others,
like those in `ast`, are fallible and can return `error.OutOfMemory`.
Historically, `BabyList.deepClone` has only worked with the fallible
kind of `deepClone`, necessitating the addition of
`BabyList.deepClone2`, which only works with the *in*fallible kind.
This PR:
* Updates `BabyList.deepClone` so that it works with both kinds of
method
* Updates `BabyList.deepClone2` so that it works with both kinds of
method
* Renames `BabyList.deepClone2` to `BabyList.deepCloneInfallible`
* Adds `bun.handleOom(...)`, which is like `... catch bun.outOfMemory()`
but it can't accidentally catch non-OOM-related errors
* Replaces an occurrence of `anyerror` with a more specific error set
(For internal tracking: fixes STAB-969, STAB-970)
---------
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
### What does this PR do?
Limit to only 5k test files for initial scan + ignore node_modules for
subdirs.
### How did you verify your code works?
manual
## Summary
This PR fixes a panic that occurs when file operations use buffers
larger than 4GB on Windows.
## The Problem
When calling `fs.readSync()` or `fs.writeSync()` with buffers larger
than 4,294,967,295 bytes (u32::MAX), Bun panics with:
```
panic(main thread): integer cast truncated bits
```
## Root Cause
The Windows APIs `ReadFile()` and `WriteFile()` expect a `DWORD` (u32)
for the buffer length parameter. The code was using `@intCast` to
convert from `usize` to `u32`, which panics when the value exceeds
u32::MAX.
## The Fix
Changed `@intCast` to `@truncate` in four locations:
1. `sys.zig:1839` - ReadFile buffer length parameter
2. `sys.zig:1556` - WriteFile buffer length parameter
3. `bun.zig:230` - platformIOVecCreate length field
4. `bun.zig:240` - platformIOVecConstCreate length field
With these changes, operations with buffers > 4GB will read/write up to
4GB at a time instead of panicking.
## Test Plan
```js
// This previously caused a panic on Windows
const fs = require('fs');
const fd = fs.openSync('test.txt', 'r');
const buffer = Buffer.allocUnsafe(4_294_967_296); // 4GB + 1 byte
fs.readSync(fd, buffer, 0, buffer.length, 0);
```
Fixes https://github.com/oven-sh/bun/issues/21699🤖 Generated with [Claude Code](https://claude.ai/code)
---------
Co-authored-by: Jarred Sumner <jarred@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
### What does this PR do?
### How did you verify your code works?
---------
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
### What does this PR do?
Setting the background color on plaintext diffs makes the plaintext
harder to read. This is particularly true when the input is longer.
This conservatively makes us only add the background color to the diff
when the characters being highlighted are all whitespaces, punctuation
or non-printable.
This branch:
<img width="748" height="388" alt="image"
src="https://github.com/user-attachments/assets/ceaf02ba-bf71-4207-a319-c041c8a887de"
/>
Canary:
<img width="742" height="404" alt="image"
src="https://github.com/user-attachments/assets/cc380f45-5540-48ed-aea1-07f4b0ab291e"
/>
### How did you verify your code works?
Updated test
## Summary
- Adds `Symbol.asyncIterator` to `process.stdout` and `process.stderr`
when they are TTY or pipe/socket streams
- Matches Node.js behavior where these streams are Duplex-like and
support async iteration
- Does not add the iterator when streams are redirected to files
(matching Node.js SyncWriteStream behavior)
## Test plan
- Added test in
`test/regression/issue/test-process-stdout-async-iterator.test.ts`
- Verified the fix works with Claude Code on Linux x64
- Test passes with `bun bd test
test/regression/issue/test-process-stdout-async-iterator.test.ts`
Fixes#21704🤖 Generated with [Claude Code](https://claude.ai/code)
---------
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
## Summary
- Fix transpiler bug where comma expressions like `(0, obj.method)()`
were incorrectly optimized to `obj.method()`
- This preserved the `this` binding instead of stripping it as per
JavaScript semantics
- Add comprehensive regression test to prevent future issues
## Root Cause
The comma operator optimization in `src/js_parser.zig:7281` was directly
returning the right operand when the left operand had no side effects,
without checking if the expression was being used as a call target.
## Solution
- Added the same `is_call_target` check that other operators (nullish
coalescing, logical OR/AND) use
- When a comma expression is used as a call target AND the right operand
has a value for `this`, preserve the comma expression to strip the
`this` binding
- Follows existing patterns in the codebase for consistent behavior
## Test Plan
- [x] Reproduce the original bug: `(0, obj.method)()` incorrectly
preserved `this`
- [x] Verify fix: comma expressions now correctly strip `this` binding
in function calls
- [x] All existing transpiler tests continue to pass
- [x] Added regression test covering various comma expression scenarios
- [x] Tested edge cases: nested comma expressions, side effects,
different operand types
🤖 Generated with [Claude Code](https://claude.ai/code)
---------
Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
## Summary
- Updates WebKit from 75f6499 to eb92990 (latest release from
oven-sh/webkit)
- This brings in the latest WebKit improvements and fixes
## Test plan
- [ ] Verify the build completes successfully
- [ ] Run existing test suite to ensure no regressions
🤖 Generated with [Claude Code](https://claude.ai/code)
---------
Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
### What does this PR do?
Reduce stack space usage of parseSuffix
### How did you verify your code works?
---------
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
### What does this PR do?
On Linux, AbortSignal.timeout created a file descriptor for each timeout
and did not keep the event loop alive when a timer was active. This is
fixed.
### How did you verify your code works?
Fewer flaky tests
---------
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Claude <claude@anthropic.ai>
## Summary
- Replace cmake-based clang-format with dedicated bash script that
directly processes source files
- Optimize CI to only install clang-format-19 instead of entire LLVM
toolchain
- Script enforces specific clang-format version with no fallbacks
## Changes
1. **New bash script** (`scripts/run-clang-format.sh`):
- Directly reads C++ files from `CxxSources.txt`
- Finds all header files in `src/` and `packages/` directories
- Respects existing `.clang-format` configuration files
- Requires specific clang-format version (no fallbacks)
- Defaults to format mode (modifies files in place)
2. **Optimized GitHub Action**:
- Only installs `clang-format-19` package with `--no-install-recommends`
- Avoids installing unnecessary components like manpages
- Uses new bash script instead of cmake targets
3. **Updated package.json scripts**:
- `clang-format`, `clang-format:check`, and `clang-format:diff` now use
the bash script
## Test plan
- [x] Verified script finds and processes all C++ source and header
files
- [x] Tested formatting works correctly by adding formatting issues and
running the script
- [x] Confirmed script respects `.clang-format` configuration files
- [x] Script correctly requires specific clang-format version
🤖 Generated with [Claude Code](https://claude.ai/code)
---------
Co-authored-by: Claude <claude@anthropic.ai>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
### What does this PR do?
- Fixes `$.braces(...)` not working properly on non-ascii inputs
- Switches braces code to use `SmallList` to support more deeply nested
brace expansion
---------
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
### What does this PR do?
### How did you verify your code works?
---------
Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
this instance type was reported to be our 1st most expensive per aws
bill
----
before:
x64-linux: 19.5m
arm64-linux: 14m
x64-musl: 16.3m
arm64-musl: 13.3m
x64-windows: 2m
after:
x64-linux: 20.3m
arm64-linux: 15.3m
x64-musl: 16m
arm64-musl: 13.5m
x64-windows: 2.5m
<details>
<summary> observed in
https://buildkite.com/bun/bun/builds/22442#annotation-test/js/node/zlib/leak.test.ts
</summary>
```
==5045==ERROR: AddressSanitizer: heap-use-after-free on address 0x5220000243c0 at pc 0x00000dad671b bp 0x14f22d4a4990 sp 0x14f22d4a4988
READ of size 8 at 0x5220000243c0 thread T5 (HeapHelper)
======== Stack trace from GDB for HeapHelper-5045.core: ========
Program terminated with signal SIGABRT, Aborted.
#0 0x000014f2c3672eec in ?? () from /lib/x86_64-linux-gnu/libc.so.6
[Current thread is 1 (Thread 0x14f22d4f46c0 (LWP 5050))]
#0 0x000014f2c3672eec in ?? () from /lib/x86_64-linux-gnu/libc.so.6
#1 0x000014f2c3623fb2 in raise () from /lib/x86_64-linux-gnu/libc.so.6
#2 0x000014f2c360e472 in abort () from /lib/x86_64-linux-gnu/libc.so.6
#3 0x000000000e3b2ae2 in uw_init_context_1[cold] ()
#4 0x000000000e3b29fc in _Unwind_Backtrace ()
#5 0x00000000046a6bab in __sanitizer::BufferedStackTrace::UnwindSlow(unsigned long, unsigned int) ()
#6 0x00000000046a181d in __sanitizer::BufferedStackTrace::Unwind(unsigned int, unsigned long, unsigned long, void*, unsigned long, unsigned long, bool) ()
#7 0x00000000046885bd in __sanitizer::BufferedStackTrace::UnwindImpl(unsigned long, unsigned long, void*, bool, unsigned int) ()
#8 0x0000000004601127 in __asan::ErrorGeneric::Print() ()
#9 0x0000000004683180 in __asan::ScopedInErrorReport::~ScopedInErrorReport() ()
#10 0x0000000004686567 in __asan::ReportGenericError(unsigned long, unsigned long, unsigned long, unsigned long, bool, unsigned long, unsigned int, bool) ()
#11 0x0000000004686d46 in __asan_report_load8 ()
#12 0x000000000dad671b in ZSTD_sizeof_CCtx (cctx=<optimized out>) at ./build/release-asan/zstd/vendor/zstd/lib/compress/zstd_compress.c:210
#13 0x0000000006d2284d in bun.js.node.zlib.NativeZstd.estimatedSize () at /var/lib/buildkite-agent/builds/ip-172-31-72-121/bun/bun/src/bun.js/node/zlib/NativeZstd.zig:57
#14 ZigGeneratedClasses.JSNativeZstd.JavaScriptCoreBindings.NativeZstd__estimatedSize (thisValue=<optimized out>) at /var/lib/buildkite-agent/builds/ip-172-31-72-121/bun/bun/build/release-asan/codegen/ZigGeneratedClasses.zig:11122
#15 0x000000000852803b in WebCore::JSNativeZstd::visitChildrenImpl<JSC::SlotVisitor> (cell=0x14f22e190840, visitor=...) at ./build/release-asan/./build/release-asan/codegen/ZigGeneratedClasses.cpp:30728
#16 WebCore::JSNativeZstd::visitChildren (cell=0x14f22e190840, visitor=...) at ./build/release-asan/./build/release-asan/codegen/ZigGeneratedClasses.cpp:30734
#17 0x000000000aa99d6c in JSC::MethodTable::visitChildren (this=<optimized out>, cell=<optimized out>, visitor=...) at vendor/WebKit/Source/JavaScriptCore/runtime/ClassInfo.h:115
#18 0x000000000aa99d6c in JSC::SlotVisitor::visitChildren (this=0x14f277028300, cell=0x14f22e190840)
#19 JSC::SlotVisitor::drain(WTF::MonotonicTime)::$_0::operator()(JSC::MarkStackArray&) const (this=<optimized out>, stack=...) at vendor/WebKit/Source/JavaScriptCore/heap/SlotVisitor.cpp:509
#20 0x000000000aa8f130 in JSC::SlotVisitor::forEachMarkStack<JSC::SlotVisitor::drain(WTF::MonotonicTime)::$_0>(JSC::SlotVisitor::drain(WTF::MonotonicTime)::$_0 const&) (this=0x14f277028300, func=...) at vendor/WebKit/Source/JavaScriptCore/heap/SlotVisitorInlines.h:193
#21 JSC::SlotVisitor::drain (this=this@entry=0x14f277028300, timeout=<error reading variable: That operation is not available on integers of more than 8 bytes.>, timeout@entry=...) at vendor/WebKit/Source/JavaScriptCore/heap/SlotVisitor.cpp:499
#22 0x000000000aa90590 in JSC::SlotVisitor::drainFromShared (this=0x14f277028300, sharedDrainMode=JSC::SlotVisitor::HelperDrain, timeout=<error reading variable: That operation is not available on integers of more than 8 bytes.>) at vendor/WebKit/Source/JavaScriptCore/heap/SlotVisitor.cpp:699
#23 0x000000000aa08726 in JSC::Heap::runBeginPhase(JSC::GCConductor)::$_1::operator()() const (this=<optimized out>) at vendor/WebKit/Source/JavaScriptCore/heap/Heap.cpp:1508
#24 WTF::SharedTaskFunctor<void (), JSC::Heap::runBeginPhase(JSC::GCConductor)::$_1>::run() (this=<optimized out>) at .WTF/Headers/wtf/SharedTask.h:91
#25 0x000000000aa3b596 in WTF::ParallelHelperClient::runTask(WTF::RefPtr<WTF::SharedTask<void ()>, WTF::RawPtrTraits<WTF::SharedTask<void ()> >, WTF::DefaultRefDerefTraits<WTF::SharedTask<void ()> > > const&) (this=0x14f22e000428, task=...) at vendor/WebKit/Source/WTF/wtf/ParallelHelperPool.cpp:110
#26 0x000000000aa3d976 in WTF::ParallelHelperPool::Thread::work (this=<optimized out>) at vendor/WebKit/Source/WTF/wtf/ParallelHelperPool.cpp:201
#27 0x000000000aa4210d in WTF::AutomaticThread::start(WTF::AbstractLocker const&)::$_0::operator()() const (this=<optimized out>) at vendor/WebKit/Source/WTF/wtf/AutomaticThread.cpp:225
#28 WTF::Detail::CallableWrapper<WTF::AutomaticThread::start(WTF::AbstractLocker const&)::$_0, void>::call() (this=<optimized out>) at vendor/WebKit/Source/WTF/wtf/Function.h:53
#29 0x0000000008958ada in WTF::Function<void ()>::operator()() const (this=<optimized out>) at vendor/WebKit/Source/WTF/wtf/Function.h:82
#30 WTF::Thread::entryPoint (newThreadContext=<optimized out>) at vendor/WebKit/Source/WTF/wtf/Threading.cpp:272
#31 0x0000000008a65689 in WTF::wtfThreadEntryPoint (context=0x13b5) at vendor/WebKit/Source/WTF/wtf/posix/ThreadingPOSIX.cpp:255
#32 0x000000000467d347 in asan_thread_start(void*) ()
#33 0x000014f2c36711f5 in ?? () from /lib/x86_64-linux-gnu/libc.so.6
#34 0x000014f2c36f189c in ?? () from /lib/x86_64-linux-gnu/libc.so.6
```
</details>
`ZSTD_sizeof_CCtx` and `ZSTD_sizeof_DCtx` can not be relied upon to be
thread-safe and estimatedSize may be called from any thread
### What does this PR do?
After 10s of inactivity in the thread pool, this releases memory more
aggressively back to the operating system
### How did you verify your code works?
### What does this PR do?
Fixes a crash related to pipelines
---------
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
## Summary
Fixes a bug in napi_get_value_bigint_words where the function would
return the number of words copied instead of the actual word count
needed when the provided buffer is smaller than required.
## The Problem
When napi_get_value_bigint_words was called with a buffer smaller than
the actual BigInt size, it would incorrectly return the buffer size
instead of the actual word count needed. This doesn't match Node.js
behavior.
### Example
BigInt that requires 2 words: 0x123456789ABCDEF0123456789ABCDEFn
Call with buffer for only 1 word
- Before fix: word_count = 1 (buffer size)
- After fix: word_count = 2 (actual words needed)
## The Fix
Changed napi_get_value_bigint_words to always set word_count to the
actual number of words in the BigInt, regardless of buffer size.
## Test Plan
- Added test test_bigint_word_count that verifies the word count is
correctly returned
- Added test test_ref_unref_underflow for the existing
napi_reference_unref underflow protection
- Both tests pass with the fix and match Node.js behavior
🤖 Generated with [Claude Code](https://claude.ai/code)
---------
Co-authored-by: Claude <noreply@anthropic.com>
### What does this PR do?
This PR adds lldb pretty printing support for `bun.String`, `ZigString`
and `WTFStringImpl` so you don't have to click through so many fields to
what the actual string value is.
### What does this PR do?
Removes the unused `capturedError` fixing the oxlint error. This
variable is never assigned to, hence the block on L1094 can never run.
### How did you verify your code works?
Existing tests
### What does this PR do?
It is easy to confuse `lines` and `columns` fields in `LineColumnOffset`
struct inside of `src/sourcemap/sourcemap.zig` as being either one or
zero based. The sourcemap spec says line and column offsets are zero
based. There was a place that was incorrectly assuming it was one based.
This PR switches it to use `bun.Ordinal` instead of bare `u32` integers
to prevent bugs and from this happening again.
## Summary
This PR adds a comprehensive TypeScript CLI flag parser that reads the
`--help` menu for every Bun command and generates structured JSON data
for shell completion generators.
### Features
- **🔍 Complete command discovery**: Automatically discovers all 22 Bun
commands
- **📋 Comprehensive flag parsing**: Extracts 388+ flags with
descriptions, types, defaults, and choices
- **🌳 Nested subcommand support**: Handles complex cases like `bun pm
cache rm`, `bun pm pkg set`
- **🔗 Command aliases**: Supports `bun i` = `bun install`, `bun a` =
`bun add`, etc.
- **🎯 Dynamic completions**: Integrates with `bun getcompletes` for
scripts, packages, files, binaries
- **📂 File type awareness**: Knows when to complete `.js/.ts` files vs
test files vs packages
- **⚡ Special case handling**: Handles bare `bun` vs `bun run` and other
edge cases
### Generated Output
The script generates `completions/bun-cli.json` with:
- 21 commands with full metadata
- 47 global flags
- 16 pm subcommands (including nested ones)
- 54+ examples
- Dynamic completion hints
- Integration info for existing shell completions
### Usage
```bash
bun run scripts/generate-cli-completions.ts
```
Output saved to `completions/bun-cli.json` for use by future shell
completion generators.
### Perfect Shell Completions Ready
This JSON structure provides everything needed to generate perfect shell
completions for fish, bash, and zsh with full feature parity to the
existing hand-crafted completions. It captures all the complex cases
that make Bun's CLI completions work seamlessly.
The generated data structure includes:
- Context-aware flag suggestions
- Proper file type filtering
- Package name completions
- Script and binary discovery
- Subcommand nesting
- Alias handling
- Dynamic completion integration
🤖 Generated with [Claude Code](https://claude.ai/code)
---------
Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
Co-authored-by: graphite-app[bot] <96075541+graphite-app[bot]@users.noreply.github.com>
### What does this PR do?
The `then` function in `transpiler.transform` can cause GC, which means
it can cause the `Transpiler` to become freed, which means that if that
same transpiler is in use by another run on the other thread, it could
have pointers to invalid memory.
Also, `ESMCondition` has unnecesasry memory allocations and there is a
very tiny memory leak in optionsFromLoaders
### How did you verify your code works?
Existing tests
---------
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
### What does this PR do?
Removes `DevServer.relative_path_buf` field and replaces it with usages
of `bun.path_buffer_pool` which is better than this debug lock thing
going on
---------
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Splits up js_parser.zig into multiple files. Also changes visitExprInOut
to use function calls rather than switch
Not ready:
- [ ] P.zig is ~70,000 tokens, still needs to get smaller
- [x] ~~measure zig build time before & after (is it slower?)~~ no
significant impact
---------
Co-authored-by: pfgithub <6010774+pfgithub@users.noreply.github.com>
Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Fixes#21189
`.pause()` should unref but it should still continue to emit `readable`
events (although it should not send `data` events)
also stdin.unref() should not pause input, it should only prevent stdin
from keeping the process alive.
DRAFT:
- [x] ~~this causes a bug where `process.stdin.on("readable", () => {});
process.stdin.pause()` will allow the process to exit when it
shouldn't.~~ fixed
---------
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
### What does this PR do?
We should not call .deinit() after .toJS otherwise hasPendingActivity
will access invalid memory
### How did you verify your code works?
Test run it with debug build on macos or asan on and will catch it
---------
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
### What does this PR do?
The DevSever's `IncrementalGraph` uses a data-oriented design memory
management style, storing data in lists and using indices instead of
pointers.
In conventional memory management, when we free a pointer and
accidentally use it will trip up asan. Obviously this doesn't apply when
using lists and indices, so this PR adds a check in debug & asan builds.
Everytime we free an `Edge` we better make sure that there are no more
dangling references to that spot.
This caught a case where we weren't setting `g.first_import[file_index]
= .none` when deleting a file's imports, causing a dangling reference
and out of bounds access.
---------
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
### What does this PR do?
dropAllLocks causes Thread::yield several times which means more system
calls which means more thread switches which means slower
### How did you verify your code works?
### What does this PR do?
Releasing heap access causes all the heap helper threads to wake up and
lock and then unlock futexes, but it's important to do that to ensure
finalizers run quickly.
That means releasing heap access is a balance between:
1. CPU usage
2. Memory usage
Not releasing heap access causes benchmarks like
https://github.com/oven-sh/bun/pull/14885 to regress due to finalizers
not being called quickly enough.
Releasing heap access too often causes high idle CPU usage.
For the following code:
```
setTimeout(() => {}, 10 * 1000)
```
command time -v when with defaultRemainingRunsUntilSkipReleaseAccess =
0:
>
> Involuntary context switches: 605
>
command time -v when with defaultRemainingRunsUntilSkipReleaseAccess =
5:
>
> Involuntary context switches: 350
>
command time -v when with defaultRemainingRunsUntilSkipReleaseAccess =
10:
>
> Involuntary context switches: 241
>
Also comapre the #14885 benchmark with different values.
The idea here is if you entered JS "recently", running any
finalizers that might've been waiting to be run is a good idea.
But if you haven't, like if the process is just waiting on I/O
then don't bother.
### How did you verify your code works?
### What does this PR do?
Instead of holding a strong for the options object passed with the
handlers, we make each of the callbacks kept alive by the handlers and
it detaches once the detachFromJS function is called.
This should fix#21570, which looks like it was caused by wrapper
functions for AsyncLocalStorage getting collected prematurely.
fixes#21254fixes#21553fixes#21422
### How did you verify your code works?
Ran test/js/node/http2/node-http2.test.js
## Summary
This PR optimizes the uSockets sweep timer to only run when there are
active connections that need timeout checking, rather than running
continuously even when no connections exist.
**Problem**: The sweep timer was running every 4 seconds
(LIBUS_TIMEOUT_GRANULARITY) regardless of whether there were any active
connections, causing unnecessary CPU usage when Bun applications are
idle.
**Solution**: Implement reference counting for active sockets so the
timer is only enabled when needed.
## Changes
- **Add sweep_timer_count field** to both C and Zig loop data structures
- **Implement helper functions** `us_internal_enable_sweep_timer()` and
`us_internal_disable_sweep_timer()`
- **Timer lifecycle management**:
- Timer starts disabled when loop is initialized
- Timer enables when first socket is linked (count 0→1)
- Timer disables when last socket is unlinked (count 1→0)
- **Socket coverage**: Applied to both regular sockets and connecting
sockets that need timeout sweeping
- **Listen socket exclusion**: Listen sockets don't increment the
counter as they don't need timeout checking
## Files Modified
- `packages/bun-usockets/src/internal/loop_data.h` - Added
sweep_timer_count field
- `src/deps/uws/InternalLoopData.zig` - Updated Zig struct to match C
struct
- `packages/bun-usockets/src/loop.c` - Helper functions and
initialization
- `packages/bun-usockets/src/internal/internal.h` - Function
declarations
- `packages/bun-usockets/src/context.c` - Socket link/unlink
modifications
## Test Results
Verified the optimization works correctly:
1. **✅ No timer during idle**: 5-second idle test showed no sweep timer
activity
2. **✅ Timer activates with connections**: Timer enables when server
starts listening
3. **✅ Timer runs periodically**: Sweep timer callbacks occur every ~4
seconds when connections are active
4. **✅ Timer deactivates**: Timer disables when connections are closed
## Performance Impact
This change significantly reduces CPU usage for idle Bun applications
with no active HTTP connections by eliminating unnecessary timer
callbacks. The optimization maintains all existing timeout functionality
while only running the sweep when actually needed.
## Test plan
- [x] Verify no timer activity during idle periods
- [x] Verify timer enables when connections are created
- [x] Verify timer runs at expected intervals when active
- [x] Verify timer disables when connections are closed
- [x] Test with HTTP server scenarios
- [ ] Run existing HTTP server test suite to ensure no regressions
🤖 Generated with [Claude Code](https://claude.ai/code)
Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
### What does this PR do?
Add missing check for .write() on a data-backed blob
### How did you verify your code works?
There is a test
---------
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Alistair Smith <hi@alistair.sh>
Previously it accepted `property: anytype` but now it's `[]const u8`
because that was the only allowed value, so it makes it easier to see
what type it accepts in autocomplete.
Also updates the doc comment, switches it to use ZIG_EXPORT, and updates
the cppbind doc comment
Fixes#7569
This adds expectTypeOf, but not the experimental `--typecheck` flag from
vitest. To use it, you need to typecheck manually with `bunx tsc
--noEmit` in addition to `bun test`
---------
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
You updated the types but not the comment in Bun 1.1.14
### What does this PR do?
Removes the sentence "This returns `undefined`." in the SQLite
Statement.run function
### How did you verify your code works?
It says this on the website
<img width="787" height="90" alt="Screenshot 2025-08-01 at 2 05 09 PM"
src="https://github.com/user-attachments/assets/63259ab3-b5fd-4392-bf69-8e297f4922f2"
/>
### What does this PR do?
Fix: https://github.com/oven-sh/bun/issues/21351
Relevant changes:
Fix advance to properly cleanup success and failed queries that could be
still be in the queue
Always ref before executing
Use stronger atomics for ref/deref and hasPendingActivity
Fallback when thisValue is freed/null/zero and check if vm is being
shutdown
The bug in --hot in `resolveRopeIfNeeded` Issue is not meant to be fixed
in this PR this is a fix for the postgres regression
Added assertions so this bug is easier to catch on CI
### How did you verify your code works?
Test added
---------
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
### What does this PR do?
Resolves
```js
Bun v1.2.13 ([64ed68c](64ed68c9e0)) on windows x86_64 [TestCommand]
panic: ComptimeStringMap.fromJS: input is not a string
[comptime_string_map.zig:268](64ed68c9e0/src/comptime_string_map.zig (L268)): getWithEql
[Response.zig:682](64ed68c9e0/src/bun.js/webcore/Response.zig (L682)): init
[Request.zig:679](64ed68c9e0/src/bun.js/webcore/Request.zig (L679)): constructInto
[ZigGeneratedClasses.cpp:37976](64ed68c9e0/C:/buildkite-agent/builds/EC2AMAZ-Q4V5GV4/bun/bun/build/release/codegen/ZigGeneratedClasses.cpp#L37976): WebCore::JSRequestConstructor::construct
2 unknown/js code
llint_entry
Features: tsconfig, Bun.stdout, dotenv, jsc
```
### How did you verify your code works?
There is a test.
Before:
```
failing-test-passes.fixture.ts:
^ this test is marked as failing but it passed. Remove \`.failing\` if tested behavior now works
(fail) This should fail but it doesnt [0.24ms]
^ this test is marked as failing but it passed. Remove \`.failing\` if tested behavior now works
(fail) This should fail but it doesnt (async) [0.23ms]
```
After:
```
failing-test-passes.fixture.ts:
(fail) This should fail but it doesnt [0.24ms]
^ this test is marked as failing but it passed. Remove \`.failing\` if tested behavior now works
(fail) This should fail but it doesnt (async) [0.23ms]
^ this test is marked as failing but it passed. Remove \`.failing\` if tested behavior now works
```
Adds a snapshot test
---------
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
### What does this PR do?
if you spam the refresh button in `next dev`, we print this error:
```
⨯ Error: Stream is already ended
at writeHead (null)
at <anonymous> (null)
at <anonymous> (null)
at <anonymous> (null)
at <anonymous> (null)
at <anonymous> (null)
at <anonymous> (null)
at <anonymous> (null)
at processTicksAndRejections (null) {
digest: '2259044225',
code: 'ERR_STREAM_ALREADY_FINISHED',
toString: [Function: toString]
}
⨯ Error: failed to pipe response
at processTicksAndRejections (unknown:7:39) {
[cause]: Error: Stream is already ended
at writeHead (null)
at <anonymous> (null)
at <anonymous> (null)
at <anonymous> (null)
at <anonymous> (null)
at <anonymous> (null)
at <anonymous> (null)
at <anonymous> (null)
at processTicksAndRejections (null) {
digest: '2259044225',
code: 'ERR_STREAM_ALREADY_FINISHED',
toString: [Function: toString]
}
}
⨯ Error: failed to pipe response
at processTicksAndRejections (unknown:7:39) {
page: '/',
[cause]: Error: Stream is already ended
at writeHead (null)
at <anonymous> (null)
at <anonymous> (null)
at <anonymous> (null)
at <anonymous> (null)
at <anonymous> (null)
at <anonymous> (null)
at <anonymous> (null)
at processTicksAndRejections (null) {
digest: '2259044225',
code: 'ERR_STREAM_ALREADY_FINISHED',
toString: [Function: toString]
}
}
```
If the socket is already closed when writeHead is called, we're supposed
to silently ignore it instead of throwing an error . The close event is
supposed to be emitted on the next tick. Now, I think there are also
cases where we do not emit the close event which is similarly bad.
### How did you verify your code works?
Need to go through the node http server tests and see if any new ones
pass. Also maybe some will fail on this PR, let's see.
### What does this PR do?
We had `bun.strings.assertIsValidWindowsPath(...)` in the resolver, but
we can't do this because the path may come from the user. Instead, let
our error handling code handle it.
Also fixes#21065
---------
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
### What does this PR do?
We have our own `MultiArrayList(...)` in
`src/collections/multi_array_list.zig` (is this really necessary?) and
this does not work with the existing lldb pretty printing functions
because they are under a different symbol name:
`collections.multi_array_list.MultiArrayList*` instead of
`multi_array_list.MultiArrayList*`
### What does this PR do?
<!-- **Please explain what your changes do**, example: -->
<!--
This adds a new flag --bail to bun test. When set, it will stop running
tests after the first failure. This is useful for CI environments where
you want to fail fast.
-->
### How did you verify your code works?
ran fuzzy-wuzzy.test.ts
<!-- **For code changes, please include automated tests**. Feel free to
uncomment the line below -->
<!-- I wrote automated tests -->
<!-- If JavaScript/TypeScript modules or builtins changed:
- [ ] I included a test for the new code, or existing tests cover it
- [ ] I ran my tests locally and they pass (`bun-debug test
test-file-name.test`)
-->
<!-- If Zig files changed:
- [ ] I checked the lifetime of memory allocated to verify it's (1)
freed and (2) only freed when it should be
- [ ] I included a test for the new code, or an existing test covers it
- [ ] JSValue used outside of the stack is either wrapped in a
JSC.Strong or is JSValueProtect'ed
- [ ] I wrote TypeScript/JavaScript tests and they pass locally
(`bun-debug test test-file-name.test`)
-->
<!-- If new methods, getters, or setters were added to a publicly
exposed class:
- [ ] I added TypeScript types for the new methods, getters, or setters
-->
<!-- If dependencies in tests changed:
- [ ] I made sure that specific versions of dependencies are used
instead of ranged or tagged versions
-->
<!-- If a new builtin ESM/CJS module was added:
- [ ] I updated Aliases in `module_loader.zig` to include the new module
- [ ] I added a test that imports the module
- [ ] I added a test that require() the module
-->
This slightly reduces memory use.
Maximum memory use of `bun test html-rewriter`, averaged across 100
iterations:
* 101975 kB without this change
* 101634 kB with this change
I also tried changing the code to always use the aligned allocation
functions, but this slightly increased memory use, to 102160 kB.
(For internal tracking: fixes ENG-19866)
Add a helper type to help detect race conditions. There's no performance
or memory use penalty in release builds.
Actually adding the type to various places will be left for future PRs.
(For internal tracking: fixes STAB-852)
### What does this PR do?
Replaces an if statement with an assertion that the condition is false.
The allocator in question should never be null.
### How did you verify your code works?
### What does this PR do?
Fixes the error printed:
```js
❯ bun --bun dev
$ next dev --turbopack
▲ Next.js 15.4.5 (Turbopack)
- Local: http://localhost:3000
- Network: http://192.168.1.250:3000
✓ Starting...
✓ Ready in 637ms
○ Compiling / ...
✓ Compiled / in 1280ms
/private/tmp/empty/my-app/.next/server/chunks/ssr/[root-of-the-server]__012ba519._.js: Invalid source map. Only conformant source maps can be used to filter stack frames. Cause: TypeError: payload is not an Object. (evaluating '"sections" in payload')
/private/tmp/empty/my-app/.next/server/chunks/ssr/[root-of-the-server]__93bf7db5._.js: Invalid source map. Only conformant source maps can be used to filter stack frames. Cause: TypeError: payload is not an Object. (evaluating '"sections" in payload')
GET / 200 in 1416ms
^C^[[A
```
### How did you verify your code works?
### What does this PR do?
<!-- **Please explain what your changes do**, example: -->
<!--
This adds a new flag --bail to bun test. When set, it will stop running
tests after the first failure. This is useful for CI environments where
you want to fail fast.
-->
- [ ] Documentation or TypeScript types (it's okay to leave the rest
blank in this case)
- [x] Code changes
### How did you verify your code works?
Tests added for padding support
Timeout of socket is being fired earlier due to backpressure or lack of
precision in usockets timers (now matchs node.js behavior).
Added check for owner_symbol so the error showed in
https://github.com/oven-sh/bun/issues/21055 is handled
<!-- **For code changes, please include automated tests**. Feel free to
uncomment the line below -->
<!-- I wrote automated tests -->
<!-- If JavaScript/TypeScript modules or builtins changed:
- [ ] I included a test for the new code, or existing tests cover it
- [ ] I ran my tests locally and they pass (`bun-debug test
test-file-name.test`)
-->
<!-- If Zig files changed:
- [ ] I checked the lifetime of memory allocated to verify it's (1)
freed and (2) only freed when it should be
- [ ] I included a test for the new code, or an existing test covers it
- [ ] JSValue used outside of the stack is either wrapped in a
JSC.Strong or is JSValueProtect'ed
- [ ] I wrote TypeScript/JavaScript tests and they pass locally
(`bun-debug test test-file-name.test`)
-->
<!-- If new methods, getters, or setters were added to a publicly
exposed class:
- [ ] I added TypeScript types for the new methods, getters, or setters
-->
<!-- If dependencies in tests changed:
- [ ] I made sure that specific versions of dependencies are used
instead of ranged or tagged versions
-->
<!-- If a new builtin ESM/CJS module was added:
- [ ] I updated Aliases in `module_loader.zig` to include the new module
- [ ] I added a test that imports the module
- [ ] I added a test that require() the module
-->
---------
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
### What does this PR do?
We have to use the existing code for handling aborted requests instead
of immediately calling deinit.
Also made the underlying uws.Response an optional pointer to mark when
the request has already been aborted to make it clear it's no longer
accessible.
### How did you verify your code works?
This needs a test
---------
Co-authored-by: Zack Radisic <56137411+zackradisic@users.noreply.github.com>
### What does this PR do?
fixes#6409
This PR implements `bun install` automatic migration from yarn.lock
files to bun.lock, preserving versions exactly. The migration happens
automatically when:
1. A project has a `yarn.lock` file
2. No `bun.lock` or `bun.lockb` file exists
3. User runs `bun install`
### Current Status: ✅ Complete and Working
The yarn.lock migration feature is **fully functional and
comprehensively tested**. All dependency types are supported:
- ✅ Regular npm dependencies (`package@^1.0.0`)
- ✅ Git dependencies (`git+https://github.com/user/repo.git`,
`github:user/repo`)
- ✅ NPM alias dependencies (`alias@npm:package@version`)
- ✅ File dependencies (`file:./path`)
- ✅ Remote tarball URLs (`https://registry.npmjs.org/package.tgz`)
- ✅ Local tarball files (`file:package.tgz`)
### Test Results
```bash
$ bun bd test test/cli/install/migration/yarn-lock-migration.test.ts
✅ 4 pass, 0 fail
- yarn-lock-mkdirp (basic npm dependency)
- yarn-lock-mkdirp-no-resolved (npm dependency without resolved field)
- yarn-lock-mkdirp-file-dep (file dependency)
- yarn-stuff (all complex dependency types: git, npm aliases, file, remote tarballs)
```
### How did you verify your code works?
1. **Comprehensive test suite**: Added 4 test cases covering all
dependency types
2. **Version preservation**: Verified that package versions are
preserved exactly during migration
3. **Real-world scenarios**: Tested with complex yarn.lock files
containing git deps, npm aliases, file deps, and remote tarballs
4. **Migration logging**: Confirms migration with log message `[X.XXms]
migrated lockfile from yarn.lock`
### Key Implementation Details
- **Core parser**: `src/install/yarn.zig` handles all yarn.lock parsing
and dependency type resolution
- **Integration**: Migration is built into existing lockfile loading
infrastructure
- **Performance**: Migration typically completes in ~1ms for most
projects
- **Compatibility**: Preserves exact dependency versions and resolution
behavior
The implementation correctly handles edge cases like npm aliases, git
dependencies with commits, file dependencies with transitive deps, and
remote tarballs.
---------
Co-authored-by: Jarred-Sumner <709451+Jarred-Sumner@users.noreply.github.com>
Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: RiskyMH <git@riskymh.dev>
Co-authored-by: RiskyMH <56214343+RiskyMH@users.noreply.github.com>
### What does this PR do?
Remove some duplicate code
### How did you verify your code works?
Ran the tests
---------
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
## Summary
- Fix crash in `Response.redirect()` when called with invalid arguments
like `Response.redirect(400, "a")`
- Add proper status code validation per Web API specification (301, 302,
303, 307, 308)
- Add comprehensive tests to prevent regression and ensure spec
compliance
## Issue
When `Response.redirect()` is called with invalid arguments (e.g.,
`Response.redirect(400, "a")`), the code crashes with a panic due to an
assertion failure in `fastGet()`. The second argument is passed to
`Response.Init.init()` which attempts to call `fastGet()` on non-object
values, triggering `bun.assert(this.isObject())` to fail.
Additionally, the original implementation didn't properly validate
redirect status codes according to the Web API specification.
## Fix
Enhanced the `constructRedirect()` function with:
1. **Proper status code validation**: Only allows valid redirect status
codes (301, 302, 303, 307, 308) as specified by the MDN Web API
documentation
2. **Crash prevention**: Only processes object init values to prevent
`fastGet()` crashes with non-object values
3. **Consistent behavior**: Throws `RangeError` for invalid status codes
in both number and object forms
## Changes
- **`src/bun.js/webcore/Response.zig`**: Enhanced `constructRedirect()`
with validation logic
- **`test/js/web/fetch/response.test.ts`**: Added comprehensive tests
for crash prevention and status validation
- **`test/js/web/fetch/fetch.test.ts`**: Updated existing test to use
valid redirect status (307 instead of 408)
## Test Plan
- [x] Added test that reproduces the original crash scenario - now
passes without crashing
- [x] Added tests for proper status code validation (valid codes pass,
invalid codes throw RangeError)
- [x] Verified existing Response.redirect tests still pass
- [x] Confirmed Web API compliance with MDN specification
- [x] Tested various edge cases: `Response.redirect(400, "a")`,
`Response.redirect("url", 400)`, etc.
## Behavior Changes
- **Invalid status codes now throw RangeError** (spec compliant
behavior)
- **Non-object init values are safely ignored** (no more crashes)
- **Maintains backward compatibility** for valid use cases
Per [MDN Web API
specification](https://developer.mozilla.org/en-US/docs/Web/API/Response/redirect_static),
Response.redirect() should only accept status codes 301, 302, 303, 307,
or 308.
Fixes https://github.com/oven-sh/bun/issues/18414🤖 Generated with [Claude Code](https://claude.ai/code)
---------
Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
The install script was incorrectly setting $env:PATH by assigning an
array directly, which PowerShell converts to a space-separated string
instead of the required semicolon-separated format.
This caused the Windows PATH environment variable to be malformed,
making installed programs inaccessible.
Fixes#16811
### What does this PR do?
Fixes a bug in the Windows PowerShell install script where `$env:PATH`
was being set incorrectly, causing the PATH environment variable to be
malformed.
**The Problem:**
- The script assigns an array directly to `$env:PATH`
- PowerShell converts this to a space-separated string instead of
semicolon-separated
- This breaks the Windows PATH, making installed programs inaccessible
**The Fix:**
- Changed `$env:PATH = $Path;` to `$env:PATH = $Path -join ';'`
- Now properly creates semicolon-separated PATH entries as required by
Windows
### How did you verify your code works?
✅ **Tested the bug reproduction:**
```powershell
$Path = @('C:\Windows', 'C:\Windows\System32', 'C:\test')
$env:PATH = $Path # WRONG: Results in "C:\Windows C:\Windows\System32 C:\test"
### What does this PR do?
reduce number of zombie build processes that can happen in CI or when
building locally
### How did you verify your code works?
## Summary
Fixes https://github.com/oven-sh/bun/issues/19198
This implements RFC 9110 Section 13.1.2 If-None-Match conditional
request support for static routes in Bun.serve().
**Key Features:**
- Automatic ETag generation for static content based on content hash
- If-None-Match header evaluation with weak entity tag comparison
- 304 Not Modified responses for cache efficiency
- Standards-compliant handling of wildcards (*), multiple ETags, and
weak ETags (W/)
- Method-specific application (GET/HEAD only) with proper 405 responses
for other methods
## Implementation Details
- ETags are generated using `bun.hash()` and formatted as strong ETags
(e.g., "abc123")
- Preserves existing ETag headers from Response objects
- Uses weak comparison semantics as defined in RFC 9110 Section 8.8.3.2
- Handles comma-separated ETag lists and malformed headers gracefully
- Only applies to GET/HEAD requests with 200 status codes
## Files Changed
- `src/bun.js/api/server/StaticRoute.zig` - Core implementation (~100
lines)
- `test/js/bun/http/serve-if-none-match.test.ts` - Comprehensive test
suite (17 tests)
## Test Results
- ✅ All 17 new If-None-Match tests pass
- ✅ All 34 existing static route tests pass (no regressions)
- ✅ Debug build compiles successfully
## Test plan
- [ ] Run existing HTTP server tests to ensure no regressions
- [ ] Test ETag generation for various content types
- [ ] Verify 304 responses reduce bandwidth in real scenarios
- [ ] Test edge cases like malformed If-None-Match headers
🤖 Generated with [Claude Code](https://claude.ai/code)
---------
Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
## Summary
Fixes a crash in the Windows file watcher that occurred when the number
of file system events exceeded the fixed `watch_events` buffer size
(128).
## Problem
The crash manifested as:
```
index out of bounds: index 128, len 128
```
This happened when:
1. More than 128 file system events were generated in a single watch
cycle
2. The code tried to access `this.watch_events[128]` on an array of
length 128 (valid indices: 0-127)
3. Later, `std.sort.pdq()` would operate on an invalid array slice
## Solution
Implemented a hybrid approach that preserves the original behavior while
handling overflow gracefully:
- **Fixed array for common case**: Uses the existing 128-element array
when possible for optimal performance
- **Dynamic allocation for overflow**: Switches to `ArrayList` only when
needed
- **Single-batch processing**: All events are still processed together
in one batch, preserving event coalescing
- **Graceful fallback**: Handles allocation failures with appropriate
fallbacks
## Benefits
- ✅ **Fixes the crash** while maintaining existing performance
characteristics
- ✅ **Preserves event coalescing** - events for the same file still get
properly merged
- ✅ **Single consolidated callback** instead of multiple partial updates
- ✅ **Memory efficient** - no overhead for normal cases (≤128 events)
- ✅ **Backward compatible** - no API changes
## Test Plan
- [x] Compiles successfully with `bun run zig:check-windows`
- [x] Preserves existing behavior for common case (≤128 events)
- [x] Handles overflow case gracefully with dynamic allocation
🤖 Generated with [Claude Code](https://claude.ai/code)
---------
Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Jarred Sumner <jarred@bun.sh>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
### What does this PR do?
Fixes thread safety issues due to file poll code being not thread safe.
<!-- **Please explain what your changes do**, example: -->
<!--
This adds a new flag --bail to bun test. When set, it will stop running
tests after the first failure. This is useful for CI environments where
you want to fail fast.
-->
### How did you verify your code works?
Added tests for lifecycle scripts. The tests are unlikely to reproduce
the bug, but we'll know if it actually fixes the issue if
`test/package.json` doesn't show in flaky tests anymore.
<!-- **For code changes, please include automated tests**. Feel free to
uncomment the line below -->
<!-- I wrote automated tests -->
<!-- If JavaScript/TypeScript modules or builtins changed:
- [ ] I included a test for the new code, or existing tests cover it
- [ ] I ran my tests locally and they pass (`bun-debug test
test-file-name.test`)
-->
<!-- If Zig files changed:
- [ ] I checked the lifetime of memory allocated to verify it's (1)
freed and (2) only freed when it should be
- [ ] I included a test for the new code, or an existing test covers it
- [ ] JSValue used outside of the stack is either wrapped in a
JSC.Strong or is JSValueProtect'ed
- [ ] I wrote TypeScript/JavaScript tests and they pass locally
(`bun-debug test test-file-name.test`)
-->
<!-- If new methods, getters, or setters were added to a publicly
exposed class:
- [ ] I added TypeScript types for the new methods, getters, or setters
-->
<!-- If dependencies in tests changed:
- [ ] I made sure that specific versions of dependencies are used
instead of ranged or tagged versions
-->
<!-- If a new builtin ESM/CJS module was added:
- [ ] I updated Aliases in `module_loader.zig` to include the new module
- [ ] I added a test that imports the module
- [ ] I added a test that require() the module
-->
---------
Co-authored-by: taylor.fish <contact@taylor.fish>
### What does this PR do?
<!-- **Please explain what your changes do** -->
This PR should fix#14219 and implement
`WebAssembly.instantiateStreaming()` and
`WebAssembly.compileStreaming()`.
This is a mixture of WebKit's implementation (using a helper,
`handleResponseOnStreamingAction`, also containing a fast-path for
blobs) and some of Node.js's validation (error messages) and its
builtin-based strategy to consume chunks from streams.
`src/bun.js/bindings/GlobalObject.zig` has a helper function
(`getBodyStreamOrBytesForWasmStreaming`), called by C++, to validate the
response (like
[Node.js](214e4db60e/lib/internal/wasm_web_api.js)
does) and to extract the data from the response, either as a slice/span
(if we can get the data synchronously), or as a `ReadableStream` body
(if the data is still pending or if it is a file/S3 `Blob`).
In C++, `handleResponseOnStreamingAction` is called by
`compileStreaming` and `instantiateStreaming` on the
`JSC::GlobalObjectMethodTable`, just like in
[WebKit](97ee3c598a/Source/WebCore/bindings/js/JSDOMGlobalObject.cpp (L517)).
It calls the aforementioned Zig helper for validation and getting the
response data. The data is then fed into `JSC::Wasm::StreamingCompiler`.
If the data is received as a `ReadableStream`, then we call a JS builtin
in `WasmStreaming.ts` to iterate over each chunk of the stream, like
[Node.js](214e4db60e/lib/internal/wasm_web_api.js (L50-L52))
does. The `JSC::Wasm::StreamingCompiler` is passed into JS through a new
wrapper object, `WebCore::WasmStreamingCompiler`, like
[Node.js](214e4db60e/src/node_wasm_web_api.h)
does. It has `addBytes`, `finalize`, `error`, and (unused) `cancel`
methods to mirror the underlying JSC class.
(If there's a simpler way to do this, please let me know...that would be
very much appreciated)
- [x] Code changes
### How did you verify your code works?
<!-- **For code changes, please include automated tests**. Feel free to
uncomment the line below -->
I wrote automated tests (`test/js/web/fetch/wasm-streaming.test`).
<!-- If JavaScript/TypeScript modules or builtins changed: -->
- [x] I included a test for the new code, or existing tests cover it
- [x] I ran my tests locally and they pass (`bun-debug test
test/js/web/fetch/wasm-streaming.test`)
<!-- If Zig files changed: -->
- [x] I checked the lifetime of memory allocated to verify it's (1)
freed and (2) only freed when it should be (NOTE: consumed `AnyBlob`
bodies are freed, and all other allocations are in C++ and either GCed
or ref-counted)
- [x] I included a test for the new code, or an existing test covers it
(NOTE: via JS/TS unit test)
- [x] JSValue used outside of the stack is either wrapped in a
JSC.Strong or is JSValueProtect'ed (NOTE: N/A, JSValue never used
outside the stack)
- [x] I wrote TypeScript/JavaScript tests and they pass locally
(`bun-debug test test/js/web/fetch/wasm-streaming.test`)
---------
Co-authored-by: graphite-app[bot] <96075541+graphite-app[bot]@users.noreply.github.com>
## Summary
- Fixed buffer overflow in env_loader when parsing large environment
variables with escape sequences
- Replaced fixed 4096-byte buffer with a stack fallback allocator that
automatically switches to heap allocation for larger values
- Added comprehensive tests to prevent regression
## Background
The env_loader previously used a fixed threadlocal buffer that could
overflow when parsing environment variables containing escape sequences.
This caused crashes when the parsed value exceeded 4KB.
## Changes
- Replaced fixed buffer with `StackFallbackAllocator` that uses 4KB
stack buffer for common cases and falls back to heap for larger values
- Updated all env parsing functions to accept a reusable buffer
parameter
- Added proper memory cleanup with defer statements
## Test plan
- [x] Added test cases for large environment variables with escape
sequences
- [x] Added test for values larger than 4KB
- [x] Added edge case tests (empty quotes, escape at EOF)
- [x] All existing env tests continue to pass
fixes#11627
fixes BAPI-1274
🤖 Generated with [Claude Code](https://claude.ai/code)
---------
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
## Summary
This PR fixes a bug in Bun's bundler where cyclic imports with async
dependencies would produce invalid JavaScript with syntax errors.
## Problem
When modules have cyclic imports and one uses top-level await, the
bundler wasn't properly marking all modules in the cycle as async. This
resulted in non-async wrapper functions containing `await` statements,
causing syntax errors like:
```
error: "await" can only be used inside an "async" function
```
## Solution
The fix matches esbuild's approach by calling `validateTLA` for all
files before `scanImportsAndExports` begins. This ensures async status
is properly propagated through import chains before dependency
resolution.
Key changes:
1. Added a new phase that validates top-level await for all parsed
JavaScript files before import/export scanning
2. This matches esbuild's `finishScan` function which processes all
files in source index order
3. Ensures the `is_async_or_has_async_dependency` flag is properly set
for all modules in cyclic import chains
## Test Plan
- Fixed the reproduction case provided in
`/Users/dylan/clones/bun-esm-bug`
- All existing bundler tests pass, including
`test/bundler/esbuild/default.test.ts`
- The bundled output now correctly generates async wrapper functions
when needed
fixes#21113🤖 Generated with [Claude Code](https://claude.ai/code)
---------
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
## Summary
- Fixed shell lexer to properly store error messages using TextRange
instead of direct string slices
- This prevents potential use-after-free issues when error messages are
accessed after the lexer's string pool might have been reallocated
- Added test coverage for shell syntax error reporting
## Changes
- Changed `LexError.msg` from `[]const u8` to `Token.TextRange` to store
indices into the string pool
- Added `TextRange.slice()` helper method for converting ranges back to
string slices
- Updated error message concatenation logic to use the new range-based
approach
- Added test to verify syntax errors are reported correctly
## Test plan
- [x] Added test case for invalid shell syntax error reporting
- [x] Existing shell tests continue to pass
- [x] Manual testing of various shell syntax errors
closes BAPI-2232
🤖 Generated with [Claude Code](https://claude.ai/code)
Co-authored-by: Claude <noreply@anthropic.com>
## Summary
- Add `exited: usize = 0` field to analytics Features struct
- Increment the counter atomically in `Global.exit` when called
- Provides visibility into how often exit is being called
## Test plan
- [x] Syntax check passes for both modified files
- [x] Code compiles without errors
🤖 Generated with [Claude Code](https://claude.ai/code)
Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
### What does this PR do?
Removes `ZigString.Slice.clone(...)` and replaces all of its usages with
`.cloneIfNeeded(...)` which is what it did anyway (why did this alias
exist in the first place?)
Anyone reading code that sees `.clone(...)` would expect it to clone the
underlying string. This makes it _extremely_ easy to write code which
looks okay but actually results in a use-after-free:
```zig
const out: []const u8 = out: {
const string = bun.String.cloneUTF8("hello friends!");
defer string.deref();
const utf8_slice = string.toUTF8(bun.default_allocator);
defer utf8_slice.deinit();
// doesn't actually clone
const cloned = utf8_slice.clone(bun.default_allocator) catch bun.outOfMemory();
break :out cloned.slice();
};
std.debug.print("Use after free: {s}!\n", .{out});
```
(This is a simplification of an actual example from the codebase)
## Summary
Fixes the broken update hdrhistogram GitHub Action workflow that was
failing due to multiple issues.
## Issues Fixed
1. **Tag SHA resolution failure**: The workflow failed when trying to
resolve commit SHA from lightweight tags, causing the error "Could not
fetch SHA for tag 0.11.8 @ 8dcce8f68512fca460b171bccc3a5afce0048779"
2. **Branch naming bug**: The workflow was creating branches named
`deps/update-cares-*` instead of `deps/update-hdrhistogram-*`
3. **Wrong workflow link**: PR body was linking to `update-cares.yml`
instead of `update-hdrhistogram.yml`
## Fix Details
- **Improved tag SHA resolution**: Updated logic to handle both
lightweight and annotated tags:
- Try to get commit SHA from tag object (for annotated tags)
- If that fails, use the tag SHA directly (for lightweight tags)
- Uses jq's `// empty` operator and proper error handling with
`2>/dev/null`
- **Fixed branch naming**: Changed from `deps/update-cares-*` to
`deps/update-hdrhistogram-*`
- **Updated workflow link**: Fixed PR body to link to correct workflow
file
## Test Plan
- [x] Verified current workflow logs show the exact error being fixed
- [x] Tested API calls locally to confirm the new logic works with
latest tag (0.11.8)
- [x] Confirmed the latest tag is a lightweight tag pointing directly to
commit
The workflow should now run successfully on the next scheduled execution
or manual trigger.
🤖 Generated with [Claude Code](https://claude.ai/code)
Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
## Summary
- Fixes the failing update-lolhtml GitHub Action that was unable to
handle lightweight Git tags
- The action was failing with "Could not fetch SHA for tag v2.6.0"
because it assumed all tags are annotated tag objects
- Updated the workflow to properly handle both lightweight tags (direct
commit refs) and annotated tags (tag objects)
## Root Cause
The lolhtml repository uses lightweight tags (like v2.6.0) which point
directly to commits, not to tag objects. The original workflow tried to
fetch a tag object for every tag, causing failures when the tag was
lightweight.
## Solution
The fix adds logic to:
1. Check the tag object type from the Git refs API response
2. For annotated tags: fetch the commit SHA from the tag object
(original behavior)
3. For lightweight tags: use the SHA directly from the ref (new
behavior)
## Test Plan
- [x] Verified the workflow logic handles both tag types correctly
- [ ] The next scheduled run should succeed (runs weekly on Sundays at 1
AM UTC)
- [ ] Manual workflow dispatch can be used to test immediately
🤖 Generated with [Claude Code](https://claude.ai/code)
Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
## What does this PR do?
Fixes the failing `update-lshpack.yml` GitHub Action that has been
consistently failing with the error: "Could not fetch SHA for tag v2.3.4
@ {SHA}".
## Root Cause
The workflow assumed all Git tags are annotated tags that need to be
dereferenced via the GitHub API. However, some tags (like lightweight
tags) point directly to commits and don't need dereferencing. When the
script tried to dereference a lightweight tag, the API call failed.
## Fix
This PR updates the workflow to:
1. **Check the tag type** before attempting to dereference
2. **For annotated tags** (`type="tag"`): dereference to get the commit
SHA via `/git/tags/{sha}`
3. **For lightweight tags** (`type="commit"`): use the SHA directly
since it already points to the commit
## Changes Made
- Updated `.github/workflows/update-lshpack.yml` to properly handle both
lightweight and annotated Git tags
- Added proper tag type checking before attempting to dereference tags
- Improved error messages to distinguish between tag types
## Testing
The workflow will now handle both types of Git tags properly:
- ✅ Annotated tags: properly dereferences to get commit SHA
- ✅ Lightweight tags: uses the tag SHA directly as commit SHA
This should resolve the consistent failures in the lshpack update
automation.
## Files Changed
- `.github/workflows/update-lshpack.yml`: Updated tag SHA resolution
logic
🤖 Generated with [Claude Code](https://claude.ai/code)
---------
Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
## Summary
Fixes the broken update highway GitHub action that was failing with:
> Error: Could not fetch SHA for tag 1.2.0 @
457c891775a7397bdb0376bb1031e6e027af1c48
## Root Cause
The workflow assumed all Git tags are annotated tags, but Google Highway
uses lightweight tags. For lightweight tags, the GitHub API returns
`object.type: "commit"` and `object.sha` is already the commit SHA. For
annotated tags, `object.type: "tag"` and you need to fetch the tag
object to get the commit SHA.
## Changes
- Updated tag SHA fetching logic to handle both lightweight and
annotated tags
- Fixed incorrect branch name (`deps/update-cares` →
`deps/update-highway`)
- Fixed workflow URL in PR template
## Test Plan
- [x] Verified the API returns `type: "commit"` for highway tag 1.2.0
- [x] Confirmed the fix properly extracts the commit SHA:
`457c891775a7397bdb0376bb1031e6e027af1c48`
- [x] Manual testing shows current version (`12b325bc...`) \!= latest
version (`457c891...`)
🤖 Generated with [Claude Code](https://claude.ai/code)
Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
### What does this PR do?
<!-- **Please explain what your changes do**, example: -->
use `window.location.origin` in browser instead of `bun://` .
should fix [9910](https://github.com/oven-sh/bun/issues/19910)
<!--
This adds a new flag --bail to bun test. When set, it will stop running
tests after the first failure. This is useful for CI environments where
you want to fail fast.
-->
- [ ] Documentation or TypeScript types (it's okay to leave the rest
blank in this case)
- [x] Code changes
### How did you verify your code works?
<!-- **For code changes, please include automated tests**. Feel free to
uncomment the line below -->
<!-- I wrote automated tests -->
<!-- If JavaScript/TypeScript modules or builtins changed:
- [ ] I included a test for the new code, or existing tests cover it
- [ ] I ran my tests locally and they pass (`bun-debug test
test-file-name.test`)
-->
<!-- If Zig files changed:
- [ ] I checked the lifetime of memory allocated to verify it's (1)
freed and (2) only freed when it should be
- [ ] I included a test for the new code, or an existing test covers it
- [ ] JSValue used outside of the stack is either wrapped in a
JSC.Strong or is JSValueProtect'ed
- [ ] I wrote TypeScript/JavaScript tests and they pass locally
(`bun-debug test test-file-name.test`)
-->
<!-- If new methods, getters, or setters were added to a publicly
exposed class:
- [ ] I added TypeScript types for the new methods, getters, or setters
-->
<!-- If dependencies in tests changed:
- [ ] I made sure that specific versions of dependencies are used
instead of ranged or tagged versions
-->
<!-- If a new builtin ESM/CJS module was added:
- [ ] I updated Aliases in `module_loader.zig` to include the new module
- [ ] I added a test that imports the module
- [ ] I added a test that require() the module
-->
### What does this PR do?
Fixes an assertion failure in dev server which may happen if you delete
files. The issue was that `disconnectEdgeFromDependencyList(...)` was
wrong and too prematurely setting `g.first_deps[idx] = .none`.
Fixes#20529
---------
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Fixes#12276: toIncludeRepeated should check for the exact repeat count
not >=
This is a breaking change because some people may be relying on the
existing behaviour. Should it be feature-flagged for 1.3?
---------
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Ensure we aren't using multiple allocators with the same list by storing
a pointer to the allocator in debug mode only.
This check is stricter than the bare minimum necessary to prevent
illegal behavior, so CI may reveal certain uses that fail the checks but
don't cause IB. Most of these cases should probably be updated to comply
with the new requirements—we want these types' invariants to be clear.
(For internal tracking: fixes ENG-14987)
`add` no longer locks a mutex, and `finish` no longer locks a mutex
except for the last task. This could meaningfully improve performance in
cases where we spawn a large number of tasks on a thread pool. This
change doesn't alter the semantics of the type, unlike the standard
library's `WaitGroup`, which also uses atomics but has to be explicitly
reset.
(For internal tracking: fixes ENG-19722)
---------
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
### What does this PR do?
lets you get useful info out of this script even if you set the number
of attempts too high and you don't want to wait
### How did you verify your code works?
local testing
(I cancelled CI because this script is not used anywhere in CI so it
wouldn't tell us anything useful)
### What does this PR do?
- for these kinds of aborts which we test in CI, introduce a feature
flag to suppress core dumps and crash reporting only from that abort,
and set the flag when running the test:
- libuv stub functions
- Node-API abort (used in particular when calling illegal functions
during finalizers)
- passing `process.kill` its own PID
- core dumps are suppressed with `setrlimit`, and crash reporting with
the new `suppress_reporting` field. these suppressions are only engaged
right before crashing, so we won't ignore new kinds of crashes that come
up in these tests.
- for the test bindings used to test the crash handler in
`run-crash-handler.test.ts`, disables core dumps but does not disable
crash reporting (because crashes get reported to a server that the test
is running to make sure they are reported)
- fixes a panic when printing source code around an error containing
`\n\r`
- updates the code where we clone vendor tests to checkout the right tag
- adds `vendor/elysia/test/path/plugin.test.ts` to
no-validate-exceptions
- this failure was exposed by starting to test the version of elysia we
have been intending to test. the crash trace suggests it may be fixed by
#21307.
- makes dumping core or uploading a crash report count as a failing test
- this ensures we don't realize a crash has occurred if it happened in a
subprocess and the main test doesn't adequately check the exit code. to
spawn a subprocess you expect to fail, prefer `expect(code).toBe(1)`
over `expect(code).not.toBe(0)`. if you really expect multiple possible
erroneous exit codes, you might try `expect(signal).toBeNull()` to still
disallow crashes.
### How did you verify your code works?
Running affected tests on a Linux machine with core dumps set up and
checking no new ones appear.
https://buildkite.com/bun/bun/builds/21465 has no core dumps.
Also fix a race condition with hardlinking on Windows during hoisted
installs, and a bug in the process waiter thread implementation causing
items to be skipped.
(For internal tracking: fixes STAB-850, STAB-873, STAB-881)
## Summary
Fixes the "index out of bounds: index 0, len 0" crash that occurs during
large batch PostgreSQL inserts, particularly on Windows systems.
The issue occurred when PostgreSQL DataRow messages contained data but
the `statement.fields` array was empty (len=0), causing crashes in
`DataCell.Putter.putImpl()`. This typically happens during large batch
operations where there may be race conditions or timing issues between
RowDescription and DataRow message processing.
## Changes
- **Add bounds checking** in `DataCell.Putter.putImpl()` before
accessing `fields` and `list` arrays
(src/sql/postgres/DataCell.zig:1043-1050)
- **Graceful degradation** - return `false` to ignore extra fields
instead of crashing
- **Debug logging** to help diagnose field metadata issues
- **Comprehensive regression tests** covering batch inserts, empty
results, and concurrent operations
## Test Plan
- [x] Added regression tests in `test/regression/issue/21311.test.ts`
- [x] Tests pass with the fix: All 3 tests pass with 212 expect() calls
- [x] Existing PostgreSQL tests still work (no regressions)
The fix prevents the crash while maintaining safe operation, allowing
PostgreSQL batch operations to continue working reliably.
## Root Cause
The crash occurred when:
1. `statement.fields` array was empty (len=0) due to timing issues
2. PostgreSQL DataRow messages contained actual data
3. Code tried to access `this.list[index]` and `this.fields[index]`
without bounds checking
This was particularly problematic on Windows during batch operations due
to potential differences in:
- Network stack message ordering
- Memory allocation behavior
- Threading/concurrency during batch operations
- Statement preparation timing
Fixes#21311🤖 Generated with [Claude Code](https://claude.ai/code)
---------
Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Ciro Spaciari <ciro.spaciari@gmail.com>
I haven't checked all uses of tryTakeException but this bug is probably
not the only one.
Caught by running fuzzy-wuzzy with debug logging enabled. It tried to
print the exception. Updates fuzzy-wuzzy to have improved logging that
can tell you what was last executed before a crash.
---------
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
### What does this PR do?
Closes#13012
On Linux, when any Bun process spawned by `runner.node.mjs` crashes, we
run GDB in batch mode to print a backtrace from the core file.
And on all platforms, we run a mini `bun.report` server which collects
crashes reported by any Bun process executed during the tests, and after
each test `runner.node.mjs` fetches and prints any new crashes from the
server.
<details>
<summary>example 1</summary>
```
#0 crash_handler.crash () at crash_handler.zig:1513
#1 0x0000000002cf4020 in crash_handler.crashHandler (reason=..., error_return_trace=0x0, begin_addr=...) at crash_handler.zig:479
#2 0x0000000002cefe25 in crash_handler.handleSegfaultPosix (sig=<optimized out>, info=<optimized out>) at crash_handler.zig:800
#3 0x00000000045a1124 in WTF::jscSignalHandler (sig=11, info=0x7ffe044e30b0, ucontext=0x0) at vendor/WebKit/Source/WTF/wtf/threads/Signals.cpp:548
#4 <signal handler called>
#5 JSC::JSCell::type (this=0x0) at vendor/WebKit/Source/JavaScriptCore/runtime/JSCellInlines.h:137
#6 JSC::JSObject::getOwnNonIndexPropertySlot (this=0x150bc914fe18, vm=..., structure=0x150a0102de50, propertyName=..., slot=...) at vendor/WebKit/Source/JavaScriptCore/runtime/JSObject.h:1348
#7 JSC::JSObject::getPropertySlot<false> (this=0x150bc914fe18, globalObject=0x150b864e0088, propertyName=..., slot=...) at vendor/WebKit/Source/JavaScriptCore/runtime/JSObject.h:1433
#8 JSC::JSValue::getPropertySlot (this=0x7ffe044e4880, globalObject=0x150b864e0088, propertyName=..., slot=...) at vendor/WebKit/Source/JavaScriptCore/runtime/JSCJSValueInlines.h:1108
#9 JSC::JSValue::get (this=0x7ffe044e4880, globalObject=0x150b864e0088, propertyName=..., slot=...) at vendor/WebKit/Source/JavaScriptCore/runtime/JSCJSValueInlines.h:1065
#10 JSC::LLInt::performLLIntGetByID (bytecodeIndex=..., codeBlock=0x150b861e7740, globalObject=0x150b864e0088, baseValue=..., ident=..., metadata=...) at vendor/WebKit/Source/JavaScriptCore/llint/LLIntSlowPaths.cpp:878
#11 0x0000000004d7b055 in llint_slow_path_get_by_id (callFrame=0x7ffe044e4ab0, pc=0x150bc92ea0e7) at vendor/WebKit/Source/JavaScriptCore/llint/LLIntSlowPaths.cpp:946
#12 0x0000000003dd6042 in llint_op_get_by_id ()
#13 0x0000000000000000 in ?? ()
```
</details>
<details>
<summary>example 2</summary>
```
#0 crash_handler.crash () at crash_handler.zig:1513
#1 0x0000000002c5db80 in crash_handler.crashHandler (reason=..., error_return_trace=0x0, begin_addr=...) at crash_handler.zig:479
#2 0x0000000002c59f60 in crash_handler.handleSegfaultPosix (sig=<optimized out>, info=<optimized out>) at crash_handler.zig:800
#3 0x00000000042ecc88 in WTF::jscSignalHandler (sig=11, info=0xfffff60141b0, ucontext=0xfffff6014230) at vendor/WebKit/Source/WTF/wtf/threads/Signals.cpp:548
#4 <signal handler called>
#5 bun.js.api.FFIObject.Reader.u8 (globalObject=0x4000554e0088) at /var/lib/buildkite-agent/builds/ip-172-31-75-92/bun/bun/src/bun.js/api/FFIObject.zig:65
#6 bun.js.jsc.host_fn.toJSHostCall__anon_1711576 (globalThis=0x4000554e0088, args=...) at /var/lib/buildkite-agent/builds/ip-172-31-75-92/bun/bun/src/bun.js/jsc/host_fn.zig:97
#7 bun.js.jsc.host_fn.DOMCall("Reader"[0..6],bun.js.api.FFIObject.Reader,"u8"[0..2],.{ .reads = .{ ... }, .writes = .{ ... } }).slowpath (globalObject=0x4000554e0088, thisValue=70370172175040, arguments_ptr=0xfffff6015460, arguments_len=1) at /var/lib/buildkite-agent/builds/ip-172-31-75-92/bun/bun/src/bun.js/jsc/host_fn.zig:490
#8 0x000040003419003c in ?? ()
#9 0x0000400055173440 in ?? ()
```
</details>
I used GDB instead of LLDB (as the branch name suggests) because it
seems to produce more useful stack traces with musl libc.
- [x] on linux, use gdb to print from core dump of main bun process
crashed
- [x] on linux, use gdb to print from all new core dumps (so including
bun subprocesses spawned by the test that crashed)
- [x] on all platforms, use a mini bun.report server to print a
self-reported trace (depends on oven-sh/bun.report#15; for now our
package.json points to a commit on the branch of that repo)
- [x] fix trying to fetch stack traces too early on windows
- [x] use output groups so the traces show up alongside the log for the
specific test instead of having to find it in the logs from the entire
run
- [x] get oven-sh/bun.report#15 merged, and point to a bun.report commit
on the main branch instead of the PR branch in package.json
### How did you verify your code works?
Manually, and in CI with a crashing test.
---------
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
We cannot assume that the next event loop cycle will yield the expected outcome from the operating system. We can assume certain things happen in a certain order, but when timers run next does not mean kqueue/epoll/iocp will always have the data available there.
We cannot assume that the next event loop cycle will yield the expected outcome from the operating system. We can assume certain things happen in a certain order, but when timers run next does not mean kqueue/epoll/iocp will always have the data available there.
There are many situations where using `catch unreachable` is a reasonable or sometimes necessary decision. This rule causes many, many merge conflicts.
@@ -2,44 +2,44 @@ name: 🇹 TypeScript Type Bug Report
description:Report an issue with TypeScript types
labels:[bug, types]
body:
- type:markdown
attributes:
value:|
Thank you for submitting a bug report. It helps make Bun better.
- type:markdown
attributes:
value:|
Thank you for submitting a bug report. It helps make Bun better.
If you need help or support using Bun, and are not reporting a bug, please
join our [Discord](https://discord.gg/CXdq2DP29u) server, where you can ask questions in the [`#help`](https://discord.gg/32EtH6p7HN) forum.
If you need help or support using Bun, and are not reporting a bug, please
join our [Discord](https://discord.gg/CXdq2DP29u) server, where you can ask questions in the [`#help`](https://discord.gg/32EtH6p7HN) forum.
Make sure you are running the [latest](https://bun.sh/docs/installation#upgrading) version of Bun.
The bug you are experiencing may already have been fixed.
Make sure you are running the [latest](https://bun.com/docs/installation#upgrading) version of Bun.
The bug you are experiencing may already have been fixed.
Please try to include as much information as possible.
Please try to include as much information as possible.
- type:input
attributes:
label:What version of Bun is running?
description:Copy the output of `bun --revision`
- type:input
attributes:
label:What platform is your computer?
description:|
For MacOS and Linux: copy the output of `uname -mprs`
For Windows: copy the output of `"$([Environment]::OSVersion | ForEach-Object VersionString) $(if ([Environment]::Is64BitOperatingSystem) { "x64" } else { "x86" })"` in the PowerShell console
- type:textarea
attributes:
label:What steps can reproduce the bug?
description:Explain the bug and provide a code snippet that can reproduce it.
validations:
required:true
- type:textarea
attributes:
label:What is the expected behavior?
description:If possible, please provide text instead of a screenshot.
- type:textarea
attributes:
label:What do you see instead?
description:If possible, please provide text instead of a screenshot.
- type:textarea
attributes:
label:Additional information
description:Is there anything else you think we should know?
- type:input
attributes:
label:What version of Bun is running?
description:Copy the output of `bun --revision`
- type:input
attributes:
label:What platform is your computer?
description:|
For MacOS and Linux: copy the output of `uname -mprs`
For Windows: copy the output of `"$([Environment]::OSVersion | ForEach-Object VersionString) $(if ([Environment]::Is64BitOperatingSystem) { "x64" } else { "x86" })"` in the PowerShell console
- type:textarea
attributes:
label:What steps can reproduce the bug?
description:Explain the bug and provide a code snippet that can reproduce it.
validations:
required:true
- type:textarea
attributes:
label:What is the expected behavior?
description:If possible, please provide text instead of a screenshot.
- type:textarea
attributes:
label:What do you see instead?
description:If possible, please provide text instead of a screenshot.
- type:textarea
attributes:
label:Additional information
description:Is there anything else you think we should know?
<!-- **Please explain what your changes do**, example: -->
<!--
This adds a new flag --bail to bun test. When set, it will stop running tests after the first failure. This is useful for CI environments where you want to fail fast.
-->
- [ ] Documentation or TypeScript types (it's okay to leave the rest blank in this case)
- [ ] Code changes
### How did you verify your code works?
<!-- **For code changes, please include automated tests**. Feel free to uncomment the line below -->
<!-- I wrote automated tests -->
<!-- If JavaScript/TypeScript modules or builtins changed:
- [ ] I included a test for the new code, or existing tests cover it
- [ ] I ran my tests locally and they pass (`bun-debug test test-file-name.test`)
-->
<!-- If Zig files changed:
- [ ] I checked the lifetime of memory allocated to verify it's (1) freed and (2) only freed when it should be
- [ ] I included a test for the new code, or an existing test covers it
- [ ] JSValue used outside of the stack is either wrapped in a JSC.Strong or is JSValueProtect'ed
- [ ] I wrote TypeScript/JavaScript tests and they pass locally (`bun-debug test test-file-name.test`)
-->
<!-- If new methods, getters, or setters were added to a publicly exposed class:
- [ ] I added TypeScript types for the new methods, getters, or setters
-->
<!-- If dependencies in tests changed:
- [ ] I made sure that specific versions of dependencies are used instead of ranged or tagged versions
-->
<!-- If a new builtin ESM/CJS module was added:
- [ ] I updated Aliases in `module_loader.zig` to include the new module
This document provides guidance for maintaining the GitHub Actions workflows in this repository.
## format.yml Workflow
### Overview
The `format.yml` workflow runs code formatters (Prettier, clang-format, and Zig fmt) on pull requests and pushes to main. It's optimized for speed by running all formatters in parallel.
Configuring a development environment for Bun can take 10-30 minutes depending on your internet connection and computer speed. You will need ~10GB of free disk space for the repository and build artifacts.
If you are using Windows, please refer to [this guide](https://bun.sh/docs/project/building-windows)
If you are using Windows, please refer to [this guide](https://bun.com/docs/project/building-windows)
## Install Dependencies
@@ -37,7 +37,7 @@ Before starting, you will need to already have a release build of Bun installed,
{% codetabs %}
```bash#Native
$ curl -fsSL https://bun.sh/install | bash
$ curl -fsSL https://bun.com/install | bash
```
```bash#npm
@@ -160,6 +160,7 @@ In particular, these are:
- `./src/codegen/generate-jssink.ts` -- Generates `build/debug/codegen/JSSink.cpp`, `build/debug/codegen/JSSink.h` which implement various classes for interfacing with `ReadableStream`. This is internally how `FileSink`, `ArrayBufferSink`, `"type": "direct"` streams and other code related to streams works.
- `./src/codegen/generate-classes.ts` -- Generates `build/debug/codegen/ZigGeneratedClasses*`, which generates Zig & C++ bindings for JavaScriptCore classes implemented in Zig. In `**/*.classes.ts` files, we define the interfaces for various classes, methods, prototypes, getters/setters etc which the code generator reads to generate boilerplate code implementing the JavaScript objects in C++ and wiring them up to Zig
- `./src/codegen/cppbind.ts` -- Generates automatic Zig bindings for C++ functions marked with `[[ZIG_EXPORT]]` attributes.
- `./src/codegen/bundle-modules.ts` -- Bundles built-in modules like `node:fs`, `bun:ffi` into files we can include in the final binary. In development, these can be reloaded without rebuilding Zig (you still need to run `bun run build`, but it re-reads the transpiled files from disk afterwards). In release builds, these are embedded into the binary.
- `./src/codegen/bundle-functions.ts` -- Bundles globally-accessible functions implemented in JavaScript/TypeScript like `ReadableStream`, `WritableStream`, and a handful more. These are used similarly to the builtin modules, but the output more closely aligns with what WebKit/Safari does for Safari's built-in functions so that we can copy-paste the implementations from WebKit as a starting point.
gh release create --repo=$(BUN_AUTO_UPDATER_REPO) --title "bun v$(PACKAGE_JSON_VERSION)""$(BUN_BUILD_TAG)" -n "See https://github.com/oven-sh/bun/releases/tag/$(BUN_BUILD_TAG) for release notes. Using the install script or bun upgrade is the recommended way to install bun. Join bun's Discord to get access https://bun.sh/discord"
gh release create --repo=$(BUN_AUTO_UPDATER_REPO) --title "bun v$(PACKAGE_JSON_VERSION)""$(BUN_BUILD_TAG)" -n "See https://github.com/oven-sh/bun/releases/tag/$(BUN_BUILD_TAG) for release notes. Using the install script or bun upgrade is the recommended way to install bun. Join bun's Discord to get access https://bun.com/discord"
release-bin-entitlements:
@@ -1977,7 +1977,7 @@ integration-test-dev: # to run integration tests
@@ -47,14 +47,14 @@ Bun supports Linux (x64 & arm64), macOS (x64 & Apple Silicon) and Windows (x64).
> **Linux users** — Kernel version 5.6 or higher is strongly recommended, but the minimum is 5.1.
> **x64 users** — if you see "illegal instruction" or similar errors, check our [CPU requirements](https://bun.sh/docs/installation#cpu-requirements-and-baseline-builds)
> **x64 users** — if you see "illegal instruction" or similar errors, check our [CPU requirements](https://bun.com/docs/installation#cpu-requirements-and-baseline-builds)
Report any discovered vulnerabilities to the Bun team by emailing `security@bun.sh`. Your report will acknowledged within 5 days, and a team member will be assigned as the primary handler. To the greatest extent possible, the security team will endeavor to keep you informed of the progress being made towards a fix and full announcement, and may ask for additional information or guidance surrounding the reported issue.
Report any discovered vulnerabilities to the Bun team by emailing `security@bun.com`. Your report will acknowledged within 5 days, and a team member will be assigned as the primary handler. To the greatest extent possible, the security team will endeavor to keep you informed of the progress being made towards a fix and full announcement, and may ask for additional information or guidance surrounding the reported issue.
**Note** — Bun provides a browser- and Node.js-compatible [console](https://developer.mozilla.org/en-US/docs/Web/API/console) global. This page only documents Bun-native APIs.
{% /callout %}
## Object inspection depth
Bun allows you to configure how deeply nested objects are displayed in `console.log()` output:
- **CLI flag**: Use `--console-depth <number>` to set the depth for a single run
- **Configuration**: Set `console.depth` in your `bunfig.toml` for persistent configuration
- **Default**: Objects are inspected to a depth of `2` levels
Note: Only PUT and POST methods support request bodies when using S3. For uploads, Bun automatically uses multipart upload for streaming bodies.
You can read more about Bun's S3 support in the [S3](https://bun.sh/docs/api/s3) documentation.
You can read more about Bun's S3 support in the [S3](https://bun.com/docs/api/s3) documentation.
#### File URLs - `file://`
@@ -320,7 +320,6 @@ Bun automatically sets the `Content-Type` header for request bodies when not exp
- For `Blob` objects, uses the blob's `type`
- For `FormData`, sets appropriate multipart boundary
- For JSON objects, sets `application/json`
## Debugging
@@ -376,14 +375,14 @@ To prefetch a DNS entry, you can use the `dns.prefetch` API. This API is useful
```ts
import{dns}from"bun";
dns.prefetch("bun.sh");
dns.prefetch("bun.com");
```
#### DNS caching
By default, Bun caches and deduplicates DNS queries in-memory for up to 30 seconds. You can see the cache stats by calling `dns.getCacheStats()`:
To learn more about DNS caching in Bun, see the [DNS caching](https://bun.sh/docs/api/dns) documentation.
To learn more about DNS caching in Bun, see the [DNS caching](https://bun.com/docs/api/dns) documentation.
### Preconnect to a host
@@ -392,7 +391,7 @@ To preconnect to a host, you can use the `fetch.preconnect` API. This API is use
```ts
import{fetch}from"bun";
fetch.preconnect("https://bun.sh");
fetch.preconnect("https://bun.com");
```
Note: calling `fetch` immediately after `fetch.preconnect` will not make your request faster. Preconnecting only helps if you know you'll need to connect to a host soon, but you're not ready to make the request yet.
@@ -402,7 +401,7 @@ Note: calling `fetch` immediately after `fetch.preconnect` will not make your re
To preconnect to a host at startup, you can pass `--fetch-preconnect`:
```sh
$ bun --fetch-preconnect https://bun.sh ./my-script.ts
$ bun --fetch-preconnect https://bun.com ./my-script.ts
```
This is sort of like `<link rel="preconnect">` in HTML.
<!-- **Note** — The `Bun.file` and `Bun.write` APIs documented on this page are heavily optimized and represent the recommended way to perform file-system tasks using Bun. Existing Node.js projects may use Bun's [nearly complete](https://bun.sh/docs/runtime/nodejs-apis#node-fs) implementation of the [`node:fs`](https://nodejs.org/api/fs.html) module. -->
<!-- **Note** — The `Bun.file` and `Bun.write` APIs documented on this page are heavily optimized and represent the recommended way to perform file-system tasks using Bun. Existing Node.js projects may use Bun's [nearly complete](https://bun.com/docs/runtime/nodejs-apis#node-fs) implementation of the [`node:fs`](https://nodejs.org/api/fs.html) module. -->
**Note** — The `Bun.file` and `Bun.write` APIs documented on this page are heavily optimized and represent the recommended way to perform file-system tasks using Bun. For operations that are not yet available with `Bun.file`, such as `mkdir` or `readdir`, you can use Bun's [nearly complete](https://bun.sh/docs/runtime/nodejs-apis#node-fs) implementation of the [`node:fs`](https://nodejs.org/api/fs.html) module.
**Note** — The `Bun.file` and `Bun.write` APIs documented on this page are heavily optimized and represent the recommended way to perform file-system tasks using Bun. For operations that are not yet available with `Bun.file`, such as `mkdir` or `readdir`, you can use Bun's [nearly complete](https://bun.com/docs/runtime/nodejs-apis#node-fs) implementation of the [`node:fs`](https://nodejs.org/api/fs.html) module.
The page primarily documents the Bun-native `Bun.serve` API. Bun also implements [`fetch`](https://developer.mozilla.org/en-US/docs/Web/API/Fetch_API) and the Node.js [`http`](https://nodejs.org/api/http.html) and [`https`](https://nodejs.org/api/https.html) modules.
{% callout %}
These modules have been re-implemented to use Bun's fast internal HTTP infrastructure. Feel free to use these modules directly; frameworks like [Express](https://expressjs.com/) that depend on these modules should work out of the box. For granular compatibility information, see [Runtime > Node.js APIs](https://bun.sh/docs/runtime/nodejs-apis).
These modules have been re-implemented to use Bun's fast internal HTTP infrastructure. Feel free to use these modules directly; frameworks like [Express](https://expressjs.com/) that depend on these modules should work out of the box. For granular compatibility information, see [Runtime > Node.js APIs](https://bun.com/docs/runtime/nodejs-apis).
{% /callout %}
To start a high-performance HTTP server with a clean API, the recommended approach is [`Bun.serve`](#start-a-server-bun-serve).
- **File routes only**: Missing or inaccessible files return `404 Not Found`
```ts
constserver=Bun.serve({
static:{
@@ -342,9 +406,9 @@ Bun.serve({
});
```
HTML imports don't just serve HTML — it's a full-featured frontend bundler, transpiler, and toolkit built using Bun's [bundler](https://bun.sh/docs/bundler), JavaScript transpiler and CSS parser. You can use this to build full-featured frontends with React, TypeScript, Tailwind CSS, and more.
HTML imports don't just serve HTML — it's a full-featured frontend bundler, transpiler, and toolkit built using Bun's [bundler](https://bun.com/docs/bundler), JavaScript transpiler and CSS parser. You can use this to build full-featured frontends with React, TypeScript, Tailwind CSS, and more.
For a complete guide on building full-stack applications with HTML imports, including detailed examples and best practices, see [/docs/bundler/fullstack](https://bun.sh/docs/bundler/fullstack).
For a complete guide on building full-stack applications with HTML imports, including detailed examples and best practices, see [/docs/bundler/fullstack](https://bun.com/docs/bundler/fullstack).
### Practical example: REST API
@@ -605,7 +669,7 @@ Bun.serve({
```
{% callout %}
[Learn more about debugging in Bun](https://bun.sh/docs/runtime/debugger)
[Learn more about debugging in Bun](https://bun.com/docs/runtime/debugger)
{% /callout %}
The call to `Bun.serve` returns a `Server` object. To stop the server, call the `.stop()` method.
@@ -772,7 +836,7 @@ Instead of passing the server options into `Bun.serve`, `export default` it. Thi
$ bun --hot server.ts
``` -->
<!-- It's possible to configure hot reloading while using the explicit `Bun.serve` API; for details refer to [Runtime > Hot reloading](https://bun.sh/docs/runtime/hot). -->
<!-- It's possible to configure hot reloading while using the explicit `Bun.serve` API; for details refer to [Runtime > Hot reloading](https://bun.com/docs/runtime/hot). -->
Bun can preconnect to PostgreSQL at startup to improve performance by establishing database connections before your application code runs. This is useful for reducing connection latency on the first database query.
```bash
# Enable PostgreSQL preconnection
bun --sql-preconnect index.js
# Works with DATABASE_URL environment variable
DATABASE_URL=postgres://user:pass@localhost:5432/db bun --sql-preconnect index.js
# Can be combined with other runtime flags
bun --sql-preconnect --hot index.js
```
The `--sql-preconnect` flag will automatically establish a PostgreSQL connection using your configured environment variables at startup. If the connection fails, it won't crash your application - the error will be handled gracefully.
## Connection Options
You can configure your database connection manually by passing options to the SQL constructor:
@@ -88,6 +88,20 @@ The order of the `--target` flag does not matter, as long as they're delimited b
On x64 platforms, Bun uses SIMD optimizations which require a modern CPU supporting AVX2 instructions. The `-baseline` build of Bun is for older CPUs that don't support these optimizations. Normally, when you install Bun we automatically detect which version to use but this can be harder to do when cross-compiling since you might not know the target CPU. You usually don't need to worry about it on Darwin x64, but it is relevant for Windows x64 and Linux x64. If you or your users see `"Illegal instruction"` errors, you might need to use the baseline version.
## Build-time constants
Use the `--define` flag to inject build-time constants into your executable, such as version numbers, build timestamps, or configuration values:
These constants are embedded directly into your compiled binary at build time, providing zero runtime overhead and enabling dead code elimination optimizations.
{% callout type="info" %}
For comprehensive examples and advanced patterns, see the [Build-time constants guide](/guides/runtime/build-time-constants).
{% /callout %}
## Deploying to production
Compiled executables reduce memory usage and improve Bun's start time.
@@ -126,6 +140,42 @@ The `--sourcemap` argument embeds a sourcemap compressed with zstd, so that erro
The `--bytecode` argument enables bytecode compilation. Every time you run JavaScript code in Bun, JavaScriptCore (the engine) will compile your source code into bytecode. We can move this parsing work from runtime to bundle time, saving you startup time.
## Act as the Bun CLI
{% note %}
New in Bun v1.2.16
{% /note %}
You can run a standalone executable as if it were the `bun` CLI itself by setting the `BUN_BE_BUN=1` environment variable. When this variable is set, the executable will ignore its bundled entrypoint and instead expose all the features of Bun's CLI.
For example, consider an executable compiled from a simple script:
```sh
$ cat such-bun.js
console.log("you shouldn't see this");
$ bun build --compile ./such-bun.js
[3ms] bundle 1 modules
[89ms] compile such-bun
```
Normally, running `./such-bun` with arguments would execute the script. However, with the `BUN_BE_BUN=1` environment variable, it acts just like the `bun` binary:
```sh
# Executable runs its own entrypoint by default
$ ./such-bun install
you shouldn't see this
# With the env var, the executable acts like the `bun` CLI
$ BUN_BE_BUN=1 ./such-bun install
bun install v1.2.16-canary.1 (1d1db811)
Checked 63 installs across 64 packages (no changes) [5.00ms]
```
This is useful for building CLI tools on top of Bun that may need to install packages, bundle dependencies, run different or local files and more without needing to download a separate binary or install bun.
@@ -325,7 +325,7 @@ When adding a build step is too complicated, you can set `development: false` in
## Plugins
Bun's [bundler plugins](https://bun.sh/docs/bundler/plugins) are also supported when bundling static routes.
Bun's [bundler plugins](https://bun.com/docs/bundler/plugins) are also supported when bundling static routes.
To configure plugins for `Bun.serve`, add a `plugins` array in the `[serve.static]` section of your `bunfig.toml`.
@@ -365,7 +365,7 @@ Or in your CSS:
### Custom plugins
Any JS file or module which exports a [valid bundler plugin object](https://bun.sh/docs/bundler/plugins#usage) (essentially an object with a `name` and `setup` field) can be placed inside the `plugins` array:
Any JS file or module which exports a [valid bundler plugin object](https://bun.com/docs/bundler/plugins#usage) (essentially an object with a `name` and `setup` field) can be placed inside the `plugins` array:
Like the Bun runtime, the bundler supports an array of file types out of the box. The following table breaks down the bundler's set of standard "loaders". Refer to [Bundler > File types](https://bun.sh/docs/runtime/loaders) for full documentation.
Like the Bun runtime, the bundler supports an array of file types out of the box. The following table breaks down the bundler's set of standard "loaders". Refer to [Bundler > File types](https://bun.com/docs/runtime/loaders) for full documentation.
{% table %}
@@ -220,11 +220,11 @@ console.log(logo);
The exact behavior of the file loader is also impacted by [`naming`](#naming) and [`publicPath`](#publicpath).
{% /callout %}
Refer to the [Bundler > Loaders](https://bun.sh/docs/bundler/loaders#file) page for more complete documentation on the file loader.
Refer to the [Bundler > Loaders](https://bun.com/docs/bundler/loaders#file) page for more complete documentation on the file loader.
### Plugins
The behavior described in this table can be overridden or extended with [plugins](https://bun.sh/docs/bundler/plugins). Refer to the [Bundler > Loaders](https://bun.sh/docs/bundler/plugins) page for complete documentation.
The behavior described in this table can be overridden or extended with [plugins](https://bun.com/docs/bundler/plugins). Refer to the [Bundler > Loaders](https://bun.com/docs/bundler/plugins) page for complete documentation.
## API
@@ -484,7 +484,7 @@ n/a
{% /codetabs %}
Bun implements a universal plugin system for both Bun's runtime and bundler. Refer to the [plugin documentation](https://bun.sh/docs/bundler/plugins) for complete documentation.
Bun implements a universal plugin system for both Bun's runtime and bundler. Refer to the [plugin documentation](https://bun.com/docs/bundler/plugins) for complete documentation.
<!-- ### `manifest`
@@ -1102,7 +1102,7 @@ A prefix to be appended to any import paths in bundled code.
In many cases, generated bundles will contain no `import` statements. After all, the goal of bundling is to combine all of the code into a single file. However there are a number of cases with the generated bundles will contain `import` statements.
- **Asset imports** — When importing an unrecognized file type like `*.svg`, the bundler defers to the [`file` loader](https://bun.sh/docs/bundler/loaders#file), which copies the file into `outdir` as is. The import is converted into a variable
- **Asset imports** — When importing an unrecognized file type like `*.svg`, the bundler defers to the [`file` loader](https://bun.com/docs/bundler/loaders#file), which copies the file into `outdir` as is. The import is converted into a variable
- **External modules** — Files and modules can be marked as [`external`](#external), in which case they will not be included in the bundle. Instead, the `import` statement will be left in the final bundle.
- **Chunking**. When [`splitting`](#splitting) is enabled, the bundler may generate separate "chunk" files that represent code that is shared among multiple entrypoints.
A map of file extensions to [built-in loader names](https://bun.sh/docs/bundler/loaders#built-in-loaders). This can be used to quickly customize how certain files are loaded.
A map of file extensions to [built-in loader names](https://bun.com/docs/bundler/loaders#built-in-loaders). This can be used to quickly customize how certain files are loaded.
{% codetabs %}
@@ -1310,7 +1310,7 @@ Each artifact also contains the following properties:
---
- `loader`
- The loader was used to interpret the file. See [Bundler > Loaders](https://bun.sh/docs/bundler/loaders) to see how Bun maps file extensions to the appropriate built-in loader.
- The loader was used to interpret the file. See [Bundler > Loaders](https://bun.com/docs/bundler/loaders) to see how Bun maps file extensions to the appropriate built-in loader.
Bun uses the file extension to determine which built-in _loader_ should be used to parse the file. Every loader has a name, such as `js`, `tsx`, or `json`. These names are used when building [plugins](https://bun.sh/docs/bundler/plugins) that extend Bun with custom loaders.
Bun uses the file extension to determine which built-in _loader_ should be used to parse the file. Every loader has a name, such as `js`, `tsx`, or `json`. These names are used when building [plugins](https://bun.com/docs/bundler/plugins) that extend Bun with custom loaders.
You can explicitly specify which loader to use using the 'loader' import attribute.
@@ -175,7 +175,7 @@ In the bundler, `.node` files are handled using the [`file`](#file) loader.
In the runtime and bundler, SQLite databases can be directly imported. This will load the database using [`bun:sqlite`](https://bun.sh/docs/api/sqlite).
In the runtime and bundler, SQLite databases can be directly imported. This will load the database using [`bun:sqlite`](https://bun.com/docs/api/sqlite).
```ts
import db from "./my.db" with { type: "sqlite" };
@@ -192,7 +192,7 @@ You can change this behavior with the `"embed"` attribute:
import db from "./my.db" with { type: "sqlite", embed: "true" };
```
When using a [standalone executable](https://bun.sh/docs/bundler/executables), the database is embedded into the single-file executable.
When using a [standalone executable](https://bun.com/docs/bundler/executables), the database is embedded into the single-file executable.
Otherwise, the database to embed is copied into the `outdir` with a hashed filename.
@@ -280,7 +280,7 @@ The `html` loader behaves differently depending on how it's used:
**Bun Shell loader**. Default for `.sh` files
This loader is used to parse [Bun Shell](https://bun.sh/docs/runtime/shell) scripts. It's only supported when starting Bun itself, so it's not available in the bundler or in the runtime.
This loader is used to parse [Bun Shell](https://bun.com/docs/runtime/shell) scripts. It's only supported when starting Bun itself, so it's not available in the bundler or in the runtime.
```sh
$ bun run ./script.sh
@@ -336,7 +336,7 @@ If a value is specified for `publicPath`, the import will use value as a prefix
{% /table %}
{% callout %}
The location and file name of the copied file is determined by the value of [`naming.asset`](https://bun.sh/docs/bundler#naming).
The location and file name of the copied file is determined by the value of [`naming.asset`](https://bun.com/docs/bundler#naming).
{% /callout %}
This loader is copied into the `outdir` as-is. The name of the copied file is determined using the value of `naming.asset`.
@@ -291,7 +291,7 @@ One of the reasons why Bun's bundler is so fast is that it is written in native
However, one limitation of plugins written in JavaScript is that JavaScript itself is single-threaded.
Native plugins are written as [NAPI](https://bun.sh/docs/api/node-api) modules and can be run on multiple threads. This allows native plugins to run much faster than JavaScript plugins.
Native plugins are written as [NAPI](https://bun.com/docs/api/node-api) modules and can be run on multiple threads. This allows native plugins to run much faster than JavaScript plugins.
In addition, native plugins can skip unnecessary work such as the UTF-8 -> UTF-16 conversion needed to pass strings to JavaScript.
@@ -2,7 +2,7 @@ Bun's bundler API is inspired heavily by [esbuild](https://esbuild.github.io/).
There are a few behavioral differences to note.
- **Bundling by default**. Unlike esbuild, Bun _always bundles by default_. This is why the `--bundle` flag isn't necessary in the Bun example. To transpile each file individually, use [`Bun.Transpiler`](https://bun.sh/docs/api/transpiler).
- **Bundling by default**. Unlike esbuild, Bun _always bundles by default_. This is why the `--bundle` flag isn't necessary in the Bun example. To transpile each file individually, use [`Bun.Transpiler`](https://bun.com/docs/api/transpiler).
- **It's just a bundler**. Unlike esbuild, Bun's bundler does not include a built-in development server or file watcher. It's just a bundler. The bundler is intended for use in conjunction with `Bun.serve` and other runtime APIs to achieve the same effect. As such, all options relating to HTTP/file watching are not applicable.
## Performance
@@ -65,7 +65,7 @@ In Bun's CLI, simple boolean flags like `--minify` do not accept an argument. Ot
- `--loader:.ext=loader`
- `--loader .ext:loader`
- Bun supports a different set of built-in loaders than esbuild; see [Bundler > Loaders](https://bun.sh/docs/bundler/loaders) for a complete reference. The esbuild loaders `dataurl`, `binary`, `base64`, `copy`, and `empty` are not yet implemented.
- Bun supports a different set of built-in loaders than esbuild; see [Bundler > Loaders](https://bun.com/docs/bundler/loaders) for a complete reference. The esbuild loaders `dataurl`, `binary`, `base64`, `copy`, and `empty` are not yet implemented.
The syntax for `--loader` is slightly different.
@@ -474,7 +474,7 @@ In Bun's CLI, simple boolean flags like `--minify` do not accept an argument. Ot
- `bundle`
- n/a
- Always `true`. Use [`Bun.Transpiler`](https://bun.sh/docs/api/transpiler) to transpile without bundling.
- Always `true`. Use [`Bun.Transpiler`](https://bun.com/docs/api/transpiler) to transpile without bundling.
---
@@ -635,7 +635,7 @@ In Bun's CLI, simple boolean flags like `--minify` do not accept an argument. Ot
- `loader`
- `loader`
- Bun supports a different set of built-in loaders than esbuild; see [Bundler > Loaders](https://bun.sh/docs/bundler/loaders) for a complete reference. The esbuild loaders `dataurl`, `binary`, `base64`, `copy`, and `empty` are not yet implemented.
- Bun supports a different set of built-in loaders than esbuild; see [Bundler > Loaders](https://bun.com/docs/bundler/loaders) for a complete reference. The esbuild loaders `dataurl`, `binary`, `base64`, `copy`, and `empty` are not yet implemented.
---
@@ -891,7 +891,7 @@ const myPlugin: BunPlugin = {
};
```
The `builder` object provides some methods for hooking into parts of the bundling process. Bun implements `onResolve` and `onLoad`; it does not yet implement the esbuild hooks `onStart`, `onEnd`, and `onDispose`, and `resolve` utilities. `initialOptions` is partially implemented, being read-only and only having a subset of esbuild's options; use [`config`](https://bun.sh/docs/bundler/plugins) (same thing but with Bun's `BuildConfig` format) instead.
The `builder` object provides some methods for hooking into parts of the bundling process. Bun implements `onResolve` and `onLoad`; it does not yet implement the esbuild hooks `onStart`, `onEnd`, and `onDispose`, and `resolve` utilities. `initialOptions` is partially implemented, being read-only and only having a subset of esbuild's options; use [`config`](https://bun.com/docs/bundler/plugins) (same thing but with Bun's `BuildConfig` format) instead.
Template a new Bun project with `bun create`. This is a flexible command that can be used to create a new project from a React component, a `create-<template>` npm package, a GitHub repo, or a local template.
If you're looking to create a brand new empty project, use [`bun init`](https://bun.sh/docs/cli/init).
If you're looking to create a brand new empty project, use [`bun init`](https://bun.com/docs/cli/init).
## From a React component
@@ -30,11 +30,11 @@ $ bun create ./MyComponent.jsx # .tsx also supported
When you run `bun create <component>`, Bun:
1. Uses [Bun's JavaScript bundler](https://bun.sh/docs/bundler) to analyze your module graph.
1. Uses [Bun's JavaScript bundler](https://bun.com/docs/bundler) to analyze your module graph.
2. Collects all the dependencies needed to run the component.
3. Scans the exports of the entry point for a React component.
4. Generates a `package.json` file with the dependencies and scripts needed to run the component.
5. Installs any missing dependencies using [`bun install --only-missing`](https://bun.sh/docs/cli/install).
5. Installs any missing dependencies using [`bun install --only-missing`](https://bun.com/docs/cli/install).
6. Generates the following files:
-`${component}.html`
-`${component}.client.tsx` (entry point for the frontend)
@@ -184,7 +184,7 @@ Peer dependencies are handled similarly to yarn. `bun install` will automaticall
## Lockfile
`bun.lock` is Bun’s lockfile format. See [our blogpost about the text lockfile](https://bun.sh/blog/bun-lock-text-lockfile).
`bun.lock` is Bun’s lockfile format. See [our blogpost about the text lockfile](https://bun.com/blog/bun-lock-text-lockfile).
Prior to Bun 1.2, the lockfile was binary and called `bun.lockb`. Old lockfiles can be upgraded to the new format by running `bun install --save-text-lockfile --frozen-lockfile --lockfile-only`, and then deleting `bun.lockb`.
For more information on both these commands, see [`bun install`](https://bun.sh/docs/cli/install) and [`bun outdated`](https://bun.sh/docs/cli/outdated).
For more information on both these commands, see [`bun install`](https://bun.com/docs/cli/install) and [`bun outdated`](https://bun.com/docs/cli/outdated).
## Running scripts with `--filter`
@@ -73,7 +73,7 @@ Both commands will be run in parallel, and you will see a nice terminal UI showi
### Running scripts in workspaces
Filters respect your [workspace configuration](https://bun.sh/docs/install/workspaces): If you have a `package.json` file that specifies which packages are part of the workspace,
Filters respect your [workspace configuration](https://bun.com/docs/install/workspaces): If you have a `package.json` file that specifies which packages are part of the workspace,
`--filter` will be restricted to only these packages. Also, in a workspace you can use `--filter` to run scripts in packages that are located anywhere in the workspace:
@@ -68,7 +68,7 @@ $ bun install --concurrent-scripts 5
## Workspaces
Bun supports `"workspaces"` in package.json. For complete documentation refer to [Package manager > Workspaces](https://bun.sh/docs/install/workspaces).
Bun supports `"workspaces"` in package.json. For complete documentation refer to [Package manager > Workspaces](https://bun.com/docs/install/workspaces).
```json#package.json
{
@@ -93,11 +93,11 @@ $ bun install --filter '!pkg-c'
$ bun install --filter './packages/pkg-a'
```
For more information on filtering with `bun install`, refer to [Package Manager > Filtering](https://bun.sh/docs/cli/filter#bun-install-and-bun-outdated)
For more information on filtering with `bun install`, refer to [Package Manager > Filtering](https://bun.com/docs/cli/filter#bun-install-and-bun-outdated)
## Overrides and resolutions
Bun supports npm's `"overrides"` and Yarn's `"resolutions"` in `package.json`. These are mechanisms for specifying a version range for _metadependencies_—the dependencies of your dependencies. Refer to [Package manager > Overrides and resolutions](https://bun.sh/docs/install/overrides) for complete documentation.
Bun supports npm's `"overrides"` and Yarn's `"resolutions"` in `package.json`. These are mechanisms for specifying a version range for _metadependencies_—the dependencies of your dependencies. Refer to [Package manager > Overrides and resolutions](https://bun.com/docs/install/overrides) for complete documentation.
```json-diff#package.json
{
@@ -142,7 +142,7 @@ For reproducible installs, use `--frozen-lockfile`. This will install the exact
$ bun install --frozen-lockfile
```
For more information on Bun's lockfile `bun.lock`, refer to [Package manager > Lockfile](https://bun.sh/docs/install/lockfile).
For more information on Bun's lockfile `bun.lock`, refer to [Package manager > Lockfile](https://bun.com/docs/install/lockfile).
## Omitting dependencies
@@ -168,7 +168,7 @@ $ bun install --dry-run
## Non-npm dependencies
Bun supports installing dependencies from Git, GitHub, and local or remotely-hosted tarballs. For complete documentation refer to [Package manager > Git, GitHub, and tarball dependencies](https://bun.sh/docs/cli/add).
Bun supports installing dependencies from Git, GitHub, and local or remotely-hosted tarballs. For complete documentation refer to [Package manager > Git, GitHub, and tarball dependencies](https://bun.com/docs/cli/add).
```json#package.json
{
@@ -183,6 +183,30 @@ Bun supports installing dependencies from Git, GitHub, and local or remotely-hos
}
```
## Installation strategies
Bun supports two package installation strategies that determine how dependencies are organized in `node_modules`:
### Hoisted installs (default for single projects)
The traditional npm/Yarn approach that flattens dependencies into a shared `node_modules` directory:
```bash
$ bun install --linker hoisted
```
### Isolated installs
A pnpm-like approach that creates strict dependency isolation to prevent phantom dependencies:
```bash
$ bun install --linker isolated
```
Isolated installs create a central package store in `node_modules/.bun/` with symlinks in the top-level `node_modules`. This ensures packages can only access their declared dependencies.
For complete documentation on isolated installs, refer to [Package manager > Isolated installs](https://bun.com/docs/install/isolated).
## Configuration
The default behavior of `bun install` can be configured in `bunfig.toml`. The default values are shown below.
@@ -213,11 +237,15 @@ dryRun = false
# equivalent to `--concurrent-scripts` flag
concurrentScripts = 16 # (cpu count or GOMAXPROCS) x2
# installation strategy: "hoisted" or "isolated"
# default: "hoisted"
linker = "hoisted"
```
## CI/CD
Looking to speed up your CI? Use the official [`oven-sh/setup-bun`](https://github.com/oven-sh/setup-bun) action to install `bun` in a GitHub Actions pipeline.
Use the official [`oven-sh/setup-bun`](https://github.com/oven-sh/setup-bun) action to install `bun` in a GitHub Actions pipeline:
```yaml#.github/workflows/release.yml
name: bun-types
@@ -236,4 +264,31 @@ jobs:
run: bun run build
```
For CI/CD environments that want to enforce reproducible builds, use `bun ci` to fail the build if the package.json is out of sync with the lockfile:
```bash
$ bun ci
```
This is equivalent to `bun install --frozen-lockfile`. It installs exact versions from `bun.lock` and fails if `package.json` doesn't match the lockfile. To use `bun ci` or `bun install --frozen-lockfile`, you must commit `bun.lock` to version control.
And instead of running `bun install`, run `bun ci`.
@@ -8,15 +8,70 @@ To create a tarball of the current workspace:
$ bun pm pack
```
Options for the `pack` command:
This command creates a `.tgz` file containing all files that would be published to npm, following the same rules as `npm pack`.
-`--dry-run`: Perform all tasks except writing the tarball to disk.
-`--destination`: Specify the directory where the tarball will be saved.
-`--filename`: Specify an exact file name for the tarball to be saved at.
## Examples
Basic usage:
```bash
$ bun pm pack
# Creates my-package-1.0.0.tgz in current directory
```
Quiet mode for scripting:
```bash
$ TARBALL=$(bun pm pack --quiet)
$ echo"Created: $TARBALL"
# Output: Created: my-package-1.0.0.tgz
```
Custom destination:
```bash
$ bun pm pack --destination ./dist
# Saves tarball in ./dist/ directory
```
## Options
-`--dry-run`: Perform all tasks except writing the tarball to disk. Shows what would be included.
-`--destination <dir>`: Specify the directory where the tarball will be saved.
-`--filename <name>`: Specify an exact file name for the tarball to be saved at.
-`--ignore-scripts`: Skip running pre/postpack and prepare scripts.
-`--gzip-level`: Set a custom compression level for gzip, ranging from 0 to 9 (default is 9).
-`--gzip-level <0-9>`: Set a custom compression level for gzip, ranging from 0 to 9 (default is 9).
-`--quiet`: Only output the tarball filename, suppressing verbose output. Ideal for scripts and automation.
> Note `--filename` and `--destination` cannot be used at the same time
> **Note:** `--filename` and `--destination` cannot be used at the same time.
## Output Modes
**Default output:**
```bash
$ bun pm pack
bun pack v1.2.19
packed 131B package.json
packed 40B index.js
my-package-1.0.0.tgz
Total files: 2
Shasum: f2451d6eb1e818f500a791d9aace80b394258a90
Unpacked size: 171B
Packed size: 249B
```
**Quiet output:**
```bash
$ bun pm pack --quiet
my-package-1.0.0.tgz
```
The `--quiet` flag is particularly useful for automation workflows where you need to capture the generated tarball filename for further processing.
## bin
@@ -193,3 +248,38 @@ v1.0.1
```
Supports `patch`, `minor`, `major`, `premajor`, `preminor`, `prepatch`, `prerelease`, `from-git`, or specific versions like `1.2.3`. By default creates git commit and tag unless `--no-git-tag-version` was used to skip.
## pkg
Manage `package.json` data with get, set, delete, and fix operations.
All commands support dot and bracket notation:
```bash
scripts.build # dot notation
contributors[0]# array access
workspaces.0 # dot with numeric index
scripts[test:watch]# bracket for special chars
```
Examples:
```bash
# set
$ bun pm pkg get name # single property
$ bun pm pkg get name version # multiple properties
$ bun pm pkg get # entire package.json
$ bun pm pkg get scripts.build # nested property
# set
$ bun pm pkg setname="my-package"# simple property
$ bun pm pkg set scripts.test="jest"version=2.0.0 # multiple properties
$ bun pm pkg set{"private":"true"} --json # JSON values with --json flag
# delete
$ bun pm pkg delete description # single property
$ bun pm pkg delete scripts.test contributors[0]# multiple/nested
Bun executes the script command in a subshell. On Linux & macOS, it checks for the following shells in order, using the first one it finds: `bash`, `sh`, `zsh`. On windows, it uses [bun shell](https://bun.sh/docs/runtime/shell) to support bash-like syntax and many common commands.
Bun executes the script command in a subshell. On Linux & macOS, it checks for the following shells in order, using the first one it finds: `bash`, `sh`, `zsh`. On windows, it uses [bun shell](https://bun.com/docs/runtime/shell) to support bash-like syntax and many common commands.
{% callout %}
⚡️ The startup time for `npm run` on Linux is roughly 170ms; with Bun it is `6ms`.
@@ -164,7 +164,7 @@ bun run --filter 'ba*' <script>
will execute `<script>` in both `bar` and `baz`, but not in `foo`.
Find more details in the docs page for [filter](https://bun.sh/docs/cli/filter#running-scripts-with-filter).
Find more details in the docs page for [filter](https://bun.com/docs/cli/filter#running-scripts-with-filter).
## `bun run -` to pipe code from stdin
@@ -185,6 +185,23 @@ This is TypeScript!
For convenience, all code is treated as TypeScript with JSX support when using `bun run -`.
## `bun run --console-depth`
Control the depth of object inspection in console output with the `--console-depth` flag.
```bash
$ bun --console-depth 5 run index.tsx
```
This sets how deeply nested objects are displayed in `console.log()` output. The default depth is `2`. Higher values show more nested properties but may produce verbose output for complex objects.
@@ -17,7 +17,7 @@ Bun aims for compatibility with Jest, but not everything is implemented. To trac
$ bun test
```
Tests are written in JavaScript or TypeScript with a Jest-like API. Refer to [Writing tests](https://bun.sh/docs/test/writing) for full documentation.
Tests are written in JavaScript or TypeScript with a Jest-like API. Refer to [Writing tests](https://bun.com/docs/test/writing) for full documentation.
```ts#math.test.ts
import { expect, test } from "bun:test";
@@ -53,7 +53,7 @@ To run a specific file in the test runner, make sure the path starts with `./` o
$ bun test ./test/specific-file.test.ts
```
The test runner runs all tests in a single process. It loads all `--preload` scripts (see [Lifecycle](https://bun.sh/docs/test/lifecycle) for details), then runs all tests. If a test fails, the test runner will exit with a non-zero exit code.
The test runner runs all tests in a single process. It loads all `--preload` scripts (see [Lifecycle](https://bun.com/docs/test/lifecycle) for details), then runs all tests. If a test fails, the test runner will exit with a non-zero exit code.
## CI/CD integration
@@ -154,11 +154,11 @@ These hooks can be defined inside test files, or in a separate file that is prel
$ bun test --preload ./setup.ts
```
See [Test > Lifecycle](https://bun.sh/docs/test/lifecycle) for complete documentation.
See [Test > Lifecycle](https://bun.com/docs/test/lifecycle) for complete documentation.
## Mocks
Create mock functions with the `mock` function. Mocks are automatically reset between tests.
Create mock functions with the `mock` function.
```ts
import { test, expect, mock } from "bun:test";
@@ -182,7 +182,7 @@ Alternatively, you can use `jest.fn()`, it behaves identically.
+ const random = jest.fn(() => Math.random());
```
See [Test > Mocks](https://bun.sh/docs/test/mocks) for complete documentation.
See [Test > Mocks](https://bun.com/docs/test/mocks) for complete documentation.
## Snapshot testing
@@ -203,7 +203,7 @@ To update snapshots, use the `--update-snapshots` flag.
$ bun test --update-snapshots
```
See [Test > Snapshots](https://bun.sh/docs/test/snapshots) for complete documentation.
See [Test > Snapshots](https://bun.com/docs/test/snapshots) for complete documentation.
## UI & DOM testing
@@ -213,7 +213,7 @@ Bun is compatible with popular UI testing libraries:
See [Test > DOM Testing](https://bun.sh/docs/test/dom) for complete documentation.
See [Test > DOM Testing](https://bun.com/docs/test/dom) for complete documentation.
## Performance
@@ -248,4 +248,33 @@ $ bun test foo
Any test file in the directory with an _absolute path_ that contains one of the targets will run. Glob patterns are not yet supported. -->
## AI Agent Integration
When using Bun's test runner with AI coding assistants, you can enable quieter output to improve readability and reduce context noise. This feature minimizes test output verbosity while preserving essential failure information.
### Environment Variables
Set any of the following environment variables to enable AI-friendly output:
- `CLAUDECODE=1` - For Claude Code
- `REPL_ID=1` - For Replit
- `AGENT=1` - Generic AI agent flag
### Behavior
When an AI agent environment is detected:
- Only test failures are displayed in detail
- Passing, skipped, and todo test indicators are hidden
- Summary statistics remain intact
```bash
# Example: Enable quiet output for Claude Code
$ CLAUDECODE=1 bun test
# Still shows failures and summary, but hides verbose passing test output
```
This feature is particularly useful in AI-assisted development workflows where reduced output verbosity improves context efficiency while maintaining visibility into test failures.
@@ -10,6 +10,86 @@ To update a specific dependency to the latest version:
$ bun update [package]
```
## `--interactive`
For a more controlled update experience, use the `--interactive` flag to select which packages to update:
```sh
$ bun update --interactive
$ bun update -i
```
This launches an interactive terminal interface that shows all outdated packages with their current and target versions. You can then select which packages to update.
### Interactive Interface
The interface displays packages grouped by dependency type:
```
? Select packages to update - Space to toggle, Enter to confirm, a to select all, n to select none, i to invert, l to toggle latest
dependencies Current Target Latest
□ react 17.0.2 18.2.0 18.3.1
□ lodash 4.17.20 4.17.21 4.17.21
devDependencies Current Target Latest
□ typescript 4.8.0 5.0.0 5.3.3
□ @types/node 16.11.7 18.0.0 20.11.5
optionalDependencies Current Target Latest
□ some-optional-package 1.0.0 1.1.0 1.2.0
```
**Sections:**
- Packages are grouped under section headers: `dependencies`, `devDependencies`, `peerDependencies`, `optionalDependencies`
- Each section shows column headers aligned with the package data
**Columns:**
- **Package**: Package name (may have suffix like ` dev`, ` peer`, ` optional` for clarity)
- **Current**: Currently installed version
- **Target**: Version that would be installed (respects semver constraints)
- **Latest**: Latest available version
### Keyboard Controls
**Selection:**
- **Space**: Toggle package selection
- **Enter**: Confirm selections and update
- **a/A**: Select all packages
- **n/N**: Select none
- **i/I**: Invert selection
**Navigation:**
- **↑/↓ Arrow keys** or **j/k**: Move cursor
- **l/L**: Toggle between target and latest version for current package
**Exit:**
- **Ctrl+C** or **Ctrl+D**: Cancel without updating
### Visual Indicators
- **☑** Selected packages (will be updated)
- **□** Unselected packages
- **>** Current cursor position
- **Colors**: Red (major), yellow (minor), green (patch) version changes
- **Underlined**: Currently selected update target
### Package Grouping
Packages are organized in sections by dependency type:
Projects that use Express and other major Node.js HTTP libraries should work out of the box.
{% callout %}
If you run into bugs, [please file an issue](https://bun.sh/issues) _in Bun's repo_, not the library. It is Bun's responsibility to address Node.js compatibility issues.
If you run into bugs, [please file an issue](https://bun.com/issues) _in Bun's repo_, not the library. It is Bun's responsibility to address Node.js compatibility issues.
{% /callout %}
```ts
@@ -19,10 +19,10 @@ app.listen(port, () => {
});
```
Bun implements the [`node:http`](https://nodejs.org/api/http.html) and [`node:https`](https://nodejs.org/api/https.html) modules that these libraries rely on. These modules can also be used directly, though [`Bun.serve`](https://bun.sh/docs/api/http) is recommended for most use cases.
Bun implements the [`node:http`](https://nodejs.org/api/http.html) and [`node:https`](https://nodejs.org/api/https.html) modules that these libraries rely on. These modules can also be used directly, though [`Bun.serve`](https://bun.com/docs/api/http) is recommended for most use cases.
{% callout %}
**Note** — Refer to the [Runtime > Node.js APIs](https://bun.sh/docs/runtime/nodejs-apis#node-http) page for more detailed compatibility information.
**Note** — Refer to the [Runtime > Node.js APIs](https://bun.com/docs/runtime/nodejs-apis#node-http) page for more detailed compatibility information.
See [Docs > API > Binary Data](https://bun.sh/docs/api/binary-data#conversion) for complete documentation on manipulating binary data with Bun.
See [Docs > API > Binary Data](https://bun.com/docs/api/binary-data#conversion) for complete documentation on manipulating binary data with Bun.
Some files were not shown because too many files have changed in this diff
Show More
Reference in New Issue
Block a user
Blocking a user prevents them from interacting with repositories, such as opening or commenting on pull requests or issues. Learn more about blocking a user.