Compare commits

...

5 Commits

Author SHA1 Message Date
Claude Bot
b72329bfef fix(bundler): default NODE_ENV to production for --compile builds
When using `bun build --compile`, default `process.env.NODE_ENV` to
`"production"` to enable dead code elimination for conditional requires
like React's:

```javascript
if (process.env.NODE_ENV === 'production') {
  module.exports = require('./cjs/react.production.js');
} else {
  module.exports = require('./cjs/react.development.js');
}
```

Without this fix, the bundler cannot evaluate the condition as a constant
and tries to resolve both branches, causing failures when development
files don't exist (e.g., in Next.js standalone output).

This also sets `jsx.development = false` for --compile builds, matching
the behavior of --production.

Users can still override this with:
- CLI: `--define 'process.env.NODE_ENV="development"'`
- API: `define: { 'process.env.NODE_ENV': '"development"' }`

Fixes #26244

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-19 08:15:05 +00:00
wovw
716801e92d gitignore: add .direnv dir (#26198)
### What does this PR do?

The `.direnv` folder is created by [direnv](https://direnv.net/) when
using `use flake` in `.envrc` to automatically load the Nix development
shell. Since the repo already includes a flake.nix, developers on NixOS
commonly use direnv (via nix-direnv) to auto-load the environment. This
folder contains cached environment data and should not be committed.
2026-01-18 00:17:14 -08:00
wovw
939f5cf7af fix(nix): disable fortify hardening for debug builds (#26199)
### What does this PR do?

NixOS enables security hardening flags by default in `mkShell` /
`devShells` e.g. `_FORTIFY_SOURCE=2`. This flag adds runtime buffer
overflow checks but requires compiler optimization (`-O1` or higher) to
work, since it needs to inline functions to insert checks.
Debug builds use `-O0` (no optimization), which causes this compilation
error:
`error: _FORTIFY_SOURCE requires compiling with optimization (-O)
[-Werror,-W#warnings]`

This patch is a standard Nix way to disable this specific flag while
keeping other hardening features intact. It doesn't affect release
builds since it's scoped to `devShells`.

### How did you verify your code works?

`bun bd test` successfully runs test cases.
2026-01-18 00:17:01 -08:00
SUZUKI Sosuke
496aeb97f9 refactor(wrapAnsi): use WTF::find for character searches (#26200)
## Summary

This PR addresses the review feedback from #26061
([comment](https://github.com/oven-sh/bun/pull/26061#discussion_r2697257836))
requesting the use of `WTF::find` for newline searches in
`wrapAnsi.cpp`.

## Changes

### 1. CRLF Normalization (lines 628-639)
Replaced manual loop with `WTF::findNextNewline` which provides
SIMD-optimized detection for `\r`, `\n`, and `\r\n` sequences.

**Before:**
```cpp
for (size_t i = 0; i < input.size(); ++i) {
    if (i + 1 < input.size() && input[i] == '\r' && input[i + 1] == '\n') {
        normalized.append(static_cast<Char>('\n'));
        i++;
    } else {
        normalized.append(input[i]);
    }
}
```

**After:**
```cpp
size_t pos = 0;
while (pos < input.size()) {
    auto newline = WTF::findNextNewline(input, pos);
    if (newline.position == WTF::notFound) {
        normalized.append(std::span { input.data() + pos, input.size() - pos });
        break;
    }
    if (newline.position > pos)
        normalized.append(std::span { input.data() + pos, newline.position - pos });
    normalized.append(static_cast<Char>('\n'));
    pos = newline.position + newline.length;
}
```

### 2. Word Length Calculation (lines 524-533)
Replaced manual loop with `WTF::find` for space character detection.

**Before:**
```cpp
for (const Char* it = lineStart; it <= lineEnd; ++it) {
    if (it == lineEnd || *it == ' ') {
        // word boundary logic
    }
}
```

**After:**
```cpp
auto lineSpan = std::span<const Char>(lineStart, lineEnd);
size_t wordStartIdx = 0;
while (wordStartIdx <= lineSpan.size()) {
    size_t spacePos = WTF::find(lineSpan, static_cast<Char>(' '), wordStartIdx);
    // word boundary logic using spacePos
}
```

## Benchmark Results

Tested on Apple M4 Max. No performance regression observed - most
benchmarks show slight improvements.

| Benchmark | Before | After | Change |
|-----------|--------|-------|--------|
| Short text (45 chars) | 613 ns | 583 ns | -4.9% |
| Medium text (810 chars) | 10.85 µs | 10.31 µs | -5.0% |
| Long text (8100 chars) | 684 µs | 102 µs | -85% * |
| Colored short | 1.26 µs | 806 ns | -36% |
| Colored medium | 19.24 µs | 13.80 µs | -28% |
| Japanese (full-width) | 7.74 µs | 7.43 µs | -4.0% |
| Emoji text | 9.35 µs | 9.27 µs | -0.9% |
| Hyperlink (OSC 8) | 5.73 µs | 5.58 µs | -2.6% |

\* Large variance in baseline measurement

## Testing

- All 35 existing tests pass
- Manual verification of CRLF normalization and word wrapping edge cases

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2026-01-17 23:43:02 -08:00
robobun
3b5f2fe756 chore(deps): update BoringSSL fork to latest upstream (#26212)
## Summary

Updates the BoringSSL fork to the latest upstream (337 commits since
last update) with bug fixes for Node.js crypto compatibility.

### Upstream BoringSSL Changes (337 commits)

| Category | Count |
|----------|-------|
| API Changes (including namespacing) | 42 |
| Code Cleanup/Refactoring | 35 |
| Testing/CI | 32 |
| Build System (Bazel, CMake) | 27 |
| Bug Fixes | 25 |
| Post-Quantum Cryptography | 14 |
| TLS/SSL Changes | 12 |
| Rust Bindings/Wrappers | 9 |
| Performance Improvements | 8 |
| Documentation | 8 |

#### Highlights

**Post-Quantum Cryptography**
- ML-DSA (Module-Lattice Digital Signature Algorithm): Full EVP
integration, Wycheproof tests, external mu verification
- SLH-DSA: Implementation of pure SLH-DSA-SHAKE-256f
- Merkle Tree Certificates: New support for verifying signatureless MTCs

**Major API Changes**
- New `CRYPTO_IOVEC` based AEAD APIs for zero-copy I/O across all
ciphers
- Massive namespacing effort moving internal symbols into `bssl`
namespace
- `bssl::Span` modernization to match `std::span` behavior

**TLS/SSL**
- Added `TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256` support
- HMAC on SHA-384 for TLS 1.3
- Improved Lucky 13 mitigation

**Build System**
- Bazel 8.x and 9.0.0 compatibility
- CI upgrades: Ubuntu 24.04, Android NDK r29

---

### Bun-specific Patches (in oven-sh/boringssl)

1. **Fix SHA512-224 EVP final buffer size** (`digests.cc.inc`)
- `BCM_sha512_224_final` writes 32 bytes but `EVP_MD.md_size` is 28
bytes
   - Now uses a temp buffer to avoid buffer overwrite

2. **Fix `EVP_do_all_sorted` to return only lowercase names**
(`evp_do_all.cc`)
- `EVP_CIPHER_do_all_sorted` and `EVP_MD_do_all_sorted` now return only
lowercase names
- Matches Node.js behavior for `crypto.getCiphers()` and
`crypto.getHashes()`

---

### Changes in Bun

- Updated BoringSSL commit hash to
`4f4f5ef8ebc6e23cbf393428f0ab1b526773f7ac`
- Removed `ignoreSHA512_224` parameter from `ncrypto::getDigestByName()`
to enable SHA512-224 support
- Removed special SHA512-224 buffer handling in `JSHash.cpp` (no longer
needed after BoringSSL fix)

## Test plan
- [x] `crypto.createHash('sha512-224')` works correctly
- [x] `crypto.getHashes()` returns lowercase names (md4, md5, sha1,
sha256, etc.)
- [x] `crypto.getCiphers()` returns lowercase names (aes-128-cbc,
aes-256-gcm, etc.)
- [x] `test/regression/issue/crypto-names.test.ts` passes
- [x] All CI tests pass

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Bot <claude-bot@bun.sh>
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-17 23:39:04 -08:00
10 changed files with 219 additions and 41 deletions

1
.gitignore vendored
View File

@@ -1,4 +1,5 @@
.claude/settings.local.json
.direnv
.DS_Store
.env
.envrc

View File

@@ -4,7 +4,7 @@ register_repository(
REPOSITORY
oven-sh/boringssl
COMMIT
f1ffd9e83d4f5c28a9c70d73f9a4e6fcf310062f
4f4f5ef8ebc6e23cbf393428f0ab1b526773f7ac
)
register_cmake_command(

View File

@@ -131,6 +131,7 @@
stdenv = pkgs.clangStdenv;
}) {
inherit packages;
hardeningDisable = [ "fortify" ];
shellHook = ''
# Set up build environment

View File

@@ -1901,7 +1901,7 @@ DataPointer DHPointer::stateless(const EVPKeyPointer& ourKey,
// ============================================================================
// KDF
const EVP_MD* getDigestByName(const WTF::StringView name, bool ignoreSHA512_224)
const EVP_MD* getDigestByName(const WTF::StringView name)
{
// Historically, "dss1" and "DSS1" were DSA aliases for SHA-1
// exposed through the public API.
@@ -1955,9 +1955,6 @@ const EVP_MD* getDigestByName(const WTF::StringView name, bool ignoreSHA512_224)
return EVP_sha512();
}
if (WTF::equalIgnoringASCIICase(moreBits, "/224"_s)) {
if (ignoreSHA512_224) {
return nullptr;
}
return EVP_sha512_224();
}
if (WTF::equalIgnoringASCIICase(moreBits, "/256"_s)) {
@@ -1979,10 +1976,6 @@ const EVP_MD* getDigestByName(const WTF::StringView name, bool ignoreSHA512_224)
}
}
if (ignoreSHA512_224 && WTF::equalIgnoringASCIICase(name, "sha512-224"_s)) {
return nullptr;
}
// if (name == "ripemd160WithRSA"_s || name == "RSA-RIPEMD160"_s) {
// return EVP_ripemd160();
// }

View File

@@ -1575,7 +1575,7 @@ Buffer<char> ExportChallenge(const char* input, size_t length);
// ============================================================================
// KDF
const EVP_MD* getDigestByName(const WTF::StringView name, bool ignoreSHA512_224 = false);
const EVP_MD* getDigestByName(const WTF::StringView name);
const EVP_CIPHER* getCipherByName(const WTF::StringView name);
// Verify that the specified HKDF output length is valid for the given digest.

View File

@@ -251,15 +251,7 @@ JSC_DEFINE_HOST_FUNCTION(jsHashProtoFuncDigest, (JSC::JSGlobalObject * lexicalGl
// Only compute the digest if it hasn't been cached yet
if (!hash->m_digest && len > 0) {
const EVP_MD* md = hash->m_ctx.getDigest();
uint32_t bufLen = len;
if (md == EVP_sha512_224()) {
// SHA-512/224 expects buffer length of length % 8. can be truncated afterwards
bufLen = SHA512_224_DIGEST_BUFFER_LENGTH;
}
auto data = hash->m_ctx.digestFinal(bufLen);
auto data = hash->m_ctx.digestFinal(len);
if (!data) {
throwCryptoError(lexicalGlobalObject, scope, ERR_get_error(), "Failed to finalize digest"_s);
return {};
@@ -325,7 +317,7 @@ JSC_DEFINE_HOST_FUNCTION(constructHash, (JSC::JSGlobalObject * globalObject, JSC
WTF::String algorithm = algorithmOrHashInstanceValue.toWTFString(globalObject);
RETURN_IF_EXCEPTION(scope, {});
md = ncrypto::getDigestByName(algorithm, true);
md = ncrypto::getDigestByName(algorithm);
if (!md) {
zigHasher = ExternZigHash::getByName(zigGlobalObject, algorithm);
}

View File

@@ -518,25 +518,32 @@ static void processLine(const Char* lineStart, const Char* lineEnd, size_t colum
return;
}
// Calculate word lengths
// Calculate word lengths using WTF::find for space detection
Vector<size_t> wordLengths;
const Char* wordStart = lineStart;
for (const Char* it = lineStart; it <= lineEnd; ++it) {
if (it == lineEnd || *it == ' ') {
if (wordStart < it) {
wordLengths.append(stringWidth(wordStart, it, options.ambiguousIsNarrow));
} else {
wordLengths.append(0);
}
wordStart = it + 1;
auto lineSpan = std::span<const Char>(lineStart, lineEnd);
size_t wordStartIdx = 0;
while (wordStartIdx <= lineSpan.size()) {
size_t spacePos = WTF::find(lineSpan, static_cast<Char>(' '), wordStartIdx);
size_t wordEndIdx = (spacePos == WTF::notFound) ? lineSpan.size() : spacePos;
if (wordStartIdx < wordEndIdx) {
wordLengths.append(stringWidth(lineSpan.data() + wordStartIdx,
lineSpan.data() + wordEndIdx,
options.ambiguousIsNarrow));
} else {
wordLengths.append(0);
}
if (spacePos == WTF::notFound)
break;
wordStartIdx = wordEndIdx + 1;
}
// Start with empty first row
rows.append(Row<Char>());
// Process each word
wordStart = lineStart;
const Char* wordStart = lineStart;
size_t wordIndex = 0;
for (const Char* it = lineStart; it <= lineEnd; ++it) {
@@ -625,17 +632,24 @@ static WTF::String wrapAnsiImpl(std::span<const Char> input, size_t columns, con
return result.toString();
}
// Normalize \r\n to \n
// Normalize \r\n to \n using WTF::findNextNewline
Vector<Char> normalized;
normalized.reserveCapacity(input.size());
for (size_t i = 0; i < input.size(); ++i) {
if (i + 1 < input.size() && input[i] == '\r' && input[i + 1] == '\n') {
normalized.append(static_cast<Char>('\n'));
i++; // Skip next char
} else {
normalized.append(input[i]);
size_t pos = 0;
while (pos < input.size()) {
auto newline = WTF::findNextNewline(input, pos);
if (newline.position == WTF::notFound) {
// Append remaining content
normalized.append(std::span { input.data() + pos, input.size() - pos });
break;
}
// Append content before newline
if (newline.position > pos)
normalized.append(std::span { input.data() + pos, newline.position - pos });
// Always append \n regardless of original (\r, \n, or \r\n)
normalized.append(static_cast<Char>('\n'));
pos = newline.position + newline.length;
}
// Process each line separately

View File

@@ -1988,6 +1988,10 @@ pub const BundleV2 = struct {
if (transpiler.options.compile) {
// Emitting DCE annotations is nonsensical in --compile.
transpiler.options.emit_dce_annotations = false;
// Default to production mode for --compile builds to enable dead code elimination
// for conditional requires like React's process.env.NODE_ENV checks.
// Users can override this with define: { 'process.env.NODE_ENV': '"development"' }
try transpiler.env.map.put("NODE_ENV", "production");
}
transpiler.configureLinker();
@@ -1996,6 +2000,12 @@ pub const BundleV2 = struct {
if (!transpiler.options.production) {
try transpiler.options.conditions.appendSlice(&.{"development"});
}
// Allow tsconfig.json overriding, but always set it to false for --compile builds.
if (transpiler.options.compile) {
transpiler.options.jsx.development = false;
}
transpiler.resolver.env_loader = transpiler.env;
transpiler.resolver.opts = transpiler.options;
}

View File

@@ -196,7 +196,10 @@ pub const BuildCommand = struct {
this_transpiler.options.env.behavior = ctx.bundler_options.env_behavior;
this_transpiler.options.env.prefix = ctx.bundler_options.env_prefix;
if (ctx.bundler_options.production) {
// Default to production mode for --compile builds to enable dead code elimination
// for conditional requires like React's process.env.NODE_ENV checks.
// Users can override this with --define 'process.env.NODE_ENV="development"'
if (ctx.bundler_options.production or ctx.bundler_options.compile) {
try this_transpiler.env.map.put("NODE_ENV", "production");
}
@@ -210,8 +213,8 @@ pub const BuildCommand = struct {
this_transpiler.resolver.opts = this_transpiler.options;
this_transpiler.resolver.env_loader = this_transpiler.env;
// Allow tsconfig.json overriding, but always set it to false if --production is passed.
if (ctx.bundler_options.production) {
// Allow tsconfig.json overriding, but always set it to false if --production or --compile is passed.
if (ctx.bundler_options.production or ctx.bundler_options.compile) {
this_transpiler.options.jsx.development = false;
this_transpiler.resolver.opts.jsx.development = false;
}

View File

@@ -0,0 +1,164 @@
import { describe, expect, test } from "bun:test";
import { bunEnv, bunExe, isWindows, tempDir } from "harness";
import { join } from "path";
// https://github.com/oven-sh/bun/issues/26244
// bun build --compile should default NODE_ENV to production for dead code elimination
describe("Issue #26244", () => {
test("--compile defaults NODE_ENV to production (CLI)", async () => {
using dir = tempDir("compile-node-env-cli", {
// This simulates React's conditional require pattern
"index.js": `
if (process.env.NODE_ENV === 'production') {
module.exports = require('./prod.js');
} else {
module.exports = require('./dev.js');
}
`,
"prod.js": `module.exports = { mode: "production" };`,
// Note: dev.js intentionally not created to simulate Next.js standalone output
// where development files are stripped
});
const outfile = join(dir + "", isWindows ? "app.exe" : "app");
// This should succeed because NODE_ENV defaults to production,
// enabling dead code elimination of the dev.js branch
const buildProc = Bun.spawn({
cmd: [bunExe(), "build", "--compile", join(dir + "", "index.js"), "--outfile", outfile],
cwd: dir + "",
env: bunEnv,
stdout: "pipe",
stderr: "pipe",
});
const [buildStdout, buildStderr, buildExitCode] = await Promise.all([
new Response(buildProc.stdout).text(),
new Response(buildProc.stderr).text(),
buildProc.exited,
]);
// Build should succeed - the dead branch with dev.js should be eliminated
expect(buildStderr).not.toContain("Could not resolve");
expect(buildExitCode).toBe(0);
});
test("--compile defaults NODE_ENV to production (API)", async () => {
using dir = tempDir("compile-node-env-api", {
// This simulates React's conditional require pattern
"index.js": `
if (process.env.NODE_ENV === 'production') {
module.exports = require('./prod.js');
} else {
module.exports = require('./dev.js');
}
`,
"prod.js": `module.exports = { mode: "production" };`,
// Note: dev.js intentionally not created to simulate Next.js standalone output
// where development files are stripped
});
const outfile = join(dir + "", isWindows ? "app.exe" : "app");
// This should succeed because NODE_ENV defaults to production,
// enabling dead code elimination of the dev.js branch
const result = await Bun.build({
entrypoints: [join(dir + "", "index.js")],
compile: {
outfile,
},
});
// Build should succeed - the dead branch with dev.js should be eliminated
expect(result.success).toBe(true);
expect(result.outputs.length).toBe(1);
});
test("--compile with conditional require eliminates dead branch (CLI)", async () => {
using dir = tempDir("compile-dead-code-cli", {
"entry.js": `
// This is the pattern used by React
if (process.env.NODE_ENV === 'production') {
console.log("Using production build");
} else {
// This branch references a non-existent file
// and should be eliminated by dead code elimination
require('./non-existent-dev-file.js');
}
`,
});
const outfile = join(dir + "", isWindows ? "app.exe" : "app");
// Should succeed - the require('./non-existent-dev-file.js') should be
// eliminated because NODE_ENV defaults to 'production'
const buildProc = Bun.spawn({
cmd: [bunExe(), "build", "--compile", join(dir + "", "entry.js"), "--outfile", outfile],
cwd: dir + "",
env: bunEnv,
stdout: "pipe",
stderr: "pipe",
});
const [buildStdout, buildStderr, buildExitCode] = await Promise.all([
new Response(buildProc.stdout).text(),
new Response(buildProc.stderr).text(),
buildProc.exited,
]);
expect(buildStderr).not.toContain("Could not resolve");
expect(buildExitCode).toBe(0);
});
test("--compile can override NODE_ENV with --define", async () => {
using dir = tempDir("compile-define-override", {
"entry.js": `console.log(process.env.NODE_ENV);`,
});
const outfile = join(dir + "", isWindows ? "app.exe" : "app");
// Use CLI to test --define override
const buildProc = Bun.spawn({
cmd: [
bunExe(),
"build",
"--compile",
join(dir + "", "entry.js"),
"--outfile",
outfile,
"--define",
'process.env.NODE_ENV="development"',
],
cwd: dir + "",
env: bunEnv,
stdout: "pipe",
stderr: "pipe",
});
const [buildStdout, buildStderr, buildExitCode] = await Promise.all([
new Response(buildProc.stdout).text(),
new Response(buildProc.stderr).text(),
buildProc.exited,
]);
expect(buildExitCode).toBe(0);
// Run the compiled binary
const runProc = Bun.spawn({
cmd: [outfile],
cwd: dir + "",
env: bunEnv,
stdout: "pipe",
stderr: "pipe",
});
const [stdout, stderr, exitCode] = await Promise.all([
new Response(runProc.stdout).text(),
new Response(runProc.stderr).text(),
runProc.exited,
]);
expect(stdout.trim()).toBe("development");
expect(exitCode).toBe(0);
});
});