Compare commits

..

3 Commits

Author SHA1 Message Date
Ashcon Partovi
968a7cd314 Format 2023-07-05 13:27:45 -07:00
Ashcon Partovi
1e1fe22be0 Fix incorrect signatures 2023-07-05 13:26:51 -07:00
Ashcon Partovi
17bacd85f9 Fix detect-libc 2023-07-05 13:21:04 -07:00
30485 changed files with 95718 additions and 419312 deletions

View File

@@ -1,8 +0,0 @@
# https://EditorConfig.org
root = true
[*]
charset = utf-8
insert_final_newline = true
trim_trailing_whitespace = true
end_of_line = lf

22
.gitattributes vendored
View File

@@ -1,22 +1,3 @@
*.css text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.js text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.jsx text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.tsx text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.ts text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.c text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.cpp text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.cc text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.yml text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.zig text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.rs text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.h text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.json text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.lock text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.map text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.md text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.mjs text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.mts text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
.vscode/launch.json linguist-generated
src/api/schema.d.ts linguist-generated
fixture.*.c linguist-generated
@@ -24,6 +5,7 @@ src/api/schema.js linguist-generated
src/bun.js/bindings/sqlite/sqlite3.c linguist-vendored
src/bun.js/bindings/sqlite/sqlite3_local.h linguist-vendored
*.lockb binary diff=lockb
*.zig text eol=lf
src/bun.js/bindings/simdutf.cpp linguist-vendored
src/bun.js/bindings/simdutf.h linguist-vendored
@@ -49,5 +31,3 @@ src/bun.js/bindings/ZigGeneratedClasses+lazyStructureHeader.h linguist-generated
src/bun.js/bindings/ZigGeneratedClasses+lazyStructureImpl.h linguist-generated
docs/**/* linguist-documentation
packages/bun-uws/fuzzing/seed-corpus/**/* linguist-generated

View File

@@ -14,7 +14,7 @@ body:
- type: input
attributes:
label: What version of Bun is running?
description: Copy the output of `bun --revision`
description: Copy the output of `bun -v`
- type: input
attributes:
label: What platform is your computer?

View File

@@ -1,62 +0,0 @@
### What does this PR do?
<!-- **Please explain what your changes do**, example: -->
<!--
This adds a new flag --bail to bun test. When set, it will stop running tests after the first failure. This is useful for CI environments where you want to fail fast.
-->
- [ ] Documentation or TypeScript types (it's okay to leave the rest blank in this case)
- [ ] Code changes
### How did you verify your code works?
<!-- **For code changes, please include automated tests**. Feel free to uncomment the line below -->
<!-- I wrote automated tests -->
<!-- If JavaScript/TypeScript modules or builtins changed:
- [ ] I ran `make js` and committed the transpiled changes
- [ ] I or my editor ran Prettier on the changed files (or I ran `bun fmt`)
- [ ] I included a test for the new code, or an existing test covers it
-->
<!-- If Zig files changed:
- [ ] I checked the lifetime of memory allocated to verify it's (1) freed and (2) only freed when it should be
- [ ] I or my editor ran `zig fmt` on the changed files
- [ ] I included a test for the new code, or an existing test covers it
- [ ] JSValue used outside outside of the stack is either wrapped in a JSC.Strong or is JSValueProtect'ed
-->
<!-- If new methods, getters, or setters were added to a publicly exposed class:
- [ ] I added TypeScript types for the new methods, getters, or setters
-->
<!-- If dependencies in tests changed:
- [ ] I made sure that specific versions of dependencies are used instead of ranged or tagged versions
-->
<!-- If functions were added to exports.zig or bindings.zig
- [ ] I ran `make headers` to regenerate the C header file
-->
<!-- If \*.classes.ts files were added or changed:
- [ ] I ran `make codegen` to regenerate the C++ and Zig code
-->
<!-- If a new builtin ESM/CJS module was added:
- [ ] I updated Aliases in `module_loader.zig` to include the new module
- [ ] I added a test that imports the module
- [ ] I added a test that require() the module
-->

View File

@@ -36,7 +36,7 @@ jobs:
arch: aarch64
build_arch: arm64
runner: linux-arm64
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-aug3-5/bun-webkit-linux-arm64-lto.tar.gz"
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/may20-3/bun-webkit-linux-arm64-lto.tar.gz"
webkit_basename: "bun-webkit-linux-arm64-lto"
build_machine_arch: aarch64

View File

@@ -46,7 +46,7 @@ jobs:
arch: x86_64
build_arch: amd64
runner: big-ubuntu
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-aug3-5/bun-webkit-linux-amd64-lto.tar.gz"
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/may20-3/bun-webkit-linux-amd64-lto.tar.gz"
webkit_basename: "bun-webkit-linux-amd64-lto"
build_machine_arch: x86_64
- cpu: nehalem
@@ -54,7 +54,7 @@ jobs:
arch: x86_64
build_arch: amd64
runner: big-ubuntu
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-aug3-5/bun-webkit-linux-amd64-lto.tar.gz"
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/may20-3/bun-webkit-linux-amd64-lto.tar.gz"
webkit_basename: "bun-webkit-linux-amd64-lto"
build_machine_arch: x86_64
@@ -193,11 +193,11 @@ jobs:
name: Test (node runner)
env:
SMTP_SENDGRID_SENDER: ${{ secrets.SMTP_SENDGRID_SENDER }}
TLS_MONGODB_DATABASE_URL: ${{ secrets.TLS_MONGODB_DATABASE_URL }}
TLS_POSTGRES_DATABASE_URL: ${{ secrets.TLS_POSTGRES_DATABASE_URL }}
PRISMA_POSTGRES_DATABASE_URL: ${{ secrets.PRISMA_POSTGRES_DATABASE_URL }}
PRISMA_MONGODB_DATABASE_URL: ${{ secrets.PRISMA_MONGODB_DATABASE_URL }}
# if: ${{github.event.inputs.use_bun == 'false'}}
run: |
sudo apt-get update && sudo apt-get install -y openssl
bun install
bun install --cwd test
bun install --cwd packages/bun-internal-test

View File

@@ -117,7 +117,7 @@ jobs:
# obj: bun-obj-darwin-x64-baseline
# runner: macos-11
# artifact: bun-obj-darwin-x64-baseline
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-aug3-5/bun-webkit-macos-amd64-lto.tar.gz"
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/may20-3/bun-webkit-macos-amd64-lto.tar.gz"
# dependencies: true
# compile_obj: false
# - cpu: haswell
@@ -126,7 +126,7 @@ jobs:
# obj: bun-obj-darwin-x64
# runner: macos-11
# artifact: bun-obj-darwin-x64
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-aug3-5/bun-webkit-macos-amd64-lto.tar.gz"
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/may20-3/bun-webkit-macos-amd64-lto.tar.gz"
# dependencies: true
# compile_obj: false
# - cpu: nehalem
@@ -135,7 +135,7 @@ jobs:
# obj: bun-obj-darwin-x64-baseline
# runner: macos-11
# artifact: bun-obj-darwin-x64-baseline
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-aug3-5/bun-webkit-macos-amd64-lto.tar.gz"
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/may20-3/bun-webkit-macos-amd64-lto.tar.gz"
# dependencies: false
# compile_obj: true
# - cpu: haswell
@@ -144,7 +144,7 @@ jobs:
# obj: bun-obj-darwin-x64
# runner: macos-11
# artifact: bun-obj-darwin-x64
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-aug3-5/bun-webkit-macos-amd64-lto.tar.gz"
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/may20-3/bun-webkit-macos-amd64-lto.tar.gz"
# dependencies: false
# compile_obj: true
- cpu: native
@@ -152,7 +152,7 @@ jobs:
tag: bun-darwin-aarch64
obj: bun-obj-darwin-aarch64
artifact: bun-obj-darwin-aarch64
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-aug3-5/bun-webkit-macos-arm64-lto.tar.gz"
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/may20-3/bun-webkit-macos-arm64-lto.tar.gz"
runner: macos-arm64
dependencies: true
compile_obj: true
@@ -257,7 +257,7 @@ jobs:
# package: bun-darwin-x64
# runner: macos-11
# artifact: bun-obj-darwin-x64-baseline
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-aug3-5/bun-webkit-macos-amd64-lto.tar.gz"
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/may20-3/bun-webkit-macos-amd64-lto.tar.gz"
# - cpu: haswell
# arch: x86_64
# tag: bun-darwin-x64
@@ -265,14 +265,14 @@ jobs:
# package: bun-darwin-x64
# runner: macos-11
# artifact: bun-obj-darwin-x64
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-aug3-5/bun-webkit-macos-amd64-lto.tar.gz"
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/may20-3/bun-webkit-macos-amd64-lto.tar.gz"
- cpu: native
arch: aarch64
tag: bun-darwin-aarch64
obj: bun-obj-darwin-aarch64
package: bun-darwin-aarch64
artifact: bun-obj-darwin-aarch64
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-aug3-5/bun-webkit-macos-arm64-lto.tar.gz"
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/may20-3/bun-webkit-macos-arm64-lto.tar.gz"
runner: macos-arm64
steps:
- uses: actions/checkout@v3
@@ -432,8 +432,9 @@ jobs:
name: Test (node runner)
env:
SMTP_SENDGRID_SENDER: ${{ secrets.SMTP_SENDGRID_SENDER }}
TLS_MONGODB_DATABASE_URL: ${{ secrets.TLS_MONGODB_DATABASE_URL }}
TLS_POSTGRES_DATABASE_URL: ${{ secrets.TLS_POSTGRES_DATABASE_URL }}
PRISMA_POSTGRES_DATABASE_URL: ${{ secrets.PRISMA_POSTGRES_DATABASE_URL }}
PRISMA_MONGODB_DATABASE_URL: ${{ secrets.PRISMA_MONGODB_DATABASE_URL }}
# if: ${{github.event.inputs.use_bun == 'false'}}
run: |
bun install

View File

@@ -117,7 +117,7 @@ jobs:
obj: bun-obj-darwin-x64-baseline
runner: macos-11
artifact: bun-obj-darwin-x64-baseline
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-aug3-5/bun-webkit-macos-amd64-lto.tar.gz"
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/may20-3/bun-webkit-macos-amd64-lto.tar.gz"
dependencies: true
compile_obj: false
# - cpu: haswell
@@ -126,7 +126,7 @@ jobs:
# obj: bun-obj-darwin-x64
# runner: macos-11
# artifact: bun-obj-darwin-x64
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-aug3-5/bun-webkit-macos-amd64-lto.tar.gz"
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/may20-3/bun-webkit-macos-amd64-lto.tar.gz"
# dependencies: true
# compile_obj: false
- cpu: nehalem
@@ -135,7 +135,7 @@ jobs:
obj: bun-obj-darwin-x64-baseline
runner: macos-11
artifact: bun-obj-darwin-x64-baseline
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-aug3-5/bun-webkit-macos-amd64-lto.tar.gz"
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/may20-3/bun-webkit-macos-amd64-lto.tar.gz"
dependencies: false
compile_obj: true
# - cpu: haswell
@@ -144,7 +144,7 @@ jobs:
# obj: bun-obj-darwin-x64
# runner: macos-11
# artifact: bun-obj-darwin-x64
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-aug3-5/bun-webkit-macos-amd64-lto.tar.gz"
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/may20-3/bun-webkit-macos-amd64-lto.tar.gz"
# dependencies: false
# compile_obj: true
# - cpu: native
@@ -152,7 +152,7 @@ jobs:
# tag: bun-darwin-aarch64
# obj: bun-obj-darwin-aarch64
# artifact: bun-obj-darwin-aarch64
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-aug3-5/bun-webkit-macos-amd64-lto.tar.gz"
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/may20-3/bun-webkit-macos-amd64-lto.tar.gz"
# runner: macos-arm64
# dependencies: true
# compile_obj: true
@@ -258,7 +258,7 @@ jobs:
package: bun-darwin-x64
runner: macos-11
artifact: bun-obj-darwin-x64-baseline
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-aug3-5/bun-webkit-macos-amd64-lto.tar.gz"
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/may20-3/bun-webkit-macos-amd64-lto.tar.gz"
# - cpu: haswell
# arch: x86_64
# tag: bun-darwin-x64
@@ -266,14 +266,14 @@ jobs:
# package: bun-darwin-x64
# runner: macos-11
# artifact: bun-obj-darwin-x64
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-aug3-5/bun-webkit-macos-amd64-lto.tar.gz"
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/may20-3/bun-webkit-macos-amd64-lto.tar.gz"
# - cpu: native
# arch: aarch64
# tag: bun-darwin-aarch64
# obj: bun-obj-darwin-aarch64
# package: bun-darwin-aarch64
# artifact: bun-obj-darwin-aarch64
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-aug3-5/bun-webkit-macos-amd64-lto.tar.gz"
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/may20-3/bun-webkit-macos-amd64-lto.tar.gz"
# runner: macos-arm64
steps:
- uses: actions/checkout@v3
@@ -436,8 +436,9 @@ jobs:
name: Test (node runner)
env:
SMTP_SENDGRID_SENDER: ${{ secrets.SMTP_SENDGRID_SENDER }}
TLS_MONGODB_DATABASE_URL: ${{ secrets.TLS_MONGODB_DATABASE_URL }}
TLS_POSTGRES_DATABASE_URL: ${{ secrets.TLS_POSTGRES_DATABASE_URL }}
PRISMA_POSTGRES_DATABASE_URL: ${{ secrets.PRISMA_POSTGRES_DATABASE_URL }}
PRISMA_MONGODB_DATABASE_URL: ${{ secrets.PRISMA_MONGODB_DATABASE_URL }}
# if: ${{github.event.inputs.use_bun == 'false'}}
run: |
bun install

View File

@@ -117,7 +117,7 @@ jobs:
# obj: bun-obj-darwin-x64-baseline
# runner: macos-11
# artifact: bun-obj-darwin-x64-baseline
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-aug3-5/bun-webkit-macos-amd64-lto.tar.gz"
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/may20-3/bun-webkit-macos-amd64-lto.tar.gz"
# dependencies: true
# compile_obj: false
- cpu: haswell
@@ -126,7 +126,7 @@ jobs:
obj: bun-obj-darwin-x64
runner: macos-11
artifact: bun-obj-darwin-x64
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-aug3-5/bun-webkit-macos-amd64-lto.tar.gz"
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/may20-3/bun-webkit-macos-amd64-lto.tar.gz"
dependencies: true
compile_obj: false
# - cpu: nehalem
@@ -135,7 +135,7 @@ jobs:
# obj: bun-obj-darwin-x64-baseline
# runner: macos-11
# artifact: bun-obj-darwin-x64-baseline
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-aug3-5/bun-webkit-macos-amd64-lto.tar.gz"
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/may20-3/bun-webkit-macos-amd64-lto.tar.gz"
# dependencies: false
# compile_obj: true
- cpu: haswell
@@ -144,7 +144,7 @@ jobs:
obj: bun-obj-darwin-x64
runner: macos-11
artifact: bun-obj-darwin-x64
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-aug3-5/bun-webkit-macos-amd64-lto.tar.gz"
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/may20-3/bun-webkit-macos-amd64-lto.tar.gz"
dependencies: false
compile_obj: true
# - cpu: native
@@ -152,7 +152,7 @@ jobs:
# tag: bun-darwin-aarch64
# obj: bun-obj-darwin-aarch64
# artifact: bun-obj-darwin-aarch64
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-aug3-5/bun-webkit-macos-arm64-lto.tar.gz"
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/may20-3/bun-webkit-macos-arm64-lto.tar.gz"
# runner: macos-arm64
# dependencies: true
# compile_obj: true
@@ -260,7 +260,7 @@ jobs:
# package: bun-darwin-x64
# runner: macos-11
# artifact: bun-obj-darwin-x64-baseline
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-aug3-5/bun-webkit-macos-amd64-lto.tar.gz"
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/may20-3/bun-webkit-macos-amd64-lto.tar.gz"
- cpu: haswell
arch: x86_64
tag: bun-darwin-x64
@@ -268,14 +268,14 @@ jobs:
package: bun-darwin-x64
runner: macos-11
artifact: bun-obj-darwin-x64
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-aug3-5/bun-webkit-macos-amd64-lto.tar.gz"
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/may20-3/bun-webkit-macos-amd64-lto.tar.gz"
# - cpu: native
# arch: aarch64
# tag: bun-darwin-aarch64
# obj: bun-obj-darwin-aarch64
# package: bun-darwin-aarch64
# artifact: bun-obj-darwin-aarch64
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-aug3-5/bun-webkit-macos-arm64-lto.tar.gz"
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/may20-3/bun-webkit-macos-arm64-lto.tar.gz"
# runner: macos-arm64
steps:
- uses: actions/checkout@v3
@@ -438,8 +438,9 @@ jobs:
name: Test (node runner)
env:
SMTP_SENDGRID_SENDER: ${{ secrets.SMTP_SENDGRID_SENDER }}
TLS_MONGODB_DATABASE_URL: ${{ secrets.TLS_MONGODB_DATABASE_URL }}
TLS_POSTGRES_DATABASE_URL: ${{ secrets.TLS_POSTGRES_DATABASE_URL }}
PRISMA_POSTGRES_DATABASE_URL: ${{ secrets.PRISMA_POSTGRES_DATABASE_URL }}
PRISMA_MONGODB_DATABASE_URL: ${{ secrets.PRISMA_MONGODB_DATABASE_URL }}
# if: ${{github.event.inputs.use_bun == 'false'}}
run: |
bun install

View File

@@ -1,7 +1,7 @@
name: zig-fmt
env:
ZIG_VERSION: 0.12.0-dev.163+6780a6bbf
ZIG_VERSION: 0.11.0-dev.3737+9eb008717
on:
pull_request:

12
.gitignore vendored
View File

@@ -96,8 +96,6 @@ packages/bun-wasm/*.cjs
packages/bun-wasm/*.map
packages/bun-wasm/*.js
packages/bun-wasm/*.d.ts
packages/bun-wasm/*.d.cts
packages/bun-wasm/*.d.mts
*.bc
src/fallback.version
@@ -124,12 +122,4 @@ cold-jsc-start.d
/test.ts
src/js/out/modules*
src/js/out/functions*
src/js/out/tmp
src/js/out/DebugPath.h
make-dev-stats.csv
.uuid
tsconfig.tsbuildinfo
src/js/out/modules_dev

7
.gitmodules vendored
View File

@@ -48,6 +48,13 @@ ignore = dirty
depth = 1
shallow = true
fetchRecurseSubmodules = false
[submodule "src/deps/uws"]
path = src/deps/uws
url = https://github.com/Jarred-Sumner/uWebSockets
ignore = dirty
depth = 1
shallow = true
fetchRecurseSubmodules = true
[submodule "src/deps/tinycc"]
path = src/deps/tinycc
url = https://github.com/Jarred-Sumner/tinycc.git

View File

@@ -1,21 +0,0 @@
// I would have made this a bash script but there isn't an easy way to track
// time in bash sub-second cross platform.
import fs from "fs";
const start = Date.now() + 5;
const result = Bun.spawnSync(process.argv.slice(2), {
stdio: ["inherit", "inherit", "inherit"],
});
const end = Date.now();
const diff = (Math.max(Math.round(end - start), 0) / 1000).toFixed(3);
const success = result.exitCode === 0;
try {
const line = `${new Date().toISOString()}, ${success ? "success" : "fail"}, ${diff}\n`;
if (fs.existsSync(".scripts/make-dev-stats.csv")) {
fs.appendFileSync(".scripts/make-dev-stats.csv", line);
} else {
fs.writeFileSync(".scripts/make-dev-stats.csv", line);
}
} catch {
// Ignore
}
process.exit(result.exitCode);

3
.scripts/write-versions.sh Executable file → Normal file
View File

@@ -7,9 +7,11 @@ LIBARCHIVE_VERSION=$(git rev-parse HEAD:./src/deps/libarchive)
PICOHTTPPARSER_VERSION=$(git rev-parse HEAD:./src/deps/picohttpparser)
BORINGSSL_VERSION=$(git rev-parse HEAD:./src/deps/boringssl)
ZLIB_VERSION=$(git rev-parse HEAD:./src/deps/zlib)
UWS_VERSION=$(git rev-parse HEAD:./src/deps/uws)
LOLHTML=$(git rev-parse HEAD:./src/deps/lol-html)
TINYCC=$(git rev-parse HEAD:./src/deps/tinycc)
C_ARES=$(git rev-parse HEAD:./src/deps/c-ares)
USOCKETS=$(cd src/deps/uws/uSockets && git rev-parse HEAD)
rm -rf src/generated_versions_list.zig
echo "// AUTO-GENERATED FILE. Created via .scripts/write-versions.sh" >src/generated_versions_list.zig
@@ -18,6 +20,7 @@ echo "pub const boringssl = \"$BORINGSSL_VERSION\";" >>src/generated_versions_li
echo "pub const libarchive = \"$LIBARCHIVE_VERSION\";" >>src/generated_versions_list.zig
echo "pub const mimalloc = \"$MIMALLOC_VERSION\";" >>src/generated_versions_list.zig
echo "pub const picohttpparser = \"$PICOHTTPPARSER_VERSION\";" >>src/generated_versions_list.zig
echo "pub const uws = \"$UWS_VERSION\";" >>src/generated_versions_list.zig
echo "pub const webkit = \"$WEBKIT_VERSION\";" >>src/generated_versions_list.zig
echo "pub const zig = @import(\"std\").fmt.comptimePrint(\"{}\", .{@import(\"builtin\").zig_version});" >>src/generated_versions_list.zig
echo "pub const zlib = \"$ZLIB_VERSION\";" >>src/generated_versions_list.zig

View File

@@ -15,14 +15,11 @@
"${workspaceFolder}/src/bun.js/bindings/webcore/",
"${workspaceFolder}/src/bun.js/bindings/sqlite/",
"${workspaceFolder}/src/bun.js/bindings/webcrypto/",
"${workspaceFolder}/src/bun.js/modules/",
"${workspaceFolder}/src/js/builtins/",
"${workspaceFolder}/src/js/out",
"${workspaceFolder}/src/deps/boringssl/include/",
"${workspaceFolder}/src/deps",
"${workspaceFolder}/src/napi/*",
"${workspaceFolder}/packages/bun-usockets/src",
"${workspaceFolder}/packages/"
"${workspaceFolder}/src/deps/uws/uSockets/src"
],
"browse": {
"path": [
@@ -34,8 +31,6 @@
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Release/WTF/Headers/**",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Release/bmalloc/Headers/**",
"${workspaceFolder}/src/bun.js/bindings/*",
"${workspaceFolder}/src/bun.js/bindings/*",
"${workspaceFolder}/src/napi/*",
"${workspaceFolder}/src/bun.js/bindings/sqlite/",
"${workspaceFolder}/src/bun.js/bindings/webcrypto/",
"${workspaceFolder}/src/bun.js/bindings/webcore/",
@@ -44,9 +39,7 @@
"${workspaceFolder}/src/bun.js/modules/*",
"${workspaceFolder}/src/deps",
"${workspaceFolder}/src/deps/boringssl/include/",
"${workspaceFolder}/packages/bun-usockets/",
"${workspaceFolder}/packages/bun-uws/",
"${workspaceFolder}/src/napi"
"${workspaceFolder}/src/deps/uws/uSockets/src"
],
"limitSymbolsToIncludedHeaders": true,
"databaseFilename": ".vscode/cppdb"

24
.vscode/launch.json generated vendored
View File

@@ -4,6 +4,8 @@
// it makes our tests very slow
// But it helps catch memory bugs
// SIGHUP must be ignored or the debugger will pause when a spawned subprocess exits:
// { "initCommands": ["process handle -p false -s false -n false SIGHUP"] }
"version": "0.2.0",
"configurations": [
{
@@ -19,6 +21,7 @@
"BUN_DEBUG_QUIET_LOGS": "1",
"BUN_GARBAGE_COLLECTOR_LEVEL": "2"
},
"initCommands": ["process handle -p false -s false -n false SIGHUP"],
"console": "internalConsole"
},
{
@@ -33,6 +36,7 @@
"FORCE_COLOR": "1",
"BUN_DEBUG_QUIET_LOGS": "1"
},
"initCommands": ["process handle -p false -s false -n false SIGHUP"],
"console": "internalConsole"
},
@@ -47,6 +51,7 @@
"env": {
"FORCE_COLOR": "1"
},
"initCommands": ["process handle -p false -s false -n false SIGHUP"],
"console": "internalConsole"
},
{
@@ -61,6 +66,7 @@
"FORCE_COLOR": "1",
"BUN_DEBUG_QUIET_LOGS": "1"
},
"initCommands": ["process handle -p false -s false -n false SIGHUP"],
"console": "internalConsole"
},
{
@@ -75,6 +81,7 @@
"FORCE_COLOR": "1",
"BUN_DEBUG_QUIET_LOGS": "1"
},
"initCommands": ["process handle -p false -s false -n false SIGHUP"],
"console": "internalConsole"
},
{
@@ -89,6 +96,7 @@
"BUN_DEBUG_QUIET_LOGS": "1",
"BUN_GARBAGE_COLLECTOR_LEVEL": "2"
},
"initCommands": ["process handle -p false -s false -n false SIGHUP"],
"console": "internalConsole"
},
{
@@ -103,6 +111,7 @@
"FORCE_COLOR": "1",
"BUN_DEBUG_QUIET_LOGS": "1"
},
"initCommands": ["process handle -p false -s false -n false SIGHUP"],
"console": "internalConsole"
},
{
@@ -117,6 +126,7 @@
"FORCE_COLOR": "1",
"BUN_DEBUG_QUIET_LOGS": "1"
},
"initCommands": ["process handle -p false -s false -n false SIGHUP"],
"console": "internalConsole"
},
{
@@ -124,12 +134,13 @@
"request": "launch",
"name": "bun run [file]",
"program": "bun-debug",
"args": ["run", "${file}", "${file}"],
"args": ["run", "${file}"],
"cwd": "${fileDirname}",
"env": {
"FORCE_COLOR": "1",
"NODE_ENV": "development"
"BUN_DEBUG_QUIET_LOGS": "1"
},
"initCommands": ["process handle -p false -s false -n false SIGHUP"],
"console": "internalConsole"
},
{
@@ -144,6 +155,7 @@
"BUN_DEBUG_QUIET_LOGS": "1",
"BUN_GARBAGE_COLLECTOR_LEVEL": "2"
},
"initCommands": ["process handle -p false -s false -n false SIGHUP"],
"console": "internalConsole"
},
{
@@ -156,6 +168,7 @@
"env": {
"FORCE_COLOR": "1"
},
"initCommands": ["process handle -p false -s false -n false SIGHUP"],
"console": "internalConsole"
},
{
@@ -168,6 +181,7 @@
"env": {
"FORCE_COLOR": "1"
},
"initCommands": ["process handle -p false -s false -n false SIGHUP"],
"console": "internalConsole"
},
{
@@ -180,6 +194,7 @@
"env": {
"FORCE_COLOR": "1"
},
"initCommands": ["process handle -p false -s false -n false SIGHUP"],
"console": "internalConsole"
},
{
@@ -201,7 +216,9 @@
"console": "internalConsole",
"env": {
"BUN_CONFIG_MINIFY_WHITESPACE": "1"
}
},
// SIGHUP must be ignored or the debugger will pause when a spawned subprocess exits.
"initCommands": ["process handle -p false -s false -n false SIGHUP"]
},
{
"type": "lldb",
@@ -225,6 +242,7 @@
"console": "internalConsole",
"env": {}
},
{
"type": "lldb",
"request": "launch",

View File

@@ -7,8 +7,6 @@
"search.followSymlinks": false,
"search.useIgnoreFiles": true,
"zig.buildOnSave": false,
// We do this until we upgrade to latest Zig so that zls doesn't break our code.
"zig.formattingProvider": "extension",
"zig.buildArgs": ["obj", "-Dfor-editor"],
"zig.buildOption": "build",
"zig.buildFilePath": "${workspaceFolder}/build.zig",
@@ -79,8 +77,7 @@
"src/deps/tinycc": true,
"src/deps/zstd": true,
"test/snippets/package-json-exports/_node_modules_copy": true,
"src/js/out": true,
"src/packages/bun-uws/fuzzing/seed-corpus/": true
"src/js/out": true
},
"C_Cpp.files.exclude": {
"**/.vscode": true,

View File

@@ -47,18 +47,34 @@ TODO: document this (see [`bindings.zig`](src/bun.js/bindings/bindings.zig) and
Copy from examples like `Subprocess` or `Response`.
### ESM Modules and Builtins JS
### ESM modules
Bun implements ESM modules in a mix of native code and JavaScript.
Several Node.js modules are implemented in JavaScript and loosely based on browserify polyfills.
Builtin modules in Bun are located in [`src/js`](src/js/). These files are transpiled and support a JavaScriptCore-only syntax for internal slots, which is explained further in [`src/js/README.md`](src/js/README.md).
Native C++ modules are in `src/bun.js/modules/`.
The ESM modules in Bun are located in [`src/bun.js/*.exports.js`](src/bun.js/). Unlike other code in Bun, these files are NOT transpiled. They are loaded directly into the JavaScriptCore VM. That means `require` does not work in these files. Instead, you must use `import.meta.require`, or ideally, not use require/import other files at all.
The module loader is in [`src/bun.js/module_loader.zig`](src/bun.js/module_loader.zig).
### JavaScript Builtins
TODO: update this with the new build process that uses TypeScript and `$` instead of `@`.
JavaScript builtins are located in [`src/js/builtins/*.ts`](src/js/builtins).
These files support a JavaScriptCore-only syntax for internal slots. `@` is used to access an internal slot. For example: `new @Array(123)` will create a new `Array` similar to `new Array(123)`, except if a library modifies the `Array` global, it will not affect the internal slot (`@Array`). These names must be allow-listed in `BunBuiltinNames.h` (though JavaScriptCore allowlists some names by default).
They can not use or reference ESM-modules. The files that end with `*Internals.js` are automatically loaded globally. Most usage of internals right now are the stream implementations (which share a lot of code from Safari/WebKit) and ImportMetaObject (which is how `require` is implemented in the runtime)
To regenerate the builtins:
```sh
make clean-bindings && make generate-builtins && make bindings -j10
```
It is recommended that you have ccache installed or else you will spend a lot of time waiting for the bindings to compile.
### Memory management in Bun's JavaScript runtime
TODO: fill this out (for now, use `JSC.Strong` in most cases)

View File

@@ -10,9 +10,9 @@ ARG ARCH=x86_64
ARG BUILD_MACHINE_ARCH=x86_64
ARG TRIPLET=${ARCH}-linux-gnu
ARG BUILDARCH=amd64
ARG WEBKIT_TAG=2023-aug3-5
ARG WEBKIT_TAG=may20-3
ARG ZIG_TAG=jul1
ARG ZIG_VERSION="0.12.0-dev.163+6780a6bbf"
ARG ZIG_VERSION="0.11.0-dev.3737+9eb008717"
ARG WEBKIT_BASENAME="bun-webkit-linux-$BUILDARCH"
ARG ZIG_FOLDERNAME=zig-linux-${BUILD_MACHINE_ARCH}-${ZIG_VERSION}
@@ -20,7 +20,7 @@ ARG ZIG_FILENAME=${ZIG_FOLDERNAME}.tar.xz
ARG WEBKIT_URL="https://github.com/oven-sh/WebKit/releases/download/$WEBKIT_TAG/${WEBKIT_BASENAME}.tar.gz"
ARG ZIG_URL="https://ziglang.org/builds/${ZIG_FILENAME}"
ARG GIT_SHA=""
ARG BUN_BASE_VERSION=0.8
ARG BUN_BASE_VERSION=0.6
FROM bitnami/minideb:bullseye as bun-base
@@ -284,8 +284,7 @@ ARG CPU_TARGET
ENV CPU_TARGET=${CPU_TARGET}
COPY Makefile ${BUN_DIR}/Makefile
COPY packages/bun-uws ${BUN_DIR}/packages/bun-uws
COPY packages/bun-usockets ${BUN_DIR}/packages/bun-usockets
COPY src/deps/uws ${BUN_DIR}/src/deps/uws
COPY src/deps/zlib ${BUN_DIR}/src/deps/zlib
COPY src/deps/boringssl/include ${BUN_DIR}/src/deps/boringssl/include
COPY src/deps/libuwsockets.cpp ${BUN_DIR}/src/deps/libuwsockets.cpp
@@ -294,7 +293,7 @@ COPY src/deps/_libusockets.h ${BUN_DIR}/src/deps/_libusockets.h
WORKDIR $BUN_DIR
RUN cd $BUN_DIR && \
make uws && rm -rf packages/bun-uws Makefile
make uws && rm -rf src/deps/uws Makefile
FROM bun-base as base64

196
Makefile
View File

@@ -6,6 +6,8 @@ BUN_AUTO_UPDATER_REPO = Jarred-Sumner/bun-releases-for-updater
CMAKE_CXX_COMPILER_LAUNCHER_FLAG :=
# 'make' command will trigger the help target
.DEFAULT_GOAL := help
@@ -18,7 +20,7 @@ CPU_TARGET ?= native
MARCH_NATIVE = -mtune=$(CPU_TARGET)
NATIVE_OR_OLD_MARCH =
MMD_IF_LOCAL =
MMD_IF_LOCAL =
DEFAULT_MIN_MACOS_VERSION=
ARCH_NAME :=
DOCKER_BUILDARCH =
@@ -38,7 +40,7 @@ NATIVE_OR_OLD_MARCH = -march=nehalem
endif
MIN_MACOS_VERSION ?= $(DEFAULT_MIN_MACOS_VERSION)
BUN_BASE_VERSION = 0.8
BUN_BASE_VERSION = 0.6
CI ?= false
@@ -50,8 +52,6 @@ endif
BUN_OR_NODE = $(shell which bun 2>/dev/null || which node 2>/dev/null)
CXX_VERSION=c++2a
TRIPLET = $(OS_NAME)-$(ARCH_NAME)
PACKAGE_NAME = bun-$(TRIPLET)
@@ -349,10 +349,10 @@ LINUX_INCLUDE_DIRS := $(ALL_JSC_INCLUDE_DIRS) \
-I$(ZLIB_INCLUDE_DIR)
UWS_INCLUDE_DIR := -I$(BUN_DIR)/packages/bun-usockets/src -I$(BUN_DIR)/packages -I$(BUN_DEPS_DIR)
UWS_INCLUDE_DIR := -I$(BUN_DEPS_DIR)/uws/uSockets/src -I$(BUN_DEPS_DIR)/uws/src -I$(BUN_DEPS_DIR)
INCLUDE_DIRS := $(UWS_INCLUDE_DIR) -I$(BUN_DEPS_DIR)/mimalloc/include -I$(BUN_DEPS_DIR)/zstd/include -Isrc/napi -I$(BUN_DEPS_DIR)/boringssl/include -I$(BUN_DEPS_DIR)/c-ares/include -Isrc/bun.js/modules
INCLUDE_DIRS := $(UWS_INCLUDE_DIR) -I$(BUN_DEPS_DIR)/mimalloc/include -I$(BUN_DEPS_DIR)/zstd/include -Isrc/napi -I$(BUN_DEPS_DIR)/boringssl/include -I$(BUN_DEPS_DIR)/c-ares/include
ifeq ($(OS_NAME),linux)
@@ -401,7 +401,6 @@ CLANG_FLAGS = $(INCLUDE_DIRS) \
-DSTATICALLY_LINKED_WITH_BMALLOC=1 \
-DBUILDING_WITH_CMAKE=1 \
-DBUN_SINGLE_THREADED_PER_VM_ENTRY_SCOPE=1 \
-DNAPI_EXPERIMENTAL=ON \
-DNDEBUG=1 \
-DNOMINMAX \
-DIS_BUILD \
@@ -555,13 +554,22 @@ tinycc:
PYTHON=$(shell which python 2>/dev/null || which python3 2>/dev/null || which python2 2>/dev/null)
.PHONY: builtins
builtins:
NODE_ENV=production bun src/js/builtins/codegen/index.ts --minify
.PHONY: esm
js: # to rebundle js (rebuilding binary not needed to reload js code)
NODE_ENV=production bun src/js/_codegen/index.ts
esm:
NODE_ENV=production bun src/js/build-esm.ts
esm-debug:
BUN_DEBUG_QUIET_LOGS=1 NODE_ENV=production bun-debug src/js/build-esm.ts
.PHONY: generate-builtins
generate-builtins: builtins
BUN_TYPES_REPO_PATH ?= $(realpath packages/bun-types)
ifeq ($(DEBUG),true)
@@ -661,8 +669,8 @@ else
PKGNAME_NINJA := ninja-build
endif
.PHONY: assert-deps
assert-deps:
.PHONY: require
require:
@echo "Checking if the required utilities are available..."
@if [ $(CLANG_VERSION) -lt "15" ]; then echo -e "ERROR: clang version >=15 required, found: $(CLANG_VERSION). Install with:\n\n $(POSIX_PKG_MANAGER) install llvm@15"; exit 1; fi
@cmake --version >/dev/null 2>&1 || (echo -e "ERROR: cmake is required."; exit 1)
@@ -674,24 +682,10 @@ assert-deps:
@which $(LIBTOOL) > /dev/null || (echo -e "ERROR: libtool is required. Install with:\n\n $(POSIX_PKG_MANAGER) install libtool"; exit 1)
@which ninja > /dev/null || (echo -e "ERROR: Ninja is required. Install with:\n\n $(POSIX_PKG_MANAGER) install $(PKGNAME_NINJA)"; exit 1)
@which pkg-config > /dev/null || (echo -e "ERROR: pkg-config is required. Install with:\n\n $(POSIX_PKG_MANAGER) install pkg-config"; exit 1)
@which rustc > /dev/null || (echo -e "ERROR: rustc is required." exit 1)
@which cargo > /dev/null || (echo -e "ERROR: cargo is required." exit 1)
@test $(shell cargo --version | awk '{print $$2}' | cut -d. -f2) -gt 57 || (echo -e "ERROR: cargo version must be at least 1.57."; exit 1)
@echo "You have the dependencies installed! Woo"
# the following allows you to run `make submodule` to update or init submodules. but we will exclude webkit
# unless you explicity clone it yourself (a huge download)
SUBMODULE_NAMES=$(shell cat .gitmodules | grep 'path = ' | awk '{print $$3}')
ifeq ("$(wildcard src/bun.js/WebKit/.git)", "")
SUBMODULE_NAMES := $(filter-out src/bun.js/WebKit, $(SUBMODULE_NAMES))
endif
.PHONY: init-submodules
init-submodules: submodule # (backwards-compatibility alias)
.PHONY: submodule
submodule: ## to init or update all submodules
git submodule update --init --recursive --progress --depth=1 --checkout $(SUBMODULE_NAMES)
init-submodules:
git submodule update --init --recursive --progress --depth=1 --checkout
.PHONY: build-obj
build-obj:
@@ -707,46 +701,44 @@ dev-build-obj-wasm:
.PHONY: dev-wasm
dev-wasm: dev-build-obj-wasm
emcc -sEXPORTED_FUNCTIONS="['_bun_free', '_cycleStart', '_cycleEnd', '_bun_malloc', '_scan', '_transform', '_init', '_getTests']" \
-g2 -s ERROR_ON_UNDEFINED_SYMBOLS=0 -DNDEBUG \
$(BUN_DEPS_DIR)/$(MIMALLOC_FILE).wasm \
packages/debug-bun-freestanding-wasm32/bun-wasm.o --no-entry --allow-undefined -s ASSERTIONS=0 -s ALLOW_MEMORY_GROWTH=1 -s WASM_BIGINT=1 \
emcc -sEXPORTED_FUNCTIONS="['_bun_free', '_cycleStart', '_cycleEnd', '_bun_malloc', '_scan', '_transform', '_init']" \
-g -s ERROR_ON_UNDEFINED_SYMBOLS=0 -DNDEBUG \
$(BUN_DEPS_DIR)/libmimalloc.a.wasm \
packages/debug-bun-freestanding-wasm32/bun-wasm.o $(OPTIMIZATION_LEVEL) --no-entry --allow-undefined -s ASSERTIONS=0 -s ALLOW_MEMORY_GROWTH=1 -s WASM_BIGINT=1 \
-o packages/debug-bun-freestanding-wasm32/bun-wasm.wasm
cp packages/debug-bun-freestanding-wasm32/bun-wasm.wasm packages/bun-wasm/bun.wasm
cp packages/debug-bun-freestanding-wasm32/bun-wasm.wasm src/api/demo/public/bun-wasm.wasm
.PHONY: build-obj-wasm
build-obj-wasm:
$(ZIG) build bun-wasm -Doptimize=ReleaseFast -Dtarget=wasm32-freestanding
emcc -sEXPORTED_FUNCTIONS="['_bun_free', '_cycleStart', '_cycleEnd', '_bun_malloc', '_scan', '_transform', '_init', '_getTests']" \
-s ERROR_ON_UNDEFINED_SYMBOLS=0 -DNDEBUG \
$(BUN_DEPS_DIR)/$(MIMALLOC_FILE).wasm \
emcc -sEXPORTED_FUNCTIONS="['_bun_free', '_cycleStart', '_cycleEnd', '_bun_malloc', '_scan', '_transform', '_init']" \
-g -s ERROR_ON_UNDEFINED_SYMBOLS=0 -DNDEBUG \
$(BUN_DEPS_DIR)/libmimalloc.a.wasm \
packages/bun-freestanding-wasm32/bun-wasm.o $(OPTIMIZATION_LEVEL) --no-entry --allow-undefined -s ASSERTIONS=0 -s ALLOW_MEMORY_GROWTH=1 -s WASM_BIGINT=1 \
-o packages/bun-freestanding-wasm32/bun-wasm.wasm
cp packages/bun-freestanding-wasm32/bun-wasm.wasm packages/bun-wasm/bun.wasm
cp packages/bun-freestanding-wasm32/bun-wasm.wasm src/api/demo/public/bun-wasm.wasm
.PHONY: build-obj-wasm-small
build-obj-wasm-small:
$(ZIG) build bun-wasm -Doptimize=ReleaseFast -Dtarget=wasm32-freestanding
emcc -sEXPORTED_FUNCTIONS="['_bun_free', '_cycleStart', '_cycleEnd', '_bun_malloc', '_scan', '_transform', '_init', '_getTests']" \
-Oz -s ERROR_ON_UNDEFINED_SYMBOLS=0 -DNDEBUG \
$(BUN_DEPS_DIR)/$(MIMALLOC_FILE).wasm \
$(ZIG) build bun-wasm -Doptimize=ReleaseSmall -Dtarget=wasm32-freestanding
emcc -sEXPORTED_FUNCTIONS="['_bun_free', '_cycleStart', '_cycleEnd', '_bun_malloc', '_scan', '_transform', '_init']" \
-g -s ERROR_ON_UNDEFINED_SYMBOLS=0 -DNDEBUG \
$(BUN_DEPS_DIR)/libmimalloc.a.wasm \
packages/bun-freestanding-wasm32/bun-wasm.o -Oz --no-entry --allow-undefined -s ASSERTIONS=0 -s ALLOW_MEMORY_GROWTH=1 -s WASM_BIGINT=1 \
-o packages/bun-freestanding-wasm32/bun-wasm.wasm
cp packages/bun-freestanding-wasm32/bun-wasm.wasm packages/bun-wasm/bun.wasm
cp packages/bun-freestanding-wasm32/bun-wasm.wasm src/api/demo/public/bun-wasm.wasm
.PHONY: wasm
wasm: api mimalloc-wasm build-obj-wasm-small
@rm -rf packages/bun-wasm/*.{d.ts,d.cts,d.mts,js,wasm,cjs,mjs,tsbuildinfo}
wasm: api build-obj-wasm-small
@rm -rf packages/bun-wasm/*.{d.ts,js,wasm,cjs,mjs,tsbuildinfo}
@cp packages/bun-freestanding-wasm32/bun-wasm.wasm packages/bun-wasm/bun.wasm
@cp src/api/schema.d.ts packages/bun-wasm/schema.d.ts
@cp src/api/schema.js packages/bun-wasm/schema.js
@cd packages/bun-wasm && $(NPM_CLIENT) run tsc -- -p .
@cp packages/bun-wasm/index.d.ts packages/bun-wasm/index.d.cts
@mv packages/bun-wasm/index.d.ts packages/bun-wasm/index.d.mts
@bun build --sourcemap=external --external=fs --outdir=packages/bun-wasm --target=browser --minify ./packages/bun-wasm/index.ts
@$(ESBUILD) --sourcemap=external --external:fs --define:process.env.NODE_ENV='"production"' --outdir=packages/bun-wasm --target=esnext --bundle packages/bun-wasm/index.ts --format=esm --minify 2> /dev/null
@mv packages/bun-wasm/index.js packages/bun-wasm/index.mjs
@mv packages/bun-wasm/index.js.map packages/bun-wasm/index.mjs.map
@$(ESBUILD) --sourcemap=external --external:fs --outdir=packages/bun-wasm --target=esnext --bundle packages/bun-wasm/index.ts --format=cjs --minify --platform=node 2> /dev/null
@$(ESBUILD) --sourcemap=external --external:fs --define:process.env.NODE_ENV='"production"' --outdir=packages/bun-wasm --target=esnext --bundle packages/bun-wasm/index.ts --format=cjs --minify --platform=node 2> /dev/null
@mv packages/bun-wasm/index.js packages/bun-wasm/index.cjs
@mv packages/bun-wasm/index.js.map packages/bun-wasm/index.cjs.map
@rm -rf packages/bun-wasm/*.tsbuildinfo
@@ -760,17 +752,17 @@ build-obj-safe:
UWS_CC_FLAGS = -pthread -DLIBUS_USE_OPENSSL=1 -DUWS_HTTPRESPONSE_NO_WRITEMARK=1 -DLIBUS_USE_BORINGSSL=1 -DWITH_BORINGSSL=1 -Wpedantic -Wall -Wextra -Wsign-conversion -Wconversion $(UWS_INCLUDE) -DUWS_WITH_PROXY
UWS_CXX_FLAGS = $(UWS_CC_FLAGS) -std=$(CXX_VERSION) -fno-exceptions -fno-rtti
UWS_LDFLAGS = -I$(BUN_DEPS_DIR)/boringssl/include -I$(ZLIB_INCLUDE_DIR)
USOCKETS_DIR = $(BUN_DIR)/packages/bun-usockets
USOCKETS_SRC_DIR = $(USOCKETS_DIR)/src
USOCKETS_DIR = $(BUN_DEPS_DIR)/uws/uSockets/
USOCKETS_SRC_DIR = $(BUN_DEPS_DIR)/uws/uSockets/src/
usockets:
rm -rf $(USOCKETS_DIR)/*.i $(USOCKETS_DIR)/*.bc $(USOCKETS_DIR)/*.o $(USOCKETS_DIR)/*.s $(USOCKETS_DIR)/*.ii $(USOCKETS_DIR)/*.s
cd $(USOCKETS_DIR) && $(CC_WITH_CCACHE) -I$(USOCKETS_SRC_DIR) -fno-builtin-malloc -fno-builtin-free -fno-builtin-realloc $(EMIT_LLVM_FOR_RELEASE) $(MACOS_MIN_FLAG) -fPIC $(CFLAGS) $(UWS_CC_FLAGS) -save-temps -I$(BUN_DEPS_DIR)/uws/uSockets/src $(UWS_LDFLAGS) -g $(DEFAULT_LINKER_FLAGS) $(PLATFORM_LINKER_FLAGS) $(OPTIMIZATION_LEVEL) -c $(wildcard $(USOCKETS_SRC_DIR)/*.c) $(wildcard $(USOCKETS_SRC_DIR)/**/*.c)
cd $(USOCKETS_DIR) && $(CXX_WITH_CCACHE) -I$(USOCKETS_SRC_DIR) -fno-builtin-malloc -fno-builtin-free -fno-builtin-realloc $(EMIT_LLVM_FOR_RELEASE) $(MACOS_MIN_FLAG) -fPIC $(CXXFLAGS) $(UWS_CXX_FLAGS) -save-temps -I$(BUN_DEPS_DIR)/uws/uSockets/src $(UWS_LDFLAGS) -g $(DEFAULT_LINKER_FLAGS) $(PLATFORM_LINKER_FLAGS) $(OPTIMIZATION_LEVEL) -c $(wildcard $(USOCKETS_SRC_DIR)/*.cpp) $(wildcard $(USOCKETS_SRC_DIR)/**/*.cpp)
rm -rf $(BUN_DEPS_DIR)/uws/uSockets/*.o $(BUN_DEPS_DIR)/uws/uSockets/**/*.o $(BUN_DEPS_DIR)/uws/uSockets/*.a $(BUN_DEPS_DIR)/uws/uSockets/*.bc
cd $(USOCKETS_DIR) && $(CC_WITH_CCACHE) -fno-builtin-malloc -fno-builtin-free -fno-builtin-realloc $(EMIT_LLVM_FOR_RELEASE) $(MACOS_MIN_FLAG) -fPIC $(CFLAGS) $(UWS_CC_FLAGS) -save-temps -I$(BUN_DEPS_DIR)/uws/uSockets/src $(UWS_LDFLAGS) -g $(DEFAULT_LINKER_FLAGS) $(PLATFORM_LINKER_FLAGS) $(OPTIMIZATION_LEVEL) -c $(wildcard $(USOCKETS_SRC_DIR)/*.c) $(wildcard $(USOCKETS_SRC_DIR)/**/*.c)
cd $(USOCKETS_DIR) && $(CXX_WITH_CCACHE) -fno-builtin-malloc -fno-builtin-free -fno-builtin-realloc $(EMIT_LLVM_FOR_RELEASE) $(MACOS_MIN_FLAG) -fPIC $(CXXFLAGS) $(UWS_CXX_FLAGS) -save-temps -I$(BUN_DEPS_DIR)/uws/uSockets/src $(UWS_LDFLAGS) -g $(DEFAULT_LINKER_FLAGS) $(PLATFORM_LINKER_FLAGS) $(OPTIMIZATION_LEVEL) -c $(wildcard $(USOCKETS_SRC_DIR)/*.cpp) $(wildcard $(USOCKETS_SRC_DIR)/**/*.cpp)
cd $(USOCKETS_DIR) && $(AR) rcvs $(BUN_DEPS_OUT_DIR)/libusockets.a $(USOCKETS_DIR)/*.{o,bc}
uws: usockets
$(CXX_WITH_CCACHE) -O2 $(EMIT_LLVM_FOR_RELEASE) -fPIC -I$(USOCKETS_SRC_DIR) $(CLANG_FLAGS) $(CFLAGS) $(UWS_CXX_FLAGS) $(UWS_LDFLAGS) $(PLATFORM_LINKER_FLAGS) -c -I$(BUN_DEPS_DIR) $(BUN_DEPS_OUT_DIR)/libusockets.a $(BUN_DEPS_DIR)/libuwsockets.cpp -o $(BUN_DEPS_OUT_DIR)/libuwsockets.o
$(CXX_WITH_CCACHE) -O2 $(EMIT_LLVM_FOR_RELEASE) -fPIC -I$(BUN_DEPS_DIR)/uws/uSockets/src $(CLANG_FLAGS) $(CFLAGS) $(UWS_CXX_FLAGS) $(UWS_LDFLAGS) $(PLATFORM_LINKER_FLAGS) -c -I$(BUN_DEPS_DIR) $(BUN_DEPS_OUT_DIR)/libusockets.a $(BUN_DEPS_DIR)/libuwsockets.cpp -o $(BUN_DEPS_OUT_DIR)/libuwsockets.o
.PHONY: sign-macos-x64
sign-macos-x64:
@@ -900,8 +892,7 @@ check-glibc-version-dependency:
ifeq ($(OS_NAME),darwin)
zig-win32:
$(ZIG) build -Dtarget=x86_64-windows
# Hardened runtime will not work with debugging
bun-codesign-debug:
@@ -945,7 +936,6 @@ headers:
$(ZIG) translate-c src/bun.js/bindings/headers.h > src/bun.js/bindings/headers.zig
$(BUN_OR_NODE) misctools/headers-cleaner.js
$(ZIG) fmt src/bun.js/bindings/headers.zig
$(CLANG_FORMAT) -i src/bun.js/bindings/ZigGeneratedCode.cpp
.PHONY: jsc-bindings-headers
jsc-bindings-headers: headers
@@ -1090,30 +1080,17 @@ test/wiptest/run: test/wiptest/run.o
release-bin-dir:
echo $(PACKAGE_DIR)
.PHONY: dev-obj-track
dev-obj-track:
bun .scripts/make-dev-timer.ts $(ZIG) build obj -freference-trace -Dcpu="$(CPU_TARGET)"
.PHONY: dev-obj-notrack
dev-obj-notrack:
$(ZIG) build obj -freference-trace -Dcpu="$(CPU_TARGET)"
.PHONY: dev-obj
dev-obj:
ifeq ($(shell which bun),)
dev-obj : dev-obj-notrack
else
dev-obj : dev-obj-track
endif
$(ZIG) build obj -freference-trace -Dcpu="$(CPU_TARGET)"
.PHONY: dev-obj-linux
dev-obj-linux:
$(ZIG) build obj -Dtarget=x86_64-linux-gnu -Dcpu="$(CPU_TARGET)"
.PHONY: dev
dev: mkdir-dev esm dev-obj link
mkdir-dev:
mkdir -p $(DEBUG_PACKAGE_DIR)
@@ -1197,12 +1174,10 @@ jsc-build-mac-compile:
-DPORT="JSCOnly" \
-DENABLE_STATIC_JSC=ON \
-DENABLE_SINGLE_THREADED_VM_ENTRY_SCOPE=ON \
-DALLOW_LINE_AND_COLUMN_NUMBER_IN_BUILTINS=ON \
-DCMAKE_BUILD_TYPE=Release \
-DCMAKE_BUILD_TYPE=relwithdebuginfo \
-DUSE_THIN_ARCHIVES=OFF \
-DBUN_FAST_TLS=ON \
-DENABLE_FTL_JIT=ON \
-DUSE_BUN_JSC_ADDITIONS=ON \
-G Ninja \
$(CMAKE_FLAGS_WITHOUT_RELEASE) \
-DPTHREAD_JIT_PERMISSIONS_API=1 \
@@ -1221,11 +1196,9 @@ jsc-build-mac-compile-lto:
-DPORT="JSCOnly" \
-DENABLE_STATIC_JSC=ON \
-DENABLE_SINGLE_THREADED_VM_ENTRY_SCOPE=ON \
-DALLOW_LINE_AND_COLUMN_NUMBER_IN_BUILTINS=ON \
-DCMAKE_BUILD_TYPE=Release \
-DUSE_THIN_ARCHIVES=OFF \
-DBUN_FAST_TLS=ON \
-DUSE_BUN_JSC_ADDITIONS=ON \
-DCMAKE_C_FLAGS="-flto=full" \
-DCMAKE_CXX_FLAGS="-flto=full" \
-DENABLE_FTL_JIT=ON \
@@ -1250,8 +1223,6 @@ jsc-build-mac-compile-debug:
-DUSE_THIN_ARCHIVES=OFF \
-DENABLE_FTL_JIT=ON \
-DCMAKE_EXPORT_COMPILE_COMMANDS=ON \
-DUSE_BUN_JSC_ADDITIONS=ON \
-DALLOW_LINE_AND_COLUMN_NUMBER_IN_BUILTINS=ON \
-G Ninja \
$(CMAKE_FLAGS_WITHOUT_RELEASE) \
-DPTHREAD_JIT_PERMISSIONS_API=1 \
@@ -1270,13 +1241,11 @@ jsc-build-linux-compile-config:
cmake \
-DPORT="JSCOnly" \
-DENABLE_STATIC_JSC=ON \
-DCMAKE_BUILD_TYPE=Release \
-DCMAKE_BUILD_TYPE=relwithdebuginfo \
-DUSE_THIN_ARCHIVES=OFF \
-DUSE_BUN_JSC_ADDITIONS=ON \
-DENABLE_FTL_JIT=ON \
-DENABLE_REMOTE_INSPECTOR=ON \
-DJSEXPORT_PRIVATE=WTF_EXPORT_DECLARATION \
-DALLOW_LINE_AND_COLUMN_NUMBER_IN_BUILTINS=ON \
-USE_VISIBILITY_ATTRIBUTE=1 \
-DCMAKE_EXPORT_COMPILE_COMMANDS=ON \
-G Ninja \
@@ -1291,7 +1260,7 @@ jsc-build-linux-compile-config:
jsc-build-linux-compile-build:
mkdir -p $(WEBKIT_RELEASE_DIR) && \
cd $(WEBKIT_RELEASE_DIR) && \
CFLAGS="$(CFLAGS) -Wl,--whole-archive -ffat-lto-objects" CXXFLAGS="$(CXXFLAGS) -Wl,--whole-archive -ffat-lto-objects" -DUSE_BUN_JSC_ADDITIONS=ON \
CFLAGS="$(CFLAGS) -Wl,--whole-archive -ffat-lto-objects" CXXFLAGS="$(CXXFLAGS) -Wl,--whole-archive -ffat-lto-objects" \
cmake --build $(WEBKIT_RELEASE_DIR) --config relwithdebuginfo --target jsc
@@ -1388,16 +1357,16 @@ mimalloc:
mimalloc-wasm:
rm -rf $(BUN_DEPS_DIR)/mimalloc/CMakeCache* $(BUN_DEPS_DIR)/mimalloc/CMakeFiles
cd $(BUN_DEPS_DIR)/mimalloc; emcmake cmake -DMI_BUILD_SHARED=OFF -DMI_BUILD_STATIC=ON -DMI_BUILD_TESTS=OFF -GNinja -DMI_BUILD_OBJECT=ON ${MIMALLOC_OVERRIDE_FLAG} -DMI_USE_CXX=OFF .; emmake cmake --build .;
cd $(BUN_DEPS_DIR)/mimalloc; emcmake cmake -DMI_BUILD_SHARED=OFF -DMI_BUILD_STATIC=ON -DMI_BUILD_TESTS=OFF -DMI_BUILD_OBJECT=ON ${MIMALLOC_OVERRIDE_FLAG} -DMI_USE_CXX=ON .; emmake make;
cp $(BUN_DEPS_DIR)/mimalloc/$(MIMALLOC_INPUT_PATH) $(BUN_DEPS_OUT_DIR)/$(MIMALLOC_FILE).wasm
# alias for link, incase anyone still types that
.PHONY: bun-link-lld-debug
bun-link-lld-debug: link
# link a debug build of bun
.PHONY: link
link: ## link a debug build of bun
link:
$(CXX) $(BUN_LLD_FLAGS_DEBUG) $(DEBUG_FLAGS) $(SYMBOLS) \
-g \
$(DEBUG_BIN)/bun-debug.o \
@@ -1492,7 +1461,7 @@ generate-classes:
generate-sink:
bun src/bun.js/scripts/generate-jssink.js
$(CLANG_FORMAT) -i src/bun.js/bindings/JSSink.cpp src/bun.js/bindings/JSSink.h
./src/bun.js/scripts/create_hash_table src/bun.js/bindings/JSSink.cpp > src/bun.js/bindings/JSSinkLookupTable.h
$(WEBKIT_DIR)/Source/JavaScriptCore/create_hash_table src/bun.js/bindings/JSSink.cpp > src/bun.js/bindings/JSSinkLookupTable.h
$(SED) -i -e 's/#include "Lookup.h"//' src/bun.js/bindings/JSSinkLookupTable.h
$(SED) -i -e 's/namespace JSC {//' src/bun.js/bindings/JSSinkLookupTable.h
$(SED) -i -e 's/} \/\/ namespace JSC//' src/bun.js/bindings/JSSinkLookupTable.h
@@ -1822,7 +1791,7 @@ endif
endif
.PHONY: build-unit
build-unit: # to build your unit tests
build-unit: ## to build your unit tests
@rm -rf zig-out/bin/$(testname)
@mkdir -p zig-out/bin
zig test $(realpath $(testpath)) \
@@ -1840,7 +1809,7 @@ build-unit: # to build your unit tests
cp zig-out/bin/$(testname) $(testbinpath)
.PHONY: run-all-unit-tests
run-all-unit-tests: # to run your unit tests
run-all-unit-tests: ## to run your unit tests
@rm -rf zig-out/bin/__main_test
@mkdir -p zig-out/bin
zig test src/main.zig \
@@ -1860,11 +1829,15 @@ run-all-unit-tests: # to run your unit tests
run-unit:
@zig-out/bin/$(testname) $(ZIG)
.PHONY: help
help: ## to print this help
@awk 'BEGIN {FS = ":.*?## "} /^[a-zA-Z0-9_-]+:.*?## / {gsub("\\\\n",sprintf("\n%22c",""), $$2);printf "\033[36m%-20s\033[0m \t\t%s\n", $$1, $$2}' $(MAKEFILE_LIST)
.PHONY: test
test: build-unit run-unit
.PHONY: integration-test-dev
integration-test-dev: # to run integration tests
integration-test-dev: ## to run integration tests
USE_EXISTING_PROCESS=true TEST_SERVER_URL=http://localhost:3000 node test/scripts/browser.js
copy-install:
@@ -1907,47 +1880,24 @@ vendor-without-npm: node-fallbacks runtime_js fallback_decoder bun_error mimallo
vendor-without-check: npm-install vendor-without-npm
.PHONY: vendor
vendor: assert-deps submodule vendor-without-check
vendor: require init-submodules vendor-without-check
.PHONY: vendor-dev
vendor-dev: assert-deps submodule npm-install-dev vendor-without-npm
vendor-dev: require init-submodules npm-install-dev vendor-without-npm
.PHONY: bun
bun: vendor identifier-cache build-obj bun-link-lld-release bun-codesign-release-local
.PHONY: cpp
cpp: ## compile src/js/builtins + all c++ code then link
@make clean-bindings js
.PHONY: regenerate-bindings
regenerate-bindings:
@make clean-bindings builtins
@make bindings -j$(CPU_COUNT)
@make link
.PHONY: cpp
cpp-no-link:
@make clean-bindings js
@make bindings -j$(CPU_COUNT)
.PHONY: zig
zig: ## compile zig code then link
@make mkdir-dev dev-obj link
.PHONY: zig-no-link
zig-no-link:
@make mkdir-dev dev-obj
.PHONY: dev
dev: # combo of `make cpp` and `make zig`
@make cpp-no-link zig-no-link -j2
@make link
.PHONY: setup
setup: vendor-dev identifier-cache clean-bindings
make jsc-check dev
make jsc-check
make bindings -j$(CPU_COUNT)
@echo ""
@echo "First build complete!"
@echo "\"bun-debug\" is available at $(DEBUG_BIN)/bun-debug"
@echo "Development environment setup complete"
@echo "Run \`make dev\` to build \`bun-debug\`"
@echo ""
.PHONY: help
help: ## to print this help
@echo "For detailed build instructions, see https://bun.sh/docs/project/development"
@awk 'BEGIN {FS = ":.*?## "} /^[a-zA-Z0-9_-]+:.*?## / {gsub("\\\\n",sprintf("\n%22c",""), $$2);printf "\033[36m%-20s\033[0m \t\t%s\n", $$1, $$2}' $(MAKEFILE_LIST)

View File

@@ -24,7 +24,7 @@
## What is Bun?
> **Bun is still under development.** Use it to speed up your development workflows or run simpler production code in resource-constrained environments like serverless functions. We're working on more complete Node.js compatibility and integration with existing frameworks. Join the [Discord](https://bun.sh/discord) and watch the [GitHub repository](https://github.com/oven-sh/bun) to keep tabs on future releases.
> **Bun is still under development.** Use it to speed up your development workflows or run simpler production code in resource-constrained environments like serverless functions. We're working on more complete Node.js compatibility and integration with existing frameworks. Join the [Discord](https://bun.sh/discord) and watch the [GitHub repository](https://github.com/oven-sh/bun) to keeps tabs on future releases.
Bun is an all-in-one toolkit for JavaScript and TypeScript apps. It ships as a single executable called `bun`.
@@ -123,6 +123,7 @@ bun upgrade --canary
- [HTMLRewriter](https://bun.sh/docs/api/html-rewriter)
- [Testing](https://bun.sh/docs/api/test)
- [Utils](https://bun.sh/docs/api/utils)
- [DNS](https://bun.sh/docs/api/dns)
- [Node-API](https://bun.sh/docs/api/node-api)
## Contributing

View File

@@ -1,30 +0,0 @@
// https://github.com/nodejs/node/issues/34493
import { AsyncLocalStorage } from "async_hooks";
const asyncLocalStorage = new AsyncLocalStorage();
// let fn = () => Promise.resolve(2).then(() => new Promise(resolve => queueMicrotask(resolve)));
let fn = () => /test/.test("test");
let runWithExpiry = async (expiry, fn) => {
let iterations = 0;
while (Date.now() < expiry) {
await fn();
iterations++;
}
return iterations;
};
console.log(`Performed ${await runWithExpiry(Date.now() + 1000, fn)} iterations to warmup`);
let withAls;
await asyncLocalStorage.run(123, async () => {
withAls = await runWithExpiry(Date.now() + 45000, fn);
console.log(`Performed ${withAls} iterations (with ALS enabled)`);
});
asyncLocalStorage.disable();
let withoutAls = await runWithExpiry(Date.now() + 45000, fn);
console.log(`Performed ${withoutAls} iterations (with ALS disabled)`);
console.log("ALS penalty: " + Math.round((1 - withAls / withoutAls) * 10000) / 100 + "%");

View File

@@ -3,6 +3,6 @@
"module": "index.ts",
"type": "module",
"devDependencies": {
"bun-types": "^0.7.0"
"bun-types": "^0.5.0"
}
}
}

View File

@@ -43,7 +43,7 @@ pub fn main() anyerror!void {
var position = try std.fmt.parseInt(u32, position_str, 10);
const filepath = try std.fs.path.resolve(allocator, &.{basepath});
var file = try std.fs.openFileAbsolute(filepath, .{ .write = true });
var ms = @as(u64, @truncate((try std.fmt.parseInt(u128, args[args.len - 1], 10)) * std.time.ns_per_ms));
var ms = @truncate(u64, (try std.fmt.parseInt(u128, args[args.len - 1], 10)) * std.time.ns_per_ms);
std.debug.assert(ms > 0);
// std.debug.assert(std.math.isFinite(position));
var prng = std.rand.DefaultPrng.init(0);
@@ -125,30 +125,30 @@ pub fn main() anyerror!void {
);
};
counters[counter].timestamp = @as(u64, @truncate(@as(u128, @intCast(std.time.nanoTimestamp())) / (std.time.ns_per_ms / 10)));
counters[counter].timestamp = @truncate(u64, @intCast(u128, std.time.nanoTimestamp()) / (std.time.ns_per_ms / 10));
counters[counter].rotate = rotate % 360;
counters[counter].percent = std.math.mod(f64, std.math.round(((progress_bar + 1.0) / destination_count) * 1000) / 1000, 100) catch 0;
counters[counter].color_values[0] = @as(u32, @intFromFloat(std.math.round(@as(f64, @floatFromInt(((colors[0][0] + 1) % 256))) * 0.8)));
counters[counter].color_values[1] = @as(u32, @intFromFloat(std.math.round(@as(f64, @floatFromInt(((colors[0][1] + 1) % 256))) * 0.8)));
counters[counter].color_values[2] = @as(u32, @intFromFloat(std.math.round(@as(f64, @floatFromInt(((colors[0][2] + 1) % 256))) * 0.8)));
counters[counter].color_values[0] = @intFromFloat(u32, std.math.round(@floatFromInt(f64, ((colors[0][0] + 1) % 256)) * 0.8));
counters[counter].color_values[1] = @intFromFloat(u32, std.math.round(@floatFromInt(f64, ((colors[0][1] + 1) % 256)) * 0.8));
counters[counter].color_values[2] = @intFromFloat(u32, std.math.round(@floatFromInt(f64, ((colors[0][2] + 1) % 256)) * 0.8));
counters[counter].color_values[3] = (colors[0][0] + 1) % 256;
counters[counter].color_values[4] = (colors[0][1] + 1) % 256;
counters[counter].color_values[5] = (colors[0][2] + 1) % 256;
counters[counter].color_values[6] = @as(u32, @intFromFloat(std.math.round(@as(f64, @floatFromInt(((colors[1][0] + 1) % 256))) * 0.8)));
counters[counter].color_values[7] = @as(u32, @intFromFloat(std.math.round(@as(f64, @floatFromInt(((colors[1][1] + 1) % 256))) * 0.8)));
counters[counter].color_values[8] = @as(u32, @intFromFloat(std.math.round(@as(f64, @floatFromInt(((colors[1][2] + 1) % 256))) * 0.8)));
counters[counter].color_values[6] = @intFromFloat(u32, std.math.round(@floatFromInt(f64, ((colors[1][0] + 1) % 256)) * 0.8));
counters[counter].color_values[7] = @intFromFloat(u32, std.math.round(@floatFromInt(f64, ((colors[1][1] + 1) % 256)) * 0.8));
counters[counter].color_values[8] = @intFromFloat(u32, std.math.round(@floatFromInt(f64, ((colors[1][2] + 1) % 256)) * 0.8));
counters[counter].color_values[9] = (colors[1][0] + 1) % 256;
counters[counter].color_values[10] = (colors[1][1] + 1) % 256;
counters[counter].color_values[11] = (colors[1][2] + 1) % 256;
counters[counter].color_values[12] = @as(u32, @intFromFloat(std.math.round(@as(f64, @floatFromInt(((colors[2][0] + 1) % 256))) * 0.8)));
counters[counter].color_values[13] = @as(u32, @intFromFloat(std.math.round(@as(f64, @floatFromInt(((colors[2][1] + 1) % 256))) * 0.8)));
counters[counter].color_values[14] = @as(u32, @intFromFloat(std.math.round(@as(f64, @floatFromInt(((colors[2][2] + 1) % 256))) * 0.8)));
counters[counter].color_values[12] = @intFromFloat(u32, std.math.round(@floatFromInt(f64, ((colors[2][0] + 1) % 256)) * 0.8));
counters[counter].color_values[13] = @intFromFloat(u32, std.math.round(@floatFromInt(f64, ((colors[2][1] + 1) % 256)) * 0.8));
counters[counter].color_values[14] = @intFromFloat(u32, std.math.round(@floatFromInt(f64, ((colors[2][2] + 1) % 256)) * 0.8));
counters[counter].color_values[15] = (colors[2][0] + 1) % 256;
counters[counter].color_values[16] = (colors[2][1] + 1) % 256;
counters[counter].color_values[17] = (colors[2][2] + 1) % 256;
counters[counter].color_values[18] = @as(u32, @intFromFloat(std.math.round(@as(f64, @floatFromInt(((colors[3][0] + 1) % 256))) * 0.8)));
counters[counter].color_values[19] = @as(u32, @intFromFloat(std.math.round(@as(f64, @floatFromInt(((colors[3][1] + 1) % 256))) * 0.8)));
counters[counter].color_values[20] = @as(u32, @intFromFloat(std.math.round(@as(f64, @floatFromInt(((colors[3][2] + 1) % 256))) * 0.8)));
counters[counter].color_values[18] = @intFromFloat(u32, std.math.round(@floatFromInt(f64, ((colors[3][0] + 1) % 256)) * 0.8));
counters[counter].color_values[19] = @intFromFloat(u32, std.math.round(@floatFromInt(f64, ((colors[3][1] + 1) % 256)) * 0.8));
counters[counter].color_values[20] = @intFromFloat(u32, std.math.round(@floatFromInt(f64, ((colors[3][2] + 1) % 256)) * 0.8));
counters[counter].color_values[21] = (colors[3][0] + 1) % 256;
counters[counter].color_values[22] = (colors[3][1] + 1) % 256;
counters[counter].color_values[23] = (colors[3][2] + 1) % 256;

View File

@@ -43,7 +43,7 @@ pub fn main() anyerror!void {
var position = try std.fmt.parseInt(u32, position_str, 10);
const filepath = try std.fs.path.resolve(allocator, &.{basepath});
var file = try std.fs.openFileAbsolute(filepath, .{ .write = true });
var ms = @as(u64, @truncate((try std.fmt.parseInt(u128, args[args.len - 1], 10)) * std.time.ns_per_ms));
var ms = @truncate(u64, (try std.fmt.parseInt(u128, args[args.len - 1], 10)) * std.time.ns_per_ms);
std.debug.assert(ms > 0);
// std.debug.assert(std.math.isFinite(position));
var prng = std.rand.DefaultPrng.init(0);
@@ -112,30 +112,30 @@ pub fn main() anyerror!void {
\\
++ SIMULATE_LONG_FILE;
counters[counter].timestamp = @as(u64, @truncate(@as(u128, @intCast(std.time.nanoTimestamp())) / (std.time.ns_per_ms / 10)));
counters[counter].timestamp = @truncate(u64, @intCast(u128, std.time.nanoTimestamp()) / (std.time.ns_per_ms / 10));
counters[counter].rotate = rotate % 360;
counters[counter].percent = std.math.mod(f64, std.math.round(((progress_bar + 1.0) / destination_count) * 1000) / 1000, 100) catch 0;
counters[counter].color_values[0] = @as(u32, @intFromFloat(std.math.round(@as(f64, @floatFromInt(((colors[0][0] + 1) % 256))) * 0.8)));
counters[counter].color_values[1] = @as(u32, @intFromFloat(std.math.round(@as(f64, @floatFromInt(((colors[0][1] + 1) % 256))) * 0.8)));
counters[counter].color_values[2] = @as(u32, @intFromFloat(std.math.round(@as(f64, @floatFromInt(((colors[0][2] + 1) % 256))) * 0.8)));
counters[counter].color_values[0] = @intFromFloat(u32, std.math.round(@floatFromInt(f64, ((colors[0][0] + 1) % 256)) * 0.8));
counters[counter].color_values[1] = @intFromFloat(u32, std.math.round(@floatFromInt(f64, ((colors[0][1] + 1) % 256)) * 0.8));
counters[counter].color_values[2] = @intFromFloat(u32, std.math.round(@floatFromInt(f64, ((colors[0][2] + 1) % 256)) * 0.8));
counters[counter].color_values[3] = (colors[0][0] + 1) % 256;
counters[counter].color_values[4] = (colors[0][1] + 1) % 256;
counters[counter].color_values[5] = (colors[0][2] + 1) % 256;
counters[counter].color_values[6] = @as(u32, @intFromFloat(std.math.round(@as(f64, @floatFromInt(((colors[1][0] + 1) % 256))) * 0.8)));
counters[counter].color_values[7] = @as(u32, @intFromFloat(std.math.round(@as(f64, @floatFromInt(((colors[1][1] + 1) % 256))) * 0.8)));
counters[counter].color_values[8] = @as(u32, @intFromFloat(std.math.round(@as(f64, @floatFromInt(((colors[1][2] + 1) % 256))) * 0.8)));
counters[counter].color_values[6] = @intFromFloat(u32, std.math.round(@floatFromInt(f64, ((colors[1][0] + 1) % 256)) * 0.8));
counters[counter].color_values[7] = @intFromFloat(u32, std.math.round(@floatFromInt(f64, ((colors[1][1] + 1) % 256)) * 0.8));
counters[counter].color_values[8] = @intFromFloat(u32, std.math.round(@floatFromInt(f64, ((colors[1][2] + 1) % 256)) * 0.8));
counters[counter].color_values[9] = (colors[1][0] + 1) % 256;
counters[counter].color_values[10] = (colors[1][1] + 1) % 256;
counters[counter].color_values[11] = (colors[1][2] + 1) % 256;
counters[counter].color_values[12] = @as(u32, @intFromFloat(std.math.round(@as(f64, @floatFromInt(((colors[2][0] + 1) % 256))) * 0.8)));
counters[counter].color_values[13] = @as(u32, @intFromFloat(std.math.round(@as(f64, @floatFromInt(((colors[2][1] + 1) % 256))) * 0.8)));
counters[counter].color_values[14] = @as(u32, @intFromFloat(std.math.round(@as(f64, @floatFromInt(((colors[2][2] + 1) % 256))) * 0.8)));
counters[counter].color_values[12] = @intFromFloat(u32, std.math.round(@floatFromInt(f64, ((colors[2][0] + 1) % 256)) * 0.8));
counters[counter].color_values[13] = @intFromFloat(u32, std.math.round(@floatFromInt(f64, ((colors[2][1] + 1) % 256)) * 0.8));
counters[counter].color_values[14] = @intFromFloat(u32, std.math.round(@floatFromInt(f64, ((colors[2][2] + 1) % 256)) * 0.8));
counters[counter].color_values[15] = (colors[2][0] + 1) % 256;
counters[counter].color_values[16] = (colors[2][1] + 1) % 256;
counters[counter].color_values[17] = (colors[2][2] + 1) % 256;
counters[counter].color_values[18] = @as(u32, @intFromFloat(std.math.round(@as(f64, @floatFromInt(((colors[3][0] + 1) % 256))) * 0.8)));
counters[counter].color_values[19] = @as(u32, @intFromFloat(std.math.round(@as(f64, @floatFromInt(((colors[3][1] + 1) % 256))) * 0.8)));
counters[counter].color_values[20] = @as(u32, @intFromFloat(std.math.round(@as(f64, @floatFromInt(((colors[3][2] + 1) % 256))) * 0.8)));
counters[counter].color_values[18] = @intFromFloat(u32, std.math.round(@floatFromInt(f64, ((colors[3][0] + 1) % 256)) * 0.8));
counters[counter].color_values[19] = @intFromFloat(u32, std.math.round(@floatFromInt(f64, ((colors[3][1] + 1) % 256)) * 0.8));
counters[counter].color_values[20] = @intFromFloat(u32, std.math.round(@floatFromInt(f64, ((colors[3][2] + 1) % 256)) * 0.8));
counters[counter].color_values[21] = (colors[3][0] + 1) % 256;
counters[counter].color_values[22] = (colors[3][1] + 1) % 256;
counters[counter].color_values[23] = (colors[3][2] + 1) % 256;

View File

@@ -5,4 +5,4 @@
"jsx": "react-jsx",
"paths": {}
}
}
}

View File

@@ -1,4 +1,4 @@
import { bench, run } from "mitata";
import { bench, run } from "../node_modules/mitata/src/cli.mjs";
bench("noop", function () {});
bench("async function(){}", async function () {});

View File

@@ -0,0 +1,14 @@
import { bench, run } from "./runner.mjs";
import { Buffer } from "node:buffer";
const bigBuffer = Buffer.from("hello world".repeat(10000));
const converted = bigBuffer.toString("base64");
bench("Buffer.toString('base64')", () => {
return bigBuffer.toString("base64");
});
// bench("Buffer.from(str, 'base64')", () => {
// return Buffer.from(converted, "base64");
// });
await run();

View File

@@ -1,29 +0,0 @@
import { bench, run } from "./runner.mjs";
import { Buffer } from "node:buffer";
import crypto from "node:crypto";
const bigBuffer = Buffer.from("hello world".repeat(10000));
const converted = bigBuffer.toString("base64");
const uuid = crypto.randomBytes(16);
bench(`Buffer(${bigBuffer.byteLength}).toString('base64')`, () => {
return bigBuffer.toString("base64");
});
bench(`Buffer(${uuid.byteLength}).toString('base64')`, () => {
return uuid.toString("base64");
});
bench(`Buffer(${bigBuffer.byteLength}).toString('hex')`, () => {
return bigBuffer.toString("hex");
});
bench(`Buffer(${uuid.byteLength}).toString('hex')`, () => {
return uuid.toString("hex");
});
bench(`Buffer(${bigBuffer.byteLength}).toString('ascii')`, () => {
return bigBuffer.toString("ascii");
});
await run();

View File

@@ -1,3 +0,0 @@
import { cp } from "fs/promises";
await cp(process.argv[2], process.argv[3], { recursive: true });

View File

@@ -1,31 +0,0 @@
import { mkdirSync, writeFileSync } from "fs";
import { bench, run } from "./runner.mjs";
import { cp } from "fs/promises";
import { join } from "path";
import { tmpdir } from "os";
const hugeDirectory = (() => {
const root = join(tmpdir(), "huge");
const base = join(root, "directory", "for", "benchmarks", "1", "2", "3", "4", "5", "6", "7", "8", "9", "10");
mkdirSync(base, {
recursive: true,
});
for (let i = 0; i < 1000; i++) {
writeFileSync(join(base, "file-" + i + ".txt"), "Hello, world! " + i);
}
return root;
})();
const hugeFilePath = join(tmpdir(), "huge-file-0.txt");
const hugeText = "Hello, world!".repeat(1000000);
writeFileSync(hugeFilePath, hugeText);
var hugeCopyI = 0;
bench("cp -r (1000 files)", async b => {
await cp(hugeDirectory, join(tmpdir(), "huge-copy" + hugeCopyI++), { recursive: true });
});
bench("cp 1 " + ((hugeText.length / 1024) | 0) + " KB file", async b => {
await cp(hugeFilePath, join(tmpdir(), "huge-file" + hugeCopyI++));
});
await run();

View File

@@ -1,12 +0,0 @@
import { bench, run } from "./runner.mjs";
var err = new Error();
bench("Error.captureStackTrace(err)", () => {
Error.captureStackTrace(err);
});
bench("Error.prototype.stack", () => {
new Error().stack;
});
await run();

View File

@@ -1,80 +0,0 @@
import { bench, run } from "../node_modules/mitata/src/cli.mjs";
// This is a benchmark of the performance impact of using private properties.
bench("Polyfillprivate", () => {
"use strict";
var __classPrivateFieldGet =
(this && this.__classPrivateFieldGet) ||
function (receiver, state, kind, f) {
if (kind === "a" && !f) throw new TypeError("Private accessor was defined without a getter");
if (typeof state === "function" ? receiver !== state || !f : !state.has(receiver))
throw new TypeError("Cannot read private member from an object whose class did not declare it");
return kind === "m" ? f : kind === "a" ? f.call(receiver) : f ? f.value : state.get(receiver);
};
var __classPrivateFieldSet =
(this && this.__classPrivateFieldSet) ||
function (receiver, state, value, kind, f) {
if (kind === "m") throw new TypeError("Private method is not writable");
if (kind === "a" && !f) throw new TypeError("Private accessor was defined without a setter");
if (typeof state === "function" ? receiver !== state || !f : !state.has(receiver))
throw new TypeError("Cannot write private member to an object whose class did not declare it");
return kind === "a" ? f.call(receiver, value) : f ? (f.value = value) : state.set(receiver, value), value;
};
var _Foo_state, _Foo_inc;
class Foo {
constructor() {
_Foo_state.set(this, 1);
_Foo_inc.set(this, 13);
}
run() {
let n = 1000000;
while (n-- > 0) {
__classPrivateFieldSet(
this,
_Foo_state,
__classPrivateFieldGet(this, _Foo_state, "f") + __classPrivateFieldGet(this, _Foo_inc, "f"),
"f",
);
}
return n;
}
}
(_Foo_state = new WeakMap()), (_Foo_inc = new WeakMap());
new Foo().run();
});
bench("NativePrivates", () => {
class Foo {
#state = 1;
#inc = 13;
run() {
let n = 1000000;
while (n-- > 0) {
this.#state += this.#inc;
}
return n;
}
}
new Foo().run();
});
bench("ConventionalPrivates", () => {
class Foo {
_state = 1;
_inc = 13;
run() {
let n = 1000000;
while (n-- > 0) {
this._state += this._inc;
}
return n;
}
}
new Foo().run();
});
await run();

View File

@@ -1,33 +0,0 @@
import { bench, run } from "./runner.mjs";
import { performance } from "perf_hooks";
bench("process.memoryUsage()", () => {
process.memoryUsage();
});
bench("process.memoryUsage.rss()", () => {
process.memoryUsage.rss();
});
bench("process.cpuUsage()", () => {
process.cpuUsage();
});
const init = process.cpuUsage();
bench("process.cpuUsage(delta)", () => {
process.cpuUsage(init);
});
bench("performance.now()", () => {
performance.now();
});
bench("process.hrtime()", () => {
process.hrtime();
});
bench("process.hrtime.bigint()", () => {
process.hrtime.bigint();
});
await run();

View File

@@ -1,15 +0,0 @@
// This mostly exists to check for a memory leak in response.clone()
import { bench, run } from "./runner.mjs";
const req = new Request("http://localhost:3000/");
const resp = await fetch("http://example.com");
bench("req.clone().url", () => {
return req.clone().url;
});
bench("resp.clone().url", () => {
return resp.clone().url;
});
await run();

View File

@@ -1,136 +0,0 @@
// This snippet mostly exists to reproduce a memory leak
//
import { bench, run } from "mitata";
const obj = {
"id": 1296269,
"node_id": "MDEwOlJlcG9zaXRvcnkxMjk2MjY5",
"name": "Hello-World",
"full_name": "octocat/Hello-World",
"owner": {
"login": "octocat",
"id": 1,
"node_id": "MDQ6VXNlcjE=",
"avatar_url": "https://github.com/images/error/octocat_happy.gif",
"gravatar_id": "",
"url": "https://api.github.com/users/octocat",
"html_url": "https://github.com/octocat",
"followers_url": "https://api.github.com/users/octocat/followers",
"following_url": "https://api.github.com/users/octocat/following{/other_user}",
"gists_url": "https://api.github.com/users/octocat/gists{/gist_id}",
"starred_url": "https://api.github.com/users/octocat/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/octocat/subscriptions",
"organizations_url": "https://api.github.com/users/octocat/orgs",
"repos_url": "https://api.github.com/users/octocat/repos",
"events_url": "https://api.github.com/users/octocat/events{/privacy}",
"received_events_url": "https://api.github.com/users/octocat/received_events",
"type": "User",
"site_admin": false,
},
"private": false,
"html_url": "https://github.com/octocat/Hello-World",
"description": "This your first repo!",
"fork": false,
"url": "https://api.github.com/repos/octocat/Hello-World",
"archive_url": "https://api.github.com/repos/octocat/Hello-World/{archive_format}{/ref}",
"assignees_url": "https://api.github.com/repos/octocat/Hello-World/assignees{/user}",
"blobs_url": "https://api.github.com/repos/octocat/Hello-World/git/blobs{/sha}",
"branches_url": "https://api.github.com/repos/octocat/Hello-World/branches{/branch}",
"collaborators_url": "https://api.github.com/repos/octocat/Hello-World/collaborators{/collaborator}",
"comments_url": "https://api.github.com/repos/octocat/Hello-World/comments{/number}",
"commits_url": "https://api.github.com/repos/octocat/Hello-World/commits{/sha}",
"compare_url": "https://api.github.com/repos/octocat/Hello-World/compare/{base}...{head}",
"contents_url": "https://api.github.com/repos/octocat/Hello-World/contents/{+path}",
"contributors_url": "https://api.github.com/repos/octocat/Hello-World/contributors",
"deployments_url": "https://api.github.com/repos/octocat/Hello-World/deployments",
"downloads_url": "https://api.github.com/repos/octocat/Hello-World/downloads",
"events_url": "https://api.github.com/repos/octocat/Hello-World/events",
"forks_url": "https://api.github.com/repos/octocat/Hello-World/forks",
"git_commits_url": "https://api.github.com/repos/octocat/Hello-World/git/commits{/sha}",
"git_refs_url": "https://api.github.com/repos/octocat/Hello-World/git/refs{/sha}",
"git_tags_url": "https://api.github.com/repos/octocat/Hello-World/git/tags{/sha}",
"git_url": "git:github.com/octocat/Hello-World.git",
"issue_comment_url": "https://api.github.com/repos/octocat/Hello-World/issues/comments{/number}",
"issue_events_url": "https://api.github.com/repos/octocat/Hello-World/issues/events{/number}",
"issues_url": "https://api.github.com/repos/octocat/Hello-World/issues{/number}",
"keys_url": "https://api.github.com/repos/octocat/Hello-World/keys{/key_id}",
"labels_url": "https://api.github.com/repos/octocat/Hello-World/labels{/name}",
"languages_url": "https://api.github.com/repos/octocat/Hello-World/languages",
"merges_url": "https://api.github.com/repos/octocat/Hello-World/merges",
"milestones_url": "https://api.github.com/repos/octocat/Hello-World/milestones{/number}",
"notifications_url": "https://api.github.com/repos/octocat/Hello-World/notifications{?since,all,participating}",
"pulls_url": "https://api.github.com/repos/octocat/Hello-World/pulls{/number}",
"releases_url": "https://api.github.com/repos/octocat/Hello-World/releases{/id}",
"ssh_url": "git@github.com:octocat/Hello-World.git",
"stargazers_url": "https://api.github.com/repos/octocat/Hello-World/stargazers",
"statuses_url": "https://api.github.com/repos/octocat/Hello-World/statuses/{sha}",
"subscribers_url": "https://api.github.com/repos/octocat/Hello-World/subscribers",
"subscription_url": "https://api.github.com/repos/octocat/Hello-World/subscription",
"tags_url": "https://api.github.com/repos/octocat/Hello-World/tags",
"teams_url": "https://api.github.com/repos/octocat/Hello-World/teams",
"trees_url": "https://api.github.com/repos/octocat/Hello-World/git/trees{/sha}",
"clone_url": "https://github.com/octocat/Hello-World.git",
"mirror_url": "git:git.example.com/octocat/Hello-World",
"hooks_url": "https://api.github.com/repos/octocat/Hello-World/hooks",
"svn_url": "https://svn.github.com/octocat/Hello-World",
"homepage": "https://github.com",
"language": null,
"forks_count": 9,
"stargazers_count": 80,
"watchers_count": 80,
"size": 108,
"default_branch": "master",
"open_issues_count": 0,
"is_template": false,
"topics": ["octocat", "atom", "electron", "api"],
"has_issues": true,
"has_projects": true,
"has_wiki": true,
"has_pages": false,
"has_downloads": true,
"has_discussions": false,
"archived": false,
"disabled": false,
"visibility": "public",
"pushed_at": "2011-01-26T19:06:43Z",
"created_at": "2011-01-26T19:01:12Z",
"updated_at": "2011-01-26T19:14:43Z",
"permissions": {
"admin": false,
"push": false,
"pull": true,
},
"security_and_analysis": {
"advanced_security": {
"status": "enabled",
},
"secret_scanning": {
"status": "enabled",
},
"secret_scanning_push_protection": {
"status": "disabled",
},
},
};
// Force the string to be 8bit
const str = String.fromCharCode(
...JSON.stringify(obj)
.split("")
.map(a => a.charCodeAt(0)),
);
var i = 0;
bench("new Response().arrayBuffer() (new string each call, latin1)", async () => {
return await new Response(str + i++).arrayBuffer();
});
bench("new Response().arrayBuffer() (new string each call, utf16)", async () => {
return await new Response(str + i++ + "😊").arrayBuffer();
});
bench("new Response().arrayBuffer() (existing string, latin1)", async () => {
return await new Response(str).arrayBuffer();
});
await run();

View File

@@ -1,123 +0,0 @@
// This snippet mostly exists to reproduce a memory leak
import { bench, run } from "mitata";
const obj = {
"id": 1296269,
"node_id": "MDEwOlJlcG9zaXRvcnkxMjk2MjY5",
"name": "Hello-World",
"full_name": "octocat/Hello-World",
"owner": {
"login": "octocat",
"id": 1,
"node_id": "MDQ6VXNlcjE=",
"avatar_url": "https://github.com/images/error/octocat_happy.gif",
"gravatar_id": "",
"url": "https://api.github.com/users/octocat",
"html_url": "https://github.com/octocat",
"followers_url": "https://api.github.com/users/octocat/followers",
"following_url": "https://api.github.com/users/octocat/following{/other_user}",
"gists_url": "https://api.github.com/users/octocat/gists{/gist_id}",
"starred_url": "https://api.github.com/users/octocat/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/octocat/subscriptions",
"organizations_url": "https://api.github.com/users/octocat/orgs",
"repos_url": "https://api.github.com/users/octocat/repos",
"events_url": "https://api.github.com/users/octocat/events{/privacy}",
"received_events_url": "https://api.github.com/users/octocat/received_events",
"type": "User",
"site_admin": false,
},
"private": false,
"html_url": "https://github.com/octocat/Hello-World",
"description": "This your first repo!",
"fork": false,
"url": "https://api.github.com/repos/octocat/Hello-World",
"archive_url": "https://api.github.com/repos/octocat/Hello-World/{archive_format}{/ref}",
"assignees_url": "https://api.github.com/repos/octocat/Hello-World/assignees{/user}",
"blobs_url": "https://api.github.com/repos/octocat/Hello-World/git/blobs{/sha}",
"branches_url": "https://api.github.com/repos/octocat/Hello-World/branches{/branch}",
"collaborators_url": "https://api.github.com/repos/octocat/Hello-World/collaborators{/collaborator}",
"comments_url": "https://api.github.com/repos/octocat/Hello-World/comments{/number}",
"commits_url": "https://api.github.com/repos/octocat/Hello-World/commits{/sha}",
"compare_url": "https://api.github.com/repos/octocat/Hello-World/compare/{base}...{head}",
"contents_url": "https://api.github.com/repos/octocat/Hello-World/contents/{+path}",
"contributors_url": "https://api.github.com/repos/octocat/Hello-World/contributors",
"deployments_url": "https://api.github.com/repos/octocat/Hello-World/deployments",
"downloads_url": "https://api.github.com/repos/octocat/Hello-World/downloads",
"events_url": "https://api.github.com/repos/octocat/Hello-World/events",
"forks_url": "https://api.github.com/repos/octocat/Hello-World/forks",
"git_commits_url": "https://api.github.com/repos/octocat/Hello-World/git/commits{/sha}",
"git_refs_url": "https://api.github.com/repos/octocat/Hello-World/git/refs{/sha}",
"git_tags_url": "https://api.github.com/repos/octocat/Hello-World/git/tags{/sha}",
"git_url": "git:github.com/octocat/Hello-World.git",
"issue_comment_url": "https://api.github.com/repos/octocat/Hello-World/issues/comments{/number}",
"issue_events_url": "https://api.github.com/repos/octocat/Hello-World/issues/events{/number}",
"issues_url": "https://api.github.com/repos/octocat/Hello-World/issues{/number}",
"keys_url": "https://api.github.com/repos/octocat/Hello-World/keys{/key_id}",
"labels_url": "https://api.github.com/repos/octocat/Hello-World/labels{/name}",
"languages_url": "https://api.github.com/repos/octocat/Hello-World/languages",
"merges_url": "https://api.github.com/repos/octocat/Hello-World/merges",
"milestones_url": "https://api.github.com/repos/octocat/Hello-World/milestones{/number}",
"notifications_url": "https://api.github.com/repos/octocat/Hello-World/notifications{?since,all,participating}",
"pulls_url": "https://api.github.com/repos/octocat/Hello-World/pulls{/number}",
"releases_url": "https://api.github.com/repos/octocat/Hello-World/releases{/id}",
"ssh_url": "git@github.com:octocat/Hello-World.git",
"stargazers_url": "https://api.github.com/repos/octocat/Hello-World/stargazers",
"statuses_url": "https://api.github.com/repos/octocat/Hello-World/statuses/{sha}",
"subscribers_url": "https://api.github.com/repos/octocat/Hello-World/subscribers",
"subscription_url": "https://api.github.com/repos/octocat/Hello-World/subscription",
"tags_url": "https://api.github.com/repos/octocat/Hello-World/tags",
"teams_url": "https://api.github.com/repos/octocat/Hello-World/teams",
"trees_url": "https://api.github.com/repos/octocat/Hello-World/git/trees{/sha}",
"clone_url": "https://github.com/octocat/Hello-World.git",
"mirror_url": "git:git.example.com/octocat/Hello-World",
"hooks_url": "https://api.github.com/repos/octocat/Hello-World/hooks",
"svn_url": "https://svn.github.com/octocat/Hello-World",
"homepage": "https://github.com",
"language": null,
"forks_count": 9,
"stargazers_count": 80,
"watchers_count": 80,
"size": 108,
"default_branch": "master",
"open_issues_count": 0,
"is_template": false,
"topics": ["octocat", "atom", "electron", "api"],
"has_issues": true,
"has_projects": true,
"has_wiki": true,
"has_pages": false,
"has_downloads": true,
"has_discussions": false,
"archived": false,
"disabled": false,
"visibility": "public",
"pushed_at": "2011-01-26T19:06:43Z",
"created_at": "2011-01-26T19:01:12Z",
"updated_at": "2011-01-26T19:14:43Z",
"permissions": {
"admin": false,
"push": false,
"pull": true,
},
"security_and_analysis": {
"advanced_security": {
"status": "enabled",
},
"secret_scanning": {
"status": "enabled",
},
"secret_scanning_push_protection": {
"status": "disabled",
},
},
};
bench("Response.json(obj)", async () => {
return Response.json(obj);
});
bench("Response.json(obj).json()", async () => {
return await Response.json(obj).json();
});
await run();

View File

@@ -1 +0,0 @@
for (let i = 0; i < 9999999; i++) new Request("http://aaaaaaaaaaaaaaaaaaaaa");

View File

@@ -1,37 +0,0 @@
import { bench, run } from "./runner.mjs";
const blob = new Blob(["<p id='foo'>Hello</p>"]);
bench("prepend", async () => {
await new HTMLRewriter()
.on("p", {
element(element) {
element.prepend("Hello");
},
})
.transform(new Response(blob))
.text();
});
bench("append", async () => {
await new HTMLRewriter()
.on("p", {
element(element) {
element.append("Hello");
},
})
.transform(new Response(blob))
.text();
});
bench("getAttribute", async () => {
await new HTMLRewriter()
.on("p", {
element(element) {
element.getAttribute("id");
},
})
.transform(new Response(blob))
.text();
});
await run();

View File

@@ -1,128 +0,0 @@
import { serialize, deserialize } from "node:v8";
import { bench, run } from "./runner.mjs";
const obj = {
"id": 1296269,
"node_id": "MDEwOlJlcG9zaXRvcnkxMjk2MjY5",
"name": "Hello-World",
"full_name": "octocat/Hello-World",
"owner": {
"login": "octocat",
"id": 1,
"node_id": "MDQ6VXNlcjE=",
"avatar_url": "https://github.com/images/error/octocat_happy.gif",
"gravatar_id": "",
"url": "https://api.github.com/users/octocat",
"html_url": "https://github.com/octocat",
"followers_url": "https://api.github.com/users/octocat/followers",
"following_url": "https://api.github.com/users/octocat/following{/other_user}",
"gists_url": "https://api.github.com/users/octocat/gists{/gist_id}",
"starred_url": "https://api.github.com/users/octocat/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/octocat/subscriptions",
"organizations_url": "https://api.github.com/users/octocat/orgs",
"repos_url": "https://api.github.com/users/octocat/repos",
"events_url": "https://api.github.com/users/octocat/events{/privacy}",
"received_events_url": "https://api.github.com/users/octocat/received_events",
"type": "User",
"site_admin": false,
},
"private": false,
"html_url": "https://github.com/octocat/Hello-World",
"description": "This your first repo!",
"fork": false,
"url": "https://api.github.com/repos/octocat/Hello-World",
"archive_url": "https://api.github.com/repos/octocat/Hello-World/{archive_format}{/ref}",
"assignees_url": "https://api.github.com/repos/octocat/Hello-World/assignees{/user}",
"blobs_url": "https://api.github.com/repos/octocat/Hello-World/git/blobs{/sha}",
"branches_url": "https://api.github.com/repos/octocat/Hello-World/branches{/branch}",
"collaborators_url": "https://api.github.com/repos/octocat/Hello-World/collaborators{/collaborator}",
"comments_url": "https://api.github.com/repos/octocat/Hello-World/comments{/number}",
"commits_url": "https://api.github.com/repos/octocat/Hello-World/commits{/sha}",
"compare_url": "https://api.github.com/repos/octocat/Hello-World/compare/{base}...{head}",
"contents_url": "https://api.github.com/repos/octocat/Hello-World/contents/{+path}",
"contributors_url": "https://api.github.com/repos/octocat/Hello-World/contributors",
"deployments_url": "https://api.github.com/repos/octocat/Hello-World/deployments",
"downloads_url": "https://api.github.com/repos/octocat/Hello-World/downloads",
"events_url": "https://api.github.com/repos/octocat/Hello-World/events",
"forks_url": "https://api.github.com/repos/octocat/Hello-World/forks",
"git_commits_url": "https://api.github.com/repos/octocat/Hello-World/git/commits{/sha}",
"git_refs_url": "https://api.github.com/repos/octocat/Hello-World/git/refs{/sha}",
"git_tags_url": "https://api.github.com/repos/octocat/Hello-World/git/tags{/sha}",
"git_url": "git:github.com/octocat/Hello-World.git",
"issue_comment_url": "https://api.github.com/repos/octocat/Hello-World/issues/comments{/number}",
"issue_events_url": "https://api.github.com/repos/octocat/Hello-World/issues/events{/number}",
"issues_url": "https://api.github.com/repos/octocat/Hello-World/issues{/number}",
"keys_url": "https://api.github.com/repos/octocat/Hello-World/keys{/key_id}",
"labels_url": "https://api.github.com/repos/octocat/Hello-World/labels{/name}",
"languages_url": "https://api.github.com/repos/octocat/Hello-World/languages",
"merges_url": "https://api.github.com/repos/octocat/Hello-World/merges",
"milestones_url": "https://api.github.com/repos/octocat/Hello-World/milestones{/number}",
"notifications_url": "https://api.github.com/repos/octocat/Hello-World/notifications{?since,all,participating}",
"pulls_url": "https://api.github.com/repos/octocat/Hello-World/pulls{/number}",
"releases_url": "https://api.github.com/repos/octocat/Hello-World/releases{/id}",
"ssh_url": "git@github.com:octocat/Hello-World.git",
"stargazers_url": "https://api.github.com/repos/octocat/Hello-World/stargazers",
"statuses_url": "https://api.github.com/repos/octocat/Hello-World/statuses/{sha}",
"subscribers_url": "https://api.github.com/repos/octocat/Hello-World/subscribers",
"subscription_url": "https://api.github.com/repos/octocat/Hello-World/subscription",
"tags_url": "https://api.github.com/repos/octocat/Hello-World/tags",
"teams_url": "https://api.github.com/repos/octocat/Hello-World/teams",
"trees_url": "https://api.github.com/repos/octocat/Hello-World/git/trees{/sha}",
"clone_url": "https://github.com/octocat/Hello-World.git",
"mirror_url": "git:git.example.com/octocat/Hello-World",
"hooks_url": "https://api.github.com/repos/octocat/Hello-World/hooks",
"svn_url": "https://svn.github.com/octocat/Hello-World",
"homepage": "https://github.com",
"language": null,
"forks_count": 9,
"stargazers_count": 80,
"watchers_count": 80,
"size": 108,
"default_branch": "master",
"open_issues_count": 0,
"is_template": false,
"topics": ["octocat", "atom", "electron", "api"],
"has_issues": true,
"has_projects": true,
"has_wiki": true,
"has_pages": false,
"has_downloads": true,
"has_discussions": false,
"archived": false,
"disabled": false,
"visibility": "public",
"pushed_at": "2011-01-26T19:06:43Z",
"created_at": "2011-01-26T19:01:12Z",
"updated_at": "2011-01-26T19:14:43Z",
"permissions": {
"admin": false,
"push": false,
"pull": true,
},
"security_and_analysis": {
"advanced_security": {
"status": "enabled",
},
"secret_scanning": {
"status": "enabled",
},
"secret_scanning_push_protection": {
"status": "disabled",
},
},
};
bench("serialize", () => {
serialize(obj);
});
const serialized = serialize(obj);
bench("deserialize", () => {
deserialize(serialized);
});
if (typeof Bun !== "undefined") {
if (!Bun.deepEquals(obj, deserialize(serialized))) {
throw new Error("not equal");
}
}
await run();

View File

@@ -1,39 +0,0 @@
var testArray = [
{
description: "Random description.",
testNumber: 123456789,
testBoolean: true,
testObject: {
testString: "test string",
testNumber: 12345,
},
testArray: [
{
myName: "test name",
myNumber: 123245,
},
],
},
{
description: "Random description.",
testNumber: 123456789,
testBoolean: true,
testObject: {
testString: "test string",
testNumber: 12345,
},
testArray: [
{
myName: "test name",
myNumber: 123245,
},
],
},
];
import { bench, run } from "./runner.mjs";
bench("structuredClone(array)", () => structuredClone(testArray));
bench("structuredClone(123)", () => structuredClone(123));
bench("structuredClone({a: 123})", () => structuredClone({ a: 123 }));
await run();

View File

@@ -1,20 +0,0 @@
import { group } from "mitata";
import { bench, run } from "./runner.mjs";
const sizes = [
["small (63 bytes)", 63],
["medium (4096 bytes)", 4096],
["large (64 MB)", 64 * 1024 * 1024],
];
for (let [name, size] of sizes) {
group(name, () => {
var buf = new Uint8Array(size);
for (let algorithm of ["SHA-1", "SHA-256", "SHA-384", "SHA-512"]) {
bench(algorithm, async () => {
await crypto.subtle.digest(algorithm, buf);
});
}
});
}
await run();

Binary file not shown.

View File

@@ -1,7 +1,7 @@
{
"name": "bench",
"dependencies": {
"better-sqlite3": "8.5.0"
"better-sqlite3": "^8.0.1"
},
"scripts": {
"build": "exit 0",

View File

@@ -2,7 +2,7 @@
This benchmarks a websocket server intended as a simple but very active chat room.
First, start the server. By default, it will wait for 32 clients which the client script will handle.
First, start the server. By default, it will wait for 16 clients which the client script will handle.
Run in Bun (`Bun.serve`):
@@ -19,10 +19,10 @@ node ./chat-server.node.mjs
Run in Deno (`Deno.serve`):
```bash
deno run -A ./chat-server.deno.mjs
deno run -A --unstable ./chat-server.deno.mjs
```
Then, run the client script. By default, it will connect 32 clients. This client script can run in Bun, Node, or Deno
Then, run the client script. By default, it will connect 16 clients. This client script can run in Bun, Node, or Deno
```bash
node ./chat-client.mjs
@@ -34,4 +34,6 @@ For example, when the client sends `"foo"`, the server sends back `"John: foo"`
The client script waits until it receives all the messages for each client before sending the next batch of messages.
TODO: once Deno lands their performance improvements, increase the client count (it was originally going to be 32 or 64, but that would've exluded Deno from the benchmark)
This project was created using `bun init` in bun v0.2.1. [Bun](https://bun.sh) is a fast all-in-one JavaScript runtime.

Binary file not shown.

View File

@@ -3,7 +3,7 @@ const env = "process" in globalThis ? process.env : "Deno" in globalThis ? Deno.
const SERVER = env.SERVER || "ws://0.0.0.0:4001";
const WebSocket = globalThis.WebSocket || (await import("ws")).WebSocket;
const LOG_MESSAGES = env.LOG_MESSAGES === "1";
const CLIENTS_TO_WAIT_FOR = parseInt(env.CLIENTS_COUNT || "", 10) || 32;
const CLIENTS_TO_WAIT_FOR = parseInt(env.CLIENTS_COUNT || "", 10) || 16;
const DELAY = 64;
const MESSAGES_TO_SEND = Array.from({ length: 32 }, () => [
"Hello World!",

View File

@@ -1,5 +1,5 @@
// See ./README.md for instructions on how to run this benchmark.
const CLIENTS_TO_WAIT_FOR = parseInt(process.env.CLIENTS_COUNT || "", 10) || 32;
const CLIENTS_TO_WAIT_FOR = parseInt(process.env.CLIENTS_COUNT || "", 10) || 16;
var remainingClients = CLIENTS_TO_WAIT_FOR;
const COMPRESS = process.env.COMPRESS === "1";
const port = process.PORT || 4001;

View File

@@ -1,6 +1,6 @@
// See ./README.md for instructions on how to run this benchmark.
const port = Deno.env.get("PORT") || 4001;
const CLIENTS_TO_WAIT_FOR = parseInt(Deno.env.get("CLIENTS_COUNT") || "", 10) || 32;
const CLIENTS_TO_WAIT_FOR = parseInt(Deno.env.get("CLIENTS_COUNT") || "", 10) || 16;
var clients = [];
async function reqHandler(req) {

View File

@@ -1,6 +1,6 @@
// See ./README.md for instructions on how to run this benchmark.
const port = process.env.PORT || 4001;
const CLIENTS_TO_WAIT_FOR = parseInt(process.env.CLIENTS_COUNT || "", 10) || 32;
const CLIENTS_TO_WAIT_FOR = parseInt(process.env.CLIENTS_COUNT || "", 10) || 16;
import { createRequire } from "module";
const require = createRequire(import.meta.url);

View File

@@ -3,8 +3,8 @@
"module": "index.ts",
"type": "module",
"dependencies": {
"bufferutil": "4.0.7",
"utf-8-validate": "6.0.3",
"ws": "8.13.0"
"bufferutil": "^4.0.7",
"utf-8-validate": "^5.0.10",
"ws": "^8.9.0"
}
}

183
build.zig
View File

@@ -1,7 +1,6 @@
const std = @import("std");
const pathRel = std.fs.path.relative;
const Wyhash = @import("./src/wyhash.zig").Wyhash;
var is_debug_build = false;
fn moduleSource(comptime out: []const u8) FileSource {
if (comptime std.fs.path.dirname(@src().file)) |base| {
const outpath = comptime base ++ std.fs.path.sep_str ++ out;
@@ -10,6 +9,27 @@ fn moduleSource(comptime out: []const u8) FileSource {
return FileSource.relative(out);
}
}
pub fn addPicoHTTP(step: *CompileStep, comptime with_obj: bool) void {
step.addIncludePath("src/deps");
if (with_obj) {
step.addObjectFile("src/deps/picohttpparser.o");
}
step.addIncludePath("src/deps");
if (with_obj) {
step.addObjectFile(panicIfNotFound("src/deps/picohttpparser.o"));
step.addObjectFile(panicIfNotFound("src/deps/libssl.a"));
step.addObjectFile(panicIfNotFound("src/deps/libcrypto.a"));
}
// step.add("/Users/jarred/Code/WebKit/WebKitBuild/Release/lib/libWTF.a");
// ./Tools/Scripts/build-jsc --jsc-only --cmakeargs="-DENABLE_STATIC_JSC=ON"
// set -gx ICU_INCLUDE_DIRS "/usr/local/opt/icu4c/include"
// homebrew-provided icu4c
}
const color_map = std.ComptimeStringMap([]const u8, .{
&.{ "black", "30m" },
@@ -34,10 +54,6 @@ fn addInternalPackages(b: *Build, step: *CompileStep, _: std.mem.Allocator, _: [
break :brk b.createModule(.{
.source_file = FileSource.relative("src/io/io_linux.zig"),
});
} else if (target.isWindows()) {
break :brk b.createModule(.{
.source_file = FileSource.relative("src/io/io_windows.zig"),
});
}
break :brk b.createModule(.{
@@ -60,31 +76,18 @@ const BunBuildOptions = struct {
fallback_html_version: u64 = 0,
pub fn updateRuntime(this: *BunBuildOptions) anyerror!void {
if (std.fs.cwd().openFile("src/runtime.out.js", .{ .mode = .read_only })) |file| {
defer file.close();
const runtime_hash = Wyhash.hash(
0,
try file.readToEndAlloc(std.heap.page_allocator, try file.getEndPos()),
);
this.runtime_js_version = runtime_hash;
} else |_| {
if (!is_debug_build) {
@panic("Runtime file was not read successfully. Please run `make setup`");
}
}
if (std.fs.cwd().openFile("src/fallback.out.js", .{ .mode = .read_only })) |file| {
defer file.close();
const fallback_hash = Wyhash.hash(
0,
try file.readToEndAlloc(std.heap.page_allocator, try file.getEndPos()),
);
this.fallback_html_version = fallback_hash;
} else |_| {
if (!is_debug_build) {
@panic("Fallback file was not read successfully. Please run `make setup`");
}
}
var runtime_out_file = try std.fs.cwd().openFile("src/runtime.out.js", .{ .mode = .read_only });
const runtime_hash = Wyhash.hash(
0,
try runtime_out_file.readToEndAlloc(std.heap.page_allocator, try runtime_out_file.getEndPos()),
);
this.runtime_js_version = runtime_hash;
var fallback_out_file = try std.fs.cwd().openFile("src/fallback.out.js", .{ .mode = .read_only });
const fallback_hash = Wyhash.hash(
0,
try fallback_out_file.readToEndAlloc(std.heap.page_allocator, try fallback_out_file.getEndPos()),
);
this.fallback_html_version = fallback_hash;
}
pub fn step(this: BunBuildOptions, b: anytype) *std.build.OptionsStep {
@@ -101,7 +104,6 @@ const BunBuildOptions = struct {
}
};
// relative to the prefix
var output_dir: []const u8 = "";
fn panicIfNotFound(comptime filepath: []const u8) []const u8 {
var file = std.fs.cwd().openFile(filepath, .{ .optimize = .read_only }) catch |err| {
@@ -136,16 +138,6 @@ const Module = std.build.Module;
const fs = std.fs;
pub fn build(b: *Build) !void {
build_(b) catch |err| {
if (@errorReturnTrace()) |trace| {
std.debug.dumpStackTrace(trace.*);
}
return err;
};
}
pub fn build_(b: *Build) !void {
// Standard target options allows the person running `zig build` to choose
// what target to build for. Here we do not override the defaults, which
// means any target is allowed, and the default is native. Other options
@@ -188,25 +180,25 @@ pub fn build_(b: *Build) !void {
var triplet = triplet_buf[0 .. osname.len + cpuArchName.len + 1];
if (b.option([]const u8, "output-dir", "target to install to") orelse std.os.getenv("OUTPUT_DIR")) |output_dir_| {
output_dir = try pathRel(b.allocator, b.install_prefix, output_dir_);
output_dir = b.pathFromRoot(output_dir_);
} else {
const output_dir_base = try std.fmt.bufPrint(&output_dir_buf, "{s}{s}", .{ bin_label, triplet });
output_dir = try pathRel(b.allocator, b.install_prefix, output_dir_base);
output_dir = b.pathFromRoot(output_dir_base);
}
is_debug_build = optimize == OptimizeMode.Debug;
std.fs.cwd().makePath(output_dir) catch {};
const bun_executable_name = if (optimize == std.builtin.OptimizeMode.Debug) "bun-debug" else "bun";
const root_src = if (target.getOsTag() == std.Target.Os.Tag.freestanding)
"root_wasm.zig"
"src/main_wasm.zig"
else
"root.zig";
const min_version: std.SemanticVersion = if (!(target.isWindows() or target.getOsTag() == .freestanding))
const min_version: std.SemanticVersion = if (target.getOsTag() != .freestanding)
target.getOsVersionMin().semver
else
.{ .major = 0, .minor = 0, .patch = 0 };
const max_version: std.SemanticVersion = if (!(target.isWindows() or target.getOsTag() == .freestanding))
const max_version: std.SemanticVersion = if (target.getOsTag() != .freestanding)
target.getOsVersionMax().semver
else
.{ .major = 0, .minor = 0, .patch = 0 };
@@ -217,11 +209,8 @@ pub fn build_(b: *Build) !void {
.root_source_file = FileSource.relative(root_src),
.target = target,
.optimize = optimize,
.main_pkg_path = .{ .cwd_relative = b.pathFromRoot(".") },
});
b.reference_trace = 16;
var default_build_options: BunBuildOptions = brk: {
const is_baseline = arch.isX86() and (target.cpu_model == .baseline or
!std.Target.x86.featureSetHas(target.getCpuFeatures(), .avx2));
@@ -236,6 +225,7 @@ pub fn build_(b: *Build) !void {
.argv = &.{
"git",
"rev-parse",
"--short",
"HEAD",
},
.cwd = b.pathFromRoot("."),
@@ -257,6 +247,9 @@ pub fn build_(b: *Build) !void {
};
{
addPicoHTTP(obj, false);
obj.setMainPkgPath(b.pathFromRoot("."));
try addInternalPackages(
b,
obj,
@@ -287,15 +280,9 @@ pub fn build_(b: *Build) !void {
std.io.getStdErr().writer().print("Output: {s}/{s}\n\n", .{ output_dir, bun_executable_name }) catch unreachable;
defer obj_step.dependOn(&obj.step);
var install = b.addInstallFileWithDir(
obj.getEmittedBin(),
.{ .custom = output_dir },
b.fmt("{s}.o", .{bun_executable_name}),
);
install.step.dependOn(&obj.step);
obj_step.dependOn(&install.step);
obj.emit_bin = .{
.emit_to = b.fmt("{s}/{s}.o", .{ output_dir, bun_executable_name }),
};
var actual_build_options = default_build_options;
if (b.option(bool, "generate-sizes", "Generate sizes of things") orelse false) {
actual_build_options.sizegen = true;
@@ -312,8 +299,7 @@ pub fn build_(b: *Build) !void {
if (target.getCpuArch().isX86()) obj.disable_stack_probing = true;
if (b.option(bool, "for-editor", "Do not emit bin, just check for errors") orelse false) {
// obj.emit_bin = .no_emit;
obj.generated_bin = null;
obj.emit_bin = .no_emit;
}
if (target.getOsTag() == .linux) {
@@ -331,10 +317,9 @@ pub fn build_(b: *Build) !void {
.root_source_file = FileSource.relative("src/bindgen.zig"),
.target = target,
.optimize = optimize,
.main_pkg_path = obj.main_pkg_path,
});
defer headers_step.dependOn(&headers_obj.step);
try configureObjectStep(b, headers_obj, headers_step, @TypeOf(target), target);
try configureObjectStep(b, headers_obj, @TypeOf(target), target, obj.main_pkg_path.?);
var headers_build_options = default_build_options;
headers_build_options.bindgen = true;
headers_obj.addOptions("build_options", default_build_options.step(b));
@@ -342,22 +327,19 @@ pub fn build_(b: *Build) !void {
}
{
const wasm_step = b.step("bun-wasm", "Build WASM");
var wasm = b.addStaticLibrary(.{
const wasm = b.step("bun-wasm", "Build WASM");
var wasm_step = b.addStaticLibrary(.{
.name = "bun-wasm",
.root_source_file = FileSource.relative("root_wasm.zig"),
.root_source_file = FileSource.relative("src/main_wasm.zig"),
.target = target,
.optimize = optimize,
.main_pkg_path = obj.main_pkg_path,
});
defer wasm_step.dependOn(&wasm.step);
wasm.strip = false;
defer wasm.dependOn(&wasm_step.step);
wasm_step.strip = false;
// wasm_step.link_function_sections = true;
// wasm_step.link_emit_relocs = true;
// wasm_step.single_threaded = true;
try configureObjectStep(b, wasm, wasm_step, @TypeOf(target), target);
var build_opts = default_build_options;
wasm.addOptions("build_options", build_opts.step(b));
try configureObjectStep(b, wasm_step, @TypeOf(target), target, obj.main_pkg_path.?);
}
{
@@ -367,10 +349,9 @@ pub fn build_(b: *Build) !void {
.root_source_file = FileSource.relative("misctools/http_bench.zig"),
.target = target,
.optimize = optimize,
.main_pkg_path = obj.main_pkg_path,
});
defer headers_step.dependOn(&headers_obj.step);
try configureObjectStep(b, headers_obj, headers_step, @TypeOf(target), target);
try configureObjectStep(b, headers_obj, @TypeOf(target), target, obj.main_pkg_path.?);
headers_obj.addOptions("build_options", default_build_options.step(b));
}
@@ -381,10 +362,9 @@ pub fn build_(b: *Build) !void {
.root_source_file = FileSource.relative("misctools/machbench.zig"),
.target = target,
.optimize = optimize,
.main_pkg_path = obj.main_pkg_path,
});
defer headers_step.dependOn(&headers_obj.step);
try configureObjectStep(b, headers_obj, headers_step, @TypeOf(target), target);
try configureObjectStep(b, headers_obj, @TypeOf(target), target, obj.main_pkg_path.?);
headers_obj.addOptions("build_options", default_build_options.step(b));
}
@@ -395,10 +375,9 @@ pub fn build_(b: *Build) !void {
.root_source_file = FileSource.relative("misctools/fetch.zig"),
.target = target,
.optimize = optimize,
.main_pkg_path = obj.main_pkg_path,
});
defer headers_step.dependOn(&headers_obj.step);
try configureObjectStep(b, headers_obj, headers_step, @TypeOf(target), target);
try configureObjectStep(b, headers_obj, @TypeOf(target), target, obj.main_pkg_path.?);
headers_obj.addOptions("build_options", default_build_options.step(b));
}
@@ -409,10 +388,9 @@ pub fn build_(b: *Build) !void {
.root_source_file = FileSource.relative("src/bench/string-handling.zig"),
.target = target,
.optimize = optimize,
.main_pkg_path = obj.main_pkg_path,
});
defer headers_step.dependOn(&headers_obj.step);
try configureObjectStep(b, headers_obj, headers_step, @TypeOf(target), target);
try configureObjectStep(b, headers_obj, @TypeOf(target), target, obj.main_pkg_path.?);
headers_obj.addOptions("build_options", default_build_options.step(b));
}
@@ -423,10 +401,9 @@ pub fn build_(b: *Build) !void {
.root_source_file = FileSource.relative("src/sha.zig"),
.target = target,
.optimize = optimize,
.main_pkg_path = obj.main_pkg_path,
});
defer headers_step.dependOn(&headers_obj.step);
try configureObjectStep(b, headers_obj, headers_step, @TypeOf(target), target);
try configureObjectStep(b, headers_obj, @TypeOf(target), target, obj.main_pkg_path.?);
headers_obj.addOptions("build_options", default_build_options.step(b));
}
@@ -437,10 +414,9 @@ pub fn build_(b: *Build) !void {
.root_source_file = FileSource.relative("src/sourcemap/vlq_bench.zig"),
.target = target,
.optimize = optimize,
.main_pkg_path = obj.main_pkg_path,
});
defer headers_step.dependOn(&headers_obj.step);
try configureObjectStep(b, headers_obj, headers_step, @TypeOf(target), target);
try configureObjectStep(b, headers_obj, @TypeOf(target), target, obj.main_pkg_path.?);
headers_obj.addOptions("build_options", default_build_options.step(b));
}
@@ -451,10 +427,9 @@ pub fn build_(b: *Build) !void {
.root_source_file = FileSource.relative("misctools/tgz.zig"),
.target = target,
.optimize = optimize,
.main_pkg_path = obj.main_pkg_path,
});
defer headers_step.dependOn(&headers_obj.step);
try configureObjectStep(b, headers_obj, headers_step, @TypeOf(target), target);
try configureObjectStep(b, headers_obj, @TypeOf(target), target, obj.main_pkg_path.?);
headers_obj.addOptions("build_options", default_build_options.step(b));
}
@@ -468,23 +443,16 @@ pub fn build_(b: *Build) !void {
var headers_obj: *CompileStep = b.addTest(.{
.root_source_file = FileSource.relative(test_file orelse "src/main.zig"),
.target = target,
.main_pkg_path = obj.main_pkg_path,
});
headers_obj.filter = test_filter;
if (test_bin_) |test_bin| {
headers_obj.name = std.fs.path.basename(test_bin);
if (std.fs.path.dirname(test_bin)) |dir| {
var install = b.addInstallFileWithDir(
headers_obj.getEmittedBin(),
.{ .custom = try std.fs.path.relative(b.allocator, output_dir, dir) },
headers_obj.name,
);
install.step.dependOn(&headers_obj.step);
headers_step.dependOn(&install.step);
}
if (std.fs.path.dirname(test_bin)) |dir| headers_obj.emit_bin = .{
.emit_to = b.fmt("{s}/{s}", .{ dir, headers_obj.name }),
};
}
try configureObjectStep(b, headers_obj, headers_step, @TypeOf(target), target);
try configureObjectStep(b, headers_obj, @TypeOf(target), target, obj.main_pkg_path.?);
headers_step.dependOn(&headers_obj.step);
headers_obj.addOptions("build_options", default_build_options.step(b));
@@ -495,24 +463,23 @@ pub fn build_(b: *Build) !void {
pub var original_make_fn: ?*const fn (step: *std.build.Step) anyerror!void = null;
pub fn configureObjectStep(b: *std.build.Builder, obj: *CompileStep, obj_step: *std.build.Step, comptime Target: type, target: Target) !void {
pub fn configureObjectStep(b: *std.build.Builder, obj: *CompileStep, comptime Target: type, target: Target, main_pkg_path: []const u8) !void {
obj.setMainPkgPath(main_pkg_path);
// obj.setTarget(target);
try addInternalPackages(b, obj, std.heap.page_allocator, b.zig_exe, target);
if (target.getOsTag() != .freestanding)
addPicoHTTP(obj, false);
obj.strip = false;
// obj.setBuildMode(optimize);
obj.bundle_compiler_rt = false;
if (obj.emit_directory == null) {
var install = b.addInstallFileWithDir(
obj.getEmittedBin(),
.{ .custom = output_dir },
b.fmt("{s}.o", .{obj.name}),
);
if (obj.emit_bin == .default)
obj.emit_bin = .{
.emit_to = b.fmt("{s}/{s}.o", .{ output_dir, obj.name }),
};
install.step.dependOn(&obj.step);
obj_step.dependOn(&install.step);
}
if (target.getOsTag() != .freestanding) obj.linkLibC();
if (target.getOsTag() != .freestanding) obj.bundle_compiler_rt = false;

BIN
bun.lockb

Binary file not shown.

View File

@@ -49,7 +49,7 @@ _bun() {
'--production[Don'"'"'t install devDependencies]' \
'--frozen-lockfile[Disallow changes to lockfile]' \
'--optional[Add dependency to optionalDependencies]' \
'--dev[Add dependency to devDependencies]' \
'--development[Add dependency to devDependencies]' \
'-d[Add dependency to devDependencies]' \
'-p[Don'"'"'t install devDependencies]' \
'--no-save[]' \
@@ -91,7 +91,7 @@ _bun() {
'--production[Don'"'"'t install devDependencies]' \
'--frozen-lockfile[Disallow changes to lockfile]' \
'--optional[Add dependency to optionalDependencies]' \
'--dev[Add dependency to devDependencies]' \
'--development[Add dependency to devDependencies]' \
'-d[Add dependency to devDependencies]' \
'-p[Don'"'"'t install devDependencies]' \
'--no-save[]' \
@@ -127,7 +127,7 @@ _bun() {
'--production[Don'"'"'t install devDependencies]' \
'--frozen-lockfile[Disallow changes to lockfile]' \
'--optional[Add dependency to optionalDependencies]' \
'--dev[Add dependency to devDependencies]' \
'--development[Add dependency to devDependencies]' \
'-d[Add dependency to devDependencies]' \
'-p[Don'"'"'t install devDependencies]' \
'--no-save[]' \

View File

@@ -1,5 +1,4 @@
FROM debian:bullseye-slim AS build
# Not officially supported (yet)
# https://github.com/oven-sh/bun/releases
ARG BUN_VERSION=latest

View File

@@ -927,7 +927,7 @@ Buffer.from(Bun.readableStreamToArrayBuffer(stream));
new Response(stream).text();
// with Bun function
await Bun.readableStreamToText(stream);
await Bun.readableStreamToString(stream);
```
#### To `number[]`

View File

@@ -1,8 +1,8 @@
{% callout %}
<!-- **Note** — The `Bun.file` and `Bun.write` APIs documented on this page are heavily optimized and represent the recommended way to perform file-system tasks using Bun. Existing Node.js projects may use Bun's [nearly complete](/docs/runtime/nodejs-apis#node-fs) implementation of the [`node:fs`](https://nodejs.org/api/fs.html) module. -->
<!-- **Note** — The `Bun.file` and `Bun.write` APIs documented on this page are heavily optimized and represent the recommended way to perform file-system tasks using Bun. Existing Node.js projects may use Bun's [nearly complete](/docs/runtime/nodejs-apis#node_fs) implementation of the [`node:fs`](https://nodejs.org/api/fs.html) module. -->
**Note** — The `Bun.file` and `Bun.write` APIs documented on this page are heavily optimized and represent the recommended way to perform file-system tasks using Bun. For operations that are not yet available with `Bun.file`, such as `mkdir`, you can use Bun's [nearly complete](/docs/runtime/nodejs-apis#node-fs) implementation of the [`node:fs`](https://nodejs.org/api/fs.html) module.
**Note** — The `Bun.file` and `Bun.write` APIs documented on this page are heavily optimized and represent the recommended way to perform file-system tasks using Bun. For operations that are not yet available with `Bun.file`, such as `mkdir`, you can use Bun's [nearly complete](/docs/runtime/nodejs-apis#node_fs) implementation of the [`node:fs`](https://nodejs.org/api/fs.html) module.
{% /callout %}
@@ -285,13 +285,7 @@ interface Bun {
write(
destination: string | number | BunFile | URL,
input:
| string
| Blob
| ArrayBuffer
| SharedArrayBuffer
| TypedArray
| Response,
input: string | Blob | ArrayBuffer | SharedArrayBuffer | TypedArray | Response,
): Promise<number>;
}
@@ -307,9 +301,7 @@ interface BunFile {
}
export interface FileSink {
write(
chunk: string | ArrayBufferView | ArrayBuffer | SharedArrayBuffer,
): number;
write(chunk: string | ArrayBufferView | ArrayBuffer | SharedArrayBuffer): number;
flush(): number | Promise<number>;
end(error?: Error): number | Promise<number>;
start(options?: { highWaterMark?: number }): void;

View File

@@ -34,7 +34,7 @@ Bun implements the following globals.
- [`Buffer`](https://nodejs.org/api/buffer.html#class-buffer)
- Node.js
- See [Node.js > `Buffer`](/docs/runtime/nodejs-apis#node-buffer)
- See [Node.js > `Buffer`](/docs/runtime/nodejs-apis#node_buffer)
---
@@ -172,7 +172,7 @@ Bun implements the following globals.
- [`global`](https://nodejs.org/api/globals.html#global)
- Node.js
- See [Node.js > `global`](/docs/runtime/nodejs-apis#global).
- See [Node.js > `global`](/docs/runtime/nodejs-apis#node_global).
---
@@ -220,7 +220,7 @@ Bun implements the following globals.
- [`process`](https://nodejs.org/api/process.html)
- Node.js
- See [Node.js > `process`](/docs/runtime/nodejs-apis#node-process)
- See [Node.js > `process`](/docs/runtime/nodejs-apis#node_process)
---

View File

@@ -77,7 +77,7 @@ The standard `Bun.hash` functions uses [Wyhash](https://github.com/wangyi-fudan/
```ts
Bun.hash("some data here");
// 11562320457524636935n
// 976213160445840
```
The input can be a string, `TypedArray`, `DataView`, `ArrayBuffer`, or `SharedArrayBuffer`.
@@ -91,14 +91,14 @@ Bun.hash(arr.buffer);
Bun.hash(new DataView(arr.buffer));
```
Optionally, an integer seed can be specified as the second parameter. For 64-bit hashes seeds above `Number.MAX_SAFE_INTEGER` should be given as BigInt to avoid loss of precision.
Optionally, an integer seed can be specified as the second parameter.
```ts
Bun.hash("some data here", 1234);
// 15724820720172937558n
// 1173484059023252
```
Additional hashing algorithms are available as properties on `Bun.hash`. The API is the same for each, only changing the return type from number for 32-bit hashes to bigint for 64-bit hashes.
Additional hashing algorithms are available as properties on `Bun.hash`. The API is the same for each.
```ts
Bun.hash.wyhash("data", 1234); // equivalent to Bun.hash()
@@ -107,7 +107,6 @@ Bun.hash.adler32("data", 1234);
Bun.hash.cityHash32("data", 1234);
Bun.hash.cityHash64("data", 1234);
Bun.hash.murmur32v3("data", 1234);
Bun.hash.murmur32v2("data", 1234);
Bun.hash.murmur64v2("data", 1234);
```

View File

@@ -35,7 +35,7 @@ To configure which port and hostname the server will listen on:
```ts
Bun.serve({
port: 8080, // defaults to $BUN_PORT, $PORT, $NODE_PORT otherwise 3000
port: 8080, // defaults to $PORT, then 3000
hostname: "mydomain.com", // defaults to "0.0.0.0"
fetch(req) {
return new Response(`404!`);
@@ -43,17 +43,6 @@ Bun.serve({
});
```
To listen on a [unix domain socket](https://en.wikipedia.org/wiki/Unix_domain_socket):
```ts
Bun.serve({
unix: "/tmp/my-socket.sock", // path to socket
fetch(req) {
return new Response(`404!`);
},
});
```
## Error handling
To activate development mode, set `development: true`. By default, development mode is _enabled_ unless `NODE_ENV` is `production`.
@@ -78,7 +67,7 @@ Bun.serve({
fetch(req) {
throw new Error("woops!");
},
error(error) {
error(error: Error) {
return new Response(`<pre>${error}\n${error.stack}</pre>`, {
headers: {
"Content-Type": "text/html",
@@ -89,7 +78,7 @@ Bun.serve({
```
{% callout %}
[Learn more about debugging in Bun](https://bun.sh/docs/runtime/debugger)
**Note** — Full debugger support is planned.
{% /callout %}
The call to `Bun.serve` returns a `Server` object. To stop the server, call the `.stop()` method.
@@ -106,37 +95,37 @@ server.stop();
## TLS
Bun supports TLS out of the box, powered by [BoringSSL](https://boringssl.googlesource.com/boringssl). Enable TLS by passing in a value for `key` and `cert`; both are required to enable TLS.
Bun supports TLS out of the box, powered by [OpenSSL](https://www.openssl.org/). Enable TLS by passing in a value for `key` and `cert`; both are required to enable TLS. If needed, supply a `passphrase` to decrypt the `keyFile`.
```ts-diff
Bun.serve({
fetch(req) {
return new Response("Hello!!!");
},
```ts
Bun.serve({
fetch(req) {
return new Response("Hello!!!");
},
+ tls: {
+ key: Bun.file("./key.pem"),
+ cert: Bun.file("./cert.pem"),
+ }
});
// can be string, BunFile, TypedArray, Buffer, or array thereof
key: Bun.file("./key.pem"),
cert: Bun.file("./cert.pem"),
// passphrase, only required if key is encrypted
passphrase: "super-secret",
});
```
The `key` and `cert` fields expect the _contents_ of your TLS key and certificate, _not a path to it_. This can be a string, `BunFile`, `TypedArray`, or `Buffer`.
The `key` and `cert` fields expect the _contents_ of your TLS key and certificate. This can be a string, `BunFile`, `TypedArray`, or `Buffer`.
```ts
Bun.serve({
fetch() {},
tls: {
// BunFile
key: Bun.file("./key.pem"),
// Buffer
key: fs.readFileSync("./key.pem"),
// string
key: fs.readFileSync("./key.pem", "utf8"),
// array of above
key: [Bun.file("./key1.pem"), Bun.file("./key2.pem")],
},
// BunFile
key: Bun.file("./key.pem"),
// Buffer
key: fs.readFileSync("./key.pem"),
// string
key: fs.readFileSync("./key.pem", "utf8"),
// array of above
key: [Bun.file('./key1.pem'), Bun.file('./key2.pem')],
});
```
@@ -146,35 +135,17 @@ Bun.serve({
{% /callout %}
If your private key is encrypted with a passphrase, provide a value for `passphrase` to decrypt it.
```ts-diff
Bun.serve({
fetch(req) {
return new Response("Hello!!!");
},
tls: {
key: Bun.file("./key.pem"),
cert: Bun.file("./cert.pem"),
+ passphrase: "my-secret-passphrase",
}
});
```
Optionally, you can override the trusted CA certificates by passing a value for `ca`. By default, the server will trust the list of well-known CAs curated by Mozilla. When `ca` is specified, the Mozilla list is overwritten.
```ts-diff
Bun.serve({
fetch(req) {
return new Response("Hello!!!");
},
tls: {
key: Bun.file("./key.pem"), // path to TLS key
cert: Bun.file("./cert.pem"), // path to TLS cert
+ ca: Bun.file("./ca.pem"), // path to root CA certificate
}
});
```ts
Bun.serve({
fetch(req) {
return new Response("Hello!!!");
},
key: Bun.file("./key.pem"), // path to TLS key
cert: Bun.file("./cert.pem"), // path to TLS cert
ca: Bun.file("./ca.pem"), // path to root CA certificate
});
```
To override Diffie-Helman parameters:
@@ -182,14 +153,11 @@ To override Diffie-Helman parameters:
```ts
Bun.serve({
// ...
tls: {
// other config
dhParamsFile: "/path/to/dhparams.pem", // path to Diffie Helman parameters
},
dhParamsFile: "./dhparams.pem", // path to Diffie Helman parameters
});
```
## Object syntax
## Hot reloading
Thus far, the examples on this page have used the explicit `Bun.serve` API. Bun also supports an alternate syntax.
@@ -203,15 +171,15 @@ export default {
} satisfies Serve;
```
Instead of passing the server options into `Bun.serve`, `export default` it. This file can be executed as-is; when Bun sees a file with a `default` export containing a `fetch` handler, it passes it into `Bun.serve` under the hood.
Instead of passing the server options into `Bun.serve`, export it. This file can be executed as-is; when Bun runs a file with a `default` export containing a `fetch` handler, it passes it into `Bun.serve` under the hood.
<!-- This syntax has one major advantage: it is hot-reloadable out of the box. When any source file is changed, Bun will reload the server with the updated code _without restarting the process_. This makes hot reloads nearly instantaneous. Use the `--hot` flag when starting the server to enable hot reloading. -->
This syntax has one major advantage: it is hot-reloadable out of the box. When any source file is changed, Bun will reload the server with the updated code _without restarting the process_. This makes hot reloads nearly instantaneous. Use the `--hot` flag when starting the server to enable hot reloading.
<!-- ```bash
```bash
$ bun --hot server.ts
``` -->
```
<!-- It's possible to configure hot reloading while using the explicit `Bun.serve` API; for details refer to [Runtime > Hot reloading](/docs/runtime/hot). -->
It's possible to configure hot reloading while using the explicit `Bun.serve` API; for details refer to [Runtime > Hot reloading](/docs/runtime/hot).
## Streaming files
@@ -306,21 +274,11 @@ interface Bun {
port?: number;
development?: boolean;
error?: (error: Error) => Response | Promise<Response>;
tls?: {
key?:
| string
| TypedArray
| BunFile
| Array<string | TypedArray | BunFile>;
cert?:
| string
| TypedArray
| BunFile
| Array<string | TypedArray | BunFile>;
ca?: string | TypedArray | BunFile | Array<string | TypedArray | BunFile>;
passphrase?: string;
dhParamsFile?: string;
};
keyFile?: string;
certFile?: string;
caFile?: string;
dhParamsFile?: string;
passphrase?: string;
maxRequestBodySize?: number;
lowMemoryMode?: boolean;
}): Server;

View File

@@ -19,17 +19,17 @@ import.meta.resolveSync("zod")
---
- `import.meta.dir`
- Absolute path to the directory containing the current file, e.g. `/path/to/project`. Equivalent to `__dirname` in CommonJS modules (and Node.js)
- Absolute path to the directory containing the current fil, e.g. `/path/to/project`. Equivalent to `__dirname` in Node.js.
---
- `import.meta.file`
- The name of the current file, e.g. `index.tsx`
- The name of the current file, e.g. `index.tsx`. Equivalent to `__filename` in Node.js.
---
- `import.meta.path`
- Absolute path to the current file, e.g. `/path/to/project/index.tx`. Equivalent to `__filename` in CommonJS modules (and Node.js)
- Absolute path to the current file, e.g. `/path/to/project/index.tx`.
---

View File

@@ -12,7 +12,7 @@ The second argument to `Bun.spawn` is a parameters object that can be used to co
```ts
const proc = Bun.spawn(["echo", "hello"], {
cwd: "./path/to/subdir", // specify a working directory
cwd: "./path/to/subdir", // specify a working direcory
env: { ...process.env, FOO: "bar" }, // specify environment variables
onExit(proc, exitCode, signalCode, error) {
// exit handler

View File

@@ -75,7 +75,7 @@ Note: `close()` is called automatically when the database is garbage collected.
```ts
const olddb = new Database("mydb.sqlite");
const contents = olddb.serialize(); // => Uint8Array
const newdb = Database.deserialize(contents);
const newdb = new Database(contents);
```
Internally, `.serialize()` calls [`sqlite3_serialize`](https://www.sqlite.org/c3ref/serialize.html).
@@ -250,7 +250,7 @@ Transactions are a mechanism for executing multiple queries in an _atomic_ way;
```ts
const insertCat = db.prepare("INSERT INTO cats (name) VALUES ($name)");
const insertCats = db.transaction(cats => {
const insertCats = db.transaction((cats) => {
for (const cat of cats) insertCat.run(cat);
});
```
@@ -261,7 +261,7 @@ To execute the transaction, call this function. All arguments will be passed thr
```ts
const insert = db.prepare("INSERT INTO cats (name) VALUES ($name)");
const insertCats = db.transaction(cats => {
const insertCats = db.transaction((cats) => {
for (const cat of cats) insert.run(cat);
return cats.length;
});
@@ -296,11 +296,11 @@ const insertExpense = db.prepare(
"INSERT INTO expenses (note, dollars) VALUES (?, ?)",
);
const insert = db.prepare("INSERT INTO cats (name, age) VALUES ($name, $age)");
const insertCats = db.transaction(cats => {
const insertCats = db.transaction((cats) => {
for (const cat of cats) insert.run(cat);
});
const adopt = db.transaction(cats => {
const adopt = db.transaction((cats) => {
insertExpense.run("adoption fees", 20);
insertCats(cats); // nested transaction
});

View File

@@ -1,6 +1,6 @@
Streams are an important abstraction for working with binary data without loading it all into memory at once. They are commonly used for reading and writing files, sending and receiving network requests, and processing large amounts of data.
Bun implements the Web APIs [`ReadableStream`](https://developer.mozilla.org/en-US/docs/Web/API/ReadableStream) and [`WritableStream`](https://developer.mozilla.org/en-US/docs/Web/API/WritableStream).
Bun implements the Web APIs [`ReadableStream`](https://developer.mozilla.org/en-US/docs/Web/API/ReadableStream) and [`WritableStream`](https://developer.mozilla.org/en-US/docs/Web/API/ReadableStream).
{% callout %}
Bun also implements the `node:stream` module, including [`Readable`](https://nodejs.org/api/stream.html#stream_readable_streams), [`Writable`](https://nodejs.org/api/stream.html#stream_writable_streams), and [`Duplex`](https://nodejs.org/api/stream.html#stream_duplex_and_transform_streams). For complete documentation, refer to the [Node.js docs](https://nodejs.org/api/stream.html).
@@ -28,6 +28,8 @@ for await (const chunk of stream) {
}
```
For a more complete discusson of streams in Bun, see [API > Streams](/docs/api/streams).
## Direct `ReadableStream`
Bun implements an optimized version of `ReadableStream` that avoid unnecessary data copying & queue management logic. With a traditional `ReadableStream`, chunks of data are _enqueued_. Each chunk is copied into a queue, where it sits until the stream is ready to send more data.
@@ -152,9 +154,7 @@ export class ArrayBufferSink {
stream?: boolean;
}): void;
write(
chunk: string | ArrayBufferView | ArrayBuffer | SharedArrayBuffer,
): number;
write(chunk: string | ArrayBufferView | ArrayBuffer | SharedArrayBuffer): number;
/**
* Flush the internal buffer
*

View File

@@ -95,7 +95,7 @@ Bun.listen({
// string
key: fs.readFileSync("./key.pem", "utf8"),
// array of above
key: [Bun.file('./key1.pem'), Bun.file('./key2.pem')]
key: [Bun.file('./key1.pem'), Bun.file('./key2.pem']
},
});
```

View File

@@ -149,9 +149,7 @@ test("peek", () => {
// If we peek a rejected promise, it:
// - returns the error
// - does not mark the promise as handled
const rejected = Promise.reject(
new Error("Successfully tested promise rejection"),
);
const rejected = Promise.reject(new Error("Successfully tested promise rejection"));
expect(peek(rejected).message).toBe("Successfully tested promise rejection");
});
```
@@ -247,7 +245,7 @@ Bun.deepEquals(new Foo(), { a: 1 }, true); // false
## `Bun.escapeHTML()`
`Bun.escapeHTML(value: string | object | number | boolean): string`
`Bun.escapeHTML(value: string | object | number | boolean): boolean`
Escapes the following characters from an input string:
@@ -285,7 +283,7 @@ console.log(url); // "file:///foo/bar.txt"
## `Bun.gzipSync()`
Compresses a `Uint8Array` using zlib's GZIP algorithm.
Compresses a `Uint8Array` using zlib's DEFLATE algorithm.
```ts
const buf = Buffer.from("hello".repeat(100)); // Buffer extends Uint8Array
@@ -374,7 +372,7 @@ export type ZlibCompressionOptions = {
## `Bun.gunzipSync()`
Decompresses a `Uint8Array` using zlib's GUNZIP algorithm.
Uncompresses a `Uint8Array` using zlib's INFLATE algorithm.
```ts
const buf = Buffer.from("hello".repeat(100)); // Buffer extends Uint8Array
@@ -402,15 +400,15 @@ The second argument supports the same set of configuration options as [`Bun.gzip
## `Bun.inflateSync()`
Decompresses a `Uint8Array` using zlib's INFLATE algorithm.
Uncompresses a `Uint8Array` using zlib's INFLATE algorithm.
```ts
const buf = Buffer.from("hello".repeat(100));
const compressed = Bun.deflateSync(buf);
const dec = new TextDecoder();
const decompressed = Bun.inflateSync(compressed);
dec.decode(decompressed);
const uncompressed = Bun.inflateSync(compressed);
dec.decode(uncompressed);
// => "hellohellohello..."
```
@@ -428,21 +426,6 @@ const str = Bun.inspect(arr);
// => "Uint8Array(3) [ 1, 2, 3 ]"
```
## `Bun.inspect.custom`
This is the symbol that Bun uses to implement `Bun.inspect`. You can override this to customize how your objects are printed. It is identical to `util.inspect.custom` in Node.js.
```ts
class Foo {
[Bun.inspect.custom]() {
return "foo";
}
}
const foo = new Foo();
console.log(foo); // => "foo"
```
## `Bun.nanoseconds()`
Returns the number of nanoseconds since the current `bun` process started, as a `number`. Useful for high-precision timing and benchmarking.
@@ -475,12 +458,6 @@ await Bun.readableStreamToText(stream);
// returns all chunks as an array
await Bun.readableStreamToArray(stream);
// => unknown[]
// returns all chunks as a FormData object (encoded as x-www-form-urlencoded)
await Bun.readableStreamToFormData(stream);
// returns all chunks as a FormData object (encoded as multipart/form-data)
await Bun.readableStreamToFormData(stream, multipartFormBoundary);
```
## `Bun.resolveSync()`
@@ -507,17 +484,3 @@ To resolve relative to the directory containing the current file, pass `import.m
```ts
Bun.resolveSync("./foo.ts", import.meta.dir);
```
## `serialize` & `deserialize` in `bun:jsc`
To save a JavaScript value into an ArrayBuffer & back, use `serialize` and `deserialize` from the `"bun:jsc"` module.
```js
import { serialize, deserialize } from "bun:jsc";
const buf = serialize({ foo: "bar" });
const obj = deserialize(buf);
console.log(obj); // => { foo: "bar" }
```
Internally, [`structuredClone`](https://developer.mozilla.org/en-US/docs/Web/API/structuredClone) and [`postMessage`](https://developer.mozilla.org/en-US/docs/Web/API/Window/postMessage) serialize and deserialize the same way. This exposes the underlying [HTML Structured Clone Algorithm](https://developer.mozilla.org/en-US/docs/Web/API/Web_Workers_API/Structured_clone_algorithm) to JavaScript as an ArrayBuffer.

View File

@@ -1,168 +0,0 @@
{% callout %}
`Worker` support was added in Bun v0.7.0.
{% /callout %}
[`Worker`](https://developer.mozilla.org/en-US/docs/Web/API/Worker) lets you start and communicate with a new JavaScript instance running on a separate thread while sharing I/O resources with the main thread.
Bun implements a minimal version of the [Web Workers API](https://developer.mozilla.org/en-US/docs/Web/API/Web_Workers_API) with extensions that make it work better for server-side use cases. Like the rest of Bun, `Worker` in Bun support CommonJS, ES Modules, TypeScript, JSX, TSX and more out of the box. No extra build steps are necessary.
## Creating a `Worker`
Like in browsers, [`Worker`](https://developer.mozilla.org/en-US/docs/Web/API/Worker) is a global. Use it to create a new worker thread.
From the main thread:
```js#Main_thread
const workerURL = new URL("worker.ts", import.meta.url).href;
const worker = new Worker(workerURL);
worker.postMessage("hello");
worker.onmessage = event => {
console.log(event.data);
};
```
Worker thread:
```ts#worker.ts_(Worker_thread)
self.onmessage = (event: MessageEvent) => {
console.log(event.data);
postMessage("world");
};
```
You can use `import`/`export` syntax in your worker code. Unlike in browsers, there's no need to specify `{type: "module"}` to use ES Modules.
To simplify error handling, the initial script to load is resolved at the time `new Worker(url)` is called.
```js
const worker = new Worker("/not-found.js");
// throws an error immediately
```
The specifier passed to `Worker` is resolved relative to the project root (like typing `bun ./path/to/file.js`).
### `"open"`
The `"open"` event is emitted when a worker is created and ready to receive messages. This can be used to send an initial message to a worker once it's ready. (This event does not exist in browsers.)
```ts
const worker = new Worker(new URL("worker.ts", import.meta.url).href);
worker.addEventListener("open", () => {
console.log("worker is ready");
});
```
Messages are automatically enqueued until the worker is ready, so there is no need to wait for the `"open"` event to send messages.
## Messages with `postMessage`
To send messages, use [`worker.postMessage`](https://developer.mozilla.org/en-US/docs/Web/API/Worker/postMessage) and [`self.postMessage`](https://developer.mozilla.org/en-US/docs/Web/API/Window/postMessage). This leverages the [HTML Structured Clone Algorithm](https://developer.mozilla.org/en-US/docs/Web/API/Web_Workers_API/Structured_clone_algorithm).
```js
// On the worker thread, `postMessage` is automatically "routed" to the parent thread.
postMessage({ hello: "world" });
// On the main thread
worker.postMessage({ hello: "world" });
```
To receive messages, use the [`message` event handler](https://developer.mozilla.org/en-US/docs/Web/API/Worker/message_event) on the worker and main thread.
```js
// Worker thread:
self.addEventListener("message", event => {
console.log(event.data);
});
// or use the setter:
// self.onmessage = fn
// if on the main thread
worker.addEventListener("message", event => {
console.log(event.data);
});
// or use the setter:
// worker.onmessage = fn
```
## Terminating a worker
A `Worker` instance terminates automatically once it's event loop has no work left to do. Attaching a `"message"` listener on the global or any `MessagePort`s will keep the event loop alive. To forcefully terminate a `Worker`, call `worker.terminate()`.
```ts
const worker = new Worker(new URL("worker.ts", import.meta.url).href);
// ...some time later
worker.terminate();
```
This will cause the worker's to exit as soon as possible.
### `process.exit()`
A worker can terminate itself with `process.exit()`. This does not terminate the main process. Like in Node.js, `process.on('beforeExit', callback)` and `process.on('exit', callback)` are emitted on the worker thread (and not on the main thread), and the exit code is passed to the `"close"` event.
### `"close"`
The `"close"` event is emitted when a worker has been terminated. It can take some time for the worker to actually terminate, so this event is emitted when the worker has been marked as terminated. The `CloseEvent` will contain the exit code passed to `process.exit()`, or 0 if closed for other reasons.
```ts
const worker = new Worker(new URL("worker.ts", import.meta.url).href);
worker.addEventListener("close", event => {
console.log("worker is being closed");
});
```
This event does not exist in browsers.
## Managing lifetime
By default, an active `Worker` will keep the main (spawning) process alive, so async tasks like `setTimeout` and promises will keep the process alive. Attaching `message` listeners will also keep the `Worker` alive.
### `worker.unref()`
To stop a running worker from keeping the process alive, call `worker.unref()`. This decouples the lifetime of the worker to the lifetime of the main process, and is equivlent to what Node.js' `worker_threads` does.
```ts
const worker = new Worker(new URL("worker.ts", import.meta.url).href);
worker.unref();
```
Note: `worker.unref()` is not available in browers.
### `worker.ref()`
To keep the process alive until the `Worker` terminates, call `worker.ref()`. A ref'd worker is the default behavior, and still needs something going on in the event loop (such as a `"message"` listener) for the worker to continue running.
```ts
const worker = new Worker(new URL("worker.ts", import.meta.url).href);
worker.unref();
// later...
worker.ref();
```
Alternatively, you can also pass an `options` object to `Worker`:
```ts
const worker = new Worker(new URL("worker.ts", import.meta.url).href, {
ref: false,
});
```
Note: `worker.ref()` is not available in browers.
## Memory usage with `smol`
JavaScript instances can use a lot of memory. Bun's `Worker` supports a `smol` mode that reduces memory usage, at a cost of performance. To enable `smol` mode, pass `smol: true` to the `options` object in the `Worker` constructor.
```js
const worker = new Worker("./i-am-smol.ts", {
smol: true,
});
```
{% details summary="What does `smol` mode actually do?" %}
Setting `smol: true` sets `JSC::HeapSize` to be `Small` instead of the default `Large`.
{% /details %}

View File

@@ -31,30 +31,3 @@ All imported files and packages are bundled into the executable, along with a co
- `--publicPath`
{% /callout %}
## Embedding files
Standalone executables support embedding files.
To embed files into an executable with `bun build --compile`, import the file in your code
```js
// this becomes an internal file path
import icon from "./icon.png";
import { file } from "bun";
export default {
fetch(req) {
return new Response(file(icon));
},
};
```
You may need to specify a `--loader` for it to be treated as a `"file"` loader (so you get back a file path).
Embedded files can be read using `Bun.file`'s functions or the Node.js `fs.readFile` function (in `"node:fs"`).
## Minification
To trim down the size of the executable a little, pass `--minify` to `bun build --compile`. This uses Bun's minifier to reduce the code size. Overall though, Bun's binary is still way too big and we need to make it smaller.

View File

@@ -307,7 +307,7 @@ Depending on the target, Bun will apply different module resolution rules and op
---
- `browser`
- _Default._ For generating bundles that are intended for execution by a browser. Prioritizes the `"browser"` export condition when resolving imports. Importing any built-in modules, like `node:events` or `node:path` will work, but calling some functions, like `fs.readFile` will not work.
- _Default._ For generating bundles that are intended for execution by a browser. Prioritizes the `"browser"` export condition when resolving imports. An error will be thrown if any Node.js or Bun built-ins are imported or used, e.g. `node:fs` or `Bun.serve`.
---
@@ -482,7 +482,7 @@ n/a
{% /codetabs %}
Bun implements a universal plugin system for both Bun's runtime and bundler. Refer to the [plugin documentation](/docs/bundler/plugins) for complete documentation.
Bun implements a univeral plugin system for both Bun's runtime and bundler. Refer to the [plugin documentation](/docs/bundler/plugins) for complete documentation.
<!-- ### `manifest`
@@ -1272,17 +1272,7 @@ interface BuildArtifact extends Blob {
sourcemap?: BuildArtifact;
}
type Loader =
| "js"
| "jsx"
| "ts"
| "tsx"
| "json"
| "toml"
| "file"
| "napi"
| "wasm"
| "text";
type Loader = "js" | "jsx" | "ts" | "tsx" | "json" | "toml" | "file" | "napi" | "wasm" | "text";
interface BuildOutput {
outputs: BuildArtifact[];

View File

@@ -10,7 +10,7 @@ Bun uses the file extension to determine which built-in _loader_ should be used
**JavaScript**. Default for `.cjs` and `.mjs`.
Parses the code and applies a set of default transforms, like dead-code elimination, tree shaking, and environment variable inlining. Note that Bun does not attempt to down-convert syntax at the moment.
Parses the code and applies a set if default transforms, like dead-code elimination, tree shaking, and environment variable inlining. Note that Bun does not attempt to down-convert syntax at the moment.
### `jsx`

View File

@@ -33,7 +33,7 @@ As you can see, the source code of the `random` function occurs nowhere in the b
## When to use macros
If you have several build scripts for small things where you would otherwise have a one-off build script, bundle-time code execution can be easier to maintain. It lives with the rest of your code, it runs with the rest of the build, it is automatically parallelized, and if it fails, the build fails too.
If you have several build scripts for small things where you would otherwise have a one-off build script, bundle-time code execution can be easier to maintain. It lives with the rest of your code, it runs with the rest of the build, it is automatically paralellized, and if it fails, the build fails too.
If you find yourself running a lot of code at bundle-time though, consider running a server instead.

View File

@@ -6,17 +6,15 @@ Bun provides a universal plugin API that can be used to extend both the _runtime
Plugins intercept imports and perform custom loading logic: reading files, transpiling code, etc. They can be used to add support for additional file types, like `.scss` or `.yaml`. In the context of Bun's bundler, plugins can be used to implement framework-level features like CSS extraction, macros, and client-server code co-location.
For more complete documentation of the Plugin API, see [Runtime > Plugins](/docs/runtime/plugins).
## Usage
A plugin is defined as simple JavaScript object containing a `name` property and a `setup` function. Register a plugin with Bun using the `plugin` function.
```tsx#myPlugin.ts
```tsx#yamlPlugin.ts
import type { BunPlugin } from "bun";
const myPlugin: BunPlugin = {
name: "Custom loader",
name: "YAML loader",
setup(build) {
// implementation
},
@@ -32,3 +30,304 @@ Bun.build({
plugins: [myPlugin],
});
```
<!-- It can also be "registered" with the Bun runtime using the `Bun.plugin()` function. Once registered, the currently executing `bun` process will incorporate the plugin into its module resolution algorithm.
```ts
import {plugin} from "bun";
plugin(myPlugin);
``` -->
## `--preload`
To consume this plugin, add this file to the `preload` option in your [`bunfig.toml`](/docs/runtime/configuration). Bun automatically loads the files/modules specified in `preload` before running a file.
```toml
preload = ["./yamlPlugin.ts"]
```
To preload files during `bun test`:
```toml
[test]
preload = ["./loader.ts"]
```
{% details summary="Usage without preload" %}
Alternatively, you can import this file manually at the top of your project's entrypoint, before any application code is imported.
```ts#app.ts
import "./yamlPlugin.ts";
import { config } from "./config.yml";
console.log(config);
```
{% /details %}
## Third-party plugins
By convention, third-party plugins intended for consumption should export a factory function that accepts some configuration and returns a plugin object.
```ts
import { plugin } from "bun";
import fooPlugin from "bun-plugin-foo";
plugin(
fooPlugin({
// configuration
}),
);
// application code
```
Bun's plugin API is based on [esbuild](https://esbuild.github.io/plugins). Only [a subset](/docs/bundler/vs-esbuild#plugin-api) of the esbuild API is implemented, but some esbuild plugins "just work" in Bun, like the official [MDX loader](https://mdxjs.com/packages/esbuild/):
```jsx
import { plugin } from "bun";
import mdx from "@mdx-js/esbuild";
plugin(mdx());
import { renderToStaticMarkup } from "react-dom/server";
import Foo from "./bar.mdx";
console.log(renderToStaticMarkup(<Foo />));
```
## Loaders
<!-- The plugin logic is implemented in the `setup` function using the builder provided as the first argument (`build` in the example above). The `build` variable provides two methods: `onResolve` and `onLoad`. -->
<!-- ## `onResolve` -->
<!-- The `onResolve` method lets you intercept imports that match a particular regex and modify the resolution behavior, such as re-mapping the import to another file. In the simplest case, you can simply remap the matched import to a new path.
```ts
import { plugin } from "bun";
plugin({
name: "YAML loader",
setup(build) {
build.onResolve();
// implementation
},
});
``` -->
<!--
Internally, Bun's transpiler automatically turns `plugin()` calls into separate files (at most 1 per file). This lets loaders activate before the rest of your application runs with zero configuration. -->
Plugins are primarily used to extend Bun with loaders for additional file types. Let's look at a simple plugin that implements a loader for `.yaml` files.
```ts#yamlPlugin.ts
import { plugin } from "bun";
plugin({
name: "YAML",
async setup(build) {
const { load } = await import("js-yaml");
const { readFileSync } = await import("fs");
// when a .yaml file is imported...
build.onLoad({ filter: /\.(yaml|yml)$/ }, (args) => {
// read and parse the file
const text = readFileSync(args.path, "utf8");
const exports = load(text) as Record<string, any>;
// and returns it as a module
return {
exports,
loader: "object", // special loader for JS objects
};
});
},
});
```
With this plugin, data can be directly imported from `.yaml` files.
{% codetabs %}
```ts#index.ts
import "./yamlPlugin.ts"
import {name, releaseYear} from "./data.yml"
console.log(name, releaseYear);
```
```yaml#data.yml
name: Fast X
releaseYear: 2023
```
{% /codetabs %}
Note that the returned object has a `loader` property. This tells Bun which of its internal loaders should be used to handle the result. Even though we're implementing a loader for `.yaml`, the result must still be understandable by one of Bun's built-in loaders. It's loaders all the way down.
In this case we're using `"object"`—a built-in loader (intended for use by plugins) that converts a plain JavaScript object to an equivalent ES module. Any of Bun's built-in loaders are supported; these same loaders are used by Bun internally for handling files of various kinds. The table below is a quick reference; refer to [Bundler > Loaders](/docs/bundler/loaders) for complete documentation.
{% table %}
- Loader
- Extensions
- Output
---
- `js`
- `.mjs` `.cjs`
- Transpile to JavaScript files
---
- `jsx`
- `.js` `.jsx`
- Transform JSX then transpile
---
- `ts`
- `.ts` `.mts` `cts`
- Transform TypeScript then transpile
---
- `tsx`
- `.tsx`
- Transform TypeScript, JSX, then transpile
---
- `toml`
- `.toml`
- Parse using Bun's built-in TOML parser
---
- `json`
- `.json`
- Parse using Bun's built-in JSON parser
---
- `napi`
- `.node`
- Import a native Node.js addon
---
- `wasm`
- `.wasm`
- Import a native Node.js addon
---
- `object`
- _none_
- A special loader intended for plugins that converts a plain JavaScript object to an equivalent ES module. Each key in the object corresponds to a named export.
{% /callout %}
Loading a YAML file is useful, but plugins support more than just data loading. Let's look at a plugin that lets Bun import `*.svelte` files.
```ts#sveltePlugin.ts
import { plugin } from "bun";
await plugin({
name: "svelte loader",
async setup(build) {
const { compile } = await import("svelte/compiler");
const { readFileSync } = await import("fs");
// when a .svelte file is imported...
build.onLoad({ filter: /\.svelte$/ }, ({ path }) => {
// read and compile it with the Svelte compiler
const file = readFileSync(path, "utf8");
const contents = compile(file, {
filename: path,
generate: "ssr",
}).js.code;
// and return the compiled source code as "js"
return {
contents,
loader: "js",
};
});
},
});
```
> Note: in a production implementation, you'd want to cache the compiled output and include additional error handling.
The object returned from `build.onLoad` contains the compiled source code in `contents` and specifies `"js"` as its loader. That tells Bun to consider the returned `contents` to be a JavaScript module and transpile it using Bun's built-in `js` loader.
With this plugin, Svelte components can now be directly imported and consumed.
```js
import "./sveltePlugin.ts";
import MySvelteComponent from "./component.svelte";
console.log(mySvelteComponent.render());
```
## Reading `Bun.build`'s config
Plugins can read and write to the [build config](/docs/bundler#api) with `build.config`.
```ts
Bun.build({
entrypoints: ["./app.ts"],
outdir: "./dist",
sourcemap: "external",
plugins: [
{
name: "demo",
setup(build) {
console.log(build.config.sourcemap); // "external"
build.config.minify = true; // enable minification
// `plugins` is readonly
console.log(`Number of plugins: ${build.config.plugins.length}`);
},
},
],
});
```
## Reference
```ts
namespace Bun {
function plugin(plugin: { name: string; setup: (build: PluginBuilder) => void }): void;
}
type PluginBuilder = {
onResolve: (
args: { filter: RegExp; namespace?: string },
callback: (args: { path: string; importer: string }) => {
path: string;
namespace?: string;
} | void,
) => void;
onLoad: (
args: { filter: RegExp; namespace?: string },
callback: (args: { path: string }) => {
loader?: Loader;
contents?: string;
exports?: Record<string, any>;
},
) => void;
config: BuildConfig;
};
type Loader = "js" | "jsx" | "ts" | "tsx" | "json" | "toml" | "object";
```
The `onLoad` method optionally accepts a `namespace` in addition to the `filter` regex. This namespace will be be used to prefix the import in transpiled code; for instance, a loader with a `filter: /\.yaml$/` and `namespace: "yaml:"` will transform an import from `./myfile.yaml` into `yaml:./myfile.yaml`.

View File

@@ -897,7 +897,7 @@ const myPlugin: BunPlugin = {
};
```
The `builder` object provides some methods for hooking into parts of the bundling process. Bun implements `onResolve` and `onLoad`; it does not yet implement the esbuild hooks `onStart`, `onEnd`, and `onDispose`, and `resolve` utilities. `initialOptions` is partially implemented, being read-only and only having a subset of esbuild's options; use [`config`](/docs/bundler/plugins) (same thing but with Bun's `BuildConfig` format) instead.
The `builder` object provides some methods for hooking into parts of the bundling process. Bun implements `onResolve` and `onLoad`; it does not yet implement the esbuild hooks `onStart`, `onEnd`, and `onDispose`, and `resolve` utilities. `initialOptions` is partially implemented, being read-only and only having a subset of esbuild's options; use [`config`](/docs/bundler/plugins#reading-bunbuilds-config) (same thing but with Bun's `BuildConfig` format) instead.
```ts
import type { BunPlugin } from "bun";

View File

@@ -40,7 +40,7 @@ On Linux, `bun install` tends to install packages 20-100x faster than `npm insta
Running `bun install` will:
- **Install** all `dependencies`, `devDependencies`, and `optionalDependencies`. Bun does not install `peerDependencies` by default.
- **Run** your project's `{pre|post}install` and `{pre|post}prepare` scripts at the appropriate time. For security reasons Bun _does not execute_ lifecycle scripts of installed dependencies.
- **Run** your project's `{pre|post}install` scripts at the appropriate time. For security reasons Bun _does not execute_ lifecycle scripts of installed dependencies.
- **Write** a `bun.lockb` lockfile to the project root.
To install in production mode (i.e. without `devDependencies`):
@@ -49,7 +49,7 @@ To install in production mode (i.e. without `devDependencies`):
$ bun install --production
```
To install with reproducible dependencies, use `--frozen-lockfile`. If your `package.json` disagrees with `bun.lockb`, Bun will exit with an error. This is useful for production builds and CI environments.
To install dependencies without allowing changes to lockfile (useful on CI):
```bash
$ bun install --frozen-lockfile
@@ -114,7 +114,7 @@ $ bun add zod@latest
To add a package as a dev dependency (`"devDependencies"`):
```bash
$ bun add --dev @types/react
$ bun add --development @types/react
$ bun add -d @types/react
```
@@ -124,26 +124,6 @@ To add a package as an optional dependency (`"optionalDependencies"`):
$ bun add --optional lodash
```
To add a package and pin to the resolved version, use `--exact`. This will resolve the version of the package and add it to your `package.json` with an exact version number instead of a version range.
```bash
$ bun add react --exact
```
This will add the following to your `package.json`:
```jsonc
{
"dependencies": {
// without --exact
"react": "^18.2.0", // this matches >= 18.2.0 < 19.0.0
// with --exact
"react": "18.2.0" // this matches only 18.2.0 exactly
}
}
```
To install a package globally:
```bash
@@ -226,46 +206,6 @@ In addition, the `--save` flag can be used to add `cool-pkg` to the `dependencie
}
```
## Trusted dependencies
Unlike other npm clients, Bun does not execute arbitrary lifecycle scripts for installed dependencies, such as `postinstall`. These scripts represent a potential security risk, as they can execute arbitrary code on your machine.
<!-- Bun maintains an allow-list of popular packages containing `postinstall` scripts that are known to be safe. To run lifecycle scripts for packages that aren't on this list, add the package to `trustedDependencies` in your package.json. -->
To tell Bun to allow lifecycle scripts for a particular package, add the package to `trustedDependencies` in your package.json.
<!-- ```json-diff
{
"name": "my-app",
"version": "1.0.0",
+ "trustedDependencies": {
+ "my-trusted-package": "*"
+ }
}
``` -->
```json-diff
{
"name": "my-app",
"version": "1.0.0",
+ "trustedDependencies": ["my-trusted-package"]
}
```
Bun reads this field and will run lifecycle scripts for `my-trusted-package`.
<!-- If you specify a version range, Bun will only execute lifecycle scripts if the resolved package version matches the range. -->
<!--
```json
{
"name": "my-app",
"version": "1.0.0",
"trustedDependencies": {
"my-trusted-package": "^1.0.0"
}
}
``` -->
## Git dependencies
To add a dependency from a git repository:

View File

@@ -32,26 +32,6 @@ The "naked" `bun` command is equivalent to `bun run`.
$ bun index.tsx
```
### `--watch`
To run a file in watch mode, use the `--watch` flag.
```bash
$ bun --watch run index.tsx
```
### `--smol`
{% callout %}
Added in Bun v0.7.0.
{% /callout %}
In memory-constrained environments, use the `--smol` flag to reduce memory usage at a cost to performance.
```bash
$ bun --smol run index.tsx
```
## Run a `package.json` script
{% note %}
@@ -126,4 +106,4 @@ Under the hood Bun uses the [JavaScriptCore engine](https://developer.apple.com/
{% image src="/images/bun-run-speed.jpeg" caption="Bun vs Node.js vs Deno running Hello World" /%}
<!-- If no `node_modules` directory is found in the working directory or above, Bun will abandon Node.js-style module resolution in favor of the `Bun module resolution algorithm`. Under Bun-style module resolution, all packages are _auto-installed_ on the fly into a [global module cache](/docs/install/cache). For full details on this algorithm, refer to [Runtime > Modules](/docs/runtime/modules). -->
<!-- If no `node_modules` directory is found in the working directory or above, Bun will abandon Node.js-style module resolution in favor of the `Bun module resolution algorithm`. Under Bun-style module resolution, all packages are _auto-installed_ on the fly into a [global module cache](/docs/cli/install#global-cache). For full details on this algorithm, refer to [Runtime > Modules](/docs/runtime/modules). -->

View File

@@ -30,50 +30,14 @@ The runner recursively searches the working directory for files that match the f
- `*.spec.{js|jsx|ts|tsx}`
- `*_spec.{js|jsx|ts|tsx}`
You can filter the set of _test files_ to run by passing additional positional arguments to `bun test`. Any test file with a path that matches one of the filters will run. Commonly, these filters will be file or directory names; glob patterns are not yet supported.
You can filter the set of tests to run by passing additional positional arguments to `bun test`. Any file in the directory with an _absolute path_ that contains one of the filters will run. Commonly, these filters will be file or directory names; glob patterns are not yet supported.
```bash
$ bun test <filter> <filter> ...
```
To filter by _test name_, use the `-t`/`--test-name-pattern` flag.
```sh
# run all tests or test suites with "addition" in the name
$ bun test --test-name-pattern addition
```
The test runner runs all tests in a single process. It loads all `--preload` scripts (see [Lifecycle](/docs/test/lifecycle) for details), then runs all tests. If a test fails, the test runner will exit with a non-zero exit code.
## Timeouts
Use the `--timeout` flag to specify a _per-test_ timeout in milliseconds. If a test times out, it will be marked as failed. The default value is `5000`.
```bash
# default value is 5000
$ bun test --timeout 20
```
## Rerun tests
Use the `--rerun-each` flag to run each test multiple times. This is useful for detecting flaky or non-deterministic test failures.
```sh
$ bun test --rerun-each 100
```
## Bail out with `--bail`
Use the `--bail` flag to abort the test run early after a pre-determined number of test failures. By default Bun will run all tests and report all failures, but sometimes in CI environments it's preferable to terminate earlier to reduce CPU usage.
```sh
# bail after 1 failure
$ bun test --bail
# bail after 10 failure
$ bun test --bail 10
```
## Watch mode
Similar to `bun run`, you can pass the `--watch` flag to `bun test` to watch for changes and re-run tests.
@@ -101,24 +65,6 @@ $ bun test --preload ./setup.ts
See [Test > Lifecycle](/docs/test/lifecycle) for complete documentation.
## Mocks
Create mocks with the `mock` function. Mocks are automatically reset between tests.
```ts
import { test, expect, mock } from "bun:test";
const random = mock(() => Math.random());
test("random", async () => {
const val = random();
expect(val).toBeGreaterThan(0);
expect(random).toHaveBeenCalled();
expect(random).toHaveBeenCalledTimes(1);
});
```
See [Test > Mocks](/docs/test/mocks) for complete documentation.
## Snapshot testing
Snapshots are supported by `bun test`. See [Test > Snapshots](/docs/test/snapshots) for complete documentation.

View File

@@ -1,7 +1,7 @@
[Elysia](https://elysiajs.com) is a Bun-first performance focused web framework that takes full advantage of Bun's HTTP, file system, and hot reloading APIs.
Designed with TypeScript in mind, you don't need to understand TypeScript to gain the benefit of TypeScript with Elysia. The library understands what you want and automatically infers the type from your code.
⚡️ Elysia is [one of the fastest Bun web frameworks](https://github.com/SaltyAom/bun-http-framework-benchmark)
:zap: Elysia is [one of the fastest Bun web frameworks](https://github.com/SaltyAom/bun-http-framework-benchmark)
```ts#server.ts
import { Elysia } from 'elysia'
@@ -9,7 +9,7 @@ import { Elysia } from 'elysia'
const app = new Elysia()
.get('/', () => 'Hello Elysia')
.listen(8080)
console.log(`🦊 Elysia is running at on port ${app.server.port}...`)
```

View File

@@ -22,7 +22,7 @@ app.listen(port, () => {
Bun implements the [`node:http`](https://nodejs.org/api/http.html) and [`node:https`](https://nodejs.org/api/https.html) modules that these libraries rely on. These modules can also be used directly, though [`Bun.serve`](/docs/api/http) is recommended for most use cases.
{% callout %}
**Note** — Refer to the [Runtime > Node.js APIs](/docs/runtime/nodejs-apis#node-http) page for more detailed compatibility information.
**Note** — Refer to the [Runtime > Node.js APIs](/docs/runtime/nodejs-apis#node_http) page for more detailed compatibility information.
{% /callout %}
```ts

View File

@@ -1,16 +1,15 @@
[Hono](https://github.com/honojs/hono) is a lightweight ultrafast web framework designed for the edge.
```ts
import { Hono } from "hono";
const app = new Hono();
import { Hono } from 'hono'
const app = new Hono()
app.get("/", c => c.text("Hono!"));
app.get('/', (c) => c.text('Hono!'))
export default app;
export default app
```
Get started with `bun create` or follow Hono's [Bun quickstart](https://hono.dev/getting-started/bun).
```bash
$ bun create hono ./myapp
$ cd myapp

View File

@@ -1,27 +0,0 @@
---
name: Convert an ArrayBuffer to an array of numbers
---
To retrieve the contents of an `ArrayBuffer` as an array of numbers, create a [`Uint8Array`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Uint8Array) over of the buffer. and use the [`Array.from()`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/from) method to convert it to an array.
```ts
const buf = new ArrayBuffer(64);
const arr = new Uint8Array(buf);
arr.length; // 64
arr[0]; // 0 (instantiated with all zeros)
```
---
The `Uint8Array` class supports array indexing and iteration. However if you wish to convert the instance to a regular `Array`, use `Array.from()`. (This will likely be slower than using the `Uint8Array` directly.)
```ts
const buf = new ArrayBuffer(64);
const uintArr = new Uint8Array(buf);
const regularArr = Array.from(uintArr);
// number[]
```
---
See [Docs > API > Binary Data](/docs/api/binary-data#conversion) for complete documentation on manipulating binary data with Bun.

View File

@@ -1,24 +0,0 @@
---
name: Convert an ArrayBuffer to a Blob
---
A [`Blob`](https://developer.mozilla.org/en-US/docs/Web/API/Blob) can be constructed from an array of "chunks", where each chunk is a string, binary data structure, or another `Blob`.
```ts
const buf = new ArrayBuffer(64);
const blob = new Blob([buf]);
```
---
By default the `type` of the resulting `Blob` will be unset. This can be set manually.
```ts
const buf = new ArrayBuffer(64);
const blob = new Blob([buf], { type: "application/octet-stream" });
blob.type; // => "application/octet-stream"
```
---
See [Docs > API > Binary Data](/docs/api/binary-data#conversion) for complete documentation on manipulating binary data with Bun.

View File

@@ -1,25 +0,0 @@
---
name: Convert an ArrayBuffer to a Buffer
---
The Node.js [`Buffer`](https://nodejs.org/api/buffer.html) API predates the introduction of `ArrayBuffer` into the JavaScript language. Bun implements both.
Use the static `Buffer.from()` method to create a `Buffer` from an `ArrayBuffer`.
```ts
const arrBuffer = new ArrayBuffer(64);
const nodeBuffer = Buffer.from(arrBuffer);
```
---
To create a `Buffer` that only views a portion of the underlying buffer, pass the offset and length to the constructor.
```ts
const arrBuffer = new ArrayBuffer(64);
const nodeBuffer = Buffer.from(arrBuffer, 0, 16); // view first 16 bytes
```
---
See [Docs > API > Binary Data](/docs/api/binary-data#conversion) for complete documentation on manipulating binary data with Bun.

View File

@@ -1,15 +0,0 @@
---
name: Convert an ArrayBuffer to a string
---
Bun implements the Web-standard [`TextDecoder`](https://developer.mozilla.org/en-US/docs/Web/API/TextDecoder) class for converting between binary data types and strings.
```ts
const buf = new ArrayBuffer(64);
const decoder = new TextDecoder();
const str = decoder.decode(buf);
```
---
See [Docs > API > Binary Data](/docs/api/binary-data#conversion) for complete documentation on manipulating binary data with Bun.

View File

@@ -1,39 +0,0 @@
---
name: Convert an ArrayBuffer to a Uint8Array
---
A `Uint8Array` is a _typed array_, meaning it is a mechanism for viewing the data in an underlying `ArrayBuffer`.
```ts
const buffer = new ArrayBuffer(64);
const arr = new Uint8Array(buffer);
```
---
Instances of other typed arrays can be created similarly.
```ts
const buffer = new ArrayBuffer(64);
const arr1 = new Uint8Array(buffer);
const arr2 = new Uint16Array(buffer);
const arr3 = new Uint32Array(buffer);
const arr4 = new Float32Array(buffer);
const arr5 = new Float64Array(buffer);
const arr6 = new BigInt64Array(buffer);
const arr7 = new BigUint64Array(buffer);
```
---
To create a typed array that only views a portion of the underlying buffer, pass the offset and length to the constructor.
```ts
const buffer = new ArrayBuffer(64);
const arr = new Uint8Array(buffer, 0, 16); // view first 16 bytes
```
---
See [Docs > API > Utils](/docs/api/utils) for more useful utilities.

View File

@@ -1,14 +0,0 @@
---
name: Convert a Blob to an ArrayBuffer
---
The [`Blob`](https://developer.mozilla.org/en-US/docs/Web/API/Blob) class provides a number of methods for consuming its contents in different formats, including `.arrayBuffer()`.
```ts
const blob = new Blob(["hello world"]);
const buf = await blob.arrayBuffer();
```
---
See [Docs > API > Binary Data](/docs/api/binary-data#conversion) for complete documentation on manipulating binary data with Bun.

View File

@@ -1,14 +0,0 @@
---
name: Convert a Blob to a DataView
---
The [`Blob`](https://developer.mozilla.org/en-US/docs/Web/API/Blob) class provides a number of methods for consuming its contents in different formats. This snippets reads the contents to an `ArrayBuffer`, then creates a `DataView` from the buffer.
```ts
const blob = new Blob(["hello world"]);
const arr = new DataView(await blob.arrayBuffer());
```
---
See [Docs > API > Binary Data](/docs/api/binary-data#conversion) for complete documentation on manipulating binary data with Bun.

View File

@@ -1,14 +0,0 @@
---
name: Convert a Blob to a ReadableStream
---
The [`Blob`](https://developer.mozilla.org/en-US/docs/Web/API/Blob) class provides a number of methods for consuming its contents in different formats, inluding `.stream()`. This returns `Promise<ReadableStream>`.
```ts
const blob = new Blob(["hello world"]);
const stream = await blob.stream();
```
---
See [Docs > API > Binary Data](/docs/api/binary-data#conversion) for complete documentation on manipulating binary data with Bun.

View File

@@ -1,15 +0,0 @@
---
name: Convert a Blob to a string
---
The [`Blob`](https://developer.mozilla.org/en-US/docs/Web/API/Blob) class provides a number of methods for consuming its contents in different formats, inluding `.text()`.
```ts
const blob = new Blob(["hello world"]);
const str = await blob.text();
// => "hello world"
```
---
See [Docs > API > Binary Data](/docs/api/binary-data#conversion) for complete documentation on manipulating binary data with Bun.

View File

@@ -1,14 +0,0 @@
---
name: Convert a Blob to a Uint8Array
---
The [`Blob`](https://developer.mozilla.org/en-US/docs/Web/API/Blob) class provides a number of methods for consuming its contents in different formats. This snippets reads the contents to an `ArrayBuffer`, then creates a `Uint8Array` from the buffer.
```ts
const blob = new Blob(["hello world"]);
const arr = new Uint8Array(await blob.arrayBuffer());
```
---
See [Docs > API > Binary Data](/docs/api/binary-data#conversion) for complete documentation on manipulating binary data with Bun.

View File

@@ -1,14 +0,0 @@
---
name: Convert a Buffer to an ArrayBuffer
---
The Node.js [`Buffer`](https://nodejs.org/api/buffer.html) class provides a way to view and manipulate data in an underlying `ArrayBuffer`, which is available via the `buffer` property.
```ts
const nodeBuf = Buffer.alloc(64);
const arrBuf = nodeBuf.buffer;
```
---
See [Docs > API > Binary Data](/docs/api/binary-data#conversion) for complete documentation on manipulating binary data with Bun.

View File

@@ -1,14 +0,0 @@
---
name: Convert a Buffer to a blob
---
A [`Blob`](https://developer.mozilla.org/en-US/docs/Web/API/Blob) can be constructed from an array of "chunks", where each chunk is a string, binary data structure (including `Buffer`), or another `Blob`.
```ts
const buf = Buffer.from("hello");
const blob = new Blob([buf]);
```
---
See [Docs > API > Binary Data](/docs/api/binary-data#conversion) for complete documentation on manipulating binary data with Bun.

View File

@@ -1,41 +0,0 @@
---
name: Convert a Buffer to a ReadableStream
---
The naive approach to creating a [`ReadableStream`](https://developer.mozilla.org/en-US/docs/Web/API/ReadableStream) from a [`Buffer`](https://nodejs.org/api/buffer.html) is to use the `ReadableStream` constructor and enqueue the entire array as a single chunk. For a large buffer, this may be undesirable as this approach does not "streaming" the data in smaller chunks.
```ts
const buf = Buffer.from("hello world");
const stream = new ReadableStream({
start(controller) {
controller.enqueue(buf);
controller.close();
},
});
```
---
To stream the data in smaller chunks, first create a `Blob` instance from the `Buffer`. Then use the [`Blob.stream()`](https://developer.mozilla.org/en-US/docs/Web/API/Blob/stream) method to create a `ReadableStream` that streams the data in chunks of a specified size.
```ts
const buf = Buffer.from("hello world");
const blob = new Blob([buf]);
const stream = blob.stream();
```
---
The chunk size can be set by passing a number to the `.stream()` method.
```ts
const buf = Buffer.from("hello world");
const blob = new Blob([buf]);
// set chunk size of 1024 bytes
const stream = blob.stream(1024);
```
---
See [Docs > API > Binary Data](/docs/api/binary-data#conversion) for complete documentation on manipulating binary data with Bun.

View File

@@ -1,25 +0,0 @@
---
name: Convert a Buffer to a string
---
The [`Buffer`](https://nodejs.org/api/buffer.html) class provides a built-in `.toString()` method that converts a `Buffer` to a string.
```ts
const buf = Buffer.from("hello");
const str = buf.toString();
// => "hello"
```
---
You can optionally specify an encoding and byte range.
```ts
const buf = Buffer.from("hello world!");
const str = buf.toString("utf8", 0, 5);
// => "hello"
```
---
See [Docs > API > Binary Data](/docs/api/binary-data#conversion) for complete documentation on manipulating binary data with Bun.

View File

@@ -1,14 +0,0 @@
---
name: Convert a Buffer to a Uint8Array
---
The Node.js [`Buffer`](https://nodejs.org/api/buffer.html) class extends [`Uint8Array`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Uint8Array), so no conversion is needed. All properties and methods on `Uint8Array` are available on `Buffer`.
```ts
const buf = Buffer.alloc(64);
buf instanceof Uint8Array; // => true
```
---
See [Docs > API > Binary Data](/docs/api/binary-data#conversion) for complete documentation on manipulating binary data with Bun.

View File

@@ -1,15 +0,0 @@
---
name: Convert a Uint8Array to a string
---
If a [`DataView`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/DataView) contains ASCII-encoded text, you can convert it to a string using the [`TextDecoder`](https://developer.mozilla.org/en-US/docs/Web/API/TextDecoder) class.
```ts
const dv: DataView = ...;
const decoder = new TextDecoder();
const str = decoder.decode(dv);
```
---
See [Docs > API > Binary Data](/docs/api/binary-data#conversion) for complete documentation on manipulating binary data with Bun.

View File

@@ -1,4 +0,0 @@
{
"name": "Binary data",
"description": "A collection of guides for converting between binary data formats with Bun"
}

View File

@@ -1,25 +0,0 @@
---
name: Convert a Uint8Array to an ArrayBuffer
---
A [`Uint8Array`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Uint8Array) is a _typed array_ class, meaning it is a mechanism for viewing data in an underlying `ArrayBuffer`. The underlying `ArrayBuffer` is accessible via the `buffer` property.
```ts
const arr = new Uint8Array(64);
arr.buffer; // => ArrayBuffer(64)
```
---
The `Uint8Array` may be a view over a _subset_ of the data in the underlying `ArrayBuffer`. In this case, the `buffer` property will return the entire buffer, and the `byteOffset` and `byteLength` properties will indicate the subset.
```ts
const arr = new Uint8Array(64, 16, 32);
arr.buffer; // => ArrayBuffer(64)
arr.byteOffset; // => 16
arr.byteLength; // => 32
```
---
See [Docs > API > Binary Data](/docs/api/binary-data#conversion) for complete documentation on manipulating binary data with Bun.

View File

@@ -1,16 +0,0 @@
---
name: Convert a Uint8Array to a Blob
---
A [`Blob`](https://developer.mozilla.org/en-US/docs/Web/API/Blob) can be constructed from an array of "chunks", where each chunk is a string, binary data structure (including `Uint8Array`), or another `Blob`.
```ts
const arr = new Uint8Array([0x68, 0x65, 0x6c, 0x6c, 0x6f]);
const blob = new Blob([arr]);
console.log(await blob.text());
// => "hello"
```
---
See [Docs > API > Binary Data](/docs/api/binary-data#conversion) for complete documentation on manipulating binary data with Bun.

View File

@@ -1,14 +0,0 @@
---
name: Convert a Uint8Array to a Buffer
---
The [`Buffer`](https://nodejs.org/api/buffer.html) class extends [`Uint8Array`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Uint8Array) with a number of additional methods. Use `Buffer.from()` to create a `Buffer` instance from a `Uint8Array`.
```ts
const arr: Uint8Array = ...
const buf = Buffer.from(arr);
```
---
See [Docs > API > Binary Data](/docs/api/binary-data#conversion) for complete documentation on manipulating binary data with Bun.

View File

@@ -1,14 +0,0 @@
---
name: Convert a Uint8Array to a DataView
---
A [`Uint8Array`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Uint8Array) is a _typed array_ class, meaning it is a mechanism for viewing data in an underlying `ArrayBuffer`. The following snippet creates a [`DataView`] instance over the same range of data as the `Uint8Array`.
```ts
const arr: Uint8Array = ...
const dv = new DataView(arr.buffer, arr.byteOffset, arr.byteLength);
```
---
See [Docs > API > Binary Data](/docs/api/binary-data#conversion) for complete documentation on manipulating binary data with Bun.

Some files were not shown because too many files have changed in this diff Show More