Compare commits

..

3 Commits

Author SHA1 Message Date
Dylan Conway
e9470121d2 more docs 2023-10-10 18:50:46 -07:00
Dylan Conway
ee2e34866e Merge branch 'main' into dylan/github-api-option 2023-10-10 15:28:08 -07:00
Dylan Conway
e6d97f2581 add install.github.api option 2023-10-05 16:39:12 -07:00
1386 changed files with 90801 additions and 128036 deletions

View File

@@ -1,15 +1,17 @@
**/*.a
**/*.o
**/.next
**/CMakeCache.txt
**/node_modules
.git
examples
node_modules
**/node_modules
src/bun.js/WebKit/LayoutTests
zig-out
zig-build
**/*.o
**/*.a
examples
**/.next
.git
src/bun.js/WebKit
**/CMakeCache.txt
packages/**/bun
packages/**/bun-profile
src/bun.js/WebKit
src/bun.js/WebKit/LayoutTests
zig-build
zig-cache
zig-out

37
.gitattributes vendored
View File

@@ -17,28 +17,37 @@
*.mjs text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.mts text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.lockb binary diff=lockb
.vscode/launch.json linguist-generated
src/api/schema.d.ts linguist-generated
fixture.*.c linguist-generated
src/api/schema.js linguist-generated
*-fixture* linguist-generated
src/bun.js/bindings/ZigGeneratedCode.h linguist-generated
src/bun.js/bindings/ZigGeneratedCode.cpp linguist-generated
src/bun.js/bindings/headers.h linguist-generated
src/bun.js/bindings/headers.zig linguist-generated
packages/bun-uws/fuzzing/seed-corpus/**/* linguist-generated
src/bun.js/bindings/sqlite/sqlite3.c linguist-vendored
src/bun.js/bindings/sqlite/sqlite3_local.h linguist-vendored
*.lockb binary diff=lockb
src/bun.js/bindings/simdutf.cpp linguist-vendored
src/bun.js/bindings/simdutf.h linguist-vendored
src/js/out/WebCoreJSBuiltins.cpp linguist-generated
src/js/out/WebCoreJSBuiltins.h linguist-generated
src/js/out/WebCoreJSBuiltins.d.ts linguist-generated
src/bun.js/bindings/ZigGeneratedClasses.h linguist-generated
src/bun.js/bindings/ZigGeneratedClasses.cpp linguist-generated
src/bun.js/bindings/ZigGeneratedCode.h linguist-generated
src/bun.js/bindings/ZigGeneratedCode.cpp linguist-generated
src/bun.js/bindings/headers.h linguist-generated
src/bun.js/bindings/headers.zig linguist-generated
src/bun.js/bindings/JSSink.h linguist-generated
src/bun.js/bindings/JSSink.zig linguist-generated
src/bun.js/bindings/ZigGeneratedClasses+DOMClientIsoSubspaces.h linguist-generated
src/bun.js/bindings/ZigGeneratedClasses+DOMIsoSubspaces.h linguist-generated
src/bun.js/bindings/ZigGeneratedClasses+lazyStructureHeader.h linguist-generated
src/bun.js/bindings/ZigGeneratedClasses+lazyStructureImpl.h linguist-generated
docs/**/* linguist-documentation
# Don't count tests in the language stats - https://github.com/github-linguist/linguist/blob/master/docs/overrides.md
test/**/* linguist-documentation
bench/**/* linguist-documentation
examples/**/* linguist-documentation
packages/bun-uws/fuzzing/seed-corpus/**/* linguist-generated

View File

@@ -19,17 +19,18 @@ This adds a new flag --bail to bun test. When set, it will stop running tests af
<!-- If JavaScript/TypeScript modules or builtins changed:
- [ ] I included a test for the new code, or existing tests cover it
- [ ] I ran my tests locally and they pass (`bun-debug test test-file-name.test`)
- [ ] I ran `make js` and committed the transpiled changes
- [ ] I or my editor ran Prettier on the changed files (or I ran `bun fmt`)
- [ ] I included a test for the new code, or an existing test covers it
-->
<!-- If Zig files changed:
- [ ] I checked the lifetime of memory allocated to verify it's (1) freed and (2) only freed when it should be
- [ ] I or my editor ran `zig fmt` on the changed files
- [ ] I included a test for the new code, or an existing test covers it
- [ ] JSValue used outside outside of the stack is either wrapped in a JSC.Strong or is JSValueProtect'ed
- [ ] I wrote TypeScript/JavaScript tests and they pass locally (`bun-debug test test-file-name.test`)
-->
<!-- If new methods, getters, or setters were added to a publicly exposed class:
@@ -42,6 +43,17 @@ This adds a new flag --bail to bun test. When set, it will stop running tests af
- [ ] I made sure that specific versions of dependencies are used instead of ranged or tagged versions
-->
<!-- If functions were added to exports.zig or bindings.zig
- [ ] I ran `make headers` to regenerate the C header file
-->
<!-- If \*.classes.ts files were added or changed:
- [ ] I ran `make codegen` to regenerate the C++ and Zig code
-->
<!-- If a new builtin ESM/CJS module was added:
- [ ] I updated Aliases in `module_loader.zig` to include the new module

View File

@@ -4,6 +4,11 @@ concurrency:
group: bun-linux-aarch64-${{ github.ref }}
cancel-in-progress: true
env:
REGISTRY: ghcr.io
IMAGE_NAME: ${{ github.repository }}
TEST_TAG: bun-test'
on:
push:
branches:
@@ -11,19 +16,6 @@ on:
paths:
- "src/**/*"
- "test/**/*"
- "packages/bun-usockets/src/**/*"
- "CMakeLists.txt"
- "build.zig"
- "Makefile"
- "Dockerfile"
pull_request:
branches:
- main
paths:
- "src/**/*"
- "test/**/*"
- "packages/bun-usockets/src/**/*"
- "CMakeLists.txt"
- "build.zig"
- "Makefile"
- "Dockerfile"
@@ -45,7 +37,10 @@ jobs:
arch: aarch64
build_arch: arm64
runner: linux-arm64
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-linux-arm64-lto.tar.gz"
webkit_basename: "bun-webkit-linux-arm64-lto"
build_machine_arch: aarch64
steps:
- uses: actions/checkout@v3
with:
@@ -77,7 +72,9 @@ jobs:
BUILDARCH=${{matrix.build_arch}}
BUILD_MACHINE_ARCH=${{matrix.build_machine_arch}}
CPU_TARGET=${{matrix.cpu}}
WEBKIT_URL=${{matrix.webkit_url}}
GIT_SHA=${{github.sha}}
WEBKIT_BASENAME=${{matrix.webkit_basename}}
platforms: linux/${{matrix.build_arch}}
target: artifact
outputs: type=local,dest=${{runner.temp}}/release
@@ -113,6 +110,14 @@ jobs:
with:
name: bun-${{matrix.tag}}
path: ${{runner.temp}}/release/bun-${{matrix.tag}}.zip
- uses: actions/upload-artifact@v3
with:
name: bun-obj-${{matrix.tag}}
path: ${{runner.temp}}/release/bun-obj
- uses: actions/upload-artifact@v3
with:
name: ${{matrix.tag}}-dependencies
path: ${{runner.temp}}/release/bun-dependencies
- name: Release
id: release
uses: ncipollo/release-action@v1

View File

@@ -4,6 +4,11 @@ concurrency:
group: bun-linux-build-${{ github.ref }}
cancel-in-progress: true
env:
REGISTRY: ghcr.io
IMAGE_NAME: ${{ github.repository }}
TEST_TAG: bun-test'
on:
push:
branches:
@@ -11,8 +16,6 @@ on:
paths:
- "src/**/*"
- "test/**/*"
- "packages/bun-usockets/src/**/*"
- "CMakeLists.txt"
- "build.zig"
- "Makefile"
- "Dockerfile"
@@ -22,8 +25,6 @@ on:
paths:
- "src/**/*"
- "test/**/*"
- "packages/bun-usockets/src/**/*"
- "CMakeLists.txt"
- "build.zig"
- "Makefile"
- "Dockerfile"
@@ -46,44 +47,22 @@ jobs:
arch: x86_64
build_arch: amd64
runner: big-ubuntu
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-linux-amd64-lto.tar.gz"
webkit_basename: "bun-webkit-linux-amd64-lto"
build_machine_arch: x86_64
assertions: "OFF"
zig_optimize: "ReleaseFast"
target: "artifact"
- cpu: nehalem
tag: linux-x64-baseline
arch: x86_64
build_arch: amd64
runner: big-ubuntu
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-linux-amd64-lto.tar.gz"
webkit_basename: "bun-webkit-linux-amd64-lto"
build_machine_arch: x86_64
assertions: "OFF"
zig_optimize: "ReleaseFast"
target: "artifact"
# - cpu: haswell
# tag: linux-x64-assertions
# arch: x86_64
# build_arch: amd64
# runner: big-ubuntu
# build_machine_arch: x86_64
# assertions: "ON"
# zig_optimize: "ReleaseSafe"
# target: "artifact-assertions"
# - cpu: nehalem
# tag: linux-x64-baseline-assertions
# arch: x86_64
# build_arch: amd64
# runner: big-ubuntu
# build_machine_arch: x86_64
# assertions: "ON"
# zig_optimize: "ReleaseSafe"
# target: "artifact-assertions"
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v3
with:
submodules: recursive
ref: ${{github.sha}}
clean: true
- uses: docker/setup-buildx-action@v2
id: buildx
with:
@@ -111,23 +90,12 @@ jobs:
BUILDARCH=${{matrix.build_arch}}
BUILD_MACHINE_ARCH=${{matrix.build_machine_arch}}
CPU_TARGET=${{matrix.cpu}}
WEBKIT_URL=${{matrix.webkit_url}}
GIT_SHA=${{github.sha}}
ASSERTIONS=${{matrix.assertions}}
ZIG_OPTIMIZE=${{matrix.zig_optimize}}
SCCACHE_BUCKET=bun
SCCACHE_REGION=auto
SCCACHE_S3_USE_SSL=true
SCCACHE_ENDPOINT=${{ secrets.CACHE_S3_ENDPOINT }}
AWS_ACCESS_KEY_ID=${{ secrets.CACHE_S3_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY=${{ secrets.CACHE_S3_SECRET_ACCESS_KEY }}
WEBKIT_BASENAME=${{matrix.webkit_basename}}
platforms: linux/${{matrix.build_arch}}
target: ${{matrix.target}}
target: artifact
outputs: type=local,dest=${{runner.temp}}/release
- id: bun-version-check
name: Bun version check
run: |
# If this hangs, it means something is seriously wrong with the build
${{runner.temp}}/release/bun-profile --version
- name: Zip
run: |
# if zip is not found
@@ -221,15 +189,12 @@ jobs:
include:
- tag: linux-x64
- tag: linux-x64-baseline
- tag: linux-x64-assertions
- tag: linux-x64-baseline-assertions
steps:
- id: checkout
name: Checkout
uses: actions/checkout@v4
uses: actions/checkout@v3
with:
submodules: false
clean: true
- id: download
name: Download
uses: actions/download-artifact@v3
@@ -244,11 +209,7 @@ jobs:
cd bun-${{matrix.tag}}
chmod +x bun
pwd >> $GITHUB_PATH
- id: bun-version-check
name: Bun version check
run: |
# If this hangs, it means something is seriously wrong with the build
bun --version
./bun --version
- id: install-dependnecies
name: Install dependencies
run: |

View File

@@ -5,8 +5,9 @@ concurrency:
cancel-in-progress: true
env:
LLVM_VERSION: 16
BUN_DOWNLOAD_URL_BASE: https://pub-5e11e972747a44bf9aaf9394f185a982.r2.dev/releases/latest
REGISTRY: ghcr.io
IMAGE_NAME: ${{ github.repository }}
TEST_TAG: bun-test'
on:
push:
@@ -14,8 +15,6 @@ on:
paths:
- "src/**/*"
- "test/**/*"
- "packages/bun-usockets/src/**/*"
- "CMakeLists.txt"
- "build.zig"
- "Makefile"
- "Dockerfile"
@@ -24,8 +23,6 @@ on:
paths:
- "src/**/*"
- "test/**/*"
- "packages/bun-usockets/src/**/*"
- "CMakeLists.txt"
- "build.zig"
- "Makefile"
- "Dockerfile"
@@ -33,134 +30,79 @@ on:
workflow_dispatch:
jobs:
macOS-zig:
name: macOS Zig Object
macos-object-files:
name: macOS Object
runs-on: med-ubuntu
if: github.repository_owner == 'oven-sh'
strategy:
matrix:
include:
# - cpu: nehalem
# arch: x86_64
# tag: bun-obj-darwin-x64-baseline
# - cpu: haswell
# arch: x86_64
# tag: bun-obj-darwin-x64
- cpu: native
arch: aarch64
tag: bun-obj-darwin-aarch64
steps:
- uses: actions/checkout@v4
# - name: Checkout submodules
# run: git submodule update --init --recursive --depth=1 --progress --force
- name: Setup Docker Buildx
uses: docker/setup-buildx-action@v2
- uses: actions/checkout@v3
with:
submodules: recursive
- uses: docker/setup-buildx-action@v2
id: buildx
with:
install: true
- name: Run
run: |
rm -rf ${{runner.temp}}/release
- name: Login to GitHub Container Registry
uses: docker/login-action@v2
with:
registry: ghcr.io
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Compile Zig Object
- name: Build and push
uses: docker/build-push-action@v3
if: runner.arch == 'X64'
with:
context: .
push: false
# This doesnt seem to work
# cache-from: type=s3,endpoint_url=${{ secrets.CACHE_S3_ENDPOINT }},blobs_prefix=docker_blobs/,manifests_prefix=docker_manifests/,access_key_id=${{ secrets.CACHE_S3_ACCESS_KEY_ID }},secret_access_key=${{ secrets.CACHE_S3_SECRET_ACCESS_KEY }},bucket=bun,region=auto
# cache-to: type=s3,endpoint_url=${{ secrets.CACHE_S3_ENDPOINT }},blobs_prefix=docker_blobs/,manifests_prefix=docker_manifests/,access_key_id=${{ secrets.CACHE_S3_ACCESS_KEY_ID }},secret_access_key=${{ secrets.CACHE_S3_SECRET_ACCESS_KEY }},bucket=bun,region=auto
cache-from: type=gha
cache-to: type=gha,mode=min
build-args: |
BUILDARCH=${{ runner.arch == 'X64' && 'amd64' || 'arm64' }}
BUILD_MACHINE_ARCH=${{ runner.arch == 'X64' && 'x86_64' || 'aarch64' }}
ARCH=${{ matrix.arch }}
BUILDARCH=amd64
BUILD_MACHINE_ARCH=x86_64
CPU_TARGET=${{ matrix.cpu }}
TRIPLET=${{ matrix.arch }}-macos-none
GIT_SHA=${{ github.sha }}
platforms: linux/${{ runner.arch == 'X64' && 'amd64' || 'arm64' }}
TRIPLET=${{matrix.arch}}-macos-none
GIT_SHA=${{github.sha}}
platforms: linux/amd64
target: build_release_obj
outputs: type=local,dest=${{runner.temp}}/release
- name: Upload Zig Object
uses: actions/upload-artifact@v3
- name: Build and push
uses: docker/build-push-action@v3
if: runner.arch == 'ARM64'
with:
context: .
push: false
cache-from: type=gha
cache-to: type=gha,mode=min
build-args: |
ARCH=${{ matrix.arch }}
BUILDARCH=arm64
BUILD_MACHINE_ARCH=aarch64
CPU_TARGET=${{ matrix.cpu }}
TRIPLET=${{matrix.arch}}-macos-none
GIT_SHA=${{github.sha}}
platforms: linux/arm64
target: build_release_obj
outputs: type=local,dest=${{runner.temp}}/release
- uses: actions/upload-artifact@v3
with:
name: ${{ matrix.tag }}
path: ${{runner.temp}}/release/bun-zig.o
macOS-dependencies:
name: macOS Dependencies
runs-on: ${{ matrix.runner }}
if: github.repository_owner == 'oven-sh'
timeout-minutes: 15
strategy:
matrix:
include:
- cpu: native
arch: aarch64
tag: bun-darwin-aarch64
obj: bun-obj-darwin-aarch64
artifact: bun-obj-darwin-aarch64
runner: macos-arm64
steps:
- uses: actions/checkout@v4
- name: Checkout submodules
run: git submodule update --init --recursive --depth=1 --progress --force
- name: Install system dependencies
env:
HOMEBREW_NO_INSTALLED_DEPENDENTS_CHECK: 1
HOMEBREW_NO_AUTO_UPDATE: 1
HOMEBREW_NO_INSTALL_CLEANUP: 1
run: |
brew install sccache ccache rust llvm@$LLVM_VERSION pkg-config coreutils libtool cmake libiconv automake openssl@1.1 ninja gnu-sed pkg-config --force
echo "$(brew --prefix ccache)/bin" >> $GITHUB_PATH
# echo "$(brew --prefix sccache)/bin" >> $GITHUB_PATH
echo "$(brew --prefix coreutils)/libexec/gnubin" >> $GITHUB_PATH
echo "$(brew --prefix llvm@$LLVM_VERSION)/bin" >> $GITHUB_PATH
brew link --overwrite llvm@$LLVM_VERSION
- name: Hash submodule versions
run: |
print_data() {
git submodule | grep -v WebKit
llvm-config --version
rustc --version
cat $(echo scripts/build*.sh scripts/all-dependencies.sh | tr " " "\n" | sort)
}
echo "sha=$(print_data | sha1sum | cut -c 1-10)" >> $GITHUB_OUTPUT
id: submodule-versions
- name: Cache submodule dependencies
id: cache-deps-restore
uses: actions/cache/restore@v3
with:
path: ${{runner.temp}}/bun-deps
key: bun-deps-${{ matrix.tag }}-${{ steps.submodule-versions.outputs.sha }}
- name: Compile submodule dependencies
if: ${{ !steps.cache-deps-restore.outputs.cache-hit }}
env:
CPU_TARGET: ${{ matrix.cpu }}
BUN_DEPS_OUT_DIR: ${{runner.temp}}/bun-deps
run: |
mkdir -p $BUN_DEPS_OUT_DIR
bash ./scripts/clean-dependencies.sh
bash ./scripts/all-dependencies.sh
- name: Cache submodule dependencies
if: ${{ !steps.cache-deps-restore.outputs.cache-hit }}
id: cache-deps-save
uses: actions/cache/save@v3
with:
path: ${{runner.temp}}/bun-deps
key: ${{ steps.cache-deps-restore.outputs.cache-primary-key }}
- name: Upload submodule dependencies
uses: actions/upload-artifact@v3
with:
name: ${{ matrix.tag }}-deps
path: ${{runner.temp}}/bun-deps
path: ${{runner.temp}}/release/bun.o
macOS-cpp:
name: macOS C++
runs-on: ${{ matrix.runner }}
@@ -169,145 +111,257 @@ jobs:
strategy:
matrix:
include:
# - cpu: nehalem
# arch: x86_64
# tag: bun-darwin-x64-baseline
# obj: bun-obj-darwin-x64-baseline
# runner: macos-12
# artifact: bun-obj-darwin-x64-baseline
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
# dependencies: true
# compile_obj: false
# - cpu: haswell
# arch: x86_64
# tag: bun-darwin-x64
# obj: bun-obj-darwin-x64
# runner: macos-12
# artifact: bun-obj-darwin-x64
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
# dependencies: true
# compile_obj: false
# - cpu: nehalem
# arch: x86_64
# tag: bun-darwin-x64-baseline
# obj: bun-obj-darwin-x64-baseline
# runner: macos-12
# artifact: bun-obj-darwin-x64-baseline
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
# dependencies: false
# compile_obj: true
# - cpu: haswell
# arch: x86_64
# tag: bun-darwin-x64
# obj: bun-obj-darwin-x64
# runner: macos-12
# artifact: bun-obj-darwin-x64
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
# dependencies: false
# compile_obj: true
- cpu: native
arch: aarch64
tag: bun-darwin-aarch64
obj: bun-obj-darwin-aarch64
artifact: bun-obj-darwin-aarch64
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-arm64-lto.tar.gz"
runner: macos-arm64
dependencies: true
compile_obj: true
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v3
- name: Checkout submodules
run: git submodule update --init --recursive --depth=1 --progress --force
- name: Install system dependencies
run: git submodule update --init --recursive --depth=1 --progress -j $(sysctl -n hw.ncpu) --force
- name: Install dependencies
env:
CPU_TARGET: ${{ matrix.cpu }}
JSC_BASE_DIR: ${{ runner.temp }}/bun-webkit
JSC_LIB: ${{ runner.temp }}/bun-webkit/lib
HOMEBREW_NO_INSTALLED_DEPENDENTS_CHECK: 1
HOMEBREW_NO_AUTO_UPDATE: 1
HOMEBREW_NO_INSTALL_CLEANUP: 1
BUN_DEPLOY_DIR: ${{ runner.temp }}/release/bun
OBJ_DIR: ${{ runner.temp }}/bun-cpp-obj
BUN_DEPS_OUT_DIR: ${{runner.temp}}/bun-deps
run: |
brew install sccache ccache rust llvm@$LLVM_VERSION pkg-config coreutils libtool cmake libiconv automake openssl@1.1 ninja gnu-sed pkg-config --force
brew install ccache rust llvm@16 pkg-config coreutils libtool cmake libiconv automake openssl@1.1 ninja gnu-sed pkg-config esbuild --force
echo "$(brew --prefix ccache)/bin" >> $GITHUB_PATH
# echo "$(brew --prefix sccache)/bin" >> $GITHUB_PATH
echo "$(brew --prefix coreutils)/libexec/gnubin" >> $GITHUB_PATH
echo "$(brew --prefix llvm@$LLVM_VERSION)/bin" >> $GITHUB_PATH
brew link --overwrite llvm@$LLVM_VERSION
# TODO: replace with sccache
echo "$(brew --prefix llvm@16)/bin" >> $GITHUB_PATH
brew link --overwrite llvm@16
- name: ccache
uses: hendrikmuhs/ccache-action@v1.2
with:
key: ${{ runner.os }}-ccache-${{ matrix.tag }}
restore-keys: ${{ runner.os }}-ccache-${{ matrix.tag }}
- name: Compile C++
- name: Download WebKit
if: matrix.compile_obj
env:
CPU_TARGET: ${{ matrix.cpu }}
SOURCE_DIR: ${{ github.workspace }}
JSC_BASE_DIR: ${{ runner.temp }}/bun-webkit
JSC_LIB: ${{ runner.temp }}/bun-webkit/lib
BUN_DEPLOY_DIR: ${{ runner.temp }}/release/bun
OBJ_DIR: ${{ runner.temp }}/bun-cpp-obj
BUN_DEPS_OUT_DIR: ${{runner.temp}}/bun-deps
BUN_RELEASE_DIR: ${{ runner.temp }}/release
WEBKIT_RELEASE_DIR: ${{ runner.temp }}/bun-webkit
WEBKIT_RELEASE_DIR_LTO: ${{ runner.temp }}/bun-webkit
run: |
mkdir -p $OBJ_DIR
cd $OBJ_DIR
cmake -S $SOURCE_DIR -B $OBJ_DIR \
-G Ninja \
-DCMAKE_BUILD_TYPE=Release \
-DBUN_CPP_ONLY=1 \
-DNO_CONFIGURE_DEPENDS=1
bash compile-cpp-only.sh -v
rm -rf $JSC_BASE_DIR
mkdir -p $JSC_BASE_DIR
curl -L ${{ matrix.webkit_url }} | tar -xz -C $JSC_BASE_DIR --strip-components=1
- name: Compile dependencies
if: matrix.dependencies
env:
CPU_TARGET: ${{ matrix.cpu }}
JSC_BASE_DIR: ${{ runner.temp }}/bun-webkit
JSC_LIB: ${{ runner.temp }}/bun-webkit/lib
BUN_DEPLOY_DIR: ${{ runner.temp }}/release/bun
OBJ_DIR: ${{ runner.temp }}/bun-cpp-obj
BUN_DEPS_OUT_DIR: ${{runner.temp}}/bun-deps
BUN_RELEASE_DIR: ${{ runner.temp }}/release
WEBKIT_RELEASE_DIR: ${{ runner.temp }}/bun-webkit
WEBKIT_RELEASE_DIR_LTO: ${{ runner.temp }}/bun-webkit
run: |
mkdir -p $BUN_DEPS_OUT_DIR
make vendor-without-check
- name: Compile C++
if: matrix.compile_obj
env:
CPU_TARGET: ${{ matrix.cpu }}
JSC_BASE_DIR: ${{ runner.temp }}/bun-webkit
JSC_LIB: ${{ runner.temp }}/bun-webkit/lib
BUN_DEPLOY_DIR: ${{ runner.temp }}/release/bun
OBJ_DIR: ${{ runner.temp }}/bun-cpp-obj
BUN_DEPS_OUT_DIR: ${{runner.temp}}/bun-deps
BUN_RELEASE_DIR: ${{ runner.temp }}/release
WEBKIT_RELEASE_DIR: ${{ runner.temp }}/bun-webkit
WEBKIT_RELEASE_DIR_LTO: ${{ runner.temp }}/bun-webkit
run: |
mkdir -p $OBJ_DIR $BUN_DEPS_OUT_DIR
make clean-bindings
make -j $(sysctl -n hw.ncpu) release-bindings
- name: Upload C++
if: matrix.compile_obj
uses: actions/upload-artifact@v3
with:
name: ${{ matrix.tag }}-cpp
path: ${{ runner.temp }}/bun-cpp-obj/bun-cpp-objects.a
macOS-link:
path: ${{ runner.temp }}/bun-cpp-obj
- name: Upload Dependencies
if: matrix.dependencies
uses: actions/upload-artifact@v3
with:
name: ${{ matrix.tag }}-deps
path: ${{runner.temp}}/bun-deps
macOS:
name: macOS Link
runs-on: ${{ matrix.runner }}
if: github.repository_owner == 'oven-sh'
needs: [macOS-zig, macOS-cpp, macOS-dependencies]
timeout-minutes: 60
needs: [macOS-cpp, macos-object-files]
timeout-minutes: 90
permissions: write-all
strategy:
matrix:
include:
# - cpu: nehalem
# arch: x86_64
# tag: bun-darwin-x64-baseline
# obj: bun-obj-darwin-x64-baseline
# package: bun-darwin-x64
# runner: macos-12
# artifact: bun-obj-darwin-x64-baseline
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
# - cpu: haswell
# arch: x86_64
# tag: bun-darwin-x64
# obj: bun-obj-darwin-x64
# package: bun-darwin-x64
# runner: macos-12
# artifact: bun-obj-darwin-x64
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
- cpu: native
arch: aarch64
tag: bun-darwin-aarch64
obj: bun-obj-darwin-aarch64
package: bun-darwin-aarch64
artifact: bun-obj-darwin-aarch64
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-arm64-lto.tar.gz"
runner: macos-arm64
steps:
- uses: actions/checkout@v4
with:
submodules: recursive
ref: ${{github.sha}}
clean: true
- uses: actions/checkout@v3
- name: Checkout submodules
run: git submodule update --init --recursive --depth=1 --progress --force
- name: Install system dependencies
run: git submodule update --init --recursive --depth=1 --progress -j $(sysctl -n hw.ncpu) --force
- name: Install dependencies
env:
CPU_TARGET: ${{ matrix.cpu }}
JSC_BASE_DIR: ${{ runner.temp }}/bun-webkit
JSC_LIB: ${{ runner.temp }}/bun-webkit/lib
HOMEBREW_NO_INSTALLED_DEPENDENTS_CHECK: 1
HOMEBREW_NO_AUTO_UPDATE: 1
HOMEBREW_NO_INSTALL_CLEANUP: 1
BUN_DEPLOY_DIR: ${{ runner.temp }}/release/bun
OBJ_DIR: ${{ runner.temp }}/bun-cpp-obj
BUN_DEPS_OUT_DIR: ${{runner.temp}}/bun-deps
run: |
brew install ccache llvm@$LLVM_VERSION pkg-config coreutils libtool cmake libiconv openssl@1.1 ninja --force
echo "$(brew --prefix ccache)/bin" >> $GITHUB_PATH
brew install rust ccache llvm@16 pkg-config coreutils libtool cmake libiconv automake openssl@1.1 ninja gnu-sed pkg-config esbuild --force
echo "$(brew --prefix coreutils)/libexec/gnubin" >> $GITHUB_PATH
echo "$(brew --prefix llvm@$LLVM_VERSION)/bin" >> $GITHUB_PATH
brew link --overwrite llvm@$LLVM_VERSION
curl -LO "$BUN_DOWNLOAD_URL_BASE/bun-darwin-${{matrix.arch}}.zip"
unzip bun-darwin-${{matrix.arch}}.zip
mkdir -p ${{ runner.temp }}/.bun/bin
mv bun-darwin-${{matrix.arch}}/bun ${{ runner.temp }}/.bun/bin/bun
chmod +x ${{ runner.temp }}/.bun/bin/bun
echo "${{ runner.temp }}/.bun/bin" >> $GITHUB_PATH
echo "$(brew --prefix llvm@16)/bin" >> $GITHUB_PATH
brew link --overwrite llvm@16
- name: ccache
uses: hendrikmuhs/ccache-action@v1.2
with:
key: ${{ runner.os }}-ccache-${{ matrix.tag }}-link
restore-keys: ${{ runner.os }}-ccache-${{ matrix.tag }}-link
- name: Download WebKit
env:
CPU_TARGET: ${{ matrix.cpu }}
JSC_BASE_DIR: ${{ runner.temp }}/bun-webkit
JSC_LIB: ${{ runner.temp }}/bun-webkit/lib
BUN_DEPLOY_DIR: ${{ runner.temp }}/release/bun
OBJ_DIR: ${{ runner.temp }}/bun-cpp-obj
BUN_DEPS_OUT_DIR: ${{runner.temp}}/bun-deps
run: |
rm -rf $JSC_BASE_DIR
mkdir -p $JSC_BASE_DIR
curl -L ${{ matrix.webkit_url }} | tar -xz -C $JSC_BASE_DIR --strip-components=1
- name: Download C++
uses: actions/download-artifact@v3
with:
name: ${{ matrix.tag }}-cpp
path: ${{ runner.temp }}/bun-cpp-obj
- name: Download Zig Object
- name: Download Dependencies
uses: actions/download-artifact@v3
with:
name: ${{ matrix.tag }}-deps
path: ${{ runner.temp }}/bun-deps
- name: Download Object
uses: actions/download-artifact@v3
with:
name: ${{ matrix.obj }}
path: ${{ runner.temp }}/release
- name: Downloaded submodule dependencies
uses: actions/download-artifact@v3
with:
name: ${{ matrix.tag }}-deps
path: ${{runner.temp}}/bun-deps
- name: Link
env:
CPU_TARGET: ${{ matrix.cpu }}
JSC_BASE_DIR: ${{ runner.temp }}/bun-webkit
JSC_LIB: ${{ runner.temp }}/bun-webkit/lib
BUN_DEPLOY_DIR: ${{ runner.temp }}/release/bun
OBJ_DIR: ${{ runner.temp }}/bun-cpp-obj
BUN_DEPS_OUT_DIR: ${{runner.temp}}/bun-deps
BUN_RELEASE_DIR: ${{ runner.temp }}/release
WEBKIT_RELEASE_DIR: ${{ runner.temp }}/bun-webkit
WEBKIT_RELEASE_DIR_LTO: ${{ runner.temp }}/bun-webkit
run: |
SRC_DIR=$PWD
mkdir ${{runner.temp}}/link-build
cd ${{runner.temp}}/link-build
cmake $SRC_DIR \
-G Ninja \
-DCMAKE_BUILD_TYPE=Release \
-DBUN_LINK_ONLY=1 \
-DBUN_ZIG_OBJ="${{ runner.temp }}/release/bun-zig.o" \
-DBUN_CPP_ARCHIVE="${{ runner.temp }}/bun-cpp-obj/bun-cpp-objects.a" \
-DBUN_DEPS_OUT_DIR="${{runner.temp}}/bun-deps" \
-DNO_CONFIGURE_DEPENDS=1
ninja -v
rm -rf packages/${{ matrix.package }}
mkdir -p packages/${{ matrix.package }}
mv ${{ runner.temp }}/release/* packages/${{ matrix.package }}/
make bun-link-lld-release copy-to-bun-release-dir-bin
- name: Zip
env:
CPU_TARGET: ${{ matrix.cpu }}
JSC_BASE_DIR: ${{ runner.temp }}/bun-webkit
JSC_LIB: ${{ runner.temp }}/bun-webkit/lib
BUN_DEPLOY_DIR: ${{ runner.temp }}/release/bun
OBJ_DIR: ${{ runner.temp }}/bun-cpp-obj
BUN_DEPS_OUT_DIR: ${{runner.temp}}/bun-deps
BUN_RELEASE_DIR: ${{ runner.temp }}/release
WEBKIT_RELEASE_DIR: ${{ runner.temp }}/bun-webkit
WEBKIT_RELEASE_DIR_LTO: ${{ runner.temp }}/bun-webkit
run: |
cd ${{runner.temp}}/link-build
cd ${{runner.temp}}/release
chmod +x bun-profile bun
mkdir -p ${{matrix.tag}}-profile/ ${{matrix.tag}}/
mkdir ${{matrix.tag}}-profile
mkdir ${{matrix.tag}}
/usr/bin/strip -S bun
mv bun-profile ${{matrix.tag}}-profile/bun-profile
mv bun ${{matrix.tag}}/bun
@@ -317,11 +371,11 @@ jobs:
- uses: actions/upload-artifact@v3
with:
name: ${{matrix.tag}}-profile
path: ${{runner.temp}}/link-build/${{matrix.tag}}-profile.zip
path: ${{runner.temp}}/release/${{matrix.tag}}-profile.zip
- uses: actions/upload-artifact@v3
with:
name: ${{matrix.tag}}
path: ${{runner.temp}}/link-build/${{matrix.tag}}.zip
path: ${{runner.temp}}/release/${{matrix.tag}}.zip
- name: Release
id: release
uses: ncipollo/release-action@v1
@@ -338,7 +392,7 @@ jobs:
token: ${{ secrets.GITHUB_TOKEN }}
name: "Canary (${{github.sha}})"
tag: "canary"
artifacts: "${{runner.temp}}/link-build/${{matrix.tag}}.zip,${{runner.temp}}/link-build/${{matrix.tag}}-profile.zip"
artifacts: "${{runner.temp}}/release/${{matrix.tag}}.zip,${{runner.temp}}/release/${{matrix.tag}}-profile.zip"
- uses: sarisia/actions-status-discord@v1
if: failure() && github.repository_owner == 'oven-sh' && github.event_name == 'pull_request'
with:
@@ -361,7 +415,7 @@ jobs:
macOS-test:
name: Tests ${{matrix.tag}}
runs-on: ${{ matrix.runner }}
needs: [macOS-link]
needs: [macOS]
if: github.event_name == 'pull_request' && github.repository_owner == 'oven-sh'
permissions:
pull-requests: write
@@ -395,11 +449,7 @@ jobs:
cd ${{matrix.tag}}
chmod +x bun
pwd >> $GITHUB_PATH
- id: bun-version-check
name: Bun version check
run: |
# If this hangs, it means something is seriously wrong with the build
bun --version
./bun --version
- id: install
name: Install dependencies
run: |

View File

@@ -5,8 +5,9 @@ concurrency:
cancel-in-progress: true
env:
LLVM_VERSION: 16
BUN_DOWNLOAD_URL_BASE: https://pub-5e11e972747a44bf9aaf9394f185a982.r2.dev/releases/latest
REGISTRY: ghcr.io
IMAGE_NAME: ${{ github.repository }}
TEST_TAG: bun-test'
on:
push:
@@ -14,8 +15,6 @@ on:
paths:
- "src/**/*"
- "test/**/*"
- "packages/bun-usockets/src/**/*"
- "CMakeLists.txt"
- "build.zig"
- "Makefile"
- "Dockerfile"
@@ -24,8 +23,6 @@ on:
paths:
- "src/**/*"
- "test/**/*"
- "packages/bun-usockets/src/**/*"
- "CMakeLists.txt"
- "build.zig"
- "Makefile"
- "Dockerfile"
@@ -50,129 +47,62 @@ jobs:
# arch: aarch64
# tag: bun-obj-darwin-aarch64
steps:
- uses: actions/checkout@v4
- name: Setup Docker Buildx
uses: docker/setup-buildx-action@v2
- uses: actions/checkout@v3
with:
submodules: recursive
- uses: docker/setup-buildx-action@v2
id: buildx
with:
install: true
- name: Run
run: |
rm -rf ${{runner.temp}}/release
- name: Login to GitHub Container Registry
uses: docker/login-action@v2
with:
registry: ghcr.io
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Compile Zig Object
- name: Build and push
uses: docker/build-push-action@v3
if: runner.arch == 'X64'
with:
context: .
push: false
# This doesnt seem to work
# cache-from: type=s3,endpoint_url=${{ secrets.CACHE_S3_ENDPOINT }},blobs_prefix=docker_blobs/,manifests_prefix=docker_manifests/,access_key_id=${{ secrets.CACHE_S3_ACCESS_KEY_ID }},secret_access_key=${{ secrets.CACHE_S3_SECRET_ACCESS_KEY }},bucket=bun,region=auto
# cache-to: type=s3,endpoint_url=${{ secrets.CACHE_S3_ENDPOINT }},blobs_prefix=docker_blobs/,manifests_prefix=docker_manifests/,access_key_id=${{ secrets.CACHE_S3_ACCESS_KEY_ID }},secret_access_key=${{ secrets.CACHE_S3_SECRET_ACCESS_KEY }},bucket=bun,region=auto
# This was used before, but also does not really work
cache-from: type=local,src=/tmp/.buildx-cache-${{matrix.tag}}
cache-to: type=local,dest=/tmp/.buildx-cache-${{matrix.tag}}
cache-from: type=gha
cache-to: type=gha,mode=min
build-args: |
BUILDARCH=${{ runner.arch == 'X64' && 'amd64' || 'arm64' }}
BUILD_MACHINE_ARCH=${{ runner.arch == 'X64' && 'x86_64' || 'aarch64' }}
ARCH=${{ matrix.arch }}
BUILDARCH=amd64
BUILD_MACHINE_ARCH=x86_64
CPU_TARGET=${{ matrix.cpu }}
TRIPLET=${{ matrix.arch }}-macos-none
GIT_SHA=${{ github.sha }}
SCCACHE_BUCKET=bun
SCCACHE_REGION=auto
SCCACHE_S3_USE_SSL=true
SCCACHE_ENDPOINT=${{ secrets.CACHE_S3_ENDPOINT }}
AWS_ACCESS_KEY_ID=${{ secrets.CACHE_S3_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY=${{ secrets.CACHE_S3_SECRET_ACCESS_KEY }}
platforms: linux/${{ runner.arch == 'X64' && 'amd64' || 'arm64' }}
TRIPLET=${{matrix.arch}}-macos-none
GIT_SHA=${{github.sha}}
platforms: linux/amd64
target: build_release_obj
outputs: type=local,dest=${{runner.temp}}/release
- name: Upload Zig Object
uses: actions/upload-artifact@v3
- name: Build and push
uses: docker/build-push-action@v3
if: runner.arch == 'ARM64'
with:
context: .
push: false
cache-from: type=gha
cache-to: type=gha,mode=min
build-args: |
ARCH=${{ matrix.arch }}
BUILDARCH=arm64
BUILD_MACHINE_ARCH=aarch64
CPU_TARGET=${{ matrix.cpu }}
TRIPLET=${{matrix.arch}}-macos-none
GIT_SHA=${{github.sha}}
platforms: linux/arm64
target: build_release_obj
outputs: type=local,dest=${{runner.temp}}/release
- uses: actions/upload-artifact@v3
with:
name: ${{ matrix.tag }}
path: ${{runner.temp}}/release/bun-zig.o
macOS-dependencies:
name: macOS Dependencies
runs-on: ${{ matrix.runner }}
if: github.repository_owner == 'oven-sh'
timeout-minutes: 15
strategy:
matrix:
include:
- cpu: nehalem
arch: x86_64
tag: bun-darwin-x64-baseline
obj: bun-obj-darwin-x64-baseline
runner: macos-12
artifact: bun-obj-darwin-x64-baseline
steps:
- uses: actions/checkout@v4
- name: Checkout submodules
run: git submodule update --init --recursive --depth=1 --progress --force
- name: Install system dependencies
env:
HOMEBREW_NO_INSTALLED_DEPENDENTS_CHECK: 1
HOMEBREW_NO_AUTO_UPDATE: 1
HOMEBREW_NO_INSTALL_CLEANUP: 1
run: |
brew install sccache ccache rust llvm@$LLVM_VERSION pkg-config coreutils libtool cmake libiconv automake openssl@1.1 ninja gnu-sed pkg-config --force
echo "$(brew --prefix ccache)/bin" >> $GITHUB_PATH
# echo "$(brew --prefix sccache)/bin" >> $GITHUB_PATH
echo "$(brew --prefix coreutils)/libexec/gnubin" >> $GITHUB_PATH
echo "$(brew --prefix llvm@$LLVM_VERSION)/bin" >> $GITHUB_PATH
brew link --overwrite llvm@$LLVM_VERSION
- name: Hash submodule versions
run: |
print_data() {
git submodule | grep -v WebKit
llvm-config --version
rustc --version
cat $(echo scripts/build*.sh scripts/all-dependencies.sh | tr " " "\n" | sort)
}
echo "sha=$(print_data | sha1sum | cut -c 1-10)" >> $GITHUB_OUTPUT
id: submodule-versions
- name: Cache submodule dependencies
id: cache-deps-restore
uses: actions/cache/restore@v3
with:
path: ${{runner.temp}}/bun-deps
key: bun-deps-${{ matrix.tag }}-${{ steps.submodule-versions.outputs.sha }}
- name: Compile submodule dependencies
if: ${{ !steps.cache-deps-restore.outputs.cache-hit }}
env:
CPU_TARGET: ${{ matrix.cpu }}
BUN_DEPS_OUT_DIR: ${{runner.temp}}/bun-deps
run: |
mkdir -p $BUN_DEPS_OUT_DIR
bash ./scripts/clean-dependencies.sh
bash ./scripts/all-dependencies.sh
- name: Cache submodule dependencies
if: ${{ !steps.cache-deps-restore.outputs.cache-hit }}
id: cache-deps-save
uses: actions/cache/save@v3
with:
path: ${{runner.temp}}/bun-deps
key: ${{ steps.cache-deps-restore.outputs.cache-primary-key }}
- name: Upload submodule dependencies
uses: actions/upload-artifact@v3
with:
name: ${{ matrix.tag }}-deps
path: ${{runner.temp}}/bun-deps
path: ${{runner.temp}}/release/bun.o
macOS-cpp:
name: macOS C++
runs-on: ${{ matrix.runner }}
@@ -187,66 +117,136 @@ jobs:
obj: bun-obj-darwin-x64-baseline
runner: macos-12
artifact: bun-obj-darwin-x64-baseline
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
dependencies: true
compile_obj: false
# - cpu: haswell
# arch: x86_64
# tag: bun-darwin-x64
# obj: bun-obj-darwin-x64
# runner: macos-12
# artifact: bun-obj-darwin-x64
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
# dependencies: true
# compile_obj: false
- cpu: nehalem
arch: x86_64
tag: bun-darwin-x64-baseline
obj: bun-obj-darwin-x64-baseline
runner: macos-12
artifact: bun-obj-darwin-x64-baseline
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
dependencies: false
compile_obj: true
# - cpu: haswell
# arch: x86_64
# tag: bun-darwin-x64
# obj: bun-obj-darwin-x64
# runner: macos-12
# artifact: bun-obj-darwin-x64
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
# dependencies: false
# compile_obj: true
# - cpu: native
# arch: aarch64
# tag: bun-darwin-aarch64
# obj: bun-obj-darwin-aarch64
# artifact: bun-obj-darwin-aarch64
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
# runner: macos-arm64
# dependencies: true
# compile_obj: true
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v3
- name: Checkout submodules
run: git submodule update --init --recursive --depth=1 --progress --force
- name: Install system dependencies
run: git submodule update --init --recursive --depth=1 --progress -j $(sysctl -n hw.ncpu) --force
- name: Install dependencies
env:
CPU_TARGET: ${{ matrix.cpu }}
JSC_BASE_DIR: ${{ runner.temp }}/bun-webkit
JSC_LIB: ${{ runner.temp }}/bun-webkit/lib
HOMEBREW_NO_INSTALLED_DEPENDENTS_CHECK: 1
HOMEBREW_NO_AUTO_UPDATE: 1
HOMEBREW_NO_INSTALL_CLEANUP: 1
run: |
brew install sccache ccache rust llvm@$LLVM_VERSION pkg-config coreutils libtool cmake libiconv automake openssl@1.1 ninja gnu-sed pkg-config --force
echo "$(brew --prefix ccache)/bin" >> $GITHUB_PATH
# echo "$(brew --prefix sccache)/bin" >> $GITHUB_PATH
echo "$(brew --prefix coreutils)/libexec/gnubin" >> $GITHUB_PATH
echo "$(brew --prefix llvm@$LLVM_VERSION)/bin" >> $GITHUB_PATH
brew link --overwrite llvm@$LLVM_VERSION
curl -LO "$BUN_DOWNLOAD_URL_BASE/bun-darwin-x64-baseline.zip"
unzip bun-darwin-x64-baseline.zip
mkdir -p ${{ runner.temp }}/.bun/bin
mv bun-darwin-x64-baseline/bun ${{ runner.temp }}/.bun/bin/bun
chmod +x ${{ runner.temp }}/.bun/bin/bun
echo "${{ runner.temp }}/.bun/bin" >> $GITHUB_PATH
# TODO: replace with sccache
- name: ccache
uses: hendrikmuhs/ccache-action@v1.2
with:
key: ${{ runner.os }}-ccache-${{ matrix.tag }}
restore-keys: ${{ runner.os }}-ccache-${{ matrix.tag }}
- name: Compile C++
env:
CPU_TARGET: ${{ matrix.cpu }}
SOURCE_DIR: ${{ github.workspace }}
BUN_DEPLOY_DIR: ${{ runner.temp }}/release/bun
OBJ_DIR: ${{ runner.temp }}/bun-cpp-obj
BUN_DEPS_OUT_DIR: ${{runner.temp}}/bun-deps
run: |
mkdir -p $OBJ_DIR
cd $OBJ_DIR
cmake -S $SOURCE_DIR -B $OBJ_DIR \
-G Ninja \
-DCMAKE_BUILD_TYPE=Release \
-DBUN_CPP_ONLY=1 \
-DNO_CONFIGURE_DEPENDS=1
bash compile-cpp-only.sh -v
brew install ccache rust llvm@16 pkg-config coreutils libtool cmake libiconv automake openssl@1.1 ninja gnu-sed pkg-config esbuild --force
echo "$(brew --prefix ccache)/bin" >> $GITHUB_PATH
echo "$(brew --prefix coreutils)/libexec/gnubin" >> $GITHUB_PATH
echo "$(brew --prefix llvm@16)/bin" >> $GITHUB_PATH
brew link --overwrite llvm@16
- name: ccache (dependencies)
uses: hendrikmuhs/ccache-action@v1.2
if: matrix.dependencies
with:
key: ${{ runner.os }}-ccache-${{ matrix.tag }}-dependencies
restore-keys: ${{ runner.os }}-ccache-${{ matrix.tag }}-dependencies
- name: ccache (c++)
uses: hendrikmuhs/ccache-action@v1.2
if: matrix.compile_obj
with:
key: ${{ runner.os }}-ccache-${{ matrix.tag }}-obj
restore-keys: ${{ runner.os }}-ccache-${{ matrix.tag }}-obj
- name: Download WebKit
if: matrix.compile_obj
env:
CPU_TARGET: ${{ matrix.cpu }}
JSC_BASE_DIR: ${{ runner.temp }}/bun-webkit
JSC_LIB: ${{ runner.temp }}/bun-webkit/lib
BUN_DEPLOY_DIR: ${{ runner.temp }}/release/bun
OBJ_DIR: ${{ runner.temp }}/bun-cpp-obj
BUN_DEPS_OUT_DIR: ${{runner.temp}}/bun-deps
run: |
rm -rf $JSC_BASE_DIR
mkdir -p $JSC_BASE_DIR
curl -L ${{ matrix.webkit_url }} | tar -xz -C $JSC_BASE_DIR --strip-components=1
- name: Compile dependencies
if: matrix.dependencies
env:
CPU_TARGET: ${{ matrix.cpu }}
JSC_BASE_DIR: ${{ runner.temp }}/bun-webkit
JSC_LIB: ${{ runner.temp }}/bun-webkit/lib
BUN_DEPLOY_DIR: ${{ runner.temp }}/release/bun
OBJ_DIR: ${{ runner.temp }}/bun-cpp-obj
BUN_DEPS_OUT_DIR: ${{runner.temp}}/bun-deps
WEBKIT_RELEASE_DIR: ${{ runner.temp }}/bun-webkit
WEBKIT_RELEASE_DIR_LTO: ${{ runner.temp }}/bun-webkit
run: |
mkdir -p $OBJ_DIR $BUN_DEPS_OUT_DIR
make vendor-without-check
- name: Compile C++
if: matrix.compile_obj
env:
CPU_TARGET: ${{ matrix.cpu }}
JSC_BASE_DIR: ${{ runner.temp }}/bun-webkit
JSC_LIB: ${{ runner.temp }}/bun-webkit/lib
BUN_DEPLOY_DIR: ${{ runner.temp }}/release/bun
OBJ_DIR: ${{ runner.temp }}/bun-cpp-obj
BUN_DEPS_OUT_DIR: ${{runner.temp}}/bun-deps
WEBKIT_RELEASE_DIR: ${{ runner.temp }}/bun-webkit
WEBKIT_RELEASE_DIR_LTO: ${{ runner.temp }}/bun-webkit
run: |
mkdir -p $OBJ_DIR $BUN_DEPS_OUT_DIR
make -j $(sysctl -n hw.ncpu) release-bindings
- name: Upload C++
if: matrix.compile_obj
uses: actions/upload-artifact@v3
with:
name: ${{ matrix.tag }}-cpp
path: ${{ runner.temp }}/bun-cpp-obj/bun-cpp-objects.a
path: ${{ runner.temp }}/bun-cpp-obj
- name: Upload Dependencies
if: matrix.dependencies
uses: actions/upload-artifact@v3
with:
name: ${{ matrix.tag }}-deps
path: ${{runner.temp}}/bun-deps
macOS:
name: macOS Link
runs-on: ${{ matrix.runner }}
if: github.repository_owner == 'oven-sh'
needs: [macOS-cpp, macos-object-files, macOS-dependencies]
needs: [macOS-cpp, macos-object-files]
timeout-minutes: 90
permissions: write-all
strategy:
@@ -259,70 +259,113 @@ jobs:
package: bun-darwin-x64
runner: macos-12
artifact: bun-obj-darwin-x64-baseline
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
# - cpu: haswell
# arch: x86_64
# tag: bun-darwin-x64
# obj: bun-obj-darwin-x64
# package: bun-darwin-x64
# runner: macos-12
# artifact: bun-obj-darwin-x64
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
# - cpu: native
# arch: aarch64
# tag: bun-darwin-aarch64
# obj: bun-obj-darwin-aarch64
# package: bun-darwin-aarch64
# artifact: bun-obj-darwin-aarch64
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
# runner: macos-arm64
steps:
- uses: actions/checkout@v3
- name: Checkout submodules
run: git submodule update --init --recursive --depth=1 --progress --force
- name: Install system dependencies
run: git submodule update --init --recursive --depth=1 --progress -j $(sysctl -n hw.ncpu) --force
- name: Install dependencies
env:
CPU_TARGET: ${{ matrix.cpu }}
JSC_BASE_DIR: ${{ runner.temp }}/bun-webkit
JSC_LIB: ${{ runner.temp }}/bun-webkit/lib
HOMEBREW_NO_INSTALLED_DEPENDENTS_CHECK: 1
HOMEBREW_NO_AUTO_UPDATE: 1
HOMEBREW_NO_INSTALL_CLEANUP: 1
BUN_DEPLOY_DIR: ${{ runner.temp }}/release/bun
OBJ_DIR: ${{ runner.temp }}/bun-cpp-obj
BUN_DEPS_OUT_DIR: ${{runner.temp}}/bun-deps
run: |
brew install ccache llvm@$LLVM_VERSION pkg-config coreutils libtool cmake libiconv openssl@1.1 ninja --force
echo "$(brew --prefix ccache)/bin" >> $GITHUB_PATH
brew install ccache rust llvm@16 pkg-config coreutils libtool cmake libiconv automake openssl@1.1 ninja gnu-sed pkg-config esbuild --force
echo "$(brew --prefix coreutils)/libexec/gnubin" >> $GITHUB_PATH
echo "$(brew --prefix llvm@$LLVM_VERSION)/bin" >> $GITHUB_PATH
brew link --overwrite llvm@$LLVM_VERSION
curl -LO "$BUN_DOWNLOAD_URL_BASE/bun-darwin-x64-baseline.zip"
unzip bun-darwin-x64-baseline.zip
mkdir -p ${{ runner.temp }}/.bun/bin
mv bun-darwin-x64-baseline/bun ${{ runner.temp }}/.bun/bin/bun
chmod +x ${{ runner.temp }}/.bun/bin/bun
echo "${{ runner.temp }}/.bun/bin" >> $GITHUB_PATH
echo "$(brew --prefix llvm@16)/bin" >> $GITHUB_PATH
brew link --overwrite llvm@16
- name: ccache (link)
uses: hendrikmuhs/ccache-action@v1.2
with:
key: ${{ runner.os }}-ccache-${{ matrix.tag }}-link
restore-keys: ${{ runner.os }}-ccache-${{ matrix.tag }}-link
- name: Download WebKit
env:
CPU_TARGET: ${{ matrix.cpu }}
JSC_BASE_DIR: ${{ runner.temp }}/bun-webkit
JSC_LIB: ${{ runner.temp }}/bun-webkit/lib
BUN_DEPLOY_DIR: ${{ runner.temp }}/release/bun
OBJ_DIR: ${{ runner.temp }}/bun-cpp-obj
BUN_DEPS_OUT_DIR: ${{runner.temp}}/bun-deps
BUN_RELEASE_DIR: ${{ runner.temp }}/release
WEBKIT_RELEASE_DIR: ${{ runner.temp }}/bun-webkit
WEBKIT_RELEASE_DIR_LTO: ${{ runner.temp }}/bun-webkit
run: |
rm -rf $JSC_BASE_DIR
mkdir -p $JSC_BASE_DIR
curl -L ${{ matrix.webkit_url }} | tar -xz -C $JSC_BASE_DIR --strip-components=1
- name: Download C++
uses: actions/download-artifact@v3
with:
name: ${{ matrix.tag }}-cpp
path: ${{ runner.temp }}/bun-cpp-obj
- name: Download Zig Object
- name: Download Dependencies
uses: actions/download-artifact@v3
with:
name: ${{ matrix.tag }}-deps
path: ${{ runner.temp }}/bun-deps
- name: Download Object
uses: actions/download-artifact@v3
with:
name: ${{ matrix.obj }}
path: ${{ runner.temp }}/release
- name: Downloaded submodule dependencies
uses: actions/download-artifact@v3
with:
name: ${{ matrix.tag }}-deps
path: ${{runner.temp}}/bun-deps
- name: Link
env:
CPU_TARGET: ${{ matrix.cpu }}
JSC_BASE_DIR: ${{ runner.temp }}/bun-webkit
JSC_LIB: ${{ runner.temp }}/bun-webkit/lib
BUN_DEPLOY_DIR: ${{ runner.temp }}/release/bun
OBJ_DIR: ${{ runner.temp }}/bun-cpp-obj
BUN_DEPS_OUT_DIR: ${{runner.temp}}/bun-deps
BUN_RELEASE_DIR: ${{ runner.temp }}/release
WEBKIT_RELEASE_DIR: ${{ runner.temp }}/bun-webkit
WEBKIT_RELEASE_DIR_LTO: ${{ runner.temp }}/bun-webkit
run: |
SRC_DIR=$PWD
mkdir ${{runner.temp}}/link-build
cd ${{runner.temp}}/link-build
cmake $SRC_DIR \
-G Ninja \
-DCMAKE_BUILD_TYPE=Release \
-DBUN_LINK_ONLY=1 \
-DBUN_ZIG_OBJ="${{ runner.temp }}/release/bun-zig.o" \
-DBUN_CPP_ARCHIVE="${{ runner.temp }}/bun-cpp-obj/bun-cpp-objects.a" \
-DBUN_DEPS_OUT_DIR="${{runner.temp}}/bun-deps" \
-DNO_CONFIGURE_DEPENDS=1
ninja -v
rm -rf packages/${{ matrix.package }}
mkdir -p packages/${{ matrix.package }}
mv ${{ runner.temp }}/release/* packages/${{ matrix.package }}/
make bun-link-lld-release copy-to-bun-release-dir-bin
- name: Zip
env:
CPU_TARGET: ${{ matrix.cpu }}
JSC_BASE_DIR: ${{ runner.temp }}/bun-webkit
JSC_LIB: ${{ runner.temp }}/bun-webkit/lib
BUN_DEPLOY_DIR: ${{ runner.temp }}/release/bun
OBJ_DIR: ${{ runner.temp }}/bun-cpp-obj
BUN_DEPS_OUT_DIR: ${{runner.temp}}/bun-deps
BUN_RELEASE_DIR: ${{ runner.temp }}/release
WEBKIT_RELEASE_DIR: ${{ runner.temp }}/bun-webkit
WEBKIT_RELEASE_DIR_LTO: ${{ runner.temp }}/bun-webkit
run: |
cd ${{runner.temp}}/link-build
cd ${{runner.temp}}/release
chmod +x bun-profile bun
mkdir -p ${{matrix.tag}}-profile/ ${{matrix.tag}}/
mkdir ${{matrix.tag}}-profile
mkdir ${{matrix.tag}}
/usr/bin/strip -S bun
mv bun-profile ${{matrix.tag}}-profile/bun-profile
mv bun ${{matrix.tag}}/bun
@@ -332,11 +375,11 @@ jobs:
- uses: actions/upload-artifact@v3
with:
name: ${{matrix.tag}}-profile
path: ${{runner.temp}}/link-build/${{matrix.tag}}-profile.zip
path: ${{runner.temp}}/release/${{matrix.tag}}-profile.zip
- uses: actions/upload-artifact@v3
with:
name: ${{matrix.tag}}
path: ${{runner.temp}}/link-build/${{matrix.tag}}.zip
path: ${{runner.temp}}/release/${{matrix.tag}}.zip
- name: Release
id: release
uses: ncipollo/release-action@v1
@@ -353,7 +396,7 @@ jobs:
token: ${{ secrets.GITHUB_TOKEN }}
name: "Canary (${{github.sha}})"
tag: "canary"
artifacts: "${{runner.temp}}/link-build/${{matrix.tag}}.zip,${{runner.temp}}/link-build/${{matrix.tag}}-profile.zip"
artifacts: "${{runner.temp}}/release/${{matrix.tag}}.zip,${{runner.temp}}/release/${{matrix.tag}}-profile.zip"
- uses: sarisia/actions-status-discord@v1
if: failure() && github.repository_owner == 'oven-sh' && github.event_name == 'pull_request'
with:
@@ -374,7 +417,7 @@ jobs:
[Commit ${{github.sha}}](https://github.com/oven-sh/bun/commits/${{github.sha}})
macOS-test:
name: macOS Test
name: Tests ${{matrix.tag}}
runs-on: ${{ matrix.runner }}
needs: [macOS]
if: github.event_name == 'pull_request' && github.repository_owner == 'oven-sh'
@@ -410,11 +453,7 @@ jobs:
cd ${{matrix.tag}}
chmod +x bun
pwd >> $GITHUB_PATH
- id: bun-version-check
name: Bun version check
run: |
# If this hangs, it means something is seriously wrong with the build
bun --version
./bun --version
- id: install
name: Install dependencies
run: |

View File

@@ -5,8 +5,9 @@ concurrency:
cancel-in-progress: true
env:
LLVM_VERSION: 16
BUN_DOWNLOAD_URL_BASE: https://pub-5e11e972747a44bf9aaf9394f185a982.r2.dev/releases/latest
REGISTRY: ghcr.io
IMAGE_NAME: ${{ github.repository }}
TEST_TAG: bun-test'
on:
push:
@@ -14,8 +15,6 @@ on:
paths:
- "src/**/*"
- "test/**/*"
- "packages/bun-usockets/src/**/*"
- "CMakeLists.txt"
- "build.zig"
- "Makefile"
- "Dockerfile"
@@ -24,8 +23,6 @@ on:
paths:
- "src/**/*"
- "test/**/*"
- "packages/bun-usockets/src/**/*"
- "CMakeLists.txt"
- "build.zig"
- "Makefile"
- "Dockerfile"
@@ -33,8 +30,8 @@ on:
workflow_dispatch:
jobs:
macOS-zig:
name: macOS Zig Object
macos-object-files:
name: macOS Object
runs-on: med-ubuntu
if: github.repository_owner == 'oven-sh'
strategy:
@@ -46,131 +43,66 @@ jobs:
- cpu: haswell
arch: x86_64
tag: bun-obj-darwin-x64
# - cpu: native
# arch: aarch64
# tag: bun-obj-darwin-aarch64
steps:
- uses: actions/checkout@v4
- name: Setup Docker Buildx
uses: docker/setup-buildx-action@v2
- uses: actions/checkout@v3
with:
submodules: recursive
- uses: docker/setup-buildx-action@v2
id: buildx
with:
install: true
- name: Run
run: |
rm -rf ${{runner.temp}}/release
- name: Login to GitHub Container Registry
uses: docker/login-action@v2
with:
registry: ghcr.io
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Compile Zig Object
- name: Build and push
uses: docker/build-push-action@v3
if: runner.arch == 'X64'
with:
context: .
push: false
# This doesnt seem to work
# cache-from: type=s3,endpoint_url=${{ secrets.CACHE_S3_ENDPOINT }},blobs_prefix=docker_blobs/,manifests_prefix=docker_manifests/,access_key_id=${{ secrets.CACHE_S3_ACCESS_KEY_ID }},secret_access_key=${{ secrets.CACHE_S3_SECRET_ACCESS_KEY }},bucket=bun,region=auto
# cache-to: type=s3,endpoint_url=${{ secrets.CACHE_S3_ENDPOINT }},blobs_prefix=docker_blobs/,manifests_prefix=docker_manifests/,access_key_id=${{ secrets.CACHE_S3_ACCESS_KEY_ID }},secret_access_key=${{ secrets.CACHE_S3_SECRET_ACCESS_KEY }},bucket=bun,region=auto
# This was used before, but also does not really work
cache-from: type=local,src=/tmp/.buildx-cache-${{matrix.tag}}
cache-to: type=local,dest=/tmp/.buildx-cache-${{matrix.tag}}
cache-from: type=gha
cache-to: type=gha,mode=min
build-args: |
BUILDARCH=${{ runner.arch == 'X64' && 'amd64' || 'arm64' }}
BUILD_MACHINE_ARCH=${{ runner.arch == 'X64' && 'x86_64' || 'aarch64' }}
ARCH=${{ matrix.arch }}
BUILDARCH=amd64
BUILD_MACHINE_ARCH=x86_64
CPU_TARGET=${{ matrix.cpu }}
TRIPLET=${{ matrix.arch }}-macos-none
GIT_SHA=${{ github.sha }}
SCCACHE_BUCKET=bun
SCCACHE_REGION=auto
SCCACHE_S3_USE_SSL=true
SCCACHE_ENDPOINT=${{ secrets.CACHE_S3_ENDPOINT }}
AWS_ACCESS_KEY_ID=${{ secrets.CACHE_S3_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY=${{ secrets.CACHE_S3_SECRET_ACCESS_KEY }}
platforms: linux/${{ runner.arch == 'X64' && 'amd64' || 'arm64' }}
TRIPLET=${{matrix.arch}}-macos-none
GIT_SHA=${{github.sha}}
platforms: linux/amd64
target: build_release_obj
outputs: type=local,dest=${{runner.temp}}/release
- name: Upload Zig Object
uses: actions/upload-artifact@v3
- name: Build and push
uses: docker/build-push-action@v3
if: runner.arch == 'ARM64'
with:
context: .
push: false
cache-from: type=gha
cache-to: type=gha,mode=min
build-args: |
ARCH=${{ matrix.arch }}
BUILDARCH=arm64
BUILD_MACHINE_ARCH=aarch64
CPU_TARGET=${{ matrix.cpu }}
TRIPLET=${{matrix.arch}}-macos-none
GIT_SHA=${{github.sha}}
platforms: linux/arm64
target: build_release_obj
outputs: type=local,dest=${{runner.temp}}/release
- uses: actions/upload-artifact@v3
with:
name: ${{ matrix.tag }}
path: ${{runner.temp}}/release/bun-zig.o
macOS-dependencies:
name: macOS Dependencies
runs-on: ${{ matrix.runner }}
if: github.repository_owner == 'oven-sh'
timeout-minutes: 15
strategy:
matrix:
include:
- cpu: haswell
arch: x86_64
tag: bun-darwin-x64
obj: bun-obj-darwin-x64
runner: macos-12
artifact: bun-obj-darwin-x64
steps:
- uses: actions/checkout@v4
- name: Checkout submodules
run: git submodule update --init --recursive --depth=1 --progress --force
- name: Install system dependencies
env:
HOMEBREW_NO_INSTALLED_DEPENDENTS_CHECK: 1
HOMEBREW_NO_AUTO_UPDATE: 1
HOMEBREW_NO_INSTALL_CLEANUP: 1
run: |
brew install sccache ccache rust llvm@$LLVM_VERSION pkg-config coreutils libtool cmake libiconv automake openssl@1.1 ninja gnu-sed pkg-config --force
echo "$(brew --prefix ccache)/bin" >> $GITHUB_PATH
# echo "$(brew --prefix sccache)/bin" >> $GITHUB_PATH
echo "$(brew --prefix coreutils)/libexec/gnubin" >> $GITHUB_PATH
echo "$(brew --prefix llvm@$LLVM_VERSION)/bin" >> $GITHUB_PATH
brew link --overwrite llvm@$LLVM_VERSION
- name: Hash submodule versions
run: |
print_data() {
git submodule | grep -v WebKit
llvm-config --version
rustc --version
cat $(echo scripts/build*.sh scripts/all-dependencies.sh | tr " " "\n" | sort)
}
echo "sha=$(print_data | sha1sum | cut -c 1-10)" >> $GITHUB_OUTPUT
id: submodule-versions
- name: Cache submodule dependencies
id: cache-deps-restore
uses: actions/cache/restore@v3
with:
path: ${{runner.temp}}/bun-deps
key: bun-deps-${{ matrix.tag }}-${{ steps.submodule-versions.outputs.sha }}
- name: Compile submodule dependencies
if: ${{ !steps.cache-deps-restore.outputs.cache-hit }}
env:
CPU_TARGET: ${{ matrix.cpu }}
BUN_DEPS_OUT_DIR: ${{runner.temp}}/bun-deps
run: |
mkdir -p $BUN_DEPS_OUT_DIR
bash ./scripts/clean-dependencies.sh
bash ./scripts/all-dependencies.sh
- name: Cache submodule dependencies
if: ${{ !steps.cache-deps-restore.outputs.cache-hit }}
id: cache-deps-save
uses: actions/cache/save@v3
with:
path: ${{runner.temp}}/bun-deps
key: ${{ steps.cache-deps-restore.outputs.cache-primary-key }}
- name: Upload submodule dependencies
uses: actions/upload-artifact@v3
with:
name: ${{ matrix.tag }}-deps
path: ${{runner.temp}}/bun-deps
path: ${{runner.temp}}/release/bun.o
macOS-cpp:
name: macOS C++
runs-on: ${{ matrix.runner }}
@@ -179,77 +111,157 @@ jobs:
strategy:
matrix:
include:
# - cpu: nehalem
# arch: x86_64
# tag: bun-darwin-x64-baseline
# obj: bun-obj-darwin-x64-baseline
# runner: macos-12
# artifact: bun-obj-darwin-x64-baseline
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
# dependencies: true
# compile_obj: false
- cpu: haswell
arch: x86_64
tag: bun-darwin-x64
obj: bun-obj-darwin-x64
runner: macos-12
artifact: bun-obj-darwin-x64
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
dependencies: true
compile_obj: false
# - cpu: nehalem
# arch: x86_64
# tag: bun-darwin-x64-baseline
# obj: bun-obj-darwin-x64-baseline
# runner: macos-12
# artifact: bun-obj-darwin-x64-baseline
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
# dependencies: false
# compile_obj: true
- cpu: haswell
arch: x86_64
tag: bun-darwin-x64
obj: bun-obj-darwin-x64
runner: macos-12
artifact: bun-obj-darwin-x64
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
dependencies: false
compile_obj: true
# - cpu: native
# arch: aarch64
# tag: bun-darwin-aarch64
# obj: bun-obj-darwin-aarch64
# artifact: bun-obj-darwin-aarch64
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-arm64-lto.tar.gz"
# runner: macos-arm64
# dependencies: true
# compile_obj: true
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v3
- name: Checkout submodules
run: git submodule update --init --recursive --depth=1 --progress --force
- name: Install system dependencies
run: git submodule update --init --recursive --depth=1 --progress -j $(sysctl -n hw.ncpu) --force
- name: Install dependencies
env:
CPU_TARGET: ${{ matrix.cpu }}
JSC_BASE_DIR: ${{ runner.temp }}/bun-webkit
JSC_LIB: ${{ runner.temp }}/bun-webkit/lib
HOMEBREW_NO_INSTALLED_DEPENDENTS_CHECK: 1
HOMEBREW_NO_AUTO_UPDATE: 1
HOMEBREW_NO_INSTALL_CLEANUP: 1
run: |
brew install sccache ccache rust llvm@$LLVM_VERSION pkg-config coreutils libtool cmake libiconv automake openssl@1.1 ninja gnu-sed pkg-config --force
# echo "$(brew --prefix sccache)/bin" >> $GITHUB_PATH
echo "$(brew --prefix ccache)/bin" >> $GITHUB_PATH
echo "$(brew --prefix coreutils)/libexec/gnubin" >> $GITHUB_PATH
echo "$(brew --prefix llvm@$LLVM_VERSION)/bin" >> $GITHUB_PATH
brew link --overwrite llvm@$LLVM_VERSION
curl -LO "$BUN_DOWNLOAD_URL_BASE/bun-darwin-x64.zip"
unzip bun-darwin-x64.zip
mkdir -p ${{ runner.temp }}/.bun/bin
mv bun-darwin-x64/bun ${{ runner.temp }}/.bun/bin/bun
chmod +x ${{ runner.temp }}/.bun/bin/bun
echo "${{ runner.temp }}/.bun/bin" >> $GITHUB_PATH
# TODO: replace with sccache
- name: ccache
uses: hendrikmuhs/ccache-action@v1.2
with:
key: ${{ runner.os }}-ccache-${{ matrix.tag }}
restore-keys: ${{ runner.os }}-ccache-${{ matrix.tag }}
- name: Compile C++
env:
CPU_TARGET: ${{ matrix.cpu }}
SOURCE_DIR: ${{ github.workspace }}
BUN_DEPLOY_DIR: ${{ runner.temp }}/release/bun
OBJ_DIR: ${{ runner.temp }}/bun-cpp-obj
BUN_DEPS_OUT_DIR: ${{runner.temp}}/bun-deps
run: |
mkdir -p $OBJ_DIR
cd $OBJ_DIR
cmake -S $SOURCE_DIR -B $OBJ_DIR \
-G Ninja \
-DCMAKE_BUILD_TYPE=Release \
-DBUN_CPP_ONLY=1 \
-DNO_CONFIGURE_DEPENDS=1
bash compile-cpp-only.sh -v
brew install rust ccache llvm@16 pkg-config coreutils libtool cmake libiconv automake openssl@1.1 ninja gnu-sed pkg-config esbuild --force
echo "$(brew --prefix coreutils)/libexec/gnubin" >> $GITHUB_PATH
echo "$(brew --prefix llvm@16)/bin" >> $GITHUB_PATH
brew link --overwrite llvm@16
- name: Download WebKit
if: matrix.compile_obj
env:
CPU_TARGET: ${{ matrix.cpu }}
JSC_BASE_DIR: ${{ runner.temp }}/bun-webkit
JSC_LIB: ${{ runner.temp }}/bun-webkit/lib
BUN_DEPLOY_DIR: ${{ runner.temp }}/release/bun
OBJ_DIR: ${{ runner.temp }}/bun-cpp-obj
BUN_DEPS_OUT_DIR: ${{runner.temp}}/bun-deps
WEBKIT_RELEASE_DIR: ${{ runner.temp }}/bun-webkit
WEBKIT_RELEASE_DIR_LTO: ${{ runner.temp }}/bun-webkit
run: |
rm -rf $JSC_BASE_DIR
mkdir -p $JSC_BASE_DIR
curl -L ${{ matrix.webkit_url }} | tar -xz -C $JSC_BASE_DIR --strip-components=1
- name: ccache (dependencies)
uses: hendrikmuhs/ccache-action@v1.2
if: matrix.dependencies
with:
key: ${{ runner.os }}-ccache-${{ matrix.tag }}-dependencies
restore-keys: ${{ runner.os }}-ccache-${{ matrix.tag }}-dependencies
- name: ccache (c++)
uses: hendrikmuhs/ccache-action@v1.2
if: matrix.compile_obj
with:
key: ${{ runner.os }}-ccache-${{ matrix.tag }}-obj
restore-keys: ${{ runner.os }}-ccache-${{ matrix.tag }}-obj
- name: Compile dependencies
if: matrix.dependencies
env:
CPU_TARGET: ${{ matrix.cpu }}
JSC_BASE_DIR: ${{ runner.temp }}/bun-webkit
JSC_LIB: ${{ runner.temp }}/bun-webkit/lib
BUN_DEPLOY_DIR: ${{ runner.temp }}/release/bun
OBJ_DIR: ${{ runner.temp }}/bun-cpp-obj
BUN_DEPS_OUT_DIR: ${{runner.temp}}/bun-deps
WEBKIT_RELEASE_DIR: ${{ runner.temp }}/bun-webkit
WEBKIT_RELEASE_DIR_LTO: ${{ runner.temp }}/bun-webkit
run: |
mkdir -p $OBJ_DIR $BUN_DEPS_OUT_DIR
make vendor-without-check
- name: Compile C++
if: matrix.compile_obj
env:
CPU_TARGET: ${{ matrix.cpu }}
JSC_BASE_DIR: ${{ runner.temp }}/bun-webkit
JSC_LIB: ${{ runner.temp }}/bun-webkit/lib
BUN_DEPLOY_DIR: ${{ runner.temp }}/release/bun
OBJ_DIR: ${{ runner.temp }}/bun-cpp-obj
BUN_DEPS_OUT_DIR: ${{runner.temp}}/bun-deps
WEBKIT_RELEASE_DIR: ${{ runner.temp }}/bun-webkit
WEBKIT_RELEASE_DIR_LTO: ${{ runner.temp }}/bun-webkit
run: |
mkdir -p $OBJ_DIR $BUN_DEPS_OUT_DIR
make -j $(sysctl -n hw.ncpu) release-bindings
- name: Upload C++
if: matrix.compile_obj
uses: actions/upload-artifact@v3
with:
name: ${{ matrix.tag }}-cpp
path: ${{ runner.temp }}/bun-cpp-obj/bun-cpp-objects.a
path: ${{ runner.temp }}/bun-cpp-obj
- name: Upload Dependencies
if: matrix.dependencies
uses: actions/upload-artifact@v3
with:
name: ${{ matrix.tag }}-deps
path: ${{runner.temp}}/bun-deps
macOS:
name: macOS Link
runs-on: ${{ matrix.runner }}
if: github.repository_owner == 'oven-sh'
needs: [macOS-cpp, macOS-zig, macOS-dependencies]
needs: [macOS-cpp, macos-object-files]
timeout-minutes: 90
permissions: write-all
strategy:
matrix:
include:
# - cpu: nehalem
# arch: x86_64
# tag: bun-darwin-x64-baseline
# obj: bun-obj-darwin-x64-baseline
# package: bun-darwin-x64
# runner: macos-12
# artifact: bun-obj-darwin-x64-baseline
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
- cpu: haswell
arch: x86_64
tag: bun-darwin-x64
@@ -257,71 +269,105 @@ jobs:
package: bun-darwin-x64
runner: macos-12
artifact: bun-obj-darwin-x64
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3-4/bun-webkit-macos-amd64-lto.tar.gz"
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
# - cpu: native
# arch: aarch64
# tag: bun-darwin-aarch64
# obj: bun-obj-darwin-aarch64
# package: bun-darwin-aarch64
# artifact: bun-obj-darwin-aarch64
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-arm64-lto.tar.gz"
# runner: macos-arm64
steps:
- uses: actions/checkout@v3
- name: Checkout submodules
run: git submodule update --init --recursive --depth=1 --progress --force
- name: Install system dependencies
run: git submodule update --init --recursive --depth=1 --progress -j $(sysctl -n hw.ncpu) --force
- name: Install dependencies
env:
CPU_TARGET: ${{ matrix.cpu }}
JSC_BASE_DIR: ${{ runner.temp }}/bun-webkit
JSC_LIB: ${{ runner.temp }}/bun-webkit/lib
HOMEBREW_NO_INSTALLED_DEPENDENTS_CHECK: 1
HOMEBREW_NO_AUTO_UPDATE: 1
HOMEBREW_NO_INSTALL_CLEANUP: 1
BUN_DEPLOY_DIR: ${{ runner.temp }}/release/bun
OBJ_DIR: ${{ runner.temp }}/bun-cpp-obj
BUN_DEPS_OUT_DIR: ${{runner.temp}}/bun-deps
run: |
brew install ccache llvm@$LLVM_VERSION pkg-config coreutils libtool cmake libiconv openssl@1.1 ninja --force
echo "$(brew --prefix ccache)/bin" >> $GITHUB_PATH
brew install rust ccache llvm@16 pkg-config coreutils libtool cmake libiconv automake openssl@1.1 ninja gnu-sed pkg-config esbuild --force
echo "$(brew --prefix coreutils)/libexec/gnubin" >> $GITHUB_PATH
echo "$(brew --prefix llvm@$LLVM_VERSION)/bin" >> $GITHUB_PATH
brew link --overwrite llvm@$LLVM_VERSION
curl -LO "$BUN_DOWNLOAD_URL_BASE/bun-darwin-x64-baseline.zip"
unzip bun-darwin-x64-baseline.zip
mkdir -p ${{ runner.temp }}/.bun/bin
mv bun-darwin-x64-baseline/bun ${{ runner.temp }}/.bun/bin/bun
chmod +x ${{ runner.temp }}/.bun/bin/bun
echo "${{ runner.temp }}/.bun/bin" >> $GITHUB_PATH
echo "$(brew --prefix llvm@16)/bin" >> $GITHUB_PATH
brew link --overwrite llvm@16
- name: Download WebKit
env:
CPU_TARGET: ${{ matrix.cpu }}
JSC_BASE_DIR: ${{ runner.temp }}/bun-webkit
JSC_LIB: ${{ runner.temp }}/bun-webkit/lib
BUN_DEPLOY_DIR: ${{ runner.temp }}/release/bun
OBJ_DIR: ${{ runner.temp }}/bun-cpp-obj
BUN_DEPS_OUT_DIR: ${{runner.temp}}/bun-deps
BUN_RELEASE_DIR: ${{ runner.temp }}/release
WEBKIT_RELEASE_DIR: ${{ runner.temp }}/bun-webkit
WEBKIT_RELEASE_DIR_LTO: ${{ runner.temp }}/bun-webkit
run: |
rm -rf $JSC_BASE_DIR
mkdir -p $JSC_BASE_DIR
curl -L ${{ matrix.webkit_url }} | tar -xz -C $JSC_BASE_DIR --strip-components=1
- name: Download C++
uses: actions/download-artifact@v3
with:
name: ${{ matrix.tag }}-cpp
path: ${{ runner.temp }}/bun-cpp-obj
- name: Download Zig Object
- name: Download Dependencies
uses: actions/download-artifact@v3
with:
name: ${{ matrix.tag }}-deps
path: ${{ runner.temp }}/bun-deps
- name: Download Object
uses: actions/download-artifact@v3
with:
name: ${{ matrix.obj }}
path: ${{ runner.temp }}/release
- name: Downloaded submodule dependencies
uses: actions/download-artifact@v3
- name: ccache (link)
uses: hendrikmuhs/ccache-action@v1.2
with:
name: ${{ matrix.tag }}-deps
path: ${{runner.temp}}/bun-deps
key: ${{ runner.os }}-ccache-${{ matrix.tag }}-link
restore-keys: ${{ runner.os }}-ccache-${{ matrix.tag }}-link
- name: Link
env:
CPU_TARGET: ${{ matrix.cpu }}
JSC_BASE_DIR: ${{ runner.temp }}/bun-webkit
JSC_LIB: ${{ runner.temp }}/bun-webkit/lib
BUN_DEPLOY_DIR: ${{ runner.temp }}/release/bun
OBJ_DIR: ${{ runner.temp }}/bun-cpp-obj
BUN_DEPS_OUT_DIR: ${{runner.temp}}/bun-deps
BUN_RELEASE_DIR: ${{ runner.temp }}/release
WEBKIT_RELEASE_DIR: ${{ runner.temp }}/bun-webkit
WEBKIT_RELEASE_DIR_LTO: ${{ runner.temp }}/bun-webkit
run: |
SRC_DIR=$PWD
mkdir ${{runner.temp}}/link-build
cd ${{runner.temp}}/link-build
cmake $SRC_DIR \
-G Ninja \
-DCMAKE_BUILD_TYPE=Release \
-DBUN_LINK_ONLY=1 \
-DBUN_ZIG_OBJ="${{ runner.temp }}/release/bun-zig.o" \
-DBUN_CPP_ARCHIVE="${{ runner.temp }}/bun-cpp-obj/bun-cpp-objects.a" \
-DBUN_DEPS_OUT_DIR="${{runner.temp}}/bun-deps" \
-DNO_CONFIGURE_DEPENDS=1
ninja -v
rm -rf packages/${{ matrix.package }}
mkdir -p packages/${{ matrix.package }}
mv ${{ runner.temp }}/release/* packages/${{ matrix.package }}/
make bun-link-lld-release copy-to-bun-release-dir-bin
- name: Zip
env:
CPU_TARGET: ${{ matrix.cpu }}
JSC_BASE_DIR: ${{ runner.temp }}/bun-webkit
JSC_LIB: ${{ runner.temp }}/bun-webkit/lib
BUN_DEPLOY_DIR: ${{ runner.temp }}/release/bun
OBJ_DIR: ${{ runner.temp }}/bun-cpp-obj
BUN_DEPS_OUT_DIR: ${{runner.temp}}/bun-deps
BUN_RELEASE_DIR: ${{ runner.temp }}/release
WEBKIT_RELEASE_DIR: ${{ runner.temp }}/bun-webkit
WEBKIT_RELEASE_DIR_LTO: ${{ runner.temp }}/bun-webkit
run: |
cd ${{runner.temp}}/link-build
cd ${{runner.temp}}/release
chmod +x bun-profile bun
mkdir -p ${{matrix.tag}}-profile/ ${{matrix.tag}}/
mkdir ${{matrix.tag}}-profile
mkdir ${{matrix.tag}}
/usr/bin/strip -S bun
mv bun-profile ${{matrix.tag}}-profile/bun-profile
mv bun ${{matrix.tag}}/bun
@@ -331,11 +377,11 @@ jobs:
- uses: actions/upload-artifact@v3
with:
name: ${{matrix.tag}}-profile
path: ${{runner.temp}}/link-build/${{matrix.tag}}-profile.zip
path: ${{runner.temp}}/release/${{matrix.tag}}-profile.zip
- uses: actions/upload-artifact@v3
with:
name: ${{matrix.tag}}
path: ${{runner.temp}}/link-build/${{matrix.tag}}.zip
path: ${{runner.temp}}/release/${{matrix.tag}}.zip
- name: Release
id: release
uses: ncipollo/release-action@v1
@@ -352,7 +398,7 @@ jobs:
token: ${{ secrets.GITHUB_TOKEN }}
name: "Canary (${{github.sha}})"
tag: "canary"
artifacts: "${{runner.temp}}/link-build/${{matrix.tag}}.zip,${{runner.temp}}/link-build/${{matrix.tag}}-profile.zip"
artifacts: "${{runner.temp}}/release/${{matrix.tag}}.zip,${{runner.temp}}/release/${{matrix.tag}}-profile.zip"
- uses: sarisia/actions-status-discord@v1
if: failure() && github.repository_owner == 'oven-sh' && github.event_name == 'pull_request'
with:
@@ -409,11 +455,7 @@ jobs:
cd ${{matrix.tag}}
chmod +x bun
pwd >> $GITHUB_PATH
- id: bun-version-check
name: Bun version check
run: |
# If this hangs, it means something is seriously wrong with the build
bun --version
./bun --version
- id: install
name: Install dependencies
run: |

View File

@@ -2,7 +2,7 @@ name: bun-release
concurrency: release
env:
BUN_VERSION: ${{ github.event.inputs.tag || github.event.release.tag_name || 'canary' }}
BUN_LATEST: ${{ (github.event.inputs.is-latest || github.event.release.tag_name) && 'true' || 'false' }}
BUN_LATEST: ${{ github.event.inputs.is-latest || github.event.release.prerelease == 'false' }}
on:
release:
types:
@@ -152,7 +152,7 @@ jobs:
matrix:
include:
- variant: debian
suffix: ""
suffix: ''
- variant: debian
suffix: -debian
- variant: slim

View File

@@ -1,339 +0,0 @@
name: bun-windows-x64
concurrency:
group: bun-windows-x64-${{ github.ref }}
cancel-in-progress: true
env:
# note: in other files, this version is only the major version, but for windows it is the full version
LLVM_VERSION: 16.0.6
BUN_DOWNLOAD_URL_BASE: https://pub-5e11e972747a44bf9aaf9394f185a982.r2.dev/releases/latest
cpu: native
arch: x86_64
tag: bun-windows-x64
# TODO: wire this up to workflow_dispatch.
# github's expression syntax makes this hard to set a default to true
canary: true
on:
push:
branches: [main]
paths:
- "src/**/*"
- "test/**/*"
- "packages/bun-usockets/src/**/*"
- "CMakeLists.txt"
- "build.zig"
- "Makefile"
- "Dockerfile"
pull_request:
branches: [main]
paths:
- "src/**/*"
- "test/**/*"
- "packages/bun-usockets/src/**/*"
- "CMakeLists.txt"
- "build.zig"
- "Makefile"
- "Dockerfile"
# Allows you to run this workflow manually from the Actions tab
workflow_dispatch:
# inputs:
# is-canary:
# type: boolean
# description: Is Canary Build?
# default: true
jobs:
windows-zig:
name: Zig Build
runs-on: med-ubuntu
timeout-minutes: 60
if: github.repository_owner == 'oven-sh'
steps:
- uses: actions/checkout@v4
- name: Setup Docker Buildx
uses: docker/setup-buildx-action@v2
id: buildx
with:
install: true
- name: Login to GitHub Container Registry
uses: docker/login-action@v2
with:
registry: ghcr.io
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Calculate Canary Revision
if: ${{ env.canary == 'true' }}
id: canary
run: |
echo "canary_revision=$(GITHUB_TOKEN="${{ secrets.GITHUB_TOKEN }}" bash ./scripts/calculate-canary-revision.sh --raw)" >> $GITHUB_OUTPUT
- name: Compile Zig Object
uses: docker/build-push-action@v3
if: runner.arch == 'X64'
with:
context: .
push: false
# This doesnt seem to work
# cache-from: type=s3,endpoint_url=${{ secrets.CACHE_S3_ENDPOINT }},blobs_prefix=docker_blobs/,manifests_prefix=docker_manifests/,access_key_id=${{ secrets.CACHE_S3_ACCESS_KEY_ID }},secret_access_key=${{ secrets.CACHE_S3_SECRET_ACCESS_KEY }},bucket=bun,region=auto
# cache-to: type=s3,endpoint_url=${{ secrets.CACHE_S3_ENDPOINT }},blobs_prefix=docker_blobs/,manifests_prefix=docker_manifests/,access_key_id=${{ secrets.CACHE_S3_ACCESS_KEY_ID }},secret_access_key=${{ secrets.CACHE_S3_SECRET_ACCESS_KEY }},bucket=bun,region=auto
build-args: |
BUILDARCH=${{ runner.arch == 'X64' && 'amd64' || 'arm64' }}
BUILD_MACHINE_ARCH=${{ runner.arch == 'X64' && 'x86_64' || 'aarch64' }}
ARCH=${{ env.arch }}
CPU_TARGET=${{ env.cpu }}
TRIPLET=${{ env.arch }}-windows-msvc
GIT_SHA=${{ github.sha }}
CANARY=${{ env.canary == 'true' && steps.canary.outputs.canary_revision || '0' }}
platforms: linux/${{ runner.arch == 'X64' && 'amd64' || 'arm64' }}
target: build_release_obj
outputs: type=local,dest=${{runner.temp}}/release
- name: Upload Zig Object
uses: actions/upload-artifact@v3
with:
name: ${{ env.tag }}-zig
path: ${{runner.temp}}/release/bun-zig.o
windows-dependencies:
name: Dependencies
runs-on: windows
timeout-minutes: 60
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Clone Submodules
run: .\scripts\update-submodules.ps1
- name: Hash submodule versions
shell: pwsh
run: |
$data = "$(& {
git submodule | Where-Object { $_ -notmatch 'WebKit' }
clang --version
rustc --version
Get-Content -Path (Get-ChildItem -Path 'scripts/build*.sh', 'scripts/all-dependencies.sh' | Sort-Object -Property Name).FullName | Out-String
})"
$hash = ( -join ((New-Object -TypeName System.Security.Cryptography.SHA1CryptoServiceProvider).ComputeHash([System.Text.Encoding]::UTF8.GetBytes($data)) | ForEach-Object { $_.ToString("x2") } )).Substring(0, 10)
echo "sha=${hash}" >> $env:GITHUB_OUTPUT
id: submodule-versions
- name: Try fetch dependencies
id: cache-deps-restore
uses: actions/cache/restore@v3
with:
path: bun-deps
key: bun-deps-${{ env.tag }}-${{ steps.submodule-versions.outputs.sha }}
- name: Install LLVM ${{ env.LLVM_VERSION }}
if: ${{ !steps.cache-deps-restore.outputs.cache-hit }}
uses: KyleMayes/install-llvm-action@1a3da29f56261a1e1f937ec88f0856a9b8321d7e
with:
version: ${{ env.LLVM_VERSION }}
- name: Install Ninja
if: ${{ !steps.cache-deps-restore.outputs.cache-hit }}
run: choco install -y ninja
- name: Build Dependencies
if: ${{ !steps.cache-deps-restore.outputs.cache-hit }}
run: |
.\scripts\env.ps1
Invoke-WebRequest -Uri "https://www.nasm.us/pub/nasm/releasebuilds/2.16.01/win64/nasm-2.16.01-win64.zip" -OutFile nasm.zip
Expand-Archive nasm.zip (mkdir -Force "nasm")
$Nasm = (Get-ChildItem "nasm")
$env:Path += ";${Nasm}"
$env:BUN_DEPS_OUT_DIR = (mkdir -Force "./bun-deps")
.\scripts\all-dependencies.ps1
- name: Upload Dependencies
uses: actions/upload-artifact@v3
with:
name: ${{ env.tag }}-deps
path: bun-deps/
- name: Cache Dependencies
if: ${{ !steps.cache-deps-restore.outputs.cache-hit }}
id: cache-deps-save
uses: actions/cache/save@v3
with:
path: bun-deps
key: ${{ steps.cache-deps-restore.outputs.cache-primary-key }}
windows-codegen:
name: Codegen
runs-on: ubuntu-latest
timeout-minutes: 10
if: github.repository_owner == 'oven-sh'
steps:
- uses: actions/checkout@v4
- run: |
curl -fsSL $BUN_DOWNLOAD_URL_BASE/bun-linux-x64.zip > bun.zip
unzip bun.zip
export PATH="$PWD/bun-linux-x64:$PATH"
./scripts/cross-compile-codegen.sh win32 x64
# Sort of a hack to do this step in the codegen stage
- name: Calculate Canary Revision
if: ${{ env.canary == 'true' }}
run: |
echo "canary_revision=$(GITHUB_TOKEN="${{ secrets.GITHUB_TOKEN }}" bash ./scripts/calculate-canary-revision.sh --raw)" > build-codegen-win32-x64/.canary_revision
- uses: actions/upload-artifact@v3
with:
name: ${{ env.tag }}-codegen
path: build-codegen-win32-x64/
windows-cpp:
name: C++ Build
needs: [windows-codegen]
runs-on: windows
if: github.repository_owner == 'oven-sh'
timeout-minutes: 90
steps:
- uses: actions/checkout@v4
- uses: KyleMayes/install-llvm-action@1a3da29f56261a1e1f937ec88f0856a9b8321d7e
with:
version: ${{ env.LLVM_VERSION }}
- run: choco install -y ninja
- name: Download Codegen
uses: actions/download-artifact@v3
with:
name: ${{ env.tag }}-codegen
path: build
- name: Build C++
run: |
# Using SCCache is blocked by
# https://github.com/mozilla/sccache/issues/1843
# https://github.com/mozilla/sccache/pull/1856
# $sczip = "sccache-v0.6.0-x86_64-pc-windows-msvc"
# Invoke-WebRequest -Uri "https://github.com/mozilla/sccache/releases/download/v0.6.0/${sczip}.zip" -OutFile "${sczip}.zip"
# Expand-Archive "${sczip}.zip"
# $env:SCCACHE_BUCKET="bun"
# $env:SCCACHE_REGION="auto"
# $env:SCCACHE_S3_USE_SSL="true"
# $env:SCCACHE_ENDPOINT="${{ secrets.CACHE_S3_ENDPOINT }}"
# $env:AWS_ACCESS_KEY_ID="${{ secrets.CACHE_S3_ACCESS_KEY_ID }}"
# $env:AWS_SECRET_ACCESS_KEY="${{ secrets.CACHE_S3_SECRET_ACCESS_KEY }}"
# $SCCACHE="$PWD/${sczip}/${sczip}/sccache.exe"
$CANARY_REVISION = if (Test-Path build/.canary_revision) { Get-Content build/.canary_revision } else { "0" }
.\scripts\env.ps1
.\scripts\update-submodules.ps1
.\scripts\build-libuv.ps1 -CloneOnly $True
cd build
# "-DCCACHE_PROGRAM=${SCCACHE}"
# TODO(@paperdave): pass the proper revision of canary here. without it,
# the properties window will display the wrong version.
# not really a big deal for time being. should be resolved before release
cmake .. -G Ninja -DCMAKE_BUILD_TYPE=Release `
-DNO_CODEGEN=1 `
-DNO_CONFIGURE_DEPENDS=1 `
"-DCANARY=${CANARY_REVISION}" `
-DBUN_CPP_ONLY=1
if ($LASTEXITCODE -ne 0) { throw "CMake configuration failed" }
.\compile-cpp-only.ps1 -v
if ($LASTEXITCODE -ne 0) { throw "C++ compilation failed" }
- uses: actions/upload-artifact@v3
with:
name: ${{ env.tag }}-cpp
path: build/bun-cpp-objects.a
windows-link:
name: Link
needs: [windows-dependencies, windows-codegen, windows-cpp, windows-zig]
runs-on: windows-latest
if: github.repository_owner == 'oven-sh'
timeout-minutes: 30
permissions: write-all
steps:
- uses: actions/checkout@v4
- uses: KyleMayes/install-llvm-action@1a3da29f56261a1e1f937ec88f0856a9b8321d7e
with:
version: ${{ env.LLVM_VERSION }}
- run: choco install -y ninja
- name: Download Codegen
uses: actions/download-artifact@v3
with:
name: ${{ env.tag }}-codegen
path: build
- name: Download Dependencies
uses: actions/download-artifact@v3
with:
name: ${{ env.tag }}-deps
path: bun-deps
- name: Download Zig Object
uses: actions/download-artifact@v3
with:
name: ${{ env.tag }}-zig
path: bun-zig
- name: Download C++ Objects
uses: actions/download-artifact@v3
with:
name: ${{ env.tag }}-cpp
path: bun-cpp
- name: Link
run: |
.\scripts\update-submodules.ps1
.\scripts\env.ps1
Set-Location build
$CANARY_REVISION = if (Test-Path build/.canary_revision) { Get-Content build/.canary_revision } else { "0" }
cmake .. -G Ninja -DCMAKE_BUILD_TYPE=Release `
-DNO_CODEGEN=1 `
-DNO_CONFIGURE_DEPENDS=1 `
"-DCANARY=${CANARY_REVISION}" `
-DBUN_LINK_ONLY=1 `
"-DBUN_DEPS_OUT_DIR=$(Resolve-Path ../bun-deps)" `
"-DBUN_CPP_ARCHIVE=$(Resolve-Path ../bun-cpp/bun-cpp-objects.a)" `
"-DBUN_ZIG_OBJ=$(Resolve-Path ../bun-zig/bun-zig.o)"
if ($LASTEXITCODE -ne 0) { throw "CMake configuration failed" }
ninja -v
if ($LASTEXITCODE -ne 0) { throw "Link failed!" }
- name: Package
run: |
$Dist = mkdir -Force "${{ env.tag }}"
cp -r build\bun.exe "$Dist\bun.exe"
Compress-Archive $Dist ${{ env.tag }}.zip
- uses: actions/upload-artifact@v3
with:
name: ${{ env.tag }}
path: ${{ env.tag }}.zip
- name: Release
id: release
uses: ncipollo/release-action@v1
if: |
github.repository_owner == 'oven-sh'
&& github.ref == 'refs/heads/main'
with:
prerelease: true
body: "This canary release of Bun corresponds to the commit [${{ github.sha }}]"
allowUpdates: true
replacesArtifacts: true
generateReleaseNotes: true
artifactErrorsFailBuild: true
token: ${{ secrets.GITHUB_TOKEN }}
name: "Canary (${{github.sha}})"
tag: "canary"
artifacts: "${{env.tag}}.zip"
- uses: sarisia/actions-status-discord@v1
if: failure() && github.repository_owner == 'oven-sh' && github.event_name == 'pull_request'
with:
title: ""
webhook: ${{ secrets.DISCORD_WEBHOOK }}
status: ${{ job.status }}
noprefix: true
nocontext: true
description: |
Pull Request
### [${{github.event.pull_request.title}}](https://github.com/oven-sh/bun/pull/${{github.event.number}})
@${{ github.actor }}
Build failed on ${{ env.tag }}:
**[View build output](https://github.com/oven-sh/bun/actions/runs/${{github.run_id}})**
[Commit ${{github.sha}}](https://github.com/oven-sh/bun/commits/${{github.sha}})

View File

@@ -1,45 +0,0 @@
name: autofix.ci # Must be named this for autofix.ci to work
on:
workflow_dispatch:
pull_request:
push:
branches:
- main
env:
ZIG_VERSION: 0.12.0-dev.1604+caae40c21
permissions:
contents: read
jobs:
format:
name: format
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v4
with:
sparse-checkout: |
src
packages
test
bench
- name: Setup Bun
uses: oven-sh/setup-bun@v1
with:
bun-version: latest
- name: Setup Zig
uses: goto-bus-stop/setup-zig@c7b6cdd3adba8f8b96984640ff172c37c93f73ee
with:
version: ${{ env.ZIG_VERSION }}
- name: Install Dependencies
run: |
bun install
- name: Format
run: |
bun fmt
bun fmt:zig
- name: Commit # https://autofix.ci/
uses: autofix-ci/action@d3e591514b99d0fca6779455ff8338516663f7cc

78
.github/workflows/prettier-fmt.yml vendored Normal file
View File

@@ -0,0 +1,78 @@
name: prettier
on:
pull_request:
branches:
- main
- jarred/test-actions
# Allows you to run this workflow manually from the Actions tab
workflow_dispatch:
jobs:
prettier-fmt:
name: prettier
runs-on: ubuntu-latest
permissions:
pull-requests: write
outputs:
prettier_fmt_errs: ${{ steps.fmt.outputs.prettier_fmt_errs }}
steps:
- uses: actions/checkout@v3
with:
submodules: recursive
- id: setup
name: Setup
uses: oven-sh/setup-bun@v1
with:
bun-version: latest
- id: install
name: Install prettier
run: bun install
- name: Run prettier
id: fmt
run: |
rm -f .failed
bun prettier --check "./bench/**/*.{ts,tsx,js,jsx,mjs}" "./test/**/*.{ts,tsx,js,jsx,mjs}" "./src/**/*.{ts,tsx,js,jsx}" --config .prettierrc.cjs 2> prettier-fmt.err > prettier-fmt1.err || echo 'failed' > .failed
if [ -s .failed ]; then
delimiter="$(openssl rand -hex 8)"
echo "prettier_fmt_errs<<${delimiter}" >> "${GITHUB_OUTPUT}"
cat prettier-fmt.err >> "${GITHUB_OUTPUT}"
cat prettier-fmt1.err >> "${GITHUB_OUTPUT}"
echo "${delimiter}" >> "${GITHUB_OUTPUT}"
fi
- name: Comment on PR
if: steps.fmt.outputs.prettier_fmt_errs != ''
uses: thollander/actions-comment-pull-request@v2
with:
comment_tag: prettier-fmt
message: |
❌ @${{ github.actor }} `prettier` reported errors
```js
${{ steps.fmt.outputs.prettier_fmt_errs }}
```
To one-off fix this manually, run:
```sh
bun fmt
```
You might need to run `bun install` locally and configure your text editor to [auto-format on save](https://marketplace.visualstudio.com/items?itemName=esbenp.prettier-vscode).
<sup>[#${{github.sha}}](https://github.com/oven-sh/bun/commits/${{github.sha}})</sup>
- name: Uncomment on PR
if: steps.fmt.outputs.prettier_fmt_errs == ''
uses: thollander/actions-comment-pull-request@v2
with:
comment_tag: prettier-fmt
mode: upsert
create_if_not_exists: false
message: |
✅ `prettier` errors have been resolved. Thank you.
<sup>[#${{github.sha}}](https://github.com/oven-sh/bun/commits/${{github.sha}})</sup>
- name: Fail the job
if: steps.fmt.outputs.prettier_fmt_errs != ''
run: exit 1

89
.github/workflows/zig-fmt.yml vendored Normal file
View File

@@ -0,0 +1,89 @@
name: zig-fmt
env:
ZIG_VERSION: 0.12.0-dev.163+6780a6bbf
on:
pull_request:
branches:
- main
- jarred/test-actions
paths:
- "src/**/*.zig"
- "src/*.zig"
# Allows you to run this workflow manually from the Actions tab
workflow_dispatch:
jobs:
zig-fmt:
name: zig fmt
runs-on: ubuntu-latest
permissions:
pull-requests: write
outputs:
zig_fmt_errs: ${{ steps.fmt.outputs.zig_fmt_errs }}
steps:
- uses: actions/checkout@v3
with:
submodules: recursive
- name: Install zig
run: |
curl https://ziglang.org/builds/zig-linux-x86_64-${{env.ZIG_VERSION}}.tar.xz -L -o zig.tar.xz
tar -xf zig.tar.xz
echo "$(pwd)/zig-linux-x86_64-${{env.ZIG_VERSION}}" >> $GITHUB_PATH
- name: Run zig fmt
id: fmt
run: |
zig fmt --check src/*.zig src/**/*.zig 2> zig-fmt.err > zig-fmt.err2 || echo "Failed"
delimiter="$(openssl rand -hex 8)"
echo "zig_fmt_errs<<${delimiter}" >> "${GITHUB_OUTPUT}"
if [ -s zig-fmt.err ]; then
echo "// The following errors occurred:" >> "${GITHUB_OUTPUT}"
cat zig-fmt.err >> "${GITHUB_OUTPUT}"
fi
if [ -s zig-fmt.err2 ]; then
echo "// The following files were not formatted:" >> "${GITHUB_OUTPUT}"
cat zig-fmt.err2 >> "${GITHUB_OUTPUT}"
fi
echo "${delimiter}" >> "${GITHUB_OUTPUT}"
- name: Comment on PR
if: steps.fmt.outputs.zig_fmt_errs != ''
uses: thollander/actions-comment-pull-request@v2
with:
comment_tag: zig-fmt
message: |
❌ @${{ github.actor }} `zig fmt` reported errors. Consider configuring your text editor to [auto-format on save](https://github.com/ziglang/vscode-zig)
```zig
// # zig fmt --check src/*.zig src/**/*.zig
${{ steps.fmt.outputs.zig_fmt_errs }}
```
To one-off fix this manually, run:
```sh
zig fmt src/*.zig src/**/*.zig
```
<sup>[#${{github.sha}}](https://github.com/oven-sh/bun/commits/${{github.sha}})</sup>
<sup>zig v${{env.ZIG_VERSION}}</sup>
- name: Uncomment on PR
if: steps.fmt.outputs.zig_fmt_errs == ''
uses: thollander/actions-comment-pull-request@v2
with:
comment_tag: zig-fmt
mode: upsert
create_if_not_exists: false
message: |
✅ `zig fmt` errors have been resolved. Thank you.
<sup>[#${{github.sha}}](https://github.com/oven-sh/bun/commits/${{github.sha}})</sup>
<sup>zig v${{env.ZIG_VERSION}}</sup>
- name: Fail the job
if: steps.fmt.outputs.zig_fmt_errs != ''
run: exit 1

27
.gitignore vendored
View File

@@ -110,7 +110,7 @@ misctools/machbench
*.big
.eslintcache
/bun-webkit
bun-webkit
src/deps/c-ares/build
src/bun.js/bindings-obj
@@ -135,28 +135,3 @@ make-dev-stats.csv
.uuid
tsconfig.tsbuildinfo
test/js/bun/glob/fixtures
*.lib
*.pdb
CMakeFiles
build.ninja
.ninja_deps
.ninja_log
CMakeCache.txt
cmake_install.cmake
compile_commands.json
*.lib
x64
**/*.vcxproj*
**/*.sln*
**/*.dir
**/*.pdb
/.webkit-cache
/.cache
/src/deps/libuv
/build-*/
.vs

49
.gitmodules vendored
View File

@@ -1,3 +1,10 @@
[submodule "src/deps/picohttpparser"]
path = src/deps/picohttpparser
url = https://github.com/h2o/picohttpparser.git
ignore = dirty
depth = 1
shallow = true
fetchRecurseSubmodules = false
[submodule "src/javascript/jsc/WebKit"]
path = src/bun.js/WebKit
url = https://github.com/oven-sh/WebKit.git
@@ -6,13 +13,6 @@ depth = 1
update = none
shallow = true
fetchRecurseSubmodules = false
[submodule "src/deps/picohttpparser"]
path = src/deps/picohttpparser
url = https://github.com/h2o/picohttpparser.git
ignore = dirty
depth = 1
shallow = true
fetchRecurseSubmodules = false
[submodule "src/deps/mimalloc"]
path = src/deps/mimalloc
url = https://github.com/Jarred-Sumner/mimalloc.git
@@ -56,30 +56,15 @@ depth = 1
shallow = true
fetchRecurseSubmodules = false
[submodule "src/deps/c-ares"]
path = src/deps/c-ares
url = https://github.com/c-ares/c-ares.git
ignore = dirty
depth = 1
shallow = true
fetchRecurseSubmodules = false
path = src/deps/c-ares
url = https://github.com/c-ares/c-ares.git
[submodule "src/deps/zstd"]
path = src/deps/zstd
url = https://github.com/facebook/zstd.git
ignore = dirty
depth = 1
shallow = true
fetchRecurseSubmodules = false
path = src/deps/zstd
url = https://github.com/facebook/zstd.git
ignore = dirty
[submodule "src/deps/base64"]
path = src/deps/base64
url = https://github.com/aklomp/base64.git
ignore = dirty
depth = 1
shallow = true
fetchRecurseSubmodules = false
[submodule "src/deps/ls-hpack"]
path = src/deps/ls-hpack
url = https://github.com/litespeedtech/ls-hpack.git
ignore = dirty
depth = 1
shallow = true
fetchRecurseSubmodules = false
path = src/deps/base64
url = https://github.com/aklomp/base64.git
ignore = dirty
depth = 1
shallow = true

View File

@@ -0,0 +1,21 @@
// I would have made this a bash script but there isn't an easy way to track
// time in bash sub-second cross platform.
import fs from "fs";
const start = Date.now() + 5;
const result = Bun.spawnSync(process.argv.slice(2), {
stdio: ["inherit", "inherit", "inherit"],
});
const end = Date.now();
const diff = (Math.max(Math.round(end - start), 0) / 1000).toFixed(3);
const success = result.exitCode === 0;
try {
const line = `${new Date().toISOString()}, ${success ? "success" : "fail"}, ${diff}\n`;
if (fs.existsSync(".scripts/make-dev-stats.csv")) {
fs.appendFileSync(".scripts/make-dev-stats.csv", line);
} else {
fs.writeFileSync(".scripts/make-dev-stats.csv", line);
}
} catch {
// Ignore
}
process.exit(result.exitCode);

13
.scripts/postinstall.sh Executable file
View File

@@ -0,0 +1,13 @@
#!/bin/bash
set -euxo pipefail
# if bun-webkit node_modules directory exists
if [ -d ./node_modules/bun-webkit ]; then
rm -f bun-webkit
# get the first matching bun-webkit-* directory name
ln -s ./node_modules/$(ls ./node_modules | grep bun-webkit- | head -n 1) ./bun-webkit
fi
# sets up vscode C++ intellisense
rm -f .vscode/clang++
ln -s $(which clang++-16 || which clang++) .vscode/clang++ 2>/dev/null

View File

@@ -1,64 +1,16 @@
{
"configurations": [
{
"name": "Debug",
"name": "Mac",
"forcedInclude": ["${workspaceFolder}/src/bun.js/bindings/root.h"],
"includePath": [
"${workspaceFolder}/build/bun-webkit/include",
"${workspaceFolder}/build/codegen",
"${workspaceFolder}/src/bun.js/bindings/",
"${workspaceFolder}/src/bun.js/bindings/webcore/",
"${workspaceFolder}/src/bun.js/bindings/sqlite/",
"${workspaceFolder}/src/bun.js/bindings/webcrypto/",
"${workspaceFolder}/src/bun.js/modules/",
"${workspaceFolder}/src/js/builtins/",
"${workspaceFolder}/src/deps/boringssl/include/",
"${workspaceFolder}/src/deps",
"${workspaceFolder}/src/napi/*",
"${workspaceFolder}/packages/bun-usockets/src",
"${workspaceFolder}/packages/"
],
"browse": {
"path": [
"${workspaceFolder}/build/bun-webkit/include",
"${workspaceFolder}/src/bun.js/bindings",
"${workspaceFolder}/src/napi/*",
"${workspaceFolder}/src/js/builtins/*",
"${workspaceFolder}/src/bun.js/modules/*",
"${workspaceFolder}/src/deps/*",
"${workspaceFolder}/src/deps/boringssl/include/*",
"${workspaceFolder}/packages/bun-usockets/*",
"${workspaceFolder}/packages/bun-uws/*",
"${workspaceFolder}/src/napi/*"
],
"limitSymbolsToIncludedHeaders": true,
"databaseFilename": ".vscode/cppdb"
},
"defines": [
"STATICALLY_LINKED_WITH_JavaScriptCore=1",
"STATICALLY_LINKED_WITH_WTF=1",
"BUILDING_WITH_CMAKE=1",
"NOMINMAX",
"ENABLE_INSPECTOR_ALTERNATE_DISPATCHERS=0",
"BUILDING_JSCONLY__",
"USE_FOUNDATION=1",
"ASSERT_ENABLED=1",
"DU_DISABLE_RENAMING=1"
],
"macFrameworkPath": [],
"compilerPath": "${workspaceFolder}/.vscode/clang++",
"cStandard": "c17",
"cppStandard": "c++20"
},
{
"name": "BunWithJSCDebug",
"forcedInclude": ["${workspaceFolder}/src/bun.js/bindings/root.h"],
"includePath": [
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Debug/",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Debug/ICU/Headers/",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Debug/JavaScriptCore/PrivateHeaders/",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Debug/WTF/Headers",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Debug/bmalloc/Headers/",
"${workspaceFolder}/../webkit-build/include/",
"${workspaceFolder}/bun-webkit/include/",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Release/",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Release/ICU/Headers/",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Release/JavaScriptCore/PrivateHeaders/",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Release/WTF/Headers",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Release/bmalloc/Headers/",
"${workspaceFolder}/src/bun.js/bindings/",
"${workspaceFolder}/src/bun.js/bindings/webcore/",
"${workspaceFolder}/src/bun.js/bindings/sqlite/",
@@ -74,11 +26,13 @@
],
"browse": {
"path": [
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Debug/",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Debug/ICU/Headers/",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Debug/JavaScriptCore/PrivateHeaders/**",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Debug/WTF/Headers/**",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Debug/bmalloc/Headers/**",
"${workspaceFolder}/../webkit-build/include/",
"${workspaceFolder}/bun-webkit/include/",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Release/",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Release/ICU/Headers/",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Release/JavaScriptCore/PrivateHeaders/**",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Release/WTF/Headers/**",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Release/bmalloc/Headers/**",
"${workspaceFolder}/src/bun.js/bindings/*",
"${workspaceFolder}/src/bun.js/bindings/*",
"${workspaceFolder}/src/napi/*",
@@ -95,7 +49,7 @@
"${workspaceFolder}/src/napi"
],
"limitSymbolsToIncludedHeaders": true,
"databaseFilename": ".vscode/cppdb_debug"
"databaseFilename": ".vscode/cppdb"
},
"defines": [
"STATICALLY_LINKED_WITH_JavaScriptCore=1",
@@ -105,7 +59,7 @@
"ENABLE_INSPECTOR_ALTERNATE_DISPATCHERS=0",
"BUILDING_JSCONLY__",
"USE_FOUNDATION=1",
"ASSERT_ENABLED=1",
"ASSERT_ENABLED=0",
"DU_DISABLE_RENAMING=1"
],
"macFrameworkPath": [],

13
.vscode/launch.json generated vendored
View File

@@ -82,7 +82,7 @@
"request": "launch",
"name": "bun test [*]",
"program": "bun-debug",
"args": ["test", "js/node"],
"args": ["test"],
"cwd": "${workspaceFolder}/test",
"env": {
"FORCE_COLOR": "1",
@@ -96,7 +96,7 @@
"request": "launch",
"name": "bun test [*] (fast)",
"program": "bun-debug",
"args": ["test", "js"],
"args": ["test"],
// The cwd here must be the same as in CI. Or you will cause test failures that only happen in CI.
"cwd": "${workspaceFolder}/test",
"env": {
@@ -124,7 +124,7 @@
"request": "launch",
"name": "bun run [file]",
"program": "bun-debug",
"args": ["run", "${file}"],
"args": ["run", "${file}", "${file}"],
"cwd": "${fileDirname}",
"env": {
"FORCE_COLOR": "1",
@@ -307,10 +307,13 @@
"name": "bun install",
"program": "bun-debug",
"args": ["install"],
"cwd": "/Users/jarred/Build/worky",
"cwd": "${fileDirname}",
"console": "internalConsole",
"env": {}
"env": {
"BUN_DEBUG_QUIET_LOGS": "1"
}
},
{
"type": "lldb",
"request": "launch",

93
.vscode/settings.json vendored
View File

@@ -7,36 +7,45 @@
"search.followSymlinks": false,
"search.useIgnoreFiles": true,
"zig.buildOnSave": false,
"zig.formattingProvider": "zls",
// We do this until we upgrade to latest Zig so that zls doesn't break our code.
"zig.formattingProvider": "extension",
"zig.buildArgs": ["obj", "-Dfor-editor"],
"zig.buildOption": "build",
"zig.buildFilePath": "${workspaceFolder}/build.zig",
"zig.initialSetupDone": true,
"editor.formatOnSave": true,
"[zig]": {
"editor.tabSize": 4,
"editor.useTabStops": false,
"editor.defaultFormatter": "ziglang.vscode-zig"
"editor.defaultFormatter": "ziglang.vscode-zig",
"editor.formatOnSave": true
},
"[ts]": {
"editor.defaultFormatter": "esbenp.prettier-vscode"
"editor.defaultFormatter": "esbenp.prettier-vscode",
"editor.formatOnSave": true
},
"[js]": {
"editor.defaultFormatter": "esbenp.prettier-vscode"
"editor.defaultFormatter": "esbenp.prettier-vscode",
"editor.formatOnSave": true
},
"zig.zls.enableInlayHints": false,
"zig.zls.enabled": true,
"git.ignoreSubmodules": true,
"[jsx]": {
"editor.defaultFormatter": "esbenp.prettier-vscode"
"editor.defaultFormatter": "esbenp.prettier-vscode",
"editor.formatOnSave": true
},
"[tsx]": {
"editor.defaultFormatter": "esbenp.prettier-vscode"
"editor.defaultFormatter": "esbenp.prettier-vscode",
"editor.formatOnSave": true
},
"[yaml]": {
"editor.formatOnSave": true
},
"[yaml]": {},
"[markdown]": {
"editor.unicodeHighlight.ambiguousCharacters": false,
"editor.unicodeHighlight.invisibleCharacters": false,
"diffEditor.ignoreTrimWhitespace": false,
"editor.defaultFormatter": "esbenp.prettier-vscode",
"editor.formatOnSave": true,
"editor.wordWrap": "on",
"editor.quickSuggestions": {
"comments": "off",
@@ -56,6 +65,8 @@
"**/*.xcscheme": true,
"**/*.pem": true,
"**/*.xcodeproj": true,
"test/snapshots": true,
"test/snapshots-no-hmr": true,
"src/bun.js/WebKit": true,
"src/deps/libarchive": true,
"src/deps/mimalloc": true,
@@ -68,26 +79,29 @@
"src/deps/c-ares": true,
"src/deps/tinycc": true,
"src/deps/zstd": true,
"**/*.i": true,
"packages/bun-uws/fuzzing/seed-corpus/**/*": true
"test/snippets/package-json-exports/_node_modules_copy": true,
"src/js/out": true,
"packages/bun-uws/fuzzing/seed-corpus/": true,
"**/*.dep": true,
"**/CMakeFiles": true
},
"C_Cpp.files.exclude": {
"**/.vscode": true,
"WebKit/JSTests": true,
"WebKit/Tools": true,
"WebKit/WebDriverTests": true,
"WebKit/WebKit.xcworkspace": true,
"WebKit/WebKitLibraries": true,
"WebKit/Websites": true,
"WebKit/resources": true,
"WebKit/LayoutTests": true,
"WebKit/ManualTests": true,
"WebKit/PerformanceTests": true,
"WebKit/WebKitLegacy": true,
"WebKit/WebCore": true,
"WebKit/WebDriver": true,
"WebKit/WebKitBuild": true,
"WebKit/WebInspectorUI": true
"src/bun.js/WebKit/JSTests": true,
"src/bun.js/WebKit/Tools": true,
"src/bun.js/WebKit/WebDriverTests": true,
"src/bun.js/WebKit/WebKit.xcworkspace": true,
"src/bun.js/WebKit/WebKitLibraries": true,
"src/bun.js/WebKit/Websites": true,
"src/bun.js/WebKit/resources": true,
"src/bun.js/WebKit/LayoutTests": true,
"src/bun.js/WebKit/ManualTests": true,
"src/bun.js/WebKit/PerformanceTests": true,
"src/bun.js/WebKit/WebKitLegacy": true,
"src/bun.js/WebKit/WebCore": true,
"src/bun.js/WebKit/WebDriver": true,
"src/bun.js/WebKit/WebKitBuild": true,
"src/bun.js/WebKit/WebInspectorUI": true
},
"[cpp]": {
"editor.defaultFormatter": "xaver.clang-format"
@@ -178,12 +192,20 @@
"set": "cpp",
"__memory": "cpp",
"memory_resource": "cpp",
"resource.h": "c",
"sysinfo.h": "c",
"*.tcc": "cpp",
"list": "cpp",
"shared_mutex": "cpp",
"cinttypes": "cpp",
"variant": "cpp",
"sysctl.h": "c",
"interface_adresses.h": "c",
"interface_addresses.h": "c",
"ctype.h": "c",
"ethernet.h": "c",
"inet.h": "c",
"packet.h": "c",
"queue": "cpp",
"compare": "cpp",
"concepts": "cpp",
@@ -199,24 +221,9 @@
"regex": "cpp",
"span": "cpp",
"valarray": "cpp",
"codecvt": "cpp",
"xtr1common": "cpp",
"stop_token": "cpp",
"xfacet": "cpp",
"xhash": "cpp",
"xiosbase": "cpp",
"xlocale": "cpp",
"xlocbuf": "cpp",
"xlocinfo": "cpp",
"xlocmes": "cpp",
"xlocmon": "cpp",
"xlocnum": "cpp",
"xloctime": "cpp",
"xmemory": "cpp",
"xstring": "cpp",
"xtree": "cpp",
"xutility": "cpp"
"codecvt": "cpp"
},
"cmake.configureOnOpen": false,
"C_Cpp.errorSquiggles": "enabled",
"eslint.workingDirectories": ["packages/bun-types"],
"typescript.tsdk": "node_modules/typescript/lib"

29
.vscode/tasks.json vendored
View File

@@ -2,10 +2,33 @@
"version": "2.0.0",
"tasks": [
{
"label": "Rebuild Debug",
"command": "ninja",
"args": ["-Cbuild"],
"label": "build",
"type": "process",
"command": "zig",
"args": ["build"],
"presentation": {
"echo": true,
"reveal": "silent",
"focus": false,
"panel": "shared",
"showReuseMessage": false,
"clear": false
},
"group": {
"kind": "build",
"isDefault": true
}
},
{
"label": "run",
"type": "process",
"command": "zig",
"args": ["run", "${file}"],
"group": "build",
"presentation": {
"showReuseMessage": false,
"clear": true
}
}
]
}

File diff suppressed because it is too large Load Diff

View File

@@ -10,7 +10,7 @@ Today (February 2023), Bun's codebase has five distinct parts:
- JavaScript, JSX, & TypeScript transpiler, module resolver, and related code
- JavaScript runtime ([`src/bun.js/`](src/bun.js/))
- JavaScript runtime bindings ([`src/bun.js/bindings/**/*.cpp`](src/bun.js/bindings/))
- JavaScript runtime bindings ([`src/bun.zig/bindings/**/*.cpp`](src/bun.zig/bindings/))
- Package manager ([`src/install/`](src/install/))
- Shared utilities ([`src/string_immutable.zig`](src/string_immutable.zig))
@@ -18,7 +18,7 @@ The JavaScript transpiler & module resolver is mostly independent from the runti
## Getting started
Please refer to [Bun's Development Guide](https://bun.sh/docs/project/contributing) to get your dev environment setup!
Please refer to [Bun's Development Guide](https://bun.sh/docs/project/development) to get your dev environment setup!
## Memory management in Bun

File diff suppressed because it is too large Load Diff

111
Makefile
View File

@@ -39,6 +39,7 @@ endif
MIN_MACOS_VERSION ?= $(DEFAULT_MIN_MACOS_VERSION)
BUN_BASE_VERSION = 1.0
CI ?= false
AR=
@@ -65,7 +66,7 @@ PACKAGE_JSON_VERSION = $(BUN_BASE_VERSION).$(BUILD_ID)
BUN_BUILD_TAG = bun-v$(PACKAGE_JSON_VERSION)
BUN_RELEASE_BIN = $(PACKAGE_DIR)/bun
PRETTIER ?= $(shell which prettier 2>/dev/null || echo "./node_modules/.bin/prettier")
ESBUILD = "$(shell which esbuild 2>/dev/null || echo "./node_modules/.bin/esbuild")"
ESBUILD = $(shell which esbuild 2>/dev/null || echo "./node_modules/.bin/esbuild")
DSYMUTIL ?= $(shell which dsymutil 2>/dev/null || which dsymutil-15 2>/dev/null)
WEBKIT_DIR ?= $(realpath src/bun.js/WebKit)
WEBKIT_RELEASE_DIR ?= $(WEBKIT_DIR)/WebKitBuild/Release
@@ -73,7 +74,7 @@ WEBKIT_DEBUG_DIR ?= $(WEBKIT_DIR)/WebKitBuild/Debug
WEBKIT_RELEASE_DIR_LTO ?= $(WEBKIT_DIR)/WebKitBuild/ReleaseLTO
NPM_CLIENT = "$(shell which bun 2>/dev/null || which npm 2>/dev/null)"
NPM_CLIENT ?= $(shell which bun 2>/dev/null || which npm 2>/dev/null)
ZIG ?= $(shell which zig 2>/dev/null || echo -e "error: Missing zig. Please make sure zig is in PATH. Or set ZIG=/path/to-zig-executable")
# We must use the same compiler version for the JavaScriptCore bindings and JavaScriptCore
@@ -186,6 +187,11 @@ BUN_CFLAGS = $(MACOS_MIN_FLAG) $(MARCH_NATIVE) $(OPTIMIZATION_LEVEL) -fno-excep
BUN_TMP_DIR := /tmp/make-bun
CFLAGS=$(CFLAGS_WITHOUT_MARCH) $(MARCH_NATIVE)
DEFAULT_USE_BMALLOC := 1
USE_BMALLOC ?= DEFAULT_USE_BMALLOC
# Set via postinstall
ifeq (,$(realpath $(JSC_BASE_DIR)))
JSC_BASE_DIR = $(realpath $(firstword $(wildcard bun-webkit)))
@@ -374,7 +380,9 @@ ICU_FLAGS ?=
# Ideally, we could just look up the linker search paths
ifeq ($(OS_NAME),linux)
LIB_ICU_PATH ?= $(JSC_LIB)
ICU_FLAGS += $(LIB_ICU_PATH)/libicuuc.a $(LIB_ICU_PATH)/libicudata.a $(LIB_ICU_PATH)/libicui18n.a
ICU_FLAGS += $(LIB_ICU_PATH)/libicuuc.a $(LIB_ICU_PATH)/libicudata.a $(LIB_ICU_PATH)/libicui18n.a
else
LIB_ICU_PATH ?= $(BUN_DEPS_DIR)
endif
ifeq ($(OS_NAME),darwin)
@@ -455,8 +463,7 @@ ARCHIVE_FILES_WITHOUT_LIBCRYPTO = $(MINIMUM_ARCHIVE_FILES) \
-lusockets \
-lcares \
-lzstd \
$(BUN_DEPS_OUT_DIR)/libuwsockets.o \
$(BUN_DEPS_OUT_DIR)/liblshpack.a
$(BUN_DEPS_OUT_DIR)/libuwsockets.o
ARCHIVE_FILES = $(ARCHIVE_FILES_WITHOUT_LIBCRYPTO)
@@ -750,24 +757,14 @@ wasm: api mimalloc-wasm build-obj-wasm-small
build-obj-safe:
$(ZIG) build obj -Doptimize=ReleaseSafe -Dcpu="$(CPU_TARGET)"
UWS_CC_FLAGS = -pthread -DLIBUS_USE_OPENSSL=1 -DUWS_HTTPRESPONSE_NO_WRITEMARK=1 -DLIBUS_USE_BORINGSSL=1 -DWITH_BORINGSSL=1 -Wpedantic -Wall -Wextra -Wsign-conversion -Wconversion $(UWS_INCLUDE) -DUWS_WITH_PROXY
UWS_CC_FLAGS = -pthread -DLIBUS_USE_OPENSSL=1 -DUWS_HTTPRESPONSE_NO_WRITEMARK=1 -DLIBUS_USE_BORINGSSL=1 -DWITH_BORINGSSL=1 -Wpedantic -Wall -Wextra -Wsign-conversion -Wconversion $(UWS_INCLUDE) -DUWS_WITH_PROXY
UWS_CXX_FLAGS = $(UWS_CC_FLAGS) -std=$(CXX_VERSION) -fno-exceptions -fno-rtti
UWS_LDFLAGS = -I$(BUN_DEPS_DIR)/boringssl/include -I$(ZLIB_INCLUDE_DIR)
USOCKETS_DIR = $(BUN_DIR)/packages/bun-usockets
USOCKETS_SRC_DIR = $(USOCKETS_DIR)/src
LSHPACK_SRC_DIR = $(BUN_DEPS_DIR)/ls-hpack
LSHPACK_CC_FLAGS = -DXXH_HEADER_NAME="<xxhash.h>"
LSHPACK_LDFLAGS = -I$(LSHPACK_SRC_DIR) -I$(LSHPACK_SRC_DIR)/deps/xxhash
lshpack:
rm -rf $(LSHPACK_SRC_DIR)/*.i $(LSHPACK_SRC_DIR)/*.bc $(LSHPACK_SRC_DIR)/*.o $(LSHPACK_SRC_DIR)/*.s $(LSHPACK_SRC_DIR)/*.ii $(LSHPACK_SRC_DIR)/*.s
cd $(LSHPACK_SRC_DIR) && $(CC_WITH_CCACHE) -I$(LSHPACK_SRC_DIR) -fno-builtin-malloc -fno-builtin-free -fno-builtin-realloc $(EMIT_LLVM_FOR_RELEASE) $(MACOS_MIN_FLAG) -fPIC $(CFLAGS) $(LSHPACK_CC_FLAGS) -save-temps -I$(BUN_DEPS_DIR)/uws/lshpack/src $(LSHPACK_LDFLAGS) -g $(DEFAULT_LINKER_FLAGS) $(PLATFORM_LINKER_FLAGS) $(OPTIMIZATION_LEVEL) -c $(wildcard $(LSHPACK_SRC_DIR)/lshpack.c) $(wildcard $(LSHPACK_SRC_DIR)/deps/**/*.c)
cd $(LSHPACK_SRC_DIR) && $(AR) rcvs $(BUN_DEPS_OUT_DIR)/liblshpack.a $(LSHPACK_SRC_DIR)/*.{o,bc}
usockets:
rm -rf $(USOCKETS_DIR)/*.i $(USOCKETS_DIR)/*.bc $(USOCKETS_DIR)/*.o $(USOCKETS_DIR)/*.s $(USOCKETS_DIR)/*.ii $(USOCKETS_DIR)/*.s $(BUN_DEPS_OUT_DIR)/libusockets.a
rm -rf $(USOCKETS_DIR)/*.i $(USOCKETS_DIR)/*.bc $(USOCKETS_DIR)/*.o $(USOCKETS_DIR)/*.s $(USOCKETS_DIR)/*.ii $(USOCKETS_DIR)/*.s
cd $(USOCKETS_DIR) && $(CC_WITH_CCACHE) -I$(USOCKETS_SRC_DIR) -fno-builtin-malloc -fno-builtin-free -fno-builtin-realloc $(EMIT_LLVM_FOR_RELEASE) $(MACOS_MIN_FLAG) -fPIC $(CFLAGS) $(UWS_CC_FLAGS) -save-temps -I$(BUN_DEPS_DIR)/uws/uSockets/src $(UWS_LDFLAGS) -g $(DEFAULT_LINKER_FLAGS) $(PLATFORM_LINKER_FLAGS) $(OPTIMIZATION_LEVEL) -c $(wildcard $(USOCKETS_SRC_DIR)/*.c) $(wildcard $(USOCKETS_SRC_DIR)/**/*.c)
cd $(USOCKETS_DIR) && $(CXX_WITH_CCACHE) -I$(USOCKETS_SRC_DIR) -fno-builtin-malloc -fno-builtin-free -fno-builtin-realloc $(EMIT_LLVM_FOR_RELEASE) $(MACOS_MIN_FLAG) -fPIC $(CXXFLAGS) $(UWS_CXX_FLAGS) -save-temps -I$(BUN_DEPS_DIR)/uws/uSockets/src $(UWS_LDFLAGS) -g $(DEFAULT_LINKER_FLAGS) $(PLATFORM_LINKER_FLAGS) $(OPTIMIZATION_LEVEL) -c $(wildcard $(USOCKETS_SRC_DIR)/*.cpp) $(wildcard $(USOCKETS_SRC_DIR)/**/*.cpp)
cd $(USOCKETS_DIR) && $(AR) rcvs $(BUN_DEPS_OUT_DIR)/libusockets.a $(USOCKETS_DIR)/*.{o,bc}
@@ -836,10 +833,10 @@ fallback_decoder:
.PHONY: runtime_js
runtime_js:
@NODE_ENV=production $(ESBUILD) --define:process.env.NODE_ENV=\"production\" --target=esnext --bundle src/runtime/index.ts --format=iife --platform=browser --global-name=BUN_RUNTIME --minify --external:/bun:* > src/runtime.out.js; cat src/runtime.footer.js >> src/runtime.out.js
@NODE_ENV=production $(ESBUILD) --define:process.env.NODE_ENV=\"production\" --target=esnext --bundle src/runtime/index-with-refresh.ts --format=iife --platform=browser --global-name=BUN_RUNTIME --minify --external:/bun:* > src/runtime.out.refresh.js; cat src/runtime.footer.with-refresh.js >> src/runtime.out.refresh.js
@NODE_ENV=production $(ESBUILD) --define:process.env.NODE_ENV=\"production\" --target=esnext --bundle src/runtime/index-without-hmr.ts --format=iife --platform=node --global-name=BUN_RUNTIME --minify --external:/bun:* > src/runtime.node.pre.out.js; cat src/runtime.node.pre.out.js src/runtime.footer.node.js > src/runtime.node.out.js
@NODE_ENV=production $(ESBUILD) --define:process.env.NODE_ENV=\"production\" --target=esnext --bundle src/runtime/index-without-hmr.ts --format=iife --platform=node --global-name=BUN_RUNTIME --minify --external:/bun:* > src/runtime.bun.pre.out.js; cat src/runtime.bun.pre.out.js src/runtime.footer.bun.js > src/runtime.bun.out.js
@NODE_ENV=production $(ESBUILD) --define:process.env.NODE_ENV="production" --target=esnext --bundle src/runtime/index.ts --format=iife --platform=browser --global-name=BUN_RUNTIME --minify --external:/bun:* > src/runtime.out.js; cat src/runtime.footer.js >> src/runtime.out.js
@NODE_ENV=production $(ESBUILD) --define:process.env.NODE_ENV="production" --target=esnext --bundle src/runtime/index-with-refresh.ts --format=iife --platform=browser --global-name=BUN_RUNTIME --minify --external:/bun:* > src/runtime.out.refresh.js; cat src/runtime.footer.with-refresh.js >> src/runtime.out.refresh.js
@NODE_ENV=production $(ESBUILD) --define:process.env.NODE_ENV="production" --target=esnext --bundle src/runtime/index-without-hmr.ts --format=iife --platform=node --global-name=BUN_RUNTIME --minify --external:/bun:* > src/runtime.node.pre.out.js; cat src/runtime.node.pre.out.js src/runtime.footer.node.js > src/runtime.node.out.js
@NODE_ENV=production $(ESBUILD) --define:process.env.NODE_ENV="production" --target=esnext --bundle src/runtime/index-without-hmr.ts --format=iife --platform=node --global-name=BUN_RUNTIME --minify --external:/bun:* > src/runtime.bun.pre.out.js; cat src/runtime.bun.pre.out.js src/runtime.footer.bun.js > src/runtime.bun.out.js
.PHONY: runtime_js_dev
runtime_js_dev:
@@ -940,9 +937,6 @@ clone-submodules:
.PHONY: headers
headers:
echo please don't run the headers generator anymore. i don't think it works.
echo if you really need it, run make headers2
headers2:
rm -f /tmp/build-jsc-headers src/bun.js/bindings/headers.zig
touch src/bun.js/bindings/headers.zig
$(ZIG) build headers-obj
@@ -1257,7 +1251,6 @@ jsc-build-mac-compile-debug:
-DENABLE_FTL_JIT=ON \
-DCMAKE_EXPORT_COMPILE_COMMANDS=ON \
-DUSE_BUN_JSC_ADDITIONS=ON \
-DENABLE_BUN_SKIP_FAILING_ASSERTIONS=ON \
-DALLOW_LINE_AND_COLUMN_NUMBER_IN_BUILTINS=ON \
-G Ninja \
$(CMAKE_FLAGS_WITHOUT_RELEASE) \
@@ -1333,7 +1326,7 @@ release-bindings: $(OBJ_DIR) $(OBJ_FILES) $(WEBCORE_OBJ_FILES) $(SQLITE_OBJ_FILE
# Do not add $(DEBUG_DIR) to this list
# It will break caching, causing you to have to wait for every .cpp file to rebuild.
.PHONY: bindings
bindings-real: $(DEBUG_OBJ_DIR) $(DEBUG_OBJ_FILES) $(DEBUG_WEBCORE_OBJ_FILES) $(DEBUG_SQLITE_OBJ_FILES) $(DEBUG_NODE_OS_OBJ_FILES) $(DEBUG_BUILTINS_OBJ_FILES) $(DEBUG_IO_FILES) $(DEBUG_MODULES_OBJ_FILES) $(DEBUG_WEBCRYPTO_OBJ_FILES)
bindings: $(DEBUG_OBJ_DIR) $(DEBUG_OBJ_FILES) $(DEBUG_WEBCORE_OBJ_FILES) $(DEBUG_SQLITE_OBJ_FILES) $(DEBUG_NODE_OS_OBJ_FILES) $(DEBUG_BUILTINS_OBJ_FILES) $(DEBUG_IO_FILES) $(DEBUG_MODULES_OBJ_FILES) $(DEBUG_WEBCRYPTO_OBJ_FILES)
.PHONY: jsc-bindings-mac
jsc-bindings-mac: bindings
@@ -1367,7 +1360,7 @@ mimalloc-debug:
-GNinja \
. \
&& ninja
cp $(BUN_DEPS_DIR)/mimalloc/$(_MIMALLOC_DEBUG_FILE) $(BUN_DEPS_OUT_DIR)/$(_MIMALLOC_DEBUG_FILE)
cp $(BUN_DEPS_DIR)/mimalloc/$(_MIMALLOC_DEBUG_FILE) $(BUN_DEPS_OUT_DIR)/$(MIMALLOC_FILE)
# mimalloc is built as object files so that it can overload the system malloc on linux
@@ -1492,12 +1485,12 @@ wasm-return1:
$(ZIG) build-lib -OReleaseSmall test/bun.js/wasm-return-1-test.zig -femit-bin=test/bun.js/wasm-return-1-test.wasm -target wasm32-freestanding
generate-classes:
bun src/codegen/generate-classes.ts
bun src/bun.js/scripts/generate-classes.ts
$(ZIG) fmt src/bun.js/bindings/generated_classes.zig
$(CLANG_FORMAT) -i src/bun.js/bindings/ZigGeneratedClasses.h src/bun.js/bindings/ZigGeneratedClasses.cpp
generate-sink:
bun src/codegen/generate-jssink.js
bun src/bun.js/scripts/generate-jssink.js
$(CLANG_FORMAT) -i src/bun.js/bindings/JSSink.cpp src/bun.js/bindings/JSSink.h
./src/bun.js/scripts/create_hash_table src/bun.js/bindings/JSSink.cpp > src/bun.js/bindings/JSSinkLookupTable.h
$(SED) -i -e 's/#include "Lookup.h"//' src/bun.js/bindings/JSSinkLookupTable.h
@@ -1907,7 +1900,7 @@ cold-jsc-start:
misctools/cold-jsc-start.cpp -o cold-jsc-start
.PHONY: vendor-without-npm
vendor-without-npm: node-fallbacks runtime_js fallback_decoder bun_error mimalloc picohttp zlib boringssl libarchive lolhtml sqlite usockets uws lshpack tinycc c-ares zstd base64
vendor-without-npm: node-fallbacks runtime_js fallback_decoder bun_error mimalloc picohttp zlib boringssl libarchive lolhtml sqlite usockets uws tinycc c-ares zstd base64
.PHONY: vendor-without-check
@@ -1920,30 +1913,46 @@ vendor: assert-deps submodule vendor-without-check
vendor-dev: assert-deps submodule npm-install-dev vendor-without-npm
.PHONY: bun
bun:
@echo 'makefile is deprecated - use `cmake` / `bun run build`'
@echo 'See https://bun.sh/docs/project/contributing for more details'
bun: vendor identifier-cache build-obj bun-link-lld-release bun-codesign-release-local
cpp:
@echo 'makefile is deprecated - use `cmake` / `bun run build`'
@echo 'See https://bun.sh/docs/project/contributing for more details'
.PHONY: static-hash-table
static-hash-table:
bun src/js/_codegen/static-hash-tables.ts
zig:
@echo 'makefile is deprecated - use `cmake` / `bun run build`'
@echo 'See https://bun.sh/docs/project/contributing for more details'
.PHONY: cpp
cpp: ## compile src/js/builtins + all c++ code then link
@make clean-bindings js
@make static-hash-table
@make bindings -j$(CPU_COUNT)
@make link
dev:
@echo 'makefile is deprecated - use `cmake` / `bun run build`'
@echo 'See https://bun.sh/docs/project/contributing for more details'
.PHONY: cpp
cpp-no-link:
@make clean-bindings js
@make bindings -j$(CPU_COUNT)
setup:
@echo 'makefile is deprecated - use `cmake` / `bun run build`'
@echo 'See https://bun.sh/docs/project/contributing for more details'
.PHONY: zig
zig: ## compile zig code then link
@make mkdir-dev dev-obj link
bindings:
@echo 'makefile is deprecated - use `cmake` / `bun run build`'
@echo 'See https://bun.sh/docs/project/contributing for more details'
.PHONY: zig-no-link
zig-no-link:
@make mkdir-dev dev-obj
help:
@echo 'makefile is deprecated - use `cmake` / `bun run build`'
@echo 'See https://bun.sh/docs/project/contributing for more details'
.PHONY: dev
dev: # combo of `make cpp` and `make zig`
@make cpp-no-link zig-no-link -j2
@make link
.PHONY: setup
setup: vendor-dev identifier-cache clean-bindings
make jsc-check dev
@echo ""
@echo "First build complete!"
@echo "\"bun-debug\" is available at $(DEBUG_BIN)/bun-debug"
@echo ""
.PHONY: help
help: ## to print this help
@echo "For detailed build instructions, see https://bun.sh/docs/project/development"
@awk 'BEGIN {FS = ":.*?## "} /^[a-zA-Z0-9_-]+:.*?## / {gsub("\\\\n",sprintf("\n%22c",""), $$2);printf "\033[36m%-20s\033[0m \t\t%s\n", $$1, $$2}' $(MAKEFILE_LIST)

View File

@@ -93,8 +93,8 @@ bun upgrade --canary
- [`bun run`](https://bun.sh/docs/cli/run)
- [`bun install`](https://bun.sh/docs/cli/install)
- [`bun test`](https://bun.sh/docs/cli/test)
- [`bun init`](https://bun.sh/docs/cli/init)
- [`bun create`](https://bun.sh/docs/cli/bun-create)
- [`bun init`](https://bun.sh/docs/templates#bun-init)
- [`bun create`](https://bun.sh/docs/templates#bun-create)
- [`bunx`](https://bun.sh/docs/cli/bunx)
- Runtime
- [Runtime](https://bun.sh/docs/runtime/index)
@@ -128,7 +128,7 @@ bun upgrade --canary
## Contributing
Refer to the [Project > Contributing](https://bun.sh/docs/project/contributing) guide to start contributing to Bun.
Refer to the [Project > Development](https://bun.sh/docs/project/development) guide to start contributing to Bun.
## License

Binary file not shown.

View File

@@ -1,19 +0,0 @@
import micromatch from "micromatch";
import { bench, run } from "mitata";
const Glob = typeof Bun !== "undefined" ? Bun.Glob : undefined;
const doMatch = typeof Bun === "undefined" ? micromatch.isMatch : (a, b) => new Glob(b).match(a);
bench((Glob ? "Bun.Glob - " : "micromatch - ") + "**/*.js", () => {
doMatch("foo/bar.js", "**/*.js");
});
bench((Glob ? "Bun.Glob - " : "micromatch - ") + "*.js", () => {
doMatch("bar.js", "*.js");
});
await run({
avg: true,
min_max: true,
percentiles: true,
});

View File

@@ -1,113 +0,0 @@
import { run, bench, group } from "mitata";
import fg from "fast-glob";
import { fdir } from "fdir";
const normalPattern = "*.ts";
const recursivePattern = "**/*.ts";
const nodeModulesPattern = "**/node_modules/**/*.js";
const benchFdir = false;
const cwd = undefined;
const bunOpts = {
cwd,
followSymlinks: false,
absolute: true,
};
const fgOpts = {
cwd,
followSymbolicLinks: false,
onlyFiles: false,
absolute: true,
};
const Glob = "Bun" in globalThis ? globalThis.Bun.Glob : undefined;
group({ name: `async pattern="${normalPattern}"`, summary: true }, () => {
bench("fast-glob", async () => {
const entries = await fg.glob([normalPattern], fgOpts);
});
if (Glob)
bench("Bun.Glob", async () => {
const entries = await Array.fromAsync(new Glob(normalPattern).scan(bunOpts));
});
if (benchFdir)
bench("fdir", async () => {
const entries = await new fdir().withFullPaths().glob(normalPattern).crawl(process.cwd()).withPromise();
});
});
group({ name: `async-recursive pattern="${recursivePattern}"`, summary: true }, () => {
bench("fast-glob", async () => {
const entries = await fg.glob([recursivePattern], fgOpts);
});
if (Glob)
bench("Bun.Glob", async () => {
const entries = await Array.fromAsync(new Glob(recursivePattern).scan(bunOpts));
});
if (benchFdir)
bench("fdir", async () => {
const entries = await new fdir().withFullPaths().glob(recursivePattern).crawl(process.cwd()).withPromise();
});
});
group({ name: `sync pattern="${normalPattern}"`, summary: true }, () => {
bench("fast-glob", () => {
const entries = fg.globSync([normalPattern], fgOpts);
});
if (Glob)
bench("Bun.Glob", () => {
const entries = [...new Glob(normalPattern).scanSync(bunOpts)];
});
if (benchFdir)
bench("fdir", async () => {
const entries = new fdir().withFullPaths().glob(normalPattern).crawl(process.cwd()).sync();
});
});
group({ name: `sync-recursive pattern="${recursivePattern}"`, summary: true }, () => {
bench("fast-glob", () => {
const entries = fg.globSync([recursivePattern], fgOpts);
});
if (Glob)
bench("Bun.Glob", () => {
const entries = [...new Glob(recursivePattern).scanSync(bunOpts)];
});
if (benchFdir)
bench("fdir", async () => {
const entries = new fdir().withFullPaths().glob(recursivePattern).crawl(process.cwd()).sync();
});
});
group({ name: `node_modules pattern="${nodeModulesPattern}"`, summary: true }, () => {
bench("fast-glob", async () => {
const entries = await fg.glob([nodeModulesPattern], fgOpts);
});
if (Glob)
bench("Bun.Glob", async () => {
const entries = await Array.fromAsync(new Glob(nodeModulesPattern).scan(bunOpts));
});
if (benchFdir)
bench("fdir", async () => {
const entries = await new fdir().withFullPaths().glob(nodeModulesPattern).crawl(process.cwd()).withPromise();
});
});
await run({
avg: true,
colors: false,
min_max: true,
collect: true,
percentiles: true,
});

View File

@@ -7,8 +7,6 @@
"benchmark": "^2.1.4",
"esbuild": "^0.14.12",
"eventemitter3": "^5.0.0",
"fast-glob": "3.3.1",
"fdir": "^6.1.0",
"mitata": "^0.1.6"
},
"scripts": {

View File

@@ -6,30 +6,23 @@ bench("await 1", async function () {
return await 1;
});
if (typeof process !== "undefined") {
bench("process.nextTick x 100", async function () {
var remaining = 100;
var cb, promise;
promise = new Promise(resolve => {
cb = resolve;
});
for (let i = 0; i < 100; i++) {
process.nextTick(() => {
if (--remaining === 0) cb();
});
}
return promise;
});
bench("await 1 x 100", async function () {
for (let i = 0; i < 100; i++) await 1;
});
function callnextTick(resolve) {
process.nextTick(resolve);
}
function awaitNextTick() {
return new Promise(callnextTick);
}
bench("promise.nextTick", async function () {
return awaitNextTick();
});
bench("await new Promise(resolve => resolve())", async function () {
await new Promise(resolve => resolve());
});
bench("Promise.all(Array.from({length: 100}, () => new Promise((resolve) => resolve())))", async function () {
return Promise.all(Array.from({ length: 100 }, () => Promise.resolve(1)));
});
await run();

View File

@@ -4,9 +4,5 @@ var i = 0;
const server = createServer((req, res) => {
res.writeHead(200);
res.end("Hello, World!" + i);
if (i++ === 200_000 - 1)
setTimeout(() => {
console.log("RSS", (process.memoryUsage().rss / 1024 / 1024) | 0, "MB");
process.exit(0);
}, 0);
if (i++ === 200_000 - 1) queueMicrotask(() => process.exit(0));
}).listen(parseInt(process.env.PORT || "3000", 10));

View File

@@ -1,52 +0,0 @@
import { bench, run } from "./runner.mjs";
import { IncomingMessage } from "node:http";
const headers = {
date: "Mon, 06 Nov 2023 05:12:49 GMT",
expires: "-1",
"cache-control": "private, max-age=0",
"content-type": "text/html; charset=ISO-8859-1",
"content-security-policy-report-only":
"object-src 'none';base-uri 'self';script-src 'nonce-lcrU7l9xScCq4urW13K9gw' 'strict-dynamic' 'report-sample' 'unsafe-eval' 'unsafe-inline' https: http:;report-uri https://csp.withgoogle.com/csp/gws/other-hp",
"x-xss-protection": "0",
"x-frame-options": "SAMEORIGIN",
"accept-ranges": "none",
vary: "Accept-Encoding",
"transfer-encoding": "chunked",
"set-cookie": [
"1P_JAR=2023-11-06-05; expires=Wed, 06-Dec-2023 05:12:49 GMT; path=/; domain=.google.com; Secure",
"AEC=Ackid1TiuGtRsmu1yaDCAdL1u1J4eM4S67simzDHfWaMPQzH-UB4DZkRwm8; expires=Sat, 04-May-2024 05:12:49 GMT; path=/; domain=.google.com; Secure; HttpOnly; SameSite=lax",
"NID=511=jQcg9cM7vjKawWnf6f3qhs3WDIIN2gaRq3i4bdMiVRWFkaFNYmiI-Xquf1kAmWGcmDN0skldS7uHheru3CMJrWjMt56VaaqO6Pilb54jFjQS_ZJRfG3Uc7dGV5WXGV-slUGE1Bicxlajdn0E_R8tZOoWiFzFDQW7YGmyfRqWQ2k; expires=Tue, 07-May-2024 05:12:49 GMT; path=/; domain=.google.com; HttpOnly",
],
p3p: 'CP="This is not a P3P policy! See g.co/p3phelp for more info."',
server: "gws",
"alt-svc": 'h3=":443"; ma=2592000,h3-29=":443"; ma=2592000',
};
const request = new Request("https://www.google.com/", {
headers: new Headers(headers),
method: "GET",
});
// const server = Bun.serve({
// port: 8080,
// async fetch(request) {
// // bench("new IncomingMessage()", b => {
// // for (let i = 0; i < 1000; i++) {
// // new IncomingMessage(request);
// // }
// // });
// const msg = new IncomingMessage(request);
// console.log(msg.headers, msg.rawHeaders, msg.url);
// // await run();
// return new Response("Hello, world!");
// },
// });
bench("new IncomingMessage()", b => {
for (let i = 0; i < 1000; i++) {
new IncomingMessage(request);
}
});
await run();

View File

@@ -1,50 +1,13 @@
import { readdirSync, readdir as readdirCb } from "fs";
import { readdir } from "fs/promises";
import { readdirSync } from "fs";
import { bench, run } from "./runner.mjs";
import { argv } from "process";
import { fileURLToPath } from "url";
import { relative, resolve } from "path";
import { createHash } from "crypto";
let dir = resolve(argv.length > 2 ? argv[2] : fileURLToPath(new URL("../../node_modules", import.meta.url)));
if (dir.includes(process.cwd())) {
dir = relative(process.cwd(), dir);
}
const dir = argv.length > 2 ? argv[2] : "/tmp";
const result = await readdir(dir, { recursive: true });
const count = result.length;
const syncCount = readdirSync(dir, { recursive: true }).length;
const hash = createHash("sha256").update(result.sort().join("\n")).digest("hex");
bench(`await readdir("${dir}", {recursive: true})`, async () => {
await readdir(dir, { recursive: true });
});
bench(`await readdir("${dir}", {recursive: true}) x 10`, async () => {
const promises = [
readdir(dir, { recursive: true }),
readdir(dir, { recursive: true }),
readdir(dir, { recursive: true }),
readdir(dir, { recursive: true }),
readdir(dir, { recursive: true }),
readdir(dir, { recursive: true }),
readdir(dir, { recursive: true }),
readdir(dir, { recursive: true }),
readdir(dir, { recursive: true }),
readdir(dir, { recursive: true }),
readdir(dir, { recursive: true }),
];
await Promise.all(promises);
});
bench(`await readdir("${dir}", {recursive: false})`, async () => {
await readdir(dir, { recursive: false });
const count = readdirSync(dir).length;
bench(`readdir("${dir}")`, () => {
readdirSync(dir, { withFileTypes: true });
});
await run();
console.log("\n", count, "files/dirs in", dir, "\n", "SHA256:", hash, "\n");
if (count !== syncCount) {
throw new Error(`Mismatched file counts: ${count} async !== ${syncCount} sync`);
}
console.log("\n\nFor", count, "files/dirs in", dir);

View File

@@ -1,25 +0,0 @@
import { bench, run } from "./runner.mjs";
import { builtinModules } from "node:module";
import { writeFile } from "node:fs/promises";
import { spawnSync } from "child_process";
for (let builtin of builtinModules) {
const path = `/tmp/require.${builtin.replaceAll("/", "_")}.cjs`;
await writeFile(
path,
`
const builtin = ${JSON.stringify(builtin)};
const now = require("perf_hooks").performance.now();
require(builtin);
const end = require("perf_hooks").performance.now();
process.stdout.write(JSON.stringify({builtin, time: end - now}) + "\\n");
`,
);
const result = spawnSync(typeof Bun !== "undefined" ? "bun" : "node", [path], {
stdio: ["inherit", "inherit", "inherit"],
env: {
...process.env,
NODE_NO_WARNINGS: "1",
},
});
}

View File

@@ -1,143 +0,0 @@
import { satisfies } from "semver";
import { bench, run } from "./runner.mjs";
const tests = [
["~1.2.3", "1.2.3", true],
["~1.2", "1.2.0", true],
["~1", "1.0.0", true],
["~1", "1.2.0", true],
["~1", "1.2.999", true],
["~0.2.3", "0.2.3", true],
["~0.2", "0.2.0", true],
["~0.2", "0.2.1", true],
["~0 ", "0.0.0", true],
["~1.2.3", "1.3.0", false],
["~1.2", "1.3.0", false],
["~1", "2.0.0", false],
["~0.2.3", "0.3.0", false],
["~0.2.3", "1.0.0", false],
["~0 ", "1.0.0", false],
["~0.2", "0.1.0", false],
["~0.2", "0.3.0", false],
["~3.0.5", "3.3.0", false],
["^1.1.4", "1.1.4", true],
[">=3", "3.5.0", true],
[">=3", "2.999.999", false],
[">=3", "3.5.1", true],
[">=3.x.x", "3.x.x", false],
["<6 >= 5", "5.0.0", true],
["<6 >= 5", "4.0.0", false],
["<6 >= 5", "6.0.0", false],
["<6 >= 5", "6.0.1", false],
[">2", "3", false],
[">2", "2.1", false],
[">2", "2", false],
[">2", "1.0", false],
[">1.3", "1.3.1", false],
[">1.3", "2.0.0", true],
[">2.1.0", "2.2.0", true],
["<=2.2.99999", "2.2.0", true],
[">=2.1.99999", "2.2.0", true],
["<2.2.99999", "2.2.0", true],
[">2.1.99999", "2.2.0", true],
[">1.0.0", "2.0.0", true],
["1.0.0", "1.0.0", true],
["1.0.0", "2.0.0", false],
["1.0.0 || 2.0.0", "1.0.0", true],
["2.0.0 || 1.0.0", "1.0.0", true],
["1.0.0 || 2.0.0", "2.0.0", true],
["2.0.0 || 1.0.0", "2.0.0", true],
["2.0.0 || >1.0.0", "2.0.0", true],
[">1.0.0 <2.0.0 <2.0.1 >1.0.1", "1.0.2", true],
["2.x", "2.0.0", true],
["2.x", "2.1.0", true],
["2.x", "2.2.0", true],
["2.x", "2.3.0", true],
["2.x", "2.1.1", true],
["2.x", "2.2.2", true],
["2.x", "2.3.3", true],
["<2.0.1 >1.0.0", "2.0.0", true],
["<=2.0.1 >=1.0.0", "2.0.0", true],
["^2", "2.0.0", true],
["^2", "2.9.9", true],
["~2", "2.0.0", true],
["~2", "2.1.0", true],
["~2.2", "2.2.1", true],
["2.1.0 || > 2.2 || >3", "2.1.0", true],
[" > 2.2 || >3 || 2.1.0", "2.1.0", true],
[" > 2.2 || 2.1.0 || >3", "2.1.0", true],
["> 2.2 || 2.1.0 || >3", "2.3.0", true],
["> 2.2 || 2.1.0 || >3", "2.2.1", false],
["> 2.2 || 2.1.0 || >3", "2.2.0", false],
["> 2.2 || 2.1.0 || >3", "2.3.0", true],
["> 2.2 || 2.1.0 || >3", "3.0.1", true],
["~2", "2.0.0", true],
["~2", "2.1.0", true],
["1.2.0 - 1.3.0", "1.2.2", true],
["1.2 - 1.3", "1.2.2", true],
["1 - 1.3", "1.2.2", true],
["1 - 1.3", "1.3.0", true],
["1.2 - 1.3", "1.3.1", true],
["1.2 - 1.3", "1.4.0", false],
["1 - 1.3", "1.3.1", true],
["1.2 - 1.3 || 5.0", "6.4.0", false],
["1.2 - 1.3 || 5.0", "1.2.1", true],
["5.0 || 1.2 - 1.3", "1.2.1", true],
["1.2 - 1.3 || 5.0", "5.0", false],
["5.0 || 1.2 - 1.3", "5.0", false],
["1.2 - 1.3 || 5.0", "5.0.2", true],
["5.0 || 1.2 - 1.3", "5.0.2", true],
["1.2 - 1.3 || 5.0", "5.0.2", true],
["5.0 || 1.2 - 1.3", "5.0.2", true],
["5.0 || 1.2 - 1.3 || >8", "9.0.2", true],
];
bench("semver.satisfies x " + tests.length, () => {
for (const [range, version, expected] of tests) {
if (satisfies(version, range) !== expected) {
throw new Error("Unexpected result for " + range + " " + version);
}
}
});
if (typeof Bun !== "undefined") {
const satisfies = Bun.semver.satisfies;
bench("Bun.semver.satisfies x " + tests.length, () => {
for (const [range, version, expected] of tests) {
if (satisfies(version, range) !== expected) {
throw new Error("Unexpected result for " + range + " " + version);
}
}
});
}
bench("semver.satisfies", () => {
const [range, version, expected] = tests[0];
if (satisfies(version, range) !== expected) {
throw new Error("Unexpected result for " + range + " " + version);
}
});
if (typeof Bun !== "undefined") {
const satisfies = Bun.semver.satisfies;
bench("Bun.semver.satisfies", () => {
const [range, version, expected] = tests[0];
if (satisfies(version, range) !== expected) {
throw new Error("Unexpected result for " + range + " " + version);
}
});
}
await run();

237
build.zig
View File

@@ -1,22 +1,31 @@
const std = @import("std");
const pathRel = std.fs.path.relative;
const builtin = @import("builtin");
const Wyhash = @import("./src/wyhash.zig").Wyhash;
const zig_version = builtin.zig_version;
/// Do not rename this constant. It is scanned by some scripts to determine which zig version to install.
const recommended_zig_version = "0.12.0-dev.1604+caae40c21";
var is_debug_build = false;
fn exists(path: []const u8) bool {
_ = std.fs.openFileAbsolute(path, .{ .mode = .read_only }) catch return false;
return true;
fn moduleSource(comptime out: []const u8) FileSource {
if (comptime std.fs.path.dirname(@src().file)) |base| {
const outpath = comptime base ++ std.fs.path.sep_str ++ out;
return FileSource.relative(outpath);
} else {
return FileSource.relative(out);
}
}
const color_map = std.ComptimeStringMap([]const u8, .{
&.{ "black", "30m" },
&.{ "blue", "34m" },
&.{ "b", "1m" },
&.{ "d", "2m" },
&.{ "cyan", "36m" },
&.{ "green", "32m" },
&.{ "magenta", "35m" },
&.{ "red", "31m" },
&.{ "white", "37m" },
&.{ "yellow", "33m" },
});
fn addInternalPackages(b: *Build, step: *CompileStep, _: std.mem.Allocator, _: []const u8, target: anytype) !void {
const io: *Module = brk: {
var io: *Module = brk: {
if (target.isDarwin()) {
break :brk b.createModule(.{
.source_file = FileSource.relative("src/io/io_darwin.zig"),
@@ -37,38 +46,11 @@ fn addInternalPackages(b: *Build, step: *CompileStep, _: std.mem.Allocator, _: [
};
step.addModule("async_io", io);
step.addModule("zlib-internal", brk: {
if (target.isWindows()) {
break :brk b.createModule(.{ .source_file = FileSource.relative("src/deps/zlib.win32.zig") });
}
break :brk b.createModule(.{ .source_file = FileSource.relative("src/deps/zlib.posix.zig") });
});
const async_: *Module = brk: {
if (target.isDarwin() or target.isLinux() or target.isFreeBSD()) {
break :brk b.createModule(.{
.source_file = FileSource.relative("src/async/posix_event_loop.zig"),
});
} else if (target.isWindows()) {
break :brk b.createModule(.{
.source_file = FileSource.relative("src/async/windows_event_loop.zig"),
});
}
break :brk b.createModule(.{
.source_file = FileSource.relative("src/async/stub_event_loop.zig"),
});
};
step.addModule("async", async_);
}
const BunBuildOptions = struct {
is_canary: bool = false,
canary_revision: u32 = 0,
canary: bool = false,
sha: [:0]const u8 = "",
version: []const u8 = "",
baseline: bool = false,
bindgen: bool = false,
sizegen: bool = false,
@@ -77,8 +59,6 @@ const BunBuildOptions = struct {
runtime_js_version: u64 = 0,
fallback_html_version: u64 = 0,
tinycc: bool = true,
pub fn updateRuntime(this: *BunBuildOptions) anyerror!void {
if (std.fs.cwd().openFile("src/runtime.out.js", .{ .mode = .read_only })) |file| {
defer file.close();
@@ -109,13 +89,7 @@ const BunBuildOptions = struct {
pub fn step(this: BunBuildOptions, b: anytype) *std.build.OptionsStep {
var opts = b.addOptions();
opts.addOption(@TypeOf(this.is_canary), "is_canary", this.is_canary);
opts.addOption(@TypeOf(this.canary_revision), "canary_revision", this.canary_revision);
opts.addOption(
std.SemanticVersion,
"version",
std.SemanticVersion.parse(this.version) catch @panic(b.fmt("Invalid version: {s}", .{this.version})),
);
opts.addOption(@TypeOf(this.canary), "is_canary", this.canary);
opts.addOption(@TypeOf(this.sha), "sha", this.sha);
opts.addOption(@TypeOf(this.baseline), "baseline", this.baseline);
opts.addOption(@TypeOf(this.bindgen), "bindgen", this.bindgen);
@@ -123,15 +97,35 @@ const BunBuildOptions = struct {
opts.addOption(@TypeOf(this.base_path), "base_path", this.base_path);
opts.addOption(@TypeOf(this.runtime_js_version), "runtime_js_version", this.runtime_js_version);
opts.addOption(@TypeOf(this.fallback_html_version), "fallback_html_version", this.fallback_html_version);
opts.addOption(@TypeOf(this.tinycc), "tinycc", this.tinycc);
return opts;
}
};
// relative to the prefix
var output_dir: []const u8 = "";
fn panicIfNotFound(comptime filepath: []const u8) []const u8 {
var file = std.fs.cwd().openFile(filepath, .{ .optimize = .read_only }) catch |err| {
std.debug.panic("error: {s} opening {s}. Please ensure you've downloaded git submodules, and ran `make vendor`, `make jsc`.", .{ filepath, @errorName(err) });
};
file.close();
var optimize: std.builtin.OptimizeMode = .Debug;
return filepath;
}
const fmt = struct {
pub usingnamespace @import("std").fmt;
pub fn hexInt(value: anytype) @TypeOf(std.fmt.fmtSliceHexLower("")) {
return std.fmt.fmtSliceHexLower(std.mem.asBytes(&value));
}
pub fn hexIntUp(value: anytype) @TypeOf(std.fmt.fmtSliceHexUpper("")) {
return std.fmt.fmtSliceHexUpper(std.mem.asBytes(&value));
}
};
var x64 = "x64";
var optimize: std.builtin.OptimizeMode = undefined;
const Build = std.Build;
const CrossTarget = std.zig.CrossTarget;
@@ -152,26 +146,6 @@ pub fn build(b: *Build) !void {
}
pub fn build_(b: *Build) !void {
switch (comptime zig_version.order(std.SemanticVersion.parse(recommended_zig_version) catch unreachable)) {
.eq => {},
.lt => {
@compileError("The minimum version of Zig required to compile Bun is " ++ recommended_zig_version ++ ", found " ++ @import("builtin").zig_version_string ++ ". Please follow the instructions at https://bun.sh/docs/project/contributing. You may need to re-run `bun setup`.");
},
.gt => {
const colors = std.io.getStdErr().supportsAnsiEscapeCodes();
std.debug.print(
"{s}WARNING:\nBun recommends Zig version '{s}', but found '{s}', build may fail...\nMake sure you are following the instructions at https://bun.sh/docs/project/contributing\n{s}You can update to the right version using 'zigup {s}'\n\n",
.{
if (colors) "\x1b[1;33m" else "",
recommended_zig_version,
builtin.zig_version_string,
if (colors) "\x1b[0m" else "",
recommended_zig_version,
},
);
},
}
// Standard target options allows the person running `zig build` to choose
// what target to build for. Here we do not override the defaults, which
// means any target is allowed, and the default is native. Other options
@@ -181,14 +155,8 @@ pub fn build_(b: *Build) !void {
// between Debug, ReleaseSafe, ReleaseFast, and ReleaseSmall.
optimize = b.standardOptimizeOption(.{});
var generated_code_directory = b.option([]const u8, "generated-code", "Set the generated code directory") orelse "";
if (generated_code_directory.len == 0) {
generated_code_directory = b.pathFromRoot("build/codegen");
}
var output_dir_buf = std.mem.zeroes([4096]u8);
const bin_label = if (optimize == std.builtin.OptimizeMode.Debug) "packages/debug-bun-" else "packages/bun-";
var bin_label = if (optimize == std.builtin.OptimizeMode.Debug) "packages/debug-bun-" else "packages/bun-";
var triplet_buf: [64]u8 = undefined;
var os_tagname = @tagName(target.getOs().tag);
@@ -206,7 +174,7 @@ pub fn build_(b: *Build) !void {
&triplet_buf,
os_tagname,
);
const osname = triplet_buf[0..os_tagname.len];
var osname = triplet_buf[0..os_tagname.len];
triplet_buf[osname.len] = '-';
std.mem.copy(u8, triplet_buf[osname.len + 1 ..], @tagName(target.getCpuArch()));
@@ -217,19 +185,17 @@ pub fn build_(b: *Build) !void {
cpuArchName = cpuArchName[0..3];
}
const triplet = triplet_buf[0 .. osname.len + cpuArchName.len + 1];
var triplet = triplet_buf[0 .. osname.len + cpuArchName.len + 1];
const outfile_maybe = b.option([]const u8, "output-file", "target to install to");
if (outfile_maybe) |outfile| {
output_dir = try pathRel(b.allocator, b.install_prefix, std.fs.path.dirname(outfile) orelse "");
if (b.option([]const u8, "output-dir", "target to install to") orelse std.os.getenv("OUTPUT_DIR")) |output_dir_| {
output_dir = try pathRel(b.allocator, b.install_prefix, output_dir_);
} else {
const output_dir_base = try std.fmt.bufPrint(&output_dir_buf, "{s}{s}", .{ bin_label, triplet });
output_dir = try pathRel(b.allocator, b.install_prefix, output_dir_base);
}
is_debug_build = optimize == OptimizeMode.Debug;
const bun_executable_name = if (outfile_maybe) |outfile| std.fs.path.basename(outfile[0 .. outfile.len - std.fs.path.extension(outfile).len]) else if (is_debug_build) "bun-debug" else "bun";
const bun_executable_name = if (optimize == std.builtin.OptimizeMode.Debug) "bun-debug" else "bun";
const root_src = if (target.getOsTag() == std.Target.Os.Tag.freestanding)
"root_wasm.zig"
else
@@ -251,31 +217,10 @@ pub fn build_(b: *Build) !void {
.root_source_file = FileSource.relative(root_src),
.target = target,
.optimize = optimize,
.main_mod_path = .{ .cwd_relative = b.pathFromRoot(".") },
.main_pkg_path = .{ .cwd_relative = b.pathFromRoot(".") },
});
if (!exists(b.pathFromRoot(try std.fs.path.join(b.allocator, &.{
"src",
"js_lexer",
"id_continue_bitset.blob",
})))) {
const identifier_data = b.pathFromRoot(try std.fs.path.join(b.allocator, &.{ "src", "js_lexer", "identifier_data.zig" }));
var run_step = b.addSystemCommand(&.{
b.zig_exe,
"run",
identifier_data,
});
run_step.has_side_effects = true;
obj.step.dependOn(&run_step.step);
}
b.reference_trace = if (b.option(u32, "reference-trace", "Set the reference trace")) |trace|
if (trace == 0)
null
else
trace
else
16;
b.reference_trace = 16;
var default_build_options: BunBuildOptions = brk: {
const is_baseline = arch.isX86() and (target.cpu_model == .baseline or
@@ -286,7 +231,7 @@ pub fn build_(b: *Build) !void {
git_sha = b.allocator.dupeZ(u8, sha) catch unreachable;
} else {
sha: {
const result = std.ChildProcess.run(.{
const result = std.ChildProcess.exec(.{
.allocator = b.allocator,
.argv = &.{
"git",
@@ -301,17 +246,9 @@ pub fn build_(b: *Build) !void {
}
}
const is_canary, const canary_revision = if (b.option(u32, "canary", "Treat this as a canary build")) |rev|
if (rev == 0)
.{ false, 0 }
else
.{ true, rev }
else
.{ false, 0 };
const is_canary = (std.os.getenvZ("BUN_CANARY") orelse "0")[0] == '1';
break :brk .{
.is_canary = is_canary,
.canary_revision = canary_revision,
.version = b.option([]const u8, "version", "Value of `Bun.version`") orelse "0.0.0",
.canary = is_canary,
.sha = git_sha,
.baseline = is_baseline,
.bindgen = false,
@@ -346,8 +283,8 @@ pub fn build_(b: *Build) !void {
min_version,
max_version,
obj.target.getCpuModel().name,
}) catch {};
std.io.getStdErr().writer().print("Zig v{s}\n", .{builtin.zig_version_string}) catch {};
}) catch unreachable;
std.io.getStdErr().writer().print("Output: {s}/{s}\n\n", .{ output_dir, bun_executable_name }) catch unreachable;
defer obj_step.dependOn(&obj.step);
@@ -366,23 +303,13 @@ pub fn build_(b: *Build) !void {
obj.addOptions("build_options", actual_build_options.step(b));
// Generated Code
// TODO: exit with a better error early if these files do not exist. it is an indication someone ran `zig build` directly without the code generators.
obj.addModule("ZigGeneratedClasses", b.createModule(.{
.source_file = .{ .path = b.pathJoin(&.{ generated_code_directory, "ZigGeneratedClasses.zig" }) },
}));
obj.addModule("ResolvedSourceTag", b.createModule(.{
.source_file = .{ .path = b.pathJoin(&.{ generated_code_directory, "ResolvedSourceTag.zig" }) },
}));
obj.linkLibC();
obj.dll_export_fns = true;
obj.strip = false;
obj.omit_frame_pointer = optimize != .Debug;
obj.subsystem = .Console;
obj.strip = false;
obj.bundle_compiler_rt = false;
obj.omit_frame_pointer = optimize != .Debug;
// Disable stack probing on x86 so we don't need to include compiler_rt
if (target.getCpuArch().isX86() or target.isWindows()) obj.disable_stack_probing = true;
if (target.getCpuArch().isX86()) obj.disable_stack_probing = true;
if (b.option(bool, "for-editor", "Do not emit bin, just check for errors") orelse false) {
// obj.emit_bin = .no_emit;
@@ -404,7 +331,7 @@ pub fn build_(b: *Build) !void {
.root_source_file = FileSource.relative("src/bindgen.zig"),
.target = target,
.optimize = optimize,
.main_mod_path = obj.main_mod_path,
.main_pkg_path = obj.main_pkg_path,
});
defer headers_step.dependOn(&headers_obj.step);
try configureObjectStep(b, headers_obj, headers_step, @TypeOf(target), target);
@@ -421,7 +348,7 @@ pub fn build_(b: *Build) !void {
.root_source_file = FileSource.relative("root_wasm.zig"),
.target = target,
.optimize = optimize,
.main_mod_path = obj.main_mod_path,
.main_pkg_path = obj.main_pkg_path,
});
defer wasm_step.dependOn(&wasm.step);
wasm.strip = false;
@@ -440,7 +367,7 @@ pub fn build_(b: *Build) !void {
.root_source_file = FileSource.relative("misctools/http_bench.zig"),
.target = target,
.optimize = optimize,
.main_mod_path = obj.main_mod_path,
.main_pkg_path = obj.main_pkg_path,
});
defer headers_step.dependOn(&headers_obj.step);
try configureObjectStep(b, headers_obj, headers_step, @TypeOf(target), target);
@@ -454,7 +381,7 @@ pub fn build_(b: *Build) !void {
.root_source_file = FileSource.relative("misctools/machbench.zig"),
.target = target,
.optimize = optimize,
.main_mod_path = obj.main_mod_path,
.main_pkg_path = obj.main_pkg_path,
});
defer headers_step.dependOn(&headers_obj.step);
try configureObjectStep(b, headers_obj, headers_step, @TypeOf(target), target);
@@ -468,7 +395,7 @@ pub fn build_(b: *Build) !void {
.root_source_file = FileSource.relative("misctools/fetch.zig"),
.target = target,
.optimize = optimize,
.main_mod_path = obj.main_mod_path,
.main_pkg_path = obj.main_pkg_path,
});
defer headers_step.dependOn(&headers_obj.step);
try configureObjectStep(b, headers_obj, headers_step, @TypeOf(target), target);
@@ -482,7 +409,7 @@ pub fn build_(b: *Build) !void {
.root_source_file = FileSource.relative("src/bench/string-handling.zig"),
.target = target,
.optimize = optimize,
.main_mod_path = obj.main_mod_path,
.main_pkg_path = obj.main_pkg_path,
});
defer headers_step.dependOn(&headers_obj.step);
try configureObjectStep(b, headers_obj, headers_step, @TypeOf(target), target);
@@ -496,7 +423,7 @@ pub fn build_(b: *Build) !void {
.root_source_file = FileSource.relative("src/sha.zig"),
.target = target,
.optimize = optimize,
.main_mod_path = obj.main_mod_path,
.main_pkg_path = obj.main_pkg_path,
});
defer headers_step.dependOn(&headers_obj.step);
try configureObjectStep(b, headers_obj, headers_step, @TypeOf(target), target);
@@ -510,7 +437,7 @@ pub fn build_(b: *Build) !void {
.root_source_file = FileSource.relative("src/sourcemap/vlq_bench.zig"),
.target = target,
.optimize = optimize,
.main_mod_path = obj.main_mod_path,
.main_pkg_path = obj.main_pkg_path,
});
defer headers_step.dependOn(&headers_obj.step);
try configureObjectStep(b, headers_obj, headers_step, @TypeOf(target), target);
@@ -524,7 +451,7 @@ pub fn build_(b: *Build) !void {
.root_source_file = FileSource.relative("misctools/tgz.zig"),
.target = target,
.optimize = optimize,
.main_mod_path = obj.main_mod_path,
.main_pkg_path = obj.main_pkg_path,
});
defer headers_step.dependOn(&headers_obj.step);
try configureObjectStep(b, headers_obj, headers_step, @TypeOf(target), target);
@@ -534,14 +461,14 @@ pub fn build_(b: *Build) !void {
{
const headers_step = b.step("test", "Build test");
const test_file = b.option([]const u8, "test-file", "Input file for test");
const test_bin_ = b.option([]const u8, "test-bin", "Emit bin to");
const test_filter = b.option([]const u8, "test-filter", "Filter for test");
var test_file = b.option([]const u8, "test-file", "Input file for test");
var test_bin_ = b.option([]const u8, "test-bin", "Emit bin to");
var test_filter = b.option([]const u8, "test-filter", "Filter for test");
var headers_obj: *CompileStep = b.addTest(.{
.root_source_file = FileSource.relative(test_file orelse "src/main.zig"),
.target = target,
.main_mod_path = obj.main_mod_path,
.main_pkg_path = obj.main_pkg_path,
});
headers_obj.filter = test_filter;
if (test_bin_) |test_bin| {
@@ -563,19 +490,7 @@ pub fn build_(b: *Build) !void {
headers_obj.addOptions("build_options", default_build_options.step(b));
}
// Running `zig build` with no arguments is almost always a mistake.
const mistake_message = b.addSystemCommand(&.{
"echo",
\\
\\error: To build Bun from source, please use `bun run setup` instead of `zig build`"
\\
\\If you want to build the zig code only, run:
\\ 'zig build obj -Dgenerated-code=./build/codegen [...opts]'
\\
\\For more info, see https://bun.sh/docs/project/contributing
\\
});
b.default_step.dependOn(&mistake_message.step);
b.default_step.dependOn(obj_step);
}
pub var original_make_fn: ?*const fn (step: *std.build.Step) anyerror!void = null;

BIN
bun.lockb

Binary file not shown.

View File

@@ -84,7 +84,7 @@ _bun_completions() {
local SUBCOMMANDS="dev bun create run install add remove upgrade completions discord help init pm x";
GLOBAL_OPTIONS[LONG_OPTIONS]="--use --cwd --bunfile --server-bunfile --config --disable-react-fast-refresh --disable-hmr --env-file --extension-order --jsx-factory --jsx-fragment --extension-order --jsx-factory --jsx-fragment --jsx-import-source --jsx-production --jsx-runtime --main-fields --no-summary --version --platform --public-dir --tsconfig-override --define --external --help --inject --loader --origin --port --dump-environment-variables --dump-limits --disable-bun-js";
GLOBAL_OPTIONS[LONG_OPTIONS]="--use --cwd --bunfile --server-bunfile --config --disable-react-fast-refresh --disable-hmr --extension-order --jsx-factory --jsx-fragment --extension-order --jsx-factory --jsx-fragment --jsx-import-source --jsx-production --jsx-runtime --main-fields --no-summary --version --platform --public-dir --tsconfig-override --define --external --help --inject --loader --origin --port --dump-environment-variables --dump-limits --disable-bun-js";
GLOBAL_OPTIONS[SHORT_OPTIONS]="-c -v -d -e -h -i -l -u -p";
PACKAGE_OPTIONS[ADD_OPTIONS_LONG]="--development --optional";

View File

@@ -51,7 +51,7 @@ function __bun_last_cmd --argument-names n
end
set -l bun_install_boolean_flags yarn production optional development no-save dry-run force no-cache silent verbose global
set -l bun_install_boolean_flags_descriptions "Write a yarn.lock file (yarn v1)" "Don't install devDependencies" "Add dependency to optionalDependencies" "Add dependency to devDependencies" "Don't install devDependencies" "Don't install anything" "Always request the latest versions from the registry & reinstall all dependencies" "Ignore manifest cache entirely" "Don't output anything" "Excessively verbose logging" "Use global folder"
set -l bun_install_boolean_flags_descriptions "Write a yarn.lock file (yarn v1)" "Don't install devDependencies" "Add dependency to optionalDependencies" "Add dependency to devDependencies" "Don't install devDependencies" "Don't install anything" "Always request the latest versions from the registry & reinstall all dependenices" "Ignore manifest cache entirely" "Don't output anything" "Excessively verbose logging" "Use global folder"
set -l bun_builtin_cmds dev create help bun upgrade discord run install remove add init link unlink pm x
set -l bun_builtin_cmds_without_run dev create help bun upgrade discord install remove add init pm x

View File

@@ -406,7 +406,6 @@ _bun_run_completion() {
'--cwd[Absolute path to resolve files & entry points from. This just changes the process cwd]:cwd' \
'--config[Config file to load bun from (e.g. -c bunfig.toml]: :->config' \
'-c[Config file to load bun from (e.g. -c bunfig.toml]: :->config' \
'--env-file[Load environment variables from the specified file(s)]:env-file' \
'--extension-order[Defaults to: .tsx,.ts,.jsx,.js,.json]:extension-order' \
'--jsx-factory[Changes the function called when compiling JSX elements using the classic JSX runtime]:jsx-factory' \
'--jsx-fragment[Changes the function called when compiling JSX fragments]:jsx-fragment' \
@@ -573,7 +572,6 @@ _bun_test_completion() {
'--cwd[Set a specific cwd]:cwd' \
'-c[Load config(bunfig.toml)]: :->config' \
'--config[Load config(bunfig.toml)]: :->config' \
'--env-file[Load environment variables from the specified file(s)]:env-file' \
'--extension-order[Defaults to: .tsx,.ts,.jsx,.js,.json]:extension-order' \
'--jsx-factory[Changes the function called when compiling JSX elements using the classic JSX runtime]:jsx-factory' \
'--jsx-fragment[Changes the function called when compiling JSX fragments]:jsx-fragment' \

View File

@@ -78,9 +78,6 @@ subcommands:
- name: server-bunfile
type: string
summary: "Use a specific .bun file for SSR in bun dev (default: node_modules.server.bun)"
- name: env-file
type: string
summary: "Load environment variables from the specified file(s)"
- name: extension-order
type: string
summary: "defaults to: .tsx,.ts,.jsx,.js,.json"
@@ -121,7 +118,7 @@ subcommands:
- frozen-lockfile -- "Disallow changes to lockfile"
- no-save --
- dry-run -- "Don't install anything"
- force -- "Always request the latest versions from the registry & reinstall all dependencies"
- force -- "Always request the latest versions from the registry & reinstall all dependenices"
- name: cache-dir
type: string
summary: "Store & load cached data from a specific directory path"
@@ -156,7 +153,7 @@ subcommands:
- frozen-lockfile -- "Disallow changes to lockfile"
- no-save --
- dry-run -- "Don't install anything"
- force -- "Always request the latest versions from the registry & reinstall all dependencies"
- force -- "Always request the latest versions from the registry & reinstall all dependenices"
- no-cache -- "Ignore manifest cache entirely"
- silent -- "Don't output anything"
- verbose -- "Excessively verbose logging"
@@ -194,7 +191,7 @@ subcommands:
- frozen-lockfile -- "Disallow changes to lockfile"
- no-save --
- dry-run -- "Don't install anything"
- force -- "Always request the latest versions from the registry & reinstall all dependencies"
- force -- "Always request the latest versions from the registry & reinstall all dependenices"
- name: cache-dir
type: string
summary: "Store & load cached data from a specific directory path"

View File

@@ -91,11 +91,6 @@ RUN apk --no-cache add \
FROM alpine:3.18
# Disable the runtime transpiler cache by default inside Docker containers.
# On ephemeral containers, the cache is not useful
ARG BUN_RUNTIME_TRANSPILER_CACHE_PATH=0
ENV BUN_RUNTIME_TRANSPILER_CACHE_PATH=${BUN_RUNTIME_TRANSPILER_CACHE_PATH}
COPY --from=build /tmp/glibc.apk /tmp/
COPY --from=build /tmp/glibc-bin.apk /tmp/
COPY --from=build /usr/local/bin/bun /usr/local/bin/

View File

@@ -57,11 +57,6 @@ RUN apt-get update -qq \
FROM debian:bullseye-slim
# Disable the runtime transpiler cache by default inside Docker containers.
# On ephemeral containers, the cache is not useful
ARG BUN_RUNTIME_TRANSPILER_CACHE_PATH=0
ENV BUN_RUNTIME_TRANSPILER_CACHE_PATH=${BUN_RUNTIME_TRANSPILER_CACHE_PATH}
COPY docker-entrypoint.sh /usr/local/bin
COPY --from=build /usr/local/bin/bun /usr/local/bin/bun

View File

@@ -58,11 +58,6 @@ FROM debian:bullseye
COPY docker-entrypoint.sh /usr/local/bin
COPY --from=build /usr/local/bin/bun /usr/local/bin/bun
# Disable the runtime transpiler cache by default inside Docker containers.
# On ephemeral containers, the cache is not useful
ARG BUN_RUNTIME_TRANSPILER_CACHE_PATH=0
ENV BUN_RUNTIME_TRANSPILER_CACHE_PATH=${BUN_RUNTIME_TRANSPILER_CACHE_PATH}
RUN groupadd bun \
--gid 1000 \
&& useradd bun \

View File

@@ -57,11 +57,6 @@ RUN apt-get update -qq \
FROM gcr.io/distroless/base-nossl-debian11
# Disable the runtime transpiler cache by default inside Docker containers.
# On ephemeral containers, the cache is not useful
ARG BUN_RUNTIME_TRANSPILER_CACHE_PATH=0
ENV BUN_RUNTIME_TRANSPILER_CACHE_PATH=${BUN_RUNTIME_TRANSPILER_CACHE_PATH}
COPY --from=build /usr/local/bin/bun /usr/local/bin/
# Temporarily use the `build`-stage image binaries to create a symlink:

View File

@@ -26,10 +26,10 @@ Below is a quick "cheat sheet" that doubles as a table of contents. Click an ite
---
- [`File`](#file)
- A subclass of `Blob` that represents a file. Has a `name` and `lastModified` timestamp. There is experimental support in Node.js v20.
<!-- - [`File`](#file)
- _Browser only_. A subclass of `Blob` that represents a file. Has a `name` and `lastModified` timestamp. There is experimental support in Node.js v20; Bun does not support `File` yet; most of its functionality is provided by `BunFile`.
---
--- -->
- [`BunFile`](#bunfile)
- _Bun only_. A subclass of `Blob` that represents a lazily-loaded file on disk. Created with `Bun.file(path)`.

View File

@@ -300,7 +300,7 @@ interface BunFile {
readonly type: string;
text(): Promise<string>;
stream(): ReadableStream;
stream(): Promise<ReadableStream>;
arrayBuffer(): Promise<ArrayBuffer>;
json(): Promise<any>;
writer(params: { highWaterMark?: number }): FileSink;

View File

@@ -1,110 +0,0 @@
Bun includes a fast native implementation of file globbing.
## Quickstart
**Scan a directory for files matching `*.ts`**:
```ts
import { Glob } from "bun";
const glob = new Glob("*.ts");
for await (const file of glob.scan(".")) {
console.log(file); // => "index.ts"
}
```
**Match a string against a glob pattern**:
```ts
import { Glob } from "bun";
const glob = new Glob("*.ts");
glob.match("index.ts"); // => true
glob.match("index.js"); // => false
```
`Glob` is a class which implements the following interface:
```ts
class Glob {
scan(root: string | ScanOptions): AsyncIterable<string>;
scanSync(root: string | ScanOptions): Iterable<string>;
match(path: string): boolean;
}
interface ScanOptions {
/**
* The root directory to start matching from. Defaults to `process.cwd()`
*/
cwd?: string;
/**
* Allow patterns to match entries that begin with a period (`.`).
*
* @default false
*/
dot?: boolean;
/**
* Return the absolute path for entries.
*
* @default false
*/
absolute?: boolean;
/**
* Indicates whether to traverse descendants of symbolic link directories.
*
* @default false
*/
followSymlinks?: boolean;
/**
* Throw an error when symbolic link is broken
*
* @default false
*/
throwErrorOnBrokenSymlink?: boolean;
/**
* Return only files.
*
* @default true
*/
onlyFiles?: boolean;
}
```
## Supported Glob Patterns
Bun supports the following glob patterns:
### `*` - Match any number of characters except `/`
```ts
const glob = new Glob("*.ts");
glob.match("index.ts"); // => true
glob.match("src/index.ts"); // => false
```
### `**` - Match any number of characters including `/`
```ts
const glob = new Glob("**/*.ts");
glob.match("index.ts"); // => true
glob.match("src/index.ts"); // => true
glob.match("src/index.js"); // => false
```
### `{a,b,c}` - Match any of the given patterns
```ts
const glob = new Glob("{a,b,c}.ts");
glob.match("a.ts"); // => true
glob.match("b.ts"); // => true
glob.match("c.ts"); // => true
glob.match("d.ts"); // => false
```

View File

@@ -38,11 +38,6 @@ import.meta.resolveSync("zod")
---
- `import.meta.env`
- An alias to `process.env`.
---
- `import.meta.resolve{Sync}`
- Resolve a module specifier (e.g. `"zod"` or `"./file.tsx"`) to an absolute path. While file would be imported if the specifier were imported from this file?

View File

@@ -1,52 +0,0 @@
Bun implements a semantic versioning API which can be used to compare versions and determine if a version is compatible with another range of versions. The versions and ranges are designed to be compatible with `node-semver`, which is used by npm clients.
It's about 20x faster than `node-semver`.
![Benchmark](https://github.com/oven-sh/bun/assets/709451/94746adc-8aba-4baf-a143-3c355f8e0f78)
Currently, this API is two functions.
#### `Bun.semver.satisfies(version: string, range: string): boolean`
Returns `true` if `version` satisfies `range`, otherwise `false`.
Example:
```typescript
import { semver } from "bun";
semver.satisfies("1.0.0", "^1.0.0"); // true
semver.satisfies("1.0.0", "^1.0.1"); // false
semver.satisfies("1.0.0", "~1.0.0"); // true
semver.satisfies("1.0.0", "~1.0.1"); // false
semver.satisfies("1.0.0", "1.0.0"); // true
semver.satisfies("1.0.0", "1.0.1"); // false
semver.satisfies("1.0.1", "1.0.0"); // false
semver.satisfies("1.0.0", "1.0.x"); // true
semver.satisfies("1.0.0", "1.x.x"); // true
semver.satisfies("1.0.0", "x.x.x"); // true
semver.satisfies("1.0.0", "1.0.0 - 2.0.0"); // true
semver.satisfies("1.0.0", "1.0.0 - 1.0.1"); // true
```
If `range` is invalid, it returns false. If `version` is invalid, it returns false.
#### `Bun.semver.order(versionA: string, versionB: string): 0 | 1 | -1`
Returns `0` if `versionA` and `versionB` are equal, `1` if `versionA` is greater than `versionB`, and `-1` if `versionA` is less than `versionB`.
Example:
```typescript
import { semver } from "bun";
semver.order("1.0.0", "1.0.0"); // 0
semver.order("1.0.0", "1.0.1"); // -1
semver.order("1.0.1", "1.0.0"); // 1
const unsorted = ["1.0.0", "1.0.1", "1.0.0-alpha", "1.0.0-beta", "1.0.0-rc"];
unsorted.sort(semver.order); // ["1.0.0-alpha", "1.0.0-beta", "1.0.0-rc", "1.0.0", "1.0.1"]
console.log(unsorted);
```
If you need other semver functions, feel free to open an issue or pull request.

View File

@@ -183,60 +183,6 @@ const proc = Bun.spawn(["echo", "hello"]);
proc.unref();
```
## Inter-process communication (IPC)
Bun supports direct inter-process communication channel between two `bun` processes. To receive messages from a spawned Bun subprocess, specify an `ipc` handler.
{%callout%}
**Note** — This API is only compatible with other `bun` processes. Use `process.execPath` to get a path to the currently running `bun` executable.
{%/callout%}
```ts#parent.ts
const child = Bun.spawn(["bun", "child.ts"], {
ipc(message) {
/**
* The message received from the sub process
**/
},
});
```
The parent process can send messages to the subprocess using the `.send()` method on the returned `Subprocess` instance. A reference to the sending subprocess is also available as the second argument in the `ipc` handler.
```ts#parent.ts
const childProc = Bun.spawn(["bun", "child.ts"], {
ipc(message, childProc) {
/**
* The message received from the sub process
**/
childProc.send("Respond to child")
},
});
childProc.send("I am your father"); // The parent can send messages to the child as well
```
Meanwhile the child process can send messages to its parent using with `process.send()` and receive messages with `process.on("message")`. This is the same API used for `child_process.fork()` in Node.js.
```ts#child.ts
process.send("Hello from child as string");
process.send({ message: "Hello from child as object" });
process.on("message", (message) => {
// print message from parent
console.log(message);
});
```
All messages are serialized using the JSC `serialize` API, which allows for the same set of [transferrable types](https://developer.mozilla.org/en-US/docs/Web/API/Web_Workers_API/Transferable_objects) supported by `postMessage` and `structuredClone`, including strings, typed arrays, streams, and objects.
```ts#child.ts
// send a string
process.send("Hello from child as string");
// send an object
process.send({ message: "Hello from child as object" });
```
## Blocking API (`Bun.spawnSync()`)
Bun provides a synchronous equivalent of `Bun.spawn` called `Bun.spawnSync`. This is a blocking API that supports the same inputs and parameters as `Bun.spawn`. It returns a `SyncSubprocess` object, which differs from `Subprocess` in a few ways.

View File

@@ -398,7 +398,7 @@ buf; // => Uint8Array(25)
compressed; // => Uint8Array(10)
```
The second argument supports the same set of configuration options as [`Bun.gzipSync`](#bungzipsync).
The second argument supports the same set of configuration options as [`Bun.gzipSync`](#bun.gzipSync).
## `Bun.inflateSync()`

View File

@@ -247,6 +247,10 @@ This gives you better control over backpressure in your server.
## Connect to a `Websocket` server
{% callout %}
**🚧** — The `WebSocket` client still does not pass the full [Autobahn test suite](https://github.com/crossbario/autobahn-testsuite) and should not be considered ready for production.
{% /callout %}
Bun implements the `WebSocket` class. To create a WebSocket client that connects to a `ws://` or `wss://` server, create an instance of `WebSocket`, as you would in the browser.
```ts

View File

@@ -328,7 +328,7 @@ Depending on the target, Bun will apply different module resolution rules and op
All bundles generated with `target: "bun"` are marked with a special `// @bun` pragma, which indicates to the Bun runtime that there's no need to re-transpile the file before execution.
If any entrypoints contains a Bun shebang (`#!/usr/bin/env bun`) the bundler will default to `target: "bun"` instead of `"browser"`.
If any entrypoints contains a Bun shebang (`#!/usr/bin/env bun`) the bundler will default to `target: "bun"` instead of `"browser`.
---
@@ -1052,7 +1052,7 @@ $ bun build ./index.tsx --outdir ./out --define 'STRING="value"' --define "neste
### `loader`
A map of file extensions to [built-in loader names](https://bun.sh/docs/bundler/loaders#built-in-loaders). This can be used to quickly customize how certain files are loaded.
A map of file extensions to [built-in loader names](https://bun.sh/docs/bundler/loaders#built-in-loaders). This can be used to quickly customize how certain file files are loaded.
{% codetabs %}

View File

@@ -1,155 +0,0 @@
To add a particular package:
```bash
$ bun add preact
```
To specify a version, version range, or tag:
```bash
$ bun add zod@3.20.0
$ bun add zod@^3.0.0
$ bun add zod@latest
```
## `--dev`
{% callout %}
**Alias**`--development`, `-d`, `-D`
{% /callout %}
To add a package as a dev dependency (`"devDependencies"`):
```bash
$ bun add --dev @types/react
$ bun add -d @types/react
```
## `--optional`
To add a package as an optional dependency (`"optionalDependencies"`):
```bash
$ bun add --optional lodash
```
## `--exact`
To add a package and pin to the resolved version, use `--exact`. This will resolve the version of the package and add it to your `package.json` with an exact version number instead of a version range.
```bash
$ bun add react --exact
$ bun add react -E
```
This will add the following to your `package.json`:
```jsonc
{
"dependencies": {
// without --exact
"react": "^18.2.0", // this matches >= 18.2.0 < 19.0.0
// with --exact
"react": "18.2.0" // this matches only 18.2.0 exactly
}
}
```
To view a complete list of options for this command:
```bash
$ bun add --help
```
## `--global`
{% callout %}
**Note** — This would not modify package.json of your current project folder.
**Alias** - `bun add --global`, `bun add -g`, `bun install --global` and `bun install -g`
{% /callout %}
To install a package globally, use the `-g`/`--global` flag. This will not modify the `package.json` of your current project. Typically this is used for installing command-line tools.
```bash
$ bun add --global cowsay # or `bun add -g cowsay`
$ cowsay "Bun!"
______
< Bun! >
------
\ ^__^
\ (oo)\_______
(__)\ )\/\
||----w |
|| ||
```
{% details summary="Configuring global installation behavior" %}
```toml
[install]
# where `bun add --global` installs packages
globalDir = "~/.bun/install/global"
# where globally-installed package bins are linked
globalBinDir = "~/.bun/bin"
```
{% /details %}
## Trusted dependencies
Unlike other npm clients, Bun does not execute arbitrary lifecycle scripts for installed dependencies, such as `postinstall`. These scripts represent a potential security risk, as they can execute arbitrary code on your machine.
To tell Bun to allow lifecycle scripts for a particular package, add the package to `trustedDependencies` in your package.json.
```json-diff
{
"name": "my-app",
"version": "1.0.0",
+ "trustedDependencies": ["my-trusted-package"]
}
```
Bun reads this field and will run lifecycle scripts for `my-trusted-package`.
<!-- Bun maintains an allow-list of popular packages containing `postinstall` scripts that are known to be safe. To run lifecycle scripts for packages that aren't on this list, add the package to `trustedDependencies` in your package.json. -->
## Git dependencies
To add a dependency from a git repository:
```bash
$ bun add git@github.com:moment/moment.git
```
Bun supports a variety of protocols, including [`github`](https://docs.npmjs.com/cli/v9/configuring-npm/package-json#github-urls), [`git`](https://docs.npmjs.com/cli/v9/configuring-npm/package-json#git-urls-as-dependencies), `git+ssh`, `git+https`, and many more.
```json
{
"dependencies": {
"dayjs": "git+https://github.com/iamkun/dayjs.git",
"lodash": "git+ssh://github.com/lodash/lodash.git#4.17.21",
"moment": "git@github.com:moment/moment.git",
"zod": "github:colinhacks/zod"
}
}
```
## Tarball dependencies
A package name can correspond to a publicly hosted `.tgz` file. During installation, Bun will download and install the package from the specified tarball URL, rather than from the package registry.
```sh
$ bun add zod@https://registry.npmjs.org/zod/-/zod-3.21.4.tgz
```
This will add the following line to your `package.json`:
```json#package.json
{
"dependencies": {
"zod": "https://registry.npmjs.org/zod/-/zod-3.21.4.tgz"
}
}
```

View File

@@ -1,27 +1,3 @@
Scaffold an empty Bun project with the interactive `bun init` command.
```bash
$ bun init
bun init helps you get started with a minimal project and tries to
guess sensible defaults. Press ^C anytime to quit.
package name (quickstart):
entry point (index.ts):
Done! A package.json file was saved in the current directory.
+ index.ts
+ .gitignore
+ tsconfig.json (for editor auto-complete)
+ README.md
To get started, run:
bun run index.ts
```
Press `enter` to accept the default answer for each prompt, or pass the `-y` flag to auto-accept the defaults.
{% details summary="How `bun init` works" %}
`bun init` is a quick way to start a blank project with Bun. It guesses with sane defaults and is non-destructive when run multiple times.
![Demo](https://user-images.githubusercontent.com/709451/183006613-271960a3-ff22-4f7c-83f5-5e18f684c836.gif)
@@ -37,4 +13,6 @@ If you pass `-y` or `--yes`, it will assume you want to continue without asking
At the end, it runs `bun install` to install `bun-types`.
{% /details %}
#### How is `bun init` different than `bun create`?
`bun init` is for blank projects. `bun create` applies templates.

View File

@@ -59,8 +59,11 @@ optional = true
# Install local devDependencies (default: true)
dev = true
# Install peerDependencies (default: true)
peer = true
# Install peerDependencies (default: false)
peer = false
# Whether to use the github REST api (unauthenticated)
github.api = true
# When using `bun install -g`, install packages here
globalDir = "~/.bun/install/global"
@@ -170,7 +173,7 @@ bun stores normalized `cpu` and `os` values from npm in the lockfile, along with
## Peer dependencies?
Peer dependencies are handled similarly to yarn. `bun install` will automatically install peer dependencies. If the dependency is marked optional in `peerDependenciesMeta`, an existing dependency will be chosen if possible.
Peer dependencies are handled similarly to yarn. `bun install` does not automatically install peer dependencies and will try to choose an existing dependency.
## Lockfile

256
docs/cli/create.md Normal file
View File

@@ -0,0 +1,256 @@
## `bun init`
Scaffold an empty project with `bun init`. It's an interactive tool.
```bash
$ bun init
bun init helps you get started with a minimal project and tries to
guess sensible defaults. Press ^C anytime to quit.
package name (quickstart):
entry point (index.ts):
Done! A package.json file was saved in the current directory.
+ index.ts
+ .gitignore
+ tsconfig.json (for editor auto-complete)
+ README.md
To get started, run:
bun run index.ts
```
Press `enter` to accept the default answer for each prompt, or pass the `-y` flag to auto-accept the defaults.
## `bun create`
Template a new Bun project with `bun create`.
```bash
$ bun create <template> <destination>
```
{% callout %}
**Note** You dont need `bun create` to use Bun. You dont need any configuration at all. This command exists to make getting started a bit quicker and easier.
{% /callout %}
A template can take a number of forms:
```bash
$ bun create <template> # an official template (remote)
$ bun create <username>/<repo> # a GitHub repo (remote)
$ bun create <local-template> # a custom template (local)
```
Running `bun create` performs the following steps:
- Download the template (remote templates only)
- Copy all template files into the destination folder. By default Bun will _not overwrite_ any existing files. Use the `--force` flag to overwrite existing files.
- Install dependencies with `bun install`.
- Initialize a fresh Git repo. Opt out with the `--no-git` flag.
- Run the template's configured `start` script, if defined.
<!-- ## Official templates
The following official templates are available.
```bash
bun create next ./myapp
bun create react ./myapp
bun create svelte-kit ./myapp
bun create elysia ./myapp
bun create hono ./myapp
bun create kingworld ./myapp
```
Each of these corresponds to a directory in the [bun-community/create-templates](https://github.com/bun-community/create-templates) repo. If you think a major framework is missing, please open a PR there. This list will change over time as additional examples are added. To see an up-to-date list, run `bun create` with no arguments.
```bash
$ bun create
Welcome to bun! Create a new project by pasting any of the following:
<list of templates>
```
{% callout %}
⚡️ **Speed** — At the time of writing, `bun create react app` runs ~11x faster on a M1 Macbook Pro than `yarn create react-app app`.
{% /callout %} -->
## GitHub repos
A template of the form `<username>/<repo>` will be downloaded from GitHub.
```bash
$ bun create ahfarmer/calculator ./myapp
```
Complete GitHub URLs will also work:
```bash
$ bun create github.com/ahfarmer/calculator ./myapp
$ bun create https://github.com/ahfarmer/calculator ./myapp
```
Bun installs the files as they currently exist current default branch (usually `main`). Unlike `git clone` it doesn't download the commit history or configure a remote.
## Local templates
{% callout %}
**⚠️ Warning** — Unlike remote templates, running `bun create` with a local template will delete the entire destination folder if it already exists! Be careful.
{% /callout %}
Bun's templater can be extended to support custom templates defined on your local file system. These templates should live in one of the following directories:
- `$HOME/.bun-create/<name>`: global templates
- `<project root>/.bun-create/<name>`: project-specific templates
{% callout %}
**Note** — You can customize the global template path by setting the `BUN_CREATE_DIR` environment variable.
{% /callout %}
To create a local template, navigate to `$HOME/.bun-create` and create a new directory with the desired name of your template.
```bash
$ cd $HOME/.bun-create
$ mkdir foo
$ cd foo
```
Then, create a `package.json` file in that directory with the following contents:
```json
{
"name": "foo"
}
```
You can run `bun create foo` elsewhere on your file system to verify that Bun is correctly finding your local template.
{% table %}
---
- `postinstall`
- runs after installing dependencies
---
- `preinstall`
- runs before installing dependencies
<!-- ---
- `start`
- a command to auto-start the application -->
{% /table %}
Each of these can correspond to a string or array of strings. An array of commands will be executed in order. Here is an example:
```json
{
"name": "@bun-examples/simplereact",
"version": "0.0.1",
"main": "index.js",
"dependencies": {
"react": "^17.0.2",
"react-dom": "^17.0.2"
},
"bun-create": {
"preinstall": "echo 'Installing...'", // a single command
"postinstall": ["echo 'Done!'"], // an array of commands
"start": "bun run echo 'Hello world!'"
}
}
```
When cloning a template, `bun create` will automatically remove the `"bun-create"` section from `package.json` before writing it to the destination folder.
## Reference
### CLI flags
{% table %}
- Flag
- Description
---
- `--force`
- Overwrite existing files
---
- `--no-install`
- Skip installing `node_modules` & tasks
---
- `--no-git`
- Dont initialize a git repository
---
- `--open`
- Start & open in-browser after finish
{% /table %}
### Environment variables
{% table %}
- Name
- Description
---
- `GITHUB_API_DOMAIN`
- If youre using a GitHub enterprise or a proxy, you can customize the GitHub domain Bun pings for downloads
---
- `GITHUB_API_TOKEN`
- This lets `bun create` work with private repositories or if you get rate-limited
{% /table %}
{% details summary="How `bun create` works" %}
When you run `bun create ${template} ${destination}`, heres what happens:
IF remote template
1. GET `registry.npmjs.org/@bun-examples/${template}/latest` and parse it
2. GET `registry.npmjs.org/@bun-examples/${template}/-/${template}-${latestVersion}.tgz`
3. Decompress & extract `${template}-${latestVersion}.tgz` into `${destination}`
- If there are files that would overwrite, warn and exit unless `--force` is passed
IF GitHub repo
1. Download the tarball from GitHubs API
2. Decompress & extract into `${destination}`
- If there are files that would overwrite, warn and exit unless `--force` is passed
ELSE IF local template
1. Open local template folder
2. Delete destination directory recursively
3. Copy files recursively using the fastest system calls available (on macOS `fcopyfile` and Linux, `copy_file_range`). Do not copy or traverse into `node_modules` folder if exists (this alone makes it faster than `cp`)
4. Parse the `package.json` (again!), update `name` to be `${basename(destination)}`, remove the `bun-create` section from the `package.json` and save the updated `package.json` to disk.
- IF Next.js is detected, add `bun-framework-next` to the list of dependencies
- IF Create React App is detected, add the entry point in /src/index.{js,jsx,ts,tsx} to `public/index.html`
- IF Relay is detected, add `bun-macro-relay` so that Relay works
5. Auto-detect the npm client, preferring `pnpm`, `yarn` (v1), and lastly `npm`
6. Run any tasks defined in `"bun-create": { "preinstall" }` with the npm client
7. Run `${npmClient} install` unless `--no-install` is passed OR no dependencies are in package.json
8. Run any tasks defined in `"bun-create": { "preinstall" }` with the npm client
9. Run `git init; git add -A .; git commit -am "Initial Commit";`
- Rename `gitignore` to `.gitignore`. NPM automatically removes `.gitignore` files from appearing in packages.
- If there are dependencies, this runs in a separate thread concurrently while node_modules are being installed
- Using libgit2 if available was tested and performed 3x slower in microbenchmarks
{% /details %}

View File

@@ -9,7 +9,7 @@ The `bun` CLI contains a Node.js-compatible package manager designed to be a dra
{% /callout %}
{% details summary="For Linux users" %}
The recommended minimum Linux Kernel version is 5.6. If you're on Linux kernel 5.1 - 5.5, `bun install` will work, but HTTP requests will be slow due to a lack of support for io_uring's `connect()` operation.
The minimum Linux Kernel version is 5.1. If you're on Linux kernel 5.1 - 5.5, `bun install` should still work, but HTTP requests will be slow due to a lack of support for io_uring's `connect()` operation.
If you're using Ubuntu 20.04, here's how to install a [newer kernel](https://wiki.ubuntu.com/Kernel/LTSEnablementStack):
@@ -23,19 +23,41 @@ sudo apt install --install-recommends linux-generic-hwe-20.04
{% /details %}
## `bun install`
To install all dependencies of a project:
```bash
$ bun install
```
On Linux, `bun install` tends to install packages 20-100x faster than `npm install`. On macOS, it's more like 4-80x.
![package install benchmark](https://user-images.githubusercontent.com/709451/147004342-571b6123-17a9-49a2-8bfd-dcfc5204047e.png)
Running `bun install` will:
- **Install** all `dependencies`, `devDependencies`, and `optionalDependencies`. Bun will install `peerDependencies` by default.
- **Install** all `dependencies`, `devDependencies`, and `optionalDependencies`. Bun does not install `peerDependencies` by default.
- **Run** your project's `{pre|post}install` and `{pre|post}prepare` scripts at the appropriate time. For security reasons Bun _does not execute_ lifecycle scripts of installed dependencies.
- **Write** a `bun.lockb` lockfile to the project root.
## Logging
To install in production mode (i.e. without `devDependencies` or `optionalDependencies`):
```bash
$ bun install --production
```
To install with reproducible dependencies, use `--frozen-lockfile`. If your `package.json` disagrees with `bun.lockb`, Bun will exit with an error. This is useful for production builds and CI environments.
```bash
$ bun install --frozen-lockfile
```
To perform a dry run (i.e. don't actually install anything):
```bash
$ bun install --dry-run
```
To modify logging verbosity:
@@ -44,113 +66,8 @@ $ bun install --verbose # debug logging
$ bun install --silent # no logging
```
## Lifecycle scripts
Unlike other npm clients, Bun does not execute arbitrary lifecycle scripts like `postinstall` for installed dependencies. Executing arbitrary scripts represents a potential security risk.
To tell Bun to allow lifecycle scripts for a particular package, add the package to `trustedDependencies` in your package.json.
```json-diff
{
"name": "my-app",
"version": "1.0.0",
+ "trustedDependencies": ["my-trusted-package"]
}
```
Then re-install the package. Bun will read this field and run lifecycle scripts for `my-trusted-package`.
## Workspaces
Bun supports `"workspaces"` in package.json. For complete documentation refer to [Package manager > Workspaces](/docs/install/workspaces).
```json#package.json
{
"name": "my-app",
"version": "1.0.0",
"workspaces": ["packages/*"],
"dependencies": {
"preact": "^10.5.13"
}
}
```
## Overrides and resolutions
Bun supports npm's `"overrides"` and Yarn's `"resolutions"` in `package.json`. These are mechanisms for specifying a version range for _metadependencies_—the dependencies of your dependencies. Refer to [Package manager > Overrides and resolutions](/docs/install/overrides) for complete documentation.
```json-diff#package.json
{
"name": "my-app",
"dependencies": {
"foo": "^2.0.0"
},
+ "overrides": {
+ "bar": "~4.4.0"
+ }
}
```
## Global packages
To install a package globally, use the `-g`/`--global` flag. Typically this is used for installing command-line tools.
```bash
$ bun install --global cowsay # or `bun install -g cowsay`
$ cowsay "Bun!"
______
< Bun! >
------
\ ^__^
\ (oo)\_______
(__)\ )\/\
||----w |
|| ||
```
## Production mode
To install in production mode (i.e. without `devDependencies` or `optionalDependencies`):
```bash
$ bun install --production
```
For reproducible installs, use `--frozen-lockfile`. This will install the exact versions of each package specified in the lockfile. If your `package.json` disagrees with `bun.lockb`, Bun will exit with an error. The lockfile will not be updated.
```bash
$ bun install --frozen-lockfile
```
For more information on Bun's binary lockfile `bun.lockb`, refer to [Package manager > Lockfile](/docs/install/lockfile).
## Dry run
To perform a dry run (i.e. don't actually install anything):
```bash
$ bun install --dry-run
```
## Non-npm dependencies
Bun supports installing dependencies from Git, GitHub, and local or remotely-hosted tarballs. For complete documentation refer to [Package manager > Git, GitHub, and tarball dependencies](/docs/cli/add).
```json#package.json
{
"dependencies": {
"dayjs": "git+https://github.com/iamkun/dayjs.git",
"lodash": "git+ssh://github.com/lodash/lodash.git#4.17.21",
"moment": "git@github.com:moment/moment.git",
"zod": "github:colinhacks/zod",
"react": "https://registry.npmjs.org/react/-/react-18.2.0.tgz"
}
}
```
## Configuration
The default behavior of `bun install` can be configured in `bunfig.toml`. The default values are shown below.
{% details summary="Configuring behavior" %}
The default behavior of `bun install` can be configured in `bunfig.toml`:
```toml
[install]
@@ -162,7 +79,7 @@ optional = true
dev = true
# whether to install peerDependencies
peer = true
peer = false
# equivalent to `--production` flag
production = false
@@ -172,11 +89,230 @@ frozenLockfile = false
# equivalent to `--dry-run` flag
dryRun = false
# whether to use the github REST api (unauthenticated)
github.api = true
```
{% /details %}
## `bun add`
To add a particular package:
```bash
$ bun add preact
```
To specify a version, version range, or tag:
```bash
$ bun add zod@3.20.0
$ bun add zod@^3.0.0
$ bun add zod@latest
```
To add a package as a dev dependency (`"devDependencies"`):
```bash
$ bun add --dev @types/react
$ bun add -d @types/react
```
To add a package as an optional dependency (`"optionalDependencies"`):
```bash
$ bun add --optional lodash
```
To add a package and pin to the resolved version, use `--exact`. This will resolve the version of the package and add it to your `package.json` with an exact version number instead of a version range.
```bash
$ bun add react --exact
```
This will add the following to your `package.json`:
```jsonc
{
"dependencies": {
// without --exact
"react": "^18.2.0", // this matches >= 18.2.0 < 19.0.0
// with --exact
"react": "18.2.0" // this matches only 18.2.0 exactly
}
}
```
To install a package globally:
```bash
$ bun add --global cowsay # or `bun add -g cowsay`
$ cowsay "Bun!"
______
< Bun! >
------
\ ^__^
\ (oo)\_______
(__)\ )\/\
||----w |
|| ||
```
{% details summary="Configuring global installation behavior" %}
```toml
[install]
# where `bun install --global` installs packages
globalDir = "~/.bun/install/global"
# where globally-installed package bins are linked
globalBinDir = "~/.bun/bin"
```
{% /details %}
To view a complete list of options for a given command:
```bash
$ bun add --help
```
## `bun remove`
To remove a dependency:
```bash
$ bun remove preact
```
## `bun update`
To update all dependencies to the latest version _that's compatible with the version range specified in your `package.json`_:
```sh
$ bun update
```
This will not edit your `package.json`. There's currently no command to force-update all dependencies to the latest version regardless version ranges.
## `bun link`
Use `bun link` in a local directory to register the current package as a "linkable" package.
```bash
$ cd /path/to/cool-pkg
$ cat package.json
{
"name": "cool-pkg",
"version": "1.0.0"
}
$ bun link
bun link v1.x (7416672e)
Success! Registered "cool-pkg"
To use cool-pkg in a project, run:
bun link cool-pkg
Or add it in dependencies in your package.json file:
"cool-pkg": "link:cool-pkg"
```
This package can now be "linked" into other projects using `bun link cool-pkg`. This will create a symlink in the `node_modules` directory of the target project, pointing to the local directory.
```bash
$ cd /path/to/my-app
$ bun link cool-pkg
```
In addition, the `--save` flag can be used to add `cool-pkg` to the `dependencies` field of your app's package.json with a special version specifier that tells Bun to load from the registered local directory instead of installing from `npm`:
```json-diff
{
"name": "my-app",
"version": "1.0.0",
"dependencies": {
+ "cool-pkg": "link:cool-pkg"
}
}
```
## Trusted dependencies
Unlike other npm clients, Bun does not execute arbitrary lifecycle scripts for installed dependencies, such as `postinstall`. These scripts represent a potential security risk, as they can execute arbitrary code on your machine.
<!-- Bun maintains an allow-list of popular packages containing `postinstall` scripts that are known to be safe. To run lifecycle scripts for packages that aren't on this list, add the package to `trustedDependencies` in your package.json. -->
To tell Bun to allow lifecycle scripts for a particular package, add the package to `trustedDependencies` in your package.json.
<!-- ```json-diff
{
"name": "my-app",
"version": "1.0.0",
+ "trustedDependencies": {
+ "my-trusted-package": "*"
+ }
}
``` -->
```json-diff
{
"name": "my-app",
"version": "1.0.0",
+ "trustedDependencies": ["my-trusted-package"]
}
```
Bun reads this field and will run lifecycle scripts for `my-trusted-package`.
<!-- If you specify a version range, Bun will only execute lifecycle scripts if the resolved package version matches the range. -->
<!--
```json
{
"name": "my-app",
"version": "1.0.0",
"trustedDependencies": {
"my-trusted-package": "^1.0.0"
}
}
``` -->
## Git dependencies
To add a dependency from a git repository:
```bash
$ bun install git@github.com:moment/moment.git
```
Bun supports a variety of protocols, including [`github`](https://docs.npmjs.com/cli/v9/configuring-npm/package-json#github-urls), [`git`](https://docs.npmjs.com/cli/v9/configuring-npm/package-json#git-urls-as-dependencies), `git+ssh`, `git+https`, and many more.
```json
{
"dependencies": {
"dayjs": "git+https://github.com/iamkun/dayjs.git",
"lodash": "git+ssh://github.com/lodash/lodash.git#4.17.21",
"moment": "git@github.com:moment/moment.git",
"zod": "github:colinhacks/zod"
}
}
```
## Tarball dependencies
A package name can correspond to a publicly hosted `.tgz` file. During `bun install`, Bun will download and install the package from the specified tarball URL, rather than from the package registry.
```json#package.json
{
"dependencies": {
"zod": "https://registry.npmjs.org/zod/-/zod-3.21.4.tgz"
}
}
```
## CI/CD
Looking to speed up your CI? Use the official [`oven-sh/setup-bun`](https://github.com/oven-sh/setup-bun) action to install `bun` in a GitHub Actions pipeline.
Looking to speed up your CI? Use the official `oven-sh/setup-bun` action to install `bun` in a GitHub Actions pipeline.
```yaml#.github/workflows/release.yml
name: bun-types

View File

@@ -1,46 +0,0 @@
Use `bun link` in a local directory to register the current package as a "linkable" package.
```bash
$ cd /path/to/cool-pkg
$ cat package.json
{
"name": "cool-pkg",
"version": "1.0.0"
}
$ bun link
bun link v1.x (7416672e)
Success! Registered "cool-pkg"
To use cool-pkg in a project, run:
bun link cool-pkg
Or add it in dependencies in your package.json file:
"cool-pkg": "link:cool-pkg"
```
This package can now be "linked" into other projects using `bun link cool-pkg`. This will create a symlink in the `node_modules` directory of the target project, pointing to the local directory.
```bash
$ cd /path/to/my-app
$ bun link cool-pkg
```
In addition, the `--save` flag can be used to add `cool-pkg` to the `dependencies` field of your app's package.json with a special version specifier that tells Bun to load from the registered local directory instead of installing from `npm`:
```json-diff
{
"name": "my-app",
"version": "1.0.0",
"dependencies": {
+ "cool-pkg": "link:cool-pkg"
}
}
```
To _unregister_ a local package, navigate to the package's root directory and run `bun unlink`.
```bash
$ cd /path/to/cool-pkg
$ bun unlink
bun unlink v1.x (7416672e)
```

View File

@@ -1,5 +0,0 @@
To remove a dependency:
```bash
$ bun remove ts-node
```

View File

@@ -1,17 +0,0 @@
To update all dependencies to the latest version _that's compatible with the version range specified in your `package.json`_:
```sh
$ bun update
```
## `--force`
{% callout %}
**Alias**`-f`
{% /callout %}
Bun by default respect the version rages defined in your package.json, to ignore this and update to the latest version you can pass in the `force` flag.
```sh
$ bun update --force
```

View File

@@ -1,140 +0,0 @@
---
name: Containerize a Bun application with Docker
---
{% callout %}
This guide assumes you already have [Docker Desktop](https://www.docker.com/products/docker-desktop/) installed.
{% /callout %}
[Docker](https://www.docker.com) is a platform for packaging and running an application as a lightweight, portable _container_ that encapsulates all the necessary dependencies.
---
To _containerize_ our application, we define a `Dockerfile`. This file contains a list of instructions to initialize the container, copy our local project files into it, install dependencies, and starts the application.
```docker#Dockerfile
# use the official Bun image
# see all versions at https://hub.docker.com/r/oven/bun/tags
FROM oven/bun:1 as base
WORKDIR /usr/src/app
# install dependencies into temp directory
# this will cache them and speed up future builds
FROM base AS install
RUN mkdir -p /temp/dev
COPY package.json bun.lockb /temp/dev/
RUN cd /temp/dev && bun install --frozen-lockfile
# install with --production (exclude devDependencies)
RUN mkdir -p /temp/prod
COPY package.json bun.lockb /temp/prod/
RUN cd /temp/prod && bun install --frozen-lockfile --production
# copy node_modules from temp directory
# then copy all (non-ignored) project files into the image
FROM install AS prerelease
COPY --from=install /temp/dev/node_modules node_modules
COPY . .
# [optional] tests & build
ENV NODE_ENV=production
RUN bun test
RUN bun run build
# copy production dependencies and source code into final image
FROM base AS release
COPY --from=install /temp/prod/node_modules node_modules
COPY --from=prerelease /usr/src/app/index.ts .
COPY --from=prerelease /usr/src/app/package.json .
# run the app
USER bun
EXPOSE 3000/tcp
ENTRYPOINT [ "bun", "run", "index.ts" ]
```
---
Now that you have your docker image, let's look at `.dockerignore` which has the same syntax as `.gitignore`, here you need to specify the files/directories that must not go in any stage of the docker build. An example for a ignore file is
```txt#.dockerignore
node_modules
Dockerfile*
docker-compose*
.dockerignore
.git
.gitignore
README.md
LICENSE
.vscode
Makefile
helm-charts
.env
.editorconfig
.idea
coverage*
```
---
We'll now use `docker build` to convert this `Dockerfile` into a _Docker image_, is a self-contained template containing all the dependencies and configuration required to run the application.
The `-t` flag lets us specify a name for the image, and `--pull` tells Docker to automatically download the latest version of the base image (`oven/bun`). The initial build will take longer, as Docker will download all the base images and dependencies.
```bash
$ docker build --pull -t bun-hello-world .
[+] Building 0.9s (21/21) FINISHED
=> [internal] load build definition from Dockerfile 0.0s
=> => transferring dockerfile: 37B 0.0s
=> [internal] load .dockerignore 0.0s
=> => transferring context: 35B 0.0s
=> [internal] load metadata for docker.io/oven/bun:1 0.8s
=> [auth] oven/bun:pull token for registry-1.docker.io 0.0s
=> [base 1/2] FROM docker.io/oven/bun:1@sha256:373265748d3cd3624cb3f3ee6004f45b1fc3edbd07a622aeeec17566d2756997 0.0s
=> [internal] load build context 0.0s
=> => transferring context: 155B 0.0s
# ...lots of commands...
=> exporting to image 0.0s
=> => exporting layers 0.0s
=> => writing image sha256:360663f7fdcd6f11e8e94761d5592e2e4dfc8d167f034f15cd5a863d5dc093c4 0.0s
=> => naming to docker.io/library/bun-hello-world 0.0s
```
---
We've built a new _Docker image_. Now let's use that image to spin up an actual, running _container_.
We'll use `docker run` to start a new container using the `bun-hello-world` image. It will be run in _detached_ mode (`-d`) and we'll map the container's port 3000 to our local machine's port 3000 (`-p 3000:3000`).
The `run` command prints a string representing the _container ID_.
```sh
$ docker run -d -p 3000:3000 bun-hello-world
7f03e212a15ede8644379bce11a13589f563d3909a9640446c5bbefce993678d
```
---
The container is now running in the background. Visit [localhost:3000](http://localhost:3000). You should see a `Hello, World!` message.
---
To stop the container, we'll use `docker stop <container-id>`.
```sh
$ docker stop 7f03e212a15ede8644379bce11a13589f563d3909a9640446c5bbefce993678d
```
---
If you can't find the container ID, you can use `docker ps` to list all running containers.
```sh
$ docker ps
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
7f03e212a15e bun-hello-world "bun run index.ts" 2 minutes ago Up 2 minutes 0.0.0.0:3000->3000/tcp flamboyant_cerf
```
---
That's it! Refer to the [Docker documentation](https://docs.docker.com/) for more advanced usage.

View File

@@ -1,185 +0,0 @@
---
name: Use Drizzle ORM with Bun
---
Drizzle is an ORM that supports both a SQL-like "query builder" API and an ORM-like [Queries API](https://orm.drizzle.team/docs/rqb). It supports the `bun:sqlite` built-in module.
---
Let's get started by creating a fresh project with `bun init` and installing Drizzle.
```sh
$ bun init -y
$ bun add drizzle-orm
$ bun add -D drizzle-kit
```
---
Then we'll connect to a SQLite database using the `bun:sqlite` module and create the Drizzle database instance.
```ts#db.ts
import { drizzle } from "drizzle-orm/bun-sqlite";
import { Database } from "bun:sqlite";
const sqlite = new Database("sqlite.db");
export const db = drizzle(sqlite);
```
---
To see the database in action, add these lines to `index.ts`.
```ts#index.ts
import { db } from "./db";
import { sql } from "drizzle-orm";
const query = sql`select "hello world" as text`;
const result = db.get<{ text: string }>(query);
console.log(result);
```
---
Then run `index.ts` with Bun. Bun will automatically create `sqlite.db` and execute the query.
```sh
$ bun run index.ts
{
text: "hello world"
}
```
---
Lets give our database a proper schema. Create a `schema.ts` file and define a `movies` table.
```ts#schema.ts
import { sqliteTable, text, integer } from "drizzle-orm/sqlite-core";
export const movies = sqliteTable("movies", {
id: integer("id").primaryKey(),
title: text("name"),
releaseYear: integer("release_year"),
});
```
---
We can use the `drizzle-kit` CLI to generate an initial SQL migration.
```sh
$ bunx drizzle-kit generate:sqlite --schema ./schema.ts
```
---
This creates a new `drizzle` directory containing a `.sql` migration file and `meta` directory.
```txt
drizzle
├── 0000_ordinary_beyonder.sql
└── meta
├── 0000_snapshot.json
└── _journal.json
```
---
We can execute these migrations with a simple `migrate.ts` script.
This script creates a new connection to a SQLite database that writes to `sqlite.db`, then executes all unexecuted migrations in the `drizzle` directory.
```ts#migrate.ts
import { migrate } from "drizzle-orm/bun-sqlite/migrator";
import { drizzle } from "drizzle-orm/bun-sqlite";
import { Database } from "bun:sqlite";
const sqlite = new Database("sqlite.db");
const db = drizzle(sqlite);
await migrate(db, { migrationsFolder: "./drizzle" });
```
---
We can run this script with `bun` to execute the migration.
```sh
$ bun run migrate.ts
```
---
Now that we have a database, let's add some data to it. Create a `seed.ts` file with the following contents.
```ts#seed.ts
import { db } from "./db";
import * as schema from "./schema";
await db.insert(schema.movies).values([
{
title: "The Matrix",
releaseYear: 1999,
},
{
title: "The Matrix Reloaded",
releaseYear: 2003,
},
{
title: "The Matrix Revolutions",
releaseYear: 2003,
},
]);
console.log(`Seeding complete.`);
```
---
Then run this file.
```sh
$ bun run seed.ts
Seeding complete.
```
---
We finally have a database with a schema and some sample data. Let's use Drizzle to query it. Replace the contents of `index.ts` with the following.
```ts#index.ts
import * as schema from "./schema";
import { db } from "./db";
const result = await db.select().from(schema.movies);
console.log(result);
```
---
Then run the file. You should see the three movies we inserted.
```sh
$ bun run index.ts
bun run index.ts
[
{
id: 1,
title: "The Matrix",
releaseYear: 1999
}, {
id: 2,
title: "The Matrix Reloaded",
releaseYear: 2003
}, {
id: 3,
title: "The Matrix Revolutions",
releaseYear: 2003
}
]
```
---
Refer to the [Drizzle website](https://orm.drizzle.team/docs/overview) for complete documentation.

View File

@@ -1,227 +0,0 @@
---
name: Use EdgeDB with Bun
---
EdgeDB is a graph-relational database powered by Postgres under the hood. It provides a declarative schema language, migrations system, and object-oriented query language, in addition to supporting raw SQL queries. It solves the object-relational mapping problem at the database layer, eliminating the need for an ORM library in your application code.
---
First, [install EdgeDB](https://www.edgedb.com/install) if you haven't already.
{% codetabs %}
```sh#Linux/macOS
$ curl --proto '=https' --tlsv1.2 -sSf https://sh.edgedb.com | sh
```
```sh#Windows
$ iwr https://ps1.edgedb.com -useb | iex
```
{% /codetabs %}
---
Use `bun init` to create a fresh project.
```sh
$ mkdir my-edgedb-app
$ cd my-edgedb-app
$ bun init -y
```
---
We'll use the EdgeDB CLI to initialize an EdgeDB instance for our project. This creates an `edgedb.toml` file in our project root.
```sh
$ edgedb project init
No `edgedb.toml` found in `/Users/colinmcd94/Documents/bun/fun/examples/my-edgedb-app` or above
Do you want to initialize a new project? [Y/n]
> Y
Specify the name of EdgeDB instance to use with this project [default: my_edgedb_app]:
> my_edgedb_app
Checking EdgeDB versions...
Specify the version of EdgeDB to use with this project [default: x.y]:
> x.y
┌─────────────────────┬────────────────────────────────────────────────────────────────────────┐
│ Project directory │ /Users/colinmcd94/Documents/bun/fun/examples/my-edgedb-app │
│ Project config │ /Users/colinmcd94/Documents/bun/fun/examples/my-edgedb-app/edgedb.toml │
│ Schema dir (empty) │ /Users/colinmcd94/Documents/bun/fun/examples/my-edgedb-app/dbschema │
│ Installation method │ portable package │
│ Version │ x.y+6d5921b │
│ Instance name │ my_edgedb_app │
└─────────────────────┴────────────────────────────────────────────────────────────────────────┘
Version x.y+6d5921b is already downloaded
Initializing EdgeDB instance...
Applying migrations...
Everything is up to date. Revision initial
Project initialized.
To connect to my_edgedb_app, run `edgedb`
```
---
To see if the database is running, let's open a REPL and run a simple query.
Then run `\quit` to exit the REPL.
```sh
$ edgedb
edgedb> select 1 + 1;
2
edgedb> \quit
```
---
With the project initialized, we can define a schema. The `edgedb project init` command already created a `dbschema/default.esdl` file to contain our schema.
```txt
dbschema
├── default.esdl
└── migrations
```
---
Open that file and paste the following contents.
```txt
module default {
type Movie {
title: str;
releaseYear: int64;
}
};
```
---
Then generate and apply an initial migration.
```sh
$ edgedb migration create
Created /Users/colinmcd94/Documents/bun/fun/examples/my-edgedb-app/dbschema/migrations/00001.edgeql, id: m1uwekrn4ni4qs7ul7hfar4xemm5kkxlpswolcoyqj3xdhweomwjrq
$ edgedb migrate
Applied m1uwekrn4ni4qs7ul7hfar4xemm5kkxlpswolcoyqj3xdhweomwjrq (00001.edgeql)
```
---
With our schema applied, let's execute some queries using EdgeDB's JavaScript client library. We'll install the client library and EdgeDB's codegen CLI, and create a `seed.ts`.file.
```sh
$ bun add edgedb
$ bun add -D @edgedb/generate
$ touch seed.ts
```
---
Paste the following code into `seed.ts`.
The client auto-connects to the database. We insert a couple movies using the `.execute()` method.
```ts
import { createClient } from "edgedb";
const client = createClient();
const INSERT_MOVIE = `
insert Movie {
title := <str>$title,
releaseYear := <int64>$year,
}
`;
const movies = [
{ title: "The Matrix", year: 1999 },
{ title: "The Matrix Reloaded", year: 2003 },
{ title: "The Matrix Revolutions", year: 2003 },
];
for (const movie of movies) {
await client.execute(INSERT_MOVIE, movie);
}
console.log(`Seeding complete.`);
process.exit();
```
---
Then run this file with Bun.
```sh
$ bun run seed.ts
Seeding complete.
```
---
EdgeDB implements a number of code generation tools for TypeScript. To query our newly seeded database in a typesafe way, we'll use `@edgedb/generate` to code-generate the EdgeQL query builder.
```sh
$ bunx @edgedb/generate edgeql-js
Generating query builder...
Detected tsconfig.json, generating TypeScript files.
To override this, use the --target flag.
Run `npx @edgedb/generate --help` for full options.
Introspecting database schema...
Writing files to ./dbschema/edgeql-js
Generation complete! 🤘
Checking the generated query builder into version control
is not recommended. Would you like to update .gitignore to ignore
the query builder directory? The following line will be added:
dbschema/edgeql-js
[y/n] (leave blank for "y")
> y
```
---
In `index.ts`, we can import the generated query builder from `./dbschema/edgeql-js` and write a simple select query.
```ts
import { createClient } from "edgedb";
import e from "./dbschema/edgeql-js";
const client = createClient();
const query = e.select(e.Movie, () => ({
title: true,
releaseYear: true,
}));
const results = await query.run(client);
console.log(results);
results; // { title: string, releaseYear: number | null }[]
```
---
Running the file with Bun, we can see the list of movies we inserted.
```sh
$ bun run index.ts
[
{
title: "The Matrix",
releaseYear: 1999
}, {
title: "The Matrix Reloaded",
releaseYear: 2003
}, {
title: "The Matrix Revolutions",
releaseYear: 2003
}
]
```
---
For complete documentation, refer to the [EdgeDB docs](https://www.edgedb.com/docs).

View File

@@ -1,54 +0,0 @@
---
name: Run Bun as a daemon with PM2
---
[PM2](https://pm2.keymetrics.io/) is a popular process manager that manages and runs your applications as daemons (background processes).
It offers features like process monitoring, automatic restarts, and easy scaling. Using a process manager is common when deploying a Bun application on a cloud-hosted virtual private server (VPS), as it:
- Keeps your Node.js application running continuously.
- Ensure high availability and reliability of your application.
- Monitor and manage multiple processes with ease.
- Simplify the deployment process.
---
You can use PM2 with Bun in two ways: as a CLI option or in a configuration file.
### With `--interpreter`
---
To start your application with PM2 and Bun as the interpreter, open your terminal and run the following command:
```bash
pm2 start --interpreter ~/.bun/bin/bun index.ts
```
---
### With a configuration file
---
Alternatively, you can create a PM2 configuration file. Create a file named `pm2.config.js` in your project directory and add the following content.
```javascript
module.exports = {
name: "app", // Name of your application
script: "index.ts", // Entry point of your application
interpreter: "~/.bun/bin/bun", // Path to the Bun interpreter
};
```
---
After saving the file, you can start your application with PM2
```bash
pm2 start pm2.config.js
```
---
Thats it! Your JavaScript/TypeScript web server is now running as a daemon with PM2 using Bun as the interpreter.

View File

@@ -1,5 +1,5 @@
---
name: Use Prisma with Bun
name: Get started using Prisma
---
{% callout %}

View File

@@ -4,7 +4,7 @@ name: Use React and JSX
React just works with Bun. Bun supports `.jsx` and `.tsx` files out of the box.
Remember that JSX is just a special syntax for including HTML-like syntax in JavaScript files. React uses JSX syntax, as do alternatives like [Preact](https://preactjs.com/) and [Solid](https://www.solidjs.com/). Bun's internal transpiler converts JSX syntax into vanilla JavaScript before execution.
Remember that JSX is just a special syntax for including HTML-like syntax in JavaScript files. It's commonReact uses JSX syntax, as do other React alternatives like [Preact](https://preactjs.com/) and [Solid](https://www.solidjs.com/). Bun's internal transpiler converts JSX syntax into vanilla JavaScript before execution.
---
@@ -27,7 +27,7 @@ const element = jsx("h1", { children: "Hello, world!" });
---
This code requires `react` to run, so make sure you've installed React.
This code requires `react` to run, so make sure you you've installed React.
```bash
$ bun install react

View File

@@ -1,113 +0,0 @@
---
name: Run Bun as a daemon with systemd
---
[systemd](https://systemd.io) is an init system and service manager for Linux operating systems that manages the startup and control of system processes and services.
<!-- systemd provides aggressive parallelization capabilities, uses socket and D-Bus activation for starting services, offers on-demand starting of daemons, keeps track of processes using Linux control groups, maintains mount and auto mount points, and implements an elaborate transactional dependency-based service control logic. systemd supports SysV and LSB init scripts and works as a replacement for sysvinit. -->
<!-- Other parts include a logging daemon, utilities to control basic system configuration like the hostname, date, locale, maintain a list of logged-in users and running containers and virtual machines, system accounts, runtime directories and settings, and daemons to manage simple network configuration, network time synchronization, log forwarding, and name resolution. -->
---
To run a Bun application as a daemon using **systemd** you'll need to create a _service file_ in `/lib/systemd/system/`.
```sh
$ cd /lib/systemd/system
$ touch my-app.service
```
---
Here is a typical service file that runs an application on system start. You can use this as a template for your own service. Replace `YOUR_USER` with the name of the user you want to run the application as. To run as `root`, replace `YOUR_USER` with `root`, though this is generally not recommended for security reasons.
Refer to the [systemd documentation](https://www.freedesktop.org/software/systemd/man/systemd.service.html) for more information on each setting.
```ini#my-app.service
[Unit]
# describe the app
Description=My App
# start the app after the network is available
After=network.target
[Service]
# usually you'll use 'simple'
# one of https://www.freedesktop.org/software/systemd/man/systemd.service.html#Type=
Type=simple
# which user to use when starting the app
User=YOUR_USER
# path to your application's root directory
WorkingDirectory=/home/YOUR_USER/path/to/my-app
# the command to start the app
# requires absolute paths
ExecStart=/home/YOUR_USER/.bun/bin/bun run index.ts
# restart policy
# one of {no|on-success|on-failure|on-abnormal|on-watchdog|on-abort|always}
Restart=always
[Install]
# start the app automatically
WantedBy=multi-user.target
```
---
If your application starts a webserver, note that non-`root` users are not able to listen on ports 80 or 443 by default. To permanently allow Bun to listen on these ports when executed by a non-`root` user, use the following command. This step isn't necessary when running as `root`.
```bash
$ sudo setcap CAP_NET_BIND_SERVICE=+eip ~/.bun/bin/bun
```
---
With the service file configured, you can now _enable_ the service. Once enabled, it will start automatically on reboot. This requires `sudo` permissions.
```bash
$ sudo systemctl enable my-app
```
---
To start the service without rebooting, you can manually _start_ it.
```bash
$ sudo systemctl start my-app
```
---
Check the status of your application with `systemctl status`. If you've started your app successfully, you should see something like this:
```bash
$ sudo systemctl status my-app
● my-app.service - My App
Loaded: loaded (/lib/systemd/system/my-app.service; enabled; preset: enabled)
Active: active (running) since Thu 2023-10-12 11:34:08 UTC; 1h 8min ago
Main PID: 309641 (bun)
Tasks: 3 (limit: 503)
Memory: 40.9M
CPU: 1.093s
CGroup: /system.slice/my-app.service
└─309641 /home/YOUR_USER/.bun/bin/bun run /home/YOUR_USER/application/index.ts
```
---
To update the service, edit the contents of the service file, then reload the daemon.
```bash
$ sudo systemctl daemon-reload
```
---
For a complete guide on the service unit configuration, you can check [this page](https://www.freedesktop.org/software/systemd/man/systemd.service.html). Or refer to this cheatsheet of common commands:
```bash
$ sudo systemctl daemon-reload # tell systemd that some files got changed
$ sudo systemctl enable my-app # enable the app (to allow auto-start)
$ sudo systemctl disable my-app # disable the app (turns off auto-start)
$ sudo systemctl start my-app # start the app if is stopped
$ sudo systemctl stop my-app # stop the app
$ sudo systemctl restart my-app # restart the app
```

View File

@@ -30,7 +30,8 @@ bun install
Start the development server with the `vite` CLI using `bunx`.
The `--bun` flag tells Bun to run Vite's CLI using `bun` instead of `node`; by default Bun respects Vite's `#!/usr/bin/env node` [shebang line](<https://en.wikipedia.org/wiki/Shebang_(Unix)>).
The `--bun` flag tells Bun to run Vite's CLI using `bun` instead of `node`; by default Bun respects Vite's `#!/usr/bin/env node` [shebang line](<https://en.wikipedia.org/wiki/Shebang_(Unix)>). After Bun 1.0 this flag will no longer be necessary.
```bash
bunx --bun vite
```

View File

@@ -2,7 +2,7 @@
name: Add a peer dependency
---
To add an npm package as a peer dependency, directly modify the `peerDependencies` object in your package.json. Running `bun install` will install peer dependencies by default, unless marked optional in `peerDependenciesMeta`.
To add an npm package as a peer dependency, directly modify the `peerDependencies` object in your package.json. Running `bun install` will not install peer dependencies.
```json-diff
{

View File

@@ -12,7 +12,7 @@ jobs:
runs-on: ubuntu-latest
steps:
# ...
- uses: actions/checkout@v4
- uses: actions/checkout@v3
+ - uses: oven-sh/setup-bun@v1
# run any `bun` or `bunx` command

View File

@@ -47,4 +47,4 @@ Note that this only allows lifecycle scripts for the specific package listed in
---
See [Docs > Package manager > Trusted dependencies](/docs/install/lifecycle) for complete documentation of trusted dependencies.
See [Docs > Package manager > Trusted dependencies](/docs/cli/install#trusted-dependencies) for complete documentation of trusted dependencies.

View File

@@ -1,66 +0,0 @@
---
name: Spawn a child process and communicate using IPC
---
Use [`Bun.spawn()`](/docs/api/spawn) to spawn a child process. When spawning a second `bun` process, you can open a direct inter-process communication (IPC) channel between the two processes.
{%callout%}
**Note** — This API is only compatible with other `bun` processes. Use `process.execPath` to get a path to the currently running `bun` executable.
{%/callout%}
```ts#parent.ts
const child = Bun.spawn(["bun", "child.ts"], {
ipc(message) {
/**
* The message received from the sub process
**/
},
});
```
---
The parent process can send messages to the subprocess using the `.send()` method on the returned `Subprocess` instance. A reference to the sending subprocess is also available as the second argument in the `ipc` handler.
```ts#parent.ts
const childProc = Bun.spawn(["bun", "child.ts"], {
ipc(message, childProc) {
/**
* The message received from the sub process
**/
childProc.send("Respond to child")
},
});
childProc.send("I am your father"); // The parent can send messages to the child as well
```
---
Meanwhile the child process can send messages to its parent using with `process.send()` and receive messages with `process.on("message")`. This is the same API used for `child_process.fork()` in Node.js.
```ts#child.ts
process.send("Hello from child as string");
process.send({ message: "Hello from child as object" });
process.on("message", (message) => {
// print message from parent
console.log(message);
});
```
---
All messages are serialized using the JSC `serialize` API, which allows for the same set of [transferrable types](https://developer.mozilla.org/en-US/docs/Web/API/Web_Workers_API/Transferable_objects) supported by `postMessage` and `structuredClone`, including strings, typed arrays, streams, and objects.
```ts#child.ts
// send a string
process.send("Hello from child as string");
// send an object
process.send({ message: "Hello from child as object" });
```
---
See [Docs > API > Child processes](/docs/api/spawn) for complete documentation.

View File

@@ -35,7 +35,7 @@ jobs:
# ...
- uses: oven-sh/setup-bun@v1
+ with:
+ bun-version: 1.0.11 # or "latest", "canary", <sha>
+ version: 0.7.0 # or "canary"
```
---

View File

@@ -32,6 +32,7 @@ Some notable missing features:
- `expect.extend()`
- `expect().toMatchInlineSnapshot()`
- `expect().toHaveBeenCalledWith()`
- `expect().toHaveReturned()`
---

View File

@@ -20,7 +20,7 @@ test("party like it's 1999", () => {
---
The `setSystemTime` function is commonly used on conjunction with [Lifecycle Hooks](/docs/test/lifecycle) to configure a testing environment with a deterministic "fake clock".
The `setSystemTime` function is commonly used on conjunction with [Lifecycle Hooks](/docs/test/lifecycle) to configure a testing environment with a determinstic "fake clock".
```ts
import { test, expect, beforeAll, setSystemTime } from "bun:test";

View File

@@ -64,7 +64,7 @@ Ran 2 tests across 1 files. [15.00ms]
All tests have a name, defined using the first parameter to the `test` function. Tests can also be grouped into suites with `describe`.
```ts
import { test, expect, describe } from "bun:test";
import { test, expect } from "bun:test";
describe("math", () => {
test("add", () => {

View File

@@ -2,7 +2,7 @@
name: Get the file name of the current file
---
Bun provides a handful of module-specific utilities on the [`import.meta`](/docs/api/import-meta) object. Use `import.meta.file` to retrieve the name of the current file.
Bun provides a handful of module-specific utilities on the [`import.meta`](/docs/api/import-meta) object. Use `import.meta.file` to retreive the name of the current file.
```ts#/a/b/c.ts
import.meta.file; // => "c.ts"

View File

@@ -2,7 +2,7 @@
name: Get the absolute path of the current file
---
Bun provides a handful of module-specific utilities on the [`import.meta`](/docs/api/import-meta) object. Use `import.meta.path` to retrieve the absolute path of the current file.
Bun provides a handful of module-specific utilities on the [`import.meta`](/docs/api/import-meta) object. Use `import.meta.path` to retreive the absolute path of the current file.
```ts#/a/b/c.ts
import.meta.path; // => "/a/b/c.ts"

View File

@@ -28,7 +28,7 @@ const server = Bun.serve<{ username: string }>({
},
close(ws) {
const msg = `${ws.data.username} has left the chat`;
server.publish("the-group-chat", msg);
ws.publish("the-group-chat", msg);
ws.unsubscribe("the-group-chat");
},
},

View File

@@ -1,52 +0,0 @@
---
name: Append content to a file
---
Bun implements the `node:fs` module, which includes the `fs.appendFile` and `fs.appendFileSync` functions for appending content to files.
---
You can use `fs.appendFile` to asynchronously append data to a file, creating the file if it does not yet exist. The content can be a string or a `Buffer`.
```ts
import { appendFile } from "node:fs/promises";
await appendFile("message.txt", "data to append");
```
---
To use the non-`Promise` API:
```ts
import { appendFile } from "node:fs";
appendFile("message.txt", "data to append", err => {
if (err) throw err;
console.log('The "data to append" was appended to file!');
});
```
---
To specify the encoding of the content:
```js
import { appendFile } from "node:fs";
appendFile("message.txt", "data to append", "utf8", callback);
```
---
To append the data synchronously, use `fs.appendFileSync`:
```ts
import { appendFileSync } from "node:fs";
appendFileSync("message.txt", "data to append", "utf8");
```
---
See the [Node.js documentation](https://nodejs.org/api/fs.html#fspromisesappendfilepath-data-options) for more information.

View File

@@ -26,7 +26,7 @@ Get started with one of the quick links below, or read on to learn more about Bu
{% arrowbutton href="/docs/installation" text="Install Bun" /%}
{% arrowbutton href="/docs/quickstart" text="Do the quickstart" /%}
{% arrowbutton href="/docs/cli/install" text="Install a package" /%}
{% arrowbutton href="/docs/cli/bun-create" text="Use a project template" /%}
{% arrowbutton href="/docs/templates" text="Use a project template" /%}
{% arrowbutton href="/docs/bundler" text="Bundle code for production" /%}
{% arrowbutton href="/docs/api/http" text="Build an HTTP server" /%}
{% arrowbutton href="/docs/api/websockets" text="Build a Websocket server" /%}
@@ -37,14 +37,11 @@ Get started with one of the quick links below, or read on to learn more about Bu
## What is a runtime?
JavaScript (or, more formally, ECMAScript) is just a _specification_ for a programming language. Anyone can write a JavaScript _engine_ that ingests a valid JavaScript program and executes it. The two most popular engines in use today are V8 (developed by Google)
and JavaScriptCore (developed by Apple). Both are open source.
But most JavaScript programs don't run in a vacuum. They need a way to access the outside world to perform useful tasks. This is where _runtimes_ come in. They implement additional APIs that are then made available to the JavaScript programs they execute.
JavaScript (or, more formally, ECMAScript) is just a _specification_ for a programming language. Anyone can write a JavaScript _engine_ that ingests a valid JavaScript program and executes it. The two most popular engines in use today are V8 (developed by Google) and JavaScriptCore (developed by Apple). Both are open source.
### Browsers
Notably, browsers ship with JavaScript runtimes that implement a set of Web-specific APIs that are exposed via the global `window` object. Any JavaScript code executed by the browser can use these APIs to implement interactive or dynamic behavior in the context of the current webpage.
But most JavaScript programs don't run in a vacuum. They need a way to access the outside world to perform useful tasks. This is where _runtimes_ come in. They implement additional APIs that are then made available to the JavaScript programs they execute. Notably, browsers ship with JavaScript runtimes that implement a set of Web-specific APIs that are exposed via the global `window` object. Any JavaScript code executed by the browser can use these APIs to implement interactive or dynamic behavior in the context of the current webpage.
<!-- JavaScript runtime that exposes JavaScript engines are designed to run "vanilla" JavaScript programs, but it's often JavaScript _runtimes_ use an engine internally to execute the code and implement additional APIs that are then made available to executed programs.
JavaScript was [initially designed](https://en.wikipedia.org/wiki/JavaScript) as a language to run in web browsers to implement interactivity and dynamic behavior in web pages. Browsers are the first JavaScript runtimes. JavaScript programs that are executed in browsers have access to a set of Web-specific global APIs on the `window` object. -->

View File

@@ -39,7 +39,7 @@ On Linux, `bun install` tends to install packages 20-100x faster than `npm insta
Running `bun install` will:
- **Install** all `dependencies`, `devDependencies`, and `optionalDependencies`. Bun will install `peerDependencies` by default.
- **Install** all `dependencies`, `devDependencies`, and `optionalDependencies`. Bun does not install `peerDependencies` by default.
- **Run** your project's `{pre|post}install` scripts at the appropriate time. For security reasons Bun _does not execute_ lifecycle scripts of installed dependencies.
- **Write** a `bun.lockb` lockfile to the project root.
@@ -81,7 +81,7 @@ optional = true
dev = true
# whether to install peerDependencies
peer = true
peer = false
# equivalent to `--production` flag
production = false
@@ -91,6 +91,9 @@ frozenLockfile = false
# equivalent to `--dry-run` flag
dryRun = false
# whether to use the github REST api (unauthenticated)
github.api = true
```
{% /details %}

View File

@@ -1,44 +0,0 @@
Packages on `npm` can define _lifecycle scripts_ in their `package.json`. Some of the most common are below, but there are [many others](https://docs.npmjs.com/cli/v10/using-npm/scripts).
- `preinstall`: Runs before the package is installed
- `postinstall`: Runs after the package is installed
- `preuninstall`: Runs before the package is uninstalled
- `prepublishOnly`: Runs before the package is published
These scripts are arbitrary shell commands that the package manager is expected to read and execute at the appropriate time. But executing arbitrary scripts represents a potential security risk, so—unlike other `npm` clients—Bun does not execute arbitrary lifecycle scripts by default.
## `postinstall`
The `postinstall` script is particularly important. It's widely used to build or install platform-specific binaries for packages that are implemented as [native Node.js add-ons](https://nodejs.org/api/addons.html). For example, `node-sass` is a popular package that uses `postinstall` to build a native binary for Sass.
```json
{
"name": "my-app",
"version": "1.0.0",
"dependencies": {
"node-sass": "^6.0.1"
}
}
```
## `trustedDependencies`
Instead of executing arbitrary scripts, Bun uses a "default-secure" approach. You can add certain packages to an allow list, and Bun will execute lifecycle scripts for those packages. To tell Bun to allow lifecycle scripts for a particular package, add the package name to `trustedDependencies` array in your `package.json`.
```json-diff
{
"name": "my-app",
"version": "1.0.0",
+ "trustedDependencies": ["node-sass"]
}
```
Once added to `trustedDependencies`, install/re-install the package. Bun will read this field and run lifecycle scripts for `my-trusted-package`.
## `--ignore-scripts`
To disable lifecycle scripts for all packages, use the `--ignore-scripts` flag.
```bash
$ bun install --ignore-scripts
```

View File

@@ -1,73 +0,0 @@
Bun supports npm's `"overrides"` and Yarn's `"resolutions"` in `package.json`. These are mechanisms for specifying a version range for _metadependencies_—the dependencies of your dependencies. Refer to [Package manager > Overrides and resolutions](/docs/install/overrides) for complete documentation.
```json-diff#package.json
{
"name": "my-app",
"dependencies": {
"foo": "^2.0.0"
},
+ "overrides": {
+ "bar": "~4.4.0"
+ }
}
```
By default, Bun will install the latest version of all dependencies and metadependencies, according to the ranges specified in each package's `package.json`. Let's say you have a project with one dependency, `foo`, which in turn has a dependency on `bar`. This means `bar` is a _metadependency_ of our project.
```json#package.json
{
"name": "my-app",
"dependencies": {
"foo": "^2.0.0"
}
}
```
When you run `bun install`, Bun will install the latest versions of each package.
```
# tree layout of node_modules
node_modules
├── foo@1.2.3
└── bar@4.5.6
```
But what if a security vulnerability was introduced in `bar@4.5.6`? We may want a way to pin `bar` to an older version that doesn't have the vulnerability. This is where `"overrides"`/`"resolutions"` come in.
## `"overrides"`
Add `bar` to the `"overrides"` field in `package.json`. Bun will defer to the specified version range when determining which version of `bar` to install, whether it's a dependency or a metadependency.
{% callout %}
**Note** — Bun currently only supports top-level `"overrides"`. [Nested overrides](https://docs.npmjs.com/cli/v9/configuring-npm/package-json#overrides) are not supported.
{% /callout %}
```json-diff#package.json
{
"name": "my-app",
"dependencies": {
"foo": "^2.0.0"
},
+ "overrides": {
+ "bar": "~4.4.0"
+ }
}
```
## `"resolutions"`
The syntax is similar for `"resolutions"`, which is Yarn's alternative to `"overrides"`. Bun supports this feature to make migration from Yarn easier.
As with `"overrides"`, _nested resolutions_ are not currently supported.
```json-diff#package.json
{
"name": "my-app",
"dependencies": {
"foo": "^2.0.0"
},
+ "resolutions": {
+ "bar": "~4.4.0"
+ }
}
```

View File

@@ -24,7 +24,7 @@ tree
   └── tsconfig.json
```
In the root `package.json`, the `"workspaces"` key is used to indicate which subdirectories should be considered packages/workspaces within the monorepo. It's conventional to place all the workspace in a directory called `packages`.
In the root `package.json`, the `"workspaces"` key is used to indicate which subdirectories should be considered packages/workspaces within the monorepo. It conventional to place all the workspace in a directory called `packages`.
```json
{

View File

@@ -1,8 +1,6 @@
Bun ships as a single executable that can be installed a few different ways.
## Installing
### macOS and Linux
## macOS and Linux
{% callout %}
**Linux users** — The `unzip` package is required to install Bun. Use `sudo apt install unzip` to install `unzip` package.
@@ -37,7 +35,7 @@ $ proto install bun
{% /codetabs %}
### Windows
## Windows
Bun provides a _limited, experimental_ native build for Windows. At the moment, only the Bun runtime is supported.
@@ -69,59 +67,6 @@ $ docker pull oven/bun:alpine
$ docker pull oven/bun:distroless
```
## Checking installation
To check that Bun was installed successfully, open a new terminal window and run `bun --version`.
```sh
$ bun --version
1.x.y
```
To see the precise commit of [oven-sh/bun](https://github.com/oven-sh/bun) that you're using, run `bun --revision`.
```sh
$ bun --revision
1.x.y+b7982ac13189
```
If you've installed Bun but are seeing a `command not found` error, you may have to manually add the installation directory (`~/.bun/bin`) to your `PATH`.
{% details summary="How to add to your `PATH`" %}
First, determine what shell you're using:
```sh
$ echo $SHELL
/bin/zsh # or /bin/bash or /bin/fish
```
Then add these lines below to bottom of your shell's configuration file.
{% codetabs %}
```bash#~/.zshrc
# add to ~/.zshrc
export BUN_INSTALL="$HOME/.bun"
export PATH="$BUN_INSTALL/bin:$PATH"
```
```bash#~/.bashrc
# add to ~/.bashrc
export BUN_INSTALL="$HOME/.bun"
export PATH="$BUN_INSTALL/bin:$PATH"
```
```sh#~/.config/fish/config.fish
# add to ~/.config/fish/config.fish
export BUN_INSTALL="$HOME/.bun"
export PATH="$BUN_INSTALL/bin:$PATH"
```
{% /codetabs %}
Save the file. You'll need to open a new shell/terminal window for the changes to take effect.
{% /details %}
## Upgrading
Once installed, the binary can upgrade itself.

View File

@@ -38,13 +38,12 @@ export default {
page("typescript", "TypeScript", {
description: "Install and configure type declarations for Bun's APIs",
}),
divider("Templating"),
page("cli/init", "`bun init`", {
description: "Scaffold an empty Bun project.",
page("templates", "Templates", {
description: "Hit the ground running with one of Bun's official templates, or download a template from GitHub.",
}),
page("cli/bun-create", "`bun create`", {
description: "Scaffold a new Bun project from an official template or GitHub repo.",
page("guides", "Guides", {
description: "A set of walkthrough guides and code snippets for performing common tasks with Bun",
href: "/guides",
}),
// page("typescript", "TypeScript"),
@@ -82,6 +81,7 @@ export default {
// page("bundev", "Dev server"),
// page("benchmarks", "Benchmarks"),
// divider("Runtime"),
divider("Runtime"),
page("cli/run", "`bun run`", {
description: "Use `bun run` to execute JavaScript/TypeScript files and package.json scripts.",
@@ -152,21 +152,6 @@ export default {
description:
"Install all dependencies with `bun install`, or manage dependencies with `bun add` and `bun remove`.",
}),
page("cli/add", "`bun add`", {
description: "Add dependencies to your project.",
}),
page("cli/remove", "`bun remove`", {
description: "Remove dependencies from your project.",
}),
page("cli/update", "`bun update`", {
description: "Update your project's dependencies.",
}),
page("cli/link", "`bun link`", {
description: "Install local packages as dependencies in your project.",
}),
page("cli/pm", "`bun pm`", {
description: "Utilities relating to package management with Bun.",
}),
page("install/cache", "Global cache", {
description:
"Bun's package manager installs all packages into a shared global cache to avoid redundant re-downloads.",
@@ -174,9 +159,6 @@ export default {
page("install/workspaces", "Workspaces", {
description: "Bun's package manager supports workspaces and mono-repo development workflows.",
}),
page("install/lifecycle", "Lifecycle scripts", {
description: "How Bun handles package lifecycle scripts with trustedDependencies",
}),
page("install/lockfile", "Lockfile", {
description:
"Bun's binary lockfile `bun.lockb` tracks your resolved dependency tree, making future installs fast and repeatable.",
@@ -184,12 +166,9 @@ export default {
page("install/registries", "Scopes and registries", {
description: "How to configure private scopes and custom package registries.",
}),
page("install/overrides", "Overrides and resolutions", {
description: "Specify version ranges for nested dependencies",
page("install/utilities", "Utilities", {
description: "Use `bun pm` to introspect your global module cache or project dependency tree.",
}),
// page("install/utilities", "Utilities", {
// description: "Use `bun pm` to introspect your global module cache or project dependency tree.",
// }),
divider("Bundler"),
page("bundler", "`Bun.build`", {
@@ -340,14 +319,6 @@ export default {
description: `Bun implements the Node-API spec for building native addons.`,
}), // "`Node-API`"),
page("api/glob", "Glob", {
description: `Bun includes a fast native Glob implementation for matching file paths.`,
}), // "`Glob`"),
page("api/semver", "Semver", {
description: `Bun's native Semver implementation is 20x faster than the popular \`node-semver\` package.`,
}), // "`Semver`"),
// divider("Dev Server"),
// page("bun-dev", "Vanilla"),
// page("dev/css", "CSS"),
@@ -363,7 +334,7 @@ export default {
page("project/benchmarking", "Benchmarking", {
description: `Bun is designed for performance. Learn how to benchmark Bun yourself.`,
}),
page("project/contributing", "Contributing", {
page("project/development", "Development", {
description: "Learn how to contribute to Bun and get your local development environment up and running.",
}),
page("project/licensing", "License", {

View File

@@ -1,75 +0,0 @@
## Prerequisites
### System Dependencies
- [Visual Studio](https://visualstudio.microsoft.com) with the "Desktop Development with C++" workload. You should install Git and CMake from here, if not already installed.
- Ninja
- Go
- Rust
- NASM
- Perl
- Ruby
- Node.js (until bun runs stably on windows)
<!--
TODO: missing the rest of the things
```
winget install OpenJS.NodeJS.LTS
``` -->
### Enable Scripts
By default, scripts are blocked.
```ps1
Set-ExecutionPolicy -Scope CurrentUser -ExecutionPolicy Unrestricted
```
### Zig
Bun pins a version of Zig. As the compiler is still in development, breaking changes happen often that will break the build. It is recommended to use [Zigup](https://github.com/marler8997/zigup/releases) as it can quickly switch to any version by name, but you can also [manually download Zig](https://ziglang.org/download/).
```bash
$ zigup 0.12.0-dev.1604+caae40c21
```
{% callout %}
We last updated Zig on **October 26th, 2023**
{% /callout %}
### Codegen
On Unix platforms, we depend on an existing build of Bun to generate code for itself. Since the Windows branch is not stable enough for this to pass, you currently need to generate the code.
On a system with Bun installed, run:
```bash
$ bash ./scripts/cross-compile-codegen.sh win32 x64
# -> build-codegen-win32-x64
```
Copy the contents of this to the Windows machine into a folder named `build`
TODO: Use WSL to automatically run codegen without a separate machine.
## Building
```ps1
npm install
.\scripts\env.ps1
.\scripts\update-submodules.ps1
.\scripts\all-dependencies.ps1
cd build # this was created by the codegen script in the prerequisites
cmake .. -G Ninja -DCMAKE_BUILD_TYPE=Debug
ninja
```
If this was successful, you should have a `bun-debug.exe` in the `build` folder.
```ps1
.\bun-debug.exe --version
```

View File

@@ -2,31 +2,7 @@ Configuring a development environment for Bun can take 10-30 minutes depending o
If you are using Windows, you must use a WSL environment as Bun does not yet compile on Windows natively.
## Install Dependencies
Using your system's package manager, install the Bun's dependencies:
{% codetabs %}
```bash#macOS (Homebrew)
$ brew install automake ccache cmake coreutils gnu-sed go libiconv libtool ninja pkg-config rust
```
```bash#Ubuntu/Debian
$ sudo apt install cargo ccache cmake git golang libtool ninja-build pkg-config rustc ruby-full xz-utils
```
```bash#Arch
$ sudo pacman -S base-devel ccache cmake git go libiconv libtool make ninja pkg-config python rust sed unzip ruby
```
```bash#Fedora
$ sudo dnf install cargo ccache cmake git golang libtool ninja-build pkg-config rustc libatomic-static libstdc++-static sed unzip which libicu-devel
```
{% /codetabs %}
Before starting, you will need to already have a release build of Bun installed, as we use our bundler to transpile and minify our code, as well as for code generation scripts.
Before starting, you will need to already have a release build of Bun installed, as we use our bundler to transpile and minify our code.
{% codetabs %}
@@ -95,59 +71,247 @@ If not, run this to manually link it:
```bash#macOS (Homebrew)
# use fish_add_path if you're using fish
$ export PATH="$(brew --prefix llvm@16)/bin:$PATH"
$ export PATH="$PATH:$(brew --prefix llvm@16)/bin"
$ export LDFLAGS="$LDFLAGS -L$(brew --prefix llvm@16)/lib"
$ export CPPFLAGS="$CPPFLAGS -I$(brew --prefix llvm@16)/include"
```
```bash#Arch
# use fish_add_path if you're using fish
$ export PATH="$PATH:/usr/lib/llvm16/bin"
$ export LDFLAGS="$LDFLAGS -L/usr/lib/llvm16/lib"
$ export CPPFLAGS="$CPPFLAGS -I/usr/lib/llvm16/include"
```
{% /codetabs %}
## Building Bun
## Install Dependencies
Using your system's package manager, install the rest of Bun's dependencies:
{% codetabs %}
```bash#macOS (Homebrew)
$ brew install automake ccache cmake coreutils esbuild gnu-sed go libiconv libtool ninja pkg-config rust
```
```bash#Ubuntu/Debian
$ sudo apt install cargo ccache cmake git golang libtool ninja-build pkg-config rustc esbuild
```
```bash#Arch
$ sudo pacman -S base-devel ccache cmake esbuild git go libiconv libtool make ninja pkg-config python rust sed unzip
```
```bash#Fedora
$ sudo dnf install cargo ccache cmake git golang libtool ninja-build pkg-config rustc golang-github-evanw-esbuild libatomic-static libstdc++-static sed unzip
```
{% /codetabs %}
{% details summary="Ubuntu — Unable to locate package esbuild" %}
The `apt install esbuild` command may fail with an `Unable to locate package` error if you are using a Ubuntu mirror that does not contain an exact copy of the original Ubuntu server. Note that the same error may occur if you are not using any mirror but have the Ubuntu Universe enabled in the `sources.list`. In this case, you can install esbuild manually:
```bash
$ curl -fsSL https://esbuild.github.io/dl/latest | sh
$ chmod +x ./esbuild
$ sudo mv ./esbuild /usr/local/bin
```
{% /details %}
In addition to this, you will need an npm package manager (`bun`, `npm`, etc) to install the `package.json` dependencies.
## Install Zig
Zig can be installed either with our npm package [`@oven/zig`](https://www.npmjs.com/package/@oven/zig), or by using [zigup](https://github.com/marler8997/zigup).
```bash
$ bun install -g @oven/zig
$ zigup 0.12.0-dev.163+6780a6bbf
```
{% callout %}
We last updated Zig on **July 18th, 2023**
{% /callout %}
## First Build
After cloning the repository, run the following command to run the first build. This may take a while as it will clone submodules and build dependencies.
```bash
$ bun setup
$ make setup
```
The binary will be located at `./build/bun-debug`. It is recommended to add this to your `$PATH`. To verify the build worked, let's print the version number on the development build of Bun.
The binary will be located at `packages/debug-bun-{platform}-{arch}/bun-debug`. It is recommended to add this to your `$PATH`. To verify the build worked, let's print the version number on the development build of Bun.
```bash
$ build/bun-debug --version
x.y.z_debug
$ packages/debug-bun-*/bun-debug --version
bun 1.x.y__dev
```
To rebuild, you can invoke `bun run build`
Note: `make setup` is just an alias for the following:
```bash
$ bun run build
$ make assert-deps submodule npm-install-dev node-fallbacks runtime_js fallback_decoder bun_error mimalloc picohttp zlib boringssl libarchive lolhtml sqlite usockets uws tinycc c-ares zstd base64 cpp zig link
```
These two scripts, `setup` and `build`, are aliases to do roughly the following:
## Rebuilding
```bash
$ ./scripts/setup.sh
$ cmake -S . -G Ninja -B build -DCMAKE_BUILD_TYPE=Debug
$ ninja -C build # 'bun run build' runs just this
```
Bun uses a series of make commands to rebuild parts of the codebase. The general rule for rebuilding is there is `make link` to rerun the linker, and then different make targets for different parts of the codebase. Do not pass `-j` to make as these scripts will break if run out of order, and multiple cores will be used when possible during the builds.
Advanced uses can pass CMake flags to customize the build.
{% table %}
- What changed
- Run this command
---
- Zig Code
- `make zig`
---
- C++ Code
- `make cpp`
---
- Zig + C++ Code
- `make dev` (combination of the above two)
---
- JS/TS Code in `src/js`
- `make js` (in bun-debug, js is loaded from disk without a recompile). If you change the names of any file or add/remove anything, you must also run `make dev`.
---
- `*.classes.ts`
- `make generate-classes dev`
---
- JSSink
- `make generate-sink cpp`
---
- `src/node_fallbacks/*`
- `make node-fallbacks zig`
---
- `identifier_data.zig`
- `make identifier-cache zig`
---
- Code using `cppFn`/`JSC.markBinding`
- `make headers` (TODO: explain what this is used for and why it's useful)
{% /table %}
`make setup` cloned a bunch of submodules and built the subprojects. When a submodule is out of date, run `make submodule` to quickly reset/update all your submodules, then you can rebuild individual submodules with their respective command.
{% table %}
- Dependency
- Run this command
---
- WebKit
- `bun install` (it is a prebuilt package)
---
- uWebSockets
- `make uws`
---
- Mimalloc
- `make mimalloc`
---
- PicoHTTPParser
- `make picohttp`
---
- zlib
- `make zlib`
---
- BoringSSL
- `make boringssl`
---
- libarchive
- `make libarchive`
---
- lolhtml
- `make lolhtml`
---
- sqlite
- `make sqlite`
---
- TinyCC
- `make tinycc`
---
- c-ares
- `make c-ares`
---
- zstd
- `make zstd`
---
- Base64
- `make base64`
{% /table %}
The above will probably also need Zig and/or C++ code rebuilt.
## VSCode
VSCode is the recommended IDE for working on Bun, as it has been configured. Once opening, you can run `Extensions: Show Recommended Extensions` to install the recommended extensions for Zig and C++. ZLS is automatically configured.
## Code generation scripts
## JavaScript builtins
When you change anything in `src/js/builtins/*` or switch branches, run this:
```bash
$ make js cpp
```
That inlines the TypeScript code into C++ headers.
{% callout %}
**Note**: This section is outdated. The code generators are run automatically by ninja, instead of by `make`.
Make sure you have `ccache` installed, otherwise regeneration will take much longer than it should.
{% /callout %}
For more information on how `src/js` works, see `src/js/README.md` in the codebase.
## Code generation scripts
Bun leverages a lot of code generation scripts.
The [./src/bun.js/bindings/headers.h](https://github.com/oven-sh/bun/blob/main/src/bun.js/bindings/headers.h) file has bindings to & from Zig <> C++ code. This file is generated by running the following:
@@ -180,17 +344,26 @@ You probably won't need to run that one much.
## Modifying ESM modules
Certain modules like `node:fs`, `node:stream`, `bun:sqlite`, and `ws` are implemented in JavaScript. These live in `src/js/{node,bun,thirdparty}` files and are pre-bundled using Bun. In debug builds, Bun automatically loads these from the filesystem, wherever it was compiled, so no need to re-run `make dev`.
Certain modules like `node:fs`, `node:stream`, `bun:sqlite`, and `ws` are implemented in JavaScript. These live in `src/js/{node,bun,thirdparty}` files and are pre-bundled using Bun. The bundled code is committed so CI builds can run without needing a copy of Bun.
When these are changed, run:
```
$ make js
```
In debug builds, Bun automatically loads these from the filesystem, wherever it was compiled, so no need to re-run `make dev`.
## Release build
To build a release build of Bun, run:
```bash
$ bun run build:release
$ make release-bindings -j12
$ make release
```
The binary will be located at `./build-release/bun` and `./build-release/bun-profile`.
The binary will be located at `packages/bun-{platform}-{arch}/bun`.
## Valgrind
@@ -210,32 +383,15 @@ You'll need a very recent version of Valgrind due to DWARF 5 debug symbols. You
$ valgrind --fair-sched=try --track-origins=yes bun-debug <args>
```
## Building WebKit locally + Debug mode of JSC
## Updating `WebKit`
{% callout %}
**TODO**: This is out of date. TLDR is pass `-DUSE_DEBUG_JSC=1` or `-DWEBKIT_DIR=...` to CMake. it will probably need more fiddling. ask @paperdave if you need this.
{% /callout %}
WebKit is not cloned by default (to save time and disk space). To clone and build WebKit locally, run:
The Bun team will occasionally bump the version of WebKit used in Bun. When this happens, you may see errors in `src/bun.js/bindings` during builds. When you see this, install the latest version of `bun-webkit` and re-compile.
```bash
# once you run this, `make submodule` can be used to automatically
# update WebKit and the other submodules
$ git submodule update --init --depth 1 --checkout src/bun.js/WebKit
# to make a jsc release build
$ make jsc
# JSC debug build does not work perfectly with Bun yet, this is actively being
# worked on and will eventually become the default.
$ make jsc-build-linux-compile-debug cpp
$ make jsc-build-mac-compile-debug cpp
$ bun install
$ make cpp
```
Note that the WebKit folder, including build artifacts, is 8GB+ in size.
If you are using a JSC debug build and using VScode, make sure to run the `C/C++: Select a Configuration` command to configure intellisense to find the debug headers.
## Troubleshooting
### 'span' file not found on Ubuntu
@@ -279,6 +435,29 @@ If you see an error when compiling `libarchive`, run this:
$ brew install pkg-config
```
### missing files on `zig build obj`
If you see an error about missing files on `zig build obj`, make sure you built the headers.
```bash
$ make headers
```
### cmakeconfig.h not found
If you see an error about `cmakeconfig.h` not being found, this is because the precompiled WebKit did not install properly.
```bash
$ bun install
```
Check to see the command installed webkit, and you can manually look for `node_modules/bun-webkit-{platform}-{arch}`:
```bash
# this should reveal two directories. if not, something went wrong
$ echo node_modules/bun-webkit*
```
### macOS `library not found for -lSystem`
If you see this error when compiling, run:
@@ -295,4 +474,4 @@ Bun requires `libatomic` to be statically linked. On Arch Linux, it is only give
$ sudo ln -s /lib/libatomic.so /lib/libatomic.a
```
The built version of Bun may not work on other systems if compiled this way.
The built version of bun may not work on other systems if compiled this way.

View File

@@ -1,75 +0,0 @@
There are four parts to the CI build:
- Dependencies: should be cached across builds as much as possible, it depends on git submodule hashes
- Zig Object: depends on \*.zig and potentially src/js
- C++ Object: depends on \*.cpp and src/js
- Linking: depends on the above three
Utilizing multiple GitHub Action runners allows us to do a lot of work in parallel.
## Dependencies
```sh
BUN_DEPS_OUT_DIR="/optional/out/dir" bash ./scripts/all-dependencies.sh
```
## Zig Object
This does not have a dependency on WebKit or any of the dependencies at all. It can be compiled without checking out submodules, but you will need to have bun install run. It can be very easily cross compiled.
```sh
BUN_REPO=/path/to/oven-sh/bun
cd tmp1
cmake $BUN_REPO \
-G Ninja \
-DCMAKE_BUILD_TYPE=Release \
-DCPU_TARGET="native" \
-DZIG_TARGET="native" \
-DBUN_ZIG_OBJ="./bun-zig.o"
ninja ./bun-zig.o
# -> bun-zig.o
```
## C++ Object
Note: if WEBKIT_DIR is not passed, it is automatically downloaded from GitHub releases. This depends on the headers from submodules but not necessarily the build copies of them, .a files, etc.
```sh
cd tmp2
cmake $BUN_REPO \
-G Ninja \
-DCMAKE_BUILD_TYPE=Release \
-DBUN_CPP_ONLY=1 \
bash compile-cpp-only.sh
# -> bun-cpp-objects.a
```
## Linking
The goal is you run both stages from above on different machines, so that they can build in parallel. Zig build is slow, and MacOS build runners are slower on average than the linux ones. With both artifacts from above, you can link them together:
```sh
cd tmp3
cmake $BUN_REPO \
-G Ninja \
-DCMAKE_BUILD_TYPE=Release \
-DBUN_LINK_ONLY=1 \
-DBUN_ZIG_OBJ="/path/to/bun-zig.o" \
-DBUN_CPP_ARCHIVE="/path/to/bun-cpp-objects.a"
ninja
# optiona:
# -DBUN_DEPS_OUT_DIR=... custom deps dir, use this to cache the built deps between rebuilds
# -DWEBKIT_DIR=... same thing, but it's probably fast enough to pull from github releases
# -> bun
# -> bun-profile
# -> bun.dSYM/
```

View File

@@ -42,10 +42,7 @@ const server = Bun.serve({
console.log(`Listening on http://localhost:${server.port} ...`);
```
{% details summary="Seeing TypeScript errors on `Bun`?" %}
If you used `bun init`, Bun will have automatically installed Bun's TypeScript declarations and configured your `tsconfig.json`. If you're trying out Bun in an existing project, you may see a type error on the `Bun` global.
To fix this, first install `bun-types` as a dev dependency.
If you're using TypeScript, you may see a type error on the `Bun` global. To fix this, install `bun-types`.
```sh
$ bun add -d bun-types
@@ -61,8 +58,6 @@ Then add the following line to your `compilerOptions` in `tsconfig.json`.
}
```
{% /details %}
Run the file from your shell.
```bash

View File

@@ -54,7 +54,7 @@ Click the link in the right column to jump to the associated documentation.
---
- Streaming HTML Transformations
- HTML Rewriting
- [`HTMLRewriter`](/docs/api/html-rewriter)
---
@@ -94,11 +94,6 @@ Click the link in the right column to jump to the associated documentation.
---
- Glob
- [`Bun.Glob`](/docs/api/glob)
---
- Utilities
- [`Bun.version`](/docs/api/utils#bun-version) [`Bun.revision`](/docs/api/utils#bun-revision) [`Bun.env`](/docs/api/utils#bun-env) [`Bun.main`](/docs/api/utils#bun-main) [`Bun.sleep()`](/docs/api/utils#bun-sleep) [`Bun.sleepSync()`](/docs/api/utils#bun-sleepsync) [`Bun.which()`](/docs/api/utils#bun-which) [`Bun.peek()`](/docs/api/utils#bun-peek) [`Bun.openInEditor()`](/docs/api/utils#bun-openineditor) [`Bun.deepEquals()`](/docs/api/utils#bun-deepequals) [`Bun.escapeHTML()`](/docs/api/utils#bun-escapehtml) [`Bun.fileURLToPath()`](/docs/api/utils#bun-fileurltopath) [`Bun.pathToFileURL()`](/docs/api/utils#bun-pathtofileurl) [`Bun.gzipSync()`](/docs/api/utils#bun-gzipsync) [`Bun.gunzipSync()`](/docs/api/utils#bun-gunzipsync) [`Bun.deflateSync()`](/docs/api/utils#bun-deflatesync) [`Bun.inflateSync()`](/docs/api/utils#bun-inflatesync) [`Bun.inspect()`](/docs/api/utils#bun-inspect) [`Bun.nanoseconds()`](/docs/api/utils#bun-nanoseconds) [`Bun.readableStreamTo*()`](/docs/api/utils#bun-readablestreamto) [`Bun.resolveSync()`](/docs/api/utils#bun-resolvesync)

View File

@@ -209,11 +209,22 @@ dev = true
### `install.peer`
Whether to install peer dependencies. Default `true`.
Whether to install peer dependencies. Default `false`.
```toml
[install]
peer = true
peer = false
```
### `install.github.api`
Enable using the github REST API to install github dependencies. Default `true`.
Private github repositories will fail to install if this option is true because the REST API is unauthenticated.
```toml
[install]
github.api = true
```
### `install.production`

View File

@@ -25,16 +25,6 @@ Or programmatically by assigning a property to `process.env`.
process.env.FOO = "hello";
```
### Manually specifying `.env` files
Bun supports `--env-file` to override which specific `.env` file to load. You can use `--env-file` when running scripts in bun's runtime, or when running package.json scripts.
```sh
bun --env-file=.env.1 src/index.ts
bun --env-file=.env.abc --env-file=.env.def run build
```
### Quotation marks
Bun supports double quotes, single quotes, and
@@ -85,11 +75,10 @@ The current environment variables can be accessed via `process.env`.
process.env.API_TOKEN; // => "secret"
```
Bun also exposes these variables via `Bun.env` and `import.meta.env`, which is a simple alias of `process.env`.
Bun also exposes these variables via `Bun.env`, which is a simple alias of `process.env`.
```ts
Bun.env.API_TOKEN; // => "secret"
import.meta.env.API_TOKEN; // => "secret"
```
To print all currently-set environment variables to the command line, run `bun run env`. This is useful for debugging.
@@ -137,11 +126,6 @@ These environment variables are read by Bun and configure aspects of its behavio
---
- `BUN_RUNTIME_TRANSPILER_CACHE_PATH`
- The runtime transpiler caches the transpiled output of source files larger than 50 kb. This makes CLIs using Bun load faster. If `BUN_RUNTIME_TRANSPILER_CACHE_PATH` is set, then the runtime transpiler will cache transpiled output to the specified directory. If `BUN_RUNTIME_TRANSPILER_CACHE_PATH` is set to an empty string or the string `"0"`, then the runtime transpiler will not cache transpiled output. If `BUN_RUNTIME_TRANSPILER_CACHE_PATH` is unset, then the runtime transpiler will cache transpiled output to the platform-specific cache directory.
---
- `TMPDIR`
- Bun occasionally requires a directory to store intermediate assets during bundling or other operations. If unset, defaults to the platform-specific temporary directory: `/tmp` on Linux, `/private/tmp` on macOS.
@@ -158,31 +142,6 @@ These environment variables are read by Bun and configure aspects of its behavio
---
- `DO_NOT_TRACK`
- Telemetry is not sent yet as of November 28th, 2023, but we are planning to add telemetry in the coming months. If `DO_NOT_TRACK=1`, then analytics are [disabled](https://do-not-track.dev/). Bun records bundle timings (so we can answer with data, "is Bun getting faster?") and feature usage (e.g., "are people actually using macros?"). The request body size is about 60 bytes, so it's not a lot of data. Equivalent of `telemetry=false` in bunfig.
- If `DO_NOT_TRACK=1`, then analytics are [disabled](https://do-not-track.dev/). Bun records bundle timings (so we can answer with data, "is Bun getting faster?") and feature usage (e.g., "are people actually using macros?"). The request body size is about 60 bytes, so it's not a lot of data. Equivalent of `telemetry=false` in bunfig.
{% /table %}
## Runtime transpiler caching
For files larger than 50 KB, Bun caches transpiled output into `$BUN_RUNTIME_TRANSPILER_CACHE_PATH` or the platform-specific cache directory. This makes CLIs using Bun load faster.
This transpiler cache is global and shared across all projects. It is safe to delete the cache at any time. It is a content-addressable cache, so it will never contain duplicate entries. It is also safe to delete the cache while a Bun process is running.
It is recommended to disable this cache when using ephemeral filesystems like Docker. Bun's Docker images automatically disable this cache.
### Disable the runtime transpiler cache
To disable the runtime transpiler cache, set `BUN_RUNTIME_TRANSPILER_CACHE_PATH` to an empty string or the string `"0"`.
```sh
BUN_RUNTIME_TRANSPILER_CACHE_PATH=0 bun run dev
```
### What does it cache?
It caches:
- The transpiled output of source files larger than 50 KB.
- The sourcemap for the transpiled output of the file
The file extension `.pile` is used for these cached files.

View File

@@ -30,7 +30,7 @@ This page is updated regularly to reflect compatibility status of the latest ver
### [`node:crypto`](https://nodejs.org/api/crypto.html)
🟡 Missing `Certificate` `ECDH` `X509Certificate` `checkPrime` `checkPrimeSync` `diffieHellman` `generatePrime` `generatePrimeSync` `getCipherInfo` `getFips` `hkdf` `hkdfSync` `secureHeapUsed` `setEngine` `setFips`
🟡 Missing `Certificate` `ECDH` `KeyObject` `X509Certificate` `checkPrime` `checkPrimeSync` `createPrivateKey` `createPublicKey` `createSecretKey` `diffieHellman` `generateKey` `generateKeyPair` `generateKeyPairSync` `generateKeySync` `generatePrime` `generatePrimeSync` `getCipherInfo` `getFips` `hkdf` `hkdfSync` `secureHeapUsed` `setEngine` `setFips` `sign` `verify`
Some methods are not optimized yet.
@@ -64,7 +64,7 @@ Some methods are not optimized yet.
### [`node:http2`](https://nodejs.org/api/http2.html)
🟡 Client is supported, but server isn't yet.
🔴 Not implemented.
### [`node:https`](https://nodejs.org/api/https.html)
@@ -148,7 +148,7 @@ Some methods are not optimized yet.
### [`node:util`](https://nodejs.org/api/util.html)
🟡 Missing `MIMEParams` `MIMEType` `aborted` `debug` `getSystemErrorMap` `getSystemErrorName` `transferableAbortController` `transferableAbortSignal` `stripVTControlCharacters`
🟡 Missing `MIMEParams` `MIMEType` `aborted` `debug` `getSystemErrorMap` `getSystemErrorName` `parseArgs` `transferableAbortController` `transferableAbortSignal` `stripVTControlCharacters`
### [`node:v8`](https://nodejs.org/api/v8.html)
@@ -156,7 +156,7 @@ Some methods are not optimized yet.
### [`node:vm`](https://nodejs.org/api/vm.html)
🟡 Core functionality works, but experimental VM ES modules are not implemented, including `vm.Module`, `vm.SourceTextModule`, `vm.SyntheticModule`,`importModuleDynamically`, and `vm.measureMemory`. Options like `timeout`, `breakOnSigint`, `cachedData` are not implemented yet. There is a bug with `this` value for contextified options not having the correct prototype.
🟡 Core functionality works, but VM modules are not implemented. Missing `createScript`. `ShadowRealm` can be used.
### [`node:wasi`](https://nodejs.org/api/wasi.html)

Some files were not shown because too many files have changed in this diff Show More