Compare commits

..

1 Commits

Author SHA1 Message Date
Jarred Sumner
0b80ca57d2 Implement bun start 2023-07-31 06:51:09 -07:00
30513 changed files with 98076 additions and 430429 deletions

View File

@@ -1,8 +0,0 @@
# https://EditorConfig.org
root = true
[*]
charset = utf-8
insert_final_newline = true
trim_trailing_whitespace = true
end_of_line = lf

22
.gitattributes vendored
View File

@@ -1,22 +1,3 @@
*.css text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.js text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.jsx text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.tsx text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.ts text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.c text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.cpp text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.cc text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.yml text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.zig text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.rs text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.h text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.json text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.lock text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.map text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.md text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.mjs text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.mts text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
.vscode/launch.json linguist-generated
src/api/schema.d.ts linguist-generated
fixture.*.c linguist-generated
@@ -24,6 +5,7 @@ src/api/schema.js linguist-generated
src/bun.js/bindings/sqlite/sqlite3.c linguist-vendored
src/bun.js/bindings/sqlite/sqlite3_local.h linguist-vendored
*.lockb binary diff=lockb
*.zig text eol=lf
src/bun.js/bindings/simdutf.cpp linguist-vendored
src/bun.js/bindings/simdutf.h linguist-vendored
@@ -49,5 +31,3 @@ src/bun.js/bindings/ZigGeneratedClasses+lazyStructureHeader.h linguist-generated
src/bun.js/bindings/ZigGeneratedClasses+lazyStructureImpl.h linguist-generated
docs/**/* linguist-documentation
packages/bun-uws/fuzzing/seed-corpus/**/* linguist-generated

View File

@@ -0,0 +1,35 @@
name: 📥 Install Problem
description: Report an issue during install or upgrade
labels: [bug, install]
body:
- type: markdown
attributes:
value: |
Thank you for submitting a bug report. It helps make Bun better.
If you need help or support using Bun, and are not reporting an issue, please
join our [Discord](https://discord.gg/CXdq2DP29u) server, where you can ask questions in the [`#help`](https://discord.gg/32EtH6p7HN) forum.
Please try to include as much information as possible.
- type: input
attributes:
label: What platform is your computer?
description: |
For MacOS and Linux: copy the output of `uname -mprs`
For Windows: copy the output of `"$([Environment]::OSVersion | ForEach-Object VersionString) $(if ([Environment]::Is64BitOperatingSystem) { "x64" } else { "x86" })"` in the PowerShell console
- type: textarea
attributes:
label: How did you attempt to install or upgrade?
description: Please provide the commands you ran to install or upgrade.
validations:
required: true
- type: textarea
attributes:
label: What do you see instead?
description: If possible, please provide text instead of a screenshot.
validations:
required: true
- type: textarea
attributes:
label: Additional information
description: Is there anything else you think we should know?

View File

@@ -10,15 +10,11 @@ body:
If you need help or support using Bun, and are not reporting a bug, please
join our [Discord](https://discord.gg/CXdq2DP29u) server, where you can ask questions in the [`#help`](https://discord.gg/32EtH6p7HN) forum.
Make sure you are running the [latest](https://bun.sh/docs/installation#upgrading) version of Bun.
The bug you are experiencing may already have been fixed.
Please try to include as much information as possible.
- type: input
attributes:
label: What version of Bun is running?
description: Copy the output of `bun --revision`
description: Copy the output of `bun -v`
- type: input
attributes:
label: What platform is your computer?

View File

@@ -8,7 +8,7 @@ body:
Thank you for submitting an idea. It helps make Bun better.
If you want to discuss Bun, or learn how others are using Bun, please
join our [Discord](https://discord.gg/CXdq2DP29u) server, where you can share in the [`#feedback`](https://discord.gg/unwUnHBNqy) channel.
join our [Discord](https://discord.gg/CXdq2DP29u) server, where you can share in the [`#feedback-ideas`](https://discord.gg/unwUnHBNqy) channel.
- type: textarea
attributes:
label: What is the problem this feature would solve?

View File

@@ -0,0 +1,50 @@
name: bun-ecosystem-test
on:
schedule:
- cron: "0 15 * * *" # every day at 7am PST
workflow_dispatch:
inputs:
version:
description: "The version of Bun to run"
required: true
default: "canary"
type: string
jobs:
test:
name: ${{ matrix.tag }}
runs-on: ${{ matrix.os }}
if: github.repository_owner == 'oven-sh'
timeout-minutes: 10
strategy:
fail-fast: false
matrix:
include:
- os: ubuntu-latest
tag: linux-x64
url: linux/x64?avx2=true
- os: ubuntu-latest
tag: linux-x64-baseline
url: linux/x64?baseline=true
# FIXME: runner fails with "No tests found"?
#- os: macos-latest
# tag: darwin-x64
# url: darwin/x64?avx2=true
- os: macos-latest
tag: darwin-x64-baseline
url: darwin/x64?baseline=true
steps:
- id: checkout
name: Checkout
uses: Bhacaz/checkout-files@v2
with:
files: packages/bun-internal-test
- id: setup
name: Setup
uses: oven-sh/setup-bun@v1
with:
bun-download-url: https://bun.sh/download/${{ github.event.inputs.version }}/${{ matrix.url }}
- id: test
name: Test
working-directory: packages/bun-internal-test
run: bun run test:ecosystem

View File

@@ -0,0 +1,41 @@
name: bun-framework-next
on:
push:
paths:
- packages/bun-framework-next/**/*
branches: [main, bun-framework-next-actions]
pull_request:
paths:
- packages/bun-framework-next/**/*
branches: [main]
jobs:
build:
name: lint, test and build on Node ${{ matrix.node }} and ${{ matrix.os }}
runs-on: ${{ matrix.os }}
strategy:
matrix:
node: ["14.x"]
os: [macOS-latest]
steps:
- name: Checkout repo
uses: actions/checkout@v2
- name: Use Node ${{ matrix.node }}
uses: actions/setup-node@v2
with:
node-version: ${{ matrix.node }}
- name: Install PNPM
uses: pnpm/action-setup@v2.0.1
with:
version: 6.21.0
- name: Install dependencies
run: cd packages/bun-framework-next && pnpm install
- name: Type check bun-framework-next
run: cd packages/bun-framework-next && pnpm check

View File

@@ -28,7 +28,6 @@ jobs:
runs-on: ${{matrix.runner}}
if: github.repository_owner == 'oven-sh'
timeout-minutes: 90
permissions: write-all
strategy:
matrix:
include:
@@ -37,7 +36,7 @@ jobs:
arch: aarch64
build_arch: arm64
runner: linux-arm64
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-linux-arm64-lto.tar.gz"
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-july23/bun-webkit-linux-arm64-lto.tar.gz"
webkit_basename: "bun-webkit-linux-arm64-lto"
build_machine_arch: aarch64

View File

@@ -37,7 +37,6 @@ jobs:
runs-on: ${{matrix.runner}}
if: github.repository_owner == 'oven-sh'
timeout-minutes: 90
permissions: write-all
strategy:
fail-fast: false
matrix:
@@ -47,7 +46,7 @@ jobs:
arch: x86_64
build_arch: amd64
runner: big-ubuntu
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-linux-amd64-lto.tar.gz"
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-july23/bun-webkit-linux-amd64-lto.tar.gz"
webkit_basename: "bun-webkit-linux-amd64-lto"
build_machine_arch: x86_64
- cpu: nehalem
@@ -55,7 +54,7 @@ jobs:
arch: x86_64
build_arch: amd64
runner: big-ubuntu
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-linux-amd64-lto.tar.gz"
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-july23/bun-webkit-linux-amd64-lto.tar.gz"
webkit_basename: "bun-webkit-linux-amd64-lto"
build_machine_arch: x86_64
@@ -153,33 +152,13 @@ jobs:
name: "Canary (${{github.sha}})"
tag: "canary"
artifacts: "${{runner.temp}}/release/bun-${{matrix.tag}}.zip,${{runner.temp}}/release/bun-${{matrix.tag}}-profile.zip"
- uses: sarisia/actions-status-discord@v1
if: failure() && github.repository_owner == 'oven-sh' && github.event_name == 'pull_request'
with:
title: ""
webhook: ${{ secrets.DISCORD_WEBHOOK }}
status: ${{ job.status }}
noprefix: true
nocontext: true
description: |
Pull Request
### [${{github.event.pull_request.title}}](https://github.com/oven-sh/bun/pull/${{github.event.number}})
@${{ github.actor }}
Build failed on ${{ matrix.tag }}:
**[View build output](https://github.com/oven-sh/bun/actions/runs/${{github.run_id}})**
[Commit ${{github.sha}}](https://github.com/oven-sh/bun/commits/${{github.sha}})
linux-test:
name: Tests ${{matrix.tag}}
runs-on: ubuntu-latest
needs: [linux]
if: github.event_name == 'pull_request'
timeout-minutes: 20
permissions:
pull-requests: write
outputs:
failing_tests: ${{ steps.test.outputs.failing_tests }}
failing_tests_count: ${{ steps.test.outputs.failing_tests_count }}
@@ -201,8 +180,8 @@ jobs:
with:
name: bun-${{matrix.tag}}
path: ${{runner.temp}}/release
- id: install-bun
name: Install Bun
- id: install
name: Install
run: |
cd ${{runner.temp}}/release
unzip bun-${{matrix.tag}}.zip
@@ -210,13 +189,6 @@ jobs:
chmod +x bun
pwd >> $GITHUB_PATH
./bun --version
- id: install-dependnecies
name: Install dependencies
run: |
sudo apt-get update && sudo apt-get install -y openssl
bun install --verbose
bun install --cwd=test --verbose
bun install --cwd=packages/bun-internal-test --verbose
- id: test
name: Test (node runner)
env:
@@ -225,24 +197,10 @@ jobs:
TLS_POSTGRES_DATABASE_URL: ${{ secrets.TLS_POSTGRES_DATABASE_URL }}
# if: ${{github.event.inputs.use_bun == 'false'}}
run: |
bun install
bun install --cwd test
bun install --cwd packages/bun-internal-test
node packages/bun-internal-test/src/runner.node.mjs || true
- uses: sarisia/actions-status-discord@v1
if: always() && steps.test.outputs.failing_tests != '' && github.event_name == 'pull_request'
with:
title: ""
webhook: ${{ secrets.DISCORD_WEBHOOK }}
status: "failure"
noprefix: true
nocontext: true
description: |
Pull Request
### ❌ [${{github.event.pull_request.title}}](https://github.com/oven-sh/bun/pull/${{github.event.number}})
@${{ github.actor }}, there are ${{ steps.test.outputs.failing_tests_count }} files with test failures on ${{ matrix.tag }}:
${{ steps.test.outputs.failing_tests }}
**[View test output](https://github.com/oven-sh/bun/actions/runs/${{github.run_id}})**
- name: Comment on PR
if: steps.test.outputs.failing_tests != '' && github.event_name == 'pull_request'
uses: thollander/actions-comment-pull-request@v2

View File

@@ -115,36 +115,36 @@ jobs:
# arch: x86_64
# tag: bun-darwin-x64-baseline
# obj: bun-obj-darwin-x64-baseline
# runner: macos-12
# runner: macos-11
# artifact: bun-obj-darwin-x64-baseline
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-july23/bun-webkit-macos-amd64-lto.tar.gz"
# dependencies: true
# compile_obj: false
# - cpu: haswell
# arch: x86_64
# tag: bun-darwin-x64
# obj: bun-obj-darwin-x64
# runner: macos-12
# runner: macos-11
# artifact: bun-obj-darwin-x64
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-july23/bun-webkit-macos-amd64-lto.tar.gz"
# dependencies: true
# compile_obj: false
# - cpu: nehalem
# arch: x86_64
# tag: bun-darwin-x64-baseline
# obj: bun-obj-darwin-x64-baseline
# runner: macos-12
# runner: macos-11
# artifact: bun-obj-darwin-x64-baseline
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-july23/bun-webkit-macos-amd64-lto.tar.gz"
# dependencies: false
# compile_obj: true
# - cpu: haswell
# arch: x86_64
# tag: bun-darwin-x64
# obj: bun-obj-darwin-x64
# runner: macos-12
# runner: macos-11
# artifact: bun-obj-darwin-x64
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-july23/bun-webkit-macos-amd64-lto.tar.gz"
# dependencies: false
# compile_obj: true
- cpu: native
@@ -152,14 +152,14 @@ jobs:
tag: bun-darwin-aarch64
obj: bun-obj-darwin-aarch64
artifact: bun-obj-darwin-aarch64
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-arm64-lto.tar.gz"
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-july23/bun-webkit-macos-arm64-lto.tar.gz"
runner: macos-arm64
dependencies: true
compile_obj: true
steps:
- uses: actions/checkout@v3
- name: Checkout submodules
run: git submodule update --init --recursive --depth=1 --progress -j $(sysctl -n hw.ncpu) --force
run: git submodule update --init --recursive --depth=1 --progress -j $(sysctl -n hw.ncpu)
- name: Install dependencies
env:
CPU_TARGET: ${{ matrix.cpu }}
@@ -172,11 +172,11 @@ jobs:
OBJ_DIR: ${{ runner.temp }}/bun-cpp-obj
BUN_DEPS_OUT_DIR: ${{runner.temp}}/bun-deps
run: |
brew install ccache rust llvm@16 pkg-config coreutils libtool cmake libiconv automake openssl@1.1 ninja gnu-sed pkg-config esbuild --force
brew install ccache rust llvm@15 pkg-config coreutils libtool cmake libiconv automake openssl@1.1 ninja gnu-sed pkg-config esbuild --force
echo "$(brew --prefix ccache)/bin" >> $GITHUB_PATH
echo "$(brew --prefix coreutils)/libexec/gnubin" >> $GITHUB_PATH
echo "$(brew --prefix llvm@16)/bin" >> $GITHUB_PATH
brew link --overwrite llvm@16
echo "$(brew --prefix llvm@15)/bin" >> $GITHUB_PATH
brew link --overwrite llvm@15
- name: ccache
uses: hendrikmuhs/ccache-action@v1.2
with:
@@ -247,7 +247,6 @@ jobs:
if: github.repository_owner == 'oven-sh'
needs: [macOS-cpp, macos-object-files]
timeout-minutes: 90
permissions: write-all
strategy:
matrix:
include:
@@ -256,29 +255,29 @@ jobs:
# tag: bun-darwin-x64-baseline
# obj: bun-obj-darwin-x64-baseline
# package: bun-darwin-x64
# runner: macos-12
# runner: macos-11
# artifact: bun-obj-darwin-x64-baseline
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-july23/bun-webkit-macos-amd64-lto.tar.gz"
# - cpu: haswell
# arch: x86_64
# tag: bun-darwin-x64
# obj: bun-obj-darwin-x64
# package: bun-darwin-x64
# runner: macos-12
# runner: macos-11
# artifact: bun-obj-darwin-x64
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-july23/bun-webkit-macos-amd64-lto.tar.gz"
- cpu: native
arch: aarch64
tag: bun-darwin-aarch64
obj: bun-obj-darwin-aarch64
package: bun-darwin-aarch64
artifact: bun-obj-darwin-aarch64
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-arm64-lto.tar.gz"
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-july23/bun-webkit-macos-arm64-lto.tar.gz"
runner: macos-arm64
steps:
- uses: actions/checkout@v3
- name: Checkout submodules
run: git submodule update --init --recursive --depth=1 --progress -j $(sysctl -n hw.ncpu) --force
run: git submodule update --init --recursive --depth=1 --progress -j $(sysctl -n hw.ncpu)
- name: Install dependencies
env:
CPU_TARGET: ${{ matrix.cpu }}
@@ -291,10 +290,10 @@ jobs:
OBJ_DIR: ${{ runner.temp }}/bun-cpp-obj
BUN_DEPS_OUT_DIR: ${{runner.temp}}/bun-deps
run: |
brew install rust ccache llvm@16 pkg-config coreutils libtool cmake libiconv automake openssl@1.1 ninja gnu-sed pkg-config esbuild --force
brew install rust ccache llvm@15 pkg-config coreutils libtool cmake libiconv automake openssl@1.1 ninja gnu-sed pkg-config esbuild --force
echo "$(brew --prefix coreutils)/libexec/gnubin" >> $GITHUB_PATH
echo "$(brew --prefix llvm@16)/bin" >> $GITHUB_PATH
brew link --overwrite llvm@16
echo "$(brew --prefix llvm@15)/bin" >> $GITHUB_PATH
brew link --overwrite llvm@15
- name: ccache
uses: hendrikmuhs/ccache-action@v1.2
with:
@@ -393,32 +392,11 @@ jobs:
name: "Canary (${{github.sha}})"
tag: "canary"
artifacts: "${{runner.temp}}/release/${{matrix.tag}}.zip,${{runner.temp}}/release/${{matrix.tag}}-profile.zip"
- uses: sarisia/actions-status-discord@v1
if: failure() && github.repository_owner == 'oven-sh' && github.event_name == 'pull_request'
with:
title: ""
webhook: ${{ secrets.DISCORD_WEBHOOK }}
status: ${{ job.status }}
noprefix: true
nocontext: true
description: |
Pull Request
### [${{github.event.pull_request.title}}](https://github.com/oven-sh/bun/pull/${{github.event.number}})
@${{ github.actor }}
Build failed on ${{ matrix.tag }}:
**[View build output](https://github.com/oven-sh/bun/actions/runs/${{github.run_id}})**
[Commit ${{github.sha}}](https://github.com/oven-sh/bun/commits/${{github.sha}})
macOS-test:
name: Tests ${{matrix.tag}}
runs-on: ${{ matrix.runner }}
needs: [macOS]
if: github.event_name == 'pull_request' && github.repository_owner == 'oven-sh'
permissions:
pull-requests: write
timeout-minutes: 30
outputs:
failing_tests: ${{ steps.test.outputs.failing_tests }}
@@ -441,8 +419,8 @@ jobs:
with:
name: ${{matrix.tag}}
path: ${{runner.temp}}/release
- id: install-bun
name: Install Bun
- id: install
name: Install
run: |
cd ${{runner.temp}}/release
unzip ${{matrix.tag}}.zip
@@ -450,12 +428,6 @@ jobs:
chmod +x bun
pwd >> $GITHUB_PATH
./bun --version
- id: install
name: Install dependencies
run: |
bun install --verbose
bun install --cwd=test --verbose
bun install --cwd=packages/bun-internal-test --verbose
- id: test
name: Test (node runner)
env:
@@ -464,24 +436,10 @@ jobs:
TLS_POSTGRES_DATABASE_URL: ${{ secrets.TLS_POSTGRES_DATABASE_URL }}
# if: ${{github.event.inputs.use_bun == 'false'}}
run: |
bun install
bun install --cwd test
bun install --cwd packages/bun-internal-test
node packages/bun-internal-test/src/runner.node.mjs || true
- uses: sarisia/actions-status-discord@v1
if: always() && steps.test.outputs.failing_tests != '' && github.event_name == 'pull_request'
with:
title: ""
webhook: ${{ secrets.DISCORD_WEBHOOK }}
status: "failure"
noprefix: true
nocontext: true
description: |
Pull Request
### ❌ [${{github.event.pull_request.title}}](https://github.com/oven-sh/bun/pull/${{github.event.number}})
@${{ github.actor }}, there are ${{ steps.test.outputs.failing_tests_count }} files with test failures on ${{ matrix.tag }}:
${{ steps.test.outputs.failing_tests }}
**[View test output](https://github.com/oven-sh/bun/actions/runs/${{github.run_id}})**
- name: Comment on PR
if: steps.test.outputs.failing_tests != '' && github.event_name == 'pull_request'
uses: thollander/actions-comment-pull-request@v2

View File

@@ -115,36 +115,36 @@ jobs:
arch: x86_64
tag: bun-darwin-x64-baseline
obj: bun-obj-darwin-x64-baseline
runner: macos-12
runner: macos-11
artifact: bun-obj-darwin-x64-baseline
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-july23/bun-webkit-macos-amd64-lto.tar.gz"
dependencies: true
compile_obj: false
# - cpu: haswell
# arch: x86_64
# tag: bun-darwin-x64
# obj: bun-obj-darwin-x64
# runner: macos-12
# runner: macos-11
# artifact: bun-obj-darwin-x64
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-july23/bun-webkit-macos-amd64-lto.tar.gz"
# dependencies: true
# compile_obj: false
- cpu: nehalem
arch: x86_64
tag: bun-darwin-x64-baseline
obj: bun-obj-darwin-x64-baseline
runner: macos-12
runner: macos-11
artifact: bun-obj-darwin-x64-baseline
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-july23/bun-webkit-macos-amd64-lto.tar.gz"
dependencies: false
compile_obj: true
# - cpu: haswell
# arch: x86_64
# tag: bun-darwin-x64
# obj: bun-obj-darwin-x64
# runner: macos-12
# runner: macos-11
# artifact: bun-obj-darwin-x64
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-july23/bun-webkit-macos-amd64-lto.tar.gz"
# dependencies: false
# compile_obj: true
# - cpu: native
@@ -152,14 +152,14 @@ jobs:
# tag: bun-darwin-aarch64
# obj: bun-obj-darwin-aarch64
# artifact: bun-obj-darwin-aarch64
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-july23/bun-webkit-macos-amd64-lto.tar.gz"
# runner: macos-arm64
# dependencies: true
# compile_obj: true
steps:
- uses: actions/checkout@v3
- name: Checkout submodules
run: git submodule update --init --recursive --depth=1 --progress -j $(sysctl -n hw.ncpu) --force
run: git submodule update --init --recursive --depth=1 --progress -j $(sysctl -n hw.ncpu)
- name: Install dependencies
env:
CPU_TARGET: ${{ matrix.cpu }}
@@ -172,11 +172,11 @@ jobs:
OBJ_DIR: ${{ runner.temp }}/bun-cpp-obj
BUN_DEPS_OUT_DIR: ${{runner.temp}}/bun-deps
run: |
brew install ccache rust llvm@16 pkg-config coreutils libtool cmake libiconv automake openssl@1.1 ninja gnu-sed pkg-config esbuild --force
brew install ccache rust llvm@15 pkg-config coreutils libtool cmake libiconv automake openssl@1.1 ninja gnu-sed pkg-config esbuild --force
echo "$(brew --prefix ccache)/bin" >> $GITHUB_PATH
echo "$(brew --prefix coreutils)/libexec/gnubin" >> $GITHUB_PATH
echo "$(brew --prefix llvm@16)/bin" >> $GITHUB_PATH
brew link --overwrite llvm@16
echo "$(brew --prefix llvm@15)/bin" >> $GITHUB_PATH
brew link --overwrite llvm@15
- name: ccache (dependencies)
uses: hendrikmuhs/ccache-action@v1.2
if: matrix.dependencies
@@ -248,7 +248,6 @@ jobs:
if: github.repository_owner == 'oven-sh'
needs: [macOS-cpp, macos-object-files]
timeout-minutes: 90
permissions: write-all
strategy:
matrix:
include:
@@ -257,29 +256,29 @@ jobs:
tag: bun-darwin-x64-baseline
obj: bun-obj-darwin-x64-baseline
package: bun-darwin-x64
runner: macos-12
runner: macos-11
artifact: bun-obj-darwin-x64-baseline
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-july23/bun-webkit-macos-amd64-lto.tar.gz"
# - cpu: haswell
# arch: x86_64
# tag: bun-darwin-x64
# obj: bun-obj-darwin-x64
# package: bun-darwin-x64
# runner: macos-12
# runner: macos-11
# artifact: bun-obj-darwin-x64
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-july23/bun-webkit-macos-amd64-lto.tar.gz"
# - cpu: native
# arch: aarch64
# tag: bun-darwin-aarch64
# obj: bun-obj-darwin-aarch64
# package: bun-darwin-aarch64
# artifact: bun-obj-darwin-aarch64
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-july23/bun-webkit-macos-amd64-lto.tar.gz"
# runner: macos-arm64
steps:
- uses: actions/checkout@v3
- name: Checkout submodules
run: git submodule update --init --recursive --depth=1 --progress -j $(sysctl -n hw.ncpu) --force
run: git submodule update --init --recursive --depth=1 --progress -j $(sysctl -n hw.ncpu)
- name: Install dependencies
env:
CPU_TARGET: ${{ matrix.cpu }}
@@ -292,10 +291,10 @@ jobs:
OBJ_DIR: ${{ runner.temp }}/bun-cpp-obj
BUN_DEPS_OUT_DIR: ${{runner.temp}}/bun-deps
run: |
brew install ccache rust llvm@16 pkg-config coreutils libtool cmake libiconv automake openssl@1.1 ninja gnu-sed pkg-config esbuild --force
brew install ccache rust llvm@15 pkg-config coreutils libtool cmake libiconv automake openssl@1.1 ninja gnu-sed pkg-config esbuild --force
echo "$(brew --prefix coreutils)/libexec/gnubin" >> $GITHUB_PATH
echo "$(brew --prefix llvm@16)/bin" >> $GITHUB_PATH
brew link --overwrite llvm@16
echo "$(brew --prefix llvm@15)/bin" >> $GITHUB_PATH
brew link --overwrite llvm@15
- name: ccache (link)
uses: hendrikmuhs/ccache-action@v1.2
with:
@@ -397,32 +396,11 @@ jobs:
name: "Canary (${{github.sha}})"
tag: "canary"
artifacts: "${{runner.temp}}/release/${{matrix.tag}}.zip,${{runner.temp}}/release/${{matrix.tag}}-profile.zip"
- uses: sarisia/actions-status-discord@v1
if: failure() && github.repository_owner == 'oven-sh' && github.event_name == 'pull_request'
with:
title: ""
webhook: ${{ secrets.DISCORD_WEBHOOK }}
status: ${{ job.status }}
noprefix: true
nocontext: true
description: |
Pull Request
### [${{github.event.pull_request.title}}](https://github.com/oven-sh/bun/pull/${{github.event.number}})
@${{ github.actor }}
Build failed on ${{ matrix.tag }}:
**[View build output](https://github.com/oven-sh/bun/actions/runs/${{github.run_id}})**
[Commit ${{github.sha}}](https://github.com/oven-sh/bun/commits/${{github.sha}})
macOS-test:
name: Tests ${{matrix.tag}}
runs-on: ${{ matrix.runner }}
needs: [macOS]
if: github.event_name == 'pull_request' && github.repository_owner == 'oven-sh'
permissions:
pull-requests: write
timeout-minutes: 30
outputs:
failing_tests: ${{ steps.test.outputs.failing_tests }}
@@ -432,7 +410,7 @@ jobs:
matrix:
include:
- tag: bun-darwin-x64-baseline
runner: macos-12
runner: macos-11
steps:
- id: checkout
name: Checkout
@@ -445,8 +423,8 @@ jobs:
with:
name: ${{matrix.tag}}
path: ${{runner.temp}}/release
- id: install-bun
name: Install Bun
- id: install
name: Install
run: |
cd ${{runner.temp}}/release
unzip ${{matrix.tag}}.zip
@@ -454,12 +432,6 @@ jobs:
chmod +x bun
pwd >> $GITHUB_PATH
./bun --version
- id: install
name: Install dependencies
run: |
bun install --verbose
bun install --cwd=test --verbose
bun install --cwd=packages/bun-internal-test --verbose
- id: test
name: Test (node runner)
env:
@@ -468,27 +440,10 @@ jobs:
TLS_POSTGRES_DATABASE_URL: ${{ secrets.TLS_POSTGRES_DATABASE_URL }}
# if: ${{github.event.inputs.use_bun == 'false'}}
run: |
bun install
bun install --cwd test
bun install --cwd packages/bun-internal-test
node packages/bun-internal-test/src/runner.node.mjs || true
- uses: sarisia/actions-status-discord@v1
if: always() && steps.test.outputs.failing_tests != '' && github.event_name == 'pull_request'
with:
title: ""
webhook: ${{ secrets.DISCORD_WEBHOOK }}
status: "failure"
noprefix: true
nocontext: true
description: |
Pull Request
### ❌ [${{github.event.pull_request.title}}](https://github.com/oven-sh/bun/pull/${{github.event.number}})
Hey @${{ github.actor }},
${{ steps.test.outputs.failing_tests_count }} files with test failures on ${{ matrix.tag }}:
${{ steps.test.outputs.failing_tests }}
**[View test output](https://github.com/oven-sh/bun/actions/runs/${{github.run_id}})**
- name: Comment on PR
if: steps.test.outputs.failing_tests != '' && github.event_name == 'pull_request'
uses: thollander/actions-comment-pull-request@v2

View File

@@ -115,36 +115,36 @@ jobs:
# arch: x86_64
# tag: bun-darwin-x64-baseline
# obj: bun-obj-darwin-x64-baseline
# runner: macos-12
# runner: macos-11
# artifact: bun-obj-darwin-x64-baseline
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-july23/bun-webkit-macos-amd64-lto.tar.gz"
# dependencies: true
# compile_obj: false
- cpu: haswell
arch: x86_64
tag: bun-darwin-x64
obj: bun-obj-darwin-x64
runner: macos-12
runner: macos-11
artifact: bun-obj-darwin-x64
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-july23/bun-webkit-macos-amd64-lto.tar.gz"
dependencies: true
compile_obj: false
# - cpu: nehalem
# arch: x86_64
# tag: bun-darwin-x64-baseline
# obj: bun-obj-darwin-x64-baseline
# runner: macos-12
# runner: macos-11
# artifact: bun-obj-darwin-x64-baseline
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-july23/bun-webkit-macos-amd64-lto.tar.gz"
# dependencies: false
# compile_obj: true
- cpu: haswell
arch: x86_64
tag: bun-darwin-x64
obj: bun-obj-darwin-x64
runner: macos-12
runner: macos-11
artifact: bun-obj-darwin-x64
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-july23/bun-webkit-macos-amd64-lto.tar.gz"
dependencies: false
compile_obj: true
# - cpu: native
@@ -152,14 +152,14 @@ jobs:
# tag: bun-darwin-aarch64
# obj: bun-obj-darwin-aarch64
# artifact: bun-obj-darwin-aarch64
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-arm64-lto.tar.gz"
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-july23/bun-webkit-macos-arm64-lto.tar.gz"
# runner: macos-arm64
# dependencies: true
# compile_obj: true
steps:
- uses: actions/checkout@v3
- name: Checkout submodules
run: git submodule update --init --recursive --depth=1 --progress -j $(sysctl -n hw.ncpu) --force
run: git submodule update --init --recursive --depth=1 --progress -j $(sysctl -n hw.ncpu)
- name: Install dependencies
env:
CPU_TARGET: ${{ matrix.cpu }}
@@ -172,10 +172,10 @@ jobs:
OBJ_DIR: ${{ runner.temp }}/bun-cpp-obj
BUN_DEPS_OUT_DIR: ${{runner.temp}}/bun-deps
run: |
brew install rust ccache llvm@16 pkg-config coreutils libtool cmake libiconv automake openssl@1.1 ninja gnu-sed pkg-config esbuild --force
brew install rust ccache llvm@15 pkg-config coreutils libtool cmake libiconv automake openssl@1.1 ninja gnu-sed pkg-config esbuild --force
echo "$(brew --prefix coreutils)/libexec/gnubin" >> $GITHUB_PATH
echo "$(brew --prefix llvm@16)/bin" >> $GITHUB_PATH
brew link --overwrite llvm@16
echo "$(brew --prefix llvm@15)/bin" >> $GITHUB_PATH
brew link --overwrite llvm@15
- name: Download WebKit
if: matrix.compile_obj
env:
@@ -250,7 +250,6 @@ jobs:
if: github.repository_owner == 'oven-sh'
needs: [macOS-cpp, macos-object-files]
timeout-minutes: 90
permissions: write-all
strategy:
matrix:
include:
@@ -259,29 +258,29 @@ jobs:
# tag: bun-darwin-x64-baseline
# obj: bun-obj-darwin-x64-baseline
# package: bun-darwin-x64
# runner: macos-12
# runner: macos-11
# artifact: bun-obj-darwin-x64-baseline
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-july23/bun-webkit-macos-amd64-lto.tar.gz"
- cpu: haswell
arch: x86_64
tag: bun-darwin-x64
obj: bun-obj-darwin-x64
package: bun-darwin-x64
runner: macos-12
runner: macos-11
artifact: bun-obj-darwin-x64
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-july23/bun-webkit-macos-amd64-lto.tar.gz"
# - cpu: native
# arch: aarch64
# tag: bun-darwin-aarch64
# obj: bun-obj-darwin-aarch64
# package: bun-darwin-aarch64
# artifact: bun-obj-darwin-aarch64
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-arm64-lto.tar.gz"
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-july23/bun-webkit-macos-arm64-lto.tar.gz"
# runner: macos-arm64
steps:
- uses: actions/checkout@v3
- name: Checkout submodules
run: git submodule update --init --recursive --depth=1 --progress -j $(sysctl -n hw.ncpu) --force
run: git submodule update --init --recursive --depth=1 --progress -j $(sysctl -n hw.ncpu)
- name: Install dependencies
env:
CPU_TARGET: ${{ matrix.cpu }}
@@ -294,10 +293,10 @@ jobs:
OBJ_DIR: ${{ runner.temp }}/bun-cpp-obj
BUN_DEPS_OUT_DIR: ${{runner.temp}}/bun-deps
run: |
brew install rust ccache llvm@16 pkg-config coreutils libtool cmake libiconv automake openssl@1.1 ninja gnu-sed pkg-config esbuild --force
brew install rust ccache llvm@15 pkg-config coreutils libtool cmake libiconv automake openssl@1.1 ninja gnu-sed pkg-config esbuild --force
echo "$(brew --prefix coreutils)/libexec/gnubin" >> $GITHUB_PATH
echo "$(brew --prefix llvm@16)/bin" >> $GITHUB_PATH
brew link --overwrite llvm@16
echo "$(brew --prefix llvm@15)/bin" >> $GITHUB_PATH
brew link --overwrite llvm@15
- name: Download WebKit
env:
CPU_TARGET: ${{ matrix.cpu }}
@@ -399,32 +398,11 @@ jobs:
name: "Canary (${{github.sha}})"
tag: "canary"
artifacts: "${{runner.temp}}/release/${{matrix.tag}}.zip,${{runner.temp}}/release/${{matrix.tag}}-profile.zip"
- uses: sarisia/actions-status-discord@v1
if: failure() && github.repository_owner == 'oven-sh' && github.event_name == 'pull_request'
with:
title: ""
webhook: ${{ secrets.DISCORD_WEBHOOK }}
status: ${{ job.status }}
noprefix: true
nocontext: true
description: |
Pull Request
### [${{github.event.pull_request.title}}](https://github.com/oven-sh/bun/pull/${{github.event.number}})
@${{ github.actor }}
Build failed on ${{ matrix.tag }}:
**[View build output](https://github.com/oven-sh/bun/actions/runs/${{github.run_id}})**
[Commit ${{github.sha}}](https://github.com/oven-sh/bun/commits/${{github.sha}})
macOS-test:
name: Tests ${{matrix.tag}}
runs-on: ${{ matrix.runner }}
needs: [macOS]
if: github.event_name == 'pull_request' && github.repository_owner == 'oven-sh'
permissions:
pull-requests: write
timeout-minutes: 30
outputs:
failing_tests: ${{ steps.test.outputs.failing_tests }}
@@ -434,7 +412,7 @@ jobs:
matrix:
include:
- tag: bun-darwin-x64
runner: macos-12
runner: macos-11
steps:
- id: checkout
name: Checkout
@@ -447,8 +425,8 @@ jobs:
with:
name: ${{matrix.tag}}
path: ${{runner.temp}}/release
- id: install-bun
name: Install Bun
- id: install
name: Install
run: |
cd ${{runner.temp}}/release
unzip ${{matrix.tag}}.zip
@@ -456,12 +434,6 @@ jobs:
chmod +x bun
pwd >> $GITHUB_PATH
./bun --version
- id: install
name: Install dependencies
run: |
bun install --verbose
bun install --cwd=test --verbose
bun install --cwd=packages/bun-internal-test --verbose
- id: test
name: Test (node runner)
env:
@@ -470,24 +442,10 @@ jobs:
TLS_POSTGRES_DATABASE_URL: ${{ secrets.TLS_POSTGRES_DATABASE_URL }}
# if: ${{github.event.inputs.use_bun == 'false'}}
run: |
bun install
bun install --cwd test
bun install --cwd packages/bun-internal-test
node packages/bun-internal-test/src/runner.node.mjs || true
- uses: sarisia/actions-status-discord@v1
if: always() && steps.test.outputs.failing_tests != '' && github.event_name == 'pull_request'
with:
title: ""
webhook: ${{ secrets.DISCORD_WEBHOOK }}
status: "failure"
noprefix: true
nocontext: true
description: |
Pull Request
### ❌ [${{github.event.pull_request.title}}](https://github.com/oven-sh/bun/pull/${{github.event.number}})
@${{ github.actor }}, there are ${{ steps.test.outputs.failing_tests_count }} files with test failures on ${{ matrix.tag }}:
${{ steps.test.outputs.failing_tests }}
**[View test output](https://github.com/oven-sh/bun/actions/runs/${{github.run_id}})**
- name: Comment on PR
if: steps.test.outputs.failing_tests != '' && github.event_name == 'pull_request'
uses: thollander/actions-comment-pull-request@v2

179
.github/workflows/bun-release-canary.yml vendored Normal file
View File

@@ -0,0 +1,179 @@
name: bun-release-canary
concurrency: release-canary
on:
schedule:
- cron: "0 14 * * *" # every day at 6am PST
workflow_dispatch:
jobs:
sign:
name: Sign Release
runs-on: ubuntu-latest
if: github.repository_owner == 'oven-sh'
defaults:
run:
working-directory: packages/bun-release
steps:
- id: checkout
name: Checkout
uses: actions/checkout@v3
- id: setup-gpg
name: Setup GPG
uses: crazy-max/ghaction-import-gpg@v5
with:
gpg_private_key: ${{ secrets.GPG_PRIVATE_KEY }}
passphrase: ${{ secrets.GPG_PASSPHRASE }}
- id: setup-bun
name: Setup Bun
uses: oven-sh/setup-bun@v1
with:
bun-version: canary
- id: bun-install
name: Install Dependencies
run: bun install
- id: bun-run
name: Sign Release
run: |
echo "$GPG_PASSPHRASE" | bun upload-assets -- "canary"
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GPG_PASSPHRASE: ${{ secrets.GPG_PASSPHRASE }}
npm:
name: Release to NPM
runs-on: ubuntu-latest
needs: sign
if: github.repository_owner == 'oven-sh'
defaults:
run:
working-directory: packages/bun-release
steps:
- id: checkout
name: Checkout
uses: actions/checkout@v3
- id: setup-bun
name: Setup Bun
uses: oven-sh/setup-bun@v1
with:
bun-version: canary
- id: bun-install
name: Install Dependencies
run: bun install
- id: bun-run
name: Release
run: bun upload-npm -- canary publish
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
NPM_TOKEN: ${{ secrets.NPM_TOKEN }}
# npm-types:
# name: Release types to NPM
# runs-on: ubuntu-latest
# defaults:
# run:
# working-directory: packages/bun-types
# steps:
# - id: checkout
# name: Checkout
# uses: actions/checkout@v3
# - id: setup-node
# name: Setup Node.js
# uses: actions/setup-node@v3
# with:
# node-version: latest
# - id: setup-bun
# name: Setup Bun
# uses: oven-sh/setup-bun@v1
# with:
# bun-version: canary
# - id: bun-install
# name: Install Dependencies
# run: bun install
# - id: setup-env
# name: Setup Environment
# run: |
# SHA=$(git rev-parse --short "$GITHUB_SHA")
# VERSION=$(bun --version)
# TAG="${VERSION}-canary.$(date '+%Y%m%d').1+${SHA}"
# echo "Setup tag: ${TAG}"
# echo "TAG=${TAG}" >> ${GITHUB_ENV}
# - id: bun-run
# name: Build
# run: bun run build
# env:
# BUN_VERSION: ${{ env.TAG }}
# - id: npm-publish
# name: Release
# uses: JS-DevTools/npm-publish@v1
# with:
# package: packages/bun-types/dist/package.json
# token: ${{ secrets.NPM_TOKEN }}
# tag: canary
docker:
name: Release to Dockerhub
runs-on: ubuntu-latest
needs: sign
if: github.repository_owner == 'oven-sh'
steps:
- id: checkout
name: Checkout
uses: actions/checkout@v3
- id: qemu
name: Setup Docker QEMU
uses: docker/setup-qemu-action@v2
- id: buildx
name: Setup Docker buildx
uses: docker/setup-buildx-action@v2
with:
platforms: linux/amd64,linux/arm64
- id: metadata
name: Setup Docker metadata
uses: docker/metadata-action@v4
with:
images: oven/bun
tags: canary
- id: login
name: Login to Docker
uses: docker/login-action@v2
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_PASSWORD }}
- id: push
name: Push to Docker
uses: docker/build-push-action@v3
with:
context: ./dockerhub
file: ./dockerhub/Dockerfile-debian
platforms: linux/amd64,linux/arm64
builder: ${{ steps.buildx.outputs.name }}
push: true
tags: ${{ steps.metadata.outputs.tags }}
labels: ${{ steps.metadata.outputs.labels }}
build-args: |
BUN_VERSION=canary
s3:
name: Upload to S3
runs-on: ubuntu-latest
needs: sign
if: github.repository_owner == 'oven-sh'
defaults:
run:
working-directory: packages/bun-release
steps:
- id: checkout
name: Checkout
uses: actions/checkout@v3
- id: setup-bun
name: Setup Bun
uses: oven-sh/setup-bun@v1
with:
bun-version: canary
- id: bun-install
name: Install Dependencies
run: bun install
- id: bun-run
name: Release
run: bun upload-s3 -- canary
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY}}
AWS_ENDPOINT: ${{ secrets.AWS_ENDPOINT }}
AWS_BUCKET: bun

View File

@@ -0,0 +1,54 @@
name: bun-release-types-canary
concurrency: release-canary
on:
push:
branches:
- main
paths:
- "packages/bun-types/**"
workflow_dispatch:
jobs:
npm-types:
name: Release types to NPM
runs-on: ubuntu-latest
if: github.repository_owner == 'oven-sh'
defaults:
run:
working-directory: packages/bun-types
steps:
- id: checkout
name: Checkout
uses: actions/checkout@v3
- id: setup-node
name: Setup Node.js
uses: actions/setup-node@v3
with:
node-version: latest
- id: setup-bun
name: Setup Bun
uses: oven-sh/setup-bun@v1
with:
bun-version: canary
- id: bun-install
name: Install Dependencies
run: bun install
- id: setup-env
name: Setup Environment
run: |
SHA=$(git rev-parse --short "$GITHUB_SHA")
VERSION=$(bun --version)
TAG="${VERSION}-canary.$(date +'%Y%m%dT%H%M%S')"
echo "Setup tag: ${TAG}"
echo "TAG=${TAG}" >> ${GITHUB_ENV}
- id: bun-run
name: Build
run: bun run build
env:
BUN_VERSION: ${{ env.TAG }}
- id: npm-publish
name: Release
uses: JS-DevTools/npm-publish@v1
with:
package: packages/bun-types/dist/package.json
token: ${{ secrets.NPM_TOKEN }}
tag: canary

View File

@@ -1,71 +1,52 @@
name: bun-release
concurrency: release
env:
BUN_VERSION: ${{ github.event.inputs.tag || github.event.release.tag_name || 'canary' }}
BUN_LATEST: ${{ github.event.inputs.is-latest || github.event.release.prerelease == 'false' }}
on:
release:
types:
- published
schedule:
- cron: "0 14 * * *" # every day at 6am PST
workflow_dispatch:
inputs:
is-latest:
description: Is this the latest release?
type: boolean
default: false
tag:
type: string
description: What is the release tag? (e.g. "1.0.2", "canary")
description: The tag to publish
required: true
use-docker:
description: Should Docker images be released?
type: boolean
default: false
use-npm:
description: Should npm packages be published?
type: boolean
default: false
use-homebrew:
description: Should binaries be released to Homebrew?
type: boolean
default: false
use-s3:
description: Should binaries be uploaded to S3?
type: boolean
default: false
use-types:
description: Should types be released to npm?
type: boolean
default: false
jobs:
sign:
name: Sign Release
runs-on: ubuntu-latest
if: ${{ github.repository_owner == 'oven-sh' }}
permissions:
contents: write
if: github.repository_owner == 'oven-sh'
defaults:
run:
working-directory: packages/bun-release
steps:
- name: Checkout
- id: checkout
name: Checkout
uses: actions/checkout@v3
- name: Setup GPG
- id: setup-env
name: Setup Environment
run: |
TAG="${{ github.event.inputs.tag }}"
TAG="${TAG:-"${{ github.event.release.tag_name }}"}"
echo "Setup tag: ${TAG}"
echo "TAG=${TAG}" >> ${GITHUB_ENV}
- id: setup-gpg
name: Setup GPG
uses: crazy-max/ghaction-import-gpg@v5
with:
gpg_private_key: ${{ secrets.GPG_PRIVATE_KEY }}
passphrase: ${{ secrets.GPG_PASSPHRASE }}
- name: Setup Bun
- id: setup-bun
name: Setup Bun
uses: oven-sh/setup-bun@v1
with:
bun-version: latest
- name: Install Dependencies
bun-version: canary
- id: bun-install
name: Install Dependencies
run: bun install
- name: Sign Release
- id: bun-run
name: Sign Release
run: |
echo "$GPG_PASSPHRASE" | bun upload-assets -- "${{ env.BUN_VERSION }}"
echo "$GPG_PASSPHRASE" | bun upload-assets -- "${{ env.TAG }}"
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GPG_PASSPHRASE: ${{ secrets.GPG_PASSPHRASE }}
@@ -73,23 +54,32 @@ jobs:
name: Release to NPM
runs-on: ubuntu-latest
needs: sign
if: ${{ github.event_name != 'workflow_dispatch' || github.event.inputs.use-npm == 'true' }}
permissions:
contents: read
if: github.repository_owner == 'oven-sh'
defaults:
run:
working-directory: packages/bun-release
steps:
- name: Checkout
- id: checkout
name: Checkout
uses: actions/checkout@v3
- name: Setup Bun
- id: setup-env
name: Setup Environment
run: |
TAG="${{ github.event.inputs.tag }}"
TAG="${TAG:-"${{ github.event.release.tag_name }}"}"
echo "Setup tag: ${TAG}"
echo "TAG=${TAG}" >> ${GITHUB_ENV}
- id: setup-bun
name: Setup Bun
uses: oven-sh/setup-bun@v1
with:
bun-version: latest
- name: Install Dependencies
bun-version: canary
- id: bun-install
name: Install Dependencies
run: bun install
- name: Release
run: bun upload-npm -- "${{ env.BUN_VERSION }}" publish
- id: bun-run
name: Release
run: bun upload-npm -- "${{ env.TAG }}" publish
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
NPM_TOKEN: ${{ secrets.NPM_TOKEN }}
@@ -97,45 +87,41 @@ jobs:
name: Release types to NPM
runs-on: ubuntu-latest
needs: sign
if: ${{ github.event_name != 'workflow_dispatch' || github.event.inputs.use-types == 'true' }}
permissions:
contents: read
if: github.repository_owner == 'oven-sh'
defaults:
run:
working-directory: packages/bun-types
steps:
- name: Checkout
- id: checkout
name: Checkout
uses: actions/checkout@v3
- name: Setup Node.js
- id: setup-env
name: Setup Environment
run: |
TAG="${{ github.event.inputs.tag }}"
TAG="${TAG:-"${{ github.event.release.tag_name }}"}"
echo "Setup tag: ${TAG}"
echo "TAG=${TAG}" >> ${GITHUB_ENV}
- id: setup-node
name: Setup Node.js
uses: actions/setup-node@v3
with:
node-version: latest
- name: Setup Bun
- id: setup-bun
name: Setup Bun
uses: oven-sh/setup-bun@v1
with:
bun-version: latest
- name: Install Dependencies
bun-version: canary
- id: bun-install
name: Install Dependencies
run: bun install
- name: Setup Tag
if: ${{ env.BUN_VERSION == 'canary' }}
run: |
VERSION=$(bun --version)
TAG="${VERSION}-canary.$(date +'%Y%m%dT%H%M%S')"
echo "Setup tag: ${TAG}"
echo "TAG=${TAG}" >> ${GITHUB_ENV}
- name: Build
- id: bun-run
name: Build
run: bun run build
env:
BUN_VERSION: ${{ env.TAG || env.BUN_VERSION }}
- name: Release (canary)
if: ${{ env.BUN_VERSION == 'canary' }}
uses: JS-DevTools/npm-publish@v1
with:
package: packages/bun-types/dist/package.json
token: ${{ secrets.NPM_TOKEN }}
tag: canary
- name: Release (latest)
if: ${{ env.BUN_LATEST == 'true' }}
BUN_VERSION: ${{ env.TAG }}
- id: npm-publish
name: Release
uses: JS-DevTools/npm-publish@v1
with:
package: packages/bun-types/dist/package.json
@@ -144,28 +130,20 @@ jobs:
name: Release to Dockerhub
runs-on: ubuntu-latest
needs: sign
if: ${{ github.event_name != 'workflow_dispatch' || github.event.inputs.use-docker == 'true' }}
permissions:
contents: read
strategy:
fail-fast: false
matrix:
include:
- variant: debian
suffix: ''
- variant: debian
suffix: -debian
- variant: slim
suffix: -slim
dir: debian-slim
- variant: alpine
suffix: -alpine
- variant: distroless
suffix: -distroless
if: github.repository_owner == 'oven-sh'
steps:
- name: Checkout
- id: checkout
name: Checkout
uses: actions/checkout@v3
- name: Setup Docker emulator
- id: environment
name: Setup Environment
run: |
TAG="${{ github.event.inputs.tag }}"
TAG="${TAG:-"${{ github.event.release.tag_name }}"}"
echo "Setup tag: ${TAG}"
echo "TAG=${TAG}" >> ${GITHUB_ENV}
- id: qemu
name: Setup Docker QEMU
uses: docker/setup-qemu-action@v2
- id: buildx
name: Setup Docker buildx
@@ -177,60 +155,67 @@ jobs:
uses: docker/metadata-action@v4
with:
images: oven/bun
flavor: |
latest=false
tags: |
type=raw,value=latest,enable=${{ env.BUN_LATEST == 'true' && matrix.suffix == '' }}
type=raw,value=${{ matrix.variant }},enable=${{ env.BUN_LATEST == 'true' }}
type=match,pattern=(bun-v)?(canary|\d+.\d+.\d+),group=2,value=${{ env.BUN_VERSION }},suffix=${{ matrix.suffix }}
type=match,pattern=(bun-v)?(canary|\d+.\d+),group=2,value=${{ env.BUN_VERSION }},suffix=${{ matrix.suffix }}
type=match,pattern=(bun-v)?(canary|\d+),group=2,value=${{ env.BUN_VERSION }},suffix=${{ matrix.suffix }}
- name: Login to Docker
type=match,pattern=(bun-v)?(\d+.\d+.\d+),group=2,value=${{ env.TAG }}
type=match,pattern=(bun-v)?(\d+.\d+),group=2,value=${{ env.TAG }}
- id: login
name: Login to Docker
uses: docker/login-action@v2
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_PASSWORD }}
- name: Push to Docker
- id: push
name: Push to Docker
uses: docker/build-push-action@v3
with:
context: ./dockerhub/${{ matrix.dir || matrix.variant }}
context: ./dockerhub
file: ./dockerhub/Dockerfile-debian
platforms: linux/amd64,linux/arm64
builder: ${{ steps.buildx.outputs.name }}
push: true
tags: ${{ steps.metadata.outputs.tags }}
labels: ${{ steps.metadata.outputs.labels }}
build-args: |
BUN_VERSION=${{ env.BUN_VERSION }}
BUN_VERSION=${{ env.TAG }}
homebrew:
name: Release to Homebrew
runs-on: ubuntu-latest
needs: sign
permissions:
contents: read
if: ${{ github.event_name == 'release' || github.event.inputs.use-homebrew == 'true' }}
if: github.repository_owner == 'oven-sh'
steps:
- name: Checkout
- id: checkout
name: Checkout
uses: actions/checkout@v3
with:
repository: oven-sh/homebrew-bun
token: ${{ secrets.ROBOBUN_TOKEN }}
- id: gpg
- id: setup-gpg
name: Setup GPG
uses: crazy-max/ghaction-import-gpg@v5
with:
gpg_private_key: ${{ secrets.GPG_PRIVATE_KEY }}
passphrase: ${{ secrets.GPG_PASSPHRASE }}
- name: Setup Ruby
- id: setup-env
name: Setup Environment
run: |
TAG="${{ github.event.inputs.tag }}"
TAG="${TAG:-"${{ github.event.release.tag_name }}"}"
echo "Setup tag: ${TAG}"
echo "TAG=${TAG}" >> ${GITHUB_ENV}
- id: setup-ruby
name: Setup Ruby
uses: ruby/setup-ruby@v1
with:
ruby-version: "2.6"
- name: Update Tap
run: ruby scripts/release.rb "${{ env.BUN_VERSION }}"
- name: Commit Tap
- id: update-tap
name: Update Tap
run: ruby scripts/release.rb "${{ env.TAG }}"
- id: commit-tap
name: Commit Tap
uses: stefanzweifel/git-auto-commit-action@v4
with:
commit_options: --gpg-sign=${{ steps.gpg.outputs.keyid }}
commit_message: Release ${{ env.BUN_VERSION }}
commit_options: --gpg-sign=${{ steps.setup-gpg.outputs.keyid }}
commit_message: Release ${{ env.TAG }}
commit_user_name: robobun
commit_user_email: robobun@oven.sh
commit_author: robobun <robobun@oven.sh>
@@ -238,23 +223,32 @@ jobs:
name: Upload to S3
runs-on: ubuntu-latest
needs: sign
if: ${{ github.event_name != 'workflow_dispatch' || github.event.inputs.use-s3 == 'true' }}
permissions:
contents: read
if: github.repository_owner == 'oven-sh'
defaults:
run:
working-directory: packages/bun-release
steps:
- name: Checkout
- id: checkout
name: Checkout
uses: actions/checkout@v3
- name: Setup Bun
- id: setup-env
name: Setup Environment
run: |
TAG="${{ github.event.inputs.tag }}"
TAG="${TAG:-"${{ github.event.release.tag_name }}"}"
echo "Setup tag: ${TAG}"
echo "TAG=${TAG}" >> ${GITHUB_ENV}
- id: setup-bun
name: Setup Bun
uses: oven-sh/setup-bun@v1
with:
bun-version: latest
- name: Install Dependencies
bun-version: canary
- id: bun-install
name: Install Dependencies
run: bun install
- name: Release
run: bun upload-s3 -- "${{ env.BUN_VERSION }}"
- id: bun-run
name: Release
run: bun upload-s3 -- "${{ env.TAG }}"
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}

View File

@@ -12,8 +12,6 @@ jobs:
prettier-fmt:
name: prettier
runs-on: ubuntu-latest
permissions:
pull-requests: write
outputs:
prettier_fmt_errs: ${{ steps.fmt.outputs.prettier_fmt_errs }}
steps:

View File

@@ -1,7 +1,7 @@
name: zig-fmt
env:
ZIG_VERSION: 0.12.0-dev.163+6780a6bbf
ZIG_VERSION: 0.11.0-dev.4006+bf827d0b5
on:
pull_request:
@@ -18,8 +18,6 @@ jobs:
zig-fmt:
name: zig fmt
runs-on: ubuntu-latest
permissions:
pull-requests: write
outputs:
zig_fmt_errs: ${{ steps.fmt.outputs.zig_fmt_errs }}
steps:

12
.gitignore vendored
View File

@@ -6,7 +6,6 @@ packages/*/*.wasm
profile.json
node_modules
.envrc
.swcrc
yarn.lock
dist
@@ -97,8 +96,6 @@ packages/bun-wasm/*.cjs
packages/bun-wasm/*.map
packages/bun-wasm/*.js
packages/bun-wasm/*.d.ts
packages/bun-wasm/*.d.cts
packages/bun-wasm/*.d.mts
*.bc
src/fallback.version
@@ -124,14 +121,7 @@ cold-jsc-start
cold-jsc-start.d
/test.ts
/test.js
src/js/out/modules*
src/js/out/functions*
src/js/out/tmp
src/js/out/DebugPath.h
src/js/out/modules_dev
make-dev-stats.csv
.uuid
tsconfig.tsbuildinfo

7
.gitmodules vendored
View File

@@ -48,6 +48,13 @@ ignore = dirty
depth = 1
shallow = true
fetchRecurseSubmodules = false
[submodule "src/deps/uws"]
path = src/deps/uws
url = https://github.com/Jarred-Sumner/uWebSockets
ignore = dirty
depth = 1
shallow = true
fetchRecurseSubmodules = true
[submodule "src/deps/tinycc"]
path = src/deps/tinycc
url = https://github.com/Jarred-Sumner/tinycc.git

View File

@@ -10,4 +10,4 @@ fi
# sets up vscode C++ intellisense
rm -f .vscode/clang++
ln -s $(which clang++-16 || which clang++) .vscode/clang++ 2>/dev/null
ln -s $(which clang++-15 || which clang++) .vscode/clang++ 2>/dev/null

3
.scripts/write-versions.sh Executable file → Normal file
View File

@@ -7,9 +7,11 @@ LIBARCHIVE_VERSION=$(git rev-parse HEAD:./src/deps/libarchive)
PICOHTTPPARSER_VERSION=$(git rev-parse HEAD:./src/deps/picohttpparser)
BORINGSSL_VERSION=$(git rev-parse HEAD:./src/deps/boringssl)
ZLIB_VERSION=$(git rev-parse HEAD:./src/deps/zlib)
UWS_VERSION=$(git rev-parse HEAD:./src/deps/uws)
LOLHTML=$(git rev-parse HEAD:./src/deps/lol-html)
TINYCC=$(git rev-parse HEAD:./src/deps/tinycc)
C_ARES=$(git rev-parse HEAD:./src/deps/c-ares)
USOCKETS=$(cd src/deps/uws/uSockets && git rev-parse HEAD)
rm -rf src/generated_versions_list.zig
echo "// AUTO-GENERATED FILE. Created via .scripts/write-versions.sh" >src/generated_versions_list.zig
@@ -18,6 +20,7 @@ echo "pub const boringssl = \"$BORINGSSL_VERSION\";" >>src/generated_versions_li
echo "pub const libarchive = \"$LIBARCHIVE_VERSION\";" >>src/generated_versions_list.zig
echo "pub const mimalloc = \"$MIMALLOC_VERSION\";" >>src/generated_versions_list.zig
echo "pub const picohttpparser = \"$PICOHTTPPARSER_VERSION\";" >>src/generated_versions_list.zig
echo "pub const uws = \"$UWS_VERSION\";" >>src/generated_versions_list.zig
echo "pub const webkit = \"$WEBKIT_VERSION\";" >>src/generated_versions_list.zig
echo "pub const zig = @import(\"std\").fmt.comptimePrint(\"{}\", .{@import(\"builtin\").zig_version});" >>src/generated_versions_list.zig
echo "pub const zlib = \"$ZLIB_VERSION\";" >>src/generated_versions_list.zig

View File

@@ -15,14 +15,11 @@
"${workspaceFolder}/src/bun.js/bindings/webcore/",
"${workspaceFolder}/src/bun.js/bindings/sqlite/",
"${workspaceFolder}/src/bun.js/bindings/webcrypto/",
"${workspaceFolder}/src/bun.js/modules/",
"${workspaceFolder}/src/js/builtins/",
"${workspaceFolder}/src/js/out",
"${workspaceFolder}/src/deps/boringssl/include/",
"${workspaceFolder}/src/deps",
"${workspaceFolder}/src/napi/*",
"${workspaceFolder}/packages/bun-usockets/src",
"${workspaceFolder}/packages/"
"${workspaceFolder}/src/deps/uws/uSockets/src"
],
"browse": {
"path": [
@@ -34,8 +31,6 @@
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Release/WTF/Headers/**",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Release/bmalloc/Headers/**",
"${workspaceFolder}/src/bun.js/bindings/*",
"${workspaceFolder}/src/bun.js/bindings/*",
"${workspaceFolder}/src/napi/*",
"${workspaceFolder}/src/bun.js/bindings/sqlite/",
"${workspaceFolder}/src/bun.js/bindings/webcrypto/",
"${workspaceFolder}/src/bun.js/bindings/webcore/",
@@ -44,9 +39,7 @@
"${workspaceFolder}/src/bun.js/modules/*",
"${workspaceFolder}/src/deps",
"${workspaceFolder}/src/deps/boringssl/include/",
"${workspaceFolder}/packages/bun-usockets/",
"${workspaceFolder}/packages/bun-uws/",
"${workspaceFolder}/src/napi"
"${workspaceFolder}/src/deps/uws/uSockets/src"
],
"limitSymbolsToIncludedHeaders": true,
"databaseFilename": ".vscode/cppdb"

23
.vscode/launch.json generated vendored
View File

@@ -4,6 +4,8 @@
// it makes our tests very slow
// But it helps catch memory bugs
// SIGHUP must be ignored or the debugger will pause when a spawned subprocess exits:
// { "initCommands": ["process handle -p false -s false -n false SIGHUP"] }
"version": "0.2.0",
"configurations": [
{
@@ -19,6 +21,7 @@
"BUN_DEBUG_QUIET_LOGS": "1",
"BUN_GARBAGE_COLLECTOR_LEVEL": "2"
},
"initCommands": ["process handle -p false -s false -n false SIGHUP"],
"console": "internalConsole"
},
{
@@ -33,6 +36,7 @@
"FORCE_COLOR": "1",
"BUN_DEBUG_QUIET_LOGS": "1"
},
"initCommands": ["process handle -p false -s false -n false SIGHUP"],
"console": "internalConsole"
},
@@ -47,6 +51,7 @@
"env": {
"FORCE_COLOR": "1"
},
"initCommands": ["process handle -p false -s false -n false SIGHUP"],
"console": "internalConsole"
},
{
@@ -61,6 +66,7 @@
"FORCE_COLOR": "1",
"BUN_DEBUG_QUIET_LOGS": "1"
},
"initCommands": ["process handle -p false -s false -n false SIGHUP"],
"console": "internalConsole"
},
{
@@ -75,6 +81,7 @@
"FORCE_COLOR": "1",
"BUN_DEBUG_QUIET_LOGS": "1"
},
"initCommands": ["process handle -p false -s false -n false SIGHUP"],
"console": "internalConsole"
},
{
@@ -89,6 +96,7 @@
"BUN_DEBUG_QUIET_LOGS": "1",
"BUN_GARBAGE_COLLECTOR_LEVEL": "2"
},
"initCommands": ["process handle -p false -s false -n false SIGHUP"],
"console": "internalConsole"
},
{
@@ -103,6 +111,7 @@
"FORCE_COLOR": "1",
"BUN_DEBUG_QUIET_LOGS": "1"
},
"initCommands": ["process handle -p false -s false -n false SIGHUP"],
"console": "internalConsole"
},
{
@@ -117,6 +126,7 @@
"FORCE_COLOR": "1",
"BUN_DEBUG_QUIET_LOGS": "1"
},
"initCommands": ["process handle -p false -s false -n false SIGHUP"],
"console": "internalConsole"
},
{
@@ -124,12 +134,13 @@
"request": "launch",
"name": "bun run [file]",
"program": "bun-debug",
"args": ["run", "${file}", "${file}"],
"args": ["run", "${file}"],
"cwd": "${fileDirname}",
"env": {
"FORCE_COLOR": "1",
"NODE_ENV": "development"
"BUN_DEBUG_QUIET_LOGS": "1"
},
"initCommands": ["process handle -p false -s false -n false SIGHUP"],
"console": "internalConsole"
},
{
@@ -144,6 +155,7 @@
"BUN_DEBUG_QUIET_LOGS": "1",
"BUN_GARBAGE_COLLECTOR_LEVEL": "2"
},
"initCommands": ["process handle -p false -s false -n false SIGHUP"],
"console": "internalConsole"
},
{
@@ -156,6 +168,7 @@
"env": {
"FORCE_COLOR": "1"
},
"initCommands": ["process handle -p false -s false -n false SIGHUP"],
"console": "internalConsole"
},
{
@@ -168,6 +181,7 @@
"env": {
"FORCE_COLOR": "1"
},
"initCommands": ["process handle -p false -s false -n false SIGHUP"],
"console": "internalConsole"
},
{
@@ -180,6 +194,7 @@
"env": {
"FORCE_COLOR": "1"
},
"initCommands": ["process handle -p false -s false -n false SIGHUP"],
"console": "internalConsole"
},
{
@@ -201,7 +216,9 @@
"console": "internalConsole",
"env": {
"BUN_CONFIG_MINIFY_WHITESPACE": "1"
}
},
// SIGHUP must be ignored or the debugger will pause when a spawned subprocess exits.
"initCommands": ["process handle -p false -s false -n false SIGHUP"]
},
{
"type": "lldb",

View File

@@ -27,8 +27,7 @@
"editor.formatOnSave": true
},
"zig.zls.enableInlayHints": false,
"zig.zls.enabled": true,
"git.ignoreSubmodules": true,
"[jsx]": {
"editor.defaultFormatter": "esbenp.prettier-vscode",
"editor.formatOnSave": true
@@ -80,10 +79,7 @@
"src/deps/tinycc": true,
"src/deps/zstd": true,
"test/snippets/package-json-exports/_node_modules_copy": true,
"src/js/out": true,
"packages/bun-uws/fuzzing/seed-corpus/": true,
"**/*.dep": true,
"**/CMakeFiles": true
"src/js/out": true
},
"C_Cpp.files.exclude": {
"**/.vscode": true,

View File

@@ -47,18 +47,34 @@ TODO: document this (see [`bindings.zig`](src/bun.js/bindings/bindings.zig) and
Copy from examples like `Subprocess` or `Response`.
### ESM Modules and Builtins JS
### ESM modules
Bun implements ESM modules in a mix of native code and JavaScript.
Several Node.js modules are implemented in JavaScript and loosely based on browserify polyfills.
Builtin modules in Bun are located in [`src/js`](src/js/). These files are transpiled and support a JavaScriptCore-only syntax for internal slots, which is explained further in [`src/js/README.md`](src/js/README.md).
Native C++ modules are in `src/bun.js/modules/`.
The ESM modules in Bun are located in [`src/bun.js/*.exports.js`](src/bun.js/). Unlike other code in Bun, these files are NOT transpiled. They are loaded directly into the JavaScriptCore VM. That means `require` does not work in these files. Instead, you must use `import.meta.require`, or ideally, not use require/import other files at all.
The module loader is in [`src/bun.js/module_loader.zig`](src/bun.js/module_loader.zig).
### JavaScript Builtins
TODO: update this with the new build process that uses TypeScript and `$` instead of `@`.
JavaScript builtins are located in [`src/js/builtins/*.ts`](src/js/builtins).
These files support a JavaScriptCore-only syntax for internal slots. `@` is used to access an internal slot. For example: `new @Array(123)` will create a new `Array` similar to `new Array(123)`, except if a library modifies the `Array` global, it will not affect the internal slot (`@Array`). These names must be allow-listed in `BunBuiltinNames.h` (though JavaScriptCore allowlists some names by default).
They can not use or reference ESM-modules. The files that end with `*Internals.js` are automatically loaded globally. Most usage of internals right now are the stream implementations (which share a lot of code from Safari/WebKit) and ImportMetaObject (which is how `require` is implemented in the runtime)
To regenerate the builtins:
```sh
make clean-bindings && make generate-builtins && make bindings -j10
```
It is recommended that you have ccache installed or else you will spend a lot of time waiting for the bindings to compile.
### Memory management in Bun's JavaScript runtime
TODO: fill this out (for now, use `JSC.Strong` in most cases)

View File

@@ -10,9 +10,9 @@ ARG ARCH=x86_64
ARG BUILD_MACHINE_ARCH=x86_64
ARG TRIPLET=${ARCH}-linux-gnu
ARG BUILDARCH=amd64
ARG WEBKIT_TAG=2023-oct3
ARG WEBKIT_TAG=2023-july23
ARG ZIG_TAG=jul1
ARG ZIG_VERSION="0.12.0-dev.163+6780a6bbf"
ARG ZIG_VERSION="0.11.0-dev.4006+bf827d0b5"
ARG WEBKIT_BASENAME="bun-webkit-linux-$BUILDARCH"
ARG ZIG_FOLDERNAME=zig-linux-${BUILD_MACHINE_ARCH}-${ZIG_VERSION}
@@ -20,17 +20,19 @@ ARG ZIG_FILENAME=${ZIG_FOLDERNAME}.tar.xz
ARG WEBKIT_URL="https://github.com/oven-sh/WebKit/releases/download/$WEBKIT_TAG/${WEBKIT_BASENAME}.tar.gz"
ARG ZIG_URL="https://ziglang.org/builds/${ZIG_FILENAME}"
ARG GIT_SHA=""
ARG BUN_BASE_VERSION=1.0
ARG BUN_BASE_VERSION=0.7
FROM bitnami/minideb:bullseye as bun-base
RUN install_packages ca-certificates curl wget lsb-release software-properties-common gnupg gnupg1 gnupg2 && \
echo "deb https://apt.llvm.org/bullseye/ llvm-toolchain-bullseye-16 main" > /etc/apt/sources.list.d/llvm.list && \
echo "deb-src https://apt.llvm.org/bullseye/ llvm-toolchain-bullseye-16 main" >> /etc/apt/sources.list.d/llvm.list && \
wget -O - https://apt.llvm.org/llvm-snapshot.gpg.key | apt-key add - && \
curl -fsSL https://deb.nodesource.com/setup_lts.x | bash - && \
install_packages \
RUN install_packages ca-certificates curl wget lsb-release software-properties-common gnupg gnupg1 gnupg2
RUN wget https://apt.llvm.org/llvm.sh && \
chmod +x llvm.sh && \
./llvm.sh 15
RUN install_packages \
cmake \
curl \
file \
git \
gnupg \
@@ -44,16 +46,16 @@ RUN install_packages ca-certificates curl wget lsb-release software-properties-c
rsync \
ruby \
unzip \
clang-16 \
lld-16 \
lldb-16 \
clangd-16 \
xz-utils \
bash tar gzip ccache nodejs && \
bash tar gzip ccache
ENV CXX=clang++-15
ENV CC=clang-15
RUN curl -fsSL https://deb.nodesource.com/setup_lts.x | bash - && \
install_packages nodejs && \
npm install -g esbuild
ENV CXX=clang++-16
ENV CC=clang-16
ARG DEBIAN_FRONTEND
ARG GITHUB_WORKSPACE
@@ -70,10 +72,10 @@ ARG ZIG_FILENAME
ENV WEBKIT_OUT_DIR=${WEBKIT_DIR}
ENV BUILDARCH=${BUILDARCH}
ENV AR=/usr/bin/llvm-ar-16
ENV AR=/usr/bin/llvm-ar-15
ENV ZIG "${ZIG_PATH}/zig"
ENV PATH="$ZIG/bin:$PATH"
ENV LD=lld-16
ENV LD=lld-15
RUN mkdir -p $BUN_DIR $BUN_DEPS_OUT_DIR
@@ -155,7 +157,7 @@ COPY src/deps/lol-html ${BUN_DIR}/src/deps/lol-html
ENV CCACHE_DIR=/ccache
RUN --mount=type=cache,target=/ccache export PATH=$PATH:$HOME/.cargo/bin && export CC=$(which clang-16) && cd ${BUN_DIR} && \
RUN --mount=type=cache,target=/ccache export PATH=$PATH:$HOME/.cargo/bin && export CC=$(which clang-15) && cd ${BUN_DIR} && \
make lolhtml && rm -rf src/deps/lol-html Makefile
FROM bun-base as mimalloc
@@ -282,18 +284,16 @@ ARG CPU_TARGET
ENV CPU_TARGET=${CPU_TARGET}
COPY Makefile ${BUN_DIR}/Makefile
COPY packages/bun-uws ${BUN_DIR}/packages/bun-uws
COPY packages/bun-usockets ${BUN_DIR}/packages/bun-usockets
COPY src/deps/uws ${BUN_DIR}/src/deps/uws
COPY src/deps/zlib ${BUN_DIR}/src/deps/zlib
COPY src/deps/boringssl/include ${BUN_DIR}/src/deps/boringssl/include
COPY src/deps/c-ares/include ${BUN_DIR}/src/deps/c-ares/include
COPY src/deps/libuwsockets.cpp ${BUN_DIR}/src/deps/libuwsockets.cpp
COPY src/deps/_libusockets.h ${BUN_DIR}/src/deps/_libusockets.h
WORKDIR $BUN_DIR
RUN cd $BUN_DIR && \
make uws && rm -rf packages/bun-uws Makefile
make uws && rm -rf src/deps/uws Makefile
FROM bun-base as base64

206
Makefile
View File

@@ -6,6 +6,8 @@ BUN_AUTO_UPDATER_REPO = Jarred-Sumner/bun-releases-for-updater
CMAKE_CXX_COMPILER_LAUNCHER_FLAG :=
# 'make' command will trigger the help target
.DEFAULT_GOAL := help
@@ -18,7 +20,7 @@ CPU_TARGET ?= native
MARCH_NATIVE = -mtune=$(CPU_TARGET)
NATIVE_OR_OLD_MARCH =
MMD_IF_LOCAL =
MMD_IF_LOCAL =
DEFAULT_MIN_MACOS_VERSION=
ARCH_NAME :=
DOCKER_BUILDARCH =
@@ -38,7 +40,7 @@ NATIVE_OR_OLD_MARCH = -march=nehalem
endif
MIN_MACOS_VERSION ?= $(DEFAULT_MIN_MACOS_VERSION)
BUN_BASE_VERSION = 1.0
BUN_BASE_VERSION = 0.7
CI ?= false
@@ -82,9 +84,9 @@ ZIG ?= $(shell which zig 2>/dev/null || echo -e "error: Missing zig. Please make
# This is easier to happen than you'd expect.
# Using realpath here causes issues because clang uses clang++ as a symlink
# so if that's resolved, it won't build for C++
REAL_CC = $(shell which clang-16 2>/dev/null || which clang 2>/dev/null)
REAL_CXX = $(shell which clang++-16 2>/dev/null || which clang++ 2>/dev/null)
CLANG_FORMAT = $(shell which clang-format-16 2>/dev/null || which clang-format 2>/dev/null)
REAL_CC = $(shell which clang-15 2>/dev/null || which clang 2>/dev/null)
REAL_CXX = $(shell which clang++-15 2>/dev/null || which clang++ 2>/dev/null)
CLANG_FORMAT = $(shell which clang-format-15 2>/dev/null || which clang-format 2>/dev/null)
CC = $(REAL_CC)
CXX = $(REAL_CXX)
@@ -108,14 +110,14 @@ CC_WITH_CCACHE = $(CCACHE_PATH) $(CC)
ifeq ($(OS_NAME),darwin)
# Find LLVM
ifeq ($(wildcard $(LLVM_PREFIX)),)
LLVM_PREFIX = $(shell brew --prefix llvm@16)
LLVM_PREFIX = $(shell brew --prefix llvm@15)
endif
ifeq ($(wildcard $(LLVM_PREFIX)),)
LLVM_PREFIX = $(shell brew --prefix llvm)
endif
ifeq ($(wildcard $(LLVM_PREFIX)),)
# This is kinda ugly, but I can't find a better way to error :(
LLVM_PREFIX = $(shell echo -e "error: Unable to find llvm. Please run 'brew install llvm@16' or set LLVM_PREFIX=/path/to/llvm")
LLVM_PREFIX = $(shell echo -e "error: Unable to find llvm. Please run 'brew install llvm@15' or set LLVM_PREFIX=/path/to/llvm")
endif
LDFLAGS += -L$(LLVM_PREFIX)/lib
@@ -155,7 +157,7 @@ CMAKE_FLAGS_WITHOUT_RELEASE = -DCMAKE_C_COMPILER=$(CC) \
-DCMAKE_OSX_DEPLOYMENT_TARGET=$(MIN_MACOS_VERSION) \
$(CMAKE_CXX_COMPILER_LAUNCHER_FLAG) \
-DCMAKE_AR=$(AR) \
-DCMAKE_RANLIB=$(which llvm-16-ranlib 2>/dev/null || which llvm-ranlib 2>/dev/null)
-DCMAKE_RANLIB=$(which llvm-15-ranlib 2>/dev/null || which llvm-ranlib 2>/dev/null)
@@ -177,7 +179,7 @@ endif
ifeq ($(OS_NAME),linux)
LIBICONV_PATH =
AR = $(shell which llvm-ar-16 2>/dev/null || which llvm-ar 2>/dev/null || which ar 2>/dev/null)
AR = $(shell which llvm-ar-15 2>/dev/null || which llvm-ar 2>/dev/null || which ar 2>/dev/null)
endif
OPTIMIZATION_LEVEL=-O3 $(MARCH_NATIVE)
@@ -274,7 +276,7 @@ STRIP=/usr/bin/strip
endif
ifeq ($(OS_NAME),linux)
STRIP=$(shell which llvm-strip 2>/dev/null || which llvm-strip-16 2>/dev/null || which strip 2>/dev/null || echo "Missing strip")
STRIP=$(shell which llvm-strip 2>/dev/null || which llvm-strip-15 2>/dev/null || which strip 2>/dev/null || echo "Missing strip")
endif
@@ -349,10 +351,10 @@ LINUX_INCLUDE_DIRS := $(ALL_JSC_INCLUDE_DIRS) \
-I$(ZLIB_INCLUDE_DIR)
UWS_INCLUDE_DIR := -I$(BUN_DIR)/packages/bun-usockets/src -I$(BUN_DIR)/packages -I$(BUN_DEPS_DIR)
UWS_INCLUDE_DIR := -I$(BUN_DEPS_DIR)/uws/uSockets/src -I$(BUN_DEPS_DIR)/uws/src -I$(BUN_DEPS_DIR)
INCLUDE_DIRS := $(UWS_INCLUDE_DIR) -I$(BUN_DEPS_DIR)/mimalloc/include -I$(BUN_DEPS_DIR)/zstd/include -Isrc/napi -I$(BUN_DEPS_DIR)/boringssl/include -I$(BUN_DEPS_DIR)/c-ares/include -Isrc/bun.js/modules
INCLUDE_DIRS := $(UWS_INCLUDE_DIR) -I$(BUN_DEPS_DIR)/mimalloc/include -I$(BUN_DEPS_DIR)/zstd/include -Isrc/napi -I$(BUN_DEPS_DIR)/boringssl/include -I$(BUN_DEPS_DIR)/c-ares/include
ifeq ($(OS_NAME),linux)
@@ -401,7 +403,6 @@ CLANG_FLAGS = $(INCLUDE_DIRS) \
-DSTATICALLY_LINKED_WITH_BMALLOC=1 \
-DBUILDING_WITH_CMAKE=1 \
-DBUN_SINGLE_THREADED_PER_VM_ENTRY_SCOPE=1 \
-DNAPI_EXPERIMENTAL=ON \
-DNDEBUG=1 \
-DNOMINMAX \
-DIS_BUILD \
@@ -550,18 +551,27 @@ tinycc:
cd $(TINYCC_DIR) && \
make clean && \
AR=$(AR) $(CCACHE_CC_FLAG) CFLAGS='$(CFLAGS_WITHOUT_MARCH) $(NATIVE_OR_OLD_MARCH) -mtune=native $(TINYCC_CFLAGS)' ./configure --enable-static --cc=$(CCACHE_CC_OR_CC) --ar=$(AR) --config-predefs=yes && \
make libtcc.a -j10 && \
make -j10 && \
cp $(TINYCC_DIR)/*.a $(BUN_DEPS_OUT_DIR)
PYTHON=$(shell which python 2>/dev/null || which python3 2>/dev/null || which python2 2>/dev/null)
.PHONY: builtins
builtins:
NODE_ENV=production bun src/js/builtins/codegen/index.ts --minify
.PHONY: esm
js: # to rebundle js (rebuilding binary not needed to reload js code)
NODE_ENV=production bun src/js/_codegen/index.ts
esm:
NODE_ENV=production bun src/js/build-esm.ts
esm-debug:
BUN_DEBUG_QUIET_LOGS=1 NODE_ENV=production bun-debug src/js/build-esm.ts
.PHONY: generate-builtins
generate-builtins: builtins
BUN_TYPES_REPO_PATH ?= $(realpath packages/bun-types)
ifeq ($(DEBUG),true)
@@ -661,10 +671,10 @@ else
PKGNAME_NINJA := ninja-build
endif
.PHONY: assert-deps
assert-deps:
.PHONY: require
require:
@echo "Checking if the required utilities are available..."
@if [ $(CLANG_VERSION) -lt "15" ]; then echo -e "ERROR: clang version >=15 required, found: $(CLANG_VERSION). Install with:\n\n $(POSIX_PKG_MANAGER) install llvm@16"; exit 1; fi
@if [ $(CLANG_VERSION) -lt "15" ]; then echo -e "ERROR: clang version >=15 required, found: $(CLANG_VERSION). Install with:\n\n $(POSIX_PKG_MANAGER) install llvm@15"; exit 1; fi
@cmake --version >/dev/null 2>&1 || (echo -e "ERROR: cmake is required."; exit 1)
@$(PYTHON) --version >/dev/null 2>&1 || (echo -e "ERROR: python is required."; exit 1)
@$(ESBUILD) --version >/dev/null 2>&1 || (echo -e "ERROR: esbuild is required."; exit 1)
@@ -674,9 +684,6 @@ assert-deps:
@which $(LIBTOOL) > /dev/null || (echo -e "ERROR: libtool is required. Install with:\n\n $(POSIX_PKG_MANAGER) install libtool"; exit 1)
@which ninja > /dev/null || (echo -e "ERROR: Ninja is required. Install with:\n\n $(POSIX_PKG_MANAGER) install $(PKGNAME_NINJA)"; exit 1)
@which pkg-config > /dev/null || (echo -e "ERROR: pkg-config is required. Install with:\n\n $(POSIX_PKG_MANAGER) install pkg-config"; exit 1)
@which rustc > /dev/null || (echo -e "ERROR: rustc is required." exit 1)
@which cargo > /dev/null || (echo -e "ERROR: cargo is required." exit 1)
@test $(shell cargo --version | awk '{print $$2}' | cut -d. -f2) -gt 57 || (echo -e "ERROR: cargo version must be at least 1.57."; exit 1)
@echo "You have the dependencies installed! Woo"
# the following allows you to run `make submodule` to update or init submodules. but we will exclude webkit
@@ -707,46 +714,44 @@ dev-build-obj-wasm:
.PHONY: dev-wasm
dev-wasm: dev-build-obj-wasm
emcc -sEXPORTED_FUNCTIONS="['_bun_free', '_cycleStart', '_cycleEnd', '_bun_malloc', '_scan', '_transform', '_init', '_getTests']" \
-g2 -s ERROR_ON_UNDEFINED_SYMBOLS=0 -DNDEBUG \
$(BUN_DEPS_DIR)/$(MIMALLOC_FILE).wasm \
packages/debug-bun-freestanding-wasm32/bun-wasm.o --no-entry --allow-undefined -s ASSERTIONS=0 -s ALLOW_MEMORY_GROWTH=1 -s WASM_BIGINT=1 \
emcc -sEXPORTED_FUNCTIONS="['_bun_free', '_cycleStart', '_cycleEnd', '_bun_malloc', '_scan', '_transform', '_init']" \
-g -s ERROR_ON_UNDEFINED_SYMBOLS=0 -DNDEBUG \
$(BUN_DEPS_DIR)/libmimalloc.a.wasm \
packages/debug-bun-freestanding-wasm32/bun-wasm.o $(OPTIMIZATION_LEVEL) --no-entry --allow-undefined -s ASSERTIONS=0 -s ALLOW_MEMORY_GROWTH=1 -s WASM_BIGINT=1 \
-o packages/debug-bun-freestanding-wasm32/bun-wasm.wasm
cp packages/debug-bun-freestanding-wasm32/bun-wasm.wasm packages/bun-wasm/bun.wasm
cp packages/debug-bun-freestanding-wasm32/bun-wasm.wasm src/api/demo/public/bun-wasm.wasm
.PHONY: build-obj-wasm
build-obj-wasm:
$(ZIG) build bun-wasm -Doptimize=ReleaseFast -Dtarget=wasm32-freestanding
emcc -sEXPORTED_FUNCTIONS="['_bun_free', '_cycleStart', '_cycleEnd', '_bun_malloc', '_scan', '_transform', '_init', '_getTests']" \
-s ERROR_ON_UNDEFINED_SYMBOLS=0 -DNDEBUG \
$(BUN_DEPS_DIR)/$(MIMALLOC_FILE).wasm \
emcc -sEXPORTED_FUNCTIONS="['_bun_free', '_cycleStart', '_cycleEnd', '_bun_malloc', '_scan', '_transform', '_init']" \
-g -s ERROR_ON_UNDEFINED_SYMBOLS=0 -DNDEBUG \
$(BUN_DEPS_DIR)/libmimalloc.a.wasm \
packages/bun-freestanding-wasm32/bun-wasm.o $(OPTIMIZATION_LEVEL) --no-entry --allow-undefined -s ASSERTIONS=0 -s ALLOW_MEMORY_GROWTH=1 -s WASM_BIGINT=1 \
-o packages/bun-freestanding-wasm32/bun-wasm.wasm
cp packages/bun-freestanding-wasm32/bun-wasm.wasm packages/bun-wasm/bun.wasm
cp packages/bun-freestanding-wasm32/bun-wasm.wasm src/api/demo/public/bun-wasm.wasm
.PHONY: build-obj-wasm-small
build-obj-wasm-small:
$(ZIG) build bun-wasm -Doptimize=ReleaseFast -Dtarget=wasm32-freestanding
emcc -sEXPORTED_FUNCTIONS="['_bun_free', '_cycleStart', '_cycleEnd', '_bun_malloc', '_scan', '_transform', '_init', '_getTests']" \
-Oz -s ERROR_ON_UNDEFINED_SYMBOLS=0 -DNDEBUG \
$(BUN_DEPS_DIR)/$(MIMALLOC_FILE).wasm \
$(ZIG) build bun-wasm -Doptimize=ReleaseSmall -Dtarget=wasm32-freestanding
emcc -sEXPORTED_FUNCTIONS="['_bun_free', '_cycleStart', '_cycleEnd', '_bun_malloc', '_scan', '_transform', '_init']" \
-g -s ERROR_ON_UNDEFINED_SYMBOLS=0 -DNDEBUG \
$(BUN_DEPS_DIR)/libmimalloc.a.wasm \
packages/bun-freestanding-wasm32/bun-wasm.o -Oz --no-entry --allow-undefined -s ASSERTIONS=0 -s ALLOW_MEMORY_GROWTH=1 -s WASM_BIGINT=1 \
-o packages/bun-freestanding-wasm32/bun-wasm.wasm
cp packages/bun-freestanding-wasm32/bun-wasm.wasm packages/bun-wasm/bun.wasm
cp packages/bun-freestanding-wasm32/bun-wasm.wasm src/api/demo/public/bun-wasm.wasm
.PHONY: wasm
wasm: api mimalloc-wasm build-obj-wasm-small
@rm -rf packages/bun-wasm/*.{d.ts,d.cts,d.mts,js,wasm,cjs,mjs,tsbuildinfo}
wasm: api build-obj-wasm-small
@rm -rf packages/bun-wasm/*.{d.ts,js,wasm,cjs,mjs,tsbuildinfo}
@cp packages/bun-freestanding-wasm32/bun-wasm.wasm packages/bun-wasm/bun.wasm
@cp src/api/schema.d.ts packages/bun-wasm/schema.d.ts
@cp src/api/schema.js packages/bun-wasm/schema.js
@cd packages/bun-wasm && $(NPM_CLIENT) run tsc -- -p .
@cp packages/bun-wasm/index.d.ts packages/bun-wasm/index.d.cts
@mv packages/bun-wasm/index.d.ts packages/bun-wasm/index.d.mts
@bun build --sourcemap=external --external=fs --outdir=packages/bun-wasm --target=browser --minify ./packages/bun-wasm/index.ts
@$(ESBUILD) --sourcemap=external --external:fs --define:process.env.NODE_ENV='"production"' --outdir=packages/bun-wasm --target=esnext --bundle packages/bun-wasm/index.ts --format=esm --minify 2> /dev/null
@mv packages/bun-wasm/index.js packages/bun-wasm/index.mjs
@mv packages/bun-wasm/index.js.map packages/bun-wasm/index.mjs.map
@$(ESBUILD) --sourcemap=external --external:fs --outdir=packages/bun-wasm --target=esnext --bundle packages/bun-wasm/index.ts --format=cjs --minify --platform=node 2> /dev/null
@$(ESBUILD) --sourcemap=external --external:fs --define:process.env.NODE_ENV='"production"' --outdir=packages/bun-wasm --target=esnext --bundle packages/bun-wasm/index.ts --format=cjs --minify --platform=node 2> /dev/null
@mv packages/bun-wasm/index.js packages/bun-wasm/index.cjs
@mv packages/bun-wasm/index.js.map packages/bun-wasm/index.cjs.map
@rm -rf packages/bun-wasm/*.tsbuildinfo
@@ -760,17 +765,17 @@ build-obj-safe:
UWS_CC_FLAGS = -pthread -DLIBUS_USE_OPENSSL=1 -DUWS_HTTPRESPONSE_NO_WRITEMARK=1 -DLIBUS_USE_BORINGSSL=1 -DWITH_BORINGSSL=1 -Wpedantic -Wall -Wextra -Wsign-conversion -Wconversion $(UWS_INCLUDE) -DUWS_WITH_PROXY
UWS_CXX_FLAGS = $(UWS_CC_FLAGS) -std=$(CXX_VERSION) -fno-exceptions -fno-rtti
UWS_LDFLAGS = -I$(BUN_DEPS_DIR)/boringssl/include -I$(ZLIB_INCLUDE_DIR)
USOCKETS_DIR = $(BUN_DIR)/packages/bun-usockets
USOCKETS_SRC_DIR = $(USOCKETS_DIR)/src
USOCKETS_DIR = $(BUN_DEPS_DIR)/uws/uSockets/
USOCKETS_SRC_DIR = $(BUN_DEPS_DIR)/uws/uSockets/src/
usockets:
rm -rf $(USOCKETS_DIR)/*.i $(USOCKETS_DIR)/*.bc $(USOCKETS_DIR)/*.o $(USOCKETS_DIR)/*.s $(USOCKETS_DIR)/*.ii $(USOCKETS_DIR)/*.s
cd $(USOCKETS_DIR) && $(CC_WITH_CCACHE) -I$(USOCKETS_SRC_DIR) -fno-builtin-malloc -fno-builtin-free -fno-builtin-realloc $(EMIT_LLVM_FOR_RELEASE) $(MACOS_MIN_FLAG) -fPIC $(CFLAGS) $(UWS_CC_FLAGS) -save-temps -I$(BUN_DEPS_DIR)/uws/uSockets/src $(UWS_LDFLAGS) -g $(DEFAULT_LINKER_FLAGS) $(PLATFORM_LINKER_FLAGS) $(OPTIMIZATION_LEVEL) -c $(wildcard $(USOCKETS_SRC_DIR)/*.c) $(wildcard $(USOCKETS_SRC_DIR)/**/*.c)
cd $(USOCKETS_DIR) && $(CXX_WITH_CCACHE) -I$(USOCKETS_SRC_DIR) -fno-builtin-malloc -fno-builtin-free -fno-builtin-realloc $(EMIT_LLVM_FOR_RELEASE) $(MACOS_MIN_FLAG) -fPIC $(CXXFLAGS) $(UWS_CXX_FLAGS) -save-temps -I$(BUN_DEPS_DIR)/uws/uSockets/src $(UWS_LDFLAGS) -g $(DEFAULT_LINKER_FLAGS) $(PLATFORM_LINKER_FLAGS) $(OPTIMIZATION_LEVEL) -c $(wildcard $(USOCKETS_SRC_DIR)/*.cpp) $(wildcard $(USOCKETS_SRC_DIR)/**/*.cpp)
rm -rf $(BUN_DEPS_DIR)/uws/uSockets/*.o $(BUN_DEPS_DIR)/uws/uSockets/**/*.o $(BUN_DEPS_DIR)/uws/uSockets/*.a $(BUN_DEPS_DIR)/uws/uSockets/*.bc
cd $(USOCKETS_DIR) && $(CC_WITH_CCACHE) -fno-builtin-malloc -fno-builtin-free -fno-builtin-realloc $(EMIT_LLVM_FOR_RELEASE) $(MACOS_MIN_FLAG) -fPIC $(CFLAGS) $(UWS_CC_FLAGS) -save-temps -I$(BUN_DEPS_DIR)/uws/uSockets/src $(UWS_LDFLAGS) -g $(DEFAULT_LINKER_FLAGS) $(PLATFORM_LINKER_FLAGS) $(OPTIMIZATION_LEVEL) -c $(wildcard $(USOCKETS_SRC_DIR)/*.c) $(wildcard $(USOCKETS_SRC_DIR)/**/*.c)
cd $(USOCKETS_DIR) && $(CXX_WITH_CCACHE) -fno-builtin-malloc -fno-builtin-free -fno-builtin-realloc $(EMIT_LLVM_FOR_RELEASE) $(MACOS_MIN_FLAG) -fPIC $(CXXFLAGS) $(UWS_CXX_FLAGS) -save-temps -I$(BUN_DEPS_DIR)/uws/uSockets/src $(UWS_LDFLAGS) -g $(DEFAULT_LINKER_FLAGS) $(PLATFORM_LINKER_FLAGS) $(OPTIMIZATION_LEVEL) -c $(wildcard $(USOCKETS_SRC_DIR)/*.cpp) $(wildcard $(USOCKETS_SRC_DIR)/**/*.cpp)
cd $(USOCKETS_DIR) && $(AR) rcvs $(BUN_DEPS_OUT_DIR)/libusockets.a $(USOCKETS_DIR)/*.{o,bc}
uws: usockets
$(CXX_WITH_CCACHE) -O2 $(EMIT_LLVM_FOR_RELEASE) -fPIC -I$(USOCKETS_SRC_DIR) $(CLANG_FLAGS) $(CFLAGS) $(UWS_CXX_FLAGS) $(UWS_LDFLAGS) $(PLATFORM_LINKER_FLAGS) -c -I$(BUN_DEPS_DIR) $(BUN_DEPS_OUT_DIR)/libusockets.a $(BUN_DEPS_DIR)/libuwsockets.cpp -o $(BUN_DEPS_OUT_DIR)/libuwsockets.o
$(CXX_WITH_CCACHE) -O2 $(EMIT_LLVM_FOR_RELEASE) -fPIC -I$(BUN_DEPS_DIR)/uws/uSockets/src $(CLANG_FLAGS) $(CFLAGS) $(UWS_CXX_FLAGS) $(UWS_LDFLAGS) $(PLATFORM_LINKER_FLAGS) -c -I$(BUN_DEPS_DIR) $(BUN_DEPS_OUT_DIR)/libusockets.a $(BUN_DEPS_DIR)/libuwsockets.cpp -o $(BUN_DEPS_OUT_DIR)/libuwsockets.o
.PHONY: sign-macos-x64
sign-macos-x64:
@@ -811,7 +816,7 @@ fmt-cpp:
.PHONY: fmt-zig
fmt-zig:
cd src && $(ZIG) fmt **/*.zig
cd src && zig fmt **/*.zig
.PHONY: fmt
fmt: fmt-cpp fmt-zig
@@ -900,8 +905,7 @@ check-glibc-version-dependency:
ifeq ($(OS_NAME),darwin)
zig-win32:
$(ZIG) build -Dtarget=x86_64-windows
# Hardened runtime will not work with debugging
bun-codesign-debug:
@@ -945,7 +949,6 @@ headers:
$(ZIG) translate-c src/bun.js/bindings/headers.h > src/bun.js/bindings/headers.zig
$(BUN_OR_NODE) misctools/headers-cleaner.js
$(ZIG) fmt src/bun.js/bindings/headers.zig
$(CLANG_FORMAT) -i src/bun.js/bindings/ZigGeneratedCode.cpp
.PHONY: jsc-bindings-headers
jsc-bindings-headers: headers
@@ -1114,6 +1117,9 @@ endif
dev-obj-linux:
$(ZIG) build obj -Dtarget=x86_64-linux-gnu -Dcpu="$(CPU_TARGET)"
.PHONY: dev
dev: mkdir-dev esm dev-obj link ## compile zig changes + link bun
mkdir-dev:
mkdir -p $(DEBUG_PACKAGE_DIR)
@@ -1197,12 +1203,10 @@ jsc-build-mac-compile:
-DPORT="JSCOnly" \
-DENABLE_STATIC_JSC=ON \
-DENABLE_SINGLE_THREADED_VM_ENTRY_SCOPE=ON \
-DALLOW_LINE_AND_COLUMN_NUMBER_IN_BUILTINS=ON \
-DCMAKE_BUILD_TYPE=Release \
-DUSE_THIN_ARCHIVES=OFF \
-DBUN_FAST_TLS=ON \
-DENABLE_FTL_JIT=ON \
-DUSE_BUN_JSC_ADDITIONS=ON \
-G Ninja \
$(CMAKE_FLAGS_WITHOUT_RELEASE) \
-DPTHREAD_JIT_PERMISSIONS_API=1 \
@@ -1221,11 +1225,9 @@ jsc-build-mac-compile-lto:
-DPORT="JSCOnly" \
-DENABLE_STATIC_JSC=ON \
-DENABLE_SINGLE_THREADED_VM_ENTRY_SCOPE=ON \
-DALLOW_LINE_AND_COLUMN_NUMBER_IN_BUILTINS=ON \
-DCMAKE_BUILD_TYPE=Release \
-DUSE_THIN_ARCHIVES=OFF \
-DBUN_FAST_TLS=ON \
-DUSE_BUN_JSC_ADDITIONS=ON \
-DCMAKE_C_FLAGS="-flto=full" \
-DCMAKE_CXX_FLAGS="-flto=full" \
-DENABLE_FTL_JIT=ON \
@@ -1250,8 +1252,6 @@ jsc-build-mac-compile-debug:
-DUSE_THIN_ARCHIVES=OFF \
-DENABLE_FTL_JIT=ON \
-DCMAKE_EXPORT_COMPILE_COMMANDS=ON \
-DUSE_BUN_JSC_ADDITIONS=ON \
-DALLOW_LINE_AND_COLUMN_NUMBER_IN_BUILTINS=ON \
-G Ninja \
$(CMAKE_FLAGS_WITHOUT_RELEASE) \
-DPTHREAD_JIT_PERMISSIONS_API=1 \
@@ -1272,11 +1272,9 @@ jsc-build-linux-compile-config:
-DENABLE_STATIC_JSC=ON \
-DCMAKE_BUILD_TYPE=Release \
-DUSE_THIN_ARCHIVES=OFF \
-DUSE_BUN_JSC_ADDITIONS=ON \
-DENABLE_FTL_JIT=ON \
-DENABLE_REMOTE_INSPECTOR=ON \
-DJSEXPORT_PRIVATE=WTF_EXPORT_DECLARATION \
-DALLOW_LINE_AND_COLUMN_NUMBER_IN_BUILTINS=ON \
-USE_VISIBILITY_ATTRIBUTE=1 \
-DCMAKE_EXPORT_COMPILE_COMMANDS=ON \
-G Ninja \
@@ -1291,7 +1289,7 @@ jsc-build-linux-compile-config:
jsc-build-linux-compile-build:
mkdir -p $(WEBKIT_RELEASE_DIR) && \
cd $(WEBKIT_RELEASE_DIR) && \
CFLAGS="$(CFLAGS) -Wl,--whole-archive -ffat-lto-objects" CXXFLAGS="$(CXXFLAGS) -Wl,--whole-archive -ffat-lto-objects -DUSE_BUN_JSC_ADDITIONS=ON" \
CFLAGS="$(CFLAGS) -Wl,--whole-archive -ffat-lto-objects" CXXFLAGS="$(CXXFLAGS) -Wl,--whole-archive -ffat-lto-objects" \
cmake --build $(WEBKIT_RELEASE_DIR) --config relwithdebuginfo --target jsc
@@ -1388,8 +1386,7 @@ mimalloc:
mimalloc-wasm:
rm -rf $(BUN_DEPS_DIR)/mimalloc/CMakeCache* $(BUN_DEPS_DIR)/mimalloc/CMakeFiles
cd $(BUN_DEPS_DIR)/mimalloc; emcmake cmake -DMI_BUILD_SHARED=OFF -DMI_BUILD_STATIC=ON -DMI_BUILD_TESTS=OFF -GNinja -DMI_BUILD_OBJECT=ON ${MIMALLOC_OVERRIDE_FLAG} -DMI_USE_CXX=OFF .; emmake cmake --build .;
cd $(BUN_DEPS_DIR)/mimalloc; emcmake cmake -DMI_BUILD_SHARED=OFF -DMI_BUILD_STATIC=ON -DMI_BUILD_TESTS=OFF -DMI_BUILD_OBJECT=ON ${MIMALLOC_OVERRIDE_FLAG} -DMI_USE_CXX=ON .; emmake make;
cp $(BUN_DEPS_DIR)/mimalloc/$(MIMALLOC_INPUT_PATH) $(BUN_DEPS_OUT_DIR)/$(MIMALLOC_FILE).wasm
# alias for link, incase anyone still types that
@@ -1482,7 +1479,7 @@ bun-relink: bun-relink-copy bun-link-lld-release bun-link-lld-release-dsym
bun-relink-fast: bun-relink-copy bun-link-lld-release-no-lto
wasm-return1:
$(ZIG) build-lib -OReleaseSmall test/bun.js/wasm-return-1-test.zig -femit-bin=test/bun.js/wasm-return-1-test.wasm -target wasm32-freestanding
zig build-lib -OReleaseSmall test/bun.js/wasm-return-1-test.zig -femit-bin=test/bun.js/wasm-return-1-test.wasm -target wasm32-freestanding
generate-classes:
bun src/bun.js/scripts/generate-classes.ts
@@ -1518,7 +1515,7 @@ $(OBJ_DIR)/%.o: $(SRC_DIR)/%.cpp
${MMD_IF_LOCAL} \
-fno-exceptions \
-fno-rtti \
-ferror-limit=10 \
-ferror-limit=1000 \
$(EMIT_LLVM) \
-c -o $@ $<
@@ -1529,7 +1526,7 @@ $(OBJ_DIR)/%.o: src/bun.js/modules/%.cpp
${MMD_IF_LOCAL} \
-fno-exceptions \
-fno-rtti \
-ferror-limit=10 \
-ferror-limit=1000 \
$(EMIT_LLVM) \
-c -o $@ $<
@@ -1540,7 +1537,7 @@ $(OBJ_DIR)/%.o: $(SRC_DIR)/webcore/%.cpp
${MMD_IF_LOCAL} \
-fno-exceptions \
-fno-rtti \
-ferror-limit=10 \
-ferror-limit=1000 \
$(EMIT_LLVM) \
-c -o $@ $<
@@ -1551,7 +1548,7 @@ $(OBJ_DIR)/%.o: $(SRC_DIR)/sqlite/%.cpp
${MMD_IF_LOCAL} \
-fno-exceptions \
-fno-rtti \
-ferror-limit=10 \
-ferror-limit=1000 \
$(EMIT_LLVM) \
-c -o $@ $<
@@ -1562,7 +1559,7 @@ $(OBJ_DIR)/%.o: src/io/%.cpp
${MMD_IF_LOCAL} \
-fno-exceptions \
-fno-rtti \
-ferror-limit=10 \
-ferror-limit=1000 \
$(EMIT_LLVM) \
-c -o $@ $<
@@ -1573,7 +1570,7 @@ $(OBJ_DIR)/%.o: $(SRC_DIR)/node_os/%.cpp
${MMD_IF_LOCAL} \
-fno-exceptions \
-fno-rtti \
-ferror-limit=10 \
-ferror-limit=1000 \
$(EMIT_LLVM) \
-c -o $@ $<
@@ -1584,7 +1581,7 @@ $(OBJ_DIR)/%.o: src/js/out/%.cpp
${MMD_IF_LOCAL} \
-fno-exceptions \
-fno-rtti \
-ferror-limit=10 \
-ferror-limit=1000 \
$(EMIT_LLVM) \
-c -o $@ $<
@@ -1596,7 +1593,7 @@ $(OBJ_DIR)/%.o: src/bun.js/bindings/webcrypto/%.cpp
${MMD_IF_LOCAL} \
-fno-exceptions \
-fno-rtti \
-ferror-limit=10 \
-ferror-limit=1000 \
$(EMIT_LLVM) \
-c -o $@ $<
@@ -1610,7 +1607,7 @@ $(DEBUG_OBJ_DIR)/%.o: $(SRC_DIR)/%.cpp
${MMD_IF_LOCAL} \
-fno-exceptions \
-fno-rtti \
-ferror-limit=10 \
-ferror-limit=1000 \
-DBUN_DEBUG \
$(EMIT_LLVM_FOR_DEBUG) \
-g3 -c -o $@ $<
@@ -1625,7 +1622,7 @@ $(DEBUG_OBJ_DIR)/%.o: $(SRC_DIR)/webcore/%.cpp
${MMD_IF_LOCAL} \
-fno-exceptions \
-fno-rtti \
-ferror-limit=10 \
-ferror-limit=1000 \
$(EMIT_LLVM_FOR_DEBUG) \
-DBUN_DEBUG \
-g3 -c -o $@ $<
@@ -1638,7 +1635,7 @@ $(DEBUG_OBJ_DIR)/%.o: src/io/%.cpp
${MMD_IF_LOCAL} \
-fno-exceptions \
-fno-rtti \
-ferror-limit=10 \
-ferror-limit=1000 \
-DBUN_DEBUG \
$(EMIT_LLVM_FOR_DEBUG) \
-g3 -c -o $@ $<
@@ -1654,7 +1651,7 @@ $(DEBUG_OBJ_DIR)/%.o: $(SRC_DIR)/sqlite/%.cpp
${MMD_IF_LOCAL} \
-fno-exceptions \
-fno-rtti \
-ferror-limit=10 \
-ferror-limit=1000 \
$(EMIT_LLVM_FOR_DEBUG) \
-DBUN_DEBUG \
-g3 -c -o $@ $<
@@ -1669,7 +1666,7 @@ $(DEBUG_OBJ_DIR)/%.o: $(SRC_DIR)/node_os/%.cpp
${MMD_IF_LOCAL} \
-fno-exceptions \
-fno-rtti \
-ferror-limit=10 \
-ferror-limit=1000 \
$(EMIT_LLVM_FOR_DEBUG) \
-DBUN_DEBUG \
-g3 -c -o $@ $<
@@ -1684,7 +1681,7 @@ $(DEBUG_OBJ_DIR)/%.o: src/js/out/%.cpp
${MMD_IF_LOCAL} \
-fno-exceptions \
-fno-rtti \
-ferror-limit=10 \
-ferror-limit=1000 \
$(EMIT_LLVM_FOR_DEBUG) \
-DBUN_DEBUG \
-g3 -c -o $@ $<
@@ -1697,7 +1694,7 @@ $(DEBUG_OBJ_DIR)/%.o: src/bun.js/modules/%.cpp
${MMD_IF_LOCAL} \
-fno-exceptions \
-fno-rtti \
-ferror-limit=10 \
-ferror-limit=1000 \
$(EMIT_LLVM_FOR_DEBUG) \
-DBUN_DEBUG \
-g3 -c -o $@ $<
@@ -1712,7 +1709,7 @@ $(DEBUG_OBJ_DIR)/%.o: src/bun.js/bindings/webcrypto/%.cpp
-fno-exceptions \
-I$(SRC_DIR) \
-fno-rtti \
-ferror-limit=10 \
-ferror-limit=1000 \
$(EMIT_LLVM_FOR_DEBUG) \
-DBUN_DEBUG \
-g3 -c -o $@ $<
@@ -1825,7 +1822,7 @@ endif
build-unit: # to build your unit tests
@rm -rf zig-out/bin/$(testname)
@mkdir -p zig-out/bin
$(ZIG) test $(realpath $(testpath)) \
zig test $(realpath $(testpath)) \
$(testfilterflag) \
$(PACKAGE_MAP) \
--main-pkg-path $(BUN_DIR) \
@@ -1843,7 +1840,7 @@ build-unit: # to build your unit tests
run-all-unit-tests: # to run your unit tests
@rm -rf zig-out/bin/__main_test
@mkdir -p zig-out/bin
$(ZIG) test src/main.zig \
zig test src/main.zig \
$(PACKAGE_MAP) \
--main-pkg-path $(BUN_DIR) \
--test-no-exec \
@@ -1880,7 +1877,7 @@ PACKAGE_MAP = --pkg-begin async_io $(BUN_DIR)/src/io/io_darwin.zig --pkg-begin b
.PHONY: base64
base64:
cd $(BUN_DEPS_DIR)/base64 && make clean && rm -rf CMakeCache.txt CMakeFiles && cmake $(CMAKE_FLAGS) . && make
cd $(BUN_DEPS_DIR)/base64 && make clean && cmake $(CMAKE_FLAGS) . && make
cp $(BUN_DEPS_DIR)/base64/libbase64.a $(BUN_DEPS_OUT_DIR)/libbase64.a
.PHONY: cold-jsc-start
@@ -1891,7 +1888,7 @@ cold-jsc-start:
${MMD_IF_LOCAL} \
-fno-exceptions \
-fno-rtti \
-ferror-limit=10 \
-ferror-limit=1000 \
$(LIBICONV_PATH) \
$(DEFAULT_LINKER_FLAGS) \
$(PLATFORM_LINKER_FLAGS) \
@@ -1907,49 +1904,26 @@ vendor-without-npm: node-fallbacks runtime_js fallback_decoder bun_error mimallo
vendor-without-check: npm-install vendor-without-npm
.PHONY: vendor
vendor: assert-deps submodule vendor-without-check
vendor: require submodule vendor-without-check
.PHONY: vendor-dev
vendor-dev: assert-deps submodule npm-install-dev vendor-without-npm
vendor-dev: require submodule npm-install-dev vendor-without-npm
.PHONY: bun
bun: vendor identifier-cache build-obj bun-link-lld-release bun-codesign-release-local
.PHONY: static-hash-table
static-hash-table:
bun src/js/_codegen/static-hash-tables.ts
.PHONY: cpp
cpp: ## compile src/js/builtins + all c++ code then link
@make clean-bindings js
@make static-hash-table
.PHONY: regenerate-bindings
regenerate-bindings: ## compile src/js/builtins + all c++ code, does not link
@make clean-bindings builtins
@make bindings -j$(CPU_COUNT)
@make link
.PHONY: cpp
cpp-no-link:
@make clean-bindings js
@make bindings -j$(CPU_COUNT)
.PHONY: zig
zig: ## compile zig code then link
@make mkdir-dev dev-obj link
.PHONY: zig-no-link
zig-no-link:
@make mkdir-dev dev-obj
.PHONY: dev
dev: # combo of `make cpp` and `make zig`
@make cpp-no-link zig-no-link -j2
@make link
.PHONY: setup
setup: vendor-dev identifier-cache clean-bindings
make jsc-check dev
make jsc-check
make bindings -j$(CPU_COUNT)
@echo ""
@echo "First build complete!"
@echo "\"bun-debug\" is available at $(DEBUG_BIN)/bun-debug"
@echo "Development environment setup complete"
@echo "Run \`make dev\` to build \`bun-debug\`"
@echo ""
.PHONY: help

View File

@@ -24,14 +24,14 @@
## What is Bun?
> **Bun is under active development.** Use it to speed up your development workflows or run simpler production code in resource-constrained environments like serverless functions. We're working on more complete Node.js compatibility and integration with existing frameworks. Join the [Discord](https://bun.sh/discord) and watch the [GitHub repository](https://github.com/oven-sh/bun) to keep tabs on future releases.
> **Bun is still under development.** Use it to speed up your development workflows or run simpler production code in resource-constrained environments like serverless functions. We're working on more complete Node.js compatibility and integration with existing frameworks. Join the [Discord](https://bun.sh/discord) and watch the [GitHub repository](https://github.com/oven-sh/bun) to keeps tabs on future releases.
Bun is an all-in-one toolkit for JavaScript and TypeScript apps. It ships as a single executable called `bun`.
At its core is the _Bun runtime_, a fast JavaScript runtime designed as a drop-in replacement for Node.js. It's written in Zig and powered by JavaScriptCore under the hood, dramatically reducing startup times and memory usage.
```bash
bun run index.tsx # TS and JSX supported out-of-the-box
bun run index.tsx # TS and JSX supported out of the box
```
The `bun` command-line tool also implements a test runner, script runner, and Node.js-compatible package manager. Instead of 1,000 node_modules for development, you only need `bun`. Bun's built-in tools are significantly faster than existing options and usable in existing Node.js projects with little to no changes.
@@ -93,8 +93,7 @@ bun upgrade --canary
- [`bun run`](https://bun.sh/docs/cli/run)
- [`bun install`](https://bun.sh/docs/cli/install)
- [`bun test`](https://bun.sh/docs/cli/test)
- [`bun init`](https://bun.sh/docs/templates#bun-init)
- [`bun create`](https://bun.sh/docs/templates#bun-create)
- [`bun create`](https://bun.sh/docs/cli/create)
- [`bunx`](https://bun.sh/docs/cli/bunx)
- Runtime
- [Runtime](https://bun.sh/docs/runtime/index)
@@ -124,6 +123,7 @@ bun upgrade --canary
- [HTMLRewriter](https://bun.sh/docs/api/html-rewriter)
- [Testing](https://bun.sh/docs/api/test)
- [Utils](https://bun.sh/docs/api/utils)
- [DNS](https://bun.sh/docs/api/dns)
- [Node-API](https://bun.sh/docs/api/node-api)
## Contributing

View File

@@ -1,12 +0,0 @@
# Security Policy
## Supported Versions
| Version | Supported |
| ------- | ------------------ |
| 1.x.x | :white_check_mark: |
## Reporting a Vulnerability
Report any discovered vulnerabilities to the Bun team by emailing `security@bun.sh`. Your report will acknowledged within 5 days, and a team member will be assigned as the primary handler. To the greatest extent possible, the security team will endeavor to keep you informed of the progress being made towards a fix and full announcement, and may ask for additional information or guidance surrounding the reported issue.

View File

@@ -10,7 +10,7 @@ To run in Bun:
```bash
# so it doesn't run the vitest one
bun test expect-to-equal.test.js
bun wiptest expect-to-equal.test.js
```
To run in Jest:

View File

@@ -1,33 +1,11 @@
# `install` benchmark
Requires [`hyperfine`](https://github.com/sharkdp/hyperfine). The goal of this benchmark is to compare installation performance of Bun with other package managers _when caches are hot_.
Requires [`hyperfine`](https://github.com/sharkdp/hyperfine)
### With lockfile, online mode
To run the benchmark with the standard "install" command for each package manager:
```sh
```
$ hyperfine --prepare 'rm -rf node_modules' --warmup 1 --runs 3 'bun install' 'pnpm install' 'yarn' 'npm install'
```
### With lockfile, offline mode
Even though all packages are cached, some tools may hit the npm API during the version resolution step. (This is not the same as re-downloading a package.) To entirely avoid network calls, the other package managers require `--prefer-offline/--offline` flag. To run the benchmark using "offline" mode:
```sh
$ hyperfine --prepare 'rm -rf node_modules' --runs 1 'bun install' 'pnpm install --prefer-offline' 'yarn --offline' 'npm install --prefer-offline'
```
### Without lockfile, offline mode
To run the benchmark with offline mode but without lockfiles:
```sh
$ hyperfine --prepare 'rm -rf node_modules' --warmup 1 'rm bun.lockb && bun install' 'rm pnpm-lock.yaml && pnpm install --prefer-offline' 'rm yarn.lock && yarn --offline' 'rm package-lock.json && npm install --prefer-offline'
```
##
To check that the app is working as expected:
```

View File

@@ -0,0 +1,14 @@
import { bench, run } from "./runner.mjs";
import { Buffer } from "node:buffer";
const bigBuffer = Buffer.from("hello world".repeat(10000));
const converted = bigBuffer.toString("base64");
bench("Buffer.toString('base64')", () => {
return bigBuffer.toString("base64");
});
// bench("Buffer.from(str, 'base64')", () => {
// return Buffer.from(converted, "base64");
// });
await run();

View File

@@ -1,29 +0,0 @@
import { bench, run } from "./runner.mjs";
import { Buffer } from "node:buffer";
import crypto from "node:crypto";
const bigBuffer = Buffer.from("hello world".repeat(10000));
const converted = bigBuffer.toString("base64");
const uuid = crypto.randomBytes(16);
bench(`Buffer(${bigBuffer.byteLength}).toString('base64')`, () => {
return bigBuffer.toString("base64");
});
bench(`Buffer(${uuid.byteLength}).toString('base64')`, () => {
return uuid.toString("base64");
});
bench(`Buffer(${bigBuffer.byteLength}).toString('hex')`, () => {
return bigBuffer.toString("hex");
});
bench(`Buffer(${uuid.byteLength}).toString('hex')`, () => {
return uuid.toString("hex");
});
bench(`Buffer(${bigBuffer.byteLength}).toString('ascii')`, () => {
return bigBuffer.toString("ascii");
});
await run();

View File

@@ -1,3 +0,0 @@
import { cp } from "fs/promises";
await cp(process.argv[2], process.argv[3], { recursive: true });

View File

@@ -1,31 +0,0 @@
import { mkdirSync, writeFileSync } from "fs";
import { bench, run } from "./runner.mjs";
import { cp } from "fs/promises";
import { join } from "path";
import { tmpdir } from "os";
const hugeDirectory = (() => {
const root = join(tmpdir(), "huge");
const base = join(root, "directory", "for", "benchmarks", "1", "2", "3", "4", "5", "6", "7", "8", "9", "10");
mkdirSync(base, {
recursive: true,
});
for (let i = 0; i < 1000; i++) {
writeFileSync(join(base, "file-" + i + ".txt"), "Hello, world! " + i);
}
return root;
})();
const hugeFilePath = join(tmpdir(), "huge-file-0.txt");
const hugeText = "Hello, world!".repeat(1000000);
writeFileSync(hugeFilePath, hugeText);
var hugeCopyI = 0;
bench("cp -r (1000 files)", async b => {
await cp(hugeDirectory, join(tmpdir(), "huge-copy" + hugeCopyI++), { recursive: true });
});
bench("cp 1 " + ((hugeText.length / 1024) | 0) + " KB file", async b => {
await cp(hugeFilePath, join(tmpdir(), "huge-file" + hugeCopyI++));
});
await run();

View File

@@ -1,100 +0,0 @@
import { bench, run } from "../node_modules/mitata/src/cli.mjs";
// This is a benchmark of the performance impact of using private properties.
bench("Polyfillprivate", () => {
"use strict";
var __classPrivateFieldGet =
(this && this.__classPrivateFieldGet) ||
function (receiver, state, kind, f) {
if (kind === "a" && !f) throw new TypeError("Private accessor was defined without a getter");
if (typeof state === "function" ? receiver !== state || !f : !state.has(receiver))
throw new TypeError("Cannot read private member from an object whose class did not declare it");
return kind === "m" ? f : kind === "a" ? f.call(receiver) : f ? f.value : state.get(receiver);
};
var __classPrivateFieldSet =
(this && this.__classPrivateFieldSet) ||
function (receiver, state, value, kind, f) {
if (kind === "m") throw new TypeError("Private method is not writable");
if (kind === "a" && !f) throw new TypeError("Private accessor was defined without a setter");
if (typeof state === "function" ? receiver !== state || !f : !state.has(receiver))
throw new TypeError("Cannot write private member to an object whose class did not declare it");
return kind === "a" ? f.call(receiver, value) : f ? (f.value = value) : state.set(receiver, value), value;
};
var _Foo_state, _Foo_inc;
class Foo {
constructor() {
_Foo_state.set(this, 1);
_Foo_inc.set(this, 13);
}
run() {
let n = 1000000;
while (n-- > 0) {
__classPrivateFieldSet(
this,
_Foo_state,
__classPrivateFieldGet(this, _Foo_state, "f") + __classPrivateFieldGet(this, _Foo_inc, "f"),
"f",
);
}
return n;
}
}
(_Foo_state = new WeakMap()), (_Foo_inc = new WeakMap());
new Foo().run();
});
bench("NativePrivates", () => {
class Foo {
#state = 1;
#inc = 13;
run() {
let n = 1000000;
while (n-- > 0) {
this.#state += this.#inc;
}
return n;
}
}
new Foo().run();
});
bench("ConventionalPrivates", () => {
class Foo {
_state = 1;
_inc = 13;
run() {
let n = 1000000;
while (n-- > 0) {
this._state += this._inc;
}
return n;
}
}
new Foo().run();
});
const _state = Symbol("state");
const _inc = Symbol("inc");
bench("SymbolPrivates", () => {
class Foo {
[_state] = 1;
[_inc] = 13;
run() {
let n = 1000000;
while (n-- > 0) {
this[_state] += this[_inc];
}
return n;
}
}
new Foo().run();
});
await run();

View File

@@ -1,37 +0,0 @@
import { bench, run } from "./runner.mjs";
const blob = new Blob(["<p id='foo'>Hello</p>"]);
bench("prepend", async () => {
await new HTMLRewriter()
.on("p", {
element(element) {
element.prepend("Hello");
},
})
.transform(new Response(blob))
.text();
});
bench("append", async () => {
await new HTMLRewriter()
.on("p", {
element(element) {
element.append("Hello");
},
})
.transform(new Response(blob))
.text();
});
bench("getAttribute", async () => {
await new HTMLRewriter()
.on("p", {
element(element) {
element.getAttribute("id");
},
})
.transform(new Response(blob))
.text();
});
await run();

View File

@@ -1,22 +0,0 @@
import { tmpdir } from "node:os";
import { promises, existsSync, mkdirSync } from "node:fs";
const count = 1024 * 12;
var queue = new Array(count);
var paths = new Array(count);
for (let i = 0; i < count; i++) {
const path = `${tmpdir()}/${Date.now()}.rm.dir${i}`;
try {
mkdirSync(path);
} catch (e) {}
paths[i] = path;
queue[i] = promises.rmdir(path);
}
await Promise.all(queue);
for (let i = 0; i < count; i++) {
if (existsSync(paths[i])) {
throw new Error(`Path ${paths[i]} was not removed`);
}
}

View File

@@ -9,8 +9,9 @@ bench("writeFile(/tmp/foo.txt, short string)", async () => {
await writeFile("/tmp/foo.txt", "short string", "utf8");
});
const buffer = Buffer.from("short string");
bench("writeFile(/tmp/foo.txt, Buffer.from(short string))", async () => {
await writeFile("/tmp/foo.txt", Buffer.from("short string"));
await writeFile("/tmp/foo.txt", buffer);
});
const fd = openSync("/tmp/foo.txt", "w");
@@ -21,7 +22,7 @@ bench("write(fd, short string)", () => {
});
bench("write(fd, Uint8Array(short string))", () => {
const bytesWritten = write(fd, Buffer.from("short string"));
const bytesWritten = write(fd, buffer);
if (bytesWritten !== 12) throw new Error("wrote !== 12");
});

119
build.zig
View File

@@ -1,5 +1,4 @@
const std = @import("std");
const pathRel = std.fs.path.relative;
const Wyhash = @import("./src/wyhash.zig").Wyhash;
var is_debug_build = false;
fn moduleSource(comptime out: []const u8) FileSource {
@@ -34,10 +33,6 @@ fn addInternalPackages(b: *Build, step: *CompileStep, _: std.mem.Allocator, _: [
break :brk b.createModule(.{
.source_file = FileSource.relative("src/io/io_linux.zig"),
});
} else if (target.isWindows()) {
break :brk b.createModule(.{
.source_file = FileSource.relative("src/io/io_windows.zig"),
});
}
break :brk b.createModule(.{
@@ -101,7 +96,6 @@ const BunBuildOptions = struct {
}
};
// relative to the prefix
var output_dir: []const u8 = "";
fn panicIfNotFound(comptime filepath: []const u8) []const u8 {
var file = std.fs.cwd().openFile(filepath, .{ .optimize = .read_only }) catch |err| {
@@ -136,16 +130,6 @@ const Module = std.build.Module;
const fs = std.fs;
pub fn build(b: *Build) !void {
build_(b) catch |err| {
if (@errorReturnTrace()) |trace| {
std.debug.dumpStackTrace(trace.*);
}
return err;
};
}
pub fn build_(b: *Build) !void {
// Standard target options allows the person running `zig build` to choose
// what target to build for. Here we do not override the defaults, which
// means any target is allowed, and the default is native. Other options
@@ -188,25 +172,26 @@ pub fn build_(b: *Build) !void {
var triplet = triplet_buf[0 .. osname.len + cpuArchName.len + 1];
if (b.option([]const u8, "output-dir", "target to install to") orelse std.os.getenv("OUTPUT_DIR")) |output_dir_| {
output_dir = try pathRel(b.allocator, b.install_prefix, output_dir_);
output_dir = b.pathFromRoot(output_dir_);
} else {
const output_dir_base = try std.fmt.bufPrint(&output_dir_buf, "{s}{s}", .{ bin_label, triplet });
output_dir = try pathRel(b.allocator, b.install_prefix, output_dir_base);
output_dir = b.pathFromRoot(output_dir_base);
}
std.fs.cwd().makePath(output_dir) catch {};
is_debug_build = optimize == OptimizeMode.Debug;
const bun_executable_name = if (optimize == std.builtin.OptimizeMode.Debug) "bun-debug" else "bun";
const root_src = if (target.getOsTag() == std.Target.Os.Tag.freestanding)
"root_wasm.zig"
"src/main_wasm.zig"
else
"root.zig";
const min_version: std.SemanticVersion = if (!(target.isWindows() or target.getOsTag() == .freestanding))
const min_version: std.SemanticVersion = if (target.getOsTag() != .freestanding)
target.getOsVersionMin().semver
else
.{ .major = 0, .minor = 0, .patch = 0 };
const max_version: std.SemanticVersion = if (!(target.isWindows() or target.getOsTag() == .freestanding))
const max_version: std.SemanticVersion = if (target.getOsTag() != .freestanding)
target.getOsVersionMax().semver
else
.{ .major = 0, .minor = 0, .patch = 0 };
@@ -217,11 +202,8 @@ pub fn build_(b: *Build) !void {
.root_source_file = FileSource.relative(root_src),
.target = target,
.optimize = optimize,
.main_pkg_path = .{ .cwd_relative = b.pathFromRoot(".") },
});
b.reference_trace = 16;
var default_build_options: BunBuildOptions = brk: {
const is_baseline = arch.isX86() and (target.cpu_model == .baseline or
!std.Target.x86.featureSetHas(target.getCpuFeatures(), .avx2));
@@ -236,6 +218,7 @@ pub fn build_(b: *Build) !void {
.argv = &.{
"git",
"rev-parse",
"--short",
"HEAD",
},
.cwd = b.pathFromRoot("."),
@@ -257,6 +240,8 @@ pub fn build_(b: *Build) !void {
};
{
obj.setMainPkgPath(b.pathFromRoot("."));
try addInternalPackages(
b,
obj,
@@ -287,15 +272,9 @@ pub fn build_(b: *Build) !void {
std.io.getStdErr().writer().print("Output: {s}/{s}\n\n", .{ output_dir, bun_executable_name }) catch unreachable;
defer obj_step.dependOn(&obj.step);
var install = b.addInstallFileWithDir(
obj.getEmittedBin(),
.{ .custom = output_dir },
b.fmt("{s}.o", .{bun_executable_name}),
);
install.step.dependOn(&obj.step);
obj_step.dependOn(&install.step);
obj.emit_bin = .{
.emit_to = b.fmt("{s}/{s}.o", .{ output_dir, bun_executable_name }),
};
var actual_build_options = default_build_options;
if (b.option(bool, "generate-sizes", "Generate sizes of things") orelse false) {
actual_build_options.sizegen = true;
@@ -312,8 +291,7 @@ pub fn build_(b: *Build) !void {
if (target.getCpuArch().isX86()) obj.disable_stack_probing = true;
if (b.option(bool, "for-editor", "Do not emit bin, just check for errors") orelse false) {
// obj.emit_bin = .no_emit;
obj.generated_bin = null;
obj.emit_bin = .no_emit;
}
if (target.getOsTag() == .linux) {
@@ -331,10 +309,9 @@ pub fn build_(b: *Build) !void {
.root_source_file = FileSource.relative("src/bindgen.zig"),
.target = target,
.optimize = optimize,
.main_pkg_path = obj.main_pkg_path,
});
defer headers_step.dependOn(&headers_obj.step);
try configureObjectStep(b, headers_obj, headers_step, @TypeOf(target), target);
try configureObjectStep(b, headers_obj, @TypeOf(target), target, obj.main_pkg_path.?);
var headers_build_options = default_build_options;
headers_build_options.bindgen = true;
headers_obj.addOptions("build_options", default_build_options.step(b));
@@ -342,22 +319,19 @@ pub fn build_(b: *Build) !void {
}
{
const wasm_step = b.step("bun-wasm", "Build WASM");
var wasm = b.addStaticLibrary(.{
const wasm = b.step("bun-wasm", "Build WASM");
var wasm_step = b.addStaticLibrary(.{
.name = "bun-wasm",
.root_source_file = FileSource.relative("root_wasm.zig"),
.root_source_file = FileSource.relative("src/main_wasm.zig"),
.target = target,
.optimize = optimize,
.main_pkg_path = obj.main_pkg_path,
});
defer wasm_step.dependOn(&wasm.step);
wasm.strip = false;
defer wasm.dependOn(&wasm_step.step);
wasm_step.strip = false;
// wasm_step.link_function_sections = true;
// wasm_step.link_emit_relocs = true;
// wasm_step.single_threaded = true;
try configureObjectStep(b, wasm, wasm_step, @TypeOf(target), target);
var build_opts = default_build_options;
wasm.addOptions("build_options", build_opts.step(b));
try configureObjectStep(b, wasm_step, @TypeOf(target), target, obj.main_pkg_path.?);
}
{
@@ -367,10 +341,9 @@ pub fn build_(b: *Build) !void {
.root_source_file = FileSource.relative("misctools/http_bench.zig"),
.target = target,
.optimize = optimize,
.main_pkg_path = obj.main_pkg_path,
});
defer headers_step.dependOn(&headers_obj.step);
try configureObjectStep(b, headers_obj, headers_step, @TypeOf(target), target);
try configureObjectStep(b, headers_obj, @TypeOf(target), target, obj.main_pkg_path.?);
headers_obj.addOptions("build_options", default_build_options.step(b));
}
@@ -381,10 +354,9 @@ pub fn build_(b: *Build) !void {
.root_source_file = FileSource.relative("misctools/machbench.zig"),
.target = target,
.optimize = optimize,
.main_pkg_path = obj.main_pkg_path,
});
defer headers_step.dependOn(&headers_obj.step);
try configureObjectStep(b, headers_obj, headers_step, @TypeOf(target), target);
try configureObjectStep(b, headers_obj, @TypeOf(target), target, obj.main_pkg_path.?);
headers_obj.addOptions("build_options", default_build_options.step(b));
}
@@ -395,10 +367,9 @@ pub fn build_(b: *Build) !void {
.root_source_file = FileSource.relative("misctools/fetch.zig"),
.target = target,
.optimize = optimize,
.main_pkg_path = obj.main_pkg_path,
});
defer headers_step.dependOn(&headers_obj.step);
try configureObjectStep(b, headers_obj, headers_step, @TypeOf(target), target);
try configureObjectStep(b, headers_obj, @TypeOf(target), target, obj.main_pkg_path.?);
headers_obj.addOptions("build_options", default_build_options.step(b));
}
@@ -409,10 +380,9 @@ pub fn build_(b: *Build) !void {
.root_source_file = FileSource.relative("src/bench/string-handling.zig"),
.target = target,
.optimize = optimize,
.main_pkg_path = obj.main_pkg_path,
});
defer headers_step.dependOn(&headers_obj.step);
try configureObjectStep(b, headers_obj, headers_step, @TypeOf(target), target);
try configureObjectStep(b, headers_obj, @TypeOf(target), target, obj.main_pkg_path.?);
headers_obj.addOptions("build_options", default_build_options.step(b));
}
@@ -423,10 +393,9 @@ pub fn build_(b: *Build) !void {
.root_source_file = FileSource.relative("src/sha.zig"),
.target = target,
.optimize = optimize,
.main_pkg_path = obj.main_pkg_path,
});
defer headers_step.dependOn(&headers_obj.step);
try configureObjectStep(b, headers_obj, headers_step, @TypeOf(target), target);
try configureObjectStep(b, headers_obj, @TypeOf(target), target, obj.main_pkg_path.?);
headers_obj.addOptions("build_options", default_build_options.step(b));
}
@@ -437,10 +406,9 @@ pub fn build_(b: *Build) !void {
.root_source_file = FileSource.relative("src/sourcemap/vlq_bench.zig"),
.target = target,
.optimize = optimize,
.main_pkg_path = obj.main_pkg_path,
});
defer headers_step.dependOn(&headers_obj.step);
try configureObjectStep(b, headers_obj, headers_step, @TypeOf(target), target);
try configureObjectStep(b, headers_obj, @TypeOf(target), target, obj.main_pkg_path.?);
headers_obj.addOptions("build_options", default_build_options.step(b));
}
@@ -451,10 +419,9 @@ pub fn build_(b: *Build) !void {
.root_source_file = FileSource.relative("misctools/tgz.zig"),
.target = target,
.optimize = optimize,
.main_pkg_path = obj.main_pkg_path,
});
defer headers_step.dependOn(&headers_obj.step);
try configureObjectStep(b, headers_obj, headers_step, @TypeOf(target), target);
try configureObjectStep(b, headers_obj, @TypeOf(target), target, obj.main_pkg_path.?);
headers_obj.addOptions("build_options", default_build_options.step(b));
}
@@ -468,23 +435,16 @@ pub fn build_(b: *Build) !void {
var headers_obj: *CompileStep = b.addTest(.{
.root_source_file = FileSource.relative(test_file orelse "src/main.zig"),
.target = target,
.main_pkg_path = obj.main_pkg_path,
});
headers_obj.filter = test_filter;
if (test_bin_) |test_bin| {
headers_obj.name = std.fs.path.basename(test_bin);
if (std.fs.path.dirname(test_bin)) |dir| {
var install = b.addInstallFileWithDir(
headers_obj.getEmittedBin(),
.{ .custom = try std.fs.path.relative(b.allocator, output_dir, dir) },
headers_obj.name,
);
install.step.dependOn(&headers_obj.step);
headers_step.dependOn(&install.step);
}
if (std.fs.path.dirname(test_bin)) |dir| headers_obj.emit_bin = .{
.emit_to = b.fmt("{s}/{s}", .{ dir, headers_obj.name }),
};
}
try configureObjectStep(b, headers_obj, headers_step, @TypeOf(target), target);
try configureObjectStep(b, headers_obj, @TypeOf(target), target, obj.main_pkg_path.?);
headers_step.dependOn(&headers_obj.step);
headers_obj.addOptions("build_options", default_build_options.step(b));
@@ -495,7 +455,9 @@ pub fn build_(b: *Build) !void {
pub var original_make_fn: ?*const fn (step: *std.build.Step) anyerror!void = null;
pub fn configureObjectStep(b: *std.build.Builder, obj: *CompileStep, obj_step: *std.build.Step, comptime Target: type, target: Target) !void {
pub fn configureObjectStep(b: *std.build.Builder, obj: *CompileStep, comptime Target: type, target: Target, main_pkg_path: []const u8) !void {
obj.setMainPkgPath(main_pkg_path);
// obj.setTarget(target);
try addInternalPackages(b, obj, std.heap.page_allocator, b.zig_exe, target);
@@ -503,16 +465,11 @@ pub fn configureObjectStep(b: *std.build.Builder, obj: *CompileStep, obj_step: *
// obj.setBuildMode(optimize);
obj.bundle_compiler_rt = false;
if (obj.emit_directory == null) {
var install = b.addInstallFileWithDir(
obj.getEmittedBin(),
.{ .custom = output_dir },
b.fmt("{s}.o", .{obj.name}),
);
if (obj.emit_bin == .default)
obj.emit_bin = .{
.emit_to = b.fmt("{s}/{s}.o", .{ output_dir, obj.name }),
};
install.step.dependOn(&obj.step);
obj_step.dependOn(&install.step);
}
if (target.getOsTag() != .freestanding) obj.linkLibC();
if (target.getOsTag() != .freestanding) obj.bundle_compiler_rt = false;

BIN
bun.lockb

Binary file not shown.

View File

@@ -1,8 +1,8 @@
[test]
# Large monorepos (like Bun) may want to specify the test directory more specifically
# By default, `bun test` scans every single folder recursively which, if you
# have a gigantic submodule (like WebKit), requires lots of directory
# By default, `bun test` scans every single folder recurisvely which, if you
# have a gigantic submodule (like WebKit), it has to do lots of directory
# traversals
#
# Instead, we can only scan the test directory for Bun's runtime tests
# Instead, we can just make it scan only the test directory for Bun's runtime tests
root = "test"

View File

@@ -92,12 +92,12 @@ _bun_completions() {
PACKAGE_OPTIONS[REMOVE_OPTIONS_LONG]="";
PACKAGE_OPTIONS[REMOVE_OPTIONS_SHORT]="";
PACKAGE_OPTIONS[SHARED_OPTIONS_LONG]="--config --yarn --production --frozen-lockfile --no-save --dry-run --force --cache-dir --no-cache --silent --verbose --global --cwd --backend --link-native-bins --help";
PACKAGE_OPTIONS[SHARED_OPTIONS_LONG]="--config --yarn --production --frozen-lockfile --no-save --dry-run --lockfile --force --cache-dir --no-cache --silent --verbose --global --cwd --backend --link-native-bins --help";
PACKAGE_OPTIONS[SHARED_OPTIONS_SHORT]="-c -y -p -f -g";
PM_OPTIONS[LONG_OPTIONS]="--config --yarn --production --frozen-lockfile --no-save --dry-run --force --cache-dir --no-cache --silent --verbose --no-progress --no-summary --no-verify --ignore-scripts --global --cwd --backend --link-native-bins --help"
PM_OPTIONS[LONG_OPTIONS]="--config --yarn --production --frozen-lockfile --no-save --dry-run --lockfile --force --cache-dir --no-cache --silent --verbose --no-progress --no-summary --no-verify --ignore-scripts --global --cwd --backend --link-native-bins --help"
PM_OPTIONS[SHORT_OPTIONS]="-c -y -p -f -g"
local cur_word="${COMP_WORDS[${COMP_CWORD}]}";
local prev="${COMP_WORDS[$(( COMP_CWORD - 1 ))]}";

File diff suppressed because it is too large Load Diff

View File

@@ -119,6 +119,9 @@ subcommands:
- no-save --
- dry-run -- "Don't install anything"
- force -- "Always request the latest versions from the registry & reinstall all dependenices"
- name: lockfile
type: string
summary: "Store & load a lockfile at a specific filepath"
- name: cache-dir
type: string
summary: "Store & load cached data from a specific directory path"
@@ -157,6 +160,9 @@ subcommands:
- no-cache -- "Ignore manifest cache entirely"
- silent -- "Don't output anything"
- verbose -- "Excessively verbose logging"
- name: lockfile
type: string
summary: "Store & load a lockfile at a specific filepath"
- name: cache-dir
type: string
summary: "Store & load cached data from a specific directory path"
@@ -192,6 +198,9 @@ subcommands:
- no-save --
- dry-run -- "Don't install anything"
- force -- "Always request the latest versions from the registry & reinstall all dependenices"
- name: lockfile
type: string
summary: "Store & load a lockfile at a specific filepath"
- name: cache-dir
type: string
summary: "Store & load cached data from a specific directory path"

View File

@@ -0,0 +1,29 @@
# bun:alpine
# Not officially supported (yet)
ARG GLIBC_RELEASE=2.35-r0
FROM alpine:latest AS build
WORKDIR /tmp
RUN apk --no-cache add unzip
ARG GLIBC_RELEASE
RUN wget https://alpine-pkgs.sgerrand.com/sgerrand.rsa.pub && \
wget https://github.com/sgerrand/alpine-pkg-glibc/releases/download/${GLIBC_RELEASE}/glibc-${GLIBC_RELEASE}.apk
ADD https://github.com/oven-sh/bun/releases/latest/download/bun-linux-x64.zip bun-linux-x64.zip
RUN unzip bun-linux-x64.zip
FROM alpine:latest
ARG GLIBC_RELEASE
COPY --from=build /tmp/sgerrand.rsa.pub /etc/apk/keys
COPY --from=build /tmp/glibc-${GLIBC_RELEASE}.apk /tmp
COPY --from=build /tmp/bun-linux-x64/bun /usr/local/bin
RUN apk --no-cache --force-overwrite add /tmp/glibc-${GLIBC_RELEASE}.apk \
&& rm /etc/apk/keys/sgerrand.rsa.pub \
&& rm /tmp/glibc-${GLIBC_RELEASE}.apk
RUN bun --version

View File

@@ -0,0 +1,77 @@
FROM debian:bullseye-slim AS build
# https://github.com/oven-sh/bun/releases
ARG BUN_VERSION=latest
RUN apt-get update -qq \
&& apt-get install -qq --no-install-recommends \
ca-certificates \
curl \
dirmngr \
gpg \
gpg-agent \
unzip \
&& apt-get clean \
&& rm -rf /var/lib/apt/lists/* \
&& arch="$(dpkg --print-architecture)" \
&& case "${arch##*-}" in \
amd64) build="x64-baseline";; \
arm64) build="aarch64";; \
*) echo "error: unsupported architecture: ($arch)"; exit 1 ;; \
esac \
&& version="$BUN_VERSION" \
&& case "$version" in \
latest | canary | bun-v*) tag="$version"; ;; \
v*) tag="bun-$version"; ;; \
*) tag="bun-v$version"; ;; \
esac \
&& case "$tag" in \
latest) release="latest/download"; ;; \
*) release="download/$tag"; ;; \
esac \
&& curl "https://github.com/oven-sh/bun/releases/$release/bun-linux-$build.zip" \
-fsSLO \
--compressed \
--retry 5 \
|| (echo "error: unknown release: ($tag)" && exit 1) \
&& for key in \
"F3DCC08A8572C0749B3E18888EAB4D40A7B22B59" \
; do \
gpg --batch --keyserver hkps://keys.openpgp.org --recv-keys "$key" \
|| gpg --batch --keyserver keyserver.ubuntu.com --recv-keys "$key" ; \
done \
&& gpg --update-trustdb \
&& curl "https://github.com/oven-sh/bun/releases/$release/SHASUMS256.txt.asc" \
-fsSLO \
--compressed \
--retry 5 \
&& gpg --batch --decrypt --output SHASUMS256.txt SHASUMS256.txt.asc \
|| (echo "error: failed to verify release: ($tag)" && exit 1) \
&& grep " bun-linux-$build.zip\$" SHASUMS256.txt | sha256sum -c - \
|| (echo "error: failed to verify release: ($tag)" && exit 1) \
&& unzip "bun-linux-$build.zip" \
&& mv "bun-linux-$build/bun" /usr/local/bin/bun \
&& rm -f "bun-linux-$build.zip" SHASUMS256.txt.asc SHASUMS256.txt \
&& chmod +x /usr/local/bin/bun \
&& ln -s /usr/local/bin/bun /usr/local/bin/bunx \
&& which bun \
&& which bunx \
&& bun --version
FROM debian:bullseye-slim
RUN groupadd bun \
--gid 1000 \
&& useradd bun \
--uid 1000 \
--gid bun \
--shell /bin/sh \
--create-home
COPY docker-entrypoint.sh /usr/local/bin
COPY --from=build /usr/local/bin/bun /usr/local/bin
COPY --from=build /usr/local/bin/bunx /usr/local/bin
WORKDIR /home/bun/app
ENTRYPOINT ["/usr/local/bin/docker-entrypoint.sh"]
CMD ["/usr/local/bin/bun"]

View File

@@ -0,0 +1,68 @@
FROM debian:bullseye-slim AS build
# https://github.com/oven-sh/bun/releases
ARG BUN_VERSION=latest
RUN apt-get update -qq \
&& apt-get install -qq --no-install-recommends \
ca-certificates \
curl \
dirmngr \
gpg \
gpg-agent \
unzip \
&& apt-get clean \
&& rm -rf /var/lib/apt/lists/* \
&& arch="$(dpkg --print-architecture)" \
&& case "${arch##*-}" in \
amd64) build="x64-baseline";; \
arm64) build="aarch64";; \
*) echo "error: unsupported architecture: ($arch)"; exit 1 ;; \
esac \
&& version="$BUN_VERSION" \
&& case "$version" in \
latest | canary | bun-v*) tag="$version"; ;; \
v*) tag="bun-$version"; ;; \
*) tag="bun-v$version"; ;; \
esac \
&& case "$tag" in \
latest) release="latest/download"; ;; \
*) release="download/$tag"; ;; \
esac \
&& curl "https://github.com/oven-sh/bun/releases/$release/bun-linux-$build.zip" \
-fsSLO \
--compressed \
--retry 5 \
|| (echo "error: unknown release: ($tag)" && exit 1) \
&& for key in \
"F3DCC08A8572C0749B3E18888EAB4D40A7B22B59" \
; do \
gpg --batch --keyserver hkps://keys.openpgp.org --recv-keys "$key" \
|| gpg --batch --keyserver keyserver.ubuntu.com --recv-keys "$key" ; \
done \
&& gpg --update-trustdb \
&& curl "https://github.com/oven-sh/bun/releases/$release/SHASUMS256.txt.asc" \
-fsSLO \
--compressed \
--retry 5 \
&& gpg --batch --decrypt --output SHASUMS256.txt SHASUMS256.txt.asc \
|| (echo "error: failed to verify release: ($tag)" && exit 1) \
&& grep " bun-linux-$build.zip\$" SHASUMS256.txt | sha256sum -c - \
|| (echo "error: failed to verify release: ($tag)" && exit 1) \
&& unzip "bun-linux-$build.zip" \
&& mv "bun-linux-$build/bun" /usr/local/bin/bun \
&& rm -f "bun-linux-$build.zip" SHASUMS256.txt.asc SHASUMS256.txt \
&& chmod +x /usr/local/bin/bun \
&& ln -s /usr/local/bin/bun /usr/local/bin/bunx \
&& which bun \
&& which bunx \
&& bun --version
FROM gcr.io/distroless/base-nossl-debian11
COPY --from=build /usr/local/bin/bun /usr/local/bin
COPY --from=build /usr/local/bin/bunx /usr/local/bin
WORKDIR /app
ENTRYPOINT ["/usr/local/bin/bun"]
CMD ["/usr/local/bin/bun"]

View File

@@ -1,113 +0,0 @@
FROM alpine:3.18 AS build
# https://github.com/oven-sh/bun/releases
ARG BUN_VERSION=latest
# TODO: Instead of downloading glibc from a third-party source, we should
# build it from source. This is a temporary solution.
# See: https://github.com/sgerrand/alpine-pkg-glibc
# https://github.com/sgerrand/alpine-pkg-glibc/releases
# https://github.com/sgerrand/alpine-pkg-glibc/issues/176
ARG GLIBC_VERSION=2.34-r0
# https://github.com/oven-sh/bun/issues/5545#issuecomment-1722461083
ARG GLIBC_VERSION_AARCH64=2.26-r1
RUN apk --no-cache add \
ca-certificates \
curl \
dirmngr \
gpg \
gpg-agent \
unzip \
&& arch="$(apk --print-arch)" \
&& case "${arch##*-}" in \
x86_64) build="x64-baseline";; \
aarch64) build="aarch64";; \
*) echo "error: unsupported architecture: $arch"; exit 1 ;; \
esac \
&& version="$BUN_VERSION" \
&& case "$version" in \
latest | canary | bun-v*) tag="$version"; ;; \
v*) tag="bun-$version"; ;; \
*) tag="bun-v$version"; ;; \
esac \
&& case "$tag" in \
latest) release="latest/download"; ;; \
*) release="download/$tag"; ;; \
esac \
&& curl "https://github.com/oven-sh/bun/releases/$release/bun-linux-$build.zip" \
-fsSLO \
--compressed \
--retry 5 \
|| (echo "error: failed to download: $tag" && exit 1) \
&& for key in \
"F3DCC08A8572C0749B3E18888EAB4D40A7B22B59" \
; do \
gpg --batch --keyserver hkps://keys.openpgp.org --recv-keys "$key" \
|| gpg --batch --keyserver keyserver.ubuntu.com --recv-keys "$key" ; \
done \
&& curl "https://github.com/oven-sh/bun/releases/$release/SHASUMS256.txt.asc" \
-fsSLO \
--compressed \
--retry 5 \
&& gpg --batch --decrypt --output SHASUMS256.txt SHASUMS256.txt.asc \
|| (echo "error: failed to verify: $tag" && exit 1) \
&& grep " bun-linux-$build.zip\$" SHASUMS256.txt | sha256sum -c - \
|| (echo "error: failed to verify: $tag" && exit 1) \
&& unzip "bun-linux-$build.zip" \
&& mv "bun-linux-$build/bun" /usr/local/bin/bun \
&& rm -f "bun-linux-$build.zip" SHASUMS256.txt.asc SHASUMS256.txt \
&& chmod +x /usr/local/bin/bun \
&& cd /tmp \
&& case "${arch##*-}" in \
x86_64) curl "https://github.com/sgerrand/alpine-pkg-glibc/releases/download/${GLIBC_VERSION}/glibc-${GLIBC_VERSION}.apk" \
-fsSLO \
--compressed \
--retry 5 \
|| (echo "error: failed to download: glibc v${GLIBC_VERSION}" && exit 1) \
&& mv "glibc-${GLIBC_VERSION}.apk" glibc.apk \
&& curl "https://github.com/sgerrand/alpine-pkg-glibc/releases/download/${GLIBC_VERSION}/glibc-bin-${GLIBC_VERSION}.apk" \
-fsSLO \
--compressed \
--retry 5 \
|| (echo "error: failed to download: glibc-bin v${GLIBC_VERSION}" && exit 1) \
&& mv "glibc-bin-${GLIBC_VERSION}.apk" glibc-bin.apk ;; \
aarch64) curl "https://raw.githubusercontent.com/squishyu/alpine-pkg-glibc-aarch64-bin/master/glibc-${GLIBC_VERSION_AARCH64}.apk" \
-fsSLO \
--compressed \
--retry 5 \
|| (echo "error: failed to download: glibc v${GLIBC_VERSION_AARCH64}" && exit 1) \
&& mv "glibc-${GLIBC_VERSION_AARCH64}.apk" glibc.apk \
&& curl "https://raw.githubusercontent.com/squishyu/alpine-pkg-glibc-aarch64-bin/master/glibc-bin-${GLIBC_VERSION_AARCH64}.apk" \
-fsSLO \
--compressed \
--retry 5 \
|| (echo "error: failed to download: glibc-bin v${GLIBC_VERSION_AARCH64}" && exit 1) \
&& mv "glibc-bin-${GLIBC_VERSION_AARCH64}.apk" glibc-bin.apk ;; \
*) echo "error: unsupported architecture '$arch'"; exit 1 ;; \
esac
FROM alpine:3.18
COPY --from=build /tmp/glibc.apk /tmp/
COPY --from=build /tmp/glibc-bin.apk /tmp/
COPY --from=build /usr/local/bin/bun /usr/local/bin/
COPY docker-entrypoint.sh /usr/local/bin/
RUN addgroup -g 1000 bun \
&& adduser -u 1000 -G bun -s /bin/sh -D bun \
&& apk --no-cache --force-overwrite --allow-untrusted add \
/tmp/glibc.apk \
/tmp/glibc-bin.apk \
&& rm /tmp/glibc.apk \
&& rm /tmp/glibc-bin.apk \
&& ln -s /usr/local/bin/bun /usr/local/bin/bunx \
&& which bun \
&& which bunx \
&& bun --version
WORKDIR /home/bun/app
ENTRYPOINT ["/usr/local/bin/docker-entrypoint.sh"]
CMD ["/usr/local/bin/bun"]

View File

@@ -1,7 +1,7 @@
FROM debian:bullseye-slim AS build
# https://github.com/oven-sh/bun/releases
ARG BUN_VERSION=latest
ARG BUN_VERSION=0.5.7
RUN apt-get update -qq \
&& apt-get install -qq --no-install-recommends \
@@ -17,7 +17,7 @@ RUN apt-get update -qq \
&& case "${arch##*-}" in \
amd64) build="x64-baseline";; \
arm64) build="aarch64";; \
*) echo "error: unsupported architecture: $arch"; exit 1 ;; \
*) echo "error: unsupported architecture: ($arch)"; exit 1 ;; \
esac \
&& version="$BUN_VERSION" \
&& case "$version" in \
@@ -33,44 +33,44 @@ RUN apt-get update -qq \
-fsSLO \
--compressed \
--retry 5 \
|| (echo "error: failed to download: $tag" && exit 1) \
|| (echo "error: unknown release: ($tag)" && exit 1) \
&& for key in \
"F3DCC08A8572C0749B3E18888EAB4D40A7B22B59" \
; do \
gpg --batch --keyserver hkps://keys.openpgp.org --recv-keys "$key" \
|| gpg --batch --keyserver keyserver.ubuntu.com --recv-keys "$key" ; \
done \
&& gpg --update-trustdb \
&& curl "https://github.com/oven-sh/bun/releases/$release/SHASUMS256.txt.asc" \
-fsSLO \
--compressed \
--retry 5 \
&& gpg --batch --decrypt --output SHASUMS256.txt SHASUMS256.txt.asc \
|| (echo "error: failed to verify: $tag" && exit 1) \
|| (echo "error: failed to verify release: ($tag)" && exit 1) \
&& grep " bun-linux-$build.zip\$" SHASUMS256.txt | sha256sum -c - \
|| (echo "error: failed to verify: $tag" && exit 1) \
|| (echo "error: failed to verify release: ($tag)" && exit 1) \
&& unzip "bun-linux-$build.zip" \
&& mv "bun-linux-$build/bun" /usr/local/bin/bun \
&& rm -f "bun-linux-$build.zip" SHASUMS256.txt.asc SHASUMS256.txt \
&& chmod +x /usr/local/bin/bun \
&& ln -s /usr/local/bin/bun /usr/local/bin/bunx \
&& which bun \
&& which bunx \
&& bun --version
FROM debian:bullseye-slim
COPY docker-entrypoint.sh /usr/local/bin
COPY --from=build /usr/local/bin/bun /usr/local/bin/bun
RUN groupadd bun \
--gid 1000 \
&& useradd bun \
--uid 1000 \
--gid bun \
--shell /bin/sh \
--create-home \
&& ln -s /usr/local/bin/bun /usr/local/bin/bunx \
&& which bun \
&& which bunx \
&& bun --version
--create-home
COPY docker-entrypoint.sh /usr/local/bin
COPY --from=build /usr/local/bin/bun /usr/local/bin
COPY --from=build /usr/local/bin/bunx /usr/local/bin
WORKDIR /home/bun/app
ENTRYPOINT ["/usr/local/bin/docker-entrypoint.sh"]

View File

@@ -1,7 +1,7 @@
FROM debian:bullseye-slim AS build
# https://github.com/oven-sh/bun/releases
ARG BUN_VERSION=latest
ARG BUN_VERSION=0.5.7
RUN apt-get update -qq \
&& apt-get install -qq --no-install-recommends \
@@ -17,7 +17,7 @@ RUN apt-get update -qq \
&& case "${arch##*-}" in \
amd64) build="x64-baseline";; \
arm64) build="aarch64";; \
*) echo "error: unsupported architecture: $arch"; exit 1 ;; \
*) echo "error: unsupported architecture: ($arch)"; exit 1 ;; \
esac \
&& version="$BUN_VERSION" \
&& case "$version" in \
@@ -33,42 +33,44 @@ RUN apt-get update -qq \
-fsSLO \
--compressed \
--retry 5 \
|| (echo "error: failed to download: $tag" && exit 1) \
|| (echo "error: unknown release: ($tag)" && exit 1) \
&& for key in \
"F3DCC08A8572C0749B3E18888EAB4D40A7B22B59" \
; do \
gpg --batch --keyserver hkps://keys.openpgp.org --recv-keys "$key" \
|| gpg --batch --keyserver keyserver.ubuntu.com --recv-keys "$key" ; \
done \
&& gpg --update-trustdb \
&& curl "https://github.com/oven-sh/bun/releases/$release/SHASUMS256.txt.asc" \
-fsSLO \
--compressed \
--retry 5 \
&& gpg --batch --decrypt --output SHASUMS256.txt SHASUMS256.txt.asc \
|| (echo "error: failed to verify: $tag" && exit 1) \
|| (echo "error: failed to verify release: ($tag)" && exit 1) \
&& grep " bun-linux-$build.zip\$" SHASUMS256.txt | sha256sum -c - \
|| (echo "error: failed to verify: $tag" && exit 1) \
|| (echo "error: failed to verify release: ($tag)" && exit 1) \
&& unzip "bun-linux-$build.zip" \
&& mv "bun-linux-$build/bun" /usr/local/bin/bun \
&& rm -f "bun-linux-$build.zip" SHASUMS256.txt.asc SHASUMS256.txt \
&& chmod +x /usr/local/bin/bun
&& chmod +x /usr/local/bin/bun \
&& ln -s /usr/local/bin/bun /usr/local/bin/bunx \
&& which bun \
&& which bunx \
&& bun --version
FROM debian:bullseye
COPY docker-entrypoint.sh /usr/local/bin
COPY --from=build /usr/local/bin/bun /usr/local/bin/bun
RUN groupadd bun \
--gid 1000 \
&& useradd bun \
--uid 1000 \
--gid bun \
--shell /bin/sh \
--create-home \
&& ln -s /usr/local/bin/bun /usr/local/bin/bunx \
&& which bun \
&& which bunx \
&& bun --version
--create-home
COPY docker-entrypoint.sh /usr/local/bin
COPY --from=build /usr/local/bin/bun /usr/local/bin
COPY --from=build /usr/local/bin/bunx /usr/local/bin
WORKDIR /home/bun/app
ENTRYPOINT ["/usr/local/bin/docker-entrypoint.sh"]

View File

@@ -1,69 +0,0 @@
FROM debian:bullseye-slim AS build
# https://github.com/oven-sh/bun/releases
ARG BUN_VERSION=latest
RUN apt-get update -qq \
&& apt-get install -qq --no-install-recommends \
ca-certificates \
curl \
dirmngr \
gpg \
gpg-agent \
unzip \
&& apt-get clean \
&& rm -rf /var/lib/apt/lists/* \
&& arch="$(dpkg --print-architecture)" \
&& case "${arch##*-}" in \
amd64) build="x64-baseline";; \
arm64) build="aarch64";; \
*) echo "error: unsupported architecture: $arch"; exit 1 ;; \
esac \
&& version="$BUN_VERSION" \
&& case "$version" in \
latest | canary | bun-v*) tag="$version"; ;; \
v*) tag="bun-$version"; ;; \
*) tag="bun-v$version"; ;; \
esac \
&& case "$tag" in \
latest) release="latest/download"; ;; \
*) release="download/$tag"; ;; \
esac \
&& curl "https://github.com/oven-sh/bun/releases/$release/bun-linux-$build.zip" \
-fsSLO \
--compressed \
--retry 5 \
|| (echo "error: failed to download: $tag" && exit 1) \
&& for key in \
"F3DCC08A8572C0749B3E18888EAB4D40A7B22B59" \
; do \
gpg --batch --keyserver hkps://keys.openpgp.org --recv-keys "$key" \
|| gpg --batch --keyserver keyserver.ubuntu.com --recv-keys "$key" ; \
done \
&& curl "https://github.com/oven-sh/bun/releases/$release/SHASUMS256.txt.asc" \
-fsSLO \
--compressed \
--retry 5 \
&& gpg --batch --decrypt --output SHASUMS256.txt SHASUMS256.txt.asc \
|| (echo "error: failed to verify: $tag" && exit 1) \
&& grep " bun-linux-$build.zip\$" SHASUMS256.txt | sha256sum -c - \
|| (echo "error: failed to verify: $tag" && exit 1) \
&& unzip "bun-linux-$build.zip" \
&& mv "bun-linux-$build/bun" /usr/local/bin/bun \
&& rm -f "bun-linux-$build.zip" SHASUMS256.txt.asc SHASUMS256.txt \
&& chmod +x /usr/local/bin/bun \
&& which bun \
&& bun --version
FROM gcr.io/distroless/base-nossl-debian11
COPY --from=build /usr/local/bin/bun /usr/local/bin/
# Temporarily use the `build`-stage image binaries to create a symlink:
RUN --mount=type=bind,from=build,source=/usr/bin,target=/usr/bin \
--mount=type=bind,from=build,source=/bin,target=/bin <<EOF
ln -s /usr/local/bin/bun /usr/local/bin/bunx
which bunx
EOF
ENTRYPOINT ["/usr/local/bin/bun"]

View File

@@ -74,7 +74,7 @@ dv.getUint8(0); // => 3
// [0x11, 0x0, 0x0, 0x0]
```
Now let's write a `Uint16` at byte offset `1`. This requires two bytes. We're using the value `513`, which is `2 * 256 + 1`; in bytes, that's `00000010 00000001`.
Now lets write a `Uint16` at byte offset `1`. This requires two bytes. We're using the value `513`, which is `2 * 256 + 1`; in bytes, that's `00000010 00000001`.
```ts
dv.setUint16(1, 513);
@@ -90,7 +90,7 @@ console.log(dv.getUint8(1)); // => 2
console.log(dv.getUint8(2)); // => 1
```
Attempting to write a value that requires more space than is available in the underlying `ArrayBuffer` will cause an error. Below we attempt to write a `Float64` (which requires 8 bytes) at byte offset `0`, but there are only four total bytes in the buffer.
Attempting to write a value that requires more space than is available in the underlying `ArrayBuffer` will cuase an error. Below we attempt to write a `Float64` (which requires 8 bytes) at byte offset `0`, but there are only four total bytes in the buffer.
```ts
dv.setFloat64(0, 3.1415);
@@ -412,7 +412,7 @@ For complete documentation, refer to the [Node.js documentation](https://nodejs.
`Blob` is a Web API commonly used for representing files. `Blob` was initially implemented in browsers (unlike `ArrayBuffer` which is part of JavaScript itself), but it is now supported in Node and Bun.
It isn't common to directly create `Blob` instances. More often, you'll receive instances of `Blob` from an external source (like an `<input type="file">` element in the browser) or library. That said, it is possible to create a `Blob` from one or more string or binary "blob parts".
It isn't common to directly create `Blob` instances. More often, you'll recieve instances of `Blob` from an external source (like an `<input type="file">` element in the browser) or library. That said, it is possible to create a `Blob` from one or more string or binary "blob parts".
```ts
const blob = new Blob(["<html>Hello</html>"], {
@@ -507,7 +507,7 @@ for await (const chunk of stream) {
}
```
For a more complete discussion of streams in Bun, see [API > Streams](/docs/api/streams).
For a more complete discusson of streams in Bun, see [API > Streams](/docs/api/streams).
## Conversion

View File

@@ -229,11 +229,7 @@ const lib = linkSymbols({
},
});
const [major, minor, patch] = [
lib.symbols.getMajor(),
lib.symbols.getMinor(),
lib.symbols.getPatch(),
];
const [major, minor, patch] = [lib.symbols.getMajor(), lib.symbols.getMinor(), lib.symbols.getPatch()];
```
## Callbacks
@@ -253,13 +249,10 @@ const {
},
});
const searchIterator = new JSCallback(
(ptr, length) => /hello/.test(new CString(ptr, length)),
{
returns: "bool",
args: ["ptr", "usize"],
},
);
const searchIterator = new JSCallback((ptr, length) => /hello/.test(new CString(ptr, length)), {
returns: "bool",
args: ["ptr", "usize"],
});
const str = Buffer.from("wwutwutwutwutwutwutwutwutwutwutut\0", "utf8");
if (search(ptr(str), searchIterator)) {
@@ -383,6 +376,10 @@ If you want to track when a `TypedArray` is no longer in use from JavaScript, yo
#### From C, Rust, Zig, etc
{% callout %}
**Note** — Available in Bun v0.1.8 and later.
{% /callout %}
If you want to track when a `TypedArray` is no longer in use from C or FFI, you can pass a callback and an optional context pointer to `toArrayBuffer` or `toBuffer`. This function is called at some point later, once the garbage collector frees the underlying `ArrayBuffer` JavaScript object.
The expected signature is the same as in [JavaScriptCore's C API](https://developer.apple.com/documentation/javascriptcore/jstypedarraybytesdeallocator?language=objc):

View File

@@ -1,8 +1,8 @@
{% callout %}
<!-- **Note** — The `Bun.file` and `Bun.write` APIs documented on this page are heavily optimized and represent the recommended way to perform file-system tasks using Bun. Existing Node.js projects may use Bun's [nearly complete](/docs/runtime/nodejs-apis#node-fs) implementation of the [`node:fs`](https://nodejs.org/api/fs.html) module. -->
<!-- **Note** — The `Bun.file` and `Bun.write` APIs documented on this page are heavily optimized and represent the recommended way to perform file-system tasks using Bun. Existing Node.js projects may use Bun's [nearly complete](/docs/runtime/nodejs-apis#node_fs) implementation of the [`node:fs`](https://nodejs.org/api/fs.html) module. -->
**Note** — The `Bun.file` and `Bun.write` APIs documented on this page are heavily optimized and represent the recommended way to perform file-system tasks using Bun. For operations that are not yet available with `Bun.file`, such as `mkdir` or `readdir`, you can use Bun's [nearly complete](/docs/runtime/nodejs-apis#node-fs) implementation of the [`node:fs`](https://nodejs.org/api/fs.html) module.
**Note** — The `Bun.file` and `Bun.write` APIs documented on this page are heavily optimized and represent the recommended way to perform file-system tasks using Bun. For operations that are not yet available with `Bun.file`, such as `mkdir`, you can use Bun's [nearly complete](/docs/runtime/nodejs-apis#node_fs) implementation of the [`node:fs`](https://nodejs.org/api/fs.html) module.
{% /callout %}
@@ -195,7 +195,7 @@ const input = Bun.file("input.txt");
await Bun.write(Bun.stdout, input);
```
To write the body of an HTTP response to disk:
To write an HTTP response to disk:
```ts
const response = await fetch("https://bun.sh");

View File

@@ -93,7 +93,6 @@ interface Bun {
style: "nextjs";
origin?: string;
assetPrefix?: string;
fileExtensions?: string[];
});
reload(): void;

View File

@@ -34,7 +34,7 @@ Bun implements the following globals.
- [`Buffer`](https://nodejs.org/api/buffer.html#class-buffer)
- Node.js
- See [Node.js > `Buffer`](/docs/runtime/nodejs-apis#node-buffer)
- See [Node.js > `Buffer`](/docs/runtime/nodejs-apis#node_buffer)
---
@@ -172,7 +172,7 @@ Bun implements the following globals.
- [`global`](https://nodejs.org/api/globals.html#global)
- Node.js
- See [Node.js > `global`](/docs/runtime/nodejs-apis#global).
- See [Node.js > `global`](/docs/runtime/nodejs-apis#node_global).
---
@@ -220,7 +220,7 @@ Bun implements the following globals.
- [`process`](https://nodejs.org/api/process.html)
- Node.js
- See [Node.js > `process`](/docs/runtime/nodejs-apis#node-process)
- See [Node.js > `process`](/docs/runtime/nodejs-apis#node_process)
---

View File

@@ -6,6 +6,10 @@ Bun implements the `createHash` and `createHmac` functions from [`node:crypto`](
## `Bun.password`
{% callout %}
**Note** — Added in Bun 0.6.8.
{% /callout %}
`Bun.password` is a collection of utility functions for hashing and verifying passwords with various cryptographically secure algorithms.
```ts
@@ -73,7 +77,7 @@ The standard `Bun.hash` functions uses [Wyhash](https://github.com/wangyi-fudan/
```ts
Bun.hash("some data here");
// 11562320457524636935n
// 976213160445840
```
The input can be a string, `TypedArray`, `DataView`, `ArrayBuffer`, or `SharedArrayBuffer`.
@@ -87,14 +91,14 @@ Bun.hash(arr.buffer);
Bun.hash(new DataView(arr.buffer));
```
Optionally, an integer seed can be specified as the second parameter. For 64-bit hashes seeds above `Number.MAX_SAFE_INTEGER` should be given as BigInt to avoid loss of precision.
Optionally, an integer seed can be specified as the second parameter.
```ts
Bun.hash("some data here", 1234);
// 15724820720172937558n
// 1173484059023252
```
Additional hashing algorithms are available as properties on `Bun.hash`. The API is the same for each, only changing the return type from number for 32-bit hashes to bigint for 64-bit hashes.
Additional hashing algorithms are available as properties on `Bun.hash`. The API is the same for each.
```ts
Bun.hash.wyhash("data", 1234); // equivalent to Bun.hash()
@@ -103,7 +107,6 @@ Bun.hash.adler32("data", 1234);
Bun.hash.cityHash32("data", 1234);
Bun.hash.cityHash64("data", 1234);
Bun.hash.murmur32v3("data", 1234);
Bun.hash.murmur32v2("data", 1234);
Bun.hash.murmur64v2("data", 1234);
```
@@ -132,7 +135,7 @@ hasher.digest();
Once initialized, data can be incrementally fed to to the hasher using `.update()`. This method accepts `string`, `TypedArray`, and `ArrayBuffer`.
```ts
const hasher = new Bun.CryptoHasher("sha256");
const hasher = new Bun.CryptoHasher();
hasher.update("hello world");
hasher.update(new Uint8Array([1, 2, 3]));
@@ -170,7 +173,7 @@ hasher.update("hello world", "latin1");
After the data has been feed into the hasher, a final hash can be computed using `.digest()`. By default, this method returns a `Uint8Array` containing the hash.
```ts
const hasher = new Bun.CryptoHasher("sha256");
const hasher = new Bun.CryptoHasher();
hasher.update("hello world");
hasher.digest();

View File

@@ -13,7 +13,7 @@ Start an HTTP server in Bun with `Bun.serve`.
```ts
Bun.serve({
fetch(req) {
return new Response("Bun!");
return new Response(`Bun!`);
},
});
```
@@ -24,9 +24,9 @@ The `fetch` handler handles incoming requests. It receives a [`Request`](https:/
Bun.serve({
fetch(req) {
const url = new URL(req.url);
if (url.pathname === "/") return new Response("Home page!");
if (url.pathname === "/") return new Response(`Home page!`);
if (url.pathname === "/blog") return new Response("Blog!");
return new Response("404!");
return new Response(`404!`);
},
});
```
@@ -35,19 +35,8 @@ To configure which port and hostname the server will listen on:
```ts
Bun.serve({
port: 8080, // defaults to $BUN_PORT, $PORT, $NODE_PORT otherwise 3000
port: 8080, // defaults to $PORT, then 3000
hostname: "mydomain.com", // defaults to "0.0.0.0"
fetch(req) {
return new Response("404!");
},
});
```
To listen on a [unix domain socket](https://en.wikipedia.org/wiki/Unix_domain_socket):
```ts
Bun.serve({
unix: "/tmp/my-socket.sock", // path to socket
fetch(req) {
return new Response(`404!`);
},
@@ -71,7 +60,7 @@ In development mode, Bun will surface errors in-browser with a built-in error pa
{% image src="/images/exception_page.png" caption="Bun's built-in 500 page" /%}
To handle server-side errors, implement an `error` handler. This function should return a `Response` to serve to the client when an error occurs. This response will supersede Bun's default error page in `development` mode.
To handle server-side errors, implement an `error` handler. This function should return a `Response` to served to the client when an error occurs. This response will supercede Bun's default error page in `development` mode.
```ts
Bun.serve({
@@ -89,7 +78,7 @@ Bun.serve({
```
{% callout %}
[Learn more about debugging in Bun](https://bun.sh/docs/runtime/debugger)
**Note** — Full debugger support is planned.
{% /callout %}
The call to `Bun.serve` returns a `Server` object. To stop the server, call the `.stop()` method.
@@ -140,6 +129,12 @@ Bun.serve({
});
```
{% callout %}
**Note** — Earlier versions of Bun supported passing a file path as `keyFile` and `certFile`; this has been deprecated as of `v0.6.3`.
{% /callout %}
If your private key is encrypted with a passphrase, provide a value for `passphrase` to decrypt it.
```ts-diff
@@ -183,7 +178,7 @@ Bun.serve({
});
```
## Object syntax
## Hot reloading
Thus far, the examples on this page have used the explicit `Bun.serve` API. Bun also supports an alternate syntax.
@@ -192,27 +187,29 @@ import {type Serve} from "bun";
export default {
fetch(req) {
return new Response("Bun!");
return new Response(`Bun!`);
},
} satisfies Serve;
```
Instead of passing the server options into `Bun.serve`, `export default` it. This file can be executed as-is; when Bun sees a file with a `default` export containing a `fetch` handler, it passes it into `Bun.serve` under the hood.
Instead of passing the server options into `Bun.serve`, export it. This file can be executed as-is; when Bun runs a file with a `default` export containing a `fetch` handler, it passes it into `Bun.serve` under the hood.
<!-- This syntax has one major advantage: it is hot-reloadable out of the box. When any source file is changed, Bun will reload the server with the updated code _without restarting the process_. This makes hot reloads nearly instantaneous. Use the `--hot` flag when starting the server to enable hot reloading. -->
This syntax has one major advantage: it is hot-reloadable out of the box. When any source file is changed, Bun will reload the server with the updated code _without restarting the process_. This makes hot reloads nearly instantaneous. Use the `--hot` flag when starting the server to enable hot reloading.
<!-- ```bash
```bash
$ bun --hot server.ts
``` -->
```
<!-- It's possible to configure hot reloading while using the explicit `Bun.serve` API; for details refer to [Runtime > Hot reloading](/docs/runtime/hot). -->
It's possible to configure hot reloading while using the explicit `Bun.serve` API; for details refer to [Runtime > Hot reloading](/docs/runtime/hot).
## Streaming files
To stream a file, return a `Response` object with a `BunFile` object as the body.
```ts
Bun.serve({
import { serve, file } from "bun";
serve({
fetch(req) {
return new Response(Bun.file("./hello.txt"));
},
@@ -223,7 +220,7 @@ Bun.serve({
⚡️ **Speed** — Bun automatically uses the [`sendfile(2)`](https://man7.org/linux/man-pages/man2/sendfile.2.html) system call when possible, enabling zero-copy file transfers in the kernel—the fastest way to send files.
{% /callout %}
You can send part of a file using the [`slice(start, end)`](https://developer.mozilla.org/en-US/docs/Web/API/Blob/slice) method on the `Bun.file` object. This automatically sets the `Content-Range` and `Content-Length` headers on the `Response` object.
**[v0.3.0+]** You can send part of a file using the [`slice(start, end)`](https://developer.mozilla.org/en-US/docs/Web/API/Blob/slice) method on the `Bun.file` object. This automatically sets the `Content-Range` and `Content-Length` headers on the `Response` object.
```ts
Bun.serve({
@@ -252,7 +249,7 @@ Below are Bun and Node.js implementations of a simple HTTP server that responds
```ts#Bun
Bun.serve({
fetch(req: Request) {
return new Response("Bun!");
return new Response(`Bun!`);
},
port: 3000,
});

View File

@@ -19,17 +19,17 @@ import.meta.resolveSync("zod")
---
- `import.meta.dir`
- Absolute path to the directory containing the current file, e.g. `/path/to/project`. Equivalent to `__dirname` in CommonJS modules (and Node.js)
- Absolute path to the directory containing the current fil, e.g. `/path/to/project`. Equivalent to `__dirname` in Node.js.
---
- `import.meta.file`
- The name of the current file, e.g. `index.tsx`
- The name of the current file, e.g. `index.tsx`. Equivalent to `__filename` in Node.js.
---
- `import.meta.path`
- Absolute path to the current file, e.g. `/path/to/project/index.tx`. Equivalent to `__filename` in CommonJS modules (and Node.js)
- Absolute path to the current file, e.g. `/path/to/project/index.tx`.
---
@@ -39,7 +39,7 @@ import.meta.resolveSync("zod")
---
- `import.meta.resolve{Sync}`
- Resolve a module specifier (e.g. `"zod"` or `"./file.tsx"`) to an absolute path. While file would be imported if the specifier were imported from this file?
- Resolve a module specifier (e.g. `"zod"` or `"./file.tsx`) to an absolute path. While file would be imported if the specifier were imported from this file?
```ts
import.meta.resolveSync("zod");

View File

@@ -28,9 +28,7 @@ By default, the input stream of the subprocess is undefined; it can be configure
```ts
const proc = Bun.spawn(["cat"], {
stdin: await fetch(
"https://raw.githubusercontent.com/oven-sh/bun/main/examples/hashing.js",
),
stdin: await fetch("https://raw.githubusercontent.com/oven-sh/bun/main/examples/hashing.js"),
});
const text = await new Response(proc.stdout).text();
@@ -211,7 +209,7 @@ Bun's `spawnSync` spawns processes 60% faster than the Node.js `child_process` m
```bash
$ bun spawn.mjs
cpu: Apple M1 Max
runtime: bun 1.x (arm64-darwin)
runtime: bun 0.2.0 (arm64-darwin)
benchmark time (avg) (min … max) p75 p99 p995
--------------------------------------------------------- -----------------------------
@@ -232,15 +230,10 @@ A simple reference of the Spawn API and types are shown below. The real types ha
```ts
interface Bun {
spawn(command: string[], options?: SpawnOptions.OptionsObject): Subprocess;
spawnSync(
command: string[],
options?: SpawnOptions.OptionsObject,
): SyncSubprocess;
spawnSync(command: string[], options?: SpawnOptions.OptionsObject): SyncSubprocess;
spawn(options: { cmd: string[] } & SpawnOptions.OptionsObject): Subprocess;
spawnSync(
options: { cmd: string[] } & SpawnOptions.OptionsObject,
): SyncSubprocess;
spawnSync(options: { cmd: string[] } & SpawnOptions.OptionsObject): SyncSubprocess;
}
namespace SpawnOptions {
@@ -250,12 +243,7 @@ namespace SpawnOptions {
stdin?: SpawnOptions.Readable;
stdout?: SpawnOptions.Writable;
stderr?: SpawnOptions.Writable;
onExit?: (
proc: Subprocess,
exitCode: number | null,
signalCode: string | null,
error: Error | null,
) => void;
onExit?: (proc: Subprocess, exitCode: number | null, signalCode: string | null, error: Error | null) => void;
}
type Readable =

View File

@@ -99,20 +99,6 @@ const query = db.prepare("SELECT * FROM foo WHERE bar = ?");
{% /callout %}
## WAL mode
SQLite supports [write-ahead log mode](https://www.sqlite.org/wal.html) (WAL) which dramatically improves performance, especially in situations with many concurrent writes. It's broadly recommended to enable WAL mode for most typical applications.
To enable WAL mode, run this pragma query at the beginning of your application:
```ts
db.exec("PRAGMA journal_mode = WAL;");
```
{% details summary="What is WAL mode" %}
In WAL mode, writes to the database are written directly to a separate file called the "WAL file" (write-ahead log). This file will be later integrated into the main database file. Think of it as a buffer for pending writes. Refer to the [SQLite docs](https://www.sqlite.org/wal.html) for a more detailed overview.
{% /details %}
## Statements
A `Statement` is a _prepared query_, which means it's been parsed and compiled into an efficient binary form. It can be executed multiple times in a performant way.

View File

@@ -1,6 +1,6 @@
Streams are an important abstraction for working with binary data without loading it all into memory at once. They are commonly used for reading and writing files, sending and receiving network requests, and processing large amounts of data.
Bun implements the Web APIs [`ReadableStream`](https://developer.mozilla.org/en-US/docs/Web/API/ReadableStream) and [`WritableStream`](https://developer.mozilla.org/en-US/docs/Web/API/WritableStream).
Bun implements the Web APIs [`ReadableStream`](https://developer.mozilla.org/en-US/docs/Web/API/ReadableStream) and [`WritableStream`](https://developer.mozilla.org/en-US/docs/Web/API/ReadableStream).
{% callout %}
Bun also implements the `node:stream` module, including [`Readable`](https://nodejs.org/api/stream.html#stream_readable_streams), [`Writable`](https://nodejs.org/api/stream.html#stream_writable_streams), and [`Duplex`](https://nodejs.org/api/stream.html#stream_duplex_and_transform_streams). For complete documentation, refer to the [Node.js docs](https://nodejs.org/api/stream.html).

View File

@@ -76,6 +76,12 @@ Bun.listen({
});
```
{% callout %}
**Note** Earlier versions of Bun supported passing a file path as `keyFile` and `certFile`; this has been deprecated as of `v0.6.3`.
{% /callout %}
The `key` and `cert` fields expect the _contents_ of your TLS key and certificate. This can be a string, `BunFile`, `TypedArray`, or `Buffer`.
```ts
@@ -89,7 +95,7 @@ Bun.listen({
// string
key: fs.readFileSync("./key.pem", "utf8"),
// array of above
key: [Bun.file("./key1.pem"), Bun.file("./key2.pem")],
key: [Bun.file('./key1.pem'), Bun.file('./key2.pem']
},
});
```

View File

@@ -76,7 +76,7 @@ await transpiler.transform("<div>hi!</div>", "tsx");
```
{% details summary="Nitty gritty" %}
The `.transform()` method runs the transpiler in Bun's worker threadpool, so if you run it 100 times, it will run it across `Math.floor($cpu_count * 0.8)` threads, without blocking the main JavaScript thread.
The `.tranform()` method runs the transpiler in Bun's worker threadpool, so if you run it 100 times, it will run it across `Math.floor($cpu_count * 0.8)` threads, without blocking the main JavaScript thread.
If your code uses a macro, it will potentially spawn a new copy of Bun's JavaScript runtime environment in that new thread.
{% /details %}
@@ -160,6 +160,7 @@ export const name = "hello";
`;
const result = transpiler.scanImports(code);
`);
```
```json#Results

View File

@@ -43,7 +43,7 @@ This is analogous to the [`require.main = module` trick](https://stackoverflow.c
## `Bun.sleep()`
`Bun.sleep(ms: number)`
`Bun.sleep(ms: number)` (added in Bun v0.5.6)
Returns a `Promise` that resolves after the given number of milliseconds.
@@ -65,7 +65,7 @@ console.log("hello one second later!");
## `Bun.sleepSync()`
`Bun.sleepSync(ms: number)`
`Bun.sleepSync(ms: number)` (added in Bun v0.5.6)
A blocking synchronous version of `Bun.sleep`.
@@ -108,7 +108,7 @@ console.log(ls); // null
## `Bun.peek()`
`Bun.peek(prom: Promise)`
`Bun.peek(prom: Promise)` (added in Bun v0.2.2)
Reads a promise's result without `await` or `.then`, but only if the promise has already fulfilled or rejected.
@@ -183,7 +183,7 @@ const currentFile = import.meta.url;
Bun.openInEditor(currentFile);
```
You can override this via the `debug.editor` setting in your [`bunfig.toml`](/docs/runtime/bunfig)
You can override this via the `debug.editor` setting in your [`bunfig.toml`](/docs/runtime/configuration)
```toml-diff#bunfig.toml
+ [debug]
@@ -204,7 +204,7 @@ Bun.ArrayBufferSink;
## `Bun.deepEquals()`
Recursively checks if two objects are equivalent. This is used internally by `expect().toEqual()` in `bun:test`.
Nestedly checks if two objects are equivalent. This is used internally by `expect().toEqual()` in `bun:test`.
```ts
const foo = { a: 1, b: 2, c: { d: 3 } };
@@ -428,21 +428,6 @@ const str = Bun.inspect(arr);
// => "Uint8Array(3) [ 1, 2, 3 ]"
```
## `Bun.inspect.custom`
This is the symbol that Bun uses to implement `Bun.inspect`. You can override this to customize how your objects are printed. It is identical to `util.inspect.custom` in Node.js.
```ts
class Foo {
[Bun.inspect.custom]() {
return "foo";
}
}
const foo = new Foo();
console.log(foo); // => "foo"
```
## `Bun.nanoseconds()`
Returns the number of nanoseconds since the current `bun` process started, as a `number`. Useful for high-precision timing and benchmarking.

View File

@@ -87,7 +87,7 @@ ws.send(new Uint8Array([1, 2, 3])); // TypedArray | DataView
### Headers
Once the upgrade succeeds, Bun will send a `101 Switching Protocols` response per the [spec](https://developer.mozilla.org/en-US/docs/Web/HTTP/Protocol_upgrade_mechanism). Additional `headers` can be attached to this `Response` in the call to `server.upgrade()`.
Once the upgrade succeeds, Bun will send a `101 Switching Protocols` response per the [spec](https://developer.mozilla.org/en-US/docs/Web/HTTP/Protocol_upgrade_mechanism). Additional `headers` can be attched to this `Response` in the call to `server.upgrade()`.
```ts
Bun.serve({
@@ -161,7 +161,7 @@ socket.addEventListener("message", event => {
### Pub/Sub
Bun's `ServerWebSocket` implementation implements a native publish-subscribe API for topic-based broadcasting. Individual sockets can `.subscribe()` to a topic (specified with a string identifier) and `.publish()` messages to all other subscribers to that topic (excluding itself). This topic-based broadcast API is similar to [MQTT](https://en.wikipedia.org/wiki/MQTT) and [Redis Pub/Sub](https://redis.io/topics/pubsub).
Bun's `ServerWebSocket` implementation implements a native publish-subscribe API for topic-based broadcasting. Individual sockets can `.subscribe()` to a topic (specified with a string identifier) and `.publish()` messages to all other subscribers to that topic. This topic-based broadcast API is similar to [MQTT](https://en.wikipedia.org/wiki/MQTT) and [Redis Pub/Sub](https://redis.io/topics/pubsub).
```ts
const server = Bun.serve<{ username: string }>({
@@ -192,7 +192,7 @@ const server = Bun.serve<{ username: string }>({
close(ws) {
const msg = `${ws.data.username} has left the chat`;
ws.unsubscribe("the-group-chat");
server.publish("the-group-chat", msg);
ws.publish("the-group-chat", msg);
},
},
});
@@ -200,18 +200,7 @@ const server = Bun.serve<{ username: string }>({
console.log(`Listening on ${server.hostname}:${server.port}`);
```
Calling `.publish(data)` will send the message to all subscribers of a topic _except_ the socket that called `.publish()`. To send a message to all subscribers of a topic, use the `.publish()` method on the `Server` instance.
```ts
const server = Bun.serve({
websocket: {
// ...
},
});
// listen for some external event
server.publish("the-group-chat", "Hello world");
```
Calling `.publish(data)` will send the message to all subscribers of a topic _except_ the socket that called `.publish()`.
### Compression
@@ -247,11 +236,7 @@ This gives you better control over backpressure in your server.
## Connect to a `Websocket` server
{% callout %}
**🚧** — The `WebSocket` client still does not pass the full [Autobahn test suite](https://github.com/crossbario/autobahn-testsuite) and should not be considered ready for production.
{% /callout %}
Bun implements the `WebSocket` class. To create a WebSocket client that connects to a `ws://` or `wss://` server, create an instance of `WebSocket`, as you would in the browser.
To connect to an external socket server, either from a browser or from Bun, create an instance of `WebSocket` with the constructor.
```ts
const socket = new WebSocket("ws://localhost:3000");

View File

@@ -1,5 +1,5 @@
{% callout %}
**🚧** — The `Worker` API is still experimental and should not be considered ready for production.
`Worker` support was added in Bun v0.7.0.
{% /callout %}
[`Worker`](https://developer.mozilla.org/en-US/docs/Web/API/Worker) lets you start and communicate with a new JavaScript instance running on a separate thread while sharing I/O resources with the main thread.
@@ -10,7 +10,7 @@ Bun implements a minimal version of the [Web Workers API](https://developer.mozi
Like in browsers, [`Worker`](https://developer.mozilla.org/en-US/docs/Web/API/Worker) is a global. Use it to create a new worker thread.
### From the main thread
From the main thread:
```js#Main_thread
const workerURL = new URL("worker.ts", import.meta.url).href;
@@ -22,25 +22,16 @@ worker.onmessage = event => {
};
```
### Worker thread
Worker thread:
```ts#worker.ts_(Worker_thread)
// prevents TS errors
declare var self: Worker;
self.onmessage = (event: MessageEvent) => {
console.log(event.data);
postMessage("world");
};
```
To prevent TypeScript errors when using `self`, add this line to the top of your worker file.
```ts
declare var self: Worker;
```
You can use `import` and `export` syntax in your worker code. Unlike in browsers, there's no need to specify `{type: "module"}` to use ES Modules.
You can use `import`/`export` syntax in your worker code. Unlike in browsers, there's no need to specify `{type: "module"}` to use ES Modules.
To simplify error handling, the initial script to load is resolved at the time `new Worker(url)` is called.
@@ -97,7 +88,7 @@ worker.addEventListener("message", event => {
## Terminating a worker
A `Worker` instance terminates automatically once it's event loop has no work left to do. Attaching a `"message"` listener on the global or any `MessagePort`s will keep the event loop alive. To forcefully terminate a `Worker`, call `worker.terminate()`.
A `Worker` instance terminate automatically when Bun's process exits. To terminate a `Worker` sooner, call `worker.terminate()`.
```ts
const worker = new Worker(new URL("worker.ts", import.meta.url).href);
@@ -106,20 +97,18 @@ const worker = new Worker(new URL("worker.ts", import.meta.url).href);
worker.terminate();
```
This will cause the worker's to exit as soon as possible.
### `process.exit()`
A worker can terminate itself with `process.exit()`. This does not terminate the main process. Like in Node.js, `process.on('beforeExit', callback)` and `process.on('exit', callback)` are emitted on the worker thread (and not on the main thread), and the exit code is passed to the `"close"` event.
A worker can terminate itself with `process.exit()`. This does not terminate the main process. Like in Node.js, `process.on('beforeExit', callback)` and `process.on('exit', callback)` are emitted on the worker thread (and not on the main thread).
### `"close"`
The `"close"` event is emitted when a worker has been terminated. It can take some time for the worker to actually terminate, so this event is emitted when the worker has been marked as terminated. The `CloseEvent` will contain the exit code passed to `process.exit()`, or 0 if closed for other reasons.
The `"close"` event is emitted when a worker has been terminated. It can take some time for the worker to actually terminate, so this event is emitted when the worker has been marked as terminated.
```ts
const worker = new Worker(new URL("worker.ts", import.meta.url).href);
worker.addEventListener("close", event => {
worker.addEventListener("close", () => {
console.log("worker is being closed");
});
```
@@ -128,27 +117,14 @@ This event does not exist in browsers.
## Managing lifetime
By default, an active `Worker` will keep the main (spawning) process alive, so async tasks like `setTimeout` and promises will keep the process alive. Attaching `message` listeners will also keep the `Worker` alive.
By default, an active `Worker` will _not_ keep the main (spawning) process alive. Once the main script finishes, the main thread will terminate, shutting down any workers it created.
### `worker.unref()`
### `worker.ref`
To stop a running worker from keeping the process alive, call `worker.unref()`. This decouples the lifetime of the worker to the lifetime of the main process, and is equivalent to what Node.js' `worker_threads` does.
To keep the process alive until the `Worker` terminates, call `worker.ref()`. This couples the lifetime of the worker to the lifetime of the main process.
```ts
const worker = new Worker(new URL("worker.ts", import.meta.url).href);
worker.unref();
```
Note: `worker.unref()` is not available in browsers.
### `worker.ref()`
To keep the process alive until the `Worker` terminates, call `worker.ref()`. A ref'd worker is the default behavior, and still needs something going on in the event loop (such as a `"message"` listener) for the worker to continue running.
```ts
const worker = new Worker(new URL("worker.ts", import.meta.url).href);
worker.unref();
// later...
worker.ref();
```
@@ -156,11 +132,22 @@ Alternatively, you can also pass an `options` object to `Worker`:
```ts
const worker = new Worker(new URL("worker.ts", import.meta.url).href, {
ref: false,
ref: true,
});
```
Note: `worker.ref()` is not available in browsers.
### `worker.unref`
To stop keeping the process alive, call `worker.unref()`.
```ts
const worker = new Worker(new URL("worker.ts", import.meta.url).href);
worker.ref();
// ...later on
worker.unref();
```
Note: `worker.ref()` and `worker.unref()` do not exist in browsers.
## Memory usage with `smol`

View File

@@ -52,6 +52,7 @@ Run this with `bun cat.js /path/to/big/file`.
## Reading from standard input
```ts
// As of Bun v0.3.0, console is an AsyncIterable
for await (const line of console) {
// line of text from stdin
console.log(line);

View File

@@ -28,33 +28,6 @@ All imported files and packages are bundled into the executable, along with a co
- `--outdir` — use `outfile` instead.
- `--external`
- `--splitting`
- `--public-path`
- `--publicPath`
{% /callout %}
## Embedding files
Standalone executables support embedding files.
To embed files into an executable with `bun build --compile`, import the file in your code
```js
// this becomes an internal file path
import icon from "./icon.png";
import { file } from "bun";
export default {
fetch(req) {
return new Response(file(icon));
},
};
```
You may need to specify a `--loader` for it to be treated as a `"file"` loader (so you get back a file path).
Embedded files can be read using `Bun.file`'s functions or the Node.js `fs.readFile` function (in `"node:fs"`).
## Minification
To trim down the size of the executable a little, pass `--minify` to `bun build --compile`. This uses Bun's minifier to reduce the code size. Overall though, Bun's binary is still way too big and we need to make it smaller.

View File

@@ -29,10 +29,6 @@ The bundler is a key piece of infrastructure in the JavaScript ecosystem. As a b
Let's jump into the bundler API.
{% callout %}
Note that the Bun bundler is not intended to replace `tsc` for typechecking or generating type declarations.
{% /callout %}
## Basic example
Let's build our first bundle. You have the following two files, which implement a simple client-side rendered React app.
@@ -136,14 +132,6 @@ Visit `http://localhost:5000` to see your bundled app in action.
{% /details %}
## Watch mode
Like the runtime and test runner, the bundler supports watch mode natively.
```sh
$ bun build ./index.tsx --outdir ./out --watch
```
## Content types
Like the Bun runtime, the bundler supports an array of file types out of the box. The following table breaks down the bundler's set of standard "loaders". Refer to [Bundler > File types](/docs/runtime/loaders) for full documentation.
@@ -972,7 +960,7 @@ By specifying `.` as `root`, the generated file structure will look like this:
A prefix to be appended to any import paths in bundled code.
<!-- $ bun build ./index.tsx --outdir ./out --public-path https://cdn.example.com -->
<!-- $ bun build ./index.tsx --outdir ./out --publicPath https://cdn.example.com -->
In many cases, generated bundles will contain no `import` statements. After all, the goal of bundling is to combine all of the code into a single file. However there are a number of cases with the generated bundles will contain `import` statements.
@@ -1096,7 +1084,7 @@ interface BuildArtifact extends Blob {
The `outputs` array contains all the files that were generated by the build. Each artifact implements the `Blob` interface.
```ts
const build = await Bun.build({
const build = Bun.build({
/* */
});
@@ -1140,7 +1128,7 @@ Each artifact also contains the following properties:
Similar to `BunFile`, `BuildArtifact` objects can be passed directly into `new Response()`.
```ts
const build = await Bun.build({
const build = Bun.build({
/* */
});
@@ -1156,7 +1144,7 @@ The Bun runtime implements special pretty-printing of `BuildArtifact` object to
```ts#Build_script
// build.ts
const build = await Bun.build({/* */});
const build = Bun.build({/* */});
const artifact = build.outputs[0];
console.log(artifact);

View File

@@ -40,7 +40,7 @@ When a user visits this website, the files are loaded in the following order:
This approach works, it requires three round-trip HTTP requests before the browser is ready to render the page. On slow internet connections, this may add up to a non-trivial delay.
This example is extremely simplistic. A modern app may be loading dozens of modules from `node_modules`, each consisting of hundred of files. Loading each of these files with a separate HTTP request becomes untenable very quickly. While most of these requests will be running in parallel, the number of round-trip requests can still be very high; plus, there are limits on how many simultaneous requests a browser can make.
This example is extremely simplistic. A modern app may be loading dozens of modules from `node_modules`, each consisting of hundrends of files. Loading each of these files with a separate HTTP request becomes untenable very quickly. While most of these requests will be running in parallel, the number of round-trip requests can still be very high; plus, there are limits on how many simultaneous requests a browser can make.
{% callout %}
Some recent advances like modulepreload and HTTP/3 are intended to solve some of these problems, but at the moment bundling is still the most performant approach.

View File

@@ -16,7 +16,7 @@ Parses the code and applies a set of default transforms, like dead-code eliminat
**JavaScript + JSX.**. Default for `.js` and `.jsx`.
Same as the `js` loader, but JSX syntax is supported. By default, JSX is down-converted to plain JavaScript; the details of how this is done depends on the `jsx*` compiler options in your `tsconfig.json`. Refer to the TypeScript documentation [on JSX](https://www.typescriptlang.org/docs/handbook/jsx.html) for more information.
Same as the `js` loader, but JSX syntax is supported. By default, JSX is downconverted to plain JavaScript; the details of how this is done depends on the `jsx*` compiler options in your `tsconfig.json`. Refer to the TypeScript documentation [on JSX](https://www.typescriptlang.org/docs/handbook/jsx.html) for more information.
### `ts`

View File

@@ -129,7 +129,7 @@ if (returnFalse()) {
}
```
## Serializability
## Serializablility
Bun's transpiler needs to be able to serialize the result of the macro so it can be inlined into the AST. All JSON-compatible data structures are supported:

View File

@@ -1,18 +1,20 @@
{% callout %}
**Note** — Introduced in Bun v0.1.11.
{% /callout %}
Bun provides a universal plugin API that can be used to extend both the _runtime_ and _bundler_.
Plugins intercept imports and perform custom loading logic: reading files, transpiling code, etc. They can be used to add support for additional file types, like `.scss` or `.yaml`. In the context of Bun's bundler, plugins can be used to implement framework-level features like CSS extraction, macros, and client-server code co-location.
For more complete documentation of the Plugin API, see [Runtime > Plugins](/docs/runtime/plugins).
## Usage
A plugin is defined as simple JavaScript object containing a `name` property and a `setup` function. Register a plugin with Bun using the `plugin` function.
```tsx#myPlugin.ts
```tsx#yamlPlugin.ts
import type { BunPlugin } from "bun";
const myPlugin: BunPlugin = {
name: "Custom loader",
name: "YAML loader",
setup(build) {
// implementation
},
@@ -28,3 +30,307 @@ Bun.build({
plugins: [myPlugin],
});
```
<!-- It can also be "registered" with the Bun runtime using the `Bun.plugin()` function. Once registered, the currently executing `bun` process will incorporate the plugin into its module resolution algorithm.
```ts
import {plugin} from "bun";
plugin(myPlugin);
``` -->
## `--preload`
To consume this plugin, add this file to the `preload` option in your [`bunfig.toml`](/docs/runtime/configuration). Bun automatically loads the files/modules specified in `preload` before running a file.
```toml
preload = ["./yamlPlugin.ts"]
```
To preload files during `bun test`:
```toml
[test]
preload = ["./loader.ts"]
```
{% details summary="Usage without preload" %}
Alternatively, you can import this file manually at the top of your project's entrypoint, before any application code is imported.
```ts#app.ts
import "./yamlPlugin.ts";
import { config } from "./config.yml";
console.log(config);
```
{% /details %}
## Third-party plugins
By convention, third-party plugins intended for consumption should export a factory function that accepts some configuration and returns a plugin object.
```ts
import { plugin } from "bun";
import fooPlugin from "bun-plugin-foo";
plugin(
fooPlugin({
// configuration
}),
);
// application code
```
Bun's plugin API is based on [esbuild](https://esbuild.github.io/plugins). Only [a subset](/docs/bundler/vs-esbuild#plugin-api) of the esbuild API is implemented, but some esbuild plugins "just work" in Bun, like the official [MDX loader](https://mdxjs.com/packages/esbuild/):
```jsx
import { plugin } from "bun";
import mdx from "@mdx-js/esbuild";
plugin(mdx());
import { renderToStaticMarkup } from "react-dom/server";
import Foo from "./bar.mdx";
console.log(renderToStaticMarkup(<Foo />));
```
## Loaders
<!-- The plugin logic is implemented in the `setup` function using the builder provided as the first argument (`build` in the example above). The `build` variable provides two methods: `onResolve` and `onLoad`. -->
<!-- ## `onResolve` -->
<!-- The `onResolve` method lets you intercept imports that match a particular regex and modify the resolution behavior, such as re-mapping the import to another file. In the simplest case, you can simply remap the matched import to a new path.
```ts
import { plugin } from "bun";
plugin({
name: "YAML loader",
setup(build) {
build.onResolve();
// implementation
},
});
``` -->
<!--
Internally, Bun's transpiler automatically turns `plugin()` calls into separate files (at most 1 per file). This lets loaders activate before the rest of your application runs with zero configuration. -->
Plugins are primarily used to extend Bun with loaders for additional file types. Let's look at a simple plugin that implements a loader for `.yaml` files.
```ts#yamlPlugin.ts
import { plugin } from "bun";
plugin({
name: "YAML",
async setup(build) {
const { load } = await import("js-yaml");
const { readFileSync } = await import("fs");
// when a .yaml file is imported...
build.onLoad({ filter: /\.(yaml|yml)$/ }, (args) => {
// read and parse the file
const text = readFileSync(args.path, "utf8");
const exports = load(text) as Record<string, any>;
// and returns it as a module
return {
exports,
loader: "object", // special loader for JS objects
};
});
},
});
```
With this plugin, data can be directly imported from `.yaml` files.
{% codetabs %}
```ts#index.ts
import "./yamlPlugin.ts"
import {name, releaseYear} from "./data.yml"
console.log(name, releaseYear);
```
```yaml#data.yml
name: Fast X
releaseYear: 2023
```
{% /codetabs %}
Note that the returned object has a `loader` property. This tells Bun which of its internal loaders should be used to handle the result. Even though we're implementing a loader for `.yaml`, the result must still be understandable by one of Bun's built-in loaders. It's loaders all the way down.
In this case we're using `"object"`—a built-in loader (intended for use by plugins) that converts a plain JavaScript object to an equivalent ES module. Any of Bun's built-in loaders are supported; these same loaders are used by Bun internally for handling files of various kinds. The table below is a quick reference; refer to [Bundler > Loaders](/docs/bundler/loaders) for complete documentation.
{% table %}
- Loader
- Extensions
- Output
---
- `js`
- `.mjs` `.cjs`
- Transpile to JavaScript files
---
- `jsx`
- `.js` `.jsx`
- Transform JSX then transpile
---
- `ts`
- `.ts` `.mts` `cts`
- Transform TypeScript then transpile
---
- `tsx`
- `.tsx`
- Transform TypeScript, JSX, then transpile
---
- `toml`
- `.toml`
- Parse using Bun's built-in TOML parser
---
- `json`
- `.json`
- Parse using Bun's built-in JSON parser
---
- `napi`
- `.node`
- Import a native Node.js addon
---
- `wasm`
- `.wasm`
- Import a native Node.js addon
---
- `object`
- _none_
- A special loader intended for plugins that converts a plain JavaScript object to an equivalent ES module. Each key in the object corresponds to a named export.
{% /callout %}
Loading a YAML file is useful, but plugins support more than just data loading. Let's look at a plugin that lets Bun import `*.svelte` files.
```ts#sveltePlugin.ts
import { plugin } from "bun";
await plugin({
name: "svelte loader",
async setup(build) {
const { compile } = await import("svelte/compiler");
const { readFileSync } = await import("fs");
// when a .svelte file is imported...
build.onLoad({ filter: /\.svelte$/ }, ({ path }) => {
// read and compile it with the Svelte compiler
const file = readFileSync(path, "utf8");
const contents = compile(file, {
filename: path,
generate: "ssr",
}).js.code;
// and return the compiled source code as "js"
return {
contents,
loader: "js",
};
});
},
});
```
> Note: in a production implementation, you'd want to cache the compiled output and include additional error handling.
The object returned from `build.onLoad` contains the compiled source code in `contents` and specifies `"js"` as its loader. That tells Bun to consider the returned `contents` to be a JavaScript module and transpile it using Bun's built-in `js` loader.
With this plugin, Svelte components can now be directly imported and consumed.
```js
import "./sveltePlugin.ts";
import MySvelteComponent from "./component.svelte";
console.log(mySvelteComponent.render());
```
## Reading the config
Plugins can read and write to the [build config](/docs/bundler#api) with `build.config`.
```ts
Bun.build({
entrypoints: ["./app.ts"],
outdir: "./dist",
sourcemap: "external",
plugins: [
{
name: "demo",
setup(build) {
console.log(build.config.sourcemap); // "external"
build.config.minify = true; // enable minification
// `plugins` is readonly
console.log(`Number of plugins: ${build.config.plugins.length}`);
},
},
],
});
```
## Reference
```ts
namespace Bun {
function plugin(plugin: {
name: string;
setup: (build: PluginBuilder) => void;
}): void;
}
type PluginBuilder = {
onResolve: (
args: { filter: RegExp; namespace?: string },
callback: (args: { path: string; importer: string }) => {
path: string;
namespace?: string;
} | void,
) => void;
onLoad: (
args: { filter: RegExp; namespace?: string },
callback: (args: { path: string }) => {
loader?: Loader;
contents?: string;
exports?: Record<string, any>;
},
) => void;
config: BuildConfig;
};
type Loader = "js" | "jsx" | "ts" | "tsx" | "json" | "toml" | "object";
```
The `onLoad` method optionally accepts a `namespace` in addition to the `filter` regex. This namespace will be be used to prefix the import in transpiled code; for instance, a loader with a `filter: /\.yaml$/` and `namespace: "yaml:"` will transform an import from `./myfile.yaml` into `yaml:./myfile.yaml`.

View File

@@ -1,3 +1,7 @@
{% callout %}
**Note** — Available in Bun v0.6.0 and later.
{% /callout %}
Bun's bundler API is inspired heavily by [esbuild](https://esbuild.github.io/). Migrating to Bun's bundler from esbuild should be relatively painless. This guide will briefly explain why you might consider migrating to Bun's bundler and provide a side-by-side API comparison reference for those who are already familiar with esbuild's API.
There are a few behavioral differences to note.
@@ -125,13 +129,13 @@ In Bun's CLI, simple boolean flags like `--minify` do not accept an argument. Ot
- `--target`
- n/a
- No supported. Bun's bundler performs no syntactic down-leveling at this time.
- No supported. Bun's bundler performs no syntactic downleveling at this time.
---
- `--watch`
- `--watch`
- No differences
- n/a
- Not applicable
---
@@ -893,7 +897,7 @@ const myPlugin: BunPlugin = {
};
```
The `builder` object provides some methods for hooking into parts of the bundling process. Bun implements `onResolve` and `onLoad`; it does not yet implement the esbuild hooks `onStart`, `onEnd`, and `onDispose`, and `resolve` utilities. `initialOptions` is partially implemented, being read-only and only having a subset of esbuild's options; use [`config`](/docs/bundler/plugins) (same thing but with Bun's `BuildConfig` format) instead.
The `builder` object provides some methods for hooking into parts of the bundling process. Bun implements `onResolve` and `onLoad`; it does not yet implement the esbuild hooks `onStart`, `onEnd`, and `onDispose`, and `resolve` utilities. `initialOptions` is partially implemented, being read-only and only having a subset of esbuild's options; use [`config`](/docs/bundler/plugins#reading-the-config) (same thing but with Bun's `BuildConfig` format) instead.
```ts
import type { BunPlugin } from "bun";

View File

@@ -13,6 +13,8 @@ If you pass `-y` or `--yes`, it will assume you want to continue without asking
At the end, it runs `bun install` to install `bun-types`.
Added in Bun v0.1.7.
#### How is `bun init` different than `bun create`?
`bun init` is for blank projects. `bun create` applies templates.

View File

@@ -21,7 +21,7 @@ Configuring with `bunfig.toml` is optional. Bun tries to be zero configuration i
# Scope name The value can be a URL string or an object
"@mybigcompany" = { token = "123456", url = "https://registry.mybigcompany.com" }
# URL is optional and falls back to the default registry
# URL is optional and fallsback to the default registry
# The "@" in the scope is optional
mybigcompany2 = { token = "123456" }
@@ -47,7 +47,7 @@ registry = "https://registry.yarnpkg.com/"
# Install for production? This is the equivalent to the "--production" CLI argument
production = false
# Disallow changes to lockfile? This is the equivalent to the "--frozen-lockfile" CLI argument
# Disallow changes to lockfile? This is the equivalent to the "--fozen-lockfile" CLI argument
frozenLockfile = false
# Don't actually install
@@ -62,9 +62,6 @@ dev = true
# Install peerDependencies (default: false)
peer = false
# Whether to use the github REST api (unauthenticated)
github.api = true
# When using `bun install -g`, install packages here
globalDir = "~/.bun/install/global"
@@ -92,6 +89,12 @@ disableManifest = false
# Note: it does not load the lockfile, it just converts bun.lockb into a yarn.lock
print = "yarn"
# Path to read bun.lockb from
path = "bun.lockb"
# Path to save bun.lockb to
savePath = "bun.lockb"
# Save the lockfile to disk
save = true
@@ -139,6 +142,8 @@ export interface Cache {
export interface Lockfile {
print?: "yarn";
path: string;
savePath: string;
save: boolean;
}
```
@@ -151,6 +156,7 @@ Environment variables have a higher priority than `bunfig.toml`.
| -------------------------------- | ------------------------------------------------------------- |
| BUN_CONFIG_REGISTRY | Set an npm registry (default: <https://registry.npmjs.org>) |
| BUN_CONFIG_TOKEN | Set an auth token (currently does nothing) |
| BUN_CONFIG_LOCKFILE_SAVE_PATH | File path to save the lockfile to (default: bun.lockb) |
| BUN_CONFIG_YARN_LOCKFILE | Save a Yarn v1-style yarn.lock |
| BUN_CONFIG_LINK_NATIVE_BINS | Point `bin` in package.json to a platform-specific dependency |
| BUN_CONFIG_SKIP_SAVE_LOCKFILE | Dont save a lockfile |

View File

@@ -1,6 +1,6 @@
Bundling is currently an important mechanism for building complex web apps.
Modern apps typically consist of a large number of files and package dependencies. Despite the fact that modern browsers support [ES Module](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Modules) imports, it's still too slow to fetch each file via individual HTTP requests. _Bundling_ is the process of concatenating several source files into a single large file that can be loaded in a single request.
Modern apps typically consist of a large number of files and package dependencies. Despite the fact that modern browsers support [ES Module](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Modules) imports, it's still too slow to fetch each file via inidividual HTTP requests. _Bundling_ is the process of concatenating several source files into a single large file that can be loaded in a single request.
{% callout %}
**On bundling** — Despite recent advances like [`modulepreload`](https://developer.mozilla.org/en-US/docs/Web/HTML/Attributes/rel/modulepreload) and [HTTP/3](https://en.wikipedia.org/wiki/HTTP/3), bundling is still the most performant approach.

View File

@@ -50,7 +50,7 @@ Running `bun create` performs the following steps:
- Initialize a fresh Git repo. Opt out with the `--no-git` flag.
- Run the template's configured `start` script, if defined.
<!-- ## Official templates
## Official templates
The following official templates are available.
@@ -73,7 +73,7 @@ Welcome to bun! Create a new project by pasting any of the following:
{% callout %}
⚡️ **Speed** — At the time of writing, `bun create react app` runs ~11x faster on a M1 Macbook Pro than `yarn create react-app app`.
{% /callout %} -->
{% /callout %}
## GitHub repos

View File

@@ -23,7 +23,9 @@ sudo apt install --install-recommends linux-generic-hwe-20.04
{% /details %}
## `bun install`
## Manage dependencies
### `bun install`
To install all dependencies of a project:
@@ -41,7 +43,7 @@ Running `bun install` will:
- **Run** your project's `{pre|post}install` and `{pre|post}prepare` scripts at the appropriate time. For security reasons Bun _does not execute_ lifecycle scripts of installed dependencies.
- **Write** a `bun.lockb` lockfile to the project root.
To install in production mode (i.e. without `devDependencies` or `optionalDependencies`):
To install in production mode (i.e. without `devDependencies`):
```bash
$ bun install --production
@@ -67,7 +69,7 @@ $ bun install --silent # no logging
```
{% details summary="Configuring behavior" %}
The default behavior of `bun install` can be configured in `bunfig.toml`:
The default behavior of `bun install` can be configured in `bun.toml`:
```toml
[install]
@@ -89,14 +91,11 @@ frozenLockfile = false
# equivalent to `--dry-run` flag
dryRun = false
# whether to use the github REST api (unauthenticated)
github.api = true
```
{% /details %}
## `bun add`
### `bun add`
To add a particular package:
@@ -115,7 +114,7 @@ $ bun add zod@latest
To add a package as a dev dependency (`"devDependencies"`):
```bash
$ bun add --dev @types/react
$ bun add --development @types/react
$ bun add -d @types/react
```
@@ -178,7 +177,7 @@ To view a complete list of options for a given command:
$ bun add --help
```
## `bun remove`
### `bun remove`
To remove a dependency:
@@ -186,17 +185,7 @@ To remove a dependency:
$ bun remove preact
```
## `bun update`
To update all dependencies to the latest version _that's compatible with the version range specified in your `package.json`_:
```sh
$ bun update
```
This will not edit your `package.json`. There's currently no command to force-update all dependencies to the latest version regardless version ranges.
## `bun link`
## Local packages (`bun link`)
Use `bun link` in a local directory to register the current package as a "linkable" package.
@@ -208,7 +197,7 @@ $ cat package.json
"version": "1.0.0"
}
$ bun link
bun link v1.x (7416672e)
bun link v0.5.7 (7416672e)
Success! Registered "cool-pkg"
To use cool-pkg in a project, run:
@@ -300,7 +289,7 @@ Bun supports a variety of protocols, including [`github`](https://docs.npmjs.com
## Tarball dependencies
A package name can correspond to a publicly hosted `.tgz` file. During `bun install`, Bun will download and install the package from the specified tarball URL, rather than from the package registry.
A package name can correspond to a publically hosted `.tgz` file. During `bun install`, Bun will download and install the package from the specified tarball URL, rather than from the package registry.
```json#package.json
{

View File

@@ -1,28 +1,5 @@
The `bun` CLI can be used to execute JavaScript/TypeScript files, `package.json` scripts, and [executable packages](https://docs.npmjs.com/cli/v9/configuring-npm/package-json#bin).
## Performance
Bun is designed to start fast and run fast.
Under the hood Bun uses the [JavaScriptCore engine](https://developer.apple.com/documentation/javascriptcore), which is developed by Apple for Safari. In most cases, the startup and running performance is faster than V8, the engine used by Node.js and Chromium-based browsers. Its transpiler and runtime are written in Zig, a modern, high-performance language. On Linux, this translates into startup times [4x faster](https://twitter.com/jarredsumner/status/1499225725492076544) than Node.js.
{% table %}
---
- `bun hello.js`
- `5.2ms`
---
- `node hello.js`
- `25.1ms`
{% /table %}
{% caption content="Running a simple Hello World script on Linux" /%}
<!-- {% image src="/images/bun-run-speed.jpeg" caption="Bun vs Node.js vs Deno running Hello World" /%} -->
<!-- ## Speed -->
<!--
@@ -49,11 +26,10 @@ $ bun run index.ts
$ bun run index.tsx
```
Alternatively, you can omit the `run` keyword and use the "naked" command; it behaves identically.
The "naked" `bun` command is equivalent to `bun run`.
```bash
$ bun index.tsx
$ bun index.js
```
### `--watch`
@@ -64,19 +40,12 @@ To run a file in watch mode, use the `--watch` flag.
$ bun --watch run index.tsx
```
{% callout %}
**Note** — When using `bun run`, put Bun flags like `--watch` immediately after `bun`.
```bash
$ bun --watch run dev # ✔️ do this
$ bun run dev --watch # ❌ don't do this
```
Flags that occur at the end of the command will be ignored and passed through to the `"dev"` script itself.
{% /callout %}
### `--smol`
{% callout %}
Added in Bun v0.7.0.
{% /callout %}
In memory-constrained environments, use the `--smol` flag to reduce memory usage at a cost to performance.
```bash
@@ -89,10 +58,6 @@ $ bun --smol run index.tsx
Compare to `npm run <script>` or `yarn <script>`
{% /note %}
```sh
$ bun [bun flags] run <script> [script flags]
```
Your `package.json` can define a number of named `"scripts"` that correspond to shell commands.
```jsonc
@@ -105,10 +70,10 @@ Your `package.json` can define a number of named `"scripts"` that correspond to
}
```
Use `bun run <script>` to execute these scripts.
Use `bun <script>` to execute these scripts.
```bash
$ bun run clean
$ bun clean
$ rm -rf dist && echo 'Done.'
Cleaning...
Done.
@@ -143,18 +108,22 @@ quickstart scripts:
Bun respects lifecycle hooks. For instance, `bun run clean` will execute `preclean` and `postclean`, if defined. If the `pre<script>` fails, Bun will not execute the script itself.
### `--bun`
## Environment variables
It's common for `package.json` scripts to reference locally-installed CLIs like `vite` or `next`. These CLIs are often JavaScript files marked with a [shebang](<https://en.wikipedia.org/wiki/Shebang_(Unix)>) to indicate that they should be executed with `node`.
Bun automatically loads environment variables from `.env` files before running a file, script, or executable. The following files are checked, in order:
```js
#!/usr/bin/env node
1. `.env.local` (first)
2. `NODE_ENV` === `"production"` ? `.env.production` : `.env.development`
3. `.env`
// do stuff
```
To debug environment variables, run `bun run env` to view a list of resolved environment variables.
By default, Bun respects this shebang and executes the script with `node`. However, you can override this behavior with the `--bun` flag. For Node.js-based CLIs, this will run the CLI with Bun instead of Node.js.
## Performance
```bash
$ bun run --bun vite
```
Bun is designed to start fast and run fast.
Under the hood Bun uses the [JavaScriptCore engine](https://developer.apple.com/documentation/javascriptcore), which is developed by Apple for Safari. In most cases, the startup and running performance is faster than V8, the engine used by Node.js and Chromium-based browsers. Its transpiler and runtime are written in Zig, a modern, high-performance language. On Linux, this translates into startup times [4x faster](https://twitter.com/jarredsumner/status/1499225725492076544) than Node.js.
{% image src="/images/bun-run-speed.jpeg" caption="Bun vs Node.js vs Deno running Hello World" /%}
<!-- If no `node_modules` directory is found in the working directory or above, Bun will abandon Node.js-style module resolution in favor of the `Bun module resolution algorithm`. Under Bun-style module resolution, all packages are _auto-installed_ on the fly into a [global module cache](/docs/install/cache). For full details on this algorithm, refer to [Runtime > Modules](/docs/runtime/modules). -->

View File

@@ -1,4 +1,4 @@
Bun ships with a fast, built-in, Jest-compatible test runner. Tests are executed with the Bun runtime, and support the following features.
Bun ships with a fast built-in test runner. Tests are executed with the Bun runtime, and support the following features.
- TypeScript and JSX
- Lifecycle hooks
@@ -7,10 +7,6 @@ Bun ships with a fast, built-in, Jest-compatible test runner. Tests are executed
- Watch mode with `--watch`
- Script pre-loading with `--preload`
{% callout %}
Bun aims for compatibility with Jest, but not everything is implemented. To track compatibility, see [this tracking issue](https://github.com/oven-sh/bun/issues/1825).
{% /callout %}
## Run tests
```bash
@@ -34,50 +30,14 @@ The runner recursively searches the working directory for files that match the f
- `*.spec.{js|jsx|ts|tsx}`
- `*_spec.{js|jsx|ts|tsx}`
You can filter the set of _test files_ to run by passing additional positional arguments to `bun test`. Any test file with a path that matches one of the filters will run. Commonly, these filters will be file or directory names; glob patterns are not yet supported.
You can filter the set of tests to run by passing additional positional arguments to `bun test`. Any file in the directory with an _absolute path_ that contains one of the filters will run. Commonly, these filters will be file or directory names; glob patterns are not yet supported.
```bash
$ bun test <filter> <filter> ...
```
To filter by _test name_, use the `-t`/`--test-name-pattern` flag.
```sh
# run all tests or test suites with "addition" in the name
$ bun test --test-name-pattern addition
```
The test runner runs all tests in a single process. It loads all `--preload` scripts (see [Lifecycle](/docs/test/lifecycle) for details), then runs all tests. If a test fails, the test runner will exit with a non-zero exit code.
## Timeouts
Use the `--timeout` flag to specify a _per-test_ timeout in milliseconds. If a test times out, it will be marked as failed. The default value is `5000`.
```bash
# default value is 5000
$ bun test --timeout 20
```
## Rerun tests
Use the `--rerun-each` flag to run each test multiple times. This is useful for detecting flaky or non-deterministic test failures.
```sh
$ bun test --rerun-each 100
```
## Bail out with `--bail`
Use the `--bail` flag to abort the test run early after a pre-determined number of test failures. By default Bun will run all tests and report all failures, but sometimes in CI environments it's preferable to terminate earlier to reduce CPU usage.
```sh
# bail after 1 failure
$ bun test --bail
# bail after 10 failure
$ bun test --bail 10
```
## Watch mode
Similar to `bun run`, you can pass the `--watch` flag to `bun test` to watch for changes and re-run tests.
@@ -107,11 +67,7 @@ See [Test > Lifecycle](/docs/test/lifecycle) for complete documentation.
## Mocks
{% callout %}
Module mocking (`jest.mock()`) is not yet supported. Track support for it [here](https://github.com/oven-sh/bun/issues/5394).
{% /callout %}
Create mock functions with the `mock` function. Mocks are automatically reset between tests.
Create mocks with the `mock` function. Mocks are automatically reset between tests.
```ts
import { test, expect, mock } from "bun:test";
@@ -125,38 +81,11 @@ test("random", async () => {
});
```
Alternatively, you can use `jest.fn()`, it behaves identically.
```ts-diff
- import { test, expect, mock } from "bun:test";
+ import { test, expect, jest } from "bun:test";
- const random = mock(() => Math.random());
+ const random = jest.fn(() => Math.random());
```
See [Test > Mocks](/docs/test/mocks) for complete documentation.
## Snapshot testing
Snapshots are supported by `bun test`.
```ts
// example usage of toMatchSnapshot
import { test, expect } from "bun:test";
test("snapshot", async () => {
expect({ a: 1 }).toMatchSnapshot();
});
```
To update snapshots, use the `--update-snapshots` flag.
```sh
$ bun test --update-snapshots
```
See [Test > Snapshots](/docs/test/snapshots) for complete documentation.
Snapshots are supported by `bun test`. See [Test > Snapshots](/docs/test/snapshots) for complete documentation.
## UI & DOM testing

View File

@@ -49,7 +49,7 @@ This is useful for preventing flash of unstyled content.
## With `bun bun`
Bun bundles `.css` files imported via `@import` into a single file. It doesnt auto-prefix or minify CSS today. Multiple `.css` files imported in one JavaScript file will _not_ be bundled into one file. Youll have to import those from a `.css` file.
Bun bundles `.css` files imported via `@import` into a single file. It doesnt autoprefix or minify CSS today. Multiple `.css` files imported in one JavaScript file will _not_ be bundled into one file. Youll have to import those from a `.css` file.
This input:

151
docs/dev/frameworks.md Normal file
View File

@@ -0,0 +1,151 @@
{% callout %}
**Warning** — This will soon have breaking changes. It was designed when Bun was mostly a dev server and not a JavaScript runtime.
{% /callout %}
Frameworks preconfigure Bun to enable developers to use Bun with their existing tooling.
Frameworks are configured via the `framework` object in the `package.json` of the framework (not in the applications `package.json`):
Here is an example:
```json
{
"name": "bun-framework-next",
"version": "0.0.0-18",
"description": "",
"framework": {
"displayName": "Next.js",
"static": "public",
"assetPrefix": "_next/",
"router": {
"dir": ["pages", "src/pages"],
"extensions": [".js", ".ts", ".tsx", ".jsx"]
},
"css": "onimportcss",
"development": {
"client": "client.development.tsx",
"fallback": "fallback.development.tsx",
"server": "server.development.tsx",
"css": "onimportcss",
"define": {
"client": {
".env": "NEXT_PUBLIC_",
"defaults": {
"process.env.__NEXT_TRAILING_SLASH": "false",
"process.env.NODE_ENV": "\"development\"",
"process.env.__NEXT_ROUTER_BASEPATH": "''",
"process.env.__NEXT_SCROLL_RESTORATION": "false",
"process.env.__NEXT_I18N_SUPPORT": "false",
"process.env.__NEXT_HAS_REWRITES": "false",
"process.env.__NEXT_ANALYTICS_ID": "null",
"process.env.__NEXT_OPTIMIZE_CSS": "false",
"process.env.__NEXT_CROSS_ORIGIN": "''",
"process.env.__NEXT_STRICT_MODE": "false",
"process.env.__NEXT_IMAGE_OPTS": "null"
}
},
"server": {
".env": "NEXT_",
"defaults": {
"process.env.__NEXT_TRAILING_SLASH": "false",
"process.env.__NEXT_OPTIMIZE_FONTS": "false",
"process.env.NODE_ENV": "\"development\"",
"process.env.__NEXT_OPTIMIZE_IMAGES": "false",
"process.env.__NEXT_OPTIMIZE_CSS": "false",
"process.env.__NEXT_ROUTER_BASEPATH": "''",
"process.env.__NEXT_SCROLL_RESTORATION": "false",
"process.env.__NEXT_I18N_SUPPORT": "false",
"process.env.__NEXT_HAS_REWRITES": "false",
"process.env.__NEXT_ANALYTICS_ID": "null",
"process.env.__NEXT_CROSS_ORIGIN": "''",
"process.env.__NEXT_STRICT_MODE": "false",
"process.env.__NEXT_IMAGE_OPTS": "null",
"global": "globalThis",
"window": "undefined"
}
}
}
}
}
}
```
Here are type definitions:
```ts
type Framework = Environment & {
// This changes whats printed in the console on load
displayName?: string;
// This allows a prefix to be added (and ignored) to requests.
// Useful for integrating an existing framework that expects internal routes to have a prefix
// e.g. "_next"
assetPrefix?: string;
development?: Environment;
production?: Environment;
// The directory used for serving unmodified assets like fonts and images
// Defaults to "public" if exists, else "static", else disabled.
static?: string;
// "onimportcss" disables the automatic "onimportcss" feature
// If the framework does routing, you may want to handle CSS manually
// "facade" removes CSS imports from JavaScript files,
// and replaces an imported object with a proxy that mimics CSS module support without doing any class renaming.
css?: "onimportcss" | "facade";
// Bun's filesystem router
router?: Router;
};
type Define = {
// By passing ".env", Bun will automatically load .env.local, .env.development, and .env if exists in the project root
// (in addition to the processes environment variables)
// When "*", all environment variables will be automatically injected into the JavaScript loader
// When a string like "NEXT_PUBLIC_", only environment variables starting with that prefix will be injected
".env": string | "*";
// These environment variables will be injected into the JavaScript loader
// These are the equivalent of Webpacks resolve.alias and esbuilds --define.
// Values are parsed as JSON, so they must be valid JSON. The only exception is '' is a valid string, to simplify writing stringified JSON in JSON.
// If not set, `process.env.NODE_ENV` will be transformed into "development".
"defaults": Record<string, string>;
};
type Environment = {
// This is a wrapper for the client-side entry point for a route.
// This allows frameworks to run initialization code on pages.
client: string;
// This is a wrapper for the server-side entry point for a route.
// This allows frameworks to run initialization code on pages.
server: string;
// This runs when "server" code fails to load due to an exception.
fallback: string;
// This is how environment variables and .env is configured.
define?: Define;
};
// Bun's filesystem router
// Currently, Bun supports pages by either an absolute match or a parameter match.
// pages/index.tsx will be executed on navigation to "/" and "/index"
// pages/posts/[id].tsx will be executed on navigation to "/posts/123"
// Routes & parameters are automatically passed to `fallback` and `server`.
type Router = {
// This determines the folder to look for pages
dir: string[];
// These are the allowed file extensions for pages.
extensions?: string[];
};
```
To use a framework, you pass `bun bun --use package-name`.
Your frameworks `package.json` `name` should start with `bun-framework-`. This is so that people can type something like `bun bun --use next` and it will check `bun-framework-next` first. This is similar to how Babel plugins tend to start with `babel-plugin-`.
For developing frameworks, you can also do `bun bun --use ./relative-path-to-framework`.
If youre interested in adding a framework integration, please reach out. Theres a lot here, and its not entirely documented yet.

33
docs/dev/nextjs.md Normal file
View File

@@ -0,0 +1,33 @@
To create a new Next.js app with bun:
```bash
$ bun create next ./app
$ cd app
$ bun dev # start dev server
```
To use an existing Next.js app with bun:
```bash
$ bun add bun-framework-next
$ echo "framework = 'next'" > bunfig.toml
$ bun bun # bundle dependencies
$ bun dev # start dev server
```
Many of Next.js features are supported, but not all.
Heres what doesnt work yet:
- `getStaticPaths`
- same-origin `fetch` inside of `getStaticProps` or `getServerSideProps`
- locales, zones, `assetPrefix` (workaround: change `--origin \"http://localhost:3000/assetPrefixInhere\"`)
- `next/image` is polyfilled to a regular `<img src>` tag.
- `proxy` and anything else in `next.config.js`
- API routes, middleware (middleware is easier to support, though! Similar SSR API)
- styled-jsx (technically not Next.js, but often used with it)
- React Server Components
When using Next.js, Bun automatically reads configuration from `.env.local`, `.env.development` and `.env` (in that order). `process.env.NEXT_PUBLIC_` and `process.env.NEXT_` automatically are replaced via `--define`.
Currently, any time you import new dependencies from `node_modules`, you will need to re-run `bun bun --use next`. This will eventually be automatic.

36
docs/ecosystem/buchta.md Normal file
View File

@@ -0,0 +1,36 @@
[Buchta](https://buchtajs.com) is a fullstack framework designed to take full advantage of Bun's strengths. It currently supports Preact and Svelte.
To get started:
```bash
$ bunx buchta init myapp
Project templates:
- svelte
- default
- preact
Name of template: preact
Do you want TSX? y
Do you want SSR? y
Enable livereload? y
Buchta Preact project was setup successfully!
$ cd myapp
$ bun install
$ bunx buchta serve
```
To implement a simple HTTP server with Buchta:
```ts#server.ts
import { Buchta, type BuchtaRequest, type BuchtaResponse } from "buchta";
const app = new Buchta();
app.get("/api/hello/", (req: BuchtaRequest, res: BuchtaResponse) => {
res.send("Hello, World!");
});
app.run();
```
For more information, refer to Buchta's [documentation](https://buchtajs.com/docs/).

View File

@@ -1,7 +1,7 @@
[Elysia](https://elysiajs.com) is a Bun-first performance focused web framework that takes full advantage of Bun's HTTP, file system, and hot reloading APIs.
Designed with TypeScript in mind, you don't need to understand TypeScript to gain the benefit of TypeScript with Elysia. The library understands what you want and automatically infers the type from your code.
⚡️ Elysia is [one of the fastest Bun web frameworks](https://github.com/SaltyAom/bun-http-framework-benchmark)
:zap: Elysia is [one of the fastest Bun web frameworks](https://github.com/SaltyAom/bun-http-framework-benchmark)
```ts#server.ts
import { Elysia } from 'elysia'
@@ -9,7 +9,7 @@ import { Elysia } from 'elysia'
const app = new Elysia()
.get('/', () => 'Hello Elysia')
.listen(8080)
console.log(`🦊 Elysia is running at on port ${app.server.port}...`)
```

View File

@@ -22,7 +22,7 @@ app.listen(port, () => {
Bun implements the [`node:http`](https://nodejs.org/api/http.html) and [`node:https`](https://nodejs.org/api/https.html) modules that these libraries rely on. These modules can also be used directly, though [`Bun.serve`](/docs/api/http) is recommended for most use cases.
{% callout %}
**Note** — Refer to the [Runtime > Node.js APIs](/docs/runtime/nodejs-apis#node-http) page for more detailed compatibility information.
**Note** — Refer to the [Runtime > Node.js APIs](/docs/runtime/nodejs-apis#node_http) page for more detailed compatibility information.
{% /callout %}
```ts

View File

@@ -1,16 +1,15 @@
[Hono](https://github.com/honojs/hono) is a lightweight ultrafast web framework designed for the edge.
```ts
import { Hono } from "hono";
const app = new Hono();
import { Hono } from 'hono'
const app = new Hono()
app.get("/", c => c.text("Hono!"));
app.get('/', (c) => c.text('Hono!'))
export default app;
export default app
```
Get started with `bun create` or follow Hono's [Bun quickstart](https://hono.dev/getting-started/bun).
```bash
$ bun create hono ./myapp
$ cd myapp

View File

@@ -2,7 +2,7 @@
name: Convert a Blob to a ReadableStream
---
The [`Blob`](https://developer.mozilla.org/en-US/docs/Web/API/Blob) class provides a number of methods for consuming its contents in different formats, including `.stream()`. This returns `Promise<ReadableStream>`.
The [`Blob`](https://developer.mozilla.org/en-US/docs/Web/API/Blob) class provides a number of methods for consuming its contents in different formats, inluding `.stream()`. This returns `Promise<ReadableStream>`.
```ts
const blob = new Blob(["hello world"]);

View File

@@ -2,7 +2,7 @@
name: Convert a Blob to a string
---
The [`Blob`](https://developer.mozilla.org/en-US/docs/Web/API/Blob) class provides a number of methods for consuming its contents in different formats, including `.text()`.
The [`Blob`](https://developer.mozilla.org/en-US/docs/Web/API/Blob) class provides a number of methods for consuming its contents in different formats, inluding `.text()`.
```ts
const blob = new Blob(["hello world"]);

View File

@@ -1,72 +0,0 @@
---
name: Build an app with Astro and Bun
---
Initialize a fresh Astro app with `bun create astro`. The `create-astro` package detects when you are using `bunx` and will automatically install dependencies using `bun`.
```sh
$ bun create astro
╭─────╮ Houston:
│ ◠ ◡ ◠ We're glad to have you on board.
╰─────╯
astro v3.1.4 Launch sequence initiated.
dir Where should we create your new project?
./fumbling-field
tmpl How would you like to start your new project?
Use blog template
✔ Template copied
deps Install dependencies?
Yes
✔ Dependencies installed
ts Do you plan to write TypeScript?
Yes
use How strict should TypeScript be?
Strict
✔ TypeScript customized
git Initialize a new git repository?
Yes
✔ Git initialized
next Liftoff confirmed. Explore your project!
Enter your project directory using cd ./fumbling-field
Run `bun run dev` to start the dev server. CTRL+C to stop.
Add frameworks like react or tailwind using astro add.
Stuck? Join us at https://astro.build/chat
╭─────╮ Houston:
│ ◠ ◡ ◠ Good luck out there, astronaut! 🚀
╰─────╯
```
---
Start the dev server with `bunx`.
By default, Bun will run the dev server with Node.js. To use the Bun runtime instead, use the `--bun` flag.
```sh
$ bunx --bun astro dev
🚀 astro v3.1.4 started in 200ms
┃ Local http://localhost:4321/
┃ Network use --host to expose
```
---
Open [http://localhost:4321](http://localhost:4321) with your browser to see the result. Astro will hot-reload your app as you edit your source files.
{% image src="https://i.imgur.com/Dswiu6w.png" caption="An Astro v3 starter app running on Bun" %}
---
Refer to the [Astro docs](https://docs.astro.build/en/getting-started/) for complete documentation.

View File

@@ -2,7 +2,7 @@
name: Create a Discord bot
---
Discord.js works out of the box with Bun. Let's write a simple bot. First create a directory and initialize it with `bun init`.
Discord.js works [out of the box](https://bun.sh/blog/bun-v0.6.7) with Bun. Let's write a simple bot. First create a directory and initialize it with `bun init`.
```bash
mkdir my-bot
@@ -74,4 +74,4 @@ Ready! Logged in as my-bot#1234
---
You're up and running with a bare-bones Discord.js bot! This is a basic guide to setting up your bot with Bun; we recommend the [official discord.js docs](https://discordjs.guide/) for complete information on the `discord.js` API.
You're up and running with a bare-bones Discord.js bot! This is a basic guide to setting up your bot with Bun; we recommend the [official Discord docs](https://discordjs.guide/) for complete information on the `discord.js` API.

View File

@@ -1,31 +0,0 @@
---
name: Build an HTTP server using Elysia and Bun
---
[Elysia](https://elysiajs.com) is a Bun-first performance focused web framework that takes full advantage of Bun's HTTP, file system, and hot reloading APIs. Get started with `bun create`.
```bash
$ bun create elysia myapp
$ cd myapp
$ bun run dev
```
---
To define a simple HTTP route and start a server with Elysia:
```ts#server.ts
import { Elysia } from 'elysia'
const app = new Elysia()
.get('/', () => 'Hello Elysia')
.listen(8080)
console.log(`🦊 Elysia is running at on port ${app.server?.port}...`)
```
---
Elysia is a full-featured server framework with Express-like syntax, type inference, middleware, file uploads, and plugins for JWT authentication, tRPC, and more. It's also is one of the [fastest Bun web frameworks](https://github.com/SaltyAom/bun-http-framework-benchmark).
Refer to the Elysia [documentation](https://elysiajs.com/quick-start.html) for more information.

View File

@@ -1,40 +0,0 @@
---
name: Build an HTTP server using Express and Bun
---
Express and other major Node.js HTTP libraries should work out of the box. Bun implements the [`node:http`](https://nodejs.org/api/http.html) and [`node:https`](https://nodejs.org/api/https.html) modules that these libraries rely on.
{% callout %}
Refer to the [Runtime > Node.js APIs](/docs/runtime/nodejs-apis#node-http) page for more detailed compatibility information.
{% /callout %}
```sh
$ bun add express
```
---
To define a simple HTTP route and start a server with Express:
```ts#server.ts
import express from "express";
const app = express();
const port = 8080;
app.get("/", (req, res) => {
res.send("Hello World!");
});
app.listen(port, () => {
console.log(`Listening on port ${port}...`);
});
```
---
To start the server on `localhost`:
```sh
$ bun server.ts
```

View File

@@ -1,39 +0,0 @@
---
name: Build an HTTP server using Hono and Bun
---
[Hono](https://github.com/honojs/hono) is a lightweight ultrafast web framework designed for the edge.
```ts
import { Hono } from "hono";
const app = new Hono();
app.get("/", c => c.text("Hono!"));
export default app;
```
---
Use `create-hono` to get started with one of Hono's project templates. Select `bun` when prompted for a template.
```bash
$ bun create hono myapp
✔ Which template do you want to use? bun
cloned honojs/starter#main to /path/to/myapp
✔ Copied project files
$ cd myapp
$ bun install
```
---
Then start the dev server and visit [localhost:3000](http://localhost:3000).
```bash
$ bun run dev
```
---
Refer to Hono's guide on [getting started with Bun](https://hono.dev/getting-started/bun) for more information.

View File

@@ -1,5 +1,5 @@
---
name: Read and write data to MongoDB using Mongoose and Bun
name: Use MongoDB and Mongoose
---
MongoDB and Mongoose work out of the box with Bun. This guide assumes you've already installed MongoDB and are running it as background process/service on your development machine. Follow [this guide](https://www.mongodb.com/docs/manual/installation/) for details.
@@ -33,18 +33,11 @@ const animalSchema = new mongoose.Schema(
{
name: {type: String, required: true},
sound: {type: String, required: true},
},
{
methods: {
speak() {
console.log(`${this.sound}!`);
},
},
}
);
export type Animal = mongoose.InferSchemaType<typeof animalSchema>;
export const Animal = mongoose.model('Animal', animalSchema);
export const Animal = mongoose.model('Kitten', animalSchema);
```
---
@@ -69,13 +62,13 @@ await cow.save(); // saves to the database
const animals = await Animal.find();
animals[0].speak(); // logs "Moo!"
// disconnect
// disconect
await mongoose.disconnect();
```
---
Let's run this with `bun run`.
Lets run this with `bun run`.
```bash
$ bun run index.ts

View File

@@ -1,44 +0,0 @@
---
name: Build an app with Next.js and Bun
---
{% callout %}
The Next.js [App Router](https://nextjs.org/docs/app) currently relies on Node.js APIs that Bun does not yet implement. The guide below uses Bun to initialize a project and install dependencies, but it uses Node.js to run the dev server.
{% /callout %}
---
Initialize a Next.js app with `create-next-app`. This automatically installs dependencies using `npm`.
```sh
$ bun create next-app
✔ What is your project named? … my-app
✔ Would you like to use TypeScript with this project? … No / Yes
✔ Would you like to use ESLint with this project? … No / Yes
✔ Would you like to use `src/` directory with this project? … No / Yes
✔ Would you like to use experimental `app/` directory with this project? … No / Yes
✔ What import alias would you like configured? … @/*
Creating a new Next.js app in /path/to/my-app.
```
---
To start the dev server with Bun, run `bun --bun run dev` from the project root.
```sh
$ cd my-app
$ bun --bun run dev
```
---
To run the dev server with Node.js instead, omit `--bun`.
```sh
$ cd my-app
$ bun run dev
```
---
Open [http://localhost:3000](http://localhost:3000) with your browser to see the result. Any changes you make to `(pages/app)/index.tsx` will be hot-reloaded in the browser.

Some files were not shown because too many files have changed in this diff Show More