mirror of
https://github.com/oven-sh/bun
synced 2026-03-02 05:21:05 +01:00
Compare commits
7 Commits
dave/reada
...
jarred/fas
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
69b3653b7b | ||
|
|
628de36ad3 | ||
|
|
dd39c1a0f6 | ||
|
|
47ce51c43b | ||
|
|
5dcc621224 | ||
|
|
98ae204cf7 | ||
|
|
4113da1b2e |
@@ -1,30 +0,0 @@
|
||||
# Uploads the latest CI workflow to Buildkite.
|
||||
# https://buildkite.com/docs/pipelines/defining-steps
|
||||
#
|
||||
# Changes to this file must be manually edited here:
|
||||
# https://buildkite.com/bun/bun/settings/steps
|
||||
steps:
|
||||
- if: "build.pull_request.repository.fork"
|
||||
block: ":eyes:"
|
||||
prompt: "Did you review the PR?"
|
||||
blocked_state: "running"
|
||||
|
||||
- label: ":pipeline:"
|
||||
command: "buildkite-agent pipeline upload .buildkite/ci.yml"
|
||||
agents:
|
||||
queue: "build-linux"
|
||||
|
||||
- if: "build.branch == 'main' && !build.pull_request.repository.fork"
|
||||
label: ":github:"
|
||||
agents:
|
||||
queue: "test-darwin"
|
||||
depends_on:
|
||||
- "darwin-aarch64-build-bun"
|
||||
- "darwin-x64-build-bun"
|
||||
- "linux-aarch64-build-bun"
|
||||
- "linux-x64-build-bun"
|
||||
- "linux-x64-baseline-build-bun"
|
||||
- "windows-x64-build-bun"
|
||||
- "windows-x64-baseline-build-bun"
|
||||
command:
|
||||
- ".buildkite/scripts/upload-release.sh"
|
||||
@@ -1,63 +0,0 @@
|
||||
## CI
|
||||
|
||||
How does CI work?
|
||||
|
||||
### Building
|
||||
|
||||
Bun is built on macOS, Linux, and Windows. The process is split into the following steps, the first 3 of which are able to run in parallel:
|
||||
|
||||
#### 1. `build-deps`
|
||||
|
||||
Builds the static libaries in `src/deps` and outputs a directory: `build/bun-deps`.
|
||||
|
||||
- on Windows, this runs the script: [`scripts/all-dependencies.ps1`](scripts/all-dependencies.ps1)
|
||||
- on macOS and Linux, this runs the script: [`scripts/all-dependencies.sh`](scripts/all-dependencies.sh)
|
||||
|
||||
#### 2. `build-zig`
|
||||
|
||||
Builds the Zig object file: `build/bun-zig.o`. Since `zig build` supports cross-compiling, this step is run on macOS aarch64 since we have observed it to be the fastest.
|
||||
|
||||
- on macOS and Linux, this runs the script: [`scripts/build-bun-zig.sh`](scripts/build-bun-zig.sh)
|
||||
|
||||
#### 3. `build-cpp`
|
||||
|
||||
Builds the C++ object file: `build/bun-cpp-objects.a`.
|
||||
|
||||
- on Windows, this runs the script: [`scripts/build-bun-cpp.ps1`](scripts/build-bun-cpp.ps1)
|
||||
- on macOS and Linux, this runs the script: [`scripts/build-bun-cpp.sh`](scripts/build-bun-cpp.sh)
|
||||
|
||||
#### 4. `link` / `build-bun`
|
||||
|
||||
After the `build-deps`, `build-zig`, and `build-cpp` steps have completed, this step links the Zig object file and C++ object file into a single binary: `bun-<os>-<arch>.zip`.
|
||||
|
||||
- on Windows, this runs the script: [`scripts/buildkite-link-bun.ps1`](scripts/buildkite-link-bun.ps1)
|
||||
- on macOS and Linux, this runs the script: [`scripts/buildkite-link-bun.sh`](scripts/buildkite-link-bun.sh)
|
||||
|
||||
To speed up the build, thare are two options:
|
||||
|
||||
- `--fast`: This disables the LTO (link-time optimization) step.
|
||||
- without `--fast`: This runs the LTO step, which is the default. The binaries that are release to Github are always built with LTO.
|
||||
|
||||
### Testing
|
||||
|
||||
### FAQ
|
||||
|
||||
> How do I add a new CI agent?
|
||||
|
||||
> How do I add/modify system dependencies?
|
||||
|
||||
> How do I SSH into a CI agent?
|
||||
|
||||
### Known issues
|
||||
|
||||
These are things that we know about, but haven't fixed or optimized yet.
|
||||
|
||||
- There is no `scripts/build-bun-zig.ps1` for Windows.
|
||||
|
||||
- The `build-deps` step does not cache in CI, so it re-builds each time (though it does use ccache). It attempts to check the `BUN_DEPS_CACHE_DIR` environment variable, but for some reason it doesn't work.
|
||||
|
||||
- Windows and Linux machines sometimes take up to 1-2 minutes to start tests. This is because robobun is listening for when the job is scheduled to provision the VM. Instead, it can start provisioning during the link step, or keep a pool of idle VMs around (but it's unclear how more expensive this is).
|
||||
|
||||
- There are a limited number of macOS VMs. This is because they are expensive and manually provisioned, mostly through MacStadium. If wait times are too long we can just provision more, or buy some.
|
||||
|
||||
- To prevent idle machines, robobun periodically checks for idle machines and terminates them. Before doing this, it checks to see if the machine is connected as an agent to Buildkite. However, sometimes the machine picks up a job in-between this time, and the job is terminated.
|
||||
1523
.buildkite/ci.yml
1523
.buildkite/ci.yml
File diff suppressed because it is too large
Load Diff
@@ -1,94 +0,0 @@
|
||||
#!/bin/bash
|
||||
|
||||
set -eo pipefail
|
||||
|
||||
function assert_main() {
|
||||
if [[ "$BUILDKITE_PULL_REQUEST_REPO" && "$BUILDKITE_REPO" != "$BUILDKITE_PULL_REQUEST_REPO" ]]; then
|
||||
echo "error: Cannot upload release from a fork"
|
||||
exit 1
|
||||
fi
|
||||
if [ "$BUILDKITE_PULL_REQUEST" != "false" ]; then
|
||||
echo "error: Cannot upload release from a pull request"
|
||||
exit 1
|
||||
fi
|
||||
if [ "$BUILDKITE_BRANCH" != "main" ]; then
|
||||
echo "error: Cannot upload release from a branch other than main"
|
||||
exit 1
|
||||
fi
|
||||
}
|
||||
|
||||
function assert_buildkite_agent() {
|
||||
if ! command -v buildkite-agent &> /dev/null; then
|
||||
echo "error: Cannot find buildkite-agent, please install it:"
|
||||
echo "https://buildkite.com/docs/agent/v3/install"
|
||||
exit 1
|
||||
fi
|
||||
}
|
||||
|
||||
function assert_gh() {
|
||||
if ! command -v gh &> /dev/null; then
|
||||
echo "warning: gh is not installed, installing..."
|
||||
if command -v brew &> /dev/null; then
|
||||
brew install gh
|
||||
else
|
||||
echo "error: Cannot install gh, please install it:"
|
||||
echo "https://github.com/cli/cli#installation"
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
}
|
||||
|
||||
function assert_gh_token() {
|
||||
local token=$(buildkite-agent secret get GITHUB_TOKEN)
|
||||
if [ -z "$token" ]; then
|
||||
echo "error: Cannot find GITHUB_TOKEN secret"
|
||||
echo ""
|
||||
echo "hint: Create a secret named GITHUB_TOKEN with a GitHub access token:"
|
||||
echo "https://buildkite.com/docs/pipelines/buildkite-secrets"
|
||||
exit 1
|
||||
fi
|
||||
export GH_TOKEN="$token"
|
||||
}
|
||||
|
||||
function download_artifact() {
|
||||
local name=$1
|
||||
buildkite-agent artifact download "$name" .
|
||||
if [ ! -f "$name" ]; then
|
||||
echo "error: Cannot find Buildkite artifact: $name"
|
||||
exit 1
|
||||
fi
|
||||
}
|
||||
|
||||
function upload_assets() {
|
||||
local tag=$1
|
||||
local files=${@:2}
|
||||
gh release upload "$tag" $files --clobber --repo "$BUILDKITE_REPO"
|
||||
}
|
||||
|
||||
assert_main
|
||||
assert_buildkite_agent
|
||||
assert_gh
|
||||
assert_gh_token
|
||||
|
||||
declare artifacts=(
|
||||
bun-darwin-aarch64.zip
|
||||
bun-darwin-aarch64-profile.zip
|
||||
bun-darwin-x64.zip
|
||||
bun-darwin-x64-profile.zip
|
||||
bun-linux-aarch64.zip
|
||||
bun-linux-aarch64-profile.zip
|
||||
bun-linux-x64.zip
|
||||
bun-linux-x64-profile.zip
|
||||
bun-linux-x64-baseline.zip
|
||||
bun-linux-x64-baseline-profile.zip
|
||||
bun-windows-x64.zip
|
||||
bun-windows-x64-profile.zip
|
||||
bun-windows-x64-baseline.zip
|
||||
bun-windows-x64-baseline-profile.zip
|
||||
)
|
||||
|
||||
for artifact in "${artifacts[@]}"; do
|
||||
download_artifact $artifact
|
||||
done
|
||||
|
||||
upload_assets "canary" "${artifacts[@]}"
|
||||
4
.github/ISSUE_TEMPLATE/2-bug-report.yml
vendored
4
.github/ISSUE_TEMPLATE/2-bug-report.yml
vendored
@@ -1,8 +1,6 @@
|
||||
name: 🐛 Bug Report
|
||||
description: Report an issue that should be fixed
|
||||
labels:
|
||||
- bug
|
||||
- needs triage
|
||||
labels: [bug]
|
||||
body:
|
||||
- type: markdown
|
||||
attributes:
|
||||
|
||||
11
.github/ISSUE_TEMPLATE/6-crash-report.yml
vendored
11
.github/ISSUE_TEMPLATE/6-crash-report.yml
vendored
@@ -6,12 +6,17 @@ body:
|
||||
- type: markdown
|
||||
attributes:
|
||||
value: |
|
||||
**Thank you so much** for submitting a crash report. You're helping us make Bun more reliable for everyone!
|
||||
Thank you so much for submitting a crash report. You're helping us make Bun more reliable for everyone!
|
||||
- type: textarea
|
||||
attributes:
|
||||
label: How can we reproduce the crash?
|
||||
description: Please provide instructions on how to reproduce the crash.
|
||||
- type: textarea
|
||||
id: code
|
||||
attributes:
|
||||
label: How can we reproduce the crash?
|
||||
description: Please provide a [minimal reproduction](https://stackoverflow.com/help/minimal-reproducible-example) using a GitHub repository, [Replit](https://replit.com/@replit/Bun) or [CodeSandbox](https://codesandbox.io/templates/bun)
|
||||
label: JavaScript/TypeScript code that reproduces the crash?
|
||||
description: If this crash happened in the Bun runtime, can you paste code we can run to reproduce the crash?
|
||||
render: shell
|
||||
- type: textarea
|
||||
id: logs
|
||||
attributes:
|
||||
|
||||
@@ -1,27 +0,0 @@
|
||||
name: bun install crash report
|
||||
description: Report a crash in bun install
|
||||
labels:
|
||||
- npm
|
||||
body:
|
||||
- type: markdown
|
||||
attributes:
|
||||
value: |
|
||||
**Thank you so much** for submitting a crash report. You're helping us make Bun more reliable for everyone!
|
||||
- type: textarea
|
||||
id: repro
|
||||
attributes:
|
||||
label: How can we reproduce the crash?
|
||||
description: Please provide a [minimal reproduction](https://stackoverflow.com/help/minimal-reproducible-example) using a GitHub repository, [Replit](https://replit.com/@replit/Bun) or [CodeSandbox](https://codesandbox.io/templates/bun)
|
||||
- type: textarea
|
||||
id: logs
|
||||
attributes:
|
||||
label: Relevant log output
|
||||
description: Please copy and paste any relevant log output. This will be
|
||||
automatically formatted into code, so no need for backticks.
|
||||
render: shell
|
||||
- type: textarea
|
||||
id: remapped_trace
|
||||
attributes:
|
||||
label: Stack Trace (bun.report)
|
||||
validations:
|
||||
required: true
|
||||
43
.github/actions/bump/action.yml
vendored
43
.github/actions/bump/action.yml
vendored
@@ -1,43 +0,0 @@
|
||||
name: Bump version
|
||||
description: Bump the version of Bun
|
||||
|
||||
inputs:
|
||||
version:
|
||||
description: The most recent version of Bun.
|
||||
required: true
|
||||
type: string
|
||||
token:
|
||||
description: The GitHub token to use for creating a pull request.
|
||||
required: true
|
||||
type: string
|
||||
default: ${{ github.token }}
|
||||
|
||||
runs:
|
||||
using: composite
|
||||
steps:
|
||||
- name: Run Bump
|
||||
shell: bash
|
||||
id: bump
|
||||
run: |
|
||||
set -euo pipefail
|
||||
MESSAGE=$(bun ./scripts/bump.ts patch --last-version=${{ inputs.version }})
|
||||
LATEST=$(cat LATEST)
|
||||
echo "version=$LATEST" >> $GITHUB_OUTPUT
|
||||
echo "message=$MESSAGE" >> $GITHUB_OUTPUT
|
||||
- name: Create Pull Request
|
||||
uses: peter-evans/create-pull-request@v4
|
||||
with:
|
||||
add-paths: |
|
||||
CMakeLists.txt
|
||||
LATEST
|
||||
token: ${{ inputs.token }}
|
||||
commit-message: Bump version to ${{ steps.bump.outputs.version }}
|
||||
title: Bump to ${{ steps.bump.outputs.version }}
|
||||
delete-branch: true
|
||||
branch: github-actions/bump-version-${{ steps.bump.outputs.version }}--${{ github.run_id }}
|
||||
body: |
|
||||
## What does this PR do?
|
||||
|
||||
${{ steps.bump.outputs.message }}
|
||||
|
||||
Auto-bumped by [this workflow](https://github.com/oven-sh/bun/actions/workflows/release.yml)
|
||||
2
.github/workflows/build-darwin.yml
vendored
2
.github/workflows/build-darwin.yml
vendored
@@ -265,7 +265,7 @@ jobs:
|
||||
-DCMAKE_BUILD_TYPE=Release \
|
||||
-DUSE_LTO=ON \
|
||||
-DBUN_LINK_ONLY=1 \
|
||||
-DBUN_ZIG_OBJ_DIR="${{ runner.temp }}/release" \
|
||||
-DBUN_ZIG_OBJ="${{ runner.temp }}/release/bun-zig.o" \
|
||||
-DBUN_CPP_ARCHIVE="${{ runner.temp }}/bun-cpp-obj/bun-cpp-objects.a" \
|
||||
-DBUN_DEPS_OUT_DIR="${{ runner.temp }}/bun-deps" \
|
||||
-DNO_CONFIGURE_DEPENDS=1
|
||||
|
||||
34
.github/workflows/build-windows.yml
vendored
34
.github/workflows/build-windows.yml
vendored
@@ -36,15 +36,12 @@ env:
|
||||
BUN_GARBAGE_COLLECTOR_LEVEL: 1
|
||||
BUN_FEATURE_FLAG_INTERNAL_FOR_TESTING: 1
|
||||
CI: true
|
||||
USE_LTO: 1
|
||||
|
||||
jobs:
|
||||
build-submodules:
|
||||
name: Build Submodules
|
||||
runs-on: ${{ inputs.runs-on }}
|
||||
steps:
|
||||
- name: Install VS2022 BuildTools 17.9.7
|
||||
run: choco install -y visualstudio2022buildtools --version=117.9.7.0 --params "--add Microsoft.VisualStudio.Component.VC.Tools.x86.x64 --installChannelUri https://aka.ms/vs/17/release/180911598_-255012421/channel"
|
||||
- name: Setup Git
|
||||
run: |
|
||||
git config --global core.autocrlf false
|
||||
@@ -74,11 +71,15 @@ jobs:
|
||||
with:
|
||||
path: bun-deps
|
||||
key: bun-${{ inputs.tag }}-deps-${{ steps.hash.outputs.hash }}
|
||||
- if: ${{ inputs.no-cache || !steps.cache.outputs.cache-hit }}
|
||||
name: Install LLVM
|
||||
uses: KyleMayes/install-llvm-action@8b37482c5a2997a3ab5dbf6561f8109e2eaa7d3b
|
||||
with:
|
||||
version: ${{ env.LLVM_VERSION }}
|
||||
- if: ${{ inputs.no-cache || !steps.cache.outputs.cache-hit }}
|
||||
name: Install Ninja
|
||||
run: |
|
||||
choco install -y ninja
|
||||
choco install -y llvm --version=${{ env.LLVM_VERSION }} --force
|
||||
- if: ${{ inputs.no-cache || !steps.cache.outputs.cache-hit }}
|
||||
name: Clone Submodules
|
||||
run: |
|
||||
@@ -88,10 +89,12 @@ jobs:
|
||||
env:
|
||||
CPU_TARGET: ${{ inputs.cpu }}
|
||||
CCACHE_DIR: ccache
|
||||
USE_LTO: 1
|
||||
run: |
|
||||
.\scripts\env.ps1 ${{ contains(inputs.tag, '-baseline') && '-Baseline' || '' }}
|
||||
choco install -y nasm --version=2.16.01
|
||||
Invoke-WebRequest -Uri "https://www.nasm.us/pub/nasm/releasebuilds/2.16.01/win64/nasm-2.16.01-win64.zip" -OutFile nasm.zip
|
||||
Expand-Archive nasm.zip (mkdir -Force "nasm")
|
||||
$Nasm = (Get-ChildItem "nasm")
|
||||
$env:Path += ";${Nasm}"
|
||||
$env:BUN_DEPS_OUT_DIR = (mkdir -Force "./bun-deps")
|
||||
.\scripts\all-dependencies.ps1
|
||||
- name: Save Cache
|
||||
@@ -139,8 +142,6 @@ jobs:
|
||||
needs: codegen
|
||||
runs-on: ${{ inputs.runs-on }}
|
||||
steps:
|
||||
- name: Install VS2022 BuildTools 17.9.7
|
||||
run: choco install -y visualstudio2022buildtools --version=117.9.7.0 --params "--add Microsoft.VisualStudio.Component.VC.Tools.x86.x64 --installChannelUri https://aka.ms/vs/17/release/180911598_-255012421/channel"
|
||||
- name: Setup Git
|
||||
run: |
|
||||
git config --global core.autocrlf false
|
||||
@@ -149,10 +150,13 @@ jobs:
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
submodules: recursive
|
||||
- name: Install LLVM
|
||||
uses: KyleMayes/install-llvm-action@8b37482c5a2997a3ab5dbf6561f8109e2eaa7d3b
|
||||
with:
|
||||
version: ${{ env.LLVM_VERSION }}
|
||||
- name: Install Ninja
|
||||
run: |
|
||||
choco install -y ninja
|
||||
choco install -y llvm --version=${{ env.LLVM_VERSION }} --force
|
||||
- name: Setup Bun
|
||||
uses: ./.github/actions/setup-bun
|
||||
with:
|
||||
@@ -174,7 +178,6 @@ jobs:
|
||||
env:
|
||||
CPU_TARGET: ${{ inputs.cpu }}
|
||||
CCACHE_DIR: ccache
|
||||
USE_LTO: 1
|
||||
run: |
|
||||
# $CANARY_REVISION = if (Test-Path build/.canary_revision) { Get-Content build/.canary_revision } else { "0" }
|
||||
$CANARY_REVISION = 0
|
||||
@@ -184,7 +187,6 @@ jobs:
|
||||
cd build
|
||||
cmake .. -G Ninja -DCMAKE_BUILD_TYPE=Release `
|
||||
-DNO_CODEGEN=1 `
|
||||
-DUSE_LTO=1 `
|
||||
-DNO_CONFIGURE_DEPENDS=1 `
|
||||
"-DCANARY=${CANARY_REVISION}" `
|
||||
-DBUN_CPP_ONLY=1 ${{ contains(inputs.tag, '-baseline') && '-DUSE_BASELINE_BUILD=1' || '' }}
|
||||
@@ -219,8 +221,6 @@ jobs:
|
||||
- build-zig
|
||||
- codegen
|
||||
steps:
|
||||
- name: Install VS2022 BuildTools 17.9.7
|
||||
run: choco install -y visualstudio2022buildtools --version=117.9.7.0 --params "--add Microsoft.VisualStudio.Component.VC.Tools.x86.x64 --installChannelUri https://aka.ms/vs/17/release/180911598_-255012421/channel"
|
||||
- name: Setup Git
|
||||
run: |
|
||||
git config --global core.autocrlf false
|
||||
@@ -229,10 +229,13 @@ jobs:
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
submodules: recursive
|
||||
- name: Install LLVM
|
||||
uses: KyleMayes/install-llvm-action@8b37482c5a2997a3ab5dbf6561f8109e2eaa7d3b
|
||||
with:
|
||||
version: ${{ env.LLVM_VERSION }}
|
||||
- name: Install Ninja
|
||||
run: |
|
||||
choco install -y ninja
|
||||
choco install -y llvm --version=${{ env.LLVM_VERSION }} --force
|
||||
- name: Setup Bun
|
||||
uses: ./.github/actions/setup-bun
|
||||
with:
|
||||
@@ -280,10 +283,9 @@ jobs:
|
||||
-DNO_CONFIGURE_DEPENDS=1 `
|
||||
"-DCANARY=${CANARY_REVISION}" `
|
||||
-DBUN_LINK_ONLY=1 `
|
||||
-DUSE_LTO=1 `
|
||||
"-DBUN_DEPS_OUT_DIR=$(Resolve-Path ../bun-deps)" `
|
||||
"-DBUN_CPP_ARCHIVE=$(Resolve-Path ../bun-cpp/bun-cpp-objects.a)" `
|
||||
"-DBUN_ZIG_OBJ_DIR=$(Resolve-Path ../bun-zig)" `
|
||||
"-DBUN_ZIG_OBJ=$(Resolve-Path ../bun-zig/bun-zig.o)" `
|
||||
${{ contains(inputs.tag, '-baseline') && '-DUSE_BASELINE_BUILD=1' || '' }}
|
||||
if ($LASTEXITCODE -ne 0) { throw "CMake configuration failed" }
|
||||
ninja -v
|
||||
|
||||
2
.github/workflows/ci.yml
vendored
2
.github/workflows/ci.yml
vendored
@@ -34,7 +34,7 @@ jobs:
|
||||
uses: ./.github/workflows/run-format.yml
|
||||
secrets: inherit
|
||||
with:
|
||||
zig-version: 0.13.0
|
||||
zig-version: 0.12.0-dev.1828+225fe6ddb
|
||||
permissions:
|
||||
contents: write
|
||||
lint:
|
||||
|
||||
81
.github/workflows/labeled.yml
vendored
81
.github/workflows/labeled.yml
vendored
@@ -1,81 +0,0 @@
|
||||
name: Issue Labeled
|
||||
env:
|
||||
BUN_VERSION: 1.1.13
|
||||
|
||||
on:
|
||||
issues:
|
||||
types: [labeled]
|
||||
|
||||
jobs:
|
||||
on-labeled:
|
||||
runs-on: ubuntu-latest
|
||||
if: github.event.label.name == 'crash' || github.event.label.name == 'needs repro'
|
||||
permissions:
|
||||
issues: write
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
sparse-checkout: |
|
||||
scripts
|
||||
.github
|
||||
CMakeLists.txt
|
||||
- name: Setup Bun
|
||||
uses: ./.github/actions/setup-bun
|
||||
with:
|
||||
bun-version: "1.1.13"
|
||||
- name: "add platform and command label"
|
||||
id: add-labels
|
||||
if: github.event.label.name == 'crash'
|
||||
env:
|
||||
GITHUB_ISSUE_BODY: ${{ github.event.issue.body }}
|
||||
GITHUB_ISSUE_TITLE: ${{ github.event.issue.title }}
|
||||
shell: bash
|
||||
run: |
|
||||
LABELS=$(bun scripts/read-issue.ts)
|
||||
echo "labels=$LABELS" >> $GITHUB_OUTPUT
|
||||
bun scripts/is-outdated.ts
|
||||
|
||||
if [[ -f "is-outdated.txt" ]]; then
|
||||
echo "is-outdated=true" >> $GITHUB_OUTPUT
|
||||
fi
|
||||
|
||||
if [[ -f "outdated.txt" ]]; then
|
||||
echo "oudated=$(cat outdated.txt)" >> $GITHUB_OUTPUT
|
||||
fi
|
||||
|
||||
echo "latest=$(cat LATEST)" >> $GITHUB_OUTPUT
|
||||
|
||||
rm -rf is-outdated.txt outdated.txt latest.txt
|
||||
- name: Add labels
|
||||
uses: actions-cool/issues-helper@v3
|
||||
if: github.event.label.name == 'crash'
|
||||
with:
|
||||
actions: "add-labels"
|
||||
token: ${{ secrets.GITHUB_TOKEN }}
|
||||
issue-number: ${{ github.event.issue.number }}
|
||||
labels: ${{ steps.add-labels.outputs.labels }}
|
||||
- name: Comment outdated
|
||||
if: steps.add-labels.outputs.is-outdated == 'true' && github.event.label.name == 'crash'
|
||||
uses: actions-cool/issues-helper@v3
|
||||
with:
|
||||
actions: "create-comment"
|
||||
token: ${{ secrets.GITHUB_TOKEN }}
|
||||
issue-number: ${{ github.event.issue.number }}
|
||||
body: |
|
||||
@${{ github.event.issue.user.login }}, the latest version of Bun is v${{ steps.add-labels.outputs.latest }}, but this crash was reported on Bun v${{ steps.add-labels.outputs.oudated }}.
|
||||
|
||||
Are you able to reproduce this crash on the latest version of Bun?
|
||||
|
||||
```sh
|
||||
bun upgrade
|
||||
```
|
||||
- name: Comment needs repro
|
||||
if: github.event.label.name == 'needs repro'
|
||||
uses: actions-cool/issues-helper@v3
|
||||
with:
|
||||
actions: "create-comment"
|
||||
token: ${{ secrets.GITHUB_TOKEN }}
|
||||
issue-number: ${{ github.event.issue.number }}
|
||||
body: |
|
||||
Hello @${{ github.event.issue.user.login }}. Please provide a [minimal reproduction](https://stackoverflow.com/help/minimal-reproducible-example) using a GitHub repository, [Replit](https://replit.com/@replit/Bun), or [CodeSandbox](https://codesandbox.io/templates/bun). Issues marked with `needs repro` will be closed if they have no activity within 3 days.
|
||||
9
.github/workflows/lint-cpp.yml
vendored
9
.github/workflows/lint-cpp.yml
vendored
@@ -14,11 +14,10 @@ on:
|
||||
type: string
|
||||
description: The workflow ID to download artifacts (skips the build step)
|
||||
pull_request:
|
||||
paths:
|
||||
- ".github/workflows/lint-cpp.yml"
|
||||
- "**/*.cpp"
|
||||
- "src/deps/**/*"
|
||||
- "CMakeLists.txt"
|
||||
paths-ignore:
|
||||
- .vscode/**/*
|
||||
- docs/**/*
|
||||
- examples/**/*
|
||||
|
||||
jobs:
|
||||
lint-cpp:
|
||||
|
||||
89
.github/workflows/on-submodule-update.yml
vendored
89
.github/workflows/on-submodule-update.yml
vendored
@@ -1,89 +0,0 @@
|
||||
name: Comment on updated submodule
|
||||
|
||||
on:
|
||||
pull_request_target:
|
||||
paths:
|
||||
- "src/generated_versions_list.zig"
|
||||
- ".github/workflows/on-submodule-update.yml"
|
||||
|
||||
jobs:
|
||||
comment:
|
||||
name: Comment
|
||||
runs-on: ubuntu-latest
|
||||
if: ${{ github.repository_owner == 'oven-sh' }}
|
||||
permissions:
|
||||
contents: read
|
||||
pull-requests: write
|
||||
issues: write
|
||||
steps:
|
||||
- name: Checkout current
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
sparse-checkout: |
|
||||
src
|
||||
- name: Hash generated versions list
|
||||
id: hash
|
||||
run: |
|
||||
echo "hash=$(sha256sum src/generated_versions_list.zig | cut -d ' ' -f 1)" >> $GITHUB_OUTPUT
|
||||
- name: Checkout base
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
ref: ${{ github.base_ref }}
|
||||
sparse-checkout: |
|
||||
src
|
||||
- name: Hash base
|
||||
id: base
|
||||
run: |
|
||||
echo "base=$(sha256sum src/generated_versions_list.zig | cut -d ' ' -f 1)" >> $GITHUB_OUTPUT
|
||||
- name: Compare
|
||||
id: compare
|
||||
run: |
|
||||
if [ "${{ steps.hash.outputs.hash }}" != "${{ steps.base.outputs.base }}" ]; then
|
||||
echo "changed=true" >> $GITHUB_OUTPUT
|
||||
else
|
||||
echo "changed=false" >> $GITHUB_OUTPUT
|
||||
fi
|
||||
- name: Find Comment
|
||||
id: comment
|
||||
uses: peter-evans/find-comment@v3
|
||||
with:
|
||||
issue-number: ${{ github.event.pull_request.number }}
|
||||
comment-author: github-actions[bot]
|
||||
body-includes: <!-- generated-comment submodule-updated -->
|
||||
- name: Write Warning Comment
|
||||
uses: peter-evans/create-or-update-comment@v4
|
||||
if: steps.compare.outputs.changed == 'true'
|
||||
with:
|
||||
comment-id: ${{ steps.comment.outputs.comment-id }}
|
||||
issue-number: ${{ github.event.pull_request.number }}
|
||||
edit-mode: replace
|
||||
body: |
|
||||
⚠️ **Warning:** @${{ github.actor }}, this PR has changes to submodule versions.
|
||||
|
||||
If this change was intentional, please ignore this message. If not, please undo changes to submodules and rebase your branch.
|
||||
|
||||
<!-- generated-comment submodule-updated -->
|
||||
- name: Add labels
|
||||
uses: actions-cool/issues-helper@v3
|
||||
if: steps.compare.outputs.changed == 'true'
|
||||
with:
|
||||
actions: "add-labels"
|
||||
token: ${{ secrets.GITHUB_TOKEN }}
|
||||
issue-number: ${{ github.event.pull_request.number }}
|
||||
labels: "changed-submodules"
|
||||
- name: Remove labels
|
||||
uses: actions-cool/issues-helper@v3
|
||||
if: steps.compare.outputs.changed == 'false'
|
||||
with:
|
||||
actions: "remove-labels"
|
||||
token: ${{ secrets.GITHUB_TOKEN }}
|
||||
issue-number: ${{ github.event.pull_request.number }}
|
||||
labels: "changed-submodules"
|
||||
- name: Delete outdated comment
|
||||
uses: actions-cool/issues-helper@v3
|
||||
if: steps.compare.outputs.changed == 'false' && steps.comment.outputs.comment-id != ''
|
||||
with:
|
||||
actions: "delete-comment"
|
||||
token: ${{ secrets.GITHUB_TOKEN }}
|
||||
issue-number: ${{ github.event.pull_request.number }}
|
||||
comment-id: ${{ steps.comment.outputs.comment-id }}
|
||||
22
.github/workflows/release.yml
vendored
22
.github/workflows/release.yml
vendored
@@ -270,25 +270,3 @@ jobs:
|
||||
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY}}
|
||||
AWS_ENDPOINT: ${{ secrets.AWS_ENDPOINT }}
|
||||
AWS_BUCKET: bun
|
||||
bump:
|
||||
name: "Bump version"
|
||||
runs-on: ubuntu-latest
|
||||
if: ${{ github.event_name != 'schedule' }}
|
||||
permissions:
|
||||
pull-requests: write
|
||||
contents: write
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
if: ${{ env.BUN_LATEST == 'true' }}
|
||||
- name: Setup Bun
|
||||
uses: ./.github/actions/setup-bun
|
||||
if: ${{ env.BUN_LATEST == 'true' }}
|
||||
with:
|
||||
bun-version: "1.1.12"
|
||||
- name: Bump version
|
||||
uses: ./.github/actions/bump
|
||||
if: ${{ env.BUN_LATEST == 'true' }}
|
||||
with:
|
||||
version: ${{ env.BUN_VERSION }}
|
||||
token: ${{ github.token }}
|
||||
|
||||
4
.github/workflows/run-format.yml
vendored
4
.github/workflows/run-format.yml
vendored
@@ -22,7 +22,6 @@ jobs:
|
||||
sparse-checkout: |
|
||||
.github
|
||||
src
|
||||
scripts
|
||||
packages
|
||||
test
|
||||
bench
|
||||
@@ -43,9 +42,6 @@ jobs:
|
||||
- name: Format Zig
|
||||
run: |
|
||||
bun fmt:zig
|
||||
- name: Generate submodule versions
|
||||
run: |
|
||||
bash ./scripts/write-versions.sh
|
||||
- name: Commit
|
||||
uses: stefanzweifel/git-auto-commit-action@v5
|
||||
with:
|
||||
|
||||
8
.github/workflows/run-test.yml
vendored
8
.github/workflows/run-test.yml
vendored
@@ -78,20 +78,14 @@ jobs:
|
||||
node-version: 20
|
||||
- name: Install Dependencies
|
||||
timeout-minutes: 5
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
run: |
|
||||
bun install
|
||||
- name: Install Dependencies (test)
|
||||
timeout-minutes: 5
|
||||
run: |
|
||||
bun install --cwd test
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
- name: Install Dependencies (runner)
|
||||
timeout-minutes: 5
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
run: |
|
||||
bun install --cwd packages/bun-internal-test
|
||||
- name: Run Tests
|
||||
@@ -99,7 +93,6 @@ jobs:
|
||||
timeout-minutes: 90
|
||||
shell: bash
|
||||
env:
|
||||
IS_BUN_CI: 1
|
||||
TMPDIR: ${{ runner.temp }}
|
||||
BUN_TAG: ${{ inputs.tag }}
|
||||
BUN_FEATURE_FLAG_INTERNAL_FOR_TESTING: "true"
|
||||
@@ -109,7 +102,6 @@ jobs:
|
||||
TEST_INFO_STRIPE: ${{ secrets.TEST_INFO_STRIPE }}
|
||||
TEST_INFO_AZURE_SERVICE_BUS: ${{ secrets.TEST_INFO_AZURE_SERVICE_BUS }}
|
||||
SHELLOPTS: igncr
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
run: |
|
||||
node packages/bun-internal-test/src/runner.node.mjs $(which bun)
|
||||
- if: ${{ always() }}
|
||||
|
||||
29
.github/workflows/test-bump.yml
vendored
29
.github/workflows/test-bump.yml
vendored
@@ -1,29 +0,0 @@
|
||||
name: Test Bump version
|
||||
|
||||
on:
|
||||
workflow_dispatch:
|
||||
inputs:
|
||||
version:
|
||||
type: string
|
||||
description: What is the release tag? (e.g. "1.0.2", "canary")
|
||||
required: true
|
||||
|
||||
jobs:
|
||||
bump:
|
||||
name: "Bump version"
|
||||
runs-on: ubuntu-latest
|
||||
permissions:
|
||||
pull-requests: write
|
||||
contents: write
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
- name: Setup Bun
|
||||
uses: ./.github/actions/setup-bun
|
||||
with:
|
||||
bun-version: "1.1.12"
|
||||
- name: Bump version
|
||||
uses: ./.github/actions/bump
|
||||
with:
|
||||
version: ${{ inputs.version }}
|
||||
token: ${{ github.token }}
|
||||
3
.gitignore
vendored
3
.gitignore
vendored
@@ -15,7 +15,6 @@
|
||||
.vs
|
||||
.vscode/clang*
|
||||
.vscode/cpp*
|
||||
.zig-cache
|
||||
*.a
|
||||
*.bc
|
||||
*.big
|
||||
@@ -55,7 +54,6 @@
|
||||
/test.js
|
||||
/test.ts
|
||||
/testdir
|
||||
/test.zig
|
||||
build
|
||||
build.ninja
|
||||
bun-binary
|
||||
@@ -144,4 +142,3 @@ yarn.lock
|
||||
zig-cache
|
||||
zig-out
|
||||
test/node.js/upstream
|
||||
.zig-cache
|
||||
|
||||
8
.gitmodules
vendored
8
.gitmodules
vendored
@@ -69,6 +69,13 @@ ignore = dirty
|
||||
depth = 1
|
||||
shallow = true
|
||||
fetchRecurseSubmodules = false
|
||||
[submodule "src/deps/base64"]
|
||||
path = src/deps/base64
|
||||
url = https://github.com/aklomp/base64.git
|
||||
ignore = dirty
|
||||
depth = 1
|
||||
shallow = true
|
||||
fetchRecurseSubmodules = false
|
||||
[submodule "src/deps/ls-hpack"]
|
||||
path = src/deps/ls-hpack
|
||||
url = https://github.com/litespeedtech/ls-hpack.git
|
||||
@@ -79,6 +86,7 @@ fetchRecurseSubmodules = false
|
||||
[submodule "zig"]
|
||||
path = src/deps/zig
|
||||
url = https://github.com/oven-sh/zig
|
||||
branch = bun
|
||||
depth = 1
|
||||
shallow = true
|
||||
fetchRecurseSubmodules = false
|
||||
|
||||
88
.vscode/launch.json
generated
vendored
88
.vscode/launch.json
generated
vendored
@@ -17,7 +17,8 @@
|
||||
"cwd": "${workspaceFolder}/test",
|
||||
"env": {
|
||||
"FORCE_COLOR": "1",
|
||||
"BUN_GARBAGE_COLLECTOR_LEVEL": "1",
|
||||
"BUN_DEBUG_QUIET_LOGS": "1",
|
||||
"BUN_GARBAGE_COLLECTOR_LEVEL": "2",
|
||||
},
|
||||
"console": "internalConsole",
|
||||
},
|
||||
@@ -33,16 +34,9 @@
|
||||
"BUN_DEBUG_QUIET_LOGS": "1",
|
||||
"BUN_GARBAGE_COLLECTOR_LEVEL": "1",
|
||||
"BUN_DEBUG_FileReader": "1",
|
||||
"BUN_DEBUG_jest": "1",
|
||||
},
|
||||
"console": "internalConsole",
|
||||
},
|
||||
{
|
||||
"type": "lldb",
|
||||
"name": "Attach",
|
||||
"request": "attach",
|
||||
"pid": "${command:pickMyProcess}",
|
||||
},
|
||||
{
|
||||
"type": "lldb",
|
||||
"request": "launch",
|
||||
@@ -150,7 +144,6 @@
|
||||
"env": {
|
||||
"FORCE_COLOR": "0",
|
||||
"BUN_DEBUG_QUIET_LOGS": "1",
|
||||
"BUN_DEBUG_EventLoop": "1",
|
||||
"BUN_GARBAGE_COLLECTOR_LEVEL": "2",
|
||||
},
|
||||
"console": "internalConsole",
|
||||
@@ -445,16 +438,13 @@
|
||||
"request": "launch",
|
||||
"name": "bun test [*] (ci)",
|
||||
"program": "node",
|
||||
"args": ["test/runner.node.mjs"],
|
||||
"cwd": "${workspaceFolder}",
|
||||
"args": ["src/runner.node.mjs"],
|
||||
"cwd": "${workspaceFolder}/packages/bun-internal-test",
|
||||
"console": "internalConsole",
|
||||
},
|
||||
// Windows: bun test [file]
|
||||
{
|
||||
"type": "cppvsdbg",
|
||||
"sourceFileMap": {
|
||||
"D:\\a\\WebKit\\WebKit\\Source": "${workspaceFolder}\\src\\bun.js\\WebKit\\Source",
|
||||
},
|
||||
"request": "launch",
|
||||
"name": "Windows: bun test [file]",
|
||||
"program": "${workspaceFolder}/build/bun-debug.exe",
|
||||
@@ -482,9 +472,6 @@
|
||||
},
|
||||
{
|
||||
"type": "cppvsdbg",
|
||||
"sourceFileMap": {
|
||||
"D:\\a\\WebKit\\WebKit\\Source": "${workspaceFolder}\\src\\bun.js\\WebKit\\Source",
|
||||
},
|
||||
"request": "launch",
|
||||
"name": "Windows: bun test --only [file]",
|
||||
"program": "${workspaceFolder}/build/bun-debug.exe",
|
||||
@@ -523,9 +510,6 @@
|
||||
},
|
||||
{
|
||||
"type": "cppvsdbg",
|
||||
"sourceFileMap": {
|
||||
"D:\\a\\WebKit\\WebKit\\Source": "${workspaceFolder}\\src\\bun.js\\WebKit\\Source",
|
||||
},
|
||||
"request": "launch",
|
||||
"name": "Windows: bun test [file] (fast)",
|
||||
"program": "${workspaceFolder}/build/bun-debug.exe",
|
||||
@@ -548,9 +532,6 @@
|
||||
},
|
||||
{
|
||||
"type": "cppvsdbg",
|
||||
"sourceFileMap": {
|
||||
"D:\\a\\WebKit\\WebKit\\Source": "${workspaceFolder}\\src\\bun.js\\WebKit\\Source",
|
||||
},
|
||||
"request": "launch",
|
||||
"name": "Windows: bun test [file] (verbose)",
|
||||
"program": "${workspaceFolder}/build/bun-debug.exe",
|
||||
@@ -573,9 +554,6 @@
|
||||
},
|
||||
{
|
||||
"type": "cppvsdbg",
|
||||
"sourceFileMap": {
|
||||
"D:\\a\\WebKit\\WebKit\\Source": "${workspaceFolder}\\src\\bun.js\\WebKit\\Source",
|
||||
},
|
||||
"request": "launch",
|
||||
"name": "Windows: bun test [file] --inspect",
|
||||
"program": "${workspaceFolder}/build/bun-debug.exe",
|
||||
@@ -607,9 +585,6 @@
|
||||
},
|
||||
{
|
||||
"type": "cppvsdbg",
|
||||
"sourceFileMap": {
|
||||
"D:\\a\\WebKit\\WebKit\\Source": "${workspaceFolder}\\src\\bun.js\\WebKit\\Source",
|
||||
},
|
||||
"request": "launch",
|
||||
"name": "Windows: bun test [file] --inspect-brk",
|
||||
"program": "${workspaceFolder}/build/bun-debug.exe",
|
||||
@@ -642,9 +617,6 @@
|
||||
// Windows: bun run [file]
|
||||
{
|
||||
"type": "cppvsdbg",
|
||||
"sourceFileMap": {
|
||||
"D:\\a\\WebKit\\WebKit\\Source": "${workspaceFolder}\\src\\bun.js\\WebKit\\Source",
|
||||
},
|
||||
"request": "launch",
|
||||
"name": "Windows: bun run [file]",
|
||||
"program": "${workspaceFolder}/build/bun-debug.exe",
|
||||
@@ -667,9 +639,6 @@
|
||||
},
|
||||
{
|
||||
"type": "cppvsdbg",
|
||||
"sourceFileMap": {
|
||||
"D:\\a\\WebKit\\WebKit\\Source": "${workspaceFolder}\\src\\bun.js\\WebKit\\Source",
|
||||
},
|
||||
"request": "launch",
|
||||
"name": "Windows: bun install",
|
||||
"program": "${workspaceFolder}/build/bun-debug.exe",
|
||||
@@ -689,9 +658,6 @@
|
||||
},
|
||||
{
|
||||
"type": "cppvsdbg",
|
||||
"sourceFileMap": {
|
||||
"D:\\a\\WebKit\\WebKit\\Source": "${workspaceFolder}\\src\\bun.js\\WebKit\\Source",
|
||||
},
|
||||
"request": "launch",
|
||||
"name": "Windows: bun run [file] (verbose)",
|
||||
"program": "${workspaceFolder}/build/bun-debug.exe",
|
||||
@@ -714,9 +680,6 @@
|
||||
},
|
||||
{
|
||||
"type": "cppvsdbg",
|
||||
"sourceFileMap": {
|
||||
"D:\\a\\WebKit\\WebKit\\Source": "${workspaceFolder}\\src\\bun.js\\WebKit\\Source",
|
||||
},
|
||||
"request": "launch",
|
||||
"name": "Windows: bun run [file] --inspect",
|
||||
"program": "${workspaceFolder}/build/bun-debug.exe",
|
||||
@@ -748,9 +711,6 @@
|
||||
},
|
||||
{
|
||||
"type": "cppvsdbg",
|
||||
"sourceFileMap": {
|
||||
"D:\\a\\WebKit\\WebKit\\Source": "${workspaceFolder}\\src\\bun.js\\WebKit\\Source",
|
||||
},
|
||||
"request": "launch",
|
||||
"name": "Windows: bun run [file] --inspect-brk",
|
||||
"program": "${workspaceFolder}/build/bun-debug.exe",
|
||||
@@ -783,9 +743,6 @@
|
||||
// Windows: bun test [...]
|
||||
{
|
||||
"type": "cppvsdbg",
|
||||
"sourceFileMap": {
|
||||
"D:\\a\\WebKit\\WebKit\\Source": "${workspaceFolder}\\src\\bun.js\\WebKit\\Source",
|
||||
},
|
||||
"request": "launch",
|
||||
"name": "Windows: bun test [...]",
|
||||
"program": "${workspaceFolder}/build/bun-debug.exe",
|
||||
@@ -808,9 +765,6 @@
|
||||
},
|
||||
{
|
||||
"type": "cppvsdbg",
|
||||
"sourceFileMap": {
|
||||
"D:\\a\\WebKit\\WebKit\\Source": "${workspaceFolder}\\src\\bun.js\\WebKit\\Source",
|
||||
},
|
||||
"request": "launch",
|
||||
"name": "Windows: bun test [...] (fast)",
|
||||
"program": "${workspaceFolder}/build/bun-debug.exe",
|
||||
@@ -833,9 +787,6 @@
|
||||
},
|
||||
{
|
||||
"type": "cppvsdbg",
|
||||
"sourceFileMap": {
|
||||
"D:\\a\\WebKit\\WebKit\\Source": "${workspaceFolder}\\src\\bun.js\\WebKit\\Source",
|
||||
},
|
||||
"request": "launch",
|
||||
"name": "Windows: bun test [...] (verbose)",
|
||||
"program": "${workspaceFolder}/build/bun-debug.exe",
|
||||
@@ -858,9 +809,6 @@
|
||||
},
|
||||
{
|
||||
"type": "cppvsdbg",
|
||||
"sourceFileMap": {
|
||||
"D:\\a\\WebKit\\WebKit\\Source": "${workspaceFolder}\\src\\bun.js\\WebKit\\Source",
|
||||
},
|
||||
"request": "launch",
|
||||
"name": "Windows: bun test [...] --watch",
|
||||
"program": "${workspaceFolder}/build/bun-debug.exe",
|
||||
@@ -883,9 +831,6 @@
|
||||
},
|
||||
{
|
||||
"type": "cppvsdbg",
|
||||
"sourceFileMap": {
|
||||
"D:\\a\\WebKit\\WebKit\\Source": "${workspaceFolder}\\src\\bun.js\\WebKit\\Source",
|
||||
},
|
||||
"request": "launch",
|
||||
"name": "Windows: bun test [...] --hot",
|
||||
"program": "${workspaceFolder}/build/bun-debug.exe",
|
||||
@@ -908,9 +853,6 @@
|
||||
},
|
||||
{
|
||||
"type": "cppvsdbg",
|
||||
"sourceFileMap": {
|
||||
"D:\\a\\WebKit\\WebKit\\Source": "${workspaceFolder}\\src\\bun.js\\WebKit\\Source",
|
||||
},
|
||||
"request": "launch",
|
||||
"name": "Windows: bun test [...] --inspect",
|
||||
"program": "${workspaceFolder}/build/bun-debug.exe",
|
||||
@@ -942,9 +884,6 @@
|
||||
},
|
||||
{
|
||||
"type": "cppvsdbg",
|
||||
"sourceFileMap": {
|
||||
"D:\\a\\WebKit\\WebKit\\Source": "${workspaceFolder}\\src\\bun.js\\WebKit\\Source",
|
||||
},
|
||||
"request": "launch",
|
||||
"name": "Windows: bun test [...] --inspect-brk",
|
||||
"program": "${workspaceFolder}/build/bun-debug.exe",
|
||||
@@ -977,9 +916,6 @@
|
||||
// Windows: bun exec [...]
|
||||
{
|
||||
"type": "cppvsdbg",
|
||||
"sourceFileMap": {
|
||||
"D:\\a\\WebKit\\WebKit\\Source": "${workspaceFolder}\\src\\bun.js\\WebKit\\Source",
|
||||
},
|
||||
"request": "launch",
|
||||
"name": "Windows: bun exec [...]",
|
||||
"program": "${workspaceFolder}/build/bun-debug.exe",
|
||||
@@ -1003,9 +939,6 @@
|
||||
// Windows: bun test [*]
|
||||
{
|
||||
"type": "cppvsdbg",
|
||||
"sourceFileMap": {
|
||||
"D:\\a\\WebKit\\WebKit\\Source": "${workspaceFolder}\\src\\bun.js\\WebKit\\Source",
|
||||
},
|
||||
"request": "launch",
|
||||
"name": "Windows: bun test [*]",
|
||||
"program": "${workspaceFolder}/build/bun-debug.exe",
|
||||
@@ -1028,9 +961,6 @@
|
||||
},
|
||||
{
|
||||
"type": "cppvsdbg",
|
||||
"sourceFileMap": {
|
||||
"D:\\a\\WebKit\\WebKit\\Source": "${workspaceFolder}\\src\\bun.js\\WebKit\\Source",
|
||||
},
|
||||
"request": "launch",
|
||||
"name": "Windows: bun test [*] (fast)",
|
||||
"program": "${workspaceFolder}/build/bun-debug.exe",
|
||||
@@ -1053,9 +983,6 @@
|
||||
},
|
||||
{
|
||||
"type": "cppvsdbg",
|
||||
"sourceFileMap": {
|
||||
"D:\\a\\WebKit\\WebKit\\Source": "${workspaceFolder}\\src\\bun.js\\WebKit\\Source",
|
||||
},
|
||||
"request": "launch",
|
||||
"name": "Windows: bun test [*] --inspect",
|
||||
"program": "${workspaceFolder}/build/bun-debug.exe",
|
||||
@@ -1087,14 +1014,11 @@
|
||||
},
|
||||
{
|
||||
"type": "cppvsdbg",
|
||||
"sourceFileMap": {
|
||||
"D:\\a\\WebKit\\WebKit\\Source": "${workspaceFolder}\\src\\bun.js\\WebKit\\Source",
|
||||
},
|
||||
"request": "launch",
|
||||
"name": "Windows: bun test [*] (ci)",
|
||||
"program": "node",
|
||||
"args": ["test/runner.node.mjs"],
|
||||
"cwd": "${workspaceFolder}",
|
||||
"args": ["src/runner.node.mjs"],
|
||||
"cwd": "${workspaceFolder}/packages/bun-internal-test",
|
||||
"console": "internalConsole",
|
||||
},
|
||||
],
|
||||
|
||||
7
.vscode/settings.json
vendored
7
.vscode/settings.json
vendored
@@ -26,12 +26,8 @@
|
||||
|
||||
// Zig
|
||||
"zig.initialSetupDone": true,
|
||||
"zig.buildOnSave": false,
|
||||
"zig.buildOption": "build",
|
||||
"zig.zls.zigLibPath": "${workspaceFolder}/src/deps/zig/lib",
|
||||
"zig.buildArgs": ["-Dgenerated-code=./build/codegen"],
|
||||
"zig.zls.buildOnSaveStep": "check",
|
||||
// "zig.zls.enableBuildOnSave": true,
|
||||
// "zig.buildOnSave": true,
|
||||
"zig.buildFilePath": "${workspaceFolder}/build.zig",
|
||||
"zig.path": "${workspaceFolder}/.cache/zig/zig.exe",
|
||||
"zig.formattingProvider": "zls",
|
||||
@@ -151,5 +147,4 @@
|
||||
"WebKit/WebKitBuild": true,
|
||||
"WebKit/WebInspectorUI": true,
|
||||
},
|
||||
"git.detectSubmodules": false,
|
||||
}
|
||||
|
||||
@@ -2,9 +2,8 @@ cmake_minimum_required(VERSION 3.22)
|
||||
cmake_policy(SET CMP0091 NEW)
|
||||
cmake_policy(SET CMP0067 NEW)
|
||||
|
||||
set(CMAKE_POLICY_DEFAULT_CMP0069 NEW)
|
||||
set(Bun_VERSION "1.1.19")
|
||||
set(WEBKIT_TAG 615e8585f96aa718b0f5158210259b83fe8440ea)
|
||||
set(Bun_VERSION "1.1.10")
|
||||
set(WEBKIT_TAG 2c4f31e10974404bc8316a70d491ec0f400c880d)
|
||||
|
||||
set(BUN_WORKDIR "${CMAKE_CURRENT_BINARY_DIR}")
|
||||
message(STATUS "Configuring Bun ${Bun_VERSION} in ${BUN_WORKDIR}")
|
||||
@@ -15,10 +14,6 @@ set(CMAKE_C_STANDARD 17)
|
||||
set(CMAKE_CXX_STANDARD_REQUIRED ON)
|
||||
set(CMAKE_C_STANDARD_REQUIRED ON)
|
||||
|
||||
# Should not start with v
|
||||
# Used in process.version, process.versions.node, napi, and elsewhere
|
||||
set(REPORTED_NODEJS_VERSION "22.3.0")
|
||||
|
||||
# WebKit uses -std=gnu++20 on non-macOS non-Windows
|
||||
# If we do not set this, it will crash at startup on the first memory allocation.
|
||||
if(NOT WIN32 AND NOT APPLE)
|
||||
@@ -306,6 +301,7 @@ option(USE_CUSTOM_LIBARCHIVE "Use Bun's recommended version of libarchive" ON)
|
||||
option(USE_CUSTOM_MIMALLOC "Use Bun's recommended version of Mimalloc" ON)
|
||||
option(USE_CUSTOM_ZSTD "Use Bun's recommended version of zstd" ON)
|
||||
option(USE_CUSTOM_CARES "Use Bun's recommended version of c-ares" ON)
|
||||
option(USE_CUSTOM_BASE64 "Use Bun's recommended version of libbase64" ON)
|
||||
option(USE_CUSTOM_LOLHTML "Use Bun's recommended version of lolhtml" ON)
|
||||
option(USE_CUSTOM_TINYCC "Use Bun's recommended version of tinycc" ON)
|
||||
option(USE_CUSTOM_LIBUV "Use Bun's recommended version of libuv (Windows only)" ON)
|
||||
@@ -322,11 +318,6 @@ option(USE_STATIC_LIBATOMIC "Statically link libatomic, requires the presence of
|
||||
|
||||
option(USE_LTO "Enable Link-Time Optimization" ${DEFAULT_LTO})
|
||||
|
||||
if(WIN32 AND USE_LTO)
|
||||
set(CMAKE_LINKER_TYPE LLD)
|
||||
set(CMAKE_INTERPROCEDURAL_OPTIMIZATION OFF)
|
||||
endif()
|
||||
|
||||
option(BUN_TIDY_ONLY "Only run clang-tidy" OFF)
|
||||
option(BUN_TIDY_ONLY_EXTRA " Only run clang-tidy, with extra checks for local development" OFF)
|
||||
|
||||
@@ -349,10 +340,6 @@ if(NOT CANARY)
|
||||
set(CANARY 0)
|
||||
endif()
|
||||
|
||||
if(NOT ENABLE_LOGS)
|
||||
set(ENABLE_LOGS false)
|
||||
endif()
|
||||
|
||||
if(NOT ZIG_OPTIMIZE)
|
||||
set(ZIG_OPTIMIZE ${DEFAULT_ZIG_OPTIMIZE})
|
||||
endif()
|
||||
@@ -620,7 +607,7 @@ set(BUN_DEPS_DIR "${BUN_SRC}/deps")
|
||||
set(BUN_CODEGEN_SRC "${BUN_SRC}/codegen")
|
||||
|
||||
if(NOT BUN_DEPS_OUT_DIR)
|
||||
set(BUN_DEPS_OUT_DIR "${CMAKE_CURRENT_SOURCE_DIR}/build/bun-deps")
|
||||
set(BUN_DEPS_OUT_DIR "${BUN_DEPS_DIR}")
|
||||
endif()
|
||||
|
||||
set(BUN_RAW_SOURCES, "")
|
||||
@@ -861,13 +848,11 @@ file(GLOB ZIG_FILES
|
||||
"${BUN_SRC}/*/*/*/*/*.zig"
|
||||
)
|
||||
|
||||
if(NOT BUN_ZIG_OBJ_DIR)
|
||||
set(BUN_ZIG_OBJ_DIR "${BUN_WORKDIR}/CMakeFiles")
|
||||
if(NOT BUN_ZIG_OBJ)
|
||||
set(BUN_ZIG_OBJ "${BUN_WORKDIR}/CMakeFiles/bun-zig.o")
|
||||
endif()
|
||||
|
||||
get_filename_component(BUN_ZIG_OBJ_DIR "${BUN_ZIG_OBJ_DIR}" REALPATH BASE_DIR "${CMAKE_BINARY_DIR}")
|
||||
|
||||
set(BUN_ZIG_OBJ "${BUN_ZIG_OBJ_DIR}/bun-zig.o")
|
||||
get_filename_component(BUN_ZIG_OBJ "${BUN_ZIG_OBJ}" REALPATH BASE_DIR "${CMAKE_BINARY_DIR}")
|
||||
|
||||
set(USES_TERMINAL_NOT_IN_CI "")
|
||||
|
||||
@@ -881,7 +866,7 @@ if(NOT BUN_LINK_ONLY AND NOT BUN_CPP_ONLY)
|
||||
COMMAND
|
||||
"${ZIG_COMPILER}" "build" "obj"
|
||||
"--zig-lib-dir" "${ZIG_LIB_DIR}"
|
||||
"--prefix" "${BUN_ZIG_OBJ_DIR}"
|
||||
"-Doutput-file=${BUN_ZIG_OBJ}"
|
||||
"-Dgenerated-code=${BUN_WORKDIR}/codegen"
|
||||
"-freference-trace=10"
|
||||
"-Dversion=${Bun_VERSION}"
|
||||
@@ -889,8 +874,6 @@ if(NOT BUN_LINK_ONLY AND NOT BUN_CPP_ONLY)
|
||||
"-Doptimize=${ZIG_OPTIMIZE}"
|
||||
"-Dcpu=${CPU_TARGET}"
|
||||
"-Dtarget=${ZIG_TARGET}"
|
||||
"-Denable_logs=${ENABLE_LOGS}"
|
||||
"-Dreported_nodejs_version=${REPORTED_NODEJS_VERSION}"
|
||||
DEPENDS
|
||||
"${CMAKE_CURRENT_SOURCE_DIR}/build.zig"
|
||||
"${ZIG_FILES}"
|
||||
@@ -954,15 +937,12 @@ set_target_properties(${bun} PROPERTIES
|
||||
VISIBILITY_INLINES_HIDDEN YES
|
||||
)
|
||||
|
||||
if(APPLE)
|
||||
add_compile_definitions("__DARWIN_NON_CANCELABLE=1")
|
||||
endif()
|
||||
|
||||
add_compile_definitions(
|
||||
|
||||
# TODO: are all of these variables strictly necessary?
|
||||
"_HAS_EXCEPTIONS=0"
|
||||
"LIBUS_USE_OPENSSL=1"
|
||||
"UWS_HTTPRESPONSE_NO_WRITEMARK=1"
|
||||
"LIBUS_USE_BORINGSSL=1"
|
||||
"WITH_BORINGSSL=1"
|
||||
"STATICALLY_LINKED_WITH_JavaScriptCore=1"
|
||||
@@ -976,7 +956,6 @@ add_compile_definitions(
|
||||
"IS_BUILD"
|
||||
"BUILDING_JSCONLY__"
|
||||
"BUN_DYNAMIC_JS_LOAD_PATH=\"${BUN_WORKDIR}/js\""
|
||||
"REPORTED_NODEJS_VERSION=\"${REPORTED_NODEJS_VERSION}\""
|
||||
)
|
||||
|
||||
if(NOT ASSERT_ENABLED)
|
||||
@@ -1086,14 +1065,12 @@ elseif(CMAKE_BUILD_TYPE STREQUAL "Release")
|
||||
set(LTO_LINK_FLAG "")
|
||||
|
||||
if(USE_LTO)
|
||||
target_compile_options(${bun} PUBLIC -Xclang -emit-llvm-bc)
|
||||
|
||||
# -emit-llvm seems to not be supported or under a different name on Windows.
|
||||
list(APPEND LTO_FLAG "-flto=full")
|
||||
list(APPEND LTO_LINK_FLAG "/LTCG")
|
||||
endif()
|
||||
|
||||
target_compile_options(${bun} PUBLIC /O2 ${LTO_FLAG})
|
||||
target_compile_options(${bun} PUBLIC /O2 ${LTO_FLAG} /DEBUG:FULL)
|
||||
target_link_options(${bun} PUBLIC ${LTO_LINK_FLAG} /DEBUG:FULL)
|
||||
endif()
|
||||
endif()
|
||||
@@ -1129,8 +1106,7 @@ if(WIN32)
|
||||
set_property(TARGET ${bun} PROPERTY MSVC_RUNTIME_LIBRARY "MultiThreadedDLL")
|
||||
|
||||
target_compile_options(${bun} PUBLIC "/EHsc" "/GR-")
|
||||
|
||||
target_link_options(${bun} PUBLIC "/STACK:0x1200000,0x100000" "/DEF:${BUN_SRC}/symbols.def" "/errorlimit:0")
|
||||
target_link_options(${bun} PUBLIC "/STACK:0x1200000,0x100000")
|
||||
else()
|
||||
target_compile_options(${bun} PUBLIC
|
||||
-fPIC
|
||||
@@ -1152,6 +1128,7 @@ if(APPLE)
|
||||
target_link_options(${bun} PUBLIC "-Wl,-stack_size,0x1200000")
|
||||
target_link_options(${bun} PUBLIC "-exported_symbols_list" "${BUN_SRC}/symbols.txt")
|
||||
set_target_properties(${bun} PROPERTIES LINK_DEPENDS "${BUN_SRC}/symbols.txt")
|
||||
|
||||
target_link_options(${bun} PUBLIC "-fno-keep-static-consts")
|
||||
target_link_libraries(${bun} PRIVATE "resolv")
|
||||
endif()
|
||||
@@ -1306,11 +1283,11 @@ if(USE_CUSTOM_MIMALLOC)
|
||||
elseif(APPLE)
|
||||
if(USE_DEBUG_JSC OR CMAKE_BUILD_TYPE STREQUAL "Debug")
|
||||
message(STATUS "Using debug mimalloc")
|
||||
target_link_libraries(${bun} PRIVATE "${BUN_DEPS_OUT_DIR}/libmimalloc-debug.o")
|
||||
target_link_libraries(${bun} PRIVATE "${BUN_DEPS_OUT_DIR}/libmimalloc-debug.a")
|
||||
else()
|
||||
# Note: https://github.com/microsoft/mimalloc/issues/512
|
||||
# It may have been a bug in our code at the time.
|
||||
target_link_libraries(${bun} PRIVATE "${BUN_DEPS_OUT_DIR}/libmimalloc.o")
|
||||
# https://github.com/microsoft/mimalloc/issues/512
|
||||
# Linking mimalloc via object file on macOS x64 can cause heap corruption
|
||||
target_link_libraries(${bun} PRIVATE "${BUN_DEPS_OUT_DIR}/libmimalloc.a")
|
||||
endif()
|
||||
else()
|
||||
if(USE_DEBUG_JSC OR CMAKE_BUILD_TYPE STREQUAL "Debug")
|
||||
@@ -1351,6 +1328,19 @@ else()
|
||||
target_link_libraries(${bun} PRIVATE c-ares::cares)
|
||||
endif()
|
||||
|
||||
if(USE_CUSTOM_BASE64)
|
||||
include_directories(${BUN_DEPS_DIR}/base64/include)
|
||||
|
||||
if(WIN32)
|
||||
target_link_libraries(${bun} PRIVATE "${BUN_DEPS_OUT_DIR}/base64.lib")
|
||||
else()
|
||||
target_link_libraries(${bun} PRIVATE "${BUN_DEPS_OUT_DIR}/libbase64.a")
|
||||
endif()
|
||||
else()
|
||||
find_package(base64 REQUIRED)
|
||||
target_link_libraries(${bun} PRIVATE base64::base64)
|
||||
endif()
|
||||
|
||||
if(USE_CUSTOM_TINYCC)
|
||||
if(WIN32)
|
||||
target_link_libraries(${bun} PRIVATE "${BUN_DEPS_OUT_DIR}/tcc.lib")
|
||||
@@ -1440,6 +1430,10 @@ else()
|
||||
)
|
||||
endif()
|
||||
|
||||
if(WIN32)
|
||||
# delayimp -delayload:shell32.dll -delayload:ole32.dll
|
||||
endif()
|
||||
|
||||
if(BUN_LINK_ONLY)
|
||||
message(STATUS "NOTE: BUN_LINK_ONLY is ON, this build config will only link the Bun executable")
|
||||
endif()
|
||||
@@ -1462,4 +1456,4 @@ if(BUN_TIDY_ONLY_EXTRA)
|
||||
find_program(CLANG_TIDY_EXE NAMES "clang-tidy")
|
||||
set(CLANG_TIDY_COMMAND "${CLANG_TIDY_EXE}" "-checks=-*,clang-analyzer-*,performance-*,-clang-analyzer-webkit.UncountedLambdaCapturesChecker" "--fix" "--fix-errors" "--format-style=webkit" "--warnings-as-errors=*")
|
||||
set_target_properties(${bun} PROPERTIES CXX_CLANG_TIDY "${CLANG_TIDY_COMMAND}")
|
||||
endif()
|
||||
endif()
|
||||
@@ -2,10 +2,6 @@ Configuring a development environment for Bun can take 10-30 minutes depending o
|
||||
|
||||
If you are using Windows, please refer to [this guide](/docs/project/building-windows)
|
||||
|
||||
{% details summary="For Ubuntu users" %}
|
||||
TL;DR: Ubuntu 22.04 is suggested.
|
||||
Bun currently requires `glibc >=2.32` in development which means if you're on Ubuntu 20.04 (glibc == 2.31), you may likely meet `error: undefined symbol: __libc_single_threaded `. You need to take extra configurations. Also, according to this [issue](https://github.com/llvm/llvm-project/issues/97314), LLVM 16 is no longer maintained on Ubuntu 24.04 (noble). And instead, you might want `brew` to install LLVM 16 for your Ubuntu 24.04.
|
||||
|
||||
## Install Dependencies
|
||||
|
||||
Using your system's package manager, install Bun's dependencies:
|
||||
@@ -111,7 +107,7 @@ $ export PATH="$PATH:/usr/lib/llvm16/bin"
|
||||
|
||||
{% /codetabs %}
|
||||
|
||||
> ⚠️ Ubuntu distributions (<= 20.04) may require installation of the C++ standard library independently. See the [troubleshooting section](#span-file-not-found-on-ubuntu) for more information.
|
||||
> ⚠️ Ubuntu distributions may require installation of the C++ standard library independently. See the [troubleshooting section](#span-file-not-found-on-ubuntu) for more information.
|
||||
|
||||
## Building Bun
|
||||
|
||||
@@ -315,12 +311,3 @@ $ bun setup -DUSE_STATIC_LIBATOMIC=OFF
|
||||
```
|
||||
|
||||
The built version of Bun may not work on other systems if compiled this way.
|
||||
|
||||
## ccache conflicts with building TinyCC on macOS
|
||||
|
||||
If you run into issues with `ccache` when building TinyCC, try reinstalling ccache
|
||||
|
||||
```bash
|
||||
brew uninstall ccache
|
||||
brew install ccache
|
||||
```
|
||||
|
||||
31
Dockerfile
31
Dockerfile
@@ -25,9 +25,7 @@ ARG CMAKE_BUILD_TYPE=Release
|
||||
|
||||
ARG NODE_VERSION="20"
|
||||
ARG LLVM_VERSION="16"
|
||||
|
||||
ARG ZIG_VERSION="0.13.0"
|
||||
ARG ZIG_VERSION_SHORT="0.13.0"
|
||||
ARG ZIG_VERSION="0.12.0-dev.1828+225fe6ddb"
|
||||
|
||||
ARG SCCACHE_BUCKET
|
||||
ARG SCCACHE_REGION
|
||||
@@ -141,7 +139,6 @@ RUN install_packages \
|
||||
FROM bun-base as bun-base-with-zig
|
||||
|
||||
ARG ZIG_VERSION
|
||||
ARG ZIG_VERSION_SHORT
|
||||
ARG BUILD_MACHINE_ARCH
|
||||
ARG ZIG_FOLDERNAME=zig-linux-${BUILD_MACHINE_ARCH}-${ZIG_VERSION}
|
||||
ARG ZIG_FILENAME=${ZIG_FOLDERNAME}.tar.xz
|
||||
@@ -259,14 +256,15 @@ ENV CCACHE_DIR=${CCACHE_DIR}
|
||||
|
||||
RUN install_packages autoconf automake libtool pkg-config
|
||||
|
||||
COPY scripts ${BUN_DIR}/scripts
|
||||
COPY Makefile ${BUN_DIR}/Makefile
|
||||
COPY src/deps/libarchive ${BUN_DIR}/src/deps/libarchive
|
||||
|
||||
WORKDIR $BUN_DIR
|
||||
|
||||
RUN --mount=type=cache,target=${CCACHE_DIR} \
|
||||
cd $BUN_DIR \
|
||||
&& bash ./scripts/build-libarchive.sh && rm -rf src/deps/libarchive .scripts
|
||||
&& make libarchive \
|
||||
&& rm -rf src/deps/libarchive Makefile
|
||||
|
||||
FROM bun-base as tinycc
|
||||
|
||||
@@ -298,6 +296,19 @@ RUN --mount=type=cache,target=${CCACHE_DIR} \
|
||||
&& make boringssl \
|
||||
&& rm -rf src/deps/boringssl Makefile
|
||||
|
||||
FROM bun-base as base64
|
||||
|
||||
ARG BUN_DIR
|
||||
ARG CPU_TARGET
|
||||
ENV CPU_TARGET=${CPU_TARGET}
|
||||
|
||||
COPY Makefile ${BUN_DIR}/Makefile
|
||||
COPY src/deps/base64 ${BUN_DIR}/src/deps/base64
|
||||
|
||||
WORKDIR $BUN_DIR
|
||||
|
||||
RUN cd $BUN_DIR && \
|
||||
make base64 && rm -rf src/deps/base64 Makefile
|
||||
|
||||
FROM bun-base as zstd
|
||||
|
||||
@@ -460,7 +471,7 @@ RUN --mount=type=cache,target=${CCACHE_DIR} \
|
||||
-DWEBKIT_DIR="omit" \
|
||||
-DNO_CONFIGURE_DEPENDS=1 \
|
||||
-DNO_CODEGEN=1 \
|
||||
-DBUN_ZIG_OBJ_DIR="/tmp" \
|
||||
-DBUN_ZIG_OBJ="/tmp/bun-zig.o" \
|
||||
-DCANARY="${CANARY}" \
|
||||
-DZIG_COMPILER=system \
|
||||
-DZIG_LIB_DIR=$BUN_DIR/src/deps/zig/lib \
|
||||
@@ -496,6 +507,7 @@ COPY src/symbols.dyn src/linker.lds ${BUN_DIR}/src/
|
||||
|
||||
COPY CMakeLists.txt ${BUN_DIR}/CMakeLists.txt
|
||||
COPY --from=zlib ${BUN_DEPS_OUT_DIR}/* ${BUN_DEPS_OUT_DIR}/
|
||||
COPY --from=base64 ${BUN_DEPS_OUT_DIR}/* ${BUN_DEPS_OUT_DIR}/
|
||||
COPY --from=libarchive ${BUN_DEPS_OUT_DIR}/* ${BUN_DEPS_OUT_DIR}/
|
||||
COPY --from=boringssl ${BUN_DEPS_OUT_DIR}/* ${BUN_DEPS_OUT_DIR}/
|
||||
COPY --from=lolhtml ${BUN_DEPS_OUT_DIR}/* ${BUN_DEPS_OUT_DIR}/
|
||||
@@ -516,7 +528,7 @@ RUN --mount=type=cache,target=${CCACHE_DIR} \
|
||||
-G Ninja \
|
||||
-DCMAKE_BUILD_TYPE=Release \
|
||||
-DBUN_LINK_ONLY=1 \
|
||||
-DBUN_ZIG_OBJ_DIR="${BUN_DIR}/build" \
|
||||
-DBUN_ZIG_OBJ="${BUN_DIR}/build/bun-zig.o" \
|
||||
-DUSE_LTO=ON \
|
||||
-DUSE_DEBUG_JSC=${ASSERTIONS} \
|
||||
-DBUN_CPP_ARCHIVE="${BUN_DIR}/build/bun-cpp-objects.a" \
|
||||
@@ -559,6 +571,7 @@ COPY src/symbols.dyn src/linker.lds ${BUN_DIR}/src/
|
||||
|
||||
COPY CMakeLists.txt ${BUN_DIR}/CMakeLists.txt
|
||||
COPY --from=zlib ${BUN_DEPS_OUT_DIR}/* ${BUN_DEPS_OUT_DIR}/
|
||||
COPY --from=base64 ${BUN_DEPS_OUT_DIR}/* ${BUN_DEPS_OUT_DIR}/
|
||||
COPY --from=libarchive ${BUN_DEPS_OUT_DIR}/* ${BUN_DEPS_OUT_DIR}/
|
||||
COPY --from=boringssl ${BUN_DEPS_OUT_DIR}/* ${BUN_DEPS_OUT_DIR}/
|
||||
COPY --from=lolhtml ${BUN_DEPS_OUT_DIR}/* ${BUN_DEPS_OUT_DIR}/
|
||||
@@ -578,7 +591,7 @@ RUN --mount=type=cache,target=${CCACHE_DIR} \
|
||||
-G Ninja \
|
||||
-DCMAKE_BUILD_TYPE=Release \
|
||||
-DBUN_LINK_ONLY=1 \
|
||||
-DBUN_ZIG_OBJ_DIR="${BUN_DIR}/build" \
|
||||
-DBUN_ZIG_OBJ="${BUN_DIR}/build/bun-zig.o" \
|
||||
-DUSE_DEBUG_JSC=ON \
|
||||
-DBUN_CPP_ARCHIVE="${BUN_DIR}/build/bun-cpp-objects.a" \
|
||||
-DWEBKIT_DIR="${BUN_DIR}/bun-webkit" \
|
||||
|
||||
229
LICENSE
Normal file
229
LICENSE
Normal file
@@ -0,0 +1,229 @@
|
||||
Bun itself is MIT-licensed.
|
||||
|
||||
## JavaScriptCore
|
||||
|
||||
Bun statically links JavaScriptCore (and WebKit) which is LGPL-2 licensed. WebCore files from WebKit are also licensed under LGPL2. Per LGPL2:
|
||||
|
||||
> (1) If you statically link against an LGPL’d library, you must also provide your application in an object (not necessarily source) format, so that a user has the opportunity to modify the library and relink the application.
|
||||
|
||||
You can find the patched version of WebKit used by Bun here: <https://github.com/oven-sh/webkit>. If you would like to relink Bun with changes:
|
||||
|
||||
- `git submodule update --init --recursive`
|
||||
- `make jsc`
|
||||
- `zig build`
|
||||
|
||||
This compiles JavaScriptCore, compiles Bun’s `.cpp` bindings for JavaScriptCore (which are the object files using JavaScriptCore) and outputs a new `bun` binary with your changes.
|
||||
|
||||
## Linked libraries
|
||||
|
||||
Bun statically links these libraries:
|
||||
|
||||
{% table %}
|
||||
|
||||
- Library
|
||||
- License
|
||||
|
||||
---
|
||||
|
||||
- [`boringssl`](https://boringssl.googlesource.com/boringssl/)
|
||||
- [several licenses](https://boringssl.googlesource.com/boringssl/+/refs/heads/master/LICENSE)
|
||||
|
||||
---
|
||||
|
||||
---
|
||||
|
||||
- [`brotli`](https://github.com/google/brotli)
|
||||
- MIT
|
||||
|
||||
---
|
||||
|
||||
- [`libarchive`](https://github.com/libarchive/libarchive)
|
||||
- [several licenses](https://github.com/libarchive/libarchive/blob/master/COPYING)
|
||||
|
||||
---
|
||||
|
||||
- [`lol-html`](https://github.com/cloudflare/lol-html/tree/master/c-api)
|
||||
- BSD 3-Clause
|
||||
|
||||
---
|
||||
|
||||
- [`mimalloc`](https://github.com/microsoft/mimalloc)
|
||||
- MIT
|
||||
|
||||
---
|
||||
|
||||
- [`picohttp`](https://github.com/h2o/picohttpparser)
|
||||
- dual-licensed under the Perl License or the MIT License
|
||||
|
||||
---
|
||||
|
||||
- [`zstd`](https://github.com/facebook/zstd)
|
||||
- dual-licensed under the BSD License or GPLv2 license
|
||||
|
||||
---
|
||||
|
||||
- [`simdutf`](https://github.com/simdutf/simdutf)
|
||||
- Apache 2.0
|
||||
|
||||
---
|
||||
|
||||
- [`tinycc`](https://github.com/tinycc/tinycc)
|
||||
- LGPL v2.1
|
||||
|
||||
---
|
||||
|
||||
- [`uSockets`](https://github.com/uNetworking/uSockets)
|
||||
- Apache 2.0
|
||||
|
||||
---
|
||||
|
||||
- [`zlib-cloudflare`](https://github.com/cloudflare/zlib)
|
||||
- zlib
|
||||
|
||||
---
|
||||
|
||||
- [`c-ares`](https://github.com/c-ares/c-ares)
|
||||
- MIT licensed
|
||||
|
||||
---
|
||||
|
||||
- [`libicu`](https://github.com/unicode-org/icu) 72
|
||||
- [license here](https://github.com/unicode-org/icu/blob/main/icu4c/LICENSE)
|
||||
|
||||
---
|
||||
|
||||
- [`libbase64`](https://github.com/aklomp/base64/blob/master/LICENSE)
|
||||
- BSD 2-Clause
|
||||
|
||||
---
|
||||
|
||||
- A fork of [`uWebsockets`](https://github.com/jarred-sumner/uwebsockets)
|
||||
- Apache 2.0 licensed
|
||||
|
||||
---
|
||||
|
||||
- Parts of [Tigerbeetle's IO code](https://github.com/tigerbeetle/tigerbeetle/blob/532c8b70b9142c17e07737ab6d3da68d7500cbca/src/io/windows.zig#L1)
|
||||
- Apache 2.0 licensed
|
||||
|
||||
{% /table %}
|
||||
|
||||
## Polyfills
|
||||
|
||||
For compatibility reasons, the following packages are embedded into Bun's binary and injected if imported.
|
||||
|
||||
{% table %}
|
||||
|
||||
- Package
|
||||
- License
|
||||
|
||||
---
|
||||
|
||||
- [`assert`](https://npmjs.com/package/assert)
|
||||
- MIT
|
||||
|
||||
---
|
||||
|
||||
- [`browserify-zlib`](https://npmjs.com/package/browserify-zlib)
|
||||
- MIT
|
||||
|
||||
---
|
||||
|
||||
- [`buffer`](https://npmjs.com/package/buffer)
|
||||
- MIT
|
||||
|
||||
---
|
||||
|
||||
- [`constants-browserify`](https://npmjs.com/package/constants-browserify)
|
||||
- MIT
|
||||
|
||||
---
|
||||
|
||||
- [`crypto-browserify`](https://npmjs.com/package/crypto-browserify)
|
||||
- MIT
|
||||
|
||||
---
|
||||
|
||||
- [`domain-browser`](https://npmjs.com/package/domain-browser)
|
||||
- MIT
|
||||
|
||||
---
|
||||
|
||||
- [`events`](https://npmjs.com/package/events)
|
||||
- MIT
|
||||
|
||||
---
|
||||
|
||||
- [`https-browserify`](https://npmjs.com/package/https-browserify)
|
||||
- MIT
|
||||
|
||||
---
|
||||
|
||||
- [`os-browserify`](https://npmjs.com/package/os-browserify)
|
||||
- MIT
|
||||
|
||||
---
|
||||
|
||||
- [`path-browserify`](https://npmjs.com/package/path-browserify)
|
||||
- MIT
|
||||
|
||||
---
|
||||
|
||||
- [`process`](https://npmjs.com/package/process)
|
||||
- MIT
|
||||
|
||||
---
|
||||
|
||||
- [`punycode`](https://npmjs.com/package/punycode)
|
||||
- MIT
|
||||
|
||||
---
|
||||
|
||||
- [`querystring-es3`](https://npmjs.com/package/querystring-es3)
|
||||
- MIT
|
||||
|
||||
---
|
||||
|
||||
- [`stream-browserify`](https://npmjs.com/package/stream-browserify)
|
||||
- MIT
|
||||
|
||||
---
|
||||
|
||||
- [`stream-http`](https://npmjs.com/package/stream-http)
|
||||
- MIT
|
||||
|
||||
---
|
||||
|
||||
- [`string_decoder`](https://npmjs.com/package/string_decoder)
|
||||
- MIT
|
||||
|
||||
---
|
||||
|
||||
- [`timers-browserify`](https://npmjs.com/package/timers-browserify)
|
||||
- MIT
|
||||
|
||||
---
|
||||
|
||||
- [`tty-browserify`](https://npmjs.com/package/tty-browserify)
|
||||
- MIT
|
||||
|
||||
---
|
||||
|
||||
- [`url`](https://npmjs.com/package/url)
|
||||
- MIT
|
||||
|
||||
---
|
||||
|
||||
- [`util`](https://npmjs.com/package/util)
|
||||
- MIT
|
||||
|
||||
---
|
||||
|
||||
- [`vm-browserify`](https://npmjs.com/package/vm-browserify)
|
||||
- MIT
|
||||
|
||||
{% /table %}
|
||||
|
||||
## Additional credits
|
||||
|
||||
- Bun's JS transpiler, CSS lexer, and Node.js module resolver source code is a Zig port of [@evanw](https://github.com/evanw)’s [esbuild](https://github.com/evanw/esbuild) project.
|
||||
- Credit to [@kipply](https://github.com/kipply) for the name "Bun"!
|
||||
71
LICENSE.md
71
LICENSE.md
@@ -1,71 +0,0 @@
|
||||
Bun itself is MIT-licensed.
|
||||
|
||||
## JavaScriptCore
|
||||
|
||||
Bun statically links JavaScriptCore (and WebKit) which is LGPL-2 licensed. WebCore files from WebKit are also licensed under LGPL2. Per LGPL2:
|
||||
|
||||
> (1) If you statically link against an LGPL’d library, you must also provide your application in an object (not necessarily source) format, so that a user has the opportunity to modify the library and relink the application.
|
||||
|
||||
You can find the patched version of WebKit used by Bun here: <https://github.com/oven-sh/webkit>. If you would like to relink Bun with changes:
|
||||
|
||||
- `git submodule update --init --recursive`
|
||||
- `make jsc`
|
||||
- `zig build`
|
||||
|
||||
This compiles JavaScriptCore, compiles Bun’s `.cpp` bindings for JavaScriptCore (which are the object files using JavaScriptCore) and outputs a new `bun` binary with your changes.
|
||||
|
||||
## Linked libraries
|
||||
|
||||
Bun statically links these libraries:
|
||||
|
||||
| Library | License |
|
||||
|---------|---------|
|
||||
| [`boringssl`](https://boringssl.googlesource.com/boringssl/) | [several licenses](https://boringssl.googlesource.com/boringssl/+/refs/heads/master/LICENSE) |
|
||||
| [`brotli`](https://github.com/google/brotli) | MIT |
|
||||
| [`libarchive`](https://github.com/libarchive/libarchive) | [several licenses](https://github.com/libarchive/libarchive/blob/master/COPYING) |
|
||||
| [`lol-html`](https://github.com/cloudflare/lol-html/tree/master/c-api) | BSD 3-Clause |
|
||||
| [`mimalloc`](https://github.com/microsoft/mimalloc) | MIT |
|
||||
| [`picohttp`](https://github.com/h2o/picohttpparser) | dual-licensed under the Perl License or the MIT License |
|
||||
| [`zstd`](https://github.com/facebook/zstd) | dual-licensed under the BSD License or GPLv2 license |
|
||||
| [`simdutf`](https://github.com/simdutf/simdutf) | Apache 2.0 |
|
||||
| [`tinycc`](https://github.com/tinycc/tinycc) | LGPL v2.1 |
|
||||
| [`uSockets`](https://github.com/uNetworking/uSockets) | Apache 2.0 |
|
||||
| [`zlib-cloudflare`](https://github.com/cloudflare/zlib) | zlib |
|
||||
| [`c-ares`](https://github.com/c-ares/c-ares) | MIT licensed |
|
||||
| [`libicu`](https://github.com/unicode-org/icu) 72 | [license here](https://github.com/unicode-org/icu/blob/main/icu4c/LICENSE) |
|
||||
| [`libbase64`](https://github.com/aklomp/base64/blob/master/LICENSE) | BSD 2-Clause |
|
||||
| A fork of [`uWebsockets`](https://github.com/jarred-sumner/uwebsockets) | Apache 2.0 licensed |
|
||||
| Parts of [Tigerbeetle's IO code](https://github.com/tigerbeetle/tigerbeetle/blob/532c8b70b9142c17e07737ab6d3da68d7500cbca/src/io/windows.zig#L1) | Apache 2.0 licensed |
|
||||
|
||||
## Polyfills
|
||||
|
||||
For compatibility reasons, the following packages are embedded into Bun's binary and injected if imported.
|
||||
|
||||
| Package | License |
|
||||
|---------|---------|
|
||||
| [`assert`](https://npmjs.com/package/assert) | MIT |
|
||||
| [`browserify-zlib`](https://npmjs.com/package/browserify-zlib) | MIT |
|
||||
| [`buffer`](https://npmjs.com/package/buffer) | MIT |
|
||||
| [`constants-browserify`](https://npmjs.com/package/constants-browserify) | MIT |
|
||||
| [`crypto-browserify`](https://npmjs.com/package/crypto-browserify) | MIT |
|
||||
| [`domain-browser`](https://npmjs.com/package/domain-browser) | MIT |
|
||||
| [`events`](https://npmjs.com/package/events) | MIT |
|
||||
| [`https-browserify`](https://npmjs.com/package/https-browserify) | MIT |
|
||||
| [`os-browserify`](https://npmjs.com/package/os-browserify) | MIT |
|
||||
| [`path-browserify`](https://npmjs.com/package/path-browserify) | MIT |
|
||||
| [`process`](https://npmjs.com/package/process) | MIT |
|
||||
| [`punycode`](https://npmjs.com/package/punycode) | MIT |
|
||||
| [`querystring-es3`](https://npmjs.com/package/querystring-es3) | MIT |
|
||||
| [`stream-browserify`](https://npmjs.com/package/stream-browserify) | MIT |
|
||||
| [`stream-http`](https://npmjs.com/package/stream-http) | MIT |
|
||||
| [`string_decoder`](https://npmjs.com/package/string_decoder) | MIT |
|
||||
| [`timers-browserify`](https://npmjs.com/package/timers-browserify) | MIT |
|
||||
| [`tty-browserify`](https://npmjs.com/package/tty-browserify) | MIT |
|
||||
| [`url`](https://npmjs.com/package/url) | MIT |
|
||||
| [`util`](https://npmjs.com/package/util) | MIT |
|
||||
| [`vm-browserify`](https://npmjs.com/package/vm-browserify) | MIT |
|
||||
|
||||
## Additional credits
|
||||
|
||||
- Bun's JS transpiler, CSS lexer, and Node.js module resolver source code is a Zig port of [@evanw](https://github.com/evanw)’s [esbuild](https://github.com/evanw/esbuild) project.
|
||||
- Credit to [@kipply](https://github.com/kipply) for the name "Bun"!
|
||||
12
Makefile
12
Makefile
@@ -129,7 +129,7 @@ SED = $(shell which gsed 2>/dev/null || which sed 2>/dev/null)
|
||||
|
||||
BUN_DIR ?= $(shell dirname $(realpath $(firstword $(MAKEFILE_LIST))))
|
||||
BUN_DEPS_DIR ?= $(shell pwd)/src/deps
|
||||
BUN_DEPS_OUT_DIR ?= $(shell pwd)/build/bun-deps
|
||||
BUN_DEPS_OUT_DIR ?= $(BUN_DEPS_DIR)
|
||||
CPU_COUNT = 2
|
||||
ifeq ($(OS_NAME),darwin)
|
||||
CPU_COUNT = $(shell sysctl -n hw.logicalcpu)
|
||||
@@ -449,7 +449,8 @@ MINIMUM_ARCHIVE_FILES = -L$(BUN_DEPS_OUT_DIR) \
|
||||
-ldecrepit \
|
||||
-lssl \
|
||||
-lcrypto \
|
||||
-llolhtml
|
||||
-llolhtml \
|
||||
-lbase64
|
||||
|
||||
ARCHIVE_FILES_WITHOUT_LIBCRYPTO = $(MINIMUM_ARCHIVE_FILES) \
|
||||
-larchive \
|
||||
@@ -1970,6 +1971,11 @@ copy-to-bun-release-dir-bin:
|
||||
|
||||
PACKAGE_MAP = --pkg-begin async_io $(BUN_DIR)/src/io/io_darwin.zig --pkg-begin bun $(BUN_DIR)/src/bun_redirect.zig --pkg-end --pkg-end --pkg-begin javascript_core $(BUN_DIR)/src/jsc.zig --pkg-begin bun $(BUN_DIR)/src/bun_redirect.zig --pkg-end --pkg-end --pkg-begin bun $(BUN_DIR)/src/bun_redirect.zig --pkg-end
|
||||
|
||||
.PHONY: base64
|
||||
base64:
|
||||
cd $(BUN_DEPS_DIR)/base64 && make clean && rm -rf CMakeCache.txt CMakeFiles && cmake $(CMAKE_FLAGS) . && make
|
||||
cp $(BUN_DEPS_DIR)/base64/libbase64.a $(BUN_DEPS_OUT_DIR)/libbase64.a
|
||||
|
||||
.PHONY: cold-jsc-start
|
||||
cold-jsc-start:
|
||||
$(CXX_WITH_CCACHE) $(CLANG_FLAGS) \
|
||||
@@ -1987,7 +1993,7 @@ cold-jsc-start:
|
||||
misctools/cold-jsc-start.cpp -o cold-jsc-start
|
||||
|
||||
.PHONY: vendor-without-npm
|
||||
vendor-without-npm: node-fallbacks runtime_js fallback_decoder bun_error mimalloc picohttp zlib boringssl libarchive lolhtml sqlite usockets uws lshpack tinycc c-ares zstd
|
||||
vendor-without-npm: node-fallbacks runtime_js fallback_decoder bun_error mimalloc picohttp zlib boringssl libarchive lolhtml sqlite usockets uws lshpack tinycc c-ares zstd base64
|
||||
|
||||
|
||||
.PHONY: vendor-without-check
|
||||
|
||||
@@ -234,7 +234,6 @@ bun upgrade --canary
|
||||
- [Use Neon's Serverless Postgres with Bun](https://bun.sh/guides/ecosystem/neon-serverless-postgres)
|
||||
- [Use Prisma with Bun](https://bun.sh/guides/ecosystem/prisma)
|
||||
- [Use React and JSX](https://bun.sh/guides/ecosystem/react)
|
||||
- [Add Sentry to a Bun app](https://bun.sh/guides/ecosystem/sentry)
|
||||
|
||||
- HTTP
|
||||
- [Common HTTP server usage](https://bun.sh/guides/http/server)
|
||||
|
||||
BIN
bench/bun.lockb
BIN
bench/bun.lockb
Binary file not shown.
@@ -3,7 +3,6 @@
|
||||
"dependencies": {
|
||||
"@babel/core": "^7.16.10",
|
||||
"@babel/preset-react": "^7.16.7",
|
||||
"@babel/standalone": "^7.24.7",
|
||||
"@swc/core": "^1.2.133",
|
||||
"benchmark": "^2.1.4",
|
||||
"braces": "^3.0.2",
|
||||
|
||||
@@ -6,7 +6,6 @@ const App = () => (
|
||||
<html>
|
||||
<body>
|
||||
<h1>Hello World</h1>
|
||||
<p>This is an example.</p>
|
||||
</body>
|
||||
</html>
|
||||
);
|
||||
|
||||
@@ -1,21 +0,0 @@
|
||||
import { bench, run } from "./runner.mjs";
|
||||
|
||||
function makeBenchmark(size, isToString) {
|
||||
const base64Input = Buffer.alloc(size, "latin1").toString("base64");
|
||||
const base64From = Buffer.from(base64Input, "base64");
|
||||
|
||||
if (!isToString)
|
||||
bench(`Buffer.from(${size} bytes, 'base64')`, () => {
|
||||
Buffer.from(base64Input, "base64");
|
||||
});
|
||||
|
||||
if (isToString)
|
||||
bench(`Buffer(${size}).toString('base64')`, () => {
|
||||
base64From.toString("base64");
|
||||
});
|
||||
}
|
||||
|
||||
[32, 512, 64 * 1024, 512 * 1024, 1024 * 1024 * 8].forEach(s => makeBenchmark(s, true));
|
||||
[32, 512, 64 * 1024, 512 * 1024, 1024 * 1024 * 8].forEach(s => makeBenchmark(s, false));
|
||||
|
||||
await run();
|
||||
@@ -1,29 +0,0 @@
|
||||
import { pbkdf2, pbkdf2Sync } from "node:crypto";
|
||||
|
||||
import { bench, run } from "./runner.mjs";
|
||||
|
||||
const password = "password";
|
||||
const salt = "salt";
|
||||
const iterations = 1000;
|
||||
const keylen = 32;
|
||||
const hash = "sha256";
|
||||
|
||||
bench("pbkdf2(iterations = 1000, 'sha256') -> 32", async () => {
|
||||
return new Promise((resolve, reject) => {
|
||||
pbkdf2(password, salt, iterations, keylen, hash, (err, key) => {
|
||||
if (err) return reject(err);
|
||||
resolve(key);
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
bench("pbkdf2(iterations = 500_000, 'sha256') -> 32", async () => {
|
||||
return new Promise((resolve, reject) => {
|
||||
pbkdf2(password, salt, 500_000, keylen, hash, (err, key) => {
|
||||
if (err) return reject(err);
|
||||
resolve(key);
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
await run();
|
||||
@@ -1,42 +1,16 @@
|
||||
import { bench, run } from "../node_modules/mitata/src/cli.mjs";
|
||||
|
||||
let count = 20_000_000;
|
||||
const batchSize = 1_000_000;
|
||||
console.time("Run");
|
||||
|
||||
let { promise, resolve, reject } = Promise.withResolvers();
|
||||
let remaining = count;
|
||||
|
||||
if (batchSize === 0) {
|
||||
for (let i = 0; i < count; i++) {
|
||||
setTimeout(() => {
|
||||
remaining--;
|
||||
if (remaining === 0) {
|
||||
resolve();
|
||||
}
|
||||
}, 0);
|
||||
}
|
||||
await promise;
|
||||
} else {
|
||||
for (let i = 0; i < count; i += batchSize) {
|
||||
let batch = Math.min(batchSize, count - i);
|
||||
console.time("Batch " + i + " - " + (i + batch));
|
||||
let { promise: batchPromise, resolve: batchResolve } = Promise.withResolvers();
|
||||
let remaining = batch;
|
||||
for (let j = 0; j < batch; j++) {
|
||||
bench("setTimeout(, 4) 100 times", async () => {
|
||||
var i = 100;
|
||||
while (--i >= 0) {
|
||||
await new Promise((resolve, reject) => {
|
||||
setTimeout(() => {
|
||||
remaining--;
|
||||
if (remaining === 0) {
|
||||
batchResolve();
|
||||
}
|
||||
}, 0);
|
||||
}
|
||||
await batchPromise;
|
||||
console.timeEnd("Batch " + i + " - " + (i + batch));
|
||||
resolve();
|
||||
}, 4);
|
||||
});
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
const fmt = new Intl.NumberFormat();
|
||||
console.log("Executed", fmt.format(count), "timers");
|
||||
console.timeEnd("Run");
|
||||
process.exit(0);
|
||||
setTimeout(() => {
|
||||
run({}).then(() => {});
|
||||
}, 1);
|
||||
|
||||
@@ -1,14 +0,0 @@
|
||||
import { bench, run } from "mitata";
|
||||
import { join } from "path";
|
||||
|
||||
const code = require("fs").readFileSync(
|
||||
process.argv[2] || join(import.meta.dir, "../node_modules/@babel/standalone/babel.min.js"),
|
||||
);
|
||||
|
||||
const transpiler = new Bun.Transpiler({ minify: true });
|
||||
|
||||
bench("transformSync", () => {
|
||||
transpiler.transformSync(code);
|
||||
});
|
||||
|
||||
await run();
|
||||
@@ -39,7 +39,7 @@ _read_scripts_in_package_json() {
|
||||
[[ "${COMP_WORDS[${line}]}" == "--cwd" ]] && working_dir="${COMP_WORDS[$((line + 1))]}";
|
||||
done
|
||||
|
||||
[[ -f "${working_dir}/package.json" ]] && package_json=$(<"${working_dir}/package.json");
|
||||
[[ -f "${working_dir}/package.json" ]] && package_json=$(<${working_dir}/package.json);
|
||||
|
||||
[[ "${package_json}" =~ "\"scripts\""[[:space:]]*":"[[:space:]]*\{(.*)\} ]] && {
|
||||
local package_json_compreply;
|
||||
@@ -82,7 +82,7 @@ _bun_completions() {
|
||||
declare -A PACKAGE_OPTIONS;
|
||||
declare -A PM_OPTIONS;
|
||||
|
||||
local SUBCOMMANDS="dev bun create run install add remove upgrade completions discord help init pm x test repl update link unlink build";
|
||||
local SUBCOMMANDS="dev bun create run install add remove upgrade completions discord help init pm x";
|
||||
|
||||
GLOBAL_OPTIONS[LONG_OPTIONS]="--use --cwd --bunfile --server-bunfile --config --disable-react-fast-refresh --disable-hmr --env-file --extension-order --jsx-factory --jsx-fragment --extension-order --jsx-factory --jsx-fragment --jsx-import-source --jsx-production --jsx-runtime --main-fields --no-summary --version --platform --public-dir --tsconfig-override --define --external --help --inject --loader --origin --port --dump-environment-variables --dump-limits --disable-bun-js";
|
||||
GLOBAL_OPTIONS[SHORT_OPTIONS]="-c -v -d -e -h -i -l -u -p";
|
||||
|
||||
@@ -53,7 +53,7 @@ function __bun_complete_bins_scripts --inherit-variable bun_builtin_cmds_without
|
||||
# Scripts have descriptions appended with a tab separator.
|
||||
# Strip off descriptions for the purposes of subcommand testing.
|
||||
set -l scripts (__fish__get_bun_scripts)
|
||||
if __fish_seen_subcommand_from (string split \t -f 1 -- $scripts)
|
||||
if __fish_seen_subcommand_from $(string split \t -f 1 -- $scripts)
|
||||
return
|
||||
end
|
||||
# Emit scripts.
|
||||
|
||||
@@ -61,7 +61,7 @@ To do anything interesting we need a construct known as a "view". A view is a cl
|
||||
|
||||
The `DataView` class is a lower-level interface for reading and manipulating the data in an `ArrayBuffer`.
|
||||
|
||||
Below we create a new `DataView` and set the first byte to 3.
|
||||
Below we create a new `DataView` and set the first byte to 5.
|
||||
|
||||
```ts
|
||||
const buf = new ArrayBuffer(4);
|
||||
@@ -395,7 +395,7 @@ Bun implements `Buffer`, a Node.js API for working with binary data that pre-dat
|
||||
|
||||
```ts
|
||||
const buf = Buffer.from("hello world");
|
||||
// => Buffer(11) [ 104, 101, 108, 108, 111, 32, 119, 111, 114, 108, 100 ]
|
||||
// => Buffer(16) [ 116, 104, 105, 115, 32, 105, 115, 32, 97, 32, 115, 116, 114, 105, 110, 103 ]
|
||||
|
||||
buf.length; // => 11
|
||||
buf[0]; // => 104, ascii for 'h'
|
||||
|
||||
@@ -16,10 +16,7 @@ Features include:
|
||||
- Parameters (named & positional)
|
||||
- Prepared statements
|
||||
- Datatype conversions (`BLOB` becomes `Uint8Array`)
|
||||
- Map query results to classes without an ORM - `query.as(MyClass)`
|
||||
- The fastest performance of any SQLite driver for JavaScript
|
||||
- `bigint` support
|
||||
- Multi-query statements (e.g. `SELECT 1; SELECT 2;`) in a single call to database.run(query)
|
||||
|
||||
The `bun:sqlite` module is roughly 3-6x faster than `better-sqlite3` and 8-9x faster than `deno.land/x/sqlite` for read queries. Each driver was benchmarked against the [Northwind Traders](https://github.com/jpwhite3/northwind-SQLite3/blob/46d5f8a64f396f87cd374d1600dbf521523980e8/Northwind_large.sqlite.zip) dataset. View and run the [benchmark source](https://github.com/oven-sh/bun/tree/main/bench/sqlite).
|
||||
|
||||
@@ -60,39 +57,6 @@ import { Database } from "bun:sqlite";
|
||||
const db = new Database("mydb.sqlite", { create: true });
|
||||
```
|
||||
|
||||
### Strict mode
|
||||
|
||||
{% callout %}
|
||||
Added in Bun v1.1.14
|
||||
{% /callout %}
|
||||
|
||||
By default, `bun:sqlite` requires binding parameters to include the `$`, `:`, or `@` prefix, and does not throw an error if a parameter is missing.
|
||||
|
||||
To instead throw an error when a parameter is missing and allow binding without a prefix, set `strict: true` on the `Database` constructor:
|
||||
|
||||
<!-- prettier-ignore -->
|
||||
```ts
|
||||
import { Database } from "bun:sqlite";
|
||||
|
||||
const strict = new Database(
|
||||
":memory:",
|
||||
{ strict: true }
|
||||
);
|
||||
|
||||
// throws error because of the typo:
|
||||
const query = strict
|
||||
.query("SELECT $message;")
|
||||
.all({ messag: "Hello world" });
|
||||
|
||||
const notStrict = new Database(
|
||||
":memory:"
|
||||
);
|
||||
// does not throw error:
|
||||
notStrict
|
||||
.query("SELECT $message;")
|
||||
.all({ messag: "Hello world" });
|
||||
```
|
||||
|
||||
### Load via ES module import
|
||||
|
||||
You can also use an import attribute to load a database.
|
||||
@@ -210,47 +174,6 @@ const query = db.query(`SELECT $param1, $param2;`);
|
||||
|
||||
Values are bound to these parameters when the query is executed. A `Statement` can be executed with several different methods, each returning the results in a different form.
|
||||
|
||||
### Binding values
|
||||
|
||||
To bind values to a statement, pass an object to the `.all()`, `.get()`, `.run()`, or `.values()` method.
|
||||
|
||||
```ts
|
||||
const query = db.query(`select $message;`);
|
||||
query.all({ $message: "Hello world" });
|
||||
```
|
||||
|
||||
You can bind using positional parameters too:
|
||||
|
||||
```ts
|
||||
const query = db.query(`select ?1;`);
|
||||
query.all("Hello world");
|
||||
```
|
||||
|
||||
#### `strict: true` lets you bind values without prefixes
|
||||
|
||||
{% callout %}
|
||||
Added in Bun v1.1.14
|
||||
{% /callout %}
|
||||
|
||||
By default, the `$`, `:`, and `@` prefixes are **included** when binding values to named parameters. To bind without these prefixes, use the `strict` option in the `Database` constructor.
|
||||
|
||||
```ts
|
||||
import { Database } from "bun:sqlite";
|
||||
|
||||
const db = new Database(":memory:", {
|
||||
// bind values without prefixes
|
||||
strict: true,
|
||||
});
|
||||
|
||||
const query = db.query(`select $message;`);
|
||||
|
||||
// strict: true
|
||||
query.all({ message: "Hello world" });
|
||||
|
||||
// strict: false
|
||||
// query.all({ $message: "Hello world" });
|
||||
```
|
||||
|
||||
### `.all()`
|
||||
|
||||
Use `.all()` to run a query and get back the results as an array of objects.
|
||||
@@ -282,49 +205,11 @@ Use `.run()` to run a query and get back `undefined`. This is useful for schema-
|
||||
```ts
|
||||
const query = db.query(`create table foo;`);
|
||||
query.run();
|
||||
// {
|
||||
// lastInsertRowid: 0,
|
||||
// changes: 0,
|
||||
// }
|
||||
// => undefined
|
||||
```
|
||||
|
||||
Internally, this calls [`sqlite3_reset`](https://www.sqlite.org/capi3ref.html#sqlite3_reset) and calls [`sqlite3_step`](https://www.sqlite.org/capi3ref.html#sqlite3_step) once. Stepping through all the rows is not necessary when you don't care about the results.
|
||||
|
||||
{% callout %}
|
||||
Since Bun v1.1.14, `.run()` returns an object with two properties: `lastInsertRowid` and `changes`.
|
||||
{% /callout %}
|
||||
|
||||
The `lastInsertRowid` property returns the ID of the last row inserted into the database. The `changes` property is the number of rows affected by the query.
|
||||
|
||||
### `.as(Class)` - Map query results to a class
|
||||
|
||||
{% callout %}
|
||||
Added in Bun v1.1.14
|
||||
{% /callout %}
|
||||
|
||||
Use `.as(Class)` to run a query and get back the results as instances of a class. This lets you attach methods & getters/setters to results.
|
||||
|
||||
```ts
|
||||
class Movie {
|
||||
title: string;
|
||||
year: number;
|
||||
|
||||
get isMarvel() {
|
||||
return this.title.includes("Marvel");
|
||||
}
|
||||
}
|
||||
|
||||
const query = db.query("SELECT title, year FROM movies").as(Movie);
|
||||
const movies = query.all();
|
||||
const first = query.get();
|
||||
console.log(movies[0].isMarvel); // => true
|
||||
console.log(first.isMarvel); // => true
|
||||
```
|
||||
|
||||
As a performance optimization, the class constructor is not called, default initializers are not run, and private fields are not accessible. This is more like using `Object.create` than `new`. The class's prototype is assigned to the object, methods are attached, and getters/setters are set up, but the constructor is not called.
|
||||
|
||||
The database columns are set as properties on the class instance.
|
||||
|
||||
### `.values()`
|
||||
|
||||
Use `values()` to run a query and get back all results as an array of arrays.
|
||||
@@ -415,65 +300,6 @@ const results = query.all("hello", "goodbye");
|
||||
|
||||
{% /codetabs %}
|
||||
|
||||
## Integers
|
||||
|
||||
sqlite supports signed 64 bit integers, but JavaScript only supports signed 52 bit integers or arbitrary precision integers with `bigint`.
|
||||
|
||||
`bigint` input is supported everywhere, but by default `bun:sqlite` returns integers as `number` types. If you need to handle integers larger than 2^53, set `safeInteger` option to `true` when creating a `Database` instance. This also validates that `bigint` passed to `bun:sqlite` do not exceed 64 bits.
|
||||
|
||||
By default, `bun:sqlite` returns integers as `number` types. If you need to handle integers larger than 2^53, you can use the `bigint` type.
|
||||
|
||||
### `safeIntegers: true`
|
||||
|
||||
{% callout %}
|
||||
Added in Bun v1.1.14
|
||||
{% /callout %}
|
||||
|
||||
When `safeIntegers` is `true`, `bun:sqlite` will return integers as `bigint` types:
|
||||
|
||||
```ts
|
||||
import { Database } from "bun:sqlite";
|
||||
|
||||
const db = new Database(":memory:", { safeIntegers: true });
|
||||
const query = db.query(
|
||||
`SELECT ${BigInt(Number.MAX_SAFE_INTEGER) + 102n} as max_int`,
|
||||
);
|
||||
const result = query.get();
|
||||
console.log(result.max_int); // => 9007199254741093n
|
||||
```
|
||||
|
||||
When `safeIntegers` is `true`, `bun:sqlite` will throw an error if a `bigint` value in a bound parameter exceeds 64 bits:
|
||||
|
||||
```ts
|
||||
import { Database } from "bun:sqlite";
|
||||
|
||||
const db = new Database(":memory:", { safeIntegers: true });
|
||||
db.run("CREATE TABLE test (id INTEGER PRIMARY KEY, value INTEGER)");
|
||||
|
||||
const query = db.query("INSERT INTO test (value) VALUES ($value)");
|
||||
|
||||
try {
|
||||
query.run({ $value: BigInt(Number.MAX_SAFE_INTEGER) ** 2n });
|
||||
} catch (e) {
|
||||
console.log(e.message); // => BigInt value '81129638414606663681390495662081' is out of range
|
||||
}
|
||||
```
|
||||
|
||||
### `safeIntegers: false` (default)
|
||||
|
||||
When `safeIntegers` is `false`, `bun:sqlite` will return integers as `number` types and truncate any bits beyond 53:
|
||||
|
||||
```ts
|
||||
import { Database } from "bun:sqlite";
|
||||
|
||||
const db = new Database(":memory:", { safeIntegers: false });
|
||||
const query = db.query(
|
||||
`SELECT ${BigInt(Number.MAX_SAFE_INTEGER) + 102n} as max_int`,
|
||||
);
|
||||
const result = query.get();
|
||||
console.log(result.max_int); // => 9007199254741092
|
||||
```
|
||||
|
||||
## Transactions
|
||||
|
||||
Transactions are a mechanism for executing multiple queries in an _atomic_ way; that is, either all of the queries succeed or none of them do. Create a transaction with the `db.transaction()` method:
|
||||
@@ -621,20 +447,12 @@ class Database {
|
||||
);
|
||||
|
||||
query<Params, ReturnType>(sql: string): Statement<Params, ReturnType>;
|
||||
run(
|
||||
sql: string,
|
||||
params?: SQLQueryBindings,
|
||||
): { lastInsertRowid: number; changes: number };
|
||||
exec = this.run;
|
||||
}
|
||||
|
||||
class Statement<Params, ReturnType> {
|
||||
all(params: Params): ReturnType[];
|
||||
get(params: Params): ReturnType | undefined;
|
||||
run(params: Params): {
|
||||
lastInsertRowid: number;
|
||||
changes: number;
|
||||
};
|
||||
run(params: Params): void;
|
||||
values(params: Params): unknown[][];
|
||||
|
||||
finalize(): void; // destroy statement and clean up resources
|
||||
@@ -643,8 +461,6 @@ class Statement<Params, ReturnType> {
|
||||
columnNames: string[]; // the column names of the result set
|
||||
paramsCount: number; // the number of parameters expected by the statement
|
||||
native: any; // the native object representing the statement
|
||||
|
||||
as(Class: new () => ReturnType): this;
|
||||
}
|
||||
|
||||
type SQLQueryBindings =
|
||||
|
||||
@@ -183,7 +183,7 @@ const currentFile = import.meta.url;
|
||||
Bun.openInEditor(currentFile);
|
||||
```
|
||||
|
||||
You can override this via the `debug.editor` setting in your [`bunfig.toml`](/docs/runtime/bunfig).
|
||||
You can override this via the `debug.editor` setting in your [`bunfig.toml`](/docs/runtime/bunfig)
|
||||
|
||||
```toml-diff#bunfig.toml
|
||||
+ [debug]
|
||||
@@ -200,6 +200,8 @@ Bun.openInEditor(import.meta.url, {
|
||||
});
|
||||
```
|
||||
|
||||
Bun.ArrayBufferSink;
|
||||
|
||||
## `Bun.deepEquals()`
|
||||
|
||||
Recursively checks if two objects are equivalent. This is used internally by `expect().toEqual()` in `bun:test`.
|
||||
@@ -249,11 +251,11 @@ Bun.deepEquals(new Foo(), { a: 1 }, true); // false
|
||||
|
||||
Escapes the following characters from an input string:
|
||||
|
||||
- `"` becomes `"`
|
||||
- `&` becomes `&`
|
||||
- `'` becomes `'`
|
||||
- `<` becomes `<`
|
||||
- `>` becomes `>`
|
||||
- `"` becomes `"""`
|
||||
- `&` becomes `"&"`
|
||||
- `'` becomes `"'"`
|
||||
- `<` becomes `"<"`
|
||||
- `>` becomes `">"`
|
||||
|
||||
This function is optimized for large input. On an M1X, it processes 480 MB/s -
|
||||
20 GB/s, depending on how much data is being escaped and whether there is non-ascii
|
||||
|
||||
@@ -13,7 +13,8 @@ Like in browsers, [`Worker`](https://developer.mozilla.org/en-US/docs/Web/API/Wo
|
||||
### From the main thread
|
||||
|
||||
```js#Main_thread
|
||||
const worker = new Worker("./worker.ts");
|
||||
const workerURL = new URL("worker.ts", import.meta.url).href;
|
||||
const worker = new Worker(workerURL);
|
||||
|
||||
worker.postMessage("hello");
|
||||
worker.onmessage = event => {
|
||||
@@ -50,38 +51,6 @@ const worker = new Worker("/not-found.js");
|
||||
|
||||
The specifier passed to `Worker` is resolved relative to the project root (like typing `bun ./path/to/file.js`).
|
||||
|
||||
### `blob:` URLs
|
||||
|
||||
As of Bun v1.1.13, you can also pass a `blob:` URL to `Worker`. This is useful for creating workers from strings or other sources.
|
||||
|
||||
```js
|
||||
const blob = new Blob(
|
||||
[
|
||||
`
|
||||
self.onmessage = (event: MessageEvent) => postMessage(event.data)`,
|
||||
],
|
||||
{
|
||||
type: "application/typescript",
|
||||
},
|
||||
);
|
||||
const url = URL.createObjectURL(blob);
|
||||
const worker = new Worker(url);
|
||||
```
|
||||
|
||||
Like the rest of Bun, workers created from `blob:` URLs support TypeScript, JSX, and other file types out of the box. You can communicate it should be loaded via typescript either via `type` or by passing a `filename` to the `File` constructor.
|
||||
|
||||
```js
|
||||
const file = new File(
|
||||
[
|
||||
`
|
||||
self.onmessage = (event: MessageEvent) => postMessage(event.data)`,
|
||||
],
|
||||
"worker.ts",
|
||||
);
|
||||
const url = URL.createObjectURL(file);
|
||||
const worker = new Worker(url);
|
||||
```
|
||||
|
||||
### `"open"`
|
||||
|
||||
The `"open"` event is emitted when a worker is created and ready to receive messages. This can be used to send an initial message to a worker once it's ready. (This event does not exist in browsers.)
|
||||
|
||||
@@ -563,12 +563,12 @@ Specifies the type of sourcemap to generate.
|
||||
await Bun.build({
|
||||
entrypoints: ['./index.tsx'],
|
||||
outdir: './out',
|
||||
sourcemap: 'linked', // default 'none'
|
||||
sourcemap: "external", // default "none"
|
||||
})
|
||||
```
|
||||
|
||||
```bash#CLI
|
||||
$ bun build ./index.tsx --outdir ./out --sourcemap=linked
|
||||
$ bun build ./index.tsx --outdir ./out --sourcemap=external
|
||||
```
|
||||
|
||||
{% /codetabs %}
|
||||
@@ -582,19 +582,19 @@ $ bun build ./index.tsx --outdir ./out --sourcemap=linked
|
||||
|
||||
---
|
||||
|
||||
- `"linked"`
|
||||
- A separate `*.js.map` file is created alongside each `*.js` bundle using a `//# sourceMappingURL` comment to link the two. Requires `--outdir` to be set. The base URL of this can be customized with `--public-path`.
|
||||
- `"inline"`
|
||||
- A sourcemap is generated and appended to the end of the generated bundle as a base64 payload.
|
||||
|
||||
```ts
|
||||
// <bundled code here>
|
||||
|
||||
//# sourceMappingURL=bundle.js.map
|
||||
//# sourceMappingURL=data:application/json;base64,<encoded sourcemap here>
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
- `"external"`
|
||||
- A separate `*.js.map` file is created alongside each `*.js` bundle without inserting a `//# sourceMappingURL` comment.
|
||||
- A separate `*.js.map` file is created alongside each `*.js` bundle.
|
||||
|
||||
{% /table %}
|
||||
|
||||
@@ -608,18 +608,7 @@ Generated bundles contain a [debug id](https://sentry.engineering/blog/the-case-
|
||||
//# debugId=<DEBUG ID>
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
- `"inline"`
|
||||
- A sourcemap is generated and appended to the end of the generated bundle as a base64 payload.
|
||||
|
||||
```ts
|
||||
// <bundled code here>
|
||||
|
||||
//# sourceMappingURL=data:application/json;base64,<encoded sourcemap here>
|
||||
```
|
||||
|
||||
The associated `*.js.map` sourcemap will be a JSON file containing an equivalent `debugId` property.
|
||||
The associated `*.js.map` sourcemap will be a JSON file containing an equivalent `debugId` property.
|
||||
|
||||
{% /callout %}
|
||||
|
||||
@@ -1257,7 +1246,7 @@ interface BuildOptions {
|
||||
loader?: { [k in string]: Loader }; // See https://bun.sh/docs/bundler/loaders
|
||||
manifest?: boolean; // false
|
||||
external?: string[]; // []
|
||||
sourcemap?: "none" | "inline" | "linked" | "external" | boolean; // "none"
|
||||
sourcemap?: "none" | "inline" | "external"; // "none"
|
||||
root?: string; // computed from entrypoints
|
||||
naming?:
|
||||
| string
|
||||
|
||||
@@ -35,10 +35,6 @@ $ bun add --optional lodash
|
||||
|
||||
## `--exact`
|
||||
|
||||
{% callout %}
|
||||
**Alias** — `-E`
|
||||
{% /callout %}
|
||||
|
||||
To add a package and pin to the resolved version, use `--exact`. This will resolve the version of the package and add it to your `package.json` with an exact version number instead of a version range.
|
||||
|
||||
```bash
|
||||
@@ -121,16 +117,12 @@ Bun reads this field and will run lifecycle scripts for `my-trusted-package`.
|
||||
|
||||
## Git dependencies
|
||||
|
||||
To add a dependency from a public or private git repository:
|
||||
To add a dependency from a git repository:
|
||||
|
||||
```bash
|
||||
$ bun add git@github.com:moment/moment.git
|
||||
```
|
||||
|
||||
{% callout %}
|
||||
**Note** — To install private repositories, your system needs the appropriate SSH credentials to access the repository.
|
||||
{% /callout %}
|
||||
|
||||
Bun supports a variety of protocols, including [`github`](https://docs.npmjs.com/cli/v9/configuring-npm/package-json#github-urls), [`git`](https://docs.npmjs.com/cli/v9/configuring-npm/package-json#git-urls-as-dependencies), `git+ssh`, `git+https`, and many more.
|
||||
|
||||
```json
|
||||
|
||||
@@ -1,9 +0,0 @@
|
||||
An alias for `bun patch --commit` to maintain compatibility with pnpm.
|
||||
|
||||
You must prepare the package for patching with [`bun patch <pkg>`](/docs/cli/patch) first.
|
||||
|
||||
### `--patches-dir`
|
||||
|
||||
By default, `bun patch-commit` will use the `patches` directory in the temporary directory.
|
||||
|
||||
You can specify a different directory with the `--patches-dir` flag.
|
||||
@@ -56,17 +56,3 @@ To clear Bun's global module cache:
|
||||
```bash
|
||||
$ bun pm cache rm
|
||||
```
|
||||
|
||||
## List global installs
|
||||
|
||||
To list all globally installed packages:
|
||||
|
||||
```bash
|
||||
$ bun pm ls -g
|
||||
```
|
||||
|
||||
To list all globally installed packages, including nth-order dependencies:
|
||||
|
||||
```bash
|
||||
$ bun pm ls -g --all
|
||||
```
|
||||
|
||||
@@ -1,34 +1,17 @@
|
||||
To update all dependencies to the latest version:
|
||||
To update all dependencies to the latest version _that's compatible with the version range specified in your `package.json`_:
|
||||
|
||||
```sh
|
||||
$ bun update
|
||||
```
|
||||
|
||||
To update a specific dependency to the latest version:
|
||||
## `--force`
|
||||
|
||||
{% callout %}
|
||||
**Alias** — `-f`
|
||||
{% /callout %}
|
||||
|
||||
By default, Bun respects the version range defined in your package.json. To ignore this and update to the latest version, you can pass in the `force` flag.
|
||||
|
||||
```sh
|
||||
$ bun update [package]
|
||||
$ bun update --force
|
||||
```
|
||||
|
||||
## `--latest`
|
||||
|
||||
By default, `bun update` will update to the latest version of a dependency that satisfies the version range specified in your `package.json`.
|
||||
|
||||
To update to the latest version, regardless of if it's compatible with the current version range, use the `--latest` flag:
|
||||
|
||||
```sh
|
||||
$ bun update --latest
|
||||
```
|
||||
|
||||
For example, with the following `package.json`:
|
||||
|
||||
```json
|
||||
{
|
||||
"dependencies": {
|
||||
"react": "^17.0.2"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
- `bun update` would update to a version that matches `17.x`.
|
||||
- `bun update --latest` would update to a version that matches `18.x` or later.
|
||||
|
||||
@@ -15,7 +15,7 @@ To _containerize_ our application, we define a `Dockerfile`. This file contains
|
||||
```docker#Dockerfile
|
||||
# use the official Bun image
|
||||
# see all versions at https://hub.docker.com/r/oven/bun/tags
|
||||
FROM oven/bun:1 AS base
|
||||
FROM oven/bun:1 as base
|
||||
WORKDIR /usr/src/app
|
||||
|
||||
# install dependencies into temp directory
|
||||
|
||||
@@ -69,7 +69,7 @@ export const movies = sqliteTable("movies", {
|
||||
We can use the `drizzle-kit` CLI to generate an initial SQL migration.
|
||||
|
||||
```sh
|
||||
$ bunx drizzle-kit generate --dialect sqlite --schema ./schema.ts
|
||||
$ bunx drizzle-kit generate:sqlite --schema ./schema.ts
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
@@ -20,7 +20,7 @@ $ bun add @neondatabase/serverless
|
||||
Create a `.env.local` file and add your [Neon Postgres connection string](https://neon.tech/docs/connect/connect-from-any-app) to it.
|
||||
|
||||
```sh
|
||||
DATABASE_URL=postgresql://username:password@ep-adj-noun-guid.us-east-1.aws.neon.tech/neondb?sslmode=require
|
||||
DATBASE_URL=postgresql://username:password@ep-adj-noun-guid.us-east-1.aws.neon.tech/neondb?sslmode=require
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
@@ -30,10 +30,9 @@ bun add express
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
Define a simple server with Express:
|
||||
|
||||
```ts#app.ts
|
||||
```app.ts
|
||||
import express from "express";
|
||||
|
||||
const app = express();
|
||||
@@ -49,7 +48,6 @@ app.listen(port, () => {
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
Commit your changes and push to GitHub.
|
||||
|
||||
```bash
|
||||
@@ -66,11 +64,11 @@ In your [Render Dashboard](https://dashboard.render.com/), click `New` > `Web Se
|
||||
|
||||
In the Render UI, provide the following values during web service creation:
|
||||
|
||||
| | |
|
||||
| ----------------- | ------------- |
|
||||
| **Runtime** | `Node` |
|
||||
| | |
|
||||
| ----------- | --------- |
|
||||
| **Runtime** | `Node` |
|
||||
| **Build Command** | `bun install` |
|
||||
| **Start Command** | `bun app.js` |
|
||||
| **Start Command** | `bun app.js` |
|
||||
|
||||
---
|
||||
|
||||
|
||||
@@ -1,52 +0,0 @@
|
||||
---
|
||||
name: Add Sentry to a Bun app
|
||||
---
|
||||
|
||||
[Sentry](https://sentry.io) is a developer-first error tracking and performance monitoring platform. Sentry has a first-class SDK for Bun, `@sentry/bun`, that instruments your Bun application to automatically collect error and performance data.
|
||||
|
||||
Don't already have an account and Sentry project established? Head over to [sentry.io](https://sentry.io/signup/), then return to this page.
|
||||
|
||||
---
|
||||
|
||||
To start using Sentry with Bun, first install the Sentry Bun SDK.
|
||||
|
||||
```bash
|
||||
bun add @sentry/bun
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
Then, initialize the Sentry SDK with your Sentry DSN in your app's entry file. You can find your DSN in your Sentry project settings.
|
||||
|
||||
```js
|
||||
import * as Sentry from "@sentry/bun";
|
||||
|
||||
// Ensure to call this before importing any other modules!
|
||||
Sentry.init({
|
||||
dsn: "__SENTRY_DSN__",
|
||||
|
||||
// Add Performance Monitoring by setting tracesSampleRate
|
||||
// We recommend adjusting this value in production
|
||||
tracesSampleRate: 1.0,
|
||||
});
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
You can verify that Sentry is working by capturing a test error:
|
||||
|
||||
```js
|
||||
setTimeout(() => {
|
||||
try {
|
||||
foo();
|
||||
} catch (e) {
|
||||
Sentry.captureException(e);
|
||||
}
|
||||
}, 99);
|
||||
```
|
||||
|
||||
To view and resolve the recorded error, log into [sentry.io](https://sentry.io/) and open your project. Clicking on the error's title will open a page where you can see detailed information and mark it as resolved.
|
||||
|
||||
---
|
||||
|
||||
To learn more about Sentry and using the Sentry Bun SDK, view the [Sentry documentation](https://docs.sentry.io/platforms/javascript/guides/bun).
|
||||
@@ -13,7 +13,7 @@ console.log(Bun.argv);
|
||||
Running this file with arguments results in the following:
|
||||
|
||||
```sh
|
||||
$ bun run cli.ts --flag1 --flag2 value
|
||||
$ bun run cli.tsx --flag1 --flag2 value
|
||||
[ '/path/to/bun', '/path/to/cli.ts', '--flag1', '--flag2', 'value' ]
|
||||
```
|
||||
|
||||
@@ -47,7 +47,7 @@ console.log(positionals);
|
||||
then it outputs
|
||||
|
||||
```
|
||||
$ bun run cli.ts --flag1 --flag2 value
|
||||
$ bun run cli.tsx --flag1 --flag2 value
|
||||
{
|
||||
flag1: true,
|
||||
flag2: "value",
|
||||
|
||||
@@ -16,7 +16,7 @@ await proc.exited;
|
||||
The second argument accepts a configuration object.
|
||||
|
||||
```ts
|
||||
const proc = Bun.spawn(["echo", "Hello, world!"], {
|
||||
const proc = Bun.spawn("echo", ["Hello, world!"], {
|
||||
cwd: "/tmp",
|
||||
env: { FOO: "bar" },
|
||||
onExit(proc, exitCode, signalCode, error) {
|
||||
|
||||
@@ -13,7 +13,7 @@ jobs:
|
||||
steps:
|
||||
# ...
|
||||
- uses: actions/checkout@v4
|
||||
+ - uses: oven-sh/setup-bun@v2
|
||||
+ - uses: oven-sh/setup-bun@v1
|
||||
|
||||
# run any `bun` or `bunx` command
|
||||
+ - run: bun install
|
||||
@@ -33,7 +33,7 @@ jobs:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
# ...
|
||||
- uses: oven-sh/setup-bun@v2
|
||||
- uses: oven-sh/setup-bun@v1
|
||||
+ with:
|
||||
+ bun-version: 1.0.11 # or "latest", "canary", <sha>
|
||||
```
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
---
|
||||
name: Import a HTML file as text
|
||||
name: Import HTML file as text
|
||||
---
|
||||
|
||||
To import a `.html` file in Bun as a text file, use the `type: "text"` attribute in the import statement.
|
||||
|
||||
@@ -52,7 +52,7 @@ Different thresholds can be set for line-level and function-level coverage.
|
||||
```toml
|
||||
[test]
|
||||
# to set different thresholds for lines and functions
|
||||
coverageThreshold = { lines = 0.5, functions = 0.7 }
|
||||
coverageThreshold = { line = 0.5, function = 0.7 }
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
@@ -22,7 +22,7 @@ Bun.serve<{ socketId: number }>({
|
||||
websocket: {
|
||||
// define websocket handlers
|
||||
async message(ws, message) {
|
||||
// the contextual data is available as the `data` property
|
||||
// the contextual dta is available as the `data` property
|
||||
// on the WebSocket instance
|
||||
console.log(`Received ${message} from ${ws.data.socketId}}`);
|
||||
},
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
All packages downloaded from the registry are stored in a global cache at `~/.bun/install/cache`. They are stored in subdirectories named like `${name}@${version}`, so multiple versions of a package can be cached.
|
||||
|
||||
{% details summary="Configuring cache behavior (bunfig.toml)" %}
|
||||
{% details summary="Configuring cache behavior" (bunfig.toml) %}
|
||||
|
||||
```toml
|
||||
[install.cache]
|
||||
@@ -15,6 +15,8 @@ disable = false
|
||||
disableManifest = false
|
||||
```
|
||||
|
||||
{% /details %}
|
||||
|
||||
## Minimizing re-downloads
|
||||
|
||||
Bun strives to avoid re-downloading packages multiple times. When installing a package, if the cache already contains a version in the range specified by `package.json`, Bun will use the cached package instead of downloading it again.
|
||||
|
||||
@@ -1,69 +0,0 @@
|
||||
Bun supports loading configuration options from [`.npmrc`](https://docs.npmjs.com/cli/v10/configuring-npm/npmrc) files, allowing you to reuse existing registry/scope configurations.
|
||||
|
||||
{% callout %}
|
||||
|
||||
**NOTE**: We recommend migrating your `.npmrc` file to Bun's [`bunfig.toml`](/docs/runtime/bunfig) format, as it provides more flexible options and can let you configure Bun-specific configuration options.
|
||||
|
||||
{% /callout %}
|
||||
|
||||
# Supported options
|
||||
|
||||
### `registry`: Set the default registry
|
||||
|
||||
The default registry is used to resolve packages, it's default value is `npm`'s official registry (`https://registry.npmjs.org/`).
|
||||
|
||||
To change it, you can set the `registry` option in `.npmrc`:
|
||||
|
||||
```ini
|
||||
registry=http://localhost:4873/
|
||||
```
|
||||
|
||||
The equivalent `bunfig.toml` option is [`install.registry`](/docs/runtime/bunfig#install-registry):
|
||||
|
||||
```toml
|
||||
install.registry = "http://localhost:4873/"
|
||||
```
|
||||
|
||||
### `@<scope>:registry`: Set the registry for a specific scope
|
||||
|
||||
Allows you to set the registry for a specific scope:
|
||||
|
||||
```ini
|
||||
@myorg:registry=http://localhost:4873/
|
||||
```
|
||||
|
||||
The equivalent `bunfig.toml` option is to add a key in [`install.scopes`](/docs/runtime/bunfig#install-registry):
|
||||
|
||||
```toml
|
||||
[install.scopes]
|
||||
myorg = "http://localhost:4873/"
|
||||
```
|
||||
|
||||
### `//<registry_url>/:<key>=<value>`: Confgure options for a specific registry
|
||||
|
||||
Allows you to set options for a specific registry:
|
||||
|
||||
```ini
|
||||
# set an auth token for the registry
|
||||
# ${...} is a placeholder for environment variables
|
||||
//http://localhost:4873/:_authToken=${NPM_TOKEN}
|
||||
|
||||
|
||||
# or you could set a username and password
|
||||
//http://localhost:4873/:username=myusername
|
||||
|
||||
//http://localhost:4873/:_password=${NPM_PASSWORD}
|
||||
```
|
||||
|
||||
The following options are supported:
|
||||
|
||||
- `_authToken`
|
||||
- `username`
|
||||
- `_password`
|
||||
|
||||
The equivalent `bunfig.toml` option is to add a key in [`install.scopes`](/docs/runtime/bunfig#install-registry):
|
||||
|
||||
```toml
|
||||
[install.scopes]
|
||||
myorg = { url = "http://localhost:4873/", username = "myusername", password = "$NPM_PASSWORD" }
|
||||
```
|
||||
@@ -1,57 +0,0 @@
|
||||
`bun patch` lets you persistently patch node_modules in a maintainable, git-friendly way.
|
||||
|
||||
Sometimes, you need to make a small change to a package in `node_modules/` to fix a bug or add a feature. `bun patch` makes it easy to do this without vendoring the entire package and reuse the patch across multiple installs, multiple projects, and multiple machines.
|
||||
|
||||
Features:
|
||||
|
||||
- Generates `.patch` files applied to dependencies in `node_modules` on install
|
||||
- `.patch` files can be committed to your repository, reused across multiple installs, projects, and machines
|
||||
- `"patchedDependencies"` in `package.json` keeps track of patched packages
|
||||
- `bun patch` lets you patch packages in `node_modules/` while preserving the integrity of Bun's [Global Cache](https://bun.sh/docs/install/cache)
|
||||
- Test your changes locally before committing them with `bun patch --commit <pkg>`
|
||||
- To preserve disk space and keep `bun install` fast, patched packages are committed to the Global Cache and shared across projects where possible
|
||||
|
||||
#### Step 1. Prepare the package for patching
|
||||
|
||||
To get started, use `bun patch <pkg>` to prepare the package for patching:
|
||||
|
||||
```bash
|
||||
# you can supply the package name
|
||||
$ bun patch react
|
||||
|
||||
# ...and a precise version in case multiple versions are installed
|
||||
$ bun patch react@17.0.2
|
||||
|
||||
# or the path to the package
|
||||
$ bun patch node_modules/react
|
||||
```
|
||||
|
||||
{% callout %}
|
||||
**Note** — Don't forget to call `bun patch <pkg>`! This ensures the package folder in `node_modules/` contains a fresh copy of the package with no symlinks/hardlinks to Bun's cache.
|
||||
|
||||
If you forget to do this, you might end up editing the package globally in the cache!
|
||||
{% /callout %}
|
||||
|
||||
#### Step 2. Test your changes locally
|
||||
|
||||
`bun patch <pkg>` makes it safe to edit the `<pkg>` in `node_modules/` directly, while preserving the integrity of Bun's [Global Cache](https://bun.sh/docs/install/cache). This works by re-creating an unlinked clone of the package in `node_modules/` and diffing it against the original package in the Global Cache.
|
||||
|
||||
#### Step 3. Commit your changes
|
||||
|
||||
Once you're happy with your changes, run `bun patch --commit <path or pkg>`.
|
||||
|
||||
Bun will generate a patch file in `patches/`, update your `package.json` and lockfile, and Bun will start using the patched package:
|
||||
|
||||
```bash
|
||||
# you can supply the path to the patched package
|
||||
$ bun patch --commit node_modules/react
|
||||
|
||||
# ... or the package name and optionally the version
|
||||
$ bun patch --commit react@17.0.2
|
||||
|
||||
# choose the directory to store the patch files
|
||||
$ bun patch --commit react --patches-dir=mypatches
|
||||
|
||||
# `patch-commit` is available for compatibility with pnpm
|
||||
$ bun patch-commit react
|
||||
```
|
||||
@@ -30,6 +30,10 @@ $ docker pull oven/bun
|
||||
$ docker run --rm --init --ulimit memlock=-1:-1 oven/bun
|
||||
```
|
||||
|
||||
```bash#Proto
|
||||
$ proto install bun
|
||||
```
|
||||
|
||||
{% /codetabs %}
|
||||
|
||||
### Windows
|
||||
@@ -140,8 +144,9 @@ $ bun upgrade
|
||||
{% callout %}
|
||||
**Homebrew users** — To avoid conflicts with Homebrew, use `brew upgrade bun` instead.
|
||||
|
||||
**Scoop users** — To avoid conflicts with Scoop, use `scoop update bun` instead.
|
||||
**Scoop users** — To avoid conflicts with Scoop, use `scoop upgrade bun` instead.
|
||||
|
||||
**proto users** - Use `proto install bun --pin` instead.
|
||||
{% /callout %}
|
||||
|
||||
## Canary builds
|
||||
@@ -286,4 +291,8 @@ $ npm uninstall -g bun
|
||||
$ brew uninstall bun
|
||||
```
|
||||
|
||||
```bash#Proto
|
||||
$ proto uninstall bun
|
||||
```
|
||||
|
||||
{% /codetabs %}
|
||||
|
||||
@@ -193,13 +193,6 @@ export default {
|
||||
page("install/overrides", "Overrides and resolutions", {
|
||||
description: "Specify version ranges for nested dependencies",
|
||||
}),
|
||||
page("install/patch", "Patch dependencies", {
|
||||
description:
|
||||
"Patch dependencies in your project to fix bugs or add features without vendoring the entire package.",
|
||||
}),
|
||||
page("install/npmrc", ".npmrc support", {
|
||||
description: "Bun supports loading some configuration options from .npmrc",
|
||||
}),
|
||||
// page("install/utilities", "Utilities", {
|
||||
// description: "Use `bun pm` to introspect your global module cache or project dependency tree.",
|
||||
// }),
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
There are four parts to the CI build:
|
||||
|
||||
- Dependencies: should be cached across builds as much as possible, it depends on git submodule hashes
|
||||
- Zig Object: depends on \*.zig and src/js
|
||||
- Zig Object: depends on \*.zig and potentially src/js
|
||||
- C++ Object: depends on \*.cpp and src/js
|
||||
- Linking: depends on the above three
|
||||
|
||||
@@ -15,7 +15,7 @@ BUN_DEPS_OUT_DIR="/optional/out/dir" bash ./scripts/all-dependencies.sh
|
||||
|
||||
## Zig Object
|
||||
|
||||
This does not have a dependency on WebKit or any of the dependencies at all. It can be compiled without checking out submodules, but you will need to have bun install run. It can be very easily cross compiled. Note that the zig object is always `bun-zig.o`.
|
||||
This does not have a dependency on WebKit or any of the dependencies at all. It can be compiled without checking out submodules, but you will need to have bun install run. It can be very easily cross compiled.
|
||||
|
||||
```sh
|
||||
BUN_REPO=/path/to/oven-sh/bun
|
||||
@@ -27,9 +27,9 @@ cmake $BUN_REPO \
|
||||
-DCMAKE_BUILD_TYPE=Release \
|
||||
-DCPU_TARGET="native" \
|
||||
-DZIG_TARGET="native" \
|
||||
-DBUN_ZIG_OBJ_DIR="./build"
|
||||
-DBUN_ZIG_OBJ="./bun-zig.o"
|
||||
|
||||
ninja ./build/bun-zig.o
|
||||
ninja ./bun-zig.o
|
||||
# -> bun-zig.o
|
||||
```
|
||||
|
||||
@@ -60,12 +60,12 @@ cmake $BUN_REPO \
|
||||
-G Ninja \
|
||||
-DCMAKE_BUILD_TYPE=Release \
|
||||
-DBUN_LINK_ONLY=1 \
|
||||
-DBUN_ZIG_OBJ_DIR="/path/to/bun-zig-dir" \
|
||||
-DBUN_ZIG_OBJ="/path/to/bun-zig.o" \
|
||||
-DBUN_CPP_ARCHIVE="/path/to/bun-cpp-objects.a"
|
||||
|
||||
ninja
|
||||
|
||||
# optional:
|
||||
# optiona:
|
||||
# -DBUN_DEPS_OUT_DIR=... custom deps dir, use this to cache the built deps between rebuilds
|
||||
# -DWEBKIT_DIR=... same thing, but it's probably fast enough to pull from github releases
|
||||
|
||||
|
||||
@@ -1 +1 @@
|
||||
../../LICENSE.md
|
||||
../../LICENSE
|
||||
@@ -1,10 +1,8 @@
|
||||
---
|
||||
name: Debugging
|
||||
name: Debugger
|
||||
---
|
||||
|
||||
Bun speaks the [WebKit Inspector Protocol](https://github.com/oven-sh/bun/blob/main/packages/bun-types/jsc.d.ts), so you can debug your code with an interactive debugger. For demonstration purposes, consider the following simple web server.
|
||||
|
||||
## Debugging JavaScript and TypeScript
|
||||
Bun speaks the [WebKit Inspector Protocol](https://github.com/oven-sh/bun/blob/main/packages/bun-vscode/types/jsc.d.ts), so you can debug your code with an interactive debugger. For demonstration purposes, consider the following simple web server.
|
||||
|
||||
```ts#server.ts
|
||||
Bun.serve({
|
||||
@@ -90,236 +88,3 @@ Here's a cheat sheet explaining the functions of the control flow buttons.
|
||||
- _Step out_ — If the current statement is a function call, the debugger will finish executing the call, then "step out" of the function to the location where it was called.
|
||||
|
||||
{% image src="https://github-production-user-asset-6210df.s3.amazonaws.com/3084745/261510346-6a94441c-75d3-413a-99a7-efa62365f83d.png" /%}
|
||||
|
||||
### Visual Studio Code Debugger
|
||||
|
||||
Experimental support for debugging Bun scripts is available in Visual Studio Code. To use it, you'll need to install the [Bun VSCode extension](https://bun.sh/guides/runtime/vscode-debugger).
|
||||
|
||||
## Debugging Network Requests
|
||||
|
||||
The `BUN_CONFIG_VERBOSE_FETCH` environment variable lets you log network requests made with `fetch()` or `node:http` automatically.
|
||||
|
||||
| Value | Description |
|
||||
| ------- | ---------------------------------- |
|
||||
| `curl` | Print requests as `curl` commands. |
|
||||
| `true` | Print request & response info |
|
||||
| `false` | Don't print anything. Default |
|
||||
|
||||
### Print fetch & node:http requests as curl commands
|
||||
|
||||
Bun also supports printing `fetch()` and `node:http` network requests as `curl` commands by setting the environment variable `BUN_CONFIG_VERBOSE_FETCH` to `curl`.
|
||||
|
||||
```ts
|
||||
process.env.BUN_CONFIG_VERBOSE_FETCH = "curl";
|
||||
|
||||
await fetch("https://example.com", {
|
||||
method: "POST",
|
||||
headers: {
|
||||
"Content-Type": "application/json",
|
||||
},
|
||||
body: JSON.stringify({ foo: "bar" }),
|
||||
});
|
||||
```
|
||||
|
||||
This prints the `fetch` request as a single-line `curl` command to let you copy-paste into your terminal to replicate the request.
|
||||
|
||||
```sh
|
||||
[fetch] $ curl --http1.1 "https://example.com/" -X POST -H "content-type: application/json" -H "Connection: keep-alive" -H "User-Agent: Bun/1.1.14" -H "Accept: */*" -H "Host: example.com" -H "Accept-Encoding: gzip, deflate, br" --compressed -H "Content-Length: 13" --data-raw "{\"foo\":\"bar\"}"
|
||||
[fetch] > HTTP/1.1 POST https://example.com/
|
||||
[fetch] > content-type: application/json
|
||||
[fetch] > Connection: keep-alive
|
||||
[fetch] > User-Agent: Bun/1.1.14
|
||||
[fetch] > Accept: */*
|
||||
[fetch] > Host: example.com
|
||||
[fetch] > Accept-Encoding: gzip, deflate, br
|
||||
[fetch] > Content-Length: 13
|
||||
|
||||
[fetch] < 200 OK
|
||||
[fetch] < Accept-Ranges: bytes
|
||||
[fetch] < Cache-Control: max-age=604800
|
||||
[fetch] < Content-Type: text/html; charset=UTF-8
|
||||
[fetch] < Date: Tue, 18 Jun 2024 05:12:07 GMT
|
||||
[fetch] < Etag: "3147526947"
|
||||
[fetch] < Expires: Tue, 25 Jun 2024 05:12:07 GMT
|
||||
[fetch] < Last-Modified: Thu, 17 Oct 2019 07:18:26 GMT
|
||||
[fetch] < Server: EOS (vny/044F)
|
||||
[fetch] < Content-Length: 1256
|
||||
```
|
||||
|
||||
The lines with `[fetch] >` are the request from your local code, and the lines with `[fetch] <` are the response from the remote server.
|
||||
|
||||
The `BUN_CONFIG_VERBOSE_FETCH` environment variable is supported in both `fetch()` and `node:http` requests, so it should just work.
|
||||
|
||||
To print without the `curl` command, set `BUN_CONFIG_VERBOSE_FETCH` to `true`.
|
||||
|
||||
```ts
|
||||
process.env.BUN_CONFIG_VERBOSE_FETCH = "true";
|
||||
|
||||
await fetch("https://example.com", {
|
||||
method: "POST",
|
||||
headers: {
|
||||
"Content-Type": "application/json",
|
||||
},
|
||||
body: JSON.stringify({ foo: "bar" }),
|
||||
});
|
||||
```
|
||||
|
||||
This prints the following to the console:
|
||||
|
||||
```sh
|
||||
[fetch] > HTTP/1.1 POST https://example.com/
|
||||
[fetch] > content-type: application/json
|
||||
[fetch] > Connection: keep-alive
|
||||
[fetch] > User-Agent: Bun/1.1.14
|
||||
[fetch] > Accept: */*
|
||||
[fetch] > Host: example.com
|
||||
[fetch] > Accept-Encoding: gzip, deflate, br
|
||||
[fetch] > Content-Length: 13
|
||||
|
||||
[fetch] < 200 OK
|
||||
[fetch] < Accept-Ranges: bytes
|
||||
[fetch] < Cache-Control: max-age=604800
|
||||
[fetch] < Content-Type: text/html; charset=UTF-8
|
||||
[fetch] < Date: Tue, 18 Jun 2024 05:12:07 GMT
|
||||
[fetch] < Etag: "3147526947"
|
||||
[fetch] < Expires: Tue, 25 Jun 2024 05:12:07 GMT
|
||||
[fetch] < Last-Modified: Thu, 17 Oct 2019 07:18:26 GMT
|
||||
[fetch] < Server: EOS (vny/044F)
|
||||
[fetch] < Content-Length: 1256
|
||||
```
|
||||
|
||||
## Stacktraces & sourcemaps
|
||||
|
||||
Bun transpiles every file, which sounds like it would mean that the stack traces you see in the console would unhelpfully point to the transpiled output. To address this, Bun automatically generates and serves sourcemapped files for every file it transpiles. When you see a stack trace in the console, you can click on the file path and be taken to the original source code, even though it was written in TypeScript or JSX, or has some other transformation applied.
|
||||
|
||||
<!-- TODO: uncomment once v1.1.13 regression is fixed (cc @paperdave) -->
|
||||
<!-- In Bun, each `Error` object gets four additional properties:
|
||||
|
||||
- `line` — the source-mapped line number. This number points to the input source code, not the transpiled output.
|
||||
- `column` — the source-mapped column number. This number points to the input source code, not the transpiled output.
|
||||
- `originalColumn` — the column number pointing to transpiled source code, without sourcemaps. This number comes from JavaScriptCore.
|
||||
- `originalLine` — the line number pointing to transpiled source code, without sourcemaps. This number comes from JavaScriptCore.
|
||||
|
||||
These properties are populated lazily when `error.stack` is accessed. -->
|
||||
|
||||
Bun automatically loads sourcemaps both at runtime when transpiling files on-demand, and when using `bun build` to precompile files ahead of time.
|
||||
|
||||
### Syntax-highlighted source code preview
|
||||
|
||||
To help with debugging, Bun automatically prints a small source-code preview when an unhandled exception or rejection occurs. You can simulate this behavior by calling `Bun.inspect(error)`:
|
||||
|
||||
```ts
|
||||
// Create an error
|
||||
const err = new Error("Something went wrong");
|
||||
console.log(Bun.inspect(err, { colors: true }));
|
||||
```
|
||||
|
||||
This prints a syntax-highlighted preview of the source code where the error occurred, along with the error message and stack trace.
|
||||
|
||||
```js
|
||||
1 | // Create an error
|
||||
2 | const err = new Error("Something went wrong");
|
||||
^
|
||||
error: Something went wrong
|
||||
at file.js:2:13
|
||||
```
|
||||
|
||||
### V8 Stack Traces
|
||||
|
||||
Bun uses JavaScriptCore as it's engine, but much of the Node.js ecosystem & npm expects V8. JavaScript engines differ in `error.stack` formatting. Bun intends to be a drop-in replacement for Node.js, and that means it's our job to make sure that even though the engine is different, the stack traces are as similar as possible.
|
||||
|
||||
That's why when you log `error.stack` in Bun, the formatting of `error.stack` is the same as in Node.js's V8 engine. This is especially useful when you're using libraries that expect V8 stack traces.
|
||||
|
||||
#### V8 Stack Trace API
|
||||
|
||||
Bun implements the [V8 Stack Trace API](https://v8.dev/docs/stack-trace-api), which is a set of functions that allow you to manipulate stack traces.
|
||||
|
||||
##### Error.prepareStackTrace
|
||||
|
||||
The `Error.prepareStackTrace` function is a global function that lets you customize the stack trace output. This function is called with the error object and an array of `CallSite` objects and lets you return a custom stack trace.
|
||||
|
||||
```ts
|
||||
Error.prepareStackTrace = (err, stack) => {
|
||||
return stack.map(callSite => {
|
||||
return callSite.getFileName();
|
||||
});
|
||||
};
|
||||
|
||||
const err = new Error("Something went wrong");
|
||||
console.log(err.stack);
|
||||
// [ "error.js" ]
|
||||
```
|
||||
|
||||
The `CallSite` object has the following methods:
|
||||
|
||||
| Method | Returns |
|
||||
| -------------------------- | ----------------------------------------------------- |
|
||||
| `getThis` | `this` value of the function call |
|
||||
| `getTypeName` | typeof `this` |
|
||||
| `getFunction` | function object |
|
||||
| `getFunctionName` | function name as a string |
|
||||
| `getMethodName` | method name as a string |
|
||||
| `getFileName` | file name or URL |
|
||||
| `getLineNumber` | line number |
|
||||
| `getColumnNumber` | column number |
|
||||
| `getEvalOrigin` | `undefined` |
|
||||
| `getScriptNameOrSourceURL` | source URL |
|
||||
| `isToplevel` | returns `true` if the function is in the global scope |
|
||||
| `isEval` | returns `true` if the function is an `eval` call |
|
||||
| `isNative` | returns `true` if the function is native |
|
||||
| `isConstructor` | returns `true` if the function is a constructor |
|
||||
| `isAsync` | returns `true` if the function is `async` |
|
||||
| `isPromiseAll` | Not implemented yet. |
|
||||
| `getPromiseIndex` | Not implemented yet. |
|
||||
| `toString` | returns a string representation of the call site |
|
||||
|
||||
In some cases, the `Function` object may have already been garbage collected, so some of these methods may return `undefined`.
|
||||
|
||||
##### Error.captureStackTrace(error, startFn)
|
||||
|
||||
The `Error.captureStackTrace` function lets you capture a stack trace at a specific point in your code, rather than at the point where the error was thrown.
|
||||
|
||||
This can be helpful when you have callbacks or asynchronous code that makes it difficult to determine where an error originated. The 2nd argument to `Error.captureStackTrace` is the function where you want the stack trace to start.
|
||||
|
||||
For example, the below code will make `err.stack` point to the code calling `fn()`, even though the error was thrown at `myInner`.
|
||||
|
||||
```ts
|
||||
const fn = () => {
|
||||
function myInner() {
|
||||
throw err;
|
||||
}
|
||||
|
||||
try {
|
||||
myInner();
|
||||
} catch (err) {
|
||||
console.log(err.stack);
|
||||
console.log("");
|
||||
console.log("-- captureStackTrace --");
|
||||
console.log("");
|
||||
Error.captureStackTrace(err, fn);
|
||||
console.log(err.stack);
|
||||
}
|
||||
};
|
||||
|
||||
fn();
|
||||
```
|
||||
|
||||
This logs the following:
|
||||
|
||||
```sh
|
||||
Error: here!
|
||||
at myInner (file.js:4:15)
|
||||
at fn (file.js:8:5)
|
||||
at module code (file.js:17:1)
|
||||
at moduleEvaluation (native)
|
||||
at moduleEvaluation (native)
|
||||
at <anonymous> (native)
|
||||
|
||||
-- captureStackTrace --
|
||||
|
||||
Error: here!
|
||||
at module code (file.js:17:1)
|
||||
at moduleEvaluation (native)
|
||||
at moduleEvaluation (native)
|
||||
at <anonymous> (native)
|
||||
```
|
||||
|
||||
@@ -143,16 +143,6 @@ These environment variables are read by Bun and configure aspects of its behavio
|
||||
|
||||
---
|
||||
|
||||
- `NODE_TLS_REJECT_UNAUTHORIZED`
|
||||
- `NODE_TLS_REJECT_UNAUTHORIZED=0` disables SSL certificate validation. This is useful for testing and debugging, but you should be very hesitant to use this in production. Note: This environment variable was originally introduced by Node.js and we kept the name for compatibility.
|
||||
|
||||
---
|
||||
|
||||
- `BUN_CONFIG_VERBOSE_FETCH`
|
||||
- If `BUN_CONFIG_VERBOSE_FETCH=curl`, then fetch requests will log the url, method, request headers and response headers to the console. This is useful for debugging network requests. This also works with `node:http`. `BUN_CONFIG_VERBOSE_FETCH=1` is equivalent to `BUN_CONFIG_VERBOSE_FETCH=curl` except without the `curl` output.
|
||||
|
||||
---
|
||||
|
||||
- `BUN_RUNTIME_TRANSPILER_CACHE_PATH`
|
||||
- The runtime transpiler caches the transpiled output of source files larger than 50 kb. This makes CLIs using Bun load faster. If `BUN_RUNTIME_TRANSPILER_CACHE_PATH` is set, then the runtime transpiler will cache transpiled output to the specified directory. If `BUN_RUNTIME_TRANSPILER_CACHE_PATH` is set to an empty string or the string `"0"`, then the runtime transpiler will not cache transpiled output. If `BUN_RUNTIME_TRANSPILER_CACHE_PATH` is unset, then the runtime transpiler will cache transpiled output to the platform-specific cache directory.
|
||||
|
||||
@@ -179,7 +169,7 @@ These environment variables are read by Bun and configure aspects of its behavio
|
||||
---
|
||||
|
||||
- `BUN_CONFIG_NO_CLEAR_TERMINAL_ON_RELOAD`
|
||||
- If `BUN_CONFIG_NO_CLEAR_TERMINAL_ON_RELOAD=true`, then `bun --watch` will not clear the console on reload
|
||||
- If `BUN_CONFIG_NO_CLEAR_TERMINAL_ON_RELOAD=1`, then `bun --watch` will not clear the console on reload
|
||||
|
||||
---
|
||||
|
||||
|
||||
@@ -48,6 +48,14 @@ In this case, we are importing from `./hello`, a relative path with no extension
|
||||
- `./hello/index.cjs`
|
||||
- `./hello/index.json`
|
||||
|
||||
Import paths are case-insensitive, meaning these are all valid imports:
|
||||
|
||||
```ts#index.ts
|
||||
import { hello } from "./hello";
|
||||
import { hello } from "./HELLO";
|
||||
import { hello } from "./hElLo";
|
||||
```
|
||||
|
||||
Import paths can optionally include extensions. If an extension is present, Bun will only check for a file with that exact extension.
|
||||
|
||||
```ts#index.ts
|
||||
|
||||
@@ -18,7 +18,7 @@ This page is updated regularly to reflect compatibility status of the latest ver
|
||||
|
||||
### [`node:child_process`](https://nodejs.org/api/child_process.html)
|
||||
|
||||
🟡 Missing `proc.gid` `proc.uid`. `Stream` class not exported. IPC cannot send socket handles. Node.js <> Bun IPC can be used with JSON serialization.
|
||||
🟡 Missing `Stream` stdio, `proc.gid` `proc.uid`. IPC cannot send socket handles and only works with other `bun` processes.
|
||||
|
||||
### [`node:cluster`](https://nodejs.org/api/cluster.html)
|
||||
|
||||
@@ -53,7 +53,7 @@ Some methods are not optimized yet.
|
||||
|
||||
### [`node:events`](https://nodejs.org/api/events.html)
|
||||
|
||||
🟡 `events.addAbortListener` & `events.getMaxListeners` do not support (web api) `EventTarget`
|
||||
🟡 Missing `addAbortListener` `events.getMaxListeners`
|
||||
|
||||
### [`node:fs`](https://nodejs.org/api/fs.html)
|
||||
|
||||
@@ -61,7 +61,7 @@ Some methods are not optimized yet.
|
||||
|
||||
### [`node:http`](https://nodejs.org/api/http.html)
|
||||
|
||||
🟢 Fully implemented. Outgoing client request body is currently buffered instead of streamed.
|
||||
🟢 Fully implemented.
|
||||
|
||||
### [`node:http2`](https://nodejs.org/api/http2.html)
|
||||
|
||||
@@ -69,7 +69,7 @@ Some methods are not optimized yet.
|
||||
|
||||
### [`node:https`](https://nodejs.org/api/https.html)
|
||||
|
||||
🟢 APIs are implemented, but `Agent` is not always used yet.
|
||||
🟢 Fully implemented.
|
||||
|
||||
### [`node:inspector`](https://nodejs.org/api/inspector.html)
|
||||
|
||||
@@ -169,7 +169,7 @@ Some methods are not optimized yet.
|
||||
|
||||
### [`node:worker_threads`](https://nodejs.org/api/worker_threads.html)
|
||||
|
||||
🟡 `Worker` doesn't support the following options: `stdin` `stdout` `stderr` `trackedUnmanagedFds` `resourceLimits`. Missing `markAsUntransferable` `moveMessagePortToContext` `getHeapSnapshot`.
|
||||
🟡 `Worker` doesn't support the following options: `eval` `stdin` `stdout` `stderr` `trackedUnmanagedFds` `resourceLimits`. Missing `markAsUntransferable` `moveMessagePortToContext` `getHeapSnapshot`.
|
||||
|
||||
### [`node:zlib`](https://nodejs.org/api/zlib.html)
|
||||
|
||||
@@ -193,7 +193,7 @@ The table below lists all globals implemented by Node.js and Bun's current compa
|
||||
|
||||
### [`Buffer`](https://nodejs.org/api/buffer.html#class-buffer)
|
||||
|
||||
🟢 Fully implemented.
|
||||
🟡 Incomplete implementation of `base64` and `base64url` encodings.
|
||||
|
||||
### [`ByteLengthQueuingStrategy`](https://developer.mozilla.org/en-US/docs/Web/API/ByteLengthQueuingStrategy)
|
||||
|
||||
@@ -433,7 +433,7 @@ The table below lists all globals implemented by Node.js and Bun's current compa
|
||||
|
||||
### [`URL`](https://developer.mozilla.org/en-US/docs/Web/API/URL)
|
||||
|
||||
🟢 Fully implemented.
|
||||
🟡 `URL.createObjectURL` is missing. See [Issue #3925](https://github.com/oven-sh/bun/issues/3925)
|
||||
|
||||
### [`URLSearchParams`](https://developer.mozilla.org/en-US/docs/Web/API/URLSearchParams)
|
||||
|
||||
|
||||
@@ -418,7 +418,7 @@ For cross-platform compatibility, Bun Shell implements a set of builtin commands
|
||||
|
||||
**Not** implemented yet, but planned:
|
||||
|
||||
- See [Issue #9716](https://github.com/oven-sh/bun/issues/9716) for the full list.
|
||||
- See https://github.com/oven-sh/bun/issues/9716 for the full list.
|
||||
|
||||
## Utilities
|
||||
|
||||
|
||||
@@ -63,29 +63,3 @@ Internally, Bun transpiles all files by default, so Bun automatically generates
|
||||
[test]
|
||||
coverageIgnoreSourcemaps = true # default false
|
||||
```
|
||||
|
||||
### Coverage reporters
|
||||
|
||||
By default, coverage reports will be printed to the console.
|
||||
|
||||
For persistent code coverage reports in CI environments and for other tools, you can pass a `--coverage-reporter=lcov` CLI option or `coverageReporter` option in `bunfig.toml`.
|
||||
|
||||
```toml
|
||||
[test]
|
||||
coverageReporter = ["text", "lcov"] # default ["text"]
|
||||
coverageDir = "path/to/somewhere" # default "coverage"
|
||||
```
|
||||
|
||||
| Reporter | Description |
|
||||
| -------- | --------------------------------------------------------------------------- |
|
||||
| `text` | Prints a text summary of the coverage to the console. |
|
||||
| `lcov` | Save coverage in [lcov](https://github.com/linux-test-project/lcov) format. |
|
||||
|
||||
#### lcov coverage reporter
|
||||
|
||||
To generate an lcov report, you can use the `lcov` reporter. This will generate an `lcov.info` file in the `coverage` directory.
|
||||
|
||||
```toml
|
||||
[test]
|
||||
coverageReporter = "lcov"
|
||||
```
|
||||
|
||||
@@ -305,30 +305,6 @@ Bun implements the following matchers. Full Jest compatibility is on the roadmap
|
||||
|
||||
---
|
||||
|
||||
- ✅
|
||||
- [`.toContainAllKeys()`](https://jest-extended.jestcommunity.dev/docs/matchers/Object#tocontainallkeyskeys)
|
||||
|
||||
---
|
||||
|
||||
- ✅
|
||||
- [`.toContainValue()`](https://jest-extended.jestcommunity.dev/docs/matchers/Object#tocontainvaluevalue)
|
||||
|
||||
---
|
||||
|
||||
- ✅
|
||||
- [`.toContainValues()`](https://jest-extended.jestcommunity.dev/docs/matchers/Object#tocontainvaluesvalues)
|
||||
|
||||
---
|
||||
|
||||
- ✅
|
||||
- [`.toContainAllValues()`](https://jest-extended.jestcommunity.dev/docs/matchers/Object#tocontainallvaluesvalues)
|
||||
|
||||
---
|
||||
- ✅
|
||||
- [`.toContainAnyValues()`](https://jest-extended.jestcommunity.dev/docs/matchers/Object#tocontainanyvaluesvalues)
|
||||
|
||||
---
|
||||
|
||||
- ✅
|
||||
- [`.toStrictEqual()`](https://jestjs.io/docs/expect#tostrictequalvalue)
|
||||
|
||||
|
||||
@@ -3,7 +3,7 @@ const std = @import("std");
|
||||
const CompressionFramework = struct {
|
||||
var handle: ?*anyopaque = null;
|
||||
pub fn load() !void {
|
||||
handle = std.posix.darwin.dlopen("libcompression.dylib", 1);
|
||||
handle = std.os.darwin.dlopen("libcompression.dylib", 1);
|
||||
|
||||
if (handle == null)
|
||||
return error.@"failed to load Compression.framework";
|
||||
@@ -247,7 +247,7 @@ pub fn main() anyerror!void {
|
||||
|
||||
if (algorithm == null or operation == null) {
|
||||
try std.io.getStdErr().writer().print("to compress: {s} ./file ./out.{{br,gz,lz4,lzfse}}\nto decompress: {s} ./out.{{br,gz,lz4,lzfse}} ./out\n", .{ argv0, argv0 });
|
||||
std.posix.exit(1);
|
||||
std.os.exit(1);
|
||||
}
|
||||
|
||||
var output_file: std.fs.File = undefined;
|
||||
|
||||
@@ -182,7 +182,7 @@ pub fn main() anyerror!void {
|
||||
|
||||
try channel.buffer.ensureTotalCapacity(1);
|
||||
|
||||
HTTPThread.init();
|
||||
try HTTPThread.init();
|
||||
|
||||
var ctx = try default_allocator.create(HTTP.HTTPChannelContext);
|
||||
ctx.* = .{
|
||||
|
||||
@@ -198,7 +198,7 @@ pub fn main() anyerror!void {
|
||||
try channel.buffer.ensureTotalCapacity(args.count);
|
||||
|
||||
try NetworkThread.init();
|
||||
if (args.concurrency > 0) HTTP.AsyncHTTP.max_simultaneous_requests.store(args.concurrency, .monotonic);
|
||||
if (args.concurrency > 0) HTTP.AsyncHTTP.max_simultaneous_requests.store(args.concurrency, .Monotonic);
|
||||
const Group = struct {
|
||||
response_body: MutableString = undefined,
|
||||
context: HTTP.HTTPChannelContext = undefined,
|
||||
|
||||
@@ -126,11 +126,11 @@ pub fn main() anyerror!void {
|
||||
Output.prettyErrorln("For {d} messages and {d} threads:", .{ count, thread_count });
|
||||
Output.flush();
|
||||
defer Output.flush();
|
||||
const runs = if (std.posix.getenv("RUNS")) |run_count| try std.fmt.parseInt(usize, run_count, 10) else 1;
|
||||
const runs = if (std.os.getenv("RUNS")) |run_count| try std.fmt.parseInt(usize, run_count, 10) else 1;
|
||||
|
||||
if (std.posix.getenv("NO_MACH") == null)
|
||||
if (std.os.getenv("NO_MACH") == null)
|
||||
try machMain(runs);
|
||||
|
||||
if (std.posix.getenv("NO_USER") == null)
|
||||
if (std.os.getenv("NO_USER") == null)
|
||||
try userMain(runs);
|
||||
}
|
||||
|
||||
@@ -42,11 +42,11 @@ pub fn main() anyerror!void {
|
||||
.loose,
|
||||
);
|
||||
joined_buf[joined.len] = 0;
|
||||
const os = std.posix;
|
||||
const os = std.os;
|
||||
const joined_z: [:0]const u8 = joined_buf[0..joined.len :0];
|
||||
const O_PATH = if (@hasDecl(bun.O, "PATH")) bun.O.PATH else 0;
|
||||
const O_PATH = if (@hasDecl(os.O, "PATH")) os.O.PATH else 0;
|
||||
|
||||
var file = std.posix.openZ(joined_z, O_PATH | bun.O.CLOEXEC, 0) catch |err| {
|
||||
var file = std.os.openZ(joined_z, O_PATH | std.os.O.CLOEXEC, 0) catch |err| {
|
||||
switch (err) {
|
||||
error.NotDir, error.FileNotFound => {
|
||||
Output.prettyError("<r><red>404 Not Found<r>: <b>\"{s}\"<r>", .{joined_z});
|
||||
|
||||
@@ -32,7 +32,7 @@ pub fn main() anyerror!void {
|
||||
|
||||
var j: usize = 0;
|
||||
while (j < 1000) : (j += 1) {
|
||||
path = try std.posix.realpathZ(to_resolve, &out_buffer);
|
||||
path = try std.os.realpathZ(to_resolve, &out_buffer);
|
||||
}
|
||||
|
||||
Output.print("{s}", .{path});
|
||||
|
||||
@@ -88,9 +88,8 @@ pub fn main() anyerror!void {
|
||||
null,
|
||||
void,
|
||||
void{},
|
||||
.{
|
||||
.depth_to_skip = 1,
|
||||
.close_handles = false,
|
||||
},
|
||||
1,
|
||||
false,
|
||||
false,
|
||||
);
|
||||
}
|
||||
|
||||
37
package.json
37
package.json
@@ -5,23 +5,23 @@
|
||||
"./packages/bun-types"
|
||||
],
|
||||
"dependencies": {
|
||||
"@vscode/debugadapter": "^1.65.0",
|
||||
"esbuild": "^0.21.4",
|
||||
"eslint": "^9.4.0",
|
||||
"eslint-config-prettier": "^9.1.0",
|
||||
"mitata": "^0.1.11",
|
||||
"@vscode/debugadapter": "^1.61.0",
|
||||
"esbuild": "^0.17.15",
|
||||
"eslint": "^8.20.0",
|
||||
"eslint-config-prettier": "^8.5.0",
|
||||
"mitata": "^0.1.3",
|
||||
"peechy": "0.4.34",
|
||||
"prettier": "^3.2.5",
|
||||
"react": "^18.3.1",
|
||||
"react-dom": "^18.3.1",
|
||||
"source-map-js": "^1.2.0",
|
||||
"typescript": "^5.4.5"
|
||||
"react": "^18.2.0",
|
||||
"react-dom": "^18.2.0",
|
||||
"source-map-js": "^1.0.2",
|
||||
"typescript": "^5.0.2"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@types/bun": "^1.1.3",
|
||||
"@types/react": "^18.3.3",
|
||||
"@typescript-eslint/eslint-plugin": "^7.11.0",
|
||||
"@typescript-eslint/parser": "^7.11.0"
|
||||
"@types/bun": "^1.1.2",
|
||||
"@types/react": "^18.0.25",
|
||||
"@typescript-eslint/eslint-plugin": "^5.31.0",
|
||||
"@typescript-eslint/parser": "^5.31.0"
|
||||
},
|
||||
"resolutions": {
|
||||
"bun-types": "workspace:packages/bun-types"
|
||||
@@ -34,8 +34,6 @@
|
||||
"build:tidy": "BUN_SILENT=1 cmake --log-level=WARNING . -DZIG_OPTIMIZE=Debug -DUSE_DEBUG_JSC=ON -DBUN_TIDY_ONLY=ON -DCMAKE_BUILD_TYPE=Debug -GNinja -Bbuild-tidy >> ${GITHUB_STEP_SUMMARY:-/dev/stdout} && BUN_SILENT=1 ninja -Cbuild-tidy >> ${GITHUB_STEP_SUMMARY:-/dev/stdout}",
|
||||
"build:tidy-extra": "cmake . -DZIG_OPTIMIZE=Debug -DUSE_DEBUG_JSC=ON -DBUN_TIDY_ONLY_EXTRA=ON -DCMAKE_BUILD_TYPE=Debug -GNinja -Bbuild-tidy && ninja -Cbuild-tidy",
|
||||
"build:release": "cmake . -DCMAKE_BUILD_TYPE=Release -GNinja -Bbuild-release && ninja -Cbuild-release",
|
||||
"build:release:local": "cmake . -DCMAKE_BUILD_TYPE=Release -DWEBKIT_DIR=$(pwd)/src/bun.js/WebKit/WebKitBuild/Release -GNinja -Bbuild-release-local && ninja -Cbuild-release-local",
|
||||
"build:release:with_logs": "cmake . -DCMAKE_BUILD_TYPE=Release -DENABLE_LOGS=true -GNinja -Bbuild-release && ninja -Cbuild-release",
|
||||
"build:debug-zig-release": "cmake . -DCMAKE_BUILD_TYPE=Release -DZIG_OPTIMIZE=Debug -GNinja -Bbuild-debug-zig-release && ninja -Cbuild-debug-zig-release",
|
||||
"build:safe": "cmake . -DZIG_OPTIMIZE=ReleaseSafe -DUSE_DEBUG_JSC=ON -DCMAKE_BUILD_TYPE=Release -GNinja -Bbuild-safe && ninja -Cbuild-safe",
|
||||
"build:windows": "cmake -B build -S . -G Ninja -DCMAKE_BUILD_TYPE=Debug && ninja -Cbuild",
|
||||
@@ -44,12 +42,7 @@
|
||||
"fmt:zig": "zig fmt src/*.zig src/*/*.zig src/*/*/*.zig src/*/*/*/*.zig",
|
||||
"lint": "eslint './**/*.d.ts' --cache",
|
||||
"lint:fix": "eslint './**/*.d.ts' --cache --fix",
|
||||
"test": "node scripts/runner.node.mjs ./build/bun-debug",
|
||||
"test:release": "node scripts/runner.node.mjs ./build-release/bun",
|
||||
"banned": "bun packages/bun-internal-test/src/linter.ts",
|
||||
"zig-check": ".cache/zig/zig.exe build check --summary new",
|
||||
"zig-check-all": ".cache/zig/zig.exe build check-all --summary new",
|
||||
"zig-check-windows": ".cache/zig/zig.exe build check-windows --summary new",
|
||||
"zig": ".cache/zig/zig.exe "
|
||||
"test": "node packages/bun-internal-test/src/runner.node.mjs ./build/bun-debug",
|
||||
"test:release": "node packages/bun-internal-test/src/runner.node.mjs ./build-release/bun"
|
||||
}
|
||||
}
|
||||
|
||||
@@ -511,7 +511,13 @@ const SourceLines = ({
|
||||
);
|
||||
};
|
||||
|
||||
const BuildErrorSourceLines = ({ location, filename }: { location: Location; filename: string }) => {
|
||||
const BuildErrorSourceLines = ({
|
||||
location,
|
||||
filename,
|
||||
}: {
|
||||
location: Location;
|
||||
filename: string;
|
||||
}) => {
|
||||
const { line, line_text, column } = location;
|
||||
const sourceLines: SourceLine[] = [{ line, text: line_text }];
|
||||
const buildURL = React.useCallback((line, column) => srcFileURL(filename, line, column), [srcFileURL, filename]);
|
||||
@@ -606,7 +612,7 @@ const NativeStackFrame = ({
|
||||
const {
|
||||
file,
|
||||
function_name: functionName,
|
||||
position: { line, column },
|
||||
position: { line, column_start: column },
|
||||
scope,
|
||||
} = frame;
|
||||
const fileName = normalizedFilename(file, cwd);
|
||||
@@ -683,21 +689,21 @@ const NativeStackTrace = ({
|
||||
return (
|
||||
<div ref={ref} className={`BunError-NativeStackTrace`}>
|
||||
<a
|
||||
href={urlBuilder(filename, position.line, position.column)}
|
||||
href={urlBuilder(filename, position.line, position.column_start)}
|
||||
data-line={position.line}
|
||||
data-column={position.column}
|
||||
data-column={position.column_start}
|
||||
data-is-client="true"
|
||||
target="_blank"
|
||||
onClick={openWithoutFlashOfNewTab}
|
||||
className="BunError-NativeStackTrace-filename"
|
||||
>
|
||||
{filename}:{position.line}:{position.column}
|
||||
{filename}:{position.line}:{position.column_start}
|
||||
</a>
|
||||
{sourceLines.length > 0 && (
|
||||
<SourceLines
|
||||
highlight={position.line}
|
||||
sourceLines={sourceLines}
|
||||
highlightColumnStart={position.column}
|
||||
highlightColumnStart={position.column_start}
|
||||
buildURL={buildURL}
|
||||
highlightColumnEnd={position.column_stop}
|
||||
>
|
||||
@@ -709,7 +715,7 @@ const NativeStackTrace = ({
|
||||
highlight={position.line}
|
||||
sourceLines={sourceLines}
|
||||
setSourceLines={setSourceLines}
|
||||
highlightColumnStart={position.column}
|
||||
highlightColumnStart={position.column_start}
|
||||
buildURL={buildURL}
|
||||
highlightColumnEnd={position.column_stop}
|
||||
>
|
||||
@@ -731,7 +737,13 @@ const Indent = ({ by, children }) => {
|
||||
);
|
||||
};
|
||||
|
||||
const JSException = ({ value, isClient = false }: { value: JSExceptionType; isClient: boolean }) => {
|
||||
const JSException = ({
|
||||
value,
|
||||
isClient = false,
|
||||
}: {
|
||||
value: JSExceptionType;
|
||||
isClient: boolean;
|
||||
}) => {
|
||||
const tag = isClient ? ErrorTagType.client : ErrorTagType.server;
|
||||
const [sourceLines, _setSourceLines] = React.useState(value?.stack?.source_lines ?? []);
|
||||
var message = value.message || "";
|
||||
@@ -779,7 +791,7 @@ const JSException = ({ value, isClient = false }: { value: JSExceptionType; isCl
|
||||
sourceLines={sourceLines}
|
||||
setSourceLines={setSourceLines}
|
||||
>
|
||||
<Indent by={value.stack.frames[0].position.column}>
|
||||
<Indent by={value.stack.frames[0].position.column_start}>
|
||||
<span className="BunError-error-typename">{fancyTypeError.runtimeTypeName}</span>
|
||||
</Indent>
|
||||
</NativeStackTrace>
|
||||
@@ -841,7 +853,13 @@ const JSException = ({ value, isClient = false }: { value: JSExceptionType; isCl
|
||||
}
|
||||
};
|
||||
|
||||
const Summary = ({ errorCount, onClose }: { errorCount: number; onClose: () => void }) => {
|
||||
const Summary = ({
|
||||
errorCount,
|
||||
onClose,
|
||||
}: {
|
||||
errorCount: number;
|
||||
onClose: () => void;
|
||||
}) => {
|
||||
return (
|
||||
<div className="BunError-Summary">
|
||||
<div className="BunError-Summary-ErrorIcon"></div>
|
||||
@@ -983,7 +1001,11 @@ const Footer = ({ toMarkdown, data }) => (
|
||||
</div>
|
||||
);
|
||||
|
||||
const BuildFailureMessageContainer = ({ messages }: { messages: Message[] }) => {
|
||||
const BuildFailureMessageContainer = ({
|
||||
messages,
|
||||
}: {
|
||||
messages: Message[];
|
||||
}) => {
|
||||
return (
|
||||
<div id="BunErrorOverlay-container">
|
||||
<div className="BunError-content">
|
||||
@@ -1131,14 +1153,14 @@ export function renderRuntimeError(error: Error) {
|
||||
file: error[fileNameProperty] || "",
|
||||
position: {
|
||||
line: +error[lineNumberProperty] || 1,
|
||||
column: +error[columnNumberProperty] || 1,
|
||||
column_start: +error[columnNumberProperty] || 1,
|
||||
},
|
||||
} as StackFrame);
|
||||
} else if (exception.stack && exception.stack.frames.length > 0) {
|
||||
exception.stack.frames[0].position.line = error[lineNumberProperty];
|
||||
|
||||
if (Number.isFinite(error[columnNumberProperty])) {
|
||||
exception.stack.frames[0].position.column = error[columnNumberProperty];
|
||||
exception.stack.frames[0].position.column_start = error[columnNumberProperty];
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1192,27 +1214,27 @@ export function renderRuntimeError(error: Error) {
|
||||
}
|
||||
var frame = exception.stack.frames[frameIndex];
|
||||
|
||||
const { line, column } = frame.position;
|
||||
const remapped = remapPosition(mappings, line, column);
|
||||
const { line, column_start } = frame.position;
|
||||
const remapped = remapPosition(mappings, line, column_start);
|
||||
if (!remapped) return null;
|
||||
frame.position.line_start = frame.position.line = remapped[0];
|
||||
frame.position.column_stop =
|
||||
frame.position.expression_stop =
|
||||
frame.position.expression_start =
|
||||
frame.position.column =
|
||||
frame.position.column_start =
|
||||
remapped[1];
|
||||
}, console.error);
|
||||
} else {
|
||||
if (!mappings) return null;
|
||||
var frame = exception.stack.frames[frameIndex];
|
||||
const { line, column } = frame.position;
|
||||
const remapped = remapPosition(mappings, line, column);
|
||||
const { line, column_start } = frame.position;
|
||||
const remapped = remapPosition(mappings, line, column_start);
|
||||
if (!remapped) return null;
|
||||
frame.position.line_start = frame.position.line = remapped[0];
|
||||
frame.position.column_stop =
|
||||
frame.position.expression_stop =
|
||||
frame.position.expression_start =
|
||||
frame.position.column =
|
||||
frame.position.column_start =
|
||||
remapped[1];
|
||||
}
|
||||
});
|
||||
|
||||
@@ -1518,10 +1518,7 @@
|
||||
"id": "EventMetadata",
|
||||
"description": "A key-value pair for additional event information to pass along.",
|
||||
"type": "object",
|
||||
"properties": [
|
||||
{ "name": "key", "type": "string" },
|
||||
{ "name": "value", "type": "string" }
|
||||
]
|
||||
"properties": [{ "name": "key", "type": "string" }, { "name": "value", "type": "string" }]
|
||||
},
|
||||
{
|
||||
"id": "BackgroundServiceEvent",
|
||||
@@ -1573,10 +1570,7 @@
|
||||
{
|
||||
"name": "setRecording",
|
||||
"description": "Set the recording state for the service.",
|
||||
"parameters": [
|
||||
{ "name": "shouldRecord", "type": "boolean" },
|
||||
{ "name": "service", "$ref": "ServiceName" }
|
||||
]
|
||||
"parameters": [{ "name": "shouldRecord", "type": "boolean" }, { "name": "service", "$ref": "ServiceName" }]
|
||||
},
|
||||
{
|
||||
"name": "clearEvents",
|
||||
@@ -1588,10 +1582,7 @@
|
||||
{
|
||||
"name": "recordingStateChanged",
|
||||
"description": "Called when the recording state for the service has been updated.",
|
||||
"parameters": [
|
||||
{ "name": "isRecording", "type": "boolean" },
|
||||
{ "name": "service", "$ref": "ServiceName" }
|
||||
]
|
||||
"parameters": [{ "name": "isRecording", "type": "boolean" }, { "name": "service", "$ref": "ServiceName" }]
|
||||
},
|
||||
{
|
||||
"name": "backgroundServiceEventReceived",
|
||||
@@ -2081,10 +2072,7 @@
|
||||
{
|
||||
"id": "Header",
|
||||
"type": "object",
|
||||
"properties": [
|
||||
{ "name": "name", "type": "string" },
|
||||
{ "name": "value", "type": "string" }
|
||||
]
|
||||
"properties": [{ "name": "name", "type": "string" }, { "name": "value", "type": "string" }]
|
||||
},
|
||||
{
|
||||
"id": "CachedResponse",
|
||||
@@ -3454,10 +3442,7 @@
|
||||
{
|
||||
"name": "setStyleSheetText",
|
||||
"description": "Sets the new stylesheet text.",
|
||||
"parameters": [
|
||||
{ "name": "styleSheetId", "$ref": "StyleSheetId" },
|
||||
{ "name": "text", "type": "string" }
|
||||
],
|
||||
"parameters": [{ "name": "styleSheetId", "$ref": "StyleSheetId" }, { "name": "text", "type": "string" }],
|
||||
"returns": [
|
||||
{
|
||||
"name": "sourceMapURL",
|
||||
@@ -3582,10 +3567,7 @@
|
||||
},
|
||||
{
|
||||
"name": "executeSQL",
|
||||
"parameters": [
|
||||
{ "name": "databaseId", "$ref": "DatabaseId" },
|
||||
{ "name": "query", "type": "string" }
|
||||
],
|
||||
"parameters": [{ "name": "databaseId", "$ref": "DatabaseId" }, { "name": "query", "type": "string" }],
|
||||
"returns": [
|
||||
{ "name": "columnNames", "optional": true, "type": "array", "items": { "type": "string" } },
|
||||
{ "name": "values", "optional": true, "type": "array", "items": { "type": "any" } },
|
||||
@@ -3626,10 +3608,7 @@
|
||||
{
|
||||
"name": "selectPrompt",
|
||||
"description": "Select a device in response to a DeviceAccess.deviceRequestPrompted event.",
|
||||
"parameters": [
|
||||
{ "name": "id", "$ref": "RequestId" },
|
||||
{ "name": "deviceId", "$ref": "DeviceId" }
|
||||
]
|
||||
"parameters": [{ "name": "id", "$ref": "RequestId" }, { "name": "deviceId", "$ref": "DeviceId" }]
|
||||
},
|
||||
{
|
||||
"name": "cancelPrompt",
|
||||
@@ -5677,10 +5656,7 @@
|
||||
},
|
||||
{
|
||||
"name": "removeDOMStorageItem",
|
||||
"parameters": [
|
||||
{ "name": "storageId", "$ref": "StorageId" },
|
||||
{ "name": "key", "type": "string" }
|
||||
]
|
||||
"parameters": [{ "name": "storageId", "$ref": "StorageId" }, { "name": "key", "type": "string" }]
|
||||
},
|
||||
{
|
||||
"name": "setDOMStorageItem",
|
||||
@@ -5702,10 +5678,7 @@
|
||||
},
|
||||
{
|
||||
"name": "domStorageItemRemoved",
|
||||
"parameters": [
|
||||
{ "name": "storageId", "$ref": "StorageId" },
|
||||
{ "name": "key", "type": "string" }
|
||||
]
|
||||
"parameters": [{ "name": "storageId", "$ref": "StorageId" }, { "name": "key", "type": "string" }]
|
||||
},
|
||||
{
|
||||
"name": "domStorageItemUpdated",
|
||||
@@ -5775,10 +5748,7 @@
|
||||
{
|
||||
"id": "MediaFeature",
|
||||
"type": "object",
|
||||
"properties": [
|
||||
{ "name": "name", "type": "string" },
|
||||
{ "name": "value", "type": "string" }
|
||||
]
|
||||
"properties": [{ "name": "name", "type": "string" }, { "name": "value", "type": "string" }]
|
||||
},
|
||||
{
|
||||
"id": "VirtualTimePolicy",
|
||||
@@ -5792,10 +5762,7 @@
|
||||
"description": "Used to specify User Agent Cient Hints to emulate. See https://wicg.github.io/ua-client-hints",
|
||||
"experimental": true,
|
||||
"type": "object",
|
||||
"properties": [
|
||||
{ "name": "brand", "type": "string" },
|
||||
{ "name": "version", "type": "string" }
|
||||
]
|
||||
"properties": [{ "name": "brand", "type": "string" }, { "name": "version", "type": "string" }]
|
||||
},
|
||||
{
|
||||
"id": "UserAgentMetadata",
|
||||
@@ -6153,10 +6120,7 @@
|
||||
"name": "setSensorOverrideReadings",
|
||||
"description": "Updates the sensor readings reported by a sensor type previously overriden\nby setSensorOverrideEnabled.",
|
||||
"experimental": true,
|
||||
"parameters": [
|
||||
{ "name": "type", "$ref": "SensorType" },
|
||||
{ "name": "reading", "$ref": "SensorReading" }
|
||||
]
|
||||
"parameters": [{ "name": "type", "$ref": "SensorType" }, { "name": "reading", "$ref": "SensorReading" }]
|
||||
},
|
||||
{
|
||||
"name": "setIdleOverride",
|
||||
@@ -6441,17 +6405,11 @@
|
||||
{ "name": "disable" },
|
||||
{
|
||||
"name": "selectAccount",
|
||||
"parameters": [
|
||||
{ "name": "dialogId", "type": "string" },
|
||||
{ "name": "accountIndex", "type": "integer" }
|
||||
]
|
||||
"parameters": [{ "name": "dialogId", "type": "string" }, { "name": "accountIndex", "type": "integer" }]
|
||||
},
|
||||
{
|
||||
"name": "clickDialogButton",
|
||||
"parameters": [
|
||||
{ "name": "dialogId", "type": "string" },
|
||||
{ "name": "dialogButton", "$ref": "DialogButton" }
|
||||
]
|
||||
"parameters": [{ "name": "dialogId", "type": "string" }, { "name": "dialogButton", "$ref": "DialogButton" }]
|
||||
},
|
||||
{
|
||||
"name": "dismissDialog",
|
||||
@@ -6506,10 +6464,7 @@
|
||||
"id": "HeaderEntry",
|
||||
"description": "Response HTTP header entry",
|
||||
"type": "object",
|
||||
"properties": [
|
||||
{ "name": "name", "type": "string" },
|
||||
{ "name": "value", "type": "string" }
|
||||
]
|
||||
"properties": [{ "name": "name", "type": "string" }, { "name": "value", "type": "string" }]
|
||||
},
|
||||
{
|
||||
"id": "AuthChallenge",
|
||||
@@ -8346,28 +8301,19 @@
|
||||
"id": "PlayerProperty",
|
||||
"description": "Corresponds to kMediaPropertyChange",
|
||||
"type": "object",
|
||||
"properties": [
|
||||
{ "name": "name", "type": "string" },
|
||||
{ "name": "value", "type": "string" }
|
||||
]
|
||||
"properties": [{ "name": "name", "type": "string" }, { "name": "value", "type": "string" }]
|
||||
},
|
||||
{
|
||||
"id": "PlayerEvent",
|
||||
"description": "Corresponds to kMediaEventTriggered",
|
||||
"type": "object",
|
||||
"properties": [
|
||||
{ "name": "timestamp", "$ref": "Timestamp" },
|
||||
{ "name": "value", "type": "string" }
|
||||
]
|
||||
"properties": [{ "name": "timestamp", "$ref": "Timestamp" }, { "name": "value", "type": "string" }]
|
||||
},
|
||||
{
|
||||
"id": "PlayerErrorSourceLocation",
|
||||
"description": "Represents logged source line numbers reported in an error.\nNOTE: file and line are from chromium c++ implementation code, not js.",
|
||||
"type": "object",
|
||||
"properties": [
|
||||
{ "name": "file", "type": "string" },
|
||||
{ "name": "line", "type": "integer" }
|
||||
]
|
||||
"properties": [{ "name": "file", "type": "string" }, { "name": "line", "type": "integer" }]
|
||||
},
|
||||
{
|
||||
"id": "PlayerError",
|
||||
@@ -12411,10 +12357,7 @@
|
||||
"description": "Pair of issuer origin and number of available (signed, but not used) Trust\nTokens from that issuer.",
|
||||
"experimental": true,
|
||||
"type": "object",
|
||||
"properties": [
|
||||
{ "name": "issuerOrigin", "type": "string" },
|
||||
{ "name": "count", "type": "number" }
|
||||
]
|
||||
"properties": [{ "name": "issuerOrigin", "type": "string" }, { "name": "count", "type": "number" }]
|
||||
},
|
||||
{
|
||||
"id": "InterestGroupAccessType",
|
||||
@@ -12477,10 +12420,7 @@
|
||||
"id": "SharedStorageEntry",
|
||||
"description": "Struct for a single key-value pair in an origin's shared storage.",
|
||||
"type": "object",
|
||||
"properties": [
|
||||
{ "name": "key", "type": "string" },
|
||||
{ "name": "value", "type": "string" }
|
||||
]
|
||||
"properties": [{ "name": "key", "type": "string" }, { "name": "value", "type": "string" }]
|
||||
},
|
||||
{
|
||||
"id": "SharedStorageMetadata",
|
||||
@@ -12496,10 +12436,7 @@
|
||||
"id": "SharedStorageReportingMetadata",
|
||||
"description": "Pair of reporting metadata details for a candidate URL for `selectURL()`.",
|
||||
"type": "object",
|
||||
"properties": [
|
||||
{ "name": "eventType", "type": "string" },
|
||||
{ "name": "reportingUrl", "type": "string" }
|
||||
]
|
||||
"properties": [{ "name": "eventType", "type": "string" }, { "name": "reportingUrl", "type": "string" }]
|
||||
},
|
||||
{
|
||||
"id": "SharedStorageUrlWithMetadata",
|
||||
@@ -12631,10 +12568,7 @@
|
||||
"id": "AttributionReportingAggregationKeysEntry",
|
||||
"experimental": true,
|
||||
"type": "object",
|
||||
"properties": [
|
||||
{ "name": "key", "type": "string" },
|
||||
{ "name": "value", "$ref": "UnsignedInt128AsBase16" }
|
||||
]
|
||||
"properties": [{ "name": "key", "type": "string" }, { "name": "value", "$ref": "UnsignedInt128AsBase16" }]
|
||||
},
|
||||
{
|
||||
"id": "AttributionReportingEventReportWindows",
|
||||
@@ -13009,10 +12943,7 @@
|
||||
"name": "getInterestGroupDetails",
|
||||
"description": "Gets details for a named interest group.",
|
||||
"experimental": true,
|
||||
"parameters": [
|
||||
{ "name": "ownerOrigin", "type": "string" },
|
||||
{ "name": "name", "type": "string" }
|
||||
],
|
||||
"parameters": [{ "name": "ownerOrigin", "type": "string" }, { "name": "name", "type": "string" }],
|
||||
"returns": [{ "name": "details", "$ref": "InterestGroupDetails" }]
|
||||
},
|
||||
{
|
||||
@@ -13055,10 +12986,7 @@
|
||||
"name": "deleteSharedStorageEntry",
|
||||
"description": "Deletes entry for `key` (if it exists) for a given origin's shared storage.",
|
||||
"experimental": true,
|
||||
"parameters": [
|
||||
{ "name": "ownerOrigin", "type": "string" },
|
||||
{ "name": "key", "type": "string" }
|
||||
]
|
||||
"parameters": [{ "name": "ownerOrigin", "type": "string" }, { "name": "key", "type": "string" }]
|
||||
},
|
||||
{
|
||||
"name": "clearSharedStorageEntries",
|
||||
@@ -13082,10 +13010,7 @@
|
||||
"name": "setStorageBucketTracking",
|
||||
"description": "Set tracking for a storage key's buckets.",
|
||||
"experimental": true,
|
||||
"parameters": [
|
||||
{ "name": "storageKey", "type": "string" },
|
||||
{ "name": "enable", "type": "boolean" }
|
||||
]
|
||||
"parameters": [{ "name": "storageKey", "type": "string" }, { "name": "enable", "type": "boolean" }]
|
||||
},
|
||||
{
|
||||
"name": "deleteStorageBucket",
|
||||
@@ -13531,10 +13456,7 @@
|
||||
"id": "RemoteLocation",
|
||||
"experimental": true,
|
||||
"type": "object",
|
||||
"properties": [
|
||||
{ "name": "host", "type": "string" },
|
||||
{ "name": "port", "type": "integer" }
|
||||
]
|
||||
"properties": [{ "name": "host", "type": "string" }, { "name": "port", "type": "integer" }]
|
||||
}
|
||||
],
|
||||
"commands": [
|
||||
|
||||
@@ -5,13 +5,9 @@
|
||||
"std.debug.assert": "Use bun.assert instead",
|
||||
"std.debug.dumpStackTrace": "Use bun.handleErrorReturnTrace or bun.crash_handler.dumpStackTrace instead",
|
||||
"std.debug.print": "Don't let this be committed",
|
||||
"std.mem.indexOfAny(": "Use bun.strings.indexOfAny",
|
||||
"std.mem.indexOfAny": "Use bun.strings.indexAny or bun.strings.indexAnyComptime",
|
||||
"undefined != ": "This is by definition Undefined Behavior.",
|
||||
"undefined == ": "This is by definition Undefined Behavior.",
|
||||
"bun.toFD(std.fs.cwd().fd)": "Use bun.FD.cwd()",
|
||||
"std.StringArrayHashMapUnmanaged(": "bun.StringArrayHashMapUnmanaged has a faster `eql`",
|
||||
"std.StringArrayHashMap(": "bun.StringArrayHashMap has a faster `eql`",
|
||||
"std.StringHashMapUnmanaged(": "bun.StringHashMapUnmanaged has a faster `eql`",
|
||||
"std.StringHashMap(": "bun.StringHashMaphas a faster `eql`",
|
||||
"": ""
|
||||
}
|
||||
|
||||
@@ -19,7 +19,9 @@ for (const [banned, suggestion] of Object.entries(BANNED)) {
|
||||
if (banned.length === 0) continue;
|
||||
// Run git grep to find occurrences of std.debug.assert in .zig files
|
||||
// .nothrow() is here since git will exit with non-zero if no matches are found.
|
||||
let stdout = await $`git grep -n -F "${banned}" "src/**.zig" | grep -v -F '//' | grep -v -F bench`.nothrow().text();
|
||||
let stdout = await $`git grep -n -F "${banned}" "src/**/**.zig" | grep -v -F '//' | grep -v -F bench`
|
||||
.nothrow()
|
||||
.text();
|
||||
|
||||
stdout = stdout.trim();
|
||||
if (stdout.length === 0) continue;
|
||||
|
||||
@@ -152,9 +152,9 @@ function getMaxFileDescriptor(path) {
|
||||
|
||||
hasInitialMaxFD = true;
|
||||
|
||||
if (process.platform === "linux" || process.platform === "darwin") {
|
||||
if (process.platform === "linux") {
|
||||
try {
|
||||
readdirSync(process.platform === "darwin" ? "/dev/fd" : "/proc/self/fd").forEach(name => {
|
||||
readdirSync("/proc/self/fd").forEach(name => {
|
||||
const fd = parseInt(name.trim(), 10);
|
||||
if (Number.isSafeInteger(fd) && fd >= 0) {
|
||||
maxFd = Math.max(maxFd, fd);
|
||||
|
||||
@@ -290,7 +290,7 @@ function formatBody(body?: string, isBase64Encoded?: boolean): string | null {
|
||||
if (!isBase64Encoded) {
|
||||
return body;
|
||||
}
|
||||
return Buffer.from(body, "base64").toString("utf8");
|
||||
return Buffer.from(body).toString("base64");
|
||||
}
|
||||
|
||||
type HttpEventV1 = {
|
||||
|
||||
@@ -35,7 +35,7 @@ if (latest.tag_name === release.tag_name) {
|
||||
} else if (release.tag_name === "canary") {
|
||||
try {
|
||||
const build = await getSemver("canary", await getBuild());
|
||||
paths = ["releases/canary", `releases/${build}`, `releases/${full_commit_hash}-canary`];
|
||||
paths = ["releases/canary", `releases/${build}`, `releases/${full_commit_hash}`];
|
||||
} catch (error) {
|
||||
console.warn(error);
|
||||
paths = ["releases/canary"];
|
||||
@@ -97,7 +97,7 @@ for (const asset of release.assets) {
|
||||
let data = Bun.spawnSync({
|
||||
cmd: [
|
||||
join(temp, local.replace(".zip", ""), "bun"),
|
||||
"--print",
|
||||
"-e",
|
||||
'JSON.stringify(require("bun:internal-for-testing").crash_handler.getFeatureData())',
|
||||
],
|
||||
cwd: temp,
|
||||
|
||||
196
packages/bun-types/bun.d.ts
vendored
196
packages/bun-types/bun.d.ts
vendored
@@ -15,7 +15,7 @@
|
||||
*/
|
||||
declare module "bun" {
|
||||
import type { Encoding as CryptoEncoding } from "crypto";
|
||||
import type { CipherNameAndProtocol, EphemeralKeyInfo, PeerCertificate } from "tls";
|
||||
|
||||
interface Env {
|
||||
NODE_ENV?: string;
|
||||
/**
|
||||
@@ -1520,7 +1520,7 @@ declare module "bun" {
|
||||
define?: Record<string, string>;
|
||||
// origin?: string; // e.g. http://mydomain.com
|
||||
loader?: { [k in string]: Loader };
|
||||
sourcemap?: "none" | "linked" | "inline" | "external"; // default: "none", true -> "inline"
|
||||
sourcemap?: "none" | "inline" | "external"; // default: "none"
|
||||
/**
|
||||
* package.json `exports` conditions used when resolving imports
|
||||
*
|
||||
@@ -2968,7 +2968,7 @@ declare module "bun" {
|
||||
* Returns 0 if the versions are equal, 1 if `v1` is greater, or -1 if `v2` is greater.
|
||||
* Throws an error if either version is invalid.
|
||||
*/
|
||||
order(this: void, v1: StringLike, v2: StringLike): -1 | 0 | 1;
|
||||
order(v1: StringLike, v2: StringLike): -1 | 0 | 1;
|
||||
}
|
||||
var semver: Semver;
|
||||
|
||||
@@ -3216,8 +3216,7 @@ declare module "bun" {
|
||||
*
|
||||
* @param hashInto `TypedArray` to write the hash into. Faster than creating a new one each time
|
||||
*/
|
||||
digest(): Buffer;
|
||||
digest(hashInto: NodeJS.TypedArray): NodeJS.TypedArray;
|
||||
digest(hashInto?: NodeJS.TypedArray): NodeJS.TypedArray;
|
||||
|
||||
/**
|
||||
* Run the hash over the given data
|
||||
@@ -3226,11 +3225,10 @@ declare module "bun" {
|
||||
*
|
||||
* @param hashInto `TypedArray` to write the hash into. Faster than creating a new one each time
|
||||
*/
|
||||
static hash(algorithm: SupportedCryptoAlgorithms, input: Bun.BlobOrStringOrBuffer): Buffer;
|
||||
static hash(
|
||||
algorithm: SupportedCryptoAlgorithms,
|
||||
input: Bun.BlobOrStringOrBuffer,
|
||||
hashInto: NodeJS.TypedArray,
|
||||
hashInto?: NodeJS.TypedArray,
|
||||
): NodeJS.TypedArray;
|
||||
|
||||
/**
|
||||
@@ -3873,15 +3871,6 @@ declare module "bun" {
|
||||
*/
|
||||
timeout(seconds: number): void;
|
||||
|
||||
/**
|
||||
* Forcefully close the socket. The other end may not receive all data, and
|
||||
* the socket will be closed immediately.
|
||||
*
|
||||
* This passes `SO_LINGER` with `l_onoff` set to `1` and `l_linger` set to
|
||||
* `0` and then calls `close(2)`.
|
||||
*/
|
||||
terminate(): void;
|
||||
|
||||
/**
|
||||
* Shutdown writes to a socket
|
||||
*
|
||||
@@ -3928,181 +3917,6 @@ declare module "bun" {
|
||||
* local port connected to the socket
|
||||
*/
|
||||
readonly localPort: number;
|
||||
|
||||
/**
|
||||
* This property is `true` if the peer certificate was signed by one of the CAs
|
||||
* specified when creating the `Socket` instance, otherwise `false`.
|
||||
*/
|
||||
readonly authorized: boolean;
|
||||
|
||||
/**
|
||||
* String containing the selected ALPN protocol.
|
||||
* Before a handshake has completed, this value is always null.
|
||||
* When a handshake is completed but not ALPN protocol was selected, socket.alpnProtocol equals false.
|
||||
*/
|
||||
readonly alpnProtocol: string | false | null;
|
||||
|
||||
/**
|
||||
* Disables TLS renegotiation for this `Socket` instance. Once called, attempts
|
||||
* to renegotiate will trigger an `error` handler on the `Socket`.
|
||||
*
|
||||
* There is no support for renegotiation as a server. (Attempts by clients will result in a fatal alert so that ClientHello messages cannot be used to flood a server and escape higher-level limits.)
|
||||
*/
|
||||
disableRenegotiation(): void;
|
||||
|
||||
/**
|
||||
* Keying material is used for validations to prevent different kind of attacks in
|
||||
* network protocols, for example in the specifications of IEEE 802.1X.
|
||||
*
|
||||
* Example
|
||||
*
|
||||
* ```js
|
||||
* const keyingMaterial = socket.exportKeyingMaterial(
|
||||
* 128,
|
||||
* 'client finished');
|
||||
*
|
||||
* /*
|
||||
* Example return value of keyingMaterial:
|
||||
* <Buffer 76 26 af 99 c5 56 8e 42 09 91 ef 9f 93 cb ad 6c 7b 65 f8 53 f1 d8 d9
|
||||
* 12 5a 33 b8 b5 25 df 7b 37 9f e0 e2 4f b8 67 83 a3 2f cd 5d 41 42 4c 91
|
||||
* 74 ef 2c ... 78 more bytes>
|
||||
*
|
||||
* ```
|
||||
*
|
||||
* @param length number of bytes to retrieve from keying material
|
||||
* @param label an application specific label, typically this will be a value from the [IANA Exporter Label
|
||||
* Registry](https://www.iana.org/assignments/tls-parameters/tls-parameters.xhtml#exporter-labels).
|
||||
* @param context Optionally provide a context.
|
||||
* @return requested bytes of the keying material
|
||||
*/
|
||||
exportKeyingMaterial(length: number, label: string, context: Buffer): Buffer;
|
||||
|
||||
/**
|
||||
* Returns the reason why the peer's certificate was not been verified. This
|
||||
* property is set only when `socket.authorized === false`.
|
||||
*/
|
||||
getAuthorizationError(): Error | null;
|
||||
|
||||
/**
|
||||
* Returns an object representing the local certificate. The returned object has
|
||||
* some properties corresponding to the fields of the certificate.
|
||||
*
|
||||
* If there is no local certificate, an empty object will be returned. If the
|
||||
* socket has been destroyed, `null` will be returned.
|
||||
*/
|
||||
getCertificate(): PeerCertificate | object | null;
|
||||
|
||||
/**
|
||||
* Returns an object containing information on the negotiated cipher suite.
|
||||
*
|
||||
* For example, a TLSv1.2 protocol with AES256-SHA cipher:
|
||||
*
|
||||
* ```json
|
||||
* {
|
||||
* "name": "AES256-SHA",
|
||||
* "standardName": "TLS_RSA_WITH_AES_256_CBC_SHA",
|
||||
* "version": "SSLv3"
|
||||
* }
|
||||
* ```
|
||||
*
|
||||
*/
|
||||
getCipher(): CipherNameAndProtocol;
|
||||
|
||||
/**
|
||||
* Returns an object representing the type, name, and size of parameter of
|
||||
* an ephemeral key exchange in `perfect forward secrecy` on a client
|
||||
* connection. It returns an empty object when the key exchange is not
|
||||
* ephemeral. As this is only supported on a client socket; `null` is returned
|
||||
* if called on a server socket. The supported types are `'DH'` and `'ECDH'`. The`name` property is available only when type is `'ECDH'`.
|
||||
*
|
||||
* For example: `{ type: 'ECDH', name: 'prime256v1', size: 256 }`.
|
||||
*/
|
||||
getEphemeralKeyInfo(): EphemeralKeyInfo | object | null;
|
||||
|
||||
/**
|
||||
* Returns an object representing the peer's certificate. If the peer does not
|
||||
* provide a certificate, an empty object will be returned. If the socket has been
|
||||
* destroyed, `null` will be returned.
|
||||
*
|
||||
* If the full certificate chain was requested, each certificate will include an`issuerCertificate` property containing an object representing its issuer's
|
||||
* certificate.
|
||||
* @return A certificate object.
|
||||
*/
|
||||
getPeerCertificate(): PeerCertificate;
|
||||
|
||||
/**
|
||||
* See [SSL\_get\_shared\_sigalgs](https://www.openssl.org/docs/man1.1.1/man3/SSL_get_shared_sigalgs.html) for more information.
|
||||
* @since v12.11.0
|
||||
* @return List of signature algorithms shared between the server and the client in the order of decreasing preference.
|
||||
*/
|
||||
getSharedSigalgs(): string[];
|
||||
|
||||
/**
|
||||
* As the `Finished` messages are message digests of the complete handshake
|
||||
* (with a total of 192 bits for TLS 1.0 and more for SSL 3.0), they can
|
||||
* be used for external authentication procedures when the authentication
|
||||
* provided by SSL/TLS is not desired or is not enough.
|
||||
*
|
||||
* @return The latest `Finished` message that has been sent to the socket as part of a SSL/TLS handshake, or `undefined` if no `Finished` message has been sent yet.
|
||||
*/
|
||||
getTLSFinishedMessage(): Buffer | undefined;
|
||||
|
||||
/**
|
||||
* As the `Finished` messages are message digests of the complete handshake
|
||||
* (with a total of 192 bits for TLS 1.0 and more for SSL 3.0), they can
|
||||
* be used for external authentication procedures when the authentication
|
||||
* provided by SSL/TLS is not desired or is not enough.
|
||||
*
|
||||
* @return The latest `Finished` message that is expected or has actually been received from the socket as part of a SSL/TLS handshake, or `undefined` if there is no `Finished` message so
|
||||
* far.
|
||||
*/
|
||||
getTLSPeerFinishedMessage(): Buffer | undefined;
|
||||
|
||||
/**
|
||||
* For a client, returns the TLS session ticket if one is available, or`undefined`. For a server, always returns `undefined`.
|
||||
*
|
||||
* It may be useful for debugging.
|
||||
*
|
||||
* See `Session Resumption` for more information.
|
||||
*/
|
||||
getTLSTicket(): Buffer | undefined;
|
||||
|
||||
/**
|
||||
* Returns a string containing the negotiated SSL/TLS protocol version of the
|
||||
* current connection. The value `'unknown'` will be returned for connected
|
||||
* sockets that have not completed the handshaking process. The value `null` will
|
||||
* be returned for server sockets or disconnected client sockets.
|
||||
*
|
||||
* Protocol versions are:
|
||||
*
|
||||
* * `'SSLv3'`
|
||||
* * `'TLSv1'`
|
||||
* * `'TLSv1.1'`
|
||||
* * `'TLSv1.2'`
|
||||
* * `'TLSv1.3'`
|
||||
*
|
||||
*/
|
||||
getTLSVersion(): string;
|
||||
|
||||
/**
|
||||
* See `Session Resumption` for more information.
|
||||
* @return `true` if the session was reused, `false` otherwise.
|
||||
*/
|
||||
isSessionReused(): boolean;
|
||||
|
||||
/**
|
||||
* The `socket.setMaxSendFragment()` method sets the maximum TLS fragment size.
|
||||
* Returns `true` if setting the limit succeeded; `false` otherwise.
|
||||
*
|
||||
* Smaller fragment sizes decrease the buffering latency on the client: larger
|
||||
* fragments are buffered by the TLS layer until the entire fragment is received
|
||||
* and its integrity is verified; large fragments can span multiple roundtrips
|
||||
* and their processing can be delayed due to packet loss or reordering. However,
|
||||
* smaller fragments add extra TLS framing bytes and CPU overhead, which may
|
||||
* decrease overall server throughput.
|
||||
* @param [size=16384] The maximum TLS fragment size. The maximum value is `16384`.
|
||||
*/
|
||||
setMaxSendFragment(size: number): boolean;
|
||||
}
|
||||
|
||||
interface SocketListener<Data = undefined> {
|
||||
|
||||
35
packages/bun-types/jsc.d.ts
vendored
35
packages/bun-types/jsc.d.ts
vendored
@@ -78,7 +78,21 @@ declare module "bun:jsc" {
|
||||
*/
|
||||
function setTimeZone(timeZone: string): string;
|
||||
|
||||
interface SamplingProfile {
|
||||
/**
|
||||
* Run JavaScriptCore's sampling profiler for a particular function
|
||||
*
|
||||
* This is pretty low-level.
|
||||
*
|
||||
* Things to know:
|
||||
* - LLint means "Low Level Interpreter", which is the interpreter that runs before any JIT compilation
|
||||
* - Baseline is the first JIT compilation tier. It's the least optimized, but the fastest to compile
|
||||
* - DFG means "Data Flow Graph", which is the second JIT compilation tier. It has some optimizations, but is slower to compile
|
||||
* - FTL means "Faster Than Light", which is the third JIT compilation tier. It has the most optimizations, but is the slowest to compile
|
||||
*/
|
||||
function profile(
|
||||
callback: CallableFunction,
|
||||
sampleInterval?: number,
|
||||
): {
|
||||
/**
|
||||
* A formatted summary of the top functions
|
||||
*
|
||||
@@ -169,24 +183,7 @@ declare module "bun:jsc" {
|
||||
* Stack traces of the top functions
|
||||
*/
|
||||
stackTraces: string[];
|
||||
}
|
||||
|
||||
/**
|
||||
* Run JavaScriptCore's sampling profiler for a particular function
|
||||
*
|
||||
* This is pretty low-level.
|
||||
*
|
||||
* Things to know:
|
||||
* - LLint means "Low Level Interpreter", which is the interpreter that runs before any JIT compilation
|
||||
* - Baseline is the first JIT compilation tier. It's the least optimized, but the fastest to compile
|
||||
* - DFG means "Data Flow Graph", which is the second JIT compilation tier. It has some optimizations, but is slower to compile
|
||||
* - FTL means "Faster Than Light", which is the third JIT compilation tier. It has the most optimizations, but is the slowest to compile
|
||||
*/
|
||||
function profile<T extends (...args: any[]) => any>(
|
||||
callback: T,
|
||||
sampleInterval?: number,
|
||||
...args: Parameters<T>
|
||||
): ReturnType<T> extends Promise<infer U> ? Promise<SamplingProfile> : SamplingProfile;
|
||||
};
|
||||
|
||||
/**
|
||||
* This returns objects which native code has explicitly protected from being
|
||||
|
||||
105
packages/bun-types/sqlite.d.ts
vendored
105
packages/bun-types/sqlite.d.ts
vendored
@@ -36,7 +36,7 @@ declare module "bun:sqlite" {
|
||||
* ```ts
|
||||
* const db = new Database("mydb.sqlite");
|
||||
* db.run("CREATE TABLE foo (bar TEXT)");
|
||||
* db.run("INSERT INTO foo VALUES (?)", ["baz"]);
|
||||
* db.run("INSERT INTO foo VALUES (?)", "baz");
|
||||
* console.log(db.query("SELECT * FROM foo").all());
|
||||
* ```
|
||||
*
|
||||
@@ -47,7 +47,7 @@ declare module "bun:sqlite" {
|
||||
* ```ts
|
||||
* const db = new Database(":memory:");
|
||||
* db.run("CREATE TABLE foo (bar TEXT)");
|
||||
* db.run("INSERT INTO foo VALUES (?)", ["hiiiiii"]);
|
||||
* db.run("INSERT INTO foo VALUES (?)", "hiiiiii");
|
||||
* console.log(db.query("SELECT * FROM foo").all());
|
||||
* ```
|
||||
*
|
||||
@@ -82,40 +82,6 @@ declare module "bun:sqlite" {
|
||||
* Equivalent to {@link constants.SQLITE_OPEN_READWRITE}
|
||||
*/
|
||||
readwrite?: boolean;
|
||||
|
||||
/**
|
||||
* When set to `true`, integers are returned as `bigint` types.
|
||||
*
|
||||
* When set to `false`, integers are returned as `number` types and truncated to 52 bits.
|
||||
*
|
||||
* @default false
|
||||
* @since v1.1.14
|
||||
*/
|
||||
safeInteger?: boolean;
|
||||
|
||||
/**
|
||||
* When set to `false` or `undefined`:
|
||||
* - Queries missing bound parameters will NOT throw an error
|
||||
* - Bound named parameters in JavaScript need to exactly match the SQL query.
|
||||
*
|
||||
* @example
|
||||
* ```ts
|
||||
* const db = new Database(":memory:", { strict: false });
|
||||
* db.run("INSERT INTO foo (name) VALUES ($name)", { $name: "foo" });
|
||||
* ```
|
||||
*
|
||||
* When set to `true`:
|
||||
* - Queries missing bound parameters will throw an error
|
||||
* - Bound named parameters in JavaScript no longer need to be `$`, `:`, or `@`. The SQL query will remain prefixed.
|
||||
*
|
||||
* @example
|
||||
* ```ts
|
||||
* const db = new Database(":memory:", { strict: true });
|
||||
* db.run("INSERT INTO foo (name) VALUES ($name)", { name: "foo" });
|
||||
* ```
|
||||
* @since v1.1.14
|
||||
*/
|
||||
strict?: boolean;
|
||||
},
|
||||
);
|
||||
|
||||
@@ -158,7 +124,7 @@ declare module "bun:sqlite" {
|
||||
* @example
|
||||
* ```ts
|
||||
* db.run("CREATE TABLE foo (bar TEXT)");
|
||||
* db.run("INSERT INTO foo VALUES (?)", ["baz"]);
|
||||
* db.run("INSERT INTO foo VALUES (?)", "baz");
|
||||
* ```
|
||||
*
|
||||
* Useful for queries like:
|
||||
@@ -199,11 +165,11 @@ declare module "bun:sqlite" {
|
||||
* | `bigint` | `INTEGER` |
|
||||
* | `null` | `NULL` |
|
||||
*/
|
||||
run<ParamsType extends SQLQueryBindings[]>(sqlQuery: string, ...bindings: ParamsType[]): Changes;
|
||||
run<ParamsType extends SQLQueryBindings[]>(sqlQuery: string, ...bindings: ParamsType[]): void;
|
||||
/**
|
||||
This is an alias of {@link Database.prototype.run}
|
||||
*/
|
||||
exec<ParamsType extends SQLQueryBindings[]>(sqlQuery: string, ...bindings: ParamsType[]): Changes;
|
||||
exec<ParamsType extends SQLQueryBindings[]>(sqlQuery: string, ...bindings: ParamsType[]): void;
|
||||
|
||||
/**
|
||||
* Compile a SQL query and return a {@link Statement} object. This is the
|
||||
@@ -268,9 +234,9 @@ declare module "bun:sqlite" {
|
||||
* @example
|
||||
* ```ts
|
||||
* db.run("CREATE TABLE foo (bar TEXT)");
|
||||
* db.run("INSERT INTO foo VALUES (?)", ["baz"]);
|
||||
* db.run("INSERT INTO foo VALUES (?)", "baz");
|
||||
* db.run("BEGIN");
|
||||
* db.run("INSERT INTO foo VALUES (?)", ["qux"]);
|
||||
* db.run("INSERT INTO foo VALUES (?)", "qux");
|
||||
* console.log(db.inTransaction());
|
||||
* ```
|
||||
*/
|
||||
@@ -609,7 +575,7 @@ declare module "bun:sqlite" {
|
||||
* | `bigint` | `INTEGER` |
|
||||
* | `null` | `NULL` |
|
||||
*/
|
||||
run(...params: ParamsType): Changes;
|
||||
run(...params: ParamsType): void;
|
||||
|
||||
/**
|
||||
* Execute the prepared statement and return the results as an array of arrays.
|
||||
@@ -714,44 +680,6 @@ declare module "bun:sqlite" {
|
||||
*/
|
||||
toString(): string;
|
||||
|
||||
/**
|
||||
*
|
||||
* Make {@link get} and {@link all} return an instance of the provided
|
||||
* `Class` instead of the default `Object`.
|
||||
*
|
||||
* @param Class A class to use
|
||||
* @returns The same statement instance, modified to return an instance of `Class`
|
||||
*
|
||||
* This lets you attach methods, getters, and setters to the returned
|
||||
* objects.
|
||||
*
|
||||
* For performance reasons, constructors for classes are not called, which means
|
||||
* initializers will not be called and private fields will not be
|
||||
* accessible.
|
||||
*
|
||||
* @example
|
||||
*
|
||||
* ## Custom class
|
||||
* ```ts
|
||||
* class User {
|
||||
* rawBirthdate: string;
|
||||
* get birthdate() {
|
||||
* return new Date(this.rawBirthdate);
|
||||
* }
|
||||
* }
|
||||
*
|
||||
* const db = new Database(":memory:");
|
||||
* db.exec("CREATE TABLE users (id INTEGER PRIMARY KEY, rawBirthdate TEXT)");
|
||||
* db.run("INSERT INTO users (rawBirthdate) VALUES ('1995-12-19')");
|
||||
* const query = db.query("SELECT * FROM users");
|
||||
* query.as(User);
|
||||
* const user = query.get();
|
||||
* console.log(user.birthdate);
|
||||
* // => Date(1995, 11, 19)
|
||||
* ```
|
||||
*/
|
||||
as<T = unknown>(Class: new (...args: any[]) => T): Statement<T, ParamsType>;
|
||||
|
||||
/**
|
||||
* Native object representing the underlying `sqlite3_stmt`
|
||||
*
|
||||
@@ -1112,21 +1040,4 @@ declare module "bun:sqlite" {
|
||||
*/
|
||||
readonly byteOffset: number;
|
||||
}
|
||||
|
||||
/**
|
||||
* An object representing the changes made to the database since the last `run` or `exec` call.
|
||||
*
|
||||
* @since Bun v1.1.14
|
||||
*/
|
||||
interface Changes {
|
||||
/**
|
||||
* The number of rows changed by the last `run` or `exec` call.
|
||||
*/
|
||||
changes: number;
|
||||
|
||||
/**
|
||||
* If `safeIntegers` is `true`, this is a `bigint`. Otherwise, it is a `number`.
|
||||
*/
|
||||
lastInsertRowid: number | bigint;
|
||||
}
|
||||
}
|
||||
|
||||
104
packages/bun-types/test.d.ts
vendored
104
packages/bun-types/test.d.ts
vendored
@@ -94,7 +94,6 @@ declare module "bun:test" {
|
||||
clearAllMocks(): void;
|
||||
fn<T extends (...args: any[]) => any>(func?: T): Mock<T>;
|
||||
setSystemTime(now?: number | Date): void;
|
||||
setTimeout(milliseconds: number): void;
|
||||
}
|
||||
export const jest: Jest;
|
||||
export namespace jest {
|
||||
@@ -294,13 +293,6 @@ declare module "bun:test" {
|
||||
* @param fn the function to run
|
||||
*/
|
||||
export function afterEach(fn: (() => void | Promise<unknown>) | ((done: (err?: unknown) => void) => void)): void;
|
||||
/**
|
||||
* Sets the default timeout for all tests in the current file. If a test specifies a timeout, it will
|
||||
* override this value. The default timeout is 5000ms (5 seconds).
|
||||
*
|
||||
* @param milliseconds the number of milliseconds for the default timeout
|
||||
*/
|
||||
export function setDefaultTimeout(milliseconds: number): void;
|
||||
export interface TestOptions {
|
||||
/**
|
||||
* Sets the timeout for the test in milliseconds.
|
||||
@@ -490,12 +482,7 @@ declare module "bun:test" {
|
||||
|
||||
export interface Expect extends AsymmetricMatchers {
|
||||
// the `expect()` callable signature
|
||||
/**
|
||||
* @param actual the actual value
|
||||
* @param customFailMessage an optional custom message to display if the test fails.
|
||||
* */
|
||||
|
||||
<T = unknown>(actual?: T, customFailMessage?: string): Matchers<T>;
|
||||
<T = unknown>(actual?: T): Matchers<T>;
|
||||
|
||||
/**
|
||||
* Access to negated asymmetric matchers.
|
||||
@@ -955,21 +942,6 @@ declare module "bun:test" {
|
||||
* @param expected the expected value
|
||||
*/
|
||||
toContainKey(expected: unknown): void;
|
||||
/**
|
||||
* Asserts that an `object` contains all the provided keys.
|
||||
*
|
||||
* The value must be an object
|
||||
*
|
||||
* @example
|
||||
* expect({ a: 'hello', b: 'world' }).toContainAllKeys(['a','b']);
|
||||
* expect({ a: 'hello', b: 'world' }).toContainAllKeys(['b','a']);
|
||||
* expect({ 1: 'hello', b: 'world' }).toContainAllKeys([1,'b']);
|
||||
* expect({ a: 'hello', b: 'world' }).not.toContainAllKeys(['c']);
|
||||
* expect({ a: 'hello', b: 'world' }).not.toContainAllKeys(['a']);
|
||||
*
|
||||
* @param expected the expected value
|
||||
*/
|
||||
toContainAllKeys(expected: unknown): void;
|
||||
/**
|
||||
* Asserts that an `object` contains at least one of the provided keys.
|
||||
* Asserts that an `object` contains all the provided keys.
|
||||
@@ -986,78 +958,6 @@ declare module "bun:test" {
|
||||
*/
|
||||
toContainAnyKeys(expected: unknown): void;
|
||||
|
||||
/**
|
||||
* Asserts that an `object` contain the provided value.
|
||||
*
|
||||
* The value must be an object
|
||||
*
|
||||
* @example
|
||||
* const shallow = { hello: "world" };
|
||||
* const deep = { message: shallow };
|
||||
* const deepArray = { message: [shallow] };
|
||||
* const o = { a: "foo", b: [1, "hello", true], c: "baz" };
|
||||
|
||||
* expect(shallow).toContainValue("world");
|
||||
* expect({ foo: false }).toContainValue(false);
|
||||
* expect(deep).toContainValue({ hello: "world" });
|
||||
* expect(deepArray).toContainValue([{ hello: "world" }]);
|
||||
|
||||
* expect(o).toContainValue("foo", "barr");
|
||||
* expect(o).toContainValue([1, "hello", true]);
|
||||
* expect(o).not.toContainValue("qux");
|
||||
|
||||
// NOT
|
||||
* expect(shallow).not.toContainValue("foo");
|
||||
* expect(deep).not.toContainValue({ foo: "bar" });
|
||||
* expect(deepArray).not.toContainValue([{ foo: "bar" }]);
|
||||
*
|
||||
* @param expected the expected value
|
||||
*/
|
||||
toContainValue(expected: unknown): void;
|
||||
|
||||
/**
|
||||
* Asserts that an `object` contain the provided value.
|
||||
*
|
||||
* The value must be an object
|
||||
*
|
||||
* @example
|
||||
* const o = { a: 'foo', b: 'bar', c: 'baz' };
|
||||
* expect(o).toContainValues(['foo']);
|
||||
* expect(o).toContainValues(['baz', 'bar']);
|
||||
* expect(o).not.toContainValues(['qux', 'foo']);
|
||||
* @param expected the expected value
|
||||
*/
|
||||
toContainValues(expected: unknown): void;
|
||||
|
||||
/**
|
||||
* Asserts that an `object` contain all the provided values.
|
||||
*
|
||||
* The value must be an object
|
||||
*
|
||||
* @example
|
||||
* const o = { a: 'foo', b: 'bar', c: 'baz' };
|
||||
* expect(o).toContainAllValues(['foo', 'bar', 'baz']);
|
||||
* expect(o).toContainAllValues(['baz', 'bar', 'foo']);
|
||||
* expect(o).not.toContainAllValues(['bar', 'foo']);
|
||||
* @param expected the expected value
|
||||
*/
|
||||
toContainAllValues(expected: unknown): void;
|
||||
|
||||
/**
|
||||
* Asserts that an `object` contain any provided value.
|
||||
*
|
||||
* The value must be an object
|
||||
*
|
||||
* @example
|
||||
* const o = { a: 'foo', b: 'bar', c: 'baz' };
|
||||
` * expect(o).toContainAnyValues(['qux', 'foo']);
|
||||
* expect(o).toContainAnyValues(['qux', 'bar']);
|
||||
* expect(o).toContainAnyValues(['qux', 'baz']);
|
||||
* expect(o).not.toContainAnyValues(['qux']);
|
||||
* @param expected the expected value
|
||||
*/
|
||||
toContainAnyValues(expected: unknown): void;
|
||||
|
||||
/**
|
||||
* Asserts that an `object` contains all the provided keys.
|
||||
* expect({ a: 'foo', b: 'bar', c: 'baz' }).toContainKeys(['a', 'b']);
|
||||
@@ -1230,7 +1130,7 @@ declare module "bun:test" {
|
||||
* - If expected is a `string` or `RegExp`, it will check the `message` property.
|
||||
* - If expected is an `Error` object, it will check the `name` and `message` properties.
|
||||
* - If expected is an `Error` constructor, it will check the class of the `Error`.
|
||||
* - If expected is not provided, it will check if anything has thrown.
|
||||
* - If expected is not provided, it will check if anything as thrown.
|
||||
*
|
||||
* @example
|
||||
* function fail() {
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@@ -28,7 +28,7 @@ const getCertdataURL = version => {
|
||||
return certdataURL;
|
||||
};
|
||||
|
||||
const normalizeTD = (text = "") => {
|
||||
const normalizeTD = text => {
|
||||
// Remove whitespace and any HTML tags.
|
||||
return text?.trim().replace(/<.*?>/g, "");
|
||||
};
|
||||
|
||||
@@ -42,7 +42,6 @@
|
||||
#define HAS_MSGX
|
||||
#endif
|
||||
|
||||
|
||||
/* We need to emulate sendmmsg, recvmmsg on platform who don't have it */
|
||||
int bsd_sendmmsg(LIBUS_SOCKET_DESCRIPTOR fd, struct udp_sendbuf* sendbuf, int flags) {
|
||||
#if defined(_WIN32)// || defined(__APPLE__)
|
||||
@@ -398,9 +397,7 @@ int bsd_addr_get_port(struct bsd_addr_t *addr) {
|
||||
// called by dispatch_ready_poll
|
||||
LIBUS_SOCKET_DESCRIPTOR bsd_accept_socket(LIBUS_SOCKET_DESCRIPTOR fd, struct bsd_addr_t *addr) {
|
||||
LIBUS_SOCKET_DESCRIPTOR accepted_fd;
|
||||
|
||||
while (1) {
|
||||
addr->len = sizeof(addr->mem);
|
||||
addr->len = sizeof(addr->mem);
|
||||
|
||||
#if defined(SOCK_CLOEXEC) && defined(SOCK_NONBLOCK)
|
||||
// Linux, FreeBSD
|
||||
@@ -408,18 +405,12 @@ LIBUS_SOCKET_DESCRIPTOR bsd_accept_socket(LIBUS_SOCKET_DESCRIPTOR fd, struct bsd
|
||||
#else
|
||||
// Windows, OS X
|
||||
accepted_fd = accept(fd, (struct sockaddr *) addr, &addr->len);
|
||||
|
||||
#endif
|
||||
|
||||
if (UNLIKELY(IS_EINTR(accepted_fd))) {
|
||||
continue;
|
||||
}
|
||||
|
||||
/* We cannot rely on addr since it is not initialized if failed */
|
||||
if (accepted_fd == LIBUS_SOCKET_ERROR) {
|
||||
return LIBUS_SOCKET_ERROR;
|
||||
}
|
||||
|
||||
break;
|
||||
/* We cannot rely on addr since it is not initialized if failed */
|
||||
if (accepted_fd == LIBUS_SOCKET_ERROR) {
|
||||
return LIBUS_SOCKET_ERROR;
|
||||
}
|
||||
|
||||
internal_finalize_bsd_addr(addr);
|
||||
@@ -432,22 +423,14 @@ LIBUS_SOCKET_DESCRIPTOR bsd_accept_socket(LIBUS_SOCKET_DESCRIPTOR fd, struct bsd
|
||||
#endif
|
||||
}
|
||||
|
||||
ssize_t bsd_recv(LIBUS_SOCKET_DESCRIPTOR fd, void *buf, int length, int flags) {
|
||||
while (1) {
|
||||
ssize_t ret = recv(fd, buf, length, flags);
|
||||
|
||||
if (UNLIKELY(IS_EINTR(ret))) {
|
||||
continue;
|
||||
}
|
||||
|
||||
return ret;
|
||||
}
|
||||
int bsd_recv(LIBUS_SOCKET_DESCRIPTOR fd, void *buf, int length, int flags) {
|
||||
return recv(fd, buf, length, flags);
|
||||
}
|
||||
|
||||
#if !defined(_WIN32)
|
||||
#include <sys/uio.h>
|
||||
|
||||
ssize_t bsd_write2(LIBUS_SOCKET_DESCRIPTOR fd, const char *header, int header_length, const char *payload, int payload_length) {
|
||||
int bsd_write2(LIBUS_SOCKET_DESCRIPTOR fd, const char *header, int header_length, const char *payload, int payload_length) {
|
||||
struct iovec chunks[2];
|
||||
|
||||
chunks[0].iov_base = (char *)header;
|
||||
@@ -455,21 +438,13 @@ ssize_t bsd_write2(LIBUS_SOCKET_DESCRIPTOR fd, const char *header, int header_le
|
||||
chunks[1].iov_base = (char *)payload;
|
||||
chunks[1].iov_len = payload_length;
|
||||
|
||||
while (1) {
|
||||
ssize_t written = writev(fd, chunks, 2);
|
||||
|
||||
if (UNLIKELY(IS_EINTR(written))) {
|
||||
continue;
|
||||
}
|
||||
|
||||
return written;
|
||||
}
|
||||
return writev(fd, chunks, 2);
|
||||
}
|
||||
#else
|
||||
ssize_t bsd_write2(LIBUS_SOCKET_DESCRIPTOR fd, const char *header, int header_length, const char *payload, int payload_length) {
|
||||
ssize_t written = bsd_send(fd, header, header_length, 0);
|
||||
int bsd_write2(LIBUS_SOCKET_DESCRIPTOR fd, const char *header, int header_length, const char *payload, int payload_length) {
|
||||
int written = bsd_send(fd, header, header_length, 0);
|
||||
if (written == header_length) {
|
||||
ssize_t second_write = bsd_send(fd, payload, payload_length, 0);
|
||||
int second_write = bsd_send(fd, payload, payload_length, 0);
|
||||
if (second_write > 0) {
|
||||
written += second_write;
|
||||
}
|
||||
@@ -478,28 +453,26 @@ ssize_t bsd_write2(LIBUS_SOCKET_DESCRIPTOR fd, const char *header, int header_le
|
||||
}
|
||||
#endif
|
||||
|
||||
ssize_t bsd_send(LIBUS_SOCKET_DESCRIPTOR fd, const char *buf, int length, int msg_more) {
|
||||
while (1) {
|
||||
int bsd_send(LIBUS_SOCKET_DESCRIPTOR fd, const char *buf, int length, int msg_more) {
|
||||
|
||||
// MSG_MORE (Linux), MSG_PARTIAL (Windows), TCP_NOPUSH (BSD)
|
||||
|
||||
#ifndef MSG_NOSIGNAL
|
||||
#define MSG_NOSIGNAL 0
|
||||
#endif
|
||||
|
||||
#ifdef MSG_MORE
|
||||
// for Linux we do not want signals
|
||||
ssize_t rc = send(fd, buf, length, ((msg_more != 0) * MSG_MORE) | MSG_NOSIGNAL | MSG_DONTWAIT);
|
||||
#else
|
||||
// use TCP_NOPUSH
|
||||
ssize_t rc = send(fd, buf, length, MSG_NOSIGNAL | MSG_DONTWAIT);
|
||||
#endif
|
||||
#ifdef MSG_MORE
|
||||
|
||||
if (UNLIKELY(IS_EINTR(rc))) {
|
||||
continue;
|
||||
}
|
||||
// for Linux we do not want signals
|
||||
return send(fd, buf, length, ((msg_more != 0) * MSG_MORE) | MSG_NOSIGNAL | MSG_DONTWAIT);
|
||||
|
||||
return rc;
|
||||
}
|
||||
#else
|
||||
|
||||
// use TCP_NOPUSH
|
||||
|
||||
return send(fd, buf, length, MSG_NOSIGNAL | MSG_DONTWAIT);
|
||||
|
||||
#endif
|
||||
}
|
||||
|
||||
int bsd_would_block() {
|
||||
@@ -510,23 +483,6 @@ int bsd_would_block() {
|
||||
#endif
|
||||
}
|
||||
|
||||
static int us_internal_bind_and_listen(LIBUS_SOCKET_DESCRIPTOR listenFd, struct sockaddr *listenAddr, socklen_t listenAddrLength, int backlog) {
|
||||
int result;
|
||||
do
|
||||
result = bind(listenFd, listenAddr, listenAddrLength);
|
||||
while (IS_EINTR(result));
|
||||
|
||||
if (result == -1) {
|
||||
return -1;
|
||||
}
|
||||
|
||||
do
|
||||
result = listen(listenFd, backlog);
|
||||
while (IS_EINTR(result));
|
||||
|
||||
return result;
|
||||
}
|
||||
|
||||
inline __attribute__((always_inline)) LIBUS_SOCKET_DESCRIPTOR bsd_bind_listen_fd(
|
||||
LIBUS_SOCKET_DESCRIPTOR listenFd,
|
||||
struct addrinfo *listenAddr,
|
||||
@@ -534,29 +490,35 @@ inline __attribute__((always_inline)) LIBUS_SOCKET_DESCRIPTOR bsd_bind_listen_fd
|
||||
int options
|
||||
) {
|
||||
|
||||
if ((options & LIBUS_LISTEN_EXCLUSIVE_PORT)) {
|
||||
#if _WIN32
|
||||
int optval2 = 1;
|
||||
setsockopt(listenFd, SOL_SOCKET, SO_EXCLUSIVEADDRUSE, (void *) &optval2, sizeof(optval2));
|
||||
if (port != 0) {
|
||||
/* Otherwise, always enable SO_REUSEPORT and SO_REUSEADDR _unless_ options specify otherwise */
|
||||
#ifdef _WIN32
|
||||
if (options & LIBUS_LISTEN_EXCLUSIVE_PORT) {
|
||||
int optval2 = 1;
|
||||
setsockopt(listenFd, SOL_SOCKET, SO_EXCLUSIVEADDRUSE, (void *) &optval2, sizeof(optval2));
|
||||
} else {
|
||||
int optval3 = 1;
|
||||
setsockopt(listenFd, SOL_SOCKET, SO_REUSEADDR, (void *) &optval3, sizeof(optval3));
|
||||
}
|
||||
#else
|
||||
#if /*defined(__linux__) &&*/ defined(SO_REUSEPORT)
|
||||
if (!(options & LIBUS_LISTEN_EXCLUSIVE_PORT)) {
|
||||
int optval = 1;
|
||||
setsockopt(listenFd, SOL_SOCKET, SO_REUSEPORT, (void *) &optval, sizeof(optval));
|
||||
}
|
||||
#endif
|
||||
int enabled = 1;
|
||||
setsockopt(listenFd, SOL_SOCKET, SO_REUSEADDR, (void *) &enabled, sizeof(enabled));
|
||||
#endif
|
||||
} else {
|
||||
#if defined(SO_REUSEPORT)
|
||||
int optval2 = 1;
|
||||
setsockopt(listenFd, SOL_SOCKET, SO_REUSEPORT, (void *) &optval2, sizeof(optval2));
|
||||
#endif
|
||||
}
|
||||
|
||||
#if defined(SO_REUSEADDR)
|
||||
int optval3 = 1;
|
||||
setsockopt(listenFd, SOL_SOCKET, SO_REUSEADDR, (void *) &optval3, sizeof(optval3));
|
||||
#endif
|
||||
}
|
||||
|
||||
#ifdef IPV6_V6ONLY
|
||||
int disabled = 0;
|
||||
setsockopt(listenFd, IPPROTO_IPV6, IPV6_V6ONLY, (void *) &disabled, sizeof(disabled));
|
||||
#endif
|
||||
|
||||
if (us_internal_bind_and_listen(listenFd, listenAddr->ai_addr, (socklen_t) listenAddr->ai_addrlen, 512)) {
|
||||
if (bind(listenFd, listenAddr->ai_addr, (socklen_t) listenAddr->ai_addrlen) || listen(listenFd, 512)) {
|
||||
return LIBUS_SOCKET_ERROR;
|
||||
}
|
||||
|
||||
@@ -734,7 +696,7 @@ static LIBUS_SOCKET_DESCRIPTOR internal_bsd_create_listen_socket_unix(const char
|
||||
unlink(path);
|
||||
#endif
|
||||
|
||||
if (us_internal_bind_and_listen(listenFd, (struct sockaddr *) server_address, (socklen_t) addrlen, 512)) {
|
||||
if (bind(listenFd, (struct sockaddr *)server_address, addrlen) || listen(listenFd, 512)) {
|
||||
#if defined(_WIN32)
|
||||
int shouldSimulateENOENT = WSAGetLastError() == WSAENETDOWN;
|
||||
#endif
|
||||
@@ -965,23 +927,13 @@ static int bsd_do_connect_raw(LIBUS_SOCKET_DESCRIPTOR fd, struct sockaddr *addr,
|
||||
|
||||
|
||||
#else
|
||||
int r;
|
||||
do {
|
||||
errno = 0;
|
||||
r = connect(fd, (struct sockaddr *)addr, namelen);
|
||||
} while (IS_EINTR(r));
|
||||
|
||||
// connect() can return -1 with an errno of 0.
|
||||
// the errno is the correct one in that case.
|
||||
if (r == -1 && errno != 0) {
|
||||
if (errno == EINPROGRESS) {
|
||||
if (connect(fd, (struct sockaddr *)addr, namelen) == 0 || errno == EINPROGRESS || errno == EAGAIN) {
|
||||
return 0;
|
||||
}
|
||||
} while (errno == EINTR);
|
||||
|
||||
return errno;
|
||||
}
|
||||
|
||||
return 0;
|
||||
return errno;
|
||||
#endif
|
||||
}
|
||||
|
||||
|
||||
@@ -597,21 +597,13 @@ void us_internal_socket_after_open(struct us_socket_t *s, int error) {
|
||||
break;
|
||||
}
|
||||
}
|
||||
us_socket_close(0, s, LIBUS_SOCKET_CLOSE_CODE_CONNECTION_RESET, 0);
|
||||
|
||||
// Since CONCURRENT_CONNECTIONS is 2, we know there is room for at least 1 more active connection
|
||||
// now that we've closed the current socket.
|
||||
//
|
||||
// Three possible cases:
|
||||
// 1. The list of addresses to try is now empty -> throw an error
|
||||
// 2. There is a next address to try -> start the next one
|
||||
// 3. There are 2 or more addresses to try -> start the next two.
|
||||
if (c->connecting_head == NULL || c->connecting_head->connect_next == NULL) {
|
||||
us_socket_close(0, s, 0, 0);
|
||||
// there are no further attempting to connect
|
||||
if (!c->connecting_head) {
|
||||
// start opening the next batch of connections
|
||||
int opened = start_connections(c, c->connecting_head == NULL ? CONCURRENT_CONNECTIONS : 1);
|
||||
int opened = start_connections(c, CONCURRENT_CONNECTIONS);
|
||||
// we have run out of addresses to attempt, signal the connection error
|
||||
// but only if there are no other sockets in the list
|
||||
if (opened == 0 && c->connecting_head == NULL) {
|
||||
if (opened == 0) {
|
||||
c->error = ECONNREFUSED;
|
||||
c->context->on_connect_error(c, error);
|
||||
Bun__addrinfo_freeRequest(c->addrinfo_req, ECONNREFUSED);
|
||||
@@ -639,7 +631,7 @@ void us_internal_socket_after_open(struct us_socket_t *s, int error) {
|
||||
if (c) {
|
||||
for (struct us_socket_t *next = c->connecting_head; next; next = next->connect_next) {
|
||||
if (next != s) {
|
||||
us_socket_close(0, next, LIBUS_SOCKET_CLOSE_CODE_CONNECTION_RESET, 0);
|
||||
us_socket_close(0, next, 0, 0);
|
||||
}
|
||||
}
|
||||
// now that the socket is open, we can release the associated us_connecting_socket_t if it exists
|
||||
|
||||
@@ -150,9 +150,8 @@ int BIO_s_custom_write(BIO *bio, const char *data, int length) {
|
||||
int written = us_socket_write(0, loop_ssl_data->ssl_socket, data, length,
|
||||
loop_ssl_data->last_write_was_msg_more);
|
||||
|
||||
BIO_clear_retry_flags(bio);
|
||||
if (!written) {
|
||||
BIO_set_retry_write(bio);
|
||||
BIO_set_flags(bio, BIO_FLAGS_SHOULD_RETRY | BIO_FLAGS_WRITE);
|
||||
return -1;
|
||||
}
|
||||
|
||||
@@ -163,9 +162,8 @@ int BIO_s_custom_read(BIO *bio, char *dst, int length) {
|
||||
struct loop_ssl_data *loop_ssl_data =
|
||||
(struct loop_ssl_data *)BIO_get_data(bio);
|
||||
|
||||
BIO_clear_retry_flags(bio);
|
||||
if (!loop_ssl_data->ssl_read_input_length) {
|
||||
BIO_set_retry_read(bio);
|
||||
BIO_set_flags(bio, BIO_FLAGS_SHOULD_RETRY | BIO_FLAGS_READ);
|
||||
return -1;
|
||||
}
|
||||
|
||||
@@ -446,7 +444,6 @@ struct us_internal_ssl_socket_t *ssl_on_data(struct us_internal_ssl_socket_t *s,
|
||||
// no further processing of data when in shutdown state
|
||||
return s;
|
||||
}
|
||||
|
||||
// bug checking: this loop needs a lot of attention and clean-ups and
|
||||
// check-ups
|
||||
int read = 0;
|
||||
@@ -610,9 +607,7 @@ ssl_on_writable(struct us_internal_ssl_socket_t *s) {
|
||||
return 0;
|
||||
}
|
||||
|
||||
if (s->handshake_state == HANDSHAKE_COMPLETED) {
|
||||
s = context->on_writable(s);
|
||||
}
|
||||
s = context->on_writable(s);
|
||||
|
||||
return s;
|
||||
}
|
||||
@@ -722,8 +717,6 @@ create_ssl_context_from_options(struct us_socket_context_options_t options) {
|
||||
|
||||
/* Default options we rely on - changing these will break our logic */
|
||||
SSL_CTX_set_read_ahead(ssl_context, 1);
|
||||
/* we should always accept moving write buffer so we can retry writes with a
|
||||
* buffer allocated in a different address */
|
||||
SSL_CTX_set_mode(ssl_context, SSL_MODE_ACCEPT_MOVING_WRITE_BUFFER);
|
||||
|
||||
/* Anything below TLS 1.2 is disabled */
|
||||
@@ -1077,8 +1070,6 @@ SSL_CTX *create_ssl_context_from_bun_options(
|
||||
|
||||
/* Default options we rely on - changing these will break our logic */
|
||||
SSL_CTX_set_read_ahead(ssl_context, 1);
|
||||
/* we should always accept moving write buffer so we can retry writes with a
|
||||
* buffer allocated in a different address */
|
||||
SSL_CTX_set_mode(ssl_context, SSL_MODE_ACCEPT_MOVING_WRITE_BUFFER);
|
||||
|
||||
/* Anything below TLS 1.2 is disabled */
|
||||
@@ -1533,6 +1524,7 @@ struct us_connecting_socket_t *us_internal_ssl_socket_context_connect(
|
||||
sizeof(struct us_internal_ssl_socket_t) - sizeof(struct us_socket_t) +
|
||||
socket_ext_size, is_connected);
|
||||
}
|
||||
|
||||
struct us_internal_ssl_socket_t *us_internal_ssl_socket_context_connect_unix(
|
||||
struct us_internal_ssl_socket_context_t *context, const char *server_path,
|
||||
size_t pathlen, int options, int socket_ext_size) {
|
||||
|
||||
@@ -211,6 +211,26 @@ static struct us_cert_string_t root_certs[] = {
|
||||
"zTSMmfXK4SVhM7JZG+Ju1zdXtg2pEto=\n"
|
||||
"-----END CERTIFICATE-----",.len=2349},
|
||||
|
||||
/* Security Communication Root CA */
|
||||
{.str="-----BEGIN CERTIFICATE-----\n"
|
||||
"MIIDWjCCAkKgAwIBAgIBADANBgkqhkiG9w0BAQUFADBQMQswCQYDVQQGEwJKUDEYMBYGA1UE\n"
|
||||
"ChMPU0VDT00gVHJ1c3QubmV0MScwJQYDVQQLEx5TZWN1cml0eSBDb21tdW5pY2F0aW9uIFJv\n"
|
||||
"b3RDQTEwHhcNMDMwOTMwMDQyMDQ5WhcNMjMwOTMwMDQyMDQ5WjBQMQswCQYDVQQGEwJKUDEY\n"
|
||||
"MBYGA1UEChMPU0VDT00gVHJ1c3QubmV0MScwJQYDVQQLEx5TZWN1cml0eSBDb21tdW5pY2F0\n"
|
||||
"aW9uIFJvb3RDQTEwggEiMA0GCSqGSIb3DQEBAQUAA4IBDwAwggEKAoIBAQCzs/5/022x7xZ8\n"
|
||||
"V6UMbXaKL0u/ZPtM7orw8yl89f/uKuDp6bpbZCKamm8sOiZpUQWZJtzVHGpxxpp9Hp3dfGzG\n"
|
||||
"jGdnSj74cbAZJ6kJDKaVv0uMDPpVmDvY6CKhS3E4eayXkmmziX7qIWgGmBSWh9JhNrxtJ1ae\n"
|
||||
"V+7AwFb9Ms+k2Y7CI9eNqPPYJayX5HA49LY6tJ07lyZDo6G8SVlyTCMwhwFY9k6+HGhWZq/N\n"
|
||||
"QV3Is00qVUarH9oe4kA92819uZKAnDfdDJZkndwi92SL32HeFZRSFaB9UslLqCHJxrHty8OV\n"
|
||||
"YNEP8Ktw+N/LTX7s1vqr2b1/VPKl6Xn62dZ2JChzAgMBAAGjPzA9MB0GA1UdDgQWBBSgc0mZ\n"
|
||||
"aNyFW2XjmygvV5+9M7wHSDALBgNVHQ8EBAMCAQYwDwYDVR0TAQH/BAUwAwEB/zANBgkqhkiG\n"
|
||||
"9w0BAQUFAAOCAQEAaECpqLvkT115swW1F7NgE+vGkl3g0dNq/vu+m22/xwVtWSDEHPC32oRY\n"
|
||||
"AmP6SBbvT6UL90qY8j+eG61Ha2POCEfrUj94nK9NrvjVT8+amCoQQTlSxN3Zmw7vkwGusi7K\n"
|
||||
"aEIkQmywszo+zenaSMQVy+n5Bw+SUEmK3TGXX8npN6o7WWWXlDLJs58+OmJYxUmtYg5xpTKq\n"
|
||||
"L8aJdkNAExNnPaJUJRDL8Try2frbSVa7pv6nQTXD4IhhyYjH3zYQIphZ6rBK+1YWc26sTfci\n"
|
||||
"oU+tHXotRSflMMFe8toTyyVCUZVHA4xsIcx0Qu1T/zOLjw9XARYvz6buyXAiFL39vmwLAw==\n"
|
||||
"-----END CERTIFICATE-----",.len=1221},
|
||||
|
||||
/* XRamp Global CA Root */
|
||||
{.str="-----BEGIN CERTIFICATE-----\n"
|
||||
"MIIEMDCCAxigAwIBAgIQUJRs7Bjq1ZxN1ZfvdY+grTANBgkqhkiG9w0BAQUFADCBgjELMAkG\n"
|
||||
@@ -662,6 +682,39 @@ static struct us_cert_string_t root_certs[] = {
|
||||
"WD9f\n"
|
||||
"-----END CERTIFICATE-----",.len=1226},
|
||||
|
||||
/* Autoridad de Certificacion Firmaprofesional CIF A62634068 */
|
||||
{.str="-----BEGIN CERTIFICATE-----\n"
|
||||
"MIIGFDCCA/ygAwIBAgIIU+w77vuySF8wDQYJKoZIhvcNAQEFBQAwUTELMAkGA1UEBhMCRVMx\n"
|
||||
"QjBABgNVBAMMOUF1dG9yaWRhZCBkZSBDZXJ0aWZpY2FjaW9uIEZpcm1hcHJvZmVzaW9uYWwg\n"
|
||||
"Q0lGIEE2MjYzNDA2ODAeFw0wOTA1MjAwODM4MTVaFw0zMDEyMzEwODM4MTVaMFExCzAJBgNV\n"
|
||||
"BAYTAkVTMUIwQAYDVQQDDDlBdXRvcmlkYWQgZGUgQ2VydGlmaWNhY2lvbiBGaXJtYXByb2Zl\n"
|
||||
"c2lvbmFsIENJRiBBNjI2MzQwNjgwggIiMA0GCSqGSIb3DQEBAQUAA4ICDwAwggIKAoICAQDK\n"
|
||||
"lmuO6vj78aI14H9M2uDDUtd9thDIAl6zQyrET2qyyhxdKJp4ERppWVevtSBC5IsP5t9bpgOS\n"
|
||||
"L/UR5GLXMnE42QQMcas9UX4PB99jBVzpv5RvwSmCwLTaUbDBPLutN0pcyvFLNg4kq7/DhHf9\n"
|
||||
"qFD0sefGL9ItWY16Ck6WaVICqjaY7Pz6FIMMNx/Jkjd/14Et5cS54D40/mf0PmbR0/RAz15i\n"
|
||||
"NA9wBj4gGFrO93IbJWyTdBSTo3OxDqqHECNZXyAFGUftaI6SEspd/NYrspI8IM/hX68gvqB2\n"
|
||||
"f3bl7BqGYTM+53u0P6APjqK5am+5hyZvQWyIplD9amML9ZMWGxmPsu2bm8mQ9QEM3xk9Dz44\n"
|
||||
"I8kvjwzRAv4bVdZO0I08r0+k8/6vKtMFnXkIoctXMbScyJCyZ/QYFpM6/EfY0XiWMR+6Kwxf\n"
|
||||
"XZmtY4laJCB22N/9q06mIqqdXuYnin1oKaPnirjaEbsXLZmdEyRG98Xi2J+Of8ePdG1asuhy\n"
|
||||
"9azuJBCtLxTa/y2aRnFHvkLfuwHb9H/TKI8xWVvTyQKmtFLKbpf7Q8UIJm+K9Lv9nyiqDdVF\n"
|
||||
"8xM6HdjAeI9BZzwelGSuewvF6NkBiDkal4ZkQdU7hwxu+g/GvUgUvzlN1J5Bto+WHWOWk9mV\n"
|
||||
"BngxaJ43BjuAiUVhOSPHG0SjFeUc+JIwuwIDAQABo4HvMIHsMBIGA1UdEwEB/wQIMAYBAf8C\n"
|
||||
"AQEwDgYDVR0PAQH/BAQDAgEGMB0GA1UdDgQWBBRlzeurNR4APn7VdMActHNHDhpkLzCBpgYD\n"
|
||||
"VR0gBIGeMIGbMIGYBgRVHSAAMIGPMC8GCCsGAQUFBwIBFiNodHRwOi8vd3d3LmZpcm1hcHJv\n"
|
||||
"ZmVzaW9uYWwuY29tL2NwczBcBggrBgEFBQcCAjBQHk4AUABhAHMAZQBvACAAZABlACAAbABh\n"
|
||||
"ACAAQgBvAG4AYQBuAG8AdgBhACAANAA3ACAAQgBhAHIAYwBlAGwAbwBuAGEAIAAwADgAMAAx\n"
|
||||
"ADcwDQYJKoZIhvcNAQEFBQADggIBABd9oPm03cXF661LJLWhAqvdpYhKsg9VSytXjDvlMd3+\n"
|
||||
"xDLx51tkljYyGOylMnfX40S2wBEqgLk9am58m9Ot/MPWo+ZkKXzR4Tgegiv/J2Wv+xYVxC5x\n"
|
||||
"hOW1//qkR71kMrv2JYSiJ0L1ILDCExARzRAVukKQKtJE4ZYm6zFIEv0q2skGz3QeqUvVhyj5\n"
|
||||
"eTSSPi5E6PaPT481PyWzOdxjKpBrIF/EUhJOlywqrJ2X3kjyo2bbwtKDlaZmp54lD+kLM5Fl\n"
|
||||
"ClrD2VQS3a/DTg4fJl4N3LON7NWBcN7STyQF82xO9UxJZo3R/9ILJUFI/lGExkKvgATP0H5k\n"
|
||||
"SeTy36LssUzAKh3ntLFlosS88Zj0qnAHY7S42jtM+kAiMFsRpvAFDsYCA0irhpuF3dvd6qJ2\n"
|
||||
"gHN99ZwExEWN57kci57q13XRcrHedUTnQn3iV2t93Jm8PYMo6oCTjcVMZcFwgbg4/EMxsvYD\n"
|
||||
"NEeyrPsiBsse3RdHHF9mudMaotoRsaS8I8nkvof/uZS2+F0gStRf571oe2XyFR7SOqkt6dhr\n"
|
||||
"JKyXWERHrVkY8SFlcN7ONGCoQPHzPKTDKCOM/iczQ0CgFzzr6juwcqajuUpLXhZI9LK8yIyS\n"
|
||||
"xZ2frHI2vDSANGupi5LAuBft7HZT9SQBjLMi6Et8Vcad+qMUu2WFbm5PEn4KPJ2V\n"
|
||||
"-----END CERTIFICATE-----",.len=2162},
|
||||
|
||||
/* Izenpe.com */
|
||||
{.str="-----BEGIN CERTIFICATE-----\n"
|
||||
"MIIF8TCCA9mgAwIBAgIQALC3WhZIX7/hy/WL1xnmfTANBgkqhkiG9w0BAQsFADA4MQswCQYD\n"
|
||||
@@ -3414,188 +3467,4 @@ static struct us_cert_string_t root_certs[] = {
|
||||
"dDTedk+SKlOxJTnbPP/lPqYO5Wue/9vsL3SD3460s6neFE3/MaNFcyT6lSnMEpcEoji2jbDw\n"
|
||||
"N/zIIX8/syQbPYtuzE2wFg2WHYMfRsCbvUOZ58SWLs5fyQ==\n"
|
||||
"-----END CERTIFICATE-----",.len=1927},
|
||||
|
||||
/* TrustAsia Global Root CA G3 */
|
||||
{.str="-----BEGIN CERTIFICATE-----\n"
|
||||
"MIIFpTCCA42gAwIBAgIUZPYOZXdhaqs7tOqFhLuxibhxkw8wDQYJKoZIhvcNAQEMBQAwWjEL\n"
|
||||
"MAkGA1UEBhMCQ04xJTAjBgNVBAoMHFRydXN0QXNpYSBUZWNobm9sb2dpZXMsIEluYy4xJDAi\n"
|
||||
"BgNVBAMMG1RydXN0QXNpYSBHbG9iYWwgUm9vdCBDQSBHMzAeFw0yMTA1MjAwMjEwMTlaFw00\n"
|
||||
"NjA1MTkwMjEwMTlaMFoxCzAJBgNVBAYTAkNOMSUwIwYDVQQKDBxUcnVzdEFzaWEgVGVjaG5v\n"
|
||||
"bG9naWVzLCBJbmMuMSQwIgYDVQQDDBtUcnVzdEFzaWEgR2xvYmFsIFJvb3QgQ0EgRzMwggIi\n"
|
||||
"MA0GCSqGSIb3DQEBAQUAA4ICDwAwggIKAoICAQDAMYJhkuSUGwoqZdC+BqmHO1ES6nBBruL7\n"
|
||||
"dOoKjbmzTNyPtxNST1QY4SxzlZHFZjtqz6xjbYdT8PfxObegQ2OwxANdV6nnRM7EoYNl9lA+\n"
|
||||
"sX4WuDqKAtCWHwDNBSHvBm3dIZwZQ0WhxeiAysKtQGIXBsaqvPPW5vxQfmZCHzyLpnl5hkA1\n"
|
||||
"nyDvP+uLRx+PjsXUjrYsyUQE49RDdT/VP68czH5GX6zfZBCK70bwkPAPLfSIC7Epqq+FqklY\n"
|
||||
"qL9joDiR5rPmd2jE+SoZhLsO4fWvieylL1AgdB4SQXMeJNnKziyhWTXAyB1GJ2Faj/lN03J5\n"
|
||||
"Zh6fFZAhLf3ti1ZwA0pJPn9pMRJpxx5cynoTi+jm9WAPzJMshH/x/Gr8m0ed262IPfN2dTPX\n"
|
||||
"S6TIi/n1Q1hPy8gDVI+lhXgEGvNz8teHHUGf59gXzhqcD0r83ERoVGjiQTz+LISGNzzNPy+i\n"
|
||||
"2+f3VANfWdP3kXjHi3dqFuVJhZBFcnAvkV34PmVACxmZySYgWmjBNb9Pp1Hx2BErW+Canig7\n"
|
||||
"CjoKH8GB5S7wprlppYiU5msTf9FkPz2ccEblooV7WIQn3MSAPmeamseaMQ4w7OYXQJXZRe0B\n"
|
||||
"lqq/DPNL0WP3E1jAuPP6Z92bfW1K/zJMtSU7/xxnD4UiWQWRkUF3gdCFTIcQcf+eQxuulXUt\n"
|
||||
"gQIDAQABo2MwYTAPBgNVHRMBAf8EBTADAQH/MB8GA1UdIwQYMBaAFEDk5PIj7zjKsK5Xf/Ih\n"
|
||||
"MBY027ySMB0GA1UdDgQWBBRA5OTyI+84yrCuV3/yITAWNNu8kjAOBgNVHQ8BAf8EBAMCAQYw\n"
|
||||
"DQYJKoZIhvcNAQEMBQADggIBACY7UeFNOPMyGLS0XuFlXsSUT9SnYaP4wM8zAQLpw6o1D/GU\n"
|
||||
"E3d3NZ4tVlFEbuHGLige/9rsR82XRBf34EzC4Xx8MnpmyFq2XFNFV1pF1AWZLy4jVe5jaN/T\n"
|
||||
"G3inEpQGAHUNcoTpLrxaatXeL1nHo+zSh2bbt1S1JKv0Q3jbSwTEb93mPmY+KfJLaHEih6D4\n"
|
||||
"sTNjduMNhXJEIlU/HHzp/LgV6FL6qj6jITk1dImmasI5+njPtqzn59ZW/yOSLlALqbUHM/Q4\n"
|
||||
"X6RJpstlcHboCoWASzY9M/eVVHUl2qzEc4Jl6VL1XP04lQJqaTDFHApXB64ipCz5xUG3uOyf\n"
|
||||
"T0gA+QEEVcys+TIxxHWVBqB/0Y0n3bOppHKH/lmLmnp0Ft0WpWIp6zqW3IunaFnT63eROfjX\n"
|
||||
"y9mPX1onAX1daBli2MjN9LdyR75bl87yraKZk62Uy5P2EgmVtqvXO9A/EcswFi55gORngS1d\n"
|
||||
"7XB4tmBZrOFdRWOPyN9yaFvqHbgB8X7754qz41SgOAngPN5C8sLtLpvzHzW2NtjjgKGLzZlk\n"
|
||||
"D8Kqq7HK9W+eQ42EVJmzbsASZthwEPEGNTNDqJwuuhQxzhB/HIbjj9LV+Hfsm6vxL2PZQl/g\n"
|
||||
"Z4FkkfGXL/xuJvYz+NO1+MRiqzFRJQJ6+N1rZdVtTTDIZbpoFGWsJwt0ivKH\n"
|
||||
"-----END CERTIFICATE-----",.len=2012},
|
||||
|
||||
/* TrustAsia Global Root CA G4 */
|
||||
{.str="-----BEGIN CERTIFICATE-----\n"
|
||||
"MIICVTCCAdygAwIBAgIUTyNkuI6XY57GU4HBdk7LKnQV1tcwCgYIKoZIzj0EAwMwWjELMAkG\n"
|
||||
"A1UEBhMCQ04xJTAjBgNVBAoMHFRydXN0QXNpYSBUZWNobm9sb2dpZXMsIEluYy4xJDAiBgNV\n"
|
||||
"BAMMG1RydXN0QXNpYSBHbG9iYWwgUm9vdCBDQSBHNDAeFw0yMTA1MjAwMjEwMjJaFw00NjA1\n"
|
||||
"MTkwMjEwMjJaMFoxCzAJBgNVBAYTAkNOMSUwIwYDVQQKDBxUcnVzdEFzaWEgVGVjaG5vbG9n\n"
|
||||
"aWVzLCBJbmMuMSQwIgYDVQQDDBtUcnVzdEFzaWEgR2xvYmFsIFJvb3QgQ0EgRzQwdjAQBgcq\n"
|
||||
"hkjOPQIBBgUrgQQAIgNiAATxs8045CVD5d4ZCbuBeaIVXxVjAd7Cq92zphtnS4CDr5nLrBfb\n"
|
||||
"K5bKfFJV4hrhPVbwLxYI+hW8m7tH5j/uqOFMjPXTNvk4XatwmkcN4oFBButJ+bAp3TPsUKV/\n"
|
||||
"eSm4IJijYzBhMA8GA1UdEwEB/wQFMAMBAf8wHwYDVR0jBBgwFoAUpbtKl86zK3+kMd6Xg1mD\n"
|
||||
"pm9xy94wHQYDVR0OBBYEFKW7SpfOsyt/pDHel4NZg6ZvccveMA4GA1UdDwEB/wQEAwIBBjAK\n"
|
||||
"BggqhkjOPQQDAwNnADBkAjBe8usGzEkxn0AAbbd+NvBNEU/zy4k6LHiRUKNbwMp1JvK/kF0L\n"
|
||||
"goxgKJ/GcJpo5PECMFxYDlZ2z1jD1xCMuo6u47xkdUfFVZDj/bpV6wfEU6s3qe4hsiFbYI89\n"
|
||||
"MvHVI5TWWA==\n"
|
||||
"-----END CERTIFICATE-----",.len=869},
|
||||
|
||||
/* CommScope Public Trust ECC Root-01 */
|
||||
{.str="-----BEGIN CERTIFICATE-----\n"
|
||||
"MIICHTCCAaOgAwIBAgIUQ3CCd89NXTTxyq4yLzf39H91oJ4wCgYIKoZIzj0EAwMwTjELMAkG\n"
|
||||
"A1UEBhMCVVMxEjAQBgNVBAoMCUNvbW1TY29wZTErMCkGA1UEAwwiQ29tbVNjb3BlIFB1Ymxp\n"
|
||||
"YyBUcnVzdCBFQ0MgUm9vdC0wMTAeFw0yMTA0MjgxNzM1NDNaFw00NjA0MjgxNzM1NDJaME4x\n"
|
||||
"CzAJBgNVBAYTAlVTMRIwEAYDVQQKDAlDb21tU2NvcGUxKzApBgNVBAMMIkNvbW1TY29wZSBQ\n"
|
||||
"dWJsaWMgVHJ1c3QgRUNDIFJvb3QtMDEwdjAQBgcqhkjOPQIBBgUrgQQAIgNiAARLNumuV16o\n"
|
||||
"cNfQj3Rid8NeeqrltqLxeP0CflfdkXmcbLlSiFS8LwS+uM32ENEp7LXQoMPwiXAZu1FlxUOc\n"
|
||||
"w5tjnSCDPgYLpkJEhRGnSjot6dZoL0hOUysHP029uax3OVejQjBAMA8GA1UdEwEB/wQFMAMB\n"
|
||||
"Af8wDgYDVR0PAQH/BAQDAgEGMB0GA1UdDgQWBBSOB2LAUN3GGQYARnQE9/OufXVNMDAKBggq\n"
|
||||
"hkjOPQQDAwNoADBlAjEAnDPfQeMjqEI2Jpc1XHvr20v4qotzVRVcrHgpD7oh2MSg2NED3W3R\n"
|
||||
"OT3Ek2DS43KyAjB8xX6I01D1HiXo+k515liWpDVfG2XqYZpwI7UNo5uSUm9poIyNStDuiw7L\n"
|
||||
"R47QjRE=\n"
|
||||
"-----END CERTIFICATE-----",.len=792},
|
||||
|
||||
/* CommScope Public Trust ECC Root-02 */
|
||||
{.str="-----BEGIN CERTIFICATE-----\n"
|
||||
"MIICHDCCAaOgAwIBAgIUKP2ZYEFHpgE6yhR7H+/5aAiDXX0wCgYIKoZIzj0EAwMwTjELMAkG\n"
|
||||
"A1UEBhMCVVMxEjAQBgNVBAoMCUNvbW1TY29wZTErMCkGA1UEAwwiQ29tbVNjb3BlIFB1Ymxp\n"
|
||||
"YyBUcnVzdCBFQ0MgUm9vdC0wMjAeFw0yMTA0MjgxNzQ0NTRaFw00NjA0MjgxNzQ0NTNaME4x\n"
|
||||
"CzAJBgNVBAYTAlVTMRIwEAYDVQQKDAlDb21tU2NvcGUxKzApBgNVBAMMIkNvbW1TY29wZSBQ\n"
|
||||
"dWJsaWMgVHJ1c3QgRUNDIFJvb3QtMDIwdjAQBgcqhkjOPQIBBgUrgQQAIgNiAAR4MIHoYx7l\n"
|
||||
"63FRD/cHB8o5mXxO1Q/MMDALj2aTPs+9xYa9+bG3tD60B8jzljHz7aRP+KNOjSkVWLjVb3/u\n"
|
||||
"bCK1sK9IRQq9qEmUv4RDsNuESgMjGWdqb8FuvAY5N9GIIvejQjBAMA8GA1UdEwEB/wQFMAMB\n"
|
||||
"Af8wDgYDVR0PAQH/BAQDAgEGMB0GA1UdDgQWBBTmGHX/72DehKT1RsfeSlXjMjZ59TAKBggq\n"
|
||||
"hkjOPQQDAwNnADBkAjAmc0l6tqvmSfR9Uj/UQQSugEODZXW5hYA4O9Zv5JOGq4/nich/m35r\n"
|
||||
"ChJVYaoR4HkCMHfoMXGsPHED1oQmHhS48zs73u1Z/GtMMH9ZzkXpc2AVmkzw5l4lIhVtwodZ\n"
|
||||
"0LKOag==\n"
|
||||
"-----END CERTIFICATE-----",.len=792},
|
||||
|
||||
/* CommScope Public Trust RSA Root-01 */
|
||||
{.str="-----BEGIN CERTIFICATE-----\n"
|
||||
"MIIFbDCCA1SgAwIBAgIUPgNJgXUWdDGOTKvVxZAplsU5EN0wDQYJKoZIhvcNAQELBQAwTjEL\n"
|
||||
"MAkGA1UEBhMCVVMxEjAQBgNVBAoMCUNvbW1TY29wZTErMCkGA1UEAwwiQ29tbVNjb3BlIFB1\n"
|
||||
"YmxpYyBUcnVzdCBSU0EgUm9vdC0wMTAeFw0yMTA0MjgxNjQ1NTRaFw00NjA0MjgxNjQ1NTNa\n"
|
||||
"ME4xCzAJBgNVBAYTAlVTMRIwEAYDVQQKDAlDb21tU2NvcGUxKzApBgNVBAMMIkNvbW1TY29w\n"
|
||||
"ZSBQdWJsaWMgVHJ1c3QgUlNBIFJvb3QtMDEwggIiMA0GCSqGSIb3DQEBAQUAA4ICDwAwggIK\n"
|
||||
"AoICAQCwSGWjDR1C45FtnYSkYZYSwu3D2iM0GXb26v1VWvZVAVMP8syMl0+5UMuzAURWlv2b\n"
|
||||
"KOx7dAvnQmtVzslhsuitQDy6uUEKBU8bJoWPQ7VAtYXR1HHcg0Hz9kXHgKKEUJdGzqAMxGBW\n"
|
||||
"BB0HW0alDrJLpA6lfO741GIDuZNqihS4cPgugkY4Iw50x2tBt9Apo52AsH53k2NC+zSDO3Oj\n"
|
||||
"WiE260f6GBfZumbCk6SP/F2krfxQapWsvCQz0b2If4b19bJzKo98rwjyGpg/qYFlP8GMicWW\n"
|
||||
"MJoKz/TUyDTtnS+8jTiGU+6Xn6myY5QXjQ/cZip8UlF1y5mO6D1cv547KI2DAg+pn3LiLCuz\n"
|
||||
"3GaXAEDQpFSOm117RTYm1nJD68/A6g3czhLmfTifBSeolz7pUcZsBSjBAg/pGG3svZwG1KdJ\n"
|
||||
"9FQFa2ww8esD1eo9anbCyxooSU1/ZOD6K9pzg4H/kQO9lLvkuI6cMmPNn7togbGEW682v3fu\n"
|
||||
"HX/3SZtS7NJ3Wn2RnU3COS3kuoL4b/JOHg9O5j9ZpSPcPYeoKFgo0fEbNttPxP/hjFtyjMcm\n"
|
||||
"AyejOQoBqsCyMWCDIqFPEgkBEa801M/XrmLTBQe0MXXgDW1XT2mH+VepuhX2yFJtocucH+X8\n"
|
||||
"eKg1mp9BFM6ltM6UCBwJrVbl2rZJmkrqYxhTnCwuwwIDAQABo0IwQDAPBgNVHRMBAf8EBTAD\n"
|
||||
"AQH/MA4GA1UdDwEB/wQEAwIBBjAdBgNVHQ4EFgQUN12mmnQywsL5x6YVEFm45P3luG0wDQYJ\n"
|
||||
"KoZIhvcNAQELBQADggIBAK+nz97/4L1CjU3lIpbfaOp9TSp90K09FlxD533Ahuh6NWPxzIHI\n"
|
||||
"xgvoLlI1pKZJkGNRrDSsBTtXAOnTYtPZKdVUvhwQkZyybf5Z/Xn36lbQnmhUQo8mUuJM3y+X\n"
|
||||
"pi/SB5io82BdS5pYV4jvguX6r2yBS5KPQJqTRlnLX3gWsWc+QgvfKNmwrZggvkN80V4aCRck\n"
|
||||
"jXtdlemrwWCrWxhkgPut4AZ9HcpZuPN4KWfGVh2vtrV0KnahP/t1MJ+UXjulYPPLXAziDslg\n"
|
||||
"+MkfFoom3ecnf+slpoq9uC02EJqxWE2aaE9gVOX2RhOOiKy8IUISrcZKiX2bwdgt6ZYD9KJ0\n"
|
||||
"DLwAHb/WNyVntHKLr4W96ioDj8z7PEQkguIBpQtZtjSNMgsSDesnwv1B10A8ckYpwIzqug/x\n"
|
||||
"BpMu95yo9GA+o/E4Xo4TwbM6l4c/ksp4qRyv0LAbJh6+cOx69TOY6lz/KwsETkPdY34Op054\n"
|
||||
"A5U+1C0wlREQKC6/oAI+/15Z0wUOlV9TRe9rh9VIzRamloPh37MG88EU26fsHItdkJANclHn\n"
|
||||
"YfkUyq+Dj7+vsQpZXdxc1+SWrVtgHdqul7I52Qb1dgAT+GhMIbA1xNxVssnBQVocicCMb3Sg\n"
|
||||
"azNNtQEo/a2tiRc7ppqEvOuM6sRxJKi6KfkIsidWNTJf6jn7MZrVGczw\n"
|
||||
"-----END CERTIFICATE-----",.len=1935},
|
||||
|
||||
/* CommScope Public Trust RSA Root-02 */
|
||||
{.str="-----BEGIN CERTIFICATE-----\n"
|
||||
"MIIFbDCCA1SgAwIBAgIUVBa/O345lXGN0aoApYYNK496BU4wDQYJKoZIhvcNAQELBQAwTjEL\n"
|
||||
"MAkGA1UEBhMCVVMxEjAQBgNVBAoMCUNvbW1TY29wZTErMCkGA1UEAwwiQ29tbVNjb3BlIFB1\n"
|
||||
"YmxpYyBUcnVzdCBSU0EgUm9vdC0wMjAeFw0yMTA0MjgxNzE2NDNaFw00NjA0MjgxNzE2NDJa\n"
|
||||
"ME4xCzAJBgNVBAYTAlVTMRIwEAYDVQQKDAlDb21tU2NvcGUxKzApBgNVBAMMIkNvbW1TY29w\n"
|
||||
"ZSBQdWJsaWMgVHJ1c3QgUlNBIFJvb3QtMDIwggIiMA0GCSqGSIb3DQEBAQUAA4ICDwAwggIK\n"
|
||||
"AoICAQDh+g77aAASyE3VrCLENQE7xVTlWXZjpX/rwcRqmL0yjReA61260WI9JSMZNRTpf4mn\n"
|
||||
"G2I81lDnNJUDMrG0kyI9p+Kx7eZ7Ti6Hmw0zdQreqjXnfuU2mKKuJZ6VszKWpCtYHu8//mI0\n"
|
||||
"SFHRtI1CrWDaSWqVcN3SAOLMV2MCe5bdSZdbkk6V0/nLKR8YSvgBKtJjCW4k6YnS5cciTNxz\n"
|
||||
"hkcAqg2Ijq6FfUrpuzNPDlJwnZXjfG2WWy09X6GDRl224yW4fKcZgBzqZUPckXk2LHR88mcG\n"
|
||||
"yYnJ27/aaL8j7dxrrSiDeS/sOKUNNwFnJ5rpM9kzXzehxfCrPfp4sOcsn/Y+n2Dg70jpkEUe\n"
|
||||
"BVF4GiwSLFworA2iI540jwXmojPOEXcT1A6kHkIfhs1w/tkuFT0du7jyU1fbzMZ0KZwYszZ1\n"
|
||||
"OC4PVKH4kh+Jlk+71O6d6Ts2QrUKOyrUZHk2EOH5kQMreyBUzQ0ZGshBMjTRsJnhkB4BQDa1\n"
|
||||
"t/qp5Xd1pCKBXbCL5CcSD1SIxtuFdOa3wNemKfrb3vOTlycEVS8KbzfFPROvCgCpLIscgSjX\n"
|
||||
"74Yxqa7ybrjKaixUR9gqiC6vwQcQeKwRoi9C8DfF8rhW3Q5iLc4tVn5V8qdE9isy9COoR+jU\n"
|
||||
"KgF4z2rDN6ieZdIs5fq6M8EGRPbmz6UNp2YINIos8wIDAQABo0IwQDAPBgNVHRMBAf8EBTAD\n"
|
||||
"AQH/MA4GA1UdDwEB/wQEAwIBBjAdBgNVHQ4EFgQUR9DnsSL/nSz12Vdgs7GxcJXvYXowDQYJ\n"
|
||||
"KoZIhvcNAQELBQADggIBAIZpsU0v6Z9PIpNojuQhmaPORVMbc0RTAIFhzTHjCLqBKCh6krm2\n"
|
||||
"qMhDnscTJk3C2OVVnJJdUNjCK9v+5qiXz1I6JMNlZFxHMaNlNRPDk7n3+VGXu6TwYofF1gbT\n"
|
||||
"l4MgqX67tiHCpQ2EAOHyJxCDut0DgdXdaMNmEMjRdrSzbymeAPnCKfWxkxlSaRosTKCL4BWa\n"
|
||||
"MS/TiJVZbuXEs1DIFAhKm4sTg7GkcrI7djNB3NyqpgdvHSQSn8h2vS/ZjvQs7rfSOBAkNlEv\n"
|
||||
"41xdgSGn2rtO/+YHqP65DSdsu3BaVXoT6fEqSWnHX4dXTEN5bTpl6TBcQe7rd6VzEojov32u\n"
|
||||
"5cSoHw2OHG1QAk8mGEPej1WFsQs3BWDJVTkSBKEqz3EWnzZRSb9wO55nnPt7eck5HHisd5FU\n"
|
||||
"mrh1CoFSl+NmYWvtPjgelmFV4ZFUjO2MJB+ByRCac5krFk5yAD9UG/iNuovnFNa2RU9g7Jau\n"
|
||||
"wy8CTl2dlklyALKrdVwPaFsdZcJfMw8eD/A7hvWwTruc9+olBdytoptLFwG+Qt81IR2tq670\n"
|
||||
"v64fG9PiO/yzcnMcmyiQiRM9HcEARwmWmjgb3bHPDcK0RPOWlc4yOo80nOAXx17Org3bhzjl\n"
|
||||
"P1v9mxnhMUF6cKojawHhRUzNlM47ni3niAIi9G7oyOzWPPO5std3eqx7\n"
|
||||
"-----END CERTIFICATE-----",.len=1935},
|
||||
|
||||
/* Telekom Security TLS ECC Root 2020 */
|
||||
{.str="-----BEGIN CERTIFICATE-----\n"
|
||||
"MIICQjCCAcmgAwIBAgIQNjqWjMlcsljN0AFdxeVXADAKBggqhkjOPQQDAzBjMQswCQYDVQQG\n"
|
||||
"EwJERTEnMCUGA1UECgweRGV1dHNjaGUgVGVsZWtvbSBTZWN1cml0eSBHbWJIMSswKQYDVQQD\n"
|
||||
"DCJUZWxla29tIFNlY3VyaXR5IFRMUyBFQ0MgUm9vdCAyMDIwMB4XDTIwMDgyNTA3NDgyMFoX\n"
|
||||
"DTQ1MDgyNTIzNTk1OVowYzELMAkGA1UEBhMCREUxJzAlBgNVBAoMHkRldXRzY2hlIFRlbGVr\n"
|
||||
"b20gU2VjdXJpdHkgR21iSDErMCkGA1UEAwwiVGVsZWtvbSBTZWN1cml0eSBUTFMgRUNDIFJv\n"
|
||||
"b3QgMjAyMDB2MBAGByqGSM49AgEGBSuBBAAiA2IABM6//leov9Wq9xCazbzREaK9Z0LMkOsV\n"
|
||||
"GJDZos0MKiXrPk/OtdKPD/M12kOLAoC+b1EkHQ9rK8qfwm9QMuU3ILYg/4gND21Ju9sGpIeQ\n"
|
||||
"kpT0CdDPf8iAC8GXs7s1J8nCG6NCMEAwHQYDVR0OBBYEFONyzG6VmUex5rNhTNHLq+O6zd6f\n"
|
||||
"MA8GA1UdEwEB/wQFMAMBAf8wDgYDVR0PAQH/BAQDAgEGMAoGCCqGSM49BAMDA2cAMGQCMHVS\n"
|
||||
"i7ekEE+uShCLsoRbQuHmKjYC2qBuGT8lv9pZMo7k+5Dck2TOrbRBR2Diz6fLHgIwN0GMZt9B\n"
|
||||
"a9aDAEH9L1r3ULRn0SyocddDypwnJJGDSA3PzfdUga/sf+Rn27iQ7t0l\n"
|
||||
"-----END CERTIFICATE-----",.len=840},
|
||||
|
||||
/* Telekom Security TLS RSA Root 2023 */
|
||||
{.str="-----BEGIN CERTIFICATE-----\n"
|
||||
"MIIFszCCA5ugAwIBAgIQIZxULej27HF3+k7ow3BXlzANBgkqhkiG9w0BAQwFADBjMQswCQYD\n"
|
||||
"VQQGEwJERTEnMCUGA1UECgweRGV1dHNjaGUgVGVsZWtvbSBTZWN1cml0eSBHbWJIMSswKQYD\n"
|
||||
"VQQDDCJUZWxla29tIFNlY3VyaXR5IFRMUyBSU0EgUm9vdCAyMDIzMB4XDTIzMDMyODEyMTY0\n"
|
||||
"NVoXDTQ4MDMyNzIzNTk1OVowYzELMAkGA1UEBhMCREUxJzAlBgNVBAoMHkRldXRzY2hlIFRl\n"
|
||||
"bGVrb20gU2VjdXJpdHkgR21iSDErMCkGA1UEAwwiVGVsZWtvbSBTZWN1cml0eSBUTFMgUlNB\n"
|
||||
"IFJvb3QgMjAyMzCCAiIwDQYJKoZIhvcNAQEBBQADggIPADCCAgoCggIBAO01oYGA88tKaVvC\n"
|
||||
"+1GDrib94W7zgRJ9cUD/h3VCKSHtgVIs3xLBGYSJwb3FKNXVS2xE1kzbB5ZKVXrKNoIENqil\n"
|
||||
"/Cf2SfHVcp6R+SPWcHu79ZvB7JPPGeplfohwoHP89v+1VmLhc2o0mD6CuKyVU/QBoCcHcqMA\n"
|
||||
"U6DksquDOFczJZSfvkgdmOGjup5czQRxUX11eKvzWarE4GC+j4NSuHUaQTXtvPM6Y+mpFEXX\n"
|
||||
"5lLRbtLevOP1Czvm4MS9Q2QTps70mDdsipWol8hHD/BeEIvnHRz+sTugBTNoBUGCwQMrAcjn\n"
|
||||
"j02r6LX2zWtEtefdi+zqJbQAIldNsLGyMcEWzv/9FIS3R/qy8XDe24tsNlikfLMR0cN3f1+2\n"
|
||||
"JeANxdKz+bi4d9s3cXFH42AYTyS2dTd4uaNir73Jco4vzLuu2+QVUhkHM/tqty1LkCiCc/4Y\n"
|
||||
"izWN26cEar7qwU02OxY2kTLvtkCJkUPg8qKrBC7m8kwOFjQgrIfBLX7JZkcXFBGk8/ehJImr\n"
|
||||
"2BrIoVyxo/eMbcgByU/J7MT8rFEz0ciD0cmfHdRHNCk+y7AO+oMLKFjlKdw/fKifybYKu6bo\n"
|
||||
"RhYPluV75Gp6SG12mAWl3G0eQh5C2hrgUve1g8Aae3g1LDj1H/1Joy7SWWO/gLCMk3PLNaaZ\n"
|
||||
"lSJhZQNg+y+TS/qanIA7AgMBAAGjYzBhMA4GA1UdDwEB/wQEAwIBBjAdBgNVHQ4EFgQUtqeX\n"
|
||||
"gj10hZv3PJ+TmpV5dVKMbUcwDwYDVR0TAQH/BAUwAwEB/zAfBgNVHSMEGDAWgBS2p5eCPXSF\n"
|
||||
"m/c8n5OalXl1UoxtRzANBgkqhkiG9w0BAQwFAAOCAgEAqMxhpr51nhVQpGv7qHBFfLp+sVr8\n"
|
||||
"WyP6Cnf4mHGCDG3gXkaqk/QeoMPhk9tLrbKmXauw1GLLXrtm9S3ul0A8Yute1hTWjOKWi0Fp\n"
|
||||
"kzXmuZlrYrShF2Y0pmtjxrlO8iLpWA1WQdH6DErwM807u20hOq6OcrXDSvvpfeWxm4bu4uB9\n"
|
||||
"tPcy/SKE8YXJN3nptT+/XOR0so8RYgDdGGah2XsjX/GO1WfoVNpbOms2b/mBsTNHM3dA+VKq\n"
|
||||
"3dSDz4V4mZqTuXNnQkYRIer+CqkbGmVps4+uFrb2S1ayLfmlyOw7YqPta9BO1UAJpB+Y1zql\n"
|
||||
"klkg5LB9zVtzaL1txKITDmcZuI1CfmwMmm6gJC3VRRvcxAIU/oVbZZfKTpBQCHpCNfnqwmbU\n"
|
||||
"+AGuHrS+w6jv/naaoqYfRvaE7fzbzsQCzndILIyy7MMAo+wsVRjBfhnu4S/yrYObnqsZ38aK\n"
|
||||
"L4x35bcF7DvB7L6Gs4a8wPfc5+pbrrLMtTWGS9DiP7bY+A4A7l3j941Y/8+LN+ljX273CXE2\n"
|
||||
"whJdV/LItM3z7gLfEdxquVeEHVlNjM7IDiPCtyaaEBRx/pOyiriA8A4QntOoUAw3gi/q4Iqd\n"
|
||||
"4Sw5/7W0cwDk90imc6y/st53BIe0o82bNSQ3+pCTE4FCxpgmdTdmQRCsu/WU48IxK63nI1bM\n"
|
||||
"NSWSs1A=\n"
|
||||
"-----END CERTIFICATE-----",.len=2033},
|
||||
};
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user