Compare commits

..

33 Commits

Author SHA1 Message Date
Jarred Sumner
ab04e82f55 good enough for now 2022-06-05 04:44:05 -07:00
Jarred Sumner
5aa196b361 take two 2022-06-04 20:01:33 -07:00
Jarred Sumner
9f640ffb51 impl #1 2022-06-03 18:49:12 -07:00
Jarred Sumner
af6859acc2 temp 2022-06-03 04:45:50 -07:00
Jarred Sumner
e5322eb63b Move streams to it's own file 2022-06-03 04:44:11 -07:00
Jarred Sumner
102553dca6 Update streams.test.js 2022-06-03 04:43:38 -07:00
Jarred Sumner
549e5bbbcd Handle empty files/blobs 2022-06-02 18:36:10 -07:00
Jarred Sumner
57ad68a4b0 Fix off by one & exceptions 2022-06-02 17:55:03 -07:00
Jarred Sumner
e5eabc0658 Faster ReadableStream 2022-06-02 03:00:45 -07:00
Jarred Sumner
121c2960de Create bun-jsc.test.js 2022-06-01 07:31:14 -07:00
Jarred Sumner
a96a2a8b12 Update WebKit 2022-06-01 07:29:11 -07:00
Jarred Sumner
6e8a30e999 Update mimalloc 2022-06-01 07:29:05 -07:00
Jarred Sumner
7b455492d3 Update Makefile 2022-06-01 07:29:00 -07:00
Jarred Sumner
f9710e270a Update streams.test.js 2022-06-01 07:28:56 -07:00
Jarred Sumner
515da14621 More streams 2022-06-01 07:28:53 -07:00
Jarred Sumner
b735de0d3f bun:jsc 2022-06-01 07:28:35 -07:00
Jarred Sumner
bf28afad1d Add releaseWEakrefs binding 2022-06-01 07:28:19 -07:00
Jarred Sumner
73ad5c4f67 More WebKit updates 2022-06-01 07:27:59 -07:00
Jarred Sumner
73ec79fa8e WebKit updates 2022-06-01 07:27:39 -07:00
Jarred Sumner
3083718e3f faster async/await + readablestream optimizations 2022-05-30 17:11:39 -07:00
Jarred Sumner
17cdc0fee6 fix segfault 2022-05-30 17:11:39 -07:00
Jarred Sumner
26c890d3a1 make jsc bindings a little easier to work with 2022-05-30 17:11:39 -07:00
Jarred Sumner
564b6d12f0 [fs.readFile] Improve performance 2022-05-30 17:11:39 -07:00
Jarred Sumner
f4aa3a8c34 Add an extra function 2022-05-30 17:11:39 -07:00
Jarred Sumner
ed63b22f7a Implement Bun.file(pathOrFd).stream() 2022-05-30 17:11:39 -07:00
Jarred Sumner
b18b0efb8b Implement Blob.stream() 2022-05-30 17:11:39 -07:00
Jarred Sumner
7f3bc2b9e6 start to implement native streams 2022-05-30 17:11:36 -07:00
Jarred Sumner
c362729186 reading almost works 2022-05-30 17:11:14 -07:00
Jarred Sumner
f0e58fec49 [TextEncoder] 3x faster in hot loops 2022-05-30 17:11:14 -07:00
Jarred Sumner
9fcc5f27e8 Fix sourcemaps crash 2022-05-30 17:11:14 -07:00
Jarred Sumner
ab2d25bfec Update streams.test.js 2022-05-30 17:11:14 -07:00
Jarred Sumner
3e55023819 Implement Blob.stream() 2022-05-30 17:11:12 -07:00
Jarred Sumner
4fb9354439 [bun.js] WritableStream, ReadableStream, TransformStream, WritableStreamDefaultController, ReadableStreamDefaultController & more 2022-05-30 17:10:47 -07:00
34236 changed files with 243897 additions and 1401956 deletions

View File

@@ -1,31 +0,0 @@
# Uploads the latest CI workflow to Buildkite.
# https://buildkite.com/docs/pipelines/defining-steps
#
# Changes to this file must be manually edited here:
# https://buildkite.com/bun/bun/settings/steps
steps:
- if: "build.pull_request.repository.fork"
block: ":eyes:"
prompt: "Did you review the PR?"
blocked_state: "running"
- label: ":pipeline:"
agents:
queue: "build-darwin"
command:
- ".buildkite/scripts/prepare-build.sh"
- if: "build.branch == 'main' && !build.pull_request.repository.fork"
label: ":github:"
agents:
queue: "test-darwin"
depends_on:
- "darwin-aarch64-build-bun"
- "darwin-x64-build-bun"
- "linux-aarch64-build-bun"
- "linux-x64-build-bun"
- "linux-x64-baseline-build-bun"
- "windows-x64-build-bun"
- "windows-x64-baseline-build-bun"
command:
- ".buildkite/scripts/upload-release.sh"

View File

@@ -1,63 +0,0 @@
## CI
How does CI work?
### Building
Bun is built on macOS, Linux, and Windows. The process is split into the following steps, the first 3 of which are able to run in parallel:
#### 1. `build-deps`
Builds the static libaries in `src/deps` and outputs a directory: `build/bun-deps`.
- on Windows, this runs the script: [`scripts/all-dependencies.ps1`](scripts/all-dependencies.ps1)
- on macOS and Linux, this runs the script: [`scripts/all-dependencies.sh`](scripts/all-dependencies.sh)
#### 2. `build-zig`
Builds the Zig object file: `build/bun-zig.o`. Since `zig build` supports cross-compiling, this step is run on macOS aarch64 since we have observed it to be the fastest.
- on macOS and Linux, this runs the script: [`scripts/build-bun-zig.sh`](scripts/build-bun-zig.sh)
#### 3. `build-cpp`
Builds the C++ object file: `build/bun-cpp-objects.a`.
- on Windows, this runs the script: [`scripts/build-bun-cpp.ps1`](scripts/build-bun-cpp.ps1)
- on macOS and Linux, this runs the script: [`scripts/build-bun-cpp.sh`](scripts/build-bun-cpp.sh)
#### 4. `link` / `build-bun`
After the `build-deps`, `build-zig`, and `build-cpp` steps have completed, this step links the Zig object file and C++ object file into a single binary: `bun-<os>-<arch>.zip`.
- on Windows, this runs the script: [`scripts/buildkite-link-bun.ps1`](scripts/buildkite-link-bun.ps1)
- on macOS and Linux, this runs the script: [`scripts/buildkite-link-bun.sh`](scripts/buildkite-link-bun.sh)
To speed up the build, thare are two options:
- `--fast`: This disables the LTO (link-time optimization) step.
- without `--fast`: This runs the LTO step, which is the default. The binaries that are release to Github are always built with LTO.
### Testing
### FAQ
> How do I add a new CI agent?
> How do I add/modify system dependencies?
> How do I SSH into a CI agent?
### Known issues
These are things that we know about, but haven't fixed or optimized yet.
- There is no `scripts/build-bun-zig.ps1` for Windows.
- The `build-deps` step does not cache in CI, so it re-builds each time (though it does use ccache). It attempts to check the `BUN_DEPS_CACHE_DIR` environment variable, but for some reason it doesn't work.
- Windows and Linux machines sometimes take up to 1-2 minutes to start tests. This is because robobun is listening for when the job is scheduled to provision the VM. Instead, it can start provisioning during the link step, or keep a pool of idle VMs around (but it's unclear how more expensive this is).
- There are a limited number of macOS VMs. This is because they are expensive and manually provisioned, mostly through MacStadium. If wait times are too long we can just provision more, or buy some.
- To prevent idle machines, robobun periodically checks for idle machines and terminates them. Before doing this, it checks to see if the machine is connected as an agent to Buildkite. However, sometimes the machine picks up a job in-between this time, and the job is terminated.

View File

@@ -1,790 +0,0 @@
# Build and test Bun on macOS, Linux, and Windows.
# https://buildkite.com/docs/pipelines/defining-steps
#
# If a step has the `robobun: true` label, robobun will listen
# to webhooks from Buildkite and provision a VM to run the step.
#
# Changes to this file will be automatically uploaded on the next run
# for a particular commit.
#
# Future tests machines to be added:
# - macOS 12
# - Windows Server 2016 & 2019
# - Amazon Linux 2 & 2023
# - CentOS / RHEL / Fedora / other Linux distros
# - Docker containers
# - Rasberry Pi?
steps:
# macOS aarch64
- key: "darwin-aarch64"
group: ":darwin: aarch64"
steps:
- key: "darwin-aarch64-build-deps"
label: ":darwin: aarch64 - build-deps"
agents:
queue: "build-darwin"
os: "darwin"
arch: "aarch64"
command:
- "./.buildkite/scripts/build-deps.sh"
- key: "darwin-aarch64-build-zig"
label: ":darwin: aarch64 - build-zig"
agents:
queue: "build-darwin"
os: "darwin"
arch: "aarch64"
command:
- "./.buildkite/scripts/build-zig.sh darwin aarch64"
- key: "darwin-aarch64-build-cpp"
label: ":darwin: aarch64 - build-cpp"
agents:
queue: "build-darwin"
os: "darwin"
arch: "aarch64"
command:
- "./.buildkite/scripts/build-cpp.sh"
- key: "darwin-aarch64-build-bun"
label: ":darwin: aarch64 - build-bun"
depends_on:
- "darwin-aarch64-build-deps"
- "darwin-aarch64-build-zig"
- "darwin-aarch64-build-cpp"
agents:
queue: "build-darwin"
os: "darwin"
arch: "aarch64"
command:
- "./.buildkite/scripts/build-bun.sh"
- key: "darwin-aarch64-test-macos-14"
label: ":darwin: 14 aarch64 - test-bun"
if: "build.branch != 'main'"
parallelism: 3
soft_fail:
- exit_status: 2
retry:
automatic:
- exit_status: 1
limit: 1
- exit_status: -1
limit: 3
- exit_status: 255
limit: 3
- signal_reason: agent_stop
limit: 3
- signal: SIGTERM
limit: 3
depends_on:
- "darwin-aarch64-build-bun"
agents:
queue: "test-darwin"
os: "darwin"
arch: "aarch64"
release: "14"
command:
- "./scripts/runner.node.mjs --step darwin-aarch64-build-bun"
- key: "darwin-aarch64-test-macos-13"
label: ":darwin: 13 aarch64 - test-bun"
if: "build.branch != 'main'"
parallelism: 3
soft_fail:
- exit_status: 2
retry:
automatic:
- exit_status: 1
limit: 1
- exit_status: -1
limit: 3
- exit_status: 255
limit: 3
- signal_reason: agent_stop
limit: 3
- signal: SIGTERM
limit: 3
depends_on:
- "darwin-aarch64-build-bun"
agents:
queue: "test-darwin"
os: "darwin"
arch: "aarch64"
release: "13"
command:
- "./scripts/runner.node.mjs --step darwin-aarch64-build-bun"
# macOS x64
- key: "darwin-x64"
group: ":darwin: x64"
steps:
- key: "darwin-x64-build-deps"
label: ":darwin: x64 - build-deps"
agents:
queue: "build-darwin"
os: "darwin"
arch: "x64"
command:
- "./.buildkite/scripts/build-deps.sh"
- key: "darwin-x64-build-zig"
label: ":darwin: x64 - build-zig"
agents:
queue: "build-darwin"
os: "darwin"
arch: "aarch64"
command:
- "./.buildkite/scripts/build-zig.sh darwin x64"
- key: "darwin-x64-build-cpp"
label: ":darwin: x64 - build-cpp"
agents:
queue: "build-darwin"
os: "darwin"
arch: "x64"
command:
- "./.buildkite/scripts/build-cpp.sh"
- key: "darwin-x64-build-bun"
label: ":darwin: x64 - build-bun"
depends_on:
- "darwin-x64-build-deps"
- "darwin-x64-build-zig"
- "darwin-x64-build-cpp"
agents:
queue: "build-darwin"
os: "darwin"
arch: "x64"
command:
- "./.buildkite/scripts/build-bun.sh"
- key: "darwin-x64-test-macos-14"
label: ":darwin: 14 x64 - test-bun"
if: "build.branch != 'main'"
parallelism: 2
soft_fail:
- exit_status: 2
retry:
automatic:
- exit_status: 1
limit: 1
- exit_status: -1
limit: 3
- exit_status: 255
limit: 3
- signal_reason: agent_stop
limit: 3
- signal: SIGTERM
limit: 3
depends_on:
- "darwin-x64-build-bun"
agents:
queue: "test-darwin"
os: "darwin"
arch: "x64"
release: "14"
command:
- "./scripts/runner.node.mjs --step darwin-x64-build-bun"
- key: "darwin-x64-test-macos-13"
label: ":darwin: 13 x64 - test-bun"
if: "build.branch != 'main'"
parallelism: 2
soft_fail:
- exit_status: 2
retry:
automatic:
- exit_status: 1
limit: 1
- exit_status: -1
limit: 3
- exit_status: 255
limit: 3
- signal_reason: agent_stop
limit: 3
- signal: SIGTERM
limit: 3
depends_on:
- "darwin-x64-build-bun"
agents:
queue: "test-darwin"
os: "darwin"
arch: "x64"
release: "13"
command:
- "./scripts/runner.node.mjs --step darwin-x64-build-bun"
# Linux aarch64
- key: "linux-aarch64"
group: ":linux: aarch64"
steps:
- key: "linux-aarch64-build-deps"
label: ":linux: aarch64 - build-deps"
agents:
queue: "build-linux"
os: "linux"
arch: "aarch64"
command:
- "./.buildkite/scripts/build-deps.sh"
- key: "linux-aarch64-build-zig"
label: ":linux: aarch64 - build-zig"
agents:
queue: "build-darwin"
os: "darwin"
arch: "aarch64"
command:
- "./.buildkite/scripts/build-zig.sh linux aarch64"
- key: "linux-aarch64-build-cpp"
label: ":linux: aarch64 - build-cpp"
agents:
queue: "build-linux"
os: "linux"
arch: "aarch64"
command:
- "./.buildkite/scripts/build-cpp.sh"
- key: "linux-aarch64-build-bun"
label: ":linux: aarch64 - build-bun"
depends_on:
- "linux-aarch64-build-deps"
- "linux-aarch64-build-zig"
- "linux-aarch64-build-cpp"
agents:
queue: "build-linux"
os: "linux"
arch: "aarch64"
command:
- "./.buildkite/scripts/build-bun.sh"
- key: "linux-aarch64-test-debian-12"
label: ":debian: 12 aarch64 - test-bun"
if: "build.branch != 'main'"
parallelism: 5
soft_fail:
- exit_status: 2
retry:
automatic:
- exit_status: 1
limit: 1
- exit_status: -1
limit: 3
- exit_status: 255
limit: 3
- signal_reason: agent_stop
limit: 3
- signal: SIGTERM
limit: 3
depends_on:
- "linux-aarch64-build-bun"
agents:
robobun: "true"
os: "linux"
arch: "aarch64"
distro: "debian"
release: "12"
command:
- "./scripts/runner.node.mjs --step linux-aarch64-build-bun"
- key: "linux-aarch64-test-ubuntu-2204"
label: ":ubuntu: 22.04 aarch64 - test-bun"
if: "build.branch != 'main'"
parallelism: 5
soft_fail:
- exit_status: 2
retry:
automatic:
- exit_status: 1
limit: 1
- exit_status: -1
limit: 3
- exit_status: 255
limit: 3
- signal_reason: agent_stop
limit: 3
- signal: SIGTERM
limit: 3
depends_on:
- "linux-aarch64-build-bun"
agents:
robobun: "true"
os: "linux"
arch: "aarch64"
distro: "ubuntu"
release: "22.04"
command:
- "./scripts/runner.node.mjs --step linux-aarch64-build-bun"
- key: "linux-aarch64-test-ubuntu-2004"
label: ":ubuntu: 20.04 aarch64 - test-bun"
if: "build.branch != 'main'"
parallelism: 5
soft_fail:
- exit_status: 2
retry:
automatic:
- exit_status: 1
limit: 1
- exit_status: -1
limit: 3
- exit_status: 255
limit: 3
- signal_reason: agent_stop
limit: 3
- signal: SIGTERM
limit: 3
depends_on:
- "linux-aarch64-build-bun"
agents:
robobun: "true"
os: "linux"
arch: "aarch64"
distro: "ubuntu"
release: "20.04"
command:
- "./scripts/runner.node.mjs --step linux-aarch64-build-bun"
# Linux x64
- key: "linux-x64"
group: ":linux: x64"
steps:
- key: "linux-x64-build-deps"
label: ":linux: x64 - build-deps"
agents:
queue: "build-linux"
os: "linux"
arch: "x64"
command:
- "./.buildkite/scripts/build-deps.sh"
- key: "linux-x64-build-zig"
label: ":linux: x64 - build-zig"
agents:
queue: "build-darwin"
os: "darwin"
arch: "aarch64"
command:
- "./.buildkite/scripts/build-zig.sh linux x64"
- key: "linux-x64-build-cpp"
label: ":linux: x64 - build-cpp"
agents:
queue: "build-linux"
os: "linux"
arch: "x64"
command:
- "./.buildkite/scripts/build-cpp.sh"
- key: "linux-x64-build-bun"
label: ":linux: x64 - build-bun"
depends_on:
- "linux-x64-build-deps"
- "linux-x64-build-zig"
- "linux-x64-build-cpp"
agents:
queue: "build-linux"
os: "linux"
arch: "x64"
command:
- "./.buildkite/scripts/build-bun.sh"
- key: "linux-x64-test-debian-12"
label: ":debian: 12 x64 - test-bun"
if: "build.branch != 'main'"
parallelism: 5
soft_fail:
- exit_status: 2
retry:
automatic:
- exit_status: 1
limit: 1
- exit_status: -1
limit: 3
- exit_status: 255
limit: 3
- signal_reason: agent_stop
limit: 3
- signal: SIGTERM
limit: 3
depends_on:
- "linux-x64-build-bun"
agents:
robobun: "true"
os: "linux"
arch: "x64"
distro: "debian"
release: "12"
command:
- "./scripts/runner.node.mjs --step linux-x64-build-bun"
- key: "linux-x64-test-ubuntu-2204"
label: ":ubuntu: 22.04 x64 - test-bun"
if: "build.branch != 'main'"
parallelism: 5
soft_fail:
- exit_status: 2
retry:
automatic:
- exit_status: 1
limit: 1
- exit_status: -1
limit: 3
- exit_status: 255
limit: 3
- signal_reason: agent_stop
limit: 3
- signal: SIGTERM
limit: 3
depends_on:
- "linux-x64-build-bun"
agents:
robobun: "true"
os: "linux"
arch: "x64"
distro: "ubuntu"
release: "22.04"
command:
- "./scripts/runner.node.mjs --step linux-x64-build-bun"
- key: "linux-x64-test-ubuntu-2004"
label: ":ubuntu: 20.04 x64 - test-bun"
if: "build.branch != 'main'"
parallelism: 5
soft_fail:
- exit_status: 2
retry:
automatic:
- exit_status: 1
limit: 1
- exit_status: -1
limit: 3
- exit_status: 255
limit: 3
- signal_reason: agent_stop
limit: 3
- signal: SIGTERM
limit: 3
depends_on:
- "linux-x64-build-bun"
agents:
robobun: "true"
os: "linux"
arch: "x64"
distro: "ubuntu"
release: "20.04"
command:
- "./scripts/runner.node.mjs --step linux-x64-build-bun"
# Linux x64-baseline
- key: "linux-x64-baseline"
group: ":linux: x64-baseline"
steps:
- key: "linux-x64-baseline-build-deps"
label: ":linux: x64-baseline - build-deps"
agents:
queue: "build-linux"
os: "linux"
arch: "x64"
command:
- "./.buildkite/scripts/build-deps.sh"
- key: "linux-x64-baseline-build-zig"
label: ":linux: x64-baseline - build-zig"
agents:
queue: "build-darwin"
os: "darwin"
arch: "aarch64"
command:
- "./.buildkite/scripts/build-zig.sh linux x64"
- key: "linux-x64-baseline-build-cpp"
label: ":linux: x64-baseline - build-cpp"
agents:
queue: "build-linux"
os: "linux"
arch: "x64"
command:
- "./.buildkite/scripts/build-cpp.sh"
- key: "linux-x64-baseline-build-bun"
label: ":linux: x64-baseline - build-bun"
depends_on:
- "linux-x64-baseline-build-deps"
- "linux-x64-baseline-build-zig"
- "linux-x64-baseline-build-cpp"
agents:
queue: "build-linux"
os: "linux"
arch: "x64"
command:
- "./.buildkite/scripts/build-bun.sh"
- key: "linux-x64-baseline-test-debian-12"
label: ":debian: 12 x64-baseline - test-bun"
if: "build.branch != 'main'"
parallelism: 5
soft_fail:
- exit_status: 2
retry:
automatic:
- exit_status: 1
limit: 1
- exit_status: -1
limit: 3
- exit_status: 255
limit: 3
- signal_reason: agent_stop
limit: 3
- signal: SIGTERM
limit: 3
depends_on:
- "linux-x64-baseline-build-bun"
agents:
robobun: "true"
os: "linux"
arch: "x64"
distro: "debian"
release: "12"
command:
- "./scripts/runner.node.mjs --step linux-x64-baseline-build-bun"
- key: "linux-x64-baseline-test-ubuntu-2204"
label: ":ubuntu: 22.04 x64-baseline - test-bun"
if: "build.branch != 'main'"
parallelism: 5
soft_fail:
- exit_status: 2
retry:
automatic:
- exit_status: 1
limit: 1
- exit_status: -1
limit: 3
- exit_status: 255
limit: 3
- signal_reason: agent_stop
limit: 3
- signal: SIGTERM
limit: 3
depends_on:
- "linux-x64-baseline-build-bun"
agents:
robobun: "true"
os: "linux"
arch: "x64"
distro: "ubuntu"
release: "22.04"
command:
- "./scripts/runner.node.mjs --step linux-x64-baseline-build-bun"
- key: "linux-x64-baseline-test-ubuntu-2004"
label: ":ubuntu: 20.04 x64-baseline - test-bun"
if: "build.branch != 'main'"
parallelism: 5
soft_fail:
- exit_status: 2
retry:
automatic:
- exit_status: 1
limit: 1
- exit_status: -1
limit: 3
- exit_status: 255
limit: 3
- signal_reason: agent_stop
limit: 3
- signal: SIGTERM
limit: 3
depends_on:
- "linux-x64-baseline-build-bun"
agents:
robobun: "true"
os: "linux"
arch: "x64"
distro: "ubuntu"
release: "20.04"
command:
- "./scripts/runner.node.mjs --step linux-x64-baseline-build-bun"
# Windows x64
- key: "windows-x64"
group: ":windows: x64"
steps:
- key: "windows-x64-build-deps"
label: ":windows: x64 - build-deps"
agents:
queue: "build-windows"
os: "windows"
arch: "x64"
artifact_paths:
- "build\\bun-deps\\*.lib"
env:
SCCACHE_DIR: "$$HOME\\.cache\\sccache"
ZIG_LOCAL_CACHE_DIR: "$$HOME\\.cache\\zig-cache"
SCCACHE_IGNORE_SERVER_IO_ERROR: "1"
command:
- ".\\scripts\\all-dependencies.ps1"
- key: "windows-x64-build-zig"
label: ":windows: x64 - build-zig"
agents:
queue: "build-darwin"
os: "darwin" # cross-compile on Linux or Darwin
arch: "aarch64"
command:
- "./.buildkite/scripts/build-zig.sh windows x64"
- key: "windows-x64-build-cpp"
label: ":windows: x64 - build-cpp"
agents:
queue: "build-windows"
os: "windows"
arch: "x64"
artifact_paths:
# HACK: See scripts/build-bun-cpp.ps1
# - "build\\bun-cpp-objects.a"
- "build\\bun-cpp-objects.a.*"
command:
- ".\\scripts\\build-bun-cpp.ps1"
- key: "windows-x64-build-bun"
label: ":windows: x64 - build-bun"
depends_on:
- "windows-x64-build-deps"
- "windows-x64-build-zig"
- "windows-x64-build-cpp"
agents:
queue: "build-windows"
os: "windows"
arch: "x64"
artifact_paths:
- "bun-windows-x64.zip"
- "bun-windows-x64-profile.zip"
- "features.json"
env:
SCCACHE_DIR: "$$HOME\\.cache\\sccache"
ZIG_LOCAL_CACHE_DIR: "$$HOME\\.cache\\zig-cache"
SCCACHE_IGNORE_SERVER_IO_ERROR: "1"
command:
- ".\\scripts\\buildkite-link-bun.ps1"
- key: "windows-x64-test-bun"
label: ":windows: x64 - test-bun"
if: "build.branch != 'main'"
parallelism: 10
soft_fail:
- exit_status: 1
retry:
automatic:
- exit_status: -1
limit: 3
- exit_status: 255
limit: 3
- signal_reason: agent_stop
limit: 3
- signal: SIGTERM
limit: 3
depends_on:
- "windows-x64-build-bun"
agents:
robobun: "true"
os: "windows"
arch: "x64"
command:
- "node .\\scripts\\runner.node.mjs --step windows-x64-build-bun"
# Windows x64-baseline
- key: "windows-x64-baseline"
group: ":windows: x64-baseline"
steps:
- key: "windows-x64-baseline-build-deps"
label: ":windows: x64-baseline - build-deps"
agents:
queue: "build-windows"
os: "windows"
arch: "x64"
artifact_paths:
- "build\\bun-deps\\*.lib"
env:
SCCACHE_DIR: "$$HOME\\.cache\\sccache"
ZIG_LOCAL_CACHE_DIR: "$$HOME\\.cache\\zig-cache"
SCCACHE_IGNORE_SERVER_IO_ERROR: "1"
USE_BASELINE_BUILD: "1"
command:
- ".\\scripts\\all-dependencies.ps1"
- key: "windows-x64-baseline-build-zig"
label: ":windows: x64-baseline - build-zig"
agents:
queue: "build-darwin"
os: "darwin" # cross-compile on Linux or Darwin
arch: "aarch64"
command:
- "./.buildkite/scripts/build-zig.sh windows x64"
- key: "windows-x64-baseline-build-cpp"
label: ":windows: x64-baseline - build-cpp"
agents:
queue: "build-windows"
os: "windows"
arch: "x64"
artifact_paths:
# HACK: See scripts/build-bun-cpp.ps1
# - "build\\bun-cpp-objects.a"
- "build\\bun-cpp-objects.a.*"
env:
SCCACHE_DIR: "$$HOME\\.cache\\sccache"
ZIG_LOCAL_CACHE_DIR: "$$HOME\\.cache\\zig-cache"
SCCACHE_IGNORE_SERVER_IO_ERROR: "1"
USE_BASELINE_BUILD: "1"
command:
- ".\\scripts\\build-bun-cpp.ps1"
- key: "windows-x64-baseline-build-bun"
label: ":windows: x64-baseline - build-bun"
depends_on:
- "windows-x64-baseline-build-deps"
- "windows-x64-baseline-build-zig"
- "windows-x64-baseline-build-cpp"
agents:
queue: "build-windows"
os: "windows"
arch: "x64"
artifact_paths:
- "bun-windows-x64-baseline.zip"
- "bun-windows-x64-baseline-profile.zip"
- "features.json"
env:
SCCACHE_DIR: "$$HOME\\.cache\\sccache"
ZIG_LOCAL_CACHE_DIR: "$$HOME\\.cache\\zig-cache"
SCCACHE_IGNORE_SERVER_IO_ERROR: "1"
USE_BASELINE_BUILD: "1"
command:
- ".\\scripts\\buildkite-link-bun.ps1 -Baseline $$True"
- key: "windows-x64-baseline-test-bun"
label: ":windows: x64-baseline - test-bun"
if: "build.branch != 'main'"
parallelism: 10
soft_fail:
- exit_status: 1
retry:
automatic:
- exit_status: -1
limit: 3
- exit_status: 255
limit: 3
- signal_reason: agent_stop
limit: 3
- signal: SIGTERM
limit: 3
depends_on:
- "windows-x64-baseline-build-bun"
agents:
robobun: "true"
os: "windows"
arch: "x64"
command:
- "node .\\scripts\\runner.node.mjs --step windows-x64-baseline-build-bun"

View File

@@ -1,55 +0,0 @@
#!/bin/bash
set -eo pipefail
source "$(dirname "$0")/env.sh"
function run_command() {
set -x
"$@"
{ set +x; } 2>/dev/null
}
cwd="$(pwd)"
mkdir -p build
source "$(dirname "$0")/download-artifact.sh" "build/bun-deps/**" --step "$BUILDKITE_GROUP_KEY-build-deps"
source "$(dirname "$0")/download-artifact.sh" "build/bun-zig.o" --step "$BUILDKITE_GROUP_KEY-build-zig"
source "$(dirname "$0")/download-artifact.sh" "build/bun-cpp-objects.a" --step "$BUILDKITE_GROUP_KEY-build-cpp" --split
cd build
run_command cmake .. "${CMAKE_FLAGS[@]}" \
-GNinja \
-DBUN_LINK_ONLY="1" \
-DNO_CONFIGURE_DEPENDS="1" \
-DBUN_ZIG_OBJ_DIR="$cwd/build" \
-DBUN_CPP_ARCHIVE="$cwd/build/bun-cpp-objects.a" \
-DBUN_DEPS_OUT_DIR="$cwd/build/bun-deps" \
-DCMAKE_BUILD_TYPE="$CMAKE_BUILD_TYPE" \
-DCPU_TARGET="$CPU_TARGET" \
-DUSE_LTO="$USE_LTO" \
-DUSE_DEBUG_JSC="$USE_DEBUG_JSC" \
-DCANARY="$CANARY" \
-DGIT_SHA="$GIT_SHA"
run_command ninja -v -j "$CPUS"
run_command ls
tag="bun-$BUILDKITE_GROUP_KEY"
if [ "$USE_LTO" == "OFF" ]; then
# Remove OS check when LTO is enabled on macOS again
if [[ "$tag" == *"darwin"* ]]; then
tag="$tag-nolto"
fi
fi
for name in bun bun-profile; do
dir="$tag"
if [ "$name" == "bun-profile" ]; then
dir="$tag-profile"
fi
run_command chmod +x "$name"
run_command "./$name" --revision
run_command mkdir -p "$dir"
run_command mv "$name" "$dir/$name"
run_command zip -r "$dir.zip" "$dir"
source "$cwd/.buildkite/scripts/upload-artifact.sh" "$dir.zip"
done

View File

@@ -1,35 +0,0 @@
#!/bin/bash
set -eo pipefail
source "$(dirname "$0")/env.sh"
export FORCE_UPDATE_SUBMODULES=1
source "$(realpath $(dirname "$0")/../../scripts/update-submodules.sh)"
{ set +x; } 2>/dev/null
function run_command() {
set -x
"$@"
{ set +x; } 2>/dev/null
}
mkdir -p build
cd build
mkdir -p tmp_modules tmp_functions js codegen
run_command cmake .. "${CMAKE_FLAGS[@]}" \
-GNinja \
-DBUN_CPP_ONLY="1" \
-DNO_CONFIGURE_DEPENDS="1" \
-DCMAKE_BUILD_TYPE="$CMAKE_BUILD_TYPE" \
-DCPU_TARGET="$CPU_TARGET" \
-DUSE_LTO="$USE_LTO" \
-DUSE_DEBUG_JSC="$USE_DEBUG_JSC" \
-DCANARY="$CANARY" \
-DGIT_SHA="$GIT_SHA"
chmod +x compile-cpp-only.sh
source compile-cpp-only.sh -v -j "$CPUS"
{ set +x; } 2>/dev/null
cd ..
source "$(dirname "$0")/upload-artifact.sh" "build/bun-cpp-objects.a" --split

View File

@@ -1,22 +0,0 @@
#!/bin/bash
set -eo pipefail
source "$(dirname "$0")/env.sh"
source "$(realpath $(dirname "$0")/../../scripts/all-dependencies.sh)"
artifacts=(
libcrypto.a libssl.a libdecrepit.a
libcares.a
libarchive.a
liblolhtml.a
libmimalloc.a libmimalloc.o
libtcc.a
libz.a
libzstd.a
libdeflate.a
liblshpack.a
)
for artifact in "${artifacts[@]}"; do
source "$(dirname "$0")/upload-artifact.sh" "build/bun-deps/$artifact"
done

View File

@@ -1,40 +0,0 @@
#!/bin/bash
set -eo pipefail
source "$(dirname "$0")/env.sh"
function assert_bun() {
if ! command -v bun &>/dev/null; then
echo "error: bun is not installed" 1>&2
exit 1
fi
}
function assert_make() {
if ! command -v make &>/dev/null; then
echo "error: make is not installed" 1>&2
exit 1
fi
}
function run_command() {
set -x
"$@"
{ set +x; } 2>/dev/null
}
function build_node_fallbacks() {
local cwd="src/node-fallbacks"
run_command bun install --cwd "$cwd" --frozen-lockfile
run_command bun run --cwd "$cwd" build
}
function build_old_js() {
run_command bun install --frozen-lockfile
run_command make runtime_js fallback_decoder bun_error
}
assert_bun
assert_make
build_node_fallbacks
build_old_js

View File

@@ -1,80 +0,0 @@
#!/bin/bash
set -eo pipefail
source "$(dirname "$0")/env.sh"
function assert_target() {
local arch="${2-$(uname -m)}"
case "$(echo "$arch" | tr '[:upper:]' '[:lower:]')" in
x64 | x86_64 | amd64)
export ZIG_ARCH="x86_64"
if [[ "$BUILDKITE_STEP_KEY" == *"baseline"* ]]; then
export ZIG_CPU_TARGET="nehalem"
else
export ZIG_CPU_TARGET="haswell"
fi
;;
aarch64 | arm64)
export ZIG_ARCH="aarch64"
export ZIG_CPU_TARGET="native"
;;
*)
echo "error: Unsupported architecture: $arch" 1>&2
exit 1
;;
esac
local os="${1-$(uname -s)}"
case "$(echo "$os" | tr '[:upper:]' '[:lower:]')" in
linux)
export ZIG_TARGET="$ZIG_ARCH-linux-gnu" ;;
darwin)
export ZIG_TARGET="$ZIG_ARCH-macos-none" ;;
windows)
export ZIG_TARGET="$ZIG_ARCH-windows-msvc" ;;
*)
echo "error: Unsupported operating system: $os" 1>&2
exit 1
;;
esac
}
function run_command() {
set -x
"$@"
{ set +x; } 2>/dev/null
}
assert_target "$@"
# Since the zig build depends on files from the zig submodule,
# make sure to update the submodule before building.
run_command git submodule update --init --recursive --progress --depth=1 --checkout src/deps/zig
# TODO: Move these to be part of the CMake build
source "$(dirname "$0")/build-old-js.sh"
cwd="$(pwd)"
mkdir -p build
cd build
run_command cmake .. "${CMAKE_FLAGS[@]}" \
-GNinja \
-DNO_CONFIGURE_DEPENDS="1" \
-DNO_CODEGEN="0" \
-DWEBKIT_DIR="omit" \
-DBUN_ZIG_OBJ_DIR="$cwd/build" \
-DZIG_LIB_DIR="$cwd/src/deps/zig/lib" \
-DCMAKE_BUILD_TYPE="$CMAKE_BUILD_TYPE" \
-DARCH="$ZIG_ARCH" \
-DCPU_TARGET="$ZIG_CPU_TARGET" \
-DZIG_TARGET="$ZIG_TARGET" \
-DUSE_LTO="$USE_LTO" \
-DUSE_DEBUG_JSC="$USE_DEBUG_JSC" \
-DCANARY="$CANARY" \
-DGIT_SHA="$GIT_SHA"
export ONLY_ZIG="1"
run_command ninja "$cwd/build/bun-zig.o" -v -j "$CPUS"
cd ..
source "$(dirname "$0")/upload-artifact.sh" "build/bun-zig.o"

View File

@@ -1,47 +0,0 @@
param (
[Parameter(Mandatory=$true)]
[string[]] $Paths,
[switch] $Split
)
$ErrorActionPreference = "Stop"
function Assert-Buildkite-Agent() {
if (-not (Get-Command "buildkite-agent" -ErrorAction SilentlyContinue)) {
Write-Error "Cannot find buildkite-agent, please install it: https://buildkite.com/docs/agent/v3/install"
exit 1
}
}
function Assert-Join-File() {
if (-not (Get-Command "Join-File" -ErrorAction SilentlyContinue)) {
Write-Error "Cannot find Join-File, please install it: https://www.powershellgallery.com/packages/FileSplitter/1.3"
exit 1
}
}
function Download-Buildkite-Artifact() {
param (
[Parameter(Mandatory=$true)]
[string] $Path,
)
if ($Split) {
& buildkite-agent artifact download "$Path.*" --debug --debug-http
Join-File -Path "$(Resolve-Path .)\$Path" -Verbose -DeletePartFiles
} else {
& buildkite-agent artifact download "$Path" --debug --debug-http
}
if (-not (Test-Path $Path)) {
Write-Error "Could not find artifact: $Path"
exit 1
}
}
Assert-Buildkite-Agent
if ($Split) {
Assert-Join-File
}
foreach ($Path in $Paths) {
Download-Buildkite-Artifact $Path
}

View File

@@ -1,46 +0,0 @@
#!/bin/bash
set -eo pipefail
function assert_buildkite_agent() {
if ! command -v buildkite-agent &> /dev/null; then
echo "error: Cannot find buildkite-agent, please install it:"
echo "https://buildkite.com/docs/agent/v3/install"
exit 1
fi
}
function download_buildkite_artifact() {
local path="$1"; shift
local split="0"
local args=()
while true; do
if [ -z "$1" ]; then
break
fi
case "$1" in
--split) split="1"; shift ;;
*) args+=("$1"); shift ;;
esac
done
if [ "$split" == "1" ]; then
run_command buildkite-agent artifact download "$path.*" . "${args[@]}"
run_command cat $path.?? > "$path"
run_command rm -f $path.??
else
run_command buildkite-agent artifact download "$path" . "${args[@]}"
fi
if [[ "$path" != *"*"* ]] && [ ! -f "$path" ]; then
echo "error: Could not find artifact: $path"
exit 1
fi
}
function run_command() {
set -x
"$@"
{ set +x; } 2>/dev/null
}
assert_buildkite_agent
download_buildkite_artifact "$@"

View File

@@ -1,119 +0,0 @@
#!/bin/bash
set -eo pipefail
function assert_os() {
local os="$(uname -s)"
case "$os" in
Linux)
echo "linux" ;;
Darwin)
echo "darwin" ;;
*)
echo "error: Unsupported operating system: $os" 1>&2
exit 1
;;
esac
}
function assert_arch() {
local arch="$(uname -m)"
case "$arch" in
aarch64 | arm64)
echo "aarch64" ;;
x86_64 | amd64)
echo "x64" ;;
*)
echo "error: Unknown architecture: $arch" 1>&2
exit 1
;;
esac
}
function assert_build() {
if [ -z "$BUILDKITE_REPO" ]; then
echo "error: Cannot find repository for this build"
exit 1
fi
if [ -z "$BUILDKITE_COMMIT" ]; then
echo "error: Cannot find commit for this build"
exit 1
fi
if [ -z "$BUILDKITE_STEP_KEY" ]; then
echo "error: Cannot find step key for this build"
exit 1
fi
if [ -n "$BUILDKITE_GROUP_KEY" ] && [[ "$BUILDKITE_STEP_KEY" != "$BUILDKITE_GROUP_KEY"* ]]; then
echo "error: Build step '$BUILDKITE_STEP_KEY' does not start with group key '$BUILDKITE_GROUP_KEY'"
exit 1
fi
# Skip os and arch checks for Zig, since it's cross-compiled on macOS
if [[ "$BUILDKITE_STEP_KEY" != *"zig"* ]]; then
local os="$(assert_os)"
if [[ "$BUILDKITE_STEP_KEY" != *"$os"* ]]; then
echo "error: Build step '$BUILDKITE_STEP_KEY' does not match operating system '$os'"
exit 1
fi
local arch="$(assert_arch)"
if [[ "$BUILDKITE_STEP_KEY" != *"$arch"* ]]; then
echo "error: Build step '$BUILDKITE_STEP_KEY' does not match architecture '$arch'"
exit 1
fi
fi
}
function assert_buildkite_agent() {
if ! command -v buildkite-agent &> /dev/null; then
echo "error: Cannot find buildkite-agent, please install it:"
echo "https://buildkite.com/docs/agent/v3/install"
exit 1
fi
}
function export_environment() {
source "$(realpath $(dirname "$0")/../../scripts/env.sh)"
source "$(realpath $(dirname "$0")/../../scripts/update-submodules.sh)"
{ set +x; } 2>/dev/null
export GIT_SHA="$BUILDKITE_COMMIT"
export CCACHE_DIR="$HOME/.cache/ccache/$BUILDKITE_STEP_KEY"
export SCCACHE_DIR="$HOME/.cache/sccache/$BUILDKITE_STEP_KEY"
export ZIG_LOCAL_CACHE_DIR="$HOME/.cache/zig-cache/$BUILDKITE_STEP_KEY"
export BUN_DEPS_CACHE_DIR="$HOME/.cache/bun-deps/$BUILDKITE_STEP_KEY"
if [ "$(assert_arch)" == "aarch64" ]; then
export CPU_TARGET="native"
elif [[ "$BUILDKITE_STEP_KEY" == *"baseline"* ]]; then
export CPU_TARGET="nehalem"
else
export CPU_TARGET="haswell"
fi
if [[ "$BUILDKITE_STEP_KEY" == *"nolto"* ]]; then
export USE_LTO="OFF"
else
export USE_LTO="ON"
fi
if $(buildkite-agent meta-data exists release &> /dev/null); then
export CMAKE_BUILD_TYPE="$(buildkite-agent meta-data get release)"
else
export CMAKE_BUILD_TYPE="Release"
fi
if $(buildkite-agent meta-data exists canary &> /dev/null); then
export CANARY="$(buildkite-agent meta-data get canary)"
else
export CANARY="1"
fi
if $(buildkite-agent meta-data exists assertions &> /dev/null); then
export USE_DEBUG_JSC="$(buildkite-agent meta-data get assertions)"
else
export USE_DEBUG_JSC="OFF"
fi
if [ "$BUILDKITE_CLEAN_CHECKOUT" == "true" ]; then
rm -rf "$CCACHE_DIR"
rm -rf "$SCCACHE_DIR"
rm -rf "$ZIG_LOCAL_CACHE_DIR"
rm -rf "$BUN_DEPS_CACHE_DIR"
fi
}
assert_build
assert_buildkite_agent
export_environment

View File

@@ -1,97 +0,0 @@
#!/bin/bash
set -eo pipefail
function assert_build() {
if [ -z "$BUILDKITE_REPO" ]; then
echo "error: Cannot find repository for this build"
exit 1
fi
if [ -z "$BUILDKITE_COMMIT" ]; then
echo "error: Cannot find commit for this build"
exit 1
fi
}
function assert_buildkite_agent() {
if ! command -v buildkite-agent &> /dev/null; then
echo "error: Cannot find buildkite-agent, please install it:"
echo "https://buildkite.com/docs/agent/v3/install"
exit 1
fi
}
function assert_jq() {
assert_command "jq" "jq" "https://stedolan.github.io/jq/"
}
function assert_curl() {
assert_command "curl" "curl" "https://curl.se/download.html"
}
function assert_command() {
local command="$1"
local package="$2"
local help_url="$3"
if ! command -v "$command" &> /dev/null; then
echo "warning: $command is not installed, installing..."
if command -v brew &> /dev/null; then
HOMEBREW_NO_AUTO_UPDATE=1 brew install "$package"
else
echo "error: Cannot install $command, please install it"
if [ -n "$help_url" ]; then
echo ""
echo "hint: See $help_url for help"
fi
exit 1
fi
fi
}
function assert_release() {
if [ "$RELEASE" == "1" ]; then
run_command buildkite-agent meta-data set canary "0"
fi
}
function assert_canary() {
local canary="$(buildkite-agent meta-data get canary 2>/dev/null)"
if [ -z "$canary" ]; then
local repo=$(echo "$BUILDKITE_REPO" | sed -E 's#https://github.com/([^/]+)/([^/]+).git#\1/\2#g')
local tag="$(curl -sL "https://api.github.com/repos/$repo/releases/latest" | jq -r ".tag_name")"
if [ "$tag" == "null" ]; then
canary="1"
else
local revision=$(curl -sL "https://api.github.com/repos/$repo/compare/$tag...$BUILDKITE_COMMIT" | jq -r ".ahead_by")
if [ "$revision" == "null" ]; then
canary="1"
else
canary="$revision"
fi
fi
run_command buildkite-agent meta-data set canary "$canary"
fi
}
function upload_buildkite_pipeline() {
local path="$1"
if [ ! -f "$path" ]; then
echo "error: Cannot find pipeline: $path"
exit 1
fi
run_command buildkite-agent pipeline upload "$path"
}
function run_command() {
set -x
"$@"
{ set +x; } 2>/dev/null
}
assert_build
assert_buildkite_agent
assert_jq
assert_curl
assert_release
assert_canary
upload_buildkite_pipeline ".buildkite/ci.yml"

View File

@@ -1,47 +0,0 @@
param (
[Parameter(Mandatory=$true)]
[string[]] $Paths,
[switch] $Split
)
$ErrorActionPreference = "Stop"
function Assert-Buildkite-Agent() {
if (-not (Get-Command "buildkite-agent" -ErrorAction SilentlyContinue)) {
Write-Error "Cannot find buildkite-agent, please install it: https://buildkite.com/docs/agent/v3/install"
exit 1
}
}
function Assert-Split-File() {
if (-not (Get-Command "Split-File" -ErrorAction SilentlyContinue)) {
Write-Error "Cannot find Split-File, please install it: https://www.powershellgallery.com/packages/FileSplitter/1.3"
exit 1
}
}
function Upload-Buildkite-Artifact() {
param (
[Parameter(Mandatory=$true)]
[string] $Path,
)
if (-not (Test-Path $Path)) {
Write-Error "Could not find artifact: $Path"
exit 1
}
if ($Split) {
Remove-Item -Path "$Path.*" -Force
Split-File -Path (Resolve-Path $Path) -PartSizeBytes "50MB" -Verbose
$Path = "$Path.*"
}
& buildkite-agent artifact upload "$Path" --debug --debug-http
}
Assert-Buildkite-Agent
if ($Split) {
Assert-Split-File
}
foreach ($Path in $Paths) {
Upload-Buildkite-Artifact $Path
}

View File

@@ -1,54 +0,0 @@
#!/bin/bash
set -eo pipefail
function assert_buildkite_agent() {
if ! command -v buildkite-agent &> /dev/null; then
echo "error: Cannot find buildkite-agent, please install it:"
echo "https://buildkite.com/docs/agent/v3/install"
exit 1
fi
}
function assert_split() {
if ! command -v split &> /dev/null; then
echo "error: Cannot find split, please install it:"
echo "https://www.gnu.org/software/coreutils/split"
exit 1
fi
}
function upload_buildkite_artifact() {
local path="$1"; shift
local split="0"
local args=()
while true; do
if [ -z "$1" ]; then
break
fi
case "$1" in
--split) split="1"; shift ;;
*) args+=("$1"); shift ;;
esac
done
if [ ! -f "$path" ]; then
echo "error: Could not find artifact: $path"
exit 1
fi
if [ "$split" == "1" ]; then
run_command rm -f "$path."*
run_command split -b 50MB -d "$path" "$path."
run_command buildkite-agent artifact upload "$path.*" "${args[@]}"
else
run_command buildkite-agent artifact upload "$path" "${args[@]}"
fi
}
function run_command() {
set -x
"$@"
{ set +x; } 2>/dev/null
}
assert_buildkite_agent
upload_buildkite_artifact "$@"

View File

@@ -1,211 +0,0 @@
#!/bin/bash
set -eo pipefail
function assert_main() {
if [ -z "$BUILDKITE_REPO" ]; then
echo "error: Cannot find repository for this build"
exit 1
fi
if [ -z "$BUILDKITE_COMMIT" ]; then
echo "error: Cannot find commit for this build"
exit 1
fi
if [ -n "$BUILDKITE_PULL_REQUEST_REPO" ] && [ "$BUILDKITE_REPO" != "$BUILDKITE_PULL_REQUEST_REPO" ]; then
echo "error: Cannot upload release from a fork"
exit 1
fi
if [ "$BUILDKITE_PULL_REQUEST" != "false" ]; then
echo "error: Cannot upload release from a pull request"
exit 1
fi
if [ "$BUILDKITE_BRANCH" != "main" ]; then
echo "error: Cannot upload release from a branch other than main"
exit 1
fi
}
function assert_buildkite_agent() {
if ! command -v "buildkite-agent" &> /dev/null; then
echo "error: Cannot find buildkite-agent, please install it:"
echo "https://buildkite.com/docs/agent/v3/install"
exit 1
fi
}
function assert_github() {
assert_command "gh" "gh" "https://github.com/cli/cli#installation"
assert_buildkite_secret "GITHUB_TOKEN"
# gh expects the token in $GH_TOKEN
export GH_TOKEN="$GITHUB_TOKEN"
}
function assert_aws() {
assert_command "aws" "awscli" "https://docs.aws.amazon.com/cli/latest/userguide/getting-started-install.html"
for secret in "AWS_ACCESS_KEY_ID" "AWS_SECRET_ACCESS_KEY" "AWS_ENDPOINT"; do
assert_buildkite_secret "$secret"
done
assert_buildkite_secret "AWS_BUCKET" --skip-redaction
}
function assert_sentry() {
assert_command "sentry-cli" "getsentry/tools/sentry-cli" "https://docs.sentry.io/cli/installation/"
for secret in "SENTRY_AUTH_TOKEN" "SENTRY_ORG" "SENTRY_PROJECT"; do
assert_buildkite_secret "$secret"
done
}
function run_command() {
set -x
"$@"
{ set +x; } 2>/dev/null
}
function assert_command() {
local command="$1"
local package="$2"
local help_url="$3"
if ! command -v "$command" &> /dev/null; then
echo "warning: $command is not installed, installing..."
if command -v brew &> /dev/null; then
HOMEBREW_NO_AUTO_UPDATE=1 run_command brew install "$package"
else
echo "error: Cannot install $command, please install it"
if [ -n "$help_url" ]; then
echo ""
echo "hint: See $help_url for help"
fi
exit 1
fi
fi
}
function assert_buildkite_secret() {
local key="$1"
local value=$(buildkite-agent secret get "$key" ${@:2})
if [ -z "$value" ]; then
echo "error: Cannot find $key secret"
echo ""
echo "hint: Create a secret named $key with a value:"
echo "https://buildkite.com/docs/pipelines/buildkite-secrets"
exit 1
fi
export "$key"="$value"
}
function release_tag() {
local version="$1"
if [ "$version" == "canary" ]; then
echo "canary"
else
echo "bun-v$version"
fi
}
function create_sentry_release() {
local version="$1"
local release="$version"
if [ "$version" == "canary" ]; then
release="$BUILDKITE_COMMIT-canary"
fi
run_command sentry-cli releases new "$release" --finalize
run_command sentry-cli releases set-commits "$release" --auto --ignore-missing
if [ "$version" == "canary" ]; then
run_command sentry-cli deploys new --env="canary" --release="$release"
fi
}
function download_buildkite_artifact() {
local name="$1"
local dir="$2"
if [ -z "$dir" ]; then
dir="."
fi
run_command buildkite-agent artifact download "$name" "$dir"
if [ ! -f "$dir/$name" ]; then
echo "error: Cannot find Buildkite artifact: $name"
exit 1
fi
}
function upload_github_asset() {
local version="$1"
local tag="$(release_tag "$version")"
local file="$2"
run_command gh release upload "$tag" "$file" --clobber --repo "$BUILDKITE_REPO"
# Sometimes the upload fails, maybe this is a race condition in the gh CLI?
while [ "$(gh release view "$tag" --repo "$BUILDKITE_REPO" | grep -c "$file")" -eq 0 ]; do
echo "warn: Uploading $file to $tag failed, retrying..."
sleep "$((RANDOM % 5 + 1))"
run_command gh release upload "$tag" "$file" --clobber --repo "$BUILDKITE_REPO"
done
}
function update_github_release() {
local version="$1"
local tag="$(release_tag "$version")"
if [ "$tag" == "canary" ]; then
run_command gh release edit "$tag" --repo "$BUILDKITE_REPO" \
--notes "This release of Bun corresponds to the commit: $BUILDKITE_COMMIT"
fi
}
function upload_s3_file() {
local folder="$1"
local file="$2"
run_command aws --endpoint-url="$AWS_ENDPOINT" s3 cp "$file" "s3://$AWS_BUCKET/$folder/$file"
}
function create_release() {
assert_main
assert_buildkite_agent
assert_github
assert_aws
assert_sentry
local tag="$1" # 'canary' or 'x.y.z'
local artifacts=(
bun-darwin-aarch64.zip
bun-darwin-aarch64-profile.zip
bun-darwin-x64.zip
bun-darwin-x64-profile.zip
bun-linux-aarch64.zip
bun-linux-aarch64-profile.zip
bun-linux-x64.zip
bun-linux-x64-profile.zip
bun-linux-x64-baseline.zip
bun-linux-x64-baseline-profile.zip
bun-windows-x64.zip
bun-windows-x64-profile.zip
bun-windows-x64-baseline.zip
bun-windows-x64-baseline-profile.zip
)
function upload_artifact() {
local artifact="$1"
download_buildkite_artifact "$artifact"
upload_s3_file "releases/$BUILDKITE_COMMIT" "$artifact" &
upload_s3_file "releases/$tag" "$artifact" &
upload_github_asset "$tag" "$artifact" &
}
for artifact in "${artifacts[@]}"; do
upload_artifact "$artifact" &
done
wait
update_github_release "$tag"
create_sentry_release "$tag"
}
function assert_canary() {
local canary="$(buildkite-agent meta-data get canary 2>/dev/null)"
if [ -z "$canary" ] || [ "$canary" == "0" ]; then
echo "warn: Skipping release because this is not a canary build"
exit 0
fi
}
assert_canary
create_release "canary"

View File

@@ -1,3 +0,0 @@
Index:
Background: Skip # Disable slow background indexing of these files.

View File

@@ -0,0 +1,65 @@
// For format details, see https://aka.ms/devcontainer.json. For config options, see the README at:
// https://github.com/microsoft/vscode-dev-containers/tree/v0.209.6/containers/docker-existing-dockerfile
{
"name": "bun (Ubuntu)",
// Sets the run context to one level up instead of the .devcontainer folder.
"context": "..",
// Update the 'dockerFile' property if you aren't using the standard 'Dockerfile' filename.
"dockerFile": "../Dockerfile",
// Set *default* container specific settings.json values on container create.
"settings": {
"terminal.integrated.shell.linux": "/bin/zsh",
"zigLanguageClient.path": "/home/ubuntu/zls/zig-out/bin/zls",
"zig.zigPath": "/build/zig/zig",
"editor.defaultFormatter": "esbenp.prettier-vscode"
},
// Add the IDs of extensions you want installed when the container is created.
"extensions": [
"AugusteRame.zls-vscode",
"ms-vscode.cpptools",
"/home/ubuntu/vscode-zig.vsix",
"vadimcn.vscode-lldb",
"esbenp.prettier-vscode",
"xaver.clang-format"
],
"postCreateCommand": "cd /build/bun; bash /build/getting-started.sh; zsh",
"build": {
"target": "bun.devcontainer",
"cacheFrom": ["bun.devcontainer:latest"],
"args": {}
},
"runArgs": [
"--ulimit",
"memlock=-1:-1",
"--ulimit",
"nofile=65536:65536",
"--cap-add=SYS_PTRACE",
"--security-opt",
"seccomp=unconfined"
],
"workspaceMount": "source=bun,target=/build/bun,type=volume",
"workspaceFolder": "/build/bun",
"mounts": [
"source=bun-install,target=/home/ubuntu/.bun,type=volume",
"source=bun-config,target=/home/ubuntu/.config,type=volume"
],
// Use 'forwardPorts' to make a list of ports inside the container available locally.
"forwardPorts": [3000, 8081, 8080]
// Uncomment the next line to run commands after the container is created - for example installing curl.
// "postCreateCommand": "apt-get update && apt-get install -y curl",
// Uncomment when using a ptrace-based debugger like C++, Go, and Rust
// Uncomment to use the Docker CLI from inside the container. See https://aka.ms/vscode-remote/samples/docker-from-docker.
// "mounts": [ "source=/var/run/docker.sock,target=/var/run/docker.sock,type=bind" ],
// Uncomment to connect as a non-root user if you've added one. See https://aka.ms/vscode-remote/containers/non-root.
// "remoteUser": "vscode"
}

61
.devcontainer/limits.conf Normal file
View File

@@ -0,0 +1,61 @@
# /etc/security/limits.conf
#
#Each line describes a limit for a user in the form:
#
#<domain> <type> <item> <value>
#
#Where:
#<domain> can be:
# - a user name
# - a group name, with @group syntax
# - the wildcard *, for default entry
# - the wildcard %, can be also used with %group syntax,
# for maxlogin limit
# - NOTE: group and wildcard limits are not applied to root.
# To apply a limit to the root user, <domain> must be
# the literal username root.
#
#<type> can have the two values:
# - "soft" for enforcing the soft limits
# - "hard" for enforcing hard limits
#
#<item> can be one of the following:
# - core - limits the core file size (KB)
# - data - max data size (KB)
# - fsize - maximum filesize (KB)
# - memlock - max locked-in-memory address space (KB)
# - nofile - max number of open file descriptors
# - rss - max resident set size (KB)
# - stack - max stack size (KB)
# - cpu - max CPU time (MIN)
# - nproc - max number of processes
# - as - address space limit (KB)
# - maxlogins - max number of logins for this user
# - maxsyslogins - max number of logins on the system
# - priority - the priority to run user process with
# - locks - max number of file locks the user can hold
# - sigpending - max number of pending signals
# - msgqueue - max memory used by POSIX message queues (bytes)
# - nice - max nice priority allowed to raise to values: [-20, 19]
# - rtprio - max realtime priority
# - chroot - change root to directory (Debian-specific)
#
#<domain> <type> <item> <value>
#
* soft memlock 33554432
* hard memlock 33554432
* soft nofile 33554432
* hard nofile 33554432
#* soft core 0
#root hard core 100000
#* hard rss 10000
#@student hard nproc 20
#@faculty soft nproc 20
#@faculty hard nproc 50
#ftp hard nproc 0
#ftp - chroot /ftp
#@student - maxlogins 4
# End of file

View File

@@ -0,0 +1,445 @@
#!/usr/bin/env bash
#-------------------------------------------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See https://go.microsoft.com/fwlink/?linkid=2090316 for license information.
#-------------------------------------------------------------------------------------------------------------
#
# Docs: https://github.com/microsoft/vscode-dev-containers/blob/main/script-library/docs/common.md
# Maintainer: The VS Code and Codespaces Teams
#
# Syntax: ./common-debian.sh [install zsh flag] [username] [user UID] [user GID] [upgrade packages flag] [install Oh My Zsh! flag] [Add non-free packages]
set -e
INSTALL_ZSH=${1:-"true"}
USERNAME=${2:-"automatic"}
USER_UID=${3:-"automatic"}
USER_GID=${4:-"automatic"}
UPGRADE_PACKAGES=${5:-"true"}
INSTALL_OH_MYS=${6:-"true"}
ADD_NON_FREE_PACKAGES=${7:-"false"}
SCRIPT_DIR="$(cd $(dirname "${BASH_SOURCE[0]}") && pwd)"
MARKER_FILE="/usr/local/etc/vscode-dev-containers/common"
if [ "$(id -u)" -ne 0 ]; then
echo -e 'Script must be run as root. Use sudo, su, or add "USER root" to your Dockerfile before running this script.'
exit 1
fi
# Ensure that login shells get the correct path if the user updated the PATH using ENV.
rm -f /etc/profile.d/00-restore-env.sh
echo "export PATH=${PATH//$(sh -lc 'echo $PATH')/\$PATH}" >/etc/profile.d/00-restore-env.sh
chmod +x /etc/profile.d/00-restore-env.sh
# If in automatic mode, determine if a user already exists, if not use vscode
if [ "${USERNAME}" = "auto" ] || [ "${USERNAME}" = "automatic" ]; then
USERNAME=""
POSSIBLE_USERS=("vscode" "node" "codespace" "$(awk -v val=1000 -F ":" '$3==val{print $1}' /etc/passwd)")
for CURRENT_USER in ${POSSIBLE_USERS[@]}; do
if id -u ${CURRENT_USER} >/dev/null 2>&1; then
USERNAME=${CURRENT_USER}
break
fi
done
if [ "${USERNAME}" = "" ]; then
USERNAME=vscode
fi
elif [ "${USERNAME}" = "none" ]; then
USERNAME=root
USER_UID=0
USER_GID=0
fi
# Load markers to see which steps have already run
if [ -f "${MARKER_FILE}" ]; then
echo "Marker file found:"
cat "${MARKER_FILE}"
source "${MARKER_FILE}"
fi
# Ensure apt is in non-interactive to avoid prompts
export DEBIAN_FRONTEND=noninteractive
# Function to call apt-get if needed
apt_get_update_if_needed() {
if [ ! -d "/var/lib/apt/lists" ] || [ "$(ls /var/lib/apt/lists/ | wc -l)" = "0" ]; then
echo "Running apt-get update..."
apt-get update
else
echo "Skipping apt-get update."
fi
}
# Run install apt-utils to avoid debconf warning then verify presence of other common developer tools and dependencies
if [ "${PACKAGES_ALREADY_INSTALLED}" != "true" ]; then
package_list="apt-utils \
openssh-client \
gnupg2 \
dirmngr \
iproute2 \
procps \
lsof \
htop \
net-tools \
psmisc \
curl \
wget \
rsync \
ca-certificates \
unzip \
zip \
nano \
vim-tiny \
less \
jq \
lsb-release \
apt-transport-https \
dialog \
libc6 \
libgcc1 \
libkrb5-3 \
libgssapi-krb5-2 \
libicu[0-9][0-9] \
liblttng-ust0 \
libstdc++6 \
zlib1g \
locales \
sudo \
ncdu \
man-db \
strace \
manpages \
manpages-dev \
init-system-helpers"
# Needed for adding manpages-posix and manpages-posix-dev which are non-free packages in Debian
if [ "${ADD_NON_FREE_PACKAGES}" = "true" ]; then
# Bring in variables from /etc/os-release like VERSION_CODENAME
. /etc/os-release
sed -i -E "s/deb http:\/\/(deb|httpredir)\.debian\.org\/debian ${VERSION_CODENAME} main/deb http:\/\/\1\.debian\.org\/debian ${VERSION_CODENAME} main contrib non-free/" /etc/apt/sources.list
sed -i -E "s/deb-src http:\/\/(deb|httredir)\.debian\.org\/debian ${VERSION_CODENAME} main/deb http:\/\/\1\.debian\.org\/debian ${VERSION_CODENAME} main contrib non-free/" /etc/apt/sources.list
sed -i -E "s/deb http:\/\/(deb|httpredir)\.debian\.org\/debian ${VERSION_CODENAME}-updates main/deb http:\/\/\1\.debian\.org\/debian ${VERSION_CODENAME}-updates main contrib non-free/" /etc/apt/sources.list
sed -i -E "s/deb-src http:\/\/(deb|httpredir)\.debian\.org\/debian ${VERSION_CODENAME}-updates main/deb http:\/\/\1\.debian\.org\/debian ${VERSION_CODENAME}-updates main contrib non-free/" /etc/apt/sources.list
sed -i "s/deb http:\/\/security\.debian\.org\/debian-security ${VERSION_CODENAME}\/updates main/deb http:\/\/security\.debian\.org\/debian-security ${VERSION_CODENAME}\/updates main contrib non-free/" /etc/apt/sources.list
sed -i "s/deb-src http:\/\/security\.debian\.org\/debian-security ${VERSION_CODENAME}\/updates main/deb http:\/\/security\.debian\.org\/debian-security ${VERSION_CODENAME}\/updates main contrib non-free/" /etc/apt/sources.list
sed -i "s/deb http:\/\/deb\.debian\.org\/debian ${VERSION_CODENAME}-backports main/deb http:\/\/deb\.debian\.org\/debian ${VERSION_CODENAME}-backports main contrib non-free/" /etc/apt/sources.list
sed -i "s/deb-src http:\/\/deb\.debian\.org\/debian ${VERSION_CODENAME}-backports main/deb http:\/\/deb\.debian\.org\/debian ${VERSION_CODENAME}-backports main contrib non-free/" /etc/apt/sources.list
# Handle bullseye location for security https://www.debian.org/releases/bullseye/amd64/release-notes/ch-information.en.html
sed -i "s/deb http:\/\/security\.debian\.org\/debian-security ${VERSION_CODENAME}-security main/deb http:\/\/security\.debian\.org\/debian-security ${VERSION_CODENAME}-security main contrib non-free/" /etc/apt/sources.list
sed -i "s/deb-src http:\/\/security\.debian\.org\/debian-security ${VERSION_CODENAME}-security main/deb http:\/\/security\.debian\.org\/debian-security ${VERSION_CODENAME}-security main contrib non-free/" /etc/apt/sources.list
echo "Running apt-get update..."
apt-get update
package_list="${package_list} manpages-posix manpages-posix-dev"
else
apt_get_update_if_needed
fi
# Install libssl1.1 if available
if [[ ! -z $(apt-cache --names-only search ^libssl1.1$) ]]; then
package_list="${package_list} libssl1.1"
fi
# Install appropriate version of libssl1.0.x if available
libssl_package=$(dpkg-query -f '${db:Status-Abbrev}\t${binary:Package}\n' -W 'libssl1\.0\.?' 2>&1 || echo '')
if [ "$(echo "$LIlibssl_packageBSSL" | grep -o 'libssl1\.0\.[0-9]:' | uniq | sort | wc -l)" -eq 0 ]; then
if [[ ! -z $(apt-cache --names-only search ^libssl1.0.2$) ]]; then
# Debian 9
package_list="${package_list} libssl1.0.2"
elif [[ ! -z $(apt-cache --names-only search ^libssl1.0.0$) ]]; then
# Ubuntu 18.04, 16.04, earlier
package_list="${package_list} libssl1.0.0"
fi
fi
echo "Packages to verify are installed: ${package_list}"
apt-get -y install --no-install-recommends ${package_list} 2> >(grep -v 'debconf: delaying package configuration, since apt-utils is not installed' >&2)
# Install git if not already installed (may be more recent than distro version)
if ! type git >/dev/null 2>&1; then
apt-get -y install --no-install-recommends git
fi
PACKAGES_ALREADY_INSTALLED="true"
fi
# Get to latest versions of all packages
if [ "${UPGRADE_PACKAGES}" = "true" ]; then
apt_get_update_if_needed
apt-get -y upgrade --no-install-recommends
apt-get autoremove -y
fi
# Ensure at least the en_US.UTF-8 UTF-8 locale is available.
# Common need for both applications and things like the agnoster ZSH theme.
if [ "${LOCALE_ALREADY_SET}" != "true" ] && ! grep -o -E '^\s*en_US.UTF-8\s+UTF-8' /etc/locale.gen >/dev/null; then
echo "en_US.UTF-8 UTF-8" >>/etc/locale.gen
locale-gen
LOCALE_ALREADY_SET="true"
fi
# Create or update a non-root user to match UID/GID.
group_name="${USERNAME}"
if id -u ${USERNAME} >/dev/null 2>&1; then
# User exists, update if needed
if [ "${USER_GID}" != "automatic" ] && [ "$USER_GID" != "$(id -g $USERNAME)" ]; then
group_name="$(id -gn $USERNAME)"
groupmod --gid $USER_GID ${group_name}
usermod --gid $USER_GID $USERNAME
fi
if [ "${USER_UID}" != "automatic" ] && [ "$USER_UID" != "$(id -u $USERNAME)" ]; then
usermod --uid $USER_UID $USERNAME
fi
else
# Create user
if [ "${USER_GID}" = "automatic" ]; then
groupadd $USERNAME
else
groupadd --gid $USER_GID $USERNAME
fi
if [ "${USER_UID}" = "automatic" ]; then
useradd -s /bin/bash --gid $USERNAME -m $USERNAME
else
useradd -s /bin/bash --uid $USER_UID --gid $USERNAME -m $USERNAME
fi
fi
# Add add sudo support for non-root user
if [ "${USERNAME}" != "root" ] && [ "${EXISTING_NON_ROOT_USER}" != "${USERNAME}" ]; then
echo $USERNAME ALL=\(root\) NOPASSWD:ALL >/etc/sudoers.d/$USERNAME
chmod 0440 /etc/sudoers.d/$USERNAME
EXISTING_NON_ROOT_USER="${USERNAME}"
fi
# ** Shell customization section **
if [ "${USERNAME}" = "root" ]; then
user_rc_path="/root"
else
user_rc_path="/home/${USERNAME}"
fi
# Restore user .bashrc defaults from skeleton file if it doesn't exist or is empty
if [ ! -f "${user_rc_path}/.bashrc" ] || [ ! -s "${user_rc_path}/.bashrc" ]; then
cp /etc/skel/.bashrc "${user_rc_path}/.bashrc"
fi
# Restore user .profile defaults from skeleton file if it doesn't exist or is empty
if [ ! -f "${user_rc_path}/.profile" ] || [ ! -s "${user_rc_path}/.profile" ]; then
cp /etc/skel/.profile "${user_rc_path}/.profile"
fi
# .bashrc/.zshrc snippet
rc_snippet="$(
cat <<'EOF'
if [ -z "${USER}" ]; then export USER=$(whoami); fi
if [[ "${PATH}" != *"$HOME/.local/bin"* ]]; then export PATH="${PATH}:$HOME/.local/bin"; fi
# Display optional first run image specific notice if configured and terminal is interactive
if [ -t 1 ] && [[ "${TERM_PROGRAM}" = "vscode" || "${TERM_PROGRAM}" = "codespaces" ]] && [ ! -f "$HOME/.config/vscode-dev-containers/first-run-notice-already-displayed" ]; then
if [ -f "/usr/local/etc/vscode-dev-containers/first-run-notice.txt" ]; then
cat "/usr/local/etc/vscode-dev-containers/first-run-notice.txt"
elif [ -f "/workspaces/.codespaces/shared/first-run-notice.txt" ]; then
cat "/workspaces/.codespaces/shared/first-run-notice.txt"
fi
mkdir -p "$HOME/.config/vscode-dev-containers"
# Mark first run notice as displayed after 10s to avoid problems with fast terminal refreshes hiding it
((sleep 10s; touch "$HOME/.config/vscode-dev-containers/first-run-notice-already-displayed") &)
fi
# Set the default git editor if not already set
if [ -z "$(git config --get core.editor)" ] && [ -z "${GIT_EDITOR}" ]; then
if [ "${TERM_PROGRAM}" = "vscode" ]; then
if [[ -n $(command -v code-insiders) && -z $(command -v code) ]]; then
export GIT_EDITOR="code-insiders --wait"
else
export GIT_EDITOR="code --wait"
fi
fi
fi
EOF
)"
# code shim, it fallbacks to code-insiders if code is not available
cat <<'EOF' >/usr/local/bin/code
#!/bin/sh
get_in_path_except_current() {
which -a "$1" | grep -A1 "$0" | grep -v "$0"
}
code="$(get_in_path_except_current code)"
if [ -n "$code" ]; then
exec "$code" "$@"
elif [ "$(command -v code-insiders)" ]; then
exec code-insiders "$@"
else
echo "code or code-insiders is not installed" >&2
exit 127
fi
EOF
chmod +x /usr/local/bin/code
# systemctl shim - tells people to use 'service' if systemd is not running
cat <<'EOF' >/usr/local/bin/systemctl
#!/bin/sh
set -e
if [ -d "/run/systemd/system" ]; then
exec /bin/systemctl/systemctl "$@"
else
echo '\n"systemd" is not running in this container due to its overhead.\nUse the "service" command to start services intead. e.g.: \n\nservice --status-all'
fi
EOF
chmod +x /usr/local/bin/systemctl
# Codespaces bash and OMZ themes - partly inspired by https://github.com/ohmyzsh/ohmyzsh/blob/master/themes/robbyrussell.zsh-theme
codespaces_bash="$(
cat \
<<'EOF'
# Codespaces bash prompt theme
__bash_prompt() {
local userpart='`export XIT=$? \
&& [ ! -z "${GITHUB_USER}" ] && echo -n "\[\033[0;32m\]@${GITHUB_USER} " || echo -n "\[\033[0;32m\]\u " \
&& [ "$XIT" -ne "0" ] && echo -n "\[\033[1;31m\]➜" || echo -n "\[\033[0m\]➜"`'
local gitbranch='`\
if [ "$(git config --get codespaces-theme.hide-status 2>/dev/null)" != 1 ]; then \
export BRANCH=$(git symbolic-ref --short HEAD 2>/dev/null || git rev-parse --short HEAD 2>/dev/null); \
if [ "${BRANCH}" != "" ]; then \
echo -n "\[\033[0;36m\](\[\033[1;31m\]${BRANCH}" \
&& if git ls-files --error-unmatch -m --directory --no-empty-directory -o --exclude-standard ":/*" > /dev/null 2>&1; then \
echo -n " \[\033[1;33m\]✗"; \
fi \
&& echo -n "\[\033[0;36m\]) "; \
fi; \
fi`'
local lightblue='\[\033[1;34m\]'
local removecolor='\[\033[0m\]'
PS1="${userpart} ${lightblue}\w ${gitbranch}${removecolor}\$ "
unset -f __bash_prompt
}
__bash_prompt
EOF
)"
codespaces_zsh="$(
cat \
<<'EOF'
# Codespaces zsh prompt theme
__zsh_prompt() {
local prompt_username
if [ ! -z "${GITHUB_USER}" ]; then
prompt_username="@${GITHUB_USER}"
else
prompt_username="%n"
fi
PROMPT="%{$fg[green]%}${prompt_username} %(?:%{$reset_color%}➜ :%{$fg_bold[red]%}➜ )" # User/exit code arrow
PROMPT+='%{$fg_bold[blue]%}%(5~|%-1~/…/%3~|%4~)%{$reset_color%} ' # cwd
PROMPT+='$([ "$(git config --get codespaces-theme.hide-status 2>/dev/null)" != 1 ] && git_prompt_info)' # Git status
PROMPT+='%{$fg[white]%}$ %{$reset_color%}'
unset -f __zsh_prompt
}
ZSH_THEME_GIT_PROMPT_PREFIX="%{$fg_bold[cyan]%}(%{$fg_bold[red]%}"
ZSH_THEME_GIT_PROMPT_SUFFIX="%{$reset_color%} "
ZSH_THEME_GIT_PROMPT_DIRTY=" %{$fg_bold[yellow]%}✗%{$fg_bold[cyan]%})"
ZSH_THEME_GIT_PROMPT_CLEAN="%{$fg_bold[cyan]%})"
__zsh_prompt
EOF
)"
# Add RC snippet and custom bash prompt
if [ "${RC_SNIPPET_ALREADY_ADDED}" != "true" ]; then
echo "${rc_snippet}" >>/etc/bash.bashrc
echo "${codespaces_bash}" >>"${user_rc_path}/.bashrc"
echo 'export PROMPT_DIRTRIM=4' >>"${user_rc_path}/.bashrc"
if [ "${USERNAME}" != "root" ]; then
echo "${codespaces_bash}" >>"/root/.bashrc"
echo 'export PROMPT_DIRTRIM=4' >>"/root/.bashrc"
fi
chown ${USERNAME}:${group_name} "${user_rc_path}/.bashrc"
RC_SNIPPET_ALREADY_ADDED="true"
fi
# Optionally install and configure zsh and Oh My Zsh!
if [ "${INSTALL_ZSH}" = "true" ]; then
if ! type zsh >/dev/null 2>&1; then
apt_get_update_if_needed
apt-get install -y zsh
fi
if [ "${ZSH_ALREADY_INSTALLED}" != "true" ]; then
echo "${rc_snippet}" >>/etc/zsh/zshrc
ZSH_ALREADY_INSTALLED="true"
fi
# Adapted, simplified inline Oh My Zsh! install steps that adds, defaults to a codespaces theme.
# See https://github.com/ohmyzsh/ohmyzsh/blob/master/tools/install.sh for official script.
oh_my_install_dir="${user_rc_path}/.oh-my-zsh"
if [ ! -d "${oh_my_install_dir}" ] && [ "${INSTALL_OH_MYS}" = "true" ]; then
template_path="${oh_my_install_dir}/templates/zshrc.zsh-template"
user_rc_file="${user_rc_path}/.zshrc"
umask g-w,o-w
mkdir -p ${oh_my_install_dir}
git clone --depth=1 \
-c core.eol=lf \
-c core.autocrlf=false \
-c fsck.zeroPaddedFilemode=ignore \
-c fetch.fsck.zeroPaddedFilemode=ignore \
-c receive.fsck.zeroPaddedFilemode=ignore \
"https://github.com/ohmyzsh/ohmyzsh" "${oh_my_install_dir}" 2>&1
echo -e "$(cat "${template_path}")\nDISABLE_AUTO_UPDATE=true\nDISABLE_UPDATE_PROMPT=true" >${user_rc_file}
sed -i -e 's/ZSH_THEME=.*/ZSH_THEME="codespaces"/g' ${user_rc_file}
mkdir -p ${oh_my_install_dir}/custom/themes
echo "${codespaces_zsh}" >"${oh_my_install_dir}/custom/themes/codespaces.zsh-theme"
# Shrink git while still enabling updates
cd "${oh_my_install_dir}"
git repack -a -d -f --depth=1 --window=1
# Copy to non-root user if one is specified
if [ "${USERNAME}" != "root" ]; then
cp -rf "${user_rc_file}" "${oh_my_install_dir}" /root
chown -R ${USERNAME}:${group_name} "${user_rc_path}"
fi
fi
fi
# Persist image metadata info, script if meta.env found in same directory
meta_info_script="$(
cat <<'EOF'
#!/bin/sh
. /usr/local/etc/vscode-dev-containers/meta.env
# Minimal output
if [ "$1" = "version" ] || [ "$1" = "image-version" ]; then
echo "${VERSION}"
exit 0
elif [ "$1" = "release" ]; then
echo "${GIT_REPOSITORY_RELEASE}"
exit 0
elif [ "$1" = "content" ] || [ "$1" = "content-url" ] || [ "$1" = "contents" ] || [ "$1" = "contents-url" ]; then
echo "${CONTENTS_URL}"
exit 0
fi
#Full output
echo
echo "Development container image information"
echo
if [ ! -z "${VERSION}" ]; then echo "- Image version: ${VERSION}"; fi
if [ ! -z "${DEFINITION_ID}" ]; then echo "- Definition ID: ${DEFINITION_ID}"; fi
if [ ! -z "${VARIANT}" ]; then echo "- Variant: ${VARIANT}"; fi
if [ ! -z "${GIT_REPOSITORY}" ]; then echo "- Source code repository: ${GIT_REPOSITORY}"; fi
if [ ! -z "${GIT_REPOSITORY_RELEASE}" ]; then echo "- Source code release/branch: ${GIT_REPOSITORY_RELEASE}"; fi
if [ ! -z "${BUILD_TIMESTAMP}" ]; then echo "- Timestamp: ${BUILD_TIMESTAMP}"; fi
if [ ! -z "${CONTENTS_URL}" ]; then echo && echo "More info: ${CONTENTS_URL}"; fi
echo
EOF
)"
if [ -f "${SCRIPT_DIR}/meta.env" ]; then
mkdir -p /usr/local/etc/vscode-dev-containers/
cp -f "${SCRIPT_DIR}/meta.env" /usr/local/etc/vscode-dev-containers/meta.env
echo "${meta_info_script}" >/usr/local/bin/devcontainer-info
chmod +x /usr/local/bin/devcontainer-info
fi
# Write marker file
mkdir -p "$(dirname "${MARKER_FILE}")"
echo -e "\
PACKAGES_ALREADY_INSTALLED=${PACKAGES_ALREADY_INSTALLED}\n\
LOCALE_ALREADY_SET=${LOCALE_ALREADY_SET}\n\
EXISTING_NON_ROOT_USER=${EXISTING_NON_ROOT_USER}\n\
RC_SNIPPET_ALREADY_ADDED=${RC_SNIPPET_ALREADY_ADDED}\n\
ZSH_ALREADY_INSTALLED=${ZSH_ALREADY_INSTALLED}" >"${MARKER_FILE}"
echo "Done!"

View File

@@ -0,0 +1,16 @@
#!/bin/bash
echo "To get started, login to GitHub and clone bun's GitHub repo into /workspaces/bun"
echo "Make sure to login with a Personal Access Token"
echo "# First time setup"
echo "gh auth login"
echo "gh repo clone Jarred-Sumner/bun . -- --depth=1 --progress -j8"
echo ""
echo "# Compile bun dependencies (zig is already compiled)"
echo "make devcontainer"
echo ""
echo "# Build bun for development"
echo "make dev"
echo ""
echo "# Run bun"
echo "bun-debug"

View File

@@ -0,0 +1,185 @@
#!/usr/bin/env bash
#-------------------------------------------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See https://go.microsoft.com/fwlink/?linkid=2090316 for license information.
#-------------------------------------------------------------------------------------------------------------
#
# Docs: https://github.com/microsoft/vscode-dev-containers/blob/main/script-library/docs/github.md
# Maintainer: The VS Code and Codespaces Teams
#
# Syntax: ./github-debian.sh [version]
CLI_VERSION=${1:-"latest"}
GITHUB_CLI_ARCHIVE_GPG_KEY=C99B11DEB97541F0
GPG_KEY_SERVERS="keyserver hkp://keyserver.ubuntu.com:80
keyserver hkps://keys.openpgp.org
keyserver hkp://keyserver.pgp.com"
set -e
if [ "$(id -u)" -ne 0 ]; then
echo -e 'Script must be run as root. Use sudo, su, or add "USER root" to your Dockerfile before running this script.'
exit 1
fi
# Get central common setting
get_common_setting() {
if [ "${common_settings_file_loaded}" != "true" ]; then
curl -sfL "https://aka.ms/vscode-dev-containers/script-library/settings.env" -o /tmp/vsdc-settings.env 2>/dev/null || echo "Could not download settings file. Skipping."
common_settings_file_loaded=true
fi
if [ -f "/tmp/vsdc-settings.env" ]; then
local multi_line=""
if [ "$2" = "true" ]; then multi_line="-z"; fi
local result="$(grep ${multi_line} -oP "$1=\"?\K[^\"]+" /tmp/vsdc-settings.env | tr -d '\0')"
if [ ! -z "${result}" ]; then declare -g $1="${result}"; fi
fi
echo "$1=${!1}"
}
# Import the specified key in a variable name passed in as
receive_gpg_keys() {
get_common_setting $1
local keys=${!1}
get_common_setting GPG_KEY_SERVERS true
# Use a temporary locaiton for gpg keys to avoid polluting image
export GNUPGHOME="/tmp/tmp-gnupg"
mkdir -p ${GNUPGHOME}
chmod 700 ${GNUPGHOME}
echo -e "disable-ipv6\n${GPG_KEY_SERVERS}" >${GNUPGHOME}/dirmngr.conf
# GPG key download sometimes fails for some reason and retrying fixes it.
local retry_count=0
local gpg_ok="false"
set +e
until [ "${gpg_ok}" = "true" ] || [ "${retry_count}" -eq "5" ]; do
echo "(*) Downloading GPG key..."
(echo "${keys}" | xargs -n 1 gpg --recv-keys) 2>&1 && gpg_ok="true"
if [ "${gpg_ok}" != "true" ]; then
echo "(*) Failed getting key, retring in 10s..."
((retry_count++))
sleep 10s
fi
done
set -e
if [ "${gpg_ok}" = "false" ]; then
echo "(!) Failed to get gpg key."
exit 1
fi
}
# Figure out correct version of a three part version number is not passed
find_version_from_git_tags() {
local variable_name=$1
local requested_version=${!variable_name}
if [ "${requested_version}" = "none" ]; then return; fi
local repository=$2
local prefix=${3:-"tags/v"}
local separator=${4:-"."}
local last_part_optional=${5:-"false"}
if [ "$(echo "${requested_version}" | grep -o "." | wc -l)" != "2" ]; then
local escaped_separator=${separator//./\\.}
local last_part
if [ "${last_part_optional}" = "true" ]; then
last_part="(${escaped_separator}[0-9]+)?"
else
last_part="${escaped_separator}[0-9]+"
fi
local regex="${prefix}\\K[0-9]+${escaped_separator}[0-9]+${last_part}$"
local version_list="$(git ls-remote --tags ${repository} | grep -oP "${regex}" | tr -d ' ' | tr "${separator}" "." | sort -rV)"
if [ "${requested_version}" = "latest" ] || [ "${requested_version}" = "current" ] || [ "${requested_version}" = "lts" ]; then
declare -g ${variable_name}="$(echo "${version_list}" | head -n 1)"
else
set +e
declare -g ${variable_name}="$(echo "${version_list}" | grep -E -m 1 "^${requested_version//./\\.}([\\.\\s]|$)")"
set -e
fi
fi
if [ -z "${!variable_name}" ] || ! echo "${version_list}" | grep "^${!variable_name//./\\.}$" >/dev/null 2>&1; then
echo -e "Invalid ${variable_name} value: ${requested_version}\nValid values:\n${version_list}" >&2
exit 1
fi
echo "${variable_name}=${!variable_name}"
}
# Import the specified key in a variable name passed in as
receive_gpg_keys() {
get_common_setting $1
local keys=${!1}
get_common_setting GPG_KEY_SERVERS true
local keyring_args=""
if [ ! -z "$2" ]; then
keyring_args="--no-default-keyring --keyring $2"
fi
# Use a temporary locaiton for gpg keys to avoid polluting image
export GNUPGHOME="/tmp/tmp-gnupg"
mkdir -p ${GNUPGHOME}
chmod 700 ${GNUPGHOME}
echo -e "disable-ipv6\n${GPG_KEY_SERVERS}" >${GNUPGHOME}/dirmngr.conf
# GPG key download sometimes fails for some reason and retrying fixes it.
local retry_count=0
local gpg_ok="false"
set +e
until [ "${gpg_ok}" = "true" ] || [ "${retry_count}" -eq "5" ]; do
echo "(*) Downloading GPG key..."
(echo "${keys}" | xargs -n 1 gpg -q ${keyring_args} --recv-keys) 2>&1 && gpg_ok="true"
if [ "${gpg_ok}" != "true" ]; then
echo "(*) Failed getting key, retring in 10s..."
((retry_count++))
sleep 10s
fi
done
set -e
if [ "${gpg_ok}" = "false" ]; then
echo "(!) Failed to get gpg key."
exit 1
fi
}
# Function to run apt-get if needed
apt_get_update_if_needed() {
if [ ! -d "/var/lib/apt/lists" ] || [ "$(ls /var/lib/apt/lists/ | wc -l)" = "0" ]; then
echo "Running apt-get update..."
apt-get update
else
echo "Skipping apt-get update."
fi
}
# Checks if packages are installed and installs them if not
check_packages() {
if ! dpkg -s "$@" >/dev/null 2>&1; then
apt_get_update_if_needed
apt-get -y install --no-install-recommends "$@"
fi
}
export DEBIAN_FRONTEND=noninteractive
# Install curl, apt-transport-https, curl, gpg, or dirmngr, git if missing
check_packages curl ca-certificates apt-transport-https dirmngr gnupg2
if ! type git >/dev/null 2>&1; then
apt_get_update_if_needed
apt-get -y install --no-install-recommends git
fi
# Soft version matching
if [ "${CLI_VERSION}" != "latest" ] && [ "${CLI_VERSION}" != "lts" ] && [ "${CLI_VERSION}" != "stable" ]; then
find_version_from_git_tags CLI_VERSION "https://github.com/cli/cli"
version_suffix="=${CLI_VERSION}"
else
version_suffix=""
fi
# Install the GitHub CLI
echo "Downloading github CLI..."
# Import key safely (new method rather than deprecated apt-key approach) and install
. /etc/os-release
receive_gpg_keys GITHUB_CLI_ARCHIVE_GPG_KEY /usr/share/keyrings/githubcli-archive-keyring.gpg
echo "deb [arch=$(dpkg --print-architecture) signed-by=/usr/share/keyrings/githubcli-archive-keyring.gpg] https://cli.github.com/packages ${VERSION_CODENAME} main" >/etc/apt/sources.list.d/github-cli.list
apt-get update
apt-get -y install "gh${version_suffix}"
rm -rf "/tmp/gh/gnupg"
echo "Done!"

View File

@@ -0,0 +1,7 @@
#!/bin/bash
chsh -s $(which zsh)
sh -c "$(curl -fsSL https://starship.rs/install.sh) -- --platform linux_musl"
echo "eval \"$(starship init zsh)\"" >>~/.zshrc
curl https://github.com/Jarred-Sumner/vscode-zig/releases/download/fork-v1/zig-0.2.5.vsix >/home/ubuntu/vscode-zig.vsix

View File

@@ -0,0 +1,7 @@
#!/bin/bash
curl -L https://github.com/Jarred-Sumner/vscode-zig/releases/download/fork-v1/zig-0.2.5.vsix >/home/ubuntu/vscode-zig.vsix
git clone https://github.com/zigtools/zls /home/ubuntu/zls
cd /home/ubuntu/zls
git submodule update --init --recursive --progress --depth=1
zig build -Drelease-fast

View File

@@ -0,0 +1,9 @@
{
"folders": [
{
// Source code
"name": "bun",
"path": "bun"
},
]
}

9
.devcontainer/zls.json Normal file
View File

@@ -0,0 +1,9 @@
{
"zig_exe_path": "/build/zig/zig",
"enable_snippets": true,
"warn_style": false,
"enable_semantic_tokens": true,
"operator_completions": true,
"include_at_in_builtins": false,
"max_detail_length": 1048576
}

View File

@@ -0,0 +1,15 @@
#!/bin/bash
set -euxo pipefail
export DOCKER_BUILDKIT=1
docker login --username bunbunbunbun
docker build -f Dockerfile.base -t bunbunbunbun/bun-test-base --target bun-test-base . --platform=linux/$BUILDARCH --build-arg BUILDARCH=$BUILDARCH
docker build -f Dockerfile.base -t bunbunbunbun/bun-base-with-zig-and-webkit --target bun-base-with-zig-and-webkit . --platform=linux/$BUILDARCH --build-arg BUILDARCH=$BUILDARCH
docker build -f Dockerfile.base -t bunbunbunbun/bun-base --target bun-base --platform=linux/$BUILDARCH . --build-arg BUILDARCH=$BUILDARCH
docker push bunbunbunbun/bun-test-base:latest
docker push bunbunbunbun/bun-base-with-zig-and-webkit:latest
docker push bunbunbunbun/bun-base:latest

24
.docker/build-base.sh Normal file
View File

@@ -0,0 +1,24 @@
#!/bin/bash
set -euxo pipefail
export DOCKER_BUILDKIT=1
docker buildx build \
-t bunbunbunbun/bun-test-base:latest -f Dockerfile.base \
--target bun-test-base \
--platform=linux/$BUILDARCH --build-arg BUILDARCH=$BUILDARCH .
docker buildx build \
--target bun-base \
-f Dockerfile.base \
-t bunbunbunbun/bun-base:latest --platform=linux/$BUILDARCH \
--build-arg BUILDARCH=$BUILDARCH .
docker buildx build \
-t bunbunbunbun/bun-base-with-zig-and-webkit:latest \
-f Dockerfile.base \
--target bun-base-with-zig-and-webkit \
--platform=linux/$BUILDARCH --build-arg BUILDARCH=$BUILDARCH .
docker push bunbunbunbun/bun-test-base:latest
docker push bunbunbunbun/bun-base:latest
docker push bunbunbunbun/bun-base-with-zig-and-webkit:latest

View File

@@ -3,7 +3,7 @@
set -euxo pipefail
bun install
bun install --cwd ./test/snippets
bun install --cwd ./test/scripts
bun install --cwd ./integration/snippets
bun install --cwd ./integration/scripts
make $BUN_TEST_NAME

View File

@@ -1,15 +1,17 @@
**/*.a
**/*.o
**/.next
**/CMakeCache.txt
**/node_modules
.git
examples
node_modules
**/node_modules
src/javascript/jsc/WebKit/LayoutTests
zig-out
zig-build
**/*.o
**/*.a
examples
**/.next
.git
src/javascript/jsc/WebKit
**/CMakeCache.txt
packages/**/bun
packages/**/bun-profile
src/bun.js/WebKit
src/bun.js/WebKit/LayoutTests
zig-build
zig-cache
zig-out

View File

@@ -1,8 +0,0 @@
# https://EditorConfig.org
root = true
[*]
charset = utf-8
insert_final_newline = true
trim_trailing_whitespace = true
end_of_line = lf

48
.gitattributes vendored
View File

@@ -1,47 +1,7 @@
*.css text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.js text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.jsx text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.tsx text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.ts text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.c text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.cpp text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.cc text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.yml text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.zig text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.rs text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.h text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.json text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.lock text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.map text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.md text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.mjs text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.mts text eol=lf whitespace=blank-at-eol,-blank-at-eof,-space-before-tab,tab-in-indent,tabwidth=2
*.lockb binary diff=lockb
.vscode/launch.json linguist-generated
src/api/schema.d.ts linguist-generated
fixture.*.c linguist-generated
src/api/schema.js linguist-generated
*-fixture* linguist-generated
src/bun.js/bindings/ZigGeneratedCode.h linguist-generated
src/bun.js/bindings/ZigGeneratedCode.cpp linguist-generated
src/bun.js/bindings/headers.h linguist-generated
src/bun.js/bindings/headers.zig linguist-generated
packages/bun-uws/fuzzing/seed-corpus/**/* linguist-generated
src/bun.js/bindings/sqlite/sqlite3.c linguist-vendored
src/bun.js/bindings/sqlite/sqlite3_local.h linguist-vendored
src/bun.js/bindings/simdutf.cpp linguist-vendored
src/bun.js/bindings/simdutf.h linguist-vendored
docs/**/* linguist-documentation
# Don't count tests in the language stats - https://github.com/github-linguist/linguist/blob/master/docs/overrides.md
test/**/* linguist-documentation
bench/**/* linguist-documentation
examples/**/* linguist-documentation
src/deps/*.c linguist-vendored
src/deps/brotli/** linguist-vendored
src/javascript/jsc/bindings/sqlite/sqlite3.c linguist-vendored
src/javascript/jsc/bindings/sqlite/sqlite3_local.h linguist-vendored
*.lockb binary diff=lockb
*.zig text eol=lf

View File

@@ -1,47 +0,0 @@
name: 🐛 Bug Report
description: Report an issue that should be fixed
labels:
- bug
- needs triage
body:
- type: markdown
attributes:
value: |
Thank you for submitting a bug report. It helps make Bun better.
If you need help or support using Bun, and are not reporting a bug, please
join our [Discord](https://discord.gg/CXdq2DP29u) server, where you can ask questions in the [`#help`](https://discord.gg/32EtH6p7HN) forum.
Make sure you are running the [latest](https://bun.sh/docs/installation#upgrading) version of Bun.
The bug you are experiencing may already have been fixed.
Please try to include as much information as possible.
- type: input
attributes:
label: What version of Bun is running?
description: Copy the output of `bun --revision`
- type: input
attributes:
label: What platform is your computer?
description: |
For MacOS and Linux: copy the output of `uname -mprs`
For Windows: copy the output of `"$([Environment]::OSVersion | ForEach-Object VersionString) $(if ([Environment]::Is64BitOperatingSystem) { "x64" } else { "x86" })"` in the PowerShell console
- type: textarea
attributes:
label: What steps can reproduce the bug?
description: Explain the bug and provide a code snippet that can reproduce it.
validations:
required: true
- type: textarea
attributes:
label: What is the expected behavior?
description: If possible, please provide text instead of a screenshot.
- type: textarea
attributes:
label: What do you see instead?
description: If possible, please provide text instead of a screenshot.
- type: textarea
attributes:
label: Additional information
description: Is there anything else you think we should know?

View File

@@ -1,45 +0,0 @@
name: 🇹 TypeScript Type Bug Report
description: Report an issue with TypeScript types
labels: [bug, typescript]
body:
- type: markdown
attributes:
value: |
Thank you for submitting a bug report. It helps make Bun better.
If you need help or support using Bun, and are not reporting a bug, please
join our [Discord](https://discord.gg/CXdq2DP29u) server, where you can ask questions in the [`#help`](https://discord.gg/32EtH6p7HN) forum.
Make sure you are running the [latest](https://bun.sh/docs/installation#upgrading) version of Bun.
The bug you are experiencing may already have been fixed.
Please try to include as much information as possible.
- type: input
attributes:
label: What version of Bun is running?
description: Copy the output of `bun --revision`
- type: input
attributes:
label: What platform is your computer?
description: |
For MacOS and Linux: copy the output of `uname -mprs`
For Windows: copy the output of `"$([Environment]::OSVersion | ForEach-Object VersionString) $(if ([Environment]::Is64BitOperatingSystem) { "x64" } else { "x86" })"` in the PowerShell console
- type: textarea
attributes:
label: What steps can reproduce the bug?
description: Explain the bug and provide a code snippet that can reproduce it.
validations:
required: true
- type: textarea
attributes:
label: What is the expected behavior?
description: If possible, please provide text instead of a screenshot.
- type: textarea
attributes:
label: What do you see instead?
description: If possible, please provide text instead of a screenshot.
- type: textarea
attributes:
label: Additional information
description: Is there anything else you think we should know?

View File

@@ -1,24 +0,0 @@
name: 🚀 Feature Request
description: Suggest an idea, feature, or enhancement
labels: [enhancement]
body:
- type: markdown
attributes:
value: |
Thank you for submitting an idea. It helps make Bun better.
If you want to discuss Bun, or learn how others are using Bun, please
join our [Discord](https://discord.gg/CXdq2DP29u) server, where you can share in the [`#feedback`](https://discord.gg/unwUnHBNqy) channel.
- type: textarea
attributes:
label: What is the problem this feature would solve?
validations:
required: true
- type: textarea
attributes:
label: What is the feature you are proposing to solve the problem?
validations:
required: true
- type: textarea
attributes:
label: What alternatives have you considered?

View File

@@ -1,29 +0,0 @@
name: 📗 Documentation Issue
description: Tell us if there is missing or incorrect documentation
labels: [docs]
body:
- type: markdown
attributes:
value: |
Thank you for submitting a documentation request. It helps make Bun better.
We are working on moving documentation from the [README](https://github.com/oven-sh/bun#table-of-contents) to a documentation website. Please report as many issues or missing content requests as you can so we can incoperate that in the new documentation.
- type: dropdown
attributes:
label: What is the type of issue?
multiple: true
options:
- Documentation is missing
- Documentation is incorrect
- Documentation is confusing
- Example code is not working
- Something else
- type: textarea
attributes:
label: What is the issue?
validations:
required: true
- type: textarea
attributes:
label: Where did you find it?
description: If possible, please provide the URL(s) where you found this issue.

View File

@@ -1,27 +0,0 @@
name: Prefilled crash report
description: Report a crash in Bun
labels:
- crash
body:
- type: markdown
attributes:
value: |
**Thank you so much** for submitting a crash report. You're helping us make Bun more reliable for everyone!
- type: textarea
id: code
attributes:
label: How can we reproduce the crash?
description: Please provide a [minimal reproduction](https://stackoverflow.com/help/minimal-reproducible-example) using a GitHub repository, [Replit](https://replit.com/@replit/Bun) or [CodeSandbox](https://codesandbox.io/templates/bun)
- type: textarea
id: logs
attributes:
label: Relevant log output
description: Please copy and paste any relevant log output. This will be
automatically formatted into code, so no need for backticks.
render: shell
- type: textarea
id: remapped_trace
attributes:
label: Stack Trace (bun.report)
validations:
required: true

View File

@@ -1,27 +0,0 @@
name: bun install crash report
description: Report a crash in bun install
labels:
- npm
body:
- type: markdown
attributes:
value: |
**Thank you so much** for submitting a crash report. You're helping us make Bun more reliable for everyone!
- type: textarea
id: repro
attributes:
label: How can we reproduce the crash?
description: Please provide a [minimal reproduction](https://stackoverflow.com/help/minimal-reproducible-example) using a GitHub repository, [Replit](https://replit.com/@replit/Bun) or [CodeSandbox](https://codesandbox.io/templates/bun)
- type: textarea
id: logs
attributes:
label: Relevant log output
description: Please copy and paste any relevant log output. This will be
automatically formatted into code, so no need for backticks.
render: shell
- type: textarea
id: remapped_trace
attributes:
label: Stack Trace (bun.report)
validations:
required: true

View File

@@ -1,5 +0,0 @@
blank_issues_enabled: true
contact_links:
- name: 💬 Ask a Question
url: https://discord.com/invite/CXdq2DP29u
about: Join our Discord server for questions, support requests, or just to chat.

View File

@@ -1,43 +0,0 @@
name: Bump version
description: Bump the version of Bun
inputs:
version:
description: The most recent version of Bun.
required: true
type: string
token:
description: The GitHub token to use for creating a pull request.
required: true
type: string
default: ${{ github.token }}
runs:
using: composite
steps:
- name: Run Bump
shell: bash
id: bump
run: |
set -euo pipefail
MESSAGE=$(bun ./scripts/bump.ts patch --last-version=${{ inputs.version }})
LATEST=$(cat LATEST)
echo "version=$LATEST" >> $GITHUB_OUTPUT
echo "message=$MESSAGE" >> $GITHUB_OUTPUT
- name: Create Pull Request
uses: peter-evans/create-pull-request@v4
with:
add-paths: |
CMakeLists.txt
LATEST
token: ${{ inputs.token }}
commit-message: Bump version to ${{ steps.bump.outputs.version }}
title: Bump to ${{ steps.bump.outputs.version }}
delete-branch: true
branch: github-actions/bump-version-${{ steps.bump.outputs.version }}--${{ github.run_id }}
body: |
## What does this PR do?
${{ steps.bump.outputs.message }}
Auto-bumped by [this workflow](https://github.com/oven-sh/bun/actions/workflows/release.yml)

View File

@@ -1,50 +0,0 @@
name: Setup Bun
description: An internal version of the 'oven-sh/setup-bun' action.
inputs:
bun-version:
type: string
description: "The version of bun to install: 'latest', 'canary', 'bun-v1.0.0', etc."
default: latest
required: false
baseline:
type: boolean
description: "Whether to use the baseline version of bun."
default: false
required: false
download-url:
type: string
description: "The base URL to download bun from."
default: "https://pub-5e11e972747a44bf9aaf9394f185a982.r2.dev/releases"
required: false
runs:
using: composite
steps:
- name: Setup Bun
shell: bash
run: |
case "$(uname -s)" in
Linux*) os=linux;;
Darwin*) os=darwin;;
*) os=windows;;
esac
case "$(uname -m)" in
arm64 | aarch64) arch=aarch64;;
*) arch=x64;;
esac
case "${{ inputs.baseline }}" in
true | 1) target="bun-${os}-${arch}-baseline";;
*) target="bun-${os}-${arch}";;
esac
case "${{ inputs.bun-version }}" in
latest) release="latest";;
canary) release="canary";;
*) release="bun-v${{ inputs.bun-version }}";;
esac
curl -LO "${{ inputs.download-url }}/${release}/${target}.zip"
unzip ${target}.zip
mkdir -p ${{ runner.temp }}/.bun/bin
mv ${target}/bun* ${{ runner.temp }}/.bun/bin/
chmod +x ${{ runner.temp }}/.bun/bin/*
echo "${{ runner.temp }}/.bun/bin" >> ${GITHUB_PATH}

View File

@@ -1,50 +0,0 @@
### What does this PR do?
<!-- **Please explain what your changes do**, example: -->
<!--
This adds a new flag --bail to bun test. When set, it will stop running tests after the first failure. This is useful for CI environments where you want to fail fast.
-->
- [ ] Documentation or TypeScript types (it's okay to leave the rest blank in this case)
- [ ] Code changes
### How did you verify your code works?
<!-- **For code changes, please include automated tests**. Feel free to uncomment the line below -->
<!-- I wrote automated tests -->
<!-- If JavaScript/TypeScript modules or builtins changed:
- [ ] I included a test for the new code, or existing tests cover it
- [ ] I ran my tests locally and they pass (`bun-debug test test-file-name.test`)
-->
<!-- If Zig files changed:
- [ ] I checked the lifetime of memory allocated to verify it's (1) freed and (2) only freed when it should be
- [ ] I included a test for the new code, or an existing test covers it
- [ ] JSValue used outside outside of the stack is either wrapped in a JSC.Strong or is JSValueProtect'ed
- [ ] I wrote TypeScript/JavaScript tests and they pass locally (`bun-debug test test-file-name.test`)
-->
<!-- If new methods, getters, or setters were added to a publicly exposed class:
- [ ] I added TypeScript types for the new methods, getters, or setters
-->
<!-- If dependencies in tests changed:
- [ ] I made sure that specific versions of dependencies are used instead of ranged or tagged versions
-->
<!-- If a new builtin ESM/CJS module was added:
- [ ] I updated Aliases in `module_loader.zig` to include the new module
- [ ] I added a test that imports the module
- [ ] I added a test that require() the module
-->

View File

@@ -0,0 +1,41 @@
name: bun-framework-next
on:
push:
paths:
- packages/bun-framework-next/**/*
branches: [main, bun-framework-next-actions]
pull_request:
paths:
- packages/bun-framework-next/**/*
branches: [main]
jobs:
build:
name: lint, test and build on Node ${{ matrix.node }} and ${{ matrix.os }}
runs-on: ${{ matrix.os }}
strategy:
matrix:
node: ["14.x"]
os: [macOS-latest]
steps:
- name: Checkout repo
uses: actions/checkout@v2
- name: Use Node ${{ matrix.node }}
uses: actions/setup-node@v2
with:
node-version: ${{ matrix.node }}
- name: Install PNPM
uses: pnpm/action-setup@v2.0.1
with:
version: 6.21.0
- name: Install dependencies
run: cd packages/bun-framework-next && pnpm install
- name: Type check bun-framework-next
run: cd packages/bun-framework-next && pnpm check

167
.github/workflows/bun.yml vendored Normal file
View File

@@ -0,0 +1,167 @@
name: bun
on:
push:
branches: [main, bun-actions]
paths-ignore:
- "examples/**"
- "bench/**"
- "README.*"
- "LICENSE"
- ".vscode"
- ".devcontainer"
pull_request:
branches: [main]
paths-ignore:
- "examples/**"
- "bench/**"
- README.*
env:
REGISTRY: ghcr.io
IMAGE_NAME: ${{ github.repository }}
TEST_TAG: bun-test'
jobs:
e2e:
runs-on: self-hosted
name: "Integration tests"
steps:
- name: Checkout
uses: actions/checkout@v2
- name: Checkout submodules
run: git -c submodule."src/javascript/jsc/WebKit".update=none submodule update --init --recursive --depth=1 --progress -j 8
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v1
- name: Login to Dockerhub
uses: docker/login-action@v1
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_PASSWORD }}
- name: Pull Base Image
run: bash .docker/pull.sh
- name: Build tests
uses: docker/build-push-action@v2
with:
context: .
target: test_base
tags: bun-test:latest
load: true
cache-from: type=gha
cache-to: type=gha,mode=max
builder: ${{ steps.buildx.outputs.name }}
- name: Run test-with-hmr
env:
BUN_TEST_NAME: test-with-hmr
GITHUB_WORKSPACE: $GITHUB_WORKSPACE
RUNNER_TEMP: ${RUNNER_TEMP}
run: bash .docker/runner.sh
- name: Run test-no-hmr
env:
BUN_TEST_NAME: test-no-hmr
GITHUB_WORKSPACE: $GITHUB_WORKSPACE
RUNNER_TEMP: ${RUNNER_TEMP}
run: bash .docker/runner.sh
- name: Run test-bun-create-next
env:
RUNNER_TEMP: ${RUNNER_TEMP}
BUN_TEST_NAME: test-create-next
GITHUB_WORKSPACE: $GITHUB_WORKSPACE
run: bash .docker/runner.sh
- name: Run test-bun-create-react
env:
RUNNER_TEMP: ${RUNNER_TEMP}
BUN_TEST_NAME: test-create-react
GITHUB_WORKSPACE: $GITHUB_WORKSPACE
run: bash .docker/runner.sh
- name: Run test-bun-run
env:
RUNNER_TEMP: ${RUNNER_TEMP}
BUN_TEST_NAME: test-bun-run
GITHUB_WORKSPACE: $GITHUB_WORKSPACE
run: bash .docker/runner.sh
- name: Run test-bun-install
env:
RUNNER_TEMP: ${RUNNER_TEMP}
BUN_TEST_NAME: test-bun-install
GITHUB_WORKSPACE: $GITHUB_WORKSPACE
run: bash .docker/runner.sh
# This is commented out because zig test does not work on the CI
# Which sucks
# zig-unit-tests:
# runs-on: self-hosted
# name: "Unit tests (Zig)"
# steps:
# - name: Checkout
# uses: actions/checkout@v2
# - name: Checkout submodules
# run: git -c submodule."src/javascript/jsc/WebKit".update=none submodule update --init --recursive --depth=1 --progress -j 8
# - name: Set up Docker Buildx
# uses: docker/setup-buildx-action@v1
# - name: Login to Dockerhub
# uses: docker/login-action@v1
# with:
# username: ${{ secrets.DOCKERHUB_USERNAME }}
# password: ${{ secrets.DOCKERHUB_PASSWORD }}
# - name: Pull Base Image
# run: bash .docker/pull.sh
# - name: Build tests
# uses: docker/build-push-action@v2
# with:
# context: .
# target: build_unit
# tags: bun-unit-tests:latest
# load: true
# cache-from: type=gha
# cache-to: type=gha,mode=max
# builder: ${{ steps.buildx.outputs.name }}
# - name: Run tests
# env:
# GITHUB_WORKSPACE: $GITHUB_WORKSPACE
# RUNNER_TEMP: ${RUNNER_TEMP}
# run: bash .docker/unit-tests.sh
release:
runs-on: self-hosted
needs: ["e2e"]
if: github.ref == 'refs/heads/main'
steps:
- name: Checkout
uses: actions/checkout@v2
- name: Checkout submodules
run: git -c submodule."src/javascript/jsc/WebKit".update=none submodule update --init --recursive --depth=1 --progress -j 8
- name: Login to GitHub Container Registry
uses: docker/login-action@v1
with:
registry: ghcr.io
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Login to DockerHub
uses: docker/login-action@v1
with:
username: jarredsumner
password: ${{ secrets.DOCKERHUB_ALT }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v1
with:
install: true
- name: Pull Base Image
run: bash .docker/pull.sh
- name: Build release image
uses: docker/build-push-action@v2
with:
context: .
target: release
tags: |
ghcr.io/jarred-sumner/bun:${{github.sha}}
ghcr.io/jarred-sumner/bun:edge
jarredsumner/bun:${{github.sha}}
jarredsumner/bun:edge
platforms: |
linux/amd64
labels: |
org.opencontainers.image.title=bun
org.opencontainers.image.description=bun is a fast bundler, transpiler, JavaScript Runtime environment and package manager for web software. The image is an Ubuntu 20.04 image with bun preinstalled into /opt/bun.
org.opencontainers.image.vendor=bun
org.opencontainers.image.source=https://github.com/Jarred-Sumner/bun
org.opencontainers.image.url=https://bun.sh
builder: ${{ steps.buildx.outputs.name }}
push: true

View File

@@ -1,85 +0,0 @@
name: C++ Linter comment
permissions:
actions: read
pull-requests: write
on:
workflow_run:
workflows:
- lint-cpp
types:
- completed
jobs:
comment-lint:
if: ${{ github.repository_owner == 'oven-sh' }}
name: Comment
runs-on: ubuntu-latest
steps:
- name: Download Comment
uses: actions/download-artifact@v4
with:
name: format.log
github-token: ${{ github.token }}
run-id: ${{ github.event.workflow_run.id }}
- name: PR Number
uses: actions/download-artifact@v4
with:
name: pr-number.txt
github-token: ${{ github.token }}
run-id: ${{ github.event.workflow_run.id }}
- name: Did Fail
uses: actions/download-artifact@v4
with:
name: did_fail.txt
github-token: ${{ github.token }}
run-id: ${{ github.event.workflow_run.id }}
- name: Setup Environment
id: env
shell: bash
run: |
# Copy to outputs
echo "pr-number=$(cat pr-number.txt)" >> $GITHUB_OUTPUT
{
echo 'text_output<<EOF'
cat format.log
echo EOF
} >> "$GITHUB_OUTPUT"
echo "did_fail=$(cat did_fail.txt)" >> $GITHUB_OUTPUT
- name: Find Comment
id: comment
uses: peter-evans/find-comment@v3
with:
issue-number: ${{ steps.env.outputs.pr-number }}
comment-author: github-actions[bot]
body-includes: <!-- generated-comment lint-cpp-workflow=${{ github.workflow }} -->
- name: Update Comment
uses: peter-evans/create-or-update-comment@v4
if: steps.env.outputs.did_fail != '0'
with:
comment-id: ${{ steps.comment.outputs.comment-id }}
issue-number: ${{ steps.env.outputs.pr-number }}
body: |
@${{ github.actor }}, `clang-tidy` had something to share with you about your code:
```cpp
${{ steps.env.outputs.text_output }}
```
Commit: ${{ github.event.workflow_run.head_sha || github.sha }}
<!-- generated-comment lint-cpp-workflow=${{ github.workflow }} -->
edit-mode: replace
- name: Update Previous Comment
uses: peter-evans/create-or-update-comment@v4
if: steps.env.outputs.did_fail == '0' && steps.comment.outputs.comment-id != ''
with:
comment-id: ${{ steps.comment.outputs.comment-id }}
issue-number: ${{ steps.env.outputs.pr-number }}
body: |
clang-tidy nits are fixed! Thank you.
<!-- generated-comment lint-cpp-workflow=${{ github.workflow }} -->
edit-mode: replace

View File

@@ -1,21 +0,0 @@
name: Docs
on:
push:
paths:
- "docs/**"
- "CONTRIBUTING.md"
branches:
- main
jobs:
deploy:
name: Deploy
runs-on: ubuntu-latest
if: ${{ github.repository_owner == 'oven-sh' }}
steps:
# redeploy Vercel site when a file in `docs` changes
# using VERCEL_DEPLOY_HOOK environment variable
- name: Trigger Webhook
run: |
curl -v ${{ secrets.VERCEL_DEPLOY_HOOK }}

View File

@@ -1,81 +0,0 @@
name: Issue Labeled
env:
BUN_VERSION: 1.1.13
on:
issues:
types: [labeled]
jobs:
on-labeled:
runs-on: ubuntu-latest
if: github.event.label.name == 'crash' || github.event.label.name == 'needs repro'
permissions:
issues: write
steps:
- name: Checkout
uses: actions/checkout@v4
with:
sparse-checkout: |
scripts
.github
CMakeLists.txt
- name: Setup Bun
uses: ./.github/actions/setup-bun
with:
bun-version: "1.1.13"
- name: "add platform and command label"
id: add-labels
if: github.event.label.name == 'crash'
env:
GITHUB_ISSUE_BODY: ${{ github.event.issue.body }}
GITHUB_ISSUE_TITLE: ${{ github.event.issue.title }}
shell: bash
run: |
LABELS=$(bun scripts/read-issue.ts)
echo "labels=$LABELS" >> $GITHUB_OUTPUT
bun scripts/is-outdated.ts
if [[ -f "is-outdated.txt" ]]; then
echo "is-outdated=true" >> $GITHUB_OUTPUT
fi
if [[ -f "outdated.txt" ]]; then
echo "oudated=$(cat outdated.txt)" >> $GITHUB_OUTPUT
fi
echo "latest=$(cat LATEST)" >> $GITHUB_OUTPUT
rm -rf is-outdated.txt outdated.txt latest.txt
- name: Add labels
uses: actions-cool/issues-helper@v3
if: github.event.label.name == 'crash'
with:
actions: "add-labels"
token: ${{ secrets.GITHUB_TOKEN }}
issue-number: ${{ github.event.issue.number }}
labels: ${{ steps.add-labels.outputs.labels }}
- name: Comment outdated
if: steps.add-labels.outputs.is-outdated == 'true' && github.event.label.name == 'crash'
uses: actions-cool/issues-helper@v3
with:
actions: "create-comment"
token: ${{ secrets.GITHUB_TOKEN }}
issue-number: ${{ github.event.issue.number }}
body: |
@${{ github.event.issue.user.login }}, the latest version of Bun is v${{ steps.add-labels.outputs.latest }}, but this crash was reported on Bun v${{ steps.add-labels.outputs.oudated }}.
Are you able to reproduce this crash on the latest version of Bun?
```sh
bun upgrade
```
- name: Comment needs repro
if: github.event.label.name == 'needs repro'
uses: actions-cool/issues-helper@v3
with:
actions: "create-comment"
token: ${{ secrets.GITHUB_TOKEN }}
issue-number: ${{ github.event.issue.number }}
body: |
Hello @${{ github.event.issue.user.login }}. Please provide a [minimal reproduction](https://stackoverflow.com/help/minimal-reproducible-example) using a GitHub repository, [Replit](https://replit.com/@replit/Bun), or [CodeSandbox](https://codesandbox.io/templates/bun). Issues marked with `needs repro` will be closed if they have no activity within 3 days.

View File

@@ -1,30 +0,0 @@
name: lint-cpp
permissions:
contents: read
concurrency:
group: ${{ github.workflow }}-${{ github.event_name == 'workflow_dispatch' && inputs.run-id || github.ref }}
cancel-in-progress: true
on:
workflow_dispatch:
inputs:
run-id:
type: string
description: The workflow ID to download artifacts (skips the build step)
pull_request:
paths:
- ".github/workflows/lint-cpp.yml"
- "**/*.cpp"
- "src/deps/**/*"
- "CMakeLists.txt"
jobs:
lint-cpp:
if: ${{ !inputs.run-id }}
name: Lint C++
uses: ./.github/workflows/run-lint-cpp.yml
secrets: inherit
with:
pr-number: ${{ github.event.number }}

View File

@@ -1,89 +0,0 @@
name: Comment on updated submodule
on:
pull_request_target:
paths:
- "src/generated_versions_list.zig"
- ".github/workflows/on-submodule-update.yml"
jobs:
comment:
name: Comment
runs-on: ubuntu-latest
if: ${{ github.repository_owner == 'oven-sh' }}
permissions:
contents: read
pull-requests: write
issues: write
steps:
- name: Checkout current
uses: actions/checkout@v4
with:
sparse-checkout: |
src
- name: Hash generated versions list
id: hash
run: |
echo "hash=$(sha256sum src/generated_versions_list.zig | cut -d ' ' -f 1)" >> $GITHUB_OUTPUT
- name: Checkout base
uses: actions/checkout@v4
with:
ref: ${{ github.base_ref }}
sparse-checkout: |
src
- name: Hash base
id: base
run: |
echo "base=$(sha256sum src/generated_versions_list.zig | cut -d ' ' -f 1)" >> $GITHUB_OUTPUT
- name: Compare
id: compare
run: |
if [ "${{ steps.hash.outputs.hash }}" != "${{ steps.base.outputs.base }}" ]; then
echo "changed=true" >> $GITHUB_OUTPUT
else
echo "changed=false" >> $GITHUB_OUTPUT
fi
- name: Find Comment
id: comment
uses: peter-evans/find-comment@v3
with:
issue-number: ${{ github.event.pull_request.number }}
comment-author: github-actions[bot]
body-includes: <!-- generated-comment submodule-updated -->
- name: Write Warning Comment
uses: peter-evans/create-or-update-comment@v4
if: steps.compare.outputs.changed == 'true'
with:
comment-id: ${{ steps.comment.outputs.comment-id }}
issue-number: ${{ github.event.pull_request.number }}
edit-mode: replace
body: |
⚠️ **Warning:** @${{ github.actor }}, this PR has changes to submodule versions.
If this change was intentional, please ignore this message. If not, please undo changes to submodules and rebase your branch.
<!-- generated-comment submodule-updated -->
- name: Add labels
uses: actions-cool/issues-helper@v3
if: steps.compare.outputs.changed == 'true'
with:
actions: "add-labels"
token: ${{ secrets.GITHUB_TOKEN }}
issue-number: ${{ github.event.pull_request.number }}
labels: "changed-submodules"
- name: Remove labels
uses: actions-cool/issues-helper@v3
if: steps.compare.outputs.changed == 'false'
with:
actions: "remove-labels"
token: ${{ secrets.GITHUB_TOKEN }}
issue-number: ${{ github.event.pull_request.number }}
labels: "changed-submodules"
- name: Delete outdated comment
uses: actions-cool/issues-helper@v3
if: steps.compare.outputs.changed == 'false' && steps.comment.outputs.comment-id != ''
with:
actions: "delete-comment"
token: ${{ secrets.GITHUB_TOKEN }}
issue-number: ${{ github.event.pull_request.number }}
comment-id: ${{ steps.comment.outputs.comment-id }}

View File

@@ -1,315 +0,0 @@
# TODO: Move this to bash scripts intead of Github Actions
# so it can be run from Buildkite, see: .buildkite/scripts/release.sh
name: Release
concurrency: release
env:
BUN_VERSION: ${{ github.event.inputs.tag || github.event.release.tag_name || 'canary' }}
BUN_LATEST: ${{ (github.event.inputs.is-latest || github.event.release.tag_name) && 'true' || 'false' }}
on:
release:
types:
- published
schedule:
- cron: "0 14 * * *" # every day at 6am PST
workflow_dispatch:
inputs:
is-latest:
description: Is this the latest release?
type: boolean
default: false
tag:
type: string
description: What is the release tag? (e.g. "1.0.2", "canary")
required: true
use-docker:
description: Should Docker images be released?
type: boolean
default: false
use-npm:
description: Should npm packages be published?
type: boolean
default: false
use-homebrew:
description: Should binaries be released to Homebrew?
type: boolean
default: false
use-s3:
description: Should binaries be uploaded to S3?
type: boolean
default: false
use-types:
description: Should types be released to npm?
type: boolean
default: false
jobs:
sign:
name: Sign Release
runs-on: ubuntu-latest
if: ${{ github.repository_owner == 'oven-sh' }}
permissions:
contents: write
defaults:
run:
working-directory: packages/bun-release
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Setup GPG
uses: crazy-max/ghaction-import-gpg@v5
with:
gpg_private_key: ${{ secrets.GPG_PRIVATE_KEY }}
passphrase: ${{ secrets.GPG_PASSPHRASE }}
- name: Setup Bun
uses: ./.github/actions/setup-bun
with:
bun-version: "1.1.20"
- name: Install Dependencies
run: bun install
- name: Sign Release
run: |
echo "$GPG_PASSPHRASE" | bun upload-assets -- "${{ env.BUN_VERSION }}"
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GPG_PASSPHRASE: ${{ secrets.GPG_PASSPHRASE }}
npm:
name: Release to NPM
runs-on: ubuntu-latest
needs: sign
if: ${{ github.event_name != 'workflow_dispatch' || github.event.inputs.use-npm == 'true' }}
permissions:
contents: read
defaults:
run:
working-directory: packages/bun-release
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Setup Bun
uses: ./.github/actions/setup-bun
with:
bun-version: "1.1.20"
- name: Install Dependencies
run: bun install
- name: Release
run: bun upload-npm -- "${{ env.BUN_VERSION }}" publish
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
NPM_TOKEN: ${{ secrets.NPM_TOKEN }}
npm-types:
name: Release types to NPM
runs-on: ubuntu-latest
needs: sign
if: ${{ github.event_name != 'workflow_dispatch' || github.event.inputs.use-types == 'true' }}
permissions:
contents: read
defaults:
run:
working-directory: packages/bun-types
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Setup Node.js
uses: actions/setup-node@v3
with:
node-version: latest
- name: Setup Bun
if: ${{ env.BUN_VERSION != 'canary' }}
uses: ./.github/actions/setup-bun
with:
bun-version: "1.1.20"
- name: Setup Bun
if: ${{ env.BUN_VERSION == 'canary' }}
uses: ./.github/actions/setup-bun
with:
bun-version: "canary" # Must be 'canary' so tag is correct
- name: Install Dependencies
run: bun install
- name: Setup Tag
if: ${{ env.BUN_VERSION == 'canary' }}
run: |
VERSION=$(bun --version)
TAG="${VERSION}-canary.$(date +'%Y%m%dT%H%M%S')"
echo "Setup tag: ${TAG}"
echo "TAG=${TAG}" >> ${GITHUB_ENV}
- name: Build
run: bun run build
env:
BUN_VERSION: ${{ env.TAG || env.BUN_VERSION }}
- name: Release (canary)
if: ${{ env.BUN_VERSION == 'canary' }}
uses: JS-DevTools/npm-publish@v1
with:
package: packages/bun-types/package.json
token: ${{ secrets.NPM_TOKEN }}
tag: canary
- name: Release (latest)
if: ${{ env.BUN_LATEST == 'true' }}
uses: JS-DevTools/npm-publish@v1
with:
package: packages/bun-types/package.json
token: ${{ secrets.NPM_TOKEN }}
docker:
name: Release to Dockerhub
runs-on: ubuntu-latest
needs: sign
if: ${{ github.event_name != 'workflow_dispatch' || github.event.inputs.use-docker == 'true' }}
permissions:
contents: read
strategy:
fail-fast: false
matrix:
include:
- variant: debian
suffix: ""
- variant: debian
suffix: -debian
- variant: slim
suffix: -slim
dir: debian-slim
- variant: alpine
suffix: -alpine
- variant: distroless
suffix: -distroless
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Setup Docker emulator
uses: docker/setup-qemu-action@v2
- id: buildx
name: Setup Docker buildx
uses: docker/setup-buildx-action@v3
with:
platforms: linux/amd64,linux/arm64
- id: metadata
name: Setup Docker metadata
uses: docker/metadata-action@v4
with:
images: oven/bun
flavor: |
latest=false
tags: |
type=raw,value=latest,enable=${{ env.BUN_LATEST == 'true' && matrix.suffix == '' }}
type=raw,value=${{ matrix.variant }},enable=${{ env.BUN_LATEST == 'true' }}
type=match,pattern=(bun-v)?(canary|\d+.\d+.\d+),group=2,value=${{ env.BUN_VERSION }},suffix=${{ matrix.suffix }}
type=match,pattern=(bun-v)?(canary|\d+.\d+),group=2,value=${{ env.BUN_VERSION }},suffix=${{ matrix.suffix }}
type=match,pattern=(bun-v)?(canary|\d+),group=2,value=${{ env.BUN_VERSION }},suffix=${{ matrix.suffix }}
- name: Login to Docker
uses: docker/login-action@v3
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_PASSWORD }}
- name: Push to Docker
uses: docker/build-push-action@v5
with:
context: ./dockerhub/${{ matrix.dir || matrix.variant }}
platforms: linux/amd64,linux/arm64
builder: ${{ steps.buildx.outputs.name }}
push: true
tags: ${{ steps.metadata.outputs.tags }}
labels: ${{ steps.metadata.outputs.labels }}
build-args: |
BUN_VERSION=${{ env.BUN_VERSION }}
homebrew:
name: Release to Homebrew
runs-on: ubuntu-latest
needs: sign
permissions:
contents: read
if: ${{ github.event_name == 'release' || github.event.inputs.use-homebrew == 'true' }}
steps:
- name: Checkout
uses: actions/checkout@v4
with:
repository: oven-sh/homebrew-bun
token: ${{ secrets.ROBOBUN_TOKEN }}
- id: gpg
name: Setup GPG
uses: crazy-max/ghaction-import-gpg@v5
with:
gpg_private_key: ${{ secrets.GPG_PRIVATE_KEY }}
passphrase: ${{ secrets.GPG_PASSPHRASE }}
- name: Setup Ruby
uses: ruby/setup-ruby@v1
with:
ruby-version: "2.6"
- name: Update Tap
run: ruby scripts/release.rb "${{ env.BUN_VERSION }}"
- name: Commit Tap
uses: stefanzweifel/git-auto-commit-action@v4
with:
commit_options: --gpg-sign=${{ steps.gpg.outputs.keyid }}
commit_message: Release ${{ env.BUN_VERSION }}
commit_user_name: robobun
commit_user_email: robobun@oven.sh
commit_author: robobun <robobun@oven.sh>
s3:
name: Upload to S3
runs-on: ubuntu-latest
needs: sign
if: ${{ github.event_name != 'workflow_dispatch' || github.event.inputs.use-s3 == 'true' }}
permissions:
contents: read
defaults:
run:
working-directory: packages/bun-release
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Setup Bun
uses: ./.github/actions/setup-bun
with:
bun-version: "1.1.20"
- name: Install Dependencies
run: bun install
- name: Release
run: bun upload-s3 -- "${{ env.BUN_VERSION }}"
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY}}
AWS_ENDPOINT: ${{ secrets.AWS_ENDPOINT }}
AWS_BUCKET: bun
notify-sentry:
name: Notify Sentry
runs-on: ubuntu-latest
needs: s3
steps:
- name: Notify Sentry
uses: getsentry/action-release@v1.7.0
env:
SENTRY_AUTH_TOKEN: ${{ secrets.SENTRY_AUTH_TOKEN }}
SENTRY_ORG: ${{ secrets.SENTRY_ORG }}
SENTRY_PROJECT: ${{ secrets.SENTRY_PROJECT }}
with:
ignore_missing: true
ignore_empty: true
version: ${{ env.BUN_VERSION }}
environment: production
bump:
name: "Bump version"
runs-on: ubuntu-latest
if: ${{ github.event_name != 'schedule' }}
permissions:
pull-requests: write
contents: write
steps:
- name: Checkout
uses: actions/checkout@v4
if: ${{ env.BUN_LATEST == 'true' }}
- name: Setup Bun
uses: ./.github/actions/setup-bun
if: ${{ env.BUN_LATEST == 'true' }}
with:
bun-version: "1.1.12"
- name: Bump version
uses: ./.github/actions/bump
if: ${{ env.BUN_LATEST == 'true' }}
with:
version: ${{ env.BUN_VERSION }}
token: ${{ github.token }}

View File

@@ -1,52 +0,0 @@
name: Format
permissions:
contents: write
on:
workflow_call:
inputs:
zig-version:
type: string
required: true
jobs:
format:
name: Format
runs-on: ubuntu-latest
if: ${{ github.ref != 'refs/heads/main' }}
steps:
- name: Checkout
uses: actions/checkout@v4
with:
sparse-checkout: |
.github
src
scripts
packages
test
bench
- name: Setup Bun
uses: ./.github/actions/setup-bun
with:
bun-version: "1.1.20"
- name: Setup Zig
uses: mlugg/setup-zig@v1
with:
version: ${{ inputs.zig-version }}
- name: Install Dependencies
run: |
bun install
- name: Format
run: |
bun fmt
- name: Format Zig
run: |
bun fmt:zig
- name: Generate submodule versions
run: |
bash ./scripts/write-versions.sh
- name: Commit
uses: stefanzweifel/git-auto-commit-action@v5
with:
commit_message: Apply formatting changes

View File

@@ -1,84 +0,0 @@
name: lint-cpp
permissions:
contents: read
env:
LLVM_VERSION: 16
LC_CTYPE: "en_US.UTF-8"
LC_ALL: "en_US.UTF-8"
on:
workflow_call:
inputs:
pr-number:
required: true
type: number
jobs:
lint-cpp:
name: Lint C++
runs-on: ${{ github.repository_owner == 'oven-sh' && 'macos-13-xlarge' || 'macos-13' }}
steps:
- name: Checkout
uses: actions/checkout@v4
with:
submodules: recursive
- name: Setup Bun
uses: ./.github/actions/setup-bun
with:
bun-version: latest
- name: Install Dependencies
env:
HOMEBREW_NO_INSTALLED_DEPENDENTS_CHECK: 1
HOMEBREW_NO_AUTO_UPDATE: 1
HOMEBREW_NO_INSTALL_CLEANUP: 1
run: |
brew install \
llvm@${{ env.LLVM_VERSION }} \
ninja \
coreutils \
openssl@1.1 \
libiconv \
gnu-sed --force --overwrite
echo "$(brew --prefix coreutils)/libexec/gnubin" >> $GITHUB_PATH
echo "$(brew --prefix llvm@$LLVM_VERSION)/bin" >> $GITHUB_PATH
brew link --overwrite llvm@$LLVM_VERSION
- name: Bun install
run: |
bun install
- name: clang-tidy
id: format
env:
CPU_TARGET: native
BUN_SILENT: 1
run: |
rm -f did_fail format.log
echo "${{ inputs.pr-number }}" > pr-number.txt
echo "pr_number=$(cat pr-number.txt)" >> $GITHUB_OUTPUT
bun run --silent build:tidy &> >(tee -p format.log) && echo 0 > did_succeed.txt
# Upload format.log as github artifact for the workflow
if [ -f did_succeed.txt ]; then
echo "0" > did_fail.txt
else
echo "1" > did_fail.txt
fi
echo "did_fail=$(cat did_fail.txt)" >> $GITHUB_OUTPUT
- name: Upload format.log
uses: actions/upload-artifact@v2
with:
name: format.log
path: format.log
- name: Upload PR
uses: actions/upload-artifact@v2
with:
name: pr-number.txt
path: pr-number.txt
- name: Upload PR
uses: actions/upload-artifact@v2
with:
name: did_fail.txt
path: did_fail.txt
- name: Fail if formatting failed
if: ${{ steps.format.outputs.did_fail == '1' }}
run: exit 1

View File

@@ -1,32 +0,0 @@
name: Lint
permissions:
contents: read
env:
LLVM_VERSION: 16
on:
workflow_call:
jobs:
lint:
name: Lint
runs-on: ubuntu-latest
outputs:
text_output: ${{ steps.lint.outputs.text_output }}
json_output: ${{ steps.lint.outputs.json_output }}
count: ${{ steps.lint.outputs.count }}
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Setup Bun
uses: ./.github/actions/setup-bun
with:
bun-version: "1.1.3"
- name: Install Dependencies
run: |
bun --cwd=packages/bun-internal-test install
- name: Lint
id: lint
run: |
bun packages/bun-internal-test/src/linter.ts

View File

@@ -1,29 +0,0 @@
name: Test Bump version
on:
workflow_dispatch:
inputs:
version:
type: string
description: What is the release tag? (e.g. "1.0.2", "canary")
required: true
jobs:
bump:
name: "Bump version"
runs-on: ubuntu-latest
permissions:
pull-requests: write
contents: write
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Setup Bun
uses: ./.github/actions/setup-bun
with:
bun-version: "1.1.12"
- name: Bump version
uses: ./.github/actions/bump
with:
version: ${{ inputs.version }}
token: ${{ github.token }}

216
.gitignore vendored
View File

@@ -1,148 +1,104 @@
.DS_Store
.env
.envrc
.eslintcache
.idea
.next
.ninja_deps
.ninja_log
.npm
.npm.gz
.parcel-cache
.swcrc
.trace
.uuid
.vs
.vscode/clang*
.vscode/cpp*
.zig-cache
*.a
*.bc
*.big
*.blob
*.bun
*.crash
*.database
*.db
*.dmg
*.dSYM
*.jsb
*.lib
*.log
zig-cache
*.wasm
*.o
*.a
profile.json
node_modules
.swcrc
yarn.lock
dist
*.log
*.out.js
*.out.refresh.js
*.pdb
*.sqlite
*.tmp
*.trace
*.wat
*.zip
**/.verdaccio-db.json
**/*.dir
**/*.pdb
**/*.sln*
**/*.vcxproj*
**/package-lock.json
/.cache
/.webkit-cache
/build-*/
/bun-webkit
/kcov-out
/src/deps/libuv
/test-report.json
/test-report.md
/test.js
/test.ts
/testdir
/test.zig
/package-lock.json
build
build.ninja
bun-binary
bun-mimalloc
bun-nomimalloc
bun-singlehtreaded
bun-test-scratch
bun-zigld
cmake_install.cmake
CMakeCache.txt
CMakeFiles
cold-jsc-start
cold-jsc-start.d
compile_commands.json
*.wat
zig-out
pnpm-lock.yaml
README.md.template
src/deps/zig-clap/example
src/deps/zig-clap/README.md
src/deps/zig-clap/.github
src/deps/zig-clap/.gitattributes
out
outdir
.trace
cover
coverage
coverv
dist
esbuilddir
examples/lotta-modules/bun-nofscache
examples/lotta-modules/bun-old
examples/lotta-modules/bun-yday
failing-tests.txt
*.trace
github
make-dev-stats.csv
misctools/fetch
misctools/machbench
misctools/sha
myscript.sh
node_modules
node_modules_*
out
out.*
outcss
outdir
out
.parcel-cache
esbuilddir
*.bun
parceldist
esbuilddir
outdir/
packages/*/*.wasm
packages/bun-*/*.o
outcss
.next
txt.js
.idea
.vscode/cpp*
node_modules_*
*.jsb
*.zip
bun-zigld
bun-singlehtreaded
bun-nomimalloc
bun-mimalloc
examples/lotta-modules/bun-yday
examples/lotta-modules/bun-old
examples/lotta-modules/bun-nofscache
src/node-fallbacks/out/*
src/node-fallbacks/node_modules
sign.json
release/
*.dmg
sign.*.json
packages/debug-*
packages/bun-cli/postinstall.js
packages/bun-*/bun
packages/bun-*/bun-profile
packages/bun-*/debug-bun
packages/bun-cli/bin/*
packages/bun-*/*.o
packages/bun-cli/postinstall.js
packages/bun-wasm/*.cjs
packages/bun-wasm/*.d.cts
packages/bun-wasm/*.d.mts
packages/bun-wasm/*.d.ts
packages/bun-wasm/*.js
packages/bun-wasm/*.map
packages/bun-wasm/*.mjs
packages/debug-*
parceldist
pnpm-lock.yaml
profile.json
README.md.template
release/
sign.*.json
sign.json
src/bun.js/bindings-obj
src/bun.js/bindings/GeneratedJS2Native.zig
src/bun.js/debug-bindings-obj
src/deps/c-ares/build
packages/bun-cli/bin/*
bun-test-scratch
misctools/fetch
src/deps/libiconv
src/deps/openssl
src/deps/PLCrashReporter/
src/deps/s2n-tls
src/deps/zig-clap/.gitattributes
src/deps/zig-clap/.github
src/deps/zig-clap/example
src/deps/zig-clap/README.md
src/fallback.version
src/js/out/DebugPath.h
src/js/out/functions*
src/js/out/modules*
src/js/out/tmp
src/node-fallbacks/node_modules
src/node-fallbacks/out/*
src/runtime.version
src/tests.zig
test.txt
test/js/bun/glob/fixtures
tsconfig.tsbuildinfo
txt.js
x64
yarn.lock
zig-cache
zig-out
test/node.js/upstream
.zig-cache
scripts/env.local
*.blob
src/deps/s2n-tls
.npm
.npm.gz
bun-binary
src/deps/PLCrashReporter/
*.dSYM
*.crash
misctools/sha
packages/bun-wasm/*.mjs
packages/bun-wasm/*.cjs
packages/bun-wasm/*.map
packages/bun-wasm/*.js
packages/bun-wasm/*.d.ts
*.bc
src/fallback.version
src/runtime.version
*.sqlite
*.database
*.db

127
.gitmodules vendored
View File

@@ -1,88 +1,53 @@
[submodule "src/javascript/jsc/WebKit"]
path = src/bun.js/WebKit
url = https://github.com/oven-sh/WebKit.git
ignore = dirty
depth = 1
update = none
shallow = true
fetchRecurseSubmodules = false
# [submodule "src/deps/zig-clap"]
# path = src/deps/zig-clap
# url = https://github.com/Hejsil/zig-clap
[submodule "src/deps/picohttpparser"]
path = src/deps/picohttpparser
url = https://github.com/h2o/picohttpparser.git
ignore = dirty
depth = 1
shallow = true
fetchRecurseSubmodules = false
path = src/deps/picohttpparser
url = https://github.com/h2o/picohttpparser.git
ignore = dirty
depth = 1
[submodule "src/javascript/jsc/WebKit"]
path = src/javascript/jsc/WebKit
url = https://github.com/Jarred-Sumner/WebKit.git
ignore = dirty
depth = 1
[submodule "src/deps/mimalloc"]
path = src/deps/mimalloc
url = https://github.com/Jarred-Sumner/mimalloc.git
ignore = dirty
depth = 1
shallow = true
fetchRecurseSubmodules = false
path = src/deps/mimalloc
url = https://github.com/Jarred-Sumner/mimalloc.git
ignore = dirty
depth = 1
[submodule "src/deps/zlib"]
path = src/deps/zlib
url = https://github.com/cloudflare/zlib.git
ignore = dirty
depth = 1
shallow = true
fetchRecurseSubmodules = false
path = src/deps/zlib
url = https://github.com/cloudflare/zlib.git
ignore = dirty
depth = 1
[submodule "src/deps/libarchive"]
path = src/deps/libarchive
url = https://github.com/libarchive/libarchive.git
ignore = dirty
depth = 1
shallow = true
fetchRecurseSubmodules = false
path = src/deps/libarchive
url = https://github.com/libarchive/libarchive.git
ignore = dirty
depth = 1
[submodule "src/deps/boringssl"]
path = src/deps/boringssl
url = https://github.com/oven-sh/boringssl.git
ignore = dirty
depth = 1
shallow = true
fetchRecurseSubmodules = false
path = src/deps/boringssl
url = https://github.com/google/boringssl.git
ignore = dirty
depth = 1
[submodule "src/deps/libbacktrace"]
path = src/deps/libbacktrace
url = https://github.com/ianlancetaylor/libbacktrace
ignore = dirty
depth = 1
[submodule "src/deps/lol-html"]
path = src/deps/lol-html
url = https://github.com/cloudflare/lol-html
ignore = dirty
depth = 1
shallow = true
fetchRecurseSubmodules = false
path = src/deps/lol-html
url = https://github.com/cloudflare/lol-html
ignore = dirty
depth = 1
[submodule "src/deps/uws"]
path = src/deps/uws
url = https://github.com/Jarred-Sumner/uWebSockets
ignore = dirty
depth = 1
[submodule "src/deps/tinycc"]
path = src/deps/tinycc
url = https://github.com/Jarred-Sumner/tinycc.git
ignore = dirty
depth = 1
shallow = true
fetchRecurseSubmodules = false
[submodule "src/deps/c-ares"]
path = src/deps/c-ares
url = https://github.com/c-ares/c-ares.git
ignore = dirty
depth = 1
shallow = true
fetchRecurseSubmodules = false
[submodule "src/deps/zstd"]
path = src/deps/zstd
url = https://github.com/facebook/zstd.git
ignore = dirty
depth = 1
shallow = true
fetchRecurseSubmodules = false
[submodule "src/deps/ls-hpack"]
path = src/deps/ls-hpack
url = https://github.com/litespeedtech/ls-hpack.git
ignore = dirty
depth = 1
shallow = true
fetchRecurseSubmodules = false
[submodule "zig"]
path = src/deps/zig
url = https://github.com/oven-sh/zig
depth = 1
shallow = true
fetchRecurseSubmodules = false
[submodule "src/deps/libdeflate"]
path = src/deps/libdeflate
url = https://github.com/ebiggers/libdeflate
ignore = "dirty"
path = src/deps/tinycc
url = https://github.com/Jarred-Sumner/tinycc.git
ignore = dirty
depth = 1

View File

@@ -1,4 +0,0 @@
command script import src/deps/zig/tools/lldb_pretty_printers.py
command script import src/bun.js/WebKit/Tools/lldb/lldb_webkit.py
# type summary add --summary-string "${var} | inner=${var[0-30]}, source=${var[33-64]}, tag=${var[31-32]}" "unsigned long"

View File

@@ -1,7 +0,0 @@
src/bun.js/WebKit
src/deps
test/snapshots
test/js/deno
test/node.js
src/react-refresh.js
*.min.js

View File

@@ -1,24 +0,0 @@
{
"arrowParens": "avoid",
"printWidth": 120,
"trailingComma": "all",
"useTabs": false,
"quoteProps": "preserve",
"overrides": [
{
"files": [".vscode/*.json"],
"options": {
"parser": "jsonc",
"quoteProps": "preserve",
"singleQuote": false,
"trailingComma": "all"
}
},
{
"files": ["*.md"],
"options": {
"printWidth": 80
}
}
]
}

View File

@@ -0,0 +1,25 @@
#!/bin/bash
set -euxo pipefail
WEBKIT_VERSION=$(git rev-parse HEAD:./src/javascript/jsc/WebKit)
MIMALLOC_VERSION=$(git rev-parse HEAD:./src/deps/mimalloc)
LIBARCHIVE_VERSION=$(git rev-parse HEAD:./src/deps/libarchive)
PICOHTTPPARSER_VERSION=$(git rev-parse HEAD:./src/deps/picohttpparser)
BORINGSSL_VERSION=$(git rev-parse HEAD:./src/deps/boringssl)
ZLIB_VERSION=$(git rev-parse HEAD:./src/deps/zlib)
UWS_VERSION=$(git rev-parse HEAD:./src/deps/uws)
rm -rf src/generated_versions_list.zig
echo "// AUTO-GENERATED FILE. Created via .scripts/write-versions.sh" >src/generated_versions_list.zig
echo "" >>src/generated_versions_list.zig
echo "pub const boringssl = \"$BORINGSSL_VERSION\";" >>src/generated_versions_list.zig
echo "pub const libarchive = \"$LIBARCHIVE_VERSION\";" >>src/generated_versions_list.zig
echo "pub const mimalloc = \"$MIMALLOC_VERSION\";" >>src/generated_versions_list.zig
echo "pub const picohttpparser = \"$PICOHTTPPARSER_VERSION\";" >>src/generated_versions_list.zig
echo "pub const uws = \"$UWS_VERSION\";" >>src/generated_versions_list.zig
echo "pub const webkit = \"$WEBKIT_VERSION\";" >>src/generated_versions_list.zig
echo "pub const zig = @import(\"std\").fmt.comptimePrint(\"{}\", .{@import(\"builtin\").zig_version});" >>src/generated_versions_list.zig
echo "pub const zlib = \"$ZLIB_VERSION\";" >>src/generated_versions_list.zig
echo "" >>src/generated_versions_list.zig
zig fmt src/generated_versions_list.zig

View File

@@ -1,39 +1,30 @@
{
"configurations": [
{
"name": "Debug",
"forcedInclude": ["${workspaceFolder}/src/bun.js/bindings/root.h"],
"compileCommands": "${workspaceFolder}/build/compile_commands.json",
"name": "Mac",
"forcedInclude": [
"${workspaceFolder}/src/javascript/jsc/bindings/root.h"
],
"includePath": [
"${workspaceFolder}/build/bun-webkit/include",
"${workspaceFolder}/build/codegen",
"${workspaceFolder}/src/bun.js/bindings/",
"${workspaceFolder}/src/bun.js/bindings/webcore/",
"${workspaceFolder}/src/bun.js/bindings/sqlite/",
"${workspaceFolder}/src/bun.js/bindings/webcrypto/",
"${workspaceFolder}/src/bun.js/modules/",
"${workspaceFolder}/src/js/builtins/",
"${workspaceFolder}/src/deps/boringssl/include/",
"${workspaceFolder}/src/deps",
"${workspaceFolder}/src/napi/*",
"${workspaceFolder}/packages/bun-usockets/src",
"${workspaceFolder}/packages/",
"${workspaceFolder}/src/JavaScript/jsc/WebKit/WebKitBuild/Release/JavaScriptCore/PrivateHeaders/",
"${workspaceFolder}/src/JavaScript/jsc/WebKit/WebKitBuild/Release/WTF/Headers",
"${workspaceFolder}/src/JavaScript/jsc/WebKit/WebKitBuild/Release/*",
"${workspaceFolder}/src/JavaScript/jsc/bindings/",
"${workspaceFolder}/src/JavaScript/jsc/WebKit/Source/bmalloc/",
"${workspaceFolder}/src/javascript/jsc/WebKit/WebKitBuild/Release/ICU/Headers/"
],
"browse": {
"path": [
"${workspaceFolder}/build/bun-webkit/include",
"${workspaceFolder}/src/bun.js/bindings",
"${workspaceFolder}/src/napi/*",
"${workspaceFolder}/src/js/builtins/*",
"${workspaceFolder}/src/bun.js/modules/*",
"${workspaceFolder}/src/deps/*",
"${workspaceFolder}/src/deps/boringssl/include/*",
"${workspaceFolder}/packages/bun-usockets/*",
"${workspaceFolder}/packages/bun-uws/*",
"${workspaceFolder}/src/napi/*",
"${workspaceFolder}/src/javascript/jsc/bindings/*",
"${workspaceFolder}/src/JavaScript/jsc/WebKit/WebKitBuild/Release/JavaScriptCore/PrivateHeaders/",
"${workspaceFolder}/src/JavaScript/jsc/WebKit/WebKitBuild/Release/WTF/Headers/**",
"${workspaceFolder}/src/JavaScript/jsc/WebKit/WebKitBuild/Release/*",
"${workspaceFolder}/src/JavaScript/jsc/bindings/**",
"${workspaceFolder}/src/JavaScript/jsc/WebKit/Source/bmalloc/**",
"${workspaceFolder}/src/javascript/jsc/WebKit/WebKitBuild/Release/ICU/Headers/"
],
"limitSymbolsToIncludedHeaders": true,
"databaseFilename": ".vscode/cppdb",
"databaseFilename": ".vscode/cppdb"
},
"defines": [
"STATICALLY_LINKED_WITH_JavaScriptCore=1",
@@ -43,78 +34,15 @@
"ENABLE_INSPECTOR_ALTERNATE_DISPATCHERS=0",
"BUILDING_JSCONLY__",
"USE_FOUNDATION=1",
"ASSERT_ENABLED=1",
"DU_DISABLE_RENAMING=1",
"ASSERT_ENABLED=0",
"DU_DISABLE_RENAMING=1"
],
"macFrameworkPath": [],
"compilerPath": "${workspaceFolder}/.vscode/clang++",
"compilerPath": "/usr/local/opt/llvm/bin/clang",
"cStandard": "c17",
"cppStandard": "c++20",
},
{
"name": "BunWithJSCDebug",
"forcedInclude": ["${workspaceFolder}/src/bun.js/bindings/root.h"],
"includePath": [
"${workspaceFolder}/build/codegen",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Debug/",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Debug/ICU/Headers/",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Debug/JavaScriptCore/PrivateHeaders/",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Debug/WTF/Headers",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Debug/bmalloc/Headers/",
"${workspaceFolder}/src/bun.js/bindings/",
"${workspaceFolder}/src/bun.js/bindings/webcore/",
"${workspaceFolder}/src/bun.js/bindings/sqlite/",
"${workspaceFolder}/src/bun.js/bindings/webcrypto/",
"${workspaceFolder}/src/bun.js/modules/",
"${workspaceFolder}/src/js/builtins/",
"${workspaceFolder}/src/js/out",
"${workspaceFolder}/src/deps/boringssl/include/",
"${workspaceFolder}/src/deps",
"${workspaceFolder}/src/napi/*",
"${workspaceFolder}/packages/bun-usockets/src",
"${workspaceFolder}/packages/",
],
"browse": {
"path": [
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Debug/",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Debug/ICU/Headers/",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Debug/JavaScriptCore/PrivateHeaders/**",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Debug/WTF/Headers/**",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Debug/bmalloc/Headers/**",
"${workspaceFolder}/src/bun.js/bindings/*",
"${workspaceFolder}/src/bun.js/bindings/*",
"${workspaceFolder}/src/napi/*",
"${workspaceFolder}/src/bun.js/bindings/sqlite/",
"${workspaceFolder}/src/bun.js/bindings/webcrypto/",
"${workspaceFolder}/src/bun.js/bindings/webcore/",
"${workspaceFolder}/src/js/builtins/*",
"${workspaceFolder}/src/js/out/*",
"${workspaceFolder}/src/bun.js/modules/*",
"${workspaceFolder}/src/deps",
"${workspaceFolder}/src/deps/boringssl/include/",
"${workspaceFolder}/packages/bun-usockets/",
"${workspaceFolder}/packages/bun-uws/",
"${workspaceFolder}/src/napi",
],
"limitSymbolsToIncludedHeaders": true,
"databaseFilename": ".vscode/cppdb_debug",
},
"defines": [
"STATICALLY_LINKED_WITH_JavaScriptCore=1",
"STATICALLY_LINKED_WITH_WTF=1",
"BUILDING_WITH_CMAKE=1",
"NOMINMAX",
"ENABLE_INSPECTOR_ALTERNATE_DISPATCHERS=0",
"BUILDING_JSCONLY__",
"USE_FOUNDATION=1",
"ASSERT_ENABLED=1",
"DU_DISABLE_RENAMING=1",
],
"macFrameworkPath": [],
"compilerPath": "${workspaceFolder}/.vscode/clang++",
"cStandard": "c17",
"cppStandard": "c++20",
},
"cppStandard": "c++11",
"intelliSenseMode": "macos-clang-x64"
}
],
"version": 4,
"version": 4
}

View File

@@ -1,33 +1,8 @@
{
"recommendations": [
// Zig
"ziglang.vscode-zig",
// C/C++
"clang.clangd",
"ms-vscode.cmake-tools",
"xaver.clang-format",
"vadimcn.vscode-lldb",
// JavaScript
"oven.bun-vscode",
"AugusteRame.zls-vscode",
"esbenp.prettier-vscode",
// TypeScript
"better-ts-errors.better-ts-errors",
"MylesMurphy.prettify-ts",
// Markdown
"bierner.markdown-preview-github-styles",
"bierner.markdown-emoji",
"bierner.emojisense",
"bierner.markdown-checkbox",
"bierner.jsdoc-markdown-highlighting",
// TOML
"tamasfe.even-better-toml",
// Other
"bierner.comment-tagged-templates",
],
"xaver.clang-format",
"vadimcn.vscode-lldb"
]
}

1845
.vscode/launch.json generated vendored

File diff suppressed because it is too large Load Diff

275
.vscode/settings.json vendored
View File

@@ -1,123 +1,36 @@
{
// Editor
"editor.tabSize": 2,
"editor.insertSpaces": true,
"editor.formatOnSave": true,
"editor.formatOnSaveMode": "file",
// Search
"git.autoRepositoryDetection": "openEditors",
"search.quickOpen.includeSymbols": false,
"search.seedWithNearestWord": true,
"search.smartCase": true,
"search.exclude": {
"node_modules": true,
".git": true,
"src/bun.js/WebKit": true,
"src/deps/*/**": true,
"test/node.js/upstream": true,
},
"search.exclude": {},
"search.followSymlinks": false,
"search.useIgnoreFiles": true,
// Git
"git.autoRepositoryDetection": "openEditors",
"git.ignoreSubmodules": true,
"git.ignoreLimitWarning": true,
// Zig
"zig.initialSetupDone": true,
"zig.buildOption": "build",
"zig.zls.zigLibPath": "${workspaceFolder}/src/deps/zig/lib",
"zig.buildArgs": ["-Dgenerated-code=./build/codegen"],
"zig.zls.buildOnSaveStep": "check",
// "zig.zls.enableBuildOnSave": true,
// "zig.buildOnSave": true,
"zig.buildFilePath": "${workspaceFolder}/build.zig",
"zig.path": "${workspaceFolder}/.cache/zig/zig.exe",
"zig.formattingProvider": "zls",
"zig.zls.enableInlayHints": false,
"zig.buildOnSave": false,
"[zig]": {
"editor.tabSize": 4,
"editor.useTabStops": false,
"editor.defaultFormatter": "ziglang.vscode-zig",
"editor.defaultFormatter": "AugusteRame.zls-vscode",
"editor.formatOnSave": true
},
"[ts]": {
"editor.defaultFormatter": "esbenp.prettier-vscode",
"editor.formatOnSave": true
},
"[js]": {
"editor.defaultFormatter": "esbenp.prettier-vscode",
"editor.formatOnSave": true
},
"[jsx]": {
"editor.defaultFormatter": "esbenp.prettier-vscode",
"editor.formatOnSave": true
},
"[tsx]": {
"editor.defaultFormatter": "esbenp.prettier-vscode",
"editor.formatOnSave": true
},
"zig.beforeDebugCmd": "make build-unit ${file} ${filter} ${bin}",
"zig.testCmd": "make test ${file} ${filter} ${bin}",
// lldb
"lldb.launch.initCommands": ["command source ${workspaceFolder}/.lldbinit"],
"lldb.verboseLogging": false,
// C++
"cmake.configureOnOpen": false,
"C_Cpp.errorSquiggles": "enabled",
"[cpp]": {
"editor.defaultFormatter": "xaver.clang-format",
},
"[c]": {
"editor.defaultFormatter": "xaver.clang-format",
},
"[h]": {
"editor.defaultFormatter": "xaver.clang-format",
},
// JavaScript
"prettier.enable": true,
"prettier.configPath": ".prettierrc",
"eslint.workingDirectories": ["${workspaceFolder}/packages/bun-types"],
"[javascript]": {
"editor.defaultFormatter": "esbenp.prettier-vscode",
},
"[javascriptreact]": {
"editor.defaultFormatter": "esbenp.prettier-vscode",
},
"prettier.prettierPath": "./node_modules/prettier",
// TypeScript
"typescript.tsdk": "${workspaceFolder}/node_modules/typescript/lib",
"[typescript]": {
"editor.defaultFormatter": "esbenp.prettier-vscode",
},
"[typescriptreact]": {
"editor.defaultFormatter": "esbenp.prettier-vscode",
},
// JSON
"[json]": {
"editor.defaultFormatter": "esbenp.prettier-vscode",
},
"[jsonc]": {
"editor.defaultFormatter": "esbenp.prettier-vscode",
},
// Markdown
"[markdown]": {
"editor.defaultFormatter": "esbenp.prettier-vscode",
"editor.unicodeHighlight.ambiguousCharacters": true,
"editor.unicodeHighlight.invisibleCharacters": true,
"diffEditor.ignoreTrimWhitespace": false,
"editor.wordWrap": "on",
"editor.quickSuggestions": {
"comments": "off",
"strings": "off",
"other": "off",
},
},
// TOML
"[toml]": {
"editor.defaultFormatter": "esbenp.prettier-vscode",
},
// YAML
"[yaml]": {
"editor.defaultFormatter": "esbenp.prettier-vscode",
},
// Docker
"[dockerfile]": {
"editor.formatOnSave": false,
},
// Files
"files.exclude": {
"**/.git": true,
"**/.svn": true,
@@ -127,32 +40,130 @@
"**/Thumbs.db": true,
"**/*.xcworkspacedata": true,
"**/*.xcscheme": true,
"**/*.pem": true,
"**/*.xcodeproj": true,
"**/*.i": true,
"packages/bun-types/*.d.ts": true,
// uws WebSocket.cpp conflicts with webcore WebSocket.cpp
"packages/bun-uws/fuzzing": true,
},
"files.associations": {
"*.idl": "cpp",
"integration/snapshots": true,
"integration/snapshots-no-hmr": true,
"src/javascript/jsc/WebKit": true,
"src/deps/libarchive": true,
"src/deps/mimalloc": true,
"src/deps/s2n-tls": true,
"src/deps/boringssl": true,
"src/deps/openssl": true,
"src/deps/uws": true,
"src/deps/zlib": true,
"src/deps/lol-html": true,
"integration/snippets/package-json-exports/_node_modules_copy": true
},
"C_Cpp.files.exclude": {
"**/.vscode": true,
"WebKit/JSTests": true,
"WebKit/Tools": true,
"WebKit/WebDriverTests": true,
"WebKit/WebKit.xcworkspace": true,
"WebKit/WebKitLibraries": true,
"WebKit/Websites": true,
"WebKit/resources": true,
"WebKit/LayoutTests": true,
"WebKit/ManualTests": true,
"WebKit/PerformanceTests": true,
"WebKit/WebKitLegacy": true,
"WebKit/WebCore": true,
"WebKit/WebDriver": true,
"WebKit/WebKitBuild": true,
"WebKit/WebInspectorUI": true,
"src/javascript/jsc/WebKit/JSTests": true,
"src/javascript/jsc/WebKit/Tools": true,
"src/javascript/jsc/WebKit/WebDriverTests": true,
"src/javascript/jsc/WebKit/WebKit.xcworkspace": true,
"src/javascript/jsc/WebKit/WebKitLibraries": true,
"src/javascript/jsc/WebKit/Websites": true,
"src/javascript/jsc/WebKit/resources": true,
"src/javascript/jsc/WebKit/LayoutTests": true,
"src/javascript/jsc/WebKit/ManualTests": true,
"src/javascript/jsc/WebKit/PerformanceTests": true,
"src/javascript/jsc/WebKit/WebKitLegacy": true,
"src/javascript/jsc/WebKit/WebCore": true,
"src/javascript/jsc/WebKit/WebDriver": true,
"src/javascript/jsc/WebKit/WebKitBuild": true,
"src/javascript/jsc/WebKit/WebInspectorUI": true
},
"git.detectSubmodules": false,
"[cpp]": {
"editor.defaultFormatter": "xaver.clang-format"
},
"[h]": {
"editor.defaultFormatter": "xaver.clang-format"
},
"[c]": {
"editor.defaultFormatter": "xaver.clang-format"
},
"files.associations": {
"*.idl": "cpp",
"memory": "cpp",
"iostream": "cpp",
"algorithm": "cpp",
"random": "cpp",
"ios": "cpp",
"filesystem": "cpp",
"__locale": "cpp",
"type_traits": "cpp",
"__mutex_base": "cpp",
"__string": "cpp",
"string": "cpp",
"string_view": "cpp",
"typeinfo": "cpp",
"__config": "cpp",
"__nullptr": "cpp",
"exception": "cpp",
"__bit_reference": "cpp",
"atomic": "cpp",
"utility": "cpp",
"sstream": "cpp",
"__functional_base": "cpp",
"new": "cpp",
"__debug": "cpp",
"__errc": "cpp",
"__hash_table": "cpp",
"__node_handle": "cpp",
"__split_buffer": "cpp",
"__threading_support": "cpp",
"__tuple": "cpp",
"array": "cpp",
"bit": "cpp",
"bitset": "cpp",
"cctype": "cpp",
"chrono": "cpp",
"clocale": "cpp",
"cmath": "cpp",
"complex": "cpp",
"condition_variable": "cpp",
"cstdarg": "cpp",
"cstddef": "cpp",
"cstdint": "cpp",
"cstdio": "cpp",
"cstdlib": "cpp",
"cstring": "cpp",
"ctime": "cpp",
"cwchar": "cpp",
"cwctype": "cpp",
"deque": "cpp",
"fstream": "cpp",
"functional": "cpp",
"initializer_list": "cpp",
"iomanip": "cpp",
"iosfwd": "cpp",
"istream": "cpp",
"iterator": "cpp",
"limits": "cpp",
"locale": "cpp",
"mutex": "cpp",
"optional": "cpp",
"ostream": "cpp",
"ratio": "cpp",
"stack": "cpp",
"stdexcept": "cpp",
"streambuf": "cpp",
"system_error": "cpp",
"thread": "cpp",
"tuple": "cpp",
"unordered_map": "cpp",
"unordered_set": "cpp",
"vector": "cpp",
"__bits": "cpp",
"__tree": "cpp",
"map": "cpp",
"numeric": "cpp",
"set": "cpp",
"__memory": "cpp",
"memory_resource": "cpp"
},
"go.logging.level": "off",
"cmake.configureOnOpen": false
}

66
.vscode/tasks.json vendored
View File

@@ -2,51 +2,33 @@
"version": "2.0.0",
"tasks": [
{
"label": "build",
"type": "process",
"label": "Install Dependencies",
"command": "scripts/all-dependencies.sh",
"windows": {
"command": "scripts/all-dependencies.ps1",
},
"icon": {
"id": "arrow-down",
},
"options": {
"cwd": "${workspaceFolder}",
"command": "zig",
"args": ["build"],
"presentation": {
"echo": true,
"reveal": "silent",
"focus": false,
"panel": "shared",
"showReuseMessage": false,
"clear": false
},
"group": {
"kind": "build",
"isDefault": true
}
},
{
"label": "run",
"type": "process",
"label": "Setup Environment",
"dependsOn": ["Install Dependencies"],
"command": "scripts/setup.sh",
"windows": {
"command": "scripts/setup.ps1",
},
"icon": {
"id": "check",
},
"options": {
"cwd": "${workspaceFolder}",
},
},
{
"type": "process",
"label": "Build Bun",
"dependsOn": ["Setup Environment"],
"command": "bun",
"args": ["run", "build"],
"icon": {
"id": "gear",
},
"options": {
"cwd": "${workspaceFolder}",
},
"isBuildCommand": true,
"runOptions": {
"instanceLimit": 1,
"reevaluateOnRerun": true,
},
},
],
"command": "zig",
"args": ["run", "${file}"],
"group": "build",
"presentation": {
"showReuseMessage": false,
"clear": true
}
}
]
}

File diff suppressed because it is too large Load Diff

View File

@@ -1,12 +0,0 @@
## Code of conduct
- We are committed to providing a friendly, safe and welcoming environment for all, regardless of level of experience, gender identity and expression, sexual orientation, disability, personal appearance, body size, race, ethnicity, age, religion, nationality, or other similar characteristic.
- Please avoid using overtly sexual aliases or other nicknames that might detract from a friendly, safe and welcoming environment for all.
- Please be kind and courteous. Theres no need to be mean or rude.
- Respect that people have differences of opinion and that every design or implementation choice carries a trade-off and numerous costs. There is seldom a right answer.
- Please keep unstructured critique to a minimum. If you have solid ideas you want to experiment with, make a fork and see how it works.
- We will exclude you from interaction if you insult, demean or harass anyone. That is not welcome behavior. We interpret the term “harassment” as including the definition in the [Citizen Code of Conduct](https://github.com/stumpsyn/policies/blob/master/citizen_code_of_conduct.md); if you have any lack of clarity about what might be included in that concept, please read their definition. In particular, we dont tolerate behavior that excludes people in socially marginalized groups.
- Private harassment is also unacceptable. No matter who you are, if you feel you have been or are being harassed or made uncomfortable by a community member, please contact one of the channel ops or an employee of Oven immediately. Whether youre a regular contributor or a newcomer, we care about making this community a safe place for you and weve got your back.
- Likewise any spamming, trolling, flaming, baiting or other attention-stealing behavior is not welcome.
This code of conduct is adapted from the [Rust Code of Conduct](https://www.rust-lang.org/policies/code-of-conduct).

View File

@@ -1,327 +0,0 @@
Configuring a development environment for Bun can take 10-30 minutes depending on your internet connection and computer speed. You will need ~10GB of free disk space for the repository and build artifacts.
If you are using Windows, please refer to [this guide](/docs/project/building-windows)
{% details summary="For Ubuntu users" %}
TL;DR: Ubuntu 22.04 is suggested.
Bun currently requires `glibc >=2.32` in development which means if you're on Ubuntu 20.04 (glibc == 2.31), you may likely meet `error: undefined symbol: __libc_single_threaded `. You need to take extra configurations. Also, according to this [issue](https://github.com/llvm/llvm-project/issues/97314), LLVM 16 is no longer maintained on Ubuntu 24.04 (noble). And instead, you might want `brew` to install LLVM 16 for your Ubuntu 24.04.
{% /details %}
## Install Dependencies
Using your system's package manager, install Bun's dependencies:
{% codetabs %}
```bash#macOS (Homebrew)
$ brew install automake ccache cmake coreutils gnu-sed go icu4c libiconv libtool ninja pkg-config rust ruby
```
```bash#Ubuntu/Debian
$ sudo apt install curl wget lsb-release software-properties-common cargo ccache cmake git golang libtool ninja-build pkg-config rustc ruby-full xz-utils
```
```bash#Arch
$ sudo pacman -S base-devel ccache cmake git go libiconv libtool make ninja pkg-config python rust sed unzip ruby
```
```bash#Fedora
$ sudo dnf install cargo ccache cmake git golang libtool ninja-build pkg-config rustc ruby libatomic-static libstdc++-static sed unzip which libicu-devel 'perl(Math::BigInt)'
```
```bash#openSUSE Tumbleweed
$ sudo zypper install go cmake ninja automake git rustup && rustup toolchain install stable
```
{% /codetabs %}
> **Note**: The Zig compiler is automatically installed and updated by the build scripts. Manual installation is not required.
Before starting, you will need to already have a release build of Bun installed, as we use our bundler to transpile and minify our code, as well as for code generation scripts.
{% codetabs %}
```bash#Native
$ curl -fsSL https://bun.sh/install | bash
```
```bash#npm
$ npm install -g bun
```
```bash#Homebrew
$ brew tap oven-sh/bun
$ brew install bun
```
{% /codetabs %}
## Install LLVM
Bun requires LLVM 16 (`clang` is part of LLVM). This version requirement is to match WebKit (precompiled), as mismatching versions will cause memory allocation failures at runtime. In most cases, you can install LLVM through your system package manager:
{% codetabs %}
```bash#macOS (Homebrew)
$ brew install llvm@16
```
```bash#Ubuntu/Debian
$ # LLVM has an automatic installation script that is compatible with all versions of Ubuntu
$ wget https://apt.llvm.org/llvm.sh -O - | sudo bash -s -- 16 all
```
```bash#Arch
$ sudo pacman -S llvm clang lld
```
```bash#Fedora
$ sudo dnf install 'dnf-command(copr)'
$ sudo dnf copr enable -y @fedora-llvm-team/llvm-snapshots
$ sudo dnf install llvm clang lld
```
```bash#openSUSE Tumbleweed
$ sudo zypper install clang16 lld16 llvm16
```
{% /codetabs %}
If none of the above solutions apply, you will have to install it [manually](https://github.com/llvm/llvm-project/releases/tag/llvmorg-16.0.6).
Make sure Clang/LLVM 16 is in your path:
```bash
$ which clang-16
```
If not, run this to manually add it:
{% codetabs %}
```bash#macOS (Homebrew)
# use fish_add_path if you're using fish
# use path+="$(brew --prefix llvm@16)/bin" if you are using zsh
$ export PATH="$(brew --prefix llvm@16)/bin:$PATH"
```
```bash#Arch
# use fish_add_path if you're using fish
$ export PATH="$PATH:/usr/lib/llvm16/bin"
```
{% /codetabs %}
> ⚠️ Ubuntu distributions (<= 20.04) may require installation of the C++ standard library independently. See the [troubleshooting section](#span-file-not-found-on-ubuntu) for more information.
## Building Bun
After cloning the repository, run the following command to run the first build. This may take a while as it will clone submodules and build dependencies.
```bash
$ bun setup
```
The binary will be located at `./build/bun-debug`. It is recommended to add this to your `$PATH`. To verify the build worked, let's print the version number on the development build of Bun.
```bash
$ build/bun-debug --version
x.y.z_debug
```
To rebuild, you can invoke `bun run build`
```bash
$ bun run build
```
These two scripts, `setup` and `build`, are aliases to do roughly the following:
```bash
$ ./scripts/setup.sh
$ cmake -S . -B build -G Ninja -DCMAKE_BUILD_TYPE=Debug
$ ninja -C build # 'bun run build' runs just this
```
Advanced users can pass CMake flags to customize the build.
## VSCode
VSCode is the recommended IDE for working on Bun, as it has been configured. Once opening, you can run `Extensions: Show Recommended Extensions` to install the recommended extensions for Zig and C++. ZLS is automatically configured.
If you use a different editor, make sure that you tell ZLS to use the automatically installed Zig compiler, which is located at `./.cache/zig/zig.exe`. The filename is `zig.exe` so that it works as expected on Windows, but it still works on macOS/Linux (it just has a surprising file extension).
We recommend adding `./build` to your `$PATH` so that you can run `bun-debug` in your terminal:
```sh
$ bun-debug
```
## Code generation scripts
Several code generation scripts are used during Bun's build process. These are run automatically when changes are made to certain files.
In particular, these are:
- `./src/codegen/generate-jssink.ts` -- Generates `build/codegen/JSSink.cpp`, `build/codegen/JSSink.h` which implement various classes for interfacing with `ReadableStream`. This is internally how `FileSink`, `ArrayBufferSink`, `"type": "direct"` streams and other code related to streams works.
- `./src/codegen/generate-classes.ts` -- Generates `build/codegen/ZigGeneratedClasses*`, which generates Zig & C++ bindings for JavaScriptCore classes implemented in Zig. In `**/*.classes.ts` files, we define the interfaces for various classes, methods, prototypes, getters/setters etc which the code generator reads to generate boilerplate code implementing the JavaScript objects in C++ and wiring them up to Zig
- `./src/codegen/bundle-modules.ts` -- Bundles built-in modules like `node:fs`, `bun:ffi` into files we can include in the final binary. In development, these can be reloaded without rebuilding Zig (you still need to run `bun run build`, but it re-reads the transpiled files from disk afterwards). In release builds, these are embedded into the binary.
- `./src/codegen/bundle-functions.ts` -- Bundles globally-accessible functions implemented in JavaScript/TypeScript like `ReadableStream`, `WritableStream`, and a handful more. These are used similarly to the builtin modules, but the output more closely aligns with what WebKit/Safari does for Safari's built-in functions so that we can copy-paste the implementations from WebKit as a starting point.
## Modifying ESM modules
Certain modules like `node:fs`, `node:stream`, `bun:sqlite`, and `ws` are implemented in JavaScript. These live in `src/js/{node,bun,thirdparty}` files and are pre-bundled using Bun.
## Release build
To compile a release build of Bun, run:
```bash
$ bun run build:release
```
The binary will be located at `./build-release/bun` and `./build-release/bun-profile`.
### Download release build from pull requests
To save you time spent building a release build locally, we provide a way to run release builds from pull requests. This is useful for manully testing changes in a release build before they are merged.
To run a release build from a pull request, you can use the `bun-pr` npm package:
```sh
bunx bun-pr pr-number
bunx bun-pr branch/branch-name
bunx bun-pr "https://github.com/oven-sh/bun/pull/1234566"
```
This will download the release build from the pull request and add it to `$PATH` as `bun-${pr-number}`. You can then run the build with `bun-${pr-number}`.
```sh
bun-1234566 --version
```
This works by downloading the release build from the GitHub Actions artifacts on the linked pull request. You may need the `gh` CLI installed to authenticate with GitHub.
## Valgrind
On Linux, valgrind can help find memory issues.
Keep in mind:
- JavaScriptCore doesn't support valgrind. It will report spurious errors.
- Valgrind is slow
- Mimalloc will sometimes cause spurious errors when debug build is enabled
You'll need a very recent version of Valgrind due to DWARF 5 debug symbols. You may need to manually compile Valgrind instead of using it from your Linux package manager.
`--fair-sched=try` is necessary if running multithreaded code in Bun (such as the bundler). Otherwise it will hang.
```bash
$ valgrind --fair-sched=try --track-origins=yes bun-debug <args>
```
## Building WebKit locally + Debug mode of JSC
{% callout %}
**TODO**: This is out of date. TLDR is pass `-DUSE_DEBUG_JSC=1` or `-DWEBKIT_DIR=...` to CMake. it will probably need more fiddling. ask @paperdave if you need this.
{% /callout %}
WebKit is not cloned by default (to save time and disk space). To clone and build WebKit locally, run:
```bash
# once you run this, `make submodule` can be used to automatically
# update WebKit and the other submodules
$ git submodule update --init --depth 1 --checkout src/bun.js/WebKit
# to make a jsc release build
$ make jsc
# JSC debug build does not work perfectly with Bun yet, this is actively being
# worked on and will eventually become the default.
$ make jsc-build-linux-compile-debug cpp
$ make jsc-build-mac-compile-debug cpp
```
Note that the WebKit folder, including build artifacts, is 8GB+ in size.
If you are using a JSC debug build and using VScode, make sure to run the `C/C++: Select a Configuration` command to configure intellisense to find the debug headers.
## Troubleshooting
### 'span' file not found on Ubuntu
> ⚠️ Please note that the instructions below are specific to issues occurring on Ubuntu. It is unlikely that the same issues will occur on other Linux distributions.
The Clang compiler typically uses the `libstdc++` C++ standard library by default. `libstdc++` is the default C++ Standard Library implementation provided by the GNU Compiler Collection (GCC). While Clang may link against the `libc++` library, this requires explicitly providing the `-stdlib` flag when running Clang.
Bun relies on C++20 features like `std::span`, which are not available in GCC versions lower than 11. GCC 10 doesn't have all of the C++20 features implemented. As a result, running `make setup` may fail with the following error:
```
fatal error: 'span' file not found
#include <span>
^~~~~~
```
The issue may manifest when initially running `bun setup` as Clang being unable to compile a simple program:
```
The C++ compiler
"/usr/bin/clang++-16"
is not able to compile a simple test program.
```
To fix the error, we need to update the GCC version to 11. To do this, we'll need to check if the latest version is available in the distribution's official repositories or use a third-party repository that provides GCC 11 packages. Here are general steps:
```bash
$ sudo apt update
$ sudo apt install gcc-11 g++-11
# If the above command fails with `Unable to locate package gcc-11` we need
# to add the APT repository
$ sudo add-apt-repository -y ppa:ubuntu-toolchain-r/test
# Now run `apt install` again
$ sudo apt install gcc-11 g++-11
```
Now, we need to set GCC 11 as the default compiler:
```bash
$ sudo update-alternatives --install /usr/bin/gcc gcc /usr/bin/gcc-11 100
$ sudo update-alternatives --install /usr/bin/g++ g++ /usr/bin/g++-11 100
```
### libarchive
If you see an error on macOS when compiling `libarchive`, run:
```bash
$ brew install pkg-config
```
### macOS `library not found for -lSystem`
If you see this error when compiling, run:
```bash
$ xcode-select --install
```
## Cannot find `libatomic.a`
Bun defaults to linking `libatomic` statically, as not all systems have it. If you are building on a distro that does not have a static libatomic available, you can run the following command to enable dynamic linking:
```bash
$ bun setup -DUSE_STATIC_LIBATOMIC=OFF
```
The built version of Bun may not work on other systems if compiled this way.
## ccache conflicts with building TinyCC on macOS
If you run into issues with `ccache` when building TinyCC, try reinstalling ccache
```bash
brew uninstall ccache
brew install ccache
```

View File

@@ -1,645 +1,447 @@
# This Dockerfile is used by CI workflows to build Bun. It is not intended as a development
# environment, or to be used as a base image for other projects.
#
# You likely want this image instead: https://hub.docker.com/r/oven/bun
#
# TODO: move this file to reduce confusion
FROM bunbunbunbun/bun-base:latest as lolhtml
ARG DEBIAN_FRONTEND=noninteractive
ARG GITHUB_WORKSPACE=/build
ARG ZIG_PATH=${GITHUB_WORKSPACE}/zig
# Directory extracts to "bun-webkit"
ARG WEBKIT_DIR=${GITHUB_WORKSPACE}/bun-webkit
ARG BUN_RELEASE_DIR=${GITHUB_WORKSPACE}/bun-release
ARG BUN_DEPS_OUT_DIR=${GITHUB_WORKSPACE}/bun-deps
ARG BUN_DIR=${GITHUB_WORKSPACE}/bun
ARG CPU_TARGET=native
ARG ARCH=x86_64
ARG BUILD_MACHINE_ARCH=x86_64
ARG BUILDARCH=amd64
ARG TRIPLET=${ARCH}-linux-gnu
ARG GIT_SHA=""
ARG BUN_VERSION="bun-v1.1.4"
ARG BUN_DOWNLOAD_URL_BASE="https://pub-5e11e972747a44bf9aaf9394f185a982.r2.dev/releases/${BUN_VERSION}"
ARG CANARY=0
ARG ASSERTIONS=OFF
ARG ZIG_OPTIMIZE=ReleaseFast
ARG CMAKE_BUILD_TYPE=Release
ARG NODE_VERSION="20"
ARG LLVM_VERSION="16"
ARG ZIG_VERSION="0.13.0"
ARG ZIG_VERSION_SHORT="0.13.0"
ARG SCCACHE_BUCKET
ARG SCCACHE_REGION
ARG SCCACHE_S3_USE_SSL
ARG SCCACHE_ENDPOINT
ARG AWS_ACCESS_KEY_ID
ARG AWS_SECRET_ACCESS_KEY
FROM bitnami/minideb:bullseye as bun-base
ARG BUN_DOWNLOAD_URL_BASE
ARG DEBIAN_FRONTEND
ARG BUN_VERSION
ARG NODE_VERSION
ARG LLVM_VERSION
ARG BUILD_MACHINE_ARCH
ARG BUN_DIR
ARG BUN_DEPS_OUT_DIR
ARG CPU_TARGET
ENV CI 1
ENV CPU_TARGET=${CPU_TARGET}
ENV BUILDARCH=${BUILDARCH}
ENV BUN_DEPS_OUT_DIR=${BUN_DEPS_OUT_DIR}
ENV BUN_ENABLE_LTO 1
ENV LC_CTYPE=en_US.UTF-8
ENV LC_ALL=en_US.UTF-8
ENV SCCACHE_BUCKET=${SCCACHE_BUCKET}
ENV SCCACHE_REGION=${SCCACHE_REGION}
ENV SCCACHE_S3_USE_SSL=${SCCACHE_S3_USE_SSL}
ENV SCCACHE_ENDPOINT=${SCCACHE_ENDPOINT}
ENV AWS_ACCESS_KEY_ID=${AWS_ACCESS_KEY_ID}
ENV AWS_SECRET_ACCESS_KEY=${AWS_SECRET_ACCESS_KEY}
RUN install_packages \
ca-certificates \
curl \
gnupg \
&& echo "deb https://apt.llvm.org/bullseye/ llvm-toolchain-bullseye-${LLVM_VERSION} main" > /etc/apt/sources.list.d/llvm.list \
&& echo "deb-src https://apt.llvm.org/bullseye/ llvm-toolchain-bullseye-${LLVM_VERSION} main" >> /etc/apt/sources.list.d/llvm.list \
&& curl -fsSL "https://apt.llvm.org/llvm-snapshot.gpg.key" | apt-key add - \
&& echo "deb https://deb.nodesource.com/node_${NODE_VERSION}.x nodistro main" > /etc/apt/sources.list.d/nodesource.list \
&& curl -fsSL "https://deb.nodesource.com/gpgkey/nodesource-repo.gpg.key" | apt-key add - \
&& echo "deb https://apt.kitware.com/ubuntu/ focal main" > /etc/apt/sources.list.d/kitware.list \
&& curl -fsSL "https://apt.kitware.com/keys/kitware-archive-latest.asc" | apt-key add - \
&& install_packages \
wget \
bash \
software-properties-common \
build-essential \
autoconf \
automake \
libtool \
pkg-config \
clang-${LLVM_VERSION} \
lld-${LLVM_VERSION} \
lldb-${LLVM_VERSION} \
clangd-${LLVM_VERSION} \
libc++-${LLVM_VERSION}-dev \
libc++abi-${LLVM_VERSION}-dev \
llvm-${LLVM_VERSION}-runtime \
llvm-${LLVM_VERSION}-dev \
make \
cmake \
ninja-build \
file \
libc-dev \
libxml2 \
libxml2-dev \
xz-utils \
git \
tar \
rsync \
gzip \
unzip \
perl \
python3 \
ruby \
ruby-dev \
golang \
nodejs && \
for f in /usr/lib/llvm-${LLVM_VERSION}/bin/*; do ln -sf "$f" /usr/bin; done \
&& ln -sf /usr/bin/clang-${LLVM_VERSION} /usr/bin/clang \
&& ln -sf /usr/bin/clang++-${LLVM_VERSION} /usr/bin/clang++ \
&& ln -sf /usr/bin/lld-${LLVM_VERSION} /usr/bin/lld \
&& ln -sf /usr/bin/lldb-${LLVM_VERSION} /usr/bin/lldb \
&& ln -sf /usr/bin/clangd-${LLVM_VERSION} /usr/bin/clangd \
&& ln -sf /usr/bin/llvm-ar-${LLVM_VERSION} /usr/bin/llvm-ar \
&& ln -sf /usr/bin/ld.lld /usr/bin/ld \
&& ln -sf /usr/bin/llvm-ranlib-${LLVM_VERSION} /usr/bin/ranlib \
&& ln -sf /usr/bin/clang /usr/bin/cc \
&& ln -sf /usr/bin/clang /usr/bin/c89 \
&& ln -sf /usr/bin/clang /usr/bin/c99 \
&& ln -sf /usr/bin/clang++ /usr/bin/c++ \
&& ln -sf /usr/bin/clang++ /usr/bin/g++ \
&& ln -sf /usr/bin/llvm-ar /usr/bin/ar \
&& ln -sf /usr/bin/clang /usr/bin/gcc \
&& arch="$(dpkg --print-architecture)" \
&& case "${arch##*-}" in \
amd64) variant="x64";; \
arm64) variant="aarch64";; \
*) echo "unsupported architecture: $arch"; exit 1 ;; \
esac \
&& wget "${BUN_DOWNLOAD_URL_BASE}/bun-linux-${variant}.zip" \
&& unzip bun-linux-${variant}.zip \
&& mv bun-linux-${variant}/bun /usr/bin/bun \
&& ln -s /usr/bin/bun /usr/bin/bunx \
&& rm -rf bun-linux-${variant} bun-linux-${variant}.zip \
&& mkdir -p ${BUN_DIR} ${BUN_DEPS_OUT_DIR}
# && if [ -n "${SCCACHE_BUCKET}" ]; then \
# echo "Setting up sccache" \
# && wget https://github.com/mozilla/sccache/releases/download/v0.5.4/sccache-v0.5.4-${BUILD_MACHINE_ARCH}-unknown-linux-musl.tar.gz \
# && tar xf sccache-v0.5.4-${BUILD_MACHINE_ARCH}-unknown-linux-musl.tar.gz \
# && mv sccache-v0.5.4-${BUILD_MACHINE_ARCH}-unknown-linux-musl/sccache /usr/bin/sccache \
# && rm -rf sccache-v0.5.4-${BUILD_MACHINE_ARCH}-unknown-linux-musl.tar.gz sccache-v0.5.4-${BUILD_MACHINE_ARCH}-unknown-linux-musl \
FROM bun-base as bun-base-with-zig
ARG ZIG_VERSION
ARG ZIG_VERSION_SHORT
ARG BUILD_MACHINE_ARCH
ARG ZIG_FOLDERNAME=zig-linux-${BUILD_MACHINE_ARCH}-${ZIG_VERSION}
ARG ZIG_FILENAME=${ZIG_FOLDERNAME}.tar.xz
ARG ZIG_URL="https://ziglang.org/builds/${ZIG_FILENAME}"
ARG ZIG_LOCAL_CACHE_DIR=/zig-cache
ENV ZIG_LOCAL_CACHE_DIR=${ZIG_LOCAL_CACHE_DIR}
WORKDIR $GITHUB_WORKSPACE
ADD $ZIG_URL .
RUN tar xf ${ZIG_FILENAME} \
&& mv ${ZIG_FOLDERNAME}/lib /usr/lib/zig \
&& mv ${ZIG_FOLDERNAME}/zig /usr/bin/zig \
&& rm -rf ${ZIG_FILENAME} ${ZIG_FOLDERNAME}
FROM bun-base as c-ares
ARG BUN_DIR
ARG CPU_TARGET
ENV CPU_TARGET=${CPU_TARGET}
ARG CCACHE_DIR=/ccache
ENV CCACHE_DIR=${CCACHE_DIR}
COPY Makefile ${BUN_DIR}/Makefile
COPY src/deps/c-ares ${BUN_DIR}/src/deps/c-ares
COPY scripts ${BUN_DIR}/scripts
WORKDIR $BUN_DIR
RUN --mount=type=cache,target=${CCACHE_DIR} \
cd $BUN_DIR \
&& bash ./scripts/build-cares.sh \
&& rm -rf ${BUN_DIR}/src/deps/c-ares ${BUN_DIR}/Makefile ${BUN_DIR}/scripts
FROM bun-base as lolhtml
RUN curl https://sh.rustup.rs -sSf | sh -s -- -y
ARG BUN_DIR
ARG CPU_TARGET
ENV CPU_TARGET=${CPU_TARGET}
COPY Makefile ${BUN_DIR}/Makefile
COPY src/deps/lol-html ${BUN_DIR}/src/deps/lol-html
ARG CCACHE_DIR=/ccache
ENV CCACHE_DIR=${CCACHE_DIR}
RUN cd ${BUN_DIR} && \
make lolhtml && rm -rf src/deps/lol-html Makefile
RUN --mount=type=cache,target=${CCACHE_DIR} \
export PATH=$PATH:$HOME/.cargo/bin \
&& cd ${BUN_DIR} \
&& make lolhtml \
&& rm -rf src/deps/lol-html Makefile
FROM bunbunbunbun/bun-base:latest as mimalloc
FROM bun-base as mimalloc
ARG BUN_DIR
ARG CPU_TARGET
ARG ASSERTIONS
ENV CPU_TARGET=${CPU_TARGET}
COPY Makefile ${BUN_DIR}/Makefile
COPY src/deps/mimalloc ${BUN_DIR}/src/deps/mimalloc
COPY scripts ${BUN_DIR}/scripts
ARG CCACHE_DIR=/ccache
ENV CCACHE_DIR=${CCACHE_DIR}
RUN --mount=type=cache,target=${CCACHE_DIR} \
cd ${BUN_DIR} \
&& bash ./scripts/build-mimalloc.sh \
&& rm -rf src/deps/mimalloc Makefile
FROM bun-base as mimalloc-debug
ARG BUN_DIR
ARG CPU_TARGET
ARG ASSERTIONS
ENV CPU_TARGET=${CPU_TARGET}
ARG DEBIAN_FRONTEND=noninteractive
ARG GITHUB_WORKSPACE=/build
ARG ZIG_PATH=${GITHUB_WORKSPACE}/zig
# Directory extracts to "bun-webkit"
ARG WEBKIT_DIR=${GITHUB_WORKSPACE}/bun-webkit
ARG BUN_RELEASE_DIR=${GITHUB_WORKSPACE}/bun-release
ARG BUN_DEPS_OUT_DIR=${GITHUB_WORKSPACE}/bun-deps
ARG BUN_DIR=${GITHUB_WORKSPACE}/bun
COPY Makefile ${BUN_DIR}/Makefile
COPY src/deps/mimalloc ${BUN_DIR}/src/deps/mimalloc
ARG CCACHE_DIR=/ccache
ENV CCACHE_DIR=${CCACHE_DIR}
RUN cd ${BUN_DIR} && \
make mimalloc && rm -rf src/deps/mimalloc Makefile
RUN --mount=type=cache,target=${CCACHE_DIR} \
cd ${BUN_DIR} \
&& make mimalloc-debug \
&& rm -rf src/deps/mimalloc Makefile
FROM bunbunbunbun/bun-base:latest as zlib
FROM bun-base as zlib
ARG BUN_DIR
ARG CPU_TARGET
ENV CPU_TARGET=${CPU_TARGET}
ARG CCACHE_DIR=/ccache
ENV CCACHE_DIR=${CCACHE_DIR}
ARG DEBIAN_FRONTEND=noninteractive
ARG GITHUB_WORKSPACE=/build
ARG ZIG_PATH=${GITHUB_WORKSPACE}/zig
# Directory extracts to "bun-webkit"
ARG WEBKIT_DIR=${GITHUB_WORKSPACE}/bun-webkit
ARG BUN_RELEASE_DIR=${GITHUB_WORKSPACE}/bun-release
ARG BUN_DEPS_OUT_DIR=${GITHUB_WORKSPACE}/bun-deps
ARG BUN_DIR=${GITHUB_WORKSPACE}/bun
COPY Makefile ${BUN_DIR}/Makefile
COPY CMakeLists.txt ${BUN_DIR}/CMakeLists.txt
COPY scripts ${BUN_DIR}/scripts
COPY src/deps/zlib ${BUN_DIR}/src/deps/zlib
COPY package.json bun.lockb Makefile .gitmodules ${BUN_DIR}/
WORKDIR $BUN_DIR
RUN --mount=type=cache,target=${CCACHE_DIR} \
cd $BUN_DIR \
&& bash ./scripts/build-zlib.sh && rm -rf src/deps/zlib scripts
RUN cd $BUN_DIR && \
make zlib && rm -rf src/deps/zlib Makefile
FROM bunbunbunbun/bun-base:latest as libarchive
FROM bun-base as libdeflate
ARG BUN_DIR
ARG CPU_TARGET
ENV CPU_TARGET=${CPU_TARGET}
ARG CCACHE_DIR=/ccache
ENV CCACHE_DIR=${CCACHE_DIR}
ARG DEBIAN_FRONTEND=noninteractive
ARG GITHUB_WORKSPACE=/build
ARG ZIG_PATH=${GITHUB_WORKSPACE}/zig
# Directory extracts to "bun-webkit"
ARG WEBKIT_DIR=${GITHUB_WORKSPACE}/bun-webkit
ARG BUN_RELEASE_DIR=${GITHUB_WORKSPACE}/bun-release
ARG BUN_DEPS_OUT_DIR=${GITHUB_WORKSPACE}/bun-deps
ARG BUN_DIR=${GITHUB_WORKSPACE}/bun
COPY Makefile ${BUN_DIR}/Makefile
COPY CMakeLists.txt ${BUN_DIR}/CMakeLists.txt
COPY scripts ${BUN_DIR}/scripts
COPY src/deps/libdeflate ${BUN_DIR}/src/deps/libdeflate
COPY package.json bun.lockb Makefile .gitmodules ${BUN_DIR}/
WORKDIR $BUN_DIR
RUN --mount=type=cache,target=${CCACHE_DIR} \
cd $BUN_DIR \
&& bash ./scripts/build-libdeflate.sh && rm -rf src/deps/libdeflate scripts
FROM bun-base as libarchive
ARG BUN_DIR
ARG CPU_TARGET
ENV CPU_TARGET=${CPU_TARGET}
ARG CCACHE_DIR=/ccache
ENV CCACHE_DIR=${CCACHE_DIR}
RUN install_packages autoconf automake libtool pkg-config
COPY scripts ${BUN_DIR}/scripts
COPY src/deps/libarchive ${BUN_DIR}/src/deps/libarchive
WORKDIR $BUN_DIR
RUN --mount=type=cache,target=${CCACHE_DIR} \
cd $BUN_DIR \
&& bash ./scripts/build-libarchive.sh && rm -rf src/deps/libarchive .scripts
RUN cd $BUN_DIR && \
make libarchive && rm -rf src/deps/libarchive Makefile
FROM bun-base as tinycc
FROM bunbunbunbun/bun-base:latest as tinycc
ARG BUN_DEPS_OUT_DIR
ARG BUN_DIR
ARG CPU_TARGET
ENV CPU_TARGET=${CPU_TARGET}
RUN install_packages libtcc-dev && cp /usr/lib/$(uname -m)-linux-gnu/libtcc.a ${BUN_DEPS_OUT_DIR}
FROM bun-base as boringssl
RUN install_packages golang
ARG BUN_DIR
ARG CPU_TARGET
ENV CPU_TARGET=${CPU_TARGET}
ARG DEBIAN_FRONTEND=noninteractive
ARG GITHUB_WORKSPACE=/build
ARG ZIG_PATH=${GITHUB_WORKSPACE}/zig
# Directory extracts to "bun-webkit"
ARG WEBKIT_DIR=${GITHUB_WORKSPACE}/bun-webkit
ARG BUN_RELEASE_DIR=${GITHUB_WORKSPACE}/bun-release
ARG BUN_DEPS_OUT_DIR=${GITHUB_WORKSPACE}/bun-deps
ARG BUN_DIR=${GITHUB_WORKSPACE}/bun
COPY Makefile ${BUN_DIR}/Makefile
COPY src/deps/tinycc ${BUN_DIR}/src/deps/tinycc
WORKDIR $BUN_DIR
RUN cd $BUN_DIR && \
make tinycc && rm -rf src/deps/tinycc Makefile
FROM bunbunbunbun/bun-base:latest as libbacktrace
ARG DEBIAN_FRONTEND=noninteractive
ARG GITHUB_WORKSPACE=/build
ARG ZIG_PATH=${GITHUB_WORKSPACE}/zig
# Directory extracts to "bun-webkit"
ARG WEBKIT_DIR=${GITHUB_WORKSPACE}/bun-webkit
ARG BUN_RELEASE_DIR=${GITHUB_WORKSPACE}/bun-release
ARG BUN_DEPS_OUT_DIR=${GITHUB_WORKSPACE}/bun-deps
ARG BUN_DIR=${GITHUB_WORKSPACE}/bun
COPY Makefile ${BUN_DIR}/Makefile
COPY src/deps/libbacktrace ${BUN_DIR}/src/deps/libbacktrace
WORKDIR $BUN_DIR
RUN cd $BUN_DIR && \
make libbacktrace && rm -rf src/deps/libbacktrace Makefile
FROM bunbunbunbun/bun-base:latest as boringssl
ARG DEBIAN_FRONTEND=noninteractive
ARG GITHUB_WORKSPACE=/build
ARG ZIG_PATH=${GITHUB_WORKSPACE}/zig
# Directory extracts to "bun-webkit"
ARG WEBKIT_DIR=${GITHUB_WORKSPACE}/bun-webkit
ARG BUN_RELEASE_DIR=${GITHUB_WORKSPACE}/bun-release
ARG BUN_DEPS_OUT_DIR=${GITHUB_WORKSPACE}/bun-deps
ARG BUN_DIR=${GITHUB_WORKSPACE}/bun
COPY Makefile ${BUN_DIR}/Makefile
COPY scripts ${BUN_DIR}/scripts
COPY src/deps/boringssl ${BUN_DIR}/src/deps/boringssl
WORKDIR $BUN_DIR
ARG CCACHE_DIR=/ccache
ENV CCACHE_DIR=${CCACHE_DIR}
RUN make boringssl && rm -rf src/deps/boringssl Makefile
RUN --mount=type=cache,target=${CCACHE_DIR} \
cd ${BUN_DIR} \
&& bash ./scripts/build-boringssl.sh \
&& rm -rf src/deps/boringssl Makefile
FROM bunbunbunbun/bun-base:latest as base64
FROM bun-base as zstd
ARG BUN_DIR
ARG CPU_TARGET
ENV CPU_TARGET=${CPU_TARGET}
ARG CCACHE_DIR=/ccache
ENV CCACHE_DIR=${CCACHE_DIR}
ARG DEBIAN_FRONTEND=noninteractive
ARG GITHUB_WORKSPACE=/build
ARG ZIG_PATH=${GITHUB_WORKSPACE}/zig
# Directory extracts to "bun-webkit"
ARG WEBKIT_DIR=${GITHUB_WORKSPACE}/bun-webkit
ARG BUN_RELEASE_DIR=${GITHUB_WORKSPACE}/bun-release
ARG BUN_DEPS_OUT_DIR=${GITHUB_WORKSPACE}/bun-deps
ARG BUN_DIR=${GITHUB_WORKSPACE}/bun
COPY Makefile ${BUN_DIR}/Makefile
COPY src/deps/zstd ${BUN_DIR}/src/deps/zstd
COPY scripts ${BUN_DIR}/scripts
COPY src/base64 ${BUN_DIR}/src/base64
WORKDIR $BUN_DIR
RUN --mount=type=cache,target=${CCACHE_DIR} \
cd $BUN_DIR \
&& bash ./scripts/build-zstd.sh \
&& rm -rf src/deps/zstd scripts
RUN make base64 && rm -rf src/base64 Makefile
FROM bun-base as ls-hpack
FROM bunbunbunbun/bun-base:latest as uws
ARG BUN_DIR
ARG CPU_TARGET
ENV CPU_TARGET=${CPU_TARGET}
ARG CCACHE_DIR=/ccache
ENV CCACHE_DIR=${CCACHE_DIR}
ARG DEBIAN_FRONTEND=noninteractive
ARG GITHUB_WORKSPACE=/build
ARG ZIG_PATH=${GITHUB_WORKSPACE}/zig
# Directory extracts to "bun-webkit"
ARG WEBKIT_DIR=${GITHUB_WORKSPACE}/bun-webkit
ARG BUN_RELEASE_DIR=${GITHUB_WORKSPACE}/bun-release
ARG BUN_DEPS_OUT_DIR=${GITHUB_WORKSPACE}/bun-deps
ARG BUN_DIR=${GITHUB_WORKSPACE}/bun
COPY Makefile ${BUN_DIR}/Makefile
COPY src/deps/ls-hpack ${BUN_DIR}/src/deps/ls-hpack
COPY scripts ${BUN_DIR}/scripts
COPY src/deps/uws ${BUN_DIR}/src/deps/uws
COPY src/deps/zlib ${BUN_DIR}/src/deps/zlib
COPY src/deps/boringssl/include ${BUN_DIR}/src/deps/boringssl/include
COPY src/deps/libuwsockets.cpp ${BUN_DIR}/src/deps/libuwsockets.cpp
COPY src/deps/_libusockets.h ${BUN_DIR}/src/deps/_libusockets.h
WORKDIR $BUN_DIR
RUN --mount=type=cache,target=${CCACHE_DIR} \
cd $BUN_DIR \
&& bash ./scripts/build-lshpack.sh \
&& rm -rf src/deps/ls-hpack scripts
RUN cd $BUN_DIR && \
make uws && rm -rf src/deps/uws Makefile
FROM bun-base-with-zig as bun-identifier-cache
FROM bunbunbunbun/bun-base:latest as picohttp
ARG DEBIAN_FRONTEND
ARG GITHUB_WORKSPACE
ARG CPU_TARGET
ARG BUN_DIR
ENV CPU_TARGET=${CPU_TARGET}
ARG DEBIAN_FRONTEND=noninteractive
ARG GITHUB_WORKSPACE=/build
ARG ZIG_PATH=${GITHUB_WORKSPACE}/zig
# Directory extracts to "bun-webkit"
ARG WEBKIT_DIR=${GITHUB_WORKSPACE}/bun-webkit
ARG BUN_RELEASE_DIR=${GITHUB_WORKSPACE}/bun-release
ARG BUN_DEPS_OUT_DIR=${GITHUB_WORKSPACE}/bun-deps
ARG BUN_DIR=${GITHUB_WORKSPACE}/bun
COPY Makefile ${BUN_DIR}/Makefile
COPY src/deps/picohttpparser ${BUN_DIR}/src/deps/picohttpparser
COPY src/deps/*.c ${BUN_DIR}/src/deps/
COPY src/deps/*.h ${BUN_DIR}/src/deps/
WORKDIR $BUN_DIR
RUN cd $BUN_DIR && \
make picohttp
FROM bunbunbunbun/bun-base-with-zig-and-webkit:latest as identifier_cache
ARG DEBIAN_FRONTEND=noninteractive
ARG GITHUB_WORKSPACE=/build
ARG ZIG_PATH=${GITHUB_WORKSPACE}/zig
# Directory extracts to "bun-webkit"
ARG WEBKIT_DIR=${GITHUB_WORKSPACE}/bun-webkit
ARG BUN_RELEASE_DIR=${GITHUB_WORKSPACE}/bun-release
ARG BUN_DEPS_OUT_DIR=${GITHUB_WORKSPACE}/bun-deps
ARG BUN_DIR=${GITHUB_WORKSPACE}/bun
WORKDIR $BUN_DIR
COPY Makefile ${BUN_DIR}/Makefile
COPY src/js_lexer/identifier_data.zig ${BUN_DIR}/src/js_lexer/identifier_data.zig
COPY src/js_lexer/identifier_cache.zig ${BUN_DIR}/src/js_lexer/identifier_cache.zig
RUN --mount=type=cache,target=${ZIG_LOCAL_CACHE_DIR} \
cd $BUN_DIR \
&& zig run src/js_lexer/identifier_data.zig
RUN cd $BUN_DIR && \
make identifier-cache && rm -rf zig-cache Makefile
FROM bun-base as bun-node-fallbacks
FROM bunbunbunbun/bun-base-with-zig-and-webkit:latest as node_fallbacks
ARG BUN_DIR
ARG DEBIAN_FRONTEND=noninteractive
ARG GITHUB_WORKSPACE=/build
ARG ZIG_PATH=${GITHUB_WORKSPACE}/zig
# Directory extracts to "bun-webkit"
ARG WEBKIT_DIR=${GITHUB_WORKSPACE}/bun-webkit
ARG BUN_RELEASE_DIR=${GITHUB_WORKSPACE}/bun-release
ARG BUN_DEPS_OUT_DIR=${GITHUB_WORKSPACE}/bun-deps
ARG BUN_DIR=${GITHUB_WORKSPACE}/bun
WORKDIR $BUN_DIR
COPY Makefile ${BUN_DIR}/Makefile
COPY src/node-fallbacks ${BUN_DIR}/src/node-fallbacks
RUN cd $BUN_DIR && \
make node-fallbacks && rm -rf src/node-fallbacks/node_modules Makefile
RUN cd $BUN_DIR/src/node-fallbacks \
&& bun install --frozen-lockfile \
&& bun run build \
&& rm -rf src/node-fallbacks/node_modules
FROM bunbunbunbun/bun-base-with-zig-and-webkit:latest as prepare_release
FROM bun-base as bun-webkit
ARG DEBIAN_FRONTEND=noninteractive
ARG GITHUB_WORKSPACE=/build
ARG ZIG_PATH=${GITHUB_WORKSPACE}/zig
# Directory extracts to "bun-webkit"
ARG WEBKIT_DIR=${GITHUB_WORKSPACE}/bun-webkit
ARG BUN_RELEASE_DIR=${GITHUB_WORKSPACE}/bun-release
ARG BUN_DEPS_OUT_DIR=${GITHUB_WORKSPACE}/bun-deps
ARG BUN_DIR=${GITHUB_WORKSPACE}/bun
ARG BUILDARCH
ARG ASSERTIONS
COPY CMakeLists.txt ${BUN_DIR}/CMakeLists.txt
RUN mkdir ${BUN_DIR}/bun-webkit \
&& WEBKIT_TAG=$(grep 'set(WEBKIT_TAG' "${BUN_DIR}/CMakeLists.txt" | awk '{print $2}' | cut -f 1 -d ')') \
&& WEBKIT_SUFFIX=$(if [ "${ASSERTIONS}" = "ON" ]; then echo "debug"; else echo "lto"; fi) \
&& WEBKIT_URL="https://github.com/oven-sh/WebKit/releases/download/autobuild-${WEBKIT_TAG}/bun-webkit-linux-${BUILDARCH}-${WEBKIT_SUFFIX}.tar.gz" \
&& echo "Downloading ${WEBKIT_URL}" \
&& curl -fsSL "${WEBKIT_URL}" | tar -xz -C ${BUN_DIR}/bun-webkit --strip-components=1
FROM bun-base as bun-cpp-objects
ARG CANARY
ARG ASSERTIONS
COPY --from=bun-webkit ${BUN_DIR}/bun-webkit ${BUN_DIR}/bun-webkit
COPY packages ${BUN_DIR}/packages
COPY src ${BUN_DIR}/src
COPY CMakeLists.txt ${BUN_DIR}/CMakeLists.txt
COPY src/deps/boringssl/include ${BUN_DIR}/src/deps/boringssl/include
# for uWebSockets
COPY src/deps/libdeflate ${BUN_DIR}/src/deps/libdeflate
ARG CCACHE_DIR=/ccache
ENV CCACHE_DIR=${CCACHE_DIR}
RUN --mount=type=cache,target=${CCACHE_DIR} mkdir ${BUN_DIR}/build \
&& cd ${BUN_DIR}/build \
&& mkdir -p tmp_modules tmp_functions js codegen \
&& cmake .. -GNinja -DCMAKE_BUILD_TYPE=Release -DUSE_LTO=ON -DUSE_DEBUG_JSC=${ASSERTIONS} -DBUN_CPP_ONLY=1 -DWEBKIT_DIR=/build/bun/bun-webkit -DCANARY=${CANARY} -DZIG_COMPILER=system \
&& bash compile-cpp-only.sh -v
FROM bun-base-with-zig as bun-codegen-for-zig
COPY package.json bun.lockb Makefile .gitmodules ${BUN_DIR}/
COPY src/runtime ${BUN_DIR}/src/runtime
COPY src/runtime.js src/runtime.bun.js ${BUN_DIR}/src/
COPY packages/bun-error ${BUN_DIR}/packages/bun-error
COPY packages/bun-types ${BUN_DIR}/packages/bun-types
COPY src/fallback.ts ${BUN_DIR}/src/fallback.ts
COPY src/api ${BUN_DIR}/src/api
WORKDIR $BUN_DIR
# TODO: move away from Makefile entirely
RUN --mount=type=cache,target=${ZIG_LOCAL_CACHE_DIR} \
bun install --frozen-lockfile \
&& make runtime_js fallback_decoder bun_error \
&& rm -rf src/runtime src/fallback.ts node_modules bun.lockb package.json Makefile
COPY ./src ${BUN_DIR}/src
COPY ./build.zig ${BUN_DIR}/build.zig
COPY ./completions ${BUN_DIR}/completions
COPY ./packages ${BUN_DIR}/packages
COPY ./build-id ${BUN_DIR}/build-id
COPY ./package.json ${BUN_DIR}/package.json
COPY ./misctools ${BUN_DIR}/misctools
COPY Makefile ${BUN_DIR}/Makefile
FROM bun-base-with-zig as bun-compile-zig-obj
COPY --from=lolhtml ${BUN_DEPS_OUT_DIR}/*.a ${BUN_DEPS_OUT_DIR}/
COPY --from=mimalloc ${BUN_DEPS_OUT_DIR}/*.o ${BUN_DEPS_OUT_DIR}/
COPY --from=libarchive ${BUN_DEPS_OUT_DIR}/*.a ${BUN_DEPS_OUT_DIR}/
COPY --from=picohttp ${BUN_DEPS_OUT_DIR}/*.o ${BUN_DEPS_OUT_DIR}/
COPY --from=boringssl ${BUN_DEPS_OUT_DIR}/*.a ${BUN_DEPS_OUT_DIR}/
COPY --from=uws ${BUN_DEPS_OUT_DIR}/*.a ${BUN_DEPS_OUT_DIR}/
COPY --from=uws ${BUN_DEPS_OUT_DIR}/*.o ${BUN_DEPS_OUT_DIR}/
COPY --from=libbacktrace ${BUN_DEPS_OUT_DIR}/*.a ${BUN_DEPS_OUT_DIR}/
COPY --from=zlib ${BUN_DEPS_OUT_DIR}/*.a ${BUN_DEPS_OUT_DIR}/
COPY --from=tinycc ${BUN_DEPS_OUT_DIR}/*.a ${BUN_DEPS_OUT_DIR}/
COPY --from=base64 ${BUN_DEPS_OUT_DIR}/*.a ${BUN_DEPS_OUT_DIR}/
COPY --from=identifier_cache ${BUN_DIR}/src/js_lexer/*.blob ${BUN_DIR}/src/js_lexer/
COPY --from=node_fallbacks ${BUN_DIR}/src/node-fallbacks/out ${BUN_DIR}/src/node-fallbacks/out
ARG ZIG_PATH
ARG TRIPLET
ARG GIT_SHA
ARG CPU_TARGET
ARG CANARY=0
ARG ASSERTIONS=OFF
ARG ZIG_OPTIMIZE=ReleaseFast
WORKDIR ${BUN_DIR}
ARG CCACHE_DIR=/ccache
ENV CCACHE_DIR=${CCACHE_DIR}
COPY *.zig package.json CMakeLists.txt ${BUN_DIR}/
COPY completions ${BUN_DIR}/completions
COPY packages ${BUN_DIR}/packages
COPY src ${BUN_DIR}/src
FROM prepare_release as build_release
COPY --from=bun-identifier-cache ${BUN_DIR}/src/js_lexer/*.blob ${BUN_DIR}/src/js_lexer/
COPY --from=bun-node-fallbacks ${BUN_DIR}/src/node-fallbacks/out ${BUN_DIR}/src/node-fallbacks/out
COPY --from=bun-codegen-for-zig ${BUN_DIR}/src/*.out.js ${BUN_DIR}/src/*.out.refresh.js ${BUN_DIR}/src/
COPY --from=bun-codegen-for-zig ${BUN_DIR}/packages/bun-error/dist ${BUN_DIR}/packages/bun-error/dist
ARG DEBIAN_FRONTEND=noninteractive
ARG GITHUB_WORKSPACE=/build
ARG ZIG_PATH=${GITHUB_WORKSPACE}/zig
# Directory extracts to "bun-webkit"
ARG WEBKIT_DIR=${GITHUB_WORKSPACE}/bun-webkit
ARG BUN_RELEASE_DIR=${GITHUB_WORKSPACE}/bun-release
ARG BUN_DEPS_OUT_DIR=${GITHUB_WORKSPACE}/bun-deps
ARG BUN_DIR=${GITHUB_WORKSPACE}/bun
COPY Makefile ${BUN_DIR}/Makefile
WORKDIR $BUN_DIR
RUN --mount=type=cache,target=${CCACHE_DIR} \
--mount=type=cache,target=${ZIG_LOCAL_CACHE_DIR} \
mkdir -p build \
&& bun run $BUN_DIR/src/codegen/bundle-modules.ts --debug=OFF $BUN_DIR/build \
&& cd build \
&& cmake .. \
-G Ninja \
-DCMAKE_BUILD_TYPE=Release \
-DUSE_LTO=ON \
-DZIG_OPTIMIZE="${ZIG_OPTIMIZE}" \
-DCPU_TARGET="${CPU_TARGET}" \
-DZIG_TARGET="${TRIPLET}" \
-DWEBKIT_DIR="omit" \
-DNO_CONFIGURE_DEPENDS=1 \
-DNO_CODEGEN=1 \
-DBUN_ZIG_OBJ_DIR="/tmp" \
-DCANARY="${CANARY}" \
-DZIG_COMPILER=system \
-DZIG_LIB_DIR=$BUN_DIR/src/deps/zig/lib \
&& ONLY_ZIG=1 ninja "/tmp/bun-zig.o" -v
RUN cd $BUN_DIR && rm -rf $HOME/.cache zig-cache && make \
jsc-bindings-headers \
api \
analytics \
bun_error \
fallback_decoder && rm -rf $HOME/.cache zig-cache && \
mkdir -p $BUN_RELEASE_DIR && \
make jsc-bindings-mac -j10 && \
make sqlite release copy-to-bun-release-dir && \
rm -rf $HOME/.cache zig-cache misctools package.json build-id completions build.zig $(BUN_DIR)/packages
FROM scratch as build_release_obj
FROM prepare_release as build_unit
ARG CPU_TARGET
ENV CPU_TARGET=${CPU_TARGET}
COPY --from=bun-compile-zig-obj /tmp/bun-zig.o /
FROM bun-base as bun-link
ARG CPU_TARGET
ARG CANARY
ARG ASSERTIONS
ENV CPU_TARGET=${CPU_TARGET}
ARG CCACHE_DIR=/ccache
ENV CCACHE_DIR=${CCACHE_DIR}
ARG ZIG_LOCAL_CACHE_DIR=/zig-cache
ENV ZIG_LOCAL_CACHE_DIR=${ZIG_LOCAL_CACHE_DIR}
ARG DEBIAN_FRONTEND=noninteractive
ARG GITHUB_WORKSPACE=/build
ARG ZIG_PATH=${GITHUB_WORKSPACE}/zig
# Directory extracts to "bun-webkit"
ARG WEBKIT_DIR=${GITHUB_WORKSPACE}/bun-webkit
ARG BUN_RELEASE_DIR=${GITHUB_WORKSPACE}/bun-release
ARG BUN_DEPS_OUT_DIR=${GITHUB_WORKSPACE}/bun-deps
ARG BUN_DIR=${GITHUB_WORKSPACE}/bun
WORKDIR $BUN_DIR
RUN mkdir -p build bun-webkit
ENV PATH "$ZIG_PATH:$PATH"
# lol
COPY src/bun.js/bindings/sqlite/sqlite3.c ${BUN_DIR}/src/bun.js/bindings/sqlite/sqlite3.c
COPY src/deps/brotli ${BUN_DIR}/src/deps/brotli
CMD make jsc-bindings-headers \
api \
analytics \
bun_error \
fallback_decoder \
jsc-bindings-mac -j10 && \
make \
run-all-unit-tests
COPY src/symbols.dyn src/linker.lds ${BUN_DIR}/src/
FROM bunbunbunbun/bun-base-with-zig-and-webkit:latest as bun.devcontainer
COPY CMakeLists.txt ${BUN_DIR}/CMakeLists.txt
COPY --from=zlib ${BUN_DEPS_OUT_DIR}/* ${BUN_DEPS_OUT_DIR}/
COPY --from=libdeflate ${BUN_DEPS_OUT_DIR}/* ${BUN_DEPS_OUT_DIR}/
COPY --from=libarchive ${BUN_DEPS_OUT_DIR}/* ${BUN_DEPS_OUT_DIR}/
COPY --from=boringssl ${BUN_DEPS_OUT_DIR}/* ${BUN_DEPS_OUT_DIR}/
COPY --from=lolhtml ${BUN_DEPS_OUT_DIR}/* ${BUN_DEPS_OUT_DIR}/
COPY --from=mimalloc ${BUN_DEPS_OUT_DIR}/* ${BUN_DEPS_OUT_DIR}/
COPY --from=zstd ${BUN_DEPS_OUT_DIR}/* ${BUN_DEPS_OUT_DIR}/
COPY --from=tinycc ${BUN_DEPS_OUT_DIR}/* ${BUN_DEPS_OUT_DIR}/
COPY --from=c-ares ${BUN_DEPS_OUT_DIR}/* ${BUN_DEPS_OUT_DIR}/
COPY --from=ls-hpack ${BUN_DEPS_OUT_DIR}/* ${BUN_DEPS_OUT_DIR}/
COPY --from=bun-compile-zig-obj /tmp/bun-zig.o ${BUN_DIR}/build/bun-zig.o
COPY --from=bun-cpp-objects ${BUN_DIR}/build/*.a ${BUN_DIR}/build/
COPY --from=bun-cpp-objects ${BUN_DIR}/build/*.o ${BUN_DIR}/build/
COPY --from=bun-cpp-objects ${BUN_DIR}/bun-webkit/lib ${BUN_DIR}/bun-webkit/lib
ARG DEBIAN_FRONTEND=noninteractive
ARG GITHUB_WORKSPACE=/build
ARG ZIG_PATH=${GITHUB_WORKSPACE}/zig
# Directory extracts to "bun-webkit"
ARG WEBKIT_DIR=${GITHUB_WORKSPACE}/bun-webkit
ARG BUN_RELEASE_DIR=${GITHUB_WORKSPACE}/bun-release
ARG BUN_DEPS_OUT_DIR=${GITHUB_WORKSPACE}/bun-deps
ARG BUN_DIR=${GITHUB_WORKSPACE}/bun
WORKDIR $BUN_DIR/build
ENV WEBKIT_OUT_DIR ${WEBKIT_DIR}
ENV PATH "$ZIG_PATH:$PATH"
ENV JSC_BASE_DIR $WEBKIT_OUT_DIR
ENV LIB_ICU_PATH ${GITHUB_WORKSPACE}/icu/source/lib
ENV BUN_RELEASE_DIR ${BUN_RELEASE_DIR}
ENV PATH "${GITHUB_WORKSPACE}/packages/bun-linux-x64:${GITHUB_WORKSPACE}/packages/bun-linux-aarch64:${GITHUB_WORKSPACE}/packages/debug-bun-linux-x64:${GITHUB_WORKSPACE}/packages/debug-bun-linux-aarch64:$PATH"
ENV PATH "/home/ubuntu/zls/zig-out/bin:$PATH"
RUN --mount=type=cache,target=${CCACHE_DIR} \
--mount=type=cache,target=${ZIG_LOCAL_CACHE_DIR} \
cmake .. \
-G Ninja \
-DCMAKE_BUILD_TYPE=Release \
-DBUN_LINK_ONLY=1 \
-DBUN_ZIG_OBJ_DIR="${BUN_DIR}/build" \
-DUSE_LTO=ON \
-DUSE_DEBUG_JSC=${ASSERTIONS} \
-DBUN_CPP_ARCHIVE="${BUN_DIR}/build/bun-cpp-objects.a" \
-DWEBKIT_DIR="${BUN_DIR}/bun-webkit" \
-DBUN_DEPS_OUT_DIR="${BUN_DEPS_OUT_DIR}" \
-DCPU_TARGET="${CPU_TARGET}" \
-DNO_CONFIGURE_DEPENDS=1 \
-DCANARY="${CANARY}" \
-DZIG_COMPILER=system \
&& ninja -v \
&& ./bun --revision \
&& mkdir -p /build/out \
&& mv bun bun-profile /build/out \
&& rm -rf ${BUN_DIR} ${BUN_DEPS_OUT_DIR}
ENV BUN_INSTALL /home/ubuntu/.bun
ENV XDG_CONFIG_HOME /home/ubuntu/.config
FROM scratch as artifact
RUN apt-get -y update && update-alternatives --install /usr/bin/lldb lldb /usr/bin/lldb-13 90
COPY --from=bun-link /build/out /
COPY .devcontainer/workspace.code-workspace $GITHUB_WORKSPACE/workspace.code-workspace
COPY .devcontainer/zls.json $GITHUB_WORKSPACE/workspace.code-workspace
COPY .devcontainer/limits.conf /etc/security/limits.conf
COPY ".devcontainer/scripts/" /scripts/
COPY ".devcontainer/scripts/getting-started.sh" $GITHUB_WORKSPACE/getting-started.sh
RUN mkdir -p /home/ubuntu/.bun /home/ubuntu/.config $GITHUB_WORKSPACE/bun && \
bash /scripts/common-debian.sh && \
bash /scripts/github.sh && \
bash /scripts/nice.sh && \
bash /scripts/zig-env.sh
COPY .devcontainer/zls.json /home/ubuntu/.config/zls.json
FROM bun-base as bun-link-assertions
FROM ubuntu:20.04 as release_with_debug_info
ARG CPU_TARGET
ARG CANARY
ARG ASSERTIONS
ARG DEBIAN_FRONTEND=noninteractive
ARG GITHUB_WORKSPACE=/build
ARG ZIG_PATH=${GITHUB_WORKSPACE}/zig
# Directory extracts to "bun-webkit"
ARG WEBKIT_DIR=${GITHUB_WORKSPACE}/bun-webkit
ARG BUN_RELEASE_DIR=${GITHUB_WORKSPACE}/bun-release
ARG BUN_DEPS_OUT_DIR=${GITHUB_WORKSPACE}/bun-deps
ARG BUN_DIR=${GITHUB_WORKSPACE}/bun
ENV CPU_TARGET=${CPU_TARGET}
ARG CCACHE_DIR=/ccache
ENV CCACHE_DIR=${CCACHE_DIR}
ARG ZIG_LOCAL_CACHE_DIR=/zig-cache
ENV ZIG_LOCAL_CACHE_DIR=${ZIG_LOCAL_CACHE_DIR}
COPY .devcontainer/limits.conf /etc/security/limits.conf
ENV BUN_INSTALL /opt/bun
ENV PATH "/opt/bun/bin:$PATH"
ARG BUILDARCH=amd64
LABEL org.opencontainers.image.title="bun ${BUILDARCH} (glibc)"
LABEL org.opencontainers.image.source=https://github.com/jarred-sumner/bun
COPY --from=build_release ${BUN_RELEASE_DIR}/bun /opt/bun/bin/bun
COPY --from=build_release ${BUN_RELEASE_DIR}/bun-profile /opt/bun/bin/bun-profile
WORKDIR /opt/bun
ENTRYPOINT [ "/opt/bun/bin/bun" ]
FROM ubuntu:20.04 as release
ARG DEBIAN_FRONTEND=noninteractive
ARG GITHUB_WORKSPACE=/build
ARG ZIG_PATH=${GITHUB_WORKSPACE}/zig
# Directory extracts to "bun-webkit"
ARG WEBKIT_DIR=${GITHUB_WORKSPACE}/bun-webkit
ARG BUN_RELEASE_DIR=${GITHUB_WORKSPACE}/bun-release
ARG BUN_DEPS_OUT_DIR=${GITHUB_WORKSPACE}/bun-deps
ARG BUN_DIR=${GITHUB_WORKSPACE}/bun
COPY .devcontainer/limits.conf /etc/security/limits.conf
ENV BUN_INSTALL /opt/bun
ENV PATH "/opt/bun/bin:$PATH"
ARG BUILDARCH=amd64
LABEL org.opencontainers.image.title="bun ${BUILDARCH} (glibc)"
LABEL org.opencontainers.image.source=https://github.com/jarred-sumner/bun
COPY --from=build_release ${BUN_RELEASE_DIR}/bun /opt/bun/bin/bun
WORKDIR /opt/bun
ENTRYPOINT [ "/opt/bun/bin/bun" ]
FROM bunbunbunbun/bun-test-base as test_base
ARG DEBIAN_FRONTEND=noninteractive
ARG GITHUB_WORKSPACE=/build
ARG ZIG_PATH=${GITHUB_WORKSPACE}/zig
# Directory extracts to "bun-webkit"
ARG WEBKIT_DIR=${GITHUB_WORKSPACE}/bun-webkit
ARG BUN_RELEASE_DIR=${GITHUB_WORKSPACE}/bun-release
ARG BUN_DEPS_OUT_DIR=${GITHUB_WORKSPACE}/bun-deps
ARG BUN_DIR=${GITHUB_WORKSPACE}/bun
ARG BUILDARCH=amd64
RUN groupadd -r chromium && useradd -d ${BUN_DIR} -M -r -g chromium -G audio,video chromium \
&& mkdir -p /home/chromium/Downloads && chown -R chromium:chromium /home/chromium
USER chromium
WORKDIR $BUN_DIR
RUN mkdir -p build bun-webkit
ENV NPM_CLIENT bun
ENV PATH "${BUN_DIR}/packages/bun-linux-x64:${BUN_DIR}/packages/bun-linux-aarch64:$PATH"
ENV CI 1
ENV BROWSER_EXECUTABLE /usr/bin/chromium
# lol
COPY src/bun.js/bindings/sqlite/sqlite3.c ${BUN_DIR}/src/bun.js/bindings/sqlite/sqlite3.c
COPY ./integration ${BUN_DIR}/integration
COPY Makefile ${BUN_DIR}/Makefile
COPY package.json ${BUN_DIR}/package.json
COPY .docker/run-test.sh ${BUN_DIR}/run-test.sh
COPY ./bun.lockb ${BUN_DIR}/bun.lockb
COPY src/symbols.dyn src/linker.lds ${BUN_DIR}/src/
# # We don't want to worry about architecture differences in this image
COPY --from=release /opt/bun/bin/bun ${BUN_DIR}/packages/bun-linux-aarch64/bun
COPY --from=release /opt/bun/bin/bun ${BUN_DIR}/packages/bun-linux-x64/bun
COPY CMakeLists.txt ${BUN_DIR}/CMakeLists.txt
COPY --from=zlib ${BUN_DEPS_OUT_DIR}/* ${BUN_DEPS_OUT_DIR}/
COPY --from=libarchive ${BUN_DEPS_OUT_DIR}/* ${BUN_DEPS_OUT_DIR}/
COPY --from=boringssl ${BUN_DEPS_OUT_DIR}/* ${BUN_DEPS_OUT_DIR}/
COPY --from=lolhtml ${BUN_DEPS_OUT_DIR}/* ${BUN_DEPS_OUT_DIR}/
COPY --from=mimalloc-debug ${BUN_DEPS_OUT_DIR}/* ${BUN_DEPS_OUT_DIR}/
COPY --from=zstd ${BUN_DEPS_OUT_DIR}/* ${BUN_DEPS_OUT_DIR}/
COPY --from=tinycc ${BUN_DEPS_OUT_DIR}/* ${BUN_DEPS_OUT_DIR}/
COPY --from=c-ares ${BUN_DEPS_OUT_DIR}/* ${BUN_DEPS_OUT_DIR}/
COPY --from=bun-compile-zig-obj /tmp/bun-zig.o ${BUN_DIR}/build/bun-zig.o
COPY --from=bun-cpp-objects ${BUN_DIR}/build/bun-cpp-objects.a ${BUN_DIR}/build/bun-cpp-objects.a
COPY --from=bun-cpp-objects ${BUN_DIR}/bun-webkit/lib ${BUN_DIR}/bun-webkit/lib
USER root
RUN chgrp -R chromium ${BUN_DIR} && chmod g+rwx ${BUN_DIR} && chown -R chromium:chromium ${BUN_DIR}
USER chromium
WORKDIR $BUN_DIR/build
CMD [ "bash", "run-test.sh" ]
RUN --mount=type=cache,target=${CCACHE_DIR} \
--mount=type=cache,target=${ZIG_LOCAL_CACHE_DIR} \
cmake .. \
-G Ninja \
-DCMAKE_BUILD_TYPE=Release \
-DBUN_LINK_ONLY=1 \
-DBUN_ZIG_OBJ_DIR="${BUN_DIR}/build" \
-DUSE_DEBUG_JSC=ON \
-DBUN_CPP_ARCHIVE="${BUN_DIR}/build/bun-cpp-objects.a" \
-DWEBKIT_DIR="${BUN_DIR}/bun-webkit" \
-DBUN_DEPS_OUT_DIR="${BUN_DEPS_OUT_DIR}" \
-DCPU_TARGET="${CPU_TARGET}" \
-DNO_CONFIGURE_DEPENDS=1 \
-DCANARY="${CANARY}" \
-DZIG_COMPILER=system \
-DUSE_LTO=ON \
&& ninja -v \
&& ./bun --revision \
&& mkdir -p /build/out \
&& mv bun bun-profile /build/out \
&& rm -rf ${BUN_DIR} ${BUN_DEPS_OUT_DIR}
FROM scratch as artifact-assertions
COPY --from=bun-link-assertions /build/out /
FROM release

147
Dockerfile.base Normal file
View File

@@ -0,0 +1,147 @@
FROM ubuntu:20.04 as bun-base-with-args
FROM bun-base-with-args as bun-base
ARG DEBIAN_FRONTEND=noninteractive
ARG GITHUB_WORKSPACE=/build
ARG ZIG_PATH=${GITHUB_WORKSPACE}/zig
# Directory extracts to "bun-webkit"
ARG WEBKIT_DIR=${GITHUB_WORKSPACE}/bun-webkit
ARG BUN_RELEASE_DIR=${GITHUB_WORKSPACE}/bun-release
ARG BUN_DEPS_OUT_DIR=${GITHUB_WORKSPACE}/bun-deps
ARG BUN_DIR=${GITHUB_WORKSPACE}/bun
WORKDIR ${GITHUB_WORKSPACE}
RUN apt-get update && \
apt-get install --no-install-recommends -y wget gnupg2 curl lsb-release wget software-properties-common && \
add-apt-repository ppa:longsleep/golang-backports && \
wget https://apt.llvm.org/llvm.sh --no-check-certificate && \
chmod +x llvm.sh && \
./llvm.sh 13 && \
apt-get update && \
apt-get install --no-install-recommends -y \
ca-certificates \
curl \
gnupg2 \
software-properties-common \
cmake \
build-essential \
git \
libssl-dev \
ruby \
liblld-13-dev \
libclang-13-dev \
nodejs \
gcc \
g++ \
npm \
clang-13 \
clang-format-13 \
libc++-13-dev \
libc++abi-13-dev \
lld-13 \
libicu-dev \
wget \
rustc \
cargo \
unzip \
tar \
golang-go ninja-build pkg-config automake autoconf libtool curl && \
update-alternatives --install /usr/bin/cc cc /usr/bin/clang-13 90 && \
update-alternatives --install /usr/bin/cpp cpp /usr/bin/clang++-13 90 && \
update-alternatives --install /usr/bin/c++ c++ /usr/bin/clang++-13 90 && \
npm install -g esbuild
ENV CC=clang-13
ENV CXX=clang++-13
ARG BUILDARCH=amd64
WORKDIR $GITHUB_WORKSPACE
ENV WEBKIT_OUT_DIR ${WEBKIT_DIR}
ENV JSC_BASE_DIR $WEBKIT_OUT_DIR
ENV LIB_ICU_PATH ${GITHUB_WORKSPACE}/icu/source/lib
ENV BUN_RELEASE_DIR ${BUN_RELEASE_DIR}
ENV BUN_DEPS_OUT_DIR ${BUN_DEPS_OUT_DIR}
RUN cd / && mkdir -p $BUN_RELEASE_DIR $BUN_DEPS_OUT_DIR ${BUN_DIR} ${BUN_DEPS_OUT_DIR}
LABEL org.opencontainers.image.title="bun base image ${BUILDARCH} (glibc)"
LABEL org.opencontainers.image.source=https://github.com/jarred-sumner/bun
FROM bun-base as bun-base-with-zig-and-webkit
ARG DEBIAN_FRONTEND=noninteractive
ARG GITHUB_WORKSPACE=/build
ARG ZIG_PATH=${GITHUB_WORKSPACE}/zig
# Directory extracts to "bun-webkit"
ARG WEBKIT_DIR=${GITHUB_WORKSPACE}/bun-webkit
ARG BUN_RELEASE_DIR=${GITHUB_WORKSPACE}/bun-release
ARG BUN_DEPS_OUT_DIR=${GITHUB_WORKSPACE}/bun-deps
ARG BUN_DIR=${GITHUB_WORKSPACE}/bun
ARG BUILDARCH=amd64
WORKDIR $GITHUB_WORKSPACE
RUN cd $GITHUB_WORKSPACE && \
curl -o zig-linux-$BUILDARCH.zip -L https://github.com/Jarred-Sumner/zig/releases/download/mar4/zig-linux-$BUILDARCH.zip && \
unzip -q zig-linux-$BUILDARCH.zip && \
rm zig-linux-$BUILDARCH.zip;
RUN cd $GITHUB_WORKSPACE && \
curl -o bun-webkit-linux-$BUILDARCH.tar.gz -L https://github.com/Jarred-Sumner/WebKit/releases/download/May8/bun-webkit-linux-$BUILDARCH.tar.gz && \
tar -xzf bun-webkit-linux-$BUILDARCH.tar.gz && \
rm bun-webkit-linux-$BUILDARCH.tar.gz && \
cat $WEBKIT_OUT_DIR/include/cmakeconfig.h > /dev/null
RUN cd $GITHUB_WORKSPACE && \
curl -o icu4c-66_1-src.tgz -L https://github.com/unicode-org/icu/releases/download/release-66-1/icu4c-66_1-src.tgz && \
tar -xzf icu4c-66_1-src.tgz && \
rm icu4c-66_1-src.tgz && \
cd icu/source && \
./configure --enable-static --disable-shared && \
make -j$(nproc)
ENV ZIG "${ZIG_PATH}/zig"
LABEL org.opencontainers.image.title="bun base image with zig & webkit ${BUILDARCH} (glibc)"
LABEL org.opencontainers.image.source=https://github.com/jarred-sumner/bun
FROM debian:bullseye-slim as bun-test-base
# Original creator:
# LABEL maintainer "Jessie Frazelle <jess@linux.com>"
# Install Chromium
# Yes, including the Google API Keys sucks but even debian does the same: https://packages.debian.org/stretch/amd64/chromium/filelist
RUN apt-get update && apt-get install -y \
chromium \
chromium-l10n \
fonts-liberation \
fonts-roboto \
hicolor-icon-theme \
libcanberra-gtk-module \
libexif-dev \
libgl1-mesa-dri \
libgl1-mesa-glx \
libpangox-1.0-0 \
libv4l-0 \
fonts-symbola \
bash \
make \
psmisc \
curl \
--no-install-recommends \
&& rm -rf /var/lib/apt/lists/* \
&& mkdir -p /etc/chromium.d/ \
&& /bin/echo -e 'export GOOGLE_API_KEY="AIzaSyCkfPOPZXDKNn8hhgu3JrA62wIgC93d44k"\nexport GOOGLE_DEFAULT_CLIENT_ID="811574891467.apps.googleusercontent.com"\nexport GOOGLE_DEFAULT_CLIENT_SECRET="kdloedMFGdGla2P1zacGjAQh"' > /etc/chromium.d/googleapikeys && \
curl -L https://deb.nodesource.com/setup_16.x | bash - && \
apt-get update && \
apt-get install -y nodejs npm

139
Dockerfile.musl Normal file
View File

@@ -0,0 +1,139 @@
# This doesn't work
# Specifically: there are a number of crashes and segfaults when using musl
# The cause is likely related to differences in pthreads implementations
# It is not just the stack size thing. It's something more complicated and importantly
# There was no meaningful file size difference between musl and glibc
# ARG BUILDARCH=aarch64
# ARG zig_base_image=ghcr.io/jarred-sumner/zig-linux-musl-${BUILDARCH}
# ARG webkit_base_image=ghcr.io/jarred-sumner/bun-webkit-musl-${BUILDARCH}
# FROM ${zig_base_image}:latest AS zig
# FROM ${webkit_base_image}:latest AS webkit
# FROM zig as bun_base
# COPY --from=webkit /webkit /webkit
# ENV PATH "/zig/bin:$PATH"
# ENV JSC_BASE_DIR=/webkit
# ENV LIB_ICU_PATH=/webkit/lib
# ENV BUN_DEPS_OUT_DIR /bun-deps
# ENV STATIC_MUSL_FLAG=-static
# ENV MIMALLOC_OVERRIDE_FLAG="-DMI_OVERRIDE=OFF"
# RUN apk add --no-cache nodejs npm go libtool autoconf pkgconfig automake ninja
# RUN mkdir -p $BUN_DEPS_OUT_DIR;
# WORKDIR /bun
# COPY Makefile /bun/Makefile
# FROM bun_base as mimalloc
# COPY src/deps/mimalloc /bun/src/deps/mimalloc
# RUN make mimalloc;
# FROM bun_base as zlib
# COPY src/deps/zlib /bun/src/deps/zlib
# RUN make zlib;
# FROM bun_base as libarchive
# COPY src/deps/libarchive /bun/src/deps/libarchive
# RUN make libarchive;
# FROM bun_base as boringssl
# COPY src/deps/boringssl /bun/src/deps/boringssl
# RUN make boringssl;
# FROM bun_base as picohttp
# COPY src/deps/picohttpparser /bun/src/deps/picohttpparser
# COPY src/deps/*.c /bun/src/deps/
# COPY src/deps/*.h /bun/src/deps/
# RUN make picohttp
# FROM bun_base as identifier_cache
# COPY src/js_lexer/identifier_data.zig /bun/src/js_lexer/identifier_data.zig
# COPY src/js_lexer/identifier_cache.zig /bun/src/js_lexer/identifier_cache.zig
# RUN make identifier-cache
# FROM bun_base as node_fallbacks
# COPY src/node-fallbacks /bun/src/node-fallbacks
# RUN make node-fallbacks
# FROM bun_base as prebuild
# WORKDIR /bun
# COPY ./src /bun/src
# COPY ./build.zig /bun/build.zig
# COPY ./completions /bun/completions
# COPY ./packages /bun/packages
# COPY ./build-id /bun/build-id
# COPY ./package.json /bun/package.json
# COPY ./misctools /bun/misctools
# COPY --from=mimalloc /bun-deps/*.o /bun-deps
# COPY --from=libarchive /bun-deps/*.a /bun-deps
# COPY --from=picohttp /bun-deps/*.o /bun-deps
# COPY --from=boringssl /bun-deps/*.a /bun-deps
# COPY --from=zlib /bun-deps/*.a /bun-deps
# COPY --from=node_fallbacks /bun/src/node-fallbacks /bun/src/node-fallbacks
# COPY --from=identifier_cache /bun/src/js_lexer/*.blob /bun/src/js_lexer/
# ENV ICU_FLAGS="-I/webkit/include/wtf $ICU_FLAGS"
# RUN apk add --no-cache chromium && npm install -g esbuild && make \
# jsc-bindings-headers \
# api \
# analytics \
# bun_error \
# fallback_decoder
# FROM prebuild as release
# ENV BUN_RELEASE_DIR /opt/bun
# ENV LIB_ICU_PATH /usr/lib
# RUN apk add icu-static icu-dev && mkdir -p $BUN_RELEASE_DIR; make release \
# copy-to-bun-release-dir
# FROM alpine:3.15 as bun
# COPY --from=release /opt/bun/bun /opt/bun/bin/bun
# ENV BUN_INSTALL /opt/bun
# ENV PATH /opt/bun/bin:$PATH
# LABEL org.opencontainers.image.title="bun - Linux ${BUILDARCH} (musl)"
# LABEL org.opencontainers.image.source=https://github.com/jarred-sumner/bun
# FROM release as test
# ENV PATH /opt/bun/bin:$PATH
# ENV PATH /bun/packages/bun-linux-aarch64:/bun/packages/bun-linux-x64:$PATH
# ENV BUN_INSTALL /opt/bun
# WORKDIR /bun
# COPY ./integration /bun/integration
# COPY ./integration/snippets/package-json-exports/_node_modules_copy /bun/integration/snippets/package-json-exports/_node_modules_copy
# CMD [ "bash", "-c", "npm install && cd /bun/integration/snippets && npm install && cd /bun && make copy-test-node-modules test-all"]
# # FROM bun

1
LATEST
View File

@@ -1 +0,0 @@
1.1.21

View File

@@ -1,73 +0,0 @@
Bun itself is MIT-licensed.
## JavaScriptCore
Bun statically links JavaScriptCore (and WebKit) which is LGPL-2 licensed. WebCore files from WebKit are also licensed under LGPL2. Per LGPL2:
> (1) If you statically link against an LGPLd library, you must also provide your application in an object (not necessarily source) format, so that a user has the opportunity to modify the library and relink the application.
You can find the patched version of WebKit used by Bun here: <https://github.com/oven-sh/webkit>. If you would like to relink Bun with changes:
- `git submodule update --init --recursive`
- `make jsc`
- `zig build`
This compiles JavaScriptCore, compiles Buns `.cpp` bindings for JavaScriptCore (which are the object files using JavaScriptCore) and outputs a new `bun` binary with your changes.
## Linked libraries
Bun statically links these libraries:
| Library | License |
|---------|---------|
| [`boringssl`](https://boringssl.googlesource.com/boringssl/) | [several licenses](https://boringssl.googlesource.com/boringssl/+/refs/heads/master/LICENSE) |
| [`brotli`](https://github.com/google/brotli) | MIT |
| [`libarchive`](https://github.com/libarchive/libarchive) | [several licenses](https://github.com/libarchive/libarchive/blob/master/COPYING) |
| [`lol-html`](https://github.com/cloudflare/lol-html/tree/master/c-api) | BSD 3-Clause |
| [`mimalloc`](https://github.com/microsoft/mimalloc) | MIT |
| [`picohttp`](https://github.com/h2o/picohttpparser) | dual-licensed under the Perl License or the MIT License |
| [`zstd`](https://github.com/facebook/zstd) | dual-licensed under the BSD License or GPLv2 license |
| [`simdutf`](https://github.com/simdutf/simdutf) | Apache 2.0 |
| [`tinycc`](https://github.com/tinycc/tinycc) | LGPL v2.1 |
| [`uSockets`](https://github.com/uNetworking/uSockets) | Apache 2.0 |
| [`zlib-cloudflare`](https://github.com/cloudflare/zlib) | zlib |
| [`c-ares`](https://github.com/c-ares/c-ares) | MIT licensed |
| [`libicu`](https://github.com/unicode-org/icu) 72 | [license here](https://github.com/unicode-org/icu/blob/main/icu4c/LICENSE) |
| [`libbase64`](https://github.com/aklomp/base64/blob/master/LICENSE) | BSD 2-Clause |
| [`libuv`](https://github.com/libuv/libuv) (on Windows) | MIT |
| [`libdeflate`](https://github.com/ebiggers/libdeflate) | MIT |
| A fork of [`uWebsockets`](https://github.com/jarred-sumner/uwebsockets) | Apache 2.0 licensed |
| Parts of [Tigerbeetle's IO code](https://github.com/tigerbeetle/tigerbeetle/blob/532c8b70b9142c17e07737ab6d3da68d7500cbca/src/io/windows.zig#L1) | Apache 2.0 licensed |
## Polyfills
For compatibility reasons, the following packages are embedded into Bun's binary and injected if imported.
| Package | License |
|---------|---------|
| [`assert`](https://npmjs.com/package/assert) | MIT |
| [`browserify-zlib`](https://npmjs.com/package/browserify-zlib) | MIT |
| [`buffer`](https://npmjs.com/package/buffer) | MIT |
| [`constants-browserify`](https://npmjs.com/package/constants-browserify) | MIT |
| [`crypto-browserify`](https://npmjs.com/package/crypto-browserify) | MIT |
| [`domain-browser`](https://npmjs.com/package/domain-browser) | MIT |
| [`events`](https://npmjs.com/package/events) | MIT |
| [`https-browserify`](https://npmjs.com/package/https-browserify) | MIT |
| [`os-browserify`](https://npmjs.com/package/os-browserify) | MIT |
| [`path-browserify`](https://npmjs.com/package/path-browserify) | MIT |
| [`process`](https://npmjs.com/package/process) | MIT |
| [`punycode`](https://npmjs.com/package/punycode) | MIT |
| [`querystring-es3`](https://npmjs.com/package/querystring-es3) | MIT |
| [`stream-browserify`](https://npmjs.com/package/stream-browserify) | MIT |
| [`stream-http`](https://npmjs.com/package/stream-http) | MIT |
| [`string_decoder`](https://npmjs.com/package/string_decoder) | MIT |
| [`timers-browserify`](https://npmjs.com/package/timers-browserify) | MIT |
| [`tty-browserify`](https://npmjs.com/package/tty-browserify) | MIT |
| [`url`](https://npmjs.com/package/url) | MIT |
| [`util`](https://npmjs.com/package/util) | MIT |
| [`vm-browserify`](https://npmjs.com/package/vm-browserify) | MIT |
## Additional credits
- Bun's JS transpiler, CSS lexer, and Node.js module resolver source code is a Zig port of [@evanw](https://github.com/evanw)s [esbuild](https://github.com/evanw/esbuild) project.
- Credit to [@kipply](https://github.com/kipply) for the name "Bun"!

1567
Makefile

File diff suppressed because it is too large Load Diff

3477
README.md

File diff suppressed because it is too large Load Diff

View File

@@ -1,12 +0,0 @@
# Security Policy
## Supported Versions
| Version | Supported |
| ------- | ------------------ |
| 1.x.x | :white_check_mark: |
## Reporting a Vulnerability
Report any discovered vulnerabilities to the Bun team by emailing `security@bun.sh`. Your report will acknowledged within 5 days, and a team member will be assigned as the primary handler. To the greatest extent possible, the security team will endeavor to keep you informed of the progress being made towards a fix and full announcement, and may ask for additional information or guidance surrounding the reported issue.

View File

@@ -1,3 +0,0 @@
BUN=bun
DENO=deno
NODE=node

2
bench/.gitignore vendored
View File

@@ -1,2 +0,0 @@
ffi/src/target
ffi/src/*.node

View File

@@ -1,13 +0,0 @@
```bash
npm install
bun run ffi
bun run log
bun run gzip
bun run async
bun run sqlite
# to use custom version of bun/deno/node binary
BUN=path/to/bun bun run ffi
# or edit .env file
```

View File

@@ -1,3 +0,0 @@
BUN=bun
DENO=deno
NODE=node

View File

@@ -1,30 +0,0 @@
// https://github.com/nodejs/node/issues/34493
import { AsyncLocalStorage } from "async_hooks";
const asyncLocalStorage = new AsyncLocalStorage();
// let fn = () => Promise.resolve(2).then(() => new Promise(resolve => queueMicrotask(resolve)));
let fn = () => /test/.test("test");
let runWithExpiry = async (expiry, fn) => {
let iterations = 0;
while (Date.now() < expiry) {
await fn();
iterations++;
}
return iterations;
};
console.log(`Performed ${await runWithExpiry(Date.now() + 1000, fn)} iterations to warmup`);
let withAls;
await asyncLocalStorage.run(123, async () => {
withAls = await runWithExpiry(Date.now() + 45000, fn);
console.log(`Performed ${withAls} iterations (with ALS enabled)`);
});
asyncLocalStorage.disable();
let withoutAls = await runWithExpiry(Date.now() + 45000, fn);
console.log(`Performed ${withoutAls} iterations (with ALS disabled)`);
console.log("ALS penalty: " + Math.round((1 - withAls / withoutAls) * 10000) / 100 + "%");

View File

@@ -1,7 +0,0 @@
import { run, bench } from "mitata";
bench("sync", () => {});
bench("async", async () => {});
bench("await 1", async () => await 1);
await run();

View File

@@ -1,7 +0,0 @@
import { run, bench } from "../node_modules/mitata/src/cli.mjs";
bench("sync", () => {});
bench("async", async () => {});
bench("await 1", async () => await 1);
await run();

View File

@@ -1,7 +0,0 @@
import { run, bench } from "mitata";
bench("sync", () => {});
bench("async", async () => {});
bench("await 1", async () => await 1);
await run();

View File

@@ -1,11 +0,0 @@
{
"name": "bench",
"scripts": {
"deps": "exit 0",
"build": "exit 0",
"bench:bun": "$BUN bun.js",
"bench:node": "$NODE node.mjs",
"bench:deno": "$DENO run -A --unstable deno.js",
"bench": "bun run bench:bun && bun run bench:node && bun run bench:deno"
}
}

Binary file not shown.

View File

@@ -1,171 +0,0 @@
# Based on https://raw.githubusercontent.com/github/gitignore/main/Node.gitignore
# Logs
logs
_.log
npm-debug.log_
yarn-debug.log*
yarn-error.log*
lerna-debug.log*
.pnpm-debug.log*
# Diagnostic reports (https://nodejs.org/api/report.html)
report.[0-9]_.[0-9]_.[0-9]_.[0-9]_.json
# Runtime data
pids
_.pid
_.seed
\*.pid.lock
# Directory for instrumented libs generated by jscoverage/JSCover
lib-cov
# Coverage directory used by tools like istanbul
coverage
\*.lcov
# nyc test coverage
.nyc_output
# Grunt intermediate storage (https://gruntjs.com/creating-plugins#storing-task-files)
.grunt
# Bower dependency directory (https://bower.io/)
bower_components
# node-waf configuration
.lock-wscript
# Compiled binary addons (https://nodejs.org/api/addons.html)
build/Release
# Dependency directories
node_modules/
jspm_packages/
# Snowpack dependency directory (https://snowpack.dev/)
web_modules/
# TypeScript cache
\*.tsbuildinfo
# Optional npm cache directory
.npm
# Optional eslint cache
.eslintcache
# Optional stylelint cache
.stylelintcache
# Microbundle cache
.rpt2_cache/
.rts2_cache_cjs/
.rts2_cache_es/
.rts2_cache_umd/
# Optional REPL history
.node_repl_history
# Output of 'npm pack'
\*.tgz
# Yarn Integrity file
.yarn-integrity
# dotenv environment variable files
.env
.env.development.local
.env.test.local
.env.production.local
.env.local
# parcel-bundler cache (https://parceljs.org/)
.cache
.parcel-cache
# Next.js build output
.next
out
# Nuxt.js build / generate output
.nuxt
dist
# Gatsby files
.cache/
# Comment in the public line in if your project uses Gatsby and not Next.js
# https://nextjs.org/blog/next-9-1#public-directory-support
# public
# vuepress build output
.vuepress/dist
# vuepress v2.x temp and cache directory
.temp
.cache
# Docusaurus cache and generated files
.docusaurus
# Serverless directories
.serverless/
# FuseBox cache
.fusebox/
# DynamoDB Local files
.dynamodb/
# TernJS port file
.tern-port
# Stores VSCode versions used for testing VSCode extensions
.vscode-test
# yarn v2
.yarn/cache
.yarn/unplugged
.yarn/build-state.yml
.yarn/install-state.gz
.pnp.\*
esbuild

View File

@@ -1,40 +0,0 @@
# Bundler benchmark
This is a performance benchmark of the following bundlers:
- Bun
- esbuild
- Parcel 2
- Rollup + Terser
- Webpack
It is an exact copy of [`esbuild`'s benchmark](https://github.com/evanw/esbuild/blob/main/Makefile), aside from the fact that Bun [has been added](https://github.com/colinhacks/esbuild/commit/1b928b7981aa7edfadf77fcf8931bb8d6f38cd96). The benchmark bundles 10 copies of the large [three.js](https://threejs.org/), with minification and source maps enabled.
To run the benchmark:
```sh
$ chmod +x run-bench.sh
$ ./run-bench.sh
```
Various output will be written to the console by each bundler. Scan through the results for lines that look like this underneath each bundler output:
```sh
real <number>
user <number>
sys <number>
```
These lines are generated by the `time` command which is used to benchmark each build.
## Results
The `real` results, as run on a 16-inch M1 Macbook Pro:
| Bundler | Time |
| ------- | ------ |
| Bun | 0.17s |
| esbuild | 0.33s |
| Rollup | 18.82s |
| Webpack | 26.21 |
| Parcel | 17.95s |

Binary file not shown.

View File

@@ -1 +0,0 @@
console.log("Hello via Bun!");

View File

@@ -1,8 +0,0 @@
{
"name": "bundle",
"module": "index.ts",
"type": "module",
"devDependencies": {
"bun-types": "^0.7.0"
}
}

View File

@@ -1,3 +0,0 @@
git clone git@github.com:colinhacks/esbuild.git
cd esbuild
make bench-three

View File

@@ -1,20 +0,0 @@
{
"compilerOptions": {
"lib": [
"ESNext"
],
"module": "esnext",
"target": "esnext",
"moduleResolution": "bundler",
"strict": true,
"downlevelIteration": true,
"skipLibCheck": true,
"jsx": "react-jsx",
"allowSyntheticDefaultImports": true,
"forceConsistentCasingInFileNames": true,
"allowJs": true,
"types": [
"bun-types" // add Bun global
]
}
}

View File

@@ -1,9 +0,0 @@
// works in both bun & node
import { readFileSync } from "node:fs";
const count = parseInt(process.env.ITERATIONS || "1", 10) || 1;
const arg = process.argv.slice(1);
// TODO: remove Buffer.from() when readFileSync() returns Buffer
for (let i = 0; i < count; i++) console.log(arg.map(file => Buffer.from(readFileSync(file, "utf8"))).join(""));

View File

@@ -1,6 +0,0 @@
const fs = require("fs");
const path = require("path");
const input = path.resolve(process.argv[process.argv.length - 1]);
fs.createReadStream(input).pipe(process.stdout);

View File

@@ -1,5 +0,0 @@
import path from "path";
const input = path.resolve(process.argv[process.argv.length - 2]);
const output = path.resolve(process.argv[process.argv.length - 1]);
await Bun.write(Bun.file(output), Bun.file(input));

View File

@@ -1,4 +0,0 @@
import { createReadStream, createWriteStream } from "node:fs";
const arg = process.argv.slice(2);
createReadStream(arg[0]).pipe(createWriteStream(arg[1]));

View File

@@ -1,34 +0,0 @@
import { copyFileSync, writeFileSync, readFileSync, statSync } from "node:fs";
import { bench, run } from "mitata";
function runner(ready) {
for (let size of [1, 10, 100, 1000, 10000, 100000, 1000000, 10000000]) {
const rand = new Int32Array(size);
for (let i = 0; i < size; i++) {
rand[i] = (Math.random() * 1024 * 1024) | 0;
}
const dest = `/tmp/fs-test-copy-file-${((Math.random() * 10000000 + 100) | 0).toString(32)}`;
const src = `/tmp/fs-test-copy-file-${((Math.random() * 10000000 + 100) | 0).toString(32)}`;
writeFileSync(src, Buffer.from(rand.buffer), { encoding: "buffer" });
const { size: fileSize } = statSync(src);
if (fileSize !== rand.byteLength) {
throw new Error("size mismatch");
}
ready(src, dest, new Uint8Array(rand.buffer));
}
}
runner((src, dest, rand) =>
bench(`copyFileSync(${rand.buffer.byteLength} bytes)`, () => {
copyFileSync(src, dest);
// const output = readFileSync(dest).buffer;
// for (let i = 0; i < output.length; i++) {
// if (output[i] !== rand[i]) {
// throw new Error(
// "Files are not equal" + " " + output[i] + " " + rand[i] + " " + i
// );
// }
// }
}),
);
await run();

View File

@@ -1,5 +0,0 @@
import { copyFileSync } from "node:fs";
const arg = process.argv.slice(2);
copyFileSync(arg[0], arg[1]);

View File

@@ -1,31 +0,0 @@
import EventEmitter3 from "eventemitter3";
import { group } from "mitata";
import EventEmitterNative from "node:events";
export const implementations = [
{
EventEmitter: EventEmitterNative,
name: process.isBun ? (EventEmitterNative.init ? "bun" : "C++") : "node:events",
monkey: true,
},
// { EventEmitter: EventEmitter3, name: "EventEmitter3" },
].filter(Boolean);
for (const impl of implementations) {
impl.EventEmitter?.setMaxListeners?.(Infinity);
}
export function groupForEmitter(name, cb) {
if (implementations.length === 1) {
return cb({
...implementations[0],
name: `${name}: ${implementations[0].name}`,
});
} else {
return group(name, () => {
for (let impl of implementations) {
cb(impl);
}
});
}
}

View File

@@ -1,96 +0,0 @@
import { bench, run } from "mitata";
import { groupForEmitter } from "./implementations.mjs";
var id = 0;
groupForEmitter("single emit", ({ EventEmitter, name }) => {
const emitter = new EventEmitter();
emitter.on("hello", event => {
event.preventDefault();
});
bench(name, () => {
emitter.emit("hello", {
preventDefault() {
id++;
},
});
});
});
groupForEmitter("on x 10_000 (handler)", ({ EventEmitter, name }) => {
const emitter = new EventEmitter();
bench(name, () => {
var cb = event => {
event.preventDefault();
};
emitter.on("hey", cb);
var called = false;
for (let i = 0; i < 10_000; i++)
emitter.emit("hey", {
preventDefault() {
id++;
called = true;
},
});
if (!called) throw new Error("not called");
});
});
// for (let { impl: EventEmitter, name, monkey } of []) {
// if (monkey) {
// var monkeyEmitter = Object.assign({}, EventEmitter.prototype);
// monkeyEmitter.on("hello", event => {
// event.preventDefault();
// });
// bench(`[monkey] ${className}.emit`, () => {
// var called = false;
// monkeyEmitter.emit("hello", {
// preventDefault() {
// id++;
// called = true;
// },
// });
// if (!called) {
// throw new Error("monkey failed");
// }
// });
// bench(`[monkey] ${className}.on x 10_000 (handler)`, () => {
// var cb = () => {
// event.preventDefault();
// };
// monkeyEmitter.on("hey", cb);
// for (let i = 0; i < 10_000; i++)
// monkey.emit("hey", {
// preventDefault() {
// id++;
// },
// });
// monkeyEmitter.off("hey", cb);
// });
// }
// }
// var target = new EventTarget();
// target.addEventListener("hello", event => {});
// bench("EventTarget.dispatch", () => {
// target.dispatchEvent(event);
// });
// var hey = new Event("hey");
// bench("EventTarget.on x 10_000 (handler)", () => {
// var handler = event => {};
// target.addEventListener("hey", handler);
// for (let i = 0; i < 10_000; i++) target.dispatchEvent(hey);
// target.removeEventListener("hey", handler);
// });
await run();

Some files were not shown because too many files have changed in this diff Show More