Compare commits

...

211 Commits

Author SHA1 Message Date
Ashcon Partovi
d20a145f00 Fix syntax error from workflow [no ci] 2024-04-19 14:05:15 -07:00
Ashcon Partovi
abaa505374 Set git config when running tests on Windows [no ci] 2024-04-19 13:50:23 -07:00
Jarred Sumner
c5270f8121 Fix typo 2024-04-18 22:57:36 -07:00
Jarred Sumner
c4d69146c8 [doc] Disambiguate the title 2024-04-18 22:46:43 -07:00
Jarred Sumner
2ce83953e5 [docs] Add a few more guides 2024-04-18 22:20:29 -07:00
Jarred Sumner
b6aebb58c2 [doc] Add guide for streaming with async iterators 2024-04-18 22:09:09 -07:00
Jarred Sumner
26428d5e1c [CI] Fixup windows 2024-04-18 20:52:37 -07:00
Jarred Sumner
05ff620d4d [Ci] Always cancel-in-progress 2024-04-18 19:12:51 -07:00
Jarred Sumner
e134ed253f [CI] Use bigger windows runners 2024-04-18 18:55:09 -07:00
Jarred Sumner
f663472d5f [CI] Normalize filepaths relative to cwd in output 2024-04-18 18:52:47 -07:00
Jarred Sumner
6f67c63873 Formatting tweak 2024-04-18 18:44:49 -07:00
Ashcon Partovi
f460d39298 Increase timeout for tests 2024-04-18 17:19:49 -07:00
Dylan Conway
246df1f43e check without .exe (#10362) 2024-04-18 17:03:38 -07:00
Ashcon Partovi
213461adc6 Fix Discord message on test failure 2024-04-18 13:05:23 -07:00
Jarred Sumner
a78668eb4c fix(bundler): fix a crash while computing character frequencies
* Fixes #10344

* Update bundler_compile.test.ts

* Apply formatting changes

* Track comments when bundling

* Fix embedded files and add test

* Make this const

* Update runner.node.mjs

* Prefill process arch/platform in bun build --compile

* nitpick

---------

Co-authored-by: Jarred-Sumner <Jarred-Sumner@users.noreply.github.com>
Co-authored-by: dave caruso <me@paperdave.net>
2024-04-18 12:35:59 -07:00
Ashcon Partovi
452dd68253 Fix comment not upserting 2024-04-18 10:56:54 -07:00
Ashcon Partovi
de7985b5a6 Fix unzip location 2024-04-17 19:02:46 -07:00
Ashcon Partovi
a1f86bf3f3 Add temporary SSH into workflow 2024-04-17 19:00:00 -07:00
Ashcon Partovi
df23f18461 Fix glob that unzips bun 2024-04-17 18:58:10 -07:00
Ashcon Partovi
1d7f80c73c Use always() to maybe fix trigger 2024-04-17 18:54:58 -07:00
Ashcon Partovi
6d6b2e8bc5 Add ability to manually trigger tests 2024-04-17 18:52:10 -07:00
Ashcon Partovi
accfff0271 Attempt to fix download artifact issue 2024-04-17 18:28:28 -07:00
Ashcon Partovi
aa1174df69 Probably fix permissions issues with CI 2024-04-17 17:51:41 -07:00
Jarred Sumner
997f57b97f Fix generate comment in CI 2024-04-17 17:41:59 -07:00
Jarred Sumner
074205d963 Fix generate comment in CI 2024-04-17 17:37:33 -07:00
Ashcon Partovi
cfce166a9b Use different GitHub action to download Bun 2024-04-17 17:13:39 -07:00
Ashcon Partovi
13f0188fec Allow concurrent CI runs on main, but only cancel-in-progress if not-main 2024-04-17 16:29:48 -07:00
Ashcon Partovi
97761cba67 Fix GIT_SHA not being populated in builds 2024-04-17 16:23:50 -07:00
Ashcon Partovi
492211f499 Tweak comment from PRs 2024-04-17 16:08:04 -07:00
Jarred Sumner
192577141b Update 6-crash-report.yml 2024-04-17 15:53:24 -07:00
Jarred Sumner
6e71dca5c2 Tweak crash report template 2024-04-17 15:52:09 -07:00
dave caruso
c99d7ed331 feat: overhaul the crash handler (#10203)
* some work

* linux things

* linux things

* feat: tracestrings on Windows

* bwaa

* more work on the crash handler

* okay

* adgadsgbcxcv

* ya

* dsafds

* a

* wuh

* a

* bru h

* ok

* yay

* window

* alright

* oops

* yeah

* a

* a

* OOM handling

* fix on window
2024-04-17 15:32:25 -07:00
Ashcon Partovi
f764c1233b Fix permissions in workflows, part 2 2024-04-17 15:09:26 -07:00
Ashcon Partovi
20d8261405 Fix permissions in workflows 2024-04-17 15:07:54 -07:00
Ashcon Partovi
a7273802a8 Debug comment workflow 2024-04-17 14:18:25 -07:00
Ashcon Partovi
303bf4d9f1 Fix comment workflow 2024-04-17 11:47:05 -07:00
Ashcon Partovi
d4c31d3c9e Maybe fix test workflow 2024-04-17 11:36:40 -07:00
Ashcon Partovi
d5e6ff4c97 Fix artifact uploads for canary builds 2024-04-17 10:04:57 -07:00
liudonghua
51bb5f3a04 Update platform.ts to fix isWindowsAVX2 implementation. (#10313)
The isWindowsAVX2 function is not working as expected due to the stdout endwith `\r\n`. So the simple `stdout == "True"` will never true.
2024-04-17 00:26:57 -07:00
Ashcon Partovi
fdaa01287a Maybe fix Windows tests 2024-04-16 22:53:42 -07:00
Ashcon Partovi
f8a28ad37e Probably fix comment workflow 2024-04-16 22:12:48 -07:00
Ashcon Partovi
c18c25f390 Testing workflows (#10157)
* Testing workflows

* Testing workflows

* Testing workflows

* Testing workflows

* Testing workflows

* Testing workflows

* Update .github/workflows/run-test.yml

---------

Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2024-04-16 19:39:06 -07:00
Stefano
3df202f91f Fix(windows) search correct path for esbuild(.exe|.cmd) (#10302) 2024-04-16 15:43:47 -07:00
Meghan Denny
df190815df node: convert remaining js packages to ts (#10289) 2024-04-16 15:42:24 -07:00
Meghan Denny
2ae48f3314 lint: ban comparing against undefined in Zig (#10288) 2024-04-16 14:37:09 -07:00
Jarred Sumner
291a39bd3f Do not run shell tests outside test scope (#10199)
* Do not run tests outside test scope

* Fix tests

* Fix type errors and remove potentially precarious uses of unreachable

* yoops

* Remove all instances of "Ruh roh"

---------

Co-authored-by: Zack Radisic <56137411+zackradisic@users.noreply.github.com>
2024-04-16 14:03:02 -07:00
Jarred Sumner
fbe2fe0c3f Bump 2024-04-15 21:32:17 -07:00
Dylan Conway
5a81dc8e33 fix(install): fix dependency install order (#10240)
* packages wait for parent trees before install

* use `directoryExistsAt`

* missing increment

* fix faccessat

* swap destination and target

* update

* force

* only create destination dir before installing package

* fix windows symlink/junction

* increment, false on extracting

* done
2024-04-15 18:29:34 -07:00
Georgijs
24a411f904 Correctly handle duplicate column names in sqlite joins (#10285)
* add tests

* working

* cleanup

* fix compile

* fix naming and comment

* fix lints in test

* apply suggested fixes
2024-04-15 14:02:28 -07:00
Jarred Sumner
3f10d5250a [bun:sqlite] Support sqlite3_file_control, better closing behavior, implement Disposable (#10262)
* [bun:sqlite] Support `sqlite_file_control`, better closing behavior, support `using` statements

* docs+flaky test

* Simplify the implementation
2024-04-15 13:06:30 -07:00
Jarred Sumner
dd6beb66d8 [doc] Simplify this guide slightly 2024-04-15 07:45:07 -07:00
Grigory
233624b6ff fix(which/windows): ignore file extension case (#10102)
* fix(which/windows): ignore file extension case

* feat(which): add test for `endsWithExtension` fun

* Revert "feat(which): add test for `endsWithExtension` fun"

This reverts commit fb3ad51de7.

* add test

---------

Co-authored-by: Georgijs <48869301+gvilums@users.noreply.github.com>
2024-04-15 05:03:44 -07:00
Georgijs
fdcc844027 fix path resolution for writeFile in nodefs (#10179)
* fix path resolution for writeFile in nodefs

* add test

* [autofix.ci] apply automated fixes

* use force copy

* fix build

* fix test on windows

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2024-04-15 05:02:09 -07:00
Ciro Spaciari
74d91f6b51 feat(SSL_renegociate) (#10256)
* allow client renegotiation and allow server renegotiation with limits matching nodejs behavior

* wip before the refactoring and context separation

* investigate if BoringSSL can send a SSL_renegotiate request or only accept

* format-off

* option to disable server renegotiation

* allow tls options on https

* dead_socket when connectError

* propagate cert error

* test

* move the logic to the right place

* cleanup

* Update test/js/node/tls/renegotiation.test.ts

Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>

---------

Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
Co-authored-by: Georgijs <48869301+gvilums@users.noreply.github.com>
2024-04-15 05:00:41 -07:00
Jarred Sumner
9f81a6268e Fixes #10259 (#10260) 2024-04-14 03:07:18 -07:00
Jarred Sumner
7b5065c1c9 Use internal setup-bun action
We do not want metrics to come from internal usage. CC @Electroid please remember this going forward.
2024-04-13 05:03:37 -07:00
Jarred Sumner
21ad40e86c Allow SSL negotiation for clients (#10239)
@lithdew
2024-04-13 02:43:10 -07:00
Jarred Sumner
c59f49385f Make Command.Context a pointer (#10237) 2024-04-13 01:53:31 -07:00
Jarred Sumner
8d49a3ee37 Better way to check if a directory exists (#10235)
* Better way to check if a directory exists

* Update sys.zig

* Fix windows build

* Add missing file
2024-04-12 22:19:29 -07:00
Ciro Spaciari
f6b9c0c909 fix(socket) fix error in case of failure/returning a error in the open handler (#10154)
* fix socket

* one more test

* always clean callback on deinit

* Update src/bun.js/api/bun/socket.zig

Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>

* make context close private

* keep old logic

* move clean step to SocketContext.close

* add comment

* wait for close on stop

* cleanup

---------

Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2024-04-12 20:58:45 -07:00
Jarred Sumner
65d8288d81 Revert "fix create with github URL on windows (#10231)" (#10236)
This reverts commit 1820d08d25.
2024-04-12 20:55:51 -07:00
Ciro Spaciari
4627af5893 fix(stream) fix http body-stream sending duplicate data (#10221)
* some fixes

* cleanup

* more complete test

* fix test + use same server

* opsie

* incremental steps
2024-04-12 19:58:13 -07:00
Dylan Conway
176af5cf58 reachable errors (#10190)
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2024-04-12 19:02:45 -07:00
Georgijs
1820d08d25 fix create with github URL on windows (#10231)
* correctly ignore error on windows to match posix behavior

* replace zig accessat with bun.sys.existsAt

* fix posix build
2024-04-12 17:12:44 -07:00
Georgijs
472bd6c7de Allow fs.close with no callback (#10229)
* allow fs.close to only take one argument

* add test

* fix tests on windows
2024-04-12 17:11:58 -07:00
Georgijs Vilums
d785d30eaf return correct error code on overlong paths 2024-04-12 15:52:06 -07:00
Georgijs
22d6227a3a fix wrong truncation on fs.writeFileSync with fd argument (#10225)
* fix wrong truncate

* close fd in test

* [autofix.ci] apply automated fixes

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2024-04-12 13:02:24 -07:00
Meghan Denny
70023bc4ed linter: ergonomics and new rules (#10197)
* linter: allow a trailing field

* linter: dont fail if no matches are found

* lint: only import 'bun' once

* lint: ban std.mem.indexOfAny

* linter: ignore commented out code and ignore benchmarks

* this was testing nothing

* lint: ban std.debug.print

* this wasnt testing anything either
2024-04-11 22:26:23 -07:00
Jarred Sumner
19da72fe34 Truncate failing output in internal test runner 2024-04-11 21:30:07 -07:00
Meghan Denny
7d673dd7d8 node:child_process: fix propagation of windowsHide and windowsVerbatimArguments option (#10193) 2024-04-11 20:24:47 -07:00
Meghan Denny
ca98138936 add 'build:windows' package.json script for easier local dev (#10194) 2024-04-11 20:24:03 -07:00
Jarred Sumner
d00b5b94ea Make receiving data over TCP faster on Windows (#10191) 2024-04-11 20:10:09 -07:00
Georgijs
ff5ef512c7 correctly handle --cwd flag (#10187)
* actually change cwd on posix

* add test

* [autofix.ci] apply automated fixes

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2024-04-11 19:22:29 -07:00
Georgijs
545cb546cc feat(cli): --filter flag (#8185)
* Skeleton code for `bun run --workspace`

* Update run_command.zig

* implement directory traversal to find workspace root

* finish --workspace implementation

* clean up changes in run_command.zig

* add workspace tests, update harness to handle nested dirs

* [autofix.ci] apply automated fixes

* basic filtering

* [autofix.ci] apply automated fixes

* working filter without patterns

* update tests, filter mostly working

* simplify package name parsing, commit tests

* support filter even without workspace setup

* move filter arg handling to separate source file

* use bun.sys.chdir, match root package for scripts

* fix exit code handling

* ignore node_modules and directories starting with . in --filter

* progress converting --filter to use iterators

* convert filtering to use iterators

* cleanup

* implement DirEntry access method for glob (currently crashing)

* cleanup and fixes

* run js files in subprocess when filter flag passed

* clean up dead code

* fix fd leak in run_command.zig

* [autofix.ci] apply automated fixes

* fix issues after merge

* use posix-spawn in runBinary, fix resource PATH variable resource leak

* move filter argument to runtime category

* fix test harness

* add js and binary tests to filter-workspace

* [autofix.ci] apply automated fixes

* fix compile after merge

* [autofix.ci] apply automated fixes

* clean up filter-workspace test

* [autofix.ci] apply automated fixes

* fixes to running binaries

* fix actually setting cwd_override

* windows fixes

* address some review comments

* handle malformed JSON

* add various tests

* [autofix.ci] apply automated fixes

* update docs for filter

* [autofix.ci] apply automated fixes

* reset tinycc commit

* filtered run prototype

* make pretty

* implement abort handler (not working)

* make prettier

* prep for windows

* windows path and printing fixes

* implement log-style output (not tui)

* fix issues when logging to file

* revert a bunch of unecessary changes

* cleanup

* implement dependency order execution

* detect  circular dependencies, fix cancel hang

* Fix `$PATH`

* ignore dep order on loop, stream on linux, sort pkgs

* support pre and post scripts

* add more filter tests, print elapsed time

* enable 'bun --filter' without run

* fix harness after merge

* [autofix.ci] apply automated fixes

* print number of scripts we're waiting for

* update docs, fix windows build

* fix tests on windows

* [autofix.ci] apply automated fixes

* fix uninitialized memory

* use terminal synchronized update sequences

* Add skip list

* Preallocate

* Use current bun in tests

---------

Co-authored-by: Jarred Sumner <709451+Jarred-Sumner@users.noreply.github.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2024-04-11 19:06:50 -07:00
Jarred Sumner
688844b472 refactor: ban std.debug.assert (#10168)
* Ban `std.debug.assert`

* Create .clangd

* Update lint.yml

* Update linter.ts

* update

* lint

* Update linter.ts

* Update linter.ts

* update

* Update linter.ts

* update

* Update linter.ts

* more

* Update install.zig

* words

* Remove UB
2024-04-11 17:52:29 -07:00
Georgijs
0f10d4f1be correctly ignore error on windows to match posix behavior (#10186) 2024-04-11 15:10:52 -07:00
Georgijs
c2ac5d4d18 fix example in bun add help text (#10185) 2024-04-11 15:10:33 -07:00
David Ferguson
edeb75a84a Reference .exists() in File-IO Docs (#9957)
* add mention of .exists()

* show that the exists method returns a promise in the docs

* remove unnecessary white space

* update type ref to show that exists returns a promise
2024-04-11 13:24:49 -07:00
Dale Seo
57208cb02e fix typos (#10131) 2024-04-10 12:32:47 -07:00
Evan Shortiss
5b4b6931c4 docs: add guide for neon serverless postgres driver (#10126) 2024-04-10 05:25:27 -07:00
Jarred Sumner
257f4c1b3e Bump zig std lib 2024-04-10 05:21:05 -07:00
Jarred Sumner
459bcdc5ac Concurrent uninstalls (#10111)
* Concurrent uninstalls

* Try disabling concurrency

* Get `rm` tests to pass on Windows

* Fix more things

* Undisable concurrency

* handle error

* Deflake

* [autofix.ci] apply automated fixes

* Undo

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2024-04-10 05:09:14 -07:00
Jarred Sumner
f5c8914c8a Re-sync URL from WebKit + set ERR_MISSING_ARGS (#10129)
* Update URL from WebKit

* Set `ERR_MISSING_ARGS` code on all Error objects from C++

* Fix the `code`

* [autofix.ci] apply automated fixes

* Micro optimize URL

* [autofix.ci] apply automated fixes

* Update url.mjs

* [autofix.ci] apply automated fixes

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2024-04-09 23:21:11 -07:00
Jarred Sumner
1e20f618c9 [bundler] Do not generate sourceContents for non-javascript assets (#10140) 2024-04-09 23:18:09 -07:00
Meghan Denny
cd52f42148 windows: fs/promises: fix when writing to file opened in append mode (#10134)
* windows: fs/promises: fix when writing to file opened in append mode

* add default values since we're using one now

* [autofix.ci] apply automated fixes

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2024-04-09 22:57:10 -07:00
Jarred Sumner
21fc1f7295 Add test for #10132 (#10136)
* Add test for #10132

* Update 010132.test.ts

* Update 010132.test.ts
2024-04-09 22:51:22 -07:00
Meghan Denny
e209ae81dd meta: ensure there's a single 'bun' import per file in zig (#10137)
* meta: ensure there's a single 'bun' import per file in zig

* undo this change in codegen
2024-04-09 22:41:07 -07:00
Meghan Denny
698d0f7c87 fix vscode json handling in prettier (#10133) 2024-04-09 20:42:42 -07:00
Zack Radisic
baf0d7c40f shell: Fix escaped newlines and add more tests (#10122)
* Fix multiline args and add more tests

* [autofix.ci] apply automated fixes

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2024-04-09 17:25:10 -07:00
Dale Seo
81d021794e stdout and sterr are in err (#10130) 2024-04-09 16:42:23 -07:00
John-David Dalton
769d7a1680 fix: null is not an object at readableStreamCancel (#10091)
* fix: null is not an object at readableStreamCancel

* [autofix.ci] apply automated fixes

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2024-04-09 14:02:23 -07:00
Jarred Sumner
df49a5a8e4 Upgrade AbortSignal & AbortController to latest from WebKit (#10106)
Fixes https://github.com/oven-sh/bun/issues/9977
Closes https://github.com/oven-sh/bun/pull/10086

Thank you @lithdew for investigating and most of the fixes. This adds more of the changes we missed from WebKit into Bun like the ability to follow other signals

Co-authored-by: Kenta Iwasaki <63115601+lithdew@users.noreply.github.com>
2024-04-09 00:49:13 -07:00
Meghan Denny
0dc0919119 vscode: dont hide submodules from file tree (#10104) 2024-04-08 23:54:53 -07:00
Meghan Denny
e30a848c4c vscode/settings: force prettier to use workspace config file 2024-04-08 19:02:40 -07:00
Dylan Conway
c739c4adeb unset ENABLE_VIRTUAL_TERMINAL_INPUT (#10089) 2024-04-08 18:19:52 -07:00
Grigory
00933d597a docs(contributing): add link to guide for windows (#10095)
* docs(contributing): add link to guide for windows

* fix broken link

---------

Co-authored-by: dave caruso <me@paperdave.net>
2024-04-08 15:43:40 -07:00
Jarred Sumner
5baa2fbb87 Use a different cache dir in each test file 2024-04-08 07:39:21 -07:00
Jarred Sumner
2615dc742e Partial fix for #10028 (#10030)
* Partial fix for #10028

* [autofix.ci] apply automated fixes

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2024-04-08 07:32:43 -07:00
Jarred Sumner
a4e8534779 Use a separate cache dir for this test 2024-04-08 07:31:05 -07:00
Jarred Sumner
9380e99e2b Fixes #9952 (#10069)
* Fixes #9952

* Update CMakeLists.txt

* Update CMakeLists.txt

* linux

* isolate this test
2024-04-08 07:27:07 -07:00
Yoav Balasiano
9898e0a731 Improve bun shell docs examples (#10052) 2024-04-08 06:48:21 -07:00
Juan Pablo Rinaldi
ad6aadf7b2 Fix coverage documentation (#10059) 2024-04-08 06:47:43 -07:00
Jarred Sumner
ee05bae2be Make bun install 60% faster on Windows, improve reliability, reduce memory usage (#10037)
* [bundows] Make bun install 60% faster

* [autofix.ci] apply automated fixes

* Do not keep node_modules folder open between async tasks. Make sure we call runTasks on every event loop wakeup.

* Update install.zig

* Fix deadlock

* Make that deadlock impossible

* a little less repetitive

* Fix test failure with local tarball

* Get those tests to pass

* Normalize absolutely

* lets see how many times we call GetFinalPathNameByHandle

* Workaround https://github.com/ziglang/zig/issues/19586

https://github.com/ziglang/zig/issues/19586

* Is the dev-server-100 test failure a hash table collision?

* Give it its own cache dir

* We cannot change the git task ids

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2024-04-08 06:25:24 -07:00
Jarred Sumner
d615c11a57 Force non-zero exit code whenever bun install has any packages which failed to install (#10041)
* If any failed to install, always exit with non-zero

* [autofix.ci] apply automated fixes

* This test should fail

* Update bun-link.test.ts

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2024-04-07 16:12:16 -07:00
Jarred Sumner
3679f69b70 Disable assertion on Windows 2024-04-06 15:19:52 -07:00
Tomer Horowitz
0b0bf353fa docs: Updated Bun.nanoseconds documentation (#9986) 2024-04-06 02:47:47 -07:00
Giorgio Bellisario
c4847f464e fix typo: missing "v" prefix on installed Bun version (#9941) 2024-04-05 19:17:32 -07:00
Jarred Sumner
c8d072c2a9 Fixes #9978 (#9995) 2024-04-05 17:42:34 -07:00
dave caruso
f014f35531 fix(windows): use bun.spawnSync for bun upgrade + different check for bun (#10006)
* small changes

* [autofix.ci] apply automated fixes

* fxifsdahjfkdsahjk

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2024-04-05 17:30:32 -07:00
Meghan Denny
fd3cd05647 shell: more builtin commands (#9908)
* remove asString and improve fromString

* make writeNoIO return Maybe

* shell: add builtin command 'yes'

* shell: add builtin command 'seq'

* shell: yes+seq: fix usage string

* shell: add builtin command 'dirname'

* shell: add builtin command 'basename'

* add more tests

* update shell docs with list of commands

* add 'bun exec' launch configurations

* fix AsyncDeinitReader name

* fix 'yes' command IO

* shell: rewrite 'bun' to 'bun-debug' when self is bun-debug

* make the docs not lie about bun being a shell builtin

---------

Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2024-04-05 16:29:50 -07:00
Jarred Sumner
20085d8ddc Merge conflict fix 2024-04-05 15:41:45 -07:00
Jarred Sumner
5735feac5d Redo file watcher + Fix EBUSY when saving lockfile on Windows (#9972)
* Fix `EBUSY` when saving lockfile on Windows

* Redo file watcher wrapper on Windows

* Update lockfile.zig

* Update win_watcher.zig

* Update src/bun.js/node/node_fs.zig

Co-authored-by: Ciro Spaciari <ciro.spaciari@gmail.com>

* Add retry logic

* Comments

* more careful

* smaller

* Fix garbage

* Normalize the paths

* hmmm

* [autofix.ci] apply automated fixes

* try

---------

Co-authored-by: Jarred Sumner <709451+Jarred-Sumner@users.noreply.github.com>
Co-authored-by: Ciro Spaciari <ciro.spaciari@gmail.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2024-04-05 15:40:05 -07:00
dave caruso
4ba993be7e fix(install.ps1): change cpu check (#9921) 2024-04-05 15:35:46 -07:00
sitiom
0b2bb1fdc1 docs: add scoop installation method (#9818)
* docs: add scoop installation method

* Update installation.md

* Add upgrade and uninstall instructions

* Update installation.md

* add ps prompt to code blocks

---------

Co-authored-by: dave caruso <me@paperdave.net>
2024-04-05 14:55:19 -07:00
Dylan Conway
b29cf75a24 fix(install): allow installing without lockfile with --production (#9923)
* check for not_found lockfile load result

* Fix tests

* update tests

---------

Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
Co-authored-by: Jarred Sumner <709451+Jarred-Sumner@users.noreply.github.com>
2024-04-05 14:54:57 -07:00
Eric L. Goldstein
05fb044577 Add types for node:util styleText() (#9945) 2024-04-05 13:31:50 -07:00
Jarred Sumner
8825b29529 bump webkit (#9997)
Co-authored-by: Jarred Sumner <709451+Jarred-Sumner@users.noreply.github.com>
2024-04-05 12:09:27 -07:00
Dylan Conway
182b90896f fix(parser): handle empty type parameters and conditional union/intersection bugfix (#9964)
* tests

* allow_empty_type_parameters

* pass options through unions and intersections
2024-04-05 00:45:43 -07:00
Zack Radisic
40e33da4b4 Fixes (#9940) 2024-04-04 19:46:53 -07:00
Jarred Sumner
f393f8a065 bun install launch.json 2024-04-04 19:31:10 -07:00
Jarred Sumner
a09c421f2a ```sh-diff doesn't work 2024-04-04 08:48:51 -07:00
Jarred Sumner
ca1dbb4eb2 Revert "remove ENABLE_VIRTUAL_TERMINAL_INPUT (#9913)" (#9935)
This reverts commit 06ec233ebe.
2024-04-04 07:34:33 -07:00
Jarred Sumner
8a3b6f0439 Fixes #6730 (#9930)
* Fixes #6730

* Fix test

---------

Co-authored-by: Jarred Sumner <709451+Jarred-Sumner@users.noreply.github.com>
2024-04-04 07:26:53 -07:00
Jarred Sumner
e7d8abb263 Don't recommend something that doesn't work on windows 2024-04-04 04:51:53 -07:00
Dylan Conway
013bc79f62 ignore EndOfStream error (#9926) 2024-04-04 04:31:47 -07:00
Jarred Sumner
8326235ecc Ask for fewer permissions when opening directories (#9928)
Co-authored-by: Jarred Sumner <709451+Jarred-Sumner@users.noreply.github.com>
2024-04-04 04:18:15 -07:00
Jarred Sumner
7543bf936a Add missing HasStaticPropertyTable structure flag 2024-04-04 01:36:18 -07:00
Dylan Conway
06ec233ebe remove ENABLE_VIRTUAL_TERMINAL_INPUT (#9913) 2024-04-04 00:30:20 -07:00
dave caruso
0cdad4bebb fix(bun-release): support windows in npm package (#9873)
* fix npm install on windows

* try again

* again

* copy less file

* revert changes

* remove package.json from git

* okay

* now?
2024-04-03 23:16:48 -07:00
Ashcon Partovi
14c23cc429 Define BUN_INSTALL_BIN in Dockerfiles
Fixes #8753
2024-04-04 13:24:53 +09:00
Ashcon Partovi
0dfbdc711a Remove python3 from slim and alpine Dockerfiles to match Node.js 2024-04-04 13:22:41 +09:00
Ashcon Partovi
3cde2365ea Do not format Dockerfiles 2024-04-04 13:22:11 +09:00
dave caruso
3cfb2816ac docs 2024-04-03 21:11:15 -07:00
Jarred Sumner
c8f5c9f29c Fixes #9851 (#9886)
* Fixes #9851

* Fix

* Fix
2024-04-03 21:02:02 -07:00
Jarred Sumner
00f27fbeec Get bunx tests to pass on Windows (#9729)
* Get bunx tests to pass on Windows

* wip

* WIP

* wip

* wip

* ads

* asdf

* makeOpenPath

* almost revert

* fix build

* enoent

* fix bun install git repos

* cleanup

* use custom zig stdlib from submodule

* update dockerfile to copy zig stdlib sources

* fix dockerfile, update gitmodules

* fix dockerfile

* fix build

* fix build

* fix symlinkat

* fix build

* fix build

* Remove usages of unreachable

* Fixup

* Fixup

* wip

* fixup

* Fix one of the bugs

* asd

* Normalize BUN_INSTALL_CACHE_DIR var

* Set iterable to false when we're about to delete

* Update bun.zig

* I still can't repro this outside CI

* i think that fixes it?

* fix posix compile

* factor out directory creation

* update all install methods to use InstallDirState

* move walker creation to init function

* fix error cleanup

* fix posix compile

* all install tests pass locally

* cleanup

* [autofix.ci] apply automated fixes

* Fix posix regressions

---------

Co-authored-by: Dylan Conway <dylan.conway567@gmail.com>
Co-authored-by: Meghan Denny <hello@nektro.net>
Co-authored-by: Georgijs Vilums <georgijs.vilums@gmail.com>
Co-authored-by: Georgijs <48869301+gvilums@users.noreply.github.com>
Co-authored-by: Georgijs Vilums <georgijs@bun.sh>
Co-authored-by: Jarred Sumner <709451+Jarred-Sumner@users.noreply.github.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2024-04-03 20:53:28 -07:00
Jarred Sumner
76795af695 Fixes https://github.com/oven-sh/bun/issues/9807 (#9875)
Co-authored-by: Jarred Sumner <709451+Jarred-Sumner@users.noreply.github.com>
2024-04-03 20:27:50 -07:00
Zack Radisic
a4b151962a feat: Support subshells in Bun shell (#9905)
* fix #9823

* subshell

* Refactor a bit and add a lot of tests

* delete random code

* make tests pass on windows

* Cleanup

* add sharp test

* Resolve comments

---------

Co-authored-by: Georgijs Vilums <georgijs.vilums@gmail.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2024-04-03 20:27:20 -07:00
Ciro Spaciari
1cde9bcdac fix(server) fix body-stream (#9898)
* some fixes

* WIP

* wip wip

* more debug

* closer

* sending a really big payload at once is still broken

* now we need to avoid segfault happening inside onWritable after destroy

* opsie

* cleanup

* more cleanup

* more WIP, closer need to fix cork

* fix cork actually not writing non-optional data

* make onWritable return actually do something

* actually clean the on writable handler

* remove unreachable condition

* we are not looping anymore

* little revert

* fix possible fault

* inform backpressure on chunked encoding

* just queue when tryEnd

* remove unreachable code
2024-04-03 20:25:05 -07:00
Eric L. Goldstein
0bd7265e8f Remove documentation references to environment variable inlining because the bundler does not do so (#9901) 2024-04-03 18:14:20 -07:00
Dylan Conway
c831dd8db8 Upgrade webkit (#9885)
* span

* remove JSStringIsEqualToString

* bump webkit tag

* span literal

* undo

* fix windows build

* Update JSStringDecoder.cpp

* Update JSStringDecoder.cpp

* Update JSStringDecoder.cpp

---------

Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
Co-authored-by: Jarred Sumner <709451+Jarred-Sumner@users.noreply.github.com>
2024-04-03 17:10:39 -07:00
Jarred Sumner
390441327f Fixes #9778 (#9834)
Co-authored-by: Jarred Sumner <709451+Jarred-Sumner@users.noreply.github.com>
2024-04-03 02:47:31 -07:00
Jarred Sumner
2e0e9f135b Fixes #9878 (#9883)
Co-authored-by: Jarred Sumner <709451+Jarred-Sumner@users.noreply.github.com>
2024-04-03 02:47:13 -07:00
Jarred Sumner
36f1bd3694 Truncate source lines in error messages (#9832)
* Truncate source lines in error messages

* Update .prettierignore

* trim

* fix

* try

* fix

* 1 more time

---------

Co-authored-by: Jarred Sumner <709451+Jarred-Sumner@users.noreply.github.com>
2024-04-03 02:46:10 -07:00
Ciro Spaciari
289d23b377 fix(uws) uWS uintmax_t > uint64_t (#9866)
* wip check this on Posix, probably better to used fixed types on uWS instead of uintmax_t here

* uintmax_t > u64
2024-04-03 01:57:33 -07:00
Meghan Denny
bb483e8479 shell: implement $0, $1, argv accessors (#9740)
* shell: organize imports

* shell: dont allocate when printing errors

* shell: implement $0, $1, argv accessors

* add more tests

* oops need this commit too

* make these logs listen to silencing logs

* expand switch else statements

* align behavior with bash

* this isnt referenced anywhere

* add missing test file

* add another test

* revert this change

* cache utf8 converted version of positionals

* rebase fixes

---------

Co-authored-by: Georgijs Vilums <georgijs.vilums@gmail.com>
2024-04-02 23:07:27 -07:00
Meghan Denny
268f13765c ci: windows: use bun install (#9730)
* ci: windows: use bun install

* run the workflow

---------

Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2024-04-02 17:57:44 -07:00
Georgijs Vilums
801e475c72 update readme again 2024-04-01 09:37:48 -07:00
Georgijs Vilums
a073c85fdb update readme to include windows install command 2024-04-01 09:35:03 -07:00
dave caruso
8cb9f59753 update installation docs 2024-04-01 09:30:47 -07:00
dave caruso
5903a61410 Bun 1.1 2024-04-01 08:57:05 -07:00
dave caruso
b4941cdb0c not canary 2024-04-01 08:55:21 -07:00
Jarred Sumner
58417217d6 Tweak cleanup code in PipeReader for files (#9746)
* WIP: some fixes and improvements

* cleanup

* WIP: some fixes and improvements

* cleanup

* dont pause

---------

Co-authored-by: cirospaciari <ciro.spaciari@gmail.com>
2024-04-01 08:53:39 -07:00
Jarred Sumner
2d57f25637 Bump 2024-04-01 08:52:24 -07:00
cirospaciari
83a99bf190 revert 2024-04-01 12:14:47 -03:00
cirospaciari
e2ffa66bf7 dont pause 2024-04-01 12:12:44 -03:00
Meghan Denny
8980dc026d shell: fix crash in 'ls' and other misc improvements (#9772)
* shell: ls: fix crash when passing argument

* shell: pwd: output was missing newline

* shell: exit: output was missing newline

* shell: pwd: make sure output goes to proper stdout/stderr

* add test ensuring all those work

* fix build error

* fix

---------

Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
Co-authored-by: dave caruso <me@paperdave.net>
2024-04-01 07:54:42 -07:00
Jarred Sumner
4192728592 fix build (#9773)
Co-authored-by: cirospaciari <ciro.spaciari@gmail.com>
2024-04-01 07:11:02 -07:00
Meghan Denny
bdfbcb1898 use bun shell for lifecycle scripts on windows [v3] (#9771)
* these comments were redundant

* better windows support here

* slightly better error message

* didnt realize this variable already existed

* fix node-gyp shim script

* move 'windows bin linking shim should work' to its own file

* run all lifecycle scripts on windows with bun shell

* tidy

* clean imports

* this seemed missing

* remove these comments

* fix the shim again

* fix posix release ensureTempNodeGypScript

* revert this change, it was correct before
2024-04-01 06:48:44 -07:00
Dylan Conway
6e07f9477c fix(streams): don't lose bytes on drain (#9768)
* fix

* clear

* update

* test

* fix test

---------

Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2024-04-01 05:23:47 -07:00
dave caruso
2dd2fc6ed0 install script 2024-04-01 04:02:26 -07:00
dave caruso
9e6e8b0234 feat(runtime): align import.meta.resolve with node.js's implementation (#5827)
* works

* works

* a

* fix zig compiler error

* fix things

* [autofix.ci] apply automated fixes

* a

* not done

* finish this

* [autofix.ci] apply automated fixes

* self check

* delete committed generated file (#9717)

* Fix bug with PipeWriter (#9714)

* fix!: do not lookup cwd in which (#9691)

* do not lookup cwd in which

* fix webkit submodule

* fix compilation on linux

* feedback

* default process.env.NODE_ENV to undefined (#9695)

* small changes

* [autofix.ci] apply automated fixes

* fix(windows) fix node-stream tests/ windows file reader/writer (#9718)

* fix canceled onFileRead

* report continue errors and fix closing

* also fix pipe writer

* avoid possible memory leaks

* Propagate errors in open

---------

Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>

* posix failures

* windows fixes

* avoid using c++ labels

* add a resolver

* fix compile test AGAIN

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Meghan Denny <hello@nektro.net>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
Co-authored-by: Ciro Spaciari <ciro.spaciari@gmail.com>
2024-04-01 02:21:34 -07:00
Zack Radisic
d53e6d6323 feat: Shell if-else, conditional expressions, running commands in background (#9631)
* rename conditional -> binary

* Parse if clauses

* `if` works

* Conditional expressions

* Support If clause condition and branches multi-statements

* cond expr tests

* more

* Fix parse tests

* `&` commands

* clean up

* Make it compile for windows

* Fix test

* Remove If/Else/Elif/Then/Fi tokens

* Fix parsing ambiguities

* Resolve some comments

* More tests fix bugs

* Fix parsing and add more tests ported from GNU bash

* Fix `&`on left side of `&&` error message

* leak test fix hopefully

* todo some tests because `wait` is not implemented

* Disable background commands for now

* Resolve additional comments

* Fix merge conflicts

* Fix broken tests from merge

* Add `==` and `!=` and fix parsing bug

* wow

* fix 09401 test failing... forgot to update `this.inlined.len`

---------

Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
Co-authored-by: Jarred Sumner <709451+Jarred-Sumner@users.noreply.github.com>
2024-04-01 02:07:52 -07:00
Ashcon Partovi
1edacc6e49 Prepare npm i -g bun for Windows 2024-04-01 18:00:16 +09:00
dave caruso
81badbac4c fix(ipc): add json ipc type + buffer incoming messages until a listener is attached. (#8733)
* fix a few ipc issues

* a

* my own revisions

* remove none as a valid type

* a

* fix windows build

* remove comment

* make it work !!!!!!!!

* a

* formatter nonsense

* blah

* huge update refactor

* awa

* wow

* okay
2024-04-01 01:51:15 -07:00
Zack Radisic
7531bfbfe0 add bun exec (#9762)
* add `bun exec`

* Add tests for writing a lot of data for bun exec

* Resolve some comments

* fix on windows
2024-04-01 00:57:19 -07:00
Georgijs
1a989c9ad2 ref tls socket on upgrade (#9766)
Co-authored-by: Zack Radisic <zack@theradisic.com>
2024-03-31 21:32:25 -07:00
Chawye Hsu
ab7825cca5 windows: fix bun pm bin -g path not added complaining (#9763)
Signed-off-by: Chawye Hsu <su+git@chawyehsu.com>
2024-03-31 17:55:36 -07:00
dave caruso
f02752577b fix: which should use cwd if given a relative filepath (#9761)
* Revert "fix!: do not lookup cwd in which (#9691)"

This reverts commit 4869ebff24.

* fix which implementation to be more accurate

* t

* which tests windows

* [autofix.ci] apply automated fixes

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2024-03-31 16:50:16 -07:00
dave caruso
c177e054f5 feat!: shell will now throw on error by default (#9720)
* make the shell throw by default

* make shell default to throws(true)

* ok

* mv tests

* a

* a

* [autofix.ci] apply automated fixes

---------

Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2024-03-31 16:13:59 -07:00
dave caruso
a01b01ae72 chore!: enable 1.1's breaking changes (#9724)
* root scripts in foreground

* ignore if silent

* test for breaking changes

* move back to installPackages

* [autofix.ci] apply automated fixes

* boolean variable, comptime, 1_1_0

* flip the 1.1 flag

* add for the next batch of breakings

* make it buidl

* enable breaking changes tests

* fix version fmt

* silent node-gyp

* comment change

---------

Co-authored-by: Dylan Conway <dylan.conway567@gmail.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Dylan Conway <35280289+dylan-conway@users.noreply.github.com>
2024-03-31 16:12:59 -07:00
dave caruso
456a32344e windows: skip cleaning up old binary (#9696) 2024-03-31 16:12:21 -07:00
Jarred Sumner
6164fac256 Revert "pipe.signal.ptr == subprocess.stdin, not subprocess"
This reverts commit 4bbcc39d2f.
2024-03-30 22:55:28 -07:00
Jarred Sumner
4bbcc39d2f pipe.signal.ptr == subprocess.stdin, not subprocess 2024-03-30 22:10:26 -07:00
Meghan Denny
62c8c97e24 add test.todoIf and fix bun-install-registry.test.ts on windows (#9723)
* bun:test: implement test.todoIf and describe.todoIf

* fix bun-install-registry.test.ts and mark some as todo

* add even more tests

* remove todoIf from this file

* [autofix.ci] apply automated fixes

* fix regression

* this extra expect was incorrect

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2024-03-30 21:25:05 -07:00
Jarred Sumner
eb708d34ae Fixes #9748 (#9751)
Co-authored-by: Jarred Sumner <709451+Jarred-Sumner@users.noreply.github.com>
2024-03-30 21:23:34 -07:00
Jarred Sumner
c3ba60eef5 Fixes #9739 (#9752)
Co-authored-by: Jarred Sumner <709451+Jarred-Sumner@users.noreply.github.com>
2024-03-30 20:57:40 -07:00
Meghan Denny
7f71f10ad1 import-meta.test.js: isolate the query param test into separate cases for esm and cjs (#9750)
* import-meta.test.js: isolate the query param test into separate cases for esm and cos

* make name more accurate
2024-03-30 20:09:01 -07:00
Jarred Sumner
9939049b85 Fixes #5319 (#9745)
* Fixes #5319

* Make this test better

* another test

---------

Co-authored-by: Jarred Sumner <709451+Jarred-Sumner@users.noreply.github.com>
2024-03-30 20:08:34 -07:00
Meghan Denny
a5c5b5dc61 console-iterator.test.ts: add a case that only uses latin1 characters (#9749) 2024-03-30 18:16:10 -07:00
Ciro Spaciari
a2835ef098 fix(websockets) fix socket/websockets (#9645)
* repro

* cleanup

* avoid shutdownRead on SSL

* still dont fix

* more

* some ssl

* cleanup

* handle shutdown

* make actually pass the tests

* fix STATUS_STACK_BUFFER_OVERRUN?

* revert some, cleanup fetch.tls.test

* make clear why we need on_handshake when closing

* more

* revert

* cleanup

* cleanup + less Bun.gc

---------

Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2024-03-30 16:32:19 -07:00
Jarred Sumner
31c4c59740 Make duplicate simultaneous bun install work better (#9738)
Co-authored-by: Jarred Sumner <709451+Jarred-Sumner@users.noreply.github.com>
2024-03-30 14:06:37 -07:00
Jarred Sumner
0248e3c2b7 Add NODE_API_EXPERIMENTAL_NOGC_ENV_OPT_OUT=1 (#9742)
Co-authored-by: Jarred Sumner <709451+Jarred-Sumner@users.noreply.github.com>
2024-03-30 14:03:52 -07:00
Jarred Sumner
d869fcee21 Fixes #7896 (#9712)
* Fixes #7896

* Update ws.test.ts

* Delete the old one

---------

Co-authored-by: Jarred Sumner <709451+Jarred-Sumner@users.noreply.github.com>
2024-03-30 04:27:00 -07:00
dave caruso
55f8ae5aea feat(windows): properly implement setRawMode (#9734)
* setRawMode rewrite for Windows

* work on posix using old approach

* [autofix.ci] apply automated fixes

* no print

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2024-03-30 04:06:23 -07:00
Meghan Denny
e414d107e6 ci: windows: show all failing files (#9736)
* ci: windows: show all failing files

* fix workflow variables

* fix workflow v2
2024-03-30 02:00:24 -07:00
Meghan Denny
0103e2df73 windows: pass bunshell.test.ts (#9733) 2024-03-30 01:58:28 -07:00
Jarred Sumner
02ad501f9e Add missing globs 2024-03-29 23:40:13 -07:00
saklani
d433a1ada0 fix: Define missing crypto.constants defined on Node (#9511)
* define crypto.constants

* requested changes

* fix: missing jsNumber wrap

---------

Co-authored-by: Georgijs <48869301+gvilums@users.noreply.github.com>
2024-03-29 21:53:52 -07:00
dave caruso
d712254128 internal: remove secret hidden internals and introduce new way to call native code from js (#8166)
* oooooh magic

* stuff

* run format

* ok

* yippee

* run the formatter back

* finish things up

* fix webkit

* more

* [autofix.ci] apply automated fixes

* fix compile

* fix compilation on windows, it seems to not work though :(

* update

* a

* v

* ok

* [autofix.ci] apply automated fixes

* OOPS

* bump bun to reduce ci bugs

* a

* js2native is done!

* improve array binding

* rebase

* some final stuff

* wasi fixes

* os

---------

Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2024-03-29 21:47:11 -07:00
Meghan Denny
a500c69728 shell: implement 'true' and 'false' builtin commands (#9728) 2024-03-29 21:36:13 -07:00
Dylan Conway
d30b53591f fix(napi): fix finalizer callback (#9732)
* fix finalize callback

* fix test
2024-03-29 21:33:48 -07:00
Meghan Denny
b8389f32ce shell: add 'exit' builtin command (#9705)
* shell: add 'exit' builtin command

* remove loop here

* [autofix.ci] apply automated fixes

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2024-03-29 18:20:44 -07:00
Dylan Conway
7172013a72 fix(windows): use extended max path prefix for hardlinks during install (#9721)
* uncomment code

* use GetFinalPathNameByHandleW

* add packages with large names

* delete

* test large package name
2024-03-29 18:13:39 -07:00
Jarred Sumner
8ff7ee03d2 stdio tweaks (#9726) 2024-03-29 18:11:47 -07:00
dave caruso
5296c26dab fix bunx-bins verdaccio package (#9697)
* fix bunx-bins verdaccio package

* env suck

* [autofix.ci] apply automated fixes

* ugh

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2024-03-29 17:55:06 -07:00
Jarred Sumner
da6826e2b7 Unmark known failing 2024-03-29 17:49:29 -07:00
Jarred Sumner
a637b4c880 Unmark known failing 2024-03-29 17:49:03 -07:00
Ciro Spaciari
d9074dfa5d fix(windows) fix node-stream tests/ windows file reader/writer (#9718)
* fix canceled onFileRead

* report continue errors and fix closing

* also fix pipe writer

* avoid possible memory leaks

* Propagate errors in open

---------

Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2024-03-29 16:58:55 -07:00
dave caruso
ba9834d746 default process.env.NODE_ENV to undefined (#9695) 2024-03-29 16:42:50 -07:00
dave caruso
4869ebff24 fix!: do not lookup cwd in which (#9691)
* do not lookup cwd in which

* fix webkit submodule

* fix compilation on linux

* feedback
2024-03-29 16:42:17 -07:00
Jarred Sumner
a9804a3a11 Fix bug with PipeWriter (#9714) 2024-03-29 16:11:24 -07:00
Meghan Denny
6bedc23992 delete committed generated file (#9717) 2024-03-29 15:53:11 -07:00
dave caruso
093e9c2499 ci: does this fix the windows build (#9715)
* does this fix the windows build

* [autofix.ci] apply automated fixes

* a

* enable tar oop

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2024-03-29 15:20:13 -07:00
dave caruso
3047c9005e fix: add a better error message for fetch when it fails with an unknown code (#9663)
* add a better error message for fetch when it fails with an unknown code

* Update src/bun.js/webcore/response.zig

Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>

* [autofix.ci] apply automated fixes

* fix compilation

---------

Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2024-03-29 14:18:11 -07:00
Jarred Sumner
e80e61c9a3 Allow 0-length ArrayBuffer & Blob in Bun.spawn stdio (#9557)
Co-authored-by: Zack Radisic <zack@theradisic.com>
2024-03-29 13:51:45 -07:00
Meghan Denny
e3bf906127 memoize all calls to selfExePath (#9703)
* memoize all calls to selfExePath

* Fix threadsafety issue

---------

Co-authored-by: Jarred Sumner <709451+Jarred-Sumner@users.noreply.github.com>
2024-03-29 13:29:42 -07:00
Walter
4e7ed173ef fix!: remove worker from default conditions (#9256)
Co-authored-by: Walter Blacke <walter.blacke@vegabyte.studio>
2024-03-29 13:22:42 -07:00
Jarred Sumner
31befad163 Workaround for #9041 (#9580)
* Workaround for #9041

* Fix crash with auto install

* Fixup this test

* Update 09041.test.ts

---------

Co-authored-by: Jarred Sumner <709451+Jarred-Sumner@users.noreply.github.com>
2024-03-29 13:17:12 -07:00
Meghan Denny
94b01b2f45 test: pm: don't delete temporary directories (#9649) 2024-03-29 12:29:50 -07:00
PondWader
9ecb691380 Fix URL.canParse.length (#9710)
* Fix URL.canParse.length

* Add URL.canParse.length test

---------

Co-authored-by: John-David Dalton <john.david.dalton@gmail.com>
2024-03-29 12:21:49 -07:00
Meghan Denny
fb8a299765 shell: windows: make EnvMap case-insensitive (#9704)
* shell: windows: make EnvMap case-insensitive

* [autofix.ci] apply automated fixes

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2024-03-29 12:21:09 -07:00
657 changed files with 26689 additions and 14067 deletions

3
.clangd Normal file
View File

@@ -0,0 +1,3 @@
Index:
Background: Skip # Disable slow background indexing of these files.

View File

@@ -0,0 +1,27 @@
name: Prefilled crash report
description: Report a crash in Bun
labels:
- bug
- crash
body:
- type: markdown
attributes:
value: |
Thank you for submitting a crash report. It helps make Bun better.
- type: textarea
attributes:
label: How can we reproduce the crash?
description: Please provide instructions on how to reproduce the crash.
- type: textarea
id: logs
attributes:
label: Relevant log output
description: Please copy and paste any relevant log output. This will be
automatically formatted into code, so no need for backticks.
render: shell
- type: textarea
id: remapped_trace
attributes:
label: Stack Trace (bun.report)
validations:
required: true

View File

@@ -30,7 +30,7 @@ runs:
*) os=windows;;
esac
case "$(uname -m)" in
arm64 | aarch64) arch=arm64;;
arm64 | aarch64) arch=aarch64;;
*) arch=x64;;
esac
case "${{ inputs.baseline }}" in

307
.github/workflows/build-darwin.yml vendored Normal file
View File

@@ -0,0 +1,307 @@
name: Build Darwin
permissions:
contents: read
actions: write
on:
workflow_call:
inputs:
runs-on:
type: string
default: macos-12-large
tag:
type: string
required: true
arch:
type: string
required: true
cpu:
type: string
required: true
assertions:
type: boolean
canary:
type: boolean
no-cache:
type: boolean
env:
LLVM_VERSION: 16
BUN_VERSION: 1.1.2
jobs:
build-submodules:
name: Build Submodules
runs-on: ${{ inputs.runs-on }}
steps:
- name: Checkout
uses: actions/checkout@v4
with:
sparse-checkout: |
.gitmodules
src/deps
scripts
- name: Hash Submodules
id: hash
run: |
print_versions() {
git submodule | grep -v WebKit
echo "LLVM_VERSION=${{ env.LLVM_VERSION }}"
cat $(echo scripts/build*.sh scripts/all-dependencies.sh | tr " " "\n" | sort)
}
echo "hash=$(print_versions | shasum)" >> $GITHUB_OUTPUT
- if: ${{ !inputs.no-cache }}
name: Restore Cache
id: cache
uses: actions/cache/restore@v4
with:
path: ${{ runner.temp }}/bun-deps
key: bun-${{ inputs.tag }}-deps-${{ steps.hash.outputs.hash }}
# TODO: Figure out how to cache homebrew dependencies
- if: ${{ inputs.no-cache || !steps.cache.outputs.cache-hit }}
name: Install Dependencies
env:
HOMEBREW_NO_INSTALLED_DEPENDENTS_CHECK: 1
HOMEBREW_NO_AUTO_UPDATE: 1
HOMEBREW_NO_INSTALL_CLEANUP: 1
run: |
brew install \
llvm@${{ env.LLVM_VERSION }} \
ccache \
rust \
pkg-config \
coreutils \
libtool \
cmake \
libiconv \
automake \
openssl@1.1 \
ninja \
gnu-sed --force
echo "$(brew --prefix ccache)/bin" >> $GITHUB_PATH
echo "$(brew --prefix coreutils)/libexec/gnubin" >> $GITHUB_PATH
echo "$(brew --prefix llvm@$LLVM_VERSION)/bin" >> $GITHUB_PATH
brew link --overwrite llvm@$LLVM_VERSION
- if: ${{ inputs.no-cache || !steps.cache.outputs.cache-hit }}
name: Clone Submodules
run: |
./scripts/update-submodules.sh
- name: Build Submodules
if: ${{ inputs.no-cache || !steps.cache.outputs.cache-hit }}
env:
CPU_TARGET: ${{ inputs.cpu }}
BUN_DEPS_OUT_DIR: ${{ runner.temp }}/bun-deps
run: |
mkdir -p $BUN_DEPS_OUT_DIR
./scripts/all-dependencies.sh
- name: Save Cache
if: ${{ inputs.no-cache || !steps.cache.outputs.cache-hit }}
uses: actions/cache/save@v4
with:
path: ${{ runner.temp }}/bun-deps
key: ${{ steps.cache.outputs.cache-primary-key }}
- name: Upload bun-${{ inputs.tag }}-deps
uses: actions/upload-artifact@v4
with:
name: bun-${{ inputs.tag }}-deps
path: ${{ runner.temp }}/bun-deps
if-no-files-found: error
build-cpp:
name: Build C++
runs-on: ${{ inputs.runs-on }}
steps:
- name: Checkout
uses: actions/checkout@v4
with:
submodules: recursive
# TODO: Figure out how to cache homebrew dependencies
- name: Install Dependencies
env:
HOMEBREW_NO_INSTALLED_DEPENDENTS_CHECK: 1
HOMEBREW_NO_AUTO_UPDATE: 1
HOMEBREW_NO_INSTALL_CLEANUP: 1
run: |
brew install \
llvm@${{ env.LLVM_VERSION }} \
ccache \
rust \
pkg-config \
coreutils \
libtool \
cmake \
libiconv \
automake \
openssl@1.1 \
ninja \
gnu-sed --force
echo "$(brew --prefix ccache)/bin" >> $GITHUB_PATH
echo "$(brew --prefix coreutils)/libexec/gnubin" >> $GITHUB_PATH
echo "$(brew --prefix llvm@$LLVM_VERSION)/bin" >> $GITHUB_PATH
brew link --overwrite llvm@$LLVM_VERSION
- name: Setup Bun
uses: ./.github/actions/setup-bun
with:
bun-version: ${{ env.BUN_VERSION }}
- if: ${{ !inputs.no-cache }}
name: Restore Cache
uses: actions/cache@v4
with:
path: ${{ runner.temp }}/ccache
key: bun-${{ inputs.tag }}-cpp-${{ hashFiles('Dockerfile', 'Makefile', 'CMakeLists.txt', 'build.zig', 'scripts/**', 'src/**', 'packages/bun-usockets/src/**', 'packages/bun-uws/src/**') }}
restore-keys: |
bun-${{ inputs.tag }}-cpp-
- name: Compile
env:
CPU_TARGET: ${{ inputs.cpu }}
SOURCE_DIR: ${{ github.workspace }}
OBJ_DIR: ${{ runner.temp }}/bun-cpp-obj
BUN_DEPS_OUT_DIR: ${{ runner.temp }}/bun-deps
CCACHE_DIR: ${{ runner.temp }}/ccache
run: |
mkdir -p $OBJ_DIR
cd $OBJ_DIR
cmake -S $SOURCE_DIR -B $OBJ_DIR \
-G Ninja \
-DCMAKE_BUILD_TYPE=Release \
-DUSE_LTO=ON \
-DBUN_CPP_ONLY=1 \
-DNO_CONFIGURE_DEPENDS=1
chmod +x compile-cpp-only.sh
./compile-cpp-only.sh -v
- name: Upload bun-${{ inputs.tag }}-cpp
uses: actions/upload-artifact@v4
with:
name: bun-${{ inputs.tag }}-cpp
path: ${{ runner.temp }}/bun-cpp-obj/bun-cpp-objects.a
if-no-files-found: error
build-zig:
name: Build Zig
uses: ./.github/workflows/build-zig.yml
with:
os: darwin
only-zig: true
tag: ${{ inputs.tag }}
arch: ${{ inputs.arch }}
cpu: ${{ inputs.cpu }}
assertions: ${{ inputs.assertions }}
canary: ${{ inputs.canary }}
no-cache: ${{ inputs.no-cache }}
link:
name: Link
runs-on: ${{ inputs.runs-on }}
needs:
- build-submodules
- build-cpp
- build-zig
steps:
- uses: actions/checkout@v4
# TODO: Figure out how to cache homebrew dependencies
- name: Install Dependencies
env:
HOMEBREW_NO_INSTALLED_DEPENDENTS_CHECK: 1
HOMEBREW_NO_AUTO_UPDATE: 1
HOMEBREW_NO_INSTALL_CLEANUP: 1
run: |
brew install \
llvm@${{ env.LLVM_VERSION }} \
ccache \
rust \
pkg-config \
coreutils \
libtool \
cmake \
libiconv \
automake \
openssl@1.1 \
ninja \
gnu-sed --force
echo "$(brew --prefix ccache)/bin" >> $GITHUB_PATH
echo "$(brew --prefix coreutils)/libexec/gnubin" >> $GITHUB_PATH
echo "$(brew --prefix llvm@$LLVM_VERSION)/bin" >> $GITHUB_PATH
brew link --overwrite llvm@$LLVM_VERSION
- name: Setup Bun
uses: ./.github/actions/setup-bun
with:
bun-version: ${{ env.BUN_VERSION }}
- name: Download bun-${{ inputs.tag }}-deps
uses: actions/download-artifact@v4
with:
name: bun-${{ inputs.tag }}-deps
path: ${{ runner.temp }}/bun-deps
- name: Download bun-${{ inputs.tag }}-cpp
uses: actions/download-artifact@v4
with:
name: bun-${{ inputs.tag }}-cpp
path: ${{ runner.temp }}/bun-cpp-obj
- name: Download bun-${{ inputs.tag }}-zig
uses: actions/download-artifact@v4
with:
name: bun-${{ inputs.tag }}-zig
path: ${{ runner.temp }}/release
- if: ${{ !inputs.no-cache }}
name: Restore Cache
uses: actions/cache@v4
with:
path: ${{ runner.temp }}/ccache
key: bun-${{ inputs.tag }}-cpp-${{ hashFiles('Dockerfile', 'Makefile', 'CMakeLists.txt', 'build.zig', 'scripts/**', 'src/**', 'packages/bun-usockets/src/**', 'packages/bun-uws/src/**') }}
restore-keys: |
bun-${{ inputs.tag }}-cpp-
- name: Link
env:
CPU_TARGET: ${{ inputs.cpu }}
CCACHE_DIR: ${{ runner.temp }}/ccache
run: |
SRC_DIR=$PWD
mkdir ${{ runner.temp }}/link-build
cd ${{ runner.temp }}/link-build
cmake $SRC_DIR \
-G Ninja \
-DCMAKE_BUILD_TYPE=Release \
-DUSE_LTO=ON \
-DBUN_LINK_ONLY=1 \
-DBUN_ZIG_OBJ="${{ runner.temp }}/release/bun-zig.o" \
-DBUN_CPP_ARCHIVE="${{ runner.temp }}/bun-cpp-obj/bun-cpp-objects.a" \
-DBUN_DEPS_OUT_DIR="${{ runner.temp }}/bun-deps" \
-DNO_CONFIGURE_DEPENDS=1
ninja -v
- name: Prepare
run: |
cd ${{ runner.temp }}/link-build
chmod +x bun-profile bun
mkdir -p bun-${{ inputs.tag }}-profile/ bun-${{ inputs.tag }}/
mv bun-profile bun-${{ inputs.tag }}-profile/bun-profile
mv bun bun-${{ inputs.tag }}/bun
zip -r bun-${{ inputs.tag }}-profile.zip bun-${{ inputs.tag }}-profile
zip -r bun-${{ inputs.tag }}.zip bun-${{ inputs.tag }}
- name: Upload bun-${{ inputs.tag }}
uses: actions/upload-artifact@v4
with:
name: bun-${{ inputs.tag }}
path: ${{ runner.temp }}/link-build/bun-${{ inputs.tag }}.zip
if-no-files-found: error
- name: Upload bun-${{ inputs.tag }}-profile
uses: actions/upload-artifact@v4
with:
name: bun-${{ inputs.tag }}-profile
path: ${{ runner.temp }}/link-build/bun-${{ inputs.tag }}-profile.zip
if-no-files-found: error
on-failure:
if: ${{ github.repository_owner == 'oven-sh' && failure() }}
name: On Failure
needs: link
runs-on: ubuntu-latest
steps:
- name: Send Message
uses: sarisia/actions-status-discord@v1
with:
webhook: ${{ secrets.DISCORD_WEBHOOK }}
nodetail: true
color: "#FF0000"
title: ""
description: |
### ❌ [${{ github.event.pull_request.title }}](${{ github.event.pull_request.html_url }})
@${{ github.actor }}, the build for bun-${{ inputs.tag }} failed.
**[View logs](${{ github.event.workflow_run.html_url }})**

64
.github/workflows/build-linux.yml vendored Normal file
View File

@@ -0,0 +1,64 @@
name: Build Linux
permissions:
contents: read
actions: write
on:
workflow_call:
inputs:
runs-on:
type: string
required: true
tag:
type: string
required: true
arch:
type: string
required: true
cpu:
type: string
required: true
assertions:
type: boolean
zig-optimize:
type: string
canary:
type: boolean
no-cache:
type: boolean
jobs:
build:
name: Build Linux
uses: ./.github/workflows/build-zig.yml
with:
os: linux
only-zig: false
runs-on: ${{ inputs.runs-on }}
tag: ${{ inputs.tag }}
arch: ${{ inputs.arch }}
cpu: ${{ inputs.cpu }}
assertions: ${{ inputs.assertions }}
zig-optimize: ${{ inputs.zig-optimize }}
canary: ${{ inputs.canary }}
no-cache: ${{ inputs.no-cache }}
on-failure:
if: ${{ github.repository_owner == 'oven-sh' && failure() }}
name: On Failure
needs: build
runs-on: ubuntu-latest
steps:
- name: Send Message
uses: sarisia/actions-status-discord@v1
with:
webhook: ${{ secrets.DISCORD_WEBHOOK }}
nodetail: true
color: "#FF0000"
title: ""
description: |
### ❌ [${{ github.event.pull_request.title }}](${{ github.event.pull_request.html_url }})
@${{ github.actor }}, the build for bun-${{ inputs.tag }} failed.
**[View logs](${{ github.event.workflow_run.html_url }})**

327
.github/workflows/build-windows.yml vendored Normal file
View File

@@ -0,0 +1,327 @@
name: Build Windows
permissions:
contents: read
actions: write
on:
workflow_call:
inputs:
runs-on:
type: string
default: windows-latest
tag:
type: string
required: true
arch:
type: string
required: true
cpu:
type: string
required: true
assertions:
type: boolean
canary:
type: boolean
no-cache:
type: boolean
env:
# Must specify exact version of LLVM for Windows
LLVM_VERSION: 16.0.6
BUN_VERSION: 1.1.2
jobs:
build-submodules:
name: Build Submodules
runs-on: ${{ inputs.runs-on }}
steps:
- name: Setup Git
run: |
git config --global core.autocrlf false
git config --global core.eol lf
- name: Checkout
uses: actions/checkout@v4
with:
sparse-checkout: |
.gitmodules
src/deps
scripts
- name: Hash Submodules
id: hash
run: |
$data = "$(& {
git submodule | Where-Object { $_ -notmatch 'WebKit' }
echo "LLVM_VERSION=${{ env.LLVM_VERSION }}"
Get-Content -Path (Get-ChildItem -Path 'scripts/build*.ps1', 'scripts/all-dependencies.ps1', 'scripts/env.ps1' | Sort-Object -Property Name).FullName | Out-String
echo 1
})"
$hash = ( -join ((New-Object -TypeName System.Security.Cryptography.SHA1CryptoServiceProvider).ComputeHash([System.Text.Encoding]::UTF8.GetBytes($data)) | ForEach-Object { $_.ToString("x2") } )).Substring(0, 10)
echo "hash=${hash}" >> $env:GITHUB_OUTPUT
- if: ${{ !inputs.no-cache }}
name: Restore Cache
id: cache
uses: actions/cache/restore@v4
with:
path: bun-deps
key: bun-${{ inputs.tag }}-deps-${{ steps.hash.outputs.hash }}
- if: ${{ inputs.no-cache || !steps.cache.outputs.cache-hit }}
name: Install LLVM
uses: KyleMayes/install-llvm-action@1a3da29f56261a1e1f937ec88f0856a9b8321d7e
with:
version: ${{ env.LLVM_VERSION }}
- if: ${{ inputs.no-cache || !steps.cache.outputs.cache-hit }}
name: Install Ninja
run: |
choco install -y ninja
- if: ${{ inputs.no-cache || !steps.cache.outputs.cache-hit }}
name: Clone Submodules
run: |
.\scripts\update-submodules.ps1
- if: ${{ inputs.no-cache || !steps.cache.outputs.cache-hit }}
name: Build Dependencies
env:
CPU_TARGET: ${{ inputs.cpu }}
CCACHE_DIR: ccache
run: |
.\scripts\env.ps1 ${{ contains(inputs.tag, '-baseline') && '-Baseline' || '' }}
Invoke-WebRequest -Uri "https://www.nasm.us/pub/nasm/releasebuilds/2.16.01/win64/nasm-2.16.01-win64.zip" -OutFile nasm.zip
Expand-Archive nasm.zip (mkdir -Force "nasm")
$Nasm = (Get-ChildItem "nasm")
$env:Path += ";${Nasm}"
$env:BUN_DEPS_OUT_DIR = (mkdir -Force "./bun-deps")
.\scripts\all-dependencies.ps1
- name: Save Cache
if: ${{ inputs.no-cache || !steps.cache.outputs.cache-hit }}
uses: actions/cache/save@v4
with:
path: bun-deps
key: ${{ steps.cache.outputs.cache-primary-key }}
- name: Upload bun-${{ inputs.tag }}-deps
uses: actions/upload-artifact@v4
with:
name: bun-${{ inputs.tag }}-deps
path: bun-deps
if-no-files-found: error
codegen:
name: Codegen
runs-on: ubuntu-latest
steps:
- name: Setup Git
run: |
git config --global core.autocrlf false
git config --global core.eol lf
- name: Checkout
uses: actions/checkout@v4
- name: Setup Bun
uses: ./.github/actions/setup-bun
with:
bun-version: ${{ env.BUN_VERSION }}
- name: Codegen
run: |
./scripts/cross-compile-codegen.sh win32 x64
- if: ${{ inputs.canary }}
name: Calculate Revision
run: |
echo "canary_revision=$(GITHUB_TOKEN="${{ github.token }}"
bash ./scripts/calculate-canary-revision.sh --raw)" > build-codegen-win32-x64/.canary_revision
- name: Upload bun-${{ inputs.tag }}-codegen
uses: actions/upload-artifact@v4
with:
name: bun-${{ inputs.tag }}-codegen
path: build-codegen-win32-x64
if-no-files-found: error
build-cpp:
name: Build C++
needs: codegen
runs-on: ${{ inputs.runs-on }}
steps:
- name: Setup Git
run: |
git config --global core.autocrlf false
git config --global core.eol lf
- name: Checkout
uses: actions/checkout@v4
with:
submodules: recursive
- name: Install LLVM
uses: KyleMayes/install-llvm-action@1a3da29f56261a1e1f937ec88f0856a9b8321d7e
with:
version: ${{ env.LLVM_VERSION }}
- name: Install Ninja
run: |
choco install -y ninja
- name: Setup Bun
uses: ./.github/actions/setup-bun
with:
bun-version: ${{ env.BUN_VERSION }}
- if: ${{ !inputs.no-cache }}
name: Restore Cache
uses: actions/cache@v4
with:
path: ccache
key: bun-${{ inputs.tag }}-cpp-${{ hashFiles('Dockerfile', 'Makefile', 'CMakeLists.txt', 'build.zig', 'scripts/**', 'src/**', 'packages/bun-usockets/src/**', 'packages/bun-uws/src/**') }}
restore-keys: |
bun-${{ inputs.tag }}-cpp-
- name: Download bun-${{ inputs.tag }}-codegen
uses: actions/download-artifact@v4
with:
name: bun-${{ inputs.tag }}-codegen
path: build
- name: Compile
env:
CPU_TARGET: ${{ inputs.cpu }}
CCACHE_DIR: ccache
run: |
# $CANARY_REVISION = if (Test-Path build/.canary_revision) { Get-Content build/.canary_revision } else { "0" }
$CANARY_REVISION = 0
.\scripts\env.ps1 ${{ contains(inputs.tag, '-baseline') && '-Baseline' || '' }}
.\scripts\update-submodules.ps1
.\scripts\build-libuv.ps1 -CloneOnly $True
cd build
cmake .. -G Ninja -DCMAKE_BUILD_TYPE=Release `
-DNO_CODEGEN=1 `
-DNO_CONFIGURE_DEPENDS=1 `
"-DCANARY=${CANARY_REVISION}" `
-DBUN_CPP_ONLY=1 ${{ contains(inputs.tag, '-baseline') && '-DUSE_BASELINE_BUILD=1' || '' }}
if ($LASTEXITCODE -ne 0) { throw "CMake configuration failed" }
.\compile-cpp-only.ps1 -v
if ($LASTEXITCODE -ne 0) { throw "C++ compilation failed" }
- name: Upload bun-${{ inputs.tag }}-cpp
uses: actions/upload-artifact@v4
with:
name: bun-${{ inputs.tag }}-cpp
path: build/bun-cpp-objects.a
if-no-files-found: error
build-zig:
name: Build Zig
uses: ./.github/workflows/build-zig.yml
with:
os: windows
zig-optimize: ReleaseSafe
only-zig: true
tag: ${{ inputs.tag }}
arch: ${{ inputs.arch }}
cpu: ${{ inputs.cpu }}
assertions: ${{ inputs.assertions }}
canary: ${{ inputs.canary }}
no-cache: ${{ inputs.no-cache }}
link:
name: Link
runs-on: ${{ inputs.runs-on }}
needs:
- build-submodules
- build-cpp
- build-zig
- codegen
steps:
- name: Setup Git
run: |
git config --global core.autocrlf false
git config --global core.eol lf
- name: Checkout
uses: actions/checkout@v4
with:
submodules: recursive
- name: Install LLVM
uses: KyleMayes/install-llvm-action@1a3da29f56261a1e1f937ec88f0856a9b8321d7e
with:
version: ${{ env.LLVM_VERSION }}
- name: Install Ninja
run: |
choco install -y ninja
- name: Setup Bun
uses: ./.github/actions/setup-bun
with:
bun-version: ${{ env.BUN_VERSION }}
- name: Download bun-${{ inputs.tag }}-deps
uses: actions/download-artifact@v4
with:
name: bun-${{ inputs.tag }}-deps
path: bun-deps
- name: Download bun-${{ inputs.tag }}-cpp
uses: actions/download-artifact@v4
with:
name: bun-${{ inputs.tag }}-cpp
path: bun-cpp
- name: Download bun-${{ inputs.tag }}-zig
uses: actions/download-artifact@v4
with:
name: bun-${{ inputs.tag }}-zig
path: bun-zig
- name: Download bun-${{ inputs.tag }}-codegen
uses: actions/download-artifact@v4
with:
name: bun-${{ inputs.tag }}-codegen
path: build
- if: ${{ !inputs.no-cache }}
name: Restore Cache
uses: actions/cache@v4
with:
path: ccache
key: bun-${{ inputs.tag }}-cpp-${{ hashFiles('Dockerfile', 'Makefile', 'CMakeLists.txt', 'build.zig', 'scripts/**', 'src/**', 'packages/bun-usockets/src/**', 'packages/bun-uws/src/**') }}
restore-keys: |
bun-${{ inputs.tag }}-cpp-
- name: Link
env:
CPU_TARGET: ${{ inputs.cpu }}
CCACHE_DIR: ccache
run: |
.\scripts\update-submodules.ps1
.\scripts\env.ps1 ${{ contains(inputs.tag, '-baseline') && '-Baseline' || '' }}
Set-Location build
# $CANARY_REVISION = if (Test-Path build/.canary_revision) { Get-Content build/.canary_revision } else { "0" }
$CANARY_REVISION = 0
cmake .. -G Ninja -DCMAKE_BUILD_TYPE=Release `
-DNO_CODEGEN=1 `
-DNO_CONFIGURE_DEPENDS=1 `
"-DCANARY=${CANARY_REVISION}" `
-DBUN_LINK_ONLY=1 `
"-DBUN_DEPS_OUT_DIR=$(Resolve-Path ../bun-deps)" `
"-DBUN_CPP_ARCHIVE=$(Resolve-Path ../bun-cpp/bun-cpp-objects.a)" `
"-DBUN_ZIG_OBJ=$(Resolve-Path ../bun-zig/bun-zig.o)" `
${{ contains(inputs.tag, '-baseline') && '-DUSE_BASELINE_BUILD=1' || '' }}
if ($LASTEXITCODE -ne 0) { throw "CMake configuration failed" }
ninja -v
if ($LASTEXITCODE -ne 0) { throw "Link failed!" }
- name: Prepare
run: |
$Dist = mkdir -Force "bun-${{ inputs.tag }}"
cp -r build\bun.exe "$Dist\bun.exe"
Compress-Archive -Force "$Dist" "${Dist}.zip"
$Dist = "$Dist-profile"
MkDir -Force "$Dist"
cp -r build\bun.exe "$Dist\bun.exe"
cp -r build\bun.pdb "$Dist\bun.pdb"
Compress-Archive -Force "$Dist" "$Dist.zip"
- name: Upload bun-${{ inputs.tag }}
uses: actions/upload-artifact@v4
with:
name: bun-${{ inputs.tag }}
path: bun-${{ inputs.tag }}.zip
if-no-files-found: error
- name: Upload bun-${{ inputs.tag }}-profile
uses: actions/upload-artifact@v4
with:
name: bun-${{ inputs.tag }}-profile
path: bun-${{ inputs.tag }}-profile.zip
if-no-files-found: error
on-failure:
if: ${{ github.repository_owner == 'oven-sh' && failure() }}
name: On Failure
needs: link
runs-on: ubuntu-latest
steps:
- name: Send Message
uses: sarisia/actions-status-discord@v1
with:
webhook: ${{ secrets.DISCORD_WEBHOOK }}
nodetail: true
color: "#FF0000"
title: ""
description: |
### ❌ [${{ github.event.pull_request.title }}](${{ github.event.pull_request.html_url }})
@${{ github.actor }}, the build for bun-${{ inputs.tag }} failed.
**[View logs](${{ github.event.workflow_run.html_url }})**

122
.github/workflows/build-zig.yml vendored Normal file
View File

@@ -0,0 +1,122 @@
name: Build Zig
permissions:
contents: read
actions: write
on:
workflow_call:
inputs:
runs-on:
type: string
default: ${{ github.repository_owner != 'oven-sh' && 'ubuntu-latest' || inputs.arch == 'x64' && 'namespace-profile-bun-ci-linux-x64' || 'namespace-profile-bun-ci-linux-aarch64' }}
tag:
type: string
required: true
os:
type: string
required: true
arch:
type: string
required: true
cpu:
type: string
required: true
assertions:
type: boolean
default: false
zig-optimize:
type: string # 'ReleaseSafe' or 'ReleaseFast'
default: ReleaseFast
canary:
type: boolean
default: ${{ github.ref == 'refs/heads/main' }}
only-zig:
type: boolean
default: true
no-cache:
type: boolean
default: false
jobs:
build-zig:
name: ${{ inputs.only-zig && 'Build Zig' || 'Build & Link' }}
runs-on: ${{ inputs.runs-on }}
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Calculate Cache Key
id: cache
run: |
echo "key=${{ hashFiles('Dockerfile', 'Makefile', 'CMakeLists.txt', 'build.zig', 'scripts/**', 'src/**', 'packages/bun-usockets/src/**', 'packages/bun-uws/src/**') }}" >> $GITHUB_OUTPUT
- if: ${{ !inputs.no-cache }}
name: Restore Cache
uses: actions/cache@v4
with:
key: bun-${{ inputs.tag }}-docker-${{ steps.cache.outputs.key }}
restore-keys: |
bun-${{ inputs.tag }}-docker-
path: |
${{ runner.temp }}/dockercache
- name: Setup Docker
uses: docker/setup-buildx-action@v3
with:
install: true
platforms: |
linux/${{ runner.arch == 'X64' && 'amd64' || 'arm64' }}
- name: Build
uses: docker/build-push-action@v5
with:
push: false
target: ${{ inputs.only-zig && 'build_release_obj' || 'artifact' }}
cache-from: |
type=local,src=${{ runner.temp }}/dockercache
cache-to: |
type=local,dest=${{ runner.temp }}/dockercache,mode=max
outputs: |
type=local,dest=${{ runner.temp }}/release
platforms: |
linux/${{ runner.arch == 'X64' && 'amd64' || 'arm64' }}
build-args: |
GIT_SHA=${{ github.event.workflow_run.head_sha || github.sha }}
TRIPLET=${{ inputs.os == 'darwin' && format('{0}-macos-none', inputs.arch == 'x64' && 'x86_64' || 'aarch64') || inputs.os == 'windows' && format('{0}-windows-msvc', inputs.arch == 'x64' && 'x86_64' || 'aarch64') || format('{0}-linux-gnu', inputs.arch == 'x64' && 'x86_64' || 'aarch64') }}
ARCH=${{ inputs.arch == 'x64' && 'x86_64' || 'aarch64' }}
BUILDARCH=${{ inputs.arch == 'x64' && 'amd64' || 'arm64' }}
BUILD_MACHINE_ARCH=${{ inputs.arch == 'x64' && 'x86_64' || 'aarch64' }}
CPU_TARGET=${{ inputs.arch == 'x64' && inputs.cpu || 'native' }}
ASSERTIONS=${{ inputs.assertions && 'ON' || 'OFF' }}
ZIG_OPTIMIZE=${{ inputs.zig-optimize }}
CANARY=${{ inputs.canary && '1' || '0' }}
- if: ${{ inputs.only-zig }}
name: Upload bun-${{ inputs.tag }}-zig
uses: actions/upload-artifact@v4
with:
name: bun-${{ inputs.tag }}-zig
path: ${{ runner.temp }}/release/bun-zig.o
if-no-files-found: error
- if: ${{ !inputs.only-zig }}
name: Prepare
run: |
cd ${{ runner.temp }}/release
chmod +x bun-profile bun
mkdir bun-${{ inputs.tag }}-profile
mkdir bun-${{ inputs.tag }}
strip bun
mv bun-profile bun-${{ inputs.tag }}-profile/bun-profile
mv bun bun-${{ inputs.tag }}/bun
zip -r bun-${{ inputs.tag }}-profile.zip bun-${{ inputs.tag }}-profile
zip -r bun-${{ inputs.tag }}.zip bun-${{ inputs.tag }}
- if: ${{ !inputs.only-zig }}
name: Upload bun-${{ inputs.tag }}
uses: actions/upload-artifact@v4
with:
name: bun-${{ inputs.tag }}
path: ${{ runner.temp }}/release/bun-${{ inputs.tag }}.zip
if-no-files-found: error
- if: ${{ !inputs.only-zig }}
name: Upload bun-${{ inputs.tag }}-profile
uses: actions/upload-artifact@v4
with:
name: bun-${{ inputs.tag }}-profile
path: ${{ runner.temp }}/release/bun-${{ inputs.tag }}-profile.zip
if-no-files-found: error

View File

@@ -1,18 +0,0 @@
# redeploy Vercel site when a file in `docs` changes
# using VERCEL_DEPLOY_HOOK environment variable
name: Deploy site
on:
push:
paths:
- "docs/**"
branches: [main]
jobs:
deploy:
name: Deploy site
runs-on: ubuntu-latest
if: github.repository_owner == 'oven-sh'
steps:
- name: Trigger Vercel build
run: curl ${{ secrets.VERCEL_DEPLOY_HOOK }}

View File

@@ -1,142 +0,0 @@
name: bun-linux
concurrency:
group: bun-linux-aarch64-${{ github.ref }}
cancel-in-progress: true
on:
push:
branches:
- main
paths:
- ".github/workflows/bun-linux-aarch64.yml"
- "src/**/*"
- "test/**/*"
- "packages/bun-usockets/src/**/*"
- "packages/bun-uws/src/**/*"
- "CMakeLists.txt"
- "build.zig"
- "Makefile"
- "Dockerfile"
pull_request:
branches:
- main
paths:
- ".github/workflows/bun-linux-aarch64.yml"
- "src/**/*"
- "test/**/*"
- "packages/bun-usockets/src/**/*"
- "packages/bun-uws/src/**/*"
- "CMakeLists.txt"
- "build.zig"
- "Makefile"
- "Dockerfile"
# Allows you to run this workflow manually from the Actions tab
workflow_dispatch:
jobs:
linux:
name: ${{matrix.tag}}
runs-on: ${{matrix.runner}}
if: github.repository_owner == 'oven-sh'
timeout-minutes: 90
permissions: write-all
strategy:
matrix:
include:
- cpu: native
tag: linux-aarch64
arch: aarch64
build_arch: arm64
runner: linux-arm64
build_machine_arch: aarch64
steps:
- uses: actions/checkout@v4
with:
submodules: false
ref: ${{github.sha}}
clean: true
- run: |
bash ./scripts/update-submodules.sh
- uses: docker/setup-buildx-action@v3
id: buildx
with:
install: true
- name: Run
run: |
rm -rf ${{runner.temp}}/release
- name: Login to GitHub Container Registry
uses: docker/login-action@v3
with:
registry: ghcr.io
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- run: |
mkdir -p /tmp/.buildx-cache-${{matrix.tag}}
- name: Build and push
uses: docker/build-push-action@v5
with:
context: .
push: false
cache-from: type=local,src=/tmp/.buildx-cache-${{matrix.tag}}
cache-to: type=local,dest=/tmp/.buildx-cache-${{matrix.tag}}
build-args: |
ARCH=${{matrix.arch}}
BUILDARCH=${{matrix.build_arch}}
BUILD_MACHINE_ARCH=${{matrix.build_machine_arch}}
CPU_TARGET=${{matrix.cpu}}
GIT_SHA=${{github.sha}}
platforms: linux/${{matrix.build_arch}}
target: artifact
outputs: type=local,dest=${{runner.temp}}/release
- name: Zip
run: |
# if zip is not found
if [ ! -x "$(command -v zip)" ]; then
sudo apt-get update && sudo apt-get install -y zip --no-install-recommends
fi
if [ ! -x "$(command -v strip)" ]; then
sudo apt-get update && sudo apt-get install -y binutils --no-install-recommends
fi
cd ${{runner.temp}}/release
chmod +x bun-profile bun
mkdir bun-${{matrix.tag}}-profile
mkdir bun-${{matrix.tag}}
strip bun
mv bun-profile bun-${{matrix.tag}}-profile/bun-profile
mv bun bun-${{matrix.tag}}/bun
zip -r bun-${{matrix.tag}}-profile.zip bun-${{matrix.tag}}-profile
zip -r bun-${{matrix.tag}}.zip bun-${{matrix.tag}}
- uses: actions/upload-artifact@v4
with:
name: bun-${{matrix.tag}}-profile
path: ${{runner.temp}}/release/bun-${{matrix.tag}}-profile.zip
if-no-files-found: "error"
- uses: actions/upload-artifact@v4
with:
name: bun-${{matrix.tag}}
path: ${{runner.temp}}/release/bun-${{matrix.tag}}.zip
if-no-files-found: "error"
- name: Release
id: release
uses: ncipollo/release-action@v1
if: |
github.repository_owner == 'oven-sh'
&& github.ref == 'refs/heads/main'
with:
prerelease: true
body: "This canary release of Bun corresponds to the commit [${{ github.sha }}]"
allowUpdates: true
replacesArtifacts: true
generateReleaseNotes: true
artifactErrorsFailBuild: true
token: ${{ secrets.GITHUB_TOKEN }}
name: "Canary (${{github.sha}})"
tag: "canary"
artifacts: "${{runner.temp}}/release/bun-${{matrix.tag}}.zip,${{runner.temp}}/release/bun-${{matrix.tag}}-profile.zip"

View File

@@ -1,316 +0,0 @@
name: bun-linux
concurrency:
group: bun-linux-build-${{ github.ref }}
cancel-in-progress: true
on:
push:
branches:
- main
paths:
- ".github/workflows/bun-linux-build.yml"
- "src/**/*"
- "test/**/*"
- "packages/bun-usockets/src/**/*"
- "packages/bun-uws/src/**/*"
- "CMakeLists.txt"
- "build.zig"
- "Makefile"
- "Dockerfile"
pull_request:
branches:
- main
paths:
- ".github/workflows/bun-linux-build.yml"
- "src/**/*"
- "test/**/*"
- "packages/bun-usockets/src/**/*"
- "packages/bun-uws/src/**/*"
- "CMakeLists.txt"
- "build.zig"
- "Makefile"
- "Dockerfile"
# Allows you to run this workflow manually from the Actions tab
workflow_dispatch:
jobs:
linux:
name: ${{matrix.tag}}
runs-on: ${{matrix.runner}}
if: github.repository_owner == 'oven-sh'
timeout-minutes: 90
permissions: write-all
strategy:
fail-fast: false
matrix:
include:
- cpu: haswell
tag: linux-x64
arch: x86_64
build_arch: amd64
runner: namespace-profile-bun-linux-x64
build_machine_arch: x86_64
assertions: "OFF"
zig_optimize: "ReleaseFast"
target: "artifact"
- cpu: nehalem
tag: linux-x64-baseline
arch: x86_64
build_arch: amd64
runner: namespace-profile-bun-linux-x64
build_machine_arch: x86_64
assertions: "OFF"
zig_optimize: "ReleaseFast"
target: "artifact"
# - cpu: haswell
# tag: linux-x64-assertions
# arch: x86_64
# build_arch: amd64
# runner: big-ubuntu
# build_machine_arch: x86_64
# assertions: "ON"
# zig_optimize: "ReleaseSafe"
# target: "artifact-assertions"
# - cpu: nehalem
# tag: linux-x64-baseline-assertions
# arch: x86_64
# build_arch: amd64
# runner: big-ubuntu
# build_machine_arch: x86_64
# assertions: "ON"
# zig_optimize: "ReleaseSafe"
# target: "artifact-assertions"
steps:
- uses: actions/checkout@v4
with:
submodules: recursive
ref: ${{github.sha}}
clean: true
- name: Run
run: |
rm -rf ${{runner.temp}}/release
- name: Login to GitHub Container Registry
uses: docker/login-action@v3
with:
registry: ghcr.io
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Build and push
uses: docker/build-push-action@v5
with:
context: .
push: false
build-args: |
ARCH=${{matrix.arch}}
BUILDARCH=${{matrix.build_arch}}
BUILD_MACHINE_ARCH=${{matrix.build_machine_arch}}
CPU_TARGET=${{matrix.cpu}}
GIT_SHA=${{github.sha}}
ASSERTIONS=${{matrix.assertions}}
ZIG_OPTIMIZE=${{matrix.zig_optimize}}
platforms: linux/${{matrix.build_arch}}
target: ${{matrix.target}}
outputs: type=local,dest=${{runner.temp}}/release
- id: bun-version-check
name: Bun version check
run: |
# If this hangs, it means something is seriously wrong with the build
${{runner.temp}}/release/bun-profile --version
- name: Zip
run: |
# if zip is not found
if [ ! -x "$(command -v zip)" ]; then
sudo apt-get update && sudo apt-get install -y zip --no-install-recommends
fi
if [ ! -x "$(command -v strip)" ]; then
sudo apt-get update && sudo apt-get install -y binutils --no-install-recommends
fi
cd ${{runner.temp}}/release
chmod +x bun-profile bun
mkdir bun-${{matrix.tag}}-profile
mkdir bun-${{matrix.tag}}
strip bun
mv bun-profile bun-${{matrix.tag}}-profile/bun-profile
mv bun bun-${{matrix.tag}}/bun
zip -r bun-${{matrix.tag}}-profile.zip bun-${{matrix.tag}}-profile
zip -r bun-${{matrix.tag}}.zip bun-${{matrix.tag}}
- uses: actions/upload-artifact@v4
with:
name: bun-${{matrix.tag}}-profile
path: ${{runner.temp}}/release/bun-${{matrix.tag}}-profile.zip
if-no-files-found: "error"
- uses: actions/upload-artifact@v4
with:
name: bun-${{matrix.tag}}
path: ${{runner.temp}}/release/bun-${{matrix.tag}}.zip
if-no-files-found: "error"
- name: Release
id: release
uses: ncipollo/release-action@v1
if: |
github.repository_owner == 'oven-sh'
&& github.ref == 'refs/heads/main'
with:
prerelease: true
body: "This canary release of Bun corresponds to the commit [${{ github.sha }}]"
allowUpdates: true
replacesArtifacts: true
generateReleaseNotes: true
artifactErrorsFailBuild: true
token: ${{ secrets.GITHUB_TOKEN }}
name: "Canary (${{github.sha}})"
tag: "canary"
artifacts: "${{runner.temp}}/release/bun-${{matrix.tag}}.zip,${{runner.temp}}/release/bun-${{matrix.tag}}-profile.zip"
- uses: sarisia/actions-status-discord@v1
if: failure() && github.repository_owner == 'oven-sh' && github.event_name == 'pull_request'
with:
title: ""
webhook: ${{ secrets.DISCORD_WEBHOOK }}
status: ${{ job.status }}
noprefix: true
nocontext: true
description: |
Pull Request
### [${{github.event.pull_request.title}}](https://github.com/oven-sh/bun/pull/${{github.event.number}})
@${{ github.actor }}
Build failed on ${{ matrix.tag }}:
**[View build output](https://github.com/oven-sh/bun/actions/runs/${{github.run_id}})**
[Commit ${{github.sha}}](https://github.com/oven-sh/bun/commits/${{github.sha}})
linux-test:
name: Tests ${{matrix.tag}}
runs-on: namespace-profile-bun-linux-x64
needs: [linux]
if: github.event_name == 'pull_request'
timeout-minutes: 20
permissions:
pull-requests: write
outputs:
failing_tests: ${{ steps.test.outputs.failing_tests }}
failing_tests_count: ${{ steps.test.outputs.failing_tests_count }}
strategy:
fail-fast: false
matrix:
include:
- tag: linux-x64
- tag: linux-x64-baseline
# - tag: linux-x64-assertions
# - tag: linux-x64-baseline-assertions
steps:
- id: checkout
name: Checkout
uses: actions/checkout@v4
with:
submodules: false
clean: true
- id: download
name: Download
uses: actions/download-artifact@v4
with:
name: bun-${{matrix.tag}}
path: ${{runner.temp}}/release
- id: install-bun
name: Install Bun
run: |
cd ${{runner.temp}}/release
unzip bun-${{matrix.tag}}.zip
cd bun-${{matrix.tag}}
chmod +x bun
pwd >> $GITHUB_PATH
- id: bun-version-check
name: Bun version check
run: |
# If this hangs, it means something is seriously wrong with the build
bun --version
- id: install-dependnecies
name: Install dependencies
run: |
sudo apt-get update && sudo apt-get install -y openssl
bun install --verbose
bun install --cwd=test --verbose
bun install --cwd=packages/bun-internal-test --verbose
bun install --cwd=test/js/third_party/prisma --verbose
# This is disabled because the cores are ~5.5gb each
# so it is easy to hit 50gb coredump downloads. Only enable if you need to retrive one
# - name: Set core dumps to get stored in /cores
# run: |
# sudo mkdir /cores
# sudo chmod 777 /cores
# # Core filenames will be of the form executable.pid.timestamp:
# sudo bash -c 'echo "/cores/%e.%p.%t" > /proc/sys/kernel/core_pattern'
- id: test
name: Test (node runner)
env:
SMTP_SENDGRID_SENDER: ${{ secrets.SMTP_SENDGRID_SENDER }}
TMPDIR: ${{runner.temp}}
TLS_MONGODB_DATABASE_URL: ${{ secrets.TLS_MONGODB_DATABASE_URL }}
TLS_POSTGRES_DATABASE_URL: ${{ secrets.TLS_POSTGRES_DATABASE_URL }}
# if: ${{github.event.inputs.use_bun == 'false'}}
run: |
node packages/bun-internal-test/src/runner.node.mjs || true
# - uses: actions/upload-artifact@v4
# if: steps.test.outputs.failing_tests != ''
# with:
# name: cores
# path: /cores
# if-no-files-found: "error"
- uses: sarisia/actions-status-discord@v1
if: always() && steps.test.outputs.failing_tests != '' && github.event_name == 'pull_request'
with:
title: ""
webhook: ${{ secrets.DISCORD_WEBHOOK }}
status: "failure"
noprefix: true
nocontext: true
description: |
Pull Request
### ❌ [${{github.event.pull_request.title}}](https://github.com/oven-sh/bun/pull/${{github.event.number}})
@${{ github.actor }}, there are ${{ steps.test.outputs.failing_tests_count }} files with test failures on ${{ matrix.tag }}:
${{ steps.test.outputs.failing_tests }}
**[View test output](https://github.com/oven-sh/bun/actions/runs/${{github.run_id}})**
- name: Comment on PR
if: steps.test.outputs.failing_tests != '' && github.event_name == 'pull_request'
uses: thollander/actions-comment-pull-request@v2
with:
comment_tag: test-failures-${{matrix.tag}}
message: |
❌ @${{ github.actor }} ${{ steps.test.outputs.failing_tests_count }} files with test failures on ${{ matrix.tag }}:
${{ steps.test.outputs.failing_tests }}
**[View test output](https://github.com/oven-sh/bun/actions/runs/${{github.run_id}})**
<sup>[#${{github.sha}}](https://github.com/oven-sh/bun/commits/${{github.sha}})</sup>
- name: Uncomment on PR
if: steps.test.outputs.failing_tests == '' && github.event_name == 'pull_request'
uses: thollander/actions-comment-pull-request@v2
with:
comment_tag: test-failures-${{matrix.tag}}
mode: upsert
create_if_not_exists: false
message: |
✅ test failures on ${{ matrix.tag }} have been resolved.
<sup>[#${{github.sha}}](https://github.com/oven-sh/bun/commits/${{github.sha}})</sup>
- id: fail
name: Fail the build
if: steps.test.outputs.failing_tests != ''
run: exit 1

View File

@@ -1,479 +0,0 @@
name: bun-macOS-aarch64
concurrency:
group: bun-macOS-aarch64-${{ github.ref }}
cancel-in-progress: true
env:
LLVM_VERSION: 16
BUN_DOWNLOAD_URL_BASE: https://pub-5e11e972747a44bf9aaf9394f185a982.r2.dev/releases/latest
on:
push:
branches: [main]
paths:
- "src/**/*"
- "test/**/*"
- "packages/bun-usockets/src/**/*"
- "packages/bun-uws/src/**/*"
- "CMakeLists.txt"
- "build.zig"
- "Makefile"
- "Dockerfile"
pull_request:
branches: [main]
paths:
- "src/**/*"
- "test/**/*"
- "packages/bun-usockets/src/**/*"
- "packages/bun-uws/src/**/*"
- "CMakeLists.txt"
- "build.zig"
- "Makefile"
- "Dockerfile"
# Allows you to run this workflow manually from the Actions tab
workflow_dispatch:
jobs:
macOS-zig:
name: macOS Zig Object
runs-on: namespace-profile-zig-build
if: github.repository_owner == 'oven-sh'
strategy:
matrix:
include:
- cpu: native
arch: aarch64
tag: bun-obj-darwin-aarch64
steps:
- uses: actions/checkout@v4
# - name: Checkout submodules
# run: git submodule update --init --recursive --depth=1 --progress --force
- name: Login to GitHub Container Registry
uses: docker/login-action@v3
with:
registry: ghcr.io
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Compile Zig Object
uses: docker/build-push-action@v5
with:
context: .
push: false
# This doesnt seem to work
# cache-from: type=s3,endpoint_url=${{ secrets.CACHE_S3_ENDPOINT }},blobs_prefix=docker_blobs/,manifests_prefix=docker_manifests/,access_key_id=${{ secrets.CACHE_S3_ACCESS_KEY_ID }},secret_access_key=${{ secrets.CACHE_S3_SECRET_ACCESS_KEY }},bucket=bun,region=auto
# cache-to: type=s3,endpoint_url=${{ secrets.CACHE_S3_ENDPOINT }},blobs_prefix=docker_blobs/,manifests_prefix=docker_manifests/,access_key_id=${{ secrets.CACHE_S3_ACCESS_KEY_ID }},secret_access_key=${{ secrets.CACHE_S3_SECRET_ACCESS_KEY }},bucket=bun,region=auto
build-args: |
BUILDARCH=${{ runner.arch == 'X64' && 'amd64' || 'arm64' }}
BUILD_MACHINE_ARCH=${{ runner.arch == 'X64' && 'x86_64' || 'aarch64' }}
ARCH=${{ matrix.arch }}
CPU_TARGET=${{ matrix.cpu }}
TRIPLET=${{ matrix.arch }}-macos-none
GIT_SHA=${{ github.sha }}
platforms: linux/${{ runner.arch == 'X64' && 'amd64' || 'arm64' }}
target: build_release_obj
outputs: type=local,dest=${{runner.temp}}/release
- name: Upload Zig Object
uses: actions/upload-artifact@v4
with:
name: ${{ matrix.tag }}
path: ${{runner.temp}}/release/bun-zig.o
if-no-files-found: "error"
macOS-dependencies:
name: macOS Dependencies
runs-on: ${{ matrix.runner }}
if: github.repository_owner == 'oven-sh'
timeout-minutes: 15
strategy:
matrix:
include:
- cpu: native
arch: aarch64
tag: bun-darwin-aarch64
obj: bun-obj-darwin-aarch64
artifact: bun-obj-darwin-aarch64
runner: macos-13-xlarge
steps:
- uses: actions/checkout@v4
- name: Checkout submodules
run: git submodule update --init --recursive --depth=1 --progress --force
- name: Install system dependencies
env:
HOMEBREW_NO_INSTALLED_DEPENDENTS_CHECK: 1
HOMEBREW_NO_AUTO_UPDATE: 1
HOMEBREW_NO_INSTALL_CLEANUP: 1
run: |
brew install go sccache ccache rust llvm@$LLVM_VERSION pkg-config coreutils libtool cmake libiconv automake openssl@1.1 ninja gnu-sed pkg-config --force
echo "$(brew --prefix ccache)/bin" >> $GITHUB_PATH
# echo "$(brew --prefix sccache)/bin" >> $GITHUB_PATH
echo "$(brew --prefix coreutils)/libexec/gnubin" >> $GITHUB_PATH
echo "$(brew --prefix llvm@$LLVM_VERSION)/bin" >> $GITHUB_PATH
brew link --overwrite llvm@$LLVM_VERSION
curl -LO "$BUN_DOWNLOAD_URL_BASE/bun-darwin-aarch64.zip"
unzip bun-darwin-aarch64.zip
mkdir -p ${{ runner.temp }}/.bun/bin
mv bun-darwin-aarch64/bun ${{ runner.temp }}/.bun/bin/bun
chmod +x ${{ runner.temp }}/.bun/bin/bun
echo "${{ runner.temp }}/.bun/bin" >> $GITHUB_PATH
- name: Hash submodule versions
run: |
print_data() {
git submodule | grep -v WebKit
llvm-config --version
rustc --version
cat $(echo scripts/build*.sh scripts/all-dependencies.sh | tr " " "\n" | sort)
}
echo "sha=$(print_data | sha1sum | cut -c 1-10)" >> $GITHUB_OUTPUT
id: submodule-versions
- name: Cache submodule dependencies
id: cache-deps-restore
uses: actions/cache/restore@v4
with:
path: ${{runner.temp}}/bun-deps
key: bun-deps-${{ matrix.tag }}-${{ steps.submodule-versions.outputs.sha }}
- name: Compile submodule dependencies
if: ${{ !steps.cache-deps-restore.outputs.cache-hit }}
env:
CPU_TARGET: ${{ matrix.cpu }}
BUN_DEPS_OUT_DIR: ${{runner.temp}}/bun-deps
run: |
mkdir -p $BUN_DEPS_OUT_DIR
bash ./scripts/clean-dependencies.sh
bash ./scripts/all-dependencies.sh
- name: Cache submodule dependencies
if: ${{ !steps.cache-deps-restore.outputs.cache-hit }}
id: cache-deps-save
uses: actions/cache/save@v4
with:
path: ${{runner.temp}}/bun-deps
key: ${{ steps.cache-deps-restore.outputs.cache-primary-key }}
- name: Upload submodule dependencies
uses: actions/upload-artifact@v4
with:
name: ${{ matrix.tag }}-deps
path: ${{runner.temp}}/bun-deps
if-no-files-found: "error"
macOS-cpp:
name: macOS C++
runs-on: ${{ matrix.runner }}
if: github.repository_owner == 'oven-sh'
timeout-minutes: 90
strategy:
matrix:
include:
- cpu: native
arch: aarch64
tag: bun-darwin-aarch64
obj: bun-obj-darwin-aarch64
artifact: bun-obj-darwin-aarch64
runner: macos-13-xlarge
steps:
- uses: actions/checkout@v4
- name: Checkout submodules
run: git submodule update --init --recursive --depth=1 --progress --force
- name: Install system dependencies
env:
HOMEBREW_NO_INSTALLED_DEPENDENTS_CHECK: 1
HOMEBREW_NO_AUTO_UPDATE: 1
HOMEBREW_NO_INSTALL_CLEANUP: 1
run: |
brew install go sccache ccache rust llvm@$LLVM_VERSION pkg-config coreutils libtool cmake libiconv automake openssl@1.1 ninja gnu-sed pkg-config --force
echo "$(brew --prefix ccache)/bin" >> $GITHUB_PATH
# echo "$(brew --prefix sccache)/bin" >> $GITHUB_PATH
echo "$(brew --prefix coreutils)/libexec/gnubin" >> $GITHUB_PATH
echo "$(brew --prefix llvm@$LLVM_VERSION)/bin" >> $GITHUB_PATH
brew link --overwrite llvm@$LLVM_VERSION
curl -LO "$BUN_DOWNLOAD_URL_BASE/bun-darwin-aarch64.zip"
unzip bun-darwin-aarch64.zip
mkdir -p ${{ runner.temp }}/.bun/bin
mv bun-darwin-aarch64/bun ${{ runner.temp }}/.bun/bin/bun
chmod +x ${{ runner.temp }}/.bun/bin/bun
echo "${{ runner.temp }}/.bun/bin" >> $GITHUB_PATH
# TODO: replace with sccache
- name: ccache
uses: hendrikmuhs/ccache-action@v1.2
with:
key: ${{ runner.os }}-ccache-${{ matrix.tag }}
restore-keys: ${{ runner.os }}-ccache-${{ matrix.tag }}
- name: Compile C++
env:
CPU_TARGET: ${{ matrix.cpu }}
SOURCE_DIR: ${{ github.workspace }}
OBJ_DIR: ${{ runner.temp }}/bun-cpp-obj
BUN_DEPS_OUT_DIR: ${{runner.temp}}/bun-deps
run: |
mkdir -p $OBJ_DIR
cd $OBJ_DIR
cmake -S $SOURCE_DIR -B $OBJ_DIR \
-G Ninja \
-DUSE_LTO=ON \
-DCMAKE_BUILD_TYPE=Release \
-DBUN_CPP_ONLY=1 \
-DNO_CONFIGURE_DEPENDS=1
bash compile-cpp-only.sh -v
- name: Upload C++
uses: actions/upload-artifact@v4
with:
name: ${{ matrix.tag }}-cpp
path: ${{ runner.temp }}/bun-cpp-obj/bun-cpp-objects.a
if-no-files-found: "error"
macOS-link:
name: macOS Link
runs-on: ${{ matrix.runner }}
if: github.repository_owner == 'oven-sh'
needs: [macOS-zig, macOS-cpp, macOS-dependencies]
timeout-minutes: 60
permissions: write-all
strategy:
matrix:
include:
- cpu: native
arch: aarch64
tag: bun-darwin-aarch64
obj: bun-obj-darwin-aarch64
package: bun-darwin-aarch64
artifact: bun-obj-darwin-aarch64
runner: macos-13-xlarge
steps:
- uses: actions/checkout@v4
with:
submodules: recursive
ref: ${{github.sha}}
clean: true
- name: Checkout submodules
run: git submodule update --init --recursive --depth=1 --progress --force
- name: Install system dependencies
env:
HOMEBREW_NO_INSTALLED_DEPENDENTS_CHECK: 1
HOMEBREW_NO_AUTO_UPDATE: 1
HOMEBREW_NO_INSTALL_CLEANUP: 1
run: |
brew install ccache llvm@$LLVM_VERSION pkg-config coreutils libtool cmake libiconv openssl@1.1 ninja --force
echo "$(brew --prefix ccache)/bin" >> $GITHUB_PATH
echo "$(brew --prefix coreutils)/libexec/gnubin" >> $GITHUB_PATH
echo "$(brew --prefix llvm@$LLVM_VERSION)/bin" >> $GITHUB_PATH
brew link --overwrite llvm@$LLVM_VERSION
curl -LO "$BUN_DOWNLOAD_URL_BASE/bun-darwin-${{matrix.arch}}.zip"
unzip bun-darwin-${{matrix.arch}}.zip
mkdir -p ${{ runner.temp }}/.bun/bin
mv bun-darwin-${{matrix.arch}}/bun ${{ runner.temp }}/.bun/bin/bun
chmod +x ${{ runner.temp }}/.bun/bin/bun
echo "${{ runner.temp }}/.bun/bin" >> $GITHUB_PATH
- name: Download C++
uses: actions/download-artifact@v4
with:
name: ${{ matrix.tag }}-cpp
path: ${{ runner.temp }}/bun-cpp-obj
- name: Download Zig Object
uses: actions/download-artifact@v4
with:
name: ${{ matrix.obj }}
path: ${{ runner.temp }}/release
- name: Downloaded submodule dependencies
uses: actions/download-artifact@v4
with:
name: ${{ matrix.tag }}-deps
path: ${{runner.temp}}/bun-deps
- name: Link
env:
CPU_TARGET: ${{ matrix.cpu }}
run: |
SRC_DIR=$PWD
mkdir ${{runner.temp}}/link-build
cd ${{runner.temp}}/link-build
cmake $SRC_DIR \
-G Ninja \
-DCMAKE_BUILD_TYPE=Release \
-DUSE_LTO=ON \
-DBUN_LINK_ONLY=1 \
-DBUN_ZIG_OBJ="${{ runner.temp }}/release/bun-zig.o" \
-DBUN_CPP_ARCHIVE="${{ runner.temp }}/bun-cpp-obj/bun-cpp-objects.a" \
-DBUN_DEPS_OUT_DIR="${{runner.temp}}/bun-deps" \
-DNO_CONFIGURE_DEPENDS=1
ninja -v
- name: Zip
run: |
cd ${{runner.temp}}/link-build
chmod +x bun-profile bun
mkdir -p ${{matrix.tag}}-profile/ ${{matrix.tag}}/
mv bun-profile ${{matrix.tag}}-profile/bun-profile
mv bun ${{matrix.tag}}/bun
zip -r ${{matrix.tag}}-profile.zip ${{matrix.tag}}-profile
zip -r ${{matrix.tag}}.zip ${{matrix.tag}}
- uses: actions/upload-artifact@v4
with:
name: ${{matrix.tag}}-profile
path: ${{runner.temp}}/link-build/${{matrix.tag}}-profile.zip
if-no-files-found: "error"
- uses: actions/upload-artifact@v4
with:
name: ${{matrix.tag}}
path: ${{runner.temp}}/link-build/${{matrix.tag}}.zip
if-no-files-found: "error"
- name: Release
id: release
uses: ncipollo/release-action@v1
if: |
github.repository_owner == 'oven-sh'
&& github.ref == 'refs/heads/main'
with:
prerelease: true
body: "This canary release of Bun corresponds to the commit [${{ github.sha }}]"
allowUpdates: true
replacesArtifacts: true
generateReleaseNotes: true
artifactErrorsFailBuild: true
token: ${{ secrets.GITHUB_TOKEN }}
name: "Canary (${{github.sha}})"
tag: "canary"
artifacts: "${{runner.temp}}/link-build/${{matrix.tag}}.zip,${{runner.temp}}/link-build/${{matrix.tag}}-profile.zip"
- uses: sarisia/actions-status-discord@v1
if: failure() && github.repository_owner == 'oven-sh' && github.event_name == 'pull_request'
with:
title: ""
webhook: ${{ secrets.DISCORD_WEBHOOK }}
status: ${{ job.status }}
noprefix: true
nocontext: true
description: |
Pull Request
### [${{github.event.pull_request.title}}](https://github.com/oven-sh/bun/pull/${{github.event.number}})
@${{ github.actor }}
Build failed on ${{ matrix.tag }}:
**[View build output](https://github.com/oven-sh/bun/actions/runs/${{github.run_id}})**
[Commit ${{github.sha}}](https://github.com/oven-sh/bun/commits/${{github.sha}})
macOS-test:
name: Tests ${{matrix.tag}}
runs-on: ${{ matrix.runner }}
needs: [macOS-link]
if: github.event_name == 'pull_request' && github.repository_owner == 'oven-sh'
permissions:
pull-requests: write
timeout-minutes: 30
outputs:
failing_tests: ${{ steps.test.outputs.failing_tests }}
failing_tests_count: ${{ steps.test.outputs.failing_tests_count }}
strategy:
fail-fast: false
matrix:
include:
- tag: bun-darwin-aarch64
runner: macos-13-xlarge
steps:
- id: checkout
name: Checkout
uses: actions/checkout@v4
with:
submodules: false
- id: download
name: Download
uses: actions/download-artifact@v4
with:
name: ${{matrix.tag}}
path: ${{runner.temp}}/release
- id: install-bun
name: Install Bun
run: |
cd ${{runner.temp}}/release
unzip ${{matrix.tag}}.zip
cd ${{matrix.tag}}
chmod +x bun
pwd >> $GITHUB_PATH
- id: bun-version-check
name: Bun version check
run: |
# If this hangs, it means something is seriously wrong with the build
bun --version
- id: install
name: Install dependencies
run: |
bun install --verbose
bun install --cwd=test --verbose
bun install --cwd=packages/bun-internal-test --verbose
- id: test
name: Test (node runner)
env:
SMTP_SENDGRID_SENDER: ${{ secrets.SMTP_SENDGRID_SENDER }}
TMPDIR: ${{runner.temp}}
TLS_MONGODB_DATABASE_URL: ${{ secrets.TLS_MONGODB_DATABASE_URL }}
TLS_POSTGRES_DATABASE_URL: ${{ secrets.TLS_POSTGRES_DATABASE_URL }}
# if: ${{github.event.inputs.use_bun == 'false'}}
run: |
node packages/bun-internal-test/src/runner.node.mjs || true
- uses: sarisia/actions-status-discord@v1
if: always() && steps.test.outputs.failing_tests != '' && github.event_name == 'pull_request'
with:
title: ""
webhook: ${{ secrets.DISCORD_WEBHOOK }}
status: "failure"
noprefix: true
nocontext: true
description: |
Pull Request
### ❌ [${{github.event.pull_request.title}}](https://github.com/oven-sh/bun/pull/${{github.event.number}})
@${{ github.actor }}, there are ${{ steps.test.outputs.failing_tests_count }} files with test failures on ${{ matrix.tag }}:
${{ steps.test.outputs.failing_tests }}
**[View test output](https://github.com/oven-sh/bun/actions/runs/${{github.run_id}})**
- name: Comment on PR
if: steps.test.outputs.failing_tests != '' && github.event_name == 'pull_request'
uses: thollander/actions-comment-pull-request@v2
with:
comment_tag: test-failures-${{matrix.tag}}
message: |
❌ @${{ github.actor }} ${{ steps.test.outputs.failing_tests_count }} files with test failures on ${{ matrix.tag }}:
${{ steps.test.outputs.failing_tests }}
**[View test output](https://github.com/oven-sh/bun/actions/runs/${{github.run_id}})**
<sup>[#${{github.sha}}](https://github.com/oven-sh/bun/commits/${{github.sha}})</sup>
- name: Uncomment on PR
if: steps.test.outputs.failing_tests == '' && github.event_name == 'pull_request'
uses: thollander/actions-comment-pull-request@v2
with:
comment_tag: test-failures-${{matrix.tag}}
mode: upsert
create_if_not_exists: false
message: |
✅ test failures on ${{ matrix.tag }} have been resolved.
<sup>[#${{github.sha}}](https://github.com/oven-sh/bun/commits/${{github.sha}})</sup>
- id: fail
name: Fail the build
if: steps.test.outputs.failing_tests != ''
run: exit 1

View File

@@ -1,469 +0,0 @@
name: bun-macOS-x64-baseline
concurrency:
group: bun-macOS-x64-baseline-${{ github.ref }}
cancel-in-progress: true
env:
LLVM_VERSION: 16
BUN_DOWNLOAD_URL_BASE: https://pub-5e11e972747a44bf9aaf9394f185a982.r2.dev/releases/latest
on:
push:
branches: [main]
paths:
- "src/**/*"
- "test/**/*"
- "packages/bun-usockets/src/**/*"
- "packages/bun-uws/src/**/*"
- "CMakeLists.txt"
- "build.zig"
- "Makefile"
- "Dockerfile"
pull_request:
branches: [main]
paths:
- "src/**/*"
- "test/**/*"
- "packages/bun-usockets/src/**/*"
- "packages/bun-uws/src/**/*"
- "CMakeLists.txt"
- "build.zig"
- "Makefile"
- "Dockerfile"
# Allows you to run this workflow manually from the Actions tab
workflow_dispatch:
jobs:
macos-object-files:
name: macOS Object
runs-on: namespace-profile-zig-build
if: github.repository_owner == 'oven-sh'
strategy:
matrix:
include:
- cpu: nehalem
arch: x86_64
tag: bun-obj-darwin-x64-baseline
# - cpu: haswell
# arch: x86_64
# tag: bun-obj-darwin-x64
# - cpu: native
# arch: aarch64
# tag: bun-obj-darwin-aarch64
steps:
- uses: actions/checkout@v4
- name: Login to GitHub Container Registry
uses: docker/login-action@v3
with:
registry: ghcr.io
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Compile Zig Object
uses: docker/build-push-action@v5
with:
context: .
push: false
build-args: |
BUILDARCH=${{ runner.arch == 'X64' && 'amd64' || 'arm64' }}
BUILD_MACHINE_ARCH=${{ runner.arch == 'X64' && 'x86_64' || 'aarch64' }}
ARCH=${{ matrix.arch }}
CPU_TARGET=${{ matrix.cpu }}
TRIPLET=${{ matrix.arch }}-macos-none
GIT_SHA=${{ github.sha }}
platforms: linux/${{ runner.arch == 'X64' && 'amd64' || 'arm64' }}
target: build_release_obj
outputs: type=local,dest=${{runner.temp}}/release
- name: Upload Zig Object
uses: actions/upload-artifact@v4
with:
name: ${{ matrix.tag }}
path: ${{runner.temp}}/release/bun-zig.o
if-no-files-found: "error"
macOS-dependencies:
name: macOS Dependencies
runs-on: ${{ matrix.runner }}
if: github.repository_owner == 'oven-sh'
timeout-minutes: 15
strategy:
matrix:
include:
- cpu: nehalem
arch: x86_64
tag: bun-darwin-x64-baseline
obj: bun-obj-darwin-x64-baseline
runner: macos-12-large
artifact: bun-obj-darwin-x64-baseline
steps:
- uses: actions/checkout@v4
- name: Checkout submodules
run: git submodule update --init --recursive --depth=1 --progress --force
- name: Install system dependencies
env:
HOMEBREW_NO_INSTALLED_DEPENDENTS_CHECK: 1
HOMEBREW_NO_AUTO_UPDATE: 1
HOMEBREW_NO_INSTALL_CLEANUP: 1
run: |
brew install sccache ccache rust llvm@$LLVM_VERSION pkg-config coreutils libtool cmake libiconv automake openssl@1.1 ninja gnu-sed pkg-config --force
echo "$(brew --prefix ccache)/bin" >> $GITHUB_PATH
# echo "$(brew --prefix sccache)/bin" >> $GITHUB_PATH
echo "$(brew --prefix coreutils)/libexec/gnubin" >> $GITHUB_PATH
echo "$(brew --prefix llvm@$LLVM_VERSION)/bin" >> $GITHUB_PATH
brew link --overwrite llvm@$LLVM_VERSION
- name: Hash submodule versions
run: |
print_data() {
git submodule | grep -v WebKit
llvm-config --version
rustc --version
cat $(echo scripts/build*.sh scripts/all-dependencies.sh | tr " " "\n" | sort)
}
echo "sha=$(print_data | sha1sum | cut -c 1-10)" >> $GITHUB_OUTPUT
id: submodule-versions
- name: Cache submodule dependencies
id: cache-deps-restore
uses: actions/cache/restore@v4
with:
path: ${{runner.temp}}/bun-deps
key: bun-deps-${{ matrix.tag }}-${{ steps.submodule-versions.outputs.sha }}
- name: Compile submodule dependencies
if: ${{ !steps.cache-deps-restore.outputs.cache-hit }}
env:
CPU_TARGET: ${{ matrix.cpu }}
BUN_DEPS_OUT_DIR: ${{runner.temp}}/bun-deps
run: |
mkdir -p $BUN_DEPS_OUT_DIR
bash ./scripts/clean-dependencies.sh
bash ./scripts/all-dependencies.sh
- name: Cache submodule dependencies
if: ${{ !steps.cache-deps-restore.outputs.cache-hit }}
id: cache-deps-save
uses: actions/cache/save@v4
with:
path: ${{runner.temp}}/bun-deps
key: ${{ steps.cache-deps-restore.outputs.cache-primary-key }}
- name: Upload submodule dependencies
uses: actions/upload-artifact@v4
with:
name: ${{ matrix.tag }}-deps
path: ${{runner.temp}}/bun-deps
if-no-files-found: "error"
macOS-cpp:
name: macOS C++
runs-on: ${{ matrix.runner }}
if: github.repository_owner == 'oven-sh'
timeout-minutes: 90
strategy:
matrix:
include:
- cpu: nehalem
arch: x86_64
tag: bun-darwin-x64-baseline
obj: bun-obj-darwin-x64-baseline
runner: macos-12-large
artifact: bun-obj-darwin-x64-baseline
steps:
- uses: actions/checkout@v4
- name: Checkout submodules
run: git submodule update --init --recursive --depth=1 --progress --force
- name: Install system dependencies
env:
HOMEBREW_NO_INSTALLED_DEPENDENTS_CHECK: 1
HOMEBREW_NO_AUTO_UPDATE: 1
HOMEBREW_NO_INSTALL_CLEANUP: 1
run: |
brew install sccache ccache rust llvm@$LLVM_VERSION pkg-config coreutils libtool cmake libiconv automake openssl@1.1 ninja gnu-sed pkg-config --force
echo "$(brew --prefix ccache)/bin" >> $GITHUB_PATH
# echo "$(brew --prefix sccache)/bin" >> $GITHUB_PATH
echo "$(brew --prefix coreutils)/libexec/gnubin" >> $GITHUB_PATH
echo "$(brew --prefix llvm@$LLVM_VERSION)/bin" >> $GITHUB_PATH
brew link --overwrite llvm@$LLVM_VERSION
curl -LO "$BUN_DOWNLOAD_URL_BASE/bun-darwin-x64-baseline.zip"
unzip bun-darwin-x64-baseline.zip
mkdir -p ${{ runner.temp }}/.bun/bin
mv bun-darwin-x64-baseline/bun ${{ runner.temp }}/.bun/bin/bun
chmod +x ${{ runner.temp }}/.bun/bin/bun
echo "${{ runner.temp }}/.bun/bin" >> $GITHUB_PATH
# TODO: replace with sccache
- name: ccache
uses: hendrikmuhs/ccache-action@v1.2
with:
key: ${{ runner.os }}-ccache-${{ matrix.tag }}
restore-keys: ${{ runner.os }}-ccache-${{ matrix.tag }}
- name: Compile C++
env:
CPU_TARGET: ${{ matrix.cpu }}
SOURCE_DIR: ${{ github.workspace }}
OBJ_DIR: ${{ runner.temp }}/bun-cpp-obj
BUN_DEPS_OUT_DIR: ${{runner.temp}}/bun-deps
run: |
mkdir -p $OBJ_DIR
cd $OBJ_DIR
cmake -S $SOURCE_DIR -B $OBJ_DIR \
-G Ninja \
-DUSE_LTO=ON \
-DCMAKE_BUILD_TYPE=Release \
-DBUN_CPP_ONLY=1 \
-DNO_CONFIGURE_DEPENDS=1
bash compile-cpp-only.sh -v
- name: Upload C++
uses: actions/upload-artifact@v4
with:
name: ${{ matrix.tag }}-cpp
path: ${{ runner.temp }}/bun-cpp-obj/bun-cpp-objects.a
if-no-files-found: "error"
macOS:
name: macOS Link
runs-on: ${{ matrix.runner }}
if: github.repository_owner == 'oven-sh'
needs: [macOS-cpp, macos-object-files, macOS-dependencies]
timeout-minutes: 90
permissions: write-all
strategy:
matrix:
include:
- cpu: nehalem
arch: x86_64
tag: bun-darwin-x64-baseline
obj: bun-obj-darwin-x64-baseline
package: bun-darwin-x64
runner: macos-12-large
artifact: bun-obj-darwin-x64-baseline
steps:
- uses: actions/checkout@v4
- name: Checkout submodules
run: git submodule update --init --recursive --depth=1 --progress --force
- name: Install system dependencies
env:
HOMEBREW_NO_INSTALLED_DEPENDENTS_CHECK: 1
HOMEBREW_NO_AUTO_UPDATE: 1
HOMEBREW_NO_INSTALL_CLEANUP: 1
run: |
brew install ccache llvm@$LLVM_VERSION pkg-config coreutils libtool cmake libiconv openssl@1.1 ninja --force
echo "$(brew --prefix ccache)/bin" >> $GITHUB_PATH
echo "$(brew --prefix coreutils)/libexec/gnubin" >> $GITHUB_PATH
echo "$(brew --prefix llvm@$LLVM_VERSION)/bin" >> $GITHUB_PATH
brew link --overwrite llvm@$LLVM_VERSION
curl -LO "$BUN_DOWNLOAD_URL_BASE/bun-darwin-x64-baseline.zip"
unzip bun-darwin-x64-baseline.zip
mkdir -p ${{ runner.temp }}/.bun/bin
mv bun-darwin-x64-baseline/bun ${{ runner.temp }}/.bun/bin/bun
chmod +x ${{ runner.temp }}/.bun/bin/bun
echo "${{ runner.temp }}/.bun/bin" >> $GITHUB_PATH
- name: Download C++
uses: actions/download-artifact@v4
with:
name: ${{ matrix.tag }}-cpp
path: ${{ runner.temp }}/bun-cpp-obj
- name: Download Zig Object
uses: actions/download-artifact@v4
with:
name: ${{ matrix.obj }}
path: ${{ runner.temp }}/release
- name: Downloaded submodule dependencies
uses: actions/download-artifact@v4
with:
name: ${{ matrix.tag }}-deps
path: ${{runner.temp}}/bun-deps
- name: Link
env:
CPU_TARGET: ${{ matrix.cpu }}
run: |
SRC_DIR=$PWD
mkdir ${{runner.temp}}/link-build
cd ${{runner.temp}}/link-build
cmake $SRC_DIR \
-G Ninja \
-DUSE_LTO=ON \
-DCMAKE_BUILD_TYPE=Release \
-DBUN_LINK_ONLY=1 \
-DBUN_ZIG_OBJ="${{ runner.temp }}/release/bun-zig.o" \
-DBUN_CPP_ARCHIVE="${{ runner.temp }}/bun-cpp-obj/bun-cpp-objects.a" \
-DBUN_DEPS_OUT_DIR="${{runner.temp}}/bun-deps" \
-DNO_CONFIGURE_DEPENDS=1
ninja -v
- name: Zip
run: |
cd ${{runner.temp}}/link-build
chmod +x bun-profile bun
mkdir -p ${{matrix.tag}}-profile/ ${{matrix.tag}}/
mv bun-profile ${{matrix.tag}}-profile/bun-profile
mv bun ${{matrix.tag}}/bun
zip -r ${{matrix.tag}}-profile.zip ${{matrix.tag}}-profile
zip -r ${{matrix.tag}}.zip ${{matrix.tag}}
- uses: actions/upload-artifact@v4
with:
name: ${{matrix.tag}}-profile
path: ${{runner.temp}}/link-build/${{matrix.tag}}-profile.zip
if-no-files-found: "error"
- uses: actions/upload-artifact@v4
with:
name: ${{matrix.tag}}
path: ${{runner.temp}}/link-build/${{matrix.tag}}.zip
if-no-files-found: "error"
- name: Release
id: release
uses: ncipollo/release-action@v1
if: |
github.repository_owner == 'oven-sh'
&& github.ref == 'refs/heads/main'
with:
prerelease: true
body: "This canary release of Bun corresponds to the commit [${{ github.sha }}]"
allowUpdates: true
replacesArtifacts: true
generateReleaseNotes: true
artifactErrorsFailBuild: true
token: ${{ secrets.GITHUB_TOKEN }}
name: "Canary (${{github.sha}})"
tag: "canary"
artifacts: "${{runner.temp}}/link-build/${{matrix.tag}}.zip,${{runner.temp}}/link-build/${{matrix.tag}}-profile.zip"
- uses: sarisia/actions-status-discord@v1
if: failure() && github.repository_owner == 'oven-sh' && github.event_name == 'pull_request'
with:
title: ""
webhook: ${{ secrets.DISCORD_WEBHOOK }}
status: ${{ job.status }}
noprefix: true
nocontext: true
description: |
Pull Request
### [${{github.event.pull_request.title}}](https://github.com/oven-sh/bun/pull/${{github.event.number}})
@${{ github.actor }}
Build failed on ${{ matrix.tag }}:
**[View build output](https://github.com/oven-sh/bun/actions/runs/${{github.run_id}})**
[Commit ${{github.sha}}](https://github.com/oven-sh/bun/commits/${{github.sha}})
macOS-test:
name: macOS Test
runs-on: ${{ matrix.runner }}
needs: [macOS]
# if: github.event_name == 'pull_request' && github.repository_owner == 'oven-sh'
if: false
permissions:
pull-requests: write
timeout-minutes: 30
outputs:
failing_tests: ${{ steps.test.outputs.failing_tests }}
failing_tests_count: ${{ steps.test.outputs.failing_tests_count }}
strategy:
fail-fast: false
matrix:
include:
- tag: bun-darwin-x64-baseline
runner: macos-12-large
steps:
- id: checkout
name: Checkout
uses: actions/checkout@v4
with:
submodules: false
- id: download
name: Download
uses: actions/download-artifact@v4
with:
name: ${{matrix.tag}}
path: ${{runner.temp}}/release
- id: install-bun
name: Install Bun
run: |
cd ${{runner.temp}}/release
unzip ${{matrix.tag}}.zip
cd ${{matrix.tag}}
chmod +x bun
pwd >> $GITHUB_PATH
- id: bun-version-check
name: Bun version check
run: |
# If this hangs, it means something is seriously wrong with the build
bun --version
- id: install
name: Install dependencies
run: |
bun install --verbose
bun install --cwd=test --verbose
bun install --cwd=packages/bun-internal-test --verbose
- id: test
name: Test (node runner)
env:
SMTP_SENDGRID_SENDER: ${{ secrets.SMTP_SENDGRID_SENDER }}
TMPDIR: ${{runner.temp}}
TLS_MONGODB_DATABASE_URL: ${{ secrets.TLS_MONGODB_DATABASE_URL }}
TLS_POSTGRES_DATABASE_URL: ${{ secrets.TLS_POSTGRES_DATABASE_URL }}
# if: ${{github.event.inputs.use_bun == 'false'}}
run: |
node packages/bun-internal-test/src/runner.node.mjs || true
- uses: sarisia/actions-status-discord@v1
if: always() && steps.test.outputs.failing_tests != '' && github.event_name == 'pull_request'
with:
title: ""
webhook: ${{ secrets.DISCORD_WEBHOOK }}
status: "failure"
noprefix: true
nocontext: true
description: |
Pull Request
### ❌ [${{github.event.pull_request.title}}](https://github.com/oven-sh/bun/pull/${{github.event.number}})
Hey @${{ github.actor }},
${{ steps.test.outputs.failing_tests_count }} files with test failures on ${{ matrix.tag }}:
${{ steps.test.outputs.failing_tests }}
**[View test output](https://github.com/oven-sh/bun/actions/runs/${{github.run_id}})**
- name: Comment on PR
if: steps.test.outputs.failing_tests != '' && github.event_name == 'pull_request'
uses: thollander/actions-comment-pull-request@v2
with:
comment_tag: test-failures-${{matrix.tag}}
message: |
❌ @${{ github.actor }} ${{ steps.test.outputs.failing_tests_count }} files with test failures on ${{ matrix.tag }}:
${{ steps.test.outputs.failing_tests }}
**[View test output](https://github.com/oven-sh/bun/actions/runs/${{github.run_id}})**
<sup>[#${{github.sha}}](https://github.com/oven-sh/bun/commits/${{github.sha}})</sup>
- name: Uncomment on PR
if: steps.test.outputs.failing_tests == '' && github.event_name == 'pull_request'
uses: thollander/actions-comment-pull-request@v2
with:
comment_tag: test-failures-${{matrix.tag}}
mode: upsert
create_if_not_exists: false
message: |
✅ test failures on ${{ matrix.tag }} have been resolved.
<sup>[#${{github.sha}}](https://github.com/oven-sh/bun/commits/${{github.sha}})</sup>
- id: fail
name: Fail the build
if: steps.test.outputs.failing_tests != ''
run: exit 1

View File

@@ -1,463 +0,0 @@
name: bun-macOS-x64
concurrency:
group: bun-macOS-x64-${{ github.ref }}
cancel-in-progress: true
env:
LLVM_VERSION: 16
BUN_DOWNLOAD_URL_BASE: https://pub-5e11e972747a44bf9aaf9394f185a982.r2.dev/releases/latest
on:
push:
branches: [main]
paths:
- "src/**/*"
- "test/**/*"
- "packages/bun-usockets/src/**/*"
- "packages/bun-uws/src/**/*"
- "CMakeLists.txt"
- "build.zig"
- "Makefile"
- "Dockerfile"
pull_request:
branches: [main]
paths:
- "src/**/*"
- "test/**/*"
- "packages/bun-usockets/src/**/*"
- "packages/bun-uws/src/**/*"
- "CMakeLists.txt"
- "build.zig"
- "Makefile"
- "Dockerfile"
# Allows you to run this workflow manually from the Actions tab
workflow_dispatch:
jobs:
macOS-zig:
name: macOS Zig Object
runs-on: namespace-profile-zig-build
if: github.repository_owner == 'oven-sh'
strategy:
matrix:
include:
# - cpu: nehalem
# arch: x86_64
# tag: bun-obj-darwin-x64-baseline
- cpu: haswell
arch: x86_64
tag: bun-obj-darwin-x64
steps:
- uses: actions/checkout@v4
- name: Login to GitHub Container Registry
uses: docker/login-action@v3
with:
registry: ghcr.io
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Compile Zig Object
uses: docker/build-push-action@v5
with:
context: .
push: false
build-args: |
BUILDARCH=${{ runner.arch == 'X64' && 'amd64' || 'arm64' }}
BUILD_MACHINE_ARCH=${{ runner.arch == 'X64' && 'x86_64' || 'aarch64' }}
ARCH=${{ matrix.arch }}
CPU_TARGET=${{ matrix.cpu }}
TRIPLET=${{ matrix.arch }}-macos-none
GIT_SHA=${{ github.sha }}
platforms: linux/${{ runner.arch == 'X64' && 'amd64' || 'arm64' }}
target: build_release_obj
outputs: type=local,dest=${{runner.temp}}/release
- name: Upload Zig Object
uses: actions/upload-artifact@v4
with:
name: ${{ matrix.tag }}
path: ${{runner.temp}}/release/bun-zig.o
if-no-files-found: "error"
macOS-dependencies:
name: macOS Dependencies
runs-on: ${{ matrix.runner }}
if: github.repository_owner == 'oven-sh'
timeout-minutes: 15
strategy:
matrix:
include:
- cpu: haswell
arch: x86_64
tag: bun-darwin-x64
obj: bun-obj-darwin-x64
runner: macos-12-large
artifact: bun-obj-darwin-x64
steps:
- uses: actions/checkout@v4
- name: Checkout submodules
run: git submodule update --init --recursive --depth=1 --progress --force
- name: Install system dependencies
env:
HOMEBREW_NO_INSTALLED_DEPENDENTS_CHECK: 1
HOMEBREW_NO_AUTO_UPDATE: 1
HOMEBREW_NO_INSTALL_CLEANUP: 1
run: |
brew install sccache ccache rust llvm@$LLVM_VERSION pkg-config coreutils libtool cmake libiconv automake openssl@1.1 ninja gnu-sed pkg-config --force
echo "$(brew --prefix ccache)/bin" >> $GITHUB_PATH
# echo "$(brew --prefix sccache)/bin" >> $GITHUB_PATH
echo "$(brew --prefix coreutils)/libexec/gnubin" >> $GITHUB_PATH
echo "$(brew --prefix llvm@$LLVM_VERSION)/bin" >> $GITHUB_PATH
brew link --overwrite llvm@$LLVM_VERSION
- name: Hash submodule versions
run: |
print_data() {
git submodule | grep -v WebKit
llvm-config --version
rustc --version
cat $(echo scripts/build*.sh scripts/all-dependencies.sh | tr " " "\n" | sort)
}
echo "sha=$(print_data | sha1sum | cut -c 1-10)" >> $GITHUB_OUTPUT
id: submodule-versions
- name: Cache submodule dependencies
id: cache-deps-restore
uses: actions/cache/restore@v4
with:
path: ${{runner.temp}}/bun-deps
key: bun-deps-${{ matrix.tag }}-${{ steps.submodule-versions.outputs.sha }}
- name: Compile submodule dependencies
if: ${{ !steps.cache-deps-restore.outputs.cache-hit }}
env:
CPU_TARGET: ${{ matrix.cpu }}
BUN_DEPS_OUT_DIR: ${{runner.temp}}/bun-deps
run: |
mkdir -p $BUN_DEPS_OUT_DIR
bash ./scripts/clean-dependencies.sh
bash ./scripts/all-dependencies.sh
- name: Cache submodule dependencies
if: ${{ !steps.cache-deps-restore.outputs.cache-hit }}
id: cache-deps-save
uses: actions/cache/save@v4
with:
path: ${{runner.temp}}/bun-deps
key: ${{ steps.cache-deps-restore.outputs.cache-primary-key }}
- name: Upload submodule dependencies
uses: actions/upload-artifact@v4
with:
name: ${{ matrix.tag }}-deps
path: ${{runner.temp}}/bun-deps
if-no-files-found: "error"
macOS-cpp:
name: macOS C++
runs-on: ${{ matrix.runner }}
if: github.repository_owner == 'oven-sh'
timeout-minutes: 90
strategy:
matrix:
include:
- cpu: haswell
arch: x86_64
tag: bun-darwin-x64
obj: bun-obj-darwin-x64
runner: macos-12-large
artifact: bun-obj-darwin-x64
steps:
- uses: actions/checkout@v4
- name: Checkout submodules
run: git submodule update --init --recursive --depth=1 --progress --force
- name: Install system dependencies
env:
HOMEBREW_NO_INSTALLED_DEPENDENTS_CHECK: 1
HOMEBREW_NO_AUTO_UPDATE: 1
HOMEBREW_NO_INSTALL_CLEANUP: 1
run: |
brew install sccache ccache rust llvm@$LLVM_VERSION pkg-config coreutils libtool cmake libiconv automake openssl@1.1 ninja gnu-sed pkg-config --force
# echo "$(brew --prefix sccache)/bin" >> $GITHUB_PATH
echo "$(brew --prefix ccache)/bin" >> $GITHUB_PATH
echo "$(brew --prefix coreutils)/libexec/gnubin" >> $GITHUB_PATH
echo "$(brew --prefix llvm@$LLVM_VERSION)/bin" >> $GITHUB_PATH
brew link --overwrite llvm@$LLVM_VERSION
curl -LO "$BUN_DOWNLOAD_URL_BASE/bun-darwin-x64.zip"
unzip bun-darwin-x64.zip
mkdir -p ${{ runner.temp }}/.bun/bin
mv bun-darwin-x64/bun ${{ runner.temp }}/.bun/bin/bun
chmod +x ${{ runner.temp }}/.bun/bin/bun
echo "${{ runner.temp }}/.bun/bin" >> $GITHUB_PATH
# TODO: replace with sccache
- name: ccache
uses: hendrikmuhs/ccache-action@v1.2
with:
key: ${{ runner.os }}-ccache-${{ matrix.tag }}
restore-keys: ${{ runner.os }}-ccache-${{ matrix.tag }}
- name: Compile C++
env:
CPU_TARGET: ${{ matrix.cpu }}
SOURCE_DIR: ${{ github.workspace }}
OBJ_DIR: ${{ runner.temp }}/bun-cpp-obj
BUN_DEPS_OUT_DIR: ${{runner.temp}}/bun-deps
run: |
mkdir -p $OBJ_DIR
cd $OBJ_DIR
cmake -S $SOURCE_DIR -B $OBJ_DIR \
-G Ninja \
-DCMAKE_BUILD_TYPE=Release \
-DUSE_LTO=ON \
-DBUN_CPP_ONLY=1 \
-DNO_CONFIGURE_DEPENDS=1
bash compile-cpp-only.sh -v
- name: Upload C++
uses: actions/upload-artifact@v4
with:
name: ${{ matrix.tag }}-cpp
path: ${{ runner.temp }}/bun-cpp-obj/bun-cpp-objects.a
if-no-files-found: "error"
macOS:
name: macOS Link
runs-on: ${{ matrix.runner }}
if: github.repository_owner == 'oven-sh'
needs: [macOS-cpp, macOS-zig, macOS-dependencies]
timeout-minutes: 90
permissions: write-all
strategy:
matrix:
include:
- cpu: haswell
arch: x86_64
tag: bun-darwin-x64
obj: bun-obj-darwin-x64
package: bun-darwin-x64
runner: macos-12-large
artifact: bun-obj-darwin-x64
steps:
- uses: actions/checkout@v4
- name: Checkout submodules
run: git submodule update --init --recursive --depth=1 --progress --force
- name: Install system dependencies
env:
HOMEBREW_NO_INSTALLED_DEPENDENTS_CHECK: 1
HOMEBREW_NO_AUTO_UPDATE: 1
HOMEBREW_NO_INSTALL_CLEANUP: 1
run: |
brew install ccache llvm@$LLVM_VERSION pkg-config coreutils libtool cmake libiconv openssl@1.1 ninja --force
echo "$(brew --prefix ccache)/bin" >> $GITHUB_PATH
echo "$(brew --prefix coreutils)/libexec/gnubin" >> $GITHUB_PATH
echo "$(brew --prefix llvm@$LLVM_VERSION)/bin" >> $GITHUB_PATH
brew link --overwrite llvm@$LLVM_VERSION
curl -LO "$BUN_DOWNLOAD_URL_BASE/bun-darwin-x64-baseline.zip"
unzip bun-darwin-x64-baseline.zip
mkdir -p ${{ runner.temp }}/.bun/bin
mv bun-darwin-x64-baseline/bun ${{ runner.temp }}/.bun/bin/bun
chmod +x ${{ runner.temp }}/.bun/bin/bun
echo "${{ runner.temp }}/.bun/bin" >> $GITHUB_PATH
- name: Download C++
uses: actions/download-artifact@v4
with:
name: ${{ matrix.tag }}-cpp
path: ${{ runner.temp }}/bun-cpp-obj
- name: Download Zig Object
uses: actions/download-artifact@v4
with:
name: ${{ matrix.obj }}
path: ${{ runner.temp }}/release
- name: Downloaded submodule dependencies
uses: actions/download-artifact@v4
with:
name: ${{ matrix.tag }}-deps
path: ${{runner.temp}}/bun-deps
- name: Link
env:
CPU_TARGET: ${{ matrix.cpu }}
run: |
SRC_DIR=$PWD
mkdir ${{runner.temp}}/link-build
cd ${{runner.temp}}/link-build
cmake $SRC_DIR \
-G Ninja \
-DCMAKE_BUILD_TYPE=Release \
-DUSE_LTO=ON \
-DBUN_LINK_ONLY=1 \
-DBUN_ZIG_OBJ="${{ runner.temp }}/release/bun-zig.o" \
-DBUN_CPP_ARCHIVE="${{ runner.temp }}/bun-cpp-obj/bun-cpp-objects.a" \
-DBUN_DEPS_OUT_DIR="${{runner.temp}}/bun-deps" \
-DNO_CONFIGURE_DEPENDS=1
ninja -v
- name: Zip
run: |
cd ${{runner.temp}}/link-build
chmod +x bun-profile bun
mkdir -p ${{matrix.tag}}-profile/ ${{matrix.tag}}/
mv bun-profile ${{matrix.tag}}-profile/bun-profile
mv bun ${{matrix.tag}}/bun
zip -r ${{matrix.tag}}-profile.zip ${{matrix.tag}}-profile
zip -r ${{matrix.tag}}.zip ${{matrix.tag}}
- uses: actions/upload-artifact@v4
with:
name: ${{matrix.tag}}-profile
path: ${{runner.temp}}/link-build/${{matrix.tag}}-profile.zip
if-no-files-found: "error"
- uses: actions/upload-artifact@v4
with:
name: ${{matrix.tag}}
path: ${{runner.temp}}/link-build/${{matrix.tag}}.zip
if-no-files-found: "error"
- name: Release
id: release
uses: ncipollo/release-action@v1
if: |
github.repository_owner == 'oven-sh'
&& github.ref == 'refs/heads/main'
with:
prerelease: true
body: "This canary release of Bun corresponds to the commit [${{ github.sha }}]"
allowUpdates: true
replacesArtifacts: true
generateReleaseNotes: true
artifactErrorsFailBuild: true
token: ${{ secrets.GITHUB_TOKEN }}
name: "Canary (${{github.sha}})"
tag: "canary"
artifacts: "${{runner.temp}}/link-build/${{matrix.tag}}.zip,${{runner.temp}}/link-build/${{matrix.tag}}-profile.zip"
- uses: sarisia/actions-status-discord@v1
if: failure() && github.repository_owner == 'oven-sh' && github.event_name == 'pull_request'
with:
title: ""
webhook: ${{ secrets.DISCORD_WEBHOOK }}
status: ${{ job.status }}
noprefix: true
nocontext: true
description: |
Pull Request
### [${{github.event.pull_request.title}}](https://github.com/oven-sh/bun/pull/${{github.event.number}})
@${{ github.actor }}
Build failed on ${{ matrix.tag }}:
**[View build output](https://github.com/oven-sh/bun/actions/runs/${{github.run_id}})**
[Commit ${{github.sha}}](https://github.com/oven-sh/bun/commits/${{github.sha}})
macOS-test:
name: Tests ${{matrix.tag}}
runs-on: ${{ matrix.runner }}
needs: [macOS]
if: github.event_name == 'pull_request' && github.repository_owner == 'oven-sh'
permissions:
pull-requests: write
timeout-minutes: 30
outputs:
failing_tests: ${{ steps.test.outputs.failing_tests }}
failing_tests_count: ${{ steps.test.outputs.failing_tests_count }}
strategy:
fail-fast: false
matrix:
include:
- tag: bun-darwin-x64
runner: macos-12-large
steps:
- id: checkout
name: Checkout
uses: actions/checkout@v4
with:
submodules: false
- id: download
name: Download
uses: actions/download-artifact@v4
with:
name: ${{matrix.tag}}
path: ${{runner.temp}}/release
- id: install-bun
name: Install Bun
run: |
cd ${{runner.temp}}/release
unzip ${{matrix.tag}}.zip
cd ${{matrix.tag}}
chmod +x bun
pwd >> $GITHUB_PATH
- id: bun-version-check
name: Bun version check
run: |
# If this hangs, it means something is seriously wrong with the build
bun --version
- id: install
name: Install dependencies
run: |
bun install --verbose
bun install --cwd=test --verbose
bun install --cwd=packages/bun-internal-test --verbose
- id: test
name: Test (node runner)
env:
SMTP_SENDGRID_SENDER: ${{ secrets.SMTP_SENDGRID_SENDER }}
TLS_MONGODB_DATABASE_URL: ${{ secrets.TLS_MONGODB_DATABASE_URL }}
TMPDIR: ${{runner.temp}}
TLS_POSTGRES_DATABASE_URL: ${{ secrets.TLS_POSTGRES_DATABASE_URL }}
# if: ${{github.event.inputs.use_bun == 'false'}}
run: |
node packages/bun-internal-test/src/runner.node.mjs || true
- uses: sarisia/actions-status-discord@v1
if: always() && steps.test.outputs.failing_tests != '' && github.event_name == 'pull_request'
with:
title: ""
webhook: ${{ secrets.DISCORD_WEBHOOK }}
status: "failure"
noprefix: true
nocontext: true
description: |
Pull Request
### ❌ [${{github.event.pull_request.title}}](https://github.com/oven-sh/bun/pull/${{github.event.number}})
@${{ github.actor }}, there are ${{ steps.test.outputs.failing_tests_count }} files with test failures on ${{ matrix.tag }}:
${{ steps.test.outputs.failing_tests }}
**[View test output](https://github.com/oven-sh/bun/actions/runs/${{github.run_id}})**
- name: Comment on PR
if: steps.test.outputs.failing_tests != '' && github.event_name == 'pull_request'
uses: thollander/actions-comment-pull-request@v2
with:
comment_tag: test-failures-${{matrix.tag}}
message: |
❌ @${{ github.actor }} ${{ steps.test.outputs.failing_tests_count }} files with test failures on ${{ matrix.tag }}:
${{ steps.test.outputs.failing_tests }}
**[View test output](https://github.com/oven-sh/bun/actions/runs/${{github.run_id}})**
<sup>[#${{github.sha}}](https://github.com/oven-sh/bun/commits/${{github.sha}})</sup>
- name: Uncomment on PR
if: steps.test.outputs.failing_tests == '' && github.event_name == 'pull_request'
uses: thollander/actions-comment-pull-request@v2
with:
comment_tag: test-failures-${{matrix.tag}}
mode: upsert
create_if_not_exists: false
message: |
✅ test failures on ${{ matrix.tag }} have been resolved.
<sup>[#${{github.sha}}](https://github.com/oven-sh/bun/commits/${{github.sha}})</sup>
- id: fail
name: Fail the build
if: steps.test.outputs.failing_tests != ''
run: exit 1

View File

@@ -1,41 +0,0 @@
name: bun-types
on:
push:
paths:
- "packages/bun-types/**"
branches: [main]
pull_request:
paths:
- "packages/bun-types/**"
jobs:
tests:
name: type-tests
runs-on: ubuntu-latest
defaults:
run:
working-directory: packages/bun-types
steps:
- name: Checkout repo
uses: actions/checkout@v4
- name: Install bun
uses: oven-sh/setup-bun@v1
with:
bun-version: canary
- name: Install node
uses: actions/setup-node@v3
with:
node-version: latest
- name: Install dependencies
run: |
bun install
- name: Generate package
run: bun run build
- name: Tests
run: bun run test

View File

@@ -1,501 +0,0 @@
name: bun-windows
concurrency:
group: bun-windows-${{ github.ref }}
cancel-in-progress: true
env:
# note: in other files, this version is only the major version, but for windows it is the full version
LLVM_VERSION: 16.0.6
BUN_DOWNLOAD_URL_BASE: https://pub-5e11e972747a44bf9aaf9394f185a982.r2.dev/releases/latest
tag: bun-windows
# TODO: wire this up to workflow_dispatch.
# github's expression syntax makes this hard to set a default to true
canary: true
on:
push:
branches: [main]
paths:
- "src/**/*"
- "test/**/*"
- "packages/bun-usockets/src/**/*"
- "packages/bun-uws/src/**/*"
- "CMakeLists.txt"
- "build.zig"
- "Makefile"
- "Dockerfile"
pull_request:
branches: [main]
paths:
- "src/**/*"
- "test/**/*"
- "packages/bun-usockets/src/**/*"
- "packages/bun-uws/src/**/*"
- "CMakeLists.txt"
- "build.zig"
- "Makefile"
- "Dockerfile"
# Allows you to run this workflow manually from the Actions tab
workflow_dispatch:
# inputs:
# is-canary:
# type: boolean
# description: Is Canary Build?
# default: true
jobs:
windows-zig:
strategy:
fail-fast: false
matrix:
cpu: [haswell, nehalem]
arch: [x86_64]
name: Zig Build
runs-on: namespace-profile-zig-build
timeout-minutes: 60
if: github.repository_owner == 'oven-sh'
steps:
- run: git config --global core.autocrlf false && git config --global core.eol lf
- uses: actions/checkout@v4
- name: Login to GitHub Container Registry
uses: docker/login-action@v3
with:
registry: ghcr.io
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Calculate Canary Revision
if: ${{ env.canary == 'true' }}
id: canary
run: |
echo "canary_revision=$(GITHUB_TOKEN="${{ secrets.GITHUB_TOKEN }}" bash ./scripts/calculate-canary-revision.sh --raw)" >> $GITHUB_OUTPUT
- name: Compile Zig Object
uses: docker/build-push-action@v5
with:
context: .
push: false
# This doesnt seem to work
# cache-from: type=s3,endpoint_url=${{ secrets.CACHE_S3_ENDPOINT }},blobs_prefix=docker_blobs/,manifests_prefix=docker_manifests/,access_key_id=${{ secrets.CACHE_S3_ACCESS_KEY_ID }},secret_access_key=${{ secrets.CACHE_S3_SECRET_ACCESS_KEY }},bucket=bun,region=auto
# cache-to: type=s3,endpoint_url=${{ secrets.CACHE_S3_ENDPOINT }},blobs_prefix=docker_blobs/,manifests_prefix=docker_manifests/,access_key_id=${{ secrets.CACHE_S3_ACCESS_KEY_ID }},secret_access_key=${{ secrets.CACHE_S3_SECRET_ACCESS_KEY }},bucket=bun,region=auto
build-args: |
BUILDARCH=${{ runner.arch == 'X64' && 'amd64' || 'arm64' }}
BUILD_MACHINE_ARCH=${{ runner.arch == 'X64' && 'x86_64' || 'aarch64' }}
ARCH=${{ matrix.arch }}
CPU_TARGET=${{ matrix.cpu }}
TRIPLET=${{ matrix.arch }}-windows-msvc
GIT_SHA=${{ github.sha }}
CANARY=${{ env.canary == 'true' && steps.canary.outputs.canary_revision || '0' }}
ZIG_OPTIMIZE=ReleaseSafe
# TODO(@paperdave): enable ASSERTIONS=1
platforms: linux/${{ runner.arch == 'X64' && 'amd64' || 'arm64' }}
target: build_release_obj
outputs: type=local,dest=${{runner.temp}}/release
- name: Upload Zig Object
uses: actions/upload-artifact@v4
with:
name: ${{ env.tag }}-${{ matrix.arch == 'x86_64' && 'x64' || 'aarch64' }}-zig${{ matrix.cpu == 'nehalem' && '-baseline' || '' }}
path: ${{runner.temp}}/release/bun-zig.o
if-no-files-found: "error"
windows-dependencies:
name: Dependencies
runs-on: windows
timeout-minutes: 60
strategy:
fail-fast: false
matrix:
cpu: [haswell, nehalem]
arch: [x86_64]
steps:
- run: git config --global core.autocrlf false && git config --global core.eol lf
- name: Checkout
uses: actions/checkout@v4
- name: Clone Submodules
run: .\scripts\update-submodules.ps1
- name: Hash submodule versions
shell: pwsh
run: |
$data = "$(& {
git submodule | Where-Object { $_ -notmatch 'WebKit' }
clang --version
rustc --version
Get-Content -Path (Get-ChildItem -Path 'scripts/build*.ps1', 'scripts/all-dependencies.ps1', 'scripts/env.ps1' | Sort-Object -Property Name).FullName | Out-String
echo 1
})"
$hash = ( -join ((New-Object -TypeName System.Security.Cryptography.SHA1CryptoServiceProvider).ComputeHash([System.Text.Encoding]::UTF8.GetBytes($data)) | ForEach-Object { $_.ToString("x2") } )).Substring(0, 10)
echo "sha=${hash}" >> $env:GITHUB_OUTPUT
id: submodule-versions
- name: Try fetch dependencies
id: cache-deps-restore
uses: actions/cache/restore@v4
with:
path: bun-deps
key: bun-deps-${{ env.tag }}-${{ matrix.arch == 'x86_64' && 'x64' || 'aarch64' }}${{ matrix.cpu == 'nehalem' && '-baseline' || '' }}-${{ steps.submodule-versions.outputs.sha }}
- name: Install LLVM ${{ env.LLVM_VERSION }}
if: ${{ !steps.cache-deps-restore.outputs.cache-hit }}
uses: KyleMayes/install-llvm-action@1a3da29f56261a1e1f937ec88f0856a9b8321d7e
with:
version: ${{ env.LLVM_VERSION }}
- name: Install Ninja
if: ${{ !steps.cache-deps-restore.outputs.cache-hit }}
run: choco install -y ninja
- name: Build Dependencies
if: ${{ !steps.cache-deps-restore.outputs.cache-hit }}
run: |
.\scripts\env.ps1 ${{ matrix.cpu == 'nehalem' && '-Baseline' || '' }}
Invoke-WebRequest -Uri "https://www.nasm.us/pub/nasm/releasebuilds/2.16.01/win64/nasm-2.16.01-win64.zip" -OutFile nasm.zip
Expand-Archive nasm.zip (mkdir -Force "nasm")
$Nasm = (Get-ChildItem "nasm")
$env:Path += ";${Nasm}"
$env:BUN_DEPS_OUT_DIR = (mkdir -Force "./bun-deps")
.\scripts\all-dependencies.ps1
- name: Upload Dependencies
uses: actions/upload-artifact@v4
with:
name: ${{ env.tag }}-${{ matrix.arch == 'x86_64' && 'x64' || 'aarch64' }}-deps${{ matrix.cpu == 'nehalem' && '-baseline' || '' }}
path: bun-deps/
if-no-files-found: "error"
- name: Cache Dependencies
if: ${{ !steps.cache-deps-restore.outputs.cache-hit }}
id: cache-deps-save
uses: actions/cache/save@v4
with:
path: bun-deps
key: ${{ steps.cache-deps-restore.outputs.cache-primary-key }}
# TODO(@paperdave): stop relying on this and use bun.exe to build itself.
# we cant do that now because there isn't a tagged release to use.
#
# and at the time of writing, the minimum canary required to work is not
# yet released as it is the one *this* commit.
windows-codegen:
name: Codegen
runs-on: ubuntu-latest
timeout-minutes: 10
if: github.repository_owner == 'oven-sh'
strategy:
fail-fast: false
matrix:
arch: [x86_64]
steps:
- uses: actions/checkout@v4
- run: |
curl -fsSL $BUN_DOWNLOAD_URL_BASE/bun-linux-x64.zip > bun.zip
unzip bun.zip
export PATH="$PWD/bun-linux-x64:$PATH"
./scripts/cross-compile-codegen.sh win32 x64
# Sort of a hack to do this step in the codegen stage
- name: Calculate Canary Revision
if: ${{ env.canary == 'true' }}
run: |
echo "canary_revision=$(GITHUB_TOKEN="${{ secrets.GITHUB_TOKEN }}" bash ./scripts/calculate-canary-revision.sh --raw)" > build-codegen-win32-x64/.canary_revision
- uses: actions/upload-artifact@v4
with:
name: ${{ env.tag }}-${{ matrix.arch == 'x86_64' && 'x64' || 'aarch64' }}-codegen
path: build-codegen-win32-x64/
if-no-files-found: "error"
windows-cpp:
name: C++ Build
needs: [windows-codegen]
runs-on: windows
if: github.repository_owner == 'oven-sh'
timeout-minutes: 90
strategy:
fail-fast: false
matrix:
cpu: [haswell, nehalem]
arch: [x86_64]
steps:
- run: git config --global core.autocrlf false && git config --global core.eol lf
- uses: actions/checkout@v4
- uses: KyleMayes/install-llvm-action@1a3da29f56261a1e1f937ec88f0856a9b8321d7e
with:
version: ${{ env.LLVM_VERSION }}
- run: choco install -y ninja
- name: Download Codegen
uses: actions/download-artifact@v4
with:
name: ${{ env.tag }}-${{ matrix.arch == 'x86_64' && 'x64' || 'aarch64' }}-codegen
path: build
- name: Build C++
run: |
# Using SCCache was blocked by an issue that is fixed in a newer version.
# TODO UPDATE
# $sczip = "sccache-v0.6.0-x86_64-pc-windows-msvc"
# Invoke-WebRequest -Uri "https://github.com/mozilla/sccache/releases/download/v0.6.0/${sczip}.zip" -OutFile "${sczip}.zip"
# Expand-Archive "${sczip}.zip"
# $env:SCCACHE_BUCKET="bun"
# $env:SCCACHE_REGION="auto"
# $env:SCCACHE_S3_USE_SSL="true"
# $env:SCCACHE_ENDPOINT="${{ secrets.CACHE_S3_ENDPOINT }}"
# $env:AWS_ACCESS_KEY_ID="${{ secrets.CACHE_S3_ACCESS_KEY_ID }}"
# $env:AWS_SECRET_ACCESS_KEY="${{ secrets.CACHE_S3_SECRET_ACCESS_KEY }}"
# $SCCACHE="$PWD/${sczip}/${sczip}/sccache.exe"
$CANARY_REVISION = if (Test-Path build/.canary_revision) { Get-Content build/.canary_revision } else { "0" }
.\scripts\env.ps1 ${{ matrix.cpu == 'nehalem' && '-Baseline' || '' }}
.\scripts\update-submodules.ps1
.\scripts\build-libuv.ps1 -CloneOnly $True
cd build
# "-DCCACHE_PROGRAM=${SCCACHE}"
cmake .. -G Ninja -DCMAKE_BUILD_TYPE=Release `
-DNO_CODEGEN=1 `
-DNO_CONFIGURE_DEPENDS=1 `
"-DCANARY=${CANARY_REVISION}" `
-DBUN_CPP_ONLY=1 ${{ matrix.cpu == 'nehalem' && '-DUSE_BASELINE_BUILD=1' || '' }}
if ($LASTEXITCODE -ne 0) { throw "CMake configuration failed" }
.\compile-cpp-only.ps1 -v
if ($LASTEXITCODE -ne 0) { throw "C++ compilation failed" }
- uses: actions/upload-artifact@v4
with:
name: ${{ env.tag }}-${{ matrix.arch == 'x86_64' && 'x64' || 'aarch64' }}-cpp${{ matrix.cpu == 'nehalem' && '-baseline' || '' }}
path: build/bun-cpp-objects.a
if-no-files-found: "error"
windows-link:
strategy:
fail-fast: false
matrix:
cpu: [haswell, nehalem]
arch: [x86_64]
name: Link
needs: [windows-dependencies, windows-codegen, windows-cpp, windows-zig]
runs-on: windows-small
if: github.repository_owner == 'oven-sh'
timeout-minutes: 30
permissions: write-all
steps:
- run: git config --global core.autocrlf false && git config --global core.eol lf
- uses: actions/checkout@v4
- uses: KyleMayes/install-llvm-action@1a3da29f56261a1e1f937ec88f0856a9b8321d7e
with:
version: ${{ env.LLVM_VERSION }}
- run: choco install -y ninja
- name: Download Codegen
uses: actions/download-artifact@v4
with:
name: ${{ env.tag }}-${{ matrix.arch == 'x86_64' && 'x64' || 'aarch64' }}-codegen
path: build
- name: Download Dependencies
uses: actions/download-artifact@v4
with:
name: ${{ env.tag }}-${{ matrix.arch == 'x86_64' && 'x64' || 'aarch64' }}-deps${{ matrix.cpu == 'nehalem' && '-baseline' || '' }}
path: bun-deps
- name: Download Zig Object
uses: actions/download-artifact@v4
with:
name: ${{ env.tag }}-${{ matrix.arch == 'x86_64' && 'x64' || 'aarch64' }}-zig${{ matrix.cpu == 'nehalem' && '-baseline' || '' }}
path: bun-zig
- name: Download C++ Objects
uses: actions/download-artifact@v4
with:
name: ${{ env.tag }}-${{ matrix.arch == 'x86_64' && 'x64' || 'aarch64' }}-cpp${{ matrix.cpu == 'nehalem' && '-baseline' || '' }}
path: bun-cpp
- name: Link
run: |
.\scripts\update-submodules.ps1
.\scripts\env.ps1 ${{ matrix.cpu == 'nehalem' && '-Baseline' || '' }}
Set-Location build
$CANARY_REVISION = if (Test-Path build/.canary_revision) { Get-Content build/.canary_revision } else { "0" }
cmake .. -G Ninja -DCMAKE_BUILD_TYPE=Release `
-DNO_CODEGEN=1 `
-DNO_CONFIGURE_DEPENDS=1 `
"-DCANARY=${CANARY_REVISION}" `
-DBUN_LINK_ONLY=1 `
"-DBUN_DEPS_OUT_DIR=$(Resolve-Path ../bun-deps)" `
"-DBUN_CPP_ARCHIVE=$(Resolve-Path ../bun-cpp/bun-cpp-objects.a)" `
"-DBUN_ZIG_OBJ=$(Resolve-Path ../bun-zig/bun-zig.o)" `
${{ matrix.cpu == 'nehalem' && '-DUSE_BASELINE_BUILD=1' || '' }}
if ($LASTEXITCODE -ne 0) { throw "CMake configuration failed" }
ninja -v
if ($LASTEXITCODE -ne 0) { throw "Link failed!" }
- name: Package
run: |
$Dist = mkdir -Force "${{ env.tag }}-${{ matrix.arch == 'x86_64' && 'x64' || 'aarch64' }}${{ matrix.cpu == 'nehalem' && '-baseline' || '' }}"
cp -r build\bun.exe "$Dist\bun.exe"
Compress-Archive "$Dist" "${Dist}.zip"
$Dist = "$Dist-profile"
MkDir -Force "$Dist"
cp -r build\bun.exe "$Dist\bun.exe"
cp -r build\bun.pdb "$Dist\bun.pdb"
Compress-Archive "$Dist" "$Dist.zip"
- uses: actions/upload-artifact@v4
with:
name: ${{ env.tag }}-${{ matrix.arch == 'x86_64' && 'x64' || 'aarch64' }}${{ matrix.cpu == 'nehalem' && '-baseline' || '' }}
path: ${{ env.tag }}-${{ matrix.arch == 'x86_64' && 'x64' || 'aarch64' }}${{ matrix.cpu == 'nehalem' && '-baseline' || '' }}.zip
if-no-files-found: "error"
- uses: actions/upload-artifact@v4
with:
name: ${{ env.tag }}-${{ matrix.arch == 'x86_64' && 'x64' || 'aarch64' }}${{ matrix.cpu == 'nehalem' && '-baseline' || '' }}-profile
path: ${{ env.tag }}-${{ matrix.arch == 'x86_64' && 'x64' || 'aarch64' }}${{ matrix.cpu == 'nehalem' && '-baseline' || '' }}-profile.zip
if-no-files-found: "error"
- name: Release
id: release
uses: ncipollo/release-action@v1
if: |
github.repository_owner == 'oven-sh'
&& github.ref == 'refs/heads/main'
with:
prerelease: true
body: "This canary release of Bun corresponds to the commit [${{ github.sha }}]"
allowUpdates: true
replacesArtifacts: true
generateReleaseNotes: true
artifactErrorsFailBuild: true
token: ${{ secrets.GITHUB_TOKEN }}
name: "Canary (${{github.sha}})"
tag: "canary"
artifacts: "${{env.tag}}-${{ matrix.arch == 'x86_64' && 'x64' || 'aarch64' }}${{ matrix.cpu == 'nehalem' && '-baseline' || '' }}.zip,${{env.tag}}-${{ matrix.arch == 'x86_64' && 'x64' || 'aarch64' }}${{ matrix.cpu == 'nehalem' && '-baseline' || '' }}-profile.zip"
- uses: sarisia/actions-status-discord@v1
if: failure() && github.repository_owner == 'oven-sh' && github.event_name == 'pull_request'
with:
title: ""
webhook: ${{ secrets.DISCORD_WEBHOOK }}
status: ${{ job.status }}
noprefix: true
nocontext: true
description: |
### [${{github.event.pull_request.title}}](https://github.com/oven-sh/bun/pull/${{github.event.number}})
@${{ github.actor }}
Build failed on Windows ${{ matrix.arch }}${{ matrix.cpu == 'nehalem' && ' Baseline' || '' }}
**[Build Output](https://github.com/oven-sh/bun/actions/runs/${{github.run_id}})** | [Commit](https://github.com/oven-sh/bun/commits/${{github.sha}})
windows-test:
name: Test
runs-on: windows-small
needs: [windows-link]
if: github.event_name == 'pull_request' && github.repository_owner == 'oven-sh'
permissions:
pull-requests: write
timeout-minutes: 180
outputs:
failing_tests: ${{ steps.test.outputs.failing_tests }}
failing_tests_count: ${{ steps.test.outputs.failing_tests_count }}
strategy:
fail-fast: false
matrix:
# TODO: test baseline, disabled due to noise
cpu: [haswell]
arch: [x86_64]
steps:
- run: git config --global core.autocrlf false && git config --global core.eol lf
- id: checkout
name: Checkout
uses: actions/checkout@v4
with:
submodules: false
- id: download
name: Download Release
uses: actions/download-artifact@v4
with:
name: ${{ env.tag }}-${{ matrix.arch == 'x86_64' && 'x64' || 'aarch64' }}${{ matrix.cpu == 'nehalem' && '-baseline' || '' }}-profile
path: ${{runner.temp}}/release
- name: Install Bun
run: |
cd ${{runner.temp}}/release
unzip ${{env.tag}}-${{ matrix.arch == 'x86_64' && 'x64' || 'aarch64' }}${{ matrix.cpu == 'nehalem' && '-baseline' || '' }}-profile.zip
cd ${{env.tag}}-${{ matrix.arch == 'x86_64' && 'x64' || 'aarch64' }}${{ matrix.cpu == 'nehalem' && '-baseline' || '' }}-profile
pwd >> $env:GITHUB_PATH
- name: Install Node
uses: actions/setup-node@v4
with:
node-version: 20
- uses: secondlife/setup-cygwin@v1
with:
packages: bash
- name: Install dependencies
run: |
# bun install --verbose
# bun install --cwd=test --verbose
# bun install --cwd=packages/bun-internal-test --verbose
npm install
cd test && npm install
cd ../packages/bun-internal-test && npm install
cd ../..
- id: test
name: Run tests
env:
SMTP_SENDGRID_SENDER: ${{ secrets.SMTP_SENDGRID_SENDER }}
TMPDIR: ${{runner.temp}}
TLS_MONGODB_DATABASE_URL: ${{ secrets.TLS_MONGODB_DATABASE_URL }}
TLS_POSTGRES_DATABASE_URL: ${{ secrets.TLS_POSTGRES_DATABASE_URL }}
SHELLOPTS: igncr
BUN_PATH_BASE: ${{runner.temp}}
BUN_PATH: release/${{env.tag}}-${{ matrix.arch == 'x86_64' && 'x64' || 'aarch64' }}${{ matrix.cpu == 'nehalem' && '-baseline' || '' }}-profile/bun.exe
run: |
node packages/bun-internal-test/src/runner.node.mjs || true
shell: bash
- uses: sarisia/actions-status-discord@v1
if: always() && steps.test.outputs.failing_tests != '' && github.event_name == 'pull_request'
with:
title: ""
webhook: ${{ secrets.DISCORD_WEBHOOK_WINTEST }}
status: "failure"
noprefix: true
nocontext: true
description: |
### ❌🪟 [${{github.event.pull_request.title}}](https://github.com/oven-sh/bun/pull/${{github.event.number}})
@${{ github.actor }}, there are **${{ steps.test.outputs.failing_test_count }} failing tests** on Windows ${{ matrix.arch }}${{ matrix.cpu == 'nehalem' && ' Baseline' || '' }}
${{ steps.test.outputs.failing_tests }}
[Full Test Output](https://github.com/oven-sh/bun/actions/runs/${{github.run_id}})
- uses: sarisia/actions-status-discord@v1
if: always() && steps.test.outputs.regressing_tests != '' && github.event_name == 'pull_request'
with:
title: ""
webhook: ${{ secrets.DISCORD_WEBHOOK }}
status: "failure"
noprefix: true
nocontext: true
description: |
### ❌🪟 [${{github.event.pull_request.title}}](https://github.com/oven-sh/bun/pull/${{github.event.number}})
@${{ github.actor }}, there are **${{ steps.test.outputs.regressing_test_count }} test regressions** on Windows ${{ matrix.arch }}${{ matrix.cpu == 'nehalem' && ' Baseline' || '' }}
${{ steps.test.outputs.regressing_tests }}
[Full Test Output](https://github.com/oven-sh/bun/actions/runs/${{github.run_id}})
- name: Comment on PR
if: always() && steps.test.outputs.regressing_tests != '' && github.event_name == 'pull_request'
uses: thollander/actions-comment-pull-request@v2
with:
comment_tag: test-windows-${{ matrix.arch }}-${{ matrix.cpu }}
message: |
### ❌🪟 @${{ github.actor }}, there are **${{ steps.test.outputs.regressing_test_count }} test regressions** on Windows ${{ matrix.arch }}${{ matrix.cpu == 'nehalem' && ' Baseline' || '' }}
${{ steps.test.outputs.regressing_tests }}
[Full Test Output](https://github.com/oven-sh/bun/actions/runs/${{github.run_id}})
- name: Uncomment on PR
if: steps.test.outputs.regressing_tests == '' && github.event_name == 'pull_request'
uses: thollander/actions-comment-pull-request@v2
with:
comment_tag: test-windows-${{ matrix.arch }}-${{ matrix.cpu }}
mode: upsert
create_if_not_exists: false
message: |
✅🪟 Test regressions on Windows ${{ matrix.arch }}${{ matrix.cpu == 'nehalem' && ' Baseline' || '' }} have been resolved.
- id: fail
name: Fail the build
if: steps.test.outputs.regressing_tests != '' && github.event_name == 'pull_request'
run: exit 1

226
.github/workflows/ci.yml vendored Normal file
View File

@@ -0,0 +1,226 @@
name: CI
permissions:
contents: read
actions: write
concurrency:
group: ${{ github.workflow }}-${{ github.ref == 'refs/heads/main' && github.run_id || github.ref }}
cancel-in-progress: true
on:
workflow_dispatch:
inputs:
run-id:
type: string
description: The workflow ID to download artifacts (skips the build step)
pull_request:
push:
branches:
- main
jobs:
format:
if: ${{ !github.event.inputs.run-id }}
name: Format
uses: ./.github/workflows/run-format.yml
secrets: inherit
with:
zig-version: 0.12.0-dev.1828+225fe6ddb
permissions:
contents: write
lint:
if: ${{ !github.event.inputs.run-id }}
name: Lint
uses: ./.github/workflows/run-lint.yml
secrets: inherit
linux-x64:
if: ${{ !github.event.inputs.run-id }}
name: Build linux-x64
uses: ./.github/workflows/build-linux.yml
secrets: inherit
with:
runs-on: ${{ github.repository_owner == 'oven-sh' && 'namespace-profile-bun-ci-linux-x64' || 'ubuntu-latest' }}
tag: linux-x64
arch: x64
cpu: haswell
linux-x64-baseline:
if: ${{ !github.event.inputs.run-id }}
name: Build linux-x64-baseline
uses: ./.github/workflows/build-linux.yml
secrets: inherit
with:
runs-on: ${{ github.repository_owner == 'oven-sh' && 'namespace-profile-bun-ci-linux-x64' || 'ubuntu-latest' }}
tag: linux-x64-baseline
arch: x64
cpu: nehalem
linux-aarch64:
if: ${{ !github.event.inputs.run-id && github.repository_owner == 'oven-sh' }}
name: Build linux-aarch64
uses: ./.github/workflows/build-linux.yml
secrets: inherit
with:
runs-on: namespace-profile-bun-ci-linux-aarch64
tag: linux-aarch64
arch: aarch64
cpu: native
darwin-x64:
if: ${{ !github.event.inputs.run-id }}
name: Build darwin-x64
uses: ./.github/workflows/build-darwin.yml
secrets: inherit
with:
runs-on: ${{ github.repository_owner == 'oven-sh' && 'macos-12-large' || 'macos-12' }}
tag: darwin-x64
arch: x64
cpu: haswell
darwin-x64-baseline:
if: ${{ !github.event.inputs.run-id }}
name: Build darwin-x64-baseline
uses: ./.github/workflows/build-darwin.yml
secrets: inherit
with:
runs-on: ${{ github.repository_owner == 'oven-sh' && 'macos-12-large' || 'macos-12' }}
tag: darwin-x64-baseline
arch: x64
cpu: nehalem
darwin-aarch64:
if: ${{ !github.event.inputs.run-id }}
name: Build darwin-aarch64
uses: ./.github/workflows/build-darwin.yml
secrets: inherit
with:
runs-on: ${{ github.repository_owner == 'oven-sh' && 'namespace-profile-bun-ci-darwin-aarch64' || 'macos-14' }}
tag: darwin-aarch64
arch: aarch64
cpu: native
windows-x64:
if: ${{ !github.event.inputs.run-id }}
name: Build windows-x64
uses: ./.github/workflows/build-windows.yml
secrets: inherit
with:
runs-on: windows
tag: windows-x64
arch: x64
cpu: haswell
windows-x64-baseline:
if: ${{ !github.event.inputs.run-id }}
name: Build windows-x64-baseline
uses: ./.github/workflows/build-windows.yml
secrets: inherit
with:
runs-on: windows
tag: windows-x64-baseline
arch: x64
cpu: nehalem
linux-x64-test:
if: ${{ github.event.inputs.run-id && always() || github.event_name == 'pull_request' }}
name: Test linux-x64
needs: linux-x64
uses: ./.github/workflows/run-test.yml
secrets: inherit
with:
run-id: ${{ inputs.run-id }}
pr-number: ${{ github.event.number }}
runs-on: ${{ github.repository_owner == 'oven-sh' && 'namespace-profile-bun-ci-linux-x64' || 'ubuntu-latest' }}
tag: linux-x64
linux-x64-baseline-test:
if: ${{ github.event.inputs.run-id && always() || github.event_name == 'pull_request' }}
name: Test linux-x64-baseline
needs: linux-x64-baseline
uses: ./.github/workflows/run-test.yml
secrets: inherit
with:
run-id: ${{ inputs.run-id }}
pr-number: ${{ github.event.number }}
runs-on: ${{ github.repository_owner == 'oven-sh' && 'namespace-profile-bun-ci-linux-x64' || 'ubuntu-latest' }}
tag: linux-x64-baseline
linux-aarch64-test:
if: ${{ github.event.inputs.run-id && always() || github.event_name == 'pull_request' && github.repository_owner == 'oven-sh'}}
name: Test linux-aarch64
needs: linux-aarch64
uses: ./.github/workflows/run-test.yml
secrets: inherit
with:
run-id: ${{ inputs.run-id }}
pr-number: ${{ github.event.number }}
runs-on: namespace-profile-bun-ci-linux-aarch64
tag: linux-aarch64
darwin-x64-test:
if: ${{ github.event.inputs.run-id && always() || github.event_name == 'pull_request' }}
name: Test darwin-x64
needs: darwin-x64
uses: ./.github/workflows/run-test.yml
secrets: inherit
with:
run-id: ${{ inputs.run-id }}
pr-number: ${{ github.event.number }}
runs-on: ${{ github.repository_owner == 'oven-sh' && 'macos-12-large' || 'macos-12' }}
tag: darwin-x64
darwin-x64-baseline-test:
if: ${{ github.event.inputs.run-id && always() || github.event_name == 'pull_request' }}
name: Test darwin-x64-baseline
needs: darwin-x64-baseline
uses: ./.github/workflows/run-test.yml
secrets: inherit
with:
run-id: ${{ inputs.run-id }}
pr-number: ${{ github.event.number }}
runs-on: ${{ github.repository_owner == 'oven-sh' && 'macos-12-large' || 'macos-12' }}
tag: darwin-x64-baseline
darwin-aarch64-test:
if: ${{ github.event.inputs.run-id && always() || github.event_name == 'pull_request' }}
name: Test darwin-aarch64
needs: darwin-aarch64
uses: ./.github/workflows/run-test.yml
secrets: inherit
with:
run-id: ${{ inputs.run-id }}
pr-number: ${{ github.event.number }}
runs-on: ${{ github.repository_owner == 'oven-sh' && 'namespace-profile-bun-ci-darwin-aarch64' || 'macos-14' }}
tag: darwin-aarch64
windows-x64-test:
if: ${{ github.event.inputs.run-id && always() || github.event_name == 'pull_request' }}
name: Test windows-x64
needs: windows-x64
uses: ./.github/workflows/run-test.yml
secrets: inherit
with:
run-id: ${{ inputs.run-id }}
pr-number: ${{ github.event.number }}
runs-on: windows
tag: windows-x64
windows-x64-baseline-test:
if: ${{ github.event.inputs.run-id && always() || github.event_name == 'pull_request' }}
name: Test windows-x64-baseline
needs: windows-x64-baseline
uses: ./.github/workflows/run-test.yml
secrets: inherit
with:
run-id: ${{ inputs.run-id }}
pr-number: ${{ github.event.number }}
runs-on: windows
tag: windows-x64-baseline
cleanup:
if: ${{ always() }}
name: Cleanup
needs:
- linux-x64
- linux-x64-baseline
- linux-aarch64
- darwin-x64
- darwin-x64-baseline
- darwin-aarch64
- windows-x64
- windows-x64-baseline
runs-on: ubuntu-latest
steps:
- name: Cleanup Artifacts
uses: geekyeggo/delete-artifact@v5
with:
name: |
bun-*-cpp
bun-*-zig
bun-*-deps
bun-*-codegen

55
.github/workflows/comment.yml vendored Normal file
View File

@@ -0,0 +1,55 @@
name: Comment
permissions:
actions: read
pull-requests: write
on:
workflow_run:
workflows:
- CI
types:
- completed
jobs:
comment:
if: ${{ github.repository_owner == 'oven-sh' }}
name: Comment
runs-on: ubuntu-latest
steps:
- name: Download Tests
uses: actions/download-artifact@v4
with:
path: bun
pattern: bun-*-tests
github-token: ${{ github.token }}
run-id: ${{ github.event.workflow_run.id }}
- name: Setup Environment
id: env
shell: bash
run: |
echo "pr-number=$(<bun/bun-linux-x64-tests/pr-number.txt)" >> $GITHUB_OUTPUT
- name: Generate Comment
run: |
cat bun/bun-*-tests/comment.md > comment.md
if [ -s comment.md ]; then
echo -e "❌ @${{ github.actor }}, your commit has failing tests :(\n\n$(cat comment.md)" > comment.md
else
echo -e "✅ @${{ github.actor }}, all tests passed!" > comment.md
fi
echo -e "\n**[View logs](${{ github.event.workflow_run.html_url }})**" >> comment.md
echo -e "<!-- generated-comment workflow=${{ github.workflow }} -->" >> comment.md
- name: Find Comment
id: comment
uses: peter-evans/find-comment@v3
with:
issue-number: ${{ steps.env.outputs.pr-number }}
comment-author: github-actions[bot]
body-includes: <!-- generated-comment workflow=${{ github.workflow }} -->
- name: Write Comment
uses: peter-evans/create-or-update-comment@v4
with:
comment-id: ${{ steps.comment.outputs.comment-id }}
issue-number: ${{ steps.env.outputs.pr-number }}
body-path: comment.md
edit-mode: replace

20
.github/workflows/docs.yml vendored Normal file
View File

@@ -0,0 +1,20 @@
name: Docs
on:
push:
paths:
- "docs/**"
branches:
- main
jobs:
deploy:
name: Deploy
runs-on: ubuntu-latest
if: ${{ github.repository_owner == 'oven-sh' }}
steps:
# redeploy Vercel site when a file in `docs` changes
# using VERCEL_DEPLOY_HOOK environment variable
- name: Trigger Webhook
run: |
curl -v ${{ secrets.VERCEL_DEPLOY_HOOK }}

View File

@@ -1,8 +1,10 @@
name: bun-release
name: Release
concurrency: release
env:
BUN_VERSION: ${{ github.event.inputs.tag || github.event.release.tag_name || 'canary' }}
BUN_LATEST: ${{ (github.event.inputs.is-latest || github.event.release.tag_name) && 'true' || 'false' }}
on:
release:
types:
@@ -39,6 +41,7 @@ on:
description: Should types be released to npm?
type: boolean
default: false
jobs:
sign:
name: Sign Release
@@ -58,7 +61,7 @@ jobs:
gpg_private_key: ${{ secrets.GPG_PRIVATE_KEY }}
passphrase: ${{ secrets.GPG_PASSPHRASE }}
- name: Setup Bun
uses: oven-sh/setup-bun@v1
uses: ./.github/actions/setup-bun
with:
bun-version: "1.0.21"
- name: Install Dependencies
@@ -83,7 +86,7 @@ jobs:
- name: Checkout
uses: actions/checkout@v4
- name: Setup Bun
uses: oven-sh/setup-bun@v1
uses: ./.github/actions/setup-bun
with:
bun-version: "1.0.21"
- name: Install Dependencies
@@ -112,12 +115,12 @@ jobs:
node-version: latest
- name: Setup Bun
if: ${{ env.BUN_VERSION != 'canary' }}
uses: oven-sh/setup-bun@v1
uses: ./.github/actions/setup-bun
with:
bun-version: "1.0.21"
- name: Setup Bun
if: ${{ env.BUN_VERSION == 'canary' }}
uses: oven-sh/setup-bun@v1
uses: ./.github/actions/setup-bun
with:
bun-version: "canary" # Must be 'canary' so tag is correct
- name: Install Dependencies
@@ -254,7 +257,7 @@ jobs:
- name: Checkout
uses: actions/checkout@v4
- name: Setup Bun
uses: oven-sh/setup-bun@v1
uses: ./.github/actions/setup-bun
with:
bun-version: "1.0.21"
- name: Install Dependencies

View File

@@ -1,22 +1,19 @@
name: autofix.ci # Must be named this for autofix.ci to work
name: Format
permissions:
contents: read
contents: write
on:
workflow_dispatch:
pull_request:
push:
branches:
- main
env:
ZIG_VERSION: 0.12.0-dev.1828+225fe6ddb
workflow_call:
inputs:
zig-version:
type: string
required: true
jobs:
format:
name: format
runs-on: ${{ vars.RUNNER_LINUX_X64 || 'ubuntu-latest' }}
name: Format
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v4
@@ -34,13 +31,17 @@ jobs:
- name: Setup Zig
uses: goto-bus-stop/setup-zig@c7b6cdd3adba8f8b96984640ff172c37c93f73ee
with:
version: ${{ env.ZIG_VERSION }}
version: ${{ inputs.zig-version }}
- name: Install Dependencies
run: |
bun install
- name: Format
run: |
bun fmt
- name: Format Zig
run: |
bun fmt:zig
- name: Commit # https://autofix.ci/
uses: autofix-ci/action@d3e591514b99d0fca6779455ff8338516663f7cc
- name: Commit
uses: stefanzweifel/git-auto-commit-action@v5
with:
commit_message: Apply formatting changes

30
.github/workflows/run-lint.yml vendored Normal file
View File

@@ -0,0 +1,30 @@
name: Lint
permissions:
contents: read
on:
workflow_call:
jobs:
lint:
name: Lint
runs-on: ubuntu-latest
outputs:
text_output: ${{ steps.lint.outputs.text_output }}
json_output: ${{ steps.lint.outputs.json_output }}
count: ${{ steps.lint.outputs.count }}
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Setup Bun
uses: ./.github/actions/setup-bun
with:
bun-version: "1.1.3"
- name: Install Dependencies
run: |
bun --cwd=packages/bun-internal-test install
- name: Lint
id: lint
run: |
bun packages/bun-internal-test/src/linter.ts

126
.github/workflows/run-test.yml vendored Normal file
View File

@@ -0,0 +1,126 @@
name: Test
permissions:
contents: read
actions: read
on:
workflow_call:
inputs:
runs-on:
type: string
required: true
tag:
type: string
required: true
pr-number:
type: string
required: true
run-id:
type: string
default: ${{ github.run_id }}
jobs:
test:
name: Run Tests
runs-on: ${{ inputs.runs-on }}
steps:
- if: ${{ runner.os == 'Windows' }}
name: Setup Git
run: |
git config --global core.autocrlf false
git config --global core.eol lf
- name: Checkout
uses: actions/checkout@v4
with:
sparse-checkout: |
package.json
bun.lockb
test
packages/bun-internal-test
- name: Setup Environment
shell: bash
run: |
echo "${{ inputs.pr-number }}" > pr-number.txt
- name: Download Bun
uses: actions/download-artifact@v4
with:
name: bun-${{ inputs.tag }}
path: bun
github-token: ${{ github.token }}
run-id: ${{ inputs.run-id || github.run_id }}
- if: ${{ runner.os == 'Windows' }}
name: Setup Cygwin
uses: secondlife/setup-cygwin@v3
with:
packages: bash
- name: Setup Bun
shell: bash
run: |
unzip bun/bun-*.zip
cd bun-*
pwd >> $GITHUB_PATH
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: 20
- name: Install Dependencies
timeout-minutes: 5
shell: bash
run: |
bun install
- name: Install Dependencies (test)
timeout-minutes: 5
shell: bash
run: |
bun install --cwd test
- name: Install Dependencies (runner)
timeout-minutes: 5
shell: bash
run: |
bun install --cwd packages/bun-internal-test
- name: Run Tests
id: test
timeout-minutes: 90
shell: bash
env:
TMPDIR: ${{ runner.temp }}
BUN_TAG: ${{ inputs.tag }}
BUN_FEATURE_FLAG_INTERNAL_FOR_TESTING: "true"
SMTP_SENDGRID_SENDER: ${{ secrets.SMTP_SENDGRID_SENDER }}
TLS_MONGODB_DATABASE_URL: ${{ secrets.TLS_MONGODB_DATABASE_URL }}
TLS_POSTGRES_DATABASE_URL: ${{ secrets.TLS_POSTGRES_DATABASE_URL }}
run: |
node packages/bun-internal-test/src/runner.node.mjs $(which bun)
- if: ${{ always() }}
name: Upload Results
uses: actions/upload-artifact@v4
with:
name: bun-${{ inputs.tag }}-tests
path: |
test-report.*
comment.md
pr-number.txt
if-no-files-found: error
overwrite: true
- if: ${{ always() && steps.test.outputs.failing_tests != '' && github.event.pull_request && github.repository_owner == 'oven-sh' }}
name: Send Message
uses: sarisia/actions-status-discord@v1
with:
webhook: ${{ secrets.DISCORD_WEBHOOK }}
nodetail: true
color: "#FF0000"
title: ""
description: |
### ❌ [${{ github.event.pull_request.title }}](${{ github.event.pull_request.html_url }})
@${{ github.actor }}, there are ${{ steps.test.outputs.failing_tests_count || 'some' }} failing tests on bun-${{ inputs.tag }}.
${{ steps.test.outputs.failing_tests }}
**[View logs](${{ github.event.workflow_run.html_url }})**
- name: Fail
if: ${{ failure() || always() && steps.test.outputs.failing_tests != '' }}
run: |
echo "There are ${{ steps.test.outputs.failing_tests_count || 'some' }} failing tests on bun-${{ inputs.tag }}."
exit 1

59
.github/workflows/upload.yml vendored Normal file
View File

@@ -0,0 +1,59 @@
name: Upload Artifacts
permissions:
contents: write
on:
workflow_run:
workflows:
- CI
types:
- completed
branches:
- main
jobs:
upload:
if: ${{ github.repository_owner == 'oven-sh' }}
name: Upload Artifacts
runs-on: ubuntu-latest
steps:
- name: Download Artifacts
uses: actions/download-artifact@v4
with:
path: bun
pattern: bun-*
merge-multiple: true
github-token: ${{ github.token }}
run-id: ${{ github.event.workflow_run.id }}
- name: Upload to GitHub Releases
uses: ncipollo/release-action@v1
with:
tag: canary
name: Canary (${{ github.sha }})
prerelease: true
body: This canary release of Bun corresponds to the commit [${{ github.sha }}]
allowUpdates: true
replacesArtifacts: true
generateReleaseNotes: true
artifactErrorsFailBuild: true
artifacts: bun/**/bun-*.zip
token: ${{ github.token }}
- name: Upload to S3 (using SHA)
uses: shallwefootball/s3-upload-action@4350529f410221787ccf424e50133cbc1b52704e
with:
endpoint: ${{ secrets.AWS_ENDPOINT }}
aws_key_id: ${{ secrets.AWS_ACCESS_KEY_ID }}
aws_secret_access_key: ${{ secrets.AWS_SECRET_ACCESS_KEY}}
aws_bucket: ${{ secrets.AWS_BUCKET }}
source_dir: bun
destination_dir: releases/${{ github.event.workflow_run.head_sha || github.sha }}
- name: Upload to S3 (using tag)
uses: shallwefootball/s3-upload-action@4350529f410221787ccf424e50133cbc1b52704e
with:
endpoint: ${{ secrets.AWS_ENDPOINT }}
aws_key_id: ${{ secrets.AWS_ACCESS_KEY_ID }}
aws_secret_access_key: ${{ secrets.AWS_SECRET_ACCESS_KEY}}
aws_bucket: ${{ secrets.AWS_BUCKET }}
source_dir: bun
destination_dir: releases/canary

312
.gitignore vendored
View File

@@ -1,169 +1,143 @@
.DS_Store
zig-cache
packages/*/*.wasm
*.o
*.a
profile.json
.env
node_modules
.envrc
.swcrc
yarn.lock
dist
*.tmp
*.log
*.out.js
*.out.refresh.js
**/package-lock.json
build
*.wat
zig-out
pnpm-lock.yaml
README.md.template
src/deps/zig-clap/example
src/deps/zig-clap/README.md
src/deps/zig-clap/.github
src/deps/zig-clap/.gitattributes
out
outdir
.trace
cover
coverage
coverv
*.trace
github
out.*
out
.parcel-cache
esbuilddir
*.bun
parceldist
esbuilddir
outdir/
outcss
.next
txt.js
.idea
.vscode/cpp*
.vscode/clang*
node_modules_*
*.jsb
*.zip
bun-zigld
bun-singlehtreaded
bun-nomimalloc
bun-mimalloc
examples/lotta-modules/bun-yday
examples/lotta-modules/bun-old
examples/lotta-modules/bun-nofscache
src/node-fallbacks/out/*
src/node-fallbacks/node_modules
sign.json
release/
*.dmg
sign.*.json
packages/debug-*
packages/bun-cli/postinstall.js
packages/bun-*/bun
packages/bun-*/bun-profile
packages/bun-*/debug-bun
packages/bun-*/*.o
packages/bun-cli/postinstall.js
packages/bun-cli/bin/*
bun-test-scratch
misctools/fetch
src/deps/libiconv
src/deps/openssl
src/tests.zig
*.blob
src/deps/s2n-tls
.npm
.npm.gz
bun-binary
src/deps/PLCrashReporter/
*.dSYM
*.crash
misctools/sha
packages/bun-wasm/*.mjs
packages/bun-wasm/*.cjs
packages/bun-wasm/*.map
packages/bun-wasm/*.js
packages/bun-wasm/*.d.ts
packages/bun-wasm/*.d.cts
packages/bun-wasm/*.d.mts
*.bc
src/fallback.version
src/runtime.version
*.sqlite
*.database
*.db
misctools/machbench
*.big
.eslintcache
/bun-webkit
src/deps/c-ares/build
src/bun.js/bindings-obj
src/bun.js/debug-bindings-obj
failing-tests.txt
test.txt
myscript.sh
cold-jsc-start
cold-jsc-start.d
/testdir
/test.ts
/test.js
src/js/out/modules*
src/js/out/functions*
src/js/out/tmp
src/js/out/DebugPath.h
make-dev-stats.csv
.uuid
tsconfig.tsbuildinfo
test/js/bun/glob/fixtures
*.lib
*.pdb
CMakeFiles
build.ninja
.ninja_deps
.ninja_log
CMakeCache.txt
cmake_install.cmake
compile_commands.json
*.lib
x64
**/*.vcxproj*
**/*.sln*
**/*.dir
**/*.pdb
/.webkit-cache
/.cache
/src/deps/libuv
/build-*/
/kcov-out
.vs
**/.verdaccio-db.json
/test-report.md
/test-report.json
.DS_Store
.env
.envrc
.eslintcache
.idea
.next
.ninja_deps
.ninja_log
.npm
.npm.gz
.parcel-cache
.swcrc
.trace
.uuid
.vs
.vscode/clang*
.vscode/cpp*
*.a
*.bc
*.big
*.blob
*.bun
*.crash
*.database
*.db
*.dmg
*.dSYM
*.jsb
*.lib
*.log
*.o
*.out.js
*.out.refresh.js
*.pdb
*.sqlite
*.tmp
*.trace
*.wat
*.zip
**/.verdaccio-db.json
**/*.dir
**/*.pdb
**/*.sln*
**/*.vcxproj*
**/package-lock.json
/.cache
/.webkit-cache
/build-*/
/bun-webkit
/kcov-out
/src/deps/libuv
/test-report.json
/test-report.md
/test.js
/test.ts
/testdir
build
build.ninja
bun-binary
bun-mimalloc
bun-nomimalloc
bun-singlehtreaded
bun-test-scratch
bun-zigld
cmake_install.cmake
CMakeCache.txt
CMakeFiles
cold-jsc-start
cold-jsc-start.d
compile_commands.json
cover
coverage
coverv
dist
esbuilddir
examples/lotta-modules/bun-nofscache
examples/lotta-modules/bun-old
examples/lotta-modules/bun-yday
failing-tests.txt
github
make-dev-stats.csv
misctools/fetch
misctools/machbench
misctools/sha
myscript.sh
node_modules
node_modules_*
out
out.*
outcss
outdir
outdir/
packages/*/*.wasm
packages/bun-*/*.o
packages/bun-*/bun
packages/bun-*/bun-profile
packages/bun-*/debug-bun
packages/bun-cli/bin/*
packages/bun-cli/postinstall.js
packages/bun-wasm/*.cjs
packages/bun-wasm/*.d.cts
packages/bun-wasm/*.d.mts
packages/bun-wasm/*.d.ts
packages/bun-wasm/*.js
packages/bun-wasm/*.map
packages/bun-wasm/*.mjs
packages/debug-*
parceldist
pnpm-lock.yaml
profile.json
README.md.template
release/
sign.*.json
sign.json
src/bun.js/bindings-obj
src/bun.js/bindings/GeneratedJS2Native.zig
src/bun.js/debug-bindings-obj
src/deps/c-ares/build
src/deps/libiconv
src/deps/openssl
src/deps/PLCrashReporter/
src/deps/s2n-tls
src/deps/zig-clap/.gitattributes
src/deps/zig-clap/.github
src/deps/zig-clap/example
src/deps/zig-clap/README.md
src/fallback.version
src/js/out/DebugPath.h
src/js/out/functions*
src/js/out/modules*
src/js/out/tmp
src/node-fallbacks/node_modules
src/node-fallbacks/out/*
src/runtime.version
src/tests.zig
test.txt
test/js/bun/glob/fixtures
tsconfig.tsbuildinfo
txt.js
x64
yarn.lock
zig-cache
zig-out

7
.gitmodules vendored
View File

@@ -83,3 +83,10 @@ ignore = dirty
depth = 1
shallow = true
fetchRecurseSubmodules = false
[submodule "zig"]
path = src/deps/zig
url = https://github.com/oven-sh/zig
branch = bun
depth = 1
shallow = true
fetchRecurseSubmodules = false

View File

@@ -3,3 +3,4 @@ src/deps
test/snapshots
test/js/deno
src/react-refresh.js
*.min.js

View File

@@ -5,6 +5,15 @@
"useTabs": false,
"quoteProps": "preserve",
"overrides": [
{
"files": [".vscode/*.json"],
"options": {
"parser": "jsonc",
"quoteProps": "preserve",
"singleQuote": false,
"trailingComma": "all"
}
},
{
"files": ["*.md"],
"options": {

View File

@@ -17,7 +17,7 @@
"${workspaceFolder}/src/deps",
"${workspaceFolder}/src/napi/*",
"${workspaceFolder}/packages/bun-usockets/src",
"${workspaceFolder}/packages/"
"${workspaceFolder}/packages/",
],
"browse": {
"path": [
@@ -30,10 +30,10 @@
"${workspaceFolder}/src/deps/boringssl/include/*",
"${workspaceFolder}/packages/bun-usockets/*",
"${workspaceFolder}/packages/bun-uws/*",
"${workspaceFolder}/src/napi/*"
"${workspaceFolder}/src/napi/*",
],
"limitSymbolsToIncludedHeaders": true,
"databaseFilename": ".vscode/cppdb"
"databaseFilename": ".vscode/cppdb",
},
"defines": [
"STATICALLY_LINKED_WITH_JavaScriptCore=1",
@@ -44,17 +44,18 @@
"BUILDING_JSCONLY__",
"USE_FOUNDATION=1",
"ASSERT_ENABLED=1",
"DU_DISABLE_RENAMING=1"
"DU_DISABLE_RENAMING=1",
],
"macFrameworkPath": [],
"compilerPath": "${workspaceFolder}/.vscode/clang++",
"cStandard": "c17",
"cppStandard": "c++20"
"cppStandard": "c++20",
},
{
"name": "BunWithJSCDebug",
"forcedInclude": ["${workspaceFolder}/src/bun.js/bindings/root.h"],
"includePath": [
"${workspaceFolder}/build/codegen",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Debug/",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Debug/ICU/Headers/",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Debug/JavaScriptCore/PrivateHeaders/",
@@ -71,7 +72,7 @@
"${workspaceFolder}/src/deps",
"${workspaceFolder}/src/napi/*",
"${workspaceFolder}/packages/bun-usockets/src",
"${workspaceFolder}/packages/"
"${workspaceFolder}/packages/",
],
"browse": {
"path": [
@@ -93,10 +94,10 @@
"${workspaceFolder}/src/deps/boringssl/include/",
"${workspaceFolder}/packages/bun-usockets/",
"${workspaceFolder}/packages/bun-uws/",
"${workspaceFolder}/src/napi"
"${workspaceFolder}/src/napi",
],
"limitSymbolsToIncludedHeaders": true,
"databaseFilename": ".vscode/cppdb_debug"
"databaseFilename": ".vscode/cppdb_debug",
},
"defines": [
"STATICALLY_LINKED_WITH_JavaScriptCore=1",
@@ -107,13 +108,13 @@
"BUILDING_JSCONLY__",
"USE_FOUNDATION=1",
"ASSERT_ENABLED=1",
"DU_DISABLE_RENAMING=1"
"DU_DISABLE_RENAMING=1",
],
"macFrameworkPath": [],
"compilerPath": "${workspaceFolder}/.vscode/clang++",
"cStandard": "c17",
"cppStandard": "c++20"
}
"cppStandard": "c++20",
},
],
"version": 4
"version": 4,
}

View File

@@ -28,6 +28,6 @@
"tamasfe.even-better-toml",
// Other
"bierner.comment-tagged-templates"
]
"bierner.comment-tagged-templates",
],
}

425
.vscode/launch.json generated vendored
View File

@@ -18,9 +18,9 @@
"env": {
"FORCE_COLOR": "1",
"BUN_DEBUG_QUIET_LOGS": "1",
"BUN_GARBAGE_COLLECTOR_LEVEL": "2"
"BUN_GARBAGE_COLLECTOR_LEVEL": "2",
},
"console": "internalConsole"
"console": "internalConsole",
},
{
"type": "lldb",
@@ -33,9 +33,9 @@
"FORCE_COLOR": "1",
"BUN_DEBUG_QUIET_LOGS": "1",
"BUN_GARBAGE_COLLECTOR_LEVEL": "1",
"BUN_DEBUG_FileReader": "1"
"BUN_DEBUG_FileReader": "1",
},
"console": "internalConsole"
"console": "internalConsole",
},
{
"type": "lldb",
@@ -47,9 +47,9 @@
"env": {
"FORCE_COLOR": "1",
"BUN_DEBUG_QUIET_LOGS": "1",
"BUN_GARBAGE_COLLECTOR_LEVEL": "0"
"BUN_GARBAGE_COLLECTOR_LEVEL": "0",
},
"console": "internalConsole"
"console": "internalConsole",
},
{
"type": "lldb",
@@ -61,9 +61,9 @@
"env": {
"FORCE_COLOR": "1",
"BUN_DEBUG_QUIET_LOGS": "0",
"BUN_GARBAGE_COLLECTOR_LEVEL": "2"
"BUN_GARBAGE_COLLECTOR_LEVEL": "2",
},
"console": "internalConsole"
"console": "internalConsole",
},
{
"type": "lldb",
@@ -75,9 +75,9 @@
"env": {
"FORCE_COLOR": "1",
"BUN_DEBUG_QUIET_LOGS": "1",
"BUN_GARBAGE_COLLECTOR_LEVEL": "2"
"BUN_GARBAGE_COLLECTOR_LEVEL": "2",
},
"console": "internalConsole"
"console": "internalConsole",
},
{
"type": "lldb",
@@ -89,9 +89,9 @@
"env": {
"FORCE_COLOR": "1",
"BUN_DEBUG_QUIET_LOGS": "1",
"BUN_GARBAGE_COLLECTOR_LEVEL": "2"
"BUN_GARBAGE_COLLECTOR_LEVEL": "2",
},
"console": "internalConsole"
"console": "internalConsole",
},
{
"type": "lldb",
@@ -104,14 +104,14 @@
"FORCE_COLOR": "1",
"BUN_DEBUG_QUIET_LOGS": "1",
"BUN_GARBAGE_COLLECTOR_LEVEL": "2",
"BUN_INSPECT": "ws://localhost:0/?wait=1"
"BUN_INSPECT": "ws://localhost:0/?wait=1",
},
"console": "internalConsole",
"serverReadyAction": {
"pattern": "https://debug.bun.sh/#localhost:([0-9]+)/",
"uriFormat": "https://debug.bun.sh/#ws://localhost:%s/",
"action": "openExternally"
}
"action": "openExternally",
},
},
{
"type": "lldb",
@@ -124,14 +124,14 @@
"FORCE_COLOR": "1",
"BUN_DEBUG_QUIET_LOGS": "1",
"BUN_GARBAGE_COLLECTOR_LEVEL": "2",
"BUN_INSPECT": "ws://localhost:0/?break=1"
"BUN_INSPECT": "ws://localhost:0/?break=1",
},
"console": "internalConsole",
"serverReadyAction": {
"pattern": "https://debug.bun.sh/#localhost:([0-9]+)/",
"uriFormat": "https://debug.bun.sh/#ws://localhost:%s/",
"action": "openExternally"
}
"action": "openExternally",
},
},
// bun run [file]
{
@@ -144,9 +144,9 @@
"env": {
"FORCE_COLOR": "0",
"BUN_DEBUG_QUIET_LOGS": "1",
"BUN_GARBAGE_COLLECTOR_LEVEL": "2"
"BUN_GARBAGE_COLLECTOR_LEVEL": "2",
},
"console": "internalConsole"
"console": "internalConsole",
},
{
"type": "lldb",
@@ -158,9 +158,9 @@
"env": {
"FORCE_COLOR": "1",
"BUN_DEBUG_QUIET_LOGS": "1",
"BUN_GARBAGE_COLLECTOR_LEVEL": "0"
"BUN_GARBAGE_COLLECTOR_LEVEL": "0",
},
"console": "internalConsole"
"console": "internalConsole",
},
{
"type": "lldb",
@@ -172,9 +172,9 @@
"env": {
"FORCE_COLOR": "1",
"BUN_DEBUG_QUIET_LOGS": "0",
"BUN_GARBAGE_COLLECTOR_LEVEL": "2"
"BUN_GARBAGE_COLLECTOR_LEVEL": "2",
},
"console": "internalConsole"
"console": "internalConsole",
},
{
"type": "lldb",
@@ -186,9 +186,9 @@
"env": {
"FORCE_COLOR": "1",
"BUN_DEBUG_QUIET_LOGS": "1",
"BUN_GARBAGE_COLLECTOR_LEVEL": "2"
"BUN_GARBAGE_COLLECTOR_LEVEL": "2",
},
"console": "internalConsole"
"console": "internalConsole",
},
{
"type": "lldb",
@@ -200,9 +200,9 @@
"env": {
"FORCE_COLOR": "1",
"BUN_DEBUG_QUIET_LOGS": "1",
"BUN_GARBAGE_COLLECTOR_LEVEL": "2"
"BUN_GARBAGE_COLLECTOR_LEVEL": "2",
},
"console": "internalConsole"
"console": "internalConsole",
},
{
"type": "lldb",
@@ -215,14 +215,14 @@
"FORCE_COLOR": "0",
"BUN_DEBUG_QUIET_LOGS": "1",
"BUN_GARBAGE_COLLECTOR_LEVEL": "2",
"BUN_INSPECT": "ws://localhost:0/?wait=1"
"BUN_INSPECT": "ws://localhost:0/?wait=1",
},
"console": "internalConsole",
"serverReadyAction": {
"pattern": "https://debug.bun.sh/#localhost:([0-9]+)/",
"uriFormat": "https://debug.bun.sh/#ws://localhost:%s/",
"action": "openExternally"
}
"action": "openExternally",
},
},
{
"type": "lldb",
@@ -235,14 +235,14 @@
"FORCE_COLOR": "0",
"BUN_DEBUG_QUIET_LOGS": "1",
"BUN_GARBAGE_COLLECTOR_LEVEL": "2",
"BUN_INSPECT": "ws://localhost:0/?break=1"
"BUN_INSPECT": "ws://localhost:0/?break=1",
},
"console": "internalConsole",
"serverReadyAction": {
"pattern": "https://debug.bun.sh/#localhost:([0-9]+)/",
"uriFormat": "https://debug.bun.sh/#ws://localhost:%s/",
"action": "openExternally"
}
"action": "openExternally",
},
},
// bun test [...]
{
@@ -255,9 +255,9 @@
"env": {
"FORCE_COLOR": "1",
"BUN_DEBUG_QUIET_LOGS": "1",
"BUN_GARBAGE_COLLECTOR_LEVEL": "2"
"BUN_GARBAGE_COLLECTOR_LEVEL": "2",
},
"console": "internalConsole"
"console": "internalConsole",
},
{
"type": "lldb",
@@ -269,9 +269,9 @@
"env": {
"FORCE_COLOR": "1",
"BUN_DEBUG_QUIET_LOGS": "1",
"BUN_GARBAGE_COLLECTOR_LEVEL": "0"
"BUN_GARBAGE_COLLECTOR_LEVEL": "0",
},
"console": "internalConsole"
"console": "internalConsole",
},
{
"type": "lldb",
@@ -283,9 +283,9 @@
"env": {
"FORCE_COLOR": "1",
"BUN_DEBUG_QUIET_LOGS": "0",
"BUN_GARBAGE_COLLECTOR_LEVEL": "2"
"BUN_GARBAGE_COLLECTOR_LEVEL": "2",
},
"console": "internalConsole"
"console": "internalConsole",
},
{
"type": "lldb",
@@ -297,9 +297,9 @@
"env": {
"FORCE_COLOR": "1",
"BUN_DEBUG_QUIET_LOGS": "1",
"BUN_GARBAGE_COLLECTOR_LEVEL": "2"
"BUN_GARBAGE_COLLECTOR_LEVEL": "2",
},
"console": "internalConsole"
"console": "internalConsole",
},
{
"type": "lldb",
@@ -311,9 +311,9 @@
"env": {
"FORCE_COLOR": "1",
"BUN_DEBUG_QUIET_LOGS": "1",
"BUN_GARBAGE_COLLECTOR_LEVEL": "2"
"BUN_GARBAGE_COLLECTOR_LEVEL": "2",
},
"console": "internalConsole"
"console": "internalConsole",
},
{
"type": "lldb",
@@ -326,14 +326,14 @@
"FORCE_COLOR": "1",
"BUN_DEBUG_QUIET_LOGS": "1",
"BUN_GARBAGE_COLLECTOR_LEVEL": "2",
"BUN_INSPECT": "ws://localhost:0/?wait=1"
"BUN_INSPECT": "ws://localhost:0/?wait=1",
},
"console": "internalConsole",
"serverReadyAction": {
"pattern": "https://debug.bun.sh/#localhost:([0-9]+)/",
"uriFormat": "https://debug.bun.sh/#ws://localhost:%s/",
"action": "openExternally"
}
"action": "openExternally",
},
},
{
"type": "lldb",
@@ -346,14 +346,29 @@
"FORCE_COLOR": "1",
"BUN_DEBUG_QUIET_LOGS": "1",
"BUN_GARBAGE_COLLECTOR_LEVEL": "2",
"BUN_INSPECT": "ws://localhost:0/?break=1"
"BUN_INSPECT": "ws://localhost:0/?break=1",
},
"console": "internalConsole",
"serverReadyAction": {
"pattern": "https://debug.bun.sh/#localhost:([0-9]+)/",
"uriFormat": "https://debug.bun.sh/#ws://localhost:%s/",
"action": "openExternally"
}
"action": "openExternally",
},
},
// bun exec [...]
{
"type": "lldb",
"request": "launch",
"name": "bun exec [...]",
"program": "${workspaceFolder}/build/bun-debug",
"args": ["exec", "${input:testName}"],
"cwd": "${workspaceFolder}",
"env": {
"FORCE_COLOR": "1",
"BUN_DEBUG_QUIET_LOGS": "1",
"BUN_GARBAGE_COLLECTOR_LEVEL": "2",
},
"console": "internalConsole",
},
// bun test [*]
{
@@ -366,9 +381,9 @@
"env": {
"FORCE_COLOR": "1",
"BUN_DEBUG_QUIET_LOGS": "1",
"BUN_GARBAGE_COLLECTOR_LEVEL": "2"
"BUN_GARBAGE_COLLECTOR_LEVEL": "2",
},
"console": "internalConsole"
"console": "internalConsole",
},
{
"type": "lldb",
@@ -380,9 +395,9 @@
"env": {
"FORCE_COLOR": "1",
"BUN_DEBUG_QUIET_LOGS": "1",
"BUN_GARBAGE_COLLECTOR_LEVEL": "0"
"BUN_GARBAGE_COLLECTOR_LEVEL": "0",
},
"console": "internalConsole"
"console": "internalConsole",
},
{
"type": "lldb",
@@ -395,14 +410,14 @@
"FORCE_COLOR": "1",
"BUN_DEBUG_QUIET_LOGS": "1",
"BUN_GARBAGE_COLLECTOR_LEVEL": "2",
"BUN_INSPECT": "ws://localhost:0/"
"BUN_INSPECT": "ws://localhost:0/",
},
"console": "internalConsole",
"serverReadyAction": {
"pattern": "https://debug.bun.sh/#localhost:([0-9]+)/",
"uriFormat": "https://debug.bun.sh/#ws://localhost:%s/",
"action": "openExternally"
}
"action": "openExternally",
},
},
{
"type": "lldb",
@@ -414,9 +429,9 @@
"env": {
"FORCE_COLOR": "1",
"BUN_DEBUG_QUIET_LOGS": "1",
"BUN_GARBAGE_COLLECTOR_LEVEL": "2"
"BUN_GARBAGE_COLLECTOR_LEVEL": "2",
},
"console": "internalConsole"
"console": "internalConsole",
},
{
"type": "lldb",
@@ -425,7 +440,7 @@
"program": "node",
"args": ["src/runner.node.mjs"],
"cwd": "${workspaceFolder}/packages/bun-internal-test",
"console": "internalConsole"
"console": "internalConsole",
},
// Windows: bun test [file]
{
@@ -438,22 +453,22 @@
"environment": [
{
"name": "FORCE_COLOR",
"value": "1"
"value": "1",
},
{
"name": "BUN_DEBUG_QUIET_LOGS",
"value": "1"
"value": "1",
},
{
"name": "BUN_DEBUG_jest",
"value": "1"
"value": "1",
},
{
"name": "BUN_GARBAGE_COLLECTOR_LEVEL",
"value": "1"
}
]
"value": "1",
},
],
},
{
"type": "cppvsdbg",
@@ -465,33 +480,33 @@
"environment": [
{
"name": "FORCE_COLOR",
"value": "1"
"value": "1",
},
{
"name": "BUN_DEBUG_QUIET_LOGS",
"value": "1"
"value": "1",
},
{
"name": "BUN_DEBUG_EventLoop",
"value": "1"
"value": "1",
},
{
"name": "BUN_DEBUG_uv",
"value": "1"
"value": "1",
},
{
"name": "BUN_DEBUG_SYS",
"value": "1"
"value": "1",
},
{
"name": "BUN_DEBUG_PipeWriter",
"value": "1"
"value": "1",
},
{
"name": "BUN_GARBAGE_COLLECTOR_LEVEL",
"value": "2"
}
]
"value": "2",
},
],
},
{
"type": "cppvsdbg",
@@ -503,17 +518,17 @@
"environment": [
{
"name": "FORCE_COLOR",
"value": "1"
"value": "1",
},
{
"name": "BUN_DEBUG_QUIET_LOGS",
"value": "1"
"value": "1",
},
{
"name": "BUN_GARBAGE_COLLECTOR_LEVEL",
"value": "0"
}
]
"value": "0",
},
],
},
{
"type": "cppvsdbg",
@@ -525,17 +540,17 @@
"environment": [
{
"name": "FORCE_COLOR",
"value": "1"
"value": "1",
},
{
"name": "BUN_DEBUG_QUIET_LOGS",
"value": "0"
"value": "0",
},
{
"name": "BUN_GARBAGE_COLLECTOR_LEVEL",
"value": "2"
}
]
"value": "2",
},
],
},
{
"type": "cppvsdbg",
@@ -547,26 +562,26 @@
"environment": [
{
"name": "FORCE_COLOR",
"value": "1"
"value": "1",
},
{
"name": "BUN_DEBUG_QUIET_LOGS",
"value": "1"
"value": "1",
},
{
"name": "BUN_GARBAGE_COLLECTOR_LEVEL",
"value": "2"
"value": "2",
},
{
"name": "BUN_INSPECT",
"value": "ws://localhost:0/?wait=1"
}
"value": "ws://localhost:0/?wait=1",
},
],
"serverReadyAction": {
"pattern": "https://debug.bun.sh/#localhost:([0-9]+)/",
"uriFormat": "https://debug.bun.sh/#ws://localhost:%s/",
"action": "openExternally"
}
"action": "openExternally",
},
},
{
"type": "cppvsdbg",
@@ -578,26 +593,26 @@
"environment": [
{
"name": "FORCE_COLOR",
"value": "1"
"value": "1",
},
{
"name": "BUN_DEBUG_QUIET_LOGS",
"value": "1"
"value": "1",
},
{
"name": "BUN_GARBAGE_COLLECTOR_LEVEL",
"value": "2"
"value": "2",
},
{
"name": "BUN_INSPECT",
"value": "ws://localhost:0/?break=1"
}
"value": "ws://localhost:0/?break=1",
},
],
"serverReadyAction": {
"pattern": "https://debug.bun.sh/#localhost:([0-9]+)/",
"uriFormat": "https://debug.bun.sh/#ws://localhost:%s/",
"action": "openExternally"
}
"action": "openExternally",
},
},
// Windows: bun run [file]
{
@@ -610,39 +625,36 @@
"environment": [
{
"name": "FORCE_COLOR",
"value": "1"
"value": "1",
},
{
"name": "BUN_DEBUG_QUIET_LOGS",
"value": "1"
"value": "1",
},
{
"name": "BUN_GARBAGE_COLLECTOR_LEVEL",
"value": "2"
}
]
"value": "2",
},
],
},
{
"type": "cppvsdbg",
"request": "launch",
"name": "Windows: bun run [file] (fast)",
"name": "Windows: bun install",
"program": "${workspaceFolder}/build/bun-debug.exe",
"args": ["run", "${fileBasename}"],
"args": ["install"],
"cwd": "${fileDirname}",
"environment": [
{
"name": "FORCE_COLOR",
"value": "1"
},
{
"name": "BUN_DEBUG_QUIET_LOGS",
"value": "1"
"value": "1",
},
{
"name": "BUN_GARBAGE_COLLECTOR_LEVEL",
"value": "0"
}
]
"value": "0",
},
],
},
{
"type": "cppvsdbg",
@@ -654,17 +666,17 @@
"environment": [
{
"name": "FORCE_COLOR",
"value": "1"
"value": "1",
},
{
"name": "BUN_DEBUG_QUIET_LOGS",
"value": "0"
"value": "0",
},
{
"name": "BUN_GARBAGE_COLLECTOR_LEVEL",
"value": "2"
}
]
"value": "2",
},
],
},
{
"type": "cppvsdbg",
@@ -676,26 +688,26 @@
"environment": [
{
"name": "FORCE_COLOR",
"value": "1"
"value": "1",
},
{
"name": "BUN_DEBUG_QUIET_LOGS",
"value": "1"
"value": "1",
},
{
"name": "BUN_GARBAGE_COLLECTOR_LEVEL",
"value": "2"
"value": "2",
},
{
"name": "BUN_INSPECT",
"value": "ws://localhost:0/?wait=1"
}
"value": "ws://localhost:0/?wait=1",
},
],
"serverReadyAction": {
"pattern": "https://debug.bun.sh/#localhost:([0-9]+)/",
"uriFormat": "https://debug.bun.sh/#ws://localhost:%s/",
"action": "openExternally"
}
"action": "openExternally",
},
},
{
"type": "cppvsdbg",
@@ -707,26 +719,26 @@
"environment": [
{
"name": "FORCE_COLOR",
"value": "1"
"value": "1",
},
{
"name": "BUN_DEBUG_QUIET_LOGS",
"value": "1"
"value": "1",
},
{
"name": "BUN_GARBAGE_COLLECTOR_LEVEL",
"value": "2"
"value": "2",
},
{
"name": "BUN_INSPECT",
"value": "ws://localhost:0/?break=1"
}
"value": "ws://localhost:0/?break=1",
},
],
"serverReadyAction": {
"pattern": "https://debug.bun.sh/#localhost:([0-9]+)/",
"uriFormat": "https://debug.bun.sh/#ws://localhost:%s/",
"action": "openExternally"
}
"action": "openExternally",
},
},
// Windows: bun test [...]
{
@@ -739,17 +751,17 @@
"environment": [
{
"name": "FORCE_COLOR",
"value": "1"
"value": "1",
},
{
"name": "BUN_DEBUG_QUIET_LOGS",
"value": "1"
"value": "1",
},
{
"name": "BUN_GARBAGE_COLLECTOR_LEVEL",
"value": "2"
}
]
"value": "2",
},
],
},
{
"type": "cppvsdbg",
@@ -761,17 +773,17 @@
"environment": [
{
"name": "FORCE_COLOR",
"value": "1"
"value": "1",
},
{
"name": "BUN_DEBUG_QUIET_LOGS",
"value": "1"
"value": "1",
},
{
"name": "BUN_GARBAGE_COLLECTOR_LEVEL",
"value": "0"
}
]
"value": "0",
},
],
},
{
"type": "cppvsdbg",
@@ -783,17 +795,17 @@
"environment": [
{
"name": "FORCE_COLOR",
"value": "1"
"value": "1",
},
{
"name": "BUN_DEBUG_QUIET_LOGS",
"value": "0"
"value": "0",
},
{
"name": "BUN_GARBAGE_COLLECTOR_LEVEL",
"value": "2"
}
]
"value": "2",
},
],
},
{
"type": "cppvsdbg",
@@ -805,17 +817,17 @@
"environment": [
{
"name": "FORCE_COLOR",
"value": "1"
"value": "1",
},
{
"name": "BUN_DEBUG_QUIET_LOGS",
"value": "1"
"value": "1",
},
{
"name": "BUN_GARBAGE_COLLECTOR_LEVEL",
"value": "2"
}
]
"value": "2",
},
],
},
{
"type": "cppvsdbg",
@@ -827,17 +839,17 @@
"environment": [
{
"name": "FORCE_COLOR",
"value": "1"
"value": "1",
},
{
"name": "BUN_DEBUG_QUIET_LOGS",
"value": "1"
"value": "1",
},
{
"name": "BUN_GARBAGE_COLLECTOR_LEVEL",
"value": "2"
}
]
"value": "2",
},
],
},
{
"type": "cppvsdbg",
@@ -849,26 +861,26 @@
"environment": [
{
"name": "FORCE_COLOR",
"value": "1"
"value": "1",
},
{
"name": "BUN_DEBUG_QUIET_LOGS",
"value": "1"
"value": "1",
},
{
"name": "BUN_GARBAGE_COLLECTOR_LEVEL",
"value": "2"
"value": "2",
},
{
"name": "BUN_INSPECT",
"value": "ws://localhost:0/?wait=1"
}
"value": "ws://localhost:0/?wait=1",
},
],
"serverReadyAction": {
"pattern": "https://debug.bun.sh/#localhost:([0-9]+)/",
"uriFormat": "https://debug.bun.sh/#ws://localhost:%s/",
"action": "openExternally"
}
"action": "openExternally",
},
},
{
"type": "cppvsdbg",
@@ -880,26 +892,49 @@
"environment": [
{
"name": "FORCE_COLOR",
"value": "1"
"value": "1",
},
{
"name": "BUN_DEBUG_QUIET_LOGS",
"value": "1"
"value": "1",
},
{
"name": "BUN_GARBAGE_COLLECTOR_LEVEL",
"value": "2"
"value": "2",
},
{
"name": "BUN_INSPECT",
"value": "ws://localhost:0/?break=1"
}
"value": "ws://localhost:0/?break=1",
},
],
"serverReadyAction": {
"pattern": "https://debug.bun.sh/#localhost:([0-9]+)/",
"uriFormat": "https://debug.bun.sh/#ws://localhost:%s/",
"action": "openExternally"
}
"action": "openExternally",
},
},
// Windows: bun exec [...]
{
"type": "cppvsdbg",
"request": "launch",
"name": "Windows: bun exec [...]",
"program": "${workspaceFolder}/build/bun-debug.exe",
"args": ["exec", "${input:testName}"],
"cwd": "${workspaceFolder}",
"environment": [
{
"name": "FORCE_COLOR",
"value": "1",
},
{
"name": "BUN_DEBUG_QUIET_LOGS",
"value": "1",
},
{
"name": "BUN_GARBAGE_COLLECTOR_LEVEL",
"value": "2",
},
],
},
// Windows: bun test [*]
{
@@ -912,17 +947,17 @@
"environment": [
{
"name": "FORCE_COLOR",
"value": "1"
"value": "1",
},
{
"name": "BUN_DEBUG_QUIET_LOGS",
"value": "1"
"value": "1",
},
{
"name": "BUN_GARBAGE_COLLECTOR_LEVEL",
"value": "2"
}
]
"value": "2",
},
],
},
{
"type": "cppvsdbg",
@@ -934,17 +969,17 @@
"environment": [
{
"name": "FORCE_COLOR",
"value": "1"
"value": "1",
},
{
"name": "BUN_DEBUG_QUIET_LOGS",
"value": "1"
"value": "1",
},
{
"name": "BUN_GARBAGE_COLLECTOR_LEVEL",
"value": "0"
}
]
"value": "0",
},
],
},
{
"type": "cppvsdbg",
@@ -956,26 +991,26 @@
"environment": [
{
"name": "FORCE_COLOR",
"value": "1"
"value": "1",
},
{
"name": "BUN_DEBUG_QUIET_LOGS",
"value": "0"
"value": "0",
},
{
"name": "BUN_GARBAGE_COLLECTOR_LEVEL",
"value": "2"
"value": "2",
},
{
"name": "BUN_INSPECT",
"value": "ws://localhost:0/"
}
"value": "ws://localhost:0/",
},
],
"serverReadyAction": {
"pattern": "https://debug.bun.sh/#localhost:([0-9]+)/",
"uriFormat": "https://debug.bun.sh/#ws://localhost:%s/",
"action": "openExternally"
}
"action": "openExternally",
},
},
{
"type": "cppvsdbg",
@@ -984,19 +1019,19 @@
"program": "node",
"args": ["src/runner.node.mjs"],
"cwd": "${workspaceFolder}/packages/bun-internal-test",
"console": "internalConsole"
}
"console": "internalConsole",
},
],
"inputs": [
{
"id": "commandLine",
"type": "promptString",
"description": "Usage: bun [...]"
"description": "Usage: bun [...]",
},
{
"id": "testName",
"type": "promptString",
"description": "Usage: bun test [...]"
}
]
"description": "Usage: bun test [...]",
},
],
}

47
.vscode/settings.json vendored
View File

@@ -13,7 +13,7 @@
"node_modules": true,
".git": true,
"src/bun.js/WebKit": true,
"src/deps/*/**": true
"src/deps/*/**": true,
},
"search.followSymlinks": false,
"search.useIgnoreFiles": true,
@@ -34,40 +34,41 @@
"[zig]": {
"editor.tabSize": 4,
"editor.useTabStops": false,
"editor.defaultFormatter": "ziglang.vscode-zig"
"editor.defaultFormatter": "ziglang.vscode-zig",
},
// C++
"lldb.verboseLogging": false,
"cmake.configureOnOpen": false,
"C_Cpp.errorSquiggles": "enabled",
"C_Cpp.errorSquiggles": "enabled",
"[cpp]": {
"editor.defaultFormatter": "xaver.clang-format"
"editor.defaultFormatter": "xaver.clang-format",
},
"[c]": {
"editor.defaultFormatter": "xaver.clang-format"
"editor.defaultFormatter": "xaver.clang-format",
},
"[h]": {
"editor.defaultFormatter": "xaver.clang-format"
"editor.defaultFormatter": "xaver.clang-format",
},
// JavaScript
"prettier.enable": true,
"prettier.configPath": ".prettierrc",
"eslint.workingDirectories": ["${workspaceFolder}/packages/bun-types"],
"[javascript]": {
"editor.defaultFormatter": "esbenp.prettier-vscode",
},
"[javascriptreact]": {
"editor.defaultFormatter": "esbenp.prettier-vscode"
"editor.defaultFormatter": "esbenp.prettier-vscode",
},
// TypeScript
"typescript.tsdk": "${workspaceFolder}/node_modules/typescript/lib",
"[typescript]": {
"editor.defaultFormatter": "esbenp.prettier-vscode"
"editor.defaultFormatter": "esbenp.prettier-vscode",
},
"[typescriptreact]": {
"editor.defaultFormatter": "esbenp.prettier-vscode"
"editor.defaultFormatter": "esbenp.prettier-vscode",
},
// JSON
@@ -77,7 +78,7 @@
"[jsonc]": {
"editor.defaultFormatter": "esbenp.prettier-vscode",
},
// Markdown
"[markdown]": {
"editor.defaultFormatter": "esbenp.prettier-vscode",
@@ -88,8 +89,8 @@
"editor.quickSuggestions": {
"comments": "off",
"strings": "off",
"other": "off"
}
"other": "off",
},
},
// TOML
@@ -102,6 +103,11 @@
"editor.defaultFormatter": "esbenp.prettier-vscode",
},
// Docker
"[dockerfile]": {
"editor.formatOnSave": false,
},
// Files
"files.exclude": {
"**/.git": true,
@@ -113,23 +119,10 @@
"**/*.xcworkspacedata": true,
"**/*.xcscheme": true,
"**/*.xcodeproj": true,
"src/bun.js/WebKit": true,
"src/deps/libarchive": true,
"src/deps/mimalloc": true,
"src/deps/s2n-tls": true,
"src/deps/boringssl": true,
"src/deps/openssl": true,
"src/deps/uws": true,
"src/deps/zlib": true,
"src/deps/lol-html": true,
"src/deps/c-ares": true,
"src/deps/tinycc": true,
"src/deps/zstd": true,
"**/*.i": true,
"packages/bun-uws/fuzzing/seed-corpus": true
},
"files.associations": {
"*.idl": "cpp"
"*.idl": "cpp",
},
"C_Cpp.files.exclude": {
"**/.vscode": true,
@@ -147,6 +140,6 @@
"WebKit/WebCore": true,
"WebKit/WebDriver": true,
"WebKit/WebKitBuild": true,
"WebKit/WebInspectorUI": true
"WebKit/WebInspectorUI": true,
},
}

18
.vscode/tasks.json vendored
View File

@@ -6,13 +6,13 @@
"label": "Install Dependencies",
"command": "scripts/all-dependencies.sh",
"windows": {
"command": "scripts/all-dependencies.ps1"
"command": "scripts/all-dependencies.ps1",
},
"icon": {
"id": "arrow-down"
"id": "arrow-down",
},
"options": {
"cwd": "${workspaceFolder}"
"cwd": "${workspaceFolder}",
},
},
{
@@ -21,13 +21,13 @@
"dependsOn": ["Install Dependencies"],
"command": "scripts/setup.sh",
"windows": {
"command": "scripts/setup.ps1"
"command": "scripts/setup.ps1",
},
"icon": {
"id": "check"
"id": "check",
},
"options": {
"cwd": "${workspaceFolder}"
"cwd": "${workspaceFolder}",
},
},
{
@@ -37,10 +37,10 @@
"command": "bun",
"args": ["run", "build"],
"icon": {
"id": "gear"
"id": "gear",
},
"options": {
"cwd": "${workspaceFolder}"
"cwd": "${workspaceFolder}",
},
"isBuildCommand": true,
"runOptions": {
@@ -48,5 +48,5 @@
"reevaluateOnRerun": true,
},
},
]
],
}

View File

@@ -2,8 +2,8 @@ cmake_minimum_required(VERSION 3.22)
cmake_policy(SET CMP0091 NEW)
cmake_policy(SET CMP0067 NEW)
set(Bun_VERSION "1.0.36")
set(WEBKIT_TAG 089023cc9078b3aa173869fd6685f3e7bed2a994)
set(Bun_VERSION "1.1.4")
set(WEBKIT_TAG e3a2d89a0b1644cc8d5c245bd2ffee4d4bd6c1d5)
set(BUN_WORKDIR "${CMAKE_CURRENT_BINARY_DIR}")
message(STATUS "Configuring Bun ${Bun_VERSION} in ${BUN_WORKDIR}")
@@ -315,6 +315,10 @@ option(USE_STATIC_LIBATOMIC "Statically link libatomic, requires the presence of
option(USE_LTO "Enable Link-Time Optimization" ${DEFAULT_LTO})
if(NOT ZIG_LIB_DIR)
cmake_path(SET ZIG_LIB_DIR NORMALIZE "${CMAKE_CURRENT_SOURCE_DIR}/src/deps/zig/lib")
endif()
if(USE_VALGRIND)
# Disable SIMD
set(USE_BASELINE_BUILD ON)
@@ -551,6 +555,8 @@ else()
add_compile_definitions("BUN_DEBUG=1")
set(ASSERT_ENABLED "1")
endif()
message(STATUS "Using WebKit from ${WEBKIT_DIR}")
else()
if(NOT EXISTS "${WEBKIT_DIR}/lib/${libWTF}.${STATIC_LIB_EXT}" OR NOT EXISTS "${WEBKIT_DIR}/lib/${libJavaScriptCore}.${STATIC_LIB_EXT}")
if(WEBKIT_DIR MATCHES "src/bun.js/WebKit$")
@@ -732,7 +738,7 @@ if(NOT NO_CODEGEN)
OUTPUT ${BUN_IDENTIFIER_CACHE_OUT}
MAIN_DEPENDENCY "${BUN_SRC}/js_lexer/identifier_data.zig"
DEPENDS "${BUN_SRC}/js_lexer/identifier_cache.zig"
COMMAND ${ZIG_COMPILER} run "${BUN_SRC}/js_lexer/identifier_data.zig"
COMMAND ${ZIG_COMPILER} run "--zig-lib-dir" "${ZIG_LIB_DIR}" "${BUN_SRC}/js_lexer/identifier_data.zig"
VERBATIM
COMMENT "Building Identifier Cache"
)
@@ -745,19 +751,27 @@ if(NOT NO_CODEGEN)
file(GLOB BUN_TS_MODULES ${CONFIGURE_DEPENDS}
"${BUN_SRC}/js/node/*.ts"
"${BUN_SRC}/js/node/*.js"
"${BUN_SRC}/js/bun/*.js"
"${BUN_SRC}/js/bun/*.ts"
"${BUN_SRC}/js/bun/*.js"
"${BUN_SRC}/js/builtins/*.ts"
"${BUN_SRC}/js/builtins/*.js"
"${BUN_SRC}/js/thirdparty/*.js"
"${BUN_SRC}/js/thirdparty/*.ts"
"${BUN_SRC}/js/internal/*.js"
"${BUN_SRC}/js/internal/*.ts"
"${BUN_SRC}/js/node/*.js"
"${BUN_SRC}/js/node/*.ts"
"${BUN_SRC}/js/thirdparty/*.js"
"${BUN_SRC}/js/thirdparty/*.ts"
"${BUN_SRC}/js/internal-for-testing.ts"
)
file(GLOB BUN_TS_FUNCTIONS ${CONFIGURE_DEPENDS} "${BUN_SRC}/js/builtins/*.ts")
file(GLOB CODEGEN_FILES ${CONFIGURE_DEPENDS} "${BUN_CODEGEN_SRC}/*.ts")
add_custom_command(
OUTPUT
"${BUN_WORKDIR}/codegen/WebCoreJSBuiltins.cpp"
"${BUN_WORKDIR}/codegen/WebCoreJSBuiltins.h"
"${BUN_WORKDIR}/codegen/InternalModuleRegistryConstants.h"
"${BUN_WORKDIR}/codegen/InternalModuleRegistry+createInternalModuleById.h"
"${BUN_WORKDIR}/codegen/InternalModuleRegistry+enum.h"
@@ -765,10 +779,12 @@ if(NOT NO_CODEGEN)
"${BUN_WORKDIR}/codegen/NativeModuleImpl.h"
"${BUN_WORKDIR}/codegen/ResolvedSourceTag.zig"
"${BUN_WORKDIR}/codegen/SyntheticModuleType.h"
"${BUN_WORKDIR}/codegen/GeneratedJS2Native.h"
"${BUN_SRC}/bun.js/bindings/GeneratedJS2Native.zig"
COMMAND ${BUN_EXECUTABLE} run "${BUN_SRC}/codegen/bundle-modules.ts" "--debug=${DEBUG}" "${BUN_WORKDIR}"
DEPENDS ${BUN_TS_MODULES} ${CODEGEN_FILES}
WORKING_DIRECTORY ${CMAKE_CURRENT_SOURCE_DIR}
COMMENT "Bundling JS modules"
COMMENT "Bundling JS"
)
endif()
@@ -776,15 +792,6 @@ WEBKIT_ADD_SOURCE_DEPENDENCIES(
"${BUN_SRC}/bun.js/bindings/InternalModuleRegistry.cpp"
"${BUN_WORKDIR}/codegen/InternalModuleRegistryConstants.h"
)
add_custom_command(
OUTPUT "${BUN_WORKDIR}/codegen/WebCoreJSBuiltins.cpp"
"${BUN_WORKDIR}/codegen/WebCoreJSBuiltins.h"
COMMAND ${BUN_EXECUTABLE} run "${BUN_SRC}/codegen/bundle-functions.ts" "--debug=${DEBUG}" "${BUN_WORKDIR}"
DEPENDS ${BUN_TS_FUNCTIONS} ${CODEGEN_FILES}
WORKING_DIRECTORY ${CMAKE_CURRENT_SOURCE_DIR}
COMMENT "Bundling JS builtin functions"
)
list(APPEND BUN_RAW_SOURCES "${BUN_WORKDIR}/codegen/WebCoreJSBuiltins.cpp")
# --- Peechy API ---
@@ -852,8 +859,10 @@ if(NOT BUN_LINK_ONLY AND NOT BUN_CPP_ONLY)
OUTPUT "${BUN_ZIG_OBJ}"
COMMAND
"${ZIG_COMPILER}" "build" "obj"
"--zig-lib-dir" "${ZIG_LIB_DIR}"
"-Doutput-file=${BUN_ZIG_OBJ}"
"-Dgenerated-code=${BUN_WORKDIR}/codegen"
"-freference-trace=10"
"-Dversion=${Bun_VERSION}"
"-Dcanary=${CANARY}"
"-Doptimize=${ZIG_OPTIMIZE}"
@@ -866,6 +875,7 @@ if(NOT BUN_LINK_ONLY AND NOT BUN_CPP_ONLY)
"${BUN_WORKDIR}/codegen/ResolvedSourceTag.zig"
"${BUN_IDENTIFIER_CACHE_OUT}"
"${BUN_SRC}/api/schema.zig"
"${BUN_SRC}/bun.js/bindings/GeneratedJS2Native.zig"
WORKING_DIRECTORY ${CMAKE_CURRENT_SOURCE_DIR}
COMMENT "Building zig code"
VERBATIM
@@ -1005,7 +1015,7 @@ endif()
# --- clang and linker flags ---
if(CMAKE_BUILD_TYPE STREQUAL "Debug")
if(NOT WIN32)
target_compile_options(${bun} PUBLIC -g3 -O0 -gdwarf-4
target_compile_options(${bun} PUBLIC -O0 -g -g3 -ggdb -gdwarf-4
-Werror=return-type
-Werror=return-stack-address
-Werror=implicit-function-declaration
@@ -1052,8 +1062,8 @@ elseif(CMAKE_BUILD_TYPE STREQUAL "Release")
list(APPEND LTO_LINK_FLAG "/LTCG")
endif()
target_compile_options(${bun} PUBLIC /O2 ${LTO_FLAG} /DEBUG /Z7)
target_link_options(${bun} PUBLIC ${LTO_LINK_FLAG} /DEBUG)
target_compile_options(${bun} PUBLIC /O2 ${LTO_FLAG} /DEBUG:FULL)
target_link_options(${bun} PUBLIC ${LTO_LINK_FLAG} /DEBUG:FULL)
endif()
endif()

View File

@@ -116,7 +116,7 @@ RUN apt-get update -y \
&& case "${arch##*-}" in \
amd64) variant="x64";; \
arm64) variant="aarch64";; \
*) echo "error: unsupported architecture: $arch"; exit 1 ;; \
*) echo "unsupported architecture: $arch"; exit 1 ;; \
esac \
&& wget "${BUN_DOWNLOAD_URL_BASE}/bun-linux-${variant}.zip" \
&& unzip bun-linux-${variant}.zip \
@@ -138,6 +138,8 @@ ARG BUILD_MACHINE_ARCH
ARG ZIG_FOLDERNAME=zig-linux-${BUILD_MACHINE_ARCH}-${ZIG_VERSION}
ARG ZIG_FILENAME=${ZIG_FOLDERNAME}.tar.xz
ARG ZIG_URL="https://ziglang.org/builds/${ZIG_FILENAME}"
ARG ZIG_LOCAL_CACHE_DIR=/zig-cache
ENV ZIG_LOCAL_CACHE_DIR=${ZIG_LOCAL_CACHE_DIR}
WORKDIR $GITHUB_WORKSPACE
@@ -152,14 +154,18 @@ FROM bun-base as c-ares
ARG BUN_DIR
ARG CPU_TARGET
ENV CPU_TARGET=${CPU_TARGET}
ENV CCACHE_DIR=/ccache
ARG CCACHE_DIR=/ccache
ENV CCACHE_DIR=${CCACHE_DIR}
COPY Makefile ${BUN_DIR}/Makefile
COPY src/deps/c-ares ${BUN_DIR}/src/deps/c-ares
WORKDIR $BUN_DIR
RUN --mount=type=cache,target=/ccache cd $BUN_DIR && make c-ares && rm -rf ${BUN_DIR}/src/deps/c-ares ${BUN_DIR}/Makefile
RUN --mount=type=cache,target=${CCACHE_DIR} \
cd $BUN_DIR \
&& make c-ares \
&& rm -rf ${BUN_DIR}/src/deps/c-ares ${BUN_DIR}/Makefile
FROM bun-base as lolhtml
@@ -172,10 +178,14 @@ ENV CPU_TARGET=${CPU_TARGET}
COPY Makefile ${BUN_DIR}/Makefile
COPY src/deps/lol-html ${BUN_DIR}/src/deps/lol-html
ENV CCACHE_DIR=/ccache
ARG CCACHE_DIR=/ccache
ENV CCACHE_DIR=${CCACHE_DIR}
RUN --mount=type=cache,target=/ccache export PATH=$PATH:$HOME/.cargo/bin && cd ${BUN_DIR} && \
make lolhtml && rm -rf src/deps/lol-html Makefile
RUN --mount=type=cache,target=${CCACHE_DIR} \
export PATH=$PATH:$HOME/.cargo/bin \
&& cd ${BUN_DIR} \
&& make lolhtml \
&& rm -rf src/deps/lol-html Makefile
FROM bun-base as mimalloc
@@ -187,10 +197,13 @@ ENV CPU_TARGET=${CPU_TARGET}
COPY Makefile ${BUN_DIR}/Makefile
COPY src/deps/mimalloc ${BUN_DIR}/src/deps/mimalloc
ENV CCACHE_DIR=/ccache
ARG CCACHE_DIR=/ccache
ENV CCACHE_DIR=${CCACHE_DIR}
RUN --mount=type=cache,target=/ccache cd ${BUN_DIR} && \
make mimalloc && rm -rf src/deps/mimalloc Makefile;
RUN --mount=type=cache,target=${CCACHE_DIR} \
cd ${BUN_DIR} \
&& make mimalloc \
&& rm -rf src/deps/mimalloc Makefile
FROM bun-base as mimalloc-debug
@@ -202,32 +215,39 @@ ENV CPU_TARGET=${CPU_TARGET}
COPY Makefile ${BUN_DIR}/Makefile
COPY src/deps/mimalloc ${BUN_DIR}/src/deps/mimalloc
ENV CCACHE_DIR=/ccache
ARG CCACHE_DIR=/ccache
ENV CCACHE_DIR=${CCACHE_DIR}
RUN --mount=type=cache,target=/ccache cd ${BUN_DIR} && \
make mimalloc-debug && rm -rf src/deps/mimalloc Makefile;
RUN --mount=type=cache,target=${CCACHE_DIR} \
cd ${BUN_DIR} \
&& make mimalloc-debug \
&& rm -rf src/deps/mimalloc Makefile
FROM bun-base as zlib
ARG BUN_DIR
ARG CPU_TARGET
ENV CPU_TARGET=${CPU_TARGET}
ENV CCACHE_DIR=/ccache
ARG CCACHE_DIR=/ccache
ENV CCACHE_DIR=${CCACHE_DIR}
COPY Makefile ${BUN_DIR}/Makefile
COPY src/deps/zlib ${BUN_DIR}/src/deps/zlib
WORKDIR $BUN_DIR
RUN --mount=type=cache,target=/ccache cd $BUN_DIR && \
make zlib && rm -rf src/deps/zlib Makefile
RUN --mount=type=cache,target=${CCACHE_DIR} \
cd $BUN_DIR \
&& make zlib \
&& rm -rf src/deps/zlib Makefile
FROM bun-base as libarchive
ARG BUN_DIR
ARG CPU_TARGET
ENV CPU_TARGET=${CPU_TARGET}
ENV CCACHE_DIR=/ccache
ARG CCACHE_DIR=/ccache
ENV CCACHE_DIR=${CCACHE_DIR}
RUN install_packages autoconf automake libtool pkg-config
@@ -236,8 +256,10 @@ COPY src/deps/libarchive ${BUN_DIR}/src/deps/libarchive
WORKDIR $BUN_DIR
RUN --mount=type=cache,target=/ccache cd $BUN_DIR && \
make libarchive && rm -rf src/deps/libarchive Makefile
RUN --mount=type=cache,target=${CCACHE_DIR} \
cd $BUN_DIR \
&& make libarchive \
&& rm -rf src/deps/libarchive Makefile
FROM bun-base as tinycc
@@ -261,9 +283,13 @@ COPY src/deps/boringssl ${BUN_DIR}/src/deps/boringssl
WORKDIR $BUN_DIR
ENV CCACHE_DIR=/ccache
ARG CCACHE_DIR=/ccache
ENV CCACHE_DIR=${CCACHE_DIR}
RUN --mount=type=cache,target=/ccache cd ${BUN_DIR} && make boringssl && rm -rf src/deps/boringssl Makefile
RUN --mount=type=cache,target=${CCACHE_DIR} \
cd ${BUN_DIR} \
&& make boringssl \
&& rm -rf src/deps/boringssl Makefile
FROM bun-base as base64
@@ -286,14 +312,17 @@ ARG BUN_DIR
ARG CPU_TARGET
ENV CPU_TARGET=${CPU_TARGET}
ENV CCACHE_DIR=/ccache
ARG CCACHE_DIR=/ccache
ENV CCACHE_DIR=${CCACHE_DIR}
COPY Makefile ${BUN_DIR}/Makefile
COPY src/deps/zstd ${BUN_DIR}/src/deps/zstd
WORKDIR $BUN_DIR
RUN --mount=type=cache,target=/ccache cd $BUN_DIR && make zstd
RUN --mount=type=cache,target=${CCACHE_DIR} \
cd $BUN_DIR \
&& make zstd
FROM bun-base as ls-hpack
@@ -302,14 +331,17 @@ ARG BUN_DIR
ARG CPU_TARGET
ENV CPU_TARGET=${CPU_TARGET}
ENV CCACHE_DIR=/ccache
ARG CCACHE_DIR=/ccache
ENV CCACHE_DIR=${CCACHE_DIR}
COPY Makefile ${BUN_DIR}/Makefile
COPY src/deps/ls-hpack ${BUN_DIR}/src/deps/ls-hpack
WORKDIR $BUN_DIR
RUN --mount=type=cache,target=/ccache cd $BUN_DIR && make lshpack
RUN --mount=type=cache,target=${CCACHE_DIR} \
cd $BUN_DIR \
&& make lshpack
FROM bun-base-with-zig as bun-identifier-cache
@@ -324,9 +356,9 @@ WORKDIR $BUN_DIR
COPY src/js_lexer/identifier_data.zig ${BUN_DIR}/src/js_lexer/identifier_data.zig
COPY src/js_lexer/identifier_cache.zig ${BUN_DIR}/src/js_lexer/identifier_cache.zig
RUN cd $BUN_DIR \
&& zig run src/js_lexer/identifier_data.zig \
&& rm -rf zig-cache
RUN --mount=type=cache,target=${ZIG_LOCAL_CACHE_DIR} \
cd $BUN_DIR \
&& zig run src/js_lexer/identifier_data.zig
FROM bun-base as bun-node-fallbacks
@@ -367,9 +399,10 @@ COPY src ${BUN_DIR}/src
COPY CMakeLists.txt ${BUN_DIR}/CMakeLists.txt
COPY src/deps/boringssl/include ${BUN_DIR}/src/deps/boringssl/include
ENV CCACHE_DIR=/ccache
ARG CCACHE_DIR=/ccache
ENV CCACHE_DIR=${CCACHE_DIR}
RUN --mount=type=cache,target=/ccache mkdir ${BUN_DIR}/build \
RUN --mount=type=cache,target=${CCACHE_DIR} mkdir ${BUN_DIR}/build \
&& cd ${BUN_DIR}/build \
&& mkdir -p tmp_modules tmp_functions js codegen \
&& cmake .. -GNinja -DCMAKE_BUILD_TYPE=Release -DUSE_LTO=ON -DUSE_DEBUG_JSC=${ASSERTIONS} -DBUN_CPP_ONLY=1 -DWEBKIT_DIR=/build/bun/bun-webkit -DCANARY=${CANARY} -DZIG_COMPILER=system \
@@ -387,7 +420,8 @@ COPY src/api ${BUN_DIR}/src/api
WORKDIR $BUN_DIR
# TODO: move away from Makefile entirely
RUN bun install --frozen-lockfile \
RUN --mount=type=cache,target=${ZIG_LOCAL_CACHE_DIR} \
bun install --frozen-lockfile \
&& make runtime_js fallback_decoder bun_error \
&& rm -rf src/runtime src/fallback.ts node_modules bun.lockb package.json Makefile
@@ -401,6 +435,9 @@ ARG CANARY=0
ARG ASSERTIONS=OFF
ARG ZIG_OPTIMIZE=ReleaseFast
ARG CCACHE_DIR=/ccache
ENV CCACHE_DIR=${CCACHE_DIR}
COPY *.zig package.json CMakeLists.txt ${BUN_DIR}/
COPY completions ${BUN_DIR}/completions
COPY packages ${BUN_DIR}/packages
@@ -413,8 +450,10 @@ COPY --from=bun-codegen-for-zig ${BUN_DIR}/packages/bun-error/dist ${BUN_DIR}/pa
WORKDIR $BUN_DIR
RUN mkdir -p build \
&& bun run $BUN_DIR/src/codegen/bundle-modules-fast.ts $BUN_DIR/build \
RUN --mount=type=cache,target=${CCACHE_DIR} \
--mount=type=cache,target=${ZIG_LOCAL_CACHE_DIR} \
mkdir -p build \
&& bun run $BUN_DIR/src/codegen/bundle-modules.ts --debug=OFF $BUN_DIR/build \
&& cd build \
&& cmake .. \
-G Ninja \
@@ -429,6 +468,7 @@ RUN mkdir -p build \
-DBUN_ZIG_OBJ="/tmp/bun-zig.o" \
-DCANARY="${CANARY}" \
-DZIG_COMPILER=system \
-DZIG_LIB_DIR=$BUN_DIR/src/deps/zig/lib \
&& ONLY_ZIG=1 ninja "/tmp/bun-zig.o" -v
FROM scratch as build_release_obj
@@ -445,6 +485,10 @@ ARG CANARY
ARG ASSERTIONS
ENV CPU_TARGET=${CPU_TARGET}
ARG CCACHE_DIR=/ccache
ENV CCACHE_DIR=${CCACHE_DIR}
ARG ZIG_LOCAL_CACHE_DIR=/zig-cache
ENV ZIG_LOCAL_CACHE_DIR=${ZIG_LOCAL_CACHE_DIR}
WORKDIR $BUN_DIR
@@ -472,7 +516,9 @@ COPY --from=bun-cpp-objects ${BUN_DIR}/bun-webkit/lib ${BUN_DIR}/bun-webkit/lib
WORKDIR $BUN_DIR/build
RUN cmake .. \
RUN --mount=type=cache,target=${CCACHE_DIR} \
--mount=type=cache,target=${ZIG_LOCAL_CACHE_DIR} \
cmake .. \
-G Ninja \
-DCMAKE_BUILD_TYPE=Release \
-DBUN_LINK_ONLY=1 \
@@ -503,6 +549,10 @@ ARG CANARY
ARG ASSERTIONS
ENV CPU_TARGET=${CPU_TARGET}
ARG CCACHE_DIR=/ccache
ENV CCACHE_DIR=${CCACHE_DIR}
ARG ZIG_LOCAL_CACHE_DIR=/zig-cache
ENV ZIG_LOCAL_CACHE_DIR=${ZIG_LOCAL_CACHE_DIR}
WORKDIR $BUN_DIR
@@ -529,7 +579,9 @@ COPY --from=bun-cpp-objects ${BUN_DIR}/bun-webkit/lib ${BUN_DIR}/bun-webkit/lib
WORKDIR $BUN_DIR/build
RUN cmake .. \
RUN --mount=type=cache,target=${CCACHE_DIR} \
--mount=type=cache,target=${ZIG_LOCAL_CACHE_DIR} \
cmake .. \
-G Ninja \
-DCMAKE_BUILD_TYPE=Release \
-DBUN_LINK_ONLY=1 \

View File

@@ -45,16 +45,17 @@ bunx cowsay 'Hello, world!' # execute a package
## Install
Bun supports Linux (x64 & arm64) and macOS (x64 & Apple Silicon).
Bun supports Linux (x64 & arm64), macOS (x64 & Apple Silicon) and Windows (x64).
> **Linux users** — Kernel version 5.6 or higher is strongly recommended, but the minimum is 5.1.
>
> **Windows users** — Bun does not currently provide a native Windows build. We're working on this; progress can be tracked at [this issue](https://github.com/oven-sh/bun/issues/43). In the meantime, use one of the installation methods below for Windows Subsystem for Linux.
```sh
# with install script (recommended)
curl -fsSL https://bun.sh/install | bash
# on windows
powershell -c "irm bun.sh/install.ps1 | iex"
# with npm
npm install -g bun

19
bench/snippets/url.mjs Normal file
View File

@@ -0,0 +1,19 @@
import { bench, run } from "./runner.mjs";
bench(`new URL('https://example.com/')`, () => {
const url = new URL("https://example.com/");
});
bench(`new URL('https://example.com')`, () => {
const url = new URL("https://example.com");
});
bench(`new URL('https://www.example.com')`, () => {
const url = new URL("https://www.example.com");
});
bench(`new URL('https://www.example.com/')`, () => {
const url = new URL("https://www.example.com/");
});
await run();

View File

@@ -1,4 +1,4 @@
import { Database } from "https://deno.land/x/sqlite3@0.9.1/mod.ts";
import { Database } from "https://deno.land/x/sqlite3@0.11.1/mod.ts";
import { run, bench } from "../node_modules/mitata/src/cli.mjs";
const db = new Database("./src/northwind.sqlite");

View File

@@ -146,7 +146,7 @@ const fs = std.fs;
pub fn build(b: *Build) !void {
build_(b) catch |err| {
if (@errorReturnTrace()) |trace| {
std.debug.dumpStackTrace(trace.*);
(std.debug).dumpStackTrace(trace.*);
}
return err;
@@ -394,7 +394,7 @@ pub fn build_(b: *Build) !void {
obj.linkLibC();
obj.dll_export_fns = true;
obj.strip = false;
obj.omit_frame_pointer = optimize != .Debug;
obj.omit_frame_pointer = false;
obj.subsystem = .Console;
// Disable stack probing on x86 so we don't need to include compiler_rt

View File

@@ -96,6 +96,10 @@ FROM alpine:3.18
ARG BUN_RUNTIME_TRANSPILER_CACHE_PATH=0
ENV BUN_RUNTIME_TRANSPILER_CACHE_PATH=${BUN_RUNTIME_TRANSPILER_CACHE_PATH}
# Ensure `bun install -g` works
ARG BUN_INSTALL_BIN=/usr/local/bin
ENV BUN_INSTALL_BIN=${BUN_INSTALL_BIN}
COPY --from=build /usr/local/bin/bun /usr/local/bin/
COPY docker-entrypoint.sh /usr/local/bin/

View File

@@ -62,6 +62,10 @@ FROM debian:bullseye-slim
ARG BUN_RUNTIME_TRANSPILER_CACHE_PATH=0
ENV BUN_RUNTIME_TRANSPILER_CACHE_PATH=${BUN_RUNTIME_TRANSPILER_CACHE_PATH}
# Ensure `bun install -g` works
ARG BUN_INSTALL_BIN=/usr/local/bin
ENV BUN_INSTALL_BIN=${BUN_INSTALL_BIN}
COPY docker-entrypoint.sh /usr/local/bin
COPY --from=build /usr/local/bin/bun /usr/local/bin/bun

View File

@@ -3,6 +3,8 @@ FROM debian:bullseye-slim AS build
# https://github.com/oven-sh/bun/releases
ARG BUN_VERSION=latest
# Node.js includes python3 for node-gyp, see https://github.com/oven-sh/bun/issues/9807
# Though, not on slim and alpine images.
RUN apt-get update -qq \
&& apt-get install -qq --no-install-recommends \
ca-certificates \
@@ -11,6 +13,7 @@ RUN apt-get update -qq \
gpg \
gpg-agent \
unzip \
python3 \
&& apt-get clean \
&& rm -rf /var/lib/apt/lists/* \
&& arch="$(dpkg --print-architecture)" \
@@ -63,6 +66,10 @@ COPY --from=build /usr/local/bin/bun /usr/local/bin/bun
ARG BUN_RUNTIME_TRANSPILER_CACHE_PATH=0
ENV BUN_RUNTIME_TRANSPILER_CACHE_PATH=${BUN_RUNTIME_TRANSPILER_CACHE_PATH}
# Ensure `bun install -g` works
ARG BUN_INSTALL_BIN=/usr/local/bin
ENV BUN_INSTALL_BIN=${BUN_INSTALL_BIN}
RUN groupadd bun \
--gid 1000 \
&& useradd bun \

View File

@@ -62,6 +62,10 @@ FROM gcr.io/distroless/base-nossl-debian11
ARG BUN_RUNTIME_TRANSPILER_CACHE_PATH=0
ENV BUN_RUNTIME_TRANSPILER_CACHE_PATH=${BUN_RUNTIME_TRANSPILER_CACHE_PATH}
# Ensure `bun install -g` works
ARG BUN_INSTALL_BIN=/usr/local/bin
ENV BUN_INSTALL_BIN=${BUN_INSTALL_BIN}
COPY --from=build /usr/local/bin/bun /usr/local/bin/
# Temporarily use the `build`-stage image binaries to create a symlink:

View File

@@ -43,6 +43,7 @@ A `BunFile` can point to a location on disk where a file does not exist.
const notreal = Bun.file("notreal.txt");
notreal.size; // 0
notreal.type; // "text/plain;charset=utf-8"
const exists = await notreal.exists(); // false
```
The default MIME type is `text/plain;charset=utf-8`, but it can be overridden by passing a second argument to `Bun.file`.
@@ -340,6 +341,7 @@ interface BunFile {
arrayBuffer(): Promise<ArrayBuffer>;
json(): Promise<any>;
writer(params: { highWaterMark?: number }): FileSink;
exists(): Promise<boolean>;
}
export interface FileSink {

View File

@@ -6,12 +6,12 @@ Bun implements the following properties.
import.meta.dir; // => "/path/to/project"
import.meta.file; // => "file.ts"
import.meta.path; // => "/path/to/project/file.ts"
import.meta.url; // => "file:///path/to/project/file.ts"
import.meta.main; // `true` if this file is directly executed by `bun run`
// `false` otherwise
import.meta.resolveSync("zod")
// resolve an import specifier relative to the directory
import.meta.resolve("zod"); // => "file:///path/to/project/node_modules/zod/index.js"
```
{% table %}
@@ -28,13 +28,18 @@ import.meta.resolveSync("zod")
---
- `import.meta.env`
- An alias to `process.env`.
---
- `import.meta.file`
- The name of the current file, e.g. `index.tsx`
---
- `import.meta.path`
- Absolute path to the current file, e.g. `/path/to/project/index.tx`. Equivalent to `__filename` in CommonJS modules (and Node.js)
- Absolute path to the current file, e.g. `/path/to/project/index.ts`. Equivalent to `__filename` in CommonJS modules (and Node.js)
---
@@ -43,30 +48,22 @@ import.meta.resolveSync("zod")
---
- `import.meta.url`
- A string url to the current file, e.g. `file:///path/to/project/index.tx`
---
- `import.meta.main`
- `boolean` Indicates whether the current file is the entrypoint to the current `bun` process. Is the file being directly executed by `bun run` or is it being imported?
- Indicates whether the current file is the entrypoint to the current `bun` process. Is the file being directly executed by `bun run` or is it being imported?
---
- `import.meta.env`
- An alias to `process.env`.
---
- `import.meta.resolve{Sync}`
- Resolve a module specifier (e.g. `"zod"` or `"./file.tsx"`) to an absolute path. While file would be imported if the specifier were imported from this file?
- `import.meta.resolve`
- Resolve a module specifier (e.g. `"zod"` or `"./file.tsx"`) to a url. Equivalent to [`import.meta.resolve` in browsers](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Operators/import.meta#resolve)
```ts
import.meta.resolveSync("zod");
// => "/path/to/project/node_modules/zod/index.ts"
import.meta.resolveSync("./file.tsx");
// => "/path/to/project/file.tsx"
import.meta.resolve("zod");
// => "file:///path/to/project/node_modules/zod/index.ts"
```
---
- `import.meta.url`
- A `string` url to the current file, e.g. `file:///path/to/project/index.ts`. Equivalent to [`import.meta.url` in browsers](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Operators/import.meta#url)
{% /table %}

View File

@@ -186,6 +186,7 @@ proc.unref();
## Inter-process communication (IPC)
Bun supports direct inter-process communication channel between two `bun` processes. To receive messages from a spawned Bun subprocess, specify an `ipc` handler.
{%callout%}
**Note** — This API is only compatible with other `bun` processes. Use `process.execPath` to get a path to the currently running `bun` executable.
{%/callout%}
@@ -227,8 +228,6 @@ process.on("message", (message) => {
});
```
All messages are serialized using the JSC `serialize` API, which allows for the same set of [transferrable types](https://developer.mozilla.org/en-US/docs/Web/API/Web_Workers_API/Transferable_objects) supported by `postMessage` and `structuredClone`, including strings, typed arrays, streams, and objects.
```ts#child.ts
// send a string
process.send("Hello from child as string");
@@ -237,6 +236,11 @@ process.send("Hello from child as string");
process.send({ message: "Hello from child as object" });
```
The `ipcMode` option controls the underlying communication format between the two processes:
- `advanced`: (default) Messages are serialized using the JSC `serialize` API, which supports cloning [everything `structuredClone` supports](https://developer.mozilla.org/en-US/docs/Web/API/Web_Workers_API/Structured_clone_algorithm). This does not support transferring ownership of objects.
- `json`: Messages are serialized using `JSON.stringify` and `JSON.parse`, which does not support as many object types as `advanced` does.
## Blocking API (`Bun.spawnSync()`)
Bun provides a synchronous equivalent of `Bun.spawn` called `Bun.spawnSync`. This is a blocking API that supports the same inputs and parameters as `Bun.spawn`. It returns a `SyncSubprocess` object, which differs from `Subprocess` in a few ways.

View File

@@ -62,7 +62,7 @@ const db = new Database("mydb.sqlite", { create: true });
You can also use an import attribute to load a database.
```ts
import db from "./mydb.sqlite" with {"type": "sqlite"};
import db from "./mydb.sqlite" with { "type": "sqlite" };
console.log(db.query("select * from users LIMIT 1").get());
```
@@ -74,16 +74,39 @@ import { Database } from "bun:sqlite";
const db = new Database("./mydb.sqlite");
```
### `.close()`
### `.close(throwOnError: boolean = false)`
To close a database:
To close a database connection, but allow existing queries to finish, call `.close(false)`:
```ts
const db = new Database();
db.close();
// ... do stuff
db.close(false);
```
Note: `close()` is called automatically when the database is garbage collected. It is safe to call multiple times but has no effect after the first.
To close the database and throw an error if there are any pending queries, call `.close(true)`:
```ts
const db = new Database();
// ... do stuff
db.close(true);
```
Note: `close(false)` is called automatically when the database is garbage collected. It is safe to call multiple times but has no effect after the first.
### `using` statement
You can use the `using` statement to ensure that a database connection is closed when the `using` block is exited.
```ts
import { Database } from "bun:sqlite";
{
using db = new Database("mydb.sqlite");
using query = db.query("select 'Hello world' as message;");
console.log(query.get()); // => { message: "Hello world" }
}
```
### `.serialize()`
@@ -128,6 +151,8 @@ db.exec("PRAGMA journal_mode = WAL;");
{% details summary="What is WAL mode" %}
In WAL mode, writes to the database are written directly to a separate file called the "WAL file" (write-ahead log). This file will be later integrated into the main database file. Think of it as a buffer for pending writes. Refer to the [SQLite docs](https://www.sqlite.org/wal.html) for a more detailed overview.
On macOS, WAL files may be persistent by default. This is not a bug, it is how macOS configured the system version of SQLite.
{% /details %}
## Statements
@@ -387,6 +412,25 @@ db.loadExtension("myext");
{% /details %}
### .fileControl(cmd: number, value: any)
To use the advanced `sqlite3_file_control` API, call `.fileControl(cmd, value)` on your `Database` instance.
```ts
import { Database, constants } from "bun:sqlite";
const db = new Database();
// Ensure WAL mode is NOT persistent
// this prevents wal files from lingering after the database is closed
db.fileControl(constants.SQLITE_FCNTL_PERSIST_WAL, 0);
```
`value` can be:
- `number`
- `TypedArray`
- `undefined` or `null`
## Reference
```ts

View File

@@ -635,7 +635,7 @@ Bun.resolveSync("zod", "/path/to/project");
// => "/path/to/project/node_modules/zod/index.ts"
```
To resolve relative to the current working directory, pass `process.cwd` or `"."` as the root.
To resolve relative to the current working directory, pass `process.cwd()` or `"."` as the root.
```ts
Bun.resolveSync("./foo.ts", process.cwd());

View File

@@ -156,7 +156,7 @@ Like the Bun runtime, the bundler supports an array of file types out of the box
---
- `.js` `.jsx`, `.cjs` `.mjs` `.mts` `.cts` `.ts` `.tsx`
- Uses Bun's built-in transpiler to parse the file and transpile TypeScript/JSX syntax to vanilla JavaScript. The bundler executes a set of default transforms, including dead code elimination, tree shaking, and environment variable inlining. At the moment Bun does not attempt to down-convert syntax; if you use recently ECMAScript syntax, that will be reflected in the bundled code.
- Uses Bun's built-in transpiler to parse the file and transpile TypeScript/JSX syntax to vanilla JavaScript. The bundler executes a set of default transforms including dead code elimination and tree shaking. At the moment Bun does not attempt to down-convert syntax; if you use recently ECMAScript syntax, that will be reflected in the bundled code.
---

View File

@@ -10,7 +10,7 @@ Bun uses the file extension to determine which built-in _loader_ should be used
**JavaScript**. Default for `.cjs` and `.mjs`.
Parses the code and applies a set of default transforms, like dead-code elimination, tree shaking, and environment variable inlining. Note that Bun does not attempt to down-convert syntax at the moment.
Parses the code and applies a set of default transforms like dead-code elimination and tree shaking. Note that Bun does not attempt to down-convert syntax at the moment.
### `jsx`
@@ -178,7 +178,7 @@ In the bundler, `.node` files are handled using the [`file`](#file) loader.
In the runtime and bundler, SQLite databases can be directly imported. This will load the database using [`bun:sqlite`](/docs/api/sqlite.md).
```ts
import db from "./my.db" with {type: "sqlite"};
import db from "./my.db" with { type: "sqlite" };
```
This is only supported when the `target` is `bun`.
@@ -189,21 +189,21 @@ You can change this behavior with the `"embed"` attribute:
```ts
// embed the database into the bundle
import db from "./my.db" with {type: "sqlite", embed: "true"};
import db from "./my.db" with { type: "sqlite", embed: "true" };
```
When using a [standalone executable](/docs/bundler/executables), the database is embedded into the single-file executable.
Otherwise, the database to embed is copied into the `outdir` with a hashed filename.
### `bunshell` loader
### `sh` loader
**Bun Shell loader**. Default for `.bun.sh` files
**Bun Shell loader**. Default for `.sh` files
This loader is used to parse [Bun Shell](/docs/runtime/shell) scripts. It's only supported when starting bun itself, so it's not available in the bundler or in the runtime.
This loader is used to parse [Bun Shell](/docs/runtime/shell) scripts. It's only supported when starting Bun itself, so it's not available in the bundler or in the runtime.
```sh
$ bun run ./script.bun.sh
$ bun run ./script.sh
```
### `file`

View File

@@ -77,4 +77,4 @@ Bun automatically loads environment variables from `.env` files before running a
2. `NODE_ENV` === `"production"` ? `.env.production` : `.env.development`
3. `.env`
To debug environment variables, run `bun run env` to view a list of resolved environment variables. -->
To debug environment variables, run `bun --print process.env` to view a list of resolved environment variables. -->

58
docs/cli/filter.md Normal file
View File

@@ -0,0 +1,58 @@
Use the `--filter` flag to execute lifecycle scripts in multiple packages at once:
```bash
bun --filter <pattern> <script>
```
Say you have a monorepo with two packages: `packages/api` and `packages/frontend`, both with a `dev` script that will start a local development server. Normally, you would have to open two separate terminal tabs, cd into each package directory, and run `bun dev`:
```bash
cd packages/api
bun dev
# in another terminal
cd packages/frontend
bun dev
```
Using `--filter`, you can run the `dev` script in both packages at once:
```bash
bun --filter '*' dev
```
Both commands will be run in parallel, and you will see a nice terminal UI showing their respective outputs:
![Terminal Output](https://github.com/oven-sh/bun/assets/48869301/2a103e42-9921-4c33-948f-a1ad6e6bac71)
## Matching
`--filter` accepts a pattern to match specific packages, either by name or by path. Patterns have full support for glob syntax.
### Package Name `--filter <pattern>`
Name patterns select packages based on the package name, as specified in `package.json`. For example, if you have packages `pkga`, `pkgb` and `other`, you can match all packages with `*`, only `pkga` and `pkgb` with `pkg*`, and a specific package by providing the full name of the package.
### Package Path `--filter ./<glob>`
Path patterns are specified by starting the pattern with `./`, and will select all packages in directories that match the pattern. For example, to match all packages in subdirectories of `packages`, you can use `--filter './packages/**'`. To match a package located in `pkgs/foo`, use `--filter ./pkgs/foo`.
## Workspaces
Filters respect your [workspace configuration](/docs/install/workspaces.md): If you have a `package.json` file that specifies which packages are part of the workspace,
`--filter` will be restricted to only these packages. Also, in a workspace you can use `--filter` to run scripts in packages that are located anywhere in the workspace:
```bash
# Packages
# src/foo
# src/bar
# in src/bar: runs myscript in src/foo, no need to cd!
bun run --filter foo myscript
```
## Dependency Order
Bun will respect package dependency order when running scripts. Say you have a package `foo` that depends on another package `bar` in your workspace, and both packages have a `build` script. When you run `bun --filter '*' build`, you will notice that `foo` will only start running once `bar` is done.
### Cyclic Dependencies

View File

@@ -151,6 +151,19 @@ By default, Bun respects this shebang and executes the script with `node`. Howev
$ bun run --bun vite
```
### Filtering
in monorepos containing multiple packages, you can use the `--filter` argument to execute scripts in many packages at once.
Use `bun run --filter <name_pattern> <script>` to execute `<script>` in all packages whose name matches `<name_pattern>`.
For example, if you have subdirectories containing packages named `foo`, `bar` and `baz`, running
```bash
bun run --filter 'ba*' <script>
```
will execute `<script>` in both `bar` and `baz`, but not in `foo`.
Find more details in the docs page for [filter](/docs/cli/filter.md).
## `bun run -` to pipe code from stdin
`bun run -` lets you read JavaScript, TypeScript, TSX, or JSX from stdin and execute it without writing to a temporary file first.

View File

@@ -0,0 +1,55 @@
---
name: Use Neon's Serverless Postgres with Bun
---
[Neon](https://neon.tech/) is a fully managed serverless Postgres. Neon separates compute and storage to offer modern developer features such as autoscaling, branching, bottomless storage, and more.
---
Get started by creating a project directory, initializing the directory using `bun init`, and adding the [Neon serverless driver](https://github.com/neondatabase/serverless/) as a project dependency.
```sh
$ mkdir bun-neon-postgres
$ cd bun-neon-postgres
$ bun init -y
$ bun add @neondatabase/serverless
```
---
Create a `.env.local` file and add your [Neon Postgres connection string](https://neon.tech/docs/connect/connect-from-any-app) to it.
```sh
DATBASE_URL=postgresql://username:password@ep-adj-noun-guid.us-east-1.aws.neon.tech/neondb?sslmode=require
```
---
Paste the following code into your project's `index.ts` file.
```ts
import { neon } from "@neondatabase/serverless";
// Bun automatically loads the DATABASE_URL from .env.local
// Refer to: https://bun.sh/docs/runtime/env for more information
const sql = neon(process.env.DATABASE_URL);
const rows = await sql`SELECT version()`;
console.log(rows[0].version);
```
---
Start the program using `bun ./index.ts`. The Postgres version should be printed to the console.
```sh
$ bun ./index.ts
PostgreSQL 16.2 on x86_64-pc-linux-gnu, compiled by gcc (Debian 10.2.1-6) 10.2.1 20210110, 64-bit
```
---
This example used the Neon serverless driver's SQL-over-HTTP functionality. Neon's serverless driver also exposes `Client` and `Pool` constructors to enable sessions, interactive transactions, and node-postgres compatibility.
Refer to [Neon's documentation](https://neon.tech/docs/serverless/serverless-driver) for a complete overview of the serverless driver.

View File

@@ -0,0 +1,47 @@
---
name: Streaming HTTP Server with Async Iterators
---
In Bun, [`Response`](https://developer.mozilla.org/en-US/docs/Web/API/Response) objects can accept an async generator function as their body. This allows you to stream data to the client as it becomes available, rather than waiting for the entire response to be ready.
```ts
Bun.serve({
port: 3000,
fetch(req) {
return new Response(
// An async generator function
async function* () {
yield "Hello, ";
await Bun.sleep(100);
yield "world!";
// you can also yield a TypedArray or Buffer
yield new Uint8Array(["\n".charCodeAt(0)]);
},
{ headers: { "Content-Type": "text/plain" } },
);
},
});
```
---
You can pass any async iterable directly to `Response`:
```ts
Bun.serve({
port: 3000,
fetch(req) {
return new Response(
{
[Symbol.asyncIterator]: async function* () {
yield "Hello, ";
await Bun.sleep(100);
yield "world!";
},
},
{ headers: { "Content-Type": "text/plain" } },
);
},
});
```

View File

@@ -0,0 +1,20 @@
---
name: Streaming HTTP Server with Node.js Streams
---
In Bun, [`Response`](https://developer.mozilla.org/en-US/docs/Web/API/Response) objects can accept a Node.js [`Readable`](https://nodejs.org/api/stream.html#stream_readable_streams).
This works because Bun's `Response` object allows any async iterable as its body. Node.js streams are async iterables, so you can pass them directly to `Response`.
```ts
import { Readable } from "stream";
import { serve } from "bun";
serve({
port: 3000,
fetch(req) {
return new Response(Readable.from(["Hello, ", "world!"]), {
headers: { "Content-Type": "text/plain" },
});
},
});
```

View File

@@ -18,10 +18,10 @@ Bun.env.API_TOKEN; // => "secret"
---
To print all currently-set environment variables to the command line, run `bun run env`. This is useful for debugging.
To print all currently-set environment variables to the command line, run `bun --print process.env`. This is useful for debugging.
```sh
$ bun run env
$ bun --print process.env
BAZ=stuff
FOOBAR=aaaaaa
<lots more lines>

View File

@@ -0,0 +1,11 @@
---
name: Convert a Node.js Readable to an ArrayBuffer
---
To convert a Node.js `Readable` stream to an `ArrayBuffer` in Bun, you can create a new `Response` object with the stream as the body, then use `arrayBuffer()` to read the stream into an `ArrayBuffer`.
```ts
import { Readable } from "stream";
const stream = Readable.from(["Hello, ", "world!"]);
const buf = await new Response(stream).arrayBuffer();
```

View File

@@ -0,0 +1,11 @@
---
name: Convert a Node.js Readable to a Blob
---
To convert a Node.js `Readable` stream to a [`Blob`](https://developer.mozilla.org/en-US/docs/Web/API/Blob) in Bun, you can create a new [`Response`](https://developer.mozilla.org/en-US/docs/Web/API/Response) object with the stream as the body, then use [`response.blob()`](https://developer.mozilla.org/en-US/docs/Web/API/Response/blob) to read the stream into a [`Blob`](https://developer.mozilla.org/en-US/docs/Web/API/Blob).
```ts
import { Readable } from "stream";
const stream = Readable.from(["Hello, ", "world!"]);
const blob = await new Response(stream).blob();
```

View File

@@ -0,0 +1,12 @@
---
name: Convert a Node.js Readable to JSON
---
To convert a Node.js `Readable` stream to a JSON object in Bun, you can create a new [`Response`](https://developer.mozilla.org/en-US/docs/Web/API/Response) object with the stream as the body, then use [`response.json()`](https://developer.mozilla.org/en-US/docs/Web/API/Response/json) to read the stream into a JSON object.
```ts
import { Readable } from "stream";
const stream = Readable.from([JSON.stringify({ hello: "world" })]);
const json = await new Response(stream).json();
console.log(json); // { hello: "world" }
```

View File

@@ -0,0 +1,12 @@
---
name: Convert a Node.js Readable to a string
---
To convert a Node.js `Readable` stream to a string in Bun, you can create a new [`Response`](https://developer.mozilla.org/en-US/docs/Web/API/Response) object with the stream as the body, then use [`response.text()`](https://developer.mozilla.org/en-US/docs/Web/API/Response/text) to read the stream into a string.
```ts
import { Readable } from "stream";
const stream = Readable.from([Buffer.from("Hello, world!")]);
const text = await new Response(stream).text();
console.log(text); // "Hello, world!"
```

View File

@@ -4,7 +4,7 @@ name: Migrate from Jest to Bun's test runner
In many cases, Bun's test runner can run Jest test suites with no code changes. Just run `bun test` instead of `npx jest`, `yarn test`, etc.
```sh-diff
```sh
- $ npx jest
- $ yarn test
+ $ bun test
@@ -57,7 +57,7 @@ Replace `bail` in your Jest config with the `--bail` CLI flag.
- };
``` -->
```sh-diff
```sh
$ bun test --bail 3
```

View File

@@ -61,6 +61,7 @@ Workspaces have a couple major benefits.
- **Code can be split into logical parts.** If one package relies on another, you can simply add it as a dependency in `package.json`. If package `b` depends on `a`, `bun install` will install your local `packages/a` directory into `node_modules` instead of downloading it from the npm registry.
- **Dependencies can be de-duplicated.** If `a` and `b` share a common dependency, it will be _hoisted_ to the root `node_modules` directory. This reduces redundant disk usage and minimizes "dependency hell" issues associated with having multiple versions of a package installed simultaneously.
- **Run scripts in multiple pacakges.** You can use the [`--filter` flag](/docs/cli/filter.md) to easily run `package.json` scripts in multiple packages in your workspace.
{% callout %}
⚡️ **Speed** — Installs are fast, even for big monorepos. Bun installs the [Remix](https://github.com/remix-run/remix) monorepo in about `500ms` on Linux.

View File

@@ -42,21 +42,20 @@ $ proto install bun
Bun requires a minimum of Windows 10 version 1809
{% /callout %}
Bun provides a _limited, experimental_ native build for Windows. It is recommended to use Bun within [Windows Subsystem for Linux](https://learn.microsoft.com/en-us/windows/wsl/install) and follow the above instructions. To help catch bugs, the experimental build enables many debugging assertions, which will make the binary slower than what the stable version will be.
To install, paste this into a terminal:
{% codetabs %}
```powershell#PowerShell/cmd.exe
# WARNING: No stability is guaranteed on the experimental Windows builds
powershell -c "irm bun.sh/install.ps1|iex"
> powershell -c "irm bun.sh/install.ps1|iex"
```
```powershell#npm
> npm install -g bun # the last `npm` command you'll ever need
```
```powershell#Scoop
# WARNING: No stability is guaranteed on the experimental Windows builds
scoop bucket add versions
scoop install bun-canary
> scoop install bun
```
{% /codetabs %}
@@ -145,6 +144,8 @@ $ bun upgrade
{% callout %}
**Homebrew users** — To avoid conflicts with Homebrew, use `brew upgrade bun` instead.
**Scoop users** — To avoid conflicts with Scoop, use `scoop upgrade bun` instead.
**proto users** - Use `proto install bun --pin` instead.
{% /callout %}
@@ -233,10 +234,14 @@ $ rm -rf ~/.bun # for macOS, Linux, and WSL
```
```powershell#Windows
powershell -c ~\.bun\uninstall.ps1
> powershell -c ~\.bun\uninstall.ps1
```
```bash#NPM
```powershell#Scoop
> scoop uninstall bun
```
```bash#npm
$ npm uninstall -g bun
```

View File

@@ -180,6 +180,9 @@ export default {
page("install/lifecycle", "Lifecycle scripts", {
description: "How Bun handles package lifecycle scripts with trustedDependencies",
}),
page("cli/filter", "Filter", {
description: "Run scripts in multiple packages in parallel",
}),
page("install/lockfile", "Lockfile", {
description:
"Bun's binary lockfile `bun.lockb` tracks your resolved dependency tree, making future installs fast and repeatable.",

View File

@@ -39,7 +39,7 @@ I recommend using VSCode through SSH instead of Tunnels or the Tailscale extensi
By default, running unverified scripts are blocked.
```ps1
Set-ExecutionPolicy -Scope CurrentUser -ExecutionPolicy Unrestricted
> Set-ExecutionPolicy -Scope CurrentUser -ExecutionPolicy Unrestricted
```
### System Dependencies
@@ -47,7 +47,7 @@ Set-ExecutionPolicy -Scope CurrentUser -ExecutionPolicy Unrestricted
- Bun 1.1 or later. We use Bun to run it's own code generators.
```ps1
irm bun.sh/install.ps1 | iex
> irm bun.sh/install.ps1 | iex
```
- [Visual Studio](https://visualstudio.microsoft.com) with the "Desktop Development with C++" workload.
@@ -70,28 +70,28 @@ The Zig compiler is automatically downloaded, installed, and updated by the buil
[Scoop](https://scoop.sh) can be used to install these remaining tools easily:
```ps1
irm https://get.scoop.sh | iex
scoop install nodejs-lts go rust nasm ruby perl
scoop llvm@16.0.4 # scoop bug if you install llvm and the rest at the same time
> irm https://get.scoop.sh | iex
> scoop install nodejs-lts go rust nasm ruby perl
# scoop seems to be buggy if you install llvm and the rest at the same time
> scoop llvm@16.0.4
```
If you intend on building WebKit locally (optional), you should install these packages:
```ps1
scoop install make cygwin python
> scoop install make cygwin python
```
From here on out, it is **expected you use a PowerShell Terminal with `.\scripts\env.ps1` sourced**. This script is available in the Bun repository and can be loaded by executing it:
```ps1
.\scripts\env.ps1
> .\scripts\env.ps1
```
To verify, you can check for an MSVC-only command line such as `mt.exe`
```ps1
Get-Command mt
> Get-Command mt
```
{% callout %}
@@ -101,24 +101,24 @@ It is not recommended to install `ninja` / `cmake` into your global path, becaus
## Building
```ps1
bun install
> bun install
.\scripts\env.ps1
.\scripts\update-submodules.ps1 # this syncs git submodule state
.\scripts\all-dependencies.ps1 # this builds all dependencies
.\scripts\make-old-js.ps1 # runs some old code generators
> .\scripts\env.ps1
> .\scripts\update-submodules.ps1 # this syncs git submodule state
> .\scripts\all-dependencies.ps1 # this builds all dependencies
> .\scripts\make-old-js.ps1 # runs some old code generators
# Configure build environment
cmake -Bbuild -GNinja -DCMAKE_BUILD_TYPE=Debug
> cmake -Bbuild -GNinja -DCMAKE_BUILD_TYPE=Debug
# Build bun
ninja -Cbuild
> ninja -Cbuild
```
If this was successful, you should have a `bun-debug.exe` in the `build` folder.
```ps1
.\build\bun-debug.exe --revision
> .\build\bun-debug.exe --revision
```
You should add this to `$Env:PATH`. The simplest way to do so is to open the start menu, type "Path", and then navigate the environment variables menu to add `C:\.....\bun\build` to the user environment variable `PATH`. You should then restart your editor (if it does not update still, log out and log back in).
@@ -134,15 +134,15 @@ You can run the test suite either using `bun test`, or by using the wrapper scri
```ps1
# Setup
bun i --cwd packages\bun-internal-test
> bun i --cwd packages\bun-internal-test
# Run the entire test suite with reporter
# the package.json script "test" uses "build/bun-debug.exe" by default
bun run test
> bun run test
# Run an individual test file:
bun-debug test node\fs
bun-debug test "C:\bun\test\js\bun\resolve\import-meta.test.js"
> bun-debug test node\fs
> bun-debug test "C:\bun\test\js\bun\resolve\import-meta.test.js"
```
## Troubleshooting

View File

@@ -1,6 +1,6 @@
Configuring a development environment for Bun can take 10-30 minutes depending on your internet connection and computer speed. You will need ~10GB of free disk space for the repository and build artifacts.
If you are using Windows, you must use a WSL environment as Bun does not yet compile on Windows natively.
If you are using Windows, please refer to [this guide](/docs/project/building-windows)
## Install Dependencies

View File

@@ -98,10 +98,10 @@ Bun.env.API_TOKEN; // => "secret"
import.meta.env.API_TOKEN; // => "secret"
```
To print all currently-set environment variables to the command line, run `bun run env`. This is useful for debugging.
To print all currently-set environment variables to the command line, run `bun --print process.env`. This is useful for debugging.
```sh
$ bun run env
$ bun --print process.env
BAZ=stuff
FOOBAR=aaaaaa
<lots more lines>

View File

@@ -18,7 +18,7 @@ This page is updated regularly to reflect compatibility status of the latest ver
### [`node:child_process`](https://nodejs.org/api/child_process.html)
🟡 Missing `Stream` stdio, `proc.gid` `proc.uid`. IPC has partial support and only current only works with other `bun` processes.
🟡 Missing `Stream` stdio, `proc.gid` `proc.uid`. IPC cannot send socket handles and only works with other `bun` processes.
### [`node:cluster`](https://nodejs.org/api/cluster.html)

View File

@@ -1,9 +1,5 @@
Bun Shell makes shell scripting with JavaScript & TypeScript fun. It's a cross-platform bash-like shell with seamless JavaScript interop.
{% callout type="note" %}
**Alpha-quality software**: Bun Shell is an unstable API still under development. If you have feature requests or run into bugs, please open an issue. There may be breaking changes in the future.
{% /callout %}
Quickstart:
```js
@@ -23,6 +19,8 @@ await $`cat < ${response} | wc -c`; // 1256
- **Template literals**: Template literals are used to execute shell commands. This allows for easy interpolation of variables and expressions.
- **Safety**: Bun Shell escapes all strings by default, preventing shell injection attacks.
- **JavaScript interop**: Use `Response`, `ArrayBuffer`, `Blob`, `Bun.file(path)` and other JavaScript objects as stdin, stdout, and stderr.
- **Shell scripting**: Bun Shell can be used to run shell scripts (`.bun.sh` files).
- **Custom interpreter**: Bun Shell is written in Zig, along with it's lexer, parser, and interpreter. Bun Shell is a small programming language.
## Getting started
@@ -53,16 +51,66 @@ const welcome = await $`echo "Hello World!"`.text();
console.log(welcome); // Hello World!\n
```
To get stdout, stderr, and the exit code, use await or `.run`:
By default, `await`ing will return stdout and stderr as `Buffer`s.
```js
import { $ } from "bun";
const { stdout, stderr, exitCode } = await $`echo "Hello World!"`.quiet();
const { stdout, stderr } = await $`echo "Hello World!"`.quiet();
console.log(stdout); // Buffer(6) [ 72, 101, 108, 108, 111, 32 ]
console.log(stderr); // Buffer(0) []
console.log(exitCode); // 0
```
## Error handling
By default, non-zero exit codes will throw an error. This `ShellError` contains information about the command run.
```js
import { $ } from "bun";
try {
const output = await $`something-that-may-fail`.text();
console.log(output);
} catch (err) {
console.log(`Failed with code ${err.exitCode}`);
console.log(err.stdout.toString());
console.log(err.stderr.toString());
}
```
Throwing can be disabled with `.nothrow()`. The result's `exitCode` will need to be checked manually.
```js
import { $ } from "bun";
const { stdout, stderr, exitCode } = await $`something-that-may-fail`
.nothrow()
.quiet();
if (exitCode !== 0) {
console.log(`Non-zero exit code ${exitCode}`);
}
console.log(stdout);
console.log(stderr);
```
The default handling of non-zero exit codes can be configured by calling `.nothrow()` or `.throws(boolean)` on the `$` function itself.
```js
import { $ } from "bun";
// shell promises will not throw, meaning you will have to
// check for `exitCode` manually on every shell command.
$.nothrow(); // equivilent to $.throws(false)
// default behavior, non-zero exit codes will throw an error
$.throws(true);
// alias for $.nothrow()
$.throws(false);
await $`something-that-may-fail`; // No exception thrown
```
## Redirection
@@ -89,9 +137,8 @@ To redirect stdout to a JavaScript object, use the `>` operator:
import { $ } from "bun";
const buffer = Buffer.alloc(100);
const result = await $`echo "Hello World!" > ${buffer}`;
await $`echo "Hello World!" > ${buffer}`;
console.log(result.exitCode); // 0
console.log(buffer.toString()); // Hello World!\n
```
@@ -105,7 +152,7 @@ The following JavaScript objects are supported for redirection to:
To redirect the output from JavaScript objects to stdin, use the `<` operator:
```js
import { $, file } from "bun";
import { $ } from "bun";
const response = new Response("hello i am a response body");
@@ -144,7 +191,7 @@ import { $ } from "bun";
await $`bun run index.ts 2> errors.txt`;
```
### Example: Redirect stdout -> stderr
### Example: Redirect stderr -> stdout
```js
import { $ } from "bun";
@@ -154,7 +201,7 @@ import { $ } from "bun";
await $`bun run ./index.ts 2>&1`;
```
### Example: Redirect stderr -> stdout
### Example: Redirect stdout -> stderr
```js
import { $ } from "bun";
@@ -352,6 +399,18 @@ For cross-platform compatibility, Bun Shell implements a set of builtin commands
- `echo`: print text
- `pwd`: print the working directory
- `bun`: run bun in bun
- `cat`
- `touch`
- `mkdir`
- `which`
- `mv`
- `exit`
- `true`
- `false`
- `yes`
- `seq`
- `dirname`
- `basename`
**Partially** implemented:
@@ -359,9 +418,7 @@ For cross-platform compatibility, Bun Shell implements a set of builtin commands
**Not** implemented yet, but planned:
- `mkdir`: create directories
- `cp`: copy files and directories
- `cat`: concatenate files
- See https://github.com/oven-sh/bun/issues/9716 for the full list.
## Utilities
@@ -404,24 +461,28 @@ await $`echo ${{ raw: '$(foo) `bar` "baz"' }}`;
For simple shell scripts, instead of `/bin/sh`, you can use Bun Shell to run shell scripts.
To do so, just run the script with `bun` on a file with the `.bun.sh` extension.
To do so, just run the script with `bun` on a file with the `.sh` extension.
```sh#script.bun.sh
```sh#script.sh
echo "Hello World! pwd=$(pwd)"
```
```sh
$ bun ./script.bun.sh
$ bun ./script.sh
Hello World! pwd=/home/demo
```
Scripts with Bun Shell are cross platform, which means they work on Windows:
```
PS C:\Users\Demo> bun .\script.bun.sh
```powershell
> bun .\script.sh
Hello World! pwd=C:\Users\Demo
```
## Implementation notes
Bun Shell is a small programming language in Bun that is implemented in Zig. It includes a handwritten lexer, parser, and interpreter. Unlike bash, zsh, and other shells, Bun Shell runs operations concurrently.
## Credits
Large parts of this API were inspired by [zx](https://github.com/google/zx), [dax](https://github.com/dsherret/dax), and [bnx](https://github.com/wobsoriano/bnx). Thank you to the authors of those projects.

View File

@@ -57,7 +57,7 @@ coverageThreshold = { lines = 0.9, functions = 0.9 }
### Sourcemaps
Internally, Bun transpiles all files by default, so Bun automatically generates an internal [source map](https://web.dev/source-maps/) that maps lines of your original source code onto Bun's internal representation. If for any reason you want to disable this, set `test.coverageIgnoreSourcemaps` to `false`; this will rarely be desirable outside of advanced use cases.
Internally, Bun transpiles all files by default, so Bun automatically generates an internal [source map](https://web.dev/source-maps/) that maps lines of your original source code onto Bun's internal representation. If for any reason you want to disable this, set `test.coverageIgnoreSourcemaps` to `true`; this will rarely be desirable outside of advanced use cases.
```toml
[test]

View File

@@ -156,6 +156,8 @@ test.if(macOS)("runs on macOS", () => {
});
```
## `test.skipIf`
To instead skip a test based on some condition, use `test.skipIf()` or `describe.skipIf()`.
```ts
@@ -166,16 +168,32 @@ test.skipIf(macOS)("runs on non-macOS", () => {
});
```
## `test.todoIf`
If instead you want to mark the test as TODO, use `test.todoIf()` or `describe.todoIf()`. Carefully choosing `skipIf` or `todoIf` can show a difference between, for example, intent of "invalid for this target" and "planned but not implemented yet."
```ts
const macOS = process.arch === "darwin";
// TODO: we've only implemented this for Linux so far.
test.todoIf(macOS)("runs on posix", () => {
// runs if *not* macOS
});
```
## `test.each`
To return a function for multiple cases in a table of tests, use `test.each`.
```ts
const cases = [[1, 2, 3], [3, 4, 5]];
const cases = [
[1, 2, 3],
[3, 4, 5],
];
test.each(cases)("%p + %p should be %p", (a, b, expected) => {
// runs once for each test case provided
})
// runs once for each test case provided
});
```
There are a number of options available for formatting the case label depending on its type.

View File

@@ -14,13 +14,13 @@ pub usingnamespace @import("root").bun;
const clap = bun.clap;
const URL = @import("../src/url.zig").URL;
const Headers = @import("root").bun.http.Headers;
const Headers = bun.http.Headers;
const Method = @import("../src/http/method.zig").Method;
const ColonListType = @import("../src/cli/colon_list_type.zig").ColonListType;
const HeadersTuple = ColonListType(string, noop_resolver);
const path_handler = @import("../src/resolver/resolve_path.zig");
const HTTPThread = @import("root").bun.http.HTTPThread;
const HTTP = @import("root").bun.http;
const HTTPThread = bun.http.HTTPThread;
const HTTP = bun.http;
fn noop_resolver(in: string) !string {
return in;
}

View File

@@ -177,7 +177,7 @@ pub const Arguments = struct {
}
};
const HTTP = @import("root").bun.http;
const HTTP = bun.http;
const NetworkThread = HTTP.NetworkThread;
var stdout_: std.fs.File = undefined;

View File

@@ -13,13 +13,13 @@ const C = bun.C;
const clap = @import("../src/deps/zig-clap/clap.zig");
const URL = @import("../src/url.zig").URL;
const Headers = @import("root").bun.http.Headers;
const Headers = bun.http.Headers;
const Method = @import("../src/http/method.zig").Method;
const ColonListType = @import("../src/cli/colon_list_type.zig").ColonListType;
const HeadersTuple = ColonListType(string, noop_resolver);
const path_handler = @import("../src/resolver/resolve_path.zig");
const NetworkThread = @import("root").bun.http.NetworkThread;
const HTTP = @import("root").bun.http;
const NetworkThread = bun.http.NetworkThread;
const HTTP = bun.http;
fn noop_resolver(in: string) !string {
return in;
}

View File

@@ -26,13 +26,13 @@
"build:release": "cmake . -DCMAKE_BUILD_TYPE=Release -GNinja -Bbuild-release && ninja -Cbuild-release",
"build:debug-zig-release": "cmake . -DCMAKE_BUILD_TYPE=Release -DZIG_OPTIMIZE=Debug -GNinja -Bbuild-debug-zig-release && ninja -Cbuild-debug-zig-release",
"build:safe": "cmake . -DZIG_OPTIMIZE=ReleaseSafe -DUSE_DEBUG_JSC=ON -DCMAKE_BUILD_TYPE=Release -GNinja -Bbuild-safe && ninja -Cbuild-safe",
"build:windows": "cmake -B build -S . -G Ninja -DCMAKE_BUILD_TYPE=Debug && ninja -Cbuild",
"typecheck": "tsc --noEmit && cd test && bun run typecheck",
"fmt": "prettier --write --cache './{.vscode,src,test,bench,packages/{bun-types,bun-inspector-*,bun-vscode,bun-debug-adapter-protocol}}/**/*.{mjs,ts,tsx,js,jsx}'",
"fmt:zig": "zig fmt src/*.zig src/*/*.zig src/*/*/*.zig src/*/*/*/*.zig",
"lint": "eslint './**/*.d.ts' --cache",
"lint:fix": "eslint './**/*.d.ts' --cache --fix",
"test": "node packages/bun-internal-test/src/runner.node.mjs ./build/bun-debug",
"test:release": "node packages/bun-internal-test/src/runner.node.mjs ./build-release/bun",
"update-known-failures": "node packages/bun-internal-test/src/update-known-windows-failures.mjs"
"test:release": "node packages/bun-internal-test/src/runner.node.mjs ./build-release/bun"
}
}

View File

@@ -0,0 +1,12 @@
{
" != undefined": "This is by definition Undefined Behavior.",
" == undefined": "This is by definition Undefined Behavior.",
"@import(\"root\").bun.": "Only import 'bun' once",
"std.debug.assert": "Use bun.assert instead",
"std.debug.dumpStackTrace": "Use bun.handleErrorReturnTrace or bun.crash_handler.dumpStackTrace instead",
"std.debug.print": "Don't let this be committed",
"std.mem.indexOfAny": "Use bun.strings.indexAny or bun.strings.indexAnyComptime",
"undefined != ": "This is by definition Undefined Behavior.",
"undefined == ": "This is by definition Undefined Behavior.",
"": ""
}

View File

@@ -0,0 +1,72 @@
import { $ } from "bun";
import BANNED from "./banned.json";
import * as action from "@actions/core";
const IGNORED_FOLDERS = [
// list of folders to ignore
"windows-shim",
];
const ci = !!process.env["GITHUB_ACTIONS"];
process.chdir(require("path").join(import.meta.dir, "../../../"));
let bad = [];
let report = "";
const write = (text: string) => {
process.stdout.write(text);
report += text;
};
for (const [banned, suggestion] of Object.entries(BANNED)) {
if (banned.length === 0) continue;
// Run git grep to find occurrences of std.debug.assert in .zig files
// .nothrow() is here since git will exit with non-zero if no matches are found.
let stdout = await $`git grep -n -F "${banned}" "src/**/**.zig" | grep -v -F '//' | grep -v -F bench`
.nothrow()
.text();
stdout = stdout.trim();
if (stdout.length === 0) continue;
let lines = stdout.split("\n");
// Parse each line to extract filename and line number
const matches = lines
.filter(line => !IGNORED_FOLDERS.some(folder => line.includes(folder)))
.map(line => {
const [path, lineNumber, ...text] = line.split(":");
return { path, lineNumber, banned, suggestion, text: text.join(":") };
});
// Check if we got any output
// Split the output into lines
if (matches.length === 0) continue;
write(`Banned **'${banned}'** found in the following locations:` + "\n");
matches.forEach(match => {
write(`${match.path}:${match.lineNumber}: ${match.text.trim()}` + "\n");
});
bad = bad.concat(matches);
}
if (report.length === 0) {
process.exit(0);
}
function link({ path, lineNumber, suggestion, banned }) {
action.error(`Lint failure: ${banned} is banned, ${suggestion}`, {
file: path,
startLine: Number(lineNumber),
endLine: Number(lineNumber),
});
return `[\`${path}:${lineNumber}\`](https://github.com/oven-sh/bun/blob/${process.env.GITHUB_SHA}/${path}#L${lineNumber})`;
}
if (ci) {
if (report.length > 0) {
action.setFailed(`${bad.length} lint failures`);
}
action.setOutput("count", bad.length);
action.setOutput("text_output", bad.map(m => `- ${link(m)}: ${m.banned} is banned, ${m.suggestion}`).join("\n"));
action.setOutput("json_output", JSON.stringify(bad));
action.summary.addRaw(report);
await action.summary.write();
}
process.exit(1);

View File

@@ -1,11 +1,10 @@
import * as action from "@actions/core";
import { spawn, spawnSync } from "child_process";
import { rmSync, writeFileSync, readFileSync, mkdirSync, openSync, close, closeSync } from "fs";
import { readFile, rm } from "fs/promises";
import { rmSync, writeFileSync, readFileSync, mkdirSync, openSync, closeSync } from "fs";
import { readdirSync } from "node:fs";
import { resolve, basename } from "node:path";
import { constants, cpus, hostname, tmpdir, totalmem, userInfo } from "os";
import { join, normalize } from "path";
import { cpus, hostname, tmpdir, totalmem, userInfo } from "os";
import { join, normalize, posix, relative } from "path";
import { fileURLToPath } from "url";
import PQueue from "p-queue";
@@ -22,9 +21,7 @@ function defaultConcurrency() {
return Math.min(Math.floor((cpus().length - 2) / 2), 2);
}
const windows = process.platform === "win32";
const KEEP_TMPDIR = process.env["BUN_KEEP_TMPDIR"] === "1";
const nativeMemory = totalmem();
const force_ram_size_input = parseInt(process.env["BUN_JSC_forceRAMSize"] || "0", 10);
let force_ram_size = Number(BigInt(nativeMemory) >> BigInt(2)) + "";
@@ -147,8 +144,6 @@ function lookupWindowsError(code) {
const failing_tests = [];
const passing_tests = [];
const fixes = [];
const regressions = [];
let maxFd = -1;
function getMaxFileDescriptor(path) {
if (process.platform === "win32") {
@@ -210,21 +205,15 @@ function checkSlowTests() {
setInterval(checkSlowTests, SHORT_TIMEOUT_DURATION).unref();
var currentTestNumber = 0;
async function runTest(path) {
const pathOnDisk = resolve(path);
const thisTestNumber = currentTestNumber++;
const name = path.replace(cwd, "").slice(1);
const testFileName = posix.normalize(relative(cwd, path).replaceAll("\\", "/"));
let exitCode, signal, err, output;
const expected_crash_reason = windows
? await readFile(resolve(path), "utf-8").then(data => {
const match = data.match(/@known-failing-on-windows:(.*)\n/);
return match ? match[1].trim() : null;
})
: null;
const start = Date.now();
const activeTestObject = { start, proc: undefined };
activeTests.set(path, activeTestObject);
activeTests.set(testFileName, activeTestObject);
try {
await new Promise((finish, reject) => {
@@ -234,12 +223,12 @@ async function runTest(path) {
at ${((start - run_start.getTime()) / 1000).toFixed(2)}s, file ${thisTestNumber
.toString()
.padStart(total.toString().length, "0")}/${total}, ${failing_tests.length} failing files
Starting "${name}"
Starting "${testFileName}"
`,
);
const TMPDIR = maketemp();
const proc = spawn(bunExe, ["test", resolve(path)], {
const proc = spawn(bunExe, ["test", pathOnDisk], {
stdio: ["ignore", "pipe", "pipe"],
env: {
...process.env,
@@ -249,6 +238,7 @@ Starting "${name}"
BUN_RUNTIME_TRANSPILER_CACHE_PATH: "0",
GITHUB_ACTIONS: process.env.GITHUB_ACTIONS ?? "true",
BUN_DEBUG_QUIET_LOGS: "1",
BUN_INSTALL_CACHE_DIR: join(TMPDIR, ".bun-install-cache"),
[windows ? "TEMP" : "TMPDIR"]: TMPDIR,
},
});
@@ -309,7 +299,7 @@ Starting "${name}"
});
});
} finally {
activeTests.delete(path);
activeTests.delete(testFileName);
}
if (!hasInitialMaxFD) {
@@ -319,7 +309,7 @@ Starting "${name}"
maxFd = getMaxFileDescriptor();
if (maxFd > prevMaxFd + queue.concurrency * 2) {
process.stderr.write(
`\n\x1b[31mewarn\x1b[0;2m:\x1b[0m file descriptor leak in ${name}, delta: ${
`\n\x1b[31mewarn\x1b[0;2m:\x1b[0m file descriptor leak in ${testFileName}, delta: ${
maxFd - prevMaxFd
}, current: ${maxFd}, previous: ${prevMaxFd}\n`,
);
@@ -370,8 +360,8 @@ Starting "${name}"
console.log(
`\x1b[2m${formatTime(duration).padStart(6, " ")}\x1b[0m ${
passed ? "\x1b[32m✔" : expected_crash_reason ? "\x1b[33m⚠" : "\x1b[31m✖"
} ${name}\x1b[0m${reason ? ` (${reason})` : ""}`,
passed ? "\x1b[32m✔" : "\x1b[31m✖"
} ${testFileName}\x1b[0m${reason ? ` (${reason})` : ""}`,
);
finished++;
@@ -385,21 +375,11 @@ Starting "${name}"
}
if (!passed) {
if (reason) {
if (windows && !expected_crash_reason) {
regressions.push({ path: name, reason, output });
}
}
failing_tests.push({ path: name, reason, output, expected_crash_reason });
failing_tests.push({ path: testFileName, reason, output });
process.exitCode = 1;
if (err) console.error(err);
} else {
if (windows && expected_crash_reason !== null) {
fixes.push({ path: name, output, expected_crash_reason });
}
passing_tests.push(name);
passing_tests.push(testFileName);
}
return passed;
@@ -496,30 +476,6 @@ ${header}
`;
if (fixes.length > 0) {
report += `## Fixes\n\n`;
report += "The following tests had @known-failing-on-windows but now pass:\n\n";
report += fixes
.map(
({ path, expected_crash_reason }) => `- [\`${path}\`](${sectionLink(path)}) (before: ${expected_crash_reason})`,
)
.join("\n");
report += "\n\n";
}
if (regressions.length > 0) {
report += `## Regressions\n\n`;
report += regressions
.map(
({ path, reason, expected_crash_reason }) =>
`- [\`${path}\`](${sectionLink(path)}) ${reason}${
expected_crash_reason ? ` (expected: ${expected_crash_reason})` : ""
}`,
)
.join("\n");
report += "\n\n";
}
if (failingTestDisplay.length > 0) {
report += `## Failing tests\n\n`;
report += failingTestDisplay;
@@ -534,21 +490,22 @@ if (failingTestDisplay.length > 0) {
if (failing_tests.length) {
report += `## Failing tests log output\n\n`;
for (const { path, output, reason, expected_crash_reason } of failing_tests) {
for (const { path, output, reason } of failing_tests) {
report += `### ${path}\n\n`;
report += "[Link to file](" + linkToGH(path) + ")\n\n";
if (windows && reason !== expected_crash_reason) {
report += `To mark this as a known failing test, add this to the start of the file:\n`;
report += `\`\`\`ts\n`;
report += `// @known-failing-on-windows: ${reason}\n`;
report += `\`\`\`\n\n`;
} else {
report += `${reason}\n\n`;
}
report += `${reason}\n\n`;
report += "```\n";
report += output
let failing_output = output
.replace(/\x1b\[[0-9;]*m/g, "")
.replace(/^::(group|endgroup|error|warning|set-output|add-matcher|remove-matcher).*$/gm, "");
if (failing_output.length > 1024 * 64) {
failing_output = failing_output.slice(0, 1024 * 64) + `\n\n[truncated output (length: ${failing_output.length})]`;
}
report += failing_output;
report += "```\n\n";
}
}
@@ -559,35 +516,86 @@ writeFileSync(
JSON.stringify({
failing_tests,
passing_tests,
fixes,
regressions,
}),
);
function mabeCapitalize(str) {
str = str.toLowerCase();
if (str.includes("arm64") || str.includes("aarch64")) {
return str.toUpperCase();
}
if (str.includes("x64")) {
return "x64";
}
if (str.includes("baseline")) {
return str;
}
return str[0].toUpperCase() + str.slice(1);
}
console.log("-> test-report.md, test-report.json");
if (ci) {
if (windows) {
action.setOutput("regressing_tests", regressions.map(({ path }) => `- \`${path}\``).join("\n"));
action.setOutput("regressing_test_count", regressions.length);
}
if (failing_tests.length > 0) {
action.setFailed(`${failing_tests.length} files with failing tests`);
}
action.setOutput("failing_tests", failingTestDisplay);
action.setOutput("failing_tests_count", failing_tests.length);
if (failing_tests.length) {
const tag = process.env.BUN_TAG || "unknown";
let comment = `## ${emojiTag(tag)}${failing_tests.length} failing tests ${tag
.split("-")
.map(mabeCapitalize)
.join(" ")}
${failingTestDisplay}
`;
writeFileSync("comment.md", comment);
}
let truncated_report = report;
if (truncated_report.length > 512 * 1000) {
truncated_report = truncated_report.slice(0, 512 * 1000) + "\n\n...truncated...";
}
action.summary.addRaw(truncated_report);
await action.summary.write();
} else {
if (windows && (regressions.length > 0 || fixes.length > 0)) {
console.log(
"\n\x1b[34mnote\x1b[0;2m:\x1b[0m If you would like to update the @known-failing-on-windows annotations, run `bun update-known-failures`",
);
}
function emojiTag(tag) {
let emojiText = "";
tag = tag.toLowerCase();
if (tag.includes("win32") || tag.includes("windows")) {
emojiText += "🪟";
}
if (tag.includes("linux")) {
emojiText += "🐧";
}
if (tag.includes("macos") || tag.includes("darwin")) {
emojiText += "";
}
if (tag.includes("x86") || tag.includes("x64") || tag.includes("_64") || tag.includes("amd64")) {
if (!tag.includes("linux")) {
emojiText += "💻";
} else {
emojiText += "🖥";
}
}
if (tag.includes("arm64") || tag.includes("aarch64")) {
emojiText += "💪";
}
if (emojiText) {
emojiText += " ";
}
return emojiText;
}
process.exit(failing_tests.length ? 1 : process.exitCode);

View File

@@ -1,49 +0,0 @@
import assert from "assert";
import { existsSync, readFileSync, writeFileSync } from "fs";
import { join } from "path";
import { fileURLToPath } from "url";
if (process.platform !== "win32") {
console.log("This script is only intended to be run on Windows.");
process.exit(1);
}
process.chdir(join(fileURLToPath(import.meta.url), "../../../../"));
if (!existsSync("test-report.json")) {
console.log("No test report found. Please run `bun run test` first.");
process.exit(1);
}
const test_report = JSON.parse(readFileSync("test-report.json", "utf8"));
assert(Array.isArray(test_report.failing_tests));
for (const { path, reason, expected_crash_reason } of test_report.failing_tests) {
assert(path);
assert(reason);
if (expected_crash_reason !== reason) {
const old_content = readFileSync(path, "utf8");
if (!old_content.includes("// @known-failing-on-windows")) {
let content = old_content.replace(/\/\/\s*@known-failing-on-windows:.*\n/, "");
if (reason) {
content = `// @known-failing-on-windows: ${reason}\n` + content;
}
writeFileSync(path, content, "utf8");
console.log(path);
}
}
}
for (const { path } of test_report.fixes) {
assert(path);
const old_content = readFileSync(path, "utf8");
let content = old_content.replace(/\/\/\s*@known-failing-on-windows:.*\n/, "");
if (content !== old_content) {
writeFileSync(path, content, "utf8");
console.log(path);
}
}

View File

@@ -19,9 +19,7 @@ export default function polyfillImportMeta(metaIn: ImportMeta) {
dir: path.dirname(metapath),
file: path.basename(metapath),
require: require2,
async resolve(id: string, parent?: string) {
return this.resolveSync(id, parent);
},
resolve: metaIn.resolve,
resolveSync(id: string, parent?: string) {
return require2.resolve(id, {
paths: typeof parent === 'string' ? [

View File

@@ -1,6 +1,8 @@
.DS_Store
.env
node_modules
/npm/**/bin
/npm/**/*.js
/npm/**/.npmrc
.DS_Store
.env
node_modules
/npm/**/bin
/npm/**/*.js
/npm/**/package.json
/npm/**/.npmrc
*.tgz

Binary file not shown.

View File

@@ -1,16 +0,0 @@
{
"name": "@oven/bun-darwin-aarch64",
"version": "0.5.3",
"description": "This is the macOS arm64 binary for Bun, a fast all-in-one JavaScript runtime.",
"homepage": "https://bun.sh",
"bugs": "https://github.com/oven-sh/issues",
"license": "MIT",
"repository": "https://github.com/oven-sh/bun",
"preferUnplugged": true,
"os": [
"darwin"
],
"cpu": [
"arm64"
]
}

View File

@@ -1,16 +0,0 @@
{
"name": "@oven/bun-darwin-x64-baseline",
"version": "0.5.3",
"description": "This is the macOS x64 binary for Bun, a fast all-in-one JavaScript runtime.",
"homepage": "https://bun.sh",
"bugs": "https://github.com/oven-sh/issues",
"license": "MIT",
"repository": "https://github.com/oven-sh/bun",
"preferUnplugged": true,
"os": [
"darwin"
],
"cpu": [
"x64"
]
}

View File

@@ -1,16 +0,0 @@
{
"name": "@oven/bun-darwin-x64",
"version": "0.5.3",
"description": "This is the macOS x64 binary for Bun, a fast all-in-one JavaScript runtime.",
"homepage": "https://bun.sh",
"bugs": "https://github.com/oven-sh/issues",
"license": "MIT",
"repository": "https://github.com/oven-sh/bun",
"preferUnplugged": true,
"os": [
"darwin"
],
"cpu": [
"x64"
]
}

View File

@@ -1,16 +0,0 @@
{
"name": "@oven/bun-linux-aarch64",
"version": "0.5.3",
"description": "This is the Linux arm64 binary for Bun, a fast all-in-one JavaScript runtime.",
"homepage": "https://bun.sh",
"bugs": "https://github.com/oven-sh/issues",
"license": "MIT",
"repository": "https://github.com/oven-sh/bun",
"preferUnplugged": true,
"os": [
"linux"
],
"cpu": [
"arm64"
]
}

View File

@@ -1,16 +0,0 @@
{
"name": "@oven/bun-linux-x64-baseline",
"version": "0.5.3",
"description": "This is the Linux x64 binary for Bun, a fast all-in-one JavaScript runtime.",
"homepage": "https://bun.sh",
"bugs": "https://github.com/oven-sh/issues",
"license": "MIT",
"repository": "https://github.com/oven-sh/bun",
"preferUnplugged": true,
"os": [
"linux"
],
"cpu": [
"x64"
]
}

View File

@@ -1,16 +0,0 @@
{
"name": "@oven/bun-linux-x64",
"version": "0.5.3",
"description": "This is the Linux x64 binary for Bun, a fast all-in-one JavaScript runtime.",
"homepage": "https://bun.sh",
"bugs": "https://github.com/oven-sh/issues",
"license": "MIT",
"repository": "https://github.com/oven-sh/bun",
"preferUnplugged": true,
"os": [
"linux"
],
"cpu": [
"x64"
]
}

View File

@@ -0,0 +1,5 @@
# Bun
This is the Windows x64 binary for Bun, a fast all-in-one JavaScript runtime. https://bun.sh
_Note: "Baseline" builds are for machines that do not support [AVX2](https://en.wikipedia.org/wiki/Advanced_Vector_Extensions) instructions._

View File

@@ -0,0 +1,3 @@
# Bun
This is the Windows x64 binary for Bun, a fast all-in-one JavaScript runtime. https://bun.sh

View File

@@ -1,42 +0,0 @@
{
"name": "bun",
"version": "0.5.3",
"description": "Bun is a fast all-in-one JavaScript runtime.",
"keywords": [
"bun",
"bun.js",
"node",
"node.js",
"runtime",
"bundler",
"transpiler",
"typescript"
],
"homepage": "https://bun.sh",
"bugs": "https://github.com/oven-sh/issues",
"license": "MIT",
"bin": {
"bun": "bin/bun",
"bunx": "bin/bun"
},
"repository": "https://github.com/oven-sh/bun",
"scripts": {
"postinstall": "node install.js"
},
"optionalDependencies": {
"@oven/bun-darwin-aarch64": "0.5.3",
"@oven/bun-darwin-x64": "0.5.3",
"@oven/bun-darwin-x64-baseline": "0.5.3",
"@oven/bun-linux-aarch64": "0.5.3",
"@oven/bun-linux-x64": "0.5.3",
"@oven/bun-linux-x64-baseline": "0.5.3"
},
"os": [
"darwin",
"linux"
],
"cpu": [
"arm64",
"x64"
]
}

View File

@@ -9,7 +9,7 @@
},
"devDependencies": {
"@octokit/types": "^8.1.1",
"bun-types": "^0.4.0",
"bun-types": "^1.1.0",
"prettier": "^2.8.2"
},
"scripts": {

View File

@@ -1,4 +1,4 @@
import { importBun } from "../src/npm/install";
import { importBun, optimizeBun } from "../src/npm/install";
import { execFileSync } from "child_process";
importBun()

View File

@@ -1,4 +1,8 @@
import { join, copy, exists, chmod, write, writeJson } from "../src/fs";
import { mkdtemp } from "fs/promises";
import { rmSync, mkdirSync } from "fs";
import { tmpdir } from "os";
import { dirname } from "path";
import { fetch } from "../src/fetch";
import { spawn } from "../src/spawn";
import type { Platform } from "../src/platform";
@@ -10,41 +14,51 @@ import { buildSync, formatMessagesSync } from "esbuild";
import type { JSZipObject } from "jszip";
import { loadAsync } from "jszip";
import { debug, log, error } from "../src/console";
import { expect } from "bun:test";
const module = "bun";
const owner = "@oven";
let version: string;
const [tag, action] = process.argv.slice(2);
await build(tag);
const release = await getRelease(tag);
const version = await getSemver(release.tag_name);
if (action !== "test-only") await build();
if (action === "publish") {
await publish();
} else if (action === "dry-run") {
await publish(true);
} else if (action === "test") {
await publish(true);
await test();
} else if (action === "test-only") {
await test();
} else if (action) {
throw new Error(`Unknown action: ${action}`);
}
process.exit(0); // HACK
async function build(tag?: string): Promise<void> {
const release = await getRelease(tag);
version = await getSemver(release.tag_name);
async function build(): Promise<void> {
await buildRootModule();
for (const platform of platforms) {
if (action !== "publish" && (platform.os !== process.platform || platform.arch !== process.arch)) continue;
await buildModule(release, platform);
}
}
async function publish(dryRun?: boolean): Promise<void> {
const modules = platforms.map(({ bin }) => `${owner}/${bin}`);
const modules = platforms
.filter(({ os, arch }) => action === "publish" || (os === process.platform && arch === process.arch))
.map(({ bin }) => `${owner}/${bin}`);
modules.push(module);
for (const module of modules) {
publishModule(module, dryRun);
}
}
async function buildRootModule() {
async function buildRootModule(dryRun?: boolean) {
log("Building:", `${module}@${version}`);
const cwd = join("npm", module);
const define = {
@@ -54,28 +68,53 @@ async function buildRootModule() {
};
bundle(join("scripts", "npm-postinstall.ts"), join(cwd, "install.js"), {
define,
});
bundle(join("scripts", "npm-exec.ts"), join(cwd, "bin", "bun"), {
define,
banner: {
js: "#!/usr/bin/env node",
js: "// Source code: https://github.com/oven-sh/bun/blob/main/packages/bun-release/scripts/npm-postinstall.ts",
},
});
write(join(cwd, "bin", "bun.exe"), "");
write(
join(cwd, "bin", "README.txt"),
`The 'bun.exe' file is a placeholder for the binary file, which
is replaced by Bun's 'postinstall' script. For this to work, make
sure that you do not use --ignore-scripts while installing.
The postinstall script is responsible for linking the binary file
directly into 'node_modules/.bin' and avoiding a Node.js wrapper
script being called on every invocation of 'bun'. If this wasn't
done, Bun would seem to be slower than Node.js, because it would
be executing a copy of Node.js every time!
Unfortunately, it is not possible to fix all cases on all platforms
without *requiring* a postinstall script.
`,
);
const os = [...new Set(platforms.map(({ os }) => os))];
const cpu = [...new Set(platforms.map(({ arch }) => arch))];
writeJson(join(cwd, "package.json"), {
name: module,
description: "Bun is a fast all-in-one JavaScript runtime.",
version: version,
scripts: {
postinstall: "node install.js",
},
optionalDependencies: Object.fromEntries(platforms.map(({ bin }) => [`${owner}/${bin}`, version])),
optionalDependencies: Object.fromEntries(
platforms.map(({ bin }) => [
`${owner}/${bin}`,
dryRun ? `file:./oven-${bin.replaceAll("/", "-") + "-" + version + ".tgz"}` : version,
]),
),
bin: {
bun: "bin/bun",
bunx: "bin/bun",
bun: "bin/bun.exe",
bunx: "bin/bun.exe",
},
os,
cpu,
keywords: ["bun", "bun.js", "node", "node.js", "runtime", "bundler", "transpiler", "typescript"],
homepage: "https://bun.sh",
bugs: "https://github.com/oven-sh/issues",
license: "MIT",
repository: "https://github.com/oven-sh/bun",
});
if (exists(".npmrc")) {
copy(".npmrc", join(cwd, ".npmrc"));
@@ -95,11 +134,17 @@ async function buildModule(
}
const bun = await extractFromZip(asset.browser_download_url, `${bin}/bun`);
const cwd = join("npm", module);
mkdirSync(dirname(join(cwd, exe)), { recursive: true });
write(join(cwd, exe), await bun.async("arraybuffer"));
chmod(join(cwd, exe), 0o755);
writeJson(join(cwd, "package.json"), {
name: module,
version: version,
description: "This is the macOS arm64 binary for Bun, a fast all-in-one JavaScript runtime.",
homepage: "https://bun.sh",
bugs: "https://github.com/oven-sh/issues",
license: "MIT",
repository: "https://github.com/oven-sh/bun",
preferUnplugged: true,
os: [os],
cpu: [arch],
@@ -111,22 +156,33 @@ async function buildModule(
function publishModule(name: string, dryRun?: boolean): void {
log(dryRun ? "Dry-run Publishing:" : "Publishing:", `${name}@${version}`);
const { exitCode, stdout, stderr } = spawn(
"npm",
[
"publish",
"--access",
"public",
"--tag",
version.includes("canary") ? "canary" : "latest",
...(dryRun ? ["--dry-run"] : []),
],
{
cwd: join("npm", name),
},
);
if (exitCode === 0) {
if (!dryRun) {
const { exitCode, stdout, stderr } = spawn(
"npm",
[
"publish",
"--access",
"public",
"--tag",
version.includes("canary") ? "canary" : "latest",
...(dryRun ? ["--dry-run"] : []),
],
{
cwd: join("npm", name),
},
);
error(stderr || stdout);
if (exitCode !== 0) {
throw new Error("npm publish failed with code " + exitCode);
}
} else {
const { exitCode, stdout, stderr } = spawn("npm", ["pack"], {
cwd: join("npm", name),
});
error(stderr || stdout);
if (exitCode !== 0) {
throw new Error("npm pack failed with code " + exitCode);
}
}
}
@@ -162,3 +218,86 @@ function bundle(src: string, dst: string, options: BuildOptions = {}): void {
throw new Error(messages.join("\n"));
}
}
async function test() {
const root = await mkdtemp(join(tmpdir(), "bun-release-test-"));
const $ = new Bun.$.Shell().cwd(root);
for (const platform of platforms) {
if (platform.os !== process.platform) continue;
if (platform.arch !== process.arch) continue;
copy(
join(
import.meta.dir,
"../npm/@oven/",
platform.bin,
"oven-" + platform.bin.replaceAll("/", "-") + `-${version}.tgz`,
),
join(root, `${platform.bin}-${version}.tgz`),
);
}
copy(join(import.meta.dir, "../npm", "bun", "bun-" + version + ".tgz"), join(root, "bun-" + version + ".tgz"));
console.log(root);
for (const [install, exec] of [
["npm i", "npm exec"],
["yarn set version berry; yarn add", "yarn"],
["yarn set version latest; yarn add", "yarn"],
["pnpm i", "pnpm"],
["bun i", "bun run"],
]) {
rmSync(join(root, "node_modules"), { recursive: true, force: true });
rmSync(join(root, "package-lock.json"), { recursive: true, force: true });
rmSync(join(root, "package.json"), { recursive: true, force: true });
rmSync(join(root, "pnpm-lock.yaml"), { recursive: true, force: true });
rmSync(join(root, "yarn.lock"), { recursive: true, force: true });
writeJson(join(root, "package.json"), {
name: "bun-release-test",
});
console.log("Testing", install + " bun");
await $`${{ raw: install }} ./bun-${version}.tgz`;
console.log("Running " + exec + " bun");
// let output = await $`${{
// raw: exec,
// }} bun -- -e "console.log(JSON.stringify([Bun.version, process.platform, process.arch, process.execPath]))"`.text();
const split = exec.split(" ");
let {
stdout: output,
stderr,
exitCode,
} = spawn(
split[0],
[
...split.slice(1),
"--",
"bun",
"-e",
"console.log(JSON.stringify([Bun.version, process.platform, process.arch, process.execPath]))",
],
{
cwd: root,
},
);
if (exitCode !== 0) {
console.error(stderr);
throw new Error("Failed to run " + exec + " bun, exit code: " + exitCode);
}
try {
output = JSON.parse(output);
} catch (e) {
console.log({ output });
throw e;
}
expect(output[0]).toBe(version);
expect(output[1]).toBe(process.platform);
expect(output[2]).toBe(process.arch);
expect(output[3]).toStartWith(root);
expect(output[3]).toInclude("bun");
}
}

View File

@@ -121,24 +121,14 @@ async function downloadBun(platform: Platform, dst: string): Promise<void> {
}
export function optimizeBun(path: string): void {
if (os === "win32") {
throw new Error(
"You must use Windows Subsystem for Linux, aka. WSL, to run bun. Learn more: https://learn.microsoft.com/en-us/windows/wsl/install",
);
}
const { npm_config_user_agent } = process.env;
if (npm_config_user_agent && /\byarn\//.test(npm_config_user_agent)) {
throw new Error(
"Yarn does not support bun, because it does not allow linking to binaries. To use bun, install using the following command: curl -fsSL https://bun.sh/install | bash",
);
}
const installScript = os === "win32" ? 'powershell -c "irm bun.sh/install.ps1 | iex"' : "curl -fsSL https://bun.sh/install | bash";
try {
rename(path, join(__dirname, "bin", "bun"));
rename(path, join(__dirname, "bin", "bun.exe"));
return;
} catch (error) {
debug("optimizeBun failed", error);
}
throw new Error(
"Your package manager doesn't seem to support bun. To use bun, install using the following command: curl -fsSL https://bun.sh/install | bash",
`Your package manager doesn't seem to support bun. To use bun, install using the following command: ${installScript}`,
);
}

View File

@@ -6,7 +6,9 @@ export const os = process.platform;
export const arch = os === "darwin" && process.arch === "x64" && isRosetta2() ? "arm64" : process.arch;
export const avx2 = (arch === "x64" && os === "linux" && isLinuxAVX2()) || (os === "darwin" && isDarwinAVX2());
export const avx2 =
arch === "x64" &&
((os === "linux" && isLinuxAVX2()) || (os === "darwin" && isDarwinAVX2()) || (os === "win32" && isWindowsAVX2()));
export type Platform = {
os: string;
@@ -55,6 +57,19 @@ export const platforms: Platform[] = [
bin: "bun-linux-x64-baseline",
exe: "bin/bun",
},
{
os: "win32",
arch: "x64",
avx2: true,
bin: "bun-windows-x64",
exe: "bin/bun.exe",
},
{
os: "win32",
arch: "x64",
bin: "bun-windows-x64-baseline",
exe: "bin/bun.exe",
},
];
export const supportedPlatforms: Platform[] = platforms
@@ -89,3 +104,17 @@ function isRosetta2(): boolean {
return false;
}
}
function isWindowsAVX2(): boolean {
try {
return (
spawn("powershell", [
"-c",
`(Add-Type -MemberDefinition '[DllImport("kernel32.dll")] public static extern bool IsProcessorFeaturePresent(int ProcessorFeature);' -Name 'Kernel32' -Namespace 'Win32' -PassThru)::IsProcessorFeaturePresent(40);`,
]).stdout.trim() === "True"
);
} catch (error) {
debug("isWindowsAVX2 failed", error);
return false;
}
}

View File

@@ -43,7 +43,7 @@ declare module "bun" {
*
* @param {string} command The name of the executable or script
* @param {string} options.PATH Overrides the PATH environment variable
* @param {string} options.cwd Limits the search to a particular directory in which to searc
* @param {string} options.cwd When given a relative path, use this path to join it.
*/
function which(command: string, options?: { PATH?: string; cwd?: string }): string | null;
@@ -277,12 +277,16 @@ declare module "bun" {
blob(): Promise<Blob>;
/**
* Configure the shell to not throw an exception on non-zero exit codes.
* Configure the shell to not throw an exception on non-zero exit codes. Throwing can be re-enabled with `.throws(true)`.
*
* By default, the shell with throw an exception on commands which return non-zero exit codes.
*/
nothrow(): this;
/**
* Configure whether or not the shell should throw an exception on non-zero exit codes.
*
* By default, this is configured to `true`.
*/
throws(shouldThrow: boolean): this;
}
@@ -2988,12 +2992,19 @@ declare module "bun" {
}
/**
* Nanoseconds since Bun.js was started as an integer.
* Returns the number of nanoseconds since the process was started.
*
* This uses a high-resolution monotonic system timer.
* This function uses a high-resolution monotonic system timer to provide precise time measurements.
* In JavaScript, numbers are represented as double-precision floating-point values (IEEE 754),
* which can safely represent integers up to 2^53 - 1 (Number.MAX_SAFE_INTEGER).
*
* After 14 weeks of consecutive uptime, this function
* wraps
* Due to this limitation, while the internal counter may continue beyond this point,
* the precision of the returned value will degrade after 14.8 weeks of uptime (when the nanosecond
* count exceeds Number.MAX_SAFE_INTEGER). Beyond this point, the function will continue to count but
* with reduced precision, which might affect time calculations and comparisons in long-running applications.
*
* @returns {number} The number of nanoseconds since the process was started, with precise values up to
* Number.MAX_SAFE_INTEGER.
*/
function nanoseconds(): number;
@@ -4134,6 +4145,11 @@ declare module "bun" {
*/
windowsHide?: boolean;
/**
* If true, no quoting or escaping of arguments is done on Windows.
*/
windowsVerbatimArguments?: boolean;
/**
* Path to the executable to run in the subprocess. This defaults to `cmds[0]`.
*

View File

@@ -1758,21 +1758,10 @@ declare global {
* ```
*/
readonly env: NodeJS.ProcessEnv;
/**
* Resolve a module ID the same as if you imported it
*
* On failure, throws a `ResolveMessage`
*/
resolve(moduleId: string): Promise<string>;
/**
* Resolve a `moduleId` as though it were imported from `parent`
*
* On failure, throws a `ResolveMessage`
*/
// tslint:disable-next-line:unified-signatures
resolve(moduleId: string, parent: string): Promise<string>;
/**
* @deprecated Use `require.resolve` or `Bun.resolveSync(moduleId, path.dirname(parent))` instead
*
* Resolve a module ID the same as if you imported it
*
* The `parent` argument is optional, and defaults to the current module's path.
@@ -1780,17 +1769,12 @@ declare global {
resolveSync(moduleId: string, parent?: string): string;
/**
* Load a CommonJS module
* Load a CommonJS module within an ES Module. Bun's transpiler rewrites all
* calls to `require` with `import.meta.require` when transpiling ES Modules
* for the runtime.
*
* Internally, this is a synchronous version of ESModule's `import()`, with extra code for handling:
* - CommonJS modules
* - *.node files
* - *.json files
*
* Warning: **This API is not stable** and may change in the future. Use at your
* own risk. Usually, you should use `require` instead and Bun's transpiler
* will automatically rewrite your code to use `import.meta.require` if
* relevant.
* Warning: **This API is not stable** and may change or be removed in the
* future. Use at your own risk.
*/
require: NodeJS.Require;
@@ -1814,17 +1798,15 @@ declare global {
readonly main: boolean;
/** Alias of `import.meta.dir`. Exists for Node.js compatibility */
dirname: string;
readonly dirname: string;
/** Alias of `import.meta.path`. Exists for Node.js compatibility */
filename: string;
readonly filename: string;
}
/**
* NodeJS-style `require` function
*
* Internally, uses `import.meta.require`
*
* @param moduleId - The module ID to resolve
*/
var require: NodeJS.Require;

View File

@@ -62,3 +62,64 @@ declare module "tls" {
function connect(options: BunConnectionOptions, secureConnectListener?: () => void): TLSSocket;
}
declare module "util" {
// https://nodejs.org/docs/latest/api/util.html#foreground-colors
type ForegroundColors =
| "black"
| "blackBright"
| "blue"
| "blueBright"
| "cyan"
| "cyanBright"
| "gray"
| "green"
| "greenBright"
| "grey"
| "magenta"
| "magentaBright"
| "red"
| "redBright"
| "white"
| "whiteBright"
| "yellow"
| "yellowBright";
// https://nodejs.org/docs/latest/api/util.html#background-colors
type BackgroundColors =
| "bgBlack"
| "bgBlackBright"
| "bgBlue"
| "bgBlueBright"
| "bgCyan"
| "bgCyanBright"
| "bgGray"
| "bgGreen"
| "bgGreenBright"
| "bgGrey"
| "bgMagenta"
| "bgMagentaBright"
| "bgRed"
| "bgRedBright"
| "bgWhite"
| "bgWhiteBright"
| "bgYellow"
| "bgYellowBright";
// https://nodejs.org/docs/latest/api/util.html#modifiers
type Modifiers =
| "blink"
| "bold"
| "dim"
| "doubleunderline"
| "framed"
| "hidden"
| "inverse"
| "italic"
| "overlined"
| "reset"
| "strikethrough"
| "underline";
function styleText(format: ForegroundColors | BackgroundColors | Modifiers, text: string): string;
}

View File

@@ -24,7 +24,7 @@
* | `null` | `NULL` |
*/
declare module "bun:sqlite" {
export class Database {
export class Database implements Disposable {
/**
* Open or create a SQLite3 database
*
@@ -257,7 +257,20 @@ declare module "bun:sqlite" {
*
* Internally, this calls `sqlite3_close_v2`.
*/
close(): void;
close(
/**
* If `true`, then the database will throw an error if it is in use
* @default false
*
* When true, this calls `sqlite3_close` instead of `sqlite3_close_v2`.
*
* Learn more about this in the [sqlite3 documentation](https://www.sqlite.org/c3ref/close.html).
*
* Bun will automatically call close by default when the database instance is garbage collected.
* In The future, Bun may default `throwOnError` to be true but for backwards compatibility, it is false by default.
*/
throwOnError?: boolean,
): void;
/**
* The filename passed when `new Database()` was called
@@ -304,6 +317,8 @@ declare module "bun:sqlite" {
*/
static setCustomSQLite(path: string): boolean;
[Symbol.dispose](): void;
/**
* Creates a function that always runs inside a transaction. When the
* function is invoked, it will begin a new transaction. When the function
@@ -427,6 +442,17 @@ declare module "bun:sqlite" {
* ```
*/
static deserialize(serialized: NodeJS.TypedArray | ArrayBufferLike, isReadOnly?: boolean): Database;
/**
* See `sqlite3_file_control` for more information.
* @link https://www.sqlite.org/c3ref/file_control.html
*/
fileControl(op: number, arg?: ArrayBufferView | number): number;
/**
* See `sqlite3_file_control` for more information.
* @link https://www.sqlite.org/c3ref/file_control.html
*/
fileControl(zDbName: string, op: number, arg?: ArrayBufferView | number): number;
}
/**
@@ -455,7 +481,7 @@ declare module "bun:sqlite" {
* // => undefined
* ```
*/
export class Statement<ReturnType = unknown, ParamsType extends SQLQueryBindings[] = any[]> {
export class Statement<ReturnType = unknown, ParamsType extends SQLQueryBindings[] = any[]> implements Disposable {
/**
* Creates a new prepared statement from native code.
*
@@ -633,6 +659,11 @@ declare module "bun:sqlite" {
*/
finalize(): void;
/**
* Calls {@link finalize} if it wasn't already called.
*/
[Symbol.dispose](): void;
/**
* Return the expanded SQL string for the prepared statement.
*
@@ -766,6 +797,187 @@ declare module "bun:sqlite" {
* @constant 0x04
*/
SQLITE_PREPARE_NO_VTAB: number;
/**
* @constant 1
*/
SQLITE_FCNTL_LOCKSTATE: number;
/**
* @constant 2
*/
SQLITE_FCNTL_GET_LOCKPROXYFILE: number;
/**
* @constant 3
*/
SQLITE_FCNTL_SET_LOCKPROXYFILE: number;
/**
* @constant 4
*/
SQLITE_FCNTL_LAST_ERRNO: number;
/**
* @constant 5
*/
SQLITE_FCNTL_SIZE_HINT: number;
/**
* @constant 6
*/
SQLITE_FCNTL_CHUNK_SIZE: number;
/**
* @constant 7
*/
SQLITE_FCNTL_FILE_POINTER: number;
/**
* @constant 8
*/
SQLITE_FCNTL_SYNC_OMITTED: number;
/**
* @constant 9
*/
SQLITE_FCNTL_WIN32_AV_RETRY: number;
/**
* @constant 10
*
* Control whether or not the WAL is persisted
* Some versions of macOS configure WAL to be persistent by default.
*
* You can change this with code like the below:
* ```ts
* import { Database } from "bun:sqlite";
*
* const db = Database.open("mydb.sqlite");
* db.fileControl(constants.SQLITE_FCNTL_PERSIST_WAL, 0);
* // enable WAL
* db.exec("PRAGMA journal_mode = WAL");
* // .. do some work
* db.close();
* ```
*
*/
SQLITE_FCNTL_PERSIST_WAL: number;
/**
* @constant 11
*/
SQLITE_FCNTL_OVERWRITE: number;
/**
* @constant 12
*/
SQLITE_FCNTL_VFSNAME: number;
/**
* @constant 13
*/
SQLITE_FCNTL_POWERSAFE_OVERWRITE: number;
/**
* @constant 14
*/
SQLITE_FCNTL_PRAGMA: number;
/**
* @constant 15
*/
SQLITE_FCNTL_BUSYHANDLER: number;
/**
* @constant 16
*/
SQLITE_FCNTL_TEMPFILENAME: number;
/**
* @constant 18
*/
SQLITE_FCNTL_MMAP_SIZE: number;
/**
* @constant 19
*/
SQLITE_FCNTL_TRACE: number;
/**
* @constant 20
*/
SQLITE_FCNTL_HAS_MOVED: number;
/**
* @constant 21
*/
SQLITE_FCNTL_SYNC: number;
/**
* @constant 22
*/
SQLITE_FCNTL_COMMIT_PHASETWO: number;
/**
* @constant 23
*/
SQLITE_FCNTL_WIN32_SET_HANDLE: number;
/**
* @constant 24
*/
SQLITE_FCNTL_WAL_BLOCK: number;
/**
* @constant 25
*/
SQLITE_FCNTL_ZIPVFS: number;
/**
* @constant 26
*/
SQLITE_FCNTL_RBU: number;
/**
* @constant 27
*/
SQLITE_FCNTL_VFS_POINTER: number;
/**
* @constant 28
*/
SQLITE_FCNTL_JOURNAL_POINTER: number;
/**
* @constant 29
*/
SQLITE_FCNTL_WIN32_GET_HANDLE: number;
/**
* @constant 30
*/
SQLITE_FCNTL_PDB: number;
/**
* @constant 31
*/
SQLITE_FCNTL_BEGIN_ATOMIC_WRITE: number;
/**
* @constant 32
*/
SQLITE_FCNTL_COMMIT_ATOMIC_WRITE: number;
/**
* @constant 33
*/
SQLITE_FCNTL_ROLLBACK_ATOMIC_WRITE: number;
/**
* @constant 34
*/
SQLITE_FCNTL_LOCK_TIMEOUT: number;
/**
* @constant 35
*/
SQLITE_FCNTL_DATA_VERSION: number;
/**
* @constant 36
*/
SQLITE_FCNTL_SIZE_LIMIT: number;
/**
* @constant 37
*/
SQLITE_FCNTL_CKPT_DONE: number;
/**
* @constant 38
*/
SQLITE_FCNTL_RESERVE_BYTES: number;
/**
* @constant 39
*/
SQLITE_FCNTL_CKPT_START: number;
/**
* @constant 40
*/
SQLITE_FCNTL_EXTERNAL_READER: number;
/**
* @constant 41
*/
SQLITE_FCNTL_CKSM_FILE: number;
/**
* @constant 42
*/
SQLITE_FCNTL_RESET_CACHE: number;
};
/**

Some files were not shown because too many files have changed in this diff Show More