Compare commits

...

275 Commits

Author SHA1 Message Date
Ashcon Partovi
83d4d6264e Make executable 2023-10-23 15:14:07 -07:00
dave caruso
20234bc147 stuff 2023-10-20 21:47:55 -07:00
dave caruso
5e72f7f640 add unified sources code (does not work) 2023-10-20 21:05:19 -07:00
dave caruso
d37bb8efd5 more cmake nonsense 2023-10-20 20:01:24 -07:00
dave caruso
f22b56e64b a 2023-10-20 19:21:49 -07:00
dave caruso
2407143c9e Merge remote-tracking branch 'origin/main' into jarred/prepare-for-libuv 2023-10-20 18:37:18 -07:00
dave caruso
9f79fdea57 Merge remote-tracking branch 'origin/ci-test' into jarred/prepare-for-libuv 2023-10-20 18:36:42 -07:00
dave caruso
c271db39e4 okay 2023-10-20 17:35:53 -08:00
dave caruso
bcf027c35c chaos chaos chaos 2023-10-20 18:28:05 -07:00
Ashcon Partovi
071165b737 More changes to Dockerfile 2023-10-20 16:46:28 -07:00
Ashcon Partovi
9cdf695df3 Build 2023-10-20 16:46:28 -07:00
Ashcon Partovi
b6527820f3 New dockerfile 2023-10-20 16:45:55 -07:00
Ashcon Partovi
0646f106f0 Skip llvm 2023-10-20 16:45:55 -07:00
Ashcon Partovi
929c7ae09a Fix postinstal 2023-10-20 16:45:55 -07:00
Ashcon Partovi
4351966b5d add more to ssh 2023-10-20 16:45:55 -07:00
Ashcon Partovi
7055728cfb ssh into github actions 2023-10-20 16:45:55 -07:00
Ashcon Partovi
aed8be20ec bun install 2023-10-20 16:45:55 -07:00
Ashcon Partovi
ef5c2a29e7 Tweak script 2023-10-20 16:45:55 -07:00
Ashcon Partovi
93f530c0ba Tweak script 2023-10-20 16:45:55 -07:00
Ashcon Partovi
55444cc434 Tweak script 2023-10-20 16:45:55 -07:00
Ashcon Partovi
5af35fbbcb Tweak script 2023-10-20 16:45:55 -07:00
Ashcon Partovi
f9b7c24015 Tweak script 2023-10-20 16:45:55 -07:00
Ashcon Partovi
560e5f9948 16.0 -> 16 2023-10-20 16:45:55 -07:00
Ashcon Partovi
47f3ace597 Tweak script 2023-10-20 16:45:55 -07:00
Ashcon Partovi
35bb6bf2cf Another sudo attempt 2023-10-20 16:45:55 -07:00
Ashcon Partovi
8536313906 Tweak dependencies 2023-10-20 16:45:55 -07:00
Ashcon Partovi
c5a4060df3 More sudo 2023-10-20 16:45:55 -07:00
Ashcon Partovi
43cc370c9e Use sudo 2023-10-20 16:45:55 -07:00
Ashcon Partovi
a3381ee9b9 Fix script 2023-10-20 16:45:55 -07:00
Ashcon Partovi
00d28e09c0 New build workflow 2023-10-20 16:45:55 -07:00
Pedro Nogueira
074534b292 revert: back the test/README.md file (#6626)
Co-authored-by: pedromdsn <pedromdsn@hotmail.com>
2023-10-20 16:38:06 -07:00
dave caruso
b72eec9e9a colorterm 2023-10-20 14:49:48 -07:00
Dylan Conway
b0393fba62 Update InternalModuleRegistryConstants.h 2023-10-20 14:15:05 -07:00
Dmitry Nourell
7166fe10b5 Fixes IV calculation for AES-GCM mode (#6590)
* fix(crypto): fix the error in IV calculation for AES-GCM mode

* chore(crypto): add basic unit tests for Cipher & Decipher
2023-10-20 14:01:58 -07:00
chandi Langecker
2890ad53c4 fix(napi): incorrect refCount with napi_wrap() (#6598)
while trying to get [`node-usb`](https://github.com/node-usb/node-usb) running with bun, it always failed because close() is only allowed when there are no open references.

7e0182df8c/src/node_usb.h (L39-L41)
```c++
    inline void ref(){ refs_ = Ref();}
    inline void unref(){ refs_ = Unref();}
    inline bool canClose(){return refs_ == 0;}
```

`Ref()` and `Unref()` are both called once, with node.js resulting in `refs_ == 0` (which is expected), but with bun `refs_ == 1`.

I've made this small script to reproduce the bug:
https://github.com/alangecker/bun-ref-bug/blob/main/binding.cc
```
run with bun 1.0.6:
 - refcount: 2 (expected: 1)
run with node 20.8.1:
 - refcount: 1 (expected: 1)
```

during a long debugging journey I found out, that buns `NapiRef::ref()` is also just called once (as expected), but within `napi_wrap()` the `NapiRef` gets initialized already with the refCount set to 1

378385ba60/src/bun.js/bindings/napi.cpp (L669)
```c++
extern "C" napi_status napi_wrap(napi_env env,
    napi_value js_object,
    void* native_object,
    napi_finalize finalize_cb,
    void* finalize_hint,
    napi_ref* result)
{
    // [...]
    auto* ref = new NapiRef(globalObject, 1);
    // [...]
}
```

After changing it to `new NapiRef(globalObject, 0)` it got the expected behavior / same as with node.js and node-usb works.
as far as I understand it, a `NapiRef`` with refCount=0 should then be weak instead of strong, which is why I have changed this too.
2023-10-20 13:02:35 -07:00
Paula Burghelea
01e3474600 Update quickstart.md - removed the part for editing compilerOptions… (#6620)
* Update quickstart.md - removed the part for editing `compilerOptions` in `tsconfig.json`

The line is already added to `compilerOptions` in `tsconfig.json` so there is no need to edit the file.

    "types": [
      "bun-types" // add Bun global
    ]

This was already added when the project initialized it seems.

* Typescript section

---------

Co-authored-by: Colin McDonnell <colinmcd94@gmail.com>
2023-10-20 11:26:19 -07:00
Jarred Sumner
756eee087a Sort list of dependencies and fix test (#6616)
* fix findBestMatch so it finds the best match and not the first match

* update complex-workspaces to include lines-and-columns ^1.1.6

* PR feedback

* PR feedback

* This test doesn't reproduce the original issue

* Support pre release versions the same way

* Add a test that reproduces the original issue

* Sort the list of package versions before serializing to disk

* Remove test that didnt reproduce it

* Fix the count

* Fix 0 and 1 and sorting order

* Fix assertions and ordering

---------

Co-authored-by: Dylan Greene <dgreene@medallia.com>
Co-authored-by: Jarred Sumner <709451+Jarred-Sumner@users.noreply.github.com>
2023-10-20 03:55:40 -07:00
Dylan Conway
4b2cdc4fc1 respect optional peer dependencies and update docs (#6615)
* update docs

* optional peer dependencies

* rename offset variable name, cache invalidation time

* Update install.zig

* install more peer dependencies
2023-10-20 03:27:10 -07:00
Dylan Greene
184528e4eb fix findBestMatch so it finds the best match and not the first match (#6611)
* fix findBestMatch so it finds the best match and not the first match

* update complex-workspaces to include lines-and-columns ^1.1.6

* PR feedback

* PR feedback
2023-10-20 02:18:37 -07:00
dave caruso
7a7c85d05c okay 2023-10-19 23:02:13 -07:00
dave caruso
6158d5986d cool 2023-10-19 21:51:58 -08:00
dave caruso
de3a0f63e9 Merge branch 'jarred/prepare-for-libuv' of https://github.com/oven-sh/bun into jarred/prepare-for-libuv 2023-10-19 21:40:17 -08:00
dave caruso
d1c9f073df asdf 2023-10-19 21:40:15 -08:00
dave caruso
0c98256e64 theoretical -DCANARY flag we can use 2023-10-19 22:39:52 -07:00
Jarred Sumner
8cf7d6157a Fix missing function names in console.log and Bun.inspect (#6612)
* Fix missing function names in Bun.inspect

* Fix failing tests

* Handle @@toStringTag

* Update bindings.cpp

* Revert breaking changes to snapshots until a minor version

* Fix test

---------

Co-authored-by: Jarred Sumner <709451+Jarred-Sumner@users.noreply.github.com>
2023-10-19 22:38:05 -07:00
Vladimir Vlach
68324daf78 String response for lambda function - no need to strinfigy string (#6208) 2023-10-19 22:27:36 -07:00
Liz
66debb1ce4 fix: support custom file type in Bun.file (#6512)
* fix: support custom file type in Bun.file

In the docs it seamed to suggest this is something supported,
it seamed to be only supported in JSDOMFiles or blob.
This Adds the 2 properties `type` and `lastModified` to be supported on `Bun.file`

Fixes: https://github.com/oven-sh/bun/issues/6507

* fix: implement changes requested in review

Add changes requested in the review and add a test for a non standard
mimetype
2023-10-19 22:26:27 -07:00
Ashcon Partovi
d5d9fc4684 Fix websocket upgrade (#6564)
* Remove ancient changelog

* Fix `Host` header excluding port in WebSocket upgrade

* `byteSlice()`

* Revert `byteSlice()`
2023-10-19 22:24:45 -07:00
dave caruso
bb3a11035f asdfg 2023-10-19 22:01:38 -07:00
dave caruso
c4934352db update build dot zig 2023-10-19 21:50:34 -07:00
Liz
f6b694ee2c fix(install): dont replace git urls when already present (#6607)
* fix: dont replace git urls when already present

* fix: set request e_string

* test: add test for git url duplication
2023-10-19 21:28:59 -07:00
dave caruso
939e8be057 Merge branch 'jarred/prepare-for-libuv' of https://github.com/oven-sh/bun into jarred/prepare-for-libuv 2023-10-19 20:15:42 -08:00
dave caruso
18d368b2cc fgh 2023-10-19 20:15:38 -08:00
dave caruso
5f30a71f8d a 2023-10-19 21:07:52 -07:00
dave caruso
14be6f71c1 adfhjskfjdhkas 2023-10-19 20:07:33 -08:00
dave caruso
74a3582d11 sxdcvbnmk, 2023-10-19 20:26:33 -07:00
dave caruso
5eabbdc27b getting farther 2023-10-19 18:25:13 -08:00
Dylan Conway
bb623196a3 fix install add (#6609)
* fix add package

* update test

* initWithCLI once

* skip searching for workspaces if package json was created
2023-10-19 19:17:38 -07:00
dave caruso
c6f29fb0ec Merge branch 'jarred/prepare-for-libuv' of https://github.com/oven-sh/bun into jarred/prepare-for-libuv 2023-10-19 17:39:17 -08:00
dave caruso
0ed8153798 g 2023-10-19 17:39:14 -08:00
dave caruso
89e7eefd27 sdfadsf 2023-10-19 18:38:46 -07:00
nygma
e9948f1291 Add append content to a file guide (#6581)
* Add append content guide

Resolve #6559

* Update guide

---------

Co-authored-by: Colin McDonnell <colinmcd94@gmail.com>
2023-10-19 17:29:31 -07:00
dave caruso
5150c089a0 now it only has linker errors on mac 2023-10-19 17:20:45 -07:00
dave caruso
0e3dab76a6 Merge remote-tracking branch 'origin/main' into jarred/prepare-for-libuv 2023-10-19 16:14:06 -07:00
dave caruso
60be6b2497 finalize merge 2023-10-19 16:13:42 -07:00
dave caruso
42ac78ed49 Merge remote-tracking branch 'origin/jarred/prepare-for-libuv-2' into jarred/prepare-for-libuv 2023-10-19 16:13:36 -07:00
dave caruso
95662b07e6 blah 2023-10-19 14:54:18 -07:00
Jarred Sumner
378385ba60 Bump Zig 2023-10-19 00:19:21 -07:00
dave caruso
019bf2cd41 hmm 2023-10-18 22:55:50 -07:00
dave caruso
f794febca2 Merge remote-tracking branch 'origin/main' into jarred/prepare-for-libuv 2023-10-18 22:51:24 -07:00
dave caruso
11d4e85aa1 Process -> BunProcess 2023-10-18 22:34:13 -07:00
dave caruso
c9fa15d0c2 wow 2023-10-18 21:31:03 -08:00
dave caruso
51e0770476 git ignore 2023-10-18 19:45:18 -08:00
dave caruso
a6a3850d56 Merge branch 'jarred/prepare-for-libuv' of https://github.com/oven-sh/bun into jarred/prepare-for-libuv 2023-10-18 19:45:04 -08:00
dave caruso
5e100745d4 fghjkl 2023-10-18 19:41:36 -08:00
dave caruso
98f8a45f8b asdfasfdafdsafda 2023-10-18 20:34:31 -07:00
Ai Hoshino
ef5930e8bc fix(serve): When IPv6 is not enabled, attempt to bind to IPv4 address under the same hostname. (#6533)
* fix(serve): When IPv6 configuration is incorrect, attempt to bind to IPv4 address under the same hostname.
Close: #5315

* fix review

* fix review again

---------

Co-authored-by: Ashcon Partovi <ashcon@partovi.net>
Co-authored-by: Dylan Conway <35280289+dylan-conway@users.noreply.github.com>
2023-10-18 17:40:26 -07:00
dave caruso
80b4047aaa oops 2023-10-18 17:19:38 -07:00
dave caruso
c29ec25586 ok 2023-10-18 17:09:11 -07:00
dave caruso
4c6c617cf8 fix 2023-10-18 16:59:08 -07:00
dave caruso
07ed4d87d4 bun run build 2023-10-18 16:51:24 -07:00
Ai Hoshino
0173571b19 fix(node:buffer): fix the behavior of totalLength in Buffer.concat (#6574)
* fix(node:buffer): fix the behavior of `totalLength` in `Buffer.concat`
Close: #6570
Close: #3639

* fix buffer totalLength type

---------

Co-authored-by: Ashcon Partovi <ashcon@partovi.net>
2023-10-18 14:30:53 -07:00
Dawid Sowa
35259c0c1d fix: change --no-scripts to --ignore-scripts (#6587) 2023-10-18 14:00:04 -07:00
Mountain/\Ash
e7cba822e4 fix: online docs moved (#6579) 2023-10-18 12:57:46 -07:00
Kevin Latka
0d34e7a141 Fix minimum kernel version in docs (#6153)
* Fix minimum kernel version in docs

* Update install.md

* Update install.md

* Update install.md

---------

Co-authored-by: Colin McDonnell <colinmcd94@gmail.com>
2023-10-18 11:49:09 -07:00
Dylan Conway
aedc8c0ead build-id++ 2023-10-18 11:02:44 -07:00
Liz
9c0cd5c030 fix(web): stub performance.getEntriesByName (#6542) 2023-10-18 10:04:45 -07:00
Dylan Conway
2f10398c74 update root package variable 2023-10-17 23:10:10 -07:00
Ashcon Partovi
49ef5bccec Fix missing {port: 0} causing flaky test 2023-10-17 21:23:13 -07:00
dave caruso
cb5c4c71c8 yay 2023-10-17 19:42:37 -07:00
Dylan Conway
dcbcf9803a test changes in usockets in ci 2023-10-17 19:38:13 -07:00
dave caruso
bf12268274 progress 2023-10-17 17:44:30 -07:00
Pierre CM
e731eff382 fix #4766 (#6563) 2023-10-17 16:56:27 -07:00
Dylan Conway
a57d7ecb5b Update ZigGeneratedClasses.cpp 2023-10-17 16:46:31 -07:00
Dylan Conway
d187563d36 use npm alias in dependencies (#6545)
* aliased package in dependencies

* other buf

* make sure version works

* make sure overrides don't override alias

* tests

* update

* comments
2023-10-17 16:34:03 -07:00
dave caruso
602a526723 Merge remote-tracking branch 'origin/main' into jarred/prepare-for-libuv 2023-10-17 15:12:59 -07:00
dave caruso
6ff8c406b1 cmake on mac works 2023-10-17 15:09:01 -07:00
dave caruso
89edf5ef4b windows zig compiles 2023-10-17 14:21:03 -07:00
Ai Hoshino
e91436e524 fix(node:http): fix server.address() (#6442)
Closes #6413, #5850
2023-10-17 13:18:14 -07:00
Aral Roca Gomez
bbc2e96090 docs: fix ws.publish (#6558)
In this example there is no server variable in the context, and here it makes more sense to use ws.publish. It is explained below that once the serve is done, the server.publish can be used.
2023-10-17 09:23:10 -07:00
Mikhail
f53eb7cd59 perf(bun-types): remove needless some call (#6550) 2023-10-17 08:59:05 -07:00
dave caruso
98d19fa624 fix(runtime): make some things more stable (partial jsc debug build) (#5881)
* make our debug assertions work

* install bun-webkit-debug

* more progress

* ok

* progress...

* more debug build stuff

* ok

* a

* asdfghjkl

* fix(runtime): fix bad assertion failure in JSBufferList

* ok

* stuff

* upgrade webkit

* Update src/bun.js/bindings/JSDOMWrapperCache.h

Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>

* fix message for colin's changes

* okay

* fix cjs prototype

* implement mainModule

* i think this fixes it all

---------

Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2023-10-16 21:22:43 -07:00
dave caruso
a3958190e8 fix(runtime): improve IPC reliability + organization pass on that code (#6475)
* dfghj

* Handle messages that did not finish

* tidy

* ok

* a

* Merge remote-tracking branch 'origin/main' into dave/ipc-fixes

* test failures

---------

Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2023-10-16 20:01:24 -07:00
Mikhail
6504bfef74 Simplify getting Set of extentions (#4975) 2023-10-16 17:16:10 -07:00
Ashcon Partovi
220cb0eb94 Fix formatting 2023-10-16 17:11:04 -07:00
Igor Shapiro
01e04e3341 fix(test): when tests run with --only the nested describe blocks `.on… (#5616) 2023-10-16 16:33:02 -07:00
Yannik Schröder
0853e19f53 perf(node:events): optimize emit(...) function (#5485) 2023-10-16 16:18:40 -07:00
Liz
a9b8e3ecc8 fix: don't remove content-encoding header from header table (#5743)
Closes #5668
2023-10-16 16:11:44 -07:00
Hugo Galan
bec6161dce fix(sqlite) Insert .all() does not return an array #5872 (#5946)
* fixing #5872

* removing useless comment
2023-10-16 16:08:58 -07:00
Ashcon Partovi
8c580e6764 Fix formatting 2023-10-16 16:02:11 -07:00
Chris Toshok
c5354951ba Fix Response.statusText (#6151) 2023-10-16 15:57:16 -07:00
Nicolae-Rares Ailincai
f1658e2e58 fix-subprocess-argument-missing (#6407)
* fix-subprocess-argument-missing

* fix-tests

* nitpick, these should === not just be undefined

---------

Co-authored-by: dave caruso <me@paperdave.net>
2023-10-16 15:31:14 -07:00
Voldemat
90d7f33522 Add type parameter to expect (#6128) 2023-10-16 15:24:56 -07:00
Jérôme Benoit
d9c0273421 fix(node:worker_threads): ensure threadId property is exposed on worker_threads instance (#6521)
* fix: ensure threadId property is exposed on worker_threads instance

Signed-off-by: Jérôme Benoit <jerome.benoit@sap.com>

* fix: rename lazy worker_threads module properties

Signed-off-by: Jérôme Benoit <jerome.benoit@sap.com>

* fix: add getter for threadId

Signed-off-by: Jérôme Benoit <jerome.benoit@sap.com>

* test: improve worker_threads UTs

Signed-off-by: Jérôme Benoit <jerome.benoit@sap.com>

* test: fix lazy loading

Signed-off-by: Jérôme Benoit <jerome.benoit@sap.com>

* test: fix worker_threads test

Signed-off-by: Jérôme Benoit <jerome.benoit@piment-noir.org>

* fix: return the worker threadId

Signed-off-by: Jérôme Benoit <jerome.benoit@sap.com>

* test: refine worker_threads expectation on threadId

Signed-off-by: Jérôme Benoit <jerome.benoit@piment-noir.org>

---------

Signed-off-by: Jérôme Benoit <jerome.benoit@sap.com>
Signed-off-by: Jérôme Benoit <jerome.benoit@piment-noir.org>
2023-10-16 15:19:38 -07:00
Ashcon Partovi
d65b1fd80b Fix use before define bug in sqlite
Fixes #6481
2023-10-16 15:14:15 -07:00
João Alisson
7becb5ec74 fix(jest): fix toStrictEqual on same URLs (#6528)
Fixes #6492
2023-10-16 15:14:15 -07:00
Ashcon Partovi
c3f5baa091 Fix toHaveBeenCalled having wrong error signature
Fixes #6527
2023-10-16 15:14:15 -07:00
Ashcon Partovi
800ad150ff Fix formatting 2023-10-16 15:14:15 -07:00
Ashcon Partovi
5608e59270 Add reusePort to Bun.serve types 2023-10-16 15:14:15 -07:00
Ashcon Partovi
e31ed84b1b Fix request.url having incorrect port
Fixes #6443
2023-10-16 15:14:15 -07:00
Ashcon Partovi
548b1d02f2 Remove uWebSockets header from Bun.serve responses 2023-10-16 15:14:15 -07:00
Ashcon Partovi
f63955a01f Rename some tests 2023-10-16 15:14:15 -07:00
Ashcon Partovi
2996ef7156 Fix #6467 2023-10-16 15:14:15 -07:00
Dylan Conway
2b1f3438e6 Update InternalModuleRegistryConstants.h 2023-10-16 14:21:39 -07:00
Colin McDonnell
2a8f3a3b4e Development -> Contributing (#6538)
Co-authored-by: Colin McDonnell <colin@KennyM1.local>
2023-10-16 16:11:03 -04:00
Ciro Spaciari
a87aa2fafe fix(net/tls) fix pg hang on end + hanging on query (#6487)
* fix pg hang on end + hanging on query

* remove dummy function

* fix node-stream

* add test

* fix test

* return error in test

* fix test use once instead of on

* fix OOM

* generated

* 💅

* 💅
2023-10-14 16:16:49 -07:00
Dylan Conway
9b5e66453b fix installing dependencies that match workspace versions (#6494)
* check if dependency matches workspace version

* test

* Update lockfile.zig

* set resolution to workspace package id
2023-10-13 20:37:48 -07:00
Dylan Conway
46f978838d fix lockfile struct padding (#6495)
* integrity padding

* error message for bytes at end of struct
2023-10-13 20:37:06 -07:00
Nicolae-Rares Ailincai
21576589c6 Guide to containerize a bun application using Docker (#6478)
* docker.md

* use-debian

* Updates

---------

Co-authored-by: Colin McDonnell <colinmcd94@gmail.com>
2023-10-13 18:03:32 -07:00
Jarred Sumner
d7062eb367 [node:dns] Fix unnecessary array creation + prettier 2023-10-13 17:57:43 -07:00
João Alisson
851dc9aadc fix(node): dns lookup deprecated behavior (#6391)
Co-authored-by: alisson <alisson@Ubuntu.myguest.virtualbox.org>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2023-10-13 17:52:57 -07:00
Ai Hoshino
d08e112d41 fix(error): correct the path field in syscall error message. (#6370)
* fix(error): correct the `path` field in syscall error message.
Close: #6336

* fix pathlike union case
2023-10-13 17:51:36 -07:00
Ashcon Partovi
77d7e47019 Fix dns.lookup returning wrong address for family (#6474)
* Fix #6452

* Fix formatting
2023-10-13 17:47:05 -07:00
Dylan Conway
1bad64bc5e Update settings.json 2023-10-13 16:29:06 -07:00
Nicolae-Rares Ailincai
0794767291 Adds systemd guide to run a bun application as a daemon (#6451)
* systemd-guide

* remove-root-from-example

* add-more-description

* Updates

* Updates

* Updates

* Update

---------

Co-authored-by: Colin McDonnell <colinmcd94@gmail.com>
2023-10-13 11:14:57 -07:00
Clay Curry
d7c8a58453 fix obvious typo in CONTRIBUTING.md (#6479)
Co-authored-by: Clay Curry <me@claycurry.com>
2023-10-13 09:33:44 -07:00
Colin McDonnell
4fab8fee21 Update descriptions 2023-10-12 23:19:53 -07:00
Colin McDonnell
3f2df4526e Fix links 2023-10-12 23:17:51 -07:00
Colin McDonnell
d6d4ead438 Tweaks to pm docs 2023-10-12 23:08:52 -07:00
Colin McDonnell
4e67862753 Add overrides/resolutions docs (#6476) 2023-10-12 23:05:20 -07:00
dave caruso
584e6dd1c2 Upgrade zig to 0.12.0-dev.888+130227491 (#6471)
* update build.zig

* save

* works?

* better workaround

* fix install

* Fix compiler crash
2023-10-12 19:38:33 -07:00
Dylan Conway
4bb753295d use a different package 2023-10-12 19:35:00 -07:00
Dylan Conway
892593c73b fix install test 2023-10-12 15:17:03 -07:00
Dylan Conway
691cf338c2 fix editing package json when adding github dependency (#6432)
* fix package name added to package json

* check for github tag

* remove alloc

* some tests

* fix test
2023-10-12 15:02:05 -07:00
Colin McDonnell
beb746e5ea Update installation.md 2023-10-12 14:12:53 -07:00
Colin McDonnell
89faee2522 Update installation.md 2023-10-12 13:43:45 -07:00
dave caruso
969da088f5 fix(install): re-evaluate overrides when removed 2023-10-12 02:03:02 -07:00
Luna
c50be68790 chore: add missing ending quote (#6436) 2023-10-12 01:00:27 -07:00
dave caruso
2fbb95142a feat(install): support npm overrides/yarn resolutions, one level deep only (#6435)
* disable zig fmt on generated ResolvedSourceTag.zig

* overrides

* it works

* ok

* a

---------

Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2023-10-12 00:44:15 -07:00
Dylan Conway
755e16d962 fix #6416 (#6430)
* make sure latest is checked after prerelease

* test and fix

* test for when latest matches prerelease
2023-10-11 20:41:12 -07:00
Jarred Sumner
b1063edf3e Bump WebKit 2023-10-11 20:04:15 -07:00
Jarred Sumner
edb4cbac2b Bump! 2023-10-11 19:05:52 -07:00
Dylan Conway
a59a69e21b Update JSCUSocketsLoopIntegration.cpp 2023-10-11 15:26:55 -07:00
Colin McDonnell
4c9e009971 Update installation.md 2023-10-11 14:31:39 -07:00
h2210316651
4531cf18c2 Docs : Added instructions to run bun apps in daemon (PM2) to address … (#5931)
* Docs : Added instructions to run bun apps in daemon (PM2) to address issue #4734

Added instructions to set bun as pm2 interpreter to extend same functionality as node.js apps.

* Add pm2 guide

* Add pm2 file

---------

Co-authored-by: Colin McDonnell <colinmcd94@gmail.com>
2023-10-11 14:22:43 -07:00
Ashcon Partovi
31bda68f24 Update bun-release.yml 2023-10-11 12:05:16 -07:00
dave caruso
1bf28e0d77 feat(install): automatically migrate package-lock.json to bun.lockb (#6352)
* work so far

* stuff

* a

* basics work

* stuff

* yoo

* build lockfile

* correct

* f

* a

* install fixture havent tested

* i made it worse

* lol

* be more reasonable

* make the test easier to pass because bun install doesn't handle obscure lockfile edge cases :/

* a

* works now

* ok

* a

* a

* cool

* nah

* fix stuff

* l

* a

* idfk

* LAME

* prettier errors

* does this fix tests?

* Add more safety checks to Integrity

* Add another check

* More careful lifetime handling

* Fix linux debugger issue

* a

* tmp dir and snapshot test

---------

Co-authored-by: Jarred SUmner <jarred@jarredsumner.com>
2023-10-11 02:27:07 -07:00
Jarred Sumner
6a17ebe669 Update nodejs-apis.md 2023-10-11 01:59:56 -07:00
Arden Sinclair
39446ebdb8 Fix lifecycle scripts not running on reinstallation (#6376)
* Include trusted dependencies in lockfile

* Add a remote dependency to lifecycle script test
2023-10-10 21:13:42 -07:00
Elad Bezalel
c2c3b0d4a9 feat(test): implement toEqualIgnoringWhitespace (#6293)
* feat(test): implement `toEqualIgnoringWhitespace`

* equality check in matcher & incorrect arg error
2023-10-10 20:27:19 -07:00
saurabh
9a90d90966 fix: form data content type (#6380)
* fix: form data content type

* fix: condition if no extension for file
2023-10-10 20:08:32 -07:00
Dylan Conway
05781dd91e make peer dependencies install by default (#6396)
* peer dependencies

* default true

* add test

* cleanup

* some tests

* skip peer deps if they are non optional

* remove debug print, fix build

* iterate peer dependencies
2023-10-10 20:05:58 -07:00
Aaron Dewes
a6a474a83f Add File to binary data TOC (#6025) 2023-10-10 16:47:35 -07:00
cyfung1031
44dd744f0a docs: rearranged cli/runtime related sections (#6275)
* docs: rearranged cli/runtime related sections

* docs: update README.md for the updated docs path

* Updates

* Rearrange

* Rearrange

* Add files

* readme

---------

Co-authored-by: Colin McDonnell <colinmcd94@gmail.com>
2023-10-10 16:34:35 -07:00
Ashcon Partovi
df4ec8aaad Update inspector-protocol 2023-10-10 16:14:46 -07:00
Ashcon Partovi
0348b169d6 Update debug-adapter-protocol 2023-10-10 16:14:46 -07:00
Vasilis Themelis
54dbf3ba21 Add missing ws declarations (#6307) 2023-10-10 15:39:20 -07:00
Clément P
5f09a4dd0a Update vite.md (#6399)
remove outdated information
2023-10-10 15:36:28 -07:00
Nicolae-Rares Ailincai
e58e85cd5c Documentation for the IPC of Bun.spawn (#6400)
* doc/ipc.md

* update/spawn.md

* improved-documentation-and-added-send-type

* Updates

* Updates

---------

Co-authored-by: Colin McDonnell <colinmcd94@gmail.com>
2023-10-10 15:35:05 -07:00
Jarred Sumner
52d47c24dc Merge branch 'jarred/prepare-for-libuv' of https://github.com/oven-sh/bun into jarred/prepare-for-libuv 2023-09-09 15:13:58 -08:00
Jarred Sumner
99c92d2b81 Update index.ts 2023-09-09 16:13:45 -07:00
Jarred Sumner
242ab4445b bump 2023-09-09 16:04:00 -07:00
Jarred Sumner
7f6bbce191 Merge branch 'main' into jarred/prepare-for-libuv 2023-09-09 15:03:47 -08:00
Jarred SUmner
fb86c1bdfe [dave]: fix webcrypto crash 2023-09-07 09:03:18 -07:00
Jarred Sumner
f44c2b65a3 hm 2023-09-07 08:58:47 -07:00
Jarred Sumner
7177cb0803 Merge branch 'main' into jarred/prepare-for-libuv 2023-09-07 06:14:50 -07:00
Jarred Sumner
69696793d5 Make sure we remove libusockets is removed 2023-09-07 05:54:45 -07:00
Jarred SUmner
605477f49a Fix some test failures 2023-09-07 05:52:36 -07:00
Jarred SUmner
ce16029183 Fix sqlite test failures 2023-09-07 04:46:53 -07:00
Jarred Sumner
c645342a74 Update mimalloc 2023-09-07 01:54:24 -08:00
Jarred Sumner
48cdfd6685 Merge branch 'jarred/prepare-for-libuv' of https://github.com/oven-sh/bun into jarred/prepare-for-libuv 2023-09-07 01:53:49 -08:00
Jarred Sumner
9452ade912 more 2023-09-07 01:53:41 -08:00
Jarred Sumner
f7ce24895a Set permissions 2023-09-07 00:38:41 -07:00
Jarred Sumner
cffdacc5fa Merge branch 'jarred/prepare-for-libuv' of https://github.com/oven-sh/bun into jarred/prepare-for-libuv 2023-09-06 23:09:49 -08:00
Jarred Sumner
dbd57f426c Update feature_flags.zig 2023-09-06 23:47:02 -07:00
Jarred Sumner
497d546048 Revert "avoid duplicate symbols"
This reverts commit 4ac6ca8700.
2023-09-06 23:06:47 -07:00
Jarred Sumner
19a4df3c52 Revert "avoid undefined symbols"
This reverts commit ca835b726f.
2023-09-06 23:06:41 -07:00
Jarred Sumner
c7e93c1376 Update response.zig 2023-09-06 22:44:09 -07:00
Jarred Sumner
9e82e040dc Merge branch 'jarred/prepare-for-libuv' of https://github.com/oven-sh/bun into jarred/prepare-for-libuv 2023-09-06 21:41:36 -08:00
Jarred Sumner
b6a4609a73 Update feature_flags.zig 2023-09-06 22:38:10 -07:00
Jarred Sumner
0a968a66c2 churn 2023-09-06 22:37:41 -07:00
Jarred Sumner
61bb92b7bc Merge remote-tracking branch 'origin' into jarred/prepare-for-libuv 2023-09-06 22:37:24 -07:00
Jarred Sumner
4b5233fc3a feat(fetch) rejectUnauthorized and checkServerIdentity (#4514)
* enable root certs on fetch

* rebase

* fix lookup

* some fixes and improvements

* fmt

* more fixes

* more fixes

* check detached onHandshake

* fix promise case

* fix cert non-Native

* add fetch tls tests

* more one test
2023-09-06 22:33:55 -07:00
Birk Skyum
99219d5e1c Update nodejs compat docs cp/cpSync/watchFile/unwatchFile (#4525) 2023-09-06 22:32:27 -07:00
Ashcon Partovi
70ec9afa46 Add bun-types to 'bun fmt' script 2023-09-06 22:32:26 -07:00
Ashcon Partovi
eb316e7197 Add types for watchFile and unwatchFile 2023-09-06 22:32:26 -07:00
Ashcon Partovi
02ba25c103 Add types for cp and cpSync 2023-09-06 22:32:26 -07:00
Ashcon Partovi
2b6f297c0d Remove issue template for install
It's not used, and use the bug issue instead.
2023-09-06 22:32:26 -07:00
Jarred Sumner
dc8d70ab02 Merge branch 'main' into jarred/prepare-for-libuv 2023-09-06 21:27:16 -08:00
Jarred Sumner
b38b345184 Merge branch 'jarred/prepare-for-libuv' of https://github.com/oven-sh/bun into jarred/prepare-for-libuv 2023-09-06 20:49:26 -08:00
Jarred Sumner
b7241f77fe bump 2023-09-06 20:47:06 -08:00
Dylan Conway
ca835b726f avoid undefined symbols 2023-09-06 14:24:04 -07:00
Dylan Conway
4ac6ca8700 avoid duplicate symbols 2023-09-06 14:12:06 -07:00
Jarred Sumner
bd7a2619e8 Merge branch 'jarred/prepare-for-libuv' of https://github.com/oven-sh/bun into jarred/prepare-for-libuv 2023-09-06 11:20:35 -08:00
Jarred Sumner
258615a43f fixup 2023-09-06 11:20:01 -08:00
Jarred Sumner
33d83ad6ca Update bun.zig 2023-09-06 12:09:28 -07:00
Jarred Sumner
bb9e0c2043 windows 2023-09-06 11:02:41 -08:00
Jarred Sumner
724a83de53 some clenaup 2023-09-06 10:51:32 -08:00
Jarred Sumner
db42fd3b1d Merge branch 'jarred/prepare-for-libuv' of https://github.com/oven-sh/bun into jarred/prepare-for-libuv 2023-09-06 10:31:17 -08:00
Jarred Sumner
9a485be954 ok 2023-09-06 10:29:42 -08:00
Jarred Sumner
dca6ffd4cd We have to bump the version of Debian because libarchive has a higher minimum requirement 2023-09-06 09:28:20 -07:00
Jarred Sumner
dae3887bed Undo that change 2023-09-06 09:20:30 -07:00
Jarred Sumner
6979855f42 Bummp 2023-09-06 08:59:06 -07:00
Jarred Sumner
ce4c1351bb Bump 2023-09-06 08:57:53 -07:00
Jarred Sumner
1a275c6337 small fixes 2023-09-06 08:56:16 -07:00
Jarred Sumner
666fbead09 Fixes 2023-09-06 08:42:43 -07:00
Jarred Sumner
b8329f293e Merge branch 'jarred/prepare-for-libuv' of https://github.com/oven-sh/bun into jarred/prepare-for-libuv 2023-09-06 06:48:27 -08:00
Jarred Sumner
4601b80d7c use less std.os 2023-09-06 06:48:08 -08:00
Jarred Sumner
4b58698136 Merge branch 'main' into jarred/prepare-for-libuv 2023-09-06 07:40:51 -07:00
Jarred Sumner
400bc949e6 more things work 2023-09-06 06:37:59 -08:00
Jarred Sumner
5fb23b9296 bun install progress 2023-09-06 04:18:14 -08:00
Jarred Sumner
caa3ff71fe Make require() work 2023-09-06 01:19:45 -08:00
Jarred Sumner
c33e86e1a4 Bun.serve() and much of the event loop works now 2023-09-06 00:57:25 -08:00
Jarred Sumner
e44911d4eb fetch works 2023-09-05 22:19:48 -08:00
Jarred Sumner
e7c6b1d683 fixup 2023-09-05 21:25:54 -07:00
Jarred Sumner
38849b566d Update JSSink.h 2023-09-05 21:08:50 -07:00
Jarred Sumner
831919d12a regenaret 2023-09-05 21:08:36 -07:00
Jarred Sumner
547b8453ec reb 2023-09-05 21:05:14 -07:00
Jarred Sumner
b9e5758a86 Merge branch 'main' into jarred/prepare-for-libuv 2023-09-05 19:26:45 -08:00
Jarred Sumner
27c82a6763 Many more things are starting to work. 2023-09-05 17:43:23 -08:00
Jarred Sumner
4b1d1a272a Merge branch 'main' into jarred/prepare-for-libuv 2023-09-05 05:11:46 -08:00
Jarred Sumner
ebc64f2b4d zig fmt 2023-09-05 05:11:39 -08:00
Jarred Sumner
e2d3bdd6b3 Update CMakeLists.txt 2023-09-05 05:08:44 -08:00
Jarred Sumner
f9d325cb56 quite a lot of fixes 2023-09-05 05:08:37 -08:00
Jarred Sumner
5b808d6c5c hm 2023-09-05 00:49:37 -08:00
Jarred Sumner
e84a85b020 Update .gitignore 2023-09-05 00:48:00 -08:00
Jarred Sumner
a66b766ced more 2023-09-05 00:45:51 -08:00
Jarred Sumner
04a2944eb3 further! 2023-09-05 00:44:22 -08:00
Jarred Sumner
c5ea98ae56 Update src/bun.js/bindings/headers-handwritten.h
Co-authored-by: Dylan Conway <35280289+dylan-conway@users.noreply.github.com>
2023-09-04 19:53:38 -07:00
Dylan Conway
5aa1324938 Update wtf-bindings.cpp 2023-09-04 19:43:38 -07:00
Jarred Sumner
fd97e2f17a Fixups 2023-09-04 19:17:01 -07:00
Jarred Sumner
35fdc690b7 Rename file 2023-09-04 19:17:01 -07:00
Jarred Sumner
32084dac41 hmmm 2023-09-04 10:54:50 -08:00
Jarred Sumner
6adb60197e it works 2023-09-04 10:19:29 -08:00
Jarred Sumner
a113950d41 Merge branch 'jarred/prepare-for-libuv' of https://github.com/oven-sh/bun into jarred/prepare-for-libuv 2023-09-04 08:58:06 -08:00
Jarred Sumner
dfa0bf6f5b Fix build issue 2023-09-04 08:58:04 -08:00
Jarred Sumner
c63b9aa548 Update windows.zig 2023-09-04 09:55:55 -07:00
Jarred Sumner
f40b07ad65 Update windows.zig 2023-09-04 09:55:49 -07:00
Jarred Sumner
f76f15c5e2 Add fast-ish path for bun install on Windows 2023-09-04 09:55:18 -07:00
Jarred Sumner
37d8593713 fix variosu issues 2023-09-04 05:58:03 -08:00
Jarred Sumner
3b9318794d Fix getenvZ 2023-09-04 03:07:45 -07:00
Jarred Sumner
60f1a81d06 12 mb 2023-09-04 02:42:21 -07:00
Jarred Sumner
03334de41e More 2023-09-04 00:02:19 -08:00
Jarred Sumner
36840f8ab0 Add musl polyfill for memmem on Windows 2023-09-04 00:02:07 -08:00
Jarred Sumner
2ef51cadb0 Rename Process -> BunProcess
Works around a Windows issue
2023-09-04 00:01:53 -08:00
Jarred Sumner
c9f614a2f2 theres more 2023-09-04 00:01:29 -08:00
Jarred Sumner
5b1f211110 draw the rest of the owl 2023-09-04 00:01:00 -08:00
Jarred Sumner
0ff9a10355 chunk 2023-09-03 23:57:55 -08:00
Jarred Sumner
c267a422ad Fix one of the compiler errors 2023-09-03 22:21:35 -07:00
Jarred Sumner
3de075bbae Fixup 2023-09-03 04:18:51 -07:00
Jarred Sumner
b142009fc9 Fix usockets warnings 2023-09-03 04:14:29 -07:00
Jarred Sumner
8f7f75bb2a put it in the zig file 2023-09-03 03:56:51 -07:00
Jarred Sumner
80e45451cf Update JSSQLStatement.h 2023-09-03 02:24:07 -07:00
Jarred Sumner
1d5ea13825 cmake works 2023-09-03 02:14:02 -07:00
Jarred Sumner
d5d0ffe43c more warnings 2023-09-03 00:46:46 -07:00
Jarred Sumner
8a5f27c1a7 Remove more warnings 2023-09-03 00:32:55 -07:00
Jarred Sumner
40193f8846 Fix a bunch of compiler warnings 2023-09-03 00:29:25 -07:00
Jarred Sumner
24582cc99b Update settings.json 2023-09-02 21:30:16 -07:00
Jarred Sumner
668e5ce368 Add the build scripts 2023-09-01 21:33:52 -07:00
Jarred Sumner
9c8e17e300 Bump mimalloc 2023-09-01 20:40:29 -07:00
Jarred Sumner
cbdf380493 Make compiling each dependency a shell script 2023-09-01 18:15:19 -07:00
Jarred Sumner
3c4015de26 wip 2023-08-31 03:29:03 -07:00
Jarred Sumner
69f85b59ad Update libuv.zig 2023-08-31 02:16:58 -07:00
Jarred Sumner
3775081e5a More progress 2023-08-31 02:11:04 -07:00
Jarred Sumner
a81ca0d9cc Prepare for windows event loop 2023-08-31 00:22:50 -07:00
588 changed files with 60640 additions and 61778 deletions

22
.build/base64.bash Executable file
View File

@@ -0,0 +1,22 @@
#!/usr/bin/env bash
set -euxo pipefail
SCRIPT_DIR=$(cd -- "$(dirname -- "${BASH_SOURCE[0]}")" &>/dev/null && pwd)
CMAKE_FLAGS=${CMAKE_FLAGS:-}
BUN_BASE_DIR=${BUN_BASE_DIR:-$(cd $SCRIPT_DIR && cd ../ && pwd)}
BUN_DEPS_OUT_DIR=${BUN_DEPS_OUT_DIR:-$BUN_BASE_DIR/src/deps/}
BUN_DEPS_DIR=${BUN_DEPS_DIR:-$BUN_BASE_DIR/src/deps}
CCACHE_CC_FLAG=${CCACHE_CC_FLAG:-}
CFLAGS=${CFLAGS:-}
mkdir -p $BUN_DEPS_OUT_DIR
cd $BUN_DEPS_DIR/base64
echo "1: $(which make)"
echo "2: $(which cmake)"
make clean
cmake $CMAKE_FLAGS .
make
cp libbase64.a $BUN_DEPS_OUT_DIR/libbase64.a

22
.build/base64.ps1 Normal file
View File

@@ -0,0 +1,22 @@
$ErrorActionPreference = 'Stop' # Setting strict mode, similar to 'set -euo pipefail' in bash
$SCRIPT_DIR = Split-Path $PSScriptRoot -Parent
$CMAKE_FLAGS = $env:CMAKE_FLAGS
$BUN_BASE_DIR = if ($env:BUN_BASE_DIR) { $env:BUN_BASE_DIR } else { $SCRIPT_DIR }
$BUN_DEPS_OUT_DIR = if ($env:BUN_DEPS_OUT_DIR) { $env:BUN_DEPS_OUT_DIR } else { Join-Path $BUN_BASE_DIR 'src\deps' }
$BUN_DEPS_DIR = if ($env:BUN_DEPS_DIR) { $env:BUN_DEPS_DIR } else { Join-Path $BUN_BASE_DIR 'src\deps' }
$CCACHE_CC_FLAG = $env:CCACHE_CC_FLAG
$CPUS = if ($env:CPUS) { $env:CPUS } else { (Get-WmiObject -Class Win32_ComputerSystem).NumberOfLogicalProcessors }
$CFLAGS = $env:CFLAGS
$CXXFLAGS = $env:CXXFLAGS
# Create the output directory if it doesn't exist
if (-not (Test-Path $BUN_DEPS_OUT_DIR)) {
New-Item -ItemType Directory -Path $BUN_DEPS_OUT_DIR
}
Set-Location (Join-Path $BUN_DEPS_DIR 'base64')
cmake $CMAKE_FLAGS .
cmake --build . --clean-first --config Release
Copy-Item **/*.lib $BUN_DEPS_OUT_DIR

22
.build/boringssl.bash Executable file
View File

@@ -0,0 +1,22 @@
#!/usr/bin/env bash
set -euxo pipefail
SCRIPT_DIR=$(cd -- "$(dirname -- "${BASH_SOURCE[0]}")" &>/dev/null && pwd)
CMAKE_FLAGS=${CMAKE_FLAGS:-}
BUN_BASE_DIR=${BUN_BASE_DIR:-$(cd $SCRIPT_DIR && cd ../ && pwd)}
BUN_DEPS_OUT_DIR=${BUN_DEPS_OUT_DIR:-$BUN_BASE_DIR/src/deps/}
BUN_DEPS_DIR=${BUN_DEPS_DIR:-$BUN_BASE_DIR/src/deps}
CCACHE_CC_FLAG=${CCACHE_CC_FLAG:-}
CFLAGS=${CFLAGS:-}
mkdir -p $BUN_DEPS_OUT_DIR
cd $BUN_DEPS_DIR/boringssl
rm -rf build
mkdir -p build
cd build
CFLAGS="$CFLAGS" cmake $CMAKE_FLAGS -DCMAKE_EXE_LINKER_FLAGS="-fuse-ld=lld" -GNinja ..
ninja libcrypto.a libssl.a libdecrepit.a
cp **/libcrypto.a $BUN_DEPS_OUT_DIR/libcrypto.a
cp **/libssl.a $BUN_DEPS_OUT_DIR/libssl.a
cp **/libdecrepit.a $BUN_DEPS_OUT_DIR/libdecrepit.a

19
.build/boringssl.ps1 Normal file
View File

@@ -0,0 +1,19 @@
$ErrorActionPreference = 'Stop' # Setting strict mode, similar to 'set -euo pipefail' in bash
$SCRIPT_DIR = Split-Path $PSScriptRoot -Parent
$CMAKE_FLAGS = $env:CMAKE_FLAGS
$BUN_BASE_DIR = if ($env:BUN_BASE_DIR) { $env:BUN_BASE_DIR } else { $SCRIPT_DIR }
$BUN_DEPS_OUT_DIR = if ($env:BUN_DEPS_OUT_DIR) { $env:BUN_DEPS_OUT_DIR } else { Join-Path $BUN_BASE_DIR 'src\deps' }
$BUN_DEPS_DIR = if ($env:BUN_DEPS_DIR) { $env:BUN_DEPS_DIR } else { Join-Path $BUN_BASE_DIR 'src\deps' }
$CCACHE_CC_FLAG = $env:CCACHE_CC_FLAG
$CPUS = if ($env:CPUS) { $env:CPUS } else { (Get-WmiObject -Class Win32_ComputerSystem).NumberOfLogicalProcessors }
$CFLAGS = $env:CFLAGS
$CXXFLAGS = $env:CXXFLAGS
mkdir -p $BUN_DEPS_OUT_DIR -Force
Set-Location $BUN_DEPS_DIR/boringssl
cmake $CMAKE_FLAGS .
cmake --build . --target crypto --target ssl --target decrepit --clean-first --config Release
Copy-Item crypto/Release/crypto.lib $BUN_DEPS_OUT_DIR
Copy-Item ssl/Release/ssl.lib $BUN_DEPS_OUT_DIR
Copy-Item decrepit/Release/decrepit.lib $BUN_DEPS_OUT_DIR

24
.build/cares.ps1 Normal file
View File

@@ -0,0 +1,24 @@
$ErrorActionPreference = 'Stop' # Setting strict mode, similar to 'set -euo pipefail' in bash
$SCRIPT_DIR = Split-Path $PSScriptRoot -Parent
$CMAKE_FLAGS = $env:CMAKE_FLAGS
$BUN_BASE_DIR = if ($env:BUN_BASE_DIR) { $env:BUN_BASE_DIR } else { $SCRIPT_DIR }
$BUN_DEPS_OUT_DIR = if ($env:BUN_DEPS_OUT_DIR) { $env:BUN_DEPS_OUT_DIR } else { Join-Path $BUN_BASE_DIR 'src\deps' }
$BUN_DEPS_DIR = if ($env:BUN_DEPS_DIR) { $env:BUN_DEPS_DIR } else { Join-Path $BUN_BASE_DIR 'src\deps' }
$CCACHE_CC_FLAG = $env:CCACHE_CC_FLAG
$CPUS = if ($env:CPUS) { $env:CPUS } else { (Get-WmiObject -Class Win32_ComputerSystem).NumberOfLogicalProcessors }
$CFLAGS = $env:CFLAGS
$CXXFLAGS = $env:CXXFLAGS
# Create the output directory if it doesn't exist
if (-not (Test-Path $BUN_DEPS_OUT_DIR)) {
New-Item -ItemType Directory -Path $BUN_DEPS_OUT_DIR
}
Set-Location (Join-Path $BUN_DEPS_DIR 'c-ares')
rm -r build -ErrorAction SilentlyContinue
mkdir build -ErrorAction SilentlyContinue
cd build
cmake $CMAKE_FLAGS -DCMAKE_BUILD_TYPE=Release -G "Visual Studio 17 2022" -DCARES_STATIC=ON -DCARES_SHARED=OFF ..
cmake --build . --clean-first --config Release
cp ./lib/Release/*.lib $BUN_DEPS_OUT_DIR

22
.build/libarchive.bash Executable file
View File

@@ -0,0 +1,22 @@
#!/usr/bin/env bash
set -euxo pipefail
SCRIPT_DIR=$(cd -- "$(dirname -- "${BASH_SOURCE[0]}")" &>/dev/null && pwd)
CMAKE_FLAGS=${CMAKE_FLAGS:-}
BUN_BASE_DIR=${BUN_BASE_DIR:-$(cd $SCRIPT_DIR && cd ../ && pwd)}
BUN_DEPS_OUT_DIR=${BUN_DEPS_OUT_DIR:-$BUN_BASE_DIR/src/deps/}
BUN_DEPS_DIR=${BUN_DEPS_DIR:-$BUN_BASE_DIR/src/deps}
CCACHE_CC_FLAG=${CCACHE_CC_FLAG:-}
CPUS=${CPUS:-$(nproc || sysctl -n hw.ncpu || echo 1)}
CFLAGS=${CFLAGS:-}
mkdir -p $BUN_DEPS_OUT_DIR
cd $BUN_DEPS_DIR/libarchive
make clean || echo ""
./build/clean.sh || echo ""
./build/autogen.sh
CFLAGS="$CFLAGS" $CCACHE_CC_FLAG ./configure --disable-shared --enable-static --with-pic --disable-bsdtar --disable-bsdcat --disable-rpath --enable-posix-regex-lib --without-xml2 --without-expat --without-openssl --without-iconv --without-zlib
make -j$CPUS
cp ./.libs/libarchive.a $BUN_DEPS_OUT_DIR/libarchive.a

16
.build/libarchive.ps1 Normal file
View File

@@ -0,0 +1,16 @@
$ErrorActionPreference = 'Stop' # Setting strict mode, similar to 'set -euo pipefail' in bash
$SCRIPT_DIR = Split-Path $PSScriptRoot -Parent
$CMAKE_FLAGS = $env:CMAKE_FLAGS
$BUN_BASE_DIR = if ($env:BUN_BASE_DIR) { $env:BUN_BASE_DIR } else { $SCRIPT_DIR }
$BUN_DEPS_OUT_DIR = if ($env:BUN_DEPS_OUT_DIR) { $env:BUN_DEPS_OUT_DIR } else { Join-Path $BUN_BASE_DIR 'src\deps' }
$BUN_DEPS_DIR = if ($env:BUN_DEPS_DIR) { $env:BUN_DEPS_DIR } else { Join-Path $BUN_BASE_DIR 'src\deps' }
$CCACHE_CC_FLAG = $env:CCACHE_CC_FLAG
$CPUS = if ($env:CPUS) { $env:CPUS } else { (Get-WmiObject -Class Win32_ComputerSystem).NumberOfLogicalProcessors }
$CFLAGS = $env:CFLAGS
$CXXFLAGS = $env:CXXFLAGS
Set-Location $BUN_DEPS_DIR/libarchive
cmake -DBUILD_SHARED_LIBS=OFF -DENABLE_TEST=OFF -DENABLE_INSTALL=OFF --compile-no-warning-as-error $CMAKE_FLAGS .
cmake --build . --target ALL_BUILD --clean-first --config Release -- /p:WarningLevel=0
Copy-Item libarchive/Release/archive.lib $BUN_DEPS_OUT_DIR

16
.build/lolhtml.bash Executable file
View File

@@ -0,0 +1,16 @@
#!/usr/bin/env bash
set -euxo pipefail
SCRIPT_DIR=$(cd -- "$(dirname -- "${BASH_SOURCE[0]}")" &>/dev/null && pwd)
CMAKE_FLAGS=${CMAKE_FLAGS:-}
BUN_BASE_DIR=${BUN_BASE_DIR:-$(cd $SCRIPT_DIR && cd ../ && pwd)}
BUN_DEPS_OUT_DIR=${BUN_DEPS_OUT_DIR:-$BUN_BASE_DIR/src/deps/}
BUN_DEPS_DIR=${BUN_DEPS_DIR:-$BUN_BASE_DIR/src/deps}
CCACHE_CC_FLAG=${CCACHE_CC_FLAG:-}
CFLAGS=${CFLAGS:-}
mkdir -p $BUN_DEPS_OUT_DIR
cd $BUN_DEPS_DIR/lol-html/c-api
cargo build --release
cp target/release/liblolhtml.a $BUN_DEPS_OUT_DIR

16
.build/lolhtml.ps1 Normal file
View File

@@ -0,0 +1,16 @@
$ErrorActionPreference = 'Stop' # Setting strict mode, similar to 'set -euo pipefail' in bash
$SCRIPT_DIR = Split-Path $PSScriptRoot -Parent
$CMAKE_FLAGS = $env:CMAKE_FLAGS
$BUN_BASE_DIR = if ($env:BUN_BASE_DIR) { $env:BUN_BASE_DIR } else { $SCRIPT_DIR }
$BUN_DEPS_OUT_DIR = if ($env:BUN_DEPS_OUT_DIR) { $env:BUN_DEPS_OUT_DIR } else { Join-Path $BUN_BASE_DIR 'src\deps' }
$BUN_DEPS_DIR = if ($env:BUN_DEPS_DIR) { $env:BUN_DEPS_DIR } else { Join-Path $BUN_BASE_DIR 'src\deps' }
$CCACHE_CC_FLAG = $env:CCACHE_CC_FLAG
$CPUS = if ($env:CPUS) { $env:CPUS } else { (Get-WmiObject -Class Win32_ComputerSystem).NumberOfLogicalProcessors }
$CFLAGS = $env:CFLAGS
$CXXFLAGS = $env:CXXFLAGS
Set-Location $BUN_DEPS_DIR/lol-html/c-api
cargo build --release --target x86_64-pc-windows-msvc
Copy-Item target/x86_64-pc-windows-msvc/release/lolhtml.lib $BUN_DEPS_OUT_DIR
Copy-Item target/x86_64-pc-windows-msvc/release/lolhtml.pdb $BUN_DEPS_OUT_DIR

38
.build/mimalloc-debug.bash Executable file
View File

@@ -0,0 +1,38 @@
#!/usr/bin/env bash
set -euxo pipefail
SCRIPT_DIR=$(cd -- "$(dirname -- "${BASH_SOURCE[0]}")" &>/dev/null && pwd)
CMAKE_FLAGS=${CMAKE_FLAGS:-}
BUN_BASE_DIR=${BUN_BASE_DIR:-$(cd $SCRIPT_DIR && cd ../ && pwd)}
BUN_DEPS_OUT_DIR=${BUN_DEPS_OUT_DIR:-$BUN_BASE_DIR/src/deps/}
BUN_DEPS_DIR=${BUN_DEPS_DIR:-$BUN_BASE_DIR/src/deps}
CCACHE_CC_FLAG=${CCACHE_CC_FLAG:-}
CPUS=${CPUS:-$(nproc || sysctl -n hw.ncpu || echo 1)}
CFLAGS=${CFLAGS:-}
MIMALLOC_OVERRIDE_FLAG=${MIMALLOC_OVERRIDE_FLAG:-}
MIMALLOC_VALGRIND_ENABLED_FLAG=${MIMALLOC_VALGRIND_ENABLED_FLAG:-}
mkdir -p $BUN_DEPS_OUT_DIR
rm -rf $(BUN_DEPS_DIR)/mimalloc/CMakeCache* $(BUN_DEPS_DIR)/mimalloc/CMakeFiles
cd $(BUN_DEPS_DIR)/mimalloc
make clean || echo ""
CFLAGS="$CFLAGS" cmake $CMAKE_FLAGS $MIMALLOC_OVERRIDE_FLAG $MIMALLOC_VALGRIND_ENABLED_FLAG \
-DCMAKE_BUILD_TYPE=Debug \
-DMI_DEBUG_FULL=1 \
-DMI_SKIP_COLLECT_ON_EXIT=1 \
-DMI_BUILD_SHARED=OFF \
-DMI_BUILD_STATIC=ON \
-DMI_BUILD_TESTS=OFF \
-DMI_OSX_ZONE=OFF \
-DMI_OSX_INTERPOSE=OFF \
-DMI_BUILD_OBJECT=ON \
-DMI_USE_CXX=ON \
-DMI_OVERRIDE=OFF \
-DCMAKE_C_FLAGS="$CFLAGS" \
-DCMAKE_CXX_FLAGS="$CFLAGS" \
-GNinja .
ninja
cp $BUN_DEPS_DIR/mimalloc/libmimalloc-debug.a $BUN_DEPS_OUT_DIR/libmimalloc.a

37
.build/mimalloc.bash Executable file
View File

@@ -0,0 +1,37 @@
#!/usr/bin/env bash
set -euxo pipefail
SCRIPT_DIR=$(cd -- "$(dirname -- "${BASH_SOURCE[0]}")" &>/dev/null && pwd)
CMAKE_FLAGS=${CMAKE_FLAGS:-}
BUN_BASE_DIR=${BUN_BASE_DIR:-$(cd $SCRIPT_DIR && cd ../ && pwd)}
BUN_DEPS_OUT_DIR=${BUN_DEPS_OUT_DIR:-$BUN_BASE_DIR/src/deps/}
BUN_DEPS_DIR=${BUN_DEPS_DIR:-$BUN_BASE_DIR/src/deps}
CCACHE_CC_FLAG=${CCACHE_CC_FLAG:-}
CPUS=${CPUS:-$(nproc || sysctl -n hw.ncpu || echo 1)}
CFLAGS=${CFLAGS:-}
MIMALLOC_OVERRIDE_FLAG=${MIMALLOC_OVERRIDE_FLAG:-}
MIMALLOC_VALGRIND_ENABLED_FLAG=${MIMALLOC_VALGRIND_ENABLED_FLAG:-}
mkdir -p $BUN_DEPS_OUT_DIR
rm -rf $BUN_DEPS_DIR/mimalloc/CMakeCache* $BUN_DEPS_DIR/mimalloc/CMakeFiles
cd $BUN_DEPS_DIR/mimalloc
make clean || echo ""
CFLAGS="$CFLAGS" cmake $CMAKE_FLAGS $MIMALLOC_OVERRIDE_FLAG \
-DMI_SKIP_COLLECT_ON_EXIT=1 \
-DMI_BUILD_SHARED=OFF \
-DMI_BUILD_STATIC=ON \
-DMI_BUILD_TESTS=OFF \
-DMI_OSX_ZONE=OFF \
-DMI_OSX_INTERPOSE=OFF \
-DMI_BUILD_OBJECT=ON \
-DMI_USE_CXX=ON \
-DMI_OVERRIDE=OFF \
-DMI_OSX_ZONE=OFF \
-DCMAKE_C_FLAGS="$CFLAGS" \
-GNinja .
ninja
cp $BUN_DEPS_DIR/mimalloc/libmimalloc.a $BUN_DEPS_OUT_DIR/libmimalloc.a

22
.build/mimalloc.ps1 Normal file
View File

@@ -0,0 +1,22 @@
$ErrorActionPreference = 'Stop' # Setting strict mode, similar to 'set -euo pipefail' in bash
$SCRIPT_DIR = Split-Path $PSScriptRoot -Parent
$CMAKE_FLAGS = $env:CMAKE_FLAGS
$BUN_BASE_DIR = if ($env:BUN_BASE_DIR) { $env:BUN_BASE_DIR } else { $SCRIPT_DIR }
$BUN_DEPS_OUT_DIR = if ($env:BUN_DEPS_OUT_DIR) { $env:BUN_DEPS_OUT_DIR } else { Join-Path $BUN_BASE_DIR 'src\deps' }
$BUN_DEPS_DIR = if ($env:BUN_DEPS_DIR) { $env:BUN_DEPS_DIR } else { Join-Path $BUN_BASE_DIR 'src\deps' }
$CCACHE_CC_FLAG = $env:CCACHE_CC_FLAG
$CPUS = if ($env:CPUS) { $env:CPUS } else { (Get-WmiObject -Class Win32_ComputerSystem).NumberOfLogicalProcessors }
$CFLAGS = $env:CFLAGS
$CXXFLAGS = $env:CXXFLAGS
# Create the output directory if it doesn't exist
if (-not (Test-Path $BUN_DEPS_OUT_DIR)) {
New-Item -ItemType Directory -Path $BUN_DEPS_OUT_DIR
}
Set-Location (Join-Path $BUN_DEPS_DIR 'mimalloc')
cmake $CMAKE_FLAGS -DMI_SKIP_COLLECT_ON_EXIT=1 -DMI_BUILD_SHARED=OFF -DMI_BUILD_STATIC=ON -DMI_BUILD_TESTS=OFF -DMI_OSX_ZONE=OFF -DMI_OSX_INTERPOSE=OFF -DMI_BUILD_OBJECT=ON -DMI_USE_CXX=ON -DMI_OVERRIDE=OFF -DMI_OSX_ZONE=OFF -DCMAKE_C_FLAGS="$CFLAGS" .
cmake --build . --clean-first --config Release
Copy-Item **/*.lib $BUN_DEPS_OUT_DIR

22
.build/tinycc.bash Executable file
View File

@@ -0,0 +1,22 @@
#!/usr/bin/env bash
set -euxo pipefail
SCRIPT_DIR=$(cd -- "$(dirname -- "${BASH_SOURCE[0]}")" &>/dev/null && pwd)
CMAKE_FLAGS=${CMAKE_FLAGS:-}
BUN_BASE_DIR=${BUN_BASE_DIR:-$(cd $SCRIPT_DIR && cd ../ && pwd)}
BUN_DEPS_OUT_DIR=${BUN_DEPS_OUT_DIR:-$BUN_BASE_DIR/src/deps/}
BUN_DEPS_DIR=${BUN_DEPS_DIR:-$BUN_BASE_DIR/src/deps}
CCACHE_CC_FLAG=${CCACHE_CC_FLAG:-}
CPUS=${CPUS:-$(nproc || sysctl -n hw.ncpu || echo 1)}
AR=${AR:-ar}
CC=${CC:-cc}
CFLAGS=${CFLAGS:-}
mkdir -p $BUN_DEPS_OUT_DIR
cd $BUN_DEPS_DIR/tinycc
make clean
AR=$AR CC=$CC CFLAGS="$CFLAGS" ./configure --enable-static --cc=$CC --ar=$AR --config-predefs=yes
make -j10
cp *.a $BUN_DEPS_OUT_DIR

18
.build/zlib.bash Executable file
View File

@@ -0,0 +1,18 @@
#!/usr/bin/env bash
set -euxo pipefail
SCRIPT_DIR=$(cd -- "$(dirname -- "${BASH_SOURCE[0]}")" &>/dev/null && pwd)
CMAKE_FLAGS=${CMAKE_FLAGS:-}
BUN_BASE_DIR=${BUN_BASE_DIR:-$(cd $SCRIPT_DIR && cd ../ && pwd)}
BUN_DEPS_OUT_DIR=${BUN_DEPS_OUT_DIR:-$BUN_BASE_DIR/src/deps/}
BUN_DEPS_DIR=${BUN_DEPS_DIR:-$BUN_BASE_DIR/src/deps}
CCACHE_CC_FLAG=${CCACHE_CC_FLAG:-}
CPUS=${CPUS:-$(nproc || sysctl -n hw.ncpu || echo 1)}
mkdir -p $BUN_DEPS_OUT_DIR
cd $BUN_DEPS_DIR/zlib
make clean
$CCACHE_CC_FLAG CFLAGS="$CFLAGS" ./configure --static
make -j${CPUS}
cp ./libz.a $BUN_DEPS_OUT_DIR/libz.a

19
.build/zstd.bash Executable file
View File

@@ -0,0 +1,19 @@
#!/usr/bin/env bash
set -euxo pipefail
SCRIPT_DIR=$(cd -- "$(dirname -- "${BASH_SOURCE[0]}")" &>/dev/null && pwd)
CMAKE_FLAGS=${CMAKE_FLAGS:-}
BUN_BASE_DIR=${BUN_BASE_DIR:-$(cd $SCRIPT_DIR && cd ../ && pwd)}
BUN_DEPS_OUT_DIR=${BUN_DEPS_OUT_DIR:-$BUN_BASE_DIR/src/deps/}
BUN_DEPS_DIR=${BUN_DEPS_DIR:-$BUN_BASE_DIR/src/deps}
CCACHE_CC_FLAG=${CCACHE_CC_FLAG:-}
CPUS=${CPUS:-$(nproc || sysctl -n hw.ncpu || echo 1)}
mkdir -p $BUN_DEPS_OUT_DIR
cd $BUN_DEPS_DIR/zstd
rm -rf Release CMakeCache.txt CMakeFiles
cmake $CMAKE_FLAGS -DZSTD_BUILD_STATIC=ON -B Release -S build/cmake -G Ninja
ninja -C Release
cp Release/lib/libzstd.a $BUN_DEPS_OUT_DIR/libzstd.a

21
.build/zstd.ps1 Normal file
View File

@@ -0,0 +1,21 @@
$ErrorActionPreference = 'Stop' # Setting strict mode, similar to 'set -euo pipefail' in bash
$SCRIPT_DIR = Split-Path $PSScriptRoot -Parent
$CMAKE_FLAGS = $env:CMAKE_FLAGS
$BUN_BASE_DIR = if ($env:BUN_BASE_DIR) { $env:BUN_BASE_DIR } else { $SCRIPT_DIR }
$BUN_DEPS_OUT_DIR = if ($env:BUN_DEPS_OUT_DIR) { $env:BUN_DEPS_OUT_DIR } else { Join-Path $BUN_BASE_DIR 'src\deps' }
$BUN_DEPS_DIR = if ($env:BUN_DEPS_DIR) { $env:BUN_DEPS_DIR } else { Join-Path $BUN_BASE_DIR 'src\deps' }
$CCACHE_CC_FLAG = $env:CCACHE_CC_FLAG
$CPUS = if ($env:CPUS) { $env:CPUS } else { (Get-WmiObject -Class Win32_ComputerSystem).NumberOfLogicalProcessors }
$CFLAGS = $env:CFLAGS
$CXXFLAGS = $env:CXXFLAGS
# Create the output directory if it doesn't exist
if (-not (Test-Path $BUN_DEPS_OUT_DIR)) {
New-Item -ItemType Directory -Path $BUN_DEPS_OUT_DIR
}
Set-Location (Join-Path $BUN_DEPS_DIR 'zstd\build\cmake')
cmake $CMAKE_FLAGS -DZSTD_BUILD_STATIC=ON -DCMAKE_BUILD_TYPE=Release
cmake --build . --clean-first --config Release
Copy-Item lib\*\**.lib $BUN_DEPS_OUT_DIR

89
.github/workflows/bun-build.yml vendored Normal file
View File

@@ -0,0 +1,89 @@
name: bun-build
env:
ZIG_VERSION: 0.12.0-dev.888+130227491
concurrency:
group: bun-build-${{ github.ref || github.run_id }}
cancel-in-progress: true
on: [push]
# push:
# branches:
# - main
# - ci/*
# paths:
# # Build files
# # Source files
# - "src/**/*"
# - "packages/{bun-usockets,bun-uws}/src/**/*"
# pull_request:
# branches:
# - main
# paths:
# # Build files
# - "a"
# # Source files
# - "src/**/*"
# - "packages/{bun-usockets,bun-uws}/src/**/*"
# workflow_dispatch: {}
permissions:
contents: read
jobs:
build:
name: Build (${{ matrix.id }})
timeout-minutes: 60
strategy:
fail-fast: false
matrix:
include:
- id: linux-x64
runner: ubuntu-latest
runs-on: ${{ matrix.runner }}
steps:
- name: Checkout
uses: actions/checkout@v4
with:
submodules: recursive
- name: Restore Cache
uses: actions/cache@v3
with:
key: ${{ runner.os }}-build-${{ matrix.id }}
path: |
build
- name: Setup APT
uses: awalsh128/cache-apt-pkgs-action@latest
with:
version: "1" # increment when packages change
packages: |
make
cmake
ccache
ninja-build
- name: Setup Zig
uses: goto-bus-stop/setup-zig@v2
with:
version: ${{ env.ZIG_VERSION }}
# - name: Setup LLVM
# uses: KyleMayes/install-llvm-action@v1
# with:
# version: "16"
- name: Setup Node.js
uses: actions/setup-node@v3
with:
node-version: "20"
- name: Setup Bun
uses: oven-sh/setup-bun@v1
with:
bun-version: latest
- name: Setup Dependencies
run: |
bun install
bash .scripts/postinstall.sh
bun install -g esbuild
- name: Setup upterm session
uses: lhotari/action-upterm@v1
with:
limit-access-to-users: Electroid,paperdave,Jarred-Sumner

View File

@@ -16,6 +16,7 @@ on:
paths:
- "src/**/*"
- "test/**/*"
- "packages/bun-usockets/src/**/*"
- "build.zig"
- "Makefile"
- "Dockerfile"
@@ -37,7 +38,7 @@ jobs:
arch: aarch64
build_arch: arm64
runner: linux-arm64
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-linux-arm64-lto.tar.gz"
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3-2/bun-webkit-linux-arm64-lto.tar.gz"
webkit_basename: "bun-webkit-linux-arm64-lto"
build_machine_arch: aarch64

View File

@@ -16,6 +16,7 @@ on:
paths:
- "src/**/*"
- "test/**/*"
- "packages/bun-usockets/src/**/*"
- "build.zig"
- "Makefile"
- "Dockerfile"
@@ -25,6 +26,7 @@ on:
paths:
- "src/**/*"
- "test/**/*"
- "packages/bun-usockets/src/**/*"
- "build.zig"
- "Makefile"
- "Dockerfile"
@@ -47,7 +49,7 @@ jobs:
arch: x86_64
build_arch: amd64
runner: big-ubuntu
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-linux-amd64-lto.tar.gz"
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3-2/bun-webkit-linux-amd64-lto.tar.gz"
webkit_basename: "bun-webkit-linux-amd64-lto"
build_machine_arch: x86_64
- cpu: nehalem
@@ -55,7 +57,7 @@ jobs:
arch: x86_64
build_arch: amd64
runner: big-ubuntu
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-linux-amd64-lto.tar.gz"
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3-2/bun-webkit-linux-amd64-lto.tar.gz"
webkit_basename: "bun-webkit-linux-amd64-lto"
build_machine_arch: x86_64

View File

@@ -15,6 +15,7 @@ on:
paths:
- "src/**/*"
- "test/**/*"
- "packages/bun-usockets/src/**/*"
- "build.zig"
- "Makefile"
- "Dockerfile"
@@ -23,6 +24,7 @@ on:
paths:
- "src/**/*"
- "test/**/*"
- "packages/bun-usockets/src/**/*"
- "build.zig"
- "Makefile"
- "Dockerfile"
@@ -117,7 +119,7 @@ jobs:
# obj: bun-obj-darwin-x64-baseline
# runner: macos-12
# artifact: bun-obj-darwin-x64-baseline
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3-2/bun-webkit-macos-amd64-lto.tar.gz"
# dependencies: true
# compile_obj: false
# - cpu: haswell
@@ -126,7 +128,7 @@ jobs:
# obj: bun-obj-darwin-x64
# runner: macos-12
# artifact: bun-obj-darwin-x64
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3-2/bun-webkit-macos-amd64-lto.tar.gz"
# dependencies: true
# compile_obj: false
# - cpu: nehalem
@@ -135,7 +137,7 @@ jobs:
# obj: bun-obj-darwin-x64-baseline
# runner: macos-12
# artifact: bun-obj-darwin-x64-baseline
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3-2/bun-webkit-macos-amd64-lto.tar.gz"
# dependencies: false
# compile_obj: true
# - cpu: haswell
@@ -144,7 +146,7 @@ jobs:
# obj: bun-obj-darwin-x64
# runner: macos-12
# artifact: bun-obj-darwin-x64
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3-2/bun-webkit-macos-amd64-lto.tar.gz"
# dependencies: false
# compile_obj: true
- cpu: native
@@ -152,7 +154,7 @@ jobs:
tag: bun-darwin-aarch64
obj: bun-obj-darwin-aarch64
artifact: bun-obj-darwin-aarch64
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-arm64-lto.tar.gz"
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3-2/bun-webkit-macos-arm64-lto.tar.gz"
runner: macos-arm64
dependencies: true
compile_obj: true
@@ -258,7 +260,7 @@ jobs:
# package: bun-darwin-x64
# runner: macos-12
# artifact: bun-obj-darwin-x64-baseline
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3-2/bun-webkit-macos-amd64-lto.tar.gz"
# - cpu: haswell
# arch: x86_64
# tag: bun-darwin-x64
@@ -266,14 +268,14 @@ jobs:
# package: bun-darwin-x64
# runner: macos-12
# artifact: bun-obj-darwin-x64
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3-2/bun-webkit-macos-amd64-lto.tar.gz"
- cpu: native
arch: aarch64
tag: bun-darwin-aarch64
obj: bun-obj-darwin-aarch64
package: bun-darwin-aarch64
artifact: bun-obj-darwin-aarch64
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-arm64-lto.tar.gz"
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3-2/bun-webkit-macos-arm64-lto.tar.gz"
runner: macos-arm64
steps:
- uses: actions/checkout@v3

View File

@@ -15,6 +15,7 @@ on:
paths:
- "src/**/*"
- "test/**/*"
- "packages/bun-usockets/src/**/*"
- "build.zig"
- "Makefile"
- "Dockerfile"
@@ -23,6 +24,7 @@ on:
paths:
- "src/**/*"
- "test/**/*"
- "packages/bun-usockets/src/**/*"
- "build.zig"
- "Makefile"
- "Dockerfile"
@@ -117,7 +119,7 @@ jobs:
obj: bun-obj-darwin-x64-baseline
runner: macos-12
artifact: bun-obj-darwin-x64-baseline
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3-2/bun-webkit-macos-amd64-lto.tar.gz"
dependencies: true
compile_obj: false
# - cpu: haswell
@@ -126,7 +128,7 @@ jobs:
# obj: bun-obj-darwin-x64
# runner: macos-12
# artifact: bun-obj-darwin-x64
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3-2/bun-webkit-macos-amd64-lto.tar.gz"
# dependencies: true
# compile_obj: false
- cpu: nehalem
@@ -135,7 +137,7 @@ jobs:
obj: bun-obj-darwin-x64-baseline
runner: macos-12
artifact: bun-obj-darwin-x64-baseline
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3-2/bun-webkit-macos-amd64-lto.tar.gz"
dependencies: false
compile_obj: true
# - cpu: haswell
@@ -144,7 +146,7 @@ jobs:
# obj: bun-obj-darwin-x64
# runner: macos-12
# artifact: bun-obj-darwin-x64
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3-2/bun-webkit-macos-amd64-lto.tar.gz"
# dependencies: false
# compile_obj: true
# - cpu: native
@@ -152,7 +154,7 @@ jobs:
# tag: bun-darwin-aarch64
# obj: bun-obj-darwin-aarch64
# artifact: bun-obj-darwin-aarch64
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3-2/bun-webkit-macos-amd64-lto.tar.gz"
# runner: macos-arm64
# dependencies: true
# compile_obj: true
@@ -259,7 +261,7 @@ jobs:
package: bun-darwin-x64
runner: macos-12
artifact: bun-obj-darwin-x64-baseline
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3-2/bun-webkit-macos-amd64-lto.tar.gz"
# - cpu: haswell
# arch: x86_64
# tag: bun-darwin-x64
@@ -267,14 +269,14 @@ jobs:
# package: bun-darwin-x64
# runner: macos-12
# artifact: bun-obj-darwin-x64
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3-2/bun-webkit-macos-amd64-lto.tar.gz"
# - cpu: native
# arch: aarch64
# tag: bun-darwin-aarch64
# obj: bun-obj-darwin-aarch64
# package: bun-darwin-aarch64
# artifact: bun-obj-darwin-aarch64
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3-2/bun-webkit-macos-amd64-lto.tar.gz"
# runner: macos-arm64
steps:
- uses: actions/checkout@v3

View File

@@ -15,6 +15,7 @@ on:
paths:
- "src/**/*"
- "test/**/*"
- "packages/bun-usockets/src/**/*"
- "build.zig"
- "Makefile"
- "Dockerfile"
@@ -23,6 +24,7 @@ on:
paths:
- "src/**/*"
- "test/**/*"
- "packages/bun-usockets/src/**/*"
- "build.zig"
- "Makefile"
- "Dockerfile"
@@ -117,7 +119,7 @@ jobs:
# obj: bun-obj-darwin-x64-baseline
# runner: macos-12
# artifact: bun-obj-darwin-x64-baseline
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3-2/bun-webkit-macos-amd64-lto.tar.gz"
# dependencies: true
# compile_obj: false
- cpu: haswell
@@ -126,7 +128,7 @@ jobs:
obj: bun-obj-darwin-x64
runner: macos-12
artifact: bun-obj-darwin-x64
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3-2/bun-webkit-macos-amd64-lto.tar.gz"
dependencies: true
compile_obj: false
# - cpu: nehalem
@@ -135,7 +137,7 @@ jobs:
# obj: bun-obj-darwin-x64-baseline
# runner: macos-12
# artifact: bun-obj-darwin-x64-baseline
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3-2/bun-webkit-macos-amd64-lto.tar.gz"
# dependencies: false
# compile_obj: true
- cpu: haswell
@@ -144,7 +146,7 @@ jobs:
obj: bun-obj-darwin-x64
runner: macos-12
artifact: bun-obj-darwin-x64
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3-2/bun-webkit-macos-amd64-lto.tar.gz"
dependencies: false
compile_obj: true
# - cpu: native
@@ -152,7 +154,7 @@ jobs:
# tag: bun-darwin-aarch64
# obj: bun-obj-darwin-aarch64
# artifact: bun-obj-darwin-aarch64
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-arm64-lto.tar.gz"
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3-2/bun-webkit-macos-arm64-lto.tar.gz"
# runner: macos-arm64
# dependencies: true
# compile_obj: true
@@ -261,7 +263,7 @@ jobs:
# package: bun-darwin-x64
# runner: macos-12
# artifact: bun-obj-darwin-x64-baseline
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3-2/bun-webkit-macos-amd64-lto.tar.gz"
- cpu: haswell
arch: x86_64
tag: bun-darwin-x64
@@ -269,14 +271,14 @@ jobs:
package: bun-darwin-x64
runner: macos-12
artifact: bun-obj-darwin-x64
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-amd64-lto.tar.gz"
webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3-2/bun-webkit-macos-amd64-lto.tar.gz"
# - cpu: native
# arch: aarch64
# tag: bun-darwin-aarch64
# obj: bun-obj-darwin-aarch64
# package: bun-darwin-aarch64
# artifact: bun-obj-darwin-aarch64
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3/bun-webkit-macos-arm64-lto.tar.gz"
# webkit_url: "https://github.com/oven-sh/WebKit/releases/download/2023-oct3-2/bun-webkit-macos-arm64-lto.tar.gz"
# runner: macos-arm64
steps:
- uses: actions/checkout@v3

View File

@@ -2,7 +2,7 @@ name: bun-release
concurrency: release
env:
BUN_VERSION: ${{ github.event.inputs.tag || github.event.release.tag_name || 'canary' }}
BUN_LATEST: ${{ github.event.inputs.is-latest || github.event.release.prerelease == 'false' }}
BUN_LATEST: ${{ (github.event.inputs.is-latest || github.event.release.tag_name) && 'true' || 'false' }}
on:
release:
types:

View File

@@ -1,7 +1,7 @@
name: zig-fmt
env:
ZIG_VERSION: 0.12.0-dev.163+6780a6bbf
ZIG_VERSION: 0.12.0-dev.899+027aabf49
on:
pull_request:

18
.gitignore vendored
View File

@@ -135,3 +135,21 @@ make-dev-stats.csv
.uuid
tsconfig.tsbuildinfo
build-release
*.lib
*.pdb
CMakeFiles
build.ninja
.ninja_deps
.ninja_log
CMakeCache.txt
cmake_install.cmake
compile_commands.json
*.lib
x64
**/*.vcxproj*
**/*.sln*
**/*.dir
**/*.pdb

View File

@@ -1,21 +0,0 @@
// I would have made this a bash script but there isn't an easy way to track
// time in bash sub-second cross platform.
import fs from "fs";
const start = Date.now() + 5;
const result = Bun.spawnSync(process.argv.slice(2), {
stdio: ["inherit", "inherit", "inherit"],
});
const end = Date.now();
const diff = (Math.max(Math.round(end - start), 0) / 1000).toFixed(3);
const success = result.exitCode === 0;
try {
const line = `${new Date().toISOString()}, ${success ? "success" : "fail"}, ${diff}\n`;
if (fs.existsSync(".scripts/make-dev-stats.csv")) {
fs.appendFileSync(".scripts/make-dev-stats.csv", line);
} else {
fs.writeFileSync(".scripts/make-dev-stats.csv", line);
}
} catch {
// Ignore
}
process.exit(result.exitCode);

View File

@@ -1,13 +1,6 @@
#!/bin/bash
set -euxo pipefail
# if bun-webkit node_modules directory exists
if [ -d ./node_modules/bun-webkit ]; then
rm -f bun-webkit
# get the first matching bun-webkit-* directory name
ln -s ./node_modules/$(ls ./node_modules | grep bun-webkit- | head -n 1) ./bun-webkit
fi
# sets up vscode C++ intellisense
rm -f .vscode/clang++
ln -s $(which clang++-16 || which clang++) .vscode/clang++ 2>/dev/null

View File

@@ -6,11 +6,11 @@
"includePath": [
"${workspaceFolder}/../webkit-build/include/",
"${workspaceFolder}/bun-webkit/include/",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Release/",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Release/ICU/Headers/",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Release/JavaScriptCore/PrivateHeaders/",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Release/WTF/Headers",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Release/bmalloc/Headers/",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Debug/",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Debug/ICU/Headers/",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Debug/JavaScriptCore/PrivateHeaders/",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Debug/WTF/Headers",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Debug/bmalloc/Headers/",
"${workspaceFolder}/src/bun.js/bindings/",
"${workspaceFolder}/src/bun.js/bindings/webcore/",
"${workspaceFolder}/src/bun.js/bindings/sqlite/",
@@ -28,11 +28,11 @@
"path": [
"${workspaceFolder}/../webkit-build/include/",
"${workspaceFolder}/bun-webkit/include/",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Release/",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Release/ICU/Headers/",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Release/JavaScriptCore/PrivateHeaders/**",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Release/WTF/Headers/**",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Release/bmalloc/Headers/**",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Debug/",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Debug/ICU/Headers/",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Debug/JavaScriptCore/PrivateHeaders/**",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Debug/WTF/Headers/**",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Debug/bmalloc/Headers/**",
"${workspaceFolder}/src/bun.js/bindings/*",
"${workspaceFolder}/src/bun.js/bindings/*",
"${workspaceFolder}/src/napi/*",
@@ -59,7 +59,70 @@
"ENABLE_INSPECTOR_ALTERNATE_DISPATCHERS=0",
"BUILDING_JSCONLY__",
"USE_FOUNDATION=1",
"ASSERT_ENABLED=0",
"ASSERT_ENABLED=1",
"DU_DISABLE_RENAMING=1"
],
"macFrameworkPath": [],
"compilerPath": "${workspaceFolder}/.vscode/clang++",
"cStandard": "c17",
"cppStandard": "c++20"
},
{
"name": "BunWithJSCDebug",
"forcedInclude": ["${workspaceFolder}/src/bun.js/bindings/root.h"],
"includePath": [
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Debug/",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Debug/ICU/Headers/",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Debug/JavaScriptCore/PrivateHeaders/",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Debug/WTF/Headers",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Debug/bmalloc/Headers/",
"${workspaceFolder}/src/bun.js/bindings/",
"${workspaceFolder}/src/bun.js/bindings/webcore/",
"${workspaceFolder}/src/bun.js/bindings/sqlite/",
"${workspaceFolder}/src/bun.js/bindings/webcrypto/",
"${workspaceFolder}/src/bun.js/modules/",
"${workspaceFolder}/src/js/builtins/",
"${workspaceFolder}/src/js/out",
"${workspaceFolder}/src/deps/boringssl/include/",
"${workspaceFolder}/src/deps",
"${workspaceFolder}/src/napi/*",
"${workspaceFolder}/packages/bun-usockets/src",
"${workspaceFolder}/packages/"
],
"browse": {
"path": [
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Debug/",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Debug/ICU/Headers/",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Debug/JavaScriptCore/PrivateHeaders/**",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Debug/WTF/Headers/**",
"${workspaceFolder}/src/bun.js/WebKit/WebKitBuild/Debug/bmalloc/Headers/**",
"${workspaceFolder}/src/bun.js/bindings/*",
"${workspaceFolder}/src/bun.js/bindings/*",
"${workspaceFolder}/src/napi/*",
"${workspaceFolder}/src/bun.js/bindings/sqlite/",
"${workspaceFolder}/src/bun.js/bindings/webcrypto/",
"${workspaceFolder}/src/bun.js/bindings/webcore/",
"${workspaceFolder}/src/js/builtins/*",
"${workspaceFolder}/src/js/out/*",
"${workspaceFolder}/src/bun.js/modules/*",
"${workspaceFolder}/src/deps",
"${workspaceFolder}/src/deps/boringssl/include/",
"${workspaceFolder}/packages/bun-usockets/",
"${workspaceFolder}/packages/bun-uws/",
"${workspaceFolder}/src/napi"
],
"limitSymbolsToIncludedHeaders": true,
"databaseFilename": ".vscode/cppdb_debug"
},
"defines": [
"STATICALLY_LINKED_WITH_JavaScriptCore=1",
"STATICALLY_LINKED_WITH_WTF=1",
"BUILDING_WITH_CMAKE=1",
"NOMINMAX",
"ENABLE_INSPECTOR_ALTERNATE_DISPATCHERS=0",
"BUILDING_JSCONLY__",
"USE_FOUNDATION=1",
"ASSERT_ENABLED=1",
"DU_DISABLE_RENAMING=1"
],
"macFrameworkPath": [],

13
.vscode/launch.json generated vendored
View File

@@ -82,7 +82,7 @@
"request": "launch",
"name": "bun test [*]",
"program": "bun-debug",
"args": ["test"],
"args": ["test", "js/node"],
"cwd": "${workspaceFolder}/test",
"env": {
"FORCE_COLOR": "1",
@@ -96,7 +96,7 @@
"request": "launch",
"name": "bun test [*] (fast)",
"program": "bun-debug",
"args": ["test"],
"args": ["test", "js"],
// The cwd here must be the same as in CI. Or you will cause test failures that only happen in CI.
"cwd": "${workspaceFolder}/test",
"env": {
@@ -123,7 +123,7 @@
"type": "lldb",
"request": "launch",
"name": "bun run [file]",
"program": "bun-debug",
"program": "./Build/debug/bun.exe",
"args": ["run", "${file}", "${file}"],
"cwd": "${fileDirname}",
"env": {
@@ -307,13 +307,10 @@
"name": "bun install",
"program": "bun-debug",
"args": ["install"],
"cwd": "${fileDirname}",
"cwd": "/Users/jarred/Build/worky",
"console": "internalConsole",
"env": {
"BUN_DEBUG_QUIET_LOGS": "1"
}
"env": {}
},
{
"type": "lldb",
"request": "launch",

89
.vscode/settings.json vendored
View File

@@ -7,9 +7,7 @@
"search.followSymlinks": false,
"search.useIgnoreFiles": true,
"zig.buildOnSave": false,
// We do this until we upgrade to latest Zig so that zls doesn't break our code.
"zig.formattingProvider": "extension",
"zig.buildArgs": ["obj", "-Dfor-editor"],
"zig.formattingProvider": "zls",
"zig.buildOption": "build",
"zig.buildFilePath": "${workspaceFolder}/build.zig",
"[zig]": {
@@ -68,40 +66,40 @@
"test/snapshots": true,
"test/snapshots-no-hmr": true,
"src/bun.js/WebKit": true,
"src/deps/libarchive": true,
"src/deps/mimalloc": true,
"src/deps/s2n-tls": true,
"src/deps/boringssl": true,
"src/deps/openssl": true,
"src/deps/uws": true,
"src/deps/zlib": true,
"src/deps/lol-html": true,
"src/deps/c-ares": true,
"src/deps/tinycc": true,
"src/deps/zstd": true,
// "src/deps/libarchive": true,
// "src/deps/mimalloc": true,
// "src/deps/s2n-tls": true,
// "src/deps/boringssl": true,
// "src/deps/openssl": true,
// "src/deps/uws": true,
// "src/deps/zlib": true,
// "src/deps/lol-html": true,
// "src/deps/c-ares": true,
// "src/deps/tinycc": true,
// "src/deps/zstd": true,
"test/snippets/package-json-exports/_node_modules_copy": true,
"src/js/out": true,
"packages/bun-uws/fuzzing/seed-corpus/": true,
"**/*.dep": true,
"**/CMakeFiles": true
"**/*.dep": true
// "**/CMakeFiles": true
},
"C_Cpp.files.exclude": {
"**/.vscode": true,
"src/bun.js/WebKit/JSTests": true,
"src/bun.js/WebKit/Tools": true,
"src/bun.js/WebKit/WebDriverTests": true,
"src/bun.js/WebKit/WebKit.xcworkspace": true,
"src/bun.js/WebKit/WebKitLibraries": true,
"src/bun.js/WebKit/Websites": true,
"src/bun.js/WebKit/resources": true,
"src/bun.js/WebKit/LayoutTests": true,
"src/bun.js/WebKit/ManualTests": true,
"src/bun.js/WebKit/PerformanceTests": true,
"src/bun.js/WebKit/WebKitLegacy": true,
"src/bun.js/WebKit/WebCore": true,
"src/bun.js/WebKit/WebDriver": true,
"src/bun.js/WebKit/WebKitBuild": true,
"src/bun.js/WebKit/WebInspectorUI": true
"WebKit/JSTests": true,
"WebKit/Tools": true,
"WebKit/WebDriverTests": true,
"WebKit/WebKit.xcworkspace": true,
"WebKit/WebKitLibraries": true,
"WebKit/Websites": true,
"WebKit/resources": true,
"WebKit/LayoutTests": true,
"WebKit/ManualTests": true,
"WebKit/PerformanceTests": true,
"WebKit/WebKitLegacy": true,
"WebKit/WebCore": true,
"WebKit/WebDriver": true,
"WebKit/WebKitBuild": true,
"WebKit/WebInspectorUI": true
},
"[cpp]": {
"editor.defaultFormatter": "xaver.clang-format"
@@ -221,10 +219,33 @@
"regex": "cpp",
"span": "cpp",
"valarray": "cpp",
"codecvt": "cpp"
"codecvt": "cpp",
"types.h": "c",
"bsd.h": "c",
"xtr1common": "cpp",
"stop_token": "cpp",
"xfacet": "cpp",
"xhash": "cpp",
"xiosbase": "cpp",
"xlocale": "cpp",
"xlocbuf": "cpp",
"xlocinfo": "cpp",
"xlocmes": "cpp",
"xlocmon": "cpp",
"xlocnum": "cpp",
"xloctime": "cpp",
"xmemory": "cpp",
"xstring": "cpp",
"xtree": "cpp",
"xutility": "cpp",
"string.h": "c",
"zutil.h": "c",
"gzguts.h": "c",
"stdatomic.h": "c",
"root_certs.h": "c"
},
"cmake.configureOnOpen": false,
"C_Cpp.errorSquiggles": "enabled",
"eslint.workingDirectories": ["packages/bun-types"],
"typescript.tsdk": "node_modules/typescript/lib"
"typescript.tsdk": "node_modules/typescript/lib",
"zig.initialSetupDone": true
}

974
CMakeLists.txt Normal file
View File

@@ -0,0 +1,974 @@
cmake_minimum_required(VERSION 3.22)
cmake_policy(SET CMP0091 NEW)
cmake_policy(SET CMP0067 NEW)
set(Bun_VERSION "1.0.7")
set(WEBKIT_TAG 1a49a1f94bf42ab4f8c6b11d7bbbb21e491d2d62)
set(BUN_WORKDIR "${CMAKE_CURRENT_BINARY_DIR}")
message(STATUS "Configuring Bun ${Bun_VERSION} in ${BUN_WORKDIR}")
# --- Build Type ---
# This is done at the start simply so this is the first message printed
if(NOT CMAKE_BUILD_TYPE)
message(WARNING "No CMAKE_BUILD_TYPE value specified, defaulting to Debug.\nSet a build type with -DCMAKE_BUILD_TYPE=<Debug|Release>")
set(CMAKE_BUILD_TYPE "Debug" CACHE STRING "Choose the type of build (Debug, Release)" FORCE)
else()
if(NOT CMAKE_BUILD_TYPE MATCHES "^(Debug|Release)$")
message(FATAL_ERROR
"Invalid CMAKE_BUILD_TYPE value specified: ${CMAKE_BUILD_TYPE}\n"
"CMAKE_BUILD_TYPE must be Debug or Release.")
endif()
message(STATUS "The CMake build type is: ${CMAKE_BUILD_TYPE}")
endif()
if(CMAKE_BUILD_TYPE STREQUAL "Debug")
set(DEBUG ON)
set(ZIG_OPTIMIZE "Debug")
set(bun "bun-debug")
elseif(CMAKE_BUILD_TYPE STREQUAL "Release")
set(DEBUG OFF)
set(ZIG_OPTIMIZE "ReleaseFast")
set(bun "bun-profile")
endif()
# --- LLVM ---
# This detection is a little overkill, but it ensures that the set LLVM_VERSION matches under
# any case possible. Sorry for the complexity...
#
# Bun and WebKit must be compiled with the same compiler, so we do as much as we can to ensure that
# the compiler used for the prebuilt WebKit, LLVM 16, is the one that we detect in this process.
#
# It has to be done before project() is called, so that CMake doesnt pick a compiler for us, but even then
# we do some extra work afterwards to double-check, and we will rerun BUN_FIND_LLVM if the compiler did not match.
#
# If the user passes -DLLVM_PREFIX, most of this logic is skipped, but we still warn if invalid.
set(LLVM_VERSION 16)
macro(BUN_FIND_LLVM)
find_program(
_LLVM_CLANG_PATH
NAMES clang++-${LLVM_VERSION} clang-${LLVM_VERSION} clang++ clang
PATHS ENV PATH ${PLATFORM_LLVM_SEARCH_PATHS}
DOC "Path to LLVM ${LLVM_VERSION}'s clang++ binary. Please pass -DLLVM_PREFIX with the path to LLVM"
)
if(NOT _LLVM_CLANG_PATH)
message(FATAL_ERROR "Could not find LLVM ${LLVM_VERSION}, search paths: ${PLATFORM_LLVM_SEARCH_PATHS}")
endif()
set(CMAKE_CXX_COMPILER ${_LLVM_CLANG_PATH})
set(CMAKE_C_COMPILER ${_LLVM_CLANG_PATH})
find_program(
STRIP
NAMES llvm-strip
PATHS ENV PATH ${PLATFORM_LLVM_SEARCH_PATHS}
DOC "Path to LLVM ${LLVM_VERSION}'s llvm-strip binary"
)
find_program(
DSYMUTIL
NAMES dsymutil
PATHS ENV PATH ${PLATFORM_LLVM_SEARCH_PATHS}
DOC "Path to LLVM ${LLVM_VERSION}'s dsymutil binary"
)
find_program(
AR
NAMES llvm-ar
PATHS ENV PATH ${PLATFORM_LLVM_SEARCH_PATHS}
DOC "Path to LLVM ${LLVM_VERSION}'s llvm-ar binary"
)
find_program(
RANLIB
NAMES llvm-ranlib
PATHS ENV PATH ${PLATFORM_LLVM_SEARCH_PATHS}
DOC "Path to LLVM ${LLVM_VERSION}'s llvm-ar binary"
)
execute_process(COMMAND ${CMAKE_CXX_COMPILER} --version OUTPUT_VARIABLE _tmp)
string(REGEX MATCH "version ([0-9]+)\\.([0-9]+)\\.([0-9]+)" CMAKE_CXX_COMPILER_VERSION "${_tmp}")
set(CMAKE_CXX_COMPILER_VERSION "${CMAKE_MATCH_1}.${CMAKE_MATCH_2}.${CMAKE_MATCH_3}")
endmacro()
if(UNIX)
if(LLVM_PREFIX)
set(PLATFORM_LLVM_SEARCH_PATHS ${LLVM_PREFIX}/bin)
else()
set(PLATFORM_LLVM_SEARCH_PATHS /usr/lib/llvm-${LLVM_VERSION}/bin /lib/llvm-${LLVM_VERSION}/bin /usr/bin /usr/local/bin)
if(APPLE)
set(PLATFORM_LLVM_SEARCH_PATHS /opt/homebrew/opt/llvm@${LLVM_VERSION}/bin /opt/homebrew/bin ${PLATFORM_LLVM_SEARCH_PATHS})
endif()
endif()
if(CMAKE_CXX_COMPILER)
set(_LLVM_CLANG_PATH "${CMAKE_CXX_COMPILER}")
endif()
BUN_FIND_LLVM()
else()
# On windows it is expected to use MSVC, and until we get a better build configuration, it is a free-for-all
endif()
set(CMAKE_COLOR_DIAGNOSTICS ON)
set(CMAKE_CXX_STANDARD 20)
set(CMAKE_C_STANDARD 17)
set(CMAKE_CXX_STANDARD_REQUIRED ON)
set(CMAKE_C_STANDARD_REQUIRED ON)
project(Bun VERSION "${Bun_VERSION}")
# More effort to prevent using the wrong C++ compiler
if(UNIX)
if((NOT CMAKE_CXX_COMPILER_ID STREQUAL "Clang") OR (NOT CMAKE_CXX_COMPILER_VERSION MATCHES "^${LLVM_VERSION}\."))
# Attempt to auto-correct the compiler
message(STATUS "Compiler mismatch, attempting to auto-correct")
unset(_LLVM_CLANG_PATH)
BUN_FIND_LLVM()
if((NOT CMAKE_CXX_COMPILER_ID STREQUAL "Clang") OR (NOT CMAKE_CXX_COMPILER_VERSION MATCHES "^${LLVM_VERSION}\."))
message(WARNING "Expected LLVM ${LLVM_VERSION} as the C++ compiler, build may fail or break at runtime.")
endif()
endif()
endif()
message(STATUS "C++ Compiler: ${CMAKE_CXX_COMPILER_ID} ${CMAKE_CXX_COMPILER_VERSION} at ${CMAKE_CXX_COMPILER}")
# --- End LLVM ---
set(DEFAULT_ON_UNLESS_WINDOWS ON)
set(REQUIRED_IF_NOT_WINDOWS "REQUIRED")
if(WIN32)
set(DEFAULT_ON_UNLESS_WINDOWS OFF)
set(REQUIRED_IF_NOT_WINDOWS OFF)
endif()
set(DEFAULT_ON_UNLESS_APPLE ON)
if(APPLE)
set(DEFAULT_ON_UNLESS_APPLE OFF)
endif()
set(CI OFF)
if(DEFINED ENV{CI} OR DEFINED ENV{GITHUB_ACTIONS})
set(CI ON)
endif()
# -- Build Flags --
option(USE_STATIC_SQLITE "Statically link SQLite?" ${DEFAULT_ON_UNLESS_APPLE})
option(USE_CUSTOM_ZLIB "Use Bun's recommended version of zlib" ${DEFAULT_ON_UNLESS_WINDOWS})
option(USE_CUSTOM_BORINGSSL "Use Bun's recommended version of BoringSSL" ON)
option(USE_CUSTOM_LIBARCHIVE "Use Bun's recommended version of libarchive" ON)
option(USE_CUSTOM_MIMALLOC "Use Bun's recommended version of Mimalloc" ON)
option(USE_CUSTOM_ZSTD "Use Bun's recommended version of zstd" ON)
option(USE_CUSTOM_CARES "Use Bun's recommended version of c-ares" ${DEFAULT_ON_UNLESS_WINDOWS})
option(USE_CUSTOM_BASE64 "Use Bun's recommended version of libbase64" ON)
option(USE_CUSTOM_LOLHTML "Use Bun's recommended version of lolhtml" ON)
option(USE_CUSTOM_TINYCC "Use Bun's recommended version of tinycc" ON)
option(USE_CUSTOM_LIBUV "Use Bun's recommended version of libuv (Windows only)" OFF)
option(USE_BASELINE_BUILD "Build Bun for baseline (older) CPUs" OFF)
option(USE_DEBUG_JSC "Enable assertions and use a debug build of JavaScriptCore" OFF)
option(USE_UNIFIED_SOURCES "Use unified sources to speed up the build" OFF)
option(CANARY "Make `bun --revision` report a canary release" OFF)
set(ERROR_LIMIT 100 CACHE STRING "Maximum number of errors to show when compiling C++ code")
set(ARCH x86_64)
if(CMAKE_SYSTEM_PROCESSOR MATCHES "aarch64|arm64|arm")
set(ARCH aarch64)
endif()
if(NOT CPU_TARGET)
set(CPU_TARGET "native" CACHE STRING "CPU target for the compiler" FORCE)
if (ARCH STREQUAL "x86_64")
if (NOT MSVC)
if (USE_BASELINE_BUILD)
set(CPU_TARGET "nehalem")
else()
set(CPU_TARGET "haswell")
endif()
endif()
endif()
endif()
message(STATUS "Building for CPU Target: ${CPU_TARGET}")
set(ZIG_TARGET "native")
if(WIN32)
set(ZIG_TARGET "${ARCH}-windows-msvc")
endif()
# set(CONFIGURE_DEPENDS "")
set(CONFIGURE_DEPENDS "CONFIGURE_DEPENDS")
# --- CLI Paths ---
# Zig Compiler
function(validate_zig validator_result_var item)
set(${validator_result_var} FALSE PARENT_SCOPE)
# We will allow any valid zig compiler, as long as it contains some text from `zig zen`
# Ideally we would do a version or feature check, but that would be quite slow
execute_process(COMMAND ${item} zen OUTPUT_VARIABLE ZIG_ZEN_OUTPUT)
if(ZIG_ZEN_OUTPUT MATCHES "Together we serve the users")
set(${validator_result_var} TRUE PARENT_SCOPE)
else()
set(${validator_result_var} FALSE PARENT_SCOPE)
endif()
endfunction()
find_program(ZIG_COMPILER zig REQUIRED DOC "Path to the Zig compiler" VALIDATOR validate_zig)
message(STATUS "Found Zig Compiler: ${ZIG_COMPILER}")
# Bun
if(NOT WIN32)
find_program(BUN_EXECUTABLE bun REQUIRED DOC "Path to an already built release of Bun")
message(STATUS "Found Bun: ${BUN_EXECUTABLE}")
else()
set(BUN_EXECUTABLE "echo")
endif()
# Prettier
find_program(PRETTIER prettier DOC "Path to prettier" PATHS ./node_modules/.bin ENV PATH)
# Esbuild (TODO: switch these to "bun build")
find_program(ESBUILD esbuild DOC "Path to esbuild" PATHS ./node_modules/.bin ENV PATH)
# Ruby (only needed for unified sources)
if(USE_UNIFIED_SOURCES)
# ruby 'WebKit/Source/WTF/Scripts/generate-unified-source-bundles.rb' source_list.txt --source-tree-path . --derived-sources-path build/unified-sources
find_program(RUBY ruby DOC "Path to ruby")
endif()
# CCache
find_program(CCACHE_PROGRAM ccache)
if(CCACHE_PROGRAM)
set(CMAKE_CXX_COMPILER_LAUNCHER "${CCACHE_PROGRAM}")
set(CMAKE_C_COMPILER_LAUNCHER "${CCACHE_PROGRAM}")
message(STATUS "Using ccache: ${CCACHE_PROGRAM}")
endif()
# --- WebKit ---
# WebKit is either prebuilt and distributed via NPM, or you can pass WEBKIT_DIR to use a local build.
# We cannot include their CMake build files (TODO: explain why, for now ask @paperdave why)
#
# On Unix, this will pull from NPM the single package that is needed and use that
if(WIN32)
set(STATIC_LIB_EXT "lib")
set(libJavaScriptCore "JavaScriptCore")
set(libWTF "WTF")
else()
set(STATIC_LIB_EXT "a")
set(libJavaScriptCore "libJavaScriptCore")
set(libWTF "libWTF")
endif()
if(NOT WEBKIT_DIR)
if(WIN32)
message(FATAL_ERROR "Windows does not have prebuilt webkit yet. Please run release-windows.ps1 and pass the path to the built webkit with -DWEBKIT_DIR")
endif()
set(BUN_WEBKIT_PACKAGE_NAME_SUFFIX "")
set(ASSERT_ENABLED "0")
if(USE_DEBUG_JSC)
set(BUN_WEBKIT_PACKAGE_NAME_SUFFIX "-debug")
set(ASSERT_ENABLED "1")
elseif(NOT DEBUG)
set(BUN_WEBKIT_PACKAGE_NAME_SUFFIX "-lto")
set(ASSERT_ENABLED "0")
endif()
if (WIN32)
set(BUN_WEBKIT_PACKAGE_PLATFORM "win32")
elseif(APPLE)
set(BUN_WEBKIT_PACKAGE_PLATFORM "macos")
else()
set(BUN_WEBKIT_PACKAGE_PLATFORM "linux")
endif()
if(ARCH STREQUAL "x86_64")
set(BUN_WEBKIT_PACKAGE_ARCH "amd64")
elseif(ARCH MATCHES "aarch64|arm64|arm")
set(BUN_WEBKIT_PACKAGE_ARCH "arm64")
endif()
set(BUN_WEBKIT_PACKAGE_NAME "bun-webkit-${BUN_WEBKIT_PACKAGE_PLATFORM}-${BUN_WEBKIT_PACKAGE_ARCH}${BUN_WEBKIT_PACKAGE_NAME_SUFFIX}")
message(STATUS "Using Pre-built WebKit: ${BUN_WEBKIT_PACKAGE_NAME}")
execute_process(
COMMAND ${BUN_EXECUTABLE}
"${CMAKE_CURRENT_SOURCE_DIR}/src/codegen/download-webkit.ts"
"--outdir=${BUN_WORKDIR}/bun-webkit"
"--tag=${WEBKIT_TAG}"
"--package=${BUN_WEBKIT_PACKAGE_NAME}"
WORKING_DIRECTORY ${BUN_WORKDIR}
)
if(NOT EXISTS "${BUN_WORKDIR}/bun-webkit")
message(FATAL_ERROR "Prebuilt WebKit package ${BUN_WEBKIT_PACKAGE_NAME} failed to install")
endif()
set(WEBKIT_INCLUDE_DIR "${BUN_WORKDIR}/bun-webkit/include")
set(WEBKIT_LIB_DIR "${BUN_WORKDIR}/bun-webkit/lib")
else()
# Setting WEBKIT_DIR means you either have a path to the WebKit repo, or you have a path to packaged webkit
# Non-packaged webkit has CMakeLists.txt
if(EXISTS "${WEBKIT_DIR}/CMakeLists.txt")
# Since we may be doing a Debug build of Bun but with a Release build of JSC, we can't
# include their CMakeLists directly here, but rather we need to run `cmake` as a dependency
# of our build. It'll still have decent caching which is what really matters.
# cmake WEBKIT_DIR -B WEBKIT_DIR/WebKitBuild/WEBKIT_BUILD_TYPE
# -DPORT=JSCOnly
# -DENABLE_STATIC_JSC=ON
# -DENABLE_SINGLE_THREADED_VM_ENTRY_SCOPE=ON
# -DCMAKE_BUILD_TYPE=Debug
# -DENABLE_BUN_SKIP_FAILING_ASSERTIONS=ON
# -DUSE_THIN_ARCHIVES=OFF
# -DENABLE_FTL_JIT=ON
# -DCMAKE_C_COMPILER=(which clang-16)
# -DDCMAKE_CXX_COMPILER=(which clang++-16)
# -DDUSE_BUN_JSC_ADDITIONS=1
# -DCMAKE_EXE_LINKER_FLAGS="-fuse-ld=lld"
# -DCMAKE_AR=$(which llvm-ar)
# -DCMAKE_RANLIB=$(which llvm-ranlib)
# -DALLOW_LINE_AND_COLUMN_NUMBER_IN_BUILTINS=ON
# -G Ninja
# -DCMAKE_OSX_DEPLOYMENT_TARGET=11.0
# -DPTHREAD_JIT_PERMISSIONS_API=1
# -DUSE_PTHREAD_JIT_PERMISSIONS_API=ON
# -DENABLE_REMOTE_INSPECTOR=ON
message(FATAL_ERROR "TODO: Setting WEBKIT_DIR to the WebKit repository to enable automatic builds. For now you need to run the release script, and point to the packaged directory.")
else()
if(NOT EXISTS "${WEBKIT_DIR}/lib/${libWTF}.${STATIC_LIB_EXT}" OR NOT EXISTS "${WEBKIT_DIR}/lib/${libJavaScriptCore}.${STATIC_LIB_EXT}")
if(WEBKIT_DIR MATCHES "src/bun.js/WebKit$")
message(FATAL_ERROR "WebKit directory ${WEBKIT_DIR} does not contain all the required files for Bun. Did you forget to init submodules?")
endif()
message(FATAL_ERROR "WebKit directory ${WEBKIT_DIR} does not contain all the required files for Bun. Expected a path to the oven-sh/WebKit repository, or a path to a folder containing `include` and `lib`.")
endif()
set(WEBKIT_INCLUDE_DIR "${WEBKIT_DIR}/include")
set(WEBKIT_LIB_DIR "${WEBKIT_DIR}/lib")
message(STATUS "Using specified WebKit directory: ${WEBKIT_DIR}")
set(ASSERT_ENABLED "0")
message(STATUS "WebKit assertions: OFF")
endif()
endif()
# --- CMake Macros ---
# Append the given dependencies to the source file
macro(WEBKIT_ADD_SOURCE_DEPENDENCIES _source _deps)
set(_tmp)
get_source_file_property(_tmp ${_source} OBJECT_DEPENDS)
if(NOT _tmp)
set(_tmp "")
endif()
foreach(f ${_deps})
list(APPEND _tmp "${f}")
endforeach()
set_source_files_properties(${_source} PROPERTIES OBJECT_DEPENDS "${_tmp}")
unset(_tmp)
endmacro()
# --- BUILD ---
set(BUN_SRC "${CMAKE_CURRENT_SOURCE_DIR}/src")
set(BUN_DEPS_DIR "${BUN_SRC}/deps")
set(BUN_CODEGEN_SRC "${BUN_SRC}/codegen")
file(GLOB BUN_CPP ${CONFIGURE_DEPENDS}
"${BUN_SRC}/deps/*.cpp"
"${BUN_SRC}/io/*.cpp"
"${BUN_SRC}/bun.js/modules/*.cpp"
"${BUN_SRC}/bun.js/bindings/*.cpp"
"${BUN_SRC}/bun.js/bindings/webcore/*.cpp"
"${BUN_SRC}/bun.js/bindings/sqlite/*.cpp"
"${BUN_SRC}/bun.js/bindings/webcrypto/*.cpp"
"${BUN_SRC}/bun.js/bindings/webcrypto/*/*.cpp"
"${BUN_SRC}/deps/picohttpparser/picohttpparser.c"
)
set(USOCKETS_SRC "${CMAKE_CURRENT_SOURCE_DIR}/packages/bun-usockets/src")
file(GLOB USOCKETS_FILES ${CONFIGURE_DEPENDS}
"${USOCKETS_SRC}/*.c"
"${USOCKETS_SRC}/eventing/*.c"
"${USOCKETS_SRC}/internal/*.c"
"${USOCKETS_SRC}/crypto/*.c"
"${USOCKETS_SRC}/crypto/*.cpp"
)
# --- Classes Generator ---
file(GLOB BUN_CLASSES_TS ${CONFIGURE_DEPENDS}
"${BUN_SRC}/bun.js/*.classes.ts"
"${BUN_SRC}/bun.js/api/*.classes.ts"
"${BUN_SRC}/bun.js/test/*.classes.ts"
"${BUN_SRC}/bun.js/webcore/*.classes.ts"
"${BUN_SRC}/bun.js/node/*.classes.ts"
)
add_custom_command(
OUTPUT "${BUN_WORKDIR}/codegen/ZigGeneratedClasses.h"
"${BUN_WORKDIR}/codegen/ZigGeneratedClasses.cpp"
"${BUN_WORKDIR}/codegen/ZigGeneratedClasses+lazyStructureHeader.h"
"${BUN_WORKDIR}/codegen/ZigGeneratedClasses+DOMClientIsoSubspaces.h"
"${BUN_WORKDIR}/codegen/ZigGeneratedClasses+DOMIsoSubspaces.h"
"${BUN_WORKDIR}/codegen/ZigGeneratedClasses+lazyStructureImpl.h"
"${BUN_WORKDIR}/codegen/ZigGeneratedClasses.zig"
COMMAND ${BUN_EXECUTABLE} "${BUN_CODEGEN_SRC}/generate-classes.ts" ${BUN_CLASSES_TS} "${BUN_WORKDIR}/codegen"
WORKING_DIRECTORY ${CMAKE_CURRENT_SOURCE_DIR}
MAIN_DEPENDENCY "${BUN_CODEGEN_SRC}/generate-classes.ts"
DEPENDS ${BUN_CLASSES_TS}
VERBATIM
COMMENT "Generating *.classes.ts bindings"
)
# --- JSSink Generator ---
add_custom_command(
OUTPUT "${BUN_WORKDIR}/codegen/JSSink.cpp"
"${BUN_WORKDIR}/codegen/JSSink.h"
COMMAND ${BUN_EXECUTABLE} "src/codegen/generate-jssink.ts" "${BUN_WORKDIR}/codegen"
VERBATIM
MAIN_DEPENDENCY "src/codegen/generate-jssink.ts"
WORKING_DIRECTORY ${CMAKE_CURRENT_SOURCE_DIR}
COMMENT "Generating JSSink"
)
# --- .lut.h Generator ---
set(BUN_OBJECT_LUT_SOURCES
bun.js/bindings/BunObject.cpp
bun.js/bindings/ZigGlobalObject.lut.txt
bun.js/bindings/JSBuffer.cpp
bun.js/bindings/BunProcess.cpp
bun.js/bindings/ProcessBindingConstants.cpp
bun.js/bindings/ProcessBindingNatives.cpp
)
set(BUN_HASH_LUT_GENERATOR "${BUN_CODEGEN_SRC}/create-hash-table.ts")
macro(GENERATE_HASH_LUT _input _output _display_name)
add_custom_command(
OUTPUT ${_output}
MAIN_DEPENDENCY ${BUN_HASH_LUT_GENERATOR}
DEPENDS ${_input}
COMMAND ${BUN_EXECUTABLE} ${BUN_HASH_LUT_GENERATOR} ${_input} ${_output}
VERBATIM
COMMENT "Generating ${_display_name}"
)
# list(APPEND JavaScriptCore_HEADERS ${_output})
WEBKIT_ADD_SOURCE_DEPENDENCIES(${_input} ${_output})
endmacro()
foreach(_file ${BUN_OBJECT_LUT_SOURCES})
if(NOT EXISTS "${BUN_SRC}/${_file}")
message(FATAL_ERROR "Could not find ${_file} needed for LUT generation")
endif()
get_filename_component(_name ${_file} NAME_WE)
# workaround for ZigGlobalObject
if(_name MATCHES "ZigGlobalObject")
set(_name "ZigGlobalObject")
endif()
GENERATE_HASH_LUT(${BUN_SRC}/${_file} ${BUN_WORKDIR}/codegen/${_name}.lut.h ${_name}.lut.h)
endforeach()
WEBKIT_ADD_SOURCE_DEPENDENCIES(${BUN_SRC}/bun.js/bindings/ZigGlobalObject.cpp ${BUN_WORKDIR}/codegen/ZigGlobalObject.lut.h)
# --- Identifier Cache ---
set(BUN_IDENTIFIER_CACHE_OUT
"${BUN_SRC}/js_lexer/id_continue_bitset.blob"
"${BUN_SRC}/js_lexer/id_continue_bitset.meta.blob"
"${BUN_SRC}/js_lexer/id_start_bitset.blob"
"${BUN_SRC}/js_lexer/id_start_bitset.meta.blob")
add_custom_command(
OUTPUT ${BUN_IDENTIFIER_CACHE_OUT}
MAIN_DEPENDENCY "${BUN_SRC}/js_lexer/identifier_data.zig"
DEPENDS "${BUN_SRC}/js_lexer/identifier_cache.zig"
COMMAND ${ZIG_COMPILER} run "${BUN_SRC}/js_lexer/identifier_data.zig"
VERBATIM
COMMENT "Building Identifier Cache"
)
# --- Bundled TS/JS ---
# Note: It's not worth doing this in parallel at the CMake/Ninja level, because this bundling
# requires all the JS files to be known, but also Bun will use all cores during bundling anyways.
file(GLOB BUN_TS_MODULES ${CONFIGURE_DEPENDS}
"${BUN_SRC}/js/node/*.ts"
"${BUN_SRC}/js/node/*.js"
"${BUN_SRC}/js/bun/*.js"
"${BUN_SRC}/js/bun/*.ts"
"${BUN_SRC}/js/thirdparty/*.js"
"${BUN_SRC}/js/thirdparty/*.ts"
"${BUN_SRC}/js/internal/*.js"
"${BUN_SRC}/js/internal/*.ts"
)
file(GLOB BUN_TS_FUNCTIONS ${CONFIGURE_DEPENDS} "${BUN_SRC}/js/builtins/*.ts")
add_custom_command(
OUTPUT
"${BUN_WORKDIR}/codegen/InternalModuleRegistryConstants.h"
"${BUN_WORKDIR}/codegen/InternalModuleRegistry+createInternalModuleById.h"
"${BUN_WORKDIR}/codegen/InternalModuleRegistry+enum.h"
"${BUN_WORKDIR}/codegen/InternalModuleRegistry+numberOfModules.h"
"${BUN_WORKDIR}/codegen/NativeModuleImpl.h"
"${BUN_WORKDIR}/codegen/ResolvedSourceTag.zig"
"${BUN_WORKDIR}/codegen/SyntheticModuleType.h"
COMMAND ${BUN_EXECUTABLE} "${BUN_SRC}/codegen/bundle-modules.ts" "${BUN_WORKDIR}"
DEPENDS ${BUN_TS_MODULES}
WORKING_DIRECTORY ${CMAKE_CURRENT_SOURCE_DIR}
COMMENT "Bundling JS modules"
)
WEBKIT_ADD_SOURCE_DEPENDENCIES(
"${BUN_SRC}/bun.js/bindings/InternalModuleRegistry.cpp"
"${BUN_WORKDIR}/codegen/InternalModuleRegistryConstants.h"
)
add_custom_command(
OUTPUT "${BUN_WORKDIR}/codegen/WebCoreJSBuiltins.cpp"
"${BUN_WORKDIR}/codegen/WebCoreJSBuiltins.h"
COMMAND ${BUN_EXECUTABLE} "${BUN_SRC}/codegen/bundle-functions.ts" "${BUN_WORKDIR}"
DEPENDS ${BUN_TS_FUNCTIONS} "${BUN_SRC}/codegen/bundle-functions.ts"
WORKING_DIRECTORY ${CMAKE_CURRENT_SOURCE_DIR}
COMMENT "Bundling JS builtin functions"
)
# --- Peechy API ---
add_custom_command(
OUTPUT "${BUN_SRC}/api/schema.js"
"${BUN_SRC}/api/schema.d.ts"
"${BUN_SRC}/api/schema.zig"
WORKING_DIRECTORY "${CMAKE_CURRENT_SOURCE_DIR}"
COMMAND "${CMAKE_CURRENT_SOURCE_DIR}/node_modules/.bin/peechy"
"--schema" "${BUN_SRC}/api/schema.peechy"
"--esm" "${BUN_SRC}/api/schema.js"
"--ts" "${BUN_SRC}/api/schema.d.ts"
"--zig" "${BUN_SRC}/api/schema.zig"
COMMAND "${ZIG_COMPILER}" "fmt" "src/api/schema.zig"
COMMAND "${PRETTIER}" "--config=.prettierrc.cjs" "--write" "src/api/schema.js" "src/api/schema.d.ts"
DEPENDS "${BUN_SRC}/api/schema.peechy"
COMMENT "Building schema"
)
add_custom_command(
OUTPUT "${BUN_SRC}/analytics/analytics_schema.zig"
WORKING_DIRECTORY "${CMAKE_CURRENT_SOURCE_DIR}"
COMMAND "${CMAKE_CURRENT_SOURCE_DIR}/node_modules/.bin/peechy"
"--schema" "${BUN_SRC}/analytics/schema.peechy"
"--zig" "${BUN_SRC}/analytics/analytics_schema.zig"
COMMAND "${ZIG_COMPILER}" "fmt" "${BUN_SRC}/analytics/analytics_schema.zig"
DEPENDS "${BUN_SRC}/api/schema.peechy"
COMMENT "Building analytics_schema.zig"
)
# --- Zig Object ---
file(GLOB ZIG_FILES
"${BUN_SRC}/*.zig"
"${BUN_SRC}/**/*.zig"
"${BUN_SRC}/**/**/*.zig"
"${BUN_SRC}/**/**/**/*.zig"
)
if(DEBUG)
set(BUN_ZIG_OBJ "${BUN_WORKDIR}/CMakeFiles/bun-debug.o")
else()
set(BUN_ZIG_OBJ "${BUN_WORKDIR}/CMakeFiles/bun.o")
endif()
add_custom_command(
OUTPUT "${BUN_ZIG_OBJ}"
COMMAND
"${ZIG_COMPILER}" "build" "obj"
"-Doutput-dir=${BUN_WORKDIR}/CMakeFiles"
"-Dgenerated-code=${BUN_WORKDIR}/codegen"
"-Dversion=${Bun_VERSION}"
"-Dcanary=$<IF:$<BOOL:${CANARY}>,true,false>"
"-Doptimize=${ZIG_OPTIMIZE}"
"-Dcpu=${CPU_TARGET}"
"-Dtarget=${ZIG_TARGET}"
DEPENDS
"${CMAKE_CURRENT_SOURCE_DIR}/build.zig"
"${ZIG_FILES}"
"${BUN_WORKDIR}/codegen/ZigGeneratedClasses.zig"
"${BUN_WORKDIR}/codegen/ResolvedSourceTag.zig"
"${BUN_IDENTIFIER_CACHE_OUT}"
"${BUN_SRC}/api/schema.zig"
WORKING_DIRECTORY ${CMAKE_CURRENT_SOURCE_DIR}
COMMENT "Building zig code"
VERBATIM
# This is here to show Zig's progress indicator
USES_TERMINAL
)
set(BUN_EXTRA_SOURCES "")
if(WIN32)
set(BUN_EXTRA_SOURCES "${BUN_SRC}/bun.js/bindings/windows/musl-memmem.c")
include_directories("${BUN_SRC}/bun.js/bindings/windows")
endif()
# -- The Buntime™ ---
add_executable(
${bun}
${BUN_CPP}
${USOCKETS_FILES}
${BUN_ZIG_OBJ}
"${BUN_WORKDIR}/codegen/JSSink.cpp"
"${BUN_WORKDIR}/codegen/ZigGeneratedClasses.cpp"
"${BUN_WORKDIR}/codegen/WebCoreJSBuiltins.cpp"
"${BUN_ZIG_OBJ}"
"${BUN_EXTRA_SOURCES}"
)
set_target_properties(${bun} PROPERTIES
CXX_STANDARD 20
CXX_STANDARD_REQUIRED YES
CXX_EXTENSIONS YES
CXX_VISIBILITY_PRESET hidden
C_STANDARD 17
C_STANDARD_REQUIRED YES
VISIBILITY_INLINES_HIDDEN YES
)
# Set /subsystem:console on bun for windows
if(WIN32)
set_target_properties(${bun} PROPERTIES LINK_FLAGS " /SUBSYSTEM:CONSOLE ")
endif()
add_compile_definitions(
# TODO: are all of these variables strictly necessary?
"_HAS_EXCEPTIONS=0"
"LIBUS_USE_OPENSSL=1"
"UWS_HTTPRESPONSE_NO_WRITEMARK=1"
"LIBUS_USE_BORINGSSL=1"
"WITH_BORINGSSL=1"
"STATICALLY_LINKED_WITH_JavaScriptCore=1"
"STATICALLY_LINKED_WITH_WTF=1"
"STATICALLY_LINKED_WITH_BMALLOC=1"
"BUILDING_WITH_CMAKE=1"
"JSC_OBJC_API_ENABLED=0"
"BUN_SINGLE_THREADED_PER_VM_ENTRY_SCOPE=1"
"NAPI_EXPERIMENTAL=ON"
"NOMINMAX"
"IS_BUILD"
"BUILDING_JSCONLY__"
"ASSERT_ENABLED=$<IF:$<CONFIG:ASSERT_ENABLED>,1,0>"
"BUN_DYNAMIC_JS_LOAD_PATH=\"${BUN_WORKDIR}/js\""
)
if(NOT ASSERT_ENABLED)
add_compile_definitions("NDEBUG=1")
endif()
include_directories(
${CMAKE_CURRENT_SOURCE_DIR}/packages/
${CMAKE_CURRENT_SOURCE_DIR}/packages/bun-usockets
${CMAKE_CURRENT_SOURCE_DIR}/packages/bun-usockets/src
${CMAKE_CURRENT_SOURCE_DIR}/src/bun.js/bindings
${CMAKE_CURRENT_SOURCE_DIR}/src/bun.js/bindings/webcore
${CMAKE_CURRENT_SOURCE_DIR}/src/bun.js/bindings/webcrypto
${CMAKE_CURRENT_SOURCE_DIR}/src/bun.js/bindings/sqlite
${CMAKE_CURRENT_SOURCE_DIR}/src/bun.js/modules
${CMAKE_CURRENT_SOURCE_DIR}/src/js/builtins
${CMAKE_CURRENT_SOURCE_DIR}/src/napi
${CMAKE_CURRENT_SOURCE_DIR}/src/deps
${CMAKE_CURRENT_SOURCE_DIR}/src/deps/picohttpparser
${WEBKIT_INCLUDE_DIR}
"${BUN_WORKDIR}/codegen"
)
# --- clang and linker flags ---
if(CMAKE_BUILD_TYPE STREQUAL "Debug")
if(NOT MSVC)
target_compile_options(${bun} PUBLIC -g3 -O1)
endif()
add_compile_definitions("BUN_DEBUG=1")
elseif(CMAKE_BUILD_TYPE STREQUAL "Release")
if (MSVC)
target_compile_options(${bun} PUBLIC /O2)
else()
target_compile_options(${bun} PUBLIC -O3 -flto=full -emit-llvm)
endif()
endif()
if(NOT MSVC)
if(NOT CI)
target_compile_options(${bun} PRIVATE -fdiagnostics-color=always)
endif()
target_compile_options(${bun} PUBLIC
-march=${CPU_TARGET}
-mtune=${CPU_TARGET}
-fconstexpr-steps=1271242
-fconstexpr-depth=27
-fno-exceptions
-fvisibility=hidden
-fvisibility-inlines-hidden
-fno-rtti
-ferror-limit=${ERROR_LIMIT}
-fPIC
-fno-omit-frame-pointer
)
string(APPEND CMAKE_CXX_FLAGS " -std=c++2a ")
else() # MSVC
string(APPEND SUPPRESS_WARNING_NUMBERS
# JSC deletes operator delete to prevent accidental use
"/wd4291 "
# we use #pragma mark in some places
"/wd4068"
)
string(APPEND CMAKE_CXX_FLAGS " /EHsc /GR-")
string(APPEND CMAKE_C_FLAGS " /EHsc /GR- ${SUPPRESS_WARNING_NUMBERS} /experimental:c11atomics /std:c17")
string(APPEND CMAKE_CXX_FLAGS " /Zc:__cplusplus /Zc:inline /bigobj ${SUPPRESS_WARNING_NUMBERS}")
endif()
if(APPLE)
if(ARCH STREQUAL "x86_64")
set(CMAKE_OSX_DEPLOYMENT_TARGET "10.14")
else()
set(CMAKE_OSX_DEPLOYMENT_TARGET "11.0")
endif()
target_link_options(${bun} PUBLIC "-dead_strip")
target_link_options(${bun} PUBLIC "-dead_strip_dylibs")
target_link_options(${bun} PUBLIC "-exported_symbols_list" "${BUN_SRC}/symbols.txt")
set_target_properties(${bun} PROPERTIES LINK_DEPENDS "${BUN_SRC}/symbols.txt")
target_link_options(${bun} PUBLIC "-fno-keep-static-consts")
target_link_libraries(${bun} PRIVATE "resolv")
endif()
if(UNIX AND NOT APPLE)
target_link_options(${bun} PUBLIC
"-static-libstdc++"
"-static-libgcc"
"-fuse-ld=lld"
"-Wl,-z,now"
"-Wl,--as-needed"
"-Wl,--gc-sections"
"-Wl,-z,stack-size=12800000"
"-Wl,--wrap=fcntl"
"-Wl,--wrap=fcntl64"
"-Wl,--wrap=stat64"
"-Wl,--wrap=pow"
"-Wl,--wrap=exp"
"-Wl,--wrap=log"
"-Wl,--wrap=log2"
"-Wl,--wrap=lstat"
"-Wl,--wrap=stat"
"-Wl,--wrap=fstat"
"-Wl,--wrap=fstatat"
"-Wl,--wrap=lstat64"
"-Wl,--wrap=stat64"
"-Wl,--wrap=fstat64"
"-Wl,--wrap=fstatat64"
"-Wl,--wrap=mknod"
"-Wl,--wrap=mknodat"
"-Wl,--wrap=statx "
"-Wl,--compress-debug-sections=zlib"
"-Bsymbolics-functions"
"-rdynamic"
"-Wl,--dynamic-list=${BUN_SRC}/symbols.dyn"
"-Wl,--version-script=${BUN_SRC}/linker.lds"
)
target_link_libraries(${bun} PRIVATE "c")
target_link_libraries(${bun} PRIVATE "libatomic.a")
target_link_libraries(${bun} PRIVATE "${WEBKIT_LIB_DIR}/libicudata.a")
target_link_libraries(${bun} PRIVATE "${WEBKIT_LIB_DIR}/libicui18n.a")
target_link_libraries(${bun} PRIVATE "${WEBKIT_LIB_DIR}/libicuuc.a")
set_target_properties(${bun} PROPERTIES LINK_DEPENDS "${BUN_SRC}/linker.lds")
set_target_properties(${bun} PROPERTIES LINK_DEPENDS "${BUN_SRC}/symbols.dyn")
endif()
if(WIN32)
add_compile_definitions(
"WIN32"
"_WINDOWS"
"_CRT_SECURE_NO_WARNINGS"
"WIN32_LEAN_AND_MEAN=1"
)
set_property(TARGET ${bun} PROPERTY MSVC_RUNTIME_LIBRARY "MultiThreadedDLL")
endif()
if(APPLE)
# TODO: a much better check can be done to find this path
find_path(
ICU4C_DIR NAMES lib/libicudata.a
PATHS ENV PATH /usr/local/opt/icu4c /opt/homebrew/opt/icu4c
)
find_path(
ICONV_DIR NAMES lib/libiconv.a
PATHS ENV PATH /usr/local/opt/libiconv /opt/homebrew/opt/libiconv
)
target_link_libraries(${bun} PRIVATE "icucore")
target_link_libraries(${bun} PRIVATE "${ICONV_DIR}/lib/libiconv.a")
target_link_libraries(${bun} PRIVATE "${ICU4C_DIR}/lib/libicudata.a")
target_link_libraries(${bun} PRIVATE "${ICU4C_DIR}/lib/libicui18n.a")
target_link_libraries(${bun} PRIVATE "${ICU4C_DIR}/lib/libicuuc.a")
include_directories(${ICU4C_DIR}/include)
endif()
# --- Stripped Binary "bun"
if(CMAKE_BUILD_TYPE STREQUAL "Release" AND NOT WIN32)
add_custom_command(
TARGET ${bun}
POST_BUILD
COMMAND ${DSYMUTIL} -o ${BUN_WORKDIR}/bun.dSYM ${BUN_WORKDIR}/${bun}
COMMENT "Stripping Symbols"
)
add_custom_command(
TARGET ${bun}
POST_BUILD
COMMAND ${STRIP} -s -x -S -o ${BUN_WORKDIR}/bun ${BUN_WORKDIR}/${bun}
COMMENT "Stripping Symbols"
)
endif()
# --- Dependencies ---
if(USE_CUSTOM_ZLIB AND (NOT WIN32))
target_link_libraries(${bun} PRIVATE "${BUN_DEPS_DIR}/libz.a")
include_directories(${BUN_DEPS_DIR}/zlib/include)
else()
find_package(ZLIB REQUIRED)
target_link_libraries(${bun} PRIVATE ZLIB::ZLIB)
endif()
if(USE_CUSTOM_BORINGSSL)
include_directories(src/deps/boringssl/include)
if (WIN32)
target_link_libraries(${bun} PRIVATE "${BUN_DEPS_DIR}/crypto.lib")
target_link_libraries(${bun} PRIVATE "${BUN_DEPS_DIR}/ssl.lib")
target_link_libraries(${bun} PRIVATE "${BUN_DEPS_DIR}/decrepit.lib")
else()
target_link_libraries(${bun} PRIVATE "${BUN_DEPS_DIR}/libcrypto.a")
target_link_libraries(${bun} PRIVATE "${BUN_DEPS_DIR}/libssl.a")
target_link_libraries(${bun} PRIVATE "${BUN_DEPS_DIR}/libdecrepit.a")
endif()
else()
include(FindBoringSSL)
FindBoringSSL(${bun})
endif()
if(USE_CUSTOM_LIBARCHIVE)
include_directories(src/deps/libarchive/include)
if (WIN32)
target_link_libraries(${bun} PRIVATE "${BUN_DEPS_DIR}/archive.lib")
else()
target_link_libraries(${bun} PRIVATE "${BUN_DEPS_DIR}/libarchive.a")
endif()
else()
find_package(LibArchive REQUIRED)
target_link_libraries(${bun} PRIVATE LibArchive::LibArchive)
endif()
if(USE_CUSTOM_MIMALLOC)
include_directories(src/deps/mimalloc/include)
if (WIN32)
target_link_libraries(${bun} PRIVATE "${BUN_DEPS_DIR}/mimalloc-static.lib")
elseif(APPLE)
# https://github.com/microsoft/mimalloc/issues/512
# Linking mimalloc via object file on macOS x64 can cause heap corruption
target_link_libraries(${bun} PRIVATE "${BUN_DEPS_DIR}/libmimalloc.a")
else()
target_link_libraries(${bun} PRIVATE "${BUN_DEPS_DIR}/libmimalloc.o")
endif()
else()
find_package(mimalloc REQUIRED)
target_link_libraries(${bun} PRIVATE mimalloc)
endif()
if(USE_CUSTOM_ZSTD)
include_directories(src/deps/zstd/include)
if (WIN32)
target_link_libraries(${bun} PRIVATE "${BUN_DEPS_DIR}/zstd.lib")
else()
target_link_libraries(${bun} PRIVATE "${BUN_DEPS_DIR}/libzstd.a")
endif()
else()
find_package(zstd CONFIG REQUIRED)
target_link_libraries(${bun} PRIVATE zstd::libzstd)
endif()
if(USE_CUSTOM_CARES)
include_directories(src/deps/c-ares/include)
if (WIN32)
target_link_libraries(${bun} PRIVATE "${BUN_DEPS_DIR}/cares.lib")
else()
target_link_libraries(${bun} PRIVATE "${BUN_DEPS_DIR}/libcares.a")
endif()
else()
find_package(c-ares CONFIG REQUIRED)
target_link_libraries(${bun} PRIVATE c-ares::cares)
endif()
if(USE_CUSTOM_BASE64)
include_directories(src/deps/base64/include)
if (WIN32)
target_link_libraries(${bun} PRIVATE "${BUN_DEPS_DIR}/base64.lib")
else()
target_link_libraries(${bun} PRIVATE "${BUN_DEPS_DIR}/libbase64.a")
endif()
else()
find_package(base64 REQUIRED)
target_link_libraries(${bun} PRIVATE base64::base64)
endif()
if(NOT WIN32)
if (USE_CUSTOM_TINYCC)
if (WIN32)
target_link_libraries(${bun} PRIVATE "${BUN_DEPS_DIR}/tcc.lib")
else()
target_link_libraries(${bun} PRIVATE "${BUN_DEPS_DIR}/libtcc.a")
endif()
else()
find_package(tinycc REQUIRED)
target_link_libraries(${bun} PRIVATE tinycc::tinycc)
endif()
endif()
if(USE_CUSTOM_LOLHTML)
if (WIN32)
target_link_libraries(${bun} PRIVATE "${BUN_DEPS_DIR}/lolhtml.lib")
else()
target_link_libraries(${bun} PRIVATE "${BUN_DEPS_DIR}/liblolhtml.a")
endif()
else()
find_package(lolhtml REQUIRED)
target_link_libraries(${bun} PRIVATE lolhtml::lolhtml)
endif()
if(WIN32)
if (USE_CUSTOM_LIBUV)
target_link_libraries(${bun} PRIVATE "${BUN_DEPS_DIR}/uv.lib")
else()
find_package(libuv CONFIG REQUIRED )
target_link_libraries(${bun} PRIVATE $<IF:$<TARGET_EXISTS:libuv::uv_a>,libuv::uv_a,libuv::uv>)
endif()
message(STATUS "Found libuv: ${libuv_LIBRARIES}")
endif()
if(USE_STATIC_SQLITE)
add_library(sqlite3 STATIC src/bun.js/bindings/sqlite/sqlite3.c)
target_include_directories(sqlite3 PUBLIC src/bun.js/bindings/sqlite)
target_link_libraries(${bun} PRIVATE sqlite3)
message(STATUS "Using static sqlite3")
target_compile_definitions(${bun} PRIVATE "LAZY_LOAD_SQLITE=0")
else()
message(STATUS "Using dynamicly linked sqlite3")
target_compile_definitions(${bun} PRIVATE "LAZY_LOAD_SQLITE=1")
endif()
if(NOT MSVC)
target_link_libraries(${bun} PRIVATE "${WEBKIT_LIB_DIR}/libWTF.a")
target_link_libraries(${bun} PRIVATE "${WEBKIT_LIB_DIR}/libJavaScriptCore.a")
target_link_libraries(${bun} PRIVATE "${WEBKIT_LIB_DIR}/libbmalloc.a")
else()
target_link_libraries(${bun} PRIVATE "${WEBKIT_LIB_DIR}/WTF.lib")
target_link_libraries(${bun} PRIVATE "${WEBKIT_LIB_DIR}/JavaScriptCore.lib")
if (WIN32)
string (APPEND CMAKE_CXX_FLAGS
" /external:anglebrackets /Gs- /Zi"
)
string (APPEND CMAKE_FLAGS
" /external:anglebrackets /Gs- /Zi"
)
set_target_properties(${bun} PROPERTIES LINK_FLAGS " /SUBSYSTEM:CONSOLE /STACK:4194304,2097152")
endif()
if (DEFINED ENV{VCPKG_ROOT})
include_directories($ENV{VCPKG_ROOT}/installed/x64-windows/include)
endif()
# include_directories(C:/Users/windo/Build/WebKit/WebKitBuild/WTF/DerivedSources)
# include_directories(C:/Users/windo/Build/WebKit/WebKitBuild/WTF/Headers)
target_include_directories(${bun} PUBLIC C:/Users/windo/Code/WebKit/WebKitLibraries/win/include)
target_link_directories(${bun} PUBLIC C:/Users/windo/Code/WebKit/WebKitLibraries/win/lib64)
target_link_directories(${bun} PUBLIC C:/Users/windo/Code/lib64)
target_link_libraries(${bun} PUBLIC icuuc icudt icutu icuio icuin icutest)
target_link_libraries(${bun} PUBLIC winmm ws2_32 bcrypt ntdll kernel32 shell32 shlwapi advapi32 vcruntime ucrt legacy_stdio_definitions)
endif()

View File

@@ -10,7 +10,7 @@ Today (February 2023), Bun's codebase has five distinct parts:
- JavaScript, JSX, & TypeScript transpiler, module resolver, and related code
- JavaScript runtime ([`src/bun.js/`](src/bun.js/))
- JavaScript runtime bindings ([`src/bun.zig/bindings/**/*.cpp`](src/bun.zig/bindings/))
- JavaScript runtime bindings ([`src/bun.js/bindings/**/*.cpp`](src/bun.js/bindings/))
- Package manager ([`src/install/`](src/install/))
- Shared utilities ([`src/string_immutable.zig`](src/string_immutable.zig))

View File

@@ -1,685 +1,90 @@
ARG DEBIAN_FRONTEND=noninteractive
ARG GITHUB_WORKSPACE=/build
ARG ZIG_PATH=${GITHUB_WORKSPACE}/zig
ARG WEBKIT_DIR=${GITHUB_WORKSPACE}/bun-webkit
ARG BUN_RELEASE_DIR=${GITHUB_WORKSPACE}/bun-release
ARG BUN_DEPS_OUT_DIR=${GITHUB_WORKSPACE}/bun-deps
ARG BUN_DIR=${GITHUB_WORKSPACE}/bun
ARG CPU_TARGET=native
ARG ARCH=x86_64
ARG BUILD_MACHINE_ARCH=x86_64
ARG TRIPLET=${ARCH}-linux-gnu
ARG BUILDARCH=amd64
ARG WEBKIT_TAG=2023-oct3
ARG ZIG_TAG=jul1
ARG ZIG_VERSION="0.12.0-dev.163+6780a6bbf"
ARG WEBKIT_BASENAME="bun-webkit-linux-$BUILDARCH"
FROM bitnami/minideb:bullseye as base
ARG CLANG_VERSION="16"
ARG NODE_VERSION="20"
ARG ZIG_VERSION="0.12.0-dev.1114+e8f3c4c4b"
ARG DEBIAN_FRONTEND="noninteractive"
RUN apt-get update -y \
&& install_packages \
ca-certificates \
curl \
gnupg \
&& echo "deb https://apt.llvm.org/bullseye/ llvm-toolchain-bullseye-${CLANG_VERSION} main" > /etc/apt/sources.list.d/llvm.list \
&& echo "deb-src https://apt.llvm.org/bullseye/ llvm-toolchain-bullseye-${CLANG_VERSION} main" >> /etc/apt/sources.list.d/llvm.list \
&& curl -fsSL "https://apt.llvm.org/llvm-snapshot.gpg.key" | apt-key add - \
&& echo "deb https://deb.nodesource.com/node_${NODE_VERSION}.x nodistro main" > /etc/apt/sources.list.d/nodesource.list \
&& curl -fsSL "https://deb.nodesource.com/gpgkey/nodesource-repo.gpg.key" | apt-key add - \
&& echo "deb https://apt.kitware.com/ubuntu/ focal main" > /etc/apt/sources.list.d/kitware.list \
&& curl -fsSL "https://apt.kitware.com/keys/kitware-archive-latest.asc" | apt-key add - \
&& install_packages \
wget \
bash \
lsb-release \
software-properties-common \
build-essential \
autoconf \
automake \
libtool \
pkg-config \
clang-${CLANG_VERSION} \
lld-${CLANG_VERSION} \
lldb-${CLANG_VERSION} \
clangd-${CLANG_VERSION} \
make \
cmake \
ccache \
ninja-build \
file \
gnupg \
libc-dev \
libxml2 \
libxml2-dev \
xz-utils \
libtcc-dev \
git \
tar \
rsync \
gzip \
unzip \
perl \
python3 \
ruby \
golang \
nodejs \
&& ln -s /usr/bin/clang-${CLANG_VERSION} /usr/bin/clang \
&& ln -s /usr/bin/clang++-${CLANG_VERSION} /usr/bin/clang++ \
&& ln -s /usr/bin/lld-${CLANG_VERSION} /usr/bin/lld \
&& ln -s /usr/bin/lldb-${CLANG_VERSION} /usr/bin/lldb \
&& ln -s /usr/bin/clangd-${CLANG_VERSION} /usr/bin/clangd \
&& ln -s /usr/bin/llvm-ar-${CLANG_VERSION} /usr/bin/llvm-ar \
&& arch="$(dpkg --print-architecture)" \
&& case "${arch##*-}" in \
amd64) variant="x86_64";; \
arm64) variant="aarch64";; \
*) echo "error: unsupported architecture: $arch"; exit 1 ;; \
esac \
&& echo "https://ziglang.org/builds/zig-linux-${variant}-${ZIG_VERSION}.tar.xz" \
&& curl -fsSL "https://ziglang.org/builds/zig-linux-${variant}-${ZIG_VERSION}.tar.xz" | tar xJ --strip-components=1 \
&& mv zig /usr/bin/zig \
&& curl "https://sh.rustup.rs" -sSf | sh -s -- -y \
&& mv ${HOME}/.cargo/bin/* /usr/bin/ \
&& npm install -g bun esbuild
ARG CXX="clang++-${CLANG_VERSION}"
ARG CC="clang-${CLANG_VERSION}"
ARG LD="lld-${CLANG_VERSION}"
ARG AR="/usr/bin/llvm-ar-${CLANG_VERSION}"
COPY package.json package.json
COPY Makefile Makefile
COPY CMakeLists.txt CMakeLists.txt
COPY src/ src/
COPY packages/bun-usockets/ packages/bun-usockets/
COPY packages/bun-uws/ packages/bun-uws/
COPY .scripts/ .scripts/
COPY .build/ .build/
COPY *.zig ./
RUN ./.build/base64.bash
ARG ZIG_FOLDERNAME=zig-linux-${BUILD_MACHINE_ARCH}-${ZIG_VERSION}
ARG ZIG_FILENAME=${ZIG_FOLDERNAME}.tar.xz
ARG WEBKIT_URL="https://github.com/oven-sh/WebKit/releases/download/$WEBKIT_TAG/${WEBKIT_BASENAME}.tar.gz"
ARG ZIG_URL="https://ziglang.org/builds/${ZIG_FILENAME}"
ARG GIT_SHA=""
ARG BUN_BASE_VERSION=1.0
FROM bitnami/minideb:bullseye as bun-base
RUN install_packages ca-certificates curl wget lsb-release software-properties-common gnupg gnupg1 gnupg2 && \
echo "deb https://apt.llvm.org/bullseye/ llvm-toolchain-bullseye-16 main" > /etc/apt/sources.list.d/llvm.list && \
echo "deb-src https://apt.llvm.org/bullseye/ llvm-toolchain-bullseye-16 main" >> /etc/apt/sources.list.d/llvm.list && \
wget -O - https://apt.llvm.org/llvm-snapshot.gpg.key | apt-key add - && \
curl -fsSL https://deb.nodesource.com/setup_lts.x | bash - && \
install_packages \
cmake \
file \
git \
gnupg \
libc-dev \
libxml2 \
libxml2-dev \
make \
ninja-build \
perl \
python3 \
rsync \
ruby \
unzip \
clang-16 \
lld-16 \
lldb-16 \
clangd-16 \
xz-utils \
bash tar gzip ccache nodejs && \
npm install -g esbuild
ENV CXX=clang++-16
ENV CC=clang-16
ARG DEBIAN_FRONTEND
ARG GITHUB_WORKSPACE
ARG WEBKIT_DIR
ARG BUN_RELEASE_DIR
ARG BUN_DEPS_OUT_DIR
ARG BUN_DIR
ARG BUILDARCH
ARG ZIG_PATH
ARG WEBKIT_URL
ARG ZIG_URL
ARG ZIG_FOLDERNAME
ARG ZIG_FILENAME
ENV WEBKIT_OUT_DIR=${WEBKIT_DIR}
ENV BUILDARCH=${BUILDARCH}
ENV AR=/usr/bin/llvm-ar-16
ENV ZIG "${ZIG_PATH}/zig"
ENV PATH="$ZIG/bin:$PATH"
ENV LD=lld-16
RUN mkdir -p $BUN_DIR $BUN_DEPS_OUT_DIR
FROM bun-base as bun-base-with-zig-and-webkit
WORKDIR $GITHUB_WORKSPACE
ADD $ZIG_URL .
RUN tar xf ${ZIG_FILENAME} && \
rm ${ZIG_FILENAME} && mv ${ZIG_FOLDERNAME} zig;
WORKDIR $GITHUB_WORKSPACE
ARG GITHUB_WORKSPACE
ARG WEBKIT_DIR
ARG BUN_RELEASE_DIR
ARG BUN_DEPS_OUT_DIR
ARG BUN_DIR
ARG BUILDARCH
ARG ZIG_PATH
ARG WEBKIT_URL
ARG ZIG_URL
ARG WEBKIT_BASENAME
ADD ${WEBKIT_URL} .
RUN mkdir -p ${WEBKIT_DIR} && cd ${GITHUB_WORKSPACE} && \
gunzip ${WEBKIT_BASENAME}.tar.gz && tar -xf ${WEBKIT_BASENAME}.tar && \
cat ${WEBKIT_DIR}/include/cmakeconfig.h > /dev/null
LABEL org.opencontainers.image.title="bun base image with zig & webkit ${BUILDARCH} (glibc)"
LABEL org.opencontainers.image.source=https://github.com/oven-sh/bun
FROM bun-base as c-ares
ARG DEBIAN_FRONTEND
ARG GITHUB_WORKSPACE
ARG ZIG_PATH
# Directory extracts to "bun-webkit"
ARG WEBKIT_DIR
ARG BUN_RELEASE_DIR
ARG BUN_DEPS_OUT_DIR
ARG BUN_DIR
ARG CPU_TARGET
ENV CPU_TARGET=${CPU_TARGET}
ENV CCACHE_DIR=/ccache
ENV JSC_BASE_DIR=${WEBKIT_DIR}
ENV LIB_ICU_PATH=${WEBKIT_DIR}/lib
COPY Makefile ${BUN_DIR}/Makefile
COPY src/deps/c-ares ${BUN_DIR}/src/deps/c-ares
WORKDIR $BUN_DIR
RUN --mount=type=cache,target=/ccache cd $BUN_DIR && make c-ares && rm -rf ${BUN_DIR}/src/deps/c-ares ${BUN_DIR}/Makefile
FROM bun-base as lolhtml
RUN install_packages build-essential && curl https://sh.rustup.rs -sSf | sh -s -- -y
ARG DEBIAN_FRONTEND
ARG GITHUB_WORKSPACE
ARG ZIG_PATH
# Directory extracts to "bun-webkit"
ARG WEBKIT_DIR
ARG BUN_RELEASE_DIR
ARG BUN_DEPS_OUT_DIR
ARG BUN_DIR
ARG CPU_TARGET
ENV CPU_TARGET=${CPU_TARGET}
COPY Makefile ${BUN_DIR}/Makefile
COPY src/deps/lol-html ${BUN_DIR}/src/deps/lol-html
ENV CCACHE_DIR=/ccache
RUN --mount=type=cache,target=/ccache export PATH=$PATH:$HOME/.cargo/bin && export CC=$(which clang-16) && cd ${BUN_DIR} && \
make lolhtml && rm -rf src/deps/lol-html Makefile
FROM bun-base as mimalloc
ARG DEBIAN_FRONTEND
ARG GITHUB_WORKSPACE
ARG ZIG_PATH
# Directory extracts to "bun-webkit"
ARG WEBKIT_DIR
ARG BUN_RELEASE_DIR
ARG BUN_DEPS_OUT_DIR
ARG BUN_DIR
COPY Makefile ${BUN_DIR}/Makefile
COPY src/deps/mimalloc ${BUN_DIR}/src/deps/mimalloc
ARG CPU_TARGET
ENV CPU_TARGET=${CPU_TARGET}
ENV CCACHE_DIR=/ccache
RUN --mount=type=cache,target=/ccache cd ${BUN_DIR} && \
make mimalloc && rm -rf src/deps/mimalloc Makefile
FROM bun-base as zlib
ARG DEBIAN_FRONTEND
ARG GITHUB_WORKSPACE
ARG ZIG_PATH
# Directory extracts to "bun-webkit"
ARG WEBKIT_DIR
ARG BUN_RELEASE_DIR
ARG BUN_DEPS_OUT_DIR
ARG BUN_DIR
ARG CPU_TARGET
ENV CPU_TARGET=${CPU_TARGET}
COPY Makefile ${BUN_DIR}/Makefile
COPY src/deps/zlib ${BUN_DIR}/src/deps/zlib
WORKDIR $BUN_DIR
ENV CCACHE_DIR=/ccache
RUN --mount=type=cache,target=/ccache cd $BUN_DIR && \
make zlib && rm -rf src/deps/zlib Makefile
FROM bun-base as libarchive
ARG DEBIAN_FRONTEND
ARG GITHUB_WORKSPACE
ARG ZIG_PATH
# Directory extracts to "bun-webkit"
ARG WEBKIT_DIR
ARG BUN_RELEASE_DIR
ARG BUN_DEPS_OUT_DIR
ARG BUN_DIR
ARG CPU_TARGET
ENV CPU_TARGET=${CPU_TARGET}
RUN install_packages autoconf automake libtool pkg-config
COPY Makefile ${BUN_DIR}/Makefile
COPY src/deps/libarchive ${BUN_DIR}/src/deps/libarchive
ENV CCACHE_DIR=/ccache
WORKDIR $BUN_DIR
RUN --mount=type=cache,target=/ccache cd $BUN_DIR && \
make libarchive && rm -rf src/deps/libarchive Makefile
FROM bun-base as tinycc
ARG DEBIAN_FRONTEND
ARG GITHUB_WORKSPACE
ARG ZIG_PATH
# Directory extracts to "bun-webkit"
ARG WEBKIT_DIR
ARG BUN_RELEASE_DIR
ARG BUN_DEPS_OUT_DIR
ARG BUN_DIR
ARG CPU_TARGET
ENV CPU_TARGET=${CPU_TARGET}
RUN install_packages libtcc-dev && cp /usr/lib/$(uname -m)-linux-gnu/libtcc.a ${BUN_DEPS_OUT_DIR}
FROM bun-base as boringssl
RUN install_packages golang
ARG DEBIAN_FRONTEND
ARG GITHUB_WORKSPACE
ARG ZIG_PATH
# Directory extracts to "bun-webkit"
ARG WEBKIT_DIR
ARG BUN_RELEASE_DIR
ARG BUN_DEPS_OUT_DIR
ARG BUN_DIR
ARG CPU_TARGET
ENV CPU_TARGET=${CPU_TARGET}
COPY Makefile ${BUN_DIR}/Makefile
COPY src/deps/boringssl ${BUN_DIR}/src/deps/boringssl
WORKDIR $BUN_DIR
ENV CCACHE_DIR=/ccache
RUN --mount=type=cache,target=/ccache cd ${BUN_DIR} && make boringssl && rm -rf src/deps/boringssl Makefile
FROM bun-base as uws
ARG DEBIAN_FRONTEND
ARG GITHUB_WORKSPACE
ARG ZIG_PATH
# Directory extracts to "bun-webkit"
ARG WEBKIT_DIR
ARG BUN_RELEASE_DIR
ARG BUN_DEPS_OUT_DIR
ARG BUN_DIR
ARG CPU_TARGET
ENV CPU_TARGET=${CPU_TARGET}
COPY Makefile ${BUN_DIR}/Makefile
COPY packages/bun-uws ${BUN_DIR}/packages/bun-uws
COPY packages/bun-usockets ${BUN_DIR}/packages/bun-usockets
COPY src/deps/zlib ${BUN_DIR}/src/deps/zlib
COPY src/deps/boringssl/include ${BUN_DIR}/src/deps/boringssl/include
COPY src/deps/c-ares/include ${BUN_DIR}/src/deps/c-ares/include
COPY src/deps/libuwsockets.cpp ${BUN_DIR}/src/deps/libuwsockets.cpp
COPY src/deps/_libusockets.h ${BUN_DIR}/src/deps/_libusockets.h
WORKDIR $BUN_DIR
RUN cd $BUN_DIR && \
make uws && rm -rf packages/bun-uws Makefile
FROM bun-base as base64
ARG DEBIAN_FRONTEND
ARG GITHUB_WORKSPACE
ARG ZIG_PATH
# Directory extracts to "bun-webkit"
ARG WEBKIT_DIR
ARG BUN_RELEASE_DIR
ARG BUN_DEPS_OUT_DIR
ARG BUN_DIR
ARG CPU_TARGET
ENV CPU_TARGET=${CPU_TARGET}
COPY Makefile ${BUN_DIR}/Makefile
COPY src/deps/base64 ${BUN_DIR}/src/deps/base64
WORKDIR $BUN_DIR
RUN cd $BUN_DIR && \
make base64 && rm -rf src/deps/base64 Makefile
FROM bun-base as picohttp
ARG DEBIAN_FRONTEND
ARG GITHUB_WORKSPACE
ARG ZIG_PATH
# Directory extracts to "bun-webkit"
ARG WEBKIT_DIR
ARG BUN_RELEASE_DIR
ARG BUN_DEPS_OUT_DIR
ARG BUN_DIR
ARG CPU_TARGET
ENV CPU_TARGET=${CPU_TARGET}
COPY Makefile ${BUN_DIR}/Makefile
COPY src/deps/picohttpparser ${BUN_DIR}/src/deps/picohttpparser
COPY src/deps/*.c ${BUN_DIR}/src/deps/
COPY src/deps/*.h ${BUN_DIR}/src/deps/
WORKDIR $BUN_DIR
RUN cd $BUN_DIR && \
make picohttp
FROM bun-base-with-zig-and-webkit as identifier_cache
ARG DEBIAN_FRONTEND
ARG GITHUB_WORKSPACE
ARG ZIG_PATH
# Directory extracts to "bun-webkit"
ARG WEBKIT_DIR
ARG BUN_RELEASE_DIR
ARG BUN_DEPS_OUT_DIR
ARG BUN_DIR
ARG CPU_TARGET
ENV CPU_TARGET=${CPU_TARGET}
WORKDIR $BUN_DIR
COPY Makefile ${BUN_DIR}/Makefile
COPY src/js_lexer/identifier_data.zig ${BUN_DIR}/src/js_lexer/identifier_data.zig
COPY src/js_lexer/identifier_cache.zig ${BUN_DIR}/src/js_lexer/identifier_cache.zig
RUN cd $BUN_DIR && \
make identifier-cache && rm -rf zig-cache Makefile
FROM bun-base-with-zig-and-webkit as node_fallbacks
ARG DEBIAN_FRONTEND
ARG GITHUB_WORKSPACE
ARG ZIG_PATH
# Directory extracts to "bun-webkit"
ARG WEBKIT_DIR
ARG BUN_RELEASE_DIR
ARG BUN_DEPS_OUT_DIR
ARG BUN_DIR
ARG CPU_TARGET
ENV CPU_TARGET=${CPU_TARGET}
WORKDIR $BUN_DIR
COPY Makefile ${BUN_DIR}/Makefile
COPY src/node-fallbacks ${BUN_DIR}/src/node-fallbacks
RUN cd $BUN_DIR && \
make node-fallbacks && rm -rf src/node-fallbacks/node_modules Makefile
FROM bun-base-with-zig-and-webkit as prepare_release
ARG DEBIAN_FRONTEND
ARG GITHUB_WORKSPACE
ARG ZIG_PATH
# Directory extracts to "bun-webkit"
ARG WEBKIT_DIR
ARG BUN_RELEASE_DIR
ARG BUN_DEPS_OUT_DIR
ARG BUN_DIR
ARG CPU_TARGET
ENV CPU_TARGET=${CPU_TARGET}
WORKDIR $BUN_DIR
COPY ./root.zig ${BUN_DIR}/root.zig
COPY ./src ${BUN_DIR}/src
COPY ./build.zig ${BUN_DIR}/build.zig
COPY ./completions ${BUN_DIR}/completions
COPY ./packages ${BUN_DIR}/packages
COPY ./src/build-id ${BUN_DIR}/src/build-id
COPY ./package.json ${BUN_DIR}/package.json
COPY ./misctools ${BUN_DIR}/misctools
COPY Makefile ${BUN_DIR}/Makefile
FROM prepare_release as compile_release_obj
ARG DEBIAN_FRONTEND
ARG GITHUB_WORKSPACE
ARG ZIG_PATH
# Directory extracts to "bun-webkit"
ARG WEBKIT_DIR
ARG BUN_RELEASE_DIR
ARG BUN_DEPS_OUT_DIR
ARG BUN_DIR
ARG CPU_TARGET
ENV CPU_TARGET=${CPU_TARGET}
COPY Makefile ${BUN_DIR}/Makefile
COPY .prettierrc.cjs ${BUN_DIR}/.prettierrc.cjs
WORKDIR $BUN_DIR
ENV JSC_BASE_DIR=${WEBKIT_DIR}
ENV LIB_ICU_PATH=${WEBKIT_DIR}/lib
ARG ARCH
ARG TRIPLET
ARG CPU_TARGET
ENV CPU_TARGET=${CPU_TARGET}
ARG GIT_SHA
ARG BUN_BASE_VERSION
ENV BUN_BASE_VERSION=${BUN_BASE_VERSION}
ENV GIT_SHA=${GIT_SHA}
COPY --from=identifier_cache ${BUN_DIR}/src/js_lexer/*.blob ${BUN_DIR}/src/js_lexer/
COPY --from=node_fallbacks ${BUN_DIR}/src/node-fallbacks/out ${BUN_DIR}/src/node-fallbacks/out
COPY ./src/build-id ${BUN_DIR}/src/build-id
ENV CCACHE_DIR=/ccache
RUN --mount=type=cache,target=/ccache cd $BUN_DIR && mkdir -p src/bun.js/bindings-obj && rm -rf $HOME/.cache zig-cache && make prerelease && \
mkdir -p $BUN_RELEASE_DIR && \
OUTPUT_DIR=/tmp/bun-${TRIPLET}-${GIT_SHA} $ZIG_PATH/zig build obj -Doutput-dir=/tmp/bun-${TRIPLET}-${GIT_SHA} -Doptimize=ReleaseFast -Dtarget="${TRIPLET}" -Dcpu="${CPU_TARGET}" && \
cp /tmp/bun-${TRIPLET}-${GIT_SHA}/bun.o /tmp/bun-${TRIPLET}-${GIT_SHA}/bun-${BUN_BASE_VERSION}.$(cat ${BUN_DIR}/src/build-id).o && cd / && rm -rf $BUN_DIR
FROM scratch as build_release_obj
ARG DEBIAN_FRONTEND
ARG GITHUB_WORKSPACE
ARG ZIG_PATH
# Directory extracts to "bun-webkit"
ARG WEBKIT_DIR
ARG BUN_RELEASE_DIR
ARG BUN_DEPS_OUT_DIR
ARG GIT_SHA
ARG TRIPLET
ARG BUN_DIR
ARG CPU_TARGET
ENV CPU_TARGET=${CPU_TARGET}
COPY --from=compile_release_obj /tmp/bun-${TRIPLET}-${GIT_SHA}/*.o /
FROM prepare_release as compile_cpp
ARG DEBIAN_FRONTEND
ARG GITHUB_WORKSPACE
ARG ZIG_PATH
# Directory extracts to "bun-webkit"
ARG WEBKIT_DIR
ARG BUN_RELEASE_DIR
ARG BUN_DEPS_OUT_DIR
ARG BUN_DIR
ARG CPU_TARGET
ENV CPU_TARGET=${CPU_TARGET}
COPY Makefile ${BUN_DIR}/Makefile
COPY .prettierrc.cjs ${BUN_DIR}/.prettierrc.cjs
WORKDIR $BUN_DIR
ENV JSC_BASE_DIR=${WEBKIT_DIR}
ENV LIB_ICU_PATH=${WEBKIT_DIR}/lib
# Required for webcrypto bindings
COPY src/deps/boringssl/include ${BUN_DIR}/src/deps/boringssl/include
ENV CCACHE_DIR=/ccache
RUN --mount=type=cache,target=/ccache cd $BUN_DIR && mkdir -p src/bun.js/bindings-obj && rm -rf $HOME/.cache zig-cache && mkdir -p $BUN_RELEASE_DIR && \
make release-bindings -j10 && mv src/bun.js/bindings-obj/* /tmp
FROM bun-base as sqlite
ARG DEBIAN_FRONTEND
ARG GITHUB_WORKSPACE
ARG ZIG_PATH
# Directory extracts to "bun-webkit"
ARG WEBKIT_DIR
ARG BUN_RELEASE_DIR
ARG BUN_DEPS_OUT_DIR
ARG BUN_DIR
ARG CPU_TARGET
ENV CPU_TARGET=${CPU_TARGET}
ENV CCACHE_DIR=/ccache
COPY Makefile ${BUN_DIR}/Makefile
COPY src/bun.js/bindings/sqlite ${BUN_DIR}/src/bun.js/bindings/sqlite
COPY .prettierrc.cjs ${BUN_DIR}/.prettierrc.cjs
WORKDIR $BUN_DIR
ENV JSC_BASE_DIR=${WEBKIT_DIR}
ENV LIB_ICU_PATH=${WEBKIT_DIR}/lib
RUN --mount=type=cache,target=/ccache cd $BUN_DIR && make sqlite
FROM bun-base as zstd
ARG DEBIAN_FRONTEND
ARG GITHUB_WORKSPACE
ARG ZIG_PATH
# Directory extracts to "bun-webkit"
ARG WEBKIT_DIR
ARG BUN_RELEASE_DIR
ARG BUN_DEPS_OUT_DIR
ARG BUN_DIR
ARG CPU_TARGET
ENV CPU_TARGET=${CPU_TARGET}
ENV CCACHE_DIR=/ccache
COPY Makefile ${BUN_DIR}/Makefile
COPY src/deps/zstd ${BUN_DIR}/src/deps/zstd
COPY .prettierrc.cjs ${BUN_DIR}/.prettierrc.cjs
WORKDIR $BUN_DIR
ENV JSC_BASE_DIR=${WEBKIT_DIR}
ENV LIB_ICU_PATH=${WEBKIT_DIR}/lib
RUN --mount=type=cache,target=/ccache cd $BUN_DIR && make zstd
FROM scratch as build_release_cpp
COPY --from=compile_cpp /tmp/*.o /
FROM prepare_release as build_release
ARG DEBIAN_FRONTEND
ARG GITHUB_WORKSPACE
ARG ZIG_PATH
# Directory extracts to "bun-webkit"
ARG WEBKIT_DIR
ARG BUN_RELEASE_DIR
ARG BUN_DEPS_OUT_DIR
ARG BUN_DIR
ARG CPU_TARGET
ENV CPU_TARGET=${CPU_TARGET}
COPY Makefile ${BUN_DIR}/Makefile
COPY .prettierrc.cjs ${BUN_DIR}/.prettierrc.cjs
WORKDIR $BUN_DIR
ENV JSC_BASE_DIR=${WEBKIT_DIR}
ENV LIB_ICU_PATH=${WEBKIT_DIR}/lib
COPY --from=zlib ${BUN_DEPS_OUT_DIR}/*.a ${BUN_DEPS_OUT_DIR}/
COPY --from=base64 ${BUN_DEPS_OUT_DIR}/*.a ${BUN_DEPS_OUT_DIR}/
COPY --from=libarchive ${BUN_DEPS_OUT_DIR}/*.a ${BUN_DEPS_OUT_DIR}/
COPY --from=boringssl ${BUN_DEPS_OUT_DIR}/*.a ${BUN_DEPS_OUT_DIR}/
COPY --from=lolhtml ${BUN_DEPS_OUT_DIR}/*.a ${BUN_DEPS_OUT_DIR}/
COPY --from=mimalloc ${BUN_DEPS_OUT_DIR}/*.o ${BUN_DEPS_OUT_DIR}/
COPY --from=picohttp ${BUN_DEPS_OUT_DIR}/*.o ${BUN_DEPS_OUT_DIR}/
COPY --from=sqlite ${BUN_DEPS_OUT_DIR}/*.o ${BUN_DEPS_OUT_DIR}/
COPY --from=zstd ${BUN_DEPS_OUT_DIR}/*.a ${BUN_DEPS_OUT_DIR}/
COPY --from=tinycc ${BUN_DEPS_OUT_DIR}/*.a ${BUN_DEPS_OUT_DIR}/
COPY --from=uws ${BUN_DEPS_OUT_DIR}/*.a ${BUN_DEPS_OUT_DIR}/
COPY --from=uws ${BUN_DEPS_OUT_DIR}/*.o ${BUN_DEPS_OUT_DIR}/
COPY --from=c-ares ${BUN_DEPS_OUT_DIR}/*.a ${BUN_DEPS_OUT_DIR}/
COPY --from=build_release_obj /*.o /tmp
COPY --from=build_release_cpp /*.o ${BUN_DIR}/src/bun.js/bindings-obj/
COPY --from=build_release_cpp /*.a ${BUN_DEPS_OUT_DIR}/
RUN cd $BUN_DIR && mkdir -p ${BUN_RELEASE_DIR} && make bun-relink copy-to-bun-release-dir && \
rm -rf $HOME/.cache zig-cache misctools package.json build-id completions build.zig $(BUN_DIR)/packages
FROM scratch as artifact
ARG DEBIAN_FRONTEND
ARG GITHUB_WORKSPACE
ARG ZIG_PATH
# Directory extracts to "bun-webkit"
ARG WEBKIT_DIR
ARG BUN_RELEASE_DIR
ARG BUN_DEPS_OUT_DIR
ARG BUN_DIR
COPY --from=build_release ${BUN_RELEASE_DIR}/bun /bun
COPY --from=build_release ${BUN_RELEASE_DIR}/bun-profile /bun-profile
COPY --from=build_release ${BUN_DEPS_OUT_DIR}/* /bun-dependencies
COPY --from=build_release_obj /*.o /bun-obj
FROM prepare_release as build_unit
ARG DEBIAN_FRONTEND
ARG GITHUB_WORKSPACE
ARG ZIG_PATH
# Directory extracts to "bun-webkit"
ARG WEBKIT_DIR
ARG BUN_RELEASE_DIR
ARG BUN_DEPS_OUT_DIR
ARG BUN_DIR
WORKDIR $BUN_DIR
ENV PATH "$ZIG_PATH:$PATH"
ENV LIB_ICU_PATH "${WEBKIT_DIR}/lib"
CMD make headers \
api \
analytics \
bun_error \
fallback_decoder \
bindings -j10 && \
make \
run-all-unit-tests
# FROM bun-test-base as test_base
# ARG DEBIAN_FRONTEND=noninteractive
# ARG GITHUB_WORKSPACE=/build
# ARG ZIG_PATH=${GITHUB_WORKSPACE}/zig
# # Directory extracts to "bun-webkit"
# ARG WEBKIT_DIR=${GITHUB_WORKSPACE}/bun-webkit
# ARG BUN_RELEASE_DIR=${GITHUB_WORKSPACE}/bun-release
# ARG BUN_DEPS_OUT_DIR=${GITHUB_WORKSPACE}/bun-deps
# ARG BUN_DIR=${GITHUB_WORKSPACE}/bun
# ARG BUILDARCH=amd64
# RUN groupadd -r chromium && useradd -d ${BUN_DIR} -M -r -g chromium -G audio,video chromium \
# && mkdir -p /home/chromium/Downloads && chown -R chromium:chromium /home/chromium
# USER chromium
# WORKDIR $BUN_DIR
# ENV NPM_CLIENT bun
# ENV PATH "${BUN_DIR}/packages/bun-linux-x64:${BUN_DIR}/packages/bun-linux-aarch64:$PATH"
# ENV CI 1
# ENV BROWSER_EXECUTABLE /usr/bin/chromium
# COPY ./test ${BUN_DIR}/test
# COPY Makefile ${BUN_DIR}/Makefile
# COPY package.json ${BUN_DIR}/package.json
# COPY .docker/run-test.sh ${BUN_DIR}/run-test.sh
# COPY ./bun.lockb ${BUN_DIR}/bun.lockb
# # # We don't want to worry about architecture differences in this image
# COPY --from=release /opt/bun/bin/bun ${BUN_DIR}/packages/bun-linux-aarch64/bun
# COPY --from=release /opt/bun/bin/bun ${BUN_DIR}/packages/bun-linux-x64/bun
# USER root
# RUN chgrp -R chromium ${BUN_DIR} && chmod g+rwx ${BUN_DIR} && chown -R chromium:chromium ${BUN_DIR}
# USER chromium
# CMD [ "bash", "run-test.sh" ]
# FROM release

View File

@@ -39,7 +39,6 @@ endif
MIN_MACOS_VERSION ?= $(DEFAULT_MIN_MACOS_VERSION)
BUN_BASE_VERSION = 1.0
CI ?= false
AR=
@@ -66,7 +65,7 @@ PACKAGE_JSON_VERSION = $(BUN_BASE_VERSION).$(BUILD_ID)
BUN_BUILD_TAG = bun-v$(PACKAGE_JSON_VERSION)
BUN_RELEASE_BIN = $(PACKAGE_DIR)/bun
PRETTIER ?= $(shell which prettier 2>/dev/null || echo "./node_modules/.bin/prettier")
ESBUILD = $(shell which esbuild 2>/dev/null || echo "./node_modules/.bin/esbuild")
ESBUILD = "$(shell which esbuild 2>/dev/null || echo "./node_modules/.bin/esbuild")"
DSYMUTIL ?= $(shell which dsymutil 2>/dev/null || which dsymutil-15 2>/dev/null)
WEBKIT_DIR ?= $(realpath src/bun.js/WebKit)
WEBKIT_RELEASE_DIR ?= $(WEBKIT_DIR)/WebKitBuild/Release
@@ -74,7 +73,7 @@ WEBKIT_DEBUG_DIR ?= $(WEBKIT_DIR)/WebKitBuild/Debug
WEBKIT_RELEASE_DIR_LTO ?= $(WEBKIT_DIR)/WebKitBuild/ReleaseLTO
NPM_CLIENT ?= $(shell which bun 2>/dev/null || which npm 2>/dev/null)
NPM_CLIENT = "$(shell which bun 2>/dev/null || which npm 2>/dev/null)"
ZIG ?= $(shell which zig 2>/dev/null || echo -e "error: Missing zig. Please make sure zig is in PATH. Or set ZIG=/path/to-zig-executable")
# We must use the same compiler version for the JavaScriptCore bindings and JavaScriptCore
@@ -187,11 +186,6 @@ BUN_CFLAGS = $(MACOS_MIN_FLAG) $(MARCH_NATIVE) $(OPTIMIZATION_LEVEL) -fno-excep
BUN_TMP_DIR := /tmp/make-bun
CFLAGS=$(CFLAGS_WITHOUT_MARCH) $(MARCH_NATIVE)
DEFAULT_USE_BMALLOC := 1
USE_BMALLOC ?= DEFAULT_USE_BMALLOC
# Set via postinstall
ifeq (,$(realpath $(JSC_BASE_DIR)))
JSC_BASE_DIR = $(realpath $(firstword $(wildcard bun-webkit)))
@@ -380,9 +374,7 @@ ICU_FLAGS ?=
# Ideally, we could just look up the linker search paths
ifeq ($(OS_NAME),linux)
LIB_ICU_PATH ?= $(JSC_LIB)
ICU_FLAGS += $(LIB_ICU_PATH)/libicuuc.a $(LIB_ICU_PATH)/libicudata.a $(LIB_ICU_PATH)/libicui18n.a
else
LIB_ICU_PATH ?= $(BUN_DEPS_DIR)
ICU_FLAGS += $(LIB_ICU_PATH)/libicuuc.a $(LIB_ICU_PATH)/libicudata.a $(LIB_ICU_PATH)/libicui18n.a
endif
ifeq ($(OS_NAME),darwin)
@@ -764,7 +756,7 @@ USOCKETS_DIR = $(BUN_DIR)/packages/bun-usockets
USOCKETS_SRC_DIR = $(USOCKETS_DIR)/src
usockets:
rm -rf $(USOCKETS_DIR)/*.i $(USOCKETS_DIR)/*.bc $(USOCKETS_DIR)/*.o $(USOCKETS_DIR)/*.s $(USOCKETS_DIR)/*.ii $(USOCKETS_DIR)/*.s
rm -rf $(USOCKETS_DIR)/*.i $(USOCKETS_DIR)/*.bc $(USOCKETS_DIR)/*.o $(USOCKETS_DIR)/*.s $(USOCKETS_DIR)/*.ii $(USOCKETS_DIR)/*.s $(BUN_DEPS_OUT_DIR)/libusockets.a
cd $(USOCKETS_DIR) && $(CC_WITH_CCACHE) -I$(USOCKETS_SRC_DIR) -fno-builtin-malloc -fno-builtin-free -fno-builtin-realloc $(EMIT_LLVM_FOR_RELEASE) $(MACOS_MIN_FLAG) -fPIC $(CFLAGS) $(UWS_CC_FLAGS) -save-temps -I$(BUN_DEPS_DIR)/uws/uSockets/src $(UWS_LDFLAGS) -g $(DEFAULT_LINKER_FLAGS) $(PLATFORM_LINKER_FLAGS) $(OPTIMIZATION_LEVEL) -c $(wildcard $(USOCKETS_SRC_DIR)/*.c) $(wildcard $(USOCKETS_SRC_DIR)/**/*.c)
cd $(USOCKETS_DIR) && $(CXX_WITH_CCACHE) -I$(USOCKETS_SRC_DIR) -fno-builtin-malloc -fno-builtin-free -fno-builtin-realloc $(EMIT_LLVM_FOR_RELEASE) $(MACOS_MIN_FLAG) -fPIC $(CXXFLAGS) $(UWS_CXX_FLAGS) -save-temps -I$(BUN_DEPS_DIR)/uws/uSockets/src $(UWS_LDFLAGS) -g $(DEFAULT_LINKER_FLAGS) $(PLATFORM_LINKER_FLAGS) $(OPTIMIZATION_LEVEL) -c $(wildcard $(USOCKETS_SRC_DIR)/*.cpp) $(wildcard $(USOCKETS_SRC_DIR)/**/*.cpp)
cd $(USOCKETS_DIR) && $(AR) rcvs $(BUN_DEPS_OUT_DIR)/libusockets.a $(USOCKETS_DIR)/*.{o,bc}
@@ -1485,12 +1477,12 @@ wasm-return1:
$(ZIG) build-lib -OReleaseSmall test/bun.js/wasm-return-1-test.zig -femit-bin=test/bun.js/wasm-return-1-test.wasm -target wasm32-freestanding
generate-classes:
bun src/bun.js/scripts/generate-classes.ts
bun src/codegen/generate-classes.ts
$(ZIG) fmt src/bun.js/bindings/generated_classes.zig
$(CLANG_FORMAT) -i src/bun.js/bindings/ZigGeneratedClasses.h src/bun.js/bindings/ZigGeneratedClasses.cpp
generate-sink:
bun src/bun.js/scripts/generate-jssink.js
bun src/codegen/generate-jssink.js
$(CLANG_FORMAT) -i src/bun.js/bindings/JSSink.cpp src/bun.js/bindings/JSSink.h
./src/bun.js/scripts/create_hash_table src/bun.js/bindings/JSSink.cpp > src/bun.js/bindings/JSSinkLookupTable.h
$(SED) -i -e 's/#include "Lookup.h"//' src/bun.js/bindings/JSSinkLookupTable.h
@@ -1954,5 +1946,9 @@ setup: vendor-dev identifier-cache clean-bindings
.PHONY: help
help: ## to print this help
@echo "For detailed build instructions, see https://bun.sh/docs/project/development"
@echo "For detailed build instructions, see https://bun.sh/docs/project/contributing"
@awk 'BEGIN {FS = ":.*?## "} /^[a-zA-Z0-9_-]+:.*?## / {gsub("\\\\n",sprintf("\n%22c",""), $$2);printf "\033[36m%-20s\033[0m \t\t%s\n", $$1, $$2}' $(MAKEFILE_LIST)
print_linker_flags:
@echo $(CLANG_FLAGS)

View File

@@ -93,8 +93,8 @@ bun upgrade --canary
- [`bun run`](https://bun.sh/docs/cli/run)
- [`bun install`](https://bun.sh/docs/cli/install)
- [`bun test`](https://bun.sh/docs/cli/test)
- [`bun init`](https://bun.sh/docs/templates#bun-init)
- [`bun create`](https://bun.sh/docs/templates#bun-create)
- [`bun init`](https://bun.sh/docs/cli/init)
- [`bun create`](https://bun.sh/docs/cli/bun-create)
- [`bunx`](https://bun.sh/docs/cli/bunx)
- Runtime
- [Runtime](https://bun.sh/docs/runtime/index)

View File

@@ -6,23 +6,30 @@ bench("await 1", async function () {
return await 1;
});
function callnextTick(resolve) {
process.nextTick(resolve);
}
if (typeof process !== "undefined") {
bench("process.nextTick x 100", async function () {
var remaining = 100;
var cb, promise;
promise = new Promise(resolve => {
cb = resolve;
});
function awaitNextTick() {
return new Promise(callnextTick);
}
for (let i = 0; i < 100; i++) {
process.nextTick(() => {
if (--remaining === 0) cb();
});
}
bench("promise.nextTick", async function () {
return awaitNextTick();
});
return promise;
});
bench("await 1 x 100", async function () {
for (let i = 0; i < 100; i++) await 1;
});
}
bench("await new Promise(resolve => resolve())", async function () {
await new Promise(resolve => resolve());
});
bench("Promise.all(Array.from({length: 100}, () => new Promise((resolve) => resolve())))", async function () {
return Promise.all(Array.from({ length: 100 }, () => Promise.resolve(1)));
});
await run();

115
build.zig
View File

@@ -1,3 +1,5 @@
const required_zig_version = "0.12.0-dev.899+027aabf49";
const std = @import("std");
const pathRel = std.fs.path.relative;
const Wyhash = @import("./src/wyhash.zig").Wyhash;
@@ -11,6 +13,11 @@ fn moduleSource(comptime out: []const u8) FileSource {
}
}
fn exists(path: []const u8) bool {
_ = std.fs.openFileAbsolute(path, .{ .mode = .read_only }) catch return false;
return true;
}
const color_map = std.ComptimeStringMap([]const u8, .{
&.{ "black", "30m" },
&.{ "blue", "34m" },
@@ -46,11 +53,37 @@ fn addInternalPackages(b: *Build, step: *CompileStep, _: std.mem.Allocator, _: [
};
step.addModule("async_io", io);
step.addModule("zlib-internal", brk: {
if (target.isWindows()) {
break :brk b.createModule(.{ .source_file = FileSource.relative("src/deps/zlib.win32.zig") });
}
break :brk b.createModule(.{ .source_file = FileSource.relative("src/deps/zlib.posix.zig") });
});
var async_: *Module = brk: {
if (target.isDarwin() or target.isLinux() or target.isFreeBSD()) {
break :brk b.createModule(.{
.source_file = FileSource.relative("src/async/posix_event_loop.zig"),
});
} else if (target.isWindows()) {
break :brk b.createModule(.{
.source_file = FileSource.relative("src/async/windows_event_loop.zig"),
});
}
break :brk b.createModule(.{
.source_file = FileSource.relative("src/async/stub_event_loop.zig"),
});
};
step.addModule("async", async_);
}
const BunBuildOptions = struct {
canary: bool = false,
sha: [:0]const u8 = "",
version: []const u8 = "",
baseline: bool = false,
bindgen: bool = false,
sizegen: bool = false,
@@ -59,6 +92,8 @@ const BunBuildOptions = struct {
runtime_js_version: u64 = 0,
fallback_html_version: u64 = 0,
tinycc: bool = true,
pub fn updateRuntime(this: *BunBuildOptions) anyerror!void {
if (std.fs.cwd().openFile("src/runtime.out.js", .{ .mode = .read_only })) |file| {
defer file.close();
@@ -90,6 +125,11 @@ const BunBuildOptions = struct {
pub fn step(this: BunBuildOptions, b: anytype) *std.build.OptionsStep {
var opts = b.addOptions();
opts.addOption(@TypeOf(this.canary), "is_canary", this.canary);
opts.addOption(
std.SemanticVersion,
"version",
std.SemanticVersion.parse(this.version) catch @panic(b.fmt("Invalid version: {s}", .{this.version})),
);
opts.addOption(@TypeOf(this.sha), "sha", this.sha);
opts.addOption(@TypeOf(this.baseline), "baseline", this.baseline);
opts.addOption(@TypeOf(this.bindgen), "bindgen", this.bindgen);
@@ -97,6 +137,7 @@ const BunBuildOptions = struct {
opts.addOption(@TypeOf(this.base_path), "base_path", this.base_path);
opts.addOption(@TypeOf(this.runtime_js_version), "runtime_js_version", this.runtime_js_version);
opts.addOption(@TypeOf(this.fallback_html_version), "fallback_html_version", this.fallback_html_version);
opts.addOption(@TypeOf(this.tinycc), "tinycc", this.tinycc);
return opts;
}
};
@@ -146,6 +187,20 @@ pub fn build(b: *Build) !void {
}
pub fn build_(b: *Build) !void {
if (!std.mem.eql(u8, @import("builtin").zig_version_string, required_zig_version)) {
const colors = std.io.getStdErr().supportsAnsiEscapeCodes();
std.debug.print(
"{s}WARNING:\nBun requires Zig version '{s}', but found '{s}', build may fail...\nMake sure you installed the right version as per https://bun.sh/docs/project/contributing#install-zig\n{s}You can update to the right version using 'zigup {s}'\n\n",
.{
if (colors) "\x1b[1;33m" else "",
required_zig_version,
@import("builtin").zig_version_string,
if (colors) "\x1b[0m" else "",
required_zig_version,
},
);
}
// Standard target options allows the person running `zig build` to choose
// what target to build for. Here we do not override the defaults, which
// means any target is allowed, and the default is native. Other options
@@ -155,6 +210,8 @@ pub fn build_(b: *Build) !void {
// between Debug, ReleaseSafe, ReleaseFast, and ReleaseSmall.
optimize = b.standardOptimizeOption(.{});
const generated_code_directory = b.option([]const u8, "generated-code", "Set the generated code directory") orelse "./";
var output_dir_buf = std.mem.zeroes([4096]u8);
var bin_label = if (optimize == std.builtin.OptimizeMode.Debug) "packages/debug-bun-" else "packages/bun-";
@@ -187,7 +244,7 @@ pub fn build_(b: *Build) !void {
var triplet = triplet_buf[0 .. osname.len + cpuArchName.len + 1];
if (b.option([]const u8, "output-dir", "target to install to") orelse std.os.getenv("OUTPUT_DIR")) |output_dir_| {
if (b.option([]const u8, "output-dir", "target to install to") orelse b.env_map.get("OUTPUT_DIR")) |output_dir_| {
output_dir = try pathRel(b.allocator, b.install_prefix, output_dir_);
} else {
const output_dir_base = try std.fmt.bufPrint(&output_dir_buf, "{s}{s}", .{ bin_label, triplet });
@@ -217,9 +274,24 @@ pub fn build_(b: *Build) !void {
.root_source_file = FileSource.relative(root_src),
.target = target,
.optimize = optimize,
.main_pkg_path = .{ .cwd_relative = b.pathFromRoot(".") },
.main_mod_path = .{ .cwd_relative = b.pathFromRoot(".") },
});
if (!exists(b.pathFromRoot(try std.fs.path.join(b.allocator, &.{
"src",
"js_lexer",
"id_continue_bitset.blob",
})))) {
const identifier_data = b.pathFromRoot(try std.fs.path.join(b.allocator, &.{ "src", "js_lexer", "identifier_data.zig" }));
var run_step = b.addSystemCommand(&.{
b.zig_exe,
"run",
identifier_data,
});
run_step.has_side_effects = true;
obj.step.dependOn(&run_step.step);
}
b.reference_trace = 16;
var default_build_options: BunBuildOptions = brk: {
@@ -246,9 +318,12 @@ pub fn build_(b: *Build) !void {
}
}
const is_canary = (std.os.getenvZ("BUN_CANARY") orelse "0")[0] == '1';
const is_canary =
b.option(bool, "canary", "Treat this as a canary build") orelse
((b.env_map.get("BUN_CANARY") orelse "0")[0] == '1');
break :brk .{
.canary = is_canary,
.version = b.option([]const u8, "version", "Value of `Bun.version`") orelse "0.0.0",
.sha = git_sha,
.baseline = is_baseline,
.bindgen = false,
@@ -303,13 +378,21 @@ pub fn build_(b: *Build) !void {
obj.addOptions("build_options", actual_build_options.step(b));
obj.linkLibC();
// Generated Code
obj.addModule("generated/ZigGeneratedClasses.zig", b.createModule(.{
.source_file = .{ .path = b.fmt("{s}/ZigGeneratedClasses.zig", .{generated_code_directory}) },
}));
obj.addModule("generated/ResolvedSourceTag.zig", b.createModule(.{
.source_file = .{ .path = b.fmt("{s}/ResolvedSourceTag.zig", .{generated_code_directory}) },
}));
obj.linkLibC();
obj.dll_export_fns = true;
obj.strip = false;
obj.bundle_compiler_rt = false;
obj.omit_frame_pointer = optimize != .Debug;
obj.subsystem = .Console;
// Disable stack probing on x86 so we don't need to include compiler_rt
if (target.getCpuArch().isX86()) obj.disable_stack_probing = true;
if (target.getCpuArch().isX86() or target.isWindows()) obj.disable_stack_probing = true;
if (b.option(bool, "for-editor", "Do not emit bin, just check for errors") orelse false) {
// obj.emit_bin = .no_emit;
@@ -331,7 +414,7 @@ pub fn build_(b: *Build) !void {
.root_source_file = FileSource.relative("src/bindgen.zig"),
.target = target,
.optimize = optimize,
.main_pkg_path = obj.main_pkg_path,
.main_mod_path = obj.main_mod_path,
});
defer headers_step.dependOn(&headers_obj.step);
try configureObjectStep(b, headers_obj, headers_step, @TypeOf(target), target);
@@ -348,7 +431,7 @@ pub fn build_(b: *Build) !void {
.root_source_file = FileSource.relative("root_wasm.zig"),
.target = target,
.optimize = optimize,
.main_pkg_path = obj.main_pkg_path,
.main_mod_path = obj.main_mod_path,
});
defer wasm_step.dependOn(&wasm.step);
wasm.strip = false;
@@ -367,7 +450,7 @@ pub fn build_(b: *Build) !void {
.root_source_file = FileSource.relative("misctools/http_bench.zig"),
.target = target,
.optimize = optimize,
.main_pkg_path = obj.main_pkg_path,
.main_mod_path = obj.main_mod_path,
});
defer headers_step.dependOn(&headers_obj.step);
try configureObjectStep(b, headers_obj, headers_step, @TypeOf(target), target);
@@ -381,7 +464,7 @@ pub fn build_(b: *Build) !void {
.root_source_file = FileSource.relative("misctools/machbench.zig"),
.target = target,
.optimize = optimize,
.main_pkg_path = obj.main_pkg_path,
.main_mod_path = obj.main_mod_path,
});
defer headers_step.dependOn(&headers_obj.step);
try configureObjectStep(b, headers_obj, headers_step, @TypeOf(target), target);
@@ -395,7 +478,7 @@ pub fn build_(b: *Build) !void {
.root_source_file = FileSource.relative("misctools/fetch.zig"),
.target = target,
.optimize = optimize,
.main_pkg_path = obj.main_pkg_path,
.main_mod_path = obj.main_mod_path,
});
defer headers_step.dependOn(&headers_obj.step);
try configureObjectStep(b, headers_obj, headers_step, @TypeOf(target), target);
@@ -409,7 +492,7 @@ pub fn build_(b: *Build) !void {
.root_source_file = FileSource.relative("src/bench/string-handling.zig"),
.target = target,
.optimize = optimize,
.main_pkg_path = obj.main_pkg_path,
.main_mod_path = obj.main_mod_path,
});
defer headers_step.dependOn(&headers_obj.step);
try configureObjectStep(b, headers_obj, headers_step, @TypeOf(target), target);
@@ -423,7 +506,7 @@ pub fn build_(b: *Build) !void {
.root_source_file = FileSource.relative("src/sha.zig"),
.target = target,
.optimize = optimize,
.main_pkg_path = obj.main_pkg_path,
.main_mod_path = obj.main_mod_path,
});
defer headers_step.dependOn(&headers_obj.step);
try configureObjectStep(b, headers_obj, headers_step, @TypeOf(target), target);
@@ -437,7 +520,7 @@ pub fn build_(b: *Build) !void {
.root_source_file = FileSource.relative("src/sourcemap/vlq_bench.zig"),
.target = target,
.optimize = optimize,
.main_pkg_path = obj.main_pkg_path,
.main_mod_path = obj.main_mod_path,
});
defer headers_step.dependOn(&headers_obj.step);
try configureObjectStep(b, headers_obj, headers_step, @TypeOf(target), target);
@@ -451,7 +534,7 @@ pub fn build_(b: *Build) !void {
.root_source_file = FileSource.relative("misctools/tgz.zig"),
.target = target,
.optimize = optimize,
.main_pkg_path = obj.main_pkg_path,
.main_mod_path = obj.main_mod_path,
});
defer headers_step.dependOn(&headers_obj.step);
try configureObjectStep(b, headers_obj, headers_step, @TypeOf(target), target);
@@ -468,7 +551,7 @@ pub fn build_(b: *Build) !void {
var headers_obj: *CompileStep = b.addTest(.{
.root_source_file = FileSource.relative(test_file orelse "src/main.zig"),
.target = target,
.main_pkg_path = obj.main_pkg_path,
.main_mod_path = obj.main_mod_path,
});
headers_obj.filter = test_filter;
if (test_bin_) |test_bin| {

BIN
bun.lockb

Binary file not shown.

View File

@@ -26,10 +26,10 @@ Below is a quick "cheat sheet" that doubles as a table of contents. Click an ite
---
<!-- - [`File`](#file)
- _Browser only_. A subclass of `Blob` that represents a file. Has a `name` and `lastModified` timestamp. There is experimental support in Node.js v20; Bun does not support `File` yet; most of its functionality is provided by `BunFile`.
- [`File`](#file)
- A subclass of `Blob` that represents a file. Has a `name` and `lastModified` timestamp. There is experimental support in Node.js v20.
--- -->
---
- [`BunFile`](#bunfile)
- _Bun only_. A subclass of `Blob` that represents a lazily-loaded file on disk. Created with `Bun.file(path)`.

View File

@@ -183,6 +183,60 @@ const proc = Bun.spawn(["echo", "hello"]);
proc.unref();
```
## Inter-process communication (IPC)
Bun supports direct inter-process communication channel between two `bun` processes. To receive messages from a spawned Bun subprocess, specify an `ipc` handler.
{%callout%}
**Note** — This API is only compatible with other `bun` processes. Use `process.execPath` to get a path to the currently running `bun` executable.
{%/callout%}
```ts#parent.ts
const child = Bun.spawn(["bun", "child.ts"], {
ipc(message) {
/**
* The message received from the sub process
**/
},
});
```
The parent process can send messages to the subprocess using the `.send()` method on the returned `Subprocess` instance. A reference to the sending subprocess is also available as the second argument in the `ipc` handler.
```ts#parent.ts
const childProc = Bun.spawn(["bun", "child.ts"], {
ipc(message, childProc) {
/**
* The message received from the sub process
**/
childProc.send("Respond to child")
},
});
childProc.send("I am your father"); // The parent can send messages to the child as well
```
Meanwhile the child process can send messages to its parent using with `process.send()` and receive messages with `process.on("message")`. This is the same API used for `child_process.fork()` in Node.js.
```ts#child.ts
process.send("Hello from child as string");
process.send({ message: "Hello from child as object" });
process.on("message", (message) => {
// print message from parent
console.log(message);
});
```
All messages are serialized using the JSC `serialize` API, which allows for the same set of [transferrable types](https://developer.mozilla.org/en-US/docs/Web/API/Web_Workers_API/Transferable_objects) supported by `postMessage` and `structuredClone`, including strings, typed arrays, streams, and objects.
```ts#child.ts
// send a string
process.send("Hello from child as string");
// send an object
process.send({ message: "Hello from child as object" });
```
## Blocking API (`Bun.spawnSync()`)
Bun provides a synchronous equivalent of `Bun.spawn` called `Bun.spawnSync`. This is a blocking API that supports the same inputs and parameters as `Bun.spawn`. It returns a `SyncSubprocess` object, which differs from `Subprocess` in a few ways.

View File

@@ -192,7 +192,7 @@ const server = Bun.serve<{ username: string }>({
close(ws) {
const msg = `${ws.data.username} has left the chat`;
ws.unsubscribe("the-group-chat");
server.publish("the-group-chat", msg);
ws.publish("the-group-chat", msg);
},
},
});

View File

@@ -328,7 +328,7 @@ Depending on the target, Bun will apply different module resolution rules and op
All bundles generated with `target: "bun"` are marked with a special `// @bun` pragma, which indicates to the Bun runtime that there's no need to re-transpile the file before execution.
If any entrypoints contains a Bun shebang (`#!/usr/bin/env bun`) the bundler will default to `target: "bun"` instead of `"browser`.
If any entrypoints contains a Bun shebang (`#!/usr/bin/env bun`) the bundler will default to `target: "bun"` instead of `"browser"`.
---

155
docs/cli/add.md Normal file
View File

@@ -0,0 +1,155 @@
To add a particular package:
```bash
$ bun add preact
```
To specify a version, version range, or tag:
```bash
$ bun add zod@3.20.0
$ bun add zod@^3.0.0
$ bun add zod@latest
```
## `--dev`
{% callout %}
**Alias**`--development`, `-d`, `-D`
{% /callout %}
To add a package as a dev dependency (`"devDependencies"`):
```bash
$ bun add --dev @types/react
$ bun add -d @types/react
```
## `--optional`
To add a package as an optional dependency (`"optionalDependencies"`):
```bash
$ bun add --optional lodash
```
## `--exact`
To add a package and pin to the resolved version, use `--exact`. This will resolve the version of the package and add it to your `package.json` with an exact version number instead of a version range.
```bash
$ bun add react --exact
$ bun add react -E
```
This will add the following to your `package.json`:
```jsonc
{
"dependencies": {
// without --exact
"react": "^18.2.0", // this matches >= 18.2.0 < 19.0.0
// with --exact
"react": "18.2.0" // this matches only 18.2.0 exactly
}
}
```
To view a complete list of options for this command:
```bash
$ bun add --help
```
## `--global`
{% callout %}
**Note** — This would not modify package.json of your current project folder.
**Alias** - `bun add --global`, `bun add -g`, `bun install --global` and `bun install -g`
{% /callout %}
To install a package globally, use the `-g`/`--global` flag. This will not modify the `package.json` of your current project. Typically this is used for installing command-line tools.
```bash
$ bun add --global cowsay # or `bun add -g cowsay`
$ cowsay "Bun!"
______
< Bun! >
------
\ ^__^
\ (oo)\_______
(__)\ )\/\
||----w |
|| ||
```
{% details summary="Configuring global installation behavior" %}
```toml
[install]
# where `bun add --global` installs packages
globalDir = "~/.bun/install/global"
# where globally-installed package bins are linked
globalBinDir = "~/.bun/bin"
```
{% /details %}
## Trusted dependencies
Unlike other npm clients, Bun does not execute arbitrary lifecycle scripts for installed dependencies, such as `postinstall`. These scripts represent a potential security risk, as they can execute arbitrary code on your machine.
To tell Bun to allow lifecycle scripts for a particular package, add the package to `trustedDependencies` in your package.json.
```json-diff
{
"name": "my-app",
"version": "1.0.0",
+ "trustedDependencies": ["my-trusted-package"]
}
```
Bun reads this field and will run lifecycle scripts for `my-trusted-package`.
<!-- Bun maintains an allow-list of popular packages containing `postinstall` scripts that are known to be safe. To run lifecycle scripts for packages that aren't on this list, add the package to `trustedDependencies` in your package.json. -->
## Git dependencies
To add a dependency from a git repository:
```bash
$ bun add git@github.com:moment/moment.git
```
Bun supports a variety of protocols, including [`github`](https://docs.npmjs.com/cli/v9/configuring-npm/package-json#github-urls), [`git`](https://docs.npmjs.com/cli/v9/configuring-npm/package-json#git-urls-as-dependencies), `git+ssh`, `git+https`, and many more.
```json
{
"dependencies": {
"dayjs": "git+https://github.com/iamkun/dayjs.git",
"lodash": "git+ssh://github.com/lodash/lodash.git#4.17.21",
"moment": "git@github.com:moment/moment.git",
"zod": "github:colinhacks/zod"
}
}
```
## Tarball dependencies
A package name can correspond to a publicly hosted `.tgz` file. During installation, Bun will download and install the package from the specified tarball URL, rather than from the package registry.
```sh
$ bun add zod@https://registry.npmjs.org/zod/-/zod-3.21.4.tgz
```
This will add the following line to your `package.json`:
```json#package.json
{
"dependencies": {
"zod": "https://registry.npmjs.org/zod/-/zod-3.21.4.tgz"
}
}
```

View File

@@ -1,36 +1,12 @@
## `bun init`
Scaffold an empty project with the interactive `bun init` command.
```bash
$ bun init
bun init helps you get started with a minimal project and tries to
guess sensible defaults. Press ^C anytime to quit.
package name (quickstart):
entry point (index.ts):
Done! A package.json file was saved in the current directory.
+ index.ts
+ .gitignore
+ tsconfig.json (for editor auto-complete)
+ README.md
To get started, run:
bun run index.ts
```
Press `enter` to accept the default answer for each prompt, or pass the `-y` flag to auto-accept the defaults.
## `bun create`
{% callout %}
**Note** — You dont need `bun create` to use Bun. You dont need any configuration at all. This command exists to make getting started a bit quicker and easier.
{% /callout %}
Template a new Bun project with `bun create`. This is a flexible command that can be used to create a new project with a `create-<template>` npm package, a GitHub repo, or a local template.
### From `npm`
If you're looking to create a brand new empty project, use [`bun init`](/docs/cli/init).
## From `npm`
```sh
$ bun create <template> [<destination>]
@@ -45,7 +21,7 @@ $ bunx create-remix
Refer to the documentation of the associated `create-<template>` package for complete documentation and usage instructions.
### From GitHub
## From GitHub
This will download the contents of the GitHub repo to disk.
@@ -115,7 +91,7 @@ $ bun create https://github.com/ahfarmer/calculator ./myapp
Bun installs the files as they currently exist current default branch (usually `main` or `master`). Unlike `git clone` it doesn't download the commit history or configure a remote. -->
### From a local template
## From a local template
{% callout %}
**⚠️ Warning** — Unlike remote templates, running `bun create` with a local template will delete the entire destination folder if it already exists! Be careful.

View File

@@ -59,8 +59,8 @@ optional = true
# Install local devDependencies (default: true)
dev = true
# Install peerDependencies (default: false)
peer = false
# Install peerDependencies (default: true)
peer = true
# When using `bun install -g`, install packages here
globalDir = "~/.bun/install/global"
@@ -170,7 +170,7 @@ bun stores normalized `cpu` and `os` values from npm in the lockfile, along with
## Peer dependencies?
Peer dependencies are handled similarly to yarn. `bun install` does not automatically install peer dependencies and will try to choose an existing dependency.
Peer dependencies are handled similarly to yarn. `bun install` will automatically install peer dependencies. If the dependency is marked optional in `peerDependenciesMeta`, an existing dependency will be chosen if possible.
## Lockfile

View File

@@ -1,256 +0,0 @@
## `bun init`
Scaffold an empty project with `bun init`. It's an interactive tool.
```bash
$ bun init
bun init helps you get started with a minimal project and tries to
guess sensible defaults. Press ^C anytime to quit.
package name (quickstart):
entry point (index.ts):
Done! A package.json file was saved in the current directory.
+ index.ts
+ .gitignore
+ tsconfig.json (for editor auto-complete)
+ README.md
To get started, run:
bun run index.ts
```
Press `enter` to accept the default answer for each prompt, or pass the `-y` flag to auto-accept the defaults.
## `bun create`
Template a new Bun project with `bun create`.
```bash
$ bun create <template> <destination>
```
{% callout %}
**Note** You dont need `bun create` to use Bun. You dont need any configuration at all. This command exists to make getting started a bit quicker and easier.
{% /callout %}
A template can take a number of forms:
```bash
$ bun create <template> # an official template (remote)
$ bun create <username>/<repo> # a GitHub repo (remote)
$ bun create <local-template> # a custom template (local)
```
Running `bun create` performs the following steps:
- Download the template (remote templates only)
- Copy all template files into the destination folder. By default Bun will _not overwrite_ any existing files. Use the `--force` flag to overwrite existing files.
- Install dependencies with `bun install`.
- Initialize a fresh Git repo. Opt out with the `--no-git` flag.
- Run the template's configured `start` script, if defined.
<!-- ## Official templates
The following official templates are available.
```bash
bun create next ./myapp
bun create react ./myapp
bun create svelte-kit ./myapp
bun create elysia ./myapp
bun create hono ./myapp
bun create kingworld ./myapp
```
Each of these corresponds to a directory in the [bun-community/create-templates](https://github.com/bun-community/create-templates) repo. If you think a major framework is missing, please open a PR there. This list will change over time as additional examples are added. To see an up-to-date list, run `bun create` with no arguments.
```bash
$ bun create
Welcome to bun! Create a new project by pasting any of the following:
<list of templates>
```
{% callout %}
⚡️ **Speed** — At the time of writing, `bun create react app` runs ~11x faster on a M1 Macbook Pro than `yarn create react-app app`.
{% /callout %} -->
## GitHub repos
A template of the form `<username>/<repo>` will be downloaded from GitHub.
```bash
$ bun create ahfarmer/calculator ./myapp
```
Complete GitHub URLs will also work:
```bash
$ bun create github.com/ahfarmer/calculator ./myapp
$ bun create https://github.com/ahfarmer/calculator ./myapp
```
Bun installs the files as they currently exist current default branch (usually `main`). Unlike `git clone` it doesn't download the commit history or configure a remote.
## Local templates
{% callout %}
**⚠️ Warning** — Unlike remote templates, running `bun create` with a local template will delete the entire destination folder if it already exists! Be careful.
{% /callout %}
Bun's templater can be extended to support custom templates defined on your local file system. These templates should live in one of the following directories:
- `$HOME/.bun-create/<name>`: global templates
- `<project root>/.bun-create/<name>`: project-specific templates
{% callout %}
**Note** — You can customize the global template path by setting the `BUN_CREATE_DIR` environment variable.
{% /callout %}
To create a local template, navigate to `$HOME/.bun-create` and create a new directory with the desired name of your template.
```bash
$ cd $HOME/.bun-create
$ mkdir foo
$ cd foo
```
Then, create a `package.json` file in that directory with the following contents:
```json
{
"name": "foo"
}
```
You can run `bun create foo` elsewhere on your file system to verify that Bun is correctly finding your local template.
{% table %}
---
- `postinstall`
- runs after installing dependencies
---
- `preinstall`
- runs before installing dependencies
<!-- ---
- `start`
- a command to auto-start the application -->
{% /table %}
Each of these can correspond to a string or array of strings. An array of commands will be executed in order. Here is an example:
```json
{
"name": "@bun-examples/simplereact",
"version": "0.0.1",
"main": "index.js",
"dependencies": {
"react": "^17.0.2",
"react-dom": "^17.0.2"
},
"bun-create": {
"preinstall": "echo 'Installing...'", // a single command
"postinstall": ["echo 'Done!'"], // an array of commands
"start": "bun run echo 'Hello world!'"
}
}
```
When cloning a template, `bun create` will automatically remove the `"bun-create"` section from `package.json` before writing it to the destination folder.
## Reference
### CLI flags
{% table %}
- Flag
- Description
---
- `--force`
- Overwrite existing files
---
- `--no-install`
- Skip installing `node_modules` & tasks
---
- `--no-git`
- Dont initialize a git repository
---
- `--open`
- Start & open in-browser after finish
{% /table %}
### Environment variables
{% table %}
- Name
- Description
---
- `GITHUB_API_DOMAIN`
- If youre using a GitHub enterprise or a proxy, you can customize the GitHub domain Bun pings for downloads
---
- `GITHUB_API_TOKEN`
- This lets `bun create` work with private repositories or if you get rate-limited
{% /table %}
{% details summary="How `bun create` works" %}
When you run `bun create ${template} ${destination}`, heres what happens:
IF remote template
1. GET `registry.npmjs.org/@bun-examples/${template}/latest` and parse it
2. GET `registry.npmjs.org/@bun-examples/${template}/-/${template}-${latestVersion}.tgz`
3. Decompress & extract `${template}-${latestVersion}.tgz` into `${destination}`
- If there are files that would overwrite, warn and exit unless `--force` is passed
IF GitHub repo
1. Download the tarball from GitHubs API
2. Decompress & extract into `${destination}`
- If there are files that would overwrite, warn and exit unless `--force` is passed
ELSE IF local template
1. Open local template folder
2. Delete destination directory recursively
3. Copy files recursively using the fastest system calls available (on macOS `fcopyfile` and Linux, `copy_file_range`). Do not copy or traverse into `node_modules` folder if exists (this alone makes it faster than `cp`)
4. Parse the `package.json` (again!), update `name` to be `${basename(destination)}`, remove the `bun-create` section from the `package.json` and save the updated `package.json` to disk.
- IF Next.js is detected, add `bun-framework-next` to the list of dependencies
- IF Create React App is detected, add the entry point in /src/index.{js,jsx,ts,tsx} to `public/index.html`
- IF Relay is detected, add `bun-macro-relay` so that Relay works
5. Auto-detect the npm client, preferring `pnpm`, `yarn` (v1), and lastly `npm`
6. Run any tasks defined in `"bun-create": { "preinstall" }` with the npm client
7. Run `${npmClient} install` unless `--no-install` is passed OR no dependencies are in package.json
8. Run any tasks defined in `"bun-create": { "preinstall" }` with the npm client
9. Run `git init; git add -A .; git commit -am "Initial Commit";`
- Rename `gitignore` to `.gitignore`. NPM automatically removes `.gitignore` files from appearing in packages.
- If there are dependencies, this runs in a separate thread concurrently while node_modules are being installed
- Using libgit2 if available was tested and performed 3x slower in microbenchmarks
{% /details %}

View File

@@ -1,3 +1,27 @@
Scaffold an empty Bun project with the interactive `bun init` command.
```bash
$ bun init
bun init helps you get started with a minimal project and tries to
guess sensible defaults. Press ^C anytime to quit.
package name (quickstart):
entry point (index.ts):
Done! A package.json file was saved in the current directory.
+ index.ts
+ .gitignore
+ tsconfig.json (for editor auto-complete)
+ README.md
To get started, run:
bun run index.ts
```
Press `enter` to accept the default answer for each prompt, or pass the `-y` flag to auto-accept the defaults.
{% details summary="How `bun init` works" %}
`bun init` is a quick way to start a blank project with Bun. It guesses with sane defaults and is non-destructive when run multiple times.
![Demo](https://user-images.githubusercontent.com/709451/183006613-271960a3-ff22-4f7c-83f5-5e18f684c836.gif)
@@ -13,6 +37,4 @@ If you pass `-y` or `--yes`, it will assume you want to continue without asking
At the end, it runs `bun install` to install `bun-types`.
#### How is `bun init` different than `bun create`?
`bun init` is for blank projects. `bun create` applies templates.
{% /details %}

View File

@@ -9,7 +9,7 @@ The `bun` CLI contains a Node.js-compatible package manager designed to be a dra
{% /callout %}
{% details summary="For Linux users" %}
The minimum Linux Kernel version is 5.1. If you're on Linux kernel 5.1 - 5.5, `bun install` should still work, but HTTP requests will be slow due to a lack of support for io_uring's `connect()` operation.
The recommended minimum Linux Kernel version is 5.6. If you're on Linux kernel 5.1 - 5.5, `bun install` will work, but HTTP requests will be slow due to a lack of support for io_uring's `connect()` operation.
If you're using Ubuntu 20.04, here's how to install a [newer kernel](https://wiki.ubuntu.com/Kernel/LTSEnablementStack):
@@ -23,41 +23,19 @@ sudo apt install --install-recommends linux-generic-hwe-20.04
{% /details %}
## `bun install`
To install all dependencies of a project:
```bash
$ bun install
```
On Linux, `bun install` tends to install packages 20-100x faster than `npm install`. On macOS, it's more like 4-80x.
![package install benchmark](https://user-images.githubusercontent.com/709451/147004342-571b6123-17a9-49a2-8bfd-dcfc5204047e.png)
Running `bun install` will:
- **Install** all `dependencies`, `devDependencies`, and `optionalDependencies`. Bun does not install `peerDependencies` by default.
- **Install** all `dependencies`, `devDependencies`, and `optionalDependencies`. Bun will install `peerDependencies` by default.
- **Run** your project's `{pre|post}install` and `{pre|post}prepare` scripts at the appropriate time. For security reasons Bun _does not execute_ lifecycle scripts of installed dependencies.
- **Write** a `bun.lockb` lockfile to the project root.
To install in production mode (i.e. without `devDependencies` or `optionalDependencies`):
```bash
$ bun install --production
```
To install with reproducible dependencies, use `--frozen-lockfile`. If your `package.json` disagrees with `bun.lockb`, Bun will exit with an error. This is useful for production builds and CI environments.
```bash
$ bun install --frozen-lockfile
```
To perform a dry run (i.e. don't actually install anything):
```bash
$ bun install --dry-run
```
## Logging
To modify logging verbosity:
@@ -66,86 +44,59 @@ $ bun install --verbose # debug logging
$ bun install --silent # no logging
```
{% details summary="Configuring behavior" %}
The default behavior of `bun install` can be configured in `bunfig.toml`:
## Lifecycle scripts
```toml
[install]
Unlike other npm clients, Bun does not execute arbitrary lifecycle scripts like `postinstall` for installed dependencies. Executing arbitrary scripts represents a potential security risk.
# whether to install optionalDependencies
optional = true
To tell Bun to allow lifecycle scripts for a particular package, add the package to `trustedDependencies` in your package.json.
# whether to install devDependencies
dev = true
# whether to install peerDependencies
peer = false
# equivalent to `--production` flag
production = false
# equivalent to `--frozen-lockfile` flag
frozenLockfile = false
# equivalent to `--dry-run` flag
dryRun = false
```json-diff
{
"name": "my-app",
"version": "1.0.0",
+ "trustedDependencies": ["my-trusted-package"]
}
```
{% /details %}
Then re-install the package. Bun will read this field and run lifecycle scripts for `my-trusted-package`.
## `bun add`
## Workspaces
To add a particular package:
Bun supports `"workspaces"` in package.json. For complete documentation refer to [Package manager > Workspaces](/docs/install/workspaces).
```bash
$ bun add preact
```
To specify a version, version range, or tag:
```bash
$ bun add zod@3.20.0
$ bun add zod@^3.0.0
$ bun add zod@latest
```
To add a package as a dev dependency (`"devDependencies"`):
```bash
$ bun add --dev @types/react
$ bun add -d @types/react
```
To add a package as an optional dependency (`"optionalDependencies"`):
```bash
$ bun add --optional lodash
```
To add a package and pin to the resolved version, use `--exact`. This will resolve the version of the package and add it to your `package.json` with an exact version number instead of a version range.
```bash
$ bun add react --exact
```
This will add the following to your `package.json`:
```jsonc
```json#package.json
{
"name": "my-app",
"version": "1.0.0",
"workspaces": ["packages/*"],
"dependencies": {
// without --exact
"react": "^18.2.0", // this matches >= 18.2.0 < 19.0.0
// with --exact
"react": "18.2.0" // this matches only 18.2.0 exactly
"preact": "^10.5.13"
}
}
```
To install a package globally:
## Overrides and resolutions
Bun supports npm's `"overrides"` and Yarn's `"resolutions"` in `package.json`. These are mechanisms for specifying a version range for _metadependencies_—the dependencies of your dependencies. Refer to [Package manager > Overrides and resolutions](/docs/install/overrides) for complete documentation.
```json-diff#package.json
{
"name": "my-app",
"dependencies": {
"foo": "^2.0.0"
},
+ "overrides": {
+ "bar": "~4.4.0"
+ }
}
```
## Global packages
To install a package globally, use the `-g`/`--global` flag. Typically this is used for installing command-line tools.
```bash
$ bun add --global cowsay # or `bun add -g cowsay`
$ bun install --global cowsay # or `bun install -g cowsay`
$ cowsay "Bun!"
______
< Bun! >
@@ -157,159 +108,75 @@ $ cowsay "Bun!"
|| ||
```
{% details summary="Configuring global installation behavior" %}
## Production mode
```toml
[install]
# where `bun install --global` installs packages
globalDir = "~/.bun/install/global"
# where globally-installed package bins are linked
globalBinDir = "~/.bun/bin"
```
{% /details %}
To view a complete list of options for a given command:
To install in production mode (i.e. without `devDependencies` or `optionalDependencies`):
```bash
$ bun add --help
$ bun install --production
```
## `bun remove`
To remove a dependency:
For reproducible installs, use `--frozen-lockfile`. This will install the exact versions of each package specified in the lockfile. If your `package.json` disagrees with `bun.lockb`, Bun will exit with an error. The lockfile will not be updated.
```bash
$ bun remove preact
$ bun install --frozen-lockfile
```
## `bun update`
For more information on Bun's binary lockfile `bun.lockb`, refer to [Package manager > Lockfile](/docs/install/lockfile).
To update all dependencies to the latest version _that's compatible with the version range specified in your `package.json`_:
## Dry run
```sh
$ bun update
```
This will not edit your `package.json`. There's currently no command to force-update all dependencies to the latest version regardless version ranges.
## `bun link`
Use `bun link` in a local directory to register the current package as a "linkable" package.
To perform a dry run (i.e. don't actually install anything):
```bash
$ cd /path/to/cool-pkg
$ cat package.json
{
"name": "cool-pkg",
"version": "1.0.0"
}
$ bun link
bun link v1.x (7416672e)
Success! Registered "cool-pkg"
To use cool-pkg in a project, run:
bun link cool-pkg
Or add it in dependencies in your package.json file:
"cool-pkg": "link:cool-pkg"
$ bun install --dry-run
```
This package can now be "linked" into other projects using `bun link cool-pkg`. This will create a symlink in the `node_modules` directory of the target project, pointing to the local directory.
## Non-npm dependencies
```bash
$ cd /path/to/my-app
$ bun link cool-pkg
```
Bun supports installing dependencies from Git, GitHub, and local or remotely-hosted tarballs. For complete documentation refer to [Package manager > Git, GitHub, and tarball dependencies](/docs/cli/add).
In addition, the `--save` flag can be used to add `cool-pkg` to the `dependencies` field of your app's package.json with a special version specifier that tells Bun to load from the registered local directory instead of installing from `npm`:
```json-diff
{
"name": "my-app",
"version": "1.0.0",
"dependencies": {
+ "cool-pkg": "link:cool-pkg"
}
}
```
## Trusted dependencies
Unlike other npm clients, Bun does not execute arbitrary lifecycle scripts for installed dependencies, such as `postinstall`. These scripts represent a potential security risk, as they can execute arbitrary code on your machine.
<!-- Bun maintains an allow-list of popular packages containing `postinstall` scripts that are known to be safe. To run lifecycle scripts for packages that aren't on this list, add the package to `trustedDependencies` in your package.json. -->
To tell Bun to allow lifecycle scripts for a particular package, add the package to `trustedDependencies` in your package.json.
<!-- ```json-diff
{
"name": "my-app",
"version": "1.0.0",
+ "trustedDependencies": {
+ "my-trusted-package": "*"
+ }
}
``` -->
```json-diff
{
"name": "my-app",
"version": "1.0.0",
+ "trustedDependencies": ["my-trusted-package"]
}
```
Bun reads this field and will run lifecycle scripts for `my-trusted-package`.
<!-- If you specify a version range, Bun will only execute lifecycle scripts if the resolved package version matches the range. -->
<!--
```json
{
"name": "my-app",
"version": "1.0.0",
"trustedDependencies": {
"my-trusted-package": "^1.0.0"
}
}
``` -->
## Git dependencies
To add a dependency from a git repository:
```bash
$ bun install git@github.com:moment/moment.git
```
Bun supports a variety of protocols, including [`github`](https://docs.npmjs.com/cli/v9/configuring-npm/package-json#github-urls), [`git`](https://docs.npmjs.com/cli/v9/configuring-npm/package-json#git-urls-as-dependencies), `git+ssh`, `git+https`, and many more.
```json
```json#package.json
{
"dependencies": {
"dayjs": "git+https://github.com/iamkun/dayjs.git",
"lodash": "git+ssh://github.com/lodash/lodash.git#4.17.21",
"moment": "git@github.com:moment/moment.git",
"zod": "github:colinhacks/zod"
"zod": "github:colinhacks/zod",
"react": "https://registry.npmjs.org/react/-/react-18.2.0.tgz"
}
}
```
## Tarball dependencies
## Configuration
A package name can correspond to a publicly hosted `.tgz` file. During `bun install`, Bun will download and install the package from the specified tarball URL, rather than from the package registry.
The default behavior of `bun install` can be configured in `bunfig.toml`. The default values are shown below.
```json#package.json
{
"dependencies": {
"zod": "https://registry.npmjs.org/zod/-/zod-3.21.4.tgz"
}
}
```toml
[install]
# whether to install optionalDependencies
optional = true
# whether to install devDependencies
dev = true
# whether to install peerDependencies
peer = true
# equivalent to `--production` flag
production = false
# equivalent to `--frozen-lockfile` flag
frozenLockfile = false
# equivalent to `--dry-run` flag
dryRun = false
```
## CI/CD
Looking to speed up your CI? Use the official `oven-sh/setup-bun` action to install `bun` in a GitHub Actions pipeline.
Looking to speed up your CI? Use the official [`oven-sh/setup-bun`](https://github.com/oven-sh/setup-bun) action to install `bun` in a GitHub Actions pipeline.
```yaml#.github/workflows/release.yml
name: bun-types

46
docs/cli/link.md Normal file
View File

@@ -0,0 +1,46 @@
Use `bun link` in a local directory to register the current package as a "linkable" package.
```bash
$ cd /path/to/cool-pkg
$ cat package.json
{
"name": "cool-pkg",
"version": "1.0.0"
}
$ bun link
bun link v1.x (7416672e)
Success! Registered "cool-pkg"
To use cool-pkg in a project, run:
bun link cool-pkg
Or add it in dependencies in your package.json file:
"cool-pkg": "link:cool-pkg"
```
This package can now be "linked" into other projects using `bun link cool-pkg`. This will create a symlink in the `node_modules` directory of the target project, pointing to the local directory.
```bash
$ cd /path/to/my-app
$ bun link cool-pkg
```
In addition, the `--save` flag can be used to add `cool-pkg` to the `dependencies` field of your app's package.json with a special version specifier that tells Bun to load from the registered local directory instead of installing from `npm`:
```json-diff
{
"name": "my-app",
"version": "1.0.0",
"dependencies": {
+ "cool-pkg": "link:cool-pkg"
}
}
```
To _unregister_ a local package, navigate to the package's root directory and run `bun unlink`.
```bash
$ cd /path/to/cool-pkg
$ bun unlink
bun unlink v1.x (7416672e)
```

5
docs/cli/remove.md Normal file
View File

@@ -0,0 +1,5 @@
To remove a dependency:
```bash
$ bun remove ts-node
```

7
docs/cli/update.md Normal file
View File

@@ -0,0 +1,7 @@
To update all dependencies to the latest version _that's compatible with the version range specified in your `package.json`_:
```sh
$ bun update
```
This will not edit your `package.json`. There's currently no command to force-update all dependencies to the latest version regardless version ranges.

View File

@@ -0,0 +1,140 @@
---
name: Containerize a Bun application with Docker
---
{% callout %}
This guide assumes you already have [Docker Desktop](https://www.docker.com/products/docker-desktop/) installed.
{% /callout %}
[Docker](https://www.docker.com) is a platform for packaging and running an application as a lightweight, portable _container_ that encapsulates all the necessary dependencies.
---
To _containerize_ our application, we define a `Dockerfile`. This file contains a list of instructions to initialize the container, copy our local project files into it, install dependencies, and starts the application.
```docker#Dockerfile
# use the official Bun image
# see all versions at https://hub.docker.com/r/oven/bun/tags
FROM oven/bun:1 as base
WORKDIR /usr/src/app
# install dependencies into temp directory
# this will cache them and speed up future builds
FROM base AS install
RUN mkdir -p /temp/dev
COPY package.json bun.lockb /temp/dev/
RUN cd /temp/dev && bun install --frozen-lockfile
# install with --production (exclude devDependencies)
RUN mkdir -p /temp/prod
COPY package.json bun.lockb /temp/prod/
RUN cd /temp/prod && bun install --frozen-lockfile --production
# copy node_modules from temp directory
# then copy all (non-ignored) project files into the image
FROM install AS prerelease
COPY --from=install /temp/dev/node_modules node_modules
COPY . .
# [optional] tests & build
ENV NODE_ENV=production
RUN bun test
RUN bun run build
# copy production dependencies and source code into final image
FROM base AS release
COPY --from=install /temp/prod/node_modules node_modules
COPY --from=prerelease /usr/src/app/index.ts .
COPY --from=prerelease /usr/src/app/package.json .
# run the app
USER bun
EXPOSE 3000/tcp
ENTRYPOINT [ "bun", "run", "index.ts" ]
```
---
Now that you have your docker image, let's look at `.dockerignore` which has the same syntax as `.gitignore`, here you need to specify the files/directories that must not go in any stage of the docker build. An example for a ignore file is
```txt#.dockerignore
node_modules
Dockerfile*
docker-compose*
.dockerignore
.git
.gitignore
README.md
LICENSE
.vscode
Makefile
helm-charts
.env
.editorconfig
.idea
coverage*
```
---
We'll now use `docker build` to convert this `Dockerfile` into a _Docker image_, is a self-contained template containing all the dependencies and configuration required to run the application.
The `-t` flag lets us specify a name for the image, and `--pull` tells Docker to automatically download the latest version of the base image (`oven/bun`). The initial build will take longer, as Docker will download all the base images and dependencies.
```bash
$ docker build --pull -t bun-hello-world .
[+] Building 0.9s (21/21) FINISHED
=> [internal] load build definition from Dockerfile 0.0s
=> => transferring dockerfile: 37B 0.0s
=> [internal] load .dockerignore 0.0s
=> => transferring context: 35B 0.0s
=> [internal] load metadata for docker.io/oven/bun:1 0.8s
=> [auth] oven/bun:pull token for registry-1.docker.io 0.0s
=> [base 1/2] FROM docker.io/oven/bun:1@sha256:373265748d3cd3624cb3f3ee6004f45b1fc3edbd07a622aeeec17566d2756997 0.0s
=> [internal] load build context 0.0s
=> => transferring context: 155B 0.0s
# ...lots of commands...
=> exporting to image 0.0s
=> => exporting layers 0.0s
=> => writing image sha256:360663f7fdcd6f11e8e94761d5592e2e4dfc8d167f034f15cd5a863d5dc093c4 0.0s
=> => naming to docker.io/library/bun-hello-world 0.0s
```
---
We've built a new _Docker image_. Now let's use that image to spin up an actual, running _container_.
We'll use `docker run` to start a new container using the `bun-hello-world` image. It will be run in _detached_ mode (`-d`) and we'll map the container's port 3000 to our local machine's port 3000 (`-p 3000:3000`).
The `run` command prints a string representing the _container ID_.
```sh
$ docker run -d -p 3000:3000 bun-hello-world
7f03e212a15ede8644379bce11a13589f563d3909a9640446c5bbefce993678d
```
---
The container is now running in the background. Visit [localhost:3000](http://localhost:3000). You should see a `Hello, World!` message.
---
To stop the container, we'll use `docker stop <container-id>`.
```sh
$ docker stop 7f03e212a15ede8644379bce11a13589f563d3909a9640446c5bbefce993678d
```
---
If you can't find the container ID, you can use `docker ps` to list all running containers.
```sh
$ docker ps
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
7f03e212a15e bun-hello-world "bun run index.ts" 2 minutes ago Up 2 minutes 0.0.0.0:3000->3000/tcp flamboyant_cerf
```
---
That's it! Refer to the [Docker documentation](https://docs.docker.com/) for more advanced usage.

View File

@@ -0,0 +1,54 @@
---
name: Run Bun as a daemon with PM2
---
[PM2](https://pm2.keymetrics.io/) is a popular process manager that manages and runs your applications as daemons (background processes).
It offers features like process monitoring, automatic restarts, and easy scaling. Using a process manager is common when deploying a Bun application on a cloud-hosted virtual private server (VPS), as it:
- Keeps your Node.js application running continuously.
- Ensure high availability and reliability of your application.
- Monitor and manage multiple processes with ease.
- Simplify the deployment process.
---
You can use PM2 with Bun in two ways: as a CLI option or in a configuration file.
### With `--interpreter`
---
To start your application with PM2 and Bun as the interpreter, open your terminal and run the following command:
```bash
pm2 start --interpreter ~/.bun/bin/bun index.ts
```
---
### With a configuration file
---
Alternatively, you can create a PM2 configuration file. Create a file named `pm2.config.js` in your project directory and add the following content.
```javascript
module.exports = {
name: "app", // Name of your application
script: "index.ts", // Entry point of your application
interpreter: "~/.bun/bin/bun", // Path to the Bun interpreter
};
```
---
After saving the file, you can start your application with PM2
```bash
pm2 start pm2.config.js
```
---
Thats it! Your JavaScript/TypeScript web server is now running as a daemon with PM2 using Bun as the interpreter.

View File

@@ -0,0 +1,113 @@
---
name: Run Bun as a daemon with systemd
---
[systemd](https://systemd.io) is an init system and service manager for Linux operating systems that manages the startup and control of system processes and services.
<!-- systemd provides aggressive parallelization capabilities, uses socket and D-Bus activation for starting services, offers on-demand starting of daemons, keeps track of processes using Linux control groups, maintains mount and auto mount points, and implements an elaborate transactional dependency-based service control logic. systemd supports SysV and LSB init scripts and works as a replacement for sysvinit. -->
<!-- Other parts include a logging daemon, utilities to control basic system configuration like the hostname, date, locale, maintain a list of logged-in users and running containers and virtual machines, system accounts, runtime directories and settings, and daemons to manage simple network configuration, network time synchronization, log forwarding, and name resolution. -->
---
To run a Bun application as a daemon using **systemd** you'll need to create a _service file_ in `/lib/systemd/system/`.
```sh
$ cd /lib/systemd/system
$ touch my-app.service
```
---
Here is a typical service file that runs an application on system start. You can use this as a template for your own service. Replace `YOUR_USER` with the name of the user you want to run the application as. To run as `root`, replace `YOUR_USER` with `root`, though this is generally not recommended for security reasons.
Refer to the [systemd documentation](https://www.freedesktop.org/software/systemd/man/systemd.service.html) for more information on each setting.
```ini#my-app.service
[Unit]
# describe the app
Description=My App
# start the app after the network is available
After=network.target
[Service]
# usually you'll use 'simple'
# one of https://www.freedesktop.org/software/systemd/man/systemd.service.html#Type=
Type=simple
# which user to use when starting the app
User=YOUR_USER
# path to your application's root directory
WorkingDirectory=/home/YOUR_USER/path/to/my-app
# the command to start the app
# requires absolute paths
ExecStart=/home/YOUR_USER/.bun/bin/bun run index.ts
# restart policy
# one of {no|on-success|on-failure|on-abnormal|on-watchdog|on-abort|always}
Restart=always
[Install]
# start the app automatically
WantedBy=multi-user.target
```
---
If your application starts a webserver, note that non-`root` users are not able to listen on ports 80 or 443 by default. To permanently allow Bun to listen on these ports when executed by a non-`root` user, use the following command. This step isn't necessary when running as `root`.
```bash
$ sudo setcap CAP_NET_BIND_SERVICE=+eip ~/.bun/bin/bun
```
---
With the service file configured, you can now _enable_ the service. Once enabled, it will start automatically on reboot. This requires `sudo` permissions.
```bash
$ sudo systemctl enable my-app
```
---
To start the service without rebooting, you can manually _start_ it.
```bash
$ sudo systemctl start my-app
```
---
Check the status of your application with `systemctl status`. If you've started your app successfully, you should see something like this:
```bash
$ sudo systemctl status my-app
● my-app.service - My App
Loaded: loaded (/lib/systemd/system/my-app.service; enabled; preset: enabled)
Active: active (running) since Thu 2023-10-12 11:34:08 UTC; 1h 8min ago
Main PID: 309641 (bun)
Tasks: 3 (limit: 503)
Memory: 40.9M
CPU: 1.093s
CGroup: /system.slice/my-app.service
└─309641 /home/YOUR_USER/.bun/bin/bun run /home/YOUR_USER/application/index.ts
```
---
To update the service, edit the contents of the service file, then reload the daemon.
```bash
$ sudo systemctl daemon-reload
```
---
For a complete guide on the service unit configuration, you can check [this page](https://www.freedesktop.org/software/systemd/man/systemd.service.html). Or refer to this cheatsheet of common commands:
```bash
$ sudo systemctl daemon-reload # tell systemd that some files got changed
$ sudo systemctl enable my-app # enable the app (to allow auto-start)
$ sudo systemctl disable my-app # disable the app (turns off auto-start)
$ sudo systemctl start my-app # start the app if is stopped
$ sudo systemctl stop my-app # stop the app
$ sudo systemctl restart my-app # restart the app
```

View File

@@ -30,8 +30,7 @@ bun install
Start the development server with the `vite` CLI using `bunx`.
The `--bun` flag tells Bun to run Vite's CLI using `bun` instead of `node`; by default Bun respects Vite's `#!/usr/bin/env node` [shebang line](<https://en.wikipedia.org/wiki/Shebang_(Unix)>). After Bun 1.0 this flag will no longer be necessary.
The `--bun` flag tells Bun to run Vite's CLI using `bun` instead of `node`; by default Bun respects Vite's `#!/usr/bin/env node` [shebang line](<https://en.wikipedia.org/wiki/Shebang_(Unix)>).
```bash
bunx --bun vite
```

View File

@@ -2,7 +2,7 @@
name: Add a peer dependency
---
To add an npm package as a peer dependency, directly modify the `peerDependencies` object in your package.json. Running `bun install` will not install peer dependencies.
To add an npm package as a peer dependency, directly modify the `peerDependencies` object in your package.json. Running `bun install` will install peer dependencies by default, unless marked optional in `peerDependenciesMeta`.
```json-diff
{

View File

@@ -47,4 +47,4 @@ Note that this only allows lifecycle scripts for the specific package listed in
---
See [Docs > Package manager > Trusted dependencies](/docs/cli/install#trusted-dependencies) for complete documentation of trusted dependencies.
See [Docs > Package manager > Trusted dependencies](/docs/install/lifecycle) for complete documentation of trusted dependencies.

View File

@@ -0,0 +1,66 @@
---
name: Spawn a child process and communicate using IPC
---
Use [`Bun.spawn()`](/docs/api/spawn) to spawn a child process. When spawning a second `bun` process, you can open a direct inter-process communication (IPC) channel between the two processes.
{%callout%}
**Note** — This API is only compatible with other `bun` processes. Use `process.execPath` to get a path to the currently running `bun` executable.
{%/callout%}
```ts#parent.ts
const child = Bun.spawn(["bun", "child.ts"], {
ipc(message) {
/**
* The message received from the sub process
**/
},
});
```
---
The parent process can send messages to the subprocess using the `.send()` method on the returned `Subprocess` instance. A reference to the sending subprocess is also available as the second argument in the `ipc` handler.
```ts#parent.ts
const childProc = Bun.spawn(["bun", "child.ts"], {
ipc(message, childProc) {
/**
* The message received from the sub process
**/
childProc.send("Respond to child")
},
});
childProc.send("I am your father"); // The parent can send messages to the child as well
```
---
Meanwhile the child process can send messages to its parent using with `process.send()` and receive messages with `process.on("message")`. This is the same API used for `child_process.fork()` in Node.js.
```ts#child.ts
process.send("Hello from child as string");
process.send({ message: "Hello from child as object" });
process.on("message", (message) => {
// print message from parent
console.log(message);
});
```
---
All messages are serialized using the JSC `serialize` API, which allows for the same set of [transferrable types](https://developer.mozilla.org/en-US/docs/Web/API/Web_Workers_API/Transferable_objects) supported by `postMessage` and `structuredClone`, including strings, typed arrays, streams, and objects.
```ts#child.ts
// send a string
process.send("Hello from child as string");
// send an object
process.send({ message: "Hello from child as object" });
```
---
See [Docs > API > Child processes](/docs/api/spawn) for complete documentation.

View File

@@ -0,0 +1,52 @@
---
name: Append content to a file
---
Bun implements the `node:fs` module, which includes the `fs.appendFile` and `fs.appendFileSync` functions for appending content to files.
---
You can use `fs.appendFile` to asynchronously append data to a file, creating the file if it does not yet exist. The content can be a string or a `Buffer`.
```ts
import { appendFile } from "node:fs/promises";
await appendFile("message.txt", "data to append");
```
---
To use the non-`Promise` API:
```ts
import { appendFile } from "node:fs";
appendFile("message.txt", "data to append", err => {
if (err) throw err;
console.log('The "data to append" was appended to file!');
});
```
---
To specify the encoding of the content:
```js
import { appendFile } from "node:fs";
appendFile("message.txt", "data to append", "utf8", callback);
```
---
To append the data synchronously, use `fs.appendFileSync`:
```ts
import { appendFileSync } from "node:fs";
appendFileSync("message.txt", "data to append", "utf8");
```
---
See the [Node.js documentation](https://nodejs.org/api/fs.html#fspromisesappendfilepath-data-options) for more information.

View File

@@ -26,7 +26,7 @@ Get started with one of the quick links below, or read on to learn more about Bu
{% arrowbutton href="/docs/installation" text="Install Bun" /%}
{% arrowbutton href="/docs/quickstart" text="Do the quickstart" /%}
{% arrowbutton href="/docs/cli/install" text="Install a package" /%}
{% arrowbutton href="/docs/templates" text="Use a project template" /%}
{% arrowbutton href="/docs/cli/bun-create" text="Use a project template" /%}
{% arrowbutton href="/docs/bundler" text="Bundle code for production" /%}
{% arrowbutton href="/docs/api/http" text="Build an HTTP server" /%}
{% arrowbutton href="/docs/api/websockets" text="Build a Websocket server" /%}
@@ -37,11 +37,14 @@ Get started with one of the quick links below, or read on to learn more about Bu
## What is a runtime?
JavaScript (or, more formally, ECMAScript) is just a _specification_ for a programming language. Anyone can write a JavaScript _engine_ that ingests a valid JavaScript program and executes it. The two most popular engines in use today are V8 (developed by Google) and JavaScriptCore (developed by Apple). Both are open source.
JavaScript (or, more formally, ECMAScript) is just a _specification_ for a programming language. Anyone can write a JavaScript _engine_ that ingests a valid JavaScript program and executes it. The two most popular engines in use today are V8 (developed by Google)
and JavaScriptCore (developed by Apple). Both are open source.
But most JavaScript programs don't run in a vacuum. They need a way to access the outside world to perform useful tasks. This is where _runtimes_ come in. They implement additional APIs that are then made available to the JavaScript programs they execute.
### Browsers
But most JavaScript programs don't run in a vacuum. They need a way to access the outside world to perform useful tasks. This is where _runtimes_ come in. They implement additional APIs that are then made available to the JavaScript programs they execute. Notably, browsers ship with JavaScript runtimes that implement a set of Web-specific APIs that are exposed via the global `window` object. Any JavaScript code executed by the browser can use these APIs to implement interactive or dynamic behavior in the context of the current webpage.
Notably, browsers ship with JavaScript runtimes that implement a set of Web-specific APIs that are exposed via the global `window` object. Any JavaScript code executed by the browser can use these APIs to implement interactive or dynamic behavior in the context of the current webpage.
<!-- JavaScript runtime that exposes JavaScript engines are designed to run "vanilla" JavaScript programs, but it's often JavaScript _runtimes_ use an engine internally to execute the code and implement additional APIs that are then made available to executed programs.
JavaScript was [initially designed](https://en.wikipedia.org/wiki/JavaScript) as a language to run in web browsers to implement interactivity and dynamic behavior in web pages. Browsers are the first JavaScript runtimes. JavaScript programs that are executed in browsers have access to a set of Web-specific global APIs on the `window` object. -->

View File

@@ -39,7 +39,7 @@ On Linux, `bun install` tends to install packages 20-100x faster than `npm insta
Running `bun install` will:
- **Install** all `dependencies`, `devDependencies`, and `optionalDependencies`. Bun does not install `peerDependencies` by default.
- **Install** all `dependencies`, `devDependencies`, and `optionalDependencies`. Bun will install `peerDependencies` by default.
- **Run** your project's `{pre|post}install` scripts at the appropriate time. For security reasons Bun _does not execute_ lifecycle scripts of installed dependencies.
- **Write** a `bun.lockb` lockfile to the project root.
@@ -81,7 +81,7 @@ optional = true
dev = true
# whether to install peerDependencies
peer = false
peer = true
# equivalent to `--production` flag
production = false

44
docs/install/lifecycle.md Normal file
View File

@@ -0,0 +1,44 @@
Packages on `npm` can define _lifecycle scripts_ in their `package.json`. Some of the most common are below, but there are [many others](https://docs.npmjs.com/cli/v10/using-npm/scripts).
- `preinstall`: Runs before the package is installed
- `postinstall`: Runs after the package is installed
- `preuninstall`: Runs before the package is uninstalled
- `prepublishOnly`: Runs before the package is published
These scripts are arbitrary shell commands that the package manager is expected to read and execute at the appropriate time. But executing arbitrary scripts represents a potential security risk, so—unlike other `npm` clients—Bun does not execute arbitrary lifecycle scripts by default.
## `postinstall`
The `postinstall` script is particularly important. It's widely used to build or install platform-specific binaries for packages that are implemented as [native Node.js add-ons](https://nodejs.org/api/addons.html). For example, `node-sass` is a popular package that uses `postinstall` to build a native binary for Sass.
```json
{
"name": "my-app",
"version": "1.0.0",
"dependencies": {
"node-sass": "^6.0.1"
}
}
```
## `trustedDependencies`
Instead of executing arbitrary scripts, Bun uses a "default-secure" approach. You can add certain packages to an allow list, and Bun will execute lifecycle scripts for those packages. To tell Bun to allow lifecycle scripts for a particular package, add the package name to `trustedDependencies` array in your `package.json`.
```json-diff
{
"name": "my-app",
"version": "1.0.0",
+ "trustedDependencies": ["node-sass"]
}
```
Once added to `trustedDependencies`, install/re-install the package. Bun will read this field and run lifecycle scripts for `my-trusted-package`.
## `--ignore-scripts`
To disable lifecycle scripts for all packages, use the `--ignore-scripts` flag.
```bash
$ bun install --ignore-scripts
```

73
docs/install/overrides.md Normal file
View File

@@ -0,0 +1,73 @@
Bun supports npm's `"overrides"` and Yarn's `"resolutions"` in `package.json`. These are mechanisms for specifying a version range for _metadependencies_—the dependencies of your dependencies. Refer to [Package manager > Overrides and resolutions](/docs/install/overrides) for complete documentation.
```json-diff#package.json
{
"name": "my-app",
"dependencies": {
"foo": "^2.0.0"
},
+ "overrides": {
+ "bar": "~4.4.0"
+ }
}
```
By default, Bun will install the latest version of all dependencies and metadependencies, according to the ranges specified in each package's `package.json`. Let's say you have a project with one dependency, `foo`, which in turn has a dependency on `bar`. This means `bar` is a _metadependency_ of our project.
```json#package.json
{
"name": "my-app",
"dependencies": {
"foo": "^2.0.0"
}
}
```
When you run `bun install`, Bun will install the latest versions of each package.
```
# tree layout of node_modules
node_modules
├── foo@1.2.3
└── bar@4.5.6
```
But what if a security vulnerability was introduced in `bar@4.5.6`? We may want a way to pin `bar` to an older version that doesn't have the vulerability. This is where `"overrides"`/`"resolutions"` come in.
## `"overrides"`
Add `bar` to the `"overrides"` field in `package.json`. Bun will defer to the specified version range when determining which version of `bar` to install, whether it's a dependency or a metadependency.
{% callout %}
**Note** — Bun currently only supports top-level `"overrides"`. [Nested overrides](https://docs.npmjs.com/cli/v9/configuring-npm/package-json#overrides) are not supported.
{% /callout %}
```json-diff#package.json
{
"name": "my-app",
"dependencies": {
"foo": "^2.0.0"
},
+ "overrides": {
+ "bar": "~4.4.0"
+ }
}
```
## `"resolutions"`
The syntax is similar for `"resolutions"`, which is Yarn's alternative to `"overrides"`. Bun supports this feature to make migration from Yarn easier.
As with `"overrides"`, _nested resolutions_ are not currently supported.
```json-diff#package.json
{
"name": "my-app",
"dependencies": {
"foo": "^2.0.0"
},
+ "resolutions": {
+ "bar": "~4.4.0"
+ }
}
```

View File

@@ -1,6 +1,8 @@
Bun ships as a single executable that can be installed a few different ways.
## macOS and Linux
## Installing
### macOS and Linux
{% callout %}
**Linux users** — The `unzip` package is required to install Bun. Use `sudo apt install unzip` to install `unzip` package.
@@ -35,7 +37,7 @@ $ proto install bun
{% /codetabs %}
## Windows
### Windows
Bun provides a _limited, experimental_ native build for Windows. At the moment, only the Bun runtime is supported.
@@ -67,6 +69,59 @@ $ docker pull oven/bun:alpine
$ docker pull oven/bun:distroless
```
## Checking installation
To check that Bun was installed successfully, open a new terminal window and run `bun --version`.
```sh
$ bun --version
1.x.y
```
To see the precise commit of [oven-sh/bun](https://github.com/oven-sh/bun) that you're using, run `bun --revision`.
```sh
$ bun --revision
1.x.y+b7982ac1318937560f38e0f8eb18f45eaa43480f
```
If you've installed Bun but are seeing a `command not found` error, you may have to manually add the installation directory (`~/.bun/bin`) to your `PATH`.
{% details summary="How to add to your `PATH`" %}
First, determine what shell you're using:
```sh
$ echo $SHELL
/bin/zsh # or /bin/bash or /bin/fish
```
Then add these lines below to bottom of your shell's configuration file.
{% codetabs %}
```bash#~/.zshrc
# add to ~/.zshrc
export BUN_INSTALL="$HOME/.bun"
export PATH="$BUN_INSTALL/bin:$PATH"
```
```bash#~/.bashrc
# add to ~/.bashrc
export BUN_INSTALL="$HOME/.bun"
export PATH="$BUN_INSTALL/bin:$PATH"
```
```sh#~/.config/fish/config.fish
# add to ~/.config/fish/config.fish
export BUN_INSTALL="$HOME/.bun"
export PATH="$BUN_INSTALL/bin:$PATH"
```
{% /codetabs %}
Save the file. You'll need to open a new shell/terminal window for the changes to take effect.
{% /details %}
## Upgrading
Once installed, the binary can upgrade itself.

View File

@@ -38,12 +38,13 @@ export default {
page("typescript", "TypeScript", {
description: "Install and configure type declarations for Bun's APIs",
}),
page("templates", "Templates", {
description: "Hit the ground running with one of Bun's official templates, or download a template from GitHub.",
divider("Templating"),
page("cli/init", "`bun init`", {
description: "Scaffold an empty Bun project.",
}),
page("guides", "Guides", {
description: "A set of walkthrough guides and code snippets for performing common tasks with Bun",
href: "/guides",
page("cli/bun-create", "`bun create`", {
description: "Scaffold a new Bun project from an official template or GitHub repo.",
}),
// page("typescript", "TypeScript"),
@@ -81,7 +82,6 @@ export default {
// page("bundev", "Dev server"),
// page("benchmarks", "Benchmarks"),
// divider("Runtime"),
divider("Runtime"),
page("cli/run", "`bun run`", {
description: "Use `bun run` to execute JavaScript/TypeScript files and package.json scripts.",
@@ -152,6 +152,21 @@ export default {
description:
"Install all dependencies with `bun install`, or manage dependencies with `bun add` and `bun remove`.",
}),
page("cli/add", "`bun add`", {
description: "Add dependencies to your project.",
}),
page("cli/remove", "`bun remove`", {
description: "Remove dependencies from your project.",
}),
page("cli/update", "`bun update`", {
description: "Update your project's dependencies.",
}),
page("cli/link", "`bun link`", {
description: "Install local packages as dependencies in your project.",
}),
page("cli/pm", "`bun pm`", {
description: "Utilities relating to package management with Bun.",
}),
page("install/cache", "Global cache", {
description:
"Bun's package manager installs all packages into a shared global cache to avoid redundant re-downloads.",
@@ -159,6 +174,9 @@ export default {
page("install/workspaces", "Workspaces", {
description: "Bun's package manager supports workspaces and mono-repo development workflows.",
}),
page("install/lifecycle", "Lifecycle scripts", {
description: "How Bun handles package lifecycle scripts with trustedDependencies",
}),
page("install/lockfile", "Lockfile", {
description:
"Bun's binary lockfile `bun.lockb` tracks your resolved dependency tree, making future installs fast and repeatable.",
@@ -166,9 +184,12 @@ export default {
page("install/registries", "Scopes and registries", {
description: "How to configure private scopes and custom package registries.",
}),
page("install/utilities", "Utilities", {
description: "Use `bun pm` to introspect your global module cache or project dependency tree.",
page("install/overrides", "Overrides and resolutions", {
description: "Specify version ranges for nested dependencies",
}),
// page("install/utilities", "Utilities", {
// description: "Use `bun pm` to introspect your global module cache or project dependency tree.",
// }),
divider("Bundler"),
page("bundler", "`Bun.build`", {
@@ -334,7 +355,7 @@ export default {
page("project/benchmarking", "Benchmarking", {
description: `Bun is designed for performance. Learn how to benchmark Bun yourself.`,
}),
page("project/development", "Development", {
page("project/contributing", "Contributing", {
description: "Learn how to contribute to Bun and get your local development environment up and running.",
}),
page("project/licensing", "License", {

View File

@@ -130,11 +130,11 @@ Zig can be installed either with our npm package [`@oven/zig`](https://www.npmjs
```bash
$ bun install -g @oven/zig
$ zigup 0.12.0-dev.163+6780a6bbf
$ zigup 0.12.0-dev.899+027aabf49
```
{% callout %}
We last updated Zig on **July 18th, 2023**
We last updated Zig on **October 12th, 2023**
{% /callout %}
## First Build
@@ -392,6 +392,26 @@ $ bun install
$ make cpp
```
## Building WebKit locally + Debug mode of JSC
WebKit is not cloned by default (to save time and disk space). To clone and build WebKit locally, run:
```bash
# once you run this, `make submodule` can be used to automatically
# update WebKit and the other submodules
$ git submodule update --init --depth 1 --checkout src/bun.js/WebKit
# to make a jsc release build
$ make jsc
# JSC debug build does not work perfectly with Bun yet, this is actively being
# worked on and will eventually become the default.
$ make jsc-build-linux-compile-debug cpp
$ make jsc-build-mac-compile-debug cpp
```
Note that the WebKit folder, including build artifacts, is 8GB+ in size.
If you are using a JSC debug build and using VScode, make sure to run the `C/C++: Select a Configuration` command to configure intellisense to find the debug headers.
## Troubleshooting
### 'span' file not found on Ubuntu

View File

@@ -42,7 +42,10 @@ const server = Bun.serve({
console.log(`Listening on http://localhost:${server.port} ...`);
```
If you're using TypeScript, you may see a type error on the `Bun` global. To fix this, install `bun-types`.
{% details summary="Seeing TypeScript errors on `Bun`?" %}
If you used `bun init`, Bun will have automatically installed Bun's TypeScript declarations and configured your `tsconfig.json`. If you're trying out Bun in an existing project, you may see a type error on the `Bun` global.
To fix this, first install `bun-types` as a dev dependency.
```sh
$ bun add -d bun-types
@@ -58,6 +61,8 @@ Then add the following line to your `compilerOptions` in `tsconfig.json`.
}
```
{% /details %}
Run the file from your shell.
```bash

View File

@@ -209,11 +209,11 @@ dev = true
### `install.peer`
Whether to install peer dependencies. Default `false`.
Whether to install peer dependencies. Default `true`.
```toml
[install]
peer = false
peer = true
```
### `install.production`

View File

@@ -30,7 +30,7 @@ This page is updated regularly to reflect compatibility status of the latest ver
### [`node:crypto`](https://nodejs.org/api/crypto.html)
🟡 Missing `Certificate` `ECDH` `KeyObject` `X509Certificate` `checkPrime` `checkPrimeSync` `createPrivateKey` `createPublicKey` `createSecretKey` `diffieHellman` `generateKey` `generateKeyPair` `generateKeyPairSync` `generateKeySync` `generatePrime` `generatePrimeSync` `getCipherInfo` `getFips` `hkdf` `hkdfSync` `secureHeapUsed` `setEngine` `setFips` `sign` `verify`
🟡 Missing `Certificate` `ECDH` `X509Certificate` `checkPrime` `checkPrimeSync` `diffieHellman` `generatePrime` `generatePrimeSync` `getCipherInfo` `getFips` `hkdf` `hkdfSync` `secureHeapUsed` `setEngine` `setFips`
Some methods are not optimized yet.

View File

@@ -15,8 +15,8 @@ for (let key of Object.keys(json).sort()) {
}
const withExtensions = [
...new Set([
...Object.keys(json)
...new Set(
Object.keys(json)
.filter(key => {
return !!json[key]?.extensions?.length;
})
@@ -26,7 +26,7 @@ const withExtensions = [
});
})
.sort(),
]),
),
];
all += "\n";

View File

@@ -27,7 +27,7 @@ pub fn main() anyerror!void {
var args = std.mem.bytesAsSlice([]u8, try std.process.argsAlloc(allocator));
const to_resolve = args[args.len - 1];
const cwd = try std.process.getCwdAlloc(allocator);
const cwd = try bun.getcwdAlloc(allocator);
var path: []u8 = undefined;
var out_buffer: [bun.MAX_PATH_BYTES]u8 = undefined;

View File

@@ -47,7 +47,7 @@ pub fn main() anyerror!void {
bun.asByteSlice(args[args.len - 1]),
};
const tarball_path = path_handler.joinAbsStringBuf(try std.process.getCwdAlloc(std.heap.c_allocator), &tarball_path_buf, &parts, .auto);
const tarball_path = path_handler.joinAbsStringBuf(try bun.getcwdAlloc(std.heap.c_allocator), &tarball_path_buf, &parts, .auto);
Output.prettyErrorln("Tarball Path: {s}", .{tarball_path});
var folder = basename;

View File

@@ -1,4 +1,5 @@
{
"name": "bun",
"dependencies": {
"@vscode/debugadapter": "^1.61.0",
"esbuild": "^0.17.15",
@@ -14,9 +15,11 @@
},
"private": true,
"scripts": {
"build": "cmake . -DCMAKE_BUILD_TYPE=Debug -GNinja -Bbuild && ninja -Cbuild",
"build:release": "cmake . -DCMAKE_BUILD_TYPE=Release -GNinja -Bbuild-release && ninja -Cbuild-release",
"build-runtime": "esbuild --target=esnext --bundle src/runtime/index.ts --format=iife --platform=browser --global-name=BUN_RUNTIME > src/runtime.out.js; cat src/runtime.footer.js >> src/runtime.out.js",
"build-fallback": "esbuild --target=esnext --bundle src/fallback.ts --format=iife --platform=browser --minify > src/fallback.out.js",
"postinstall": "bash .scripts/postinstall.sh",
"postinstall_not_anymore": "bash .scripts/postinstall.sh",
"typecheck": "tsc --noEmit && cd test && bun run typecheck",
"fmt": "prettier --write --cache './{src,test,bench,packages/{bun-types,bun-inspector-*,bun-vscode,bun-debug-adapter-protocol}}/**/*.{mjs,ts,tsx,js,jsx}'",
"lint": "eslint './**/*.d.ts' --cache",
@@ -25,8 +28,7 @@
"devDependencies": {
"@types/react": "^18.0.25",
"@typescript-eslint/eslint-plugin": "^5.31.0",
"@typescript-eslint/parser": "^5.31.0",
"bun-webkit": "0.0.1-2c4d07c9499a65f36f554dc4cd4e9e0640632b8c"
"@typescript-eslint/parser": "^5.31.0"
},
"version": "0.0.0",
"prettier": "./.prettierrc.cjs"

View File

@@ -5,7 +5,7 @@ import { spawnSync } from "node:child_process";
run().catch(console.error);
async function run() {
const cwd = new URL("../protocol/", import.meta.url);
const cwd = new URL("../src/protocol/", import.meta.url);
const runner = "Bun" in globalThis ? "bunx" : "npx";
const write = (name: string, data: string) => {
const path = new URL(name, cwd);

View File

@@ -638,7 +638,7 @@ export namespace DAP {
*/
export type BreakpointLocationsRequest = {
/**
* The source location of the breakpoints; either `source.path` or `source.reference` must be specified.
* The source location of the breakpoints; either `source.path` or `source.sourceReference` must be specified.
*/
source: Source;
/**
@@ -1139,6 +1139,12 @@ export namespace DAP {
* The value should be less than or equal to 2147483647 (2^31-1).
*/
indexedVariables?: number;
/**
* A memory reference to a location appropriate for this result.
* For pointer type eval results, this is generally a reference to the memory address contained in the pointer.
* This attribute may be returned by a debug adapter if corresponding capability `supportsMemoryReferences` is true.
*/
memoryReference?: string;
};
/**
* Arguments for `source` request.
@@ -1286,7 +1292,7 @@ export namespace DAP {
/**
* A memory reference to a location appropriate for this result.
* For pointer type eval results, this is generally a reference to the memory address contained in the pointer.
* This attribute should be returned by a debug adapter if corresponding capability `supportsMemoryReferences` is true.
* This attribute may be returned by a debug adapter if corresponding capability `supportsMemoryReferences` is true.
*/
memoryReference?: string;
};
@@ -1344,6 +1350,12 @@ export namespace DAP {
* The value should be less than or equal to 2147483647 (2^31-1).
*/
indexedVariables?: number;
/**
* A memory reference to a location appropriate for this result.
* For pointer type eval results, this is generally a reference to the memory address contained in the pointer.
* This attribute may be returned by a debug adapter if corresponding capability `supportsMemoryReferences` is true.
*/
memoryReference?: string;
};
/**
* Arguments for `stepInTargets` request.
@@ -2064,8 +2076,10 @@ export namespace DAP {
*/
indexedVariables?: number;
/**
* The memory reference for the variable if the variable represents executable code, such as a function pointer.
* This attribute is only required if the corresponding capability `supportsMemoryReferences` is true.
* A memory reference associated with this variable.
* For pointer type variables, this is generally a reference to the memory address contained in the pointer.
* For executable data, this reference may later be used in a `disassemble` request.
* This attribute may be returned by a debug adapter if corresponding capability `supportsMemoryReferences` is true.
*/
memoryReference?: string;
};

View File

@@ -108,7 +108,7 @@
{ "$ref": "#/definitions/Request" },
{
"type": "object",
"description": "The `cancel` request is used by the client in two situations:\n- to indicate that it is no longer interested in the result produced by a specific request issued earlier\n- to cancel a progress sequence. Clients should only call this request if the corresponding capability `supportsCancelRequest` is true.\nThis request has a hint characteristic: a debug adapter can only be expected to make a 'best effort' in honoring this request but there are no guarantees.\nThe `cancel` request may return an error if it could not cancel an operation but a client should refrain from presenting this error to end users.\nThe request that got cancelled still needs to send a response back. This can either be a normal result (`success` attribute true) or an error response (`success` attribute false and the `message` set to `cancelled`).\nReturning partial results from a cancelled request is possible but please note that a client has no generic way for detecting that a response is partial or not.\nThe progress that got cancelled still needs to send a `progressEnd` event back.\n A client should not assume that progress just got cancelled after sending the `cancel` request.",
"description": "The `cancel` request is used by the client in two situations:\n- to indicate that it is no longer interested in the result produced by a specific request issued earlier\n- to cancel a progress sequence.\nClients should only call this request if the corresponding capability `supportsCancelRequest` is true.\nThis request has a hint characteristic: a debug adapter can only be expected to make a 'best effort' in honoring this request but there are no guarantees.\nThe `cancel` request may return an error if it could not cancel an operation but a client should refrain from presenting this error to end users.\nThe request that got cancelled still needs to send a response back. This can either be a normal result (`success` attribute true) or an error response (`success` attribute false and the `message` set to `cancelled`).\nReturning partial results from a cancelled request is possible but please note that a client has no generic way for detecting that a response is partial or not.\nThe progress that got cancelled still needs to send a `progressEnd` event back.\n A client should not assume that progress just got cancelled after sending the `cancel` request.",
"properties": {
"command": { "type": "string", "enum": ["cancel"] },
"arguments": { "$ref": "#/definitions/CancelArguments" }
@@ -1074,7 +1074,7 @@
"properties": {
"source": {
"$ref": "#/definitions/Source",
"description": "The source location of the breakpoints; either `source.path` or `source.reference` must be specified."
"description": "The source location of the breakpoints; either `source.path` or `source.sourceReference` must be specified."
},
"line": {
"type": "integer",
@@ -2035,6 +2035,10 @@
"indexedVariables": {
"type": "integer",
"description": "The number of indexed child variables.\nThe client can use this information to present the variables in a paged UI and fetch them in chunks.\nThe value should be less than or equal to 2147483647 (2^31-1)."
},
"memoryReference": {
"type": "string",
"description": "A memory reference to a location appropriate for this result.\nFor pointer type eval results, this is generally a reference to the memory address contained in the pointer.\nThis attribute may be returned by a debug adapter if corresponding capability `supportsMemoryReferences` is true."
}
},
"required": ["value"]
@@ -2326,7 +2330,7 @@
},
"memoryReference": {
"type": "string",
"description": "A memory reference to a location appropriate for this result.\nFor pointer type eval results, this is generally a reference to the memory address contained in the pointer.\nThis attribute should be returned by a debug adapter if corresponding capability `supportsMemoryReferences` is true."
"description": "A memory reference to a location appropriate for this result.\nFor pointer type eval results, this is generally a reference to the memory address contained in the pointer.\nThis attribute may be returned by a debug adapter if corresponding capability `supportsMemoryReferences` is true."
}
},
"required": ["result", "variablesReference"]
@@ -2397,6 +2401,10 @@
"indexedVariables": {
"type": "integer",
"description": "The number of indexed child variables.\nThe client can use this information to present the variables in a paged UI and fetch them in chunks.\nThe value should be less than or equal to 2147483647 (2^31-1)."
},
"memoryReference": {
"type": "string",
"description": "A memory reference to a location appropriate for this result.\nFor pointer type eval results, this is generally a reference to the memory address contained in the pointer.\nThis attribute may be returned by a debug adapter if corresponding capability `supportsMemoryReferences` is true."
}
},
"required": ["value"]
@@ -3240,7 +3248,7 @@
},
"memoryReference": {
"type": "string",
"description": "The memory reference for the variable if the variable represents executable code, such as a function pointer.\nThis attribute is only required if the corresponding capability `supportsMemoryReferences` is true."
"description": "A memory reference associated with this variable.\nFor pointer type variables, this is generally a reference to the memory address contained in the pointer.\nFor executable data, this reference may later be used in a `disassemble` request.\nThis attribute may be returned by a debug adapter if corresponding capability `supportsMemoryReferences` is true."
}
},
"required": ["name", "value", "variablesReference"]
@@ -3299,8 +3307,8 @@
"Indicates that the object is a constant.",
"Indicates that the object is read only.",
"Indicates that the object is a raw string.",
"Indicates that the object can have an Object ID created for it.",
"Indicates that the object has an Object ID associated with it.",
"Indicates that the object can have an Object ID created for it. This is a vestigial attribute that is used by some clients; 'Object ID's are not specified in the protocol.",
"Indicates that the object has an Object ID associated with it. This is a vestigial attribute that is used by some clients; 'Object ID's are not specified in the protocol.",
"Indicates that the evaluation had side effects.",
"Indicates that the object has its value tracked by a data breakpoint."
]

View File

@@ -5,7 +5,7 @@ import { spawnSync } from "node:child_process";
run().catch(console.error);
async function run() {
const cwd = new URL("../protocol/", import.meta.url);
const cwd = new URL("../src/protocol/", import.meta.url);
const runner = "Bun" in globalThis ? "bunx" : "npx";
const write = (name: string, data: string) => {
const path = new URL(name, cwd);

View File

@@ -1136,7 +1136,8 @@ export namespace V8 {
| "Canceled"
| "RpPageNotVisible"
| "SilentMediationFailure"
| "ThirdPartyCookiesBlocked";
| "ThirdPartyCookiesBlocked"
| "NotSignedInWithIdp";
export type FederatedAuthUserInfoRequestIssueDetails = {
federatedAuthUserInfoRequestIssueReason: FederatedAuthUserInfoRequestIssueReason;
};
@@ -1192,6 +1193,25 @@ export namespace V8 {
*/
failedRequestInfo?: FailedRequestInfo | undefined;
};
export type PropertyRuleIssueReason = "InvalidSyntax" | "InvalidInitialValue" | "InvalidInherits" | "InvalidName";
/**
* This issue warns about errors in property rules that lead to property
* registrations being ignored.
*/
export type PropertyRuleIssueDetails = {
/**
* Source code position of the property rule.
*/
sourceCodeLocation: SourceCodeLocation;
/**
* Reason why the property rule was discarded.
*/
propertyRuleIssueReason: PropertyRuleIssueReason;
/**
* The value of the property rule property that failed to parse
*/
propertyValue?: string | undefined;
};
/**
* A unique identifier for the type of issue. Each type may use one of the
* optional fields in InspectorIssueDetails to convey more specific
@@ -1215,7 +1235,8 @@ export namespace V8 {
| "FederatedAuthRequestIssue"
| "BounceTrackingIssue"
| "StylesheetLoadingIssue"
| "FederatedAuthUserInfoRequestIssue";
| "FederatedAuthUserInfoRequestIssue"
| "PropertyRuleIssue";
/**
* This struct holds a list of optional fields with additional information
* specific to the kind of issue. When adding a new issue code, please also
@@ -1239,6 +1260,7 @@ export namespace V8 {
federatedAuthRequestIssueDetails?: FederatedAuthRequestIssueDetails | undefined;
bounceTrackingIssueDetails?: BounceTrackingIssueDetails | undefined;
stylesheetLoadingIssueDetails?: StylesheetLoadingIssueDetails | undefined;
propertyRuleIssueDetails?: PropertyRuleIssueDetails | undefined;
federatedAuthUserInfoRequestIssueDetails?: FederatedAuthUserInfoRequestIssueDetails | undefined;
};
/**
@@ -1390,16 +1412,82 @@ export namespace V8 {
*/
name: string;
/**
* address field name, for example Jon Doe.
* address field value, for example Jon Doe.
*/
value: string;
};
/**
* A list of address fields.
*/
export type AddressFields = {
fields: AddressField[];
};
export type Address = {
/**
* fields and values defining a test address.
* fields and values defining an address.
*/
fields: AddressField[];
};
/**
* Defines how an address can be displayed like in chrome://settings/addresses.
* Address UI is a two dimensional array, each inner array is an "address information line", and when rendered in a UI surface should be displayed as such.
* The following address UI for instance:
* [[{name: "GIVE_NAME", value: "Jon"}, {name: "FAMILY_NAME", value: "Doe"}], [{name: "CITY", value: "Munich"}, {name: "ZIP", value: "81456"}]]
* should allow the receiver to render:
* Jon Doe
* Munich 81456
*/
export type AddressUI = {
/**
* A two dimension array containing the repesentation of values from an address profile.
*/
addressFields: AddressFields[];
};
/**
* Specified whether a filled field was done so by using the html autocomplete attribute or autofill heuristics.
*/
export type FillingStrategy = "autocompleteAttribute" | "autofillInferred";
export type FilledField = {
/**
* The type of the field, e.g text, password etc.
*/
htmlType: string;
/**
* the html id
*/
id: string;
/**
* the html name
*/
name: string;
/**
* the field value
*/
value: string;
/**
* The actual field type, e.g FAMILY_NAME
*/
autofillType: string;
/**
* The filling strategy
*/
fillingStrategy: FillingStrategy;
};
/**
* Emitted when an address form is filled.
* @event `Autofill.addressFormFilled`
*/
export type AddressFormFilledEvent = {
/**
* Information about the fields that were filled
*/
filledFields: FilledField[];
/**
* An UI representation of the address used to fill the form.
* Consists of a 2D array where each child represents an address/profile line.
*/
addressUi: AddressUI;
};
/**
* Trigger autofill on a form identified by the fieldId.
* If the field and related form cannot be autofilled, returns an error.
@@ -1437,6 +1525,26 @@ export namespace V8 {
* @response `Autofill.setAddresses`
*/
export type SetAddressesResponse = {};
/**
* Disables autofill domain notifications.
* @request `Autofill.disable`
*/
export type DisableRequest = {};
/**
* Disables autofill domain notifications.
* @response `Autofill.disable`
*/
export type DisableResponse = {};
/**
* Enables autofill domain notifications.
* @request `Autofill.enable`
*/
export type EnableRequest = {};
/**
* Enables autofill domain notifications.
* @response `Autofill.enable`
*/
export type EnableResponse = {};
}
export namespace BackgroundService {
/**
@@ -3573,6 +3681,25 @@ export namespace V8 {
* @response `CSS.setEffectivePropertyValueForNode`
*/
export type SetEffectivePropertyValueForNodeResponse = {};
/**
* Modifies the property rule property name.
* @request `CSS.setPropertyRulePropertyName`
*/
export type SetPropertyRulePropertyNameRequest = {
styleSheetId: StyleSheetId;
range: SourceRange;
propertyName: string;
};
/**
* Modifies the property rule property name.
* @response `CSS.setPropertyRulePropertyName`
*/
export type SetPropertyRulePropertyNameResponse = {
/**
* The resulting key text after modification.
*/
propertyName: Value;
};
/**
* Modifies the keyframe rule key text.
* @request `CSS.setKeyframeKey`
@@ -7168,6 +7295,16 @@ export namespace V8 {
* @response `EventBreakpoints.removeInstrumentationBreakpoint`
*/
export type RemoveInstrumentationBreakpointResponse = {};
/**
* Removes all breakpoints
* @request `EventBreakpoints.disable`
*/
export type DisableRequest = {};
/**
* Removes all breakpoints
* @response `EventBreakpoints.disable`
*/
export type DisableResponse = {};
}
export namespace FedCm {
/**
@@ -7178,7 +7315,7 @@ export namespace V8 {
/**
* Whether the dialog shown is an account chooser or an auto re-authentication dialog.
*/
export type DialogType = "AccountChooser" | "AutoReauthn" | "ConfirmIdpSignin";
export type DialogType = "AccountChooser" | "AutoReauthn" | "ConfirmIdpLogin";
/**
* Corresponds to IdentityRequestAccount
*/
@@ -7189,7 +7326,7 @@ export namespace V8 {
givenName: string;
pictureUrl: string;
idpConfigUrl: string;
idpSigninUrl: string;
idpLoginUrl: string;
loginState: LoginState;
/**
* These two are only set if the loginState is signUp
@@ -7252,6 +7389,20 @@ export namespace V8 {
* @response `FedCm.selectAccount`
*/
export type SelectAccountResponse = {};
/**
* Only valid if the dialog type is ConfirmIdpLogin. Acts as if the user had
* clicked the continue button.
* @request `FedCm.confirmIdpLogin`
*/
export type ConfirmIdpLoginRequest = {
dialogId: string;
};
/**
* Only valid if the dialog type is ConfirmIdpLogin. Acts as if the user had
* clicked the continue button.
* @response `FedCm.confirmIdpLogin`
*/
export type ConfirmIdpLoginResponse = {};
/**
* undefined
* @request `FedCm.dismissDialog`
@@ -10477,6 +10628,7 @@ export namespace V8 {
| "ch-ect"
| "ch-prefers-color-scheme"
| "ch-prefers-reduced-motion"
| "ch-prefers-reduced-transparency"
| "ch-rtt"
| "ch-save-data"
| "ch-ua"
@@ -13109,7 +13261,6 @@ export namespace V8 {
| "LowEndDevice"
| "InvalidSchemeRedirect"
| "InvalidSchemeNavigation"
| "InProgressNavigation"
| "NavigationRequestBlockedByCsp"
| "MainFrameNavigation"
| "MojoBinderPolicy"
@@ -13121,7 +13272,6 @@ export namespace V8 {
| "NavigationBadHttpStatus"
| "ClientCertRequested"
| "NavigationRequestNetworkError"
| "MaxNumOfRunningPrerendersExceeded"
| "CancelAllHostsForTesting"
| "DidFailLoad"
| "Stop"
@@ -13133,9 +13283,8 @@ export namespace V8 {
| "MixedContent"
| "TriggerBackgrounded"
| "MemoryLimitExceeded"
| "FailToGetMemoryUsage"
| "DataSaverEnabled"
| "HasEffectiveUrl"
| "TriggerUrlHasEffectiveUrl"
| "ActivatedBeforeStarted"
| "InactivePageRestriction"
| "StartFailed"
@@ -13166,7 +13315,13 @@ export namespace V8 {
| "PrerenderingDisabledByDevTools"
| "ResourceLoadBlockedByClient"
| "SpeculationRuleRemoved"
| "ActivatedWithAuxiliaryBrowsingContexts";
| "ActivatedWithAuxiliaryBrowsingContexts"
| "MaxNumOfRunningEagerPrerendersExceeded"
| "MaxNumOfRunningNonEagerPrerendersExceeded"
| "MaxNumOfRunningEmbedderPrerendersExceeded"
| "PrerenderingUrlHasEffectiveUrl"
| "RedirectedPrerenderingUrlHasEffectiveUrl"
| "ActivationUrlHasEffectiveUrl";
/**
* Preloading status values, see also PreloadingTriggeringOutcome. This
* status is shared by prefetchStatusUpdated and prerenderStatusUpdated.
@@ -13221,24 +13376,6 @@ export namespace V8 {
export type RuleSetRemovedEvent = {
id: RuleSetId;
};
/**
* Fired when a prerender attempt is completed.
* @event `Preload.prerenderAttemptCompleted`
*/
export type PrerenderAttemptCompletedEvent = {
key: PreloadingAttemptKey;
/**
* The frame id of the frame initiating prerendering.
*/
initiatingFrameId: Page.FrameId;
prerenderingUrl: string;
finalStatus: PrerenderFinalStatus;
/**
* This is used to give users more information about the name of the API call
* that is incompatible with prerender and has caused the cancellation of the attempt
*/
disallowedApiMethod?: string | undefined;
};
/**
* Fired when a preload enabled state is updated.
* @event `Preload.preloadEnabledStateUpdated`
@@ -13935,12 +14072,21 @@ export namespace V8 {
/**
* Enum of interest group access types.
*/
export type InterestGroupAccessType = "join" | "leave" | "update" | "loaded" | "bid" | "win";
export type InterestGroupAccessType =
| "join"
| "leave"
| "update"
| "loaded"
| "bid"
| "win"
| "additionalBid"
| "additionalBidWin"
| "clear";
/**
* Ad advertising element inside an interest group.
*/
export type InterestGroupAd = {
renderUrl: string;
renderURL: string;
metadata?: string | undefined;
};
/**
@@ -13951,10 +14097,10 @@ export namespace V8 {
name: string;
expirationTime: Network.TimeSinceEpoch;
joiningOrigin: string;
biddingUrl?: string | undefined;
biddingWasmHelperUrl?: string | undefined;
updateUrl?: string | undefined;
trustedBiddingSignalsUrl?: string | undefined;
biddingLogicURL?: string | undefined;
biddingWasmHelperURL?: string | undefined;
updateURL?: string | undefined;
trustedBiddingSignalsURL?: string | undefined;
trustedBiddingSignalsKeys: string[];
userBiddingSignals?: string | undefined;
ads: InterestGroupAd[];
@@ -14099,20 +14245,27 @@ export namespace V8 {
key: string;
value: UnsignedInt128AsBase16;
};
export type AttributionReportingEventReportWindows = {
/**
* duration in seconds
*/
start: number;
/**
* duration in seconds
*/
ends: number[];
};
export type AttributionReportingSourceRegistration = {
time: Network.TimeSinceEpoch;
/**
* duration in seconds
*/
expiry?: number | undefined;
expiry: number;
eventReportWindows: AttributionReportingEventReportWindows;
/**
* duration in seconds
*/
eventReportWindow?: number | undefined;
/**
* duration in seconds
*/
aggregatableReportWindow?: number | undefined;
aggregatableReportWindow: number;
type: AttributionReportingSourceType;
sourceOrigin: string;
reportingOrigin: string;
@@ -16381,6 +16534,7 @@ export namespace V8 {
"Animation.animationCreated": Animation.AnimationCreatedEvent;
"Animation.animationStarted": Animation.AnimationStartedEvent;
"Audits.issueAdded": Audits.IssueAddedEvent;
"Autofill.addressFormFilled": Autofill.AddressFormFilledEvent;
"BackgroundService.recordingStateChanged": BackgroundService.RecordingStateChangedEvent;
"BackgroundService.backgroundServiceEventReceived": BackgroundService.BackgroundServiceEventReceivedEvent;
"Browser.downloadWillBegin": Browser.DownloadWillBeginEvent;
@@ -16460,7 +16614,6 @@ export namespace V8 {
"PerformanceTimeline.timelineEventAdded": PerformanceTimeline.TimelineEventAddedEvent;
"Preload.ruleSetUpdated": Preload.RuleSetUpdatedEvent;
"Preload.ruleSetRemoved": Preload.RuleSetRemovedEvent;
"Preload.prerenderAttemptCompleted": Preload.PrerenderAttemptCompletedEvent;
"Preload.preloadEnabledStateUpdated": Preload.PreloadEnabledStateUpdatedEvent;
"Preload.prefetchStatusUpdated": Preload.PrefetchStatusUpdatedEvent;
"Preload.prerenderStatusUpdated": Preload.PrerenderStatusUpdatedEvent;
@@ -16533,6 +16686,8 @@ export namespace V8 {
"Audits.checkFormsIssues": Audits.CheckFormsIssuesRequest;
"Autofill.trigger": Autofill.TriggerRequest;
"Autofill.setAddresses": Autofill.SetAddressesRequest;
"Autofill.disable": Autofill.DisableRequest;
"Autofill.enable": Autofill.EnableRequest;
"BackgroundService.startObserving": BackgroundService.StartObservingRequest;
"BackgroundService.stopObserving": BackgroundService.StopObservingRequest;
"BackgroundService.setRecording": BackgroundService.SetRecordingRequest;
@@ -16583,6 +16738,7 @@ export namespace V8 {
"CSS.trackComputedStyleUpdates": CSS.TrackComputedStyleUpdatesRequest;
"CSS.takeComputedStyleUpdates": CSS.TakeComputedStyleUpdatesRequest;
"CSS.setEffectivePropertyValueForNode": CSS.SetEffectivePropertyValueForNodeRequest;
"CSS.setPropertyRulePropertyName": CSS.SetPropertyRulePropertyNameRequest;
"CSS.setKeyframeKey": CSS.SetKeyframeKeyRequest;
"CSS.setMediaText": CSS.SetMediaTextRequest;
"CSS.setContainerQueryText": CSS.SetContainerQueryTextRequest;
@@ -16705,9 +16861,11 @@ export namespace V8 {
"Emulation.setAutomationOverride": Emulation.SetAutomationOverrideRequest;
"EventBreakpoints.setInstrumentationBreakpoint": EventBreakpoints.SetInstrumentationBreakpointRequest;
"EventBreakpoints.removeInstrumentationBreakpoint": EventBreakpoints.RemoveInstrumentationBreakpointRequest;
"EventBreakpoints.disable": EventBreakpoints.DisableRequest;
"FedCm.enable": FedCm.EnableRequest;
"FedCm.disable": FedCm.DisableRequest;
"FedCm.selectAccount": FedCm.SelectAccountRequest;
"FedCm.confirmIdpLogin": FedCm.ConfirmIdpLoginRequest;
"FedCm.dismissDialog": FedCm.DismissDialogRequest;
"FedCm.resetCooldown": FedCm.ResetCooldownRequest;
"Fetch.disable": Fetch.DisableRequest;
@@ -16979,6 +17137,8 @@ export namespace V8 {
"Audits.checkFormsIssues": Audits.CheckFormsIssuesResponse;
"Autofill.trigger": Autofill.TriggerResponse;
"Autofill.setAddresses": Autofill.SetAddressesResponse;
"Autofill.disable": Autofill.DisableResponse;
"Autofill.enable": Autofill.EnableResponse;
"BackgroundService.startObserving": BackgroundService.StartObservingResponse;
"BackgroundService.stopObserving": BackgroundService.StopObservingResponse;
"BackgroundService.setRecording": BackgroundService.SetRecordingResponse;
@@ -17029,6 +17189,7 @@ export namespace V8 {
"CSS.trackComputedStyleUpdates": CSS.TrackComputedStyleUpdatesResponse;
"CSS.takeComputedStyleUpdates": CSS.TakeComputedStyleUpdatesResponse;
"CSS.setEffectivePropertyValueForNode": CSS.SetEffectivePropertyValueForNodeResponse;
"CSS.setPropertyRulePropertyName": CSS.SetPropertyRulePropertyNameResponse;
"CSS.setKeyframeKey": CSS.SetKeyframeKeyResponse;
"CSS.setMediaText": CSS.SetMediaTextResponse;
"CSS.setContainerQueryText": CSS.SetContainerQueryTextResponse;
@@ -17151,9 +17312,11 @@ export namespace V8 {
"Emulation.setAutomationOverride": Emulation.SetAutomationOverrideResponse;
"EventBreakpoints.setInstrumentationBreakpoint": EventBreakpoints.SetInstrumentationBreakpointResponse;
"EventBreakpoints.removeInstrumentationBreakpoint": EventBreakpoints.RemoveInstrumentationBreakpointResponse;
"EventBreakpoints.disable": EventBreakpoints.DisableResponse;
"FedCm.enable": FedCm.EnableResponse;
"FedCm.disable": FedCm.DisableResponse;
"FedCm.selectAccount": FedCm.SelectAccountResponse;
"FedCm.confirmIdpLogin": FedCm.ConfirmIdpLoginResponse;
"FedCm.dismissDialog": FedCm.DismissDialogResponse;
"FedCm.resetCooldown": FedCm.ResetCooldownResponse;
"Fetch.disable": Fetch.DisableResponse;

View File

@@ -1091,7 +1091,8 @@
"Canceled",
"RpPageNotVisible",
"SilentMediationFailure",
"ThirdPartyCookiesBlocked"
"ThirdPartyCookiesBlocked",
"NotSignedInWithIdp"
]
},
{
@@ -1163,6 +1164,34 @@
}
]
},
{
"id": "PropertyRuleIssueReason",
"type": "string",
"enum": ["InvalidSyntax", "InvalidInitialValue", "InvalidInherits", "InvalidName"]
},
{
"id": "PropertyRuleIssueDetails",
"description": "This issue warns about errors in property rules that lead to property\nregistrations being ignored.",
"type": "object",
"properties": [
{
"name": "sourceCodeLocation",
"description": "Source code position of the property rule.",
"$ref": "SourceCodeLocation"
},
{
"name": "propertyRuleIssueReason",
"description": "Reason why the property rule was discarded.",
"$ref": "PropertyRuleIssueReason"
},
{
"name": "propertyValue",
"description": "The value of the property rule property that failed to parse",
"optional": true,
"type": "string"
}
]
},
{
"id": "InspectorIssueCode",
"description": "A unique identifier for the type of issue. Each type may use one of the\noptional fields in InspectorIssueDetails to convey more specific\ninformation about the kind of issue.",
@@ -1185,7 +1214,8 @@
"FederatedAuthRequestIssue",
"BounceTrackingIssue",
"StylesheetLoadingIssue",
"FederatedAuthUserInfoRequestIssue"
"FederatedAuthUserInfoRequestIssue",
"PropertyRuleIssue"
]
},
{
@@ -1227,6 +1257,7 @@
},
{ "name": "bounceTrackingIssueDetails", "optional": true, "$ref": "BounceTrackingIssueDetails" },
{ "name": "stylesheetLoadingIssueDetails", "optional": true, "$ref": "StylesheetLoadingIssueDetails" },
{ "name": "propertyRuleIssueDetails", "optional": true, "$ref": "PropertyRuleIssueDetails" },
{
"name": "federatedAuthUserInfoRequestIssueDetails",
"optional": true,
@@ -1344,20 +1375,76 @@
"type": "object",
"properties": [
{ "name": "name", "description": "address field name, for example GIVEN_NAME.", "type": "string" },
{ "name": "value", "description": "address field name, for example Jon Doe.", "type": "string" }
{ "name": "value", "description": "address field value, for example Jon Doe.", "type": "string" }
]
},
{
"id": "AddressFields",
"description": "A list of address fields.",
"type": "object",
"properties": [{ "name": "fields", "type": "array", "items": { "$ref": "AddressField" } }]
},
{
"id": "Address",
"type": "object",
"properties": [
{
"name": "fields",
"description": "fields and values defining a test address.",
"description": "fields and values defining an address.",
"type": "array",
"items": { "$ref": "AddressField" }
}
]
},
{
"id": "AddressUI",
"description": "Defines how an address can be displayed like in chrome://settings/addresses.\nAddress UI is a two dimensional array, each inner array is an \"address information line\", and when rendered in a UI surface should be displayed as such.\nThe following address UI for instance:\n[[{name: \"GIVE_NAME\", value: \"Jon\"}, {name: \"FAMILY_NAME\", value: \"Doe\"}], [{name: \"CITY\", value: \"Munich\"}, {name: \"ZIP\", value: \"81456\"}]]\nshould allow the receiver to render:\nJon Doe\nMunich 81456",
"type": "object",
"properties": [
{
"name": "addressFields",
"description": "A two dimension array containing the repesentation of values from an address profile.",
"type": "array",
"items": { "$ref": "AddressFields" }
}
]
},
{
"id": "FillingStrategy",
"description": "Specified whether a filled field was done so by using the html autocomplete attribute or autofill heuristics.",
"type": "string",
"enum": ["autocompleteAttribute", "autofillInferred"]
},
{
"id": "FilledField",
"type": "object",
"properties": [
{ "name": "htmlType", "description": "The type of the field, e.g text, password etc.", "type": "string" },
{ "name": "id", "description": "the html id", "type": "string" },
{ "name": "name", "description": "the html name", "type": "string" },
{ "name": "value", "description": "the field value", "type": "string" },
{ "name": "autofillType", "description": "The actual field type, e.g FAMILY_NAME", "type": "string" },
{ "name": "fillingStrategy", "description": "The filling strategy", "$ref": "FillingStrategy" }
]
}
],
"events": [
{
"name": "addressFormFilled",
"description": "Emitted when an address form is filled.",
"parameters": [
{
"name": "filledFields",
"description": "Information about the fields that were filled",
"type": "array",
"items": { "$ref": "FilledField" }
},
{
"name": "addressUi",
"description": "An UI representation of the address used to fill the form.\nConsists of a 2D array where each child represents an address/profile line.",
"$ref": "AddressUI"
}
]
}
],
"commands": [
@@ -1387,7 +1474,9 @@
"name": "setAddresses",
"description": "Set addresses so that developers can verify their forms implementation.",
"parameters": [{ "name": "addresses", "type": "array", "items": { "$ref": "Address" } }]
}
},
{ "name": "disable", "description": "Disables autofill domain notifications." },
{ "name": "enable", "description": "Enables autofill domain notifications." }
]
},
{
@@ -3211,6 +3300,18 @@
{ "name": "value", "type": "string" }
]
},
{
"name": "setPropertyRulePropertyName",
"description": "Modifies the property rule property name.",
"parameters": [
{ "name": "styleSheetId", "$ref": "StyleSheetId" },
{ "name": "range", "$ref": "SourceRange" },
{ "name": "propertyName", "type": "string" }
],
"returns": [
{ "name": "propertyName", "description": "The resulting key text after modification.", "$ref": "Value" }
]
},
{
"name": "setKeyframeKey",
"description": "Modifies the keyframe rule key text.",
@@ -4628,7 +4729,7 @@
{
"domain": "DOMDebugger",
"description": "DOM debugging allows setting breakpoints on particular DOM operations and events. JavaScript\nexecution will stop on these operations as if there was a regular breakpoint set.",
"dependencies": ["DOM", "Debugger", "Runtime"],
"dependencies": ["DOM", "Runtime"],
"types": [
{
"id": "DOMBreakpointType",
@@ -4738,6 +4839,8 @@
"name": "removeInstrumentationBreakpoint",
"description": "Removes breakpoint on particular native event.",
"experimental": true,
"deprecated": true,
"redirect": "EventBreakpoints",
"parameters": [{ "name": "eventName", "description": "Instrumentation name to stop on.", "type": "string" }]
},
{
@@ -4788,6 +4891,8 @@
"name": "setInstrumentationBreakpoint",
"description": "Sets breakpoint on particular native event.",
"experimental": true,
"deprecated": true,
"redirect": "EventBreakpoints",
"parameters": [{ "name": "eventName", "description": "Instrumentation name to stop on.", "type": "string" }]
},
{
@@ -6069,7 +6174,7 @@
},
{
"domain": "EventBreakpoints",
"description": "EventBreakpoints permits setting breakpoints on particular operations and\nevents in targets that run JavaScript but do not have a DOM.\nJavaScript execution will stop on these operations as if there was a regular\nbreakpoint set.",
"description": "EventBreakpoints permits setting JavaScript breakpoints on operations and events\noccurring in native code invoked from JavaScript. Once breakpoint is hit, it is\nreported through Debugger domain, similarly to regular breakpoints being hit.",
"experimental": true,
"commands": [
{
@@ -6081,7 +6186,8 @@
"name": "removeInstrumentationBreakpoint",
"description": "Removes breakpoint on particular native event.",
"parameters": [{ "name": "eventName", "description": "Instrumentation name to stop on.", "type": "string" }]
}
},
{ "name": "disable", "description": "Removes all breakpoints" }
]
},
{
@@ -6099,7 +6205,7 @@
"id": "DialogType",
"description": "Whether the dialog shown is an account chooser or an auto re-authentication dialog.",
"type": "string",
"enum": ["AccountChooser", "AutoReauthn", "ConfirmIdpSignin"]
"enum": ["AccountChooser", "AutoReauthn", "ConfirmIdpLogin"]
},
{
"id": "Account",
@@ -6112,7 +6218,7 @@
{ "name": "givenName", "type": "string" },
{ "name": "pictureUrl", "type": "string" },
{ "name": "idpConfigUrl", "type": "string" },
{ "name": "idpSigninUrl", "type": "string" },
{ "name": "idpLoginUrl", "type": "string" },
{ "name": "loginState", "$ref": "LoginState" },
{
"name": "termsOfServiceUrl",
@@ -6160,6 +6266,11 @@
{ "name": "accountIndex", "type": "integer" }
]
},
{
"name": "confirmIdpLogin",
"description": "Only valid if the dialog type is ConfirmIdpLogin. Acts as if the user had\nclicked the continue button.",
"parameters": [{ "name": "dialogId", "type": "string" }]
},
{
"name": "dismissDialog",
"parameters": [
@@ -6987,16 +7098,14 @@
{
"name": "tiltX",
"description": "The plane angle between the Y-Z plane and the plane containing both the stylus axis and the Y axis, in degrees of the range [-90,90], a positive tiltX is to the right (default: 0)",
"experimental": true,
"optional": true,
"type": "integer"
"type": "number"
},
{
"name": "tiltY",
"description": "The plane angle between the X-Z plane and the plane containing both the stylus axis and the X axis, in degrees of the range [-90,90], a positive tiltY is towards the user (default: 0).",
"experimental": true,
"optional": true,
"type": "integer"
"type": "number"
},
{
"name": "twist",
@@ -7280,16 +7389,14 @@
{
"name": "tiltX",
"description": "The plane angle between the Y-Z plane and the plane containing both the stylus axis and the Y axis, in degrees of the range [-90,90], a positive tiltX is to the right (default: 0).",
"experimental": true,
"optional": true,
"type": "integer"
"type": "number"
},
{
"name": "tiltY",
"description": "The plane angle between the X-Z plane and the plane containing both the stylus axis and the X axis, in degrees of the range [-90,90], a positive tiltY is towards the user (default: 0).",
"experimental": true,
"optional": true,
"type": "integer"
"type": "number"
},
{
"name": "twist",
@@ -9158,6 +9265,7 @@
"ch-ect",
"ch-prefers-color-scheme",
"ch-prefers-reduced-motion",
"ch-prefers-reduced-transparency",
"ch-rtt",
"ch-save-data",
"ch-ua",
@@ -11403,7 +11511,6 @@
"LowEndDevice",
"InvalidSchemeRedirect",
"InvalidSchemeNavigation",
"InProgressNavigation",
"NavigationRequestBlockedByCsp",
"MainFrameNavigation",
"MojoBinderPolicy",
@@ -11415,7 +11522,6 @@
"NavigationBadHttpStatus",
"ClientCertRequested",
"NavigationRequestNetworkError",
"MaxNumOfRunningPrerendersExceeded",
"CancelAllHostsForTesting",
"DidFailLoad",
"Stop",
@@ -11427,9 +11533,8 @@
"MixedContent",
"TriggerBackgrounded",
"MemoryLimitExceeded",
"FailToGetMemoryUsage",
"DataSaverEnabled",
"HasEffectiveUrl",
"TriggerUrlHasEffectiveUrl",
"ActivatedBeforeStarted",
"InactivePageRestriction",
"StartFailed",
@@ -11460,7 +11565,13 @@
"PrerenderingDisabledByDevTools",
"ResourceLoadBlockedByClient",
"SpeculationRuleRemoved",
"ActivatedWithAuxiliaryBrowsingContexts"
"ActivatedWithAuxiliaryBrowsingContexts",
"MaxNumOfRunningEagerPrerendersExceeded",
"MaxNumOfRunningNonEagerPrerendersExceeded",
"MaxNumOfRunningEmbedderPrerendersExceeded",
"PrerenderingUrlHasEffectiveUrl",
"RedirectedPrerenderingUrlHasEffectiveUrl",
"ActivationUrlHasEffectiveUrl"
]
},
{
@@ -11515,26 +11626,6 @@
"parameters": [{ "name": "ruleSet", "$ref": "RuleSet" }]
},
{ "name": "ruleSetRemoved", "parameters": [{ "name": "id", "$ref": "RuleSetId" }] },
{
"name": "prerenderAttemptCompleted",
"description": "Fired when a prerender attempt is completed.",
"parameters": [
{ "name": "key", "$ref": "PreloadingAttemptKey" },
{
"name": "initiatingFrameId",
"description": "The frame id of the frame initiating prerendering.",
"$ref": "Page.FrameId"
},
{ "name": "prerenderingUrl", "type": "string" },
{ "name": "finalStatus", "$ref": "PrerenderFinalStatus" },
{
"name": "disallowedApiMethod",
"description": "This is used to give users more information about the name of the API call\nthat is incompatible with prerender and has caused the cancellation of the attempt",
"optional": true,
"type": "string"
}
]
},
{
"name": "preloadEnabledStateUpdated",
"description": "Fired when a preload enabled state is updated.",
@@ -12077,14 +12168,14 @@
"id": "InterestGroupAccessType",
"description": "Enum of interest group access types.",
"type": "string",
"enum": ["join", "leave", "update", "loaded", "bid", "win"]
"enum": ["join", "leave", "update", "loaded", "bid", "win", "additionalBid", "additionalBidWin", "clear"]
},
{
"id": "InterestGroupAd",
"description": "Ad advertising element inside an interest group.",
"type": "object",
"properties": [
{ "name": "renderUrl", "type": "string" },
{ "name": "renderURL", "type": "string" },
{ "name": "metadata", "optional": true, "type": "string" }
]
},
@@ -12097,10 +12188,10 @@
{ "name": "name", "type": "string" },
{ "name": "expirationTime", "$ref": "Network.TimeSinceEpoch" },
{ "name": "joiningOrigin", "type": "string" },
{ "name": "biddingUrl", "optional": true, "type": "string" },
{ "name": "biddingWasmHelperUrl", "optional": true, "type": "string" },
{ "name": "updateUrl", "optional": true, "type": "string" },
{ "name": "trustedBiddingSignalsUrl", "optional": true, "type": "string" },
{ "name": "biddingLogicURL", "optional": true, "type": "string" },
{ "name": "biddingWasmHelperURL", "optional": true, "type": "string" },
{ "name": "updateURL", "optional": true, "type": "string" },
{ "name": "trustedBiddingSignalsURL", "optional": true, "type": "string" },
{ "name": "trustedBiddingSignalsKeys", "type": "array", "items": { "type": "string" } },
{ "name": "userBiddingSignals", "optional": true, "type": "string" },
{ "name": "ads", "type": "array", "items": { "$ref": "InterestGroupAd" } },
@@ -12275,20 +12366,24 @@
{ "name": "value", "$ref": "UnsignedInt128AsBase16" }
]
},
{
"id": "AttributionReportingEventReportWindows",
"experimental": true,
"type": "object",
"properties": [
{ "name": "start", "description": "duration in seconds", "type": "integer" },
{ "name": "ends", "description": "duration in seconds", "type": "array", "items": { "type": "integer" } }
]
},
{
"id": "AttributionReportingSourceRegistration",
"experimental": true,
"type": "object",
"properties": [
{ "name": "time", "$ref": "Network.TimeSinceEpoch" },
{ "name": "expiry", "description": "duration in seconds", "optional": true, "type": "integer" },
{ "name": "eventReportWindow", "description": "duration in seconds", "optional": true, "type": "integer" },
{
"name": "aggregatableReportWindow",
"description": "duration in seconds",
"optional": true,
"type": "integer"
},
{ "name": "expiry", "description": "duration in seconds", "type": "integer" },
{ "name": "eventReportWindows", "$ref": "AttributionReportingEventReportWindows" },
{ "name": "aggregatableReportWindow", "description": "duration in seconds", "type": "integer" },
{ "name": "type", "$ref": "AttributionReportingSourceType" },
{ "name": "sourceOrigin", "type": "string" },
{ "name": "reportingOrigin", "type": "string" },

View File

@@ -279,7 +279,7 @@ async function sendResponse(response: unknown): Promise<void> {
}
await fetch(`runtime/invocation/${requestId}/response`, {
method: "POST",
body: response === null ? null : JSON.stringify(response),
body: response === null ? null : (typeof response === 'string' ? response : JSON.stringify(response)),
});
}

View File

@@ -2084,31 +2084,31 @@ declare module "buffer" {
values(): IterableIterator<number>;
}
var Buffer: BufferConstructor;
/**
* This function returns `true` if `input` contains only valid UTF-8-encoded data,
* including the case in which `input` is empty.
*
* Throws if the `input` is a detached array buffer.
* @since Bun v0.6.13
* @param input The input to validate.
*/
export function isUtf8(
input: TypedArray | ArrayBufferLike | DataView,
): boolean;
/**
* This function returns `true` if `input` contains only valid ASCII-encoded data,
* including the case in which `input` is empty.
*
* Throws if the `input` is a detached array buffer.
* @since Bun v0.6.13
* @param input The input to validate.
*/
export function isAscii(
input: TypedArray | ArrayBufferLike | DataView,
): boolean;
}
/**
* This function returns `true` if `input` contains only valid UTF-8-encoded data,
* including the case in which `input` is empty.
*
* Throws if the `input` is a detached array buffer.
* @since Bun v0.6.13
* @param input The input to validate.
*/
export function isUtf8(
input: TypedArray | ArrayBufferLike | DataView,
): boolean;
/**
* This function returns `true` if `input` contains only valid ASCII-encoded data,
* including the case in which `input` is empty.
*
* Throws if the `input` is a detached array buffer.
* @since Bun v0.6.13
* @param input The input to validate.
*/
export function isAscii(
input: TypedArray | ArrayBufferLike | DataView,
): boolean;
}
declare module "node:buffer" {
export * from "buffer";

View File

@@ -479,13 +479,13 @@ declare module "bun:test" {
* @param actual the actual value
*/
export const expect: {
(actual?: unknown): Expect;
<T = unknown>(actual?: T): Expect<T>;
any: (
constructor: ((..._: any[]) => any) | { new (..._: any[]): any },
) => Expect;
anything: () => Expect;
stringContaining: (str: string) => Expect;
stringMatching: (regex: RegExp | string) => Expect;
stringContaining: (str: string) => Expect<string>;
stringMatching: <T extends RegExp | string>(regex: T) => Expect<T>;
};
/**
* Asserts that a value matches some criteria.
@@ -982,6 +982,16 @@ declare module "bun:test" {
* @param end the end number (exclusive)
*/
toBeWithin(start: number, end: number): void;
/**
* Asserts that a value is equal to the expected string, ignoring any whitespace.
*
* @example
* expect(" foo ").toEqualIgnoringWhitespace("foo");
* expect("bar").toEqualIgnoringWhitespace(" bar ");
*
* @param expected the expected string
*/
toEqualIgnoringWhitespace(expected: string): void;
/**
* Asserts that a value is a `symbol`.
*

View File

@@ -1870,6 +1870,15 @@ declare module "bun" {
*/
port?: string | number;
/**
* If the `SO_REUSEPORT` flag should be set.
*
* This allows multiple processes to bind to the same port, which is useful for load balancing.
*
* @default false
*/
reusePort?: boolean;
/**
* What hostname should the server listen on?
*

View File

@@ -546,6 +546,12 @@ interface Worker extends EventTarget, AbstractWorker {
*/
unref(): void;
/**
* An integer identifier for the referenced thread. Inside the worker thread,
* it is available as `require('node:worker_threads').threadId`.
* This value is unique for each `Worker` instance inside a single process.
* @since v10.5.0
*/
threadId: number;
}
@@ -733,6 +739,8 @@ interface Process {
* On other operating systems, this returns `undefined`.
*/
constrainedMemory(): number | undefined;
send(data: any): void;
}
interface MemoryUsageObject {

View File

@@ -23,7 +23,7 @@ try {
const header = await file(join(import.meta.dir, "..", "header.txt")).text();
const filesToCat = (await getDotTsFiles("./")).filter(
f => !["./index.d.ts"].some(tf => f === tf),
f => f !== "./index.d.ts",
);
const fileContents: string[] = [];

View File

@@ -45,6 +45,7 @@ describe("bun:test", () => {
test("expect()", () => {
expect(1).toBe(1);
expect(1).not.toBe(2);
// @ts-expect-error
expect({ a: 1 }).toEqual({ a: 1, b: undefined });
expect({ a: 1 }).toStrictEqual({ a: 1 });
expect(new Set()).toHaveProperty("size");

View File

@@ -74,6 +74,8 @@ declare module "ws" {
WebSocket?: U | undefined;
}
interface ServerOption extends WebSocketServerOptions {}
interface AddressInfo {
address: string;
family: string;
@@ -219,4 +221,6 @@ declare module "ws" {
listener: (...args: any[]) => void,
): this;
}
var Server: typeof WebSocketServer;
}

View File

@@ -427,43 +427,12 @@ int bsd_would_block() {
#endif
}
// return LIBUS_SOCKET_ERROR or the fd that represents listen socket
// listen both on ipv6 and ipv4
LIBUS_SOCKET_DESCRIPTOR bsd_create_listen_socket(const char *host, int port, int options) {
struct addrinfo hints, *result;
memset(&hints, 0, sizeof(struct addrinfo));
hints.ai_flags = AI_PASSIVE;
hints.ai_family = AF_UNSPEC;
hints.ai_socktype = SOCK_STREAM;
char port_string[16];
snprintf(port_string, 16, "%d", port);
if (getaddrinfo(host, port_string, &hints, &result)) {
return LIBUS_SOCKET_ERROR;
}
LIBUS_SOCKET_DESCRIPTOR listenFd = LIBUS_SOCKET_ERROR;
struct addrinfo *listenAddr;
for (struct addrinfo *a = result; a && listenFd == LIBUS_SOCKET_ERROR; a = a->ai_next) {
if (a->ai_family == AF_INET6) {
listenFd = bsd_create_socket(a->ai_family, a->ai_socktype, a->ai_protocol);
listenAddr = a;
}
}
for (struct addrinfo *a = result; a && listenFd == LIBUS_SOCKET_ERROR; a = a->ai_next) {
if (a->ai_family == AF_INET) {
listenFd = bsd_create_socket(a->ai_family, a->ai_socktype, a->ai_protocol);
listenAddr = a;
}
}
if (listenFd == LIBUS_SOCKET_ERROR) {
freeaddrinfo(result);
return LIBUS_SOCKET_ERROR;
}
inline LIBUS_SOCKET_DESCRIPTOR bsd_bind_listen_fd(
LIBUS_SOCKET_DESCRIPTOR listenFd,
struct addrinfo *listenAddr,
int port,
int options
) {
if (port != 0) {
/* Otherwise, always enable SO_REUSEPORT and SO_REUSEADDR _unless_ options specify otherwise */
@@ -487,22 +456,76 @@ LIBUS_SOCKET_DESCRIPTOR bsd_create_listen_socket(const char *host, int port, int
#endif
}
#ifdef IPV6_V6ONLY
int disabled = 0;
setsockopt(listenFd, IPPROTO_IPV6, IPV6_V6ONLY, (void *) &disabled, sizeof(disabled));
#endif
if (bind(listenFd, listenAddr->ai_addr, (socklen_t) listenAddr->ai_addrlen) || listen(listenFd, 512)) {
bsd_close_socket(listenFd);
freeaddrinfo(result);
return LIBUS_SOCKET_ERROR;
}
freeaddrinfo(result);
return listenFd;
}
// return LIBUS_SOCKET_ERROR or the fd that represents listen socket
// listen both on ipv6 and ipv4
LIBUS_SOCKET_DESCRIPTOR bsd_create_listen_socket(const char *host, int port, int options) {
struct addrinfo hints, *result;
memset(&hints, 0, sizeof(struct addrinfo));
hints.ai_flags = AI_PASSIVE;
hints.ai_family = AF_UNSPEC;
hints.ai_socktype = SOCK_STREAM;
char port_string[16];
snprintf(port_string, 16, "%d", port);
if (getaddrinfo(host, port_string, &hints, &result)) {
return LIBUS_SOCKET_ERROR;
}
LIBUS_SOCKET_DESCRIPTOR listenFd = LIBUS_SOCKET_ERROR;
struct addrinfo *listenAddr;
for (struct addrinfo *a = result; a != NULL; a = a->ai_next) {
if (a->ai_family == AF_INET6) {
listenFd = bsd_create_socket(a->ai_family, a->ai_socktype, a->ai_protocol);
if (listenFd == LIBUS_SOCKET_ERROR) {
continue;
}
listenAddr = a;
if (bsd_bind_listen_fd(listenFd, listenAddr, port, options) != LIBUS_SOCKET_ERROR) {
freeaddrinfo(result);
return listenFd;
}
bsd_close_socket(listenFd);
}
}
for (struct addrinfo *a = result; a != NULL; a = a->ai_next) {
if (a->ai_family == AF_INET) {
listenFd = bsd_create_socket(a->ai_family, a->ai_socktype, a->ai_protocol);
if (listenFd == LIBUS_SOCKET_ERROR) {
continue;
}
listenAddr = a;
if (bsd_bind_listen_fd(listenFd, listenAddr, port, options) != LIBUS_SOCKET_ERROR) {
freeaddrinfo(result);
return listenFd;
}
bsd_close_socket(listenFd);
}
}
freeaddrinfo(result);
return LIBUS_SOCKET_ERROR;
}
#ifndef _WIN32
#include <sys/un.h>
#else
@@ -768,4 +791,4 @@ LIBUS_SOCKET_DESCRIPTOR bsd_create_connect_socket_unix(const char *server_path,
}
return fd;
}
}

View File

@@ -566,10 +566,10 @@ void *us_socket_context_ext(int ssl, struct us_socket_context_t *context) {
}
void us_socket_context_on_handshake(int ssl, struct us_socket_context_t *context, void (*on_handshake)(struct us_socket_context_t *, int success, struct us_bun_verify_error_t verify_error, void* custom_data), void* custom_data) {
void us_socket_context_on_handshake(int ssl, struct us_socket_context_t *context, void (*on_handshake)(struct us_socket_t *, int success, struct us_bun_verify_error_t verify_error, void* custom_data), void* custom_data) {
#ifndef LIBUS_NO_SSL
if (ssl) {
us_internal_on_ssl_handshake((struct us_internal_ssl_socket_context_t *) context, (void (*)(struct us_internal_ssl_socket_t *, int success, struct us_bun_verify_error_t verify_error, void* custom_data))on_handshake, custom_data);
us_internal_on_ssl_handshake((struct us_internal_ssl_socket_context_t *) context, (us_internal_on_handshake_t)on_handshake, custom_data);
return;
}
#endif

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,90 @@
// MSVC doesn't support C11 stdatomic.h propertly yet.
// so we use C++ std::atomic instead.
#include "./internal/internal.h"
#include "./root_certs.h"
#include <openssl/x509.h>
#include <openssl/pem.h>
#include <atomic>
static const int root_certs_size = sizeof(root_certs) / sizeof(root_certs[0]);
static X509* root_cert_instances[sizeof(root_certs) / sizeof(root_certs[0])] = {NULL};
static std::atomic_flag root_cert_instances_lock = ATOMIC_FLAG_INIT;
static std::atomic_bool root_cert_instances_initialized = 0;
// This callback is used to avoid the default passphrase callback in OpenSSL
// which will typically prompt for the passphrase. The prompting is designed
// for the OpenSSL CLI, but works poorly for this case because it involves
// synchronous interaction with the controlling terminal, something we never
// want, and use this function to avoid it.
int us_no_password_callback(char* buf, int size, int rwflag, void* u) {
return 0;
}
static X509 * us_ssl_ctx_get_X509_without_callback_from(struct us_cert_string_t content) {
X509 *x = NULL;
BIO *in;
ERR_clear_error(); // clear error stack for SSL_CTX_use_certificate()
in = BIO_new_mem_buf(content.str, content.len);
if (in == NULL) {
OPENSSL_PUT_ERROR(SSL, ERR_R_BUF_LIB);
goto end;
}
x = PEM_read_bio_X509(in, NULL, us_no_password_callback, NULL);
if (x == NULL) {
OPENSSL_PUT_ERROR(SSL, ERR_R_PEM_LIB);
goto end;
}
return x;
end:
X509_free(x);
BIO_free(in);
return NULL;
}
static void us_internal_init_root_certs() {
if(std::atomic_load(&root_cert_instances_initialized) == 1) return;
while(atomic_flag_test_and_set_explicit(&root_cert_instances_lock, std::memory_order_acquire));
if(!atomic_exchange(&root_cert_instances_initialized, 1)) {
for (size_t i = 0; i < root_certs_size; i++) {
root_cert_instances[i] = us_ssl_ctx_get_X509_without_callback_from(root_certs[i]);
}
}
atomic_flag_clear_explicit(&root_cert_instances_lock, std::memory_order_release);
}
extern "C" int us_internal_raw_root_certs(struct us_cert_string_t** out) {
*out = root_certs;
return root_certs_size;
}
extern "C" X509_STORE* us_get_default_ca_store() {
X509_STORE *store = X509_STORE_new();
if (store == NULL) {
return NULL;
}
if (!X509_STORE_set_default_paths(store)) {
X509_STORE_free(store);
return NULL;
}
us_internal_init_root_certs();
// load all root_cert_instances on the default ca store
for (size_t i = 0; i < root_certs_size; i++) {
X509* cert = root_cert_instances[i];
if(cert == NULL) continue;
X509_up_ref(cert);
X509_STORE_add_cert(store, cert);
}
return store;
}

View File

@@ -35,8 +35,8 @@ void us_loop_run_bun_tick(struct us_loop_t *loop, int64_t timeoutMs, void*);
/* Pointer tags are used to indicate a Bun pointer versus a uSockets pointer */
#define UNSET_BITS_49_UNTIL_64 0x0000FFFFFFFFFFFF
#define CLEAR_POINTER_TAG(p) ((void *) ((uintptr_t) (p) & UNSET_BITS_49_UNTIL_64))
#define LIKELY(cond) __builtin_expect((uint64_t)(void*)(cond), 1)
#define UNLIKELY(cond) __builtin_expect((uint64_t)(void*)(cond), 0)
#define LIKELY(cond) __builtin_expect((_Bool)(cond), 1)
#define UNLIKELY(cond) __builtin_expect((_Bool)(cond), 0)
#ifdef LIBUS_USE_EPOLL
#define GET_READY_POLL(loop, index) (struct us_poll_t *) loop->ready_polls[index].data.ptr

View File

@@ -0,0 +1,330 @@
/*
* Authored by Alex Hultman, 2018-2021.
* Intellectual property of third-party.
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
* http://www.apache.org/licenses/LICENSE-2.0
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
#include "internal/internal.h"
#include "libusockets.h"
#include <stdlib.h>
#ifdef LIBUS_USE_LIBUV
/* uv_poll_t->data always (except for most times after calling us_poll_stop)
* points to the us_poll_t */
static void poll_cb(uv_poll_t *p, int status, int events) {
us_internal_dispatch_ready_poll((struct us_poll_t *)p->data, status < 0,
events);
}
static void prepare_cb(uv_prepare_t *p) {
struct us_loop_t *loop = p->data;
us_internal_loop_pre(loop);
}
/* Note: libuv timers execute AFTER the post callback */
static void check_cb(uv_check_t *p) {
struct us_loop_t *loop = p->data;
us_internal_loop_post(loop);
}
/* Not used for polls, since polls need two frees */
static void close_cb_free(uv_handle_t *h) { free(h->data); }
/* This one is different for polls, since we need two frees here */
static void close_cb_free_poll(uv_handle_t *h) {
/* It is only in case we called us_poll_stop then quickly us_poll_free that we
* enter this. Most of the time, actual freeing is done by us_poll_free. */
if (h->data) {
free(h->data);
free(h);
}
}
static void timer_cb(uv_timer_t *t) {
struct us_internal_callback_t *cb = t->data;
cb->cb(cb);
}
static void async_cb(uv_async_t *a) {
struct us_internal_callback_t *cb = a->data;
// internal asyncs give their loop, not themselves
cb->cb((struct us_internal_callback_t *)cb->loop);
}
// poll
void us_poll_init(struct us_poll_t *p, LIBUS_SOCKET_DESCRIPTOR fd,
int poll_type) {
p->poll_type = poll_type;
p->fd = fd;
}
void us_poll_free(struct us_poll_t *p, struct us_loop_t *loop) {
/* The idea here is like so; in us_poll_stop we call uv_close after setting
* data of uv-poll to 0. This means that in close_cb_free we call free on 0
* with does nothing, since us_poll_stop should not really free the poll.
* HOWEVER, if we then call us_poll_free while still closing the uv-poll, we
* simply change back the data to point to our structure so that we actually
* do free it like we should. */
if (uv_is_closing((uv_handle_t *)p->uv_p)) {
p->uv_p->data = p;
} else {
free(p->uv_p);
free(p);
}
}
void us_poll_start(struct us_poll_t *p, struct us_loop_t *loop, int events) {
p->poll_type = us_internal_poll_type(p) |
((events & LIBUS_SOCKET_READABLE) ? POLL_TYPE_POLLING_IN : 0) |
((events & LIBUS_SOCKET_WRITABLE) ? POLL_TYPE_POLLING_OUT : 0);
uv_poll_init_socket(loop->uv_loop, p->uv_p, p->fd);
uv_poll_start(p->uv_p, events, poll_cb);
}
void us_poll_change(struct us_poll_t *p, struct us_loop_t *loop, int events) {
if (us_poll_events(p) != events) {
p->poll_type =
us_internal_poll_type(p) |
((events & LIBUS_SOCKET_READABLE) ? POLL_TYPE_POLLING_IN : 0) |
((events & LIBUS_SOCKET_WRITABLE) ? POLL_TYPE_POLLING_OUT : 0);
uv_poll_start(p->uv_p, events, poll_cb);
}
}
void us_poll_stop(struct us_poll_t *p, struct us_loop_t *loop) {
uv_poll_stop(p->uv_p);
/* We normally only want to close the poll here, not free it. But if we stop
* it, then quickly "free" it with us_poll_free, we postpone the actual
* freeing to close_cb_free_poll whenever it triggers. That's why we set data
* to null here, so that us_poll_free can reset it if needed */
p->uv_p->data = 0;
uv_close((uv_handle_t *)p->uv_p, close_cb_free_poll);
}
int us_poll_events(struct us_poll_t *p) {
return ((p->poll_type & POLL_TYPE_POLLING_IN) ? LIBUS_SOCKET_READABLE : 0) |
((p->poll_type & POLL_TYPE_POLLING_OUT) ? LIBUS_SOCKET_WRITABLE : 0);
}
unsigned int us_internal_accept_poll_event(struct us_poll_t *p) { return 0; }
int us_internal_poll_type(struct us_poll_t *p) { return p->poll_type & 3; }
void us_internal_poll_set_type(struct us_poll_t *p, int poll_type) {
p->poll_type = poll_type | (p->poll_type & 12);
}
LIBUS_SOCKET_DESCRIPTOR us_poll_fd(struct us_poll_t *p) { return p->fd; }
void us_loop_pump(struct us_loop_t *loop) {
uv_run(loop->uv_loop, UV_RUN_NOWAIT);
}
struct us_loop_t *us_create_loop(void *hint,
void (*wakeup_cb)(struct us_loop_t *loop),
void (*pre_cb)(struct us_loop_t *loop),
void (*post_cb)(struct us_loop_t *loop),
unsigned int ext_size) {
struct us_loop_t *loop =
(struct us_loop_t *)malloc(sizeof(struct us_loop_t) + ext_size);
loop->uv_loop = hint ? hint : uv_loop_new();
loop->is_default = hint != 0;
loop->uv_pre = malloc(sizeof(uv_prepare_t));
uv_prepare_init(loop->uv_loop, loop->uv_pre);
uv_prepare_start(loop->uv_pre, prepare_cb);
uv_unref((uv_handle_t *)loop->uv_pre);
loop->uv_pre->data = loop;
loop->uv_check = malloc(sizeof(uv_check_t));
uv_check_init(loop->uv_loop, loop->uv_check);
uv_unref((uv_handle_t *)loop->uv_check);
uv_check_start(loop->uv_check, check_cb);
loop->uv_check->data = loop;
// here we create two unreffed handles - timer and async
us_internal_loop_data_init(loop, wakeup_cb, pre_cb, post_cb);
// if we do not own this loop, we need to integrate and set up timer
if (hint) {
us_loop_integrate(loop);
}
return loop;
}
// based on if this was default loop or not
void us_loop_free(struct us_loop_t *loop) {
// ref and close down prepare and check
uv_ref((uv_handle_t *)loop->uv_pre);
uv_prepare_stop(loop->uv_pre);
loop->uv_pre->data = loop->uv_pre;
uv_close((uv_handle_t *)loop->uv_pre, close_cb_free);
uv_ref((uv_handle_t *)loop->uv_check);
uv_check_stop(loop->uv_check);
loop->uv_check->data = loop->uv_check;
uv_close((uv_handle_t *)loop->uv_check, close_cb_free);
us_internal_loop_data_free(loop);
// we need to run the loop one last round to call all close callbacks
// we cannot do this if we do not own the loop, default
if (!loop->is_default) {
uv_run(loop->uv_loop, UV_RUN_NOWAIT);
uv_loop_delete(loop->uv_loop);
}
// now we can free our part
free(loop);
}
void us_loop_run(struct us_loop_t *loop) {
us_loop_integrate(loop);
uv_run(loop->uv_loop, UV_RUN_NOWAIT);
}
struct us_poll_t *us_create_poll(struct us_loop_t *loop, int fallthrough,
unsigned int ext_size) {
struct us_poll_t *p =
(struct us_poll_t *)malloc(sizeof(struct us_poll_t) + ext_size);
p->uv_p = malloc(sizeof(uv_poll_t));
p->uv_p->data = p;
return p;
}
/* If we update our block position we have to updarte the uv_poll data to point
* to us */
struct us_poll_t *us_poll_resize(struct us_poll_t *p, struct us_loop_t *loop,
unsigned int ext_size) {
struct us_poll_t *new_p = realloc(p, sizeof(struct us_poll_t) + ext_size);
new_p->uv_p->data = new_p;
return new_p;
}
// timer
struct us_timer_t *us_create_timer(struct us_loop_t *loop, int fallthrough,
unsigned int ext_size) {
struct us_internal_callback_t *cb = malloc(
sizeof(struct us_internal_callback_t) + sizeof(uv_timer_t) + ext_size);
cb->loop = loop;
cb->cb_expects_the_loop = 0; // never read?
cb->leave_poll_ready = 0; // never read?
uv_timer_t *uv_timer = (uv_timer_t *)(cb + 1);
uv_timer_init(loop->uv_loop, uv_timer);
uv_timer->data = cb;
if (fallthrough) {
uv_unref((uv_handle_t *)uv_timer);
}
return (struct us_timer_t *)cb;
}
void *us_timer_ext(struct us_timer_t *timer) {
return ((char *)timer) + sizeof(struct us_internal_callback_t) +
sizeof(uv_timer_t);
}
void us_timer_close(struct us_timer_t *t) {
struct us_internal_callback_t *cb = (struct us_internal_callback_t *)t;
uv_timer_t *uv_timer = (uv_timer_t *)(cb + 1);
// always ref the timer before closing it
uv_ref((uv_handle_t *)uv_timer);
uv_timer_stop(uv_timer);
uv_timer->data = cb;
uv_close((uv_handle_t *)uv_timer, close_cb_free);
}
void us_timer_set(struct us_timer_t *t, void (*cb)(struct us_timer_t *t),
int ms, int repeat_ms) {
struct us_internal_callback_t *internal_cb =
(struct us_internal_callback_t *)t;
internal_cb->cb = (void (*)(struct us_internal_callback_t *))cb;
uv_timer_t *uv_timer = (uv_timer_t *)(internal_cb + 1);
if (!ms) {
uv_timer_stop(uv_timer);
} else {
uv_timer_start(uv_timer, timer_cb, ms, repeat_ms);
}
}
struct us_loop_t *us_timer_loop(struct us_timer_t *t) {
struct us_internal_callback_t *internal_cb =
(struct us_internal_callback_t *)t;
return internal_cb->loop;
}
// async (internal only)
struct us_internal_async *us_internal_create_async(struct us_loop_t *loop,
int fallthrough,
unsigned int ext_size) {
struct us_internal_callback_t *cb = malloc(
sizeof(struct us_internal_callback_t) + sizeof(uv_async_t) + ext_size);
cb->loop = loop;
return (struct us_internal_async *)cb;
}
void us_internal_async_close(struct us_internal_async *a) {
struct us_internal_callback_t *cb = (struct us_internal_callback_t *)a;
uv_async_t *uv_async = (uv_async_t *)(cb + 1);
// always ref the async before closing it
uv_ref((uv_handle_t *)uv_async);
uv_async->data = cb;
uv_close((uv_handle_t *)uv_async, close_cb_free);
}
void us_internal_async_set(struct us_internal_async *a,
void (*cb)(struct us_internal_async *)) {
struct us_internal_callback_t *internal_cb =
(struct us_internal_callback_t *)a;
internal_cb->cb = (void (*)(struct us_internal_callback_t *))cb;
uv_async_t *uv_async = (uv_async_t *)(internal_cb + 1);
uv_async_init(internal_cb->loop->uv_loop, uv_async, async_cb);
uv_unref((uv_handle_t *)uv_async);
uv_async->data = internal_cb;
}
void us_internal_async_wakeup(struct us_internal_async *a) {
struct us_internal_callback_t *internal_cb =
(struct us_internal_callback_t *)a;
uv_async_t *uv_async = (uv_async_t *)(internal_cb + 1);
uv_async_send(uv_async);
}
#endif

View File

@@ -0,0 +1,47 @@
/*
* Authored by Alex Hultman, 2018-2019.
* Intellectual property of third-party.
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
* http://www.apache.org/licenses/LICENSE-2.0
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
#ifndef LIBUV_H
#define LIBUV_H
#include "internal/loop_data.h"
#include <uv.h>
#define LIBUS_SOCKET_READABLE UV_READABLE
#define LIBUS_SOCKET_WRITABLE UV_WRITABLE
struct us_loop_t {
alignas(LIBUS_EXT_ALIGNMENT) struct us_internal_loop_data_t data;
uv_loop_t *uv_loop;
int is_default;
uv_prepare_t *uv_pre;
uv_check_t *uv_check;
};
// it is no longer valid to cast a pointer to us_poll_t to a pointer of
// uv_poll_t
struct us_poll_t {
/* We need to hold a pointer to this uv_poll_t since we need to be able to
* resize our block */
uv_poll_t *uv_p;
LIBUS_SOCKET_DESCRIPTOR fd;
unsigned char poll_type;
};
#endif // LIBUV_H

View File

@@ -20,7 +20,9 @@
#if defined(_MSC_VER)
#ifndef __cplusplus
#define alignas(x) __declspec(align(x))
#endif
#else
#include <stdalign.h>
#endif
@@ -43,6 +45,10 @@ void us_internal_loop_update_pending_ready_polls(struct us_loop_t *loop, struct
#include "internal/eventing/epoll_kqueue.h"
#endif
#ifdef LIBUS_USE_LIBUV
#include "internal/eventing/libuv.h"
#endif
/* Poll type and what it polls for */
enum {
/* Two first bits */
@@ -126,6 +132,15 @@ struct us_internal_callback_t {
#endif
#if __cplusplus
extern "C" {
#endif
int us_internal_raw_root_certs(struct us_cert_string_t** out);
#if __cplusplus
}
#endif
/* Listen sockets are sockets */
struct us_listen_socket_t {
alignas(LIBUS_EXT_ALIGNMENT) struct us_socket_t s;
@@ -163,6 +178,7 @@ struct us_socket_context_t {
struct us_internal_ssl_socket_context_t;
struct us_internal_ssl_socket_t;
typedef void (*us_internal_on_handshake_t)(struct us_internal_ssl_socket_t *, int success, struct us_bun_verify_error_t verify_error, void* custom_data);
/* SNI functions */
void us_internal_ssl_socket_context_add_server_name(struct us_internal_ssl_socket_context_t *context, const char *hostname_pattern, struct us_socket_context_options_t options, void *user);
@@ -190,8 +206,8 @@ void us_internal_ssl_socket_context_on_close(struct us_internal_ssl_socket_conte
void us_internal_ssl_socket_context_on_data(struct us_internal_ssl_socket_context_t *context,
struct us_internal_ssl_socket_t *(*on_data)(struct us_internal_ssl_socket_t *s, char *data, int length));
void us_internal_ssl_handshake(struct us_internal_ssl_socket_t *s, void (*on_handshake)(struct us_internal_ssl_socket_t *, int success, struct us_bun_verify_error_t verify_error, void* custom_data), void* custom_data);
void us_internal_on_ssl_handshake(struct us_internal_ssl_socket_context_t * context, void (*on_handshake)(struct us_internal_ssl_socket_t *, int success, struct us_bun_verify_error_t verify_error, void* custom_data), void* custom_data);
void us_internal_ssl_handshake(struct us_internal_ssl_socket_t *s, us_internal_on_handshake_t on_handshake, void* custom_data);
void us_internal_on_ssl_handshake(struct us_internal_ssl_socket_context_t * context, us_internal_on_handshake_t on_handshake, void* custom_data);
void us_internal_ssl_socket_context_on_writable(struct us_internal_ssl_socket_context_t *context,
struct us_internal_ssl_socket_t *(*on_writable)(struct us_internal_ssl_socket_t *s));

View File

@@ -153,6 +153,12 @@ struct us_socket_context_options_t {
int ssl_prefer_low_memory_usage; /* Todo: rename to prefer_low_memory_usage and apply for TCP as well */
};
struct us_bun_verify_error_t {
long error;
const char* code;
const char* reason;
};
struct us_socket_events_t {
struct us_socket_t *(*on_open)(struct us_socket_t *, int is_client, char *ip, int ip_length);
struct us_socket_t *(*on_data)(struct us_socket_t *, char *data, int length);
@@ -166,11 +172,6 @@ struct us_socket_events_t {
void (*on_handshake)(struct us_socket_t*, int success, struct us_bun_verify_error_t verify_error, void* custom_data);
};
struct us_bun_verify_error_t {
long error;
const char* code;
const char* reason;
};
struct us_bun_socket_context_options_t {
const char *key_file_name;
@@ -231,7 +232,7 @@ void us_socket_context_on_long_timeout(int ssl, struct us_socket_context_t *cont
void us_socket_context_on_connect_error(int ssl, struct us_socket_context_t *context,
struct us_socket_t *(*on_connect_error)(struct us_socket_t *s, int code));
void us_socket_context_on_handshake(int ssl, struct us_socket_context_t *context, void (*on_handshake)(struct us_socket_context_t *, int success, struct us_bun_verify_error_t verify_error, void* custom_data), void* custom_data);
void us_socket_context_on_handshake(int ssl, struct us_socket_context_t *context, void (*on_handshake)(struct us_socket_t *, int success, struct us_bun_verify_error_t verify_error, void* custom_data), void* custom_data);
/* Emitted when a socket has been half-closed */
void us_socket_context_on_end(int ssl, struct us_socket_context_t *context, struct us_socket_t *(*on_end)(struct us_socket_t *s));
@@ -382,6 +383,7 @@ int us_socket_local_port(int ssl, struct us_socket_t *s);
/* Copy remote (IP) address of socket, or fail with zero length. */
void us_socket_remote_address(int ssl, struct us_socket_t *s, char *buf, int *length);
void us_socket_local_address(int ssl, struct us_socket_t *s, char *buf, int *length);
/* Bun extras */
struct us_socket_t *us_socket_pair(struct us_socket_context_t *ctx, int socket_ext_size, LIBUS_SOCKET_DESCRIPTOR* fds);

View File

@@ -18,7 +18,9 @@
#include "libusockets.h"
#include "internal/internal.h"
#include <stdlib.h>
#ifndef WIN32
#include <sys/ioctl.h>
#endif
/* The loop has 2 fallthrough polls */
void us_internal_loop_data_init(struct us_loop_t *loop, void (*wakeup_cb)(struct us_loop_t *loop),

View File

@@ -48,6 +48,16 @@ void us_socket_remote_address(int ssl, struct us_socket_t *s, char *buf, int *le
}
}
void us_socket_local_address(int ssl, struct us_socket_t *s, char *buf, int *length) {
struct bsd_addr_t addr;
if (bsd_local_addr(us_poll_fd(&s->p), &addr) || *length < bsd_addr_get_ip_length(&addr)) {
*length = 0;
} else {
*length = bsd_addr_get_ip_length(&addr);
memcpy(buf, bsd_addr_get_ip(&addr), *length);
}
}
struct us_socket_context_t *us_socket_context(int ssl, struct us_socket_t *s) {
return s->context;
}
@@ -140,22 +150,76 @@ struct us_socket_t *us_socket_close(int ssl, struct us_socket_t *s, int code, vo
return s;
}
// This function is the same as us_socket_close but:
// - does not emit on_close event
// - does not close
struct us_socket_t *us_socket_detach(int ssl, struct us_socket_t *s) {
if (!us_socket_is_closed(0, s)) {
if (s->low_prio_state == 1) {
/* Unlink this socket from the low-priority queue */
if (!s->prev) s->context->loop->data.low_prio_head = s->next;
else s->prev->next = s->next;
if (s->next) s->next->prev = s->prev;
s->prev = 0;
s->next = 0;
s->low_prio_state = 0;
} else {
us_internal_socket_context_unlink_socket(s->context, s);
}
us_poll_stop((struct us_poll_t *) s, s->context->loop);
/* Link this socket to the close-list and let it be deleted after this iteration */
s->next = s->context->loop->data.closed_head;
s->context->loop->data.closed_head = s;
/* Any socket with prev = context is marked as closed */
s->prev = (struct us_socket_t *) s->context;
return s;
}
return s;
}
// This function is used for moving a socket between two different event loops
struct us_socket_t *us_socket_attach(int ssl, LIBUS_SOCKET_DESCRIPTOR client_fd, struct us_socket_context_t *ctx, int flags, int socket_ext_size) {
struct us_poll_t *accepted_p = us_create_poll(ctx->loop, 0, sizeof(struct us_socket_t) - sizeof(struct us_poll_t) + socket_ext_size);
us_poll_init(accepted_p, client_fd, POLL_TYPE_SOCKET);
us_poll_start(accepted_p, ctx->loop, flags);
struct us_socket_t *s = (struct us_socket_t *) accepted_p;
s->context = ctx;
s->timeout = 0;
s->low_prio_state = 0;
/* We always use nodelay */
bsd_socket_nodelay(client_fd, 1);
us_internal_socket_context_link_socket(ctx, s);
if (ctx->on_open) ctx->on_open(s, 0, 0, 0);
return s;
}
struct us_socket_t *us_socket_pair(struct us_socket_context_t *ctx, int socket_ext_size, LIBUS_SOCKET_DESCRIPTOR* fds) {
#ifdef LIBUS_USE_LIBUV
#if defined(LIBUS_USE_LIBUV) || defined(WIN32)
return 0;
#endif
#else
if (socketpair(AF_UNIX, SOCK_STREAM, 0, fds) != 0) {
return 0;
}
return us_socket_from_fd(ctx, socket_ext_size, fds[0]);
#endif
}
struct us_socket_t *us_socket_from_fd(struct us_socket_context_t *ctx, int socket_ext_size, LIBUS_SOCKET_DESCRIPTOR fd) {
#ifdef LIBUS_USE_LIBUV
#if defined(LIBUS_USE_LIBUV) || defined(WIN32)
return 0;
#endif
#else
struct us_poll_t *p1 = us_create_poll(ctx->loop, 0, sizeof(struct us_socket_t) + socket_ext_size);
us_poll_init(p1, fd, POLL_TYPE_SOCKET);
us_poll_start(p1, ctx->loop, LIBUS_SOCKET_READABLE | LIBUS_SOCKET_WRITABLE);
@@ -176,6 +240,7 @@ struct us_socket_t *us_socket_from_fd(struct us_socket_context_t *ctx, int socke
}
return s;
#endif
}
@@ -301,4 +366,4 @@ unsigned int us_get_remote_address_info(char *buf, struct us_socket_t *s, const
*port = bsd_addr_get_port(&addr);
return length;
}
}

View File

@@ -213,9 +213,9 @@ public:
unsigned char *b = (unsigned char *) binary.data();
if (binary.length() == 4) {
ipLength = sprintf(buf, "%u.%u.%u.%u", b[0], b[1], b[2], b[3]);
ipLength = snprintf(buf, sizeof(buf), "%u.%u.%u.%u", b[0], b[1], b[2], b[3]);
} else {
ipLength = sprintf(buf, "%02x%02x:%02x%02x:%02x%02x:%02x%02x:%02x%02x:%02x%02x:%02x%02x:%02x%02x",
ipLength = snprintf(buf, sizeof(buf), "%02x%02x:%02x%02x:%02x%02x:%02x%02x:%02x%02x:%02x%02x:%02x%02x:%02x%02x",
b[0], b[1], b[2], b[3], b[4], b[5], b[6], b[7], b[8], b[9], b[10], b[11],
b[12], b[13], b[14], b[15]);
}

Some files were not shown because too many files have changed in this diff Show More