Compare commits

...

100 Commits

Author SHA1 Message Date
Jarred Sumner
237810c142 Update jsc.zig 2025-03-31 17:13:27 -07:00
Jarred Sumner
a045907e58 ai slop not working SourceMap class 2025-03-31 17:13:24 -07:00
Jarred Sumner
323d78df5e Bump 2025-03-31 11:00:56 -07:00
Alistair Smith
adab0f64f9 quick fix for category rendering (#18669) 2025-03-31 05:50:34 -07:00
Alistair Smith
41d3f1bc9d fix: declaring Bun Env should declare on process.env also (#18665) 2025-03-31 04:25:08 -07:00
Jarred Sumner
b34703914c Fix build 2025-03-31 04:20:52 -07:00
Jarred Sumner
f3da1b80bc Use macOS signpost api for tracing (#14871)
Co-authored-by: Jarred-Sumner <Jarred-Sumner@users.noreply.github.com>
Co-authored-by: Don Isaac <donald.isaac@gmail.com>
Co-authored-by: DonIsaac <DonIsaac@users.noreply.github.com>
Co-authored-by: Jarred-Sumner <709451+Jarred-Sumner@users.noreply.github.com>
2025-03-31 04:13:11 -07:00
Alistair Smith
0814abe21e FIx some smaller issues in bun-types (#18642) 2025-03-31 03:41:13 -07:00
Jarred Sumner
c3be6732d1 Fixes #18572 (#18578)
Co-authored-by: Jarred-Sumner <709451+Jarred-Sumner@users.noreply.github.com>
Co-authored-by: Dylan Conway <35280289+dylan-conway@users.noreply.github.com>
2025-03-31 02:15:27 -07:00
Jarred Sumner
c3e2bf0fc4 Update ban-words.test.ts 2025-03-31 02:14:36 -07:00
Jarred Sumner
78a9396038 Update ban-words.test.ts 2025-03-31 02:14:36 -07:00
Ciro Spaciari
e2ce3bd4ce fix(node:http) http.request empty requests should not always be transfer-encoding: chunked (#18649) 2025-03-31 02:10:31 -07:00
Jarred Sumner
fee911194a Remove dead code from js_printer (#18619) 2025-03-29 04:46:23 -07:00
Jarred Sumner
358a1db422 Clean up some mimalloc-related code (#18618) 2025-03-29 04:01:55 -07:00
Jarred Sumner
8929d65f0e Remove some dead code (#18617) 2025-03-29 03:18:08 -07:00
Don Isaac
f14e26bc85 fix: crash when Bun.write is called on a typedarray-backed Blob (#18600) 2025-03-29 03:04:10 -07:00
Jarred Sumner
43f7a241b9 Fix typo 2025-03-29 02:31:33 -07:00
Ciro Spaciari
7021c42cf2 fix(node:http) fix post regression (#18599)
Co-authored-by: cirospaciari <6379399+cirospaciari@users.noreply.github.com>
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
Co-authored-by: Jarred-Sumner <709451+Jarred-Sumner@users.noreply.github.com>
2025-03-29 01:50:26 -07:00
Don Isaac
1b10b61423 test: disable failing shadcn tests in CI (#18603) 2025-03-29 01:48:59 -07:00
Kai Tamkun
bb9128c0e8 Fix a node:http UAF (#18564) 2025-03-28 22:02:49 -07:00
Jarred Sumner
f38d35f7c9 Revert #18562 #18478 (#18610) 2025-03-28 20:23:49 -07:00
Don Isaac
f0dfa109bb fix(cli/pack): excluded entries nested within included dirs (#18509)
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2025-03-28 19:05:37 -07:00
Don Isaac
27cf65a1e2 refactor(uws): remove unused uws_res_write_headers (#18607) 2025-03-28 17:52:05 -07:00
Alistair Smith
e83b5fb720 @types/bun: fix URLSearchParams, bring back old types test suite (#18598) 2025-03-28 17:51:52 -07:00
Dylan Conway
ee89130991 remove zig and zls path 2025-03-28 15:40:22 -07:00
Dylan Conway
0a4f36644f avoid encoding as double in napi_create_double if possible (#18585) 2025-03-28 15:16:32 -07:00
Don Isaac
a1ab2a4780 fix: Bun.write() with empty string creates a file (#18561) 2025-03-28 11:54:54 -07:00
Don Isaac
451c1905a8 ci: do not lint cli fixture files (#18596) 2025-03-28 11:26:12 -07:00
Jarred Sumner
accccbfdaf 2x faster headers.get, headers.delete, headers.has (#18571) 2025-03-28 01:15:00 -07:00
pfg
8e0c8a143e Fix for test-stdin-from-file closing fd '0' (#18559) 2025-03-27 21:38:50 -07:00
Jarred Sumner
9ea577efc0 Update CONTRIBUTING.md 2025-03-27 21:25:14 -07:00
Jarred Sumner
54416dad05 Add bd package.json script 2025-03-27 21:24:38 -07:00
chloe caruso
8f4575c0e4 fix: detection module type from extension (#18562) 2025-03-27 20:47:31 -07:00
Dylan Conway
c7edb24520 fix(install): resolution order and unused resolutions (#18560)
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2025-03-27 20:46:58 -07:00
Ciro Spaciari
325acfc230 fix(node:http) fix regression where we throw ECONNRESET and/or ERR_STREAM_WRITE_AFTER_END after socket.end() (#18539)
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2025-03-27 18:27:19 -07:00
Pham Minh Triet
7f60375cca docs: update require()'s compatibility status (#18556) 2025-03-27 18:26:13 -07:00
Ciro Spaciari
dac7f22997 fix(CSRF) remove undefined's and add more checks for bad tokens (#18535) 2025-03-27 17:47:00 -07:00
Alistair Smith
f5836c2013 Docs/tag relevant docs (#18544) 2025-03-27 16:49:15 -07:00
chloe caruso
70ddfb55e6 implement require.extensions (#18478) 2025-03-27 14:58:24 -07:00
HAHALOSAH
934e41ae59 docs: clarify sentence in sql (#18532) [no ci] 2025-03-27 09:55:10 -07:00
Alistair Smith
f4ae8c7254 Fix AbortSignal static methods when DOM is missing (#18530) 2025-03-27 15:10:18 +00:00
Alistair Smith
2a9569cec4 @types/bun: Some small fixes for the test and Bun.env (#18528) 2025-03-27 13:32:23 +00:00
Jarred Sumner
31060a5e2a Update package.json 2025-03-27 03:48:38 -07:00
Zack Radisic
5c0fa6dc21 Better error message for NAPI modules which access unsupported libuv functions (#18503)
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2025-03-26 23:57:10 -07:00
Grigory
53f311fdd9 fix(nodevm): allow options props to be undefined (#18465) 2025-03-26 22:35:02 -07:00
pfg
b40f5c9669 more cookie mistakes (#18517)
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2025-03-26 22:33:44 -07:00
Vladlen Grachev
317e9d23ab Fix CSS Modules when code splitting enabled (#18435)
Co-authored-by: Don Isaac <donald.isaac@gmail.com>
2025-03-26 21:45:22 -07:00
Don Isaac
11bb3573ea build: shave 30s off debug builds (#18516)
Co-authored-by: DonIsaac <22823424+DonIsaac@users.noreply.github.com>
2025-03-26 21:42:13 -07:00
Ciro Spaciari
39cf0906d1 fix(fetch) handle aborted connection inside start (#18512) 2025-03-26 20:52:49 -07:00
pfg
1d655a0232 cookie mistakes (#18513) 2025-03-26 20:51:20 -07:00
Ciro Spaciari
a548c2ec54 fix(node:http) properly signal server drain after draining the socket buffer (#18479) 2025-03-26 16:41:17 -07:00
Jarred Sumner
7740271359 Update cookie.md 2025-03-26 16:22:57 -07:00
Don Isaac
75144ab881 fix: reflect-metadata import order (#18086)
Co-authored-by: DonIsaac <22823424+DonIsaac@users.noreply.github.com>
2025-03-26 16:19:45 -07:00
Jarred Sumner
1dbeed20a9 Port https://github.com/WebKit/WebKit/pull/43041 (#18507) 2025-03-26 16:16:48 -07:00
Don Isaac
3af6f7a5fe fix(docs): failing typo (#18510) [no ci] 2025-03-26 16:06:40 -07:00
Jarred Sumner
1bfccf707b Update cookie.md 2025-03-26 15:51:40 -07:00
Jarred Sumner
21853d08de more cookie docs 2025-03-26 15:32:39 -07:00
Jarred Sumner
b6502189e8 Update nav.ts 2025-03-26 15:13:02 -07:00
pfg
f4ab2e4986 Remove two usingnamespace & ptrCast on function pointer (#18486) 2025-03-26 14:41:14 -07:00
Jarred Sumner
57cda4a445 Clean-up after #18485 (#18489) 2025-03-26 04:46:35 -07:00
Jarred Sumner
49ca2c86e7 More robust Bun.Cookie & Bun.CookieMap (#18359)
Co-authored-by: pfg <pfg@pfg.pw>
2025-03-26 02:51:41 -07:00
Ciro Spaciari
a08a9c5bfb compat(http) more compatibility in cookies (#18446)
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2025-03-26 01:30:29 -07:00
Kai Tamkun
ee8a839500 Fix node:http UAF (#18485)
Co-authored-by: Dylan Conway <dylan.conway567@gmail.com>
Co-authored-by: Dylan Conway <35280289+dylan-conway@users.noreply.github.com>
2025-03-25 23:31:04 -07:00
Dylan Conway
8ee962d79f Fix #14881 (#18483) 2025-03-25 20:51:36 -07:00
Ciro Spaciari
4c3d652f00 chore(fetch) split fetch from response.zig (#18475) 2025-03-25 20:44:27 -07:00
Dylan Conway
c21fca08e2 fix node:crypto hash name regression (#18481) 2025-03-25 20:43:41 -07:00
Meghan Denny
77fde278e8 LATEST remove trailing newline that got added 2025-03-25 19:13:42 -07:00
Meghan Denny
517af630e7 Bump 2025-03-25 18:53:52 -07:00
Meghan Denny
d8e5335268 Bump 2025-03-25 18:52:49 -07:00
190n
db492575c8 Skip flaky macOS x64 node-napi tests in CI (v2) (#18468) 2025-03-25 18:20:08 -07:00
Don Isaac
9e580f8413 chore(bun-plugin-svelte): bump version to v0.0.6 (#18464) 2025-03-25 15:53:07 -07:00
190n
6ba2ba41c6 Skip "text lockfile is hoisted" test on Windows CI (#18473) 2025-03-25 15:38:14 -07:00
Alistair Smith
57381d43ed types: Rewrite to avoid conflicts and allow for doc generation (#18024)
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2025-03-25 14:33:30 -07:00
Pham Minh Triet
90c67c4b79 docs: update node:crypto compatibility (#18459) 2025-03-25 12:38:21 -07:00
Twilight
cf9f2bf98e types(ffi): fix cc example jsdoc (#18457) 2025-03-25 12:37:59 -07:00
190n
8ebd5d53da Skip flaky macOS x64 node-napi tests in CI (#18448) 2025-03-24 23:49:14 -07:00
Kai Tamkun
60acfb17f0 node:http compatibility (options.lookup) (#18395)
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2025-03-24 23:49:02 -07:00
Jarred Sumner
8735a3f4d6 -200 KB binary size (#18415) 2025-03-24 23:28:21 -07:00
Don Isaac
a07844ea13 feat(bun-plugin-svelte): custom elements (#18336) 2025-03-24 23:24:56 -07:00
Ciro Spaciari
1656bca9ab improve(fetch) (#18187) 2025-03-24 23:24:16 -07:00
Dylan Conway
43af1a2283 maybe fix crash in test-crypto-prime.js (#18450) 2025-03-24 22:46:28 -07:00
Jarred Sumner
84a21234d4 Split bun crypto APIs into more files (#18431) 2025-03-24 17:22:05 -07:00
Grigory
fefdaefb97 docs(contributing): update llvm version to 19 (#18421)
Co-authored-by: 190n <benjamin.j.grant@gmail.com>
2025-03-24 17:11:14 -07:00
Jarred Sumner
50eaea19cb Move TextDecoder, TextEncoderStreamEncoder, TextEncoder, EncodingLabel into separate files (#18430) 2025-03-24 17:10:48 -07:00
Jarred Sumner
438d8555c6 Split ServerWebSocket & NodeHTTPResponse into more files (#18432) 2025-03-24 17:09:30 -07:00
chloe caruso
163a51c0f6 fix BUN-604 (#18447) 2025-03-24 17:05:01 -07:00
pfg
8df7064f73 Don't crash when server.fetch() is called on a server without a fetch() handler (#18151)
Co-authored-by: Ciro Spaciari <ciro.spaciari@gmail.com>
2025-03-24 16:55:34 -07:00
xuxu's code
99ee90a58f types: refine idleTimeout in the example of SQL options (#18234) 2025-03-24 09:53:45 -07:00
Jarred Sumner
46c43d954c Add cursor rule documenting zig javascriptcore bindings 2025-03-24 03:10:15 -07:00
Jarred Sumner
b37054697b Fix BUN-DAR (#18400) 2025-03-22 14:37:19 -07:00
Jarred Sumner
5d50281f1a Bump WebKit (#18399)
Co-authored-by: Dylan Conway <35280289+dylan-conway@users.noreply.github.com>
2025-03-22 02:03:50 -07:00
Don Isaac
6bef525704 fix: make JSPropertyIterator accept a JSObject instead of a JSValue (#18308)
Co-authored-by: DonIsaac <22823424+DonIsaac@users.noreply.github.com>
Co-authored-by: Dylan Conway <dylan.conway567@gmail.com>
Co-authored-by: Dylan Conway <35280289+dylan-conway@users.noreply.github.com>
2025-03-22 01:19:27 -07:00
Dylan Conway
687a0ab5a4 node:crypto: fix test-crypto-scrypt.js (#18396)
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
2025-03-22 01:18:27 -07:00
Jarred Sumner
60ae19bded Revert "Introduce Bun.Cookie & Bun.CookieMap & request.cookies (in BunRequest) (#18073)"
This reverts commit 9888570456.

We will add it in Bun v1.2.7
2025-03-21 22:17:28 -07:00
Kai Tamkun
be41c884b4 Fix dns.resolve (#18393) 2025-03-21 21:46:06 -07:00
Dylan Conway
73d1b2ff67 Add benchmark for Cipheriv and Decipheriv (#18394) 2025-03-21 19:49:44 -07:00
Vincent (Wen Yu) Ge
2312b2c0f2 Fix typo in PR template (#18392) 2025-03-21 19:46:37 -07:00
chloe caruso
eae2c889ed refactor and rename JSCommonJSModule (#18390) 2025-03-21 18:46:39 -07:00
chloe caruso
ddd87fef12 module.children and Module.runMain (#18343)
Co-authored-by: Jarred Sumner <jarred@jarredsumner.com>
Co-authored-by: 190n <ben@bun.sh>
2025-03-21 16:57:10 -07:00
Ashcon Partovi
f36d480919 fix: test-http-header-obstext.js (#18371) 2025-03-21 15:28:12 -07:00
413 changed files with 41111 additions and 18684 deletions

View File

@@ -1,13 +1,14 @@
---
description: JavaScript class implemented in C++
globs: *.cpp
alwaysApply: false
---
# Implementing JavaScript classes in C++
If there is a publicly accessible Constructor and Prototype, then there are 3 classes:
- IF there are C++ class members we need a destructor, so `class Foo : public JSC::DestructibleObject`, if no C++ class fields (only JS properties) then we don't need a class at all usually. We can instead use JSC::constructEmptyObject(vm, structure) and `putDirectOffset` like in [NodeFSBinding.cpp](mdc:src/bun.js/bindings/NodeFSBinding.cpp).
- IF there are C++ class members we need a destructor, so `class Foo : public JSC::DestructibleObject`, if no C++ class fields (only JS properties) then we don't need a class at all usually. We can instead use JSC::constructEmptyObject(vm, structure) and `putDirectOffset` like in [NodeFSStatBinding.cpp](mdc:src/bun.js/bindings/NodeFSStatBinding.cpp).
- class FooPrototype : public JSC::JSNonFinalObject
- class FooConstructor : public JSC::InternalFunction
@@ -18,6 +19,7 @@ If there are C++ fields on the Foo class, the Foo class will need an iso subspac
Usually you'll need to #include "root.h" at the top of C++ files or you'll get lint errors.
Generally, defining the subspace looks like this:
```c++
class Foo : public JSC::DestructibleObject {
@@ -45,6 +47,7 @@ It's better to put it in the .cpp file instead of the .h file, when possible.
## Defining properties
Define properties on the prototype. Use a const HashTableValues like this:
```C++
static JSC_DECLARE_HOST_FUNCTION(jsX509CertificateProtoFuncCheckEmail);
static JSC_DECLARE_HOST_FUNCTION(jsX509CertificateProtoFuncCheckHost);
@@ -158,6 +161,7 @@ void JSX509CertificatePrototype::finishCreation(VM& vm)
```
### Getter definition:
```C++
JSC_DEFINE_CUSTOM_GETTER(jsX509CertificateGetter_ca, (JSGlobalObject * globalObject, EncodedJSValue thisValue, PropertyName))
@@ -212,7 +216,6 @@ JSC_DEFINE_HOST_FUNCTION(jsX509CertificateProtoFuncToJSON, (JSGlobalObject * glo
}
```
### Constructor definition
```C++
@@ -259,7 +262,6 @@ private:
};
```
### Structure caching
If there's a class, prototype, and constructor:
@@ -279,6 +281,7 @@ void GlobalObject::finishCreation(VM& vm) {
```
Then, implement the function that creates the structure:
```c++
void setupX509CertificateClassStructure(LazyClassStructure::Initializer& init)
{
@@ -301,11 +304,12 @@ If there's only a class, use `JSC::LazyProperty<JSGlobalObject, Structure>` inst
1. Add the `JSC::LazyProperty<JSGlobalObject, Structure>` to @ZigGlobalObject.h
2. Initialize the class structure in @ZigGlobalObject.cpp in `void GlobalObject::finishCreation(VM& vm)`
3. Visit the lazy property in visitChildren in @ZigGlobalObject.cpp in `void GlobalObject::visitChildrenImpl`
void GlobalObject::finishCreation(VM& vm) {
// ...
void GlobalObject::finishCreation(VM& vm) {
// ...
this.m_myLazyProperty.initLater([](const JSC::LazyProperty<JSC::JSGlobalObject, JSC::Structure>::Initializer& init) {
init.set(Bun::initMyStructure(init.vm, reinterpret_cast<Zig::GlobalObject*>(init.owner)));
});
init.set(Bun::initMyStructure(init.vm, reinterpret_cast<Zig::GlobalObject\*>(init.owner)));
});
```
Then, implement the function that creates the structure:
@@ -316,7 +320,7 @@ Structure* setupX509CertificateStructure(JSC::VM &vm, Zig::GlobalObject* globalO
auto* prototypeStructure = JSX509CertificatePrototype::createStructure(init.vm, init.global, init.global->objectPrototype());
auto* prototype = JSX509CertificatePrototype::create(init.vm, init.global, prototypeStructure);
// If there is no prototype or it only has
// If there is no prototype or it only has
auto* structure = JSX509Certificate::createStructure(init.vm, init.global, prototype);
init.setPrototype(prototype);
@@ -325,7 +329,6 @@ Structure* setupX509CertificateStructure(JSC::VM &vm, Zig::GlobalObject* globalO
}
```
Then, use the structure by calling `globalObject.m_myStructureName.get(globalObject)`
```C++
@@ -378,12 +381,14 @@ extern "C" JSC::EncodedJSValue Bun__JSBigIntStatsObjectConstructor(Zig::GlobalOb
```
Zig:
```zig
extern "c" fn Bun__JSBigIntStatsObjectConstructor(*JSC.JSGlobalObject) JSC.JSValue;
pub const getBigIntStatsConstructor = Bun__JSBigIntStatsObjectConstructor;
```
To create an object (instance) of a JS class defined in C++ from Zig, follow the __toJS convention like this:
To create an object (instance) of a JS class defined in C++ from Zig, follow the \_\_toJS convention like this:
```c++
// X509* is whatever we need to create the object
extern "C" EncodedJSValue Bun__X509__toJS(Zig::GlobalObject* globalObject, X509* cert)
@@ -395,12 +400,13 @@ extern "C" EncodedJSValue Bun__X509__toJS(Zig::GlobalObject* globalObject, X509*
```
And from Zig:
```zig
const X509 = opaque {
// ... class
// ... class
extern fn Bun__X509__toJS(*JSC.JSGlobalObject, *X509) JSC.JSValue;
pub fn toJS(this: *X509, globalObject: *JSC.JSGlobalObject) JSC.JSValue {
return Bun__X509__toJS(globalObject, this);
}

View File

@@ -0,0 +1,488 @@
---
description: How Zig works with JavaScriptCore bindings generator
globs:
alwaysApply: false
---
# Bun's JavaScriptCore Class Bindings Generator
This document explains how Bun's class bindings generator works to bridge Zig and JavaScript code through JavaScriptCore (JSC).
## Architecture Overview
Bun's binding system creates a seamless bridge between JavaScript and Zig, allowing Zig implementations to be exposed as JavaScript classes. The system has several key components:
1. **Zig Implementation** (.zig files)
2. **JavaScript Interface Definition** (.classes.ts files)
3. **Generated Code** (C++/Zig files that connect everything)
## Class Definition Files
### JavaScript Interface (.classes.ts)
The `.classes.ts` files define the JavaScript API using a declarative approach:
```typescript
// Example: encoding.classes.ts
define({
name: "TextDecoder",
constructor: true,
JSType: "object",
finalize: true,
proto: {
decode: {
// Function definition
args: 1,
},
encoding: {
// Getter with caching
getter: true,
cache: true,
},
fatal: {
// Read-only property
getter: true,
},
ignoreBOM: {
// Read-only property
getter: true,
}
}
});
```
Each class definition specifies:
- The class name
- Whether it has a constructor
- JavaScript type (object, function, etc.)
- Properties and methods in the `proto` field
- Caching strategy for properties
- Finalization requirements
### Zig Implementation (.zig)
The Zig files implement the native functionality:
```zig
// Example: TextDecoder.zig
pub const TextDecoder = struct {
// Internal state
encoding: []const u8,
fatal: bool,
ignoreBOM: bool,
// Use generated bindings
pub usingnamespace JSC.Codegen.JSTextDecoder;
pub usingnamespace bun.New(@This());
// Constructor implementation - note use of globalObject
pub fn constructor(
globalObject: *JSGlobalObject,
callFrame: *JSC.CallFrame,
) bun.JSError!*TextDecoder {
// Implementation
}
// Prototype methods - note return type includes JSError
pub fn decode(
this: *TextDecoder,
globalObject: *JSGlobalObject,
callFrame: *JSC.CallFrame,
) bun.JSError!JSC.JSValue {
// Implementation
}
// Getters
pub fn getEncoding(this: *TextDecoder, globalObject: *JSGlobalObject) JSC.JSValue {
return JSC.JSValue.createStringFromUTF8(globalObject, this.encoding);
}
pub fn getFatal(this: *TextDecoder, globalObject: *JSGlobalObject) JSC.JSValue {
return JSC.JSValue.jsBoolean(this.fatal);
}
// Cleanup - note standard pattern of using deinit/deref
pub fn deinit(this: *TextDecoder) void {
// Release any retained resources
}
pub fn finalize(this: *TextDecoder) void {
this.deinit();
// Or sometimes this is used to free memory instead
bun.default_allocator.destroy(this);
}
};
```
Key components in the Zig file:
- The struct containing native state
- `usingnamespace JSC.Codegen.JS<ClassName>` to include generated code
- `usingnamespace bun.New(@This())` for object creation helpers
- Constructor and methods using `bun.JSError!JSValue` return type for proper error handling
- Consistent use of `globalObject` parameter name instead of `ctx`
- Methods matching the JavaScript interface
- Getters/setters for properties
- Proper resource cleanup pattern with `deinit()` and `finalize()`
## Code Generation System
The binding generator produces C++ code that connects JavaScript and Zig:
1. **JSC Class Structure**: Creates C++ classes for the JS object, prototype, and constructor
2. **Memory Management**: Handles GC integration through JSC's WriteBarrier
3. **Method Binding**: Connects JS function calls to Zig implementations
4. **Type Conversion**: Converts between JS values and Zig types
5. **Property Caching**: Implements the caching system for properties
The generated C++ code includes:
- A JSC wrapper class (`JSTextDecoder`)
- A prototype class (`JSTextDecoderPrototype`)
- A constructor function (`JSTextDecoderConstructor`)
- Function bindings (`TextDecoderPrototype__decodeCallback`)
- Property getters/setters (`TextDecoderPrototype__encodingGetterWrap`)
## CallFrame Access
The `CallFrame` object provides access to JavaScript execution context:
```zig
pub fn decode(
this: *TextDecoder,
globalObject: *JSGlobalObject,
callFrame: *JSC.CallFrame
) bun.JSError!JSC.JSValue {
// Get arguments
const input = callFrame.argument(0);
const options = callFrame.argument(1);
// Get this value
const thisValue = callFrame.thisValue();
// Implementation with error handling
if (input.isUndefinedOrNull()) {
return globalObject.throw("Input cannot be null or undefined", .{});
}
// Return value or throw error
return JSC.JSValue.jsString(globalObject, "result");
}
```
CallFrame methods include:
- `argument(i)`: Get the i-th argument
- `argumentCount()`: Get the number of arguments
- `thisValue()`: Get the `this` value
- `callee()`: Get the function being called
## Property Caching and GC-Owned Values
The `cache: true` option in property definitions enables JSC's WriteBarrier to efficiently store values:
```typescript
encoding: {
getter: true,
cache: true, // Enable caching
}
```
### C++ Implementation
In the generated C++ code, caching uses JSC's WriteBarrier:
```cpp
JSC_DEFINE_CUSTOM_GETTER(TextDecoderPrototype__encodingGetterWrap, (...)) {
auto& vm = JSC::getVM(lexicalGlobalObject);
Zig::GlobalObject *globalObject = reinterpret_cast<Zig::GlobalObject*>(lexicalGlobalObject);
auto throwScope = DECLARE_THROW_SCOPE(vm);
JSTextDecoder* thisObject = jsCast<JSTextDecoder*>(JSValue::decode(encodedThisValue));
JSC::EnsureStillAliveScope thisArg = JSC::EnsureStillAliveScope(thisObject);
// Check for cached value and return if present
if (JSValue cachedValue = thisObject->m_encoding.get())
return JSValue::encode(cachedValue);
// Get value from Zig implementation
JSC::JSValue result = JSC::JSValue::decode(
TextDecoderPrototype__getEncoding(thisObject->wrapped(), globalObject)
);
RETURN_IF_EXCEPTION(throwScope, {});
// Store in cache for future access
thisObject->m_encoding.set(vm, thisObject, result);
RELEASE_AND_RETURN(throwScope, JSValue::encode(result));
}
```
### Zig Accessor Functions
For each cached property, the generator creates Zig accessor functions that allow Zig code to work with these GC-owned values:
```zig
// External function declarations
extern fn TextDecoderPrototype__encodingSetCachedValue(JSC.JSValue, *JSC.JSGlobalObject, JSC.JSValue) callconv(JSC.conv) void;
extern fn TextDecoderPrototype__encodingGetCachedValue(JSC.JSValue) callconv(JSC.conv) JSC.JSValue;
/// `TextDecoder.encoding` setter
/// This value will be visited by the garbage collector.
pub fn encodingSetCached(thisValue: JSC.JSValue, globalObject: *JSC.JSGlobalObject, value: JSC.JSValue) void {
JSC.markBinding(@src());
TextDecoderPrototype__encodingSetCachedValue(thisValue, globalObject, value);
}
/// `TextDecoder.encoding` getter
/// This value will be visited by the garbage collector.
pub fn encodingGetCached(thisValue: JSC.JSValue) ?JSC.JSValue {
JSC.markBinding(@src());
const result = TextDecoderPrototype__encodingGetCachedValue(thisValue);
if (result == .zero)
return null;
return result;
}
```
### Benefits of GC-Owned Values
This system provides several key benefits:
1. **Automatic Memory Management**: The JavaScriptCore GC tracks and manages these values
2. **Proper Garbage Collection**: The WriteBarrier ensures values are properly visited during GC
3. **Consistent Access**: Zig code can easily get/set these cached JS values
4. **Performance**: Cached values avoid repeated computation or serialization
### Use Cases
GC-owned cached values are particularly useful for:
1. **Computed Properties**: Store expensive computation results
2. **Lazily Created Objects**: Create objects only when needed, then cache them
3. **References to Other Objects**: Store references to other JS objects that need GC tracking
4. **Memoization**: Cache results based on input parameters
The WriteBarrier mechanism ensures that any JS values stored in this way are properly tracked by the garbage collector.
## Memory Management and Finalization
The binding system handles memory management across the JavaScript/Zig boundary:
1. **Object Creation**: JavaScript `new TextDecoder()` creates both a JS wrapper and a Zig struct
2. **Reference Tracking**: JSC's GC tracks all JS references to the object
3. **Finalization**: When the JS object is collected, the finalizer releases Zig resources
Bun uses a consistent pattern for resource cleanup:
```zig
// Resource cleanup method - separate from finalization
pub fn deinit(this: *TextDecoder) void {
// Release resources like strings
this._encoding.deref(); // String deref pattern
// Free any buffers
if (this.buffer) |buffer| {
bun.default_allocator.free(buffer);
}
}
// Called by the GC when object is collected
pub fn finalize(this: *TextDecoder) void {
JSC.markBinding(@src()); // For debugging
this.deinit(); // Clean up resources
bun.default_allocator.destroy(this); // Free the object itself
}
```
Some objects that hold references to other JS objects use `.deref()` instead:
```zig
pub fn finalize(this: *SocketAddress) void {
JSC.markBinding(@src());
this._presentation.deref(); // Release references
this.destroy();
}
```
## Error Handling with JSError
Bun uses `bun.JSError!JSValue` return type for proper error handling:
```zig
pub fn decode(
this: *TextDecoder,
globalObject: *JSGlobalObject,
callFrame: *JSC.CallFrame
) bun.JSError!JSC.JSValue {
// Throwing an error
if (callFrame.argumentCount() < 1) {
return globalObject.throw("Missing required argument", .{});
}
// Or returning a success value
return JSC.JSValue.jsString(globalObject, "Success!");
}
```
This pattern allows Zig functions to:
1. Return JavaScript values on success
2. Throw JavaScript exceptions on error
3. Propagate errors automatically through the call stack
## Type Safety and Error Handling
The binding system includes robust error handling:
```cpp
// Example of type checking in generated code
JSTextDecoder* thisObject = jsDynamicCast<JSTextDecoder*>(callFrame->thisValue());
if (UNLIKELY(!thisObject)) {
scope.throwException(lexicalGlobalObject,
Bun::createInvalidThisError(lexicalGlobalObject, callFrame->thisValue(), "TextDecoder"_s));
return {};
}
```
## Prototypal Inheritance
The binding system creates proper JavaScript prototype chains:
1. **Constructor**: JSTextDecoderConstructor with standard .prototype property
2. **Prototype**: JSTextDecoderPrototype with methods and properties
3. **Instances**: Each JSTextDecoder instance with __proto__ pointing to prototype
This ensures JavaScript inheritance works as expected:
```cpp
// From generated code
void JSTextDecoderConstructor::finishCreation(VM& vm, JSC::JSGlobalObject* globalObject, JSTextDecoderPrototype* prototype)
{
Base::finishCreation(vm, 0, "TextDecoder"_s, PropertyAdditionMode::WithoutStructureTransition);
// Set up the prototype chain
putDirectWithoutTransition(vm, vm.propertyNames->prototype, prototype, PropertyAttribute::DontEnum | PropertyAttribute::DontDelete | PropertyAttribute::ReadOnly);
ASSERT(inherits(info()));
}
```
## Performance Considerations
The binding system is optimized for performance:
1. **Direct Pointer Access**: JavaScript objects maintain a direct pointer to Zig objects
2. **Property Caching**: WriteBarrier caching avoids repeated native calls for stable properties
3. **Memory Management**: JSC garbage collection integrated with Zig memory management
4. **Type Conversion**: Fast paths for common JavaScript/Zig type conversions
## Creating a New Class Binding
To create a new class binding in Bun:
1. **Define the class interface** in a `.classes.ts` file:
```typescript
define({
name: "MyClass",
constructor: true,
finalize: true,
proto: {
myMethod: {
args: 1,
},
myProperty: {
getter: true,
cache: true,
}
}
});
```
2. **Implement the native functionality** in a `.zig` file:
```zig
pub const MyClass = struct {
// State
value: []const u8,
// Generated bindings
pub usingnamespace JSC.Codegen.JSMyClass;
pub usingnamespace bun.New(@This());
// Constructor
pub fn constructor(
globalObject: *JSGlobalObject,
callFrame: *JSC.CallFrame,
) bun.JSError!*MyClass {
const arg = callFrame.argument(0);
// Implementation
}
// Method
pub fn myMethod(
this: *MyClass,
globalObject: *JSGlobalObject,
callFrame: *JSC.CallFrame,
) bun.JSError!JSC.JSValue {
// Implementation
}
// Getter
pub fn getMyProperty(this: *MyClass, globalObject: *JSGlobalObject) JSC.JSValue {
return JSC.JSValue.jsString(globalObject, this.value);
}
// Resource cleanup
pub fn deinit(this: *MyClass) void {
// Clean up resources
}
pub fn finalize(this: *MyClass) void {
this.deinit();
bun.default_allocator.destroy(this);
}
};
```
3. **The binding generator** creates all necessary C++ and Zig glue code to connect JavaScript and Zig, including:
- C++ class definitions
- Method and property bindings
- Memory management utilities
- GC integration code
## Generated Code Structure
The binding generator produces several components:
### 1. C++ Classes
For each Zig class, the system generates:
- **JS<Class>**: Main wrapper that holds a pointer to the Zig object (`JSTextDecoder`)
- **JS<Class>Prototype**: Contains methods and properties (`JSTextDecoderPrototype`)
- **JS<Class>Constructor**: Implementation of the JavaScript constructor (`JSTextDecoderConstructor`)
### 2. C++ Methods and Properties
- **Method Callbacks**: `TextDecoderPrototype__decodeCallback`
- **Property Getters/Setters**: `TextDecoderPrototype__encodingGetterWrap`
- **Initialization Functions**: `finishCreation` methods for setting up the class
### 3. Zig Bindings
- **External Function Declarations**:
```zig
extern fn TextDecoderPrototype__decode(*TextDecoder, *JSC.JSGlobalObject, *JSC.CallFrame) callconv(JSC.conv) JSC.EncodedJSValue;
```
- **Cached Value Accessors**:
```zig
pub fn encodingGetCached(thisValue: JSC.JSValue) ?JSC.JSValue { ... }
pub fn encodingSetCached(thisValue: JSC.JSValue, globalObject: *JSC.JSGlobalObject, value: JSC.JSValue) void { ... }
```
- **Constructor Helpers**:
```zig
pub fn create(globalObject: *JSC.JSGlobalObject) bun.JSError!JSC.JSValue { ... }
```
### 4. GC Integration
- **Memory Cost Calculation**: `estimatedSize` method
- **Child Visitor Methods**: `visitChildrenImpl` and `visitAdditionalChildren`
- **Heap Analysis**: `analyzeHeap` for debugging memory issues
This architecture makes it possible to implement high-performance native functionality in Zig while exposing a clean, idiomatic JavaScript API to users.

View File

@@ -28,7 +28,7 @@ This adds a new flag --bail to bun test. When set, it will stop running tests af
- [ ] I checked the lifetime of memory allocated to verify it's (1) freed and (2) only freed when it should be
- [ ] I included a test for the new code, or an existing test covers it
- [ ] JSValue used outside outside of the stack is either wrapped in a JSC.Strong or is JSValueProtect'ed
- [ ] JSValue used outside of the stack is either wrapped in a JSC.Strong or is JSValueProtect'ed
- [ ] I wrote TypeScript/JavaScript tests and they pass locally (`bun-debug test test-file-name.test`)
-->

View File

@@ -35,8 +35,6 @@
// "zig.zls.enableBuildOnSave": true,
// "zig.buildOnSave": true,
"zig.buildFilePath": "${workspaceFolder}/build.zig",
"zig.path": "${workspaceFolder}/vendor/zig/zig.exe",
"zig.zls.path": "${workspaceFolder}/vendor/zig/zls.exe",
"zig.formattingProvider": "zls",
"zig.zls.enableInlayHints": false,
"[zig]": {

View File

@@ -53,39 +53,39 @@ $ brew install bun
## Install LLVM
Bun requires LLVM 18 (`clang` is part of LLVM). This version requirement is to match WebKit (precompiled), as mismatching versions will cause memory allocation failures at runtime. In most cases, you can install LLVM through your system package manager:
Bun requires LLVM 19 (`clang` is part of LLVM). This version requirement is to match WebKit (precompiled), as mismatching versions will cause memory allocation failures at runtime. In most cases, you can install LLVM through your system package manager:
{% codetabs group="os" %}
```bash#macOS (Homebrew)
$ brew install llvm@18
$ brew install llvm@19
```
```bash#Ubuntu/Debian
$ # LLVM has an automatic installation script that is compatible with all versions of Ubuntu
$ wget https://apt.llvm.org/llvm.sh -O - | sudo bash -s -- 18 all
$ wget https://apt.llvm.org/llvm.sh -O - | sudo bash -s -- 19 all
```
```bash#Arch
$ sudo pacman -S llvm clang18 lld
$ sudo pacman -S llvm clang lld
```
```bash#Fedora
$ sudo dnf install llvm18 clang18 lld18-devel
$ sudo dnf install llvm clang lld-devel
```
```bash#openSUSE Tumbleweed
$ sudo zypper install clang18 lld18 llvm18
$ sudo zypper install clang19 lld19 llvm19
```
{% /codetabs %}
If none of the above solutions apply, you will have to install it [manually](https://github.com/llvm/llvm-project/releases/tag/llvmorg-19.1.7).
Make sure Clang/LLVM 18 is in your path:
Make sure Clang/LLVM 19 is in your path:
```bash
$ which clang-18
$ which clang-19
```
If not, run this to manually add it:
@@ -94,13 +94,13 @@ If not, run this to manually add it:
```bash#macOS (Homebrew)
# use fish_add_path if you're using fish
# use path+="$(brew --prefix llvm@18)/bin" if you are using zsh
$ export PATH="$(brew --prefix llvm@18)/bin:$PATH"
# use path+="$(brew --prefix llvm@19)/bin" if you are using zsh
$ export PATH="$(brew --prefix llvm@19)/bin:$PATH"
```
```bash#Arch
# use fish_add_path if you're using fish
$ export PATH="$PATH:/usr/lib/llvm18/bin"
$ export PATH="$PATH:/usr/lib/llvm19/bin"
```
{% /codetabs %}
@@ -134,6 +134,16 @@ We recommend adding `./build/debug` to your `$PATH` so that you can run `bun-deb
$ bun-debug
```
## Running debug builds
The `bd` package.json script compiles and runs a debug build of Bun, only printing the output of the build process if it fails.
```sh
$ bun bd <args>
$ bun bd test foo.test.ts
$ bun bd ./foo.ts
```
## Code generation scripts
Several code generation scripts are used during Bun's build process. These are run automatically when changes are made to certain files.
@@ -250,7 +260,7 @@ The issue may manifest when initially running `bun setup` as Clang being unable
```
The C++ compiler
"/usr/bin/clang++-18"
"/usr/bin/clang++-19"
is not able to compile a simple test program.
```

2
LATEST
View File

@@ -1 +1 @@
1.2.5
1.2.8

View File

@@ -0,0 +1,44 @@
import { bench, run } from "../runner.mjs";
import crypto from "node:crypto";
import { Buffer } from "node:buffer";
const keylen = { "aes-128-gcm": 16, "aes-192-gcm": 24, "aes-256-gcm": 32 };
const sizes = [4 * 1024, 1024 * 1024];
const ciphers = ["aes-128-gcm", "aes-192-gcm", "aes-256-gcm"];
const messages = {};
sizes.forEach(size => {
messages[size] = Buffer.alloc(size, "b");
});
const keys = {};
ciphers.forEach(cipher => {
keys[cipher] = crypto.randomBytes(keylen[cipher]);
});
// Fixed IV and AAD
const iv = crypto.randomBytes(12);
const associate_data = Buffer.alloc(16, "z");
for (const cipher of ciphers) {
for (const size of sizes) {
const message = messages[size];
const key = keys[cipher];
bench(`${cipher} ${size / 1024}KB`, () => {
const alice = crypto.createCipheriv(cipher, key, iv);
alice.setAAD(associate_data);
const enc = alice.update(message);
alice.final();
const tag = alice.getAuthTag();
const bob = crypto.createDecipheriv(cipher, key, iv);
bob.setAuthTag(tag);
bob.setAAD(associate_data);
bob.update(enc);
bob.final();
});
}
}
await run();

View File

@@ -1,7 +1,7 @@
{
"compilerOptions": {
// Enable latest features
"lib": ["ESNext", "DOM"],
"lib": ["ESNext"],
"target": "ESNext",
"module": "ESNext",
"moduleDetection": "force",

View File

@@ -1,7 +1,7 @@
{
"compilerOptions": {
// Enable latest features
"lib": ["ESNext", "DOM"],
"lib": ["ESNext"],
"target": "ESNext",
"module": "ESNext",
"moduleDetection": "force",

View File

@@ -27,9 +27,10 @@
},
"packages/bun-types": {
"name": "bun-types",
"version": "1.2.5",
"dependencies": {
"@types/node": "*",
"@types/ws": "~8.5.10",
"@types/ws": "*",
},
"devDependencies": {
"@biomejs/biome": "^1.5.3",

View File

@@ -4,7 +4,7 @@ register_repository(
REPOSITORY
oven-sh/boringssl
COMMIT
914b005ef3ece44159dca0ffad74eb42a9f6679f
7a5d984c69b0c34c4cbb56c6812eaa5b9bef485c
)
register_cmake_command(

View File

@@ -785,6 +785,10 @@ target_include_directories(${bun} PRIVATE
${NODEJS_HEADERS_PATH}/include
)
if(NOT WIN32)
target_include_directories(${bun} PRIVATE ${CWD}/src/bun.js/bindings/libuv)
endif()
if(LINUX)
include(CheckIncludeFiles)
check_include_files("sys/queue.h" HAVE_SYS_QUEUE_H)

View File

@@ -2,7 +2,7 @@ option(WEBKIT_VERSION "The version of WebKit to use")
option(WEBKIT_LOCAL "If a local version of WebKit should be used instead of downloading")
if(NOT WEBKIT_VERSION)
set(WEBKIT_VERSION 91bf2baced1b1309c7e05f19177c97fefec20976)
set(WEBKIT_VERSION ef31d98a1370e01b7483cabcbe3593d055bea982)
endif()
string(SUBSTRING ${WEBKIT_VERSION} 0 16 WEBKIT_VERSION_PREFIX)

449
docs/api/cookie.md Normal file
View File

@@ -0,0 +1,449 @@
Bun provides native APIs for working with HTTP cookies through `Bun.Cookie` and `Bun.CookieMap`. These APIs offer fast, easy-to-use methods for parsing, generating, and manipulating cookies in HTTP requests and responses.
## CookieMap class
`Bun.CookieMap` provides a Map-like interface for working with collections of cookies. It implements the `Iterable` interface, allowing you to use it with `for...of` loops and other iteration methods.
```ts
// Empty cookie map
const cookies = new Bun.CookieMap();
// From a cookie string
const cookies1 = new Bun.CookieMap("name=value; foo=bar");
// From an object
const cookies2 = new Bun.CookieMap({
session: "abc123",
theme: "dark",
});
// From an array of name/value pairs
const cookies3 = new Bun.CookieMap([
["session", "abc123"],
["theme", "dark"],
]);
```
### In HTTP servers
In Bun's HTTP server, the `cookies` property on the request object (in `routes`) is an instance of `CookieMap`:
```ts
const server = Bun.serve({
routes: {
"/": req => {
// Access request cookies
const cookies = req.cookies;
// Get a specific cookie
const sessionCookie = cookies.get("session");
if (sessionCookie != null) {
console.log(sessionCookie);
}
// Check if a cookie exists
if (cookies.has("theme")) {
// ...
}
// Set a cookie, it will be automatically applied to the response
cookies.set("visited", "true");
return new Response("Hello");
},
},
});
console.log("Server listening at: " + server.url);
```
### Methods
#### `get(name: string): string | null`
Retrieves a cookie by name. Returns `null` if the cookie doesn't exist.
```ts
// Get by name
const cookie = cookies.get("session");
if (cookie != null) {
console.log(cookie);
}
```
#### `has(name: string): boolean`
Checks if a cookie with the given name exists.
```ts
// Check if cookie exists
if (cookies.has("session")) {
// Cookie exists
}
```
#### `set(name: string, value: string): void`
#### `set(options: CookieInit): void`
#### `set(cookie: Cookie): void`
Adds or updates a cookie in the map. Cookies default to `{ path: "/", sameSite: "lax" }`.
```ts
// Set by name and value
cookies.set("session", "abc123");
// Set using options object
cookies.set({
name: "theme",
value: "dark",
maxAge: 3600,
secure: true,
});
// Set using Cookie instance
const cookie = new Bun.Cookie("visited", "true");
cookies.set(cookie);
```
#### `delete(name: string): void`
#### `delete(options: CookieStoreDeleteOptions): void`
Removes a cookie from the map. When applied to a Response, this adds a cookie with an empty string value and an expiry date in the past. A cookie will only delete successfully on the browser if the domain and path is the same as it was when the cookie was created.
```ts
// Delete by name using default domain and path.
cookies.delete("session");
// Delete with domain/path options.
cookies.delete({
name: "session",
domain: "example.com",
path: "/admin",
});
```
#### `toJSON(): Record<string, string>`
Converts the cookie map to a serializable format.
```ts
const json = cookies.toJSON();
```
#### `toSetCookieHeaders(): string[]`
Returns an array of values for Set-Cookie headers that can be used to apply all cookie changes.
When using `Bun.serve()`, you don't need to call this method explicitly. Any changes made to the `req.cookies` map are automatically applied to the response headers. This method is primarily useful when working with other HTTP server implementations.
```js
import { createServer } from "node:http";
import { CookieMap } from "bun";
const server = createServer((req, res) => {
const cookieHeader = req.headers.cookie || "";
const cookies = new CookieMap(cookieHeader);
cookies.set("view-count", Number(cookies.get("view-count") || "0") + 1);
cookies.delete("session");
res.writeHead(200, {
"Content-Type": "text/plain",
"Set-Cookie": cookies.toSetCookieHeaders(),
});
res.end(`Found ${cookies.size} cookies`);
});
server.listen(3000, () => {
console.log("Server running at http://localhost:3000/");
});
```
### Iteration
`CookieMap` provides several methods for iteration:
```ts
// Iterate over [name, cookie] entries
for (const [name, value] of cookies) {
console.log(`${name}: ${value}`);
}
// Using entries()
for (const [name, value] of cookies.entries()) {
console.log(`${name}: ${value}`);
}
// Using keys()
for (const name of cookies.keys()) {
console.log(name);
}
// Using values()
for (const value of cookies.values()) {
console.log(value);
}
// Using forEach
cookies.forEach((value, name) => {
console.log(`${name}: ${value}`);
});
```
### Properties
#### `size: number`
Returns the number of cookies in the map.
```ts
console.log(cookies.size); // Number of cookies
```
## Cookie class
`Bun.Cookie` represents an HTTP cookie with its name, value, and attributes.
```ts
import { Cookie } from "bun";
// Create a basic cookie
const cookie = new Bun.Cookie("name", "value");
// Create a cookie with options
const secureSessionCookie = new Bun.Cookie("session", "abc123", {
domain: "example.com",
path: "/admin",
expires: new Date(Date.now() + 86400000), // 1 day
httpOnly: true,
secure: true,
sameSite: "strict",
});
// Parse from a cookie string
const parsedCookie = new Bun.Cookie("name=value; Path=/; HttpOnly");
// Create from an options object
const objCookie = new Bun.Cookie({
name: "theme",
value: "dark",
maxAge: 3600,
secure: true,
});
```
### Constructors
```ts
// Basic constructor with name/value
new Bun.Cookie(name: string, value: string);
// Constructor with name, value, and options
new Bun.Cookie(name: string, value: string, options: CookieInit);
// Constructor from cookie string
new Bun.Cookie(cookieString: string);
// Constructor from cookie object
new Bun.Cookie(options: CookieInit);
```
### Properties
```ts
cookie.name; // string - Cookie name
cookie.value; // string - Cookie value
cookie.domain; // string | null - Domain scope (null if not specified)
cookie.path; // string - URL path scope (defaults to "/")
cookie.expires; // number | undefined - Expiration timestamp (ms since epoch)
cookie.secure; // boolean - Require HTTPS
cookie.sameSite; // "strict" | "lax" | "none" - SameSite setting
cookie.partitioned; // boolean - Whether the cookie is partitioned (CHIPS)
cookie.maxAge; // number | undefined - Max age in seconds
cookie.httpOnly; // boolean - Accessible only via HTTP (not JavaScript)
```
### Methods
#### `isExpired(): boolean`
Checks if the cookie has expired.
```ts
// Expired cookie (Date in the past)
const expiredCookie = new Bun.Cookie("name", "value", {
expires: new Date(Date.now() - 1000),
});
console.log(expiredCookie.isExpired()); // true
// Valid cookie (Using maxAge instead of expires)
const validCookie = new Bun.Cookie("name", "value", {
maxAge: 3600, // 1 hour in seconds
});
console.log(validCookie.isExpired()); // false
// Session cookie (no expiration)
const sessionCookie = new Bun.Cookie("name", "value");
console.log(sessionCookie.isExpired()); // false
```
#### `serialize(): string`
#### `toString(): string`
Returns a string representation of the cookie suitable for a `Set-Cookie` header.
```ts
const cookie = new Bun.Cookie("session", "abc123", {
domain: "example.com",
path: "/admin",
expires: new Date(Date.now() + 86400000),
secure: true,
httpOnly: true,
sameSite: "strict",
});
console.log(cookie.serialize());
// => "session=abc123; Domain=example.com; Path=/admin; Expires=Sun, 19 Mar 2025 15:03:26 GMT; Secure; HttpOnly; SameSite=strict"
console.log(cookie.toString());
// => "session=abc123; Domain=example.com; Path=/admin; Expires=Sun, 19 Mar 2025 15:03:26 GMT; Secure; HttpOnly; SameSite=strict"
```
#### `toJSON(): CookieInit`
Converts the cookie to a plain object suitable for JSON serialization.
```ts
const cookie = new Bun.Cookie("session", "abc123", {
secure: true,
httpOnly: true,
});
const json = cookie.toJSON();
// => {
// name: "session",
// value: "abc123",
// path: "/",
// secure: true,
// httpOnly: true,
// sameSite: "lax",
// partitioned: false
// }
// Works with JSON.stringify
const jsonString = JSON.stringify(cookie);
```
### Static methods
#### `Cookie.parse(cookieString: string): Cookie`
Parses a cookie string into a `Cookie` instance.
```ts
const cookie = Bun.Cookie.parse("name=value; Path=/; Secure; SameSite=Lax");
console.log(cookie.name); // "name"
console.log(cookie.value); // "value"
console.log(cookie.path); // "/"
console.log(cookie.secure); // true
console.log(cookie.sameSite); // "lax"
```
#### `Cookie.from(name: string, value: string, options?: CookieInit): Cookie`
Factory method to create a cookie.
```ts
const cookie = Bun.Cookie.from("session", "abc123", {
httpOnly: true,
secure: true,
maxAge: 3600,
});
```
## Types
```ts
interface CookieInit {
name?: string;
value?: string;
domain?: string;
/** Defaults to '/'. To allow the browser to set the path, use an empty string. */
path?: string;
expires?: number | Date | string;
secure?: boolean;
/** Defaults to `lax`. */
sameSite?: CookieSameSite;
httpOnly?: boolean;
partitioned?: boolean;
maxAge?: number;
}
interface CookieStoreDeleteOptions {
name: string;
domain?: string | null;
path?: string;
}
interface CookieStoreGetOptions {
name?: string;
url?: string;
}
type CookieSameSite = "strict" | "lax" | "none";
class Cookie {
constructor(name: string, value: string, options?: CookieInit);
constructor(cookieString: string);
constructor(cookieObject?: CookieInit);
readonly name: string;
value: string;
domain?: string;
path: string;
expires?: Date;
secure: boolean;
sameSite: CookieSameSite;
partitioned: boolean;
maxAge?: number;
httpOnly: boolean;
isExpired(): boolean;
serialize(): string;
toString(): string;
toJSON(): CookieInit;
static parse(cookieString: string): Cookie;
static from(name: string, value: string, options?: CookieInit): Cookie;
}
class CookieMap implements Iterable<[string, string]> {
constructor(init?: string[][] | Record<string, string> | string);
get(name: string): string | null;
toSetCookieHeaders(): string[];
has(name: string): boolean;
set(name: string, value: string, options?: CookieInit): void;
set(options: CookieInit): void;
delete(name: string): void;
delete(options: CookieStoreDeleteOptions): void;
delete(name: string, options: Omit<CookieStoreDeleteOptions, "name">): void;
toJSON(): Record<string, string>;
readonly size: number;
entries(): IterableIterator<[string, string]>;
keys(): IterableIterator<string>;
values(): IterableIterator<string>;
forEach(callback: (value: string, key: string, map: CookieMap) => void): void;
[Symbol.iterator](): IterableIterator<[string, string]>;
}
```

View File

@@ -61,6 +61,7 @@ Routes in `Bun.serve()` receive a `BunRequest` (which extends [`Request`](https:
// Simplified for brevity
interface BunRequest<T extends string> extends Request {
params: Record<T, string>;
readonly cookies: CookieMap;
}
```
@@ -934,6 +935,83 @@ const server = Bun.serve({
Returns `null` for closed requests or Unix domain sockets.
## Working with Cookies
Bun provides a built-in API for working with cookies in HTTP requests and responses. The `BunRequest` object includes a `cookies` property that provides a `CookieMap` for easily accessing and manipulating cookies. When using `routes`, `Bun.serve()` automatically tracks `request.cookies.set` and applies them to the response.
### Reading cookies
Read cookies from incoming requests using the `cookies` property on the `BunRequest` object:
```ts
Bun.serve({
routes: {
"/profile": req => {
// Access cookies from the request
const userId = req.cookies.get("user_id");
const theme = req.cookies.get("theme") || "light";
return Response.json({
userId,
theme,
message: "Profile page",
});
},
},
});
```
### Setting cookies
To set cookies, use the `set` method on the `CookieMap` from the `BunRequest` object.
```ts
Bun.serve({
routes: {
"/login": req => {
const cookies = req.cookies;
// Set a cookie with various options
cookies.set("user_id", "12345", {
maxAge: 60 * 60 * 24 * 7, // 1 week
httpOnly: true,
secure: true,
path: "/",
});
// Add a theme preference cookie
cookies.set("theme", "dark");
// Modified cookies from the request are automatically applied to the response
return new Response("Login successful");
},
},
});
```
`Bun.serve()` automatically tracks modified cookies from the request and applies them to the response.
### Deleting cookies
To delete a cookie, use the `delete` method on the `request.cookies` (`CookieMap`) object:
```ts
Bun.serve({
routes: {
"/logout": req => {
// Delete the user_id cookie
req.cookies.delete("user_id", {
path: "/",
});
return new Response("Logged out successfully");
},
},
});
```
Deleted cookies become a `Set-Cookie` header on the response with the `maxAge` set to `0` and an empty `value`.
## Server Metrics
### server.pendingRequests and server.pendingWebSockets

View File

@@ -240,7 +240,7 @@ const result = await sql.unsafe(
### Execute and Cancelling Queries
Bun's SQL is lazy that means its will only start executing when awaited or executed with `.execute()`.
Bun's SQL is lazy, which means it will only start executing when awaited or executed with `.execute()`.
You can cancel a query that is currently executing by calling the `cancel()` method on the query object.
```ts

View File

@@ -15,8 +15,8 @@ Below is the full set of recommended `compilerOptions` for a Bun project. With t
```jsonc
{
"compilerOptions": {
// Enable latest features
"lib": ["ESNext", "DOM"],
// Environment setup & latest features
"lib": ["ESNext"],
"target": "ESNext",
"module": "ESNext",
"moduleDetection": "force",
@@ -33,11 +33,12 @@ Below is the full set of recommended `compilerOptions` for a Bun project. With t
"strict": true,
"skipLibCheck": true,
"noFallthroughCasesInSwitch": true,
"noUncheckedIndexedAccess": true,
// Some stricter flags
"noUnusedLocals": true,
"noUnusedParameters": true,
"noPropertyAccessFromIndexSignature": true,
// Some stricter flags (disabled by default)
"noUnusedLocals": false,
"noUnusedParameters": false,
"noPropertyAccessFromIndexSignature": false,
},
}
```

View File

@@ -355,24 +355,24 @@ export default {
page("api/spawn", "Child processes", {
description: `Spawn sync and async child processes with easily configurable input and output streams.`,
}), // "`Bun.spawn`"),
page("api/transpiler", "Transpiler", {
description: `Bun exposes its internal transpiler as a pluggable API.`,
}), // "`Bun.Transpiler`"),
page("api/html-rewriter", "HTMLRewriter", {
description: `Parse and transform HTML with Bun's native HTMLRewriter API, inspired by Cloudflare Workers.`,
}), // "`HTMLRewriter`"),
page("api/hashing", "Hashing", {
description: `Native support for a range of fast hashing algorithms.`,
}), // "`Bun.serve`"),
page("api/console", "Console", {
description: `Bun implements a Node.js-compatible \`console\` object with colorized output and deep pretty-printing.`,
}), // "`Node-API`"),
page("api/cookie", "Cookie", {
description: "Bun's native Cookie API simplifies working with HTTP cookies.",
}), // "`Node-API`"),
page("api/ffi", "FFI", {
description: `Call native code from JavaScript with Bun's foreign function interface (FFI) API.`,
}), // "`bun:ffi`"),
page("api/cc", "C Compiler", {
description: `Build & run native C from JavaScript with Bun's native C compiler API`,
}), // "`bun:ffi`"),
page("api/html-rewriter", "HTMLRewriter", {
description: `Parse and transform HTML with Bun's native HTMLRewriter API, inspired by Cloudflare Workers.`,
}), // "`HTMLRewriter`"),
page("api/test", "Testing", {
description: `Bun's built-in test runner is fast and uses Jest-compatible syntax.`,
}), // "`bun:test`"),
@@ -398,6 +398,9 @@ export default {
page("api/color", "Color", {
description: `Bun's color function leverages Bun's CSS parser for parsing, normalizing, and converting colors from user input to a variety of output formats.`,
}), // "`Color`"),
page("api/transpiler", "Transpiler", {
description: `Bun exposes its internal transpiler as a pluggable API.`,
}), // "`Bun.Transpiler`"),
// divider("Dev Server"),
// page("bun-dev", "Vanilla"),

View File

@@ -104,7 +104,7 @@ This page is updated regularly to reflect compatibility status of the latest ver
### [`node:crypto`](https://nodejs.org/api/crypto.html)
🟡 Missing `ECDH` `checkPrime` `checkPrimeSync` `generatePrime` `generatePrimeSync` `hkdf` `hkdfSync` `secureHeapUsed` `setEngine` `setFips`
🟡 Missing `secureHeapUsed` `setEngine` `setFips`
Some methods are not optimized yet.
@@ -118,7 +118,7 @@ Some methods are not optimized yet.
### [`node:module`](https://nodejs.org/api/module.html)
🟡 Missing `runMain` `syncBuiltinESMExports`, `Module#load()`. Overriding `require.cache` is supported for ESM & CJS modules. `module._extensions`, `module._pathCache`, `module._cache` are no-ops. `module.register` is not implemented and we recommend using a [`Bun.plugin`](https://bun.sh/docs/runtime/plugins) in the meantime.
🟡 Missing `syncBuiltinESMExports`, `Module#load()`. Overriding `require.cache` is supported for ESM & CJS modules. `module._extensions`, `module._pathCache`, `module._cache` are no-ops. `module.register` is not implemented and we recommend using a [`Bun.plugin`](https://bun.sh/docs/runtime/plugins) in the meantime.
### [`node:net`](https://nodejs.org/api/net.html)
@@ -378,8 +378,7 @@ The table below lists all globals implemented by Node.js and Bun's current compa
### [`require()`](https://nodejs.org/api/globals.html#require)
🟢 Fully implemented, including [`require.main`](https://nodejs.org/api/modules.html#requiremain), [`require.cache`](https://nodejs.org/api/modules.html#requirecache), [`require.resolve`](https://nodejs.org/api/modules.html#requireresolverequest-options). `require.extensions` is a stub.
🟢 Fully implemented, including [`require.main`](https://nodejs.org/api/modules.html#requiremain), [`require.cache`](https://nodejs.org/api/modules.html#requirecache), [`require.resolve`](https://nodejs.org/api/modules.html#requireresolverequest-options).
### [`Response`](https://developer.mozilla.org/en-US/docs/Web/API/Response)
🟢 Fully implemented.

View File

@@ -17,7 +17,7 @@ Bun supports things like top-level await, JSX, and extensioned `.ts` imports, wh
```jsonc
{
"compilerOptions": {
// Enable latest features
// Environment setup & latest features
"lib": ["ESNext"],
"target": "ESNext",
"module": "ESNext",
@@ -35,12 +35,13 @@ Bun supports things like top-level await, JSX, and extensioned `.ts` imports, wh
"strict": true,
"skipLibCheck": true,
"noFallthroughCasesInSwitch": true,
"noUncheckedIndexedAccess": true,
// Some stricter flags
"noUnusedLocals": true,
"noUnusedParameters": true,
"noPropertyAccessFromIndexSignature": true
}
// Some stricter flags (disabled by default)
"noUnusedLocals": false,
"noUnusedParameters": false,
"noPropertyAccessFromIndexSignature": false,
},
}
```

View File

@@ -29,6 +29,9 @@
"test/js/**/*bad.js",
"test/bundler/transpiler/decorators.test.ts", // uses `arguments` as decorator
"test/bundler/native-plugin.test.ts", // parser doesn't handle import metadata
"test/bundler/transpiler/with-statement-works.js", // parser doesn't allow `with` statement
"test/js/node/module/extensions-fixture", // these files are not meant to be linted
"test/cli/run/module-type-fixture",
"test/bundler/transpiler/with-statement-works.js" // parser doesn't allow `with` statement
],
"overrides": [

View File

@@ -1,7 +1,7 @@
{
"private": true,
"name": "bun",
"version": "1.2.6",
"version": "1.2.9",
"workspaces": [
"./packages/bun-types"
],
@@ -31,6 +31,7 @@
},
"scripts": {
"build": "bun run build:debug",
"bd": "(bun run --silent build:debug &> /tmp/bun.debug.build.log || (cat /tmp/bun.debug.build.log && rm -rf /tmp/bun.debug.build.log && exit 1)) && rm -f /tmp/bun.debug.build.log && ./build/debug/bun-debug",
"build:debug": "bun ./scripts/build.mjs -GNinja -DCMAKE_BUILD_TYPE=Debug -B build/debug",
"build:valgrind": "bun ./scripts/build.mjs -GNinja -DCMAKE_BUILD_TYPE=Debug -DENABLE_BASELINE=ON -ENABLE_VALGRIND=ON -B build/debug-valgrind",
"build:release": "bun ./scripts/build.mjs -GNinja -DCMAKE_BUILD_TYPE=Release -B build/release",
@@ -44,6 +45,7 @@
"build:release:with_logs": "cmake . -DCMAKE_BUILD_TYPE=Release -DENABLE_LOGS=true -GNinja -Bbuild-release && ninja -Cbuild-release",
"build:debug-zig-release": "cmake . -DCMAKE_BUILD_TYPE=Release -DZIG_OPTIMIZE=Debug -GNinja -Bbuild-debug-zig-release && ninja -Cbuild-debug-zig-release",
"css-properties": "bun run src/css/properties/generate_properties.ts",
"uv-posix-stubs": "bun run src/bun.js/bindings/libuv/generate_uv_posix_stubs.ts",
"bump": "bun ./scripts/bump.ts",
"typecheck": "tsc --noEmit && cd test && bun run typecheck",
"fmt": "bun run prettier",

View File

@@ -4,12 +4,12 @@
"": {
"name": "bun-plugin-svelte",
"devDependencies": {
"@threlte/core": "8.0.1",
"bun-types": "canary",
"svelte": "^5.20.4",
},
"peerDependencies": {
"svelte": "^5",
"typescript": "^5",
},
},
},
@@ -26,6 +26,8 @@
"@jridgewell/trace-mapping": ["@jridgewell/trace-mapping@0.3.25", "", { "dependencies": { "@jridgewell/resolve-uri": "^3.1.0", "@jridgewell/sourcemap-codec": "^1.4.14" } }, "sha512-vNk6aEwybGtawWmy/PzwnGDOjCkLWSD2wqvjGGAgOAwCGWySYXfYoxt00IJkTF+8Lb57DwOb3Aa0o9CApepiYQ=="],
"@threlte/core": ["@threlte/core@8.0.1", "", { "dependencies": { "mitt": "^3.0.1" }, "peerDependencies": { "svelte": ">=5", "three": ">=0.155" } }, "sha512-vy1xRQppJFNmfPTeiRQue+KmYFsbPgVhwuYXRTvVrwPeD2oYz43gxUeOpe1FACeGKxrxZykeKJF5ebVvl7gBxw=="],
"@types/estree": ["@types/estree@1.0.6", "", {}, "sha512-AYnb1nQyY49te+VRAVgmzfcgjYS91mY5P0TKUDCLEM+gNnA+3T6rWITXRLYCpahpqSQbN5cE+gHpnPyXjHWxcw=="],
"@types/node": ["@types/node@22.13.5", "", { "dependencies": { "undici-types": "~6.20.0" } }, "sha512-+lTU0PxZXn0Dr1NBtC7Y8cR21AJr87dLLU953CWA6pMxxv/UDc7jYAY90upcrie1nRcD6XNG5HOYEDtgW5TxAg=="],
@@ -54,9 +56,11 @@
"magic-string": ["magic-string@0.30.17", "", { "dependencies": { "@jridgewell/sourcemap-codec": "^1.5.0" } }, "sha512-sNPKHvyjVf7gyjwS4xGTaW/mCnF8wnjtifKBEhxfZ7E/S8tQ0rssrwGNn6q8JH/ohItJfSQp9mBtQYuTlH5QnA=="],
"mitt": ["mitt@3.0.1", "", {}, "sha512-vKivATfr97l2/QBCYAkXYDbrIWPM2IIKEl7YPhjCvKlG3kE2gm+uBo6nEXK3M5/Ffh/FLpKExzOQ3JJoJGFKBw=="],
"svelte": ["svelte@5.20.4", "", { "dependencies": { "@ampproject/remapping": "^2.3.0", "@jridgewell/sourcemap-codec": "^1.5.0", "@types/estree": "^1.0.5", "acorn": "^8.12.1", "acorn-typescript": "^1.4.13", "aria-query": "^5.3.1", "axobject-query": "^4.1.0", "clsx": "^2.1.1", "esm-env": "^1.2.1", "esrap": "^1.4.3", "is-reference": "^3.0.3", "locate-character": "^3.0.0", "magic-string": "^0.30.11", "zimmerframe": "^1.1.2" } }, "sha512-2Mo/AfObaw9zuD0u1JJ7sOVzRCGcpETEyDkLbtkcctWpCMCIyT0iz83xD8JT29SR7O4SgswuPRIDYReYF/607A=="],
"typescript": ["typescript@5.7.3", "", { "bin": { "tsc": "bin/tsc", "tsserver": "bin/tsserver" } }, "sha512-84MVSjMEHP+FQRPy3pX9sTVV/INIex71s9TL2Gm5FG/WG1SqXeKyZ0k7/blY/4FdOzI12CBy1vGc4og/eus0fw=="],
"three": ["three@0.174.0", "", {}, "sha512-p+WG3W6Ov74alh3geCMkGK9NWuT62ee21cV3jEnun201zodVF4tCE5aZa2U122/mkLRmhJJUQmLLW1BH00uQJQ=="],
"undici-types": ["undici-types@6.20.0", "", {}, "sha512-Ny6QZ2Nju20vw1SRHe3d9jVu6gJ+4e3+MMpqu7pqE5HT6WsTSlce++GQmK5UXS8mzV8DSYHrQH+Xrf2jVcuKNg=="],

View File

@@ -1,6 +1,6 @@
{
"name": "bun-plugin-svelte",
"version": "0.0.5",
"version": "0.0.6",
"description": "Official Svelte plugin for Bun",
"repository": {
"type": "git",

View File

@@ -11,7 +11,11 @@ describe("SveltePlugin", () => {
expect(() => SveltePlugin(undefined)).not.toThrow();
});
it.each([null, 1, "hi", {}, "Client"])("throws if forceSide is not 'client' or 'server' (%p)", (forceSide: any) => {
it.each([1, "hi", {}, "Client"])("throws if forceSide is not 'client' or 'server' (%p)", (forceSide: any) => {
expect(() => SveltePlugin({ forceSide })).toThrow(TypeError);
});
it.each([null, undefined])("forceSide may be nullish", (forceSide: any) => {
expect(() => SveltePlugin({ forceSide })).not.toThrow();
});
});

View File

@@ -2,7 +2,7 @@ import { describe, beforeAll, it, expect } from "bun:test";
import type { BuildConfig } from "bun";
import type { CompileOptions } from "svelte/compiler";
import { getBaseCompileOptions, type SvelteOptions } from "./options";
import { getBaseCompileOptions, validateOptions, type SvelteOptions } from "./options";
describe("getBaseCompileOptions", () => {
describe("when no options are provided", () => {
@@ -42,4 +42,13 @@ describe("getBaseCompileOptions", () => {
);
},
);
});
}); // getBaseCompileOptions
describe("validateOptions(options)", () => {
it.each(["", 1, null, undefined, true, false, Symbol("hi")])(
"throws if options is not an object (%p)",
(badOptions: any) => {
expect(() => validateOptions(badOptions)).toThrow();
},
);
}); // validateOptions

View File

@@ -2,7 +2,8 @@ import { strict as assert } from "node:assert";
import { type BuildConfig } from "bun";
import type { CompileOptions, ModuleCompileOptions } from "svelte/compiler";
export interface SvelteOptions {
type OverrideCompileOptions = Pick<CompileOptions, "customElement" | "runes" | "modernAst" | "namespace">;
export interface SvelteOptions extends Pick<CompileOptions, "runes"> {
/**
* Force client-side or server-side generation.
*
@@ -20,6 +21,11 @@ export interface SvelteOptions {
* Defaults to `true` when run via Bun's dev server, `false` otherwise.
*/
development?: boolean;
/**
* Options to forward to the Svelte compiler.
*/
compilerOptions?: OverrideCompileOptions;
}
/**
@@ -27,15 +33,24 @@ export interface SvelteOptions {
*/
export function validateOptions(options: unknown): asserts options is SvelteOptions {
assert(options && typeof options === "object", new TypeError("bun-svelte-plugin: options must be an object"));
if ("forceSide" in options) {
switch (options.forceSide) {
const opts = options as Record<keyof SvelteOptions, unknown>;
if (opts.forceSide != null) {
if (typeof opts.forceSide !== "string") {
throw new TypeError("bun-svelte-plugin: forceSide must be a string, got " + typeof opts.forceSide);
}
switch (opts.forceSide) {
case "client":
case "server":
break;
default:
throw new TypeError(
`bun-svelte-plugin: forceSide must be either 'client' or 'server', got ${options.forceSide}`,
);
throw new TypeError(`bun-svelte-plugin: forceSide must be either 'client' or 'server', got ${opts.forceSide}`);
}
}
if (opts.compilerOptions) {
if (typeof opts.compilerOptions !== "object") {
throw new TypeError("bun-svelte-plugin: compilerOptions must be an object");
}
}
}
@@ -44,7 +59,10 @@ export function validateOptions(options: unknown): asserts options is SvelteOpti
* @internal
*/
export function getBaseCompileOptions(pluginOptions: SvelteOptions, config: Partial<BuildConfig>): CompileOptions {
let { development = false } = pluginOptions;
let {
development = false,
compilerOptions: { customElement, runes, modernAst, namespace } = kEmptyObject as OverrideCompileOptions,
} = pluginOptions;
const { minify = false } = config;
const shouldMinify = Boolean(minify);
@@ -68,6 +86,10 @@ export function getBaseCompileOptions(pluginOptions: SvelteOptions, config: Part
preserveWhitespace: !minifyWhitespace,
preserveComments: !shouldMinify,
dev: development,
customElement,
runes,
modernAst,
namespace,
cssHash({ css }) {
// same prime number seed used by svelte/compiler.
// TODO: ensure this provides enough entropy
@@ -109,3 +131,4 @@ function generateSide(pluginOptions: SvelteOptions, config: Partial<BuildConfig>
}
export const hash = (content: string): string => Bun.hash(content, 5381).toString(36);
const kEmptyObject = Object.create(null);

View File

@@ -24,13 +24,32 @@ afterAll(() => {
}
});
it("hello world component", async () => {
const res = await Bun.build({
entrypoints: [fixturePath("foo.svelte")],
outdir,
plugins: [SveltePlugin()],
describe("given a hello world component", () => {
const entrypoints = [fixturePath("foo.svelte")];
it("when no options are provided, builds successfully", async () => {
const res = await Bun.build({
entrypoints,
outdir,
plugins: [SveltePlugin()],
});
expect(res.success).toBeTrue();
});
describe("when a custom element is provided", () => {
let res: BuildOutput;
beforeAll(async () => {
res = await Bun.build({
entrypoints,
outdir,
plugins: [SveltePlugin({ compilerOptions: { customElement: true } })],
});
});
it("builds successfully", () => {
expect(res.success).toBeTrue();
});
});
expect(res.success).toBeTrue();
});
describe("when importing `.svelte.ts` files with ESM", () => {

View File

@@ -20,7 +20,7 @@ That's it! VS Code and TypeScript automatically load `@types/*` packages into yo
# Contributing
The `@types/bun` package is a shim that loads `bun-types`. The `bun-types` package lives in the Bun repo under `packages/bun-types`. It is generated via [./scripts/bundle.ts](./scripts/bundle.ts).
The `@types/bun` package is a shim that loads `bun-types`. The `bun-types` package lives in the Bun repo under `packages/bun-types`.
To add a new file, add it under `packages/bun-types`. Then add a [triple-slash directive](https://www.typescriptlang.org/docs/handbook/triple-slash-directives.html) pointing to it inside [./index.d.ts](./index.d.ts).
@@ -28,8 +28,6 @@ To add a new file, add it under `packages/bun-types`. Then add a [triple-slash d
+ /// <reference path="./newfile.d.ts" />
```
[`./bundle.ts`](./bundle.ts) merges the types in this folder into a single file. To run it:
```bash
bun build
```

View File

@@ -0,0 +1,114 @@
# Authoring @types/bun
These declarations define the `'bun'` module, the `Bun` global variable, and lots of other global declarations like extending the `fetch` interface.
## The `'bun'` Module
The `Bun` global variable and the `'bun'` module types are defined with one syntax. It supports declaring both types/interfaces and runtime values:
```typescript
declare module "bun" {
// Your types go here
interface MyInterface {
// ...
}
type MyType = string | number;
function myFunction(): void;
}
```
You can write as many `declare module "bun"` declarations as you need. Symbols will be accessible from other files inside of the declaration, and they will all be merged when the types are evaluated.
You can consume these declarations in two ways:
1. Importing it from `'bun'`:
```typescript
import { type MyInterface, type MyType, myFunction } from "bun";
const myInterface: MyInterface = {};
const myType: MyType = "cool";
myFunction();
```
2. Using the global `Bun` object:
```typescript
const myInterface: Bun.MyInterface = {};
const myType: Bun.MyType = "cool";
Bun.myFunction();
```
Consuming them inside the ambient declarations is also easy:
```ts
// These are equivalent
type A = import("bun").MyType;
type A = Bun.MyType;
```
## File Structure
Types are organized across multiple `.d.ts` files in the `packages/bun-types` directory:
- `index.d.ts` - The main entry point that references all other type files
- `bun.d.ts` - Core Bun APIs and types
- `globals.d.ts` - Global type declarations
- `test.d.ts` - Testing-related types
- `sqlite.d.ts` - SQLite-related types
- ...etc. You can make more files
Note: The order of references in `index.d.ts` is important - `bun.ns.d.ts` must be referenced last to ensure the `Bun` global gets defined properly.
### Best Practices
1. **Type Safety**
- Please use strict types instead of `any` where possible
- Leverage TypeScript's type system features (generics, unions, etc.)
- Document complex types with JSDoc comments
2. **Compatibility**
- Use `Bun.__internal.UseLibDomIfAvailable<LibDomName extends string, OurType>` for types that might conflict with lib.dom.d.ts (see [`./fetch.d.ts`](./fetch.d.ts) for a real example)
- `@types/node` often expects variables to always be defined (this was the biggest cause of most of the conflicts in the past!), so we use the `UseLibDomIfAvailable` type to make sure we don't overwrite `lib.dom.d.ts` but still provide Bun types while simultaneously declaring the variable exists (for Node to work) in the cases that we can.
3. **Documentation**
- Add JSDoc comments for public APIs
- Include examples in documentation when helpful
- Document default values and important behaviors
### Internal Types
For internal types that shouldn't be exposed to users, use the `__internal` namespace:
```typescript
declare module "bun" {
namespace __internal {
interface MyInternalType {
// ...
}
}
}
```
The internal namespace is mostly used for declaring things that shouldn't be globally accessible on the `bun` namespace, but are still used in public apis. You can see a pretty good example of that in the [`./fetch.d.ts`](./fetch.d.ts) file.
## Testing Types
We test our type definitions using a special test file at `fixture/index.ts`. This file contains TypeScript code that exercises our type definitions, but is never actually executed - it's only used to verify that the types work correctly.
The test file is type-checked in two different environments:
1. With `lib.dom.d.ts` included - This simulates usage in a browser environment where DOM types are available
2. Without `lib.dom.d.ts` - This simulates usage in a server-like environment without DOM types
Your type definitions must work properly in both environments. This ensures that Bun's types are compatible regardless of whether DOM types are present or not.
For example, if you're adding types for a new API, you should just add code to `fixture/index.ts` that uses your new API. Doesn't need to work at runtime (e.g. you can fake api keys for example), it's just checking that the types are correct.
## Questions
Feel free to open an issue or speak to any of the more TypeScript-focused team members if you need help authoring types or fixing type tests.

File diff suppressed because it is too large Load Diff

7
packages/bun-types/bun.ns.d.ts vendored Normal file
View File

@@ -0,0 +1,7 @@
import * as BunModule from "bun";
declare global {
export import Bun = BunModule;
}
export {};

View File

@@ -1,4 +1,19 @@
declare module "bun" {
interface BunMessageEvent<T> {
/**
* @deprecated
*/
initMessageEvent(
type: string,
bubbles?: boolean,
cancelable?: boolean,
data?: any,
origin?: string,
lastEventId?: string,
source?: null,
): void;
}
/**
* @deprecated Renamed to `ErrorLike`
*/
@@ -38,21 +53,6 @@ declare namespace NodeJS {
}
}
declare namespace Bun {
interface MessageEvent {
/** @deprecated */
initMessageEvent(
type: string,
bubbles?: boolean,
cancelable?: boolean,
data?: any,
origin?: string,
lastEventId?: string,
source?: null,
): void;
}
}
interface CustomEvent<T = any> {
/** @deprecated */
initCustomEvent(type: string, bubbles?: boolean, cancelable?: boolean, detail?: T): void;

View File

@@ -1,192 +1,189 @@
export {};
declare module "bun" {
type HMREventNames =
| "beforeUpdate"
| "afterUpdate"
| "beforeFullReload"
| "beforePrune"
| "invalidate"
| "error"
| "ws:disconnect"
| "ws:connect";
declare global {
namespace Bun {
type HMREventNames =
| "bun:ready"
| "bun:beforeUpdate"
| "bun:afterUpdate"
| "bun:beforeFullReload"
| "bun:beforePrune"
| "bun:invalidate"
| "bun:error"
| "bun:ws:disconnect"
| "bun:ws:connect";
/**
* The event names for the dev server
*/
type HMREvent = `bun:${HMREventNames}` | (string & {});
}
interface ImportMeta {
/**
* Hot module replacement APIs. This value is `undefined` in production and
* can be used in an `if` statement to check if HMR APIs are available
*
* ```ts
* if (import.meta.hot) {
* // HMR APIs are available
* }
* ```
*
* However, this check is usually not needed as Bun will dead-code-eliminate
* calls to all of the HMR APIs in production builds.
*
* https://bun.sh/docs/bundler/hmr
*/
hot: {
/**
* The event names for the dev server
*/
type HMREvent = `bun:${HMREventNames}` | (string & {});
}
interface ImportMeta {
/**
* Hot module replacement APIs. This value is `undefined` in production and
* can be used in an `if` statement to check if HMR APIs are available
* `import.meta.hot.data` maintains state between module instances during
* hot replacement, enabling data transfer from previous to new versions.
* When `import.meta.hot.data` is written to, Bun will mark this module as
* capable of self-accepting (equivalent of calling `accept()`).
*
* @example
* ```ts
* if (import.meta.hot) {
* // HMR APIs are available
* }
* const root = import.meta.hot.data.root ??= createRoot(elem);
* root.render(<App />); // re-use an existing root
* ```
*
* However, this check is usually not needed as Bun will dead-code-eliminate
* calls to all of the HMR APIs in production builds.
* In production, `data` is inlined to be `{}`. This is handy because Bun
* knows it can minify `{}.prop ??= value` into `value` in production.
*
*
* https://bun.sh/docs/bundler/hmr
*/
hot: {
/**
* `import.meta.hot.data` maintains state between module instances during
* hot replacement, enabling data transfer from previous to new versions.
* When `import.meta.hot.data` is written to, Bun will mark this module as
* capable of self-accepting (equivalent of calling `accept()`).
*
* @example
* ```ts
* const root = import.meta.hot.data.root ??= createRoot(elem);
* root.render(<App />); // re-use an existing root
* ```
*
* In production, `data` is inlined to be `{}`. This is handy because Bun
* knows it can minify `{}.prop ??= value` into `value` in production.
*/
data: any;
data: any;
/**
* Indicate that this module can be replaced simply by re-evaluating the
* file. After a hot update, importers of this module will be
* automatically patched.
*
* When `import.meta.hot.accept` is not used, the page will reload when
* the file updates, and a console message shows which files were checked.
*
* @example
* ```ts
* import { getCount } from "./foo";
*
* console.log("count is ", getCount());
*
* import.meta.hot.accept();
* ```
*/
accept(): void;
/**
* Indicate that this module can be replaced simply by re-evaluating the
* file. After a hot update, importers of this module will be
* automatically patched.
*
* When `import.meta.hot.accept` is not used, the page will reload when
* the file updates, and a console message shows which files were checked.
*
* @example
* ```ts
* import { getCount } from "./foo";
*
* console.log("count is ", getCount());
*
* import.meta.hot.accept();
* ```
*/
accept(): void;
/**
* Indicate that this module can be replaced by evaluating the new module,
* and then calling the callback with the new module. In this mode, the
* importers do not get patched. This is to match Vite, which is unable
* to patch their import statements. Prefer using `import.meta.hot.accept()`
* without an argument as it usually makes your code easier to understand.
*
* When `import.meta.hot.accept` is not used, the page will reload when
* the file updates, and a console message shows which files were checked.
*
* @example
* ```ts
* export const count = 0;
*
* import.meta.hot.accept((newModule) => {
* if (newModule) {
* // newModule is undefined when SyntaxError happened
* console.log('updated: count is now ', newModule.count)
* }
* });
* ```
*
* In production, calls to this are dead-code-eliminated.
*/
accept(cb: (newModule: any | undefined) => void): void;
/**
* Indicate that this module can be replaced by evaluating the new module,
* and then calling the callback with the new module. In this mode, the
* importers do not get patched. This is to match Vite, which is unable
* to patch their import statements. Prefer using `import.meta.hot.accept()`
* without an argument as it usually makes your code easier to understand.
*
* When `import.meta.hot.accept` is not used, the page will reload when
* the file updates, and a console message shows which files were checked.
*
* @example
* ```ts
* export const count = 0;
*
* import.meta.hot.accept((newModule) => {
* if (newModule) {
* // newModule is undefined when SyntaxError happened
* console.log('updated: count is now ', newModule.count)
* }
* });
* ```
*
* In production, calls to this are dead-code-eliminated.
*/
accept(cb: (newModule: any | undefined) => void): void;
/**
* Indicate that a dependency's module can be accepted. When the dependency
* is updated, the callback will be called with the new module.
*
* When `import.meta.hot.accept` is not used, the page will reload when
* the file updates, and a console message shows which files were checked.
*
* @example
* ```ts
* import.meta.hot.accept('./foo', (newModule) => {
* if (newModule) {
* // newModule is undefined when SyntaxError happened
* console.log('updated: count is now ', newModule.count)
* }
* });
* ```
*/
accept(specifier: string, callback: (newModule: any) => void): void;
/**
* Indicate that a dependency's module can be accepted. When the dependency
* is updated, the callback will be called with the new module.
*
* When `import.meta.hot.accept` is not used, the page will reload when
* the file updates, and a console message shows which files were checked.
*
* @example
* ```ts
* import.meta.hot.accept('./foo', (newModule) => {
* if (newModule) {
* // newModule is undefined when SyntaxError happened
* console.log('updated: count is now ', newModule.count)
* }
* });
* ```
*/
accept(specifier: string, callback: (newModule: any) => void): void;
/**
* Indicate that a dependency's module can be accepted. This variant
* accepts an array of dependencies, where the callback will receive
* the one updated module, and `undefined` for the rest.
*
* When `import.meta.hot.accept` is not used, the page will reload when
* the file updates, and a console message shows which files were checked.
*/
accept(specifiers: string[], callback: (newModules: (any | undefined)[]) => void): void;
/**
* Indicate that a dependency's module can be accepted. This variant
* accepts an array of dependencies, where the callback will receive
* the one updated module, and `undefined` for the rest.
*
* When `import.meta.hot.accept` is not used, the page will reload when
* the file updates, and a console message shows which files were checked.
*/
accept(specifiers: string[], callback: (newModules: (any | undefined)[]) => void): void;
/**
* Attach an on-dispose callback. This is called:
* - Just before the module is replaced with another copy (before the next is loaded)
* - After the module is detached (removing all imports to this module)
*
* This callback is not called on route navigation or when the browser tab closes.
*
* Returning a promise will delay module replacement until the module is
* disposed. All dispose callbacks are called in parallel.
*/
dispose(cb: (data: any) => void | Promise<void>): void;
/**
* Attach an on-dispose callback. This is called:
* - Just before the module is replaced with another copy (before the next is loaded)
* - After the module is detached (removing all imports to this module)
*
* This callback is not called on route navigation or when the browser tab closes.
*
* Returning a promise will delay module replacement until the module is
* disposed. All dispose callbacks are called in parallel.
*/
dispose(cb: (data: any) => void | Promise<void>): void;
/**
* No-op
* @deprecated
*/
decline(): void;
/**
* No-op
* @deprecated
*/
decline(): void;
// NOTE TO CONTRIBUTORS ////////////////////////////////////////
// Callback is currently never called for `.prune()` //
// so the types are commented out until we support it. //
////////////////////////////////////////////////////////////////
// /**
// * Attach a callback that is called when the module is removed from the module graph.
// *
// * This can be used to clean up resources that were created when the module was loaded.
// * Unlike `import.meta.hot.dispose()`, this pairs much better with `accept` and `data` to manage stateful resources.
// *
// * @example
// * ```ts
// * export const ws = (import.meta.hot.data.ws ??= new WebSocket(location.origin));
// *
// * import.meta.hot.prune(() => {
// * ws.close();
// * });
// * ```
// */
// prune(callback: () => void): void;
// NOTE TO CONTRIBUTORS ////////////////////////////////////////
// Callback is currently never called for `.prune()` //
// so the types are commented out until we support it. //
////////////////////////////////////////////////////////////////
// /**
// * Attach a callback that is called when the module is removed from the module graph.
// *
// * This can be used to clean up resources that were created when the module was loaded.
// * Unlike `import.meta.hot.dispose()`, this pairs much better with `accept` and `data` to manage stateful resources.
// *
// * @example
// * ```ts
// * export const ws = (import.meta.hot.data.ws ??= new WebSocket(location.origin));
// *
// * import.meta.hot.prune(() => {
// * ws.close();
// * });
// * ```
// */
// prune(callback: () => void): void;
/**
* Listen for an event from the dev server
*
* For compatibility with Vite, event names are also available via vite:* prefix instead of bun:*.
*
* https://bun.sh/docs/bundler/hmr#import-meta-hot-on-and-off
* @param event The event to listen to
* @param callback The callback to call when the event is emitted
*/
on(event: Bun.HMREvent, callback: () => void): void;
/**
* Listen for an event from the dev server
*
* For compatibility with Vite, event names are also available via vite:* prefix instead of bun:*.
*
* https://bun.sh/docs/bundler/hmr#import-meta-hot-on-and-off
* @param event The event to listen to
* @param callback The callback to call when the event is emitted
*/
on(event: Bun.HMREvent, callback: () => void): void;
/**
* Stop listening for an event from the dev server
*
* For compatibility with Vite, event names are also available via vite:* prefix instead of bun:*.
*
* https://bun.sh/docs/bundler/hmr#import-meta-hot-on-and-off
* @param event The event to stop listening to
* @param callback The callback to stop listening to
*/
off(event: Bun.HMREvent, callback: () => void): void;
};
}
/**
* Stop listening for an event from the dev server
*
* For compatibility with Vite, event names are also available via vite:* prefix instead of bun:*.
*
* https://bun.sh/docs/bundler/hmr#import-meta-hot-on-and-off
* @param event The event to stop listening to
* @param callback The callback to stop listening to
*/
off(event: Bun.HMREvent, callback: () => void): void;
};
}

View File

@@ -19,13 +19,13 @@ declare module "*/bun.lock" {
}
declare module "*.html" {
// In Bun v1.2, we might change this to Bun.HTMLBundle
// In Bun v1.2, this might change to Bun.HTMLBundle
var contents: any;
export = contents;
}
declare module "*.svg" {
// Bun 1.2.3 added support for frontend dev server
var contents: `${string}.svg`;
export = contents;
var content: `${string}.svg`;
export = content;
}

View File

@@ -1,161 +1,72 @@
interface Headers {
/**
* Convert {@link Headers} to a plain JavaScript object.
*
* About 10x faster than `Object.fromEntries(headers.entries())`
*
* Called when you run `JSON.stringify(headers)`
*
* Does not preserve insertion order. Well-known header names are lowercased. Other header names are left as-is.
*/
toJSON(): Record<string, string>;
/**
* Get the total number of headers
*/
readonly count: number;
/**
* Get all headers matching the name
*
* Only supports `"Set-Cookie"`. All other headers are empty arrays.
*
* @param name - The header name to get
*
* @returns An array of header values
*
* @example
* ```ts
* const headers = new Headers();
* headers.append("Set-Cookie", "foo=bar");
* headers.append("Set-Cookie", "baz=qux");
* headers.getAll("Set-Cookie"); // ["foo=bar", "baz=qux"]
* ```
*/
getAll(name: "set-cookie" | "Set-Cookie"): string[];
}
/*
var Headers: {
prototype: Headers;
new (init?: Bun.HeadersInit): Headers;
};
This file does not declare any global types.
interface Request {
headers: Headers;
}
That should only happen in [./globals.d.ts](./globals.d.ts)
so that our documentation generator can pick it up, as it
expects all globals to be declared in one file.
var Request: {
prototype: Request;
new (requestInfo: string, requestInit?: RequestInit): Request;
new (requestInfo: RequestInit & { url: string }): Request;
new (requestInfo: Request, requestInit?: RequestInit): Request;
};
var Response: {
new (body?: Bun.BodyInit | null | undefined, init?: Bun.ResponseInit | undefined): Response;
/**
* Create a new {@link Response} with a JSON body
*
* @param body - The body of the response
* @param options - options to pass to the response
*
* @example
*
* ```ts
* const response = Response.json({hi: "there"});
* console.assert(
* await response.text(),
* `{"hi":"there"}`
* );
* ```
* -------
*
* This is syntactic sugar for:
* ```js
* new Response(JSON.stringify(body), {headers: { "Content-Type": "application/json" }})
* ```
* @link https://github.com/whatwg/fetch/issues/1389
*/
json(body?: any, options?: Bun.ResponseInit | number): Response;
/**
* Create a new {@link Response} that redirects to url
*
* @param url - the URL to redirect to
* @param status - the HTTP status code to use for the redirect
*/
// tslint:disable-next-line:unified-signatures
redirect(url: string, status?: number): Response;
/**
* Create a new {@link Response} that redirects to url
*
* @param url - the URL to redirect to
* @param options - options to pass to the response
*/
// tslint:disable-next-line:unified-signatures
redirect(url: string, options?: Bun.ResponseInit): Response;
/**
* Create a new {@link Response} that has a network error
*/
error(): Response;
};
type _BunTLSOptions = import("bun").TLSOptions;
interface BunFetchRequestInitTLS extends _BunTLSOptions {
/**
* Custom function to check the server identity
* @param hostname - The hostname of the server
* @param cert - The certificate of the server
* @returns An error if the server is unauthorized, otherwise undefined
*/
checkServerIdentity?: NonNullable<import("node:tls").ConnectionOptions["checkServerIdentity"]>;
}
/**
* BunFetchRequestInit represents additional options that Bun supports in `fetch()` only.
*
* Bun extends the `fetch` API with some additional options, except
* this interface is not quite a `RequestInit`, because they won't work
* if passed to `new Request()`. This is why it's a separate type.
*/
interface BunFetchRequestInit extends RequestInit {
/**
* Override the default TLS options
*/
tls?: BunFetchRequestInitTLS;
declare module "bun" {
type HeadersInit = string[][] | Record<string, string | ReadonlyArray<string>> | Headers;
type BodyInit = ReadableStream | Bun.XMLHttpRequestBodyInit | URLSearchParams | AsyncGenerator<Uint8Array>;
namespace __internal {
type LibOrFallbackHeaders = LibDomIsLoaded extends true ? {} : import("undici-types").Headers;
type LibOrFallbackRequest = LibDomIsLoaded extends true ? {} : import("undici-types").Request;
type LibOrFallbackResponse = LibDomIsLoaded extends true ? {} : import("undici-types").Response;
type LibOrFallbackResponseInit = LibDomIsLoaded extends true ? {} : import("undici-types").ResponseInit;
type LibOrFallbackRequestInit = LibDomIsLoaded extends true
? {}
: Omit<import("undici-types").RequestInit, "body" | "headers"> & {
body?: Bun.BodyInit | null | undefined;
headers?: Bun.HeadersInit;
};
interface BunHeadersOverride extends LibOrFallbackHeaders {
/**
* Convert {@link Headers} to a plain JavaScript object.
*
* About 10x faster than `Object.fromEntries(headers.entries())`
*
* Called when you run `JSON.stringify(headers)`
*
* Does not preserve insertion order. Well-known header names are lowercased. Other header names are left as-is.
*/
toJSON(): Record<string, string>;
/**
* Get the total number of headers
*/
readonly count: number;
/**
* Get all headers matching the name
*
* Only supports `"Set-Cookie"`. All other headers are empty arrays.
*
* @param name - The header name to get
*
* @returns An array of header values
*
* @example
* ```ts
* const headers = new Headers();
* headers.append("Set-Cookie", "foo=bar");
* headers.append("Set-Cookie", "baz=qux");
* headers.getAll("Set-Cookie"); // ["foo=bar", "baz=qux"]
* ```
*/
getAll(name: "set-cookie" | "Set-Cookie"): string[];
}
interface BunRequestOverride extends LibOrFallbackRequest {
headers: BunHeadersOverride;
}
interface BunResponseOverride extends LibOrFallbackResponse {
headers: BunHeadersOverride;
}
}
}
var fetch: {
/**
* Send a HTTP(s) request
*
* @param request Request object
* @param init A structured value that contains settings for the fetch() request.
*
* @returns A promise that resolves to {@link Response} object.
*/
(request: Request, init?: BunFetchRequestInit): Promise<Response>;
/**
* Send a HTTP(s) request
*
* @param url URL string
* @param init A structured value that contains settings for the fetch() request.
*
* @returns A promise that resolves to {@link Response} object.
*/
(url: string | URL | Request, init?: BunFetchRequestInit): Promise<Response>;
(input: string | URL | globalThis.Request, init?: BunFetchRequestInit): Promise<Response>;
/**
* Start the DNS resolution, TCP connection, and TLS handshake for a request
* before the request is actually sent.
*
* This can reduce the latency of a request when you know there's some
* long-running task that will delay the request starting.
*
* This is a bun-specific API and is not part of the Fetch API specification.
*/
preconnect(url: string | URL): void;
};

View File

@@ -13,6 +13,8 @@
* that convert JavaScript types to C types and back. Internally,
* bun uses [tinycc](https://github.com/TinyCC/tinycc), so a big thanks
* goes to Fabrice Bellard and TinyCC maintainers for making this possible.
*
* @category FFI
*/
declare module "bun:ffi" {
enum FFIType {
@@ -543,14 +545,6 @@ declare module "bun:ffi" {
type Symbols = Readonly<Record<string, FFIFunction>>;
// /**
// * Compile a callback function
// *
// * Returns a function pointer
// *
// */
// export function callback(ffi: FFIFunction, cb: Function): number;
interface Library<Fns extends Symbols> {
symbols: ConvertFns<Fns>;
@@ -608,6 +602,8 @@ declare module "bun:ffi" {
* that convert JavaScript types to C types and back. Internally,
* bun uses [tinycc](https://github.com/TinyCC/tinycc), so a big thanks
* goes to Fabrice Bellard and TinyCC maintainers for making this possible.
*
* @category FFI
*/
function dlopen<Fns extends Record<string, FFIFunction>>(
name: string | import("bun").BunFile | URL,
@@ -626,9 +622,9 @@ declare module "bun:ffi" {
* JavaScript:
* ```js
* import { cc } from "bun:ffi";
* import hello from "./hello.c" with {type: "file"};
* import source from "./hello.c" with {type: "file"};
* const {symbols: {hello}} = cc({
* source: hello,
* source,
* symbols: {
* hello: {
* returns: "cstring",
@@ -681,8 +677,9 @@ declare module "bun:ffi" {
* @example
* ```js
* import { cc } from "bun:ffi";
* import source from "./hello.c" with {type: "file"};
* const {symbols: {hello}} = cc({
* source: hello,
* source,
* define: {
* "NDEBUG": "1",
* },
@@ -707,8 +704,9 @@ declare module "bun:ffi" {
* @example
* ```js
* import { cc } from "bun:ffi";
* import source from "./hello.c" with {type: "file"};
* const {symbols: {hello}} = cc({
* source: hello,
* source,
* flags: ["-framework CoreFoundation", "-framework Security"],
* symbols: {
* hello: {
@@ -1024,6 +1022,8 @@ declare module "bun:ffi" {
* // Do something with rawPtr
* }
* ```
*
* @category FFI
*/
function ptr(view: NodeJS.TypedArray | ArrayBufferLike | DataView, byteOffset?: number): Pointer;
@@ -1048,8 +1048,9 @@ declare module "bun:ffi" {
* thing to do safely. Passing an invalid pointer can crash the program and
* reading beyond the bounds of the pointer will crash the program or cause
* undefined behavior. Use with care!
*
* @category FFI
*/
class CString extends String {
/**
* Get a string from a UTF-8 encoded C string

File diff suppressed because it is too large Load Diff

View File

@@ -147,6 +147,8 @@ declare namespace HTMLRewriterTypes {
* });
* rewriter.transform(await fetch("https://remix.run"));
* ```
*
* @category HTML Manipulation
*/
declare class HTMLRewriter {
constructor();

View File

@@ -1,24 +1,26 @@
// Project: https://github.com/oven-sh/bun
// Definitions by: Jarred Sumner <https://github.com/Jarred-Sumner>
// Definitions by: Bun Contributors <https://github.com/oven-sh/bun/graphs/contributors>
// Definitions: https://github.com/DefinitelyTyped/DefinitelyTyped
/// <reference lib="esnext" />
/// <reference types="ws" />
/// <reference types="node" />
// contributors: uncomment this to detect conflicts with lib.dom.d.ts
// /// <reference lib="dom" />
/// <reference path="./bun.d.ts" />
/// <reference path="./globals.d.ts" />
/// <reference path="./s3.d.ts" />
/// <reference path="./fetch.d.ts" />
/// <reference path="./overrides.d.ts" />
/// <reference path="./bun.d.ts" />
/// <reference path="./extensions.d.ts" />
/// <reference path="./devserver.d.ts" />
/// <reference path="./ffi.d.ts" />
/// <reference path="./test.d.ts" />
/// <reference path="./html-rewriter.d.ts" />
/// <reference path="./jsc.d.ts" />
/// <reference path="./sqlite.d.ts" />
/// <reference path="./test.d.ts" />
/// <reference path="./wasm.d.ts" />
/// <reference path="./overrides.d.ts" />
/// <reference path="./deprecated.d.ts" />
/// <reference path="./ambient.d.ts" />
/// <reference path="./devserver.d.ts" />
/// <reference path="./bun.ns.d.ts" />
// @ts-ignore Must disable this so it doesn't conflict with the DOM onmessage type, but still
// allows us to declare our own globals that Node's types can "see" and not conflict with
declare var onmessage: never;

View File

@@ -1,18 +1,73 @@
export {};
import type { BunFile, Env, PathLike } from "bun";
declare global {
namespace NodeJS {
interface ProcessEnv extends Bun.Env {}
interface Process {
readonly version: string;
browser: boolean;
/**
* Whether you are using Bun
*/
isBun: true;
/**
* The current git sha of Bun
*/
revision: string;
reallyExit(code?: number): never;
dlopen(module: { exports: any }, filename: string, flags?: number): void;
_exiting: boolean;
noDeprecation: boolean;
binding(m: "constants"): {
os: typeof import("node:os").constants;
fs: typeof import("node:fs").constants;
crypto: typeof import("node:crypto").constants;
zlib: typeof import("node:zlib").constants;
trace: {
TRACE_EVENT_PHASE_BEGIN: number;
TRACE_EVENT_PHASE_END: number;
TRACE_EVENT_PHASE_COMPLETE: number;
TRACE_EVENT_PHASE_INSTANT: number;
TRACE_EVENT_PHASE_ASYNC_BEGIN: number;
TRACE_EVENT_PHASE_ASYNC_STEP_INTO: number;
TRACE_EVENT_PHASE_ASYNC_STEP_PAST: number;
TRACE_EVENT_PHASE_ASYNC_END: number;
TRACE_EVENT_PHASE_NESTABLE_ASYNC_BEGIN: number;
TRACE_EVENT_PHASE_NESTABLE_ASYNC_END: number;
TRACE_EVENT_PHASE_NESTABLE_ASYNC_INSTANT: number;
TRACE_EVENT_PHASE_FLOW_BEGIN: number;
TRACE_EVENT_PHASE_FLOW_STEP: number;
TRACE_EVENT_PHASE_FLOW_END: number;
TRACE_EVENT_PHASE_METADATA: number;
TRACE_EVENT_PHASE_COUNTER: number;
TRACE_EVENT_PHASE_SAMPLE: number;
TRACE_EVENT_PHASE_CREATE_OBJECT: number;
TRACE_EVENT_PHASE_SNAPSHOT_OBJECT: number;
TRACE_EVENT_PHASE_DELETE_OBJECT: number;
TRACE_EVENT_PHASE_MEMORY_DUMP: number;
TRACE_EVENT_PHASE_MARK: number;
TRACE_EVENT_PHASE_CLOCK_SYNC: number;
TRACE_EVENT_PHASE_ENTER_CONTEXT: number;
TRACE_EVENT_PHASE_LEAVE_CONTEXT: number;
TRACE_EVENT_PHASE_LINK_IDS: number;
};
};
binding(m: string): object;
}
interface ProcessVersions extends Dict<string> {
bun: string;
}
interface ProcessEnv extends Env {}
}
}
declare module "fs/promises" {
function exists(path: PathLike): Promise<boolean>;
function exists(path: Bun.PathLike): Promise<boolean>;
}
declare module "tls" {
@@ -22,7 +77,7 @@ declare module "tls" {
* the well-known CAs curated by Mozilla. Mozilla's CAs are completely
* replaced when CAs are explicitly specified using this option.
*/
ca?: string | Buffer | NodeJS.TypedArray | BunFile | Array<string | Buffer | BunFile> | undefined;
ca?: string | Buffer | NodeJS.TypedArray | Bun.BunFile | Array<string | Buffer | Bun.BunFile> | undefined;
/**
* Cert chains in PEM format. One cert chain should be provided per
* private key. Each cert chain should consist of the PEM formatted
@@ -38,8 +93,8 @@ declare module "tls" {
| string
| Buffer
| NodeJS.TypedArray
| BunFile
| Array<string | Buffer | NodeJS.TypedArray | BunFile>
| Bun.BunFile
| Array<string | Buffer | NodeJS.TypedArray | Bun.BunFile>
| undefined;
/**
* Private keys in PEM format. PEM allows the option of private keys
@@ -54,9 +109,9 @@ declare module "tls" {
key?:
| string
| Buffer
| BunFile
| Bun.BunFile
| NodeJS.TypedArray
| Array<string | Buffer | BunFile | NodeJS.TypedArray | KeyObject>
| Array<string | Buffer | Bun.BunFile | NodeJS.TypedArray | KeyObject>
| undefined;
}

View File

@@ -9,14 +9,14 @@
"directory": "packages/bun-types"
},
"files": [
"*.d.ts",
"./*.d.ts",
"docs/**/*.md",
"docs/*.md"
],
"homepage": "https://bun.sh",
"dependencies": {
"@types/node": "*",
"@types/ws": "~8.5.10"
"@types/ws": "*"
},
"devDependencies": {
"@biomejs/biome": "^1.5.3",
@@ -27,7 +27,7 @@
"scripts": {
"prebuild": "echo $(pwd)",
"copy-docs": "rm -rf docs && cp -rL ../../docs/ ./docs && find ./docs -type f -name '*.md' -exec sed -i 's/\\$BUN_LATEST_VERSION/'\"${BUN_VERSION#bun-v}\"'/g' {} +",
"build": "bun run copy-docs && bun scripts/build.ts && bun run fmt",
"build": "bun run copy-docs && bun scripts/build.ts",
"test": "tsc",
"fmt": "echo $(which biome) && biome format --write ."
},

831
packages/bun-types/s3.d.ts vendored Normal file
View File

@@ -0,0 +1,831 @@
declare module "bun" {
/**
* Fast incremental writer for files and pipes.
*
* This uses the same interface as {@link ArrayBufferSink}, but writes to a file or pipe.
*/
interface FileSink {
/**
* Write a chunk of data to the file.
*
* If the file descriptor is not writable yet, the data is buffered.
*/
write(chunk: string | ArrayBufferView | ArrayBuffer | SharedArrayBuffer): number;
/**
* Flush the internal buffer, committing the data to disk or the pipe.
*/
flush(): number | Promise<number>;
/**
* Close the file descriptor. This also flushes the internal buffer.
*/
end(error?: Error): number | Promise<number>;
start(options?: {
/**
* Preallocate an internal buffer of this size
* This can significantly improve performance when the chunk size is small
*/
highWaterMark?: number;
}): void;
/**
* For FIFOs & pipes, this lets you decide whether Bun's process should
* remain alive until the pipe is closed.
*
* By default, it is automatically managed. While the stream is open, the
* process remains alive and once the other end hangs up or the stream
* closes, the process exits.
*
* If you previously called {@link unref}, you can call this again to re-enable automatic management.
*
* Internally, it will reference count the number of times this is called. By default, that number is 1
*
* If the file is not a FIFO or pipe, {@link ref} and {@link unref} do
* nothing. If the pipe is already closed, this does nothing.
*/
ref(): void;
/**
* For FIFOs & pipes, this lets you decide whether Bun's process should
* remain alive until the pipe is closed.
*
* If you want to allow Bun's process to terminate while the stream is open,
* call this.
*
* If the file is not a FIFO or pipe, {@link ref} and {@link unref} do
* nothing. If the pipe is already closed, this does nothing.
*/
unref(): void;
}
interface NetworkSink extends FileSink {
/**
* Write a chunk of data to the network.
*
* If the network is not writable yet, the data is buffered.
*/
write(chunk: string | ArrayBufferView | ArrayBuffer | SharedArrayBuffer): number;
/**
* Flush the internal buffer, committing the data to the network.
*/
flush(): number | Promise<number>;
/**
* Finish the upload. This also flushes the internal buffer.
*/
end(error?: Error): number | Promise<number>;
/**
* Get the stat of the file.
*/
stat(): Promise<import("node:fs").Stats>;
}
/**
* Configuration options for S3 operations
*/
interface S3Options extends BlobPropertyBag {
/**
* The Access Control List (ACL) policy for the file.
* Controls who can access the file and what permissions they have.
*
* @example
* // Setting public read access
* const file = s3.file("public-file.txt", {
* acl: "public-read",
* bucket: "my-bucket"
* });
*
* @example
* // Using with presigned URLs
* const url = file.presign({
* acl: "public-read",
* expiresIn: 3600
* });
*/
acl?:
| "private"
| "public-read"
| "public-read-write"
| "aws-exec-read"
| "authenticated-read"
| "bucket-owner-read"
| "bucket-owner-full-control"
| "log-delivery-write";
/**
* The S3 bucket name. Defaults to `S3_BUCKET` or `AWS_BUCKET` environment variables.
*
* @example
* // Using explicit bucket
* const file = s3.file("my-file.txt", { bucket: "my-bucket" });
*
* @example
* // Using environment variables
* // With S3_BUCKET=my-bucket in .env
* const file = s3.file("my-file.txt");
*/
bucket?: string;
/**
* The AWS region. Defaults to `S3_REGION` or `AWS_REGION` environment variables.
*
* @example
* const file = s3.file("my-file.txt", {
* bucket: "my-bucket",
* region: "us-west-2"
* });
*/
region?: string;
/**
* The access key ID for authentication.
* Defaults to `S3_ACCESS_KEY_ID` or `AWS_ACCESS_KEY_ID` environment variables.
*/
accessKeyId?: string;
/**
* The secret access key for authentication.
* Defaults to `S3_SECRET_ACCESS_KEY` or `AWS_SECRET_ACCESS_KEY` environment variables.
*/
secretAccessKey?: string;
/**
* Optional session token for temporary credentials.
* Defaults to `S3_SESSION_TOKEN` or `AWS_SESSION_TOKEN` environment variables.
*
* @example
* // Using temporary credentials
* const file = s3.file("my-file.txt", {
* accessKeyId: tempAccessKey,
* secretAccessKey: tempSecretKey,
* sessionToken: tempSessionToken
* });
*/
sessionToken?: string;
/**
* The S3-compatible service endpoint URL.
* Defaults to `S3_ENDPOINT` or `AWS_ENDPOINT` environment variables.
*
* @example
* // AWS S3
* const file = s3.file("my-file.txt", {
* endpoint: "https://s3.us-east-1.amazonaws.com"
* });
*
* @example
* // Cloudflare R2
* const file = s3.file("my-file.txt", {
* endpoint: "https://<account-id>.r2.cloudflarestorage.com"
* });
*
* @example
* // DigitalOcean Spaces
* const file = s3.file("my-file.txt", {
* endpoint: "https://<region>.digitaloceanspaces.com"
* });
*
* @example
* // MinIO (local development)
* const file = s3.file("my-file.txt", {
* endpoint: "http://localhost:9000"
* });
*/
endpoint?: string;
/**
* Use virtual hosted style endpoint. default to false, when true if `endpoint` is informed it will ignore the `bucket`
*
* @example
* // Using virtual hosted style
* const file = s3.file("my-file.txt", {
* virtualHostedStyle: true,
* endpoint: "https://my-bucket.s3.us-east-1.amazonaws.com"
* });
*/
virtualHostedStyle?: boolean;
/**
* The size of each part in multipart uploads (in bytes).
* - Minimum: 5 MiB
* - Maximum: 5120 MiB
* - Default: 5 MiB
*
* @example
* // Configuring multipart uploads
* const file = s3.file("large-file.dat", {
* partSize: 10 * 1024 * 1024, // 10 MiB parts
* queueSize: 4 // Upload 4 parts in parallel
* });
*
* const writer = file.writer();
* // ... write large file in chunks
*/
partSize?: number;
/**
* Number of parts to upload in parallel for multipart uploads.
* - Default: 5
* - Maximum: 255
*
* Increasing this value can improve upload speeds for large files
* but will use more memory.
*/
queueSize?: number;
/**
* Number of retry attempts for failed uploads.
* - Default: 3
* - Maximum: 255
*
* @example
* // Setting retry attempts
* const file = s3.file("my-file.txt", {
* retry: 5 // Retry failed uploads up to 5 times
* });
*/
retry?: number;
/**
* The Content-Type of the file.
* Automatically set based on file extension when possible.
*
* @example
* // Setting explicit content type
* const file = s3.file("data.bin", {
* type: "application/octet-stream"
* });
*/
type?: string;
/**
* By default, Amazon S3 uses the STANDARD Storage Class to store newly created objects.
*
* @example
* // Setting explicit Storage class
* const file = s3.file("my-file.json", {
* storageClass: "STANDARD_IA"
* });
*/
storageClass?:
| "STANDARD"
| "DEEP_ARCHIVE"
| "EXPRESS_ONEZONE"
| "GLACIER"
| "GLACIER_IR"
| "INTELLIGENT_TIERING"
| "ONEZONE_IA"
| "OUTPOSTS"
| "REDUCED_REDUNDANCY"
| "SNOW"
| "STANDARD_IA";
/**
* @deprecated The size of the internal buffer in bytes. Defaults to 5 MiB. use `partSize` and `queueSize` instead.
*/
highWaterMark?: number;
}
/**
* Options for generating presigned URLs
*/
interface S3FilePresignOptions extends S3Options {
/**
* Number of seconds until the presigned URL expires.
* - Default: 86400 (1 day)
*
* @example
* // Short-lived URL
* const url = file.presign({
* expiresIn: 3600 // 1 hour
* });
*
* @example
* // Long-lived public URL
* const url = file.presign({
* expiresIn: 7 * 24 * 60 * 60, // 7 days
* acl: "public-read"
* });
*/
expiresIn?: number;
/**
* The HTTP method allowed for the presigned URL.
*
* @example
* // GET URL for downloads
* const downloadUrl = file.presign({
* method: "GET",
* expiresIn: 3600
* });
*
* @example
* // PUT URL for uploads
* const uploadUrl = file.presign({
* method: "PUT",
* expiresIn: 3600,
* type: "application/json"
* });
*/
method?: "GET" | "POST" | "PUT" | "DELETE" | "HEAD";
}
interface S3Stats {
size: number;
lastModified: Date;
etag: string;
type: string;
}
/**
* Represents a file in an S3-compatible storage service.
* Extends the Blob interface for compatibility with web APIs.
*
* @category Cloud Storage
*/
interface S3File extends Blob {
/**
* The size of the file in bytes.
* This is a Promise because it requires a network request to determine the size.
*
* @example
* // Getting file size
* const size = await file.size;
* console.log(`File size: ${size} bytes`);
*
* @example
* // Check if file is larger than 1MB
* if (await file.size > 1024 * 1024) {
* console.log("Large file detected");
* }
*/
/**
* TODO: figure out how to get the typescript types to not error for this property.
*/
// size: Promise<number>;
/**
* Creates a new S3File representing a slice of the original file.
* Uses HTTP Range headers for efficient partial downloads.
*
* @param begin - Starting byte offset
* @param end - Ending byte offset (exclusive)
* @param contentType - Optional MIME type for the slice
* @returns A new S3File representing the specified range
*
* @example
* // Reading file header
* const header = file.slice(0, 1024);
* const headerText = await header.text();
*
* @example
* // Reading with content type
* const jsonSlice = file.slice(1024, 2048, "application/json");
* const data = await jsonSlice.json();
*
* @example
* // Reading from offset to end
* const remainder = file.slice(1024);
* const content = await remainder.text();
*/
slice(begin?: number, end?: number, contentType?: string): S3File;
slice(begin?: number, contentType?: string): S3File;
slice(contentType?: string): S3File;
/**
* Creates a writable stream for uploading data.
* Suitable for large files as it uses multipart upload.
*
* @param options - Configuration for the upload
* @returns A NetworkSink for writing data
*
* @example
* // Basic streaming write
* const writer = file.writer({
* type: "application/json"
* });
* writer.write('{"hello": ');
* writer.write('"world"}');
* await writer.end();
*
* @example
* // Optimized large file upload
* const writer = file.writer({
* partSize: 10 * 1024 * 1024, // 10MB parts
* queueSize: 4, // Upload 4 parts in parallel
* retry: 3 // Retry failed parts
* });
*
* // Write large chunks of data efficiently
* for (const chunk of largeDataChunks) {
* writer.write(chunk);
* }
* await writer.end();
*
* @example
* // Error handling
* const writer = file.writer();
* try {
* writer.write(data);
* await writer.end();
* } catch (err) {
* console.error('Upload failed:', err);
* // Writer will automatically abort multipart upload on error
* }
*/
writer(options?: S3Options): NetworkSink;
/**
* Gets a readable stream of the file's content.
* Useful for processing large files without loading them entirely into memory.
*
* @returns A ReadableStream for the file content
*
* @example
* // Basic streaming read
* const stream = file.stream();
* for await (const chunk of stream) {
* console.log('Received chunk:', chunk);
* }
*
* @example
* // Piping to response
* const stream = file.stream();
* return new Response(stream, {
* headers: { 'Content-Type': file.type }
* });
*
* @example
* // Processing large files
* const stream = file.stream();
* const textDecoder = new TextDecoder();
* for await (const chunk of stream) {
* const text = textDecoder.decode(chunk);
* // Process text chunk by chunk
* }
*/
readonly readable: ReadableStream;
stream(): ReadableStream;
/**
* The name or path of the file in the bucket.
*
* @example
* const file = s3.file("folder/image.jpg");
* console.log(file.name); // "folder/image.jpg"
*/
readonly name?: string;
/**
* The bucket name containing the file.
*
* @example
* const file = s3.file("s3://my-bucket/file.txt");
* console.log(file.bucket); // "my-bucket"
*/
readonly bucket?: string;
/**
* Checks if the file exists in S3.
* Uses HTTP HEAD request to efficiently check existence without downloading.
*
* @returns Promise resolving to true if file exists, false otherwise
*
* @example
* // Basic existence check
* if (await file.exists()) {
* console.log("File exists in S3");
* }
*
* @example
* // With error handling
* try {
* const exists = await file.exists();
* if (!exists) {
* console.log("File not found");
* }
* } catch (err) {
* console.error("Error checking file:", err);
* }
*/
exists(): Promise<boolean>;
/**
* Uploads data to S3.
* Supports various input types and automatically handles large files.
*
* @param data - The data to upload
* @param options - Upload configuration options
* @returns Promise resolving to number of bytes written
*
* @example
* // Writing string data
* await file.write("Hello World", {
* type: "text/plain"
* });
*
* @example
* // Writing JSON
* const data = { hello: "world" };
* await file.write(JSON.stringify(data), {
* type: "application/json"
* });
*
* @example
* // Writing from Response
* const response = await fetch("https://example.com/data");
* await file.write(response);
*
* @example
* // Writing with ACL
* await file.write(data, {
* acl: "public-read",
* type: "application/octet-stream"
* });
*/
write(
data: string | ArrayBufferView | ArrayBuffer | SharedArrayBuffer | Request | Response | BunFile | S3File | Blob,
options?: S3Options,
): Promise<number>;
/**
* Generates a presigned URL for the file.
* Allows temporary access to the file without exposing credentials.
*
* @param options - Configuration for the presigned URL
* @returns Presigned URL string
*
* @example
* // Basic download URL
* const url = file.presign({
* expiresIn: 3600 // 1 hour
* });
*
* @example
* // Upload URL with specific content type
* const uploadUrl = file.presign({
* method: "PUT",
* expiresIn: 3600,
* type: "image/jpeg",
* acl: "public-read"
* });
*
* @example
* // URL with custom permissions
* const url = file.presign({
* method: "GET",
* expiresIn: 7 * 24 * 60 * 60, // 7 days
* acl: "public-read"
* });
*/
presign(options?: S3FilePresignOptions): string;
/**
* Deletes the file from S3.
*
* @returns Promise that resolves when deletion is complete
*
* @example
* // Basic deletion
* await file.delete();
*
* @example
* // With error handling
* try {
* await file.delete();
* console.log("File deleted successfully");
* } catch (err) {
* console.error("Failed to delete file:", err);
* }
*/
delete(): Promise<void>;
/**
* Alias for delete() method.
* Provided for compatibility with Node.js fs API naming.
*
* @example
* await file.unlink();
*/
unlink: S3File["delete"];
/**
* Get the stat of a file in an S3-compatible storage service.
*
* @returns Promise resolving to S3Stat
*/
stat(): Promise<S3Stats>;
}
/**
* A configured S3 bucket instance for managing files.
* The instance is callable to create S3File instances and provides methods
* for common operations.
*
* @example
* // Basic bucket setup
* const bucket = new S3Client({
* bucket: "my-bucket",
* accessKeyId: "key",
* secretAccessKey: "secret"
* });
*
* // Get file instance
* const file = bucket.file("image.jpg");
*
* // Common operations
* await bucket.write("data.json", JSON.stringify({hello: "world"}));
* const url = bucket.presign("file.pdf");
* await bucket.unlink("old.txt");
*
* @category Cloud Storage
*/
class S3Client {
prototype: S3Client;
/**
* Create a new instance of an S3 bucket so that credentials can be managed
* from a single instance instead of being passed to every method.
*
* @param options The default options to use for the S3 client. Can be
* overriden by passing options to the methods.
*
* ## Keep S3 credentials in a single instance
*
* @example
* const bucket = new Bun.S3Client({
* accessKeyId: "your-access-key",
* secretAccessKey: "your-secret-key",
* bucket: "my-bucket",
* endpoint: "https://s3.us-east-1.amazonaws.com",
* sessionToken: "your-session-token",
* });
*
* // S3Client is callable, so you can do this:
* const file = bucket.file("my-file.txt");
*
* // or this:
* await file.write("Hello Bun!");
* await file.text();
*
* // To delete the file:
* await bucket.delete("my-file.txt");
*
* // To write a file without returning the instance:
* await bucket.write("my-file.txt", "Hello Bun!");
*
*/
constructor(options?: S3Options);
/**
* Creates an S3File instance for the given path.
*
* @example
* const file = bucket.file("image.jpg");
* await file.write(imageData);
* const configFile = bucket.file("config.json", {
* type: "application/json",
* acl: "private"
* });
*/
file(path: string, options?: S3Options): S3File;
/**
* Writes data directly to a path in the bucket.
* Supports strings, buffers, streams, and web API types.
*
* @example
* // Write string
* await bucket.write("hello.txt", "Hello World");
*
* // Write JSON with type
* await bucket.write(
* "data.json",
* JSON.stringify({hello: "world"}),
* {type: "application/json"}
* );
*
* // Write from fetch
* const res = await fetch("https://example.com/data");
* await bucket.write("data.bin", res);
*
* // Write with ACL
* await bucket.write("public.html", html, {
* acl: "public-read",
* type: "text/html"
* });
*/
write(
path: string,
data:
| string
| ArrayBufferView
| ArrayBuffer
| SharedArrayBuffer
| Request
| Response
| BunFile
| S3File
| Blob
| File,
options?: S3Options,
): Promise<number>;
/**
* Generate a presigned URL for temporary access to a file.
* Useful for generating upload/download URLs without exposing credentials.
*
* @example
* // Download URL
* const downloadUrl = bucket.presign("file.pdf", {
* expiresIn: 3600 // 1 hour
* });
*
* // Upload URL
* const uploadUrl = bucket.presign("uploads/image.jpg", {
* method: "PUT",
* expiresIn: 3600,
* type: "image/jpeg",
* acl: "public-read"
* });
*
* // Long-lived public URL
* const publicUrl = bucket.presign("public/doc.pdf", {
* expiresIn: 7 * 24 * 60 * 60, // 7 days
* acl: "public-read"
* });
*/
presign(path: string, options?: S3FilePresignOptions): string;
/**
* Delete a file from the bucket.
*
* @example
* // Simple delete
* await bucket.unlink("old-file.txt");
*
* // With error handling
* try {
* await bucket.unlink("file.dat");
* console.log("File deleted");
* } catch (err) {
* console.error("Delete failed:", err);
* }
*/
unlink(path: string, options?: S3Options): Promise<void>;
delete: S3Client["unlink"];
/**
* Get the size of a file in bytes.
* Uses HEAD request to efficiently get size.
*
* @example
* // Get size
* const bytes = await bucket.size("video.mp4");
* console.log(`Size: ${bytes} bytes`);
*
* // Check if file is large
* if (await bucket.size("data.zip") > 100 * 1024 * 1024) {
* console.log("File is larger than 100MB");
* }
*/
size(path: string, options?: S3Options): Promise<number>;
/**
* Check if a file exists in the bucket.
* Uses HEAD request to check existence.
*
* @example
* // Check existence
* if (await bucket.exists("config.json")) {
* const file = bucket.file("config.json");
* const config = await file.json();
* }
*
* // With error handling
* try {
* if (!await bucket.exists("required.txt")) {
* throw new Error("Required file missing");
* }
* } catch (err) {
* console.error("Check failed:", err);
* }
*/
exists(path: string, options?: S3Options): Promise<boolean>;
/**
* Get the stat of a file in an S3-compatible storage service.
*
* @param path The path to the file.
* @param options The options to use for the S3 client.
*/
stat(path: string, options?: S3Options): Promise<S3Stats>;
}
/**
* A default instance of S3Client
*
* Pulls credentials from environment variables. Use `new Bun.S3Client()` if you need to explicitly set credentials.
*
* @category Cloud Storage
*/
var s3: S3Client;
}

View File

@@ -58,6 +58,8 @@ declare module "bun:sqlite" {
* ```ts
* const db = new Database("mydb.sqlite", {readonly: true});
* ```
*
* @category Database
*/
constructor(
filename?: string,
@@ -567,6 +569,8 @@ declare module "bun:sqlite" {
*
* This is returned by {@link Database.prepare} and {@link Database.query}.
*
* @category Database
*
* @example
* ```ts
* const stmt = db.prepare("SELECT * FROM foo WHERE bar = ?");

View File

@@ -16,6 +16,8 @@
declare module "bun:test" {
/**
* -- Mocks --
*
* @category Testing
*/
export type Mock<T extends (...args: any[]) => any> = JestMock.Mock<T>;
@@ -149,6 +151,10 @@ declare module "bun:test" {
methodOrPropertyValue: K,
): Mock<T[K] extends (...args: any[]) => any ? T[K] : never>;
interface FunctionLike {
readonly name: string;
}
/**
* Describes a group of related tests.
*
@@ -164,11 +170,9 @@ declare module "bun:test" {
*
* @param label the label for the tests
* @param fn the function that defines the tests
*
* @category Testing
*/
interface FunctionLike {
readonly name: string;
}
export interface Describe {
(fn: () => void): void;
@@ -352,6 +356,8 @@ declare module "bun:test" {
* @param label the label for the test
* @param fn the test function
* @param options the test timeout or options
*
* @category Testing
*/
export interface Test {
(
@@ -420,7 +426,6 @@ declare module "bun:test" {
*
* @param label the label for the test
* @param fn the test function
* @param options the test timeout or options
*/
failing(label: string, fn?: (() => void | Promise<unknown>) | ((done: (err?: unknown) => void) => void)): void;
/**
@@ -1778,10 +1783,6 @@ declare module "bun:test" {
type MatcherContext = MatcherUtils & MatcherState;
}
declare module "test" {
export type * from "bun:test";
}
declare namespace JestMock {
/**
* Copyright (c) Meta Platforms, Inc. and affiliates.

View File

@@ -1,2 +0,0 @@
Bun.spawn(["echo", '"hi"']);
performance;

View File

@@ -1,178 +0,0 @@
Bun.serve({
fetch(req) {
console.log(req.url); // => http://localhost:3000/
return new Response("Hello World");
},
});
Bun.serve({
fetch(req) {
console.log(req.url); // => http://localhost:3000/
return new Response("Hello World");
},
keyFile: "ca.pem",
certFile: "cert.pem",
});
Bun.serve({
websocket: {
message(ws, message) {
ws.send(message);
},
},
fetch(req, server) {
// Upgrade to a ServerWebSocket if we can
// This automatically checks for the `Sec-WebSocket-Key` header
// meaning you don't have to check headers, you can just call `upgrade()`
if (server.upgrade(req)) {
// When upgrading, we return undefined since we don't want to send a Response
return;
}
return new Response("Regular HTTP response");
},
});
Bun.serve<{
name: string;
}>({
fetch(req, server) {
const url = new URL(req.url);
if (url.pathname === "/chat") {
if (
server.upgrade(req, {
data: {
name: new URL(req.url).searchParams.get("name") || "Friend",
},
headers: {
"Set-Cookie": "name=" + new URL(req.url).searchParams.get("name"),
},
})
) {
return;
}
}
return new Response("Expected a websocket connection", { status: 400 });
},
websocket: {
open(ws) {
console.log("WebSocket opened");
ws.subscribe("the-group-chat");
},
message(ws, message) {
ws.publish("the-group-chat", `${ws.data.name}: ${message.toString()}`);
},
close(ws, code, reason) {
ws.publish("the-group-chat", `${ws.data.name} left the chat`);
},
drain(ws) {
console.log("Please send me data. I am ready to receive it.");
},
perMessageDeflate: true,
},
});
Bun.serve({
fetch(req) {
throw new Error("woops!");
},
error(error) {
return new Response(`<pre>${error.message}\n${error.stack}</pre>`, {
headers: {
"Content-Type": "text/html",
},
});
},
});
export {};
Bun.serve({
port: 1234,
fetch(req, server) {
server.upgrade(req);
if (Math.random() > 0.5) return undefined;
return new Response();
},
websocket: { message() {} },
});
Bun.serve({
unix: "/tmp/bun.sock",
fetch() {
return new Response();
},
});
Bun.serve({
unix: "/tmp/bun.sock",
fetch(req, server) {
server.upgrade(req);
if (Math.random() > 0.5) return undefined;
return new Response();
},
websocket: { message() {} },
});
Bun.serve({
unix: "/tmp/bun.sock",
fetch() {
return new Response();
},
tls: {},
});
Bun.serve({
unix: "/tmp/bun.sock",
fetch(req, server) {
server.upgrade(req);
if (Math.random() > 0.5) return undefined;
return new Response();
},
websocket: { message() {} },
tls: {},
});
Bun.serve({
fetch(req, server) {
server.upgrade(req);
},
websocket: {
open(ws) {
console.log("WebSocket opened");
ws.subscribe("test-channel");
},
message(ws, message) {
ws.publish("test-channel", `${message.toString()}`);
},
perMessageDeflate: true,
},
});
// Bun.serve({
// unix: "/tmp/bun.sock",
// // @ts-expect-error
// port: 1234,
// fetch() {
// return new Response();
// },
// });
// Bun.serve({
// unix: "/tmp/bun.sock",
// // @ts-expect-error
// port: 1234,
// fetch(req, server) {
// server.upgrade(req);
// if (Math.random() > 0.5) return undefined;
// return new Response();
// },
// websocket: { message() {} },
// });

View File

@@ -1,23 +0,0 @@
new ReadableStream({
start(controller) {
controller.enqueue("hello");
controller.enqueue("world");
controller.close();
},
});
// this will have type errors when lib.dom.d.ts is present
// afaik this isn't fixable
new ReadableStream({
type: "direct",
pull(controller) {
// eslint-disable-next-line
controller.write("hello");
// eslint-disable-next-line
controller.write("world");
controller.close();
},
cancel() {
// called if stream.cancel() is called
},
});

View File

@@ -1,8 +0,0 @@
// eslint-disable-next-line @definitelytyped/no-unnecessary-generics
export declare const expectType: <T>(expression: T) => void;
// eslint-disable-next-line @definitelytyped/no-unnecessary-generics
export declare const expectAssignable: <T>(expression: T) => void;
// eslint-disable-next-line @definitelytyped/no-unnecessary-generics
export declare const expectNotAssignable: <T>(expression: any) => void;
// eslint-disable-next-line @definitelytyped/no-unnecessary-generics
export declare const expectTypeEquals: <T, S>(expression: T extends S ? (S extends T ? true : false) : false) => void;

View File

@@ -1,12 +1,12 @@
{
"extends": "../../tsconfig.base.json",
"compilerOptions": {
"skipLibCheck": true,
"declaration": true,
"emitDeclarationOnly": true,
"noEmit": false,
"declarationDir": "out"
},
"include": ["**/*.ts"],
"exclude": ["dist", "node_modules"]
"extends": "../../tsconfig.base.json",
"compilerOptions": {
"skipLibCheck": true,
"declaration": true,
"emitDeclarationOnly": true,
"noEmit": false,
"declarationDir": "out"
},
"include": ["**/*.ts"],
"exclude": ["dist", "node_modules"]
}

View File

@@ -1,270 +1,193 @@
export {};
type _Global<T extends Bun.WebAssembly.ValueType = Bun.WebAssembly.ValueType> = typeof globalThis extends {
onerror: any;
WebAssembly: { Global: infer T };
}
? T
: Bun.WebAssembly.Global<T>;
type _CompileError = typeof globalThis extends {
onerror: any;
WebAssembly: { CompileError: infer T };
}
? T
: Bun.WebAssembly.CompileError;
type _LinkError = typeof globalThis extends {
onerror: any;
WebAssembly: { LinkError: infer T };
}
? T
: Bun.WebAssembly.LinkError;
type _RuntimeError = typeof globalThis extends {
onerror: any;
WebAssembly: { RuntimeError: infer T };
}
? T
: Bun.WebAssembly.RuntimeError;
type _Memory = typeof globalThis extends {
onerror: any;
WebAssembly: { Memory: infer T };
}
? T
: Bun.WebAssembly.Memory;
type _Instance = typeof globalThis extends {
onerror: any;
WebAssembly: { Instance: infer T };
}
? T
: Bun.WebAssembly.Instance;
type _Module = typeof globalThis extends {
onerror: any;
WebAssembly: { Module: infer T };
}
? T
: Bun.WebAssembly.Module;
type _Table = typeof globalThis extends {
onerror: any;
WebAssembly: { Table: infer T };
}
? T
: Bun.WebAssembly.Table;
declare global {
namespace Bun {
namespace WebAssembly {
type ImportExportKind = "function" | "global" | "memory" | "table";
type TableKind = "anyfunc" | "externref";
// eslint-disable-next-line @typescript-eslint/ban-types
type ExportValue = Function | Global | WebAssembly.Memory | WebAssembly.Table;
type Exports = Record<string, ExportValue>;
type ImportValue = ExportValue | number;
type Imports = Record<string, ModuleImports>;
type ModuleImports = Record<string, ImportValue>;
interface ValueTypeMap {
// eslint-disable-next-line @typescript-eslint/ban-types
anyfunc: Function;
externref: any;
f32: number;
f64: number;
i32: number;
i64: bigint;
v128: never;
}
type ValueType = keyof ValueTypeMap;
interface GlobalDescriptor<T extends ValueType = ValueType> {
mutable?: boolean;
value: T;
}
interface Global<T extends ValueType = ValueType> {
// <T extends ValueType = ValueType> {
/** [MDN Reference](https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/WebAssembly/Global/value) */
value: ValueTypeMap[T];
/** [MDN Reference](https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/WebAssembly/Global/valueOf) */
valueOf(): ValueTypeMap[T];
}
interface CompileError extends Error {}
interface LinkError extends Error {}
interface RuntimeError extends Error {}
/** [MDN Reference](https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/WebAssembly/Instance) */
interface Instance {
/** [MDN Reference](https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/WebAssembly/Instance/exports) */
readonly exports: Exports;
}
/** [MDN Reference](https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/WebAssembly/Memory) */
interface Memory {
/** [MDN Reference](https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/WebAssembly/Memory/buffer) */
readonly buffer: ArrayBuffer;
/** [MDN Reference](https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/WebAssembly/Memory/grow) */
grow(delta: number): number;
}
/** [MDN Reference](https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/WebAssembly/Module) */
interface Module {}
/** [MDN Reference](https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/WebAssembly/Table) */
interface Table {
/** [MDN Reference](https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/WebAssembly/Table/length) */
readonly length: number;
/** [MDN Reference](https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/WebAssembly/Table/get) */
get(index: number): any;
/** [MDN Reference](https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/WebAssembly/Table/grow) */
grow(delta: number, value?: any): number;
/** [MDN Reference](https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/WebAssembly/Table/set) */
set(index: number, value?: any): void;
}
interface MemoryDescriptor {
initial: number;
maximum?: number;
shared?: boolean;
}
interface ModuleExportDescriptor {
kind: ImportExportKind;
name: string;
}
interface ModuleImportDescriptor {
kind: ImportExportKind;
module: string;
name: string;
}
interface TableDescriptor {
element: TableKind;
initial: number;
maximum?: number;
}
interface WebAssemblyInstantiatedSource {
instance: Instance;
module: Module;
}
}
}
declare module "bun" {
namespace WebAssembly {
interface ValueTypeMap extends Bun.WebAssembly.ValueTypeMap {}
interface GlobalDescriptor<T extends keyof ValueTypeMap = keyof ValueTypeMap>
extends Bun.WebAssembly.GlobalDescriptor<T> {}
interface MemoryDescriptor extends Bun.WebAssembly.MemoryDescriptor {}
interface ModuleExportDescriptor extends Bun.WebAssembly.ModuleExportDescriptor {}
interface ModuleImportDescriptor extends Bun.WebAssembly.ModuleImportDescriptor {}
interface TableDescriptor extends Bun.WebAssembly.TableDescriptor {}
interface WebAssemblyInstantiatedSource extends Bun.WebAssembly.WebAssemblyInstantiatedSource {}
type ImportExportKind = "function" | "global" | "memory" | "table";
type TableKind = "anyfunc" | "externref";
type ExportValue = Function | Global | WebAssembly.Memory | WebAssembly.Table;
type Exports = Record<string, ExportValue>;
type ImportValue = ExportValue | number;
type Imports = Record<string, ModuleImports>;
type ModuleImports = Record<string, ImportValue>;
interface LinkError extends _LinkError {}
var LinkError: {
prototype: LinkError;
new (message?: string): LinkError;
(message?: string): LinkError;
};
interface CompileError extends _CompileError {}
var CompileError: typeof globalThis extends {
onerror: any;
WebAssembly: { CompileError: infer T };
interface ValueTypeMap {
anyfunc: Function;
externref: any;
f32: number;
f64: number;
i32: number;
i64: bigint;
v128: never;
}
? T
: {
prototype: CompileError;
new (message?: string): CompileError;
(message?: string): CompileError;
};
interface RuntimeError extends _RuntimeError {}
var RuntimeError: {
prototype: RuntimeError;
new (message?: string): RuntimeError;
(message?: string): RuntimeError;
};
type ValueType = keyof ValueTypeMap;
interface Global<T extends keyof ValueTypeMap = keyof ValueTypeMap> extends _Global<T> {}
var Global: typeof globalThis extends {
onerror: any;
WebAssembly: { Global: infer T };
interface GlobalDescriptor<T extends ValueType = ValueType> {
mutable?: boolean;
value: T;
}
? T
: {
prototype: Global;
new <T extends Bun.WebAssembly.ValueType = Bun.WebAssembly.ValueType>(
descriptor: GlobalDescriptor<T>,
v?: ValueTypeMap[T],
): Global<T>;
};
interface Instance extends _Instance {}
var Instance: typeof globalThis extends {
onerror: any;
WebAssembly: { Instance: infer T };
interface Global<T extends ValueType = ValueType> {
// <T extends ValueType = ValueType> {
/** [MDN Reference](https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/WebAssembly/Global/value) */
value: ValueTypeMap[T];
/** [MDN Reference](https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/WebAssembly/Global/valueOf) */
valueOf(): ValueTypeMap[T];
}
? T
: {
prototype: Instance;
new (module: Module, importObject?: Bun.WebAssembly.Imports): Instance;
};
interface Memory extends _Memory {}
var Memory: {
prototype: Memory;
new (descriptor: MemoryDescriptor): Memory;
};
interface CompileError extends Error {}
interface Module extends _Module {}
var Module: typeof globalThis extends {
onerror: any;
WebAssembly: { Module: infer T };
interface LinkError extends Error {}
interface RuntimeError extends Error {}
/** [MDN Reference](https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/WebAssembly/Instance) */
interface Instance {
/** [MDN Reference](https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/WebAssembly/Instance/exports) */
readonly exports: Exports;
}
? T
: {
prototype: Module;
new (bytes: Bun.BufferSource): Module;
/** [MDN Reference](https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/WebAssembly/Module/customSections) */
customSections(moduleObject: Module, sectionName: string): ArrayBuffer[];
/** [MDN Reference](https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/WebAssembly/Module/exports) */
exports(moduleObject: Module): ModuleExportDescriptor[];
/** [MDN Reference](https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/WebAssembly/Module/imports) */
imports(moduleObject: Module): ModuleImportDescriptor[];
};
interface Table extends _Table {}
var Table: {
prototype: Table;
new (descriptor: TableDescriptor, value?: any): Table;
};
/** [MDN Reference](https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/WebAssembly/Memory) */
interface Memory {
/** [MDN Reference](https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/WebAssembly/Memory/buffer) */
readonly buffer: ArrayBuffer;
/** [MDN Reference](https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/WebAssembly/Memory/grow) */
grow(delta: number): number;
}
/** [MDN Reference](https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/WebAssembly/compile) */
function compile(bytes: Bun.BufferSource): Promise<Module>;
/** [MDN Reference](https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/WebAssembly/compileStreaming) */
function compileStreaming(source: Response | PromiseLike<Response>): Promise<Module>;
/** [MDN Reference](https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/WebAssembly/instantiate) */
function instantiate(
bytes: Bun.BufferSource,
importObject?: Bun.WebAssembly.Imports,
): Promise<WebAssemblyInstantiatedSource>;
function instantiate(moduleObject: Module, importObject?: Bun.WebAssembly.Imports): Promise<Instance>;
/** [MDN Reference](https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/WebAssembly/instantiateStreaming) */
function instantiateStreaming(
source: Response | PromiseLike<Response>,
importObject?: Bun.WebAssembly.Imports,
): Promise<WebAssemblyInstantiatedSource>;
/** [MDN Reference](https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/WebAssembly/validate) */
function validate(bytes: Bun.BufferSource): boolean;
/** [MDN Reference](https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/WebAssembly/Module) */
interface Module {}
/** [MDN Reference](https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/WebAssembly/Table) */
interface Table {
/** [MDN Reference](https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/WebAssembly/Table/length) */
readonly length: number;
/** [MDN Reference](https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/WebAssembly/Table/get) */
get(index: number): any;
/** [MDN Reference](https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/WebAssembly/Table/grow) */
grow(delta: number, value?: any): number;
/** [MDN Reference](https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/WebAssembly/Table/set) */
set(index: number, value?: any): void;
}
interface MemoryDescriptor {
initial: number;
maximum?: number;
shared?: boolean;
}
interface ModuleExportDescriptor {
kind: ImportExportKind;
name: string;
}
interface ModuleImportDescriptor {
kind: ImportExportKind;
module: string;
name: string;
}
interface TableDescriptor {
element: TableKind;
initial: number;
maximum?: number;
}
interface WebAssemblyInstantiatedSource {
instance: Instance;
module: Module;
}
}
}
declare namespace WebAssembly {
interface ValueTypeMap extends Bun.WebAssembly.ValueTypeMap {}
interface GlobalDescriptor<T extends keyof ValueTypeMap = keyof ValueTypeMap>
extends Bun.WebAssembly.GlobalDescriptor<T> {}
interface MemoryDescriptor extends Bun.WebAssembly.MemoryDescriptor {}
interface ModuleExportDescriptor extends Bun.WebAssembly.ModuleExportDescriptor {}
interface ModuleImportDescriptor extends Bun.WebAssembly.ModuleImportDescriptor {}
interface TableDescriptor extends Bun.WebAssembly.TableDescriptor {}
interface WebAssemblyInstantiatedSource extends Bun.WebAssembly.WebAssemblyInstantiatedSource {}
interface LinkError extends Bun.WebAssembly.LinkError {}
var LinkError: {
prototype: LinkError;
new (message?: string): LinkError;
(message?: string): LinkError;
};
interface CompileError extends Bun.WebAssembly.CompileError {}
var CompileError: {
prototype: CompileError;
new (message?: string): CompileError;
(message?: string): CompileError;
};
interface RuntimeError extends Bun.WebAssembly.RuntimeError {}
var RuntimeError: {
prototype: RuntimeError;
new (message?: string): RuntimeError;
(message?: string): RuntimeError;
};
interface Global<T extends keyof ValueTypeMap = keyof ValueTypeMap> extends Bun.WebAssembly.Global<T> {}
var Global: {
prototype: Global;
new <T extends Bun.WebAssembly.ValueType = Bun.WebAssembly.ValueType>(
descriptor: GlobalDescriptor<T>,
v?: ValueTypeMap[T],
): Global<T>;
};
interface Instance extends Bun.WebAssembly.Instance {}
var Instance: {
prototype: Instance;
new (module: Module, importObject?: Bun.WebAssembly.Imports): Instance;
};
interface Memory extends Bun.WebAssembly.Memory {}
var Memory: {
prototype: Memory;
new (descriptor: MemoryDescriptor): Memory;
};
interface Module extends Bun.WebAssembly.Module {}
var Module: Bun.__internal.UseLibDomIfAvailable<
"WebAssembly",
{
Module: {
prototype: Module;
new (bytes: Bun.BufferSource): Module;
/** [MDN Reference](https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/WebAssembly/Module/customSections) */
customSections(moduleObject: Module, sectionName: string): ArrayBuffer[];
/** [MDN Reference](https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/WebAssembly/Module/exports) */
exports(moduleObject: Module): ModuleExportDescriptor[];
/** [MDN Reference](https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/WebAssembly/Module/imports) */
imports(moduleObject: Module): ModuleImportDescriptor[];
};
}
>["Module"];
interface Table extends Bun.WebAssembly.Table {}
var Table: {
prototype: Table;
new (descriptor: TableDescriptor, value?: any): Table;
};
/** [MDN Reference](https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/WebAssembly/compile) */
function compile(bytes: Bun.BufferSource): Promise<Module>;
/** [MDN Reference](https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/WebAssembly/compileStreaming) */
function compileStreaming(source: Response | PromiseLike<Response>): Promise<Module>;
/** [MDN Reference](https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/WebAssembly/instantiate) */
function instantiate(
bytes: Bun.BufferSource,
importObject?: Bun.WebAssembly.Imports,
): Promise<WebAssemblyInstantiatedSource>;
function instantiate(moduleObject: Module, importObject?: Bun.WebAssembly.Imports): Promise<Instance>;
/** [MDN Reference](https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/WebAssembly/instantiateStreaming) */
function instantiateStreaming(
source: Response | PromiseLike<Response>,
importObject?: Bun.WebAssembly.Imports,
): Promise<WebAssemblyInstantiatedSource>;
/** [MDN Reference](https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/WebAssembly/validate) */
function validate(bytes: Bun.BufferSource): boolean;
}

View File

@@ -353,15 +353,21 @@ int kqueue_change(int kqfd, int fd, int old_events, int new_events, void *user_d
int change_length = 0;
/* Do they differ in readable? */
int is_readable = (new_events & LIBUS_SOCKET_READABLE);
int is_writable = (new_events & LIBUS_SOCKET_WRITABLE);
if ((new_events & LIBUS_SOCKET_READABLE) != (old_events & LIBUS_SOCKET_READABLE)) {
EV_SET64(&change_list[change_length++], fd, EVFILT_READ, (new_events & LIBUS_SOCKET_READABLE) ? EV_ADD : EV_DELETE, 0, 0, (uint64_t)(void*)user_data, 0, 0);
EV_SET64(&change_list[change_length++], fd, EVFILT_READ, is_readable ? EV_ADD : EV_DELETE, 0, 0, (uint64_t)(void*)user_data, 0, 0);
}
/* Do they differ in writable? */
if ((new_events & LIBUS_SOCKET_WRITABLE) != (old_events & LIBUS_SOCKET_WRITABLE)) {
if(!is_readable && !is_writable) {
if(!(old_events & LIBUS_SOCKET_WRITABLE)) {
// if we are not reading or writing, we need to add writable to receive FIN
EV_SET64(&change_list[change_length++], fd, EVFILT_WRITE, EV_ADD, 0, 0, (uint64_t)(void*)user_data, 0, 0);
}
} else if ((new_events & LIBUS_SOCKET_WRITABLE) != (old_events & LIBUS_SOCKET_WRITABLE)) {
/* Do they differ in writable? */
EV_SET64(&change_list[change_length++], fd, EVFILT_WRITE, (new_events & LIBUS_SOCKET_WRITABLE) ? EV_ADD : EV_DELETE, 0, 0, (uint64_t)(void*)user_data, 0, 0);
}
}
int ret;
do {
ret = kevent64(kqfd, change_list, change_length, change_list, change_length, KEVENT_FLAG_ERROR_EVENTS, NULL);
@@ -399,6 +405,10 @@ int us_poll_start_rc(struct us_poll_t *p, struct us_loop_t *loop, int events) {
#ifdef LIBUS_USE_EPOLL
struct epoll_event event;
if(!(events & LIBUS_SOCKET_READABLE) && !(events & LIBUS_SOCKET_WRITABLE)) {
// if we are disabling readable, we need to add the other events to detect EOF/HUP/ERR
events |= EPOLLRDHUP | EPOLLHUP | EPOLLERR;
}
event.events = events;
event.data.ptr = p;
int ret;
@@ -423,6 +433,10 @@ void us_poll_change(struct us_poll_t *p, struct us_loop_t *loop, int events) {
#ifdef LIBUS_USE_EPOLL
struct epoll_event event;
if(!(events & LIBUS_SOCKET_READABLE) && !(events & LIBUS_SOCKET_WRITABLE)) {
// if we are disabling readable, we need to add the other events to detect EOF/HUP/ERR
events |= EPOLLRDHUP | EPOLLHUP | EPOLLERR;
}
event.events = events;
event.data.ptr = p;
int rc;

View File

@@ -104,7 +104,6 @@ void us_poll_change(struct us_poll_t *p, struct us_loop_t *loop, int events) {
us_internal_poll_type(p) |
((events & LIBUS_SOCKET_READABLE) ? POLL_TYPE_POLLING_IN : 0) |
((events & LIBUS_SOCKET_WRITABLE) ? POLL_TYPE_POLLING_OUT : 0);
uv_poll_start(p->uv_p, events, poll_cb);
}
}

View File

@@ -472,6 +472,7 @@ int us_socket_raw_write(int ssl, us_socket_r s, const char *data, int length, in
struct us_socket_t* us_socket_open(int ssl, struct us_socket_t * s, int is_client, char* ip, int ip_length);
int us_raw_root_certs(struct us_cert_string_t**out);
unsigned int us_get_remote_address_info(char *buf, us_socket_r s, const char **dest, int *port, int *is_ipv6);
unsigned int us_get_local_address_info(char *buf, us_socket_r s, const char **dest, int *port, int *is_ipv6);
int us_socket_get_error(int ssl, us_socket_r s);
void us_socket_ref(us_socket_r s);

View File

@@ -213,21 +213,21 @@ void us_internal_free_closed_sockets(struct us_loop_t *loop) {
us_poll_free((struct us_poll_t *) s, loop);
s = next;
}
loop->data.closed_head = 0;
loop->data.closed_head = NULL;
for (struct us_udp_socket_t *s = loop->data.closed_udp_head; s; ) {
struct us_udp_socket_t *next = s->next;
us_poll_free((struct us_poll_t *) s, loop);
s = next;
}
loop->data.closed_udp_head = 0;
loop->data.closed_udp_head = NULL;
for (struct us_connecting_socket_t *s = loop->data.closed_connecting_head; s; ) {
struct us_connecting_socket_t *next = s->next;
us_free(s);
s = next;
}
loop->data.closed_connecting_head = 0;
loop->data.closed_connecting_head = NULL;
}
void us_internal_free_closed_contexts(struct us_loop_t *loop) {
@@ -236,7 +236,7 @@ void us_internal_free_closed_contexts(struct us_loop_t *loop) {
us_free(ctx);
ctx = next;
}
loop->data.closed_context_head = 0;
loop->data.closed_context_head = NULL;
}
void sweep_timer_cb(struct us_internal_callback_t *cb) {

View File

@@ -501,6 +501,24 @@ unsigned int us_get_remote_address_info(char *buf, struct us_socket_t *s, const
return length;
}
unsigned int us_get_local_address_info(char *buf, struct us_socket_t *s, const char **dest, int *port, int *is_ipv6)
{
struct bsd_addr_t addr;
if (bsd_local_addr(us_poll_fd(&s->p), &addr)) {
return 0;
}
int length = bsd_addr_get_ip_length(&addr);
if (!length) {
return 0;
}
memcpy(buf, bsd_addr_get_ip(&addr), length);
*port = bsd_addr_get_port(&addr);
return length;
}
void us_socket_ref(struct us_socket_t *s) {
#ifdef LIBUS_USE_LIBUV
uv_ref((uv_handle_t*)s->p.uv_p);
@@ -540,11 +558,6 @@ struct us_loop_t *us_connecting_socket_get_loop(struct us_connecting_socket_t *c
void us_socket_pause(int ssl, struct us_socket_t *s) {
// closed cannot be paused because it is already closed
if(us_socket_is_closed(ssl, s)) return;
if(us_socket_is_shut_down(ssl, s)) {
// we already sent FIN so we pause all events because we are read-only
us_poll_change(&s->p, s->context->loop, 0);
return;
}
// we are readable and writable so we can just pause readable side
us_poll_change(&s->p, s->context->loop, LIBUS_SOCKET_WRITABLE);
}

View File

@@ -131,7 +131,7 @@ public:
getLoopData()->setCorkedSocket(this, SSL);
}
/* Returns wheter we are corked or not */
/* Returns whether we are corked */
bool isCorked() {
return getLoopData()->isCorkedWith(this);
}
@@ -182,9 +182,9 @@ public:
}
/* Returns the user space backpressure. */
unsigned int getBufferedAmount() {
size_t getBufferedAmount() {
/* We return the actual amount of bytes in backbuffer, including pendingRemoval */
return (unsigned int) getAsyncSocketData()->buffer.totalLength();
return getAsyncSocketData()->buffer.totalLength();
}
/* Returns the text representation of an IPv4 or IPv6 address */
@@ -222,6 +222,63 @@ public:
return addressAsText(getRemoteAddress());
}
/**
* Flushes the socket buffer by writing as much data as possible to the underlying socket.
*
* @return The total number of bytes successfully written to the socket
*/
size_t flush() {
/* Check if socket is valid for operations */
if (us_socket_is_closed(SSL, (us_socket_t *) this)) {
/* Socket is closed, no flushing is possible */
return 0;
}
/* Get the associated asynchronous socket data structure */
AsyncSocketData<SSL> *asyncSocketData = getAsyncSocketData();
size_t total_written = 0;
/* Continue flushing as long as we have data in the buffer */
while (asyncSocketData->buffer.length()) {
/* Get current buffer size */
size_t buffer_len = asyncSocketData->buffer.length();
/* Limit write size to INT_MAX as the underlying socket API uses int for length */
int max_flush_len = std::min(buffer_len, (size_t)INT_MAX);
/* Attempt to write data to the socket */
int written = us_socket_write(SSL, (us_socket_t *) this, asyncSocketData->buffer.data(), max_flush_len, 0);
total_written += written;
/* Check if we couldn't write the entire buffer */
if ((unsigned int) written < buffer_len) {
/* Remove the successfully written data from the buffer */
asyncSocketData->buffer.erase((unsigned int) written);
/* If we wrote less than we attempted, the socket buffer is likely full
* likely is used as an optimization hint to the compiler
* since written < buffer_len is very likely to be true
*/
if(written < max_flush_len) {
[[likely]]
/* Cannot write more at this time, return what we've written so far */
return total_written;
}
/* If we wrote exactly max_flush_len, we might be able to write more, so continue
* This is unlikely to happen, because this would be INT_MAX bytes, which is unlikely to be written in one go
* but we keep this check for completeness
*/
continue;
}
/* Successfully wrote the entire buffer, clear the buffer */
asyncSocketData->buffer.clear();
}
/* Return the total number of bytes written during this flush operation */
return total_written;
}
/* Write in three levels of prioritization: cork-buffer, syscall, socket-buffer. Always drain if possible.
* Returns pair of bytes written (anywhere) and wheter or not this call resulted in the polling for
* writable (or we are in a state that implies polling for writable). */
@@ -233,7 +290,6 @@ public:
LoopData *loopData = getLoopData();
AsyncSocketData<SSL> *asyncSocketData = getAsyncSocketData();
/* We are limited if we have a per-socket buffer */
if (asyncSocketData->buffer.length()) {
size_t buffer_len = asyncSocketData->buffer.length();
@@ -261,7 +317,7 @@ public:
asyncSocketData->buffer.clear();
}
if (length) {
if (length) {
if (loopData->isCorkedWith(this)) {
/* We are corked */
if (LoopData::CORK_BUFFER_SIZE - loopData->getCorkOffset() >= (unsigned int) length) {

View File

@@ -125,7 +125,7 @@ private:
}
/* Signal broken HTTP request only if we have a pending request */
if (httpResponseData->onAborted) {
if (httpResponseData->onAborted != nullptr && httpResponseData->userData != nullptr) {
httpResponseData->onAborted((HttpResponse<SSL> *)s, httpResponseData->userData);
}
@@ -235,7 +235,7 @@ private:
}
/* Returning from a request handler without responding or attaching an onAborted handler is ill-use */
if (!((HttpResponse<SSL> *) s)->hasResponded() && !httpResponseData->onAborted) {
if (!((HttpResponse<SSL> *) s)->hasResponded() && !httpResponseData->onAborted && !httpResponseData->socketData) {
/* Throw exception here? */
std::cerr << "Error: Returning from a request handler without responding or attaching an abort handler is forbidden!" << std::endl;
std::terminate();
@@ -365,11 +365,32 @@ private:
auto *asyncSocket = reinterpret_cast<AsyncSocket<SSL> *>(s);
auto *httpResponseData = reinterpret_cast<HttpResponseData<SSL> *>(asyncSocket->getAsyncSocketData());
/* Attempt to drain the socket buffer before triggering onWritable callback */
size_t bufferedAmount = asyncSocket->getBufferedAmount();
if (bufferedAmount > 0) {
/* Try to flush pending data from the socket's buffer to the network */
bufferedAmount -= asyncSocket->flush();
/* Check if there's still data waiting to be sent after flush attempt */
if (bufferedAmount > 0) {
/* Socket buffer is not completely empty yet
* - Reset the timeout to prevent premature connection closure
* - This allows time for another writable event or new request
* - Return the socket to indicate we're still processing
*/
reinterpret_cast<HttpResponse<SSL> *>(s)->resetTimeout();
return s;
}
/* If bufferedAmount is now 0, we've successfully flushed everything
* and will fall through to the next section of code
*/
}
/* Ask the developer to write data and return success (true) or failure (false), OR skip sending anything and return success (true). */
if (httpResponseData->onWritable) {
/* We are now writable, so hang timeout again, the user does not have to do anything so we should hang until end or tryEnd rearms timeout */
us_socket_timeout(SSL, s, 0);
/* We expect the developer to return whether or not write was successful (true).
* If write was never called, the developer should still return true so that we may drain. */
bool success = httpResponseData->callOnWritable(reinterpret_cast<HttpResponse<SSL> *>(asyncSocket), httpResponseData->offset);
@@ -384,7 +405,7 @@ private:
}
/* Drain any socket buffer, this might empty our backpressure and thus finish the request */
/*auto [written, failed] = */asyncSocket->write(nullptr, 0, true, 0);
asyncSocket->flush();
/* Should we close this connection after a response - and is this response really done? */
if (httpResponseData->state & HttpResponseData<SSL>::HTTP_CONNECTION_CLOSE) {

View File

@@ -122,15 +122,10 @@ public:
/* We do not have tryWrite-like functionalities, so ignore optional in this path */
/* Do not allow sending 0 chunk here */
if (data.length()) {
Super::write("\r\n", 2);
writeUnsignedHex((unsigned int) data.length());
Super::write("\r\n", 2);
/* Ignoring optional for now */
Super::write(data.data(), (int) data.length());
}
/* Write the chunked data if there is any (this will not send zero chunks) */
this->write(data, nullptr);
/* Terminating 0 chunk */
Super::write("\r\n0\r\n\r\n", 7);
@@ -480,6 +475,40 @@ public:
return true;
}
size_t length = data.length();
// Special handling for extremely large data (greater than UINT_MAX bytes)
// most clients expect a max of UINT_MAX, so we need to split the write into multiple writes
if (length > UINT_MAX) {
bool has_failed = false;
size_t total_written = 0;
// Process full-sized chunks until remaining data is less than UINT_MAX
while (length > UINT_MAX) {
size_t written = 0;
// Write a UINT_MAX-sized chunk and check for failure
// even after failure we continue writing because the data will be buffered
if(!this->write(data.substr(0, UINT_MAX), &written)) {
has_failed = true;
}
total_written += written;
length -= UINT_MAX;
data = data.substr(UINT_MAX);
}
// Handle the final chunk (less than UINT_MAX bytes)
if (length > 0) {
size_t written = 0;
if(!this->write(data, &written)) {
has_failed = true;
}
total_written += written;
}
if (writtenPtr) {
*writtenPtr = total_written;
}
return !has_failed;
}
HttpResponseData<SSL> *httpResponseData = getHttpResponseData();
if (!(httpResponseData->state & HttpResponseData<SSL>::HTTP_WROTE_CONTENT_LENGTH_HEADER) && !httpResponseData->fromAncientRequest) {
@@ -499,17 +528,36 @@ public:
Super::write("\r\n", 2);
httpResponseData->state |= HttpResponseData<SSL>::HTTP_WRITE_CALLED;
}
size_t total_written = 0;
bool has_failed = false;
auto [written, failed] = Super::write(data.data(), (int) data.length());
// Handle data larger than INT_MAX by writing it in chunks of INT_MAX bytes
while (length > INT_MAX) {
// Write the maximum allowed chunk size (INT_MAX)
auto [written, failed] = Super::write(data.data(), INT_MAX);
// If the write failed, set the has_failed flag we continue writting because the data will be buffered
has_failed = has_failed || failed;
total_written += written;
length -= INT_MAX;
data = data.substr(INT_MAX);
}
// Handle the remaining data (less than INT_MAX bytes)
if (length > 0) {
// Write the final chunk with exact remaining length
auto [written, failed] = Super::write(data.data(), (int) length);
has_failed = has_failed || failed;
total_written += written;
}
/* Reset timeout on each sended chunk */
this->resetTimeout();
if (writtenPtr) {
*writtenPtr = written;
*writtenPtr = total_written;
}
/* If we did not fail the write, accept more */
return !failed;
return !has_failed;
}
/* Get the current byte write offset for this Http response */
@@ -660,12 +708,6 @@ public:
return httpResponseData->socketData;
}
void setSocketData(void* socketData) {
HttpResponseData<SSL> *httpResponseData = getHttpResponseData();
httpResponseData->socketData = socketData;
}
void setWriteOffset(uint64_t offset) {
HttpResponseData<SSL> *httpResponseData = getHttpResponseData();

View File

@@ -339,7 +339,7 @@ private:
/* We store old backpressure since it is unclear whether write drained anything,
* however, in case of coming here with 0 backpressure we still need to emit drain event */
unsigned int backpressure = asyncSocket->getBufferedAmount();
size_t backpressure = asyncSocket->getBufferedAmount();
/* Drain as much as possible */
asyncSocket->write(nullptr, 0);

View File

@@ -0,0 +1,31 @@
#!/usr/bin/env bash
# This file is not run often, so we don't need to make it part of the build system.
# We do this because the event names have to be compile-time constants.
export TRACE_EVENTS=$(rg 'bun\.perf\.trace\("([^"]*)"\)' -t zig --json \
| jq -r 'select(.type == "match")' \
| jq -r '.data.submatches[].match.text' \
| cut -d'"' -f2 \
| sort \
| uniq)
echo "// Generated with scripts/generate-perf-trace-events.sh" > src/bun.js/bindings/generated_perf_trace_events.h
echo "// clang-format off" >> src/bun.js/bindings/generated_perf_trace_events.h
echo "#define FOR_EACH_TRACE_EVENT(macro) \\" >> src/bun.js/bindings/generated_perf_trace_events.h
i=0
for event in $TRACE_EVENTS; do
echo " macro($event, $((i++))) \\" >> src/bun.js/bindings/generated_perf_trace_events.h
done
echo " // end" >> src/bun.js/bindings/generated_perf_trace_events.h
echo "Generated src/bun.js/bindings/generated_perf_trace_events.h"
echo "// Generated with scripts/generate-perf-trace-events.sh" > src/generated_perf_trace_events.zig
echo "pub const PerfEvent = enum(i32) {" >> src/generated_perf_trace_events.zig
for event in $TRACE_EVENTS; do
echo " @\"$event\"," >> src/generated_perf_trace_events.zig
done
echo "};" >> src/generated_perf_trace_events.zig
echo "Generated src/generated_perf_trace_events.zig"

View File

@@ -1080,7 +1080,7 @@ function getRelevantTests(cwd) {
const filteredTests = [];
if (options["node-tests"]) {
tests = tests.filter(isNodeParallelTest);
tests = tests.filter(isNodeTest);
}
const isMatch = (testPath, filter) => {

View File

@@ -326,7 +326,7 @@ pub const StandaloneModuleGraph = struct {
}
pub fn toBytes(allocator: std.mem.Allocator, prefix: []const u8, output_files: []const bun.options.OutputFile, output_format: bun.options.Format) ![]u8 {
var serialize_trace = bun.tracy.traceNamed(@src(), "StandaloneModuleGraph.serialize");
var serialize_trace = bun.perf.trace("StandaloneModuleGraph.serialize");
defer serialize_trace.end();
var entry_point_id: ?usize = null;

View File

@@ -3,8 +3,9 @@ const Watcher = @This();
const DebugLogScope = bun.Output.Scoped(.watcher, false);
const log = DebugLogScope.log;
// Consumer-facing
watch_events: [max_count]WatchEvent,
// This will always be [max_count]WatchEvent,
// We avoid statically allocating because it increases the binary size.
watch_events: []WatchEvent = &.{},
changed_filepaths: [max_count]?[:0]u8,
/// The platform-specific implementation of the watcher
@@ -86,7 +87,7 @@ pub fn init(comptime T: type, ctx: *T, fs: *bun.fs.FileSystem, allocator: std.me
.onFileUpdate = &wrapped.onFileUpdateWrapped,
.onError = &wrapped.onErrorWrapped,
.platform = .{},
.watch_events = undefined,
.watch_events = try allocator.alloc(WatchEvent, max_count),
.changed_filepaths = [_]?[:0]u8{null} ** max_count,
};
@@ -251,7 +252,7 @@ pub fn flushEvictions(this: *Watcher) void {
// swapRemove messes up the order
// But, it only messes up the order if any elements in the list appear after the item being removed
// So if we just sort the list by the biggest index first, that should be fine
std.sort.pdq(
std.sort.insertion(
WatchItemIndex,
this.evict_list[0..this.evict_list_i],
{},
@@ -268,7 +269,7 @@ pub fn flushEvictions(this: *Watcher) void {
if (!Environment.isWindows) {
// on mac and linux we can just close the file descriptor
// TODO do we need to call inotify_rm_watch on linux?
// we don't need to call inotify_rm_watch on linux because it gets removed when the file descriptor is closed
if (fds[item].isValid()) {
_ = bun.sys.close(fds[item]);
}
@@ -279,7 +280,7 @@ pub fn flushEvictions(this: *Watcher) void {
last_item = no_watch_item;
// This is split into two passes because reading the slice while modified is potentially unsafe.
for (this.evict_list[0..this.evict_list_i]) |item| {
if (item == last_item) continue;
if (item == last_item or this.watchlist.len <= item) continue;
this.watchlist.swapRemove(item);
last_item = item;
}

View File

@@ -32,7 +32,7 @@ fn mimalloc_free(
}
}
const CAllocator = struct {
const MimallocAllocator = struct {
pub const supports_posix_memalign = true;
fn alignedAlloc(len: usize, alignment: mem.Alignment) ?[*]u8 {
@@ -60,36 +60,31 @@ const CAllocator = struct {
return mimalloc.mi_malloc_size(ptr);
}
fn alloc(_: *anyopaque, len: usize, alignment: mem.Alignment, _: usize) ?[*]u8 {
fn alloc_with_default_allocator(_: *anyopaque, len: usize, alignment: mem.Alignment, _: usize) ?[*]u8 {
return alignedAlloc(len, alignment);
}
fn resize(_: *anyopaque, buf: []u8, _: mem.Alignment, new_len: usize, _: usize) bool {
if (new_len <= buf.len) {
return true;
}
const full_len = alignedAllocSize(buf.ptr);
if (new_len <= full_len) {
return true;
}
return false;
fn resize_with_default_allocator(_: *anyopaque, buf: []u8, _: mem.Alignment, new_len: usize, _: usize) bool {
return mimalloc.mi_expand(buf.ptr, new_len) != null;
}
const free = mimalloc_free;
fn remap_with_default_allocator(_: *anyopaque, buf: []u8, alignment: mem.Alignment, new_len: usize, _: usize) ?[*]u8 {
return @ptrCast(mimalloc.mi_realloc_aligned(buf.ptr, new_len, alignment.toByteUnits()));
}
const free_with_default_allocator = mimalloc_free;
};
pub const c_allocator = Allocator{
// This ptr can be anything. But since it's not nullable, we should set it to something.
.ptr = @constCast(c_allocator_vtable),
.ptr = memory_allocator_tags.default_allocator_tag_ptr,
.vtable = c_allocator_vtable,
};
const c_allocator_vtable = &Allocator.VTable{
.alloc = &CAllocator.alloc,
.resize = &CAllocator.resize,
.remap = &std.mem.Allocator.noRemap,
.free = &CAllocator.free,
.alloc = &MimallocAllocator.alloc_with_default_allocator,
.resize = &MimallocAllocator.resize_with_default_allocator,
.remap = &MimallocAllocator.remap_with_default_allocator,
.free = &MimallocAllocator.free_with_default_allocator,
};
const ZAllocator = struct {
@@ -119,11 +114,11 @@ const ZAllocator = struct {
return mimalloc.mi_malloc_size(ptr);
}
fn alloc(_: *anyopaque, len: usize, alignment: mem.Alignment, _: usize) ?[*]u8 {
fn alloc_with_z_allocator(_: *anyopaque, len: usize, alignment: mem.Alignment, _: usize) ?[*]u8 {
return alignedAlloc(len, alignment);
}
fn resize(_: *anyopaque, buf: []u8, _: mem.Alignment, new_len: usize, _: usize) bool {
fn resize_with_z_allocator(_: *anyopaque, buf: []u8, _: mem.Alignment, new_len: usize, _: usize) bool {
if (new_len <= buf.len) {
return true;
}
@@ -136,141 +131,24 @@ const ZAllocator = struct {
return false;
}
const free = mimalloc_free;
const free_with_z_allocator = mimalloc_free;
};
const memory_allocator_tags = struct {
const default_allocator_tag: usize = 0xBEEFA110C; // "BEEFA110C" beef a110c i guess
pub const default_allocator_tag_ptr: *anyopaque = @ptrFromInt(default_allocator_tag);
const z_allocator_tag: usize = 0x2a11043470123; // "z4110c4701" (Z ALLOCATOR in 1337 speak)
pub const z_allocator_tag_ptr: *anyopaque = @ptrFromInt(z_allocator_tag);
};
pub const z_allocator = Allocator{
.ptr = undefined,
.ptr = memory_allocator_tags.z_allocator_tag_ptr,
.vtable = &z_allocator_vtable,
};
const z_allocator_vtable = Allocator.VTable{
.alloc = &ZAllocator.alloc,
.resize = &ZAllocator.resize,
.alloc = &ZAllocator.alloc_with_z_allocator,
.resize = &ZAllocator.resize_with_z_allocator,
.remap = &std.mem.Allocator.noRemap,
.free = &ZAllocator.free,
};
const HugeAllocator = struct {
fn alloc(
_: *anyopaque,
len: usize,
alignment: u29,
len_align: u29,
return_address: usize,
) error{OutOfMemory}![]u8 {
_ = return_address;
assert(len > 0);
assert(std.math.isPowerOfTwo(alignment));
const slice = std.posix.mmap(
null,
len,
std.posix.PROT.READ | std.posix.PROT.WRITE,
std.posix.MAP.ANONYMOUS | std.posix.MAP.PRIVATE,
-1,
0,
) catch
return error.OutOfMemory;
_ = len_align;
return slice;
}
fn resize(
_: *anyopaque,
_: []u8,
_: u29,
_: usize,
_: u29,
_: usize,
) ?usize {
return null;
}
fn free(
_: *anyopaque,
buf: []u8,
_: u29,
_: usize,
) void {
std.posix.munmap(@alignCast(buf));
}
};
pub const huge_allocator = Allocator{
.ptr = undefined,
.vtable = &huge_allocator_vtable,
};
const huge_allocator_vtable = Allocator.VTable{
.alloc = HugeAllocator.alloc,
.resize = HugeAllocator.resize,
.free = HugeAllocator.free,
};
pub const huge_threshold = 1024 * 256;
const AutoSizeAllocator = struct {
fn alloc(
_: *anyopaque,
len: usize,
alignment: u29,
len_align: u29,
return_address: usize,
) error{OutOfMemory}![]u8 {
_ = len_align;
if (len >= huge_threshold) {
return huge_allocator.rawAlloc(
len,
alignment,
return_address,
) orelse return error.OutOfMemory;
}
return c_allocator.rawAlloc(
len,
alignment,
return_address,
) orelse return error.OutOfMemory;
}
fn resize(
_: *anyopaque,
_: []u8,
_: u29,
_: usize,
_: u29,
_: usize,
) ?usize {
return null;
}
fn free(
_: *anyopaque,
buf: []u8,
a: u29,
b: usize,
) void {
if (buf.len >= huge_threshold) {
return huge_allocator.rawFree(
buf,
a,
b,
);
}
return c_allocator.rawFree(
buf,
a,
b,
);
}
};
pub const auto_allocator = Allocator{
.ptr = undefined,
.vtable = &auto_allocator_vtable,
};
const auto_allocator_vtable = Allocator.VTable{
.alloc = AutoSizeAllocator.alloc,
.resize = AutoSizeAllocator.resize,
.free = AutoSizeAllocator.free,
.free = &ZAllocator.free_with_z_allocator,
};

View File

@@ -10,124 +10,6 @@ const assert = bun.assert;
const bun = @import("root").bun;
const log = bun.Output.scoped(.mimalloc, true);
pub const GlobalArena = struct {
arena: Arena,
fallback_allocator: std.mem.Allocator,
pub fn initWithCapacity(capacity: usize, fallback: std.mem.Allocator) error{OutOfMemory}!GlobalArena {
const arena = try Arena.initWithCapacity(capacity);
return GlobalArena{
.arena = arena,
.fallback_allocator = fallback,
};
}
pub fn allocator(this: *GlobalArena) Allocator {
return .{
.ptr = this,
.vtable = &.{
.alloc = alloc,
.resize = resize,
.free = free,
},
};
}
fn alloc(
self: *GlobalArena,
len: usize,
ptr_align: u29,
len_align: u29,
return_address: usize,
) error{OutOfMemory}![]u8 {
return self.arena.alloc(len, ptr_align, len_align, return_address) catch
return self.fallback_allocator.rawAlloc(len, ptr_align, return_address) orelse return error.OutOfMemory;
}
fn resize(
self: *GlobalArena,
buf: []u8,
buf_align: u29,
new_len: usize,
len_align: u29,
return_address: usize,
) ?usize {
if (self.arena.ownsPtr(buf.ptr)) {
return self.arena.resize(buf, buf_align, new_len, len_align, return_address);
} else {
return self.fallback_allocator.rawResize(buf, buf_align, new_len, len_align, return_address);
}
}
fn free(
self: *GlobalArena,
buf: []u8,
buf_align: u29,
return_address: usize,
) void {
if (self.arena.ownsPtr(buf.ptr)) {
return self.arena.free(buf, buf_align, return_address);
} else {
return self.fallback_allocator.rawFree(buf, buf_align, return_address);
}
}
};
const ArenaRegistry = struct {
arenas: std.AutoArrayHashMap(?*mimalloc.Heap, std.Thread.Id) = std.AutoArrayHashMap(?*mimalloc.Heap, std.Thread.Id).init(bun.default_allocator),
mutex: bun.Mutex = .{},
var registry = ArenaRegistry{};
pub fn register(arena: Arena) void {
if (comptime Environment.isDebug and Environment.isNative) {
registry.mutex.lock();
defer registry.mutex.unlock();
const entry = registry.arenas.getOrPut(arena.heap.?) catch unreachable;
const received = std.Thread.getCurrentId();
if (entry.found_existing) {
const expected = entry.value_ptr.*;
if (expected != received) {
bun.unreachablePanic("Arena created on wrong thread! Expected: {d} received: {d}", .{
expected,
received,
});
}
}
entry.value_ptr.* = received;
}
}
pub fn assert(arena: Arena) void {
if (comptime Environment.isDebug and Environment.isNative) {
registry.mutex.lock();
defer registry.mutex.unlock();
const expected = registry.arenas.get(arena.heap.?) orelse {
bun.unreachablePanic("Arena not registered!", .{});
};
const received = std.Thread.getCurrentId();
if (expected != received) {
bun.unreachablePanic("Arena accessed on wrong thread! Expected: {d} received: {d}", .{
expected,
received,
});
}
}
}
pub fn unregister(arena: Arena) void {
if (comptime Environment.isDebug and Environment.isNative) {
registry.mutex.lock();
defer registry.mutex.unlock();
if (!registry.arenas.swapRemove(arena.heap.?)) {
bun.unreachablePanic("Arena not registered!", .{});
}
}
}
};
pub const Arena = struct {
heap: ?*mimalloc.Heap = null,
@@ -149,15 +31,6 @@ pub const Arena = struct {
return Allocator{ .ptr = this.heap.?, .vtable = &c_allocator_vtable };
}
pub fn deinit(this: *Arena) void {
// if (comptime Environment.isDebug) {
// ArenaRegistry.unregister(this.*);
// }
mimalloc.mi_heap_destroy(this.heap.?);
this.heap = null;
}
pub fn dumpThreadStats(_: *Arena) void {
const dump_fn = struct {
pub fn dump(textZ: [*:0]const u8, _: ?*anyopaque) callconv(.C) void {
@@ -180,27 +53,22 @@ pub const Arena = struct {
bun.Output.flush();
}
pub fn reset(this: *Arena) void {
this.deinit();
this.* = init() catch unreachable;
pub fn deinit(this: *Arena) void {
mimalloc.mi_heap_destroy(bun.take(&this.heap).?);
}
pub fn init() !Arena {
const arena = Arena{ .heap = mimalloc.mi_heap_new() orelse return error.OutOfMemory };
// if (comptime Environment.isDebug) {
// ArenaRegistry.register(arena);
// }
return arena;
}
pub fn gc(this: Arena, force: bool) void {
mimalloc.mi_heap_collect(this.heap orelse return, force);
pub fn gc(this: Arena) void {
mimalloc.mi_heap_collect(this.heap orelse return, false);
}
pub inline fn helpCatchMemoryIssues(this: Arena) void {
if (comptime FeatureFlags.help_catch_memory_issues) {
this.gc(true);
bun.Mimalloc.mi_collect(true);
this.gc();
bun.Mimalloc.mi_collect(false);
}
}
@@ -236,8 +104,6 @@ pub const Arena = struct {
fn alloc(arena: *anyopaque, len: usize, alignment: mem.Alignment, _: usize) ?[*]u8 {
const this = bun.cast(*mimalloc.Heap, arena);
// if (comptime Environment.isDebug)
// ArenaRegistry.assert(.{ .heap = this });
return alignedAlloc(
this,
@@ -247,16 +113,7 @@ pub const Arena = struct {
}
fn resize(_: *anyopaque, buf: []u8, _: mem.Alignment, new_len: usize, _: usize) bool {
if (new_len <= buf.len) {
return true;
}
const full_len = alignedAllocSize(buf.ptr);
if (new_len <= full_len) {
return true;
}
return false;
return mimalloc.mi_expand(buf.ptr, new_len) != null;
}
fn free(
@@ -278,11 +135,36 @@ pub const Arena = struct {
mimalloc.mi_free(buf.ptr);
}
}
/// Attempt to expand or shrink memory, allowing relocation.
///
/// `memory.len` must equal the length requested from the most recent
/// successful call to `alloc`, `resize`, or `remap`. `alignment` must
/// equal the same value that was passed as the `alignment` parameter to
/// the original `alloc` call.
///
/// A non-`null` return value indicates the resize was successful. The
/// allocation may have same address, or may have been relocated. In either
/// case, the allocation now has size of `new_len`. A `null` return value
/// indicates that the resize would be equivalent to allocating new memory,
/// copying the bytes from the old memory, and then freeing the old memory.
/// In such case, it is more efficient for the caller to perform the copy.
///
/// `new_len` must be greater than zero.
///
/// `ret_addr` is optionally provided as the first return address of the
/// allocation call stack. If the value is `0` it means no return address
/// has been provided.
fn remap(this: *anyopaque, buf: []u8, alignment: mem.Alignment, new_len: usize, _: usize) ?[*]u8 {
const aligned_size = alignment.toByteUnits();
const value = mimalloc.mi_heap_realloc_aligned(@ptrCast(this), buf.ptr, new_len, aligned_size);
return @ptrCast(value);
}
};
const c_allocator_vtable = Allocator.VTable{
.alloc = &Arena.alloc,
.resize = &Arena.resize,
.remap = &std.mem.Allocator.noRemap,
.remap = &Arena.remap,
.free = &Arena.free,
};

View File

@@ -130,6 +130,7 @@ pub const Features = struct {
pub var s3: usize = 0;
pub var csrf_verify: usize = 0;
pub var csrf_generate: usize = 0;
pub var unsupported_uv_function: usize = 0;
comptime {
@export(&napi_module_register, .{ .name = "Bun__napi_module_register_count" });

View File

@@ -5,8 +5,14 @@ extern const jsc_llint_begin: u8;
extern const jsc_llint_end: u8;
/// allocated using bun.default_allocator. when called from lldb, it is never freed.
pub export fn dumpBtjsTrace() [*:0]const u8 {
if (@import("builtin").mode != .Debug) return "dumpBtjsTrace is disabled in release builds";
if (comptime bun.Environment.isDebug) {
return dumpBtjsTraceDebugImpl();
}
return "btjs is disabled in release builds";
}
fn dumpBtjsTraceDebugImpl() [*:0]const u8 {
var result_writer = std.ArrayList(u8).init(bun.default_allocator);
const w = result_writer.writer();
@@ -63,7 +69,8 @@ pub export fn dumpBtjsTrace() [*:0]const u8 {
}).ptr);
}
pub fn printSourceAtAddress(debug_info: *std.debug.SelfInfo, out_stream: anytype, address: usize, tty_config: std.io.tty.Config, fp: usize) !void {
fn printSourceAtAddress(debug_info: *std.debug.SelfInfo, out_stream: anytype, address: usize, tty_config: std.io.tty.Config, fp: usize) !void {
if (!bun.Environment.isDebug) unreachable;
const module = debug_info.getModuleForAddress(address) catch |err| switch (err) {
error.MissingDebugInfo, error.InvalidDebugInfo => return printUnknownSource(debug_info, out_stream, address, tty_config),
else => return err,
@@ -114,6 +121,7 @@ pub fn printSourceAtAddress(debug_info: *std.debug.SelfInfo, out_stream: anytype
}
fn printUnknownSource(debug_info: *std.debug.SelfInfo, out_stream: anytype, address: usize, tty_config: std.io.tty.Config) !void {
if (!bun.Environment.isDebug) unreachable;
const module_name = debug_info.getModuleNameForAddress(address);
return printLineInfo(
out_stream,
@@ -136,6 +144,8 @@ fn printLineInfo(
comptime printLineFromFile: anytype,
do_llint: bool,
) !void {
if (!bun.Environment.isDebug) unreachable;
nosuspend {
try tty_config.setColor(out_stream, .bold);
@@ -176,6 +186,8 @@ fn printLineInfo(
}
fn printLineFromFileAnyOs(out_stream: anytype, source_location: std.debug.SourceLocation) !void {
if (!bun.Environment.isDebug) unreachable;
// Need this to always block even in async I/O mode, because this could potentially
// be called from e.g. the event loop code crashing.
var f = try std.fs.cwd().openFile(source_location.file_name, .{});
@@ -230,6 +242,7 @@ fn printLineFromFileAnyOs(out_stream: anytype, source_location: std.debug.Source
}
fn printLastUnwindError(it: *std.debug.StackIterator, debug_info: *std.debug.SelfInfo, out_stream: anytype, tty_config: std.io.tty.Config) void {
if (!bun.Environment.isDebug) unreachable;
if (!std.debug.have_ucontext) return;
if (it.getLastError()) |unwind_error| {
printUnwindError(debug_info, out_stream, unwind_error.address, unwind_error.err, tty_config) catch {};
@@ -237,6 +250,8 @@ fn printLastUnwindError(it: *std.debug.StackIterator, debug_info: *std.debug.Sel
}
fn printUnwindError(debug_info: *std.debug.SelfInfo, out_stream: anytype, address: usize, err: std.debug.UnwindError, tty_config: std.io.tty.Config) !void {
if (!bun.Environment.isDebug) unreachable;
const module_name = debug_info.getModuleNameForAddress(address) orelse "???";
try tty_config.setColor(out_stream, .dim);
if (err == error.MissingDebugInfo) {

View File

@@ -360,7 +360,7 @@ pub const TablePrinter = struct {
return;
}
if (row_value.isObject()) {
if (row_value.getObject()) |obj| {
// object ->
// - if "properties" arg was provided: iterate the already-created columns (except for the 0-th which is the index)
// - otherwise: iterate the object properties, and create the columns on-demand
@@ -374,7 +374,7 @@ pub const TablePrinter = struct {
var cols_iter = try JSC.JSPropertyIterator(.{
.skip_empty_name = false,
.include_value = true,
}).init(this.globalObject, row_value);
}).init(this.globalObject, obj);
defer cols_iter.deinit();
while (try cols_iter.next()) |col_key| {
@@ -554,10 +554,11 @@ pub const TablePrinter = struct {
}.callback);
if (ctx_.err) return error.JSError;
} else {
const tabular_obj = try this.tabular_data.toObject(globalObject);
var rows_iter = try JSC.JSPropertyIterator(.{
.skip_empty_name = false,
.include_value = true,
}).init(globalObject, this.tabular_data);
}).init(globalObject, tabular_obj);
defer rows_iter.deinit();
while (try rows_iter.next()) |row_key| {
@@ -627,10 +628,13 @@ pub const TablePrinter = struct {
}.callback);
if (ctx_.err) return error.JSError;
} else {
const cell = this.tabular_data.toCell() orelse {
return globalObject.throwTypeError("tabular_data must be an object or array", .{});
};
var rows_iter = try JSC.JSPropertyIterator(.{
.skip_empty_name = false,
.include_value = true,
}).init(globalObject, this.tabular_data);
}).init(globalObject, cell.toObject(globalObject));
defer rows_iter.deinit();
while (try rows_iter.next()) |row_key| {
@@ -3077,14 +3081,15 @@ pub const Formatter = struct {
if (value.get_unsafe(this.globalThis, "props")) |props| {
const prev_quote_strings = this.quote_strings;
this.quote_strings = true;
defer this.quote_strings = prev_quote_strings;
this.quote_strings = true;
// SAFETY: JSX props are always objects
const props_obj = props.getObject().?;
var props_iter = try JSC.JSPropertyIterator(.{
.skip_empty_name = true,
.include_value = true,
}).init(this.globalThis, props);
}).init(this.globalThis, props_obj);
defer props_iter.deinit();
const children_prop = props.get_unsafe(this.globalThis, "children");

View File

@@ -160,7 +160,7 @@ pub const RuntimeTranspilerCache = struct {
output_code: OutputCode,
exports_kind: bun.JSAst.ExportsKind,
) !void {
var tracer = bun.tracy.traceNamed(@src(), "RuntimeTranspilerCache.save");
var tracer = bun.perf.trace("RuntimeTranspilerCache.save");
defer tracer.end();
// atomically write to a tmpfile and then move it to the final destination
@@ -457,7 +457,7 @@ pub const RuntimeTranspilerCache = struct {
sourcemap_allocator: std.mem.Allocator,
output_code_allocator: std.mem.Allocator,
) !Entry {
var tracer = bun.tracy.traceNamed(@src(), "RuntimeTranspilerCache.fromFile");
var tracer = bun.perf.trace("RuntimeTranspilerCache.fromFile");
defer tracer.end();
var cache_file_path_buf: bun.PathBuffer = undefined;
@@ -531,7 +531,7 @@ pub const RuntimeTranspilerCache = struct {
source_code: bun.String,
exports_kind: bun.JSAst.ExportsKind,
) !void {
var tracer = bun.tracy.traceNamed(@src(), "RuntimeTranspilerCache.toFile");
var tracer = bun.perf.trace("RuntimeTranspilerCache.toFile");
defer tracer.end();
var cache_file_path_buf: bun.PathBuffer = undefined;

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,638 @@
pub fn newCString(globalThis: *JSGlobalObject, value: JSValue, byteOffset: ?JSValue, lengthValue: ?JSValue) JSC.JSValue {
switch (FFIObject.getPtrSlice(globalThis, value, byteOffset, lengthValue)) {
.err => |err| {
return err;
},
.slice => |slice| {
return bun.String.createUTF8ForJS(globalThis, slice);
},
}
}
pub const dom_call = JSC.DOMCall("FFI", @This(), "ptr", JSC.DOMEffect.forRead(.TypedArrayProperties));
pub fn toJS(globalObject: *JSC.JSGlobalObject) JSC.JSValue {
const object = JSC.JSValue.createEmptyObject(globalObject, comptime std.meta.fieldNames(@TypeOf(fields)).len + 2);
inline for (comptime std.meta.fieldNames(@TypeOf(fields))) |field| {
object.put(
globalObject,
comptime ZigString.static(field),
JSC.createCallback(globalObject, comptime ZigString.static(field), 1, comptime @field(fields, field)),
);
}
dom_call.put(globalObject, object);
object.put(globalObject, ZigString.static("read"), Reader.toJS(globalObject));
return object;
}
pub const Reader = struct {
pub const DOMCalls = .{
.u8 = JSC.DOMCall("Reader", @This(), "u8", JSC.DOMEffect.forRead(.World)),
.u16 = JSC.DOMCall("Reader", @This(), "u16", JSC.DOMEffect.forRead(.World)),
.u32 = JSC.DOMCall("Reader", @This(), "u32", JSC.DOMEffect.forRead(.World)),
.ptr = JSC.DOMCall("Reader", @This(), "ptr", JSC.DOMEffect.forRead(.World)),
.i8 = JSC.DOMCall("Reader", @This(), "i8", JSC.DOMEffect.forRead(.World)),
.i16 = JSC.DOMCall("Reader", @This(), "i16", JSC.DOMEffect.forRead(.World)),
.i32 = JSC.DOMCall("Reader", @This(), "i32", JSC.DOMEffect.forRead(.World)),
.i64 = JSC.DOMCall("Reader", @This(), "i64", JSC.DOMEffect.forRead(.World)),
.u64 = JSC.DOMCall("Reader", @This(), "u64", JSC.DOMEffect.forRead(.World)),
.intptr = JSC.DOMCall("Reader", @This(), "intptr", JSC.DOMEffect.forRead(.World)),
.f32 = JSC.DOMCall("Reader", @This(), "f32", JSC.DOMEffect.forRead(.World)),
.f64 = JSC.DOMCall("Reader", @This(), "f64", JSC.DOMEffect.forRead(.World)),
};
pub fn toJS(globalThis: *JSC.JSGlobalObject) JSC.JSValue {
const obj = JSC.JSValue.createEmptyObject(globalThis, std.meta.fieldNames(@TypeOf(Reader.DOMCalls)).len);
inline for (comptime std.meta.fieldNames(@TypeOf(Reader.DOMCalls))) |field| {
@field(Reader.DOMCalls, field).put(globalThis, obj);
}
return obj;
}
pub fn @"u8"(
globalObject: *JSGlobalObject,
_: JSValue,
arguments: []const JSValue,
) bun.JSError!JSValue {
if (arguments.len == 0 or !arguments[0].isNumber()) {
return globalObject.throwInvalidArguments("Expected a pointer", .{});
}
const addr = arguments[0].asPtrAddress() + if (arguments.len > 1) @as(usize, @intCast(arguments[1].to(i32))) else @as(usize, 0);
const value = @as(*align(1) u8, @ptrFromInt(addr)).*;
return JSValue.jsNumber(value);
}
pub fn @"u16"(
globalObject: *JSGlobalObject,
_: JSValue,
arguments: []const JSValue,
) bun.JSError!JSValue {
if (arguments.len == 0 or !arguments[0].isNumber()) {
return globalObject.throwInvalidArguments("Expected a pointer", .{});
}
const addr = arguments[0].asPtrAddress() + if (arguments.len > 1) @as(usize, @intCast(arguments[1].to(i32))) else @as(usize, 0);
const value = @as(*align(1) u16, @ptrFromInt(addr)).*;
return JSValue.jsNumber(value);
}
pub fn @"u32"(
globalObject: *JSGlobalObject,
_: JSValue,
arguments: []const JSValue,
) bun.JSError!JSValue {
if (arguments.len == 0 or !arguments[0].isNumber()) {
return globalObject.throwInvalidArguments("Expected a pointer", .{});
}
const addr = arguments[0].asPtrAddress() + if (arguments.len > 1) @as(usize, @intCast(arguments[1].to(i32))) else @as(usize, 0);
const value = @as(*align(1) u32, @ptrFromInt(addr)).*;
return JSValue.jsNumber(value);
}
pub fn ptr(
globalObject: *JSGlobalObject,
_: JSValue,
arguments: []const JSValue,
) bun.JSError!JSValue {
if (arguments.len == 0 or !arguments[0].isNumber()) {
return globalObject.throwInvalidArguments("Expected a pointer", .{});
}
const addr = arguments[0].asPtrAddress() + if (arguments.len > 1) @as(usize, @intCast(arguments[1].to(i32))) else @as(usize, 0);
const value = @as(*align(1) u64, @ptrFromInt(addr)).*;
return JSValue.jsNumber(value);
}
pub fn @"i8"(
globalObject: *JSGlobalObject,
_: JSValue,
arguments: []const JSValue,
) bun.JSError!JSValue {
if (arguments.len == 0 or !arguments[0].isNumber()) {
return globalObject.throwInvalidArguments("Expected a pointer", .{});
}
const addr = arguments[0].asPtrAddress() + if (arguments.len > 1) @as(usize, @intCast(arguments[1].to(i32))) else @as(usize, 0);
const value = @as(*align(1) i8, @ptrFromInt(addr)).*;
return JSValue.jsNumber(value);
}
pub fn @"i16"(
globalObject: *JSGlobalObject,
_: JSValue,
arguments: []const JSValue,
) bun.JSError!JSValue {
if (arguments.len == 0 or !arguments[0].isNumber()) {
return globalObject.throwInvalidArguments("Expected a pointer", .{});
}
const addr = arguments[0].asPtrAddress() + if (arguments.len > 1) @as(usize, @intCast(arguments[1].to(i32))) else @as(usize, 0);
const value = @as(*align(1) i16, @ptrFromInt(addr)).*;
return JSValue.jsNumber(value);
}
pub fn @"i32"(
globalObject: *JSGlobalObject,
_: JSValue,
arguments: []const JSValue,
) bun.JSError!JSValue {
if (arguments.len == 0 or !arguments[0].isNumber()) {
return globalObject.throwInvalidArguments("Expected a pointer", .{});
}
const addr = arguments[0].asPtrAddress() + if (arguments.len > 1) @as(usize, @intCast(arguments[1].to(i32))) else @as(usize, 0);
const value = @as(*align(1) i32, @ptrFromInt(addr)).*;
return JSValue.jsNumber(value);
}
pub fn intptr(
globalObject: *JSGlobalObject,
_: JSValue,
arguments: []const JSValue,
) bun.JSError!JSValue {
if (arguments.len == 0 or !arguments[0].isNumber()) {
return globalObject.throwInvalidArguments("Expected a pointer", .{});
}
const addr = arguments[0].asPtrAddress() + if (arguments.len > 1) @as(usize, @intCast(arguments[1].to(i32))) else @as(usize, 0);
const value = @as(*align(1) i64, @ptrFromInt(addr)).*;
return JSValue.jsNumber(value);
}
pub fn @"f32"(
globalObject: *JSGlobalObject,
_: JSValue,
arguments: []const JSValue,
) bun.JSError!JSValue {
if (arguments.len == 0 or !arguments[0].isNumber()) {
return globalObject.throwInvalidArguments("Expected a pointer", .{});
}
const addr = arguments[0].asPtrAddress() + if (arguments.len > 1) @as(usize, @intCast(arguments[1].to(i32))) else @as(usize, 0);
const value = @as(*align(1) f32, @ptrFromInt(addr)).*;
return JSValue.jsNumber(value);
}
pub fn @"f64"(
globalObject: *JSGlobalObject,
_: JSValue,
arguments: []const JSValue,
) bun.JSError!JSValue {
if (arguments.len == 0 or !arguments[0].isNumber()) {
return globalObject.throwInvalidArguments("Expected a pointer", .{});
}
const addr = arguments[0].asPtrAddress() + if (arguments.len > 1) @as(usize, @intCast(arguments[1].to(i32))) else @as(usize, 0);
const value = @as(*align(1) f64, @ptrFromInt(addr)).*;
return JSValue.jsNumber(value);
}
pub fn @"i64"(
globalObject: *JSGlobalObject,
_: JSValue,
arguments: []const JSValue,
) bun.JSError!JSValue {
if (arguments.len == 0 or !arguments[0].isNumber()) {
return globalObject.throwInvalidArguments("Expected a pointer", .{});
}
const addr = arguments[0].asPtrAddress() + if (arguments.len > 1) @as(usize, @intCast(arguments[1].to(i32))) else @as(usize, 0);
const value = @as(*align(1) i64, @ptrFromInt(addr)).*;
return JSValue.fromInt64NoTruncate(globalObject, value);
}
pub fn @"u64"(
globalObject: *JSGlobalObject,
_: JSValue,
arguments: []const JSValue,
) bun.JSError!JSValue {
if (arguments.len == 0 or !arguments[0].isNumber()) {
return globalObject.throwInvalidArguments("Expected a pointer", .{});
}
const addr = arguments[0].asPtrAddress() + if (arguments.len > 1) @as(usize, @intCast(arguments[1].to(i32))) else @as(usize, 0);
const value = @as(*align(1) u64, @ptrFromInt(addr)).*;
return JSValue.fromUInt64NoTruncate(globalObject, value);
}
pub fn u8WithoutTypeChecks(
_: *JSGlobalObject,
_: *anyopaque,
raw_addr: i64,
offset: i32,
) callconv(JSC.conv) JSValue {
const addr = @as(usize, @intCast(raw_addr)) + @as(usize, @intCast(offset));
const value = @as(*align(1) u8, @ptrFromInt(addr)).*;
return JSValue.jsNumber(value);
}
pub fn u16WithoutTypeChecks(
_: *JSGlobalObject,
_: *anyopaque,
raw_addr: i64,
offset: i32,
) callconv(JSC.conv) JSValue {
const addr = @as(usize, @intCast(raw_addr)) + @as(usize, @intCast(offset));
const value = @as(*align(1) u16, @ptrFromInt(addr)).*;
return JSValue.jsNumber(value);
}
pub fn u32WithoutTypeChecks(
_: *JSGlobalObject,
_: *anyopaque,
raw_addr: i64,
offset: i32,
) callconv(JSC.conv) JSValue {
const addr = @as(usize, @intCast(raw_addr)) + @as(usize, @intCast(offset));
const value = @as(*align(1) u32, @ptrFromInt(addr)).*;
return JSValue.jsNumber(value);
}
pub fn ptrWithoutTypeChecks(
_: *JSGlobalObject,
_: *anyopaque,
raw_addr: i64,
offset: i32,
) callconv(JSC.conv) JSValue {
const addr = @as(usize, @intCast(raw_addr)) + @as(usize, @intCast(offset));
const value = @as(*align(1) u64, @ptrFromInt(addr)).*;
return JSValue.jsNumber(value);
}
pub fn i8WithoutTypeChecks(
_: *JSGlobalObject,
_: *anyopaque,
raw_addr: i64,
offset: i32,
) callconv(JSC.conv) JSValue {
const addr = @as(usize, @intCast(raw_addr)) + @as(usize, @intCast(offset));
const value = @as(*align(1) i8, @ptrFromInt(addr)).*;
return JSValue.jsNumber(value);
}
pub fn i16WithoutTypeChecks(
_: *JSGlobalObject,
_: *anyopaque,
raw_addr: i64,
offset: i32,
) callconv(JSC.conv) JSValue {
const addr = @as(usize, @intCast(raw_addr)) + @as(usize, @intCast(offset));
const value = @as(*align(1) i16, @ptrFromInt(addr)).*;
return JSValue.jsNumber(value);
}
pub fn i32WithoutTypeChecks(
_: *JSGlobalObject,
_: *anyopaque,
raw_addr: i64,
offset: i32,
) callconv(JSC.conv) JSValue {
const addr = @as(usize, @intCast(raw_addr)) + @as(usize, @intCast(offset));
const value = @as(*align(1) i32, @ptrFromInt(addr)).*;
return JSValue.jsNumber(value);
}
pub fn intptrWithoutTypeChecks(
_: *JSGlobalObject,
_: *anyopaque,
raw_addr: i64,
offset: i32,
) callconv(JSC.conv) JSValue {
const addr = @as(usize, @intCast(raw_addr)) + @as(usize, @intCast(offset));
const value = @as(*align(1) i64, @ptrFromInt(addr)).*;
return JSValue.jsNumber(value);
}
pub fn f32WithoutTypeChecks(
_: *JSGlobalObject,
_: *anyopaque,
raw_addr: i64,
offset: i32,
) callconv(JSC.conv) JSValue {
const addr = @as(usize, @intCast(raw_addr)) + @as(usize, @intCast(offset));
const value = @as(*align(1) f32, @ptrFromInt(addr)).*;
return JSValue.jsNumber(value);
}
pub fn f64WithoutTypeChecks(
_: *JSGlobalObject,
_: *anyopaque,
raw_addr: i64,
offset: i32,
) callconv(JSC.conv) JSValue {
const addr = @as(usize, @intCast(raw_addr)) + @as(usize, @intCast(offset));
const value = @as(*align(1) f64, @ptrFromInt(addr)).*;
return JSValue.jsNumber(value);
}
pub fn u64WithoutTypeChecks(
global: *JSGlobalObject,
_: *anyopaque,
raw_addr: i64,
offset: i32,
) callconv(JSC.conv) JSValue {
const addr = @as(usize, @intCast(raw_addr)) + @as(usize, @intCast(offset));
const value = @as(*align(1) u64, @ptrFromInt(addr)).*;
return JSValue.fromUInt64NoTruncate(global, value);
}
pub fn i64WithoutTypeChecks(
global: *JSGlobalObject,
_: *anyopaque,
raw_addr: i64,
offset: i32,
) callconv(JSC.conv) JSValue {
const addr = @as(usize, @intCast(raw_addr)) + @as(usize, @intCast(offset));
const value = @as(*align(1) i64, @ptrFromInt(addr)).*;
return JSValue.fromInt64NoTruncate(global, value);
}
};
pub fn ptr(
globalThis: *JSGlobalObject,
_: JSValue,
arguments: []const JSValue,
) JSValue {
return switch (arguments.len) {
0 => ptr_(globalThis, JSValue.zero, null),
1 => ptr_(globalThis, arguments[0], null),
else => ptr_(globalThis, arguments[0], arguments[1]),
};
}
pub fn ptrWithoutTypeChecks(
_: *JSGlobalObject,
_: *anyopaque,
array: *JSC.JSUint8Array,
) callconv(JSC.conv) JSValue {
return JSValue.fromPtrAddress(@intFromPtr(array.ptr()));
}
fn ptr_(
globalThis: *JSGlobalObject,
value: JSValue,
byteOffset: ?JSValue,
) JSValue {
if (value == .zero) {
return JSC.JSValue.jsNull();
}
const array_buffer = value.asArrayBuffer(globalThis) orelse {
return JSC.toInvalidArguments("Expected ArrayBufferView but received {s}", .{@tagName(value.jsType())}, globalThis);
};
if (array_buffer.len == 0) {
return JSC.toInvalidArguments("ArrayBufferView must have a length > 0. A pointer to empty memory doesn't work", .{}, globalThis);
}
var addr: usize = @intFromPtr(array_buffer.ptr);
// const Sizes = @import("../bindings/sizes.zig");
// assert(addr == @intFromPtr(value.asEncoded().ptr) + Sizes.Bun_FFI_PointerOffsetToTypedArrayVector);
if (byteOffset) |off| {
if (!off.isEmptyOrUndefinedOrNull()) {
if (!off.isNumber()) {
return JSC.toInvalidArguments("Expected number for byteOffset", .{}, globalThis);
}
}
const bytei64 = off.toInt64();
if (bytei64 < 0) {
addr -|= @as(usize, @intCast(bytei64 * -1));
} else {
addr += @as(usize, @intCast(bytei64));
}
if (addr > @intFromPtr(array_buffer.ptr) + @as(usize, array_buffer.byte_len)) {
return JSC.toInvalidArguments("byteOffset out of bounds", .{}, globalThis);
}
}
if (addr > max_addressable_memory) {
return JSC.toInvalidArguments("Pointer is outside max addressible memory, which usually means a bug in your program.", .{}, globalThis);
}
if (addr == 0) {
return JSC.toInvalidArguments("Pointer must not be 0", .{}, globalThis);
}
if (addr == 0xDEADBEEF or addr == 0xaaaaaaaa or addr == 0xAAAAAAAA) {
return JSC.toInvalidArguments("ptr to invalid memory, that would segfault Bun :(", .{}, globalThis);
}
if (comptime Environment.allow_assert) {
assert(JSC.JSValue.fromPtrAddress(addr).asPtrAddress() == addr);
}
return JSC.JSValue.fromPtrAddress(addr);
}
const ValueOrError = union(enum) {
err: JSValue,
slice: []u8,
};
pub fn getPtrSlice(globalThis: *JSGlobalObject, value: JSValue, byteOffset: ?JSValue, byteLength: ?JSValue) ValueOrError {
if (!value.isNumber()) {
return .{ .err = JSC.toInvalidArguments("ptr must be a number.", .{}, globalThis) };
}
const num = value.asPtrAddress();
if (num == 0) {
return .{ .err = JSC.toInvalidArguments("ptr cannot be zero, that would segfault Bun :(", .{}, globalThis) };
}
// if (!std.math.isFinite(num)) {
// return .{ .err = JSC.toInvalidArguments("ptr must be a finite number.", .{}, globalThis) };
// }
var addr = @as(usize, @bitCast(num));
if (byteOffset) |byte_off| {
if (byte_off.isNumber()) {
const off = byte_off.toInt64();
if (off < 0) {
addr -|= @as(usize, @intCast(off * -1));
} else {
addr +|= @as(usize, @intCast(off));
}
if (addr == 0) {
return .{ .err = JSC.toInvalidArguments("ptr cannot be zero, that would segfault Bun :(", .{}, globalThis) };
}
if (!std.math.isFinite(byte_off.asNumber())) {
return .{ .err = JSC.toInvalidArguments("ptr must be a finite number.", .{}, globalThis) };
}
} else if (!byte_off.isEmptyOrUndefinedOrNull()) {
// do nothing
} else {
return .{ .err = JSC.toInvalidArguments("Expected number for byteOffset", .{}, globalThis) };
}
}
if (addr == 0xDEADBEEF or addr == 0xaaaaaaaa or addr == 0xAAAAAAAA) {
return .{ .err = JSC.toInvalidArguments("ptr to invalid memory, that would segfault Bun :(", .{}, globalThis) };
}
if (byteLength) |valueLength| {
if (!valueLength.isEmptyOrUndefinedOrNull()) {
if (!valueLength.isNumber()) {
return .{ .err = JSC.toInvalidArguments("length must be a number.", .{}, globalThis) };
}
if (valueLength.asNumber() == 0.0) {
return .{ .err = JSC.toInvalidArguments("length must be > 0. This usually means a bug in your code.", .{}, globalThis) };
}
const length_i = valueLength.toInt64();
if (length_i < 0) {
return .{ .err = JSC.toInvalidArguments("length must be > 0. This usually means a bug in your code.", .{}, globalThis) };
}
if (length_i > max_addressable_memory) {
return .{ .err = JSC.toInvalidArguments("length exceeds max addressable memory. This usually means a bug in your code.", .{}, globalThis) };
}
const length = @as(usize, @intCast(length_i));
return .{ .slice = @as([*]u8, @ptrFromInt(addr))[0..length] };
}
}
return .{ .slice = bun.span(@as([*:0]u8, @ptrFromInt(addr))) };
}
fn getCPtr(value: JSValue) ?usize {
// pointer to C function
if (value.isNumber()) {
const addr = value.asPtrAddress();
if (addr > 0) return addr;
} else if (value.isBigInt()) {
const addr = @as(u64, @bitCast(value.toUInt64NoTruncate()));
if (addr > 0) {
return addr;
}
}
return null;
}
pub fn toArrayBuffer(
globalThis: *JSGlobalObject,
value: JSValue,
byteOffset: ?JSValue,
valueLength: ?JSValue,
finalizationCtxOrPtr: ?JSValue,
finalizationCallback: ?JSValue,
) JSC.JSValue {
switch (getPtrSlice(globalThis, value, byteOffset, valueLength)) {
.err => |erro| {
return erro;
},
.slice => |slice| {
var callback: JSC.C.JSTypedArrayBytesDeallocator = null;
var ctx: ?*anyopaque = null;
if (finalizationCallback) |callback_value| {
if (getCPtr(callback_value)) |callback_ptr| {
callback = @as(JSC.C.JSTypedArrayBytesDeallocator, @ptrFromInt(callback_ptr));
if (finalizationCtxOrPtr) |ctx_value| {
if (getCPtr(ctx_value)) |ctx_ptr| {
ctx = @as(*anyopaque, @ptrFromInt(ctx_ptr));
} else if (!ctx_value.isUndefinedOrNull()) {
return JSC.toInvalidArguments("Expected user data to be a C pointer (number or BigInt)", .{}, globalThis);
}
}
} else if (!callback_value.isEmptyOrUndefinedOrNull()) {
return JSC.toInvalidArguments("Expected callback to be a C pointer (number or BigInt)", .{}, globalThis);
}
} else if (finalizationCtxOrPtr) |callback_value| {
if (getCPtr(callback_value)) |callback_ptr| {
callback = @as(JSC.C.JSTypedArrayBytesDeallocator, @ptrFromInt(callback_ptr));
} else if (!callback_value.isEmptyOrUndefinedOrNull()) {
return JSC.toInvalidArguments("Expected callback to be a C pointer (number or BigInt)", .{}, globalThis);
}
}
return JSC.ArrayBuffer.fromBytes(slice, JSC.JSValue.JSType.ArrayBuffer).toJSWithContext(globalThis, ctx, callback, null);
},
}
}
pub fn toBuffer(
globalThis: *JSGlobalObject,
value: JSValue,
byteOffset: ?JSValue,
valueLength: ?JSValue,
finalizationCtxOrPtr: ?JSValue,
finalizationCallback: ?JSValue,
) JSC.JSValue {
switch (getPtrSlice(globalThis, value, byteOffset, valueLength)) {
.err => |err| {
return err;
},
.slice => |slice| {
var callback: JSC.C.JSTypedArrayBytesDeallocator = null;
var ctx: ?*anyopaque = null;
if (finalizationCallback) |callback_value| {
if (getCPtr(callback_value)) |callback_ptr| {
callback = @as(JSC.C.JSTypedArrayBytesDeallocator, @ptrFromInt(callback_ptr));
if (finalizationCtxOrPtr) |ctx_value| {
if (getCPtr(ctx_value)) |ctx_ptr| {
ctx = @as(*anyopaque, @ptrFromInt(ctx_ptr));
} else if (!ctx_value.isEmptyOrUndefinedOrNull()) {
return JSC.toInvalidArguments("Expected user data to be a C pointer (number or BigInt)", .{}, globalThis);
}
}
} else if (!callback_value.isEmptyOrUndefinedOrNull()) {
return JSC.toInvalidArguments("Expected callback to be a C pointer (number or BigInt)", .{}, globalThis);
}
} else if (finalizationCtxOrPtr) |callback_value| {
if (getCPtr(callback_value)) |callback_ptr| {
callback = @as(JSC.C.JSTypedArrayBytesDeallocator, @ptrFromInt(callback_ptr));
} else if (!callback_value.isEmptyOrUndefinedOrNull()) {
return JSC.toInvalidArguments("Expected callback to be a C pointer (number or BigInt)", .{}, globalThis);
}
}
if (callback != null or ctx != null) {
return JSC.JSValue.createBufferWithCtx(globalThis, slice, ctx, callback);
}
return JSC.JSValue.createBuffer(globalThis, slice, null);
},
}
}
pub fn toCStringBuffer(
globalThis: *JSGlobalObject,
value: JSValue,
byteOffset: ?JSValue,
valueLength: ?JSValue,
) JSC.JSValue {
switch (getPtrSlice(globalThis, value, byteOffset, valueLength)) {
.err => |err| {
return err;
},
.slice => |slice| {
return JSC.JSValue.createBuffer(globalThis, slice, null);
},
}
}
pub fn getter(
globalObject: *JSC.JSGlobalObject,
_: *JSC.JSObject,
) JSC.JSValue {
return FFIObject.toJS(globalObject);
}
const fields = .{
.viewSource = JSC.wrapStaticMethod(
JSC.FFI,
"print",
false,
),
.dlopen = JSC.wrapStaticMethod(JSC.FFI, "open", false),
.callback = JSC.wrapStaticMethod(JSC.FFI, "callback", false),
.linkSymbols = JSC.wrapStaticMethod(JSC.FFI, "linkSymbols", false),
.toBuffer = JSC.wrapStaticMethod(@This(), "toBuffer", false),
.toArrayBuffer = JSC.wrapStaticMethod(@This(), "toArrayBuffer", false),
.closeCallback = JSC.wrapStaticMethod(JSC.FFI, "closeCallback", false),
.CString = JSC.wrapStaticMethod(Bun.FFIObject, "newCString", false),
};
const max_addressable_memory = std.math.maxInt(u56);
const JSGlobalObject = JSC.JSGlobalObject;
const JSObject = JSC.JSObject;
const JSValue = JSC.JSValue;
const JSC = bun.JSC;
const bun = @import("root").bun;
const FFIObject = @This();
const Bun = JSC.API.Bun;
const Environment = bun.Environment;
const std = @import("std");
const assert = bun.assert;
const ZigString = JSC.ZigString;

View File

@@ -0,0 +1,144 @@
pub const wyhash = hashWrap(std.hash.Wyhash);
pub const adler32 = hashWrap(std.hash.Adler32);
pub const crc32 = hashWrap(std.hash.Crc32);
pub const cityHash32 = hashWrap(std.hash.CityHash32);
pub const cityHash64 = hashWrap(std.hash.CityHash64);
pub const xxHash32 = hashWrap(struct {
pub fn hash(seed: u32, bytes: []const u8) u32 {
// sidestep .hash taking in anytype breaking ArgTuple
// downstream by forcing a type signature on the input
return std.hash.XxHash32.hash(seed, bytes);
}
});
pub const xxHash64 = hashWrap(struct {
pub fn hash(seed: u32, bytes: []const u8) u64 {
// sidestep .hash taking in anytype breaking ArgTuple
// downstream by forcing a type signature on the input
return std.hash.XxHash64.hash(seed, bytes);
}
});
pub const xxHash3 = hashWrap(struct {
pub fn hash(seed: u32, bytes: []const u8) u64 {
// sidestep .hash taking in anytype breaking ArgTuple
// downstream by forcing a type signature on the input
return std.hash.XxHash3.hash(seed, bytes);
}
});
pub const murmur32v2 = hashWrap(std.hash.murmur.Murmur2_32);
pub const murmur32v3 = hashWrap(std.hash.murmur.Murmur3_32);
pub const murmur64v2 = hashWrap(std.hash.murmur.Murmur2_64);
pub fn create(globalThis: *JSC.JSGlobalObject) JSC.JSValue {
const function = JSC.createCallback(globalThis, ZigString.static("hash"), 1, wyhash);
const fns = comptime .{
"wyhash",
"adler32",
"crc32",
"cityHash32",
"cityHash64",
"xxHash32",
"xxHash64",
"xxHash3",
"murmur32v2",
"murmur32v3",
"murmur64v2",
};
inline for (fns) |name| {
const value = JSC.createCallback(
globalThis,
ZigString.static(name),
1,
@field(HashObject, name),
);
function.put(globalThis, comptime ZigString.static(name), value);
}
return function;
}
fn hashWrap(comptime Hasher_: anytype) JSC.JSHostZigFunction {
return struct {
const Hasher = Hasher_;
pub fn hash(globalThis: *JSC.JSGlobalObject, callframe: *JSC.CallFrame) bun.JSError!JSC.JSValue {
const arguments = callframe.arguments_old(2).slice();
var args = JSC.Node.ArgumentsSlice.init(globalThis.bunVM(), arguments);
defer args.deinit();
var input: []const u8 = "";
var input_slice = ZigString.Slice.empty;
defer input_slice.deinit();
if (args.nextEat()) |arg| {
if (arg.as(JSC.WebCore.Blob)) |blob| {
// TODO: files
input = blob.sharedView();
} else {
switch (arg.jsTypeLoose()) {
.ArrayBuffer,
.Int8Array,
.Uint8Array,
.Uint8ClampedArray,
.Int16Array,
.Uint16Array,
.Int32Array,
.Uint32Array,
.Float16Array,
.Float32Array,
.Float64Array,
.BigInt64Array,
.BigUint64Array,
.DataView,
=> {
var array_buffer = arg.asArrayBuffer(globalThis) orelse {
return globalThis.throwInvalidArguments("ArrayBuffer conversion error", .{});
};
input = array_buffer.byteSlice();
},
else => {
input_slice = try arg.toSlice(globalThis, bun.default_allocator);
input = input_slice.slice();
},
}
}
}
// std.hash has inconsistent interfaces
//
const Function = if (@hasDecl(Hasher, "hashWithSeed")) Hasher.hashWithSeed else Hasher.hash;
var function_args: std.meta.ArgsTuple(@TypeOf(Function)) = undefined;
if (comptime std.meta.fields(std.meta.ArgsTuple(@TypeOf(Function))).len == 1) {
return JSC.JSValue.jsNumber(Function(input));
} else {
var seed: u64 = 0;
if (args.nextEat()) |arg| {
if (arg.isNumber() or arg.isBigInt()) {
seed = arg.toUInt64NoTruncate();
}
}
if (comptime bun.trait.isNumber(@TypeOf(function_args[0]))) {
function_args[0] = @as(@TypeOf(function_args[0]), @truncate(seed));
function_args[1] = input;
} else {
function_args[0] = input;
function_args[1] = @as(@TypeOf(function_args[1]), @truncate(seed));
}
const value = @call(.auto, Function, function_args);
if (@TypeOf(value) == u32) {
return JSC.JSValue.jsNumber(@as(u32, @bitCast(value)));
}
return JSC.JSValue.fromUInt64NoTruncate(globalThis, value);
}
}
}.hash;
}
const HashObject = @This();
const JSC = bun.JSC;
const JSValue = JSC.JSValue;
const JSGlobalObject = JSC.JSGlobalObject;
const JSObject = JSC.JSObject;
const std = @import("std");
const bun = @import("root").bun;
const ZigString = JSC.ZigString;

View File

@@ -426,10 +426,6 @@ pub const JSBundler = struct {
}
if (try config.getOwnObject(globalThis, "define")) |define| {
if (!define.isObject()) {
return globalThis.throwInvalidArguments("define must be an object", .{});
}
var define_iter = try JSC.JSPropertyIterator(.{
.skip_empty_name = true,
.include_value = true,
@@ -956,7 +952,7 @@ pub const JSBundler = struct {
is_onLoad: bool,
) bool {
JSC.markBinding(@src());
const tracer = bun.tracy.traceNamed(@src(), "JSBundler.hasAnyMatches");
const tracer = bun.perf.trace("JSBundler.hasAnyMatches");
defer tracer.end();
const namespace_string = if (path.isFile())
@@ -978,7 +974,7 @@ pub const JSBundler = struct {
is_server_side: bool,
) void {
JSC.markBinding(@src());
const tracer = bun.tracy.traceNamed(@src(), "JSBundler.matchOnLoad");
const tracer = bun.perf.trace("JSBundler.matchOnLoad");
defer tracer.end();
debug("JSBundler.matchOnLoad(0x{x}, {s}, {s})", .{ @intFromPtr(this), namespace, path });
const namespace_string = if (namespace.len == 0)
@@ -1000,7 +996,7 @@ pub const JSBundler = struct {
import_record_kind: bun.ImportKind,
) void {
JSC.markBinding(@src());
const tracer = bun.tracy.traceNamed(@src(), "JSBundler.matchOnResolve");
const tracer = bun.perf.trace("JSBundler.matchOnResolve");
defer tracer.end();
const namespace_string = if (strings.eqlComptime(namespace, "file"))
bun.String.empty
@@ -1023,7 +1019,7 @@ pub const JSBundler = struct {
is_bake: bool,
) !JSValue {
JSC.markBinding(@src());
const tracer = bun.tracy.traceNamed(@src(), "JSBundler.addPlugin");
const tracer = bun.perf.trace("JSBundler.addPlugin");
defer tracer.end();
return JSBundlerPlugin__runSetupFunction(
this,

View File

@@ -0,0 +1,279 @@
const std = @import("std");
const bun = @import("root").bun;
const string = bun.string;
const SourceMap = @import("../../sourcemap/sourcemap.zig");
const JSC = bun.JSC;
const ParsedSourceMap = SourceMap.ParsedSourceMap;
const Mapping = SourceMap.Mapping;
const ZigString = JSC.ZigString;
const JSValue = JSC.JSValue;
const JSGlobalObject = JSC.JSGlobalObject;
parsedSourceMap: *ParsedSourceMap,
lineLengths: ?[]u32 = null,
pub usingnamespace bun.New(JSParsedSourceMap);
pub usingnamespace JSC.Codegen.JSSourceMap;
pub fn getSourceMapConstructor(globalObject: *JSGlobalObject) JSValue {
// The constructor function is generated automatically by WebKit/JSC through the codegen system
return @This().getConstructor(globalObject);
}
pub fn estimatedSize(this: *JSParsedSourceMap) usize {
var size = @sizeOf(JSParsedSourceMap);
// Add size of ParsedSourceMap
size += @sizeOf(ParsedSourceMap);
// Add size of line lengths if present
if (this.lineLengths) |lengths| {
size += lengths.len * @sizeOf(u32);
}
// Add size of mappings
size += this.parsedSourceMap.mappings.len * @sizeOf(SourceMap.MappingItem);
// Add size of sources
for (this.parsedSourceMap.external_source_names) |source| {
size += source.len;
}
return size;
}
pub fn finalize(this: *JSParsedSourceMap) void {
if (this.lineLengths) |lengths| {
bun.default_allocator.free(lengths);
}
this.parsedSourceMap.deref();
this.destroy();
}
pub fn constructor(globalObject: *JSGlobalObject, callframe: *JSC.CallFrame) bun.JSError!*JSParsedSourceMap {
const arguments = callframe.arguments();
if (arguments.len < 1 or !arguments.ptr[0].isObject()) {
return globalObject.throw("SourceMap constructor requires a SourceMapV3 payload object", .{});
}
const payload = arguments.ptr[0];
var lineLengths: ?[]u32 = null;
// Check for options argument
if (arguments.len > 1 and arguments.ptr[1].isObject()) {
const options = arguments.ptr[1];
if (try options.get(globalObject, "lineLengths")) |lineLengthsValue| {
if (lineLengthsValue.isUndefinedOrNull()) {
// Leave lineLengths as null
} else if (lineLengthsValue.isArray()) {
const array_len = lineLengthsValue.getLength(globalObject);
if (globalObject.hasException()) return .zero;
const allocator = bun.default_allocator;
lineLengths = allocator.alloc(u32, array_len) catch bun.outOfMemory();
for (0..array_len) |i| {
const val = lineLengthsValue.getIndex(globalObject, i);
lineLengths.?[i] = val.coerce(u32, globalObject);
if (globalObject.hasException()) {
allocator.free(lineLengths.?);
return .zero;
}
}
} else {
return globalObject.throw("lineLengths must be an array of numbers", .{});
}
}
}
// Get the SourceMapV3 properties
const version = try payload.get(globalObject, "version");
if (version == null or !version.?.isNumber() or version.?.toInt32() != 3) {
return globalObject.throw("SourceMap version must be 3", .{});
}
const sources = try payload.get(globalObject, "sources");
if (sources == null or !sources.?.isArray()) {
return globalObject.throw("SourceMap sources must be an array", .{});
}
const mappings = try payload.get(globalObject, "mappings");
if (mappings == null or !mappings.?.isString()) {
return globalObject.throw("SourceMap mappings must be a string", .{});
}
const mappings_str = try mappings.?.toBunString(globalObject);
defer mappings_str.deref();
const mappings_bytes = mappings_str.byteSlice();
var arena = std.heap.ArenaAllocator.init(bun.default_allocator);
defer arena.deinit();
const sources_array = sources.?.asArrayBuffer(globalObject);
if (sources_array == null) {
return globalObject.throw("SourceMap sources must be an array", .{});
}
const sources_array_len = sources_array.?.length;
const source_paths_slice = bun.default_allocator.alloc([]const u8, sources_array_len) catch bun.outOfMemory();
var i: usize = 0;
errdefer {
for (0..i) |j| {
bun.default_allocator.free(source_paths_slice[j]);
}
bun.default_allocator.free(source_paths_slice);
}
while (i < sources_array_len) : (i += 1) {
const source = sources.?.getIndex(globalObject, i);
if (globalObject.hasException()) {}
const source_str = try source.toBunString(globalObject);
defer source_str.deref();
source_paths_slice[i] = bun.default_allocator.dupe(u8, source_str.byteSlice()) catch bun.outOfMemory();
}
// Parse the mappings
const map_data = switch (Mapping.parse(
bun.default_allocator,
mappings_bytes,
null,
std.math.maxInt(i32),
@intCast(sources_array_len),
)) {
.success => |x| x,
.fail => |fail| {
return globalObject.throw("Failed to parse SourceMap mappings: {s} ({s})", .{ fail.msg, @errorName(fail.err) });
},
};
const ptr = ParsedSourceMap.new(map_data);
ptr.external_source_names = source_paths_slice;
return JSParsedSourceMap.new(.{
.parsedSourceMap = ptr,
.lineLengths = lineLengths,
});
}
pub fn getPayload(this: *JSParsedSourceMap, globalObject: *JSGlobalObject) JSValue {
const map = this.parsedSourceMap;
const object = JSValue.createEmptyObject(globalObject, 5);
// version field
object.put(globalObject, "version", JSValue.jsNumber(3));
// sources field
const sources = JSValue.createEmptyArray(globalObject, map.external_source_names.len);
for (map.external_source_names, 0..) |source, i| {
sources.putIndex(globalObject, i, ZigString.init(source).toJS(globalObject));
}
object.put(globalObject, "sources", sources);
// sourcesContent field
const sourcesContent = JSValue.createEmptyArray(globalObject, map.external_source_names.len);
for (map.external_source_names, 0..) |source, i| {
sourcesContent.putIndex(globalObject, @intCast(i), bun.String.createUTF8ForJS(globalObject, source));
}
object.put(globalObject, "sourcesContent", sourcesContent);
// names field
object.put(globalObject, "names", JSValue.createEmptyArray(globalObject, 0));
// mappings field
var buf = std.ArrayList(u8).init(bun.default_allocator);
defer buf.deinit();
map.writeVLQs(buf.writer()) catch bun.outOfMemory();
object.put(globalObject, "mappings", bun.String.createUTF8ForJS(globalObject, buf.items));
return object;
}
pub fn getLineLengths(this: *JSParsedSourceMap, globalObject: *JSGlobalObject) JSValue {
if (this.lineLengths) |lengths| {
const array = JSValue.createEmptyArray(globalObject, lengths.len);
for (lengths, 0..) |length, i| {
array.putIndex(globalObject, i, JSValue.jsNumber(length));
}
return array;
}
return JSValue.jsUndefined();
}
pub fn findEntry(
this: *JSParsedSourceMap,
globalObject: *JSGlobalObject,
callframe: *JSC.CallFrame,
) JSValue {
const args = callframe.arguments();
if (args.len < 2) {
return globalObject.throwNotEnoughArguments("findEntry", 2, args.len);
}
const lineOffset = try globalObject.validateIntegerRange(args[0], i32, 0, .{ .min = 1, .max = std.math.maxInt(i32) });
const columnOffset = try globalObject.validateIntegerRange(args[1], i32, 0, .{ .min = 0, .max = std.math.maxInt(i32) });
// Find the mapping
const mapping = Mapping.find(this.parsedSourceMap.mappings, lineOffset, columnOffset) orelse {
return JSValue.createEmptyObject(globalObject, 0);
};
// Create a SourceMapEntry object with the mapping data
const entryObj = JSValue.createEmptyObject(globalObject, 6);
entryObj.put(globalObject, "generatedLine", JSValue.jsNumber(mapping.generatedLine()));
entryObj.put(globalObject, "generatedColumn", JSValue.jsNumber(mapping.generatedColumn()));
if (mapping.sourceIndex() >= 0 and mapping.sourceIndex() < @as(i32, @intCast(this.parsedSourceMap.external_source_names.len))) {
const sourceName = this.parsedSourceMap.external_source_names[@intCast(mapping.sourceIndex())];
entryObj.put(globalObject, "originalSource", bun.String.createUTF8ForJS(globalObject, sourceName));
entryObj.put(globalObject, "originalLine", JSValue.jsNumber(mapping.originalLine()));
entryObj.put(globalObject, "originalColumn", JSValue.jsNumber(mapping.originalColumn()));
// TODO: Add name if available
// if (mapping.nameIndex() >= 0) {
// entryObj.put(globalObject, "name", JSValue.jsString(globalObject, ""));
// }
}
return entryObj;
}
pub fn findOrigin(
this: *JSParsedSourceMap,
globalObject: *JSGlobalObject,
callframe: *JSC.CallFrame,
) JSValue {
const args = callframe.arguments();
if (args.len < 2) {
return globalObject.throw("findOrigin requires lineNumber and columnNumber arguments", .{});
}
// Convert 1-indexed to 0-indexed
const lineNumber = args.ptr[0].toInt32() - 1;
const columnNumber = args.ptr[1].toInt32() - 1;
// Find the mapping
const mapping = Mapping.find(this.parsedSourceMap.mappings, lineNumber, columnNumber);
if (mapping == null) {
return JSValue.createEmptyObject(globalObject, 0);
}
// Create a SourceMapOrigin object with the mapping data
const originObj = JSValue.createEmptyObject(globalObject, 3);
if (mapping.?.sourceIndex() >= 0 and mapping.?.sourceIndex() < @as(i32, @intCast(this.parsedSourceMap.external_source_names.len))) {
const sourceName = this.parsedSourceMap.external_source_names[@intCast(mapping.?.sourceIndex())];
originObj.put(globalObject, "fileName", ZigString.init(sourceName).toJS(globalObject));
// Convert back to 1-indexed for the return value
originObj.put(globalObject, "lineNumber", JSValue.jsNumber(mapping.?.originalLine() + 1));
originObj.put(globalObject, "columnNumber", JSValue.jsNumber(mapping.?.originalColumn() + 1));
// TODO: Add name if available
}
return originObj;
}
const JSParsedSourceMap = @This();

View File

@@ -0,0 +1,31 @@
import { define } from "../../codegen/class-definitions";
export default {
JSSourceMap: define({
name: "SourceMap",
construct: true,
finalize: true,
estimatedSize: true,
klass: {},
proto: {
payload: {
getter: "getPayload",
enumerable: true,
},
lineLengths: {
getter: "getLineLengths",
enumerable: true,
},
findEntry: {
fn: "findEntry",
length: 2,
enumerable: true,
},
findOrigin: {
fn: "findOrigin",
length: 2,
enumerable: true,
},
},
}),
};

View File

@@ -332,15 +332,15 @@ fn transformOptionsFromJSC(globalObject: JSC.C.JSContextRef, temp_allocator: std
break :define;
}
if (!define.isObject()) {
const define_obj = define.getObject() orelse {
return globalObject.throwInvalidArguments("define must be an object", .{});
}
};
var define_iter = try JSC.JSPropertyIterator(.{
.skip_empty_name = true,
.include_value = true,
}).init(globalThis, define);
}).init(globalThis, define_obj);
defer define_iter.deinit();
// cannot be a temporary because it may be loaded on different threads.
@@ -616,14 +616,14 @@ fn transformOptionsFromJSC(globalObject: JSC.C.JSContextRef, temp_allocator: std
}
if (try exports.getTruthy(globalThis, "replace")) |replace| {
if (!replace.isObject()) {
const replace_obj = replace.getObject() orelse {
return globalObject.throwInvalidArguments("replace must be an object", .{});
}
};
var iter = try JSC.JSPropertyIterator(.{
.skip_empty_name = true,
.include_value = true,
}).init(globalThis, replace);
}).init(globalThis, replace_obj);
defer iter.deinit();
if (iter.len > 0) {

View File

@@ -0,0 +1,72 @@
pub fn create(globalThis: *JSC.JSGlobalObject) JSC.JSValue {
const object = JSValue.createEmptyObject(globalThis, 1);
object.put(
globalThis,
ZigString.static("parse"),
JSC.createCallback(
globalThis,
ZigString.static("parse"),
1,
parse,
),
);
return object;
}
pub fn parse(
globalThis: *JSC.JSGlobalObject,
callframe: *JSC.CallFrame,
) bun.JSError!JSC.JSValue {
var arena = bun.ArenaAllocator.init(globalThis.allocator());
const allocator = arena.allocator();
defer arena.deinit();
var log = logger.Log.init(default_allocator);
const arguments = callframe.arguments_old(1).slice();
if (arguments.len == 0 or arguments[0].isEmptyOrUndefinedOrNull()) {
return globalThis.throwInvalidArguments("Expected a string to parse", .{});
}
var input_slice = try arguments[0].toSlice(globalThis, bun.default_allocator);
defer input_slice.deinit();
var source = logger.Source.initPathString("input.toml", input_slice.slice());
const parse_result = TOMLParser.parse(&source, &log, allocator, false) catch {
return globalThis.throwValue(log.toJS(globalThis, default_allocator, "Failed to parse toml"));
};
// for now...
const buffer_writer = js_printer.BufferWriter.init(allocator) catch {
return globalThis.throwValue(log.toJS(globalThis, default_allocator, "Failed to print toml"));
};
var writer = js_printer.BufferPrinter.init(buffer_writer);
_ = js_printer.printJSON(
*js_printer.BufferPrinter,
&writer,
parse_result,
&source,
.{
.mangled_props = null,
},
) catch {
return globalThis.throwValue(log.toJS(globalThis, default_allocator, "Failed to print toml"));
};
const slice = writer.ctx.buffer.slice();
var out = bun.String.fromUTF8(slice);
defer out.deref();
return out.toJSByParseJSON(globalThis);
}
const TOMLObject = @This();
const JSC = bun.JSC;
const JSValue = JSC.JSValue;
const JSGlobalObject = JSC.JSGlobalObject;
const JSObject = JSC.JSObject;
const std = @import("std");
const ZigString = JSC.ZigString;
const logger = bun.logger;
const bun = @import("root").bun;
const js_printer = bun.js_printer;
const default_allocator = bun.default_allocator;
const TOMLParser = @import("../../toml/toml_parser.zig").TOML;

View File

@@ -0,0 +1,76 @@
pub fn create(globalThis: *JSC.JSGlobalObject) JSC.JSValue {
const object = JSValue.createEmptyObject(globalThis, 3);
const fields = comptime .{
.gcAggressionLevel = gcAggressionLevel,
.arrayBufferToString = arrayBufferToString,
.mimallocDump = dump_mimalloc,
};
inline for (comptime std.meta.fieldNames(@TypeOf(fields))) |name| {
object.put(
globalThis,
comptime ZigString.static(name),
JSC.createCallback(globalThis, comptime ZigString.static(name), 1, comptime @field(fields, name)),
);
}
return object;
}
pub fn gcAggressionLevel(
globalThis: *JSC.JSGlobalObject,
callframe: *JSC.CallFrame,
) bun.JSError!JSC.JSValue {
const ret = JSValue.jsNumber(@as(i32, @intFromEnum(globalThis.bunVM().aggressive_garbage_collection)));
const value = callframe.arguments_old(1).ptr[0];
if (!value.isEmptyOrUndefinedOrNull()) {
switch (value.coerce(i32, globalThis)) {
1 => globalThis.bunVM().aggressive_garbage_collection = .mild,
2 => globalThis.bunVM().aggressive_garbage_collection = .aggressive,
0 => globalThis.bunVM().aggressive_garbage_collection = .none,
else => {},
}
}
return ret;
}
pub fn arrayBufferToString(
globalThis: *JSC.JSGlobalObject,
callframe: *JSC.CallFrame,
) bun.JSError!JSC.JSValue {
const args = callframe.arguments_old(2).slice();
if (args.len < 1 or !args[0].isCell() or !args[0].jsType().isTypedArrayOrArrayBuffer()) {
return globalThis.throwInvalidArguments("Expected an ArrayBuffer", .{});
}
const array_buffer = JSC.ArrayBuffer.fromTypedArray(globalThis, args[0]);
switch (array_buffer.typed_array_type) {
.Uint16Array, .Int16Array => {
var zig_str = ZigString.init("");
zig_str._unsafe_ptr_do_not_use = @as([*]const u8, @ptrCast(@alignCast(array_buffer.ptr)));
zig_str.len = array_buffer.len;
zig_str.markUTF16();
return zig_str.toJS(globalThis);
},
else => {
return ZigString.init(array_buffer.slice()).toJS(globalThis);
},
}
}
extern fn dump_zone_malloc_stats() void;
fn dump_mimalloc(globalObject: *JSC.JSGlobalObject, _: *JSC.CallFrame) bun.JSError!JSC.JSValue {
globalObject.bunVM().arena.dumpStats();
if (bun.heap_breakdown.enabled) {
dump_zone_malloc_stats();
}
return .undefined;
}
const JSC = bun.JSC;
const JSValue = JSC.JSValue;
const JSGlobalObject = JSC.JSGlobalObject;
const JSObject = JSC.JSObject;
const std = @import("std");
const bun = @import("root").bun;
const ZigString = JSC.ZigString;

View File

@@ -2501,7 +2501,7 @@ pub const DNSResolver = struct {
return globalThis.throwNotEnoughArguments("resolve", 3, arguments.len);
}
const record_type: RecordType = if (arguments.len == 1)
const record_type: RecordType = if (arguments.len <= 1)
RecordType.default
else brk: {
const record_type_value = arguments.ptr[1];
@@ -2518,7 +2518,7 @@ pub const DNSResolver = struct {
}
break :brk RecordType.map.getWithEql(record_type_str.getZigString(globalThis), JSC.ZigString.eqlComptime) orelse {
return globalThis.throwInvalidArgumentType("resolve", "record", "one of: A, AAAA, CAA, CNAME, MX, NS, PTR, SOA, SRV, TXT");
return globalThis.throwInvalidArgumentType("resolve", "record", "one of: A, AAAA, ANY, CAA, CNAME, MX, NS, PTR, SOA, SRV, TXT");
};
};

View File

@@ -3248,9 +3248,9 @@ pub const H2FrameParser = struct {
return globalObject.throw("Invalid stream id", .{});
};
if (!headers_arg.isObject()) {
const headers_obj = headers_arg.getObject() orelse {
return globalObject.throw("Expected headers to be an object", .{});
}
};
if (!sensitive_arg.isObject()) {
return globalObject.throw("Expected sensitiveHeaders to be an object", .{});
@@ -3266,7 +3266,7 @@ pub const H2FrameParser = struct {
var iter = try JSC.JSPropertyIterator(.{
.skip_empty_name = false,
.include_value = true,
}).init(globalObject, headers_arg);
}).init(globalObject, headers_obj);
defer iter.deinit();
var single_value_headers: [SingleValueHeaders.keys().len]bool = undefined;
@@ -3595,9 +3595,9 @@ pub const H2FrameParser = struct {
const headers_arg = args_list.ptr[2];
const sensitive_arg = args_list.ptr[3];
if (!headers_arg.isObject()) {
const headers_obj = headers_arg.getObject() orelse {
return globalObject.throw("Expected headers to be an object", .{});
}
};
if (!sensitive_arg.isObject()) {
return globalObject.throw("Expected sensitiveHeaders to be an object", .{});
@@ -3617,7 +3617,7 @@ pub const H2FrameParser = struct {
var iter = try JSC.JSPropertyIterator(.{
.skip_empty_name = false,
.include_value = true,
}).init(globalObject, headers_arg);
}).init(globalObject, headers_obj);
defer iter.deinit();
var header_count: u32 = 0;

View File

@@ -159,7 +159,7 @@ pub const ResourceUsage = struct {
}
};
pub fn appendEnvpFromJS(globalThis: *JSC.JSGlobalObject, object: JSC.JSValue, envp: *std.ArrayList(?[*:0]const u8), PATH: *[]const u8) bun.JSError!void {
pub fn appendEnvpFromJS(globalThis: *JSC.JSGlobalObject, object: *JSC.JSObject, envp: *std.ArrayList(?[*:0]const u8), PATH: *[]const u8) bun.JSError!void {
var object_iter = try JSC.JSPropertyIterator(.{ .skip_empty_name = false, .include_value = true }).init(globalThis, object);
defer object_iter.deinit();
@@ -2024,10 +2024,11 @@ pub fn spawnMaybeSync(
onExit_.withAsyncContextIfNeeded(globalThis);
}
if (try args.getTruthy(globalThis, "env")) |object| {
if (!object.isObject()) {
if (try args.getTruthy(globalThis, "env")) |env_arg| {
env_arg.ensureStillAlive();
const object = env_arg.getObject() orelse {
return globalThis.throwInvalidArguments("env must be an object", .{});
}
};
override_env = true;
// If the env object does not include a $PATH, it must disable path lookup for argv[0]

View File

@@ -68,10 +68,6 @@ export default [
},
length: 2,
},
scryptSync: {
fn: "doScryptSync",
length: 2,
},
},
klass: {},
}),

32
src/bun.js/api/crypto.zig Normal file
View File

@@ -0,0 +1,32 @@
pub fn createCryptoError(globalThis: *JSC.JSGlobalObject, err_code: u32) JSValue {
return bun.BoringSSL.ERR_toJS(globalThis, err_code);
}
pub const PasswordObject = @import("./crypto/PasswordObject.zig").PasswordObject;
pub const JSPasswordObject = @import("./crypto/PasswordObject.zig").JSPasswordObject;
pub const CryptoHasher = @import("./crypto/CryptoHasher.zig").CryptoHasher;
pub const MD4 = @import("./crypto/CryptoHasher.zig").MD4;
pub const MD5 = @import("./crypto/CryptoHasher.zig").MD5;
pub const SHA1 = @import("./crypto/CryptoHasher.zig").SHA1;
pub const SHA224 = @import("./crypto/CryptoHasher.zig").SHA224;
pub const SHA256 = @import("./crypto/CryptoHasher.zig").SHA256;
pub const SHA384 = @import("./crypto/CryptoHasher.zig").SHA384;
pub const SHA512 = @import("./crypto/CryptoHasher.zig").SHA512;
pub const SHA512_256 = @import("./crypto/CryptoHasher.zig").SHA512_256;
pub const HMAC = @import("./crypto/HMAC.zig");
pub const EVP = @import("./crypto/EVP.zig");
comptime {
CryptoHasher.Extern.@"export"();
}
const std = @import("std");
const bun = @import("root").bun;
const string = bun.string;
const strings = bun.strings;
const MutableString = bun.MutableString;
const stringZ = bun.stringZ;
const JSC = bun.JSC;
const JSValue = JSC.JSValue;
const JSGlobalObject = JSC.JSGlobalObject;

View File

@@ -0,0 +1,898 @@
pub const CryptoHasher = union(enum) {
// HMAC_CTX contains 3 EVP_CTX, so let's store it as a pointer.
hmac: ?*HMAC,
evp: EVP,
zig: CryptoHasherZig,
const Digest = EVP.Digest;
pub usingnamespace JSC.Codegen.JSCryptoHasher;
usingnamespace bun.New(@This());
// For using only CryptoHasherZig in c++
pub const Extern = struct {
fn getByName(global: *JSGlobalObject, name_bytes: [*:0]const u8, name_len: usize) callconv(.C) ?*CryptoHasher {
const name = name_bytes[0..name_len];
if (CryptoHasherZig.init(name)) |inner| {
return CryptoHasher.new(.{
.zig = inner,
});
}
const algorithm = EVP.Algorithm.map.get(name) orelse {
return null;
};
switch (algorithm) {
.ripemd160,
.blake2b256,
.blake2b512,
.@"sha512-224",
=> {
if (algorithm.md()) |md| {
return CryptoHasher.new(.{
.evp = EVP.init(algorithm, md, global.bunVM().rareData().boringEngine()),
});
}
},
else => {
return null;
},
}
return null;
}
fn getFromOther(global: *JSGlobalObject, other_handle: *CryptoHasher) callconv(.C) ?*CryptoHasher {
switch (other_handle.*) {
.zig => |other| {
const hasher = CryptoHasher.new(.{
.zig = other.copy(),
});
return hasher;
},
.evp => |other| {
return CryptoHasher.new(.{
.evp = other.copy(global.bunVM().rareData().boringEngine()) catch {
return null;
},
});
},
else => {
return null;
},
}
}
fn destroy(handle: *CryptoHasher) callconv(.C) void {
handle.finalize();
}
fn update(handle: *CryptoHasher, input_bytes: [*]const u8, input_len: usize) callconv(.C) bool {
const input = input_bytes[0..input_len];
switch (handle.*) {
.zig => {
handle.zig.update(input);
return true;
},
.evp => {
handle.evp.update(input);
return true;
},
else => {
return false;
},
}
}
fn digest(handle: *CryptoHasher, global: *JSGlobalObject, buf: [*]u8, buf_len: usize) callconv(.C) u32 {
const digest_buf = buf[0..buf_len];
switch (handle.*) {
.zig => {
const res = handle.zig.finalWithLen(digest_buf, buf_len);
return @intCast(res.len);
},
.evp => {
const res = handle.evp.final(global.bunVM().rareData().boringEngine(), digest_buf);
return @intCast(res.len);
},
else => {
return 0;
},
}
}
fn getDigestSize(handle: *CryptoHasher) callconv(.C) u32 {
return switch (handle.*) {
.zig => |inner| inner.digest_length,
.evp => |inner| inner.size(),
else => 0,
};
}
pub fn @"export"() void {
@export(&CryptoHasher.Extern.getByName, .{ .name = "Bun__CryptoHasherExtern__getByName" });
@export(&CryptoHasher.Extern.getFromOther, .{ .name = "Bun__CryptoHasherExtern__getFromOther" });
@export(&CryptoHasher.Extern.destroy, .{ .name = "Bun__CryptoHasherExtern__destroy" });
@export(&CryptoHasher.Extern.update, .{ .name = "Bun__CryptoHasherExtern__update" });
@export(&CryptoHasher.Extern.digest, .{ .name = "Bun__CryptoHasherExtern__digest" });
@export(&CryptoHasher.Extern.getDigestSize, .{ .name = "Bun__CryptoHasherExtern__getDigestSize" });
}
};
pub const digest = JSC.wrapInstanceMethod(CryptoHasher, "digest_", false);
pub const hash = JSC.wrapStaticMethod(CryptoHasher, "hash_", false);
fn throwHmacConsumed(globalThis: *JSC.JSGlobalObject) bun.JSError {
return globalThis.throw("HMAC has been consumed and is no longer usable", .{});
}
pub fn getByteLength(this: *CryptoHasher, globalThis: *JSC.JSGlobalObject) JSC.JSValue {
return JSC.JSValue.jsNumber(switch (this.*) {
.evp => |*inner| inner.size(),
.hmac => |inner| if (inner) |hmac| hmac.size() else {
throwHmacConsumed(globalThis) catch return .zero;
},
.zig => |*inner| inner.digest_length,
});
}
pub fn getAlgorithm(this: *CryptoHasher, globalObject: *JSC.JSGlobalObject) JSC.JSValue {
return switch (this.*) {
inline .evp, .zig => |*inner| ZigString.fromUTF8(bun.asByteSlice(@tagName(inner.algorithm))).toJS(globalObject),
.hmac => |inner| if (inner) |hmac| ZigString.fromUTF8(bun.asByteSlice(@tagName(hmac.algorithm))).toJS(globalObject) else {
throwHmacConsumed(globalObject) catch return .zero;
},
};
}
pub fn getAlgorithms(
globalThis_: *JSC.JSGlobalObject,
_: JSValue,
_: JSValue,
) JSC.JSValue {
return bun.String.toJSArray(globalThis_, &EVP.Algorithm.names.values);
}
fn hashToEncoding(globalThis: *JSGlobalObject, evp: *EVP, input: JSC.Node.BlobOrStringOrBuffer, encoding: JSC.Node.Encoding) bun.JSError!JSC.JSValue {
var output_digest_buf: Digest = undefined;
defer input.deinit();
if (input == .blob and input.blob.isBunFile()) {
return globalThis.throw("Bun.file() is not supported here yet (it needs an async version)", .{});
}
const len = evp.hash(globalThis.bunVM().rareData().boringEngine(), input.slice(), &output_digest_buf) orelse {
const err = BoringSSL.ERR_get_error();
const instance = createCryptoError(globalThis, err);
BoringSSL.ERR_clear_error();
return globalThis.throwValue(instance);
};
return encoding.encodeWithMaxSize(globalThis, BoringSSL.EVP_MAX_MD_SIZE, output_digest_buf[0..len]);
}
fn hashToBytes(globalThis: *JSGlobalObject, evp: *EVP, input: JSC.Node.BlobOrStringOrBuffer, output: ?JSC.ArrayBuffer) bun.JSError!JSC.JSValue {
var output_digest_buf: Digest = undefined;
var output_digest_slice: []u8 = &output_digest_buf;
defer input.deinit();
if (input == .blob and input.blob.isBunFile()) {
return globalThis.throw("Bun.file() is not supported here yet (it needs an async version)", .{});
}
if (output) |output_buf| {
const size = evp.size();
var bytes = output_buf.byteSlice();
if (bytes.len < size) {
return globalThis.throwInvalidArguments("TypedArray must be at least {d} bytes", .{size});
}
output_digest_slice = bytes[0..size];
}
const len = evp.hash(globalThis.bunVM().rareData().boringEngine(), input.slice(), output_digest_slice) orelse {
const err = BoringSSL.ERR_get_error();
const instance = createCryptoError(globalThis, err);
BoringSSL.ERR_clear_error();
return globalThis.throwValue(instance);
};
if (output) |output_buf| {
return output_buf.value;
} else {
// Clone to GC-managed memory
return JSC.ArrayBuffer.createBuffer(globalThis, output_digest_slice[0..len]);
}
}
pub fn hash_(
globalThis: *JSGlobalObject,
algorithm: ZigString,
input: JSC.Node.BlobOrStringOrBuffer,
output: ?JSC.Node.StringOrBuffer,
) bun.JSError!JSC.JSValue {
var evp = EVP.byName(algorithm, globalThis) orelse return try CryptoHasherZig.hashByName(globalThis, algorithm, input, output) orelse {
return globalThis.throwInvalidArguments("Unsupported algorithm \"{any}\"", .{algorithm});
};
defer evp.deinit();
if (output) |string_or_buffer| {
switch (string_or_buffer) {
inline else => |*str| {
defer str.deinit();
const encoding = JSC.Node.Encoding.from(str.slice()) orelse {
return globalThis.ERR_INVALID_ARG_VALUE("Unknown encoding: {s}", .{str.slice()}).throw();
};
return hashToEncoding(globalThis, &evp, input, encoding);
},
.buffer => |buffer| {
return hashToBytes(globalThis, &evp, input, buffer.buffer);
},
}
} else {
return hashToBytes(globalThis, &evp, input, null);
}
}
// Bun.CryptoHasher(algorithm, hmacKey?: string | Buffer)
pub fn constructor(globalThis: *JSC.JSGlobalObject, callframe: *JSC.CallFrame) bun.JSError!*CryptoHasher {
const arguments = callframe.arguments_old(2);
if (arguments.len == 0) {
return globalThis.throwInvalidArguments("Expected an algorithm name as an argument", .{});
}
const algorithm_name = arguments.ptr[0];
if (algorithm_name.isEmptyOrUndefinedOrNull() or !algorithm_name.isString()) {
return globalThis.throwInvalidArguments("algorithm must be a string", .{});
}
const algorithm = try algorithm_name.getZigString(globalThis);
if (algorithm.len == 0) {
return globalThis.throwInvalidArguments("Invalid algorithm name", .{});
}
const hmac_value = arguments.ptr[1];
var hmac_key: ?JSC.Node.StringOrBuffer = null;
defer {
if (hmac_key) |*key| {
key.deinit();
}
}
if (!hmac_value.isEmptyOrUndefinedOrNull()) {
hmac_key = try JSC.Node.StringOrBuffer.fromJS(globalThis, bun.default_allocator, hmac_value) orelse {
return globalThis.throwInvalidArguments("key must be a string or buffer", .{});
};
}
return CryptoHasher.new(brk: {
if (hmac_key) |*key| {
const chosen_algorithm = try algorithm_name.toEnumFromMap(globalThis, "algorithm", EVP.Algorithm, EVP.Algorithm.map);
if (chosen_algorithm == .ripemd160) {
// crashes at runtime.
return globalThis.throw("ripemd160 is not supported", .{});
}
break :brk .{
.hmac = HMAC.init(chosen_algorithm, key.slice()) orelse {
if (!globalThis.hasException()) {
const err = BoringSSL.ERR_get_error();
if (err != 0) {
const instance = createCryptoError(globalThis, err);
BoringSSL.ERR_clear_error();
return globalThis.throwValue(instance);
} else {
return globalThis.throwTODO("HMAC is not supported for this algorithm yet");
}
}
return error.JSError;
},
};
}
break :brk .{
.evp = EVP.byName(algorithm, globalThis) orelse return CryptoHasherZig.constructor(algorithm) orelse {
return globalThis.throwInvalidArguments("Unsupported algorithm {any}", .{algorithm});
},
};
});
}
pub fn getter(
globalObject: *JSC.JSGlobalObject,
_: *JSC.JSObject,
) JSC.JSValue {
return CryptoHasher.getConstructor(globalObject);
}
pub fn update(this: *CryptoHasher, globalThis: *JSC.JSGlobalObject, callframe: *JSC.CallFrame) bun.JSError!JSC.JSValue {
const thisValue = callframe.this();
const arguments = callframe.arguments_old(2);
const input = arguments.ptr[0];
if (input.isEmptyOrUndefinedOrNull()) {
return globalThis.throwInvalidArguments("expected blob, string or buffer", .{});
}
const encoding = arguments.ptr[1];
const buffer = try JSC.Node.BlobOrStringOrBuffer.fromJSWithEncodingValue(globalThis, globalThis.bunVM().allocator, input, encoding) orelse {
if (!globalThis.hasException()) return globalThis.throwInvalidArguments("expected blob, string or buffer", .{});
return error.JSError;
};
defer buffer.deinit();
if (buffer == .blob and buffer.blob.isBunFile()) {
return globalThis.throw("Bun.file() is not supported here yet (it needs an async version)", .{});
}
switch (this.*) {
.evp => |*inner| {
inner.update(buffer.slice());
const err = BoringSSL.ERR_get_error();
if (err != 0) {
const instance = createCryptoError(globalThis, err);
BoringSSL.ERR_clear_error();
return globalThis.throwValue(instance);
}
},
.hmac => |inner| {
const hmac = inner orelse {
return throwHmacConsumed(globalThis);
};
hmac.update(buffer.slice());
const err = BoringSSL.ERR_get_error();
if (err != 0) {
const instance = createCryptoError(globalThis, err);
BoringSSL.ERR_clear_error();
return globalThis.throwValue(instance);
}
},
.zig => |*inner| {
inner.update(buffer.slice());
return thisValue;
},
}
return thisValue;
}
pub fn copy(
this: *CryptoHasher,
globalObject: *JSC.JSGlobalObject,
_: *JSC.CallFrame,
) bun.JSError!JSC.JSValue {
var new: CryptoHasher = undefined;
switch (this.*) {
.evp => |*inner| {
new = .{ .evp = inner.copy(globalObject.bunVM().rareData().boringEngine()) catch bun.outOfMemory() };
},
.hmac => |inner| {
const hmac = inner orelse {
return throwHmacConsumed(globalObject);
};
new = .{
.hmac = hmac.copy() catch {
const err = createCryptoError(globalObject, BoringSSL.ERR_get_error());
BoringSSL.ERR_clear_error();
return globalObject.throwValue(err);
},
};
},
.zig => |*inner| {
new = .{ .zig = inner.copy() };
},
}
return CryptoHasher.new(new).toJS(globalObject);
}
pub fn digest_(this: *CryptoHasher, globalThis: *JSGlobalObject, output: ?JSC.Node.StringOrBuffer) bun.JSError!JSC.JSValue {
if (output) |string_or_buffer| {
switch (string_or_buffer) {
inline else => |*str| {
defer str.deinit();
const encoding = JSC.Node.Encoding.from(str.slice()) orelse {
return globalThis.ERR_INVALID_ARG_VALUE("Unknown encoding: {s}", .{str.slice()}).throw();
};
return this.digestToEncoding(globalThis, encoding);
},
.buffer => |buffer| {
return this.digestToBytes(
globalThis,
buffer.buffer,
);
},
}
} else {
return this.digestToBytes(globalThis, null);
}
}
fn digestToBytes(this: *CryptoHasher, globalThis: *JSGlobalObject, output: ?JSC.ArrayBuffer) bun.JSError!JSC.JSValue {
var output_digest_buf: EVP.Digest = undefined;
var output_digest_slice: []u8 = &output_digest_buf;
if (output) |output_buf| {
var bytes = output_buf.byteSlice();
if (bytes.len < output_digest_buf.len) {
return globalThis.throwInvalidArguments(comptime std.fmt.comptimePrint("TypedArray must be at least {d} bytes", .{output_digest_buf.len}), .{});
}
output_digest_slice = bytes[0..bytes.len];
} else {
output_digest_buf = std.mem.zeroes(EVP.Digest);
}
const result = this.final(globalThis, output_digest_slice) catch return .zero;
if (globalThis.hasException()) {
return error.JSError;
}
if (output) |output_buf| {
return output_buf.value;
} else {
// Clone to GC-managed memory
return JSC.ArrayBuffer.createBuffer(globalThis, result);
}
}
fn digestToEncoding(this: *CryptoHasher, globalThis: *JSGlobalObject, encoding: JSC.Node.Encoding) bun.JSError!JSC.JSValue {
var output_digest_buf: EVP.Digest = std.mem.zeroes(EVP.Digest);
const output_digest_slice: []u8 = &output_digest_buf;
const out = this.final(globalThis, output_digest_slice) catch return .zero;
if (globalThis.hasException()) {
return error.JSError;
}
return encoding.encodeWithMaxSize(globalThis, BoringSSL.EVP_MAX_MD_SIZE, out);
}
fn final(this: *CryptoHasher, globalThis: *JSGlobalObject, output_digest_slice: []u8) bun.JSError![]u8 {
return switch (this.*) {
.hmac => |inner| brk: {
const hmac: *HMAC = inner orelse {
return throwHmacConsumed(globalThis);
};
this.hmac = null;
defer hmac.deinit();
break :brk hmac.final(output_digest_slice);
},
.evp => |*inner| inner.final(globalThis.bunVM().rareData().boringEngine(), output_digest_slice),
.zig => |*inner| inner.final(output_digest_slice),
};
}
pub fn finalize(this: *CryptoHasher) void {
switch (this.*) {
.evp => |*inner| {
// https://github.com/oven-sh/bun/issues/3250
inner.deinit();
},
.zig => |*inner| {
inner.deinit();
},
.hmac => |inner| {
if (inner) |hmac| {
hmac.deinit();
}
},
}
this.destroy();
}
};
const CryptoHasherZig = struct {
algorithm: EVP.Algorithm,
state: *anyopaque,
digest_length: u8,
const algo_map = [_]struct { string, type }{
.{ "sha3-224", std.crypto.hash.sha3.Sha3_224 },
.{ "sha3-256", std.crypto.hash.sha3.Sha3_256 },
.{ "sha3-384", std.crypto.hash.sha3.Sha3_384 },
.{ "sha3-512", std.crypto.hash.sha3.Sha3_512 },
.{ "shake128", std.crypto.hash.sha3.Shake128 },
.{ "shake256", std.crypto.hash.sha3.Shake256 },
};
inline fn digestLength(Algorithm: type) comptime_int {
return switch (Algorithm) {
std.crypto.hash.sha3.Shake128 => 16,
std.crypto.hash.sha3.Shake256 => 32,
else => Algorithm.digest_length,
};
}
pub fn hashByName(globalThis: *JSGlobalObject, algorithm: ZigString, input: JSC.Node.BlobOrStringOrBuffer, output: ?JSC.Node.StringOrBuffer) bun.JSError!?JSC.JSValue {
inline for (algo_map) |item| {
if (bun.strings.eqlComptime(algorithm.slice(), item[0])) {
return try hashByNameInner(globalThis, item[1], input, output);
}
}
return null;
}
fn hashByNameInner(globalThis: *JSGlobalObject, comptime Algorithm: type, input: JSC.Node.BlobOrStringOrBuffer, output: ?JSC.Node.StringOrBuffer) bun.JSError!JSC.JSValue {
if (output) |string_or_buffer| {
switch (string_or_buffer) {
inline else => |*str| {
defer str.deinit();
const encoding = JSC.Node.Encoding.from(str.slice()) orelse {
return globalThis.ERR_INVALID_ARG_VALUE("Unknown encoding: {s}", .{str.slice()}).throw();
};
if (encoding == .buffer) {
return hashByNameInnerToBytes(globalThis, Algorithm, input, null);
}
return hashByNameInnerToString(globalThis, Algorithm, input, encoding);
},
.buffer => |buffer| {
return hashByNameInnerToBytes(globalThis, Algorithm, input, buffer.buffer);
},
}
}
return hashByNameInnerToBytes(globalThis, Algorithm, input, null);
}
fn hashByNameInnerToString(globalThis: *JSGlobalObject, comptime Algorithm: type, input: JSC.Node.BlobOrStringOrBuffer, encoding: JSC.Node.Encoding) bun.JSError!JSC.JSValue {
defer input.deinit();
if (input == .blob and input.blob.isBunFile()) {
return globalThis.throw("Bun.file() is not supported here yet (it needs an async version)", .{});
}
var h = Algorithm.init(.{});
h.update(input.slice());
var out: [digestLength(Algorithm)]u8 = undefined;
h.final(&out);
return encoding.encodeWithSize(globalThis, digestLength(Algorithm), &out);
}
fn hashByNameInnerToBytes(globalThis: *JSGlobalObject, comptime Algorithm: type, input: JSC.Node.BlobOrStringOrBuffer, output: ?JSC.ArrayBuffer) bun.JSError!JSC.JSValue {
defer input.deinit();
if (input == .blob and input.blob.isBunFile()) {
return globalThis.throw("Bun.file() is not supported here yet (it needs an async version)", .{});
}
var h = Algorithm.init(.{});
const digest_length_comptime = digestLength(Algorithm);
if (output) |output_buf| {
if (output_buf.byteSlice().len < digest_length_comptime) {
return globalThis.throwInvalidArguments("TypedArray must be at least {d} bytes", .{digest_length_comptime});
}
}
h.update(input.slice());
if (output) |output_buf| {
h.final(output_buf.slice()[0..digest_length_comptime]);
return output_buf.value;
} else {
var out: [digestLength(Algorithm)]u8 = undefined;
h.final(&out);
// Clone to GC-managed memory
return JSC.ArrayBuffer.createBuffer(globalThis, &out);
}
}
fn constructor(algorithm: ZigString) ?*CryptoHasher {
inline for (algo_map) |item| {
if (bun.strings.eqlComptime(algorithm.slice(), item[0])) {
return CryptoHasher.new(.{ .zig = .{
.algorithm = @field(EVP.Algorithm, item[0]),
.state = bun.new(item[1], item[1].init(.{})),
.digest_length = digestLength(item[1]),
} });
}
}
return null;
}
pub fn init(algorithm: []const u8) ?CryptoHasherZig {
inline for (algo_map) |item| {
const name, const T = item;
if (bun.strings.eqlComptime(algorithm, name)) {
const handle: CryptoHasherZig = .{
.algorithm = @field(EVP.Algorithm, name),
.state = bun.new(T, T.init(.{})),
.digest_length = digestLength(T),
};
return handle;
}
}
return null;
}
fn update(self: *CryptoHasherZig, bytes: []const u8) void {
inline for (algo_map) |item| {
if (self.algorithm == @field(EVP.Algorithm, item[0])) {
return item[1].update(@ptrCast(@alignCast(self.state)), bytes);
}
}
@panic("unreachable");
}
fn copy(self: *const CryptoHasherZig) CryptoHasherZig {
inline for (algo_map) |item| {
if (self.algorithm == @field(EVP.Algorithm, item[0])) {
return .{
.algorithm = self.algorithm,
.state = bun.dupe(item[1], @ptrCast(@alignCast(self.state))),
.digest_length = self.digest_length,
};
}
}
@panic("unreachable");
}
fn finalWithLen(self: *CryptoHasherZig, output_digest_slice: []u8, res_len: usize) []u8 {
inline for (algo_map) |pair| {
const name, const T = pair;
if (self.algorithm == @field(EVP.Algorithm, name)) {
T.final(@ptrCast(@alignCast(self.state)), @ptrCast(output_digest_slice));
const reset: *T = @ptrCast(@alignCast(self.state));
reset.* = T.init(.{});
return output_digest_slice[0..res_len];
}
}
@panic("unreachable");
}
fn final(self: *CryptoHasherZig, output_digest_slice: []u8) []u8 {
return self.finalWithLen(output_digest_slice, self.digest_length);
}
fn deinit(self: *CryptoHasherZig) void {
inline for (algo_map) |item| {
if (self.algorithm == @field(EVP.Algorithm, item[0])) {
return bun.destroy(@as(*item[1], @ptrCast(@alignCast(self.state))));
}
}
@panic("unreachable");
}
};
fn StaticCryptoHasher(comptime Hasher: type, comptime name: [:0]const u8) type {
return struct {
hashing: Hasher = Hasher{},
digested: bool = false,
const ThisHasher = @This();
pub usingnamespace @field(JSC.Codegen, "JS" ++ name);
pub const digest = JSC.wrapInstanceMethod(ThisHasher, "digest_", false);
pub const hash = JSC.wrapStaticMethod(ThisHasher, "hash_", false);
pub fn getByteLength(
_: *@This(),
_: *JSC.JSGlobalObject,
) JSC.JSValue {
return JSC.JSValue.jsNumber(@as(u16, Hasher.digest));
}
pub fn getByteLengthStatic(
_: *JSC.JSGlobalObject,
_: JSValue,
_: JSValue,
) JSC.JSValue {
return JSC.JSValue.jsNumber(@as(u16, Hasher.digest));
}
fn hashToEncoding(globalThis: *JSGlobalObject, input: JSC.Node.BlobOrStringOrBuffer, encoding: JSC.Node.Encoding) bun.JSError!JSC.JSValue {
var output_digest_buf: Hasher.Digest = undefined;
if (input == .blob and input.blob.isBunFile()) {
return globalThis.throw("Bun.file() is not supported here yet (it needs an async version)", .{});
}
if (comptime @typeInfo(@TypeOf(Hasher.hash)).@"fn".params.len == 3) {
Hasher.hash(input.slice(), &output_digest_buf, JSC.VirtualMachine.get().rareData().boringEngine());
} else {
Hasher.hash(input.slice(), &output_digest_buf);
}
return encoding.encodeWithSize(globalThis, Hasher.digest, &output_digest_buf);
}
fn hashToBytes(globalThis: *JSGlobalObject, input: JSC.Node.BlobOrStringOrBuffer, output: ?JSC.ArrayBuffer) bun.JSError!JSC.JSValue {
var output_digest_buf: Hasher.Digest = undefined;
var output_digest_slice: *Hasher.Digest = &output_digest_buf;
if (output) |output_buf| {
var bytes = output_buf.byteSlice();
if (bytes.len < Hasher.digest) {
return globalThis.throwInvalidArguments(comptime std.fmt.comptimePrint("TypedArray must be at least {d} bytes", .{Hasher.digest}), .{});
}
output_digest_slice = bytes[0..Hasher.digest];
}
if (comptime @typeInfo(@TypeOf(Hasher.hash)).@"fn".params.len == 3) {
Hasher.hash(input.slice(), output_digest_slice, JSC.VirtualMachine.get().rareData().boringEngine());
} else {
Hasher.hash(input.slice(), output_digest_slice);
}
if (output) |output_buf| {
return output_buf.value;
} else {
var array_buffer_out = JSC.ArrayBuffer.fromBytes(bun.default_allocator.dupe(u8, output_digest_slice) catch unreachable, .Uint8Array);
return array_buffer_out.toJSUnchecked(globalThis, null);
}
}
pub fn hash_(
globalThis: *JSGlobalObject,
input: JSC.Node.BlobOrStringOrBuffer,
output: ?JSC.Node.StringOrBuffer,
) bun.JSError!JSC.JSValue {
defer input.deinit();
if (input == .blob and input.blob.isBunFile()) {
return globalThis.throw("Bun.file() is not supported here yet (it needs an async version)", .{});
}
if (output) |string_or_buffer| {
switch (string_or_buffer) {
inline else => |*str| {
defer str.deinit();
const encoding = JSC.Node.Encoding.from(str.slice()) orelse {
return globalThis.ERR_INVALID_ARG_VALUE("Unknown encoding: {s}", .{str.slice()}).throw();
};
return hashToEncoding(globalThis, input, encoding);
},
.buffer => |buffer| {
return hashToBytes(globalThis, input, buffer.buffer);
},
}
} else {
return hashToBytes(globalThis, input, null);
}
}
pub fn constructor(_: *JSC.JSGlobalObject, _: *JSC.CallFrame) bun.JSError!*@This() {
const this = try bun.default_allocator.create(@This());
this.* = .{ .hashing = Hasher.init() };
return this;
}
pub fn getter(
globalObject: *JSC.JSGlobalObject,
_: *JSC.JSObject,
) JSC.JSValue {
return ThisHasher.getConstructor(globalObject);
}
pub fn update(this: *@This(), globalThis: *JSC.JSGlobalObject, callframe: *JSC.CallFrame) bun.JSError!JSC.JSValue {
if (this.digested) {
return globalThis.ERR_INVALID_STATE(name ++ " hasher already digested, create a new instance to update", .{}).throw();
}
const thisValue = callframe.this();
const input = callframe.argument(0);
const buffer = try JSC.Node.BlobOrStringOrBuffer.fromJS(globalThis, globalThis.bunVM().allocator, input) orelse {
return globalThis.throwInvalidArguments("expected blob or string or buffer", .{});
};
defer buffer.deinit();
if (buffer == .blob and buffer.blob.isBunFile()) {
return globalThis.throw("Bun.file() is not supported here yet (it needs an async version)", .{});
}
this.hashing.update(buffer.slice());
return thisValue;
}
pub fn digest_(
this: *@This(),
globalThis: *JSGlobalObject,
output: ?JSC.Node.StringOrBuffer,
) bun.JSError!JSC.JSValue {
if (this.digested) {
return globalThis.ERR_INVALID_STATE(name ++ " hasher already digested, create a new instance to digest again", .{}).throw();
}
if (output) |*string_or_buffer| {
switch (string_or_buffer.*) {
inline else => |*str| {
defer str.deinit();
const encoding = JSC.Node.Encoding.from(str.slice()) orelse {
return globalThis.ERR_INVALID_ARG_VALUE("Unknown encoding: {s}", .{str.slice()}).throw();
};
return this.digestToEncoding(globalThis, encoding);
},
.buffer => |*buffer| {
return this.digestToBytes(
globalThis,
buffer.buffer,
);
},
}
} else {
return this.digestToBytes(globalThis, null);
}
}
fn digestToBytes(this: *@This(), globalThis: *JSGlobalObject, output: ?JSC.ArrayBuffer) bun.JSError!JSC.JSValue {
var output_digest_buf: Hasher.Digest = undefined;
var output_digest_slice: *Hasher.Digest = &output_digest_buf;
if (output) |output_buf| {
var bytes = output_buf.byteSlice();
if (bytes.len < Hasher.digest) {
return globalThis.throwInvalidArguments(comptime std.fmt.comptimePrint("TypedArray must be at least {d} bytes", .{Hasher.digest}), .{});
}
output_digest_slice = bytes[0..Hasher.digest];
} else {
output_digest_buf = std.mem.zeroes(Hasher.Digest);
}
this.hashing.final(output_digest_slice);
this.digested = true;
if (output) |output_buf| {
return output_buf.value;
} else {
var array_buffer_out = JSC.ArrayBuffer.fromBytes(bun.default_allocator.dupe(u8, &output_digest_buf) catch unreachable, .Uint8Array);
return array_buffer_out.toJSUnchecked(globalThis, null);
}
}
fn digestToEncoding(this: *@This(), globalThis: *JSGlobalObject, encoding: JSC.Node.Encoding) JSC.JSValue {
var output_digest_buf: Hasher.Digest = comptime brk: {
var bytes: Hasher.Digest = undefined;
var i: usize = 0;
while (i < Hasher.digest) {
bytes[i] = 0;
i += 1;
}
break :brk bytes;
};
const output_digest_slice: *Hasher.Digest = &output_digest_buf;
this.hashing.final(output_digest_slice);
this.digested = true;
return encoding.encodeWithSize(globalThis, Hasher.digest, output_digest_slice);
}
pub fn finalize(this: *@This()) void {
VirtualMachine.get().allocator.destroy(this);
}
};
}
pub const MD4 = StaticCryptoHasher(Hashers.MD4, "MD4");
pub const MD5 = StaticCryptoHasher(Hashers.MD5, "MD5");
pub const SHA1 = StaticCryptoHasher(Hashers.SHA1, "SHA1");
pub const SHA224 = StaticCryptoHasher(Hashers.SHA224, "SHA224");
pub const SHA256 = StaticCryptoHasher(Hashers.SHA256, "SHA256");
pub const SHA384 = StaticCryptoHasher(Hashers.SHA384, "SHA384");
pub const SHA512 = StaticCryptoHasher(Hashers.SHA512, "SHA512");
pub const SHA512_256 = StaticCryptoHasher(Hashers.SHA512_256, "SHA512_256");
const Crypto = JSC.API.Bun.Crypto;
const Hashers = @import("../../../sha.zig");
const std = @import("std");
const bun = @import("root").bun;
const string = bun.string;
const strings = bun.strings;
const MutableString = bun.MutableString;
const stringZ = bun.stringZ;
const default_allocator = bun.default_allocator;
const JSC = bun.JSC;
const Async = bun.Async;
const ZigString = JSC.ZigString;
const JSValue = JSC.JSValue;
const JSGlobalObject = JSC.JSGlobalObject;
const CallFrame = JSC.CallFrame;
const assert = bun.assert;
const HMAC = Crypto.HMAC;
const EVP = Crypto.EVP;
const BoringSSL = bun.BoringSSL.c;
const createCryptoError = Crypto.createCryptoError;
const VirtualMachine = JSC.VirtualMachine;

View File

@@ -0,0 +1,221 @@
ctx: BoringSSL.EVP_MD_CTX = undefined,
md: *const BoringSSL.EVP_MD = undefined,
algorithm: Algorithm,
// we do this to avoid asking BoringSSL what the digest name is
// because that API is confusing
pub const Algorithm = enum {
// @"DSA-SHA",
// @"DSA-SHA1",
// @"MD5-SHA1",
// @"RSA-MD5",
// @"RSA-RIPEMD160",
// @"RSA-SHA1",
// @"RSA-SHA1-2",
// @"RSA-SHA224",
// @"RSA-SHA256",
// @"RSA-SHA384",
// @"RSA-SHA512",
// @"ecdsa-with-SHA1",
blake2b256,
blake2b512,
md4,
md5,
ripemd160,
sha1,
sha224,
sha256,
sha384,
sha512,
@"sha512-224",
@"sha512-256",
@"sha3-224",
@"sha3-256",
@"sha3-384",
@"sha3-512",
shake128,
shake256,
pub fn md(this: Algorithm) ?*const BoringSSL.EVP_MD {
return switch (this) {
.blake2b256 => BoringSSL.EVP_blake2b256(),
.blake2b512 => BoringSSL.EVP_blake2b512(),
.md4 => BoringSSL.EVP_md4(),
.md5 => BoringSSL.EVP_md5(),
.ripemd160 => BoringSSL.EVP_ripemd160(),
.sha1 => BoringSSL.EVP_sha1(),
.sha224 => BoringSSL.EVP_sha224(),
.sha256 => BoringSSL.EVP_sha256(),
.sha384 => BoringSSL.EVP_sha384(),
.sha512 => BoringSSL.EVP_sha512(),
.@"sha512-224" => BoringSSL.EVP_sha512_224(),
.@"sha512-256" => BoringSSL.EVP_sha512_256(),
else => null,
};
}
pub const names: std.EnumArray(Algorithm, bun.String) = brk: {
var all = std.EnumArray(Algorithm, bun.String).initUndefined();
var iter = all.iterator();
while (iter.next()) |entry| {
entry.value.* = bun.String.init(@tagName(entry.key));
}
break :brk all;
};
pub const map = bun.ComptimeStringMap(Algorithm, .{
.{ "blake2b256", .blake2b256 },
.{ "blake2b512", .blake2b512 },
.{ "ripemd160", .ripemd160 },
.{ "rmd160", .ripemd160 },
.{ "md4", .md4 },
.{ "md5", .md5 },
.{ "sha1", .sha1 },
.{ "sha128", .sha1 },
.{ "sha224", .sha224 },
.{ "sha256", .sha256 },
.{ "sha384", .sha384 },
.{ "sha512", .sha512 },
.{ "sha-1", .sha1 },
.{ "sha-224", .sha224 },
.{ "sha-256", .sha256 },
.{ "sha-384", .sha384 },
.{ "sha-512", .sha512 },
.{ "sha-512/224", .@"sha512-224" },
.{ "sha-512_224", .@"sha512-224" },
.{ "sha-512224", .@"sha512-224" },
.{ "sha512-224", .@"sha512-224" },
.{ "sha-512/256", .@"sha512-256" },
.{ "sha-512_256", .@"sha512-256" },
.{ "sha-512256", .@"sha512-256" },
.{ "sha512-256", .@"sha512-256" },
.{ "sha384", .sha384 },
.{ "sha3-224", .@"sha3-224" },
.{ "sha3-256", .@"sha3-256" },
.{ "sha3-384", .@"sha3-384" },
.{ "sha3-512", .@"sha3-512" },
.{ "shake128", .shake128 },
.{ "shake256", .shake256 },
// .{ "md5-sha1", .@"MD5-SHA1" },
// .{ "dsa-sha", .@"DSA-SHA" },
// .{ "dsa-sha1", .@"DSA-SHA1" },
// .{ "ecdsa-with-sha1", .@"ecdsa-with-SHA1" },
// .{ "rsa-md5", .@"RSA-MD5" },
// .{ "rsa-sha1", .@"RSA-SHA1" },
// .{ "rsa-sha1-2", .@"RSA-SHA1-2" },
// .{ "rsa-sha224", .@"RSA-SHA224" },
// .{ "rsa-sha256", .@"RSA-SHA256" },
// .{ "rsa-sha384", .@"RSA-SHA384" },
// .{ "rsa-sha512", .@"RSA-SHA512" },
// .{ "rsa-ripemd160", .@"RSA-RIPEMD160" },
});
};
pub fn init(algorithm: Algorithm, md: *const BoringSSL.EVP_MD, engine: *BoringSSL.ENGINE) EVP {
bun.BoringSSL.load();
var ctx: BoringSSL.EVP_MD_CTX = undefined;
BoringSSL.EVP_MD_CTX_init(&ctx);
_ = BoringSSL.EVP_DigestInit_ex(&ctx, md, engine);
return .{
.ctx = ctx,
.md = md,
.algorithm = algorithm,
};
}
pub fn reset(this: *EVP, engine: *BoringSSL.ENGINE) void {
BoringSSL.ERR_clear_error();
_ = BoringSSL.EVP_DigestInit_ex(&this.ctx, this.md, engine);
}
pub fn hash(this: *EVP, engine: *BoringSSL.ENGINE, input: []const u8, output: []u8) ?u32 {
BoringSSL.ERR_clear_error();
var outsize: c_uint = @min(@as(u16, @truncate(output.len)), this.size());
if (BoringSSL.EVP_Digest(input.ptr, input.len, output.ptr, &outsize, this.md, engine) != 1) {
return null;
}
return outsize;
}
pub fn final(this: *EVP, engine: *BoringSSL.ENGINE, output: []u8) []u8 {
BoringSSL.ERR_clear_error();
var outsize: u32 = @min(@as(u16, @truncate(output.len)), this.size());
if (BoringSSL.EVP_DigestFinal_ex(
&this.ctx,
output.ptr,
&outsize,
) != 1) {
return "";
}
this.reset(engine);
return output[0..outsize];
}
pub fn update(this: *EVP, input: []const u8) void {
BoringSSL.ERR_clear_error();
_ = BoringSSL.EVP_DigestUpdate(&this.ctx, input.ptr, input.len);
}
pub fn size(this: *const EVP) u16 {
return @as(u16, @truncate(BoringSSL.EVP_MD_CTX_size(&this.ctx)));
}
pub fn copy(this: *const EVP, engine: *BoringSSL.ENGINE) error{OutOfMemory}!EVP {
BoringSSL.ERR_clear_error();
var new = init(this.algorithm, this.md, engine);
if (BoringSSL.EVP_MD_CTX_copy_ex(&new.ctx, &this.ctx) == 0) {
return error.OutOfMemory;
}
return new;
}
pub fn byNameAndEngine(engine: *BoringSSL.ENGINE, name: []const u8) ?EVP {
if (Algorithm.map.getWithEql(name, strings.eqlCaseInsensitiveASCIIIgnoreLength)) |algorithm| {
if (algorithm.md()) |md| {
return EVP.init(algorithm, md, engine);
}
if (BoringSSL.EVP_get_digestbyname(@tagName(algorithm))) |md| {
return EVP.init(algorithm, md, engine);
}
}
return null;
}
pub fn byName(name: ZigString, global: *JSC.JSGlobalObject) ?EVP {
var name_str = name.toSlice(global.allocator());
defer name_str.deinit();
return byNameAndEngine(global.bunVM().rareData().boringEngine(), name_str.slice());
}
pub fn deinit(this: *EVP) void {
// https://github.com/oven-sh/bun/issues/3250
_ = BoringSSL.EVP_MD_CTX_cleanup(&this.ctx);
}
pub const Digest = [BoringSSL.EVP_MAX_MD_SIZE]u8;
pub const PBKDF2 = @import("./PBKDF2.zig");
pub const pbkdf2 = PBKDF2.pbkdf2;
const std = @import("std");
const bun = @import("root").bun;
const string = bun.string;
const strings = bun.strings;
const MutableString = bun.MutableString;
const stringZ = bun.stringZ;
const default_allocator = bun.default_allocator;
const JSC = bun.JSC;
const Async = bun.Async;
const ZigString = JSC.ZigString;
const JSValue = JSC.JSValue;
const JSGlobalObject = JSC.JSGlobalObject;
const CallFrame = JSC.CallFrame;
const assert = bun.assert;
const EVP = @This();
const BoringSSL = bun.BoringSSL.c;

View File

@@ -0,0 +1,56 @@
ctx: BoringSSL.HMAC_CTX,
algorithm: EVP.Algorithm,
pub usingnamespace bun.New(@This());
pub fn init(algorithm: EVP.Algorithm, key: []const u8) ?*HMAC {
const md = algorithm.md() orelse return null;
var ctx: BoringSSL.HMAC_CTX = undefined;
BoringSSL.HMAC_CTX_init(&ctx);
if (BoringSSL.HMAC_Init_ex(&ctx, key.ptr, @intCast(key.len), md, null) != 1) {
BoringSSL.HMAC_CTX_cleanup(&ctx);
return null;
}
return HMAC.new(.{
.ctx = ctx,
.algorithm = algorithm,
});
}
pub fn update(this: *HMAC, data: []const u8) void {
_ = BoringSSL.HMAC_Update(&this.ctx, data.ptr, data.len);
}
pub fn size(this: *const HMAC) usize {
return BoringSSL.HMAC_size(&this.ctx);
}
pub fn copy(this: *HMAC) !*HMAC {
var ctx: BoringSSL.HMAC_CTX = undefined;
BoringSSL.HMAC_CTX_init(&ctx);
if (BoringSSL.HMAC_CTX_copy(&ctx, &this.ctx) != 1) {
BoringSSL.HMAC_CTX_cleanup(&ctx);
return error.BoringSSLError;
}
return HMAC.new(.{
.ctx = ctx,
.algorithm = this.algorithm,
});
}
pub fn final(this: *HMAC, out: []u8) []u8 {
var outlen: c_uint = undefined;
_ = BoringSSL.HMAC_Final(&this.ctx, out.ptr, &outlen);
return out[0..outlen];
}
pub fn deinit(this: *HMAC) void {
BoringSSL.HMAC_CTX_cleanup(&this.ctx);
this.destroy();
}
const bun = @import("root").bun;
const BoringSSL = bun.BoringSSL.c;
const JSC = bun.JSC;
const EVP = JSC.API.Bun.Crypto.EVP;
const HMAC = @This();

View File

@@ -0,0 +1,265 @@
password: JSC.Node.StringOrBuffer = JSC.Node.StringOrBuffer.empty,
salt: JSC.Node.StringOrBuffer = JSC.Node.StringOrBuffer.empty,
iteration_count: u32 = 1,
length: i32 = 0,
algorithm: EVP.Algorithm,
pub fn run(this: *PBKDF2, output: []u8) bool {
const password = this.password.slice();
const salt = this.salt.slice();
const algorithm = this.algorithm;
const iteration_count = this.iteration_count;
const length = this.length;
@memset(output, 0);
assert(this.length <= @as(i32, @intCast(output.len)));
BoringSSL.ERR_clear_error();
const rc = BoringSSL.PKCS5_PBKDF2_HMAC(
if (password.len > 0) password.ptr else null,
@intCast(password.len),
salt.ptr,
@intCast(salt.len),
@intCast(iteration_count),
algorithm.md().?,
@intCast(length),
output.ptr,
);
if (rc <= 0) {
return false;
}
return true;
}
pub const Job = struct {
pbkdf2: PBKDF2,
output: []u8 = &[_]u8{},
task: JSC.WorkPoolTask = .{ .callback = &runTask },
promise: JSC.JSPromise.Strong = .{},
vm: *JSC.VirtualMachine,
err: ?u32 = null,
any_task: JSC.AnyTask = undefined,
poll: Async.KeepAlive = .{},
pub usingnamespace bun.New(@This());
pub fn runTask(task: *JSC.WorkPoolTask) void {
const job: *PBKDF2.Job = @fieldParentPtr("task", task);
defer job.vm.enqueueTaskConcurrent(JSC.ConcurrentTask.create(job.any_task.task()));
job.output = bun.default_allocator.alloc(u8, @as(usize, @intCast(job.pbkdf2.length))) catch {
job.err = BoringSSL.EVP_R_MEMORY_LIMIT_EXCEEDED;
return;
};
if (!job.pbkdf2.run(job.output)) {
job.err = BoringSSL.ERR_get_error();
BoringSSL.ERR_clear_error();
bun.default_allocator.free(job.output);
job.output = &[_]u8{};
}
}
pub fn runFromJS(this: *Job) void {
defer this.deinit();
if (this.vm.isShuttingDown()) {
return;
}
const globalThis = this.vm.global;
const promise = this.promise.swap();
if (this.err) |err| {
promise.reject(globalThis, createCryptoError(globalThis, err));
return;
}
const output_slice = this.output;
assert(output_slice.len == @as(usize, @intCast(this.pbkdf2.length)));
const buffer_value = JSC.JSValue.createBuffer(globalThis, output_slice, bun.default_allocator);
if (buffer_value == .zero) {
promise.reject(globalThis, ZigString.init("Failed to create buffer").toErrorInstance(globalThis));
return;
}
this.output = &[_]u8{};
promise.resolve(globalThis, buffer_value);
}
pub fn deinit(this: *Job) void {
this.poll.unref(this.vm);
this.pbkdf2.deinitAndUnprotect();
this.promise.deinit();
bun.default_allocator.free(this.output);
this.destroy();
}
pub fn create(vm: *JSC.VirtualMachine, globalThis: *JSC.JSGlobalObject, data: *const PBKDF2) *Job {
var job = Job.new(.{
.pbkdf2 = data.*,
.vm = vm,
.any_task = undefined,
});
job.promise = JSC.JSPromise.Strong.init(globalThis);
job.any_task = JSC.AnyTask.New(@This(), &runFromJS).init(job);
job.poll.ref(vm);
JSC.WorkPool.schedule(&job.task);
return job;
}
};
pub fn deinitAndUnprotect(this: *PBKDF2) void {
this.password.deinitAndUnprotect();
this.salt.deinitAndUnprotect();
}
pub fn deinit(this: *PBKDF2) void {
this.password.deinit();
this.salt.deinit();
}
pub fn fromJS(globalThis: *JSC.JSGlobalObject, callFrame: *JSC.CallFrame, is_async: bool) bun.JSError!PBKDF2 {
const arg0, const arg1, const arg2, const arg3, const arg4, const arg5 = callFrame.argumentsAsArray(6);
if (!arg3.isNumber()) {
return globalThis.throwInvalidArgumentTypeValue("keylen", "number", arg3);
}
const keylen_num = arg3.asNumber();
if (std.math.isInf(keylen_num) or std.math.isNan(keylen_num)) {
return globalThis.throwRangeError(keylen_num, .{
.field_name = "keylen",
.msg = "an integer",
});
}
if (keylen_num < 0 or keylen_num > std.math.maxInt(i32)) {
return globalThis.throwRangeError(keylen_num, .{ .field_name = "keylen", .min = 0, .max = std.math.maxInt(i32) });
}
const keylen: i32 = @intFromFloat(keylen_num);
if (globalThis.hasException()) {
return error.JSError;
}
if (!arg2.isAnyInt()) {
return globalThis.throwInvalidArgumentTypeValue("iterations", "number", arg2);
}
const iteration_count = arg2.coerce(i64, globalThis);
if (!globalThis.hasException() and (iteration_count < 1 or iteration_count > std.math.maxInt(i32))) {
return globalThis.throwRangeError(iteration_count, .{ .field_name = "iterations", .min = 1, .max = std.math.maxInt(i32) + 1 });
}
if (globalThis.hasException()) {
return error.JSError;
}
const algorithm = brk: {
if (!arg4.isString()) {
return globalThis.throwInvalidArgumentTypeValue("digest", "string", arg4);
}
invalid: {
switch (try EVP.Algorithm.map.fromJSCaseInsensitive(globalThis, arg4) orelse break :invalid) {
.shake128, .shake256, .@"sha3-224", .@"sha3-256", .@"sha3-384", .@"sha3-512" => break :invalid,
else => |alg| break :brk alg,
}
}
if (!globalThis.hasException()) {
const slice = try arg4.toSlice(globalThis, bun.default_allocator);
defer slice.deinit();
const name = slice.slice();
return globalThis.ERR_CRYPTO_INVALID_DIGEST("Invalid digest: {s}", .{name}).throw();
}
return error.JSError;
};
var out = PBKDF2{
.iteration_count = @intCast(iteration_count),
.length = keylen,
.algorithm = algorithm,
};
defer {
if (globalThis.hasException()) {
if (is_async)
out.deinitAndUnprotect()
else
out.deinit();
}
}
const allow_string_object = true;
out.salt = try JSC.Node.StringOrBuffer.fromJSMaybeAsync(globalThis, bun.default_allocator, arg1, is_async, allow_string_object) orelse {
return globalThis.throwInvalidArgumentTypeValue("salt", "string or buffer", arg1);
};
if (out.salt.slice().len > std.math.maxInt(i32)) {
return globalThis.throwInvalidArguments("salt is too long", .{});
}
out.password = try JSC.Node.StringOrBuffer.fromJSMaybeAsync(globalThis, bun.default_allocator, arg0, is_async, allow_string_object) orelse {
return globalThis.throwInvalidArgumentTypeValue("password", "string or buffer", arg0);
};
if (out.password.slice().len > std.math.maxInt(i32)) {
return globalThis.throwInvalidArguments("password is too long", .{});
}
if (is_async) {
if (!arg5.isFunction()) {
return globalThis.throwInvalidArgumentTypeValue("callback", "function", arg5);
}
}
return out;
}
/// For usage in Zig
pub fn pbkdf2(
output: []u8,
password: []const u8,
salt: []const u8,
iteration_count: u32,
algorithm: Algorithm,
) ?[]const u8 {
var pbk = PBKDF2{
.algorithm = algorithm,
.password = JSC.Node.StringOrBuffer{ .encoded_slice = JSC.ZigString.Slice.fromUTF8NeverFree(password) },
.salt = JSC.Node.StringOrBuffer{ .encoded_slice = JSC.ZigString.Slice.fromUTF8NeverFree(salt) },
.iteration_count = iteration_count,
.length = @intCast(output.len),
};
if (!pbk.run(output)) {
return null;
}
return output;
}
const std = @import("std");
const bun = @import("root").bun;
const string = bun.string;
const strings = bun.strings;
const MutableString = bun.MutableString;
const stringZ = bun.stringZ;
const default_allocator = bun.default_allocator;
const JSC = bun.JSC;
const Async = bun.Async;
const ZigString = JSC.ZigString;
const JSValue = JSC.JSValue;
const JSGlobalObject = JSC.JSGlobalObject;
const CallFrame = JSC.CallFrame;
const assert = bun.assert;
const EVP = JSC.API.Bun.Crypto.EVP;
const Algorithm = EVP.Algorithm;
const BoringSSL = bun.BoringSSL.c;
const createCryptoError = JSC.API.Bun.Crypto.createCryptoError;
const VirtualMachine = JSC.VirtualMachine;
const PBKDF2 = @This();

View File

@@ -0,0 +1,772 @@
pub const PasswordObject = struct {
pub const pwhash = std.crypto.pwhash;
pub const Algorithm = enum {
argon2i,
argon2d,
argon2id,
bcrypt,
pub const Value = union(Algorithm) {
argon2i: Argon2Params,
argon2d: Argon2Params,
argon2id: Argon2Params,
// bcrypt only accepts "cost"
bcrypt: u6,
pub const bcrpyt_default = 10;
pub const default = Algorithm.Value{
.argon2id = .{},
};
pub fn fromJS(globalObject: *JSC.JSGlobalObject, value: JSC.JSValue) bun.JSError!Value {
if (value.isObject()) {
if (try value.getTruthy(globalObject, "algorithm")) |algorithm_value| {
if (!algorithm_value.isString()) {
return globalObject.throwInvalidArgumentType("hash", "algorithm", "string");
}
const algorithm_string = try algorithm_value.getZigString(globalObject);
switch (PasswordObject.Algorithm.label.getWithEql(algorithm_string, JSC.ZigString.eqlComptime) orelse {
return globalObject.throwInvalidArgumentType("hash", "algorithm", unknown_password_algorithm_message);
}) {
.bcrypt => {
var algorithm = PasswordObject.Algorithm.Value{
.bcrypt = PasswordObject.Algorithm.Value.bcrpyt_default,
};
if (try value.getTruthy(globalObject, "cost")) |rounds_value| {
if (!rounds_value.isNumber()) {
return globalObject.throwInvalidArgumentType("hash", "cost", "number");
}
const rounds = rounds_value.coerce(i32, globalObject);
if (rounds < 4 or rounds > 31) {
return globalObject.throwInvalidArguments("Rounds must be between 4 and 31", .{});
}
algorithm.bcrypt = @as(u6, @intCast(rounds));
}
return algorithm;
},
inline .argon2id, .argon2d, .argon2i => |tag| {
var argon = Algorithm.Argon2Params{};
if (try value.getTruthy(globalObject, "timeCost")) |time_value| {
if (!time_value.isNumber()) {
return globalObject.throwInvalidArgumentType("hash", "timeCost", "number");
}
const time_cost = time_value.coerce(i32, globalObject);
if (time_cost < 1) {
return globalObject.throwInvalidArguments("Time cost must be greater than 0", .{});
}
argon.time_cost = @as(u32, @intCast(time_cost));
}
if (try value.getTruthy(globalObject, "memoryCost")) |memory_value| {
if (!memory_value.isNumber()) {
return globalObject.throwInvalidArgumentType("hash", "memoryCost", "number");
}
const memory_cost = memory_value.coerce(i32, globalObject);
if (memory_cost < 1) {
return globalObject.throwInvalidArguments("Memory cost must be greater than 0", .{});
}
argon.memory_cost = @as(u32, @intCast(memory_cost));
}
return @unionInit(Algorithm.Value, @tagName(tag), argon);
},
}
unreachable;
} else {
return globalObject.throwInvalidArgumentType("hash", "options.algorithm", "string");
}
} else if (value.isString()) {
const algorithm_string = try value.getZigString(globalObject);
switch (PasswordObject.Algorithm.label.getWithEql(algorithm_string, JSC.ZigString.eqlComptime) orelse {
return globalObject.throwInvalidArgumentType("hash", "algorithm", unknown_password_algorithm_message);
}) {
.bcrypt => {
return PasswordObject.Algorithm.Value{
.bcrypt = PasswordObject.Algorithm.Value.bcrpyt_default,
};
},
.argon2id => {
return PasswordObject.Algorithm.Value{
.argon2id = .{},
};
},
.argon2d => {
return PasswordObject.Algorithm.Value{
.argon2d = .{},
};
},
.argon2i => {
return PasswordObject.Algorithm.Value{
.argon2i = .{},
};
},
}
} else {
return globalObject.throwInvalidArgumentType("hash", "algorithm", "string");
}
unreachable;
}
};
pub const Argon2Params = struct {
// we don't support the other options right now, but can add them later if someone asks
memory_cost: u32 = pwhash.argon2.Params.interactive_2id.m,
time_cost: u32 = pwhash.argon2.Params.interactive_2id.t,
pub fn toParams(this: Argon2Params) pwhash.argon2.Params {
return pwhash.argon2.Params{
.t = this.time_cost,
.m = this.memory_cost,
.p = 1,
};
}
};
pub const argon2 = Algorithm.argon2id;
pub const label = bun.ComptimeStringMap(
Algorithm,
.{
.{ "argon2i", .argon2i },
.{ "argon2d", .argon2d },
.{ "argon2id", .argon2id },
.{ "bcrypt", .bcrypt },
},
);
pub const default = Algorithm.argon2;
pub fn get(pw: []const u8) ?Algorithm {
if (pw[0] != '$') {
return null;
}
// PHC format looks like $<algorithm>$<params>$<salt>$<hash><optional stuff>
if (strings.hasPrefixComptime(pw[1..], "argon2d$")) {
return .argon2d;
}
if (strings.hasPrefixComptime(pw[1..], "argon2i$")) {
return .argon2i;
}
if (strings.hasPrefixComptime(pw[1..], "argon2id$")) {
return .argon2id;
}
if (strings.hasPrefixComptime(pw[1..], "bcrypt")) {
return .bcrypt;
}
// https://en.wikipedia.org/wiki/Crypt_(C)
if (strings.hasPrefixComptime(pw[1..], "2")) {
return .bcrypt;
}
return null;
}
};
pub const HashError = pwhash.Error || error{UnsupportedAlgorithm};
// This is purposely simple because nobody asked to make it more complicated
pub fn hash(
allocator: std.mem.Allocator,
password: []const u8,
algorithm: Algorithm.Value,
) HashError![]const u8 {
switch (algorithm) {
inline .argon2i, .argon2d, .argon2id => |argon| {
var outbuf: [4096]u8 = undefined;
const hash_options = pwhash.argon2.HashOptions{
.params = argon.toParams(),
.allocator = allocator,
.mode = switch (algorithm) {
.argon2i => .argon2i,
.argon2d => .argon2d,
.argon2id => .argon2id,
else => unreachable,
},
.encoding = .phc,
};
// warning: argon2's code may spin up threads if paralellism is set to > 0
// we don't expose this option
// but since it parses from phc format, it's possible that it will be set
// eventually we should do something that about that.
const out_bytes = try pwhash.argon2.strHash(password, hash_options, &outbuf);
return try allocator.dupe(u8, out_bytes);
},
.bcrypt => |cost| {
var outbuf: [4096]u8 = undefined;
var outbuf_slice: []u8 = outbuf[0..];
var password_to_use = password;
// bcrypt silently truncates passwords longer than 72 bytes
// we use SHA512 to hash the password if it's longer than 72 bytes
if (password.len > 72) {
var sha_512 = bun.sha.SHA512.init();
defer sha_512.deinit();
sha_512.update(password);
sha_512.final(outbuf[0..bun.sha.SHA512.digest]);
password_to_use = outbuf[0..bun.sha.SHA512.digest];
outbuf_slice = outbuf[bun.sha.SHA512.digest..];
}
const hash_options = pwhash.bcrypt.HashOptions{
.params = pwhash.bcrypt.Params{
.rounds_log = cost,
.silently_truncate_password = true,
},
.allocator = allocator,
.encoding = .crypt,
};
const out_bytes = try pwhash.bcrypt.strHash(password_to_use, hash_options, outbuf_slice);
return try allocator.dupe(u8, out_bytes);
},
}
}
pub fn verify(
allocator: std.mem.Allocator,
password: []const u8,
previous_hash: []const u8,
algorithm: ?Algorithm,
) HashError!bool {
if (previous_hash.len == 0) {
return false;
}
return verifyWithAlgorithm(
allocator,
password,
previous_hash,
algorithm orelse Algorithm.get(previous_hash) orelse return error.UnsupportedAlgorithm,
);
}
pub fn verifyWithAlgorithm(
allocator: std.mem.Allocator,
password: []const u8,
previous_hash: []const u8,
algorithm: Algorithm,
) HashError!bool {
switch (algorithm) {
.argon2id, .argon2d, .argon2i => {
pwhash.argon2.strVerify(previous_hash, password, .{ .allocator = allocator }) catch |err| {
if (err == error.PasswordVerificationFailed) {
return false;
}
return err;
};
return true;
},
.bcrypt => {
var password_to_use = password;
var outbuf: [bun.sha.SHA512.digest]u8 = undefined;
// bcrypt silently truncates passwords longer than 72 bytes
// we use SHA512 to hash the password if it's longer than 72 bytes
if (password.len > 72) {
var sha_512 = bun.sha.SHA512.init();
defer sha_512.deinit();
sha_512.update(password);
sha_512.final(&outbuf);
password_to_use = &outbuf;
}
pwhash.bcrypt.strVerify(previous_hash, password_to_use, .{
.allocator = allocator,
.silently_truncate_password = true,
}) catch |err| {
if (err == error.PasswordVerificationFailed) {
return false;
}
return err;
};
return true;
},
}
}
};
pub const JSPasswordObject = struct {
const PascalToUpperUnderscoreCaseFormatter = struct {
input: []const u8,
pub fn format(self: @This(), comptime _: []const u8, _: std.fmt.FormatOptions, writer: anytype) !void {
for (self.input) |c| {
if (std.ascii.isUpper(c)) {
try writer.writeByte('_');
try writer.writeByte(c);
} else if (std.ascii.isLower(c)) {
try writer.writeByte(std.ascii.toUpper(c));
} else {
try writer.writeByte(c);
}
}
}
};
pub export fn JSPasswordObject__create(globalObject: *JSC.JSGlobalObject) JSC.JSValue {
var object = JSValue.createEmptyObject(globalObject, 4);
object.put(
globalObject,
ZigString.static("hash"),
JSC.createCallback(globalObject, ZigString.static("hash"), 2, JSPasswordObject__hash),
);
object.put(
globalObject,
ZigString.static("hashSync"),
JSC.createCallback(globalObject, ZigString.static("hashSync"), 2, JSPasswordObject__hashSync),
);
object.put(
globalObject,
ZigString.static("verify"),
JSC.createCallback(globalObject, ZigString.static("verify"), 2, JSPasswordObject__verify),
);
object.put(
globalObject,
ZigString.static("verifySync"),
JSC.createCallback(globalObject, ZigString.static("verifySync"), 2, JSPasswordObject__verifySync),
);
return object;
}
const HashJob = struct {
algorithm: PasswordObject.Algorithm.Value,
password: []const u8,
promise: JSC.JSPromise.Strong,
event_loop: *JSC.EventLoop,
global: *JSC.JSGlobalObject,
ref: Async.KeepAlive = .{},
task: JSC.WorkPoolTask = .{ .callback = &run },
pub usingnamespace bun.New(@This());
pub const Result = struct {
value: Value,
ref: Async.KeepAlive = .{},
task: JSC.AnyTask = undefined,
promise: JSC.JSPromise.Strong,
global: *JSC.JSGlobalObject,
pub usingnamespace bun.New(@This());
pub const Value = union(enum) {
err: PasswordObject.HashError,
hash: []const u8,
pub fn toErrorInstance(this: Value, globalObject: *JSC.JSGlobalObject) JSC.JSValue {
const error_code = std.fmt.allocPrint(bun.default_allocator, "PASSWORD{}", .{PascalToUpperUnderscoreCaseFormatter{ .input = @errorName(this.err) }}) catch bun.outOfMemory();
defer bun.default_allocator.free(error_code);
const instance = globalObject.createErrorInstance("Password hashing failed with error \"{s}\"", .{@errorName(this.err)});
instance.put(globalObject, ZigString.static("code"), JSC.ZigString.init(error_code).toJS(globalObject));
return instance;
}
};
pub fn runFromJS(this: *Result) void {
var promise = this.promise;
defer promise.deinit();
this.promise = .{};
this.ref.unref(this.global.bunVM());
const global = this.global;
switch (this.value) {
.err => {
const error_instance = this.value.toErrorInstance(global);
this.destroy();
promise.reject(global, error_instance);
},
.hash => |value| {
const js_string = JSC.ZigString.init(value).toJS(global);
this.destroy();
promise.resolve(global, js_string);
},
}
}
};
pub fn deinit(this: *HashJob) void {
this.promise.deinit();
bun.freeSensitive(bun.default_allocator, this.password);
this.destroy();
}
pub fn getValue(password: []const u8, algorithm: PasswordObject.Algorithm.Value) Result.Value {
const value = PasswordObject.hash(bun.default_allocator, password, algorithm) catch |err| {
return Result.Value{ .err = err };
};
return Result.Value{ .hash = value };
}
pub fn run(task: *bun.ThreadPool.Task) void {
var this: *HashJob = @fieldParentPtr("task", task);
var result = Result.new(.{
.value = getValue(this.password, this.algorithm),
.task = undefined,
.promise = this.promise,
.global = this.global,
.ref = this.ref,
});
this.promise = .empty;
result.task = JSC.AnyTask.New(Result, Result.runFromJS).init(result);
this.ref = .{};
this.event_loop.enqueueTaskConcurrent(JSC.ConcurrentTask.createFrom(&result.task));
this.deinit();
}
};
pub fn hash(globalObject: *JSC.JSGlobalObject, password: []const u8, algorithm: PasswordObject.Algorithm.Value, comptime sync: bool) bun.JSError!JSC.JSValue {
assert(password.len > 0); // caller must check
if (comptime sync) {
const value = HashJob.getValue(password, algorithm);
switch (value) {
.err => {
const error_instance = value.toErrorInstance(globalObject);
return globalObject.throwValue(error_instance);
},
.hash => |h| {
return JSC.ZigString.init(h).toJS(globalObject);
},
}
unreachable;
}
const promise = JSC.JSPromise.Strong.init(globalObject);
var job = HashJob.new(.{
.algorithm = algorithm,
.password = password,
.promise = promise,
.event_loop = globalObject.bunVM().eventLoop(),
.global = globalObject,
});
job.ref.ref(globalObject.bunVM());
JSC.WorkPool.schedule(&job.task);
return promise.value();
}
pub fn verify(globalObject: *JSC.JSGlobalObject, password: []const u8, prev_hash: []const u8, algorithm: ?PasswordObject.Algorithm, comptime sync: bool) bun.JSError!JSC.JSValue {
assert(password.len > 0); // caller must check
if (comptime sync) {
const value = VerifyJob.getValue(password, prev_hash, algorithm);
switch (value) {
.err => {
const error_instance = value.toErrorInstance(globalObject);
return globalObject.throwValue(error_instance);
},
.pass => |pass| {
return JSC.JSValue.jsBoolean(pass);
},
}
unreachable;
}
var promise = JSC.JSPromise.Strong.init(globalObject);
const job = VerifyJob.new(.{
.algorithm = algorithm,
.password = password,
.prev_hash = prev_hash,
.promise = promise,
.event_loop = globalObject.bunVM().eventLoop(),
.global = globalObject,
});
job.ref.ref(globalObject.bunVM());
JSC.WorkPool.schedule(&job.task);
return promise.value();
}
// Once we have bindings generator, this should be replaced with a generated function
pub fn JSPasswordObject__hash(globalObject: *JSC.JSGlobalObject, callframe: *JSC.CallFrame) bun.JSError!JSC.JSValue {
const arguments_ = callframe.arguments_old(2);
const arguments = arguments_.ptr[0..arguments_.len];
if (arguments.len < 1) {
return globalObject.throwNotEnoughArguments("hash", 1, 0);
}
var algorithm = PasswordObject.Algorithm.Value.default;
if (arguments.len > 1 and !arguments[1].isEmptyOrUndefinedOrNull()) {
algorithm = try PasswordObject.Algorithm.Value.fromJS(globalObject, arguments[1]);
}
// TODO: this most likely should error like `hashSync` instead of stringifying.
//
// fromJS(...) orelse {
// return globalObject.throwInvalidArgumentType("hash", "password", "string or TypedArray");
// }
const password_to_hash = try JSC.Node.StringOrBuffer.fromJSToOwnedSlice(globalObject, arguments[0], bun.default_allocator);
errdefer bun.default_allocator.free(password_to_hash);
if (password_to_hash.len == 0) {
return globalObject.throwInvalidArguments("password must not be empty", .{});
}
return hash(globalObject, password_to_hash, algorithm, false);
}
// Once we have bindings generator, this should be replaced with a generated function
pub fn JSPasswordObject__hashSync(globalObject: *JSC.JSGlobalObject, callframe: *JSC.CallFrame) bun.JSError!JSC.JSValue {
const arguments_ = callframe.arguments_old(2);
const arguments = arguments_.ptr[0..arguments_.len];
if (arguments.len < 1) {
return globalObject.throwNotEnoughArguments("hash", 1, 0);
}
var algorithm = PasswordObject.Algorithm.Value.default;
if (arguments.len > 1 and !arguments[1].isEmptyOrUndefinedOrNull()) {
algorithm = try PasswordObject.Algorithm.Value.fromJS(globalObject, arguments[1]);
}
var string_or_buffer = try JSC.Node.StringOrBuffer.fromJS(globalObject, bun.default_allocator, arguments[0]) orelse {
return globalObject.throwInvalidArgumentType("hash", "password", "string or TypedArray");
};
defer string_or_buffer.deinit();
if (string_or_buffer.slice().len == 0) {
return globalObject.throwInvalidArguments("password must not be empty", .{});
}
return hash(globalObject, string_or_buffer.slice(), algorithm, true);
}
const VerifyJob = struct {
algorithm: ?PasswordObject.Algorithm = null,
password: []const u8,
prev_hash: []const u8,
promise: JSC.JSPromise.Strong,
event_loop: *JSC.EventLoop,
global: *JSC.JSGlobalObject,
ref: Async.KeepAlive = .{},
task: JSC.WorkPoolTask = .{ .callback = &run },
pub usingnamespace bun.New(@This());
pub const Result = struct {
value: Value,
ref: Async.KeepAlive = .{},
task: JSC.AnyTask = undefined,
promise: JSC.JSPromise.Strong,
global: *JSC.JSGlobalObject,
pub usingnamespace bun.New(@This());
pub const Value = union(enum) {
err: PasswordObject.HashError,
pass: bool,
pub fn toErrorInstance(this: Value, globalObject: *JSC.JSGlobalObject) JSC.JSValue {
const error_code = std.fmt.allocPrint(bun.default_allocator, "PASSWORD{}", .{PascalToUpperUnderscoreCaseFormatter{ .input = @errorName(this.err) }}) catch bun.outOfMemory();
defer bun.default_allocator.free(error_code);
const instance = globalObject.createErrorInstance("Password verification failed with error \"{s}\"", .{@errorName(this.err)});
instance.put(globalObject, ZigString.static("code"), JSC.ZigString.init(error_code).toJS(globalObject));
return instance;
}
};
pub fn runFromJS(this: *Result) void {
var promise = this.promise;
defer promise.deinit();
this.promise = .{};
this.ref.unref(this.global.bunVM());
const global = this.global;
switch (this.value) {
.err => {
const error_instance = this.value.toErrorInstance(global);
this.destroy();
promise.reject(global, error_instance);
},
.pass => |pass| {
this.destroy();
promise.resolve(global, JSC.JSValue.jsBoolean(pass));
},
}
}
};
pub fn deinit(this: *VerifyJob) void {
this.promise.deinit();
bun.freeSensitive(bun.default_allocator, this.password);
bun.freeSensitive(bun.default_allocator, this.prev_hash);
this.destroy();
}
pub fn getValue(password: []const u8, prev_hash: []const u8, algorithm: ?PasswordObject.Algorithm) Result.Value {
const pass = PasswordObject.verify(bun.default_allocator, password, prev_hash, algorithm) catch |err| {
return Result.Value{ .err = err };
};
return Result.Value{ .pass = pass };
}
pub fn run(task: *bun.ThreadPool.Task) void {
var this: *VerifyJob = @fieldParentPtr("task", task);
var result = Result.new(.{
.value = getValue(this.password, this.prev_hash, this.algorithm),
.task = undefined,
.promise = this.promise,
.global = this.global,
.ref = this.ref,
});
this.promise = .empty;
result.task = JSC.AnyTask.New(Result, Result.runFromJS).init(result);
this.ref = .{};
this.event_loop.enqueueTaskConcurrent(JSC.ConcurrentTask.createFrom(&result.task));
this.deinit();
}
};
// Once we have bindings generator, this should be replaced with a generated function
pub fn JSPasswordObject__verify(globalObject: *JSC.JSGlobalObject, callframe: *JSC.CallFrame) bun.JSError!JSC.JSValue {
const arguments_ = callframe.arguments_old(3);
const arguments = arguments_.ptr[0..arguments_.len];
if (arguments.len < 2) {
return globalObject.throwNotEnoughArguments("verify", 2, 0);
}
var algorithm: ?PasswordObject.Algorithm = null;
if (arguments.len > 2 and !arguments[2].isEmptyOrUndefinedOrNull()) {
if (!arguments[2].isString()) {
return globalObject.throwInvalidArgumentType("verify", "algorithm", "string");
}
const algorithm_string = try arguments[2].getZigString(globalObject);
algorithm = PasswordObject.Algorithm.label.getWithEql(algorithm_string, JSC.ZigString.eqlComptime) orelse {
if (!globalObject.hasException()) {
return globalObject.throwInvalidArgumentType("verify", "algorithm", unknown_password_algorithm_message);
}
return error.JSError;
};
}
// TODO: this most likely should error like `verifySync` instead of stringifying.
//
// fromJS(...) orelse {
// return globalObject.throwInvalidArgumentType("hash", "password", "string or TypedArray");
// }
const owned_password = try JSC.Node.StringOrBuffer.fromJSToOwnedSlice(globalObject, arguments[0], bun.default_allocator);
// TODO: this most likely should error like `verifySync` instead of stringifying.
//
// fromJS(...) orelse {
// return globalObject.throwInvalidArgumentType("hash", "password", "string or TypedArray");
// }
const owned_hash = JSC.Node.StringOrBuffer.fromJSToOwnedSlice(globalObject, arguments[1], bun.default_allocator) catch |err| {
bun.default_allocator.free(owned_password);
return err;
};
if (owned_hash.len == 0) {
bun.default_allocator.free(owned_password);
return JSC.JSPromise.resolvedPromiseValue(globalObject, JSC.JSValue.jsBoolean(false));
}
if (owned_password.len == 0) {
bun.default_allocator.free(owned_hash);
return JSC.JSPromise.resolvedPromiseValue(globalObject, JSC.JSValue.jsBoolean(false));
}
return verify(globalObject, owned_password, owned_hash, algorithm, false);
}
// Once we have bindings generator, this should be replaced with a generated function
pub fn JSPasswordObject__verifySync(globalObject: *JSC.JSGlobalObject, callframe: *JSC.CallFrame) bun.JSError!JSC.JSValue {
const arguments_ = callframe.arguments_old(3);
const arguments = arguments_.ptr[0..arguments_.len];
if (arguments.len < 2) {
return globalObject.throwNotEnoughArguments("verify", 2, 0);
}
var algorithm: ?PasswordObject.Algorithm = null;
if (arguments.len > 2 and !arguments[2].isEmptyOrUndefinedOrNull()) {
if (!arguments[2].isString()) {
return globalObject.throwInvalidArgumentType("verify", "algorithm", "string");
}
const algorithm_string = try arguments[2].getZigString(globalObject);
algorithm = PasswordObject.Algorithm.label.getWithEql(algorithm_string, JSC.ZigString.eqlComptime) orelse {
if (!globalObject.hasException()) {
return globalObject.throwInvalidArgumentType("verify", "algorithm", unknown_password_algorithm_message);
}
return .zero;
};
}
var password = try JSC.Node.StringOrBuffer.fromJS(globalObject, bun.default_allocator, arguments[0]) orelse {
return globalObject.throwInvalidArgumentType("verify", "password", "string or TypedArray");
};
var hash_ = try JSC.Node.StringOrBuffer.fromJS(globalObject, bun.default_allocator, arguments[1]) orelse {
password.deinit();
return globalObject.throwInvalidArgumentType("verify", "hash", "string or TypedArray");
};
defer password.deinit();
defer hash_.deinit();
if (hash_.slice().len == 0) {
return JSC.JSValue.jsBoolean(false);
}
if (password.slice().len == 0) {
return JSC.JSValue.jsBoolean(false);
}
return verify(globalObject, password.slice(), hash_.slice(), algorithm, true);
}
};
const std = @import("std");
const bun = @import("root").bun;
const string = bun.string;
const strings = bun.strings;
const MutableString = bun.MutableString;
const stringZ = bun.stringZ;
const default_allocator = bun.default_allocator;
const JSC = bun.JSC;
const Async = bun.Async;
const ZigString = JSC.ZigString;
const JSValue = JSC.JSValue;
const JSGlobalObject = JSC.JSGlobalObject;
const CallFrame = JSC.CallFrame;
const assert = bun.assert;
const unknown_password_algorithm_message = "unknown algorithm, expected one of: \"bcrypt\", \"argon2id\", \"argon2d\", \"argon2i\" (default is \"argon2id\")";

View File

@@ -607,7 +607,8 @@ pub const FFI = struct {
return error.JSError;
}
if (try generateSymbols(globalThis, allocator, &compile_c.symbols.map, symbols_object)) |val| {
// SAFETY: already checked that symbols_object is an object
if (try generateSymbols(globalThis, allocator, &compile_c.symbols.map, symbols_object.getObject().?)) |val| {
if (val != .zero and !globalThis.hasException())
return globalThis.throwValue(val);
return error.JSError;
@@ -663,9 +664,9 @@ pub const FFI = struct {
}
if (try object.getTruthy(globalThis, "define")) |define_value| {
if (define_value.isObject()) {
if (define_value.getObject()) |define_obj| {
const Iter = JSC.JSPropertyIterator(.{ .include_value = true, .skip_empty_name = true });
var iter = try Iter.init(globalThis, define_value);
var iter = try Iter.init(globalThis, define_obj);
defer iter.deinit();
while (try iter.next()) |entry| {
const key = entry.toOwnedSliceZ(allocator) catch bun.outOfMemory();
@@ -932,12 +933,11 @@ pub const FFI = struct {
}
}
if (object.isEmptyOrUndefinedOrNull() or !object.isObject()) {
return JSC.toInvalidArguments("Expected an options object with symbol names", .{}, global);
}
if (object.isEmptyOrUndefinedOrNull()) return invalidOptionsArg(global);
const obj = object.getObject() orelse return invalidOptionsArg(global);
var symbols = bun.StringArrayHashMapUnmanaged(Function){};
if (generateSymbols(global, bun.default_allocator, &symbols, object) catch JSC.JSValue.zero) |val| {
if (generateSymbols(global, bun.default_allocator, &symbols, obj) catch JSC.JSValue.zero) |val| {
// an error while validating symbols
for (symbols.keys()) |key| {
allocator.free(@constCast(key));
@@ -987,34 +987,20 @@ pub const FFI = struct {
return ret;
}
// pub fn dlcompile(global: *JSGlobalObject, object: JSC.JSValue) JSValue {
// const allocator = VirtualMachine.get().allocator;
/// Creates an Exception object indicating that options object is invalid.
/// The exception is not thrown on the VM.
fn invalidOptionsArg(global: *JSGlobalObject) JSValue {
return JSC.toInvalidArguments("Expected an options object with symbol names", .{}, global);
}
// if (object.isEmptyOrUndefinedOrNull() or !object.isObject()) {
// return JSC.toInvalidArguments("Expected an options object with symbol names", .{}, global);
// }
// var symbols = bun.StringArrayHashMapUnmanaged(Function){};
// if (generateSymbols(global, &symbols, object) catch JSC.JSValue.zero) |val| {
// // an error while validating symbols
// for (symbols.keys()) |key| {
// allocator.free(@constCast(key));
// }
// symbols.clearAndFree(allocator);
// return val;
// }
// }
pub fn open(global: *JSGlobalObject, name_str: ZigString, object: JSC.JSValue) JSC.JSValue {
pub fn open(global: *JSGlobalObject, name_str: ZigString, object_value: JSC.JSValue) JSC.JSValue {
JSC.markBinding(@src());
const vm = VirtualMachine.get();
var name_slice = name_str.toSlice(bun.default_allocator);
defer name_slice.deinit();
if (object.isEmptyOrUndefinedOrNull() or !object.isObject()) {
return JSC.toInvalidArguments("Expected an options object with symbol names", .{}, global);
}
if (object_value.isEmptyOrUndefinedOrNull()) return invalidOptionsArg(global);
const object = object_value.getObject() orelse return invalidOptionsArg(global);
var filepath_buf: bun.PathBuffer = undefined;
const name = brk: {
@@ -1163,13 +1149,12 @@ pub const FFI = struct {
return .undefined;
}
pub fn linkSymbols(global: *JSGlobalObject, object: JSC.JSValue) JSC.JSValue {
pub fn linkSymbols(global: *JSGlobalObject, object_value: JSC.JSValue) JSC.JSValue {
JSC.markBinding(@src());
const allocator = VirtualMachine.get().allocator;
if (object.isEmptyOrUndefinedOrNull() or !object.isObject()) {
return JSC.toInvalidArguments("Expected an options object with symbol names", .{}, global);
}
if (object_value.isEmptyOrUndefinedOrNull()) return invalidOptionsArg(global);
const object = object_value.getObject() orelse return invalidOptionsArg(global);
var symbols = bun.StringArrayHashMapUnmanaged(Function){};
if (generateSymbols(global, allocator, &symbols, object) catch JSC.JSValue.zero) |val| {
@@ -1379,7 +1364,7 @@ pub const FFI = struct {
return null;
}
pub fn generateSymbols(global: *JSGlobalObject, allocator: Allocator, symbols: *bun.StringArrayHashMapUnmanaged(Function), object: JSC.JSValue) bun.JSError!?JSValue {
pub fn generateSymbols(global: *JSGlobalObject, allocator: Allocator, symbols: *bun.StringArrayHashMapUnmanaged(Function), object: *JSC.JSObject) bun.JSError!?JSValue {
JSC.markBinding(@src());
var symbols_iter = try JSC.JSPropertyIterator(.{

View File

@@ -128,6 +128,7 @@ export default [
pause: {
fn: "doPause",
length: 0,
passThis: true,
},
drainRequestBody: {
fn: "drainRequestBody",
@@ -136,6 +137,7 @@ export default [
dumpRequestBody: {
fn: "dumpRequestBody",
length: 0,
passThis: true,
},
resume: {
fn: "doResume",
@@ -147,6 +149,9 @@ export default [
aborted: {
getter: "getAborted",
},
flags: {
getter: "getFlags",
},
finished: {
getter: "getFinished",
},
@@ -159,6 +164,7 @@ export default [
ondata: {
getter: "getOnData",
setter: "setOnData",
this: true,
},
onabort: {
getter: "getOnAbort",
@@ -178,6 +184,7 @@ export default [
onwritable: {
getter: "getOnWritable",
setter: "setOnWritable",
this: true,
},
},
klass: {},

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@@ -396,6 +396,7 @@ pub const ArrayBuffer = extern struct {
return Stream{ .pos = 0, .buf = this.slice() };
}
// TODO: this can throw an error! should use JSError!JSValue
pub fn create(globalThis: *JSC.JSGlobalObject, bytes: []const u8, comptime kind: JSC.JSValue.JSType) JSC.JSValue {
JSC.markBinding(@src());
return switch (comptime kind) {

View File

@@ -100,4 +100,11 @@ void JSVMClientData::create(VM* vm, void* bunVM)
clientData->builtinFunctions().exportNames();
}
WebCore::HTTPHeaderIdentifiers& JSVMClientData::httpHeaderIdentifiers()
{
if (!m_httpHeaderIdentifiers)
m_httpHeaderIdentifiers.emplace();
return *m_httpHeaderIdentifiers;
}
} // namespace WebCore

View File

@@ -23,7 +23,7 @@ class DOMWrapperWorld;
#include <wtf/StdLibExtras.h>
#include "WebCoreJSBuiltins.h"
#include "JSCTaskScheduler.h"
#include "HTTPHeaderIdentifiers.h"
namespace Zig {
}
@@ -104,6 +104,8 @@ public:
JSC::GCClient::IsoSubspace& domBuiltinConstructorSpace() { return m_domBuiltinConstructorSpace; }
WebCore::HTTPHeaderIdentifiers& httpHeaderIdentifiers();
template<typename Func> void forEachOutputConstraintSpace(const Func& func)
{
for (auto* space : m_outputConstraintSpaces)
@@ -128,6 +130,8 @@ private:
std::unique_ptr<ExtendedDOMClientIsoSubspaces> m_clientSubspaces;
Vector<JSC::IsoSubspace*> m_outputConstraintSpaces;
std::optional<WebCore::HTTPHeaderIdentifiers> m_httpHeaderIdentifiers;
};
} // namespace WebCore

View File

@@ -33,6 +33,7 @@
macro(embeddedFiles) \
macro(S3Client) \
macro(s3) \
macro(SourceMap) \
macro(CSRF) \
// --- Callbacks ---

View File

@@ -718,6 +718,7 @@ JSC_DEFINE_HOST_FUNCTION(functionFileURLToPath, (JSC::JSGlobalObject * globalObj
embeddedFiles BunObject_getter_wrap_embeddedFiles DontDelete|PropertyCallback
S3Client BunObject_getter_wrap_S3Client DontDelete|PropertyCallback
s3 BunObject_getter_wrap_s3 DontDelete|PropertyCallback
SourceMap BunObject_getter_wrap_SourceMap DontDelete|PropertyCallback
CSRF BunObject_getter_wrap_CSRF DontDelete|PropertyCallback
allocUnsafe BunObject_callback_allocUnsafe DontDelete|Function 1
argv BunObject_getter_wrap_argv DontDelete|PropertyCallback

View File

@@ -28,7 +28,7 @@
#include <wtf/text/WTFString.h>
#include "BunClientData.h"
#include "CommonJSModuleRecord.h"
#include "JSCommonJSModule.h"
#include "isBuiltinModule.h"
#include "ImportMetaObject.h"

Some files were not shown because too many files have changed in this diff Show More