mirror of
https://github.com/oven-sh/bun
synced 2026-02-25 19:17:20 +01:00
Compare commits
22 Commits
nektro-pat
...
jarred/new
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
817491faad | ||
|
|
900847fc72 | ||
|
|
ab5432a5c6 | ||
|
|
636d2e486f | ||
|
|
83c83949ab | ||
|
|
e3ab3c51c3 | ||
|
|
8c3acbaad8 | ||
|
|
2ba6b460e4 | ||
|
|
748cd82187 | ||
|
|
64ae0c7714 | ||
|
|
ec1606dc55 | ||
|
|
35f23deadc | ||
|
|
eddeb5dafb | ||
|
|
71b279ba25 | ||
|
|
bf6904af2a | ||
|
|
9d33a45c33 | ||
|
|
26b321c659 | ||
|
|
0c255d9e97 | ||
|
|
fbba4c847a | ||
|
|
7d9b355a4e | ||
|
|
488c0ccd9a | ||
|
|
c48ab7afb9 |
4
Makefile
4
Makefile
@@ -533,6 +533,10 @@ api:
|
||||
$(ZIG) fmt src/api/schema.zig
|
||||
$(PRETTIER) --write src/api/schema.js
|
||||
$(PRETTIER) --write src/api/schema.d.ts
|
||||
./node_modules/.bin/peechy --schema src/api/bundle_v2.peechy --esm src/api/bundle_v2.js --ts src/api/bundle_v2.d.ts --zig src/api/bundle_v2.zig
|
||||
$(ZIG) fmt src/api/bundle_v2.zig
|
||||
$(PRETTIER) --write src/api/bundle_v2.js
|
||||
$(PRETTIER) --write src/api/bundle_v2.d.ts
|
||||
|
||||
node-fallbacks:
|
||||
@cd src/node-fallbacks; $(NPM_CLIENT) install; $(NPM_CLIENT) run --silent build
|
||||
|
||||
84
architecture.md
Normal file
84
architecture.md
Normal file
@@ -0,0 +1,84 @@
|
||||
# How bun works
|
||||
|
||||
Bun is roughly four large projects in one.
|
||||
|
||||
- Bun.js
|
||||
- JavaScript & TypeScript parser
|
||||
- Bundler
|
||||
- npm package manager
|
||||
|
||||
### Why is bun fast?
|
||||
|
||||
There are a lot of reasons for this. Mostly, bun tries really hard to minimize memory usage/allocations and bun leverages lots of platform-specific syscalls to make stuff happen faster.
|
||||
|
||||
## Bun.js
|
||||
|
||||
Bun uses JavaScriptCore instead of the more commonly-embedded V8.
|
||||
|
||||
#### Why JavaScriptCore instead of V8?
|
||||
|
||||
JavaScriptCore tends to have significantly better startup performance, faster native bindings, alongside lower memory usage and marginally better runtime performance at things like math.
|
||||
|
||||
The downside is worse async/promise performance and lower quality Windows support.
|
||||
|
||||
It's also much harder to embed because there are no docs and it is not designed for embedding outside of WebKit/Safari. Figuring out how to embed JavaScriptCore in bun was about a month of work.
|
||||
|
||||
#### Zig <> JavaScriptCore interop
|
||||
|
||||
Zig has fantastic C support and JavaScriptCore _does_ have a C API. It's not as fast as JavaScriptCore's internal C++ API though.
|
||||
|
||||
Bun tries to use the C++ API as much as possible but there are no automated bindings yet.
|
||||
|
||||
Currently, Bun has a giant [./headers.h](./src/javascript/jsc/bindings/headers.h) file which is generated by reading the types exported in [./exports.zig](./src/javascript/jsc/bindings/exports.zig). `headers.h` is run through `zig translate-c` to generate [./headers.zig](./src/javascript/jsc/bindings/headers.zig). This turns any C ABI differences between Zig code and C++ code into a compile error.
|
||||
|
||||
From there, [`bindings.cpp`](./src/javascript/jsc/bindings/bindings.cpp) represents most of the C++ bindings. Not all of these functions are currently in use – I was learning C++ as I wrote it.
|
||||
|
||||
The sizes and alignment of C++ types are [synced](./src/javascript/jsc/bindings/sizes.zig) from C++ -> Zig, but Bun generally treats C++ types as opaque pointers and accesses via function calls because JavaScriptCore uses lots of runtime-typing and using C++ this way means RAII doesn't happen.
|
||||
|
||||
For zig-only types exposed to JavaScript, Bun uses JavaScriptCore's C API. [`JSC.NewClass`](https://github.com/Jarred-Sumner/bun/blob/89ca887ea0c0c673f1c1c22cb5913f09435feeb6/src/javascript/jsc/base.zig#L878) defines a JavaScript class. After defining the class, if the type has instance-specific data (like `Response`), the type must be added to this [`TaggedPointerUnion`](https://github.com/Jarred-Sumner/bun/blob/89ca887ea0c0c673f1c1c22cb5913f09435feeb6/src/javascript/jsc/base.zig#L2572). Every pointer from Zig to JavaScript is tagged with a type and that tag is checked for validity. This is a security measure that also helps with debugging.
|
||||
|
||||
Eventually, Bun should move to using JavaScriptCore's C++ API with bindings generated similarly to Web IDL (except from Zig potentially).
|
||||
|
||||
##### Strings between Zig <> JSC
|
||||
|
||||
JavaScriptCore has many ways to represent a string, mostly contained in `WTF::String`. Strings can be either 16 bit or 8 bit. 16 bit strings are UTF-16, 8 bit strings are latin1. UTF-8 strings are not supported. `WTF::String` contains a `WTF::StringImpl`, which is a subclass that supports strings with memory managed externally, statically allocated strings (AtomicString), and strings that are garbage collected. This is a reference-counted type (like many in JSC).
|
||||
|
||||
Most of the time, strings from Zig -> JSC are cloned, both the `WTF::StringImpl` and the underlying bytes. This has a performance cost, but makes the lifetime of strings much simpler.
|
||||
|
||||
However, strings from JSC -> Zig are usually not cloned unless it's UTF-16 and conversion from UTF-16 to UTF-8 is necessary. In many cases, reading strings in bun.js is faster than typed arrays.
|
||||
|
||||
### Bun's JavaScript parser is deeply integrated into the runtime
|
||||
|
||||
When you import a file in Bun.js, Bun's node_modules resolver runs and the JavaScript/TypeScript parser always runs when fetching the code, even if technically the file might not need to be transpiled.
|
||||
|
||||
This is for ecosystem compatibility. This enables many npm packages to "just work", despite not being the same runtime as Node.js and it enables behavior like tsconfig.json `"paths"` to work with zero additional configuration.
|
||||
|
||||
### Embedding WebCore
|
||||
|
||||
[WebCore](https://github.com/WebKit/webkit/tree/main/Source/WebCore) is the web browser part of WebKit. Bun embeds parts of WebCore and exposes them to JavaScript – must of these classes are in the [`webcore`](./src/javascript/jsc/bindings/webcore/) folder.
|
||||
|
||||
Currently, these are manually copy-pasted and slightly edited (mostly to comment out code not relevant to bun). It is not simple. Longer-term, it would be better to statically link WebCore or consider turning parts of this usage into a WebKit port (similar to how PlayStation embeds WebKit).
|
||||
|
||||
## JavaScript & TypeScript Parser
|
||||
|
||||
The design was initially modeled after [esbuild](https://github.com/evanw/esbuild/blob/main/docs/architecture.md) and many of esbuild's comments are in bun's codebase (@evanw writes fantastic documentation). [The AST](./src/js_ast.zig) data layout is still similar to esbuild, however finer-grain control over memory management with Zig led to better performance
|
||||
|
||||
## Two passes
|
||||
|
||||
1. Parsing
|
||||
2. Visiting
|
||||
|
||||
#### Hot Module Reloading
|
||||
|
||||
#### Memory management
|
||||
|
||||
Statements and expressions are allocated in blocks of continguous memory.
|
||||
|
||||
```zig
|
||||
const Block = struct {
|
||||
used: SizeType = 0,
|
||||
items: [count]UnionValueType align(MaxAlign) = undefined,
|
||||
};
|
||||
```
|
||||
|
||||
At the time of writing, several AST nodes are stack allocated.
|
||||
62
bench/internal/native-overhead.js
Normal file
62
bench/internal/native-overhead.js
Normal file
@@ -0,0 +1,62 @@
|
||||
const count = 999999;
|
||||
|
||||
function bench(label, cb) {
|
||||
console.time(label);
|
||||
cb();
|
||||
console.timeEnd(label);
|
||||
}
|
||||
|
||||
for (let i = 0; i < count; i++) {}
|
||||
|
||||
bench("globalThis.Bun (C API)", () => {
|
||||
for (let i = 0; i < count; i++) {
|
||||
globalThis.Bun;
|
||||
}
|
||||
});
|
||||
|
||||
bench("Bun (C API)", () => {
|
||||
for (let i = 0; i < count; i++) {
|
||||
Bun;
|
||||
}
|
||||
});
|
||||
|
||||
var Bun = globalThis.Bun;
|
||||
|
||||
bench("Bun.gc (C API -> C API)", () => {
|
||||
for (let i = 0; i < count; i++) {
|
||||
Bun.gc;
|
||||
}
|
||||
});
|
||||
|
||||
bench("Bun.gc copied to local variable", () => {
|
||||
var gc = Bun.gc;
|
||||
for (let i = 0; i < count; i++) {
|
||||
gc = gc;
|
||||
}
|
||||
});
|
||||
|
||||
bench("process.version (C++ Custom Accessor)", () => {
|
||||
for (let i = 0; i < count; i++) {
|
||||
process.version;
|
||||
}
|
||||
});
|
||||
|
||||
bench("process.env (C++ Custom Accessor -> C API)", () => {
|
||||
for (let i = 0; i < count; i++) {
|
||||
process.env;
|
||||
}
|
||||
});
|
||||
|
||||
bench("C++ putDirect", () => {
|
||||
for (let i = 0; i < count; i++) {
|
||||
Event.AT_TARGET;
|
||||
}
|
||||
});
|
||||
|
||||
bench("Inline", () => {
|
||||
var inline = { foo: 123 };
|
||||
|
||||
for (let i = 0; i < count; i++) {
|
||||
inline.foo;
|
||||
}
|
||||
});
|
||||
104
src/api/bundle_v2.d.ts
vendored
Normal file
104
src/api/bundle_v2.d.ts
vendored
Normal file
@@ -0,0 +1,104 @@
|
||||
import type { ByteBuffer } from "peechy";
|
||||
|
||||
type byte = number;
|
||||
type float = number;
|
||||
type int = number;
|
||||
type alphanumeric = string;
|
||||
type uint = number;
|
||||
type int8 = number;
|
||||
type lowp = number;
|
||||
type int16 = number;
|
||||
type int32 = number;
|
||||
type float32 = number;
|
||||
type uint16 = number;
|
||||
type uint32 = number;
|
||||
export interface StringPointer {
|
||||
offset: uint32;
|
||||
length: uint32;
|
||||
}
|
||||
|
||||
export interface JavascriptBundledPart {
|
||||
code: StringPointer;
|
||||
dependencies_offset: uint32;
|
||||
dependencies_length: uint32;
|
||||
exports_offset: uint32;
|
||||
exports_length: uint32;
|
||||
from_module: uint32;
|
||||
}
|
||||
|
||||
export interface JavascriptBundledModule {
|
||||
path: StringPointer;
|
||||
parts_offset: uint32;
|
||||
parts_length: uint32;
|
||||
exports_offset: uint32;
|
||||
exports_length: uint32;
|
||||
package_id: uint32;
|
||||
path_extname_length: byte;
|
||||
}
|
||||
|
||||
export interface JavascriptBundledPackage {
|
||||
name: StringPointer;
|
||||
version: StringPointer;
|
||||
hash: uint32;
|
||||
modules_offset: uint32;
|
||||
modules_length: uint32;
|
||||
}
|
||||
|
||||
export interface JavascriptBundle {
|
||||
modules: JavascriptBundledModule[];
|
||||
packages: JavascriptBundledPackage[];
|
||||
parts: JavascriptBundledPart[];
|
||||
export_names: StringPointer[];
|
||||
export_parts: Uint32Array;
|
||||
etag: Uint8Array;
|
||||
generated_at: uint32;
|
||||
import_from_name: Uint8Array;
|
||||
manifest_string: Uint8Array;
|
||||
}
|
||||
|
||||
export interface JavascriptBundleContainer {
|
||||
bundle_format_version?: uint32;
|
||||
bundle?: JavascriptBundle;
|
||||
code_length?: uint32;
|
||||
}
|
||||
|
||||
export declare function encodeStringPointer(
|
||||
message: StringPointer,
|
||||
bb: ByteBuffer
|
||||
): void;
|
||||
export declare function decodeStringPointer(buffer: ByteBuffer): StringPointer;
|
||||
export declare function encodeJavascriptBundledPart(
|
||||
message: JavascriptBundledPart,
|
||||
bb: ByteBuffer
|
||||
): void;
|
||||
export declare function decodeJavascriptBundledPart(
|
||||
buffer: ByteBuffer
|
||||
): JavascriptBundledPart;
|
||||
export declare function encodeJavascriptBundledModule(
|
||||
message: JavascriptBundledModule,
|
||||
bb: ByteBuffer
|
||||
): void;
|
||||
export declare function decodeJavascriptBundledModule(
|
||||
buffer: ByteBuffer
|
||||
): JavascriptBundledModule;
|
||||
export declare function encodeJavascriptBundledPackage(
|
||||
message: JavascriptBundledPackage,
|
||||
bb: ByteBuffer
|
||||
): void;
|
||||
export declare function decodeJavascriptBundledPackage(
|
||||
buffer: ByteBuffer
|
||||
): JavascriptBundledPackage;
|
||||
export declare function encodeJavascriptBundle(
|
||||
message: JavascriptBundle,
|
||||
bb: ByteBuffer
|
||||
): void;
|
||||
export declare function decodeJavascriptBundle(
|
||||
buffer: ByteBuffer
|
||||
): JavascriptBundle;
|
||||
export declare function encodeJavascriptBundleContainer(
|
||||
message: JavascriptBundleContainer,
|
||||
bb: ByteBuffer
|
||||
): void;
|
||||
export declare function decodeJavascriptBundleContainer(
|
||||
buffer: ByteBuffer
|
||||
): JavascriptBundleContainer;
|
||||
365
src/api/bundle_v2.js
Normal file
365
src/api/bundle_v2.js
Normal file
@@ -0,0 +1,365 @@
|
||||
function decodeStringPointer(bb) {
|
||||
var result = {};
|
||||
|
||||
result["offset"] = bb.readUint32();
|
||||
result["length"] = bb.readUint32();
|
||||
return result;
|
||||
}
|
||||
|
||||
function encodeStringPointer(message, bb) {
|
||||
var value = message["offset"];
|
||||
if (value != null) {
|
||||
bb.writeUint32(value);
|
||||
} else {
|
||||
throw new Error('Missing required field "offset"');
|
||||
}
|
||||
|
||||
var value = message["length"];
|
||||
if (value != null) {
|
||||
bb.writeUint32(value);
|
||||
} else {
|
||||
throw new Error('Missing required field "length"');
|
||||
}
|
||||
}
|
||||
|
||||
function decodeJavascriptBundledPart(bb) {
|
||||
var result = {};
|
||||
|
||||
result["code"] = decodeStringPointer(bb);
|
||||
result["dependencies_offset"] = bb.readUint32();
|
||||
result["dependencies_length"] = bb.readUint32();
|
||||
result["exports_offset"] = bb.readUint32();
|
||||
result["exports_length"] = bb.readUint32();
|
||||
result["from_module"] = bb.readUint32();
|
||||
return result;
|
||||
}
|
||||
|
||||
function encodeJavascriptBundledPart(message, bb) {
|
||||
var value = message["code"];
|
||||
if (value != null) {
|
||||
encodeStringPointer(value, bb);
|
||||
} else {
|
||||
throw new Error('Missing required field "code"');
|
||||
}
|
||||
|
||||
var value = message["dependencies_offset"];
|
||||
if (value != null) {
|
||||
bb.writeUint32(value);
|
||||
} else {
|
||||
throw new Error('Missing required field "dependencies_offset"');
|
||||
}
|
||||
|
||||
var value = message["dependencies_length"];
|
||||
if (value != null) {
|
||||
bb.writeUint32(value);
|
||||
} else {
|
||||
throw new Error('Missing required field "dependencies_length"');
|
||||
}
|
||||
|
||||
var value = message["exports_offset"];
|
||||
if (value != null) {
|
||||
bb.writeUint32(value);
|
||||
} else {
|
||||
throw new Error('Missing required field "exports_offset"');
|
||||
}
|
||||
|
||||
var value = message["exports_length"];
|
||||
if (value != null) {
|
||||
bb.writeUint32(value);
|
||||
} else {
|
||||
throw new Error('Missing required field "exports_length"');
|
||||
}
|
||||
|
||||
var value = message["from_module"];
|
||||
if (value != null) {
|
||||
bb.writeUint32(value);
|
||||
} else {
|
||||
throw new Error('Missing required field "from_module"');
|
||||
}
|
||||
}
|
||||
|
||||
function decodeJavascriptBundledModule(bb) {
|
||||
var result = {};
|
||||
|
||||
result["path"] = decodeStringPointer(bb);
|
||||
result["parts_offset"] = bb.readUint32();
|
||||
result["parts_length"] = bb.readUint32();
|
||||
result["exports_offset"] = bb.readUint32();
|
||||
result["exports_length"] = bb.readUint32();
|
||||
result["package_id"] = bb.readUint32();
|
||||
result["path_extname_length"] = bb.readByte();
|
||||
return result;
|
||||
}
|
||||
|
||||
function encodeJavascriptBundledModule(message, bb) {
|
||||
var value = message["path"];
|
||||
if (value != null) {
|
||||
encodeStringPointer(value, bb);
|
||||
} else {
|
||||
throw new Error('Missing required field "path"');
|
||||
}
|
||||
|
||||
var value = message["parts_offset"];
|
||||
if (value != null) {
|
||||
bb.writeUint32(value);
|
||||
} else {
|
||||
throw new Error('Missing required field "parts_offset"');
|
||||
}
|
||||
|
||||
var value = message["parts_length"];
|
||||
if (value != null) {
|
||||
bb.writeUint32(value);
|
||||
} else {
|
||||
throw new Error('Missing required field "parts_length"');
|
||||
}
|
||||
|
||||
var value = message["exports_offset"];
|
||||
if (value != null) {
|
||||
bb.writeUint32(value);
|
||||
} else {
|
||||
throw new Error('Missing required field "exports_offset"');
|
||||
}
|
||||
|
||||
var value = message["exports_length"];
|
||||
if (value != null) {
|
||||
bb.writeUint32(value);
|
||||
} else {
|
||||
throw new Error('Missing required field "exports_length"');
|
||||
}
|
||||
|
||||
var value = message["package_id"];
|
||||
if (value != null) {
|
||||
bb.writeUint32(value);
|
||||
} else {
|
||||
throw new Error('Missing required field "package_id"');
|
||||
}
|
||||
|
||||
var value = message["path_extname_length"];
|
||||
if (value != null) {
|
||||
bb.writeByte(value);
|
||||
} else {
|
||||
throw new Error('Missing required field "path_extname_length"');
|
||||
}
|
||||
}
|
||||
|
||||
function decodeJavascriptBundledPackage(bb) {
|
||||
var result = {};
|
||||
|
||||
result["name"] = decodeStringPointer(bb);
|
||||
result["version"] = decodeStringPointer(bb);
|
||||
result["hash"] = bb.readUint32();
|
||||
result["modules_offset"] = bb.readUint32();
|
||||
result["modules_length"] = bb.readUint32();
|
||||
return result;
|
||||
}
|
||||
|
||||
function encodeJavascriptBundledPackage(message, bb) {
|
||||
var value = message["name"];
|
||||
if (value != null) {
|
||||
encodeStringPointer(value, bb);
|
||||
} else {
|
||||
throw new Error('Missing required field "name"');
|
||||
}
|
||||
|
||||
var value = message["version"];
|
||||
if (value != null) {
|
||||
encodeStringPointer(value, bb);
|
||||
} else {
|
||||
throw new Error('Missing required field "version"');
|
||||
}
|
||||
|
||||
var value = message["hash"];
|
||||
if (value != null) {
|
||||
bb.writeUint32(value);
|
||||
} else {
|
||||
throw new Error('Missing required field "hash"');
|
||||
}
|
||||
|
||||
var value = message["modules_offset"];
|
||||
if (value != null) {
|
||||
bb.writeUint32(value);
|
||||
} else {
|
||||
throw new Error('Missing required field "modules_offset"');
|
||||
}
|
||||
|
||||
var value = message["modules_length"];
|
||||
if (value != null) {
|
||||
bb.writeUint32(value);
|
||||
} else {
|
||||
throw new Error('Missing required field "modules_length"');
|
||||
}
|
||||
}
|
||||
|
||||
function decodeJavascriptBundle(bb) {
|
||||
var result = {};
|
||||
|
||||
var length = bb.readVarUint();
|
||||
var values = (result["modules"] = Array(length));
|
||||
for (var i = 0; i < length; i++)
|
||||
values[i] = decodeJavascriptBundledModule(bb);
|
||||
var length = bb.readVarUint();
|
||||
var values = (result["packages"] = Array(length));
|
||||
for (var i = 0; i < length; i++)
|
||||
values[i] = decodeJavascriptBundledPackage(bb);
|
||||
var length = bb.readVarUint();
|
||||
var values = (result["parts"] = Array(length));
|
||||
for (var i = 0; i < length; i++) values[i] = decodeJavascriptBundledPart(bb);
|
||||
var length = bb.readVarUint();
|
||||
var values = (result["export_names"] = Array(length));
|
||||
for (var i = 0; i < length; i++) values[i] = decodeStringPointer(bb);
|
||||
result["export_parts"] = bb.readUint32ByteArray();
|
||||
result["etag"] = bb.readByteArray();
|
||||
result["generated_at"] = bb.readUint32();
|
||||
result["import_from_name"] = bb.readByteArray();
|
||||
result["manifest_string"] = bb.readByteArray();
|
||||
return result;
|
||||
}
|
||||
|
||||
function encodeJavascriptBundle(message, bb) {
|
||||
var value = message["modules"];
|
||||
if (value != null) {
|
||||
var values = value,
|
||||
n = values.length;
|
||||
bb.writeVarUint(n);
|
||||
for (var i = 0; i < n; i++) {
|
||||
value = values[i];
|
||||
encodeJavascriptBundledModule(value, bb);
|
||||
}
|
||||
} else {
|
||||
throw new Error('Missing required field "modules"');
|
||||
}
|
||||
|
||||
var value = message["packages"];
|
||||
if (value != null) {
|
||||
var values = value,
|
||||
n = values.length;
|
||||
bb.writeVarUint(n);
|
||||
for (var i = 0; i < n; i++) {
|
||||
value = values[i];
|
||||
encodeJavascriptBundledPackage(value, bb);
|
||||
}
|
||||
} else {
|
||||
throw new Error('Missing required field "packages"');
|
||||
}
|
||||
|
||||
var value = message["parts"];
|
||||
if (value != null) {
|
||||
var values = value,
|
||||
n = values.length;
|
||||
bb.writeVarUint(n);
|
||||
for (var i = 0; i < n; i++) {
|
||||
value = values[i];
|
||||
encodeJavascriptBundledPart(value, bb);
|
||||
}
|
||||
} else {
|
||||
throw new Error('Missing required field "parts"');
|
||||
}
|
||||
|
||||
var value = message["export_names"];
|
||||
if (value != null) {
|
||||
var values = value,
|
||||
n = values.length;
|
||||
bb.writeVarUint(n);
|
||||
for (var i = 0; i < n; i++) {
|
||||
value = values[i];
|
||||
encodeStringPointer(value, bb);
|
||||
}
|
||||
} else {
|
||||
throw new Error('Missing required field "export_names"');
|
||||
}
|
||||
|
||||
var value = message["export_parts"];
|
||||
if (value != null) {
|
||||
bb.writeUint32ByteArray(value);
|
||||
} else {
|
||||
throw new Error('Missing required field "export_parts"');
|
||||
}
|
||||
|
||||
var value = message["etag"];
|
||||
if (value != null) {
|
||||
bb.writeByteArray(value);
|
||||
} else {
|
||||
throw new Error('Missing required field "etag"');
|
||||
}
|
||||
|
||||
var value = message["generated_at"];
|
||||
if (value != null) {
|
||||
bb.writeUint32(value);
|
||||
} else {
|
||||
throw new Error('Missing required field "generated_at"');
|
||||
}
|
||||
|
||||
var value = message["import_from_name"];
|
||||
if (value != null) {
|
||||
bb.writeByteArray(value);
|
||||
} else {
|
||||
throw new Error('Missing required field "import_from_name"');
|
||||
}
|
||||
|
||||
var value = message["manifest_string"];
|
||||
if (value != null) {
|
||||
bb.writeByteArray(value);
|
||||
} else {
|
||||
throw new Error('Missing required field "manifest_string"');
|
||||
}
|
||||
}
|
||||
|
||||
function decodeJavascriptBundleContainer(bb) {
|
||||
var result = {};
|
||||
|
||||
while (true) {
|
||||
switch (bb.readByte()) {
|
||||
case 0:
|
||||
return result;
|
||||
|
||||
case 1:
|
||||
result["bundle_format_version"] = bb.readUint32();
|
||||
break;
|
||||
|
||||
case 2:
|
||||
result["bundle"] = decodeJavascriptBundle(bb);
|
||||
break;
|
||||
|
||||
case 3:
|
||||
result["code_length"] = bb.readUint32();
|
||||
break;
|
||||
|
||||
default:
|
||||
throw new Error("Attempted to parse invalid message");
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
function encodeJavascriptBundleContainer(message, bb) {
|
||||
var value = message["bundle_format_version"];
|
||||
if (value != null) {
|
||||
bb.writeByte(1);
|
||||
bb.writeUint32(value);
|
||||
}
|
||||
|
||||
var value = message["bundle"];
|
||||
if (value != null) {
|
||||
bb.writeByte(2);
|
||||
encodeJavascriptBundle(value, bb);
|
||||
}
|
||||
|
||||
var value = message["code_length"];
|
||||
if (value != null) {
|
||||
bb.writeByte(3);
|
||||
bb.writeUint32(value);
|
||||
}
|
||||
bb.writeByte(0);
|
||||
}
|
||||
|
||||
export { decodeStringPointer };
|
||||
export { encodeStringPointer };
|
||||
export { decodeJavascriptBundledPart };
|
||||
export { encodeJavascriptBundledPart };
|
||||
export { decodeJavascriptBundledModule };
|
||||
export { encodeJavascriptBundledModule };
|
||||
export { decodeJavascriptBundledPackage };
|
||||
export { encodeJavascriptBundledPackage };
|
||||
export { decodeJavascriptBundle };
|
||||
export { encodeJavascriptBundle };
|
||||
export { decodeJavascriptBundleContainer };
|
||||
export { encodeJavascriptBundleContainer };
|
||||
@@ -1,8 +1,9 @@
|
||||
struct Export {
|
||||
uint32 part_id;
|
||||
StringPointer name;
|
||||
}
|
||||
package BundleV2;
|
||||
|
||||
struct StringPointer {
|
||||
uint32 offset;
|
||||
uint32 length;
|
||||
}
|
||||
|
||||
struct JavascriptBundledPart {
|
||||
StringPointer code;
|
||||
@@ -14,9 +15,6 @@ struct JavascriptBundledPart {
|
||||
uint32 exports_length;
|
||||
|
||||
uint32 from_module;
|
||||
|
||||
// The ESM export is this id ("$" + number.toString(16))
|
||||
uint32 id;
|
||||
}
|
||||
|
||||
struct JavascriptBundledModule {
|
||||
@@ -46,9 +44,11 @@ struct JavascriptBundledPackage {
|
||||
}
|
||||
|
||||
struct JavascriptBundle {
|
||||
// These are sorted alphabetically so you can do binary search
|
||||
JavascriptBundledModule[] modules;
|
||||
JavascriptBundledPackage[] packages;
|
||||
JavascriptBundledPart[] parts;
|
||||
StringPointer[] export_names;
|
||||
uint32[] export_parts;
|
||||
|
||||
// This is ASCII-encoded so you can send it directly over HTTP
|
||||
byte[] etag;
|
||||
@@ -65,11 +65,11 @@ message JavascriptBundleContainer {
|
||||
uint32 bundle_format_version = 1;
|
||||
|
||||
// These go first so if we change JavaScriptBundle we can still read these
|
||||
LoadedRouteConfig routes = 3;
|
||||
LoadedFramework framework = 2;
|
||||
//LoadedRouteConfig routes = 3;
|
||||
//LoadedFramework framework = 2;
|
||||
|
||||
JavascriptBundle bundle = 4;
|
||||
JavascriptBundle bundle = 2;
|
||||
|
||||
// Don't technically need to store this, but it may be helpful as a sanity check
|
||||
uint32 code_length = 5;
|
||||
uint32 code_length = 3;
|
||||
}
|
||||
|
||||
585
src/api/bundle_v2.zig
Normal file
585
src/api/bundle_v2.zig
Normal file
@@ -0,0 +1,585 @@
|
||||
const std = @import("std");
|
||||
|
||||
pub const Reader = struct {
|
||||
const Self = @This();
|
||||
pub const ReadError = error{EOF};
|
||||
|
||||
buf: []u8,
|
||||
remain: []u8,
|
||||
allocator: std.mem.Allocator,
|
||||
|
||||
pub fn init(buf: []u8, allocator: std.mem.Allocator) Reader {
|
||||
return Reader{
|
||||
.buf = buf,
|
||||
.remain = buf,
|
||||
.allocator = allocator,
|
||||
};
|
||||
}
|
||||
|
||||
pub fn read(this: *Self, count: usize) ![]u8 {
|
||||
const read_count = @minimum(count, this.remain.len);
|
||||
if (read_count < count) {
|
||||
return error.EOF;
|
||||
}
|
||||
|
||||
var slice = this.remain[0..read_count];
|
||||
|
||||
this.remain = this.remain[read_count..];
|
||||
|
||||
return slice;
|
||||
}
|
||||
|
||||
pub inline fn readAs(this: *Self, comptime T: type) !T {
|
||||
if (!std.meta.trait.hasUniqueRepresentation(T)) {
|
||||
@compileError(@typeName(T) ++ " must have unique representation.");
|
||||
}
|
||||
|
||||
return std.mem.bytesAsValue(T, try this.read(@sizeOf(T)));
|
||||
}
|
||||
|
||||
pub inline fn readByte(this: *Self) !u8 {
|
||||
return (try this.read(1))[0];
|
||||
}
|
||||
|
||||
pub fn readEnum(this: *Self, comptime Enum: type) !Enum {
|
||||
const E = error{
|
||||
/// An integer was read, but it did not match any of the tags in the supplied enum.
|
||||
InvalidValue,
|
||||
};
|
||||
const type_info = @typeInfo(Enum).Enum;
|
||||
const tag = try this.readInt(type_info.tag_type);
|
||||
|
||||
inline for (std.meta.fields(Enum)) |field| {
|
||||
if (tag == field.value) {
|
||||
return @field(Enum, field.name);
|
||||
}
|
||||
}
|
||||
|
||||
return E.InvalidValue;
|
||||
}
|
||||
|
||||
pub inline fn readArray(this: *Self, comptime T: type) ![]const T {
|
||||
const length = try this.readInt(u32);
|
||||
if (length == 0) {
|
||||
return &([_]T{});
|
||||
}
|
||||
|
||||
switch (comptime T) {
|
||||
u8 => {
|
||||
return try this.read(length);
|
||||
},
|
||||
u16, u32, i8, i16, i32 => {
|
||||
return std.mem.readIntSliceNative(T, this.read(length * @sizeOf(T)));
|
||||
},
|
||||
[:0]const u8, []const u8 => {
|
||||
var i: u32 = 0;
|
||||
var array = try this.allocator.alloc(T, length);
|
||||
while (i < length) : (i += 1) {
|
||||
array[i] = try this.readArray(u8);
|
||||
}
|
||||
return array;
|
||||
},
|
||||
else => {
|
||||
switch (comptime @typeInfo(T)) {
|
||||
.Struct => |Struct| {
|
||||
switch (Struct.layout) {
|
||||
.Packed => {
|
||||
const sizeof = @sizeOf(T);
|
||||
var slice = try this.read(sizeof * length);
|
||||
return std.mem.bytesAsSlice(T, slice);
|
||||
},
|
||||
else => {},
|
||||
}
|
||||
},
|
||||
.Enum => |type_info| {
|
||||
const enum_values = try this.read(length * @sizeOf(type_info.tag_type));
|
||||
return @ptrCast([*]T, enum_values.ptr)[0..length];
|
||||
},
|
||||
else => {},
|
||||
}
|
||||
|
||||
var i: u32 = 0;
|
||||
var array = try this.allocator.alloc(T, length);
|
||||
while (i < length) : (i += 1) {
|
||||
array[i] = try this.readValue(T);
|
||||
}
|
||||
|
||||
return array;
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
pub inline fn readByteArray(this: *Self) ![]u8 {
|
||||
const length = try this.readInt(u32);
|
||||
if (length == 0) {
|
||||
return &([_]u8{});
|
||||
}
|
||||
|
||||
return try this.read(@as(usize, length));
|
||||
}
|
||||
|
||||
pub inline fn readInt(this: *Self, comptime T: type) !T {
|
||||
var slice = try this.read(@sizeOf(T));
|
||||
|
||||
return std.mem.readIntSliceNative(T, slice);
|
||||
}
|
||||
|
||||
pub inline fn readBool(this: *Self) !bool {
|
||||
return (try this.readByte()) > 0;
|
||||
}
|
||||
|
||||
pub inline fn readValue(this: *Self, comptime T: type) !T {
|
||||
switch (comptime T) {
|
||||
bool => {
|
||||
return try this.readBool();
|
||||
},
|
||||
u8 => {
|
||||
return try this.readByte();
|
||||
},
|
||||
[*:0]const u8, [:0]const u8, []const u8 => {
|
||||
return try this.readArray(u8);
|
||||
},
|
||||
|
||||
[]const [:0]const u8, []const [*:0]const u8, []const []const u8 => {
|
||||
return try this.readArray([]const u8);
|
||||
},
|
||||
[]u8, [:0]u8, [*:0]u8 => {
|
||||
return try this.readArray([]u8);
|
||||
},
|
||||
u16, u32, i8, i16, i32 => {
|
||||
return std.mem.readIntSliceNative(T, try this.read(@sizeOf(T)));
|
||||
},
|
||||
else => {
|
||||
switch (comptime @typeInfo(T)) {
|
||||
.Struct => |Struct| {
|
||||
switch (Struct.layout) {
|
||||
.Packed => {
|
||||
const sizeof = @sizeOf(T);
|
||||
var slice = try this.read(sizeof);
|
||||
return @ptrCast(*T, slice[0..sizeof]).*;
|
||||
},
|
||||
else => {},
|
||||
}
|
||||
},
|
||||
.Enum => {
|
||||
return try this.readEnum(T);
|
||||
},
|
||||
else => {},
|
||||
}
|
||||
|
||||
return try T.decode(this);
|
||||
},
|
||||
}
|
||||
|
||||
@compileError("Invalid type passed to readValue");
|
||||
}
|
||||
};
|
||||
|
||||
pub fn Writer(comptime WritableStream: type) type {
|
||||
return struct {
|
||||
const Self = @This();
|
||||
writable: WritableStream,
|
||||
|
||||
pub fn init(writable: WritableStream) Self {
|
||||
return Self{ .writable = writable };
|
||||
}
|
||||
|
||||
pub inline fn write(this: *Self, bytes: anytype) !void {
|
||||
_ = try this.writable.write(bytes);
|
||||
}
|
||||
|
||||
pub inline fn writeByte(this: *Self, byte: u8) !void {
|
||||
_ = try this.writable.write(&[1]u8{byte});
|
||||
}
|
||||
|
||||
pub inline fn writeInt(this: *Self, int: anytype) !void {
|
||||
try this.write(std.mem.asBytes(&int));
|
||||
}
|
||||
|
||||
pub inline fn writeFieldID(this: *Self, comptime id: comptime_int) !void {
|
||||
try this.writeByte(id);
|
||||
}
|
||||
|
||||
pub inline fn writeEnum(this: *Self, val: anytype) !void {
|
||||
try this.writeInt(@enumToInt(val));
|
||||
}
|
||||
|
||||
pub fn writeValue(this: *Self, comptime SliceType: type, slice: SliceType) !void {
|
||||
switch (SliceType) {
|
||||
[]u16,
|
||||
[]u32,
|
||||
[]i16,
|
||||
[]i32,
|
||||
[]i8,
|
||||
[]const u16,
|
||||
[]const u32,
|
||||
[]const i16,
|
||||
[]const i32,
|
||||
[]const i8,
|
||||
[:0]u16,
|
||||
[:0]u32,
|
||||
[:0]i16,
|
||||
[:0]i32,
|
||||
[:0]i8,
|
||||
[:0]const u16,
|
||||
[:0]const u32,
|
||||
[:0]const i16,
|
||||
[:0]const i32,
|
||||
[:0]const i8,
|
||||
[*:0]u16,
|
||||
[*:0]u32,
|
||||
[*:0]i16,
|
||||
[*:0]i32,
|
||||
[*:0]i8,
|
||||
[*:0]const u16,
|
||||
[*:0]const u32,
|
||||
[*:0]const i16,
|
||||
[*:0]const i32,
|
||||
[*:0]const i8,
|
||||
=> {
|
||||
try this.writeArray(SliceType, slice);
|
||||
},
|
||||
|
||||
[]u8,
|
||||
[]const u8,
|
||||
[:0]u8,
|
||||
[:0]const u8,
|
||||
[*:0]u8,
|
||||
[*:0]const u8,
|
||||
=> {
|
||||
try this.writeArray(u8, slice);
|
||||
},
|
||||
|
||||
u8 => {
|
||||
try this.write(slice);
|
||||
},
|
||||
u16, u32, i16, i32, i8 => {
|
||||
try this.write(std.mem.asBytes(slice));
|
||||
},
|
||||
|
||||
else => {
|
||||
try slice.encode(this);
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
pub inline fn writeArray(this: *Self, comptime T: type, slice: anytype) !void {
|
||||
try this.writeInt(@truncate(u32, slice.len));
|
||||
|
||||
switch (T) {
|
||||
u8 => {
|
||||
try this.write(slice);
|
||||
},
|
||||
u16, u32, i16, i32, i8 => {
|
||||
try this.write(std.mem.asBytes(slice));
|
||||
},
|
||||
[:0]u8,
|
||||
[]u8,
|
||||
[]u16,
|
||||
[]u32,
|
||||
[]i16,
|
||||
[]i32,
|
||||
[]i8,
|
||||
[]const u8,
|
||||
[:0]const u8,
|
||||
[]const u16,
|
||||
[]const u32,
|
||||
[]const i16,
|
||||
[]const i32,
|
||||
[]const i8,
|
||||
[:0]u16,
|
||||
[:0]u32,
|
||||
[:0]i16,
|
||||
[:0]i32,
|
||||
[:0]i8,
|
||||
[:0]const u16,
|
||||
[:0]const u32,
|
||||
[:0]const i16,
|
||||
[:0]const i32,
|
||||
[:0]const i8,
|
||||
[*:0]u16,
|
||||
[*:0]u32,
|
||||
[*:0]i16,
|
||||
[*:0]i32,
|
||||
[*:0]i8,
|
||||
[*:0]const u16,
|
||||
[*:0]const u32,
|
||||
[*:0]const i16,
|
||||
[*:0]const i32,
|
||||
[*:0]const i8,
|
||||
=> {
|
||||
for (slice) |num_slice| {
|
||||
try this.writeArray(std.meta.Child(@TypeOf(num_slice)), num_slice);
|
||||
}
|
||||
},
|
||||
else => {
|
||||
for (slice) |val| {
|
||||
try val.encode(this);
|
||||
}
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
pub inline fn endMessage(this: *Self) !void {
|
||||
try this.writeByte(0);
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
pub const ByteWriter = Writer(*std.io.FixedBufferStream([]u8));
|
||||
pub const FileWriter = Writer(std.fs.File);
|
||||
|
||||
pub const BundleV2 = struct {
|
||||
pub const StringPointer = packed struct {
|
||||
/// offset
|
||||
offset: u32 = 0,
|
||||
|
||||
/// length
|
||||
length: u32 = 0,
|
||||
|
||||
pub fn decode(reader: anytype) anyerror!StringPointer {
|
||||
var this = std.mem.zeroes(StringPointer);
|
||||
|
||||
this.offset = try reader.readValue(u32);
|
||||
this.length = try reader.readValue(u32);
|
||||
return this;
|
||||
}
|
||||
|
||||
pub fn encode(this: *const @This(), writer: anytype) anyerror!void {
|
||||
try writer.writeInt(this.offset);
|
||||
try writer.writeInt(this.length);
|
||||
}
|
||||
};
|
||||
|
||||
pub const JavascriptBundledPart = struct {
|
||||
/// code
|
||||
code: StringPointer,
|
||||
|
||||
/// dependencies_offset
|
||||
dependencies_offset: u32 = 0,
|
||||
|
||||
/// dependencies_length
|
||||
dependencies_length: u32 = 0,
|
||||
|
||||
/// exports_offset
|
||||
exports_offset: u32 = 0,
|
||||
|
||||
/// exports_length
|
||||
exports_length: u32 = 0,
|
||||
|
||||
/// from_module
|
||||
from_module: u32 = 0,
|
||||
|
||||
pub fn decode(reader: anytype) anyerror!JavascriptBundledPart {
|
||||
var this = std.mem.zeroes(JavascriptBundledPart);
|
||||
|
||||
this.code = try reader.readValue(StringPointer);
|
||||
this.dependencies_offset = try reader.readValue(u32);
|
||||
this.dependencies_length = try reader.readValue(u32);
|
||||
this.exports_offset = try reader.readValue(u32);
|
||||
this.exports_length = try reader.readValue(u32);
|
||||
this.from_module = try reader.readValue(u32);
|
||||
return this;
|
||||
}
|
||||
|
||||
pub fn encode(this: *const @This(), writer: anytype) anyerror!void {
|
||||
try writer.writeValue(@TypeOf(this.code), this.code);
|
||||
try writer.writeInt(this.dependencies_offset);
|
||||
try writer.writeInt(this.dependencies_length);
|
||||
try writer.writeInt(this.exports_offset);
|
||||
try writer.writeInt(this.exports_length);
|
||||
try writer.writeInt(this.from_module);
|
||||
}
|
||||
};
|
||||
|
||||
pub const JavascriptBundledModule = struct {
|
||||
/// path
|
||||
path: StringPointer,
|
||||
|
||||
/// parts_offset
|
||||
parts_offset: u32 = 0,
|
||||
|
||||
/// parts_length
|
||||
parts_length: u32 = 0,
|
||||
|
||||
/// exports_offset
|
||||
exports_offset: u32 = 0,
|
||||
|
||||
/// exports_length
|
||||
exports_length: u32 = 0,
|
||||
|
||||
/// package_id
|
||||
package_id: u32 = 0,
|
||||
|
||||
/// path_extname_length
|
||||
path_extname_length: u8 = 0,
|
||||
|
||||
pub fn decode(reader: anytype) anyerror!JavascriptBundledModule {
|
||||
var this = std.mem.zeroes(JavascriptBundledModule);
|
||||
|
||||
this.path = try reader.readValue(StringPointer);
|
||||
this.parts_offset = try reader.readValue(u32);
|
||||
this.parts_length = try reader.readValue(u32);
|
||||
this.exports_offset = try reader.readValue(u32);
|
||||
this.exports_length = try reader.readValue(u32);
|
||||
this.package_id = try reader.readValue(u32);
|
||||
this.path_extname_length = try reader.readValue(u8);
|
||||
return this;
|
||||
}
|
||||
|
||||
pub fn encode(this: *const @This(), writer: anytype) anyerror!void {
|
||||
try writer.writeValue(@TypeOf(this.path), this.path);
|
||||
try writer.writeInt(this.parts_offset);
|
||||
try writer.writeInt(this.parts_length);
|
||||
try writer.writeInt(this.exports_offset);
|
||||
try writer.writeInt(this.exports_length);
|
||||
try writer.writeInt(this.package_id);
|
||||
try writer.writeInt(this.path_extname_length);
|
||||
}
|
||||
};
|
||||
|
||||
pub const JavascriptBundledPackage = struct {
|
||||
/// name
|
||||
name: StringPointer,
|
||||
|
||||
/// version
|
||||
version: StringPointer,
|
||||
|
||||
/// hash
|
||||
hash: u32 = 0,
|
||||
|
||||
/// modules_offset
|
||||
modules_offset: u32 = 0,
|
||||
|
||||
/// modules_length
|
||||
modules_length: u32 = 0,
|
||||
|
||||
pub fn decode(reader: anytype) anyerror!JavascriptBundledPackage {
|
||||
var this = std.mem.zeroes(JavascriptBundledPackage);
|
||||
|
||||
this.name = try reader.readValue(StringPointer);
|
||||
this.version = try reader.readValue(StringPointer);
|
||||
this.hash = try reader.readValue(u32);
|
||||
this.modules_offset = try reader.readValue(u32);
|
||||
this.modules_length = try reader.readValue(u32);
|
||||
return this;
|
||||
}
|
||||
|
||||
pub fn encode(this: *const @This(), writer: anytype) anyerror!void {
|
||||
try writer.writeValue(@TypeOf(this.name), this.name);
|
||||
try writer.writeValue(@TypeOf(this.version), this.version);
|
||||
try writer.writeInt(this.hash);
|
||||
try writer.writeInt(this.modules_offset);
|
||||
try writer.writeInt(this.modules_length);
|
||||
}
|
||||
};
|
||||
|
||||
pub const JavascriptBundle = struct {
|
||||
/// modules
|
||||
modules: []const JavascriptBundledModule,
|
||||
|
||||
/// packages
|
||||
packages: []const JavascriptBundledPackage,
|
||||
|
||||
/// parts
|
||||
parts: []const JavascriptBundledPart,
|
||||
|
||||
/// export_names
|
||||
export_names: []const StringPointer,
|
||||
|
||||
/// export_parts
|
||||
export_parts: []const u32,
|
||||
|
||||
/// etag
|
||||
etag: []const u8,
|
||||
|
||||
/// generated_at
|
||||
generated_at: u32 = 0,
|
||||
|
||||
/// import_from_name
|
||||
import_from_name: []const u8,
|
||||
|
||||
/// manifest_string
|
||||
manifest_string: []const u8,
|
||||
|
||||
pub fn decode(reader: anytype) anyerror!JavascriptBundle {
|
||||
var this = std.mem.zeroes(JavascriptBundle);
|
||||
|
||||
this.modules = try reader.readArray(JavascriptBundledModule);
|
||||
this.packages = try reader.readArray(JavascriptBundledPackage);
|
||||
this.parts = try reader.readArray(JavascriptBundledPart);
|
||||
this.export_names = try reader.readArray(StringPointer);
|
||||
this.export_parts = try reader.readArray(u32);
|
||||
this.etag = try reader.readArray(u8);
|
||||
this.generated_at = try reader.readValue(u32);
|
||||
this.import_from_name = try reader.readArray(u8);
|
||||
this.manifest_string = try reader.readArray(u8);
|
||||
return this;
|
||||
}
|
||||
|
||||
pub fn encode(this: *const @This(), writer: anytype) anyerror!void {
|
||||
try writer.writeArray(JavascriptBundledModule, this.modules);
|
||||
try writer.writeArray(JavascriptBundledPackage, this.packages);
|
||||
try writer.writeArray(JavascriptBundledPart, this.parts);
|
||||
try writer.writeArray(StringPointer, this.export_names);
|
||||
try writer.writeArray(u32, this.export_parts);
|
||||
try writer.writeArray(u8, this.etag);
|
||||
try writer.writeInt(this.generated_at);
|
||||
try writer.writeArray(u8, this.import_from_name);
|
||||
try writer.writeArray(u8, this.manifest_string);
|
||||
}
|
||||
};
|
||||
|
||||
pub const JavascriptBundleContainer = struct {
|
||||
/// bundle_format_version
|
||||
bundle_format_version: ?u32 = null,
|
||||
|
||||
/// bundle
|
||||
bundle: ?JavascriptBundle = null,
|
||||
|
||||
/// code_length
|
||||
code_length: ?u32 = null,
|
||||
|
||||
pub fn decode(reader: anytype) anyerror!JavascriptBundleContainer {
|
||||
var this = std.mem.zeroes(JavascriptBundleContainer);
|
||||
|
||||
while (true) {
|
||||
switch (try reader.readByte()) {
|
||||
0 => {
|
||||
return this;
|
||||
},
|
||||
|
||||
1 => {
|
||||
this.bundle_format_version = try reader.readValue(u32);
|
||||
},
|
||||
2 => {
|
||||
this.bundle = try reader.readValue(JavascriptBundle);
|
||||
},
|
||||
3 => {
|
||||
this.code_length = try reader.readValue(u32);
|
||||
},
|
||||
else => {
|
||||
return error.InvalidMessage;
|
||||
},
|
||||
}
|
||||
}
|
||||
unreachable;
|
||||
}
|
||||
|
||||
pub fn encode(this: *const @This(), writer: anytype) anyerror!void {
|
||||
if (this.bundle_format_version) |bundle_format_version| {
|
||||
try writer.writeFieldID(1);
|
||||
try writer.writeInt(bundle_format_version);
|
||||
}
|
||||
if (this.bundle) |bundle| {
|
||||
try writer.writeFieldID(2);
|
||||
try writer.writeValue(@TypeOf(bundle), bundle);
|
||||
}
|
||||
if (this.code_length) |code_length| {
|
||||
try writer.writeFieldID(3);
|
||||
try writer.writeInt(code_length);
|
||||
}
|
||||
try writer.endMessage();
|
||||
}
|
||||
};
|
||||
};
|
||||
@@ -126,12 +126,47 @@ pub fn getBits(comptime TargetType: type, target: anytype, comptime start_bit: c
|
||||
return @truncate(TargetType, target >> start_bit);
|
||||
}
|
||||
|
||||
pub const Index = enum(Index.Int) {
|
||||
runtime = 0,
|
||||
invalid = std.math.maxInt(Index.Int),
|
||||
_,
|
||||
|
||||
pub const Int = u32;
|
||||
|
||||
pub fn init(num: anytype) Index {
|
||||
const NumType = @TypeOf(num);
|
||||
if (comptime @typeInfo(NumType) == .Pointer) {
|
||||
return init(num.*);
|
||||
}
|
||||
|
||||
if (comptime @typeInfo(NumType) == .Enum) {
|
||||
return init(@enumToInt(num));
|
||||
}
|
||||
|
||||
return @intToEnum(Index, @intCast(Index.Int, num));
|
||||
}
|
||||
|
||||
pub inline fn isValid(this: Index) bool {
|
||||
return @enumToInt(this) != @enumToInt(Index.invalid);
|
||||
}
|
||||
|
||||
pub inline fn isInvalid(this: Index) bool {
|
||||
return @enumToInt(this) == @enumToInt(Index.invalid);
|
||||
}
|
||||
|
||||
pub inline fn get(this: Index) u32 {
|
||||
return @enumToInt(this);
|
||||
}
|
||||
};
|
||||
|
||||
pub const Ref = enum(TotalSize) {
|
||||
default = std.math.maxInt(TotalSize),
|
||||
_,
|
||||
|
||||
pub const TotalSize = u62;
|
||||
|
||||
pub const ArrayHashCtx = RefHashCtx;
|
||||
|
||||
pub fn format(ref: Ref, comptime _: []const u8, _: std.fmt.FormatOptions, writer: anytype) !void {
|
||||
try std.fmt.format(
|
||||
writer,
|
||||
@@ -151,10 +186,14 @@ pub const Ref = enum(TotalSize) {
|
||||
return @bitCast(BitInt, this);
|
||||
}
|
||||
|
||||
pub fn isValid(this: Ref) bool {
|
||||
return this.asBitInt() < None.asBitInt();
|
||||
}
|
||||
|
||||
// 2 bits of padding for whatever is the parent
|
||||
pub const Int = u30;
|
||||
pub const None = Ref.init(std.math.maxInt(u30), std.math.maxInt(u30), false);
|
||||
pub const RuntimeRef = Ref.init(std.math.maxInt(u30), std.math.maxInt(u30) - 1, false);
|
||||
pub const RuntimeRef = Ref.init(std.math.maxInt(u30), 0, false);
|
||||
|
||||
const source_index_offset = 1;
|
||||
const inner_index_offset = 1 + 30;
|
||||
@@ -241,7 +280,7 @@ pub const Ref = enum(TotalSize) {
|
||||
return self.eql(Ref.None);
|
||||
}
|
||||
|
||||
pub fn isSourceIndexNull(int: anytype) bool {
|
||||
pub fn isIndexNull(int: anytype) bool {
|
||||
return int == max_ref_int;
|
||||
}
|
||||
|
||||
|
||||
149
src/baby_list.zig
Normal file
149
src/baby_list.zig
Normal file
@@ -0,0 +1,149 @@
|
||||
const std = @import("std");
|
||||
const Environment = @import("env.zig");
|
||||
const util = @import("./util.zig");
|
||||
|
||||
/// This is like ArrayList except it stores the length and capacity as u32
|
||||
/// In practice, it is very unusual to have lengths above 4 GB
|
||||
///
|
||||
/// This lets us have array lists which occupy the same amount of space as a
|
||||
/// slice
|
||||
pub fn BabyList(comptime Type: type) type {
|
||||
return struct {
|
||||
const ListType = @This();
|
||||
ptr: [*]Type = undefined,
|
||||
len: u32 = 0,
|
||||
cap: u32 = 0,
|
||||
|
||||
pub fn ensureUnusedCapacity(this: *@This(), allocator: std.mem.Allocator, count: usize) !void {
|
||||
var list_ = this.listManaged(allocator);
|
||||
try list_.ensureUnusedCapacity(count);
|
||||
this.update(list_);
|
||||
}
|
||||
|
||||
pub fn append(this: *@This(), allocator: std.mem.Allocator, value: Type) !void {
|
||||
if (this.len + 1 < this.cap) {
|
||||
var list_ = this.listManaged(allocator);
|
||||
try list_.ensureUnusedCapacity(1);
|
||||
this.update(list_);
|
||||
}
|
||||
this.appendAssumeCapacity(value);
|
||||
}
|
||||
|
||||
pub inline fn appendAssumeCapacity(this: *@This(), value: Type) void {
|
||||
this.ptr[this.len] = value;
|
||||
this.len += 1;
|
||||
}
|
||||
|
||||
pub inline fn appendSliceAssumeCapacity(this: *@This(), values: []const Type) void {
|
||||
var tail = this.ptr[this.len];
|
||||
for (values) |value| {
|
||||
tail.* = value;
|
||||
tail += 1;
|
||||
}
|
||||
this.len += values.len;
|
||||
}
|
||||
|
||||
pub inline fn init(items: []const Type) ListType {
|
||||
@setRuntimeSafety(false);
|
||||
return ListType{
|
||||
// Remove the const qualifier from the items
|
||||
.ptr = @intToPtr([*]Type, @ptrToInt(items.ptr)),
|
||||
|
||||
.len = @truncate(u32, items.len),
|
||||
.cap = @truncate(u32, items.len),
|
||||
};
|
||||
}
|
||||
|
||||
pub inline fn fromList(list_: anytype) ListType {
|
||||
@setRuntimeSafety(false);
|
||||
|
||||
if (comptime Environment.allow_assert) {
|
||||
std.debug.assert(list_.items.len <= list_.capacity);
|
||||
}
|
||||
|
||||
return ListType{
|
||||
.ptr = list_.items.ptr,
|
||||
.len = @truncate(u32, list_.items.len),
|
||||
.cap = @truncate(u32, list_.capacity),
|
||||
};
|
||||
}
|
||||
|
||||
pub fn update(this: *ListType, list_: anytype) void {
|
||||
@setRuntimeSafety(false);
|
||||
this.ptr = list_.items.ptr;
|
||||
this.len = @truncate(u32, list_.items.len);
|
||||
this.cap = @truncate(u32, list_.capacity);
|
||||
|
||||
if (comptime Environment.allow_assert) {
|
||||
std.debug.assert(this.len <= this.cap);
|
||||
}
|
||||
}
|
||||
|
||||
pub fn list(this: ListType) std.ArrayListUnmanaged(Type) {
|
||||
return std.ArrayListUnmanaged(Type){
|
||||
.items = this.ptr[0..this.len],
|
||||
.capacity = this.cap,
|
||||
};
|
||||
}
|
||||
|
||||
pub fn listManaged(this: ListType, allocator: std.mem.Allocator) std.ArrayList(Type) {
|
||||
return std.ArrayList(Type){
|
||||
.items = this.ptr[0..this.len],
|
||||
.capacity = this.cap,
|
||||
.allocator = allocator,
|
||||
};
|
||||
}
|
||||
|
||||
pub fn from(allocator: std.mem.Allocator, default: anytype) !ListType {
|
||||
return util.from(ListType, allocator, default);
|
||||
}
|
||||
|
||||
pub inline fn first(this: ListType) ?*Type {
|
||||
return if (this.len > 0) this.ptr[0] else @as(?*Type, null);
|
||||
}
|
||||
|
||||
pub inline fn last(this: ListType) ?*Type {
|
||||
return if (this.len > 0) &this.ptr[this.len - 1] else @as(?*Type, null);
|
||||
}
|
||||
|
||||
pub inline fn first_(this: ListType) Type {
|
||||
return this.ptr[0];
|
||||
}
|
||||
|
||||
pub inline fn at(this: ListType, index: usize) *const Type {
|
||||
std.debug.assert(index < this.len);
|
||||
return &this.ptr[index];
|
||||
}
|
||||
|
||||
pub inline fn mut(this: ListType, index: usize) *Type {
|
||||
std.debug.assert(index < this.len);
|
||||
return &this.ptr[index];
|
||||
}
|
||||
|
||||
pub fn one(allocator: std.mem.Allocator, value: Type) !ListType {
|
||||
var items = try allocator.alloc(Type, 1);
|
||||
items[0] = value;
|
||||
return ListType{
|
||||
.ptr = @ptrCast([*]Type, items.ptr),
|
||||
.len = 1,
|
||||
.cap = 1,
|
||||
};
|
||||
}
|
||||
|
||||
pub inline fn @"[0]"(this: ListType) Type {
|
||||
return this.ptr[0];
|
||||
}
|
||||
const OOM = error{OutOfMemory};
|
||||
|
||||
pub fn push(this: *ListType, allocator: std.mem.Allocator, value: Type) OOM!void {
|
||||
var list_ = this.list();
|
||||
try list_.append(allocator, value);
|
||||
this.update(list_);
|
||||
}
|
||||
|
||||
pub inline fn slice(this: ListType) []Type {
|
||||
@setRuntimeSafety(false);
|
||||
return this.ptr[0..this.len];
|
||||
}
|
||||
};
|
||||
}
|
||||
@@ -868,7 +868,7 @@ pub const Bundler = struct {
|
||||
source_map_context: ?js_printer.SourceMapHandler,
|
||||
) !usize {
|
||||
const ast = result.ast;
|
||||
var symbols: [][]js_ast.Symbol = &([_][]js_ast.Symbol{ast.symbols});
|
||||
var symbols = js_ast.Symbol.NestedList.init(&[_]js_ast.Symbol.List{ast.symbols});
|
||||
|
||||
return switch (format) {
|
||||
.cjs => try js_printer.printCommonJS(
|
||||
@@ -879,7 +879,6 @@ pub const Bundler = struct {
|
||||
&result.source,
|
||||
false,
|
||||
js_printer.Options{
|
||||
.to_module_ref = Ref.RuntimeRef,
|
||||
.externals = ast.externals,
|
||||
.runtime_imports = ast.runtime_imports,
|
||||
.require_ref = ast.require_ref,
|
||||
@@ -899,7 +898,6 @@ pub const Bundler = struct {
|
||||
&result.source,
|
||||
false,
|
||||
js_printer.Options{
|
||||
.to_module_ref = Ref.RuntimeRef,
|
||||
.externals = ast.externals,
|
||||
.runtime_imports = ast.runtime_imports,
|
||||
.require_ref = ast.require_ref,
|
||||
@@ -918,7 +916,6 @@ pub const Bundler = struct {
|
||||
&result.source,
|
||||
true,
|
||||
js_printer.Options{
|
||||
.to_module_ref = Ref.RuntimeRef,
|
||||
.externals = ast.externals,
|
||||
.runtime_imports = ast.runtime_imports,
|
||||
.require_ref = ast.require_ref,
|
||||
@@ -937,7 +934,6 @@ pub const Bundler = struct {
|
||||
&result.source,
|
||||
true,
|
||||
js_printer.Options{
|
||||
.to_module_ref = Ref.RuntimeRef,
|
||||
.externals = ast.externals,
|
||||
.runtime_imports = ast.runtime_imports,
|
||||
.require_ref = ast.require_ref,
|
||||
|
||||
3465
src/bundler/bundle_v2.zig
Normal file
3465
src/bundler/bundle_v2.zig
Normal file
File diff suppressed because it is too large
Load Diff
@@ -1313,7 +1313,7 @@ pub fn processFile(this: *GenerateNodeModuleBundle, worker: *ThreadPool.Worker,
|
||||
for (ast.import_records) |*import_record| {
|
||||
|
||||
// Don't resolve the runtime
|
||||
if (import_record.is_internal or import_record.is_unused) {
|
||||
if (import_record.isInternal() or import_record.is_unused) {
|
||||
continue;
|
||||
}
|
||||
|
||||
@@ -1323,7 +1323,7 @@ pub fn processFile(this: *GenerateNodeModuleBundle, worker: *ThreadPool.Worker,
|
||||
}
|
||||
var path = _resolved_import.path() orelse {
|
||||
import_record.path.is_disabled = true;
|
||||
import_record.is_bundled = true;
|
||||
|
||||
continue;
|
||||
};
|
||||
|
||||
@@ -1331,7 +1331,7 @@ pub fn processFile(this: *GenerateNodeModuleBundle, worker: *ThreadPool.Worker,
|
||||
|
||||
if (!loader_.isJavaScriptLikeOrJSON()) {
|
||||
import_record.path.is_disabled = true;
|
||||
import_record.is_bundled = true;
|
||||
|
||||
continue;
|
||||
}
|
||||
|
||||
@@ -1349,13 +1349,12 @@ pub fn processFile(this: *GenerateNodeModuleBundle, worker: *ThreadPool.Worker,
|
||||
// we just silently disable it
|
||||
// because...we need some kind of hook to say "don't bundle this"
|
||||
import_record.path.is_disabled = true;
|
||||
import_record.is_bundled = false;
|
||||
import_record.tag = .macro;
|
||||
|
||||
continue;
|
||||
};
|
||||
import_record.module_id = _module_data.module_id;
|
||||
std.debug.assert(import_record.module_id != 0);
|
||||
import_record.is_bundled = true;
|
||||
|
||||
path.* = try path.dupeAlloc(this.allocator);
|
||||
|
||||
@@ -1368,7 +1367,7 @@ pub fn processFile(this: *GenerateNodeModuleBundle, worker: *ThreadPool.Worker,
|
||||
) catch unreachable;
|
||||
} else |err| {
|
||||
if (comptime Environment.isDebug) {
|
||||
if (!import_record.handles_import_errors) {
|
||||
if (!import_record.handles_import_errors()) {
|
||||
Output.prettyErrorln("\n<r><red>{s}<r> resolving \"{s}\" from \"{s}\"", .{
|
||||
@errorName(err),
|
||||
import_record.path.text,
|
||||
@@ -1387,7 +1386,7 @@ pub fn processFile(this: *GenerateNodeModuleBundle, worker: *ThreadPool.Worker,
|
||||
error.ModuleNotFound => {
|
||||
const addError = logger.Log.addResolveErrorWithTextDupeMaybeWarn;
|
||||
|
||||
if (!import_record.handles_import_errors) {
|
||||
if (!import_record.handles_import_errors()) {
|
||||
if (isPackagePath(import_record.path.text)) {
|
||||
if (platform.isWebLike() and options.ExternalModules.isNodeBuiltin(import_record.path.text)) {
|
||||
try addError(
|
||||
@@ -1475,7 +1474,7 @@ pub fn processFile(this: *GenerateNodeModuleBundle, worker: *ThreadPool.Worker,
|
||||
ast.runtime_imports = runtime.Runtime.Imports{};
|
||||
ast.runtime_imports.@"$$m" = .{ .ref = Ref.atIndex(0), .primary = Ref.None, .backup = Ref.None };
|
||||
ast.runtime_imports.__export = .{ .ref = Ref.atIndex(1), .primary = Ref.None, .backup = Ref.None };
|
||||
ast.symbols = json_ast_symbols_list;
|
||||
ast.symbols = js_ast.Symbol.List.init(json_ast_symbols_list);
|
||||
ast.module_ref = Ref.atIndex(2);
|
||||
ast.exports_ref = ast.runtime_imports.__export.?.ref;
|
||||
ast.bundle_export_ref = Ref.atIndex(3);
|
||||
@@ -1621,7 +1620,6 @@ pub fn processFile(this: *GenerateNodeModuleBundle, worker: *ThreadPool.Worker,
|
||||
&source,
|
||||
false,
|
||||
js_printer.Options{
|
||||
.to_module_ref = Ref.RuntimeRef,
|
||||
.bundle_export_ref = ast.runtime_imports.@"$$m".?.ref,
|
||||
.source_path = file_path,
|
||||
.externals = ast.externals,
|
||||
@@ -1649,7 +1647,6 @@ pub fn processFile(this: *GenerateNodeModuleBundle, worker: *ThreadPool.Worker,
|
||||
&source,
|
||||
true,
|
||||
js_printer.Options{
|
||||
.to_module_ref = Ref.RuntimeRef,
|
||||
.bundle_export_ref = ast.runtime_imports.@"$$m".?.ref,
|
||||
.source_path = file_path,
|
||||
.externals = ast.externals,
|
||||
@@ -1724,7 +1721,7 @@ pub fn processFile(this: *GenerateNodeModuleBundle, worker: *ThreadPool.Worker,
|
||||
|
||||
{
|
||||
for (scan_pass_result.import_records.items) |*import_record| {
|
||||
if (import_record.is_internal or import_record.is_unused) {
|
||||
if (import_record.isInternal() or import_record.is_unused) {
|
||||
continue;
|
||||
}
|
||||
|
||||
@@ -1825,7 +1822,7 @@ pub fn processFile(this: *GenerateNodeModuleBundle, worker: *ThreadPool.Worker,
|
||||
} else |err| {
|
||||
switch (err) {
|
||||
error.ModuleNotFound => {
|
||||
if (!import_record.handles_import_errors) {
|
||||
if (!import_record.handles_import_errors()) {
|
||||
const addError = logger.Log.addResolveErrorWithTextDupeMaybeWarn;
|
||||
if (isPackagePath(import_record.path.text)) {
|
||||
if (platform.isWebLike() and options.ExternalModules.isNodeBuiltin(import_record.path.text)) {
|
||||
|
||||
@@ -32,65 +32,65 @@ const DotEnv = @import("../env_loader.zig");
|
||||
|
||||
const fs = @import("../fs.zig");
|
||||
const Router = @import("../router.zig");
|
||||
|
||||
const BundleV2 = @import("../bundler/bundle_v2.zig");
|
||||
var estimated_input_lines_of_code_: usize = undefined;
|
||||
const ServerBundleGeneratorThread = struct {
|
||||
inline fn _generate(
|
||||
logs: *logger.Log,
|
||||
env_loader_: *DotEnv.Loader,
|
||||
allocator_: std.mem.Allocator,
|
||||
ctx: Command.Context,
|
||||
_filepath: [*:0]const u8,
|
||||
server_conf: Api.LoadedFramework,
|
||||
route_conf_: ?Api.LoadedRouteConfig,
|
||||
router: ?Router,
|
||||
) !void {
|
||||
var server_bundler = try bundler.Bundler.init(
|
||||
allocator_,
|
||||
logs,
|
||||
try configureTransformOptionsForBun(allocator_, ctx.args),
|
||||
null,
|
||||
env_loader_,
|
||||
);
|
||||
server_bundler.configureLinker();
|
||||
server_bundler.options.jsx.supports_fast_refresh = false;
|
||||
// const ServerBundleGeneratorThread = struct {
|
||||
// inline fn _generate(
|
||||
// logs: *logger.Log,
|
||||
// env_loader_: *DotEnv.Loader,
|
||||
// allocator_: std.mem.Allocator,
|
||||
// ctx: Command.Context,
|
||||
// _filepath: [*:0]const u8,
|
||||
// server_conf: Api.LoadedFramework,
|
||||
// route_conf_: ?Api.LoadedRouteConfig,
|
||||
// router: ?Router,
|
||||
// ) !void {
|
||||
// var server_bundler = try bundler.Bundler.init(
|
||||
// allocator_,
|
||||
// logs,
|
||||
// try configureTransformOptionsForBun(allocator_, ctx.args),
|
||||
// null,
|
||||
// env_loader_,
|
||||
// );
|
||||
// server_bundler.configureLinker();
|
||||
// server_bundler.options.jsx.supports_fast_refresh = false;
|
||||
|
||||
server_bundler.router = router;
|
||||
server_bundler.configureDefines() catch |err| {
|
||||
Output.prettyErrorln("<r><red>{s}<r> loading --define or .env values for node_modules.server.bun\n", .{@errorName(err)});
|
||||
return err;
|
||||
};
|
||||
// server_bundler.router = router;
|
||||
// server_bundler.configureDefines() catch |err| {
|
||||
// Output.prettyErrorln("<r><red>{s}<r> loading --define or .env values for node_modules.server.bun\n", .{@errorName(err)});
|
||||
// return err;
|
||||
// };
|
||||
|
||||
if (ctx.debug.macros) |macros| {
|
||||
server_bundler.options.macro_remap = macros;
|
||||
}
|
||||
// if (ctx.debug.macros) |macros| {
|
||||
// server_bundler.options.macro_remap = macros;
|
||||
// }
|
||||
|
||||
var estimated_input_lines_of_code: usize = 0;
|
||||
_ = try GenerateNodeModuleBundle.generate(
|
||||
&server_bundler,
|
||||
allocator_,
|
||||
server_conf,
|
||||
route_conf_,
|
||||
_filepath,
|
||||
&estimated_input_lines_of_code,
|
||||
ctx.debug.package_bundle_map,
|
||||
);
|
||||
std.mem.doNotOptimizeAway(&server_bundler);
|
||||
}
|
||||
pub fn generate(
|
||||
logs: *logger.Log,
|
||||
env_loader_: *DotEnv.Loader,
|
||||
ctx: Command.Context,
|
||||
_filepath: [*:0]const u8,
|
||||
server_conf: Api.LoadedFramework,
|
||||
route_conf_: ?Api.LoadedRouteConfig,
|
||||
router: ?Router,
|
||||
) void {
|
||||
defer Output.flush();
|
||||
// var estimated_input_lines_of_code: usize = 0;
|
||||
// _ = try GenerateNodeModuleBundle.generate(
|
||||
// &server_bundler,
|
||||
// allocator_,
|
||||
// server_conf,
|
||||
// route_conf_,
|
||||
// _filepath,
|
||||
// &estimated_input_lines_of_code,
|
||||
// ctx.debug.package_bundle_map,
|
||||
// );
|
||||
// std.mem.doNotOptimizeAway(&server_bundler);
|
||||
// }
|
||||
// pub fn generate(
|
||||
// logs: *logger.Log,
|
||||
// env_loader_: *DotEnv.Loader,
|
||||
// ctx: Command.Context,
|
||||
// _filepath: [*:0]const u8,
|
||||
// server_conf: Api.LoadedFramework,
|
||||
// route_conf_: ?Api.LoadedRouteConfig,
|
||||
// router: ?Router,
|
||||
// ) void {
|
||||
// defer Output.flush();
|
||||
|
||||
_generate(logs, env_loader_, default_allocator, ctx, _filepath, server_conf, route_conf_, router) catch return;
|
||||
}
|
||||
};
|
||||
// _generate(logs, env_loader_, default_allocator, ctx, _filepath, server_conf, route_conf_, router) catch return;
|
||||
// }
|
||||
// };
|
||||
|
||||
pub const BunCommand = struct {
|
||||
pub fn exec(
|
||||
@@ -104,7 +104,7 @@ pub const BunCommand = struct {
|
||||
var this_bundler = try bundler.Bundler.init(allocator, log, ctx.args, null, null);
|
||||
this_bundler.configureLinker();
|
||||
var filepath: [*:0]const u8 = "node_modules.bun";
|
||||
var server_bundle_filepath: [*:0]const u8 = "node_modules.server.bun";
|
||||
// var server_bundle_filepath: [*:0]const u8 = "node_modules.server.bun";
|
||||
|
||||
// This step is optional
|
||||
// If it fails for any reason, ignore it and continue bundling
|
||||
@@ -139,7 +139,7 @@ pub const BunCommand = struct {
|
||||
}
|
||||
break :brk null;
|
||||
};
|
||||
var env_loader = this_bundler.env;
|
||||
// var env_loader = this_bundler.env;
|
||||
|
||||
if (ctx.debug.dump_environment_variables) {
|
||||
this_bundler.dumpEnvironmentVariables();
|
||||
@@ -152,33 +152,33 @@ pub const BunCommand = struct {
|
||||
return;
|
||||
}
|
||||
|
||||
var generated_server = false;
|
||||
if (this_bundler.options.framework) |*framework| {
|
||||
if (framework.toAPI(allocator, this_bundler.fs.top_level_dir) catch null) |_server_conf| {
|
||||
ServerBundleGeneratorThread.generate(
|
||||
log,
|
||||
env_loader,
|
||||
ctx,
|
||||
server_bundle_filepath,
|
||||
_server_conf,
|
||||
loaded_route_config,
|
||||
this_bundler.router,
|
||||
);
|
||||
generated_server = true;
|
||||
// var generated_server = false;
|
||||
// if (this_bundler.options.framework) |*framework| {
|
||||
// if (framework.toAPI(allocator, this_bundler.fs.top_level_dir) catch null) |_server_conf| {
|
||||
// ServerBundleGeneratorThread.generate(
|
||||
// log,
|
||||
// env_loader,
|
||||
// ctx,
|
||||
// server_bundle_filepath,
|
||||
// _server_conf,
|
||||
// loaded_route_config,
|
||||
// this_bundler.router,
|
||||
// );
|
||||
// generated_server = true;
|
||||
|
||||
if (log.msgs.items.len > 0) {
|
||||
try log.printForLogLevel(Output.errorWriter());
|
||||
log.* = logger.Log.init(allocator);
|
||||
Output.flush();
|
||||
}
|
||||
}
|
||||
}
|
||||
// if (log.msgs.items.len > 0) {
|
||||
// try log.printForLogLevel(Output.errorWriter());
|
||||
// log.* = logger.Log.init(allocator);
|
||||
// Output.flush();
|
||||
// }
|
||||
// }
|
||||
// }
|
||||
|
||||
{
|
||||
|
||||
// Always generate the client-only bundle
|
||||
// we can revisit this decision if people ask
|
||||
var node_modules_ = try GenerateNodeModuleBundle.generate(
|
||||
_ = try BundleV2.generate(
|
||||
&this_bundler,
|
||||
allocator,
|
||||
loaded_framework,
|
||||
@@ -188,55 +188,55 @@ pub const BunCommand = struct {
|
||||
ctx.debug.package_bundle_map,
|
||||
);
|
||||
|
||||
const estimated_input_lines_of_code = estimated_input_lines_of_code_;
|
||||
// const estimated_input_lines_of_code = estimated_input_lines_of_code_;
|
||||
|
||||
if (node_modules_) |node_modules| {
|
||||
if (log.errors > 0) {
|
||||
try log.printForLogLevel(Output.errorWriter());
|
||||
} else {
|
||||
var elapsed = @divTrunc(std.time.nanoTimestamp() - ctx.start_time, @as(i128, std.time.ns_per_ms));
|
||||
const print_summary = !(ctx.args.no_summary orelse false);
|
||||
if (print_summary) {
|
||||
var bundle = NodeModuleBundle.init(node_modules, allocator);
|
||||
bundle.printSummary();
|
||||
}
|
||||
const indent = comptime " ";
|
||||
// if (node_modules_) |node_modules| {
|
||||
// if (log.errors > 0) {
|
||||
// try log.printForLogLevel(Output.errorWriter());
|
||||
// } else {
|
||||
// var elapsed = @divTrunc(std.time.nanoTimestamp() - ctx.start_time, @as(i128, std.time.ns_per_ms));
|
||||
// const print_summary = !(ctx.args.no_summary orelse false);
|
||||
// if (print_summary) {
|
||||
// var bundle = NodeModuleBundle.init(node_modules, allocator);
|
||||
// bundle.printSummary();
|
||||
// }
|
||||
// const indent = comptime " ";
|
||||
|
||||
switch (estimated_input_lines_of_code) {
|
||||
0...99999 => {
|
||||
if (generated_server) {
|
||||
Output.prettyln(indent ++ "<d>{d:<5} LOC parsed x2", .{estimated_input_lines_of_code});
|
||||
} else {
|
||||
Output.prettyln(indent ++ "<d>{d:<5} LOC parsed", .{estimated_input_lines_of_code});
|
||||
}
|
||||
},
|
||||
else => {
|
||||
const formatted_loc: f32 = @floatCast(f32, @intToFloat(f128, estimated_input_lines_of_code) / 1000);
|
||||
if (generated_server) {
|
||||
Output.prettyln(indent ++ "<d>{d:<5.2}k LOC parsed x2", .{formatted_loc});
|
||||
} else {
|
||||
Output.prettyln(indent ++ "<d>{d:<5.2}k LOC parsed", .{
|
||||
formatted_loc,
|
||||
});
|
||||
}
|
||||
},
|
||||
}
|
||||
// switch (estimated_input_lines_of_code) {
|
||||
// 0...99999 => {
|
||||
// if (generated_server) {
|
||||
// Output.prettyln(indent ++ "<d>{d:<5} LOC parsed x2", .{estimated_input_lines_of_code});
|
||||
// } else {
|
||||
// Output.prettyln(indent ++ "<d>{d:<5} LOC parsed", .{estimated_input_lines_of_code});
|
||||
// }
|
||||
// },
|
||||
// else => {
|
||||
// const formatted_loc: f32 = @floatCast(f32, @intToFloat(f128, estimated_input_lines_of_code) / 1000);
|
||||
// if (generated_server) {
|
||||
// Output.prettyln(indent ++ "<d>{d:<5.2}k LOC parsed x2", .{formatted_loc});
|
||||
// } else {
|
||||
// Output.prettyln(indent ++ "<d>{d:<5.2}k LOC parsed", .{
|
||||
// formatted_loc,
|
||||
// });
|
||||
// }
|
||||
// },
|
||||
// }
|
||||
|
||||
Output.prettyln(indent ++ "<d>{d:6}ms elapsed", .{@intCast(u32, elapsed)});
|
||||
// Output.prettyln(indent ++ "<d>{d:6}ms elapsed", .{@intCast(u32, elapsed)});
|
||||
|
||||
if (generated_server) {
|
||||
Output.prettyln(indent ++ "<r>Saved to ./{s}, ./{s}", .{ filepath, server_bundle_filepath });
|
||||
} else {
|
||||
Output.prettyln(indent ++ "<r>Saved to ./{s}", .{filepath});
|
||||
}
|
||||
// if (generated_server) {
|
||||
// Output.prettyln(indent ++ "<r>Saved to ./{s}, ./{s}", .{ filepath, server_bundle_filepath });
|
||||
// } else {
|
||||
// Output.prettyln(indent ++ "<r>Saved to ./{s}", .{filepath});
|
||||
// }
|
||||
|
||||
Output.flush();
|
||||
// Output.flush();
|
||||
|
||||
try log.printForLogLevel(Output.errorWriter());
|
||||
}
|
||||
} else {
|
||||
try log.printForLogLevel(Output.errorWriter());
|
||||
}
|
||||
// try log.printForLogLevel(Output.errorWriter());
|
||||
// }
|
||||
// } else {
|
||||
try log.printForLogLevel(Output.errorWriter());
|
||||
// }
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
@@ -187,6 +187,22 @@ pub const GlobalDefinesKey = [_][]const string{
|
||||
&[_]string{"setTimeout"},
|
||||
&[_]string{"unescape"},
|
||||
|
||||
// Reflect: Static methods
|
||||
// https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Reflect#static_methods
|
||||
&[_]string{ "Reflect", "apply" },
|
||||
&[_]string{ "Reflect", "construct" },
|
||||
&[_]string{ "Reflect", "defineProperty" },
|
||||
&[_]string{ "Reflect", "deleteProperty" },
|
||||
&[_]string{ "Reflect", "get" },
|
||||
&[_]string{ "Reflect", "getOwnPropertyDescriptor" },
|
||||
&[_]string{ "Reflect", "getPrototypeOf" },
|
||||
&[_]string{ "Reflect", "has" },
|
||||
&[_]string{ "Reflect", "isExtensible" },
|
||||
&[_]string{ "Reflect", "ownKeys" },
|
||||
&[_]string{ "Reflect", "preventExtensions" },
|
||||
&[_]string{ "Reflect", "set" },
|
||||
&[_]string{ "Reflect", "setPrototypeOf" },
|
||||
|
||||
// Console method references are assumed to have no side effects
|
||||
// https://developer.mozilla.org/en-US/docs/Web/API/console
|
||||
&[_]string{ "console", "assert" },
|
||||
|
||||
27
src/fs.zig
27
src/fs.zig
@@ -1075,6 +1075,25 @@ pub const PathName = struct {
|
||||
ext: string,
|
||||
filename: string,
|
||||
|
||||
pub fn nonUniqueNameStringBase(self: *const PathName) string {
|
||||
// /bar/foo/index.js -> foo
|
||||
if (self.dir.len > 0 and strings.eqlComptime(self.base, "index")) {
|
||||
// "/index" -> "index"
|
||||
return Fs.PathName.init(self.dir).base;
|
||||
}
|
||||
|
||||
if (comptime Environment.allow_assert) {
|
||||
std.debug.assert(!strings.includes(self.base, "/"));
|
||||
}
|
||||
|
||||
// /bar/foo.js -> foo
|
||||
return self.base;
|
||||
}
|
||||
|
||||
pub fn fmtIdentifier(self: *const PathName) strings.FormatValidIdentifier {
|
||||
return strings.fmtIdentifier(self.nonUniqueNameStringBase());
|
||||
}
|
||||
|
||||
// For readability, the names of certain automatically-generated symbols are
|
||||
// derived from the file name. For example, instead of the CommonJS wrapper for
|
||||
// a file being called something like "require273" it can be called something
|
||||
@@ -1087,13 +1106,7 @@ pub const PathName = struct {
|
||||
// through the renaming logic that all other symbols go through to avoid name
|
||||
// collisions.
|
||||
pub fn nonUniqueNameString(self: *const PathName, allocator: std.mem.Allocator) !string {
|
||||
if (strings.eqlComptime(self.base, "index")) {
|
||||
if (self.dir.len > 0) {
|
||||
return MutableString.ensureValidIdentifier(PathName.init(self.dir).base, allocator);
|
||||
}
|
||||
}
|
||||
|
||||
return MutableString.ensureValidIdentifier(self.base, allocator);
|
||||
return MutableString.ensureValidIdentifier(self.nonUniqueNameStringBase(), allocator);
|
||||
}
|
||||
|
||||
pub inline fn dirWithTrailingSlash(this: *const PathName) string {
|
||||
|
||||
@@ -14,6 +14,7 @@ pub const FeatureFlags = @import("feature_flags.zig");
|
||||
const root = @import("root");
|
||||
pub const meta = @import("./meta.zig");
|
||||
pub const ComptimeStringMap = @import("./comptime_string_map.zig").ComptimeStringMap;
|
||||
pub usingnamespace @import("./util.zig");
|
||||
|
||||
pub const fmt = struct {
|
||||
pub usingnamespace std.fmt;
|
||||
|
||||
@@ -2,6 +2,7 @@ const fs = @import("fs.zig");
|
||||
const logger = @import("logger.zig");
|
||||
const std = @import("std");
|
||||
const Ref = @import("ast/base.zig").Ref;
|
||||
const Index = @import("ast/base.zig").Index;
|
||||
const Api = @import("./api/schema.zig").Api;
|
||||
|
||||
pub const ImportKind = enum(u8) {
|
||||
@@ -30,11 +31,9 @@ pub const ImportKind = enum(u8) {
|
||||
/// A CSS "url(...)" token
|
||||
url,
|
||||
|
||||
internal,
|
||||
|
||||
pub const Label = std.EnumArray(ImportKind, []const u8);
|
||||
pub const all_labels: Label = brk: {
|
||||
var labels = Label.initFill("internal");
|
||||
var labels = Label.initFill("");
|
||||
labels.set(ImportKind.entry_point, "entry-point");
|
||||
labels.set(ImportKind.stmt, "import-statement");
|
||||
labels.set(ImportKind.require, "require-call");
|
||||
@@ -85,72 +84,191 @@ pub const ImportRecord = struct {
|
||||
/// 0 is invalid
|
||||
module_id: u32 = 0,
|
||||
|
||||
source_index: Ref.Int = std.math.maxInt(Ref.Int),
|
||||
source_index: Index = Index.invalid,
|
||||
|
||||
print_mode: PrintMode = .normal,
|
||||
|
||||
/// True for the following cases:
|
||||
///
|
||||
/// try { require('x') } catch { handle }
|
||||
/// try { await import('x') } catch { handle }
|
||||
/// try { require.resolve('x') } catch { handle }
|
||||
/// import('x').catch(handle)
|
||||
/// import('x').then(_, handle)
|
||||
///
|
||||
/// In these cases we shouldn't generate an error if the path could not be
|
||||
/// resolved.
|
||||
handles_import_errors: bool = false,
|
||||
|
||||
is_internal: bool = false,
|
||||
|
||||
/// This tells the printer that we should print as export var $moduleID = ...
|
||||
/// Instead of using the path.
|
||||
is_bundled: bool = false,
|
||||
|
||||
/// Sometimes the parser creates an import record and decides it isn't needed.
|
||||
/// For example, TypeScript code may have import statements that later turn
|
||||
/// out to be type-only imports after analyzing the whole file.
|
||||
is_unused: bool = false,
|
||||
|
||||
/// If this is true, the import contains syntax like "* as ns". This is used
|
||||
/// to determine whether modules that have no exports need to be wrapped in a
|
||||
/// CommonJS wrapper or not.
|
||||
contains_import_star: bool = false,
|
||||
|
||||
/// If this is true, the import contains an import for the alias "default",
|
||||
/// either via the "import x from" or "import {default as x} from" syntax.
|
||||
contains_default_alias: bool = false,
|
||||
|
||||
/// If true, this "export * from 'path'" statement is evaluated at run-time by
|
||||
/// calling the "__reExport()" helper function
|
||||
calls_run_time_re_export_fn: bool = false,
|
||||
|
||||
/// Tell the printer to wrap this call to "require()" in "__toModule(...)"
|
||||
wrap_with_to_module: bool = false,
|
||||
|
||||
/// True for require calls like this: "try { require() } catch {}". In this
|
||||
/// case we shouldn't generate an error if the path could not be resolved.
|
||||
is_inside_try_body: bool = false,
|
||||
|
||||
/// If true, this was originally written as a bare "import 'file'" statement
|
||||
was_originally_bare_import: bool = false,
|
||||
|
||||
was_originally_require: bool = false,
|
||||
|
||||
/// If a macro used <import>, it will be tracked here.
|
||||
was_injected_by_macro: bool = false,
|
||||
|
||||
kind: ImportKind,
|
||||
|
||||
tag: Tag = Tag.none,
|
||||
|
||||
pub const Tag = enum {
|
||||
flags: Flags.Set = Flags.None,
|
||||
|
||||
pub inline fn set(this: *ImportRecord, flag: Flags, value: bool) void {
|
||||
this.flags.setPresent(flag, value);
|
||||
}
|
||||
|
||||
pub inline fn enable(this: *ImportRecord, flag: Flags) void {
|
||||
this.set(flag, true);
|
||||
}
|
||||
|
||||
/// True for the following cases:
|
||||
///
|
||||
/// `try { require('x') } catch { handle }`
|
||||
/// `try { await import('x') } catch { handle }`
|
||||
/// `try { require.resolve('x') } catch { handle }`
|
||||
/// `import('x').catch(handle)`
|
||||
/// `import('x').then(_, handle)`
|
||||
///
|
||||
/// In these cases we shouldn't generate an error if the path could not be
|
||||
/// resolved.
|
||||
pub inline fn handles_import_errors(this: *const ImportRecord) bool {
|
||||
return this.flags.contains(.handles_import_errors);
|
||||
}
|
||||
|
||||
/// Sometimes the parser creates an import record and decides it isn't needed.
|
||||
/// For example, TypeScript code may have import statements that later turn
|
||||
/// out to be type-only imports after analyzing the whole file.
|
||||
pub inline fn is_unused(this: *const ImportRecord) bool {
|
||||
return this.flags.contains(.is_unused);
|
||||
}
|
||||
|
||||
/// If this is true, the import contains syntax like "* as ns". This is used
|
||||
/// to determine whether modules that have no exports need to be wrapped in a
|
||||
/// CommonJS wrapper or not.
|
||||
pub inline fn contains_import_star(this: *const ImportRecord) bool {
|
||||
return this.flags.contains(.contains_import_star);
|
||||
}
|
||||
|
||||
/// If this is true, the import contains an import for the alias "default",
|
||||
/// either via the "import x from" or "import {default as x} from" syntax.
|
||||
pub inline fn contains_default_alias(this: *const ImportRecord) bool {
|
||||
return this.flags.contains(.contains_default_alias);
|
||||
}
|
||||
|
||||
/// If true, this "export * from 'path'" statement is evaluated at run-time by
|
||||
/// calling the "__reExport()" helper function
|
||||
pub inline fn calls_runtime_re_export_fn(this: *const ImportRecord) bool {
|
||||
return this.flags.contains(.calls_runtime_re_export_fn);
|
||||
}
|
||||
/// If true, this calls require() at runtime
|
||||
pub inline fn calls_runtime_require(this: *const ImportRecord) bool {
|
||||
return this.flags.contains(.calls_runtime_require);
|
||||
}
|
||||
|
||||
/// Tell the printer to wrap this call to "require()" in "__toModule(...)"
|
||||
pub inline fn wrap_with_to_module(this: *const ImportRecord) bool {
|
||||
return this.flags.contains(.wrap_with_to_module);
|
||||
}
|
||||
|
||||
/// Tell the printer to wrap this call to "toESM()" in "__toESM(...)"
|
||||
pub inline fn wrap_with_to_esm(this: *const ImportRecord) bool {
|
||||
return this.flags.contains(.wrap_with_to_esm);
|
||||
}
|
||||
|
||||
// If this is true, the import contains an import for the alias "__esModule",
|
||||
// via the "import {__esModule} from" syntax.
|
||||
pub inline fn contains_es_module_alias(this: *const ImportRecord) bool {
|
||||
return this.flags.contains(.contains_es_module_alias);
|
||||
}
|
||||
|
||||
/// If true, this was originally written as a bare "import 'file'" statement
|
||||
pub inline fn was_originally_bare_import(this: *const ImportRecord) bool {
|
||||
return this.flags.contains(.was_originally_bare_import);
|
||||
}
|
||||
pub inline fn was_originally_require(this: *const ImportRecord) bool {
|
||||
return this.flags.contains(.was_originally_require);
|
||||
}
|
||||
|
||||
pub inline fn is_external_without_side_effects(this: *const ImportRecord) bool {
|
||||
return @enumToInt(this.tag) >= @enumToInt(Tag.bun) or this.flags.contains(.is_external_without_side_effects);
|
||||
}
|
||||
|
||||
pub const Flags = enum {
|
||||
/// True for the following cases:
|
||||
///
|
||||
/// try { require('x') } catch { handle }
|
||||
/// try { await import('x') } catch { handle }
|
||||
/// try { require.resolve('x') } catch { handle }
|
||||
/// import('x').catch(handle)
|
||||
/// import('x').then(_, handle)
|
||||
///
|
||||
/// In these cases we shouldn't generate an error if the path could not be
|
||||
/// resolved.
|
||||
handles_import_errors,
|
||||
|
||||
/// Sometimes the parser creates an import record and decides it isn't needed.
|
||||
/// For example, TypeScript code may have import statements that later turn
|
||||
/// out to be type-only imports after analyzing the whole file.
|
||||
is_unused,
|
||||
|
||||
/// If this is true, the import contains syntax like "* as ns". This is used
|
||||
/// to determine whether modules that have no exports need to be wrapped in a
|
||||
/// CommonJS wrapper or not.
|
||||
contains_import_star,
|
||||
|
||||
/// If this is true, the import contains an import for the alias "default",
|
||||
/// either via the "import x from" or "import {default as x} from" syntax.
|
||||
contains_default_alias,
|
||||
|
||||
// If this is true, the import contains an import for the alias "__esModule",
|
||||
// via the "import {__esModule} from" syntax.
|
||||
contains_es_module_alias,
|
||||
|
||||
/// If true, this "export * from 'path'" statement is evaluated at run-time by
|
||||
/// calling the "__reExport()" helper function
|
||||
calls_runtime_re_export_fn,
|
||||
|
||||
/// If true, this calls require() at runtime
|
||||
calls_runtime_require,
|
||||
|
||||
/// Tell the printer to wrap this call to "require()" in "__toModule(...)"
|
||||
wrap_with_to_module,
|
||||
|
||||
/// Tell the printer to wrap this call to "toESM()" in "__toESM(...)"
|
||||
wrap_with_to_esm,
|
||||
|
||||
/// If true, this was originally written as a bare "import 'file'" statement
|
||||
was_originally_bare_import,
|
||||
|
||||
was_originally_require,
|
||||
|
||||
is_external_without_side_effects,
|
||||
|
||||
pub const None = Set{};
|
||||
pub const Fields = std.enums.EnumFieldStruct(Flags, bool, false);
|
||||
pub const Set = std.enums.EnumSet(Flags);
|
||||
};
|
||||
|
||||
pub inline fn isRuntime(this: *const ImportRecord) bool {
|
||||
return this.tag.isRuntime();
|
||||
}
|
||||
|
||||
pub inline fn isInternal(this: *const ImportRecord) bool {
|
||||
return this.tag.isInternal();
|
||||
}
|
||||
|
||||
pub inline fn isBundled(this: *const ImportRecord) bool {
|
||||
return this.module_id > 0;
|
||||
}
|
||||
|
||||
pub const List = @import("./baby_list.zig").BabyList(ImportRecord);
|
||||
|
||||
pub const Tag = enum(u3) {
|
||||
none,
|
||||
/// JSX auto-import for React Fast Refresh
|
||||
react_refresh,
|
||||
/// JSX auto-import for jsxDEV or jsx
|
||||
jsx_import,
|
||||
/// JSX auto-import for Fragment or createElement
|
||||
jsx_classic,
|
||||
/// Uses the `bun` import specifier
|
||||
/// import {foo} from "bun";
|
||||
bun,
|
||||
/// Uses the `bun:test` import specifier
|
||||
/// import {expect} from "bun:test";
|
||||
bun_test,
|
||||
runtime,
|
||||
/// A macro: import specifier OR a macro import
|
||||
macro,
|
||||
|
||||
pub inline fn isRuntime(this: Tag) bool {
|
||||
return this == .runtime;
|
||||
}
|
||||
|
||||
pub inline fn isInternal(this: Tag) bool {
|
||||
return @enumToInt(this) >= @enumToInt(Tag.runtime);
|
||||
}
|
||||
};
|
||||
|
||||
pub const PrintMode = enum {
|
||||
|
||||
@@ -854,9 +854,9 @@ fn getParseResult(this: *Transpiler, allocator: std.mem.Allocator, code: []const
|
||||
|
||||
// necessary because we don't run the linker
|
||||
if (parse_result) |*res| {
|
||||
for (res.ast.import_records) |*import| {
|
||||
for (res.ast.import_records.slice()) |*import| {
|
||||
if (import.kind.isCommonJS()) {
|
||||
import.wrap_with_to_module = true;
|
||||
import.enable(.wrap_with_to_module);
|
||||
import.module_id = @truncate(u32, std.hash.Wyhash.hash(0, import.path.pretty));
|
||||
}
|
||||
}
|
||||
@@ -936,7 +936,7 @@ pub fn scan(
|
||||
const imports_label = JSC.ZigString.init("imports");
|
||||
const named_imports_value = namedImportsToJS(
|
||||
ctx.ptr(),
|
||||
parse_result.ast.import_records,
|
||||
parse_result.ast.import_records.slice(),
|
||||
exception,
|
||||
);
|
||||
if (exception.* != null) return null;
|
||||
@@ -1141,7 +1141,7 @@ fn namedImportsToJS(
|
||||
defer allocator.free(array_items);
|
||||
|
||||
for (import_records) |record| {
|
||||
if (record.is_internal) continue;
|
||||
if (record.isInternal()) continue;
|
||||
|
||||
const path = JSC.ZigString.init(record.path.text).toValueGC(global);
|
||||
const kind = JSC.ZigString.init(record.kind.label()).toValue(global);
|
||||
|
||||
196
src/js_ast.zig
196
src/js_ast.zig
@@ -13,6 +13,7 @@ const stringZ = bun.stringZ;
|
||||
const default_allocator = bun.default_allocator;
|
||||
const C = bun.C;
|
||||
const Ref = @import("ast/base.zig").Ref;
|
||||
const Index = @import("ast/base.zig").Index;
|
||||
const RefHashCtx = @import("ast/base.zig").RefHashCtx;
|
||||
const ObjectPool = @import("./pool.zig").ObjectPool;
|
||||
const ImportRecord = @import("import_record.zig").ImportRecord;
|
||||
@@ -28,6 +29,15 @@ const StringHashMapUnmanaged = _hash_map.StringHashMapUnmanaged;
|
||||
const is_bindgen = std.meta.globalOption("bindgen", bool) orelse false;
|
||||
const ComptimeStringMap = bun.ComptimeStringMap;
|
||||
const JSPrinter = @import("./js_printer.zig");
|
||||
const ThreadlocalArena = @import("./mimalloc_arena.zig").Arena;
|
||||
|
||||
/// This is the index to the automatically-generated part containing code that
|
||||
/// calls "__export(exports, { ... getters ... })". This is used to generate
|
||||
/// getters on an exports object for ES6 export statements, and is both for
|
||||
/// ES6 star imports and CommonJS-style modules. All files have one of these,
|
||||
/// although it may contain no statements if there is nothing to export.
|
||||
pub const namespace_export_part_index = 0;
|
||||
|
||||
pub fn NewBaseStore(comptime Union: anytype, comptime count: usize) type {
|
||||
var max_size = 0;
|
||||
var max_align = 1;
|
||||
@@ -142,20 +152,21 @@ pub fn NewBaseStore(comptime Union: anytype, comptime count: usize) type {
|
||||
|
||||
fn deinit() void {
|
||||
var sliced = _self.overflow.slice();
|
||||
var allocator = _self.overflow.allocator;
|
||||
|
||||
if (sliced.len > 1) {
|
||||
var i: usize = 1;
|
||||
const end = sliced.len;
|
||||
while (i < end) {
|
||||
var ptrs = @ptrCast(*[2]Block, sliced[i]);
|
||||
default_allocator.free(ptrs);
|
||||
allocator.free(ptrs);
|
||||
i += 2;
|
||||
}
|
||||
_self.overflow.allocated = 1;
|
||||
}
|
||||
var base_store = @fieldParentPtr(WithBase, "store", _self);
|
||||
if (_self.overflow.ptrs[0] == &base_store.head) {
|
||||
default_allocator.destroy(base_store);
|
||||
allocator.destroy(base_store);
|
||||
}
|
||||
_self = undefined;
|
||||
}
|
||||
@@ -212,107 +223,7 @@ pub const BindingNodeIndex = Binding;
|
||||
pub const StmtNodeIndex = Stmt;
|
||||
pub const ExprNodeIndex = Expr;
|
||||
|
||||
/// This is like ArrayList except it stores the length and capacity as u32
|
||||
/// In practice, it is very unusual to have lengths above 4 GB
|
||||
///
|
||||
/// This lets us have array lists which occupy the same amount of space as a slice
|
||||
pub fn BabyList(comptime Type: type) type {
|
||||
return struct {
|
||||
const ListType = @This();
|
||||
ptr: [*]Type = undefined,
|
||||
len: u32 = 0,
|
||||
cap: u32 = 0,
|
||||
|
||||
pub inline fn init(items: []const Type) ListType {
|
||||
@setRuntimeSafety(false);
|
||||
return ListType{
|
||||
// Remove the const qualifier from the items
|
||||
.ptr = @intToPtr([*]Type, @ptrToInt(items.ptr)),
|
||||
|
||||
.len = @truncate(u32, items.len),
|
||||
.cap = @truncate(u32, items.len),
|
||||
};
|
||||
}
|
||||
|
||||
pub inline fn fromList(list_: anytype) ListType {
|
||||
@setRuntimeSafety(false);
|
||||
|
||||
if (comptime Environment.allow_assert) {
|
||||
std.debug.assert(list_.items.len <= list_.capacity);
|
||||
}
|
||||
|
||||
return ListType{
|
||||
.ptr = list_.items.ptr,
|
||||
.len = @truncate(u32, list_.items.len),
|
||||
.cap = @truncate(u32, list_.capacity),
|
||||
};
|
||||
}
|
||||
|
||||
pub fn update(this: *ListType, list_: anytype) void {
|
||||
@setRuntimeSafety(false);
|
||||
this.ptr = list_.items.ptr;
|
||||
this.len = @truncate(u32, list_.items.len);
|
||||
this.cap = @truncate(u32, list_.capacity);
|
||||
|
||||
if (comptime Environment.allow_assert) {
|
||||
std.debug.assert(this.len <= this.cap);
|
||||
}
|
||||
}
|
||||
|
||||
pub fn list(this: ListType) std.ArrayListUnmanaged(Type) {
|
||||
return std.ArrayListUnmanaged(Type){
|
||||
.items = this.ptr[0..this.len],
|
||||
.capacity = this.cap,
|
||||
};
|
||||
}
|
||||
|
||||
pub fn listManaged(this: ListType, allocator: std.mem.Allocator) std.ArrayList(Type) {
|
||||
return std.ArrayList(Type){
|
||||
.items = this.ptr[0..this.len],
|
||||
.capacity = this.cap,
|
||||
.allocator = allocator,
|
||||
};
|
||||
}
|
||||
|
||||
pub inline fn first(this: ListType) ?*Type {
|
||||
return if (this.len > 0) this.ptr[0] else @as(?*Type, null);
|
||||
}
|
||||
|
||||
pub inline fn last(this: ListType) ?*Type {
|
||||
return if (this.len > 0) &this.ptr[this.len - 1] else @as(?*Type, null);
|
||||
}
|
||||
|
||||
pub inline fn first_(this: ListType) Type {
|
||||
return this.ptr[0];
|
||||
}
|
||||
|
||||
pub fn one(allocator: std.mem.Allocator, value: Type) !ListType {
|
||||
var items = try allocator.alloc(Type, 1);
|
||||
items[0] = value;
|
||||
return ListType{
|
||||
.ptr = @ptrCast([*]Type, items.ptr),
|
||||
.len = 1,
|
||||
.cap = 1,
|
||||
};
|
||||
}
|
||||
|
||||
pub inline fn @"[0]"(this: ListType) Type {
|
||||
return this.ptr[0];
|
||||
}
|
||||
const OOM = error{OutOfMemory};
|
||||
|
||||
pub fn push(this: *ListType, allocator: std.mem.Allocator, value: Type) OOM!void {
|
||||
var list_ = this.list();
|
||||
try list_.append(allocator, value);
|
||||
this.update(list_);
|
||||
}
|
||||
|
||||
pub inline fn slice(this: ListType) []Type {
|
||||
@setRuntimeSafety(false);
|
||||
return this.ptr[0..this.len];
|
||||
}
|
||||
};
|
||||
}
|
||||
const BabyList = @import("./baby_list.zig").BabyList;
|
||||
|
||||
/// Slice that stores capacity and length in the same space as a regular slice.
|
||||
pub const ExprNodeList = BabyList(Expr);
|
||||
@@ -926,6 +837,9 @@ pub const Symbol = struct {
|
||||
count_estimate: u32 = 0,
|
||||
};
|
||||
|
||||
pub const List = BabyList(Symbol);
|
||||
pub const NestedList = BabyList(List);
|
||||
|
||||
pub const Map = struct {
|
||||
// This could be represented as a "map[Ref]Symbol" but a two-level array was
|
||||
// more efficient in profiles. This appears to be because it doesn't involve
|
||||
@@ -934,30 +848,30 @@ pub const Symbol = struct {
|
||||
// single inner array, so you can join the maps together by just make a
|
||||
// single outer array containing all of the inner arrays. See the comment on
|
||||
// "Ref" for more detail.
|
||||
symbols_for_source: [][]Symbol,
|
||||
symbols_for_source: NestedList = NestedList{},
|
||||
|
||||
pub fn get(self: *Map, ref: Ref) ?*Symbol {
|
||||
if (Ref.isSourceIndexNull(ref.sourceIndex()) or ref.isSourceContentsSlice()) {
|
||||
return null;
|
||||
}
|
||||
|
||||
return &self.symbols_for_source[ref.sourceIndex()][ref.innerIndex()];
|
||||
return self.symbols_for_source.at(ref.sourceIndex()).mut(ref.innerIndex());
|
||||
}
|
||||
|
||||
pub fn getConst(self: *Map, ref: Ref) ?*const Symbol {
|
||||
pub fn getConst(self: *const Map, ref: Ref) ?*const Symbol {
|
||||
if (Ref.isSourceIndexNull(ref.sourceIndex()) or ref.isSourceContentsSlice()) {
|
||||
return null;
|
||||
}
|
||||
|
||||
return &self.symbols_for_source[ref.sourceIndex()][ref.innerIndex()];
|
||||
return self.symbols_for_source.at(ref.sourceIndex()).at(ref.innerIndex());
|
||||
}
|
||||
|
||||
pub fn init(sourceCount: usize, allocator: std.mem.Allocator) !Map {
|
||||
var symbols_for_source: [][]Symbol = try allocator.alloc([]Symbol, sourceCount);
|
||||
var symbols_for_source: NestedList = NestedList.init(try allocator.alloc([]Symbol, sourceCount));
|
||||
return Map{ .symbols_for_source = symbols_for_source };
|
||||
}
|
||||
|
||||
pub fn initList(list: [][]Symbol) Map {
|
||||
pub fn initList(list: NestedList) Map {
|
||||
return Map{ .symbols_for_source = list };
|
||||
}
|
||||
|
||||
@@ -1180,7 +1094,7 @@ pub const E = struct {
|
||||
target: ExprNodeIndex,
|
||||
optional_chain: ?OptionalChain = null,
|
||||
|
||||
pub fn hasSameFlagsAs(a: *Index, b: *Index) bool {
|
||||
pub fn hasSameFlagsAs(a: *E.Index, b: *E.Index) bool {
|
||||
return (a.optional_chain == b.optional_chain);
|
||||
}
|
||||
};
|
||||
@@ -1912,6 +1826,8 @@ pub const Stmt = struct {
|
||||
loc: logger.Loc,
|
||||
data: Data,
|
||||
|
||||
pub const Batcher = bun.Batcher(Stmt);
|
||||
|
||||
const Serializable = struct {
|
||||
@"type": Tag,
|
||||
object: string,
|
||||
@@ -2194,6 +2110,8 @@ pub const Expr = struct {
|
||||
loc: logger.Loc,
|
||||
data: Data,
|
||||
|
||||
pub const Batcher = bun.Batcher(Stmt);
|
||||
|
||||
pub fn fromBlob(
|
||||
blob: *const JSC.WebCore.Blob,
|
||||
allocator: std.mem.Allocator,
|
||||
@@ -4307,6 +4225,8 @@ pub const ArrayBinding = struct {
|
||||
};
|
||||
|
||||
pub const Ast = struct {
|
||||
pub const TopLevelSymbolToParts = std.ArrayHashMapUnmanaged(Ref, BabyList(u32), Ref.ArrayHashCtx, false);
|
||||
|
||||
approximate_newline_count: usize = 0,
|
||||
has_lazy_export: bool = false,
|
||||
runtime_imports: Runtime.Imports,
|
||||
@@ -4327,22 +4247,23 @@ pub const Ast = struct {
|
||||
|
||||
// This is a list of ES6 features. They are ranges instead of booleans so
|
||||
// that they can be used in log messages. Check to see if "Len > 0".
|
||||
import_keyword: ?logger.Range = null, // Does not include TypeScript-specific syntax or "import()"
|
||||
export_keyword: ?logger.Range = null, // Does not include TypeScript-specific syntax
|
||||
top_level_await_keyword: ?logger.Range = null,
|
||||
import_keyword: logger.Range = logger.Range.None, // Does not include TypeScript-specific syntax or "import()"
|
||||
export_keyword: logger.Range = logger.Range.None, // Does not include TypeScript-specific syntax
|
||||
top_level_await_keyword: logger.Range = logger.Range.None,
|
||||
|
||||
// These are stored at the AST level instead of on individual AST nodes so
|
||||
// they can be manipulated efficiently without a full AST traversal
|
||||
import_records: []ImportRecord = &([_]ImportRecord{}),
|
||||
import_records: ImportRecord.List = .{},
|
||||
|
||||
hashbang: ?string = null,
|
||||
directive: ?string = null,
|
||||
url_for_css: ?string = null,
|
||||
parts: []Part,
|
||||
symbols: []Symbol = &([_]Symbol{}),
|
||||
parts: Part.List = Part.List{},
|
||||
// This list may be mutated later, so we should store the capacity
|
||||
symbols: Symbol.List = Symbol.List{},
|
||||
module_scope: ?Scope = null,
|
||||
// char_freq: *CharFreq,
|
||||
exports_ref: ?Ref = null,
|
||||
exports_ref: Ref = Ref.None,
|
||||
module_ref: ?Ref = null,
|
||||
wrapper_ref: ?Ref = null,
|
||||
require_ref: Ref = Ref.None,
|
||||
@@ -4357,17 +4278,21 @@ pub const Ast = struct {
|
||||
named_exports: NamedExports = undefined,
|
||||
export_star_import_records: []u32 = &([_]u32{}),
|
||||
|
||||
allocator: std.mem.Allocator,
|
||||
top_level_symbols_to_parts: TopLevelSymbolToParts = .{},
|
||||
|
||||
pub const NamedImports = std.ArrayHashMap(Ref, NamedImport, RefHashCtx, true);
|
||||
pub const NamedExports = std.StringArrayHashMap(NamedExport);
|
||||
|
||||
pub fn initTest(parts: []Part) Ast {
|
||||
return Ast{
|
||||
.parts = parts,
|
||||
.parts = Part.List.init(parts),
|
||||
.allocator = bun.default_allocator,
|
||||
.runtime_imports = .{},
|
||||
};
|
||||
}
|
||||
|
||||
pub const empty = Ast{ .parts = &[_]Part{}, .runtime_imports = undefined };
|
||||
pub const empty = Ast{ .parts = Part.List{}, .runtime_imports = undefined, .allocator = undefined };
|
||||
|
||||
pub fn toJSON(self: *const Ast, _: std.mem.Allocator, stream: anytype) !void {
|
||||
const opts = std.json.StringifyOptions{ .whitespace = std.json.StringifyOptions.Whitespace{
|
||||
@@ -4407,7 +4332,7 @@ pub const ExportsKind = enum {
|
||||
// "exports") with getters for the export names. All named imports to this
|
||||
// module are allowed. Direct named imports reference the corresponding export
|
||||
// directly. Other imports go through property accesses on "exports".
|
||||
esm_with_dyn,
|
||||
esm_with_dynamic_fallback,
|
||||
|
||||
pub fn jsonStringify(self: @This(), opts: anytype, o: anytype) !void {
|
||||
return try std.json.stringify(@tagName(self), opts, o);
|
||||
@@ -4417,11 +4342,15 @@ pub const ExportsKind = enum {
|
||||
pub const DeclaredSymbol = struct {
|
||||
ref: Ref,
|
||||
is_top_level: bool = false,
|
||||
|
||||
pub const List = std.MultiArrayList(DeclaredSymbol);
|
||||
};
|
||||
|
||||
pub const Dependency = packed struct {
|
||||
source_index: u32 = 0,
|
||||
part_index: u32 = 0,
|
||||
pub const Dependency = struct {
|
||||
source_index: Index = Index.invalid,
|
||||
part_index: Index.Int = 0,
|
||||
|
||||
pub const List = BabyList(Dependency);
|
||||
};
|
||||
|
||||
pub const ExprList = std.ArrayList(Expr);
|
||||
@@ -4434,24 +4363,27 @@ pub const BindingList = std.ArrayList(Binding);
|
||||
// shaking and can be assigned to separate chunks (i.e. output files) by code
|
||||
// splitting.
|
||||
pub const Part = struct {
|
||||
pub const ImportRecordIndices = BabyList(u32);
|
||||
pub const List = BabyList(Part);
|
||||
|
||||
stmts: []Stmt,
|
||||
scopes: []*Scope = &([_]*Scope{}),
|
||||
|
||||
// Each is an index into the file-level import record list
|
||||
import_record_indices: []u32 = &([_]u32{}),
|
||||
import_record_indices: ImportRecordIndices = .{},
|
||||
|
||||
// All symbols that are declared in this part. Note that a given symbol may
|
||||
// have multiple declarations, and so may end up being declared in multiple
|
||||
// parts (e.g. multiple "var" declarations with the same name). Also note
|
||||
// that this list isn't deduplicated and may contain duplicates.
|
||||
declared_symbols: []DeclaredSymbol = &([_]DeclaredSymbol{}),
|
||||
declared_symbols: DeclaredSymbol.List = .{},
|
||||
|
||||
// An estimate of the number of uses of all symbols used within this part.
|
||||
symbol_uses: SymbolUseMap = SymbolUseMap{},
|
||||
|
||||
// The indices of the other parts in this file that are needed if this part
|
||||
// is needed.
|
||||
dependencies: []Dependency = &([_]Dependency{}),
|
||||
dependencies: Dependency.List = .{},
|
||||
|
||||
// If true, this part can be removed if none of the declared symbols are
|
||||
// used. If the file containing this part is imported, then all parts that
|
||||
@@ -4495,7 +4427,7 @@ pub const StmtOrExpr = union(enum) {
|
||||
|
||||
pub const NamedImport = struct {
|
||||
// Parts within this file that use this import
|
||||
local_parts_with_uses: []u32 = &([_]u32{}),
|
||||
local_parts_with_uses: BabyList(u32) = BabyList(u32){},
|
||||
|
||||
alias: ?string,
|
||||
alias_loc: ?logger.Loc,
|
||||
@@ -4518,7 +4450,7 @@ pub const NamedExport = struct {
|
||||
alias_loc: logger.Loc,
|
||||
};
|
||||
|
||||
pub const StrictModeKind = enum(u7) {
|
||||
pub const StrictModeKind = enum(u4) {
|
||||
sloppy_mode,
|
||||
explicit_strict_mode,
|
||||
implicit_strict_mode_import,
|
||||
@@ -4531,13 +4463,12 @@ pub const StrictModeKind = enum(u7) {
|
||||
};
|
||||
|
||||
pub const Scope = struct {
|
||||
id: usize = 0,
|
||||
kind: Kind = Kind.block,
|
||||
parent: ?*Scope,
|
||||
children: std.ArrayListUnmanaged(*Scope) = .{},
|
||||
children: BabyList(*Scope) = .{},
|
||||
// This is a special hash table that allows us to pass in the hash directly
|
||||
members: StringHashMapUnmanaged(Member) = .{},
|
||||
generated: std.ArrayListUnmanaged(Ref) = .{},
|
||||
generated: BabyList(Ref) = .{},
|
||||
|
||||
// This is used to store the ref of the label symbol for ScopeLabel scopes.
|
||||
label_ref: ?Ref = null,
|
||||
@@ -4688,7 +4619,7 @@ pub const Scope = struct {
|
||||
pub fn recursiveSetStrictMode(s: *Scope, kind: StrictModeKind) void {
|
||||
if (s.strict_mode == .sloppy_mode) {
|
||||
s.strict_mode = kind;
|
||||
for (s.children.items) |child| {
|
||||
for (s.children.slice()) |child| {
|
||||
child.recursiveSetStrictMode(kind);
|
||||
}
|
||||
}
|
||||
@@ -8247,3 +8178,4 @@ pub const Macro = struct {
|
||||
// Stmt | 192
|
||||
// STry | 384
|
||||
// -- ESBuild bit sizes
|
||||
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@@ -487,12 +487,12 @@ const ImportVariant = enum {
|
||||
pub fn determine(record: *const importRecord.ImportRecord, s_import: *const S.Import) ImportVariant {
|
||||
var variant = ImportVariant.path_only;
|
||||
|
||||
if (record.contains_import_star) {
|
||||
if (record.contains_import_star()) {
|
||||
variant = variant.hasStar();
|
||||
}
|
||||
|
||||
if (!record.was_originally_bare_import) {
|
||||
if (!record.contains_default_alias) {
|
||||
if (!record.was_originally_bare_import()) {
|
||||
if (!record.contains_default_alias()) {
|
||||
if (s_import.default_name) |default_name| {
|
||||
if (default_name.ref != null) {
|
||||
variant = variant.hasDefault();
|
||||
@@ -1361,7 +1361,7 @@ pub fn NewPrinter(
|
||||
if (record.kind != .dynamic) {
|
||||
p.printSpaceBeforeIdentifier();
|
||||
|
||||
if (record.path.is_disabled and record.handles_import_errors and !is_external) {
|
||||
if (record.path.is_disabled and record.handles_import_errors() and !is_external) {
|
||||
p.printRequireError(record.path.text);
|
||||
return;
|
||||
}
|
||||
@@ -1560,14 +1560,14 @@ pub fn NewPrinter(
|
||||
}
|
||||
},
|
||||
.e_require => |e| {
|
||||
if (rewrite_esm_to_cjs and p.import_records[e.import_record_index].is_bundled) {
|
||||
if (rewrite_esm_to_cjs and p.import_records[e.import_record_index].isBundled()) {
|
||||
p.printIndent();
|
||||
p.printBundledRequire(e);
|
||||
p.printSemicolonIfNeeded();
|
||||
return;
|
||||
}
|
||||
|
||||
if (!rewrite_esm_to_cjs or !p.import_records[e.import_record_index].is_bundled) {
|
||||
if (!rewrite_esm_to_cjs or !p.import_records[e.import_record_index].isBundled()) {
|
||||
p.printRequireOrImportExpr(e.import_record_index, &([_]G.Comment{}), level, flags);
|
||||
}
|
||||
},
|
||||
@@ -2053,7 +2053,7 @@ pub fn NewPrinter(
|
||||
didPrint = true;
|
||||
} else if (symbol.namespace_alias) |namespace| {
|
||||
const import_record = p.import_records[namespace.import_record_index];
|
||||
if ((comptime is_inside_bundle) or import_record.is_bundled or namespace.was_originally_property_access) {
|
||||
if ((comptime is_inside_bundle) or import_record.isBundled() or namespace.was_originally_property_access) {
|
||||
var wrap = false;
|
||||
didPrint = true;
|
||||
|
||||
@@ -2071,7 +2071,7 @@ pub fn NewPrinter(
|
||||
if (wrap) {
|
||||
p.print(")");
|
||||
}
|
||||
} else if (import_record.was_originally_require and import_record.path.is_disabled) {
|
||||
} else if (import_record.was_originally_require() and import_record.path.is_disabled) {
|
||||
p.printRequireError(import_record.path.text);
|
||||
didPrint = true;
|
||||
}
|
||||
@@ -2349,7 +2349,7 @@ pub fn NewPrinter(
|
||||
}
|
||||
|
||||
pub fn printNamespaceAlias(p: *Printer, import_record: ImportRecord, namespace: G.NamespaceAlias) void {
|
||||
if (import_record.module_id > 0 and !import_record.contains_import_star) {
|
||||
if (import_record.isBundled() and !import_record.contains_import_star()) {
|
||||
p.print("$");
|
||||
p.printModuleId(import_record.module_id);
|
||||
} else {
|
||||
@@ -3070,7 +3070,7 @@ pub fn NewPrinter(
|
||||
|
||||
if (symbol.namespace_alias) |namespace| {
|
||||
const import_record = p.import_records[namespace.import_record_index];
|
||||
if (import_record.is_bundled or (comptime is_inside_bundle) or namespace.was_originally_property_access) {
|
||||
if (import_record.isBundled() or (comptime is_inside_bundle) or namespace.was_originally_property_access) {
|
||||
p.printIdentifier(name);
|
||||
p.print(": () => ");
|
||||
p.printNamespaceAlias(import_record, namespace);
|
||||
@@ -3141,7 +3141,7 @@ pub fn NewPrinter(
|
||||
if (p.symbols.get(item.name.ref.?)) |symbol| {
|
||||
if (symbol.namespace_alias) |namespace| {
|
||||
const import_record = p.import_records[namespace.import_record_index];
|
||||
if (import_record.is_bundled or (comptime is_inside_bundle) or namespace.was_originally_property_access) {
|
||||
if (import_record.isBundled() or (comptime is_inside_bundle) or namespace.was_originally_property_access) {
|
||||
p.print("var ");
|
||||
p.printSymbol(item.name.ref.?);
|
||||
p.print(" = ");
|
||||
@@ -3606,7 +3606,7 @@ pub fn NewPrinter(
|
||||
return p.printBundledImport(record, s);
|
||||
}
|
||||
|
||||
if (record.wrap_with_to_module) {
|
||||
if (record.wrap_with_to_module()) {
|
||||
const require_ref = p.options.require_ref.?;
|
||||
|
||||
const module_id = record.module_id;
|
||||
@@ -3620,7 +3620,7 @@ pub fn NewPrinter(
|
||||
try p.imported_module_ids.append(module_id);
|
||||
}
|
||||
|
||||
if (record.contains_import_star) {
|
||||
if (record.contains_import_star()) {
|
||||
p.print("var ");
|
||||
p.printSymbol(s.namespace_ref);
|
||||
p.print(" = ");
|
||||
@@ -3673,7 +3673,7 @@ pub fn NewPrinter(
|
||||
|
||||
p.print("} = ");
|
||||
|
||||
if (record.contains_import_star) {
|
||||
if (record.contains_import_star()) {
|
||||
p.printSymbol(s.namespace_ref);
|
||||
p.print(";\n");
|
||||
} else {
|
||||
@@ -3703,7 +3703,7 @@ pub fn NewPrinter(
|
||||
// We could use a map but we want to avoid allocating
|
||||
// and this should be pretty quick since it's just comparing a uint32
|
||||
for (p.import_records[i + 1 ..]) |_record2| {
|
||||
if (_record2.is_bundled and _record2.module_id > 0 and _record2.module_id == _record.module_id) {
|
||||
if (_record2.isBundled() and _record2.module_id == _record.module_id) {
|
||||
continue :skip;
|
||||
}
|
||||
}
|
||||
@@ -3731,7 +3731,7 @@ pub fn NewPrinter(
|
||||
return;
|
||||
}
|
||||
|
||||
if (record.handles_import_errors and record.path.is_disabled and record.kind.isCommonJS()) {
|
||||
if (record.handles_import_errors() and record.path.is_disabled and record.kind.isCommonJS()) {
|
||||
return;
|
||||
}
|
||||
|
||||
@@ -3894,7 +3894,7 @@ pub fn NewPrinter(
|
||||
}
|
||||
|
||||
pub fn printBundledImport(p: *Printer, record: importRecord.ImportRecord, s: *S.Import) void {
|
||||
if (record.is_internal) {
|
||||
if (record.isInternal()) {
|
||||
return;
|
||||
}
|
||||
|
||||
@@ -4028,7 +4028,7 @@ pub fn NewPrinter(
|
||||
}
|
||||
|
||||
pub fn printBundledRequire(p: *Printer, require: E.Require) void {
|
||||
if (p.import_records[require.import_record_index].is_internal) {
|
||||
if (p.import_records[require.import_record_index].isInternal()) {
|
||||
return;
|
||||
}
|
||||
|
||||
@@ -4410,7 +4410,7 @@ pub fn NewPrinter(
|
||||
}
|
||||
|
||||
return Printer{
|
||||
.import_records = tree.import_records,
|
||||
.import_records = tree.import_records.slice(),
|
||||
.options = opts,
|
||||
.symbols = symbols,
|
||||
.writer = writer,
|
||||
@@ -4827,7 +4827,7 @@ pub fn printAst(
|
||||
}
|
||||
}
|
||||
|
||||
for (tree.parts) |part| {
|
||||
for (tree.parts.slice()) |part| {
|
||||
for (part.stmts) |stmt| {
|
||||
try printer.printStmt(stmt);
|
||||
if (printer.writer.getError()) {} else |err| {
|
||||
@@ -4866,7 +4866,7 @@ pub fn printJSON(
|
||||
writer,
|
||||
&ast,
|
||||
source,
|
||||
std.mem.zeroes(Symbol.Map),
|
||||
Symbol.Map{},
|
||||
.{},
|
||||
null,
|
||||
undefined,
|
||||
@@ -4916,7 +4916,7 @@ pub fn printCommonJS(
|
||||
}
|
||||
}
|
||||
}
|
||||
for (tree.parts) |part| {
|
||||
for (tree.parts.slice()) |part| {
|
||||
for (part.stmts) |stmt| {
|
||||
try printer.printStmt(stmt);
|
||||
if (printer.writer.getError()) {} else |err| {
|
||||
@@ -4985,7 +4985,7 @@ pub fn printCommonJSThreaded(
|
||||
}
|
||||
}
|
||||
|
||||
for (tree.parts) |part| {
|
||||
for (tree.parts.slice()) |part| {
|
||||
for (part.stmts) |stmt| {
|
||||
try printer.printStmt(stmt);
|
||||
if (printer.writer.getError()) {} else |err| {
|
||||
|
||||
@@ -4,7 +4,7 @@ const js_lexer = @import("js_lexer.zig");
|
||||
const importRecord = @import("import_record.zig");
|
||||
const js_ast = @import("js_ast.zig");
|
||||
const options = @import("options.zig");
|
||||
|
||||
const BabyList = @import("./baby_list.zig").BabyList;
|
||||
const fs = @import("fs.zig");
|
||||
const bun = @import("global.zig");
|
||||
const string = bun.string;
|
||||
@@ -609,7 +609,7 @@ pub fn toAST(
|
||||
return Expr.init(
|
||||
js_ast.E.Object,
|
||||
js_ast.E.Object{
|
||||
.properties = js_ast.BabyList(G.Property).init(properties[0..property_i]),
|
||||
.properties = BabyList(G.Property).init(properties[0..property_i]),
|
||||
.is_single_line = property_i <= 1,
|
||||
},
|
||||
logger.Loc.Empty,
|
||||
|
||||
@@ -216,18 +216,18 @@ pub const Linker = struct {
|
||||
var needs_require = false;
|
||||
var node_module_bundle_import_path: ?string = null;
|
||||
|
||||
var import_records = result.ast.import_records;
|
||||
var import_records = result.ast.import_records.listManaged(result.ast.allocator);
|
||||
defer {
|
||||
result.ast.import_records = import_records;
|
||||
result.ast.import_records = ImportRecord.List.fromList(import_records);
|
||||
}
|
||||
// Step 1. Resolve imports & requires
|
||||
switch (result.loader) {
|
||||
.jsx, .js, .ts, .tsx => {
|
||||
var record_i: u32 = 0;
|
||||
const record_count = @truncate(u32, import_records.len);
|
||||
const record_count = @truncate(u32, import_records.items.len);
|
||||
|
||||
outer: while (record_i < record_count) : (record_i += 1) {
|
||||
var import_record = &import_records[record_i];
|
||||
var import_record = &import_records.items[record_i];
|
||||
if (import_record.is_unused) continue;
|
||||
|
||||
const record_index = record_i;
|
||||
@@ -355,7 +355,6 @@ pub const Linker = struct {
|
||||
if (node_modules_bundle.getPackage(package_name)) |pkg| {
|
||||
const import_path = text[@minimum(text.len, package_name.len + 1)..];
|
||||
if (node_modules_bundle.findModuleIDInPackageIgnoringExtension(pkg, import_path)) |found_module| {
|
||||
import_record.is_bundled = true;
|
||||
node_module_bundle_import_path = node_module_bundle_import_path orelse
|
||||
linker.nodeModuleBundleImportPath(origin);
|
||||
|
||||
@@ -383,7 +382,6 @@ pub const Linker = struct {
|
||||
if (node_modules_bundle.getPackage(package_name)) |pkg| {
|
||||
const import_path = runtime[@minimum(runtime.len, package_name.len + 1)..];
|
||||
if (node_modules_bundle.findModuleInPackage(pkg, import_path)) |found_module| {
|
||||
import_record.is_bundled = true;
|
||||
node_module_bundle_import_path = node_module_bundle_import_path orelse
|
||||
linker.nodeModuleBundleImportPath(origin);
|
||||
|
||||
@@ -477,7 +475,6 @@ pub const Linker = struct {
|
||||
);
|
||||
}
|
||||
|
||||
import_record.is_bundled = true;
|
||||
node_module_bundle_import_path = node_module_bundle_import_path orelse
|
||||
linker.nodeModuleBundleImportPath(origin);
|
||||
import_record.path.text = node_module_bundle_import_path.?;
|
||||
@@ -513,7 +510,7 @@ pub const Linker = struct {
|
||||
// We can do this in the printer instead of creating a bunch of AST nodes here.
|
||||
// But we need to at least tell the printer that this needs to happen.
|
||||
if (resolved_import.shouldAssumeCommonJS(import_record.kind)) {
|
||||
import_record.wrap_with_to_module = true;
|
||||
import_record.enable(.wrap_with_to_module);
|
||||
import_record.module_id = @truncate(u32, std.hash.Wyhash.hash(0, path.pretty));
|
||||
|
||||
result.ast.needs_runtime = true;
|
||||
@@ -522,7 +519,7 @@ pub const Linker = struct {
|
||||
} else |err| {
|
||||
switch (err) {
|
||||
error.ModuleNotFound => {
|
||||
if (import_record.handles_import_errors) {
|
||||
if (import_record.handles_import_errors()) {
|
||||
import_record.path.is_disabled = true;
|
||||
continue;
|
||||
}
|
||||
@@ -591,10 +588,7 @@ pub const Linker = struct {
|
||||
result.ast.externals = externals.toOwnedSlice();
|
||||
|
||||
if (result.ast.needs_runtime and result.ast.runtime_import_record_id == null) {
|
||||
var new_import_records = try linker.allocator.alloc(ImportRecord, import_records.len + 1);
|
||||
std.mem.copy(ImportRecord, new_import_records, import_records);
|
||||
|
||||
new_import_records[new_import_records.len - 1] = ImportRecord{
|
||||
try import_records.append(ImportRecord{
|
||||
.kind = .stmt,
|
||||
.path = if (linker.options.node_modules_bundle != null)
|
||||
Fs.Path.init(node_module_bundle_import_path orelse linker.nodeModuleBundleImportPath(origin))
|
||||
@@ -604,9 +598,8 @@ pub const Linker = struct {
|
||||
try linker.generateImportPath(source_dir, Linker.runtime_source_path, false, "bun", origin, import_path_format),
|
||||
|
||||
.range = logger.Range{ .loc = logger.Loc{ .start = 0 }, .len = 0 },
|
||||
};
|
||||
result.ast.runtime_import_record_id = @truncate(u32, new_import_records.len - 1);
|
||||
import_records = new_import_records;
|
||||
});
|
||||
result.ast.runtime_import_record_id = @truncate(u32, import_records.items.len - 1);
|
||||
}
|
||||
|
||||
// We _assume_ you're importing ESM.
|
||||
|
||||
@@ -15,12 +15,12 @@ const C = bun.C;
|
||||
const JSC = @import("javascript_core");
|
||||
const fs = @import("fs.zig");
|
||||
const unicode = std.unicode;
|
||||
|
||||
const Ref = @import("./ast/base.zig").Ref;
|
||||
const expect = std.testing.expect;
|
||||
const assert = std.debug.assert;
|
||||
const ArrayList = std.ArrayList;
|
||||
const StringBuilder = @import("./string_builder.zig");
|
||||
|
||||
const Index = @import("./ast/base.zig").Index;
|
||||
pub const Kind = enum(i8) {
|
||||
err,
|
||||
warn,
|
||||
@@ -491,6 +491,10 @@ pub const Log = struct {
|
||||
msgs: ArrayList(Msg),
|
||||
level: Level = if (Environment.isDebug) Level.info else Level.warn,
|
||||
|
||||
pub inline fn hasErrors(this: *const Log) bool {
|
||||
return this.errors > 0;
|
||||
}
|
||||
|
||||
pub fn reset(this: *Log) void {
|
||||
this.msgs.clearRetainingCapacity();
|
||||
this.warnings = 0;
|
||||
@@ -1013,10 +1017,36 @@ pub inline fn usize2Loc(loc: usize) Loc {
|
||||
pub const Source = struct {
|
||||
path: fs.Path,
|
||||
key_path: fs.Path,
|
||||
index: u32 = 0,
|
||||
|
||||
contents: string,
|
||||
contents_is_recycled: bool = false,
|
||||
|
||||
/// Lazily-generated human-readable identifier name that is non-unique
|
||||
/// Avoid accessing this directly most of the time
|
||||
identifier_name: string = "",
|
||||
|
||||
index: Index = Index.invalid,
|
||||
|
||||
pub fn fmtIdentifier(this: *const Source) strings.FormatValidIdentifier {
|
||||
return this.path.name.fmtIdentifier();
|
||||
}
|
||||
|
||||
pub fn identifierName(this: *Source, allocator: std.mem.Allocator) !string {
|
||||
if (this.identifier_name.len > 0) {
|
||||
return this.identifier_name;
|
||||
}
|
||||
|
||||
std.debug.assert(this.path.text.len > 0);
|
||||
const name = try this.path.name.nonUniqueNameString(allocator);
|
||||
this.identifier_name = name;
|
||||
return name;
|
||||
}
|
||||
|
||||
pub fn rangeOfIdentifier(this: *const Source, loc: Loc) Range {
|
||||
const js_lexer = @import("./js_lexer.zig");
|
||||
return js_lexer.rangeOfIdentifier(this.contents, loc);
|
||||
}
|
||||
|
||||
pub const ErrorPosition = struct {
|
||||
line_start: usize,
|
||||
line_end: usize,
|
||||
|
||||
@@ -236,7 +236,7 @@ export fn transform(opts_array: u64) u64 {
|
||||
parser.options.features.top_level_await = true;
|
||||
const result = parser.parse() catch unreachable;
|
||||
if (result.ok) {
|
||||
var symbols: [][]JSAst.Symbol = &([_][]JSAst.Symbol{result.ast.symbols});
|
||||
var symbols: JSAst.Symbol.NestedList = JSAst.Symbol.NestedList.init(&.{result.ast.symbols});
|
||||
|
||||
_ = JSPrinter.printAst(
|
||||
@TypeOf(&writer),
|
||||
@@ -314,8 +314,8 @@ export fn scan(opts_array: u64) u64 {
|
||||
if (result.ok) {
|
||||
var scanned_imports = allocator.alloc(Api.ScannedImport, result.ast.import_records.len) catch unreachable;
|
||||
var scanned_i: usize = 0;
|
||||
for (result.ast.import_records) |import_record| {
|
||||
if (import_record.kind == .internal) continue;
|
||||
for (result.ast.import_records.slice()) |import_record| {
|
||||
if (import_record.isInternal()) continue;
|
||||
scanned_imports[scanned_i] = Api.ScannedImport{ .path = import_record.path.text, .kind = import_record.kind.toAPI() };
|
||||
scanned_i += 1;
|
||||
}
|
||||
|
||||
@@ -1108,6 +1108,25 @@ pub const SourceMapOption = enum {
|
||||
});
|
||||
};
|
||||
|
||||
pub const OutputFormat = enum {
|
||||
preserve,
|
||||
|
||||
/// ES module format
|
||||
/// This is the default format
|
||||
esm,
|
||||
/// Immediately-invoked function expression
|
||||
/// (
|
||||
/// function(){}
|
||||
/// )();
|
||||
iife,
|
||||
/// CommonJS
|
||||
cjs,
|
||||
|
||||
pub fn keepES6ImportExportSyntax(this: OutputFormat) bool {
|
||||
return this == .esm;
|
||||
}
|
||||
};
|
||||
|
||||
/// BundleOptions is used when ResolveMode is not set to "disable".
|
||||
/// BundleOptions is effectively webpack + babel
|
||||
pub const BundleOptions = struct {
|
||||
@@ -1139,6 +1158,9 @@ pub const BundleOptions = struct {
|
||||
production: bool = false,
|
||||
serve: bool = false,
|
||||
|
||||
// only used by bundle_v2
|
||||
output_format: OutputFormat = .esm,
|
||||
|
||||
append_package_version_in_query_string: bool = false,
|
||||
|
||||
resolve_mode: api.Api.ResolveMode,
|
||||
@@ -1165,6 +1187,7 @@ pub const BundleOptions = struct {
|
||||
|
||||
conditions: ESMConditions = undefined,
|
||||
tree_shaking: bool = false,
|
||||
code_splitting: bool = false,
|
||||
sourcemap: SourceMapOption = SourceMapOption.none,
|
||||
|
||||
pub inline fn cssImportBehavior(this: *const BundleOptions) Api.CssInJsBehavior {
|
||||
|
||||
@@ -328,7 +328,8 @@ pub fn print(comptime fmt: string, args: anytype) void {
|
||||
std.fmt.format(source.stream.writer(), fmt, args) catch unreachable;
|
||||
root.console_log(root.Uint8Array.fromSlice(source.stream.buffer[0..source.stream.pos]));
|
||||
} else {
|
||||
std.debug.assert(source_set);
|
||||
if (comptime Environment.allow_assert)
|
||||
std.debug.assert(source_set);
|
||||
|
||||
if (enable_buffering) {
|
||||
std.fmt.format(source.buffered_stream.writer(), fmt, args) catch unreachable;
|
||||
|
||||
@@ -39,6 +39,29 @@ pub const Renamer = struct {
|
||||
}
|
||||
};
|
||||
|
||||
pub const BundledRenamer = struct {
|
||||
symbols: js_ast.Symbol.Map,
|
||||
sources: []const []const u8,
|
||||
|
||||
pub fn init(symbols: js_ast.Symbol.Map, sources: []const []const u8) Renamer {
|
||||
return Renamer{ .symbols = symbols, .source = sources };
|
||||
}
|
||||
|
||||
pub fn nameForSymbol(renamer: *Renamer, ref: Ref) string {
|
||||
if (ref.isSourceContentsSlice()) {
|
||||
unreachable;
|
||||
}
|
||||
|
||||
const resolved = renamer.symbols.follow(ref);
|
||||
|
||||
if (renamer.symbols.getConst(resolved)) |symbol| {
|
||||
return symbol.original_name;
|
||||
} else {
|
||||
Global.panic("Invalid symbol {s} in {s}", .{ ref, renamer.source.path.text });
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
pub const DisabledRenamer = struct {
|
||||
pub fn init(_: js_ast.Symbol.Map) DisabledRenamer {}
|
||||
pub inline fn nameForSymbol(_: *Renamer, _: js_ast.Ref) string {
|
||||
|
||||
@@ -84,6 +84,29 @@ pub const PathPair = struct {
|
||||
}
|
||||
};
|
||||
|
||||
// this is ripped from esbuild, comments included
|
||||
pub const SideEffects = enum {
|
||||
/// The default value conservatively considers all files to have side effects.
|
||||
has_side_effects,
|
||||
|
||||
/// This file was listed as not having side effects by a "package.json"
|
||||
/// file in one of our containing directories with a "sideEffects" field.
|
||||
no_side_effects__package_json,
|
||||
|
||||
/// This file is considered to have no side effects because the AST was empty
|
||||
/// after parsing finished. This should be the case for ".d.ts" files.
|
||||
no_side_effects__empty_ast,
|
||||
|
||||
/// This file was loaded using a data-oriented loader (e.g. "text") that is
|
||||
/// known to not have side effects.
|
||||
no_side_effects__pure_data,
|
||||
|
||||
// / Same as above but it came from a plugin. We don't want to warn about
|
||||
// / unused imports to these files since running the plugin is a side effect.
|
||||
// / Removing the import would not call the plugin which is observable.
|
||||
// no_side_effects__pure_data_from_plugin,
|
||||
};
|
||||
|
||||
pub const Result = struct {
|
||||
path_pair: PathPair,
|
||||
|
||||
@@ -100,7 +123,7 @@ pub const Result = struct {
|
||||
|
||||
// If present, any ES6 imports to this file can be considered to have no side
|
||||
// effects. This means they should be removed if unused.
|
||||
primary_side_effects_data: ?SideEffectsData = null,
|
||||
primary_side_effects_data: SideEffects = SideEffects.has_side_effects,
|
||||
|
||||
// If true, unused imports are retained in TypeScript code. This matches the
|
||||
// behavior of the "importsNotUsedAsValues" field in "tsconfig.json" when the
|
||||
|
||||
@@ -6,20 +6,6 @@ var __hasOwnProp = Object.prototype.hasOwnProperty;
|
||||
var __getOwnPropNames = Object.getOwnPropertyNames;
|
||||
var __getOwnPropDesc = Object.getOwnPropertyDescriptor;
|
||||
|
||||
// We're disabling Object.freeze because it breaks CJS => ESM and can cause
|
||||
// issues with Suspense and other things that expect the CJS module namespace
|
||||
// to be mutable when the ESM module namespace is NOT mutable
|
||||
var __objectFreezePolyfill = new WeakSet();
|
||||
|
||||
globalThis.Object.freeze = function freeze(obj) {
|
||||
__objectFreezePolyfill.add(obj);
|
||||
return obj;
|
||||
};
|
||||
|
||||
globalThis.Object.isFrozen = function isFrozen(obj) {
|
||||
return __objectFreezePolyfill.has(obj);
|
||||
};
|
||||
|
||||
export var __markAsModule = (target) =>
|
||||
__defProp(target, "__esModule", { value: true, configurable: true });
|
||||
|
||||
@@ -170,10 +156,10 @@ export var __exportDefault = (target, value) => {
|
||||
});
|
||||
};
|
||||
|
||||
export var __reExport = (target, module, desc) => {
|
||||
export var __reExport = (target, module, copyDefault, desc) => {
|
||||
if ((module && typeof module === "object") || typeof module === "function")
|
||||
for (let key of __getOwnPropNames(module))
|
||||
if (!__hasOwnProp.call(target, key) && key !== "default")
|
||||
if (!__hasOwnProp.call(target, key) && (copyDefault || key !== "default"))
|
||||
__defProp(target, key, {
|
||||
get: () => module[key],
|
||||
configurable: true,
|
||||
@@ -182,3 +168,37 @@ export var __reExport = (target, module, desc) => {
|
||||
});
|
||||
return target;
|
||||
};
|
||||
|
||||
// Converts the module from CommonJS to ESM
|
||||
export var __toESM = (module, isNodeMode) => {
|
||||
return __reExport(
|
||||
__markAsModule(
|
||||
__defProp(
|
||||
module != null ? __create(__getProtoOf(module)) : {},
|
||||
"default",
|
||||
|
||||
// If the importer is not in node compatibility mode and this is an ESM
|
||||
// file that has been converted to a CommonJS file using a Babel-
|
||||
// compatible transform (i.e. "__esModule" has been set), then forward
|
||||
// "default" to the export named "default". Otherwise set "default" to
|
||||
// "module.exports" for node compatibility.
|
||||
!isNodeMode && module && module.__esModule
|
||||
? { get: () => module.default, enumerable: true }
|
||||
: { value: module, enumerable: true }
|
||||
)
|
||||
),
|
||||
module
|
||||
);
|
||||
};
|
||||
|
||||
// Converts the module from ESM to CommonJS
|
||||
export var __toCommonJS = /* @__PURE__ */ ((cache) => {
|
||||
return (module, temp) => {
|
||||
return (
|
||||
(cache && cache.get(module)) ||
|
||||
((temp = __reExport(__markAsModule({}), module, /* copyDefault */ 1)),
|
||||
cache && cache.set(module, temp),
|
||||
temp)
|
||||
);
|
||||
};
|
||||
})(typeof WeakMap !== "undefined" ? new WeakMap() : 0);
|
||||
|
||||
16
src/sha.zig
16
src/sha.zig
@@ -205,6 +205,22 @@ pub fn main() anyerror!void {
|
||||
evp_in.final(&digest4);
|
||||
const evp_in_time = clock4.read();
|
||||
|
||||
var clock5 = try std.time.Timer.start();
|
||||
_ = std.hash.Wyhash.hash(0, bytes);
|
||||
const wyhash_time = clock5.read();
|
||||
|
||||
var clock6 = try std.time.Timer.start();
|
||||
_ = std.hash.Crc32.hash(bytes);
|
||||
const crc32 = clock6.read();
|
||||
|
||||
std.debug.print(
|
||||
" wyhash: {}\n",
|
||||
.{std.fmt.fmtDuration(wyhash_time)},
|
||||
);
|
||||
std.debug.print(
|
||||
" crc32: {}\n",
|
||||
.{std.fmt.fmtDuration(crc32)},
|
||||
);
|
||||
std.debug.print(
|
||||
" zig: {}\n",
|
||||
.{std.fmt.fmtDuration(zig_time)},
|
||||
|
||||
@@ -5,7 +5,6 @@ pub const VLQ_CONTINUATION_BIT: u32 = VLQ_BASE;
|
||||
pub const VLQ_CONTINUATION_MASK: u32 = 1 << VLQ_CONTINUATION_BIT;
|
||||
const std = @import("std");
|
||||
const JSAst = @import("../js_ast.zig");
|
||||
const BabyList = JSAst.BabyList;
|
||||
const Logger = @import("../logger.zig");
|
||||
const strings = @import("../string_immutable.zig");
|
||||
const MutableString = @import("../string_mutable.zig").MutableString;
|
||||
@@ -13,7 +12,7 @@ const Joiner = @import("../string_joiner.zig");
|
||||
const JSPrinter = @import("../js_printer.zig");
|
||||
const URL = @import("../url.zig").URL;
|
||||
const FileSystem = @import("../fs.zig").FileSystem;
|
||||
|
||||
const BabyList = @import("../baby_list.zig").BabyList;
|
||||
const SourceMap = @This();
|
||||
|
||||
/// Coordinates in source maps are stored using relative offsets for size
|
||||
|
||||
@@ -64,6 +64,79 @@ pub fn indexOfCharNeg(self: string, char: u8) i32 {
|
||||
return -1;
|
||||
}
|
||||
|
||||
/// Format a string to an ECMAScript identifier.
|
||||
/// Unlike the string_mutable.zig version, this always allocate/copy
|
||||
pub fn fmtIdentifier(name: string) FormatValidIdentifier {
|
||||
return FormatValidIdentifier{ .name = name };
|
||||
}
|
||||
|
||||
/// Format a string to an ECMAScript identifier.
|
||||
/// Different implementation than string_mutable because string_mutable may avoid allocating
|
||||
/// This will always allocate
|
||||
pub const FormatValidIdentifier = struct {
|
||||
name: string,
|
||||
const js_lexer = @import("./js_lexer.zig");
|
||||
pub fn format(self: FormatValidIdentifier, comptime _: []const u8, _: std.fmt.FormatOptions, writer: anytype) !void {
|
||||
var iterator = strings.CodepointIterator.init(self.name);
|
||||
var cursor = strings.CodepointIterator.Cursor{};
|
||||
|
||||
var has_needed_gap = false;
|
||||
var needs_gap = false;
|
||||
var start_i: usize = 0;
|
||||
|
||||
if (!iterator.next(&cursor)) {
|
||||
try writer.writeAll("_");
|
||||
return;
|
||||
}
|
||||
|
||||
// Common case: no gap necessary. No allocation necessary.
|
||||
needs_gap = !js_lexer.isIdentifierStart(cursor.c);
|
||||
if (!needs_gap) {
|
||||
// Are there any non-alphanumeric chars at all?
|
||||
while (iterator.next(&cursor)) {
|
||||
if (!js_lexer.isIdentifierContinue(cursor.c) or cursor.width > 1) {
|
||||
needs_gap = true;
|
||||
start_i = cursor.i;
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (needs_gap) {
|
||||
needs_gap = false;
|
||||
|
||||
var slice = self.name[start_i..];
|
||||
iterator = strings.CodepointIterator.init(slice);
|
||||
cursor = strings.CodepointIterator.Cursor{};
|
||||
|
||||
while (iterator.next(&cursor)) {
|
||||
if (js_lexer.isIdentifierContinue(cursor.c) and cursor.width == 1) {
|
||||
if (needs_gap) {
|
||||
try writer.writeAll("_");
|
||||
needs_gap = false;
|
||||
has_needed_gap = true;
|
||||
}
|
||||
try writer.writeAll(slice[cursor.i .. cursor.i + @as(u32, cursor.width)]);
|
||||
} else if (!needs_gap) {
|
||||
needs_gap = true;
|
||||
// skip the code point, replace it with a single _
|
||||
}
|
||||
}
|
||||
|
||||
// If it ends with an emoji
|
||||
if (needs_gap) {
|
||||
try writer.writeAll("_");
|
||||
needs_gap = false;
|
||||
has_needed_gap = true;
|
||||
}
|
||||
|
||||
return;
|
||||
}
|
||||
|
||||
try writer.writeAll(self.name);
|
||||
}
|
||||
};
|
||||
|
||||
pub fn indexOfSigned(self: string, str: string) i32 {
|
||||
const i = std.mem.indexOf(u8, self, str) orelse return -1;
|
||||
return @intCast(i32, i);
|
||||
@@ -2077,10 +2150,12 @@ const sort_asc = std.sort.asc(u8);
|
||||
const sort_desc = std.sort.desc(u8);
|
||||
|
||||
pub fn sortAsc(in: []string) void {
|
||||
// TODO: experiment with simd to see if it's faster
|
||||
std.sort.sort([]const u8, in, {}, cmpStringsAsc);
|
||||
}
|
||||
|
||||
pub fn sortDesc(in: []string) void {
|
||||
// TODO: experiment with simd to see if it's faster
|
||||
std.sort.sort([]const u8, in, {}, cmpStringsDesc);
|
||||
}
|
||||
|
||||
|
||||
@@ -62,10 +62,9 @@ pub const MutableString = struct {
|
||||
return mutable;
|
||||
}
|
||||
|
||||
// Convert it to an ASCII identifier. Note: If you change this to a non-ASCII
|
||||
// identifier, you're going to potentially cause trouble with non-BMP code
|
||||
// points in target environments that don't support bracketed Unicode escapes.
|
||||
|
||||
/// Convert it to an ASCII identifier. Note: If you change this to a non-ASCII
|
||||
/// identifier, you're going to potentially cause trouble with non-BMP code
|
||||
/// points in target environments that don't support bracketed Unicode escapes.
|
||||
pub fn ensureValidIdentifier(str: string, allocator: std.mem.Allocator) !string {
|
||||
if (str.len == 0) {
|
||||
return "_";
|
||||
|
||||
@@ -104,6 +104,239 @@ pub const Batch = struct {
|
||||
}
|
||||
};
|
||||
|
||||
pub const WaitGroup = struct {
|
||||
mutex: std.Thread.Mutex = .{},
|
||||
counter: u32 = 0,
|
||||
event: std.Thread.ResetEvent,
|
||||
|
||||
pub fn init(self: *WaitGroup) !void {
|
||||
self.* = .{
|
||||
.mutex = .{},
|
||||
.counter = 0,
|
||||
.event = undefined,
|
||||
};
|
||||
try self.event.init();
|
||||
}
|
||||
|
||||
pub fn deinit(self: *WaitGroup) void {
|
||||
self.event.deinit();
|
||||
self.* = undefined;
|
||||
}
|
||||
|
||||
pub fn start(self: *WaitGroup) void {
|
||||
self.mutex.lock();
|
||||
defer self.mutex.unlock();
|
||||
|
||||
self.counter += 1;
|
||||
}
|
||||
|
||||
pub fn finish(self: *WaitGroup) void {
|
||||
self.mutex.lock();
|
||||
defer self.mutex.unlock();
|
||||
|
||||
self.counter -= 1;
|
||||
|
||||
if (self.counter == 0) {
|
||||
self.event.set();
|
||||
}
|
||||
}
|
||||
|
||||
pub fn wait(self: *WaitGroup) void {
|
||||
while (true) {
|
||||
self.mutex.lock();
|
||||
|
||||
if (self.counter == 0) {
|
||||
self.mutex.unlock();
|
||||
return;
|
||||
}
|
||||
|
||||
self.mutex.unlock();
|
||||
self.event.wait();
|
||||
}
|
||||
}
|
||||
|
||||
pub fn reset(self: *WaitGroup) void {
|
||||
self.event.reset();
|
||||
}
|
||||
};
|
||||
|
||||
pub fn ConcurrentFunction(
|
||||
comptime Function: anytype,
|
||||
) type {
|
||||
return struct {
|
||||
const Fn = Function;
|
||||
const Args = std.meta.ArgsTuple(@TypeOf(Fn));
|
||||
const Runner = @This();
|
||||
thread_pool: *ThreadPool,
|
||||
states: []Routine = undefined,
|
||||
batch: Batch = .{},
|
||||
allocator: std.mem.Allocator,
|
||||
|
||||
pub fn init(allocator: std.mem.Allocator, thread_pool: *ThreadPool, count: usize) !Runner {
|
||||
return Runner{
|
||||
.allocator = allocator,
|
||||
.thread_pool = thread_pool,
|
||||
.states = try allocator.alloc(Routine, count),
|
||||
.batch = .{},
|
||||
};
|
||||
}
|
||||
|
||||
pub fn call(this: *@This(), args: Args) void {
|
||||
this.states[this.batch.len] = .{
|
||||
.args = args,
|
||||
};
|
||||
this.batch.push(Batch.from(&this.states[this.batch.len].task));
|
||||
}
|
||||
|
||||
pub fn run(this: *@This()) void {
|
||||
this.thread_pool.schedule(this.batch);
|
||||
}
|
||||
|
||||
pub const Routine = struct {
|
||||
args: Args,
|
||||
task: Task = .{ .callback = callback },
|
||||
|
||||
pub fn callback(task: *Task) void {
|
||||
var routine = @fieldParentPtr(@This(), "task", task);
|
||||
@call(.{ .modifier = .always_inline }, Fn, routine.args);
|
||||
}
|
||||
};
|
||||
|
||||
pub fn deinit(this: *@This()) void {
|
||||
this.allocator.free(this.states);
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
pub fn runner(
|
||||
this: *ThreadPool,
|
||||
allocator: std.mem.Allocator,
|
||||
comptime Function: anytype,
|
||||
count: usize,
|
||||
) !ConcurrentFunction(Function) {
|
||||
return try ConcurrentFunction(Function).init(allocator, this, count);
|
||||
}
|
||||
|
||||
/// Loop over an array of tasks and invoke `Run` on each one in a different thread
|
||||
/// **Blocks the calling thread** until all tasks are completed.
|
||||
pub fn do(
|
||||
this: *ThreadPool,
|
||||
allocator: std.mem.Allocator,
|
||||
wg: ?*WaitGroup,
|
||||
ctx: anytype,
|
||||
comptime Run: anytype,
|
||||
values: anytype,
|
||||
) !void {
|
||||
return try Do(this, allocator, wg, @TypeOf(ctx), ctx, Run, @TypeOf(values), values);
|
||||
}
|
||||
|
||||
pub fn Do(
|
||||
this: *ThreadPool,
|
||||
allocator: std.mem.Allocator,
|
||||
wg: ?*WaitGroup,
|
||||
comptime Context: type,
|
||||
ctx: Context,
|
||||
comptime Function: anytype,
|
||||
comptime ValuesType: type,
|
||||
values: ValuesType,
|
||||
) !void {
|
||||
if (values.len == 0)
|
||||
return;
|
||||
var allocated_wait_group: ?*WaitGroup = null;
|
||||
defer {
|
||||
if (allocated_wait_group) |group| {
|
||||
group.deinit();
|
||||
allocator.destroy(group);
|
||||
}
|
||||
}
|
||||
|
||||
var wait_group = wg orelse brk: {
|
||||
allocated_wait_group = try allocator.create(WaitGroup);
|
||||
try allocated_wait_group.?.init();
|
||||
break :brk allocated_wait_group.?;
|
||||
};
|
||||
const WaitContext = struct {
|
||||
wait_group: *WaitGroup = undefined,
|
||||
ctx: Context,
|
||||
};
|
||||
|
||||
const Runner = struct {
|
||||
pub fn call(ctx_: WaitContext, values_: ValuesType, i: usize) void {
|
||||
for (values_) |v, j| {
|
||||
Function(ctx_.ctx, v, i + j);
|
||||
}
|
||||
ctx_.wait_group.finish();
|
||||
}
|
||||
};
|
||||
|
||||
const tasks_per_worker = @maximum(try std.math.divCeil(u32, @intCast(u32, values.len), this.max_threads), 1);
|
||||
const count = @truncate(u32, values.len / tasks_per_worker);
|
||||
var runny = try runner(this, allocator, Runner.call, count);
|
||||
defer runny.deinit();
|
||||
|
||||
var i: usize = 0;
|
||||
const context_ = WaitContext{
|
||||
.ctx = ctx,
|
||||
.wait_group = wait_group,
|
||||
};
|
||||
var remain = values;
|
||||
while (remain.len > 0) {
|
||||
var slice = remain[0..@minimum(remain.len, tasks_per_worker)];
|
||||
|
||||
runny.call(.{
|
||||
context_,
|
||||
slice,
|
||||
i,
|
||||
});
|
||||
i += slice.len;
|
||||
remain = remain[slice.len..];
|
||||
wait_group.counter += 1;
|
||||
}
|
||||
runny.run();
|
||||
wait_group.wait();
|
||||
}
|
||||
|
||||
test "parallel for loop" {
|
||||
Output.initTest();
|
||||
var thread_pool = ThreadPool.init(.{ .max_threads = 12 });
|
||||
var sleepy_time: u32 = 100;
|
||||
var huge_array = &[_]u32{
|
||||
sleepy_time + std.rand.DefaultPrng.init(1).random().uintAtMost(u32, 20),
|
||||
sleepy_time + std.rand.DefaultPrng.init(2).random().uintAtMost(u32, 20),
|
||||
sleepy_time + std.rand.DefaultPrng.init(3).random().uintAtMost(u32, 20),
|
||||
sleepy_time + std.rand.DefaultPrng.init(4).random().uintAtMost(u32, 20),
|
||||
sleepy_time + std.rand.DefaultPrng.init(5).random().uintAtMost(u32, 20),
|
||||
sleepy_time + std.rand.DefaultPrng.init(6).random().uintAtMost(u32, 20),
|
||||
sleepy_time + std.rand.DefaultPrng.init(7).random().uintAtMost(u32, 20),
|
||||
sleepy_time + std.rand.DefaultPrng.init(8).random().uintAtMost(u32, 20),
|
||||
sleepy_time + std.rand.DefaultPrng.init(9).random().uintAtMost(u32, 20),
|
||||
sleepy_time + std.rand.DefaultPrng.init(10).random().uintAtMost(u32, 20),
|
||||
sleepy_time + std.rand.DefaultPrng.init(11).random().uintAtMost(u32, 20),
|
||||
sleepy_time + std.rand.DefaultPrng.init(12).random().uintAtMost(u32, 20),
|
||||
sleepy_time + std.rand.DefaultPrng.init(13).random().uintAtMost(u32, 20),
|
||||
sleepy_time + std.rand.DefaultPrng.init(14).random().uintAtMost(u32, 20),
|
||||
sleepy_time + std.rand.DefaultPrng.init(15).random().uintAtMost(u32, 20),
|
||||
sleepy_time + std.rand.DefaultPrng.init(16).random().uintAtMost(u32, 20),
|
||||
sleepy_time + std.rand.DefaultPrng.init(17).random().uintAtMost(u32, 20),
|
||||
sleepy_time + std.rand.DefaultPrng.init(18).random().uintAtMost(u32, 20),
|
||||
sleepy_time + std.rand.DefaultPrng.init(19).random().uintAtMost(u32, 20),
|
||||
sleepy_time + std.rand.DefaultPrng.init(20).random().uintAtMost(u32, 20),
|
||||
};
|
||||
const Runner = struct {
|
||||
completed: usize = 0,
|
||||
total: usize = 0,
|
||||
pub fn run(ctx: *@This(), value: u32, _: usize) void {
|
||||
std.time.sleep(value);
|
||||
ctx.completed += 1;
|
||||
std.debug.assert(ctx.completed <= ctx.total);
|
||||
}
|
||||
};
|
||||
var runny = try std.heap.page_allocator.create(Runner);
|
||||
runny.* = .{ .total = huge_array.len };
|
||||
try thread_pool.doAndWait(std.heap.page_allocator, null, runny, Runner.run, std.mem.span(huge_array));
|
||||
try std.testing.expectEqual(huge_array.len, runny.completed);
|
||||
}
|
||||
|
||||
/// Schedule a batch of tasks to be executed by some thread on the thread pool.
|
||||
pub fn schedule(self: *ThreadPool, batch: Batch) void {
|
||||
// Sanity check
|
||||
|
||||
301
src/util.zig
Normal file
301
src/util.zig
Normal file
@@ -0,0 +1,301 @@
|
||||
// Things that should go in Zig standard library at some point
|
||||
const std = @import("std");
|
||||
|
||||
pub fn Key(comptime Map: type) type {
|
||||
return FieldType(Map.KV, "key").?;
|
||||
}
|
||||
|
||||
pub fn Value(comptime Map: type) type {
|
||||
return FieldType(Map.KV, "value").?;
|
||||
}
|
||||
|
||||
pub fn fromEntries(
|
||||
comptime Map: type,
|
||||
allocator: std.mem.Allocator,
|
||||
comptime EntryType: type,
|
||||
entries: EntryType,
|
||||
) !Map {
|
||||
var map: Map = undefined;
|
||||
if (@hasField(Map, "allocator")) {
|
||||
map = Map.init(allocator);
|
||||
} else {
|
||||
map = Map{};
|
||||
}
|
||||
|
||||
if (comptime std.meta.trait.isIndexable(EntryType)) {
|
||||
try map.ensureUnusedCapacity(entries.len);
|
||||
|
||||
comptime var i: usize = 0;
|
||||
|
||||
inline while (i < std.meta.fields(EntryType).len) : (i += 1) {
|
||||
map.putAssumeCapacity(entries[i].@"0", entries[i].@"1");
|
||||
}
|
||||
|
||||
return map;
|
||||
} else if (comptime std.meta.trait.isContainer(EntryType) and @hasDecl(EntryType, "count")) {
|
||||
try map.ensureUnusedCapacity(entries.count());
|
||||
|
||||
if (comptime @hasDecl(EntryType, "iterator")) {
|
||||
var iter = entries.iterator();
|
||||
while (iter.next()) |entry| {
|
||||
map.putAssumeCapacity(entry.@"0", entry.@"1");
|
||||
}
|
||||
|
||||
return map;
|
||||
}
|
||||
} else if (comptime std.meta.trait.isContainer(EntryType) and std.meta.fields(EntryType).len > 0) {
|
||||
try map.ensureUnusedCapacity(std.meta.fields(EntryType).len);
|
||||
|
||||
inline for (comptime std.meta.fieldNames(@TypeOf(EntryType))) |entry| {
|
||||
map.putAssumeCapacity(entry.@"0", entry.@"1");
|
||||
}
|
||||
|
||||
return map;
|
||||
} else if (comptime std.meta.trait.isConstPtr(EntryType) and std.meta.fields(std.meta.Child(EntryType)).len > 0) {
|
||||
try map.ensureUnusedCapacity(std.meta.fields(std.meta.Child(EntryType)).len);
|
||||
|
||||
comptime var i: usize = 0;
|
||||
|
||||
inline while (i < std.meta.fields(std.meta.Child(EntryType)).len) : (i += 1) {
|
||||
map.putAssumeCapacity(entries.*[i].@"0", entries.*[i].@"1");
|
||||
}
|
||||
|
||||
return map;
|
||||
}
|
||||
|
||||
@compileError("Cannot construct Map from entries of type " ++ @typeName(EntryType));
|
||||
}
|
||||
|
||||
pub fn fromMapLike(
|
||||
comptime Map: type,
|
||||
allocator: std.mem.Allocator,
|
||||
entries: anytype,
|
||||
) !Map {
|
||||
var map: Map = undefined;
|
||||
if (comptime @hasField(Map, "allocator")) {
|
||||
map = Map.init(allocator);
|
||||
} else {
|
||||
map = Map{};
|
||||
}
|
||||
|
||||
try map.ensureUnusedCapacity(entries.count());
|
||||
|
||||
var iter = entries.iterator();
|
||||
while (iter.next()) |entry| {
|
||||
map.putAssumeCapacityNoClobber(entry.key_ptr.*, entry.value_ptr.*);
|
||||
}
|
||||
|
||||
return map;
|
||||
}
|
||||
|
||||
pub fn FieldType(comptime Map: type, comptime name: []const u8) ?type {
|
||||
const i = std.meta.fieldIndex(Map, name) orelse return null;
|
||||
const field = std.meta.fields(Map)[i];
|
||||
return field.field_type;
|
||||
}
|
||||
|
||||
pub fn Of(comptime ArrayLike: type) type {
|
||||
if (std.meta.trait.isSlice(ArrayLike)) {
|
||||
return std.meta.Child(ArrayLike);
|
||||
}
|
||||
|
||||
if (comptime @hasField(ArrayLike, "Elem")) {
|
||||
return FieldType(ArrayLike, "Elem").?;
|
||||
}
|
||||
|
||||
if (comptime @hasField(ArrayLike, "items")) {
|
||||
return std.meta.Child(FieldType(ArrayLike, "items").?);
|
||||
}
|
||||
|
||||
if (comptime @hasField(ArrayLike, "ptr")) {
|
||||
return std.meta.Child(FieldType(ArrayLike, "ptr").?);
|
||||
}
|
||||
|
||||
@compileError("Cannot infer type within " ++ @typeName(ArrayLike));
|
||||
}
|
||||
|
||||
pub inline fn from(
|
||||
comptime Array: type,
|
||||
allocator: std.mem.Allocator,
|
||||
default: anytype,
|
||||
) !Array {
|
||||
const DefaultType = @TypeOf(default);
|
||||
if (comptime std.meta.trait.isSlice(DefaultType)) {
|
||||
return fromSlice(Array, allocator, DefaultType, default);
|
||||
}
|
||||
|
||||
if (comptime std.meta.trait.isContainer(DefaultType)) {
|
||||
if (comptime std.meta.trait.isContainer(Array) and @hasDecl(DefaultType, "put")) {
|
||||
return fromMapLike(Array, allocator, default);
|
||||
}
|
||||
|
||||
if (comptime @hasField(DefaultType, "items")) {
|
||||
if (Of(FieldType(DefaultType, "items").?) == Of(Array)) {
|
||||
return fromSlice(Array, allocator, @TypeOf(default.items), default.items);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (comptime std.meta.trait.isContainer(Array) and @hasDecl(Array, "put")) {
|
||||
if (comptime std.meta.trait.isConstPtr(DefaultType) and std.meta.fields(std.meta.Child(DefaultType)).len > 0) {
|
||||
return fromEntries(Array, allocator, @TypeOf(default.*), default.*);
|
||||
}
|
||||
return fromEntries(Array, allocator, DefaultType, default);
|
||||
}
|
||||
|
||||
if (comptime @typeInfo(DefaultType) == .Struct) {
|
||||
return fromSlice(Array, allocator, DefaultType, default);
|
||||
}
|
||||
|
||||
return fromSlice(Array, allocator, []const Of(Array), @as([]const Of(Array), default));
|
||||
}
|
||||
|
||||
pub fn fromSlice(
|
||||
comptime Array: type,
|
||||
allocator: std.mem.Allocator,
|
||||
comptime DefaultType: type,
|
||||
default: DefaultType,
|
||||
) !Array {
|
||||
var map: Array = undefined;
|
||||
if (comptime std.meta.trait.isSlice(Array)) {} else if (comptime @hasField(Array, "allocator")) {
|
||||
map = Array.init(allocator);
|
||||
} else {
|
||||
map = Array{};
|
||||
}
|
||||
|
||||
// is it a MultiArrayList?
|
||||
if (comptime !std.meta.trait.isSlice(Array) and @hasField(Array, "bytes")) {
|
||||
try map.ensureUnusedCapacity(allocator, default.len);
|
||||
for (default) |elem| {
|
||||
map.appendAssumeCapacity(elem);
|
||||
}
|
||||
|
||||
return map;
|
||||
} else {
|
||||
var slice: []Of(Array) = undefined;
|
||||
if (comptime !std.meta.trait.isSlice(Array)) {
|
||||
// is it an ArrayList with an allocator?
|
||||
if (@hasField(Array, "allocator")) {
|
||||
try map.ensureUnusedCapacity(default.len);
|
||||
// is it an ArrayList without an allocator?
|
||||
} else {
|
||||
try map.ensureUnusedCapacity(allocator, default.len);
|
||||
}
|
||||
|
||||
map.items.len = default.len;
|
||||
slice = map.items;
|
||||
} else if (comptime std.meta.trait.isSlice(Array)) {
|
||||
slice = try allocator.alloc(Of(Array), default.len);
|
||||
} else if (comptime @hasField(map, "ptr")) {
|
||||
slice = try allocator.alloc(Of(Array), default.len);
|
||||
map = .{
|
||||
.ptr = slice.ptr,
|
||||
.len = @truncate(u32, default.len),
|
||||
.cap = @truncate(u32, default.len),
|
||||
};
|
||||
}
|
||||
|
||||
if (comptime std.meta.trait.isIndexable(DefaultType) and (std.meta.trait.isSlice(DefaultType) or std.meta.trait.is(.Array)(DefaultType))) {
|
||||
var in = std.mem.sliceAsBytes(default);
|
||||
var out = std.mem.sliceAsBytes(slice);
|
||||
@memcpy(out.ptr, in.ptr, in.len);
|
||||
} else {
|
||||
@compileError("Needs a more specific type to copy from");
|
||||
}
|
||||
|
||||
if (comptime std.meta.trait.isSlice(Array)) {
|
||||
return @as(Array, slice);
|
||||
}
|
||||
|
||||
return map;
|
||||
}
|
||||
}
|
||||
|
||||
/// Say you need to allocate a bunch of tiny arrays
|
||||
/// You could just do separate allocations for each, but that is slow
|
||||
/// With std.ArrayList, pointers invalidate on resize and that means it will crash.
|
||||
/// So a better idea is to batch up your allocations into one larger allocation
|
||||
/// and then just make all the arrays point to different parts of the larger allocation
|
||||
pub fn Batcher(comptime Type: type) type {
|
||||
return struct {
|
||||
head: []Type,
|
||||
|
||||
pub fn init(allocator: std.mem.Allocator, count: usize) !@This() {
|
||||
var all = try allocator.alloc(Type, count);
|
||||
return @This(){ .head = all };
|
||||
}
|
||||
|
||||
pub inline fn done(this: *@This()) void {
|
||||
std.debug.assert(this.head.len == 0);
|
||||
}
|
||||
|
||||
pub inline fn eat(this: *@This(), value: Type) *Type {
|
||||
this.head[0] = value;
|
||||
var prev = &this.head[0];
|
||||
this.head = this.head[1..];
|
||||
return prev;
|
||||
}
|
||||
|
||||
pub inline fn eat1(this: *@This(), value: Type) []Type {
|
||||
this.head[0] = value;
|
||||
var prev = this.head[0..1];
|
||||
this.head = this.head[1..];
|
||||
return prev;
|
||||
}
|
||||
|
||||
pub inline fn next(this: *@This(), values: anytype) []Type {
|
||||
this.head[0..values.len].* = values;
|
||||
var prev = this.head[0..values.len];
|
||||
this.head = this.head[values.len..];
|
||||
return prev;
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
test "fromEntries" {
|
||||
const values = try from(std.AutoHashMap(u32, u32), std.heap.page_allocator, .{
|
||||
.{ 123, 456 },
|
||||
.{ 789, 101112 },
|
||||
});
|
||||
const mapToMap = try from(std.AutoHashMap(u32, u32), std.heap.page_allocator, values);
|
||||
try std.testing.expectEqual(values.get(123).?, 456);
|
||||
try std.testing.expectEqual(values.get(789).?, 101112);
|
||||
try std.testing.expectEqual(mapToMap.get(123).?, 456);
|
||||
try std.testing.expectEqual(mapToMap.get(789).?, 101112);
|
||||
}
|
||||
|
||||
test "from" {
|
||||
const values = try from(
|
||||
[]const u32,
|
||||
std.heap.page_allocator,
|
||||
&.{ 1, 2, 3, 4, 5, 6 },
|
||||
);
|
||||
try std.testing.expectEqualSlices(u32, &.{ 1, 2, 3, 4, 5, 6 }, values);
|
||||
}
|
||||
|
||||
test "from arraylist" {
|
||||
const values = try from(
|
||||
std.ArrayList(u32),
|
||||
std.heap.page_allocator,
|
||||
&.{ 1, 2, 3, 4, 5, 6 },
|
||||
);
|
||||
try std.testing.expectEqualSlices(u32, &.{ 1, 2, 3, 4, 5, 6 }, values.items);
|
||||
|
||||
const cloned = try from(
|
||||
std.ArrayListUnmanaged(u32),
|
||||
std.heap.page_allocator,
|
||||
values,
|
||||
);
|
||||
|
||||
try std.testing.expectEqualSlices(u32, &.{ 1, 2, 3, 4, 5, 6 }, cloned.items);
|
||||
}
|
||||
|
||||
test "from arraylist with struct" {
|
||||
const Entry = std.meta.Tuple(&.{ u32, u32 });
|
||||
const values = try from(
|
||||
std.ArrayList(Entry),
|
||||
std.heap.page_allocator,
|
||||
&.{ Entry{ 123, 456 }, Entry{ 123, 456 }, Entry{ 123, 456 }, Entry{ 123, 456 } },
|
||||
);
|
||||
try std.testing.expectEqualSlices(Entry, &[_]Entry{ .{ 123, 456 }, .{ 123, 456 }, .{ 123, 456 }, .{ 123, 456 } }, values.items);
|
||||
}
|
||||
Reference in New Issue
Block a user