Compare commits

...

214 Commits

Author SHA1 Message Date
Jarred Sumner
5164b69b57 Fix build issues 2021-07-26 18:57:09 -07:00
Jarred Sumner
eec8f5afcf thats a mistake 2021-07-26 18:56:04 -07:00
Jarred Sumner
a7214ab61c cool 2021-07-26 16:39:40 -07:00
Jarred Sumner
796a9854b4 wip 2021-07-24 15:00:08 -07:00
Jarred Sumner
0d79861a94 ok 2021-07-22 15:41:03 -07:00
Jarred Sumner
a8a4f28046 most of the bindings! 2021-07-21 02:07:07 -07:00
Jarred Sumner
f2ea202730 wip 2021-07-19 18:16:32 -07:00
Jarred Sumner
eddbafadfb wrapped! 2021-07-19 17:15:57 -07:00
Jarred Sumner
74e75d6925 WIP bindings 2021-07-18 02:12:58 -07:00
Jarred Sumner
6d25d816a0 1 2021-07-16 18:17:00 -07:00
Jarred Sumner
4a4039fcb3 .gitmodules 2021-07-16 18:16:48 -07:00
Jarred Sumner
3f2a2e25ce wip 2021-07-14 13:11:03 -07:00
Jarred Sumner
f4381bb297 ts 2021-07-14 00:14:11 -07:00
Jarred Sumner
ab73c7b323 alright 2021-07-13 10:32:57 -07:00
Jarred Sumner
6ba2feb8d2 WIP 2021-07-11 15:40:14 -07:00
Jarred Sumner
88aad6aeb1 this kind of works, but there is a crash when bundling. I think its missing a Stmt.Data.Store.reset() 2021-07-01 05:12:15 -07:00
Jarred Sumner
a8536a7337 asdasdasd 2021-06-30 02:38:28 -07:00
Jarred Sumner
cb2ee39bfa latest 2021-06-30 02:38:23 -07:00
Jarred Sumner
3f197d1ce0 Fix crash, fix detecting node_modules, fix undefined not being simplified 2021-06-29 17:47:58 -07:00
Jarred Sumner
26745bb5f3 alright now it crashes 2021-06-29 01:34:38 -07:00
Jarred Sumner
23e64279d8 require_ref 2021-06-28 23:14:46 -07:00
Jarred Sumner
ae113559c6 starting to work 2021-06-27 23:36:35 -07:00
Jarred Sumner
506d9b81a7 wip 2021-06-26 23:12:57 -07:00
Jarred Sumner
3a95a74b7f I like this direction 2021-06-24 22:55:42 -07:00
Jarred Sumner
c3f9d77391 Support live-reload and fallback 2021-06-20 18:15:13 -07:00
Jarred Sumner
cbb6f8b1a9 Rename CSS stress test 2021-06-20 17:38:55 -07:00
Jarred Sumner
88ac58962e g 2021-06-20 17:38:44 -07:00
Jarred Sumner
73247e5c91 Feature detect fast refresh 2021-06-19 20:19:50 -07:00
Jarred Sumner
5015c6a969 color looper i'll probably get rid of 2021-06-19 18:30:32 -07:00
Jarred Sumner
72f1c676b9 Use CSSOM for HMR when available. 2021-06-19 18:26:43 -07:00
Jarred Sumner
3d9028f0ee Show line counts for CSS 2021-06-19 18:23:15 -07:00
Jarred Sumner
3f10c87906 CSS HMR! 2021-06-18 20:48:07 -07:00
Jarred Sumner
e0fa2e78da 100x!! 2021-06-18 00:51:11 -07:00
Jarred Sumner
4ca1e17778 CSS scanner works 2021-06-17 11:14:20 -07:00
Jarred Sumner
6e2c6cd6ea Skeleton! 2021-06-16 16:23:02 -07:00
Jarred Sumner
d0f91082fc HMR crashily works, started working on CSS Scanner 2021-06-14 19:45:51 -07:00
Jarred Sumner
44fce3c5e8 extremely close!!!!! 2021-06-14 01:49:53 -07:00
Jarred Sumner
43380a4d68 I think thats the JS part of HMR 2021-06-12 19:10:08 -07:00
Jarred Sumner
f93472101a little kqueue fs watcher 2021-06-12 00:49:46 -07:00
Jarred Sumner
19f4c59b42 lil websocket server 2021-06-11 17:05:15 -07:00
Jarred Sumner
a1dd2a2a32 alright basic stuff works now. still bugs with JS parser 2021-06-11 10:53:55 -07:00
Jarred Sumner
8070da6ec9 The code looks like it might work 2021-06-10 14:57:08 -07:00
Jarred Sumner
b1dd6cf400 We don't do the exports resolve step, so we must copy the namespace alias 2021-06-10 14:56:39 -07:00
Jarred Sumner
5ffd8e40b3 cool! 2021-06-10 01:07:42 -07:00
Jarred Sumner
f8b7e45f0c Fix expression simplification bug 2021-06-09 22:45:31 -07:00
Jarred Sumner
8444492e3c ok 2021-06-09 18:11:17 -07:00
Jarred Sumner
7346cdaa5a lots 2021-06-09 13:26:30 -07:00
Jarred Sumner
94304788dd add prop 2021-06-08 22:34:35 -07:00
Jarred Sumner
cdb9af36c1 Generate summary 2021-06-08 02:54:50 -07:00
Jarred Sumner
aa554728f1 json 2021-06-06 21:17:48 -07:00
Jarred Sumner
48e56be05e Fix crash that happens when hundreds of files have been parsed and process.env.NODE_ENV is accessed 2021-06-06 21:17:43 -07:00
Jarred Sumner
b97aca7fa1 hash 2021-06-06 21:16:50 -07:00
Jarred Sumner
5d208f9ea0 Upgrade hash table 2021-06-06 21:16:43 -07:00
Jarred Sumner
f9b02d2891 Fix bug printing large scientific notation floats 2021-06-06 21:15:07 -07:00
Jarred Sumner
024b2ea94e hm 2021-06-06 18:34:16 -07:00
Jarred Sumner
797b2ff557 WIP node module bundles 2021-06-06 18:34:01 -07:00
Jarred Sumner
7d6950da46 wip 2021-06-06 18:33:53 -07:00
Jarred Sumner
af7054c69b debug-only print 2021-06-06 14:12:52 -07:00
Jarred Sumner
a93c53651c Bump schema 2021-06-06 14:12:08 -07:00
Jarred Sumner
38fe54261d Generate parser versions at compile time 2021-06-04 19:30:26 -07:00
Jarred Sumner
7c400c9b24 oops typescript 2021-06-04 19:30:08 -07:00
Jarred Sumner
c17c200cfd Fix extra space in printer 2021-06-04 18:26:55 -07:00
Jarred Sumner
1cc15b6c20 Fix extra underscore in ensureValidIdentifier 2021-06-04 17:33:11 -07:00
Jarred Sumner
e1a8852706 Generate differnet versions of Bundler, Resolver, and Caches at comptime based on whether we're serving over HTTP 2021-06-04 16:06:38 -07:00
Jarred Sumner
981759fafa mostly fix --resolve=disable 2021-06-04 14:46:46 -07:00
Jarred Sumner
f584a38a72 rename 2021-06-04 13:50:22 -07:00
Jarred Sumner
a0d6d02fe8 readme was premature 2021-06-04 03:46:52 -07:00
Jarred Sumner
58d77ab827 fix the leaks 2021-06-04 02:47:07 -07:00
Jarred Sumner
0fb2584f15 okl 2021-06-02 20:37:04 -07:00
Jarred Sumner
72813b5b48 keep that 2021-06-02 20:36:23 -07:00
Jarred Sumner
d49df1df57 HTTP fixes + buffer stdout/in + a little HTTP caching 2021-06-02 16:39:40 -07:00
Jarred Sumner
44bab947c6 JSX & CJS work end-to-end! 2021-06-02 12:48:38 -07:00
Jarred Sumner
a6bc130918 wip 2021-06-01 20:49:49 -07:00
Jarred Sumner
73452660fd linker things 2021-05-31 20:30:40 -07:00
Jarred Sumner
52f37e4fe4 Fix printing bugs 2021-05-30 23:35:43 -07:00
Jarred Sumner
7dc3ee4c89 cool 2021-05-30 18:26:18 -07:00
Jarred Sumner
1d3588ef5a stderr 2021-05-30 15:42:47 -07:00
Jarred Sumner
4a3b4953ee Fix indent 2021-05-30 15:32:39 -07:00
Jarred Sumner
09ceececba Fix Define, JSX, use more pointers for property access to minimize calls to memmove 2021-05-30 12:50:08 -07:00
Jarred Sumner
cfda423c01 This'll do for now, I guess. 2021-05-30 01:17:55 -07:00
Jarred Sumner
95d5bc78f2 This blocks approach seems to work 2021-05-30 00:17:17 -07:00
Jarred Sumner
6c2d19c1b0 Revert "WIP"
This reverts commit 55dcde581d.
2021-05-29 19:55:35 -07:00
Jarred Sumner
55dcde581d WIP 2021-05-29 17:40:00 -07:00
Jarred Sumner
b876a8d480 microp 2021-05-29 13:33:48 -07:00
Jarred Sumner
0657f16e27 fix crash 2021-05-29 13:33:37 -07:00
Jarred Sumner
42aaa8eb81 microp-optimize hash table stuff 2021-05-29 13:33:31 -07:00
Jarred Sumner
cd91772f2f Move wyhasxh 2021-05-29 13:32:13 -07:00
Jarred Sumner
f1403901d9 2 2021-05-28 23:48:10 -07:00
Jarred Sumner
664dbf569c all 2021-05-28 23:26:13 -07:00
Jarred Sumner
95a3b72e94 w 2021-05-28 23:04:40 -07:00
Jarred Sumner
efe1479299 wap 2021-05-28 23:04:12 -07:00
Jarred Sumner
6dde9a7540 okay 2021-05-28 22:59:46 -07:00
Jarred Sumner
474df61a86 maekfile 2021-05-28 22:48:37 -07:00
Jarred Sumner
7c576d7b62 commit 2021-05-28 22:47:30 -07:00
Jarred Sumner
4e1619c17a typo 2021-05-28 13:34:02 -07:00
Jarred Sumner
54d9969b4c Fix integer overflow 2021-05-28 13:33:02 -07:00
Jarred Sumner
91d6bf26b9 Remove legacy_octal_loc 2021-05-28 13:32:55 -07:00
Jarred Sumner
2172f3c5e3 keep lexer/loc 2021-05-28 13:27:05 -07:00
Jarred Sumner
d44fa1ca92 launch.json 2021-05-28 13:23:29 -07:00
Jarred Sumner
e72ad4777c fixtures 2021-05-28 13:22:31 -07:00
Jarred Sumner
f1bcf07e2b gitignore 2021-05-27 21:35:41 -07:00
Jarred Sumner
cbf0b77e52 lists 2021-05-27 21:35:28 -07:00
Jarred Sumner
b6e7f01e6a stmt experiment 2021-05-27 18:50:20 -07:00
Jarred Sumner
d1b3bce067 lots 2021-05-27 16:38:53 -07:00
Jarred Sumner
05b9e89417 Fix blah = value inside function args 2021-05-27 15:27:58 -07:00
Jarred Sumner
ebefe97073 Fix yield* 2021-05-27 14:56:53 -07:00
Jarred Sumner
84ea80b813 Fix parsing await inside scopes that contain functions, return the backtracking error in TypeScript 2021-05-27 14:42:33 -07:00
Jarred Sumner
3bcce51fa4 Error message for using node builtins outside of platform == .node 2021-05-27 14:38:28 -07:00
Jarred Sumner
2c212929f8 faster writes performance 2021-05-27 00:42:02 -07:00
Jarred Sumner
9b47a8791e trying to fix outbase 2021-05-27 00:41:33 -07:00
Jarred Sumner
ca2897b466 node builtins 2021-05-27 00:40:56 -07:00
Jarred Sumner
453cfa5689 fuckin with absolute paths 2021-05-27 00:40:47 -07:00
Jarred Sumner
7337f27a7e Use a normal string to represent template literal content for easier UTF8/UTF16 mixing 2021-05-26 18:15:49 -07:00
Jarred Sumner
96a33924fb Skip slow path 2021-05-26 18:15:21 -07:00
Jarred Sumner
bc794e89ed FIx parsing 2 digit hex 2021-05-26 18:14:49 -07:00
Jarred Sumner
bb7404d6bc Fix returning parse errors and template tags 2021-05-26 18:14:36 -07:00
Jarred Sumner
f47d7b3d2d More reliable path storage 2021-05-26 18:13:54 -07:00
Jarred Sumner
831a461916 Fix base_url always null 2021-05-26 18:13:39 -07:00
Jarred Sumner
0e6c46819a Fix tempalte tags 2021-05-26 18:11:55 -07:00
Jarred Sumner
9b5f317c5b detect JSON errors 2021-05-26 18:11:28 -07:00
Jarred Sumner
63b8182b7c I love enums 2021-05-26 13:15:52 -07:00
Jarred Sumner
1e0eb4012a namespace/enum? is that it? 2021-05-26 12:23:09 -07:00
Jarred Sumner
bc4e76c2a5 print_ast feature flag 2021-05-26 12:21:02 -07:00
Jarred Sumner
9c2c005b58 import== 2021-05-26 10:08:14 -07:00
Jarred Sumner
84472ed57f lexer bug! 2021-05-26 10:07:56 -07:00
Jarred Sumner
d04ef8c53f cloner 2021-05-25 23:34:14 -07:00
Jarred Sumner
f2a9dc9eea like all of typescript lol 2021-05-25 20:01:33 -07:00
Jarred Sumner
1cd6b587a2 newline 2021-05-25 20:01:21 -07:00
Jarred Sumner
28534b2e34 add cat, microoptimize the microptimize 2021-05-25 20:01:16 -07:00
Jarred Sumner
6c7eeb2030 mostly just zig fmt 2021-05-25 20:01:06 -07:00
Jarred Sumner
9b206cca2b Malformed headers breaks request parsing 2021-05-25 11:12:22 -07:00
Jarred Sumner
06fbc24b11 relative path 2021-05-25 01:34:44 -07:00
Jarred Sumner
a2637e9016 w 2021-05-24 12:44:49 -07:00
Jarred Sumner
244ae8c593 try 2021-05-24 12:44:39 -07:00
Jarred Sumner
f7ed006a08 ok 2021-05-24 12:44:23 -07:00
Jarred Sumner
5f72442386 ok 2021-05-24 12:44:13 -07:00
Jarred Sumner
1957f0fc23 little separation 2021-05-23 11:10:23 -07:00
Jarred Sumner
37e17be7aa The little things 2021-05-23 10:20:48 -07:00
Jarred Sumner
ac4ac8f5a8 muck 2021-05-23 10:05:21 -07:00
Jarred Sumner
45b55a8970 http server can load static files...slowly. 2021-05-22 23:25:25 -07:00
Jarred Sumner
63e622f2f3 wip 2021-05-21 17:55:42 -07:00
Jarred Sumner
cee857ac4e pico 2021-05-20 02:34:42 -07:00
Jarred Sumner
6475442469 cool 2021-05-19 23:12:23 -07:00
Jarred Sumner
23220fd348 Starting to work on rutnime 2021-05-19 19:30:24 -07:00
Jarred Sumner
4f1d32be16 tread 2021-05-18 20:33:45 -07:00
Jarred Sumner
f502d0f1a4 decodeEscapeSequences...kiond of? 2021-05-18 20:32:55 -07:00
Jarred Sumner
4d6a8f598a hm 2021-05-18 20:06:08 -07:00
Jarred Sumner
78fa4c4f87 Fix DotDefine 2021-05-18 14:40:37 -07:00
Jarred Sumner
1c80859431 Fix label parsing 2021-05-18 14:07:51 -07:00
Jarred Sumner
957e871f4a Fix duplicate exports error 2021-05-18 13:49:23 -07:00
Jarred Sumner
0840845d68 Fix "in" keyword 2021-05-18 13:13:04 -07:00
Jarred Sumner
2ef6397ab9 Resolver is fast now! 2021-05-18 02:24:40 -07:00
Jarred Sumner
9ccb4dd082 lots 2021-05-16 23:25:12 -07:00
Jarred Sumner
d8b1d29656 lots 2021-05-15 17:23:55 -07:00
Jarred Sumner
778c24f176 keep 2021-05-13 23:22:08 -07:00
Jarred Sumner
248d1a7a93 w 2021-05-13 21:12:41 -07:00
Jarred Sumner
e1f996e1b8 more utf8 2021-05-13 21:10:02 -07:00
Jarred Sumner
dbcddc79fc bugfix 2021-05-13 21:09:41 -07:00
Jarred Sumner
7243945291 bug fixes galore 2021-05-13 17:44:50 -07:00
Jarred Sumner
b42b239344 okay 2021-05-13 13:51:40 -07:00
Jarred Sumner
87771ba895 various bug fixes 2021-05-13 01:24:10 -07:00
Jarred Sumner
28fce4aac1 hm 2021-05-13 00:46:22 -07:00
Jarred Sumner
d8828b69d8 hm 2021-05-12 20:40:38 -07:00
Jarred Sumner
80037859ec okay I think that's most of resolving packages/imports algorithm!!! 2021-05-12 20:33:58 -07:00
Jarred Sumner
51df94e599 cool 2021-05-12 13:17:26 -07:00
Jarred Sumner
f9a74df73d That's all the errors?? 2021-05-12 13:00:25 -07:00
Jarred Sumner
2c20d88e8d okay 2021-05-12 01:46:58 -07:00
Jarred Sumner
8df97221a4 now we do resolver?? 2021-05-11 20:49:11 -07:00
Jarred Sumner
cf4d0fe3b6 cool 2021-05-11 20:26:13 -07:00
Jarred Sumner
a5f1670e92 update 2021-05-11 18:39:00 -07:00
Jarred Sumner
d75a1deb4a opts 2021-05-11 17:19:08 -07:00
Jarred Sumner
033b74cc2a submodule 2021-05-11 11:55:38 -07:00
Jarred Sumner
2b3c0584c6 asdasdasdasd 2021-05-10 20:05:53 -07:00
Jarred Sumner
b7d8fe2f35 1day 2021-05-09 18:57:48 -07:00
Jarred Sumner
7d3b0e7daa Use try for errors during parsing so that backtracking can happen 2021-05-08 20:48:20 -07:00
Jarred Sumner
32cdc13f63 Okay this hunks solution seems to work for now. It's not _great_ though. 2021-05-08 19:41:52 -07:00
Jarred Sumner
2f4cd402e4 Fix exporting default twice 2021-05-08 18:12:54 -07:00
Jarred Sumner
6b863d5d51 Fix for loop initializer 2021-05-08 14:23:52 -07:00
Jarred Sumner
79223472f7 wip 2021-05-07 23:34:16 -07:00
Jarred Sumner
8c4917fe60 This _sort of_ works 2021-05-07 20:19:32 -07:00
Jarred Sumner
f4267e2d1f wip 2021-05-07 14:12:56 -07:00
Jarred Sumner
96ff169e46 cool 2021-05-07 01:26:26 -07:00
Jarred Sumner
741e1513b7 123 2021-05-05 19:02:36 -07:00
Jarred Sumner
3708dd4484 cool 2021-05-05 19:02:30 -07:00
Jarred Sumner
2cbd4c9d80 I think that fixes the scopes bug 2021-05-05 19:02:14 -07:00
Jarred Sumner
e0d01a9a91 alright 2021-05-05 13:12:19 -07:00
Jarred Sumner
e1df98878d damn tho 2021-05-05 03:09:59 -07:00
Jarred Sumner
596f3c064a Revert "the fast way"
This reverts commit 808e5cfac3.
2021-05-04 16:05:15 -07:00
Jarred Sumner
29fe5b730f hbm 2021-05-04 16:03:00 -07:00
Jarred Sumner
082d184848 w 2021-05-04 16:02:22 -07:00
Jarred Sumner
2e8d6d549d re 2021-05-04 16:02:09 -07:00
Jarred Sumner
6431b90b9e *src 2021-05-04 16:01:43 -07:00
Jarred Sumner
4c60accdc1 * 2021-05-04 16:01:21 -07:00
Jarred Sumner
808e5cfac3 the fast way 2021-05-04 15:58:18 -07:00
Jarred Sumner
0bfd74af55 slice 2021-05-04 15:56:55 -07:00
Jarred Sumner
83ff3453dc keeper 2021-05-04 15:54:17 -07:00
Jarred Sumner
e034383833 it works??? 2021-05-03 22:37:28 -07:00
Jarred Sumner
1d44b63675 hm 2021-05-03 20:29:38 -07:00
Jarred Sumner
468927c14b maybePrintSpace 2021-05-02 23:45:41 -07:00
Jarred Sumner
c8a8da370c wip 2021-05-02 18:24:46 -07:00
Jarred Sumner
195c69606b shorthand 2021-05-02 16:42:15 -07:00
Jarred Sumner
8db9c7650c various 2021-05-02 16:25:14 -07:00
Jarred Sumner
818d014931 classes work, excluding name and constructor/super 2021-05-02 13:04:55 -07:00
Jarred Sumner
f59ec8d6c0 Assorted bugfixes but the next step really is porting tests and fixing 2021-05-01 01:28:40 -07:00
Jarred Sumner
006ca4f13c it prints end to end though doesn't work yet 2021-04-30 17:26:17 -07:00
Jarred Sumner
107310d785 inching closure 2021-04-30 15:34:31 -07:00
Jarred Sumner
fd56d41c8e all in a days work 2021-04-30 00:55:15 -07:00
Jarred Sumner
daf9ea419b ao[slk 2021-04-29 22:12:22 -07:00
Jarred Sumner
ac83057d08 aoskdp 2021-04-29 21:46:07 -07:00
Jarred Sumner
2567243c8d hm 2021-04-29 20:22:25 -07:00
Jarred Sumner
38c7eb73c1 okay 2021-04-29 14:43:30 -07:00
Jarred Sumner
a32116476a wap 2021-04-29 14:03:01 -07:00
Jarred Sumner
4e3f680ac4 asdasd 2021-04-29 10:29:25 -07:00
Jarred Sumner
b37acf309c wip 2021-04-28 21:58:02 -07:00
254 changed files with 111125 additions and 9789 deletions

44
.gitignore vendored
View File

@@ -3,4 +3,46 @@ zig-cache
*.wasm
*.o
*.a
*.a
profile.json
/package.json
node_modules
.swcrc
yarn.lock
dist
*.log
*.out.js
/package-lock.json
build
*.wat
zig-out
pnpm-lock.yaml
README.md.template
src/deps/zig-clap/example
src/deps/zig-clap/README.md
src/deps/zig-clap/.github
src/deps/zig-clap/.gitattributes
out
outdir
.trace
cover
coverage
coverv
*.trace
bench
github
out.*
out
.parcel-cache
esbuilddir
*.jsb
parceldist
esbuilddir
outdir/
outcss
.next
txt.js
.idea
.vscode/cpp*

9
.gitmodules vendored Normal file
View File

@@ -0,0 +1,9 @@
# [submodule "src/deps/zig-clap"]
# path = src/deps/zig-clap
# url = https://github.com/Hejsil/zig-clap
[submodule "src/deps/picohttpparser"]
path = src/deps/picohttpparser
url = https://github.com/h2o/picohttpparser/
[submodule "src/javascript/jsc/WebKit"]
path = src/javascript/jsc/WebKit
url = git@github.com:/Jarred-Sumner/WebKit

48
.vscode/c_cpp_properties.json vendored Normal file
View File

@@ -0,0 +1,48 @@
{
"configurations": [
{
"name": "Mac",
"forcedInclude": [
"${workspaceFolder}/src/javascript/jsc/bindings/root.h"
],
"includePath": [
"${workspaceFolder}/src/JavaScript/jsc/WebKit/WebKitBuild/Release/JavaScriptCore/PrivateHeaders/",
"${workspaceFolder}/src/JavaScript/jsc/WebKit/WebKitBuild/Release/WTF/Headers",
"${workspaceFolder}/src/JavaScript/jsc/WebKit/WebKitBuild/Release/*",
"${workspaceFolder}/src/JavaScript/jsc/bindings/",
"${workspaceFolder}/src/JavaScript/jsc/WebKit/Source/bmalloc/",
"${workspaceFolder}/src/javascript/jsc/WebKit/WebKitBuild/Release/ICU/Headers/"
],
"browse": {
"path": [
"${workspaceFolder}/src/javascript/jsc/bindings/*",
"${workspaceFolder}/src/JavaScript/jsc/WebKit/WebKitBuild/Release/JavaScriptCore/PrivateHeaders/",
"${workspaceFolder}/src/JavaScript/jsc/WebKit/WebKitBuild/Release/WTF/Headers/**",
"${workspaceFolder}/src/JavaScript/jsc/WebKit/WebKitBuild/Release/*",
"${workspaceFolder}/src/JavaScript/jsc/bindings/**",
"${workspaceFolder}/src/JavaScript/jsc/WebKit/Source/bmalloc/**",
"${workspaceFolder}/src/javascript/jsc/WebKit/WebKitBuild/Release/ICU/Headers/"
],
"limitSymbolsToIncludedHeaders": true,
"databaseFilename": ".vscode/cppdb"
},
"defines": [
"STATICALLY_LINKED_WITH_JavaScriptCore=1",
"STATICALLY_LINKED_WITH_WTF=1",
"BUILDING_WITH_CMAKE=1",
"NOMINMAX",
"ENABLE_INSPECTOR_ALTERNATE_DISPATCHERS=0",
"BUILDING_JSCONLY__",
"USE_FOUNDATION=1",
"ASSERT_ENABLED=0",
"DU_DISABLE_RENAMING=1"
],
"macFrameworkPath": [],
"compilerPath": "/usr/local/opt/llvm/bin/clang",
"cStandard": "c17",
"cppStandard": "c++11",
"intelliSenseMode": "macos-clang-x64"
}
],
"version": 4
}

340
.vscode/launch.json vendored
View File

@@ -2,39 +2,327 @@
"version": "0.2.0",
"configurations": [
{
"name": "Test",
"type": "lldb",
"request": "launch",
"stdio": null,
"stopOnEntry": false,
"program": "/usr/local/bin/zig",
"cwd": "${workspaceFolder}",
"args": ["test", "${file}"],
"presentation": {
"hidden": false,
"group": "",
"order": 1
},
"env": {
"TERM": "xterm"
}
"name": "Roujtes",
"program": "${workspaceFolder}/build/debug/macos-x86_64/esdev",
"args": [
"./routes",
"--resolve=dev",
"--outdir=out"
// "--public-url=https://localhost:9000/"
],
"cwd": "${workspaceFolder}/demos/css-stress-test",
"console": "internalConsole"
},
{
"name": "Launch",
"type": "cppdbg",
"type": "lldb",
"request": "launch",
"program": "${workspaceFolder}/zig-cache/bin/esdev",
"args": [],
"stopAtEntry": false,
"name": "SPJS Lazy Build",
"program": "${workspaceFolder}/build/debug/macos-x86_64/spjs",
"args": ["src/test/fixtures/console.log.js"],
"cwd": "${workspaceFolder}",
"environment": [],
"externalConsole": false,
"MIMode": "lldb",
"internalConsoleOptions": "openOnSessionStart",
"logging": {
"moduleLoad": false
}
"console": "internalConsole"
},
// {
// "type": "lldb",
// "request": "launch",
// "name": "Test",
// "program": "${workspaceFolder}/zig-out/bin/test",
// "preLaunchTask": "test",
// "args": ["/usr/local/bin/zig"],
// "cwd": "${workspaceFolder}",
// "console": "internalConsole"
// },
{
"type": "lldb",
"request": "launch",
"name": "Eval Small TEst",
"program": "${workspaceFolder}/build/debug/macos-x86_64/spjs",
"args": [
"./quoted-escape.js",
"--resolve=dev",
"--outdir=outcss"
// "--public-url=https://localhost:9000/"
],
"cwd": "${workspaceFolder}/src/test/fixtures",
"console": "internalConsole"
},
{
"type": "lldb",
"request": "launch",
"name": "Eval",
"program": "${workspaceFolder}/build/debug/macos-x86_64/spjs",
"args": [
"src/index.tsx",
"--resolve=dev",
"--outdir=outcss"
// "--public-url=https://localhost:9000/"
],
"cwd": "${workspaceFolder}/demos/css-stress-test",
"console": "internalConsole"
},
{
"type": "lldb",
"request": "launch",
"name": "Dev Launch",
"program": "${workspaceFolder}/build/debug/macos-x86_64/esdev",
"args": [
"./simple.css",
"--resolve=dev",
"--outdir=outcss",
"--public-url=https://localhost:9000/"
],
"cwd": "${workspaceFolder}/src/test/fixtures",
"console": "internalConsole"
},
{
"type": "lldb",
"request": "launch",
"name": "Demo Serve",
"program": "${workspaceFolder}/build/debug/macos-x86_64/esdev",
"args": [
"./src/index.tsx",
"--resolve=lazy",
"--outdir=public",
"--serve",
"--public-url=http://localhost:9000/"
],
"cwd": "${workspaceFolder}/demos/css-stress-test",
"console": "internalConsole"
},
{
"type": "lldb",
"request": "launch",
"name": "Demo Lazy Build",
"program": "${workspaceFolder}/build/debug/macos-x86_64/esdev",
"args": [
"./src/index.tsx",
"--resolve=lazy",
"--public-url=http://localhost:9000/"
],
"cwd": "${workspaceFolder}/demos/simple-react",
"console": "internalConsole"
},
{
"type": "lldb",
"request": "launch",
"name": "Demo Build",
"program": "${workspaceFolder}/build/debug/macos-x86_64/esdev",
"args": [
"./src/index.tsx",
"--resolve=dev",
"--outdir=outcss",
"--platform=browser",
"--public-url=http://localhost:9000/"
],
"cwd": "${workspaceFolder}/demos/css-stress-test",
"console": "internalConsole"
},
{
"type": "lldb",
"request": "launch",
"name": "Demo .jsb",
"program": "${workspaceFolder}/build/debug/macos-x86_64/esdev",
"args": [
"./src/index.tsx",
"--resolve=lazy",
"--public-url=http://localhost:9000/"
],
"cwd": "${workspaceFolder}/demos/simple-react",
"console": "internalConsole"
},
{
"type": "lldb",
"request": "launch",
"name": "Demo Build .jsb",
"program": "${workspaceFolder}/build/debug/macos-x86_64/esdev",
"args": [
"./src/index.tsx",
"--public-url=http://localhost:9000/",
"--new-jsb"
],
"cwd": "${workspaceFolder}/demos/simple-react",
"console": "internalConsole"
},
{
"type": "lldb",
"request": "launch",
"name": "Demo Print .jsb",
"program": "${workspaceFolder}/build/debug/macos-x86_64/esdev",
"args": ["./node_modules.jsb"],
"cwd": "${workspaceFolder}/demos/simple-react",
"console": "internalConsole"
},
{
"type": "lldb",
"request": "launch",
"name": "DAev Launch",
"program": "${workspaceFolder}/build/macos-x86_64/esdev",
"args": ["./simple.jsx", "--resolve=disable"],
"cwd": "${workspaceFolder}/src/test/fixtures",
"console": "internalConsole"
},
{
"name": "esbuild",
"type": "go",
"request": "launch",
"mode": "debug",
"program": "/Users/jarred/Code/esbuild/cmd/esbuild",
"cwd": "/Users/jarred/Code/esdev/src/test/fixtures",
"args": ["--bundle", "--outfile=out.esbuild.js", "await.ts"]
},
// {
// "type": "lldb",
// "request": "launch",
// "name": "Dev Launch (other)",
// "program": "${workspaceFolder}/build/debug/macos-x86_64/esdev",
// "args": ["./simple.jsx", "--resolve=disable"],
// "cwd": "${workspaceFolder}/src/test/fixtures",
// "console": "internalConsole"
// },
// {
// "type": "lldb",
// "request": "launch",
// "name": "Dev Launch",
// "program": "${workspaceFolder}/build/debug/macos-x86_64/esdev",
// "preLaunchTask": "build",
// "args": [
// "--resolve=disable",
// "--cwd",
// "/Users/jarredsumner/Code/esdev/src/test/fixtures",
// "escape-chars.js"
// ],
// "cwd": "${workspaceFolder}",
// "console": "internalConsole"
// }
// {
// "type": "lldb",
// "request": "launch",
// "name": "Dev Launch",
// "program": "${workspaceFolder}/build/debug/macos-x86_64/esdev",
// "preLaunchTask": "build",
// "args": [
// "--resolve=dev",
// "--cwd",
// "/Users/jarredsumner/Builds/esbuild/bench/three/src/",
// "./entry.js",
// "-o",
// "out"
// ],
// "cwd": "/Users/jarredsumner/Builds/esbuild/bench/three/src",
// "console": "internalConsole"
// }
// {
// "type": "lldb",
// "request": "launch",
// "name": "Dev Launch",
// "program": "${workspaceFolder}/build/debug/macos-x86_64/esdev",
// "preLaunchTask": "build",
// "args": [
// "--resolve=dev",
// "--cwd",
// "/Users/jarredsumner/Builds/esbuild/bench/three/src/",
// "./entry.js",
// "-o",
// "out"
// ],
// "cwd": "${workspaceFolder}",
// "console": "internalConsole"
// }
// {
// "type": "lldb",
// "request": "launch",
// "name": "Dev Launch",
// "program": "${workspaceFolder}/build/debug/macos-x86_64/esdev",
// // "preLaunchTask": "build",
// "args": [
// "--resolve=dev",
// "--cwd",
// "./src/api/demo",
// "pages/index.jsx",
// "-o",
// "out",
// "--public-url=https://hello.com/",
// "--serve"
// ],
// "cwd": "${workspaceFolder}",
// "console": "internalConsole"
// }
{
"type": "lldb",
"request": "launch",
"name": "Rome",
// "program": "${workspaceFolder}/build/debug/macos-x86_64/esdev",
"program": "${workspaceFolder}/build/macos-x86_64/esdev",
// "preLaunchTask": "build",
"args": [
"--resolve=dev",
// "--resolve=lazy",
"--cwd",
"${workspaceFolder}/bench/rome/src",
"entry",
"--platform=node",
// "@romejs/js-analysis/evaluators/modules/ImportCall.ts",
"--outdir=${workspaceFolder}/bench/rome/src/out",
// "@romejs/cli-diagnostics/banners/success.json",
"--public-url=https://hello.com/"
],
"cwd": "${workspaceFolder}/bench/rome/src",
"console": "internalConsole"
},
{
"type": "lldb",
"request": "launch",
"name": "Rome Dev",
// "program": "${workspaceFolder}/build/debug/macos-x86_64/esdev",
"program": "${workspaceFolder}/build/debug/macos-x86_64/esdev",
// "preLaunchTask": "build",
"args": [
"--resolve=dev",
// "--resolve=lazy",
"--cwd",
"${workspaceFolder}/bench/rome/src",
"entry",
"--platform=node",
// "@romejs/js-analysis/evaluators/modules/ImportCall.ts",
"--outdir=${workspaceFolder}/bench/rome/src/out",
// "@romejs/cli-diagnostics/banners/success.json",
"--public-url=https://hello.com/"
],
"cwd": "${workspaceFolder}/bench/rome/src",
"console": "internalConsole"
}
// {
// "type": "lldb",
// "request": "launch",
// "name": "Dev Launch",
// "program": "${workspaceFolder}/build/bin/debug/esdev",
// "preLaunchTask": "build",
// "args": [
// "--resolve=dev",
// "--cwd",
// "/",
// "/Users/jarredsumner/Code/esdev/src/test/fixtures/img-bug.js",
// "-o",
// "out"
// ],
// "cwd": "${workspaceFolder}",
// "console": "internalConsole",
// "presentation": {
// "hidden": false,
// "group": "",
// "order": 1
// }
// }
]
}

53
.vscode/settings.json vendored Normal file
View File

@@ -0,0 +1,53 @@
{
"git.autoRepositoryDetection": "openEditors",
"search.quickOpen.includeSymbols": true,
"search.seedWithNearestWord": true,
"search.smartCase": true,
"search.exclude": {
"src/javascript/jsc/WebKit/**/*": true
},
"search.followSymlinks": false,
"search.useIgnoreFiles": true,
"C_Cpp.files.exclude": {
"**/.vscode": true,
"src/javascript/jsc/WebKit/JSTests": true,
"src/javascript/jsc/WebKit/Tools": true,
"src/javascript/jsc/WebKit/WebDriverTests": true,
"src/javascript/jsc/WebKit/WebKit.xcworkspace": true,
"src/javascript/jsc/WebKit/WebKitLibraries": true,
"src/javascript/jsc/WebKit/Websites": true,
"src/javascript/jsc/WebKit/resources": true,
"src/javascript/jsc/WebKit/LayoutTests": true,
"src/javascript/jsc/WebKit/ManualTests": true,
"src/javascript/jsc/WebKit/PerformanceTests": true,
"src/javascript/jsc/WebKit/WebKitLegacy": true,
"src/javascript/jsc/WebKit/WebCore": true,
"src/javascript/jsc/WebKit/WebDriver": true,
"src/javascript/jsc/WebKit/WebKitBuild": true,
"src/javascript/jsc/WebKit/WebInspectorUI": true
},
"files.associations": {
"*.idl": "cpp",
"memory": "cpp",
"iostream": "cpp",
"algorithm": "cpp",
"random": "cpp",
"ios": "cpp",
"filesystem": "cpp",
"__locale": "cpp",
"type_traits": "cpp",
"__mutex_base": "cpp",
"__string": "cpp",
"string": "cpp",
"string_view": "cpp",
"typeinfo": "cpp",
"__config": "cpp",
"__nullptr": "cpp",
"exception": "cpp",
"__bit_reference": "cpp",
"atomic": "cpp",
"utility": "cpp",
"sstream": "cpp"
}
}

28
.vscode/tasks.json vendored
View File

@@ -3,9 +3,17 @@
"tasks": [
{
"label": "build",
"type": "shell",
"command": "zig build",
"type": "process",
"command": "zig",
"args": ["build"],
"presentation": {
"echo": true,
"reveal": "silent",
"focus": false,
"panel": "shared",
"showReuseMessage": false,
"clear": false
},
"group": {
"kind": "build",
"isDefault": true
@@ -26,7 +34,15 @@
"label": "test",
"type": "shell",
"command": "zig",
"args": ["test", "${file}", "-femit-bin=zig-cache/bin/test"],
"args": [
"test",
"${file}",
"--main-pkg-path",
"${workspaceFolder}",
"-femit-bin=${workspaceFolder}/zig-out/bin/test",
";",
"true"
],
"group": {
"kind": "test",
@@ -34,7 +50,9 @@
},
"presentation": {
"showReuseMessage": false,
"clear": true
"clear": true,
"panel": "new",
"reveal": "always"
}
}
]

147
Makefile Normal file
View File

@@ -0,0 +1,147 @@
speedy: speedy-prod-native speedy-prod-wasi speedy-prod-wasm
api:
peechy --schema src/api/schema.peechy --esm src/api/schema.js --ts src/api/schema.d.ts --zig src/api/schema.zig
jsc: jsc-build jsc-bindings
jsc-build: jsc-build-mac
jsc-bindings:
jsc-bindings-headers
jsc-bindings-mac
jsc-bindings-headers:
zig build headers
jsc-build-mac:
cd src/javascript/jsc/WebKit && ICU_INCLUDE_DIRS="/usr/local/opt/icu4c/include" ./Tools/Scripts/build-jsc --jsc-only --cmakeargs="-DENABLE_STATIC_JSC=ON -DCMAKE_BUILD_TYPE=relwithdebinfo" && echo "Ignore the \"has no symbols\" errors"
SRC_DIR := src/javascript/jsc/bindings
OBJ_DIR := src/javascript/jsc/bindings-obj
SRC_FILES := $(wildcard $(SRC_DIR)/*.cpp)
OBJ_FILES := $(patsubst $(SRC_DIR)/%.cpp,$(OBJ_DIR)/%.o,$(SRC_FILES))
CLANG_FLAGS = -Isrc/JavaScript/jsc/WebKit/WebKitBuild/Release/JavaScriptCore/PrivateHeaders \
-Isrc/javascript/jsc/WebKit/WebKitBuild/Release/WTF/Headers \
-Isrc/javascript/jsc/WebKit/WebKitBuild/Release/ICU/Headers \
-DSTATICALLY_LINKED_WITH_JavaScriptCore=1 \
-DSTATICALLY_LINKED_WITH_WTF=1 \
-DBUILDING_WITH_CMAKE=1 \
-DNDEBUG=1 \
-DNOMINMAX \
-DIS_BUILD \
-O3 \
-g \
-DENABLE_INSPECTOR_ALTERNATE_DISPATCHERS=0 \
-DBUILDING_JSCONLY__ \
-DASSERT_ENABLED=0\
-Isrc/JavaScript/jsc/WebKit/WebKitBuild/Release/ \
-Isrc/JavaScript/jsc/bindings/ \
-Isrc/javascript/jsc/WebKit/Source/bmalloc \
-std=gnu++1z \
-stdlib=libc++ \
-DDU_DISABLE_RENAMING=1 \
-Wall
jsc-bindings-mac: $(OBJ_FILES)
# We do this outside of build.zig for performance reasons
# The C compilation stuff with build.zig is really slow and we don't need to run this as often as the rest
$(OBJ_DIR)/%.o: $(SRC_DIR)/%.cpp
clang++ -c -o $@ $< \
$(CLANG_FLAGS)
sizegen:
clang++ src/javascript/jsc/headergen/sizegen.cpp -o /tmp/sizegen $(CLANG_FLAGS)
/tmp/sizegen > src/javascript/jsc/bindings/sizes.zig
picohttp:
clang -O3 -g -c src/deps/picohttpparser.c -Isrc/deps -o src/deps/picohttpparser.o; cd ../../
speedy-prod-native-macos: picohttp
zig build -Drelease-fast -Dtarget=x86_64-macos-gnu
speedy-prod-native-macos-lib:
zig build lib -Drelease-fast -Dtarget=x86_64-macos-gnu
speedy-m1:
zig build -Drelease-fast -Dtarget=aarch64-macos-gnu
speedy-prod-wasm:
zig build -Drelease-fast -Dtarget=wasm32-freestanding
speedy-prod-wasi:
zig build -Drelease-fast -Dtarget=wasm32-wasi
speedy-dev: speedy-dev-native speedy-dev-wasi speedy-dev-wasm
speedy-dev-native:
zig build
speedy-dev-wasm:
zig build -Dtarget=wasm32-freestanding
speedy-dev-wasi:
zig build -Dtarget=wasm32-wasi
ROME_TSCONFIG += {
ROME_TSCONFIG += \"compilerOptions\": {
ROME_TSCONFIG += \"sourceMap\": true,
ROME_TSCONFIG += \"esModuleInterop\": true,
ROME_TSCONFIG += \"resolveJsonModule\": true,
ROME_TSCONFIG += \"moduleResolution\": \"node\",
ROME_TSCONFIG += \"target\": \"es2019\",
ROME_TSCONFIG += \"module\": \"commonjs\",
ROME_TSCONFIG += \"baseUrl\": \".\"
ROME_TSCONFIG += }
ROME_TSCONFIG += }
github/rome:
mkdir -p github/rome
cd github/rome && git init && git remote add origin https://github.com/romejs/rome.git
cd github/rome && git fetch --depth 1 origin d95a3a7aab90773c9b36d9c82a08c8c4c6b68aa5 && git checkout FETCH_HEAD
# This target provides an easy way to verify that the build is correct. Since
# Rome is self-hosted, we can just run the bundle to build Rome. This makes sure
# the bundle doesn't crash when run and is a good test of a non-trivial workload.
bench-rome-verify: | github/rome
mkdir -p bench/rome-verify
cp -r github/rome/packages bench/rome-verify/packages
cp github/rome/package.json bench/rome-verify/package.json
bench-rome:
rm -rf bench/rome
mkdir -p bench/rome
cp -r github/rome/packages bench/rome/src/
echo "$(ROME_TSCONFIG)" > bench/rome/src/tsconfig.json
echo 'import "rome/bin/rome"' > bench/rome/src/entry.ts
# Patch a cyclic import ordering issue that affects commonjs-style bundlers (webpack and parcel)
echo "export { default as createHook } from './api/createHook';" > .temp
sed "/createHook/d" bench/rome/src/@romejs/js-compiler/index.ts >> .temp
mv .temp bench/rome/src/@romejs/js-compiler/index.ts
# Replace "import fs = require('fs')" with "const fs = require('fs')" because
# the TypeScript compiler strips these statements when targeting "esnext",
# which breaks Parcel 2 when scope hoisting is enabled.
find bench/rome/src -name '*.ts' -type f -print0 | xargs -L1 -0 sed -i '' 's/import \([A-Za-z0-9_]*\) =/const \1 =/g'
find bench/rome/src -name '*.tsx' -type f -print0 | xargs -L1 -0 sed -i '' 's/import \([A-Za-z0-9_]*\) =/const \1 =/g'
# Get an approximate line count
rm -r bench/rome/src/@romejs/js-parser/test-fixtures
echo 'Line count:' && (find bench/rome/src -name '*.ts' && find bench/rome/src -name '*.js') | xargs wc -l | tail -n 1
bench-rome-speedy: | bench/rome-verify
cd bench/rome/src
/Users/jarred/Code/esdev/build/macos-x86_64/esdev --outdir=dist ./entry.ts
github-rome:
mkdir -p github/rome
cd github/rome && git init && git remote add origin https://github.com/romejs/rome.git
cd github/rome && git fetch --depth 1 origin d95a3a7aab90773c9b36d9c82a08c8c4c6b68aa5 && git checkout FETCH_HEAD

363
README.md
View File

@@ -1,181 +1,236 @@
# esdev
# Speedy - a fast web bundler & JavaScript runtime environment
Incredibly fast ECMAScript & TypeScript bundler designed for development.
Speedy bundles & transpiles JavaScript, TypeScript, and CSS. Speedy is probably the fastest bundler out today.
## Motivation
### Speed hacking
JavaScript bundlers run very slow in web browsers.
Here are some techniques Speedy uses to make your builds shockingly fast. Most are small wins. Some are big.
## Purpose
#### Compare comptime-known strings by nearest `(u64 || u32 || u16 || u8)`-sized integer
The purpose of esdev is to very quickly convert ECMAScript/TypeScript into something a web browser can execute.
Parsers & lexers search source code for many tokens. For JavaScript, some of these include:
Goals:
- `yield`
- `await`
- `for`
- `of`
- `in`
- `while`
- Transpile fast inside a web browser. "Fast" is defined as "<= 3ms per un-minified file up to 1000 LOC" without build caching (FS cache yes).
- Transpile JSX to ECMAScript
- Remove TypeScript annotations
- Conditionally support React Fast Refresh
- Rewrite CommonJS/SystemJS/UMD imports and exports to ESM
- Support most of tsconfig.json/jsconfig.json
- Support `defines` like in esbuild
- Support esbuild plugins
- Support importing CSS files from JavaScript
- Tree-shaking
You get the idea.
Non-goals:
When you know the string you're looking for ahead of time, it's faster to compare multiple characters at a time than a single character.
- Bundling for production
- Minification
- AST plugins
- Support Node.js
- CommonJS, UMD, IIFE
- ES6 to ES5
- Supporting non-recent versions of Chromium, Firefox, or Safari. (No IE)
<details>
## How it works
<summary>Here's a function that does this. This is used in many places throughout the code.</summary>
Much of the code is a line-for-line port of esbuild to Zig, with a few important differences.
```zig
pub fn eqlComptime(self: string, comptime alt: anytype) bool {
switch (comptime alt.len) {
0 => {
@compileError("Invalid size passed to eqlComptime");
},
2 => {
const check = std.mem.readIntNative(u16, alt[0..alt.len]);
return self.len == alt.len and std.mem.readIntNative(u16, self[0..2]) == check;
},
1, 3 => {
if (alt.len != self.len) {
return false;
}
### Implementation differences
inline for (alt) |c, i| {
if (self[i] != c) return false;
}
return true;
},
4 => {
const check = std.mem.readIntNative(u32, alt[0..alt.len]);
return self.len == alt.len and std.mem.readIntNative(u32, self[0..4]) == check;
},
6 => {
const first = std.mem.readIntNative(u32, alt[0..4]);
const second = std.mem.readIntNative(u16, alt[4..6]);
#### Moar lookup tables
return self.len == alt.len and first == std.mem.readIntNative(u32, self[0..4]) and
second == std.mem.readIntNative(u16, self[4..6]);
},
5, 7 => {
const check = std.mem.readIntNative(u32, alt[0..4]);
if (self.len != alt.len or std.mem.readIntNative(u32, self[0..4]) != check) {
return false;
}
const remainder = self[4..];
inline for (alt[4..]) |c, i| {
if (remainder[i] != c) return false;
}
return true;
},
8 => {
const check = std.mem.readIntNative(u64, alt[0..alt.len]);
return self.len == alt.len and std.mem.readIntNative(u64, self[0..8]) == check;
},
9...11 => {
const first = std.mem.readIntNative(u64, alt[0..8]);
### Why not just use esbuild?
if (self.len != alt.len or first != std.mem.readIntNative(u64, self[0..8])) {
return false;
}
#### Missing features
inline for (alt[8..]) |c, i| {
if (self[i + 8] != c) return false;
}
return true;
},
12 => {
const first = std.mem.readIntNative(u64, alt[0..8]);
const second = std.mem.readIntNative(u32, alt[8..12]);
return (self.len == alt.len) and first == std.mem.readIntNative(u64, self[0..8]) and second == std.mem.readIntNative(u32, self[8..12]);
},
13...15 => {
const first = std.mem.readIntNative(u64, alt[0..8]);
const second = std.mem.readIntNative(u32, alt[8..12]);
- Hot Module Reloading
- Rewrite CommonJS/SystemJS/UMD imports and exports to ESM
- React Fast Refresh
if (self.len != alt.len or first != std.mem.readIntNative(u64, self[0..8]) or second != std.mem.readIntNative(u32, self[8..12])) {
return false;
}
#### Go WASM performance isn't great.
inline for (alt[13..]) |c, i| {
if (self[i + 13] != c) return false;
}
There's a number of reasons for this:
- Unlike native targets, Go's WASM target runs the garbage collector on the same thread as the application. Since this usecase is very constrained (no need for shared memory, or long-term objects), rewriting in Zig lets us get away with a bump allocator -- skipping garbage collection entirely. This is faster than what Go does and possibly Rust, since this zeroes out the heap in one call at the end, rather than progressively zeroing memory.
- Goroutines cross the JS<>WASM binding, which is very slow. The more goroutines you use, the slower your code runs. When building a Zig project in single-threaded mode, Zig's `comptime` feature compiles away most of the difference.
- Slow startup time: unless you use TinyGo, Go WASM binaries are > 2 MB. In esbuild's case, at the time of writing its 6 MB. That's a lot of code for the web browser to download & compile.
#### Different constraints enable performance improvements
If bundler means "merge N source files into 1 or few source file(s)", esdev is most definitely not a bundler. Unlike most bundlers today, esdev deliberately outputs
If bundler means "turn my development code into something a browser can run",
### Compatibility Table
| Feature | esbuild | esdev |
| ------------------------------------ | ------- | ----- |
| JSX (transform) | ✅ | ⌛ |
| TypeScript (transform) | ✅ | ⌛ |
| React Fast Refresh | ❌ | ⌛ |
| Hot Module Reloading | ❌ | ⌛ |
| Minification | ✅ | ❌ |
| Tree Shaking | ✅ | ⌛ |
| Incremental builds | ✅ | ⌛ |
| CSS | ✅ | 🗓️ |
| Expose CSS dependencies per file | ✅ | 🗓️ |
| CommonJS, IIFE, UMD outputs | ✅ | ❌ |
| Node.js build target | ✅ | ❌ |
| Code Splitting | ✅ | ⌛ |
| Browser build target | ✅ | ⌛ |
| Bundling for production | ✅ | ❌ |
| Support older browsers | ✅ | ❌ |
| Plugins | ✅ | 🗓️ |
| AST Plugins | ❌ | ❌ |
| Filesystem Cache API (for plugins) | ❓ | 🗓️ |
| Transform to ESM with `bundle` false | ❓ | ⌛ |
Key:
| Tag | Meaning |
| --- | ------------------------------------------ |
| ✅ | Compatible |
| ❌ | Not supported, and no plans to change that |
| ⌛ | In-progress |
| 🗓️ | Planned but work has not started |
| ❓ | Unknown |
#### Notes
##### Hot Module Reloading & React Fast Refresh
esdev exposes a runtime API to support Hot Module Reloading and React Fast Refresh. React Fast Refresh depends on Hot Module Reloading to work, but you can turn either of them off. esdev itself doesn't serve bundled files, it's up to the development server to provide that.
##### Code Splitting
esdev supports code splitting the way browsers do natively: through ES Modules. This works great for local development files. It doesn't work great for node_modules or for production due to the sheer number of network requests. There are plans to make this better, stay tuned.
##### Support older browsers
To simplify the parser, esdev doesn't support lowering features to non-current browsers. This means if you run a development build with esdev with, for example, optional chaining, it won't work in Internet Explorer 11. If you want to support older browsers, use a different tool.
#### Implementation Notes
##### HMR & Fast Refresh implementation
This section only applies when Hot Module Reloading is enabled. When it's off, none of this part runs. React Fast Refresh depends on Hot Module Reloading.
###### What is hot module reloading?
HMR: "hot module reloading"
A lot of developers know what it does -- but what actually is it and how does it work? Essentially, it means when a source file changes, automatically reload the code without reloading the web page.
A big caveat here is JavaScript VMs don't expose an API to "unload" parts of the JavaScript context. In all HMR implementations, What really happens is this:
1. Load a new copy of the code that changed
2. Update references to the old code to point to the new code
3. Handle errors
The old code still lives there, in your browser's JavaScript VM until the page is refreshed. If any past references are kept (side effects!), undefined behavior happens. That's why, historically (by web standards), HMR has a reputation for being buggy.
Loading code is easy. The hard parts are updating references and handling errors.
There are two ways to update references:
- Update all module imports
- Update the exports
Either approach works.
###### How it's implemented in esdev
At build time, esdev replaces all import URLs with import manifests that wrap the real module.
In the simple case, that looks like this:
```ts
import { Button as _Button } from "http://localhost:3000/src/components/button.KXk23UX3.js";
export let Button = _Button;
import.meta.onUpdate(import.meta.url, (exports) => {
if ("Button" in exports) {
Button = exports["Button"];
}
});
return true;
},
16 => {
const first = std.mem.readIntNative(u64, alt[0..8]);
const second = std.mem.readIntNative(u64, alt[8..15]);
return (self.len == alt.len) and first == std.mem.readIntNative(u64, self[0..8]) and second == std.mem.readIntNative(u64, self[8..16]);
},
else => {
@compileError(alt ++ " is too long.");
},
}
}
```
Then, lets say you updated `button.tsx` from this:
</details>
```tsx
export const Button = ({ children }) => (
<div className="Button">{children}</div>
);
#### Skip decoding UTF-16 when safe
JavaScript engines represent strings as UTF-16 byte arrays. That means every character in a string occupies at least 2 bytes of memory. Most applications (and documents) use UTF-8, which uses at least 1 byte of memory.
Most JavaScript bundlers store JavaScript strings as UTF-16, either because the bundler is written in JavaScript, or to simplify the code.
It's much faster to reuse the memory from reading the contents of the JavaScript source and store the byte offset + length into the source file, than allocating a new string for each JavaScript string. This is safe when the string doesn't have a codepoint > 127, which mostly means `A-Za-z0-9` and punctuation. Most JavaScript strings don't use lots of emoji, so this saves from many tiny allocations.
#### CSS ~Parser~ Scanner
Speedy (currently) does not have a CSS Parser. But, it still processes CSS.
Most CSS processors work something like this:
1. Copy the CSS source code to a byte array
2. Iterate through every unicode codepoint, generating tokens (lexing)
3. Convert each token into an AST node (parsing)
4. Perform 1 or more passes over the AST. For tools like PostCSS, every plugin typically adds 1+ passes over the AST. (visiting)
5. Print the source code (printing)
Speedy's CSS Scanner scans, rewrites, and prints CSS in a single pass without generating an AST. It works like this:
1. Copy the CSS source code to a byte array
2. Iterate through every unicode codepoint, searching for lines starting with `@import` or property values with `url(`
3. For each url or import:
1. Flush everything before the url or import
2. Resolve the import URL
3. Write the import URL
4. When end of file is reached, flush to disk.
Speedy's CSS Scanner is about 56x faster than PostCSS with the `postcss-import` and `postcss-url` plugins enabled (and sourcemaps disabled). On the other hand, auto-prefixing and minification won't work. Minifying whitespace is possible with some modifications, but minifiying CSS syntax correctly needs an AST.
This approach is fast, but not without tradeoffs!
Speedy's CSS Scanner is based on esbuild's CSS Lexer. Thank you @evanwallace.
#### Compile-time generated JavaScript Parsers
At the time of writing, there are 8 different comptime-generated variations of Speedy's JavaScript parser.
```zig
pub fn NewParser(
comptime is_typescript_enabled: bool,
comptime is_jsx_enabled: bool,
comptime only_scan_imports_and_do_not_visit: bool,
) type {
```
To this:
When this is `false`, branches that only apply to parsing TypeScript are removed.
```tsx
export const Button = ({ children }) => (
<div className="Button">
<div className="Button-label">{children}</div>
</div>
);
```zig
comptime is_typescript_enabled: bool,
```
This triggers the HMR client in esdev to:
**Performance impact: +2%?**
1. import `/src/components/button.js` once again
```bash
hyperfine "../../build/macos-x86_64/esdev node_modules/react-dom/cjs/react-dom.development.js --resolve=disable" "../../esdev.before-comptime-js-parser node_modules/react-dom/cjs/react-dom.development.js --resolve=disable" --min-runs=500
Benchmark #1: ../../build/macos-x86_64/esdev node_modules/react-dom/cjs/react-dom.development.js --resolve=disable
Time (mean ± σ): 25.1 ms ± 1.1 ms [User: 20.4 ms, System: 3.1 ms]
Range (min … max): 23.5 ms … 31.7 ms 500 runs
Benchmark #2: ../../esdev.before-comptime-js-parser node_modules/react-dom/cjs/react-dom.development.js --resolve=disable
Time (mean ± σ): 25.6 ms ± 1.3 ms [User: 20.9 ms, System: 3.1 ms]
Range (min … max): 24.1 ms … 39.7 ms 500 runs
'../../build/macos-x86_64/esdev node_modules/react-dom/cjs/react-dom.development.js --resolve=disable' ran
1.02 ± 0.07 times faster than '../../esdev.before-comptime-js-parser node_modules/react-dom/cjs/react-dom.development.js --resolve=disable'
```
When this is `false`, branches that only apply to parsing JSX are removed.
```zig
comptime is_jsx_enabled: bool,
```
This is only used for application code when generating `node_modules.jsb`. This skips the visiting pass. It reduces parsing time by about 30%, but the source code cannot be printed without visiting. It's only useful for scanning `import` and `require`.
```zig
comptime only_scan_imports_and_do_not_visit: bool,
```
At runtime, Speedy chooses the appropriate JavaScript parser to use based on the `loader`. In practical terms, this moves all the branches checking whether a parsing step should be run from inside several tight loops to just once, before parsing starts.
#### Max out per-process file handle limit automatically, leave file handles open.
**Performance impact: +5%**
It turns out, lots of time is spent opening and closing file handles. This is feature flagged off on Windows.
This also enabled a kqueue-based File System watcher on macOS. FSEvents, the more common macOS File System watcher uses kqueue internally to watch directories. Watching file handles is faster than directories. It was surprising to learn that none of the popular filesystem watchers for Node.js adjust the process ulimit, leaving many developers to deal with "too many open file handles" errors.
### Architecture
#### The Speedy Bundle Format
TODO: document
### Hot Module Reloading
Speedy's Hot Module Reloader uses a custom binary protocol that's around 8x more space efficient than other bundlers.
- File change notifications cost 9 bytes.
- Build metadata costs 13 bytes + length of the module path that was rebuilt + size of the built file.
For comparison, Vite's HMR implementation uses 104 bytes + length of the module path that was rebuilt (at the time of writing)
#### Instant CSS
When using `<link rel="stylesheet">`, Speedy HMR "just works", with zero configuration and without modifying HTML.
Here's how:
- On page load, CSS files are built per request
- When you make a change to a local CSS file, a file change notification is pushed over the websocket connection to the browser (HMR client)
- For the first update, instead of asking for a new file to build, it asks for a list of files that the file within the `<link rel="stylesheet">` imports, and any those `@import`, recursively. If `index.css` imports `link.css` and `link.css` imports `colors.css`, that list will include `index.css`, `link.css`, and `colors.css`.
- Preserving import order, the link tags are replaced in a single DOM update. This time, an additional query string flag is added `?noimport` which tells the Speedy CSS Scanner to remove any `@import` statements from the built CSS file.
While this approach today is fast, there are more scalable alternatives to large codebases worth considering (such as, a bundling format that supports loading individual files unbundled). This may change in the near future.

242
build.zig
View File

@@ -1,4 +1,34 @@
const std = @import("std");
const resolve_path = @import("./src/resolver/resolve_path.zig");
pub fn addPicoHTTP(step: *std.build.LibExeObjStep) void {
const picohttp = step.addPackage(.{
.name = "picohttp",
.path = .{ .path = "src/deps/picohttp.zig" },
});
step.addObjectFile("src/deps/picohttpparser.o");
step.addIncludeDir("src/deps");
// step.add("/Users/jarred/Code/WebKit/WebKitBuild/Release/lib/libWTF.a");
// ./Tools/Scripts/build-jsc --jsc-only --cmakeargs="-DENABLE_STATIC_JSC=ON"
// set -gx ICU_INCLUDE_DIRS "/usr/local/opt/icu4c/include"
// homebrew-provided icu4c
}
pub var original_make_fn: ?fn (step: *std.build.Step) anyerror!void = null;
pub var headers_zig_file: ?[]const u8 = null;
const HeadersMaker = struct {
pub fn make(self: *std.build.Step) anyerror!void {
try original_make_fn.?(self);
var headers_zig: std.fs.File = try std.fs.openFileAbsolute(headers_zig_file.?, .{ .write = true });
const contents = try headers_zig.readToEndAlloc(std.heap.page_allocator, 99999);
const last_extern_i = std.mem.lastIndexOf(u8, contents, "pub extern fn") orelse @panic("Expected contents");
const last_newline = std.mem.indexOf(u8, contents[last_extern_i..], "\n") orelse @panic("Expected newline");
try headers_zig.seekTo(0);
try headers_zig.setEndPos(last_newline + last_extern_i);
}
};
pub fn build(b: *std.build.Builder) void {
// Standard target options allows the person running `zig build` to choose
@@ -6,23 +36,206 @@ pub fn build(b: *std.build.Builder) void {
// means any target is allowed, and the default is native. Other options
// for restricting supported target set are available.
const target = b.standardTargetOptions(.{});
// Standard release options allow the person running `zig build` to select
// between Debug, ReleaseSafe, ReleaseFast, and ReleaseSmall.
const mode = b.standardReleaseOptions();
var cwd_buf: [std.fs.MAX_PATH_BYTES]u8 = undefined;
const cwd: []const u8 = b.pathFromRoot(".");
var exe: *std.build.LibExeObjStep = undefined;
if (target.getCpuArch().isWasm()) {
exe = b.addExecutable("esdev", "src/main_wasm.zig");
var output_dir_buf = std.mem.zeroes([4096]u8);
var bin_label = if (mode == std.builtin.Mode.Debug) "/debug/" else "/";
const output_dir = b.pathFromRoot(std.fmt.bufPrint(&output_dir_buf, "build{s}{s}-{s}", .{ bin_label, @tagName(target.getOs().tag), @tagName(target.getCpuArch()) }) catch unreachable);
if (target.getOsTag() == .wasi) {
exe.enable_wasmtime = true;
exe = b.addExecutable("esdev", "src/main_wasi.zig");
exe.linkage = .dynamic;
exe.setOutputDir(output_dir);
} else if (target.getCpuArch().isWasm()) {
// exe = b.addExecutable(
// "esdev",
// "src/main_wasm.zig",
// );
// exe.is_linking_libc = false;
// exe.is_dynamic = true;
var lib = b.addExecutable("esdev", "src/main_wasm.zig");
lib.single_threaded = true;
// exe.want_lto = true;
// exe.linkLibrary(lib);
if (mode == std.builtin.Mode.Debug) {
// exception_handling
var features = target.getCpuFeatures();
features.addFeature(2);
target.updateCpuFeatures(&features);
} else {
// lib.strip = true;
}
lib.setOutputDir(output_dir);
lib.want_lto = true;
b.install_path = lib.getOutputSource().getPath(b);
std.debug.print("Build: ./{s}\n", .{b.install_path});
b.default_step.dependOn(&lib.step);
b.verbose_link = true;
lib.setTarget(target);
lib.setBuildMode(mode);
std.fs.deleteTreeAbsolute(std.fs.path.join(b.allocator, &.{ cwd, lib.getOutputSource().getPath(b) }) catch unreachable) catch {};
var install = b.getInstallStep();
lib.strip = false;
lib.install();
const run_cmd = lib.run();
run_cmd.step.dependOn(b.getInstallStep());
if (b.args) |args| {
run_cmd.addArgs(args);
}
const run_step = b.step("run", "Run the app");
run_step.dependOn(&run_cmd.step);
return;
} else {
exe = b.addExecutable("esdev", "src/main.zig");
}
// exe.setLibCFile("libc.txt");
exe.linkLibC();
// exe.linkLibCpp();
exe.addPackage(.{
.name = "clap",
.path = .{ .path = "src/deps/zig-clap/clap.zig" },
});
exe.setOutputDir(output_dir);
var walker = std.fs.walkPath(b.allocator, cwd) catch unreachable;
if (std.builtin.is_test) {
while (walker.next() catch unreachable) |entry| {
if (std.mem.endsWith(u8, entry.basename, "_test.zig")) {
std.debug.print("[test] Added {s}", .{entry.basename});
_ = b.addTest(entry.path);
}
}
}
const runtime_hash = std.hash.Wyhash.hash(0, @embedFile("./src/runtime.out.js"));
const runtime_version_file = std.fs.cwd().openFile("src/runtime.version", .{ .write = true }) catch unreachable;
runtime_version_file.writer().print("{x}", .{runtime_hash}) catch unreachable;
defer runtime_version_file.close();
exe.setTarget(target);
exe.setBuildMode(mode);
b.install_path = output_dir;
exe.addLibPath("/usr/local/lib");
exe.install();
var javascript = b.addExecutable("spjs", "src/main_javascript.zig");
var typings_exe = b.addExecutable("typescript-decls", "src/javascript/jsc/typescript.zig");
javascript.setMainPkgPath(b.pathFromRoot("."));
typings_exe.setMainPkgPath(b.pathFromRoot("."));
exe.setMainPkgPath(b.pathFromRoot("."));
// exe.want_lto = true;
if (!target.getCpuArch().isWasm()) {
b.default_step.dependOn(&exe.step);
const bindings_dir = std.fs.path.join(
b.allocator,
&.{
cwd,
"src",
"javascript",
"jsc",
"bindings-obj",
},
) catch unreachable;
var bindings_walker = std.fs.walkPath(b.allocator, bindings_dir) catch unreachable;
var bindings_files = std.ArrayList([]const u8).init(b.allocator);
while (bindings_walker.next() catch unreachable) |entry| {
if (std.mem.eql(u8, std.fs.path.extension(entry.basename), ".o")) {
bindings_files.append(b.allocator.dupe(u8, entry.path) catch unreachable) catch unreachable;
}
}
// // References:
// // - https://github.com/mceSystems/node-jsc/blob/master/deps/jscshim/webkit.gyp
// // - https://github.com/mceSystems/node-jsc/blob/master/deps/jscshim/docs/webkit_fork_and_compilation.md#webkit-port-and-compilation
// const flags = [_][]const u8{
// "-Isrc/JavaScript/jsc/WebKit/WebKitBuild/Release/JavaScriptCore/PrivateHeaders",
// "-Isrc/JavaScript/jsc/WebKit/WebKitBuild/Release/WTF/Headers",
// "-Isrc/javascript/jsc/WebKit/WebKitBuild/Release/ICU/Headers",
// "-DSTATICALLY_LINKED_WITH_JavaScriptCore=1",
// "-DSTATICALLY_LINKED_WITH_WTF=1",
// "-DBUILDING_WITH_CMAKE=1",
// "-DNOMINMAX",
// "-DENABLE_INSPECTOR_ALTERNATE_DISPATCHERS=0",
// "-DBUILDING_JSCONLY__",
// "-DASSERT_ENABLED=0", // missing symbol errors like this will happen "JSC::DFG::DoesGCCheck::verifyCanGC(JSC::VM&)"
// "-Isrc/JavaScript/jsc/WebKit/WebKitBuild/Release/", // config.h,
// "-Isrc/JavaScript/jsc/bindings/",
// "-Isrc/javascript/jsc/WebKit/Source/bmalloc",
// "-std=gnu++17",
// if (target.getOsTag() == .macos) "-DUSE_FOUNDATION=1" else "",
// if (target.getOsTag() == .macos) "-DUSE_CF_RETAIN_PTR=1" else "",
// };
const headers_step = b.step("headers", "JSC headers");
var headers_exec: *std.build.LibExeObjStep = b.addExecutable("headers", "src/javascript/jsc/bindings/bindings-generator.zig");
var headers_runner = headers_exec.run();
headers_exec.setMainPkgPath(javascript.main_pkg_path.?);
headers_step.dependOn(&headers_runner.step);
var translate_c: *std.build.TranslateCStep = b.addTranslateC(.{ .path = b.pathFromRoot("src/javascript/jsc/bindings/headers.h") });
translate_c.out_basename = "headers";
translate_c.output_dir = b.pathFromRoot("src/javascript/jsc/bindings/");
headers_step.dependOn(&translate_c.step);
headers_zig_file = b.pathFromRoot("src/javascript/jsc/bindings/headers.zig");
original_make_fn = headers_step.makeFn;
headers_step.makeFn = HeadersMaker.make;
b.default_step.dependOn(&exe.step);
var steps = [_]*std.build.LibExeObjStep{ exe, javascript, typings_exe };
for (steps) |step| {
step.linkLibC();
step.linkLibCpp();
addPicoHTTP(
step,
);
step.addObjectFile("src/JavaScript/jsc/WebKit/WebKitBuild/Release/lib/libJavaScriptCore.a");
step.addObjectFile("src/JavaScript/jsc/WebKit/WebKitBuild/Release/lib/libWTF.a");
step.addObjectFile("src/JavaScript/jsc/WebKit/WebKitBuild/Release/lib/libbmalloc.a");
// We must link ICU statically
step.addObjectFile("/usr/local/opt/icu4c/lib/libicudata.a");
step.addObjectFile("/usr/local/opt/icu4c/lib/libicui18n.a");
step.addObjectFile("/usr/local/opt/icu4c/lib/libicuuc.a");
if (target.getOsTag() == .macos) {
// icucore is a weird macOS only library
step.linkSystemLibrary("icucore");
step.addLibPath("/usr/local/opt/icu4c/lib");
step.addIncludeDir("/usr/local/opt/icu4c/include");
}
// for (bindings_files.items) |binding| {
// step.addObjectFile(
// binding,
// );
// }
}
} else {
b.default_step.dependOn(&exe.step);
}
javascript.strip = false;
javascript.packages = std.ArrayList(std.build.Pkg).fromOwnedSlice(b.allocator, b.allocator.dupe(std.build.Pkg, exe.packages.items) catch unreachable);
javascript.setOutputDir(output_dir);
javascript.setBuildMode(mode);
const run_cmd = exe.run();
run_cmd.step.dependOn(b.getInstallStep());
@@ -32,4 +245,23 @@ pub fn build(b: *std.build.Builder) void {
const run_step = b.step("run", "Run the app");
run_step.dependOn(&run_cmd.step);
var log_step = b.addLog("Destination: {s}/{s}\n", .{ output_dir, "esdev" });
log_step.step.dependOn(&exe.step);
var typings_cmd: *std.build.RunStep = typings_exe.run();
typings_cmd.cwd = cwd;
typings_cmd.addArg(cwd);
typings_cmd.addArg("types");
typings_cmd.step.dependOn(&typings_exe.step);
typings_exe.linkLibC();
typings_exe.linkLibCpp();
typings_exe.setMainPkgPath(cwd);
var typings_step = b.step("types", "Build TypeScript types");
typings_step.dependOn(&typings_cmd.step);
var javascript_cmd = b.step("spjs", "Build standalone JavaScript runtime. Must run \"make jsc\" first.");
javascript_cmd.dependOn(&javascript.step);
}

View File

@@ -0,0 +1,28 @@
import ReactDOMServer from "react-dom/server.browser";
import { Base } from "./src/index";
addEventListener("fetch", (event: FetchEvent) => {
const response = new Response(`
<!DOCTYPE html>
<html>
<head>
<link
rel="stylesheet"
crossorigin="anonymous"
href="https://fonts.googleapis.com/css2?family=IBM+Plex+Sans:wght@400;700&family=Space+Mono:wght@400;700"
/>
</head>
<body>
<link rel="stylesheet" href="./src/index.css" />
<div id="reactroot">${ReactDOMServer.renderToString(<Base />)}</div>
<script src="./src/index.tsx" async type="module"></script>
</body>
</html>
`);
event.respondWith(response);
});
// typescript isolated modules
export {};

View File

@@ -0,0 +1,15 @@
<!DOCTYPE html>
<html>
<head>
<link
rel="stylesheet"
crossorigin="anonymous"
href="https://fonts.googleapis.com/css2?family=IBM+Plex+Sans:wght@400;700&family=Space+Mono:wght@400;700&display=swap"
/>
<link rel="stylesheet" href="src/index.css" />
<script async src="src/index.tsx" type="module"></script>
</head>
<body>
<div id="reactroot"></div>
</body>
</html>

View File

@@ -0,0 +1,2 @@
import React from "react";
export { React };

2
demos/css-stress-test/next-env.d.ts vendored Normal file
View File

@@ -0,0 +1,2 @@
/// <reference types="next" />
/// <reference types="next/types/global" />

View File

@@ -0,0 +1,15 @@
import { renderNextJSPage } from "speedy-nextjs/server";
addEventListener("fetch", (event: FetchEvent) => {
const AppComponent = module.requireFirst(
"pages/_app",
"speedy-nextjs/pages/_app"
);
const Document = module.requireFirst(
"pages/_document",
"speedy-nextjs/pages/_document"
);
});
// typescript isolated modules
export {};

Binary file not shown.

View File

@@ -0,0 +1,34 @@
{
"name": "simple-react",
"version": "1.0.0",
"license": "MIT",
"dependencies": {
"@emotion/css": "^11.1.3",
"@vitejs/plugin-react-refresh": "^1.3.3",
"antd": "^4.16.1",
"left-pad": "^1.3.0",
"next": "^11.0.0",
"parcel": "2.0.0-beta.3",
"react": "^17.0.2",
"react-bootstrap": "^1.6.1",
"react-dom": "^17.0.2",
"react-form": "^4.0.1",
"react-hook-form": "^7.8.3"
},
"parcel": "parceldist/index.js",
"targets": {
"parcel": {
"outputFormat": "esmodule",
"sourceMap": false,
"optimize": false,
"engines": {
"chrome": "last 1 version"
}
}
},
"devDependencies": {
"@microsoft/fetch-event-source": "^2.0.1",
"@snowpack/plugin-react-refresh": "^2.5.0",
"typescript": "^4.3.4"
}
}

View File

@@ -0,0 +1,12 @@
import "../src/font.css";
import "../src/index.css";
import App from "next/app";
class MyApp extends App {
render() {
const { Component, pageProps } = this.props;
return <Component {...pageProps} />;
}
}
export default MyApp;

View File

@@ -0,0 +1,14 @@
import { Main } from "../src/main";
import { Button } from "../src/components/button";
export function getInitialProps() {
return {};
}
export default function IndexRoute() {
return (
<div>
<Main productName={"Next.js (Webpack 5)"} />;<Button>hello</Button>
</div>
);
}

View File

@@ -0,0 +1,15 @@
<!DOCTYPE html>
<html>
<head>
<link
rel="stylesheet"
crossorigin="anonymous"
href="https://fonts.googleapis.com/css2?family=IBM+Plex+Sans:wght@400;700&family=Space+Mono:wght@400;700&display=swap"
/>
</head>
<body>
<div id="reactroot"></div>
<link rel="stylesheet" href="./src/index.css" />
<script src="./src/index.tsx" async type="module"></script>
</body>
</html>

View File

@@ -0,0 +1,10 @@
<!DOCTYPE html>
<html>
<head>
<link rel="stylesheet" href="/src/index.css" />
<script async src="/src/index.tsx" type="module"></script>
</head>
<body>
<div id="reactroot"></div>
</body>
</html>

View File

@@ -0,0 +1,26 @@
import * as jsx_dev_runtime_runtime from "http://localhost:8000node_modules/react/jsx-dev-runtime.js";
var jsxDEV = require( jsx_dev_runtime_runtime).jsxDEV, __jsxFilename = "src/components/button.tsx";
import {
__require as require
} from "http://localhost:8000__runtime.js";
import * as ttp_localhost_8000node_modules_module from "http://localhost:8000node_modules/react/index.js";
var React = require(ttp_localhost_8000node_modules_module);
export const Button = ({ label, label2, onClick }) => jsxDEV("div", {
className: "Button",
onClick,
children: [jsxDEV("div", {
className: "Button-label",
children: [
label,
"111"
]
}, undefined, true, {
fileName: __jsxFilename,
lineNumber: 133
}, this)]
}, undefined, true, {
fileName: __jsxFilename,
lineNumber: 86
}, this);

View File

@@ -0,0 +1,59 @@
import * as jsx_dev_runtime_runtime from "http://localhost:8000node_modules/react/jsx-dev-runtime.js";
import * as React_dot_jsx from "http://localhost:8000node_modules/react/jsx-dev-runtime.js";
var jsxDEV = require( jsx_dev_runtime_runtime).jsxDEV, __jsxFilename = "src/index.tsx", Fragment = require( React_dot_jsx).Fragment;
import {
__require as require
} from "http://localhost:8000__runtime.js";
import * as ttp_localhost_8000node_modules_module from "http://localhost:8000node_modules/react-dom/index.js";
var ReactDOM = require(ttp_localhost_8000node_modules_module);
import { Button} from "http://localhost:8000src/components/button.js";
const Base = ({}) => {
return jsxDEV("main", {
children: [
jsxDEV("h1", {
children: ["I am the page"]
}, undefined, true, {
fileName: __jsxFilename,
lineNumber: 132
}, this),
jsxDEV("h3", {
className: "bacon",
children: ["Here is some text"]
}, undefined, true, {
fileName: __jsxFilename,
lineNumber: 161
}, this),
jsxDEV( Fragment, {
children: ["Fragmen!t"]
}, undefined, true, {
fileName: __jsxFilename,
lineNumber: 212
}, this),
jsxDEV(Button, {
label: "Do not click.",
onClick: () => alert("I told u not to click!"),
children: []
}, undefined, true, {
fileName: __jsxFilename,
lineNumber: 234
}, this)
]
}, undefined, true, {
fileName: __jsxFilename,
lineNumber: 119
}, this);
};
function startReact() {
ReactDOM.render( jsxDEV( Base, {
children: []
}, undefined, true, {
fileName: __jsxFilename,
lineNumber: 408
}, this), document.querySelector("#reactroot"));
}
globalThis.addEventListener("DOMContentLoaded", () => {
startReact();
});

1
demos/css-stress-test/react-inject.js vendored Normal file
View File

@@ -0,0 +1 @@
export { default as React } from "react";

View File

@@ -0,0 +1,26 @@
import * as _jsx_dev_runtime_runtime from "../../node_modules/react/jsx-dev-runtime.js";
var jsxDEV = require( _jsx_dev_runtime_runtime).jsxDEV, __jsxFilename = "src/components/button.tsx";
import {
__require as require
} from "../../__runtime.js";
import * as node_modules_module from "../../node_modules/react/index.js";
var React = require(node_modules_module);
export const Button = ({ label, label2, onClick }) => jsxDEV("div", {
className: "Button",
onClick,
children: [jsxDEV("div", {
className: "Button-label",
children: [
label,
"111"
]
}, undefined, true, {
fileName: __jsxFilename,
lineNumber: 133
}, this)]
}, undefined, true, {
fileName: __jsxFilename,
lineNumber: 86
}, this);

View File

@@ -0,0 +1,2 @@
import { Button } from "../../src/components/button.tsx";
console.log("hi1.js", Button.name);

View File

@@ -0,0 +1,2 @@
import { Button } from "../../src/components/button.tsx";
console.log("hi1.js", Button.name);

View File

@@ -0,0 +1,2 @@
import { Button } from "../../src/components/button.tsx";
console.log("hi1.js", Button.name);

View File

@@ -0,0 +1,2 @@
import { Button } from "../src/components/button.tsx";
console.log("hi1.js", Button.name);

View File

@@ -0,0 +1,21 @@
// Snowpack Configuration File
// See all supported options: https://www.snowpack.dev/reference/configuration
/** @type {import("snowpack").SnowpackUserConfig } */
module.exports = {
root: "src",
mount: {
public: "/",
src: "/",
},
plugins: ["@snowpack/plugin-react-refresh"],
packageOptions: {
/* ... */
},
devOptions: {
/* ... */
},
buildOptions: {
/* ... */
},
};

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,14 @@
:root {
--timestamp: "12812";
--interval: "8";
--progress-bar: 11.83299999999997%;
--spinner-1-muted: rgb(142, 6, 182);
--spinner-1-primary: rgb(177, 8, 227);
--spinner-2-muted: rgb(110, 148, 190);
--spinner-2-primary: rgb(138, 185, 238);
--spinner-3-muted: rgb(75, 45, 64);
--spinner-3-primary: rgb(94, 56, 80);
--spinner-4-muted: rgb(155, 129, 108);
--spinner-4-primary: rgb(194, 161, 135);
--spinner-rotate: 213deg;
}

View File

@@ -0,0 +1,27 @@
import React from "react";
import { NewComponent } from "./new-comp";
const Toast = () => {
const [baconyes, baconno] = useBacon();
return <div>false</div>;
};
const Button = ({ label, label2, onClick }) => {
const useCustomHookInsideFunction = (what, arr) => {
return [true, false];
};
const [on, setOn] = React.useState(false);
React.useEffect(() => {
console.log({ on });
}, [on]);
// const [foo1, foo2] = useCustomHookInsideFunction(() => {}, [on]);
return (
<div className="Button" onClick={onClick}>
<Toast>f</Toast>
<div className="Button-label">{label}12</div>
<NewComponent />
</div>
);
};

View File

@@ -0,0 +1,3 @@
export const NewComponent = () => {
return <div>NEW!</div>;
};

View File

@@ -0,0 +1 @@
@import "https://fonts.googleapis.com/css2?family=IBM+Plex+Sans:wght@400;700&family=Space+Mono:wght@400;700&display=swap";

View File

@@ -0,0 +1,236 @@
@import "./colors.css";
:root {
--heading-font: "Space Mono", system-ui;
--body-font: "IBM Plex Sans", system-ui;
--color-brand: #02ff00;
--color-brand-muted: rgb(2, 150, 0);
--padding-horizontal: 90px;
--page-background: black;
--page-background-alpha: rgba(0, 0, 0, 0.8);
--result__background-color: black;
--result__primary-color: var(--color-brand);
--result__foreground-color: white;
--result__muted-color: rgb(165, 165, 165);
--card-width: 352px;
--page-width: 1152px;
--snippets_container-background-unfocused: #171717;
--snippets_container-background-focused: #0017e9;
--snippets_container-background: var(
--snippets_container-background-unfocused
);
--snippets_container-muted-color: rgb(153, 153, 153);
}
body {
color: white;
margin: 0;
padding: 0;
font-family: var(--body-font);
background-color: var(--page-background);
color: var(--result__muted-color);
display: flex;
flex-direction: column;
height: 100%;
}
.Subtitle {
text-align: center;
font-size: 4em;
margin: 0;
padding: 0;
margin-bottom: 0.25em;
align-items: center;
display: flex;
flex-direction: row;
}
#reactroot,
#__next,
body,
html {
height: 100%;
}
.Title {
color: var(--color-brand);
font-family: var(--heading-font);
font-weight: 700;
margin-top: 48px;
font-size: 48px;
text-transform: capitalize;
text-align: center;
}
.Description {
text-align: center;
}
.main {
display: flex;
flex-direction: column;
height: 100%;
}
header,
.main {
width: 650px;
margin: 0 auto;
}
section {
width: 650px;
}
header {
margin-bottom: 48px;
}
footer {
flex-shrink: 0;
}
#reactroot,
#__next {
display: flex;
flex-direction: column;
justify-content: center;
}
section {
height: 300px;
display: flex;
flex-direction: column;
}
.timer {
font-weight: normal;
}
.ProgressBar-container {
width: 100%;
display: block;
position: relative;
border: 1px solid var(--color-brand-muted);
border-radius: 4px;
height: 92px;
}
.ProgressBar {
position: absolute;
top: 0;
bottom: 0;
right: 0;
left: 0;
width: 100%;
height: 100%;
display: block;
background-color: var(--color-brand);
transform-origin: top left;
border-radius: 4px;
transform: scaleX(var(--progress-bar, 0%));
}
.Bundler-container {
background-color: var(--snippets_container-background-focused);
font-size: 64px;
font-weight: bold;
color: white;
left: 0;
right: 0;
padding: 0.8em 0.8em;
}
.Bundler-updateRate {
font-size: 0.8em;
font-weight: normal;
display: flex;
color: var(--result__muted-color);
}
.interval:before {
content: var(--interval, "16");
}
.highlight {
color: white;
}
.timer:after {
content: var(--timestamp);
font-variant-numeric: tabular-nums;
font-family: -apple-system, BlinkMacSystemFont, "Segoe UI", Roboto, Oxygen,
Ubuntu, Cantarell, "Open Sans", "Helvetica Neue", sans-serif;
display: inline;
font-weight: 500;
color: white;
width: 100%;
}
.SectionLabel {
font-weight: 300;
font-family: var(--heading-font);
text-align: center;
width: 100%;
font-weight: 700;
margin-top: 24px;
}
.FooterLabel {
margin-top: 0;
margin-bottom: 12px;
}
.Spinner-container {
--spinner-muted: rgb(0, 255, 0);
--spinner-primary: rgb(0, 60, 255);
width: 96px;
height: 96px;
border-radius: 50%;
background-color: var(--page-background);
border-top: 1.1em solid var(--spinner-muted);
border-right: 1.1em solid var(--spinner-muted);
border-bottom: 1.1em solid var(--spinner-muted);
border-left: 1.1em solid var(--spinner-primary);
transform: rotate(var(--spinner-rotate, 12deg));
}
.Spinners {
display: grid;
grid-auto-flow: column;
justify-content: space-between;
width: 100%;
}
.Spinner-1.Spinner-container {
--spinner-muted: var(--spinner-1-muted);
--spinner-primary: var(--spinner-1-primary);
}
.Spinner-2.Spinner-container {
--spinner-muted: var(--spinner-2-muted);
--spinner-primary: var(--spinner-2-primary);
}
.Spinner-3.Spinner-container {
--spinner-muted: var(--spinner-3-muted);
--spinner-primary: var(--spinner-3-primary);
}
.Spinner-4.Spinner-container {
--spinner-muted: var(--spinner-4-muted);
--spinner-primary: var(--spinner-4-primary);
}

View File

@@ -0,0 +1,30 @@
import { Main } from "./main";
import classNames from "classnames";
import * as ReactDOM from "react-dom";
const Base = ({}) => {
const name =
typeof location !== "undefined"
? decodeURIComponent(location.search.substring(1))
: null;
return <Main productName={name || "Bundler"} />;
};
function startReact() {
ReactDOM.render(<Base />, document.querySelector("#reactroot"));
}
if (typeof window !== "undefined") {
console.log("HERE!!");
globalThis.addEventListener("DOMContentLoaded", () => {
startReact();
});
startReact();
} else {
console.log(ReactDOMServer.renderToString(<Base />));
}
export { Base };

View File

@@ -0,0 +1,67 @@
export const Main = ({ productName }) => {
return (
<>
<header>
<div className="Title">CSS HMR Stress Test</div>
<p className="Description">
This page visually tests how quickly a bundler can update CSS over Hot
Module Reloading.
</p>
</header>
<main className="main">
<section className="ProgressSection">
<p className="Subtitle">
<span className="Subtitle-part">
Ran:&nbsp;<span className="timer"></span>
</span>
</p>
<div className="ProgressBar-container">
<div className="ProgressBar"></div>
</div>
<div className="SectionLabel">
The progress bar should move from left to right smoothly.
</div>
</section>
<section>
<div className="Spinners">
<div className="Spinner-container Spinner-1">
<div className="Spinner"></div>
</div>
<div className="Spinner-container Spinner-2">
<div className="Spinner"></div>
</div>
<div className="Spinner-container Spinner-3">
<div className="Spinner"></div>
</div>
<div className="Spinner-container Spinner-4">
<div className="Spinner"></div>
</div>
</div>
<div className="SectionLabel">
The spinners should rotate &amp; change color smoothly.
</div>
</section>
</main>
<footer>
<div className="SectionLabel FooterLabel">
There are no CSS animations on this page.
</div>
<div className="Bundler-container">
<div className="Bundler">{productName}</div>
<div className="Bundler-updateRate">
Saving a css file every&nbsp;
<span className="highlight">
<span className="interval"></span>ms
</span>
</div>
</div>
</footer>
</>
);
};

View File

@@ -0,0 +1,19 @@
{
"compilerOptions": {
"target": "esnext",
"lib": ["dom", "dom.iterable", "esnext", "WebWorker"],
"allowJs": true,
"skipLibCheck": true,
"strict": false,
"forceConsistentCasingInFileNames": true,
"noEmit": true,
"esModuleInterop": true,
"module": "esnext",
"moduleResolution": "node",
"resolveJsonModule": true,
"isolatedModules": true,
"jsx": "preserve"
},
"include": ["next-env.d.ts", "**/*.ts", "**/*.tsx"],
"exclude": ["node_modules"]
}

View File

@@ -0,0 +1,5 @@
import reactRefresh from "@vitejs/plugin-react-refresh";
export default {
plugins: [reactRefresh()],
};

View File

@@ -0,0 +1,33 @@
{
"name": "simple-react",
"version": "1.0.0",
"license": "MIT",
"dependencies": {
"@emotion/css": "^11.1.3",
"@vitejs/plugin-react-refresh": "^1.3.3",
"antd": "^4.16.1",
"left-pad": "^1.3.0",
"next": "^11.0.0",
"parcel": "2.0.0-beta.3",
"react": "^17.0.2",
"react-bootstrap": "^1.6.1",
"react-dom": "^17.0.2",
"react-form": "^4.0.1",
"react-hook-form": "^7.8.3"
},
"parcel": "parceldist/index.js",
"targets": {
"parcel": {
"outputFormat": "esmodule",
"sourceMap": false,
"optimize": false,
"engines": {
"chrome": "last 1 version"
}
}
},
"devDependencies": {
"@snowpack/plugin-react-refresh": "^2.5.0",
"typescript": "^4.3.4"
}
}

View File

@@ -0,0 +1,15 @@
<!DOCTYPE html>
<html>
<head>
<link
rel="stylesheet"
crossorigin="anonymous"
href="https://fonts.googleapis.com/css2?family=IBM+Plex+Sans:wght@400;700&family=Space+Mono:wght@400;700&display=swap"
/>
</head>
<body>
<div id="reactroot"></div>
<link rel="stylesheet" href="./src/index.css" />
<script src="./src/index.tsx" async type="module"></script>
</body>
</html>

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,14 @@
:root {
--timestamp: "12812";
--interval: "8";
--progress-bar: 11.83299999999997%;
--spinner-1-muted: rgb(142, 6, 182);
--spinner-1-primary: rgb(177, 8, 227);
--spinner-2-muted: rgb(110, 148, 190);
--spinner-2-primary: rgb(138, 185, 238);
--spinner-3-muted: rgb(75, 45, 64);
--spinner-3-primary: rgb(94, 56, 80);
--spinner-4-muted: rgb(155, 129, 108);
--spinner-4-primary: rgb(194, 161, 135);
--spinner-rotate: 213deg;
}

View File

@@ -0,0 +1,21 @@
import React from "react";
export function RenderCounter({ name, children }) {
const counter = React.useRef(1);
return (
<div className="RenderCounter">
<div className="RenderCounter-meta">
<div className="RenderCounter-title">
{name} rendered <strong>{counter.current++} times</strong>
</div>
<div className="RenderCounter-lastRender">
LAST RENDER:{" "}
{new Intl.DateTimeFormat([], {
timeStyle: "long",
}).format(new Date())}
</div>
</div>
<div className="RenderCounter-children">{children}</div>
</div>
);
}

View File

@@ -0,0 +1,14 @@
import * as React from "react";
import { Button } from "./Button";
import { RenderCounter } from "./RenderCounter";
export function App() {
return (
<RenderCounter name="App">
<div className="AppRoot">
<h1>This is the root element</h1>
<Button>Click</Button>
</div>
</RenderCounter>
);
}

View File

@@ -0,0 +1,9 @@
import { RenderCounter } from "./RenderCounter";
export const Button = ({ children }) => {
return (
<RenderCounter name="Button">
<div className="Button">{children}</div>
</RenderCounter>
);
};

View File

@@ -0,0 +1 @@
@import "https://fonts.googleapis.com/css2?family=IBM+Plex+Sans:wght@400;700&family=Space+Mono:wght@400;700&display=swap";

View File

@@ -0,0 +1,98 @@
@import "./colors.css";
:root {
--heading-font: "Space Mono", system-ui;
--body-font: "IBM Plex Sans", system-ui;
--color-brand: #02ff00;
--color-brand-muted: rgb(2, 150, 0);
--padding-horizontal: 90px;
--page-background: black;
--page-background-alpha: rgba(0, 0, 0, 0.8);
--result__background-color: black;
--result__primary-color: var(--color-brand);
--result__foreground-color: white;
--result__muted-color: rgb(165, 165, 165);
--card-width: 352px;
--page-width: 1152px;
--snippets_container-background-unfocused: #171717;
--snippets_container-background-focused: #0017e9;
--snippets_container-background: var(
--snippets_container-background-unfocused
);
--snippets_container-muted-color: rgb(153, 153, 153);
}
body {
color: white;
margin: 0;
padding: 0;
font-family: var(--body-font);
background-color: var(--page-background);
color: var(--result__muted-color);
display: flex;
flex-direction: column;
height: 100%;
}
#reactroot,
#__next,
body,
html {
height: 100%;
}
.RenderCounter {
border: 10px solid var(--snippets_container-background-focused);
margin: 10px;
padding: 10px;
animation: flash 0.2s linear;
animation-fill-mode: forwards;
}
.RenderCounter-meta {
display: flex;
flex-direction: row;
justify-content: space-between;
margin: -10px;
padding: 10px;
background-color: #111;
}
.RenderCounter-lastRender,
.RenderCounter-title {
white-space: nowrap;
color: rgb(153, 153, 153);
}
@keyframes flash {
from {
border-color: var(--snippets_container-background-focused);
}
to {
border-color: var(--snippets_container-background-unfocused);
}
}
.Button {
display: block;
border: 1px solid rgb(20, 180, 0);
background-color: rgb(2, 150, 0);
color: white;
font-weight: 500;
padding: 10px 12px;
border-radius: 4px;
text-transform: uppercase;
text-align: center;
width: fit-content;
cursor: pointer;
}

View File

@@ -0,0 +1,15 @@
import ReactDOM from "react-dom";
import React from "react";
import { App } from "./components/app";
import classNames from "classnames";
function startReact() {
ReactDOM.render(<App />, document.querySelector("#reactroot"));
}
globalThis.addEventListener("DOMContentLoaded", () => {
startReact();
});
startReact();
export { App };

View File

@@ -0,0 +1,69 @@
import React from "react";
export const Main = ({ productName }) => {
return (
<>
<header>
<div className="Title">CSS HMR Stress Test</div>
<p className="Description">
This page visually tests how quickly a bundler can update CSS over Hot
Module Reloading.
</p>
</header>
<main className="main">
<section className="ProgressSection">
<p className="Subtitle">
<span className="Subtitle-part">
Ran:&nbsp;<span className="timer"></span>
</span>
</p>
<div className="ProgressBar-container">
<div className="ProgressBar"></div>
</div>
<div className="SectionLabel">
The progress bar should move from left to right smoothly.
</div>
</section>
<section>
<div className="Spinners">
<div className="Spinner-container Spinner-1">
<div className="Spinner"></div>
</div>
<div className="Spinner-container Spinner-2">
<div className="Spinner"></div>
</div>
<div className="Spinner-container Spinner-3">
<div className="Spinner"></div>
</div>
<div className="Spinner-container Spinner-4">
<div className="Spinner"></div>
</div>
</div>
<div className="SectionLabel">
The spinners should rotate &amp; change color smoothly.
</div>
</section>
</main>
<footer>
<div className="SectionLabel FooterLabel">
There are no CSS animations on this page.
</div>
<div className="Bundler-container">
<div className="Bundler">{productName}</div>
<div className="Bundler-updateRate">
Saving a css file every&nbsp;
<span className="highlight">
<span className="interval"></span>ms
</span>
</div>
</div>
</footer>
</>
);
};

View File

@@ -0,0 +1,19 @@
{
"compilerOptions": {
"target": "esnext",
"lib": ["dom", "dom.iterable", "esnext"],
"allowJs": true,
"skipLibCheck": true,
"strict": false,
"forceConsistentCasingInFileNames": true,
"noEmit": true,
"esModuleInterop": true,
"module": "esnext",
"moduleResolution": "node",
"resolveJsonModule": true,
"isolatedModules": true,
"jsx": "preserve"
},
"include": ["next-env.d.ts", "**/*.ts", "**/*.tsx"],
"exclude": ["node_modules"]
}

BIN
esdev-fd Executable file

Binary file not shown.

BIN
esdev-fd-relative Executable file

Binary file not shown.

BIN
esdev-lists Executable file

Binary file not shown.

BIN
esdev-nolists Executable file

Binary file not shown.

BIN
esdev.before-comptime-js-parser Executable file

Binary file not shown.

BIN
esdev.beforehashmapupgrade Executable file

Binary file not shown.

134
misctools/color-looper.zig Normal file
View File

@@ -0,0 +1,134 @@
const std = @import("std");
// usage:
// ./file-path:0 10
// 1 2 3
// 1. file path
// 2. Byte offset in file
// 3. ms update interval
pub fn main() anyerror!void {
var allocator = std.heap.c_allocator;
var timer = try std.time.Timer.start();
var color_buf: [2048]u8 = undefined;
var args = std.mem.span(try std.process.argsAlloc(allocator));
var basepath_with_colon: []u8 = args[args.len - 2];
var basepath: []u8 = "";
var position_str: []u8 = "";
if (std.mem.lastIndexOfScalar(u8, basepath_with_colon, ':')) |colon| {
basepath = basepath_with_colon[0..colon];
position_str = basepath_with_colon[colon + 1 ..];
}
var position = try std.fmt.parseInt(u32, position_str, 10);
const filepath = try std.fs.path.resolve(allocator, &.{basepath});
var file = try std.fs.openFileAbsolute(filepath, .{ .write = true });
var ms = @truncate(u64, (try std.fmt.parseInt(u128, args[args.len - 1], 10)) * std.time.ns_per_ms);
std.debug.assert(ms > 0);
// std.debug.assert(std.math.isFinite(position));
var prng = std.rand.DefaultPrng.init(0);
var stdout = std.io.getStdOut();
var log = stdout.writer();
var colors = std.mem.zeroes([4][3]u32);
var progress_bar: f64 = 0.0;
var destination_count: f64 = 18.0;
// Randomize initial colors
colors[0][0] = prng.random.int(u32);
colors[0][1] = prng.random.int(u32);
colors[0][2] = prng.random.int(u32);
colors[1][0] = prng.random.int(u32);
colors[1][1] = prng.random.int(u32);
colors[1][2] = prng.random.int(u32);
colors[2][0] = prng.random.int(u32);
colors[2][1] = prng.random.int(u32);
colors[2][2] = prng.random.int(u32);
colors[3][0] = prng.random.int(u32);
colors[3][1] = prng.random.int(u32);
colors[3][2] = prng.random.int(u32);
var rotate: u32 = 0;
var counter: usize = 0;
while (true) {
colors[0][0] += 1;
colors[0][1] += 1;
colors[0][2] += 1;
colors[1][0] += 1;
colors[1][1] += 1;
colors[1][2] += 1;
colors[2][0] += 1;
colors[2][1] += 1;
colors[2][2] += 1;
colors[3][0] += 1;
colors[3][1] += 1;
colors[3][2] += 1;
rotate += 1;
const fmtd =
\\:root {{
\\ --timestamp: "{d}";
\\ --interval: "{s}";
\\ --progress-bar: {d}%;
\\ --spinner-1-muted: rgb({d}, {d}, {d});
\\ --spinner-1-primary: rgb({d}, {d}, {d});
\\ --spinner-2-muted: rgb({d}, {d}, {d});
\\ --spinner-2-primary: rgb({d}, {d}, {d});
\\ --spinner-3-muted: rgb({d}, {d}, {d});
\\ --spinner-3-primary: rgb({d}, {d}, {d});
\\ --spinner-4-muted: rgb({d}, {d}, {d});
\\ --spinner-4-primary: rgb({d}, {d}, {d});
\\ --spinner-rotate: {d}deg;
\\}}
;
file = try std.fs.createFileAbsolute(filepath, .{ .truncate = true });
var wrote = try std.fmt.bufPrint(&color_buf, fmtd, .{
counter,
args[args.len - 1],
std.math.mod(f64, std.math.round(((progress_bar + 1.0) / destination_count) * 1000) / 1000, 100),
@floatToInt(u32, std.math.round(@intToFloat(f64, ((colors[0][0] + 1) % 256)) * 0.8)),
@floatToInt(u32, std.math.round(@intToFloat(f64, ((colors[0][1] + 1) % 256)) * 0.8)),
@floatToInt(u32, std.math.round(@intToFloat(f64, ((colors[0][2] + 1) % 256)) * 0.8)),
(colors[0][0] + 1) % 256,
(colors[0][1] + 1) % 256,
(colors[0][2] + 1) % 256,
@floatToInt(u32, std.math.round(@intToFloat(f64, ((colors[1][0] + 1) % 256)) * 0.8)),
@floatToInt(u32, std.math.round(@intToFloat(f64, ((colors[1][1] + 1) % 256)) * 0.8)),
@floatToInt(u32, std.math.round(@intToFloat(f64, ((colors[1][2] + 1) % 256)) * 0.8)),
(colors[1][0] + 1) % 256,
(colors[1][1] + 1) % 256,
(colors[1][2] + 1) % 256,
@floatToInt(u32, std.math.round(@intToFloat(f64, ((colors[2][0] + 1) % 256)) * 0.8)),
@floatToInt(u32, std.math.round(@intToFloat(f64, ((colors[2][1] + 1) % 256)) * 0.8)),
@floatToInt(u32, std.math.round(@intToFloat(f64, ((colors[2][2] + 1) % 256)) * 0.8)),
(colors[2][0] + 1) % 256,
(colors[2][1] + 1) % 256,
(colors[2][2] + 1) % 256,
@floatToInt(u32, std.math.round(@intToFloat(f64, ((colors[3][0] + 1) % 256)) * 0.8)),
@floatToInt(u32, std.math.round(@intToFloat(f64, ((colors[3][1] + 1) % 256)) * 0.8)),
@floatToInt(u32, std.math.round(@intToFloat(f64, ((colors[3][2] + 1) % 256)) * 0.8)),
(colors[3][0] + 1) % 256,
(colors[3][1] + 1) % 256,
(colors[3][2] + 1) % 256,
rotate % 360,
});
progress_bar += 1.0;
_ = try file.writeAll(wrote);
try log.print("[{d}] \"{s}\":{d}\n", .{
std.time.nanoTimestamp(),
filepath,
position,
});
counter += 1;
// If we don't close the file, Parcel seems to never recognize it
file.close();
std.time.sleep(ms);
}
}

BIN
node_modules.jsbundle Normal file

Binary file not shown.

0
out.txt Normal file
View File

View File

@@ -0,0 +1,6 @@
{
"name": "speedy-nextjs",
"version": "1.0.0",
"main": "index.js",
"license": "MIT"
}

336
pnpm-lock.yaml generated Normal file
View File

@@ -0,0 +1,336 @@
lockfileVersion: 5.3
specifiers:
'@babel/preset-react': ^7.13.13
'@swc/cli': ^0.1.39
'@swc/core': ^1.2.55
'@swc/wasm': ^1.2.54
esbuild-wasm: ^0.11.19
dependencies:
'@babel/preset-react': 7.13.13
'@swc/cli': 0.1.39_@swc+core@1.2.55
'@swc/core': 1.2.55
'@swc/wasm': 1.2.55
esbuild-wasm: 0.11.19
packages:
/@babel/helper-annotate-as-pure/7.12.13:
resolution: {integrity: sha512-7YXfX5wQ5aYM/BOlbSccHDbuXXFPxeoUmfWtz8le2yTkTZc+BxsiEnENFoi2SlmA8ewDkG2LgIMIVzzn2h8kfw==}
dependencies:
'@babel/types': 7.14.1
dev: false
/@babel/helper-module-imports/7.13.12:
resolution: {integrity: sha512-4cVvR2/1B693IuOvSI20xqqa/+bl7lqAMR59R4iu39R9aOX8/JoYY1sFaNvUMyMBGnHdwvJgUrzNLoUZxXypxA==}
dependencies:
'@babel/types': 7.14.1
dev: false
/@babel/helper-plugin-utils/7.13.0:
resolution: {integrity: sha512-ZPafIPSwzUlAoWT8DKs1W2VyF2gOWthGd5NGFMsBcMMol+ZhK+EQY/e6V96poa6PA/Bh+C9plWN0hXO1uB8AfQ==}
dev: false
/@babel/helper-validator-identifier/7.14.0:
resolution: {integrity: sha512-V3ts7zMSu5lfiwWDVWzRDGIN+lnCEUdaXgtVHJgLb1rGaA6jMrtB9EmE7L18foXJIE8Un/A/h6NJfGQp/e1J4A==}
dev: false
/@babel/helper-validator-option/7.12.17:
resolution: {integrity: sha512-TopkMDmLzq8ngChwRlyjR6raKD6gMSae4JdYDB8bByKreQgG0RBTuKe9LRxW3wFtUnjxOPRKBDwEH6Mg5KeDfw==}
dev: false
/@babel/plugin-syntax-jsx/7.12.13:
resolution: {integrity: sha512-d4HM23Q1K7oq/SLNmG6mRt85l2csmQ0cHRaxRXjKW0YFdEXqlZ5kzFQKH5Uc3rDJECgu+yCRgPkG04Mm98R/1g==}
peerDependencies:
'@babel/core': ^7.0.0-0
dependencies:
'@babel/helper-plugin-utils': 7.13.0
dev: false
/@babel/plugin-transform-react-display-name/7.12.13:
resolution: {integrity: sha512-MprESJzI9O5VnJZrL7gg1MpdqmiFcUv41Jc7SahxYsNP2kDkFqClxxTZq+1Qv4AFCamm+GXMRDQINNn+qrxmiA==}
peerDependencies:
'@babel/core': ^7.0.0-0
dependencies:
'@babel/helper-plugin-utils': 7.13.0
dev: false
/@babel/plugin-transform-react-jsx-development/7.12.17:
resolution: {integrity: sha512-BPjYV86SVuOaudFhsJR1zjgxxOhJDt6JHNoD48DxWEIxUCAMjV1ys6DYw4SDYZh0b1QsS2vfIA9t/ZsQGsDOUQ==}
peerDependencies:
'@babel/core': ^7.0.0-0
dependencies:
'@babel/plugin-transform-react-jsx': 7.13.12
dev: false
/@babel/plugin-transform-react-jsx/7.13.12:
resolution: {integrity: sha512-jcEI2UqIcpCqB5U5DRxIl0tQEProI2gcu+g8VTIqxLO5Iidojb4d77q+fwGseCvd8af/lJ9masp4QWzBXFE2xA==}
peerDependencies:
'@babel/core': ^7.0.0-0
dependencies:
'@babel/helper-annotate-as-pure': 7.12.13
'@babel/helper-module-imports': 7.13.12
'@babel/helper-plugin-utils': 7.13.0
'@babel/plugin-syntax-jsx': 7.12.13
'@babel/types': 7.14.1
dev: false
/@babel/plugin-transform-react-pure-annotations/7.12.1:
resolution: {integrity: sha512-RqeaHiwZtphSIUZ5I85PEH19LOSzxfuEazoY7/pWASCAIBuATQzpSVD+eT6MebeeZT2F4eSL0u4vw6n4Nm0Mjg==}
peerDependencies:
'@babel/core': ^7.0.0-0
dependencies:
'@babel/helper-annotate-as-pure': 7.12.13
'@babel/helper-plugin-utils': 7.13.0
dev: false
/@babel/preset-react/7.13.13:
resolution: {integrity: sha512-gx+tDLIE06sRjKJkVtpZ/t3mzCDOnPG+ggHZG9lffUbX8+wC739x20YQc9V35Do6ZAxaUc/HhVHIiOzz5MvDmA==}
peerDependencies:
'@babel/core': ^7.0.0-0
dependencies:
'@babel/helper-plugin-utils': 7.13.0
'@babel/helper-validator-option': 7.12.17
'@babel/plugin-transform-react-display-name': 7.12.13
'@babel/plugin-transform-react-jsx': 7.13.12
'@babel/plugin-transform-react-jsx-development': 7.12.17
'@babel/plugin-transform-react-pure-annotations': 7.12.1
dev: false
/@babel/types/7.14.1:
resolution: {integrity: sha512-S13Qe85fzLs3gYRUnrpyeIrBJIMYv33qSTg1qoBwiG6nPKwUWAD9odSzWhEedpwOIzSEI6gbdQIWEMiCI42iBA==}
dependencies:
'@babel/helper-validator-identifier': 7.14.0
to-fast-properties: 2.0.0
dev: false
/@napi-rs/triples/1.0.2:
resolution: {integrity: sha512-EL3SiX43m9poFSnhDx4d4fn9SSaqyO2rHsCNhETi9bWPmjXK3uPJ0QpPFtx39FEdHcz1vJmsiW41kqc0AgvtzQ==}
dev: false
/@node-rs/helper/1.1.0:
resolution: {integrity: sha512-r43YnnrY5JNzDuXJdW3sBJrKzvejvFmFWbiItUEoBJsaPzOIWFMhXB7i5j4c9EMXcFfxveF4l7hT+rLmwtjrVQ==}
dependencies:
'@napi-rs/triples': 1.0.2
tslib: 2.2.0
dev: false
/@swc/cli/0.1.39_@swc+core@1.2.55:
resolution: {integrity: sha512-qTI+HIjSgKUJUKZ3xGA6zAEkHryirmKrzj4zWrCg4FQnAEFGPOIx58/qRs3aURSOS3BnbVE33sqAxEN+v8qZpw==}
engines: {node: '>= 12.13'}
hasBin: true
peerDependencies:
'@swc/core': ^1.2.4
chokidar: ^3.0.0
peerDependenciesMeta:
chokidar:
optional: true
dependencies:
'@swc/core': 1.2.55
commander: 7.2.0
convert-source-map: 1.7.0
glob: 7.1.7
lodash: 4.17.21
slash: 3.0.0
source-map: 0.7.3
dev: false
/@swc/core-android-arm64/1.2.56:
resolution: {integrity: sha512-yXiqbuEnpotpYdGL8rFvRQzkK7JQ1rhZAdGTcCvwUF7L8Ujm1NxJlrNaiMiK7uKvCYOynwe32Ddykaew8ggEFQ==}
engines: {node: '>=10'}
cpu: [arm64]
os: [android]
dev: false
optional: true
/@swc/core-darwin-arm64/1.2.56:
resolution: {integrity: sha512-Ub74q6rKxJy909mXoBJQ7dF5dUJnqrq3XpGHWexv3WUr7C/sTbcwZDwgFMqgDHOf0TSPTge+qwPNOIxcSYv/Kg==}
engines: {node: '>=10'}
cpu: [arm64]
os: [darwin]
dev: false
optional: true
/@swc/core-darwin-x64/1.2.56:
resolution: {integrity: sha512-vxHo9eAyEVykTXM9tJGOYdlsxWq43po5mDeB1dEEjdwefpRCeV+xv3xL6GfVxoVn26w+LZgT4R+BpP0Hx7kATQ==}
engines: {node: '>=10'}
cpu: [x64]
os: [darwin]
dev: false
optional: true
/@swc/core-linux-arm-gnueabihf/1.2.56:
resolution: {integrity: sha512-Chmj/OQB1ie/UY5Cdt9e8VkUTE5lDAPGg4eN2O71j0UlZux3TwR+L/tiGuS9S87lqF9qtZAmZ+WTldeiVFdVqQ==}
engines: {node: '>=10'}
cpu: [arm]
os: [linux]
dev: false
optional: true
/@swc/core-linux-arm64-gnu/1.2.56:
resolution: {integrity: sha512-WCze10brrFmWrJUKmmZVQPfgVnfkvfXbKbs24cgjFSzsV2iBZ4/NVqe+5covYTOkaFvnrqERHqq+ntm1wjDT1A==}
engines: {node: '>=10'}
cpu: [arm64]
os: [linux]
dev: false
optional: true
/@swc/core-linux-x64-gnu/1.2.56:
resolution: {integrity: sha512-B+Rr6NXUNe8RmgBNEh3ATZt77muFssaXbzIYTn+Yovw/s+xh27TFHaoZkfKJFNY/uWxL3S22ZVAxv5ugwS4++g==}
engines: {node: '>=10'}
cpu: [x64]
os: [linux]
dev: false
optional: true
/@swc/core-linux-x64-musl/1.2.56:
resolution: {integrity: sha512-W1BA8Zjz4pkFmAg3PqKsdTyySkJcUiPWi18Ok0qBx2xemgkEKpERpwI51NwWm3YQUSJKTH2MFiwfDLtCE+Ieng==}
engines: {node: '>=10'}
cpu: [x64]
os: [linux]
dev: false
optional: true
/@swc/core-win32-ia32-msvc/1.2.56:
resolution: {integrity: sha512-sSpruAaA3y0CXO1yMPfDxo4p9wtrS7cVOM7P9IryKIUGZBtoM3U0W2NAUE3h5GNrx7xv2GBxqtzfoYW6I8T9bw==}
engines: {node: '>=10'}
cpu: [ia32]
os: [win32]
dev: false
optional: true
/@swc/core-win32-x64-msvc/1.2.56:
resolution: {integrity: sha512-eSqajMZ6fAfHAy1h9Bh8oN90faCy3zsj3VcgjhEbJQnjUIN32eOLlWb70pAb58ckP+c2pBejaRuRElVjaViVjw==}
engines: {node: '>=10'}
cpu: [x64]
os: [win32]
dev: false
optional: true
/@swc/core/1.2.55:
resolution: {integrity: sha512-ZtyxJ0IT0dv4jq0oPrlQytRN9HoSocT5Xig6y/Yx28uFRGJOlqaP1NrkNyZhB65c29gwXoedxN54uVqmXe+aFQ==}
engines: {node: '>=10'}
dependencies:
'@node-rs/helper': 1.1.0
optionalDependencies:
'@swc/core-android-arm64': 1.2.56
'@swc/core-darwin-arm64': 1.2.56
'@swc/core-darwin-x64': 1.2.56
'@swc/core-linux-arm-gnueabihf': 1.2.56
'@swc/core-linux-arm64-gnu': 1.2.56
'@swc/core-linux-x64-gnu': 1.2.56
'@swc/core-linux-x64-musl': 1.2.56
'@swc/core-win32-ia32-msvc': 1.2.56
'@swc/core-win32-x64-msvc': 1.2.56
dev: false
/@swc/wasm/1.2.55:
resolution: {integrity: sha512-otrxYNDmKSKVK8QVsGynACyvSL8XOYYXsh7cyaXPSKGnTTPjeWhYvI1d5uFnZyASfFXUpk1eFEE6AMJWIwKJhA==}
dev: false
/balanced-match/1.0.2:
resolution: {integrity: sha512-3oSeUO0TMV67hN1AmbXsK4yaqU7tjiHlbxRDZOpH0KW9+CeX4bRAaX0Anxt0tx2MrpRpWwQaPwIlISEJhYU5Pw==}
dev: false
/brace-expansion/1.1.11:
resolution: {integrity: sha512-iCuPHDFgrHX7H2vEI/5xpz07zSHB00TpugqhmYtVmMO6518mCuRMoOYFldEBl0g187ufozdaHgWKcYFb61qGiA==}
dependencies:
balanced-match: 1.0.2
concat-map: 0.0.1
dev: false
/commander/7.2.0:
resolution: {integrity: sha512-QrWXB+ZQSVPmIWIhtEO9H+gwHaMGYiF5ChvoJ+K9ZGHG/sVsa6yiesAD1GC/x46sET00Xlwo1u49RVVVzvcSkw==}
engines: {node: '>= 10'}
dev: false
/concat-map/0.0.1:
resolution: {integrity: sha1-2Klr13/Wjfd5OnMDajug1UBdR3s=}
dev: false
/convert-source-map/1.7.0:
resolution: {integrity: sha512-4FJkXzKXEDB1snCFZlLP4gpC3JILicCpGbzG9f9G7tGqGCzETQ2hWPrcinA9oU4wtf2biUaEH5065UnMeR33oA==}
dependencies:
safe-buffer: 5.1.2
dev: false
/esbuild-wasm/0.11.19:
resolution: {integrity: sha512-d4s3fcIBG9CL/h5kKfXHpkztyMhs71anqdszND1Zfr4na1bhMGAb+VyEMBbt2/0ft5HtcsOYBqXsjNPNWTC29w==}
engines: {node: '>=8'}
hasBin: true
dev: false
/fs.realpath/1.0.0:
resolution: {integrity: sha1-FQStJSMVjKpA20onh8sBQRmU6k8=}
dev: false
/glob/7.1.7:
resolution: {integrity: sha512-OvD9ENzPLbegENnYP5UUfJIirTg4+XwMWGaQfQTY0JenxNvvIKP3U3/tAQSPIu/lHxXYSZmpXlUHeqAIdKzBLQ==}
dependencies:
fs.realpath: 1.0.0
inflight: 1.0.6
inherits: 2.0.4
minimatch: 3.0.4
once: 1.4.0
path-is-absolute: 1.0.1
dev: false
/inflight/1.0.6:
resolution: {integrity: sha1-Sb1jMdfQLQwJvJEKEHW6gWW1bfk=}
dependencies:
once: 1.4.0
wrappy: 1.0.2
dev: false
/inherits/2.0.4:
resolution: {integrity: sha512-k/vGaX4/Yla3WzyMCvTQOXYeIHvqOKtnqBduzTHpzpQZzAskKMhZ2K+EnBiSM9zGSoIFeMpXKxa4dYeZIQqewQ==}
dev: false
/lodash/4.17.21:
resolution: {integrity: sha512-v2kDEe57lecTulaDIuNTPy3Ry4gLGJ6Z1O3vE1krgXZNrsQ+LFTGHVxVjcXPs17LhbZVGedAJv8XZ1tvj5FvSg==}
dev: false
/minimatch/3.0.4:
resolution: {integrity: sha512-yJHVQEhyqPLUTgt9B83PXu6W3rx4MvvHvSUvToogpwoGDOUQ+yDrR0HRot+yOCdCO7u4hX3pWft6kWBBcqh0UA==}
dependencies:
brace-expansion: 1.1.11
dev: false
/once/1.4.0:
resolution: {integrity: sha1-WDsap3WWHUsROsF9nFC6753Xa9E=}
dependencies:
wrappy: 1.0.2
dev: false
/path-is-absolute/1.0.1:
resolution: {integrity: sha1-F0uSaHNVNP+8es5r9TpanhtcX18=}
engines: {node: '>=0.10.0'}
dev: false
/safe-buffer/5.1.2:
resolution: {integrity: sha512-Gd2UZBJDkXlY7GbJxfsE8/nvKkUEU1G38c1siN6QP6a9PT9MmHB8GnpscSmMJSoF8LOIrt8ud/wPtojys4G6+g==}
dev: false
/slash/3.0.0:
resolution: {integrity: sha512-g9Q1haeby36OSStwb4ntCGGGaKsaVSjQ68fBxoQcutl5fS1vuY18H3wSt3jFyFtrkx+Kz0V1G85A4MyAdDMi2Q==}
engines: {node: '>=8'}
dev: false
/source-map/0.7.3:
resolution: {integrity: sha512-CkCj6giN3S+n9qrYiBTX5gystlENnRW5jZeNLHpe6aue+SrHcG5VYwujhW9s4dY31mEGsxBDrHR6oI69fTXsaQ==}
engines: {node: '>= 8'}
dev: false
/to-fast-properties/2.0.0:
resolution: {integrity: sha1-3F5pjL0HkmW8c+A3doGk5Og/YW4=}
engines: {node: '>=4'}
dev: false
/tslib/2.2.0:
resolution: {integrity: sha512-gS9GVHRU+RGn5KQM2rllAlR3dU6m7AcpJKdtH8gFvQiC4Otgk98XnmMU+nZenHt/+VhnBPWwgrJsyrdcw6i23w==}
dev: false
/wrappy/1.0.2:
resolution: {integrity: sha1-tSQ9jz7BqjXxNkYFvA0QNuMKtp8=}
dev: false

1
profile.json Normal file

File diff suppressed because one or more lines are too long

332
src/Wyhash.zig Normal file
View File

@@ -0,0 +1,332 @@
// SPDX-License-Identifier: MIT
// Copyright (c) 2015-2021 Zig Contributors
// This file is part of [zig](https://ziglang.org/), which is MIT licensed.
// The MIT license requires this copyright notice to be included in all copies
// and substantial portions of the software.
const std = @import("std");
const mem = std.mem;
const primes = [_]u64{
0xa0761d6478bd642f,
0xe7037ed1a0b428db,
0x8ebc6af09c88c6e3,
0x589965cc75374cc3,
0x1d8e4e27c47d124f,
};
fn read_bytes(comptime bytes: u8, data: []const u8) u64 {
const T = std.meta.Int(.unsigned, 8 * bytes);
return mem.readIntLittle(T, data[0..bytes]);
}
fn read_8bytes_swapped(data: []const u8) u64 {
return (read_bytes(4, data) << 32 | read_bytes(4, data[4..]));
}
fn mum(a: u64, b: u64) u64 {
var r = std.math.mulWide(u64, a, b);
r = (r >> 64) ^ r;
return @truncate(u64, r);
}
fn mix0(a: u64, b: u64, seed: u64) u64 {
return mum(a ^ seed ^ primes[0], b ^ seed ^ primes[1]);
}
fn mix1(a: u64, b: u64, seed: u64) u64 {
return mum(a ^ seed ^ primes[2], b ^ seed ^ primes[3]);
}
// Wyhash version which does not store internal state for handling partial buffers.
// This is needed so that we can maximize the speed for the short key case, which will
// use the non-iterative api which the public Wyhash exposes.
pub fn WyhashGenerator(comptime ValueType: type) type {
return struct {
seed: u64,
msg_len: usize,
pub fn init(seed: u64) WyhashStateless {
return WyhashStateless{
.seed = seed,
.msg_len = 0,
};
}
fn round(self: *WyhashStateless, b: []const u8) void {
std.debug.assert(b.len == 32);
self.seed = mix0(
read_bytes(8, b[0..]),
read_bytes(8, b[8..]),
self.seed,
) ^ mix1(
read_bytes(8, b[16..]),
read_bytes(8, b[24..]),
self.seed,
);
}
pub fn update(self: *WyhashStateless, b: []const u8) void {
std.debug.assert(b.len % 32 == 0);
var off: usize = 0;
while (off < b.len) : (off += 32) {
@call(.{ .modifier = .always_inline }, self.round, .{b[off .. off + 32]});
}
self.msg_len += b.len;
}
pub fn final(self: *WyhashStateless, b: []const u8) u64 {
std.debug.assert(b.len < 32);
const seed = self.seed;
const rem_len = @intCast(u5, b.len);
const rem_key = b[0..rem_len];
self.seed = switch (rem_len) {
0 => seed,
1 => mix0(read_bytes(1, rem_key), primes[4], seed),
2 => mix0(read_bytes(2, rem_key), primes[4], seed),
3 => mix0((read_bytes(2, rem_key) << 8) | read_bytes(1, rem_key[2..]), primes[4], seed),
4 => mix0(read_bytes(4, rem_key), primes[4], seed),
5 => mix0((read_bytes(4, rem_key) << 8) | read_bytes(1, rem_key[4..]), primes[4], seed),
6 => mix0((read_bytes(4, rem_key) << 16) | read_bytes(2, rem_key[4..]), primes[4], seed),
7 => mix0((read_bytes(4, rem_key) << 24) | (read_bytes(2, rem_key[4..]) << 8) | read_bytes(1, rem_key[6..]), primes[4], seed),
8 => mix0(read_8bytes_swapped(rem_key), primes[4], seed),
9 => mix0(read_8bytes_swapped(rem_key), read_bytes(1, rem_key[8..]), seed),
10 => mix0(read_8bytes_swapped(rem_key), read_bytes(2, rem_key[8..]), seed),
11 => mix0(read_8bytes_swapped(rem_key), (read_bytes(2, rem_key[8..]) << 8) | read_bytes(1, rem_key[10..]), seed),
12 => mix0(read_8bytes_swapped(rem_key), read_bytes(4, rem_key[8..]), seed),
13 => mix0(read_8bytes_swapped(rem_key), (read_bytes(4, rem_key[8..]) << 8) | read_bytes(1, rem_key[12..]), seed),
14 => mix0(read_8bytes_swapped(rem_key), (read_bytes(4, rem_key[8..]) << 16) | read_bytes(2, rem_key[12..]), seed),
15 => mix0(read_8bytes_swapped(rem_key), (read_bytes(4, rem_key[8..]) << 24) | (read_bytes(2, rem_key[12..]) << 8) | read_bytes(1, rem_key[14..]), seed),
16 => mix0(read_8bytes_swapped(rem_key), read_8bytes_swapped(rem_key[8..]), seed),
17 => mix0(read_8bytes_swapped(rem_key), read_8bytes_swapped(rem_key[8..]), seed) ^ mix1(read_bytes(1, rem_key[16..]), primes[4], seed),
18 => mix0(read_8bytes_swapped(rem_key), read_8bytes_swapped(rem_key[8..]), seed) ^ mix1(read_bytes(2, rem_key[16..]), primes[4], seed),
19 => mix0(read_8bytes_swapped(rem_key), read_8bytes_swapped(rem_key[8..]), seed) ^ mix1((read_bytes(2, rem_key[16..]) << 8) | read_bytes(1, rem_key[18..]), primes[4], seed),
20 => mix0(read_8bytes_swapped(rem_key), read_8bytes_swapped(rem_key[8..]), seed) ^ mix1(read_bytes(4, rem_key[16..]), primes[4], seed),
21 => mix0(read_8bytes_swapped(rem_key), read_8bytes_swapped(rem_key[8..]), seed) ^ mix1((read_bytes(4, rem_key[16..]) << 8) | read_bytes(1, rem_key[20..]), primes[4], seed),
22 => mix0(read_8bytes_swapped(rem_key), read_8bytes_swapped(rem_key[8..]), seed) ^ mix1((read_bytes(4, rem_key[16..]) << 16) | read_bytes(2, rem_key[20..]), primes[4], seed),
23 => mix0(read_8bytes_swapped(rem_key), read_8bytes_swapped(rem_key[8..]), seed) ^ mix1((read_bytes(4, rem_key[16..]) << 24) | (read_bytes(2, rem_key[20..]) << 8) | read_bytes(1, rem_key[22..]), primes[4], seed),
24 => mix0(read_8bytes_swapped(rem_key), read_8bytes_swapped(rem_key[8..]), seed) ^ mix1(read_8bytes_swapped(rem_key[16..]), primes[4], seed),
25 => mix0(read_8bytes_swapped(rem_key), read_8bytes_swapped(rem_key[8..]), seed) ^ mix1(read_8bytes_swapped(rem_key[16..]), read_bytes(1, rem_key[24..]), seed),
26 => mix0(read_8bytes_swapped(rem_key), read_8bytes_swapped(rem_key[8..]), seed) ^ mix1(read_8bytes_swapped(rem_key[16..]), read_bytes(2, rem_key[24..]), seed),
27 => mix0(read_8bytes_swapped(rem_key), read_8bytes_swapped(rem_key[8..]), seed) ^ mix1(read_8bytes_swapped(rem_key[16..]), (read_bytes(2, rem_key[24..]) << 8) | read_bytes(1, rem_key[26..]), seed),
28 => mix0(read_8bytes_swapped(rem_key), read_8bytes_swapped(rem_key[8..]), seed) ^ mix1(read_8bytes_swapped(rem_key[16..]), read_bytes(4, rem_key[24..]), seed),
29 => mix0(read_8bytes_swapped(rem_key), read_8bytes_swapped(rem_key[8..]), seed) ^ mix1(read_8bytes_swapped(rem_key[16..]), (read_bytes(4, rem_key[24..]) << 8) | read_bytes(1, rem_key[28..]), seed),
30 => mix0(read_8bytes_swapped(rem_key), read_8bytes_swapped(rem_key[8..]), seed) ^ mix1(read_8bytes_swapped(rem_key[16..]), (read_bytes(4, rem_key[24..]) << 16) | read_bytes(2, rem_key[28..]), seed),
31 => mix0(read_8bytes_swapped(rem_key), read_8bytes_swapped(rem_key[8..]), seed) ^ mix1(read_8bytes_swapped(rem_key[16..]), (read_bytes(4, rem_key[24..]) << 24) | (read_bytes(2, rem_key[28..]) << 8) | read_bytes(1, rem_key[30..]), seed),
};
self.msg_len += b.len;
return mum(self.seed ^ self.msg_len, primes[4]);
}
pub fn hash(seed: u64, value: ValueType) u64 {
const input = std.mem.asBytes(&value);
const aligned_len = @sizeOf(value) - (@sizeOf(value) % 32);
var c = WyhashStateless.init(seed);
@call(.{ .modifier = .always_inline }, c.update, .{input[0..aligned_len]});
return @call(.{ .modifier = .always_inline }, c.final, .{input[aligned_len..]});
}
};
}
// Wyhash version which does not store internal state for handling partial buffers.
// This is needed so that we can maximize the speed for the short key case, which will
// use the non-iterative api which the public Wyhash exposes.
const WyhashStateless = struct {
seed: u64,
msg_len: usize,
pub fn init(seed: u64) WyhashStateless {
return WyhashStateless{
.seed = seed,
.msg_len = 0,
};
}
fn round(self: *WyhashStateless, b: []const u8) void {
std.debug.assert(b.len == 32);
self.seed = mix0(
read_bytes(8, b[0..]),
read_bytes(8, b[8..]),
self.seed,
) ^ mix1(
read_bytes(8, b[16..]),
read_bytes(8, b[24..]),
self.seed,
);
}
pub fn update(self: *WyhashStateless, b: []const u8) void {
std.debug.assert(b.len % 32 == 0);
var off: usize = 0;
while (off < b.len) : (off += 32) {
@call(.{ .modifier = .always_inline }, self.round, .{b[off .. off + 32]});
}
self.msg_len += b.len;
}
pub fn final(self: *WyhashStateless, b: []const u8) u64 {
std.debug.assert(b.len < 32);
const seed = self.seed;
const rem_len = @intCast(u5, b.len);
const rem_key = b[0..rem_len];
self.seed = switch (rem_len) {
0 => seed,
1 => mix0(read_bytes(1, rem_key), primes[4], seed),
2 => mix0(read_bytes(2, rem_key), primes[4], seed),
3 => mix0((read_bytes(2, rem_key) << 8) | read_bytes(1, rem_key[2..]), primes[4], seed),
4 => mix0(read_bytes(4, rem_key), primes[4], seed),
5 => mix0((read_bytes(4, rem_key) << 8) | read_bytes(1, rem_key[4..]), primes[4], seed),
6 => mix0((read_bytes(4, rem_key) << 16) | read_bytes(2, rem_key[4..]), primes[4], seed),
7 => mix0((read_bytes(4, rem_key) << 24) | (read_bytes(2, rem_key[4..]) << 8) | read_bytes(1, rem_key[6..]), primes[4], seed),
8 => mix0(read_8bytes_swapped(rem_key), primes[4], seed),
9 => mix0(read_8bytes_swapped(rem_key), read_bytes(1, rem_key[8..]), seed),
10 => mix0(read_8bytes_swapped(rem_key), read_bytes(2, rem_key[8..]), seed),
11 => mix0(read_8bytes_swapped(rem_key), (read_bytes(2, rem_key[8..]) << 8) | read_bytes(1, rem_key[10..]), seed),
12 => mix0(read_8bytes_swapped(rem_key), read_bytes(4, rem_key[8..]), seed),
13 => mix0(read_8bytes_swapped(rem_key), (read_bytes(4, rem_key[8..]) << 8) | read_bytes(1, rem_key[12..]), seed),
14 => mix0(read_8bytes_swapped(rem_key), (read_bytes(4, rem_key[8..]) << 16) | read_bytes(2, rem_key[12..]), seed),
15 => mix0(read_8bytes_swapped(rem_key), (read_bytes(4, rem_key[8..]) << 24) | (read_bytes(2, rem_key[12..]) << 8) | read_bytes(1, rem_key[14..]), seed),
16 => mix0(read_8bytes_swapped(rem_key), read_8bytes_swapped(rem_key[8..]), seed),
17 => mix0(read_8bytes_swapped(rem_key), read_8bytes_swapped(rem_key[8..]), seed) ^ mix1(read_bytes(1, rem_key[16..]), primes[4], seed),
18 => mix0(read_8bytes_swapped(rem_key), read_8bytes_swapped(rem_key[8..]), seed) ^ mix1(read_bytes(2, rem_key[16..]), primes[4], seed),
19 => mix0(read_8bytes_swapped(rem_key), read_8bytes_swapped(rem_key[8..]), seed) ^ mix1((read_bytes(2, rem_key[16..]) << 8) | read_bytes(1, rem_key[18..]), primes[4], seed),
20 => mix0(read_8bytes_swapped(rem_key), read_8bytes_swapped(rem_key[8..]), seed) ^ mix1(read_bytes(4, rem_key[16..]), primes[4], seed),
21 => mix0(read_8bytes_swapped(rem_key), read_8bytes_swapped(rem_key[8..]), seed) ^ mix1((read_bytes(4, rem_key[16..]) << 8) | read_bytes(1, rem_key[20..]), primes[4], seed),
22 => mix0(read_8bytes_swapped(rem_key), read_8bytes_swapped(rem_key[8..]), seed) ^ mix1((read_bytes(4, rem_key[16..]) << 16) | read_bytes(2, rem_key[20..]), primes[4], seed),
23 => mix0(read_8bytes_swapped(rem_key), read_8bytes_swapped(rem_key[8..]), seed) ^ mix1((read_bytes(4, rem_key[16..]) << 24) | (read_bytes(2, rem_key[20..]) << 8) | read_bytes(1, rem_key[22..]), primes[4], seed),
24 => mix0(read_8bytes_swapped(rem_key), read_8bytes_swapped(rem_key[8..]), seed) ^ mix1(read_8bytes_swapped(rem_key[16..]), primes[4], seed),
25 => mix0(read_8bytes_swapped(rem_key), read_8bytes_swapped(rem_key[8..]), seed) ^ mix1(read_8bytes_swapped(rem_key[16..]), read_bytes(1, rem_key[24..]), seed),
26 => mix0(read_8bytes_swapped(rem_key), read_8bytes_swapped(rem_key[8..]), seed) ^ mix1(read_8bytes_swapped(rem_key[16..]), read_bytes(2, rem_key[24..]), seed),
27 => mix0(read_8bytes_swapped(rem_key), read_8bytes_swapped(rem_key[8..]), seed) ^ mix1(read_8bytes_swapped(rem_key[16..]), (read_bytes(2, rem_key[24..]) << 8) | read_bytes(1, rem_key[26..]), seed),
28 => mix0(read_8bytes_swapped(rem_key), read_8bytes_swapped(rem_key[8..]), seed) ^ mix1(read_8bytes_swapped(rem_key[16..]), read_bytes(4, rem_key[24..]), seed),
29 => mix0(read_8bytes_swapped(rem_key), read_8bytes_swapped(rem_key[8..]), seed) ^ mix1(read_8bytes_swapped(rem_key[16..]), (read_bytes(4, rem_key[24..]) << 8) | read_bytes(1, rem_key[28..]), seed),
30 => mix0(read_8bytes_swapped(rem_key), read_8bytes_swapped(rem_key[8..]), seed) ^ mix1(read_8bytes_swapped(rem_key[16..]), (read_bytes(4, rem_key[24..]) << 16) | read_bytes(2, rem_key[28..]), seed),
31 => mix0(read_8bytes_swapped(rem_key), read_8bytes_swapped(rem_key[8..]), seed) ^ mix1(read_8bytes_swapped(rem_key[16..]), (read_bytes(4, rem_key[24..]) << 24) | (read_bytes(2, rem_key[28..]) << 8) | read_bytes(1, rem_key[30..]), seed),
};
self.msg_len += b.len;
return mum(self.seed ^ self.msg_len, primes[4]);
}
pub fn hash(seed: u64, input: []const u8) u64 {
const aligned_len = input.len - (input.len % 32);
var c = WyhashStateless.init(seed);
@call(.{ .modifier = .always_inline }, c.update, .{input[0..aligned_len]});
return @call(.{ .modifier = .always_inline }, c.final, .{input[aligned_len..]});
}
};
/// Fast non-cryptographic 64bit hash function.
/// See https://github.com/wangyi-fudan/wyhash
pub const Wyhash = struct {
state: WyhashStateless,
buf: [32]u8,
buf_len: usize,
pub fn init(seed: u64) Wyhash {
return Wyhash{
.state = WyhashStateless.init(seed),
.buf = undefined,
.buf_len = 0,
};
}
pub fn update(self: *Wyhash, b: []const u8) void {
var off: usize = 0;
if (self.buf_len != 0 and self.buf_len + b.len >= 32) {
off += 32 - self.buf_len;
mem.copy(u8, self.buf[self.buf_len..], b[0..off]);
self.state.update(self.buf[0..]);
self.buf_len = 0;
}
const remain_len = b.len - off;
const aligned_len = remain_len - (remain_len % 32);
self.state.update(b[off .. off + aligned_len]);
mem.copy(u8, self.buf[self.buf_len..], b[off + aligned_len ..]);
self.buf_len += @intCast(u8, b[off + aligned_len ..].len);
}
pub fn final(self: *Wyhash) u64 {
const seed = self.state.seed;
const rem_len = @intCast(u5, self.buf_len);
const rem_key = self.buf[0..self.buf_len];
return self.state.final(rem_key);
}
pub fn hash(seed: u64, input: []const u8) u64 {
return WyhashStateless.hash(seed, input);
}
};
const expectEqual = std.testing.expectEqual;
test "test vectors" {
const hash = Wyhash.hash;
try expectEqual(hash(0, ""), 0x0);
try expectEqual(hash(1, "a"), 0xbed235177f41d328);
try expectEqual(hash(2, "abc"), 0xbe348debe59b27c3);
try expectEqual(hash(3, "message digest"), 0x37320f657213a290);
try expectEqual(hash(4, "abcdefghijklmnopqrstuvwxyz"), 0xd0b270e1d8a7019c);
try expectEqual(hash(5, "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789"), 0x602a1894d3bbfe7f);
try expectEqual(hash(6, "12345678901234567890123456789012345678901234567890123456789012345678901234567890"), 0x829e9c148b75970e);
}
test "test vectors streaming" {
var wh = Wyhash.init(5);
for ("ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789") |e| {
wh.update(mem.asBytes(&e));
}
try expectEqual(wh.final(), 0x602a1894d3bbfe7f);
const pattern = "1234567890";
const count = 8;
const result = 0x829e9c148b75970e;
try expectEqual(Wyhash.hash(6, pattern ** 8), result);
wh = Wyhash.init(6);
var i: u32 = 0;
while (i < count) : (i += 1) {
wh.update(pattern);
}
try expectEqual(wh.final(), result);
}
test "iterative non-divisible update" {
var buf: [8192]u8 = undefined;
for (buf) |*e, i| {
e.* = @truncate(u8, i);
}
const seed = 0x128dad08f;
var end: usize = 32;
while (end < buf.len) : (end += 32) {
const non_iterative_hash = Wyhash.hash(seed, buf[0..end]);
var wy = Wyhash.init(seed);
var i: usize = 0;
while (i < end) : (i += 33) {
wy.update(buf[i..std.math.min(i + 33, end)]);
}
const iterative_hash = wy.final();
try std.testing.expectEqual(iterative_hash, non_iterative_hash);
}
}

View File

@@ -1,22 +1,17 @@
const std = @import("std");
const builtin = @import("builtin");
const STATIC_MEMORY_SIZE = 256000;
pub var static_manager: ?std.heap.FixedBufferAllocator = null;
pub var dynamic_manager: ?std.heap.ArenaAllocator = null;
pub var root_manager: ?std.heap.ArenaAllocator = null;
pub var static_manager: ?std.heap.ArenaAllocator = null;
pub var root_manager: ?RootAlloc = null;
pub var needs_setup: bool = true;
pub var static: *std.mem.Allocator = undefined;
pub var dynamic: *std.mem.Allocator = undefined;
pub fn setup(root: *std.mem.Allocator) !void {
root_manager = std.heap.ArenaAllocator.init(root);
var buf = try root_manager.?.child_allocator.alloc(u8, STATIC_MEMORY_SIZE);
dynamic_manager = std.heap.ArenaAllocator.init(root_manager.?.child_allocator);
static_manager = std.heap.FixedBufferAllocator.init(buf);
static = root_manager.?.child_allocator;
dynamic_manager = std.heap.ArenaAllocator.init(root);
dynamic = dynamic_manager.?.child_allocator;
needs_setup = false;
static = root;
dynamic = root;
// static = @ptrCast(*std.mem.Allocator, &stat.allocator);
}
@@ -25,3 +20,177 @@ test "GlobalAllocator" {
var testType = try static.alloc(u8, 10);
testType[1] = 1;
}
pub const HunkSide = struct {
pub const VTable = struct {
alloc: fn (self: *Hunk, n: usize, alignment: u29) std.mem.Allocator.Error![]u8,
getMark: fn (self: *Hunk) usize,
freeToMark: fn (self: *Hunk, pos: usize) void,
};
hunk: *Hunk,
vtable: *const VTable,
allocator: std.mem.Allocator,
pub fn init(hunk: *Hunk, vtable: *const VTable) HunkSide {
return .{
.hunk = hunk,
.vtable = vtable,
.allocator = .{
.allocFn = allocFn,
.resizeFn = resizeFn,
},
};
}
pub fn getMark(self: HunkSide) usize {
return self.vtable.getMark(self.hunk);
}
pub fn freeToMark(self: HunkSide, pos: usize) void {
self.vtable.freeToMark(self.hunk, pos);
}
fn allocFn(allocator: *std.mem.Allocator, len: usize, ptr_align: u29, len_align: u29, ret_addr: usize) std.mem.Allocator.Error![]u8 {
const self = @fieldParentPtr(HunkSide, "allocator", allocator);
return try self.vtable.alloc(self.hunk, len, ptr_align);
}
fn resizeFn(allocator: *std.mem.Allocator, old_mem: []u8, old_align: u29, new_size: usize, len_align: u29, ret_addr: usize) std.mem.Allocator.Error!usize {
if (new_size > old_mem.len) {
return error.OutOfMemory;
}
if (new_size == 0) {
return 0;
}
return std.mem.alignAllocLen(old_mem.len, new_size, len_align);
}
};
pub const Hunk = struct {
low_used: usize,
high_used: usize,
buffer: []u8,
pub fn init(buffer: []u8) Hunk {
return .{
.low_used = 0,
.high_used = 0,
.buffer = buffer,
};
}
pub fn low(self: *Hunk) HunkSide {
const GlobalStorage = struct {
const vtable: HunkSide.VTable = .{
.alloc = allocLow,
.getMark = getLowMark,
.freeToMark = freeToLowMark,
};
};
return HunkSide.init(self, &GlobalStorage.vtable);
}
pub fn high(self: *Hunk) HunkSide {
const GlobalStorage = struct {
const vtable: HunkSide.VTable = .{
.alloc = allocHigh,
.getMark = getHighMark,
.freeToMark = freeToHighMark,
};
};
return HunkSide.init(self, &GlobalStorage.vtable);
}
pub fn allocLow(self: *Hunk, n: usize, alignment: u29) ![]u8 {
const start = @ptrToInt(self.buffer.ptr);
const adjusted_index = std.mem.alignForward(start + self.low_used, alignment) - start;
const new_low_used = adjusted_index + n;
if (new_low_used > self.buffer.len - self.high_used) {
return error.OutOfMemory;
}
const result = self.buffer[adjusted_index..new_low_used];
self.low_used = new_low_used;
return result;
}
pub fn allocHigh(self: *Hunk, n: usize, alignment: u29) ![]u8 {
const addr = @ptrToInt(self.buffer.ptr) + self.buffer.len - self.high_used;
const rem = @rem(addr, alignment);
const march_backward_bytes = rem;
const adjusted_index = self.high_used + march_backward_bytes;
const new_high_used = adjusted_index + n;
if (new_high_used > self.buffer.len - self.low_used) {
return error.OutOfMemory;
}
const start = self.buffer.len - adjusted_index - n;
const result = self.buffer[start .. start + n];
self.high_used = new_high_used;
return result;
}
pub fn getLowMark(self: *Hunk) usize {
return self.low_used;
}
pub fn getHighMark(self: *Hunk) usize {
return self.high_used;
}
pub fn freeToLowMark(self: *Hunk, pos: usize) void {
std.debug.assert(pos <= self.low_used);
if (pos < self.low_used) {
if (std.builtin.mode == std.builtin.Mode.Debug) {
std.mem.set(u8, self.buffer[pos..self.low_used], 0xcc);
}
self.low_used = pos;
}
}
pub fn freeToHighMark(self: *Hunk, pos: usize) void {
std.debug.assert(pos <= self.high_used);
if (pos < self.high_used) {
if (std.builtin.mode == std.builtin.Mode.Debug) {
const i = self.buffer.len - self.high_used;
const n = self.high_used - pos;
std.mem.set(u8, self.buffer[i .. i + n], 0xcc);
}
self.high_used = pos;
}
}
};
test "Hunk" {
// test a few random operations. very low coverage. write more later
var buf: [100]u8 = undefined;
var hunk = Hunk.init(buf[0..]);
const high_mark = hunk.getHighMark();
_ = try hunk.low().allocator.alloc(u8, 7);
_ = try hunk.high().allocator.alloc(u8, 8);
std.testing.expectEqual(@as(usize, 7), hunk.low_used);
std.testing.expectEqual(@as(usize, 8), hunk.high_used);
_ = try hunk.high().allocator.alloc(u8, 8);
std.testing.expectEqual(@as(usize, 16), hunk.high_used);
const low_mark = hunk.getLowMark();
_ = try hunk.low().allocator.alloc(u8, 100 - 7 - 16);
std.testing.expectEqual(@as(usize, 100 - 16), hunk.low_used);
std.testing.expectError(error.OutOfMemory, hunk.high().allocator.alloc(u8, 1));
hunk.freeToLowMark(low_mark);
_ = try hunk.high().allocator.alloc(u8, 1);
hunk.freeToHighMark(high_mark);
std.testing.expectEqual(@as(usize, 0), hunk.high_used);
}

639
src/allocators.zig Normal file
View File

@@ -0,0 +1,639 @@
const std = @import("std");
const Wyhash = std.hash.Wyhash;
const FixedBufferAllocator = std.heap.FixedBufferAllocator;
// https://en.wikipedia.org/wiki/.bss#BSS_in_C
pub fn BSSSectionAllocator(comptime size: usize) type {
return struct {
var backing_buf: [size]u8 = undefined;
var fixed_buffer_allocator = FixedBufferAllocator.init(&backing_buf);
var buf_allocator = &fixed_buffer_allocator.allocator;
const Allocator = std.mem.Allocator;
const Self = @This();
allocator: Allocator,
fallback_allocator: *Allocator,
is_overflowed: bool = false,
pub fn get(self: *Self) *Allocator {
return &self.allocator;
}
pub fn init(fallback_allocator: *Allocator) Self {
return Self{ .fallback_allocator = fallback_allocator, .allocator = Allocator{
.allocFn = BSSSectionAllocator(size).alloc,
.resizeFn = BSSSectionAllocator(size).resize,
} };
}
pub fn alloc(
allocator: *Allocator,
len: usize,
ptr_align: u29,
len_align: u29,
return_address: usize,
) error{OutOfMemory}![]u8 {
const self = @fieldParentPtr(Self, "allocator", allocator);
return buf_allocator.allocFn(buf_allocator, len, ptr_align, len_align, return_address) catch |err| {
self.is_overflowed = true;
return self.fallback_allocator.allocFn(self.fallback_allocator, len, ptr_align, len_align, return_address);
};
}
pub fn resize(
allocator: *Allocator,
buf: []u8,
buf_align: u29,
new_len: usize,
len_align: u29,
return_address: usize,
) error{OutOfMemory}!usize {
const self = @fieldParentPtr(Self, "allocator", allocator);
if (fixed_buffer_allocator.ownsPtr(buf.ptr)) {
return fixed_buffer_allocator.allocator.resizeFn(&fixed_buffer_allocator.allocator, buf, buf_align, new_len, len_align, return_address);
} else {
return self.fallback_allocator.resizeFn(self.fallback_allocator, buf, buf_align, new_len, len_align, return_address);
}
}
};
}
pub fn isSliceInBuffer(slice: anytype, buffer: anytype) bool {
return (@ptrToInt(buffer) <= @ptrToInt(slice.ptr) and (@ptrToInt(slice.ptr) + slice.len) <= (@ptrToInt(buffer) + buffer.len));
}
pub const IndexType = packed struct {
index: u31,
is_overflow: bool = false,
};
const HashKeyType = u64;
const IndexMap = std.HashMapUnmanaged(HashKeyType, IndexType, struct {
pub fn hash(ctx: @This(), key: HashKeyType) HashKeyType {
return key;
}
pub fn eql(ctx: @This(), a: HashKeyType, b: HashKeyType) bool {
return a == b;
}
}, 80);
pub const Result = struct {
hash: HashKeyType,
index: IndexType,
status: ItemStatus,
pub fn hasCheckedIfExists(r: *const Result) bool {
return r.index.index != Unassigned.index;
}
pub fn isOverflowing(r: *const Result, comptime count: usize) bool {
return r.index >= count;
}
pub fn realIndex(r: *const Result, comptime count: anytype) IndexType {
return if (r.isOverflowing(count)) @intCast(IndexType, r.index - max_index) else r.index;
}
};
const Seed = 999;
pub const NotFound = IndexType{
.index = std.math.maxInt(u31),
};
pub const Unassigned = IndexType{
.index = std.math.maxInt(u31) - 1,
};
pub const ItemStatus = enum(u3) {
unknown,
exists,
not_found,
};
const hasDeinit = std.meta.trait.hasFn("deinit")(ValueType);
pub fn BSSList(comptime ValueType: type, comptime _count: anytype) type {
const count = _count * 2;
const max_index = count - 1;
var list_type: type = undefined;
var list_count = count;
return struct {
pub var backing_buf: [count]ValueType = undefined;
pub var backing_buf_used: u16 = 0;
const Allocator = std.mem.Allocator;
const Self = @This();
overflow_list: std.ArrayListUnmanaged(ValueType),
allocator: *Allocator,
pub var instance: Self = undefined;
pub fn init(allocator: *std.mem.Allocator) *Self {
instance = Self{
.allocator = allocator,
.overflow_list = std.ArrayListUnmanaged(ValueType){},
};
return &instance;
}
pub fn isOverflowing() bool {
return backing_buf_used >= @as(u16, count);
}
pub fn at(self: *const Self, index: IndexType) ?*ValueType {
if (index.index == NotFound.index or index.index == Unassigned.index) return null;
if (index.is_overflow) {
return &self.overflow_list.items[index.index];
} else {
return &backing_buf[index.index];
}
}
pub fn exists(self: *Self, value: ValueType) bool {
return isSliceInBuffer(value, backing_buf);
}
pub fn append(self: *Self, value: ValueType) !IndexType {
var result = IndexType{ .index = std.math.maxInt(u31), .is_overflow = backing_buf_used > max_index };
if (result.is_overflow) {
result.index = @intCast(u31, self.overflow_list.items.len);
try self.overflow_list.append(self.allocator, value);
} else {
result.index = backing_buf_used;
backing_buf[result.index] = value;
backing_buf_used += 1;
if (backing_buf_used >= max_index) {
self.overflow_list = try @TypeOf(self.overflow_list).initCapacity(self.allocator, count);
}
}
return result;
}
pub fn update(self: *Self, result: *IndexType, value: ValueType) !*ValueType {
if (result.index.index == NotFound.index or result.index.index == Unassigned.index) {
result.index.is_overflow = backing_buf_used > max_index;
if (result.index.is_overflow) {
result.index.index = @intCast(u31, self.overflow_list.items.len);
} else {
result.index.index = backing_buf_used;
backing_buf_used += 1;
if (backing_buf_used >= max_index) {
self.overflow_list = try @TypeOf(self.overflow_list).initCapacity(self.allocator, count);
}
}
}
if (result.index.is_overflow) {
if (self.overflow_list.items.len == result.index.index) {
const real_index = self.overflow_list.items.len;
try self.overflow_list.append(self.allocator, value);
} else {
self.overflow_list.items[result.index.index] = value;
}
return &self.overflow_list.items[result.index.index];
} else {
backing_buf[result.index.index] = value;
return &backing_buf[result.index.index];
}
}
pub fn remove(self: *Self, index: IndexType) void {
@compileError("Not implemented yet.");
// switch (index) {
// Unassigned.index => {
// self.index.remove(_key);
// },
// NotFound.index => {
// self.index.remove(_key);
// },
// 0...max_index => {
// if (hasDeinit(ValueType)) {
// backing_buf[index].deinit();
// }
// backing_buf[index] = undefined;
// },
// else => {
// const i = index - count;
// if (hasDeinit(ValueType)) {
// self.overflow_list.items[i].deinit();
// }
// self.overflow_list.items[index - count] = undefined;
// },
// }
// return index;
}
};
}
pub fn BSSStringList(comptime _count: usize, comptime _item_length: usize) type {
// + 1 for sentinel
const item_length = _item_length + 1;
const count = _count * 2;
const max_index = count - 1;
const ValueType = []const u8;
return struct {
pub var slice_buf: [count][]const u8 = undefined;
pub var slice_buf_used: u16 = 0;
pub var backing_buf: [count * item_length]u8 = undefined;
pub var backing_buf_used: u64 = undefined;
const Allocator = std.mem.Allocator;
const Self = @This();
overflow_list: std.ArrayListUnmanaged(ValueType),
allocator: *Allocator,
pub var instance: Self = undefined;
pub fn init(allocator: *std.mem.Allocator) *Self {
instance = Self{
.allocator = allocator,
.overflow_list = std.ArrayListUnmanaged(ValueType){},
};
return &instance;
}
pub fn isOverflowing() bool {
return slice_buf_used >= @as(u16, count);
}
pub fn at(self: *const Self, index: IndexType) ?ValueType {
if (index.index == NotFound.index or index.index == Unassigned.index) return null;
if (index.is_overflow) {
return &self.overflow_list.items[index.index];
} else {
return &slice_buf[index.index];
}
}
pub fn exists(self: *Self, value: ValueType) bool {
return isSliceInBuffer(value, slice_buf);
}
pub fn editableSlice(slice: []const u8) []u8 {
return constStrToU8(slice);
}
pub fn append(self: *Self, comptime AppendType: type, _value: AppendType) ![]const u8 {
const value_len: usize = brk: {
switch (comptime AppendType) {
[]const u8, []u8 => {
break :brk _value.len;
},
else => {
var len: usize = 0;
for (_value) |val| {
len += val.len;
}
break :brk len;
},
}
unreachable;
} + 1;
var value: [:0]u8 = undefined;
if (value_len + backing_buf_used < backing_buf.len - 1) {
const start = backing_buf_used;
backing_buf_used += value_len;
switch (AppendType) {
[]const u8, []u8 => {
std.mem.copy(u8, backing_buf[start .. backing_buf_used - 1], _value);
backing_buf[backing_buf_used - 1] = 0;
},
else => {
var remainder = backing_buf[start..];
for (_value) |val| {
std.mem.copy(u8, remainder, val);
remainder = remainder[val.len..];
}
remainder[0] = 0;
},
}
value = backing_buf[start .. backing_buf_used - 1 :0];
} else {
var value_buf = try self.allocator.alloc(u8, value_len);
switch (comptime AppendType) {
[]const u8, []u8 => {
std.mem.copy(u8, value_buf, _value);
},
else => {
var remainder = value_buf;
for (_value) |val| {
std.mem.copy(u8, remainder, val);
remainder = remainder[val.len..];
}
},
}
value_buf[value_len - 1] = 0;
value = value_buf[0 .. value_len - 1 :0];
}
var result = IndexType{ .index = std.math.maxInt(u31), .is_overflow = slice_buf_used > max_index };
if (result.is_overflow) {
result.index = @intCast(u31, self.overflow_list.items.len);
} else {
result.index = slice_buf_used;
slice_buf_used += 1;
if (slice_buf_used >= max_index) {
self.overflow_list = try @TypeOf(self.overflow_list).initCapacity(self.allocator, count);
}
}
if (result.is_overflow) {
if (self.overflow_list.items.len == result.index) {
const real_index = self.overflow_list.items.len;
try self.overflow_list.append(self.allocator, value);
} else {
self.overflow_list.items[result.index] = value;
}
return self.overflow_list.items[result.index];
} else {
slice_buf[result.index] = value;
return slice_buf[result.index];
}
}
pub fn remove(self: *Self, index: IndexType) void {
@compileError("Not implemented yet.");
// switch (index) {
// Unassigned.index => {
// self.index.remove(_key);
// },
// NotFound.index => {
// self.index.remove(_key);
// },
// 0...max_index => {
// if (hasDeinit(ValueType)) {
// slice_buf[index].deinit();
// }
// slice_buf[index] = undefined;
// },
// else => {
// const i = index - count;
// if (hasDeinit(ValueType)) {
// self.overflow_list.items[i].deinit();
// }
// self.overflow_list.items[index - count] = undefined;
// },
// }
// return index;
}
};
}
pub fn BSSMap(comptime ValueType: type, comptime count: anytype, store_keys: bool, estimated_key_length: usize) type {
const max_index = count - 1;
const BSSMapType = struct {
pub var backing_buf: [count]ValueType = undefined;
pub var backing_buf_used: u16 = 0;
const Allocator = std.mem.Allocator;
const Self = @This();
index: IndexMap,
overflow_list: std.ArrayListUnmanaged(ValueType),
allocator: *Allocator,
pub var instance: Self = undefined;
pub fn init(allocator: *std.mem.Allocator) *Self {
instance = Self{
.index = IndexMap{},
.allocator = allocator,
.overflow_list = std.ArrayListUnmanaged(ValueType){},
};
return &instance;
}
pub fn isOverflowing() bool {
return backing_buf_used >= @as(u16, count);
}
pub fn getOrPut(self: *Self, key: []const u8) !Result {
const _key = Wyhash.hash(Seed, key);
var index = try self.index.getOrPut(self.allocator, _key);
if (index.found_existing) {
return Result{
.hash = _key,
.index = index.value_ptr.*,
.status = switch (index.value_ptr.index) {
NotFound.index => .not_found,
Unassigned.index => .unknown,
else => .exists,
},
};
}
index.value_ptr.* = Unassigned;
return Result{
.hash = _key,
.index = Unassigned,
.status = .unknown,
};
}
pub fn get(self: *const Self, key: []const u8) ?*ValueType {
const _key = Wyhash.hash(Seed, key);
const index = self.index.get(_key) orelse return null;
return self.atIndex(index);
}
pub fn markNotFound(self: *Self, result: Result) void {
self.index.put(self.allocator, result.hash, NotFound) catch unreachable;
}
pub fn atIndex(self: *const Self, index: IndexType) ?*ValueType {
if (index.index == NotFound.index or index.index == Unassigned.index) return null;
if (index.is_overflow) {
return &self.overflow_list.items[index.index];
} else {
return &backing_buf[index.index];
}
}
pub fn put(self: *Self, result: *Result, value: ValueType) !*ValueType {
if (result.index.index == NotFound.index or result.index.index == Unassigned.index) {
result.index.is_overflow = backing_buf_used > max_index;
if (result.index.is_overflow) {
result.index.index = @intCast(u31, self.overflow_list.items.len);
} else {
result.index.index = backing_buf_used;
backing_buf_used += 1;
if (backing_buf_used >= max_index) {
self.overflow_list = try @TypeOf(self.overflow_list).initCapacity(self.allocator, count);
}
}
}
try self.index.put(self.allocator, result.hash, result.index);
if (result.index.is_overflow) {
if (self.overflow_list.items.len == result.index.index) {
const real_index = self.overflow_list.items.len;
try self.overflow_list.append(self.allocator, value);
} else {
self.overflow_list.items[result.index.index] = value;
}
return &self.overflow_list.items[result.index.index];
} else {
backing_buf[result.index.index] = value;
return &backing_buf[result.index.index];
}
}
pub fn remove(self: *Self, key: string) IndexType {
const _key = Wyhash.hash(Seed, key);
const index = self.index.get(_key) orelse return;
switch (index) {
Unassigned.index => {
self.index.remove(_key);
},
NotFound.index => {
self.index.remove(_key);
},
0...max_index => {
if (hasDeinit(ValueType)) {
backing_buf[index].deinit();
}
backing_buf[index] = undefined;
},
else => {
const i = index - count;
if (hasDeinit(ValueType)) {
self.overflow_list.items[i].deinit();
}
self.overflow_list.items[index - count] = undefined;
},
}
return index;
}
};
if (!store_keys) {
return BSSMapType;
}
return struct {
map: *BSSMapType,
const Self = @This();
pub var instance: Self = undefined;
var key_list_buffer: [count * estimated_key_length]u8 = undefined;
var key_list_buffer_used: usize = 0;
var key_list_slices: [count][]u8 = undefined;
var key_list_overflow: std.ArrayListUnmanaged([]u8) = undefined;
pub fn init(allocator: *std.mem.Allocator) *Self {
instance = Self{
.map = BSSMapType.init(allocator),
};
return &instance;
}
pub fn isOverflowing() bool {
return instance.map.backing_buf_used >= count;
}
pub fn getOrPut(self: *Self, key: []const u8) !Result {
return try self.map.getOrPut(key);
}
pub fn get(self: *Self, key: []const u8) ?*ValueType {
return @call(.{ .modifier = .always_inline }, BSSMapType.get, .{ self.map, key });
}
pub fn atIndex(self: *Self, index: IndexType) ?*ValueType {
return @call(.{ .modifier = .always_inline }, BSSMapType.atIndex, .{ self.map, index });
}
pub fn keyAtIndex(self: *Self, index: IndexType) ?[]const u8 {
return switch (index.index) {
Unassigned.index, NotFound.index => null,
else => {
if (!index.is_overflow) {
return key_list_slices[index.index];
} else {
return key_list_overflow.items[index.index];
}
},
};
}
pub fn put(self: *Self, key: anytype, comptime store_key: bool, result: *Result, value: ValueType) !*ValueType {
var ptr = try self.map.put(result, value);
if (store_key) {
try self.putKey(key, result);
}
return ptr;
}
pub fn isKeyStaticallyAllocated(key: anytype) bool {
return isSliceInBuffer(key, &key_list_buffer);
}
// There's two parts to this.
// 1. Storing the underyling string.
// 2. Making the key accessible at the index.
pub fn putKey(self: *Self, key: anytype, result: *Result) !void {
var slice: []u8 = undefined;
// Is this actually a slice into the map? Don't free it.
if (isKeyStaticallyAllocated(key)) {
slice = constStrToU8(key);
} else if (key_list_buffer_used + key.len < key_list_buffer.len) {
const start = key_list_buffer_used;
key_list_buffer_used += key.len;
slice = key_list_buffer[start..key_list_buffer_used];
std.mem.copy(u8, slice, key);
} else {
slice = try self.map.allocator.dupe(u8, key);
}
if (!result.index.is_overflow) {
key_list_slices[result.index.index] = slice;
} else {
if (@intCast(u31, key_list_overflow.items.len) > result.index.index) {
const existing_slice = key_list_overflow.items[result.index.index];
if (!isKeyStaticallyAllocated(existing_slice)) {
self.map.allocator.free(existing_slice);
}
key_list_overflow.items[result.index.index] = slice;
} else {
try key_list_overflow.append(self.map.allocator, slice);
}
}
}
pub fn markNotFound(self: *Self, result: Result) void {
self.map.markNotFound(result);
}
// For now, don't free the keys.
pub fn remove(self: *Self, key: string) IndexType {
return self.map.remove(key);
}
};
}
pub fn constStrToU8(s: []const u8) []u8 {
return @intToPtr([*]u8, @ptrToInt(s.ptr))[0..s.len];
}

34
src/api/demo/.gitignore vendored Normal file
View File

@@ -0,0 +1,34 @@
# See https://help.github.com/articles/ignoring-files/ for more about ignoring files.
# dependencies
/node_modules
/.pnp
.pnp.js
# testing
/coverage
# next.js
/.next/
/out/
# production
/build
# misc
.DS_Store
*.pem
# debug
npm-debug.log*
yarn-debug.log*
yarn-error.log*
# local env files
.env.local
.env.development.local
.env.test.local
.env.production.local
# vercel
.vercel

34
src/api/demo/README.md Normal file
View File

@@ -0,0 +1,34 @@
This is a [Next.js](https://nextjs.org/) project bootstrapped with [`create-next-app`](https://github.com/vercel/next.js/tree/canary/packages/create-next-app).
## Getting Started
First, run the development server:
```bash
npm run dev
# or
yarn dev
```
Open [http://localhost:3000](http://localhost:3000) with your browser to see the result.
You can start editing the page by modifying `pages/index.js`. The page auto-updates as you edit the file.
[API routes](https://nextjs.org/docs/api-routes/introduction) can be accessed on [http://localhost:3000/api/hello](http://localhost:3000/api/hello). This endpoint can be edited in `pages/api/hello.js`.
The `pages/api` directory is mapped to `/api/*`. Files in this directory are treated as [API routes](https://nextjs.org/docs/api-routes/introduction) instead of React pages.
## Learn More
To learn more about Next.js, take a look at the following resources:
- [Next.js Documentation](https://nextjs.org/docs) - learn about Next.js features and API.
- [Learn Next.js](https://nextjs.org/learn) - an interactive Next.js tutorial.
You can check out [the Next.js GitHub repository](https://github.com/vercel/next.js/) - your feedback and contributions are welcome!
## Deploy on Vercel
The easiest way to deploy your Next.js app is to use the [Vercel Platform](https://vercel.com/new?utm_medium=default-template&filter=next.js&utm_source=create-next-app&utm_campaign=create-next-app-readme) from the creators of Next.js.
Check out our [Next.js deployment documentation](https://nextjs.org/docs/deployment) for more details.

184
src/api/demo/lib/api.ts Normal file
View File

@@ -0,0 +1,184 @@
import * as Schema from "../../schema";
import { ByteBuffer } from "peechy/bb";
// import { transform as sucraseTransform } from "sucrase";
export interface WebAssemblyModule {
init(): number;
transform(a: number): number;
malloc(a: number): number;
calloc(a: number): number;
realloc(a: number): number;
free(a: number): number;
cycle(): void;
}
const wasm_imports_sym: symbol | string =
process.env.NODE_ENV === "development"
? "wasm_imports"
: Symbol("wasm_imports");
const ptr_converter = new ArrayBuffer(8);
const ptr_float = new Float64Array(ptr_converter);
const slice = new Uint32Array(ptr_converter);
var scratch: Uint8Array;
export class ESDev {
static has_initialized = false;
static wasm_source: WebAssembly.WebAssemblyInstantiatedSource = null;
static get wasm_exports(): WebAssemblyModule {
return ESDev.wasm_source.instance.exports as any;
}
static get memory() {
return ESDev[wasm_imports_sym].memory as WebAssembly.Memory;
}
static memory_array: Uint8Array;
static _decoder: TextDecoder;
static _wasmPtrToSlice(offset: number) {
if (ESDev.memory_array.buffer !== ESDev.memory.buffer) {
ESDev.memory_array = new Uint8Array(ESDev.memory.buffer);
}
ptr_float[0] = offset;
return ESDev.memory_array.subarray(slice[0], slice[0] + slice[1]);
}
static _wasmPtrLenToString(slice: number) {
if (!ESDev._decoder) {
ESDev._decoder = new TextDecoder("utf8");
}
const region = this._wasmPtrToSlice(slice);
return ESDev._decoder.decode(region);
}
// We don't want people to be calling these manually
static [wasm_imports_sym] = {
console_log(slice: number) {
console.log(ESDev._wasmPtrLenToString(slice));
},
console_error(slice: number) {
console.error(ESDev._wasmPtrLenToString(slice));
},
console_warn(slice: number) {
console.warn(ESDev._wasmPtrLenToString(slice));
},
console_info(slice: number) {
console.info(ESDev._wasmPtrLenToString(slice));
},
memory: null,
// __indirect_function_table: new WebAssembly.Table({
// initial: 0,
// element: "anyfunc",
// }),
// __stack_pointer: new WebAssembly.Global({
// mutable: true,
// value: "i32",
// }),
// __multi3(one: number, two: number) {
// return Math.imul(one | 0, two | 0);
// },
// fmod(one: number, two: number) {
// return one % two;
// },
// memset(ptr: number, value: number, len: number) {
// ESDev.memory_array.fill(value, ptr, ptr + len);
// },
// memcpy(ptr: number, value: number, len: number) {
// ESDev.memory_array.copyWithin(ptr, value, value + len);
// },
// // These functions convert a to an unsigned long long, rounding toward zero. Negative values all become zero.
// __fixunsdfti(a: number) {
// return Math.floor(a);
// },
// // These functions return the remainder of the unsigned division of a and b.
// __umodti3(a: number, b: number) {
// return (a | 0) % (b | 0);
// },
// // These functions return the quotient of the unsigned division of a and b.
// __udivti3(a: number, b: number) {
// return (a | 0) / (b | 0);
// },
// // These functions return the result of shifting a left by b bits.
// __ashlti3(a: number, b: number) {
// return (a | 0) >> (b | 0);
// },
// /* Returns: convert a to a double, rounding toward even. */
// __floatuntidf(a: number) {
// const mod = a % 2;
// if (mod === 0) {
// return Math.ceil(a);
// } else if (mod === 1) {
// return Math.floor(a);
// }
// },
};
static async init(url) {
// globalThis.sucraseTransform = sucraseTransform;
scratch = new Uint8Array(8096);
if (ESDev.has_initialized) {
return;
}
ESDev[wasm_imports_sym].memory = new WebAssembly.Memory({
initial: 20,
// shared: typeof SharedArrayBuffer !== "undefined",
maximum: typeof SharedArrayBuffer !== "undefined" ? 5000 : undefined,
});
ESDev.wasm_source = await globalThis.WebAssembly.instantiateStreaming(
fetch(url),
{ env: ESDev[wasm_imports_sym] }
);
ESDev.memory_array = new Uint8Array(ESDev.memory.buffer);
const res = ESDev.wasm_exports.init();
if (res < 0) {
throw `[ESDev] Failed to initialize WASM module: code ${res}`;
} else {
console.log("WASM loaded.");
}
ESDev.has_initialized = true;
}
static transform(content: Uint8Array, file_name: string) {
if (!ESDev.has_initialized) {
throw "Please run await ESDev.init(wasm_url) before using this.";
}
// if (process.env.NODE_ENV === "development") {
// console.time("[ESDev] Transform " + file_name);
// }
const bb = new ByteBuffer(scratch);
bb.length = 0;
Schema.encodeTransform(
{
contents: content,
path: file_name,
},
bb
);
const data = bb.toUint8Array();
if (bb._data.buffer !== scratch.buffer) {
scratch = bb._data;
}
ESDev.wasm_exports.cycleStart();
const ptr = ESDev.wasm_exports.malloc(data.byteLength);
this._wasmPtrToSlice(ptr).set(data);
const resp_ptr = ESDev.wasm_exports.transform(ptr);
var _bb = new ByteBuffer(this._wasmPtrToSlice(resp_ptr));
const response = Schema.decodeTransformResponse(_bb);
ESDev.wasm_exports.cycleEnd();
return response;
}
}
globalThis.ESDev = ESDev;

2
src/api/demo/next-env.d.ts vendored Normal file
View File

@@ -0,0 +1,2 @@
/// <reference types="next" />
/// <reference types="next/types/global" />

7580
src/api/demo/package-lock.json generated Normal file

File diff suppressed because it is too large Load Diff

23
src/api/demo/package.json Normal file
View File

@@ -0,0 +1,23 @@
{
"name": "demo",
"version": "0.1.0",
"private": true,
"scripts": {
"dev": "next dev",
"build": "next build",
"start": "next start"
},
"dependencies": {
"next": "10.2.0",
"peechy": "^0.4.5",
"react": "17.0.2",
"react-dom": "17.0.2",
"sucrase": "^3.18.1"
},
"devDependencies": {
"@types/react": "^17.0.8",
"typescript": "^4.3.2",
"webpack": "^5.38.1",
"webpack-cli": "^4.7.0"
}
}

View File

@@ -0,0 +1,7 @@
import '../styles/globals.css'
function MyApp({ Component, pageProps }) {
return <Component {...pageProps} />
}
export default MyApp

View File

@@ -0,0 +1,5 @@
// Next.js API route support: https://nextjs.org/docs/api-routes/introduction
export default (req, res) => {
res.status(200).json({ name: 'John Doe' })
}

View File

@@ -0,0 +1,70 @@
import Head from "next/head";
import Image from "next/image";
import styles from "../styles/Home.module.css";
import "../lib/api.ts";
export default function Home() {
return (
<div className={styles.container}>
<Head>
<title>Create Next App</title>
<meta name="description" content="Generated by create next app" />
<link rel="icon" href="/favicon.ico" />
</Head>
<main className={styles.main}>
<h1 className={styles.title}>
Welcome to <a href="https://nextjs.org">Next.js!</a>
</h1>
<p className={styles.description}>
Get started by editing{" "}
<code className={styles.code}>pages/index.js</code>
</p>
<div className={styles.grid}>
<a href="https://nextjs.org/docs" className={styles.card}>
<h2>Documentation &rarr;</h2>
<p>Find in-depth information about Next.js features and API.</p>
</a>
<a href="https://nextjs.org/learn" className={styles.card}>
<h2>Learn &rarr;</h2>
<p>Learn about Next.js in an interactive course with quizzes!</p>
</a>
<a
href="https://github.com/vercel/next.js/tree/master/examples"
className={styles.card}
>
<h2>Examples &rarr;</h2>
<p>Discover and deploy boilerplate example Next.js projects.</p>
</a>
<a
href="https://vercel.com/new?utm_source=create-next-app&utm_medium=default-template&utm_campaign=create-next-app"
className={styles.card}
>
<h2>Deploy &rarr;</h2>
<p>
Instantly deploy your Next.js site to a public URL with Vercel.
</p>
</a>
</div>
</main>
<footer className={styles.footer}>
<a
href="https://vercel.com?utm_source=create-next-app&utm_medium=default-template&utm_campaign=create-next-app"
target="_blank"
rel="noopener noreferrer"
>
Powered by{" "}
<span className={styles.logo}>
<Image src="/vercel.svg" alt="Vercel Logo" width={72} height={16} />
</span>
</a>
</footer>
</div>
);
}

2038
src/api/demo/pnpm-lock.yaml generated Normal file

File diff suppressed because it is too large Load Diff

Binary file not shown.

After

Width:  |  Height:  |  Size: 15 KiB

View File

@@ -0,0 +1,4 @@
<svg width="283" height="64" viewBox="0 0 283 64" fill="none"
xmlns="http://www.w3.org/2000/svg">
<path d="M141.04 16c-11.04 0-19 7.2-19 18s8.96 18 20 18c6.67 0 12.55-2.64 16.19-7.09l-7.65-4.42c-2.02 2.21-5.09 3.5-8.54 3.5-4.79 0-8.86-2.5-10.37-6.5h28.02c.22-1.12.35-2.28.35-3.5 0-10.79-7.96-17.99-19-17.99zm-9.46 14.5c1.25-3.99 4.67-6.5 9.45-6.5 4.79 0 8.21 2.51 9.45 6.5h-18.9zM248.72 16c-11.04 0-19 7.2-19 18s8.96 18 20 18c6.67 0 12.55-2.64 16.19-7.09l-7.65-4.42c-2.02 2.21-5.09 3.5-8.54 3.5-4.79 0-8.86-2.5-10.37-6.5h28.02c.22-1.12.35-2.28.35-3.5 0-10.79-7.96-17.99-19-17.99zm-9.45 14.5c1.25-3.99 4.67-6.5 9.45-6.5 4.79 0 8.21 2.51 9.45 6.5h-18.9zM200.24 34c0 6 3.92 10 10 10 4.12 0 7.21-1.87 8.8-4.92l7.68 4.43c-3.18 5.3-9.14 8.49-16.48 8.49-11.05 0-19-7.2-19-18s7.96-18 19-18c7.34 0 13.29 3.19 16.48 8.49l-7.68 4.43c-1.59-3.05-4.68-4.92-8.8-4.92-6.07 0-10 4-10 10zm82.48-29v46h-9V5h9zM36.95 0L73.9 64H0L36.95 0zm92.38 5l-27.71 48L73.91 5H84.3l17.32 30 17.32-30h10.39zm58.91 12v9.69c-1-.29-2.06-.49-3.2-.49-5.81 0-10 4-10 10V51h-9V17h9v9.2c0-5.08 5.91-9.2 13.2-9.2z" fill="#000"/>
</svg>

After

Width:  |  Height:  |  Size: 1.1 KiB

725
src/api/demo/schema.js Normal file
View File

@@ -0,0 +1,725 @@
const Loader = {
"1": 1,
"2": 2,
"3": 3,
"4": 4,
"5": 5,
"6": 6,
"7": 7,
jsx: 1,
js: 2,
ts: 3,
tsx: 4,
css: 5,
file: 6,
json: 7
};
const LoaderKeys = {
"1": "jsx",
"2": "js",
"3": "ts",
"4": "tsx",
"5": "css",
"6": "file",
"7": "json",
jsx: "jsx",
js: "js",
ts: "ts",
tsx: "tsx",
css: "css",
file: "file",
json: "json"
};
const ResolveMode = {
"1": 1,
"2": 2,
"3": 3,
"4": 4,
disable: 1,
lazy: 2,
dev: 3,
bundle: 4
};
const ResolveModeKeys = {
"1": "disable",
"2": "lazy",
"3": "dev",
"4": "bundle",
disable: "disable",
lazy: "lazy",
dev: "dev",
bundle: "bundle"
};
const Platform = {
"1": 1,
"2": 2,
browser: 1,
node: 2
};
const PlatformKeys = {
"1": "browser",
"2": "node",
browser: "browser",
node: "node"
};
const JSXRuntime = {
"1": 1,
"2": 2,
automatic: 1,
classic: 2
};
const JSXRuntimeKeys = {
"1": "automatic",
"2": "classic",
automatic: "automatic",
classic: "classic"
};
function decodeJSX(bb) {
var result = {};
result["factory"] = bb.readString();
result["runtime"] = JSXRuntime[bb.readByte()];
result["fragment"] = bb.readString();
result["development"] = !!bb.readByte();
result["import_source"] = bb.readString();
result["react_fast_refresh"] = !!bb.readByte();
return result;
}
function encodeJSX(message, bb) {
var value = message["factory"];
if (value != null)
bb.writeString(value);
else
throw new Error("Missing required field \"factory\"");
var value = message["runtime"];
if (value != null) {
var encoded = JSXRuntime[value];
if (encoded === undefined)
throw new Error("Invalid value " + JSON.stringify(value) + " for enum \"JSXRuntime\"");
bb.writeByte(encoded);
} else
throw new Error("Missing required field \"runtime\"");
var value = message["fragment"];
if (value != null)
bb.writeString(value);
else
throw new Error("Missing required field \"fragment\"");
var value = message["development"];
if (value != null)
bb.writeByte(value);
else
throw new Error("Missing required field \"development\"");
var value = message["import_source"];
if (value != null)
bb.writeString(value);
else
throw new Error("Missing required field \"import_source\"");
var value = message["react_fast_refresh"];
if (value != null)
bb.writeByte(value);
else
throw new Error("Missing required field \"react_fast_refresh\"");
}
function decodeTransformOptions(bb) {
var result = {};
while (true)
switch (bb.readByte()) {
case 0:
return result;
case 1:
result["jsx"] = decodeJSX(bb);
break;
case 2:
result["tsconfig_override"] = bb.readString();
break;
case 3:
result["resolve"] = ResolveMode[bb.readByte()];
break;
case 4:
result["public_url"] = bb.readString();
break;
case 5:
result["absolute_working_dir"] = bb.readString();
break;
case 6:
var length = bb.readVarUint();
var values = result["define_keys"] = Array(length);
for (var i = 0;i < length; i++)
values[i] = bb.readString();
break;
case 7:
var length = bb.readVarUint();
var values = result["define_values"] = Array(length);
for (var i = 0;i < length; i++)
values[i] = bb.readString();
break;
case 8:
result["preserve_symlinks"] = !!bb.readByte();
break;
case 9:
var length = bb.readVarUint();
var values = result["entry_points"] = Array(length);
for (var i = 0;i < length; i++)
values[i] = bb.readString();
break;
case 10:
result["write"] = !!bb.readByte();
break;
case 11:
var length = bb.readVarUint();
var values = result["inject"] = Array(length);
for (var i = 0;i < length; i++)
values[i] = bb.readString();
break;
case 12:
result["output_dir"] = bb.readString();
break;
case 13:
var length = bb.readVarUint();
var values = result["external"] = Array(length);
for (var i = 0;i < length; i++)
values[i] = bb.readString();
break;
case 14:
var length = bb.readVarUint();
var values = result["loader_keys"] = Array(length);
for (var i = 0;i < length; i++)
values[i] = bb.readString();
break;
case 15:
var length = bb.readVarUint();
var values = result["loader_values"] = Array(length);
for (var i = 0;i < length; i++)
values[i] = Loader[bb.readByte()];
break;
case 16:
var length = bb.readVarUint();
var values = result["main_fields"] = Array(length);
for (var i = 0;i < length; i++)
values[i] = bb.readString();
break;
case 17:
result["platform"] = Platform[bb.readByte()];
break;
case 18:
result["serve"] = !!bb.readByte();
break;
case 19:
var length = bb.readVarUint();
var values = result["extension_order"] = Array(length);
for (var i = 0;i < length; i++)
values[i] = bb.readString();
break;
case 20:
result["public_dir"] = bb.readString();
break;
default:
throw new Error("Attempted to parse invalid message");
}
}
function encodeTransformOptions(message, bb) {
var value = message["jsx"];
if (value != null) {
bb.writeByte(1);
encodeJSX(value, bb);
}
var value = message["tsconfig_override"];
if (value != null) {
bb.writeByte(2);
bb.writeString(value);
}
var value = message["resolve"];
if (value != null) {
bb.writeByte(3);
var encoded = ResolveMode[value];
if (encoded === undefined)
throw new Error("Invalid value " + JSON.stringify(value) + " for enum \"ResolveMode\"");
bb.writeByte(encoded);
}
var value = message["public_url"];
if (value != null) {
bb.writeByte(4);
bb.writeString(value);
}
var value = message["absolute_working_dir"];
if (value != null) {
bb.writeByte(5);
bb.writeString(value);
}
var value = message["define_keys"];
if (value != null) {
bb.writeByte(6);
var values = value, n = values.length;
bb.writeVarUint(n);
for (var i = 0;i < n; i++) {
value = values[i];
bb.writeString(value);
}
}
var value = message["define_values"];
if (value != null) {
bb.writeByte(7);
var values = value, n = values.length;
bb.writeVarUint(n);
for (var i = 0;i < n; i++) {
value = values[i];
bb.writeString(value);
}
}
var value = message["preserve_symlinks"];
if (value != null) {
bb.writeByte(8);
bb.writeByte(value);
}
var value = message["entry_points"];
if (value != null) {
bb.writeByte(9);
var values = value, n = values.length;
bb.writeVarUint(n);
for (var i = 0;i < n; i++) {
value = values[i];
bb.writeString(value);
}
}
var value = message["write"];
if (value != null) {
bb.writeByte(10);
bb.writeByte(value);
}
var value = message["inject"];
if (value != null) {
bb.writeByte(11);
var values = value, n = values.length;
bb.writeVarUint(n);
for (var i = 0;i < n; i++) {
value = values[i];
bb.writeString(value);
}
}
var value = message["output_dir"];
if (value != null) {
bb.writeByte(12);
bb.writeString(value);
}
var value = message["external"];
if (value != null) {
bb.writeByte(13);
var values = value, n = values.length;
bb.writeVarUint(n);
for (var i = 0;i < n; i++) {
value = values[i];
bb.writeString(value);
}
}
var value = message["loader_keys"];
if (value != null) {
bb.writeByte(14);
var values = value, n = values.length;
bb.writeVarUint(n);
for (var i = 0;i < n; i++) {
value = values[i];
bb.writeString(value);
}
}
var value = message["loader_values"];
if (value != null) {
bb.writeByte(15);
var values = value, n = values.length;
bb.writeVarUint(n);
for (var i = 0;i < n; i++) {
value = values[i];
var encoded = Loader[value];
if (encoded === undefined)
throw new Error("Invalid value " + JSON.stringify(value) + " for enum \"Loader\"");
bb.writeByte(encoded);
}
}
var value = message["main_fields"];
if (value != null) {
bb.writeByte(16);
var values = value, n = values.length;
bb.writeVarUint(n);
for (var i = 0;i < n; i++) {
value = values[i];
bb.writeString(value);
}
}
var value = message["platform"];
if (value != null) {
bb.writeByte(17);
var encoded = Platform[value];
if (encoded === undefined)
throw new Error("Invalid value " + JSON.stringify(value) + " for enum \"Platform\"");
bb.writeByte(encoded);
}
var value = message["serve"];
if (value != null) {
bb.writeByte(18);
bb.writeByte(value);
}
var value = message["extension_order"];
if (value != null) {
bb.writeByte(19);
var values = value, n = values.length;
bb.writeVarUint(n);
for (var i = 0;i < n; i++) {
value = values[i];
bb.writeString(value);
}
}
var value = message["public_dir"];
if (value != null) {
bb.writeByte(20);
bb.writeString(value);
}
bb.writeByte(0);
}
function decodeFileHandle(bb) {
var result = {};
result["path"] = bb.readString();
result["size"] = bb.readVarUint();
result["fd"] = bb.readVarUint();
return result;
}
function encodeFileHandle(message, bb) {
var value = message["path"];
if (value != null)
bb.writeString(value);
else
throw new Error("Missing required field \"path\"");
var value = message["size"];
if (value != null)
bb.writeVarUint(value);
else
throw new Error("Missing required field \"size\"");
var value = message["fd"];
if (value != null)
bb.writeVarUint(value);
else
throw new Error("Missing required field \"fd\"");
}
function decodeTransform(bb) {
var result = {};
while (true)
switch (bb.readByte()) {
case 0:
return result;
case 1:
result["handle"] = decodeFileHandle(bb);
break;
case 2:
result["path"] = bb.readString();
break;
case 3:
result["contents"] = bb.readByteArray();
break;
case 4:
result["loader"] = Loader[bb.readByte()];
break;
case 5:
result["options"] = decodeTransformOptions(bb);
break;
default:
throw new Error("Attempted to parse invalid message");
}
}
function encodeTransform(message, bb) {
var value = message["handle"];
if (value != null) {
bb.writeByte(1);
encodeFileHandle(value, bb);
}
var value = message["path"];
if (value != null) {
bb.writeByte(2);
bb.writeString(value);
}
var value = message["contents"];
if (value != null) {
bb.writeByte(3);
bb.writeByteArray(value);
}
var value = message["loader"];
if (value != null) {
bb.writeByte(4);
var encoded = Loader[value];
if (encoded === undefined)
throw new Error("Invalid value " + JSON.stringify(value) + " for enum \"Loader\"");
bb.writeByte(encoded);
}
var value = message["options"];
if (value != null) {
bb.writeByte(5);
encodeTransformOptions(value, bb);
}
bb.writeByte(0);
}
const TransformResponseStatus = {
"1": 1,
"2": 2,
success: 1,
fail: 2
};
const TransformResponseStatusKeys = {
"1": "success",
"2": "fail",
success: "success",
fail: "fail"
};
function decodeOutputFile(bb) {
var result = {};
result["data"] = bb.readByteArray();
result["path"] = bb.readString();
return result;
}
function encodeOutputFile(message, bb) {
var value = message["data"];
if (value != null)
bb.writeByteArray(value);
else
throw new Error("Missing required field \"data\"");
var value = message["path"];
if (value != null)
bb.writeString(value);
else
throw new Error("Missing required field \"path\"");
}
function decodeTransformResponse(bb) {
var result = {};
result["status"] = TransformResponseStatus[bb.readVarUint()];
var length = bb.readVarUint();
var values = result["files"] = Array(length);
for (var i = 0;i < length; i++)
values[i] = decodeOutputFile(bb);
var length = bb.readVarUint();
var values = result["errors"] = Array(length);
for (var i = 0;i < length; i++)
values[i] = decodeMessage(bb);
return result;
}
function encodeTransformResponse(message, bb) {
var value = message["status"];
if (value != null) {
var encoded = TransformResponseStatus[value];
if (encoded === undefined)
throw new Error("Invalid value " + JSON.stringify(value) + " for enum \"TransformResponseStatus\"");
bb.writeVarUint(encoded);
} else
throw new Error("Missing required field \"status\"");
var value = message["files"];
if (value != null) {
var values = value, n = values.length;
bb.writeVarUint(n);
for (var i = 0;i < n; i++) {
value = values[i];
encodeOutputFile(value, bb);
}
} else
throw new Error("Missing required field \"files\"");
var value = message["errors"];
if (value != null) {
var values = value, n = values.length;
bb.writeVarUint(n);
for (var i = 0;i < n; i++) {
value = values[i];
encodeMessage(value, bb);
}
} else
throw new Error("Missing required field \"errors\"");
}
const MessageKind = {
"1": 1,
"2": 2,
"3": 3,
"4": 4,
err: 1,
warn: 2,
note: 3,
debug: 4
};
const MessageKindKeys = {
"1": "err",
"2": "warn",
"3": "note",
"4": "debug",
err: "err",
warn: "warn",
note: "note",
debug: "debug"
};
function decodeLocation(bb) {
var result = {};
result["file"] = bb.readString();
result["namespace"] = bb.readString();
result["line"] = bb.readInt32();
result["column"] = bb.readInt32();
result["line_text"] = bb.readString();
result["suggestion"] = bb.readString();
result["offset"] = bb.readVarUint();
return result;
}
function encodeLocation(message, bb) {
var value = message["file"];
if (value != null)
bb.writeString(value);
else
throw new Error("Missing required field \"file\"");
var value = message["namespace"];
if (value != null)
bb.writeString(value);
else
throw new Error("Missing required field \"namespace\"");
var value = message["line"];
if (value != null)
bb.writeInt32(value);
else
throw new Error("Missing required field \"line\"");
var value = message["column"];
if (value != null)
bb.writeInt32(value);
else
throw new Error("Missing required field \"column\"");
var value = message["line_text"];
if (value != null)
bb.writeString(value);
else
throw new Error("Missing required field \"line_text\"");
var value = message["suggestion"];
if (value != null)
bb.writeString(value);
else
throw new Error("Missing required field \"suggestion\"");
var value = message["offset"];
if (value != null)
bb.writeVarUint(value);
else
throw new Error("Missing required field \"offset\"");
}
function decodeMessageData(bb) {
var result = {};
while (true)
switch (bb.readByte()) {
case 0:
return result;
case 1:
result["text"] = bb.readString();
break;
case 2:
result["location"] = decodeLocation(bb);
break;
default:
throw new Error("Attempted to parse invalid message");
}
}
function encodeMessageData(message, bb) {
var value = message["text"];
if (value != null) {
bb.writeByte(1);
bb.writeString(value);
}
var value = message["location"];
if (value != null) {
bb.writeByte(2);
encodeLocation(value, bb);
}
bb.writeByte(0);
}
function decodeMessage(bb) {
var result = {};
result["kind"] = MessageKind[bb.readVarUint()];
result["data"] = decodeMessageData(bb);
var length = bb.readVarUint();
var values = result["notes"] = Array(length);
for (var i = 0;i < length; i++)
values[i] = decodeMessageData(bb);
return result;
}
function encodeMessage(message, bb) {
var value = message["kind"];
if (value != null) {
var encoded = MessageKind[value];
if (encoded === undefined)
throw new Error("Invalid value " + JSON.stringify(value) + " for enum \"MessageKind\"");
bb.writeVarUint(encoded);
} else
throw new Error("Missing required field \"kind\"");
var value = message["data"];
if (value != null)
encodeMessageData(value, bb);
else
throw new Error("Missing required field \"data\"");
var value = message["notes"];
if (value != null) {
var values = value, n = values.length;
bb.writeVarUint(n);
for (var i = 0;i < n; i++) {
value = values[i];
encodeMessageData(value, bb);
}
} else
throw new Error("Missing required field \"notes\"");
}
function decodeLog(bb) {
var result = {};
result["warnings"] = bb.readUint32();
result["errors"] = bb.readUint32();
var length = bb.readVarUint();
var values = result["msgs"] = Array(length);
for (var i = 0;i < length; i++)
values[i] = decodeMessage(bb);
return result;
}
function encodeLog(message, bb) {
var value = message["warnings"];
if (value != null)
bb.writeUint32(value);
else
throw new Error("Missing required field \"warnings\"");
var value = message["errors"];
if (value != null)
bb.writeUint32(value);
else
throw new Error("Missing required field \"errors\"");
var value = message["msgs"];
if (value != null) {
var values = value, n = values.length;
bb.writeVarUint(n);
for (var i = 0;i < n; i++) {
value = values[i];
encodeMessage(value, bb);
}
} else
throw new Error("Missing required field \"msgs\"");
}
export {Loader};
export {LoaderKeys};
export {ResolveMode};
export {ResolveModeKeys};
export {Platform};
export {PlatformKeys};
export {JSXRuntime};
export {JSXRuntimeKeys};
export {decodeJSX};
export {encodeJSX};
export {decodeTransformOptions};
export {encodeTransformOptions};
export {decodeFileHandle};
export {encodeFileHandle};
export {decodeTransform};
export {encodeTransform};
export {TransformResponseStatus};
export {TransformResponseStatusKeys};
export {decodeOutputFile};
export {encodeOutputFile};
export {decodeTransformResponse};
export {encodeTransformResponse};
export {MessageKind};
export {MessageKindKeys};
export {decodeLocation};
export {encodeLocation};
export {decodeMessageData};
export {encodeMessageData};
export {decodeMessage};
export {encodeMessage};
export {decodeLog};
export {encodeLog};

View File

@@ -0,0 +1,121 @@
.container {
min-height: 100vh;
padding: 0 0.5rem;
display: flex;
flex-direction: column;
justify-content: center;
align-items: center;
height: 100vh;
}
.main {
padding: 5rem 0;
flex: 1;
display: flex;
flex-direction: column;
justify-content: center;
align-items: center;
}
.footer {
width: 100%;
height: 100px;
border-top: 1px solid #eaeaea;
display: flex;
justify-content: center;
align-items: center;
}
.footer a {
display: flex;
justify-content: center;
align-items: center;
flex-grow: 1;
}
.title a {
color: #0070f3;
text-decoration: none;
}
.title a:hover,
.title a:focus,
.title a:active {
text-decoration: underline;
}
.title {
margin: 0;
line-height: 1.15;
font-size: 4rem;
}
.title,
.description {
text-align: center;
}
.description {
line-height: 1.5;
font-size: 1.5rem;
}
.code {
background: #fafafa;
border-radius: 5px;
padding: 0.75rem;
font-size: 1.1rem;
font-family: Menlo, Monaco, Lucida Console, Liberation Mono, DejaVu Sans Mono,
Bitstream Vera Sans Mono, Courier New, monospace;
}
.grid {
display: flex;
align-items: center;
justify-content: center;
flex-wrap: wrap;
max-width: 800px;
margin-top: 3rem;
}
.card {
margin: 1rem;
padding: 1.5rem;
text-align: left;
color: inherit;
text-decoration: none;
border: 1px solid #eaeaea;
border-radius: 10px;
transition: color 0.15s ease, border-color 0.15s ease;
width: 45%;
}
.card:hover,
.card:focus,
.card:active {
color: #0070f3;
border-color: #0070f3;
}
.card h2 {
margin: 0 0 1rem 0;
font-size: 1.5rem;
}
.card p {
margin: 0;
font-size: 1.25rem;
line-height: 1.5;
}
.logo {
height: 1em;
margin-left: 0.5rem;
}
@media (max-width: 600px) {
.grid {
width: 100%;
flex-direction: column;
}
}

View File

@@ -0,0 +1,16 @@
html,
body {
padding: 0;
margin: 0;
font-family: -apple-system, BlinkMacSystemFont, Segoe UI, Roboto, Oxygen,
Ubuntu, Cantarell, Fira Sans, Droid Sans, Helvetica Neue, sans-serif;
}
a {
color: inherit;
text-decoration: none;
}
* {
box-sizing: border-box;
}

View File

@@ -0,0 +1,29 @@
{
"compilerOptions": {
"target": "es5",
"lib": [
"dom",
"dom.iterable",
"esnext"
],
"allowJs": true,
"skipLibCheck": true,
"strict": false,
"forceConsistentCasingInFileNames": true,
"noEmit": true,
"esModuleInterop": true,
"module": "esnext",
"moduleResolution": "node",
"resolveJsonModule": true,
"isolatedModules": true,
"jsx": "preserve"
},
"include": [
"next-env.d.ts",
"**/*.ts",
"**/*.tsx"
],
"exclude": [
"node_modules"
]
}

470
src/api/schema.d.ts vendored Normal file
View File

@@ -0,0 +1,470 @@
import type {ByteBuffer} from "peechy";
type byte = number;
type float = number;
type int = number;
type alphanumeric = string;
type uint = number;
type int8 = number;
type lowp = number;
type int16 = number;
type int32 = number;
type float32 = number;
type uint16 = number;
type uint32 = number;
export enum Loader {
jsx = 1,
js = 2,
ts = 3,
tsx = 4,
css = 5,
file = 6,
json = 7
}
export const LoaderKeys = {
1: "jsx",
jsx: "jsx",
2: "js",
js: "js",
3: "ts",
ts: "ts",
4: "tsx",
tsx: "tsx",
5: "css",
css: "css",
6: "file",
file: "file",
7: "json",
json: "json"
}
export enum ResolveMode {
disable = 1,
lazy = 2,
dev = 3,
bundle = 4
}
export const ResolveModeKeys = {
1: "disable",
disable: "disable",
2: "lazy",
lazy: "lazy",
3: "dev",
dev: "dev",
4: "bundle",
bundle: "bundle"
}
export enum Platform {
browser = 1,
node = 2,
speedy = 3
}
export const PlatformKeys = {
1: "browser",
browser: "browser",
2: "node",
node: "node",
3: "speedy",
speedy: "speedy"
}
export enum JSXRuntime {
automatic = 1,
classic = 2
}
export const JSXRuntimeKeys = {
1: "automatic",
automatic: "automatic",
2: "classic",
classic: "classic"
}
export enum ScanDependencyMode {
app = 1,
all = 2
}
export const ScanDependencyModeKeys = {
1: "app",
app: "app",
2: "all",
all: "all"
}
export enum ModuleImportType {
import = 1,
require = 2
}
export const ModuleImportTypeKeys = {
1: "import",
import: "import",
2: "require",
require: "require"
}
export enum TransformResponseStatus {
success = 1,
fail = 2
}
export const TransformResponseStatusKeys = {
1: "success",
success: "success",
2: "fail",
fail: "fail"
}
export enum MessageKind {
err = 1,
warn = 2,
note = 3,
debug = 4
}
export const MessageKindKeys = {
1: "err",
err: "err",
2: "warn",
warn: "warn",
3: "note",
note: "note",
4: "debug",
debug: "debug"
}
export enum Reloader {
disable = 1,
live = 2,
fast_refresh = 3
}
export const ReloaderKeys = {
1: "disable",
disable: "disable",
2: "live",
live: "live",
3: "fast_refresh",
fast_refresh: "fast_refresh"
}
export enum WebsocketMessageKind {
welcome = 1,
file_change_notification = 2,
build_success = 3,
build_fail = 4,
manifest_success = 5,
manifest_fail = 6
}
export const WebsocketMessageKindKeys = {
1: "welcome",
welcome: "welcome",
2: "file_change_notification",
file_change_notification: "file_change_notification",
3: "build_success",
build_success: "build_success",
4: "build_fail",
build_fail: "build_fail",
5: "manifest_success",
manifest_success: "manifest_success",
6: "manifest_fail",
manifest_fail: "manifest_fail"
}
export enum WebsocketCommandKind {
build = 1,
manifest = 2
}
export const WebsocketCommandKindKeys = {
1: "build",
build: "build",
2: "manifest",
manifest: "manifest"
}
export interface JSX {
factory: string;
runtime: JSXRuntime;
fragment: string;
development: boolean;
import_source: string;
react_fast_refresh: boolean;
}
export interface StringPointer {
offset: uint32;
length: uint32;
}
export interface JavascriptBundledModule {
path: StringPointer;
code: StringPointer;
package_id: uint32;
id: uint32;
path_extname_length: byte;
}
export interface JavascriptBundledPackage {
name: StringPointer;
version: StringPointer;
hash: uint32;
modules_offset: uint32;
modules_length: uint32;
}
export interface JavascriptBundle {
modules: JavascriptBundledModule[];
packages: JavascriptBundledPackage[];
etag: Uint8Array;
generated_at: uint32;
app_package_json_dependencies_hash: Uint8Array;
import_from_name: Uint8Array;
manifest_string: Uint8Array;
}
export interface JavascriptBundleContainer {
bundle_format_version?: uint32;
bundle?: JavascriptBundle;
code_length?: uint32;
}
export interface ModuleImportRecord {
kind: ModuleImportType;
path: string;
dynamic: boolean;
}
export interface Module {
path: string;
imports: ModuleImportRecord[];
}
export interface StringMap {
keys: string[];
values: string[];
}
export interface LoaderMap {
extensions: string[];
loaders: Loader[];
}
export interface FrameworkConfig {
entry_point?: string;
}
export interface RouteConfig {
dir?: string;
extensions?: string[];
}
export interface TransformOptions {
jsx?: JSX;
tsconfig_override?: string;
resolve?: ResolveMode;
public_url?: string;
absolute_working_dir?: string;
define?: StringMap;
preserve_symlinks?: boolean;
entry_points?: string[];
write?: boolean;
inject?: string[];
output_dir?: string;
external?: string[];
loaders?: LoaderMap;
main_fields?: string[];
platform?: Platform;
serve?: boolean;
extension_order?: string[];
public_dir?: string;
only_scan_dependencies?: ScanDependencyMode;
generate_node_module_bundle?: boolean;
node_modules_bundle_path?: string;
framework?: FrameworkConfig;
router?: RouteConfig;
}
export interface FileHandle {
path: string;
size: uint;
fd: uint;
}
export interface Transform {
handle?: FileHandle;
path?: string;
contents?: Uint8Array;
loader?: Loader;
options?: TransformOptions;
}
export interface OutputFile {
data: Uint8Array;
path: string;
}
export interface TransformResponse {
status: TransformResponseStatus;
files: OutputFile[];
errors: Message[];
}
export interface Location {
file: string;
namespace: string;
line: int32;
column: int32;
line_text: string;
suggestion: string;
offset: uint;
}
export interface MessageData {
text?: string;
location?: Location;
}
export interface Message {
kind: MessageKind;
data: MessageData;
notes: MessageData[];
}
export interface Log {
warnings: uint32;
errors: uint32;
msgs: Message[];
}
export interface WebsocketMessage {
timestamp: uint32;
kind: WebsocketMessageKind;
}
export interface WebsocketMessageWelcome {
epoch: uint32;
javascriptReloader: Reloader;
}
export interface WebsocketMessageFileChangeNotification {
id: uint32;
loader: Loader;
}
export interface WebsocketCommand {
kind: WebsocketCommandKind;
timestamp: uint32;
}
export interface WebsocketCommandBuild {
id: uint32;
}
export interface WebsocketCommandManifest {
id: uint32;
}
export interface WebsocketMessageBuildSuccess {
id: uint32;
from_timestamp: uint32;
loader: Loader;
module_path: string;
blob_length: uint32;
}
export interface WebsocketMessageBuildFailure {
id: uint32;
from_timestamp: uint32;
loader: Loader;
module_path: string;
log: Log;
}
export interface DependencyManifest {
ids: Uint32Array;
}
export interface FileList {
ptrs: StringPointer[];
files: string;
}
export interface WebsocketMessageResolveIDs {
id: Uint32Array;
list: FileList;
}
export interface WebsocketCommandResolveIDs {
ptrs: StringPointer[];
files: string;
}
export interface WebsocketMessageManifestSuccess {
id: uint32;
module_path: string;
loader: Loader;
manifest: DependencyManifest;
}
export interface WebsocketMessageManifestFailure {
id: uint32;
from_timestamp: uint32;
loader: Loader;
log: Log;
}
export declare function encodeJSX(message: JSX, bb: ByteBuffer): void;
export declare function decodeJSX(buffer: ByteBuffer): JSX;
export declare function encodeStringPointer(message: StringPointer, bb: ByteBuffer): void;
export declare function decodeStringPointer(buffer: ByteBuffer): StringPointer;
export declare function encodeJavascriptBundledModule(message: JavascriptBundledModule, bb: ByteBuffer): void;
export declare function decodeJavascriptBundledModule(buffer: ByteBuffer): JavascriptBundledModule;
export declare function encodeJavascriptBundledPackage(message: JavascriptBundledPackage, bb: ByteBuffer): void;
export declare function decodeJavascriptBundledPackage(buffer: ByteBuffer): JavascriptBundledPackage;
export declare function encodeJavascriptBundle(message: JavascriptBundle, bb: ByteBuffer): void;
export declare function decodeJavascriptBundle(buffer: ByteBuffer): JavascriptBundle;
export declare function encodeJavascriptBundleContainer(message: JavascriptBundleContainer, bb: ByteBuffer): void;
export declare function decodeJavascriptBundleContainer(buffer: ByteBuffer): JavascriptBundleContainer;
export declare function encodeModuleImportRecord(message: ModuleImportRecord, bb: ByteBuffer): void;
export declare function decodeModuleImportRecord(buffer: ByteBuffer): ModuleImportRecord;
export declare function encodeModule(message: Module, bb: ByteBuffer): void;
export declare function decodeModule(buffer: ByteBuffer): Module;
export declare function encodeStringMap(message: StringMap, bb: ByteBuffer): void;
export declare function decodeStringMap(buffer: ByteBuffer): StringMap;
export declare function encodeLoaderMap(message: LoaderMap, bb: ByteBuffer): void;
export declare function decodeLoaderMap(buffer: ByteBuffer): LoaderMap;
export declare function encodeFrameworkConfig(message: FrameworkConfig, bb: ByteBuffer): void;
export declare function decodeFrameworkConfig(buffer: ByteBuffer): FrameworkConfig;
export declare function encodeRouteConfig(message: RouteConfig, bb: ByteBuffer): void;
export declare function decodeRouteConfig(buffer: ByteBuffer): RouteConfig;
export declare function encodeTransformOptions(message: TransformOptions, bb: ByteBuffer): void;
export declare function decodeTransformOptions(buffer: ByteBuffer): TransformOptions;
export declare function encodeFileHandle(message: FileHandle, bb: ByteBuffer): void;
export declare function decodeFileHandle(buffer: ByteBuffer): FileHandle;
export declare function encodeTransform(message: Transform, bb: ByteBuffer): void;
export declare function decodeTransform(buffer: ByteBuffer): Transform;
export declare function encodeOutputFile(message: OutputFile, bb: ByteBuffer): void;
export declare function decodeOutputFile(buffer: ByteBuffer): OutputFile;
export declare function encodeTransformResponse(message: TransformResponse, bb: ByteBuffer): void;
export declare function decodeTransformResponse(buffer: ByteBuffer): TransformResponse;
export declare function encodeLocation(message: Location, bb: ByteBuffer): void;
export declare function decodeLocation(buffer: ByteBuffer): Location;
export declare function encodeMessageData(message: MessageData, bb: ByteBuffer): void;
export declare function decodeMessageData(buffer: ByteBuffer): MessageData;
export declare function encodeMessage(message: Message, bb: ByteBuffer): void;
export declare function decodeMessage(buffer: ByteBuffer): Message;
export declare function encodeLog(message: Log, bb: ByteBuffer): void;
export declare function decodeLog(buffer: ByteBuffer): Log;
export declare function encodeWebsocketMessage(message: WebsocketMessage, bb: ByteBuffer): void;
export declare function decodeWebsocketMessage(buffer: ByteBuffer): WebsocketMessage;
export declare function encodeWebsocketMessageWelcome(message: WebsocketMessageWelcome, bb: ByteBuffer): void;
export declare function decodeWebsocketMessageWelcome(buffer: ByteBuffer): WebsocketMessageWelcome;
export declare function encodeWebsocketMessageFileChangeNotification(message: WebsocketMessageFileChangeNotification, bb: ByteBuffer): void;
export declare function decodeWebsocketMessageFileChangeNotification(buffer: ByteBuffer): WebsocketMessageFileChangeNotification;
export declare function encodeWebsocketCommand(message: WebsocketCommand, bb: ByteBuffer): void;
export declare function decodeWebsocketCommand(buffer: ByteBuffer): WebsocketCommand;
export declare function encodeWebsocketCommandBuild(message: WebsocketCommandBuild, bb: ByteBuffer): void;
export declare function decodeWebsocketCommandBuild(buffer: ByteBuffer): WebsocketCommandBuild;
export declare function encodeWebsocketCommandManifest(message: WebsocketCommandManifest, bb: ByteBuffer): void;
export declare function decodeWebsocketCommandManifest(buffer: ByteBuffer): WebsocketCommandManifest;
export declare function encodeWebsocketMessageBuildSuccess(message: WebsocketMessageBuildSuccess, bb: ByteBuffer): void;
export declare function decodeWebsocketMessageBuildSuccess(buffer: ByteBuffer): WebsocketMessageBuildSuccess;
export declare function encodeWebsocketMessageBuildFailure(message: WebsocketMessageBuildFailure, bb: ByteBuffer): void;
export declare function decodeWebsocketMessageBuildFailure(buffer: ByteBuffer): WebsocketMessageBuildFailure;
export declare function encodeDependencyManifest(message: DependencyManifest, bb: ByteBuffer): void;
export declare function decodeDependencyManifest(buffer: ByteBuffer): DependencyManifest;
export declare function encodeFileList(message: FileList, bb: ByteBuffer): void;
export declare function decodeFileList(buffer: ByteBuffer): FileList;
export declare function encodeWebsocketMessageResolveIDs(message: WebsocketMessageResolveIDs, bb: ByteBuffer): void;
export declare function decodeWebsocketMessageResolveIDs(buffer: ByteBuffer): WebsocketMessageResolveIDs;
export declare function encodeWebsocketCommandResolveIDs(message: WebsocketCommandResolveIDs, bb: ByteBuffer): void;
export declare function decodeWebsocketCommandResolveIDs(buffer: ByteBuffer): WebsocketCommandResolveIDs;
export declare function encodeWebsocketMessageManifestSuccess(message: WebsocketMessageManifestSuccess, bb: ByteBuffer): void;
export declare function decodeWebsocketMessageManifestSuccess(buffer: ByteBuffer): WebsocketMessageManifestSuccess;
export declare function encodeWebsocketMessageManifestFailure(message: WebsocketMessageManifestFailure, bb: ByteBuffer): void;
export declare function decodeWebsocketMessageManifestFailure(buffer: ByteBuffer): WebsocketMessageManifestFailure;

1936
src/api/schema.js Normal file

File diff suppressed because it is too large Load Diff

368
src/api/schema.peechy Normal file
View File

@@ -0,0 +1,368 @@
package Api;
smol Loader {
jsx = 1;
js = 2;
ts = 3;
tsx = 4;
css = 5;
file = 6;
json = 7;
}
smol ResolveMode {
disable = 1;
lazy = 2;
dev = 3;
bundle = 4;
}
smol Platform {
browser = 1;
node = 2;
speedy = 3;
}
smol JSXRuntime {
automatic = 1;
classic = 2;
}
struct JSX {
string factory;
JSXRuntime runtime;
string fragment;
bool development;
// Probably react
string import_source;
bool react_fast_refresh;
}
struct StringPointer {
uint32 offset;
uint32 length;
}
struct JavascriptBundledModule {
// package-relative path including file extension
StringPointer path;
// Source code
StringPointer code;
// index into JavascriptBundle.packages
uint32 package_id;
// The ESM export is this id ("$" + number.toString(16))
uint32 id;
// This lets us efficiently compare strings ignoring the extension
byte path_extname_length;
}
struct JavascriptBundledPackage {
StringPointer name;
StringPointer version;
uint32 hash;
uint32 modules_offset;
uint32 modules_length;
}
struct JavascriptBundle {
// These are sorted alphabetically so you can do binary search
JavascriptBundledModule[] modules;
JavascriptBundledPackage[] packages;
// This is ASCII-encoded so you can send it directly over HTTP
byte[] etag;
uint32 generated_at;
// generated by hashing all ${name}@${version} in sorted order
byte[] app_package_json_dependencies_hash;
byte[] import_from_name;
// This is what StringPointer refers to
byte[] manifest_string;
}
message JavascriptBundleContainer {
uint32 bundle_format_version = 1;
JavascriptBundle bundle = 2;
// Don't technically need to store this, but it may be helpful as a sanity check
uint32 code_length = 3;
}
smol ScanDependencyMode {
app = 1;
all = 2;
}
smol ModuleImportType {
import = 1;
require = 2;
}
struct ModuleImportRecord {
ModuleImportType kind;
string path;
bool dynamic;
}
struct Module {
string path;
ModuleImportRecord[] imports;
}
struct StringMap {
string[] keys;
string[] values;
}
struct LoaderMap {
string[] extensions;
Loader[] loaders;
}
message FrameworkConfig {
string entry_point = 1;
}
message RouteConfig {
string dir = 1;
string[] extensions = 2;
}
message TransformOptions {
JSX jsx = 1;
string tsconfig_override = 2;
ResolveMode resolve = 3;
string public_url = 4;
string absolute_working_dir = 5;
StringMap define = 6;
bool preserve_symlinks = 7;
string[] entry_points = 8;
bool write = 9;
string[] inject = 10;
string output_dir = 11;
string[] external = 12;
LoaderMap loaders = 13;
string[] main_fields = 14;
Platform platform = 15;
bool serve = 16;
string[] extension_order = 17;
string public_dir = 18;
ScanDependencyMode only_scan_dependencies = 19;
bool generate_node_module_bundle = 20;
string node_modules_bundle_path = 21;
FrameworkConfig framework = 22;
RouteConfig router = 23;
}
struct FileHandle {
string path;
uint size;
uint fd;
}
message Transform {
FileHandle handle = 1;
string path = 2;
byte[] contents = 3;
Loader loader = 4;
TransformOptions options = 5;
}
enum TransformResponseStatus {
success = 1;
fail = 2;
}
struct OutputFile {
byte[] data;
string path;
}
struct TransformResponse {
TransformResponseStatus status;
OutputFile[] files;
Message[] errors;
}
enum MessageKind {
err = 1;
warn =2;
note = 3;
debug = 4;
}
struct Location {
string file;
string namespace;
int32 line;
int32 column;
string line_text;
string suggestion;
uint offset;
}
message MessageData {
string text = 1;
Location location = 2;
}
struct Message {
MessageKind kind;
MessageData data;
MessageData[] notes;
}
struct Log {
uint32 warnings;
uint32 errors;
Message[] msgs;
}
smol Reloader {
disable = 1;
// equivalent of CMD + R
live = 2;
// React Fast Refresh
fast_refresh = 3;
}
// The WebSocket protocol
// Server: "hey, this file changed. Does anyone want it?"
// Browser: *checks array* "uhh yeah, ok. rebuild that for me"
// Server: "here u go"
// This makes the client responsible for tracking which files it needs to listen for.
// From a server perspective, this means the filesystem watching thread can send the same WebSocket message
// to every client, which is good for performance. It means if you have 5 tabs open it won't really be different than one tab
// The clients can just ignore files they don't care about
smol WebsocketMessageKind {
welcome = 1;
file_change_notification = 2;
build_success = 3;
build_fail = 4;
manifest_success = 5;
manifest_fail = 6;
}
smol WebsocketCommandKind {
build = 1;
manifest = 2;
}
// Each websocket message has two messages in it!
// This is the first.
struct WebsocketMessage {
uint32 timestamp;
WebsocketMessageKind kind;
}
// This is the first.
struct WebsocketMessageWelcome {
uint32 epoch;
Reloader javascriptReloader;
}
struct WebsocketMessageFileChangeNotification {
uint32 id;
Loader loader;
}
struct WebsocketCommand {
WebsocketCommandKind kind;
uint32 timestamp;
}
// The timestamp is used for client-side deduping
struct WebsocketCommandBuild {
uint32 id;
}
struct WebsocketCommandManifest {
uint32 id;
}
// We copy the module_path here incase they don't already have it
struct WebsocketMessageBuildSuccess {
uint32 id;
uint32 from_timestamp;
Loader loader;
string module_path;
// This is the length of the blob that immediately follows this message.
uint32 blob_length;
}
struct WebsocketMessageBuildFailure {
uint32 id;
uint32 from_timestamp;
Loader loader;
string module_path;
Log log;
}
// CSS @import only for now!
struct DependencyManifest {
uint32[] ids;
}
struct FileList {
StringPointer[] ptrs;
string files;
}
struct WebsocketMessageResolveIDs {
uint32[] id;
FileList list;
}
struct WebsocketCommandResolveIDs {
StringPointer[] ptrs;
string files;
}
struct WebsocketMessageManifestSuccess {
uint32 id;
string module_path;
Loader loader;
DependencyManifest manifest;
}
struct WebsocketMessageManifestFailure {
uint32 id;
uint32 from_timestamp;
Loader loader;
Log log;
}

2039
src/api/schema.zig Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -22,13 +22,43 @@ pub const NodeIndexNone = 4294967293;
// be an array of arrays indexed first by source index, then by inner index.
// The maps can be merged quickly by creating a single outer array containing
// all inner arrays from all parsed files.
pub const RefHashCtx = struct {
pub fn hash(ctx: @This(), key: Ref) u32 {
return @truncate(u32, std.hash.Wyhash.hash(0, std.mem.asBytes(&key)));
}
pub fn eql(ctx: @This(), ref: Ref, b: Ref) bool {
return std.mem.readIntNative(u64, std.mem.asBytes(&ref)) == std.mem.readIntNative(u64, std.mem.asBytes(&b));
}
};
pub const Ref = packed struct {
source_index: Int = std.math.maxInt(Ref.Int),
inner_index: Int = 0,
is_source_contents_slice: bool = false,
// 2 bits of padding for whatever is the parent
pub const Int = u31;
pub const None = Ref{ .inner_index = std.math.maxInt(Ref.Int) };
pub const Int = u30;
pub const None = Ref{
.inner_index = std.math.maxInt(Ref.Int),
.source_index = std.math.maxInt(Ref.Int),
};
pub const RuntimeRef = Ref{
.inner_index = std.math.maxInt(Ref.Int),
.source_index = std.math.maxInt(Ref.Int) - 1,
};
pub fn toInt(int: anytype) Int {
return @intCast(Int, int);
}
pub fn hash(key: Ref) u32 {
return @truncate(u32, std.hash.Wyhash.hash(0, std.mem.asBytes(&key)));
}
pub fn eql(ref: Ref, b: Ref) bool {
return std.mem.readIntNative(u64, std.mem.asBytes(&ref)) == std.mem.readIntNative(u64, std.mem.asBytes(&b));
}
pub fn isNull(self: *const Ref) bool {
return self.source_index == std.math.maxInt(Ref.Int) and self.inner_index == std.math.maxInt(Ref.Int);
}
@@ -37,12 +67,12 @@ pub const Ref = packed struct {
return self.source_index == std.math.maxInt(Ref.Int);
}
pub fn isSourceIndexNull(int: Ref.Int) bool {
pub fn isSourceIndexNull(int: anytype) bool {
return int == std.math.maxInt(Ref.Int);
}
pub fn eql(ref: Ref, b: Ref) bool {
return ref.inner_index == b.inner_index and ref.source_index == b.source_index;
pub fn jsonStringify(self: *const Ref, options: anytype, writer: anytype) !void {
return try std.json.stringify([2]u32{ self.source_index, self.inner_index }, options, writer);
}
};
@@ -55,3 +85,11 @@ pub const RequireOrImportMeta = struct {
exports_ref: Ref = Ref.None,
is_wrapper_async: bool = false,
};
pub inline fn debug(comptime fmt: []const u8, args: anytype) void {
// Output.print(fmt, args);
}
pub inline fn debugl(
comptime fmt: []const u8,
) void {
// Output.print("{s}\n", .{fmt});
}

File diff suppressed because it is too large Load Diff

6
src/c.zig Normal file
View File

@@ -0,0 +1,6 @@
const std = @import("std");
pub usingnamespace switch (std.Target.current.os.tag) {
.macos => @import("./darwin_c.zig"),
else => struct {},
};

255
src/cache.zig Normal file
View File

@@ -0,0 +1,255 @@
usingnamespace @import("global.zig");
const js_ast = @import("./js_ast.zig");
const logger = @import("./logger.zig");
const js_parser = @import("./js_parser/js_parser.zig");
const json_parser = @import("./json_parser.zig");
const options = @import("./options.zig");
const Define = @import("./defines.zig").Define;
const std = @import("std");
const fs = @import("./fs.zig");
const sync = @import("sync.zig");
const Mutex = sync.Mutex;
const import_record = @import("./import_record.zig");
const ImportRecord = import_record.ImportRecord;
pub fn NewCache(comptime cache_files: bool) type {
return struct {
pub const Set = struct {
js: JavaScript,
fs: Fs,
json: Json,
pub fn init(allocator: *std.mem.Allocator) Set {
return Set{
.js = JavaScript.init(allocator),
.fs = Fs{
.mutex = Mutex.init(),
.entries = std.StringHashMap(Fs.Entry).init(allocator),
.shared_buffer = MutableString.init(allocator, 0) catch unreachable,
},
.json = Json{
.mutex = Mutex.init(),
.entries = std.StringHashMap(*Json.Entry).init(allocator),
},
};
}
};
pub const Fs = struct {
mutex: Mutex,
entries: std.StringHashMap(Entry),
shared_buffer: MutableString,
pub const Entry = struct {
contents: string,
fd: StoredFileDescriptorType = 0,
// Null means its not usable
mod_key: ?fs.FileSystem.Implementation.ModKey = null,
pub fn deinit(entry: *Entry, allocator: *std.mem.Allocator) void {
if (entry.contents.len > 0) {
allocator.free(entry.contents);
entry.contents = "";
}
}
};
pub fn deinit(c: *Fs) void {
var iter = c.entries.iterator();
while (iter.next()) |entry| {
entry.value.deinit(c.entries.allocator);
}
c.entries.deinit();
}
pub fn readFile(
c: *Fs,
_fs: *fs.FileSystem,
path: string,
dirname_fd: StoredFileDescriptorType,
comptime use_shared_buffer: bool,
_file_handle: ?StoredFileDescriptorType,
) !Entry {
var rfs = _fs.fs;
if (cache_files) {
{
c.mutex.lock();
defer c.mutex.unlock();
if (c.entries.get(path)) |entry| {
return entry;
}
}
}
var file_handle: std.fs.File = if (_file_handle) |__file| std.fs.File{ .handle = __file } else undefined;
if (_file_handle == null) {
if (FeatureFlags.store_file_descriptors and dirname_fd > 0) {
file_handle = try std.fs.Dir.openFile(std.fs.Dir{ .fd = dirname_fd }, std.fs.path.basename(path), .{ .read = true });
} else {
file_handle = try std.fs.openFileAbsolute(path, .{ .read = true });
}
}
defer {
if (rfs.needToCloseFiles() and _file_handle == null) {
file_handle.close();
}
}
// If the file's modification key hasn't changed since it was cached, assume
// the contents of the file are also the same and skip reading the file.
var mod_key: ?fs.FileSystem.Implementation.ModKey = rfs.modKeyWithFile(path, file_handle) catch |err| handler: {
switch (err) {
error.FileNotFound, error.AccessDenied => {
return err;
},
else => {
if (isDebug) {
Output.printError("modkey error: {s}", .{@errorName(err)});
}
break :handler null;
},
}
};
var file: fs.File = undefined;
if (mod_key) |modk| {
file = rfs.readFileWithHandle(path, modk.size, file_handle, use_shared_buffer, &c.shared_buffer) catch |err| {
if (isDebug) {
Output.printError("{s}: readFile error -- {s}", .{ path, @errorName(err) });
}
return err;
};
} else {
file = rfs.readFileWithHandle(path, null, file_handle, use_shared_buffer, &c.shared_buffer) catch |err| {
if (isDebug) {
Output.printError("{s}: readFile error -- {s}", .{ path, @errorName(err) });
}
return err;
};
}
const entry = Entry{
.contents = file.contents,
.mod_key = mod_key,
.fd = if (FeatureFlags.store_file_descriptors) file_handle.handle else 0,
};
if (cache_files) {
c.mutex.lock();
defer c.mutex.unlock();
var res = c.entries.getOrPut(path) catch unreachable;
if (res.found_existing) {
res.value_ptr.*.deinit(c.entries.allocator);
}
res.value_ptr.* = entry;
return res.value_ptr.*;
} else {
return entry;
}
}
};
pub const Css = struct {
pub const Entry = struct {};
pub const Result = struct {
ok: bool,
value: void,
};
pub fn parse(cache: *@This(), log: *logger.Log, source: logger.Source) !Result {
Global.notimpl();
}
};
pub const JavaScript = struct {
mutex: Mutex,
entries: std.StringHashMap(Result),
pub const Result = js_ast.Result;
pub fn init(allocator: *std.mem.Allocator) JavaScript {
return JavaScript{ .mutex = Mutex.init(), .entries = std.StringHashMap(Result).init(allocator) };
}
// For now, we're not going to cache JavaScript ASTs.
// It's probably only relevant when bundling for production.
pub fn parse(
cache: *@This(),
allocator: *std.mem.Allocator,
opts: js_parser.Parser.Options,
defines: *Define,
log: *logger.Log,
source: *const logger.Source,
) anyerror!?js_ast.Ast {
var temp_log = logger.Log.init(allocator);
defer temp_log.appendTo(log) catch {};
var parser = js_parser.Parser.init(opts, &temp_log, source, defines, allocator) catch |err| {
return null;
};
const result = try parser.parse();
return if (result.ok) result.ast else null;
}
pub fn scan(
cache: *@This(),
allocator: *std.mem.Allocator,
scan_pass_result: *js_parser.ScanPassResult,
opts: js_parser.Parser.Options,
defines: *Define,
log: *logger.Log,
source: *const logger.Source,
) anyerror!void {
var temp_log = logger.Log.init(allocator);
defer temp_log.appendTo(log) catch {};
var parser = js_parser.Parser.init(opts, &temp_log, source, defines, allocator) catch |err| {
return;
};
return try parser.scanImports(scan_pass_result);
}
};
pub const Json = struct {
pub const Entry = struct {
is_tsconfig: bool = false,
source: logger.Source,
expr: ?js_ast.Expr = null,
ok: bool = false,
// msgs: []logger.Msg,
};
mutex: Mutex,
entries: std.StringHashMap(*Entry),
pub fn init(allocator: *std.mem.Allocator) Json {
return Json{
.mutex = Mutex.init(),
.entries = std.StringHashMap(Entry).init(allocator),
};
}
fn parse(cache: *@This(), log: *logger.Log, source: logger.Source, allocator: *std.mem.Allocator, is_tsconfig: bool, func: anytype) anyerror!?js_ast.Expr {
var temp_log = logger.Log.init(allocator);
defer {
temp_log.appendTo(log) catch {};
}
return func(&source, &temp_log, allocator) catch handler: {
break :handler null;
};
}
pub fn parseJSON(cache: *@This(), log: *logger.Log, source: logger.Source, allocator: *std.mem.Allocator) anyerror!?js_ast.Expr {
return try parse(cache, log, source, allocator, false, json_parser.ParseJSON);
}
pub fn parseTSConfig(cache: *@This(), log: *logger.Log, source: logger.Source, allocator: *std.mem.Allocator) anyerror!?js_ast.Expr {
return try parse(cache, log, source, allocator, true, json_parser.ParseTSConfig);
}
};
};
}
pub const Cache = NewCache(true);
pub const ServeCache = NewCache(false);

529
src/cli.zig Normal file
View File

@@ -0,0 +1,529 @@
usingnamespace @import("global.zig");
usingnamespace @import("./http.zig");
const std = @import("std");
const lex = @import("js_lexer.zig");
const logger = @import("logger.zig");
const alloc = @import("alloc.zig");
const options = @import("options.zig");
const js_parser = @import("js_parser.zig");
const json_parser = @import("json_parser.zig");
const js_printer = @import("js_printer.zig");
const js_ast = @import("js_ast.zig");
const linker = @import("linker.zig");
usingnamespace @import("ast/base.zig");
usingnamespace @import("defines.zig");
const panicky = @import("panic_handler.zig");
const Api = @import("api/schema.zig").Api;
const resolve_path = @import("./resolver/resolve_path.zig");
const clap = @import("clap");
const bundler = @import("bundler.zig");
const fs = @import("fs.zig");
const NodeModuleBundle = @import("./node_module_bundle.zig").NodeModuleBundle;
pub const Cli = struct {
const LoaderMatcher = strings.ExactSizeMatcher(4);
pub fn ColonListType(comptime t: type, value_resolver: anytype) type {
return struct {
pub fn init(allocator: *std.mem.Allocator, count: usize) !@This() {
var keys = try allocator.alloc(string, count);
var values = try allocator.alloc(t, count);
return @This(){ .keys = keys, .values = values };
}
keys: []string,
values: []t,
pub fn load(self: *@This(), input: []const string) !void {
for (input) |str, i| {
// Support either ":" or "=" as the separator, preferring whichever is first.
// ":" is less confusing IMO because that syntax is used with flags
// but "=" is what esbuild uses and I want this to be somewhat familiar for people using esbuild
const midpoint = std.math.min(strings.indexOfChar(str, ':') orelse std.math.maxInt(usize), strings.indexOfChar(str, '=') orelse std.math.maxInt(usize));
if (midpoint == std.math.maxInt(usize)) {
return error.InvalidSeparator;
}
self.keys[i] = str[0..midpoint];
self.values[i] = try value_resolver(str[midpoint + 1 .. str.len]);
}
}
pub fn resolve(allocator: *std.mem.Allocator, input: []const string) !@This() {
var list = try init(allocator, input.len);
try list.load(input);
return list;
}
};
}
pub const LoaderColonList = ColonListType(Api.Loader, Arguments.loader_resolver);
pub const DefineColonList = ColonListType(string, Arguments.noop_resolver);
pub const Arguments = struct {
pub fn loader_resolver(in: string) !Api.Loader {
const Matcher = strings.ExactSizeMatcher(4);
switch (Matcher.match(in)) {
Matcher.case("jsx") => return Api.Loader.jsx,
Matcher.case("js") => return Api.Loader.js,
Matcher.case("ts") => return Api.Loader.ts,
Matcher.case("tsx") => return Api.Loader.tsx,
Matcher.case("css") => return Api.Loader.css,
Matcher.case("file") => return Api.Loader.file,
Matcher.case("json") => return Api.Loader.json,
else => {
return error.InvalidLoader;
},
}
}
pub fn noop_resolver(in: string) !string {
return in;
}
pub fn fileReadError(err: anyerror, stderr: anytype, filename: string, kind: string) noreturn {
stderr.writer().print("Error reading file \"{s}\" for {s}: {s}", .{ filename, kind, @errorName(err) }) catch {};
std.process.exit(1);
}
pub fn readFile(
allocator: *std.mem.Allocator,
cwd: string,
filename: string,
) ![]u8 {
var paths = [_]string{ cwd, filename };
const outpath = try std.fs.path.resolve(allocator, &paths);
defer allocator.free(outpath);
var file = try std.fs.openFileAbsolute(outpath, std.fs.File.OpenFlags{ .read = true, .write = false });
defer file.close();
const stats = try file.stat();
return try file.readToEndAlloc(allocator, stats.size);
}
pub fn parse(allocator: *std.mem.Allocator, stdout: anytype, stderr: anytype) !Api.TransformOptions {
@setEvalBranchQuota(9999);
const params = comptime [_]clap.Param(clap.Help){
clap.parseParam("-h, --help Display this help and exit. ") catch unreachable,
clap.parseParam("-r, --resolve <STR> Determine import/require behavior. \"disable\" ignores. \"dev\" bundles node_modules and builds everything else as independent entry points") catch unreachable,
clap.parseParam("-d, --define <STR>... Substitute K:V while parsing, e.g. --define process.env.NODE_ENV:development") catch unreachable,
clap.parseParam("-l, --loader <STR>... Parse files with .ext:loader, e.g. --loader .js:jsx. Valid loaders: jsx, js, json, tsx (not implemented yet), ts (not implemented yet), css (not implemented yet)") catch unreachable,
clap.parseParam("-o, --outdir <STR> Save output to directory (default: \"out\" if none provided and multiple entry points passed)") catch unreachable,
clap.parseParam("-e, --external <STR>... Exclude module from transpilation (can use * wildcards). ex: -e react") catch unreachable,
clap.parseParam("-i, --inject <STR>... Inject module at the top of every file") catch unreachable,
clap.parseParam("--cwd <STR> Absolute path to resolve entry points from. Defaults to cwd") catch unreachable,
clap.parseParam("--public-url <STR> Rewrite import paths to start with --public-url. Useful for web browsers.") catch unreachable,
clap.parseParam("--serve Start a local dev server. This also sets resolve to \"lazy\".") catch unreachable,
clap.parseParam("--public-dir <STR> Top-level directory for .html files, fonts, images, or anything external. Only relevant with --serve. Defaults to \"<cwd>/public\", to match create-react-app and Next.js") catch unreachable,
clap.parseParam("--jsx-factory <STR> Changes the function called when compiling JSX elements using the classic JSX runtime") catch unreachable,
clap.parseParam("--jsx-fragment <STR> Changes the function called when compiling JSX fragments using the classic JSX runtime") catch unreachable,
clap.parseParam("--jsx-import-source <STR> Declares the module specifier to be used for importing the jsx and jsxs factory functions. Default: \"react\"") catch unreachable,
clap.parseParam("--jsx-runtime <STR> \"automatic\" (default) or \"classic\"") catch unreachable,
clap.parseParam("--jsx-production Use jsx instead of jsxDEV (default) for the automatic runtime") catch unreachable,
clap.parseParam("--extension-order <STR>... defaults to: .tsx,.ts,.jsx,.js,.json ") catch unreachable,
clap.parseParam("--disable-react-fast-refresh Disable React Fast Refresh. Enabled if --serve is set and --jsx-production is not set. Otherwise, it's a noop.") catch unreachable,
clap.parseParam("--tsconfig-override <STR> Load tsconfig from path instead of cwd/tsconfig.json") catch unreachable,
clap.parseParam("--platform <STR> \"browser\" or \"node\". Defaults to \"browser\"") catch unreachable,
clap.parseParam("--main-fields <STR>... Main fields to lookup in package.json. Defaults to --platform dependent") catch unreachable,
clap.parseParam("--scan Instead of bundling or transpiling, print a list of every file imported by an entry point, recursively") catch unreachable,
clap.parseParam("--new-jsb Generate a new node_modules.jsb file from node_modules and entry point(s)") catch unreachable,
clap.parseParam("--jsb <STR> Use a Speedy JavaScript Bundle (default: \"./node_modules.jsb\" if exists)") catch unreachable,
clap.parseParam("--framework <STR> Use a JavaScript framework (module path)") catch unreachable,
clap.parseParam("<POS>... Entry point(s) to use. Can be individual files, npm packages, or one directory. If one directory, it will auto-detect entry points using a filesystem router. If you're using a framework, passing entry points are optional.") catch unreachable,
};
var diag = clap.Diagnostic{};
var args = clap.parse(clap.Help, &params, .{ .diagnostic = &diag }) catch |err| {
// Report useful error and exit
diag.report(stderr.writer(), err) catch {};
return err;
};
if (args.flag("--help")) {
try clap.help(stderr.writer(), &params);
std.process.exit(1);
}
var cwd_paths = [_]string{args.option("--cwd") orelse try std.process.getCwdAlloc(allocator)};
var cwd = try std.fs.path.resolve(allocator, &cwd_paths);
var tsconfig_override = if (args.option("--tsconfig-override")) |ts| (Arguments.readFile(allocator, cwd, ts) catch |err| fileReadError(err, stderr, ts, "tsconfig.json")) else null;
var public_url = args.option("--public-url");
var defines_tuple = try DefineColonList.resolve(allocator, args.options("--define"));
var loader_tuple = try LoaderColonList.resolve(allocator, args.options("--define"));
var define_keys = defines_tuple.keys;
var define_values = defines_tuple.values;
var loader_keys = loader_tuple.keys;
var loader_values = loader_tuple.values;
var entry_points = args.positionals();
var inject = args.options("--inject");
var output_dir = args.option("--outdir");
const serve = args.flag("--serve");
var write = entry_points.len > 1 or output_dir != null;
if (write and output_dir == null) {
var _paths = [_]string{ cwd, "out" };
output_dir = try std.fs.path.resolve(allocator, &_paths);
}
var externals = std.mem.zeroes([][]u8);
if (args.options("--external").len > 0) {
externals = try allocator.alloc([]u8, args.options("--external").len);
for (args.options("--external")) |external, i| {
externals[i] = constStrToU8(external);
}
}
var jsx_factory = args.option("--jsx-factory");
var jsx_fragment = args.option("--jsx-fragment");
var jsx_import_source = args.option("--jsx-import-source");
var jsx_runtime = args.option("--jsx-runtime");
var jsx_production = args.flag("--jsx-production");
var react_fast_refresh = false;
var framework_entry_point = args.option("--framework");
if (serve or args.flag("--new-jsb")) {
react_fast_refresh = true;
if (args.flag("--disable-react-fast-refresh") or jsx_production) {
react_fast_refresh = false;
}
}
var main_fields = args.options("--main-fields");
var node_modules_bundle_path = args.option("--jsb") orelse brk: {
if (args.flag("--new-jsb")) {
break :brk null;
}
const node_modules_bundle_path_absolute = resolve_path.joinAbs(cwd, .auto, "node_modules.jsb");
std.fs.accessAbsolute(node_modules_bundle_path_absolute, .{}) catch |err| {
break :brk null;
};
break :brk try std.fs.realpathAlloc(allocator, node_modules_bundle_path_absolute);
};
if (args.flag("--new-jsb")) {
node_modules_bundle_path = null;
}
const PlatformMatcher = strings.ExactSizeMatcher(8);
const ResoveMatcher = strings.ExactSizeMatcher(8);
var resolve = Api.ResolveMode.lazy;
if (args.option("--resolve")) |_resolve| {
switch (PlatformMatcher.match(_resolve)) {
PlatformMatcher.case("disable") => {
resolve = Api.ResolveMode.disable;
},
PlatformMatcher.case("bundle") => {
resolve = Api.ResolveMode.bundle;
},
PlatformMatcher.case("dev") => {
resolve = Api.ResolveMode.dev;
},
PlatformMatcher.case("lazy") => {
resolve = Api.ResolveMode.lazy;
},
else => {
diag.name.long = "--resolve";
diag.arg = _resolve;
try diag.report(stderr.writer(), error.InvalidResolveOption);
std.process.exit(1);
},
}
}
var platform: ?Api.Platform = null;
if (args.option("--platform")) |_platform| {
switch (PlatformMatcher.match(_platform)) {
PlatformMatcher.case("browser") => {
platform = Api.Platform.browser;
},
PlatformMatcher.case("node") => {
platform = Api.Platform.node;
},
else => {
diag.name.long = "--platform";
diag.arg = _platform;
try diag.report(stderr.writer(), error.InvalidPlatform);
std.process.exit(1);
},
}
}
var jsx: ?Api.Jsx = null;
if (jsx_factory != null or
jsx_fragment != null or
jsx_import_source != null or
jsx_runtime != null or
jsx_production or react_fast_refresh)
{
var default_factory = "".*;
var default_fragment = "".*;
var default_import_source = "".*;
jsx = Api.Jsx{
.factory = constStrToU8(jsx_factory orelse &default_factory),
.fragment = constStrToU8(jsx_fragment orelse &default_fragment),
.import_source = constStrToU8(jsx_import_source orelse &default_import_source),
.runtime = if (jsx_runtime != null) try resolve_jsx_runtime(jsx_runtime.?) else Api.JsxRuntime.automatic,
.development = !jsx_production,
.react_fast_refresh = react_fast_refresh,
};
}
var javascript_framework: ?Api.FrameworkConfig = null;
if (framework_entry_point) |entry| {
javascript_framework = Api.FrameworkConfig{
.entry_point = entry,
};
}
if (entry_points.len == 0 and javascript_framework == null) {
try clap.help(stderr.writer(), &params);
try diag.report(stderr.writer(), error.MissingEntryPoint);
std.process.exit(1);
}
return Api.TransformOptions{
.jsx = jsx,
.output_dir = output_dir,
.resolve = resolve,
.external = externals,
.absolute_working_dir = cwd,
.tsconfig_override = tsconfig_override,
.public_url = public_url,
.define = .{
.keys = define_keys,
.values = define_values,
},
.loaders = .{
.extensions = loader_keys,
.loaders = loader_values,
},
.node_modules_bundle_path = node_modules_bundle_path,
.public_dir = if (args.option("--public-dir")) |public_dir| allocator.dupe(u8, public_dir) catch unreachable else null,
.write = write,
.serve = serve,
.inject = inject,
.entry_points = entry_points,
.extension_order = args.options("--extension-order"),
.main_fields = args.options("--main-fields"),
.platform = platform,
.only_scan_dependencies = if (args.flag("--scan")) Api.ScanDependencyMode.all else Api.ScanDependencyMode._none,
.generate_node_module_bundle = if (args.flag("--new-jsb")) true else false,
.framework = javascript_framework,
};
}
};
pub fn resolve_jsx_runtime(str: string) !Api.JsxRuntime {
if (strings.eql(str, "automatic")) {
return Api.JsxRuntime.automatic;
} else if (strings.eql(str, "fallback")) {
return Api.JsxRuntime.classic;
} else {
return error.InvalidJSXRuntime;
}
}
pub fn printScanResults(scan_results: bundler.ScanResult.Summary, allocator: *std.mem.Allocator) !void {
var stdout = std.io.getStdOut();
const print_start = std.time.nanoTimestamp();
try std.json.stringify(scan_results.list(), .{}, stdout.writer());
Output.printError("\nJSON printing took: {d}\n", .{std.time.nanoTimestamp() - print_start});
}
pub fn startTransform(allocator: *std.mem.Allocator, args: Api.TransformOptions, log: *logger.Log) anyerror!void {}
pub fn start(allocator: *std.mem.Allocator, stdout: anytype, stderr: anytype, comptime MainPanicHandler: type) anyerror!void {
const start_time = std.time.nanoTimestamp();
var log = logger.Log.init(allocator);
var panicker = MainPanicHandler.init(&log);
MainPanicHandler.Singleton = &panicker;
var args = try Arguments.parse(alloc.static, stdout, stderr);
if ((args.entry_points.len == 1 and args.entry_points[0].len > ".jsb".len and args.entry_points[0][args.entry_points[0].len - ".jsb".len] == '.' and strings.eqlComptime(args.entry_points[0][args.entry_points[0].len - "jsb".len ..], "jsb"))) {
var out_buffer: [std.fs.MAX_PATH_BYTES]u8 = undefined;
var input = try std.fs.openFileAbsolute(try std.os.realpath(args.entry_points[0], &out_buffer), .{ .read = true });
const params = comptime [_]clap.Param(clap.Help){
clap.parseParam("--summary Print a summary") catch unreachable,
clap.parseParam("<POS>... ") catch unreachable,
};
var jsBundleArgs = clap.parse(clap.Help, &params, .{}) catch |err| {
try NodeModuleBundle.printBundle(std.fs.File, input, @TypeOf(stdout), stdout);
return;
};
if (jsBundleArgs.flag("--summary")) {
try NodeModuleBundle.printSummaryFromDisk(std.fs.File, input, @TypeOf(stdout), stdout, allocator);
} else {
try NodeModuleBundle.printBundle(std.fs.File, input, @TypeOf(stdout), stdout);
}
return;
}
if (args.serve orelse false) {
try Server.start(allocator, args);
return;
}
if ((args.only_scan_dependencies orelse ._none) == .all) {
return try printScanResults(try bundler.Bundler.scanDependencies(allocator, &log, args), allocator);
}
if ((args.generate_node_module_bundle orelse false)) {
var this_bundler = try bundler.ServeBundler.init(allocator, &log, args, null);
this_bundler.configureLinker();
var filepath = "node_modules.jsb";
var node_modules = try bundler.ServeBundler.GenerateNodeModuleBundle.generate(&this_bundler, allocator, filepath);
var elapsed = @divTrunc(std.time.nanoTimestamp() - start_time, @as(i128, std.time.ns_per_ms));
var bundle = NodeModuleBundle.init(node_modules, allocator);
bundle.printSummary();
const indent = comptime " ";
Output.prettyln(indent ++ "<d>{d:6}ms elapsed", .{@intCast(u32, elapsed)});
Output.prettyln(indent ++ "<r>Saved to ./{s}", .{filepath});
return;
}
var result: options.TransformResult = undefined;
switch (args.resolve orelse Api.ResolveMode.dev) {
Api.ResolveMode.disable => {
result = try bundler.Transformer.transform(
allocator,
&log,
args,
);
},
.lazy => {
result = try bundler.ServeBundler.bundle(
allocator,
&log,
args,
);
},
else => {
result = try bundler.Bundler.bundle(
allocator,
&log,
args,
);
},
}
var did_write = false;
var stderr_writer = stderr.writer();
var buffered_writer = std.io.bufferedWriter(stderr_writer);
defer buffered_writer.flush() catch {};
var writer = buffered_writer.writer();
var err_writer = writer;
var open_file_limit: usize = 32;
if (args.write) |write| {
if (write) {
const root_dir = result.root_dir orelse unreachable;
if (std.os.getrlimit(.NOFILE)) |limit| {
open_file_limit = limit.cur;
} else |err| {}
var all_paths = try allocator.alloc([]const u8, result.output_files.len);
var max_path_len: usize = 0;
var max_padded_size: usize = 0;
for (result.output_files) |f, i| {
all_paths[i] = f.input.text;
}
var from_path = resolve_path.longestCommonPath(all_paths);
for (result.output_files) |f, i| {
max_path_len = std.math.max(
std.math.max(from_path.len, f.input.text.len) + 2 - from_path.len,
max_path_len,
);
}
did_write = true;
// On posix, file handles automatically close on process exit by the OS
// Closing files shows up in profiling.
// So don't do that unless we actually need to.
const do_we_need_to_close = !FeatureFlags.store_file_descriptors or (@intCast(usize, root_dir.fd) + open_file_limit) < result.output_files.len;
var filepath_buf: [std.fs.MAX_PATH_BYTES]u8 = undefined;
filepath_buf[0] = '.';
filepath_buf[1] = '/';
for (result.output_files) |f, i| {
var rel_path: []const u8 = undefined;
switch (f.value) {
// easy mode: write the buffer
.buffer => |value| {
rel_path = resolve_path.relative(from_path, f.input.text);
try root_dir.writeFile(rel_path, value);
},
.move => |value| {
// const primary = f.input.text[from_path.len..];
// std.mem.copy(u8, filepath_buf[2..], primary);
// rel_path = filepath_buf[0 .. primary.len + 2];
rel_path = value.pathname;
// try f.moveTo(result.outbase, constStrToU8(rel_path), root_dir.fd);
},
.copy => |value| {
rel_path = value.pathname;
try f.copyTo(result.outbase, constStrToU8(rel_path), root_dir.fd);
},
.noop => {},
.pending => |value| {
unreachable;
},
}
// Print summary
_ = try writer.write("\n");
const padding_count = 2 + (std.math.max(rel_path.len, max_path_len) - rel_path.len);
try writer.writeByteNTimes(' ', 2);
try writer.writeAll(rel_path);
try writer.writeByteNTimes(' ', padding_count);
const size = @intToFloat(f64, f.size) / 1000.0;
try std.fmt.formatFloatDecimal(size, .{ .precision = 2 }, writer);
try writer.writeAll(" KB\n");
}
}
}
if (isDebug) {
err_writer.print("\nExpr count: {d}\n", .{js_ast.Expr.icount}) catch {};
err_writer.print("Stmt count: {d}\n", .{js_ast.Stmt.icount}) catch {};
err_writer.print("Binding count: {d}\n", .{js_ast.Binding.icount}) catch {};
err_writer.print("File Descriptors: {d} / {d}\n", .{
fs.FileSystem.max_fd,
open_file_limit,
}) catch {};
}
for (result.errors) |err| {
try err.writeFormat(err_writer);
_ = try err_writer.write("\n");
}
for (result.warnings) |err| {
try err.writeFormat(err_writer);
_ = try err_writer.write("\n");
}
const duration = std.time.nanoTimestamp() - start_time;
if (did_write and duration < @as(i128, @as(i128, std.time.ns_per_s) * @as(i128, 2))) {
var elapsed = @divTrunc(duration, @as(i128, std.time.ns_per_ms));
try err_writer.print("\nCompleted in {d}ms", .{elapsed});
}
}
};

1144
src/css_scanner.zig Normal file

File diff suppressed because it is too large Load Diff

13
src/darwin_c.zig Normal file
View File

@@ -0,0 +1,13 @@
usingnamespace @import("std").c;
// int clonefileat(int src_dirfd, const char * src, int dst_dirfd, const char * dst, int flags);
pub extern "c" fn clonefileat(c_int, [*c]const u8, c_int, [*c]const u8, uint32_t: c_int) c_int;
// int fclonefileat(int srcfd, int dst_dirfd, const char * dst, int flags);
pub extern "c" fn fclonefileat(c_int, c_int, [*c]const u8, uint32_t: c_int) c_int;
// int clonefile(const char * src, const char * dst, int flags);
pub extern "c" fn clonefile([*c]const u8, [*c]const u8, uint32_t: c_int) c_int;
pub extern "c" fn chmod([*c]const u8, mode_t) c_int;
pub extern "c" fn fchmod(c_int, mode_t) c_int;
pub extern "c" fn umask(mode_t) mode_t;
pub extern "c" fn fchmodat(c_int, [*c]const u8, mode_t, c_int) c_int;

File diff suppressed because it is too large Load Diff

View File

@@ -1,12 +1,343 @@
const std = @import("std");
const js_ast = @import("./js_ast.zig");
const alloc = @import("alloc.zig");
const logger = @import("logger.zig");
const js_lexer = @import("js_lexer.zig");
const json_parser = @import("json_parser.zig");
const fs = @import("fs.zig");
usingnamespace @import("global.zig");
usingnamespace @import("ast/base.zig");
const GlobalDefinesKey = @import("./defines-table.zig").GlobalDefinesKey;
pub const defaultIdentifierDefines = comptime {};
const Globals = struct {
pub const Undefined = js_ast.E.Undefined{};
pub const UndefinedPtr = &Globals.Undefined;
pub const IdentifierDefine = struct {};
pub const NaN = js_ast.E.Number{ .value = std.math.nan(f64) };
pub const NanPtr = &Globals.NaN;
pub const DotDefine = struct {};
pub const Infinity = js_ast.E.Number{ .value = std.math.inf(f64) };
pub const InfinityPtr = &Globals.Infinity;
pub const UndefinedData = js_ast.Expr.Data{ .e_undefined = Globals.UndefinedPtr };
pub const NaNData = js_ast.Expr.Data{ .e_number = Globals.NanPtr };
pub const InfinityData = js_ast.Expr.Data{ .e_number = Globals.InfinityPtr };
};
pub const Defines = struct {};
const defines_path = fs.Path.initWithNamespace("defines.json", "internal");
pub const RawDefines = std.StringHashMap(string);
pub const UserDefines = std.StringHashMap(DefineData);
pub const DefineData = struct {
value: js_ast.Expr.Data,
valueless: bool = false,
original_name: ?string = null,
// True if accessing this value is known to not have any side effects. For
// example, a bare reference to "Object.create" can be removed because it
// does not have any observable side effects.
can_be_removed_if_unused: bool = false,
// True if a call to this value is known to not have any side effects. For
// example, a bare call to "Object()" can be removed because it does not
// have any observable side effects.
call_can_be_unwrapped_if_unused: bool = false,
// All the globals have the same behavior.
// So we can create just one struct for it.
pub const GlobalDefineData = DefineData{};
pub fn isUndefined(self: *const DefineData) bool {
return self.valueless;
}
pub fn merge(a: DefineData, b: DefineData) DefineData {
return DefineData{
.value = b.value,
.can_be_removed_if_unused = a.can_be_removed_if_unused,
.call_can_be_unwrapped_if_unused = a.call_can_be_unwrapped_if_unused,
};
}
pub fn from_input(defines: RawDefines, log: *logger.Log, allocator: *std.mem.Allocator) !UserDefines {
var user_defines = UserDefines.init(allocator);
try user_defines.ensureCapacity(defines.count());
var iter = defines.iterator();
while (iter.next()) |entry| {
var splitter = std.mem.split(entry.key_ptr.*, ".");
while (splitter.next()) |part| {
if (!js_lexer.isIdentifier(part)) {
if (strings.eql(part, entry.key_ptr)) {
try log.addErrorFmt(null, logger.Loc{}, allocator, "The define key \"{s}\" must be a valid identifier", .{entry.key_ptr});
} else {
try log.addErrorFmt(null, logger.Loc{}, allocator, "The define key \"{s}\" contains invalid identifier \"{s}\"", .{ part, entry.key_ptr });
}
break;
}
}
if (js_lexer.isIdentifier(entry.value_ptr.*) and !js_lexer.Keywords.has(entry.value_ptr.*)) {
// Special-case undefined. it's not an identifier here
// https://github.com/evanw/esbuild/issues/1407
if (strings.eqlComptime(entry.value_ptr.*, "undefined")) {
user_defines.putAssumeCapacity(
entry.key_ptr.*,
DefineData{
.value = js_ast.Expr.Data{ .e_undefined = js_ast.E.Undefined{} },
.original_name = entry.value_ptr.*,
.can_be_removed_if_unused = true,
},
);
} else {
var ident: *js_ast.E.Identifier = try allocator.create(js_ast.E.Identifier);
ident.ref = Ref.None;
ident.can_be_removed_if_unused = true;
user_defines.putAssumeCapacity(
entry.key_ptr.*,
DefineData{
.value = js_ast.Expr.Data{ .e_identifier = ident },
.original_name = entry.value_ptr.*,
.can_be_removed_if_unused = true,
},
);
}
// user_defines.putAssumeCapacity(
// entry.key_ptr,
// DefineData{ .value = js_ast.Expr.Data{.e_identifier = } },
// );
continue;
}
var _log = log;
var source = logger.Source{
.contents = entry.value_ptr.*,
.path = defines_path,
.identifier_name = "defines",
.key_path = fs.Path.initWithNamespace("defines", "internal"),
};
var expr = try json_parser.ParseJSON(&source, _log, allocator);
var data: js_ast.Expr.Data = undefined;
switch (expr.data) {
.e_missing => {
continue;
},
// We must copy so we don't recycle
.e_string => {
data = .{ .e_string = try allocator.create(js_ast.E.String) };
data.e_string.* = try expr.data.e_string.clone(allocator);
},
.e_null, .e_boolean, .e_number => {
data = expr.data;
},
// We must copy so we don't recycle
.e_object => |obj| {
expr.data.e_object = try allocator.create(js_ast.E.Object);
expr.data.e_object.* = obj.*;
data = expr.data;
},
// We must copy so we don't recycle
.e_array => |obj| {
expr.data.e_array = try allocator.create(js_ast.E.Array);
expr.data.e_array.* = obj.*;
data = expr.data;
},
else => {
continue;
},
}
user_defines.putAssumeCapacity(entry.key_ptr.*, DefineData{
.value = data,
});
}
return user_defines;
}
};
fn arePartsEqual(a: []const string, b: []const string) bool {
if (a.len != b.len) {
return false;
}
var i: usize = 0;
while (i < a.len) : (i += 1) {
if (!strings.eql(a[i], b[i])) {
return false;
}
}
return true;
}
pub const IdentifierDefine = DefineData;
pub const DotDefine = struct {
parts: []const string,
data: DefineData,
};
// var nan_val = try allocator.create(js_ast.E.Number);
var nan_val = js_ast.E.Number{ .value = std.math.nan_f64 };
var inf_val = js_ast.E.Number{ .value = std.math.inf_f64 };
pub const Define = struct {
identifiers: std.StringHashMap(IdentifierDefine),
dots: std.StringHashMap([]DotDefine),
allocator: *std.mem.Allocator,
pub fn init(allocator: *std.mem.Allocator, _user_defines: ?UserDefines) !*@This() {
var define = try allocator.create(Define);
define.allocator = allocator;
define.identifiers = std.StringHashMap(IdentifierDefine).init(allocator);
define.dots = std.StringHashMap([]DotDefine).init(allocator);
try define.identifiers.ensureCapacity(641);
try define.dots.ensureCapacity(64);
var val = js_ast.Expr.Data{ .e_undefined = .{} };
var ident_define = IdentifierDefine{
.value = val,
};
var value_define = DefineData{ .value = val, .valueless = true };
// Step 1. Load the globals into the hash tables
for (GlobalDefinesKey) |global| {
if (global.len == 1) {
// TODO: when https://github.com/ziglang/zig/pull/8596 is merged, switch to putAssumeCapacityNoClobber
define.identifiers.putAssumeCapacity(global[0], value_define);
} else {
const key = global[global.len - 1];
// TODO: move this to comptime
// TODO: when https://github.com/ziglang/zig/pull/8596 is merged, switch to putAssumeCapacityNoClobber
if (define.dots.getEntry(key)) |entry| {
var list = try std.ArrayList(DotDefine).initCapacity(allocator, entry.value_ptr.*.len + 1);
list.appendSliceAssumeCapacity(entry.value_ptr.*);
list.appendAssumeCapacity(DotDefine{
.parts = global[0..global.len],
.data = value_define,
});
define.dots.putAssumeCapacity(key, list.toOwnedSlice());
} else {
var list = try std.ArrayList(DotDefine).initCapacity(allocator, 1);
list.appendAssumeCapacity(DotDefine{
.parts = global[0..global.len],
.data = value_define,
});
define.dots.putAssumeCapacity(key, list.toOwnedSlice());
}
}
}
// Step 2. Swap in certain literal values because those can be constant folded
define.identifiers.putAssumeCapacity("undefined", value_define);
define.identifiers.putAssumeCapacity("NaN", DefineData{
.value = js_ast.Expr.Data{ .e_number = &nan_val },
});
define.identifiers.putAssumeCapacity("Infinity", DefineData{
.value = js_ast.Expr.Data{ .e_number = &inf_val },
});
// Step 3. Load user data into hash tables
// At this stage, user data has already been validated.
if (_user_defines) |user_defines| {
var iter = user_defines.iterator();
while (iter.next()) |user_define| {
const user_define_key = user_define.key_ptr.*;
// If it has a dot, then it's a DotDefine.
// e.g. process.env.NODE_ENV
if (strings.lastIndexOfChar(user_define_key, '.')) |last_dot| {
const tail = user_define_key[last_dot + 1 .. user_define_key.len];
const remainder = user_define_key[0..last_dot];
const count = std.mem.count(u8, remainder, ".") + 1;
var parts = try allocator.alloc(string, count + 1);
var splitter = std.mem.split(remainder, ".");
var i: usize = 0;
while (splitter.next()) |split| : (i += 1) {
parts[i] = split;
}
parts[i] = tail;
var didFind = false;
var initial_values: []DotDefine = &([_]DotDefine{});
// "NODE_ENV"
if (define.dots.getEntry(tail)) |entry| {
for (entry.value_ptr.*) |*part| {
// ["process", "env"] === ["process", "env"] (if that actually worked)
if (arePartsEqual(part.parts, parts)) {
part.data = part.data.merge(user_define.value_ptr.*);
didFind = true;
break;
}
}
initial_values = entry.value_ptr.*;
}
if (!didFind) {
var list = try std.ArrayList(DotDefine).initCapacity(allocator, initial_values.len + 1);
if (initial_values.len > 0) {
list.appendSliceAssumeCapacity(initial_values);
}
list.appendAssumeCapacity(DotDefine{
.data = user_define.value_ptr.*,
// TODO: do we need to allocate this?
.parts = parts,
});
try define.dots.put(tail, list.toOwnedSlice());
}
} else {
// e.g. IS_BROWSER
try define.identifiers.put(user_define_key, user_define.value_ptr.*);
}
}
}
return define;
}
};
const expect = std.testing.expect;
test "UserDefines" {
try alloc.setup(std.heap.page_allocator);
var orig = RawDefines.init(alloc.dynamic);
try orig.put("process.env.NODE_ENV", "\"development\"");
try orig.put("globalThis", "window");
var log = logger.Log.init(alloc.dynamic);
var data = try DefineData.from_input(orig, &log, alloc.dynamic);
expect(data.contains("process.env.NODE_ENV"));
expect(data.contains("globalThis"));
const globalThis = data.get("globalThis");
const val = data.get("process.env.NODE_ENV");
expect(val != null);
expect(strings.utf16EqlString(val.?.value.e_string.value, "development"));
std.testing.expectEqualStrings(globalThis.?.original_name.?, "window");
}
// 396,000ns was upper end of last time this was checked how long it took
// => 0.396ms
test "Defines" {
try alloc.setup(std.heap.page_allocator);
const start = std.time.nanoTimestamp();
var orig = RawDefines.init(alloc.dynamic);
try orig.put("process.env.NODE_ENV", "\"development\"");
var log = logger.Log.init(alloc.dynamic);
var data = try DefineData.from_input(orig, &log, alloc.dynamic);
var defines = try Define.init(alloc.dynamic, data);
Output.print("Time: {d}", .{std.time.nanoTimestamp() - start});
const node_env_dots = defines.dots.get("NODE_ENV");
expect(node_env_dots != null);
expect(node_env_dots.?.len > 0);
const node_env = node_env_dots.?[0];
std.testing.expectEqual(node_env.parts.len, 2);
std.testing.expectEqualStrings("process", node_env.parts[0]);
std.testing.expectEqualStrings("env", node_env.parts[1]);
expect(node_env.data.original_name == null);
expect(strings.utf16EqlString(node_env.data.value.e_string.value, "development"));
}

205
src/deps/picohttp.zig Normal file
View File

@@ -0,0 +1,205 @@
const std = @import("std");
const c = @cImport(@cInclude("picohttpparser.h"));
const ExactSizeMatcher = @import("../exact_size_matcher.zig").ExactSizeMatcher;
const Match = ExactSizeMatcher(2);
const fmt = std.fmt;
const assert = std.debug.assert;
pub const Header = struct {
name: []const u8,
value: []const u8,
pub fn isMultiline(self: Header) bool {
return @ptrToInt(self.name.ptr) == 0;
}
pub fn format(self: Header, comptime layout: []const u8, opts: fmt.FormatOptions, writer: anytype) !void {
if (self.isMultiline()) {
try fmt.format(writer, "{s}", .{self.value});
} else {
try fmt.format(writer, "{s}: {s}", .{ self.name, self.value });
}
}
comptime {
assert(@sizeOf(Header) == @sizeOf(c.phr_header));
assert(@alignOf(Header) == @alignOf(c.phr_header));
}
};
pub const Request = struct {
method: []const u8,
path: []const u8,
minor_version: usize,
headers: []const Header,
pub fn parse(buf: []const u8, src: []Header) !Request {
var method: []const u8 = undefined;
var path: []const u8 = undefined;
var minor_version: c_int = undefined;
var num_headers: usize = src.len;
const rc = c.phr_parse_request(
buf.ptr,
buf.len,
@ptrCast([*c][*c]const u8, &method.ptr),
&method.len,
@ptrCast([*c][*c]const u8, &path.ptr),
&path.len,
&minor_version,
@ptrCast([*c]c.phr_header, src.ptr),
&num_headers,
0,
);
// Leave a sentinel value, for JavaScriptCore support.
@intToPtr([*]u8, @ptrToInt(path.ptr))[path.len] = 0;
return switch (rc) {
-1 => error.BadRequest,
-2 => error.ShortRead,
else => |bytes_read| Request{
.method = method,
.path = path,
.minor_version = @intCast(usize, minor_version),
.headers = src[0..num_headers],
},
};
}
};
test "pico_http: parse request" {
const REQ = "GET /wp-content/uploads/2010/03/hello-kitty-darth-vader-pink.jpg HTTP/1.1\r\n" ++
"Host: www.kittyhell.com\r\n" ++
"User-Agent: Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10.6; ja-JP-mac; rv:1.9.2.3) Gecko/20100401 Firefox/3.6.3 " ++
"Pathtraq/0.9\r\n" ++
"Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8\r\n" ++
"Accept-Language: ja,en-us;q=0.7,en;q=0.3\r\n" ++
"Accept-Encoding: gzip,deflate\r\n" ++
"Accept-Charset: Shift_JIS,utf-8;q=0.7,*;q=0.7\r\n" ++
"Keep-Alive: 115\r\n" ++
"Connection: keep-alive\r\n" ++
"TestMultiline: Hello world\r\n" ++
" This is a second line in the header!\r\n" ++
"Cookie: wp_ozh_wsa_visits=2; wp_ozh_wsa_visit_lasttime=xxxxxxxxxx; " ++
"__utma=xxxxxxxxx.xxxxxxxxxx.xxxxxxxxxx.xxxxxxxxxx.xxxxxxxxxx.x; " ++
"__utmz=xxxxxxxxx.xxxxxxxxxx.x.x.utmccn=(referral)|utmcsr=reader.livedoor.com|utmcct=/reader/|utmcmd=referral\r\n" ++
"\r\n";
var headers: [32]Header = undefined;
const req = try Request.parse(REQ, &headers);
std.debug.print("Method: {s}\n", .{req.method});
std.debug.print("Path: {s}\n", .{req.path});
std.debug.print("Minor Version: {}\n", .{req.minor_version});
for (req.headers) |header| {
std.debug.print("{}\n", .{header});
}
}
pub const Response = struct {
minor_version: usize,
status_code: usize,
status: []const u8,
headers: []const Header,
pub fn parse(buf: []const u8, src: []Header) !Response {
var minor_version: c_int = undefined;
var status_code: c_int = undefined;
var status: []const u8 = undefined;
var num_headers: usize = src.len;
const rc = c.phr_parse_response(
buf.ptr,
buf.len,
&minor_version,
&status_code,
@ptrCast([*c][*c]const u8, &status.ptr),
&status.len,
@ptrCast([*c]c.phr_header, src.ptr),
&num_headers,
0,
);
return switch (rc) {
-1 => error.BadResponse,
-2 => error.ShortRead,
else => |bytes_read| Response{
.minor_version = @intCast(usize, minor_version),
.status_code = @intCast(usize, status_code),
.status = status,
.headers = src[0..num_headers],
},
};
}
};
test "pico_http: parse response" {
const RES = "HTTP/1.1 200 OK\r\n" ++
"Date: Mon, 22 Mar 2021 08:15:54 GMT\r\n" ++
"Content-Type: text/html; charset=utf-8\r\n" ++
"Content-Length: 9593\r\n" ++
"Connection: keep-alive\r\n" ++
"Server: gunicorn/19.9.0\r\n" ++
"Access-Control-Allow-Origin: *\r\n" ++
"Access-Control-Allow-Credentials: true\r\n" ++
"\r\n";
var headers: [32]Header = undefined;
const res = try Response.parse(RES, &headers);
std.debug.print("Minor Version: {}\n", .{res.minor_version});
std.debug.print("Status Code: {}\n", .{res.status_code});
std.debug.print("Status: {s}\n", .{res.status});
for (res.headers) |header| {
std.debug.print("{}\n", .{header});
}
}
pub const Headers = struct {
headers: []const Header,
pub fn parse(buf: []const u8, src: []Header) !Headers {
var num_headers: usize = src.len;
const rc = c.phr_parse_headers(
buf.ptr,
buf.len,
@ptrCast([*c]c.phr_header, src.ptr),
@ptrCast([*c]usize, &num_headers),
0,
);
return switch (rc) {
-1 => error.BadHeaders,
-2 => error.ShortRead,
else => |bytes_read| Headers{
.headers = src[0..num_headers],
},
};
}
};
test "pico_http: parse headers" {
const HEADERS = "Date: Mon, 22 Mar 2021 08:15:54 GMT\r\n" ++
"Content-Type: text/html; charset=utf-8\r\n" ++
"Content-Length: 9593\r\n" ++
"Connection: keep-alive\r\n" ++
"Server: gunicorn/19.9.0\r\n" ++
"Access-Control-Allow-Origin: *\r\n" ++
"Access-Control-Allow-Credentials: true\r\n" ++
"\r\n";
var headers: [32]Header = undefined;
const result = try Headers.parse(HEADERS, &headers);
for (result.headers) |header| {
std.debug.print("{}\n", .{header});
}
}

665
src/deps/picohttpparser.c Normal file
View File

@@ -0,0 +1,665 @@
/*
* Copyright (c) 2009-2014 Kazuho Oku, Tokuhiro Matsuno, Daisuke Murase,
* Shigeo Mitsunari
*
* The software is licensed under either the MIT License (below) or the Perl
* license.
*
* Permission is hereby granted, free of charge, to any person obtaining a copy
* of this software and associated documentation files (the "Software"), to
* deal in the Software without restriction, including without limitation the
* rights to use, copy, modify, merge, publish, distribute, sublicense, and/or
* sell copies of the Software, and to permit persons to whom the Software is
* furnished to do so, subject to the following conditions:
*
* The above copyright notice and this permission notice shall be included in
* all copies or substantial portions of the Software.
*
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
* FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS
* IN THE SOFTWARE.
*/
#include <assert.h>
#include <stddef.h>
#include <string.h>
#ifdef __SSE4_2__
#ifdef _MSC_VER
#include <nmmintrin.h>
#else
#include <x86intrin.h>
#endif
#endif
#include "picohttpparser.h"
#if __GNUC__ >= 3
#define likely(x) __builtin_expect(!!(x), 1)
#define unlikely(x) __builtin_expect(!!(x), 0)
#else
#define likely(x) (x)
#define unlikely(x) (x)
#endif
#ifdef _MSC_VER
#define ALIGNED(n) _declspec(align(n))
#else
#define ALIGNED(n) __attribute__((aligned(n)))
#endif
#define IS_PRINTABLE_ASCII(c) ((unsigned char)(c)-040u < 0137u)
#define CHECK_EOF() \
if (buf == buf_end) { \
*ret = -2; \
return NULL; \
}
#define EXPECT_CHAR_NO_CHECK(ch) \
if (*buf++ != ch) { \
*ret = -1; \
return NULL; \
}
#define EXPECT_CHAR(ch) \
CHECK_EOF(); \
EXPECT_CHAR_NO_CHECK(ch);
#define ADVANCE_TOKEN(tok, toklen) \
do { \
const char *tok_start = buf; \
static const char ALIGNED(16) ranges2[16] = "\000\040\177\177"; \
int found2; \
buf = findchar_fast(buf, buf_end, ranges2, 4, &found2); \
if (!found2) { \
CHECK_EOF(); \
} \
while (1) { \
if (*buf == ' ') { \
break; \
} else if (unlikely(!IS_PRINTABLE_ASCII(*buf))) { \
if ((unsigned char)*buf < '\040' || *buf == '\177') { \
*ret = -1; \
return NULL; \
} \
} \
++buf; \
CHECK_EOF(); \
} \
tok = tok_start; \
toklen = buf - tok_start; \
} while (0)
static const char *token_char_map = "\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0"
"\0\1\0\1\1\1\1\1\0\0\1\1\0\1\1\0\1\1\1\1\1\1\1\1\1\1\0\0\0\0\0\0"
"\0\1\1\1\1\1\1\1\1\1\1\1\1\1\1\1\1\1\1\1\1\1\1\1\1\1\1\0\0\0\1\1"
"\1\1\1\1\1\1\1\1\1\1\1\1\1\1\1\1\1\1\1\1\1\1\1\1\1\1\1\0\1\0\1\0"
"\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0"
"\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0"
"\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0"
"\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0";
static const char *findchar_fast(const char *buf, const char *buf_end, const char *ranges, size_t ranges_size, int *found)
{
*found = 0;
#if __SSE4_2__
if (likely(buf_end - buf >= 16)) {
__m128i ranges16 = _mm_loadu_si128((const __m128i *)ranges);
size_t left = (buf_end - buf) & ~15;
do {
__m128i b16 = _mm_loadu_si128((const __m128i *)buf);
int r = _mm_cmpestri(ranges16, ranges_size, b16, 16, _SIDD_LEAST_SIGNIFICANT | _SIDD_CMP_RANGES | _SIDD_UBYTE_OPS);
if (unlikely(r != 16)) {
buf += r;
*found = 1;
break;
}
buf += 16;
left -= 16;
} while (likely(left != 0));
}
#else
/* suppress unused parameter warning */
(void)buf_end;
(void)ranges;
(void)ranges_size;
#endif
return buf;
}
static const char *get_token_to_eol(const char *buf, const char *buf_end, const char **token, size_t *token_len, int *ret)
{
const char *token_start = buf;
#ifdef __SSE4_2__
static const char ALIGNED(16) ranges1[16] = "\0\010" /* allow HT */
"\012\037" /* allow SP and up to but not including DEL */
"\177\177"; /* allow chars w. MSB set */
int found;
buf = findchar_fast(buf, buf_end, ranges1, 6, &found);
if (found)
goto FOUND_CTL;
#else
/* find non-printable char within the next 8 bytes, this is the hottest code; manually inlined */
while (likely(buf_end - buf >= 8)) {
#define DOIT() \
do { \
if (unlikely(!IS_PRINTABLE_ASCII(*buf))) \
goto NonPrintable; \
++buf; \
} while (0)
DOIT();
DOIT();
DOIT();
DOIT();
DOIT();
DOIT();
DOIT();
DOIT();
#undef DOIT
continue;
NonPrintable:
if ((likely((unsigned char)*buf < '\040') && likely(*buf != '\011')) || unlikely(*buf == '\177')) {
goto FOUND_CTL;
}
++buf;
}
#endif
for (;; ++buf) {
CHECK_EOF();
if (unlikely(!IS_PRINTABLE_ASCII(*buf))) {
if ((likely((unsigned char)*buf < '\040') && likely(*buf != '\011')) || unlikely(*buf == '\177')) {
goto FOUND_CTL;
}
}
}
FOUND_CTL:
if (likely(*buf == '\015')) {
++buf;
EXPECT_CHAR('\012');
*token_len = buf - 2 - token_start;
} else if (*buf == '\012') {
*token_len = buf - token_start;
++buf;
} else {
*ret = -1;
return NULL;
}
*token = token_start;
return buf;
}
static const char *is_complete(const char *buf, const char *buf_end, size_t last_len, int *ret)
{
int ret_cnt = 0;
buf = last_len < 3 ? buf : buf + last_len - 3;
while (1) {
CHECK_EOF();
if (*buf == '\015') {
++buf;
CHECK_EOF();
EXPECT_CHAR('\012');
++ret_cnt;
} else if (*buf == '\012') {
++buf;
++ret_cnt;
} else {
++buf;
ret_cnt = 0;
}
if (ret_cnt == 2) {
return buf;
}
}
*ret = -2;
return NULL;
}
#define PARSE_INT(valp_, mul_) \
if (*buf < '0' || '9' < *buf) { \
buf++; \
*ret = -1; \
return NULL; \
} \
*(valp_) = (mul_) * (*buf++ - '0');
#define PARSE_INT_3(valp_) \
do { \
int res_ = 0; \
PARSE_INT(&res_, 100) \
*valp_ = res_; \
PARSE_INT(&res_, 10) \
*valp_ += res_; \
PARSE_INT(&res_, 1) \
*valp_ += res_; \
} while (0)
/* returned pointer is always within [buf, buf_end), or null */
static const char *parse_token(const char *buf, const char *buf_end, const char **token, size_t *token_len, char next_char,
int *ret)
{
/* We use pcmpestri to detect non-token characters. This instruction can take no more than eight character ranges (8*2*8=128
* bits that is the size of a SSE register). Due to this restriction, characters `|` and `~` are handled in the slow loop. */
static const char ALIGNED(16) ranges[] = "\x00 " /* control chars and up to SP */
"\"\"" /* 0x22 */
"()" /* 0x28,0x29 */
",," /* 0x2c */
"//" /* 0x2f */
":@" /* 0x3a-0x40 */
"[]" /* 0x5b-0x5d */
"{\xff"; /* 0x7b-0xff */
const char *buf_start = buf;
int found;
buf = findchar_fast(buf, buf_end, ranges, sizeof(ranges) - 1, &found);
if (!found) {
CHECK_EOF();
}
while (1) {
if (*buf == next_char) {
break;
} else if (!token_char_map[(unsigned char)*buf]) {
*ret = -1;
return NULL;
}
++buf;
CHECK_EOF();
}
*token = buf_start;
*token_len = buf - buf_start;
return buf;
}
/* returned pointer is always within [buf, buf_end), or null */
static const char *parse_http_version(const char *buf, const char *buf_end, int *minor_version, int *ret)
{
/* we want at least [HTTP/1.<two chars>] to try to parse */
if (buf_end - buf < 9) {
*ret = -2;
return NULL;
}
EXPECT_CHAR_NO_CHECK('H');
EXPECT_CHAR_NO_CHECK('T');
EXPECT_CHAR_NO_CHECK('T');
EXPECT_CHAR_NO_CHECK('P');
EXPECT_CHAR_NO_CHECK('/');
EXPECT_CHAR_NO_CHECK('1');
EXPECT_CHAR_NO_CHECK('.');
PARSE_INT(minor_version, 1);
return buf;
}
static const char *parse_headers(const char *buf, const char *buf_end, struct phr_header *headers, size_t *num_headers,
size_t max_headers, int *ret)
{
for (;; ++*num_headers) {
CHECK_EOF();
if (*buf == '\015') {
++buf;
EXPECT_CHAR('\012');
break;
} else if (*buf == '\012') {
++buf;
break;
}
if (*num_headers == max_headers) {
*ret = -1;
return NULL;
}
if (!(*num_headers != 0 && (*buf == ' ' || *buf == '\t'))) {
/* parsing name, but do not discard SP before colon, see
* http://www.mozilla.org/security/announce/2006/mfsa2006-33.html */
if ((buf = parse_token(buf, buf_end, &headers[*num_headers].name, &headers[*num_headers].name_len, ':', ret)) == NULL) {
return NULL;
}
if (headers[*num_headers].name_len == 0) {
*ret = -1;
return NULL;
}
++buf;
for (;; ++buf) {
CHECK_EOF();
if (!(*buf == ' ' || *buf == '\t')) {
break;
}
}
} else {
headers[*num_headers].name = NULL;
headers[*num_headers].name_len = 0;
}
const char *value;
size_t value_len;
if ((buf = get_token_to_eol(buf, buf_end, &value, &value_len, ret)) == NULL) {
return NULL;
}
/* remove trailing SPs and HTABs */
const char *value_end = value + value_len;
for (; value_end != value; --value_end) {
const char c = *(value_end - 1);
if (!(c == ' ' || c == '\t')) {
break;
}
}
headers[*num_headers].value = value;
headers[*num_headers].value_len = value_end - value;
}
return buf;
}
static const char *parse_request(const char *buf, const char *buf_end, const char **method, size_t *method_len, const char **path,
size_t *path_len, int *minor_version, struct phr_header *headers, size_t *num_headers,
size_t max_headers, int *ret)
{
/* skip first empty line (some clients add CRLF after POST content) */
CHECK_EOF();
if (*buf == '\015') {
++buf;
EXPECT_CHAR('\012');
} else if (*buf == '\012') {
++buf;
}
/* parse request line */
if ((buf = parse_token(buf, buf_end, method, method_len, ' ', ret)) == NULL) {
return NULL;
}
do {
++buf;
CHECK_EOF();
} while (*buf == ' ');
ADVANCE_TOKEN(*path, *path_len);
do {
++buf;
CHECK_EOF();
} while (*buf == ' ');
if (*method_len == 0 || *path_len == 0) {
*ret = -1;
return NULL;
}
if ((buf = parse_http_version(buf, buf_end, minor_version, ret)) == NULL) {
return NULL;
}
if (*buf == '\015') {
++buf;
EXPECT_CHAR('\012');
} else if (*buf == '\012') {
++buf;
} else {
*ret = -1;
return NULL;
}
return parse_headers(buf, buf_end, headers, num_headers, max_headers, ret);
}
int phr_parse_request(const char *buf_start, size_t len, const char **method, size_t *method_len, const char **path,
size_t *path_len, int *minor_version, struct phr_header *headers, size_t *num_headers, size_t last_len)
{
const char *buf = buf_start, *buf_end = buf_start + len;
size_t max_headers = *num_headers;
int r;
*method = NULL;
*method_len = 0;
*path = NULL;
*path_len = 0;
*minor_version = -1;
*num_headers = 0;
/* if last_len != 0, check if the request is complete (a fast countermeasure
againt slowloris */
if (last_len != 0 && is_complete(buf, buf_end, last_len, &r) == NULL) {
return r;
}
if ((buf = parse_request(buf, buf_end, method, method_len, path, path_len, minor_version, headers, num_headers, max_headers,
&r)) == NULL) {
return r;
}
return (int)(buf - buf_start);
}
static const char *parse_response(const char *buf, const char *buf_end, int *minor_version, int *status, const char **msg,
size_t *msg_len, struct phr_header *headers, size_t *num_headers, size_t max_headers, int *ret)
{
/* parse "HTTP/1.x" */
if ((buf = parse_http_version(buf, buf_end, minor_version, ret)) == NULL) {
return NULL;
}
/* skip space */
if (*buf != ' ') {
*ret = -1;
return NULL;
}
do {
++buf;
CHECK_EOF();
} while (*buf == ' ');
/* parse status code, we want at least [:digit:][:digit:][:digit:]<other char> to try to parse */
if (buf_end - buf < 4) {
*ret = -2;
return NULL;
}
PARSE_INT_3(status);
/* get message including preceding space */
if ((buf = get_token_to_eol(buf, buf_end, msg, msg_len, ret)) == NULL) {
return NULL;
}
if (*msg_len == 0) {
/* ok */
} else if (**msg == ' ') {
/* Remove preceding space. Successful return from `get_token_to_eol` guarantees that we would hit something other than SP
* before running past the end of the given buffer. */
do {
++*msg;
--*msg_len;
} while (**msg == ' ');
} else {
/* garbage found after status code */
*ret = -1;
return NULL;
}
return parse_headers(buf, buf_end, headers, num_headers, max_headers, ret);
}
int phr_parse_response(const char *buf_start, size_t len, int *minor_version, int *status, const char **msg, size_t *msg_len,
struct phr_header *headers, size_t *num_headers, size_t last_len)
{
const char *buf = buf_start, *buf_end = buf + len;
size_t max_headers = *num_headers;
int r;
*minor_version = -1;
*status = 0;
*msg = NULL;
*msg_len = 0;
*num_headers = 0;
/* if last_len != 0, check if the response is complete (a fast countermeasure
against slowloris */
if (last_len != 0 && is_complete(buf, buf_end, last_len, &r) == NULL) {
return r;
}
if ((buf = parse_response(buf, buf_end, minor_version, status, msg, msg_len, headers, num_headers, max_headers, &r)) == NULL) {
return r;
}
return (int)(buf - buf_start);
}
int phr_parse_headers(const char *buf_start, size_t len, struct phr_header *headers, size_t *num_headers, size_t last_len)
{
const char *buf = buf_start, *buf_end = buf + len;
size_t max_headers = *num_headers;
int r;
*num_headers = 0;
/* if last_len != 0, check if the response is complete (a fast countermeasure
against slowloris */
if (last_len != 0 && is_complete(buf, buf_end, last_len, &r) == NULL) {
return r;
}
if ((buf = parse_headers(buf, buf_end, headers, num_headers, max_headers, &r)) == NULL) {
return r;
}
return (int)(buf - buf_start);
}
enum {
CHUNKED_IN_CHUNK_SIZE,
CHUNKED_IN_CHUNK_EXT,
CHUNKED_IN_CHUNK_DATA,
CHUNKED_IN_CHUNK_CRLF,
CHUNKED_IN_TRAILERS_LINE_HEAD,
CHUNKED_IN_TRAILERS_LINE_MIDDLE
};
static int decode_hex(int ch)
{
if ('0' <= ch && ch <= '9') {
return ch - '0';
} else if ('A' <= ch && ch <= 'F') {
return ch - 'A' + 0xa;
} else if ('a' <= ch && ch <= 'f') {
return ch - 'a' + 0xa;
} else {
return -1;
}
}
ssize_t phr_decode_chunked(struct phr_chunked_decoder *decoder, char *buf, size_t *_bufsz)
{
size_t dst = 0, src = 0, bufsz = *_bufsz;
ssize_t ret = -2; /* incomplete */
while (1) {
switch (decoder->_state) {
case CHUNKED_IN_CHUNK_SIZE:
for (;; ++src) {
int v;
if (src == bufsz)
goto Exit;
if ((v = decode_hex(buf[src])) == -1) {
if (decoder->_hex_count == 0) {
ret = -1;
goto Exit;
}
break;
}
if (decoder->_hex_count == sizeof(size_t) * 2) {
ret = -1;
goto Exit;
}
decoder->bytes_left_in_chunk = decoder->bytes_left_in_chunk * 16 + v;
++decoder->_hex_count;
}
decoder->_hex_count = 0;
decoder->_state = CHUNKED_IN_CHUNK_EXT;
/* fallthru */
case CHUNKED_IN_CHUNK_EXT:
/* RFC 7230 A.2 "Line folding in chunk extensions is disallowed" */
for (;; ++src) {
if (src == bufsz)
goto Exit;
if (buf[src] == '\012')
break;
}
++src;
if (decoder->bytes_left_in_chunk == 0) {
if (decoder->consume_trailer) {
decoder->_state = CHUNKED_IN_TRAILERS_LINE_HEAD;
break;
} else {
goto Complete;
}
}
decoder->_state = CHUNKED_IN_CHUNK_DATA;
/* fallthru */
case CHUNKED_IN_CHUNK_DATA: {
size_t avail = bufsz - src;
if (avail < decoder->bytes_left_in_chunk) {
if (dst != src)
memmove(buf + dst, buf + src, avail);
src += avail;
dst += avail;
decoder->bytes_left_in_chunk -= avail;
goto Exit;
}
if (dst != src)
memmove(buf + dst, buf + src, decoder->bytes_left_in_chunk);
src += decoder->bytes_left_in_chunk;
dst += decoder->bytes_left_in_chunk;
decoder->bytes_left_in_chunk = 0;
decoder->_state = CHUNKED_IN_CHUNK_CRLF;
}
/* fallthru */
case CHUNKED_IN_CHUNK_CRLF:
for (;; ++src) {
if (src == bufsz)
goto Exit;
if (buf[src] != '\015')
break;
}
if (buf[src] != '\012') {
ret = -1;
goto Exit;
}
++src;
decoder->_state = CHUNKED_IN_CHUNK_SIZE;
break;
case CHUNKED_IN_TRAILERS_LINE_HEAD:
for (;; ++src) {
if (src == bufsz)
goto Exit;
if (buf[src] != '\015')
break;
}
if (buf[src++] == '\012')
goto Complete;
decoder->_state = CHUNKED_IN_TRAILERS_LINE_MIDDLE;
/* fallthru */
case CHUNKED_IN_TRAILERS_LINE_MIDDLE:
for (;; ++src) {
if (src == bufsz)
goto Exit;
if (buf[src] == '\012')
break;
}
++src;
decoder->_state = CHUNKED_IN_TRAILERS_LINE_HEAD;
break;
default:
assert(!"decoder is corrupt");
}
}
Complete:
ret = bufsz - src;
Exit:
if (dst != src)
memmove(buf + dst, buf + src, bufsz - src);
*_bufsz = dst;
return ret;
}
int phr_decode_chunked_is_in_data(struct phr_chunked_decoder *decoder)
{
return decoder->_state == CHUNKED_IN_CHUNK_DATA;
}
#undef CHECK_EOF
#undef EXPECT_CHAR
#undef ADVANCE_TOKEN

87
src/deps/picohttpparser.h Normal file
View File

@@ -0,0 +1,87 @@
/*
* Copyright (c) 2009-2014 Kazuho Oku, Tokuhiro Matsuno, Daisuke Murase,
* Shigeo Mitsunari
*
* The software is licensed under either the MIT License (below) or the Perl
* license.
*
* Permission is hereby granted, free of charge, to any person obtaining a copy
* of this software and associated documentation files (the "Software"), to
* deal in the Software without restriction, including without limitation the
* rights to use, copy, modify, merge, publish, distribute, sublicense, and/or
* sell copies of the Software, and to permit persons to whom the Software is
* furnished to do so, subject to the following conditions:
*
* The above copyright notice and this permission notice shall be included in
* all copies or substantial portions of the Software.
*
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
* FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS
* IN THE SOFTWARE.
*/
#ifndef picohttpparser_h
#define picohttpparser_h
#include <sys/types.h>
#ifdef _MSC_VER
#define ssize_t intptr_t
#endif
#ifdef __cplusplus
extern "C" {
#endif
/* contains name and value of a header (name == NULL if is a continuing line
* of a multiline header */
struct phr_header {
const char *name;
size_t name_len;
const char *value;
size_t value_len;
};
/* returns number of bytes consumed if successful, -2 if request is partial,
* -1 if failed */
int phr_parse_request(const char *buf, size_t len, const char **method, size_t *method_len, const char **path, size_t *path_len,
int *minor_version, struct phr_header *headers, size_t *num_headers, size_t last_len);
/* ditto */
int phr_parse_response(const char *_buf, size_t len, int *minor_version, int *status, const char **msg, size_t *msg_len,
struct phr_header *headers, size_t *num_headers, size_t last_len);
/* ditto */
int phr_parse_headers(const char *buf, size_t len, struct phr_header *headers, size_t *num_headers, size_t last_len);
/* should be zero-filled before start */
struct phr_chunked_decoder {
size_t bytes_left_in_chunk; /* number of bytes left in current chunk */
char consume_trailer; /* if trailing headers should be consumed */
char _hex_count;
char _state;
};
/* the function rewrites the buffer given as (buf, bufsz) removing the chunked-
* encoding headers. When the function returns without an error, bufsz is
* updated to the length of the decoded data available. Applications should
* repeatedly call the function while it returns -2 (incomplete) every time
* supplying newly arrived data. If the end of the chunked-encoded data is
* found, the function returns a non-negative number indicating the number of
* octets left undecoded, that starts from the offset returned by `*bufsz`.
* Returns -1 on error.
*/
ssize_t phr_decode_chunked(struct phr_chunked_decoder *decoder, char *buf, size_t *bufsz);
/* returns if the chunked decoder is in middle of chunked data */
int phr_decode_chunked_is_in_data(struct phr_chunked_decoder *decoder);
#ifdef __cplusplus
}
#endif
#endif

Binary file not shown.

Some files were not shown because too many files have changed in this diff Show More