Compare commits

...

6 Commits

Author SHA1 Message Date
Claude Bot
b798e9f921 Merge main into claude/publishconfig-directory-support
Resolved merge conflicts in pack_command.zig by:
- Adopting main's bin handling refactor (processing bins before files)
- Preserving feature branch's use of actual_json.root for publishConfig.directory support
- Removing duplicate bin processing code

The key change ensures bins are read from actual_json.root (the package.json
in publishConfig.directory if set) rather than json.root (workspace package.json).

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-03 10:21:05 +00:00
Claude Bot
a87c40d30d Address additional CodeRabbit review comments
Fixes two issues identified in review:

1. PathBuffer aliasing (comment #2428232461):
   - Use separate buffers (norm_buf, join_buf1, join_buf2) to avoid
     input/output overlap in path operations
   - Validate ".." before normalization to catch parent traversal attempts
     (normalizeBuf resolves ".." away, so we must check the raw input)
   - Keep substring check for ".." as it's appropriately conservative for
     security validation of publishConfig.directory

2. Memory leak (comment #2428232477):
   - Added defer to free edited_package_json buffer after allocation
   - Ensures cleanup happens on all code paths (dry-run vs normal execution)

All 9 publishConfig.directory tests still passing.

Note: Skipped adding publish-specific test (comment #2428232485) because:
- pack and publish share the same code path for publishConfig.directory
- The subdirectory logic executes before the for_publish branch
- Testing publish requires complex npm auth setup
- Current pack tests provide comprehensive coverage of the feature

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-14 08:13:01 +00:00
Claude Bot
28bfafbf13 Address CodeRabbit review comments for publishConfig.directory
- Fixed --dry-run crash by passing edited_package.json.len instead of 0
  when publishConfig.directory is set (addresses comment #2427922588)

- Refactored pack_dir_path to be computed once and reused throughout
  the function, eliminating duplicate computation (addresses comment
  #2428128412)

- Added 3 additional test cases (addresses comment #2428128423):
  1. Verifies lifecycle scripts run from workspace root, not subdirectory
  2. Rejects parent directory traversal (../)
  3. Rejects absolute paths

- Added validation to reject invalid publishConfig.directory values:
  * Absolute paths are rejected with clear error message
  * Parent directory traversal (..) is rejected with clear error message
  * Added Output.flush() calls to ensure error messages are displayed

All 9 publishConfig.directory tests now passing.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-14 07:37:59 +00:00
Claude Bot
976046b50d Fix --dry-run crash with publishConfig.directory
When publishConfig.directory is set, root_dir points to the subdirectory
which doesn't have package.json stats available. The --dry-run code path
was calling printArchivedFilesAndPackages with package_json_len=0, causing
it to try to stat package.json from root_dir and crash.

Fixed by passing edited_package_json.len to printArchivedFilesAndPackages,
and updating the function to use the provided length when non-zero instead
of trying to stat the file.

Added test coverage for --dry-run with publishConfig.directory.

Fixes CodeRabbit review comment #2427922588

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-14 06:58:23 +00:00
Claude Bot
5e6a40c46f Update publishConfig.directory to match pnpm's behavior
After reviewing pnpm's implementation, updated to match their behavior:
- Reads initial package.json from project root to get publishConfig.directory
- Re-reads package.json from the subdirectory specified in publishConfig.directory
- Uses the subdirectory's package.json for packing (name, version, files field, etc.)
- The subdirectory MUST contain its own package.json

This matches pnpm's implementation where build scripts typically:
1. Compile/transpile code into a dist/ directory
2. Copy package.json (possibly modified) to dist/
3. Run bun publish to publish from dist/

Updated all tests to include package.json in the subdirectories.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-14 05:44:32 +00:00
Claude Bot
8d3c406157 Add support for publishConfig.directory in bun pack and bun publish
This implements support for the `publishConfig.directory` field in package.json,
which allows packages to specify a subdirectory to pack/publish from instead of
the project root. This is useful for build workflows where compiled assets are
placed in a dist/ or build/ directory.

When publishConfig.directory is set:
- Files are packed from the specified subdirectory
- package.json remains from the project root
- The published package structure matches what's in the subdirectory

This matches pnpm's behavior and resolves #23641.

Test plan:
- Added comprehensive tests for publishConfig.directory
- Tests cover: basic directory changes, nested paths, with files field
- All tests passing

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-14 04:59:53 +00:00
2 changed files with 348 additions and 21 deletions

View File

@@ -1187,8 +1187,14 @@ pub const PackCommand = struct {
.entry => |entry| entry,
};
if (comptime for_publish) {
if (json.root.get("publishConfig")) |config| {
var publish_config_directory: ?string = null;
if (json.root.get("publishConfig")) |config| {
// Read directory for both pack and publish
if (try config.getString(ctx.allocator, "directory")) |directory| {
publish_config_directory = directory[0];
}
if (comptime for_publish) {
if (manager.options.publish_config.tag.len == 0) {
if (try config.getStringCloned(ctx.allocator, "tag")) |tag| {
manager.options.publish_config.tag = tag;
@@ -1207,7 +1213,70 @@ pub const PackCommand = struct {
// maybe otp
}
const package_name_expr: Expr = json.root.get("name") orelse return error.MissingPackageName;
// If publishConfig.directory is set, re-read package.json from that directory
const actual_package_json_path: stringZ = if (publish_config_directory) |directory| path_with_dir: {
// Reject absolute paths before normalization
if (std.fs.path.isAbsolute(directory)) {
Output.errGeneric("publishConfig.directory cannot be an absolute path: {s}", .{directory});
Output.flush();
Global.crash();
}
// Reject parent directory traversal before normalization (check raw input)
// We check before normalization because normalizeBuf may resolve ".." away
if (strings.contains(directory, "..")) {
Output.errGeneric("publishConfig.directory cannot contain '..': {s}", .{directory});
Output.flush();
Global.crash();
}
// Normalize after validation
var norm_buf: PathBuffer = undefined;
const normalized0 = bun.path.normalizeBuf(directory, &norm_buf, .posix);
const normalized_dir = strings.withoutTrailingSlash(strings.withoutPrefixComptime(normalized0, "./"));
const abs_workspace_path: string = strings.withoutTrailingSlash(strings.withoutSuffixComptime(abs_package_json_path, "package.json"));
// Use a separate buffer for joins to avoid aliasing with normalized buffer
var join_buf1: PathBuffer = undefined;
const abs_dir = bun.path.joinAbsStringBuf(
abs_workspace_path,
&join_buf1,
&[_]string{normalized_dir},
.auto,
);
var join_buf2: PathBuffer = undefined;
break :path_with_dir bun.path.joinAbsStringBufZ(
abs_dir,
&join_buf2,
&[_]string{"package.json"},
.auto,
);
} else abs_package_json_path;
// Directory we actually pack from (directory containing actual package.json)
const pack_dir_path: string = strings.withoutTrailingSlash(
strings.withoutSuffixComptime(actual_package_json_path, "package.json"),
);
// Re-read package.json from the actual directory
const actual_json = if (publish_config_directory != null) switch (manager.workspace_package_json_cache.getWithPath(manager.allocator, manager.log, actual_package_json_path, .{
.guess_indentation = true,
})) {
.read_err => |err| {
Output.err(err, "failed to read package.json from publishConfig.directory: {s}", .{actual_package_json_path});
Global.crash();
},
.parse_err => |err| {
Output.err(err, "failed to parse package.json from publishConfig.directory: {s}", .{actual_package_json_path});
manager.log.print(Output.errorWriter()) catch {};
Global.crash();
},
.entry => |entry| entry,
} else json;
const package_name_expr: Expr = actual_json.root.get("name") orelse return error.MissingPackageName;
const package_name = try package_name_expr.asStringCloned(ctx.allocator) orelse return error.InvalidPackageName;
if (comptime for_publish) {
const is_scoped = try Dependency.isScopedPackageName(package_name);
@@ -1220,13 +1289,13 @@ pub const PackCommand = struct {
defer if (comptime !for_publish) ctx.allocator.free(package_name);
if (package_name.len == 0) return error.InvalidPackageName;
const package_version_expr: Expr = json.root.get("version") orelse return error.MissingPackageVersion;
const package_version_expr: Expr = actual_json.root.get("version") orelse return error.MissingPackageVersion;
const package_version = try package_version_expr.asStringCloned(ctx.allocator) orelse return error.InvalidPackageVersion;
defer if (comptime !for_publish) ctx.allocator.free(package_version);
if (package_version.len == 0) return error.InvalidPackageVersion;
if (comptime for_publish) {
if (json.root.get("private")) |private| {
if (actual_json.root.get("private")) |private| {
if (private.asBool()) |is_private| {
if (is_private) {
return error.PrivatePackage;
@@ -1235,7 +1304,8 @@ pub const PackCommand = struct {
}
}
const edited_package_json = try editRootPackageJSON(ctx.allocator, ctx.lockfile, json);
const edited_package_json = try editRootPackageJSON(ctx.allocator, ctx.lockfile, actual_json);
defer ctx.allocator.free(edited_package_json);
var this_transpiler: bun.transpiler.Transpiler = undefined;
@@ -1360,27 +1430,27 @@ pub const PackCommand = struct {
break :post_scripts .{ postpack_script, null, null };
};
// Open the directory where files will be packed from (and where package.json is)
var root_dir = root_dir: {
var path_buf: PathBuffer = undefined;
@memcpy(path_buf[0..abs_workspace_path.len], abs_workspace_path);
path_buf[abs_workspace_path.len] = 0;
break :root_dir std.fs.openDirAbsoluteZ(path_buf[0..abs_workspace_path.len :0], .{
@memcpy(path_buf[0..pack_dir_path.len], pack_dir_path);
path_buf[pack_dir_path.len] = 0;
break :root_dir std.fs.openDirAbsoluteZ(path_buf[0..pack_dir_path.len :0], .{
.iterate = true,
}) catch |err| {
Output.err(err, "failed to open root directory: {s}\n", .{abs_workspace_path});
Output.err(err, "failed to open pack directory: {s}\n", .{pack_dir_path});
Global.crash();
};
};
defer root_dir.close();
ctx.bundled_deps = try getBundledDeps(ctx.allocator, json.root, "bundledDependencies") orelse
try getBundledDeps(ctx.allocator, json.root, "bundleDependencies") orelse
ctx.bundled_deps = try getBundledDeps(ctx.allocator, actual_json.root, "bundledDependencies") orelse
try getBundledDeps(ctx.allocator, actual_json.root, "bundleDependencies") orelse
.{};
var pack_queue: PackQueue = .init(ctx.allocator, {});
defer pack_queue.deinit();
const bins = try getPackageBins(ctx.allocator, json.root);
const bins = try getPackageBins(ctx.allocator, actual_json.root);
defer for (bins) |bin| ctx.allocator.free(bin.path);
for (bins) |bin| {
@@ -1400,7 +1470,7 @@ pub const PackCommand = struct {
}
iterate_project_tree: {
if (json.root.get("files")) |files| {
if (actual_json.root.get("files")) |files| {
files_error: {
if (files.asArray()) |_files_array| {
var includes: std.ArrayListUnmanaged(Pattern) = .{};
@@ -1465,7 +1535,7 @@ pub const PackCommand = struct {
if (manager.options.dry_run) {
// don't create the tarball, but run scripts if they exists
printArchivedFilesAndPackages(ctx, root_dir, true, &pack_queue, 0);
printArchivedFilesAndPackages(ctx, root_dir, true, &pack_queue, edited_package_json.len);
if (comptime !for_publish) {
if (manager.options.pack_destination.len == 0 and manager.options.pack_filename.len == 0) {
@@ -2468,15 +2538,18 @@ pub const PackCommand = struct {
const packed_fmt = "<r><b><cyan>packed<r> {} {s}";
if (comptime is_dry_run) {
const package_json_stat = root_dir.statat("package.json").unwrap() catch |err| {
Output.err(err, "failed to stat package.json", .{});
Global.crash();
const package_json_size: usize = if (package_json_len != 0) package_json_len else blk: {
const stat = root_dir.statat("package.json").unwrap() catch |err| {
Output.err(err, "failed to stat package.json", .{});
Global.crash();
};
break :blk @intCast(stat.size);
};
ctx.stats.unpacked_size += @intCast(package_json_stat.size);
ctx.stats.unpacked_size += package_json_size;
Output.prettyln("\n" ++ packed_fmt, .{
bun.fmt.size(package_json_stat.size, .{ .space_between_number_and_unit = false }),
bun.fmt.size(package_json_size, .{ .space_between_number_and_unit = false }),
"package.json",
});

View File

@@ -1466,3 +1466,257 @@ test("$npm_lifecycle_event is accurate", async () => {
]);
expect(p.err).toEqual(`$ echo $npm_lifecycle_event\n`);
});
describe("publishConfig.directory", () => {
test("basic directory change", async () => {
await mkdir(join(packageDir, "src"), { recursive: true });
await mkdir(join(packageDir, "dist"), { recursive: true });
await Promise.all([
write(
join(packageDir, "package.json"),
JSON.stringify({
name: "pack-publishconfig-directory",
version: "1.0.0",
publishConfig: {
directory: "dist",
},
}),
),
// package.json must also be in the dist directory (matches pnpm behavior)
write(
join(packageDir, "dist", "package.json"),
JSON.stringify({
name: "pack-publishconfig-directory",
version: "1.0.0",
}),
),
write(join(packageDir, "src", "index.js"), "console.log('src');"),
write(join(packageDir, "dist", "index.js"), "console.log('dist');"),
write(join(packageDir, "dist", "other.js"), "console.log('other');"),
]);
await pack(packageDir, bunEnv);
const tarball = readTarball(join(packageDir, "pack-publishconfig-directory-1.0.0.tgz"));
expect(tarball.entries).toMatchObject([
{ pathname: "package/package.json" },
{ pathname: "package/index.js" },
{ pathname: "package/other.js" },
]);
// src/index.js should not be included, only dist files
expect(tarball.entries.length).toBe(3);
});
test("directory with ./ prefix", async () => {
await Promise.all([
write(
join(packageDir, "package.json"),
JSON.stringify({
name: "pack-publishconfig-dir-prefix",
version: "2.0.0",
publishConfig: {
directory: "./build",
},
}),
),
mkdir(join(packageDir, "build"), { recursive: true }),
write(
join(packageDir, "build", "package.json"),
JSON.stringify({
name: "pack-publishconfig-dir-prefix",
version: "2.0.0",
}),
),
write(join(packageDir, "build", "main.js"), "console.log('main');"),
]);
await pack(packageDir, bunEnv);
const tarball = readTarball(join(packageDir, "pack-publishconfig-dir-prefix-2.0.0.tgz"));
expect(tarball.entries).toMatchObject([{ pathname: "package/package.json" }, { pathname: "package/main.js" }]);
});
test("nested directory", async () => {
await Promise.all([
write(
join(packageDir, "package.json"),
JSON.stringify({
name: "pack-publishconfig-nested",
version: "3.0.0",
publishConfig: {
directory: "output/final",
},
}),
),
mkdir(join(packageDir, "output", "final"), { recursive: true }),
write(
join(packageDir, "output", "final", "package.json"),
JSON.stringify({
name: "pack-publishconfig-nested",
version: "3.0.0",
}),
),
write(join(packageDir, "output", "final", "result.js"), "console.log('result');"),
write(join(packageDir, "output", "temp.js"), "console.log('temp');"),
]);
await pack(packageDir, bunEnv);
const tarball = readTarball(join(packageDir, "pack-publishconfig-nested-3.0.0.tgz"));
expect(tarball.entries).toMatchObject([{ pathname: "package/package.json" }, { pathname: "package/result.js" }]);
// temp.js should not be included
expect(tarball.entries.find(e => e.pathname === "package/temp.js")).toBeUndefined();
});
test("directory does not exist", async () => {
await write(
join(packageDir, "package.json"),
JSON.stringify({
name: "pack-publishconfig-missing",
version: "1.0.0",
publishConfig: {
directory: "nonexistent",
},
}),
);
// packExpectError verifies exit code is > 0
await packExpectError(packageDir, bunEnv);
});
test("with files field", async () => {
await Promise.all([
write(
join(packageDir, "package.json"),
JSON.stringify({
name: "pack-publishconfig-with-files",
version: "1.0.0",
publishConfig: {
directory: "dist",
},
}),
),
mkdir(join(packageDir, "dist", "lib"), { recursive: true }),
mkdir(join(packageDir, "dist", "other"), { recursive: true }),
write(
join(packageDir, "dist", "package.json"),
JSON.stringify({
name: "pack-publishconfig-with-files",
version: "1.0.0",
files: ["lib/**/*"],
}),
),
write(join(packageDir, "dist", "lib", "main.js"), "console.log('lib');"),
write(join(packageDir, "dist", "other", "other.js"), "console.log('other');"),
]);
await pack(packageDir, bunEnv);
const tarball = readTarball(join(packageDir, "pack-publishconfig-with-files-1.0.0.tgz"));
expect(tarball.entries).toMatchObject([{ pathname: "package/package.json" }, { pathname: "package/lib/main.js" }]);
// other.js should not be included
expect(tarball.entries.find(e => e.pathname === "package/other/other.js")).toBeUndefined();
});
test("with --dry-run", async () => {
await mkdir(join(packageDir, "dist"), { recursive: true });
await Promise.all([
write(
join(packageDir, "package.json"),
JSON.stringify({
name: "pack-publishconfig-dryrun",
version: "1.0.0",
publishConfig: {
directory: "dist",
},
}),
),
write(
join(packageDir, "dist", "package.json"),
JSON.stringify({
name: "pack-publishconfig-dryrun",
version: "1.0.0",
}),
),
write(join(packageDir, "dist", "index.js"), "console.log('dist');"),
]);
const { out } = await pack(packageDir, bunEnv, "--dry-run");
// Should show what would be packed
expect(out).toContain("packed");
expect(out).toContain("package.json");
expect(out).toContain("index.js");
// Should not create the tarball
expect(await exists(join(packageDir, "pack-publishconfig-dryrun-1.0.0.tgz"))).toBeFalse();
});
test("scripts run from workspace root", async () => {
await mkdir(join(packageDir, "dist"), { recursive: true });
await Promise.all([
write(
join(packageDir, "package.json"),
JSON.stringify({
name: "pack-publishconfig-scripts",
version: "1.0.0",
scripts: {
prepack: "touch .prepack-touched",
},
publishConfig: {
directory: "dist",
},
}),
),
write(
join(packageDir, "dist", "package.json"),
JSON.stringify({
name: "pack-publishconfig-scripts",
version: "1.0.0",
}),
),
write(join(packageDir, "dist", "index.js"), "console.log('dist');"),
]);
await pack(packageDir, bunEnv);
// Script should run from workspace root, not from dist/
expect(await exists(join(packageDir, ".prepack-touched"))).toBeTrue();
expect(await exists(join(packageDir, "dist", ".prepack-touched"))).toBeFalse();
});
test("rejects parent directory traversal", async () => {
await write(
join(packageDir, "package.json"),
JSON.stringify({
name: "pack-publishconfig-parent",
version: "1.0.0",
publishConfig: {
directory: "../outside",
},
}),
);
const { err } = await packExpectError(packageDir, bunEnv);
// Should reject parent directory traversal
expect(err).toContain("publishConfig.directory cannot contain '..'");
});
test("rejects absolute paths", async () => {
await write(
join(packageDir, "package.json"),
JSON.stringify({
name: "pack-publishconfig-absolute",
version: "1.0.0",
publishConfig: {
directory: "/tmp/absolute",
},
}),
);
const { err } = await packExpectError(packageDir, bunEnv);
// Should reject absolute paths
expect(err).toContain("publishConfig.directory cannot be an absolute path");
});
});