Files
bun.sh/docs/api/streams.md
Claude Bot 38117fb07d docs: update documentation for recent API additions
Adds comprehensive documentation for features introduced in recent releases:

## CLI Commands & Features
- Enhanced `bun pm pkg` subcommands (get/set/delete/fix)
- `bun install --linker=isolated` for pnpm-style installs
- `bun audit` filtering flags (--audit-level, --prod, --ignore)
- Security Scanner API for vulnerability scanning
- Fixed examples in `bun why` command docs

## Testing APIs
- `expectTypeOf` for TypeScript type-level testing
- New mock return value matchers: `toHaveReturnedWith`, `toHaveLastReturnedWith`, `toHaveNthReturnedWith`
- `mock.clearAllMocks()` for global mock state management

## Runtime & Build APIs
- ReadableStream convenience methods (`.text()`, `.json()`, `.bytes()`, `.blob()`)
- WebSocket `permessage-deflate` compression support
- `Math.sumPrecise` high-precision summation
- Enhanced `Bun.build()` compile API with cross-platform targets
- Ahead-of-time bundling for HTML imports in server-side code

All examples tested and verified working in Bun v1.2.21.

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-09-01 08:04:33 +00:00

7.3 KiB

Streams are an important abstraction for working with binary data without loading it all into memory at once. They are commonly used for reading and writing files, sending and receiving network requests, and processing large amounts of data.

Bun implements the Web APIs ReadableStream and WritableStream.

{% callout %} Bun also implements the node:stream module, including Readable, Writable, and Duplex. For complete documentation, refer to the Node.js docs. {% /callout %}

To create a simple ReadableStream:

const stream = new ReadableStream({
  start(controller) {
    controller.enqueue("hello");
    controller.enqueue("world");
    controller.close();
  },
});

The contents of a ReadableStream can be read chunk-by-chunk with for await syntax.

for await (const chunk of stream) {
  console.log(chunk);
  // => "hello"
  // => "world"
}

Convenience methods

Bun extends ReadableStream with convenience methods for consuming the entire stream as different data types. These methods allow you to consume streams directly without wrapping them in a Response:

const stream = new ReadableStream({
  start(controller) {
    controller.enqueue("hello ");
    controller.enqueue("world");
    controller.close();
  },
});

// Consume as text
const text = await stream.text();
console.log(text); // => "hello world"

// Consume as JSON
const jsonStream = new ReadableStream({
  start(controller) {
    controller.enqueue('{"message": "hello"}');
    controller.close();
  },
});
const data = await jsonStream.json();
console.log(data); // => { message: "hello" }

// Consume as bytes
const bytes = await stream.bytes();
console.log(bytes); // => Uint8Array

// Consume as blob
const blob = await stream.blob();
console.log(blob); // => Blob

These methods are particularly useful when working with streams from fetch() or other APIs:

const response = await fetch("https://api.example.com/data.json");
const data = await response.body.json(); // Direct JSON parsing from stream

Direct ReadableStream

Bun implements an optimized version of ReadableStream that avoid unnecessary data copying & queue management logic. With a traditional ReadableStream, chunks of data are enqueued. Each chunk is copied into a queue, where it sits until the stream is ready to send more data.

const stream = new ReadableStream({
  start(controller) {
    controller.enqueue("hello");
    controller.enqueue("world");
    controller.close();
  },
});

With a direct ReadableStream, chunks of data are written directly to the stream. No queueing happens, and there's no need to clone the chunk data into memory. The controller API is updated to reflect this; instead of .enqueue() you call .write.

const stream = new ReadableStream({
  type: "direct",
  pull(controller) {
    controller.write("hello");
    controller.write("world");
  },
});

When using a direct ReadableStream, all chunk queueing is handled by the destination. The consumer of the stream receives exactly what is passed to controller.write(), without any encoding or modification.

Async generator streams

Bun also supports async generator functions as a source for Response and Request. This is an easy way to create a ReadableStream that fetches data from an asynchronous source.

const response = new Response(
  (async function* () {
    yield "hello";
    yield "world";
  })(),
);

await response.text(); // "helloworld"

You can also use [Symbol.asyncIterator] directly.

const response = new Response({
  [Symbol.asyncIterator]: async function* () {
    yield "hello";
    yield "world";
  },
});

await response.text(); // "helloworld"

If you need more granular control over the stream, yield will return the direct ReadableStream controller.

const response = new Response({
  [Symbol.asyncIterator]: async function* () {
    const controller = yield "hello";
    await controller.end();
  },
});

await response.text(); // "hello"

Bun.ArrayBufferSink

The Bun.ArrayBufferSink class is a fast incremental writer for constructing an ArrayBuffer of unknown size.

const sink = new Bun.ArrayBufferSink();

sink.write("h");
sink.write("e");
sink.write("l");
sink.write("l");
sink.write("o");

sink.end();
// ArrayBuffer(5) [ 104, 101, 108, 108, 111 ]

To instead retrieve the data as a Uint8Array, pass the asUint8Array option to the start method.

const sink = new Bun.ArrayBufferSink();
sink.start({
+ asUint8Array: true
});

sink.write("h");
sink.write("e");
sink.write("l");
sink.write("l");
sink.write("o");

sink.end();
// Uint8Array(5) [ 104, 101, 108, 108, 111 ]

The .write() method supports strings, typed arrays, ArrayBuffer, and SharedArrayBuffer.

sink.write("h");
sink.write(new Uint8Array([101, 108]));
sink.write(Buffer.from("lo").buffer);

sink.end();

Once .end() is called, no more data can be written to the ArrayBufferSink. However, in the context of buffering a stream, it's useful to continuously write data and periodically .flush() the contents (say, into a WriteableStream). To support this, pass stream: true to the constructor.

const sink = new Bun.ArrayBufferSink();
sink.start({
  stream: true,
});

sink.write("h");
sink.write("e");
sink.write("l");
sink.flush();
// ArrayBuffer(5) [ 104, 101, 108 ]

sink.write("l");
sink.write("o");
sink.flush();
// ArrayBuffer(5) [ 108, 111 ]

The .flush() method returns the buffered data as an ArrayBuffer (or Uint8Array if asUint8Array: true) and clears internal buffer.

To manually set the size of the internal buffer in bytes, pass a value for highWaterMark:

const sink = new Bun.ArrayBufferSink();
sink.start({
  highWaterMark: 1024 * 1024, // 1 MB
});

{% details summary="Reference" %}

/**
 * Fast incremental writer that becomes an `ArrayBuffer` on end().
 */
export class ArrayBufferSink {
  constructor();

  start(options?: {
    asUint8Array?: boolean;
    /**
     * Preallocate an internal buffer of this size
     * This can significantly improve performance when the chunk size is small
     */
    highWaterMark?: number;
    /**
     * On {@link ArrayBufferSink.flush}, return the written data as a `Uint8Array`.
     * Writes will restart from the beginning of the buffer.
     */
    stream?: boolean;
  }): void;

  write(
    chunk: string | ArrayBufferView | ArrayBuffer | SharedArrayBuffer,
  ): number;
  /**
   * Flush the internal buffer
   *
   * If {@link ArrayBufferSink.start} was passed a `stream` option, this will return a `ArrayBuffer`
   * If {@link ArrayBufferSink.start} was passed a `stream` option and `asUint8Array`, this will return a `Uint8Array`
   * Otherwise, this will return the number of bytes written since the last flush
   *
   * This API might change later to separate Uint8ArraySink and ArrayBufferSink
   */
  flush(): number | Uint8Array<ArrayBuffer> | ArrayBuffer;
  end(): ArrayBuffer | Uint8Array<ArrayBuffer>;
}

{% /details %}