{ "type": "module", "source": "doc/api/webstreams.md", "modules": [ { "textRaw": "Web Streams API", "name": "web_streams_api", "introduced_in": "v16.5.0", "meta": { "added": [ "v16.5.0" ], "changes": [ { "version": "v18.0.0", "pr-url": "https://github.com/nodejs/node/pull/42225", "description": "Use of this API no longer emit a runtime warning." } ] }, "stability": 1, "stabilityText": "Experimental.", "desc": "
An implementation of the WHATWG Streams Standard.
", "modules": [ { "textRaw": "Overview", "name": "overview", "desc": "The WHATWG Streams Standard (or \"web streams\") defines an API for handling\nstreaming data. It is similar to the Node.js Streams API but emerged later\nand has become the \"standard\" API for streaming data across many JavaScript\nenvironments.
\nThere are three primary types of objects:
\nReadableStream
- Represents a source of streaming data.WritableStream
- Represents a destination for streaming data.TransformStream
- Represents an algorithm for transforming streaming data.ReadableStream
This example creates a simple ReadableStream
that pushes the current\nperformance.now()
timestamp once every second forever. An async iterable\nis used to read the data from the stream.
import {\n ReadableStream,\n} from 'node:stream/web';\n\nimport {\n setInterval as every,\n} from 'node:timers/promises';\n\nimport {\n performance,\n} from 'node:perf_hooks';\n\nconst SECOND = 1000;\n\nconst stream = new ReadableStream({\n async start(controller) {\n for await (const _ of every(SECOND))\n controller.enqueue(performance.now());\n },\n});\n\nfor await (const value of stream)\n console.log(value);\n
\nconst {\n ReadableStream,\n} = require('node:stream/web');\n\nconst {\n setInterval: every,\n} = require('node:timers/promises');\n\nconst {\n performance,\n} = require('node:perf_hooks');\n\nconst SECOND = 1000;\n\nconst stream = new ReadableStream({\n async start(controller) {\n for await (const _ of every(SECOND))\n controller.enqueue(performance.now());\n },\n});\n\n(async () => {\n for await (const value of stream)\n console.log(value);\n})();\n
",
"type": "module",
"displayName": "Overview"
},
{
"textRaw": "API",
"name": "api",
"classes": [
{
"textRaw": "Class: `ReadableStream`",
"type": "class",
"name": "ReadableStream",
"meta": {
"added": [
"v16.5.0"
],
"changes": [
{
"version": "v18.0.0",
"pr-url": "https://github.com/nodejs/node/pull/42225",
"description": "This class is now exposed on the global object."
}
]
},
"properties": [
{
"textRaw": "`locked` Type: {boolean} Set to `true` if there is an active reader for this {ReadableStream}.",
"type": "boolean",
"name": "Type",
"meta": {
"added": [
"v16.5.0"
],
"changes": []
},
"desc": "The readableStream.locked
property is false
by default, and is\nswitched to true
while there is an active reader consuming the\nstream's data.
import { ReadableStream } from 'node:stream/web';\n\nconst stream = new ReadableStream();\n\nconst reader = stream.getReader();\n\nconsole.log(await reader.read());\n
\nconst { ReadableStream } = require('node:stream/web');\n\nconst stream = new ReadableStream();\n\nconst reader = stream.getReader();\n\nreader.read().then(console.log);\n
\nCauses the readableStream.locked
to be true
.
Connects this <ReadableStream> to the pair of <ReadableStream> and\n<WritableStream> provided in the transform
argument such that the\ndata from this <ReadableStream> is written in to transform.writable
,\npossibly transformed, then pushed to transform.readable
. Once the\npipeline is configured, transform.readable
is returned.
Causes the readableStream.locked
to be true
while the pipe operation\nis active.
import {\n ReadableStream,\n TransformStream,\n} from 'node:stream/web';\n\nconst stream = new ReadableStream({\n start(controller) {\n controller.enqueue('a');\n },\n});\n\nconst transform = new TransformStream({\n transform(chunk, controller) {\n controller.enqueue(chunk.toUpperCase());\n },\n});\n\nconst transformedStream = stream.pipeThrough(transform);\n\nfor await (const chunk of transformedStream)\n console.log(chunk);\n
\nconst {\n ReadableStream,\n TransformStream,\n} = require('node:stream/web');\n\nconst stream = new ReadableStream({\n start(controller) {\n controller.enqueue('a');\n },\n});\n\nconst transform = new TransformStream({\n transform(chunk, controller) {\n controller.enqueue(chunk.toUpperCase());\n },\n});\n\nconst transformedStream = stream.pipeThrough(transform);\n\n(async () => {\n for await (const chunk of transformedStream)\n console.log(chunk);\n})();\n
"
},
{
"textRaw": "`readableStream.pipeTo(destination[, options])`",
"type": "method",
"name": "pipeTo",
"meta": {
"added": [
"v16.5.0"
],
"changes": []
},
"signatures": [
{
"return": {
"textRaw": "Returns: A promise fulfilled with `undefined`",
"name": "return",
"desc": "A promise fulfilled with `undefined`"
},
"params": [
{
"textRaw": "`destination` {WritableStream} A {WritableStream} to which this `ReadableStream`'s data will be written.",
"name": "destination",
"type": "WritableStream",
"desc": "A {WritableStream} to which this `ReadableStream`'s data will be written."
},
{
"textRaw": "`options` {Object}",
"name": "options",
"type": "Object",
"options": [
{
"textRaw": "`preventAbort` {boolean} When `true`, errors in this `ReadableStream` will not cause `destination` to be aborted.",
"name": "preventAbort",
"type": "boolean",
"desc": "When `true`, errors in this `ReadableStream` will not cause `destination` to be aborted."
},
{
"textRaw": "`preventCancel` {boolean} When `true`, errors in the `destination` will not cause this `ReadableStream` to be canceled.",
"name": "preventCancel",
"type": "boolean",
"desc": "When `true`, errors in the `destination` will not cause this `ReadableStream` to be canceled."
},
{
"textRaw": "`preventClose` {boolean} When `true`, closing this `ReadableStream` does not cause `destination` to be closed.",
"name": "preventClose",
"type": "boolean",
"desc": "When `true`, closing this `ReadableStream` does not cause `destination` to be closed."
},
{
"textRaw": "`signal` {AbortSignal} Allows the transfer of data to be canceled using an {AbortController}.",
"name": "signal",
"type": "AbortSignal",
"desc": "Allows the transfer of data to be canceled using an {AbortController}."
}
]
}
]
}
],
"desc": "Causes the readableStream.locked
to be true
while the pipe operation\nis active.
Returns a pair of new <ReadableStream> instances to which this\nReadableStream
's data will be forwarded. Each will receive the\nsame data.
Causes the readableStream.locked
to be true
.
Creates and returns an async iterator usable for consuming this\nReadableStream
's data.
Causes the readableStream.locked
to be true
while the async iterator\nis active.
import { Buffer } from 'node:buffer';\n\nconst stream = new ReadableStream(getSomeSource());\n\nfor await (const chunk of stream.values({ preventCancel: true }))\n console.log(Buffer.from(chunk).toString());\n
"
}
],
"modules": [
{
"textRaw": "Async Iteration",
"name": "async_iteration",
"desc": "The <ReadableStream> object supports the async iterator protocol using\nfor await
syntax.
import { Buffer } from 'node:buffer';\n\nconst stream = new ReadableStream(getSomeSource());\n\nfor await (const chunk of stream)\n console.log(Buffer.from(chunk).toString());\n
\nThe async iterator will consume the <ReadableStream> until it terminates.
\nBy default, if the async iterator exits early (via either a break
,\nreturn
, or a throw
), the <ReadableStream> will be closed. To prevent\nautomatic closing of the <ReadableStream>, use the readableStream.values()
\nmethod to acquire the async iterator and set the preventCancel
option to\ntrue
.
The <ReadableStream> must not be locked (that is, it must not have an existing\nactive reader). During the async iteration, the <ReadableStream> will be locked.
", "type": "module", "displayName": "Async Iteration" }, { "textRaw": "Transferring with `postMessage()`", "name": "transferring_with_`postmessage()`", "desc": "A <ReadableStream> instance can be transferred using a <MessagePort>.
\nconst stream = new ReadableStream(getReadableSourceSomehow());\n\nconst { port1, port2 } = new MessageChannel();\n\nport1.onmessage = ({ data }) => {\n data.getReader().read().then((chunk) => {\n console.log(chunk);\n });\n};\n\nport2.postMessage(stream, [stream]);\n
",
"type": "module",
"displayName": "Transferring with `postMessage()`"
}
],
"signatures": [
{
"params": [],
"desc": "\nunderlyingSource
<Object>\nstart
<Function> A user-defined function that is invoked immediately when\nthe ReadableStream
is created.\ncontroller
<ReadableStreamDefaultController> | <ReadableByteStreamController>undefined
or a promise fulfilled with undefined
.pull
<Function> A user-defined function that is called repeatedly when the\nReadableStream
internal queue is not full. The operation may be sync or\nasync. If async, the function will not be called again until the previously\nreturned promise is fulfilled.\ncontroller
<ReadableStreamDefaultController> | <ReadableByteStreamController>undefined
.cancel
<Function> A user-defined function that is called when the\nReadableStream
is canceled.\nreason
<any>undefined
.type
<string> Must be 'bytes'
or undefined
.autoAllocateChunkSize
<number> Used only when type
is equal to\n'bytes'
. When set to a non-zero value a view buffer is automatically\nallocated to ReadableByteStreamController.byobRequest
. When not set\none must use stream's internal queues to transfer data via the default\nreader ReadableStreamDefaultReader
.strategy
<Object>\nhighWaterMark
<number> The maximum internal queue size before backpressure\nis applied.size
<Function> A user-defined function used to identify the size of each\nchunk of data.\n\nBy default, calling readableStream.getReader()
with no arguments\nwill return an instance of ReadableStreamDefaultReader
. The default\nreader treats the chunks of data passed through the stream as opaque\nvalues, which allows the <ReadableStream> to work with generally any\nJavaScript value.
Cancels the <ReadableStream> and returns a promise that is fulfilled\nwhen the underlying stream has been canceled.
" }, { "textRaw": "`readableStreamDefaultReader.read()`", "type": "method", "name": "read", "meta": { "added": [ "v16.5.0" ], "changes": [] }, "signatures": [ { "return": { "textRaw": "Returns: A promise fulfilled with an object:", "name": "return", "desc": "A promise fulfilled with an object:", "options": [ { "textRaw": "`value` {ArrayBuffer}", "name": "value", "type": "ArrayBuffer" }, { "textRaw": "`done` {boolean}", "name": "done", "type": "boolean" } ] }, "params": [] } ], "desc": "Requests the next chunk of data from the underlying <ReadableStream>\nand returns a promise that is fulfilled with the data once it is\navailable.
" }, { "textRaw": "`readableStreamDefaultReader.releaseLock()`", "type": "method", "name": "releaseLock", "meta": { "added": [ "v16.5.0" ], "changes": [] }, "signatures": [ { "params": [] } ], "desc": "Releases this reader's lock on the underlying <ReadableStream>.
" } ], "properties": [ { "textRaw": "`closed` Type: {Promise} Fulfilled with `undefined` when the associated {ReadableStream} is closed or rejected if the stream errors or the reader's lock is released before the stream finishes closing.", "type": "Promise", "name": "Type", "meta": { "added": [ "v16.5.0" ], "changes": [] }, "desc": "Fulfilled with `undefined` when the associated {ReadableStream} is closed or rejected if the stream errors or the reader's lock is released before the stream finishes closing." } ], "signatures": [ { "params": [ { "textRaw": "`stream` {ReadableStream}", "name": "stream", "type": "ReadableStream" } ], "desc": "Creates a new <ReadableStreamDefaultReader> that is locked to the\ngiven <ReadableStream>.
" } ] }, { "textRaw": "Class: `ReadableStreamBYOBReader`", "type": "class", "name": "ReadableStreamBYOBReader", "meta": { "added": [ "v16.5.0" ], "changes": [ { "version": "v18.0.0", "pr-url": "https://github.com/nodejs/node/pull/42225", "description": "This class is now exposed on the global object." } ] }, "desc": "The ReadableStreamBYOBReader
is an alternative consumer for\nbyte-oriented <ReadableStream>s (those that are created with\nunderlyingSource.type
set equal to 'bytes'
when the\nReadableStream
was created).
The BYOB
is short for \"bring your own buffer\". This is a\npattern that allows for more efficient reading of byte-oriented\ndata that avoids extraneous copying.
import {\n open,\n} from 'node:fs/promises';\n\nimport {\n ReadableStream,\n} from 'node:stream/web';\n\nimport { Buffer } from 'node:buffer';\n\nclass Source {\n type = 'bytes';\n autoAllocateChunkSize = 1024;\n\n async start(controller) {\n this.file = await open(new URL(import.meta.url));\n this.controller = controller;\n }\n\n async pull(controller) {\n const view = controller.byobRequest?.view;\n const {\n bytesRead,\n } = await this.file.read({\n buffer: view,\n offset: view.byteOffset,\n length: view.byteLength,\n });\n\n if (bytesRead === 0) {\n await this.file.close();\n this.controller.close();\n }\n controller.byobRequest.respond(bytesRead);\n }\n}\n\nconst stream = new ReadableStream(new Source());\n\nasync function read(stream) {\n const reader = stream.getReader({ mode: 'byob' });\n\n const chunks = [];\n let result;\n do {\n result = await reader.read(Buffer.alloc(100));\n if (result.value !== undefined)\n chunks.push(Buffer.from(result.value));\n } while (!result.done);\n\n return Buffer.concat(chunks);\n}\n\nconst data = await read(stream);\nconsole.log(Buffer.from(data).toString());\n
",
"methods": [
{
"textRaw": "`readableStreamBYOBReader.cancel([reason])`",
"type": "method",
"name": "cancel",
"meta": {
"added": [
"v16.5.0"
],
"changes": []
},
"signatures": [
{
"return": {
"textRaw": "Returns: A promise fulfilled with `undefined`.",
"name": "return",
"desc": "A promise fulfilled with `undefined`."
},
"params": [
{
"textRaw": "`reason` {any}",
"name": "reason",
"type": "any"
}
]
}
],
"desc": "Cancels the <ReadableStream> and returns a promise that is fulfilled\nwhen the underlying stream has been canceled.
" }, { "textRaw": "`readableStreamBYOBReader.read(view)`", "type": "method", "name": "read", "meta": { "added": [ "v16.5.0" ], "changes": [] }, "signatures": [ { "return": { "textRaw": "Returns: A promise fulfilled with an object:", "name": "return", "desc": "A promise fulfilled with an object:", "options": [ { "textRaw": "`value` {ArrayBuffer}", "name": "value", "type": "ArrayBuffer" }, { "textRaw": "`done` {boolean}", "name": "done", "type": "boolean" } ] }, "params": [ { "textRaw": "`view` {Buffer|TypedArray|DataView}", "name": "view", "type": "Buffer|TypedArray|DataView" } ] } ], "desc": "Requests the next chunk of data from the underlying <ReadableStream>\nand returns a promise that is fulfilled with the data once it is\navailable.
\nDo not pass a pooled <Buffer> object instance in to this method.\nPooled Buffer
objects are created using Buffer.allocUnsafe()
,\nor Buffer.from()
, or are often returned by various node:fs
module\ncallbacks. These types of Buffer
s use a shared underlying\n<ArrayBuffer> object that contains all of the data from all of\nthe pooled Buffer
instances. When a Buffer
, <TypedArray>,\nor <DataView> is passed in to readableStreamBYOBReader.read()
,\nthe view's underlying ArrayBuffer
is detached, invalidating\nall existing views that may exist on that ArrayBuffer
. This\ncan have disastrous consequences for your application.
Releases this reader's lock on the underlying <ReadableStream>.
" } ], "properties": [ { "textRaw": "`closed` Type: {Promise} Fulfilled with `undefined` when the associated {ReadableStream} is closed or rejected if the stream errors or the reader's lock is released before the stream finishes closing.", "type": "Promise", "name": "Type", "meta": { "added": [ "v16.5.0" ], "changes": [] }, "desc": "Fulfilled with `undefined` when the associated {ReadableStream} is closed or rejected if the stream errors or the reader's lock is released before the stream finishes closing." } ], "signatures": [ { "params": [ { "textRaw": "`stream` {ReadableStream}", "name": "stream", "type": "ReadableStream" } ], "desc": "Creates a new ReadableStreamBYOBReader
that is locked to the\ngiven <ReadableStream>.
Every <ReadableStream> has a controller that is responsible for\nthe internal state and management of the stream's queue. The\nReadableStreamDefaultController
is the default controller\nimplementation for ReadableStream
s that are not byte-oriented.
Closes the <ReadableStream> to which this controller is associated.
" }, { "textRaw": "`readableStreamDefaultController.enqueue([chunk])`", "type": "method", "name": "enqueue", "meta": { "added": [ "v16.5.0" ], "changes": [] }, "signatures": [ { "params": [ { "textRaw": "`chunk` {any}", "name": "chunk", "type": "any" } ] } ], "desc": "Appends a new chunk of data to the <ReadableStream>'s queue.
" }, { "textRaw": "`readableStreamDefaultController.error([error])`", "type": "method", "name": "error", "meta": { "added": [ "v16.5.0" ], "changes": [] }, "signatures": [ { "params": [ { "textRaw": "`error` {any}", "name": "error", "type": "any" } ] } ], "desc": "Signals an error that causes the <ReadableStream> to error and close.
" } ], "properties": [ { "textRaw": "`desiredSize` Type: {number}", "type": "number", "name": "Type", "meta": { "added": [ "v16.5.0" ], "changes": [] }, "desc": "Returns the amount of data remaining to fill the <ReadableStream>'s\nqueue.
" } ] }, { "textRaw": "Class: `ReadableByteStreamController`", "type": "class", "name": "ReadableByteStreamController", "meta": { "added": [ "v16.5.0" ], "changes": [ { "version": "v18.10.0", "pr-url": "https://github.com/nodejs/node/pull/44702", "description": "Support handling a BYOB pull request from a released reader." } ] }, "desc": "Every <ReadableStream> has a controller that is responsible for\nthe internal state and management of the stream's queue. The\nReadableByteStreamController
is for byte-oriented ReadableStream
s.
Returns the amount of data remaining to fill the <ReadableStream>'s\nqueue.
" }, { "textRaw": "`desiredSize` Type: {number}", "type": "number", "name": "Type", "meta": { "added": [ "v16.5.0" ], "changes": [] }, "desc": "Returns the amount of data remaining to fill the <ReadableStream>'s\nqueue.
" } ], "methods": [ { "textRaw": "`readableByteStreamController.close()`", "type": "method", "name": "close", "meta": { "added": [ "v16.5.0" ], "changes": [] }, "signatures": [ { "params": [] } ], "desc": "Closes the <ReadableStream> to which this controller is associated.
" }, { "textRaw": "`readableByteStreamController.enqueue(chunk)`", "type": "method", "name": "enqueue", "meta": { "added": [ "v16.5.0" ], "changes": [] }, "signatures": [ { "params": [ { "textRaw": "`chunk`: {Buffer|TypedArray|DataView}", "name": "chunk", "type": "Buffer|TypedArray|DataView" } ] } ], "desc": "Appends a new chunk of data to the <ReadableStream>'s queue.
" }, { "textRaw": "`readableByteStreamController.error([error])`", "type": "method", "name": "error", "meta": { "added": [ "v16.5.0" ], "changes": [] }, "signatures": [ { "params": [ { "textRaw": "`error` {any}", "name": "error", "type": "any" } ] } ], "desc": "Signals an error that causes the <ReadableStream> to error and close.
" } ] }, { "textRaw": "Class: `ReadableStreamBYOBRequest`", "type": "class", "name": "ReadableStreamBYOBRequest", "meta": { "added": [ "v16.5.0" ], "changes": [ { "version": "v18.0.0", "pr-url": "https://github.com/nodejs/node/pull/42225", "description": "This class is now exposed on the global object." } ] }, "desc": "When using ReadableByteStreamController
in byte-oriented\nstreams, and when using the ReadableStreamBYOBReader
,\nthe readableByteStreamController.byobRequest
property\nprovides access to a ReadableStreamBYOBRequest
instance\nthat represents the current read request. The object\nis used to gain access to the ArrayBuffer
/TypedArray
\nthat has been provided for the read request to fill,\nand provides methods for signaling that the data has\nbeen provided.
Signals that a bytesWritten
number of bytes have been written\nto readableStreamBYOBRequest.view
.
Signals that the request has been fulfilled with bytes written\nto a new Buffer
, TypedArray
, or DataView
.
The WritableStream
is a destination to which stream data is sent.
import {\n WritableStream,\n} from 'node:stream/web';\n\nconst stream = new WritableStream({\n write(chunk) {\n console.log(chunk);\n },\n});\n\nawait stream.getWriter().write('Hello World');\n
",
"methods": [
{
"textRaw": "`writableStream.abort([reason])`",
"type": "method",
"name": "abort",
"meta": {
"added": [
"v16.5.0"
],
"changes": []
},
"signatures": [
{
"return": {
"textRaw": "Returns: A promise fulfilled with `undefined`.",
"name": "return",
"desc": "A promise fulfilled with `undefined`."
},
"params": [
{
"textRaw": "`reason` {any}",
"name": "reason",
"type": "any"
}
]
}
],
"desc": "Abruptly terminates the WritableStream
. All queued writes will be\ncanceled with their associated promises rejected.
Closes the WritableStream
when no additional writes are expected.
Creates and returns a new writer instance that can be used to write\ndata into the WritableStream
.
The writableStream.locked
property is false
by default, and is\nswitched to true
while there is an active writer attached to this\nWritableStream
.
A <WritableStream> instance can be transferred using a <MessagePort>.
\nconst stream = new WritableStream(getWritableSinkSomehow());\n\nconst { port1, port2 } = new MessageChannel();\n\nport1.onmessage = ({ data }) => {\n data.getWriter().write('hello');\n};\n\nport2.postMessage(stream, [stream]);\n
",
"type": "module",
"displayName": "Transferring with postMessage()"
}
],
"signatures": [
{
"params": [
{
"textRaw": "`underlyingSink` {Object}",
"name": "underlyingSink",
"type": "Object",
"options": [
{
"textRaw": "`start` {Function} A user-defined function that is invoked immediately when the `WritableStream` is created.",
"name": "start",
"type": "Function",
"desc": "A user-defined function that is invoked immediately when the `WritableStream` is created.",
"options": [
{
"textRaw": "`controller` {WritableStreamDefaultController}",
"name": "controller",
"type": "WritableStreamDefaultController"
},
{
"textRaw": "Returns: `undefined` or a promise fulfilled with `undefined`.",
"name": "return",
"desc": "`undefined` or a promise fulfilled with `undefined`."
}
]
},
{
"textRaw": "`write` {Function} A user-defined function that is invoked when a chunk of data has been written to the `WritableStream`.",
"name": "write",
"type": "Function",
"desc": "A user-defined function that is invoked when a chunk of data has been written to the `WritableStream`.",
"options": [
{
"textRaw": "`chunk` {any}",
"name": "chunk",
"type": "any"
},
{
"textRaw": "`controller` {WritableStreamDefaultController}",
"name": "controller",
"type": "WritableStreamDefaultController"
},
{
"textRaw": "Returns: A promise fulfilled with `undefined`.",
"name": "return",
"desc": "A promise fulfilled with `undefined`."
}
]
},
{
"textRaw": "`close` {Function} A user-defined function that is called when the `WritableStream` is closed.",
"name": "close",
"type": "Function",
"desc": "A user-defined function that is called when the `WritableStream` is closed.",
"options": [
{
"textRaw": "Returns: A promise fulfilled with `undefined`.",
"name": "return",
"desc": "A promise fulfilled with `undefined`."
}
]
},
{
"textRaw": "`abort` {Function} A user-defined function that is called to abruptly close the `WritableStream`.",
"name": "abort",
"type": "Function",
"desc": "A user-defined function that is called to abruptly close the `WritableStream`.",
"options": [
{
"textRaw": "`reason` {any}",
"name": "reason",
"type": "any"
},
{
"textRaw": "Returns: A promise fulfilled with `undefined`.",
"name": "return",
"desc": "A promise fulfilled with `undefined`."
}
]
},
{
"textRaw": "`type` {any} The `type` option is reserved for future use and _must_ be undefined.",
"name": "type",
"type": "any",
"desc": "The `type` option is reserved for future use and _must_ be undefined."
}
]
},
{
"textRaw": "`strategy` {Object}",
"name": "strategy",
"type": "Object",
"options": [
{
"textRaw": "`highWaterMark` {number} The maximum internal queue size before backpressure is applied.",
"name": "highWaterMark",
"type": "number",
"desc": "The maximum internal queue size before backpressure is applied."
},
{
"textRaw": "`size` {Function} A user-defined function used to identify the size of each chunk of data.",
"name": "size",
"type": "Function",
"desc": "A user-defined function used to identify the size of each chunk of data.",
"options": [
{
"textRaw": "`chunk` {any}",
"name": "chunk",
"type": "any"
},
{
"textRaw": "Returns: {number}",
"name": "return",
"type": "number"
}
]
}
]
}
]
}
]
},
{
"textRaw": "Class: `WritableStreamDefaultWriter`",
"type": "class",
"name": "WritableStreamDefaultWriter",
"meta": {
"added": [
"v16.5.0"
],
"changes": [
{
"version": "v18.0.0",
"pr-url": "https://github.com/nodejs/node/pull/42225",
"description": "This class is now exposed on the global object."
}
]
},
"methods": [
{
"textRaw": "`writableStreamDefaultWriter.abort([reason])`",
"type": "method",
"name": "abort",
"meta": {
"added": [
"v16.5.0"
],
"changes": []
},
"signatures": [
{
"return": {
"textRaw": "Returns: A promise fulfilled with `undefined`.",
"name": "return",
"desc": "A promise fulfilled with `undefined`."
},
"params": [
{
"textRaw": "`reason` {any}",
"name": "reason",
"type": "any"
}
]
}
],
"desc": "Abruptly terminates the WritableStream
. All queued writes will be\ncanceled with their associated promises rejected.
Closes the WritableStream
when no additional writes are expected.
Releases this writer's lock on the underlying <ReadableStream>.
" }, { "textRaw": "`writableStreamDefaultWriter.write([chunk])`", "type": "method", "name": "write", "meta": { "added": [ "v16.5.0" ], "changes": [] }, "signatures": [ { "return": { "textRaw": "Returns: A promise fulfilled with `undefined`.", "name": "return", "desc": "A promise fulfilled with `undefined`." }, "params": [ { "textRaw": "`chunk`: {any}", "name": "chunk", "type": "any" } ] } ], "desc": "Appends a new chunk of data to the <WritableStream>'s queue.
" } ], "properties": [ { "textRaw": "`closed` Type: {Promise} Fulfilled with `undefined` when the associated {WritableStream} is closed or rejected if the stream errors or the writer's lock is released before the stream finishes closing.", "type": "Promise", "name": "Type", "meta": { "added": [ "v16.5.0" ], "changes": [] }, "desc": "Fulfilled with `undefined` when the associated {WritableStream} is closed or rejected if the stream errors or the writer's lock is released before the stream finishes closing." }, { "textRaw": "`desiredSize` Type: {number}", "type": "number", "name": "Type", "meta": { "added": [ "v16.5.0" ], "changes": [] }, "desc": "The amount of data required to fill the <WritableStream>'s queue.
" }, { "textRaw": "`ready` Type: {Promise} Fulfilled with `undefined` when the writer is ready to be used.", "type": "Promise", "name": "Type", "meta": { "added": [ "v16.5.0" ], "changes": [] }, "desc": "Fulfilled with `undefined` when the writer is ready to be used." } ], "signatures": [ { "params": [ { "textRaw": "`stream` {WritableStream}", "name": "stream", "type": "WritableStream" } ], "desc": "Creates a new WritableStreamDefaultWriter
that is locked to the given\nWritableStream
.
The WritableStreamDefaultController
manage's the <WritableStream>'s\ninternal state.
Called by user-code to signal that an error has occurred while processing\nthe WritableStream
data. When called, the <WritableStream> will be aborted,\nwith currently pending writes canceled.
A TransformStream
consists of a <ReadableStream> and a <WritableStream> that\nare connected such that the data written to the WritableStream
is received,\nand potentially transformed, before being pushed into the ReadableStream
's\nqueue.
import {\n TransformStream,\n} from 'node:stream/web';\n\nconst transform = new TransformStream({\n transform(chunk, controller) {\n controller.enqueue(chunk.toUpperCase());\n },\n});\n\nawait Promise.all([\n transform.writable.getWriter().write('A'),\n transform.readable.getReader().read(),\n]);\n
",
"properties": [
{
"textRaw": "`readable` Type: {ReadableStream}",
"type": "ReadableStream",
"name": "Type",
"meta": {
"added": [
"v16.5.0"
],
"changes": []
}
},
{
"textRaw": "`writable` Type: {WritableStream}",
"type": "WritableStream",
"name": "Type",
"meta": {
"added": [
"v16.5.0"
],
"changes": []
}
}
],
"modules": [
{
"textRaw": "Transferring with postMessage()",
"name": "transferring_with_postmessage()",
"desc": "A <TransformStream> instance can be transferred using a <MessagePort>.
\nconst stream = new TransformStream();\n\nconst { port1, port2 } = new MessageChannel();\n\nport1.onmessage = ({ data }) => {\n const { writable, readable } = data;\n // ...\n};\n\nport2.postMessage(stream, [stream]);\n
",
"type": "module",
"displayName": "Transferring with postMessage()"
}
],
"signatures": [
{
"params": [
{
"textRaw": "`transformer` {Object}",
"name": "transformer",
"type": "Object",
"options": [
{
"textRaw": "`start` {Function} A user-defined function that is invoked immediately when the `TransformStream` is created.",
"name": "start",
"type": "Function",
"desc": "A user-defined function that is invoked immediately when the `TransformStream` is created.",
"options": [
{
"textRaw": "`controller` {TransformStreamDefaultController}",
"name": "controller",
"type": "TransformStreamDefaultController"
},
{
"textRaw": "Returns: `undefined` or a promise fulfilled with `undefined`",
"name": "return",
"desc": "`undefined` or a promise fulfilled with `undefined`"
}
]
},
{
"textRaw": "`transform` {Function} A user-defined function that receives, and potentially modifies, a chunk of data written to `transformStream.writable`, before forwarding that on to `transformStream.readable`.",
"name": "transform",
"type": "Function",
"desc": "A user-defined function that receives, and potentially modifies, a chunk of data written to `transformStream.writable`, before forwarding that on to `transformStream.readable`.",
"options": [
{
"textRaw": "`chunk` {any}",
"name": "chunk",
"type": "any"
},
{
"textRaw": "`controller` {TransformStreamDefaultController}",
"name": "controller",
"type": "TransformStreamDefaultController"
},
{
"textRaw": "Returns: A promise fulfilled with `undefined`.",
"name": "return",
"desc": "A promise fulfilled with `undefined`."
}
]
},
{
"textRaw": "`flush` {Function} A user-defined function that is called immediately before the writable side of the `TransformStream` is closed, signaling the end of the transformation process.",
"name": "flush",
"type": "Function",
"desc": "A user-defined function that is called immediately before the writable side of the `TransformStream` is closed, signaling the end of the transformation process.",
"options": [
{
"textRaw": "`controller` {TransformStreamDefaultController}",
"name": "controller",
"type": "TransformStreamDefaultController"
},
{
"textRaw": "Returns: A promise fulfilled with `undefined`.",
"name": "return",
"desc": "A promise fulfilled with `undefined`."
}
]
},
{
"textRaw": "`readableType` {any} the `readableType` option is reserved for future use and _must_ be `undefined`.",
"name": "readableType",
"type": "any",
"desc": "the `readableType` option is reserved for future use and _must_ be `undefined`."
},
{
"textRaw": "`writableType` {any} the `writableType` option is reserved for future use and _must_ be `undefined`.",
"name": "writableType",
"type": "any",
"desc": "the `writableType` option is reserved for future use and _must_ be `undefined`."
}
]
},
{
"textRaw": "`writableStrategy` {Object}",
"name": "writableStrategy",
"type": "Object",
"options": [
{
"textRaw": "`highWaterMark` {number} The maximum internal queue size before backpressure is applied.",
"name": "highWaterMark",
"type": "number",
"desc": "The maximum internal queue size before backpressure is applied."
},
{
"textRaw": "`size` {Function} A user-defined function used to identify the size of each chunk of data.",
"name": "size",
"type": "Function",
"desc": "A user-defined function used to identify the size of each chunk of data.",
"options": [
{
"textRaw": "`chunk` {any}",
"name": "chunk",
"type": "any"
},
{
"textRaw": "Returns: {number}",
"name": "return",
"type": "number"
}
]
}
]
},
{
"textRaw": "`readableStrategy` {Object}",
"name": "readableStrategy",
"type": "Object",
"options": [
{
"textRaw": "`highWaterMark` {number} The maximum internal queue size before backpressure is applied.",
"name": "highWaterMark",
"type": "number",
"desc": "The maximum internal queue size before backpressure is applied."
},
{
"textRaw": "`size` {Function} A user-defined function used to identify the size of each chunk of data.",
"name": "size",
"type": "Function",
"desc": "A user-defined function used to identify the size of each chunk of data.",
"options": [
{
"textRaw": "`chunk` {any}",
"name": "chunk",
"type": "any"
},
{
"textRaw": "Returns: {number}",
"name": "return",
"type": "number"
}
]
}
]
}
]
}
]
},
{
"textRaw": "Class: `TransformStreamDefaultController`",
"type": "class",
"name": "TransformStreamDefaultController",
"meta": {
"added": [
"v16.5.0"
],
"changes": [
{
"version": "v18.0.0",
"pr-url": "https://github.com/nodejs/node/pull/42225",
"description": "This class is now exposed on the global object."
}
]
},
"desc": "The TransformStreamDefaultController
manages the internal state\nof the TransformStream
.
The amount of data required to fill the readable side's queue.
" } ], "methods": [ { "textRaw": "`transformStreamDefaultController.enqueue([chunk])`", "type": "method", "name": "enqueue", "meta": { "added": [ "v16.5.0" ], "changes": [] }, "signatures": [ { "params": [ { "textRaw": "`chunk` {any}", "name": "chunk", "type": "any" } ] } ], "desc": "Appends a chunk of data to the readable side's queue.
" }, { "textRaw": "`transformStreamDefaultController.error([reason])`", "type": "method", "name": "error", "meta": { "added": [ "v16.5.0" ], "changes": [] }, "signatures": [ { "params": [ { "textRaw": "`reason` {any}", "name": "reason", "type": "any" } ] } ], "desc": "Signals to both the readable and writable side that an error has occurred\nwhile processing the transform data, causing both sides to be abruptly\nclosed.
" }, { "textRaw": "`transformStreamDefaultController.terminate()`", "type": "method", "name": "terminate", "meta": { "added": [ "v16.5.0" ], "changes": [] }, "signatures": [ { "params": [] } ], "desc": "Closes the readable side of the transport and causes the writable side\nto be abruptly closed with an error.
" } ] }, { "textRaw": "Class: `ByteLengthQueuingStrategy`", "type": "class", "name": "ByteLengthQueuingStrategy", "meta": { "added": [ "v16.5.0" ], "changes": [ { "version": "v18.0.0", "pr-url": "https://github.com/nodejs/node/pull/42225", "description": "This class is now exposed on the global object." } ] }, "properties": [ { "textRaw": "`highWaterMark` Type: {number}", "type": "number", "name": "Type", "meta": { "added": [ "v16.5.0" ], "changes": [] } }, { "textRaw": "`size` Type: {Function}", "type": "Function", "name": "Type", "meta": { "added": [ "v16.5.0" ], "changes": [] }, "options": [ { "textRaw": "`chunk` {any}", "name": "chunk", "type": "any" }, { "textRaw": "Returns: {number}", "name": "return", "type": "number" } ] } ], "signatures": [ { "params": [ { "textRaw": "`init` {Object}", "name": "init", "type": "Object", "options": [ { "textRaw": "`highWaterMark` {number}", "name": "highWaterMark", "type": "number" } ] } ] } ] }, { "textRaw": "Class: `CountQueuingStrategy`", "type": "class", "name": "CountQueuingStrategy", "meta": { "added": [ "v16.5.0" ], "changes": [ { "version": "v18.0.0", "pr-url": "https://github.com/nodejs/node/pull/42225", "description": "This class is now exposed on the global object." } ] }, "properties": [ { "textRaw": "`highWaterMark` Type: {number}", "type": "number", "name": "Type", "meta": { "added": [ "v16.5.0" ], "changes": [] } }, { "textRaw": "`size` Type: {Function}", "type": "Function", "name": "Type", "meta": { "added": [ "v16.5.0" ], "changes": [] }, "options": [ { "textRaw": "`chunk` {any}", "name": "chunk", "type": "any" }, { "textRaw": "Returns: {number}", "name": "return", "type": "number" } ] } ], "signatures": [ { "params": [ { "textRaw": "`init` {Object}", "name": "init", "type": "Object", "options": [ { "textRaw": "`highWaterMark` {number}", "name": "highWaterMark", "type": "number" } ] } ] } ] }, { "textRaw": "Class: `TextEncoderStream`", "type": "class", "name": "TextEncoderStream", "meta": { "added": [ "v16.6.0" ], "changes": [ { "version": "v18.0.0", "pr-url": "https://github.com/nodejs/node/pull/42225", "description": "This class is now exposed on the global object." } ] }, "properties": [ { "textRaw": "`encoding` Type: {string}", "type": "string", "name": "Type", "meta": { "added": [ "v16.6.0" ], "changes": [] }, "desc": "The encoding supported by the TextEncoderStream
instance.
Creates a new TextEncoderStream
instance.
The encoding supported by the TextDecoderStream
instance.
The value will be true
if decoding errors result in a TypeError
being\nthrown.
The value will be true
if the decoding result will include the byte order\nmark.
Creates a new TextDecoderStream
instance.
The utility consumer functions provide common options for consuming\nstreams.
\nThey are accessed using:
\nimport {\n arrayBuffer,\n blob,\n buffer,\n json,\n text,\n} from 'node:stream/consumers';\n
\nconst {\n arrayBuffer,\n blob,\n buffer,\n json,\n text,\n} = require('node:stream/consumers');\n
",
"methods": [
{
"textRaw": "`streamConsumers.arrayBuffer(stream)`",
"type": "method",
"name": "arrayBuffer",
"meta": {
"added": [
"v16.7.0"
],
"changes": []
},
"signatures": [
{
"return": {
"textRaw": "Returns: {Promise} Fulfills with an `ArrayBuffer` containing the full contents of the stream.",
"name": "return",
"type": "Promise",
"desc": "Fulfills with an `ArrayBuffer` containing the full contents of the stream."
},
"params": [
{
"textRaw": "`stream` {ReadableStream|stream.Readable|AsyncIterator}",
"name": "stream",
"type": "ReadableStream|stream.Readable|AsyncIterator"
}
]
}
],
"desc": "import { arrayBuffer } from 'node:stream/consumers';\nimport { Readable } from 'node:stream';\nimport { TextEncoder } from 'node:util';\n\nconst encoder = new TextEncoder();\nconst dataArray = encoder.encode('hello world from consumers!');\n\nconst readable = Readable.from(dataArray);\nconst data = await arrayBuffer(readable);\nconsole.log(`from readable: ${data.byteLength}`);\n
\nconst { arrayBuffer } = require('node:stream/consumers');\nconst { Readable } = require('node:stream');\nconst { TextEncoder } = require('node:util');\n\nconst encoder = new TextEncoder();\nconst dataArray = encoder.encode('hello world from consumers!');\nconst readable = Readable.from(dataArray);\narrayBuffer(readable).then((data) => {\n console.log(`from readable: ${data.byteLength}`);\n});\n
"
},
{
"textRaw": "`streamConsumers.blob(stream)`",
"type": "method",
"name": "blob",
"meta": {
"added": [
"v16.7.0"
],
"changes": []
},
"signatures": [
{
"return": {
"textRaw": "Returns: {Promise} Fulfills with a {Blob} containing the full contents of the stream.",
"name": "return",
"type": "Promise",
"desc": "Fulfills with a {Blob} containing the full contents of the stream."
},
"params": [
{
"textRaw": "`stream` {ReadableStream|stream.Readable|AsyncIterator}",
"name": "stream",
"type": "ReadableStream|stream.Readable|AsyncIterator"
}
]
}
],
"desc": "import { blob } from 'node:stream/consumers';\n\nconst dataBlob = new Blob(['hello world from consumers!']);\n\nconst readable = dataBlob.stream();\nconst data = await blob(readable);\nconsole.log(`from readable: ${data.size}`);\n
\nconst { blob } = require('node:stream/consumers');\n\nconst dataBlob = new Blob(['hello world from consumers!']);\n\nconst readable = dataBlob.stream();\nblob(readable).then((data) => {\n console.log(`from readable: ${data.size}`);\n});\n
"
},
{
"textRaw": "`streamConsumers.buffer(stream)`",
"type": "method",
"name": "buffer",
"meta": {
"added": [
"v16.7.0"
],
"changes": []
},
"signatures": [
{
"return": {
"textRaw": "Returns: {Promise} Fulfills with a {Buffer} containing the full contents of the stream.",
"name": "return",
"type": "Promise",
"desc": "Fulfills with a {Buffer} containing the full contents of the stream."
},
"params": [
{
"textRaw": "`stream` {ReadableStream|stream.Readable|AsyncIterator}",
"name": "stream",
"type": "ReadableStream|stream.Readable|AsyncIterator"
}
]
}
],
"desc": "import { buffer } from 'node:stream/consumers';\nimport { Readable } from 'node:stream';\nimport { Buffer } from 'node:buffer';\n\nconst dataBuffer = Buffer.from('hello world from consumers!');\n\nconst readable = Readable.from(dataBuffer);\nconst data = await buffer(readable);\nconsole.log(`from readable: ${data.length}`);\n
\nconst { buffer } = require('node:stream/consumers');\nconst { Readable } = require('node:stream');\nconst { Buffer } = require('node:buffer');\n\nconst dataBuffer = Buffer.from('hello world from consumers!');\n\nconst readable = Readable.from(dataBuffer);\nbuffer(readable).then((data) => {\n console.log(`from readable: ${data.length}`);\n});\n
"
},
{
"textRaw": "`streamConsumers.json(stream)`",
"type": "method",
"name": "json",
"meta": {
"added": [
"v16.7.0"
],
"changes": []
},
"signatures": [
{
"return": {
"textRaw": "Returns: {Promise} Fulfills with the contents of the stream parsed as a UTF-8 encoded string that is then passed through `JSON.parse()`.",
"name": "return",
"type": "Promise",
"desc": "Fulfills with the contents of the stream parsed as a UTF-8 encoded string that is then passed through `JSON.parse()`."
},
"params": [
{
"textRaw": "`stream` {ReadableStream|stream.Readable|AsyncIterator}",
"name": "stream",
"type": "ReadableStream|stream.Readable|AsyncIterator"
}
]
}
],
"desc": "import { json } from 'node:stream/consumers';\nimport { Readable } from 'node:stream';\n\nconst items = Array.from(\n {\n length: 100,\n },\n () => ({\n message: 'hello world from consumers!',\n }),\n);\n\nconst readable = Readable.from(JSON.stringify(items));\nconst data = await json(readable);\nconsole.log(`from readable: ${data.length}`);\n
\nconst { json } = require('node:stream/consumers');\nconst { Readable } = require('node:stream');\n\nconst items = Array.from(\n {\n length: 100,\n },\n () => ({\n message: 'hello world from consumers!',\n }),\n);\n\nconst readable = Readable.from(JSON.stringify(items));\njson(readable).then((data) => {\n console.log(`from readable: ${data.length}`);\n});\n
"
},
{
"textRaw": "`streamConsumers.text(stream)`",
"type": "method",
"name": "text",
"meta": {
"added": [
"v16.7.0"
],
"changes": []
},
"signatures": [
{
"return": {
"textRaw": "Returns: {Promise} Fulfills with the contents of the stream parsed as a UTF-8 encoded string.",
"name": "return",
"type": "Promise",
"desc": "Fulfills with the contents of the stream parsed as a UTF-8 encoded string."
},
"params": [
{
"textRaw": "`stream` {ReadableStream|stream.Readable|AsyncIterator}",
"name": "stream",
"type": "ReadableStream|stream.Readable|AsyncIterator"
}
]
}
],
"desc": "import { text } from 'node:stream/consumers';\nimport { Readable } from 'node:stream';\n\nconst readable = Readable.from('Hello world from consumers!');\nconst data = await text(readable);\nconsole.log(`from readable: ${data.length}`);\n
\nconst { text } = require('node:stream/consumers');\nconst { Readable } = require('node:stream');\n\nconst readable = Readable.from('Hello world from consumers!');\ntext(readable).then((data) => {\n console.log(`from readable: ${data.length}`);\n});\n
"
}
],
"type": "module",
"displayName": "Utility Consumers"
}
],
"type": "module",
"displayName": "API"
}
],
"type": "module",
"displayName": "Web Streams API"
}
]
}