diff --git a/doc/api/stream.md b/doc/api/stream.md index 62e5190330610d..34f20730084f8e 100644 --- a/doc/api/stream.md +++ b/doc/api/stream.md @@ -727,11 +727,49 @@ The return value indicates if you should continue writing right now. If the data had to be buffered internally, then it will return `false`. Otherwise, it will return `true`. -This return value is strictly advisory. You MAY continue to write, -even if it returns `false`. However, writes will be buffered in -memory, so it is best not to do this excessively. Instead, wait for -the [`'drain'`][] event before writing more data. +The return value is `true` if the internal buffer does not exceed +`highWaterMark` configured when the stream was created after admitting `chunk`. +If `false` is returned, further attempts to write data to the stream should +stop until the [`'drain'`][] event is emitted. + +While a stream is not draining, calls to `write()` will buffer `chunk`, and +return false. Once all currently buffered chunks are drained (accepted for +delivery by the operating system), the `'drain'` event will be emitted. +It is recommended that once write() returns false, no more chunks be written +until the `'drain'` event is emitted. While calling `write()` on a stream that +is not draining is allowed, Node.js will buffer all written chunks until +maximum memory usage occurs, at which point it will abort unconditionally. +Even before it aborts, high memory usage will cause poor garbage collector +performance and high RSS (which is not typically released back to the system, +even after the memory is no longer required). Since TCP sockets may never +drain if the remote peer does not read the data, writing a socket that is +not draining may lead to a remotely exploitable vulnerability. + +Writing data while the stream is not draining is particularly +problematic for a [Transform][], because the `Transform` streams are paused +by default until they are piped or an `'data'` or `'readable'` event handler +is added. + +If the data to be written can be generated or fetched on demand, it is +recommended to encapsulate the logic into a [Readable][] and use +[`stream.pipe()`][]. However, if calling `write()` is preferred, it is +possible to respect backpressure and avoid memory issues using the +the [`'drain'`][] event: +```js +function write (data, cb) { + if (!stream.write(data)) { + stream.once('drain', cb) + } else { + process.nextTick(cb) + } +} + +// Wait for cb to be called before doing any other write. +write('hello', () => { + console.log('write completed, do more writes now') +}) +``` ## API for Stream Implementors