Skip to content

Commit

Permalink
doc: add added: information for stream
Browse files Browse the repository at this point in the history
Ref: nodejs#6578
PR-URL: nodejs#7287
Reviewed-By: Anna Henningsen <anna@addaleax.net>
Reviewed-By: Claudio Rodriguez <cjrodr@yahoo.com>
  • Loading branch information
italoacasas authored and addaleax committed Jul 17, 2016
1 parent 7cb70e9 commit c897d0b
Showing 1 changed file with 90 additions and 3 deletions.
93 changes: 90 additions & 3 deletions doc/api/stream.md
Original file line number Diff line number Diff line change
Expand Up @@ -116,7 +116,7 @@ const server = http.createServer( (req, res) => {
// req is an http.IncomingMessage, which is a Readable Stream
// res is an http.ServerResponse, which is a Writable Stream

var body = '';
let body = '';
// Get the data as utf8 strings.
// If an encoding is not set, Buffer objects will be received.
req.setEncoding('utf8');
Expand Down Expand Up @@ -205,10 +205,16 @@ myStream.end('done writing data');
```

#### Class: stream.Writable
<!-- YAML
added: v0.9.4
-->

<!--type=class-->

##### Event: 'close'
<!-- YAML
added: v0.9.4
-->

The `'close'` event is emitted when the stream and any of its underlying
resources (a file descriptor, for example) have been closed. The event indicates
Expand All @@ -217,6 +223,9 @@ that no more events will be emitted, and no further computation will occur.
Not all Writable streams will emit the `'close'` event.

##### Event: 'drain'
<!-- YAML
added: v0.9.4
-->

If a call to [`stream.write(chunk)`][stream-write] returns `false`, the
`'drain'` event will be emitted when it is appropriate to resume writing data
Expand All @@ -226,7 +235,7 @@ to the stream.
// Write the data to the supplied writable stream one million times.
// Be attentive to back-pressure.
function writeOneMillionTimes(writer, data, encoding, callback) {
var i = 1000000;
let i = 1000000;
write();
function write() {
var ok = true;
Expand All @@ -251,6 +260,9 @@ function writeOneMillionTimes(writer, data, encoding, callback) {
```

##### Event: 'error'
<!-- YAML
added: v0.9.4
-->

* {Error}

Expand All @@ -260,6 +272,9 @@ data. The listener callback is passed a single `Error` argument when called.
*Note*: The stream is not closed when the `'error'` event is emitted.

##### Event: 'finish'
<!-- YAML
added: v0.9.4
-->

The `'finish'` event is emitted after the [`stream.end()`][stream-end] method
has been called, and all data has been flushed to the underlying system.
Expand All @@ -276,6 +291,9 @@ writer.on('finish', () => {
```

##### Event: 'pipe'
<!-- YAML
added: v0.9.4
-->

* `src` {stream.Readable} source stream that is piping to this writable

Expand All @@ -293,6 +311,9 @@ reader.pipe(writer);
```

##### Event: 'unpipe'
<!-- YAML
added: v0.9.4
-->

* `src` {[Readable][] Stream} The source stream that
[unpiped][`stream.unpipe()`] this writable
Expand All @@ -313,6 +334,9 @@ reader.unpipe(writer);
```

##### writable.cork()
<!-- YAML
added: v0.11.2
-->

The `writable.cork()` method forces all written data to be buffered in memory.
The buffered data will be flushed when either the [`stream.uncork()`][] or
Expand All @@ -325,6 +349,9 @@ implementations that implement the `writable._writev()` method can perform
buffered writes in a more optimized manner.

##### writable.end([chunk][, encoding][, callback])
<!-- YAML
added: v0.9.4
-->

* `chunk` {String|Buffer|any} Optional data to write. For streams not operating
in object mode, `chunk` must be a string or a `Buffer`. For object mode
Expand All @@ -350,6 +377,9 @@ file.end('world!');
```

##### writable.setDefaultEncoding(encoding)
<!-- YAML
added: v0.11.15
-->

* `encoding` {String} The new default encoding
* Return: `this`
Expand All @@ -358,6 +388,9 @@ The `writable.setDefaultEncoding()` method sets the default `encoding` for a
[Writable][] stream.

##### writable.uncork()
<!-- YAML
added: v0.11.2
-->

The `writable.uncork()` method flushes all data buffered since
[`stream.cork()`][] was called.
Expand Down Expand Up @@ -391,6 +424,9 @@ process.nextTick(() => {
```

##### writable.write(chunk[, encoding][, callback])
<!-- YAML
added: v0.9.4
-->

* `chunk` {String|Buffer} The data to write
* `encoding` {String} The encoding, if `chunk` is a String
Expand Down Expand Up @@ -516,10 +552,16 @@ require more fine-grained control over the transfer and generation of data can
use the [`EventEmitter`][] and `readable.pause()`/`readable.resume()` APIs.

#### Class: stream.Readable
<!-- YAML
added: v0.9.4
-->

<!--type=class-->

##### Event: 'close'
<!-- YAML
added: v0.9.4
-->

The `'close'` event is emitted when the stream and any of its underlying
resources (a file descriptor, for example) have been closed. The event indicates
Expand All @@ -528,6 +570,9 @@ that no more events will be emitted, and no further computation will occur.
Not all [Readable][] streams will emit the `'close'` event.

##### Event: 'data'
<!-- YAML
added: v0.9.4
-->

* `chunk` {Buffer|String|any} The chunk of data. For streams that are not
operating in object mode, the chunk will be either a string or `Buffer`.
Expand Down Expand Up @@ -558,6 +603,9 @@ readable.on('data', (chunk) => {
```

##### Event: 'end'
<!-- YAML
added: v0.9.4
-->

The `'end'` event is emitted when there is no more data to be consumed from
the stream.
Expand All @@ -578,6 +626,9 @@ readable.on('end', () => {
```

##### Event: 'error'
<!-- YAML
added: v0.9.4
-->

* {Error}

Expand All @@ -589,6 +640,9 @@ to push an invalid chunk of data.
The listener callback will be passed a single `Error` object.

##### Event: 'readable'
<!-- YAML
added: v0.9.4
-->

The `'readable'` event is emitted when there is data available to be read from
the stream. In some cases, attaching a listener for the `'readable'` event will
Expand Down Expand Up @@ -632,6 +686,9 @@ end
preferred over the use of the `'readable'` event.

##### readable.isPaused()
<!--
added: v0.11.14
-->

* Return: {Boolean}

Expand All @@ -651,6 +708,9 @@ readable.isPaused() // === false
```

##### readable.pause()
<!-- YAML
added: v0.9.4
-->

* Return: `this`

Expand All @@ -672,6 +732,9 @@ readable.on('data', (chunk) => {
```

##### readable.pipe(destination[, options])
<!-- YAML
added: v0.9.4
-->

* `destination` {stream.Writable} The destination for writing data
* `options` {Object} Pipe options
Expand Down Expand Up @@ -727,6 +790,9 @@ never closed until the Node.js process exits, regardless of the specified
options.

##### readable.read([size])
<!-- YAML
added: v0.9.4
-->

* `size` {Number} Optional argument to specify how much data to read.
* Return {String|Buffer|Null}
Expand Down Expand Up @@ -774,6 +840,9 @@ event will also be emitted.
event has been emitted will return `null`. No runtime error will be raised.

##### readable.resume()
<!-- YAML
added: v0.9.4
-->

* Return: `this`

Expand All @@ -793,6 +862,9 @@ getReadableStreamSomehow()
```

##### readable.setEncoding(encoding)
<!-- YAML
added: v0.9.4
-->

* `encoding` {String} The encoding to use.
* Return: `this`
Expand Down Expand Up @@ -825,6 +897,9 @@ readable.on('data', (chunk) => {
```

##### readable.unpipe([destination])
<!-- YAML
added: v0.9.4
-->

* `destination` {stream.Writable} Optional specific stream to unpipe

Expand All @@ -851,6 +926,9 @@ setTimeout(() => {
```

##### readable.unshift(chunk)
<!-- YAML
added: v0.9.11
-->

* `chunk` {Buffer|String} Chunk of data to unshift onto the read queue

Expand Down Expand Up @@ -911,6 +989,9 @@ appropriately, however it is best to simply avoid calling `readable.unshift()`
while in the process of performing a read.

##### readable.wrap(stream)
<!-- YAML
added: v0.9.4
-->

* `stream` {Stream} An "old style" readable stream

Expand Down Expand Up @@ -943,6 +1024,9 @@ myReader.on('readable', () => {
### Duplex and Transform Streams

#### Class: stream.Duplex
<!-- YAML
added: v0.9.4
-->

<!--type=class-->

Expand All @@ -956,6 +1040,9 @@ Examples of Duplex streams include:
* [crypto streams][crypto]

#### Class: stream.Transform
<!-- YAML
added: v0.9.4
-->

<!--type=class-->

Expand Down Expand Up @@ -1579,7 +1666,7 @@ For Duplex streams, `objectMode` can be set exclusively for either the Readable
or Writable side using the `readableObjectMode` and `writableObjectMode` options
respectively.

In the following example, for instance, a new Transform stream (which is a
In the following example, for instance, a new Transform stream (which is a
type of [Duplex][] stream) is created that has an object mode Writable side
that accepts JavaScript numbers that are converted to hexidecimal strings on
the Readable side.
Expand Down

0 comments on commit c897d0b

Please sign in to comment.