A stream in Node.js is an abstract interface that is implemented by various objects. Thanks to Node.js’s asynchronous and event-driven nature, it excels at handling I/O-bound tasks and streams, which are similar to Unix pipes.
For example, an HTTP server request is a stream, just like the standard output (stdout). The request
is a readable stream, while the response
is a writable stream. Streams can be classified as Readable, Writable, or Duplex (both readable and writable). Readable streams allow you to read data from a source, while writable streams allow you to write data to a destination. A “duplex” stream, such as a TCP socket connection, can both read and write data.
All streams in Node.js are instances of EventEmitter. Here is an example:
var fs = require('fs');
var readStream = fs.createReadStream('file.txt');
var text = '';
readStream.on('data', function(chunk) {
text += chunk;
});
readStream.on('end', function() {
console.log(text);
});
In the code above, fs.createReadStream()
provides a readable stream object. By using this object, we can listen for the data
event and attach a callback. Whenever chunks of data are read, they are passed to the callback and appended to the text
string. When all data has been read, the stream emits an end
event, and we log the text.
You can set the encoding on a stream by calling readStream.setEncoding()
. This will cause the data to be interpreted as UTF-8 and passed to your callback as a string.
Additionally, Node.js provides a useful function called pipe()
for transferring the flow of data from one stream to another. For instance, we can pipe a readable stream to a writable stream like this:
var fs = require('fs');
var readStream = fs.createReadStream('file1.txt');
var writeStream = fs.createWriteStream('file2.txt');
readStream.pipe(writeStream);
In this case, we pipe the readable stream of file1.txt
to the writable stream for file2.txt
. The pipe()
method manages the data flow automatically, so you don’t have to handle the callbacks manually.
The pipe()
method also supports chaining, which allows you to pipe data through multiple destinations:
var readStream = fs.createReadStream('file.txt');
var zlib = require('zlib').createGzip();
var writeStream = fs.createWriteStream('file.txt.gz');
readStream.pipe(zlib).pipe(writeStream);
In this example, we pipe the readStream
to zlib
(for compression), and then pipe the resulting data to writeStream
. The purpose of the pipe()
method is to manage buffering and ensure that data is transferred at a rate that prevents memory overload, especially when dealing with sources and destinations of varying speeds.