We would cover Streams in Node.js under the following sub-topics:
- Introduction to Streams in Node.js
- Reading from Streams
- Writing to Streams
- Piping a Stream
- Chaining Streams
1. Introduction to Streams in Node.js
In Node.js, streams are used to read data from a source to a destination in a continuous manner. We would learn of four types of streams in Node.js. They are:
- Readable − This is a stream used for read operation.
- Writable − Stream which is used for write operation.
- Duplex − This can be used for both read and write operations.
- Transform − A type of duplex stream where the output is determined based on the input.
Each of the type so streams listed above is an instance of EvenEmitter (EventEmitters were discussed in Part 8). They can fire several events at different times. For instance some of the most common events are:
- data − This is fired whenever there is data is available to read.
- end − This is fired whenever there is no more data to read.
- error − This is fired whenever there is any error receiving or writing data.
- finish − This is fired whenever all the data has been flushed to underlying system.
Let’s now examine the operations you can perform with streams. We begin with reading.
2. Reading from a Stream
I have created a file named testinput.txt. The content is as follows:
Find all tutorial in kindsonthegenius.com Also watch the video lessons.
Then, in VS Code, write and run the following program:
var fs = require("fs"); var data = ''; // Create a readable stream object var readStream = fs.createReadStream('D:/nodefiles/testinput.txt'); // Set the character encoding to be utf8. readStream.setEncoding('UTF8'); // Handle stream data event readStream.on('data', function(chunk) { data += chunk; }); // Handle the end event readStream.on('end',function() { console.log(data); }); //Handle the error event readStream.on('error', function(err) { console.log(err.stack); }); console.log("End of Program");
In the program above we created and stream. Then we read from the stream.
3. Writing to a Stream
Now, we are going write to a stream. The code below create a writable stream. The it writes some text into it.
var fs = require("fs"); var data = 'kindsonthegenius.com is the best place to learn'; // Create a writable stream var writeStream = fs.createWriteStream('D:/nodefiles/testoutput.txt'); // Write the data to the stream with character encoding to be utf8 writeStream.write(data,'UTF8'); // Indicate the end of file writeStream.end(); // Handle stream finis event writeStream.on('finish', function() { console.log("Data was written succesfully."); }); // Handle the Error event writeStream.on('error', function(err) { console.log('Error Occured: ' + err.stack); }); console.log("End of Program");
This program creates the file specified. Then open it and writes into it. So run this code, then check the directory to see that a new file was created.
Also introduce some error, just to make it fire the error event.
4. Piping a Stream to another Stream
Sometimes a pipe and a stream works together. You use a pipe to connect an output of a stream as input to another stream. This called piping. Now, you can pipe as many streams as you want. In the code below, we would read data from a readable stream and write it into another stream.
In the code below, we create two streams: readStream and writeStream. Then we pipe the readStream into the writeStream.
// Program to demonstrate Piping operation var fs = require("fs"); // Create a readable stream object var readerStream = fs.createReadStream('D:/nodefiles/testinput.txt'); // Create a writable stream object var writerStream = fs.createWriteStream('D:/nodefiles/testoutput.txt'); // Pipe the readstream into the writeStream readerStream.pipe(writerStream); console.log("End of Program");
5. Chaining Streams
Similar to piping, is another mechanism called chaining.
Chaining allows you to connect the output of a stream to another stream. Therefore, you can chain multiple stream operations. Oftentimes, chaining is used along with piping.
For example, we are going to use piping and chaining to first compress a file. Then we decompress the same file.
The code is given below:
var fs = require("fs"); var zlib = require('zlib'); //compression module // Compress the file testinput.txt to testinput.txt.gz fs.createReadStream('D:/nodefiles/testinput.txt') .pipe(zlib.createGzip()) .pipe(fs.createWriteStream('D:/nodefiles/testinput.txt.gz')); console.log("File was Compressed.");
If you run the code above succefully, then it would compress the specified file. Now you can check the directory. You’ll notice that a compressed file has been created. The folder in my system is shown below.
Now, we would run a code to decompress the same file. The code is given below:
var fs = require("fs"); var zlib = require('zlib'); // Decompress the file testinput.txt.gz to testinput.txt fs.createReadStream('D:/nodefiles/testinput.txt.gz') .pipe(zlib.createGunzip()) .pipe(fs.createWriteStream('D:/nodefiles/testinput.txt')); console.log("File was Decompressed.");
[…] Node.js -Streams […]