Learn the Concept of Streams in Node.js

1
3813
Streams

Streams

In the last article, we discussed the concept of buffers in Node.js. In this article, we are going to discuss the concept of streams in Node.Js and the various operations on the streams along with the suitable examples.

Streams in Node.js
In Node.js, streams are the objects which help the developer read data from the file or write date to file present at a particular location on machine in a regular fashion. There are four such types of streams in Node.js as follows:

  • Readable – The readable stream is used for the read operation.
  • Writable – The writable stream is used for write operation.
  • Duplex – The Duplex stream can be used to execute both read and write operations.
  • Transform – The transform stream is a type of duplex stream where we can compute an output based on an input.

Here is the interesting part about the above streams. All of the above streams are EventEmitter instances which can throw several events at different instance at the time of their execution. The following are some of such commonly used events.

  • ‘data’ – The ‘data’ event is fired when there is data available to read from the file.
  • ‘end’ – The ‘end’ event is fired when system has finished reading the data from the file or when there is no data to read from the file.
  • ‘error’ – The ‘error’ event is fired whenever there is any error while receiving or writing data to the file.
  • ‘finish’ – The ‘finish’ event is fired when all the data from the file has been flushed successfully to the underlying system.

Let’s understand each of the above events in detail.

Reading from a Stream
In this case, we are going to read data from the file ‘inputFile.txt’ which is present at a particular location on the machine. This file has the following data to read.

Now, create a Node.js project in eclipse with the name as ‘StreamsNodeJSApplication’ and add a js file with the name as ‘file-stream-reader.js’.

Explanation of the code

  • Firstly, we are importing the required module by using the required directive “fs” in order to load the “fs” module and storing the returned “fs” instance into a fileSystem variable as shown above.
  • Next, we are creating a fileReaderStream instance by using ‘fileSystem.createReadStream (‘file path’)’ where the instance is returned into fileReaderStream variable as shown above.
  • Next, we have set the encoding of data as ‘UTF8’.
  • Next, we are using ‘fileReaderStream’ instance for handling the reading stream events for data, end and error where we are taking some actions as shown above.
  • Lastly, we are logging a predefined message ‘Program execution has ended successfully!’ on to the console.

Output
When we execute the code present in the ‘file-stream-reader.js’ file by using the node command, we can observe the following output.
file-stream-readerjs
Writing to a Stream
To write to a stream into a file ‘’,we are going to use the same Node.js project ‘StreamsNodeJSApplication’ and add a js file with the name as ‘file-stream-writer.js’. The following are the contents of this file.

Explanation of the code

  • Firstly, we are importing the required module by using the required directive “fs” in order to load the “fs” module and storing the returned “fs” instance into a fileSystem variable as shown above.
  • Next, we are creating a fileWriterStream instance by using ‘fileSystem.createWriteStream (‘file path’)’ where the instance is returned into fileWriterStream variable as shown above.
  • Next, we have set the encoding of data as ‘UTF8’ followed by marking the end of the file.
  • Next, we are using ‘fileWriterStream’ instance for handling the writing stream events for finish and error where we are taking some actions as shown above.
  • Lastly, we are logging a predefined message ‘Program execution has ended successfully!’ on to the console.

Output
When we execute the code present in the ‘file-stream-writer.js’ file by using the node command, we can observe the following output.
file-stream-writerjs
Also, we can check ‘fileOutput.txt’ file contents, which we have written using the above program as shown below.

Piping the Streams
Piping can be defined as a mechanism through which we can write data to a stream that can act as an input to the other stream. Therefore, we can pass the data from one to the other stream. There is no limit on the number of such piping operations. The following is the demonstration for the piping mechanism.

Let’s create a js file with the name as ‘piping.js’ file.

Explanation of the code

  • Firstly, we are importing the required module by using the required directive “fs” in order to load the “fs” module and storing the returned “fs” instance into a fileSystem variable as shown above.
  • Next, we are creating a fileReaderStream instance by using ‘fileSystem.createReadStream (‘file path’)’ where the instance is returned into fileReaderStream variable as shown above.
  • Next, we are creating a fileWriterStream instance by using ‘fileSystem.createWriteStream (‘file path’)’ where the instance is returned into fileWriterStream variable as shown above.
  • Next, we are piping both the streams using this method ‘fileReaderStream.pipe (fileWriterStream)’. As a result, whatever data is written on the ‘inputData.txt’ will be copied as it is on the ‘outputData.txt’ file. This mechanism is known as piping.
  • Lastly, we are logging a predefined message ‘Program execution has ended successfully!’ on to the console.

Output
When we execute the code present in the ‘piping.js’ file by using the node command, we can observe the following output.
pipingjs

Chaining the Streams
Chaining can be defined as a mechanism through which we can connect the output of one stream to another stream and further create a chain of such multiple stream operations. It is generally used with piping operations. The following is the demonstration example for the piping mechanism.

Let’s create a js file with the name as ‘chaining.js’ file.

Explanation of the code

  • Firstly, we are importing the required module by using the required directive “fs” in order to load the “fs” module and storing the returned “fs” instance into a fileSystem variable as shown above.
  • Next, we are creating a fileReaderStream instance by using ‘fileSystem.createReadStream (‘file path’)’ where the instance is returned into fileReaderStream variable which we have piped into a chain to ‘pipe(zlib.createGzip ())’ followed by ‘pipe(fileSystem.createWriteStream()’ to generate a compressed file.
  • Lastly, we are logging a predefined message ‘File has compressed successfully!’ on to the console.

Output
When we execute the code present in the ‘chaining.js’ file by using the node command, we can observe the following output.
chainingjs

Source code for Streams of NodeJS Application

Conclusion: –
In this article, we have discussed about the reader and the writer streams of Node.js along with their frequently emitted events which are emitted by these streams. Later, we have used these streams to demonstrate piping and chaining mechanisms in Node.js.

1 COMMENT

  1. How to merge 2 createReadStreams to 1 createWriteStream?

    var fs = require(‘fs’);
    var readStream1 = fs.createReadStream(‘file1.mp4’);
    var readStream2 = fs.createReadStream(‘file2.mp4’);
    var writeStream = fs.createWriteStream(‘file3.mp4’);

    readStream1.pipe(writeStream, {flags:a});
    readStream2.pipe(writeStream);

    This script can merge both files but MP4 player only detect first stream although file3.mp4 contains both files.

    How to merge 2 read streams into 1 write Stream?

LEAVE A REPLY

Please enter your comment!
Please enter your name here