Streams

Streams are the objects that facilitate you to read data from a source and write data to a destination.

Streams are unix pipes that let you easily read data from a source and pipe it to a destination. Simply put, a stream is nothing but an EventEmitter and implements some specials methods.

Depending on the methods implemented, a stream becomes Readable, Writable, or Duplex (both readable and writable).

There are four types of streams in Node.js:

  • Readable: This stream is used for read operations.
  • Writable: This stream is used for write operations.
  • Duplex: This stream can be used for both read and write operations.
  • Transform: It is type of duplex stream where the output is computed according to input.

Streams can be readable, writable or both. We can connect the readable stream to a writable stream using the pipe method.

All of the above streams are EventEmitter instances which can throw several events at different instance at the time of their execution.

Readable Stream

A readable stream lets you read data from a source. The source can be anything. It can be a simple file on your file system, a buffer in memory or even another stream.

As streams are EventEmitters, they emit several events at various points. We will use these events to work with the streams.

  • Import the required module by using the required directive "fs" in order to load the "fs" module and storing the returned "fs" instance into a fileSystem variable as shown above.
  • 
    var fileSystem = require("fs");
    

  • Create a fileReaderStream instance by using fileSystem.createReadStream ('file path'), where the instance is returned into fileReaderStream variable as shown above.
  • 
    // Creation of a readable stream.
    var fileReaderStream = fileSystem.createReadStream('C:\\PATH\\testFile.txt');
    

  • We have set the encoding of data as 'UTF8'.
  • 
    // Set the encoding to the utf8 format. 
    fileReaderStream.setEncoding('UTF8');
    

  • We are using 'fileReaderStream' instance for handling the reading stream events for data, end and error where we are taking some actions as shown above.
  • 
    // Handling the stream events for data, end and error.
    fileReaderStream.on('data', function(dataBits) {
    	fileContent += dataBits;
    });
    

  • We are logging a predefined message 'Program execution has ended successfully!' on to the console.
  • 
    fileReaderStream.on('end',function(){
       console.log(fileContent);
    });
    

Complete code for Read stream in Nodejs


var fileSystem = require("fs");
var fileContent = '';

// Creation of a readable stream.
var fileReaderStream = fileSystem.createReadStream('C:\\PATH\\testFile.txt');

// Set the encoding to the utf8 format. 
fileReaderStream.setEncoding('UTF8');

// Handling the stream events for data, end and error.
fileReaderStream.on('data', function(dataBits) {
	fileContent += dataBits;
});

fileReaderStream.on('end',function(){
   console.log(fileContent);
});

fileReaderStream.on('error', function(error){
   console.log(error.stack);
});

console.log ("Execution Completed!");

The following are some of such commonly used events.

  • data – The 'data' event is fired when there is data available to read from the file.
  • end – The 'end' event is fired when system has finished reading the data from the file or when there is no data to read from the file.
  • error – The 'error' event is fired whenever there is any error while receiving or writing data to the file.
  • finish – The 'finish' event is fired when all the data from the file has been flushed successfully to the underlying system.

Methods in Readable Stream in NodeJS

  • readable.pause() : pause() function is used to change the mode of the stream from flowing to paused and also all the data available keeps residing in the internal buffer.
  • readable.resume() : resume() function is used to change the mode of the stream from paused to flowing and also stream will resume emitting events.
  • readable.isPaused() : isPaused() function is used to check the current operating state of the readable stream. If it returns true then that signifies that readable stream is in paused mode.
  • readable.pipe() : pipe() function is used to attach a writable stream to the readable which will make the stream switch to flowing mode and start pushing data to the attached writable.
  • readable.unpipe() : unpipe() function is used to detach the writable stream previously attached to the readable stream.
  • readable.read() : read() function is used to pull the data out of the internal buffer where data is returned in the form of buffers unless any other format is specified using readable.setEncoding() . If there is no data to pull , then null is returned.
  • readable.setEncoding() : setEncoding() function is used to set the encoding for readable stream. By default the data is pulled in the form of buffers.
  • readable.unshift() : unshift() function is used to push the data back to the internal buffer.
  • readable.wrap() : wrap() function is used to read the data from the read-ables where the data sources uses the old streams.
  • readable.destroy() : destroy() function is used to signifies the end of readable stream and stream releases any resources , if held.

Writing to a Stream in NodeJS

Writable stream is used to write data to a specific target, Basically its has two main method :

  • write
  • end

writable.write() :

write() function is used to write the chunk of data to stream and call its corresponding callback after writing the data, if it is provided. Encoding is supplied if the chunk is a String. write() function will return true if data is handled or ready to write more data.

writable.end() :

write() function is used to write last chunk of data on the stream and call its corresponding callback when it is finished writing. Finish event is attached with this callback, Encoding is supplied if the chunk is a String.

Steps to use Write Stream in NodejS :

  • import the required module by using the required directive 'fs' in order to load the 'fs' module and storing the returned 'fs' instance into a fileSystem variable as shown above.
  • 
    var fileSystem = require("fs");
    

  • Create a fileWriterStream instance by using fileSystem.createWriteStream ('file path') where the instance is returned into fileWriterStream variable as shown above.
  • 
    

  • Set the encoding of data as 'UTF8' followed by marking the end of the file.
  • 
    // Creation of a readable stream
    var fileWriterStream = fileSystem.createWriteStream('C:\\PATH\\outputFile.txt');
    

  • Use 'fileWriterStream' instance for handling the writing stream events for finish and error where we are taking some actions as shown above.
  • 
    // Writing data to the stream with encoding to be utf8
    fileWriterStream.write(fileData,'UTF8');
    

  • Log a predefined message 'Execution completed !' on to the console.
  • 
    // Handling the stream events i.e. finish and error events
    fileWriterStream.on('finish', function() {
        console.log ("The write operation on the file has been completed.");
    });
    

Complete code for write stream


var fileSystem = require("fs");
var fileData = 'Chercher tech | Learning is fun';

// Creation of a readable stream
var fileWriterStream = fileSystem.createWriteStream('C:\\PATH\\outputFile.txt');

// Writing data to the stream with encoding to be utf8
fileWriterStream.write(fileData,'UTF8');

// Marking the end of file
fileWriterStream.end();

// Handling the stream events i.e. finish and error events
fileWriterStream.on('finish', function() {
    console.log ("The write operation on the file has been completed.");
});

fileWriterStream.on('error', function(error){
   console.log(error.stack);
});

console.log ("Execution completed!");

The following are some of such commonly used events in writable streams

  • drain :: This event is fired when a call to system.write(chunk) method returns false and it indicates when it will be appropriate to resume writing data.
  • pipe : This event is fired when stream.pipe() method is called on a readable stream indicating the addition of the writable in the set of destinations of the readable.
  • unpipe : This event is fired when stream.unpipe() method is called on a readable stream indicating the removal of the writable from the set of destinations of the readable.
  • error : This event is fired when an error occurred while writing or piping the data.
  • close : This event is fired when the stream is closed. It indicates that no more events will be emitted and no further computation will occur.
  • finish : This event is fired when all the data is successfully flushed.

Piping of Streams

Piping can be defined as a way through which we can write data to a stream that can act as an input to the other stream. Therefore, we can pass the data from one to the other stream. There is no limit on the number of such piping operations.

The built-in function pipe() attaches a readable stream to a writable stream, passing the data from one to the other.

  • Import the required module by using the required directive 'fs' in order to load the 'fs' module and storing the returned 'fs' instance into a fileSystem variable as shown above.
  • 
    var fileSystem = require("fs");
    

  • Create a fileReaderStream instance by using 'fileSystem.createReadStream ('file path')' where the instance is returned into fileReaderStream variable as shown above.
  • 
    // Creation of a readable stream
    var fileReaderStream = fileSystem.createReadStream('testData.txt');
    

  • Create a fileWriterStream instance by using 'fileSystem.createWriteStream ('file path')' where the instance is returned into fileWriterStream variable as shown above.
  • 
    // Creation of a writable stream
    var fileWriterStream = fileSystem.createWriteStream('outputData.txt');
    

  • Pipe both the streams using function 'fileReaderStream.pipe (fileWriterStream)'. As a result, whatever data is written on the 'inputData.txt' will be copied as it is on the 'outputData.txt' file. This mechanism is known as piping.
  • 
    // Here we are Piping the read and write operations
    fileReaderStream.pipe(fileWriterStream);
    

  • Print a predefined message 'Execution completed!' on to the console.
  • 
    console.log ("Execution completed!");
    


Complete example for Piping in NodeJS


var fileSystem = require("fs");

// Creation of a readable stream
var fileReaderStream = fileSystem.createReadStream('testData.txt');

// Creation of a writable stream
var fileWriterStream = fileSystem.createWriteStream('outputData.txt');

// Here we are Piping the read and write operations
fileReaderStream.pipe(fileWriterStream);

console.log ("Execution completed!");

Write Stream Methods

  • writable.cork() : cork() function is used to force all the written data to be buffered in memory.
    This buffered data is flushed in either of the following scenarios :
    • stream.uncork() method is called.
    • stream.end() method is called.
  • writable.uncork() : uncork() function is used to flush all the data buffered by stream.cork() method.
  • writable.write() : function is used to write some data to the stream and call the given callback when the data is handled successfully.
  • writable.setDefaultEncoding() : write() function is used to set the default encoding for the writable stream.
  • writable.end() : end() function is used to signifies that no more data will be written to the writable stream.
  • writable.destroy() : destroy() function is used to signifies the end of writable stream.

Duplex Stream

Duplex stream is one which is both readable and writable, however most often a Duplex stream is usually referring to a stream which actually has two full independent streams embedded in it, one flowing out and one flowing in.

Transform Stream

Transform streams are duplex streams that can transform or modify data as it is read and written, also where output is in some way related to the input. These streams read the input data, transform it using the manipulating function and output the new data

About Author

Myself KarthiQ, I am the author of this blog, I know ways to write a good article but some how I donot have the skills to make it to reach people, would you like help me to reach more people By sharing this Article in the social media.

Share this Article Facebook
Comment / Suggestion Section
Point our Mistakes and Post Your Suggestions

Recent Addition

new tutorial Protractor Online Training : We have closed registration for training

Please do email to chercher.tech@gmail.com for any queries

If you already registered then please do check your email for the Webex link
Subscribe
 
Join My Facebook Group