Understanding Streams in nodejs

Last updated on August 16th, 2023

What are Streams in NodeJS?

Node.js streams are a powerful feature provided by the Node.js runtime environment for handling data flow between input and output sources in a more efficient and flexible manner. Streams in NodeJS are used to read or write data in chunks, rather than loading the entire data into memory at once, which can be especially beneficial when dealing with large volumes of data or working with limited system resources.

Types of Streams

Node.js streams follow the stream abstraction, which consists of a series of objects that facilitate the flow of data. There are four fundamental types of streams in Node.js:

  1. Readable Streams: These streams allow you to read data from a source, such as a file or network request. Readable streams emit events like ‘data’, ‘end’, and ‘error’, and you can consume the data using the read() or pipe() methods.
  2. Writable Streams: Writable streams are used to write data to a destination, such as a file or a network response. They provide methods like write() and end() for sending data and emit events like ‘drain’ and ‘error’.
  3. Duplex Streams: Duplex streams represent both a readable and writable stream. They allow data to flow in both directions simultaneously. Duplex streams are commonly used for network sockets.
  4. Transform Streams: Transform streams are a type of duplex stream that can modify or transform data as it passes through. They are commonly used for tasks like compression, encryption, or data manipulation. Transform streams inherit from the Duplex class and implement the _transform and _flush methods.

By leveraging streams, you can process data efficiently and handle it in smaller, manageable chunks. Streams offer benefits such as reduced memory usage, improved performance, and the ability to handle data in real-time. They are widely used in Node.js for tasks like file I/O, network communication, data processing, and building scalable applications.

Overall, Node.js streams provide a flexible and efficient way to handle data flow, making it easier to work with large datasets or handle real-time data processing requirements.

How you can use Streams in NodeJS?

  1. Reading a File Using Readable Stream:
const fs = require('fs');

const readableStream = fs.createReadStream('input.txt');

readableStream.on('data', (chunk) => {
  console.log(`Received chunk: ${chunk}`);
});

readableStream.on('end', () => {
  console.log('Finished reading the file.');
});

readableStream.on('error', (error) => {
  console.error(`Error reading the file: ${error}`);
});

In this example, we create a readable stream using fs.createReadStream() and read data from the file ‘input.txt’. The stream emits the ‘data’ event for each chunk of data read from the file, and the ‘end’ event when the file reading is complete. Any errors encountered during reading will trigger the ‘error’ event.

2. Writing to a File Using Writable Stream:

const fs = require('fs');

const writableStream = fs.createWriteStream('output.txt');

writableStream.write('Hello,');
writableStream.write('Node.js Streams!');
writableStream.end();

writableStream.on('finish', () => {
  console.log('Data has been written to the file.');
});

writableStream.on('error', (error) => {
  console.error(`Error writing to the file: ${error}`);
});

In this example, we create a writable stream using fs.createWriteStream() and write data to the file ‘output.txt’ using the write() method. We then call end() to signal the end of writing. The ‘finish’ event is emitted when all data has been written, and the ‘error’ event is triggered if any error occurs during writing.

3. Piping Streams:

const fs = require('fs');

const readableStream = fs.createReadStream('input.txt');
const writableStream = fs.createWriteStream('output.txt');

readableStream.pipe(writableStream);

readableStream.on('end', () => {
  console.log('Data has been read from the file and written to another file.');
});

readableStream.on('error', (error) => {
  console.error(`Error reading the file: ${error}`);
});

writableStream.on('finish', () => {
  console.log('Data has been written to the file.');
});

writableStream.on('error', (error) => {
  console.error(`Error writing to the file: ${error}`);
});

In this example, we create a readable stream from ‘input.txt’ and a writable stream to ‘output.txt’. By using the pipe() method, we connect the output of the readable stream to the input of the writable stream. This allows data to be read from the source and automatically written to the destination. The ‘end’ and ‘error’ events are handled similarly to the previous examples.

These examples illustrate some basic usage scenarios of Node.js streams. Depending on your needs, you can combine and customize streams to achieve more complex data processing and manipulation tasks.

deep dive in streams visit https://nodejs.org/api/stream.html

Thank you for reading.

Leave a Reply

Your email address will not be published. Required fields are marked *