Master Map Stream Node.js Stream Processing Efficiently

Introduction to map-stream

The map-stream module is a powerful utility in the Node.js ecosystem that allows for efficient stream processing. Streams are an essential concept in Node.js, enabling the processing of data chunk by chunk, rather than loading everything into memory at once.

Getting Started with map-stream

First, install the map-stream package using npm:

npm install map-stream

Import it into your project:

const map = require('map-stream');

Basic Usage

Using map-stream is straightforward. Define a function that processes an individual data chunk and call the map function:


  const map = require('map-stream');

  const transform = map(function (data, callback) {
    // example processing
    callback(null, data.toString().toUpperCase());
  });

  process.stdin.pipe(transform).pipe(process.stdout);

The example above will convert all input data to uppercase.

Advanced Usage

Here are some more complex examples demonstrating different operations:

Filtering Data


  const filterStream = map(function (data, callback) {
    if (data.toString().indexOf('keep') !== -1) {
      callback(null, data);
    } else {
      callback();
    }
  });

Parsing JSON


  const parseJSON = map(function (data, callback) {
    try {
      const jsonObject = JSON.parse(data);
      callback(null, jsonObject);
    } catch (err) {
      callback(err);
    }
  });

Combining Streams


  const combinedStream = process.stdin
    .pipe(transform)
    .pipe(filterStream)
    .pipe(parseJSON)
    .pipe(process.stdout);

Full Application Example

To demonstrate a complete application, consider this example that reads JSON data from a file, filters out unwanted entries, transforms the data, and writes it to another file:


  const fs = require('fs');
  const map = require('map-stream');
  
  const inputFilePath = 'input.json';
  const outputFilePath = 'output.json';

  const readStream = fs.createReadStream(inputFilePath);
  const writeStream = fs.createWriteStream(outputFilePath);

  const transform = map(function (data, callback) {
    try {
      let jsonData = JSON.parse(data);
      jsonData.transformed = true;  // Example transformation
      callback(null, JSON.stringify(jsonData) + '\n');
    } catch (err) {
      callback(err);
    }
  });

  readStream
    .pipe(transform)
    .pipe(writeStream)
    .on('finish', () => {
      console.log('Processing completed!');
    })
    .on('error', (err) => {
      console.error('Error during processing:', err.message);
    });

This example demonstrates how to integrate map-stream with filesystem streams to process data efficiently.

By understanding and leveraging the power of map-stream, you can create efficient, scalable stream processing solutions in Node.js.

Hash: bf078c52f2d0d231368e566b40222473347240c12f35495effbc60878242b95f

Leave a Reply

Your email address will not be published. Required fields are marked *