Introduction to Readable-Stream
The readable-stream
library in Node.js provides a powerful interface for handling streams of data. This library offers a wide range of APIs that allow developers to work with streams efficiently. In this blog post, we will introduce the key APIs and provide practical code snippets that demonstrate their usage. We’ll also provide a comprehensive app example to illustrate how these APIs can be used together.
Core APIs
1. Stream.Readable
The Stream.Readable
class is used to create readable stream instances. Here’s how you can create and use it:
const { Readable } = require('readable-stream');
const readable = new Readable({
read(size) {
this.push('data chunk');
this.push(null); // Indicates the end of the stream
}
});
readable.on('data', (chunk) => {
console.log(`Received ${chunk}`);
});
2. Stream.Writable
The Stream.Writable
class is used to create writable stream instances. Here’s an example:
const { Writable } = require('readable-stream');
const writable = new Writable({
write(chunk, encoding, callback) {
console.log(`Writing: ${chunk}`);
callback();
}
});
writable.write('data chunk');
writable.end('done');
3. Stream.Duplex
The Stream.Duplex
class allows you to create streams that can be both readable and writable. Here’s an example:
const { Duplex } = require('readable-stream');
const duplex = new Duplex({
read(size) {
this.push('data chunk');
this.push(null);
},
write(chunk, encoding, callback) {
console.log(`Writing: ${chunk}`);
callback();
}
});
duplex.on('data', (chunk) => {
console.log(`Received ${chunk}`);
});
duplex.write('data chunk');
duplex.end('done');
4. Stream.Transform
The Stream.Transform
class is used to create transform streams, which can read and write data, allowing modification of data as it passes through. Example:
const { Transform } = require('readable-stream');
const transform = new Transform({
transform(chunk, encoding, callback) {
const modifiedChunk = chunk.toString().toUpperCase();
this.push(modifiedChunk);
callback();
}
});
transform.on('data', (chunk) => {
console.log(`Transformed: ${chunk}`);
});
transform.write('data chunk');
transform.end();
5. Pipe method
The pipe
method allows piping the output of one stream directly into another stream:
const { Readable, Writable } = require('readable-stream');
const readable = new Readable({
read(size) {
this.push('data chunk');
this.push(null);
}
});
const writable = new Writable({
write(chunk, encoding, callback) {
console.log(`Writing: ${chunk}`);
callback();
}
});
readable.pipe(writable);
App Example
Here’s an example of an application that uses the Readable, Writable, and Transform streams to process data:
const { Readable, Writable, Transform } = require('readable-stream');
// Readable stream produces data
const readable = new Readable({
read(size) {
this.push('example data chunk ');
this.push(null);
}
});
// Transform stream modifies incoming data
const transform = new Transform({
transform(chunk, encoding, callback) {
const modifiedChunk = chunk.toString().replace('chunk', 'snippet');
this.push(modifiedChunk);
callback();
}
});
// Writable stream consumes data
const writable = new Writable({
write(chunk, encoding, callback) {
console.log(`Final Output: ${chunk}`);
callback();
}
});
// Pipe streams together
readable.pipe(transform).pipe(writable);
This setup reads data from the readable stream, processes it with the transform stream to replace ‘chunk’ with ‘snippet’, and finally writes the transformed data to the writable stream.
Using these APIs creatively allows developers to build complex data processing pipelines with ease. The readable-stream
library ensures efficient data handling and robust integration with the Node.js stream infrastructure.
Hash: 88369504f3f6e71ea4c90744ee9194f6b241ba5f469cb3b517f83bc460bded34