Node.JS Flashcards

Refresh Node.JS knowledge

1
Q

How many different stream classes are in Node.JS?

A

Readable, Writable, Transform and Duplex.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Which of the four stream types in Node have a “pipe” method?

A

Readable, Transform and Duplex (so all readable streams)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What does the “pipe” method in a stream return?

A

It returns the destination stream, to allow multiple calls to be chained together

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What kind of mechanism does the “pipe” method support in a stream? What does it do?

A

It supports the backpressure. If the consumer is unable to consume data as fast as the producer, then the producer will be paused until the consumer catches up.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Are the errors forwarded upstream with the pipe mechanism in streams?

A

Not by default, because it would be difficult to know exactly where the error happened.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What kind of stream is the “request” object?

A

It is a readable stream.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Can you interrupt a “req” request?

A

Yes, by using req.on(“abort”,…);

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is the difference between flowing and non-flowing mode of a readable stream and which one of them is the default mode?

A

Non-flowing (paused) mode&raquo_space;> data is explicitly pulled from the stream, on demand; when data finishes I wait for another “readable” event;
ex:
process.stdin(‘readable’, function( ) {
.on(‘readable’, function( ) {
console.log(‘new data available’);
while ( (chunk=process.stdin.read() ) != null {
console.log(‘chunk read’, chunk.toString( );
}
} )
.on (‘end’, function( ) {
process.stdout.write(‘End of stream’)}
} )

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What kind of data is contained in a Buffer?

A

Uint8Array

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What kind of data are strings in Node.JS operating with?

A

strings, Buffer and Uint8Array for streams operating in binary-mode and objects for streams operating in object-mode;

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

When can a stream switch between non-object to object Mode?

A

When the streams are created. Is not safe to switch an existing stream into object mode.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

How can you retrieve the internal Buffer from a readable or a writable stream?

A

By using writable.writableBuffer or readable.readableBuffer methods

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What kind of role is the highWaterMark option playing in streams?

A

It regulates the amount on data stored in a stream. It is passed into the Stream constructor, and is measured in “total number of bytes” for regular streams (non-object mode) and in “max stored objects” in object-mode streams.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Explain the mechanism of buffering data in Readable streams.

A

Data is buffered in a Readable stream when the implementation calls stream.push(chunk); if the consumer of the Stream does not call stream.read( ), the data will sit in the internal queue until consumed.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Explain how a Readable stream is actually working in non-flowing (default) mode.

A
  1. The data is pushed into the stream’s internal buffer by calling stream.push( );
    ex: const {Readable} = require(‘readable-stream’);
    const inStream = new Readable( { read( ) } );
    inStream.push(‘bla, bla, bla’);
    inStream.push(null) // no more data
  2. Consume the data from the Readable stream by only using the ‘readable’ event and the read( ) function:
    inStream.on(‘readable’, ( ) => {
    while ( (chunk = inStream.read ( ) ) !== null ) {
    console.log(inStream.read( ); // consume data
    }
    }
  3. Detect the closing of the stream:
    inStream.on(‘end’, ( ) => {
    console.log(‘stream ended’); }
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is the interaction between the “readable” and the “data” events in a Readable stream?

A

The “readable” event takes precedence (that’s why the non-flowing mode is the default) in controlling the flow, so the “data” event will only be emitted when the stream.read( ) is called; in that case, the ‘readableFlowing’ property would become false.
On the other side, if there any “data” listeners when the “readable” event listener is removed, the stream will start flowing.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Explain how a Readable stream is actually working in flowing mode.

A

If the readable stream is in flowing mode (non-default) then:
1. The data is pushed into the stream’s internal buffer by calling stream.push( );
ex: const {Readable} = require(‘readable-stream’);
const inStream = new Readable( { read( ) } );
inStream.push(‘bla, bla, bla’);
inStream.push(null) // no more data
2. Consume the data from the Readable stream by attaching a listener to the “data” event:
inStream.on(‘data’, (chunk) => {
console.log(‘new data available’, chunk.toString( ));
} );
3. Detect the closing of the stream:
inStream.on(‘end’, ( ) => {
console.log(‘stream ended’); }

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

What is the recommended way to treat an error that occurs during the process of reading data from a readable stream in flowing mode? Is there any difference compared to the non-flowing mode?

A
It is recommended that the errors occurring during the processing of the read( ) method are emitted using the 'error' event rather than being thrown (otherwise, throwing an error will probably result in unexpected behaviour, depending whether the stream is in flowing or non-flowing mode);
for ex:
const {Readable} = require('readable-stream');
const myStream = new Readable( {
      read(size) {
         if (checkSomeErrorCondition ( ) ) {
             process.nextTick ( ( ) => this.emit('error', err) );
             return;
         }
         // do the regular normal work in here;
      }
} );
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Is it possible for the “read’ event to be triggered several times when the data becomes available?

A

Yes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Is it possible to have multiple “data” events per chunk when using a Readable stream?

A

No, only 1 “data” event per chunk

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

How do you put a readable stream in flowing mode?

A

You enable flowing mode:

  1. by attaching a listener to the ‘data’ event;
  2. or by explicitly invoking the resume( ) method;
  3. or by piping into a writable stream by calling the stream.pipe( ) method;
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

How do you temporarily stop a readable stream in flowing mode?

A

By using the pause ( ) method;

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

What happens if you remove the “data” listener from a Readable stream?

A

The stream will go back to the default non-flowing mode, but that will not automatically pause the stream;

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

When is the “data” event emitted from a Readable stream?

A

Whenever the stream is relinquishing ownership of a chunk of data to a consumer. That means whenever:

  1. the stream is switched in flowing mode by calling readable.pipe(), readable.resume(), or by attaching a listener callback to the ‘data’ event;
  2. the readable.read() is called and a chunk of data is available to be returned
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Q

Is the stream.read( ) method needed to be called in flowing mode?

A

No, because the data is read from the underlying system automatically and provided to an application as quickly as possible using events via the EventEmitter interface;

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
26
Q

How can you switch back a readable stream from flowing into paused mode?

A

You can switch it back to non-flowing mode by:

  1. calling the stream.pause( ), if there are no pipe destinations;
  2. If there are pipe destinations, then by removing all pipe destinations you will put it back into non-flowing mode;
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
27
Q

How can you switch back a readable stream from flowing into paused mode?

A

You can switch it back to non-flowing mode by:

  1. adding a “readable” event handler;
  2. calling the stream.pause( ), if there are no pipe destinations;
  3. If there are pipe destinations, then by removing all pipe destinations you will put it back into non-flowing mode;
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
28
Q

Can you be sure that by using stream.pause( ) method is the stream actually paused?

A

No, because for ex. you you have piped destinations, then calling stream.pause( ) will NOT guarantee that the stream will remain paused once those destinations drain and ask for more data;

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
29
Q

What are the three internal states of a readable stream?

A

At any point in time, a readable stream will be in one of the following internal states:

  1. readable.readableFlowing === null;
  2. readable.readableFlowing === false;
  3. readable.readableFlowing === true;
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
30
Q

What is the meaning of the readable.readableFlowing === null internal state?

A

Then no mechanism for consuming the stream’s data is provided, so the stream will not generate data. In this state, attaching a “data” event listener, calling a readable.pipe( ) method or calling a the readable.resume( ) method will switch the readable.readableFlowing === true;, causing the stream to begin actively emitting events as data is generated;

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
31
Q

What is the meaning of the readable.readableFlowing === null internal state?

A

Then no mechanism for consuming the stream’s data is provided, so the stream will not generate data. In this state, attaching a “data” event listener, calling a readable.pipe( ) method or calling a the readable.resume( ) method will switch the readable.readableFlowing === true;, causing the stream to begin actively emitting events as data is generated;

Calling the readable.pause (), readable.unpipe() or receiving backpressure will cause the readable.readableFlowing === false;, temporarily halting the flowing of events but NOT halting the generation of data. In this state, attaching a “data” event listener will NOT switch the readable.readableFlowing === true;

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
32
Q

What are the multiple ways of consuming data from a readable stream?

A
  1. By using the .on(‘data’);
  2. By using the .on(‘readable’);
  3. By using the pipe( );
  4. By using async iterators;

the pipe( ) way is the recommended method, also is NOT recommended that you mix the styles above for a single stream;

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
33
Q

Does the “read” event listener have flow control and if yes, how do you use it - if no, what can be done?

A

No, there is no flow control for the “read” method of a readable stream; the solution to implement the flow control is to use the “from2” npm

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
34
Q

How is the data chunks passed to the listener callback in a readable stream?

A

Is passed as a string if a default encoding has been specified for the stream by using readable.setEncoding() method; otherwise the data is passed as a Buffer

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
35
Q

How can you force a readable stream to emit an “end” event?

A

The “end” event will NOT be emitted until all the data is completely consumed. That is done by switching the stream in flowing mode or by calling stream.read() repeatedly until all the data has been consumed;

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
36
Q

What happens once the end of the stream is reached, in what concerns the “readable” and “end” events?

A

The ‘readable’ event will also be emitted once at the end of the stream, but before the ‘end’ event emitted.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
37
Q

When is the stream.read() going to return null?

A

At the end of the stream

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
38
Q

What is it better in terms of a readable stream performance, to read the data with “readable” or to use the “data” event?

A

Using “readable” is better as you get higher throughput.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
39
Q

How can you make a readable stream to emit data intermittently with 1 second breaks?

A

By using the pause() and resume() methods:
const readable = getReadableStreamSomehow();
readable.on(‘data’, (chunk) => {
console.log(Received ${chunk.length} bytes of data);
readable.pause();
console.log(‘There will be no additional data for 1 second’);
setTimeout( ( ) => {
console.log(‘Now data will start floating again’);
readable.resume();
}, 1000);
});

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
40
Q

When is the readable.pause( ) method having no effects?

A

if there is a “readable” event listener attached

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
41
Q

What does the stream.pipe( ) method return in a readable stream?

A

it returns the destination, where the calling format is:

readable.pipe(destination,[options]);

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
42
Q

Is it possible to attach multiple writable streams to a single Readable stream by using pipe ( ) ?

A

Yes, for ex:
const fs = require(‘fs’);
const r = fs.createReadStream(‘file.txt’);
const z = zlib.createGzip();
const w = fs.createWriteStream(‘file.txt.gz’);
r.pipe(z).pipe(w);

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
43
Q

What is the meaning of the following:

reader.pipe(writer, {end:false})

A

By default, stream.end( ) is called on the destination Writable stream when the source Readable stream emits ‘end’, so the destination is no longer writable; to prevent this behavior, the “end” option can be passed as ‘false’, causing the destination stream to remain open

44
Q

What happens to a Writable stream piped to a Readable stream, if the Readable stream emits an error during processing?

A

The Writable destination is not closed automatically, so it will be necessary to manually close each stream to prevent memory leaks;

45
Q

How can you close process.stderr and process.stdout?

A

They are never closed until the Node process exits.

46
Q

The readable.read([size]) signature has an optional “size” param. How does it work?

A

It specifies a specific number of bytes that are being read from the internal buffers. If there are less than “size” bytes, null will returned unless the stream has ended, in which case all the remaining data will be returned.
if the “size” is NOT specified when the read( ) is called, then the whole amount of data contained in the internal buffer is returned.

47
Q

What is a Readable stream going to return in object mode if readable.read(1000) is called?

A

In object mode, the read( ) will return a single object, irrespective of the “size” (1000 in this case) argument.

48
Q

What si the connection between the readable.read( ) method and the “data” event?

A

If the readable.read( ) method returns a chunk of data, a “data” event will also be emitted.

49
Q

What kind of error will be thrown if you use a stream.read([size]) after the ‘end’ event has been emitted?

A

It will return null, so no error will be thrown.

50
Q

How can you check that the highWaterMark is not exceeded on a regular basis?

A

You have to use the readable.readableLength property

51
Q

How can you flush a readable stream?

A

By using the resume( ) method:
getReadableStreamSomehow( )
.resume( )
.on(‘end’, ( ) => { console.log(‘stream flushed’)};

52
Q

What is the effect of setEncoding( ) method applied to a readable stream?

A
It convinces the stream to return strings instead of Buffer objects:
const readable = getReadableStreamSomehow( );
readable.setEncoding('utf8');
readable.on('data', (chunk) => {
    assert.equal(typeof chunk, 'string');
} ); // returns true
53
Q

How do you iterate over a readable stream?

A

By using the :

async function print(readable) {
    readable.setEncoding('utf8');
    let data = ' ';
    for await (const k of readable) {
        data += k;
    }
    console.log(data);
}
// now use the print( ) defined above:
const fs = require('fs');
print(fs.createReadStream('file')).catch(console.log);
54
Q

What happens if the loop is terminated with a “break” or an “throw” when you use AsyncIterator to parse a readable stream?

A

The stream will be destroyed.

55
Q

Can you partially iterate over a readable stream?

A

No, once you start iterating you will iterate the whole readable stream. The stream will be read in equal sized chunks given highWaterMark option.

56
Q

Name some writable streams

A
  1. HTTP requests, on the client;
  2. HTTP response, on the server;
  3. fs write streams;
  4. zlib streams;
  5. crypto streams;
  6. TCP sockets;
  7. child process stdin;
  8. process.stdout, process.stderr;
57
Q

How do you write info into a writable stream?

A

myStream.write(‘Some data’);

58
Q

What is the typical usage pattern for a writable stream?

A

const myStream = getWritableStreamFromSomewhere();
myStream.write(‘some data’);
myStream.write(‘more data’);
myStream.end(‘done writing data’);

59
Q

In the writable.write(chunk[, encoding][,callback]) signature, in object mode, what is the restriction on the “chunk” data?

A

It can be any other value apart from “null”

60
Q

In the writable.write(chunk[, encoding][,callback]) signature, when is the callback actually called? What happens in the case of an error when writing data?

A

When the data is flushed (when the data has been fully handled);
In case of an error the callback may or may not be called with the error as the first argument.

61
Q

What can you do to reliably detect errors in a writable stream?

A

Add a listener to the ‘error’ event.

62
Q

What is the return value from writable.write( ) method?

A

It’s “true” if the internal buffer is lower than the highWaterMark, If greater than the highWatermark then returns false, in which case the attempts to write data should stop until the ‘drain’ event is emitted

63
Q

What happens to the calls to write( ) method if a writable stream is not draining?

A

The stream will buffer the chunk and the writable.write( ) will return false

64
Q

What is the definition of “drained package” in a writable stream case?

A

It means: “accepted by delivery by the operating system”

65
Q

When is the “drain” event emitted in the case of a writable stream?

A

Once ALL the currently buffered packets are drained the stream will emit a “drain” event

66
Q

What happens if you continue writing into a writable stream that is not draining?

A

node.js is buffering all the written chunks in memory, until maximum memory usage is hit, at which point the stream will abort unconditionally

67
Q

What is the danger of not implementing flow control when writing into a writable stream?

A

If by storing the chunks in memory, Node.js will eventually hit max memory usage and abort, there is the danger that the TCP sockets may never drain (if the remote peer does not read the data) so they can be exploited remotely as a vulnerability.

68
Q

Give an example on how to implement flow control in a writable stream.

A
function writeWithFlowControl (stream,data, cb) {
    if (!stream.write(data)) {
         stream.once('drain',cb);
    else {
         process.nextTick(cb);
    }
}
const wStream = getWritableStreamFromSomwhere();
writeWithFlowControl (wStream, "bla bla bla", ( ) => {
     console.log('Write completed, do some more work now');
});
69
Q

What happens if I set the “encoding” algorithm in a wrong way in a writable stream in object mode?

A

The writable stream in object mode will always ignore the “encoding” algo

70
Q

What is the effect of specifying a callback in the writable.write() method?

A

it is equivalent to registering a listener to the “finish” event

71
Q

When is the ‘finish’ event triggered on a writable stream?

A

The “finish” event is triggered after the stream.end( ) method has been called, and all data has been flushed to the underlying system

72
Q

When is the ‘close’ event emitted?

A

When the stream and any of its underlying resources (e.g. file descriptors) have been closed; the event indicates that no more events will be emitted and no further computation is done

73
Q

What do you have to do to ensure a writable stream will ALWAYS call the “close” event?

A

You have to create the writable stream with the “emitClose” option.

74
Q

In which cases are writable streams throwing errors?

A

When an error occurs while writing OR piping data

75
Q

When is the ‘pipe’ event raised on a writable stream?

A

When the stream.pipe( ) is called on a readable stream, adding the writable stream to its list of destinations:

const writer = getWritableStreamFromSomewhere();
const reader = getReadableStreamFromSomewhere();
writer.on(‘pipe’, (src) => {
console.log(‘who is piping in me’);
});
reader.pipe(writer);

76
Q

When is the ‘unpipe’ event raised on a writable stream?

A

In two cases:

a) when the stream.unpipe( ) method has been called on the connected readable stream;
b) when the writable stream emits an error when the readable stream is piping into it;

77
Q

What is the effect of writable.cork( ) method on a writable stream?

A

It actually forces all written data to be buffered in memory. The buffered data will be flushed when wither the stream.uncork( ) method is called or stream.end( ) is called

78
Q

What happens if you write into a writable stream after it’s been destroyed?

A

After the myWritableStream.destroy( ) has beeen called, if I call write( ) or end( ) will result in an ERR_STREAM_DESTROYED error

79
Q

What is the difference between end( ) and destroy( ) methods in what concerns the writable streams?

A

destroy( ) will immediate destroy the stream. if you want to flush the data first, then call end( )

80
Q

What does the end( ) method for a writable stream return?

A

It returns

81
Q

What is the recommendation when you call uncork( ) method on a writable stream?

A

It is recommended that you defer the calls to writable.uncork() by using the process.nextTick( ); the reason is that by doing so, you will allow batching all the writable.write( ) calls that occur within a given Node.js event loop phase.

82
Q

What is the dependency between writable.cork( ) and writable.uncork( ) for a writable stream?

A
if the writable.cork( ) been called multiple times, then you have to call uncork( ) the same amount of times if you want to flush the data:
stream.cork();
stream.write('some');
stream.cork();
stream.write('data');
process.netxTick( ( ) => {
   stream.uncork();
   // the data is still not flushed here
   stream.uncork ( );
   // only now the data is flushed;
});
83
Q

How do you get the number of bytes/objects in the queue of a writable stream, ready to be written?

A

by calling writable.writableLength

84
Q

How do you create a writable stream?

A
const Writable} = require('stream');
const outStream = Writable( {
   write: (chunk, encoding, cb) => {
       console.log(chunk.toString( ));
       cb();
   }
});
process.stdin.pipe(outStream);
85
Q

What are Duplex streams?

A

They are streams that are both readable and writable

86
Q

Give examples of duplex streams:

A

TCP sockets, zlib streams, crypto streams

87
Q

Are the Transform streams also Duplex streams?

A

Yes, also they are readable and writable as well

88
Q

Is TCP socket a transform stream?

A

Nope

89
Q

Give examples of Transform streams.

A

zlib, crypto

90
Q

How do you create a Duplex stream?

A
const {Duplex} = require('stream');
const inOutStream = Duplex( {
    write(chunk, encoding, cb) {
       console.log(chunk.toString( ));
       cb( );
    },
    read(size) {
        this.push('bla, bla bla');
        this.push(null);
    }
});
process.stdin.pipe(inOutStream).pipe(process.stdout);
91
Q

What npm lib is recommended to use for transform streams in object mode?

A

through2

const through2 = require('through2');
const xyz = through2.obj( ({x,y},enc,cb) => {
      cb(null,{z:x+y})
})
xyz.pipe(process.stdout);
92
Q

How do you create a Transform stream?

A
const {Transform} = require('stream');
const upperCaseTrStream = new Transform( {
   transform(chunk, enc, cb) {
      this.push(chunk.toString( ).toUpperCase());
      cb();
   }
})
process.stdin.pipe(upperCaseTrStream).pipe(process.stdout);
93
Q

What is the role of the optional param “end” in the pipe() method signature?
readableStream.pipe(writableStream, [end]);

A

It is an object with a boolean property. If true (default), the destination stream is closed when the source stream emits the “end” event.

94
Q

How do the standard streams (process.stdin, process.stdout and process.err) behave when they are associated with a file or a terminal?

A

They behave synchronously (so they block the program)

95
Q

What is the behaviour of the standard streams when piped?

A

They behave asynchronously

96
Q

How do you call read( ) or write( ) when you work with piped streams?

A

You don’t, as there is no need

97
Q

What is the mechanism of the streams backpressure?

A

it is automatic, you can’t modify it

98
Q

How do you create a read stream from a file?

A
const fs = require('fs');
const stream = fs.createReadStream(\_\_dirname + '/foo.txt')
99
Q

When is the “open” event triggered for a file stream?

A

When the file is successfully opened.

100
Q

How do you create a writable file stream?

A
const fs = require('fs');
const writableStream = fs.createWriteStream(\_\_dirname + '/bar.txt');
101
Q

What happens to the error events in a pipeline?

A

They are NOT propagated from one stream to the next

102
Q

Having:
stream1.pipe(stream2).on(‘error’, ( ) => …);
what happens if there is an error in stream1?
Also, what happens if there is an error in stream2?

A

If error in stream 1, the construct presented will NOT catch the error.
If error in stream 2, then stream2 will get automatically disconnected from stream1

103
Q

What is safer to use than pipe( ) in practice?

A
Use pump(stream1, stream2,...,done);
if any errors in any of the intermediate streams, then there are potentially memory leaks, so very dangerous.
104
Q

What is the recommended package to use when create streams?

A

instead of the standard “stream” package, use “readable-stream’. This new package is implementing the Stream2 interface. I can use it as:
const {Readable, Writable, Transform, Duplex, pipeline, finished} = require(‘readable-stream’);

105
Q

What are the advantages of using Stream2?

A
  1. The state management is entirely automated now;
106
Q

What is the “through2” npm package?

A

It is a tiny abstraction around Node’s core stream.Transform