Node.JS Flashcards
Refresh Node.JS knowledge
How many different stream classes are in Node.JS?
Readable, Writable, Transform and Duplex.
Which of the four stream types in Node have a “pipe” method?
Readable, Transform and Duplex (so all readable streams)
What does the “pipe” method in a stream return?
It returns the destination stream, to allow multiple calls to be chained together
What kind of mechanism does the “pipe” method support in a stream? What does it do?
It supports the backpressure. If the consumer is unable to consume data as fast as the producer, then the producer will be paused until the consumer catches up.
Are the errors forwarded upstream with the pipe mechanism in streams?
Not by default, because it would be difficult to know exactly where the error happened.
What kind of stream is the “request” object?
It is a readable stream.
Can you interrupt a “req” request?
Yes, by using req.on(“abort”,…);
What is the difference between flowing and non-flowing mode of a readable stream and which one of them is the default mode?
Non-flowing (paused) mode»_space;> data is explicitly pulled from the stream, on demand; when data finishes I wait for another “readable” event;
ex:
process.stdin(‘readable’, function( ) {
.on(‘readable’, function( ) {
console.log(‘new data available’);
while ( (chunk=process.stdin.read() ) != null {
console.log(‘chunk read’, chunk.toString( );
}
} )
.on (‘end’, function( ) {
process.stdout.write(‘End of stream’)}
} )
What kind of data is contained in a Buffer?
Uint8Array
What kind of data are strings in Node.JS operating with?
strings, Buffer and Uint8Array for streams operating in binary-mode and objects for streams operating in object-mode;
When can a stream switch between non-object to object Mode?
When the streams are created. Is not safe to switch an existing stream into object mode.
How can you retrieve the internal Buffer from a readable or a writable stream?
By using writable.writableBuffer or readable.readableBuffer methods
What kind of role is the highWaterMark option playing in streams?
It regulates the amount on data stored in a stream. It is passed into the Stream constructor, and is measured in “total number of bytes” for regular streams (non-object mode) and in “max stored objects” in object-mode streams.
Explain the mechanism of buffering data in Readable streams.
Data is buffered in a Readable stream when the implementation calls stream.push(chunk); if the consumer of the Stream does not call stream.read( ), the data will sit in the internal queue until consumed.
Explain how a Readable stream is actually working in non-flowing (default) mode.
- The data is pushed into the stream’s internal buffer by calling stream.push( );
ex: const {Readable} = require(‘readable-stream’);
const inStream = new Readable( { read( ) } );
inStream.push(‘bla, bla, bla’);
inStream.push(null) // no more data - Consume the data from the Readable stream by only using the ‘readable’ event and the read( ) function:
inStream.on(‘readable’, ( ) => {
while ( (chunk = inStream.read ( ) ) !== null ) {
console.log(inStream.read( ); // consume data
}
} - Detect the closing of the stream:
inStream.on(‘end’, ( ) => {
console.log(‘stream ended’); }
What is the interaction between the “readable” and the “data” events in a Readable stream?
The “readable” event takes precedence (that’s why the non-flowing mode is the default) in controlling the flow, so the “data” event will only be emitted when the stream.read( ) is called; in that case, the ‘readableFlowing’ property would become false.
On the other side, if there any “data” listeners when the “readable” event listener is removed, the stream will start flowing.
Explain how a Readable stream is actually working in flowing mode.
If the readable stream is in flowing mode (non-default) then:
1. The data is pushed into the stream’s internal buffer by calling stream.push( );
ex: const {Readable} = require(‘readable-stream’);
const inStream = new Readable( { read( ) } );
inStream.push(‘bla, bla, bla’);
inStream.push(null) // no more data
2. Consume the data from the Readable stream by attaching a listener to the “data” event:
inStream.on(‘data’, (chunk) => {
console.log(‘new data available’, chunk.toString( ));
} );
3. Detect the closing of the stream:
inStream.on(‘end’, ( ) => {
console.log(‘stream ended’); }
What is the recommended way to treat an error that occurs during the process of reading data from a readable stream in flowing mode? Is there any difference compared to the non-flowing mode?
It is recommended that the errors occurring during the processing of the read( ) method are emitted using the 'error' event rather than being thrown (otherwise, throwing an error will probably result in unexpected behaviour, depending whether the stream is in flowing or non-flowing mode); for ex: const {Readable} = require('readable-stream'); const myStream = new Readable( { read(size) { if (checkSomeErrorCondition ( ) ) { process.nextTick ( ( ) => this.emit('error', err) ); return; } // do the regular normal work in here; } } );
Is it possible for the “read’ event to be triggered several times when the data becomes available?
Yes
Is it possible to have multiple “data” events per chunk when using a Readable stream?
No, only 1 “data” event per chunk
How do you put a readable stream in flowing mode?
You enable flowing mode:
- by attaching a listener to the ‘data’ event;
- or by explicitly invoking the resume( ) method;
- or by piping into a writable stream by calling the stream.pipe( ) method;
How do you temporarily stop a readable stream in flowing mode?
By using the pause ( ) method;
What happens if you remove the “data” listener from a Readable stream?
The stream will go back to the default non-flowing mode, but that will not automatically pause the stream;
When is the “data” event emitted from a Readable stream?
Whenever the stream is relinquishing ownership of a chunk of data to a consumer. That means whenever:
- the stream is switched in flowing mode by calling readable.pipe(), readable.resume(), or by attaching a listener callback to the ‘data’ event;
- the readable.read() is called and a chunk of data is available to be returned
Is the stream.read( ) method needed to be called in flowing mode?
No, because the data is read from the underlying system automatically and provided to an application as quickly as possible using events via the EventEmitter interface;
How can you switch back a readable stream from flowing into paused mode?
You can switch it back to non-flowing mode by:
- calling the stream.pause( ), if there are no pipe destinations;
- If there are pipe destinations, then by removing all pipe destinations you will put it back into non-flowing mode;
How can you switch back a readable stream from flowing into paused mode?
You can switch it back to non-flowing mode by:
- adding a “readable” event handler;
- calling the stream.pause( ), if there are no pipe destinations;
- If there are pipe destinations, then by removing all pipe destinations you will put it back into non-flowing mode;
Can you be sure that by using stream.pause( ) method is the stream actually paused?
No, because for ex. you you have piped destinations, then calling stream.pause( ) will NOT guarantee that the stream will remain paused once those destinations drain and ask for more data;
What are the three internal states of a readable stream?
At any point in time, a readable stream will be in one of the following internal states:
- readable.readableFlowing === null;
- readable.readableFlowing === false;
- readable.readableFlowing === true;
What is the meaning of the readable.readableFlowing === null internal state?
Then no mechanism for consuming the stream’s data is provided, so the stream will not generate data. In this state, attaching a “data” event listener, calling a readable.pipe( ) method or calling a the readable.resume( ) method will switch the readable.readableFlowing === true;, causing the stream to begin actively emitting events as data is generated;
What is the meaning of the readable.readableFlowing === null internal state?
Then no mechanism for consuming the stream’s data is provided, so the stream will not generate data. In this state, attaching a “data” event listener, calling a readable.pipe( ) method or calling a the readable.resume( ) method will switch the readable.readableFlowing === true;, causing the stream to begin actively emitting events as data is generated;
Calling the readable.pause (), readable.unpipe() or receiving backpressure will cause the readable.readableFlowing === false;, temporarily halting the flowing of events but NOT halting the generation of data. In this state, attaching a “data” event listener will NOT switch the readable.readableFlowing === true;
What are the multiple ways of consuming data from a readable stream?
- By using the .on(‘data’);
- By using the .on(‘readable’);
- By using the pipe( );
- By using async iterators;
the pipe( ) way is the recommended method, also is NOT recommended that you mix the styles above for a single stream;
Does the “read” event listener have flow control and if yes, how do you use it - if no, what can be done?
No, there is no flow control for the “read” method of a readable stream; the solution to implement the flow control is to use the “from2” npm
How is the data chunks passed to the listener callback in a readable stream?
Is passed as a string if a default encoding has been specified for the stream by using readable.setEncoding() method; otherwise the data is passed as a Buffer
How can you force a readable stream to emit an “end” event?
The “end” event will NOT be emitted until all the data is completely consumed. That is done by switching the stream in flowing mode or by calling stream.read() repeatedly until all the data has been consumed;
What happens once the end of the stream is reached, in what concerns the “readable” and “end” events?
The ‘readable’ event will also be emitted once at the end of the stream, but before the ‘end’ event emitted.
When is the stream.read() going to return null?
At the end of the stream
What is it better in terms of a readable stream performance, to read the data with “readable” or to use the “data” event?
Using “readable” is better as you get higher throughput.
How can you make a readable stream to emit data intermittently with 1 second breaks?
By using the pause() and resume() methods:
const readable = getReadableStreamSomehow();
readable.on(‘data’, (chunk) => {
console.log(Received ${chunk.length} bytes of data
);
readable.pause();
console.log(‘There will be no additional data for 1 second’);
setTimeout( ( ) => {
console.log(‘Now data will start floating again’);
readable.resume();
}, 1000);
});
When is the readable.pause( ) method having no effects?
if there is a “readable” event listener attached
What does the stream.pipe( ) method return in a readable stream?
it returns the destination, where the calling format is:
readable.pipe(destination,[options]);
Is it possible to attach multiple writable streams to a single Readable stream by using pipe ( ) ?
Yes, for ex:
const fs = require(‘fs’);
const r = fs.createReadStream(‘file.txt’);
const z = zlib.createGzip();
const w = fs.createWriteStream(‘file.txt.gz’);
r.pipe(z).pipe(w);