dstructures Flashcards
js: a blob is
the binary string of an entire file stored in ram. Like a file stored fully in ram.
cs: a binary is
a string sequence of 0s or 1s that make up any file
cs: a bit is
a string, either 0 or 1
cs: a byte is
a string of 8 bits (which are either 0 or 1)
cs: All digital media (text, pictures, videos, etc) is
stored as bits at the lowest level
cs: bit stands for
binary digit
https://medium.freecodecamp.org/do-you-want-a-better-understanding-of-buffer-in-node-js-check-this-out-2e29de2968e8
cs: To save a number, a computer must first
convert it to its binary representation whic is the same number written in base-2. So, 3 becomes 11.
cs: To save a letter, a computer must first
convert it to a number (using the character to number dictionary called character set, usually unicode), and then convert that number into binary aka base-2.
cs: To know which letter corresponds with which number, a computer must check the
character to number dictionary, aka, character set, usually unicode
cs: The most common character set is
Unicode
cs: A character set is
a character to number dictionary
cs: A character encoding is a
set of rules about how to format the binary you created from a letter
cs: The most popular character encoding it
UTF-8
cs: The UTF-8 rules about converting a character to a byte force you to
save into a bytes (8 digits). When the letter’s number’s base-2 version is less than 8 digits, you must add 0s to the beginning of the byte.
cs: Chunks are kinda big
a large text file can have just 2 chunks.
js: During a stream, node automatically creates
an internal buffer
cs: The three types of stream are
readable, writable and duplex
cs: Some common uses of streams are
reading and writing to disk, sending response to client from server, console.log()
cs: In http streams, usually
an internal buffer in ram gets filled with bytes, and then when the internal buffer is full it sends a chunk to the client and asks for new data.
https: //www.youtube.com/watch?v=GpGTYp_G9VE
https: //www.youtube.com/watch?v=YpVDaVufDVU
cs: streams emit
events, like data and end
node: To write a script that echos back whatever you type into the console, type
process. stdin.pipe(process.stdout)
process. stdout is a callback function that gets the process.stdin return as a parameter
cs: To start a stream’s transfer of bits, type
myStream.read()
cs: A buffer is
an object that holds binary string
held in ram
fills with data up to allocated limit
https: //nodejs.org/api/stream.html#stream_buffering
https: //hackernoon.com/https-medium-com-amanhimself-converting-a-buffer-to-json-and-utf8-strings-in-nodejs-2150b1e3de57
https: //allenkim67.github.io/programming/2016/05/17/nodejs-buffer-tutorial.html
js: A stream is basically an
object that spits out bytes of data continuously into an internal buffer which fills up to its byte limit, and then sends all that data as a chunk to a stream receiver, and asks for next data.
The stream runs .emit(‘data’, payload) whenever it wants to send data and then your event handler runs with that data payload as a param.
https://medium.freecodecamp.org/node-js-streams-everything-you-need-to-know-c9141306be93
To make a stream stop sending data
https://youtu.be/lQAV3bPOYHo?t=168
cs: in -Buffer 02 04 06 08 0a 0c 0e 10> each number represents
a byte translated into hexadecimal
cs: the size of a buffer is a count of it’s
bytes
cs: To concatenate many blobs together, type
add them all to an array
let myBobs = [Blob1, Blob2, Blob3]
and then let fullBlob = new Blob(myBobs, {type: 'video/webm'})
Set whatever type you want
cs: setting a blob to application/octet-stream means
it is arbitrary binary data
https://stackoverflow.com/questions/20508788/do-i-need-content-type-application-octet-stream-for-file-download
cs: To end a running stream, you
push null into the stream
this.push(null)
https://gist.github.com/joyrexus/10026630
cs: The end event from readable streams is important because it
tells the internal buffer thats not full yet to send the remaining data.
exp: In express, the response object is a stream, so
you can pipe data to it, and it will send all the chunks.
file.pipe(res)
or
res.write(‘string’)
https: //medium.com/@daspinola/video-stream-with-node-js-and-html5-320b3191a6b6
https: //stackoverflow.com/questions/38788721/how-do-i-stream-response-in-express
js: when you .push() to a readable stream, but the consumer is not ready
the data gets buffered
https://github.com/substack/stream-handbook
js: To write a readable stream, type
var Readable = require(‘stream’).Readable;
var myStream = new Readable;
myStream.push(‘beep ‘);
myStream.push(‘boop\n’);
myStream.push(null);
myStream.pipe(process.stdout);
js: To only start pushing data onto a stream after the consumer calls read, type
var Readable = require('stream').Readable; var myStream = Readable();
myStream._read = function () { myStream.push('beep '); myStream.push('boop\n'); myStream.push(null); };
myStream.pipe(process.stdout);
cs: Converting a file to base-64 means
encoding it with an algorithm.
https: //www.youtube.com/watch?v=8qkxeZmKmOY
https: //www.youtube.com/watch?v=eUjVcUiNbD4
cs: The streams data event gets triggered when
its buffer gets full and sends a chunk
myStream.on(‘data’, (chunk)=>{
console.log(‘chunk sent’)
})
js: To convert a Buffer to json, type
JSON.stringify(bufferName);
=> {type: ‘Buffer’, data: [12,34,45…]}
https://hackernoon.com/https-medium-com-amanhimself-converting-a-buffer-to-json-and-utf8-strings-in-nodejs-2150b1e3de57
js: To convert a node Buffer object to a javascript ArrayBuffer object
https://stackoverflow.com/questions/8609289/convert-a-binary-nodejs-buffer-to-javascript-arraybuffer
Node: Once a buffers size is set
it cannot be changed
js: A buffer is a lot like a
blob
js: What this is basically doing is myReadableStream.pipe(myWritableStream)
myReadableStream.on('data', (chunk) => { myWritableStream.write(chunk); }); myReadableStream.on('end', () => { myWritableStream.end(); });
js: To create an event emitter, type
Note:
Event emitters need chainable .on() functions
https: //www.youtube.com/watch?v=X-AhceP6jpA (apparently this guy made mistakes. Maybe use:
https: //gist.github.com/mudge/5830382 spinmutoli’s design
js: to convert a readable stream into a blob, type
let chunks = [] myStream.on('data', function (chunk) { chunks.push(chunk) }) .on('end', function () { let blob = new Blob(chunks, { type: 'myType' }) })
js: To convert a function to a promise and rely on its return for another function, type
// bad const slowFunc = () => { setTimeout(() => {return 'string'}, 5000) }
const reliesOnSlowFunc = () => { console.log(slowFunc() + 'string') }
reliesOnSlowFunc()
undefinedstring
// good const slowFunc = () => { return new Promise((resolve, reject) => { setTimeout(() => resolve('string') , 5000) }) }
const reliesOnSlowFunc = () => {
slowFunc().then((res) => console.log(res + ‘string’))
}
> stringstring // after 5 seconds
Note: The last function is always a side effect function. If you rely on the return, then it must be a promise.
i need to learn this Promise.resolve(path.join(path1, path2)).then(function(path) { // use the result here });
https://stackoverflow.com/questions/36826592/what-is-the-best-way-to-wrap-synchronous-functions-in-to-a-promise