SWE Interview - Set 3 Flashcards
Multitasking in early computers
Multitasking in early computers involved the execution of multiple tasks or processes through time-sharing. It allowed computers to efficiently switch between tasks, simulating simultaneous processing on single-core systems.
Time Sharing
“Allows what?”
Time sharing allows multitasking and multithreading by allocating short time slices to multiple tasks, enabling the appearance of concurrent processing on less powerful machines.
short time slices
: Time Quanta
Time Quantum in computing
It is the fixed or dynamic time interval allocated to a process for execution in a round-robin scheduling algorithm, enabling multitasking and fair CPU usage.
- Traditional round-robin time quantum is fixed
- Typically measured in milliseconds or microseconds.
Multithreading
It allows multiple threads to execute concurrently within a single program (process), it improves CPU usage, IO utilization, and enhances application responsiveness, especially on multi-core systems, by sharing resources like memory.
Multithreading (Single processor core)
It involved threads competing for the core’s processing time through time-sharing, simulating concurrency.
Multiple threads could be created within a single process, though contention and slower execution were common.
Multithreading (Multiple processor cores)
It employs separate cores for concurrent, independent thread execution, for true parallelism, optimizing performance, and minimizing resource conflicts.
Thread
A thread is the smallest unit of execution in a computer program. It represents a sequence of instructions that can be scheduled by the system’s CPU. Multiple threads can belong to the same process.
Process
A process is an instance of a computer program that is being executed. It consists of the program code and its current activity. Each process has its own memory space and resources.
What is concurrency in computing?
Concurrency in computing refers to the ability of a system to execute multiple tasks in overlapping time intervals, giving the appearance of concurrent execution.
What is parallelism in computing, and how does it improve performance?
Parallelism is a computing concept where multiple tasks or processes are executed simultaneously, leveraging multiple processors or cores. It enhances performance by dividing work into smaller units and processing them independently and concurrently.
What defines heap memory in computing, and what are its primary characteristics?
Heap memory is a dynamically allocated region of a computer’s memory used by programs to store data during runtime. Its primary characteristics include dynamic allocation, storage of data structures like objects and arrays, the need for manual or automatic memory management, random access, and unpredictable allocation time compared to stack memory.
Garbage Collection
Automatic memory management technique used in programming languages. It automatically detects and recovers memory that’s no longer in use by the program. This prevents memory leaks and optimizes system resources.
Unlike manual management (allocation and deallocation) in C/C++, this technique minimizes errors by tracking accessible objects and freeing unreachable memory, typically using a "mark and sweep"
process.
Describe the “mark and sweep” process
In the Mark Phase, the garbage collector identifies and labels reachable objects, beginning from root elements like global and local variables. These labeled objects are marked as “live.” In the Sweep Phase, the collector scans the entire memory, freeing up memory occupied by unmarked (garbage) objects, which are inaccessible and unnecessary.
What is asynchronous?
Asynchronous programming allows tasks to operate independently of the main program flow, allowing concurrent execution of multiple operations.It enables the initiation of a task without waiting for it to complete before starting the next one. It optimizes performance by utilizing idle time, enhancing efficiency, and employing mechanisms like callbacks, promises or async/await.
What is synchronous?
Synchronous programming executes tasks sequentially, where one operation completes before the next starts. It operates in a blocking manner, where each task waits for the previous one to finish before proceeding.