Process | Multi-Processing | Thread | Multi-Threading Flashcards
What is a Process?
A Process is an instance of a program in execution. It has its own memory space, resources, and state. Each process is independent of other processes.
What are the components of a Process?
A Process consists of: 1) Code (text section), 2) Data (variables), 3) Heap (dynamic memory), 4) Stack (function calls and local variables), and 5) Process Control Block (PCB).
What is a Process Control Block (PCB)?
The PCB is a data structure used by the OS to store information about a process, such as Process ID (PID), program counter, CPU registers, memory limits, and process state.
What are the states of a Process?
The states of a Process are: 1) New, 2) Ready, 3) Running, 4) Waiting (or Blocked), and 5) Terminated.
What is Context Switching?
Context Switching is the process of saving the state of a running process and loading the state of another process so that the CPU can execute it. It is essential for multitasking.
What is the difference between a Process and a Program?
A Program is a static set of instructions stored on disk, while a Process is a dynamic instance of a program being executed in memory.
What is a Thread?
A Thread is the smallest unit of execution within a process. It shares the same memory space and resources as other threads in the same process but has its own stack and program counter.
What is the difference between a Process and a Thread?
A Process is independent and has its own memory space, while a Thread is a lightweight unit of execution within a process and shares memory with other threads in the same process.
What are the advantages of using Threads?
Advantages of Threads include: 1) Faster creation and context switching, 2) Efficient communication (shared memory), 3) Better resource utilization, and 4) Improved responsiveness in applications.
What are the disadvantages of using Threads?
Disadvantages of Threads include: 1) Complexity in synchronization, 2) Risk of race conditions, 3) Debugging difficulties, and 4) Potential for deadlocks.
What is Multithreading?
Multithreading is the ability of a CPU to execute multiple threads concurrently within a single process. It allows parallel execution of tasks and improves application performance.
What are the types of Multithreading?
The types of Multithreading are: 1) User-Level Threads (managed by user libraries) and 2) Kernel-Level Threads (managed by the OS).
What is the difference between User-Level Threads and Kernel-Level Threads?
User-Level Threads are managed by user libraries and are faster to create, but they cannot leverage multiple CPUs. Kernel-Level Threads are managed by the OS and can run on multiple CPUs but have higher overhead.
What is a Race Condition?
A Race Condition occurs when two or more threads access shared data simultaneously, leading to unpredictable results due to improper synchronization.
What is Synchronization in Multithreading?
Synchronization ensures that only one thread can access a shared resource at a time. It is achieved using tools like mutexes, semaphores, and monitors.
What is a Mutex?
A Mutex (Mutual Exclusion) is a synchronization primitive that ensures only one thread can access a shared resource at a time. It prevents race conditions.
What is a Semaphore?
A Semaphore is a synchronization tool that controls access to shared resources using a counter. It can allow multiple threads to access a resource up to a specified limit.
What is a Deadlock?
A Deadlock occurs when two or more threads are blocked forever, waiting for each other to release resources. The four conditions for deadlock are: 1) Mutual Exclusion, 2) Hold and Wait, 3) No Preemption, and 4) Circular Wait.
How can Deadlocks be prevented?
Deadlocks can be prevented by breaking one of the four conditions: 1) Avoid Mutual Exclusion, 2) Eliminate Hold and Wait, 3) Allow Preemption, or 4) Break Circular Wait.
What is the Banker’s Algorithm?
The Banker’s Algorithm is a deadlock avoidance algorithm that checks if allocating resources to a process will leave the system in a safe state (where all processes can complete without causing a deadlock).
What is Multiprocessing?
Multiprocessing refers to the use of multiple CPUs or cores within a single computer system to execute multiple processes simultaneously. It enables true parallel execution.
What is the difference between Multiprocessing and Multithreading?
Multiprocessing uses multiple CPUs/cores to execute processes independently, while Multithreading uses a single CPU/core to execute multiple threads within a process concurrently.
What are the advantages of Multiprocessing?
Advantages of Multiprocessing include: 1) High performance, 2) Fault tolerance (if one CPU fails, others continue), and 3) Scalability (more CPUs can be added).
What are the disadvantages of Multiprocessing?
Disadvantages of Multiprocessing include: 1) High cost, 2) Complexity in managing shared resources, and 3) Increased power consumption.
What is the difference between Symmetric Multiprocessing (SMP) and Asymmetric Multiprocessing (AMP)?
In SMP, all CPUs are equal and share the same memory. In AMP, CPUs have specific roles (e.g., one master CPU controls others).
What is a Thread Pool?
A Thread Pool is a collection of pre-initialized threads that are ready to execute tasks. It improves performance by reducing the overhead of creating and destroying threads.
What is the difference between Preemptive and Non-Preemptive Scheduling?
In Preemptive Scheduling, the OS can interrupt a running task to allocate the CPU to another task. In Non-Preemptive Scheduling, a task retains the CPU until it completes or voluntarily releases it.
What is a Critical Section?
A Critical Section is a segment of code where shared resources are accessed. Only one thread can execute the critical section at a time to prevent race conditions.
What is a Monitor?
A Monitor is a synchronization construct that allows threads to wait for certain conditions to be met. It combines mutual exclusion with the ability to wait and signal.
What is the Producer-Consumer Problem?
The Producer-Consumer Problem is a classic synchronization problem where a producer generates data and a consumer processes it. It requires proper synchronization to avoid race conditions.
How does the OS handle Thread Scheduling?
The OS uses scheduling algorithms (e.g., Round Robin, Priority Scheduling) to allocate CPU time to threads. It ensures fair and efficient execution of threads.
What is the difference between Parallelism and Concurrency?
Parallelism involves executing multiple tasks simultaneously using multiple CPUs/cores. Concurrency involves managing multiple tasks at the same time, but not necessarily executing them simultaneously.
What is Amdahl’s Law?
Amdahl’s Law states that the speedup of a program using multiple processors is limited by the portion of the program that cannot be parallelized.
What is a Zombie Process?
A Zombie Process is a process that has completed execution but still has an entry in the process table to report its status to the parent process.
What is an Orphan Process?
An Orphan Process is a process whose parent process has terminated, leaving it to be adopted by the init process (PID 1).
What is the difference between a Daemon Process and a Normal Process?
A Daemon Process runs in the background and provides services (e.g., web servers), while a Normal Process runs in the foreground and interacts with the user.
What is Thread Safety?
Thread Safety means that a piece of code or data structure can be safely accessed by multiple threads without causing race conditions or inconsistencies.
What is the difference between a Process and a Thread in terms of Memory?
A Process has its own memory space, while a Thread shares memory with other threads in the same process.
What is the GIL (Global Interpreter Lock) in Python?
The GIL is a mutex that allows only one thread to execute Python bytecode at a time, limiting the performance of multithreaded Python programs.
What is the difference between Multiprogramming and Multiprocessing?
Multiprogramming maximizes CPU utilization by switching between programs on a single CPU, while Multiprocessing uses multiple CPUs/cores to execute processes simultaneously.
What is the difference between Multitasking and Multithreading?
Multitasking involves running multiple processes concurrently, while Multithreading involves running multiple threads within a single process concurrently.
What is the difference between a Heavyweight Process and a Lightweight Process?
A Heavyweight Process has its own memory space and resources, while a Lightweight Process (Thread) shares memory and resources with other threads in the same process.
What is the difference between a Kernel Thread and a User Thread?
A Kernel Thread is managed by the OS and can run on multiple CPUs, while a User Thread is managed by a user library and is limited to a single CPU.
What is the difference between a Process and a Thread in terms of Overhead?
Creating and managing a Process has higher overhead than a Thread because a Process has its own memory space, while a Thread shares memory with other threads.
What is the difference between a Process and a Thread in terms of Communication?
Processes communicate using Inter-Process Communication (IPC) mechanisms like pipes or shared memory, while Threads communicate directly through shared memory.
What is the difference between a Process and a Thread in terms of Fault Tolerance?
If a Process crashes, it does not affect other Processes. If a Thread crashes, it can affect other Threads in the same Process.
What is the difference between a Process and a Thread in terms of Scalability?
Threads are more scalable than Processes because they require fewer resources and have lower overhead.