Wk1L1 - Intro Flashcards
What is the difference between concurrency and parallelism?
Concurrency is the ability to run multiple tasks independently, possibly on the same CPU core.
Parallelism is the ability to execute multiple tasks simultaneously, requiring multiple threads or processors.
What is a thread?
A thread is an independent path of execution within a process. In parallel computing, a process can have multiple threads running simultaneously.
What analogy is used to explain sequential vs. parallel execution?
Sequential: A single chef cooking a meal one task at a time.
Parallel: Multiple chefs working together on different tasks to finish the meal faster.
Why is parallelism important in modern processors?
Modern processors are multi-CPU or GPU-based, which are inherently parallel. Using parallelism helps to exploit these systems for better performance and efficiency.
What is the benefit of concurrency in distributed applications?
Concurrency allows geographically separated individuals to collaborate in real-time via decentralized peer-to-peer software, such as in Bitcoin or file torrents.
What is the significance of resource utilization in concurrency?
Concurrency allows one thread to run while another waits for an external resource (e.g., user input or data from a disk), ensuring that the CPU continues to be utilized effectively.
What challenges arise when writing parallel programs?
- Parallel programs are harder to debug.
- Context switching and increased resource consumption add overhead.
- Not all problems decompose easily into parallel tasks.
What are some examples of industrial applications of parallel computing?
- Artificial intelligence
- Stress testing
- Oil exploration
- Pharmaceutical design
- Advanced CGI for movies
What is Moore’s Law, and how does it relate to parallel processing?
Moore’s Law predicts that the number of components on an integrated circuit doubles every two years. As the trend of more transistors continues, modern CPUs feature multiple cores, making parallelism essential for utilizing this growth.
What are the drawbacks of parallel computing?
- Complex design
- Harder debugging
- Increased resource consumption (memory, call stacks)
- Some problems don’t easily parallelize
- Sometimes, a better sequential algorithm outperforms a parallel solution for small problem sizes.