Process management: Threads Flashcards
• To introduce the notion of a thread—a fundamental unit of CPU utilisation that forms the basis of multithreaded computer systems • To discuss the APIs for the Java thread library
Concurrent and Parallel execution
Sequential execution
Concurrent execution
Parallel execution
Remember
* Concurrency and parallelism are not the same thing
* Both need synchronisation
Sequential execution
- One task or subtask (job, process) at a time
- Needs to complete its execution for another task to start
Concurrent execution
- Multiple tasks or subtasks appear to run in parallel
- Takes advantage of the CPU time-slicing feature of the operating system
- Recall process states - Runs part of a task then go to waiting state. While in
waiting state another task is running and so on
Parallel execution
- Multiple tasks or subtasks actually run at the same time
- Requires two or more CPUs/cores
Concurrency vs. Parallelism
*An application can be neither parallel – nor concurrent, which means that it processes all tasks one at a
time, sequentially.
*An application can be concurrent – but not parallel, which means that it processes more than one task at
the same time, but no two tasks are executing at same time instant.
*An application can be parallel – but not concurrent, which means that it processes more than one task at
the same time and they are executing at the same time instant (multi-core CPU) or multiple CPUs.
*An application can be both parallel – and concurrent, which means that it processes multiple tasks
concurrently in multi-core CPU at same time.
*Parallelism means that an application processes multiple sub-tasks of a task in multi-core CPU at same time.
Concurrency
*Two or more tasks can start, run, and complete in
overlapping time periods
*Composition of independently executing processes.
*Dealing with lots of things at once
Parallelism
*Two or more tasks run at the same time (multiple
CPUs/cores).
*Simultaneous execution of (possibly related)
computations.
*Doing lots of things at once
What is a Thread?
- A thread is a basic unit of CPU utilisation.
- Consists of a thread ID, a program counter, a register set, and a
stack. - It shares with other threads belonging to the same process its code
section, data section, and other operating system resources (e.g. files,
etc.). - A traditional (or heavyweight) process has a single thread of control.
- If a process has multiple threads of control, it can perform more than
one task at a time.
Benefits of Threads
Responsiveness
Resource sharing
Economy
Scalability
Responsiveness
Multithreading an interactive application may allow a program to continue running even if
part of it is blocked or is performing a lengthy operation
Resource sharing
Processes can only share resources through techniques such as shared memory and message
passing. Threads share the memory and the resources of the process to which they belong by
default. The benefit of sharing code and data is that it allows an application to have several
different threads of activity within the same address space
Economy
Allocating memory and resources for process creation is costly. Because threads share the
resources of the process to which they belong, it is more economical to create and context-
switch threads (e.g. in Solaris creating a process is about thirty times slower than is creating
a thread, context switching is about five times slower)
Scalability
The benefits of multithreading can be even greater in a multiprocessor architecture, where
threads may be running in parallel on different processing cores. A single-threaded process
can run on only one processor, regardless how many are available
Multithreaded server architecture
A web server accepts client requests for web pages, images, sound … A
busy web server may have several clients concurrently accessing it
* Single-threaded process for accepting and servicing requests
* Only one client at a time (might have to wait a very long time for its request to be
serviced.
* Process-creation method
* When a server receives a request, it creates a separate process to service that
request. However, process creation is time consuming and resource intensive.
* Multithreaded server
* When a request is made, rather than creating another process, the server creates a
new thread to service the request and resume listening for additional request
Multicore Programming
Continuous need for more computing performance led single-CPU
systems to evolve into multi-CPU systems. Then, rather than multiple
CPUs we place multiple cores on a single chip.
* Each core appears as a separate processor to the operating system.
* We call these systems multicore or multiprocessor systems.
* Multithreaded programming provides a mechanism for more efficient use of
these multiple computing cores and improved concurrency.
* On a single core system, concurrency merely means that the execution of the
threads will be interleaved over time, because the processing core is capable of
executing only one thread at a time.
* On a multicore system, concurrency means that the threads can run in parallel,
because the system can assign a separate thread to each core