Finals Flashcards
splits the task into smaller subtasks and these tasks are executed concurrently.
Fork
After the execution of the subtasks, the task may join all the results into one result.
Join
is a programming model that allows tasks to be split into subtasks (forking) and later combined (joining) after execution.
FORK JOIN PARALLELISM
useful in scenarios where a problem can be broken down into subproblems that can be solved independedtly
fork-join Parallelism
style of parralel programming that solves problems by “Divide and Conquer”
fork-join Parallelism
Three (3) key components to achieve parallel processing
*Fork-Join Pool
*ForkJoinTasks
*Worker Threads
Two main types of task
Recursive Task
Recursive Action
An abstract class that defines a task that runs within ForkJoinPool
ForkJoinTasks
There are the workhorses that execute that tasks. They continuously check the queue for available tasks. Once they receive a task, they follow the instructions effectively completing their assigned part of the overall job.
Worker Threads
The fork/join framework uses a cool trick called work-stealing.
Work-sharing
Real-life examples of Fork-Join Parallelism
Weather Forecasting
Big Data Processing
is a path that is followed during a program’s execution.
Thread
The majority of programs written nowadays run as a
single thread
example of a multithreaded program
word processor
is the smallest unit of processing that can be performed in an OS (operating system)
thread
simply a subset process
thread
A thread contains all the information in a
Thread Control Block (TCB)
Is a unique identifier assigned by the operating system to the thread when it is being created
Thread ID
These are the states of the thread which changes as the thread progresses through the system
Thread states
It includes everything that the OS needs to know about, such as how far the thread has progressed and what data is being used
CPU Information
It indicates the weight (or priority) of the thread over other threads which helps the thread scheduler to determine which thread should be selected next from the ready queue
Thread Priority
points to the process which triggered the creation of a thread
Pointer
This block stores information about a process, including its process ID (PID), state, counter, registers, memory limits, and list of open files.
Process Control Block
This block stores information about a thread, including its parent process pointer, thread ID (TID), state, program counter, register set, and stack pointer.
Thread Control Block (TCB)
This section represents the memory space allocated to a process, which is divided into different segments.
Process Memory
Contains the executable code of the process
Text
(Process Memory)
Stores the data used by the process
Data
Used for storing local variables, function arguments, and return addresses during function calls.
Stack
is the ability of a program or an operating system to enable more than one user at a time without requiring multiple copies of the program running on the computer.
Multithreading
needed for multithreading
Fast central processing unit
The processor can execute only one instruction at a time, but it switches between different threads so fast that it gives the illusion of simultaneous execution.
Processor Handling
Each thread is like a separate task within a program. They share resources and work together smoothly, ensuring programs run efficiently.
Thread Synchronization
Threads in a program can run independently or wait for their turn to process, making programs faster and more responsive.
Efficient Execution
Programmers need to be careful about managing threads to avoid problems like conflicts or situations where threads get stuck waiting for each other.
Programming Considerations
Multithreading vs. Multiprocessing
Multithreading – refers to the ability of a processor to execute multiple threads concurrently, where each thread runs a process.
Multiprocessing – refers to the ability of a system to run multiple processors concurrently, where each processor can run one or more threads.
is a CPU feature that allows programmers to split processes into smaller subtasks called threads that can be executed concurrently. These threads may be run asynchronously, concurrently, or parallelly across one or more processors to improve the performance of the application. The ability to run tasks concurrently also makes multithreaded applications and APIs more responsive to the end user.
Multithreading
delineates a set of tasks that can be executed simultaneously, beginning at the same starting point, the fork, and continuing until all concurrent tasks are finished having reached the joining point. Only when all the concurrent tasks defined by the fork-join have been completed will the succeeding computation proceed.
Fork–join parallelism
provides a high performance, parallel, fine-grained task execution framework for Java programs
ForkJoinPool