2.1 Computational Methods Flashcards
Features that make a problem solvable
A problem that can be solved using an algorithm is computable. Problems are computable only if they can be solved within a finite realistic amount of time.
Typically consists of inputs, outputs and calculations
Problem Recognition
Stakeholders state what they require from the solution and this info is used to clearly define the problem and system requirements
Decomposition
Breaking a larger, complex problem into smaller problems which can be represented as a self-contained subroutines. It aims to reduce the complexity.
- makes the problem easier to manage as each problem can be assigned to different groups as they are self-contained
- possible to develop modules in parallel = faster development time
- debugging is simpler as it is easier to identify, locate and mitigate errors in smaller problems.
Divide and Conquer
Involves halving the size of the problem with every iteration. Each problem is solved in the ‘ conquer stage’ often recursively. The solutions to the subproblems are then combined during the ‘ merge stage’ to form the final solution
Divide and Conquer Advantage and disadvantage
Adv = halving problems greatly simplifies very complex problems Dis = stack overflow
Abstraction
- removing excessive and unnecessary details in order to simplify a problem
- abstraction allows programmers to simplify projects by hiding information that is too complex or irrelevant. It enables for more efficient software design so programmers can focus on the core elements rather than unnecessary details.
- this reduces development time and prevents the program from getting unnecessarily large.
- abstraction is a simplified representation of reality
Backtracking
It works by methodically visiting each path and building a solution based on the paths found to be correct. if a path is found to be invalid. the algorithm backtracks to the previous stage and visits an alternate path.
Eneumeration
trying every possible option
Data Mining
the technique used to identify patterns/ outliers in large data sets collected from a variety of big sources
- used to make predictions about the future based on previous trends
- often involves the handling of personal data
Heuristics
Used to find an approximate solution to a problem when the standard solution is unreasonably time-consuming or resource-intensive to find
- its not perfectly accurate or complete
- machine learning and language recognition
Performance Modelling
Eliminates the need for true performance testing by providing mathematical methods to test a variety of loads on different operating systems.
This provides a cheaper, less time-consuming or safer method of testing applications.
It is useful for safety-critical computer systems.
Can help companies judge the capabilities of a system, whether it is safe to implement.
Pipelining
The process of completing the FDE cycle of three separate instructions simultaneously. Whilst one instruction is being executed, another instruction is being decoded, whilst another instruction is being fetched.
Visualisation
Data can be presented in a way that is easier for us to understand using visualisation to produce graphs, trees, charts etc.
This makes it possible to identify trends that weren’t obvious.