2.2.2 Computational Methods Flashcards
Features that make a problem solvable by computational methods
- Identifying if a problem can be solved using computational methods is the first stage of problem solving.
- Problems can only be called computable if they can be solved within a finite, realistic amount of time.
- Some computable problems are impractical to solve due to the resources (processing power, speed and memory) or time they need.
Problem Recognition
- Stakeholders state what they require from the solution
- Information used to clearly define the problem and system requirements
Problem may be defined by considering:
- Strengths and weaknesses of current solution
- Inputs, outputs, stored data and volume of data
Problem Decomposition
- Problems are broken down into smaller problems until each subproblem can be represented as a self-contained subroutine
- Decomposition reduces problem complexity by splitting it up into smaller sections
- Enables programmers to find sections that can be implemented using pre-coded modules or libraries
- Makes the project easier to manage
- Teams can be assigned different modules depending on specialisms
- Modules can be designed, developed and tested individually before being combined
- Makes it possible to develop modules in parallel, therefore deliver projects faster
- Simplifies debugging process, as it is quicker to identify, locate and mitigate errors
Divide and onquer
Problem-solving technique that can be broken down into three parts:
Divide
- Halves the size of the problem with every iteration
Conquer
- Each subproblem is solved, often recursively
Merge
- Solutions to the subproblems are then recombined
Applied to problem-solving in quick sort, merge sort and binary search
Simplifies complex problems very quickly
The time complexity of algorithms that use divide and conquer is of the order
O(logn).
As divide and conquer mostly makes use of recursion, it faces the same problems that all recursive functions face: stack overflow will cause the program to crash and large programs are very difficult to trace.
Use of Abstraction
- Excessive details are removed to simplify a problem
- Problems may be reduced to form problems that have already been solved
- This allows pre-programmed modules and libraries to be used
- Levels of abstraction allow a complex project to be divided into simpler parts
- Levels can be assigned to different teams and details about other layers hidden
- Makes projects more manageable.
- Abstraction by generalisation used to group together sections with similar functionality
- Means segments can be coded together, saving time
- Abstraction is used to represent real-world entities with computational elements
Backtracking
- Problem-solving technique implemented using algorithms, often recursively
- Methodically builds a solution based on visited paths found to be correct
- If a path is found to be invalid, algorithm backtracks to the previous stage
- Depth-first graph traversals are an example of backtracking
Data Mining
- Technique used to identify patterns or outliers in large sets of data collected from a variety of sources, termed big data
- Spots trends or correlations between data which are not immediately obvious
- Insights from data mining can aid predictions about the future
- This makes data mining a useful tool in assisting business and marketing decisions
- However, as data mining often involves the handling of personal data, it is crucial that it is dealt with in accordance with the present legislation regarding data protection
Heuristics
- Non-optimal, ‘rule-of-thumb’ approach to problem-solving
- Used to find an approximate solution when the standard solution is unreasonably time-consuming or resource-intensive to find
- Solution found through using heuristics is not perfectly accurate or complete
- The focus is on finding a quick solution that is ‘good enough’ and heuristics provide a shortcut to this.
- Used to provide estimations for intractable problems, shortest path-finding problems, machine learning and language recognition
Performance Modelling
- Performance modelling eliminates the need for true performance testing by providing mathematical methods to test a variety of loads on different operating systems.
- Provides cheaper, less time-consuming or safer method of testing applications
- Useful for safety-critical computer systems, where a trial run is not feasible
- The results of performance modelling can help companies judge the capabilities of a system, how it will cope in different environments and assess whether it is safe to implement.
Pipelining
- Process in which modules are divided into individual tasks, with different tasks being developed in parallel
- Enables faster project delivery
- Output of one process typically becomes the input of another
- Commonly used in RISC processors: different sections of the FDE cycle are performed simultaneously
Visualisation
- Data can be presented in a way that is easier for us to understand using visualisation.
- This makes it possible to identify trends that were not otherwise obvious, particularly amongst statistical data.
- Depending on the type of data, data may be represented as graphs, trees, charts and tables.
- Visualisation is another technique that is used by businesses to pick up on patterns which can be used to inform business decisions.