Time complexity of algorithms Flashcards
Time Complexity
number of operations an algorithm performs to complete its task (considering that each operation takes the same amount of time). The algorithm that performs the task in the smallest number of operations is considered the most efficient one in terms of the time complexity. However, the space and time complexity are also affected by factors such as your operating system and hardware
const array = [1, 2, 3, 4, 5, 6, 7, 8, 9, 10];
const search_digit = 10;
Linear search algorithm will compare each element of the array to the search_digit. When it finds the search_digit in the array, it will return true.
Now let’s count the number of operations it performs. Here, the answer is 10 (since it compares every element of the array). So, Linear search uses ten operations to find the given element (these are the maximum number of operations for this array; in the case of Linear search, this is also known as the worst case of an algorithm).
In general, Linear search will take n number of operations in its worst case (where n is the size of the array).
for an array of size n, linear search will perform n operations to complete the search. On the other hand, Binary search performed log(n) number of operations (both for their worst cases). We can represent this as a graph (x-axis: number of elements, y-axis: number of operations).
the rate by which the complexity increases for Linear search is much faster than that for binary search.
When we analyse an algorithm, we use a notation to represent its time complexity and that notation is Big O notation.
For Example: time complexity for Linear search can be represented as O(n) and O(log n) for Binary search (where, n and log(n) are the number of operations).
The Time complexity or Big O notations for some popular algorithms are:
Binary Search: O(log n)
Linear Search: O(n)
Quick Sort: O(n * log n)
Selection Sort: O(n * n)
Travelling salesperson : O(n!)
Problem-solving using computer requires memory to hold temporary data or final result while the program is in execution. The amount of memory required by the algorithm to solve given problem is called space complexity of the algorithm.
The space complexity of an algorithm quantifies the amount of space taken by an algorithm to run as a function of the length of the input. Consider an example: Suppose a problem to find the frequency of array elements.
It is the amount of memory needed for the completion of an algorithm.
To estimate the memory requirement we need to focus on two parts:
(1) A fixed part: It is independent of the input size. It includes memory for instructions (code), constants, variables, etc.
(2) A variable part: It is dependent on the input size. It includes memory for recursion stack, referenced variables, etc.
Space Complexity
total memory space required by the program for its execution.
efficiency of an algorithm also depends upon the nature and ____ of the input.
size
Best Time Complexity
Define the input for which algorithm takes less time or minimum time. In the best case calculate the lower bound of an algorithm. Example: In the linear search when search data is present at the first location of large data then the best case occurs.
Average Time Complexity
In the average case take all random inputs and calculate the computation time for all inputs.
And then we divide it by the total number of inputs.
Worst Time Complexity
Define the input for which algorithm takes a long time or maximum time. In the worst calculate the upper bound of an algorithm. Example: In the linear search when search data is present at the last location of large data then the worst case occurs.