LN1 Algorithms in General and Time Complexity Flashcards
What is an algorithm in the context of computer science?
An algorithm is a precise and unambiguous set of instructions for performing calculations to solve a specific problem for every possible input instance. It is an abstract mathematical concept, independent of any programming language or machine implementation.
Define the terms “problem” and “instance” in algorithm design.
A “problem” is a general question to be solved, defined by a set of possible inputs and the desired output for each input. An “instance” is a specific input from the set of possible inputs for the problem.
Why is an algorithm not the same as a computer program?
An algorithm is an abstract concept outlining the steps to solve a problem, while a program is a concrete implementation of an algorithm in a specific programming language, complete with syntax and language-specific structures.
What is the importance of distinguishing between algorithms and programs?
Distinguishing between algorithms and programs allows for focusing on the problem-solving process without being distracted by programming details. It enables the design of efficient algorithms that can be implemented in any programming language.
Why should algorithm development come before programming?
Developing the algorithm first ensures a correct and efficient solution to the problem. Programming without a well-designed algorithm may lead to inefficient or incorrect solutions, as implementation details can distract from underlying problem-solving.
What are elementary operations in time complexity analysis?
Elementary operations are the basic computations that an algorithm performs, considered to take constant time. Examples include arithmetic operations on single digits, comparisons, and data access operations.
How is the size of an instance defined in algorithm analysis?
The size of an instance, denoted as n, is typically defined as the amount of input data, such as the number of elements in a list, the number of vertices in a graph, or the number of digits in a number.
What is time complexity and why is it crucial in algorithm analysis?
Time complexity measures the amount of computational time an algorithm takes relative to the size of the input. It is crucial because it helps predict performance and scalability, guiding the selection of the most efficient algorithm for a problem.
What is Big-O notation?
Big-O notation describes an upper bound on the time complexity of an algorithm, characterizing its growth rate by focusing on the dominant term and ignoring constant factors and lower-order terms.
How do you formally define that a function t(n) is O(f(n))?
A function t(n) is O(f(n)) if there exists a positive constant c and a value n₀ such that for all n ≥ n₀, t(n) ≤ c * f(n). This means t(n) does not grow faster than f(n) up to a constant multiple.
Why are constants and lower-order terms ignored in Big-O notation?
Constants and lower-order terms become negligible for large input sizes compared to the dominant term, so Big-O notation focuses on the most significant factor affecting growth rate.
Explain the difference in time complexity between adding and multiplying two n-digit numbers.
Adding two n-digit numbers has a time complexity of O(n) because each digit is processed once. Multiplying two n-digit numbers using the standard algorithm has a time complexity of O(n²) because each digit of one number is multiplied by each digit of the other.
Is it possible to multiply two n-digit numbers faster than O(n²)?
Yes, advanced algorithms like Karatsuba’s algorithm and the Fast Fourier Transform (FFT) based algorithms achieve multiplication in O(n^1.585) and O(n log n log log n) time, respectively.
What is the Interval Scheduling problem?
The Interval Scheduling problem involves selecting the maximum number of non-overlapping intervals from a given set of intervals on the real axis.
How does the Big-O notation handle the sum of two functions?
The Big-O of a sum of functions is determined by the function with the largest growth rate. Formally, O(f(n) + g(n)) = O(max(f(n), g(n))).