5: Complexity Flashcards
What is an oracle?
A black box that is used with TMs that answer or solve difficult problems in one call.
What is a computation resource? List 5.
An asset that enables computations of increased complexity and difficulty.
time; space; energy; the number of comparisons; the number of calls to an oracle.
How do you measure time and space complexity in TMs?
Time β‘ the number of transitions made by TM.
Space β‘ the number of tape cells touched by TM.
What is an instance of a problem?
An input word that falls within (the language of) a problem.
Consider a positive non-decreasing function π βΆ N β N, πΏ a language, L, (over some alphabet π΄),
and π being the length of the input to f(n).
True/false: DTIME(π (π)) > {πΏ | πΏ is decided by some (deterministic) TM in at most π (π) transitions}
False. Here, DTIME(π (π)) = {πΏ | πΏ is decided by some (deterministic) TM in at most π(π) transitions}
What is a complexity class?
A set of computing problems, grouped by the time and space complexities that they require to solve.
How do you determine the complexity class(es) of languages?
Consider how the number of transitions and tape cells/stack size needed changes asymptotically for inputs of different length into machines recognising the language. You can do so for both deterministic and nondeterministic machines and find for either or both if the language lies within (takes to solve) constant, linear, polynomial, logarithmic, exponential, etc, time.
What does β^? mean?
A β^? B = whether A is an element of the set B. (?)
What does the big O notation mean?
- If π (π) = π(π(π)), then π is at least as big as π
- π is at most a constant times π
- βπ > 0, π0. βπ > π0. π (π) < ππ(π)
What does the little o notation mean?
- If π (π) = π(π(π)), then π is infinitely smaller than π
- π (π)/π(π) = 0
- βπ > 0.βππ. βπ > ππ. π (π) < ππ(π)
What does the big omega notation mean?
- If π (π) = Ξ©(π(π)), then π is at least as big as π
- π is at least a constant times π
- π = π(π (π))
What does the little omega notation mean?
- If π (π) = π(π(π)), then π is infinitely smaller than π
- π(π)/π (π) = 0
- π = π(π (π))
What does the big theta notation mean?
- If π (π) = Ξ(π(π)), then π and π are the same size
- π and π are within a constant factor
- π = π(π(π)) and π = Ξ©(π(π))
Consider a positive non-decreasing function π βΆ N β N, πΏ a language, L, (over some alphabet π΄),
and π being the length of the input to f(n).
True/false: DSPACE(π (π)) > {πΏ | πΏ is decided by some (deterministic) TM in at most π (π) tape cells}
False. Here, DSPACE(π (π)) = {πΏ | πΏ is decided by some (deterministic) TM in at most π (π) tape cells}
Are input tape cells (or whole tapes dedicated to holding input in multi-tape TMs) counted as part of space complexity? Why?
No. So the space complexity represents the number of tape cells to solve a problem: cells to take input are not part of those cells.
When is π(π) π(π (π))?
There exist constants π and a π0 > 0 such that for all π > π0, π(π) β€ π β π (π). Intuitively, π(π) grows no more than π (π) for large π. This is an asymptotic upper bound.
When is π(π) o(π (π))?
For every π > 0 there exists a π0 > 0 such that for all π > π0, π(π) β€ π β π (π). Intuitively, π(π) grows strictly slower than π (π) for large π. This is an asymptotic upper bound.
When is π(π) Ξ©(π (π))?
There exist constants π and a π0 > 0 such that for all π > π0, π(π) β₯ π β π (π). Intuitively, π(π) grows no less than π (π) for large π. This is an asymptotic lower bound.
When is π(π) Ο(π (π))?
For every π > 0 there exists a π0 > 0 such that for all π > π0, π(π) β₯ π β π (π). Intuitively, π(π) grows strictly faster than π (π) for large π. This is an asymptotic lower bound.
When is π(π) Ξ(π (π))?
There exist constants π1, π2, and a π0 > 0 such that for all π > π0, π1β π (π) β€ π(π) β€ π2β π (π). Intuitively,π(π) grows like π (π) for large π. This is an asymptotic tight bound.
What is the Time Speedup Lemma?
Let πΏ β DTIME(π (π)). Then, for any π > 0,
πΏ β DTIME(π β²(π)) where πβ²(π) = ππ (π) + π(π).
If you increase the number of states and alphabet size (by a constant factor depending on π), so that the new TM can make β₯ π transitions for each transition of the original TM, then in π(π) time, we can encode the original input.
What is the Tape Compression Lemma?
Let πΏ β DSPACE(π (π)). Then, for any π > 0,
πΏ β DSPACE(π β²(π))where πβ²(π) = ππ (π) + π(1).
If you increase the number of states and alphabet (by a constant factor depending on π), the new TM can use β₯ π cells for each tape cell used of the original TM.
Let π be a positive non-decreasing function π βΆ N β N
Let πΏ be a language (over some alphabet π΄),
and π be the length of the input.
True/false: NTIME(π (π)) = {πΏ β£ There is some nondeterministic TM π such that for any word π€ β πΏ, any computation path takes at most π (π) transitions}
True.
Let π be a positive non-decreasing function π βΆ N β N
Let πΏ be a language (over some alphabet π΄),
and π be the length of the input.
True/false: NSPACE(π (π)) = {πΏ β£ There is some nondeterministic TM π such that for any word π€ β πΏ, any computation path takes at most π (π) additional tape cells}
True.
Let π be a positive non-decreasing function π βΆ N β N
Let πΏ be a language (over some alphabet π΄),
and π be the length of the input.
True/false: NSPACE(π (π)) = {πΏ β£ There is some certificate π such that πΏ is decided by a Deterministic TM which takes π as additional input using at most π (π) additional tape cells}
True.
Let π be a positive non-decreasing function π βΆ N β N
Let πΏ be a language (over some alphabet π΄),
and π be the length of the input.
True/false: NTIME(π (π)) = {πΏ β£ There is some certificate π such that πΏ is decided by a Deterministic TM which takes π as additional input using at most π (π) transitions}
True.