Dynamic Arrays and Amortized Analysis Flashcards
What is the tightest correct upper bound on the running time of a PushBack operation on a dynamic array currently containing n elements in the worst-case?
a) O(n)
b) O(1)
c) O(nlogn)
d) O(n^2)
O(n) If the dynamic array needs to be resized to accomodate the new element, it will take O(n) to create a new array of double size and copy all the current elements into it.
Which of the following is the tightest correct upper bound on the value of the sum \sum_{j=1}^{floor(log2(n-1))} 2^j? Recall that floor(x) is the floor function of x the largest integer that is not greater than x. Also recall that 2^(log2(x)) = x. You may want to read about the geometric series.
a) O(n)
b) O(1)
c) O(nlogn)
d) O(n^2)
O(n)
https://drive.google.com/file/d/1M4M_kU63ccOnh72bPXg8183L5X-IXUun/view?usp=sharing
Will the same argument work if we replace the potential function and set \phi(h) = 3*size-capacity?
a) Yes
b) No
Yes
Let’s imagine we add support to our dynamic array for a new operation PopBack (which removes the last element), and that PopBack never reallocates the associated dynamically-allocated array. Calling PopBack on an empty dynamic array is an error.
If we have a sequence of 48 operations on an empty dynamic array: 24 PushBack and 24 PopBack (not necessarily in that order), we clearly end with a size of 0.
What are the minimum and maximum possible final capacities given such a sequence of 48 operations on an empty dynamic array? Assume that PushBack doubles the capacity, if necessary, as in lecture.
a) minimum: 1, maximum:32
b) minimum: 32, maximum:32
c) minimum: 1, maximum:1
d) minimum: 1, maximum:24
e) minimum: 24, maximum:32
The minimum is achieved when we alternate with one PushBack followed by one PopBack. The size of the array never exceeds 1, so the capacity also never exceeds 1. The maximum is achieved when we have 24 PushBacks followed by 24 PopBacks. The maximum size is 24, so the corresponding capacity is 32 (next highest power-of-two).
Let’s imagine we add support to our dynamic array for a new operation PopBack (which removes the last element). PopBack will reallocate the dynamically-allocated array if the size is <= the capacity / 2 to a new array of half the capacity. So, for example, if, before a PopBack the size were 5 and the capacity were 8, then after the PopBack, the size would be 4 and the capacity would be 4.
Give an example of n operations starting from an empty array that require O(n^2) copies.
a) PushBack n/2 elements, and the PopBack n/2 elements.
b) Let n be a power of 2. Add n/2 elements, then alternate n/4 times between doing a PushBack of an element and a PopBack
c) PushBack 2 elements, and then alternate n/2 - 1 PushBack and PopBack operations.
Once we have added n/2 elements, the dynamically-allocated array is full (size=n/2, capacity=n/2) When we add one element, we resize, and copy n/2 elements (now: size = n/2 +1, capacity = n). When we then remove an element (with PopBack), we reallocate the dynamically allocated array and copy n/2 elements. So, each of the final n/2 operations costs n/2 copies, for a total of (n^2)/4 moves, or O(n^2)
Let’s imagine we add support to our dynamic array for a new operation PopBack (which removes the last element). Calling PopBack on an empty dynamic array is an error.
PopBack reallocates the dynamically-allocated array to a new array of half the capacity if the size is <= the capacity /4. So, for example, if, before a PopBack the size were 5 and the capacity were 8, then after the PopBack, the size would be 4 and the capacity would be 8. Only after two more PopBack when the size went down to 2 would the capacity go down to 4.
We want to consider the worst-case sequence of any n PushBack and PopBack operations, starting with an empty dynamic array.
What potential function would work best to show an amortized O(1) cost per operation?
a) phi(h) = 2size-capacity
b) phi(h) = 2
c) phi(h) = max(2size-capacity, capacity/2 - size)
d) phi(h) = max(0, 2*size-capacity)
https://drive.google.com/file/d/1g47hYkcvrxzLPi_UKCGNNfEFpA-OtQ9w/view?usp=sharing
Imagine a stack with a new operation: PopMany which takes a parameter, i, that specifies how many elements to pop from the stack. The cost of this operation is i, the number of elements that need to be popped.
Without this new operation, the amortized cost of any operation in a sequence of stack operations (Push, Pop, Top) is O(1) since the true cost of each operation is O(1).
What is the amortized cost of any operation in a sequence of n stack operations (starting with an empty stack) that includes PopMany (choose the best answers)?
a) O(1) because the sum of the costs of all PopMany operations in a total of n operations is O(n)
b) O(n) because we could push n-1 items and then do one PopMany(n-1) that would take O(n) time.
c) O(1) because we can define phi(h)= size
d) O(1) beacause we can place one token on each item in the stack when it is pushed. That token will pay for popping it off with a PopMany
https://www.youtube.com/watch?v=U5XKyIVy2Vc