یک سوال امتحان میان ترم تصادفی (۹۴)
يكشنبه, ۲۸ آذر ۱۳۹۵، ۰۸:۴۴ ب.ظ
Let N := {1, . . . , n} and let f : N × N → R be a function. Assume for simplicity that f(i, j) ̸= f(i ′ , j′ ) if (i, j) ̸= (i ′ , j′ ). We want to compute the value M := min1⩽i,j⩽n f(i, j). This can of course easily be done in O(n 2 ) time, by simply evaluating all f(i, j)’s—we assume that evaluating f(i, j) takes O(1) time—and then finding the minimum of all these values. Now suppose we have a magic subroutine Test(i, x) available that returns true if min1⩽j⩽n f(i, j) ⩽ x and false otherwise. The function Test runs in T ∗ (n) time.
(i) Describe a randomized incremental algorithm to compute M that runs in O(n log n+ nT∗ (n)) expected time. Argue that your algorithm is correct and show that your algorithm runs in the required time. Hint: Define F(i) = min1⩽j⩽i f(i, j) and observe that M = min1⩽i⩽n F(i).
(ii) Describe a randomized divide-and-conquer algorithm to compute M that runs in O(n log n+nT∗ (n)) expected time. Argue that your algorithm is correct and show that your algorithm runs in the required time.
(i) Describe a randomized incremental algorithm to compute M that runs in O(n log n+ nT∗ (n)) expected time. Argue that your algorithm is correct and show that your algorithm runs in the required time. Hint: Define F(i) = min1⩽j⩽i f(i, j) and observe that M = min1⩽i⩽n F(i).
(ii) Describe a randomized divide-and-conquer algorithm to compute M that runs in O(n log n+nT∗ (n)) expected time. Argue that your algorithm is correct and show that your algorithm runs in the required time.
۹۵/۰۹/۲۸