1.0 - Recurrences
1.1 - Divide and Conquer Algorithms
1.1.1 - Merge Sort
merge_sort(A, p, r)
sorts the sub-array A [ p . . r ] A[p..r] A [ p .. r ] which uses the merge(A, p, q, r)
subroutine
merge(A, p, q, r)
takes the sorted sub-arrays A [ p , q ] A[p,q] A [ p , q ] and A [ q + 1 , r ] A[q+1,r] A [ q + 1 , r ] and merges them to produce the sorted array A [ p , r ] A[p, r] A [ p , r ] .
This assumes merge(A, p, q, r)
is Θ ( n ) \Theta(n) Θ ( n ) where n = r − p + 1 n=r-p+1 n = r − p + 1
i.e. we’re measuring the input in terms of the number of items that need to be merged
if p < r
q = ⌊(p + r) / 2 ⌋
merge_sort(A, p, q)
merge_sort(A, q+1 , r)
merge(A, p, q, r)
Let n = r − p + 1 n=r-p+1 n = r − p + 1
T ( n ) = Θ ( 1 ) if n ≤ 1 T ( n ) = Θ ( 1 ) + 2 T ( n 2 ) + Θ ( n ) if n > 0 = 2 T ( n 2 ) + Θ ( n ) \begin{aligned}
T(n)&=\Theta(1)\ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \text{if } n\le 1\\
T(n)&=\Theta(1)+2T(\frac n2)+\Theta(n) \ \ \ \ \ \ \ \text{if } n > 0\\
&=2 T(\frac n2)+\Theta(n)
\end{aligned}
T ( n ) T ( n ) = Θ ( 1 ) if n ≤ 1 = Θ ( 1 ) + 2 T ( 2 n ) + Θ ( n ) if n > 0 = 2 T ( 2 n ) + Θ ( n )
If we have hit the base case, (we assume) that the method is going to take a constant amount of time to execute an return otherwise.
Otherwise, we have to perform 2 T ( n 2 ) 2T(\frac n2) 2 T ( 2 n ) work to sort the two sub-problems, as well as the Θ ( n ) \Theta(n) Θ ( n ) work required to merge the two sub-problems together.
T ( n ) = 2 T ( n 2 ) + Θ ( n ) T(n)=2T(\frac n2) +\Theta(n) T ( n ) = 2 T ( 2 n ) + Θ ( n ) is shorthand for T ( n ) = 2 T ( n 2 ) + f ( n ) T(n)=2T(\frac n2)+f(n) T ( n ) = 2 T ( 2 n ) + f ( n ) for some function f ( n ) ∈ Θ ( n ) f(n)\in\Theta(n) f ( n ) ∈ Θ ( n )
1.2 - Recurrences
To be well-defined, a recurrence needs:
base case, and
recursive case(s) that converge on the base case
Running time is usually bounded by a constant for constant-sized inputs, and so we often omit the base case, and assume that
T ( n ) = Θ ( 1 ) for n ≤ c T(n)=\Theta(1) \ \ \ \ \ \ \text{for } n\le c
T ( n ) = Θ ( 1 ) for n ≤ c where c c c is a constant.
1.1 - Divide and Conquer Algorithms
Given a divide and conquer algorithm that:
Takes D ( n ) \color{lightblue} D(n) D ( n ) time to divide the problem, and
Takes a problem of size n \color{lightblue} n n , and
Breaks it into a \color{lightblue} a a parts, each of size n b \color{lightblue} \frac nb b n
Takes C ( n ) C(n) C ( n ) time to combine solutions to sub problems, we get the following recurrence:
1.1.1 - Merge Sort
merge_sort(A, p, r)
sorts the sub-array A [ p . . r ] A[p..r] A [ p .. r ]
merge(A, p, q, r)
takes the sorted sub-arrays A [ p , q ] A[p,q] A [ p , q ] and A [ q + 1 , r ] A[q+1,r] A [ q + 1 , r ] and merges them to produce the sorted array A [ p , r ] A[p, r] A [ p , r ] .
This assumes merge(A, p, q, r)
is Θ ( n ) \Theta(n) Θ ( n ) where n = r − p + 1 n=r-p+1 n = r − p + 1
if p < r
q = ⌊(p + r) / 2 ⌋
merge_sort(A, p, q)
merge_sort(A, q+1 , r)
merge(A, p, q, r)
Let n = r − p + 1 n=r-p+1 n = r − p + 1
T ( n ) = Θ ( 1 ) if n ≤ 1 T ( n ) = Θ ( 1 ) + 2 T ( n 2 ) + Θ ( n ) if n > 0 = 2 T ( n 2 ) + Θ ( n ) \begin{aligned}
T(n)&=\Theta(1)\ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \text{if } n\le 1\\
T(n)&=\Theta(1)+2T(\frac n2)+\Theta(n) \ \ \ \ \ \ \ \text{if } n > 0\\
&=2 T(\frac n2)+\Theta(n)
\end{aligned}
T ( n ) T ( n ) = Θ ( 1 ) if n ≤ 1 = Θ ( 1 ) + 2 T ( 2 n ) + Θ ( n ) if n > 0 = 2 T ( 2 n ) + Θ ( n )
T ( n ) = 2 T ( n 2 ) + Θ ( n ) T(n)=2T(\frac n2) +\Theta(n) T ( n ) = 2 T ( 2 n ) + Θ ( n ) is shorthand for T ( n ) = 2 T ( n 2 ) + f ( n ) T(n)=2T(\frac n2)+f(n) T ( n ) = 2 T ( 2 n ) + f ( n ) for some function f ( n ) ∈ Θ ( n ) f(n)\in\Theta(n) f ( n ) ∈ Θ ( n )
That is, f ( n ) f(n) f ( n ) is a yet-unknown function that is Θ ( n ) \Theta(n) Θ ( n )