Aug 7, 2015

3 ways to solve Recurrence relations

Recurrence relation :

A recurrence relation is an equation that defines a sequence based on a rule that gives the next term as a function of the previous term(s). It is also defined as a function by means of an expression that includes one or more smaller instances of itself.

Example for the recursive definition is factorial function , Fibonacci series.

How to solve these Recursive functions?

There are 3 methods by which we can solve these.

They are

1)   Repeated Substitution i.e Substitution Method.
2)   Recursion-Tree Method
3)   Master Method

1.   Repeated Substitution Method.

            In this method, the recursive function is substituted again and again until it reduces to 1.
For example :

T(n) = T(n-1) +1;                                          à  (1)

Now T(n-1)=T(n-2)+1;                                  à  (2)

          Substituting eq (2) in eq (1), we get

T(n) = ( T(n-2) +1 ) +1;    è      T(n) =  T(n-2) + 2; 

Likewise      T(n) = T(n-3) + 3 ;

So if we go on substituting 1,2,3…..n-1 
we get         T(n) = T(n-(n-1)) + n-1;

                    T(n) = T(1) + n-1;
T(1) is a constant time. Therefore T(n) is of order n i.e. It has maximum time complexity of O(n).

Example 2:
If we consider T(n) =                θ(1)                       if n=1;

                                                2T(n/2) + θ(n)       if n>1


Rewriting this
T(n) =                    c                           if n=1;

                                                2T(n/2) + n          if n>1


T(n/2) = 2T(n/4) + n/2 ;                   à  (1)

T(n) = 4T(n/4) + 2n;                         

       = 2^2T( n/2^2) + 2n;                 à  (2)

And so on  1, 2, 3…..i
                   T(n) = 2^i T(n/2^i) + i*n;
T(n/2^i) is can be made equal to 1 only for i = log2(n)

Therefore substituting i = logn in above equation
We get         T(n) = 2logn T(1) + nlogn
                   T(n) = n + nlogn  
                   T(n) = θ(nlogn)     

2) Tree Method :
          It shows the successive expansions if recurrences using trees.

             T(n) =          c                           if n=1;
                                                2T(n/2) + cn         if n>1

Here c is the running time for base case and time per array element to divide and conquer.

                    T(n) =   2[ 2[ T(n/4) + cn/2] + cn]
                           =   4 T(n/4) + 2cn


Here in every step every element is compared with the next element and is merged in Ascending Order (in the process of merging).

Hence for each step time taken  is c*n. Like wise we have logn + 1 steps. So total time =  ( log(n) +1 ) * cn = cnlogn + cn.

          T(n) = cnlogn +cn;
                 = θ ( nlogn + n)
       = θ ( nlogn )

3)   Master Theorem (for divide and conquer):

If the recurrence relation is of type T(n)= aT(n/b) + θ( nk [logn]p )
               Where a>=1, b>1, k>=0, p is a real number
1)   If a>bk then T(n) = θ( nlogab )
2)   If a=bk then
a)    For p>-1           T(n) = θ( nlogab log(p+1)n )
b)   For p =-1          T(n) = θ( nlogab log(logn ))
c)    For p<-1           T(n) = θ( nlogab )
3)     If a<bk
a) for p>=0                     T(n) = θ( nk log(p)n )
b) for p<0              T(n) = θ( nk )
 For Example : Merge sort
          T(n) =             c                           if n=1;
                                                2T(n/2) + cn         if n>1

Where  k=1, p=0, a=2, b=2.

All these values satisfy the conditions. It is in the form of a = bk  and p>0.

                             T(n) = θ( nlogab log(p+1)n )

Substituting the values, we get
                             T(n) = θ( n1 log(0+1)n )
                             T(n) = θ( nlogn )
2) T(n) = 3 T(n/2) + n2

     K=2, p=0, a= 3, b=2
All these values satisfy the conditions. It is in the form of a < bk and p=0.
                   T(n) = θ( n2 log(0)n )
                   T(n) = θ( n2 )