Aug 9, 2015

What’s the difference between malloc(),calloc() and realloc() ?


        In c language there are 3 terms that are used for dynamic memory allocation. If we allocate the data through dynamically, then that would be available until the program terminates (like global variables).

        we can have choice when to allocate and de-allocate. De-allocation can be done using free() function.
1) Malloc() :
        
The name malloc stands for "memory allocation". The function malloc() reserves a block of memory of specified size and return a pointer of type void which can be casted into pointer of any form.

                void *p = malloc(sizeof (int))
  • As it a void pointer, we can’t directly dereference it .

                like *p = 2;      // gives an error.
        
  • Before using, we have to typecast to the system defined data types ( int, float, char etc ).


        int *q = (int* )p ;
        *p = 10;           // no error

  • After allocating the memory, if we won’t initialize the values, then garbage values will be present.



         
Calloc() :
        Calloc() also allocates the blocks of memory and also initializes the allocated memory to value equal to 0 ( like in malloc() garbage values will be present). It also takes 2 function arguments as inputs.

        1) Number of blocks.
        2) Size of each block.

                int *p = (int*) calloc(n, sizeof(int))

Example:
int *p = (int*) calloc(25, sizeof(int))

This statement allocates contiguous memory space for an array of 25 elements each of size of int, i.e. 4 bytes.
  • calloc can be defined using malloc function as follows :

                char *p = (char*) (10* malloc( sizeof(int)));
                memset(p, ’0’ , 10);

The 10 memory address locations will have value 0.
Realloc() :
        realloc() is used to reallocate the memory for the given pointer either to increase the memory block or decrease the memory  block.

Syntax :

                int *b = (int*) realloc( oldblock, new memorysize);
  • It first looks whether the old block can be extended if we want more memory. If it is not possible, then a new block of memory is allocated and the old values are copied.
  • After copying the old block of memory into the new memory location, old one is automatically deleted.

  • If we reduce the size of memory block, then remaining memory is de-allocated automatically.
  • free() statement can also be done using realloc().

int *b = (int*) realloc(b, 0);
  • malloc() function can be done using realloc().


int *b = (int*) realloc(NULL , n*sizeof(int) );

SEE ALSO :






Filed under ,  | 

Aug 8, 2015

Dynamic memory allocation (Stack vs Heap)


          Application’s memory – The memory that is assigned to a program or application while running is divided into 4 segments.

Heap
(dynamic allocated)
Stack
 (functions and local variables)
Global / Static
Code ( text )
(instructions)

·       The amount of memory that is reserved for main function is called as Stack Frame.
·       Heap size is not fixed.
For example :
#include<iostream>
using namespace std;
int z;                 // global variables.
int mul(int a,int b)
{
int c;
c=a*b;
return c;
}
int init()
{
int a,b,c;
scanf("%d%d",&a,&b);
c=mul(a,b);
return c;
}
int main()
{
z= init();
printf("%d",z);
return 0;
}

While the program mul() is executing, the stack will have memory assigned as mentioned below :

mul() a, b, c
scanf()
init() a, b, c,
Main

The program starts with the main() function and is pushed into the stack. In main init() function is called next, so the main function stops executing and init() function resume and its variables are also declared in its memory declared for init(). Similarly for others.
·       The size of the stack for a method is calculated when the program is compiling. At any time, the function at the top of stack is executing, remaining are idle.

·       Once the function returns, the memory in stack is popped out and next function below it, will start executing.

·       Global variables are declared outside the stack are allocated in separate global memory section. So by this we can access global variables anywhere in the program, until it terminates.

·       Local variables declared inside each function, are allocated in their functions stack memory and also limited to those functions only.

·       When main functions returns, the program terminates and the global variables are also cleared.

·       If the stack calls goes beyond the memory, then an error will occurs i.e. stack overflow.

·       The size of Heap is not fixed. It will grow as long as it don’t run out of memory in the system itself.

·       we can keep the data in the heap till what time we need i.e. we can have control on the allocation and de-allocation.

·       Heap is also called as Dynamic memory and it’s allocation is called as Dynamic memory allocation. It’s different from heap (data structure).

In c, There are 4 functions that are used in dynamic memory allocation.

They are
       1) malloc()
       2) calloc()
3) re-alloc()
4) free()
In c++, there are extra 2 more operators including above.

New and delete

For example:
int main()
{
int a;
int *p, *q;
p = (int*)malloc(sizeof(int));
*p = 10;
q = (int*)calloc(3,sizeof(int));
*q = 10;
*(q+1) = 20;
*(q+2) = 30;
return 0;
}



·       Free(p) – It will deallocate the memory that is assigned to p.
·    p will have only the starting address of that memory block.

·       If we won’t mention free(p), in c or delete in c++ , the data will be still available until the program terminates (act like global variables ).

Recent posts :



Aug 7, 2015

3 ways to solve Recurrence relations

Recurrence relation :

A recurrence relation is an equation that defines a sequence based on a rule that gives the next term as a function of the previous term(s). It is also defined as a function by means of an expression that includes one or more smaller instances of itself.

Example for the recursive definition is factorial function , Fibonacci series.

How to solve these Recursive functions?

There are 3 methods by which we can solve these.

They are

1)   Repeated Substitution i.e Substitution Method.
2)   Recursion-Tree Method
3)   Master Method

1.   Repeated Substitution Method.

            In this method, the recursive function is substituted again and again until it reduces to 1.
For example :

T(n) = T(n-1) +1;                                          à  (1)

Now T(n-1)=T(n-2)+1;                                  à  (2)

          Substituting eq (2) in eq (1), we get

T(n) = ( T(n-2) +1 ) +1;    è      T(n) =  T(n-2) + 2; 

Likewise      T(n) = T(n-3) + 3 ;

So if we go on substituting 1,2,3…..n-1 
we get         T(n) = T(n-(n-1)) + n-1;

                    T(n) = T(1) + n-1;
T(1) is a constant time. Therefore T(n) is of order n i.e. It has maximum time complexity of O(n).

Example 2:
If we consider T(n) =                θ(1)                       if n=1;

                                                2T(n/2) + θ(n)       if n>1


Rewriting this
T(n) =                    c                           if n=1;

                                                2T(n/2) + n          if n>1


T(n/2) = 2T(n/4) + n/2 ;                   à  (1)

T(n) = 4T(n/4) + 2n;                         

       = 2^2T( n/2^2) + 2n;                 à  (2)

And so on  1, 2, 3…..i
                   T(n) = 2^i T(n/2^i) + i*n;
T(n/2^i) is can be made equal to 1 only for i = log2(n)

Therefore substituting i = logn in above equation
We get         T(n) = 2logn T(1) + nlogn
                   T(n) = n + nlogn  
                   T(n) = θ(nlogn)     

2) Tree Method :
          It shows the successive expansions if recurrences using trees.

             T(n) =          c                           if n=1;
                                                2T(n/2) + cn         if n>1

Here c is the running time for base case and time per array element to divide and conquer.

                    T(n) =   2[ 2[ T(n/4) + cn/2] + cn]
                           =   4 T(n/4) + 2cn


Here in every step every element is compared with the next element and is merged in Ascending Order (in the process of merging).

Hence for each step time taken  is c*n. Like wise we have logn + 1 steps. So total time =  ( log(n) +1 ) * cn = cnlogn + cn.

          T(n) = cnlogn +cn;
                 = θ ( nlogn + n)
       = θ ( nlogn )

3)   Master Theorem (for divide and conquer):

If the recurrence relation is of type T(n)= aT(n/b) + θ( nk [logn]p )
               Where a>=1, b>1, k>=0, p is a real number
1)   If a>bk then T(n) = θ( nlogab )
2)   If a=bk then
a)    For p>-1           T(n) = θ( nlogab log(p+1)n )
b)   For p =-1          T(n) = θ( nlogab log(logn ))
c)    For p<-1           T(n) = θ( nlogab )
3)     If a<bk
a) for p>=0                     T(n) = θ( nk log(p)n )
b) for p<0              T(n) = θ( nk )
 For Example : Merge sort
          T(n) =             c                           if n=1;
                                                2T(n/2) + cn         if n>1

Where  k=1, p=0, a=2, b=2.

All these values satisfy the conditions. It is in the form of a = bk  and p>0.

                             T(n) = θ( nlogab log(p+1)n )

Substituting the values, we get
                             T(n) = θ( n1 log(0+1)n )
                             T(n) = θ( nlogn )
2) T(n) = 3 T(n/2) + n2

     K=2, p=0, a= 3, b=2
All these values satisfy the conditions. It is in the form of a < bk and p=0.
                   T(n) = θ( n2 log(0)n )
                   T(n) = θ( n2 )