Sunday, February 28, 2021

Design and Analysis of Algorithm

CONTENT

 

·         Algorithm

o   criteria of  an algorithm

o   Difference between algorithm and program

·         Analyzing an algorithm:

o   How good is an algorithm

o   Importance of analyzing algorithm

 

·         Complexity of an algorithm

o   Worst Case

o   Best Case

o   Average Case

·          Asymptotic Notation

o   Ө notation( Asymptotically tight bond)

o   O- notation(Tight upper bond)

o   Ω – notation (Tight lower bound)-  

·         Growth rate of a function

·         Insertion Sort

o    Insertion Sort Algorithm

o    Analysis of time complexity of Insertion Sort

§  Best case of Insertion Sort

§  Worst case of Insertion Sort

 

 

 

 

 

 

1.1  Algorithm: An algorithm is a self-contained step-by-step set of operation to be performed to solve a specific problem or a class of problems. An algorithm has a well defined sequence of steps, it gives us an output, and it will eventually terminate.

 

                                                                       Problem

 

                                                                      Algorithm

Computer

 

 


                                             Input                                                    Output

 

1.1.1 Criteria of an algorithm

Each and every algorithm must satisfy the following criteria:

Input: It is a raw material which we want to transfer.
Output: Final outcome after processing.
Definiteness: Every  instruction in algorithm have to be clear and unambiguous;
Finiteness: If we make out the instructions of an algorithm, then for each and every cases the algorithm will finish after a finite number of steps.
Effectiveness: Each and every instruction must be very vital so that it can be carried out in principle by a person using only pencil and paper. Each operation must be feasible.

 

1.1.2Difference between Algorithm and Program

S.No

Algorithm

Program

1

Algorithm is finite

Program need not to be finite

2

Algorithm is written using natural language or algorithmic language

Programs are written using a specific programming language

 

1.2 Analyzing an Algorithm: Analyzing an algorithm has come to mean predicting the resources that the algorithm requires, Resources such as memory, communication bandwidth or computer hardware are of primary concern but often it is computational time that we want to measure.

1.2.1 How good is the algorithm?

The three basic design goals that one should attempt for in a program are:

1.      Correctness

2.      Time efficiency

3.      Space efficiency

 

1.2.2 Importance of Analyze Algorithm
 1- It is required to identify limitations of various algorithms for solving a problem.

 2- It is required to understand relationship between problem size and running time.

 3- It is required to learn how to analyze an algorithm's running time without coding it.
 4- It is required to learn techniques for writing more efficient code.
 5- It is required to recognize bottlenecks in code as well as which parts of code are easiest to optimize.

 

1.3 Complexity of an Algorithm:- Time complexity of an algorithm is the amount of time it need to run to completion. This is the combination of compile time and run time. The compile time does not depend on the instance characteristics and a program compiled once can be executed many times without recompiling it. For calculating the run time of a program one have to calculate time taken in addition, multiplication , subtraction , division , load ,etc.

A-    Worst- Case:- It defined by the maximum number of stapes taken on any instance of size ‘n’.

B-    Best- Case:- It is defined by the minimum number of steps taken on any instance of size ‘n’.

C-    Average- Case:- It is defined by the average number of steps taken on any instance of size ‘n’.

 

1.4 Asymptotic Notation:-  Asymptotic Notation is based on the idea that as the problem size grow the complexity can be describe as a simple proportional to known function.

If large problems are to be treated the asymptotic behavior of the growth of recurring time function is studied.

Thus when comparing the running times of the algorithms constant factor can be ignored lower order terms are unimportant when the higher order terms are different.

For Example:- An algorithm that takes a time of 100 n2 will be faster than an algorithm that take n3 for any value of n larger than 100.

There are mainly notations available:-

1.4.1        Ө notation (Asymptotically tight bond)- Let f(n) and g(n) be real valued functions of a single non-negative integer argument. Then f(n)= Ө(g(n)) if there exists positive real valued constants C1 and C2 and a positive integer number such that

                                     0 C1.g (n)f(n)  ≤ C2.g(n)       for all   n≥n0

 

        https://cdn.kastatic.org/ka-perseus-images/2bdc25c7eda8486d05b8031c5a63535684ecb5a1.png

 

1.4.2        O- notation(Tight upper bond)- Big O notation is used for an asymptotic upper bond. The o notation means for a given function g(n), we denote it by o(g(n)) as

            O(g(n))= f(n): there exist a positive constant C and n0 such that

                     ≤  f(x)  ≤ c.g(x)       for all n≥n0

and f(n)= O(g(n)) means that

f(n) is a member of O(g(n)).

Big O notation describes an upper bound when we use it to bound the worst case running time of algorithm.

https://helloworldpoc.files.wordpress.com/2013/06/bigograph.jpg

 

1.4.3 Ω – notation (Tight lower bound)-Omega(Ω ) notation provides an asymptotic lower bound. For a given function g(x), we denote the Ω (g(n)) , the set of functions

Ω (g(n))=     f(n) : There exists positive constant c and n0 such that

                        0 ≤  c.g(n)  ≤  f(n)        for all n≥n0

 

Ω – notation describes a lower bound and it is used to bound the best case running time of algorithm.

Also f(n)= Ω(g(n)) means that f(n) grows number slower than g(n).

 

https://helloworldpoc.files.wordpress.com/2013/06/ch3-2.gif

1.5 Growth rate of a function: An algorithm is usually expressed as a function of input. To study function growth easily, one can reduce the function to the important part.

Let f(n) = an2+bn+c

Here the term n2 dominates the function that is when n get sufficiently large the other terms base factor into the result.

Q -  Arrange the following according to the increasing order of growth rate.

n,2n,nlogn,n!,n3,n5,n2,1

Ans:- 1< n < nlogn < n2 < n3 < n5  <2n  <n!

 

Q- For the function f(x)=27 x2 + 16 x, find Ɵ notation.

Ans:-  f(x)=27 x2 + 16 x

Now first find lower bond of f(x)

27 x2 ≤27 x2 + 16 x      for all x

So, C1=16 and g(n)=x2

Now we find the upper bond for f(x)

27 x2 + 16 x28 x2    for all x≥4

So, C2=28,

f(x)= Ɵ(x2) for C1=27 and C2=28 ,n0=4.

 

1.6 Insertion Sort:  It is an efficient algorithm for sorting small number of elements. It works the same way many people sort a hand of playing cards. To find the correct position for a card we must compare it with each of the cards already in the hand, from right to left. Finally all the cards in the left hand are sorted.

 

1.6.1 Insertion Sort Algorithm:

1-      for j <– 2 to length[A]

2-      do key <- A[j]

3-      insert A[j] into sorted sequence A[1…..j-1]  

4-      i  <-  j-1

5-      while I > 0  and  A[i] >  key

6-      do A[i+1] <-  A[i]

7-      i <-  i-1

8-      A[i+1]  <- key

Here (a) A[j] indicates the current card being inserted into the hand, indexed by j.

          (b) The sub array consisting of elements A[1….j-1] constitute the currently sorted hand.

          (c)  And elements A[j+1…….n] corresponds to the part of cards still on table.

 

1.6.2 Analysis of time complexity of Insertion Sort

Insertion sort can take different amount of time to sort two input sequence of same size depending on how nearly sorted they already are

·         Running time of a program is defined as the function of size of input.

·         Running time of the algorithm is the sum of running times for each executable statement.

 

 Best case of Insertion Sort:-

In insertion sort the best case occurs if the array is already sorted. In this case the next element is compared with only one previous element resulting in constant time for comparing each element. Therefore for ‘n’ elements the total time is ‘n’.

The best case running time is

T(n)=C.n

Or  Ω(n)

Worst case of Insertion Sort:

If the sub array is in reverse sorted order that is in contrary order, the worst case result. In this case each currently inserted element A[j] is compared with each element in the sorted sub array.

Total time will be calculated as

n

   j-1= n(n-1)/2

j=2

[because sorted sub array is A[1……j-1] and j=2,3,…..n]

Thus running time for the worst case insertion sort can be described as the function of n(n-1)/2

T(n) = C. n(n-1)/2 ~ C.n2

Thus T(n)=O(n2) 

No comments: