Question 1 
What is recurrence for worst case of QuickSort and what is the time complexity in Worst case?
Recurrence is T(n) = T(n2) + O(n) and time complexity is O(n^2)  
Recurrence is T(n) = T(n1) + O(n) and time complexity is O(n^2)  
Recurrence is T(n) = 2T(n/2) + O(n) and time complexity is O(nLogn)  
Recurrence is T(n) = T(n/10) + T(9n/10) + O(n) and time complexity is O(nLogn) 
Discuss it
Question 1 Explanation:
The worst case of QuickSort occurs when the picked pivot is always one of the corner elements in sorted array. In worst case, QuickSort recursively calls one subproblem with size 0 and other subproblem with size (n1). So recurrence is
T(n) = T(n1) + T(0) + O(n)
The above expression can be rewritten as
T(n) = T(n1) + O(n)
1
void exchange(int *a, int *b)
{
int temp;
temp = *a;
*a = *b;
*b = temp;
}
int partition(int arr[], int si, int ei)
{
int x = arr[ei];
int i = (si  1);
int j;
for (j = si; j <= ei  1; j++)
{
if(arr[j] <= x)
{
i++;
exchange(&arr[i], &arr[j]);
}
}
exchange (&arr[i + 1], &arr[ei]);
return (i + 1);
}
/* Implementation of Quick Sort
arr[] > Array to be sorted
si > Starting index
ei > Ending index
*/
void quickSort(int arr[], int si, int ei)
{
int pi; /* Partitioning index */
if(si < ei)
{
pi = partition(arr, si, ei);
quickSort(arr, si, pi  1);
quickSort(arr, pi + 1, ei);
}
}
[/sourcecode]
Question 2 
Suppose we have a O(n) time algorithm that finds median of an unsorted array.
Now consider a QuickSort implementation where we first find median using the above algorithm, then use median as pivot. What will be the worst case time complexity of this modified QuickSort.
O(n^2 Logn)  
O(n^2)  
O(n Logn Logn)  
O(nLogn) 
Discuss it
Question 2 Explanation:
If we use median as a pivot element, then the recurrence for all cases becomes
T(n) = 2T(n/2) + O(n)
The above recurrence can be solved using Master Method. It falls in case 2 of master method.
Question 3 
Which of the following is not a stable sorting algorithm in its typical implementation.
Insertion Sort  
Merge Sort  
Quick Sort  
Bubble Sort 
Discuss it
Question 3 Explanation:
See following for details.
http://www.geeksforgeeks.org/stabilityinsortingalgorithms/
Question 4 
Which of the following sorting algorithms in its typical implementation gives best performance when applied on an array which is sorted or almost sorted (maximum 1 or two elements are misplaced).
Quick Sort  
Heap Sort  
Merge Sort  
Insertion Sort 
Discuss it
Question 4 Explanation:
Insertion sort takes linear time when input array is sorted or almost sorted (maximum 1 or 2 elements are misplaced).
All other sorting algorithms mentioned above will take more than lienear time in their typical implementation.
Question 5 
Given an unsorted array. The array has this property that every element in array is at most k distance from its position in sorted array where k is a positive integer smaller than size of array. Which sorting algorithm can be easily modified for sorting this array and what is the obtainable time complexity?
Insertion Sort with time complexity O(kn)  
Heap Sort with time complexity O(nLogk)  
Quick Sort with time complexity O(kLogk)  
Merge Sort with time complexity O(kLogk) 
Discuss it
Question 5 Explanation:
See http://www.geeksforgeeks.org/nearlysortedalgorithm/ for explanation and implementation.
Question 6 
Consider a situation where swap operation is very costly. Which of the following sorting algorithms should be preferred so that the number of swap operations are minimized in general?
Heap Sort  
Selection Sort  
Insertion Sort  
Merge Sort 
Discuss it
Question 6 Explanation:
Selection sort makes O(n) swaps which is minimum among all sorting algorithms mentioned above.
Question 7 
Which of the following is not true about comparison based sorting algorithms?
The minimum possible time complexity of a comparison based sorting algorithm is O(nLogn) for a random input array  
Any comparison based sorting algorithm can be made stable by using position as a criteria when two elements are compared  
Counting Sort is not a comparison based sorting algortihm  
Heap Sort is not a comparison based sorting algorithm. 
Discuss it
Question 7 Explanation:
See http://www.geeksforgeeks.org/lowerboundoncomparisonbasedsortingalgorithms/ for point A. See http://www.geeksforgeeks.org/stabilityinsortingalgorithms/ for B. C is true, count sort is an Integer Sorting algorithm.
Question 8 
Suppose we are sorting an array of eight integers using quicksort, and we have just finished the first partitioning with the array looking like this:
2 5 1 7 9 12 11 10
Which statement is correct?
The pivot could be either the 7 or the 9.  
The pivot could be the 7, but it is not the 9  
The pivot is not the 7, but it could be the 9  
Neither the 7 nor the 9 is the pivot. 
Discuss it
Question 8 Explanation:
7 and 9 both are at their correct positions (as in a sorted array). Also, all elements on left of 7 and 9 are smaller than 7 and 9 respectively and on right are greater than 7 and 9 respectively.
Question 9 
Suppose we are sorting an array of eight integers using heapsort, and we have just finished some heapify (either maxheapify or minheapify) operations. The array now looks like this:
16 14 15 10 12 27 28
How many heapify operations have been performed on root of heap?
1  
2  
3 or 4  
5 or 6 
Discuss it
Question 9 Explanation:
In Heapsort, we first build a heap, then we do following operations till the heap size becomes 1.
a) Swap the root with last element
b) Call heapify for root
c) reduce the heap size by 1.
In this question, it is given that heapify has been called few times and we see that last two elements in given array are the 2 maximum elements in array. So situation is clear, it is maxheapify whic has been called 2 times.
Question 10 
What is the best time complexity of bubble sort?
N^2  
NlogN  
N  
N(logN)^2 
Discuss it
Question 10 Explanation:
The bubble sort is at its best if the input data is sorted. i.e. If the input data is sorted in the same order as expected output. This can be achieved by using one boolean variable. The boolean variable is used to check whether the values are swapped at least once in the inner loop.
Consider the following code snippet:
1
int main()
{
int arr[] = {10, 20, 30, 40, 50}, i, j, isSwapped;
int n = sizeof(arr) / sizeof(*arr);
isSwapped = 1;
for(i = 0; i < n  1 && isSwapped; ++i)
{
isSwapped = 0;
for(j = 0; j < n  i  1; ++j)
if (arr[j] > arr[j + 1])
{
swap(&arr[j], &arr[j + 1]);
isSwapped = 1;
}
}
for(i = 0; i < n; ++i)
printf("%d ", arr[i]);
return 0;
}
[/sourcecode]
Please observe that in the above code, the outer loop runs only once.
Question 11 
You have to sort 1 GB of data with only 100 MB of available main memory. Which sorting technique will be most appropriate?
Heap sort  
Merge sort  
Quick sort  
Insertion sort 
Discuss it
Question 11 Explanation:
The data can be sorted using external sorting which uses merging technique. This can be done as follows:
1. Divide the data into 10 groups each of size 100.
2. Sort each group and write them to disk.
3. Load 10 items from each group into main memory.
4. Output the smallest item from the main memory to disk. Load the next item from the group whose item was chosen.
5. Loop step #4 until all items are not outputted.
The step 35 is called as merging technique.
Question 12 
What is the worst case time complexity of insertion sort where position of the data to be inserted is calculated using binary search?
N  
NlogN  
N^2  
N(logN)^2 
Discuss it
Question 12 Explanation:
Applying binary search to calculate the position of the data to be inserted doesn't reduce the time complexity of insertion sort. This is because insertion of a data at an appropriate position involves two steps:
1. Calculate the position.
2. Shift the data from the position calculated in step #1 one step right to create a gap where the data will be inserted.
Using binary search reduces the time complexity in step #1 from O(N) to O(logN). But, the time complexity in step #2 still remains O(N). So, overall complexity remains O(N^2).
Question 13 
The tightest lower bound on the number of comparisons, in the worst case, for comparisonbased sorting is of the order of
N  
N^2  
NlogN  
N(logN)^2 
Discuss it
Question 13 Explanation:
The number of comparisons that a comparison sort algorithm requires increases in proportion to Nlog(N), where N is the number of elements to sort. This bound is asymptotically tight:
Given a list of distinct numbers (we can assume this because this is a worstcase analysis), there are N factorial permutations exactly one of which is the list in sorted order. The sort algorithm must gain enough information from the comparisons to identify the correct permutations. If the algorithm always completes after at most f(N) steps, it cannot distinguish more than 2^f(N) cases because the keys are distinct and each comparison has only two possible outcomes. Therefore,
2^f(N) >= N! or equivalently f(N) >= log(N!).
Since log(N!) is Omega(NlogN), the answer is NlogN.
For more details, read here
Question 14 
In a modified merge sort, the input array is splitted at a position onethird of the length(N) of the array. What is the worst case time complexity of this merge sort?
N(logN base 3)  
N(logN base 2/3)  
N(logN base 1/3)  
N(logN base 3/2) 
Discuss it
Question 14 Explanation:
The time complexity is given by:
T(N) = T(N/3) + T(2N/3) + N
Solving the above recurrence relation gives, T(N) = N(logN base 3/2)
Question 15 
Which sorting algorithm will take least time when all elements of input array are identical? Consider typical implementations of sorting algorithms.
Insertion Sort  
Heap Sort  
Merge Sort  
Selection Sort 
Discuss it
Question 15 Explanation:
The insertion sort will take [Tex]\theta[/Tex](n) time when input array is already sorted.
Question 16 
A list of n string, each of length n, is sorted into lexicographic order using the mergesort algorithm. The worst case running time of this computation is
(A)
(B)
(C)
(D)
A  
B  
C  
D 
Discuss it
Question 16 Explanation:
The recurrence tree for merge sort will have height Log(n). And O(n^2) work will be done at each level of the recurrence tree (Each level involves n comparisons and a comparison takes O(n) time in worst case). So time complexity of this Merge Sort will be [Tex]O (n^2 log n) [/Tex].
Question 17 
In quick sort, for sorting n elements, the (n/4)th smallest element is selected as pivot using an O(n) time algorithm. What is the worst case time complexity of the quick sort?
(A) (n)
(B) (nLogn)
(C) (n^2)
(D) (n^2 log n)
A  
B  
C  
D 
Discuss it
Question 17 Explanation:
The recursion expression becomes:
T(n) = T(n/4) + T(3n/4) + cn
After solving the above recursion, we get [Tex]\theta[/Tex](nLogn).
Question 18 
Consider the Quicksort algorithm. Suppose there is a procedure for finding a pivot element which splits the list into two sublists each of which contains at least onefifth of the elements. Let T(n) be the number of comparisons required to sort n elements. Then
T(n) <= 2T(n/5) + n  
T(n) <= T(n/5) + T(4n/5) + n  
T(n) <= 2T(4n/5) + n  
T(n) <= 2T(n/2) + n 
Discuss it
Question 18 Explanation:
For the case where n/5 elements are in one subset, T(n/5) comparisons are needed for the first subset with n/5 elements, T(4n/5) is for the rest 4n/5 elements, and n is for finding the pivot.
If there are more than n/5 elements in one set then other set will have less than 4n/5 elements and time complexity will be less than T(n/5) + T(4n/5) + n because recursion tree will be more balanced.
Question 19 
Which of the following sorting algorithms has the lowest worstcase complexity?
Merge Sort  
Bubble Sort  
Quick Sort  
Selection Sort 
Discuss it
Question 19 Explanation:
Worst case complexities for the above sorting algorithms are as follows:
Merge Sort â€” nLogn
Bubble Sort â€” n^2
Quick Sort â€” n^2
Selection Sort â€” n^2
Question 20 
Which sorting algorithms is most efficient to sort string consisting of ASCII characters?
Quick sort  
Heap sort  
Merge sort  
Counting sort 
Discuss it
Question 20 Explanation:
Counting sort algorithm is efficient when range of data to be sorted is fixed. In the above question, the range is from 0 to 255(ASCII range). Counting sort uses an extra constant space proportional to range of data.
Question 21 
The number of elements that can be sorted in time using heap sort is
(A) (B) (C) (d)
A  
B  
C  
D 
Discuss it
Question 21 Explanation:
Time complexity of Heap Sort is [Tex]\Theta(mLogm)[/Tex] for m input elements. For m = [Tex]\Theta(Log n/(Log Log n))[/Tex], the value of [Tex]\Theta(m * Logm)[/Tex] will be [Tex]\Theta( [Log n/(Log Log n)] * [Log (Log n/(Log Log n))] )[/Tex] which will be [Tex]\Theta( [Log n/(Log Log n)] * [ Log Log n  Log Log Log n] )[/Tex] which is [Tex]\Theta(Log n)[/Tex]
Question 22 
Which of the following is true about merge sort?
Merge Sort works better than quick sort if data is accessed from slow sequential memory.  
Merge Sort is stable sort by nature  
Merge sort outperforms heap sort in most of the practical situations.  
All of the above. 
Discuss it
Question 22 Explanation:
See Merge Sort and this.
Question 23 
Given an array where numbers are in range from 1 to n^{6}, which sorting algorithm can be used to sort these number in linear time?
Not possible to sort in linear time  
Radix Sort  
Counting Sort  
Quick Sort 
Discuss it
Question 23 Explanation:
See Radix Sort for explanation.
Question 24 
In quick sort, for sorting n elements, the (n/4)th smallest element is selected as pivot using an O(n) time algorithm. What is the worst case time complexity of the quick sort?
<pre>
(A)Â (n)
(B)Â (nLogn)
(C)Â (n^2)
(D)Â (n^2 log n) </pre>
A  
B  
C  
D 
Discuss it
Question 24 Explanation:
Answer(B)
The recursion expression becomes:
T(n) = T(n/4) + T(3n/4) + cn
After solving the above recursion, we get \theta(nLogn).
Question 25 
Consider the Quicksort algorithm. Suppose there is a procedure for finding a pivot element which splits the list into two sublists each of which contains at least onefifth of the elements. Let T(n) be the number of comparisons required to sort n elements. Then
T(n) <= 2T(n/5) + n  
T(n) <= T(n/5) + T(4n/5) + n  
T(n) <= 2T(4n/5) + n  
T(n) <= 2T(n/2) + n 
Discuss it
Question 25 Explanation:
For the case where n/5 elements are in one subset, T(n/5) comparisons are needed for the first subset with n/5 elements, T(4n/5) is for the rest 4n/5 elements, and n is for finding the pivot.
If there are more than n/5 elements in one set then other set will have less than 4n/5 elements and time complexity will be less than T(n/5) + T(4n/5) + n because recursion tree will be more balanced.
Question 26 
Let P be a QuickSort Program to sort numbers in ascending order using the first element as pivot. Let t1 and t2 be the number of comparisons made by P for the inputs {1, 2, 3, 4, 5} and {4, 1, 5, 3, 2} respectively. Which one of the following holds?
t1 = 5  
t1 < t2  
t1 > t2  
t1 = t2 
Discuss it
Question 26 Explanation:
When first element or last element is chosen as pivot, Quick Sort's worst case occurs for the sorted arrays.
In every step of quick sort, numbers are divided as per the following recurrence.
T(n) = T(n1) + O(n)
Question 27 
You have an array of n elements. Suppose you implement quicksort by always choosing the central element of the array as the pivot. Then the tightest upper bound for the worst case performance is
O(n^{2})  
O(nLogn)  
Theta(nLogn)  
O(n^{3}) 
Discuss it
Question 27 Explanation:
The central element may always be an extreme element, therefore time complexity in worst case becomes O(n^{2})
Question 28 
In a permutation a1.....an of n distinct integers, an inversion is a pair (ai, aj) such that i < j and ai > aj. What would be the worst case time complexity of the Insertion Sort algorithm, if the inputs are restricted to permutations of 1.....n with at most n inversions?
Î˜ (n^{2})  
Î˜ (n log n)  
Î˜ (n^{1.5})  
Î˜ (n) 
Discuss it
Question 28 Explanation:
Insertion sort runs in Î˜(n + f(n)) time, where f(n) denotes the number of inversion initially present in the array being sorted.
Source: http://cs.xidian.edu.cn/jpkc/Algorithm/down/Solution%20to%2024%20Inversions.pdf
Question 29 
Randomized quicksort is an extension of quicksort where the pivot is chosen randomly. What is the worst case complexity of sorting n numbers using randomized quicksort?
O(n)  
O(n Log n)  
O(n^{2})  
O(n!) 
Discuss it
Question 29 Explanation:
Randomized quicksort has expected time complexity as O(nLogn), but worst case time complexity remains same. In worst case the randomized function can pick the index of corner element every time.
Question 30 
Which of the following changes to typical QuickSort improves its performance on average and are generally done in practice.
1) Randomly picking up to make worst case less likely to occur. 2) Calling insertion sort for small sized arrays to reduce recursive calls. 3) QuickSort is tail recursive, so tail call optimizations can be done. 4) A linear time median searching algorithm is used to pick the median, so that the worst case time reduces to O(nLogn)
1 and 2  
2, 3, and 4  
1, 2 and 3  
2, 3 and 4 
Discuss it
Question 30 Explanation:
The 4th optimization is generally not used, it reduces the worst case time complexity to O(nLogn), but the hidden constants are very high.
Question 31 
Which one of the following is the recurrence equation for the worst case time complexity of the Quicksort algorithm for sorting n(â‰¥ 2) numbers? In the recurrence equations given in the options below, c is a constant.
T(n) = 2T (n/2) + cn  
T(n) = T(n â€“ 1) + T(0) + cn  
T(n) = 2T (n â€“ 2) + cn  
T(n) = T(n/2) + cn 
Discuss it
Question 31 Explanation:
In worst case, the chosen pivot is always placed at a corner position and recursive call is made for following.
a) for subarray on left of pivot which is of size n1 in worst case.
b) for subarray on right of pivot which is of size 0 in worst case.
Question 32 
Assume that a mergesort algorithm in the worst case takes 30 seconds for an input of size 64. Which of the following most closely approximates the maximum input size of a problem that can be solved in 6 minutes?
256  
512  
1024  
2048 
Discuss it
Question 32 Explanation:
Time complexity of merge sort is Θ(nLogn) c*64Log64 is 30 c*64*6 is 30 c is 5/64 For time 6 minutes 5/64*nLogn = 6*60 nLogn = 72*64 = 512 * 9 n = 512.
Question 33 
The worst case running times of Insertion sort, Merge sort and Quick sort, respectively, are:
Î˜(n log n), Î˜(n log n) and Î˜(n^{2})
 
Î˜(n^{2}), Î˜(n^{2}) and Î˜(n Log n)
 
Î˜(n^{2}), Î˜(n log n) and Î˜(n log n)
 
Î˜(n^{2}), Î˜(n log n) and Î˜(n^{2}) 
Discuss it
Question 33 Explanation:
 Insertion Sort takes Î˜(n^{2}) in worst case as we need to run two loops. The outer loop is needed to one by one pick an element to be inserted at right position. Inner loop is used for two things, to find position of the element to be inserted and moving all sorted greater elements one position ahead. Therefore the worst case recursive formula is T(n) = T(n1) + Θ(n).
 Merge Sort takes Î˜(n Log n) time in all cases. We always divide array in two halves, sort the two halves and merge them. The recursive formula is T(n) = 2T(n/2) + Θ(n).
 QuickSort takes Î˜(n^{2}) in worst case. In QuickSort, we take an element as pivot and partition the array around it. In worst case, the picked element is always a corner element and recursive formula becomes T(n) = T(n1) + Θ(n). An example scenario when worst case happens is, arrays is sorted and our code always picks a corner element as pivot.
Question 34 
Assume that the algorithms considered here sort the input sequences in ascending order. If the input is already in ascending order, which of the following are TRUE ?
I. Quicksort runs in Θ(n^{2}) time II. Bubblesort runs in Θ(n^{2}) time III. Mergesort runs in Θ(n) time IV. Insertion sort runs in Θ(n) time
I and II only  
I and III only  
II and IV only  
I and IV only 
Discuss it
Question 34 Explanation:
I. Given an array in ascending order, Recurrence relation for total number of comparisons for quicksort will be
T(n) = T(n1)+O(n) //partition algo will take O(n) comparisons in any case.
= O(n^2)
II. Bubble Sort runs in Î˜(n^2) time
If an array is in ascending order, we could make a small modification in Bubble Sort Inner for loop which is responsible for bubbling the kth largest element to the end in kth iteration. Whenever there is no swap after the completion of inner for loop of bubble sort in any iteration, we can declare that array is sorted in case of Bubble Sort taking O(n) time in Best Case.
III. Merge Sort runs in Î˜(n) time
Merge Sort relies on Divide and Conquer paradigm to sort an array and there is no such worst or best case input for merge sort. For any sequence, Time complexity will be given by following recurrence relation,
T(n) = 2T(n/2) + Î˜(n) // InPlace Merge algorithm will take Î˜(n) due to copying an entire array.
= Î˜(nlogn)
IV. Insertion sort runs in Î˜(n) time
Whenever a new element which will be greater than all the elements of the intermediate sorted subarray ( because given array is sorted) is added, there won't be any swap but a single comparison. In n1 passes we will be having 0 swaps and n1 comparisons.
Total time complexity = O(n) // N1 Comparisons
This solution is contributed by Pranjul Ahuja
//// For an array already sorted in ascending order, Quicksort has a complexity Θ(n^{2}) [Worst Case] Bubblesort has a complexity Θ(n) [Best Case] Mergesort has a complexity Θ(n log n) [Any Case] Insertsort has a complexity Θ(n) [Best Case]
//// For an array already sorted in ascending order, Quicksort has a complexity Θ(n^{2}) [Worst Case] Bubblesort has a complexity Θ(n) [Best Case] Mergesort has a complexity Θ(n log n) [Any Case] Insertsort has a complexity Θ(n) [Best Case]
Question 35 
Assume that we use Bubble Sort to sort n distinct elements in ascending order. When does the best case of Bubble Sort occur?
When elements are sorted in ascending order  
When elements are sorted in descending order  
When elements are not sorted by any order  
There is no best case for Bubble Sort. It always takes O(n*n) time 
Discuss it
Question 36 
If we use Radix Sort to sort n integers in the range (n^{k/2},n^{k}], for some k>0 which is independent of n, the time taken would be?
Î˜(n)  
Î˜(kn)  
Î˜(nlogn)  
Î˜(n^{2}) 
Discuss it
Question 36 Explanation:
Radix sort time complexity = O(wn)
for n keys of word size= w
=>w = log(n^{k})
O(wn)=O(klogn.n)
=> kO(nlogn)
for n keys of word size= w
=>w = log(n^{k})
O(wn)=O(klogn.n)
=> kO(nlogn)
Question 37 
Consider an array of elements arr[5]= {5,4,3,2,1} , what are the steps of insertions done while doing insertion sort in the array.
4 5 3 2 1
3 4 5 2 1
2 3 4 5 1
1 2 3 4 5
 
5 4 3 1 2
5 4 1 2 3
5 1 2 3 4
1 2 3 4 5
 
4 3 2 1 5
3 2 1 5 4
2 1 5 4 3
1 5 4 3 2
 
4 5 3 2 1
2 3 4 5 1
3 4 5 2 1
1 2 3 4 5

Discuss it
Question 37 Explanation:
In the insertion sort , just imagine that the first element is already sorted and all the right side Elements are unsorted, we need to insert all elements one by one from left to right in the sorted Array.
Sorted : 5Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â unsorted : 4 3 2 1
Insert all elements less than 5 on the left (Considering 5 as the key )
Now key value is 4 and array will look like this
Sorted : 4 5Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â unsorted : 3 2 1
Similarly for all the cases the key will always be the newly inserted value and all the values will be compared to that key and inserted in to proper position.
Question 38 
Which is the correct order of the following algorithms with respect to their time Complexity in the best case ?
Merge sort > Quick sort >Insertion sort > selection sort  
insertion sort < Quick sort < Merge sort < selection sort  
Merge sort > selection sort > quick sort > insertion sort  
Merge sort > Quick sort > selection sort > insertion sort 
Discuss it
Question 38 Explanation:
In best case, Quick sort: O (nlogn) Merge sort: O (nlogn) Insertion sort: O (n) Selection sort: O (n^2) Â
Question 39 
Which of the following statements is correct with respect to insertion sort ?
*Online  can sort a list at runtime *Stable  doesn't change the relative order of elements with equal keys.
Insertion sort is stable, online but not suited well for large number of elements.  
Insertion sort is unstable and online  
Insertion sort is online and can be applied to more than 100 elements  
Insertion sort is stable & online and can be applied to more than 100 elements 
Discuss it
Question 39 Explanation:
Time taken by algorithm is good for small number of elements, but increases quadratically for large number of elements.
Question 40 
Consider the array A[]= {6,4,8,1,3} apply the insertion sort to sort the array . Consider the cost associated with each sort is 25 rupees , what is the total cost of the insertion sort when element 1 reaches the first position of the arrayÂ ?
50  
25  
75  
100 
Discuss it
Question 40 Explanation:
When the element 1 reaches the first position of the array two comparisons are only required hence 25 * 2= 50 rupees.
*step 1: 4 6 8 1 3 .
*step 2: 1 4 6 8 3.
Question 41 
The auxiliary space of insertion sort is O(1), what does O(1) mean ?
The memory (space) required to process the data is not constant.  
It means the amount of extra memory Insertion Sort consumes doesn't depend on the input. The algorithm should use the same amount of memory for all inputs.  
It takes only 1 kb of memory .  
It is the speed at which the elements are traversed. 
Discuss it
Question 41 Explanation:
The term O(1) states that the space required by the insertion sort is constant i.e., space required doesn't depend on input.
Question 42 
What is the best sorting algorithm to use for the elements in array are more thanÂ 1 million in general?
Merge sort.
 
Bubble sort.  
Quick sort.  
Insertion sort.

Discuss it
Question 42 Explanation:
Most practical implementations of Quick Sort use randomized version. The randomized version has expected time complexity of O(nLogn). The worst case is possible in randomized version also, but worst case doesnâ€™t occur for a particular pattern (like sorted array) and randomized Quick Sort works well in practice.
Quick Sort is also a cache friendly sorting algorithm as it has good locality of reference when used for arrays.
Quick Sort is also tail recursive, therefore tail call optimizations is done.
Question 43 
Which of the below given sorting techniques has highest bestcase runtime complexity.
Quick sort
 
Selection sort  
Insertion sort
 
Bubble sort

Discuss it
Question 43 Explanation:
Quick sort best case time complexity is ÎŸ(n logn)
Selection sort best case time complexity is ÎŸ(n^2 )
Insertion sort best case time complexity is ÎŸ(n)
Bubble sort best case time complexity is ÎŸ(n)
Selection sort best case time complexity is ÎŸ(n^2 )
Insertion sort best case time complexity is ÎŸ(n)
Bubble sort best case time complexity is ÎŸ(n)
There are 43 questions to complete.