Question 1
What is recurrence for worst case of QuickSort and what is the time complexity in Worst case?
 A Recurrence is T(n) = T(n-2) + O(n) and time complexity is O(n^2) B Recurrence is T(n) = T(n-1) + O(n) and time complexity is O(n^2) C Recurrence is T(n) = 2T(n/2) + O(n) and time complexity is O(nLogn) D Recurrence is T(n) = T(n/10) + T(9n/10) + O(n) and time complexity is O(nLogn)
Analysis of Algorithms    Sorting    QuickSort
Discuss it

Question 1 Explanation:
The worst case of QuickSort occurs when the picked pivot is always one of the corner elements in sorted array. In worst case, QuickSort recursively calls one subproblem with size 0 and other subproblem with size (n-1). So recurrence is T(n) = T(n-1) + T(0) + O(n) The above expression can be rewritten as T(n) = T(n-1) + O(n) 1 void exchange(int *a, int *b) { int temp; temp = *a; *a = *b; *b = temp; } int partition(int arr[], int si, int ei) { int x = arr[ei]; int i = (si - 1); int j; for (j = si; j <= ei - 1; j++) { if(arr[j] <= x) { i++; exchange(&arr[i], &arr[j]); } } exchange (&arr[i + 1], &arr[ei]); return (i + 1); } /* Implementation of Quick Sort arr[] --> Array to be sorted si --> Starting index ei --> Ending index */ void quickSort(int arr[], int si, int ei) { int pi; /* Partitioning index */ if(si < ei) { pi = partition(arr, si, ei); quickSort(arr, si, pi - 1); quickSort(arr, pi + 1, ei); } } [/sourcecode]
 Question 2
Suppose we have a O(n) time algorithm that finds median of an unsorted array. Now consider a QuickSort implementation where we first find median using the above algorithm, then use median as pivot. What will be the worst case time complexity of this modified QuickSort.
 A O(n^2 Logn) B O(n^2) C O(n Logn Logn) D O(nLogn)
Analysis of Algorithms    Sorting    QuickSort
Discuss it

Question 2 Explanation:
If we use median as a pivot element, then the recurrence for all cases becomes T(n) = 2T(n/2) + O(n) The above recurrence can be solved using Master Method. It falls in case 2 of master method.
 Question 3
Which of the following is not a stable sorting algorithm in its typical implementation.
 A Insertion Sort B Merge Sort C Quick Sort D Bubble Sort
Sorting    QuickSort    InsertionSort    MergeSort
Discuss it

Question 3 Explanation:
 Question 4
Which of the following sorting algorithms in its typical implementation gives best performance when applied on an array which is sorted or almost sorted (maximum 1 or two elements are misplaced).
 A Quick Sort B Heap Sort C Merge Sort D Insertion Sort
Sorting    QuickSort    InsertionSort    HeapSort
Discuss it

Question 4 Explanation:
Insertion sort takes linear time when input array is sorted or almost sorted (maximum 1 or 2 elements are misplaced). All other sorting algorithms mentioned above will take more than lienear time in their typical implementation.
 Question 5
Given an unsorted array. The array has this property that every element in array is at most k distance from its position in sorted array where k is a positive integer smaller than size of array. Which sorting algorithm can be easily modified for sorting this array and what is the obtainable time complexity?
 A Insertion Sort with time complexity O(kn) B Heap Sort with time complexity O(nLogk) C Quick Sort with time complexity O(kLogk) D Merge Sort with time complexity O(kLogk)
Analysis of Algorithms    Sorting    QuickSort    HeapSort
Discuss it

Question 5 Explanation:
See http://www.geeksforgeeks.org/nearly-sorted-algorithm/ for explanation and implementation.
 Question 6
Consider a situation where swap operation is very costly. Which of the following sorting algorithms should be preferred so that the number of swap operations are minimized in general?
 A Heap Sort B Selection Sort C Insertion Sort D Merge Sort
Sorting    SelectionSort    InsertionSort    MergeSort
Discuss it

Question 6 Explanation:

Selection sort makes O(n) swaps which is minimum among all sorting algorithms mentioned above.

 Question 7
Which of the following is not true about comparison based sorting algorithms?
 A The minimum possible time complexity of a comparison based sorting algorithm is O(nLogn) for a random input array B Any comparison based sorting algorithm can be made stable by using position as a criteria when two elements are compared C Counting Sort is not a comparison based sorting algortihm D Heap Sort is not a comparison based sorting algorithm.
Analysis of Algorithms    Sorting    HeapSort    CountingSort
Discuss it

Question 7 Explanation:
 Question 8
Suppose we are sorting an array of eight integers using quicksort, and we have just finished the first partitioning with the array looking like this: 2 5 1 7 9 12 11 10 Which statement is correct?
 A The pivot could be either the 7 or the 9. B The pivot could be the 7, but it is not the 9 C The pivot is not the 7, but it could be the 9 D Neither the 7 nor the 9 is the pivot.
Sorting    QuickSort
Discuss it

Question 8 Explanation:
7 and 9 both are at their correct positions (as in a sorted array). Also, all elements on left of 7 and 9 are smaller than 7 and 9 respectively and on right are greater than 7 and 9 respectively.
 Question 9
Suppose we are sorting an array of eight integers using heapsort, and we have just finished some heapify (either maxheapify or minheapify) operations. The array now looks like this: 16 14 15 10 12 27 28 How many heapify operations have been performed on root of heap?
 A 1 B 2 C 3 or 4 D 5 or 6
Sorting    Heap    HeapSort
Discuss it

Question 9 Explanation:
In Heapsort, we first build a heap, then we do following operations till the heap size becomes 1. a) Swap the root with last element b) Call heapify for root c) reduce the heap size by 1. In this question, it is given that heapify has been called few times and we see that last two elements in given array are the 2 maximum elements in array. So situation is clear, it is maxheapify whic has been called 2 times.
 Question 10
What is the best time complexity of bubble sort?
 A N^2 B NlogN C N D N(logN)^2
Analysis of Algorithms    Sorting    BubbleSort
Discuss it

Question 10 Explanation:
The bubble sort is at its best if the input data is sorted. i.e. If the input data is sorted in the same order as expected output. This can be achieved by using one boolean variable. The boolean variable is used to check whether the values are swapped at least once in the inner loop. Consider the following code snippet: 1 int main() { int arr[] = {10, 20, 30, 40, 50}, i, j, isSwapped; int n = sizeof(arr) / sizeof(*arr); isSwapped = 1; for(i = 0; i < n - 1 && isSwapped; ++i) { isSwapped = 0; for(j = 0; j < n - i - 1; ++j) if (arr[j] > arr[j + 1]) { swap(&arr[j], &arr[j + 1]); isSwapped = 1; } } for(i = 0; i < n; ++i) printf("%d ", arr[i]); return 0; } [/sourcecode] Please observe that in the above code, the outer loop runs only once.
 Question 11
You have to sort 1 GB of data with only 100 MB of available main memory. Which sorting technique will be most appropriate?
 A Heap sort B Merge sort C Quick sort D Insertion sort
Sorting    QuickSort    MergeSort    HeapSort
Discuss it

Question 11 Explanation:
The data can be sorted using external sorting which uses merging technique. This can be done as follows: 1. Divide the data into 10 groups each of size 100. 2. Sort each group and write them to disk. 3. Load 10 items from each group into main memory. 4. Output the smallest item from the main memory to disk. Load the next item from the group whose item was chosen. 5. Loop step #4 until all items are not outputted. The step 3-5 is called as merging technique.
 Question 12
What is the worst case time complexity of insertion sort where position of the data to be inserted is calculated using binary search?
 A N B NlogN C N^2 D N(logN)^2
Analysis of Algorithms    Sorting    InsertionSort    BinarySearch
Discuss it

Question 12 Explanation:
Applying binary search to calculate the position of the data to be inserted doesn't reduce the time complexity of insertion sort. This is because insertion of a data at an appropriate position involves two steps: 1. Calculate the position. 2. Shift the data from the position calculated in step #1 one step right to create a gap where the data will be inserted. Using binary search reduces the time complexity in step #1 from O(N) to O(logN). But, the time complexity in step #2 still remains O(N). So, overall complexity remains O(N^2).
 Question 13
The tightest lower bound on the number of comparisons, in the worst case, for comparison-based sorting is of the order of
 A N B N^2 C NlogN D N(logN)^2
Analysis of Algorithms    Sorting
Discuss it

Question 13 Explanation:
The number of comparisons that a comparison sort algorithm requires increases in proportion to Nlog(N), where N is the number of elements to sort. This bound is asymptotically tight: Given a list of distinct numbers (we can assume this because this is a worst-case analysis), there are N factorial permutations exactly one of which is the list in sorted order. The sort algorithm must gain enough information from the comparisons to identify the correct permutations. If the algorithm always completes after at most f(N) steps, it cannot distinguish more than 2^f(N) cases because the keys are distinct and each comparison has only two possible outcomes. Therefore, 2^f(N) >= N! or equivalently f(N) >= log(N!). Since log(N!) is Omega(NlogN), the answer is NlogN. For more details, read here
 Question 14
In a modified merge sort, the input array is splitted at a position one-third of the length(N) of the array. What is the worst case time complexity of this merge sort?
 A N(logN base 3) B N(logN base 2/3) C N(logN base 1/3) D N(logN base 3/2)
Analysis of Algorithms    Sorting    MergeSort
Discuss it

Question 14 Explanation:
The time complexity is given by: T(N) = T(N/3) + T(2N/3) + N Solving the above recurrence relation gives, T(N) = N(logN base 3/2)
 Question 15
Which sorting algorithm will take least time when all elements of input array are identical? Consider typical implementations of sorting algorithms.
 A Insertion Sort B Heap Sort C Merge Sort D Selection Sort
Sorting    SelectionSort    InsertionSort    MergeSort
Discuss it

Question 15 Explanation:
The insertion sort will take [Tex]\theta[/Tex](n) time when input array is already sorted.
 Question 16
A list of n string, each of length n, is sorted into lexicographic order using the merge-sort algorithm. The worst case running time of this computation is (A) (B) (C) (D)
 A A B B C C D D
Analysis of Algorithms    Sorting    MergeSort
Discuss it

Question 16 Explanation:
The recurrence tree for merge sort will have height Log(n). And O(n^2) work will be done at each level of the recurrence tree (Each level involves n comparisons and a comparison takes O(n) time in worst case). So time complexity of this Merge Sort will be [Tex]O (n^2 log n) [/Tex].
 Question 17
In quick sort, for sorting n elements, the (n/4)th smallest element is selected as pivot using an O(n) time algorithm. What is the worst case time complexity of the quick sort? (A) (n) (B) (nLogn) (C) (n^2) (D) (n^2 log n)
 A A B B C C D D
Analysis of Algorithms    Sorting    QuickSort
Discuss it

Question 17 Explanation:
The recursion expression becomes: T(n) = T(n/4) + T(3n/4) + cn After solving the above recursion, we get [Tex]\theta[/Tex](nLogn).
 Question 18
Consider the Quicksort algorithm. Suppose there is a procedure for finding a pivot element which splits the list into two sub-lists each of which contains at least one-fifth of the elements. Let T(n) be the number of comparisons required to sort n elements. Then
 A T(n) <= 2T(n/5) + n B T(n) <= T(n/5) + T(4n/5) + n C T(n) <= 2T(4n/5) + n D T(n) <= 2T(n/2) + n
Analysis of Algorithms    Sorting    QuickSort
Discuss it

Question 18 Explanation:
For the case where n/5 elements are in one subset, T(n/5) comparisons are needed for the first subset with n/5 elements, T(4n/5) is for the rest 4n/5 elements, and n is for finding the pivot. If there are more than n/5 elements in one set then other set will have less than 4n/5 elements and time complexity will be less than T(n/5) + T(4n/5) + n because recursion tree will be more balanced.
 Question 19
Which of the following sorting algorithms has the lowest worst-case complexity?
 A Merge Sort B Bubble Sort C Quick Sort D Selection Sort
Analysis of Algorithms    Sorting    SelectionSort    MergeSort
Discuss it

Question 19 Explanation:
Worst case complexities for the above sorting algorithms are as follows: Merge Sort â€” nLogn Bubble Sort â€” n^2 Quick Sort â€” n^2 Selection Sort â€” n^2
 Question 20
Which sorting algorithms is most efficient to sort string consisting of ASCII characters?
 A Quick sort B Heap sort C Merge sort D Counting sort
Sorting    QuickSort    HeapSort    CountingSort
Discuss it

Question 20 Explanation:
Counting sort algorithm is efficient when range of data to be sorted is fixed. In the above question, the range is from 0 to 255(ASCII range). Counting sort uses an extra constant space proportional to range of data.
 Question 21
The number of elements that can be sorted in time using heap sort is
(A)
(B)
(C)
(d)  
 A A B B C C D D
Analysis of Algorithms    Sorting    HeapSort
Discuss it

Question 21 Explanation:
Time complexity of Heap Sort is [Tex]\Theta(mLogm)[/Tex] for m input elements. For m = [Tex]\Theta(Log n/(Log Log n))[/Tex], the value of [Tex]\Theta(m * Logm)[/Tex] will be [Tex]\Theta( [Log n/(Log Log n)] * [Log (Log n/(Log Log n))] )[/Tex] which will be [Tex]\Theta( [Log n/(Log Log n)] * [ Log Log n - Log Log Log n] )[/Tex] which is [Tex]\Theta(Log n)[/Tex]
 Question 22
Which of the following is true about merge sort?
 A Merge Sort works better than quick sort if data is accessed from slow sequential memory. B Merge Sort is stable sort by nature C Merge sort outperforms heap sort in most of the practical situations. D All of the above.
Sorting    QuickSort    MergeSort    HeapSort
Discuss it

Question 22 Explanation:
See Merge Sort and this.
 Question 23
Given an array where numbers are in range from 1 to n6, which sorting algorithm can be used to sort these number in linear time?
 A Not possible to sort in linear time B Radix Sort C Counting Sort D Quick Sort
Discuss it

Question 23 Explanation:
 Question 24
In quick sort, for sorting n elements, the (n/4)th smallest element is selected as pivot using an O(n) time algorithm. What is the worst case time complexity of the quick sort? <pre> (A)Â (n) (B)Â (nLogn) (C)Â (n^2) (D)Â (n^2 log n) </pre>
 A A B B C C D D
Analysis of Algorithms    Sorting    GATE-CS-2009    QuickSort
Discuss it

Question 24 Explanation:
Answer(B) The recursion expression becomes: T(n) = T(n/4) + T(3n/4) + cn After solving the above recursion, we get \theta(nLogn).
 Question 25
Consider the Quicksort algorithm. Suppose there is a procedure for finding a pivot element which splits the list into two sub-lists each of which contains at least one-fifth of the elements. Let T(n) be the number of comparisons required to sort n elements. Then
 A T(n) <= 2T(n/5) + n B T(n) <= T(n/5) + T(4n/5) + n C T(n) <= 2T(4n/5) + n D T(n) <= 2T(n/2) + n
Analysis of Algorithms    Sorting    GATE CS 2008    QuickSort
Discuss it

Question 25 Explanation:
For the case where n/5 elements are in one subset, T(n/5) comparisons are needed for the first subset with n/5 elements, T(4n/5) is for the rest 4n/5 elements, and n is for finding the pivot. If there are more than n/5 elements in one set then other set will have less than 4n/5 elements and time complexity will be less than T(n/5) + T(4n/5) + n because recursion tree will be more balanced.
 Question 26
Let P be a QuickSort Program to sort numbers in ascending order using the first element as pivot. Let t1 and t2 be the number of comparisons made by P for the inputs {1, 2, 3, 4, 5} and {4, 1, 5, 3, 2} respectively. Which one of the following holds?
 A t1 = 5 B t1 < t2 C t1 > t2 D t1 = t2
Sorting    GATE-CS-2014-(Set-1)    QuickSort
Discuss it

Question 26 Explanation:
When first element or last element is chosen as pivot, Quick Sort's worst case occurs for the sorted arrays. In every step of quick sort, numbers are divided as per the following recurrence. T(n) = T(n-1) + O(n)
 Question 27
You have an array of n elements. Suppose you implement quicksort by always choosing the central element of the array as the pivot. Then the tightest upper bound for the worst case performance is
 A O(n2) B O(nLogn) C Theta(nLogn) D O(n3)
Analysis of Algorithms    Sorting    GATE-CS-2014-(Set-3)    QuickSort
Discuss it

Question 27 Explanation:
The central element may always be an extreme element, therefore time complexity in worst case becomes O(n2)
 Question 28
In a permutation a1.....an of n distinct integers, an inversion is a pair (ai, aj) such that i < j and ai > aj. What would be the worst case time complexity of the Insertion Sort algorithm, if the inputs are restricted to permutations of 1.....n with at most n inversions?
 A Î˜ (n2) B Î˜ (n log n) C Î˜ (n1.5) D Î˜ (n)
Analysis of Algorithms    Sorting    GATE-CS-2003    InsertionSort
Discuss it

Question 28 Explanation:
Insertion sort runs in Î˜(n + f(n)) time, where f(n) denotes the number of inversion initially present in the array being sorted. Source: http://cs.xidian.edu.cn/jpkc/Algorithm/down/Solution%20to%202-4%20Inversions.pdf
 Question 29
Randomized quicksort is an extension of quicksort where the pivot is chosen randomly. What is the worst case complexity of sorting n numbers using randomized quicksort?
 A O(n) B O(n Log n) C O(n2) D O(n!)
Analysis of Algorithms    Sorting    GATE-CS-2001    QuickSort
Discuss it

Question 29 Explanation:
Randomized quicksort has expected time complexity as O(nLogn), but worst case time complexity remains same. In worst case the randomized function can pick the index of corner element every time.
 Question 30
Which of the following changes to typical QuickSort improves its performance on average and are generally done in practice.
1) Randomly picking up to make worst case less
likely to occur.
2) Calling insertion sort for small sized arrays
to reduce recursive calls.
3) QuickSort is tail recursive, so tail call
optimizations can be done.
4) A linear time median searching algorithm is used
to pick the median, so that the worst case time
reduces to O(nLogn)

 A 1 and 2 B 2, 3, and 4 C 1, 2 and 3 D 2, 3 and 4
Sorting    GATE-CS-2015 (Mock Test)    QuickSort
Discuss it

Question 30 Explanation:
The 4th optimization is generally not used, it reduces the worst case time complexity to O(nLogn), but the hidden constants are very high.
 Question 31
Which one of the following is the recurrence equation for the worst case time complexity of the Quicksort algorithm for sorting n(â‰¥ 2) numbers? In the recurrence equations given in the options below, c is a constant.
 A T(n) = 2T (n/2) + cn B T(n) = T(n â€“ 1) + T(0) + cn C T(n) = 2T (n â€“ 2) + cn D T(n) = T(n/2) + cn
Analysis of Algorithms    Sorting    GATE-CS-2015 (Set 1)    QuickSort
Discuss it

Question 31 Explanation:
In worst case, the chosen pivot is always placed at a corner position and recursive call is made for following. a) for subarray on left of pivot which is of size n-1 in worst case. b) for subarray on right of pivot which is of size 0 in worst case.
 Question 32
Assume that a mergesort algorithm in the worst case takes 30 seconds for an input of size 64. Which of the following most closely approximates the maximum input size of a problem that can be solved in 6 minutes?
 A 256 B 512 C 1024 D 2048
Sorting    GATE-CS-2015 (Set 3)    MergeSort
Discuss it

Question 32 Explanation:
Time complexity of merge sort is Θ(nLogn)

c*64Log64 is 30
c*64*6 is 30
c is 5/64

For time 6 minutes

5/64*nLogn = 6*60

nLogn = 72*64 = 512 * 9

n = 512. 
 Question 33
The worst case running times of Insertion sort, Merge sort and Quick sort, respectively, are:
 A Î˜(n log n), Î˜(n log n) and Î˜(n2) B Î˜(n2), Î˜(n2) and Î˜(n Log n) C Î˜(n2), Î˜(n log n) and Î˜(n log n) D Î˜(n2), Î˜(n log n) and Î˜(n2)
Analysis of Algorithms    Sorting    GATE-CS-2016 (Set 1)
Discuss it

Question 33 Explanation:
• Insertion Sort takes Î˜(n2) in worst case as we need to run two loops. The outer loop is needed to one by one pick an element to be inserted at right position. Inner loop is used for two things, to find position of the element to be inserted and moving all sorted greater elements one position ahead. Therefore the worst case recursive formula is T(n) = T(n-1) + Θ(n).
• Merge Sort takes Î˜(n Log n) time in all cases. We always divide array in two halves, sort the two halves and merge them. The recursive formula is T(n) = 2T(n/2) + Θ(n).
• QuickSort takes Î˜(n2) in worst case. In QuickSort, we take an element as pivot and partition the array around it. In worst case, the picked element is always a corner element and recursive formula becomes T(n) = T(n-1) + Θ(n). An example scenario when worst case happens is, arrays is sorted and our code always picks a corner element as pivot.

 Question 34
Assume that the algorithms considered here sort the input sequences in ascending order. If the input is already in ascending order, which of the following are TRUE ?
I.   Quicksort runs in Θ(n2) time
II.  Bubblesort runs in Θ(n2) time
III. Mergesort runs in  Θ(n) time
IV.  Insertion sort runs in  Θ(n) time 
 A I and II only B I and III only C II and IV only D I and IV only
Analysis of Algorithms    Sorting    GATE-CS-2016 (Set 2)
Discuss it

Question 34 Explanation:
I. Given an array in ascending order, Recurrence relation for total number of comparisons for quicksort will be T(n) = T(n-1)+O(n) //partition algo will take O(n) comparisons in any case. = O(n^2) II. Bubble Sort runs in Î˜(n^2) time If an array is in ascending order, we could make a small modification in Bubble Sort Inner for loop which is responsible for bubbling the kth largest element to the end in kth iteration. Whenever there is no swap after the completion of inner for loop of bubble sort in any iteration, we can declare that array is sorted in case of Bubble Sort taking O(n) time in Best Case. III. Merge Sort runs in Î˜(n) time Merge Sort relies on Divide and Conquer paradigm to sort an array and there is no such worst or best case input for merge sort. For any sequence, Time complexity will be given by following recurrence relation, T(n) = 2T(n/2) + Î˜(n) // In-Place Merge algorithm will take Î˜(n) due to copying an entire array. = Î˜(nlogn) IV. Insertion sort runs in Î˜(n) time Whenever a new element which will be greater than all the elements of the intermediate sorted sub-array ( because given array is sorted) is added, there won't be any swap but a single comparison. In n-1 passes we will be having 0 swaps and n-1 comparisons. Total time complexity = O(n) // N-1 Comparisons This solution is contributed by Pranjul Ahuja

//// For an array already sorted in ascending order, Quicksort has a complexity Θ(n2) [Worst Case] Bubblesort has a complexity Θ(n) [Best Case] Mergesort has a complexity Θ(n log n) [Any Case] Insertsort has a complexity Θ(n) [Best Case]
 Question 35
Assume that we use Bubble Sort to sort n distinct elements in ascending order. When does the best case of Bubble Sort occur?
 A When elements are sorted in ascending order B When elements are sorted in descending order C When elements are not sorted by any order D There is no best case for Bubble Sort. It always takes O(n*n) time
Sorting    BubbleSort
Discuss it

 Question 36
If we use Radix Sort to sort n integers in the range (nk/2,nk], for some k>0 which is independent of n, the time taken would be?
 A Î˜(n) B Î˜(kn) C Î˜(nlogn) D Î˜(n2)
Analysis of Algorithms    Sorting    RadixSort    Gate IT 2008
Discuss it

Question 36 Explanation:
Radix sort time complexity = O(wn)
for n keys of word size= w
=>w = log(nk)
O(wn)=O(klogn.n)
=> kO(nlogn)
 Question 37
Consider an array of elements arr[5]= {5,4,3,2,1} , what are the steps of insertions done while doing insertion sort in the array.
 A 4 5 3 2 1 3 4 5 2 1 2 3 4 5 1 1 2 3 4 5 B 5 4 3 1 2 5 4 1 2 3 5 1 2 3 4 1 2 3 4 5 C 4 3 2 1 5 3 2 1 5 4 2 1 5 4 3 1 5 4 3 2 D 4 5 3 2 1 2 3 4 5 1 3 4 5 2 1 1 2 3 4 5
Sorting    InsertionSort
Discuss it

Question 37 Explanation:
In the insertion sort , just imagine that the first element is already sorted and all the right side Elements are unsorted, we need to insert all elements one by one from left to right in the sorted Array. Sorted : 5Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â  unsorted : 4 3 2 1 Insert all elements less than 5 on the left (Considering 5 as the key ) Now key value is 4 and array will look like this Sorted : 4 5Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â  unsorted : 3 2 1 Similarly for all the cases the key will always be the newly inserted value and all the values will be compared to that key and inserted in to proper position.
 Question 38
Which is the correct order of the following algorithms with respect to their time Complexity in the best case ?
 A Merge sort > Quick sort >Insertion sort > selection sort B insertion sort < Quick sort < Merge sort < selection sort C Merge sort > selection sort > quick sort > insertion sort D Merge sort > Quick sort > selection sort > insertion sort
Sorting    QuickSort    SelectionSort    InsertionSort
Discuss it

Question 38 Explanation:
In best case,

Quick sort: O (nlogn)
Merge sort: O (nlogn)
Insertion sort: O (n)
Selection sort: O (n^2) Â 
 Question 39
Which of the following statements is correct with respect to insertion sort ?
*Online - can sort a list at runtime
*Stable - doesn't change the relative
order of elements with equal keys. 
 A Insertion sort is stable, online but not suited well for large number of elements. B Insertion sort is unstable and online C Insertion sort is online and can be applied to more than 100 elements D Insertion sort is stable & online and can be applied to more than 100 elements
Sorting    InsertionSort
Discuss it

Question 39 Explanation:
Time taken by algorithm is good for small number of elements, but increases quadratically for large number of elements.
 Question 40
Consider the array A[]= {6,4,8,1,3} apply the insertion sort to sort the array . Consider the cost associated with each sort is 25 rupees , what is the total cost of the insertion sort when element 1 reaches the first position of the arrayÂ  ?
 A 50 B 25 C 75 D 100
Sorting    Arrays    InsertionSort
Discuss it

Question 40 Explanation:
When the element 1 reaches the first position of the array two comparisons are only required hence 25 * 2= 50 rupees. *step 1: 4 6 8 1 3 . *step 2: 1 4 6 8 3.
 Question 41
The auxiliary space of insertion sort is O(1), what does O(1) mean ?
 A The memory (space) required to process the data is not constant. B It means the amount of extra memory Insertion Sort consumes doesn't depend on the input. The algorithm should use the same amount of memory for all inputs. C It takes only 1 kb of memory . D It is the speed at which the elements are traversed.
Analysis of Algorithms    Sorting    InsertionSort
Discuss it

Question 41 Explanation:
The term O(1) states that the space required by the insertion sort is constant i.e., space required doesn't depend on input.
 Question 42
What is the best sorting algorithm to use for the elements in array are more thanÂ 1 million in general?
 A Merge sort. B Bubble sort. C Quick sort. D Insertion sort.
Sorting    QuickSort    InsertionSort    MergeSort
Discuss it

Question 42 Explanation:
Most practical implementations of Quick Sort use randomized version. The randomized version has expected time complexity of O(nLogn). The worst case is possible in randomized version also, but worst case doesnâ€™t occur for a particular pattern (like sorted array) and randomized Quick Sort works well in practice. Quick Sort is also a cache friendly sorting algorithm as it has good locality of reference when used for arrays. Quick Sort is also tail recursive, therefore tail call optimizations is done.
 Question 43
Which of the below given sorting techniques has highest best-case runtime complexity.
 A Quick sort B Selection sort C Insertion sort D Bubble sort
Sorting    GATE 2017 Mock
Discuss it

Question 43 Explanation:
Quick sort best case time complexity is ÎŸ(n logn)
Selection sort best case time complexity is ÎŸ(n^2 )
Insertion sort best case time complexity is ÎŸ(n)
Bubble sort best case time complexity is ÎŸ(n)
There are 43 questions to complete.

## GATE CS Corner

See Placement Course for placement preparation, GATE Corner for GATE CS Preparation and Quiz Corner for all Quizzes on GeeksQuiz.