AspBucket offers ASP.NET, C#, VB, Jquery, CSS, Ajax, SQL tutorials. It is the best place for programmers to learn

Thursday 1 October 2015

Which is the best sorting algorithm?

In this blog article we are going to discuss about the best shorting algorithm. The answer is Quick short So lets talk about why Quick short is a best sorting algorithm.

Quick sort is fastest
O(n2) behaviour, it is usually fast: assuming random pivot selection, there's a very large chance we pick some number that separates the input into two similarly sized subsets, which is exactly what we want to have.In particular, even if we pick a pivot that creates a 10%-90% split every 10 splits (which is a meh split), and a 1 element - n−1 element split otherwise (which is the worst split you can get), our running time is still O(nlogn) (note that this would blow up the constants to a point that Merge sort is probably faster though).
Quick sort is usually faster than sorts that are slower than O(nlogn) (say, Insertion sort with its O(n2) running time), simply because for large ntheir running times explode. A good reason why Quick sort is so fast in practice compared to most other O(nlogn) algorithms such as Heap sort, is because it is relatively cache-efficient. Its running time is actually O(nBlog(nB)), where B is the block size. Heapsort, on the other hand, doesn't have any such speedup: it's not at all accessing memory cache-efficiently.
Quick sort is usually faster than sorts that are slower than O(nlogn) (say, Insertion sort with its O(n2) running time), simply because for large ntheir running times explode. A good reason why Quick sort is so fast in practice compared to most other O(nlogn) algorithms such as Heap sort, is because it is relatively cache-efficient. Its running time is actually O(nBlog(nB)), where B is the block size. Heapsort, on the other hand,doesn't have any such speedup: it's not at all accessing memory cache-efficiently.

The following chart compares sorting algorithms on the various criteria outlined above; the algorithms with higher constant terms appear first, though this is clearly an implementation-dependent concept and should only be taken as a rough guide when picking between sorts of the same big-O efficiency.




Time
Sort Average Best Worst Space Stability Remarks
Bubble sort O(n^2) O(n^2) O(n^2) Constant Stable Always use a modified bubble sort
Modified Bubble sort O(n^2) O(n) O(n^2) Constant Stable Stops after reaching a sorted array
Selection Sort O(n^2) O(n^2) O(n^2) Constant Stable Even a perfectly sorted input requires scanning the entire array
Insertion Sort O(n^2) O(n) O(n^2) Constant Stable In the best case (already sorted), every insert requires constant time
Heap Sort O(n*log(n)) O(n*log(n)) O(n*log(n)) Constant Instable By using input array as storage for the heap, it is possible to achieve constant space
Merge Sort O(n*log(n)) O(n*log(n)) O(n*log(n)) Depends Stable On arrays, merge sort requires O(n) space; on linked lists, merge sort requires constant space
Quicksort O(n*log(n)) O(n*log(n)) O(n^2) Constant Stable Randomly picking a pivot value (or shuffling the array prior to sorting) can help avoid worst case scenarios such as a perfectly sorted array.

0 comments :

Post a Comment

  • Popular Posts
  • Comments