Current list: 4,5,9,7,2,8



Дата26.04.2016
Памер349.47 Kb.
#36995
Bubble Sort

The Bubble Sort compares each successive pair of elements in an un-ordered list and inverts the elements if they are not in order. The following example illustrates the bubble sort.

The list "5,4,9,7,2,8" needs to be sorted from lowest to highest. The following steps are carried out:


  • Pass 1

    • First (5) and second (4) elements compared. 4 less than 5 therefore elements inverted.

      • Current list: 4,5,9,7,2,8

    • Second (now 5) and third (9) elements compared. In correct order so no change.

      • Current list: 4,5,9,7,2,8

    • 9 and 7 compared and inverted

      • Current list: 4,5,7,9,2,8

    • 9 and 2 compared and inverted

      • Current list: 4,5,7,2,9,8

    • 9 and 8 compared and inverted

      • Current list: 4,5,7,2,8,9

  • Pass 2

    • Same method used in Pass 1 applied to current list (4,5,7,2,8,9)

    • Resultant list: 4,5,2,7,8,9

  • Pass 3

    • Resultant list: 4,2,5,7,8,9

  • Pass 4

    • Resultant list: 2,4,5,7,8,9

  • Pass 5

    • No changes would be made in this pass therefore the algorithm has reached its termination point and the list is sorted

It is worth noting that at the end of Pass 1 the largest element (9) has been moved to the end of the list. Similarly, in Pass 2, the second largest element (8) has been moved to the second last position in the list and so on. This pattern always occurs in the Bubble Sort.

Related to this point, it is also noteworthy that Pass 2 only needs to deal with n-1 elements (where n is the number of elements in the original list). Pass 3 only needs to deal with n-2 and so on. This is because at the end of each pass, another element has been put in its correct position at the end of the list. This leaves one less element that needs to be sorted for the next pass.



Best and Worst Case Analysis

The worst case for the number of passes is n-1. The best case is 1 pass which would occur if the list was already sorted. The list needs to be checked through at least once to check whether or not any inversions / swaps need to be done.

There are many different types of sorts that can be carried out both in the real world and computationally. One common type of sort is the Selection Sort. The Selection sort algorithm iterates over the values in an unordered list successively moving the lowest value to the front of the list. Each time an element is moved to the head of the list (or array) this is ignored by the following iterations. On the next iteration, the lowest value is then moved to the position in the list after the previous lowest value.

procedure bubbleSort( A : list of sortable items ) defined as:

do

swapped := false



for each i in 0 to length( A ) - 2 do:

if A[ i ] > A[ i + 1 ] then

swap( A[ i ], A[ i + 1 ] )

swapped := true

end if

end for

while swapped

end procedure

Insertion sort is a simple sorting algorithm, a comparison sort in which the sorted array (or list) is built one entry at a time. It is much less efficient on large lists than more advanced algorithms such as quicksort, heapsort, or merge sort, but it has various advantages:


  • Simple to implement

  • Efficient on (quite) small data sets

  • Efficient on data sets which are already substantially sorted: it runs in O(n + d) time, where d is the number of inversions

  • More efficient in practice than most other simple O(n2) algorithms such as selection sort or bubble sort: the average time is n2/4 and it is linear in the best case

  • Stable (does not change the relative order of elements with equal keys)

  • In-place (only requires a constant amount O(1) of extra memory space)

  • It is an online algorithm, in that it can sort a list as it receives it.

An insertion sort involves processing a single value in each step. An order list is maintained throughout the sort process and a new element is added to that list each step. After each step, the inserted element will be in is correct place in the list. The following example shows how the insertion sort works:

The list "9,7,3,5,6" needs to be sorted from lowest to highest. The following steps are carried out:



  • Step 1 - get first value from list (9)

  • Step 2 - get second value from list (7) and compare to first value (9)

    • If second value smaller than first value, place second value in front of first value (7,9)

  • Step 3 - get third value from list (3) and compare to tail value (9) of current ordered list (7,9)

    • If new element (3) smaller than tail value (9), place new element in front of tail value (7,3,9)

    • If new element (3) smaller than first value (7), place new element in front of first value (3,7,9)

  • Step 4 - get fourth value from list (5) and compare to each element in current ordered list (3,7,9) in a similar fashion to Step 3

    • New ordered list is (3,5,7,9)

  • Step 5 - get fifth value from list (6) and compare to each element in current ordered list (3,5,7,9) in a similar fashion to Step 3

    • When 6 is compared to 5, 6 is placed in the list to create the final sorted list: 3,5,6,7,9

Note that in the last step, 6 does not need to be compared to 3. This is because it is known that any value before 5 will be less than 5 due to the previous steps being carried out.

In abstract terms, every iteration of an insertion sort removes an element from the input data, inserting it at the correct position in the already sorted list, until no elements are left in the input. The choice of which element to remove from the input is arbitrary and can be made using almost any choice algorithm.

Sorting is typically done in-place. The resulting array after k iterations contains the first k entries of the input array and is sorted. In each step, the first remaining entry of the input is removed, inserted into the result at the right position, thus extending the result:

becomes:


with each element > x copied to the right as it is compared against x.

The most common variant, which operates on arrays, can be described as:


  1. Suppose we have a method called insert designed to insert a value into a sorted sequence at the beginning of an array. It operates by starting at the end of the sequence and shifting each element one place to the right until a suitable position is found for the new element. It has the side effect of overwriting the value stored immediately after the sorted sequence in the array.

  2. To perform insertion sort, start at the left end of the array and invoke insert to insert each element encountered into its correct position. The ordered sequence into which we insert it is stored at the beginning of the array in the set of indexes already examined. Each insertion overwrites a single value, but this is okay because it's the value we're inserting.

A simple pseudocode version of the complete algorithm follows, where the arrays are zero-based:

insert(array a, int length, value) {

int i = length - 1;

while (i >= 0 && a[i] > value) {

a[i + 1] = a[i];

i = i - 1;

}

a[i + 1] := value;



}

insertionSort(array a, int length) {

int i = 0;

while (i < length) {

insert(a, i, a[i]);

i = i + 1;

}

}



Selection Sort

The following example show the steps taken in a selection sort. The bold numbers are the numbers which have been moved to the front of the list after being found to be the lowest value on each iteration.



  • 5 7 3 9 2 1
    --> 1 5 7 3 9 2
    --> 1 2 5 7 3 9
    --> 1 2 3 5 7 9
    --> 1 2 3 5 7 9
    --> 1 2 3 5 7 9
    --> 1 2 3 5 7 9

Another example




  • 31 25 12 22 11

  • 11 25 12 22 31

  • 11 12 25 22 31

  • 11 12 22 25 31

Procedure

void selectionSort(int a[], int size)

{

int i, j, min;


for (i = 0; i < size - 1; i++)

{

min = i;



for (j = i+1; j < size; j++)

if (a[j] < a[min])

min = j;
swap(a[i], a[min]);

}

}



Performance

Selection sort is very easy to analyze since none of the loops depend on the data in the array. Selecting the lowest element requires scanning all n elements (this takes n - 1 comparisons) and then swapping it into the first position. Finding the next lowest element requires scanning the remaining n - 1 elements and so on, for a total of (n - 1) + (n - 2) + ... + 2 + 1 = Θ(n2) comparisons. Each of these scans requires one swap for a total of n - 1 swaps (the final element is already in place). Thus, the comparisons dominate the running time, which is Θ(n2).

Among simple average-case Θ(n2) algorithms, selection sort always outperforms bubble sort, but is generally outperformed by insertion sort.

Insertion sort is very similar in that after the kth iteration, the first k elements in the array are in sorted order. Insertion sort's advantage is that it only scans as many elements as it needs to in order to place the k + 1st element, while selection sort must scan all remaining elements to find the k + 1st element.



Merge Sort

The Merge Sort is a recursive "divide and conquer" algorithm which involves merging two sorted lists with each other each step of the algorithm. As with the insertion sort, a single element is viewed as a sorted list therefore the Merge Sort breaks down a list into individual elements, all of which are "sorted" lists in themselves, which can be merged with other sorted lists.

In the example of an unsorted list of 4 elements, each of the elements will be separated from each other. Then, the first step of the algorithm will merge the first two elements together so that they are in order. Then the 3rd and 4th elements will be merged. This leaves two sorted lists of 2 elements each. These two lists can then be merged, resulting in the 4 elements being sorted in a single list.

The time complexity of the Merge Sort is O(N log(N)). The Merge Sort is a relatively fast algorithm but it has the disadvantage of requiring a large amount of memory due to its recursive nature.

Conceptually, merge sort works as follows:


  1. Divide the unsorted list into two sublists of about half the size

  2. Sort each of the two sublists recursively until we have list sizes of length 1, in which case the list itself is returned

  3. Merge the two sorted sublists back into one sorted list.

Mergesort incorporates two main ideas to improve its runtime:

  1. A small list will take fewer steps to sort than a large list.

  2. Fewer steps are required to construct a sorted list from two sorted lists than two unsorted lists. For example, you only have to traverse each list once if they're already sorted (see the merge function below for an example implementation).


Example of Merge Sort



function mergesort(m)

var list left, right, result

if length(m) ≤ 1

return m

else

middle = length(m) / 2



for each x in m up to middle

add x to left



for each x in m after middle

add x to right

left = mergesort(left)

right = mergesort(right)

result = merge(left, right)

return result

There are several variants for the merge() function, the simplest variant could look like this:



function merge(left,right)

var list result

while length(left) > 0 and length(right) > 0

if first(left) ≤ first(right)

append first(left) to result

left = rest(left)

else

append first(right) to result

right = rest(right)

if length(left) > 0

append left to result



if length(right) > 0

append right to result



return result

Quick Sort

Quicksort is a well-known sorting algorithm developed by C. A. R. Hoare that, on average, makes Θ(n log n) comparisons to sort n items. However, in the worst case, it makes Θ(n2) comparisons. Typically, quicksort is significantly faster in practice than other Θ(n log n) algorithms, because its inner loop can be efficiently implemented on most architectures, and in most real-world data it is possible to make design choices which minimize the possibility of requiring quadratic time. Quicksort is a comparison sort and, in efficient implementations, is not a stable sort.

The Quick Sort has O(N log(N)) average case time complexity. If implemented so that the pivot point is always set to the first element of the current list, the worst case time complexity is of O(N2). The Quick Sort has the advantage of being a very fast algorithm but it is also very complex due to its heavy use of recursion.

The Quick Sort is a recursive algorithm which is very similar to the Merge Sort. It takes an element in a list and uses that element as a pivot value to divide the whole list into two parts. This division is done so that one of the parts contains elements less than the pivot. The other part contains elements that are larger than the pivot. This process is then repeated recursively on these two resultant parts. When the algorithm arrives at a single element to be sorted, nothing is done as this single element is "sorted".

Quicksort sorts by employing a divide and conquer strategy to divide a list into two sub-lists.

The steps are:


  1. Pick an element, called a pivot, from the list.

  2. Reorder the list so that all elements which are less than the pivot come before the pivot and so that all elements greater than the pivot come after it (equal values can go either way). After this partitioning, the pivot is in its final position. This is called the partition operation.

  3. Recursively sort the sub-list of lesser elements and the sub-list of greater elements.

The base case of the recursion are lists of size zero or one, which are always sorted. This sort is a variation of the lesser known Hill Sort[verification needed]. The algorithm always terminates because it puts at least one element in its final place on each iteration (the loop invariant).

In simple pseudocode, the algorithm might be expressed as:



function quicksort(q)

var list less, pivotList, greater

if length(q) ≤ 1

return q

select a pivot value pivot from q



for each x in q

if x < pivot then add x to less

if x = pivot then add x to pivotList

if x > pivot then add x to greater

return concatenate(quicksort(less), pivotList, quicksort(greater))

Notice that we only examine elements by comparing them to other elements. This makes quicksort a comparison sort.

Version with in-place partition



In-place partition in action on a small list. The boxed element is the pivot element, blue elements are less or equal, and red elements are larger.

The disadvantage of the simple version above is that it requires Ω(n) extra storage space, which is as bad as mergesort

The additional memory allocations required can also drastically impact speed and cache performance in practical implementations. There is a more complicated version which uses an in-place partition algorithm and can achieve O(log n) space use on average for good pivot choices:

Example Quicksort

QuickSort

Starting With the following List - Partition

 

i

 

 

 

 

 

 

 

 

 

 

 

j

 

L

0

1

2

3

4

5

6

7

8

9

10

11

12

R

 

8

2

5

7

9

4

16

18

3

6

11

23

10

 

1: Select Pivot a(0) i.e. 8

2 Move i to the right until a(i) found > pivot and



Move j to the left until a(j) found < pivot

 

 

 

 

 

i

 

 

 

 

j

 

 

 

 

L

0

1

2

3

4

5

6

7

8

9

10

11

12

R

 

8

2

5

7

9

4

16

18

3

6

11

23

10

 

3 Swap a(i) and a(j)

 

 

 

 

 

i

 

 

 

 

j

 

 

 

 

L

0

1

2

3

4

5

6

7

8

9

10

11

12

R

 

8

2

5

7

6

4

16

18

3

9

11

23

10

 

4 Move i to the right until a(i) found > pivot and

Move j to the left until a(j) found < pivot



 

 

 

 

 

 

 

i

 

j

 

 

 

 

 

L

0

1

2

3

4

5

6

7

8

9

10

11

12

R

 

8

2

5

7

6

4

16

18

3

9

11

23

10

 

5 Swap a(i) and a(j)

 

 

 

 

 

 

 

i

 

j

 

 

 

 

 

L

0

1

2

3

4

5

6

7

8

9

10

11

12

R

 

8

2

5

7

6

4

3

18

16

9

11

23

10

 

6 Move i to the right until a(i) found > pivot and

Move j to the left until a(j) found < pivot



 

 

 

 

 

 

 

j

i

 

 

 

 

 

 

L

0

1

2

3

4

5

6

7

8

9

10

11

12

R

 

8

2

5

7

6

4

3

18

16

9

11

23

10

 

7 j < i

Swap a(j) with Pivot



 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

L

0

1

2

3

4

5

6

7

8

9

10

11

12

R

 

3

2

5

7

6

4

8

18

16

9

11

23

10

 

8 Quicksort to the left of the pivot

 

i

 

 

 

 

j

 

L

0

1

2

3

4

5

R

 

3

2

5

7

6

4

 

























Partition

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

j

i

 

 

 

 

L

0

1

2

3

4

5

R

 

3

2

5

7

6

4

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

j

i

 

 

 

 

L

0

1

2

3

4

5

R

 

2

3

5

7

6

4

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Getting our first sorted Left Partition

 

2

3

 

 

 

 

 

Quicksort Right hand side

 

i

 

 

j

 

L

0

1

2

3

R

 

5

7

6

4

 

Pivot 5

Partition around 5



 

 

i

 

j

 

L

0

1

2

3

R

 

5

7

6

4

 

 

 

 

 

 

 

 

 

i

 

j

 

L

0

1

2

3

R

 

5

4

6

7

 

 

 

 

 

 

 

 

 

j

i

 

 

L

0

1

2

3

R

 

5

4

6

7

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

j

i

 

 

L

0

1

2

3

R

 

4

5

6

7

 

This gives us LHS

 

4

5

Now Quicksort RHS

 

i

j

 

L

0

1

R

 

6

7

 

 

 

 

 

 

i

j

 

L

0

1

R

 

6

7

 

So we now have from the original LHS around pivot 8

0

1

2

3

4

5

6

2

3

4

5

6

7

8

A sorted list

Go Back now and Sort out Right Hand Side around Pivot 8

Selecting Pivots

Partitioning and Quicksorting Sides we get the following sequence of steps.




i

 

 

 

 

j

0

1

2

3

4

5

18

16

9

11

23

10

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

i

j

0

1

2

3

4

5

18

16

9

11

23

10

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

i

j

0

1

2

3

4

5

18

16

9

11

10

23

 

 

 

 

 

 

 

 

 

 

j

i

0

1

2

3

4

5

18

16

9

11

10

23

 

 

 

 

 

 

 

 

 

 

j

i

0

1

2

3

4

5

10

16

9

11

18

23

 

 

 

 

 

 

i

 

 

j

 

 

0

1

2

3

 

 

10

16

9

11

 

0

 

 

 

 

 

23

 

i

j

 

 

 

0

1

2

3

 

 

10

16

9

11

 

 

 

 

 

 

 

 

 

i

j

 

 

 

0

1

2

3

 

 

10

9

16

11

 

 

 

 

 

 

 

 

 

j

i

 

 

 

0

1

2

3

 

 

10

9

16

11

 

 

 

 

 

 

 

 

 

j

i

 

 

 

0

1

2

3

 

 

9

10

16

11

 

 

 

 

 

 

 

 

 

 

i

j

 

 

 

 

0

2

 

 

 

 

16

11

 

 

 

 

 

 

 

 

 

 

j

i

 

 

 

 

0

2

 

 

 

 

11

16

 

 

This gives us a Sorted RHS

7

8

9

10

11

12

9

10

11

16

18

23

Putting Both Sorted Sides together we get

0

1

2

3

4

5

6

 

7

8

9

10

11

12

 

2

3

4

5

6

7

8

 

9

10

11

16

18

23

QED



Поделитесь с Вашими друзьями:




База данных защищена авторским правом ©shkola.of.by 2022
звярнуцца да адміністрацыі

    Галоўная старонка