Lecture 6
Algorithms for sorting, and their analysis
Every item in the input seqence must appear in the output sequence, the same number of times. Sorting is a very common operation in computer systems, so sorting has been studied intensely. There are many different algorithms for sorting a sequence.
Selection sort
 Find the position of the minimum in the array
 Swap it with the (current) start position
 Sort the rest of the array
for (int p  0; p < N  1; p++)
{
find the minimum (at position m) in A[p...N1];
swap A[p] with A[m];
}
Finding the minimum is O(N). Therefore, selection sort is O(N^{2})
for (int p = 0; p < N  1; p++)
{
min = A[p]; m = p;
for (int j = p + 1; j <= N  1; j++)
if (A[j] < min) {
min = A[j];
m = j;
}
}
Bubble sort
do {
sorted = true;
for (int i = 0; i < N  1; i++)
if(A[i] > A[i + 1]) {
swapAt(i, i + 1);
sorted = false;
}
} while (!sorted)
The inner for loop is O(N). Best case scenario is when A was already sorted, so the loop runs once. Best is O(N). Worst case scenario is when A
is in reverse order, so the loop runs N
times. Perhaps on average, N / 2
.
Therefore, bubble sort has a complexity of O(N^{2}).
for (;;) {
for (int i = 0; i < N  1; i++) {
if (A[i] > A[i+1]) {
swapAt(i, i + 1);
goto outer;
}
break;
}
break;
outer:;
}
Insertion sort
Build up a new sorted list, one element at a time. Always maintain the order of the result list B.
for (int i = 0; i < N; i++) {
item = A[i];
p = 0;
while (p < i && B[p] <= item)
p++
<insert item at position p in B>;
}
item = A[i]
is the next item to be inserted into B
. i
is the length of the current sorted solution. Insert the next item from A
into the correct position in B
, at or before position i
. Search for insertion position p
in B
. p = 0;
is the position in B
. Find the location of the first item in B
which is > item
, or end of B
.
Array list insertion
To find the position in sorted A
for a new value, and insert it:
 Find position
p
of the new item (the inner while loop):p
comparisons  Insert at position
p
. Note: the remainder of the array (i  p
) must be moved up
This operation is O(N).
Linked list insertion
To access a random position p
is O(N), but using setAtStart()
and getNext()
, to scan for the position of the new value is O(N). To find position and insert new value, there are N / 2
accesses on average: O(N).
For an array list 
C(S) = O(N^{2}) (N * {N})

For a linked list using getNext to scan 
C(S) = O(N^{2}) (N * {N})

For a linked list not using getNext

C(S) = O(N^{3}) (N * {N * N})

QuickSort
Reorder the array so that everything on the left (although unsorted) is less than everything on the right (although the right is unsorted)
This is called partitioning the array. Now the left part of the array can be sorted, then the right part can be sorted. This can be done recursively.
Sort(A, first, last)
Partition A into A[first...M1] and A[M...last]
Sort(A, first, M1)
Sort(A, M, last);
Implementing partitioning
 A middle value is chosen, called the pivot. Ideally this is the median, but this is unknown before it is sorted.
 Scan from the left, and if something >= median, it must go to the right side
 At the same time, scan from the right to find something <= 25, it must go to the left side
 The two values found in 2 and 3 are swapped
 This is continued until the scans meet in the middle
/* partition *?
int i = first, j = last;
int pivot = A[(first + last) / 2];
while (i <= j)
{
while (A[i] < pivot) i++;
while (A[j] > pivot) j;
if (i <= j) {
tmp = A[i];
A[i++] = A[j];
A[j] = tmp;
}
}
if (first < i1) Sort(A, first, i1);
if (i < last) Sort(A, i, last);
Analysis of QuickSort
In an ideal scenario, the array is broken down into a nice balanced binary tree.
There are N
comparisons done in the partitioning at each level, and there are log_{2} N
levels. The complexity of QuickSort in good conditions is O(N log_{2} N).
No Comments