In general, a sorting is arranging
collected elements or data in certain orders by comparing each other. There are
many different types of sorting such as bubble, selection, insertion, merge, shell,
and quick sorts. Each type has different efficiency, which is algorithmic
efficiency as the size of the input grows larger, and is based on a number of
elements to sort. Most sorts have algorithmic efficiency of either O(n^2) or O(n log(n)).
Among
the different types of sorting algorithms, the first three sorts: bubble, insertion,
and selection consider as simple sorts. They perform differently, but they
perform better on a small list or a list that is almost sorted, rather than a
large list. A bubble sort is repeatedly swapping adjacent elements that are
unsorted until the whole list of items in a certain order. An insertion sort repeatedly
goes through all the items in an unsorted list each time, and inserts an item
into its correct position. Similar as the insertion sort, a selection sort also
repeatedly goes through a list to select an item according to its order, and
place it in the correct position. All of three sorts have similar efficiency,
O(n^2) on average, best and worst test cases except the insertion sort has O(n)
on best case. As well as their stability is stable and their space is constant.
Unlike
those three sorts, merge, shell, and quick sorts are more efficient on a list
containing a large number of items in random order. As well as they do not use
comparison, and are more complex algorithms. A shell sort compares elements far
apart in the beginning, and then compares elements less far apart. At the end, it
compares adjacent elements similar as the insertion sort. A quick sort divides
the list of items into two sub-lists based on a pivot element. In one of the sub-lists,
elements are arranged to be smaller than the pivot element, and in another
sub-list, elements are placed to be larger than the pivot element. This process
continues on the resulting sub-lists until the whole items are sorted. A merge
sort is similar as the quick sort. It also divides a list, but it does not have
a pivot elements. In the beginning, it divides the list into n sub-list with
only 1 element. And then it merges a sub-list with another sub-list to produce
new sorted sub-lists, and this process repeats until all the sub-lists are merges
together and sorted. For these three sorts, their efficiency is O(n log(n)) on
best case. For merge sort, it has O(n log(n)) efficiency on best, average, and
worst case, but the other two sorts are different. The quick sort has O(n log(n))
on best case, but has O(n^2) on worst case. The shell sort has O(n^2),
too, but on the average case, it has big o notation depends on gap sequence. The
space complexity for quick sorts is constant, but merge sorts is depends on
whether it deals with linked list or arrays.
I
have learnt about insertion, selection, and bubble sorts in CSC 108, so I was
familiar with those three, but not with the other three sorts. I was confused
by just looking at the codes of the three functions that I learnt in this
course, but the link on the course website helped me a lot. The link explains
what they are, and also shows the pictures of how they work. However,
understanding big oh notation was hard for me and it is still hard. I also
learnt about efficiency in CSC 108, but I did not fully get from that course, so
I had hard time with big oh notation. So, I looked up about the big oh notation
and how they are different on different test cases and on different sorting
algorithms. As well as this week lab was little helpful to understand the
concept. I should keep practice on it as I tried to understand the recursion. When
I prepare for the final exam, I should focus more on this topic and binary tree.