Tuesday, April 1, 2014

Sorting and Efficiency

    Sorting is the process of organizing and rearranging objects in some order. Sorting is important because it provides efficiency for future needs of searching for an object in the stock. However, it is import to sort the stock efficiently as well! but how do we exactly determine what the best sorting strategy is? Normally, the lower the number of steps, the more efficient the sorting strategy is --since the number of steps is usually associated with the speed of operation. 

    We have learned many sorting techniques in CSC148: some are not very efficient, such as bubble sort, selection sort and insertion sort, whereas some are more time-saving, including quick sort and merge sort.

    Bubble Sort
  • Bubble Sort: In bubble sort, starting from i=0, element i is compared to element i+1, until the end of the list is encountered. If the two element do not match a certain criteria (ex: element[i] < element[i+1]), the algorithm will switch the position of the two elements. Every time the end of the list is reached, the last element in the list is ensured to be in the right position, therefore, bubbling through a smaller list every time. This is very insufficient, since we need to repeat this process n-1 times for a total of O(n^2) steps. This sorting technique is represented in the animation on the right, retrieved from wikipedia.com.  


    Selection Sort
  • Selection Sort: In this technique, the minimum element in the list is found and is inserted in the beginning of the list. This process is repeated n-1 times, each time placing the Min at the end of the sorted list,until the last element is reached. This technique is also very insufficient with O(n^2). This sorting technique is represented in the animation on the left, retrieved from wikipedia.com.

    Insertion Sort
  • Insertion Sort: In this sorting strategy, the list is separated into two segments: the sorted and the unsorted sections. In the start, the sorted segment is empty and the unsorted segment contains everything. Starting from the first element, each element is removed from the unsorted segment and placed in the right position in the sorted segment. This is also very insufficient and require n-1 repetition of this process for a total of O(n^2) steps. The animation for this technique is shown on the right (Source: wikipedia.com)


    Quick Sort
  • Quick Sort: This technique is more sufficient that the previous ones. It first picks a pivot point --an element that other elements are compared to. Then, t divides the list into two sublists: the smaller and the larger numbers, and brings all elements greater than the pivot point to the larger sublist and the ones smaller to the smaller sublist. The algorithm repeats this process recursively on each smaller and larger sublists until a list with one element is reached. This strategy is much faster with O(n log n) steps. wikipedia.com provides a good animation of this technique, shown on the right.


  • Merge Sort: This technique breaks down the list into single elements, and then recursively, reconstructs the list in the correct order. It compares the first element in a sublist with the first element in its adjacent sublist, and constructs a larger list, placing the elements in the correct order. It repeats this process until there is only one large list left. This strategy is also as sufficient as quick sort with O(n long n) steps needed to organize a list with n elements. An animation of this is shown on the right. (Source: wikipedia.com.

Sunday, March 2, 2014

Recursion -- Exciting but Complicated!

    Recursion is the process of breaking a problem into similar smaller components. In computer science, a recursive function is any function which calls itself within its body, therefore making a loop. A real world world example of recursion is the nested images formed when two mirrors are parallel to each other; in this case, we have an infinite recursion. When defining a recursive function, we must define a base case in order to prevent an infinite recursion. This base case does not lead to a recursive call; without it, the function will run infinitely and a "RuntimeError" will occur.

    A really good example of recursion is the Fibonacci sequence. In this sequence, every number is defined by the sum of the two previous numbers. [n = (n-1) + (n-2)]. in this way, each number can be broken down into its two previous numbers; those two numbers can be broken down into their two previous components and so on, until the first two numbers (0 & 1) are encountered. In the case of Fibonacci numbers, the two first numbers of the sequence are the base case and need to be known.

    Trees also tend to be recursive. A Tree is made of a root and one or more branches. Every branch of a Tree can recursively be broken down into its components -- root and branches. Therefore, in a recursive function, a Tree is defined as a root plus its branches, which are themselves Trees.

    I find recursion one of the most useful and interesting concepts in computer science and mathematics. It could be difficult when first working with recursion to think of how it works, but with time, it will get easier!

Tuesday, February 11, 2014

Rotten week

Lots of work to do this week.. Computer science assignment and three tests... What is the point of reading week anyway?

Tuesday, January 21, 2014

First Post

New to this whole blogger thing... I am trying to make the blog look nicer at the moment.  Looks like a lot of work for CSC148