Sunday 30 March 2014

Efficiency and Sorting

    For the past few weeks, we have been ignoring recursion(Yay!) and focusing on something that really peaks my interest, sorting lists, and the efficiency of these functions and of many more beyond that. The types of sorting algorithms studied in detail include: Selection Sort, Bubble Sort, Quick Sort, Insertion Sort and Merge Sort. In addition, we learned briefly about the extremely efficient default sort on python, Timsort. Finally, the way efficieny was measured for these algorithms is by Big-Oh analysis. The rules of this type of analysis state that when declaring the efficiency of a function, you must first find the general equation of how many steps are run then simplify it to the highest power and that is your Big-Oh. Selection sort, Bubble sort, and Insertion sort are all methods we had learned in CSC 108, and all have the same Big-Oh of O(n^2). These are the 'simpler' sorting algorithms and use loops to sort through the entire list, which is why they have that efficiency. While their exact equations are not n^2 (they could be something like 5n^2 + n + 27), we round to the highest power because of the fact that at some point, the constants, and even the value of any powers lower than the highest in the equation would become meaningless at very high list sizes. Merge sort, Quick sort, and Timsort however, have a O(n log n). This is because they seem to divide the list up into smaller lists, and divide and sort those. While this sounds scary, it turns out it is much more efficient that the previous three sorting algorithms mentioned.  While O(n^2) might seem inefficient compared to these superior algorithms, it is still in the acceptable range of programs. Programs that run on O(n!), or even O(n^n), are the point where efficiency is compromised, and is deemed unacceptable. 

Monday 17 March 2014

And on the ninth week of CSC 148 my professor gave to me...

    A week of Big Oh's, efficiency, and sorting algorithms. And a little bit of BST review, but since I talked about that last week, I'm just going to briefly go over it here. A BST is essentially a tree, but organized. The left subtree will always have values less than that of the root, while the right subtree will have values greater than it. Now on to more important things.

    I don't know if this says something about my personality but I love learning about efficiency. Just anything about it, whether it's about why my program being inefficient or about the mathematics of efficiency or just simply how lazy computer scientists are when it comes to accuracy with efficiency. I've always thought that as you go on in CS, you will inevitably start making less and less efficient functions simply because problems will contain more and more unknowns and will be impossible to do efficiently. I was both pleasantly surprised as well as horrified to learn that this was not the case. Factorial and N^N functions are deemed improper in every level of computer science, but occasionally a necessity, depending on the function.

    In addition, while learned in more detail in CSC 165, Big Oh notation seems to fit perfectly into the dogma of of a CS student; being lazy and efficient. You always round down to the lowest Big Oh that fits your function, you always remove the coefficient and put it as a breakpoint instead. At first I thought it was silly to purposefully simply a function to its basest form, but as the examples in class get more and more complicated, I could see why this is a huge time saver (you can find out the efficiency of the function at literally once glance), and why it is used today.

Sunday 9 March 2014

Searching for Linked Lists in Sets of Two

    Okay, I know I didn't put up my blog post about trees this week like I promised, but I'll put it up eventually. In the mean time, we'll talk about Linked Lists!

    A linked list is essentially a tree, but where every single node has only one value, and a pointer pointing to the next node (Also known as the rest of the linked list). Just like recursion, and trees, and all of computer science in general, it was really hard to get my mind around this concept at first. But as time went on, and I read and reprogrammed the code over and over again, it suddenly started making sense. Although this might not make perfect sense, the idea that linked lists are just lists, but instead of being inside of a single list, it is 'wrapped' inside of a wrapper, where it points to other linked lists just didn't make sense to me. But when I started to get the hang of the prepend and decapitate functions of linked lists, it started to become clear

    Now one thing that I'm still having a little issue with is a binary search tree. The concept of binary search is pretty simple; you start off with a term, and to the left are terms that are smaller in value, while to the right are terms that are greater. This gives a much faster search time for finding an object because instead of potentially having to search and entire tree, you can literally ignore half of it depending on if the value you are looking for is greater than or less than the initial value. I don't exactly know what I find confusing about these trees since I have a grasp of what they're supposed to do, but whenever I try to think about them, I can't help but feel as if I'm missing something.

Sunday 2 March 2014

A Recursive Post About Recursion

    In addition, the idea of recursion, while not fully understood, has become a lot clearer to me since I first learned it. The idea that, while defining a function that you can call the function perplexed me. How is something like that possible? Wouldn't that be like just calling the function infinity times? It turns out most of my confusion was that... Wait a minute. Does this sound familiar to you as well? That's probably because it is! Welcome to my recursive post about recursion! The first time I did this was a long time ago, in a galaxy far, far away. And since that time, I have learned so much more about recursion. From the myriad of functions we've programmed together, to classes we've created, all based on recursion, there's one thing I've learned; recursion isn't that complicated.

    It turns out that the thing about recursion that confused me the most was the fact that it was written in list comprehension form. As I have mentioned in previous blog posts, I took CSC108 a LONG time ago(about 1 year to be precise) and didn't learn a thing about list comprehensions. What makes it worse is that I missed the entire first week of class because I was stuck in an airport because of storms happening in Canada, so I am still not sure if we learned it then. However, during a lab, the topic of list comprehensions came up, and low and behold, recursion seems to make sense to me now!

    Recursion itself isn't particularly useful. It's just iterating over a function to keep it going without having to embed a bunch of loops around it. It's two things that I think make recursion useful; the built-in python function, whether it be append, or sum, or max, and the condition that the recursion is occurring against. The reason I neglected to mention a base case is because it's not what makes a recursive function useful; it's what makes a recursive function function (recursively). The built-in function is what provides some sort of work done to the function, whether it sorts things from nested lists, or it just creates a new list of things in a different order, this is what lets your function do what it needs to. The recursive condition, on the other hand, is what lets the know where to go, and prevent it from being an infinite loop. Whether it is a for loop, iterating over the items in a list, or a while loop, letting the function occur over a range of numbers, this is easily the most important part of a recursive function, excluding the recursive function itself. By splitting the function into parts, and understanding each portion, it made it much easier for me to understand.

    However, while I understand how a recursive function works, it still confuses me in its uses. For example, things such as linked lists and trees, which are essentially whole classes based on recursion, aren't clear to me. I understand how the code works, that isn't the problem, I just don't understand where and why I would use them. Why is it better than a nested list? Why do we have a specific instance of a tree, and call it a different thing, even though it is just a tree? It's questions like these that keep me up at night (not really though). Well, that's all I have to say about recursion. See you next week!

P.S. I know I haven't done a blog entry for week 6 yet... mostly because I completely forgot about it. I promise it'll be up by this Tuesday!

Sunday 2 February 2014

The Second Blog Post of my Life

    This week has been extremely hectic. Midterms are coming up so I've been trying to cut away the shrink-wrap from all my textbooks and start taking notes, so I was extremely thankful when I realized that this week in my CSC 148 class we were just doing recursion all week! This meant that I could relax and let myself breath since recursion isn't THAT complicated. It started out daunting when I was first introduced to it, but after a few Youtube videos, it became much more manageable. However, the turtles example given in class for recursion was quite thorough and the visual manifestation of the code made me feel more confident in my understanding of the concept. Now that the "important" bit is talked about, lets move on to more pressing matters.

    I'm a second-year student, and yet I'm still taking first year courses. Why? Because I wanted to do Neuroscience and Chemistry as my double major, but took CSC 108 and fell so madly in love with CS that I dropped my chemistry major in favour of it. Now, the problem is that I'm four whole courses behind and my PEY applications start next year, I have no external project experience, and my GPA is kind of terrible. And on top of all of that, I have four midterms, two assignments, and two labs all in the same atrocious week directly before my break, so this post is going to be significantly more casual and less thought out than last weeks, as you may have noticed. So I guess that ends that. This week has been the beginning of some odd ceremony performed by UofT to slowly crush my spirit, and I don't see it getting any better any time soon.

   

Thursday 23 January 2014

The First Blog Post of my Life

    As the title suggests, this happens to be the first time I've posted anything personal about myself (aside from my extremely gorgeous Facebook photos) on the internet, so introduction is in order! For those who already know who I am, congratulations on finding me staff of CSC 148! For those who don't, while I won't tell you my real name, the monicker of 'Sparkles' will suffice. Before talk of the extremely interesting subject of inheritance of classes by other classes, how to properly create instances of classes, and re-write those classes to work together in blissful harmony, as well as many other equally blood-pressure-increasing subjects, I would like to end this sentence in a non-sequitur.

    Now on to object-oriented programming. While I believe myself to be a semi-competent pythonista, one specific part of python never quite clicked with me: classes. Anything to do with classes. The making of a new class, the writing __init__ or __str__ statements,
Programming on a ridiculously oriented object
using self.function to call a list etc. never got through to me in CSC 108. But in CSC 148, the idea of inheritance made the as subject clear as day to me. To see how a class actually obtains its properties, and how it simply just has the same functions as the superclass, as well as any new functions you want to add showed me how simple creating a class can be on python. When I made classes in 108, inheritance from the default object class was never explicitly shown, so I didn't understand how things happened. But seeing it actually happen gave me a base of how classes are created, allowing me to relate it to how I would make my own class.

In addition, the idea of recursion, while not fully understood, has become a lot clearer to me since I first learned it. The idea that, while defining a function that you can call the function perplexed me. How is something like that possible? Wouldn't that be like just calling the function infinity times? It turns out most of my confusion was that I didn't understand what a recursive function was meant to replace; a for-loop. Once this became clear, the idea that a recursion could be infinite seemed preposterous to me; only a mistake in coding could possibly result in something like that because there would always be a definite end-point. The one example that I found on Youtube that convinced me of this was a simple program used to count down to zero where the function would call itself, with the input minus one each time until it reached zero. The example presented in class however, seemed much more confusing as it was just a return statement calling the function itself.

    In conclusion, the end the introduction to the mind of this simple CSC 148 student, I pose the following idea :if something seems too complicated to comprehend, it is not because the thing itself is complicated, but rather because the complex idea had not been broken down into its simplest, fundamental parts.