Building a Brain

Although I had a wonderful Christmas break filled with family and friends, by the end I was hungering for something to challenge me, something to sink my teeth in. I found that in what has been to date my favourite Quest Class. Object Oriented Programming.

Programming, as my tutor put it, is “Doing things with stuff” (Rob Knop, 2014). Object Oriented Programming is a type of computer programming that emphasizes certain design characteristics of code. Although it may sound a bit fuzzy and perhaps complicated to the uninitiated, it can actually be a lot of fun (albeit challenging). Furthermore, knowing how to write code to do things with stuff is an invaluable tool in the 21st century, as it allows you to manipulate the technological world around you, which is probably the main reason it is offered at Quest. Being a powerful tool, it can be used in pretty much any discipline.

Now, the retentive reader may be wondering “where does the building a brain part come in?” Well, the way the course is designed is that the first week is essentially a crash course in the basics of object oriented programming (specific to Java, the language we are using). After that, we gets the remaining two and a half weeks to work on our final project.

People in my class chose a diverse multitude of projects, whether programming an interactive blackjack game, creating 3D graphics of ripples on a water surface, managing exorbitant amounts of data (including 15 million lines of leaked Adobe passwords), or designing a course registration program. If you could dream it (and put forward a solid proposal to the tutor) you were free to code it for your project. I really relished the freedom this course offered me, as it allowed me to purse a higher understanding of a topic which had long interested me. AI.

AI, or Artificial Intelligence, has captured my mind and imagination in the past few years, as I see it becoming more and more a part of humanity’s future. Now although I was by no means working with strong AI in this course (think Cortana from Halo), I did in fact work with something that some AI researchers consider to be a potential goldmine of AI advancement. I decided to build a Neural Net.

A neural net is exactly what most of you think of when you hear the words “neural net”. What I did at its most basic was simulate a bunch of neurons and all the connections between them on a computer, and then teach it things. This may sound difficult; that is because it is. However, with ample time to work and research, and with much support from the tutor, I was able to code and build a simple neural net on my computer, that I can teach simple mathematics to. Although that may not sound terribly impressive (yeah, see, I took this computer that could already do math perfectly, made it the teeniest bit human, and now sucks more at math, good job ūüėČ ), it isn’t so much what it learns that interests me, but¬†how¬†it learns (mine used a backpropogation algorithm for those interested).

You see, artificial neural nets are in our every day lives, whether people realize it or not. Many of the commercial facial recognition software in use in cameras utilize neural nets trained to, lo and behold, recognize faces. Additionally, certain neural nets have been trained using historical data to make stock market predictions, and are used by major investment companies around the globe. To generalize even more, technology has become a force in our world, and its force is mounting at an incredibly rapid weight. Having the chance to study programming at Quest is invaluable; having even a very basic understanding of programming is almost a prerequisite for technological literacy these days.

 

Having no programming experience whatsoever before this class, it was a daunting and yet rewarding project. ¬†In order to shed some light on how much I enjoyed this course and how cool neural nets are, I’ll share with you a bit of one problem I had with my neural nets.

So one day I was trying to teach my neural net how to multiple by two. I would give it certain input output pairs like (2, 4), (4, 8), (6, 12) and (8, 16). When I asked it to multiple 2 by 2 it gave me 4. When I asked it to multiply 8 by 2 it gave me 16. Success! But, when I asked it to multiple 3 by two, it gave me…4! Uh-oh. What went wrong there? You see my neural net, lazy as it was, was¬†memorizing¬†how to multiply, not¬†learning. The neural net had too many neurons, and too much memory! It could, like my tutor said a bad student could, afford to memorize and not learn. The neural net I was building at that point had 25 neurons. So, in order to starve it of memory, I programmed a new one with only 16. And it worked! My neural net, unable to simply memorize due to limited memory, actually learned to multiple by two. The first one was just being¬†lazy.

See, even in the computer world you have to deal with crazy personalities!

~Daniel

Leave a Reply