Algorithms. Now there’s a huge topic. Where do you start? Good question. There is the theorym, the usual searching and the sorting, the math, the complication, the scary details to be precise about. So you start studying, memorizing, understanding, and the purpose of it all maybe escapes you or gets lost along the way. No need for it to be so. **So, what’s the point? Algorithms teach us to think. Algorithm wins us performance, and performance is like currency we can use to trade for other stuff **(as Prof. Charles Leiserson explains in his MIT lecture I’ve already blogged about). Even if you don’t work in some algorithm heavy programming, like for Google or something, you can benefit from learning some **basic patterns and antipatterns** like: quadratic is bad – avoid nested loops, sometimes it’s better to sort first so you can search faster later, etc.

What on Earth am I talking about? As always, in order to reason we need to measure. How do we measure computer program performance in a way that shows us the speed of the software independently of the computer it is running on? **We concentrate on the growth speed, not the speed itself.** We don’t so much care about the time (or maybe memory space) it takes to execute some piece of code for N size of input, as much as how fast does the performance get reduced depending on the growth of the size of input N. **Something is slow when it doesn’t scale well, and might need to scale.** So we chose an instruction to follow, count how many times it gets executed, and create a function that calculates that number depending on the size of the input, such as C1*n^2+C2*n+C3 for a nice traditional nested loop. Then we ignore the constants, and try to fit that function and algorithm into a family of algorithms. Algorithms in a family have a shared upper bound that’s a line they will never cross, for a sufficiently large N. That’s Big O notation, simplified. **Big O notation gives us a function which is the upper bound of the worst case scenario of our algorithm**. There are other notations for other bounds as well. So, mostly we are interested in the worst case, but sometimes it is also important to know the lower bound, that is to know that a certain operation can’t be done with less then a certain amount of work.

As the Wikipedia page explains:

#### “Suppose an algorithm is being developed to operate on a set of *n* elements. Its developers are interested in finding a function *T*(*n*) that will express how long the algorithm will take to run (in some arbitrary measurement of time) in terms of the number of elements in the input set. The algorithm works by first calling a subroutine to sort the elements in the set and then perform its own operations. The sort has a known time complexity of *O*(*n*^{2}), and after the subroutine runs the algorithm must take an additional 55*n*^{3} + 2*n* + 10 time before it terminates. Thus the overall time complexity of the algorithm can be expressed as

This can perhaps be most easily read by replacing *O*(*n*^{2}) with “some function that grows asymptotically slower than *n*^{2} “. Again, this usage disregards some of the formal meaning of the “=” and “+” symbols, but it does allow one to use the big O notation as a kind of convenient placeholder.”

**Let’s demystify the very basics, then.** If I get something wrong, please don’t be too harsh with me. I’m only just learning, and putting it in writing for myself, and maybe others. So I finally get to learn it well and can stop having to study it from start over and over again:)

**O(N**^{2})

**If you have to remember just one thing, make it this one – quadratic is bad. That pretty much means avoid nested loops.** I’m serious, quadratic (or worse, cubic, etc.) running times grow like crazy – the same things that takes **seconds** with an N log N growth rate, takes **weeks** with n^2. That can become important if we do some heavy calculations. But even in less extreme conditions, it’s nice to be careful. As I’ve mentioned before in this post and also in a previous post, **performance is like currency**, if we save up enough of it we can trade it for features, usability and what not. But we have to save it up first. So, if you can avoid doing something for n per m times, do.

As most things, algorithms are about **pattern recognition**. We learn to look at a piece of code, at an algorithm, and figure out what type of behaviour we can expect from it.

To read an explanation of Big O notation and some info on logarithmic growth times, click “Continue reading”. Continue reading →