So a huge part about choosing an algorithm which can best serve your needs is based on its performance and since generally you’ll be working with a high number of objects, picking an algorithm with good performance is essential. With this in mind it is important to remember that getting a solid measure of algorithm performance can be difficult.

Big-O notation allows us to express performance as a function of the number of items processed but for more than a certain number of items, some problems cannot be solved by any computer (more on this later). There are several different ways to assess the rate of efficiency for an algorithm (linear, quadratic, etc).

- Linear growth: if processing time increases in proportion to inputs, the algorithm grows linearly.

int search(int[] x, int target) { for(int i=0; i &lt; x.length; i++) { if (x[i]==target) return i; } return -1; // target not found }

So we basically took an array, looped through it and incrementally increased i, then said if the position in the x array is equal to the target, return i. This is linear in nature since it is looping through the array one time so if the input is ten times bigger, it will take approximately ten times longer. We’re also assuming the worst case scenario where if the target is not present the for loop will execute all the way through the length of the x array. In terms of Big-O notation this is known as O(n) growth.

- Quadratic Growth: if processing time increases as the square of inputs.

boolean isUnique(int[] x) { for(int i=0; i &lt; x.length; i++) { for(int j=0; j &lt; x.length; j++) { if (i != j &amp;&amp; x[i] == x[j]) return false; } } return true; }

We have a doubly nested loop here so if you look at it you can see that the loop with the i index executes x.length number of times (as we saw above) while the loop with the j index executes also x.length number of times. When you multiply those values together you can see that this takes (x.length)^2 times and thus the growth rate is O(n^2)

There are many different types of growth but the best algorithms are logarithmic in nature. This means at a high number of values, they don’t increase quickly. So examine the number of times a loop is executed. So take

for(i=1; i < x.length; i *= 2) { // Do something with x[i] }

So the loop body executes a certain number of times and now i is going from 1,2,3,4,5… to 1,2,4,6,8…. until 2^k is greater than the length of the array. We know that k-1 = log base 2 (x.length) <k SO the loop is O(log n). The general convention is to only include the term which influences the efficiency the most so if in reality the algorithm was O(n^2 + log n) we would express it as having O(n^2) efficiency. In the next post I want to highlight some sorting algorithms as well as their efficiency so stay tuned!