# Algorithm Efficiency

Time efficiency - a measure of amount of time for an algorithm to execute.

Space efficiency - a measure of the amount of memory needed for an algorithm to execute.

Complexity theory - a study of algorithm performance

Function dominance - a comparison of cost functions

Asymptotic dominance - comparison of cost functions when n is large.

Function g asymptotically dominates function f if there are positive constants c and n0 such that

c*g(n) >= f(n) for all n>= n0

That is, g asymptotically dominates f if g dominates f for all "large" values of n.

## Big-O Notation

Given two nonnegative functions f and gf = O(g) if and only if g asymptotically dominates f

"f is of the order g" or "The order of f is g"

Formally, if f(n) and g(n) are functions defined for positive integers then f(n) = O(g(n))
if there exists a c such that |f(n)| <=c|g(n)| for all sufficiently large positive integers n

### Example

We want to show that n2 dominates the polynomial  3n2 + 3n + 10, or, in other words

3n2 + 3n + 10 = O(n2)

Let c = 4 (there are lots of c's to choose from) and show that there exists an n0 in which the equation holds

3n2 + 3n + 10 <=4n2 0 <= n2-3n-10 0 <= (n-5)(n+2)

so for n >= 5 we found c=4

### Example

Here is an example where n may have to become  larger in order for g(x) to dominate.

Find an n when 2n > 8n4
n log2 2 > log28 + 4 log2n
n > 3 + 4 log2 n
(n-3)/4 > log2 n

so n > 16

## Big-O Arithmetic

Here are some simplifications that can be applied

1. O(k*f) = O(f) That is, constants can be ignored.
2. O(f*g) = O(f) * O(g) if a function is a product then its order is the product of the orders of the factors.
3. O(f/g) = O(f) / O(g)  same for a function that is a quotient
4. O(f) > O(g) if and only if f dominates g
5. O(f+g) = Max[ O(f), O(g) ] That is, terms of lower degree can be ignored.
6. Be careful with functions that have subtraction:
If f(n) = O(h(n)) and g(n) = O(h(n)) then
f(n)-g(n) is not equal to O(h(n)) - O(h(n)) = 0

2n2 - n2 = n2

Some other rules

1. The powers of n are ordered according to the exponent na = O(nb) iff a <= b
2. The order of log n is independent of the base taken loga n = O(logb n) for all a, b > 1
3. Logarithms grow more slowly than any power of n log n = O(na) for any a>0 but na != O(log n)
4. na = O(bn) for all a, b>1 but bn != O(na) for b>1

## Common examples of domination

Let X and Y denote variables and a,b,c, and d denote constants

In order of dominance (top is most dominant)

 Function condition common name Nn N! N factorial an dominates bn if a > b exponential Nc dominates Nd if c>d polynomial (cubic, quadratic, etc.) n log n n log n n linear log n logarithmic (regardless of base) 1 constant

## Log scale graph of big-O complexities

> log2 := loglogplot (log(x), x=1..lim, color=green): > lin := loglogplot (x, x=1..lim, color=gold):
> nlogn := loglogplot (x*log(x), x=1..32, color=blue):
> quad :=loglogplot (x^2, x=1..32, color=yellow):
> cube := loglogplot (x^3, x=1..lim, color=violet):
> quart := loglogplot (x^4, =1..lim, color=maroon):
> expon := loglogplot (2^x, x=1..lim, color=red):
> fact := loglogplot (factorial(x), x=1..10, color=orange):

## Table of growth rates (12.1)

 N log2N n*log2N N2 N3 2N 3N 1 0 0 1 1 2 3 2 1 2 4 8 4 9 4 2 8 16 64 16 81 8 3 24 64 512 256 6561 16 4 64 256 4096 65,536 43,046,721 32 5 160 1024 322,768 4,294,967,296 … 64 6 384 4096 262,144 (Note 1) … 128 7 896 16,384 2,097,152 (Note 2) … 256 8 2048 65,536 1,677,216 ???????? …

Note1: The value here is approximately the number of machine instructions executed by a 1 gigaflop computer in 5000 years.

Note 2: The value here is about 500 billion times the age of the universe in nanoseconds, assuming a universe age of 20 billion years.

## Effect of Increased Computer Speed

 Algorithm time estimating function Number of data collections processed on a computer 1000 times faster N 1000.00 N*log2N 140.22 N2 31.62 N3 10.00 2N 9.97 3N 6.29 4N 4.98

## Control Structures and Running Time Performance

 Control structure or form of loop Running Time assignment statement O(1) expression O(1) `the sequence ` ```Stmt1 Stmt2 ``` Max [ O(S1), O(S2) ] ```if (condition)        Stmt1     else        Stmt2``` Max [ O(cond), O(S1), O(S2) ] ```for (i=1; i<=n; i++)      Stmt1``` O(n * S1) algorithms without looping or recursion O(1) ```for (i=a; i<=b; i++){     //body requiring constant time }``` O(1) ```for (i=a; i<=n; i++){      //body requiring constant time }``` O(n) -- linear time ```for (i=a; i<=n; i++){    for(j=b; j<=n; j++){       //body requiring constant time    } }``` O(n2) -- quadratic time ```for (i=a; i<=n; i++){       //body requiring O(M) time    }``` O(n * M)

```//----------------------------------------------------------------------
// pythag.cpp
// This program prints all Pythagorean triples up through some user-
// specified value. All output triples are in increasing order.
//----------------------------------------------------------------------
#include <iostream.h>

int main()
{
int maxLen;
int small;
int next;
int last;

cout << "Max. side length: ";
cin >> maxLen;
small = 1;
while (small <= maxLen) {
// INV (prior to test): All triples in the range
// (1..small-1, 1..maxLen, 1..maxLen) have been output && small<=maxLen+1
next = small;
while (next <= maxLen) {
// INV (prior to test): All triples in the range
// (small, 1..next-1, 1..maxLen) have been output && next<=maxLen+1
last = next;
while (last <= maxLen) {
// INV (prior to test): All triples in the range
// (small, next, 1..last-1) have been output && last<=maxLen+1
if (last*last == small*small + next*next)
cout << small << ", " << next << ", "
<< last << '\n';
last++;
}
next++;
}
small++;
}
return 0;
}```

n = maxLen;

O(1) O(n) O(n2) O(n3)