Lecture 3 - Algorithms
Contents
Complexity
Factors that affect good programs
Environment
- the Operating System
- the Compiler
- the Hardware
Implementation
- the Algorithm (our code)
Testing
Empirical testing of the efficiency of algorithms
- Choose a good set of inputs
- Measure the actual runtime (environment is important)
Timing
Environmental factors
As different environments will produce different absolute times, we want to focus on the relative changes
Time Complexity
When calculating the time complexity of an algorithm, we consider the worst case scenario.
(ie for a search algorithm; key not in A)
Big-O Notation
From best to worst timing
Type | Time Cost | |
---|---|---|
best | Constant | O(1) |
Logarithmic | O(log n) | |
Linear | O(n) | |
n-log-n | O(n log n) | |
Quadratic | O(n^2) | |
Cubic | O(n^3) | |
Exponential | O(2^n) | |
Factorial | O(n!) | |
worst | all hell breaks loose | O(n^n) |
The worst case execution time is between the fastest and slowest possible operation time.
Suggested: Searching Algorithms
Calculating Big-O Values
// Remember: The Big-O value refers to the worst-case scenario
Step through the algorithm and count the number of operations performed.
When you encounter a loop, add n
, or a variant of n
.
The resulting Big-O is the highest power of n, expressed without the coefficient.
Example :: 4 + 3n
Highest power of n: 3n
Big-O: O(n) - linear
Example :: 4 + 3n + 3n^2
Highest power of n: 3n^2
Big-O: O(n^2) - quadratic
Maths-y stuff
f(n) is O(g(n)) if f(n) is asymptotically less than or equal to g(n).
f(n) is Omega(g(n)) if f(n) is asymptotically greater than or equal to g(n).
f(n) is Theta(g(n)) if f(n) is asymptotically equal to g(n).
// Given f(n) and g(n), we say f(n) is O(g(n)) if we have positive constants c and n_0
Tractable - polynomial-time algorithm; polynomial worst-case performance O(n^2)
useful and usable in in practical applications
Intractable - No algorithm exists (usually NP’)
Worse than polynomial performance O(n^2)
Feasible only for small n
Non-computable – no algorithm exists (or can exist)
Tools
time(1)
measures execution time
Also available at /usr/bin/time
Recursion
Components
- Base Case - aka our stopping case
- Recursive Case - will call itself
Calculating Factorials
Iterative Approach
|
|
Recursive Approach
|
|
To calculate fac(3)
the program will need to do
|
|
That’s a lot of stack frames that it needs to set up and destroy…
Would be better to just iterate for a factorial function
Fibonacci
So this section alone got quite large…
Optimisation
Tail-call optimisation
Tail-call optimisation is where you are able to avoid allocating a new stack frame for a function because the calling function will simply return the value that it gets from the called function.
A function ends with a tail call if the last operation before the function returns is another function call. If this call invokes the same function, it is tail-recursive.
// A function can probably be optimised if it’s last operation is calling itself
Source: StackOverflow