Lecture 18
Recap
Running Time of Function
Running Time of Cubic Function
Running Time of Quadratic Function
Running Time of Linear Function
Running Time of Logarithmic Function
Logarithm
Bits in Binary Numbers
Repeated Doubling
Repeated Halving
Checking an Algorithm
Analysis
Once we have performed an algorithm analysis, we want
to determine whether it is correct and as good we can
possibly make it
One way to do so is to code the program and see if the
empirically observed running time matches the running
time predicted by the analysis
Continued….
When N increases by a factor of 10, the running time goes
up by a factor of 10 for linear programs, 100 for quadratic
programs, and 1000 for cubic programs
Programs that run in O(N log N) take slightly more than
10 times as long to run under the same circumstances
These increases can be hard to spot if the lower order
terms have relatively large coefficients and N is not large
enough
An example is the jump from N = 10 to N = 100 in the
running time for the various implementations of the
maximum contiguous subsequence sum problem
Continued….
Another commonly used trick to verify that some program
is O(F(N)) is to compute the values T(N)/F(N) for a range
of N, where T(N) is the empirically observed running time
If F(N) is a tight answer for the running time, the
computed values converge to a positive constant
If F(N) is an overestimate, the values converge to zero
If F(N) is an underestimate, and hence wrong, the values
diverge