[Search for users] [Overall Top Noters] [List of all Conferences] [Download this site]

Conference rusure::math

Title:Mathematics at DEC
Moderator:RUSURE::EDP
Created:Mon Feb 03 1986
Last Modified:Fri Jun 06 1997
Last Successful Update:Fri Jun 06 1997
Number of topics:2083
Total number of notes:14613

1104.0. "some big O of ... questions" by RDVAX::NG () Wed Aug 02 1989 19:05

    I was reading Knuth's book. He is comparing several algorithms:
    
    Algorithm A: O(n(2^{3.5*[(log n)^0.5]}))
    Algorithm B: O(n(log n){2^[(2*log n)^0.5]})
    Algorithm C: O(n(log n)(log(log n)))
    
    All logarithms are in base 2.
    (I hope the expressions aren't too ambiguous.)
    
    I got the impression that he said C is better than B is better
    than A. Why? It seems to me A is better than B.
    
    I know that A is the same as O( T(n) = n^(1+3.5/[(log n)^0.5]) )
    which satisfies the following property:
    
    For any eps>0, there exists a constant c (as a function of eps)
    such that T(n) <= c*n^(1+eps) for all n >= n0.
    
    Would it be appropriate to call algorithm A 'asymptotically linear'?
    Or is there a standard term for this concept?
T.RTitleUserPersonal
Name
DateLines
1104.1I think Knuth was right ...COOKIE::PBERGHPeter Bergh, DTN 343-0577Wed Aug 02 1989 19:2555
>>    I was reading Knuth's book. He is comparing several algorithms:
    
>>    Algorithm A: O(n(2^{3.5*[(log n)^0.5]}))
>>    Algorithm B: O(n(log n){2^[(2*log n)^0.5]})
>>    Algorithm C: O(n(log n)(log(log n)))
>>    All logarithms are in base 2.
>>    (I hope the expressions aren't too ambiguous.)
    
    I'll rephrase the stuff inside the Ordo symbols to get it in a form that
    I'm more familiar with.  I'll call the functions A(n), B(n), and C(n).
    
    A(n) = n * 2 ** (3.5*SQRT(LOG2(n)))
    
    B(n) = n * LOG2(n) * 2 ** (2*SQRT(LOG2(n)))
    
    C(n) = n * LOG2(n) * LOG2(LOG2(n))
    
    We then find that
    
    A(n)/B(n) = 2 ** (1.5*SQRT(LOG2(n))) / LOG2(n)
    
    B(n)/C(n) = 2 ** (2*SQRT(LOG2(n))) / LOG2(LOG2(n))
    
    These things are nice, monotonic functions, so I'm not going to bother
    to make a formal proof that the asymptotic behavior of algorithm A is
    worse than that of B, which in turn is worse than that of C, but will only
    stuff in a few data points
    
    n = 2**81: A(n)/B(n) = 2**13.5/81, >> 1 (much larger than 1)
    	       B(n)/C(n) = 2**18/log2(81), >>1
    
    n = 2**100:A(n)/B(n) = 2**15/100, >> 1
               B(n)/C(n) = 2**20/log2(100), >> 1
    
>>    I got the impression that he said C is better than B is better
>>    than A. Why? It seems to me A is better than B.
    
>>    I know that A is the same as O( T(n) = n^(1+3.5/[(log n)^0.5]) )
>>    which satisfies the following property:
    
    I think you need to convince me that T(n) ~== A(n) for all sufficiently
    large n.
        
>>    For any eps>0, there exists a constant c (as a function of eps)
>>    such that T(n) <= c*n^(1+eps) for all n >= n0.
    
    The statement about T(n) may well be correct, for all I know.
    
>>    Would it be appropriate to call algorithm A 'asymptotically linear'?
>>    Or is there a standard term for this concept?
    
    I have not heard the term asymptotically linear; it sounds like a good
    one, one that is unlikely to be misunderstood.

    
1104.2Try 'em and seeAKQJ10::YARBROUGHI prefer PiWed Aug 02 1989 20:2813
It's easy to tell the order of A, B, and C: just plug in a large value of 
n, say 1,000,000, and see what the expressions evaluate to. The smaller the 
result for large n, the better the algorithm performs.

Aha! An opportunity to quote a limerick written by an old acquaintance:

	Behold the algorithm "A":
	It wends its long, recursive way
	Down through the stack.
	It will be back,
	But only after some delay.

Lynn Yarbrough 
1104.3AITG::DERAMODaniel V. {AITG,ZFC}:: D'EramoThu Aug 03 1989 00:0633
        To get from A to T, write 2^(3.5 sqrt(log n)) as
        2^(3.5 log n / sqrt(log n)) = n^(3.5 / sqrt(log n)).
        Then you get Algorithm A as being O(T(n)) where
        T(n) = n^(1 + 3.5 / sqrt(log n)).
        
>>        For any eps>0, there exists a constant c (as a function of eps)
>>        such that T(n) <= c*n^(1+eps) for all n >= n0.
        
        For example, sqrt(log n) = 10 when log n = 100 or n = 2^100
        (about 1.27 * 10^30).  So for n larger than that, T(n) is
        growing more slowly than n^1.35.  BTW, is n0 in the above
        also a function of eps?
        
        Anyway, Algorithm B is O( n * log n * 2^sqrt(2 log n) )
        which by the same manipulation as the one above is the
        same as O of n^(1 + sqrt(2)/sqrt(log n)) * log n. 
        Comparing this to T(n) it has a lower factor of 1/sqrt(log n)
        in the exponent, but is multiplied by log n.  The author
        of .0 probably assumed that the log n would swamp the
        difference between n^(1 + 3.5/sqrt(log n)) and
        n^(1 + sqrt(2)/sqrt(log n)).  But that turns out not to
        be the case.
        
        Using 1.4 for sqrt(2) the quotient of the above two is
        n^(2.1 / sqrt(log n)).  To see how that compares to log
        n, let x be sqrt(log n), so that n is e^(x^2).  Then the
        above quotient grows as e^(x^2 * 2.1 / x) or e^(2.1 x) but
        log n only grows as x^2 -- much more slowly.  In general,
        n^(1 + a/sqrt(log n)) will outgrow n^(1 + b/sqrt(log n)) log n
        if a > b, no matter how small a - b gets.  So that's why
        algorithm B is asymptotically better than algorithm A.
        
        Dan
1104.4RDVAX::NGThu Aug 03 1989 14:5650
Re .1,.3:

>    For any eps>0, there exists a constant c (as a function of eps)
>    such that T(n) <= c*n^(1+eps) for all n >= n0.
                                   ^^^^^^^^^^^^^^^
    Oops. The underlined words shouldn't be there.
    I was going to say there exists constants c and n0 (as functions
    of eps). But then I could choose a constant c so large that
    T(n) <= c*n^(1+eps) for all n > 0. So it seems unnecessary to include
    n0 and I tried to take it out. Unfortunately, I was too careless.

Re .1:
>    These things are nice, monotonic functions, so I'm not going to bother
>    to make a formal proof that the asymptotic behavior of algorithm A is
>    worse than that of B, which in turn is worse than that of C, but will only
>    stuff in a few data points
and .2:
>It's easy to tell the order of A, B, and C: just plug in a large value of 
>n, say 1,000,000, and see what the expressions evaluate to. The smaller the 
>result for large n, the better the algorithm performs.

    I don't believe that is good enough. How large a value should I evaluate
    at? What if the cross-over point is out of the range of machine floating
    point range, just for the sake of argument?

Re .3:
>        The author
>        of .0 probably assumed that the log n would swamp the
>        difference between n^(1 + 3.5/sqrt(log n)) and
>        n^(1 + sqrt(2)/sqrt(log n)).

    Exactly. It just seems non-intuitive. Using almost exactly the
    same argument, C is shown to be better than B.

    From the same argument, O(log n) is better than O(n^a/sqrt(log n))
    for any a>0. Since A(n) satisfies the property which I call
    'asymptotically linear', n log n also satisfies it. That seems
    rather strange to me. But then it may not be as I remember that
    log n grows slower than n^a for any a>0 or is it just for a>1?
    I guess I don't really have a feel for this function called logarithm.

    Thanks for all the help.
                               
Re .1:
>    I have not heard the term asymptotically linear; it sounds like a good
>    one, one that is unlikely to be misunderstood.

    But I think 'asymptotically linear' is slight misleading because
    the word 'asymptotically' is usually related to n->infinity.
    Anybody has a suggestion on this?
1104.5Some more fuel on the fireCOOKIE::PBERGHPeter Bergh, DTN 343-0577Thu Aug 03 1989 17:3061
                      <<< Re: Note 1104.4 by RDVAX::NG >>>


<> Re .1:
<> >    These things are nice, monotonic functions, so I'm not going to bother
<> >    to make a formal proof that the asymptotic behavior of algorithm A is
<> >    worse than that of B, which in turn is worse than that of C, but will only
<> >    stuff in a few data points
<> and .2:
<> >It's easy to tell the order of A, B, and C: just plug in a large value of 
<> >n, say 1,000,000, and see what the expressions evaluate to. The smaller the 
<> >result for large n, the better the algorithm performs.

<>     I don't believe that is good enough. How large a value should I evaluate
<>     at? What if the cross-over point is out of the range of machine floating
<>     point range, just for the sake of argument?

    Whether it is good enough depends on whether you are discussing
    asymptotic behavior (in the mathematics sense) or, for lack of a better
    term, computer-asymptotic behavior.  Knuth, since he is not talking
    about a specific piece of iron, has no choice: he must use the
    mathematics sense.  I was also using the mathematical sense.  The
    computer-asymptotic behavior is more difficult to ascertain;  I didn't
    bother to take the trouble to find exactly where one algorithm becomes
    better than another.
    
<> Re .3:
<> >        The author
<> >        of .0 probably assumed that the log n would swamp the
<> >        difference between n^(1 + 3.5/sqrt(log n)) and
<> >        n^(1 + sqrt(2)/sqrt(log n)).

<>     Exactly. It just seems non-intuitive. Using almost exactly the
<>     same argument, C is shown to be better than B.

<>     From the same argument, O(log n) is better than O(n^a/sqrt(log n))
<>     for any a>0. Since A(n) satisfies the property which I call
<>     'asymptotically linear', n log n also satisfies it. That seems
<>     rather strange to me. But then it may not be as I remember that
<>     log n grows slower than n^a for any a>0 or is it just for a>1?
<>     I guess I don't really have a feel for this function called logarithm.

    log(n) [any base] grows slower than any fixed, positive (>0), power of n.
                                   
<> Re .1:
<> >    I have not heard the term asymptotically linear; it sounds like a good
<> >    one, one that is unlikely to be misunderstood.

<>     But I think 'asymptotically linear' is slight misleading because
<>     the word 'asymptotically' is usually related to n->infinity.
<>     Anybody has a suggestion on this?
    
    The way I understood it, the algorithms are to be compared when n
    becomes arbitrarily large (i.e., n -> infinity), so I don't find the
    term misleading in any way.
    
    Now that I think of it, I seem to recall a definition "a is
    asymptotically b" if and only if a/b is O(1) as the independent
    variable(s) tend to some value (infinity or otherwise).
    {Disclaimer: it is some 20 years since I last taught math:  a) terms
    may have changed since then and b) my memory is not perfect.}
1104.6RDVAX::NGThu Aug 03 1989 20:3311
Re -.1:    
>    The way I understood it, the algorithms are to be compared when n
>    becomes arbitrarily large (i.e., n -> infinity), so I don't find the
>    term misleading in any way.
    
>    Now that I think of it, I seem to recall a definition "a is
>    asymptotically b" if and only if a/b is O(1) as the independent
>    variable(s) tend to some value (infinity or otherwise).

    n log n satisfies the property, or is it wrong? (I brought that
    up earlier.) But (n log n)/n does not approach 1 as n -> infinity.
1104.7AITG::DERAMODaniel V. {AITG,ZFC}:: D'EramoThu Aug 03 1989 21:3314
        Hmm.  If T(n) is the time for alg. A, then T(n) / n is
        n^(3.5 / sqrt(log n)) where the log is base two. The
        logarithm base two of this is (log n) * (3.5 / sqrt(log n))
        which is 3.5 sqrt(log n).  So the log of T(n) / n grows,
        slowly, but nonetheless without bound.  Likewise if T(n)
        is replaced with the formula for Alg. B or C.  A function
        that grows "between" n and any n^(1+eps) is not
        necessarily a.l.
        
        So perhaps the answer is that the term "asymptotically
        linear" has a precise definition, but that none of
        Algorithms A, B, and C is asymptotically linear.
        
        Dan
1104.8RDVAX::NGFri Aug 04 1989 12:522
    I was afraid of this. So now, I need a new name for the property
    I mentioned in .0. Any suggestion.
1104.9wrong-footedHERON::BUCHANANAndrew @vbo DTN 828-5805Fri Aug 04 1989 14:1812
>	Behold the algorithm "A":
>	It wends its long, recursive way
>	Down through the stack.
>	It will be back,
>	But only after some delay.

	Is this really a limerick?   It has the AABBA rhyming structure,
but the feet are all short-long, whereas a normal limerick has feet
short-short-long.

Unusually pedantic today,
Andrew.
1104.10AITG::DERAMODaniel V. {AITG,ZFC}:: D'EramoFri Aug 04 1989 16:3212
	Well, if "asymptotically linear" is an existing term with
	a known definition, the question is whether the algorithms
	in .0 fit that term.  If it isn't an existing term, then
	I suppose it is free for you to use. :-)  Just don't define
	it to be T(n)/n being O(1).

	By the way ... thinking of these functions of n as sequences,
	do they grow fast enough so that the sum of their reciprocals
	diverges?  The sum of 1/n diverges, but the sum of 1/(n^(1+e))
	converges for any eps > 0.

	Dan