POV-Ray : Newsgroups : povray.off-topic : Quote of the day : Re: Quote of the day Server Time
29 Jul 2024 02:25:59 EDT (-0400)
  Re: Quote of the day  
From: Orchid Win7 v1
Date: 3 Jul 2012 16:05:30
Message: <4ff3508a$1@news.povray.org>
> Wow, just wow. Let me rephrase: 'you should never use a sorting
> algorithm because the runtime depends on the input'.

And yet, everybody prefers the unpredictable quicksort over the 
perfectly consistent mergesort, even though on paper they should be 
equally fast...

>>> You heard it here folks - Haskell "isn't deterministic". :-D
>
> Deterministic is not the right word, he meant unpredictable. And even
> that is wrong (see also Andrews reply).

Yeah, most things a computer program does are /deterministic/. This is 
not the same as /predictable/ at all.

(In some sense, whether a program ever halts at all is "unpredictable". 
For example, even though it's completely /deterministic/, it's provably 
not /computable/.)

 From a practical perspective, if you have no idea whatsoever how 
efficient or not the code you just wrote is going to be... you have a 
problem.

> I know of only one language that
> is unpredictable and that is Dijkstra's guarded command language. OTOH I
> don't know if anybody ever made an implementation of that.

OOC, do you have a reference for that?

>> I don't understand at which point Haskell decides to stop laziness and
>> start actually computing results. It cannot be lazy forever because it
>> would just run out of memory sooner or later (probably even with
>> algorithms
>> that would normally require O(1) memory in an imperative language).
>
> No. Although you could write a program that needs an infinite amount of
> space. Such a program would make as much sense as a program in an
> imperative language that has an infinite loop.

It's true though that a lazy language can unexpectedly change the 
complexity class of your algorithm if you're not paying attention.

For example, counting the number of nodes in a list /should/ be O(1) in 
time and space. However, the "obvious" thing to write is

   length (x:xs) = 1 + length xs
   length []     = 0

This gives the correct answer, but is unfortunately O(n) in time and 
space - which is very silly, of course. The "correct" solution is the 
much less obvious function

   length = count 0
     where
       count n (x:xs) = let n' = n+1 in n' `seq` count n' xs
       count n []     = n

This is actually O(1) like you'd expect.

On the other hand, the "obvious" way to implement the Fibonacci numbers 
[as if any computer program ever in the history of mankind will ever 
/need/ to look up a Fibonacci number for any purpose other than 
micro-benchmarks] is:

   fib 0 = 1
   fib 1 = 1
   fib n = fib (n-1) + fib (n+2)

This is unfortunately O(n^2). But the trixy incantation

   fibs = 1 : 1 : zipWith (+) fibs (tail fibs)

is only O(n). And it works exactly /because/ of laziness. Otherwise, 
this would be an infinite loop which would never end.

(There are similar incantations for the prime numbers and for Pascal's 
triangle.)


Post a reply to this message

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.