POV-Ray : Newsgroups : povray.off-topic : Turing determination : Re: Turing determination Server Time
5 Sep 2024 17:21:41 EDT (-0400)
  Re: Turing determination  
From: Darren New
Date: 23 Jul 2009 15:21:58
Message: <4a68b856$1@news.povray.org>
Warp wrote:
>> You realize that in O(1), the 1 is the *output* of the analysis, right?
>   I don't understand that.

The notation puts its result in the location where most functions take their 
input. O(1) has nothing to do with the size of the input being fixed.

>>>   The only way for an algorithm to be O(1) is that it does *not* depend on
>>> any amount of input.
> 
>> Correct. The algorithm runs the same number of steps regardless of how much 
>> input you give it. That's O(1). "Return the first element in your input" is 
>> an O(1) operation.
> 
>   How is that different from "sort the first 10 elements of the input"
> (which is basically equivalent to "sort an array of 10 elements")?

That's O(1), certainly.

>>>   But O(1) is not nonsensical. It's commonly used.
> 
>> It's commonly used, but not on algorithms with fixed-size inputs. If you 
>> have a sort that only accepts 10 integers to sort, O() isn't defined, 
>> because O() asks how fast does the algorithm run as you give it more than 10 
>> inputs.
> 
>   Hence with your "return the first element in your input" O() isn't defined
> either.

When you said "sort 10 elements is O(1)", I interpreted you to mean that any 
algorithm that takes a fixed input is O(1).  There's a subtle difference 
between "sort the first 10 elements" and "sort 10 elements".

If you meant "any algorithm that only looks at a fixed number of elements of 
its input regardless of how many are supplied is O(1)", I'd probably agree 
with that after thinking about it a while and clearly defining the term 
"input". That wasn't what I was interpreting you to say. I had understood 
you to be talking about algorithms whose input is by nature fixed.

For example, it's not reasonable to say "since real computers have a fixed 
amount of memory, any algorithm you run on a real computer is by definition 
O(1)". You said

 > Any problem on a computer with limited resources is O(1).

Any problem on a computer with limited resources does not have an order 
notation, since Big-O is defined only when inputs grow increasingly large 
without bound. "Sort the first 2^32 elements and crash if you have more" is 
not an algorithm to which Big-O notation applies, because when "n" gets to 
2^33, there's no function that is below O(1) that says how long the 
algorithm takes. It just isn't defined.

-- 
   Darren New, San Diego CA, USA (PST)
   "We'd like you to back-port all the changes in 2.0
    back to version 1.0."
   "We've done that already. We call it 2.0."


Post a reply to this message

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.