|
 |
Darren New <dne### [at] san rr com> wrote:
> Warp wrote:
> > I don't know why, but I'm still understanding that as "O(1) doesn't make
> > sense".
> You realize that in O(1), the 1 is the *output* of the analysis, right?
I don't understand that.
> > The only way for an algorithm to be O(1) is that it does *not* depend on
> > any amount of input.
> Correct. The algorithm runs the same number of steps regardless of how much
> input you give it. That's O(1). "Return the first element in your input" is
> an O(1) operation.
How is that different from "sort the first 10 elements of the input"
(which is basically equivalent to "sort an array of 10 elements")?
> > But O(1) is not nonsensical. It's commonly used.
> It's commonly used, but not on algorithms with fixed-size inputs. If you
> have a sort that only accepts 10 integers to sort, O() isn't defined,
> because O() asks how fast does the algorithm run as you give it more than 10
> inputs.
Hence with your "return the first element in your input" O() isn't defined
either.
--
- Warp
Post a reply to this message
|
 |