|
|
> [And in any case, why bother saying so ? Users could probably go through
the UI and pick
> out a dozen features that are 'not very important', but they don't feel
the need to post
> in a newsgroup about it ...]
Because I've seen other people duped into purchasing software that purports
to be 'faster' than other packages, based on cpu time. However, after the
money is spent, these people find the wall clock time (i.e. the real time
that us humans actually observe) is worse than the other alternatives. Not
a pleasant outcome.
Also, people need to be aware that actual, real world rendering time (i.e.
that time we humans actually live by) is wall clock time, not cpu time. It
serves no good purpose to tell users "I rendered that image in 37 cpu
minutes" but not tell them "yeah it took 6 hours of real, human time"
I wasn't throwing stone. I was just making sure people are aware of the
difference.
Jim
"Chris Cason" <newsadmin-despam-@povray-no-spam.org> wrote in message
news:3bde990b@news.povray.org...
>
> "Jim Kress" <dea### [at] kressworkscom> wrote in message
news:3bde2b80@news.povray.org...
> > Used Kernel and User Mode CPU times are interesting but not very
important.
>
> If you are attempting a benchmark, it is almost essential if you want a
representative
> result without having to lock your machine up during the render. Which is
why it was
> added.
>
> [And in any case, why bother saying so ? Users could probably go through
the UI and pick
> out a dozen features that are 'not very important', but they don't feel
the need to post
> in a newsgroup about it ...]
>
> -- Chris
>
>
>
>
Post a reply to this message
|
|