Is there any way to do a rough estimate of time needed to render any
given scene when various settings are used with certain computer
configurations? I.e. can you estimate how long it would take to
render five glass spheres with five point lights over a plane at
resolutions 1600 x 1200 800 x 600 640 x 480 320 x 240
antialiasing on/off
radiosity on/off
CPU 100 mhz 200 mhz 800 mhz 1000 mhz 2000 mhz 6000 mhz
I'm curious 'cuz it'd be interesting to know how fast a scene I
have that took 51:09:07 hms at 700 mhz for 640 x 480 would render
on other machines (without actually trying it on, say, my 486/66)
--
Tim Cook
http://empyrean.scifi-fantasy.com
-----BEGIN GEEK CODE BLOCK-----
Version: 3.12
GFA dpu- s: a?-- C++(++++) U P? L E--- W++(+++)>$
N++ o? K- w(+) O? M-(--) V? PS+(+++) PE(--) Y(--)
PGP-(--) t* 5++>+++++ X+ R* tv+ b++(+++) DI
D++(---) G(++) e*>++ h+ !r--- !y--
------END GEEK CODE BLOCK------
Post a reply to this message
|