|
![](/i/fill.gif) |
Am 14.06.2010 08:11, schrieb SharkD:
> How good is the radiosity pretrace as an estimator of how long a scene
> will take to render? Right now it takes about 10 seconds to render a
> pixel, even with a reduced texture quality.
It's usually not a good estimator, because there are various factors
involved:
The Good News:
- Unless radiosity pretrace settings are poor, the pretrace will take a
significantly higher ratio of radiosity samples per pixel (with each
sample requiring tens, hundreds, or even up to 1600 more rays to be
shot, albeit with a few simplifications).
- Using low_error_factor significantly increases the "cost" of sample
lookup (i.e. looking up samples that have already been taken nearby)
during pretrace, while leaving the "cost" during main render unchanged.
The Bad News:
- As the pretrace progresses, the total number of samples keeps
increasing (of course), which also increases the overhead of checking
which samples are suitable for re-use; thus, sample re-use is
comparatively more "expensive" during the main render than during the
first pretrace steps.
- Some features are disabled during pretrace, most notably antialiasing
and focal blur, so these added "costs" don't show during pretrace.
So, all in all, pretrace may or may not be slower or faster than main
render - it depends.
A smaller-sized render is probably a much better indicator of the main
render performance.
Post a reply to this message
|
![](/i/fill.gif) |