|
 |
On 12/4/20 11:43 AM, Mr wrote:
> William F Pokorny <ano### [at] anonymous org> wrote:
... the
> radiosity algorithm is supposed to do unique samples to avoid wasting
> computation, that is precisely why I mentioned to truncate the precision : so
> that the samples closest would thus falsely appear as duplicates but the lines
> suppressed in the initial data would rather be considered merged by that
> (truncation) threshold criteria. Does that naive approach make sense to someone
> more experienced than me, or am I missing something?
>
Expect quite a few things in play. There is some initial sampling
density relative the the scene depending on numerous things for any
given frame. If we change one or more of the dependencies enough,
additional samples will be shot. Of those "perhaps" some will be added
to the saved file.
In the moving camera case I think it likely new, duplicate or very
nearby lines (samples) don't get added after a point because the some
criteria (error, distance, ?) kicks in on seeing local context for the
existing samples - and any in hand, "duplicate," potential sample is
tossed in the bit bucket - but, I'm guessing.
---
I wonder if anyone has tried creating a radiosity file at a resolution
larger than the final frame/scene?
I'm not aware of a way to force a stop immediately after photon /
radiosity sample file creation. Is there one?
It would be handy to avoid the cost of the final pass render, saving of
other intermediate and output files when just creating the larger
radiosity file for the actual final frame/scene renders.
Bill P.
Post a reply to this message
|
 |