|
![](/i/fill.gif) |
David Given <dg### [at] cowlark com> wrote:
> Can anyone shed any light on this? Right now I'm completely stumped.
>
As others have mentioned, such problems usually occur when the samples value is
simply set too low. (Also noted: Only 1 interval is recommended for method 3,
which will then create more intervals automatically, as-needed.) BTW, the
documentation's stated defaults for intervals and samples are incorrect. The
real defaults (at least for method 3) are: intervals 1, samples 10.
I find it strange that even 1000 samples doesn't improve things for you. How
about changing jitter to some *higher* value, like 2? I confess that I don't
usually mess with jitter in media, and leave it at its default (whatever that
is!) Perhaps a too-low jitter value is causing the banding, by not allowing the
media to 'spread out' across intervals. (My assumption is based on jitter in
photons--lowering the jitter value *there* does produce distinct photon bands.)
As far as how media jitter affects animation: I don't really know; haven't yet
animated any clouds. (A good experiment to try.) But I suspect that it would
look OK, IF the samples are high enough, and if the media density pattern has
smooth transitions, as yours does. Just a guess.
My own approach to making clouds like this is to make lots of *individual*
clouds instead (in their own spheres.) Not as difficult as it sounds, using a
#while loop (with random turbulence for each cloud's density.) I *think* the
scene might render faster as well, for two reasons: the camera doesn't have to
look through an entire large media-filled sphere; and the samples value can be
much lower per cloud.
Ken
Post a reply to this message
|
![](/i/fill.gif) |