|
|
Mr wrote:
> What's the way to do network rendering with the beta version of pov?
> Does SMPOV work with it? Does the new built-in SMP implementation support this
> out of the box? Or do you set up computer grids? (I'm searching for anything
> that
> could speed up my render times besides what's said in FAQ and documentation)
We used to have a cloud solution hosted at www.imp.org, but it kind of
stalled a few years ago due to lack of jobs to run.
The current push is to get SMP working right on a single machine (beta
3.7), after which doing such things as splitting the workload over a
network will be much easier problems to solve.
--
Chambers
Post a reply to this message
|
|
|
|
Mr schrieb:
> What's the way to do network rendering with the beta version of pov?
> Does SMPOV work with it? Does the new built-in SMP implementation support this
> out of the box? Or do you set up computer grids? (I'm searching for anything
> that
> could speed up my render times besides what's said in FAQ and documentation
Given how SMPov works, it should be compatible with POV-Ray 3.7 as well.
However, there are a caveats when using it with sophisticated lighting
like photons and radiosity (though I assume these to have been present
in POV-Ray 3.6 as well): Out of the box, render nodes will do duplicate
work for both photons and radiosity (the latter applies especially to
shots with a high recursion depth, but also when using a lot of
reflection), and what's even worse, edges of the SMPov tiles are likely
to show up as artifacts in the assembled shot when using radiosity.
Other attempts use multiple full renders of slightly different scenes
for anti-aliasing and focal blur (slightly different camera parameters),
smooth shadows (slightly different light source positions), blurred
reflections (slightly differently pertubed surface normals) and the
like. This approach is actually said to be quite efficient when a heavy
combination of all of these is used, and AFAIK wasn't even invented with
network rendering in mind, but just to avoid an explosion of render
time. By varying photon, radiosity and media sampling parameters, a good
deal of jitter can probably be introduced to these as well, so that they
can get away with significantly lower quality settings, too.
Post a reply to this message
|
|