|
![](/i/fill.gif) |
On 3/11/2011 11:54 PM, clipka wrote:
> Am 11.03.2011 00:50, schrieb stbenge:
>
>> IME, the best (not to mention the most accessible) way to do this is to
>> use a proximity pattern, which is basically just an object pigment
>> blurred in 3D space. The advantage of using proximity patterns is that
>> you can obtain *outside*edge* data as well as inside edge data.
>>
>> Any other AO-like ray tracing/casting method would require some new
>> routines to be added into POV's rendering core... but I could be wrong.
>> (anything can be done if you're willing to wait through a long render)
>
> Trevor's idea was actually the path I followed with my earlier attempt
> at a proximity pattern. Code-wise it was not much of a change, as
> radiosity already has everything that's needed, even including some
> optimization. However, it could not do outside edges, and the approach
> also didn't allow for choosing only a subset of objects to affect the
> pattern - it would always use the complete scene.
Wasn't there also an issue with how radiosity precaches samples on a
screen-level basis, which was in turn causing the prox pattern to give
different results depending on the camera location/screen size/etc.?
> My next attempt would probably be based on the "blurred object"
> approach; syntax-wise it might even be a simple extension to the object
> pattern. But don't hold your breath.
I won't, but I am wondering how you plan to approach the problem...
Sam
Post a reply to this message
|
![](/i/fill.gif) |