POV-Ray : Newsgroups : povray.unofficial.patches : Is anyone working on a distributed / stochastic / Monte Carlo ray-tracing patch for 3.7? Server Time
29 Mar 2024 09:46:18 EDT (-0400)
  Is anyone working on a distributed / stochastic / Monte Carlo ray-tracing patch for 3.7? (Message 1 to 10 of 17)  
Goto Latest 10 Messages Next 7 Messages >>>
From: Nekar Xenos
Subject: Is anyone working on a distributed / stochastic / Monte Carlo ray-tracing patch for 3.7?
Date: 20 Jul 2014 13:44:19
Message: <op.xja4vvrwufxv4h@xena.home>
-- 
-Nekar Xenos-


Post a reply to this message

From: clipka
Subject: Re: Is anyone working on a distributed / stochastic / Monte Carloray-tracing patch for 3.7?
Date: 20 Jul 2014 14:45:29
Message: <53cc0e49@news.povray.org>
Am 20.07.2014 19:44, schrieb Nekar Xenos:
>
> Is anyone working on a distributed / stochastic / Monte Carloray-tracing patch for
3.7?

Distributed raytracing patch: Planned for official POV-Ray; don't hold 
your breath though.

Stochastic / Monte Carlo raytracing patch: Maybe UberPOV 
(https://github.com/UberPOV/UberPOV) already offers what you're looking 
for; it already uses stochastic rendering for various features. If 
stochastic global illumination is what you're after, just drop me a note 
and I'll make that happen as well.


Post a reply to this message

From: Nekar Xenos
Subject: Re: Is anyone working on a distributed / stochastic / Monte Carloray-tracing patch for 3.7?
Date: 21 Jul 2014 15:12:14
Message: <op.xjc3mdmvufxv4h@xena.home>
On Sun, 20 Jul 2014 20:45:27 +0200, clipka <ano### [at] anonymousorg> wrote:

> Am 20.07.2014 19:44, schrieb Nekar Xenos:
>>
>> Is anyone working on a distributed / stochastic / Monte  
>> Carloray-tracing patch for 3.7?
>
> Distributed raytracing patch: Planned for official POV-Ray; don't hold  
> your breath though.
>
> Stochastic / Monte Carlo raytracing patch: Maybe UberPOV  
> (https://github.com/UberPOV/UberPOV) already offers what you're looking  
> for; it already uses stochastic rendering for various features. If  
> stochastic global illumination is what you're after, just drop me a note  
> and I'll make that happen as well.
>

I'm not 100% sure what the correct term is, the Wikipedia entry was a bit  
confusing. I can describe it as following the ray from the light source to  
the screen. Just like MCPov, only with all of the 3.7 features.

-- 
-Nekar Xenos-


Post a reply to this message

From: clipka
Subject: Re: Is anyone working on a distributed / stochastic / MonteCarloray-tracing patch for 3.7?
Date: 21 Jul 2014 18:53:51
Message: <53cd99ff$1@news.povray.org>
Am 21.07.2014 21:12, schrieb Nekar Xenos:
> On Sun, 20 Jul 2014 20:45:27 +0200, clipka <ano### [at] anonymousorg> wrote:
>
>> Am 20.07.2014 19:44, schrieb Nekar Xenos:
>>>
>>> Is anyone working on a distributed / stochastic / Monte
>>> Carloray-tracing patch for 3.7?
>>
>> Distributed raytracing patch: Planned for official POV-Ray; don't hold
>> your breath though.
>>
>> Stochastic / Monte Carlo raytracing patch: Maybe UberPOV
>> (https://github.com/UberPOV/UberPOV) already offers what you're
>> looking for; it already uses stochastic rendering for various
>> features. If stochastic global illumination is what you're after, just
>> drop me a note and I'll make that happen as well.
>>
>
> I'm not 100% sure what the correct term is, the Wikipedia entry was a
> bit confusing. I can describe it as following the ray from the light
> source to the screen. Just like MCPov, only with all of the 3.7 features.

Hmm, there's definitely some misunderstanding going on here.

"Following the ray from the light source to the screen" is /forward 
raytracing/, as opposed to following the ray from the screen to the 
object surface and ultimately to the light source, aka /backward 
raytracing/. The latter is the classic approach, and the one used by 
POV-Ray.

MCPov also uses /backward/ raytracing like POV-Ray.

/Stochastic/ raytracing can be done with both approaches (I guess it is 
the only way to do forward raytracing, but it is also a way to do 
backward raytracing), or even with combinations of the two, such as 
bidirectional raytracing, Metropolis Light Transport or some such.


So let's forget about the technical terms, and start at the other end:

What is it about MCPov that you actually want?

Blurred reflections? Definitely go for UberPOV, as it has them. Yay!

Blurred refractions? Not supported by UberPOV yet, but shouldn't be too 
difficult to add once I decide on a nice syntax.

A less artifact-prone replacement for the effect that official POV-Ray 
uses radiosity for (so-called Global Illumination)? Raise your hand, and 
I'll make it happen in UberPOV.

Render over and over again while you sit and watch, until you're happy 
with the result? Not supported in UberPOV yet; it's on the agenda, but 
may require quite some intrusion into the POV-Ray architecture. What 
UberPOV does already support, however, is an "anti-aliasing" mode (or, 
more to the point, an oversampling mode) that allows you to specify what 
quality you'd be happy with, and UberPOV will do the 
rendering-over-and-over-again in the background until the quality 
criteria are met.


As a rule of thumb, if MCPov has it, it is also planned (if not already 
implemented) for UberPOV.


Post a reply to this message

From: scott
Subject: Re: Is anyone working on a distributed / stochastic / MonteCarloray-tracingpatch for 3.7?
Date: 22 Jul 2014 03:11:32
Message: <53ce0ea4$1@news.povray.org>
> A less artifact-prone replacement for the effect that official POV-Ray
> uses radiosity for (so-called Global Illumination)? Raise your hand, and
> I'll make it happen in UberPOV.

I don't know what you have in mind for this, if it's not the render over 
and over again approach?

> Render over and over again while you sit and watch, until you're happy
> with the result? Not supported in UberPOV yet; it's on the agenda, but
> may require quite some intrusion into the POV-Ray architecture. What
> UberPOV does already support, however, is an "anti-aliasing" mode (or,
> more to the point, an oversampling mode) that allows you to specify what
> quality you'd be happy with, and UberPOV will do the
> rendering-over-and-over-again in the background until the quality
> criteria are met.

This seems like a better approach, as with MCPov you're often left 
re-rendering the entire image waiting for just a small dark area to 
smooth out.

So if you implement the less artefact-prone GI and combine it with the 
per-pixel oversampling it should be something better than MCPov? Sounds 
good to me! Hand firmly raised!


Post a reply to this message

From: clipka
Subject: Re: Is anyone working on a distributed / stochastic / MonteCarloray-tracingpatchfor 3.7?
Date: 22 Jul 2014 08:26:33
Message: <53ce5879$1@news.povray.org>
Am 22.07.2014 09:11, schrieb scott:
>> A less artifact-prone replacement for the effect that official POV-Ray
>> uses radiosity for (so-called Global Illumination)? Raise your hand, and
>> I'll make it happen in UberPOV.
>
> I don't know what you have in mind for this, if it's not the render over
> and over again approach?

Essentially that, yes.

At the moment, for Global Illumination UberPOV still uses the same 
approach as POV-Ray: On an object's surface it chooses some points more 
or less at random, for which it computes indirect illumination with high 
precision(*), and caches this information; for any points in between, 
indirect illumination is interpolated from nearby cached samples. This 
interpolation can lead to systematic artifacts that are difficult to get 
rid of - obviously it's not enough to just sample the image over and 
over again with the same set of cached samples.

(*) Computation of indirect illumination is done simply by shooting a 
number of secondary rays, and see what light might come from other 
objects present there; the more rays are shot, the more precise the 
result will be.

To avoid the typical POV-Ray radiosity artifacts, MCPov (re-)computes 
indirect illumination for each and every point on an object's surface, 
and doesn't cache the results at all. Usually this is done with a 
comparatively low precision, which also leads to artifacts; however, 
they manifest as random noise that can be reduced by oversampling pixels 
over and over again.

>> Render over and over again while you sit and watch, until you're happy
>> with the result? Not supported in UberPOV yet; it's on the agenda, but
>> may require quite some intrusion into the POV-Ray architecture. What
>> UberPOV does already support, however, is an "anti-aliasing" mode (or,
>> more to the point, an oversampling mode) that allows you to specify what
>> quality you'd be happy with, and UberPOV will do the
>> rendering-over-and-over-again in the background until the quality
>> criteria are met.
>
> This seems like a better approach, as with MCPov you're often left
> re-rendering the entire image waiting for just a small dark area to
> smooth out.

It's not all that bad; MCPov does spend more time on areas of the image 
that prove to really need the additional work.

> So if you implement the less artefact-prone GI and combine it with the
> per-pixel oversampling it should be something better than MCPov? Sounds
> good to me! Hand firmly raised!

I'm not sure if it that'll suffice to indeed be better than MCPov; time 
will tell. A definitive advantage will be the full support for 
multithreading.

Two other things that had always bothered me about MCPov is that it 
doesn't allow the use of classic light source (which would be far less 
noisy and hence much faster than using bright spheres), and that it has 
a factor-2 error in diffuse computations that make it necessary to use 
different finish settings. Both make it excessively difficult to create 
scenes that render essentially identical (except for artifacts or noise) 
in both POV-Ray and MCPov. Needless to say that UberPOV is intended to 
do a better job in that respect.


Post a reply to this message

From: scott
Subject: Re: Is anyone working on a distributed / stochastic / MonteCarloray-tracingpatchfor3.7?
Date: 22 Jul 2014 09:11:57
Message: <53ce631d@news.povray.org>
> Two other things that had always bothered me about MCPov is that it
> doesn't allow the use of classic light source (which would be far less
> noisy and hence much faster than using bright spheres), and that it has
> a factor-2 error in diffuse computations that make it necessary to use
> different finish settings. Both make it excessively difficult to create
> scenes that render essentially identical (except for artifacts or noise)
> in both POV-Ray and MCPov. Needless to say that UberPOV is intended to
> do a better job in that respect.

Yes, I have a few macros so that scenes can work in both POV and MCPov 
but it's a messy solution (and obviously I can't use any 3.7 features).


Post a reply to this message

From: Nekar Xenos
Subject: Re: Is anyone working on a distributed / stochastic / MonteCarloray-tracingpatchfor 3.7?
Date: 23 Jul 2014 14:05:38
Message: <op.xjgpvdseufxv4h@xena.home>
On Tue, 22 Jul 2014 14:26:28 +0200, clipka <ano### [at] anonymousorg> wrote:

> Two other things that had always bothered me about MCPov is that it  
> doesn't allow the use of classic light source (which would be far less  
> noisy and hence much faster than using bright spheres), and that it has  
> a factor-2 error in diffuse computations that make it necessary to use  
> different finish settings. Both make it excessively difficult to create  
> scenes that render essentially identical (except for artifacts or noise)  
> in both POV-Ray and MCPov. Needless to say that UberPOV is intended to  
> do a better job in that respect.
>

And there was also a problem with media. I hope UberPov will be able to do  
what MCPov did and still be able to render different types of media  
correctly.

-- 
-Nekar Xenos-


Post a reply to this message

From: Nekar Xenos
Subject: Re: Is anyone working on a distributed / stochastic / MonteCarloray-tracing patch for 3.7?
Date: 23 Jul 2014 14:12:54
Message: <op.xjgp7gp3ufxv4h@xena.home>
On Tue, 22 Jul 2014 00:53:47 +0200, clipka <ano### [at] anonymousorg> wrote:

> Am 21.07.2014 21:12, schrieb Nekar Xenos:
>> On Sun, 20 Jul 2014 20:45:27 +0200, clipka <ano### [at] anonymousorg>  
>> wrote:
>>
>>> Am 20.07.2014 19:44, schrieb Nekar Xenos:
>>>>
>>>> Is anyone working on a distributed / stochastic / Monte
>>>> Carloray-tracing patch for 3.7?
>>>
>>> Distributed raytracing patch: Planned for official POV-Ray; don't hold
>>> your breath though.
>>>
>>> Stochastic / Monte Carlo raytracing patch: Maybe UberPOV
>>> (https://github.com/UberPOV/UberPOV) already offers what you're
>>> looking for; it already uses stochastic rendering for various
>>> features. If stochastic global illumination is what you're after, just
>>> drop me a note and I'll make that happen as well.
>>>
>>
>> I'm not 100% sure what the correct term is, the Wikipedia entry was a
>> bit confusing. I can describe it as following the ray from the light
>> source to the screen. Just like MCPov, only with all of the 3.7  
>> features.
>
> Hmm, there's definitely some misunderstanding going on here.
>
> "Following the ray from the light source to the screen" is /forward  
> raytracing/, as opposed to following the ray from the screen to the  
> object surface and ultimately to the light source, aka /backward  
> raytracing/. The latter is the classic approach, and the one used by  
> POV-Ray.
>
> MCPov also uses /backward/ raytracing like POV-Ray.
>
I don't know why I thought MCPov used forward ray-tracing. I think it was  
probably because of the slow rendering and good results after waiting a  
long time ;)

> /Stochastic/ raytracing can be done with both approaches (I guess it is  
> the only way to do forward raytracing, but it is also a way to do  
> backward raytracing), or even with combinations of the two, such as  
> bidirectional raytracing, Metropolis Light Transport or some such.
>
>
> So let's forget about the technical terms, and start at the other end:
>
> What is it about MCPov that you actually want?
>
> Blurred reflections? Definitely go for UberPOV, as it has them. Yay!
>
> Blurred refractions? Not supported by UberPOV yet, but shouldn't be too  
> difficult to add once I decide on a nice syntax.

That would be useful.
>
> A less artifact-prone replacement for the effect that official POV-Ray  
> uses radiosity for (so-called Global Illumination)? Raise your hand, and  
> I'll make it happen in UberPOV.
>

*raises hand*

> Render over and over again while you sit and watch, until you're happy  
> with the result? Not supported in UberPOV yet; it's on the agenda, but  
> may require quite some intrusion into the POV-Ray architecture. What  
> UberPOV does already support, however, is an "anti-aliasing" mode (or,  
> more to the point, an oversampling mode) that allows you to specify what  
> quality you'd be happy with, and UberPOV will do the  
> rendering-over-and-over-again in the background until the quality  
> criteria are met.
>

Yes, please :)
>
> As a rule of thumb, if MCPov has it, it is also planned (if not already  
> implemented) for UberPOV.
>
Whoo-hoo!



-- 
-Nekar Xenos-


Post a reply to this message

From: Nekar Xenos
Subject: Re: Is anyone working on a distributed / stochastic / Monte Carloray-tracing patch for 3.7?
Date: 23 Jul 2014 14:18:44
Message: <op.xjgqg7fsufxv4h@xena.home>
On Sun, 20 Jul 2014 20:45:27 +0200, clipka <ano### [at] anonymousorg> wrote:

> Am 20.07.2014 19:44, schrieb Nekar Xenos:
>>
>> Is anyone working on a distributed / stochastic / Monte  
>> Carloray-tracing patch for 3.7?
>
> Distributed raytracing patch: Planned for official POV-Ray; don't hold  
> your breath though.
>
> Stochastic / Monte Carlo raytracing patch: Maybe UberPOV  
> (https://github.com/UberPOV/UberPOV) already offers what you're looking  
> for; it already uses stochastic rendering for various features. If  
> stochastic global illumination is what you're after, just drop me a note  
> and I'll make that happen as well.
>

What I would like to see with radiosity is a system that distinguishes  
between different materials. Smooth shiny materials should have a smaller  
cone of scattering than cement which should have a very wide angle of  
scattering.

-- 
-Nekar Xenos-


Post a reply to this message

Goto Latest 10 Messages Next 7 Messages >>>

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.