POV-Ray : Newsgroups : povray.beta-test : Radiosity Status: Giving Up... Server Time
28 Jul 2024 18:17:06 EDT (-0400)
  Radiosity Status: Giving Up... (Message 21 to 30 of 194)  
<<< Previous 10 Messages Goto Latest 10 Messages Next 10 Messages >>>
From: Severi Salminen
Subject: Re: Radiosity Status: Giving Up...
Date: 29 Dec 2008 09:36:44
Message: <4958e07c$1@news.povray.org>
clipka wrote:
> Distribution over a hemisphere is not enough for radiosity. There needs to be a
> particular bias towards the "zenith".

Do it like I do:

1. Choose a random point on a disc. (You can do this quickly by first
choosing a random point on square and then check if it is inside a circle).
2. Project it on a hemisphere.

Voila! There you have a cosine weighted random point on a hemishpere. No
need to do the cosine weighting anymore.

That is the method I use in ssRay. With 1600+ samples they are very well
distributed with a decent PRNG. If not you can try Halton sequence or
something similar to generate a more even distribution.


Post a reply to this message

From: nemesis
Subject: Re: Radiosity Status: Giving Up...
Date: 29 Dec 2008 10:00:00
Message: <web.4958e59ecd9d1e75180057960@news.povray.org>
Warp <war### [at] tagpovrayorg> wrote:
> Thorsten Froehlich <tho### [at] trfde> wrote:
> > In theory the contribution is zero, in reality it is not though, due to the
> > lack of perfectly flat surfaces. Micro-facets and a high-intensity light
> > source can have surprising effects...
>
>   But we are talking about simple diffuse lighting here... :P

It looks anything but simple. ;)


Post a reply to this message

From: Chambers
Subject: Re: Radiosity Status: Giving Up...
Date: 29 Dec 2008 10:20:06
Message: <933469AE87B242FFAE88AF199FE01C42@HomePC>
> -----Original Message-----
> From: clipka [mailto:nomail@nomail]
> Warp <war### [at] tagpovrayorg> wrote:
> >   Any chances of removing the upper limit of 1600 samples? While
1600
> samples
> > is a lot, some people have encountered the limit and complained
about
> it.
> 
> Yes, definitely a chance to do that. However...
> 
> (1) I expect quality to improve even with less samples

Just a quick note: a few quick calculations will show that 1600 samples
is not enough for things like a lightbulb across the room, or the sun in
the sky, to be effectively used for radiosity.

> implement some adaptive algorithm. I also think it would be a good
idea
> to flag
> objects as "radiosity targets", like it is done with photons, to
inform
> the
> sampling algorithm about small but bright objects so it can shoot a
few
> more
> rays in that direction.

That would be a great way to deal with what would otherwise be extremely
large sample sets.

...Ben Chambers
www.pacificwebguy.com


Post a reply to this message

From: Warp
Subject: Re: Radiosity Status: Giving Up...
Date: 29 Dec 2008 12:39:58
Message: <49590b6d@news.povray.org>
Chambers <ben### [at] pacificwebguycom> wrote:
> Just a quick note: a few quick calculations will show that 1600 samples
> is not enough for things like a lightbulb across the room, or the sun in
> the sky, to be effectively used for radiosity.

  OTOH, why not make the lightbulb or the sun a light source?

  I don't really understand people's obsession in making radiosity-only
scenes. Radiosity calculates diffuse lighting only. If you use only radiosity
to calculate the lighting of the scene, the entire scene will consist of
purely diffusely lighted surfaces. There will be no specular component.

  When a surface is purely diffuse, without any specular component to it,
it's like a 100% matte surface, with no reflective component. There will
be no highlights caused by light sources, which will make all surfaces
matte and dull.

  It's the specular component of lighting which makes surfaces look vivid,
lively and brilliant, which make them have that "sparky" look. Purely matte
surfaces are flat and dull.

-- 
                                                          - Warp


Post a reply to this message

From: nemesis
Subject: Re: Radiosity Status: Giving Up...
Date: 29 Dec 2008 15:30:01
Message: <web.49593282cd9d1e75180057960@news.povray.org>
Warp <war### [at] tagpovrayorg> wrote:
> Chambers <ben### [at] pacificwebguycom> wrote:
> > Just a quick note: a few quick calculations will show that 1600 samples
> > is not enough for things like a lightbulb across the room, or the sun in
> > the sky, to be effectively used for radiosity.
>
>   OTOH, why not make the lightbulb or the sun a light source?
>
>   I don't really understand people's obsession in making radiosity-only
> scenes. Radiosity calculates diffuse lighting only. If you use only radiosity
> to calculate the lighting of the scene, the entire scene will consist of
> purely diffusely lighted surfaces. There will be no specular component.
>
>   When a surface is purely diffuse, without any specular component to it,
> it's like a 100% matte surface, with no reflective component. There will
> be no highlights caused by light sources, which will make all surfaces
> matte and dull.
>
>   It's the specular component of lighting which makes surfaces look vivid,
> lively and brilliant, which make them have that "sparky" look. Purely matte
> surfaces are flat and dull.

Indeed.  But specular is itself a fake, a rough shortcut to the real deal:
specular is really diffuse reflection on very smooth surfaces.  Those
highlights also did not match the shape of the light sources -- at least until
you corrected that, for area lights at least -- and also why they look funky
from almost tangent angles...

I've seen people advocating the use of (glossy) reflection to more realistically
achieve that effect.


Post a reply to this message

From: clipka
Subject: Re: Radiosity Status: Giving Up...
Date: 29 Dec 2008 17:05:00
Message: <web.4959496bcd9d1e75ab169ede0@news.povray.org>
andrel <a_l### [at] hotmailcom> wrote:
> Ok, I'll try to read it. If only I could find the link.
> Could you post that again, I clicked through most of your posts here and
> I cannot find it. :(

No surprise - I never posted a link to it ;)

I don't remember where I got my copy from. I just googled it up ("A Ray Tracing
Solution for Diffuse Interreflection" and "Gregory J. Ward" should do fine); I
found a few pages that looked like they would have charged for access to the
document, but there's also a PDF out there with the text (alas not the
diagrams), and a series of TIFFs from a scanned-in copy (including the
diagrams).

> I'd be surprised if there is not already an efficient random generator
> that does that all in one go, bypassing the need for trigonometry. Maybe
> we only need the help of a resident google expert.

I don't think so.

Generating good pseudo-random number sequences is an art by itself, even without
specific target distributions, so at the core will most always be an algorithm
generating uniformly distributed scalar (i.e. 1-D) numbers.

The desired distribution is usually achieved through some smart (non-linear)
transformation of the generated numbers (which in our case would involve
trigonometrics I guess, or at least squares and roots). See POV-ray's random
numbers include file (don't recall the exact name right now) for examples.

Alternatively, it can be done through Monte-Carlo integration, i.e. pulling some
random number from a uniform-distribution RNG, computing the desired probability
for that particular value, and according to this probability randomly discard
the pulled number (by comparing the probability against another
uniformly-distributed random number), doing the whole process over and over
again if necessary. However, you'll usually try to avoid this, as it may become
quite slow depending on your desired distribution, and you'll inevitably end up
with unpredictable execution times. (The POV random numbers include file
implements this for example to generate a random number inside an object.)

So even though there are RNG's out there that do produce particular
distributions, I reckon they'll typically be just wrappers around a
uniform-distribution RNG for your convenience, so that you don't have to come
up with all the math yourselves (which may be a hassle because it involves
integrals), so in the end your CPU will still need to do trigonometrics.


Post a reply to this message

From: clipka
Subject: Re: Radiosity Status: Giving Up...
Date: 29 Dec 2008 17:10:00
Message: <web.49594a2bcd9d1e75ab169ede0@news.povray.org>
Warp <war### [at] tagpovrayorg> wrote:
>   I wonder if adaptive supersampling (similar to what is used in antialiasing
> method 2 and adaptive area lights) could be used in the stochastic sampling:
> If the brightness of two adjacent samples differ more than a given threshold,
> take an additional sample between them.

Shame on you! Still haven't read the paper yet, have you? :)

The original paper by Greg Ward et al. explicitly mentions the possibility of
using an adaptive approach. And I actually think about implementing it. Not
this year though, I guess :)


Post a reply to this message

From: clipka
Subject: Re: Radiosity Status: Giving Up...
Date: 29 Dec 2008 17:30:01
Message: <web.49594f41cd9d1e75ab169ede0@news.povray.org>
Severi Salminen <sev### [at] NOTTHISsaunalahtifiinvalid> wrote:
> 1. Choose a random point on a disc. (You can do this quickly by first
> choosing a random point on square and then check if it is inside a circle).
> 2. Project it on a hemisphere.
>
> Voila! There you have a cosine weighted random point on a hemishpere. No
> need to do the cosine weighting anymore.

.... aside from the fact that what you describe *is* cosine weighting (unless
your approach would be wrong, that is) :)

Thinking about it (didn't check the math behind it yet), it's not all that bad:

- take two random numbers [-1..1] (well, ]-1..1] should do as well)
- compare sum of squares against 1.0
- repeat if necessary

doesn't sound too bad (however, performance should be compared with taking two
random numbers and transforming them to circle distribution using trigonometry
instead of the monte-carlo approach).

Projecting onto a sphere (I guess you're talking about a parallel projection
along the surface normal) just needs to take the sum of squares we already
have, subtract it from 1, and take the square root as additional co-ordinate.

So yes, that should do: A loop typically taking less than two iterations (the
most expensive part probably being the RNG) and a square root. Quite
inexpensive after all.

So it's definitely less expensive than going via a uniform distribution on a
sphere in any case. Don't know whether that can be achieved without
trigonometry, and at least it will involve a root somewhere as well - plus the
math to get either a cosine weighting or a cosine weighted distribution.


Thanks, this helped me a lot - mostly because it also provides a good basis of
how to properly do adaptive supersampling in a cosine-weighted world.


Post a reply to this message

From: clipka
Subject: Re: Radiosity Status: Giving Up...
Date: 29 Dec 2008 17:45:01
Message: <web.49595273cd9d1e75ab169ede0@news.povray.org>
"Chambers" <ben### [at] pacificwebguycom> wrote:
> Just a quick note: a few quick calculations will show that 1600 samples
> is not enough for things like a lightbulb across the room, or the sun in
> the sky, to be effectively used for radiosity.

I feared there would be some catch :}

> > implement some adaptive algorithm. I also think it would be a good
> idea
> > to flag
> > objects as "radiosity targets", like it is done with photons, to
> inform
> > the
> > sampling algorithm about small but bright objects so it can shoot a
> few
> > more
> > rays in that direction.
>
> That would be a great way to deal with what would otherwise be extremely
> large sample sets.

That's why I'm thinking about it :)

Another idea I had was to do such flagging automatically. Say we do the first
samples with a very high sample count. We use some techniques borrowed from
adaptive sampling (though we're not actually going adaptive yet), and if some
close sample rays detect a high gradient (either in color or distance) the area
is somehow marked as "hot".

An area getting less than a certain threshold of "hot" markers will get less
attention during subsequent passes, while areas getting particularly high
numbers of "hot" markers will get increased attention.

However, I'm not sure yet about how to store this type of information,
especially since the perception of what area is "hot" will gradually change
from point to point. Maybe some information attached to each sample, and any
new sample will first check whether there are any samples nearby that know
something about hotspots already.

Just a bunch of wild ideas at the moment.


Post a reply to this message

From: clipka
Subject: Re: Radiosity Status: Giving Up...
Date: 29 Dec 2008 17:50:01
Message: <web.4959538bcd9d1e75ab169ede0@news.povray.org>
Warp <war### [at] tagpovrayorg> wrote:
>   I don't really understand people's obsession in making radiosity-only
> scenes. Radiosity calculates diffuse lighting only. If you use only radiosity
> to calculate the lighting of the scene, the entire scene will consist of
> purely diffusely lighted surfaces. There will be no specular component.

That's where the HDRI and reflections with micronormals come into play :)

Honestly, this combination really rocks. Your classic specular highlights are
nothing compared to what you can get out of this combo.

Takes some rendering time though.


Post a reply to this message

<<< Previous 10 Messages Goto Latest 10 Messages Next 10 Messages >>>

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.