|
|
|
|
|
|
| |
| |
|
|
|
|
| |
| |
|
|
On 29-Dec-08 11:18, Thorsten Froehlich wrote:
> Thomas de Groot wrote:
>> We sit down, and wait in confidence... :-)
>>
>> In my understanding, this should not be a problem to obtain as it is
>> part of the overall upgrading of povray.
>>
>> Have I said this? I would like to extend my sincere appreciation for
>> the effort you are clearly providing towards the next stage of our
>> all-acclaimed rendered: POV-Ray. Hurray!
>
> The redistribution of the beta source code is prohibited. There won't be
> a permission for anyone to distribute the beta source code or binary in
> any other form. The purpose of making the beta source code available is
> to get submissions of bug fixes that will be added to the official beta
> source code and beta binaries - assuming they work, of course ;-)
>
I think a relevant question here is: what is a distribution of source.
If clipka sends the source or a binary by regular mail to e.g. Thomas is
that distribution? or must it be publicly available to be one. If it is
the first then collaboration to implement and test improvements of beta
source is effectively impossible. I can think of reasons to do it that
way. One would be that source in this beta (double beta?) stage should
be coordinated by a POV team member. But, which one should that be? In
this specific case of radiosity: who is coordinating that and would that
person in this case give permission to create a test version for a
selected group to use?
Another one: if clipka had started from the 3.16 source would that have
made a difference?
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
On 29-Dec-08 13:34, clipka wrote:
> Thorsten Froehlich <tho### [at] trfde> wrote:
>> Boost / ISO C++ 2009 STL provide suitable random number generators that
>> support distribution over a hemisphere as needed by radiosity code.
>
> Distribution over a hemisphere is not enough for radiosity. There needs to be a
> particular bias towards the "zenith".
>
If you know the shape of the desired distribution that should in general
be possible.
Just curious, why should it not be evenly distributed (or should I read
that paper)?
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Thorsten Froehlich <tho### [at] trfde> wrote:
> > Distribution over a hemisphere is not enough for radiosity. There needs to be a
> > particular bias towards the "zenith".
>
> For small samples sizes, yes, but for large sample sizes assigning a weight
> to samples depending on their hemisphere location gives you the same effect.
>
> Thorsten
Having this weight "built into" the samples still has some benefits though. It
saves you some multiplications or - if you don't pre-compute the sequence -
even some trigonometrics; better yet: You also don't waste time shooting rays
that don't contribute much.
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
andrel <a_l### [at] hotmailcom> wrote:
> I think a relevant question here is: what is a distribution of source.
> If clipka sends the source or a binary by regular mail to e.g. Thomas is
> that distribution?
I bet it is.
I could send diffs though, because those portions are all my own work...
> Another one: if clipka had started from the 3.16 source would that have
> made a difference?
If it was in the current state: No; there's too much functionality missing
(compared to 3.6) in the current stuff I have, and reduced-functionality
versions may not be distributed either.
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
andrel <a_l### [at] hotmailcom> wrote:
> Just curious, why should it not be evenly distributed
Because the diffuse illumination of a surface depends on the angle from
which the light is coming: The more perpendicular the incoming light is to
the surface, the stronger its contribution. The strength of the contribution
decreases as a function of the cosine of the angle. If the incoming light
is parallel to the light, it has zero contribution.
You could sample evenly along the hemisphere, but then you will be taking
tons of samples which contribute only little to the illumination. (In fact,
rather ironically, if you sample evenly, the majority of the samples will
be on the parts of the hemisphere which contribute the least to the
illumination.)
--
- Warp
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
clipka <nomail@nomail> wrote:
> If I'd need a RNG for that, I guess I'd use whatever is commonly used in POV
> already. Speed is not really an issue for that job (nor is precision).
It's not only a quetion of speed, but a question of quality of the
randomness. A linear congruential generator, which is what std::rand()
usually is (and what the SDL "rand()" also is), is an extremely poor
random number generator, especially for things like stochastic sampling.
--
- Warp
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
andrel <a_l### [at] hotmailcom> wrote:
> Just curious, why should it not be evenly distributed (or should I read
> that paper)?
The paper explains that there should be a bias and how it should be,
mathematically - but I guess it wouldn't get you much further about the "why".
The thing is simply that incident light coming in at a shallow angle illuminates
the surface less than light coming in steeper.
This could be modeled by multiplying each sampling ray with a correctional
weight term - but it is much more elegant to model it via a non-uniform
distribution of rays, because you can concentrate precious render time on rays
that really matter - and it may also save you some mathematical operations
during the sampling.
It would be some waste of time to run your random number generator output
through some trigonometric formulae first to get a uniform distribution, and
then run those co-ordinates again through more trigonometry to get your weight
- when maybe you can simply run your RNG output through some different trig
formulae to give you just the biased distribution you need to do without a
weight term and with a better overall "computing time per weight" ratio.
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Warp wrote:
> andrel <a_l### [at] hotmailcom> wrote:
>> Just curious, why should it not be evenly distributed
>
> Because the diffuse illumination of a surface depends on the angle from
> which the light is coming: The more perpendicular the incoming light is to
> the surface, the stronger its contribution. The strength of the contribution
> decreases as a function of the cosine of the angle. If the incoming light
> is parallel to the light, it has zero contribution.
In theory the contribution is zero, in reality it is not though, due to the
lack of perfectly flat surfaces. Micro-facets and a high-intensity light
source can have surprising effects...
> You could sample evenly along the hemisphere, but then you will be taking
> tons of samples which contribute only little to the illumination. (In fact,
> rather ironically, if you sample evenly, the majority of the samples will
> be on the parts of the hemisphere which contribute the least to the
> illumination.)
Actually, that they contribute the least is not universally correct: It is
only correct if all contributions of light are about the same intensity
range. Now, if one light source is significantly brighter than all other
contributing light sources, even a small angle contribution can be brighter
than all other contributions. -- While this may sound like a rare case, it
actually is not: Sunlight and a 100W light bulb would be an example. This
case is relevant in architecture.
However, the probability that small area but high intensity contributions
are missed increases with the unevenness of the distribution (because it is
a function of the sample density). The computational complexity can be cut
by exploiting ray coherence (presorting samples into coherent groups is
easiest). Of course this would actually require POV-Ray to support coherent
ray tracing, which it does not yet do (the bounding code and SIMD
abstraction is in Perforce though).
Thorsten
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
On 29-Dec-08 14:40, clipka wrote:
> andrel <a_l### [at] hotmailcom> wrote:
>> Just curious, why should it not be evenly distributed (or should I read
>> that paper)?
>
> The paper explains that there should be a bias and how it should be,
> mathematically - but I guess it wouldn't get you much further about the "why".
Ok, I'll try to read it. If only I could find the link.
Could you post that again, I clicked through most of your posts here and
I cannot find it. :(
> The thing is simply that incident light coming in at a shallow angle illuminates
> the surface less than light coming in steeper.
Ok, that would be a fixed relation that does not depend on the material
used. I should be able to derive that myself. If only my brain would be
not so rusty.
> This could be modeled by multiplying each sampling ray with a correctional
> weight term - but it is much more elegant to model it via a non-uniform
> distribution of rays, because you can concentrate precious render time on rays
> that really matter - and it may also save you some mathematical operations
> during the sampling.
>
> It would be some waste of time to run your random number generator output
> through some trigonometric formulae first to get a uniform distribution, and
> then run those co-ordinates again through more trigonometry to get your weight
> - when maybe you can simply run your RNG output through some different trig
> formulae to give you just the biased distribution you need to do without a
> weight term and with a better overall "computing time per weight" ratio.
I'd be surprised if there is not already an efficient random generator
that does that all in one go, bypassing the need for trigonometry. Maybe
we only need the help of a resident google expert.
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Thorsten Froehlich <tho### [at] trfde> wrote:
> Warp wrote:
> > andrel <a_l### [at] hotmailcom> wrote:
> >> Just curious, why should it not be evenly distributed
> >
> > Because the diffuse illumination of a surface depends on the angle from
> > which the light is coming: The more perpendicular the incoming light is to
> > the surface, the stronger its contribution. The strength of the contribution
> > decreases as a function of the cosine of the angle. If the incoming light
> > is parallel to the light, it has zero contribution.
> In theory the contribution is zero, in reality it is not though, due to the
> lack of perfectly flat surfaces. Micro-facets and a high-intensity light
> source can have surprising effects...
But we are talking about simple diffuse lighting here... :P
> However, the probability that small area but high intensity contributions
> are missed increases with the unevenness of the distribution (because it is
> a function of the sample density).
I wonder if adaptive supersampling (similar to what is used in antialiasing
method 2 and adaptive area lights) could be used in the stochastic sampling:
If the brightness of two adjacent samples differ more than a given threshold,
take an additional sample between them.
--
- Warp
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |