POV-Ray : Newsgroups : povray.advanced-users : Intensity Mapping Server Time
3 Jul 2024 05:21:42 EDT (-0400)
  Intensity Mapping (Message 4 to 13 of 13)  
<<< Previous 3 Messages Goto Initial 10 Messages
From: Jim Holsenback
Subject: Re: Intensity Mapping
Date: 17 Dec 2008 18:21:34
Message: <4949897e@news.povray.org>
"clipka" <nomail@nomail> wrote in message 
news:web.494964e567084438abad780@news.povray.org...
> (2) PoV-ray's "photon mapping" feature, which does forward raytracing 
> (i.e.
> shooting light rays from a light source) and is specially designed to 
> simulate
> lighting effects involving light being reflecting and/or refracted
> ("caustics").

You should also have a look at the example scenes that came with the 
distribution.
Your application install directory may vary ..... ~scenes/interior

Jim


Post a reply to this message

From: Christian Froeschlin
Subject: Re: Intensity Mapping
Date: 17 Dec 2008 19:20:49
Message: <49499761$1@news.povray.org>
Colin wrote:

> So, say for example I put a point
> light source above a flat surface. If I render it, I see that it's brightest
> directly below the source. At a point some distance away from this centerline,
> it is dimmer. What I would like is a metric for how much dimmer it is, as a
> function of position. I assume this would be trivial from within the
> ray-tracing algorithm; I'm just curious if it's possible from the user end.

I think the only way from the user end is to go via the brightness.

You can position an orthographic camera to look top down on the
plane you are interested in. Using a homogenous texture and suitable 
light intensity should yield a reasonable intensity image.

You need to make sure that the surface is lighted by photons only.
This should be the case if the light source is completely blocked
by some transparent object which is a photon target.

You will wish to adapt the gamma settings to get a linear
response curve, use 16-bit output for higher precision, and
possibly try HDR output (this is in MegaPOV, there is also
HDR support in 3.7b but I'm not sure its also for output?)

This should already give you the relative intensities.
Converting the pixel intensity to a physical unit may be
tricky. For reference, you'd probably need some test lens
which directs all photons onto the visible area.


Post a reply to this message

From: clipka
Subject: Re: Intensity Mapping
Date: 17 Dec 2008 19:30:00
Message: <web.4949995b67084438abad780@news.povray.org>
"Colin" <nomail@nomail> wrote:
> So what I'm looking for is if you were tracing photons, how many per unit area
> strikes a particular position on a surface. We usually measure this in things
> like W/m2 or lux (lumens/m2).

That's basically what an orthographic shot of the surface with photon mapping
will give you: A bitmap specifying how many photons have hit which point.

If you crank up the number of photons high enough, you can probably get any
precision you may need.

(At least if you make sure that gamma correction is turned off, and your light
brightness has a proper brightness to represented by the output image format
you choose.)

To my knowledge, the principle behind this is extremely simple: PoV will shoot
photons, remember where they hit, and increase the brightness of the object
accordingly.


The only problematic thing with this might be if the light source itself is
directly visible from your "test surface", as you want to eliminate the (most
likely not really exact) conventional lighting. I don't know by heart whether
you can turn off conventional lighting completely and just use photon mapping.


A more brute-force attempt would be to do the same thing with radiosity, which
basically does the very same thing "backwards" and should lead to the same
results given extremely high-quality settings), but is probably a waste of
computing power for this application.

If it cannot be helped otherwise, it would always be possible to do something
with the trace() function, but something built-in will most likely be a lot
faster.


Post a reply to this message

From: clipka
Subject: Re: Intensity Mapping
Date: 17 Dec 2008 19:50:00
Message: <web.49499da767084438abad780@news.povray.org>
Christian Froeschlin <chr### [at] chrfrde> wrote:
> You will wish to adapt the gamma settings to get a linear
> response curve, use 16-bit output for higher precision, and
> possibly try HDR output (this is in MegaPOV, there is also
> HDR support in 3.7b but I'm not sure its also for output?)

HDR is supported in 3.7 beta (29) for both in- and output. I tested.

I had the impression that it behaves differently than MegaPov regarding gamma,
but that might be due to the general gamma handling change (or due to fixes - I
think the MegaPOV HDR implementation is buggy in this respect). I didn't
investigate any further, because the beta also lacks other things I needed from
MegaPOV.

(I think photon mapping is already multi-threaded and stable in the beta, and
should be a good deal faster on modern hardware, so in this case I'd recommend
it over MegaPOV.)

HDR will definitely be a good choice with this - better even than 16-bit output
I guess, as it can not only capture very subtle differences in dim areas, but
is also very robust against "overexposure".

The precision of HDR, by the way, is still limited though, especially when
dealing with colors; its resolution regarding hue and saturation of a color are
just the same as a classic 8-bit-per-color format; its high dynamic range is
only achieved by adding an 8-bit exponent to scale the total brightness of a
pixel.


Post a reply to this message

From: Colin
Subject: Re: Intensity Mapping
Date: 17 Dec 2008 20:25:00
Message: <web.4949a5426708443823f6cd3b0@news.povray.org>
Christian Froeschlin <chr### [at] chrfrde> wrote:
> Colin wrote:
>
> > So, say for example I put a point
> > light source above a flat surface. If I render it, I see that it's brightest
> > directly below the source. At a point some distance away from this centerline,
> > it is dimmer. What I would like is a metric for how much dimmer it is, as a
> > function of position. I assume this would be trivial from within the
> > ray-tracing algorithm; I'm just curious if it's possible from the user end.
>
> I think the only way from the user end is to go via the brightness.
>
> You can position an orthographic camera to look top down on the
> plane you are interested in. Using a homogenous texture and suitable
> light intensity should yield a reasonable intensity image.
>
> You need to make sure that the surface is lighted by photons only.
> This should be the case if the light source is completely blocked
> by some transparent object which is a photon target.
>
> You will wish to adapt the gamma settings to get a linear
> response curve, use 16-bit output for higher precision, and
> possibly try HDR output (this is in MegaPOV, there is also
> HDR support in 3.7b but I'm not sure its also for output?)
>
> This should already give you the relative intensities.
> Converting the pixel intensity to a physical unit may be
> tricky. For reference, you'd probably need some test lens
> which directs all photons onto the visible area.

I can work on referencing it easily using a known light source and photodiode
and calibrate accordingly (in the "real" world).

How do I read the brightness into a matrix of integer values corresponding to
the pixel locations?


Post a reply to this message

From: clipka
Subject: Re: Intensity Mapping
Date: 17 Dec 2008 21:55:00
Message: <web.4949bac567084438abad780@news.povray.org>
"Colin" <nomail@nomail> wrote:
> How do I read the brightness into a matrix of integer values corresponding to
> the pixel locations?

Ah, so having the values in a bitmap image doesn't help you, and you actually
need them as a 2-dimensional array instead?

Hm...

(a) Get POV-ray to render an orthographic image anyway. Use a second POV-ray
"render" to load that image as a pigment bitmap, use this pigment as a pigment
function, and voila - there you have that function you need. You can decide
yourself whether you want to make an array of it, or just use that function to
get the values at specific locations.

(b) This one gets a bit trickier: Build your own "photon tracer" from building
blocks provided by POV-ray. Do as follows:
- #declare a union of your reflector object and a plane where you want to
measure light intensity
- #declare a 2-dimensional array to count your photons
- think of an algorithm that gives you a (very large) series of
evenly-distributed directions to shoot photons from the light source (random
directions generated with vrand_on_sphere might do, but less noisy approaches
might yield more accurate results with less photons to shoot)
- loop through this series of diection, shooting a photon for each direction as
follows:

- remember the light source posotion and photon direction as the "current
location" and "current direction"
- repeat the following until... um, well, until you're done with this photon:

- call trace() with the current location & direction and your reflector/plane
union; this will trace a ray from the current location in the current direction
and get the nearest intersection point with the reflector or plane, plus the
surface normal at that point.
- if you do not get any intersection point, you're done.
- otherwise, if that intersection point is on your "measuring plane", compute
the array co-ordinates from the intersection point, and count up the
corresponding array entry. As you won't get exact hits, you may want to kind of
assign "fractions of a photon" to the nearest four array entries. You're done
for this photon.
- otherwise (i.e. if you do get an intersection point but it's not on the
measuring plane), take the intersection point as you new current location, and
mirror the current direction according to the surface normal you got from
trace(). (You may also want to reduce the photon "weight" depending on
reflector material). Note that you're *not* done in this case.
- make sure you add some iteration limit to this loop, or you may get stuck in
an endless one.

- When you're done with a photon, proceed with the next direction from your
series, starting at the light source again.

- When you're done with all directions, evaluate your results.

You may also want to count "lost" photons that escape your setup without hitting
the measurement plane, and "stuck" photons that never seem to make it out of the
reflector.


Post a reply to this message

From: Alain
Subject: Re: Intensity Mapping
Date: 17 Dec 2008 22:55:17
Message: <4949c9a5@news.povray.org>
Colin nous illumina en ce 2008-12-17 20:20 -->
> Christian Froeschlin <chr### [at] chrfrde> wrote:
>> Colin wrote:
>>
>>> So, say for example I put a point
>>> light source above a flat surface. If I render it, I see that it's brightest
>>> directly below the source. At a point some distance away from this centerline,
>>> it is dimmer. What I would like is a metric for how much dimmer it is, as a
>>> function of position. I assume this would be trivial from within the
>>> ray-tracing algorithm; I'm just curious if it's possible from the user end.
>> I think the only way from the user end is to go via the brightness.
>>
>> You can position an orthographic camera to look top down on the
>> plane you are interested in. Using a homogenous texture and suitable
>> light intensity should yield a reasonable intensity image.
>>
>> You need to make sure that the surface is lighted by photons only.
>> This should be the case if the light source is completely blocked
>> by some transparent object which is a photon target.
>>
>> You will wish to adapt the gamma settings to get a linear
>> response curve, use 16-bit output for higher precision, and
>> possibly try HDR output (this is in MegaPOV, there is also
>> HDR support in 3.7b but I'm not sure its also for output?)
>>
>> This should already give you the relative intensities.
>> Converting the pixel intensity to a physical unit may be
>> tricky. For reference, you'd probably need some test lens
>> which directs all photons onto the visible area.
> 
> I can work on referencing it easily using a known light source and photodiode
> and calibrate accordingly (in the "real" world).
> 
> How do I read the brightness into a matrix of integer values corresponding to
> the pixel locations?
> 
> 
> 
How about using the image itself as the matrix? After all, any digital image is 
in fact a matrix of values displayed as an image.
If you output as TGA, you get minimal overhead and can have up to 16 bit per 
channel par pixel.


-- 
Alain
-------------------------------------------------
To compel a man to furnish funds for the propagation of ideas he disbelieves
and abhors is sinful and tyrannical.
Thomas Jefferson


Post a reply to this message

From: Alain
Subject: Re: Intensity Mapping
Date: 17 Dec 2008 23:02:21
Message: <4949cb4d@news.povray.org>
clipka nous illumina en ce 2008-12-17 19:29 -->
> "Colin" <nomail@nomail> wrote:
>> So what I'm looking for is if you were tracing photons, how many per unit area
>> strikes a particular position on a surface. We usually measure this in things
>> like W/m2 or lux (lumens/m2).
> 
> That's basically what an orthographic shot of the surface with photon mapping
> will give you: A bitmap specifying how many photons have hit which point.
> 
> If you crank up the number of photons high enough, you can probably get any
> precision you may need.
> 
> (At least if you make sure that gamma correction is turned off, and your light
> brightness has a proper brightness to represented by the output image format
> you choose.)
> 
> To my knowledge, the principle behind this is extremely simple: PoV will shoot
> photons, remember where they hit, and increase the brightness of the object
> accordingly.
> 
> 
> The only problematic thing with this might be if the light source itself is
> directly visible from your "test surface", as you want to eliminate the (most
> likely not really exact) conventional lighting. I don't know by heart whether
> you can turn off conventional lighting completely and just use photon mapping.
You can. Set ambient to zero. Make sure that the light_source never encounter 
the target plane in the visible area. A way to do that is to have a transparent 
object in front of the light_source, a whide and thin box with pigment{rgbt 1} 
will do just fine.
> 
> 
> A more brute-force attempt would be to do the same thing with radiosity, which
> basically does the very same thing "backwards" and should lead to the same
> results given extremely high-quality settings), but is probably a waste of
> computing power for this application.
Radiosity would need a insane count, larger than the 1600 maximum.
> 
> If it cannot be helped otherwise, it would always be possible to do something
> with the trace() function, but something built-in will most likely be a lot
> faster.
> 
> 
> 


-- 
Alain
-------------------------------------------------
You know you've been raytracing too long when you think 80s movies have the 
funniest special effects.
Aaron Gage a.k.a Slartibartfast


Post a reply to this message

From: clipka
Subject: Re: Intensity Mapping
Date: 18 Dec 2008 15:25:01
Message: <web.494ab07867084438bdc576310@news.povray.org>
Alain <ele### [at] netscapenet> wrote:
> > The only problematic thing with this might be if the light source itself is
> > directly visible from your "test surface", as you want to eliminate the (most
> > likely not really exact) conventional lighting. I don't know by heart whether
> > you can turn off conventional lighting completely and just use photon mapping.
> You can. Set ambient to zero. Make sure that the light_source never encounter
> the target plane in the visible area. A way to do that is to have a transparent
> object in front of the light_source, a whide and thin box with pigment{rgbt 1}
> will do just fine.

I'm not sure how this will stop the classic lighting from the light_source, as
the object will not cast a shadow. And using an opaque "light block" will also
block photons.


Post a reply to this message

From: Christian Froeschlin
Subject: Re: Intensity Mapping
Date: 19 Dec 2008 18:40:22
Message: <494c30e6$1@news.povray.org>
clipka wrote:
> I'm not sure how this will stop the classic lighting from the light_source, as
> the object will not cast a shadow.

It will if you set it up as a photon target.


Post a reply to this message

<<< Previous 3 Messages Goto Initial 10 Messages

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.