POV-Ray : Newsgroups : povray.beta-test : Light probes maping Server Time
31 Oct 2024 10:21:13 EDT (-0400)
  Light probes maping (Message 1 to 7 of 7)  
From: Alain
Subject: Light probes maping
Date: 1 Apr 2009 12:46:06
Message: <49d39a4e@news.povray.org>
I'm new to the HRD thing. I got a few probes.

Now, my problem is maping them correctly.
I have found 3 forms of maping:
- Reflective sphere where the center point is back at you and the perimeter is 
the oposite point.
- Cross shaped, to be wrapt around a box.
- Square ment to be maped onto a sphere. Those I have are not good.

The first form seems to be the more common format.

How do you use those? I don't use mega-POV. It's not installed.

-- 
Alain
-------------------------------------------------
You know you have been raytracing for too long when the animation you render 
will be finished after yourself.
Urs Holzer


Post a reply to this message

From: clipka
Subject: Re: Light probes maping
Date: 1 Apr 2009 13:20:00
Message: <web.49d3a106f4dc8d89f708085d0@news.povray.org>
Alain <ele### [at] netscapenet> wrote:
> - Reflective sphere where the center point is back at you and the perimeter is
> the oposite point.
> - Cross shaped, to be wrapt around a box.
> - Square ment to be maped onto a sphere. Those I have are not good.
>
> The first form seems to be the more common format.

The round ones come in two flavors: The "raw" shot of the reflective sphere; or
(actually more common as it seems) a format that somewhat distorts the sphere
radially, so that all parts of the sky get about the same resolution. The two
may sometimes be hard to tell apart.

> How do you use those? I don't use mega-POV. It's not installed.

There is a software called HDRShop out there (free for noncommercial purposes
IIRC) that allows you to convert between the various formats (and do some other
stuff with HDR shots); for official POV-Ray, you will most likely want to
convert the shots to that rectangular latitude / longitude format, which you
can project onto a sphere or sky_sphere using POV's spherical mapping.


Post a reply to this message

From: Alain
Subject: Re: Light probes maping
Date: 2 Apr 2009 21:46:52
Message: <49d56a8c$1@news.povray.org>
clipka nous illumina en ce 2009-04-01 13:14 -->
> Alain <ele### [at] netscapenet> wrote:
>> - Reflective sphere where the center point is back at you and the perimeter is
>> the oposite point.
>> - Cross shaped, to be wrapt around a box.
>> - Square ment to be maped onto a sphere. Those I have are not good.
>>
>> The first form seems to be the more common format.
> 
> The round ones come in two flavors: The "raw" shot of the reflective sphere; or
> (actually more common as it seems) a format that somewhat distorts the sphere
> radially, so that all parts of the sky get about the same resolution. The two
> may sometimes be hard to tell apart.
> 
>> How do you use those? I don't use mega-POV. It's not installed.
> 
> There is a software called HDRShop out there (free for noncommercial purposes
> IIRC) that allows you to convert between the various formats (and do some other
> stuff with HDR shots); for official POV-Ray, you will most likely want to
> convert the shots to that rectangular latitude / longitude format, which you
> can project onto a sphere or sky_sphere using POV's spherical mapping.
> 
> 
> 
Thanks.
Will start playing with HDRShop.

-- 
Alain
-------------------------------------------------
Politics is such a torment that I advise everyone I love not to mix with it.
Thomas Jefferson


Post a reply to this message

From: Thomas de Groot
Subject: Re: Light probes maping
Date: 3 Apr 2009 03:01:16
Message: <49d5b43c$1@news.povray.org>
"clipka" <nomail@nomail> schreef in bericht 
news:web.49d3a106f4dc8d89f708085d0@news.povray.org...
> There is a software called HDRShop out there (free for noncommercial 
> purposes
> IIRC) that allows you to convert between the various formats (and do some 
> other
> stuff with HDR shots); for official POV-Ray, you will most likely want to
> convert the shots to that rectangular latitude / longitude format, which 
> you
> can project onto a sphere or sky_sphere using POV's spherical mapping.
>

As an aside, I don't like the fact that *they* ask for personal details 
outside a secure environment. I am rather wary of that kind of things.

Thomas


Post a reply to this message

From: Trevor G Quayle
Subject: Re: Light probes maping
Date: 3 Apr 2009 13:00:00
Message: <web.49d63fc5f4dc8d8981c811d20@news.povray.org>
"Thomas de Groot" <tDOTdegroot@interDOTnlANOTHERDOTnet> wrote:
> "clipka" <nomail@nomail> schreef in bericht
> news:web.49d3a106f4dc8d89f708085d0@news.povray.org...
>
> As an aside, I don't like the fact that *they* ask for personal details
> outside a secure environment. I am rather wary of that kind of things.
>
> Thomas

I have no problem giving them personal information.  It just happens to be made
up, but it satisfies their curiosity.  I use HDRShop for all my HDR map
purposes: viewing, converting map types and creating my own light probes.  It
is very easy to use and is a low impact install (in fact it doesn't need to
install and write to the registry and such).



As an aside (or perhaps back on the original topic).  I prefer to use strictly
lat/long type maps.  Easier to look at unmapped.  Also the similarity between
mirror ball and angular maps can cause confusion as pointed out already.

Mirror balls are just that, an image of a mirrored ball, as you go outwards, it
reflects according to the angle of incidence, but this causes the data to get
squished as you get to the edges and can end up looking odd.  Mirrorball maps
are very difficult to utilize in POV, they basically need to be converted to a
usuable format.

Angular maps (sometimes called light probe) represent a linear interpolation of
the environment relative to the angle from the viewing vector, which more
evenly distributes the data. (e.g. center is 0 degrees from viewing angle, 1/4
distance from the center is 90deg, 1/2=180deg, 3/4=270deg, edge=360deg).  This
is much easier to map than mirrorball, but may require some thought on how to
map it porperly (megaPOV actually supports a map_type 7 which is this type).

Cubic environment or vertical cross is basically 6 parts that get mapped to the
6 sides of a cube.  With a little work, these could be mapped to a sphere as
well.  This type shows the least distortion in its unmapped format, but
basically requires each of the six parts to be mapped individually rather than
with a single function.

Lat/Long are basically a standard square to sphere mapping, x axis is longitude
(0 to 360 deg), y axis is latitude (-90 to 90 deg).  The image gets stretched
as you progress towards the top or bottom, but this is not a problem as it
essentially includes more resolution than necessary and typically doesn't
affect the quality, rather than squishing and getting less.  I find these are
the easiest to map and manipulate, and also the easiest to understand when
looking at the unmapped version.  My HDR lightdome macro uses strictly this
type for this reason (also it is much easier to run through the various
distribution algorithms within the macro in this format)

-tgq


Post a reply to this message

From: Kyle
Subject: Re: Light probes maping
Date: 3 Apr 2009 14:41:31
Message: <49d6585b$1@news.povray.org>
Trevor G Quayle wrote:
> Angular maps (sometimes called light probe) represent a linear interpolation of
> the environment relative to the angle from the viewing vector, which more
> evenly distributes the data. (e.g. center is 0 degrees from viewing angle, 1/4
> distance from the center is 90deg, 1/2=180deg, 3/4=270deg, edge=360deg).

Is that right?  If so, the edge is the same as the center of the image, 
since it is 360deg, and 1/2 out is the same in either direction, since 
it's 180deg?  That's rather odd.


Post a reply to this message

From: Trevor G Quayle
Subject: Re: Light probes maping
Date: 3 Apr 2009 15:40:00
Message: <web.49d665f8f4dc8d8981c811d20@news.povray.org>
Kyle <no### [at] spamok> wrote:
> Trevor G Quayle wrote:
> > Angular maps (sometimes called light probe) represent a linear interpolation of
> > the environment relative to the angle from the viewing vector, which more
> > evenly distributes the data. (e.g. center is 0 degrees from viewing angle, 1/4
> > distance from the center is 90deg, 1/2=180deg, 3/4=270deg, edge=360deg).
>
> Is that right?  If so, the edge is the same as the center of the image,
> since it is 360deg, and 1/2 out is the same in either direction, since
> it's 180deg?  That's rather odd.

Sorry, it goes from 0 to 180deg at the outer edge (for full 360deg coverage).

-tgq


Post a reply to this message

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.