|
|
|
|
|
|
| |
| |
|
|
|
|
| |
| |
|
|
I haven't followed too closely this HDRI discussion.
Looking at the few pages and images posted to p.b.i I see that I haven't
the slightest clue about what is it all about. The images look very nice,
but I can't see what is the HDRI thing in those images.
So could someone explain exactly what is HDRI and how it affects
rendering? If it's an algorithm to make something better, could someone
make images where the regular way is used and the same scene with HDRI
is used so that they can be compared side by side? If it's something
completely new, not comparable to anything in POV-Ray, could someone
explain how it affects the images?
--
#macro M(A,N,D,L)plane{-z,-9pigment{mandel L*9translate N color_map{[0rgb x]
[1rgb 9]}scale<D,D*3D>*1e3}rotate y*A*8}#end M(-3<1.206434.28623>70,7)M(
-1<.7438.1795>1,20)M(1<.77595.13699>30,20)M(3<.75923.07145>80,99)// - Warp -
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Hey Warp,
An HDRI image is an image map that stores brightness values greater than
the typical [0,1] (it uses floating point numbers to represent [0,
infinity]). With this, you can use a HDRI environment map as a light source
to realistically light a scene by simulating the lighting coming from the
environment.
What I'm not sure on is how POV-Ray or mlpov deals with the extended
range, when POV-Ray's lighting engine only handles the [0,1] range. The
pictures do look good though, so it seems to be working.
Regards,
George Pantazopoulos
"Warp" <war### [at] tagpovrayorg> wrote in message
news:3e0ce0a8@news.povray.org...
> I haven't followed too closely this HDRI discussion.
> Looking at the few pages and images posted to p.b.i I see that I haven't
> the slightest clue about what is it all about. The images look very nice,
> but I can't see what is the HDRI thing in those images.
>
> So could someone explain exactly what is HDRI and how it affects
> rendering? If it's an algorithm to make something better, could someone
> make images where the regular way is used and the same scene with HDRI
> is used so that they can be compared side by side? If it's something
> completely new, not comparable to anything in POV-Ray, could someone
> explain how it affects the images?
>
> --
> #macro M(A,N,D,L)plane{-z,-9pigment{mandel L*9translate N color_map{[0rgb
x]
> [1rgb 9]}scale<D,D*3D>*1e3}rotate y*A*8}#end M(-3<1.206434.28623>70,7)M(
> -1<.7438.1795>1,20)M(1<.77595.13699>30,20)M(3<.75923.07145>80,99)// -
Warp -
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
In article <3e0cef3e$1@news.povray.org>,
"George Pantazopoulos" <the### [at] attbicom*KILLSPAM*> wrote:
> What I'm not sure on is how POV-Ray or mlpov deals with the extended
> range, when POV-Ray's lighting engine only handles the [0,1] range.
This isn't true. Values only get clamped to the [0, 1] range after being
computed. An rgb 2 light is twice as bright as an rgb 1 light, and half
as bright as an rgb 4 light.
--
Christopher James Huff <cja### [at] earthlinknet>
http://home.earthlink.net/~cjameshuff/
POV-Ray TAG: chr### [at] tagpovrayorg
http://tag.povray.org/
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Thanks for the correction Christopher, thats great!
George
"Christopher James Huff" <chr### [at] maccom> wrote in message
news:chr### [at] netplexaussieorg...
> In article <3e0cef3e$1@news.povray.org>,
> "George Pantazopoulos" <the### [at] attbicom*KILLSPAM*> wrote:
>
> > What I'm not sure on is how POV-Ray or mlpov deals with the extended
> > range, when POV-Ray's lighting engine only handles the [0,1] range.
>
> This isn't true. Values only get clamped to the [0, 1] range after being
> computed. An rgb 2 light is twice as bright as an rgb 1 light, and half
> as bright as an rgb 4 light.
>
> --
> Christopher James Huff <cja### [at] earthlinknet>
> http://home.earthlink.net/~cjameshuff/
> POV-Ray TAG: chr### [at] tagpovrayorg
> http://tag.povray.org/
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
George Pantazopoulos <the### [at] attbicom*killspam*> wrote:
> With this, you can use a HDRI environment map as a light source
> to realistically light a scene by simulating the lighting coming from the
> environment.
How this is done? Simply using the regular radiosity algorithm in POV-Ray?
Or is there a new algorithm for this as well?
--
#macro N(D)#if(D>99)cylinder{M()#local D=div(D,104);M().5,2pigment{rgb M()}}
N(D)#end#end#macro M()<mod(D,13)-6mod(div(D,13)8)-3,10>#end blob{
N(11117333955)N(4254934330)N(3900569407)N(7382340)N(3358)N(970)}// - Warp -
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
3e0d620d@news.povray.org...
> How this is done? Simply using the regular radiosity algorithm in
POV-Ray?
> Or is there a new algorithm for this as well?
From what I've understood, there's nothing really new in the way POV-Ray
handles radiosity itself, it's the map that does all the work and that's the
real beauty of it. The radiosity itself is traditional but because light
intensity has a much wider range in the HDR map than in a usual map (like in
the real world) it makes the results much more natural-looking, even in the
absence of specular highlights. This becomes clearer if you load HDR maps in
HDRShop (at Paul Debevec's page http://www.debevec.org) and play with the
exposure settings. The maps then reveal themselves in their 2 dimensions
(color and light intensity).
G.
--
**********************
http://www.oyonale.com
**********************
- Graphic experiments
- POV-Ray and Poser computer images
- Posters
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
In article <3e0d9909@news.povray.org>,
"Gilles Tran" <git### [at] wanadoofr> wrote:
> From what I've understood, there's nothing really new in the way POV-Ray
> handles radiosity itself, it's the map that does all the work and that's the
> real beauty of it. The radiosity itself is traditional but because light
> intensity has a much wider range in the HDR map than in a usual map (like in
> the real world) it makes the results much more natural-looking, even in the
> absence of specular highlights. This becomes clearer if you load HDR maps in
> HDRShop (at Paul Debevec's page http://www.debevec.org) and play with the
> exposure settings. The maps then reveal themselves in their 2 dimensions
> (color and light intensity).
So, instead of taking samples from the actual scene, it takes them from
a separately calculated image? The precision might be better than 8 or
16 bpc images, but it will still almost definitely be worse than float
precision. And if it is applied like an environment map, it has the same
problems: flat surfaces will have no shading or shadows. UV mapping
would make more sense, you could take the object geometry into account,
though you still have to make sure the map is at a high enough
resolution.
I could understand it having an advantage over using more common image
formats for light mapping, but what use does it have in POV? Is the
patch for output of HDRI images? That way, POV could generate them for
other, more limited programs...
--
Christopher James Huff <cja### [at] earthlinknet>
http://home.earthlink.net/~cjameshuff/
POV-Ray TAG: chr### [at] tagpovrayorg
http://tag.povray.org/
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
news: chr### [at] netplexaussieorg...
> In article <3e0d9909@news.povray.org>,
> "Gilles Tran" <git### [at] wanadoofr> wrote:
> I could understand it having an advantage over using more common image
> formats for light mapping, but what use does it have in POV?
Hmm, it looks that you haven't seen what's going on in p.b.i these days.
Come on, give it a try, you'll see the advantage by yourself :-) (and I
didn't believe it before I was given the patch). With HDR, we now have
POV-Ray images virtually impossible to tell from actual photographs. See
JRG's watch pic for instance.
Actually, HDR effects can be done in POV-Ray without a patch, by using
radiosity with a texture_map with a very high range of ambient values (this
is what Ive did for his Vermeer images) but it's not exactly easy to set up.
Setting up a simple HDR scene is a piece of cake by comparison.
G.
--
**********************
http://www.oyonale.com
**********************
- Graphic experiments
- POV-Ray and Poser computer images
- Posters
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Gilles Tran wrote:
>
> news: chr### [at] netplexaussieorg...
>> In article <3e0d9909@news.povray.org>,
>> "Gilles Tran" <git### [at] wanadoofr> wrote:
>> I could understand it having an advantage over using more common image
>> formats for light mapping, but what use does it have in POV?
>
> Hmm, it looks that you haven't seen what's going on in p.b.i these days.
> Come on, give it a try, you'll see the advantage by yourself :-) (and I
> didn't believe it before I was given the patch). With HDR, we now have
> POV-Ray images virtually impossible to tell from actual photographs. See
> JRG's watch pic for instance.
In my opinion HDRI is the next step closer to reality and POV-Ray will
become a high end renderering system. The images in p.b.i are - more ore
less - impressive because of the natural way of illumination.
Regards,
Andreas
--
http://www.render-zone.com
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
In article <3e0e4bc9$2@news.povray.org>,
"Gilles Tran" <git### [at] wanadoofr> wrote:
> Hmm, it looks that you haven't seen what's going on in p.b.i these days.
I'm on a modem connection, so I haven't kept caught up.
> Come on, give it a try, you'll see the advantage by yourself :-) (and I
> didn't believe it before I was given the patch). With HDR, we now have
> POV-Ray images virtually impossible to tell from actual photographs. See
> JRG's watch pic for instance.
But it is still less accurate than radiosity could be, as far as I can
tell. Why use a high dynamic range environment map instead of sampling
the actual environment? Speed? The ability to use real-world samples
instead of coding a background?
--
Christopher James Huff <cja### [at] earthlinknet>
http://home.earthlink.net/~cjameshuff/
POV-Ray TAG: chr### [at] tagpovrayorg
http://tag.povray.org/
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
|
|