|
|
|
|
|
|
| |
| |
|
|
|
|
| |
| |
|
|
I don't know if such a lighting method was mentioned before. Jaime told
about a method with two spheres and a greyscale image for lighting,
separated from a low range photo, to simulate this.
My method has one sphere, one texture, enables colored lighting and ist easy
to setup and adjust.
For the image shown I used a HDR lightprobe. I think it looks very similar
to real HDR-lighting. What do you think about it?
jo
Post a reply to this message
Attachments:
Download 'aschenbecher.jpg' (92 KB)
Preview of image 'aschenbecher.jpg'
|
|
| |
| |
|
|
|
|
| |
| |
|
|
> For the image shown I used a HDR lightprobe. I think it looks very similar
> to real HDR-lighting. What do you think about it?
Looks great, but whats the rendering time? Parsing time required?
--
"Tim Nikias v2.0"
Homepage: <http://www.nolights.de>
Email: tim.nikias (@) nolights.de
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
> Looks great, but whats the rendering time? Parsing time required?
On my P450 it was about 50 minutes altogether (parsing: a few seconds), not
that different from a "normal" radiosity scene.
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Whats HDR?
"Tim Nikias v2.0" <tim.nikias (@) nolights.de> wrote in message
news:4064578d@news.povray.org...
> > For the image shown I used a HDR lightprobe. I think it looks very
similar
> > to real HDR-lighting. What do you think about it?
>
> Looks great, but whats the rendering time? Parsing time required?
>
> --
> "Tim Nikias v2.0"
> Homepage: <http://www.nolights.de>
> Email: tim.nikias (@) nolights.de
>
>
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
> Whats HDR?
HDRI = High Dynamic Range Image. Instead of 8 Bit (256 scales of grey) for
each channel, a HDR image uses floating points (32 Bit) each channel and can
store much brighter values than the white of your monitor. Those values can
be used (together with radiosity) to illuminate virtual models in a more
natural way.
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Nice!
What's you method? does it work in official POV-Ray?
mlPov supports HDRI mapping and lighting
the backdraw of HDR is the usually low resolution of images(due to the file
size) making focal blur nearly compulsory and so slowing render.
Marc
4064566d@news.povray.org...
> I don't know if such a lighting method was mentioned before. Jaime told
> about a method with two spheres and a greyscale image for lighting,
> separated from a low range photo, to simulate this.
> My method has one sphere, one texture, enables colored lighting and ist
easy
> to setup and adjust.
> For the image shown I used a HDR lightprobe. I think it looks very similar
> to real HDR-lighting. What do you think about it?
>
> jo
>
>
>
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
"joeydee" <bac### [at] gmxde> wrote in message
news:4064566d@news.povray.org...
> I don't know if such a lighting method was mentioned before. Jaime told
> about a method with two spheres and a greyscale image for lighting,
> separated from a low range photo, to simulate this.
> My method has one sphere, one texture, enables colored lighting and ist
easy
> to setup and adjust.
> For the image shown I used a HDR lightprobe. I think it looks very similar
> to real HDR-lighting. What do you think about it?
Looks cool! The Surgeon General might have words about it... :-)
--
- Respectfully,
Dan
http://<broken link>
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Marc Jacquier <jac### [at] wanadoofr> schrieb in im Newsbeitrag:
4065bcc6@news.povray.org...
> Nice!
> What's you method? does it work in official POV-Ray?
> mlPov supports HDRI mapping and lighting
> the backdraw of HDR is the usually low resolution of images(due to the
file
> size) making focal blur nearly compulsory and so slowing render.
>
> Marc
Yes, it works in official POV. Normally, HDR images are assembled from
photos of different exposure to one image. I use HDR Shop to separate them
back to several low-range images of different exposure and re-assemble them
in POV into one texture (masking each other, with ascending ambient value).
For the picture above, I used only two exposures - POV interpolates missing
values.
Yes, low-res is a problem, and putting virtual Objects into a natural
environment is always tricky, even if it is hi res.
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |