POV-Ray : Newsgroups : povray.binaries.images : Experiments with light probes : Re: Experiments with light probes Server Time
20 May 2024 02:46:26 EDT (-0400)
  Re: Experiments with light probes  
From: Trevor G Quayle
Date: 2 Jun 2009 12:45:01
Message: <web.4a2555f275b7d3c981c811d20@news.povray.org>
"clipka" <nomail@nomail> wrote:
> "Bill Pragnell" <bil### [at] hotmailcom> wrote:
> > It works a treat, but often I find that the rotated image doesn't quite match
> > the other image after unwrapping. The points I select in the editor match
> > perfectly, naturally, but the images seem to diverge near the edges. I'm not
> > entirely sure why this is happening, but making sure the ball is centred in the
> > frame when taking the pictures seems to help considerably. Can any resident
> > HDR-makers shed any light on this?
>
> I'm not an expert on this, but AFAIK virtually all real-life cameras exhibit
> some distortion of objects not precisely in the center of the frame.
>
> It's basically the same effect as can be seen with POV-Ray's standard
> "perspective" camera: Place two spheres side by side - one in th center of the
> image, and one at the side. Crop both shots to the spheres' dimensions. You'll
> notice that the off-center sphere does not seem to be circular.
>
> AFAIK it is mathematically not possible to design a camera that produces planar
> 2D images without any such effects.
>
> The more zoom you use, the less prominent these distortions should be.
>
>
> If your lightprobe-generating software takes already-cropped images, it will
> most likely be unable to determine whether your chrome sphere was off-center in
> the original shot, and silently assume that it was taken head-on.

This should usually not be the problem.  I say usually, as I would assume that
most people try to center the mirrorball in their shot, which would minimize
any distortion effects directly related to being off-center.  However, I
suppose it should be worth noting as general advice to do so.

There is far more distortion due to perspective and parallax in most cases.  The
parallax distortion occurs because the mirrorball is not infinitely small and
the environment is not infinitely large.  When shots are taken from the two
different positions (90deg to each other), they are not reflecting *exactly*
the same geometry.  The amount of parallax distortion is related to how big the
mirrorball is relative to how large the enviroment or particular visible objects
are.

There is also perspective related distortion that comes from the fact that the
camera isn't taking a true orthographic shot of the mirrorball.  There is
actually a portion of the scene missing from the back (ie, not really getting
the full 360deg), however it is being used as if it is.  The amount missing is
directly related the ratio of the mirrorball size to the distance from the
mirrorball.

-tgq


Post a reply to this message

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.