|
|
On 30 Mar 2004 14:56:43 -0500, Eamon Caddigan <eca### [at] uiucedu> wrote:
> One scene has the following camera definition:
> camera {
> location <0.0000, 0.0000, -2.0000>
> look_at <0.0000, 0.0000, -0.0000>
> up <0.0000, 6.0000, 0.0000>
> right <8.0000, 0.0000, 0.0000>
> direction <-0.0000, -0.0000, -4.0000>
> }
>
> If the direction vector begins at the location point, it's pointing
> "away" from the look_at point. Maybe the problem is my comprehension of
> POV-Ray's camera model.
With POV-Ray's camera model, there is no way to look "away" from the
look_at point.
'look_at' (if specified) is always applied after 'direction'. That makes
your camera statement equivalent to:
camera {
location <0, 0, -2>
up <0, 6, 0>
right <8, 0, 0>
direction <0, 0, -4>
look_at <0, 0, 0>
}
...which, in turn, is equivalent to:
camera {
location <0, 0, -2>
up <0, 6, 0>
right <8, 0, 0>
direction vnormalize(<0,0,0> - <0,0,-2>)*4
}
or
camera {
location <0, 0, -2>
up <0, 6, 0>
right <8, 0, 0>
direction <2, 3, sqrt(2)> // Arbitrary vector with length 4
look_at <0, 0, 0>
}
When you specify all three of 'up', 'right' & 'look_at', the actual
direction of 'direction' no longer matters. Only its length matters, since
that determines the field of view (unless you also specify 'angle').
You might want to take a closer look at section 6.4.1 of the docs,
especially 6.4.1.4.
---
FE (on topic this time)
Post a reply to this message
|
|