|
![](/i/fill.gif) |
(Stupid me, got the header wrong the first time...)
There's something I don't understand about this function.
Here it is, stripped down to only reflect a non-transformed plane:
static int Intersect_Plane (RAY *Ray, PLANE *Plane, DBL *Depth)
{
DBL NormalDotOrigin, NormalDotDirection;
VECTOR P, D;
Increase_Counter(stats[Ray_Plane_Tests]);
VDot(NormalDotDirection, Plane->Normal_Vector, Ray->Direction);
if (fabs(NormalDotDirection) < EPSILON)
return(FALSE);
VDot(NormalDotOrigin, Plane->Normal_Vector, Ray->Initial);
*Depth = -(NormalDotOrigin + Plane->Distance) / NormalDotDirection;
if ((*Depth >= DEPTH_TOLERANCE) && (*Depth <= Max_Distance)) {
Increase_Counter(stats[Ray_Plane_Tests_Succeeded]);
return (TRUE);
}
else
return (FALSE);
}
First of all, I want to note that Plane->Distance has the opposite
sign of what is parsed from the scene file. This is seen in
Parse_Plane in parse.c.
The only test made is if the ray is parallel to the plane. The value
stored in *Depth is such that, when passed to Evaluate_Ray, always
results in a point on the plane.
Take a very simple case. Let's test a ray whose origin is at 0 and
points at +x against a plane { -x, -5 }. Then:
NormalDotOrigin = 0
NormalDotDirection = -1
Plane->Distance = 5
Then *Depth evaluates to -(0+5)/(-1) = 5
When passed to Evaluate_Ray, this will result in
<0,0,0> + 5*<1,0,0> = <5,0,0>
Now, let's suppose the ray points at -x. Then:
NormalDotOrigin = 0
NormalDotDirection = 1
Plane->Distance = 5
*Depth = -(0+5)/1 = -5
which, when evaluated, gives:
<0,0,0> + (-5)*<-1,0,0> = <5,0,0>
But this ray does not intersect the plane, and no tests are being done
to test if it does!
Am I not seeing something?
Peter Popov ICQ : 15002700
Personal e-mail : pet### [at] vip bg
TAG e-mail : pet### [at] tag povray org
Post a reply to this message
|
![](/i/fill.gif) |