|
![](/i/fill.gif) |
Christopher James Huff wrote:
>In article <3dd5ac26$1[at]news.povray.org>, "Slime" <slm### [at] slimeland com>
>wrote:
>
>> I thought, at first, that I could apply the inverse of the matrix
>> transformation to the normal, but then I realized that this wouldn't work in
>> all cases.
>
>You are correct. The answer is pretty simple: transform the normal by
>the transpose of the inverse of the transformation matrix.
>...
If the column or row vectors in his transformation
matrix forms an orthonormal basis, then the
transpose of the inverse of that transformation
matrix should be the transformation matrix itself.
I have not given transformation of normals much
thought, but I think that a consequence of the
above must be that:
As long as his transformation matrix does not scale
or shear the object in any way, he can just use this
matrix for the normals as well.
Tor Olav
Post a reply to this message
|
![](/i/fill.gif) |