|
|
All I can say, is that a single, simple square patch does nothing to help me
understand the fundamentals of how uv-mapping is done "under the hood" once more
complex patch definitions are used, nor does it address why certain, specific
control point orderings and uv-vector arrangements result in the mappings that
they do.
I mean, maybe I'm being too thick, or maybe I'm addressing this from far too
academic an approach, where my goal and intent is lost or dismissed due to
practical usage considerations.
Assuming that the pairs of patches are numbered left to right, top to bottom,
then the top left patch, number 1, renders "correctly".
This still gives rise to the question of why are there FOUR uv-vectors, and not
three, like one can use to determine the normal of a triangle, or two plus a
sign, like one can use to define an axis and rotational direction?
If the uv-vectors somehow define a direction around the perimeter of the patch,
as suggested by the red and blue arrows, and the green circular arrow, then it
gets a little bit fuzzy when it comes to describing and explaining (in
_unambiguous detail_) how the ordering of SIXTEEN control points plays into the
mapping described by FOUR uv-vectors.
If we look at patch #2, and the ordering of the control points, then we can see
that the matrix of control points in the array is transposed - it is flipped
around the bottom_left-to-top_right diagonal. THE PATCH IS NOT.
If one considers the order of the first 4 control points in the ARRAY {*see
note, vide infra} then one might expect some sort of direct one-to-one mapping
of the patch, starting at the origin, to result in a patch where "<1, 0>" (in
the image) winds up in the position where "<0,1>" is in Patch #1.
IT DOES NOT.
*{Note that due to the top-to-bottom parsing of the ASCII SDL, the visual
representation of the control point positions, and the way that the array looks
when typed out in 4 sequential rows of 4 vectors, things can get a bit
confusing; e.g. the ARRAY of patch #1 looks like it's upside-down when compared
with the resulting Cartesian location of the control points that it describes.
The caveat here is to recognize the numbered linear order of the point(s), not
the graphic or visual position.}
So far my control points all lie in Quadrant I, neatly "radiating out" from
control point 1. Time and caffeine permitting, more experiments are on the
way.
Now let's compare patches 1 and 3.
If I were to think of uv-mapping the patch as a photographic projection of the
image file onto the patch, then looking at the order of control points of 3 vs
1, the front and back of patch #3 ought to be switched in comparison to #1, and
this is indeed the case.
If we were to reason by analogy, then a similar state of affairs ought to exist
comparing patches #2 and #4, but there's clearly more going on than a simple
switching of interior and exterior textures. Not only are the textures
switched, but they are also rotated -z*180.
Now compare patches #1 and #5. These are matrix transpositions around the
OTHER diagonal. Unlike patch #2, where the textures are switched and rotated 90
degrees, the textures here do not appear to be switched, and are rotated 180
degrees. I will also note that switching and rotating 90 degrees two times
would likely give the same result.
I think that's enough for now to illustrate how I'm looking at this, and perhaps
clarify what exactly it is that I'm trying to find out.
Perhaps the best thing I can think of to "document" how this all works, would be
to code up an animation showing the infinite tiling of the image map texture,
it's projection onto a unit patch, and lines connecting the texture and the
patch to show the actual mapping. Then move the control points and uv vectors
to see how it all works dynamically.
Post a reply to this message
|
|