|
![](/i/fill.gif) |
Hi!
I think I have an idea how to make "UV mapping" for
arbitrary objects. The current versions of it only
seem to work for bezier patches and a few other
objects (does it work with CSG?)
Please note that with "transformations" I mean
arbitrary nonlinear transformations of objects (CSG,
animated blobs etc...)
1. The user specifies the object before and after any
transformations and a few center points.
2. From the centerpoints a set of rays are shot out.
Where they intersect the untransformed object the
controll points c0,c1,c2,... are formed and where
they intersect the transformed object the controll
points c'0,c'1,... are formed.
3. When applying texture to a point <x',y',z'> on the
tranformed object write <x',y',z'> as a linear
combination of the few closest controlpoints
<x',y',z'> = a * c'i + b * c'j + d * c'k.
Calculate the "untranformed" point
<x,y,z> = a * ci + b *cj + d * ck
And use the texture given at that point.
Example of how it could be used:
difference {
sphere { <0,0,0>, 10 }
sphere { <10,0,0>, 2 }
texture {
transformed {
original sphere{ <0,0,0>, 10 }
original_centerpoints { <0,0,0> }
center_points { <0,0,0> }
control_density 10
}
<my favourite texture....>
}
}
Of course this doesn't work with to few controlpoints,
to large transformations ,porly choosen "centerpoints"
or very irregular objects. But perhaps it works in most
normal cases with a smart user.
What do you think? Should I (try) to implement it?
/ Mathias Broxvall
Post a reply to this message
|
![](/i/fill.gif) |