|
|
Slime wrote:
>>Well I for one am damned impressed!
>
>
> Thank you! =)
>
>
>>You said the UV projection had to
>>be established first. How did you do that?
>
>
> The mesh and UV coordinates were both made with Wings 3D.
>
>
>>The reason I ask is that I
>>think the polys would have to have be contiguous in some way if the next
>>stage in the workflow was to take the template and add more stuff, say
>>hand-drawn wrinkles, for instance. That, anyway, is how I imagine this
>>technique would play a role in a workflow. I assume by your description
>>that that could be done. How you get the original UV's is arbitrary
>
> right?
>
> Yup. You can generate the UV coordinates any way you want, just as long as
> you can get them into your POV-Ray code.
>
> - Slime
> [ http://www.slimeland.com/ ]
>
>
This is really significantly useful. People use it to interactively
locate prcedurally generated bits of texture. But beyond that it seems
to me that it practically guarantees that the texture would lay flat and
be appropiately scaled since you are working *back* to the pre-existing
uv projection. This would give us a piece of fuctionality that
characterizes the highend integrated apps. It also seems to me that if
written as a patch, it would be efficious stand alone. Once the uvmap
is gained, processing could be shifted to a different patch for any
other functionality needed.
Post a reply to this message
|
|