|
 |
On 7/23/2025 9:15 AM, Bald Eagle wrote:
> "Bald Eagle" <cre### [at] netscape net> wrote:
>
>> Then you take your mesh, find the bounding box min and max, convert all the 3D
>> coordinates to uv(w) coordinates, and use the Bezier splines to adjust the
>> coordinates of the mesh.
>
> Take a look at:
>
> https://discussions.unity.com/t/deforming-a-mesh-to-bezier-curve/876635
>
> https://www.youtube.com/watch?v=S_JQUDDAsQk
>
> https://www.rose-hulman.edu/~finn/CCLI/Notes/day21.pdf
>
> my guess is that there are several ways that people go about this, and there are
> likely libraries on GitHub and elsewhere.
>
> MY conception of a general way to go about this would be to define a Bezier
> parallelpiped consisting of 64 control points - like 4 stacked Bezier patches
> having 16 control points each.
> The 8 corner points would control your 3D envelope, and all of the inner control
> points would control "stretching".
>
> We could expand existing macros to create extended Bernstein polynomials in i,
> j, and k.
>
> Cycling through all of the mesh vertices and dividing the coordinates by the
> AABB dimensions, you'd get the i, j, k parameters, and could plug those into the
> polynomial to get the adjusted mesh vertex coordinates.
>
> Then you could do all sorts of stuff with a mesh.
>
> Best would be to write something in Javascript or Processing to create a
> modeler, and have it write out an .inc file for the new distorted mesh.
>
>
>
> - BW
>
Interesting stuff. My modeling system defines a modeling grid of points
from which I generate the necessary bicubic patches. The idea is the
points on the modeling grid are on the model, and then it calculates
everything to make the patches to connect those points. The system
requires a lot of arrays, as you can imagine, so the points are all
there ready to be used and abused.
https://joshuarenglish.com/povray/bezdoc/index.html
Once I have all the modeling grids in a group, I can probably transform
the individual points directly in the SDL. I'm not sure how the
Berenstein polynomials would help with this particular transformation. I
have to play with the application of them.
Right now I'm going through an incredibly clunky conversion in 2d:
#declare quad = array[4] {<1.95,0,-1>, <3,0,0>, <2.5,0,1.2>, <1.25, 0,
-0.25> }
// outline the quad
#for(I,0,3,1)
sphere { quad[I] 0.02 pigment { rgb 0 } }
cylinder {quad[I] quad[mod((I+1),4)] 0.02 pigment { rgb 0 } }
pigment { rgb 0.5 } }
#end
#include "math.inc"
#declare res = <10,10>;
#declare left_vector = quad[3]-quad[0];
#declare right_vector = quad[2]-quad[1];
#declare DaTexture = texture {
uv_mapping
pigment { marble }
}
mesh {
#for(V, 0, res.v-1,1)
#declare start = Interpolate(V,0,res.v,quad[0], quad[3],1);
#declare stop = Interpolate(V,0,res.v, quad[1], quad[2],1);
#declare nstart = Interpolate(V+1, 0, res.v, quad[0], quad[3], 1);
#declare nstop = Interpolate(V+1, 0, res.v, quad[1], quad[2], 1);
#for(U,0, res.u-1, 1)
#declare ll = Interpolate(U, 0, res.u, start, stop, 1);
#declare lr = Interpolate(U+1, 0, res.u, start, stop, 1);
#declare tl = Interpolate(U, 0, res.u, nstart, nstop, 1);
#declare tr = Interpolate(U+1, 0, res.u, nstart, nstop, 1);
triangle { ll, lr, tl
uv_vectors <U/res.u, V/res.v> <(U+1)/res.u, V/res.v> <U/res.u,
(V+1)/res.v>
}
triangle { lr, tl, tr
uv_vectors <(U+1)/res.u, V/res.v> <U/res.u, (V+1)/res.v>
<(U+1)/res.u, (V+1)/res.v>
}
#end // for U
#end // for V
texture { DaTexture }
}
Are you suggesting there's an easier (and maybe faster way) to do this
sort of thing? I'll have to wreck my brain on the problem.
Josh
Post a reply to this message
|
 |