POV-Ray : Newsgroups : povray.binaries.images : Mesh2 Tools : Re: Mesh2 Tools Server Time
29 Mar 2024 08:16:21 EDT (-0400)
  Re: Mesh2 Tools  
From: Bald Eagle
Date: 30 Jun 2022 14:10:00
Message: <web.62bde5dee9de8f101f9dae3025979125@news.povray.org>
"Chris R" <car### [at] comcastnet> wrote:

> > > I'd also
> > > like to work on some transformations that could "wrinkle" the paper in
> > > interesting ways as well.

Certainly you could experiment with applying some of the native pattern
perturbations like crackle or ... wrinkles.

> I added a primitive "folding" capability to the code.  Right now it really only
> works for one fold at a time in a given region.  I need to play around with it
> so you can fold already folded regions.
>
> As you can see, the smaller the bending radius, the more points you need in the
> mesh2 grid to avoid jagged edges along the fold.  I need to explore a more
> efficient representation of the mesh and some code for sub-dividing the faces in
> just those regions to avoid having a million faces in a single sheet of paper.

I see where you're going with this, and it is indeed challenging.

Perhaps you could define your sheet in terms of bezier patches.
Then you could determine what patches are crossed by folds, and subdivide those
patches to give more localized control if you have multiple folds.

I think some of this may be headed straight for NURBS territory.

I also recall seeing a lot of interesting computer graphics origami work that I
just didn't have the time or energy to dive into.  That might be the way to go
since the heavy lifting has already been done.

https://langorigami.com/article/computational-origami/

https://graphics.stanford.edu/~niloy/research/folding/paper_docs/folding_sig_08.pdf

https://news.mit.edu/2017/algorithm-origami-patterns-any-3-D-structure-0622
https://origami.c.u-tokyo.ac.jp/~tachi/software/

http://masc.cs.gmu.edu/wiki/Origami

https://dam-prod.media.mit.edu/x/files/thesis/2013/ysterman-ms.pdf

.... and it goes on and on


Post a reply to this message

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.