POV-Ray : Newsgroups : povray.general : Procedural Methods for Two-Dimensional Texture Generation Server Time
13 Apr 2026 16:15:00 EDT (-0400)
  Procedural Methods for Two-Dimensional Texture Generation (Message 1 to 4 of 4)  
From: yesbird
Subject: Procedural Methods for Two-Dimensional Texture Generation
Date: 13 Apr 2026 10:46:27
Message: <69dd01c3@news.povray.org>
Greetings !

I am working now on extension to this digital painting software:
https://www.escapemotions.com/products/rebelle/about

and while looking for interesting ideas like pattern generation, I found
this paper (attached), that might be a source for additional POV-Ray
modules.
--
YB


Post a reply to this message


Attachments:
Download 'sensors-20-01135-v2.pdf.zip' (3291 KB)

From: Bald Eagle
Subject: Re: Procedural Methods for Two-Dimensional Texture Generation
Date: 13 Apr 2026 12:45:00
Message: <web.69dd1cf5ada8a5a776d02faa25979125@news.povray.org>
yesbird wrote:

> and while looking for interesting ideas like pattern generation, I found
> this paper (attached), that might be a source for additional POV-Ray
> modules.

I've come across papers like this before, and have suggested doing similar
studies and experiments.

You'll probably enjoy:

https://news.povray.org/povray.binaries.images/thread/%3C5435dc14@news.povray.org%3E/

https://news.povray.org/povray.binaries.animations/thread/%3C5437a2da%241%40news.povray.org%3E/

One approach might be to use combinatiorics or L-systems to iteratively
construct non-trivial patterns from the available keywords in the hierarchy.

Set everything up, run an animation of 1-10k frames, and see what pops out.
Have POV-Ray export the list of settings/features/parameters as a .txt file or
..inc file of the exact texture it produces for that frame.

Then it would be easy to scroll through a list of thumbnail images and pick out
the ones that are interesting enough for further study and improvement.

Time-dependent patterns like reaction-diffusion hold really rich promise, but
would have to be evaluated individually to see at what time-point a "good"
pattern emerges.  It's also quite sensitive to initial values.

An archaeological approach might yield interesting results - have a bot scan for
pigments and textures and render 2D and 3D representations of them.  There's 30
years of stuff out there, especially the IRTC.

- BE


Post a reply to this message

From: yesbird
Subject: Re: Procedural Methods for Two-Dimensional Texture Generation
Date: 13 Apr 2026 15:13:21
Message: <69dd4051$1@news.povray.org>
On 13/04/2026 19:42, Bald Eagle wrote:
> I've come across papers like this before, and have suggested doing similar
> studies and experiments.
> 
> You'll probably enjoy:
> 
>
https://news.povray.org/povray.binaries.images/thread/%3C5435dc14@news.povray.org%3E/
> 
>
https://news.povray.org/povray.binaries.animations/thread/%3C5437a2da%241%40news.povray.org%3E/
> 
> One approach might be to use combinatiorics or L-systems to iteratively
> construct non-trivial patterns from the available keywords in the hierarchy.
> ...

Thanks for these links, Bill, the trick is interesting. I am already
using L-Systems in my extension and guess that cellular automata can
give good results also. In Rebelle API I am limited to vector
operations of brush movement (press, move, release):
https://www.escapemotions.com/products/rebelle/motionio_doc/reference/json_events_reference/

and all raster-based diffusion effects can be applied later.
This is the reason why I am prefer contour-based patterns in my
investigations.
--
YB


Post a reply to this message

From: Bald Eagle
Subject: Re: Procedural Methods for Two-Dimensional Texture Generation
Date: 13 Apr 2026 16:00:00
Message: <web.69dd4b1cada8a5a79839f6ab25979125@news.povray.org>
yesbird wrote:

> In Rebelle API I am limited to vector
> operations of brush movement (press, move, release):
>
https://www.escapemotions.com/products/rebelle/motionio_doc/reference/json_events_reference/

Hmm.
Not to say any of this is directly applicable, but maybe check out "Mathematical
marbling"

https://people.csail.mit.edu/jaffer/Marbling/

Especially:  https://people.csail.mit.edu/jaffer/Marbling/stroke.pdf

And skim through Daniel Schiffman's _The Coding Train_ youtube channel.
He has a lot of interactive, mouse-position events in his code, and other people
submit derivative works on his website.

Maybe you could have a collection of Convolution Kernels that you could drag
across an image with the mouse.
Barrel/pincushion optical effect
Color discretization
etc

- BW


Post a reply to this message

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.