POV-Ray : Newsgroups : povray.binaries.animations : Normal-Interpolation (MPG1, 162kb) : Re: Normal-Interpolation (MPG1, 162kb) Server Time
8 Jul 2024 08:04:27 EDT (-0400)
  Re: Normal-Interpolation (MPG1, 162kb)  
From: Tim Nikias
Date: 23 Jan 2005 19:13:53
Message: <41f43dc1$1@news.povray.org>
Hi Rune!

No, you've understood it quite well. If the object/hair is unaffected by
gravity, then there would be no point in distributing several of them on the
sphere. Why am I not using a vertical alignment and go for the entire
sphere?

To be honest, my thoughts went a different route. I always try to maximize
flexibility. So, if I'd decide to create hair with an added (and fairly
easy/crude) simulation of wind, hairs at the back would have to bend
differently than hairs at the front. It's not like I can create
ultra-realistic and dense hair with this (as is obvious, I think, the hair,
after all, is simulated/modelled on a sphere, but gets placed across an
entire object...), but still, I always try to think in bigger scales, not
just for a current image, but for future purposes as well.

As for the lengths: they will be set in intervals, e.g. you set a minimum
and maximum length, an amount of steps, and the macros will create
[full-sphere-hairs]*[steps] and just pick from the different sets. Along
with the wind, I'd think that pre-parsing a few thousand hairs and being
able to reuse them across the entire scene is well worth the effort,
imagining that you'd parse as many hairs for a single object as well. I also
want to write the hair-sets to disc, so that you can run a simple file once
and save all the data, and the final scene will just have to parse the
picking and alignment. Maybe I'll implement output for that as well.

Additionally, the mapping technique (spherical to 2d-array) might be reused
for completely different algorithms which I am currently completely unaware
of. ;-)

I've also made a test. Using my Surcoat-Macros, I've sampled my "furry
fellow" (posted in p.b.i) with 30.000 points on the surface. If I create a
set based on a cubic-array with side-length of 14, I create 1094 hairs (I
only take the outside surface of the cube) and reuse them. That requires 74
MB peak memory for the final image. If I create 30.000 individual hairs, I
get 254MB peak memory. Add that the hairs used for both tests were rather
low-poly (I think around 60 triangles or so), and you can see where this is
getting... Also note that when I used only 74 hairs, I had 60 MB peak
memory, but some interpolation effects occur (hairs look like they're
layered, due to larger difference between angles, there are obvious gaps).
So with little more memory I got an image which looked like the 30.000
individual hair version, but used just about a third of the memory.

Once the hair-macro gets refined (currently there's only a 2d-strip, I want
to create conic hairs, and I want to script that simple wind), things will
look even better. I've yet got to figure a proper and easy way for some
varied texturing though.

Regards,
Tim

-- 
"Tim Nikias v2.0"
Homepage: <http://www.nolights.de>


Post a reply to this message

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.