POV-Ray : Newsgroups : povray.binaries.animations : No, I actually *don't* know how I made the chin to that.... : Re: No, I actually *don't* know how I made the chin to that.... Server Time
18 Jul 2024 16:17:51 EDT (-0400)
  Re: No, I actually *don't* know how I made the chin to that....  
From: Hugo Asm
Date: 4 Jul 2004 13:48:09
Message: <40e842d9$1@news.povray.org>
Without personal experience with character animation I'm hardly qualified to
answer this question. But on the other hand, the following seems logical to
me:

Mesh-based character animation can be done in either one of two ways: 1)
Different poses (especially facial expressions) are carefully modelled by
hand. The result will be a numer of "key frames" that can be morphed
together to optain whatever facial expression is needed. 2) Different poses
(especially the body) are achieved by mathematically moving large sections
of the mesh in various directions. This is controlled with "bones".

Now: Is there anything that prevents POV-Ray from doing this? I've tried the
morphing technique in POV and it works. It's quite simple. Regarding
bones-animation I don't have experience but I can't see any particular
problems. When both the mesh and bones are imported into POV-Ray, it's a
matter of moving the correct vertices for each bone. These vertices can
either be chosen by distance (those who are close to the bone will be mostly
affected, having less effect, the further we move away) or they can be
chosen by a "body part" index, where all vertices are given additional
array-dimensions, stating how much the vertex is affected by various
movements of each bone.

I know, I should take a closer look at bone-animation in 3dsMAX and learn
how it works. But overal, the principles don't look too complex to me. The
complex thing is to create the different poses, but that is outside of POV.

Regards,
Hugo


Post a reply to this message

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.