|
![](/i/fill.gif) |
Am 28.11.2013 19:15, schrieb Mr:
> Thank you !! This is great! And you really have a good sense of priority to spot
> the most wanted features for pov and add them.
I think it's more kind of "spot the features /I/ want most for pov..."
;-) But I'm glad they happen to be wanted among the community, too.
> would it do deformation motion
> blur e.g. from a pov sphere animated non uniform scale, or if a mesh2 object's
> points moved?
What's currently in there is pretty basic in /how/ it does motion blur -
which, thankfully, makes it ultimately flexible in /what/ it can do.
In essence, whatever you could imagine to do by rendering an animation
and then averaging all images, you can do with this feature in just a
single render pass.
The idea behind it is simple: Think of the render as a photograph with a
certain non-zero exposure time. Now if you have any moving object in the
scene, you create N copies of it with slightly different positions,
orientations, shapes, textures or whatever you can imagine, and assign
each copy an individual time interval within the exposure time.
UberPOV will then fire rays into the scene that are not only
characterized by their trajectory in space, but also by a point in time;
objects that do not match this particular time are ignored for this
particular ray and all secondary rays (shadow rays, reflection rays etc)
it spawns.
In the future, this approach will be complemented by features catering
to more specific use cases, such as an object moving along a linear
trajectory or rotating about a fixed axis. Those will then be both
easier to set up (probably just specifying a translation or rotation)
and also less memory-consuming (only one instance of the object will be
needed).
> will this uberpov branch merge some of its patches quicker into
> pov trunk than megapov or will it keep diverging for long?
That's difficult to tell.
On one hand, from what I see UberPOV is in a similar situation as
MegaPOV was: Aside from being a collection of patches, it was also
intended to be a testbed for new features, and AFAIU it was also
maintained by people closely affiliated with the POV-Ray dev team, so
there's no reason there to assume that UberPOV should fare any better:
At the end of the day it all boils down to whether the POV-Ray dev team
as a whole decides to integrate a given feature into POV-Ray proper.
One thing that is a big obstacle for UberPOV's main features is that
they are all based on stochastic stuff, i.e. random in nature, which
does not go well with animations because it can create flickering (at
least that's the lore - I'm already pondering a way around that);
traditionally, POV-Ray has always been reluctant to incorporate such
features.
On the other hand there have been changes in the dev team recently,
which might affect the future course of POV-Ray; if so, I would expect a
less dogmatic and more pragmatic approach to new features. But only time
will tell.
Post a reply to this message
|
![](/i/fill.gif) |