|
![](/i/fill.gif) |
Mark James Lewin wrote:
> I haven't tried it yet (hopefully will in a few days),
> but assuming
>
> V(new) = V(old) + Acceleration*Delta_Time
> P(new) = P(old) + V(new)*Delta_Time
>
> I cannot see too much trouble with negative Delta_Time
> values so long as the Acceleration term is constant.
Well, to have it work accurately, you have to slow down the rate of the
calculation frames themselves, not the Delta_Time. (I think.)
For a small simulation, simply changing the Delta_Time may seem to work
perfectly, but imagine having a simulation running for 10 seconds, then
going 10 seconds backwards. (Or if you need a more extreme example, make
it one minute both ways.)
The small precision errors can accumulate very much in such a situation,
so you have no guarantee that the particles will end up at exactly the
same positions where they started. And if they can end up a little off,
then they can also end up completely off, given a more complex case.
> As for saving particle data for each frame
> I can typically save data for 1500-2500 particles
> before the data files get larger than a 320X240
> frame saved in an uncompressed format. I don't
> see this as a major problem considering hard disk
> sizes nowadays.
2500 particles for how many seconds and with how many steps per second?
And how much data do you have per particle per step?
Rune
--
3D images and anims, include files, tutorials and more:
rune|vision: http://runevision.com (updated Oct 19)
POV-Ray Ring: http://webring.povray.co.uk
Post a reply to this message
|
![](/i/fill.gif) |