POV-Ray : Newsgroups : povray.binaries.animations : Particle matrix effect with I/O (277kb mpg) : Re: Particle matrix effect with I/O (277kb mpg) Server Time
19 Jul 2024 19:24:48 EDT (-0400)
  Re: Particle matrix effect with I/O (277kb mpg)  
From: Mark James Lewin
Date: 3 Nov 2002 22:32:06
Message: <3DC5EA2F.9010905@yahoo.com.au>
Rune wrote:
> Mark James Lewin wrote:
> 
>>I haven't tried it yet (hopefully will in a few days),
>>but assuming
>>
>>V(new) = V(old) + Acceleration*Delta_Time
>>P(new) = P(old) + V(new)*Delta_Time
>>
>>I cannot see too much trouble with negative Delta_Time
>>values so long as the Acceleration term is constant.
> 
> 
> Well, to have it work accurately, you have to slow down the rate of the
> calculation frames themselves, not the Delta_Time. (I think.)

Using Newton's method (which is what I use) requires a large number of 
itterations for accuracy anyway.

For each frame in an animation, I have a variable called Time_Step which is the 
amount of time which has passed since the last frame in the animation. The value 
of Delta_Time used in each itteration of the calculation depends on both the 
number of itterations specified, and the value of Time_Step.

> For a small simulation, simply changing the Delta_Time may seem to work
> perfectly, but imagine having a simulation running for 10 seconds, then
> going 10 seconds backwards. (Or if you need a more extreme example, make
> it one minute both ways.)
> 
> The small precision errors can accumulate very much in such a situation,
> so you have no guarantee that the particles will end up at exactly the
> same positions where they started. And if they can end up a little off,
> then they can also end up completely off, given a more complex case.

I am well aware of this problem. There is even the concern that you have not 
inadvertantly rounded off some important significant figures when saving the data 
  between frames.

I do not consider my particle system to be a rigorous physics simulation by any 
means. It was designed to automate some effects that I wanted to animate, and for 
this it has generally suceeded. If I wanted a more accurate system, I would 
probably look into Runge-Kutta methods for solving particle positions.

>>I can typically save data for 1500-2500 particles
>>before the data files get larger than a 320X240
>>frame saved in an uncompressed format. I don't
>>see this as a major problem considering hard disk
>>sizes nowadays.
> 
> 
> 2500 particles for how many seconds and with how many steps per second?
> And how much data do you have per particle per step?

One 320X240 frame bitmap ~ 226kb. Typically, I save particle position, velocity, 
age, and mass. On average, each particle uses about 90 bytes to store this 
information, and that's about 2500 particles to give a file size equivelent to 
the bitmap for that frame.

Saving particle data for each itteration is redundant for re-rendering purposes; 
all that is required is the data from the last itteration. If you are doing a 1 
minute animation at 25 fps, you'd generate ~ 330 Mb of bitmaps, and 330 Mb of 
particle data. I really don't see this as a problem when (a) hard drive sizes are 
measured in gigabytes now, and (b) if you wanted to keep the particle data for a 
long time, you could create a zip with good compression because the files are 
just ascii.

MJL


Post a reply to this message

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.