I'm working on a large animation project.
One of the scenes has a large number of spheres that move and collide.
The positions and velocity vectors are written to a file in each frame to be
read by the next frame.
The same file produces a different constellation of the spheres after a few
hundred frames in Linux as compared to the same under Windows.
Is there a logical explanation? If one of the two uses one decimal more for
storing the intermediate situations, then I can well believe that the situations
become very different.
Post a reply to this message
|