POV-Ray : Newsgroups : povray.beta-test : Out of memory with many transparent objects : Re: Out of memory with many transparent objects Server Time
30 Jun 2024 18:05:40 EDT (-0400)
  Re: Out of memory with many transparent objects  
From: Chris Cason
Date: 20 Oct 2010 01:33:53
Message: <4cbe7f41$1@news.povray.org>
On 19/10/2010 21:03, Christian Froeschlin wrote:
> I seem to be hitting a wall when stacking more than 64
> transparent objects ("Fatal error in renderer: Out of memory").
> The actual memory used by the process was only a few MB. Also,
> it doesn't help to set "max_trace_level" to a larger value,
> but AFAIU passing through transparency without reflection no
> longer adds to the trace level in 3.7 anyway (using beta 39).

As others have noted, this is a case of the fixed vector hitting its limit.
It's an issue we have yet to come to a better solution for. Fundamentally
we use a fixed-size vector there to force allocation of the object's
storage on the stack. Originally we had a standard vector and the
performance was terrible, both because of allocation strategy (though that
can be mitigated somewhat), and because if we allocate anything from the
heap during rendering it requires use of a mutex (this is within the RTL
heap code), which also damages performance.

We need to look at possible alternatives for this approach that allows
flexibility while [1] keeping memory usage down, [2] avoiding use of
mutexes, and [3] avoiding the need to copy memory (e.g. if the vector
storage is re-sized). [1] and [3] are generally contradictory. We might
eliminate [3] if we don't use a vector, but I'd have to look at the code to
determine if it requires contiguous layout of the data and/or if
indirection in element lookups would impact performance.

-- Chris


Post a reply to this message

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.