POV-Ray : Newsgroups : povray.beta-test : Out of memory with many transparent objects Server Time
28 Jun 2024 11:53:06 EDT (-0400)
  Out of memory with many transparent objects (Message 5 to 14 of 14)  
<<< Previous 4 Messages Goto Initial 10 Messages
From: Le Forgeron
Subject: Re: Out of memory with many transparent objects
Date: 19 Oct 2010 11:59:21
Message: <4cbdc059$1@news.povray.org>
Le 19/10/2010 17:03, Christian Froeschlin nous fit lire :
> Le_Forgeron wrote:
> 
>> max level 2: same, + each ray get a reflections
> 
> the test scene is not using any reflection
> 
>> In fact, could you try to reduce max trace level (low enough, such as 1
>> or 2), and does it remove the memory issue ?
> 
> using max_trace_level 1 does not change anything
> 
> 
Good news: I can reproduce (Linux 64 bits, beta39+)

Fatal error in renderer: Out of memory.
Render failed

I have far enough memory, and not even allocating...

replacing plane with sphere is ok; as is box;

removing the t part does not solves anything (of rgbt)

64 is ok too. but not 65.


Post a reply to this message

From: Warp
Subject: Re: Out of memory with many transparent objects
Date: 19 Oct 2010 12:50:33
Message: <4cbdcc58@news.povray.org>
Christian Froeschlin <chr### [at] chrfrde> wrote:
> I seem to be hitting a wall when stacking more than 64
> transparent objects ("Fatal error in renderer: Out of memory").
> The actual memory used by the process was only a few MB. Also,
> it doesn't help to set "max_trace_level" to a larger value,
> but AFAIU passing through transparency without reflection no
> longer adds to the trace level in 3.7 anyway (using beta 39).

  There's an easy way of testing if the problem happens because of that
feature: Add a tiny bit of refraction to the transparent objects. This
will make it use max_trace_level as usual. (Of course you should probably
increase max_trace_level to at least 65 to get a proper comparison.)

-- 
                                                          - Warp


Post a reply to this message

From: Le Forgeron
Subject: Re: Out of memory with many transparent objects
Date: 19 Oct 2010 13:10:04
Message: <4cbdd0ec$1@news.povray.org>
Le 19/10/2010 18:50, Warp nous fit lire :
> Christian Froeschlin <chr### [at] chrfrde> wrote:
>> I seem to be hitting a wall when stacking more than 64
>> transparent objects ("Fatal error in renderer: Out of memory").
>> The actual memory used by the process was only a few MB. Also,
>> it doesn't help to set "max_trace_level" to a larger value,
>> but AFAIU passing through transparency without reflection no
>> longer adds to the trace level in 3.7 anyway (using beta 39).
> 
>   There's an easy way of testing if the problem happens because of that
> feature: Add a tiny bit of refraction to the transparent objects. This
> will make it use max_trace_level as usual. (Of course you should probably
> increase max_trace_level to at least 65 to get a proper comparison.)
> 

max_trace_level is a red herring.

The issue is an exception (bad_alloc) in Run()
(source/backend/support/task.cpp); 65 planes might be the real trigger.

More on that later.


Post a reply to this message

From: Alain
Subject: Re: Out of memory with many transparent objects
Date: 19 Oct 2010 13:43:48
Message: <4cbdd8d4$1@news.povray.org>
Le 2010-10-19 11:59, Le_Forgeron a écrit :
> Le 19/10/2010 17:03, Christian Froeschlin nous fit lire :
>> Le_Forgeron wrote:
>>
>>> max level 2: same, + each ray get a reflections
>>
>> the test scene is not using any reflection
>>
>>> In fact, could you try to reduce max trace level (low enough, such as 1
>>> or 2), and does it remove the memory issue ?
>>
>> using max_trace_level 1 does not change anything
>>
>>
> Good news: I can reproduce (Linux 64 bits, beta39+)
>
> Fatal error in renderer: Out of memory.
> Render failed
>
> I have far enough memory, and not even allocating...
>
> replacing plane with sphere is ok; as is box;
>
> removing the t part does not solves anything (of rgbt)
>
> 64 is ok too. but not 65.
Are the spheres or boxes stagered or one inside another?

It may be a case of imbeding to many objects and be related to 
insideness tests going awry.
It could result in a stack overflow or some other fixed size container 
or structure reatching it's limit.


Alain


Post a reply to this message

From: Le Forgeron
Subject: Re: Out of memory with many transparent objects
Date: 19 Oct 2010 14:25:23
Message: <4cbde293$1@news.povray.org>
Le 19/10/2010 17:59, Le_Forgeron nous fit lire :

> 64 is ok too. but not 65.

Yet another herring...

My issue seems to be in void TraceTask::NonAdaptiveSupersamplingM1(),
in the SmartBlock pixels...


Post a reply to this message

From: Le Forgeron
Subject: Re: Out of memory with many transparent objects
Date: 19 Oct 2010 15:15:32
Message: <4cbdee54$1@news.povray.org>
Le 19/10/2010 19:10, Le_Forgeron nous fit lire :
> Le 19/10/2010 18:50, Warp nous fit lire :
>> Christian Froeschlin <chr### [at] chrfrde> wrote:
>>> I seem to be hitting a wall when stacking more than 64
>>> transparent objects ("Fatal error in renderer: Out of memory").
>>> The actual memory used by the process was only a few MB. Also,
>>> it doesn't help to set "max_trace_level" to a larger value,
>>> but AFAIU passing through transparency without reflection no
>>> longer adds to the trace level in 3.7 anyway (using beta 39).
>>
>>   There's an easy way of testing if the problem happens because of that
>> feature: Add a tiny bit of refraction to the transparent objects. This
>> will make it use max_trace_level as usual. (Of course you should probably
>> increase max_trace_level to at least 65 to get a proper comparison.)
>>
> 
> max_trace_level is a red herring.
> 
> The issue is an exception (bad_alloc) in Run()
> (source/backend/support/task.cpp); 65 planes might be the real trigger.
> 
> More on that later.

Found!

We are, with 65 planes, pushing too many interiors

source/backend/frame.h:

typedef FixedSimpleVector<Interior *, 64> RayInteriorVector;

render/tracepixel.cpp: 981, inside InitRayContainerStateTree();

                        containingInteriors.push_back(object->interior);


Work around for your scene: flip the z vector (plane { -z, 0 } instead
of plane { z, 0 }


Post a reply to this message

From: Christian Froeschlin
Subject: Re: Out of memory with many transparent objects
Date: 19 Oct 2010 15:35:31
Message: <4cbdf303@news.povray.org>
Le_Forgeron wrote:

> We are, with 65 planes, pushing too many interiors

ah makes sense. I originally had the problem with disc object
AFAIK that also has an "half-infinite" interior.

> Work around for your scene: flip the z vector (plane { -z, 0 } instead
> of plane { z, 0 }

It was just a test, the final scene used stacked
tori anyway, and then not more than 20 ;)

Thanks for looking into it.


Post a reply to this message

From: Slime
Subject: Re: Out of memory with many transparent objects
Date: 20 Oct 2010 00:55:17
Message: <4cbe7635$1@news.povray.org>
> typedef FixedSimpleVector<Interior *, 64>  RayInteriorVector;

Seems like the error message could be better. Right now it implies that 
the user should go out and buy more RAM. Maybe something like "Exceeded 
maximum array size of 64"? Would be great if it used the word "Interior" 
as well, but that's probably not easy/possible given the templating.

  - Slime


Post a reply to this message

From: Chris Cason
Subject: Re: Out of memory with many transparent objects
Date: 20 Oct 2010 01:25:10
Message: <4cbe7d36@news.povray.org>
On 20/10/2010 15:55, Slime wrote:
>  > typedef FixedSimpleVector<Interior *, 64>  RayInteriorVector;
> 
> Seems like the error message could be better. Right now it implies that 
> the user should go out and buy more RAM. Maybe something like "Exceeded 
> maximum array size of 64"? Would be great if it used the word "Interior" 
> as well, but that's probably not easy/possible given the templating.

Currently the exception thrown in the code indicates an out-of-memory
condition as has been noted; whether or not having a special case for
hitting internal limits like this would clutter things up too much I can't
say yet. I'll have a look and see.

-- Chris


Post a reply to this message

From: Chris Cason
Subject: Re: Out of memory with many transparent objects
Date: 20 Oct 2010 01:33:53
Message: <4cbe7f41$1@news.povray.org>
On 19/10/2010 21:03, Christian Froeschlin wrote:
> I seem to be hitting a wall when stacking more than 64
> transparent objects ("Fatal error in renderer: Out of memory").
> The actual memory used by the process was only a few MB. Also,
> it doesn't help to set "max_trace_level" to a larger value,
> but AFAIU passing through transparency without reflection no
> longer adds to the trace level in 3.7 anyway (using beta 39).

As others have noted, this is a case of the fixed vector hitting its limit.
It's an issue we have yet to come to a better solution for. Fundamentally
we use a fixed-size vector there to force allocation of the object's
storage on the stack. Originally we had a standard vector and the
performance was terrible, both because of allocation strategy (though that
can be mitigated somewhat), and because if we allocate anything from the
heap during rendering it requires use of a mutex (this is within the RTL
heap code), which also damages performance.

We need to look at possible alternatives for this approach that allows
flexibility while [1] keeping memory usage down, [2] avoiding use of
mutexes, and [3] avoiding the need to copy memory (e.g. if the vector
storage is re-sized). [1] and [3] are generally contradictory. We might
eliminate [3] if we don't use a vector, but I'd have to look at the code to
determine if it requires contiguous layout of the data and/or if
indirection in element lookups would impact performance.

-- Chris


Post a reply to this message

<<< Previous 4 Messages Goto Initial 10 Messages

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.