POV-Ray : Newsgroups : povray.beta-test : Out of memory with many transparent objects Server Time
23 Dec 2024 00:09:12 EST (-0500)
  Out of memory with many transparent objects (Message 1 to 10 of 14)  
Goto Latest 10 Messages Next 4 Messages >>>
From: Christian Froeschlin
Subject: Out of memory with many transparent objects
Date: 19 Oct 2010 06:03:29
Message: <4cbd6cf1$1@news.povray.org>
I seem to be hitting a wall when stacking more than 64
transparent objects ("Fatal error in renderer: Out of memory").
The actual memory used by the process was only a few MB. Also,
it doesn't help to set "max_trace_level" to a larger value,
but AFAIU passing through transparency without reflection no
longer adds to the trace level in 3.7 anyway (using beta 39).

#declare SHEET = plane
{
   z, 0
   pigment {color rgbt 1}
}

#declare NUM = 65; // 64 seems ok
#declare I = 0;
#while (I < NUM)
   object {SHEET translate z}
   #declare I = I + 1;
#end

In case you're wondering, I was trying to simulate
furry objects by stacking thinly stranded textures ;)


Post a reply to this message

From: Christian Froeschlin
Subject: Re: Out of memory with many transparent objects
Date: 19 Oct 2010 06:10:50
Message: <4cbd6eaa@news.povray.org>
Christian Froeschlin wrote:

>   object {SHEET translate z}

I made an error constructing the minimalist
test scene, this was meant to be

   object {SHEET translate I*z}

It doesn't change the effect, however.


Post a reply to this message

From: Le Forgeron
Subject: Re: Out of memory with many transparent objects
Date: 19 Oct 2010 06:50:39
Message: <4cbd77ff$1@news.povray.org>
Le 19/10/2010 12:03, Christian Froeschlin a écrit :
> but AFAIU passing through transparency without reflection no
> longer adds to the trace level in 3.7 anyway (using beta 39).

And I wonder if it is a good thing (and how it is implemented).
If we are reusing the ray & all, it might be safe, but I'm afraid of an
infinite allocation of ray & intersections before they get purged. Hence
the issue.

(if there was only transmit, it could be linear, but as there might be a
reflection with an increased depth of one which generate many more (on
the reverse path)...

Just an hypothesis:

max level 1: 65 planes... 1 original ray, 65 transmitted ray, no reflection

max level 2: same, + each ray get a reflections, the reflection of the
original get lost in space, the reflection of first transmitted get one
more ray (transmit) which get lost and would reflect if not cut by
level, (first reflection on the second plane)... reflection of
penultimate transmitted ray would generate 64 transmitted rays.

max level 3: add 65*32 rays (due to reflections occuring), and a
significant number of transmitted rays for each of them.



In fact, could you try to reduce max trace level (low enough, such as 1
or 2), and does it remove the memory issue ?

(I'm too far to test it myself right now)

-- 
A good Manager will take you
through the forest, no mater what.
A Leader will take time to climb on a
Tree and say 'This is the wrong forest'.


Post a reply to this message

From: Christian Froeschlin
Subject: Re: Out of memory with many transparent objects
Date: 19 Oct 2010 11:03:19
Message: <4cbdb337@news.povray.org>
Le_Forgeron wrote:

> max level 2: same, + each ray get a reflections

the test scene is not using any reflection

> In fact, could you try to reduce max trace level (low enough, such as 1
> or 2), and does it remove the memory issue ?

using max_trace_level 1 does not change anything


Post a reply to this message

From: Le Forgeron
Subject: Re: Out of memory with many transparent objects
Date: 19 Oct 2010 11:59:21
Message: <4cbdc059$1@news.povray.org>
Le 19/10/2010 17:03, Christian Froeschlin nous fit lire :
> Le_Forgeron wrote:
> 
>> max level 2: same, + each ray get a reflections
> 
> the test scene is not using any reflection
> 
>> In fact, could you try to reduce max trace level (low enough, such as 1
>> or 2), and does it remove the memory issue ?
> 
> using max_trace_level 1 does not change anything
> 
> 
Good news: I can reproduce (Linux 64 bits, beta39+)

Fatal error in renderer: Out of memory.
Render failed

I have far enough memory, and not even allocating...

replacing plane with sphere is ok; as is box;

removing the t part does not solves anything (of rgbt)

64 is ok too. but not 65.


Post a reply to this message

From: Warp
Subject: Re: Out of memory with many transparent objects
Date: 19 Oct 2010 12:50:33
Message: <4cbdcc58@news.povray.org>
Christian Froeschlin <chr### [at] chrfrde> wrote:
> I seem to be hitting a wall when stacking more than 64
> transparent objects ("Fatal error in renderer: Out of memory").
> The actual memory used by the process was only a few MB. Also,
> it doesn't help to set "max_trace_level" to a larger value,
> but AFAIU passing through transparency without reflection no
> longer adds to the trace level in 3.7 anyway (using beta 39).

  There's an easy way of testing if the problem happens because of that
feature: Add a tiny bit of refraction to the transparent objects. This
will make it use max_trace_level as usual. (Of course you should probably
increase max_trace_level to at least 65 to get a proper comparison.)

-- 
                                                          - Warp


Post a reply to this message

From: Le Forgeron
Subject: Re: Out of memory with many transparent objects
Date: 19 Oct 2010 13:10:04
Message: <4cbdd0ec$1@news.povray.org>
Le 19/10/2010 18:50, Warp nous fit lire :
> Christian Froeschlin <chr### [at] chrfrde> wrote:
>> I seem to be hitting a wall when stacking more than 64
>> transparent objects ("Fatal error in renderer: Out of memory").
>> The actual memory used by the process was only a few MB. Also,
>> it doesn't help to set "max_trace_level" to a larger value,
>> but AFAIU passing through transparency without reflection no
>> longer adds to the trace level in 3.7 anyway (using beta 39).
> 
>   There's an easy way of testing if the problem happens because of that
> feature: Add a tiny bit of refraction to the transparent objects. This
> will make it use max_trace_level as usual. (Of course you should probably
> increase max_trace_level to at least 65 to get a proper comparison.)
> 

max_trace_level is a red herring.

The issue is an exception (bad_alloc) in Run()
(source/backend/support/task.cpp); 65 planes might be the real trigger.

More on that later.


Post a reply to this message

From: Alain
Subject: Re: Out of memory with many transparent objects
Date: 19 Oct 2010 13:43:48
Message: <4cbdd8d4$1@news.povray.org>
Le 2010-10-19 11:59, Le_Forgeron a écrit :
> Le 19/10/2010 17:03, Christian Froeschlin nous fit lire :
>> Le_Forgeron wrote:
>>
>>> max level 2: same, + each ray get a reflections
>>
>> the test scene is not using any reflection
>>
>>> In fact, could you try to reduce max trace level (low enough, such as 1
>>> or 2), and does it remove the memory issue ?
>>
>> using max_trace_level 1 does not change anything
>>
>>
> Good news: I can reproduce (Linux 64 bits, beta39+)
>
> Fatal error in renderer: Out of memory.
> Render failed
>
> I have far enough memory, and not even allocating...
>
> replacing plane with sphere is ok; as is box;
>
> removing the t part does not solves anything (of rgbt)
>
> 64 is ok too. but not 65.
Are the spheres or boxes stagered or one inside another?

It may be a case of imbeding to many objects and be related to 
insideness tests going awry.
It could result in a stack overflow or some other fixed size container 
or structure reatching it's limit.


Alain


Post a reply to this message

From: Le Forgeron
Subject: Re: Out of memory with many transparent objects
Date: 19 Oct 2010 14:25:23
Message: <4cbde293$1@news.povray.org>
Le 19/10/2010 17:59, Le_Forgeron nous fit lire :

> 64 is ok too. but not 65.

Yet another herring...

My issue seems to be in void TraceTask::NonAdaptiveSupersamplingM1(),
in the SmartBlock pixels...


Post a reply to this message

From: Le Forgeron
Subject: Re: Out of memory with many transparent objects
Date: 19 Oct 2010 15:15:32
Message: <4cbdee54$1@news.povray.org>
Le 19/10/2010 19:10, Le_Forgeron nous fit lire :
> Le 19/10/2010 18:50, Warp nous fit lire :
>> Christian Froeschlin <chr### [at] chrfrde> wrote:
>>> I seem to be hitting a wall when stacking more than 64
>>> transparent objects ("Fatal error in renderer: Out of memory").
>>> The actual memory used by the process was only a few MB. Also,
>>> it doesn't help to set "max_trace_level" to a larger value,
>>> but AFAIU passing through transparency without reflection no
>>> longer adds to the trace level in 3.7 anyway (using beta 39).
>>
>>   There's an easy way of testing if the problem happens because of that
>> feature: Add a tiny bit of refraction to the transparent objects. This
>> will make it use max_trace_level as usual. (Of course you should probably
>> increase max_trace_level to at least 65 to get a proper comparison.)
>>
> 
> max_trace_level is a red herring.
> 
> The issue is an exception (bad_alloc) in Run()
> (source/backend/support/task.cpp); 65 planes might be the real trigger.
> 
> More on that later.

Found!

We are, with 65 planes, pushing too many interiors

source/backend/frame.h:

typedef FixedSimpleVector<Interior *, 64> RayInteriorVector;

render/tracepixel.cpp: 981, inside InitRayContainerStateTree();

                        containingInteriors.push_back(object->interior);


Work around for your scene: flip the z vector (plane { -z, 0 } instead
of plane { z, 0 }


Post a reply to this message

Goto Latest 10 Messages Next 4 Messages >>>

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.