POV-Ray : Newsgroups : povray.general : Photon distribution against limited memory Server Time
2 Aug 2024 08:14:30 EDT (-0400)
  Photon distribution against limited memory (Message 1 to 10 of 10)  
From: Dave Vogt
Subject: Photon distribution against limited memory
Date: 27 Dec 2004 17:16:21
Message: <41d089b5@news.povray.org>
Hello everybody,

I just had an idea for a problem we all probably already ran into:
Memory consumption while tracing photons. I had a scene at which,
after half a day, povray died because there was too little memory.
After thinking about it, the basic idea which came up was to just
stopping to trace photons when a given limit of RAM usage is reached.
But as I understand the documentation, there would be a problem with
the distribution of the photons, which would be uneven in this event.

What about the following: Change the distribution of photons to
something stochastic, so the distribution is more or less even when we
have to stop eating memory, and we can continue with rendering.

On the other side, I'm not a mathematician, nor am I familiar with the
POV source.. So please, tell me if this is a completely stupid idea,
or if I just found out something interesting. Or is there already a
better or at least usable solution and I didn't find it?


Greets

Dave

-- 
Dave "tPassive" Vogt | Linux  user  #225040 | www.frozenbrain.com     |
_____________________|______________________|_________________________|
List of Spam filter victims: http://frozenbrain.com/rel.php/victims   |
PGP Key:                     http://frozenbrain.com/public_key.asc    /


Post a reply to this message

From: Mike Thorn
Subject: Re: Photon distribution against limited memory
Date: 27 Dec 2004 17:28:00
Message: <41d08c70$1@news.povray.org>
Dave Vogt wrote:
> Hello everybody,
> 
> I just had an idea for a problem we all probably already ran into:
> Memory consumption while tracing photons. I had a scene at which,
> after half a day, povray died because there was too little memory.
> After thinking about it, the basic idea which came up was to just
> stopping to trace photons when a given limit of RAM usage is reached.
> But as I understand the documentation, there would be a problem with
> the distribution of the photons, which would be uneven in this event.
> 
> What about the following: Change the distribution of photons to
> something stochastic, so the distribution is more or less even when we
> have to stop eating memory, and we can continue with rendering.
> 
> On the other side, I'm not a mathematician, nor am I familiar with the
> POV source.. So please, tell me if this is a completely stupid idea,
> or if I just found out something interesting. Or is there already a
> better or at least usable solution and I didn't find it?

It all sounds logical, but why not just use fewer photons from the 
start? You would accomplish basically the same result.

~Mike


Post a reply to this message

From: Dave Vogt
Subject: Re: Photon distribution against limited memory
Date: 28 Dec 2004 05:55:15
Message: <41d13b92@news.povray.org>
Mike Thorn wrote:
> It all sounds logical, but why not just use fewer photons from the
> start? You would accomplish basically the same result.

This is what I did till now, and what I'm still doing. But you have no
other choice but to tweak the amount of photons until it fits or until
you're satisfied with the result. What if the photon-tracing process
takes a few hours (or even days), after which you see that you could
have used twice as much.. or that you would have needed more memory?
If you have refracting objects, the real amount of photons increases
exponentially, so you can't guess the value to put into your setting
by hand.

> 
> ~Mike
Dave

-- 
Dave "tPassive" Vogt | Linux  user  #225040 | www.frozenbrain.com     |
_____________________|______________________|_________________________|
List of Spam filter victims: http://frozenbrain.com/rel.php/victims   |
PGP Key:                     http://frozenbrain.com/public_key.asc    /


Post a reply to this message

From: Warp
Subject: Re: Photon distribution against limited memory
Date: 28 Dec 2004 06:06:59
Message: <41d13e53@news.povray.org>
Dave Vogt <dav### [at] newsfrozenbraincom> wrote:
> What about the following: Change the distribution of photons to
> something stochastic, so the distribution is more or less even when we
> have to stop eating memory, and we can continue with rendering.

  Do you really want caustics to be grainy?

  There's a reason why people hated the stochastic media sampling and
thus the adaptive media sampling method was added and made the default.
People don't like graininess.

-- 
#macro M(A,N,D,L)plane{-z,-9pigment{mandel L*9translate N color_map{[0rgb x]
[1rgb 9]}scale<D,D*3D>*1e3}rotate y*A*8}#end M(-3<1.206434.28623>70,7)M(
-1<.7438.1795>1,20)M(1<.77595.13699>30,20)M(3<.75923.07145>80,99)// - Warp -


Post a reply to this message

From: Dave Vogt
Subject: Re: Photon distribution against limited memory
Date: 28 Dec 2004 16:40:17
Message: <41d1d2c0@news.povray.org>
Warp wrote:

> Dave Vogt <dav### [at] newsfrozenbraincom> wrote:
>> What about the following: Change the distribution of photons to
>> something stochastic, so the distribution is more or less even when
>> we have to stop eating memory, and we can continue with rendering.
> 
>   Do you really want caustics to be grainy?

No, of course not. But I see this as the only way to be able to stop
when no more memory is availlable (the other way is to just die and
tell the user to try a smaller amount of photons, which is done
currently).

> 
>   There's a reason why people hated the stochastic media sampling and
> thus the adaptive media sampling method was added and made the
> default. People don't like graininess.
> 

Agreed. So what do you do if you run out of memory for some reason? How
do people guess how many photons need to / can be shot?


Greets,
Dave


Post a reply to this message

From: stm31415
Subject: Re: Photon distribution against limited memory
Date: 28 Dec 2004 17:00:00
Message: <web.41d1d719efd6c93ed72c6180@news.povray.org>
I'm not sure, but can't you run repeated renders, each time loading and
saving photon maps? With each consecutive render, wouldn't you get finer
data, and then you can use as many photons as you have patience for? Or
would you just resample the same places?

-S


Post a reply to this message

From: Warp
Subject: Re: Photon distribution against limited memory
Date: 29 Dec 2004 06:43:09
Message: <41d2984d@news.povray.org>
Dave Vogt <dav### [at] newsfrozenbraincom> wrote:
> Agreed. So what do you do if you run out of memory for some reason? How
> do people guess how many photons need to / can be shot?

  That's like asking "what sould people do if they run out of memory
loading an enourmous mesh"?

-- 
plane{-x+y,-1pigment{bozo color_map{[0rgb x][1rgb x+y]}turbulence 1}}
sphere{0,2pigment{rgbt 1}interior{media{emission 1density{spherical
density_map{[0rgb 0][.5rgb<1,.5>][1rgb 1]}turbulence.9}}}scale
<1,1,3>hollow}text{ttf"timrom""Warp".1,0translate<-1,-.1,2>}//  - Warp -


Post a reply to this message

From: Mike Thorn
Subject: Re: Photon distribution against limited memory
Date: 29 Dec 2004 08:45:44
Message: <41d2b508$1@news.povray.org>
Warp wrote:
> Dave Vogt <dav### [at] newsfrozenbraincom> wrote:
> 
>>Agreed. So what do you do if you run out of memory for some reason? How
>>do people guess how many photons need to / can be shot?
> 
>   That's like asking "what sould people do if they run out of memory
> loading an enourmous mesh"?

Except that in that case there's nothing one can do but make a new mesh, 
but in this case there *is* something one can do. Dave wants to know 
what *you* do.

~Mike


Post a reply to this message

From: Warp
Subject: Re: Photon distribution against limited memory
Date: 29 Dec 2004 09:03:53
Message: <41d2b948@news.povray.org>
Mike Thorn <mik### [at] realitycheckmultimediacom> wrote:
> Except that in that case there's nothing one can do but make a new mesh, 
> but in this case there *is* something one can do. Dave wants to know 
> what *you* do.

  I make a new set of photon parameters.

-- 
#macro N(D)#if(D>99)cylinder{M()#local D=div(D,104);M().5,2pigment{rgb M()}}
N(D)#end#end#macro M()<mod(D,13)-6mod(div(D,13)8)-3,10>#end blob{
N(11117333955)N(4254934330)N(3900569407)N(7382340)N(3358)N(970)}//  - Warp -


Post a reply to this message

From: Dave Vogt
Subject: Re: Photon distribution against limited memory
Date: 29 Dec 2004 11:27:36
Message: <41d2daf7@news.povray.org>
Warp wrote:

> Dave Vogt <dav### [at] newsfrozenbraincom> wrote:
>> Agreed. So what do you do if you run out of memory for some reason?
>> How do people guess how many photons need to / can be shot?
> 
>   That's like asking "what sould people do if they run out of memory
> loading an enourmous mesh"?
> 

Right. But it's quite easy to guess how much memory a mesh will use
IMHO; for photons, it's not. Especially not if there are refracting
objects.

So you say there's nothing you can do but to guess?

Just another idea which jumped into my mind: Statistics. I probably
could do some test renders on low photon counts (say, 2000, 4000,
6000) and see how the memory footprint changes. If that works well, I
could be happy with it.



Greets,

Dave


Post a reply to this message

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.