POV-Ray : Newsgroups : povray.binaries.images : OMG - it works!! Server Time
1 Aug 2024 08:21:52 EDT (-0400)
  OMG - it works!! (Message 11 to 20 of 26)  
<<< Previous 10 Messages Goto Latest 10 Messages Next 6 Messages >>>
From: Cousin Ricky
Subject: Re: OMG - it works!!
Date: 29 Dec 2008 22:55:00
Message: <web.49599aef5b5ba55b85de7b680@news.povray.org>
"clipka" <nomail@nomail> wrote:
>  See for yourself what 3.6 and the
> 3.7.0.beta.29 make of it.

I would if the image weren't 5 times bigger than my monitor.


Post a reply to this message

From: Reactor
Subject: Re: OMG - it works!!
Date: 29 Dec 2008 23:10:00
Message: <web.49599e725b5ba55bd7faa3a60@news.povray.org>
"clipka" <nomail@nomail> wrote:
> "Reactor" <rea### [at] hotmailcom> wrote:
> > Would you mind making the source available?  I have been experimenting with
> > radiosity under 3.6 recently, and I've run into an odd, non-intuitive way of
> > getting certain difficult to render scenes to trace correctly.  I also want to
> > see how the recursive sampling does under MegaPov 1.2.1.
>
> Whoops - somehow this post of yours slipped past me...
>
> Tell me about those experiments - it may help me identify some more quirks in
> the radiosity code.
>
>
> > Since the images is still not very accurate and your settings seem to be fairly
> > high (in comparison to what I use for most scenes), I think this would be a
> > good test of some alternate methods.
>
> Sure, give it a try. But promise to tell me about your results ;)
>
> As it is now, it needs a texture named "Parquetry_plain.png", but I guess that
> doesn't really matter, so you may want to replace it with a uniform pigment.
> The original texture I use is too large to post it.


Well... basically, when you start to run into the 1600 count limit or the count
vs patience limit, I've found that some scenes can actually be made to work
with a lower count than usual by lengthening the pretrace step and dropping the
error bound very low.  Since one usually sees splotches when the count is too
low for a given error_bound, if the error_bound is dropped low enough, the
splotches become very small and distributed such that the shadows and shading
start becoming reasonably accurate again.
Depending on the scene, this method can work fairly well.  I have had some
degree of success on scenes that are indoors and involve sharper shadows and
rapid changes of light.  Outdoor, architectural scenes involving slower changes
of light and softer shadows seem to do better with the usual approach.


Post a reply to this message

From: clipka
Subject: Re: OMG - it works!!
Date: 29 Dec 2008 23:25:01
Message: <web.4959a2895b5ba55bab169ede0@news.povray.org>
"Cousin Ricky" <ric### [at] yahoocom> wrote:
> "clipka" <nomail@nomail> wrote:
> >  See for yourself what 3.6 and the
> > 3.7.0.beta.29 make of it.
>
> I would if the image weren't 5 times bigger than my monitor.

:) no scroll bars or zoom on your browser / image viewer / whatever?


Post a reply to this message

From: clipka
Subject: Re: OMG - it works!!
Date: 30 Dec 2008 00:25:00
Message: <web.4959b06b5b5ba55bab169ede0@news.povray.org>
"Reactor" <rea### [at] hotmailcom> wrote:
> Well... basically, when you start to run into the 1600 count limit or the count
> vs patience limit, I've found that some scenes can actually be made to work
> with a lower count than usual by lengthening the pretrace step and dropping the
> error bound very low.  Since one usually sees splotches when the count is too
> low for a given error_bound, if the error_bound is dropped low enough, the
> splotches become very small and distributed such that the shadows and shading
> start becoming reasonably accurate again.
> Depending on the scene, this method can work fairly well.  I have had some
> degree of success on scenes that are indoors and involve sharper shadows and
> rapid changes of light.  Outdoor, architectural scenes involving slower changes
> of light and softer shadows seem to do better with the usual approach.

I see... so what you're basically doing is "de-bundling" the sample rays.

Theoretically this should be somewhat slower, as you probably use the same
number of sample rays (total, not per sample), and just distribute them more
evenly across the whole scene, so the lookup will be slowed down.

However, I can also see how this might smoothen the look of the scene.


Which reminds me... I guess I should check whether the re-used samples are
actually weighted according to the distance to the point in question...

(*digs through the code*)

.... hum... well, there is some weighting according to distance, but judging from
the way that some splotchy images look, this might be implemented poorly. I'll
have a closer look at this.


Post a reply to this message

From: stbenge
Subject: Re: OMG - it works!!
Date: 30 Dec 2008 00:32:34
Message: <4959b272@news.povray.org>
clipka wrote:
> "Samuel Benge" <stb### [at] hotmailcom> wrote:
>> Hey clipka (and everyone else who might be interested),
>>
>> There's a quick workaround for 3.7 radiosity.
> 
> I'm actually not interested in quick workarounds, but getting the thing to
> ultimately work fine without them :)

Yeah, once you start using tricks to get what you want, you lose 
flexibility.

>> 3.7's radiosity doesn't seem to like high count values,
> 
> That's very interesting for me to hear. Can you describe what's wrong with it,
> and maybe post a sample scene?

No, I cannot. Recent tests don't seem to be showing any black splotches. 
Maybe 3.7's radiosity has been fixed somewhat after all. It used to 
happen when I rendered meshes with high-count radiosity. Strange black 
patches would appear, and higher count values only made it worse. If I 
run into it again, I'll post the relevant code.

>> #default{
>>  finish{ambient 0}
>>  normal{bumps .25 scale .001}
>> }
> 
> Hm - sounds to me like you're actually using the normals to force more samples
> to be taken. Does this any good to rendering time?

Yes, that is what I'm doing. If you're having a hard time getting rid of 
  radiosity's artifacts, the method can save the day. The render times 
generally tend to be a bit higher, and you'll always see some of the 
surface normal. Overall, the result is on par with many other types of 
radiosity you see out there. You know, the ones with visual noise :)

Sam


Post a reply to this message

From: stbenge
Subject: Re: OMG - it works!!
Date: 30 Dec 2008 00:38:39
Message: <4959b3df@news.povray.org>
nemesis wrote:
> "Samuel Benge" <stb### [at] hotmailcom> wrote:
>>
>> You can now get away with lower (<300) count settings. It's a hack--I admit
>> it--but it's worth it for difficult scenes.
> 
> BTW, Samwise is one of our resident wizards -- Jaime and Gilles too but they
> seem to be a bit off in other planes of existence -- and is well versed in
> povray black crafts.  Heed his magic words. :)

It's a cheap trick, and it makes you lose flexibility in your scene. It 
might be a good thing to keep in mind though, if in case some day you 
have a particularly stubborn scene :/

Sam


Post a reply to this message

From: clipka
Subject: Re: OMG - it works!!
Date: 30 Dec 2008 01:10:00
Message: <web.4959bac55b5ba55bab169ede0@news.povray.org>
stbenge <^@hotmail.com> wrote:
> No, I cannot. Recent tests don't seem to be showing any black splotches.
> Maybe 3.7's radiosity has been fixed somewhat after all. It used to
> happen when I rendered meshes with high-count radiosity. Strange black
> patches would appear, and higher count values only made it worse. If I
> run into it again, I'll post the relevant code.

Sounds like those classic "black splotches" issue I posted about recently.

They're all about POV-ray's way of keeping track of recursion depth - of the
trace in general (what you control with "max_trace_level") and radiosity (often
referred to as "bounce depth" - the thing you control with the "recursion_limit"
keyword).

In POV 3.6, the handling is straightforward wrong, causing those well-known
artifacts basically in every scene that has any reflection or refraction. POV
3.7 (at least the beta.29 - I haven't seen a single line of code of previous
betas) handles the whole issue differently, probably for totally unrelated
design issues. This change had a positive side effect on the black splotch
issues, but is somewhat bogus as well.

The next beta will most likely have this all sorted out properly (if the POV
team decides to includes my changes, that is).


Post a reply to this message

From: Reactor
Subject: Re: OMG - it works!!
Date: 30 Dec 2008 02:20:01
Message: <web.4959cac85b5ba55b8b557b650@news.povray.org>
"clipka" <nomail@nomail> wrote:
> "Reactor" <rea### [at] hotmailcom> wrote:
> > Well... basically, when you start to run into the 1600 count limit or the count
> > vs patience limit, I've found that some scenes can actually be made to work
> > with a lower count than usual by lengthening the pretrace step and dropping the
> > error bound very low.  Since one usually sees splotches when the count is too
> > low for a given error_bound, if the error_bound is dropped low enough, the
> > splotches become very small and distributed such that the shadows and shading
> > start becoming reasonably accurate again.
> > Depending on the scene, this method can work fairly well.  I have had some
> > degree of success on scenes that are indoors and involve sharper shadows and
> > rapid changes of light.  Outdoor, architectural scenes involving slower changes
> > of light and softer shadows seem to do better with the usual approach.
>
> I see... so what you're basically doing is "de-bundling" the sample rays.
>
> Theoretically this should be somewhat slower, as you probably use the same
> number of sample rays (total, not per sample), and just distribute them more
> evenly across the whole scene, so the lookup will be slowed down.
>
> However, I can also see how this might smoothen the look of the scene.
>

It doesn't smooth it much, actually.  I use it for sharp edges and small things
that a larger error bound would miss (i.e. radiosity is largely unaffected by
the addition or removal of the object).  In those cases, a small count of very
accurately, possibly widely spaced samples is more visually correct.  So far,
it is turning the door into a diffraction grating :(

I have noticed that an error_bound greater than 0.075 leads to the light
"seeping" through the walls.  Attached is an image rendered under
3.6.1.icl8win32  with the following radiosity block:

  radiosity {
    pretrace_start  64/image_width
    pretrace_end     1/image_width
    count 25
    nearest_count 20
    error_bound 0.05
    recursion_limit 4
    low_error_factor 0.5
    gray_threshold 0
    minimum_reuse 0.01
    brightness 1.0
    adc_bailout 0.01
    always_sample on
  }
  #end



I am still testing under MegaPov, but so far, this scene looks like it will
require a very high count, which is what I was hoping to avoid.

-Reactor


Post a reply to this message


Attachments:
Download 'the_secret.jpg' (26 KB)

Preview of image 'the_secret.jpg'
the_secret.jpg


 

From: Thomas de Groot
Subject: Re: OMG - it works!!
Date: 30 Dec 2008 03:17:32
Message: <4959d91c$1@news.povray.org>
"Samuel Benge" <stb### [at] hotmailcom> schreef in bericht 
news:web.4959339e5b5ba55bf8e75d050@news.povray.org...
>
> There's a quick workaround for 3.7 radiosity. 3.7's radiosity doesn't seem 
> to
> like high count values, so to get more detail you can apply a small-scale
> global surface normal to all objects in your scene like so:
>

Yes, that is true indeed, now that you mention it. I almost 
automatically/unconsciously do this in many of my scenes. Don't remember 
were the trick came from...

Thomas


Post a reply to this message

From: clipka
Subject: Re: OMG - it works!!
Date: 30 Dec 2008 05:20:01
Message: <web.4959f5a85b5ba55b1a6427600@news.povray.org>
"Reactor" <rea### [at] hotmailcom> wrote:
> I am still testing under MegaPov, but so far, this scene looks like it will
> require a very high count, which is what I was hoping to avoid.

Try the sample randomization in MegaPOV - it might do some good, because it will
most likely break up the "diffraction grating" effect.

The problem is that POV-Ray doesn't randomize its samples at all, so if you have
a low sample count, the bright gap in the door is always picked up from the same
directions.


I'm currently toying around with some alternative approach of when to take new
samples.

Currently, the sampling is simply "expectation-driven": The scene geometry is
analyzed, and samples are taken if it is *estimated* that the existing samples
will have a too high error for a certain location.

I'm currently experimenting with an approach that sets up a coarse mesh of
samples using the same type of estimation, but when attempting to re-use
samples it checks by how much the existing samples *do* disagree.


Post a reply to this message

<<< Previous 10 Messages Goto Latest 10 Messages Next 6 Messages >>>

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.