POV-Ray : Newsgroups : povray.unofficial.patches : Blurred reflections : Re: Blurred reflections Server Time
14 Apr 2024 22:06:27 EDT (-0400)
  Re: Blurred reflections  
From: Mr
Date: 11 May 2020 04:45:01
Message: <web.5eb90e88ba6b2c306adeaecb0@news.povray.org>
Cousin Ricky <ric### [at] yahoocom> wrote:
> On 2020-05-09 4:29 PM (-4), Mr wrote:
> > "And" <49341109@ntnu.edu.tw> wrote:
> >> Will blurred reflection integrate into official version one day? (this function
> >> is too important)
> > Things did evolve in the last two years. Now that Anti Aliasing method 3 is in
> > the main POV-Ray trunk, the said "framework" is there. The syntax... almost as
> > well, with things like the finish level fresnel paving the way to a similar
> > approach for roughness maybe. My question is, since then, did any other branch
> > than Uberpov also try to add this feature?
> The knowledge base alludes to MegaPOV having some blurred reflection
> capability, though I cannot find it in the MegaPOV documentation.  But
> MegaPOV development terminated long before UberPOV was released.
> Since development of UberPOV has wound down, I've found myself reverting
> to RC3Metal from the Object Collection, which works with standard
> POV-Ray.  What seems to work best for me with RC3Metal is to do a bunch
> of EXR renders with different seeds in an animation loop, then read them
> back as image maps and average them in a pigment_map.  To save time, I
> save the photons for the first 3 to 5 frames, then reuse them round
> robin for the remaining frames.  If there are no mutually reflecting
> surfaces, I just use a single render with a high sample count.
> Overall, UberPOV does a better job than RC3Metal, though, especially
> with specular highlights.

Hi Cousin Ricky
I had noticed your work on RC3Metal and also on sun & sky atmosphere, the
blender exporter could benefit a lot from your experiments!
However for the current issue, since exporter is already using Blender frames to
export a clock animation, I assume using clock for something else will add to
the already unreadable code. Also it already exports the possibility of a user's
bump_map, so micronormals have to combine with it, and also with the finish map
trick used to emulate a specular map.

For now I started testing a single "micro bump" layer averaged with the user
bump map and it kind of works better than I remembered, when I initially chose
Uberpov. Maybe I'm forgetting a use case that made me throw it away initially...

My latest attempt totally lost the user specular map visibility as it's kind of
eaten away by the micro bump... Next try, I should make it so that the one entry
with minimal specular also doesn't carry the microbump So the finish map trick
becomes a finish&bump_map metamap trick...

In the averaged normal_map I used 0.5 as the microbump entry value and same for
the user bump map.

I don't know if it  will break when I add another layer with different seed of
microbump to the average in the same clockframe. then I would probably state
0.25 for each micro bump layer and 0.5 for user bump texture,... with microbump
muted in less specular areas(?), roughly speaking:

        pigment_pattern {
            uv_mapping image_map{jpeg "PATH\\TO\\UserSpecularMap.jpg" map_type 0
        texture_map {
                pigment {rgbft<0.8, 0.8, 0.8, 0, 0>}
                finish {finish_WITH_MIN_SPEC}
                normal {user Bump_map only}
                pigment {rgbft<0.8, 0.8, 0.8, 0, 0>}
                finish {finish_WITH_MAX_SPEC}
                normal {average of microbump(s) and user bump_map}

Post a reply to this message

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.