POV-Ray : Newsgroups : povray.binaries.images : A quick povr branch micro normal image. Server Time
8 Jul 2024 05:14:42 EDT (-0400)
  A quick povr branch micro normal image. (Message 38 to 47 of 97)  
<<< Previous 10 Messages Goto Latest 10 Messages Next 10 Messages >>>
From: William F Pokorny
Subject: Re: A quick povr branch micro normal image.
Date: 28 Jan 2022 05:25:57
Message: <61f3c4b5$1@news.povray.org>
On 1/27/22 10:54, jr wrote:

jr & Bald Eagle, Thanks both for the feedback and ideas.

> gut reaction[*] - yes, something along that line.  while compatibility is
> important of course, I think that this mechanism is of value only from current
> versions on.  not quite sure I really understand the detail, so I'd write eg:
> 
> #if (99 = f_odd(0,0,0,99))
>    #if (!strcmp(patch_val("id"),"povr"))
>      ...
>    #end
> #else
>    ...
> #end
> 
> where/how does 'patch_str' get used?

What I was thinking about was more like:

#declare povr     =  0;
#declare povr_ver = -1;
#if (99 = f_odd(0,0,0,99))
     #if (!strcmp(patch_str(0),"povr"))
         #declare povr = 1;
         #declare povr_ver = patch_val(0);
     #end
#else
   ...
#end

The patch_str() and patch_val() would be paired by count. I am leaning 
this way because the parser is set up for keywords to always be of one 
type.

Having the pair makes it easy to set up a loop to pull more than one key 
value pair aimed at creating, say, a table.

Using a count for access, I believe, will make it a little less likely 
users will get the feature tangled up on string specification or 
interpretation.

Though, nothing would stop any given branch developer from setting up to 
pull values out of returned strings in the SDL - if that's the set up 
they want.

> 
> from my admittedly limited vantage I see no downsides, other than that
> 'functions.inc' (presumably) would need to be sourced.
> 
> [*] also .. pleasing that a function with that exact name should get shouldered
> with this odd job.:-)

Indeed! :-)

Yes, including functions.inc is a bit clunky and it touches on the 
parser performance issue I was trying to address somewhat with my 
'munctions' (macro call defined functions) idea - a little work toward 
which showed up in my last release.

True to some degree for any include, but when we pull in functions.inc 
in particular we define upwards of one hundred symbols in a symbol 
table. These can and do collide by hash value(a) which slows down 
functions at RUN time as well as slowing general parsing at parse time.

We only include functions.inc to declare - create global symbol table 
entries - for each inbuilt function name. For f_odd we could, and 
probably should, use(b) just:

#declare f_odd = function { internal(43) }

ahead of the call to f_odd.

This selective declaration of inbuilt functions has always been 
recommended against because the positional values in the internal 
function table might change. True, but, they haven't actually changed in 
a very, very long time - until povr really. There has always been a 
performance reason to do the declares for only the functions in 
functions.inc you use(c).

(a) - This especially true in official POV-Ray where the hashing 
mechanism, though itself very fast, generates hash values heavily 
weighted / bunched around the first token character. The povr branch 
uses a C++ provided string hashing with very good hash value 
distribution - even where strings are quite similar (Fn00,Fn01...). The 
C++ method was slower at low optimizations and faster at -O2 and above 
for the testing done at the time I changed over.

(b) In povr, I'm using entry 43 for f_elliptical_sphrswp() and I will 
have to move f_elliptical_sphrswp() elsewhere.

(c) Though, with hash based symbol/token tables with linked lists 
hanging off each node, whether you see any a performance gain depends on 
the particular symbol table construction. Having fewer symbols/tokens 
will never hurt performance, but it can help quite a lot - depending on 
'stuff.'

> 
> ((real) minor nit, suggest 'fork', or perhaps even 'branch', rather than
> 'patch')
> 

Good idea and I guess in the git sense, fork, the better choice because 
in the usual practice there will often be branch(es) off major forks for 
particular features of the fork. Any branches we should probably handle 
as additional _str and _val entries.

> 
>> Aside: I had the thought too for a patch_keyword("sky_sphere"). Which
>> might return say "unchanged". Or "emission sub keyword is now amplify"
>> or "removed" or "new" or "substantially updated see povr documentation"
>> or... I'm thinking more about code which self documents to some minimal
>> degree.
> a macro to return a 'dictionary{}' would be real nice.  could have keys for the
> changed stuff ("amplify") as well as version/patch level, everything in one
> place.:-)

Yes, good idea. Putting more in include(s) that creates a dictionary for 
such information would be better / easier. The include could itself 
could test the forked version / branch of POV-Ray matches its internal 
information. Hmm, we could create csv file(s) and use table.inc though 
guess I'm not sure if more or less work/value over just creating the 
dictionary straight up?

On automatically including includes, I lean against it. How external 
files get searched for and found is today problematic. I've been trying 
to simplify the povr fork directory search mechanism. Still a long way 
to go there and might have some current changes wrong. Official 
POV-Ray's matching attempts are very aggressive in assuming various 
directories, file suffixes and such. I think this causes as much 
confusion as not in the end.

On versioning... Following some conventions there a good idea, but not 
sure it's something to force as part of any f_odd, fork_*() additions.

For someone actually implementing a one off patch, it's meaningless. I 
have to say too, as a long time user of such versioned tools, I've found 
them not all that reliable - except in maybe the ZZZ minor update 
category (small updates to still compile essentially static releases).

Some systems extend the versioning to components / modules / features of 
the overall 'user tool' - some allowing the user to pick the version of 
behavior for each module. This 'idea' somewhat attractive given the more 
aggressive changes I'm pursuing with povr.

For example, I substantially re-wrote 'fog' a year or more ago. One of 
the changes was making ground fog work more reliably - it basically, 
doesn't function for most scenes in official releases. Last fall I ran 
across an old scene that depended upon the particulars of the previous 
method to do a sort of fog fade at a large distance. It was a use I 
didn't foresee for 'ground fog', so I restored the old method as an 
additional option. There are now two versions of my povr fog code.

Maintaining this per feature versioning is something I think works 
better where a coder is working on say only a few of modules for a 
larger project/tool. I don't have the bandwidth to support anything 
complicated like maintaining multiple user select-able versions, but I 
might be able to increment some version number per keyword to at least 
indicate something changed related to the keyword/option.

I'll think more about what to do versioning wise, what I might be able 
to maintain...

Bill P.


Post a reply to this message

From: jr
Subject: Re: A quick povr branch micro normal image.
Date: 28 Jan 2022 06:00:00
Message: <web.61f3cc59c1365d06ea8869266cde94f1@news.povray.org>
hi,

"Bald Eagle" <cre### [at] netscapenet> wrote:
> "jr" <cre### [at] gmailcom> wrote:
> > William F Pokorny <ano### [at] anonymousorg> wrote:
> > > ...
> > > Instead of using f_odd() to return additional patch/branch information,
> > > I think it should instead return always a value f_odd() cannot. For
> > > example, 99.0(a).
>
> > a macro to return a 'dictionary{}' would be real nice.  could have keys for the
> > changed stuff ("amplify") as well as version/patch level, everything in one
> > place.  :-)
> >
> Aside, or perhaps in addition to, what can be done under the hood,
>
> what if the versioning mechanism not only has internal functions, but checks for
> the existence of "versions.inc" in the path?  Preferably this would be a
> "wrapper" inc file that then looks for the most recent "versions_DateCode.inc"
> to use...

nice.  made me think there isn't a 'version.inc' yet, so, perhaps, one could be
added to all POV-Rays and variants[*].  standard .inc, only providing a macro or
variable that's true/false depending on whether official or not, and a way to
get info about which executable.  a variant like 'povr' or 'hgpovray' could then
simply add an '#include' in that file to load the specific stuff.  (_if only_
more/most of the code was in 'C'.  anyway :-))


[*] from 3.9 on, perhaps :-)

regards, jr.


Post a reply to this message

From: jr
Subject: Re: A quick povr branch micro normal image.
Date: 28 Jan 2022 08:15:00
Message: <web.61f3ebacc1365d06ea8869266cde94f1@news.povray.org>
hi,

William F Pokorny <ano### [at] anonymousorg> wrote:
> On 1/27/22 10:54, jr wrote:
> > where/how does 'patch_str' get used?
>
> What I was thinking about was more like:
>
> #declare povr     =  0;
> #declare povr_ver = -1;
> #if (99 = f_odd(0,0,0,99))
>      #if (!strcmp(patch_str(0),"povr"))
>          #declare povr = 1;
>          #declare povr_ver = patch_val(0);
>      #end
> #else
>    ...
> #end
>
> The patch_str() and patch_val() would be paired by count. I am leaning
> this way because the parser is set up for keywords to always be of one
> type.
>
> Having the pair makes it easy to set up a loop to pull more than one key
> value pair aimed at creating, say, a table.

ok, key/value pairs.  thanks for code example (I find "snippets" helpful).  re
my earlier post, that testing and the "cloaking" of 'internal(43)' hidden,
perhaps, behind a "friendlier" bool macro/variable.


> ...
> >
> > from my admittedly limited vantage I see no downsides, other than that
> > 'functions.inc' (presumably) would need to be sourced.
> >
> > [*] also .. pleasing that a function with that exact name should get shouldered
> > with this odd job.:-)
>
> Indeed! :-)

what was its original purpose?  reading that the argument is a "field strength"
made me wonder whether it's anything to do with 'blob's.


> ...
> We only include functions.inc to declare - create global symbol table
> entries - for each inbuilt function name. For f_odd we could, and
> probably should, use(b) just:
>
> #declare f_odd = function { internal(43) }
>
> ahead of the call to f_odd.

that could/would be the content of a 'version.inc'.


> ...
> > a macro to return a 'dictionary{}' would be real nice.  could have keys for the
> > changed stuff ("amplify") as well as version/patch level, everything in one
> > place.:-)
>
> Yes, good idea. Putting more in include(s) that creates a dictionary for
> such information would be better / easier. The include could itself
> could test the forked version / branch of POV-Ray matches its internal
> information. Hmm, we could create csv file(s) and use table.inc though
> guess I'm not sure if more or less work/value over just creating the
> dictionary straight up?

'table.inc'?  :-)  guess you were thinking 'filed.inc'?  what do you think of a
'version.inc' which, for variants like 'povr', would simply pull in another
include if/where required?


> On automatically including includes, I lean against it.

v much agree.


> ...
> Maintaining this per feature versioning is something I think works
> better where a coder is working on say only a few of modules for a
> larger project/tool. I don't have the bandwidth to support anything
> complicated like maintaining multiple user select-able versions, but I
> might be able to increment some version number per keyword to at least
> indicate something changed related to the keyword/option.
>
> I'll think more about what to do versioning wise, what I might be able
> to maintain...

fwiw, I do not think that it has to be very .. fine-grained, necessarily.  as
long as a user me can establish, in-scene, that the executable supports this
feature or another as easy to use (in conditionals) test(s).


regards, jr.


Post a reply to this message

From: jr
Subject: Re: A quick povr branch micro normal image.
Date: 28 Jan 2022 12:55:00
Message: <web.61f42cf8c1365d06ea8869266cde94f1@news.povray.org>
"jr" <cre### [at] gmailcom> wrote:
> ...
> ok, key/value pairs.  thanks for code example (I find "snippets" helpful).  re
> my earlier post, that testing and the "cloaking" of 'internal(43)' hidden,
> perhaps, behind a "friendlier" bool macro/variable.
> ...
> that could/would be the content of a 'version.inc'.

have attached a mock-up example of an inc for POV-Ray proper.  the tag would
just change to 'povr', and either inline or via an include have the specific
stuff, and I think, the 'setidtypes.inc' content too.


> > ...
> > > a macro to return a 'dictionary{}' would be real nice.
> >  ...
> > information. Hmm, we could create csv file(s) and use table.inc though
> > guess I'm not sure if more or less work/value over just creating the
> > dictionary straight up?
>
> 'table.inc'?  :-)  guess you were thinking 'filed.inc'?  what do you think of a
> 'version.inc' which, for variants like 'povr', would simply pull in another
> include if/where required?

forgot.  dictionary, "straight up".


regards, jr.


Post a reply to this message


Attachments:
Download 'eg.zip' (1 KB)

From: William F Pokorny
Subject: Re: A quick povr branch micro normal image.
Date: 28 Jan 2022 16:00:10
Message: <61f4595a$1@news.povray.org>
On 1/26/22 09:01, William F Pokorny wrote:
> ...So what's going on.

Well, after digging for a chunk of day, the non-fuzzy result comes from 
a "bug fix" made back in 1998 by Nathan Kopp & CEY in the 
Trace::ComputeReflection() function of trace.cpp.

Basically after the calculation of the reflected ray direction using the 
perturbed normal, they added second dot test of that perturbed reflected 
ray with the raw surface normal.

If the perturbed reflected ray was heading in a direction opposite the 
the raw surface normal it triggers some code which does one of two kinds 
of correction...

Where the perturbed ray direction is opposite the perturbed normal 
direction it simply drops back to reflection rays based on the raw 
normal.  This is where we suddenly get non-normal-perturbed 
reflections...

Otherwise, it pulls the perturbed reflected ray more into alignment with 
the raw surface normal by an amount based upon a negative weighting 
factor and the magnitude of the dot product of the perturb ray direction 
with the raw normal to some degree being stronger up to a cut off set by 
the initial test with the perturbed normal(a).

After all the fix up the reflected ray direction is normalized - which 
is the expected ray direction state.

Ah, what to do... Thoughts anyone? I'm going to have to think about this.

I don't like that we are basically ignoring perturbed results to some 
degree or another - for reflections - beyond certain normal 
perturbations point.

With very rough surfaces - think pile of stones - a reasonable 
expectation would be to get some degree of all internal reflections. A 
darkening due reflections bouncing until they die off inside the surface 
structure. This feels like a more correct result to me.

Guess I need to check whether this bug fix is getting done with 
refraction too...

Bill P.

(a) - Yep, this second fix/pull toward the raw surface normal might not 
always set in a non-opposing direction with the raw surface normal.

(a) Yes, I think this is going to also create some extra 
fuzz/inaccuracies for fresnel effects at glancing angles at near 
tangents to a surface. There, with certain rougher surfaces, we should 
see around the complete 'larger-overall' tangent some.


Post a reply to this message

From: Bald Eagle
Subject: Re: A quick povr branch micro normal image.
Date: 28 Jan 2022 17:05:00
Message: <web.61f467c1c1365d061f9dae3025979125@news.povray.org>
William F Pokorny <ano### [at] anonymousorg> wrote:

> Ah, what to do... Thoughts anyone? I'm going to have to think about this.
>
> I don't like that we are basically ignoring perturbed results to some
> degree or another - for reflections - beyond certain normal
> perturbations point.


Whoa - interesting result of your day of code forensics!

I would tend to start from scratch, and just cancel all of that extra stuff and
see what happens.   Maybe it might explain why they decided to do all of that.

I tend to learn how to approach these types of things by going out and seeing
how various people address them outside of POV-Ray - in Unity, ShaderToy,
academic papers, and other linear algebra / computer graphics / optics / math
resources like websites and video hosting sites.

https://www.youtube.com/watch?v=EBrAdahFtuo

I would try to make a diagram, or write a scene where you have actual objects,
and shoot rays every so often, and show the incoming ray, and then whatever
reflected rays that get generated.   Maybe iterate a color change around the HSV
wheel or something to track how far along the process you are.

Seeing the process and results usually gives far more intuitive understanding of
what the problem is and what the likely solution is, than scribbling equations
and endlessly editing lines of code.

That's my take, anyway.


Post a reply to this message

From: Kenneth
Subject: Re: A quick povr branch micro normal image.
Date: 29 Jan 2022 05:10:00
Message: <web.61f5112ec1365d064cef624e6e066e29@news.povray.org>
[William P wrote:]
>
> With povr I've been using the following sort of scene set up to look at
> normals and inversion...
> //---
> #version unofficial 3.8; // povr
> global_settings { assumed_gamma 1 }
> #declare VarOrthoMult = ...
> [...snip]

[JR wrote:]
>
> have attached a mock-up example of an inc for POV-Ray proper.  the tag would
> just change to 'povr', and either inline or via an include have the specific
> stuff, and I think, the 'setidtypes.inc' content too.
>

[running v3.8.0 beta 1 in Windows]
I did not even know that there is an 'unofficial' keyword in POV-ray syntax; it
is not mentioned in 3.8's in-built documentation (at least not in the index of
keywords there). Interesting!

So, running JR's very neat little test scene (with his .inc file) like this:

#version 3.8; // no 'unofficial' keyword
global_settings {assumed_gamma 1}
box {0,1}
#include "version_test.inc"
#if (official_program())
#debug concat("prog is an official POV-Ray.\n")
#else
#debug concat("not an official POV-Ray.\n")
#end
#debug concat("ident: \"",idtag_program,"\".\n")

...... it parses and render fine, with the appropriate message.

But if I run it like this:

#version unofficial 3.8;
......

...... the scene fails outright, with the message
"line 1: Parse error: This scene was created for an unofficial version and
cannot work as-is with this official version."

I'm very naive about this stuff-- but is it to be expected that this
'unofficial' file should also run OK, and even in my Windows version of 3.8
(i.e., a non-"povr" version?) If not, then... never mind ;-) My apologies if I'm
simply "muddying the waters" of this discussion, but I thought I should mention
the parse error.


Post a reply to this message

From: William F Pokorny
Subject: Re: A quick povr branch micro normal image.
Date: 29 Jan 2022 07:50:21
Message: <61f5380d$1@news.povray.org>
On 1/28/22 17:01, Bald Eagle wrote:
> I would tend to start from scratch, and just cancel all of that extra stuff and
> see what happens. Maybe it might explain why they decided to do all of that.

A look at the basics is good advice.

I did get to commenting all the fix bugfix code and got results more in 
line with what I originally expected. As of this morning I'm leaning 
toward bringing out various treatments as user controllable variations. 
  I can see physical reasons for a few depending upon the actual surface 
and the non-perturbed rays. Given that glassy result is probably useful 
would likely keep it as perhaps an expanded option. Suppose keeping the 
current behavior as an option a good idea too.

Likely the povr default would be no bugfix / adjustments at all as I 
think it the result most would expect. I also like the look of it over 
the current for usual use 'micro' micro results.


This morning been looking at the refracted side of things.

A simplified view:
------------------

Where the perturbed normal is running along with (inverted with respect 
to the incoming ray) the perturbed normal is inverted and assigned a 
local normal for color calculations.

Where the refracted ray ends in internal reflection the perturbed normal 
is used and the compute reflection bugfix gets used as with any other 
reflection.

Where the refracted ray continues creating a new refracted ray the, 
always pointing back toward the ray, normals - the potentially inverted 
local normals - get used

So! The refraction behavior isn't really aligned with reflection 
behavior at any given surface intersection where the perturbed normals 
point in opposition to the refracted/reflected rays. In the case of 
reflections sometimes if in enough conflict with the raw normal.

A realization for me is, while it's possible in some cases to prevent 
the inversion of perturbed normals with respect to the major surface 
direction with some patterns, it is not possible to do in general 
because there are really two kinds of perturbed normal inversion. One 
with respect to the surface itself, and another with respect to the rays 
involved.

That said, a common to all normal patterns, perturbed normal inversion 
option for where the perturbed normal points away from the raw normal 
might be of use... This would immediately get us to a state where there 
are only the inversions with respect to involved rays issues.

---
I guess a take away here for POV-Ray proper is where you are using 
reflection or refraction, keep the normal bump size smallish (<0.5). 
Even so, where rays nearly tangent to the surface, ANY normal 
perturbations will invert somewhat at those locations with respect to 
incoming rays.

I suppose what can be said generally is the moment we start to fake 
surfaces / shapes to some degree or another, the reflections and 
refraction treatments necessarily get somewhat heuristic.

Bill P.


Post a reply to this message

From: William F Pokorny
Subject: Re: A quick povr branch micro normal image.
Date: 29 Jan 2022 08:14:31
Message: <61f53db7$1@news.povray.org>
On 1/29/22 05:07, Kenneth wrote:
> ut if I run it like this:
> 
> #version unofficial 3.8;
> ......
> 
> ...... the scene fails outright, with the message
> "line 1: Parse error: This scene was created for an unofficial version and
> cannot work as-is with this official version."

Thanks! Stopping with this error is what I expected for "official" 
windows versions. This means if we want to write SDL supporting both 
normal POV-Ray releases and forks for windows users, we'd need to strip 
"unofficial" from #version.

Thanks jr, for the initial versions.inc template. Yes, such an include 
could be provided with code. However, we'd need to be careful about 
'declaring' any version of POV-Ray to be an official one. Today ONLY the 
pre-compiled windows releases are official.

Hmm, suppose we could extend this code to optionally end with an #error 
if the desired hook or fork is not found for SDL written specifically 
for a particular fork.

 > what was (f_odd's) original purpose?  reading that the argument is
 > a "field strength" made me wonder whether it's anything to do
 > with 'blob's.

I don't know, I think it was likely some unfinished effort where some 
initial code got copied. Code wise it was always a match for f_cushion() 
and so pointless as shipped.

Bill P.


Post a reply to this message

From: jr
Subject: Re: A quick povr branch micro normal image.
Date: 29 Jan 2022 10:55:00
Message: <web.61f55e65c1365d06ea8869266cde94f1@news.povray.org>
hi,

William F Pokorny <ano### [at] anonymousorg> wrote:
> ...
> Thanks jr, for the initial versions.inc template. Yes, such an include
> could be provided with code. However, we'd need to be careful about
> 'declaring' any version of POV-Ray to be an official one. Today ONLY the
> pre-compiled windows releases are official.

made me think what is John/Jane User's perspective?  if they only ever use
official builds on MS Windows, then "the problems" do not arise.  if however
they write scenes which will get parsed by POV-Ray proper or a variant, then
they can source the include.  I think one can simply "turn the logic around"[*]
without harm, ie name the macro 'unofficial_program' + reverse its result.  then
we could write an even simpler conditional:

  #if (unofficial_program())
    ...
  #end

[*] assuming POV-Ray proper will continue to return clamped values + everybody
else switches to '99'.


> Hmm, suppose we could extend this code to optionally end with an #error
> if the desired hook or fork is not found for SDL written specifically
> for a particular fork.

re error, sorry snipped too much.  just to confirm, unlike on Windows, the
self-compiled POV-Rays all accept "unofficial" w/out a murmur.  while I can see
the logic, it feels wrong - now.  we're 2nd class users anyway, with RTR not
working :-(.

on the point, if I test in a scene for '6e4ed6c2' (:-)) but the executable is
older/newer/not compatible and won't do, then yes, I think it has to be an
error.


>  > what was (f_odd's) original purpose?  reading that the argument is
>  > a "field strength" made me wonder whether it's anything to do
>  > with 'blob's.
>
> I don't know, I think it was likely some unfinished effort where some
> initial code got copied. Code wise it was always a match for f_cushion()
> and so pointless as shipped.

I found _that_ definition (and description in the docs) equally .. illuminating.
 </grin>


regards, jr.


Post a reply to this message

<<< Previous 10 Messages Goto Latest 10 Messages Next 10 Messages >>>

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.