





 
 




 
 


So, I have puzzled out the important parts of the importance sampling, and want
to move ahead to seeing how it affects a reflection model.
I had already worked out the fundamentals of the Phong model:
https://news.povray.org/povray.binaries.images/thread/%3Cweb.57cb6721f854a3db5e7df57c0%40news.povray.org%3E/?ttop=43851
6&toff=750
https://en.wikipedia.org/wiki/Phong_reflection_model
And so just needed to work out how to put all of that in vector component
separated and daisychained functions so I could texture a sphere.
It seems like I've got the basics down, however I'm needing to divide my angle
between the reflection vector and camera vector by 4, otherwise I get a HUGE
phong highlight.
My reflection vector calculation also seems to be sign inverted.
http://cosinekitty.com/raytrace/chapter10_reflection.html
Also, even though I'm doing
function {min (max (Specular (x, y, z), 0), 1)}
I still seem to be veering out of the 01 range when my shininess factor (the
phong exponent) goes below 2. Weird.
I would love to get everything ironed out so as to not have to worry about weird
fudge factors, so that when I puzzle out and refine applying the importance
sampling part, I can focus on the real issues and not be chasing ghosts in the
code.
#declare SFn_vdot = function (ax, ay, az, bx, by, bz) {ax*bx + ay*by + az*bz}
#declare RX = function {L_hatX  (2 * SFn_vdot (L_hatX, L_hatY, L_hatZ, x, y,
z)) * x}
#declare RY = function {L_hatY  (2 * SFn_vdot (L_hatX, L_hatY, L_hatZ, x, y,
z)) * y}
#declare RZ = function {L_hatZ  (2 * SFn_vdot (L_hatX, L_hatY, L_hatZ, x, y,
z)) * z}
#declare R_hatX = function {RX (x , y, z) / SFn_vlength (RX (x , y, z), RY (x ,
y, z), RY (x , y, z))}
#declare R_hatY = function {RY (x , y, z) / SFn_vlength (RX (x , y, z), RY (x ,
y, z), RY (x , y, z))}
#declare R_hatZ = function {RZ (x , y, z) / SFn_vlength (RX (x , y, z), RY (x ,
y, z), RY (x , y, z))}
#declare Specular = function {k_s * pow (SFn_vdot (R_hatX (x, y, z), R_hatY
(x, y, z), R_hatZ (x, y, z), V_hatX, V_hatY, V_hatZ)/4, _alpha) * i_s}
The sphere on the left is a isosurface with finish {phong 1}, and the one on the
right is pigmented according to my Phong specular reflectance function.
Post a reply to this message
Attachments:
Download 'importancesampledphongbrdf.png' (68 KB)
Preview of image 'importancesampledphongbrdf.png'


 
 




 
 


"Bald Eagle" <cre### [at] netscapenet> wrote:
>...
> I would love to get everything ironed out so as to not have to worry about weird
> fudge factors, so that when I puzzle out and refine applying the importance
> sampling part, I can focus on the real issues and not be chasing ghosts in the
> code.
>...
> #declare R_hatX = function {RX (x , y, z) / SFn_vlength (RX (x , y, z), RY (x ,
> y, z), RY (x , y, z))}
> #declare R_hatY = function {RY (x , y, z) / SFn_vlength (RX (x , y, z), RY (x ,
> y, z), RY (x , y, z))}
> #declare R_hatZ = function {RZ (x , y, z) / SFn_vlength (RX (x , y, z), RY (x ,
> y, z), RY (x , y, z))}
>...
Hi Bill
In your code above you pass RX once and RY twice as arguments to your
SFn_vlength() function, but I guess that you intended to pass RX, RY and RZ as
arguments.

Tor Olav
http://subcube.com
https://github.com/tok
Post a reply to this message


 
 




 
 


"Tor Olav Kristensen" <tor### [at] TOBEREMOVEDgmailcom> wrote:
> In your code above you pass RX once and RY twice as arguments to your
> SFn_vlength() function, but I guess that you intended to pass RX, RY and RZ as
> arguments.
TOK of the eagle eyes to the rescue again.
Thanks so much  I'm sure that would have registered at some point. Works a LOT
better now!
Post a reply to this message
Attachments:
Download 'importancesampledphongbrdf.png' (59 KB)
Preview of image 'importancesampledphongbrdf.png'


 
 




 
 


"Bald Eagle" <cre### [at] netscapenet> wrote:
> "Tor Olav Kristensen" <tor### [at] TOBEREMOVEDgmailcom> wrote:
>
>
> > In your code above you pass RX once and RY twice as arguments to your
> > SFn_vlength() function, but I guess that you intended to pass RX, RY and RZ as
> > arguments.
>
>
> TOK of the eagle eyes to the rescue again.
>
> Thanks so much  I'm sure that would have registered at some point. Works a LOT
> better now!
No problem.
I've studied your code some more  and done some experiments with it.
You are one the right track, but unfortunately I don't have enough time
to look further at it and suggest more changes.
I did a similar experiment some years ago:
Subject: With and without a light_source  2 attachments
From: Tor Olav Kristensen
Date and time: 20030623 23:19:10
Message: <Xns### [at] 204213191226>
http://news.povray.org/povray.binaries.images/32195/
Perhaps it can help if you look at the source code for it:
Subject: Source code for "With and without a light_source"
From: Tor Olav Kristensen
Date: 24 Jun 2003 18:20:41
Message: <Xns### [at] 204213191226>
http://news.povray.org/povray.text.scenefiles/message/
%3CXns93A54E262AE2torolavkhotmailcom%40204.213.191.226%3E/

Tor Olav
http://subcube.com
https://github.com/tok
Post a reply to this message


 
 




 
 


"Tor Olav Kristensen" <tor### [at] TOBEREMOVEDgmailcom> wrote:
> I've studied your code some more  and done some experiments with it.
> You are one the right track, but unfortunately I don't have enough time
> to look further at it and suggest more changes.
No worries. I should focus on figuring out how clipka was going to apply the
importance sampling to the BRDF function first, and then I can generalize
everything to see how it works with once I've moved past the "assume a spherical
object" phase. I have some scribbles and a general plan.
> I did a similar experiment some years ago:
Yikes! Was that really 19 years ago??!
I was smackdab in the middle of my PhD research at that time.
> Perhaps it can help if you look at the source code for it:
Yes  I was fiddling some more with things today, and was thinking of heading in
exactly that same direction. Eventually.
That also answers the question I had about how you created some of those
interesting torus renders, and explains how you got the color effects. Bonus!
:)
Post a reply to this message


 
 




 
 


OK, so here's an update.
(A lot of thinking out loud, hoping maybe to get some advice / feedback so I can
get to some sort of meaningful result quickly)
"Bald Eagle" <cre### [at] netscapenet> wrote:
> Also, even though I'm doing
>
> function {min (max (Specular (x, y, z), 0), 1)}
>
> I still seem to be veering out of the 01 range when my shininess factor (the
> phong exponent) goes below 2. Weird.
I think this part of it just had to do with rejecting everything in the opposing
hemisphere by rejecting all negative dot product results using select ().
I have the basic implementation of importance sampling worked out  I really
should have been able to do it way back when...
I think I'm at the point where it looks like implementing an importancesampled
BRDF in a pigment function is an erroneous idea on my part, due a
mis(non)understanding of WHERE in the process of calculating a pixel brightness
that importance sampling is applied.
It appears to me that after watching only the first 1.5 minutes of
https://www.youtube.com/watch?v=xFsJMUS94Fs
that the importance sampled BRDF gets used in the rendering equation, which
probably means it's a sourcecode level thing that wouldn't be something doable
with a parsetime evaluated SDL function, but rather through an algorithm, which
would be macro based.
This tells me that to evaluate the surface illumination, I'll have to build my
surface out of singlepixel sized boxes or spheres, or maybe cones  or make a
triangle mesh and cycle through all of the triangles or vertices to determine
the color of each (smooth) triangle.
Does this sound right?
Alternatively, I suppose I could define a big function that uses a noise
function to determine psudorandom vectors to sample around the surface point
.....
I will have to think some more on this to decide what to start coding...
Maybe if someone has coded an entire basic (meta)raytracer in SDL, then I could
go that route.
Post a reply to this message


 
 




 
 


On 20221113 18:47 (4), Bald Eagle wrote:
>
> Maybe if someone has coded an entire basic (meta)raytracer in SDL, then I could
> go that route.
Did someone say "meta"?
https://wiki.povray.org/content/Documentation:Tutorial_Section_3.8#SDL_tutorial:_A_raytracer
Post a reply to this message


 
 




 
 


Cousin Ricky <ric### [at] yahoocom> wrote:
> On 20221113 18:47 (4), Bald Eagle wrote:
> >
> > Maybe if someone has coded an entire basic (meta)raytracer in SDL, then I could
> > go that route.
>
> Did someone say "meta"?
>
>
https://wiki.povray.org/content/Documentation:Tutorial_Section_3.8#SDL_tutorial:_A_raytracer
Heh. Shortly after that, I went hunting, and found exactly that.
That's written so that it cycles through all the light sources and
updates/accumulates all of the illumination calculations for a surface pixel,
doing a single analytical calculation for phong each time.
I'm guessing that I'd have to scrap that and go wholly with a global sampling of
the surface normal's hemisphere to have importance sampling make any sense? Not
sure how I'd do that with point lights either.
At the moment, the Monte Carlo approach makes sense when you don't know WHAT the
illumination is, like when using a pigmented sky sphere or an HDR image. Since
point lights seem to be an example of a Dirac delta function in this context, it
almost seems like there ought to be a mechanism to force sampling from _those
vectors_, or to explicitly block those vectors and do a second pass using
standard phong highlighting.
I've never played in the part of the pool, so I don't have enough context to
understand what to do with this tool, now that I've built it.
I guess I will keep web searching and reading and watching until I figure
something out...
Until then, I adjusted the raytracer code to just use two spheres, and I figured
I could use a flag to have one of them importance sampled, while the other was
stock Phong model for comparison. Pretty neat  only takes 20 sec.
Post a reply to this message
Attachments:
Download 'raytraced_importancesampledphong.pov.txt' (7 KB)


 
 




 
 


"Bald Eagle" <cre### [at] netscapenet> wrote:
Since
> point lights seem to be an example of a Dirac delta function in this context, it
> almost seems like there ought to be a mechanism to force sampling from _those
> vectors_, or to explicitly block those vectors and do a second pass using
> standard phong highlighting.
So, this was an interesting read:
https://www.gamedev.net/blogs/entry/2261086importancesampling/
It seems that for point lights, the importance sampling algorithm just defaults
to the phong model. They even invoke the same Dirac delta function description.
Spooky.
At the moment, I'm still at a loss as to how to code the variable BRDF sampling
modes, but knowing what to do (and what NOT to do) is usually over half the
battle. Maybe more.
I'll keep searching, reading, and watching  and maybe I'll come up with some
way to implement this.
As a related aside, I think that we could probably use an importance sampling
approach to increase the efficiency of hitormiss style algorithms like filling
containers or placing objects in scenes, on specific terrain features, etc.
Likely other less obvious uses as well.
Post a reply to this message


 
 




 

