On 5/8/20 8:09 PM, Bald Eagle wrote:
Forgive me, I'm going to forgo answering all the questions in detail.
You're asking the right questions. Answers there are a many. Shy away
from extremes that any given approach is all right or all wrong - that
thinking I've found almost always wrong.
On the fast, distance based stuff. I believe a reason they can be so
fast - in addition to a lot of optimization - is they set everything up
distance based (nice gradients). This lets them toss gazillions of
equations at GPU hardware (or SIMD), run it all in parallel and 'just'
add up the answers to get an 'effective equation.' By that I mean a
continuous value field over which one can iterate and find roots. How
accurate those roots are is a LOT less clear to me... Like our
isosurface set up, this method has no way to universally find all roots.
In fact beyond a certain number of equations tossed into the mix, not
sure the creator of the input really knows...
(In a bisection like solver - see below - they can also test thousands
of points along the ray 'at once'. Not sure if this done.)
All said, it IS an approach I think could have a place with isosurfaces,
but then there is the matter of multiple GPU vendors and packages on
packages of ever changing software to get at it. Needing too to stay
somewhat an expert in the field to keep it all running over time...
Aside: When I looked at the distance based work I was reminded of all
the Design Rule Checking (DRC) work I did while working in the
semiconductor industry for 30 plus years. A very big pile of very old
methods exists there - on which I the distance stuff is somewhat leaning
whether they know it or not. A pile of ideas / methods where we might do
well to beg, borrow and steal too (text object(1)).
(1) - Which if I ever get to back to Christoph's freetype work will be I
think a new character only object leaving 'text' as it exists with some
patches in my povr.
>> Your idea and/or a mix of those above there is a key component for
>> usefulness - some way to fade the isosurface function(s) at the hard
>> surface boundaries. Otherwise, we have gradient issues there - unless OK
>> with the container shape cutting or clipping the resultant shape.
> OK, but I still don't understand the gradient issue. We have CSG and flat
> planes and abrupt discontinuities in shapes all the time that the raytracer
> handles just fine. Why are the raymarching algorithms not struggling with this
> gradient thing, whereas we find it to be positively crippling in certain
There IS equation/value continuity in ALL the underlying ray-surface
equations with roots (surfaces) showing up at zero - anywhere we get a
clean result. All polynomial solvers returning 'all roots' require
continuity - otherwise the root isolation methods do not work reliably.
The only solver approach handling discontinuities in a general and safe
way - is a bisection(forward-marching) solver - of which our isosurface
solver is a form (No I don't know if it lines up with any of the
published by name ones...).
With bisection you can have discontinuities so long as you sample enough
to not miss root ranges as the ray approaches actual roots. Once you
know you have a range with '-' on one end and '+' on the other you can
always find at least one root (or discontinuity too actually) in that
Aside: Range in hand with a root is where our isosurface solver - I
think - could probably close on the root faster so long as there is
local continuity. Thinking users would declare this is so by option -
and come what may. Even with continuity these faster methods are
something risky, depending. I believe I have in hand a foolproof - coded
from scratch New-Raphson/bisection uni-variate polynomial solver which
could be adapted for use inside the isosurface solver.
The disadvantage of bisection / marching solver approaches is you never
'really' know the number of roots - or whether you've found them all...
Our parametric and isosurface objects carry this shortcoming. It can be
a big one in csg and shadow testing, as examples.
I don't have all the answers. I never will. In my solver work I've
collected a set of books if stacked on the floor would be well over my
head. Papers, if printed, a stack many, many times that height. The
whole of it is way, way beyond me to digest.
There is the problem too, in the last 30 years say, of enormous amounts
of published noise in the field(2)... People need to publish for degrees
so there is loads of minor-ish, real, but practically, meaningless stuff
published. Figuring that out some touted by title thing isn't 'really'
useful takes so blasted much time! There are also niches and notches
where researchers are off doing things which seem related, but that
don't really apply to ray tracing. Folks off working on polynomials of
insane (for us) orders for various reasons...
Anyway, I'm starting to vent. I get frustrated too. :-) You're asking
the right questions. If the answers were easy - wonderful, fast
solutions would be everywhere to be had doing everything we want. Be
thankful! That this not true, gives us a good hobby. ;-)
My philosophy is to push on things. See if something opens up where
'relatively' easy progress can be made. If so, I work there until it
gets hard - or I get bored. Then I push on other stuff. I'm also,
slowly, pruning and changing povr to get it to a place where it's easier
to try some things on my idea pile.
Apologies for the mistakes - a fair bit of thinking aloud so to speak.
(2) - Book publishers own rights to older stuff - or stuff not selling
that well - that I can better afford. I believe now, those often spun up
by an author's students or hired-influence-folks. A fair number of the
books I've picked up aren't bad I guess. Often though, they're not
helpful in any practical, get something real done, way. A particular
popular one which is, has a license for the code there-in which forces
open source efforts to "keep it on the shelf" for real code work.
Post a reply to this message