                        TRISCAN MACRO DOCUMENTATION  

The file triscan.mcr is the result of several weeks of experiments with the
MegaPOV trace() function. In some ways, this project did not meet my initial
hopes. I had hoped this would produce mesh scans of sufficient quality they
could be exported and used in other 3D packages, thereby allowing users of
other packages to use POV-Ray for producing the unusual and versatile shapes
it is so good at, such as julia fractals, quartics, and most recently 
isosurfaces (through MegaPOV). After extensive experimentation, I grew to feel
that high quality tesslation can only be done with access to the internal
parameters of various surfaces, and even then, the routines would be complex
well beyond being practical in POV scripting language.

HOWEVER, what I got out of the process was a scripted scanning routine which
can in fact produce some low to medium quality meshes out of almost anything
you care to put into it and have the patience to wait for the scan to finish.
Now, while these meshes don't begin to do justice to their POV-Ray origins,
they can be used as rudimentary "scale and position" reference objects in other
modelling packages. For example, using this scan routine I have...

 - Scanned POV objects and imported medium quality wiremeshes of them into 
    Rhinocerous and Poser as imported/prop objects. These can then be used
    as references for helping to shape, pose, and scale creations you intend 
    to import back into POV-Ray for use with the original models.
 - Scanned POV objects and made them into Moray UDOs ... even in cases where
    the primitives are not supported by Moray -- such as julia fractals (which
    greatly expands the range of what you can use in Moray)
 - Scanned simple POV-Ray objects into Leveller reference shape files. This
    would allow you, for example, to make a house, wall, bridge, or whatever
    in POV-Ray script, then import a reference shape of it into Leveller to
    use to sculpt a landscape around, and reimport that height field back into
    POV for use with the original models. 

QUICK START:
Note, you will need to use this with MegaPOV, as this makes use of a number
of the new features including, of course, the trace() function. To get started
just open the files in this package into a common directory and run the sample
files named demo1 thru demo5. The five demo files between them illustrate all 
the most important points about triscan's use;

 Demo1.pov illustrates the most basic usage of the macro with a simple sphere. 
           The various parameters are explained in the comment block.
 Demo2.pov explains how to go about importing a julia fractal into Moray with 
           this file. Not a julia_fractal plugin, maybe, but I'm sure it will
           prove useful just the same...
 Demo3.pov shows off how some shapes don't scan very well at all with this
           technique, and how you can offset this problem to some extent with
           higher scan densities
 Demo4.pov shows how to create a Leveller reference shape out of a simple CSG
           house.
 Demo5.pov shows how to use the macro function ScanLevel() which resets an
           internal variable prior to a scan forcing a secondary scan pass 
           which creates considerably higher quality scan... it takes far
           longer to scan, makes larger files, and is still not perfect, but 
           it does patch a lot of the holes. You decide if you think it's 
           worth the extra effort.


KNOWN PROBLEMS:
There are some nagging problems with this routine, possibly some that will 
never be resolved short of rewriting POV-Ray itself to do the tesselating 
internally using higher quality tesslation routines. Demo3 shows off one of
the more severe conditions that occur with shapes exposing sides that fall off
at 90 degrees, but even on the simple sphere demo you'll see examples of tiny
triangular gaps which seem to occur because of what I call "false culling" 
errors caused by the many degeneracy and false-hit tests in the Evaluate 
routine. I've poked and tweaked this routine for a couple weeks now and found 
no total solutions that don't also completely destroy the whole scan routine.

Also, I've noticed popular conversion programs like 3DWin and Crossroads, as
well as modelling programs like Poser seem to throw out any normals you attempt 
to supply and recalculate their own smoothing on the fly. As a result, delicate 
contours can be lost, and you will never get perfectly faithful reproductions, 
even at the densest mesh settings. It seems, from what I've seen so far to be a
fairly common practice among converters and modellers, and I don't know what 
can be done about it, so it looks like medium quality is the best we can hope
for in this process.

ADDITIONAL INFO, FOR THOSE WHO ARE CURIOUS
Two Additional Global Options:
-There are two global variables you may want to customize, therefore they have 
been put right at the top of triscan.mcr where you can find them easily. 
Here's detail on what they do...

TS_CullPath variable:
The scan routine prevents file size from getting too out of hand by outputing
POV triangles to a temporary mesh in a file called "culling.tmp" and reading
them back in between passes across each face. This way triangles which are
completely overlapping regions already scanned will be discarded as contributing
nothing new to the scan's geometry, keeping the final output file size more
reasonable. Normally this file is just put in the current working directory
with whatever file you're doing the scan of. If you work in a lot of different
directories, you may end up leaving copies of culling.tmp all over the place.
Maybe you don't mind manually removing it when done, but there's a better way.
By putting a full pathname in the variable TS_CullPath, you will cause your
culling.tmp file to always be written to the same place, so each time your run
the Triscan macro, it will always just overwrite the same temp file. If you
have a place you prefer to have your temp files written to, just put the path
name in TS_CullPath (ie. change "culling.tmp" to read, for example, 
"C:\POV-Ray\culling.tmp") I don't know your preferences, system, or the layout 
of your directory structure, so I leave this up to you to set up as prefered.

TS_VarBias variable:
To understand what this is good for, you must understand the problem for which
it was created. Don't worry about it too badly. The need to adjust it should 
be rare, but here goes:

The biggest problem, aside from that pesky false culling is what I call near
miss errors. Imagine a target object with a near and a far surface both on the
same scan track, but where the near surface doesn't extend as far up as the
far surface. The scan process evaluates a rectangle of scan rays which it 
resolves into two potential triangles, so as you scan your way up the near 
surface, the scan rays reach a point where the top two scan rays pass right 
over the top, while the lower two scan rays still strike the near surface. The
top two rays, meanwhile, will go on to strike the far surface, which is still
part of the overall object, so this will be treated as four successful hits.

If you went ahead and exported this as a pair of "successful" triangles, you'd 
have a pair of weird, elongated triangles spanning from just below the top of 
the near surface all the way to the far surface in a way no portion of your 
object actually does. To eliminate these, I set up a variable called 
TS_Variance which represents the largest distance you should ever get between 
any two points of a triangle so potential bad triangles can be weeded out with 
a simple variance test. TS_VarBias, a global variable appearing at the top of 
the triscan.mcr file, is a multiple of the scan segement width which is used to
calculate how much larger than segment width the variance should be. A value
of 3.0 seems to work fine as a default. Too low, and you'll start eliminating 
way too many triangles, too high and you might as well not have bothered with 
variance testing. Still, the default was chosen as a good starting point, but 
that doesn't mean you won't eventually have really bizarre surfaces that need 
higher VarBias to scan properly.

Odd but true fact: culling those triangles caused by a near miss leaves gaps in 
the far side (something I call near-miss shadows) which is why I scan all six 
sides of the object, even though recurrent scanning pierces all the way through
all surfaces. Rescanning from the opposite direction helps to fill in holes 
left by the original pass.  

THE ORIGINAL EXPERIMENTS: HERE ARE THE RESULTS
This was based on a series of experiments I announced on the povray newsgroups
a few weeks ago. For those who remember that original announcement, the 
conclusions of those experiments were are follows: The "Box Method" of scanning
proved to be the superior technique of the ones I described. The "Spherical" 
methods were considerably inferior to them for accuracy even though they were 
marginally faster. The duplicate culling methods I described worked as expected
(although, for all I know, it is also contributing to the false culling errors)
so it is no more wasteful of file space. Considering the box method was the one
I expected the least of, and spherical was where I had the greatest hopes, this
was an ironic result to say the least. 

The progressive intersection methods, (thankfully!) proved unnecessary, as I 
was able to incorporate what I ended up calling "recurrent scanning" that just 
continues scanning from the last intersection points found until you run out 
of intersections. Many thanks to those who suggested it. It did introduce 
problems of its own, but they were mostly easy to overcome and saved an 
enormous amount of time on scans without having to sacrifice internal 
surfaces. 

For reasons that are still not clear to me, attempts at recursive subdivision 
to fill gaps proved unexpectedly disasterous on every ocassion they were 
implemented. They did a poor job of filling in gaps, produced an effect best 
described as "triangle dust", swelled file sizes enormously, and created 
streams of inexplicably degenerate triangles in the culling files (despite the 
extensive degeneracy testing I attempted to eliminate them). I'm still 
convinced this subdivision ought to work, but I have, for now, given up on it. 


CONCLUSION
So here it is: it isn't perfect by far, but it may be of general use, so I
present it to my fellow POV and MegaPOV users. Feel free to use it as you 
please, and improve upon it if you see a way; I would ask, if you find a 
viable way to improve it, please share your modified code with the POV-Ray 
public (I'd be especially eager to look into what you come up with). I 
apologize if the structure of the code offends any of the more expert 
programmers among you. I don't claimed to be a great programmer, and some of 
the stranger parts of the way the code fits together are a legacy of the long 
slow process of this file's evolution, but hopefully I haven't left 
it looking too pathetic. <g>

