|
|
In article <pan### [at] internodeonnet>,
pod### [at] internodeonnet says...
> Hmm maybe it would be possible (ie not too hard) to implement a scale
> factor into the parser so that as objects are defined, their dimensions
> are scaled.
>
> e.g.
> #declare scale_factor = 1.0;
> box{-1,1} //create a 2x2 box
> #declare scale_factor = 10;
> box{-1,1} //create a 20x20 box
>
> macros could use the old
> #local old_scale = scale_factor;
> #declare scale_factor = new_scale;
> ...
> #declare scale_factor = old_scale;
>
>
> I believe that this would do what you want without any need for defining
> units such as cm, inches or cubits.
>
Yes it would and is what most people do anyway, but that doesn't solve
the issue that it is sometimes nice to know how big something really is
in real measurements. Like when trying to duplicate a real world object.
You can build a room and all the rest using 'normal' units, but then if
you decide to add a glass into the scene you end up having to play with
the scale, because the original scene didn't take into account real world
sizes and you aren't entirely sure what an inch is in that context.
Scaling is a crude and inaccurate way to handle precise measurements.
--
void main () {
call functional_code()
else
call crash_windows();
}
Post a reply to this message
|
|