|
![](/i/fill.gif) |
In article <fq5gev4mgtjfim61hht3j2aqmc7drejnj7@4ax.com>, abx### [at] abx art pl
says...
> On Wed, 11 Jun 2003 13:29:42 -0700, "Ray Gardener" <ray### [at] daylongraphics com>
> wrote:
> > Yeah, that'll work, but what if a person
> > receives a scene from someone else that
> > hasn't been set up that way and he doesn't
> > know how to edit scenes?
>
> Sorry, but imo editing text files is a basic skill nowadays.
>
> > I'd say it's a bug: the quality INI setting
> > is intended to render scenes faster when
> > the quality is lower, but it doesn't
> > actually do that, because it lets some
> > high-quality features remain enabled.
>
> And what if I want raw colors (quality=1) with focal blur to test focal blur ?
> And what if I want media without focal blur to test media ?
> How could I combine highier Quality setting with lower one ?
>
> > Being able to work around the problem
> > isn't a solution, because expending the
> > effort for the workaround
>
> It is not workaround. That's the purpose of conditional preprocesor directives:
> make parts of scene optionally parsed. Are you writing C++ programs. Can't you
> give compiler preprocesor definition? Is this workaround to defining constants
> in some kind of config.h or widely used alternative? Every programmer writer
> would choose the same preprocesor name for their preprocessor switch? SDL is a
> langugage, not GUI interface.
>
> > defeats the
> > very purpose of the Quality setting: to
> > provide an easy way to globally render
> > scenes faster without having to care
> > about the scene's content or code.
>
> And it works that way. Blur is not global property of scene but it is feature of
> camera. I consider it consistent.
>
Heres an idea.. How about a simple way to have the SDL find out what the
quality setting you are using is? Then you can use 'that' to
automatically turn on/off things that you don't want used. I can see
cases like the focal blur one you mention where it makes sense to leave
things as they are, but at the same time it makes no sense to me to have
to edit both the INI and the scene files to get it to turn everything off
you want turned off for a particular test. However, as far as I can tell
from a brief search, none of the INI settings can be checked inside the
SDL. It makes sense to not allow changing them, but not even finding out
their current values?!
--
void main () {
call functional_code()
else
call crash_windows();
}
Post a reply to this message
|
![](/i/fill.gif) |