POV-Ray : Newsgroups : povray.general : Isosurface hills Server Time
10 Aug 2024 07:20:05 EDT (-0400)
  Isosurface hills (Message 7 to 16 of 16)  
<<< Previous 6 Messages Goto Initial 10 Messages
From: Lummox JR
Subject: Re: Isosurface hills
Date: 18 Mar 2000 13:12:14
Message: <38D3C84E.65E7@aol.com>
Chris Huff wrote:
> This is a known problem, one which I wish would get fixed soon. It also
> occurs with things like the bozo pattern(which is actually identical to
> noise3d()).

Exactly. Another problem I knew people had was that when seen from a
distance, the noise3d() function can have a sort of quilted look to it
because of how the noise function behaves at near-integer values. I
turbulated x and z a bit to get rid of that (or at least disguise it by
varying the lines).

> > The only good solution I can think of would be to modify the f_func.c
> > file (I've made extensive modifications already, so this isn't entirely
> > out of the question) to include an unclipped noise function. (Too bad
> > the function doesn't allow a user-specified pre-clip scaling to avoid
> > that.) I'm not particularly anxious to try that tack if something better
> > can be achieved with the version I'm already using.
> 
> I hope someone does it soon. I don't know enough about that kind of
> function to do anything, but as I understand it, it would be a fairly
> simple modification in one of the hash functions or in the Noise()
> function.
> What are the modifications you have made? Potential new features?

Modifications I've made so far to the function code:

- Added functions floor(), ceil(), if() (like the C ?: operator),
  pi, clock, atan2(), radians(), degrees()
- int_func3d() was completely rewritten and evaluate_interval()
  changed to allow interval (method 1) testing which wasn't possible
  before
- norm_<function> functions added to evaluate an exact normal where
  possible (using a new "normal" keyword as part of an isosurface,
  parametric, or my own isoblob)

I figure an unclipped noise function would probably be extremely
beneficial--especially since raw noise is centered about 0 instead of
0.5. (I also considered running noise through something like a sigmoid
function to constrain it to 0-1, but I think the result would look just
terrible.)

> > I'm going to try some alternate versions like 4*n*(1-n), which should
> > just give me a lot of valleys instead, and see if that works any better.
> > In the meantime, suggestions are still much appreciated.
> 
> You might be able to disguise the plateaus by using multiple calls to
> noise3d() with different values added to the x, y, and z parameters for
> each(to translate the resulting patterns so they don't just reenforce).

Well, I found out that 4*n*(1-n) didn't really work out all that well.
Problem is, of course, that 0.5 (where this hits its maximum) is all too
common a value for noise. Although it doesn't much show in the test
images, this would tend to generate ring-like hills. Squaring or cubing
noise to eliminate the plateaus didn't help either.
My best results came from adding several n*(1-n) values together and
then scaling them to fit within the 0-1 range, but even then the
structures were kind of strange and not very hilly.
I think I'll just go ahead and add an unclipped noise function to the
code; I may as well at this point. I just need something to call it
other than noise3d(), and I'll need to rewrite the interval function to
compensate for the fact that this can reach values beyond the 0-1 limit.

Lummox JR


Post a reply to this message

From: Chris Huff
Subject: Re: Isosurface hills
Date: 18 Mar 2000 14:13:00
Message: <chrishuff_99-3077B0.14145918032000@news.povray.org>
In article <38D### [at] aolcom>, Lummox JR <Lum### [at] aolcom> wrote:

> Chris Huff wrote:
> > This is a known problem, one which I wish would get fixed soon. It also
> > occurs with things like the bozo pattern(which is actually identical to
> > noise3d()).
> 
> Exactly. Another problem I knew people had was that when seen from a
> distance, the noise3d() function can have a sort of quilted look to it
> because of how the noise function behaves at near-integer values. I
> turbulated x and z a bit to get rid of that (or at least disguise it by
> varying the lines).
> 
> > > The only good solution I can think of would be to modify the f_func.c
> > > file (I've made extensive modifications already, so this isn't 
> > > entirely
> > > out of the question) to include an unclipped noise function. (Too bad
> > > the function doesn't allow a user-specified pre-clip scaling to avoid
> > > that.) I'm not particularly anxious to try that tack if something 
> > > better
> > > can be achieved with the version I'm already using.
> > 
> > I hope someone does it soon. I don't know enough about that kind of
> > function to do anything, but as I understand it, it would be a fairly
> > simple modification in one of the hash functions or in the Noise()
> > function.
> > What are the modifications you have made? Potential new features?
> 
> Modifications I've made so far to the function code:
> 
> - Added functions floor(), ceil(), if() (like the C ?: operator),
>   pi, clock, atan2(), radians(), degrees()

These sound interesting...I notice that they have already been included 
in MegaPOV.


> - int_func3d() was completely rewritten and evaluate_interval()
>   changed to allow interval (method 1) testing which wasn't possible
>   before

Could you clarify this? I don't really understand what you mean.


> - norm_<function> functions added to evaluate an exact normal where
>   possible (using a new "normal" keyword as part of an isosurface,
>   parametric, or my own isoblob)

Is this related to the second normal calculation method, also included 
in MegaPOV?


> I think I'll just go ahead and add an unclipped noise function to the
> code; I may as well at this point. I just need something to call it
> other than noise3d(), and I'll need to rewrite the interval function to
> compensate for the fact that this can reach values beyond the 0-1 limit.

Could you add 4D noise while you are at it? :-)
I think this might be useful for some things, maybe for fire.

-- 
Chris Huff
e-mail: chr### [at] yahoocom
Web page: http://chrishuff.dhs.org/


Post a reply to this message

From: Lummox JR
Subject: Re: Isosurface hills
Date: 18 Mar 2000 16:38:04
Message: <38D3F88C.7F17@aol.com>
Chris Huff wrote:
> > Modifications I've made so far to the function code:
> >
> > - Added functions floor(), ceil(), if() (like the C ?: operator),
> >   pi, clock, atan2(), radians(), degrees()
> 
> These sound interesting...I notice that they have already been included
> in MegaPOV.

Hmmm... I wonder if they're mine.
All those function changes I sent out to Ron Parker for the Superpatch.
The isoblob patch I don't think is in there, though.

> > - int_func3d() was completely rewritten and evaluate_interval()
> >   changed to allow interval (method 1) testing which wasn't possible
> >   before
> 
> Could you clarify this? I don't really understand what you mean.

Interval testing works for most functions, but it doesn't work for a
function within a function. Example:

#declare F1=function {cub(x)+sqr(y)}
#declare F2=function {F1(x,y)-F1(y,z)}

An isosurface using function F2 can't use method 1 because the intervals
for F1 aren't tested properly. I rewrote int_func3d, which was basically
just an empty do-nothing function, to act the way it was supposed to in
the first place.

> > - norm_<function> functions added to evaluate an exact normal where
> >   possible (using a new "normal" keyword as part of an isosurface,
> >   parametric, or my own isoblob)
> 
> Is this related to the second normal calculation method, also included
> in MegaPOV?

That I wouldn't know; I haven't seen MegaPOV or its source. For all I
know it might be based on my patch--if you have the source, my name
should be plastered all over f_func.c and several related files if it's
based on my code.

> > I think I'll just go ahead and add an unclipped noise function to the
> > code; I may as well at this point. I just need something to call it
> > other than noise3d(), and I'll need to rewrite the interval function to
> > compensate for the fact that this can reach values beyond the 0-1 limit.
> 
> Could you add 4D noise while you are at it? :-)
> I think this might be useful for some things, maybe for fire.

Funny, but I was thinking along the exact same lines earlier when I was
modifying the code. It occurred to me that it would be nice to have an
N-dimensional noise function. Many a time I've wanted to have 4D noise
for clouds and such. If I understand the Noise() code well enough,
basically all you'd have to do would be to recurse the sucker. This
could extend noise to as many dimensions as you'd want, if it was done
right. (Of course, some preprocessing would help to prevent a lot of
useless recalculations, since speed is always a factor.)
I really don't know how this could be included in the function code
without greatly modifying it, since the code wasn't intended to go past
3 arguments per function. (The so-called 4-argument functions are
actually just a declared function number plus 3 more arguments.) I've
always wanted to see if that could be arbitrarily extended; the
challenge is definitely intriguing enough for me to take it up sometime.

BTW, I've gone ahead and added the Noise_raw function to texture.c. I
gave it a function name "bozo", to avoid having to add one more keyword
to the parser.
I added two new functions while I was in there:

range(v,min,max)  -- basically a shortcut for min(max(v,min),max)
bozo(x,y,z) -- raw noise

Lummox JR


Post a reply to this message

From: Chris Huff
Subject: Re: Isosurface hills
Date: 18 Mar 2000 17:28:46
Message: <chrishuff_99-F7DCC7.17304618032000@news.povray.org>
In article <38D### [at] aolcom>, Lummox JR <Lum### [at] aolcom> wrote:

> Hmmm... I wonder if they're mine.

Yep.


> All those function changes I sent out to Ron Parker for the Superpatch.
> The isoblob patch I don't think is in there, though.

In where? MegaPOV, or the source changes you sent to Ron? The isoblob 
patch is included in MegaPOV, if that is what you meant.


> That I wouldn't know; I haven't seen MegaPOV or its source. For all I
> know it might be based on my patch--if you have the source, my name
> should be plastered all over f_func.c and several related files if it's
> based on my code.

Ah, ok. I somehow had the impression you were using MegaPOV source as 
your base. I found the code neatly arranged in a single block starting 
with this:
#ifdef IsoBlobPatch
/* Functions added by Lummox JR, June 1999 */


> I really don't know how this could be included in the function code
> without greatly modifying it, since the code wasn't intended to go past
> 3 arguments per function. (The so-called 4-argument functions are
> actually just a declared function number plus 3 more arguments.) I've
> always wanted to see if that could be arbitrarily extended; the
> challenge is definitely intriguing enough for me to take it up sometime.

I would certainly be interested in the results. It would also be nice to 
allow user defined functions to have a variable number of parameters, 
how possible do you think that would be?


> BTW, I've gone ahead and added the Noise_raw function to texture.c. I
> gave it a function name "bozo", to avoid having to add one more keyword
> to the parser.

Sounds interesting, although the use of "bozo" would have to be changed 
before it is distributed...
Maybe call it rnoise()?


> I added two new functions while I was in there:
> 
> range(v,min,max)  -- basically a shortcut for min(max(v,min),max)

Why not call it "clamp()"? I think it is more explanatory.
BTW, I use this code when I need this function, although having it hard 
coded is probably a good idea:
#declare clamp = function {min(y, max(x, z))}
clamp(value, minimum, maximum)

-- 
Chris Huff
e-mail: chr### [at] yahoocom
Web page: http://chrishuff.dhs.org/


Post a reply to this message

From: Lummox JR
Subject: Function expansion and optimization
Date: 19 Mar 2000 00:15:34
Message: <38D463C7.5117@aol.com>
Chris Huff wrote:
> Ah, ok. I somehow had the impression you were using MegaPOV source as
> your base. I found the code neatly arranged in a single block starting
> with this:
> #ifdef IsoBlobPatch
> /* Functions added by Lummox JR, June 1999 */

Hmm. That's interesting; I'll have to check out the MegaPOV code at some
point.
I think the additional functions can easily be separated from the
isoblob sections at any rate, though. The isoblob code is fairly
independent of the function changes I made.

> > I really don't know how this could be included in the function code
> > without greatly modifying it, since the code wasn't intended to go past
> > 3 arguments per function. (The so-called 4-argument functions are
> > actually just a declared function number plus 3 more arguments.) I've
> > always wanted to see if that could be arbitrarily extended; the
> > challenge is definitely intriguing enough for me to take it up sometime.
> 
> I would certainly be interested in the results. It would also be nice to
> allow user defined functions to have a variable number of parameters,
> how possible do you think that would be?

I'd really love to see that too. What would probably be a lot more
useful, however, would be a set of functions to work on vectors.

Another change I've contemplated before is the addition of two new
functions: precalc() and expr(), which would basically be used to give
the function "local variables" of a sort. Each function could be
analyzed after parsing, and reorganized to do any repeated or
unnecessary calculations first. As an example of what I mean:

sqrt(x^2+y^2)+sqrt(x^2+z^2)+sqrt(y^2+z^2)

There are several very simple changes that can be made here:

- Any ^2 reference can be changed to sqr(), which will be faster.
  (Similarly, optimization can change things to sqrt() and cub().)
- sqr(x), sqr(y), and sqr(z) can all be precalculated and then used
  later.

The optimized expression, although it looks more complicated, would save
considerable evaluation time:

precalc(sqr(z),
  precalc(sqr(y),
    precalc(sqr(x),
      sqrt(expr(3)+expr(2))+sqrt(expr(3)+expr(1)+sqrt(expr(2)+expr(1))
  )))

And as you can see, expr(N) references the Nth cached value; basically
all the code has to do is check the very bottom of the stack and work
upward, since the precalc() function has already calculated the first
value. (Actually, the code I've got in mind is 0-based, not 1-based, but
you get the idea.)
These functions wouldn't really be user-accessible; the possibility of
screwing things up is too great. The resulting code might or might not
be longer--sometimes length will be saved, and other times the overhead
of the extra numbers will take a toll.
I've written some test code (which I haven't even looked at in months)
that will analyze the function for a repeated expression, keeping track
of whether it actually has a complete expression or not, and whether the
constants match, etc. It looks for the longest expressions first, to
maximize the savings. And if those can be further broken down, so be it. 

> > BTW, I've gone ahead and added the Noise_raw function to texture.c. I
> > gave it a function name "bozo", to avoid having to add one more keyword
> > to the parser.
> 
> Sounds interesting, although the use of "bozo" would have to be changed
> before it is distributed...
> Maybe call it rnoise()?

I dunno; I think "bozo" makes sense enough for the purpose. People
writing a function aren't likely to confuse it with the pigment, which
matches noise3d() anyway. I'd prefer a better name for it myself,
though, so perhaps rnoise would be a very worthwhile change. I chose
this one because it's a pain in the butt to add tokens to the parser.

> > I added two new functions while I was in there:
> >
> > range(v,min,max)  -- basically a shortcut for min(max(v,min),max)
> 
> Why not call it "clamp()"? I think it is more explanatory.
> BTW, I use this code when I need this function, although having it hard
> coded is probably a good idea:
> #declare clamp = function {min(y, max(x, z))}
> clamp(value, minimum, maximum)

Clamp() is explanatory, sure, although I think range() is equally
explanatory in this case. Actually "range" was the first word that came
to mind, since the goal is to constrain values to a specific range. The
beauty of using range is that its purpose is fairly evident, and it
already has a token in the parser. No token modifications required.

I'd like some open critique of my function-optimization idea, really. I
haven't tested any of the code I have in mind, though when I developed
it I did a number of mental run-throughs. And of course more can be
done.
I think any of these optimizations are possible and even desirable,
however:

- Any expression made more than once (or a function with all the same
  arguments) can be calculated once and re-used. The longer the
  expression or the more it is used, it will be more likely to save
  both space and time. (Testing for a complete expression is difficult,
  since you have to check on how each function affects the stack.)
- Any operations based solely on constants can be evalulated at
  optimization time and reduced to one single constant. Thus sqrt(2)
  and even a simple -(1) needn't be calculated each time.
- n+(-m) can be changed to n-m
- n*(a/m) can be changed to (n*a)/m by association
- n*(-1) or n/(-1) can be changed to (-n)
- n*a+n*b can be replaced with n*(a+b)
- n*n or n^2 can be replaced with sqr(n)
- sqr(n)*n or n^3 can be replaced with cub(n)
- n^a*n^b can be replaced with n^(a+b)

This basic design won't catch some simple things like (x-y)=-(y-x), but
it's a good starting point. I'd be willing to bet that a lot of
functions use the same complex expression more than once (mine often
do), and as such any optimization at parse-time would be of enormous
benefit down the line.

Lummox JR


Post a reply to this message

From: Chris Huff
Subject: Re: Function expansion and optimization
Date: 19 Mar 2000 12:48:30
Message: <chrishuff_99-A9EC64.12503019032000@news.povray.org>
In article <38D### [at] aolcom>, Lummox JR <Lum### [at] aolcom> wrote:

> > I would certainly be interested in the results. It would also be nice 
> > to
> > allow user defined functions to have a variable number of parameters,
> > how possible do you think that would be?
> 
> I'd really love to see that too. What would probably be a lot more
> useful, however, would be a set of functions to work on vectors.

Hmm, what would this syntax look like? Would it be possible to make a 
special sort of function which can handle vectors and colors and which 
outputs a color? Kind of a simplified shader, with functions like 
trace(), eval_pigment() and eval_pattern(), and possibly access to 
functions like reflect and diffuse, and maybe access to a list of light 
sources and the camera. It wouldn't be as complex as a real shader, but 
would be very close to what my "shader" patch is theoretically supposed 
to do(I never made a lot of progress and haven't worked on it lately, it 
might be better to work from the isosurface function code).


> These functions wouldn't really be user-accessible; the possibility of
> screwing things up is too great. The resulting code might or might not
> be longer--sometimes length will be saved, and other times the overhead
> of the extra numbers will take a toll.

This looks like a very good idea, and would be nearly(completely?) 
invisible to the user except as a speed increase.


> I dunno; I think "bozo" makes sense enough for the purpose. People
> writing a function aren't likely to confuse it with the pigment, which
> matches noise3d() anyway. I'd prefer a better name for it myself,
> though, so perhaps rnoise would be a very worthwhile change.

I just don't think it is a good idea to reuse that keyword in that way, 
just my opinion.
What if the patterns are someday added in as functions, so "bozo(x,y,z)" 
evaluates the bozo pattern at that position? Things like "method" make 
sense to reuse(for media method, isosurface solving method, proximity 
calculations, etc), but bozo doesn't(except for both being a noise 
function).


> I chose this one because it's a pain in the butt to add tokens to the 
> parser.

You must mean the isosurface parsing code, right? It only takes a line 
in tokenize.c and a line in parse.h to add a token, I wouldn't call 
those two lines a "pain in the butt". However, I have never added 
anything to the isosurface code, I don't know how difficult it is.


> Clamp() is explanatory, sure, although I think range() is equally
> explanatory in this case. Actually "range" was the first word that came
> to mind, since the goal is to constrain values to a specific range. The
> beauty of using range is that its purpose is fairly evident, and it
> already has a token in the parser. No token modifications required.

I think "range" implies more of scaling a value to fit in a certain 
range. It would be something like this:
range(value, source_min, source_max, dest_min, dest_max)
Values between source_min and source_max are scaled to between dest_min 
and dest_max.
The word "clamp" specifies what the function does, "clip()" would work 
just as well. Actually, I would name it ClipToRange(), but that is a bit 
long and doesn't fit with the naming conventions of POV-Ray. :-)


> I'd like some open critique of my function-optimization idea, really. 
> I haven't tested any of the code I have in mind, though when I 
> developed it I did a number of mental run-throughs. And of course 
> more can be done.
> I think any of these optimizations are possible and even desirable,
> however:
> ...
> 
> This basic design won't catch some simple things like (x-y)=-(y-x), but
> it's a good starting point. I'd be willing to bet that a lot of
> functions use the same complex expression more than once (mine often
> do), and as such any optimization at parse-time would be of enormous
> benefit down the line.

These sound like good ideas, and might clear up some of the 
inconsistancies with the function solving that have appeared(I can't 
think of any right now, but there were a couple threads in 
povray.unofficial.patches and/or .advanced-users.

-- 
Chris Huff
e-mail: chr### [at] yahoocom
Web page: http://chrishuff.dhs.org/


Post a reply to this message

From: Lummox JR
Subject: Re: Function expansion and optimization
Date: 19 Mar 2000 13:57:53
Message: <38D52479.5533@aol.com>
Chris Huff wrote:
> > I'd really love to see that too. What would probably be a lot more
> > useful, however, would be a set of functions to work on vectors.
> 
> Hmm, what would this syntax look like? Would it be possible to make a
> special sort of function which can handle vectors and colors and which
> outputs a color? Kind of a simplified shader, with functions like
> trace(), eval_pigment() and eval_pattern(), and possibly access to
> functions like reflect and diffuse, and maybe access to a list of light
> sources and the camera. It wouldn't be as complex as a real shader, but
> would be very close to what my "shader" patch is theoretically supposed
> to do(I never made a lot of progress and haven't worked on it lately, it
> might be better to work from the isosurface function code).

Therein lies the rub. Syntax would be a bear. How would such a function
work? How would you make up for the lack of support for simple
operations such as / or sin()?
Personally, I'd like to see a sort of function-aware color map that can
do the work. For example, imagine if you could code a pigment as such:

pigment {
	function rgb
	function {x-floor(x)}
	function {y-floor(y)}
	function {z-floor(z)}
	}

In other words, you could tell the pigment that it expects to see three
functions, representing red, green, and blue, respectively. I would
expect a similar syntax could handle rgbt, rgbf, etc.

> > These functions wouldn't really be user-accessible; the possibility of
> > screwing things up is too great. The resulting code might or might not
> > be longer--sometimes length will be saved, and other times the overhead
> > of the extra numbers will take a toll.
> 
> This looks like a very good idea, and would be nearly(completely?)
> invisible to the user except as a speed increase.

Precisely. The code for function optimization is embryonic but needs a
lot of constructive criticism to get flying, so I'll be posting to
povray.programming with the specifics for that.

> > I dunno; I think "bozo" makes sense enough for the purpose. People
> > writing a function aren't likely to confuse it with the pigment, which
> > matches noise3d() anyway. I'd prefer a better name for it myself,
> > though, so perhaps rnoise would be a very worthwhile change.
> 
> I just don't think it is a good idea to reuse that keyword in that way,
> just my opinion.
> What if the patterns are someday added in as functions, so "bozo(x,y,z)"
> evaluates the bozo pattern at that position? Things like "method" make
> sense to reuse(for media method, isosurface solving method, proximity
> calculations, etc), but bozo doesn't(except for both being a noise
> function).

That's possible for most functions, but since the bozo pattern matches
noise3d() anyway, there'd be no point in making it a function.
Still I see your point. Certainly the only claim it has to being a good
choice for a keyword is that it's known to be a noise function, and its
usage is a little confusing and strange.

> > I chose this one because it's a pain in the butt to add tokens to the
> > parser.
> 
> You must mean the isosurface parsing code, right? It only takes a line
> in tokenize.c and a line in parse.h to add a token, I wouldn't call
> those two lines a "pain in the butt". However, I have never added
> anything to the isosurface code, I don't know how difficult it is.

No, I meant the actual parser. Problem is, adding those two lines means
that just about every file has to be recompiled, and it just complicates
the parsing process to have that many more tokens.

> > Clamp() is explanatory, sure, although I think range() is equally
> > explanatory in this case. Actually "range" was the first word that came
> > to mind, since the goal is to constrain values to a specific range. The
> > beauty of using range is that its purpose is fairly evident, and it
> > already has a token in the parser. No token modifications required.
> 
> I think "range" implies more of scaling a value to fit in a certain
> range. It would be something like this:
> range(value, source_min, source_max, dest_min, dest_max)
> Values between source_min and source_max are scaled to between dest_min
> and dest_max.
> The word "clamp" specifies what the function does, "clip()" would work
> just as well. Actually, I would name it ClipToRange(), but that is a bit
> long and doesn't fit with the naming conventions of POV-Ray. :-)

clip() would have been my first choice, but the token wasn't available.
I suppose if I'm changing bozo() to rnoise() then I could just add
"clip" anyway, but it seems a shame to waste a perfectly viable token
that's more or less clear enough for the purpose.

> > I'd like some open critique of my function-optimization idea, really.
> > I haven't tested any of the code I have in mind, though when I
> > developed it I did a number of mental run-throughs. And of course
> > more can be done.
> > I think any of these optimizations are possible and even desirable,
> > however:
> > ...
> >
> > This basic design won't catch some simple things like (x-y)=-(y-x), but
> > it's a good starting point. I'd be willing to bet that a lot of
> > functions use the same complex expression more than once (mine often
> > do), and as such any optimization at parse-time would be of enormous
> > benefit down the line.
> 
> These sound like good ideas, and might clear up some of the
> inconsistancies with the function solving that have appeared(I can't
> think of any right now, but there were a couple threads in
> povray.unofficial.patches and/or .advanced-users.

Probably a lot of those inconsistencies have to do with the fact that
int_func3d() was never properly coded before I took a whack at it; it
was really just an empty shell of a routine, and that shell was in fact
massively buggy. (One thing I didn't explore yet in the optimization
code was to inline short functions if possible, which would also have
dealt with that problem.)
I know I encountered one, though, with my normal-solving calculations
for atan2(), one of the functions I added. For some reason, the code,
although mathematically it was perfectly sound, screwed up dramatically
in a test scene, and I was never able to find the problem.

Anyway, more on this in povray.programming, since this has become more
of a programming discussion.

Lummox JR


Post a reply to this message

From: Chris Huff
Subject: Re: Function expansion and optimization
Date: 19 Mar 2000 14:27:29
Message: <chrishuff_99-B18116.14292319032000@news.povray.org>
In article <38D### [at] aolcom>, Lummox JR <Lum### [at] aolcom> wrote:

> Personally, I'd like to see a sort of function-aware color map that can
> do the work. For example, imagine if you could code a pigment as such:
> 
> pigment {
> 	function rgb
> 	function {x-floor(x)}
> 	function {y-floor(y)}
> 	function {z-floor(z)}
> 	}
> 
> In other words, you could tell the pigment that it expects to see three
> functions, representing red, green, and blue, respectively. I would
> expect a similar syntax could handle rgbt, rgbf, etc.

I suppose this is probably the only feasible way to go...one alternative 
would be to have vector_function, float_function, color_function, etc as 
separate kinds of functions. There is just something I don't like about 
the multiple-function method, but it would be the easiest and simplest 
way of implementing this feature.


> clip() would have been my first choice, but the token wasn't available.
> I suppose if I'm changing bozo() to rnoise() then I could just add
> "clip" anyway, but it seems a shame to waste a perfectly viable token
> that's more or less clear enough for the purpose.

Well, what if that function gets added to the list of functions useable 
outside of isosurface functions?(I think everything available in 
isofunctions(?) should be available outside them as well, and most of 
what is available outside should be available inside them.)
Since the range keyword is already in use(I know it is used for photons, 
maybe also for other things), you would have the same function with 
different names depending on where it is used(it would be clamp or clip 
outside isofunctions, and range inside them). Bad idea.


> Anyway, more on this in povray.programming, since this has become more
> of a programming discussion.

Ok. I will look at the programming issues in .programming...

-- 
Chris Huff
e-mail: chr### [at] yahoocom
Web page: http://chrishuff.dhs.org/


Post a reply to this message

From: Ron Parker
Subject: Re: Isosurface hills
Date: 19 Mar 2000 16:46:55
Message: <slrn8daigj.v8.ron.parker@parkerr.fwi.com>
On Sat, 18 Mar 2000 17:30:46 -0500, Chris Huff wrote:
>In article <38D### [at] aolcom>, Lummox JR <Lum### [at] aolcom> wrote:
>
>> Hmmm... I wonder if they're mine.
>
>Yep.
>
>
>> All those function changes I sent out to Ron Parker for the Superpatch.
>> The isoblob patch I don't think is in there, though.
>
>In where? MegaPOV, or the source changes you sent to Ron? The isoblob 
>patch is included in MegaPOV, if that is what you meant.

There's really no difference, y'know.  My Superpatch 3.1g was swallowed
whole by MegaPOV, including the isoblob and other patches Lummox JR 
sent me.

-- 
These are my opinions.  I do NOT speak for the POV-Team.
The superpatch: http://www2.fwi.com/~parkerr/superpatch/
My other stuff: http://www2.fwi.com/~parkerr/traces.html


Post a reply to this message

From: Daren Scot Wilson
Subject: Re: Isosurface hills
Date: 24 Mar 2000 10:56:06
Message: <38db9016@news.povray.org>
When creating an outdoor scene, "Floating Skyscrapers"  (unfinished, very!)
last year, i wanted nice rolling hills in the distance.  Not too symmetrical,
not too uniform, with ridges and hills and broad shallow depressions and
rises.     But most of all, I needed to know altitude(x,y) so I could place
buildings, cars, etc.

After expeimenting with some ideas, i settled on a mesh on an x,y grid with z
values determined by a hill(x,y) function.    This hill function is a sum of
many individual hill component functions.  It worked great - but due to a
compiler bug, didn't render right and I gave up.    Turns out that the
compiler bug was due to a hardware problem - almost everything ran 99.999% of
the time, but anyway that's a long story... (it may also explain the colour
dispersion bugs i had back when i was working on that.)


I gave the hill component function, a macro named bulge( ), lots of
parameters to give it variation and nonsymmetry.  I write these functions
intuitively, but it's based on a gaussian function   exp(-r^2).

 
I'll post a copy of my FSLAND.POVI  file in the scene-files group.




> I'm hoping to make an image which will mostly be a series of grassy 
> hills in the rain. I've got the clouds, I think, and the rain's a task 
> I'm willing to take a shot at. Grass will be a nightmare of its own and 
> obviously a lot of fudging is required. I've started working with the 
> hills, but that's where I've run into the first problems. 
> I really don't want to generate any height maps for this one. My 
> preference is to use an isosurface (I'm using the Superpatch) with a 
> kind of hilly function. I thought of multipying a couple of sine waves 
> together, but the results on that have so far been pretty terrible. 
> ... 
> Lummox JR 

-- 
Daren Scot Wilson
dar### [at] pipelineocm
www.newcolor.com


Post a reply to this message

<<< Previous 6 Messages Goto Initial 10 Messages

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.