POV-Ray : Newsgroups : povray.general : Github repository with useful macros Server Time
28 Mar 2024 17:47:33 EDT (-0400)
  Github repository with useful macros (Message 11 to 20 of 41)  
<<< Previous 10 Messages Goto Latest 10 Messages Next 10 Messages >>>
From: Tor Olav Kristensen
Subject: Re: Github repository with useful macros
Date: 19 May 2021 16:05:00
Message: <web.60a56e87985098ff8e52cc8789db30a9@news.povray.org>
Alain Martel <kua### [at] videotronca> wrote:
>
>
> > At the moment the macros are made for POV-Ray v3.7,
> > but they should also work in v3.8.
>
> If a macro is made for version 3.7, then, it will run in version 3.8.
> To my knowledge, anything that works in version 3.3 to 3.7 will run in 3.8.

Hi Alain

Yes, that is my understanding too. But I'm hesitating to claim that they work in
v3.8 before I or someone else have actually tested them there.

--
Tor Olav
http://subcube.com
https://github.com/t-o-k


Post a reply to this message

From: William F Pokorny
Subject: Re: Github repository with useful macros
Date: 20 May 2021 09:35:35
Message: <60a665a7$1@news.povray.org>
On 5/19/21 3:55 PM, Tor Olav Kristensen wrote:
> I have also rewritten a few of the macros slightly and added some new ones.
> 

I like how macros using functions of transforms & inverse transforms 
ended up. :-) Those, I didn't expect you to change and yet what you did 
looks to me to be a clever approach to handling a macro's dynamically 
defined functions.

I'll have to spend more time thinking about it and the actual inner 
workings. I suspect you've gotten to something near like what happens 
with prod and sum internally with the vm.

>> A millsion calls.
>> ...
Yes, I'm forever inventing new words! :-) I've not a clue how I arrive 
at some of them.

---
I noticed in your vectors.inc the indenting occasionally varies. It 
looks like you mostly use increments of four - which is very common - 
but OrthogonalVector, for example, uses three. I'd guess because it was 
that in math.inc ?

---
I've been looking more at your Mean, Variance, Minimum, Maximum, 
PrintStatistics, and Histogram suggestions. First, issue I hit was 
testing your comments about vectors.

// Does also work for vectors
#macro Mean(Values)
...

However, dimension_size() expects an array identifier. Is there some 
trick to getting vectors to work with dimension_size()?  As far as I 
know, the dimension* SDL keywords work only with array identifiers and 
'defined on call' arrays. The latter meaning:

    Mean(array {0.1,0.2,0.3,0.4,0.5})

is OK to do. Yes, this last bit is a little more than what our current 
documentation says you can do.

---
I've recently moved the vector analysis macros in math.inc to a 
vectoranalysis.inc include - which itself won't be a 'core' include. 
Thinking the array statistics a better fit in a similar 
arraystatistics.inc include - again now not a 'core' include.

With your vectors.inc, on the other hand, I'm starting to lean toward it 
being a 'core' include file. Need to resolve collisions with math.inc 
macros as a start.

---
Aside: Instead of moving some of the contents of arraycoupleddf3s.inc to 
arrays.inc, I went the other way! I moved the ARRAY_WriteDF3 macro into 
arraycoupleddf3s.inc alongside the ARRAY_ReadDF3 macro already therein 
along with some other DF3 related macros. I dropped too the ARRAY_ 
prefixes. I now plan to treat arraycoupleddf3s.inc as a core include 
file. I believe reading and writing DF3s should be doable from SDL as 
'POV-Ray' is shipped.

Bill P.


Post a reply to this message

From: Tor Olav Kristensen
Subject: Re: Github repository with useful macros
Date: 20 May 2021 14:25:00
Message: <web.60a6a942985098ff786fa93e89db30a9@news.povray.org>
William F Pokorny <ano### [at] anonymousorg> wrote:
>...
> I noticed in your vectors.inc the indenting occasionally varies. It
> looks like you mostly use increments of four - which is very common -
> but OrthogonalVector, for example, uses three. I'd guess because it was
> that in math.inc ?

Yes, sorry about that. I hope I've fixed it now.

Years ago, when I had a tiny monitor, I used 2 spaces for indentation. Now I
prefer 4. 3 is very odd !  And I really hate indentation with TABs. (Because
stuff jumps around - and very seldom ends up where it looks consistent and
nice.) The Golang linter use TABs for indentation, so it's not very likely that
I will love to program in Go.


> I've been looking more at your Mean, Variance, Minimum, Maximum,
> PrintStatistics, and Histogram suggestions.

Good. I'm not sure if I like that the minimum and maximum values in the
Histogram macro always ends up at the "edges" of the outer bins. Perhaps someone
well-versed in statistics can give some advice here.


> First, issue I hit was
> testing your comments about vectors.
>
> // Does also work for vectors
> #macro Mean(Values)
> ...
>
> However, dimension_size() expects an array identifier. Is there some
> trick to getting vectors to work with dimension_size()?  As far as I
> know, the dimension* SDL keywords work only with array identifiers and
> 'defined on call' arrays. The latter meaning:
>
>     Mean(array {0.1,0.2,0.3,0.4,0.5})
>
> is OK to do. Yes, this last bit is a little more than what our current
> documentation says you can do.
>...

Try this:

#declare SomeVectors2D =
    array[5] {
        < 1,  5>,
        <-2, -2>,
        < 6,  3>,
        < 0,  2>,
        < 3, -1>
    }
;
#declare vMean2D = Mean(SomeVectors2D);
#declare vVariance2D = Variance(SomeVectors2D);
#debug "\n\n"
#debug concat("<", vstr(2, vMean2D, ", ", 0, -1), ">")
#debug "\n"
#debug concat("<", vstr(2, vVariance2D, ", ", 0, -1), ">")
#debug "\n\n"

#declare SomeVectors3D =
    array[5] {
        < 1,  5,  3>,
        <-2, -2,  0>,
        < 6,  3, -4>,
        < 0,  2,  2>,
        < 3, -1, -5>
    }
;
#declare vMean3D = Mean(SomeVectors3D);
#declare vVariance3D = Variance(SomeVectors3D);
#debug "\n\n"
#debug concat("<", vstr(3, vMean3D, ", ", 0, -1), ">")
#debug "\n"
#debug concat("<", vstr(3, vVariance3D, ", ", 0, -1), ">")
#debug "\n\n"

#error "Finished"

- and so on...

--
Tor Olav
http://subcube.com
https://github.com/t-o-k


Post a reply to this message

From: Cousin Ricky
Subject: Re: Github repository with useful macros
Date: 20 May 2021 17:46:37
Message: <60a6d8bd$1@news.povray.org>
On 2021-05-17 12:58 PM (-4), Alain Martel wrote:

> 
>> At the moment the macros are made for POV-Ray v3.7,
>> but they should also work in v3.8.
> 
> If a macro is made for version 3.7, then, it will run in version 3.8.
> To my knowledge, anything that works in version 3.3 to 3.7 will run in 3.8.

That's not always the case.  See, for example:
  https://news.povray.org/5cb13ce4%40news.povray.org

Also, where do you find POV-Ray 3.3?


Post a reply to this message

From: William F Pokorny
Subject: Re: Github repository with useful macros
Date: 21 May 2021 04:17:48
Message: <60a76cac$1@news.povray.org>
On 5/20/21 2:24 PM, Tor Olav Kristensen wrote:
> Try this:
> 
> #declare SomeVectors2D =
>      array[5] {
>          < 1,  5>,
>          <-2, -2>,
>          < 6,  3>,
>          < 0,  2>,
>          < 3, -1>
>      }
> ;
> #declare vMean2D = Mean(SomeVectors2D);

Ah, OK! I was thinking only in terms of sets of floats. Still arrays, 
but arrays of vectors.

---
With the introduction of mixed arrays in v3.8, I guess, so long as we 
are willing to work in terms of the largest defined vector and take the 
unspecified components of shorter vectors as being zero, things will 
work out.

There is an exposure if someone drops in a float in the wrong place. We 
get two different answers with a set up like the following where the 
Mean of the SomeVectorsA is OK if you take the float as being promoted 
to a 4 component vector, but that of SomeVectorsB is not because at the 
time of the addition the parser doesn't know the max vector size in the 
mixed array is four.

#declare SomeVectorsA =
     array mixed[5] {
         < 1,  5>,
         <-2, -2, 0, -1>,
         < 6,  3>,
         < 0,  2, 0, +2>,
         0.123
     }
#declare SomeVectorsB =
     array mixed[5] {
         0.123
         < 1,  5>,
         <-2, -2, 0, -1>,
         < 6,  3>,
         < 0,  2, 0, +2>,
     }
...
Mean <1.0246,1.6246,0.0246,0.2246,0.0000> SomeVectorsA
Mean <1.0246,1.6246,0.0000,0.2000,0.0000> SomeVectorsB

Would it be enough to recommend against mixed array usage? Or might we 
test for mixed arrays in some way and stop, though, I have no idea how 
to do such a thing at the moment.

Aside 1: If someone drops a non-numeric thing into a mixed array the 
parser appropriately generates an error.

Aside 2: If someone uses color the length always goes to the full five 
color component vector. It's possible with colors to mix lengths of 
vector specification - even in the traditional non-mixed array case.

     rgb 1,           // promoted to 1,1,1,0,0
     color 1,         // warning. promoted to 1,1,1,1,1
     rgb< 0,  2>,     // promoted to five components. Trailing 0s
     rgb< 3, -1, 1>   // Same
     rgb< 1, 2, 3, 4> // Warning and 4th component used. 5th 0.
     rgbf<1, 2, 3, 4> // 4th used, no warning. 5th 0.

Maybe behavior all just what it is - user beware?

Bill P.


Post a reply to this message

From: Tor Olav Kristensen
Subject: Re: Github repository with useful macros
Date: 21 May 2021 05:10:00
Message: <web.60a77745985098ff8e52cc8789db30a9@news.povray.org>
William F Pokorny <ano### [at] anonymousorg> wrote:
> On 5/20/21 2:24 PM, Tor Olav Kristensen wrote:
> > Try this:
> >
> > #declare SomeVectors2D =
> >      array[5] {
> >          < 1,  5>,
> >          <-2, -2>,
> >          < 6,  3>,
> >          < 0,  2>,
> >          < 3, -1>
> >      }
> > ;
> > #declare vMean2D = Mean(SomeVectors2D);
>
> Ah, OK! I was thinking only in terms of sets of floats. Still arrays,
> but arrays of vectors.
>
> ---
> With the introduction of mixed arrays in v3.8, I guess, so long as we
> are willing to work in terms of the largest defined vector and take the
> unspecified components of shorter vectors as being zero, things will
> work out.
>
> There is an exposure if someone drops in a float in the wrong place. We
> get two different answers with a set up like the following where the
> Mean of the SomeVectorsA is OK if you take the float as being promoted
> to a 4 component vector, but that of SomeVectorsB is not because at the
> time of the addition the parser doesn't know the max vector size in the
> mixed array is four.
>
> #declare SomeVectorsA =
>      array mixed[5] {
>          < 1,  5>,
>          <-2, -2, 0, -1>,
>          < 6,  3>,
>          < 0,  2, 0, +2>,
>          0.123
>      }
> #declare SomeVectorsB =
>      array mixed[5] {
>          0.123
>          < 1,  5>,
>          <-2, -2, 0, -1>,
>          < 6,  3>,
>          < 0,  2, 0, +2>,
>      }
> ...
> Mean <1.0246,1.6246,0.0246,0.2246,0.0000> SomeVectorsA
> Mean <1.0246,1.6246,0.0000,0.2000,0.0000> SomeVectorsB
>
> Would it be enough to recommend against mixed array usage? Or might we
> test for mixed arrays in some way and stop, though, I have no idea how
> to do such a thing at the moment.
>
> Aside 1: If someone drops a non-numeric thing into a mixed array the
> parser appropriately generates an error.
>
> Aside 2: If someone uses color the length always goes to the full five
> color component vector. It's possible with colors to mix lengths of
> vector specification - even in the traditional non-mixed array case.
>
>      rgb 1,           // promoted to 1,1,1,0,0
>      color 1,         // warning. promoted to 1,1,1,1,1
>      rgb< 0,  2>,     // promoted to five components. Trailing 0s
>      rgb< 3, -1, 1>   // Same
>      rgb< 1, 2, 3, 4> // Warning and 4th component used. 5th 0.
>      rgbf<1, 2, 3, 4> // 4th used, no warning. 5th 0.
>
> Maybe behavior all just what it is - user beware?

When I see this I'm thinking that users that try to calculate the mean or
variance of an array containing mixed vectors deserves the messy result that
they may get.

My opinion is that POV-Rays automatic vector promotion is a design flaw. It
allows users to write code that is confusing and not so clear. And it creates
some hard to find errors.

Now I'm tempted to rewrite those macros so that they fail immediately if the
array contains anything else than floats. Perhaps it is best to not mention that
the mean() and variance() macros can be used with anything else than an array of
floats.

We can not, and should not, try to anticipate and warn/guard against all the
crazy things that users may do with macros. That would make them big, clumsy and
hard to read and understand.

If we keep the macros like they are now, then yes; we should have a user beware
warning.

--
Tor Olav
http://subcube.com
https://github.com/t-o-k


Post a reply to this message

From: Bald Eagle
Subject: Re: Github repository with useful macros
Date: 21 May 2021 06:45:00
Message: <web.60a78dd0985098ff1f9dae3025979125@news.povray.org>
"Tor Olav Kristensen" <tor### [at] TOBEREMOVEDgmailcom> wrote:

> When I see this I'm thinking that users that try to calculate the mean or
> variance of an array containing mixed vectors deserves the messy result that
> they may get.
>
> My opinion is that POV-Rays automatic vector promotion is a design flaw. It
> allows users to write code that is confusing and not so clear. And it creates
> some hard to find errors.

I would agree that some features are more problematic than others, especially
for newer users.   Is there a way to add, multiply, or test for [nonexistant]
vector components with dot-notation in order to purposefully invoke the vector
promotion and force everything to me the largest vector size (5?) from the
start?

[Just as a related aside, what would be useful is a way to perform typical
mathematical functions like cosine on vectors and get a vector result.]

> We can not, and should not, try to anticipate and warn/guard against all the
> crazy things that users may do with macros. That would make them big, clumsy and
> hard to read and understand.

True, however one can have a scene containing all the cases one might think are
tricky or at least not obvious, and that allows plenty of room for doing all
sorts of things that have no proper place in a macro.
Macros, functions, and the documentation are the (overly brief) textbooks of
POV-Ray.  Scene files are the lecture and recitation notes.

In engineering, the rule of thumb for how tight a bolt should be is, "Torque it
down until it snaps, then back it off a half a turn."  That's valuable knowledge
to have which is easy to say and scribble on a blackboard, or show in a
demonstration.  However, it's probably not going to make it into the textbook.


> If we keep the macros like they are now, then yes; we should have a user beware
> warning.

This is always the most difficult and painful part of writing real-world code.

- Bill


Post a reply to this message

From: Tor Olav Kristensen
Subject: Re: Github repository with useful macros
Date: 21 May 2021 16:15:00
Message: <web.60a81382985098ff47eee7f589db30a9@news.povray.org>
William F Pokorny <ano### [at] anonymousorg> wrote:
> On 5/19/21 3:55 PM, Tor Olav Kristensen wrote:
> > I have also rewritten a few of the macros slightly and added some new ones.
> >
>
> I like how macros using functions of transforms & inverse transforms
> ended up. :-) Those, I didn't expect you to change and yet what you did
> looks to me to be a clever approach to handling a macro's dynamically
> defined functions.

=))

The global namespace for functions have troubled me for many years. Local
variables inside macros holding functions does not really work the way it IMHO
should.

Now I'll go through some of my other old macros to see if the same trick can be
applied in any of them to get rid of "named functions".

I also wonder if "hiding" functions inside an array or a dictionary is a good
and robust way to avoid function related name clashes. But I guess we will still
have to struggle with name clashes related to the function parameters.


> I'll have to spend more time thinking about it and the actual inner
> workings. I suspect you've gotten to something near like what happens
> with prod and sum internally with the vm.

Sorry, but I don't understand what you mean here. I've never studied the
internals of that vm.


>...
> I've recently moved the vector analysis macros in math.inc to a
> vectoranalysis.inc include - which itself won't be a 'core' include.
> Thinking the array statistics a better fit in a similar
> arraystatistics.inc include - again now not a 'core' include.

I agree with you, neither the vector analysis macros nor the array statistics
macros belong in a core include file. They are for users with special needs.


> With your vectors.inc, on the other hand, I'm starting to lean toward it
> being a 'core' include file. Need to resolve collisions with math.inc
> macros as a start.

I'm happy to hear that :)

I've now added some documentation for vectors.inc in the Github repository.


> Aside: Instead of moving some of the contents of arraycoupleddf3s.inc to
> arrays.inc, I went the other way! I moved the ARRAY_WriteDF3 macro into
> arraycoupleddf3s.inc alongside the ARRAY_ReadDF3 macro already therein
> along with some other DF3 related macros. I dropped too the ARRAY_
> prefixes. I now plan to treat arraycoupleddf3s.inc as a core include
> file. I believe reading and writing DF3s should be doable from SDL as
> 'POV-Ray' is shipped.

Yes, that is a sensible move.

Btw.:
Have you considered using sum() in the FnctNxNx9ArrayEntryToDBL() macro in
arraycoupleddf3s.inc ?


--
Tor Olav
http://subcube.com
https://github.com/t-o-k


Post a reply to this message

From: Tor Olav Kristensen
Subject: Re: Github repository with useful macros
Date: 21 May 2021 17:35:00
Message: <web.60a82635985098ffdb12715a89db30a9@news.povray.org>
"Bald Eagle" <cre### [at] netscapenet> wrote:
> "Tor Olav Kristensen" <tor### [at] TOBEREMOVEDgmailcom> wrote:
>
> > When I see this I'm thinking that users that try to calculate the mean or
> > variance of an array containing mixed vectors deserves the messy result that
> > they may get.
> >
> > My opinion is that POV-Rays automatic vector promotion is a design flaw. It
> > allows users to write code that is confusing and not so clear. And it creates
> > some hard to find errors.
>
> I would agree that some features are more problematic than others, especially
> for newer users.   Is there a way to add, multiply, or test for [nonexistant]
> vector components with dot-notation in order to purposefully invoke the vector
> promotion and force everything to me the largest vector size (5?) from the
> start?

I don't think that it is possible to investigate the number of components in a
vector without it being promoted automatically or having POV-Ray halt on an
error.

It would have been nice to have some functionality like with Python's
try-except-else-finally statements.

But I assume that that would be a huge task to undertake in POV-Ray. So perhaps
we could implement something like #ifdef (v0.z) instead.


> [Just as a related aside, what would be useful is a way to perform typical
> mathematical functions like cosine on vectors and get a vector result.]

I take it that you would like to be able write things like cos(v0) and then the
cos function will be applied to all of the components of the vector. If so I
second that. But only if that if is done with a built in functionality. I don't
like the macro's we currently have in the include files for such things.


To see how I've provided a similar functionality in my scikit-vectors Python
library:

- Look at the cells 'In [26]' and 'Out[26]' in this file:
https://github.com/t-o-k/scikit-vectors/blob/master/skvectors/doc/Using_a_Vector_Class.pdf

- and at the cells from 'In [48]' to 'Out[52]' in this file:
https://github.com/t-o-k/scikit-vectors/blob/master/skvectors/doc/Using_a_Fundamental_Vector_Class.pdf


>...
> Macros, functions, and the documentation are the (overly brief) textbooks of
> POV-Ray.  Scene files are the lecture and recitation notes.

Good analogy.


> In engineering, the rule of thumb for how tight a bolt should be is, "Torque it
> down until it snaps, then back it off a half a turn."  That's valuable knowledge
> to have which is easy to say and scribble on a blackboard, or show in a
> demonstration.  However, it's probably not going to make it into the textbook.

Funny :) I haven't heard that one before.


> > If we keep the macros like they are now, then yes; we should have a user beware
> > warning.
>
> This is always the most difficult and painful part of writing real-world code.

Yes, you are right about that.

I hate writing documentation. I suspect that one of the reasons for this is that
I'm struggling very much to be concise and precise with a rich, varied and good
language and at the same time use correct terms for the topic (or profession ?).

--
Tor Olav
http://subcube.com
https://github.com/t-o-k


Post a reply to this message

From: Cousin Ricky
Subject: Re: Github repository with useful macros
Date: 22 May 2021 00:10:00
Message: <web.60a88326985098ff60e0cc3d949c357d@news.povray.org>
"Tor Olav Kristensen" <tor### [at] TOBEREMOVEDgmailcom> wrote:
>
> I don't think that it is possible to investigate the number of components in a
> vector without it being promoted automatically or having POV-Ray halt on an
> error.

It is possible to do certain tests.  For example, this macro tests to see
whether or not v_Padding has a .t component.  This macro is ad hoc, but perhaps
a more general macro can be written.  I have not found a way to distinguish
between a scalar and a macro, though.

----------[BEGIN CODE]---------
// Convert input to a 4-D vector.  If the input is a 2-D or 3-D vector, then
// use the .y component for .t.
#macro Caption__Get_padding (v_Padding)
  #local caption_Promoted = <0, 0, 0, 0> + v_Padding;

 // See whether .t exists:
  #local caption_Test1 = v_Padding + 1;
  #local caption_Test2 = v_Padding + 2;

  < caption_Promoted.x,
    caption_Promoted.y,
    caption_Promoted.z,
   // Test whether .t exists:
    ( (<0, 0, 0, 0> + caption_Test1).t = (<0, 0, 0, 0> + caption_Test2).t?
      caption_Promoted.y: caption_Promoted.t
    )
    // N.B.  Although a scalar tests incorrectly as having a .t component, it
    // doesn't matter, because the returned 4-D vector is the same either way.
  >
#end
-----------[END CODE]----------


Post a reply to this message

<<< Previous 10 Messages Goto Latest 10 Messages Next 10 Messages >>>

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.