|
|
|
|
|
|
| |
| |
|
|
|
|
| |
| |
|
|
On 12/9/19 5:03 PM, jr wrote:
...
Apparently I started a response months back and didn't finish it.
(Thunderbird was not showing anything sitting in my drafts folder.)
Up front. I'm changing direction. I'm working toward a personal,
substantially cut down, *nix only version of POV-Ray. With new git (or
perhaps mercurial) repositories once I get done with all the pruning.
Meaning, while the branches I've posted are today still pull-able
against the last commit to master more than a year ago, I no longer plan
to update/re-base those branches should POV-Ray development resume.
-----
Now, completing the response to your months old questions:
>
> slightly confused by the interpolation options, they're not "regular" POV-Ray?
Yes and no.
In my 'Density File Pattern Updates' branch interpolation 0 is the only
option which remains untouched(a). Interpolations 1 & 2 work the same
except the voxel half offset is fixed - voxels values are centered as
they always should have been (Christoph asked me to add options 11 & 12
IIRC which work with the half voxel offsets/overruns as before or like 1
& 2 today, so there was a path for folks to duplicate the incorrect
behavior for replication of old scenes).
(a) - Lying. There is a somewhat substantial performance improvement
achieved by avoiding duplicate work.
There are new interpolations 3,4,5 which use a kind of exponential
blobbing and aimed mostly at creating base isosurface functions/shapes
though, of course, they can be used as patterns too.
There are some new pattern value generation helper interpolations in 9 &
10 which look for the nearest non-zero, '0' interpolation which can then
be used for complex patterning on shapes. This relates to general
efforts. I've been attempting to come up with better ways to texture
which allowing a larger set of pattern and shape modifiers. Warp{} the
isosurface (or collection of placed objects) and use the same warp for
the shape's texturing. I've posted examples of this a few times.
Interpolations 9 & 10 are basically exploring the idea of painting
shapes with DF3s - using DF3 files as a texture mapping method.
Aside: Today interpolations 1 & 2 have seam and ringing artefacts,
respectively, limiting their usefulness as functions / shapes. The same
sort of problem we have today in v3.8 with the new blob potential
pattern use - because blob shapes themselves have continuity issues when
actually blobbing.
...
>
> after some thought, the 'Density File Pattern Updates' + 'Non-portable 32 bit
> DF3 write capability' + perhaps 'Both image_map and pigment_map support in
> density block'. would that combination "play nice"?
>
Yes, should be you can use all those at once. I've made an effort to
keep the branches independent even changing a couple slightly in the
interim so they don't collide(1,2).
(1) - Single branches are tested alone. For multiple pulls, I test this
only in the order of pulls I use to build my 'povr' version. So, there
is a chance when pulling multiple branches I missed some detail where
order matters.
(2) - I did a lot of work in the solver branch to standardize / cleanup
epsilons and other 'magic' values which were all over the place in use
and effectiveness. Thus far I've not used those fixes in other branches
- though it's gotten harder and harder not to do this. The wave modifier
/ pattern / function cleanup I've been working on need the better set
'epsilon-like' values too.
>> ...
>> Once personal wiki's are up. They are sitting out there not very
>> visible. Is there even a page which lists user wikis?
>
> know what you mean. although they all have "sitemap"s, few provide table of
> contents. (self-defeating, imo)
>
Yep, agree.
>
>> ...
>> With user wikis there is always the problem that we might not completely
>> understand what we are trying to document ;-). What to do when well
>> intention-ed, but wrong pages get posted?
>
> isn't that what the "discussion" page is there for?
>
> inadvertent errors would be corrected, I'm (fairly) sure.
>
Yes, ideally.
>
>> ...
>> We have a single physical web site. When it goes down or my local
>> network connection does, I'd like to have local copy of the wiki,
>> library and the newsgroup postings as I do for the documentation. Or at
>> least have snapshots of these available on the cloud - github or
>> wherever.
>> A few expert folk maintain their own web sites for what might go on a
>> wiki, but that an investment. Plus, knowing where all these web sites
>> are today means maintaining a list of links to them yourself.
>
> it would be nice to have mirror or two. (personally, I think that if "the gods"
> put the content on a DVD (or two) and offered those for sale, I'd buy. and
> while on money, if I could take out an annual subscription of 10-15 € or USD to
> help maintain .. POV-Ray, I would. (and am quite certain that I'm not alone in
> that))
>
Christoph had button up at one point (for uberpov only?), but it was to
some new-ish site and I'm leery of those. I'd subscribe too to support
POV-Ray - if the payment mechanism reputable.
Have you seen where github itself is pushing / supporting better ways to
financially support open source projects (GitHub sponsors)? For example:
https://www.youtube.com/watch?v=FYkBA9epUEk
and
https://www.youtube.com/watch?v=e_zhHQXTwVo
and
https://help.github.com/en/github/administering-a-repository/displaying-a-sponsor-button-in-your-repository
I've watched a few of the videos and they're good in pointing out that,
in other than single developer projects, support means more than just
gathering money. It means establishing and maintaining a governing group
deciding direction / how to spend. Otherwise you can end up with a pile
of money and no idea/mechanism for how to spend it...
I'd pay for a web-news group database searchable dump too (perhaps USB
over DVD... :-) ) - the news server representation is only correct for
the more current posts (last 10 years or so). Searches done via
thunderbird don't turn up - say pre 2007 stuff - though how far back it
works depends some on the group. If you know it's an old topic, the web
search is the only way to go - but a few times now I've struggled
finding what I want with this google based method.
Aside: Chris taught me individual groups can be searched via google with
arguments:
site:news.povray.org/povray.bugreports/ freezes
but that doesn't turn up anything, where say:
site:news.povray.org/povray.bugreports/ crash
does. Not dug much, but it seems like the google indexing is perhaps
done only once in a while - like it's lagging substantially (a year?).
Don't know. The by group method is useful, but I find myself not
counting on the google search completely either at this point.
Anyway... Having a paid support mechanism / data-products of some kind
or not, would be up to the core team / Chris I'd guess.
------------
FYI. I captured your February dictionary parsing bug test case for my
own use - thanks. No idea if/when I might get to looking at it, but it's
now one of my parser test cases.
My apologies for the SLOW response.
Bill P.
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
hi,
William F Pokorny <ano### [at] anonymousorg> wrote:
> On 12/9/19 5:03 PM, jr wrote:
> ...
> Apparently I started a response months back and didn't finish it.
> (Thunderbird was not showing anything sitting in my drafts folder.)
try handkerchief + knots. :-)
> Up front. I'm changing direction. I'm working toward a personal,
> substantially cut down, *nix only version of POV-Ray. With new git (or
> perhaps mercurial) repositories once I get done with all the pruning.
> Meaning, while the branches I've posted are today still pull-able
> against the last commit to master more than a year ago, I no longer plan
> to update/re-base those branches should POV-Ray development resume.
intriguing. does "substantially cut down" translate into subset of (current)
features?
> -----
> Now, completing the response to your months old questions:
>
> >
> > slightly confused by the interpolation options, they're not "regular" POV-Ray?
> Yes and no.
> ...
> Aside: Today interpolations 1 & 2 have seam and ringing artefacts,
> respectively, limiting their usefulness as functions / shapes. ...
the problem I'm seeing with media, artifacts (almost like "shadows") when using
interpolation(s).
> > after some thought, the 'Density File Pattern Updates' + 'Non-portable 32 bit
> > DF3 write capability' + perhaps 'Both image_map and pigment_map support in
> > density block'. would that combination "play nice"?
> >
> Yes, should be you can use all those at once. I've made an effort to
> keep the branches independent even changing a couple slightly in the
> interim so they don't collide(1,2).
good, thanks.
> ...
> >> We have a single physical web site. When it goes down ...
> > it would be nice to have mirror or two. (personally, I think that if "the gods"
> > put the content on a DVD (or two) and offered those for sale, I'd buy. and
> > while on money, if I could take out an annual subscription of 10-15 € or USD to
> > help maintain .. POV-Ray, I would. (and am quite certain that I'm not alone in
> > that))
> >
> Christoph had button up at one point (for uberpov only?), but it was to
> some new-ish site and I'm leery of those. I'd subscribe too to support
> POV-Ray - if the payment mechanism reputable.
:-) personally like using PayPal, but see below.
> Have you seen where github itself is pushing / supporting better ways to
> financially support open source projects (GitHub sponsors)? For example:
> ...
I like the apparent simplicity of a single button, but wasn't familiar with any
of the names except Patreon. but yes, contributing via the project's home page
does seem sensible.
> I'd pay for a web-news group database searchable dump too (perhaps USB
> over DVD... :-) ) ...
(just) showing my age. :-)
> Aside: Chris taught me individual groups can be searched via google with
> arguments:
yes, although I tend to just use 'site: povray.org'. there's a whole vocabulary
for narrowing searches (including '+' and '-' attached to individual
words/terms, though haven't a good reference to hand).
> ...
> Anyway... Having a paid support mechanism / data-products of some kind
> or not, would be up to the core team / Chris I'd guess.
given his (understandable) pride in reaching the 25 year milestone recently, one
would hope he'll read our remarks and (deign to) comment; not holding my
breath..
> FYI. I captured your February dictionary parsing bug test case for my
> own use - thanks. No idea if/when I might get to looking at it, but it's
> now one of my parser test cases.
>
> My apologies for the SLOW response.
no need. cheers.
regards, jr.
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
On 3/20/20 6:20 AM, jr wrote:
...
>
> intriguing. does "substantially cut down" translate into subset of (current)
> features?
>
Eventually I hope; got a list in my head.
The initial work has more to do with how things are organized and built.
Attempting gnu make instead of autotool's automake -> make, for example.
Lots of little things are wrong - or not quite right - in our *nix build
system. Starting too from a new base with many of my branches merged.
Initial git repository included large boost and other libraries for
windows builds. The size required is now much smaller due Christoph's
work to move to c++11 threads, but as a code repository that old library
data doesn't go away. On *nix we don't need these libraries in the
repository at all.
The documentation(1), includes, sample scenes and code are all in the
same repository and I think it would be better as 4 different ones.
Though, I'm focused today on the one with the code.
Dumping the other than linux code. Otherwise, pruning the code and
comments as much as I can. To do my own thing - with any chance of
wrapping my head around the whole - code needs to be a POV-Ray cut down.
I don't know windows coding and have no interest in learning. I don't
care about the windows editor interfaces, certain functionality, etc.
A second reason to work on a cut down is my son bought me a Raspberry Pi
4 (4GB) board for Christmas. There, even more so than on my main
machine, I want to develop and run things entirely on a ramdisk. This
limits the playpen as a whole to 2GB or so.
Bill P.
(1) - Recently someone submitted a github request asking for the
documentation to be pulled out as its own repository (and with a
documentation specific license IIRC). Seems a pretty common approach on
github(1a). I just want that, the editor templates etc out of the way.
(1a) - Documentation images/binaries are also often handled 'off to the
side' of git/mercurial distributed type code control systems and these
files are in our git repository.
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
If you are ever looking for a polyvalent build system for the POVRay sources (as
much for Windows than Linux than other platforms), you should take a look at
"CMake" (www.cmake.org). Personnally , I use it for my projects and don't regret
it. All you have to do is writing a file:
CMakeLists.txt (with the cmake scripting language.)
Then the user who has downloaded (for example) the POVRay source code can choose
the compiler, the project type (that can be 'Codelite project'/'Visual studio
project'/'Unix Makefile'/'MinGW makefile' or many else).
Cmake is well known nowadays. ;-)
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
On 3/20/20 2:35 PM, Warren wrote:
> If you are ever looking for a polyvalent build system for the POVRay sources (as
> much for Windows than Linux than other platforms), you should take a look at
> "CMake" (www.cmake.org). Personnally , I use it for my projects and don't regret
> it. All you have to do is writing a file:
> CMakeLists.txt (with the cmake scripting language.)
> Then the user who has downloaded (for example) the POVRay source code can choose
> the compiler, the project type (that can be 'Codelite project'/'Visual studio
> project'/'Unix Makefile'/'MinGW makefile' or many else).
>
> Cmake is well known nowadays. ;-)
>
Yes, I'm aware of cmake having looked at many build systems while
digging into a particular build issue back in 2017. I think it's
strengths come to good support for IDEs and common usage which increased
especially while the autotools package support was unstable and
fractured in the 2000s. It's weakness is it needs to be installed to be
used(1); that it created it's own scripting language(2) which you have
to learn; and that like autotools you can get into version issues(2).
At a high level I'm headed in the opposite direction to cmake and
'autotools' support everything model with my cut down *nix only version
of POV-Ray. I'm looking to support much less in the way of OSs and
architectures to make my hobby like simpler.
Bill P.
(1) Not how POV-Ray is being provided today, but the autotools approach
is the developer creates the configure script and only that configure
script is packaged for a users/administrators to run ahead of a
compile/install. Nothing extra needs to be installed on the system
beyond having a compliant c++ compiler which is kinda cool.
(2) One build system I liked over both cmake and autotools - partly due
me already knowing tcl - was a tool called autosetup. It requires tcl to
run, but the package includes a tiny implementation of tcl called jim
tcl which it will compile on the fly - given an ansi-c compiler on the
system - if need be. This build system becomes part of a project's code
so no versioning issues. It's basically a couple guys though - and aimed
too at just *nix and mostly c/c++. In theory at least - gnu autoconf /
gnu make there is surer, if not at any given time better - support given
it's a GNU tool needed for linux and gnu core tool compiles.
Aside (2a): Seeing autosetup I wondered if the cmake guys could have
saved themselves some implementation effort by starting with an
extension language like tcl, lua or maybe scheme/guile. GNU make itself
now supports guile written functions - and dll extensions 'experimentally.'
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
|
|