POV-Ray : Newsgroups : povray.binaries.images : A povr fork test image for new dens_object pattern. : Re: A povr fork test image for new dens_object pattern. Server Time
22 Sep 2023 14:21:49 EDT (-0400)
  Re: A povr fork test image for new dens_object pattern.  
From: Samuel B 
Date: 6 Sep 2023 19:35:00
Message: <web.64f90b85b1d4735c16bed5696e741498@news.povray.org>
William F Pokorny <ano### [at] anonymousorg> wrote:
> On 9/5/23 18:59, Samuel B. wrote:
> > Parsing and render times.
> Of course quite a bit longer with dens_object. No getting around there
> being a lot more work getting done. Let's see. A render of just the R -
> no normal{} at all on my old 2 core i3:
>    1.50user 0.03system 0:01.26elapsed
> and with a dens_object normal 'similar' to that posted prior.
>    69.96user 0.11system 0:18.82elapsed

I don't know what any of that means :( Are you using Linux or something? I'm
guessing those values are in seconds? And that the object with a normal took a
bit over 1 minute? (POV-Ray has always showed me things in terms of seconds,
minutes, hours, PPS etc.)

> > This povr fork sounds interesting. Does it allow an object's normal
> > to be completely overridden?
> Only a little more so than the usual POV-Ray at this point. As we
> increase the normal{} block bump_size beyond 0.5 we already start to,
> for example, invert normals. The larger that bump_size the more the
> perturbation influence becomes the total perturbed normal. This 'take
> over' usually at the cost of some distortion to the intended result.

Ah. I think that the only thing the official versions can do to override a
normal atm is to use one of the facets patterns.

It would be nice to completely change normals. We could do things like make
actual normal maps, or in this case, make shaders for rounded edges. At this
point, I can make a function of averaged object patterns and then derive a
normal from it, but I can't take that normal value and use it as an actual
normal :/ I can only adjust the original surface normal with a single density
value. I don't know what future versions of the SDL would look like, but I hope
it supports more direct access to shading without decreasing the easy usability
of today's syntax.

>  > I've been thinking about making a fork of my own.
> It's work - and sometimes frustrating for sure - but you can also try
> what you want. To some degree you get to use what you want from other
> forks too.

Yeah, I've been wanting to play with clipka's reflection blurring to force it to
allow, for instance, only one sample. That way samples could be multipled via
focal blur and/or aa without reaching ridiculous render times. It would help get
the renderer more in line with something like Blender's Cycles.

> > Have you tried it with specular highlights or reflection?
> I've not played with finishes yet.
> I did spec a little time with the accuracy(a) [...]
> (a) - I have some long standing concerns with this keyword's
> implementation with respect to bias and an internal 'magic factor' there
> in the code, but not described. I dislike 'magic' code because it almost
> always disconnects users from any intuitive feel for what setting would
> be best. Anyhow, I might use dens_object based normals to play with that
> accuracy code some... We'll see.
> Bill P.

I always interpreted 'accuracy' as a sort of radius. I don't know if the values
match of with actual units, but I keep it low for sharp things, and higher for
when I need a smoother look.


Post a reply to this message

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.