|
|
|
|
|
|
| |
| |
|
|
|
|
| |
| |
|
|
On 6/18/2021 2:15 PM, Norbert Kern wrote:
> Hi Mike,
>
> for depth_maps I use a variable to change every finish ( e.g. finish {specular
> dfac*0.3 roughness 0.0003 diffuse dfac*0.6 reflection {dfac*0.03, dfac*0.1}}.
> For the "normal" render dfac is defined as 1 and for depthmaps or other
> effectmaps dfac is 0.
> Together with a white fog you get your depthmap.
>
>
> Norbert
>
Could you post some example images? I don't quite understand what this does.
Mike
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Mike Horvath <mik### [at] gmailcom> wrote:
> Could you post some example images? I don't quite understand what this does.
>
>
> Mike
If you set every diffuse, specular, phong, ambient, emission and reflection
value as zero, you get a black material.
A white fog turns this to an inverse depthmap. Then you can adapt the color_map
of your blurring file or you can invert it via Photoshop and co to get a
"regular" depthmap.
Here is a depthmap of my last image.
Norbert
Post a reply to this message
Attachments:
Download 'mountain_forest_depth.jpg' (285 KB)
Preview of image 'mountain_forest_depth.jpg'
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Am 19.06.2021 um 15:37 schrieb Norbert Kern:
> If you set every diffuse, specular, phong, ambient, emission and reflection
> value as zero, you get a black material.
> A white fog turns this to an inverse depthmap. Then you can adapt the color_map
> of your blurring file or you can invert it via Photoshop and co to get a
> "regular" depthmap.
> Here is a depthmap of my last image.
Some caveats of this approach:
- The brightness will not depend on the Z coordinate, but on the
distance to the camera; this may or may not be what you want.
- The image value will not represent the distance itself, but rather an
exponential-ish function thereof, something like:
f(d) = 1 - p^d
where p is a value < 1. Again, this may or may not be what you want.
Also, like with every other thing where you want to get a per-pixel
value out of the render rather than a color, the topic of gamma needs
attention, and also precision, i.e. bit depth.
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Le 2021-06-18 à 13:52, Mike Horvath a écrit :
> On 6/18/2021 10:44 AM, Alain Martel wrote:
>> Le 2021-06-18 à 08:41, Mike Horvath a écrit :
>>> Is there a way to override textures or materials that have already
>>> been declared or applied?
>>>
>>> I am trying to create a depth map, and need to apply a custom texture
>>> to the entire object.
>>>
>>> Thanks.
>>>
>>>
>>> Mike
>>
>> In the case of a texture or material that have been declared but not
>> used yet, a new #declare will completely replace the original with the
>> new definition.
>> If it have been applied to something, then redefined, then, any object
>> that received it will keep the original definition.
>>
>> In your case, you need to remove, or at least comment out, any texture
>> or material from your object. Then, apply your new texture to that
>> object as a whole.
>>
>
> Instead of declaring my textures, I may start using macros instead. That
> way, I can just comment out the texture declaration inside the macro.
> But might this not consume more memory?
>
>
> Mike
Whenever you use a macro, it get expanded every times it's used, so,
will use more memory while parsing.
When you use a declared texture that is applied in many places, then,
you can change it everywhere at once by changing the declare.
With a macro, you can do the same by changing the definition of the
macro. That won't affect the parameters passed to that macro anywhere in
the scene, but, you can change the macro so that it still accept the
same parameter but ignore them.
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
On 6/18/2021 2:53 PM, clipka wrote:
> Am 18.06.2021 um 19:33 schrieb Bald Eagle:
>> Mike Horvath <mik### [at] gmailcom> wrote:
>>> Is there a way to override textures or materials that have already been
>>> declared or applied?
>>
>> What if you union everything together and do an intersection with a
>> box that is
>> its own
>> object that you can give a fresh texture to.
>
> That's a smart idea.
>
> You may need to specify `cutaway_textures` though, and I'm not sure
> whether it does indeed work with objects that are already explicitly
> textured.
What is `cutaway_textures`? I have never head of it.
Mike
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
On 6/19/2021 10:38 AM, clipka wrote:
> Am 19.06.2021 um 15:37 schrieb Norbert Kern:
>
>> If you set every diffuse, specular, phong, ambient, emission and
>> reflection
>> value as zero, you get a black material.
>> A white fog turns this to an inverse depthmap. Then you can adapt the
>> color_map
>> of your blurring file or you can invert it via Photoshop and co to get a
>> "regular" depthmap.
>> Here is a depthmap of my last image.
>
> Some caveats of this approach:
>
> - The brightness will not depend on the Z coordinate, but on the
> distance to the camera; this may or may not be what you want.
>
> - The image value will not represent the distance itself, but rather an
> exponential-ish function thereof, something like:
>
>
> where p is a value < 1. Again, this may or may not be what you want.
>
>
> Also, like with every other thing where you want to get a per-pixel
> value out of the render rather than a color, the topic of gamma needs
> attention, and also precision, i.e. bit depth.
Would it work better if depth maps were an engine feature versus a bunch
of tricks/hacks with pigments or fog?
Mike
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
On 6/19/2021 3:38 PM, Mike Horvath wrote:
> On 6/19/2021 10:38 AM, clipka wrote:
>> Am 19.06.2021 um 15:37 schrieb Norbert Kern:
>>
>>> If you set every diffuse, specular, phong, ambient, emission and
>>> reflection
>>> value as zero, you get a black material.
>>> A white fog turns this to an inverse depthmap. Then you can adapt the
>>> color_map
>>> of your blurring file or you can invert it via Photoshop and co to get a
>>> "regular" depthmap.
>>> Here is a depthmap of my last image.
>>
>> Some caveats of this approach:
>>
>> - The brightness will not depend on the Z coordinate, but on the
>> distance to the camera; this may or may not be what you want.
>>
>> - The image value will not represent the distance itself, but rather
>> an exponential-ish function thereof, something like:
>>
>>
>> where p is a value < 1. Again, this may or may not be what you want.
>>
>>
>> Also, like with every other thing where you want to get a per-pixel
>> value out of the render rather than a color, the topic of gamma needs
>> attention, and also precision, i.e. bit depth.
>
> Would it work better if depth maps were an engine feature versus a bunch
> of tricks/hacks with pigments or fog?
>
>
> Mike
Not saying you should implement it. 3D photos (see other thread) don't
seem to actually work that well except in a minority of specific cases.
Mike
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Mike Horvath <mik### [at] gmailcom> wrote:
> What is `cutaway_textures`? I have never head of it.
It's a keyword
http://www.povray.org/documentation/view/3.6.1/362/
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Is this the same as:
http://news.povray.org/povray.advanced-users/thread/%3C5e52bde0%241%40news.povray.org%3E/
http://news.povray.org/povray.newusers/thread/%3Cweb.5c90a6ab69c1cfb51d791a250%40news.povray.org%3E/
?
I have done a similar thing for 3D printing, and simply used a gradient texture,
scaled and translated to contain the entire scene within the 0-1 range.
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Am 19.06.2021 um 21:38 schrieb Mike Horvath:
> Would it work better if depth maps were an engine feature versus a bunch
> of tricks/hacks with pigments or fog?
No. It would just be simpler to set up (no tampering with textures,
pigments, fog and/or interiors) while at the same time potentially less
flexible.
(As a matter of fact, if I'm not mistaken older versions of POV-Ray
provided such a feature. At the very least MegaPOV did.)
The caveats I mentioned aren't necessarily problems, just things to be
mindful of, that might require modifications to the process if they
should happen to not exactly be what you need.
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
|
|