|
|
Am 10.09.2017 um 23:50 schrieb Bald Eagle:
> clipka <ano### [at] anonymousorg> wrote:
>
>> And that's where the problem with UV-mapped textures arises: Since
>> `trace` knows nothing about textures, and `eval_pigment` knows nothing
>> about geometry, the two can only be combined for good effect if there is
>> a direct mapping between geometry and pigment via 3D XYZ space. But with
>> UV-mapping that's not the case, and hence the /combo/ of the two
>> functions breaks down there.
>
> So, the color of a surface on a uv-mapped object is not directly measurable by
> the eval_pigment() function? [ ! :O ]
> I would never have guessed that.
>
> It must then use a completely different algorithm than the ray-object
> intersection used to generate the final scene.
Actually no, the `trace` function is essentially the same as the one
used in the render. And the `eval_pigment` function is also essentially
the same as the one used in the render.
But the `trace`-like function used in the render not only computes the
3D XYZ location and surface normal, but also the UV coordinate as yet
another output value; and the render alghorithm then decides whether to
call the `eval_pigment`-like function with <X,Y,Z> as the location
parameter, or whether to use <U,V,0> instead.
> Can you:
> 1. use trace() to get the point and the normal
Yes, absolutely.
> 2. define a pigment constructed in the plane intersecting that point with the
> same normal
> ( I likely have no idea what I'm talking about here: I'm just confabulating the
> possibility that some slice of a function-object-pattern thing might be a
> workaround... )
You're right, that sounds a bit... confabulated. ;)
Whatever your brain is concocting there, it can't replace computing the
UV coordinate of the intersection.
> 3. then do an eval_pigment() on the plane?
You cannot do `eval_pigment` on shapes. You can only call it on pigments.
Post a reply to this message
|
|