|
![](/i/fill.gif) |
Thanks Warp. Is there any way to dig a little deeper? How does the image map
become the pigment function? Assuming there are RGB values for every pixel
in my image, what exactly gets mapped to the pigment function? Is there any
scaling of parameters values, or an offset? What would those values look
like? I would like to get a better understanding of how this process works,
as I am clear that the function comes from the bitmap. But I'm trying to get
a better handle on how that happens...
And regarding the grayscale values, again, are those quantized to some range
of values? What's the process here as well?
thanks much.
Dennis
"Warp" <war### [at] tag povray org> wrote in message
news:43f0f6ff@news.povray.org...
> Dennis Miller <dhm### [at] comcast net> wrote:
>> #declare Pic = function {
>> pigment {
>> image_map {
>> tga concat("e:\\me\\me", str(clock*356,-3,0), ".tga")
>> map_type 1
>> interpolate 2
>> } } }
>
> This creates a pigment function (which uses an image map).
>
>> isosurface {
>> function {
>> (1 - Pic(x, y, z).grey) * 0.4 + Sphere(x, y, z) }
>
> And this uses the grayscale values of that pigment function.
>
> --
> - Warp
Post a reply to this message
|
![](/i/fill.gif) |