|
|
Hello,
Has anybody experimented with creating RGB normal maps in POV-Ray? Is it
possible?
It should be straightforward to take the normal of a surface where the ray
hits, then quantize the XYZ components of the normal to produce values
ranging from 0 to 255 and store in an RGB image file (i.e. R = (Nx+1)*127+1
). But does POV-Ray, or any unofficial version of it have a way to do this?
It'd be a special pigment/pattern type, right? Alternatively, could this be
simulated with slope patterns?
The application would be "bump maps" used as textures in a real-time
renderer. One could render, say, a wall using just ambient light to get the
colors; and render again with the "normal map pigment" to get the normal for
each pixel... Then the real-time app combines the two like Pixel = Colormap
* (Normalmap dot Lightdirection) to allow changes in the illumination.
Post a reply to this message
|
|