POV-Ray : Newsgroups : povray.advanced-users : Using POV-Ray to generate an RGB normal map? : Using POV-Ray to generate an RGB normal map? Server Time
4 May 2024 13:04:37 EDT (-0400)
  Using POV-Ray to generate an RGB normal map?  
From: Fingers
Date: 6 Sep 2006 19:30:01
Message: <web.44ff58fa2b7d3f2b26c7adc80@news.povray.org>
Hello,

Has anybody experimented with creating RGB normal maps in POV-Ray? Is it
possible?

It should be straightforward to take the normal of a surface where the ray
hits, then quantize the XYZ components of the normal to produce values
ranging from 0 to 255 and store in an RGB image file (i.e. R = (Nx+1)*127+1
). But does POV-Ray, or any unofficial version of it have a way to do this?
It'd be a special pigment/pattern type, right? Alternatively, could this be
simulated with slope patterns?

The application would be "bump maps" used as textures in a real-time
renderer. One could render, say, a wall using just ambient light to get the
colors; and render again with the "normal map pigment" to get the normal for
each pixel... Then the real-time app combines the two like Pixel = Colormap
* (Normalmap dot Lightdirection) to allow changes in the illumination.


Post a reply to this message

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.