|
|
|
|
|
|
| |
| |
|
|
|
|
| |
| |
|
|
> Second, Displacement mapping is a concept which often apply with mesh (or
> meshed nurbs...) which is a very small subset of Povray capabilities.
AFAIK POV-Ray doesn't support NURBS.
Displacement mapping is usually done during tesselating shapes into
triangles - but POV-Ray has no need to tesselate objects (except for
bezier-patches).
If patched versions of POV-Ray add support for NURBS and SDS, it would
be a nice feature to add displacement mapping for those objects :-)
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Sascha Ledinsky wrote:
>
> AFAIK POV-Ray doesn't support NURBS.
No, POV-Ray supports NURBS as much as most other renderers support them
- by rendering meshes generated from them. It is possible to directly
raytrace NURBS of course but that is rarely done. See also my general
remark concerning 'does POV-Ray support xxx' questions.
> If patched versions of POV-Ray add support for NURBS and SDS, it would
> be a nice feature to add displacement mapping for those objects :-)
No. No algorithm commonly used to directly raytrace NURBS would allow
displacement mapping. So the above statement does not make sense.
Christoph
--
POV-Ray tutorials, include files, Sim-POV,
HCR-Edit and more: http://www.tu-bs.de/~y0013390/
Last updated 23 Sep. 2004 _____./\/^>_*_<^\/\.______
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
> No, POV-Ray supports NURBS as much as most other renderers support them
> - by rendering meshes generated from them.
Sure, POV-Ray supports triangle meshes - but you'd need an external
application to generate a triangle mesh from a NURBS mesh (of course SDL
could do this, but it's quite slow when you need to tesselate thousands
of NURBS patches).
The scene file containing triangle meshes is much larger than the file
describing just the NURBS, this can be a problem for distributing the
scenes via the internet (e.g. for the Internet Movie Project).
Another problem when tesselating "outside" of POV-Ray are the trim curves...
> It is possible to directly raytrace NURBS of course but that is rarely done.
I've read some papers about directly ray-tracing bezier-patches (I think
NURBS would work quite similar) - The conclusion was that it's also just
an approximation, and a tesselated triangle mesh of the same quality
would render faster, so there would be no real advantage in raytracing
"directly".
> No. No algorithm commonly used to directly raytrace NURBS would allow
> displacement mapping. So the above statement does not make sense.
I meant the tesselation approach (as used in POV-Ray for
bezier-patches). Theoretically it should be possible to add
displacement mapping to the bezier-patch tesselation code.
The big advantage of doing displacement mapping within POV-Ray would be
that all the patterns that can be used for bump-mapping could be re-used.
Don't get me wrong. Iso-Surfaces are a really cool feature of POV-Ray
and offer a lot of possibilities. Its just a bit of an overkill to use
this feature for displacement mapping...
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Christoph Hormann <chr### [at] gmxde> wrote:
> No, POV-Ray supports NURBS as much as most other renderers support them
> - by rendering meshes generated from them.
But I suppose that one big difference between supporting NURBS directly
and "supporting" them only be being able to render a triangle mesh
generated from the NURBS is that when rendering a NURBS directly, all
kinds of speed/memory optimizations can be made (eg. tesselate on demand,
that is, tesselate only the part currently being rendered, and when it's
done, throw away those triangles and go to the next part...).
--
plane{-x+y,-1pigment{bozo color_map{[0rgb x][1rgb x+y]}turbulence 1}}
sphere{0,2pigment{rgbt 1}interior{media{emission 1density{spherical
density_map{[0rgb 0][.5rgb<1,.5>][1rgb 1]}turbulence.9}}}scale
<1,1,3>hollow}text{ttf"timrom""Warp".1,0translate<-1,-.1,2>}// - Warp -
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Sascha Ledinsky wrote:
>
>> No. No algorithm commonly used to directly raytrace NURBS would allow
>> displacement mapping. So the above statement does not make sense.
>
>
> I meant the tesselation approach (as used in POV-Ray for
> bezier-patches). Theoretically it should be possible to add
> displacement mapping to the bezier-patch tesselation code.
Why would you want to do this if it is a mesh anyway and you could
displace the mesh (any mesh, not just patches)?
Christoph
--
POV-Ray tutorials, include files, Sim-POV,
HCR-Edit and more: http://www.tu-bs.de/~y0013390/
Last updated 23 Sep. 2004 _____./\/^>_*_<^\/\.______
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
delle wrote:
> Is there someone working on "Displacement Mapping" in Povray (as a Patch)?
>
> Best regards.
>
> Delle.
>
>
If you have a mesh in a 3D model file you can use PoseRay to read the
mesh, subdivide it, and displace it. Then you can either export it or
render it in POV-Ray directly from PoseRay.
http://user.txcyber.com/~sgalls/
The displacement is not adaptive to the gradient but it works.
later,
FlyerX
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Christoph Hormann wrote:
>> I meant the tesselation approach (as used in POV-Ray for
>> bezier-patches). Theoretically it should be possible to add
>> displacement mapping to the bezier-patch tesselation code.
>
>
> Why would you want to do this if it is a mesh anyway and you could
> displace the mesh (any mesh, not just patches)?
I think there is a misconception of what displacement mapping does. What
I understand as displacement mapping is in effect quite similar to bump
mapping (POV's normal { ... } feature). The difference is that true
displacement mapping does not simply chnage the surface normals (thats
only a side effect), but actually changes the geometry. Thus, the
displaced geometry will self-shadow, and the displacement itself is visible.
Here is an example image of a renderman displacement shader applied to a
sphere: http://jrman.sourceforge.net/image.php?img=bumpBalls.png
Compared to a bump-mapped version
http://www.povray.org/documentation/images/tutorial/pic1.png
the true displacement has several advantages: The displacement itself is
visible, there is correct self shadowing and there is no "shadow line
artifact".
While I totally agree that the displaced sphere can be done in POV-Ray
using iso-surfaces, I don't think that it makes much sense to use
iso-surfaces for e.g. character animation.
Back to your question:
I'd agree if we're talking about very-high-resolution triangle meshes
only. Displacing the vertices has the same effect as applying a
"displacement shader". But I disagree as far as NURBS, other patch
meshes, or parametric surfaces in general are concerned (currently POV
supports bicubic bezier patches only, and that not very well).
Displacing just the controlpoints of a NURBS mesh has certainly not the
desired effect!
For example, take a simple cylinder-like shape, made of four bicubic
patches (48 unique control-points). Displacing these 48 controlpoints
will change the shape and maybe add some bulges - but you couldn't apply
displacements that would give it (for example) the appearance of a
brick wall that way.
Now, when POV-Ray parses the patches, it turns each patch into a
triangle mesh (e.g. of 128 or more triangles, depending on u_steps and
v_steps). Displacing this triangle grid is very close to what a true
displacement shader would do.
Of course using an external application (or SDL macro) to tesselate the
parametric surface into a high-density triangle mesh and THEN applying
displacement to the tesselated grid would have the same effect - but I
would neither claim that POV supports NURBS, nor that it supports
displacment-mapping then...
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
> What would be interesting is support for adaptive render
> time subdivision of meshes with displacement.
>
Now, I would be interested in knowing how you can achieve that!
Sincerely, I'm afraid you would need to transform each triangle into a
bounding volume and then perform 'on intersection', the actual subdivision.
For me, it sound just as a trade of memory for time.
(ok, the bigger the initial triangles, the less costly top-level
intersections you have, but you are going to need more subdivision on the
fly. And from my personal point of view, I like to have all allocations
fully done once the scene has been parsed.)
I maybe totally misleaded.
But, you have the idea, give it a try!
--
l'habillement, les chaussures que le maquillage et les accessoires.
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Sascha Ledinsky wrote:
>>
>> Why would you want to do this if it is a mesh anyway and you could
>> displace the mesh (any mesh, not just patches)?
>
>
>
> Here is an example image of a renderman displacement shader applied to a
> sphere: [...]
Renderman is using scanline rendering techniques, this has nothing to do
here. And there of course is no self shadowing in the image you used as
example.
> Back to your question:
> I'd agree if we're talking about very-high-resolution triangle meshes
> only. Displacing the vertices has the same effect as applying a
> "displacement shader". But I disagree as far as NURBS, other patch
> meshes, or parametric surfaces in general are concerned (currently POV
> supports bicubic bezier patches only, and that not very well).
> Displacing just the controlpoints of a NURBS mesh has certainly not the
> desired effect!
I'd suggest you reread what i wrote. This remark shows you have
completely misunderstood it.
Christoph
--
POV-Ray tutorials, include files, Sim-POV,
HCR-Edit and more: http://www.tu-bs.de/~y0013390/
Last updated 23 Sep. 2004 _____./\/^>_*_<^\/\.______
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Le Forgeron wrote:
>
>> What would be interesting is support for adaptive render
>>time subdivision of meshes with displacement.
>>
>
> Now, I would be interested in knowing how you can achieve that!
>
> Sincerely, I'm afraid you would need to transform each triangle into a
> bounding volume and then perform 'on intersection', the actual subdivision.
>
> For me, it sound just as a trade of memory for time.
Sure but as you know time always scales linearly but you can't use more
memory than you have (at least not much more).
It has already been demonstrated that this is possible and also works
quite efficiently.
Christoph
--
POV-Ray tutorials, include files, Sim-POV,
HCR-Edit and more: http://www.tu-bs.de/~y0013390/
Last updated 23 Sep. 2004 _____./\/^>_*_<^\/\.______
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |