POV-Ray : Newsgroups : povray.newusers : mipmapping Server Time
26 Nov 2024 03:48:54 EST (-0500)
  mipmapping (Message 12 to 21 of 31)  
<<< Previous 10 Messages Goto Latest 10 Messages Next 10 Messages >>>
From: Slime
Subject: Re: mipmapping
Date: 11 May 2005 15:21:10
Message: <42825b26$1@news.povray.org>
> Trilinear between what? The four corners of the pixel? You'll lose some
> information then (just imagine: 4 black pixels on the texture with white
in
> between! *gosh* :-)

I think he means using mipmaps, and then taking the two nearest mipmaps for
the scale level of the current sample, bilinearly interpolating to get the
color for each of them at the sample position, and then linearly
interpolating between the two results to get the final value.

In my experience, this approach generates a little too much blurring, and
when I program with OpenGL, I often wish that I could have nice
anti-aliasing instead.

 - Slime
 [ http://www.slimeland.com/ ]


Post a reply to this message

From: Tim Nikias
Subject: Re: mipmapping
Date: 11 May 2005 15:27:08
Message: <42825c8c$1@news.povray.org>
> I think he means using mipmaps, and then taking the two nearest mipmaps
for
> the scale level of the current sample, bilinearly interpolating to get the
> color for each of them at the sample position, and then linearly
> interpolating between the two results to get the final value.
>
> In my experience, this approach generates a little too much blurring, and
> when I program with OpenGL, I often wish that I could have nice
> anti-aliasing instead.

As I just recently learned in a course at my university, the way you
describe it is the standard hardcoded way on older graphics cards. The new
ones (ATI Radeon 9800 and upwards, and I think the NVidias FX and upwards)
come with programmable Vertex and Fragment Shaders, and with fragment
shaders you can script some more elaborate way of interpolation. The guy
showed us some examples of the usual mipmap-fading to avoid jumps between
the different mipmaps, and one which interpolates amongst several
mipmap-levels, depending on the angle of the pixel (and thus the area it
would cover on the image-map). It looked real crisp.

However, we're drifting away from the topic at hand, being of help to tahoma
to get his images look good. :-)

-- 
"Tim Nikias v2.0"
Homepage: <http://www.nolights.de>


Post a reply to this message

From: Christoph Hormann
Subject: Re: mipmapping
Date: 11 May 2005 15:45:01
Message: <d5tnad$9nm$1@chho.imagico.de>
Warp wrote:
> 
>   How so?
> 
>   I don't see why mipmapping, and better yet trilinear interpolation
> of image maps could not be implemented in a raytracer. There are a
> couple of twists, but they can be probably solved.

Good luck in doing so.  Note any such technique will have to modify the 
very basic concept of raytracing with additions like ray differentials 
and you will have a hard time making this work efficiently in arbitrary 
cases (i.e. non-standard camera types, reflections/refactions etc.).  In 
the end i doubt this will lead to better results in shorter time (but i 
will be happy to be proven wrong).

Christoph

-- 
POV-Ray tutorials, include files, Sim-POV,
HCR-Edit and more: http://www.tu-bs.de/~y0013390/
Last updated 03 May. 2005 _____./\/^>_*_<^\/\.______


Post a reply to this message

From: Warp
Subject: Re: mipmapping
Date: 11 May 2005 17:50:29
Message: <42827e25@news.povray.org>
Tim Nikias <JUSTTHELOWERCASE:timISNOTnikias(at)gmx.netWARE> wrote:
> Trilinear between what? The four corners of the pixel? You'll lose some
> information then (just imagine: 4 black pixels on the texture with white in
> between! *gosh* :-)

  Perhaps you should look up what trilinear interpolation is.

-- 
                                                          - Warp


Post a reply to this message

From: Warp
Subject: Re: mipmapping
Date: 11 May 2005 17:55:09
Message: <42827f3d@news.povray.org>
Christoph Hormann <chr### [at] gmxde> wrote:
> Good luck in doing so.  Note any such technique will have to modify the 
> very basic concept of raytracing with additions like ray differentials 
> and you will have a hard time making this work efficiently in arbitrary 
> cases (i.e. non-standard camera types, reflections/refactions etc.).  In 
> the end i doubt this will lead to better results in shorter time (but i 
> will be happy to be proven wrong).

  For surfaces seen directly by the camera it's enough to know the length
of the ray and the scale of the image map.
  For surfaces seen indirectly an estimation will usually be more than
enough.
  Pathological cases exist, of course, but I'm pretty sure many people would
be very happy even with an estimated version.

-- 
                                                          - Warp


Post a reply to this message

From: Warp
Subject: Re: mipmapping
Date: 11 May 2005 17:57:54
Message: <42827fe2@news.povray.org>
tahoma <nomail@nomail> wrote:
> Trilinear filtering is what i asked for. Because of the depedency of
> mipmapping i just mentioned this one. If povray would support mipmapping
> then trilinear filtering would be naturally there i think.

  It's not all that obvious. Just adding support for mipmapping doesn't
automatically add trilinear filtering. That has to be added explicitly
as well. And it's not like trilinear filtering is the only option.
There could be anisotropic filtering instead, for example.

-- 
                                                          - Warp


Post a reply to this message

From: Tim Nikias
Subject: Re: mipmapping
Date: 11 May 2005 18:29:10
Message: <42828736$1@news.povray.org>
>   Perhaps you should look up what trilinear interpolation is.

Nope, I know what it is. :-) I was just wondering what you wanted to
interpolate, but it seems Slime got it right away. To make things worse, my
description was of course only bilinear, so that should have tipped me off
right away what you were getting at. But then again, it has been a long day
and I'm pretty tired. Sorry for the inconvenience caused. :-)

-- 
"Tim Nikias v2.0"
Homepage: <http://www.nolights.de>


Post a reply to this message

From: Christoph Hormann
Subject: Re: mipmapping
Date: 11 May 2005 18:30:01
Message: <d5u0o9$bpd$1@chho.imagico.de>
Warp wrote:
> 
>   For surfaces seen directly by the camera it's enough to know the length
> of the ray and the scale of the image map.

Assuming:

- normal perspective camera
- No warps etc. being applied to the image map.

To sum it up - in all cases where you probably are faster using hardware 
accelerated scanline rendering anyway.

Again - if you implement something along these lines that is surely 
interesting from the technical standpoint but the practical use will be 
very limited.

Christoph

-- 
POV-Ray tutorials, include files, Sim-POV,
HCR-Edit and more: http://www.tu-bs.de/~y0013390/
Last updated 03 May. 2005 _____./\/^>_*_<^\/\.______


Post a reply to this message

From: tahoma
Subject: Re: mipmapping
Date: 11 May 2005 18:33:48
Message: <opsqmualixx5erxw@leningrad>
On Wed, 11 May 2005 21:26:33 +0200, Tim Nikias  

<JUSTTHELOWERCASE:timISNOTnikias(at)gmx.netWARE> wrote:

>> I think he means using mipmaps, and then taking the two nearest mipma
ps
> for
>> the scale level of the current sample, bilinearly interpolating to ge
t  

>> the
>> color for each of them at the sample position, and then linearly
>> interpolating between the two results to get the final value.
>>
>> In my experience, this approach generates a little too much blurring,
  

>> and
>> when I program with OpenGL, I often wish that I could have nice
>> anti-aliasing instead.
>
> As I just recently learned in a course at my university, the way you
> describe it is the standard hardcoded way on older graphics cards. The
  

> new
> ones (ATI Radeon 9800 and upwards, and I think the NVidias FX and  

> upwards)
> come with programmable Vertex and Fragment Shaders, and with fragment
> shaders you can script some more elaborate way of interpolation. The g
uy
> showed us some examples of the usual mipmap-fading to avoid jumps betw
een
> the different mipmaps, and one which interpolates amongst several
> mipmap-levels, depending on the angle of the pixel (and thus the area 
it
> would cover on the image-map). It looked real crisp.

That way i was hoping to go with povray as well. To program the texture 
(or
pigment) stage to support different blending mode for example. But it is

another topic :) It's possible, but not like my mind is winded. So i hav
e
to rethink some stuff.

> However, we're drifting away from the topic at hand, being of help to 
 

> tahoma
> to get his images look good. :-)

I hope to get them look good when trying different antialiasing settings
.
More i cannot gain from this thread (mipmapping is not supported and
maybe useless). Adjusting materials is on the one hand an artist task an
d
on the other hand one of the translater from OpenGL materials to povray
one's.

But what i also get is, offline rendering (raytracing) does not really
differ from realtime rendering that much. The concepts are almost the sa
me,
the community is a little different :)

Thanks for you help


Post a reply to this message

From: tahoma
Subject: Re: mipmapping
Date: 11 May 2005 18:52:41
Message: <opsqmu53xxx5erxw@leningrad>
> In  the end i doubt this will lead to better results in shorter time  

> (but i will be happy to be proven wrong).

That's why mipmapping or trilinear filtering is common practice in realt
ime
graphics. Beside the fact that is decreases cache misses for gpu's to fe
tch
texels it gives better results without FSAA.
Nevertheless you can do fancy things with different textures. If you don
't  

use
just downsampled textures you can achieve some nice texturing effect  

depended
on the fragment distance. I don't know if there exists a pattern in povr
ay  

for
this.


Post a reply to this message

<<< Previous 10 Messages Goto Latest 10 Messages Next 10 Messages >>>

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.