POV-Ray : Newsgroups : povray.off-topic : GPU rendering Server Time
7 Sep 2024 11:20:54 EDT (-0400)
  GPU rendering (Message 15 to 24 of 34)  
<<< Previous 10 Messages Goto Latest 10 Messages Next 10 Messages >>>
From: Orchid XP v8
Subject: Re: GPU rendering
Date: 18 Jul 2008 04:16:14
Message: <4880514e$1@news.povray.org>
>> OOC, is it possible to use a GPU to copy and transform chunks of video 
>> data? (E.g., rotate, scale, that kind of thing.)
> 
> Yes, you just draw a textured square wherever you want, using whatever 
> you want as the texture (the result of a previous calculation?).

I was hoping you'd say something like that. :-)

Is it possible to take a chunk of the frame buffer and turn it into a 
texture? Or would I have to render texture to texture and then make a 
copy to the screen so I can see it?

> The only limitation is that you cannot render to a texture at the same 
> time as using that texture as a source for drawing operations.

That won't be a problem.

-- 
http://blog.orphi.me.uk/
http://www.zazzle.com/MathematicalOrchid*


Post a reply to this message

From: Orchid XP v8
Subject: Re: GPU rendering
Date: 18 Jul 2008 04:19:40
Message: <4880521c@news.povray.org>
scott wrote:
>> Uuhhh... any chance of some code? :-}
> 
> I don't have time to write anything working now, but I'm just having a 
> look at something I have already that would be a good basis...  This is 
> using DirectX 9 btw.
> 
> Use CreateTexture() to create two new 256x256 textures with the format 
> D3DFMT_R32F (that gives you a texture full of 32bit floats, as opposed 
> to the more usual RGBA bytes).  Also create two temp textures.
> 
> You can then lock the texture and write to it using CPU code, to set the 
> initial values.
> 
> I would create a big vertex buffer full of the 65k squares 1 unit big 
> centered at the origin.  Also assign an x,y coordinate to each vertex of 
> each square (this is so the vertex shader knows which square it is 
> processing).
> 
> Then, in the game loop, use the Direct3D call SetRenderTarget() to set 
> the first temp texture as the render target, and run your pixel shader 
> that calculates the new x coordinate.  DO the same for y.  If you have 
> DX10 then you can render to multiple render targets in one shot, so 
> could be faster.
> 
> Pixel shader code would look something like:
> 
> {
>  float currentX , currentY;
>  currentX = tex2D( xCoordTexture , ps_in.textureCoordinate );
>  currentY = tex2D( yCoordTexture , ps_in.textureCoordinate );
> 
>  < calculate new X coord here >
> 
>  return newX;
> 
> }
> 
>> (Also... what's a vertex shader?)
> 
> OK, so then the next bit, to actually draw all those points somewhere.  
> The vertex shader is what "normally" converts your mesh from model space 
> into screen space that is then used by the pixel shader to know where to 
> draw each pixel.  Usually it just multiplies by a big matrix, but we can 
> do whatever we want in it.
> 
> I'm going to look up the X,Y coordinates from those textures, using the 
> XY coordinates we set in the vertex buffer above, remember?
> 
> {
>  currentX = tex2D( xCoordTexture , vs_in.xCoordinate );
>  currentY = tex2D( yCoordTexture , vs_in.yCoordinate );
> 
>  vs_out.Pos.x = vs_in.Pos.x + vs_in.x + currentX;
>  vs_out.Pos.y = vs_in.Pos.y + vs_in.y + currentX;
> 
>  return vs_out;
> }
> 
> It returns each vertex of each square shifted to the position defined by 
> the coordinates in the xy texture.
> 
> The pixel shader used to draw those squares would then do something 
> really simple, like just increment the colour at that coordinate by 1.  
> Or if you want to get really clever you could make the square a bit 
> bigger (so it covers say 9 pixels) and do some antialiasing by hand in 
> the pixel shader.
> 
>> Would it be possible to do all this using just, say, OpenGL?
> 
> Yes.  But I don't know any of the syntax, sorry.

Ah, OK. Well I have a Haskell binding to OpenGL, but not for DirectX.

So the exact function calls you're talking about won't help me, but it 
gives me some idea what I'm trying to do...

-- 
http://blog.orphi.me.uk/
http://www.zazzle.com/MathematicalOrchid*


Post a reply to this message

From: scott
Subject: Re: GPU rendering
Date: 18 Jul 2008 04:24:33
Message: <48805341$1@news.povray.org>
> Is it possible to take a chunk of the frame buffer and turn it into a 
> texture?

You can, but it's a bit long-winded and slow.

> Or would I have to render texture to texture and then make a copy to the 
> screen so I can see it?

Yes, that's what is normally done.  Also gives you the advantage that you 
can do funky things with the pixel shader you use to write the texture to 
the screen, like using colour look up tables, motion blur, bloom, edge 
detection, etc etc.


Post a reply to this message

From: scott
Subject: Re: GPU rendering
Date: 18 Jul 2008 04:25:38
Message: <48805382$1@news.povray.org>
> Ah, OK. Well I have a Haskell binding to OpenGL, but not for DirectX.
>
> So the exact function calls you're talking about won't help me, but it 
> gives me some idea what I'm trying to do...

I would try and find some very simple demos of pixel and vertex shaders 
written in OpenGL first and get them working.  IIRC they are called 
something else in OpenGL too, just to make matters more confusing.


Post a reply to this message

From: Orchid XP v8
Subject: Re: GPU rendering
Date: 18 Jul 2008 04:52:07
Message: <488059b7$1@news.povray.org>
scott wrote:
>> Ah, OK. Well I have a Haskell binding to OpenGL, but not for DirectX.
>>
>> So the exact function calls you're talking about won't help me, but it 
>> gives me some idea what I'm trying to do...
> 
> I would try and find some very simple demos of pixel and vertex shaders 
> written in OpenGL first and get them working.  IIRC they are called 
> something else in OpenGL too, just to make matters more confusing.

OpenGL apparently calls a pixel shader a "fragment shader".

I found a small program that allows you to type in GLSL source code and 
see it applied to a mesh object. Should help me get to know the 
language, but won't help much with rendering a texture of floats. ;-)

The "other" fun thing is that shaders were introduced in OpenGL 2.0, for 
which no language spec is available online. You have to buy the book for 
that. But having a look at the Haskell OpenGL bindings, I think I see 
how it's supposed to work...

-- 
http://blog.orphi.me.uk/
http://www.zazzle.com/MathematicalOrchid*


Post a reply to this message

From: Kevin Wampler
Subject: Re: GPU rendering
Date: 18 Jul 2008 14:42:03
Message: <4880e3fb@news.povray.org>
Orchid XP v8 wrote:
> There are 3 processes that need to happen.
> 
> 1. A stream of coordinates needs to be generated.
> 2. A histogram of the coordinates needs to be constructed.
> 3. A non-linear mapping from histogram frequencies to colours is performed.
> 
> Step 2 looks to be the tricky one - but it shouldn't be *vastly* 
> difficult I think. The question is whether the histograms will fit into 
> the limited RAM on the card...

I'd think that memory wouldn't be a problem here, as 256MB is enough to 
store a reasonably large histogram.


Post a reply to this message

From: Orchid XP v8
Subject: Re: GPU rendering
Date: 18 Jul 2008 16:09:39
Message: <4880f883$1@news.povray.org>
Kevin Wampler wrote:

> I'd think that memory wouldn't be a problem here, as 256MB is enough to 
> store a reasonably large histogram.

The *only* reason I increased the RAM in my system from 1 GB to 3 GB was 
to handle rendering high resolution IFS images. ;-)

OTOH, I'm talking about resolutions high enough for Zazzle...

-- 
http://blog.orphi.me.uk/
http://www.zazzle.com/MathematicalOrchid*


Post a reply to this message

From: Chambers
Subject: Re: GPU rendering
Date: 19 Jul 2008 19:58:52
Message: <48827fbc@news.povray.org>
Orchid XP v8 wrote:
> The "other" fun thing is that shaders were introduced in OpenGL 2.0, for 
> which no language spec is available online. You have to buy the book for 
> that. But having a look at the Haskell OpenGL bindings, I think I see 
> how it's supposed to work...

That's not true, you can download the spec for both GL2 and the GLSL for 
free (they're separate documents, though).

...Chambers


Post a reply to this message

From: Nicolas Alvarez
Subject: Re: GPU rendering
Date: 20 Jul 2008 17:26:05
Message: <4883ad6d@news.povray.org>
Orchid XP v8 wrote:
> (Also... I get the impression you need Vista for the GeForce 8 to work.)

I have a 8600 GT. Used to use it on Windows XP. Now running Linux.

(and many Compiz effects involving shaders, like transparency blur, indeed
get WAY more framerate than on my previous machine with a... GeForce 4?)


Post a reply to this message

From: Orchid XP v8
Subject: Re: GPU rendering
Date: 22 Jul 2008 14:15:41
Message: <488623cd$1@news.povray.org>
Nicolas Alvarez wrote:
> Orchid XP v8 wrote:
>> (Also... I get the impression you need Vista for the GeForce 8 to work.)
> 
> I have a 8600 GT. Used to use it on Windows XP. Now running Linux.

Oh, OK then.

-- 
http://blog.orphi.me.uk/
http://www.zazzle.com/MathematicalOrchid*


Post a reply to this message

<<< Previous 10 Messages Goto Latest 10 Messages Next 10 Messages >>>

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.