POV-Ray : Newsgroups : povray.pov4.discussion.general : GPU Rendering Server Time
8 May 2024 19:14:11 EDT (-0400)
  GPU Rendering (Message 11 to 20 of 41)  
<<< Previous 10 Messages Goto Latest 10 Messages Next 10 Messages >>>
From: Tim Attwood
Subject: Re: GPU Rendering
Date: 13 Nov 2007 04:21:41
Message: <47396ca5$1@news.povray.org>
> Also look at nVidia's Gelato, they use the 3D card to help with rendering, 
> so it can't be all bad.  Maybe there is some technical paper somewhere 
> about what Gelato does that would give some ideas for POV4?

Gelato uses undocumented proprietary GPU commands on the cards.
NVIDIA may (probably will) introduce new commands as hardware
improves, making their products better, while public standards will always
lag behind. It's just a matter of competition.

It would be nice if graphic card companies would agree on some
published standards for communicating data from the GPU.  I wouldn't
be surprised if such standards end up in some version of Direct X,
furthering Microsoft's corner of the gaming market.


Post a reply to this message

From: scott
Subject: Re: GPU Rendering
Date: 13 Nov 2007 06:46:03
Message: <47398e7b@news.povray.org>
>> Also look at nVidia's Gelato, they use the 3D card to help with 
>> rendering, so it can't be all bad.  Maybe there is some technical paper 
>> somewhere about what Gelato does that would give some ideas for POV4?
>
> Gelato uses undocumented proprietary GPU commands on the cards.
> NVIDIA may (probably will) introduce new commands as hardware
> improves, making their products better, while public standards will always
> lag behind. It's just a matter of competition.

I was thinking more of just using the "standard" shaders rather than 
programming the cards directly.  AFAIK the shaders are available to program 
in both OpenGL and DirectX, and will run on both ATI and nVidia cards.


Post a reply to this message

From: leope
Subject: Re: GPU Rendering
Date: 18 Nov 2007 06:45:00
Message: <web.474024cabb28b9b8688e61670@news.povray.org>
Warp <war### [at] tagpovrayorg> wrote:
> Allen <nomail@nomail> wrote:
> > Would it be possible to use the GPU as well as the CPU to render?
>
>   The general answer is: No.
>
> --
>                                                           - Warp

The general answer is: Yes.
"Gelato is a software program that leverages the NVIDIA GPU as a floating point
math processor. This allows Gelato to render images faster than comparable
renderers, but without the quality limitations traditionally associated with
real-time graphics processing on the GPU."


Post a reply to this message

From: Warp
Subject: Re: GPU Rendering
Date: 18 Nov 2007 08:44:13
Message: <474041ad@news.povray.org>
leope <leo### [at] liberoit> wrote:
> The general answer is: Yes.

  The *general* answer is yes and then you give *one* isolated case where
hardware is used for rendering by unknown (proprietary) means?

  The general answer is still no.

-- 
                                                          - Warp


Post a reply to this message

From: Chambers
Subject: Re: GPU Rendering
Date: 19 Nov 2007 14:15:00
Message: <web.4741e030bb28b9b8fcdb825b0@news.povray.org>
"zeroin23" <zer### [at] gmailcom> wrote:
> anyone looked at the below?>
>
> Last Updated: 10 / 11 / 2007
> http://developer.nvidia.com/object/cuda.html

Yes, I have.  It has severe limitations in place that wouldn't affect game (or
many scanline renderers, for that matter), but which make it unusable for POV.

I'll take another look at it in a year or two, and see if its changed enough by
then.  However, it's my personal prediction that for POV, the most efficient
course will be to wait for Intel and AMD to up the number of cores in their
chips (Intel has blabbed about the possibility of *thousands* of cores, while
AMD is being a bit more conservative in their predictions).

....Chambers


Post a reply to this message

From: Saul Luizaga
Subject: Re: GPU Rendering
Date: 22 Jan 2008 23:29:30
Message: <4796c2aa@news.povray.org>

> leope <leo### [at] liberoit> wrote:
>> The general answer is: Yes.
> 
>   The *general* answer is yes and then you give *one* isolated case where
> hardware is used for rendering by unknown (proprietary) means?
> 
>   The general answer is still no.

Search Criterion on Google: use the GPU as well as the CPU to render

If you click here: 
http://www.google.com/search?q=use+the+GPU+as+well+as+the+CPU+to+render&sourceid=mozilla-search&start=0&start=0&ie=utf-8&oe=utf-8&client=mozilla&rls=org.mozilla:es-ES:unofficial

you find some of these:

2CPU.com - The one stop source for everything SMP! 
(http://www.2cpu.com/story.php?id=3946)
BorisFX, HollywoodFX, and lots of other effects plug-ins use the GPU, 
too. ... But even a single-core CPU can render most effects in 
real-time, these days, ...

[PDF]CPU-GPU Hybrid Real Time Ray Tracing Framework 
(http://www.uni-weimar.de/cms/fileadmin/medien/vr/documents/Dokus/rtrt-paper.pdf)
ing the overall rendering into five render-passes. We there-. fore use 
the GPU wherever possible and only assign our fast. CPU Ray Tracing 
algorithm where ...

[PPT]Implementing the Render Cache and the Edge-and-Point Image on ... 
(http://www.cs.cornell.edu/~eva5/ppt/epiGPU06.ppt)
We wanted to use Vertex Texture Fetch (VTF) for mapping the point cloud 
update but ... We presented a hybrid GPU/CPU system for the Render Cache 
and the EPI ...

www.gpgpu.org :: View topic - Problem with CPU-GPU parallel processing
(http://gpgpu.org/forums/viewtopic.php?t=4783)
Hi, I am trying to render a bloodtree data set. I use both CPU and GPU 
to do this. ...

[PDF]The GPU as a high performance computational resource
(http://www.math.sintef.no/gpu/pdf/Dokken_SCCG_2005.pdf)
CPU and GPU provided that the results can reside in the GPU. .... this 
efficiently we use the GPU to sample and evaluate the

and there are other 9 pages on goggle over this.

looks like many people are doing it, all over the world, so i think the 
general answer is: yes.


Post a reply to this message

From: Saul Luizaga
Subject: Re: GPU Rendering
Date: 22 Jan 2008 23:44:13
Message: <4796c61d@news.povray.org>

> Gelato uses undocumented proprietary GPU commands on the cards.

I think you are mistaken here. nVidia has lots of info about how to 
program their GPUs, just go to the developer section, they have lots of 
MegaBytes of info and a free developer enviroment, downloadable.


Post a reply to this message

From: Saul Luizaga
Subject: Re: GPU Rendering
Date: 23 Jan 2008 00:08:11
Message: <4796cbbb@news.povray.org>
I read once that OpenGL was concieved as a multi-platform 3D manager for 
3D GPUs, regardless the brand of the Video Card, so this could be useful 
IMHO, I think also that anyway would be A LOT of effort to write 
efficient code, to actually make a harmonious coordination with any 
CPU/GPU combination. Among possible obstacles to solve:
-Calculate accurately the CPU-GPU latency.
-Calculate accurately GPU-RAM latency
-Calculate accurately CPU-RAM latency
-CPU-GPU coordination latancy.

I have also read that the GPU is been used as a second powerful math 
co-processor, so I think POV-Ray should take advantage features like 
this. Would be great for POV-Ray. It already gave a great jump to 
paralelism in ver 4 I think we are half way to get to CPU-GPU paralell 
rendering. Maybe there are some Open Source free-wares already on The 
Net that could help in a practical way to embed this feature 
faster/easier, who knows...

I just love this program, I only want THE BEST tech for it :)


Post a reply to this message

From: Patrick Elliott
Subject: Re: GPU Rendering
Date: 23 Jan 2008 00:57:39
Message: <MPG.22008439ce15c08598a0e6@news.povray.org>
In article <4796cbbb@news.povray.org>, sau### [at] netscapenet says...
> I read once that OpenGL was concieved as a multi-platform 3D manager for
 
> 3D GPUs, regardless the brand of the Video Card, so this could be useful
 
> IMHO, I think also that anyway would be A LOT of effort to write 
> efficient code, to actually make a harmonious coordination with any 
> CPU/GPU combination. Among possible obstacles to solve:
> -Calculate accurately the CPU-GPU latency.
> -Calculate accurately GPU-RAM latency
> -Calculate accurately CPU-RAM latency
> -CPU-GPU coordination latancy.
> 
> I have also read that the GPU is been used as a second powerful math 
> co-processor, so I think POV-Ray should take advantage features like 
> this. Would be great for POV-Ray. It already gave a great jump to 
> paralelism in ver 4 I think we are half way to get to CPU-GPU paralell 
> rendering. Maybe there are some Open Source free-wares already on The 
> Net that could help in a practical way to embed this feature 
> faster/easier, who knows...
> 
> I just love this program, I only want THE BEST tech for it :)
> 
I am going to try to hit the main issues that always come up when this 
idea is presented. I may miss some. However, the two ***key*** issues 
here are:

1. Graphics cards are designed to operate using scanline systems, so 90% 
of what they do is completely useless to POVRay.

2. Even the stuff they could be used for, they can't, because not all 
features use the same floating point sizes, and no card currently 
supports 64-bit floating point. This creates two problems. A) POVRay 
would be limited to using one those floating point processes that "do" 
use the same sizes as it does, for 32-bit systems, and b) for the 90% of 
us who are *probably* at this point running on 64-bit processors, it 
simply won't work at all, since there is no practical way to *fake* 64-
bit operation via such a card, which wouldn't add more overhead than you 
reduce by using the card, and that **assumes** that you could make it 
work at all, since the cards are not designed to do that, and code that 
does do such things uses methods that are not easy to (and may be 
impossible to) adapt to a graphics card.

Put simply, by the time full 64 bit cards come out, some up and coming 
companies might be making POVRay almost entirely redundant anyway, by 
using real raytracing on the card (at least one I know of is trying to 
build one that will do that). So, its not possible now, and it might be 
pointless when it becomes possible. You can be sure however that once it 
does become possible, someone will figure out a way to take advantage of 
it, in the same fashion as its now being redone to support multiple 
cores.

-- 
void main () {

    if version = "Vista" {
      call slow_by_half();
      call DRM_everything();
    }
    call functional_code();
  }
  else
    call crash_windows();
}

<A HREF='http://www.daz3d.com/index.php?refid=16130551'>Get 3D Models,
 
3D Content, and 3D Software at DAZ3D!</A>


Post a reply to this message

From: Warp
Subject: Re: GPU Rendering
Date: 23 Jan 2008 03:57:12
Message: <47970168@news.povray.org>
Saul Luizaga <sau### [at] netscapenet> wrote:
> 2CPU.com - The one stop source for everything SMP! 
> (http://www.2cpu.com/story.php?id=3946)
> BorisFX, HollywoodFX, and lots of other effects plug-ins use the GPU, 
> too. ... But even a single-core CPU can render most effects in 
> real-time, these days, ...

  Uh? It's talking about video encoding. That is, converting a series
of still images into a MPEG-2/MPEG-4 stream. It has nothing to do with
raytracing (or even scanline rendering).

> [PDF]CPU-GPU Hybrid Real Time Ray Tracing Framework 
> (http://www.uni-weimar.de/cms/fileadmin/medien/vr/documents/Dokus/rtrt-paper.pdf)
> ing the overall rendering into five render-passes. We there-. fore use 
> the GPU wherever possible and only assign our fast. CPU Ray Tracing 
> algorithm where ...

  Looks to me like theoretical "perhaps it could be done a bit like this"
experimental paper.

  "The depth of reection is currently limited to one and due to the current
implementation our framework provides only one directional light."

  "The overall performance of our framework closely depends on the amount
of secondary rays red by the Ray Tracer, while the GPU will mostly remain
under-worked. As a final result Figure 6 illustrates the computation times
of each render-pass and faces GPU and CPU. As expected the render-time is
limited by the CPU which obviously is the bottle-neck of our algorithm."

  And as far as I can see this was a very specialized (and limited)
raytracing algorithms which only supports triangles and GPU textures.
Mostly useless for a generic raytracer like POV-Ray.

> [PPT]Implementing the Render Cache and the Edge-and-Point Image on ... 
> (http://www.cs.cornell.edu/~eva5/ppt/epiGPU06.ppt)
> We wanted to use Vertex Texture Fetch (VTF) for mapping the point cloud 
> update but ... We presented a hybrid GPU/CPU system for the Render Cache 
> and the EPI ...

  It doesn't seem to be talking about raytracing at all. Also, whatever
it's talking about, seems to be limited to what the GPU supports, ie.
triangles and GPU textures. Mostly useless for a generic raytracer.

> www.gpgpu.org :: View topic - Problem with CPU-GPU parallel processing
> (http://gpgpu.org/forums/viewtopic.php?t=4783)
> Hi, I am trying to render a bloodtree data set. I use both CPU and GPU 
> to do this. ...

  Is not talking about raytracing at all.

> [PDF]The GPU as a high performance computational resource
> (http://www.math.sintef.no/gpu/pdf/Dokken_SCCG_2005.pdf)
> CPU and GPU provided that the results can reside in the GPU. .... this 
> efficiently we use the GPU to sample and evaluate the

  Is not talking about raytracing at all. Seems to be talking about how
the GPU might be used for things like image processing, solving
differential equations and linear algebra. These are completely different
(and, from the point of view of the GPU, much simpler) things than generic
raytracing.

> looks like many people are doing it, all over the world, so i think the 
> general answer is: yes.

  Only one of the pages you listed talked about raytracing, and even it
was limited (only triangles, only GPU textures, the CPU as a bit bottleneck,
the example implementation supported only one reflection and one light
source). The rest had nothing to do with the subject.

  The general answer is still no.

-- 
                                                          - Warp


Post a reply to this message

<<< Previous 10 Messages Goto Latest 10 Messages Next 10 Messages >>>

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.