|
|
|
|
|
|
| |
| |
|
|
|
|
| |
| |
|
|
I was wondering if Povray rendering speed could be increased by using the
graphics card or a customized fpga if available. The graphics card may be a
little difficult to utilize (although NVidia has written their Gelato renderer
to use their GeForce cards), but how about an fpga directly connected to a
computer?
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
"jhu" <nomail@nomail> wrote:
> I was wondering if Povray rendering speed could be increased by using the
> graphics card or a customized fpga if available. The graphics card may be a
> little difficult to utilize (although NVidia has written their Gelato renderer
> to use their GeForce cards), but how about an fpga directly connected to a
> computer?
Hi,
this has been discussed several times before; a search on this site for
"graphics card" gives more than 100 hits ;-). Read THIS
http://news.povray.org/povray.programming/thread/%3Cweb.4494e8a94ad1379aa6bb2e320%40news.povray.org%3E/?mtop=1
as an example...
Have a nice weekend
Karl
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
>> I was wondering if Povray rendering speed could be increased by using the
>> graphics card or a customized fpga if available. The graphics card may be
>> a
>> little difficult to utilize (although NVidia has written their Gelato
>> renderer
>> to use their GeForce cards), but how about an fpga directly connected to
>> a
>> computer?
>
> Hi,
>
> this has been discussed several times before; a search on this site for
> "graphics card" gives more than 100 hits ;-). Read THIS
>
http://news.povray.org/povray.programming/thread/%3Cweb.4494e8a94ad1379aa6bb2e320%40news.povray.org%3E/?mtop=1
> as an example...
>
> Have a nice weekend
> Karl
The current methods of increasing POV render speed
by hardware are...
1) purchase a faster multi-core computer
2) set up multiple computers as a render farm
3) rent processing time via online vendors like Amazon
($ 0.80 per hour per instance for a quad core equivalent)
It's tempting for people to think that graphics cards should
be good at rendering ray-traced graphics, but they are not.
Typically a GPU is optimized around z-buffered scan-line
graphics. If you started from scratch you could put an
OK version of POV into hardware, I think FPGAs could be
used to prototype such a ray-tracing card, striping out
unneeded commands from some microprocessor design in
order to fit more dedicated floating point onto a chip, but it's
more likely that NVIDIA or someone will add in ray-tracing
features to their newest card as part of a hybrid
ray-tracing/scan-line design. It certainly would take a group
of smart electrical engineers, I'm perty sure that the expertise
around here tends to be in software not electrical engineering.
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
jhu wrote:
> (although NVidia has written their Gelato renderer
> to use their GeForce cards),
Gelato can be controlled directly via Python. I've always wanted to try
writing a Gelato front-end that understands at least a reasonable subset
of Povray SDL.
I don't know enough about Gelato's internals to be able to say what
level of CSG support it might have. The website only seems to talk about
meshes, so probably not much.
Alas, Dell foisted an Intel graphics card on me with me newest computer,
and the GeForce 4 in my old machine isn't supported by Gelato. :-(
--
William Tracy
afi### [at] gmailcom -- wtr### [at] calpolyedu
You know you've been raytracing too long when you invented glasses that
can be configured to use variable resolution (eg. 320x240, 640x480,
etc.), with POV-Ray style switches for other effects (eg. anti-aliasing,
radiosity, etc.)
-- Vimal N. Lad / Gautam N. Lad
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |