POV-Ray : Newsgroups : povray.off-topic : GPU rendering : Re: GPU rendering Server Time
4 Sep 2024 21:18:28 EDT (-0400)
  Re: GPU rendering  
From: nemesis
Date: 13 Jan 2010 14:16:25
Message: <4b4e1c09@news.povray.org>
andrel escreveu:
> You will still have the general increase in power, so if you need a 30 
> time increase, just wait 7.5 years.

in 7.5 years, assuming intel don't buy nvidia, I'll be using the whole 
sheer processing power available rather than just CPU.  So, you may have 
your 30x speedup, while your GPU sits idle, but I'll be making it sweat 
to give me 500-1000x speedups.

>> This is the future:  tapping all that hidden power that was being 
>> ignored so far
>> because we insist on using a lame-O chip geared at word processing to 
>> do math
>> operations.
> 
> That remark merely shows that you don't know anything about the history 
> and design of computers, or choose to ignore that.

It was obviously an over-the-top remark, but you get the point.

 > Hopefully there will be an
 > offspring (possibly GPU based) that will be able to parse POV scenes and
 > generate a preview of less quality but in much smaller time. When that

This makes no sense at all:  people aren't getting into GPU to get 
lame-O real-time previews of less quality, but to speed up final renders 
by a few orders of magnitude.

If you think GPU = game-like graphic quality, you're very dead wrong. 
General Purpose GPU programming is all about using that huge available 
power for general purpose computations.  Power that you don't use at all 
if you're not a gamer right now.

-- 
a game sig: http://tinyurl.com/d3rxz9


Post a reply to this message

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.