POV-Ray : Newsgroups : povray.off-topic : New LuxRender web site (http://www.luxrender.net) Server Time
11 Oct 2024 15:18:02 EDT (-0400)
  New LuxRender web site (http://www.luxrender.net) (Message 81 to 90 of 175)  
<<< Previous 10 Messages Goto Latest 10 Messages Next 10 Messages >>>
From: nemesis
Subject: Re: New LuxRender web site (http://www.luxrender.net)
Date: 21 Feb 2008 06:12:43
Message: <47bd5cab@news.povray.org>
Warp wrote:
>   Btw, does unbiased rendering support volumetric lighting? I can't find
> any example image in this site nor int the indigo gallery.
> 

I'm guessing they're waiting for 16-cores to become a reality before 
supporting it. :)


Post a reply to this message

From: scott
Subject: Re: New LuxRender web site (http://www.luxrender.net)
Date: 21 Feb 2008 06:27:50
Message: <47bd6036$1@news.povray.org>
> Cascades sounds interesting. I'd certainly like to watch it. However, it 
> requires a more expensive GPU and a more expensive OS before it will even 
> consider running, so that's kind of the end of that.

You can watch it on YouTube, you might even be able to find a higher 
resolution version somewhere on the net.

http://youtube.com/watch?v=_tDK2hfxiw0

> Yes. And that is why POV-Ray can do things that a GPU can't. POV-Ray isn't 
> *trying* to be real-time. ;-)

So POV should strive to use more recent algorithms then, not just stick with 
ones that were around 20 year ago.

> I could add things like isosurfaces to the list.

You do realise that the big rock in the Cascades demo is an isosurface?

> (Have you ever seen a game where the water *actually ripples* rather than 
> just surface normal tricks?)

Yes, Crysis for one, plus there's some flight sim which is regularly cited 
in papers about this.  The method used is "vertex texture lookup", whereby 
the position of a vertex can be modified by a texture.  It is available from 
Vertex Shader 3 onwards (ie all cards less than a year or two old).  This 
means a heightfield can essentially be rendered on the GPU with little or no 
effort from the CPU.  If you're doing your water simulation on the GPU 
anyway, this saves a huge amount of bandwidth between the CPU and GPU.

> Just don't try to tell me a GPU can do everything POV-Ray can. ;-)

I'm not, I'm just saying that it can do *most* of what POV can, in a 
fraction of the time.  ANd when people do start to use POV for animations, 
they chop out all the graphical goodies anyway to make it render in less 
than a year...


Post a reply to this message

From: Invisible
Subject: Re: New LuxRender web site (http://www.luxrender.net)
Date: 21 Feb 2008 06:34:33
Message: <47bd61c9@news.povray.org>
scott wrote:
>> Cascades sounds interesting. I'd certainly like to watch it. However, 
>> it requires a more expensive GPU and a more expensive OS before it 
>> will even consider running, so that's kind of the end of that.
> 
> You can watch it on YouTube, you might even be able to find a higher 
> resolution version somewhere on the net.
> 
> http://youtube.com/watch?v=_tDK2hfxiw0

Mmm, OK. Well obviously I can't do that from work, but when I get home 
tonight...

>> Yes. And that is why POV-Ray can do things that a GPU can't. POV-Ray 
>> isn't *trying* to be real-time. ;-)
> 
> So POV should strive to use more recent algorithms then, not just stick 
> with ones that were around 20 year ago.

Netwon's laws of motion have been around for a tad longer than 20 years, 
and people still use 'em. ;-)

FWIW, I think it would certainly be interesting to have unbaised 
rendering as an option on POV-Ray. But the amount of work required is, 
realistically, prohibitive. You'd have to basically rewrite the whole 
program. And I don't see that happening any time soon...

>> I could add things like isosurfaces to the list.
> 
> You do realise that the big rock in the Cascades demo is an isosurface?

Really? I thought it was just a tellesated triangle mesh based on an 
isosurface? (Remember, I haven't actually been able to watch the demo yet.)

>> (Have you ever seen a game where the water *actually ripples* rather 
>> than just surface normal tricks?)
> 
> Yes, Crysis for one, plus there's some flight sim which is regularly 
> cited in papers about this.

...starting to see why Crysis is so hard to run... ;-)

OOC... Clearly Crysis has some pretty serious graphics. But is it *fun* 
to play?

>> Just don't try to tell me a GPU can do everything POV-Ray can. ;-)
> 
> I'm not, I'm just saying that it can do *most* of what POV can, in a 
> fraction of the time.

Well OK then. That I can live with...

> And when people do start to use POV for 
> animations, they chop out all the graphical goodies anyway to make it 
> render in less than a year...

LOL! Every pover knows it's true... ;-)

-- 
http://blog.orphi.me.uk/
http://www.zazzle.com/MathematicalOrchid*


Post a reply to this message

From: Severi Salminen
Subject: Re: New LuxRender web site (http://www.luxrender.net)
Date: 21 Feb 2008 06:45:00
Message: <web.47bd6413b014483dd54d5bf70@news.povray.org>
> FWIW, I think it would certainly be interesting to have unbaised
> rendering as an option on POV-Ray. But the amount of work required is,
> realistically, prohibitive. You'd have to basically rewrite the whole
> program. And I don't see that happening any time soon...

How wrong you are :)

Just check out the images group. Fidos already implemented simple but fully
working brute force to Pov. So no, There is no need to rewrite the whole
program. Brute force might make it actually a lot simpler.


Post a reply to this message

From: scott
Subject: Re: New LuxRender web site (http://www.luxrender.net)
Date: 21 Feb 2008 06:52:53
Message: <47bd6615@news.povray.org>
> FWIW, I think it would certainly be interesting to have unbaised rendering 
> as an option on POV-Ray. But the amount of work required is, 
> realistically, prohibitive. You'd have to basically rewrite the whole 
> program. And I don't see that happening any time soon...

Not just the rendering method, but things like different reflection and 
lighting models, newer methods of increasing the efficiency of ray tracing 
(I posted a link in the pov4 group), etc.  Given that most people seem to 
think SDL is POV's strongest point, why not improve the SDL to be more 
flexible?

>> You do realise that the big rock in the Cascades demo is an isosurface?
>
> Really? I thought it was just a tellesated triangle mesh based on an 
> isosurface? (Remember, I haven't actually been able to watch the demo 
> yet.)

Well yes, of course, nothing can directly show an isosurface, even POV has 
to sample the function to generate pixels.  But my point was it shows an 
isosurface in realtime, in fine detail (you'll see in the demo when you 
watch it).

> OOC... Clearly Crysis has some pretty serious graphics. But is it *fun* to 
> play?

I only played the demo, and it was in a bit of a rush, seemed pretty similar 
*gameplay* to FarCry, which isn't a bad thing.  Played fine on my nVidia 
7900 card, I think I had low or medium detail and it was above 30fps most of 
the time.  I think you'd need to play a few levels before any new gameplay 
became apparent, just IMO.


Post a reply to this message

From: Stephen
Subject: Re: New LuxRender web site (http://www.luxrender.net)
Date: 21 Feb 2008 06:53:46
Message: <2cpqr3hqrdue3o7feav88ehdslos7vj7j8@4ax.com>
On 20 Feb 2008 15:09:52 -0500, Jim Henderson <nos### [at] nospamcom> wrote:

>
>I've seen photos that don't look photorealistic to me.  It most certainly 
>is a matter of opinion.

As an aside is a black and white photograph, photorealistic, or a sepia one? Are
the Pre-Raphaelites or chocolate box paintings?

Regards
	Stephen


Post a reply to this message

From: Severi Salminen
Subject: Re: New LuxRender web site (http://www.luxrender.net)
Date: 21 Feb 2008 06:55:00
Message: <web.47bd655bb014483dd54d5bf70@news.povray.org>
>   I wonder if an automatic measurement and then a threshold couldn't be
> developed. For example, if a given pixel hasn't changed color for the
> last n rays which have affected that pixel, then that pixel is done.

I simply calculate the standard deviation of the pixel values. A decent metric
is to see how the change in variance goes down as image converges.

Maybe this should be done locally where there are area of same color.

Anyway, it can be done.


Post a reply to this message

From: Invisible
Subject: Re: New LuxRender web site (http://www.luxrender.net)
Date: 21 Feb 2008 06:55:07
Message: <47bd669b@news.povray.org>
Severi Salminen wrote:
>> FWIW, I think it would certainly be interesting to have unbaised
>> rendering as an option on POV-Ray. But the amount of work required is,
>> realistically, prohibitive. You'd have to basically rewrite the whole
>> program. And I don't see that happening any time soon...

> How wrong you are :)

Perhaps.

> Just check out the images group. Fidos already implemented simple but fully
> working brute force to Pov.

Really? That's interesting. So what's the catch? ;-)

> So no, There is no need to rewrite the whole
> program. Brute force might make it actually a lot simpler.

Yes, I see how two algorithms instead of one would be a lot simpler... 
Oh, wait...

-- 
http://blog.orphi.me.uk/
http://www.zazzle.com/MathematicalOrchid*


Post a reply to this message

From: Severi Salminen
Subject: Re: New LuxRender web site (http://www.luxrender.net)
Date: 21 Feb 2008 07:20:01
Message: <web.47bd6bffb014483d5054540d0@news.povray.org>
"Gilles Tran" <gitran_nospam_@wanadoo.fr> wrote:

> The area_illumination feature that Warp introduced in the latest 3.7 beta is
> one of those things that the previous versions couldn't do: of course one
> could simulate it with grids of point lights, but it was just too
> impractical for common usage. Another feature that is sorely missing is
> efficient blurred reflection. There's a trick to do that in POV but the
> results are usable in only certain (limited) circumstances.

This is exactly what I don't like in POV. You can't just throw a scene to it and
let it render it accurately. You have to enable many kind of features and you
have to guess which features your scene actually needs. POV can't decide it for
you.

This is a problem both for developers and users. Developers have to implement
many special cases and tricks to get the desired effect. Brute force renderers
don't need them because many lighting effects are "automatically" generated.
With brute force you don't need area_illumination kind of keywords. You can use
any kind of light source and the result is always correct.

The same goes with blurred reflections. It is not too difficult to have any kind
of BRDF and the result is still properly antialiased etc. No need to have
special tricks to mimic blurred reflections. But I will verify this statement
when I have implemented them myself...


Post a reply to this message

From: Invisible
Subject: Re: New LuxRender web site (http://www.luxrender.net)
Date: 21 Feb 2008 07:29:29
Message: <47bd6ea9$1@news.povray.org>
Severi Salminen wrote:

> This is exactly what I don't like in POV. You can't just throw a scene to it and
> let it render it accurately. You have to enable many kind of features and you
> have to guess which features your scene actually needs. POV can't decide it for
> you.

This is what I don't like about GPUs and scanline rendering. Everything 
is textured polygons; the rest is lashings and lashings of deceptive 
trickery to make it *look* like the real thing. But with POV-Ray, if I 
ask for a sphere, I get a sphere. Not some polygon mesh approximating a 
sphere, but AN ACTUAL SPHERE. You can construct shapes of arbitrary 
complexity. Surfaces and textures can be magnified arbitrarily and never 
look pixellated. Reflections JUST WORK. Refraction JUST WORKS. Etc.

I can certainly see the advantage of a "I just throw objects in and it 
works" approach to lighting. But then, that's more or less how POV-Ray's 
radiosity feature works. You usually don't have to twiddle the settings 
all *that* much - it's more a question of how many years you're willing 
to wait for the result. And that's the kind of worrying part - how many 
years will you have to wait for the result from an unbiased renderer?

(OTOH, the fast preview you can get sounds like a useful feature. Ever 
wait 6 hours for a render only to find out that actually it looks lame? 
It's not funny...)

-- 
http://blog.orphi.me.uk/
http://www.zazzle.com/MathematicalOrchid*


Post a reply to this message

<<< Previous 10 Messages Goto Latest 10 Messages Next 10 Messages >>>

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.