POV-Ray : Newsgroups : povray.off-topic : Unreal engine 4 : Re: Unreal engine 4 Server Time
29 Jul 2024 04:21:53 EDT (-0400)
  Re: Unreal engine 4  
From: Orchid Win7 v1
Date: 9 Jun 2012 08:21:02
Message: <4fd33fae$1@news.povray.org>
> while we're at it:
>
> http://www.youtube.com/watch?feature=player_embedded&v=h5mRRElXy-w#!
>
> that's sweet real raytracing on nvidia's kepler.

Doesn't look... completely right. Is that water deformation really 
volume-preserving? Because it doesn't appear to be. (That's the general 
problem with voxel-based methods, as opposed to particle-based. I don't 
know which this is.)

> and
>
> http://www.youtube.com/watch?v=kQB9ds2AYwM
>
> some kind of Final Duty Fantasy that Square-Enix swears is a real-time demo
> geared at nextgen platforms.

That would explain the occasional glitches in frame-rate, the obvious CG 
look of the images, and a few other imperfections.

> If that is the future of games, I suspect Hollywood better embrace the tech.

I think they already did, about 20 years ago? :-P

> I also question what's the point of povray or Blender these days in the face of
> these kind of things.  Particularly in the case of raytracing (or pathtracing
> for that matter) I think it's clear that scanline-based techniques won the
> industry, be it Renderman (a scanline at its heart) or games (now with possibly
> some raytracing for reflections at least).
>
> The end?  No, the beginning of some exciting times ahead!  Now games have fully
> convincing human skin, HDR global illumination, DOF and motion blur, plus run at
> magnificent 1080+p at 60FPS.  The time of real-time CGs is upon us.

The time was when all a GPU could do is draw flat texture-mapped 
polygons really, really fast, with simple phong lighting. And, usually, 
pretty low polygon counts and poor texture resolution.

Seriously, look at this stuff:

http://www.unseen64.net/wp-content/gallery/half-life/barney1zu2.jpg

That's HalfLife, the game that won awards and accolades around the table 
for its ground-breaking graphics technology.

/Obviously/ an off-line renderer like POV-Ray can do far better. Even a 
scanline renderer like 3D Studio Max can do better. But what POV-Ray can 
do is render /curved surfaces/. Not to mention physically correct 
reflection and refraction (not texture mapping tricks), and full global 
illumination. (The stuff that's usually pre-computed in games like 
HalfLife. Press a button, watch a door open, and oh look, the light map 
no longer matches the geometry...)

In the POV-Ray forum, a perennial question was "why doesn't POV-Ray use 
the GPU to do this stuff faster?" To which the answer was always 
"POV-Ray does stuff that the GPU is simply incapable of doing". Games 
like HalfLife used all sorts of convoluted tricks to fake something that 
looked vaguely realistic, while POV-Ray actually /directly/ simulates 
the actual physical effects, achieving photo-realism without tricks or 
workarounds.

Fast-forward 15 years or so, and things have changed. GPGPU is here. And 
now, we find that the GPU can actually do all the effects that POV-Ray 
does AND MORE, and it can do it in real-time or near real-time.

In the old days, it was scanline for speed, ray-tracing for realism. 
Today "unbiased rendering" seems to be the new ray-tracing. And to get 
that kind of quality in POV-Ray, you seen to have to work really hard 
for it. You need complicated photon maps, clever material design, and 
ultimately endless tweaking of radiosity settings.

Or you could just fire up an unbiased renderer running on the GPU, which 
directly simulates /everything/, without effort and faster than POV-Ray.

Le roi est mort, vive le roi!


Post a reply to this message

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.