POV-Ray : Newsgroups : povray.general : Real-time raytracing... Server Time
7 Aug 2024 21:21:38 EDT (-0400)
  Real-time raytracing... (Message 9 to 18 of 18)  
<<< Previous 8 Messages Goto Initial 10 Messages
From: Ron Parker
Subject: Re: Real-time raytracing...
Date: 2 Apr 2001 22:08:12
Message: <slrn9cic31.c24.ron.parker@fwi.com>
On Tue, 3 Apr 2001 01:03:37 +0200, Zeger Knaepen wrote:
>tnx for your answers...
>Still, I think POV-Ray should be made much faster than it currently is :)

Feel free to contribute some fixes that make it so.

-- 
Ron Parker   http://www2.fwi.com/~parkerr/traces.html
My opinions.  Mine.  Not anyone else's.


Post a reply to this message

From: Thorsten Froehlich
Subject: Re: Real-time raytracing...
Date: 3 Apr 2001 01:06:13
Message: <3ac95a45$1@news.povray.org>
In article <3ac74223@news.povray.org> , "Zeger Knaepen" 
<zeg### [at] studentkuleuvenacbe> wrote:

> I was just looking at some of the real-time raytracing demos at
> http://www.acm.org/tog/resources/RTNews/demos/overview.htm and I was
> wondering: why can't POV-Ray be that fast?

It can.  You just have to remove the parsing stage for every frame.  You
can do this either by constructing objects manually in memory or by
parsing once and then manipulating the objects directly.  It is
possible, not too much work and you get a decent frame rate on an
average system.  However, you should reduce max trace level to two or
three.

Just be aware which options you have to turn off because they assume a
static scene and do break real-time manipulation of the scene if you
don't fix them on the fly, too.  For example the vista buffer or object
bounding.


      Thorsten


Post a reply to this message

From: Nekar Xenos
Subject: Re: Real-time raytracing...
Date: 3 Apr 2001 02:06:53
Message: <3ac9687d@news.povray.org>
Can you post a sample? This sounds plausible, so it would be nice to see it
work  =)

Nekar

"Thorsten Froehlich" <tho### [at] trfde> wrote in message
news:3ac95a45$1@news.povray.org...
> In article <3ac74223@news.povray.org> , "Zeger Knaepen"
> <zeg### [at] studentkuleuvenacbe> wrote:
>
> > I was just looking at some of the real-time raytracing demos at
> > http://www.acm.org/tog/resources/RTNews/demos/overview.htm and I was
> > wondering: why can't POV-Ray be that fast?
>
> It can.  You just have to remove the parsing stage for every frame.  You
> can do this either by constructing objects manually in memory or by
> parsing once and then manipulating the objects directly.  It is
> possible, not too much work and you get a decent frame rate on an
> average system.  However, you should reduce max trace level to two or
> three.
>
> Just be aware which options you have to turn off because they assume a
> static scene and do break real-time manipulation of the scene if you
> don't fix them on the fly, too.  For example the vista buffer or object
> bounding.
>
>
>       Thorsten


Post a reply to this message

From: Zeger Knaepen
Subject: Re: Real-time raytracing...
Date: 3 Apr 2001 06:52:07
Message: <3ac9ab57$1@news.povray.org>
> On Tue, 3 Apr 2001 01:03:37 +0200, Zeger Knaepen wrote:
> >tnx for your answers...
> >Still, I think POV-Ray should be made much faster than it currently is :)
>
> Feel free to contribute some fixes that make it so.
uhm, I'm not a programmer :)

But I was wondering about something..  Can someone explain me how exactly
that bounded_by-stuff works?  If  I'm right, the raytracer first checks if a
ray intersects with the bounded_by-object using a simple form of raytracing
itself.  Couldn't that be made faster by first doing a scanline rendering of
the bounded_by-objects, and then only raytracing the pixels that are inside
those objects?  Of course it would have it limitations, like you can't use
spheres for bounded_by-objects, and it will only work with the
default-camera-type (perspective or whatever it's called), but I think it
would be a little bit faster...

Well, I don't know.  I'm not a programmer, so it might be a real stupid
id...

cu!

--
ZK
http://www.povplace.be.tf


Post a reply to this message

From: Christoph Hormann
Subject: Re: Real-time raytracing...
Date: 3 Apr 2001 07:04:52
Message: <3AC9AE55.824C6B21@gmx.de>
Zeger Knaepen wrote:
> uhm, I'm not a programmer :)
> 
> But I was wondering about something..  Can someone explain me how exactly
> that bounded_by-stuff works?  If  I'm right, the raytracer first checks if a
> ray intersects with the bounded_by-object using a simple form of raytracing
> itself.  Couldn't that be made faster by first doing a scanline rendering of
> the bounded_by-objects, and then only raytracing the pixels that are inside
> those objects?  Of course it would have it limitations, like you can't use
> spheres for bounded_by-objects, and it will only work with the
> default-camera-type (perspective or whatever it's called), but I think it
> would be a little bit faster...
> 
> Well, I don't know.  I'm not a programmer, so it might be a real stupid
> id...
> 

That's about what the vista buffer does IIRC.

Bounding generally works with boxes, because they are quite fast to
compute.  Manual bounding using bounded_by{} is only useful in certain
cases like complex CSG or hierarchical bounding of unions.  

Christoph

-- 
Christoph Hormann <chr### [at] gmxde>
IsoWood include, radiosity tutorial, TransSkin and other 
things on: http://www.schunter.etc.tu-bs.de/~chris/


Post a reply to this message

From: Zeger Knaepen
Subject: Re: Real-time raytracing...
Date: 3 Apr 2001 11:10:26
Message: <3ac9e7e2$1@news.povray.org>
> That's about what the vista buffer does IIRC.
So, how does it work with other camera_types?


cu!

--
ZK
http://www.povplace.be.tf


Post a reply to this message

From: Ron Parker
Subject: Re: Real-time raytracing...
Date: 3 Apr 2001 11:17:16
Message: <slrn9cjqbu.cka.ron.parker@fwi.com>
On Tue, 3 Apr 2001 17:08:57 +0200, Zeger Knaepen wrote:
>> That's about what the vista buffer does IIRC.
>So, how does it work with other camera_types?

The short answer is that it doesn't.  From the docs:

  The vista buffer is created by projecting the bounding box hierarchy onto 
  the screen and determining the rectangular areas that are covered by each 
  of the elements in the hierarchy. Only those objects whose rectangles 
  enclose a given pixel are tested by the primary viewing ray. The vista 
  buffer can only be used with perspective and orthographic cameras because 
  they rely on a fixed viewpoint and a reasonable projection (i. e. straight 
  lines have to stay straight lines after the projection).

-- 
Ron Parker   http://www2.fwi.com/~parkerr/traces.html
My opinions.  Mine.  Not anyone else's.


Post a reply to this message

From: Zeger Knaepen
Subject: Re: Real-time raytracing...
Date: 3 Apr 2001 13:19:04
Message: <3aca0608$1@news.povray.org>
> >So, how does it work with other camera_types?
>
> The short answer is that it doesn't.  From the docs:
ok, tnx :)

>   The vista buffer is created by projecting the bounding box hierarchy
onto
>   the screen and determining the rectangular areas that are covered by
each
>   of the elements in the hierarchy. Only those objects whose rectangles
>   enclose a given pixel are tested by the primary viewing ray. The vista
>   buffer can only be used with perspective and orthographic cameras
because
>   they rely on a fixed viewpoint and a reasonable projection (i. e.
straight
>   lines have to stay straight lines after the projection).


--
ZK
http://www.povplace.be.tf


Post a reply to this message

From: Thorsten Froehlich
Subject: Re: Real-time raytracing...
Date: 3 Apr 2001 14:45:08
Message: <3aca1a34$1@news.povray.org>
In article <3ac9687d@news.povray.org> , "Nekar Xenos" 
<vir### [at] iconcoza> wrote:

> Can you post a sample? This sounds plausible, so it would be nice to see it
> work  =)

You have the sample code already.  Just look into the file povray.c in
the source distribution.  Follow it through once and call all the init
code.  Then just change the function FrameRender (povray.c) and merge it
with a modified version of function Start_Non_Adaptive_Tracing
(render.c) or call trace_pixel directly.  It is really easy and there is
not much work needed at all.  For example, take all the code and then
you should get an order like this:

 COLOUR col;

 ...
 init_vars();
 ...
 // build your scene here
 ...
 Initialize_xxx
 ...
 Initialize_Renderer();

 for(y...)
 {
  for(x...)
  {
   trace_pixel(x, y, col); // col contains the colour of the pixel
  }
 }

 Deinitialize_xxx
 ...

To find out how to construct objects manually, just take a look into the
Create_xxx (xxx = object name) methods in the individual source files.
For example, this creates a point light source:

 LIGHT_SOURCE *gLight;

 // light_source
 // {
 //  <-100, 80, -100>
 //  color rgb 1.2
 // }
 gLight = Create_Light_Source();
 Make_Vector(gLight->Center, -100.0, 80.0, -100.0);
 Make_Colour(gLight->Colour, 1.2, 1.2, 1.2);
 Post_Process((OBJECT *)gLight, NULL);
 Link_To_Frame((OBJECT *)gLight);

You can manipulate the light source easily by changing its center.

As for everything else, well, now you know how to get started - most of
the fun with POV-Ray is playing around with the source code :-)


      Thorsten


Post a reply to this message

From: Fabien Mosen
Subject: Re: Real-time raytracing...
Date: 3 Apr 2001 18:08:44
Message: <3ACA493B.16F65F69@skynet.be>
Ron Parker wrote:
> 
> On Tue, 3 Apr 2001 17:08:57 +0200, Zeger Knaepen wrote:
> >> That's about what the vista buffer does IIRC.
> >So, how does it work with other camera_types?
> 
> The short answer is that it doesn't.  From the docs:

BTW, I'd like to ask : why is the focal_blur only availiable in the
"perspective" camera ?  From the general principle of blur
oversampling, I can't figure why, but maybe I miss something...

Fabien.


Post a reply to this message

<<< Previous 8 Messages Goto Initial 10 Messages

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.