POV-Ray : Newsgroups : povray.newusers : Scanline rendering? Server Time
5 Sep 2024 14:17:11 EDT (-0400)
  Scanline rendering? (Message 11 to 18 of 18)  
<<< Previous 10 Messages Goto Initial 10 Messages
From: Thorsten Froehlich
Subject: Re: Scanline rendering?
Date: 21 Jan 2001 16:54:46
Message: <3a6b5aa6$1@news.povray.org>
In article <3a66d721@news.povray.org> , Warp <war### [at] tagpovrayorg>  wrote:

>   For the count, I have seen this kind of raytracer made in PostScript (!)
> in less than 20 lines (!!!).

Any link to this?


    Thorsten

____________________________________________________
Thorsten Froehlich, Duisburg, Germany
e-mail: tho### [at] trfde

Visit POV-Ray on the web: http://mac.povray.org


Post a reply to this message

From: Warp
Subject: Re: Scanline rendering?
Date: 23 Jan 2001 09:00:45
Message: <3a6d8e8d@news.povray.org>
Thorsten Froehlich <tho### [at] trfde> wrote:
:>   For the count, I have seen this kind of raytracer made in PostScript (!)
:> in less than 20 lines (!!!).

: Any link to this?

  Actually, 12 lines.
  Works at least with ghostview. Never tried actually printing it...
  (Don't know why the email of the author is censored; I got this file as it
is here.)

%!IOPSC-1993 %%Creator: HAYAKAWA Takashi<xxx### [at] xxxxxxxxxxxx>
/C/neg/d/mul/R/rlineto/E/exp/H{{cvx
def}repeat}def/T/dup/g/gt/r/roll/J/ifelse 8
H/A/copy(z&v4QX&93r9AxYQOZomQalxS2w!!O&vMYa43d6r93rMYvx2dca!D&cjSnjSnjjS3o!v&6A
X&55SAxM1CD7AjYxTTd62rmxCnTdSST0g&12wECST!&!J0g&D1!&xM0!J0g!l&544dC2Ac96ra!m&3A
F&&vGoGSnCT0g&wDmlvGoS8wpn6wpS2wTCpS1Sd7ov7Uk7o4Qkdw!&Mvlx1S7oZES3w!J!J!Q&7185d
Z&lx1CS9d9nE4!k&X&MY7!&1!J!x&jdnjdS3odS!N&mmx1C2wEc!G&150Nx4!n&2o!j&43r!U&0777d
]&2AY2A776ddT4oS3oSnMVC00VV0RRR45E42063rNz&v7UX&UOzF!F!J![&44ETCnVn!a&1CDN!Y&0M
V1c&j2AYdjmMdjjd!o&1r!M){( )T 0 4 3 r put
T(/)g{T(9)g{cvn}{cvi}J}{($)g{[}{]}J}J
cvx}forall/moveto/p/floor/w/div/S/add 29 H[{[{]setgray fill}for Y}for
showpage


-- 
char*i="b[7FK@`3NB6>B:b3O6>:B:b3O6><`3:;8:6f733:>::b?7B>:>^B>C73;S1";
main(_,c,m){for(m=32;c=*i++-49;c&m?puts(""):m)for(_=(
c/4)&7;putchar(m),_--?m:(_=(1<<(c&3))-1,(m^=3)&3););}    /*- Warp -*/


Post a reply to this message

From: Peter J  Holzer
Subject: Re: Scanline rendering?
Date: 23 Jan 2001 18:03:17
Message: <slrn96s0tk.3d4.hjp-usenet@teal.h.hjp.at>
On 2001-01-18 11:44, Warp <war### [at] tagpovrayorg> wrote:
>  Basic raytracing is pretty simple. Anyone with minimal knowledge of
>programming and calculus can make a simple sphere/plane raytracer (with
>reflections and refractions and simple textures).
>
>  Scanline rendering is, however, a lot harder process. It requires tons of
>complicated algorithms and code.

Actually, basic scanline rendering is quite simple, even simpler than
raytracing. Making fast enough for action games is different matter,
however (I tried to write a racing game when a 386/25 MHz was a fast
computer - never finished it).

	hp


-- 
   _  | Peter J. Holzer    | All Linux applications run on Solaris,
|_|_) | Sysadmin WSR       | which is our implementation of Linux.
| |   | hjp### [at] wsracat      | 
__/   | http://www.hjp.at/ |	-- Scott McNealy, Dec. 2000


Post a reply to this message

From: Warp
Subject: Re: Scanline rendering?
Date: 24 Jan 2001 08:17:25
Message: <3a6ed5e5@news.povray.org>
Peter J. Holzer <hjp### [at] sikituwsracat> wrote:
: Actually, basic scanline rendering is quite simple, even simpler than
: raytracing.

  It depends on whether you want a working scanline rendering or not.

  For example, I consider hidden surface removal to be essential in
scanline rendering (if you don't remove hidden surfaces your image will look
just plain wrong).
  The simplest way of making this is to order the polygons by depth and
draw them from the farthest to the closest.
  This method, however, is far from perfect, even if the polygons don't
intersect. It's very far from trivial to get a perfect sorting algorithm
like this. There are cases where almost any algorithm will give you a
wrong result (that is, a polygon which should be in front of another polygon
is drawn in the wrong order with the other polygon).
  Making a perfect sorting algorithm is probably a lot more work than using
just another way of drawing the polygons (such as z-buffering). And of course
sorting does not work if the polygons intersect each other.
  Z-buffering is a working solution, but it's quite elaborated if you want
it to work well. Each pixel of each polygons must have a depth information
which has to be calculated from the depth information of the vertices of the
polygon. This depth information has to be perspective corrected if you want
it to look right (if you just interpolate linearly you'll get a wrong result;
your polygons will look like they were bended in the depth direction, and
the bending amount will change depending on the orientation of the polygon).
Calculating perspective correct interpolation of depth information is not
what I consider trivial.

  Constant-colored polygons look just horrible (specially if your whole object
is of the same color because then you'll get just the silhouette of the
object and no inner details), so some type of shading is in place if
you want your renderer to look any good. You'll need light sources and your
polygons will need to be shaded according to them.
  Flat-shading is the easiest to calculate. You just calculate the orientation
of the normal vector of the polygon with respect to all light sources and
then brighten/darken the color of the polygon according to that.
  This way you only get flat polygons and it works only for light sources
at infinity (for point light sources the shading would be just plain wrong).
  Even flat polygons need to be shaded in a pixel-by-pixel basis if there
are point lights. And of course if you want smooth surfaces (that is, the
normal vector changes along the polygon). Also here you have to use perspective
correction if you want to get a correct result (specially with big polygons).

  If you want any texturing at all on your polygons, that's a whole story
in itself. Perspective correction, mipmapping, bilinear or trilinear
filtering... The list of features is endless.

  With raytracing you get things almost for free that are extremely complicated
to make with scanline raytracing.
  For example shadows. In raytracing they are laughably trivial. In scanline
rendering they are a real headache. The algorithms for shadows are quite
complicated (shadow volumes, light mapping, etc).
  Reflection and refraction is also very easy in raytracing and very
complicated in scanline rendering (if you want to look them right; basic
environment mapping is a very poor approximation).

-- 
char*i="b[7FK@`3NB6>B:b3O6>:B:b3O6><`3:;8:6f733:>::b?7B>:>^B>C73;S1";
main(_,c,m){for(m=32;c=*i++-49;c&m?puts(""):m)for(_=(
c/4)&7;putchar(m),_--?m:(_=(1<<(c&3))-1,(m^=3)&3););}    /*- Warp -*/


Post a reply to this message

From: Ron Parker
Subject: Re: Scanline rendering?
Date: 24 Jan 2001 09:11:58
Message: <slrn96tolf.2ps.ron.parker@fwi.com>
On 24 Jan 2001 08:17:25 -0500, Warp wrote:
>  Z-buffering is a working solution, but it's quite elaborated if you want
>it to work well. Each pixel of each polygons must have a depth information
>which has to be calculated from the depth information of the vertices of the
>polygon. This depth information has to be perspective corrected if you want
>it to look right (if you just interpolate linearly you'll get a wrong result;
>your polygons will look like they were bended in the depth direction, and
>the bending amount will change depending on the orientation of the polygon).
>Calculating perspective correct interpolation of depth information is not
>what I consider trivial.

You seem to be using a different definition of "Z-buffering" than is common.
You need an extra "depth" element for each pixel on the display, not for
each pixel on each polygon (whatever that means.)  Also, since the depth 
information is only used for comparison purposes and not for appearance, I'm
not convinced by the claim that the polygons appear bent if you don't 
correct for perspective.  I'm not even sure how one would go about correcting
for perspective; it seems to me that the projection doesn't matter as long
as you can compute a distance from the 3-space position of the pixel you're 
rendering (on a notional "film plane") to the 3-space position of the point
you're mapping there.  The projection only matters for computing that mapping,
and that's not a Z-buffer problem; you'd have the same problem with any other
method.

-- 
Ron Parker   http://www2.fwi.com/~parkerr/traces.html
My opinions.  Mine.  Not anyone else's.


Post a reply to this message

From: Warp
Subject: Re: Scanline rendering?
Date: 24 Jan 2001 10:24:00
Message: <3a6ef38f@news.povray.org>
Ron Parker <ron### [at] povrayorg> wrote:
: You seem to be using a different definition of "Z-buffering" than is common.
: You need an extra "depth" element for each pixel on the display, not for
: each pixel on each polygon (whatever that means.)

  I didn't express myself correctly.
  What I meant to say is that for each pixel you draw for a polygon, you need
to calculate the depth of that pixel according to the depth of the vertices
of the polygon (in order to use the Z-buffer).

: Also, since the depth 
: information is only used for comparison purposes and not for appearance, I'm
: not convinced by the claim that the polygons appear bent if you don't 
: correct for perspective.

  They appear bent if polygons interset each other or are so close to each
other that one polygon can show through another polygon because that other
polygon is "bent".
  Specially when two polygons intersect each other, the intersection line
is curved if the depths are not perspective correct.

: I'm not even sure how one would go about correcting for perspective

  When you draw one scanline of the polygon the first pixel has a certain
depth as well as the last pixel. When drawing the in-between pixels you need
to calculate their depth as well. If you do it by linearly interpolating
the depths of the border pixels you'll get a non-perspective correct result.
  As said, this results in curved intersection lines and even bad hidden
surface removal (for example if there's a smaller polygon right behind the
current polygon).
  The formula to calculate perspective correct depth is pretty similar to
the one to calculate perspective correct texturing.

: it seems to me that the projection doesn't matter as long
: as you can compute a distance from the 3-space position of the pixel you're 
: rendering

  You are right, "as long as you can compute" them. The question is how do
you compute them. You only have the depth of the vertex points. You have to
somehow calculate the depth of the in-between points from those.
  Linear interpolation is not the answer (for the exact same reason as
linear interpolation of the textures is not the answer).

-- 
char*i="b[7FK@`3NB6>B:b3O6>:B:b3O6><`3:;8:6f733:>::b?7B>:>^B>C73;S1";
main(_,c,m){for(m=32;c=*i++-49;c&m?puts(""):m)for(_=(
c/4)&7;putchar(m),_--?m:(_=(1<<(c&3))-1,(m^=3)&3););}    /*- Warp -*/


Post a reply to this message

From: Peter J  Holzer
Subject: Re: Scanline rendering?
Date: 25 Jan 2001 12:03:06
Message: <slrn970gv2.2dc.hjp-usenet@teal.h.hjp.at>
On 2001-01-24 13:17, Warp <war### [at] tagpovrayorg> wrote:
>Peter J. Holzer <hjp### [at] sikituwsracat> wrote:
>: Actually, basic scanline rendering is quite simple, even simpler than
>: raytracing.
>
>  It depends on whether you want a working scanline rendering or not.
>
>  For example, I consider hidden surface removal to be essential in
>scanline rendering (if you don't remove hidden surfaces your image will look
>just plain wrong).
[painter's algorithm vs. zbuffering, etc.]

>  If you want any texturing at all on your polygons, that's a whole story
>in itself. Perspective correction, mipmapping, bilinear or trilinear
>filtering... The list of features is endless.

As is the list of possible features for raytracing. Povray isn't a
trivial program either.

I found my old zbuffer demo on a backup tape. It is about 400 lines of
C code, about half of which is the z buffer algorithm and half is the
application (open an X11 window, tesselate a sphere and a cylinder,
render both using the Z buffer and draw them into the window). Written
in a single afternoon, if I rememer correctly.

Of course the thing is very simple. It can draw only triangles,
there is no support for textures, the shading is quite wrong. And
I did not perspective-correct the z value. I've put the thing on
http://www.hjp.at/programs/zbuffer/, if anyone is interested.

BTW, does anyone know where the name "scanline rendering" comes from?
It is quite wrong. Unlike raytracing (which does generally follow
the scanlines), these rendering algorithms do not create the picture
scanline by scanline, but polygon by polygon.

	hp

-- 
   _  | Peter J. Holzer    | All Linux applications run on Solaris,
|_|_) | Sysadmin WSR       | which is our implementation of Linux.
| |   | hjp### [at] wsracat      | 
__/   | http://www.hjp.at/ |	-- Scott McNealy, Dec. 2000


Post a reply to this message

From: Warp
Subject: Re: Scanline rendering?
Date: 28 Jan 2001 10:22:00
Message: <3a743918@news.povray.org>
Peter J. Holzer <hjp### [at] sikituwsracat> wrote:
: As is the list of possible features for raytracing. Povray isn't a
: trivial program either.

  My point was that making a simple raytracer, with some minimal features,
is quite easy. Making a working scanline renderer with the same minimal
features is a lot harder.

  Hidden surface removal? In raytracing you get a perfect result with
minimal coding. To get a perfect result (that is, works perfectly with
every possible combination of polygons) with scanline rendering needs quite
a lot of work.
  Lighting? Also easy in raytracing (at least if you use mathematical
objects). With scanline rendering you need complicated algorithms to
interpolate perspective-correctly normal vectors.
  Textures? Very easy in raytracing (for procedural textures it couldn't
be easier; for texture images it certainly needs more work if you want
things like bilinear filtering). In scanline rendering you need at least
perspective correctness calculations (which you don't need in raytracing).
  Shadows? Laughably easy in raytracing. A headache in scanline rendering.
  Reflections and refractions? As with shadows.

: I found my old zbuffer demo on a backup tape. It is about 400 lines of
: C code, about half of which is the z buffer algorithm and half is the
: application (open an X11 window, tesselate a sphere and a cylinder,
: render both using the Z buffer and draw them into the window). Written
: in a single afternoon, if I rememer correctly.

  Does it calculate the depth right? You have to take into account
perspective correctness if you want it to work right with intersecting
polygons and polygons which are very close to each other (almost coincident).
  I know it's easy to do it the easy way, just interpolating linearly the
depth between vertices. This, however, doesn't work perfectly.

  And besides, the amount of code lines doesn't tell much. One could work
a week in 50 lines of code if the algorithm to implement is hard enough.

: BTW, does anyone know where the name "scanline rendering" comes from?
: It is quite wrong. Unlike raytracing (which does generally follow
: the scanlines), these rendering algorithms do not create the picture
: scanline by scanline, but polygon by polygon.

  You are not completely right.
  You can draw the image polygon by polygon, and that's the easiest way.
However, it's not the most efficient way. The most efficient way is to draw the
whole image scanline by scanline, drawing only those parts of each polygon
which is visible (that is, gets into the final image). If you draw polygon
by polygon, you will be drawing a lot of extra data which will not get into
the final image, thus spending a lot of time in parts which do not affect
the image.
  If my memory doesn't fail me, most scanline renderers (such as 3DStudio)
render scanline by scanline, for this exact reason.

-- 
char*i="b[7FK@`3NB6>B:b3O6>:B:b3O6><`3:;8:6f733:>::b?7B>:>^B>C73;S1";
main(_,c,m){for(m=32;c=*i++-49;c&m?puts(""):m)for(_=(
c/4)&7;putchar(m),_--?m:(_=(1<<(c&3))-1,(m^=3)&3););}    /*- Warp -*/


Post a reply to this message

<<< Previous 10 Messages Goto Initial 10 Messages

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.