|
|
|
|
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Hi all,
last weeks, I did a new patch for POV-ray and now it's almost done.
It enables POV-ray, to trace stereoscopic images in one pass.
Mainly I did it for my own home use, but now I am considering
to share it / make it publicaly available.
Please tell me, if there is any interest in such a patch.
Should I go ahead and post more details, what it does and
what it's good for?
If there is some interest, I would put some images and
texts on a webpage somewhere. And, of course, I will
publish the patch (and/or provide a windows/vc6
build in some way, maybe per emal at request.).
I myself consider it beta in "works for me"
state (with a few known minor problems)
Hermann Vosseler
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
I'm interested! I suppose you're talking about those two-colored images? I
don't like crossing my eyes during a 30 minutes povray animation, you
know.... :-)
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Apache wrote:
> I'm interested! I suppose you're talking about those two-colored images? I
> don't like crossing my eyes during a 30 minutes povray animation, you
> know.... :-)
>
Hi Apache,
yes and no -- :-)
as I started this patch first for my own use, I didn't build in
anaglypic (two-colored) output. The main use for me is to make
screenshots to slide and then to look at them in a stereo viewer or
projecting them full color with stereo projector and polarizing
spectacles.
I.e. I want to get high quality/high res.
So I focused at the problem, that making two independent renders will
double the time amount needed. To ameliorate this, I built in a
"Stereo-Cache" datastructure in order to share lighting/texturing and
radiosity(!) calculation between corresponding pixels in both half
images. This was a bit tricky, but rather successfull: Alredy in
rather simple scenes with a bit of texturing and some area lights, I
achieve rendering times of 140% (to be compared to 200% for two
independend renders)
Another feature I focused on was to be able to use some of the
non-standard camera types stereoscopically. In my patch, you can
use (besides perspective camera) orthographic, fisheye, a cylindrical
type and a new designed "spherical wideangle".
As far as stills are concerned, I simply convert the output to JPG,
call it *.JPS and point a JPS-viewer (e.g. DepthCharge) at the file;
so I can chose "wall-eyed", "cross-eyed", red/green, red/blue or
interlaced (for LCD shutter) viewing method.
At the process of making animations -- I must confss -- I didn't
think. But one could think of hooking into the output_line() call
and postprocessing to -- say red/green -- on the fly. Would this
be of some help for you? Or does someone know a better method?
Anyway, I will package it together, write a short docu, render some
demo imgs and put this on a web page in some days, so you can have a
look at it. Stay tuned!
Hermann
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Hermann,
Great. I'm a stereo enthusiast too. Stereo is why I started doing
POV-Ray. I also would be more interested in output as JPS for
use with LCS glasses than anaglyph output.
Do you know of SAB's stereo patch?
http://sabix.etdv.ruhr-uni-bochum.de/sabpov/
It does something similar to what you describe. I tried it but saw
some artifacts in textures, and then POV 3.5 came along and the
patch is for 3.1g. Doing two renders is fine, but the render
time savings from shared data between left-right renders, would really
make your patch worth the effort.
How do you approach the stereo window?
Harold Baize
http://www.3dculture.com/bm3d
news:3C6### [at] webconde...
>
> Hi all,
>
> last weeks, I did a new patch for POV-ray and now it's almost done.
>
> It enables POV-ray, to trace stereoscopic images in one pass.
>
> Mainly I did it for my own home use, but now I am considering
> to share it / make it publicaly available.
>
>
> Please tell me, if there is any interest in such a patch.
> Should I go ahead and post more details, what it does and
> what it's good for?
> If there is some interest, I would put some images and
> texts on a webpage somewhere. And, of course, I will
> publish the patch (and/or provide a windows/vc6
> build in some way, maybe per emal at request.).
> I myself consider it beta in "works for me"
> state (with a few known minor problems)
>
>
> Hermann Vosseler
>
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
I think that your solution is best as it is now. The postprocessing
(red/green or whatever) depends on the specific type of glasses I think. So
leaving that to other programs is best because that way the images of your
patched povray can be used any way people want.
Regards,
Apache
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Harold Baize wrote:
> Hermann, Great. I'm a stereo enthusiast too. Stereo is why I
> started doing POV-Ray. I also would be more interested in output
> as JPS for use with LCS glasses than anaglyph output.
>
> Do you know of SAB's stereo patch?
>
> http://sabix.etdv.ruhr-uni-bochum.de/sabpov/
>
As I came acros it, most of the work for my patch was alredy done. I
didn't try it, but from the description at the webpage it seems, it is
rather focused at providing different output formats.
Anyway, fisheye-type perspective is very important to me, and for this
I had to create a new projection type that can be made steresocopic
over the whole viewing range.
> .....and then POV 3.5 came along and the patch is for 3.1g.
Yes 3.5 !!
From looking at the MegaPOV sources I guess it will be possible to
adapt my patch to it. But post-processing hooks in at the same
locations where my patch hooks in, so I decided to first do it for
3.1g and wait for the release of the 3.5 sources. I'm verry impatient
to be able to try out photons and iso surfaces, as you may guess. :-)
>
> How do you approach the stereo window?
>
My approach is: let the "up" and "right" vectors specify the image
dimensions (i.e. dimensions of the resulting window) in pov ray units.
So the user may chose the pov ray units to represent real world units
as he/she sees fit.
To give an example, I use 1 pov ray unit = 1 m and use
camera{location 0
direction 1.75 * z
up 3/4 * 1.8 * y
right 1.8 * x
translate ....
look_at ....
}
to get a f=35mm view correctly adjusted for a typical home/club
projection screen of 1.8 meters width. I.e. if you magnify the
resulting image so it is 1.8m wide, the max.deviation (for objects
located in infinity) results to 65mm.
Of course you may use the "angle" keyword, but by using "direction", I
know precisely where in my scene the stereoscopic window will be located.
Regards,
Hermann
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Excellent! I look forward to playing with your stereoscopic
patch.
HB
news:3C7### [at] webconde...
> Harold Baize wrote:
> > Hermann, Great. I'm a stereo enthusiast too. Stereo is why I
> > started doing POV-Ray. I also would be more interested in output
> > as JPS for use with LCS glasses than anaglyph output.
> >
> > Do you know of SAB's stereo patch?
> >
> > http://sabix.etdv.ruhr-uni-bochum.de/sabpov/
> >
>
> As I came acros it, most of the work for my patch was alredy done. I
> didn't try it, but from the description at the webpage it seems, it is
> rather focused at providing different output formats.
>
> Anyway, fisheye-type perspective is very important to me, and for this
> I had to create a new projection type that can be made steresocopic
> over the whole viewing range.
>
>
>
> > .....and then POV 3.5 came along and the patch is for 3.1g.
> Yes 3.5 !!
> From looking at the MegaPOV sources I guess it will be possible to
> adapt my patch to it. But post-processing hooks in at the same
> locations where my patch hooks in, so I decided to first do it for
> 3.1g and wait for the release of the 3.5 sources. I'm verry impatient
> to be able to try out photons and iso surfaces, as you may guess. :-)
>
>
> >
> > How do you approach the stereo window?
> >
>
> My approach is: let the "up" and "right" vectors specify the image
> dimensions (i.e. dimensions of the resulting window) in pov ray units.
> So the user may chose the pov ray units to represent real world units
> as he/she sees fit.
> To give an example, I use 1 pov ray unit = 1 m and use
>
> camera{location 0
> direction 1.75 * z
> up 3/4 * 1.8 * y
> right 1.8 * x
> translate ....
> look_at ....
> }
> to get a f=35mm view correctly adjusted for a typical home/club
> projection screen of 1.8 meters width. I.e. if you magnify the
> resulting image so it is 1.8m wide, the max.deviation (for objects
> located in infinity) results to 65mm.
>
> Of course you may use the "angle" keyword, but by using "direction", I
> know precisely where in my scene the stereoscopic window will be located.
>
> Regards,
> Hermann
>
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Hey Hermann,
I havent forgotten about your Steroscopic patch. In fact, I'm still
eagerly awaiting it :)
George Pantazopoulos
news:3C6### [at] webconde...
>
> Hi all,
>
> last weeks, I did a new patch for POV-ray and now it's almost done.
>
> It enables POV-ray, to trace stereoscopic images in one pass.
>
> Mainly I did it for my own home use, but now I am considering
> to share it / make it publicaly available.
>
>
> Please tell me, if there is any interest in such a patch.
> Should I go ahead and post more details, what it does and
> what it's good for?
> If there is some interest, I would put some images and
> texts on a webpage somewhere. And, of course, I will
> publish the patch (and/or provide a windows/vc6
> build in some way, maybe per emal at request.).
> I myself consider it beta in "works for me"
> state (with a few known minor problems)
>
>
> Hermann Vosseler
>
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Dear Hermann,
i really like your patch! Very interesting. Looking forward for PoV 3.5
support.
Best wishes,
Robert
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
|
|