POV-Ray : Newsgroups : povray.text.scene-files : Looking for rendering Omnidirectional Stereo images for VR headset Server Time
29 Apr 2024 08:51:23 EDT (-0400)
  Looking for rendering Omnidirectional Stereo images for VR headset (Message 21 to 30 of 38)  
<<< Previous 10 Messages Goto Latest 10 Messages Next 8 Messages >>>
From: Clodo
Subject: Re: Looking for rendering Omnidirectional Stereo images for VRheadset
Date: 12 Mar 2016 06:05:01
Message: <web.56e3f6cc3b547fa14e8811590@news.povray.org>
I'm happy you have addressed the Unix issue! Great!

In the mean-time, i write a draft of a Wiki page and write on the contact form
about how i can create it on the wiki.
http://www.clodo.it/host/images/53820b79ebb17bf6c2855e82e6ff97471a166b30.png

I also create a GitHub fork with the implementation in C
https://github.com/Clodo76/povray
Simply, in the current 'spherical' camera, i added two parameter: 'ipd' and
'ods'.
IPD, default 0.065.
ODS=0, The default spherical camera is used (no-stereo).
ODS=1, Left eye render (no-stereo)
ODS=2, Right eye render (no-stereo)
ODS=3, Side by Side
ODS=4, Top/Bottom

The C implementation is a little faster: i a sample scene, 19 seconds the C
implementation VS 22 seconds of the user-defined camera.

I'm thinking if i can cite my GitHub fork in the Wiki page, or it's not allowed.

I'm still looking how to implement support for the camera direction.
Current implementation use a forced <0,0,1> as direction as front view.
But with 'plants_demo_pano' rendering, i need another direction, otherwise the
cool stuff (tree) are behind me.

I hope i can post soon some nice render.

Jaime, sorry for your Cardboard... of course i can test your image with my
Oculus Rift.


Post a reply to this message

From: Jaime Vives Piqueres
Subject: Re: Looking for rendering Omnidirectional Stereo images for VRheadset
Date: 12 Mar 2016 07:03:50
Message: <56e405a6$1@news.povray.org>

> I'm still looking how to implement support for the camera direction.
> Current implementation use a forced <0,0,1> as direction as front
> view.

  Don't know still how much sense it makes to have an arbitrary direction
vector... maybe better would a be a way to just rotate the camera around y.

> But with 'plants_demo_pano' rendering, i need another direction,
> otherwise the cool stuff (tree) are behind me.

   For the moment, you can do with adding a - sign to the z function of
the direction to reverse it: it works with my cave to have the interior
as the initial view.

--
jaime


Post a reply to this message

From: Clodo
Subject: Re: Looking for rendering Omnidirectional Stereo images for VRheadset
Date: 12 Mar 2016 07:25:01
Message: <web.56e409c63b547fa14e8811590@news.povray.org>
Jaime Vives Piqueres <jai### [at] ignoranciaorg> wrote:
>   Don't know still how much sense it makes to have an arbitrary direction
> vector... maybe better would a be a way to just rotate the camera around y.

Yes, but i aim allow people to open it's source and simply render with
spherical/ods.
plants, for example, have a
# look_at posTree+y*yCam

In my C edition, i'm thinking to take the current direction, compute the
rotation around Y axis, and apply it, without any additional parameter.

In my user_defined edition, i can add a "#declare rotationY = 0" and use it in
formulas.
The fact that user_defined have 6 different function, and not 2 that can return
a vector, it's a pain.. also force me to expand theta/phi in each function, i
cannot find a syntax to declare outside. Maybe this are the reason of the little
performance drop VS the C edition.

>    For the moment, you can do with adding a - sign to the z function of
> the direction to reverse it: it works with my cave to have the interior
> as the initial view.

Thanks. Anyway i know that: original ODS Google algorithm is right-handed, i
invert the Z to the left-handed POV-Ray coordinate system. I will comment this
for my rendering of plants.


Post a reply to this message

From: clipka
Subject: Re: Looking for rendering Omnidirectional Stereo images for VRheadset
Date: 12 Mar 2016 09:50:45
Message: <56e42cc5$1@news.povray.org>
Am 12.03.2016 um 12:01 schrieb Clodo:

> The C implementation is a little faster: i a sample scene, 19 seconds the C
> implementation VS 22 seconds of the user-defined camera.

With more complex scenes, the relative difference between the C and SDL
implementations will probably be far smaller.


Post a reply to this message

From: clipka
Subject: Re: Looking for rendering Omnidirectional Stereo images for VRheadset
Date: 12 Mar 2016 09:53:22
Message: <56e42d62$1@news.povray.org>
Am 12.03.2016 um 13:22 schrieb Clodo:

> The fact that user_defined have 6 different function, and not 2 that can return
> a vector, it's a pain..

I totally agree.

Unfortunately POV-Ray's function engine can only handle scalar functions.


Post a reply to this message

From: Jaime Vives Piqueres
Subject: Re: Looking for rendering Omnidirectional Stereo images for VRheadset
Date: 14 Mar 2016 18:21:41
Message: <56e73975$1@news.povray.org>

> Jaime, sorry for your Cardboard... of course i can test your image
> with my Oculus Rift.

   In case you missed it, I posted a test on p.b.images... enjoy! I will
try it next weekend with the phone of my brother, just to see how it
feels to be inside my own creation.

--
jaime


Post a reply to this message

From: Koppi
Subject: Re: Looking for rendering Omnidirectional Stereo images for VRheadset
Date: 17 Mar 2016 12:40:00
Message: <web.56eadc583b547fa1db8f4ad50@news.povray.org>
I posted an ODS video on https://www.youtube.com/watch?v=UEFO5mQ2zeg
and ODS YouTube HOWTO at https://github.com/koppi/pov-ods .

Happy POVing!


Post a reply to this message

From: Jaime Vives Piqueres
Subject: Re: Looking for rendering Omnidirectional Stereo images for VRheadset
Date: 17 Mar 2016 14:43:20
Message: <56eafac8$1@news.povray.org>

> I posted an ODS video on https://www.youtube.com/watch?v=UEFO5mQ2zeg
> and ODS YouTube HOWTO at https://github.com/koppi/pov-ods .
>

   Very nice demo... bookmarked!

--
jaime


Post a reply to this message

From: Clodo
Subject: Re: Looking for rendering Omnidirectional Stereo images for VRheadset
Date: 17 Mar 2016 18:05:01
Message: <web.56eb29e33b547fa14e8811590@news.povray.org>
It's look very good on Oculus Rift, but onestly, looks like a normal spherical
camera without 3D effect.

Look at this ODS example (rendered with Blender):
https://www.clodo.it/files/projects/povray_ods/gooseberry_benchmark_panorama.jpg

Look how the left and right images are different:
https://www.clodo.it/files/projects/povray_ods/blender_diff_ex.png

Now, look your Cave left and right compare:
https://www.clodo.it/files/projects/povray_ods/cave_diff_ex.png

I think rocks are too far.

Dark scene are very good for HMD.

Also about the resolution:
8192 x 2048 side-by-side can be acceptable with toys like Cardboard & Android.
But with HMD like Oculus Rift or HTC Vive, it's too low.

Look this comment:
https://forums.oculus.com/viewtopic.php?f=20&t=30852



> You need this value of pixel coverage or "pixel density" or
> "pixel per display pixel" or "eye buffer scaling" as it's variously called.

> to cover one monoscopic turn.

It mean 6000x6000 in top/bottom, or 12000/3000 in side-by-side.
At least for GearVR, i don't actually know the correct value for other HMD.

Apart the rendering time, rendering images for HMD it's not an issue.
The issue i'm studying it's about video/animation:
this kind of resolution are outside any H264 or HEVC/H265 resolutions,
and other types of codec are not hardware accelerated,
and Oculus Rift and HTC Vive want 90 FPS, Playstation VR want 120 FPS....

----------------
I try to render the plants_demo of Gilles Tran, but i'm not satisfied.
There are a lot of flying object like butterfly and leap, and
i currently don't find a good position for a good 3D effect.
And it's too light, HMD use OLED display at few millimeters to eyes, it's better
dark scene like Cave.

Some ODS render 6480 x 6480 pixels, Top-Bottom
(published here: https://forums.oculus.com/viewtopic.php?f=28&t=30854)

Mirrors - I made this
https://www.clodo.it/files/projects/povray_ods/mirrors_20160316.jpg

Stacker Day - POV-Ray sample scene adapted for ODS (i change some reflection and
camera position)
https://www.clodo.it/files/projects/povray_ods/stackerday_20160316.jpg

Fractals 1 - POV-Ray sample scene adapted for ODS (i change ball position and
camera position)
https://www.clodo.it/files/projects/povray_ods/fractals1_20160316.jpg

Fractals 2 - POV-Ray sample scene adapted for ODS (i change only camera
position)
https://www.clodo.it/files/projects/povray_ods/fractals2_20160316.jpg

Wineglass - POV-Ray sample scene adapted for ODS (nothing changed. Apart camera
type of course)
https://www.clodo.it/files/projects/povray_ods/wineglass_20160316.jpg


----------------
I released on GitHub a fork of POV-Ray.
https://github.com/Clodo76/povray

Basically, ODS with IPD=0 are the same render as spherical camera.
So, i added two option to the current spherical camera: 'ods' (default 0, for
backward compatibility) and 'ipd' (default 0.065)

ods = 0, original POV-Ray spherical algorithm.
ods = 1, ODS, only left eye
ods = 2, ODS, only right eye
ods = 3, ODS, side by side
ods = 4, ODS, top bottom

Using user_defined camera or this patched version is the same.
As Chris said, C version is little more faster, but only in very very very
simple scene. Irrilevant imho.
I published only if someone want to continue experiments, because it's more
readable in C rather than the 6 user_defined functions.
And also because i write that before the Chris user_defined implementation :P
----------------
Soon i will create a POV-Ray Wiki page about ODS.

Ciao!


Post a reply to this message

From: Clodo
Subject: Re: Looking for rendering Omnidirectional Stereo images for VRheadset
Date: 17 Mar 2016 18:40:00
Message: <web.56eb31d43b547fa14e8811590@news.povray.org>
Jaime Vives Piqueres <jai### [at] ignoranciaorg> wrote:

> > I posted an ODS video on https://www.youtube.com/watch?v=UEFO5mQ2zeg
> > and ODS YouTube HOWTO at https://github.com/koppi/pov-ods .
> >
>
>    Very nice demo... bookmarked!
>
> --
> jaime

Off topic:

It's based on an experiment i done 7 years ago... wow a lot of time...
https://www.clodo.it/blog/mirrors/
The experiment are about perfectly loopable video. Honestly the scene are very
simple.
Another my renders of the same period:
https://www.clodo.it/blog/sphere-spirals-1958-mc-escher/


Post a reply to this message

<<< Previous 10 Messages Goto Latest 10 Messages Next 8 Messages >>>

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.