





 
 




 
 


Le 06/03/2016 15:49, clipka a écrit :
> Y'all might want to try the latest and greatest POVRay development release:
>
> https://github.com/POVRay/povray/releases/tag/v3.7.1alpha.8509766%2Bav119
>
> It adds a new functionbased userdefined camera:
>
> camera {
> user_defined
> location {
> FUNCTION, // xcoordinate of ray origin
> FUNCTION, // ycoordinate of ray origin
> FUNCTION // zcoordinate of ray origin
> }
> direction {
> FUNCTION, // xcoordinate of ray direction
> FUNCTION, // ycoordinate of ray direction
> FUNCTION // zcoordinate of ray direction
> }
> CAMERA_MODIFIERS
> }
>
> where each FUNCTION takes the screen coordinates as parameters, ranging
> from 0.5 (left/bottom) to 0.5 (right/top).
>
It's nice, but it is not generic enough:
you cannot have a nontraced area, like for fisheye and omnimax camera.
Yep, give me a finger, I would take the arm to the elbow... of the other side, with a
leg or two.
(where to I put the whole collection of smileys ?)
Post a reply to this message


 
 




 
 


Am 09.03.2016 um 21:12 schrieb Le_Forgeron:
> Le 06/03/2016 15:49, clipka a écrit :
>> Y'all might want to try the latest and greatest POVRay development release:
>>
>> https://github.com/POVRay/povray/releases/tag/v3.7.1alpha.8509766%2Bav119
>>
>> It adds a new functionbased userdefined camera:
>>
>> camera {
>> user_defined
>> location {
>> FUNCTION, // xcoordinate of ray origin
>> FUNCTION, // ycoordinate of ray origin
>> FUNCTION // zcoordinate of ray origin
>> }
>> direction {
>> FUNCTION, // xcoordinate of ray direction
>> FUNCTION, // ycoordinate of ray direction
>> FUNCTION // zcoordinate of ray direction
>> }
>> CAMERA_MODIFIERS
>> }
>>
>> where each FUNCTION takes the screen coordinates as parameters, ranging
>> from 0.5 (left/bottom) to 0.5 (right/top).
>>
>
> It's nice, but it is not generic enough:
> you cannot have a nontraced area, like for fisheye and omnimax camera.
>
> Yep, give me a finger, I would take the arm to the elbow... of the other side, with
a leg or two.
Ooooh  I can give you _the_ finger if you like:
To identify regions that should be left untraced, simply have the
`direction` functions all return 0. :P
Post a reply to this message


 
 




 
 


My current ODS user_defined camera (sidebyside only):
#declare odsIPD = 0.065;
#declare odsLocationX = 0;
#declare odsLocationY = 0;
#declare odsLocationZ = 0;
#declare odsDirectionX = 0;
#declare odsDirectionY = 0;
#declare odsDirectionZ = 1;
camera {
user_defined
location {
function { odsLocationX + cos(select(x,(x+0.5)*2,(x*2)) * 2 * pi 
pi)*odsIPD/2*select(x,1,1) }
function { odsLocationY }
function { odsLocationZ + sin(select(x,(x+0.5)*2,(x*2)) * 2 * pi 
pi)*odsIPD/2*select(x,1,1) }
}
direction {
function { sin(select(x,(x+0.5)*2,(x*2)) * 2 * pi  pi) * cos(pi / 2 
(1(y+0.5))*pi) }
function { sin(pi / 2  (1(y+0.5))*pi) }
function { cos(select(x,(x+0.5)*2,(x*2)) * 2 * pi  pi) * cos(pi / 2 
(1(y+0.5))*pi) }
}
}
I want to complete it and clean/optimize it, maybe useful in docs as example if
it deserves.
 In original Google docs formula about ODS, there isn't any reference about the
starting location and direction of the camera.
The above code render always in forced direction <0,0,1>.
Anyone can help me about where to place the odsDirectionX,Y,Z unused
variables? I don't know almost nothing about 3d maths...
 There is a function that return a float component from a given Vector?
I aim to write
#declare odsDirection = <0,1,1>
and in function
odsLocation.X
...
because i'm forced to have six function that return float for location and
direction,
or i can write two function that return a Vector?
Post a reply to this message


 
 




 
 


Hi to all, some update on this topic.
I done a lot of test, and, for me, the Google ODS docs:
https://developers.google.com/cardboard/jump/renderingodscontent.pdf
are simply wrong (i mean the raytracing suggested algorithm).
I write a mail (i'm still waiting for a feedback) to Mach Kobayashi, that seem
related to the work:
https://community.renderman.pixar.com/article/991/renderingforvr.html
This .pov source: http://pastebin.com/aPbGSDud
generate this image with the Google ODS formulas:
http://www.clodo.it/host/images/f6febe4d2a5739483562719923fc528c31ac378b.png
With my Oculus Rift, the front and bottom view are perfect, but looking
left and right the 3D seem reversed.
Focus on the little 4 yellow balls (front, left, right, bottom).
Look here, i manually draw yellow lines:
http://www.clodo.it/host/images/50ca1ff17fc834e167d44c0ab24484acd90bfd3f.jpg
From what i understand, on top (left eye) image, yellow ball need to
stay always at right of the yellow line.

So, i rewritten the ODS with another logic:
// ODS, Top/Bottom
#declare odsIPD = 1;
#declare odsLocationX = 0;
#declare odsLocationY = 0;
#declare odsLocationZ = 0;
#declare odsAngle = 90;
camera {
user_defined
location {
function { odsLocationX +
select(y,1,+1)*odsIPD/2*cos(pi+2*pi*(x+0.5+odsAngle/360))}
function { odsLocationY }
function { odsLocationZ +
select(y,1,+1)*odsIPD/2*sin(pi+2*pi*(x+0.5+odsAngle/360))}
}
direction {
function { sin(((x+0.5+odsAngle/360)) * 2 * pi  pi) * cos(pi / 2
select(y, 12*(y+0.5), 12*y) * pi) }
function { sin(pi / 2  select(y, 12*(y+0.5), 12*y) * pi) }
function { cos(((x+0.5+odsAngle/360)) * 2 * pi  pi) * cos(pi / 2
select(y, 12*(y+0.5), 12*y) * pi) * 1}
}
}
I removed the Google ODS ray_origin formulas.
user_defined location function X it's in range 0.5..0.5, and in panoramic mean
So i used it to rotate the IPD vector </+IPD,0,0> around Y axis.
Now it's correct:
http://www.clodo.it/host/images/66a61ebcd04180d9203276741ad78075262d20a3.jpg
and look perfectly in all direction with Oculus Rift.
Here:
https://forums.oculus.com/viewtopic.php?f=28&t=30854
there are some updated sample rendering.

I'm almost ready to build a POVRay Wiki page about ODS, i need only to write
user_defined function for sidebyside and singleeye versions.
Post a reply to this message


 
 




 
 


Am 22.03.2016 um 02:55 schrieb Clodo:
> This .pov source: http://pastebin.com/aPbGSDud
> generate this image with the Google ODS formulas:
> http://www.clodo.it/host/images/f6febe4d2a5739483562719923fc528c31ac378b.png
>
> With my Oculus Rift, the front and bottom view are perfect, but looking
> left and right the 3D seem reversed.
That's no surprise, because in the camera direction Zaxis formula you
have a surplus "1" factor at the end that's /not/ in the original
Google formulae.
If this is intended to compensate for a lefthanded vs. righthanded
coordinate system, you need to apply the same factor to the camera
location Zaxis formula.
> So, i rewritten the ODS with another logic:
...
> Now it's correct:
That depends on what spatial orientation you originally aimed for;
removing the surplus "1" factor gives a different result, flipping the
handedness and looking in the Z direction rather than the +Z direction.
My suggestion would be to go back to the original Google ODS formulas,
and use a factor "Handedness" rather than a constant "1" (also, make
sure to use the factor on both Zaxis formulae), which can then be set
to either "1" for lefthanded or "1" for righthanded.
Post a reply to this message


 
 




 
 


> That's no surprise, because in the camera direction Zaxis formula you
> have a surplus "1" factor at the end that's /not/ in the original
> Google formulae.
Ouch. Caused by my incompetence in 3D math... i'm a software security expert,
i'm working on ODS only for fun.
> My suggestion would be to go back to the original Google ODS formulas,
> and use a factor "Handedness" rather than a constant "1" (also, make
> sure to use the factor on both Zaxis formulae), which can then be set
> to either "1" for lefthanded or "1" for righthanded.
Reverted to the original Google ODS:
// ODS  Top/Bottom
#declare odsIPD = 0.065; // Interpupillary distance
#declare odsLocationX = 0;
#declare odsLocationY = 0;
#declare odsLocationZ = 0;
#declare odsHandedness = 1; // "1" for lefthanded or "1" for righthanded
#declare odsAngle = 0; // Rotation, clockwise, in degree.
camera {
user_defined
location {
function { odsLocationX + cos(((x+0.5+odsAngle/360)) * 2 * pi 
pi)*odsIPD/2*select(y,1,+1) }
function { odsLocationY }
function { odsLocationZ + sin(((x+0.5+odsAngle/360)) * 2 * pi 
pi)*odsIPD/2*select(y,1,+1) * odsHandedness }
}
direction {
function { sin(((x+0.5+odsAngle/360)) * 2 * pi  pi) * cos(pi / 2
select(y, 12*(y+0.5), 12*y) * pi) }
function { sin(pi / 2  select(y, 12*(y+0.5), 12*y) * pi) }
function { cos(((x+0.5+odsAngle/360)) * 2 * pi  pi) * cos(pi / 2
select(y, 12*(y+0.5), 12*y) * pi) * odsHandedness }
}
}
The GOOD news for me: this generate pixel per pixel the same image of my
rewritten version of yesterday night.
So, the function above seem the release candidate for the Wiki page. I will
prepare the equivalent sidebyside and singleeye.
Thanks, thanks, thanks for your feedback.
Post a reply to this message


 
 




 
 


I write a post on my blog about all my ODS experiments with POVRay:
https://www.clodo.it/blog/omni%C2%ADdirectionalstereoodswithpovray/
The post contain the latest edition of ODS user_defined camera.
I hope that can be the final version, and my experiment can be useful to anyone
as starting point to explore this subject.
Any of my works can be used as publicdomain, for example if POVRay team want
to use as sample in docs or wiki.
I still think that ODS will deserves a C++ userfriendly implementation in
POVRay codebase. A lot of video contents are under development around the
upcoming VR market.
Ciao!
Fabrizio Carimati / Clodo
Post a reply to this message


 
 




 
 


Le 03/04/2016 20:23, Clodo a écrit :
> I write a post on my blog about all my ODS experiments with POVRay:
>
> https://www.clodo.it/blog/omni%C2%ADdirectionalstereoodswithpovray/
>
> The post contain the latest edition of ODS user_defined camera.
>
Nice text. Really very cool.
> I hope that can be the final version, and my experiment can be useful to anyone
> as starting point to explore this subject.
>
You provide the fast setting as quickres.ini... I would have liked:
* also the various scenes for the above illustrations (if anybody want to make them
again), as downloadable links (no need to display them online)
* maybe instead of adding to quickres.ini, another file name ? (such as easyods.ini ?)
(just a suggestion)
> Any of my works can be used as publicdomain, for example if POVRay team want
> to use as sample in docs or wiki.
>
> I still think that ODS will deserves a C++ userfriendly implementation in
> POVRay codebase. A lot of video contents are under development around the
> upcoming VR market.
The best I could do was adding omni_directional_stereo camera to my own fork
(hgpovray).
It does not have the limitation of camera's direction of your text. (as well as right
& up that can be adjusted too).
Did you see it ?
About the vertical modulation, it was not in the original paper.
Is it now a requirement to have some vertical modulation ?
One of the link (Oculus VR Camera) seems to indicate that the modulation (start and
curve) could
be different for nadir and zenith (each its own settings)
Also on the same link, there is the nvidia cube and 3x2 cube map (in addition to
Longitude+latitude projection),
are there any use or interest for them ?
Do they also need a modulation for nadir & zenith ? (seems so, but I always like
contributions)
Post a reply to this message


 
 




 
 


> Nice text. Really very cool.
Thanks! Note that i'm not english native, and i'm not a math or 3d expert, i'm a
software security expert (ddos, XSS, VPN, stuffs like that). I made this ODS
only for fun.
> * also the various scenes for the above illustrations (if anybody want to make them
again), as downloadable links (no
need to display them online)
I will do.
> * maybe instead of adding to quickres.ini, another file name ? (such as easyods.ini
?) (just a suggestion)
Ouch, i simply don't know the multiple .ini files feature :P
> The best I could do was adding omni_directional_stereo camera to my own fork
(hgpovray).
> It does not have the limitation of camera's direction of your text. (as well as
right & up that can be adjusted too).
> Did you see it ?
No sorry, never tried, i will look soon.
>
> About the vertical modulation, it was not in the original paper.
> Is it now a requirement to have some vertical modulation ?
> One of the link (Oculus VR Camera) seems to indicate that the modulation (start and
curve) could
> be different for nadir and zenith (each its own settings)
It's not a requirement.
But it's an HELL write a user_defined camera function.
Use a odsVerticalModulation of 0.0001 pratically generate the same image we can
obtain without any vertical modulation. So i prefer to not add other select() in
the user_defined formula.
I read a lot about the nadir/zenith issue. But same as above, it's a pain add
different method/params on user_defined function. Much easy in C code.
But i don't know if my patch will be accepted in povray official code, so i
prefer to maintain identical the user_defined function and my C code.
From what i understand, software like 3DS Max, Maya, Softimage etc use
DomeMaster, that apply a simple texture to filter that.
At the end, it's a limit of ODS, and content creator need to avoid object at
nadir/zenith. I prefer to avoid additional complexity of user_defined functions
right now.
> Also on the same link, there is the nvidia cube and 3x2 cube map (in addition to
Longitude+latitude projection),
> are there any use or interest for them ?
> Do they also need a modulation for nadir & zenith ? (seems so, but I always like
contributions)
Theory VS practice. I own an Oculus Rift and soon a Vive. Pratically there isn't
any image/video player that support cubemap, or are very very rare. For this
reason, i don't know anything about that.
There are a lot of website that share video/image for HMD, i never seen a
cubemap format, always omnidirectional (Stereo or Mono).
I'm working on another project right now, but i hope soon to test your hgpovray
fork.
Ciao!
Post a reply to this message


 
 




 
 


The following 4K 60fps ODS animation took 3 weeks to render on my Thinkpad X200:
https://www.youtube.com/watch?v=7bvMaKAcAlQ
Post a reply to this message


 
 




 

