POV-Ray : Newsgroups : povray.macintosh : [Q] POV-Ray in command line Server Time
1 Dec 2022 01:53:35 EST (-0500)
  [Q] POV-Ray in command line (Message 25 to 34 of 34)  
<<< Previous 10 Messages Goto Initial 10 Messages
From: Francois LE COAT
Subject: Re: [Q] POV-Ray in command line
Date: 30 May 2021 11:00:01
Message: <60b3a871$1@news.povray.org>
Hi,

>> Bald Eagle writes:
>>> Francois LE COAT wrote:
>>>> I've completed the WEB page I mentioned. This image processing is
>>>> applied to a drone flying in Vosges a few days ago. I was thinking 
>>>> about
>>>> to apply the same computations with the flight of Ingenuity on Mars.
..
>>>>
>>>> <https://en.wikipedia.org/wiki/Mars_Helicopter_Ingenuity>
>>>
>>> Nice.
>>> I modeled a drone propeller like that ... 7 years ago?
>>>
>>> <http://news.povray.org/povray.binaries.images/attachment/%3Cweb.53dd
26749e0d00ba5e7df57c0%40news.povray.org%3E/propeller2_dragonfly.png?ttop=
432863&toff=950> 
>>>
>>
>> Great =)
>>
>> The goal is modelling the trajectory and the visible relief with a
>> simple video, using the images from Ingenuity's camera, like in :
>>
>> <https://hebergement.universite-paris-saclay.fr/lecoat/demoweb/tempora
l_disparity.html> 
>>
>>
>> the video is <https://www.youtube.com/watch?v=MzWu7zwdJSk> Do you
>> remember that we talked about translate <Tx,Ty,Tz>, rotate
>> <Rx,Ry,Rz> and shear (or skew) angles <Sx,Sy,0> for the camera ?
>> Then we can deduce the trajectory, and the monocular depth.
>>
>> You'll see what it gives on Mars, if you haven't seen it in a
>> forest, with a drone, in winter ... This is spectacular :-)
> 
> I recently worked a little further on the trajectory of the drone...
> 

> 
> Instead of using uniquely Ry (yaw) and Tz (translation), I also used
> Tx (translation) and Rz (roll) to reconstruct the trajectory. I couldn'
t
> use Ty (translation) and Rx (pitch) because it is not looking like a
> valid camera displacement. I have no real explanation.
> 
> But the aspect of the drone's trajectory is looking better ...

I worked on the sixth flight of the "Ingenuity" helicopter on Mars...

	<https://www.youtube.com/watch?v=pKUAsuXF6EA>

Unfortunately, there was an incident with the dated flight information
and the video sources from the NASA aren't so good for processing :-(
I wish we will have a color video, from the second embedded camera :-)

Thanks for your help.

Best regards,

-- 

<http://eureka.atari.org/>


Post a reply to this message

From: Francois LE COAT
Subject: Re: [Q] POV-Ray in command line
Date: 26 Jun 2021 09:51:19
Message: <60d730d7$1@news.povray.org>
Hi,

>>> Bald Eagle writes:
>>>> Francois LE COAT wrote:
>>>>> I've completed the WEB page I mentioned. This image processing is
>>>>> applied to a drone flying in Vosges a few days ago. I was thinking 

>>>>> about
>>>>> to apply the same computations with the flight of Ingenuity on Mars
...
>>>>>
>>>>> <https://en.wikipedia.org/wiki/Mars_Helicopter_Ingenuity>
>>>>
>>>> Nice.
>>>> I modeled a drone propeller like that ... 7 years ago?
>>>>
>>>> <http://news.povray.org/povray.binaries.images/attachment/%3Cweb.53d
d26749e0d00ba5e7df57c0%40news.povray.org%3E/propeller2_dragonfly.png?ttop
=432863&toff=950> 
>>>>
>>>
>>> Great =)
>>>
>>> The goal is modelling the trajectory and the visible relief with a
>>> simple video, using the images from Ingenuity's camera, like in :
>>>
>>> <https://hebergement.universite-paris-saclay.fr/lecoat/demoweb/tempor
al_disparity.html> 
>>>
>>>
>>> the video is <https://www.youtube.com/watch?v=MzWu7zwdJSk> Do you
>>> remember that we talked about translate <Tx,Ty,Tz>, rotate
>>> <Rx,Ry,Rz> and shear (or skew) angles <Sx,Sy,0> for the camera ?
>>> Then we can deduce the trajectory, and the monocular depth.
>>>
>>> You'll see what it gives on Mars, if you haven't seen it in a
>>> forest, with a drone, in winter ... This is spectacular :-)
>>
>> I recently worked a little further on the trajectory of the drone...
>>

>>
>> Instead of using uniquely Ry (yaw) and Tz (translation), I also used
>> Tx (translation) and Rz (roll) to reconstruct the trajectory. I couldn
't
>> use Ty (translation) and Rx (pitch) because it is not looking like a
>> valid camera displacement. I have no real explanation.
>>
>> But the aspect of the drone's trajectory is looking better ...
> 
> I worked on the sixth flight of the "Ingenuity" helicopter on Mars...
> 

> 
> Unfortunately, there was an incident with the dated flight information
> and the video sources from the NASA aren't so good for processing :-(
> I wish we will have a color video, from the second embedded camera :-)

If you read the comment from the NASA about the 8th flight of Ingenuity:

<https://mars.nasa.gov/technology/helicopter/status/308>

you understand that there was no color camera acquisition for the 7th
and the 8th flight on Mars. This was due to the incident on the 6th
flight, and a conflict between acquisition of the two embedded cameras.
Let's hope for subsequent flights of the helicopter, that NASA has fixed
the horodating problem. Let's hope we will have a color video from Mars.

Thanks for your help.

Best regards,

-- 

<http://eureka.atari.org/>


Post a reply to this message

From: BayashiPascal
Subject: Re: [Q] POV-Ray in command line
Date: 27 Jun 2021 06:40:00
Message: <web.60d85509b3a55826a3e088d5e0f8c582@news.povray.org>
Francois LE COAT <lec### [at] atariorg> wrote:
> Hi,
>
>
> If you read the comment from the NASA about the 8th flight of Ingenuity:
>
> <https://mars.nasa.gov/technology/helicopter/status/308>
>
> you understand that there was no color camera acquisition for the 7th
> and the 8th flight on Mars. This was due to the incident on the 6th
> flight, and a conflict between acquisition of the two embedded cameras.
> Let's hope for subsequent flights of the helicopter, that NASA has fixed
> the horodating problem. Let's hope we will have a color video from Mars.
>
> Thanks for your help.
>
> Best regards,
>
> --
>
> <http://eureka.atari.org/>


Well done ! :-)
I hope too there will be colors for the next videos.
Your system seems to work quite well. Do you have the data for the actual
trajectory and can you quantify the accuracy of your reconstructed trajectory ?
Are you working directly with the Ingenuity team, or just using their public
data ?

Bonne continuation :-)

Pascal


Post a reply to this message

From: Francois LE COAT
Subject: Re: [Q] POV-Ray in command line
Date: 27 Jun 2021 17:11:34
Message: <60d8e986$1@news.povray.org>
Hi,

BayashiPascal writes:
> Francois LE COAT wrote:
>> If you read the comment from the NASA about the 8th flight of Ingenuit
y:
>>
>> <https://mars.nasa.gov/technology/helicopter/status/308>
>>
>> you understand that there was no color camera acquisition for the 7th
>> and the 8th flight on Mars. This was due to the incident on the 6th
>> flight, and a conflict between acquisition of the two embedded cameras
.
>> Let's hope for subsequent flights of the helicopter, that NASA has fix
ed
>> the horodating problem. Let's hope we will have a color video from Mar
s.
> 
> Well done ! :-)
> I hope too there will be colors for the next videos.

Oh, yes! We could only see static colors images for the moment. If it
moved, we could better have a representation of Ingenuity's motion.

> Your system seems to work quite well. Do you have the data for the actu
al
> trajectory and can you quantify the accuracy of your reconstructed traj
ectory ?

I have no "ground truth" about how the camera moves. NASA has data from
various sensors, because there is an IMU that is embedded. The comment
is explaining for the sixth flight, that those sensors are drifting...
"
If the navigation system relied on the IMU alone, it would not be very
accurate in the long run: Errors would quickly accumulate, and the
helicopter would eventually lose its way. To maintain better accuracy
over time, the IMU-based estimates are nominally corrected on a regular
basis, and this is where Ingenuity’s navigation camera comes in. 
For the
majority of time airborne, the downward-looking navcams are taking 30
pictures a second of the Martian surface and immediately feeding them
into the helicopter’s navigation system.
" <https://mars.nasa.gov/technology/helicopter/status/305>

Here's my result on 8th flight <https://www.youtube.com/watch?v=CRUh37x
pLT4>

> Are you working directly with the Ingenuity team, or just using their p
ublic
> data ?

I'm using public data. And those are extracted from real Mars data
acquired with the helicopter. NASA is not transmitting videos with
30 images/second. I have to deal with it. This is fantastic to work
on martian images, nevertheless :-)

> Bonne continuation :-)

Thanks for your interest.

> Pascal

Best regards,

-- 
François LE COAT
<http://eureka.atari.org/>


Post a reply to this message

From: Francois LE COAT
Subject: Re: [Q] POV-Ray in command line
Date: 8 Jul 2021 15:55:01
Message: <60e75815$1@news.povray.org>
Hi,

> BayashiPascal writes:
>> Francois LE COAT wrote:
>>> If you read the comment from the NASA about the 8th flight of Ingenui
ty:
>>>
>>> <https://mars.nasa.gov/technology/helicopter/status/308>
>>>
>>> you understand that there was no color camera acquisition for the 7th

>>> and the 8th flight on Mars. This was due to the incident on the 6th
>>> flight, and a conflict between acquisition of the two embedded camera
s.
>>> Let's hope for subsequent flights of the helicopter, that NASA has fi
xed
>>> the horodating problem. Let's hope we will have a color video from Ma
rs.
>>
>> Well done ! :-)
>> I hope too there will be colors for the next videos.
> 
> Oh, yes! We could only see static colors images for the moment. If it
> moved, we could better have a representation of Ingenuity's motion.
> 
>> Your system seems to work quite well. Do you have the data for the act
ual
>> trajectory and can you quantify the accuracy of your reconstructed 
>> trajectory ?
> 
> I have no "ground truth" about how the camera moves. NASA has data from

> various sensors, because there is an IMU that is embedded. The comment
> is explaining for the sixth flight, that those sensors are drifting...
> "
> If the navigation system relied on the IMU alone, it would not be very
> accurate in the long run: Errors would quickly accumulate, and the
> helicopter would eventually lose its way. To maintain better accuracy
> over time, the IMU-based estimates are nominally corrected on a regular

> basis, and this is where Ingenuity’s navigation camera comes in
. For the
> majority of time airborne, the downward-looking navcams are taking 30
> pictures a second of the Martian surface and immediately feeding them
> into the helicopter’s navigation system.
> " <https://mars.nasa.gov/technology/helicopter/status/305>
> 
> Here's my result on 8th flight 
> <https://www.youtube.com/watch?v=CRUh37xpLT4>
> 
>> Are you working directly with the Ingenuity team, or just using their 

>> public
>> data ?
> 
> I'm using public data. And those are extracted from real Mars data
> acquired with the helicopter. NASA is not transmitting videos with
> 30 images/second. I have to deal with it. This is fantastic to work
> on martian images, nevertheless :-)
> 
>> Bonne continuation :-)
> 
> Thanks for your interest.
> 
>> Pascal

Here is the first color image sequence from 9th flight on planet Mars:

	<https://www.youtube.com/watch?v=0ug5BgZeNK4>

The algorithm driving Ingenuity is pushed to the limits...

	<https://mars.nasa.gov/technology/helicopter/status/314>

I really look forward to look at a real film from this color camera!

Best regards,

-- 
François LE COAT
<http://eureka.atari.org/>


Post a reply to this message

From: pkoning
Subject: Re: [Q] POV-Ray in command line
Date: 3 Aug 2021 16:30:00
Message: <web.6109a663b3a558261e101ef1b9aee0ac@news.povray.org>
I've run into the same problem.  The regular Mac release of POVRay is a GUI
tool.  Usually that's nice, but occasionally I want the command line interface
and there isn't any way to get to it.
For example, FreeCAD has a "render" tool that can feed scene descriptions to
ray-tracers such as POV-Ray.  It does so by creating the .pov file and then
invoking POVRay as a command line utility.  But on Mac that doesn't work because
it can't be configured to be told what to do using Unix command line arguments
-- even though Mac OS is a Unix.
Some graphics applications have a command line switch to say "turn off GUI"
(like "-batch" or the like); it would be useful for POVRay to offer something
along those lines.


Post a reply to this message

From: Francois LE COAT
Subject: Re: [Q] POV-Ray in command line
Date: 4 Aug 2021 08:45:08
Message: <610a8bd4$1@news.povray.org>
Hi,

pkoning writes:
> I've run into the same problem.  The regular Mac release of POVRay is a
 GUI
> tool.  Usually that's nice, but occasionally I want the command line in
terface
> and there isn't any way to get to it.
> For example, FreeCAD has a "render" tool that can feed scene descriptio
ns to
> ray-tracers such as POV-Ray.  It does so by creating the .pov file and 
then
> invoking POVRay as a command line utility.  But on Mac that doesn't wor
k because
> it can't be configured to be told what to do using Unix command line ar
guments
> -- even though Mac OS is a Unix.
> Some graphics applications have a command line switch to say "turn off 
GUI"
> (like "-batch" or the like); it would be useful for POVRay to offer som
ething
> along those lines.

Thanks for reporting your experience with POV-Ray in command line.
Because I use binaries from Macports, I don't know who I should
contact, in order to use POV-Ray port. Is there a mailing-list?

Regards,

-- 

<http://eureka.atari.org/>


Post a reply to this message

From: Francois LE COAT
Subject: Re: [Q] POV-Ray in command line
Date: 4 Feb 2022 10:00:02
Message: <61fd3f72$1@news.povray.org>
Hi,

I've made a WEB page about 18th flight of Ingenuity over planet Mars:

It's possible to reconstruct visible relief, and trajectory from the
Ingenuity drone, using a simple video sequence. The monocular disparity
is obtained by matching images with a reference, measuring optical-
flow. The trajectory is obtained using parameters of the perspective
transformation describing successive images...

<https://hebergement.universite-paris-saclay.fr/lecoat/demoweb/trajectory
.html>

Since April 19, 2021, the Ingenuity helicopter sent to Mars hasn't
stopped flying over the planet. It was expected taking off only
5 times, to demonstrate that it was possible. In fact, we are in
February 2022, and a final realization of 19th flight over Mars, was
attempted. The measurements we take correspond to 18th flight over
planet Mars, dated 15 December 2021.

The localization of the piloting assistance camera which is obtained,
is not perfect. The lens of this camera has a radial distortion, what
is not taken into account by the perspective kinematics model.

>> BayashiPascal writes:
>>> Francois LE COAT wrote:
>>>> If you read the comment from the NASA about the 8th flight of Ingenu
ity:
>>>>
>>>> <https://mars.nasa.gov/technology/helicopter/status/308>
>>>>
>>>> you understand that there was no color camera acquisition for the 7t
h
>>>> and the 8th flight on Mars. This was due to the incident on the 6th
>>>> flight, and a conflict between acquisition of the two embedded camer
as.
>>>> Let's hope for subsequent flights of the helicopter, that NASA has f
ixed
>>>> the horodating problem. Let's hope we will have a color video from M
ars.
>>>
>>> Well done ! :-)
>>> I hope too there will be colors for the next videos.
>>
>> Oh, yes! We could only see static colors images for the moment. If it
>> moved, we could better have a representation of Ingenuity's motion.
>>
>>> Your system seems to work quite well. Do you have the data for the ac
tual
>>> trajectory and can you quantify the accuracy of your reconstructed tr
ajectory ?
>>
>> I have no "ground truth" about how the camera moves. NASA has data fro
m
>> various sensors, because there is an IMU that is embedded. The comment

>> is explaining for the sixth flight, that those sensors are drifting...

>> "
>> If the navigation system relied on the IMU alone, it would not be very

>> accurate in the long run: Errors would quickly accumulate, and the
>> helicopter would eventually lose its way. To maintain better accuracy
>> over time, the IMU-based estimates are nominally corrected on a regula
r
>> basis, and this is where Ingenuity’s navigation camera comes i
n. For the
>> majority of time airborne, the downward-looking navcams are taking 30
>> pictures a second of the Martian surface and immediately feeding them
>> into the helicopter’s navigation system.
>> " <https://mars.nasa.gov/technology/helicopter/status/305>
>>
>> Here's my result on 8th flight 
>> <https://www.youtube.com/watch?v=CRUh37xpLT4>
>>
>>> Are you working directly with the Ingenuity team, or just using their
 
>>> public
>>> data ?
>>
>> I'm using public data. And those are extracted from real Mars data
>> acquired with the helicopter. NASA is not transmitting videos with
>> 30 images/second. I have to deal with it. This is fantastic to work
>> on martian images, nevertheless :-)
>>
>>> Bonne continuation :-)
>>
>> Thanks for your interest.
>>
>>> Pascal
> 
> Here is the first color image sequence from 9th flight on planet Mars:
> 
>      <https://www.youtube.com/watch?v=0ug5BgZeNK4
>
> 
> The algorithm driving Ingenuity is pushed to the limits...
> 
>      <https://mars.nasa.gov/technology/helicopter/s
tatus/314>
> 
> I really look forward to look at a real film from this color camera!

Best regards,

-- 
François LE COAT
<http://eureka.atari.org/>


Post a reply to this message

From: Francois LE COAT
Subject: Re: Monocular Depth (was: [Q] POV-Ray in command line)
Date: 19 May 2022 10:33:33
Message: <6286553d$1@news.povray.org>
Hi,

Francois LE COAT writes:
> To explain what I'm doing I've done a WEB page that is not yet finished
:
> 
> <https://hebergement.universite-paris-saclay.fr/lecoat/demoweb/temporal
_disparity.html> 
> 
> 
> POV-Ray is totally appropriate to show what I'm doing, because it can
> represent the eight parameters I'm obtaining from the camera movement.
> 
> I obtain the eight:
> 
> - Tx horizontal translation
> - Ty vertical translation
> - Tz depth translation
> - Rx pitch angle
> - Ry yaw angle
> - Rz roll angle
> - Sx horizontal shear angle
> - Sy vertical shear angle
> 
> and POV-Ray can represent those all. This already have been discussed i
n
> <news://povray.advanced-users> because I'm modelling the 3D motion.
> 
> The issue here, is just to make POV-Ray quiet when I'm rendering...

Here is what was done with a sequence at Mont Saint-Michel...

	<https://www.youtube.com/watch?v=zd_ZXEgX8tw>

There are three parts to the video, corresponding to three measures
of the optical-flow: IODP, Farneback and DualTVL1. This gives three
measures of different monocular depth (Temporal Disparity), quality
growing. What is shown is that one can as well see the relief of a
scene, with two eyes in stereo, or with one eye in mono through
movement.

Let's experiment... You can either see the relief by closing one eye
moving, or with both eyes without moving. Movement (as with TV, and
without glasses) allows you to perceive the relief! There is no need
to be equipped with a virtual reality headset (VR).

Thanks for your help.

Regards,

-- 

<http://eureka.atari.org/>


Post a reply to this message

From: Francois LE COAT
Subject: Re: Monocular Depth
Date: 23 Jun 2022 08:30:02
Message: <62b45cca$1@news.povray.org>
Hi,

Francois LE COAT wrote:
>> To explain what I'm doing I've done a WEB page that is not yet finishe
d:
>>
>> <https://hebergement.universite-paris-saclay.fr/lecoat/demoweb/tempora
l_disparity.html> 
>>
>>
>> POV-Ray is totally appropriate to show what I'm doing, because it can
>> represent the eight parameters I'm obtaining from the camera movement.

>>
>> I obtain the eight:
>>
>> - Tx horizontal translation
>> - Ty vertical translation
>> - Tz depth translation
>> - Rx pitch angle
>> - Ry yaw angle
>> - Rz roll angle
>> - Sx horizontal shear angle
>> - Sy vertical shear angle
>>
>> and POV-Ray can represent those all. This already have been discussed 
in
>> <news://povray.advanced-users> because I'm modelling the 3D motion.
>>
>> The issue here, is just to make POV-Ray quiet when I'm rendering...
> 
> Here is what was done with a sequence at Mont Saint-Michel...
> 

> 
> There are three parts to the video, corresponding to three measures
> of the optical-flow: IODP, Farneback and DualTVL1. This gives three
> measures of different monocular depth (Temporal Disparity), quality
> growing. What is shown is that one can as well see the relief of a
> scene, with two eyes in stereo, or with one eye in mono through
> movement.
> 
> Let's experiment... You can either see the relief by closing one eye
> moving, or with both eyes without moving. Movement (as with TV, and
> without glasses) allows you to perceive the relief! There is no need
> to be equipped with a virtual reality headset (VR).

Here is a similar demonstration with a sequence at Notre-Dame...

     <https://www.youtube.com/watch?v=F_J0l_B5A2s>

With these conditions, the drone is flying in a transverse direction.
The monocular depth (temporal disparity) is therefore measured
horizontally. The trajectory is reconstructed in a birdview, at a
constant altitude. There is a yaw rotation (Ry). The four shown
displacement components are <Tx,Ty,Tz,Ry>. All mosaics representing
perspective registration measure have an inter-correlation index
that is not below 60%.

Thanks for your help.

Regards,

-- 

<http://eureka.atari.org/>


Post a reply to this message

<<< Previous 10 Messages Goto Initial 10 Messages

Copyright 2003-2021 Persistence of Vision Raytracer Pty. Ltd.