POV-Ray : Newsgroups : povray.macintosh : [Q] POV-Ray in command line Server Time
27 Apr 2024 14:06:22 EDT (-0400)
  [Q] POV-Ray in command line (Message 28 to 37 of 37)  
<<< Previous 10 Messages Goto Initial 10 Messages
From: Francois LE COAT
Subject: Re: [Q] POV-Ray in command line
Date: 27 Jun 2021 17:11:34
Message: <60d8e986$1@news.povray.org>
Hi,

BayashiPascal writes:
> Francois LE COAT wrote:
>> If you read the comment from the NASA about the 8th flight of Ingenuit
y:
>>
>> <https://mars.nasa.gov/technology/helicopter/status/308>
>>
>> you understand that there was no color camera acquisition for the 7th
>> and the 8th flight on Mars. This was due to the incident on the 6th
>> flight, and a conflict between acquisition of the two embedded cameras
.
>> Let's hope for subsequent flights of the helicopter, that NASA has fix
ed
>> the horodating problem. Let's hope we will have a color video from Mar
s.
> 
> Well done ! :-)
> I hope too there will be colors for the next videos.

Oh, yes! We could only see static colors images for the moment. If it
moved, we could better have a representation of Ingenuity's motion.

> Your system seems to work quite well. Do you have the data for the actu
al
> trajectory and can you quantify the accuracy of your reconstructed traj
ectory ?

I have no "ground truth" about how the camera moves. NASA has data from
various sensors, because there is an IMU that is embedded. The comment
is explaining for the sixth flight, that those sensors are drifting...
"
If the navigation system relied on the IMU alone, it would not be very
accurate in the long run: Errors would quickly accumulate, and the
helicopter would eventually lose its way. To maintain better accuracy
over time, the IMU-based estimates are nominally corrected on a regular
basis, and this is where Ingenuity’s navigation camera comes in. 
For the
majority of time airborne, the downward-looking navcams are taking 30
pictures a second of the Martian surface and immediately feeding them
into the helicopter’s navigation system.
" <https://mars.nasa.gov/technology/helicopter/status/305>

Here's my result on 8th flight <https://www.youtube.com/watch?v=CRUh37x
pLT4>

> Are you working directly with the Ingenuity team, or just using their p
ublic
> data ?

I'm using public data. And those are extracted from real Mars data
acquired with the helicopter. NASA is not transmitting videos with
30 images/second. I have to deal with it. This is fantastic to work
on martian images, nevertheless :-)

> Bonne continuation :-)

Thanks for your interest.

> Pascal

Best regards,

-- 
François LE COAT
<http://eureka.atari.org/>


Post a reply to this message

From: Francois LE COAT
Subject: Re: [Q] POV-Ray in command line
Date: 8 Jul 2021 15:55:01
Message: <60e75815$1@news.povray.org>
Hi,

> BayashiPascal writes:
>> Francois LE COAT wrote:
>>> If you read the comment from the NASA about the 8th flight of Ingenui
ty:
>>>
>>> <https://mars.nasa.gov/technology/helicopter/status/308>
>>>
>>> you understand that there was no color camera acquisition for the 7th

>>> and the 8th flight on Mars. This was due to the incident on the 6th
>>> flight, and a conflict between acquisition of the two embedded camera
s.
>>> Let's hope for subsequent flights of the helicopter, that NASA has fi
xed
>>> the horodating problem. Let's hope we will have a color video from Ma
rs.
>>
>> Well done ! :-)
>> I hope too there will be colors for the next videos.
> 
> Oh, yes! We could only see static colors images for the moment. If it
> moved, we could better have a representation of Ingenuity's motion.
> 
>> Your system seems to work quite well. Do you have the data for the act
ual
>> trajectory and can you quantify the accuracy of your reconstructed 
>> trajectory ?
> 
> I have no "ground truth" about how the camera moves. NASA has data from

> various sensors, because there is an IMU that is embedded. The comment
> is explaining for the sixth flight, that those sensors are drifting...
> "
> If the navigation system relied on the IMU alone, it would not be very
> accurate in the long run: Errors would quickly accumulate, and the
> helicopter would eventually lose its way. To maintain better accuracy
> over time, the IMU-based estimates are nominally corrected on a regular

> basis, and this is where Ingenuity’s navigation camera comes in
. For the
> majority of time airborne, the downward-looking navcams are taking 30
> pictures a second of the Martian surface and immediately feeding them
> into the helicopter’s navigation system.
> " <https://mars.nasa.gov/technology/helicopter/status/305>
> 
> Here's my result on 8th flight 
> <https://www.youtube.com/watch?v=CRUh37xpLT4>
> 
>> Are you working directly with the Ingenuity team, or just using their 

>> public
>> data ?
> 
> I'm using public data. And those are extracted from real Mars data
> acquired with the helicopter. NASA is not transmitting videos with
> 30 images/second. I have to deal with it. This is fantastic to work
> on martian images, nevertheless :-)
> 
>> Bonne continuation :-)
> 
> Thanks for your interest.
> 
>> Pascal

Here is the first color image sequence from 9th flight on planet Mars:

	<https://www.youtube.com/watch?v=0ug5BgZeNK4>

The algorithm driving Ingenuity is pushed to the limits...

	<https://mars.nasa.gov/technology/helicopter/status/314>

I really look forward to look at a real film from this color camera!

Best regards,

-- 
François LE COAT
<http://eureka.atari.org/>


Post a reply to this message

From: pkoning
Subject: Re: [Q] POV-Ray in command line
Date: 3 Aug 2021 16:30:00
Message: <web.6109a663b3a558261e101ef1b9aee0ac@news.povray.org>
I've run into the same problem.  The regular Mac release of POVRay is a GUI
tool.  Usually that's nice, but occasionally I want the command line interface
and there isn't any way to get to it.
For example, FreeCAD has a "render" tool that can feed scene descriptions to
ray-tracers such as POV-Ray.  It does so by creating the .pov file and then
invoking POVRay as a command line utility.  But on Mac that doesn't work because
it can't be configured to be told what to do using Unix command line arguments
-- even though Mac OS is a Unix.
Some graphics applications have a command line switch to say "turn off GUI"
(like "-batch" or the like); it would be useful for POVRay to offer something
along those lines.


Post a reply to this message

From: Francois LE COAT
Subject: Re: [Q] POV-Ray in command line
Date: 4 Aug 2021 08:45:08
Message: <610a8bd4$1@news.povray.org>
Hi,

pkoning writes:
> I've run into the same problem.  The regular Mac release of POVRay is a
 GUI
> tool.  Usually that's nice, but occasionally I want the command line in
terface
> and there isn't any way to get to it.
> For example, FreeCAD has a "render" tool that can feed scene descriptio
ns to
> ray-tracers such as POV-Ray.  It does so by creating the .pov file and 
then
> invoking POVRay as a command line utility.  But on Mac that doesn't wor
k because
> it can't be configured to be told what to do using Unix command line ar
guments
> -- even though Mac OS is a Unix.
> Some graphics applications have a command line switch to say "turn off 
GUI"
> (like "-batch" or the like); it would be useful for POVRay to offer som
ething
> along those lines.

Thanks for reporting your experience with POV-Ray in command line.
Because I use binaries from Macports, I don't know who I should
contact, in order to use POV-Ray port. Is there a mailing-list?

Regards,

-- 

<http://eureka.atari.org/>


Post a reply to this message

From: Francois LE COAT
Subject: Re: [Q] POV-Ray in command line
Date: 4 Feb 2022 10:00:02
Message: <61fd3f72$1@news.povray.org>
Hi,

I've made a WEB page about 18th flight of Ingenuity over planet Mars:

It's possible to reconstruct visible relief, and trajectory from the
Ingenuity drone, using a simple video sequence. The monocular disparity
is obtained by matching images with a reference, measuring optical-
flow. The trajectory is obtained using parameters of the perspective
transformation describing successive images...

<https://hebergement.universite-paris-saclay.fr/lecoat/demoweb/trajectory
.html>

Since April 19, 2021, the Ingenuity helicopter sent to Mars hasn't
stopped flying over the planet. It was expected taking off only
5 times, to demonstrate that it was possible. In fact, we are in
February 2022, and a final realization of 19th flight over Mars, was
attempted. The measurements we take correspond to 18th flight over
planet Mars, dated 15 December 2021.

The localization of the piloting assistance camera which is obtained,
is not perfect. The lens of this camera has a radial distortion, what
is not taken into account by the perspective kinematics model.

>> BayashiPascal writes:
>>> Francois LE COAT wrote:
>>>> If you read the comment from the NASA about the 8th flight of Ingenu
ity:
>>>>
>>>> <https://mars.nasa.gov/technology/helicopter/status/308>
>>>>
>>>> you understand that there was no color camera acquisition for the 7t
h
>>>> and the 8th flight on Mars. This was due to the incident on the 6th
>>>> flight, and a conflict between acquisition of the two embedded camer
as.
>>>> Let's hope for subsequent flights of the helicopter, that NASA has f
ixed
>>>> the horodating problem. Let's hope we will have a color video from M
ars.
>>>
>>> Well done ! :-)
>>> I hope too there will be colors for the next videos.
>>
>> Oh, yes! We could only see static colors images for the moment. If it
>> moved, we could better have a representation of Ingenuity's motion.
>>
>>> Your system seems to work quite well. Do you have the data for the ac
tual
>>> trajectory and can you quantify the accuracy of your reconstructed tr
ajectory ?
>>
>> I have no "ground truth" about how the camera moves. NASA has data fro
m
>> various sensors, because there is an IMU that is embedded. The comment

>> is explaining for the sixth flight, that those sensors are drifting...

>> "
>> If the navigation system relied on the IMU alone, it would not be very

>> accurate in the long run: Errors would quickly accumulate, and the
>> helicopter would eventually lose its way. To maintain better accuracy
>> over time, the IMU-based estimates are nominally corrected on a regula
r
>> basis, and this is where Ingenuity’s navigation camera comes i
n. For the
>> majority of time airborne, the downward-looking navcams are taking 30
>> pictures a second of the Martian surface and immediately feeding them
>> into the helicopter’s navigation system.
>> " <https://mars.nasa.gov/technology/helicopter/status/305>
>>
>> Here's my result on 8th flight 
>> <https://www.youtube.com/watch?v=CRUh37xpLT4>
>>
>>> Are you working directly with the Ingenuity team, or just using their
 
>>> public
>>> data ?
>>
>> I'm using public data. And those are extracted from real Mars data
>> acquired with the helicopter. NASA is not transmitting videos with
>> 30 images/second. I have to deal with it. This is fantastic to work
>> on martian images, nevertheless :-)
>>
>>> Bonne continuation :-)
>>
>> Thanks for your interest.
>>
>>> Pascal
> 
> Here is the first color image sequence from 9th flight on planet Mars:
> 
>      <https://www.youtube.com/watch?v=0ug5BgZeNK4
>
> 
> The algorithm driving Ingenuity is pushed to the limits...
> 
>      <https://mars.nasa.gov/technology/helicopter/s
tatus/314>
> 
> I really look forward to look at a real film from this color camera!

Best regards,

-- 
François LE COAT
<http://eureka.atari.org/>


Post a reply to this message

From: Francois LE COAT
Subject: Re: Monocular Depth (was: [Q] POV-Ray in command line)
Date: 19 May 2022 10:33:33
Message: <6286553d$1@news.povray.org>
Hi,

Francois LE COAT writes:
> To explain what I'm doing I've done a WEB page that is not yet finished
:
> 
> <https://hebergement.universite-paris-saclay.fr/lecoat/demoweb/temporal
_disparity.html> 
> 
> 
> POV-Ray is totally appropriate to show what I'm doing, because it can
> represent the eight parameters I'm obtaining from the camera movement.
> 
> I obtain the eight:
> 
> - Tx horizontal translation
> - Ty vertical translation
> - Tz depth translation
> - Rx pitch angle
> - Ry yaw angle
> - Rz roll angle
> - Sx horizontal shear angle
> - Sy vertical shear angle
> 
> and POV-Ray can represent those all. This already have been discussed i
n
> <news://povray.advanced-users> because I'm modelling the 3D motion.
> 
> The issue here, is just to make POV-Ray quiet when I'm rendering...

Here is what was done with a sequence at Mont Saint-Michel...

	<https://www.youtube.com/watch?v=zd_ZXEgX8tw>

There are three parts to the video, corresponding to three measures
of the optical-flow: IODP, Farneback and DualTVL1. This gives three
measures of different monocular depth (Temporal Disparity), quality
growing. What is shown is that one can as well see the relief of a
scene, with two eyes in stereo, or with one eye in mono through
movement.

Let's experiment... You can either see the relief by closing one eye
moving, or with both eyes without moving. Movement (as with TV, and
without glasses) allows you to perceive the relief! There is no need
to be equipped with a virtual reality headset (VR).

Thanks for your help.

Regards,

-- 

<http://eureka.atari.org/>


Post a reply to this message

From: Francois LE COAT
Subject: Re: Monocular Depth
Date: 23 Jun 2022 08:30:02
Message: <62b45cca$1@news.povray.org>
Hi,

Francois LE COAT wrote:
>> To explain what I'm doing I've done a WEB page that is not yet finishe
d:
>>
>> <https://hebergement.universite-paris-saclay.fr/lecoat/demoweb/tempora
l_disparity.html> 
>>
>>
>> POV-Ray is totally appropriate to show what I'm doing, because it can
>> represent the eight parameters I'm obtaining from the camera movement.

>>
>> I obtain the eight:
>>
>> - Tx horizontal translation
>> - Ty vertical translation
>> - Tz depth translation
>> - Rx pitch angle
>> - Ry yaw angle
>> - Rz roll angle
>> - Sx horizontal shear angle
>> - Sy vertical shear angle
>>
>> and POV-Ray can represent those all. This already have been discussed 
in
>> <news://povray.advanced-users> because I'm modelling the 3D motion.
>>
>> The issue here, is just to make POV-Ray quiet when I'm rendering...
> 
> Here is what was done with a sequence at Mont Saint-Michel...
> 

> 
> There are three parts to the video, corresponding to three measures
> of the optical-flow: IODP, Farneback and DualTVL1. This gives three
> measures of different monocular depth (Temporal Disparity), quality
> growing. What is shown is that one can as well see the relief of a
> scene, with two eyes in stereo, or with one eye in mono through
> movement.
> 
> Let's experiment... You can either see the relief by closing one eye
> moving, or with both eyes without moving. Movement (as with TV, and
> without glasses) allows you to perceive the relief! There is no need
> to be equipped with a virtual reality headset (VR).

Here is a similar demonstration with a sequence at Notre-Dame...

     <https://www.youtube.com/watch?v=F_J0l_B5A2s>

With these conditions, the drone is flying in a transverse direction.
The monocular depth (temporal disparity) is therefore measured
horizontally. The trajectory is reconstructed in a birdview, at a
constant altitude. There is a yaw rotation (Ry). The four shown
displacement components are <Tx,Ty,Tz,Ry>. All mosaics representing
perspective registration measure have an inter-correlation index
that is not below 60%.

Thanks for your help.

Regards,

-- 

<http://eureka.atari.org/>


Post a reply to this message

From: Francois LE COAT
Subject: Re: Ingenuity Flight (was: [Q] POV-Ray in command line)
Date: 1 Aug 2023 10:45:44
Message: <64c91a98$1@news.povray.org>
Hi,

Francois LE COAT writes:
> I've made a WEB page about 18th flight of Ingenuity over planet Mars:
> 
> It's possible to reconstruct visible relief, and trajectory from the
> Ingenuity drone, using a simple video sequence. The monocular disparity

> is obtained by matching images with a reference, measuring optical-
> flow. The trajectory is obtained using parameters of the perspective
> transformation describing successive images...
> 
> <https://hebergement.universite-paris-saclay.fr/lecoat/demoweb/trajecto
ry.html> 
> 
> 
> Since April 19, 2021, the Ingenuity helicopter sent to Mars hasn't
> stopped flying over the planet. It was expected taking off only
> 5 times, to demonstrate that it was possible. In fact, we are in
> February 2022, and a final realization of 19th flight over Mars, was
> attempted. The measurements we take correspond to 18th flight over
> planet Mars, dated 15 December 2021.
> 
> The localization of the piloting assistance camera which is obtained,
> is not perfect. The lens of this camera has a radial distortion, what
> is not taken into account by the perspective kinematics model.

We are in August 2023, and the Ingenuity helicopter still flies over
Mars. There's no GPS satellite system on the planet, and a very little
atmosphere compared to Earth, but it is localizing itself with a
grey-scale camera that points to the ground, and it works like that...

Here is a video <https://www.youtube.com/watch?v=TP_ojUa6XtU>

The image processing computations are obtained from interpolation
performed at <https://twitter.com/stim3on/status/1419414689998594051>

transmitted down the JPL/NASA Laboratory at Caltech University...

Best regards,

-- 

<http://eureka.atari.org/>


Post a reply to this message

From: Francois LE COAT
Subject: Re: Ingenuity Flight
Date: 18 Aug 2023 13:00:43
Message: <64dfa3bb@news.povray.org>
Hi,

Francois LE COAT writes:
>> I've made a WEB page about 18th flight of Ingenuity over planet Mars:
>>
>> It's possible to reconstruct visible relief, and trajectory from the
>> Ingenuity drone, using a simple video sequence. The monocular disparit
y
>> is obtained by matching images with a reference, measuring optical-
>> flow. The trajectory is obtained using parameters of the perspective
>> transformation describing successive images...
>>
>> <https://hebergement.universite-paris-saclay.fr/lecoat/demoweb/traject
ory.html> 
>>
>>
>> Since April 19, 2021, the Ingenuity helicopter sent to Mars hasn't
>> stopped flying over the planet. It was expected taking off only
>> 5 times, to demonstrate that it was possible. In fact, we are in
>> February 2022, and a final realization of 19th flight over Mars, was
>> attempted. The measurements we take correspond to 18th flight over
>> planet Mars, dated 15 December 2021.
>>
>> The localization of the piloting assistance camera which is obtained,
>> is not perfect. The lens of this camera has a radial distortion, what
>> is not taken into account by the perspective kinematics model.
> 
> We are in August 2023, and the Ingenuity helicopter still flies over
> Mars. There's no GPS satellite system on the planet, and a very little
> atmosphere compared to Earth, but it is localizing itself with a
> grey-scale camera that points to the ground, and it works like that...
> 
> Here is a video <https://www.youtube.com/watch?v=TP_ojUa6XtU>
> 
> The image processing computations are obtained from interpolation
> performed at <https://twitter.com/stim3on/status/1419414689998594051>

> transmitted down the JPL/NASA Laboratory at Caltech University...

Ingenuity made its 55th flight on August 12, 2023, what demonstrates the
robustness of the control GNU/Linux system, including in particularly
hostile conditions, an unknown environment and minimal supervision,
because the helicopter is almost autonomous, very very far from
everything somewhere in space.

Best regards,

-- 

<http://eureka.atari.org/>


Post a reply to this message

From: Francois LE COAT
Subject: Re: Ingenuity Flight
Date: 10 Oct 2023 14:55:10
Message: <65259e0e@news.povray.org>
Hi,

Francois LE COAT writes:
>>> I've made a WEB page about 18th flight of Ingenuity over planet Mars:

>>>
>>> It's possible to reconstruct visible relief, and trajectory from the
>>> Ingenuity drone, using a simple video sequence. The monocular dispari
ty
>>> is obtained by matching images with a reference, measuring optical-
>>> flow. The trajectory is obtained using parameters of the perspective
>>> transformation describing successive images...
>>>
>>> <https://hebergement.universite-paris-saclay.fr/lecoat/demoweb/trajec
tory.html> 
>>>
>>>
>>> Since April 19, 2021, the Ingenuity helicopter sent to Mars hasn't
>>> stopped flying over the planet. It was expected taking off only
>>> 5 times, to demonstrate that it was possible. In fact, we are in
>>> February 2022, and a final realization of 19th flight over Mars, was
>>> attempted. The measurements we take correspond to 18th flight over
>>> planet Mars, dated 15 December 2021.
>>>
>>> The localization of the piloting assistance camera which is obtained,

>>> is not perfect. The lens of this camera has a radial distortion, what

>>> is not taken into account by the perspective kinematics model.
>>
>> We are in August 2023, and the Ingenuity helicopter still flies over
>> Mars. There's no GPS satellite system on the planet, and a very little

>> atmosphere compared to Earth, but it is localizing itself with a
>> grey-scale camera that points to the ground, and it works like that...

>>
>> Here is a video <https://www.youtube.com/watch?v=TP_ojUa6XtU>
>>
>> The image processing computations are obtained from interpolation
>> performed at <https://twitter.com/stim3on/status/1419414689998594051>

>> transmitted down the JPL/NASA Laboratory at Caltech University...
> 
> Ingenuity made its 55th flight on August 12, 2023, what demonstrates th
e
> robustness of the control GNU/Linux system, including in particularly
> hostile conditions, an unknown environment and minimal supervision,
> because the helicopter is almost autonomous, very very far from
> everything somewhere in space.

*Ingenuity over planet Mars*

#Ingenuity #Perseverance #Mars2020 #Mars #NASA #Solarocks

<https://www.youtube.com/shorts/gV0iPwSCFBY>

Here is a look back at Ingenuity's 59th flight on Mars, as captured by
the Mastcam-Z on the Perseverance Rover.

In this view, I've heavily enhanced the dust blown away during takeoff.
You can also see dust devils moving in the background!

Best regards,

-- 

<https://eureka.atari.org/>


Post a reply to this message

<<< Previous 10 Messages Goto Initial 10 Messages

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.