POV-Ray : Newsgroups : povray.macintosh : [Q] POV-Ray in command line Server Time
18 Apr 2024 23:15:01 EDT (-0400)
  [Q] POV-Ray in command line (Message 21 to 30 of 37)  
<<< Previous 10 Messages Goto Latest 10 Messages Next 7 Messages >>>
From: Francois LE COAT
Subject: Re: [Q] POV-Ray in command line
Date: 18 Feb 2021 11:45:17
Message: <602e999d$1@news.povray.org>
Hi,

I've completed the WEB page I mentioned. This image processing is
applied to a drone flying in Vosges a few days ago. I was thinking about
to apply the same computations with the flight of Ingenuity on Mars...

<https://en.wikipedia.org/wiki/Mars_Helicopter_Ingenuity>

The autonomous drone is landing today, and we'll have video sequences =
)

Francois LE COAT writes:
> I had a talk with Bald Eagle, that is present in this newsgroup, about
> the transformations of POV-Ray, in order to render all the eight
> parameters I'm obtaining. I could also use OpenGL, because it may be
> more appropriate for "real-time". But POV-Ray is more realistic, and
> will be more and more convenient for "real-time" photographic rendering
.
> 
> The WEB page that I mentioned will be modified, because it doesn't
> represent all what I wanted to explain about what I'm doing.
> 
> All this is written in C/C++ using POV-Ray and OpenCV, but I also use
> shell scripts, mainly `tcsh`. And I use extensively the macOS Macports
> environment, that allows to use `xv`, ImageMagick, `ffmpeg` etc. And
> of course POV-Ray and OpenCV library, regularly updated.
> 
> All of this would be totally impossible to develop without the Unix
> environment, simply. It also works under GNU/Linux, with the same tools
.
> 
> BayashiPascal writes:
>> Francois LE COAT wrote:
>> Thank you very much for the web page, it looks like a very interesting
 
>> research project.
>>
>>> POV-Ray is totally appropriate to show what I'm doing, because it can

>>> represent the eight parameters I'm obtaining from the camera movement
.
>>
>> Sure, Pov-ray can do that perfectly. What I meant was another renderin
g engine
>> could also do it as well, while avoiding the problem you face with Pov
-ray. For
>> example a graphic library integrated to what you're using to launch th
e Pov-ray
>> instances would avoid creating those 2000 external processes to run Po
v-ray. In
>> a previous comment you were writing "pac%04d.png", maybe you're using 
the C
>> programming language to generate the Pov-ray scripts and launch there 
rendering?
>> In that case, using a graphic library like, for example, OpenGL to ren
der images
>> directly in the C program instead of using Pov-ray would allow you to 
get the
>> same result while avoiding the problem encountered with Pov-ray.
>>
>> But, I still do not understand completely your constraints and I shall
 no go
>> further with hypothesis.
>>
>> Anyway, bonne chance in your research ! :-)
>>
>> Francois LE COAT wrote:
>>> To explain what I'm doing I've done a WEB page that is not yet finish
ed:
>>>
>>> <https://hebergement.universite-paris-saclay.fr/lecoat/demoweb/tempor
al_disparity.html> 
>>>
>>> POV-Ray is totally appropriate to show what I'm doing, because it can

>>> represent the eight parameters I'm obtaining from the camera movement
.
>>>
>>> I obtain the eight:
>>>
>>> - Tx horizontal translation
>>> - Ty vertical translation
>>> - Tz depth translation
>>> - Rx pitch angle
>>> - Ry yaw angle
>>> - Rz roll angle
>>> - Sx horizontal shear angle
>>> - Sy vertical shear angle
>>>
>>> and POV-Ray can represent those all. This already have been discussed
 in
>>> <news://povray.advanced-users> because I'm modelling the 3D motion.
>>>
>>> The issue here, is just to make POV-Ray quiet when I'm rendering...

Thanks for your help.

Regards,

-- 

<http://eureka.atari.org/>


Post a reply to this message

From: Bald Eagle
Subject: Re: [Q] POV-Ray in command line
Date: 18 Feb 2021 13:30:01
Message: <web.602eb1e2b3a558261f9dae300@news.povray.org>
Francois LE COAT <lec### [at] atariorg> wrote:
> Hi,
>
> I've completed the WEB page I mentioned. This image processing is
> applied to a drone flying in Vosges a few days ago. I was thinking about
> to apply the same computations with the flight of Ingenuity on Mars...
>
> <https://en.wikipedia.org/wiki/Mars_Helicopter_Ingenuity>

Nice.
I modeled a drone propeller like that ... 7 years ago?

http://news.povray.org/povray.binaries.images/attachment/%3Cweb.53dd26749e0d00ba5e7df57c0%40news.povray.org%3E/propelle
r2_dragonfly.png?ttop=432863&toff=950


Post a reply to this message

From: Francois LE COAT
Subject: Re: [Q] POV-Ray in command line
Date: 18 Feb 2021 13:49:31
Message: <602eb6bb$1@news.povray.org>
Hi,

Bald Eagle writes:
> Francois LE COAT wrote:
>> I've completed the WEB page I mentioned. This image processing is
>> applied to a drone flying in Vosges a few days ago. I was thinking abo
ut
>> to apply the same computations with the flight of Ingenuity on Mars...

>>
>> <https://en.wikipedia.org/wiki/Mars_Helicopter_Ingenuity>
> 
> Nice.
> I modeled a drone propeller like that ... 7 years ago?
> 
> <http://news.povray.org/povray.binaries.images/attachment/%3Cweb.53dd26
749e0d00ba5e7df57c0%40news.povray.org%3E/propeller2_dragonfly.png?ttop=
432863&toff=950>

Great =)

The goal is modelling the trajectory and the visible relief with a
simple video, using the images from Ingenuity's camera, like in :

<https://hebergement.universite-paris-saclay.fr/lecoat/demoweb/temporal_d
isparity.html>

the video is <https://www.youtube.com/watch?v=MzWu7zwdJSk> Do you
remember that we talked about translate <Tx,Ty,Tz>, rotate
<Rx,Ry,Rz> and shear (or skew) angles <Sx,Sy,0> for the camera ?
Then we can deduce the trajectory, and the monocular depth.

You'll see what it gives on Mars, if you haven't seen it in a
forest, with a drone, in winter ... This is spectacular :-)

Thanks for your help.

Best regards,

-- 

<http://eureka.atari.org/>


Post a reply to this message

From: Francois LE COAT
Subject: Re: [Q] POV-Ray in command line
Date: 16 Mar 2021 12:17:07
Message: <6050da03$1@news.povray.org>
Hi,

> Bald Eagle writes:
>> Francois LE COAT wrote:
>>> I've completed the WEB page I mentioned. This image processing is
>>> applied to a drone flying in Vosges a few days ago. I was thinking ab
out
>>> to apply the same computations with the flight of Ingenuity on Mars..
.
>>>
>>> <https://en.wikipedia.org/wiki/Mars_Helicopter_Ingenuity>
>>
>> Nice.
>> I modeled a drone propeller like that ... 7 years ago?
>>
>> <http://news.povray.org/povray.binaries.images/attachment/%3Cweb.53dd2
6749e0d00ba5e7df57c0%40news.povray.org%3E/propeller2_dragonfly.png?ttop=
432863&toff=950> 
>>
> 
> Great =)
> 
> The goal is modelling the trajectory and the visible relief with a
> simple video, using the images from Ingenuity's camera, like in :
> 
> <https://hebergement.universite-paris-saclay.fr/lecoat/demoweb/temporal
_disparity.html> 
> 
> 
> the video is <https://www.youtube.com/watch?v=MzWu7zwdJSk> Do you
> remember that we talked about translate <Tx,Ty,Tz>, rotate
> <Rx,Ry,Rz> and shear (or skew) angles <Sx,Sy,0> for the camera ?
> Then we can deduce the trajectory, and the monocular depth.
> 
> You'll see what it gives on Mars, if you haven't seen it in a
> forest, with a drone, in winter ... This is spectacular :-)

I recently worked a little further on the trajectory of the drone...

     <https://www.youtube.com/watch?v=3PdUvGDCbQc>

Instead of using uniquely Ry (yaw) and Tz (translation), I also used
Tx (translation) and Rz (roll) to reconstruct the trajectory. I couldn't
use Ty (translation) and Rx (pitch) because it is not looking like a
valid camera displacement. I have no real explanation.

But the aspect of the drone's trajectory is looking better ...

Thanks for your help.

Best regards,

-- 

<http://eureka.atari.org/>


Post a reply to this message

From: Francois LE COAT
Subject: Re: [Q] POV-Ray in command line
Date: 30 May 2021 11:00:01
Message: <60b3a871$1@news.povray.org>
Hi,

>> Bald Eagle writes:
>>> Francois LE COAT wrote:
>>>> I've completed the WEB page I mentioned. This image processing is
>>>> applied to a drone flying in Vosges a few days ago. I was thinking 
>>>> about
>>>> to apply the same computations with the flight of Ingenuity on Mars.
..
>>>>
>>>> <https://en.wikipedia.org/wiki/Mars_Helicopter_Ingenuity>
>>>
>>> Nice.
>>> I modeled a drone propeller like that ... 7 years ago?
>>>
>>> <http://news.povray.org/povray.binaries.images/attachment/%3Cweb.53dd
26749e0d00ba5e7df57c0%40news.povray.org%3E/propeller2_dragonfly.png?ttop=
432863&toff=950> 
>>>
>>
>> Great =)
>>
>> The goal is modelling the trajectory and the visible relief with a
>> simple video, using the images from Ingenuity's camera, like in :
>>
>> <https://hebergement.universite-paris-saclay.fr/lecoat/demoweb/tempora
l_disparity.html> 
>>
>>
>> the video is <https://www.youtube.com/watch?v=MzWu7zwdJSk> Do you
>> remember that we talked about translate <Tx,Ty,Tz>, rotate
>> <Rx,Ry,Rz> and shear (or skew) angles <Sx,Sy,0> for the camera ?
>> Then we can deduce the trajectory, and the monocular depth.
>>
>> You'll see what it gives on Mars, if you haven't seen it in a
>> forest, with a drone, in winter ... This is spectacular :-)
> 
> I recently worked a little further on the trajectory of the drone...
> 

> 
> Instead of using uniquely Ry (yaw) and Tz (translation), I also used
> Tx (translation) and Rz (roll) to reconstruct the trajectory. I couldn'
t
> use Ty (translation) and Rx (pitch) because it is not looking like a
> valid camera displacement. I have no real explanation.
> 
> But the aspect of the drone's trajectory is looking better ...

I worked on the sixth flight of the "Ingenuity" helicopter on Mars...

	<https://www.youtube.com/watch?v=pKUAsuXF6EA>

Unfortunately, there was an incident with the dated flight information
and the video sources from the NASA aren't so good for processing :-(
I wish we will have a color video, from the second embedded camera :-)

Thanks for your help.

Best regards,

-- 

<http://eureka.atari.org/>


Post a reply to this message

From: Francois LE COAT
Subject: Re: [Q] POV-Ray in command line
Date: 26 Jun 2021 09:51:19
Message: <60d730d7$1@news.povray.org>
Hi,

>>> Bald Eagle writes:
>>>> Francois LE COAT wrote:
>>>>> I've completed the WEB page I mentioned. This image processing is
>>>>> applied to a drone flying in Vosges a few days ago. I was thinking 

>>>>> about
>>>>> to apply the same computations with the flight of Ingenuity on Mars
...
>>>>>
>>>>> <https://en.wikipedia.org/wiki/Mars_Helicopter_Ingenuity>
>>>>
>>>> Nice.
>>>> I modeled a drone propeller like that ... 7 years ago?
>>>>
>>>> <http://news.povray.org/povray.binaries.images/attachment/%3Cweb.53d
d26749e0d00ba5e7df57c0%40news.povray.org%3E/propeller2_dragonfly.png?ttop
=432863&toff=950> 
>>>>
>>>
>>> Great =)
>>>
>>> The goal is modelling the trajectory and the visible relief with a
>>> simple video, using the images from Ingenuity's camera, like in :
>>>
>>> <https://hebergement.universite-paris-saclay.fr/lecoat/demoweb/tempor
al_disparity.html> 
>>>
>>>
>>> the video is <https://www.youtube.com/watch?v=MzWu7zwdJSk> Do you
>>> remember that we talked about translate <Tx,Ty,Tz>, rotate
>>> <Rx,Ry,Rz> and shear (or skew) angles <Sx,Sy,0> for the camera ?
>>> Then we can deduce the trajectory, and the monocular depth.
>>>
>>> You'll see what it gives on Mars, if you haven't seen it in a
>>> forest, with a drone, in winter ... This is spectacular :-)
>>
>> I recently worked a little further on the trajectory of the drone...
>>

>>
>> Instead of using uniquely Ry (yaw) and Tz (translation), I also used
>> Tx (translation) and Rz (roll) to reconstruct the trajectory. I couldn
't
>> use Ty (translation) and Rx (pitch) because it is not looking like a
>> valid camera displacement. I have no real explanation.
>>
>> But the aspect of the drone's trajectory is looking better ...
> 
> I worked on the sixth flight of the "Ingenuity" helicopter on Mars...
> 

> 
> Unfortunately, there was an incident with the dated flight information
> and the video sources from the NASA aren't so good for processing :-(
> I wish we will have a color video, from the second embedded camera :-)

If you read the comment from the NASA about the 8th flight of Ingenuity:

<https://mars.nasa.gov/technology/helicopter/status/308>

you understand that there was no color camera acquisition for the 7th
and the 8th flight on Mars. This was due to the incident on the 6th
flight, and a conflict between acquisition of the two embedded cameras.
Let's hope for subsequent flights of the helicopter, that NASA has fixed
the horodating problem. Let's hope we will have a color video from Mars.

Thanks for your help.

Best regards,

-- 

<http://eureka.atari.org/>


Post a reply to this message

From: BayashiPascal
Subject: Re: [Q] POV-Ray in command line
Date: 27 Jun 2021 06:40:00
Message: <web.60d85509b3a55826a3e088d5e0f8c582@news.povray.org>
Francois LE COAT <lec### [at] atariorg> wrote:
> Hi,
>
>
> If you read the comment from the NASA about the 8th flight of Ingenuity:
>
> <https://mars.nasa.gov/technology/helicopter/status/308>
>
> you understand that there was no color camera acquisition for the 7th
> and the 8th flight on Mars. This was due to the incident on the 6th
> flight, and a conflict between acquisition of the two embedded cameras.
> Let's hope for subsequent flights of the helicopter, that NASA has fixed
> the horodating problem. Let's hope we will have a color video from Mars.
>
> Thanks for your help.
>
> Best regards,
>
> --
>
> <http://eureka.atari.org/>


Well done ! :-)
I hope too there will be colors for the next videos.
Your system seems to work quite well. Do you have the data for the actual
trajectory and can you quantify the accuracy of your reconstructed trajectory ?
Are you working directly with the Ingenuity team, or just using their public
data ?

Bonne continuation :-)

Pascal


Post a reply to this message

From: Francois LE COAT
Subject: Re: [Q] POV-Ray in command line
Date: 27 Jun 2021 17:11:34
Message: <60d8e986$1@news.povray.org>
Hi,

BayashiPascal writes:
> Francois LE COAT wrote:
>> If you read the comment from the NASA about the 8th flight of Ingenuit
y:
>>
>> <https://mars.nasa.gov/technology/helicopter/status/308>
>>
>> you understand that there was no color camera acquisition for the 7th
>> and the 8th flight on Mars. This was due to the incident on the 6th
>> flight, and a conflict between acquisition of the two embedded cameras
.
>> Let's hope for subsequent flights of the helicopter, that NASA has fix
ed
>> the horodating problem. Let's hope we will have a color video from Mar
s.
> 
> Well done ! :-)
> I hope too there will be colors for the next videos.

Oh, yes! We could only see static colors images for the moment. If it
moved, we could better have a representation of Ingenuity's motion.

> Your system seems to work quite well. Do you have the data for the actu
al
> trajectory and can you quantify the accuracy of your reconstructed traj
ectory ?

I have no "ground truth" about how the camera moves. NASA has data from
various sensors, because there is an IMU that is embedded. The comment
is explaining for the sixth flight, that those sensors are drifting...
"
If the navigation system relied on the IMU alone, it would not be very
accurate in the long run: Errors would quickly accumulate, and the
helicopter would eventually lose its way. To maintain better accuracy
over time, the IMU-based estimates are nominally corrected on a regular
basis, and this is where Ingenuity’s navigation camera comes in. 
For the
majority of time airborne, the downward-looking navcams are taking 30
pictures a second of the Martian surface and immediately feeding them
into the helicopter’s navigation system.
" <https://mars.nasa.gov/technology/helicopter/status/305>

Here's my result on 8th flight <https://www.youtube.com/watch?v=CRUh37x
pLT4>

> Are you working directly with the Ingenuity team, or just using their p
ublic
> data ?

I'm using public data. And those are extracted from real Mars data
acquired with the helicopter. NASA is not transmitting videos with
30 images/second. I have to deal with it. This is fantastic to work
on martian images, nevertheless :-)

> Bonne continuation :-)

Thanks for your interest.

> Pascal

Best regards,

-- 
François LE COAT
<http://eureka.atari.org/>


Post a reply to this message

From: Francois LE COAT
Subject: Re: [Q] POV-Ray in command line
Date: 8 Jul 2021 15:55:01
Message: <60e75815$1@news.povray.org>
Hi,

> BayashiPascal writes:
>> Francois LE COAT wrote:
>>> If you read the comment from the NASA about the 8th flight of Ingenui
ty:
>>>
>>> <https://mars.nasa.gov/technology/helicopter/status/308>
>>>
>>> you understand that there was no color camera acquisition for the 7th

>>> and the 8th flight on Mars. This was due to the incident on the 6th
>>> flight, and a conflict between acquisition of the two embedded camera
s.
>>> Let's hope for subsequent flights of the helicopter, that NASA has fi
xed
>>> the horodating problem. Let's hope we will have a color video from Ma
rs.
>>
>> Well done ! :-)
>> I hope too there will be colors for the next videos.
> 
> Oh, yes! We could only see static colors images for the moment. If it
> moved, we could better have a representation of Ingenuity's motion.
> 
>> Your system seems to work quite well. Do you have the data for the act
ual
>> trajectory and can you quantify the accuracy of your reconstructed 
>> trajectory ?
> 
> I have no "ground truth" about how the camera moves. NASA has data from

> various sensors, because there is an IMU that is embedded. The comment
> is explaining for the sixth flight, that those sensors are drifting...
> "
> If the navigation system relied on the IMU alone, it would not be very
> accurate in the long run: Errors would quickly accumulate, and the
> helicopter would eventually lose its way. To maintain better accuracy
> over time, the IMU-based estimates are nominally corrected on a regular

> basis, and this is where Ingenuity’s navigation camera comes in
. For the
> majority of time airborne, the downward-looking navcams are taking 30
> pictures a second of the Martian surface and immediately feeding them
> into the helicopter’s navigation system.
> " <https://mars.nasa.gov/technology/helicopter/status/305>
> 
> Here's my result on 8th flight 
> <https://www.youtube.com/watch?v=CRUh37xpLT4>
> 
>> Are you working directly with the Ingenuity team, or just using their 

>> public
>> data ?
> 
> I'm using public data. And those are extracted from real Mars data
> acquired with the helicopter. NASA is not transmitting videos with
> 30 images/second. I have to deal with it. This is fantastic to work
> on martian images, nevertheless :-)
> 
>> Bonne continuation :-)
> 
> Thanks for your interest.
> 
>> Pascal

Here is the first color image sequence from 9th flight on planet Mars:

	<https://www.youtube.com/watch?v=0ug5BgZeNK4>

The algorithm driving Ingenuity is pushed to the limits...

	<https://mars.nasa.gov/technology/helicopter/status/314>

I really look forward to look at a real film from this color camera!

Best regards,

-- 
François LE COAT
<http://eureka.atari.org/>


Post a reply to this message

From: pkoning
Subject: Re: [Q] POV-Ray in command line
Date: 3 Aug 2021 16:30:00
Message: <web.6109a663b3a558261e101ef1b9aee0ac@news.povray.org>
I've run into the same problem.  The regular Mac release of POVRay is a GUI
tool.  Usually that's nice, but occasionally I want the command line interface
and there isn't any way to get to it.
For example, FreeCAD has a "render" tool that can feed scene descriptions to
ray-tracers such as POV-Ray.  It does so by creating the .pov file and then
invoking POVRay as a command line utility.  But on Mac that doesn't work because
it can't be configured to be told what to do using Unix command line arguments
-- even though Mac OS is a Unix.
Some graphics applications have a command line switch to say "turn off GUI"
(like "-batch" or the like); it would be useful for POVRay to offer something
along those lines.


Post a reply to this message

<<< Previous 10 Messages Goto Latest 10 Messages Next 7 Messages >>>

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.