POV-Ray : Newsgroups : povray.macintosh : [Q] POV-Ray in command line : Re: [Q] POV-Ray in command line Server Time
28 May 2024 21:22:36 EDT (-0400)
  Re: [Q] POV-Ray in command line  
From: Francois LE COAT
Date: 4 Feb 2022 10:00:02
Message: <61fd3f72$1@news.povray.org>

I've made a WEB page about 18th flight of Ingenuity over planet Mars:

It's possible to reconstruct visible relief, and trajectory from the
Ingenuity drone, using a simple video sequence. The monocular disparity
is obtained by matching images with a reference, measuring optical-
flow. The trajectory is obtained using parameters of the perspective
transformation describing successive images...


Since April 19, 2021, the Ingenuity helicopter sent to Mars hasn't
stopped flying over the planet. It was expected taking off only
5 times, to demonstrate that it was possible. In fact, we are in
February 2022, and a final realization of 19th flight over Mars, was
attempted. The measurements we take correspond to 18th flight over
planet Mars, dated 15 December 2021.

The localization of the piloting assistance camera which is obtained,
is not perfect. The lens of this camera has a radial distortion, what
is not taken into account by the perspective kinematics model.

>> BayashiPascal writes:
>>> Francois LE COAT wrote:
>>>> If you read the comment from the NASA about the 8th flight of Ingenu
>>>> <https://mars.nasa.gov/technology/helicopter/status/308>
>>>> you understand that there was no color camera acquisition for the 7t
>>>> and the 8th flight on Mars. This was due to the incident on the 6th
>>>> flight, and a conflict between acquisition of the two embedded camer
>>>> Let's hope for subsequent flights of the helicopter, that NASA has f
>>>> the horodating problem. Let's hope we will have a color video from M
>>> Well done ! :-)
>>> I hope too there will be colors for the next videos.
>> Oh, yes! We could only see static colors images for the moment. If it
>> moved, we could better have a representation of Ingenuity's motion.
>>> Your system seems to work quite well. Do you have the data for the ac
>>> trajectory and can you quantify the accuracy of your reconstructed tr
ajectory ?
>> I have no "ground truth" about how the camera moves. NASA has data fro
>> various sensors, because there is an IMU that is embedded. The comment

>> is explaining for the sixth flight, that those sensors are drifting...

>> "
>> If the navigation system relied on the IMU alone, it would not be very

>> accurate in the long run: Errors would quickly accumulate, and the
>> helicopter would eventually lose its way. To maintain better accuracy
>> over time, the IMU-based estimates are nominally corrected on a regula
>> basis, and this is where Ingenuity’s navigation camera comes i
n. For the
>> majority of time airborne, the downward-looking navcams are taking 30
>> pictures a second of the Martian surface and immediately feeding them
>> into the helicopter’s navigation system.
>> " <https://mars.nasa.gov/technology/helicopter/status/305>
>> Here's my result on 8th flight 
>> <https://www.youtube.com/watch?v=CRUh37xpLT4>
>>> Are you working directly with the Ingenuity team, or just using their
>>> public
>>> data ?
>> I'm using public data. And those are extracted from real Mars data
>> acquired with the helicopter. NASA is not transmitting videos with
>> 30 images/second. I have to deal with it. This is fantastic to work
>> on martian images, nevertheless :-)
>>> Bonne continuation :-)
>> Thanks for your interest.
>>> Pascal
> Here is the first color image sequence from 9th flight on planet Mars:
>      <https://www.youtube.com/watch?v=0ug5BgZeNK4
> The algorithm driving Ingenuity is pushed to the limits...
>      <https://mars.nasa.gov/technology/helicopter/s
> I really look forward to look at a real film from this color camera!

Best regards,

François LE COAT

Post a reply to this message

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.