|
|
Hi,
> Bald Eagle writes:
>> Francois LE COAT wrote:
>>> I've completed the WEB page I mentioned. This image processing is
>>> applied to a drone flying in Vosges a few days ago. I was thinking ab
out
>>> to apply the same computations with the flight of Ingenuity on Mars..
.
>>>
>>> <https://en.wikipedia.org/wiki/Mars_Helicopter_Ingenuity>
>>
>> Nice.
>> I modeled a drone propeller like that ... 7 years ago?
>>
>> <http://news.povray.org/povray.binaries.images/attachment/%3Cweb.53dd2
6749e0d00ba5e7df57c0%40news.povray.org%3E/propeller2_dragonfly.png?ttop=
432863&toff=950>
>>
>
> Great =)
>
> The goal is modelling the trajectory and the visible relief with a
> simple video, using the images from Ingenuity's camera, like in :
>
> <https://hebergement.universite-paris-saclay.fr/lecoat/demoweb/temporal
_disparity.html>
>
>
> the video is <https://www.youtube.com/watch?v=MzWu7zwdJSk> Do you
> remember that we talked about translate <Tx,Ty,Tz>, rotate
> <Rx,Ry,Rz> and shear (or skew) angles <Sx,Sy,0> for the camera ?
> Then we can deduce the trajectory, and the monocular depth.
>
> You'll see what it gives on Mars, if you haven't seen it in a
> forest, with a drone, in winter ... This is spectacular :-)
I recently worked a little further on the trajectory of the drone...
<https://www.youtube.com/watch?v=3PdUvGDCbQc>
Instead of using uniquely Ry (yaw) and Tz (translation), I also used
Tx (translation) and Rz (roll) to reconstruct the trajectory. I couldn't
use Ty (translation) and Rx (pitch) because it is not looking like a
valid camera displacement. I have no real explanation.
But the aspect of the drone's trajectory is looking better ...
Thanks for your help.
Best regards,
--
<http://eureka.atari.org/>
Post a reply to this message
|
|