> Bald Eagle writes:
>> Francois LE COAT wrote:
>>> I've completed the WEB page I mentioned. This image processing is
>>> applied to a drone flying in Vosges a few days ago. I was thinking ab
>>> to apply the same computations with the flight of Ingenuity on Mars..
>> I modeled a drone propeller like that ... 7 years ago?
> Great =)
> The goal is modelling the trajectory and the visible relief with a
> simple video, using the images from Ingenuity's camera, like in :
> the video is <https://www.youtube.com/watch?v=MzWu7zwdJSk> Do you
> remember that we talked about translate <Tx,Ty,Tz>, rotate
> <Rx,Ry,Rz> and shear (or skew) angles <Sx,Sy,0> for the camera ?
> Then we can deduce the trajectory, and the monocular depth.
> You'll see what it gives on Mars, if you haven't seen it in a
> forest, with a drone, in winter ... This is spectacular :-)
I recently worked a little further on the trajectory of the drone...
Instead of using uniquely Ry (yaw) and Tz (translation), I also used
Tx (translation) and Rz (roll) to reconstruct the trajectory. I couldn't
use Ty (translation) and Rx (pitch) because it is not looking like a
valid camera displacement. I have no real explanation.
But the aspect of the drone's trajectory is looking better ...
Thanks for your help.
Post a reply to this message