> Bald Eagle writes:
>> Take for example Francois LE COAT's very interesting work:
>> Forgive me if there's a lot of things I don't understand, properly rec
>> have wrong, but it seems to me that an aerial photograph of a landscap
e has a
>> lot of perspective distortion, and part of the photogrammetry process
>> correcting for that.
> For the time being, I'm experimenting to model trajectories in 3D.
> But it is a difficult work, because public images given by the NASA
> are not of good quality. If the Ingenuity helicopter flying on Mars was
> driven with my trajectory modelling, it would have crashed. But it is
> what I'm measuring, with input data I could collect. This is strange!
> I have no other possibility than to improve my computations further on.
For the 9th flight over Mars, I collected better images from the
surface. To understand what I'm computing, here is another work
You can see a cursor pointing at Ingenuity's localization, moving on
the map, following the helicopter's motion. What I would like to do, is
to compute this in space, with "translate <Tx, Ty, Tz>" in 3 dimensions.
I could also "rotate <Rx, Ry, Rz>" and "shear <Sx, Sy, 0>" but it is not
working for the moment. And I rendered this with Persistence Of Vision.
The algorithm I'm using is probably close to what is used on Mars,
because it is constructed to avoid the drift from the embedded IMU.
It's not viewed from the top (like (1)) but from the surface of Mars.
Post a reply to this message