|
|
Hi,
>>>> Bald Eagle writes:
>>>>> Take for example Francois LE COAT's very interesting work:
>>>>>
>>>>> http://news.povray.org/povray.advanced-users/thread/%3Cweb.5bb77cec
1f36de80c437ac910%40news.povray.org%3E/
>>>>>
>>>>> Forgive me if there's a lot of things I don't understand, properly
recall, or
>>>>> have wrong, but it seems to me that an aerial photograph of a lands
cape has a
>>>>> lot of perspective distortion, and part of the photogrammetry proce
ss would be
>>>>> correcting for that.
>>>>
>>>> For the time being, I'm experimenting to model trajectories in 3D.
>>>> But it is a difficult work, because public images given by the NASA
>>>> are not of good quality. If the Ingenuity helicopter flying on Mars
was
>>>> driven with my trajectory modelling, it would have crashed. But it i
s
>>>> what I'm measuring, with input data I could collect. This is strange
!
>>>> I have no other possibility than to improve my computations further
on.
>>>
>>> For the 9th flight over Mars, I collected better images from the
>>> surface. To understand what I'm computing, here is another work
>>> from mine...
>>>
>>> (1) <https://twitter.com/Astro_Aure/status/1414699163569246210>
>>>
>>> You can see a cursor pointing at Ingenuity's localization, moving on
>>> the map, following the helicopter's motion. What I would like to do,
is
>>> to compute this in space, with "translate <Tx, Ty, Tz>" in 3 dimensio
ns.
>>>
>>> (2) <https://www.youtube.com/watch?v=MT5SqhJ0Jio>
>>>
>>> I could also "rotate <Rx, Ry, Rz>" and "shear <Sx, Sy, 0>" but it is
not
>>> working for the moment. And I rendered this with Persistence Of Visio
n.
>>> The algorithm I'm using is probably close to what is used on Mars,
>>> because it is constructed to avoid the drift from the embedded IMU.
>>> It's not viewed from the top (like (1)) but from the surface of Mars.
>>
>> For 10th flight over Mars, Ingenuity followed an elaborate trajectory.
..
>>
>>
>> NASA planned to take color pictures of remarkable points on ground.
>
> I've been concerned by 11th flight over planet Mars yesterday August
> 14th, after Ingenuity helicopter happened August 4th...
>
>
> RAW separated images were released by the NASA at the location:
>
> <https://mars.nasa.gov/mars2020/multimedia/raw-images/?af=HELI_NAV,HE
LI_RTE#raw-images>
>
>
> <https://twitter.com/RevesdEspace/status/1426147951693402114>
>
> August 13th. I worked on this video. It takes a delay between the
> day of the flight, and final result here. But this video comes
> from a very distant location in space!
<https://twitter.com/Astro_Aure/status/1455284564499341319>
This is interesting because the cadence is higher with 7 images/s. So
reconstructed 3D localization of the helicopter is now more precise...
<https://www.youtube.com/watch?v=OkclJd3Fmv0>
Ingenuity flew Oct. 24th and the mission is not yet fully accomplished!
Best regards,
--
<http://eureka.atari.org/>
Post a reply to this message
|
|
|
|
Op 04/07/2021 om 14:42 schreef BayashiPascal:
> Thomas de Groot <tho### [at] degrootorg> wrote:
>> Op 4-7-2021 om 11:09 schreef Mike Horvath:
>>> On 7/4/2021 2:20 AM, Thomas de Groot wrote:
>>>> 2) I have seen on a couple of occasions (on TV), archaeologists make a
>>>> lot of photographs of an object, under all kind of angles, and later
>>>> combine those into a 3d model (with software of course). I saw that
>>
>>>> geology students recently used a drone to photograph the walls of a
>>>> quarry in the same manner, and assembled them into a 3d model of the
>>>> quarry. Fascinating stuff, and relatively cheap to implement,
>>>> especially for students I understood.
>>>>
>>>
>>> I wonder *how* cheap it really is. Probably *not* cheap, in terms of the
>>> work and expertise involved.
>>>
>> I don't really know. What I understood, in particular from the Chinese
>> example, was that the hardware came 'cheap' as only a perfectly common
>> digital camera was needed and no sophisticated laser-controlled stuff.
>> And a simple stepladder in addition, to get around the statues. The same
>> applied for the students with their drone. Concerning the software, I
>> have no idea. The results looked good however, and I doubt that they had
>> any particularly high expertise in the matter.
>>
>> --
>> Thomas
>
>
> You can indeed produce good results with a standard camera. The software itself,
> if not freeware is quite expensive, but if they were students it's probably
> provided by their university and that's surely not much compared to a university
> budget. About expertise, if they were students, that very probably mean they had
> teachers with a high level of expertise to guide them.
>
I agree. I just came across the publication(s) - in Dutch unfortunately
- I mentioned earlier. It is the software indeed which is (relatively)
expensive. It makes use of the "Structure from Motion" photogrammetry
technique. See examples in:
https://sketchfab.com/reindersma
--
Thomas
Post a reply to this message
|
|