POV-Ray : Newsgroups : povray.advanced-users : Optical Pendulum : Re: Optical Pendulum Server Time
24 Apr 2024 19:52:22 EDT (-0400)
  Re: Optical Pendulum  
From: Francois LE COAT
Date: 17 Oct 2022 11:15:30
Message: <634d7192$1@news.povray.org>
Hi,

Francois LE COAT writes:
>> Do you know something about the experiment of the "Optical Pendulum"?
>>
>> A camera is suspended upon a cable, and an image is shot at the rest
>> position. Then you push the pendulum, so that the camera oscillates,
>> and new images are acquired when the pendulum moves. The goal is to
>> evaluate the eight parameters that describe the position of the camera
,
>> from the rest position to the actual one. Because the pendulum
>> oscillates, we obtain pseudo-sinusoidal curves.
>>
>> The eight parameters are the perspective transform that happen
>> from an image, to the others. That means translations <Tx,Ty,Tz>
>> rotations <Rx,Ry,Rz> and two perspective parameters <Sx,Sy>. That's
>> what we can see in bellow video. Each images, and the corresponding
>> perspective transform parameters, compared to the rest.
>>
>> The goal is to measure a global movement, when it is observed by the
>> camera. There are devices that determine the position, such as the GPS

>> (Global Positioning System). We can evaluate rotations with a gyromete
r,
>> the accelerations with an accelerometer, the speed with an odometer.
>> The goal is to measure all this by the image, with a camera. Why?
>>
>> For example when we send robots to the planet Mars (Perseverance and
>> Ingenuity recently), and we want to pilot them with the means at our
>> disposal... On planet Earth there is a positioning system by GPS, whic
h
>> works with a network of satellites. But on Mars it does not exists. To

>> navigate on Mars, we find our way with a camera. To do this, you have
>> to measure the movement of the camera. This is the goal of our
>> experiment. Measuring the movement of the camera... The robots that
>> move on Mars have navigation cameras. These are their eyes. It's as
>> efficient as a GPS.
>>
>> Here is the video demonstration, with the optical pendulum experiment:

>>

>>
>> We can see the image taken at the pendulum's rest. Then each of the
>> images, when it oscillates. We see the perspective transformation
>> between each image, to the rest, in image plane, i.e. in two dimension
s.
>>
>> Then using the parameters obtained in 2D from the transformation, a
>> virtual camera moves in 3D, using Persistence Of Vision software.
>> It is an illustration of the use that we can have in 3D of the
>> parameters: in translation <Tx,Ty,Tz>, in rotation <Rx,Ry,Rz> and
>> in perspective <Sx,Sy>. It is a question of determining from the image
s,
>> the movement in space of the camera. The movement in space between two

>> images is completely described by eight parameters. POV-Ray is very we
ll
>> suited to represent the trajectory in 3D, because it is a free image
>> synthesis software. Of course, all these computations are not yet done

>> at the rate of video. It will probably be necessary to design a hardwa
re
>> acceleration, to obtain a smoother video...
> 
> I realized a new video which is a little smoother, dissociating
> acquisitions from the parameters' computation. It may help to
> understand:
> 

> 
> Thanks to Bald Eagle with the help on POV-Ray perspective transform!

Here is the perspective transform that we are speaking about...

     <https://www.youtube.com/watch?v=mnei7j-KRu8>

There are three rotations, three translations, and two perspective
parameters that are observed when the image is projected (Skew).

Here you can see the transformation rendered with POV-Ray...

	<https://www.youtube.com/watch?v=4vJSN6V0_yI>

The motion in space of the camera is determined from the images
of the optical pendulum, thanks to the perspective cinematic model.

Best regards,

-- 

<http://eureka.atari.org/>


Post a reply to this message

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.