POV-Ray : Newsgroups : povray.advanced-users : Optical Pendulum Server Time 2 Dec 2023 03:19:52 EST (-0500)
 Optical Pendulum (Message 1 to 10 of 11)
 From: Francois LE COAT Subject: Optical Pendulum Date: 12 Apr 2022 11:45:45 Message: <62559ea9\$1@news.povray.org>
```
{
"@context": "https://schema.org",
"@type": "DiscussionForumPosting",
"@id": "#62559ea9%241%40news.povray.org",
"dateCreated": "2022-04-12T15:45:45+00:00",
"datePublished": "2022-04-12T15:45:45+00:00",
"author": {
"@type": "Person",
"name": "Francois LE COAT"
}
}
Hi,

Do you know something about the experiment of the "Optical Pendulum"?

A camera is suspended upon a cable, and an image is shot at the rest
position. Then you push the pendulum, so that the camera oscillates,
and new images are acquired when the pendulum moves. The goal is to
evaluate the eight parameters that describe the position of the camera,
from the rest position to the actual one. Because the pendulum
oscillates, we obtain pseudo-sinusoidal curves.

The eight parameters are the perspective transform that happen
from an image, to the others. That means translations <Tx,Ty,Tz>
rotations <Rx,Ry,Rz> and two perspective parameters <Sx,Sy>. That's
what we can see in bellow video. Each images, and the corresponding
perspective transform parameters, compared to the rest.

The goal is to measure a global movement, when it is observed by the
camera. There are devices that determine the position, such as the GPS
(Global Positioning System). We can evaluate rotations with a gyrometer,
the accelerations with an accelerometer, the speed with an odometer.
The goal is to measure all this by the image, with a camera. Why?

For example when we send robots to the planet Mars (Perseverance and
Ingenuity recently), and we want to pilot them with the means at our
disposal... On planet Earth there is a positioning system by GPS, which
works with a network of satellites. But on Mars it does not exists. To
navigate on Mars, we find our way with a camera. To do this, you have
to measure the movement of the camera. This is the goal of our
experiment. Measuring the movement of the camera... The robots that
move on Mars have navigation cameras. These are their eyes. It's as
efficient as a GPS.

Here is the video demonstration, with the optical pendulum experiment:

We can see the image taken at the pendulum's rest. Then each of the
images, when it oscillates. We see the perspective transformation
between each image, to the rest, in image plane, i.e. in two dimensions.

Then using the parameters obtained in 2D from the transformation, a
virtual camera moves in 3D, using Persistence Of Vision software.
It is an illustration of the use that we can have in 3D of the
parameters: in translation <Tx,Ty,Tz>, in rotation <Rx,Ry,Rz> and
in perspective <Sx,Sy>. It is a question of determining from the images,
the movement in space of the camera. The movement in space between two
images is completely described by eight parameters. POV-Ray is very well
suited to represent the trajectory in 3D, because it is a free image
synthesis software. Of course, all these computations are not yet done
at the rate of video. It will probably be necessary to design a hardware
acceleration, to obtain a smoother video...

Best regards,

--

<http://eureka.atari.org/>
```
 From: Mr Subject: Re: Optical Pendulum Date: 13 Apr 2022 07:30:00 Message:
```
{
"@context": "https://schema.org",
"@type": "DiscussionForumPosting",
"@id": "#web.6256b352ffb209af16086ed06830a892%40news.povray.org",
"dateCreated": "2022-04-13T11:30:00+00:00",
"datePublished": "2022-04-13T11:30:00+00:00",
"author": {
"@type": "Person",
"name": "Mr"
}
}
Francois LE COAT <lec### [at] atariorg> wrote:
> Hi,
>
> Do you know something about the experiment of the "Optical Pendulum"?
>
> A camera is suspended upon a cable, and an image is shot at the rest
> position. Then you push the pendulum, so that the camera oscillates,
> and new images are acquired when the pendulum moves. The goal is to
> evaluate the eight parameters that describe the position of the camera,
> from the rest position to the actual one. Because the pendulum
> oscillates, we obtain pseudo-sinusoidal curves.
>
> The eight parameters are the perspective transform that happen
> from an image, to the others. That means translations <Tx,Ty,Tz>
> rotations <Rx,Ry,Rz> and two perspective parameters <Sx,Sy>. That's
> what we can see in bellow video. Each images, and the corresponding
> perspective transform parameters, compared to the rest.
>
> The goal is to measure a global movement, when it is observed by the
> camera. There are devices that determine the position, such as the GPS
> (Global Positioning System). We can evaluate rotations with a gyrometer,
> the accelerations with an accelerometer, the speed with an odometer.
> The goal is to measure all this by the image, with a camera. Why?
>
> For example when we send robots to the planet Mars (Perseverance and
> Ingenuity recently), and we want to pilot them with the means at our
> disposal... On planet Earth there is a positioning system by GPS, which
> works with a network of satellites. But on Mars it does not exists. To
> navigate on Mars, we find our way with a camera. To do this, you have
> to measure the movement of the camera. This is the goal of our
> experiment. Measuring the movement of the camera... The robots that
> move on Mars have navigation cameras. These are their eyes. It's as
> efficient as a GPS.
>
> Here is the video demonstration, with the optical pendulum experiment:
>
>
> We can see the image taken at the pendulum's rest. Then each of the
> images, when it oscillates. We see the perspective transformation
> between each image, to the rest, in image plane, i.e. in two dimensions.
>
> Then using the parameters obtained in 2D from the transformation, a
> virtual camera moves in 3D, using Persistence Of Vision software.
> It is an illustration of the use that we can have in 3D of the
> parameters: in translation <Tx,Ty,Tz>, in rotation <Rx,Ry,Rz> and
> in perspective <Sx,Sy>. It is a question of determining from the images,
> the movement in space of the camera. The movement in space between two
> images is completely described by eight parameters. POV-Ray is very well
> suited to represent the trajectory in 3D, because it is a free image
> synthesis software. Of course, all these computations are not yet done
> at the rate of video. It will probably be necessary to design a hardware
> acceleration, to obtain a smoother video...
>
> Best regards,
>
> --
>
> <http://eureka.atari.org/>

This reads as what a 3D tracking software such as the one shipped in Blender
would do ?
```
 From: Francois LE COAT Subject: Re: Optical Pendulum Date: 13 Apr 2022 11:50:28 Message: <6256f144\$1@news.povray.org>
```
{
"@context": "https://schema.org",
"@type": "DiscussionForumPosting",
"@id": "#6256f144%241%40news.povray.org",
"dateCreated": "2022-04-13T15:50:28+00:00",
"datePublished": "2022-04-13T15:50:28+00:00",
"author": {
"@type": "Person",
"name": "Francois LE COAT"
}
}
Hi,

Mr writes:
> Francois LE COAT wrote:
>> Do you know something about the experiment of the "Optical Pendulum"?
>>
>> A camera is suspended upon a cable, and an image is shot at the rest
>> position. Then you push the pendulum, so that the camera oscillates,
>> and new images are acquired when the pendulum moves. The goal is to
>> evaluate the eight parameters that describe the position of the camera
,
>> from the rest position to the actual one. Because the pendulum
>> oscillates, we obtain pseudo-sinusoidal curves.
>>
>> The eight parameters are the perspective transform that happen
>> from an image, to the others. That means translations <Tx,Ty,Tz>
>> rotations <Rx,Ry,Rz> and two perspective parameters <Sx,Sy>. That's
>> what we can see in bellow video. Each images, and the corresponding
>> perspective transform parameters, compared to the rest.
>>
>> The goal is to measure a global movement, when it is observed by the
>> camera. There are devices that determine the position, such as the GPS

>> (Global Positioning System). We can evaluate rotations with a gyromete
r,
>> the accelerations with an accelerometer, the speed with an odometer.
>> The goal is to measure all this by the image, with a camera. Why?
>>
>> For example when we send robots to the planet Mars (Perseverance and
>> Ingenuity recently), and we want to pilot them with the means at our
>> disposal... On planet Earth there is a positioning system by GPS, whic
h
>> works with a network of satellites. But on Mars it does not exists. To

>> navigate on Mars, we find our way with a camera. To do this, you have
>> to measure the movement of the camera. This is the goal of our
>> experiment. Measuring the movement of the camera... The robots that
>> move on Mars have navigation cameras. These are their eyes. It's as
>> efficient as a GPS.
>>
>> Here is the video demonstration, with the optical pendulum experiment:

>>
>>
>> We can see the image taken at the pendulum's rest. Then each of the
>> images, when it oscillates. We see the perspective transformation
>> between each image, to the rest, in image plane, i.e. in two dimension
s.
>>
>> Then using the parameters obtained in 2D from the transformation, a
>> virtual camera moves in 3D, using Persistence Of Vision software.
>> It is an illustration of the use that we can have in 3D of the
>> parameters: in translation <Tx,Ty,Tz>, in rotation <Rx,Ry,Rz> and
>> in perspective <Sx,Sy>. It is a question of determining from the image
s,
>> the movement in space of the camera. The movement in space between two

>> images is completely described by eight parameters. POV-Ray is very we
ll
>> suited to represent the trajectory in 3D, because it is a free image
>> synthesis software. Of course, all these computations are not yet done

>> at the rate of video. It will probably be necessary to design a hardwa
re
>> acceleration, to obtain a smoother video...
>
> This reads as what a 3D tracking software such as the one shipped in Bl
ender
> would do ?

I haven't seen such a tool yet. There's software to perform image
stitching like <http://hugin.sf.net/> which are registering images,
but it is not giving perspective transformation parameters. The goal
is to compute those perspective parameters, in order to reconstruct
the motion of the camera:

- <Tx,Ty,Tz> translations in pixels
- <Rx,Ry,Rz> rotations in degrees
- <Sx,Sy> perspective in degrees

This can be directly used in Persistence Of Vision for animation.

Is that what you meant?

Thanks,

Best regards,

--

<http://eureka.atari.org/>
```
 From: Francois LE COAT Subject: Re: Optical Pendulum Date: 26 Apr 2022 09:30:02 Message: <6267f3da\$1@news.povray.org>
```
{
"@context": "https://schema.org",
"@type": "DiscussionForumPosting",
"@id": "#6267f3da%241%40news.povray.org",
"dateCreated": "2022-04-26T13:30:02+00:00",
"datePublished": "2022-04-26T13:30:02+00:00",
"author": {
"@type": "Person",
"name": "Francois LE COAT"
}
}
Hi,

Francois LE COAT writes:
> Do you know something about the experiment of the "Optical Pendulum"?
>
> A camera is suspended upon a cable, and an image is shot at the rest
> position. Then you push the pendulum, so that the camera oscillates,
> and new images are acquired when the pendulum moves. The goal is to
> evaluate the eight parameters that describe the position of the camera,

> from the rest position to the actual one. Because the pendulum
> oscillates, we obtain pseudo-sinusoidal curves.
>
> The eight parameters are the perspective transform that happen
> from an image, to the others. That means translations <Tx,Ty,Tz>
> rotations <Rx,Ry,Rz> and two perspective parameters <Sx,Sy>. That's
> what we can see in bellow video. Each images, and the corresponding
> perspective transform parameters, compared to the rest.
>
> The goal is to measure a global movement, when it is observed by the
> camera. There are devices that determine the position, such as the GPS
> (Global Positioning System). We can evaluate rotations with a gyrometer
,
> the accelerations with an accelerometer, the speed with an odometer.
> The goal is to measure all this by the image, with a camera. Why?
>
> For example when we send robots to the planet Mars (Perseverance and
> Ingenuity recently), and we want to pilot them with the means at our
> disposal... On planet Earth there is a positioning system by GPS, which

> works with a network of satellites. But on Mars it does not exists. To
> navigate on Mars, we find our way with a camera. To do this, you have
> to measure the movement of the camera. This is the goal of our
> experiment. Measuring the movement of the camera... The robots that
> move on Mars have navigation cameras. These are their eyes. It's as
> efficient as a GPS.
>
> Here is the video demonstration, with the optical pendulum experiment:
>

>
> We can see the image taken at the pendulum's rest. Then each of the
> images, when it oscillates. We see the perspective transformation
> between each image, to the rest, in image plane, i.e. in two dimensions
.
>
> Then using the parameters obtained in 2D from the transformation, a
> virtual camera moves in 3D, using Persistence Of Vision software.
> It is an illustration of the use that we can have in 3D of the
> parameters: in translation <Tx,Ty,Tz>, in rotation <Rx,Ry,Rz> and
> in perspective <Sx,Sy>. It is a question of determining from the images
,
> the movement in space of the camera. The movement in space between two
> images is completely described by eight parameters. POV-Ray is very wel
l
> suited to represent the trajectory in 3D, because it is a free image
> synthesis software. Of course, all these computations are not yet done
> at the rate of video. It will probably be necessary to design a hardwar
e
> acceleration, to obtain a smoother video...

I realized a new video which is a little smoother, dissociating
acquisitions from the parameters' computation. It may help to
understand:

Thanks to Bald Eagle with the help on POV-Ray perspective transform!

Best regards,

--

<http://eureka.atari.org/>
```
 From: Francois LE COAT Subject: Re: Optical Pendulum Date: 17 Oct 2022 11:15:30 Message: <634d7192\$1@news.povray.org>
```
{
"@context": "https://schema.org",
"@type": "DiscussionForumPosting",
"@id": "#634d7192%241%40news.povray.org",
"dateCreated": "2022-10-17T15:15:30+00:00",
"datePublished": "2022-10-17T15:15:30+00:00",
"author": {
"@type": "Person",
"name": "Francois LE COAT"
}
}
Hi,

Francois LE COAT writes:
>> Do you know something about the experiment of the "Optical Pendulum"?
>>
>> A camera is suspended upon a cable, and an image is shot at the rest
>> position. Then you push the pendulum, so that the camera oscillates,
>> and new images are acquired when the pendulum moves. The goal is to
>> evaluate the eight parameters that describe the position of the camera
,
>> from the rest position to the actual one. Because the pendulum
>> oscillates, we obtain pseudo-sinusoidal curves.
>>
>> The eight parameters are the perspective transform that happen
>> from an image, to the others. That means translations <Tx,Ty,Tz>
>> rotations <Rx,Ry,Rz> and two perspective parameters <Sx,Sy>. That's
>> what we can see in bellow video. Each images, and the corresponding
>> perspective transform parameters, compared to the rest.
>>
>> The goal is to measure a global movement, when it is observed by the
>> camera. There are devices that determine the position, such as the GPS

>> (Global Positioning System). We can evaluate rotations with a gyromete
r,
>> the accelerations with an accelerometer, the speed with an odometer.
>> The goal is to measure all this by the image, with a camera. Why?
>>
>> For example when we send robots to the planet Mars (Perseverance and
>> Ingenuity recently), and we want to pilot them with the means at our
>> disposal... On planet Earth there is a positioning system by GPS, whic
h
>> works with a network of satellites. But on Mars it does not exists. To

>> navigate on Mars, we find our way with a camera. To do this, you have
>> to measure the movement of the camera. This is the goal of our
>> experiment. Measuring the movement of the camera... The robots that
>> move on Mars have navigation cameras. These are their eyes. It's as
>> efficient as a GPS.
>>
>> Here is the video demonstration, with the optical pendulum experiment:

>>

>>
>> We can see the image taken at the pendulum's rest. Then each of the
>> images, when it oscillates. We see the perspective transformation
>> between each image, to the rest, in image plane, i.e. in two dimension
s.
>>
>> Then using the parameters obtained in 2D from the transformation, a
>> virtual camera moves in 3D, using Persistence Of Vision software.
>> It is an illustration of the use that we can have in 3D of the
>> parameters: in translation <Tx,Ty,Tz>, in rotation <Rx,Ry,Rz> and
>> in perspective <Sx,Sy>. It is a question of determining from the image
s,
>> the movement in space of the camera. The movement in space between two

>> images is completely described by eight parameters. POV-Ray is very we
ll
>> suited to represent the trajectory in 3D, because it is a free image
>> synthesis software. Of course, all these computations are not yet done

>> at the rate of video. It will probably be necessary to design a hardwa
re
>> acceleration, to obtain a smoother video...
>
> I realized a new video which is a little smoother, dissociating
> acquisitions from the parameters' computation. It may help to
> understand:
>

>
> Thanks to Bald Eagle with the help on POV-Ray perspective transform!

Here is the perspective transform that we are speaking about...

There are three rotations, three translations, and two perspective
parameters that are observed when the image is projected (Skew).

Here you can see the transformation rendered with POV-Ray...

The motion in space of the camera is determined from the images
of the optical pendulum, thanks to the perspective cinematic model.

Best regards,

--

<http://eureka.atari.org/>
```
 From: Francois LE COAT Subject: Re: Optical Pendulum Date: 1 Feb 2023 11:00:07 Message: <63da8c87\$1@news.povray.org>
```
{
"@context": "https://schema.org",
"@type": "DiscussionForumPosting",
"@id": "#63da8c87%241%40news.povray.org",
"dateCreated": "2023-02-01T16:00:07+00:00",
"datePublished": "2023-02-01T16:00:07+00:00",
"author": {
"@type": "Person",
"name": "Francois LE COAT"
}
}
Hi,

A WEB page was made to illustrate the "optical pendulum" experiment:

<https://hebergement.universite-paris-saclay.fr/lecoat/demoweb/optical_pe
ndulum.html>

We determinate translation, rotation and perspective transformations.
On this WEB page you can see the pendulum swinging live... This is
not really fast for the moment, but we're trying to accelerate it :-)

Francois LE COAT writes:
>>> Do you know something about the experiment of the "Optical Pendulum"?

>>>
>>> A camera is suspended upon a cable, and an image is shot at the rest
>>> position. Then you push the pendulum, so that the camera oscillates,
>>> and new images are acquired when the pendulum moves. The goal is to
>>> evaluate the eight parameters that describe the position of the camer
a,
>>> from the rest position to the actual one. Because the pendulum
>>> oscillates, we obtain pseudo-sinusoidal curves.
>>>
>>> The eight parameters are the perspective transform that happen
>>> from an image, to the others. That means translations <Tx,Ty,Tz>
>>> rotations <Rx,Ry,Rz> and two perspective parameters <Sx,Sy>. That's
>>> what we can see in bellow video. Each images, and the corresponding
>>> perspective transform parameters, compared to the rest.
>>>
>>> The goal is to measure a global movement, when it is observed by the
>>> camera. There are devices that determine the position, such as the GP
S
>>> (Global Positioning System). We can evaluate rotations with a gyromet
er,
>>> the accelerations with an accelerometer, the speed with an odometer.
>>> The goal is to measure all this by the image, with a camera. Why?
>>>
>>> For example when we send robots to the planet Mars (Perseverance and
>>> Ingenuity recently), and we want to pilot them with the means at our
>>> disposal... On planet Earth there is a positioning system by GPS, whi
ch
>>> works with a network of satellites. But on Mars it does not exists. T
o
>>> navigate on Mars, we find our way with a camera. To do this, you have

>>> to measure the movement of the camera. This is the goal of our
>>> experiment. Measuring the movement of the camera... The robots that
>>> move on Mars have navigation cameras. These are their eyes. It's as
>>> efficient as a GPS.
>>>
>>> Here is the video demonstration, with the optical pendulum experiment
:
>>>

>>>
>>> We can see the image taken at the pendulum's rest. Then each of the
>>> images, when it oscillates. We see the perspective transformation
>>> between each image, to the rest, in image plane, i.e. in two dimensio
ns.
>>>
>>> Then using the parameters obtained in 2D from the transformation, a
>>> virtual camera moves in 3D, using Persistence Of Vision software.
>>> It is an illustration of the use that we can have in 3D of the
>>> parameters: in translation <Tx,Ty,Tz>, in rotation <Rx,Ry,Rz> and
>>> in perspective <Sx,Sy>. It is a question of determining from the imag
es,
>>> the movement in space of the camera. The movement in space between tw
o
>>> images is completely described by eight parameters. POV-Ray is very w
ell
>>> suited to represent the trajectory in 3D, because it is a free image
>>> synthesis software. Of course, all these computations are not yet don
e
>>> at the rate of video. It will probably be necessary to design a hardw
are
>>> acceleration, to obtain a smoother video...
>>
>> I realized a new video which is a little smoother, dissociating
>> acquisitions from the parameters' computation. It may help to
>> understand:
>>

>>
>> Thanks to Bald Eagle with the help on POV-Ray perspective transform!
>
> Here is the perspective transform that we are speaking about...
>

>
> There are three rotations, three translations, and two perspective
> parameters that are observed when the image is projected (Skew).
>
> Here you can see the transformation rendered with POV-Ray...
>

>
> The motion in space of the camera is determined from the images
> of the optical pendulum, thanks to the perspective cinematic model.

Best regards,

--

<http://eureka.atari.org/>
```
 From: ingo Subject: Re: Optical Pendulum Date: 1 Feb 2023 12:50:00 Message:
```
{
"@context": "https://schema.org",
"@type": "DiscussionForumPosting",
"@id": "#web.63daa57fffb209af17bac71e8ffb8ce3%40news.povray.org",
"dateCreated": "2023-02-01T17:50:00+00:00",
"datePublished": "2023-02-01T17:50:00+00:00",
"author": {
"@type": "Person",
"name": "ingo"
}
}
Francois LE COAT <lec### [at] atariorg> wrote:
> Hi,
>
> A WEB page was made to illustrate the "optical pendulum" experiment:
>
> <https://hebergement.universite-paris-saclay.fr/lecoat/demoweb/optical_pe
> ndulum.html>
>

Nice!

ingo
```
 From: Francois LE COAT Subject: Re: Optical Pendulum Date: 1 Feb 2023 13:25:33 Message: <63daae9d\$1@news.povray.org>
```
{
"@context": "https://schema.org",
"@type": "DiscussionForumPosting",
"@id": "#63daae9d%241%40news.povray.org",
"dateCreated": "2023-02-01T18:25:33+00:00",
"datePublished": "2023-02-01T18:25:33+00:00",
"author": {
"@type": "Person",
"name": "Francois LE COAT"
}
}
Hi,

ingo writes:
> Francois LE COAT wrote:
>> A WEB page was made to illustrate the "optical pendulum" experiment:
>>
>> <https://hebergement.universite-paris-saclay.fr/lecoat/demoweb/optical
_pendulum.html>
>
> Nice!

Thanks. I don't know anyone else working on such an experiment. It only
requires a computer and a camera. This is so simple, but interesting!
Many people are trying to match images the best way, but there's no
real experiment like this "optical pendulum", so far as I know about it?

Regards,

--

<http://eureka.atari.org/>
```
 From: Francois LE COAT Subject: Re: Optical Pendulum Date: 12 Feb 2023 11:10:19 Message: <63e90f6b\$1@news.povray.org>
```
{
"@context": "https://schema.org",
"@type": "DiscussionForumPosting",
"@id": "#63e90f6b%241%40news.povray.org",
"dateCreated": "2023-02-12T16:10:19+00:00",
"datePublished": "2023-02-12T16:10:19+00:00",
"author": {
"@type": "Person",
"name": "Francois LE COAT"
}
}
Hi,

Do you know where is located the original Foucault's pendulum?...

In ARTE Concert Youtube channel - 2021 October 6th

This is the first pendulum, very popular experiment, in Paris :-)

ingo writes:
>> Francois LE COAT wrote:
>>> A WEB page was made to illustrate the "optical pendulum" experiment:
>>>
>>> <https://hebergement.universite-paris-saclay.fr/lecoat/demoweb/optica
l_pendulum.html>
>>>
>>
>> Nice!
>
> Thanks. I don't know anyone else working on such an experiment. It only

> requires a computer and a camera. This is so simple, but interesting!
> Many people are trying to match images the best way, but there's no
> real experiment like this "optical pendulum", so far as I know about it
?

Regards,

--

<http://eureka.atari.org/>
```
 From: Francois LE COAT Subject: Re: Optical Pendulum Date: 21 Mar 2023 10:51:07 Message: <6419c45b\$1@news.povray.org>
```
{
"@context": "https://schema.org",
"@type": "DiscussionForumPosting",
"@id": "#6419c45b%241%40news.povray.org",
"dateCreated": "2023-03-21T14:51:07+00:00",
"datePublished": "2023-03-21T14:51:07+00:00",
"author": {
"@type": "Person",
"name": "Francois LE COAT"
}
}
Hi,

> ingo writes:
>> Francois LE COAT wrote:
>>> A WEB page was made to illustrate the "optical pendulum" experiment:
>>>
>>> <https://hebergement.universite-paris-saclay.fr/lecoat/demoweb/optica
l_pendulum.html>
>>>
>>
>> Nice!
>
> Thanks. I don't know anyone else working on such an experiment. It only

> requires a computer and a camera. This is so simple, but interesting!
> Many people are trying to match images the best way, but there's no
> real experiment like this "optical pendulum", so far as I know about it
?

The planar perspective transformation used for modelling the camera
motion, was added to the "optical pendulum" WEB page. Some of the
transforms are present in Persistence of Vision, like 3 translations
and 3 rotations. But the "skew" or "shear" angles <Sx,Sy> are not
directly. Thanks for the POV-Ray advance users to have helped on that.

Regards,

--

<http://eureka.atari.org/>
```