POV-Ray : Newsgroups : povray.binaries.animations : Music Video Project : Re: Music Video Project Server Time
26 Apr 2024 22:28:01 EDT (-0400)
  Re: Music Video Project  
From: Dave Blandston
Date: 11 Apr 2021 22:20:00
Message: <web.6073ae0b432b07e179416a1f9334df62@news.povray.org>
"m@b" <sai### [at] googlemailcom> wrote:
> I don't find it takes too much time, admittedly I chose shorter and much
>   slower songs than you are planning.
>
> If there is a constant beat throughout the song you can set up a
> variable direct from clock which can be used to animate all sorts of things.
>
> See:
> <https://www.tiktok.com/@matthew.bradwell/video/6944909548163484930>
> and
> <https://www.youtube.com/watch?v=djJGXZB6yRE>
>
> Best,
> m@
>
> p.s. I just watched the song - there is a lot going on there!

I'm curious about a couple things, since you have already solved some problems
that I'm currently working on. Did you create one giant POV-Ray file for the
entire animation, or did you split it up into multiple files? If you split it
up, did you need to devise a method of passing information from one file to the
next? If so I'm very interested to hear the basic details of how you did this,
especially since you had to plan much farther ahead by positioning the camera
correctly. I will have to split my project into multiple files for sure and so
far I've written a macro that automatically keeps track of the current number of
frames/time elapsed but ideally it would be nice to be able to pass information
about changing textures as well. I don't think this is possible, but maybe
there's a way...

Also my original belief that it would be possible to visually examine the
waveform with a music editor and identify beats and other musical events has
turned out to be totally false. Drum beats can be identified easily enough but
bass notes are not as obvious especially when there is singing and/or other
guitar stuff going on simultaneously.

It has been brought to my attention that the mathematical method of detecting
beats/changes is by the use of Fourier transforms. So my current options are to
try to locate and adapt some Fourier transform code to POV-Ray SDL and use it to
generate a list of acoustic event times (probably beyond my ability), or find
another alternative. One option that I'll explore next is the "spectral" view.
In the past I used Magix Audio Cleaning Lab to remove tape hiss and unwanted
sounds from commercial recordings and one of the features was "spectral
cleaning" which displayed the music in a unique visual manner, which may help.
Re-installing that program and checking that option out will be my next project.

I made a five-second test animation by guessing the approximate times of the
first few bass notes and it's definitely not good enough. I posted it anyway for
an example of what not to do:

https://www.youtube.com/watch?v=Rv8kGq9PUkM

This also includes a test of using video frames in one of the characters of the
logo. I picked a random five-second segment from a concert video and it didn't
work out at all, at least not with the big "A." I think that carefully selected
video segments have potential in the smaller characters though.

Cheers,
Dave Blandston


Post a reply to this message

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.