MUFF WIGGLER Forum Index
 FAQ & Terms of UseFAQ & Terms Of Use   Wiggler RadioMW Radio   Muff Wiggler TwitterTwitter   Support the site @ PatreonPatreon 
 SearchSearch   RegisterSign up   Log inLog in 
WIGGLING 'LITE' IN GUEST MODE

Slo-mo where frames play at a defined tempo?
MUFF WIGGLER Forum Index -> Video Synthesis  
Author Slo-mo where frames play at a defined tempo?
nangu
I've been having a lot of fun with recording things on a Roland P-10 and then playing them back really slowly using the P-10's speed knob. Adjusting the speed knob and trying to match the frame rate to the tempo of music seems to work a lot better than it should, but I'm wondering if there's a more precise way to get this effect.

Is there any software that would let me specify a tempo, and then play back the video at a rate where it would advance one frame every 16th note or whatever?
nerdware
In general, it should be possible once you know the number of frames per beat. E.g. using software like ffmpeg, which has a filter for doing this. Ideally, you should have a video editor that uses tap tempo to set the frame rate.

In hardware, like the P10? You can do it with still images, with a "strobe" slideshow setting, but that's limited to 7 images. I can't find anything else in the manual that might do this. Nothing about setting a precise rate of frame speed, even in the MIDI section. Definitely nothing about tempo.

It's a shame. We can imagine how it could've been done. There's a place for this feature, in the Movie Pad Parameters section. Imagine if they'd added a Beat setting, so advance by one frame with each beat, with the beat set as a division of the MIDI clock. Of course, that would've added a lot more code to the firmware, and maybe that wasn't possible (limited memory space).

So I don't think this can be done, or done easily, with the P10. Trial and error may get you something that works, but I really don't believe the P10 was designed to do this.
nangu
I was asking about computer software, not hoping for a ninja trick to do it all in the P-10.

Best case, I hope to process a P-10 file in the computer, and have the output format be similar enough that the P-10 will accept it without needing to use Roland's super-crusty P-10 loader software. If I can embed an unrelated audio file into the video in the computer before writing it to the P-10's SD card, that would be extra spectacular..

Something that runs on a Mac with Yosemite would be ideal, but I can borrow a Windows machine if needed.

I'll try the ffmpg filter that you mentioned- thanks!!
nerdware
Well, I'm not exactky recommending ffmpeg - its a command line swiss army knife packed with audio and video filters, but it does have a filter for doing this. However, being a command line swiss army knife, you have to do a lot of work to make it do exactly what you want. The wonderful thing about it is, once you've worked out the correct incantations, you can save them in a file and invoke them again whenever you need that particular effect. I use it a lot, often without even thinking of it as ffmpeg. It seems a lot of us ffmpeg users do that; building friendly front ends to hide the complex mechanics.

Anyway, I think the specific ffmpeg filter that should do what you want is called
framestep. The framerate filter may also be useful, I don't know. I've used them both, but in complex filtergraphs from a 2015, so I can't be sure now exactly what I was doing. I just have a few files left around that used them. Both uses were for subtle effects, but I recall using framestep for more dramatic, strobe-like effects.

The rest is just translating tempo into a frame counts. So, a nice simple example would be 120 BPM and 30 FPS. That gives me 3.75 frames per 16th note. Unfortunately, ffmpeg knows nothing about tempo nevermind tap tempo. Being a programmer, I'd probably solve this by writing a tool that uses ffmpeg to convert the source video to individual jpeg files, sync the image changes to the beats and then use ffmpeg to convert the image files back into a movie. I.e. a crude bit of code that spews out symbolic links to the image files. I don't recommend doing it like this unless you're a programmer! It's ugly, error-prone, painful to use and fiddly to get right. It won't help you sync the beats or match the tempo. You'd still need to work all that out yourself.

So I'm only saying its possible to do it that way. It's a very old-school technique, like stop-motion animation and tape-editing. It's so much easier to use, say, a video editor that has a feature built in for doing this, preferably with tap tempo. I can't recommend anything specific as I've never done that myself. I don't even use a video editor! (Yet.) I do all my post-processing using ffmpeg, so that's a tool I can talk about. YMMV.
nerdware
To give you some idea of how I use ffmpeg, consider the following example.



#!/bin/bash
IMAGE="sapphire.png"
DURATION=500
FADEOUT=`echo $DURATION-30 | bc`
SPEED1=90
SPEED2=`echo $SPEED1*5/4 | bc`
SPEED3=`echo $SPEED1*6/5 | bc`

MODE=glow
OUTPUT=joanna.avi
OPTS="-y -pix_fmt $PFORMAT $VCODEC -s $VFORMAT"

exec ffmpeg -loop 1 -i $IMAGE -t $DURATION $OPTS -filter_complex "
split=3 [j1][j2], rotate=PI/3+2*PI*t/$SPEED1, negate, hue=h=20 [j3];
[j1] rotate=PI/3+2*PI*t/$SPEED2, hflip, [j3] blend=all_mode=$MODE [r1];
[j2] rotate=PI/3+2*PI*t/-$SPEED3, vflip, hue=h=10:s=1 [r2];
[r1][r2] blend=all_mode=$MODE, crop=1/2*iw:1/2*ih, scale=$VFORMAT, fade=in:0:50, fade=t=out:st=$FADEOUT:d=27
" $OUTPUT
nerdware
Here's a fragment using framestep, from another project:

EXPR1="128+30*sin(2*PI*X/400+T)"
EXPR2="128+80*(sin(sqrt((X-W/2)*(X-W/2)+(Y-H/2)*(Y-H/2))/220*2*PI+T))"
EXPR="$EXPR2"

exec nice $FFMPEG -i "$INPUT" -filter_complex "
framestep=2, split=3 [a] [b] [c];
[a] geq=lum=$EXPR:cb=$EXPR:cr=$EXPR [x];
[b] geq=lum=$EXPR:cb=$EXPR:cr=$EXPR [y];
[c] [x] [y] displace=wrap
" $OPTS -y "$OUTPUT"

As you can see in both these examples, some of the work is being done in the code I'm wrapping around ffmpeg. That's where your tempo-match and temp-syncing code would go, if you were to do it using ffmpeg. It may be possible to make that work, but I really don't recommend it!
nangu
Thanks!!

I'm probably not sophisticated enough to do anything useful with your ffmpg info yet, but I'll come back to it if I can't find an easier way to get to where I want to go. One of my friends may be able to help..
nerdware
ADDENDUM: While framestep can only skip integer numbers of frames, the MJPEG technique can work around this. However, we can also work around this in ffmpeg by working with a higher frame rate, like 120/fps, then downscaling the framerate if/when/where needed. Just in case you were wondering how the 3.75 frames per 16th note thing might work. lol Damn, I may be using ffmpeg too much. It's getting hard to remember how much I need to explain. Video is weird and complicated. Adding computers just adds more weirdness and complexity. d'oh!
vonkhades
nerdware wrote:
To give you some idea of how I use ffmpeg, consider the following example.



#!/bin/bash
IMAGE="sapphire.png"
DURATION=500
FADEOUT=`echo $DURATION-30 | bc`
SPEED1=90
SPEED2=`echo $SPEED1*5/4 | bc`
SPEED3=`echo $SPEED1*6/5 | bc`

MODE=glow
OUTPUT=joanna.avi
OPTS="-y -pix_fmt $PFORMAT $VCODEC -s $VFORMAT"

exec ffmpeg -loop 1 -i $IMAGE -t $DURATION $OPTS -filter_complex "
split=3 [j1][j2], rotate=PI/3+2*PI*t/$SPEED1, negate, hue=h=20 [j3];
[j1] rotate=PI/3+2*PI*t/$SPEED2, hflip, [j3] blend=all_mode=$MODE [r1];
[j2] rotate=PI/3+2*PI*t/-$SPEED3, vflip, hue=h=10:s=1 [r2];
[r1][r2] blend=all_mode=$MODE, crop=1/2*iw:1/2*ih, scale=$VFORMAT, fade=in:0:50, fade=t=out:st=$FADEOUT:d=27
" $OUTPUT



So I'm trying this command but there are three environemtn variables missing, $PFORMAT $VCODEC and $VFORMAT, so I try with differnt values but I'm unable to make it work... can you suggest me any values for those?

Already changed the input and output filename variables.
nerdware
Sorry for not explaining the undefined variables. I hadn't expected anyone would want to use my code. I blame the Curse of knowledge. very frustrating I only gave that file as an example. You can use it as it is if you like, but I recommend reading the manual to understand how it all works. That may take some time - it took me a long time to get to the point where I could write that file.

Anyway, $PFORMAT $VCODEC and $VFORMAT are externally defined because I don't always use the same format. The default values I have set are PFORMAT="rgb24", VCODEC="-c:v qtrle" and VFORMAT="pal", which is the file quicktime format and the PAL video standard. You should set them to whatever values work for you, e.g. VFORMAT="ntsc". You'll also need to set IMAGE to a local filename, unless you can find the same file I used. I've had it for so many years now, I don't remember where it came from.

I should also mention that the shell language used is Bash, so unless you have this shell on your computer, the script will fail. I recommend translating it to something local, if possible. Seriously, I posted this as an example only. I'm not recommending that anyone use it! Standard disclaimers apply. ("It may wipe your hard drive and smoke your computer" etc.)
vonkhades
nerdware wrote:
Sorry for not explaining the undefined variables. I hadn't expected anyone would want to use my code. I blame the Curse of knowledge. very frustrating I only gave that file as an example. You can use it as it is if you like, but I recommend reading the manual to understand how it all works. That may take some time - it took me a long time to get to the point where I could write that file.

Anyway, $PFORMAT $VCODEC and $VFORMAT are externally defined because I don't always use the same format. The default values I have set are PFORMAT="rgb24", VCODEC="-c:v qtrle" and VFORMAT="pal", which is the file quicktime format and the PAL video standard. You should set them to whatever values work for you, e.g. VFORMAT="ntsc". You'll also need to set IMAGE to a local filename, unless you can find the same file I used. I've had it for so many years now, I don't remember where it came from.

I should also mention that the shell language used is Bash, so unless you have this shell on your computer, the script will fail. I recommend translating it to something local, if possible. Seriously, I posted this as an example only. I'm not recommending that anyone use it! Standard disclaimers apply. ("It may wipe your hard drive and smoke your computer" etc.)


I have shell, bash and korknshell on my computer (ArchLinux) so it should work... anyway I know bash (I use it for system configuration, services deployments etc...) quite well just wondering what those magic environment variables were.. thanks a lot for the reply I'll try to make something interesting out of your example for my self smile
nangu
I found a video that shows how to make VDMX use incoming MIDI notes to advance to the next frame.

http://www.vidvox.net/forums/viewtopic.php?f=22&t=89222&p

MIDI control of 'previous frame' also works, but I haven't figured out how to make the movie restart from the beginning under MIDI control yet. I'm a total VDMX n00b- this is the first cool thing that I've done with it. It's pretty rad to have direct control over the frames- I was only hoping for accurate tempo, but now I've got one track on a Polyend SEQ that's actively sequencing the frames so I can change the frame rate whenever I want.

VDMX and AVF Batch Converter won't load the AVI files that my Roland P-10 makes. I had to convert to MP4 using Handbrake before the file would load into AVF Batch Converter - then I converted it again to Hap Q. My 47mb file turned into 77mb and then 241mb, but it works perfectly..
nerdware

A small point worth noting: VDMX only runs on Macs.
vonkhades
Here are three vids using some ffmpeg filters... its super cool!

Recursive effect of Hindi folk dance


Recursive effect of Nunchaku Kata.


Rotating buttdick
nangu
That's awesome- thanks for sharing!!
nerdware
Yeah, very tasty. Thanks.
MUFF WIGGLER Forum Index -> Video Synthesis  
Page 1 of 1
Powered by phpBB © phpBB Group