|So i just recently stumbled on this video that has got me very interested in the idea of audiovisual granular synthesis.
So the concept is that you connect the parameters of a granular synthesiser (size , position, pitch, etc.) to parameters of a video that would supposedly "match".
For example, it would make sense for the position parameter of a granular synth to be connected to a videos rewind/fast-forward parameter since that is pretty much what you are doing to the audio when changing the position. Also, size could be matched with something like blur (since the smaller something is, the blurrier it gets).
So what i would like to know is if anyone knows a way i could possible connect my MI Clouds module to a video and have the parameters of said video adjust as i change specific controls on the module.