For shiggles I'm working on creating a lightweight plugin to turn automation into CV.
...Granted I don't actually have a DC coupled interface to test it with, currently.
I wrote a plug-in up real quick that outputs automation to audio (this video is while it was operating 0.0 to 1.0 before I fixed it to -1.0 to 1.0)
warning: prolly loud
Acts a bit odd at audio rates, but that seems to be an amalgamate of buffer size, how ableton is sampling the automation lines, how ableton loops and SR.
Anyways, my curiosity is this: How does one determine the relationship of digital level to outputted DC voltage, when using a DC coupled interface? Is it dependent on the interface itself? If so, is calibration for an interface just dependent on attenuating the outputted signal (therefore changing the min/max amplitude)?
Thanks! br> br>