When I started releasing sample packs via Wavparty, I wanted to put videos for the demo tracks on YouTube so they’d be easy to post and share. They, uh, quickly took on a life of their own. I get a lot of questions about how I make these things. Here’s what I use:
This is a collection of video tools for Ableton Live that allow you to generate and mangle video in real time. If you use Ableton for making music, the video workflow feels natural. You can set up most parameters to react to audio (“high” and “low” frequencies) and sync changes to your track’s tempo. I make especially healthy use of the Mosh, VHS, and Couleur effects.
It has some quirks and I’ve run into a few glitches, but they keep rolling out updates at a healthy clip.
One thing that’s pretty annoying: I can get a decently high frame rate doing live visuals, but as soon as you start recording (there’s a built in Recorder module), the frame rate basically cuts in half! I’m lucky if I can get 20 FPS on my iMac. I’m not sure what the bottleneck is; this happens even when I’m writing to SSD. My workaround here is pretty dumb: I just slow the track down to half speed when I’m recording visuals and speed up my videos 2x in post. 🤷♂️ I’m curious to see if I can avoid this hack after eventually upgrading my system.
I bought FCP because I needed something better than iMovie and I didn’t want to pay for a monthly Adobe Creative Cloud subscription to use Premiere. I don’t need much out of editing software here--I generally record a bunch of takes of Zwobot video out of Ableton and slice them up, matching cuts to to the rhyhtm of the track. The recorded video generally looks a little washed out (compared to what I see during recording) so I use FCP to tweak colors and sometimes add effects.
Final Cut’s a little weird--I’m still not totally used to the strange magnetic timeline and I think I would prefer a traditional track-based editor. I know there are other options out there but I’m too lazy and cheap to pursue them.
I always have my iPhone XR in my pocket. Every once in a while I see something that looks cool and record a snippet of video. I’ll Airdrop it into my computer and plop it into Zwobot. It doesn’t need to look great, it’s gonna get totally scrambled up anyway!
I like shooting my own video but I’m also a big fan of repurposing weird snippets of found footage. There are a bunch of YouTube download sites that let you plug in a YouTube URL and grab an MP4 that you can drop right into Zwobot and mangle. Don’t narc!
This is a pretty recent addition to my setup. EYESY is basically a Raspberry Pi in a nice case. It has MIDI and audio inputs and synthesizes video using Python scripts running the PyGame library. I’ve only been using it for a week but so far it’s a lot of fun. The workflow is a little clunky and I know just enough Python to fuck things up. People are sharing scripts on patchstorage but the community is not particularly active.
Some captured frames:
Anyway, despite not really knowing what I’m doing I’ve figured out a workflow and visual style that works for me. Someday it would be fun to try to do this stuff live.
Leave a comment (comments are moderated, they might take a while to appear)