Thesis overview + components

Heya,

 

I’m going to explain to you how my oscilloscope network works.

 

Software:

  1. Touchdesigner (free license)
  2. Oscirender (free, open-source)
  3. Vbcable (free)
  4. Loopmidi (free)
  5. Arduino IDE (free, optional)
    1. Firmata library (free, comes w arduino IDE under “sketch/examples”)
  6. Blender (optional) (free)
    1. Oscirender Blender addon (free)
  7. OpenCV scripts (optional, opensource)
    1. You can also write some basic opencv scripts in touchdesigner, running on your webcam
    2. If you want to use a specialized depth camera and/or libraries, to my understanding you need:
      1. A touchdesigner + your OS supported depth cam (kinect is good for windows)
      2. NDItools (free) + ndi-python library (free) work for sending this stuff to touchdesigner (Mac only)
      3. Or you can probably send it’s capture over osc but I haven’t tried that yet
  8. Video Waves glsl shader (free, open-source)
    1. used this port to touchdesigner 
  9. (Optional) DAW (ableton is well supported by TD) for post-production things + audio effects
    1. I’ve been using Dirtywave M8 headless tracker
      1. It’s an opensource software tracker that runs on a Teensy 4.1 microcontroller
      2. There’s a free Touchdesigner interface (but I find this clashes with my projects)
      3. Instead I use the Web Browser interface (also free)

Hardware:

  1. Computer (obv)
  2. Arduino / your firmata-supported microcontroller (optional)
    1. whatever sensors you fancy
    2. can you lights, etc. as outputs
  3. Analog CRT out (optional, for prettier visuals)
    1. need HDMI2AV interface to send signal out
    2. Analog mixer + circuit bent video gear (optional, fun for post processing + feedback loops)
  4. Analog oscilloscope (optional)
    1. need Stereo-out audio from your computer
  5. Projector (optional)
    1. looks like shit for the feedback-y stuff IMO, but looks good w just oscilliscope waves (bonus if you somehow have access to a laser projector

 

The project flow

Audio

Mocap (optional) -> Blender (optional) -> Oscirender -> VB Cable output -> Touchdesigner -> loopmidi -> Oscirender -> TD post-audio fx

Video

Mocap (optional) -> Blender (optional) -> Oscirender -> VB Cable output -> Touchdesigner + fx -> Video_Waaaves shaders

P Comp Stuff

Input: Gyroscope in (connected to Arduino Nano) –> Touchdesigner pipeline (as an oscirender controller)

Outputs: Osci Audio in Touchdesigner –> Firmata –> Arduino (mega) –> 12v relay –> 12v lightbulbs

 

If that means nothing to you, that’s ok! I’m going to explain how all these components work together below.

 

 

Oscirender

Documentation & Tutorials for Oscirender can be found here

Below is the UI, and I’m going to give you an overview of how I’m using it.

 

Input & Parameter Control

Oscirender takes text (.txt), 2d vector (.svg), 3d model (.obj), and scripting (.lua) inputs. For this project, I’m using a 3d model of a UV Sphere. Osci render has a variety of 2d & 3d tranformations parameters to make that model move. These parameters exist as sliders that can be set to static values or modulated with a variety of oscillator waveforms (sine, square, seesaw, triangle, sawtooth, & reverse sawtooth). The value range of specific sliders can be modified by accessing the slider tab, selecting a parameter from the drop down, and changing the high and low range (these default to 0.0 and 1.0). Controlling these ranges were a crucial part of harmonizing my project.

 

Midi Control

The onboard sliders & oscillators are ____, but I quickly found myself wanting more ways to control Oscirender’s many parameters. Oscirender supports midi inputs as parameter control. This works in the form of a midi listener event, where a parameter’s midi listener is engaged by clicking with it (denoted by a red highlight) and automatically pairs with the first midi input it receives (denoted by a yellow highlight).

 

Audio Output

Under the Audio tab you can select an output device. I’m using VCable to route the audio internally, at it’s highest supported sample rate of 96000hz. The output needs to be stereo, so both L & R audio channels are preserved for the XY-oscilliscope.

 

 

Touchdesigner

Here’s a snapshot of my current network:

My network breaks down into 2 main parts: the data that is coming in from Osci Render (Audio + Visual Processing), and the data that is going out to Osci Render (Midi).

Audio + Visual Processing

Audio handling

I’m routing my oscirender audio into touchdesigner’s audiodevin compent via VBCable. I set my audio device format to stereo & rate to 96000hz to be consistent with my input signal. Then, I send it to 2 audio outputs: my speaker or headphones, and an HDMI2AV adapter device, the L&R audio channels of which I connect to my analog oscilloscope.

The XY Oscilloscope

I then route my audio to this XY Oscilloscope component. Funny thing about this one, I spent my first week in touchdesigner trying to figure out how to build a good oscilloscope visualizer. This involved using a choptosop component to instance the incoming Left & Right audio channels as geometry points on the scope’s X & Y axis’s (Left = X, Right = Y). Anyway it turns out their was a prebuilt XYscope component in the Touchdesigner pallete under Tools that does just that, while also including some pretty cool feedback effects and a particle/line toggle switch. So I ditched my paltry scope and rolled with this one, with a few additional modifications that I made.

I added a HSV to RGB converter so I could modify hue of the geometry’s line material (which by default only accepts RGB values). I also added a togglable trail effect within the geometry comp, and a laser out in case I ever get my hands on a laser projector. The laser out –> x,y channel select is also a reliable way to create oscilloscope sounds from geometry within Touchdesigner. I’ve experimented a bit with using it as a way of resampling my osci-sounds after creating further geometric transformations within touchdesigner (as well as taking video imput, tracing it as geometry, and pulling the x,y data to send out as stereo sound), but the results have been mixed (read: ear curdling).

Anyways the Sop geometry is then rendered as a top (video data) over an empty alpha channel and sent out of the component for further use.

Post Effects

I do some minimal post effects to the oscilliscope rendering to up its “holographic-look” but honestly I don’t want to fuck with it much and leave it mostly preserved. That is, until…

Wave Poool Shaders

Waaave_Pool was kind of my intro to a lot of video synthesis stuff, and I used the raspberrypi port for live video effects at a few shows last year. The creator Andrei Jay hosts local donation based classes on video gear/signals/synthesis stuff that I’ve been to a few of and piqued my interest in video synthesis and maintaining an open source, diy mentality. You can read more here on their personal site

Someone made a port of the Wave Pool glsl shaders for Touchdesigner, you can download the component here

Touchdesigner has native support of glsl shaders in its glsl TOP component. Basically you just import the glsl shader code and parse through, finding all your vector variables and creating instances of them in your parameter panel. Waaave Pools has 35 controll parameters so luckily somone else was nice enough to have already done it and posted their component online.

Currently I’m taking the approach of just making a fuck ton of noise and lfo chops to control different parameters. It takes some fiddling to make something that you like, there’s also a lot of synchronization with the different interrelated effects. I like to offset the frequencies of my noise/lfo chops to keep things fresh & changing. I’m also scaling frequency against my midi note speed variable (more on that later) to work in some more audio-reactivity, likewise with my gyroscope sensors. The cool thing about touchdesigner is you can find a quick way to make just about anything control anything.

Midi

This network is built to take data in touchdesigner (from physical controllers interfacing over serial, opencv tracking, internal sequencing, instruments, you name it) and send it internally to Oscirender.

I’m using loopMIDI, an internal router for Windows, as my MIDI in and out device.

Loop midi is super simple, just create a port and then reference that in the Touchdesigner MIDI control.

You can access the MIDI Mapper under the dialogs menu.

I now have 127 addressable midi cc controllers (think of these as variables) that can be generated in touchdesigner & sent to Osci Render.

CHOP to MIDI

I built a custom component to take CHOP data and send it out as a MIDI signal. The incoming CHOP is assigned a channel + cc (ch#cc# format), its value range is set and scaled to 0-1 (Usable by MIDI), and then it is passed through the midiout chop, which basically pumps it back out into loopMIDI.

I also added a switch to turn the MIDI channels off. Only one MIDI cc should be active when assigning to a parameter in Oscirender. The MIDI listener in Osci Render makes this a bit of a pain in the ass. If multiple MIDI cc’s are active, one will take precedence and connect to the selected midi parameter — even if it’s already paired with its own parameter.

MIDI Note Generator

This custom component can generate up to 5 note chords.

Leave a reply

Skip to toolbar