Thesis Blogs May

5/1/23 (23!!)

Hey hey,

 

I’m beginning my retroactive documentation of my senior year thesis project, which is sure to bleed over and cover my larger higher ed experience and life as a whole. [edit post draft – yeah it did]

 

Starting this is tough… where do I start this story? My thesis has been a clusterfuck work in progress, juggling lots of ideas and feelings that have been percolating for some time. It hasn’t been easy for me, that’s an understatement-it’s been quite shitty, stressful, emotionally loaded, and full of disappointments set against these equally massive and vague demands I carry for myself. This has been my experience with formal education, we don’t jive well and it makes me feel pretty shitty.

 

Today for instance, I want to document my thesis process. Well, its may and I have a week, so where the fuck do I start? My hierarchical mind says go chronological, be formal… be selective with your words, so on and so forth. It leads me down rabbit holes: to research the best blogs (20 tabs of: best blogs reddit eyebleach), then decides well I might as well start a website, then I don’t want to pay for this shit let me learn how to host it for free, then omg this is now a multiweek project to get this shit up and running and I just want to put keystrokes to comp and get some of this shit down. So I decided to use this cute little wordpress site @theNewSchool made for me. My impulse is to structure a bunch of nested pages, set up a dope theme w some stable diffusion landing image, make everything super pretty and planned and ordered before I write. (delete this > my facist half at war with my anarchist self that just wants to let it all out)

 

This has been more or less my experience with my thesis project. I have the tendency to look for really elegant ways to solve issues when building out digital interfaces. I’ve had to excercise restraint. Often times the working solution is best. Or rather, let me get the working solution running before rebuilding it all from scratch. The latter is a surefire way to make myself pull out what ever decreasing hair I have remaining!

I’ll give you an example. I’ve been using a software called Oscirender to generate xy oscilliscope music (I’ll explain the nitty gritty of this elsewhere) from 3d models. I’m using touchdesigner as an intermediary in this project (a pipeline) to route my data in from various places, and do all sorts of of fun processing and output stuffs. Anyways, getting TD interfaced with Oscirender was a real pain in my asshole, even though my eventual solution was quite simple.

Oscirender can take 3d models and create line art from them, which are rendered by a built in oscilliscope as sound. It has a bunch of cool animation parameters which have different effects on the output sound. Here’s a video on it: osci-render full overview – live oscilloscope music synthesizer

It’s free & open sourced, try it out here: https://github.com/jameshball/osci-render

Anyway – I wanted to control these parameters in all sorts of fun experimental ways through touchdesigner (physical sensor input from microcontrollers, opencv hand-tracking stuff, automated events, random noise, midi notes, etc.). Well, the only built in way Oscirender can receive data is through a midi listener, which looks for incoming midi signals and pairs them with a selected synthesis parameter. For example, I plug in my midi controller, press the x rotation parameter on Osci render and turn cc nob #16, now nob #16 is paired with oscirender and controls my lil cube’s x rotation. This is nice, but I wanted to go a step further and control these parameters in a hands off way – I wanted to shoot all sorts of data at all 30+ parameters at once and make the cube go kaboom, and I wanted the data streams to be generated by the aforementioned experimental inputs.

So, my overkill solution was to fork and rewrite oscirender to receive data streams from Touchdesigner. This didn’t work well for a few reasons. One, oscirender is written in Java which I know nothing about. Two, I’m not much of a coder in general, and know next to nothing about building apps. I got all sorts of errors when I tried to rebuild oscirender on my comp (shivers). Three, touchdesigner doesn’t support programming in java (to my knowledge). Python is the default scripting language. So after some error & error, I decided on using osc to communicate between TD and oscirender because that is a nice and fundamental way to send data between apps. Well, I still couldn’t get that to work because I didn’t have a good understanding of how Oscirender was built and it ended up being a real headache.

My working solution ended up being the simplest one: send the info from touchdesigner to oscirender via midi. I ended up using another free software called loopMidi to create an internal loopback MIDI Port. Essentially, it creates a midi signal that I can send to touchdesigner, where I synthesize midi signals, designate cc channels, and send them back to loop midi. I then turn on the midi listener in oscirender and pair the individual channels with different synthesis parameters. The pairing process was a little finnicky because if more than one midi channel was activate at a time they would compete for control of the oscirender parameter with an active midi listener, but adding on/off switches in touchdesigner for each midi channel resolved this. You just turn the channel on in touchdesigner, link the parameter in oscirender, then deactivate the midi channel in touchdesigner and move on to the next one.

It’s not the elegant solution I wanted, but hey, it works!

I’m trying to embrace these sorts of workaround solutions. Its in the spirit of scavenging, and repurposing. A sort of software building method for the unelightened masses who are non coding fluent (read: me). And there is some elegance in its simplicity.

Anyway this is one of what feels like 10+ major system roadblocks I ran into in this project. Some others: learning a new central software – Touchdesigner, mocap in blender (recalled software Nimate, kinect unsupported by mac, found windows workaround, learn blender, learn rigging, comp unable to run live mocap -> blender -> oscirender -> touchdesigner without mega-laggg), aformentioned crucial libraries unsupported by certain operating systems (returned a mac & got a pc), aforementioned computer crashes (insufficient hardware, this shit needs multiple more powerful comps which I don’t have), adapting to windows & its obtuse path system (things just don’t work right away), learning python to build opencv libraries, routing video streams from python projects to touchdesigner (used ndi, ndi-python doesn’t work for windows builds, screencapture workaround too comp intensive), among others.

There were other issues that I found workable solutions to: building a virtual oscilliscope (osci-render web oscilliscope is too laggy, need to build my own, learn touchdesigner, build xy oscilliscope… turns out there’s already a prebuilt TD x/y oscilliscope component), serial communication issues between touchdesigner and arduino (learning about serial com, writing scripts, serial out works but not serial in, limited # off serially acessible pins… turns out there’s a library called firmata with solves everything & it has a prebuilt Touchdesigner component), motion capture to x/y oscilloscope rendering (so so many hardware/software/os/pipelining/etc issues <previous paragraph> … turns out there’s a Touchdesigner laser componet that can trace camera+geometry input in a readable way for x/y oscilloscopes).

I often asked myself, why the fuck am I doing this?

To make a great thesis (gag) — this is the culmination of my formal ed, years of school. This was me rackoning with years of colossal expectations unfilled, and the ickiness of thesis was me trying to salvage it all — to finally make something great. I’ve been to something like a combined 7 combined highschools and colleges over the past 10 years. Spanning public highs to authortarian all boys boarding school, public universities and a private arts+design school. None of them really clicked with me. I came to parsons with high hopes on the heels of some really dark years alone, with a portfolio stuffed full of game design & digital arts stuffs. I had one really sweet semester and then 3 more turned online because of covid, 2 more in a university in flux by virtue of admin decisions made in bad faith, the next semester these fundamental structural issues came to a head with the righteous adjunct faculty strikes, and now I find myself at the end of my last semester. And I’m pretty disappointed with ed–things could be so damn cool and so much less costly if we took a more radical approach to learning and put things in the hands of students+faculty–and very much done with it all.

Anyways here’s some things that thesis has taught me:

From a technical standpoint, I learned how to use touchdesigner and that it’s my friend (especially for pipelining). I also learned the practical limits of consumer computers, the folly of my stubbornness, and the virtue of being openminded to alternative solutions that allow me to quickly create instead of miring in the bullshit of systems optimization. Only fix it if its broken.

I’ve been compiling this entire touchdesigner-oscilliscope ecosystem and I’m going to publish my project files so others can use them. The wealth of open-source software and free instruction I’ve found on the internet were, as always, invaluable and I would like to start making my own contributions to this free .

Traditional ed is not for me. I also found instructure and co-learning spaces outside of my parsons degree, often for free or on a donation based structure. I found these casual, open, and non-hierarchal learning spaces to be much more “my vibe” for lack of better words… After some more self reflection I’ve decided that I want to teach post-grad, specifically in non0traditional & specialized learning environments like these.

The pressure I put on myself at points during thesis (and my k-12+4 ed) to develop monetizable skills was personally destructive and antithetical to how I believe learning should be structured; I often gatekept myself from areas of genuine interest and instead funneled my energy to projects I arbitrarily deemed of more value. And I hated it. GodblesstheSystemsFolks, for I am not 🙂

 

 

5/9/23

So this project is coming together really nicely.

I keep adding new components off my so called wishlist, and am throwing a lot of stuff in that I never thought I’d have the chance to get to. My work rate is increasing as I finally know what I want to do, and how the fuck everything works in touchdesigner.

1 big challenge right now… figuring out how to present the thing. More specifically, making the scope music something that can be interacted with and changed in an interesting way in an installation type setting (which isn’t really my favorite).

There’s a lot of cool stuff I’ve stumbled on playing around with this network, but it’s kinda tricky making that experience replicable for someone with no experience with it. I don’t want to make the experience too narrow; I want it to be dynamic, like when I play with it. I’m also kind of tied to the gyroscope ball – uv sphere rending idea for better or worse… so I’m trying to balance maintaing the visual integrity of that ball while still going into the explode-y side of things.

I’ve been experimenting with sequencing & high value triggers to do this. Sequencing can lead people to certain things I want them to see. High value triggers & envelopes can let things go crazy while still pulling it back.

But this is tough… sequencing gets something that sounds more classically musical and makes the shader outputs look rhythmic and beautiful… but there’s something that I love about the slower, more progressive transformations I can get with a hands on approach that is just mesmerizing, and makes a chilled out looking set on the analog gear.

I’m having a bit of trouble articulating this part, so here’s 2 vids to capture the difference in approach:

Shader Sequenced Madness

https://www.youtube.com/shorts/DRZ0GExMUQ8

Analog Chill –

https://www.youtube.com/watch?v=a70NyQgr8sY

2nd one was actually a live coded lua script (which osci render supports) with some slow x & y transformations

 

That’s all to say that this is a new network, which I’m still kind of learning the ins-n-outs of… but also therein is my conflict. I start jumping at every new thing I can throw into this. And there’s a lot. Which is a good problem, being able to make a ton of cool and diverse stuff with one network. But sometimes in my haste to explore new stuff, I forget to really study what made old patches (does that term apply here? lol) cool, and miss capturing some lightning in a bottle. Retrospectively, after I look at old vids I recorded on my phone, I wish I took a little more notes, better documented things, maybe added save states or something.

Anyways, hoping to add some more day to day documenting as I prep to showcase this thing.

 

Leave a reply

Skip to toolbar