Sound Group Project – Castronaut

In the Castronaut project, our team Neon Blackboard created a universe of cat astronauts’ heads in Unity and configured the sketch in Oculus Rift. We also had a audio input which triggers the change of size and color of the cat heads. We presented this project at PlayTech and more pictures are coming soon.

It was our first time using Unity; the entire progress was very challenging. We searched for lots of sample script for our sketch, along with the tremendous help from Kyle Li and Ryan Hall. It was indeed the most rewarding project of the semester.


Week 9-2 10-1 Augmented Reality in Processing

Last year I went to Dorkbot NYC and watched a talk from a NYU person introducing the augmented reality phone app that they were developing called Layar. Through the camera of the phone, the audience were able to see virtual objects coming to their surroundings and moving around them continuously. It was incredibly enjoyable to do something similar (but simpler) in our Core Lab. We tested a Japanese processing library that enables the camera on the computer to detect the “marker” on the paper, a patter we designed and generated with this Marker Generator. We have generated four “marker” files of different resolutions: 4×4, 8×8, 16×16 and 32×32. The camera detected the bold black border of the “marker” first and then recognize the pixel patterns inside.

In the following class, we expanded our knowledge of beginShape() and endShape() in processing by creating an ellipse with beginShape(Quad_Strips ). We further tested augmented reality with a potentiometer that controlled the visual effects around it.


photo 4 photo 5IMG_1488 IMG_1489

Here is the video documentation:

Augmented Reality in Processing – Nyar4 Library Test from Yumeng Wang on Vimeo.