Code 1 Final

My initial inspiration going forward with this code project was heavily based on the Nintendo game, Electroplankton. I knew from this that I wanted to create something similar that employed design based elements heavily rather than focusing more on the actual gameplay or on a predetermined goal. Because I am already creating a lot of more “traditional” games in my core studio and lab classes, I thought it would be great to be able to explore something not so traditional in the sense that the game is heavily based on visuals and the player themselves can play around with the game and decide for themselves what they get out of it. My initial idea was to create a game based on archery, a sport that I have been ver much involved in and had a passion for for many years. The game would start off with the player clicking and dragging to aim an arrow in the desired direction, and upon release of their mouse the arrow would fly, bouncing off of the surrounding objects in the forest and leaving behind trails of color along with a creation of some sort of song or melody.

I ran into many issues while attempting to bring my concept to life. It ended up being very good that my initial concept was broad and mainly visual because it allowed me to tweak my overall concept multiple times when figuring out the mechanics behind it. Though I was worried that I might not be able to achieve what I had set out to, and often considered turning my concept completely around in order to create something more plausible for someone with my skill set, I stuck with a core mechanic and pushed myself to discover other options. Although I had no previous outside coding experience and have only learned the basics of Processing and Unity this year, I wanted to explore even more rather than simply sticking with the specifics that we had learned in class and test my problem solving skills and my knowledge of the language.

As I played around with different aspects of my project after getting the core mechanic of clicking to instantiate shapes that then translated on their own, I started to branch out from my original concept in order to further incorporate audio into my primarily visual designs. I abstracted my core concept quite a bit in order the explore this new direction. I ultimately got rid of the background forest scene in favor of a more simple background in order to allow the focus to be on the actual mechanic and the movement itself. Because of this change the arrows then failed to serve a purpose in support of the overall conceptual direction and I decided to change my images to reflect a more abstract and colorful visual experience.

My initial direction steered me toward Box2D in order to create the collision affects I desired, but this approach led to many bugs and some of the main components seemed to have changed and are no longer recognized by processing, so I had to take a different approach to achieve my desired object interaction results. I still managed to be able to check collisions and make adjustments to the experience accordingly, and successfully got my shapes to generate on click and bounce around within the constraints of the canvas size, which was my main goal as it was the core mechanic that I had in mind from the beginning of the ideation process.

I created a machine that employs both visuals based on audio and audio based on visuals. At the beginning of the sketch the audio starts playing and a single fish is automatically generated and released into the system, where it bounces around the screen which results in the generation of additional sound effects. From there, the user can try their hand at the interactivity and click around the screen to general additional fish which act similarly to the first. As far as the visual aspects, the fish cycle through a series of random colors, all shades of blue and green in order to emphasize the oceanic nature of the inclusion of fish, splashing sound effects, and ukulele music. I also included code that analyzes the audio itself and puts out sound waves according to the specific song that it is listening to. The wavelengths change color, much like how the fish do as well, and so do the particle systems. All colors are randomized and between green and blue values. The particle systems appear based on the clicks of the user’s mouse, allowing them to control where they instantiate and allowing for further control of the overall image that is created.

Overall my final creation is not extremely interactive or particularly “game-like” in the sense of traditional goal oriented games. The individual parts of the project do interact with each other quite often, but the interaction stimulated by the user is fairly limited and the project runs itself for the most part. The audio is based on and interacting with the visuals when it comes to the fish specifically and their ability to collide with and back bounce from the walls and send out a splashing sound effect. The visuals are in return based on the audio in that the sound waves directly correlate to the main background song and feed off of that audio file. What I have created is more of a simple audio and visual experience in which the user in question has the choice to control certain, though somewhat minimal, aspects of their experience. Ideally if this project were to be in a professional setting it could be a large scale creation that because very immersive, perhaps with very large screens surrounding a space in which the audience can move around or simply stand and take in their surroundings. I want those who experience my project to appreciate its simplicity and, rather than trying to take control of the project itself, let the audio and visuals speak for themselves and provide a somewhat calming addition to the existing experience of listening to music.

Leave a Reply

Your email address will not be published. Required fields are marked *