This is the preliminary experimentation of an interactive installation coded in Max 7. What’s here now is a patch that manipulates a live video feed of the user and an audio track through the use of several filters.
My original “big idea” that I am working toward is an interactive installation in which the viewer is looking at a video of a daisy in a field. The further they are from the camera (located below the screen or projector the video is shown on), the higher above they are looking at the flower, the less vibrant and clear the image is, and the harsher the sound. As the viewer gets closer, the flower will become larger until the viewer is looking at it from the perspective of an ant; the picture will become clearer and the colors more vibrant; calm ambient sound will be played. This piece is meant to be a commentary about a person’s connection to nature in correlation to their inner peace.
Initially, I started by following the lesson called “Fractastic Sound,” included for free in the Max 7 program. This teaches how to code a patch that causes visuals to react to audio data; it used an audio track, stereo, patternizr, mappr, and viewr. After completing this lesson, I coded a method of blob detection that I found a tutorial for on Cycling ’74’s website. I added another patternizr, putting the video from the webcam, the blob detection data, and the audio track into the different inputs.
Once that was finished, I added more vizzie filters to make the final product more unique. After experimenting with various filters in the vizzie library, I settled on using the foggr, the scramblr, and the oper8r. The foggr was input with the video from the second patternizer and the blob detection was input into the scramblr. From here, both the foggr and the scramblr were combined in the oper8tr. The video in the oper8r was fed into the kaleidr filter, with the mappr data effecting the inversion of the video’s color. This was then put into the video viewr for the final product.
Where it stands now, the patch is interactive based on the movement in front of the camera and toggling the audio track on and off. To get it closer to my big idea, I would need to get the video distortions and the static noise to be triggered based on the viewer’s proximity to the camera. In this case, the noise and video filters would be turned off when they came close enough to the camera. I would need to also learn how to have these effects and noise fade in and out instead of starting and stopping abruptly once the viewer reached a certain spot.
To push even the big idea further, I would like to have the movement of the flower based on either the movement of the viewer, the noise level in the room the installation is in, or both. The movement of the flower could mimic the motions of the person in front of the camera, or sway more when the room is noisy because the wind in the animation gets stronger. These two interactions could be combined, possibly with the flower trying to resist a strong wind if the user’s movement doesn’t align with the direction of said wind.
I don’t know if I’ll ever be versed enough in Max to create my big idea and beyond, but this process of trying to get as close as possible to it, a complete novice to the program, challenged what I could do with the program in a relatively short amount of time. I’m surprised at how far I got in the patch, even after going down a different path than I originally planned. I might never get to see my big idea in an installation, but I’m happy with the patch that I’ve made here trying to get there.