As part of the project research and development, I’ve been getting to grips with visual programming using Quartz Composer. The images are from a range of experiments in the creation of live interactive visuals, and not a reflection on the final project outcome. Quartz Composer enables the creation of live visuals in combination with various input types; sound, camera, mouse, sensors, Xbox Kinect, etc. The work below has aided in understanding how each of these input types can be used to manipulate different types of imagery. The book Learning Quartz Composer by S Buchwald has been a huge help in getting my head round the software and made the experience hugely enjoyable.
Click on an image to expand it and read a more detailed description.
MIDI input to output – maps each key from a MIDI keyboard to trigger images – in this simple examples, numbers.
Cubes that follow a mouse input, one directly on the mouse and the other orbiting. Orbit size varies depending on mouse input speed.
Deke video effect – takes live video and multiplied to create endless loop
Rain effect created based on mouse input and movement.
3x 3D textured buildings that animate and react to a live sound feed, utilising 3 frequencies.
A simple 3D object that can be manipulated in real time with animated lighting.
Sound reactive visualisation – live video input manipulated to map onto a sphere, cube and background, reactive to sound.
Rotating sphere and cube
This breaks sound into a range of 12 frequencies from a microphone input. It animates 12 blocks alongside an animated cube in the centre that embeds a live video feed.
Simple animated 3D interactive cube
Complex blended cubes that are animated by an LFO.
Takes a live video feed and creates a black and white, blurred Kaleidoscope type effect.
A series of generated numbers that move in a sine wave pattern. The colours react and change dependant of tempo.
A series of 3D textured buildings that animate based on sound inputs.
Takes a live video input and creates a Kaleidoscope effect that rotates based on mouse movements.
An interactive particle that follows an input, in this case a mouse to create a comet trail.
A set of interactive cubes that move around a central position depending on mouse movement.
Simple multiple video filters – manipulates a live video input adding blur and other effects in realtime.