As part of the project research and development, I’ve been getting to grips with visual programming using Quartz Composer. The images are from a range of experiments in the creation of live interactive visuals, and not a reflection on the final project outcome. Quartz Composer enables the creation of live visuals in combination with various input types; sound, camera, mouse, sensors, Xbox Kinect, etc. The work below has aided in understanding how each of these input types can be used to manipulate different types of imagery. The book Learning Quartz Composer by S Buchwald has been a huge help in getting my head round the software and made the experience hugely enjoyable.
Click on an image to expand it and read a more detailed description.
A series of 3D textured buildings that animate based on sound inputs.
An interactive particle that follows an input, in this case a mouse to create a comet trail.
This breaks sound into a range of 12 frequencies from a microphone input. It animates 12 blocks alongside an animated cube in the centre that embeds a live video feed.
Complex blended cubes that are animated by an LFO.
Rotating sphere and cube
Takes a live video input and creates a Kaleidoscope effect that rotates based on mouse movements.
Deke video effect – takes live video and multiplied to create endless loop
3x 3D textured buildings that animate and react to a live sound feed, utilising 3 frequencies.
A set of interactive cubes that move around a central position depending on mouse movement.
Simple multiple video filters – manipulates a live video input adding blur and other effects in realtime.
A simple 3D object that can be manipulated in real time with animated lighting.
Cubes that follow a mouse input, one directly on the mouse and the other orbiting. Orbit size varies depending on mouse input speed.
Simple animated 3D interactive cube
Rain effect created based on mouse input and movement.
MIDI input to output – maps each key from a MIDI keyboard to trigger images – in this simple examples, numbers.
Takes a live video feed and creates a black and white, blurred Kaleidoscope type effect.
Sound reactive visualisation – live video input manipulated to map onto a sphere, cube and background, reactive to sound.
A series of generated numbers that move in a sine wave pattern. The colours react and change dependant of tempo.
Like this:
Like Loading...