TouchDesigner is a visual development platform that equips you with the tools you need to create stunning realtime projects and rich user experiences. Whether you’re creating interactive media systems, architectural projections, live music visuals, or simply rapid-prototyping your latest creative impulse, TouchDesigner is the platform that can do it all.
DEEP WEB is a monumental immersive audiovisual installation created by light artist Christopher Bauder from White Void and audiovisual composer Robert Henke. Presented in enormous pitch dark indoor spaces, DEEP WEB plunges the audience into a ballet of iridescent kinetic light and surround sound.
The generative, luminous architectural structure weaves 175 motorized spheres and 12 high power laser systems into a 25-meter wide and 10 meter high super-structure, bringing to life a luminous analogy to the nodes and connections of digital networks. Moving up and down, and choreographed and synchronized to an original multi-channel musical score by Robert Henke, the spheres are illuminated by blasts of colourful laser beams resulting in three-dimensional sculptural light drawings and arrangements in cavernous darkness.
Digital artist based in Montreal, Mathieu Le Sourd (Maotik) focuses its work on the creation of immersive multimedia environments and generative visuals.
Inspired by the natural phenomenon of the tides, the multimedia environment FLOW offers a sensory, poetic, playful and aesthetic experience of the rise and fall of sea levels.
The live video projections, created in TouchDesigner, are generated by input information as moon position, tide, wind, temparature and motion tracking of the audience.
St. Petersburg-based Tundra define themselves as a “collaborative artistic collective” whose members include musicians, sound engineers, programmers and visual artists. Their focus is to create “spaces and experiences by making sound, visuals and emotions work together” in audiovisual performances and interactive installations.
In the OUTLINES, there were about 260 lasers and each was controlled individually via DMX. Tundra used TouchDesigner in a two-way sync with Ableton Live via OSC and MIDI. TouchDesigner was randomly triggering scenes in Ableton and controlled effects depending on the visual pattern. They also built a type of audiovisual looper for making glitchy effects.
To control lasers separately Tundra used a pixel mapping technique, where each laser dimmer was linked to the corresponding texture’s pixel. They also built in a TouchDesigner special adjustment app to easily assign a laser’s dimmer to the right pixel or texture. So all the visual content was generated in real-time with standard TouchDesigner TOPs.
Bot & Dolly is a design and engineering studio specializing in automation, robotics and filmmaking. Central to their craft are sophisticated industrial robots, unshackled from the factory floor, set free on the likes of movie sets to manipulate lights, actors, cameras and set pieces with the most precise, coordinated and complex motions.
In BOX, TouchDesigner has been involved in a live performance film where for the first time ever robots, actors and 3D projection mapping perform in sync on a live set.
“Bot & Dolly’s kinematic projection platform makes it possible to synchronize projection with moving objects. Through large-scale robotics, projection mapping and software engineering, audiences will witness the trompe l’oeil effect pushed to new boundaries. We believe this methodology has powerful applications in a wide range of fields.
Tarik Abdel-Gawad, Creative Director at Bot & Dolly
We were using TouchDesigner for lots of different things, we were master control for the entire stage with the lightbox driven by the robot’s timeline. Theo lit the movie using TouchDesigner – 70% of the film’s live action was lit through TouchDesigner.
The movie was run through TouchDesigner as well. Our real-time visualization system where we could superimpose graphics through the lens view to make sure everything was lining up was all realised through TouchDesigner. We had on-set repositioning, on-set color correction – a whole visualization system run through TouchDesigner; there was script supervision for the Assistant Director, all the notes called out during the shot – for example: “grab left”, “look right”, “spaceship coming in” – was all on a TouchDesigner timeline that we controlled.
All of the audio playback was slaved to our TouchDesigner system as well so we would be sending out MIDI triggers through TouchDesigner to the sound department.
We designed and built an entire set-wide ecosystem of production around TouchDesigner.
Jeff Linnell, Director of Robotics at Bot & Dolly
The multi-user interactive interface is a combination of iPad Apps and an interactive TouchDesigner system. and allows multiple users to ineract with data in a aesthetical and engaging way.
The intallation offers several users valuable insight into the best surfing spots. Several means of communication show for each wave the best conditions regarding the tide, wind and swell based on past meteorological information. Furthermore, the characteristics of the seabed, wave and break are also given, and visitors can explore the biosphere (climate, vegetation and species) and the facilities of the area.
TouchDesigner was programmed for video-mapping of the scaled model terrain (1:1000), the realtime generation of the data visualization that gets displayed on the model, and the multitouch based User Interface that controls and allows the user to navigate through the content of the Interpretation Centre.
Right Brain Is a scientific experiment / data visualization art piece using 10 seconds of mind waves recordings of the artist, captured with EEG sensor.
Recording of 10 seconds of Alpha, Beta, Gamma & Theta brain waves while meditating. The different wave channels were categorized to state when the right brain representing artistic brain activity, isolating the ranges for each channel when the brain channels were more meditating and imaginative. The sequence than was being transferred directly to motion of points in 3D space. Each point represented the accumulative power from each sample of the captured mind waves.
In REFACTOR the painterly experience of the visual artist is mapped into computer codes that make possible a painting factory without any presence of the painter. The platform is an interwoven spatial-visual-aural sculpture that redefines space through the arrangement of generated recurring images and sounds. Visual images are immersed in the soundscape and sounds are emerged from visual images. Painting becomes sound and fades away from our sight like sound. Compositional algorithms of the painter are brought to life by compositional algorithms of the sound artist.
Various sensors like Kinect, Leapmotion, EEG, Myo and Wacom were installed in Arabshahi’s studio and for two consecutive weeks captured his hands’ physical movements and the ways in which he used both physical painting brushes and virtual brushes in digital painting.
Analysis of these bundles of data led to the design of TouchDesigner modules imitating Arabshahi’s behaviours. Charts of materials science and technical application of acrylic paint were next employed to model and simulate painting techniques like Impasto, Glazing, wet on wet, brushwork, etc.
The outcome of these simulations -visual elements and units in the form of lines, shapes and marks – are subsequently assembled into meaningful shapes and patterns via another layer of local compositional algorithm for data which was devised: “a network of modules designed to build a meaningful texture”.
At the last stage, a final compositional algorithm for general composition of the whole picture was added and the network condensed to become a drawing engine. Five of these simulation engines were used to reach the final picture.
FLUID STRUCTURE is an immersive interactive installation which explores how an ephemeral and amorphous shape reacts under various stimuli, internal and external. Forces and collisions bend the shape until it breaks, recombining it into new aggregates. The result is an ever changing landscape, mysterious yet familiar.
A dramatic data-like visualization emphasizes the internal structure of the shape and its motion. Using computer vision the audience is made an integral part of the process, leaving its temporary physical mark, always bound to eventually to disappear. The system is driven by a state of the art fluid solver able to process in real time the forces and constraints the shape is subjected to.
The techniques and measurements were critical to realizing authentic results so the team developed an in-house 3D pre-visualization of the room into which they placed the virtual camera and proxy 3D actors to gauge room dimensions, light positions, and the size of props as compared to an actor’s height.
Phenomena Labs developed a real-time live stitcher in TouchDesigner using the lens distortion matrix from PTGui. Live stitched video was composited with recorded stitched video to create the the 360 environment. This custom setup enabled the director to visualize previous takes over top of what was currently being filmed using TouchDesigner and Oculus Rift.
Director Greg Barth was able to get the most precise performance from the actors by seeing the results immediately on-set with the Oculus DK2 headset. This filming technique was also valuable in post-production to loop each individual actor’s musical timing while the director looks around the room.