The live visuals in this session consist of generative clips from our Resgen series for Resolume. Clips were triggered on the fly and included some audio reactive versions. Behind the scenes, some adjustments were made using the built-in Dashboard controls to alter the clip prior to displaying in the Output Monitor. Most clips had minimal blending and alpha channels were used to separate the layers. The glitchy polygon visual was tweaked live, adjustments to the size and rotation were made with the music.
These live visuals were composed using a mix of hand-triggered effects and BPM synchronization. The top layer of thin cubes had the blur effects and random video playback set to BPM. The pixelized lines on the second layer had its color effects set to BPM and both the geometric explosion and directional flipping was done manually by bypassing the effect through improvisation with the music. Lastly, another geometric layer was placed in the background using some noise effects for extra texture.
The live visuals in this session were layered using a Mask Overlay. The background layer is a moving gradient with graphical glowing lines pulsating audio reactively to the music track. On top of the entire composition is a black & white mask overlay altered with a kaleidoscope effect containing mirrored objects rotating endlessly.
These recorded live visuals were part of a test using a “track matte” technique that was shared by Resolume user subpixel. The composition itself consisted of 6 layers, however only 4 layers of footage were actually used for output. The background tiled pyramid layer is horizontally mirrored with a slide effect used to create the illusion of it scrolling endlessly. The stringy, worm-like shapes acts as a track matte or “cut-out” for the wavy lined yellow tunnel that is contained within it. The glowing, dancing lines are the focal point and layered on top of the entire composite. This layer is mirrored and contains a brightness adjustment that reacts to audio. All layers have color adjustments to create a more pleasing composition.
A website called funnyordie.com released free green screen footage of Jean-Claude Van Damme and the DocOptic team had some fun with it. We composited the footage live using VDMX with a mix of audio reactive and live triggering. For the background, effects parameters were linked to audio analysis filters to trigger on the hit sounds. We triggered the geometric visuals on each hit using mapped buttons and created a stack of effects to give them a bright laser-like quality. We used some TV and RGB shifting effects on the JCVD footage which was contained in a Media Bin set to automatically trigger using the “Next on Movie End” feature of VDMX.
Recorded with Syphon Recorder, one of our recommended VJ apps.
In this pre-recorded live visuals session, 4 layers were used. Acting as a background layer, a video mask with effects applied over a seamless loop of geometric cubes sits behind a BPM synchronized foreground visual both with color hue changes to match each other. An organic loop is also blended with its opacity lowered and color shifted along with some transformation effects (a 50% mirror with Lighten blend mode) to change its look. Originally posted on Instagram.
These live visuals were composed of 3 separate layers. The 3D pyramid layer serves as an animated backdrop that provides constant movement in the composition. Its color has been shifted using Hue effects. The stringy light particles were added (using the Add blend mode) on top with its brightness/levels parameter linked to the low frequencies of the audio. The image is mirrored vertically using a Lighten blend mode to make it appear as if it was duplicated rather then blended. The abstract flower visual has a shifted Hue to blend with the other visuals in the compositions and vertically mirrored with the Lighten blend mode as well.
To better flow with the music, preset audio reactive effects were triggered on cue to change the mood of the composition on the fly. You can see how this technique was used in Resolume by viewing this tutorial.
The live visuals in this session consist of three layers. The top layer has circular objects mirrored, with an audioreactive opacity slider set to trigger on mid audio frequencies. A multiply blend mode was used. Second layer is a glowing connection loop kaleidoscoped and set to react to audio by shifting the cloned geometry towards the center on low frequency hits. Final layer is a looped background that is influenced by the first layer’s blend mode when the audioreaction occurs. Originally posted on Instagram.
3 layers of live visuals: top two are multicolored animated textures using Darken (top), Multiply (middle) blend modes at full opacity. Third layer is a group of mirrored 3d quad shapes that inherit a nice blend of the colors from the top two. Originally posted on Instagram.
In this session, the live visuals were made up of 3 layers. The background layer is a looped field of color and texture. On top are two layers with audioreactive opacity. First is a group of geometric shapes transforming along with a spherical liquid object that disperses and rejoins. Some blur and glow effects were applied to enhance the contrast of layers. Originally posted on Instagram.