Real-time sonification for abstract visual patterns in an immersive projection space
Based on my real-time line-based sonification engine developed in 2016, Colormatrics has been newly created as a set of three sonification-driven audiovisual works specially designed for Cylorama of the Cube at the Moss Arts Center, Virginia Tech in November 2021 (see Figure 1). Colormatrics converts generative visual patterns (from either the left or the right projection screen on the video attached below) into sound and conversely visualizes the sonification process (on the opposite side of the screen) in real-time. First, Colormatrics_01 generates additive synthesis-based ambient- or pad-like sounds fitting in with the immersive atmosphere of the space (see Figure 3). Second, Colormatrics_02 creates additive synthesis-based beat music following self-changing graphic patterns that become more complex. Those patterns (see Figure 2) have a total of 18 steps and will loop again when they reached the last. Lastly, Colormatrics_03 creates timbre by additive synthesis and spatializes sound where the vertical and horizontal positions of graphics are mapped into 128-speaker systems in the Cube (see Figure 4). The number of sine waves for additive synthesis can be flexible depending on the image size (1080 here), which also indicates the length of the scan line (see Figure 5). For the color-sound mapping for each pixel, hue values are tuned for musical pitch, saturation values detune the original pitch up to -100 cents, and the brightness values determine amplitude levels. The visual scores generated in Processing are transmitted to Max/MSP via Syphon for sonification (see Figure 6).
Fugure 1: Colormatrics displayed in Cyclorama
Fugure 2: Graphic Score for Colormatrics_02
Figure 3: Graphic Score for Colormatrics_01
Figure 4: Graphic Score for Colormatrics_03
Figure 5: Additive Synth for Colormatrics
Figure 6: Implementation