AN INTERACTIVE INTER-MEDIA COMPOSITION
FOR THE KRONOS QUARTET
ARS ELECTRONICA FESTIVAL 1993, GENETIC ART - ARTIFICAL LIFE
Natural sounds – produced by the string quartet – are digitalized and thereby cloned. By means of digitalization it is possible to penetrate right to the very smallest parts of the sounds, to shred them, to put them together again, to superimpose them, add, multiply. Interventions in the frequency spectrums and manipulations of the wave forms enable the sounds to be mutated until they are no longer recognizeable. These sound manipulations together with growth phenomena in the field of tension with the classical string quartet form the basis of this compositionary work.
The reduction of information to binary code makes it possible to transform genetic structures into musical ones, to control visual happenings by music, to create artificial, multimedial worlds. Genetic "fingerprints" act as a reference for musical processes. Computer programs generate harmonic and melodic material from this genetic basic information that we have associatively further processed.
For example, completely new complex overtone melodies and sound structures develop from micro-tonal re-recordings performed by the Kronos Quartet with the central tone. We condense these with digital processors. As a result, orchestra-like complex sound proliferations result from the sound of the string quartet and expand into the room through a multichannel public address system.
Artificial life appears in two forms on the musical level. Firstly, in the composition process itself, where feedback systems are incorporated into the creative act, where random number generators influence composed structures, change them and then react on the initial material. Secondly, during the performance, where the "open system" music is constantly influenced from the outside and is disturbed, reacting to the changed conditions. These system disturbances are, on the one hand, consciously controlled interventions performed by Obermair/Spour with the midi guitar and keyboard, on the other hand, they are uncontrolled and unpredictable happenings that arise when the audience reaches into the laser beams. Both types of disturbance influence the musical and visual happenings, directly and in real-time.
A further important factor to increase the complexity of the system arises by the digitalization of the strings sounds. A computer program analyzes and interprets real-time playing and sound characteristics and the information gained from this is used again as manipulators of the initial signal.
In addition to this, by involving a special sound animation we open the rigid acoustic space and in doing so reach a system of variable sound environments in which the listener continuously has to re-adapt his sense of hearing to the changed acoustic conditions, finding himself again in different virtual listening positions.
In addition to several lasers and a 6-channel public address system, slide and video projections expand the stage and involve the auditorium.
The lasers are controlled directly by the string instruments. Musical parameters such as pitch, volume or sound timbre generate and change laser images in real-time.
On the other hand, the Kronos Quartet can also interrupt laser beams with the violin bow, and in doing so trigger off sounds. A specially developed computer program analyzes and interprets the speed of the movement, the place and the number of interruptions. The laser becomes the "instrument".
Even the audience has the possibility to play with the sounds by reaching into the laser beams. Special sensors which register the interruptions distributed throughout the auditorium, convey this information to the computer program. Consequently, the audience can intervene directly in the musical happening.
Klaus Obermaier, Robert Spour
MacPanic is a midi-controlled computer program for musicians that combines both elements of a computer virus as well as those of a life-simulating computer game. The surface has been consciously kept in such a way that the connection between new sounds and new life activities remains recognizeable for as long as possible. The figures which emerge remind us, parodistically, of the classics of all computer games.
It is left to the musicians discretion, to very suddenly abandon the ability to fathom what is happening on the monitor, as a result of a certain "event". From then on, an individual life develops which is "chaotic" from a mathematical point of view. The theory of chaos is held to be that very mathematical branch whose formulae are closest to real life. The fathomable "Simulation of Life" moves a little bit closer to "Real Life". But is our understanding helped by this? Everybody has to decide that for himself …
SOUND ANIMATION AS A REAL-TIME PROCESS
By graphically resolving the auditorium on the user surface of a computer (Macintosh), the place of the sound and its dimension is defineable. This makes it possible to organize different sound movements in the three-dimensional space, independent of the "real" position of the instruments and of creating new positions for musical, spatial listening.
The distribution is based on a matrix controlled by the computer and by six loud speakers (source cluster) positioned within the room, the selection of which is controlled according to the structure and gesture of the music on four different paths via sequence-like composed motives. With the software especially designed for this installation, influencing and control is possible via midi-data as well as by means of real-time sound parameters and is realized "topographically" on the computer monitor.
A further level of sound influencing between the instruments is achieved by a real time signal processor (NeXt Cube/IMW processor card) and the software package MaxAudio which can be openly designed. Frequency spectrums of the string instruments are analyzed in real-time and are used via "organic" structure processes to mutate sounds and time parameters. Communications resulting between composition and electronically generated sound-time level brings about a new synthesis of musical action, in situ.
LASER LIGHT IN THE INTERACTIVE PERFORMANCE
Laser light possesses several features which can only be realized with the utmost of difficulty by other light sources, or even not at all. Depending on the type of laser, there is a light beam with a high degree of intensity, low divergence (dispersion) and different colours. The light is coherent i.e. the individual light waves vibrate differently to sun or bulb light in phase and are, so-to-speak, emitted in cadence.
In the project "The Cloned Sound", different physical features are used. As the light is very finely concentrated when the beam emerges, it can be deflected via relatively small and consequently quick moving mirrors. This ability to deflect quickly also makes the projection of images possible, the shape of which can be scanned so quickly that the impression of closed line outlines develop for the eye. These figures stored as vector graphics can be called up in real-time, and in addition to this, can be manipulated in their characteristics. In this way, musical parameters control picture size, distortion and "writing" speed. Similarly, the speed of metamorphosis between several images can be determined directly by the musicians. The concept for the interactive software "Phonola" is represented in the following graphics, whereby the inlet and outlets in bold print are used in the project "The Cloned Sound".
The coherence of the laser light is a prerequisite to producing interference figures and to producing holograms. Interference figures develop when the laser light disperses through suitable materials and is superimposed with itself. Interferences even develop in the eye of the observer, giving the laser light a special quality which is only directly perceivable. Grids produced holographically are used for the reproduction of beams and figures.
The high degree of intensity of the laser light in connection with its low divergence makes it visible in the air, even over greater distances. As a result, there is an artistic medium available for representing connections and spatial light surfaces.
Whereas until now we have talked about the creative possibilities of laser images produced by music, the following will now describe the reversal of this allocation.
In "The Cloned Sound", laser light surfaces receive an additional function: they serve as an input medium for the generation of sound parameters in which a specially developed hardware and software follows shadows and reflections of objects being immersed in the light.
The software modules for the sound control, made of light data, have the working title "Digitus Package", commencing with the situation that one finger ("digitus") is immersed in the light zone. From the light reflex of the finger, or the hand, a rod or e.g. a violin bow, data are acquired relating to place, speed and duration ("digitalization" = to represent in a countable way). From this, new data sets are compiled which can be used for the production of sound and to influence sound. In this way, the place can determine the pitch and/or samples while the speed of immersion detemines the volume. Lateral movement in the light surface modulates, for example, the sound timbre.
As long as the sensors detect only a few reflexes, the digitus provides many more possibilities for sound manipulation; as soon as a large number of fingers (or hands, or rods) reach into the light surface, the sound manipulation subsides in favour of an "unlimited" polyphone game.
"Phonola" and "Digitus" are a step towards interactive freedom of creation. While the laser light in "Digitus" becomes a sensor which can be used to control sound parameters, with "Phonola" every variation possibility and the entire wealth of nuances of musical expression media are available, controlled in real-time and to control visual happenings in an improvisatory manner.
Friedrich Förster, Kurt Walz