William Kentridge exhibition uses spatialised sound from HOLOPHONIX native software app

Amadeus collaborates on a new artistic work with South African artist, performer, and director William Kentridge to create the immersive sound for his latest multi-sensorial installation.

Amadeus has collaborated on a new artistic work with South African artist, performer, and director William Kentridge to create the immersive sound for his latest multi-sensorial installation, titled More Sweetly Play The Dance and hosted by France’s renowned cultural center Les Champs Libres.

The new Kentridge exhibition, More Sweetly Play The Dance, is designed to be open, festive, poetic, and generous but also contemplative, gruff, and militant. It intends to evoke the concept of celebration in its broadest sense, by uniting artistical practices from all horizons. The celebration is considered a place to rejoice, resist, make a claim for social, identity, and cultural rights, and catharsis — a place where ‘spectacular’ and ‘intimate’ meet.

This work surrounds the spectators in a seemingly endless parade of characters. A genuine dancing procession of cartoons and videos, the 35-meter-long frieze of moving images and sound invites attendees to enter a macabre dance while providing the opportunity to reflect on notions of injustice and inhumanity.

“I am interested in political art, meaning: an art of ambiguity, contradiction, unfinished gestures, and random outcomes. An art — and a politics— in which optimism is restrained and nihilism kept at bay. The film itself is part of a series of projects that deal with despair in this era of disappearing utopias,” stated its creator, William Kentridge.

A motley crew of anonymous human-sized silhouettes walks, dances, staggers and wanders across the room, from one screen to another, advancing in front of a devastated landscape, scratched and dirtied with Kentridge’s Indian ink.

Projected in front of the spectators, this parade summons a brass band, dancers, miners, animated objects, and politicians. Shadows and mysteries are also revealed, by showing ghosts of South Africa’s heavy and tormented past. This disturbing show questions the visitor who finds themselves immersed in front of this surrealist, life-size environment wrapped in sound with multiple reliefs, nuances, and depths.

“Les Champs Libres are among the first French institutions — notably the Théâtre National de Chaillot, the Comédie Française, or even La Scala — to believe in the artistic and perceptive added value that the new spatialisation-linked sound technologies could provide. As such, in 2016 our auditorium was equipped with an advanced immersive and sound localization system, built around the HOLOPHONIX spatialisation processor developed by the Amadeus company. Given the artistic depth of William Kentridge’s work, its immersive character both visually and sonically, it seemed obvious to us, in consultation with the creators, to evolve the sound creation towards a dynamic, object-oriented approach,” explained Olivier Le Du, Head of Audiovisual and Digital exhibitions at Champs Libres.

“We wanted to go beyond a stereophonic left/right approach, by bringing this work into a totally spatial dimension where the sounds would be in movement, following the silhouettes of this macabre dance, arising in front, behind, and above the spectators, with nuances, reliefs,” added Le Du.

“From our first thoughts about the programming of More Sweetly Play The Dance, we felt it was relevant to use a spatialized system. However, we could not deprive our auditorium of its HOLOPHONIX processor for several weeks. We were also aware of a development project at Amadeus, concerning a software version of the HOLOPHONIX solution for the macOS platform, which we were able to preview and even suggest various ergonomic improvements for the software app,” revealed Dewi Seignard, General Manager of the Champs Libres auditorium.

Convinced of the added value offered by these new technologies, the Champs Libres teams then proposed a new electro-acoustic and technological device to Kentridge’s teams, who showed great interest and agreed to rework the sound dimension of this work. Gavan Eckhart, Kentridge’s sound designer, and sound engineer traveled to Rennes from South Africa to work with Dewi Seignard for a week.

“The original sound diffusion system was based on a 10-loudspeaker setup, including four cone-shaped acoustic horns, mainly used to play ambient sounds. The audio mix consisted of five stereophonic tracks integrated within the eight video files played in the exhibition,” said Seignard.

“Based on the new HOLOPHONIX Native sound spatialisation software and 22 loudspeakers — including 16 Amadeus PMX 5 point-source loudspeakers installed on two levels, four Amadeus ML 12 subs, and four acoustic horns — the new electro-acoustic configuration created more possibilities for presenting the original sound recordings, fortunately, preserved by Gavan Eckhart, to imagine a three-dimensional mix, totally object-oriented,” added Seignard. “The system allowed us to perfectly synchronize the sound materials as the procession progressed, only when needed, because the correlation between the image and the sound is not automatic in the artistic proposal.”

“We had to respect the purpose and the progress of the parade, to adapt to the movements which are not completely linear, to keep a delicate balance between the different sources. Gavan was very satisfied with the final result, repeating several times: ‘It’s magic!’,” recalled Seignard.

After listening to the different spatialisation algorithms available in the HOLOPHONIX Native solution, Eckhart first chose an amplitude-based panning algorithm called ‘LBAP’ for the processing of the main system, and, second, a stereo panning algorithm for the spatialisation of the four acoustic horns.

LBAP (Layer-Based Amplitude Panning) is an amplitude-based panning algorithm, optimized for 3D devices with multiple layers that do not necessarily all have the same number of speakers.

“We used the REAPER DAW software to play and mix the audio tracks and control the automation linked to the sound objects. The various movements and motions linked to the sources were written and played back in OSC (Open Sound Control) thanks to the HOLOSCORE plugin developed by Amadeus and available in VST3 format,” said Seignard. “The final spatialized mix from the 22 outputs of the HOLOPHONIX Native software was recorded to a Cymatic Audio UTrack24 player/recorder. These 22 tracks were then imported into the QLab software, running on a Mac Mini, providing playback during the operational phase.”

It should be noted that the entire sound installation is based on the AES67 protocol, including virtual sound cards, amplifiers, AD/DA converters, etc. “The video sequences are projected using 8 Barco G60-W laser light source projectors, each with 7,000 lumens. Seven projectors are equipped with 0.75/0.95:1 focal lengths. One projector is equipped with a 0.95/1.22:1 focal length. The synchronization of the video and audio tracks is ensured by a Crestron Electronics automation system,” said Le Du.

This successful collaboration, subtly blending art, multiple techniques, and technology, creates a true ‘augmented’ version of William Kentridge’s work, which will now be presented in this new form in his future exhibitions.

“We were so impressed with the results that we unanimously decided, in particular with video designer Rembrandt Boswijk (INDYVIDEO), that future installations of this work will be spatialized with the HOLOPHONIX solution. The integration of this new technology into our production flow was almost seamless and extremely pleasant to deal with. The result definitely takes this piece to another level of immersion and emotional realism,” concludes Kentridge’s sound designer and engineer Gavan Eckhart.

amadeuslab.com