Avolites media server at the heart of the Rio

Photo: Eliska Sky.

An Avolites Ai media server is at the heart of the spectacular new lighting installation that wraps both towers at the famous Las Vegas landmark, the Rio Hotel & Casino.

This is being used to map, help control, and schedule over 3 miles – and 351,032 pixels – of ‘illuminative possibility’, designed by the creative lighting team of Chris Kuroda and Andrew “Gif” Giffin, using Clear LED’s X-Bar 25 mm product, which wraps 360° around the buildings.

Well-known for their work as live music and entertainment lighting designers, Kuroda and Giffin programmed a series of elegant cues, scenes and sequences that run automatically, bringing a unique and organically engineered lighting aesthetic to the architecture of this iconic Vegas hotel and casino.

Chief Nerd Ruben Laine from Australia and the US-based Creative Integration Studio was asked to devise a control solution that treated video as lighting.

New look Rio
This involved outputting lighting in a video-centric format, enabling micro-manageable levels of detail to be accessed for each vertical LED strip, with some over 4,000 pixels long.

The Rio’s lighting scheme is part of an ongoing multi-million-dollar refit to the resort being managed by Dreamscape Companies. The new LEDs replace 3.6 miles of old neon that had been in residence since the 1990s.

The overall project is the brainchild of Marty Millman, VP of Development and Construction at Dreamscape. He very specifically didn’t want new lighting that resembled any other generic or clinically pixel-mapped building installation fed with video content. He wanted something unique, different and stand-out.

A major Phish fan for many years, Millman reached out to the artist’s long-term lighting creative team of Kuroda and Giffin … challenging them to produce the specific look he envisioned for The Rio, having been inspired by their lighting designs for the band.

Their work for the artist frequently uses linear stage/theatre style light sources – like Robe Tetra2s and TetraXs – as a dynamic structural base to their familiar rig of automated trusses, simultaneously adding another layer of kinetic movement.

Kuroda and Giffin have programmed hundreds of thousands of lighting cues for the assorted Phish tours and projects, using lighting consoles and effects engines, which give the animation a special, crisp and clearly defined appearance.

This was exactly what Millman wanted, and a workflow that is second nature to Kuroda and Giffin.

Video control for lighting art
Kuroda and Giffin were happy to take on the mission but quickly realised that the large number of pixels involved meant that DMX driven directly from a lighting console was not an option.

Enter Laine, who immediately grasped that they needed ‘video playback’ that did not involve video content.

Using the Avolites media server and Ai was one of Laine’s first thoughts.
“I have always been an Ai guy,” he commented, quickly moving to spec this product for the task, in combination with the powerful real-time graphics rendering of Notch.

Laine, who has used the Avo AI Media servers for over 10 years, collaborated with the Avolites team in the UK to add a new function to the AI server’s ‘Follow-on’ actions that allows for “randomized specificity” as a custom play mode to manage all the media, control and scheduling using a Notch block that Laine built, giving lighting control across the entire surface of the buildings.

The philosophy of randomisation
This custom scheduling – allowing randomisation – enables the playback of a long ‘base look’ followed by a series of random sequences before returning to the base look again … and repeating the process, which also means that the same series of sequences will never get repeated and become predictable.

The programmed lighting scenes are divided into two categories: “base looks” that are subtly animated, and “shows” that are faster, bolder, and higher contrast.

A ‘base look’ plays for five minutes, followed by a one-minute show – all randomly selected – followed again by another randomly selected base look, then another one-minute show.

“Being able to dictate a range of files to each clip, from which it would pick randomly for its next clip, was amazing,” Kuroda explained. The lighting programming itself was more loosely timed on a clip-by-clip basis with no two clips the same length, so using tools like Calendar or Macro Script made it impossible to use anything else.

Kuroda, Giffin, and Laine were all impressed with the input from Avolites and, in particular, with Ai developers Simone Donadini and Terry Clark.

They started lighting programming with the linear elements in Notch, treating each vertical line as its own layer or canvas, complete with dedicated intensity controls and a “form” to allow for solids, gradients, or patterns, plus full transform controls like position and scale, as well as different colour and alpha controls.

This meant that a single layer could manoeuvre complex gradients using one element, and these layers were then stacked.

A second independently controlled layer allowed Giffin to get “really funky” with lighting programming, stacking two-dimensional controls, giving a set of 20 ‘super layers’ to cover the entire array of layers, rendering underneath the 200 linear layers with similar but more complex controls and effects.

Finally, by including animatable masks, the individual architectural segments and features of the buildings could be highlighted, which maintained Rio’s architectural identity.

“We wanted to achieve this without the building getting lost in the glamour and glitz of its shiny, new technicolour veil,” explained Kuroda, adding that “the genius” of this control methodology was that “it allowed our familiar tools and lighting programming workflow to be used during the creative process.”

Lighting control for video art
Ideas were discussed just like they were standard lighting cues, creating and manipulating them on the fly using a lighting console and lighting console logic, relying on many of their concert lighting tricks like colour wipes across the whole canvas, narrow bands of white leading in a new colour from “rocket tips”, or creating shapes with the negative space and animating them into numerous forms.

With around 50 or 60 slow-moving looks and another 50 or 60 fast-moving ones, they needed a server that would pick these to play randomly over the course of a year, so that nothing was repeated regularly.

This Notch and Q Series / Ai combination also effectively crunches 2,000 universes of pixel data into 8 DMX universes of externally exposed ArtNet channels. Each sequence is played back from the console and ArtNet, recorded into Notch, then rendered at 60 frames per second for the smoothest possible motion across each pixel on The Rio’s facade.

The Q Series media server outputs the rendered clips into CLEAR LED’s signal processors, which are then pushed down a few miles of fibre optic cable. “Q Series / Ai was, without a doubt, a crucial part of this adventure. From our original concept of running the show as live Notch blocks, through every creative, technical, and executive challenge, to the final execution. Using Q Series / Ai allowed us to effectively map the building in just a couple of hours,” commented Laine.

The new Rio lighting scheme is helping to create heightened energy and buzz around a classic Las Vegas location. Apart from thrilling visitors, it is illustrating new technical possibilities in the scale and imagination of integrated lighting and video control through a dynamic combination of Avolites Q Series / Ai and Notch.

www.avolites.com