An Avolites Ai media server is at the heart of the jaw-droppingly spectacular new lighting installation that wraps both towers at famous Las Vegas landmark, the Rio Hotel & Casino.
This is being used to map, help control and schedule over 3 miles – and 351,032 pixels – of ‘illuminative possibility’, designed by the creative lighting team of Chris Kuroda and Andrew “Gif” Giffin, using Clear LED’s X-Bar 25 mm product which wraps 360 degrees around the buildings.
Well-known for their work as live music and entertainment lighting designers, Chris and Gif programmed a series of beautifully elegant cues, scenes and sequences that run automatically, bringing a unique and organically engineered lighting aesthetic to the architecture of this iconic Vegas hotel and casino.
Chief Nerd Ruben Laine from Australia and US-based Creative Integration Studio was asked to devise a control solution that treated video as lighting.
New Look Rio
This involved outputting lighting in a video-centric format, enabling micro-manageable levels of detail to be accessed for each vertical LED strip, with some over 4,000 pixels long.
The Rio’s lighting scheme is part of an ongoing multi-million-dollar refit to the resort being managed by Dreamscape Companies. The new LEDs replace 3.6 miles of old neon that had been in residence since the 1990s.
The overall project is the brainchild of Marty Millman, VP of Development and Construction at Dreamscape. He very specifically didn’t want new lighting that resembled any other generic or clinically pixel mapped building installation fed with video content! He wanted something unique, different and stand-out.
A major Phish fan for many years, Marty reached out to the artist’s long-term lighting creative team of Chris and Gif … challenging them to produce the specific look he envisioned for The Rio, having been inspired by their lighting designs for the band. Their work for the artist frequently uses linear stage / theatre style light sources – like Robe Tetra2s and TetraXs – as a dynamic structural base to their familiar rig of automated trusses, simultaneously adding another layer of kinetic movement.
Chris and Gif have programmed hundreds of thousands of lighting cues for the assorted Phish tours and projects, using lighting consoles and effects engines, which give the animation a special crisp and clearly defined appearance.
This was exactly what Marty wanted, and a workflow that is second nature to Chris and Gif.
Video Control for Lighting Art
Chris and Gif were delighted to take on the mission but quickly realized that the ENORMOUS number of pixels involved … meant that DMX driven directly from a lighting console was not an option.
Enter Ruben, who immediately grasped that they needed ‘video playback’ … that did not involve video content!
Using the Avolites media server and Ai was one of Ruben’s first thoughts.
“I have always been an Ai guy,” he commented, quickly moving to spec this product for the task, in combination with the powerful real-time graphics rendering of Notch.
Ruben, who has used the Avo AI Media servers for over 10 years, collaborated with the Avolites team in the UK to add a new function to the AI server’s ‘Follow-on’ actions that allows for “randomized specificity” as a custom play mode to manage all the media, control and scheduling using a Notch block that Ruben built, giving lighting control across the entire surface of the buildings.
The Philosophy of Randomization
This custom scheduling – allowing randomization – enables the playback of a long ‘base look’ followed by a series of random sequences before returning to the base look again … and repeating the process, which also means that the same series of sequences will never get repeated and become predictable.
The programmed lighting scenes are divided into two categories, “base looks” that are subtly animated, and “shows” that are faster, bolder, and higher contrast.
A ‘base look’ plays for five minutes, followed by a one-minute show – all randomly selected – followed again by another randomly selected base look, then another one-minute show.
“Being able to dictate a range of files to each clip, from which it would pick randomly for its next clip, was amazing,” Chris explained. The lighting programming itself was more loosely timed on a clip-by-clip basis with no two clips the same length, so using tools like Calendar or Macro Script made it impossible to use anything else.
Chris, Gif and Ruben were all super impressed with the input from Avolites and in particular with Ai developers Simone Donadini and Terry Clark.
They started lighting programming with the linear elements in Notch, treating each vertical line as its own layer or canvas, complete with dedicated intensity controls and a “form” to allow for solids, gradients, or patterns, plus full transform controls like position and scale, as well as different color and alpha controls.
This meant that a single layer could maneuver complex gradients using one element, and these layers were then stacked.
A second independently controlled layer allowed Gif to get “really funky” with lighting programming, stacking two-dimensional controls, giving a set of 20 ‘super layers’ to cover the entire array of layers, rendering underneath the 200 linear layers with similar but more complex controls and effects.
Finally, by including animatable masks, the individual architectural segments and features of the buildings could be highlighted, which maintained Rio’s architectural identity.
“We wanted to achieve this without the building getting lost in the glamour and glitz of its shiny, new technicolor veil,” explained Chris, adding that “the genius” of this control methodology “was that it allowed our familiar tools and lighting programming workflow to be used during the creative process.”
Lighting Control for Video Art

Ideas were discussed just like they were standard lighting cues, creating and manipulating them on the fly using a lighting console and lighting console logic, relying on many of their concert lighting tricks like color wipes across the whole canvas, narrow bands of white leading in a new color from “rocket tips”, or creating shapes with the negative space and animating them into numerous forms.
With around 50 or 60 slow-moving looks and another 50 or 60 fast-moving ones, they needed a server that would pick these to play randomly over the course of a year, so that nothing was repeated regularly.
This Notch and Q Series / Ai combination also effectively crunches 2,000 universes of pixel data into 8 DMX universes of externally exposed ArtNet channels. Each sequence is played back from the console and ArtNet, recorded into Notch, then rendered at 60 frames per second for the smoothest possible motion across each pixel on The Rio’s facade.
The Q Series media server outputs the rendered clips into CLEAR LED’s signal processors which are then pushed down a few miles of fiber optic cable. “Q Series / Ai was without a doubt a crucial part of this adventure. From our original concept of running the show as live Notch blocks, through every creative, technical, and executive challenge, to the final execution. Using Q Series / Ai allowed us to effectively map the building in just a couple of hours,” commented Ruben, adding that they probably spent more time driving around looking for a parking spot with a good view … than actually doing the mapping!
The new Rio lighting scheme is helping to create heightened energy and buzz around a classic Las Vegas location. Apart from thrilling visitors who can marvel at its mesmeric beauty, it is illustrating new technical possibilities in the scale and imagination of integrated lighting and video control through a dynamic combination of Avolites Q Series / Ai and Notch.
Chris, Gif and Ruben are all super-excited to be part of this eye-catching and trail-blazing installation.