Meet the Team Behind Katy Perry’s Projection Mapped Spectacle!

Meet the Team Behind Katy Perry’s Projection Mapped Spectacle!

The 118.5 million viewers tuning into America’s game on Sunday represented the largest audience ever to watch a projection mapping show. Katy Perry’s go to projection mapping team, Lightborne, took on the pinnacle of all projection mapping spectacles when they decided to project on the University of Phoenix stadium for Perry’s incredible halftime performance. For those of you that missed it, see the video below or you can read more here:

The show featured an LED based stage display that was surrounded by amazing projection mapping effects (made possible by a white tarp around the center stage). Of course, the tropical beach party with walking trees, wildlife, stage dancers, and the brilliant shooting star climax made the show truly unforgettable. Large stadium shows with projection mapping are getting increasingly popular: see Halifax Mooseheads, House of Mamba and Cleveland Cavaliers.

Baz Halpin, Katy Perry’s show director, originally got in touch with Lightborne in October 2014 to help create the animations. Lightborne is in fact no stranger to working on the big stage, with visuals previously created for Deadmau5, Kanye West and Katy Perry’s numerous touring shows. As video content director Ben Nicholson said:

“Doing the Super Bowl was a simple extension of the team Katy has put together over the years that she trusts. She is very involved, very savvy and watches every frame of everything.”

SuperbowlTeaser3

The key players involved with the show’s production include show director Hamish Hamilton, producer Ricky Kirshner, production designer Bruce Rodgers, lighting designer Bob Barnhart, and server operator Jason Rudolph. With such a complete team, this allowed Nicholson to focus in on designing the content:

“The ‘Dark Horse’ perspective mapping section is obviously super fun, and represents the close collaboration with all aspects of the show,” he explains. “It was fun in rehearsals to watch the playback and see which dancers ‘fell’ into the abyss and then respond by adjusting the video to give them places to stand.”

On Nicholson’s greatest challenge in creating the show:

“With perspective mapping, if you are off in your rendered camera view the gag wont work. The surface was huge and there is so little time to rehearse in the actual environment. Preparation and pre-visualization really made everything work,” says Nicholson.

The key piece of software and hardware used in the show was the D3 Designer and Media Server, which lets designers play full-resolution video files in the projected environment:

D3_LightBourne

D3Rendering2

D3Rendering3

“In D3 we plotted the hero camera position for the perspective mapping a month ahead of time, based on a CAD model of the stadium, NBC’s camera plot, and Bruce Rogers’ detailed location scout photos. “It was lots of super-pro people communicating well, and a great piece of software,” Nicholson says.

From the looks of it, Cinema4D was also used in their video animation and 3D rendering pipeline:

LightBourneCinema4D

LightBourneCinema4D2

To learn more, visit Lightborne’s website or you can directly contact Ben Nicholson.

[Source: Images from Lightborne]
[Source: TCP]

Rajinder Sodhi
I design, research and tinker with new sensing and display techniques (like projection mapping) to make our interactions with computers feel more natural! I've worked for Walt Disney Imagineering and Microsoft Research and made things along the way like Aireal and RoomAlive. Check em' out at rsodhi.com, @lumenra.
  • Jean Pierre

    Do you have any idea how they take the cinema4d module and use it as a map