The 118.5 million viewers tuning into America’s game on Sunday represented the largest audience ever to watch a projection mapping show. Katy Perry’s go to projection mapping team, Lightborne, took on the pinnacle of all projection mapping spectacles when they decided to project on the University of Phoenix stadium for Perry’s incredible halftime performance. For those of you that missed it, see the video below or you can read more here:
The show featured an LED based stage display that was surrounded by amazing projection mapping effects (made possible by a white tarp around the center stage). Of course, the tropical beach party with walking trees, wildlife, stage dancers, and the brilliant shooting star climax made the show truly unforgettable. Large stadium shows with projection mapping are getting increasingly popular: see Halifax Mooseheads, House of Mamba and Cleveland Cavaliers.
Baz Halpin, Katy Perry’s show director, originally got in touch with Lightborne in October 2014 to help create the animations. Lightborne is in fact no stranger to working on the big stage, with visuals previously created for Deadmau5, Kanye West and Katy Perry’s numerous touring shows. As video content director Ben Nicholson said:
“Doing the Super Bowl was a simple extension of the team Katy has put together over the years that she trusts. She is very involved, very savvy and watches every frame of everything.”
The key players involved with the show’s production include show director Hamish Hamilton, producer Ricky Kirshner, production designer Bruce Rodgers, lighting designer Bob Barnhart, and server operator Jason Rudolph. With such a complete team, this allowed Nicholson to focus in on designing the content:
“The ‘Dark Horse’ perspective mapping section is obviously super fun, and represents the close collaboration with all aspects of the show,” he explains. “It was fun in rehearsals to watch the playback and see which dancers ‘fell’ into the abyss and then respond by adjusting the video to give them places to stand.”
On Nicholson’s greatest challenge in creating the show:
“With perspective mapping, if you are off in your rendered camera view the gag wont work. The surface was huge and there is so little time to rehearse in the actual environment. Preparation and pre-visualization really made everything work,” says Nicholson.
The key piece of software and hardware used in the show was the D3 Designer and Media Server, which lets designers play full-resolution video files in the projected environment:
“In D3 we plotted the hero camera position for the perspective mapping a month ahead of time, based on a CAD model of the stadium, NBC’s camera plot, and Bruce Rogers’ detailed location scout photos. “It was lots of super-pro people communicating well, and a great piece of software,” Nicholson says.
From the looks of it, Cinema4D was also used in their video animation and 3D rendering pipeline:
To learn more, visit Lightborne’s website or you can directly contact Ben Nicholson.
[Source: Images from Lightborne]
[Source: TCP]