To promote one of its latest series, Nickelodeon has employed some clever technology using a combination of VR and mixed reality to allow journalists to conduct realtime interviews with the characters from a cartoon show, right inside their world.
Editors note: This is a promotional case-study, and we're not being paid to publish it, but we're doing so, because it's well-written and it's got some pretty interesting background in it.
Your eyes drift off toward the horizon. In the distance, two figures stand atop a concrete block wall, ramrod straight, watching you carefully. Both of them are green. It’s obvious who they are. You’ve seen them a thousand times before, but never in person. It’s the Teenage Mutant Ninja Turtles or at least two of them. You take a few steps forward and wave. Donatello waves back. Michelangelo smiles and turns to show off that stunning Turtle profile of his. They speak to you.
"Hey," Donnie says. "How are you doing? I hear you've got some questions to ask us." You take another step forward and glance down at your feet. A wide puddle of water spreads across the cracked concrete floor and you catch a glimpse of your reflection. It looks like you, but somehow, your head has morphed into… Hey Arnold. Awesome. You love Hey Arnold.
This is the experience journalists and a few choice super fans were treated to at this year’s Comic-Con Convention in San Diego, thanks to an incredible virtual reality environment cooked up by the talented staff of Nickelodeon Entertainment Lab. And at the heart of the experience, working tirelessly behind-the-scenes to allow layers of disparate technology to interact seamlessly with one another, NewTek’s NDI was doing the heavy lifting, bringing the new world of Rise of the Teenage Mutant Ninja Turtles to life in a way that's never been possible before.
Chris Young, Senior Vice President of the Nickelodeon Entertainment Lab, headed up this virtual reality project, designed to create the ultimate PR splash to publicise the reboot of the Teenage Mutant Ninja Turtles franchise. Nickelodeon's new turtle series Rise of the Teenage Mutant Ninja Turtles is a prequel to the original show and features the TMNT crew before they became crime fighters. The new show is set in a dense, urban background reminiscent of New York City, along with NYC's mythical, hidden underground worlds. By creating a virtual press junket, Young was hoping to be able to immerse journalists into the Turtles’ new world. “At the Nickelodeon Entertainment Lab, we believe the future is rendered in real-time," Young says and at 2018’s Comic-Con, he and his crew of engineers and dreamers set out to prove it.
"We wanted journalists and superfans at Comic-Con to have this unique opportunity to step inside the Turtles’ art-directed world," Young says. "We wanted them to get a first-hand look at the new show and be able to interview Mikey and Donnie in virtual reality. Our plan was to film the interview using live-action cameras composited with gaming footage in mixed reality. Then, at the end of the interview, we would hand the journalist a thumb drive of their interview with the turtles."
All of which sounds like a tremendous idea, but how to pull it off? "Well," Young admits, "The devil is in the details."
How Did They Do It? NDI, of course
Fortunately, this is just the sort of challenge that the Nickelodeon Entertainment Lab was created to take on. The Lab was set up a few years ago to experiment with emerging technologies in the hopes of identifying outlets for Nickelodeon's universe of characters, both present and future. Pulling off a project as ambitious as a virtual press junket proved to be quite a challenge for the Lab and required bringing together technologies from a number of different disciplines and forcing them to play together nicely. That's where NewTek’s revolutionary NDI technology came into play. NDI acted as the unifying force in the Lab's virtual reality project, allowing a wide range of programs and devices to interact with one another.
Chris Young explains: "It all starts with Adobe Character Animator. We stream that into Unreal Engine, (the source-available game engine developed by Epic Games) using NDI technology to get it into the game. So, the person wearing the VR headset in the game is seeing the animated Turtles streaming over NDI in the game. From there, we’re also streaming NDI into live compositing software where we’re compositing the footage together, both virtual camera shots of the Turtles and live action footage of the journalists."
The journalists are shot on green screen," Young explains. "And that green screen key is composited into a back plate coming out of the game."
But that was only the first step in the complicated virtual-reality universe concocted by the fertile minds at the Nickelodeon Entertainment Lab. “All of those signals were then live-streamed back over NDI to our TriCaster system,” Young tells me. “Using the TriCaster, we were able to live-edit between all the animated and live-action camera angles, as well as record iso-feeds of all the different angles, in addition to the program edit. Then, once the interview was over, we were able to throw the program feed on a thumb drive to give to the journalists. From there, they could either share the interview immediately over their social channels or go back and use the iso-edits to repackage the interview in a way that worked best for telling the bigger story they wanted to tell."
Keeping Someone Else’s Head on Straight
Through experience, the staff at the Nickelodeon Entertainment Lab has come to realise that nothing ruins a virtual reality experience faster than seeing yourself wearing a clunky VR headset rig. So, rather than destroy the illusion for journalists, the Lab’s designers came up with an ingenious solution to the headset problem; they gave the journalists new heads. Before the interview, each journalist was allowed to select their favourite character from the Nickelodeon universe of animated characters and that head was keyed over their own, eliminating those pesky, illusion-destroying VR headsets.
"We thought it would be super-fun for everyone to be a Nick character," Young says. "So, the idea was to cover the journalist's head and VR headsets with a Nick character head. Although the journalists chose which character they wanted to be, it wasn't revealed to them until they looked down at a puddle we put in the ground. So, there would be this great moment when they'd see themselves and say, 'Wow, I'm Hey Arnold." It made for a great interview."
Bringing The Turtles to Life
Pulling off this live virtual reality project required a well-rehearsed team, according to Young. A crew of nine worked to bring the new turtles’ universe to life at Comic-Con, as well as a room full of blazing-fast gaming computers and other video gear.
"We had two puppeteers who live-animated the characters of Donnie and Mikey," Young says. "They worked with the Turtles’ animation show unit to extract animation cycles from the actual show episodes. Those cycles were triggered using MIDI controllers and the two puppeteers were able to puppet live, viewing the output on NDI monitors, as well as a composite feed of the two characters together. That way, each animator knew what they were doing, as well as what the other character was doing. And then, just off to the right of the animators were actors Josh Brener and Brandon Mychal Smith, who voice the characters of Donnie and Mikey in the series.”
Since the Nickelodeon crew only had access to the actors for a few hours during Comic-Con, it was essential that the system worked without any problems.
NDI: The Glue Holding Turtle Reality Together
And that’s why the Nickelodeon Entertainment Lab chose NewTek’s NDI technology to meld all these different platforms into a single seamless world. Chris Young has been a fan of NewTek products for many years and using NDI was a natural extension of that.
“NDI does this amazing thing in a magical way,” Young says. “The great thing about NDI is that you can send it out as a source and pretty much pick it up on any machine and it just works, regardless of format or frame rate. It gives us an amazing amount of flexibility. We also wanted to route so many different sources from so many different systems with so many different requirements for each source into our project, and NDI is an elegant solution for moving video across your Local Area Network.”
Rehearsal Is Key
Once the crew at the Lab came up with their bold plan to combine all these divergent systems and technologies into a single virtual reality experience, one question remained. Would it work? And more importantly, would it work at the convention, with its requirements for rapid set-up and tear-down? In order to find out, the Lab’s engineering staff set a mock-up of the system in their Burbank studio and began putting the system through its paces.
“We taped the dimensions of our booth out on the studio floor,” Young says. “Then we set up a bunch of tables and all the computers with the right cable lengths for everything, so we could really understand the setup. We rehearsed it for a couple of weeks straight to get the protocol down cold. We built the eight-foot-high green screen cube for our journalists. Then, we hauled people out of the hallways at the Lab – anyone we could find – and made them play the role of journalists. In the end, it turned into a mini-broadcast production unit, built around the TriCaster system. As a result of all that rehearsal, the system performed like a champ at Comic-Con, all thanks to NDI, which Young credits for holding the project together.
“Anytime you do a live event and involve advanced technology and talent, there are so many things that can go wrong," he says. "So, we were thankful that NDI was super-solid and allowed us to find the rhythm to pull the project off.”
Using NewTek’s Connect Spark to Work Out the Bugs
The crew at the Lab also uses another piece of NewTek gear in their studio in Southern California to help design their virtual reality worlds. “When the Connect Spark device came out, I literally bought it Day-One from the NewTek website," Young says. "It solved a major problem for us,”
NewTek’s Connect Spark is a portable IP video encoder that converts a 4K video signal to NDI and delivers it to the network for use with compatible systems, devices, and applications. Many of Nickelodeon’s current projects involve creating large-scale roaming VR experiences, and one of the biggest problems facing game designers in that realm is understanding what the end-users are seeing with an eye toward improving the experience.
“There’s nothing worse than the player telling you, ‘I think that thing over there should move.’ And you’re saying, 'What thing? I can’t see what you’re talking about.' So, the minute I saw the Connect Spark, I realised I could put it on a backpack and send wireless video at 60fps back to the designers, so they can get a true view of what the VR players are actually experiencing while they're playing. It's absolutely critical to understanding what needs to be changed.”
Real-Time is the Future of Entertainment
When they're not creating a Turtle-powered world of wonder for journalists, the engineers and designers at the Nickelodeon Entertainment Lab are focused almost exclusively on virtual reality, augmented reality and mixed reality projects. According to Young, real-time entertainment is the wave of the future, which is why Nickelodeon Labs is working so hard to be one of the first riding the wave. “The world is going to be full of executables and binary files,” he proclaims. “The days of QuickTime and 2D video will fall away. I think the idea of immersing yourself into your entertainment is an idea that’s coming at us like a runaway train. That’s why we’re trying to understand that world with so much energy and effort.”
And most likely, NewTek’s cutting-edge technologies will continue to figure heavily into the Nickelodeon Entertainment Lab’s future plans. “There are so many projects we have in our queue, and we’re just dying to get to them all,” Young says. "We don’t have enough time in the day to do all the cool things we want to do.”