Thursday, November 7, 2024
HomeEntertainmentHow Disney’s ‘Mandalorian’ Technology Could Immerse Theme Park Guests In Alien Worlds

How Disney’s ‘Mandalorian’ Technology Could Immerse Theme Park Guests In Alien Worlds


Few events have been packed with as many announcements about Disney’s theme park plans as the recent D23 convention. New attractions were unveiled for all of the six Disney resorts around the world. Indeed, there were so many announcements that some almost entirely escaped the media’s attention.

One of the most welcome revelations wasn’t about a new theme park land or even a new ride but an update to an existing attraction.

Disneyland in California and Disney’s Hollywood Studios in Orlando are both home to a Star Wars simulator called Millennium Falcon: Smugglers Run which is perhaps one of the biggest missed opportunities in theme park history.

Sitting behind a full-size replica of the Millennium Falcon, Smugglers Run allows guests to live out their childhood fantasies by putting them at the controls of the iconic Star Wars spaceship. Each rider assumes one of three roles depending on which seat they are allocated with the lucky few sitting up front manning the guns and the controls.

The cockpit looks just like it does on the silver screen, right down to the beige color of the seats. The decor is accompanied by suitably lifelike visuals on the soaring screen in front of the viewing window. The footage is generated by computers in real-time enabling the pilots to fly freely within a pre-determined path and allowing them to push the lever which launches the Falcon into light speed. The gunner can fire the ship’s lasers at will and, at a key moment, a harpoon shoots from the Falcon when an engineer at the back hits a flashing button.

So Smugglers Run really does live up to its billing of putting guests in the driving seat of the fastest hunk of junk in the galaxy as the Falcon is affectionately known in the Star Wars films. The disturbance in the force is its story.

Instead of being based on the Falcon’s famous owner Han Solo, played so memorably by Harrison Ford, Smugglers Run is themed to a character who first appeared in the Star Wars spinoff computer-animated series, The Clone Wars. As the show was aimed at kids, many visitors won’t be familiar with the character and the same goes for the fuel he asks them to gather on their journey through space. Called Coaxium, it played a major part in the 2018 feature film Solo: A Star Wars Story, which ended up being one of the biggest box office busts in the history of Star Wars as we have reported. So that too isn’t well-known.

Neither Ford’s famous face nor his voice feature in the ride and his character’s hairy companion Chewbacca only has a fleeting role. It leaves fans wanting more and at D23 they got it. Cheers erupted in the convention center when Dave Filoni, chief creative officer of Disney’s Lucasfilm subsidiary, announced that in 2026, Smugglers Run will get a new mission themed to the smash hit Star Wars spinoff streaming series The Mandalorian.

It caused such a stir that many members of the media missed the news that this isn’t the only Mandalorian attraction which is on Disney’s radar. Vanity Fair was one of very few publications which picked up the news that Filoni and Mandalorian chief Jon Favreau are are considering using the Epic Games Unreal Engine software, which creates the show’s special effects, to allow fans to step into its alien worlds and walk through its sets. Nothing more has been mentioned about what this could involve. Until now.

When Disney announced that it had bought a $1.5 billion stake in Epic Games in February it was widely seen as being purely a move into video games but actually that might only be part of the story.

Video games took center stage when Disney lifted the curtain on the blockbuster deal in February as it announced that it would create an entertainment universe connected to Epic’s hugely-popular online game Fortnite. True to its word, at D23 Disney revealed that its Marvel Comics characters, including fan-favorite villain Doctor Doom, would feature in a Fortnite Battle Royale storyline.

Doom is the talk of the town as Oscar winner Robert Downey Jr. recently announced that he will play the character in upcoming Marvel movies. Disney is capitalizing on this momentum by inserting clues and hints into Fortnite about the storylines for upcoming Marvel movies. It makes the game a core part of Marvel’s storytelling but the synergy doesn’t stop there.

At D23 Disney also announced that Epic’s Unreal Engine 3D creation platform, which powers many of the most popular video games, will be used to create the visuals for the new Mandalorian mission in Smugglers Run.

Fittingly, the Mandalorian itself broke new ground through its use of the Unreal Engine. Disney’s Industrial Light & Magic (ILM) visual effects division used the software to create real-time broadcast-ready digital backdrops which were played live during production of the show on a soaring curved screen nicknamed the volume due to its vast size. The images it displays are so sharp that they are indistinguishable from the physical stage in front of it and the props that sit on it

ILM called the screen and software package StageCraft and built the first one in 2018 to film the first series of The Mandalorian at Manhattan Beach Studios in Los Angeles. Since then, a second screen has been built there with others following suit at Fox Studios Australia, ILM’s facility in Vancouver and Pinewood Studios in London. It has helped to tempt movie studios to film there and in turn this brings more business to ILM’s United Kingdom division.

As we revealed last week in the London Evening Standard newspaper, despite the actors’ and writers’ strikes delaying productions for much of last year, the revenue of ILM’s UK division remained stable at $111.6 million (£86.2 million). What’s more, its net profit almost trebled to $11.9 million (£9.2 million) as more staff worked remotely enabling it to lower its lease payments on its central London offices.

The StageCraft technology has been used in all three seasons of The Mandalorian as well as spinoff series The Book of Boba Fett and other Star Wars streaming shows including Ahsoka, Andor and the upcoming Skeleton Crew. Multiple Marvel movies have used it as well as others from rival studios such as Warner Bros.’ The Batman and Black Adam. Cinema could just be the start.

With their emphasis on immersion, theme parks seem like a natural home for StageCraft. It could be on its way to them according to Philip Galler, co-founder of Lux Machina, the in-camera visual effects company which partnered with ILM and Epic to develop the technologies deployed on the massive screen.

Standing 20-feet high and 75-feet wide, the original model comprises 1,326 ROE Black Pearl BP2 LED screens giving it a 270 degree sweep. It seamlessly joins a circular flat screen which hangs above in order to recreate the sky. For complete immersion, two huge LED screens can even be dropped in to create a 360 degree environment.

Big screens are nothing new in the theme park industry but what makes this one a whole new world is that the images on it move in real time with the cameras. It is thanks to the Unreal Engine which runs on 11 interlinked computers powered by NVIDIA processors with d3 media servers to store the images send them to the LED screens. The entire workflow is managed by ILM’s proprietary software and fed into the screens with the assistance of Lux Machina. It supplied the LED screen hardware and it had a magic touch.

Before StageCraft was developed, footage on screens in the background of film shoots would only look believable if the camera remained static, or moved along a preprogrammed path. This is because the footage on the screen could be created to reflect this particular perspective or path. However, as the footage was pre-rendered, it would not adjust accordingly if the camera deviated from it so the illusion would be broken. The StageCraft screen has a trick up its sleeve which prevents that from happening.

The seam between the video wall and the screen hanging above it is lined with motion-capture cameras which track the position of the cameras on the set. This data is used to alter the perspective of the photo-realistic 4K 3D images on the screen so that they move in time with the cameras. In an interview with audio visual industry publication AV Magazine, Galler explained that “real time camera tracking, real-time and interactive rendering are some key features” of the system.

The 3D images on the screens can be edited in real-time during the shoot and there is a team on hand to assist with on-the-fly color-correction and virtual lighting. Having close control of the lighting on the screens is essential as it needs to perfectly match the lighting on the set in front of it and vice versa.

Galler said that from an actor’s perspective, “being able to see the environment, the action and being able to see the world they are acting in allows for significantly improved storytelling experience.” His comments don’t apply to Disney in particular but any studio using this technology which would have seemed like science fiction just a few years ago.

The best way to describe the system is that it is like a real-life version of the Holodeck from sci-fi series Star Trek. The LED screen immerses the crew in a photorealistic virtual environment which seems to stretch into the distance and moves in time with the actors and the cameras that are filming them. Not only is there no need to build sets, but the film-makers don’t have to spend the time and money travelling to far-flung locations as the actors can be transported to them at the click of a button.

Gone are the days of film crews traveling around the world only to find that inclement weather stops them from shooting there. The ideal lighting conditions can be created digitally and shown on the screen so that film-makers can shoot in them all day.

Crucially, the images and light on the screen even reflect off the physical props on the set so reflections don’t need to be inserted digitally in post-production as is the case when filming in front of a green screen. It explains why Lux Machina is described as an in-camera visual effects company and this innovative technique came into its own during the filming of The Mandalorian as its eponymous protagonist is clad in silvery armour. If a green screen had been used, the post-production team would have had to edit out the green shine in his armour and replace it with a virtual image of his surroundings.

Instead, the surrounding environment on the screens shows up on his costume as well as on the metallic surfaces of vehicles and weapons. It makes it look like they are actually in that alien location which makes the show seem more convincing.

Indeed, Galler said that the biggest advantages of using the LED screen are “interactive lighting, reflections, if possible in-camera final pixel images in some cases, and immersive environments for productions.”

The producers of HBO’s Westworld series used similar screens to show exterior shots of the City of Arts and Sciences complex in Valencia which represents the headquarters of the company at the heart of the show. The shots are reflected in the windows of interior sets and subtly move as the camera shifts.

However, the sensitivity of the system can be a double-edged sword. In traditional film-making, the camera will only see a small portion of the set. However, when it is in the middle of a 270 or 360 degree screen, the whole environment is a light box so objects behind the camera can reflect on the actors’ faces, costumes or props.

That’s not the only hurdle that directors have to watch out for. If the lens on the camera was too sharp, the edges of the Mandalorian’s armour would stand out too much from the background and could make it seem like he had been super-imposed on it.

To avoid that, Filoni and Favreau used Panavision’s full-frame Ultra Vista 1.65x anamorphic lenses as they are known for their softness. Shooting the background with a soft lens also had the added advantage of reducing the wavy patterns of visual interference that are usually seen when a camera films a screen.

Despite all the advantages of the LED screens, Galler said “there are many scenarios where you may not be able to get in-camera finals. Large battle scenes, complex choreography, and many scenes which present fast moving camera and lighting changes will be difficult to do.” Likewise, doors have to be part of the physical set as actors can’t walk through the LED screen.

He added that directors shouldn’t get distracted by the ability to set up the perfect shot. “Just because it is real-time doesn’t mean that it should be treated as real-time. You can get bogged down in trying to touch all of the various real-time handles, and lose sight of the overall goal of telling a story. Treat it like it is a real world environment. You wouldn’t show up in a redwood forest and try to move all the trees around on the day, so don’t do it in a real-time volume.”

He added it is important for directors to remember that “this workflow affects every aspect of production, including post production, accounting, production design, hair/makeup, etc.” The production process is inverted because, unlike typical productions, computer-generated backgrounds have to be finished before filming begins.

For the first season of The Mandalorian, ILM used the industry-standard Maya program to model the backgrounds in 3D before photos from real-life locations were mapped onto them to ensure that the end result is photo-realistic. ILM got photos from Iceland and the US but didn’t require many staff to do so. That’s because its team each used a custom rig with six Canon EOS 5DS and MKIV cameras all snapping shots simultaneously.

Photos were taken throughout the day to replicate different lighting conditions and ILM used software such as Agisoft De-Lighter to remove shadows so that they could be lit virtually on the LED screens. The screens were used in more than 50% of season one of The Mandalorian and where it wasn’t suitable for the shot, the screen was turned green at the touch of a button.

Unlike a traditional green screen, which is made of canvas, the virtual version takes no time to set up and its size can be set to precisely outline the required area. The effects are then inserted in post-production along with the digital removal of the motion-capture cameras which line the seam at the top of the screen. In contrast, grass or rocks are used to hide the seam between the bottom of the screen and the set.

The screen requires significant set-up time as well as off-site preparation to make sure that all the software and media servers are functioning. As Galler explained, Lux Machina has to “calibrate camera tracking, color and asset pipelines, production timelines, and all of the departments, to have a successful shoot.”

It has made Lux Machina a force to be reckoned with in the LED screen sector and over the past few years alone it has installed them in the UK for Warner Bros. and Apple, built a temporary one for FOX Upfront and completed its Prysm Stage at Trilith Studios in Atlanta. Lux Machina also worked on Amazon’s enormous LED screen located on the historic Stage 15 at Culver Studios. A staggering 80 feet in diameter and 26 feet in height, it is integrated with AWS, Amazon’s cloud computing platform, to optimize the workflow. Movie studios may be just the start.

ILM showcased its StageCraft system to the public for the first time at D23 and the possibilities for theme parks soon became apparent. Not only was the screen the same caliber as the one which is being used in the upcoming movie The Mandalorian & Grogu, but the ILM technicians who operated it at D23 are doing the same job on the film set.

Just in front of the semicircular screen on the convention floor was a black stage with a droid on it. At the touch of a button the background transformed from the neon-lit streets of an alien city to the hangar of a space station with ships flying in and out. Guests wielding lightsabers were given a chance to stand on the stage as a camera panned around them making it seem like they were actually in the environment shown on the screen. The 10-second clips were then e-mailed to each guest giving them even more eye-catching content than they would get at a meet and greet with characters in a theme park.

“Oh, it was awesome,” said one visitor as she walked off the stage, grinning. “A chance of a lifetime.”

Given that this was just a temporary display on the floor of a convention center, it shows how much could be achieved by a permanent facility in a theme park. Imagine being on a set filled with other-worldly props covering the stage and the edges of the soaring screen which shows an alien city on the horizon. Instead of being a static backdrop as you would usually find in an theme park attraction, the alien cityscape in the distance gets closer as you walk towards it exactly as it would do in reality.

Disney has even developed cute bipedal robots, which could populate the landscape, and an ingenious floor surface enabling it to dramatically reduce the footprint of the facility. Called the holotile, it acts like a multi-directional treadmill so guests could head in any direction they want and the objects in the distance on the screen would get closer as they walk towards them on the spot.

It would likely be a low capacity attraction which could have a premium price tag as that it would create content which cannot be found elsewhere. Not only would it involve the same technology used by Hollywood blockbusters but it would make guests look like they had been transported to alien worlds.

Disney is already heading in this direction as its Avengers Campus land in Paris features a photo opportunity with superhero characters taken by a ring of cameras surrounding them. It generates a slow-motion 360 degree video clip along the lines of the bullet-time scenes from the Matrix movies. It comes at the princely price of $16.54 (€15) per person on top of the $82.72 (€75) for the park’s photopass package. An even more premium product could be just what the parks need to boost their profits in the face of softening attendance as we recently reported.

When asked if he thinks StageCraft could be used by theme parks in future, Galler said “of course, I think in many cases real-time engines are used in theme park rides. As the quality approves, we will be able to lean more on real-time tech for previs, installation and eventual rider engagement.”

The interview was published three years ago but seems staggeringly prescient in light of the recent announcement at D23. So although Filoni and Favreau are still only considering using the StageCraft system to immerse fans in alien worlds, its launch in the parks might not be a galaxy far far away.

Blog Credit

Source link

RELATED ARTICLES

Leave a Reply

Most Popular

Recent Comments

Зарегистрируйтесь, чтобы получить 100 USDT on Farmer Wants A Wife star Claire Saunders shares urgent warning after ‘shock’ health scare

Discover more from MovieBird

Subscribe now to keep reading and get access to the full archive.

Continue reading