SV_1.13.1.png

Shoorveer

Shoorveer from Disney Hotstar is one of the most exciting projects in Indian Cinema and one of the first projects in Indian Television to have Air combat Maneuvers. Viga ET helped make all the ACMs using real-time rendering from Unreal Engine.

SV_1.1.1.png

Air combat Maneuvers

One of the first TV shows on Indian television platform to have air combat manoeuvres. 

SV_1.8.1.png

Real Time 

All content is rendered using real-time rendering from Unreal Engine. For the first time final pixels were rendered from a Game Engine.

SV_1.10.1.png

Photorealism

Photorealistic Output coming straight from Unreal Engine using Physically based rendering.

Shoorveer was one of the first shows in the Indian web series to use Unreal Engine for its production. Aired on Disney+ Hotstar, Shoorveer is a story about the Indian army fighting against a looming threat. One of the first shows to use Unreal Engine in its CG content production, Shoorveer has a lot of its work set in Unreal engine. The goal was to push the Unreal Engine to achieve cinematic visuals.  

Unreal Engine was extensively used in producing Shoorveer’s Air combat manoeuvres(ACM). This production was split into two different stages. In the first stage, all the close-up shots of the pilots were shot in a LED screen setup. This involved having large LED walls behind the sets of the cockpit. Unreal Engine was used in rendering the skies. The skies and clouds were designed and had to accommodate a lot of lighting scenarios, and rendering this in non-real-time methods would have taken a long time. Thanks to Unreal’s renderer, we could render these in real-time.  

JT_18.png

The second phase of the production involved creating all the ACMs in complete CG. This involved making realistic assets, animating, lighting, and capturing shots in Unreal Engine.  

“To be honest, it was ambitious to attempt this in Unreal Engine,” says Sujay, cofounder at Viga Entertainment. “We worked on a prototype initially, and the results were stunning. So the next challenge was, how do we scale this up and make another 500 odd shots for this show in the next 16 weeks”.  

Mig_29.png

"The amount of CG content we had to make in Unreal was overwhelming for a small team like ours. This is where our teams innovated to develop production procedures that ensured the production was kept on track," says Vivek Reddy, cofounder of Viga Entertainment. 

The Realtime Workflows 

The first step in delivering this was to build a scalable team, and we recruited a lot of artists for this project across different levels. Once the crew was in place, we started off by experimenting with getting our production workflow right. The teams also spent a lot of time trying to understand the physics of the aircraft.  

It took a couple of sequences until the workflow evolved, but it eventually became an excellent process. The goal was to ensure different tasks could happen parallelly inside Unreal Engine. Once we had clarity on the shots from the animatic, we would discuss getting some technical details of the shot right. For, e.g., the aircraft's speed, the altitude, the lens settings of the camera etc. Once this was in place, we would do a layout and blocking phase where we would mark the space with splines and try to achieve the rough manoeuvres. Once these were approved, we would have the animation crew take up these shots and work on creating the final animations. 

 

We had an internal review system called Moviecolab, which was extensively used during the production of this project. Moviecolab was used to track each shot's progress and provide reviews to the right

vlcsnap-2022-08-18-12h42m40s177.png

The animators would submit their shots to Moviecolab, and our animation supervisor would review the shots. Once approved, the shots would be imported into Unreal Engine along with the rough cameras. “Keying the cameras directly in Unreal engine was difficult as the jets were moving at a very high velocity. Importing the cameras from the animations made the job a lot easier. The cameras were further tweaked in Unreal to achieve the final visuals,” says Darshan, who was the cinematic artist of this project. 

 

We had worked on a system where the artists had their sub-levels to work on without stepping on each other’s feet. We had the lighting, environment, cinematography and fx happening in parallel in their respective sub-levels. This workflow was simple and proved to be non-destructive for everyone who was in the process. 

SV_1.5.1.png

“The advantage with Unreal Engine is its extensive library of assets in the form of Megascans and marketplace. We could use a lot of these and incorporate these production-ready assets into our environments without a lot of hassle. “says Gaurav, the environment artist of this project. 

Some of the shots that had been made involved creating large-scale terrains with 100s of trees. Rendering that in real-time is not easy. What made it possible was optimized environments, continuous profiling, and efficient use of LODs to retain the cinematic visual fidelity.  

 

“The future is real-time; this project could have consumed months to render alone if not for Unreal. With this tech, we could see the visuals right as we worked on them. The speed at which we can see the output in Unreal is truly ground-breaking; we are excited to see what Unreal 5 brings to us,” says Vivek Reddy.