Play Video

Virtual Production Film

Project overview

In the Year 2525 is a Virtual Production film utilizing the real-time systems available in the Unreal Engine and the extended-reality (XR) stage at Savannah Film Studios. The film is set in a museum gallery that features environmental artworks made by different student artists; a narrative structure to accommodate and collectively assemble different environments. The team was responsible for all aspects of production, from concept development, creating digital environments, XR video production, acting, editing, and color grading.

Role & Responsibilities

In my role for the project, I was responsible for crafting the creative treatment for the narrative and utilizing Midjourney AI image generation to produce concept art. Stephen Mok was responsible for producing the 3D assets, while I collaborated with him on lighting, texturing materials, and composing shots within Unreal Engine. Additionally, I oversaw the implementation of an ACES color grading workflow and served as the primary editor responsible for assembling everyone’s video into a cohesive final product.

Creative treatment

The “Gallery” scene took place in a museum exhibition called “Life is but a Dream,” which showcased Magic Realist paintings that offered an interactive experience to visitors. The scene followed a friendly and adorable flying drone named “W45H-U,” as it explored the exhibition and admired various paintings.

Furthermore, the XR stage was utilized to capture footage of actors portraying “museum visitors” in the virtual setting. For example, two visitors were seen noticing the drone as it entered the exhibition, pointing at it in surprise and amazement.

Art Direction

Due to the project’s tight schedule and limited resources, Stephen and I made the decision to leverage the emerging trend of Midjourney to aid in the design of the museum environment and docent robot. This allowed us to streamline the design process, reduce our workload, and focus our efforts on other aspects of production where our strengths lie. However, this approach was proven not to be fully effective as Midjourney couldn’t produce enough details to help in the 3D modeling process. In the images below, moodboards on the left was our starting point, and the one the right is produced by MidJourney. 

Museum Environment

Docent

Credits

Faculty: Matthew Ackers

3D Artist: Stephen Mok

Unreal Artists: Desmond Du, Stephen Mok

Editor: Desmond Du