Four years ago, a YouTube video titled "Mother and Daughter Reunite in VR" captured global attention. The clip was excerpted from the South Korean MBC VR documentary series "Meeting You" (너를 만났다) and showcased the use of VR technology to recreate deceased loved ones for grieving family members. To date, the clip has amassed over 35 million views online. Even without prior knowledge of the program or its backstory, the 10-minute video is enough to move anyone to tears due to its raw display of emotion. This emotion stems not only from the sorrow of losing a daughter but also from the profound and visible impact of their reunion.
With advancements in VR, XR, AI, and other technologies, methods for facilitating reunions between the living and the departed are on the rise. These technologies, referred to as "grief tech" by Charlotte Jee, a writer for MIT Technology Review, are designed to help people cope with grief and loss.
In Taiwan, notable examples include record producer Bao Hsiao-Song, who, after the painful loss of his daughter, delved into AI research. Using just three sentences left by his daughter, he recreated her voice and, coupled with an AI chatbot, achieved fluid conversation. Similarly, Japanese artist Koya Matsuo used AI to revive his late wife's singing voice and image, creating "Torichan sings Desperado." This work not only won the inaugural AI Art Grand Prix in Japan in 2023 but is also currently on display at the Museum of Contemporary Art, Taipei.
While many view such endeavors indicitive of being stuck in grief and unable to move on, the people who do this may be seeking for ways to effectively process their own emotions and try to realize a chance for getting proper closure. The Korean VR documentary "Meeting You" aims to tailor virtual experiences for participants using technology, not to resurrect the dead but to offer the living a possibility to bid adieu to their pain.
The fourth season of "Meeting You" premiered in 2024, focusing on a couple who lost their thirteen-year-old son, Seo-Jun, to an acute cerebral hemorrhage three years ago. Despite their efforts to remain strong for their other three children, an unhealable wound persisted in their hearts. They decided to join "Meeting You" in hopes of reuniting with their son in virtual reality and having the opportunity to say goodbye properly.
The production team invited VR technicians and psychological counseling experts to create two virtual scenes specifically for the couple to fulfill their wish. The first scene depicted a beach from their memories, commemorating Seo-Jun's love for the sea, while the second recreated Han River Park, where Seo-Jun often rode bikes with his father.
Moreover, the team utilized VIVE Mars virtual production and Unreal Engine's MetaHuman technology to depict Seo-Jun, forever thirteen, as a vivid 16-year-old before his parents. To ensure a "real" interaction in virtual reality, bidirectional communication and real-time motion capture were essential for smooth and natural movement. The production team at PONY ENT, the VFX team at IOFX MMC, and the consulting team at Niepce Studio collaborated to accomplish this innovative and challenging task.
We had the privilege of interviewing the leaders of the PONY Games and IOFX tech teams, delving into the production process, the challenges they faced, and how they utilized cutting-edge technology to create the touching moments in this season of "Meeting You" :
Q: Hello, it's an honor to have you both. Could you introduce yourselves and explain the projects you were responsible for?
J: Hello, I'm Ji Myung-gu, CEO of IOFX and Technical Director at Studio Realive under SM. In the fourth season of "Meeting You," I was in charge of planning and designing various sensory technologies and coordinating the technical teams.
K: I'm Kang Min-ho, head of PONY. For the fourth season of "Meeting You," I oversaw the interactive components, real-time motion capture, and the technical coordination and production of VR interactions using Unreal Engine.
Q: Could you share with us the production process for this season and how long it took?
J: We started meetings with MBC's executive production team in April 2023. From then on, we planned and tested various technologies based on the content theme, formed a dedicated team to create content tailored to the participants, and pushed technical trials. The production timeline took about six months, involving collaboration with teams like Ponyent., Niepce Studio, Studio Realive, and Opim Studio, covering experts in MetaHuman, motion capture, interactive content, VR production, and virtual production. However, this project wouldn't have been possible without MBC's on-site technical guidance and HTC VIVE's virtual production solutions.
K: We were directly involved in the project for about six months as well. Most of our work focused on achieving real-time interaction, allowing triggers in Unreal Engine to be controlled instantly to correspond with dialogue and actions, and smoothing out on-site real-time motion capture.
Q: It sounds like a complex and massive project. Did you encounter any challenges, and how did you overcome them?
J: Traditional drama or film production usually involves well-planned scripts and storylines, but our work relied more on capturing participants' experiences in virtual environments and presenting them in a documentary format, which led us to use many previously untried technologies. Establishing a fail-proof, real-time operating system was a significant challenge. Although we initially planned to include mixed reality (MR) content, we ultimately focused on realizing virtual reality (VR) content due to time and technical considerations, while also employing real-time motion capture and virtual production techniques.
K: Setting up and synchronizing a large amount of equipment in a limited studio space was a big challenge, and we faced several difficulties in this area. Fortunately, we had many rehearsal opportunities to identify and solve issues during the process.
Q: After watching the video, we noticed that virtual Seo-Jun could naturally converse and interact with his parents. Could you share the technical principles behind "real-time bidirectional communication" and the role it played in the shooting process?
J: First, to ensure stable operation on-site, we not only used high-spec network and hardware equipment but also optimized software and content for low-latency interaction. Additionally, we employed a monitoring system that could react to the parents' movements and behaviors by tracking the actual motion capture relative positions, allowing virtual Seo-Jun to respond.
K: Yes, since the real parents and virtual Seo-Jun were in different spaces, knowing the parents' location and movements was a challenge. To address this, we not only tracked the parents' positions but also added a "Lookat" function for Seo-Jun to directly gaze at his parents, making face-to-face interactions feel more genuine.
Q: The 16-year-old virtual Seo-Jun is incredibly lifelike. How did you recreate his expressions, appearance, and voice?
J: We simulated 16-year-old Seo-Jun using data and files provided by his family. The production team not only captured his physical appearance but also his personality through his expressions and movements. Everyone involved strived to capture everything about Seo-Jun. In processing this data, we not only saw what a wonderful and kind child Seo-Jun was but also felt the love his family had for him.
K: Technically, we used Unreal Engine 5's MetaHuman to create virtual Seo-Jun. A key feature of MetaHuman is that it doesn't rely on traditional "blendshape" technology for facial rigging but uses a skeletal structure that can be adjusted by weight, allowing for more detailed and customized adjustments, saving us a lot of time.
Q: Additionally, did you use VIVE Mars CamTrack and virtual production? Can you explain the workflow and equipment used?
J: Indeed, through VIVE Mars CamTrack and VIVE FIZTrack's virtual production technology, we were able to successfully record participants' reactions in the real world and their experiences in virtual reality, conveying these sincere emotions to a wide audience and creating touching moments. We also used the VIVE XR Elite as the head-mounted display for this season, and with VIVE Mars's virtual production solution, we were able to maximize the benefits of various technologies and integrate them together.
Q: A highlight of "Meeting You" is psychological counseling, including conversations with counselors. Did you integrate elements of psychological counseling into the VR interaction design? Can you talk about this aspect of the design?
K: During the content planning stage, we prioritized psychological counseling elements to plan scenes and interactions that would evoke happy memories for them, rather than just a typical VR experience. Fortunately, it seems that Seo-Jun's parents were truly immersed and felt it all.
J: We believe that finding a way to properly say goodbye to a deceased loved one is very important. For those who have lost someone, resolving inner concerns is key to overcoming grief and moving forward. In the process, psychological counseling and continuous communication are the most direct and effective methods.
Q: We're curious about the reactions and thoughts of the participants, Seo-Jun's parents, and other children on set and after the experience. Can you share with us?
K: On set, the staff were moved by the emotions of Seo-Jun's family, crying and laughing with them. We could deeply feel that through this project, their trauma was healed, even if just a little, which was very gratifying.
J: Additionally, this project left a valuable family reunion experience for Seo-Jun's siblings. In fact, during the project, I also reflected on the need to cherish and treat my own family better. Although as time passed, I gradually neglected this, which I find somewhat embarrassing and shameful.
Q: Finally, if there is a fifth season of "Meeting You," what would you like to try?
J: Although it's not for us to decide, many viewers are looking forward to the next season, so I guess there will be a fifth season. We expect that the next phase might use MR technology that we didn't include this time. With the advancement of MR equipment and related technologies, we look forward to achieving photorealism, providing users with a deeper sense of immersion. We hope these realistic technologies can be applied not only to the development of MetaHuman but also to scene creation technologies like NeRF, rather than just using virtual CGI backgrounds.
K: I believe that with the arrival of the fifth season, there will be better planning and technology. I hope to use these camera tracking and hand tracking technologies to create content that satisfies both visual and tactile senses.