How to Realize the Cool XR Scene?

Publisher: Supplier of LED Display Time: 2022-03-01 15:24 Views: 1182

In recent years, XR technology has begun to enter the public's field of vision. More and more concerts, press conferences, movie shooting sites and TV shows have begun to use the high-tech and realistic XR technology to open up a new film and production. play. As we all know, XR technology is an extended display technology that integrates AR, VR, and MR technologies. How does XR technology integrate, and how does it achieve cool images in front of the screen? Let's listen to Liancheng's editor to break down the XR technology.


The XR scene technical framework includes several major elements, such as the shooting venue composed of LED display, cameras, object position tracking and positioning technology, real-time graphics rendering engine (AR technology), high-speed and low-latency network, and finally will increase the combination of elements. Before the content is transmitted to the display screen, it becomes a cool and realistic XR scene. Its overall technical framework is shown in the figure below.


To realize the XR scene, the virtual environment is firstly transmitted to the on-site LED display screen, and the object position tracking and positioning technology is used to track and shoot the object, and the material is sent back to the real-time graphics rendering engine. At the same time, the video server uses localized operation logic operations to analyze character gestures and render graphics, output stereoscopic and realistic AR content and interactive presentation of shooting materials, and the content presented in front of the screen is the XR scene. It is worth noting that strong network support is also required on-site to provide strong support for the realization of XR low-latency interaction, panoramic video and other content large-bandwidth distribution. Simply put, the XR scene is based on the real scene, combined with the cool AR content in the background, so XR technology is known as the fusion of AR, VR and MR technology.


Case decomposition


my country's short film "Only to Meet You in Dreams" uses virtual shooting for the first time, combining the XR workflow with the workflow of the film and television industry and implementing the output of the final content. The short film has no live action, no green screen, and the scene is rendered in real time, while the actors perform in the space constructed by the LED screen. In terms of scene rendering, XR Extended Reality uses a real-time rendering engine to build shooting scenes, outputs and synthesizes through the media server, uses the camera tracking system to locate spatial position information, and maps the spatial relationship between characters and scenes in real time. Real-time rendering technology converts photo-level dynamic digital scenes , restore it on the LED screen, present and output a virtual scene without dead ends in real time, and finally form a short film with an XR scene.


6.jpg


In addition, on the stage of the League of Legends global finals, XR technology was also used for live broadcast. Since it was broadcast live, there was no chance of a repeat, so dozens of cameras were involved in a game. All the cameras used on site are infrared cameras, and on the walls of the S10 site, there are a lot of these little dots, the purpose of which is to reflect infrared light. All infrared cameras will be installed with a special module. When the camera moves, they will track the camera itself according to the reflection and direction of these points, and then feed it back to the computer in real time. The error is less than 1 mm. In addition, the team matched the lights in the Unreal engine with the real skypanel through a plug-in developed by themselves, which means that the lights in the engine change, the lights in reality will also be synchronized, and the virtual content will interact with the arena. Presenting a new way of live broadcast of the event.


Prior to the important moment when my country's "Struggler" manned submersible successfully landed at the bottom of the Mariana Trench, CCTV News Channel of China Central Radio and Television Station and CCTV News New Media conducted synchronous popular science explanations through xR virtual broadcasts. Use the virtual environment to cover the real environment outside the LED screen in real time, and complete the processing of virtual environment expansion. In detail, the use of the camera tracking system Mosys makes the live camera and virtual camera in the live broadcast always consistent and synchronized, and a disguise vx4 media server is used on the scene of the XR virtual studio as a virtual rendering engine to enhance rendering and control. , and the workflow brings AR technology into the environment, by using the Black Trax dynamic tracking system to understand the position of the participants and the virtual image in real time, so as to better interact with the CG content and give the audience an immersive experience. Experience, elevate the transmission of visual effects to a new dimension, and experience the deepest blue at 10,909 meters in the ocean with audiences across the country.


The XR technology with a strong sense of futuristic technology has taken root in my country, and some of them have already bloomed beautiful flowers. Although XR technology is just emerging, the current application scope has been gradually broadened and the technology has become more mature. With the further maturity of the small-pitch display technology and the further exploration of the dot pitch, the high-definition and delicate and naked-eye 3D display can be realized. As well as the improvement of the technical training system for relevant personnel, etc., will promote the maturity of XR technology.