Technical Summary of MTSU’s XR stage
The technology behind XR is the most critical piece of the puzzle for creating a XR Stage. Research and planning for our project began in February 2021, but with the constant evolution of the latest technology changing on almost a daily basis, the project has evolved significantly from our original concept. Below is a summary of the technical equipment we have chosen for MTSU’s XR Project.
There are five key production elements in regards to the XR workflow:
For the LED Screens, we’ve utilized the latest development of the ROE BP2, or “Black Pearl” panels for short. These LED Panels have an extremely high refresh rate of 7680 Hz, which is a remarkably high refresh rate for LED Panels. The BP2 was built and designed specifically for filmmaking purposes, and has a true-to-content color representation. It is also the panels used in most major motion picture houses, and has been used in films such as the Mandalorian, Dune, and many other films. ROE Panels also require very little maintenance, and an incredibly high count of pixels per inch, which give a true to life representation of a background that looks unbelievably real.
We also decided on creating and building an LED Floor volume of ROE BM4, or “Black Marble” panels. This will give us the full XR “Cube” look for a lot of our setups, which will fully immerse our performers into an LED world. Having a LED floor will allow us to utilize a lot of features in our Disguise servers, such as set extension.
Camera Tracking Systems
One of the most critical advancements in the XR workflow has been the camera tracking system. Technology is being advanced daily, so this category has been interesting to follow during our design process. The camera tracking system mounts to the top of our cameras and shoots an infrared light into our studio lighting grid. In the grid, we’ve placed reflector discs that will reflect the infrared light back down towards the camera. Once there, the box mounted to the top of the camera reads the light levels and where they are coming from, which then generates a X, Y, & Z coordinates about where the camera is in the XR Space. In addition, data from the camera’s lens system is also communicated to the tracking system. The data is then combined into a single stream of information that is then communicated to the Media Servers, which will then render the content to display on the LED Screen based on where the camera is pointing and placed in the room. For our setup, we decided to purchase Mo-Sys Star tracker systems.
For content creation and environmental design, we will be using the versatility of Unreal Engine from Epic Games, as well as Autodesk Maya for content creation. Unreal Engine is designed for demanding applications such as filmmaking, and it is ideally suited to the complex scenes that are rendered in exceptional quality at very high frame rates. Unreal Engine is a complete suite of development tools for anyone working in real-time technology. It gives creators the freedom and control to deliver cutting-edge entertainment, compelling visualizations, and immersive virtual worlds.
Unreal Engine will be utilized in our already state-of-the-art animation labs by MTSU’s Animation concentration. We will be sending projects to the XR stage via network connections.
Typically, in an XR environment, there are two ways to shoot a performance. The first option would be a single camera shoot. This approach utilizes one high end cinema camera to shoot a different pass for each camera angle needed for the production. XR workflows require that cameras must capture in a progressive format, using a global shutter sensor system. A global shutter system captures the imagery all at once, rather than a line by line rolling shutter system. This prevents any refresh rate issues with the LED wall and provides a clean, crisp image in the camera.
For our single camera XR setup, we chose the RED Komodo 6K camera package due to the global shutter system, as well as the reliability of the RED’s ability to re-create high quality imagery. The RED Komodo utilizes a Super 35mm sensor without sacrificing image quality or dynamic range.
In a multi-camera environment, we are utilizing Hitachi’s Z-HD5500 Production Camera with global shutter systems. These cameras will be mounted with the Mo-Sys motion capturing system that was described previously.
In an XR workflow, without question, the industry choice at the moment for XR integration in media servers is a brand called Disguise. Disguise creates powerful media server hardware to generate massive amounts of high-resolution imagery for the LED walls in real time – as the camera moves. Disguise also offers online training that will allow students to learn the XR workflow on their own. Disguise servers have long been used in live performances by major artists including U2, The Rolling Stones, Beyonce’, Pink, Ed Sheeran, and many more.
These parts and equipment listed above were carefully selected to provide the maximum learning experience for our students. We are excited about the possibilities this opens up for our students and their ability to learn this amazing new production technique that is rapidly becoming a new standard in the industry. Below is a detailed summary of the items that make up MTSU’s new XR stage.
XR Stage Primary Equipment List
- ROE Black Pearl 2 LED Panels (102)
- Mo-Sys Star Tracker Camera Tracking Systems (3)
- RED Komodo 6K Camera Package (1)
- Hitachi HD5000 Studio Cameras (4)
- Miller Studio Pedestals (3)
- Fiber Optic Fabric Switch (1)
- ROE Black Marble LED Floor Panels (52)
- Brompton SX40 Processor (1)
- Brompton XD Distribution Boxes (3)
- Disguise VX2 Server (1)
- Disguise RX2 Render Servers (2)