Unreal Level / Ultra Super Bake Test
We generated path-traced render frames from the Unreal Engine scene designed by Ilyas (https://www.artstation.com/ilyasozuer) for "The Prince" virtual production project. Using these frames, we first performed camera alignment and calibration in Agisoft Metashape. Then, utilizing Agisoft's COLMAP export feature, we exported the tie points and camera poses. I really appreciate this feature in Agisoft; it started as a user-developed script, and Agisoft recognized the potential by integrating it natively (hoping to see more 3DGS-related updates in the future!). We processed our outputs using LichtFeld Studio, developed by Janusch Patas—currently the best open-source software with a GUI for 3DGS. Afterward, we displayed the resulting 3D Gaussian Splats on our LED Volume using Volinga’s Unreal Engine plugin. The results were impressive. So, why did we do this? Firstly, the original scene was computationally heavy and had been aggressively optimized to maintain 50fps. Converting this scene to 3DGS allowed us to lock in stable performance, independent of visual fidelity. Whether the scene is optimized or unoptimized, or how many lights are used, the performance cost remains relatively constant. Since we rendered the frames using path tracing (even though we couldn't polish them perfectly for this quick experiment), we achieved lighting and shadows that were essentially impossible to get in real-time, even with an A6000 Ada. Of course, this method removes the dynamic real-time flexibility of the scene, so we view it as a highly sophisticated "baking" technique. Overall, it was a fun experiment